pax_global_header00006660000000000000000000000064137555511160014523gustar00rootroot0000000000000052 comment=ec81155dfe84a8731ad2095cc7b88e99d80e35a4 python-digitalocean-1.16.0/000077500000000000000000000000001375555111600155525ustar00rootroot00000000000000python-digitalocean-1.16.0/.dockerignore000066400000000000000000000000311375555111600202200ustar00rootroot00000000000000.git/ *.pyc *__pycache__ python-digitalocean-1.16.0/.github/000077500000000000000000000000001375555111600171125ustar00rootroot00000000000000python-digitalocean-1.16.0/.github/workflows/000077500000000000000000000000001375555111600211475ustar00rootroot00000000000000python-digitalocean-1.16.0/.github/workflows/publish.yml000066400000000000000000000010601375555111600233350ustar00rootroot00000000000000name: Upload Python Package on Release on: release: types: [created] jobs: deploy: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - name: Set up Python uses: actions/setup-python@v2 with: python-version: '3.x' - name: Install dependencies run: make publish_deps deps - name: Test code run: make test - name: Build and publish env: TWINE_USERNAME: ${{ secrets.PYPI_USERNAME }} TWINE_PASSWORD: ${{ secrets.PYPI_PASSWORD }} run: make clean dist publish python-digitalocean-1.16.0/.github/workflows/tests.yml000066400000000000000000000010441375555111600230330ustar00rootroot00000000000000name: Python Tests on: push: branches: [ master ] pull_request: branches: [ master ] jobs: build: runs-on: ${{ matrix.os }} strategy: matrix: os: [ubuntu-latest] python-version: ['2.x', '3.x', 3.6, 3.7, 3.8] steps: - uses: actions/checkout@v2 - name: Set up Python ${{ matrix.python-version }} uses: actions/setup-python@v2 with: python-version: ${{ matrix.python-version }} - name: Install dependencies run: make deps - name: Run tests run: make test python-digitalocean-1.16.0/.gitignore000066400000000000000000000005421375555111600175430ustar00rootroot00000000000000*.py[cod] # C extensions *.so # Packages *.egg *.egg-info dist build eggs parts bin var sdist develop-eggs .installed.cfg lib lib64 # Installer logs pip-log.txt # Unit test / coverage reports .coverage .tox nosetests.xml # Translations *.mo # Mr Developer .mr.developer.cfg .project .pydevproject MANIFEST .idea/* .cache .venv *.sw[pon] .eggs python-digitalocean-1.16.0/DESCRIPTION.rst000066400000000000000000000122011375555111600200630ustar00rootroot00000000000000python-digitalocean =================== This library provides easy access to Digital Ocean APIs to deploy droplets, images and more. |image0| | |image1| | |image2| | |image3| How to install -------------- You can install python-digitalocean using **pip** :: pip install -U python-digitalocean or via sources: :: python setup.py install Features -------- python-digitalocean support all the features provided via digitalocean.com APIs, such as: - Get user's Droplets - Get user's Images (Snapshot and Backups) - Get public Images - Get Droplet's event status - Create and Remove a Droplet - Resize a Droplet - Shutdown, restart and boot a Droplet - Power off, power on and "power cycle" a Droplet - Perform Snapshot - Enable/Disable automatic Backups - Restore root password of a Droplet Examples --------- Listing the droplets ~~~~~~~~~~~~~~~~~~~~ This example shows how to list all the active droplets: .. code:: python import digitalocean manager = digitalocean.Manager(token="secretspecialuniquesnowflake") print(manager.get_all_droplets()) Shutdown all droplets ~~~~~~~~~~~~~~~~~~~~~ This example shows how to shutdown all the active droplets: .. code:: python import digitalocean manager = digitalocean.Manager(token="secretspecialuniquesnowflake") my_droplets = manager.get_all_droplets() for droplet in my_droplets: droplet.shutdown() Creating a Droplet and checking its status ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ This example shows how to create a droplet and how to check its status .. code:: python import digitalocean droplet = digitalocean.Droplet(token="secretspecialuniquesnowflake", name='Example', region='nyc2', # New York 2 image='ubuntu-14-04-x64', # Ubuntu 14.04 x64 size_slug='512mb', # 512MB backups=True) droplet.create() Checking the status of the droplet ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. code:: python actions = droplet.get_actions() for action in actions: action.load() # Once it shows complete, droplet is up and running print action.status Add SSHKey into DigitalOcean Account ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. code:: python from digitalocean import SSHKey user_ssh_key = open('/home/<$USER>/.ssh/id_rsa.pub').read() key = SSHKey(token='secretspecialuniquesnowflake', name='uniquehostname', public_key=user_ssh_key) key.create() Creating a new droplet with all your SSH keys ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ .. code:: python manager = digitalocean.Manager(token="secretspecialuniquesnowflake") keys = manager.get_all_sshkeys() droplet = digitalocean.Droplet(token="secretspecialuniquesnowflake", name='DropletWithSSHKeys', region='ams3', # Amster image='ubuntu-14-04-x64', # Ubuntu 14.04 x64 size_slug='512mb', # 512MB ssh_keys=keys, #Automatic conversion backups=False) droplet.create() Testing ------- Test using Docker ~~~~~~~~~~~~~~~~~ To test this python-digitalocean you can use `docker `__ to have a **clean environment automatically**. First you have to build the container by running in your shell on the repository directory: :: docker build -t "pdo-tests" . Then you can run all the tests (for both python 2 and python 3) :: docker run pdo-tests **Note**: This will use Ubuntu 14.04 as base and use your repository to run tests. So every time you edit some files, please run these commands to perform tests on your changes. Testing using pytest manually ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Use `pytest `__ to perform testing. It is recommended to use a dedicated virtualenv to perform tests, using these commands: :: $ virtualenv /tmp/digitalocean_env $ source /tmp/digitalocean_env/bin/activate $ pip install -r requirements.txt To run all the tests manually use py.test command: :: $ python -m pytest Links ----- - GitHub: https://github.com/koalalorenzo/python-digitalocean - PyPI page: https://pypi.python.org/pypi/python-digitalocean/ - Author Website: `http://who.is.lorenzo.setale.me/? `__ - Author Blog: http://blog.setale.me/ .. |image0| image:: https://travis-ci.org/koalalorenzo/python-digitalocean.svg :target: https://travis-ci.org/koalalorenzo/python-digitalocean .. |image1| image:: https://img.shields.io/github/forks/badges/shields.svg?style=social&label=Fork :target: https://travis-ci.org/koalalorenzo/python-digitalocean .. |image2| image:: https://img.shields.io/github/stars/badges/shields.svg?style=social&label=Star :target: https://travis-ci.org/koalalorenzo/python-digitalocean .. |image3| image:: https://img.shields.io/github/watchers/badges/shields.svg?style=social&label=Watch :target: https://travis-ci.org/koalalorenzo/python-digitalocean python-digitalocean-1.16.0/Dockerfile000066400000000000000000000004421375555111600175440ustar00rootroot00000000000000# Dockerfile made to test this FROM python:alpine MAINTAINER Lorenzo Setale RUN pip3 install -U python-digitalocean pytest WORKDIR /root/ ADD . /root/python-digitalocean WORKDIR /root/python-digitalocean RUN pip3 install -U -r requirements.txt CMD python3 -m pytestpython-digitalocean-1.16.0/LICENSE.txt000066400000000000000000000167421375555111600174070ustar00rootroot00000000000000 GNU LESSER GENERAL PUBLIC LICENSE Version 3, 29 June 2007 Copyright (C) 2007 Free Software Foundation, Inc. Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. This version of the GNU Lesser General Public License incorporates the terms and conditions of version 3 of the GNU General Public License, supplemented by the additional permissions listed below. 0. Additional Definitions. As used herein, "this License" refers to version 3 of the GNU Lesser General Public License, and the "GNU GPL" refers to version 3 of the GNU General Public License. "The Library" refers to a covered work governed by this License, other than an Application or a Combined Work as defined below. An "Application" is any work that makes use of an interface provided by the Library, but which is not otherwise based on the Library. Defining a subclass of a class defined by the Library is deemed a mode of using an interface provided by the Library. A "Combined Work" is a work produced by combining or linking an Application with the Library. The particular version of the Library with which the Combined Work was made is also called the "Linked Version". The "Minimal Corresponding Source" for a Combined Work means the Corresponding Source for the Combined Work, excluding any source code for portions of the Combined Work that, considered in isolation, are based on the Application, and not on the Linked Version. The "Corresponding Application Code" for a Combined Work means the object code and/or source code for the Application, including any data and utility programs needed for reproducing the Combined Work from the Application, but excluding the System Libraries of the Combined Work. 1. Exception to Section 3 of the GNU GPL. You may convey a covered work under sections 3 and 4 of this License without being bound by section 3 of the GNU GPL. 2. Conveying Modified Versions. If you modify a copy of the Library, and, in your modifications, a facility refers to a function or data to be supplied by an Application that uses the facility (other than as an argument passed when the facility is invoked), then you may convey a copy of the modified version: a) under this License, provided that you make a good faith effort to ensure that, in the event an Application does not supply the function or data, the facility still operates, and performs whatever part of its purpose remains meaningful, or b) under the GNU GPL, with none of the additional permissions of this License applicable to that copy. 3. Object Code Incorporating Material from Library Header Files. The object code form of an Application may incorporate material from a header file that is part of the Library. You may convey such object code under terms of your choice, provided that, if the incorporated material is not limited to numerical parameters, data structure layouts and accessors, or small macros, inline functions and templates (ten or fewer lines in length), you do both of the following: a) Give prominent notice with each copy of the object code that the Library is used in it and that the Library and its use are covered by this License. b) Accompany the object code with a copy of the GNU GPL and this license document. 4. Combined Works. You may convey a Combined Work under terms of your choice that, taken together, effectively do not restrict modification of the portions of the Library contained in the Combined Work and reverse engineering for debugging such modifications, if you also do each of the following: a) Give prominent notice with each copy of the Combined Work that the Library is used in it and that the Library and its use are covered by this License. b) Accompany the Combined Work with a copy of the GNU GPL and this license document. c) For a Combined Work that displays copyright notices during execution, include the copyright notice for the Library among these notices, as well as a reference directing the user to the copies of the GNU GPL and this license document. d) Do one of the following: 0) Convey the Minimal Corresponding Source under the terms of this License, and the Corresponding Application Code in a form suitable for, and under terms that permit, the user to recombine or relink the Application with a modified version of the Linked Version to produce a modified Combined Work, in the manner specified by section 6 of the GNU GPL for conveying Corresponding Source. 1) Use a suitable shared library mechanism for linking with the Library. A suitable mechanism is one that (a) uses at run time a copy of the Library already present on the user's computer system, and (b) will operate properly with a modified version of the Library that is interface-compatible with the Linked Version. e) Provide Installation Information, but only if you would otherwise be required to provide such information under section 6 of the GNU GPL, and only to the extent that such information is necessary to install and execute a modified version of the Combined Work produced by recombining or relinking the Application with a modified version of the Linked Version. (If you use option 4d0, the Installation Information must accompany the Minimal Corresponding Source and Corresponding Application Code. If you use option 4d1, you must provide the Installation Information in the manner specified by section 6 of the GNU GPL for conveying Corresponding Source.) 5. Combined Libraries. You may place library facilities that are a work based on the Library side by side in a single library together with other library facilities that are not Applications and are not covered by this License, and convey such a combined library under terms of your choice, if you do both of the following: a) Accompany the combined library with a copy of the same work based on the Library, uncombined with any other library facilities, conveyed under the terms of this License. b) Give prominent notice with the combined library that part of it is a work based on the Library, and explaining where to find the accompanying uncombined form of the same work. 6. Revised Versions of the GNU Lesser General Public License. The Free Software Foundation may publish revised and/or new versions of the GNU Lesser General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Library as you received it specifies that a certain numbered version of the GNU Lesser General Public License "or any later version" applies to it, you have the option of following the terms and conditions either of that published version or of any later version published by the Free Software Foundation. If the Library as you received it does not specify a version number of the GNU Lesser General Public License, you may choose any version of the GNU Lesser General Public License ever published by the Free Software Foundation. If the Library as you received it specifies that a proxy can decide whether future versions of the GNU Lesser General Public License shall apply, that proxy's public statement of acceptance of any version is permanent authorization for you to choose that version for the Library.python-digitalocean-1.16.0/Makefile000066400000000000000000000006301375555111600172110ustar00rootroot00000000000000clean: rm -rf build rm -rf dist rm -rf python_digitalocean.egg-info rm -rf .pytest_cache .PHONY: clean dist: python setup.py sdist bdist_wheel publish: dist twine upload dist/* .PHONY: publish publish_deps: python -m pip install --upgrade pip pip install setuptools wheel twine .PHONY: publish_deps deps: pip install -U -r requirements.txt .PHONY: deps test: python -m pytest .PHONY: test python-digitalocean-1.16.0/README.md000066400000000000000000000340701375555111600170350ustar00rootroot00000000000000

python-digitalocean

Easy access to Digital Ocean APIs to deploy droplets, images and more.

Build Status

## Table of Contents - [How to install](#how-to-install) - [Configurations](#configurations) - [Features](#features) - [Examples](#examples) - [Listing the droplets](#listing-the-droplets) - [Listing the droplets by tags](#listing-the-droplets-by-tags) - [Add a tag to a droplet](#add-a-tag-to-a-droplet) - [Shutdown all droplets](#shutdown-all-droplets) - [Creating a Droplet and checking its status](#creating-a-droplet-and-checking-its-status) - [Checking the status of the droplet](#checking-the-status-of-the-droplet) - [Listing the Projects](#listing-the-projects) - [Assign a resource for specific project](#assigne-a-resource-for-specific-project) - [List all the resources of a project](#list-all-the-resources-of-a-project) - [Add SSHKey into DigitalOcean Account](#add-sshkey-into-digitalocean-account) - [Creating a new droplet with all your SSH keys](#creating-a-new-droplet-with-all-your-ssh-keys) - [Creating a Firewall](#creating-a-firewall) - [Listing the domains](#listing-the-domains) - [Listing records of a domain](#listing-records-of-a-domain) - [Creating a domain record](#creating-a-domain-record) - [Update a domain record](#update-a-domain-record) - [Getting account requests/hour limits status](#getting-account-requestshour-limits-status) - [Session customization](#session-customization) - [Testing](#testing) - [Test using Docker](#test-using-docker) - [Testing using pytest manually](#testing-using-pytest-manually) - [Links](#links) ## How to install You can install python-digitalocean using **pip** pip install -U python-digitalocean or via sources: python setup.py install **[⬆ back to top](#table-of-contents)** ## Configurations Specify a custom provider using environment variable export DIGITALOCEAN_END_POINT=http://example.com/ Specify the DIGITALOCEAN_ACCESS_TOKEN using environment variable export DIGITALOCEAN_ACCESS_TOKEN='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx' Note: Probably want to add the export line above to your `.bashrc` file. **[⬆ back to top](#table-of-contents)** ## Features python-digitalocean support all the features provided via digitalocean.com APIs, such as: * Get user's Projects * Assign a resource to a user project * List the resources of user's project * Get user's Droplets * Get user's Images (Snapshot and Backups) * Get public Images * Get Droplet's event status * Create and Remove a Droplet * Create, Add and Remove Tags from Droplets * Resize a Droplet * Shutdown, restart and boot a Droplet * Power off, power on and "power cycle" a Droplet * Perform Snapshot * Enable/Disable automatic Backups * Restore root password of a Droplet **[⬆ back to top](#table-of-contents)** ## Examples ### Listing the droplets This example shows how to list all the active droplets: ```python import digitalocean manager = digitalocean.Manager(token="secretspecialuniquesnowflake") my_droplets = manager.get_all_droplets() print(my_droplets) ``` This example shows how to specify custom provider's end point URL: ```python import digitalocean manager = digitalocean.Manager(token="secretspecialuniquesnowflake", end_point="http://example.com/") ``` **[⬆ back to top](#table-of-contents)** ### Listing the droplets by tags This example shows how to list all the active droplets: ```python import digitalocean manager = digitalocean.Manager(token="secretspecialuniquesnowflake") my_droplets = manager.get_all_droplets(tag_name="awesome") print(my_droplets) ``` **[⬆ back to top](#table-of-contents)** ### Add a tag to a droplet This example shows how to add a tag to a droplet: ```python import digitalocean tag = digitalocean.Tag(token="secretspecialuniquesnowflake", name="tag_name") tag.create() # create tag if not already created tag.add_droplets(["DROPLET_ID"]) ``` **[⬆ back to top](#table-of-contents)** ### Shutdown all droplets This example shows how to shutdown all the active droplets: ```python import digitalocean manager = digitalocean.Manager(token="secretspecialuniquesnowflake") my_droplets = manager.get_all_droplets() for droplet in my_droplets: droplet.shutdown() ``` **[⬆ back to top](#table-of-contents)** ### Creating a Droplet and checking its status This example shows how to create a droplet and how to check its status ```python import digitalocean droplet = digitalocean.Droplet(token="secretspecialuniquesnowflake", name='Example', region='nyc2', # New York 2 image='ubuntu-20-04-x64', # Ubuntu 20.04 x64 size_slug='s-1vcpu-1gb', # 1GB RAM, 1 vCPU backups=True) droplet.create() ``` **[⬆ back to top](#table-of-contents)** ### Checking the status of the droplet ```python actions = droplet.get_actions() for action in actions: action.load() # Once it shows "completed", droplet is up and running print(action.status) ``` **[⬆ back to top](#table-of-contents)** ### Listing the Projects This example shows how to list all the projects: ```python import digitalocean manager = digitalocean.Manager(token="secretspecialuniquesnowflake") my_projects = manager.get_all_projects() print(my_projects) ``` **[⬆ back to top](#table-of-contents)** ### Assign a resource for specific project ```python import digitalocean manager = digitalocean.Manager(token="secretspecialuniquesnowflake") my_projects = manager.get_all_projects() my_projects[0].assign_resource(["do:droplet:"]) ``` **[⬆ back to top](#table-of-contents)** ### List all the resources of a project ```python import digitalocean manager = digitalocean.Manager(token="secretspecialuniquesnowflake") my_projects = manager.get_all_projects() resources = my_projects[0].get_all_resources() print(resources) ``` **[⬆ back to top](#table-of-contents)** ### Add SSHKey into DigitalOcean Account ```python from digitalocean import SSHKey user_ssh_key = open('/home/<$USER>/.ssh/id_rsa.pub').read() key = SSHKey(token='secretspecialuniquesnowflake', name='uniquehostname', public_key=user_ssh_key) key.create() ``` **[⬆ back to top](#table-of-contents)** ### Creating a new droplet with all your SSH keys ```python manager = digitalocean.Manager(token="secretspecialuniquesnowflake") keys = manager.get_all_sshkeys() droplet = digitalocean.Droplet(token=manager.token, name='DropletWithSSHKeys', region='ams3', # Amster image='ubuntu-20-04-x64', # Ubuntu 20.04 x64 size_slug='s-1vcpu-1gb', # 1GB RAM, 1 vCPU ssh_keys=keys, #Automatic conversion backups=False) droplet.create() ``` **[⬆ back to top](#table-of-contents)** ### Creating a Firewall This example creates a firewall that only accepts inbound tcp traffic on port 80 from a specific load balancer and allows outbout tcp traffic on all ports to all addresses. ```python from digitalocean import Firewall, InboundRule, OutboundRule, Destinations, Sources inbound_rule = InboundRule(protocol="tcp", ports="80", sources=Sources( load_balancer_uids=[ "4de7ac8b-495b-4884-9a69-1050c6793cd6"] ) ) outbound_rule = OutboundRule(protocol="tcp", ports="all", destinations=Destinations( addresses=[ "0.0.0.0/0", "::/0"] ) ) firewall = Firewall(token="secretspecialuniquesnowflake", name="new-firewall", inbound_rules=[inbound_rule], outbound_rules=[outbound_rule], droplet_ids=[8043964, 8043972]) firewall.create() ``` **[⬆ back to top](#table-of-contents)** ### Listing the domains This example shows how to list all the active domains: ```python import digitalocean TOKEN="secretspecialuniquesnowflake" manager = digitalocean.Manager(token=TOKEN) my_domains = manager.get_all_domains() print(my_domains) ``` **[⬆ back to top](#table-of-contents)** ### Listing records of a domain This example shows how to list all records of a domain: ```python import digitalocean TOKEN="secretspecialuniquesnowflake" domain = digitalocean.Domain(token=TOKEN, name="example.com") records = domain.get_records() for r in records: print(r.name, r.domain, r.type, r.data) ``` **[⬆ back to top](#table-of-contents)** ### Creating a domain record This example shows how to create new domain record (sub.example.com): ```python import digitalocean TOKEN="secretspecialuniquesnowflake" domain = digitalocean.Domain(token=TOKEN, name="example.com") new_record = domain.create_new_domain_record( type='A', name='sub', data='93.184.216.34' ) print(new_record) ``` **[⬆ back to top](#table-of-contents)** ### Update a domain record This example shows how to create new domain record (sub.example.com): ```python import digitalocean TOKEN="secretspecialuniquesnowflake" domain = digitalocean.Domain(token=TOKEN, name="example.com") records = domain.get_records() id = None for r in records: if r.name == 'usb': r.data = '1.1.1.1' r.save() ``` **[⬆ back to top](#table-of-contents)** ## Getting account requests/hour limits status Each request will also include the rate limit information: ```python import digitalocean account = digitalocean.Account(token="secretspecialuniquesnowflake").load() # or manager = digitalocean.Manager(token="secretspecialuniquesnowflake") account = manager.get_account() ``` Output: ```text droplet_limit: 25 email: 'name@domain.me' email_verified: True end_point: 'https://api.digitalocean.com/v2/' floating_ip_limit: 3 ratelimit_limit: '5000' ratelimit_remaining: '4995' ratelimit_reset: '1505378973' status: 'active' status_message: '' token:'my_secret_token' uuid: 'my_id' ``` When using the Manager().get_all.. functions, the rate limit will be stored on the manager object: ```python import digitalocean manager = digitalocean.Manager(token="secretspecialuniquesnowflake") domains = manager.get_all_domains() print(manager.ratelimit_limit) ``` **[⬆ back to top](#table-of-contents)** ## Session customization You can take advandtage of the [requests](http://docs.python-requests.org/en/master/) library and configure the HTTP client under python-digitalocean. ### Configure retries in case of connection error This example shows how to configure your client to retry 3 times in case of `ConnectionError`: ```python import digitalocean import requests from requests.adapters import HTTPAdapter from requests.packages.urllib3.util.retry import Retry manager = digitalocean.Manager(token="secretspecialuniquesnowflake") retry = Retry(connect=3) adapter = HTTPAdapter(max_retries=retry) manager._session.mount('https://', adapter) ``` See [`Retry`](https://urllib3.readthedocs.io/en/latest/reference/urllib3.util.html#urllib3.util.retry.Retry) object reference to get more details about all retries options. ### Configure a hook on specified answer This example shows how to launch custom actions if a HTTP 500 occurs: ```python import digitalocean def handle_response(response, *args, **kwargs): if response.status_code == 500: # Make a lot things from the raw response pass return response manager = digitalocean.Manager(token="secretspecialuniquesnowflake") manager._session.hooks['response'].append(handle_response) ``` See [event hooks documentation](http://docs.python-requests.org/en/master/user/advanced/?highlight=HTTPAdapter#event-hooks) to get more details about this feature. **[⬆ back to top](#table-of-contents)** ## Testing ### Test using Docker To test this python-digitalocean you can use [docker](https://www.docker.com) to have a **clean environment automatically**. First you have to build the container by running in your shell on the repository directory: docker build -t "pdo-tests" . Then you can run all the tests (for both python 2 and python 3) docker run pdo-tests **Note**: This will use Ubuntu 14.04 as base and use your repository to run tests. So every time you edit some files, please run these commands to perform tests on your changes. **[⬆ back to top](#table-of-contents)** ### Testing using pytest manually Use [pytest](http://pytest.org/) to perform testing. It is recommended to use a dedicated virtualenv to perform tests, using these commands: $ virtualenv /tmp/digitalocean_env $ source /tmp/digitalocean_env/bin/activate $ pip install -r requirements.txt To run all the tests manually use py.test command: $ python -m pytest **[⬆ back to top](#table-of-contents)** ## Links * GitHub: [https://github.com/koalalorenzo/python-digitalocean](https://github.com/koalalorenzo/python-digitalocean) * PyPI page: [https://pypi.python.org/pypi/python-digitalocean/](https://pypi.python.org/pypi/python-digitalocean/) * Author Website: [http://who.is.lorenzo.setale.me/?](http://setale.me/) * Author Blog: [http://blog.setale.me/](http://blog.setale.me/) **[⬆ back to top](#table-of-contents)** python-digitalocean-1.16.0/digitalocean/000077500000000000000000000000001375555111600201755ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/Account.py000066400000000000000000000016101375555111600221410ustar00rootroot00000000000000# -*- coding: utf-8 -*- from .baseapi import BaseAPI class Account(BaseAPI): def __init__(self, *args, **kwargs): self.droplet_limit = None self.floating_ip_limit = None self.email = None self.uuid = None self.email_verified = None self.status = None self.status_message = None super(Account, self).__init__(*args, **kwargs) @classmethod def get_object(cls, api_token): """ Class method that will return an Account object. """ acct = cls(token=api_token) acct.load() return acct def load(self): # URL https://api.digitalocean.com/v2/account data = self.get_data("account/") account = data['account'] for attr in account.keys(): setattr(self, attr, account[attr]) def __str__(self): return "%s" % (self.email) python-digitalocean-1.16.0/digitalocean/Action.py000066400000000000000000000041601375555111600217650ustar00rootroot00000000000000# -*- coding: utf-8 -*- from time import sleep from .baseapi import BaseAPI class Action(BaseAPI): def __init__(self, *args, **kwargs): self.id = None self.status = None self.type = None self.started_at = None self.completed_at = None self.resource_id = None self.resource_type = None self.region = None self.region_slug = None # Custom, not provided by the json object. self.droplet_id = None super(Action, self).__init__(*args, **kwargs) @classmethod def get_object(cls, api_token, action_id): """ Class method that will return a Action object by ID. """ action = cls(token=api_token, id=action_id) action.load_directly() return action def load_directly(self): action = self.get_data("actions/%s" % self.id) if action: action = action[u'action'] # Loading attributes for attr in action.keys(): setattr(self, attr, action[attr]) def load(self): if not self.droplet_id: action = self.load_directly() else: action = self.get_data( "droplets/%s/actions/%s" % ( self.droplet_id, self.id ) ) if action: action = action[u'action'] # Loading attributes for attr in action.keys(): setattr(self, attr, action[attr]) def wait(self, update_every_seconds=1): """ Wait until the action is marked as completed or with an error. It will return True in case of success, otherwise False. Optional Args: update_every_seconds - int : number of seconds to wait before checking if the action is completed. """ while self.status == u'in-progress': sleep(update_every_seconds) self.load() return self.status == u'completed' def __str__(self): return "" % (self.id, self.type, self.status) python-digitalocean-1.16.0/digitalocean/Balance.py000066400000000000000000000015171375555111600221000ustar00rootroot00000000000000# -*- coding: utf-8 -*- from .baseapi import BaseAPI class Balance(BaseAPI): def __init__(self, *args, **kwargs): self.month_to_date_balance = None self.account_balance = None self.month_to_date_usage = None self.generated_at = None super(Balance, self).__init__(*args, **kwargs) @classmethod def get_object(cls, api_token): """ Class method that will return an Balance object. """ acct = cls(token=api_token) acct.load() return acct def load(self): # URL https://api.digitalocean.com/customers/my/balance balance = self.get_data("customers/my/balance") for attr in balance.keys(): setattr(self, attr, balance[attr]) def __str__(self): return "" % (self.account_balance) python-digitalocean-1.16.0/digitalocean/Certificate.py000066400000000000000000000104411375555111600227710ustar00rootroot00000000000000# -*- coding: utf-8 -*- from .baseapi import BaseAPI, POST, DELETE class Certificate(BaseAPI): """ An object representing an SSL Certificate stored on DigitalOcean. Attributes accepted at creation time: Args: name (str): A name for the Certificate private_key (str, optional): The contents of a PEM-formatted private-key corresponding to the SSL certificate. Only used when uploading a custom certificate. leaf_certificate (str, optional): The contents of a PEM-formatted public SSL certificate. Only used when uploading a custom certificate. certificate_chain (str, optional): The full PEM-formatted trust chain between the certificate authority's certificate and your domain's SSL certificate. Only used when uploading a custom certificate. dns_names (:obj:`str`): A list of fully qualified domain names (FQDNs) for which the certificate will be issued by Let's Encrypt type (str): Specifies the type of certificate to be created. The value should be "custom" for a user-uploaded certificate or "lets_encrypt" for one automatically generated with Let's Encrypt. Attributes returned by API: name (str): The name of the Certificate id (str): A unique identifier for the Certificate not_after (str): A string that represents the Certificate's expiration date. sha1_fingerprint (str): A unique identifier for the Certificate generated from its SHA-1 fingerprint created_at (str): A string that represents when the Certificate was created dns_names (:obj:`str`): A list of fully qualified domain names (FQDNs) for which a Let's Encrypt generated certificate is issued. type (str): Specifies the type of certificate. The value will be "custom" for a user-uploaded certificate or "lets_encrypt" for one automatically generated with Let's Encrypt. state (str): Represents the current state of the certificate. It may be "pending", "verified", or "errored". """ def __init__(self, *args, **kwargs): self.id = "" self.name = None self.private_key = None self.leaf_certificate = None self.certificate_chain = None self.not_after = None self.sha1_fingerprint = None self.created_at = None self.dns_names = [] self.type = None self.state = None super(Certificate, self).__init__(*args, **kwargs) @classmethod def get_object(cls, api_token, cert_id): """ Class method that will return a Certificate object by its ID. """ certificate = cls(token=api_token, id=cert_id) certificate.load() return certificate def load(self): """ Load the Certificate object from DigitalOcean. Requires self.id to be set. """ data = self.get_data("certificates/%s" % self.id) certificate = data["certificate"] for attr in certificate.keys(): setattr(self, attr, certificate[attr]) return self def create(self): """ Create the Certificate """ params = { "name": self.name, "type": self.type, "dns_names": self.dns_names, "private_key": self.private_key, "leaf_certificate": self.leaf_certificate, "certificate_chain": self.certificate_chain } data = self.get_data("certificates", type=POST, params=params) if data: self.id = data['certificate']['id'] self.not_after = data['certificate']['not_after'] self.sha1_fingerprint = data['certificate']['sha1_fingerprint'] self.created_at = data['certificate']['created_at'] self.type = data['certificate']['type'] self.dns_names = data['certificate']['dns_names'] self.state = data['certificate']['state'] return self def destroy(self): """ Delete the Certificate """ return self.get_data("certificates/%s" % self.id, type=DELETE) def __str__(self): return "" % (self.id, self.name) python-digitalocean-1.16.0/digitalocean/Domain.py000066400000000000000000000073551375555111600217700ustar00rootroot00000000000000# -*- coding: utf-8 -*- from .Record import Record from .baseapi import BaseAPI, GET, POST, DELETE class Domain(BaseAPI): def __init__(self, *args, **kwargs): self.name = None self.ttl = None self.zone_file = None self.ip_address = None super(Domain, self).__init__(*args, **kwargs) @classmethod def get_object(cls, api_token, domain_name): """ Class method that will return a Domain object by ID. """ domain = cls(token=api_token, name=domain_name) domain.load() return domain def load(self): # URL https://api.digitalocean.com/v2/domains domains = self.get_data("domains/%s" % self.name) domain = domains['domain'] for attr in domain.keys(): setattr(self, attr, domain[attr]) def destroy(self): """ Destroy the domain by name """ # URL https://api.digitalocean.com/v2/domains/[NAME] return self.get_data("domains/%s" % self.name, type=DELETE) def create_new_domain_record(self, *args, **kwargs): """ Create new domain record. https://developers.digitalocean.com/#create-a-new-domain-record Args: type: The record type (A, MX, CNAME, etc). name: The host name, alias, or service being defined by the record data: Variable data depending on record type. Optional Args: priority: The priority of the host port: The port that the service is accessible on weight: The weight of records with the same priority ttl: This value is the time to live for the record, in seconds. flags: An unsigned integer between 0-255 used for CAA records. tag: The parameter tag for CAA records. Valid values are "issue", "issuewild", or "iodef". """ data = { "type": kwargs.get("type", None), "name": kwargs.get("name", None), "data": kwargs.get("data", None) } #  Optional Args if kwargs.get("priority", None) != None: data['priority'] = kwargs.get("priority", None) if kwargs.get("port", None): data['port'] = kwargs.get("port", None) if kwargs.get("weight", None) != None: data['weight'] = kwargs.get("weight", None) if kwargs.get("ttl", None): data['ttl'] = kwargs.get("ttl", 1800) if kwargs.get("flags", None) != None: data['flags'] = kwargs.get("flags", None) if kwargs.get("tag", None): data['tag'] = kwargs.get("tag", "issue") return self.get_data( "domains/%s/records" % self.name, type=POST, params=data ) def create(self): """ Create new doamin """ # URL https://api.digitalocean.com/v2/domains data = { "name": self.name, "ip_address": self.ip_address, } domain = self.get_data("domains", type=POST, params=data) return domain def get_records(self, params=None): """ Returns a list of Record objects """ if params is None: params = {} # URL https://api.digitalocean.com/v2/domains/[NAME]/records/ records = [] data = self.get_data("domains/%s/records/" % self.name, type=GET, params=params) for record_data in data['domain_records']: record = Record(domain_name=self.name, **record_data) record.token = self.token records.append(record) return records def __str__(self): return "%s" % (self.name) python-digitalocean-1.16.0/digitalocean/Droplet.py000066400000000000000000000512361375555111600221670ustar00rootroot00000000000000# -*- coding: utf-8 -*- import re from .Action import Action from .Image import Image from .Kernel import Kernel from .baseapi import BaseAPI, Error, GET, POST, DELETE from .SSHKey import SSHKey from .Volume import Volume class DropletError(Error): """Base exception class for this module""" pass class BadKernelObject(DropletError): pass class BadSSHKeyFormat(DropletError): pass class Droplet(BaseAPI): """Droplet management Attributes accepted at creation time: Args: name (str): name size_slug (str): droplet size image (str): image name to use to create droplet region (str): region ssh_keys (:obj:`str`, optional): list of ssh keys backups (bool): True if backups enabled ipv6 (bool): True if ipv6 enabled private_networking (bool): True if private networking enabled user_data (str): arbitrary data to pass to droplet volumes (:obj:`str`, optional): list of blockstorage volumes monitoring (bool): True if installing the DigitalOcean monitoring agent vpc_uuid (str, optional): ID of a VPC in which the Droplet will be created Attributes returned by API: * id (int): droplet id * memory (str): memory size * vcpus (int): number of vcpus * disk (int): disk size in GB * locked (bool): True if locked * created_at (str): creation date in format u'2014-11-06T10:42:09Z' * status (str): status, e.g. 'new', 'active', etc * networks (dict): details of connected networks * kernel (dict): details of kernel * backup_ids (:obj:`int`, optional): list of ids of backups of this droplet * snapshot_ids (:obj:`int`, optional): list of ids of snapshots of this droplet * action_ids (:obj:`int`, optional): list of ids of actions * features (:obj:`str`, optional): list of enabled features. e.g. [u'private_networking', u'virtio'] * image (dict): details of image used to create this droplet * ip_address (str): public ip addresses * private_ip_address (str): private ip address * ip_v6_address (:obj:`str`, optional): list of ipv6 addresses assigned * end_point (str): url of api endpoint used * volume_ids (:obj:`str`, optional): list of blockstorage volumes * vpc_uuid (str, optional): ID of the VPC that the Droplet is assigned to """ def __init__(self, *args, **kwargs): # Defining default values self.id = None self.name = None self.memory = None self.vcpus = None self.disk = None self.region = [] self.image = None self.size_slug = None self.locked = None self.created_at = None self.status = None self.networks = [] self.kernel = None self.backup_ids = [] self.snapshot_ids = [] self.action_ids = [] self.features = [] self.ip_address = None self.private_ip_address = None self.ip_v6_address = None self.ssh_keys = [] self.backups = None self.ipv6 = None self.private_networking = None self.user_data = None self.volumes = [] self.tags = [] self.monitoring = None self.vpc_uuid = None # This will load also the values passed super(Droplet, self).__init__(*args, **kwargs) @classmethod def get_object(cls, api_token, droplet_id): """Class method that will return a Droplet object by ID. Args: api_token (str): token droplet_id (int): droplet id """ droplet = cls(token=api_token, id=droplet_id) droplet.load() return droplet @classmethod def create_multiple(*args, **kwargs): api = BaseAPI(token=kwargs.get("token")) data = { "names": kwargs.get("names"), "size": kwargs.get("size_slug") or kwargs.get("size"), "image": kwargs.get("image"), "region": kwargs.get("region"), "backups": bool(kwargs.get("backups")), "ipv6": bool(kwargs.get("ipv6")), "private_networking": bool(kwargs.get("private_networking")), "tags": kwargs.get("tags"), "monitoring": bool(kwargs.get("monitoring")), "vpc_uuid": kwargs.get("vpc_uuid"), } if kwargs.get("ssh_keys"): data["ssh_keys"] = Droplet.__get_ssh_keys_id_or_fingerprint( kwargs["ssh_keys"], kwargs.get("token"), kwargs["names"][0]) if kwargs.get("user_data"): data["user_data"] = kwargs["user_data"] droplets = [] data = api.get_data("droplets/", type=POST, params=data) if data: action_ids = [data["links"]["actions"][0]["id"]] for droplet_json in data["droplets"]: droplet_json["token"] = kwargs["token"] droplet = Droplet(**droplet_json) droplet.action_ids = action_ids droplets.append(droplet) return droplets def __check_actions_in_data(self, data): # reloading actions if actions is provided. if u"actions" in data: self.action_ids = [] for action in data[u'actions']: self.action_ids.append(action[u'id']) def get_data(self, *args, **kwargs): """ Customized version of get_data to perform __check_actions_in_data """ data = super(Droplet, self).get_data(*args, **kwargs) if "type" in kwargs: if kwargs["type"] == POST: self.__check_actions_in_data(data) return data def load(self): """ Fetch data about droplet - use this instead of get_data() """ droplets = self.get_data("droplets/%s" % self.id) droplet = droplets['droplet'] for attr in droplet.keys(): setattr(self, attr, droplet[attr]) for net in self.networks['v4']: if net['type'] == 'private': self.private_ip_address = net['ip_address'] if net['type'] == 'public': self.ip_address = net['ip_address'] if self.networks['v6']: self.ip_v6_address = self.networks['v6'][0]['ip_address'] if "backups" in self.features: self.backups = True else: self.backups = False if "ipv6" in self.features: self.ipv6 = True else: self.ipv6 = False if "private_networking" in self.features: self.private_networking = True else: self.private_networking = False if "tags" in droplets: self.tags = droplets["tags"] return self def _perform_action(self, params, return_dict=True): """ Perform a droplet action. Args: params (dict): parameters of the action Optional Args: return_dict (bool): Return a dict when True (default), otherwise return an Action. Returns dict or Action """ action = self.get_data( "droplets/%s/actions/" % self.id, type=POST, params=params ) if return_dict: return action else: action = action[u'action'] return_action = Action(token=self.tokens) # Loading attributes for attr in action.keys(): setattr(return_action, attr, action[attr]) return return_action def power_on(self, return_dict=True): """ Boot up the droplet Optional Args: return_dict (bool): Return a dict when True (default), otherwise return an Action. Returns dict or Action """ return self._perform_action({'type': 'power_on'}, return_dict) def shutdown(self, return_dict=True): """ shutdown the droplet Optional Args: return_dict (bool): Return a dict when True (default), otherwise return an Action. Returns dict or Action """ return self._perform_action({'type': 'shutdown'}, return_dict) def reboot(self, return_dict=True): """ restart the droplet Optional Args: return_dict (bool): Return a dict when True (default), otherwise return an Action. Returns dict or Action """ return self._perform_action({'type': 'reboot'}, return_dict) def power_cycle(self, return_dict=True): """ restart the droplet Optional Args: return_dict (bool): Return a dict when True (default), otherwise return an Action. Returns dict or Action """ return self._perform_action({'type': 'power_cycle'}, return_dict) def power_off(self, return_dict=True): """ restart the droplet Optional Args: return_dict (bool): Return a dict when True (default), otherwise return an Action. Returns dict or Action """ return self._perform_action({'type': 'power_off'}, return_dict) def reset_root_password(self, return_dict=True): """ reset the root password Optional Args: return_dict (bool): Return a dict when True (default), otherwise return an Action. Returns dict or Action """ return self._perform_action({'type': 'password_reset'}, return_dict) def resize(self, new_size_slug, return_dict=True, disk=True): """Resize the droplet to a new size slug. https://developers.digitalocean.com/documentation/v2/#resize-a-droplet Args: new_size_slug (str): name of new size Optional Args: return_dict (bool): Return a dict when True (default), \ otherwise return an Action. disk (bool): If a permanent resize, with disk changes included. Returns dict or Action """ options = {"type": "resize", "size": new_size_slug} if disk: options["disk"] = "true" return self._perform_action(options, return_dict) def take_snapshot(self, snapshot_name, return_dict=True, power_off=False): """Take a snapshot! Args: snapshot_name (str): name of snapshot Optional Args: return_dict (bool): Return a dict when True (default), otherwise return an Action. power_off (bool): Before taking the snapshot the droplet will be turned off with another API call. It will wait until the droplet will be powered off. Returns dict or Action """ if power_off is True and self.status != "off": action = self.power_off(return_dict=False) action.wait() self.load() return self._perform_action( {"type": "snapshot", "name": snapshot_name}, return_dict ) def restore(self, image_id, return_dict=True): """Restore the droplet to an image ( snapshot or backup ) Args: image_id (int): id of image Optional Args: return_dict (bool): Return a dict when True (default), otherwise return an Action. Returns dict or Action """ return self._perform_action( {"type": "restore", "image": image_id}, return_dict ) def rebuild(self, image_id=None, return_dict=True): """Restore the droplet to an image ( snapshot or backup ) Args: image_id (int): id of image Optional Args: return_dict (bool): Return a dict when True (default), otherwise return an Action. Returns dict or Action """ if not image_id: image_id = self.image['id'] return self._perform_action( {"type": "rebuild", "image": image_id}, return_dict ) def enable_backups(self, return_dict=True): """ Enable automatic backups Optional Args: return_dict (bool): Return a dict when True (default), otherwise return an Action. Returns dict or Action """ return self._perform_action({'type': 'enable_backups'}, return_dict) def disable_backups(self, return_dict=True): """ Disable automatic backups Optional Args: return_dict (bool): Return a dict when True (default), otherwise return an Action. Returns dict or Action """ return self._perform_action({'type': 'disable_backups'}, return_dict) def destroy(self): """ Destroy the droplet Returns dict """ return self.get_data("droplets/%s" % self.id, type=DELETE) def rename(self, name, return_dict=True): """Rename the droplet Args: name (str): new name Optional Args: return_dict (bool): Return a dict when True (default), otherwise return an Action. Returns dict or Action """ return self._perform_action( {'type': 'rename', 'name': name}, return_dict ) def enable_private_networking(self, return_dict=True): """ Enable private networking on an existing Droplet where available. Optional Args: return_dict (bool): Return a dict when True (default), otherwise return an Action. Returns dict or Action """ return self._perform_action( {'type': 'enable_private_networking'}, return_dict ) def enable_ipv6(self, return_dict=True): """ Enable IPv6 on an existing Droplet where available. Optional Args: return_dict (bool): Return a dict when True (default), otherwise return an Action. Returns dict or Action """ return self._perform_action({'type': 'enable_ipv6'}, return_dict) def change_kernel(self, kernel, return_dict=True): """Change the kernel to a new one Args: kernel : instance of digitalocean.Kernel.Kernel Optional Args: return_dict (bool): Return a dict when True (default), otherwise return an Action. Returns dict or Action """ if type(kernel) != Kernel: raise BadKernelObject("Use Kernel object") return self._perform_action( {'type': 'change_kernel', 'kernel': kernel.id}, return_dict ) @staticmethod def __get_ssh_keys_id_or_fingerprint(ssh_keys, token, name): """ Check and return a list of SSH key IDs or fingerprints according to DigitalOcean's API. This method is used to check and create a droplet with the correct SSH keys. """ ssh_keys_id = list() for ssh_key in ssh_keys: if type(ssh_key) in [int, type(2 ** 64)]: ssh_keys_id.append(int(ssh_key)) elif type(ssh_key) == SSHKey: ssh_keys_id.append(ssh_key.id) elif type(ssh_key) in [type(u''), type('')]: # ssh_key could either be a fingerprint or a public key # # type(u'') and type('') is the same in python 3 but # different in 2. See: # https://github.com/koalalorenzo/python-digitalocean/issues/80 regexp_of_fingerprint = '([0-9a-fA-F]{2}:){15}[0-9a-fA-F]' match = re.match(regexp_of_fingerprint, ssh_key) if match is not None and match.end() == len(ssh_key) - 1: ssh_keys_id.append(ssh_key) else: key = SSHKey() key.token = token results = key.load_by_pub_key(ssh_key) if results is None: key.public_key = ssh_key key.name = "SSH Key %s" % name key.create() else: key = results ssh_keys_id.append(key.id) else: raise BadSSHKeyFormat( "Droplet.ssh_keys should be a list of IDs, public keys" + " or fingerprints." ) return ssh_keys_id def create(self, *args, **kwargs): """ Create the droplet with object properties. Note: Every argument and parameter given to this method will be assigned to the object. """ for attr in kwargs.keys(): setattr(self, attr, kwargs[attr]) # Provide backwards compatibility if not self.size_slug and self.size: self.size_slug = self.size ssh_keys_id = Droplet.__get_ssh_keys_id_or_fingerprint(self.ssh_keys, self.token, self.name) data = { "name": self.name, "size": self.size_slug, "image": self.image, "region": self.region, "ssh_keys": ssh_keys_id, "backups": bool(self.backups), "ipv6": bool(self.ipv6), "private_networking": bool(self.private_networking), "volumes": self.volumes, "tags": self.tags, "monitoring": bool(self.monitoring), "vpc_uuid": self.vpc_uuid, } if self.user_data: data["user_data"] = self.user_data data = self.get_data("droplets/", type=POST, params=data) if data: self.id = data['droplet']['id'] action_id = data['links']['actions'][0]['id'] self.action_ids = [] self.action_ids.append(action_id) def get_events(self): """ A helper function for backwards compatibility. Calls get_actions() """ return self.get_actions() def get_actions(self): """ Returns a list of Action objects This actions can be used to check the droplet's status """ answer = self.get_data("droplets/%s/actions/" % self.id, type=GET) actions = [] for action_dict in answer['actions']: action = Action(**action_dict) action.token = self.tokens action.droplet_id = self.id action.load() actions.append(action) return actions def get_action(self, action_id): """Returns a specific Action by its ID. Args: action_id (int): id of action """ return Action.get_object( api_token=self.tokens, action_id=action_id ) def get_snapshots(self): """ This method will return the snapshots/images connected to that specific droplet. """ snapshots = list() for id in self.snapshot_ids: snapshot = Image() snapshot.id = id snapshot.token = self.tokens snapshots.append(snapshot) return snapshots def get_kernel_available(self): """ Get a list of kernels available """ kernels = list() data = self.get_data("droplets/%s/kernels/" % self.id) while True: for jsond in data[u'kernels']: kernel = Kernel(**jsond) kernel.token = self.tokens kernels.append(kernel) try: url = data[u'links'][u'pages'].get(u'next') if not url: break data = self.get_data(url) except KeyError: # No links. break return kernels def update_volumes_data(self): """ Trigger volume objects list refresh. When called on a droplet instance, it will take all volumes ids(gathered in initial droplet details collection) and will create list of object of Volume types. Each volume is a separate api call. """ self.volumes = list() for volume_id in self.volume_ids: volume = Volume().get_object(self.tokens, volume_id) self.volumes.append(volume) def __str__(self): return "" % (self.id, self.name) python-digitalocean-1.16.0/digitalocean/Firewall.py000066400000000000000000000204141375555111600223150ustar00rootroot00000000000000# -*- coding: utf-8 -*- from .baseapi import BaseAPI, POST, DELETE, PUT import jsonpickle class _targets(object): """ An internal object that both `Sources` and `Destinations` derive from. Not for direct use by end users. """ def __init__(self, **kwargs): self.addresses = [] self.droplet_ids = [] self.load_balancer_uids = [] self.tags = [] for attr in kwargs.keys(): setattr(self, attr, kwargs[attr]) class Sources(_targets): """ An object holding information about an InboundRule's sources. Args: addresses (obj:`list`): An array of strings containing the IPv4 addresses, IPv6 addresses, IPv4 CIDRs, and/or IPv6 CIDRs to which the Firewall will allow traffic. droplet_ids (obj:`list`): An array containing the IDs of the Droplets to which the Firewall will allow traffic. load_balancer_uids (obj:`list`): An array containing the IDs of the Load Balancers to which the Firewall will allow traffic. tags (obj:`list`): An array containing the names of Tags corresponding to groups of Droplets to which the Firewall will allow traffic. """ pass class Destinations(_targets): """ An object holding information about an OutboundRule's destinations. Args: addresses (obj:`list`): An array of strings containing the IPv4 addresses, IPv6 addresses, IPv4 CIDRs, and/or IPv6 CIDRs to which the Firewall will allow traffic. droplet_ids (obj:`list`): An array containing the IDs of the Droplets to which the Firewall will allow traffic. load_balancer_uids (obj:`list`): An array containing the IDs of the Load Balancers to which the Firewall will allow traffic. tags (obj:`list`): An array containing the names of Tags corresponding to groups of Droplets to which the Firewall will allow traffic. """ pass class InboundRule(object): """ An object holding information about a Firewall's inbound rule. Args: protocol (str): The type of traffic to be allowed. This may be one of "tcp", "udp", or "icmp". ports (str): The ports on which traffic will be allowed specified as a string containing a single port, a range (e.g. "8000-9000"), or "all" to open all ports for a protocol. sources (obj): A `Sources` object. """ def __init__(self, protocol="", ports="", sources=""): self.protocol = protocol self.ports = ports if isinstance(sources, Sources): self.sources = sources else: for source in sources: self.sources = Sources(**sources) class OutboundRule(object): """ An object holding information about a Firewall's outbound rule. Args: protocol (str): The type of traffic to be allowed. This may be one of "tcp", "udp", or "icmp". ports (str): The ports on which traffic will be allowed specified as a string containing a single port, a range (e.g. "8000-9000"), or "all" to open all ports for a protocol. destinations (obj): A `Destinations` object. """ def __init__(self, protocol="", ports="", destinations=""): self.protocol = protocol self.ports = ports if isinstance(destinations, Destinations): self.destinations = destinations else: for destination in destinations: self.destinations = Destinations(**destinations) class Firewall(BaseAPI): """ An object representing an DigitalOcean Firewall. Attributes accepted at creation time: Args: name (str): The Firewall's name. droplet_ids (obj:`list` of `int`): A list of Droplet IDs to be assigned to the Firewall. tags (obj:`list` of `str`): A list Tag names to be assigned to the Firewall. inbound_rules (obj:`list`): A list of `InboundRules` objects outbound_rules (obj:`list`): A list of `OutboundRules` objects Attributes returned by API: id (str): A UUID to identify and reference a Firewall. status (str): A status string indicating the current state of the Firewall. This can be "waiting", "succeeded", or "failed". created_at (str): The time at which the Firewall was created. name (str): The Firewall's name. pending_changes (obj:`list`): Details exactly which Droplets are having their security policies updated. droplet_ids (obj:`list` of `int`): A list of Droplet IDs to be assigned to the Firewall. tags (obj:`list` of `str`): A list Tag names to be assigned to the Firewall. inbound_rules (obj:`list`): A list of `InboundRules` objects outbound_rules (obj:`list`): A list of `OutboundRules` objects """ def __init__(self, *args, **kwargs): self.id = None self.status = None self.created_at = None self.pending_changes = [] self.name = None self.inbound_rules = [] self.outbound_rules = [] self.droplet_ids = None self.tags = None super(Firewall, self).__init__(*args, **kwargs) @classmethod def get_object(cls, api_token, firewall_id): """ Class method that will return a Firewall object by ID. """ firewall = cls(token=api_token, id=firewall_id) firewall.load() return firewall def _set_firewall_attributes(self, data): self.id = data['firewall']['id'] self.name = data['firewall']['name'] self.status = data['firewall']['status'] self.created_at = data['firewall']['created_at'] self.pending_changes = data['firewall']['pending_changes'] self.droplet_ids = data['firewall']['droplet_ids'] self.tags = data['firewall']['tags'] in_rules = list() for rule in data['firewall']['inbound_rules']: in_rules.append(InboundRule(**rule)) self.inbound_rules = in_rules out_rules = list() for rule in data['firewall']['outbound_rules']: out_rules.append(OutboundRule(**rule)) self.outbound_rules = out_rules def load(self): data = self.get_data("firewalls/%s" % self.id) if data: self._set_firewall_attributes(data) return self def create(self, *args, **kwargs): inbound = jsonpickle.encode(self.inbound_rules, unpicklable=False) outbound = jsonpickle.encode(self.outbound_rules, unpicklable=False) params = {'name': self.name, 'droplet_ids': self.droplet_ids, 'inbound_rules': jsonpickle.decode(inbound), 'outbound_rules': jsonpickle.decode(outbound), 'tags': self.tags} data = self.get_data('firewalls/', type=POST, params=params) if data: self._set_firewall_attributes(data) return self def add_droplets(self, droplet_ids): """ Add droplets to this Firewall. """ return self.get_data( "firewalls/%s/droplets" % self.id, type=POST, params={"droplet_ids": droplet_ids} ) def remove_droplets(self, droplet_ids): """ Remove droplets from this Firewall. """ return self.get_data( "firewalls/%s/droplets" % self.id, type=DELETE, params={"droplet_ids": droplet_ids} ) def add_tags(self, tags): """ Add tags to this Firewall. """ return self.get_data( "firewalls/%s/tags" % self.id, type=POST, params={"tags": tags} ) def remove_tags(self, tags): """ Remove tags from this Firewall. """ return self.get_data( "firewalls/%s/tags" % self.id, type=DELETE, params={"tags": tags} ) # TODO: Other Firewall calls (Add/Remove rules, Create / Delete etc) def destroy(self): """ Destroy the Firewall """ return self.get_data("firewalls/%s/" % self.id, type=DELETE) def __str__(self): return "" % (self.id, self.name) python-digitalocean-1.16.0/digitalocean/FloatingIP.py000066400000000000000000000060271375555111600225500ustar00rootroot00000000000000# -*- coding: utf-8 -*- from .baseapi import BaseAPI, GET, POST, DELETE class FloatingIP(BaseAPI): def __init__(self, *args, **kwargs): self.ip = None self.droplet = [] self.region = [] super(FloatingIP, self).__init__(*args, **kwargs) @classmethod def get_object(cls, api_token, ip): """ Class method that will return a FloatingIP object by its IP. Args: api_token: str - token ip: str - floating ip address """ floating_ip = cls(token=api_token, ip=ip) floating_ip.load() return floating_ip def load(self): """ Load the FloatingIP object from DigitalOcean. Requires self.ip to be set. """ data = self.get_data('floating_ips/%s' % self.ip, type=GET) floating_ip = data['floating_ip'] # Setting the attribute values for attr in floating_ip.keys(): setattr(self, attr, floating_ip[attr]) return self def create(self, *args, **kwargs): """ Creates a FloatingIP and assigns it to a Droplet. Note: Every argument and parameter given to this method will be assigned to the object. Args: droplet_id: int - droplet id """ data = self.get_data('floating_ips/', type=POST, params={'droplet_id': self.droplet_id}) if data: self.ip = data['floating_ip']['ip'] self.region = data['floating_ip']['region'] return self def reserve(self, *args, **kwargs): """ Creates a FloatingIP in a region without assigning it to a specific Droplet. Note: Every argument and parameter given to this method will be assigned to the object. Args: region_slug: str - region's slug (e.g. 'nyc3') """ data = self.get_data('floating_ips/', type=POST, params={'region': self.region_slug}) if data: self.ip = data['floating_ip']['ip'] self.region = data['floating_ip']['region'] return self def destroy(self): """ Destroy the FloatingIP """ return self.get_data('floating_ips/%s/' % self.ip, type=DELETE) def assign(self, droplet_id): """ Assign a FloatingIP to a Droplet. Args: droplet_id: int - droplet id """ return self.get_data( "floating_ips/%s/actions/" % self.ip, type=POST, params={"type": "assign", "droplet_id": droplet_id} ) def unassign(self): """ Unassign a FloatingIP. """ return self.get_data( "floating_ips/%s/actions/" % self.ip, type=POST, params={"type": "unassign"} ) def __str__(self): return "%s" % (self.ip) python-digitalocean-1.16.0/digitalocean/Image.py000066400000000000000000000130661375555111600215770ustar00rootroot00000000000000# -*- coding: utf-8 -*- from .baseapi import BaseAPI, POST, DELETE, PUT, NotFoundError class Image(BaseAPI): """ An object representing an DigitalOcean Image. Attributes accepted at creation time: Args: name (str): The name to be given to an image. url (str): A URL from which the virtual machine image may be retrieved. region (str): The slug of the region where the image will be available. distribution (str, optional): The name of the image's distribution. description (str, optional): Free-form text field to describe an image. tags (obj:`list` of `str`, optional): List of tag names to apply to \ the image. Attributes returned by API: * id (int): A unique number to identify and reference a image. * name (str): The display name given to an image. * type (str): The kind of image. This will be either "snapshot", "backup", or "custom". * distribution (str): The name of the image's distribution. * slug (str): A uniquely identifying string that is associated with each \ of the DigitalOcean-provided public images. * public (bool): Indicates whether the image is public or not. * regions (obj:`list` of `str`): A list of the slugs of the regions where \ the image is available for use. * created_at (str): A time value given in ISO8601 combined date and time \ format that represents when the image was created. * min_disk_size (int): The minimum disk size in GB required for a Droplet \ to use this image. * size_gigabytes (int): The size of the image in gigabytes. * description (str): Free-form text field to describing an image. * tags (obj:`list` of `str`): List of tag names to applied to the image. * status (str): Indicates the state of a custom image. This may be "NEW", \ "available", "pending", or "deleted". * error_message (str): Information about errors that may occur when \ importing a custom image. """ def __init__(self, *args, **kwargs): self.id = None self.name = None self.distribution = None self.slug = None self.min_disk_size = None self.public = None self.regions = [] self.created_at = None self.size_gigabytes = None self.description = None self.status = None self.tags = [] self.error_message = None self.url = None self.region = None super(Image, self).__init__(*args, **kwargs) @classmethod def get_object(cls, api_token, image_id_or_slug): """ Class method that will return an Image object by ID or slug. This method is used to validate the type of the image. If it is a number, it will be considered as an Image ID, instead if it is a string, it will considered as slug. """ if cls._is_string(image_id_or_slug): image = cls(token=api_token, slug=image_id_or_slug) image.load(use_slug=True) else: image = cls(token=api_token, id=image_id_or_slug) image.load() return image @staticmethod def _is_string(value): """ Checks if the value provided is a string (True) or not integer (False) or something else (None). """ if type(value) in [type(u''), type('')]: return True elif type(value) in [int, type(2 ** 64)]: return False else: return None def create(self): """ Creates a new custom DigitalOcean Image from the Linux virtual machine image located at the provided `url`. """ params = {'name': self.name, 'region': self.region, 'url': self.url, 'distribution': self.distribution, 'description': self.description, 'tags': self.tags} data = self.get_data('images', type=POST, params=params) if data: for attr in data['image'].keys(): setattr(self, attr, data['image'][attr]) return self def load(self, use_slug=False): """ Load slug. Loads by id, or by slug if id is not present or use slug is True. """ identifier = None if use_slug or not self.id: identifier = self.slug else: identifier = self.id if not identifier: raise NotFoundError("One of self.id or self.slug must be set.") data = self.get_data("images/%s" % identifier) image_dict = data['image'] # Setting the attribute values for attr in image_dict.keys(): setattr(self, attr, image_dict[attr]) return self def destroy(self): """ Destroy the image """ return self.get_data("images/%s/" % self.id, type=DELETE) def transfer(self, new_region_slug): """ Transfer the image """ return self.get_data( "images/%s/actions/" % self.id, type=POST, params={"type": "transfer", "region": new_region_slug} ) def rename(self, new_name): """ Rename an image """ return self.get_data( "images/%s" % self.id, type=PUT, params={"name": new_name} ) def __str__(self): return "" % (self.id, self.distribution, self.name) python-digitalocean-1.16.0/digitalocean/Kernel.py000066400000000000000000000005101375555111600217630ustar00rootroot00000000000000# -*- coding: utf-8 -*- from .baseapi import BaseAPI class Kernel(BaseAPI): def __init__(self, *args, **kwargs): self.name = "" self.id = "" self.version = "" super(Kernel, self).__init__(*args, **kwargs) def __str__(self): return "" % (self.name, self.version) python-digitalocean-1.16.0/digitalocean/LoadBalancer.py000066400000000000000000000325631375555111600230670ustar00rootroot00000000000000# -*- coding: utf-8 -*- from .baseapi import BaseAPI, GET, POST, PUT, DELETE class StickySessions(object): """ An object holding information on a LoadBalancer's sticky sessions settings. Args: type (str): The type of sticky sessions used. Can be "cookies" or "none" cookie_name (str, optional): The name used for the client cookie when using cookies for sticky session cookie_ttl_seconds (int, optional): The number of seconds until the cookie expires """ def __init__(self, type='none', cookie_name='', cookie_ttl_seconds=None, **kwargs): self.type = type if type == 'cookies': self.cookie_name = 'DO-LB' self.cookie_ttl_seconds = 300 self.cookie_name = cookie_name self.cookie_ttl_seconds = cookie_ttl_seconds class ForwardingRule(object): """ An object holding information about a LoadBalancer forwarding rule setting. Args: entry_protocol (str): The protocol used for traffic to a LoadBalancer. The possible values are: "http", "https", or "tcp" entry_port (int): The port the LoadBalancer instance will listen on target_protocol (str): The protocol used for traffic from a LoadBalancer to the backend Droplets. The possible values are: "http", "https", or "tcp" target_port (int): The port on the backend Droplets on which the LoadBalancer will send traffic certificate_id (str, optional): The ID of the TLS certificate used for SSL termination if enabled tls_passthrough (bool, optional): A boolean indicating if SSL encrypted traffic will be passed through to the backend Droplets """ def __init__(self, entry_protocol=None, entry_port=None, target_protocol=None, target_port=None, certificate_id="", tls_passthrough=False): self.entry_protocol = entry_protocol self.entry_port = entry_port self.target_protocol = target_protocol self.target_port = target_port self.certificate_id = certificate_id self.tls_passthrough = tls_passthrough class HealthCheck(object): """ An object holding information about a LoadBalancer health check settings. Args: protocol (str): The protocol used for health checks. The possible values are "http" or "tcp". port (int): The port on the backend Droplets for heath checks path (str): The path to send a health check request to check_interval_seconds (int): The number of seconds between between two consecutive health checks response_timeout_seconds (int): The number of seconds the Load Balancer instance will wait for a response until marking a check as failed healthy_threshold (int): The number of times a health check must fail for a backend Droplet to be removed from the pool unhealthy_threshold (int): The number of times a health check must pass for a backend Droplet to be re-added to the pool """ def __init__(self, protocol='http', port=80, path='/', check_interval_seconds=10, response_timeout_seconds=5, healthy_threshold=5, unhealthy_threshold=3): self.protocol = protocol self.port = port self.path = path self.check_interval_seconds = check_interval_seconds self.response_timeout_seconds = response_timeout_seconds self.healthy_threshold = healthy_threshold self.unhealthy_threshold = unhealthy_threshold class LoadBalancer(BaseAPI): """ An object representing an DigitalOcean Load Balancer. Attributes accepted at creation time: Args: name (str): The Load Balancer's name region (str): The slug identifier for a DigitalOcean region algorithm (str, optional): The load balancing algorithm to be \ used. Currently, it must be either "round_robin" or \ "least_connections" forwarding_rules (obj:`list`): A list of `ForwrdingRules` objects health_check (obj, optional): A `HealthCheck` object sticky_sessions (obj, optional): A `StickySessions` object redirect_http_to_https (bool, optional): A boolean indicating \ whether HTTP requests to the Load Balancer should be \ redirected to HTTPS droplet_ids (obj:`list` of `int`): A list of IDs representing \ Droplets to be added to the Load Balancer (mutually \ exclusive with 'tag') tag (str): A string representing a DigitalOcean Droplet tag \ (mutually exclusive with 'droplet_ids') vpc_uuid (str): ID of a VPC in which the Load Balancer will be created Attributes returned by API: * name (str): The Load Balancer's name * id (str): An unique identifier for a LoadBalancer * ip (str): Public IP address for a LoadBalancer * region (str): The slug identifier for a DigitalOcean region * algorithm (str, optional): The load balancing algorithm to be \ used. Currently, it must be either "round_robin" or \ "least_connections" * forwarding_rules (obj:`list`): A list of `ForwrdingRules` objects * health_check (obj, optional): A `HealthCheck` object * sticky_sessions (obj, optional): A `StickySessions` object * redirect_http_to_https (bool, optional): A boolean indicating \ whether HTTP requests to the Load Balancer should be \ redirected to HTTPS * droplet_ids (obj:`list` of `int`): A list of IDs representing \ Droplets to be added to the Load Balancer * tag (str): A string representing a DigitalOcean Droplet tag * status (string): An indication the current state of the LoadBalancer * created_at (str): The date and time when the LoadBalancer was created * vpc_uuid (str): ID of a VPC which the Load Balancer is assigned to """ def __init__(self, *args, **kwargs): self.id = None self.name = None self.region = None self.algorithm = None self.forwarding_rules = [] self.health_check = None self.sticky_sessions = None self.redirect_http_to_https = False self.droplet_ids = [] self.tag = None self.status = None self.created_at = None self.vpc_uuid = None super(LoadBalancer, self).__init__(*args, **kwargs) @classmethod def get_object(cls, api_token, id): """ Class method that will return a LoadBalancer object by its ID. Args: api_token (str): DigitalOcean API token id (str): Load Balancer ID """ load_balancer = cls(token=api_token, id=id) load_balancer.load() return load_balancer def load(self): """ Loads updated attributues for a LoadBalancer object. Requires self.id to be set. """ data = self.get_data('load_balancers/%s' % self.id, type=GET) load_balancer = data['load_balancer'] # Setting the attribute values for attr in load_balancer.keys(): if attr == 'health_check': health_check = HealthCheck(**load_balancer['health_check']) setattr(self, attr, health_check) elif attr == 'sticky_sessions': sticky_ses = StickySessions(**load_balancer['sticky_sessions']) setattr(self, attr, sticky_ses) elif attr == 'forwarding_rules': rules = list() for rule in load_balancer['forwarding_rules']: rules.append(ForwardingRule(**rule)) setattr(self, attr, rules) else: setattr(self, attr, load_balancer[attr]) return self def create(self, *args, **kwargs): """ Creates a new LoadBalancer. Note: Every argument and parameter given to this method will be assigned to the object. Args: name (str): The Load Balancer's name region (str): The slug identifier for a DigitalOcean region algorithm (str, optional): The load balancing algorithm to be used. Currently, it must be either "round_robin" or "least_connections" forwarding_rules (obj:`list`): A list of `ForwrdingRules` objects health_check (obj, optional): A `HealthCheck` object sticky_sessions (obj, optional): A `StickySessions` object redirect_http_to_https (bool, optional): A boolean indicating whether HTTP requests to the Load Balancer should be redirected to HTTPS droplet_ids (obj:`list` of `int`): A list of IDs representing Droplets to be added to the Load Balancer (mutually exclusive with 'tag') tag (str): A string representing a DigitalOcean Droplet tag (mutually exclusive with 'droplet_ids') vpc_uuid (str): ID of a Load Balancer in which the Droplet will be created """ rules_dict = [rule.__dict__ for rule in self.forwarding_rules] params = {'name': self.name, 'region': self.region, 'forwarding_rules': rules_dict, 'redirect_http_to_https': self.redirect_http_to_https, 'vpc_uuid': self.vpc_uuid} if self.droplet_ids and self.tag: raise ValueError('droplet_ids and tag are mutually exclusive args') elif self.tag: params['tag'] = self.tag else: params['droplet_ids'] = self.droplet_ids if self.algorithm: params['algorithm'] = self.algorithm if self.health_check: params['health_check'] = self.health_check.__dict__ if self.sticky_sessions: params['sticky_sessions'] = self.sticky_sessions.__dict__ data = self.get_data('load_balancers', type=POST, params=params) if data: self.id = data['load_balancer']['id'] self.ip = data['load_balancer']['ip'] self.algorithm = data['load_balancer']['algorithm'] self.health_check = HealthCheck( **data['load_balancer']['health_check']) self.sticky_sessions = StickySessions( **data['load_balancer']['sticky_sessions']) self.droplet_ids = data['load_balancer']['droplet_ids'] self.status = data['load_balancer']['status'] self.created_at = data['load_balancer']['created_at'] self.vpc_uuid = data['load_balancer']['vpc_uuid'] return self def save(self): """ Save the LoadBalancer """ forwarding_rules = [rule.__dict__ for rule in self.forwarding_rules] data = { 'name': self.name, 'region': self.region['slug'], 'forwarding_rules': forwarding_rules, 'redirect_http_to_https': self.redirect_http_to_https, 'vpc_uuid': self.vpc_uuid } if self.tag: data['tag'] = self.tag else: data['droplet_ids'] = self.droplet_ids if self.algorithm: data["algorithm"] = self.algorithm if self.health_check: data['health_check'] = self.health_check.__dict__ if self.sticky_sessions: data['sticky_sessions'] = self.sticky_sessions.__dict__ return self.get_data("load_balancers/%s" % self.id, type=PUT, params=data) def destroy(self): """ Destroy the LoadBalancer """ return self.get_data('load_balancers/%s' % self.id, type=DELETE) def add_droplets(self, droplet_ids): """ Assign a LoadBalancer to a Droplet. Args: droplet_ids (obj:`list` of `int`): A list of Droplet IDs """ return self.get_data( "load_balancers/%s/droplets" % self.id, type=POST, params={"droplet_ids": droplet_ids} ) def remove_droplets(self, droplet_ids): """ Unassign a LoadBalancer. Args: droplet_ids (obj:`list` of `int`): A list of Droplet IDs """ return self.get_data( "load_balancers/%s/droplets" % self.id, type=DELETE, params={"droplet_ids": droplet_ids} ) def add_forwarding_rules(self, forwarding_rules): """ Adds new forwarding rules to a LoadBalancer. Args: forwarding_rules (obj:`list`): A list of `ForwrdingRules` objects """ rules_dict = [rule.__dict__ for rule in forwarding_rules] return self.get_data( "load_balancers/%s/forwarding_rules" % self.id, type=POST, params={"forwarding_rules": rules_dict} ) def remove_forwarding_rules(self, forwarding_rules): """ Removes existing forwarding rules from a LoadBalancer. Args: forwarding_rules (obj:`list`): A list of `ForwrdingRules` objects """ rules_dict = [rule.__dict__ for rule in forwarding_rules] return self.get_data( "load_balancers/%s/forwarding_rules" % self.id, type=DELETE, params={"forwarding_rules": rules_dict} ) def __str__(self): return "%s" % (self.id) python-digitalocean-1.16.0/digitalocean/Manager.py000066400000000000000000000325151375555111600221270ustar00rootroot00000000000000# -*- coding: utf-8 -*- try: from urlparse import urlparse, parse_qs except ImportError: from urllib.parse import urlparse, parse_qs # noqa from .baseapi import BaseAPI from .Account import Account from .Action import Action from .Balance import Balance from .Certificate import Certificate from .Domain import Domain from .Droplet import Droplet from .FloatingIP import FloatingIP from .Firewall import Firewall, InboundRule, OutboundRule from .Image import Image from .LoadBalancer import LoadBalancer from .LoadBalancer import StickySessions, HealthCheck, ForwardingRule from .Region import Region from .SSHKey import SSHKey from .Size import Size from .Snapshot import Snapshot from .Tag import Tag from .Volume import Volume from .VPC import VPC from .Project import Project class Manager(BaseAPI): def __init__(self, *args, **kwargs): super(Manager, self).__init__(*args, **kwargs) def get_account(self): """ Returns an Account object. """ return Account.get_object(api_token=self.tokens) def get_balance(self): """ Returns a Balance object. """ return Balance.get_object(api_token=self.token) def get_all_regions(self): """ This function returns a list of Region object. """ data = self.get_data("regions/") regions = list() for jsoned in data['regions']: region = Region(**jsoned) region.token = self.tokens regions.append(region) return regions def get_all_droplets(self, params=None, tag_name=None): """ This function returns a list of Droplet object. """ if params is None: params = dict() if tag_name: params["tag_name"] = tag_name data = self.get_data("droplets/", params=params) droplets = list() for jsoned in data['droplets']: droplet = Droplet(**jsoned) droplet.token = self.tokens for net in droplet.networks['v4']: if net['type'] == 'private': droplet.private_ip_address = net['ip_address'] if net['type'] == 'public': droplet.ip_address = net['ip_address'] if droplet.networks['v6']: droplet.ip_v6_address = droplet.networks['v6'][0]['ip_address'] if "backups" in droplet.features: droplet.backups = True else: droplet.backups = False if "ipv6" in droplet.features: droplet.ipv6 = True else: droplet.ipv6 = False if "private_networking" in droplet.features: droplet.private_networking = True else: droplet.private_networking = False droplets.append(droplet) return droplets def get_droplet(self, droplet_id): """ Return a Droplet by its ID. """ return Droplet.get_object(api_token=self.tokens, droplet_id=droplet_id) def get_all_sizes(self): """ This function returns a list of Size object. """ data = self.get_data("sizes/") sizes = list() for jsoned in data['sizes']: size = Size(**jsoned) size.token = self.tokens sizes.append(size) return sizes def get_images(self, private=False, type=None): """ This function returns a list of Image object. """ params = {} if private: params['private'] = 'true' if type: params['type'] = type data = self.get_data("images/", params=params) images = list() for jsoned in data['images']: image = Image(**jsoned) image.token = self.tokens images.append(image) return images def get_all_images(self): """ This function returns a list of Image objects containing all available DigitalOcean images, both public and private. """ images = self.get_images() return images def get_image(self, image_id_or_slug): """ Return a Image by its ID/Slug. """ return Image.get_object( api_token=self.tokens, image_id_or_slug=image_id_or_slug, ) def get_my_images(self): """ This function returns a list of Image objects representing private DigitalOcean images (e.g. snapshots and backups). """ images = self.get_images(private=True) return images def get_global_images(self): """ This function returns a list of Image objects representing public DigitalOcean images (e.g. base distribution images and 'One-Click' applications). """ data = self.get_images() images = list() for i in data: if i.public: i.token = self.tokens images.append(i) return images def get_distro_images(self): """ This function returns a list of Image objects representing public base distribution images. """ images = self.get_images(type='distribution') return images def get_app_images(self): """ This function returns a list of Image objectobjects representing public DigitalOcean 'One-Click' application images. """ images = self.get_images(type='application') return images def get_all_domains(self): """ This function returns a list of Domain object. """ data = self.get_data("domains/") domains = list() for jsoned in data['domains']: domain = Domain(**jsoned) domain.token = self.tokens domains.append(domain) return domains def get_domain(self, domain_name): """ Return a Domain by its domain_name """ return Domain.get_object(api_token=self.tokens, domain_name=domain_name) def get_all_sshkeys(self): """ This function returns a list of SSHKey object. """ data = self.get_data("account/keys/") ssh_keys = list() for jsoned in data['ssh_keys']: ssh_key = SSHKey(**jsoned) ssh_key.token = self.tokens ssh_keys.append(ssh_key) return ssh_keys def get_ssh_key(self, ssh_key_id): """ Return a SSHKey object by its ID. """ return SSHKey.get_object(api_token=self.tokens, ssh_key_id=ssh_key_id) def get_all_tags(self): """ This method returns a list of all tags. """ data = self.get_data("tags") return [ Tag(token=self.token, **tag) for tag in data['tags'] ] def get_action(self, action_id): """ Return an Action object by a specific ID. """ return Action.get_object(api_token=self.tokens, action_id=action_id) def get_all_floating_ips(self): """ This function returns a list of FloatingIP objects. """ data = self.get_data("floating_ips") floating_ips = list() for jsoned in data['floating_ips']: floating_ip = FloatingIP(**jsoned) floating_ip.token = self.tokens floating_ips.append(floating_ip) return floating_ips def get_floating_ip(self, ip): """ Returns a of FloatingIP object by its IP address. """ return FloatingIP.get_object(api_token=self.tokens, ip=ip) def get_all_load_balancers(self): """ Returns a list of Load Balancer objects. """ data = self.get_data("load_balancers") load_balancers = list() for jsoned in data['load_balancers']: load_balancer = LoadBalancer(**jsoned) load_balancer.token = self.tokens load_balancer.health_check = HealthCheck(**jsoned['health_check']) load_balancer.sticky_sessions = StickySessions(**jsoned['sticky_sessions']) forwarding_rules = list() for rule in jsoned['forwarding_rules']: forwarding_rules.append(ForwardingRule(**rule)) load_balancer.forwarding_rules = forwarding_rules load_balancers.append(load_balancer) return load_balancers def get_load_balancer(self, id): """ Returns a Load Balancer object by its ID. Args: id (str): Load Balancer ID """ return LoadBalancer.get_object(api_token=self.tokens, id=id) def get_certificate(self, id): """ Returns a Certificate object by its ID. Args: id (str): Certificate ID """ return Certificate.get_object(api_token=self.tokens, cert_id=id) def get_all_certificates(self): """ This function returns a list of Certificate objects. """ data = self.get_data("certificates") certificates = list() for jsoned in data['certificates']: cert = Certificate(**jsoned) cert.token = self.tokens certificates.append(cert) return certificates def get_snapshot(self, snapshot_id): """ Return a Snapshot by its ID. """ return Snapshot.get_object( api_token=self.tokens, snapshot_id=snapshot_id ) def get_all_snapshots(self): """ This method returns a list of all Snapshots. """ data = self.get_data("snapshots/") return [ Snapshot(token=self.tokens, **snapshot) for snapshot in data['snapshots'] ] def get_droplet_snapshots(self): """ This method returns a list of all Snapshots based on Droplets. """ data = self.get_data("snapshots?resource_type=droplet") return [ Snapshot(token=self.tokens, **snapshot) for snapshot in data['snapshots'] ] def get_volume_snapshots(self): """ This method returns a list of all Snapshots based on volumes. """ data = self.get_data("snapshots?resource_type=volume") return [ Snapshot(token=self.tokens, **snapshot) for snapshot in data['snapshots'] ] def get_all_volumes(self, region=None): """ This function returns a list of Volume objects. """ if region: url = "volumes?region={}".format(region) else: url = "volumes" data = self.get_data(url) volumes = list() for jsoned in data['volumes']: volume = Volume(**jsoned) volume.token = self.tokens volumes.append(volume) return volumes def get_volume(self, volume_id): """ Returns a Volume object by its ID. """ return Volume.get_object(api_token=self.tokens, volume_id=volume_id) def get_all_projects(self): """ All the projects of the account """ data = self.get_data("projects") projects = list() for jsoned in data['projects']: project = Project(**jsoned) project.token = self.token projects.append(project) return projects def get_project(self, project_id): """ Return a Project by its ID. """ return Project.get_object( api_token=self.token, project_id=project_id, ) def get_default_project(self): """ Return default project of the account """ return Project.get_object( api_token=self.token, project_id="default", ) def get_all_firewalls(self): """ This function returns a list of Firewall objects. """ data = self.get_data("firewalls") firewalls = list() for jsoned in data['firewalls']: firewall = Firewall(**jsoned) firewall.token = self.tokens in_rules = list() for rule in jsoned['inbound_rules']: in_rules.append(InboundRule(**rule)) firewall.inbound_rules = in_rules out_rules = list() for rule in jsoned['outbound_rules']: out_rules.append(OutboundRule(**rule)) firewall.outbound_rules = out_rules firewalls.append(firewall) return firewalls def get_firewall(self, firewall_id): """ Return a Firewall by its ID. """ return Firewall.get_object( api_token=self.tokens, firewall_id=firewall_id, ) def get_vpc(self, id): """ Returns a VPC object by its ID. Args: id (str): The VPC's ID """ return VPC.get_object(api_token=self.token, vpc_id=id) def get_all_vpcs(self): """ This function returns a list of VPC objects. """ data = self.get_data("vpcs") vpcs = list() for jsoned in data['vpcs']: vpc = VPC(**jsoned) vpc.token = self.token vpcs.append(vpc) return vpcs def __str__(self): return "" python-digitalocean-1.16.0/digitalocean/Metadata.py000066400000000000000000000024601375555111600222710ustar00rootroot00000000000000# -*- coding: utf-8 -*- import requests try: from urlparse import urljoin except ImportError: from urllib.parse import urljoin from .baseapi import BaseAPI class Metadata(BaseAPI): """ Metadata API: Provide useful information about the current Droplet. See: https://developers.digitalocean.com/metadata/#introduction """ droplet_id = None end_point = "http://169.254.169.254/metadata/v1" def __init__(self, *args, **kwargs): super(Metadata, self).__init__(*args, **kwargs) self.end_point = "http://169.254.169.254/metadata/v1" def get_data(self, url, headers=dict(), params=dict(), render_json=True): """ Customized version of get_data to directly get the data without using the authentication method. """ url = urljoin(self.end_point, url) response = requests.get(url, headers=headers, params=params, timeout=self.get_timeout()) if render_json: return response.json() return response.content def load(self): metadata = self.get_data("v1.json") for attr in metadata.keys(): setattr(self, attr, metadata[attr]) return self def __str__(self): return "" % (self.droplet_id) python-digitalocean-1.16.0/digitalocean/Project.py000066400000000000000000000116431375555111600221620ustar00rootroot00000000000000from .baseapi import BaseAPI, GET, POST, DELETE, PUT class Project(BaseAPI): def __init__(self,*args, **kwargs): self.name = None self.description = None self.purpose = None self.environment = None self.id = None self.is_default = None self.owner_uuid = None self.owner_id = None self.created_at = None self.updated_at = None self.resources = None super(Project,self).__init__(*args, **kwargs) @classmethod def get_object(cls, api_token, project_id): """Class method that will return a Project object by ID. Args: api_token (str): token kwargs (str): project id or project name """ project = cls(token=api_token, id=project_id) project.load() return project def load(self): # URL https://api.digitalocean.com/v2/projects project = self.get_data("projects/%s" % self.id) project = project['project'] for attr in project.keys(): setattr(self, attr, project[attr]) def set_as_default_project(self): data = { "name": self.name, "description": self.description, "purpose": self.purpose, "environment": self.environment, "is_default": True } project = self.get_data("projects/%s" % self.id, type=PUT, params=data) return project def create_project(self): """Creating Project with the following arguments Args: api_token (str): token "name": Name of the Project - Required "description": Description of the Project - Optional "purpose": Purpose of the project - Required "environment": Related Environment of Project - Optional - Development - Stating - Production """ data = { "name": self.name, "purpose": self.purpose } if self.description: data['description'] = self.description if self.environment: data['environment'] = self.environment data = self.get_data("projects", type=POST, params=data) if data: self.id = data['project']['id'] self.owner_uuid = data['project']['owner_uuid'] self.owner_id = data['project']['owner_id'] self.name = data['project']['name'] self.description = data['project']['description'] self.purpose = data['project']['purpose'] self.environment = data['project']['environment'] self.is_default = data['project']['is_default'] self.created_at = data['project']['created_at'] self.updated_at = data['project']['updated_at'] def delete_project(self): data = dict() return self.get_data("projects/%s" % self.id, type=DELETE, params=data) def update_project(self, **kwargs): data = dict() data['name'] = kwargs.get("name", self.name) data['description'] = kwargs.get("description", self.description) data['purpose'] = kwargs.get("purpose", self.purpose) """ Options for Purpose by Digital Ocean - Just Trying out DigitalOcean - Class Project / Educational Purposes - Website or blog - Web Application - Service or API - Mobile Application - Machine Learning / AI / Data Processing - IoT - Operational / Developer tooling - Other """ data['environment'] = kwargs.get("environment", self.environment) """ Options for Environment by Digital Ocean - Development - Stating - Production """ data['is_default'] = kwargs.get("is_default", self.is_default) update_response = self.get_data("projects/%s" % self.id, type=PUT, params=data) for attr in update_response['project'].keys(): setattr(self, attr, update_response['project'][attr]) def get_all_resources(self): project_resources_response = self.get_data("projects/%s/resources" % self.id) project_resources = project_resources_response['resources'] self.resources = [] for i in project_resources: self.resources.append(i['urn']) return self.resources def load_resources(self): project_resources_response = self.get_data("projects/%s/resources" % self.id) project_resources = project_resources_response['resources'] self.resources = [] for i in project_resources: self.resources.append(i['urn']) def assign_resource(self, resources): data = { 'resources': resources } return self.get_data("projects/%s/resources" % self.id, type=POST, params=data) def __str__(self): return " " + self.id python-digitalocean-1.16.0/digitalocean/Record.py000066400000000000000000000076601375555111600217760ustar00rootroot00000000000000# -*- coding: utf-8 -*- from .baseapi import BaseAPI, POST, DELETE, PUT class Record(BaseAPI): """ An object representing an DigitalOcean Domain Record. Args: type (str): The type of the DNS record (e.g. A, CNAME, TXT). name (str): The host name, alias, or service being defined by the record. data (int): Variable data depending on record type. priority (int): The priority for SRV and MX records. port (int): The port for SRV records. ttl (int): The time to live for the record, in seconds. weight (int): The weight for SRV records. flags (int): An unsigned integer between 0-255 used for CAA records. tags (string): The parameter tag for CAA records. Valid values are "issue", "wildissue", or "iodef" """ def __init__(self, domain_name=None, *args, **kwargs): self.domain = domain_name if domain_name else "" self.id = None self.type = None self.name = None self.data = None self.priority = None self.port = None self.ttl = None self.weight = None self.flags = None self.tags = None super(Record, self).__init__(*args, **kwargs) @classmethod def get_object(cls, api_token, domain, record_id): """ Class method that will return a Record object by ID and the domain. """ record = cls(token=api_token, domain=domain, id=record_id) record.load() return record def create(self): """ Creates a new record for a domain. Args: type (str): The type of the DNS record (e.g. A, CNAME, TXT). name (str): The host name, alias, or service being defined by the record. data (int): Variable data depending on record type. priority (int): The priority for SRV and MX records. port (int): The port for SRV records. ttl (int): The time to live for the record, in seconds. weight (int): The weight for SRV records. flags (int): An unsigned integer between 0-255 used for CAA records. tags (string): The parameter tag for CAA records. Valid values are "issue", "wildissue", or "iodef" """ input_params = { "type": self.type, "data": self.data, "name": self.name, "priority": self.priority, "port": self.port, "ttl": self.ttl, "weight": self.weight, "flags": self.flags, "tags": self.tags } data = self.get_data( "domains/%s/records" % (self.domain), type=POST, params=input_params, ) if data: self.id = data['domain_record']['id'] def destroy(self): """ Destroy the record """ return self.get_data( "domains/%s/records/%s" % (self.domain, self.id), type=DELETE, ) def save(self): """ Save existing record """ data = { "type": self.type, "data": self.data, "name": self.name, "priority": self.priority, "port": self.port, "ttl": self.ttl, "weight": self.weight, "flags": self.flags, "tags": self.tags } return self.get_data( "domains/%s/records/%s" % (self.domain, self.id), type=PUT, params=data ) def load(self): url = "domains/%s/records/%s" % (self.domain, self.id) record = self.get_data(url) if record: record = record[u'domain_record'] # Setting the attribute values for attr in record.keys(): setattr(self, attr, record[attr]) def __str__(self): return "" % (self.id, self.domain) python-digitalocean-1.16.0/digitalocean/Region.py000066400000000000000000000006021375555111600217700ustar00rootroot00000000000000# -*- coding: utf-8 -*- from .baseapi import BaseAPI class Region(BaseAPI): def __init__(self, *args, **kwargs): self.name = None self.slug = None self.sizes = [] self.available = None self.features = [] super(Region, self).__init__(*args, **kwargs) def __str__(self): return "" % (self.slug, self.name) python-digitalocean-1.16.0/digitalocean/SSHKey.py000066400000000000000000000050611375555111600216570ustar00rootroot00000000000000# -*- coding: utf-8 -*- from .baseapi import BaseAPI, GET, POST, DELETE, PUT class SSHKey(BaseAPI): def __init__(self, *args, **kwargs): self.id = "" self.name = None self.public_key = None self.fingerprint = None super(SSHKey, self).__init__(*args, **kwargs) @classmethod def get_object(cls, api_token, ssh_key_id): """ Class method that will return a SSHKey object by ID. """ ssh_key = cls(token=api_token, id=ssh_key_id) ssh_key.load() return ssh_key def load(self): """ Load the SSHKey object from DigitalOcean. Requires either self.id or self.fingerprint to be set. """ identifier = None if self.id: identifier = self.id elif self.fingerprint is not None: identifier = self.fingerprint data = self.get_data("account/keys/%s" % identifier, type=GET) ssh_key = data['ssh_key'] # Setting the attribute values for attr in ssh_key.keys(): setattr(self, attr, ssh_key[attr]) self.id = ssh_key['id'] def load_by_pub_key(self, public_key): """ This method will load a SSHKey object from DigitalOcean from a public_key. This method will avoid problems like uploading the same public_key twice. """ data = self.get_data("account/keys/") for jsoned in data['ssh_keys']: if jsoned.get('public_key', "") == public_key: self.id = jsoned['id'] self.load() return self return None def create(self): """ Create the SSH Key """ input_params = { "name": self.name, "public_key": self.public_key, } data = self.get_data("account/keys/", type=POST, params=input_params) if data: self.id = data['ssh_key']['id'] def edit(self): """ Edit the SSH Key """ input_params = { "name": self.name, "public_key": self.public_key, } data = self.get_data( "account/keys/%s" % self.id, type=PUT, params=input_params ) if data: self.id = data['ssh_key']['id'] def destroy(self): """ Destroy the SSH Key """ return self.get_data("account/keys/%s" % self.id, type=DELETE) def __str__(self): return "" % (self.id, self.name) python-digitalocean-1.16.0/digitalocean/Size.py000066400000000000000000000007051375555111600214630ustar00rootroot00000000000000# -*- coding: utf-8 -*- from .baseapi import BaseAPI class Size(BaseAPI): def __init__(self, *args, **kwargs): self.slug = None self.memory = None self.vcpus = None self.disk = None self.transfer = None self.price_monthly = None self.price_hourly = None self.regions = [] super(Size, self).__init__(*args, **kwargs) def __str__(self): return "%s" % (self.slug) python-digitalocean-1.16.0/digitalocean/Snapshot.py000066400000000000000000000022611375555111600223470ustar00rootroot00000000000000# -*- coding: utf-8 -*- from .baseapi import BaseAPI, POST, DELETE, PUT class Snapshot(BaseAPI): def __init__(self, *args, **kwargs): self.id = None self.name = None self.created_at = None self.regions = [] self.resource_id = None self.resource_type = None self.min_disk_size = None self.size_gigabytes = None super(Snapshot, self).__init__(*args, **kwargs) @classmethod def get_object(cls, api_token, snapshot_id): """ Class method that will return a Snapshot object by ID. """ snapshot = cls(token=api_token, id=snapshot_id) snapshot.load() return snapshot def load(self): data = self.get_data("snapshots/%s" % self.id) snapshot_dict = data['snapshot'] # Setting the attribute values for attr in snapshot_dict.keys(): setattr(self, attr, snapshot_dict[attr]) return self def destroy(self): """ Destroy the image """ return self.get_data("snapshots/%s/" % self.id, type=DELETE) def __str__(self): return "" % (self.id, self.name) python-digitalocean-1.16.0/digitalocean/Tag.py000066400000000000000000000132261375555111600212660ustar00rootroot00000000000000from .baseapi import BaseAPI from .Droplet import Droplet from .Snapshot import Snapshot class Tag(BaseAPI): def __init__(self, *args, **kwargs): self.name = "" self.resources = {} super(Tag, self).__init__(*args, **kwargs) @classmethod def get_object(cls, api_token, tag_name): tag = cls(token=api_token, name=tag_name) tag.load() return tag def load(self): """ Fetch data about tag """ tags = self.get_data("tags/%s" % self.name) tag = tags['tag'] for attr in tag.keys(): setattr(self, attr, tag[attr]) return self def create(self, **kwargs): """ Create the tag. """ for attr in kwargs.keys(): setattr(self, attr, kwargs[attr]) params = {"name": self.name} output = self.get_data("tags", type="POST", params=params) if output: self.name = output['tag']['name'] self.resources = output['tag']['resources'] def delete(self): return self.get_data("tags/%s" % self.name, type="DELETE") def __get_resources(self, resources, method): """ Method used to talk directly to the API (TAGs' Resources) """ tagged = self.get_data( 'tags/%s/resources' % self.name, params={ "resources": resources }, type=method, ) return tagged def __add_resources(self, resources): """ Add the resources to this tag. Attributes accepted at creation time: resources: array - See API. """ return self.__get_resources(resources, method='POST') def __remove_resources(self, resources): """ Remove resources from this tag. Attributes accepted at creation time: resources: array - See API. """ return self.__get_resources(resources, method='DELETE') def __build_resources_field(self, resources_to_tag, object_class, resource_type): """ Private method to build the `resources` field used to tag/untag DO resources. Returns an array of objects containing two fields: resource_id and resource_type. It checks the type of objects in the 1st argument and build the right structure for the API. It accepts array of strings, array of ints and array of the object type defined by object_class arg. The 3rd argument specify the resource type as defined by DO API (like droplet, image, volume or volume_snapshot). See: https://developers.digitalocean.com/documentation/v2/#tag-a-resource """ resources_field = [] if not isinstance(resources_to_tag, list): return resources_to_tag for resource_to_tag in resources_to_tag: res = {} try: if isinstance(resource_to_tag, unicode): res = {"resource_id": resource_to_tag} except NameError: pass if isinstance(resource_to_tag, str) or isinstance(resource_to_tag, int): res = {"resource_id": str(resource_to_tag)} elif isinstance(resource_to_tag, object_class): res = {"resource_id": str(resource_to_tag.id)} if len(res) > 0: res["resource_type"] = resource_type resources_field.append(res) return resources_field def add_droplets(self, droplet): """ Add the Tag to a Droplet. Attributes accepted at creation time: droplet: array of string or array of int, or array of Droplets. """ droplets = droplet if not isinstance(droplets, list): droplets = [droplet] # Extracting data from the Droplet object resources = self.__build_resources_field(droplets, Droplet, "droplet") if len(resources) > 0: return self.__add_resources(resources) return False def remove_droplets(self, droplet): """ Remove the Tag from the Droplet. Attributes accepted at creation time: droplet: array of string or array of int, or array of Droplets. """ droplets = droplet if not isinstance(droplets, list): droplets = [droplet] # Build resources field from the Droplet objects resources = self.__build_resources_field(droplets, Droplet, "droplet") if len(resources) > 0: return self.__remove_resources(resources) return False def add_snapshots(self, snapshots): """ Add the Tag to the Snapshot. Attributes accepted at creation time: snapshots: array of string or array of int or array of Snapshot. """ if not isinstance(snapshots, list): snapshots = [snapshots] resources = self.__build_resources_field(snapshots, Snapshot, "volume_snapshot") if len(resources) > 0: return self.__add_resources(resources) return False def remove_snapshots(self, snapshots): """ remove the Tag from the Snapshot. Attributes accepted at creation time: snapshots: array of string or array of int or array of Snapshot. """ if not isinstance(snapshots, list): snapshots = [snapshots] resources = self.__build_resources_field(snapshots, Snapshot, "volume_snapshot") if len(resources) > 0: return self.__remove_resources(resources) return False python-digitalocean-1.16.0/digitalocean/VPC.py000066400000000000000000000073531375555111600212070ustar00rootroot00000000000000# -*- coding: utf-8 -*- from .baseapi import BaseAPI, PATCH, POST, DELETE class VPC(BaseAPI): """ An object representing a DigitalOcean VPC. Attributes accepted at creation time: Args: name (str): A name for the VPC region (str): The slug for the region where the VPC will be created description(str): A free-form text field for describing the VPC ip_range (str): The requested range of IP addresses for the VPC in \ CIDR notation Attributes returned by API: * id (str): A unique identifier for the VPC * name (str): The name of the VPC * region (str): The slug for the region where the VPC is located * description(str): A free-form text field for describing the VPC * ip_range (str): The requested range of IP addresses for the VPC in \ CIDR notation * urn (str): The uniform resource name (URN) for the VPC * created_at (str): A string that represents when the VPC was created * default (bool): A boolen representing whether or not the VPC is the \ user's default VPC for the region """ def __init__(self, *args, **kwargs): self.id = "" self.name = None self.region = None self.description = None self.ip_range = None self.urn = None self.created_at = None self.default = False super(VPC, self).__init__(*args, **kwargs) @classmethod def get_object(cls, api_token, vpc_id): """ Class method that will return a VPC object by its ID. """ vpc = cls(token=api_token, id=vpc_id) vpc.load() return vpc def load(self): """ Load the VPC object from DigitalOcean. Requires self.id to be set. """ data = self.get_data("vpcs/%s" % self.id) vpc = data["vpc"] for attr in vpc.keys(): setattr(self, attr, vpc[attr]) return self def create(self): """ Create the VPC """ params = { "name": self.name, "region": self.region, "description": self.description, "ip_range": self.ip_range } data = self.get_data("vpcs", type=POST, params=params) if data: self.id = data['vpc']['id'] self.name = data['vpc']['name'] self.region = data['vpc']['region'] self.description = data['vpc']['description'] self.ip_range = data['vpc']['ip_range'] self.urn = data['vpc']['urn'] self.created_at = data['vpc']['created_at'] self.default = data['vpc']['default'] return self def rename(self, new_name): """ Rename a VPC Args: name (str): The new name for the VPC """ data = self.get_data("vpcs/%s" % self.id, type=PATCH, params={"name": new_name}) vpc = data["vpc"] for attr in vpc.keys(): setattr(self, attr, vpc[attr]) return self def rename(self, new_name): """ Rename a VPC Args: name (str): The new name for the VPC """ data = self.get_data("vpcs/%s" % self.id, type=PATCH, params={"name": new_name}) vpc = data["vpc"] for attr in vpc.keys(): setattr(self, attr, vpc[attr]) return self def destroy(self): """ Delete the VPC """ return self.get_data("vpcs/%s" % self.id, type=DELETE) def __str__(self): return "" % (self.id, self.name) python-digitalocean-1.16.0/digitalocean/Volume.py000066400000000000000000000151721375555111600220240ustar00rootroot00000000000000# -*- coding: utf-8 -*- from .baseapi import BaseAPI, POST, DELETE from .Snapshot import Snapshot class Volume(BaseAPI): def __init__(self, *args, **kwargs): self.id = None self.name = None self.droplet_ids = [] self.region = None self.description = None self.size_gigabytes = None self.created_at = None self.snapshot_id = None self.filesystem_type = None self.filesystem_label = None self.tags = None super(Volume, self).__init__(*args, **kwargs) @classmethod def get_object(cls, api_token, volume_id): """ Class method that will return an Volume object by ID. """ volume = cls(token=api_token, id=volume_id) volume.load() return volume def load(self): data = self.get_data("volumes/%s" % self.id) volume_dict = data['volume'] # Setting the attribute values for attr in volume_dict.keys(): setattr(self, attr, volume_dict[attr]) return self def create(self, *args, **kwargs): """ Creates a Block Storage volume Note: Every argument and parameter given to this method will be assigned to the object. Args: name: string - a name for the volume region: string - slug identifier for the region size_gigabytes: int - size of the Block Storage volume in GiB filesystem_type: string, optional - name of the filesystem type the volume will be formatted with ('ext4' or 'xfs') filesystem_label: string, optional - the label to be applied to the filesystem, only used in conjunction with filesystem_type Optional Args: description: string - text field to describe a volume tags: List[string], optional - the tags to be applied to the volume """ data = self.get_data('volumes/', type=POST, params={'name': self.name, 'region': self.region, 'size_gigabytes': self.size_gigabytes, 'description': self.description, 'filesystem_type': self.filesystem_type, 'filesystem_label': self.filesystem_label, 'tags': self.tags, }) if data: self.id = data['volume']['id'] self.created_at = data['volume']['created_at'] return self def create_from_snapshot(self, *args, **kwargs): """ Creates a Block Storage volume Note: Every argument and parameter given to this method will be assigned to the object. Args: name: string - a name for the volume snapshot_id: string - unique identifier for the volume snapshot size_gigabytes: int - size of the Block Storage volume in GiB filesystem_type: string, optional - name of the filesystem type the volume will be formatted with ('ext4' or 'xfs') filesystem_label: string, optional - the label to be applied to the filesystem, only used in conjunction with filesystem_type Optional Args: description: string - text field to describe a volume tags: List[string], optional - the tags to be applied to the volume """ data = self.get_data('volumes/', type=POST, params={'name': self.name, 'snapshot_id': self.snapshot_id, 'region': self.region, 'size_gigabytes': self.size_gigabytes, 'description': self.description, 'filesystem_type': self.filesystem_type, 'filesystem_label': self.filesystem_label, 'tags': self.tags, }) if data: self.id = data['volume']['id'] self.created_at = data['volume']['created_at'] return self def destroy(self): """ Destroy a volume """ return self.get_data("volumes/%s/" % self.id, type=DELETE) def attach(self, droplet_id, region): """ Attach a Volume to a Droplet. Args: droplet_id: int - droplet id region: string - slug identifier for the region """ return self.get_data( "volumes/%s/actions/" % self.id, type=POST, params={"type": "attach", "droplet_id": droplet_id, "region": region} ) def detach(self, droplet_id, region): """ Detach a Volume to a Droplet. Args: droplet_id: int - droplet id region: string - slug identifier for the region """ return self.get_data( "volumes/%s/actions/" % self.id, type=POST, params={"type": "detach", "droplet_id": droplet_id, "region": region} ) def resize(self, size_gigabytes, region): """ Detach a Volume to a Droplet. Args: size_gigabytes: int - size of the Block Storage volume in GiB region: string - slug identifier for the region """ return self.get_data( "volumes/%s/actions/" % self.id, type=POST, params={"type": "resize", "size_gigabytes": size_gigabytes, "region": region} ) def snapshot(self, name): """ Create a snapshot of the volume. Args: name: string - a human-readable name for the snapshot """ return self.get_data( "volumes/%s/snapshots/" % self.id, type=POST, params={"name": name} ) def get_snapshots(self): """ Retrieve the list of snapshots that have been created from a volume. Args: """ data = self.get_data("volumes/%s/snapshots/" % self.id) snapshots = list() for jsond in data[u'snapshots']: snapshot = Snapshot(**jsond) snapshot.token = self.tokens snapshots.append(snapshot) return snapshots def __str__(self): return "" % (self.id, self.name, self.size_gigabytes) python-digitalocean-1.16.0/digitalocean/__init__.py000066400000000000000000000021501375555111600223040ustar00rootroot00000000000000# -*- coding: utf-8 -*- """digitalocean API to manage droplets""" __version__ = "1.16.0" __author__ = "Lorenzo Setale ( http://who.is.lorenzo.setale.me/? )" __author_email__ = "lorenzo@setale.me" __license__ = "LGPL v3" __copyright__ = "Copyright (c) 2012-2020 Lorenzo Setale" from .Manager import Manager from .Droplet import Droplet, DropletError, BadKernelObject, BadSSHKeyFormat from .Region import Region from .Size import Size from .Image import Image from .Action import Action from .Account import Account from .Balance import Balance from .Domain import Domain from .Record import Record from .SSHKey import SSHKey from .Kernel import Kernel from .FloatingIP import FloatingIP from .Volume import Volume from .baseapi import Error, EndPointError, TokenError, DataReadError, NotFoundError from .Tag import Tag from .LoadBalancer import LoadBalancer from .LoadBalancer import StickySessions, ForwardingRule, HealthCheck from .Certificate import Certificate from .Snapshot import Snapshot from .Project import Project from .Firewall import Firewall, InboundRule, OutboundRule, Destinations, Sources from .VPC import VPC python-digitalocean-1.16.0/digitalocean/baseapi.py000066400000000000000000000176771375555111600221750ustar00rootroot00000000000000# -*- coding: utf-8 -*- import os import json import logging import requests from . import __name__, __version__ try: import urlparse except ImportError: from urllib import parse as urlparse GET = 'GET' POST = 'POST' DELETE = 'DELETE' PUT = 'PUT' PATCH = 'PATCH' REQUEST_TIMEOUT_ENV_VAR = 'PYTHON_DIGITALOCEAN_REQUEST_TIMEOUT_SEC' class Error(Exception): """Base exception class for this module""" pass class TokenError(Error): pass class DataReadError(Error): pass class JSONReadError(Error): pass class NotFoundError(Error): pass class EndPointError(Error): pass class BaseAPI(object): """ Basic api class for """ tokens = [] _last_used = 0 end_point = "https://api.digitalocean.com/v2/" def __init__(self, *args, **kwargs): self.token = os.getenv("DIGITALOCEAN_ACCESS_TOKEN", "") self.end_point = os.getenv("DIGITALOCEAN_END_POINT", "https://api.digitalocean.com/v2/") self._log = logging.getLogger(__name__) self._session = requests.Session() for attr in kwargs.keys(): setattr(self, attr, kwargs[attr]) parsed_url = urlparse.urlparse(self.end_point) if not parsed_url.scheme or not parsed_url.netloc: raise EndPointError("Provided end point is not a valid URL. Please use a valid URL") if not parsed_url.path: self.end_point += '/' def __getstate__(self): state = self.__dict__.copy() # The logger is not pickleable due to using thread.lock del state['_log'] return state def __setstate__(self, state): self.__dict__ = state self._log = logging.getLogger(__name__) def __perform_request(self, url, type=GET, params=None): """ This method will perform the real request, in this way we can customize only the "output" of the API call by using self.__call_api method. This method will return the request object. """ if params is None: params = {} if not self.token: raise TokenError("No token provided. Please use a valid token") url = urlparse.urljoin(self.end_point, url) # lookup table to find out the appropriate requests method, # headers and payload type (json or query parameters) identity = lambda x: x json_dumps = lambda x: json.dumps(x) lookup = { GET: (self._session.get, {'Content-type': 'application/json'}, 'params', identity), PATCH: (requests.patch, {'Content-type': 'application/json'}, 'data', json_dumps), POST: (requests.post, {'Content-type': 'application/json'}, 'data', json_dumps), PUT: (self._session.put, {'Content-type': 'application/json'}, 'data', json_dumps), DELETE: (self._session.delete, {'content-type': 'application/json'}, 'data', json_dumps), } requests_method, headers, payload, transform = lookup[type] agent = "{0}/{1} {2}/{3}".format('python-digitalocean', __version__, requests.__name__, requests.__version__) headers.update({'Authorization': 'Bearer ' + self.token, 'User-Agent': agent}) kwargs = {'headers': headers, payload: transform(params)} timeout = self.get_timeout() if timeout: kwargs['timeout'] = timeout # remove token from log headers_str = str(headers) for i, token in enumerate(self.tokens): headers_str = headers_str.replace(token.strip(), 'TOKEN%s' % i) self._log.debug('%s %s %s:%s %s %s' % (type, url, payload, params, headers_str, timeout)) return requests_method(url, **kwargs) def __deal_with_pagination(self, url, method, params, data): """ Perform multiple calls in order to have a full list of elements when the API are "paginated". (content list is divided in more than one page) """ all_data = data while data.get("links", {}).get("pages", {}).get("next"): url, query = data["links"]["pages"]["next"].split("?", 1) # Merge the query parameters for key, value in urlparse.parse_qs(query).items(): params[key] = value data = self.__perform_request(url, method, params).json() # Merge the dictionaries for key, value in data.items(): if isinstance(value, list) and key in all_data: all_data[key] += value else: all_data[key] = value return all_data def __init_ratelimit(self, headers): # Add the account requests/hour limit self.ratelimit_limit = headers.get('Ratelimit-Limit', None) # Add the account requests remaining self.ratelimit_remaining = headers.get('Ratelimit-Remaining', None) # Add the account requests limit reset time self.ratelimit_reset = headers.get('Ratelimit-Reset', None) @property def token(self): # use all the tokens round-robin style if self.tokens: self._last_used = (self._last_used + 1) % len(self.tokens) return self.tokens[self._last_used] return "" @token.setter def token(self, token): self._last_used = 0 if isinstance(token, list): self.tokens = token else: # for backward compatibility self.tokens = [token] def get_timeout(self): """ Checks if any timeout for the requests to DigitalOcean is required. To set a timeout, use the REQUEST_TIMEOUT_ENV_VAR environment variable. """ timeout_str = os.environ.get(REQUEST_TIMEOUT_ENV_VAR) if timeout_str: try: return float(timeout_str) except: self._log.error('Failed parsing the request read timeout of ' '"%s". Please use a valid float number!' % timeout_str) return None def get_data(self, url, type=GET, params=None): """ This method is a basic implementation of __call_api that checks errors too. In case of success the method will return True or the content of the response to the request. Pagination is automatically detected and handled accordingly """ if params is None: params = dict() # If per_page is not set, make sure it has a sane default if type is GET: params.setdefault("per_page", 200) req = self.__perform_request(url, type, params) if req.status_code == 204: return True if req.status_code == 404: raise NotFoundError() try: data = req.json() except ValueError as e: raise JSONReadError( 'Read failed from DigitalOcean: %s' % str(e) ) if not req.ok: msg = [data[m] for m in ("id", "message") if m in data][1] raise DataReadError(msg) # init request limits self.__init_ratelimit(req.headers) # If there are more elements available (total) than the elements per # page, try to deal with pagination. Note: Breaking the logic on # multiple pages, pages = data.get("links", {}).get("pages", {}) if pages.get("next") and "page" not in params: return self.__deal_with_pagination(url, type, params, data) else: return data def __str__(self): return "<%s>" % self.__class__.__name__ def __unicode__(self): return u"%s" % self.__str__() def __repr__(self): return str(self) python-digitalocean-1.16.0/digitalocean/tests/000077500000000000000000000000001375555111600213375ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/BaseTest.py000066400000000000000000000025221375555111600234240ustar00rootroot00000000000000import os import unittest DEFAULT_PER_PAGE = 200 class BaseTest(unittest.TestCase): def setUp(self): self.base_url = "https://api.digitalocean.com/v2/" self.token = "afaketokenthatwillworksincewemockthings" def load_from_file(self, json_file): cwd = os.path.dirname(__file__) with open(os.path.join(cwd, 'data/%s' % json_file), 'r') as f: return f.read() def split_url(self, url): bits = url.split('?') if len(bits) == 1: return url, [] qlist = bits[1].split('&') qlist.sort() return bits[0], qlist def assert_url_query_equal(self, url1, url2): """ Test if two URL queries are equal The key=value pairs after the ? in a URL can occur in any order (especially since dicts in python 3 are not deterministic across runs). The method sorts the key=value pairs and then compares the URLs. """ base1, qlist1 = self.split_url(url1) base2, qlist2 = self.split_url(url2) self.assertEqual(base1, base2) self.assertEqual(qlist1, qlist2) def assert_get_url_equal(self, url1, url2): if "?" in url2: url2 += "&" else: url2 += "?" url2 += "per_page=%d" % DEFAULT_PER_PAGE return self.assert_url_query_equal(url1, url2) python-digitalocean-1.16.0/digitalocean/tests/__init__.py000066400000000000000000000000001375555111600234360ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/000077500000000000000000000000001375555111600222505ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/account/000077500000000000000000000000001375555111600237045ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/account/account.json000066400000000000000000000003651375555111600262370ustar00rootroot00000000000000{ "account": { "droplet_limit": 25, "floating_ip_limit": 3, "email": "web@digitalocean.com", "uuid": "b6fc48dbf6d9906cace5f3c78dc9851e757381ef", "email_verified": true, "status": "active", "status_message": "" } }python-digitalocean-1.16.0/digitalocean/tests/data/actions/000077500000000000000000000000001375555111600237105ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/actions/create_completed.json000066400000000000000000000004511375555111600301020ustar00rootroot00000000000000{ "action": { "region_slug":"nyc3", "id":39290099, "status":"completed", "type":"create", "started_at":"2014-12-19T19:14:36Z", "completed_at":"2014-12-19T19:15:23Z", "resource_id":12345, "resource_type":"droplet", "region":"nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/actions/ipv6_completed.json000066400000000000000000000004561375555111600275300ustar00rootroot00000000000000{ "action": { "region_slug":"nyc3", "id":39388122, "status":"completed", "type":"enable_ipv6", "started_at":"2014-12-21T04:15:29Z", "completed_at":"2014-12-21T04:15:31Z", "resource_id":12345, "resource_type":"droplet", "region":"nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/actions/multi.json000066400000000000000000000012601375555111600257340ustar00rootroot00000000000000{ "actions": [ { "region_slug":"nyc3", "id":39388122, "status":"completed", "type":"enable_ipv6", "started_at":"2014-12-21T04:15:29Z", "completed_at":"2014-12-21T04:15:31Z", "resource_id":12345, "resource_type":"droplet", "region":"nyc3" }, { "region_slug":"nyc3", "id":39290099, "status":"completed", "type":"create", "started_at":"2014-12-19T19:14:36Z", "completed_at":"2014-12-19T19:15:23Z", "resource_id":12345, "resource_type":"droplet", "region":"nyc3" } ], "links": {}, "meta": {"total":2} }python-digitalocean-1.16.0/digitalocean/tests/data/balance/000077500000000000000000000000001375555111600236355ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/balance/balance.json000066400000000000000000000002201375555111600261070ustar00rootroot00000000000000{ "month_to_date_balance": "23.44", "account_balance": "12.23", "month_to_date_usage": "11.21", "generated_at": "2019-07-09T15:01:12Z" }python-digitalocean-1.16.0/digitalocean/tests/data/certificate/000077500000000000000000000000001375555111600245325ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/certificate/custom.json000066400000000000000000000005031375555111600267350ustar00rootroot00000000000000{ "certificate": { "id": "892071a0-bb95-49bc-8021-3afd67a210bf", "name": "web-cert-01", "not_after": "2017-02-22T00:23:00Z", "sha1_fingerprint": "dfcc9f57d86bf58e321c2c6c31c7a971be244ac7", "created_at": "2017-02-08T16:02:37Z", "dns_names": [""], "state": "verified", "type": "custom" } }python-digitalocean-1.16.0/digitalocean/tests/data/certificate/lets_encrpyt.json000066400000000000000000000006071375555111600301430ustar00rootroot00000000000000{ "certificate": { "id": "ba9b9c18-6c59-46c2-99df-70da170a42ba", "name": "web-cert-02", "not_after": "2018-06-07T17:44:12Z", "sha1_fingerprint": "479c82b5c63cb6d3e6fac4624d58a33b267e166c", "created_at": "2018-03-09T18:44:11Z", "dns_names": ["www.example.com","example.com"], "state": "pending", "type": "lets_encrypt" } }python-digitalocean-1.16.0/digitalocean/tests/data/certificate/list.json000066400000000000000000000013721375555111600264030ustar00rootroot00000000000000{ "certificates": [ { "id": "892071a0-bb95-49bc-8021-3afd67a210bf", "name": "web-cert-01", "not_after": "2017-02-22T00:23:00Z", "sha1_fingerprint": "dfcc9f57d86bf58e321c2c6c31c7a971be244ac7", "created_at": "2017-02-08T16:02:37Z", "dns_names": [""], "state": "verified", "type": "custom" }, { "id": "ba9b9c18-6c59-46c2-99df-70da170a42ba", "name": "web-cert-02", "not_after": "2018-06-07T17:44:12Z", "sha1_fingerprint": "479c82b5c63cb6d3e6fac4624d58a33b267e166c", "created_at": "2018-03-09T18:44:11Z", "dns_names": ["www.example.com","example.com"], "state": "pending", "type": "lets_encrypt" } ], "links": { }, "meta": { "total": 2 } }python-digitalocean-1.16.0/digitalocean/tests/data/domains/000077500000000000000000000000001375555111600237025ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/domains/all.json000066400000000000000000000002411375555111600253420ustar00rootroot00000000000000{ "domains": [ { "name": "example.com", "ttl": 1800, "zone_file": "Example zone file text..." } ], "meta": { "total": 1 } }python-digitalocean-1.16.0/digitalocean/tests/data/domains/create.json000066400000000000000000000001271375555111600260400ustar00rootroot00000000000000{ "domain": { "name": "example.com", "ttl": 1800, "zone_file": null } }python-digitalocean-1.16.0/digitalocean/tests/data/domains/create_caa_record.json000066400000000000000000000003361375555111600302040ustar00rootroot00000000000000{ "domain_record": { "id": 18, "type": "CAA", "name": "@", "data": "letsencrypt.org.", "priority": null, "port": null, "ttl": 1800, "weight": null, "flags": 0, "tag": "issue" } }python-digitalocean-1.16.0/digitalocean/tests/data/domains/create_record.json000066400000000000000000000002561375555111600274010ustar00rootroot00000000000000{ "domain_record": { "id": 16, "type": "CNAME", "name": "www", "data": "@", "priority": null, "port": null, "ttl": 600, "weight": null } }python-digitalocean-1.16.0/digitalocean/tests/data/domains/create_srv_record.json000066400000000000000000000002571375555111600302740ustar00rootroot00000000000000{ "domain_record": { "id": 17, "type": "SRV", "name": "service", "data": "service", "priority": 0, "port": 53, "ttl": 600, "weight": 0 } } python-digitalocean-1.16.0/digitalocean/tests/data/domains/records.json000066400000000000000000000026051375555111600262410ustar00rootroot00000000000000{ "domain_records": [ { "id": 1, "type": "A", "name": "@", "data": "8.8.8.8", "priority": null, "port": null, "ttl": 1800, "weight": null, "flags": null, "tag": null }, { "id": 2, "type": "NS", "name": null, "data": "NS1.DIGITALOCEAN.COM.", "priority": null, "port": null, "ttl": 1800, "weight": null, "flags": null, "tag": null }, { "id": 3, "type": "NS", "name": null, "data": "NS2.DIGITALOCEAN.COM.", "priority": null, "port": null, "ttl": 1800, "weight": null, "flags": null, "tag": null }, { "id": 4, "type": "NS", "name": null, "data": "NS3.DIGITALOCEAN.COM.", "priority": null, "port": null, "ttl": 1800, "weight": null, "flags": null, "tag": null }, { "id": 5, "type": "CNAME", "name": "example", "data": "@", "priority": null, "port": null, "ttl": 600, "weight": null, "flags": null, "tag": null }, { "id": 6, "type": "CAA", "name": "@", "data": "letsencrypt.org.", "priority": null, "port": null, "ttl": 1800, "weight": null, "flags": 0, "tag": "issue" } ], "meta": { "total": 5 } } python-digitalocean-1.16.0/digitalocean/tests/data/domains/single.json000066400000000000000000000001561375555111600260600ustar00rootroot00000000000000{ "domain": { "name": "example.com", "ttl": 1800, "zone_file": "Example zone file text..." } }python-digitalocean-1.16.0/digitalocean/tests/data/droplet_actions/000077500000000000000000000000001375555111600254415ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/droplet_actions/change_kernel.json000066400000000000000000000004011375555111600311140ustar00rootroot00000000000000{ "action": { "id":54321, "status":"in-progress", "type":"change_kernel", "started_at":"2014-12-21T02:19:17Z", "completed_at":null, "resource_id":12345, "resource_type":"droplet", "region":"nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/droplet_actions/create.json000066400000000000000000000014021375555111600275740ustar00rootroot00000000000000{ "droplet": { "id": 3164494, "name": "example.com", "memory": 512, "vcpus": 1, "disk": 20, "locked": true, "status": "new", "kernel": { "id": 2233, "name": "Ubuntu 14.04 x64 vmlinuz-3.13.0-37-generic", "version": "3.13.0-37-generic" }, "created_at": "2014-11-14T16:36:31Z", "features": [ "virtio", "backups", "ipv6" ], "backup_ids": [ ], "snapshot_ids": [ ], "image": { }, "size_slug": "512mb", "networks": { }, "region": { }, "tags": [ "web" ] }, "links": { "actions": [ { "id": 36805096, "rel": "create", "href": "https://api.digitalocean.com/v2/actions/36805096" } ] } }python-digitalocean-1.16.0/digitalocean/tests/data/droplet_actions/create_multiple.json000066400000000000000000000025141375555111600315140ustar00rootroot00000000000000{ "droplets": [{ "id": 3164494, "name": "example.com", "memory": 512, "vcpus": 1, "disk": 20, "locked": true, "status": "new", "kernel": { "id": 2233, "name": "Ubuntu 14.04 x64 vmlinuz-3.13.0-37-generic", "version": "3.13.0-37-generic" }, "created_at": "2014-11-14T16:36:31Z", "features": [ "virtio", "backups", "ipv6" ], "backup_ids": [ ], "snapshot_ids": [ ], "image": { }, "size_slug": "512mb", "networks": { }, "region": { }, "tags": [ "web" ] }, { "id": 3164495, "name": "example2.com", "memory": 512, "vcpus": 1, "disk": 20, "locked": true, "status": "new", "kernel": { "id": 2233, "name": "Ubuntu 14.04 x64 vmlinuz-3.13.0-37-generic", "version": "3.13.0-37-generic" }, "created_at": "2014-11-14T16:36:31Z", "features": [ "virtio", "backups", "ipv6" ], "backup_ids": [ ], "snapshot_ids": [ ], "image": { }, "size_slug": "512mb", "networks": { }, "region": { }, "tags": [ "web" ] }], "links": { "actions": [ { "id": 36805096, "rel": "create", "href": "https://api.digitalocean.com/v2/actions/36805096" } ] } }python-digitalocean-1.16.0/digitalocean/tests/data/droplet_actions/disable_backups.json000066400000000000000000000004031375555111600314440ustar00rootroot00000000000000{ "action": { "id":54321, "status":"in-progress", "type":"disable_backups", "started_at":"2014-12-21T02:19:17Z", "completed_at":null, "resource_id":12345, "resource_type":"droplet", "region":"nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/droplet_actions/enable_backups.json000066400000000000000000000004021375555111600312660ustar00rootroot00000000000000{ "action": { "id":54321, "status":"in-progress", "type":"enable_backups", "started_at":"2014-12-21T02:19:17Z", "completed_at":null, "resource_id":12345, "resource_type":"droplet", "region":"nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/droplet_actions/enable_ipv6.json000066400000000000000000000003771375555111600305350ustar00rootroot00000000000000{ "action": { "id":54321, "status":"in-progress", "type":"enable_ipv6", "started_at":"2014-12-21T02:19:17Z", "completed_at":null, "resource_id":12345, "resource_type":"droplet", "region":"nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/droplet_actions/enable_private_networking.json000066400000000000000000000004151375555111600335630ustar00rootroot00000000000000{ "action": { "id":54321, "status":"in-progress", "type":"enable_private_networking", "started_at":"2014-12-21T02:19:17Z", "completed_at":null, "resource_id":12345, "resource_type":"droplet", "region":"nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/droplet_actions/password_reset.json000066400000000000000000000004021375555111600313740ustar00rootroot00000000000000{ "action": { "id":54321, "status":"in-progress", "type":"password_reset", "started_at":"2014-12-21T02:19:17Z", "completed_at":null, "resource_id":12345, "resource_type":"droplet", "region":"nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/droplet_actions/power_cycle.json000066400000000000000000000003771375555111600306560ustar00rootroot00000000000000{ "action": { "id":54321, "status":"in-progress", "type":"power_cycle", "started_at":"2014-12-21T02:19:17Z", "completed_at":null, "resource_id":12345, "resource_type":"droplet", "region":"nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/droplet_actions/power_off.json000066400000000000000000000003751375555111600303270ustar00rootroot00000000000000{ "action": { "id":54321, "status":"in-progress", "type":"power_off", "started_at":"2014-12-21T02:19:17Z", "completed_at":null, "resource_id":12345, "resource_type":"droplet", "region":"nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/droplet_actions/power_on.json000066400000000000000000000003741375555111600301700ustar00rootroot00000000000000{ "action": { "id":54321, "status":"in-progress", "type":"power_on", "started_at":"2014-12-21T02:19:17Z", "completed_at":null, "resource_id":12345, "resource_type":"droplet", "region":"nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/droplet_actions/reboot.json000066400000000000000000000003721375555111600276300ustar00rootroot00000000000000{ "action": { "id":54321, "status":"in-progress", "type":"reboot", "started_at":"2014-12-21T02:19:17Z", "completed_at":null, "resource_id":12345, "resource_type":"droplet", "region":"nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/droplet_actions/rebuild.json000066400000000000000000000003731375555111600277650ustar00rootroot00000000000000{ "action": { "id":54321, "status":"in-progress", "type":"rebuild", "started_at":"2014-12-21T02:19:17Z", "completed_at":null, "resource_id":12345, "resource_type":"droplet", "region":"nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/droplet_actions/rename.json000066400000000000000000000003721375555111600276050ustar00rootroot00000000000000{ "action": { "id":54321, "status":"in-progress", "type":"rename", "started_at":"2014-12-21T02:19:17Z", "completed_at":null, "resource_id":12345, "resource_type":"droplet", "region":"nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/droplet_actions/resize.json000066400000000000000000000003721375555111600276370ustar00rootroot00000000000000{ "action": { "id":54321, "status":"in-progress", "type":"resize", "started_at":"2014-12-21T02:19:17Z", "completed_at":null, "resource_id":12345, "resource_type":"droplet", "region":"nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/droplet_actions/restore.json000066400000000000000000000003731375555111600300220ustar00rootroot00000000000000{ "action": { "id":54321, "status":"in-progress", "type":"restore", "started_at":"2014-12-21T02:19:17Z", "completed_at":null, "resource_id":12345, "resource_type":"droplet", "region":"nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/droplet_actions/shutdown.json000066400000000000000000000003741375555111600302130ustar00rootroot00000000000000{ "action": { "id":54321, "status":"in-progress", "type":"shutdown", "started_at":"2014-12-21T02:19:17Z", "completed_at":null, "resource_id":12345, "resource_type":"droplet", "region":"nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/droplet_actions/snapshot.json000066400000000000000000000003741375555111600301770ustar00rootroot00000000000000{ "action": { "id":54321, "status":"in-progress", "type":"snapshot", "started_at":"2014-12-21T02:19:17Z", "completed_at":null, "resource_id":12345, "resource_type":"droplet", "region":"nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/droplets/000077500000000000000000000000001375555111600241045ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/droplets/all.json000066400000000000000000000035221375555111600255510ustar00rootroot00000000000000{ "droplets": [ { "id": 3164444, "name": "example.com", "memory": 512, "vcpus": 1, "disk": 20, "locked": false, "status": "active", "kernel": { "id": 2233, "name": "Ubuntu 14.04 x64 vmlinuz-3.13.0-37-generic", "version": "3.13.0-37-generic" }, "created_at": "2014-11-14T16:29:21Z", "features": [ "backups", "ipv6", "virtio" ], "backup_ids": [ 7938002 ], "snapshot_ids": [ ], "image": { "id": 6918990, "name": "14.04 x64", "distribution": "Ubuntu", "slug": "ubuntu-14-04-x64", "public": true, "regions": [ "nyc1", "ams1", "sfo1", "nyc2", "ams2", "sgp1", "lon1", "nyc3", "ams3", "nyc3" ], "created_at": "2014-10-17T20:24:33Z", "min_disk_size": 20 }, "size_slug": "512mb", "networks": { "v4": [ { "ip_address": "104.236.32.182", "netmask": "255.255.192.0", "gateway": "104.236.0.1", "type": "public" } ], "v6": [ { "ip_address": "2604:A880:0800:0010:0000:0000:02DD:4001", "netmask": 64, "gateway": "2604:A880:0800:0010:0000:0000:0000:0001", "type": "public" } ] }, "vpc_uuid": "08187eaa-90eb-40d6-a8f0-0222b28ded72", "region": { "name": "New York 3", "slug": "nyc3", "sizes": [ ], "features": [ "virtio", "private_networking", "backups", "ipv6", "metadata" ], "available": null } } ], "meta": { "total": 1 } }python-digitalocean-1.16.0/digitalocean/tests/data/droplets/bytag.json000066400000000000000000000042561375555111600261140ustar00rootroot00000000000000{ "droplets": [ { "id": 3164444, "name": "example.com", "memory": 512, "vcpus": 1, "disk": 20, "locked": false, "status": "active", "kernel": { "id": 2233, "name": "Ubuntu 14.04 x64 vmlinuz-3.13.0-37-generic", "version": "3.13.0-37-generic" }, "created_at": "2014-11-14T16:29:21Z", "features": [ "backups", "ipv6", "virtio" ], "backup_ids": [ 7938002 ], "snapshot_ids": [ ], "image": { "id": 6918990, "name": "14.04 x64", "distribution": "Ubuntu", "slug": "ubuntu-14-04-x64", "public": true, "regions": [ "nyc1", "ams1", "sfo1", "nyc2", "ams2", "sgp1", "lon1", "nyc3", "ams3", "nyc3" ], "created_at": "2014-10-17T20:24:33Z", "type": "snapshot", "min_disk_size": 20, "size_gigabytes": 2.34 }, "volumes": [ ], "size": { }, "size_slug": "512mb", "networks": { "v4": [ { "ip_address": "104.236.32.182", "netmask": "255.255.192.0", "gateway": "104.236.0.1", "type": "public" } ], "v6": [ { "ip_address": "2604:A880:0800:0010:0000:0000:02DD:4001", "netmask": 64, "gateway": "2604:A880:0800:0010:0000:0000:0000:0001", "type": "public" } ] }, "vpc_uuid": "08187eaa-90eb-40d6-a8f0-0222b28ded72", "region": { "name": "New York 3", "slug": "nyc3", "sizes": [ ], "features": [ "virtio", "private_networking", "backups", "ipv6", "metadata" ], "available": null }, "tags": [ "awesome" ] } ], "links": { "pages": { "last": "https://api.digitalocean.com/v2/droplets?page=3&per_page=1", "next": "https://api.digitalocean.com/v2/droplets?page=2&per_page=1" } }, "meta": { "total": 3 } } python-digitalocean-1.16.0/digitalocean/tests/data/droplets/single.json000066400000000000000000000035011375555111600262570ustar00rootroot00000000000000{ "droplet": { "volume_ids": ["506f78a4-e098-11e5-ad9f-000f53306ae1"], "id": 12345, "name": "example.com", "memory": 512, "vcpus": 1, "disk": 20, "locked": false, "status": "active", "kernel": { "id": 2233, "name": "Ubuntu 14.04 x64 vmlinuz-3.13.0-37-generic", "version": "3.13.0-37-generic" }, "created_at": "2014-11-14T16:36:31Z", "features": [ "ipv6", "virtio" ], "backup_ids": [ ], "snapshot_ids": [ 7938206 ], "image": { "id": 6918990, "name": "14.04 x64", "distribution": "Ubuntu", "slug": "ubuntu-14-04-x64", "public": true, "regions": [ "nyc1", "ams1", "sfo1", "nyc2", "ams2", "sgp1", "lon1", "nyc3", "ams3", "nyc3" ], "created_at": "2014-10-17T20:24:33Z", "min_disk_size": 20 }, "size_slug": "512mb", "networks": { "v4": [ { "ip_address": "104.131.186.241", "netmask": "255.255.240.0", "gateway": "104.131.176.1", "type": "public" } ], "v6": [ { "ip_address": "2604:A880:0800:0010:0000:0000:031D:2001", "netmask": 64, "gateway": "2604:A880:0800:0010:0000:0000:0000:0001", "type": "public" } ] }, "vpc_uuid": "08187eaa-90eb-40d6-a8f0-0222b28ded72", "region": { "name": "New York 3", "slug": "nyc3", "sizes": [ "32gb", "16gb", "2gb", "1gb", "4gb", "8gb", "512mb", "64gb", "48gb" ], "features": [ "virtio", "private_networking", "backups", "ipv6", "metadata" ], "available": true } } } python-digitalocean-1.16.0/digitalocean/tests/data/errors/000077500000000000000000000000001375555111600235645ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/errors/unauthorized.json000066400000000000000000000000751375555111600272020ustar00rootroot00000000000000{"id":"unauthorized","message":"Unable to authenticate you."}python-digitalocean-1.16.0/digitalocean/tests/data/firewalls/000077500000000000000000000000001375555111600242405ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/firewalls/all.json000066400000000000000000000024101375555111600257000ustar00rootroot00000000000000{ "firewalls": [ { "id": "12345", "name": "firewall", "status": "succeeded", "inbound_rules": [ { "protocol": "tcp", "ports": "80", "sources": { "load_balancer_uids": [ "12345" ] } }, { "protocol": "tcp", "ports": 22, "sources": { "tags": [ "gateway" ], "addresses": [ "18.0.0.0/8" ] } } ], "outbound_rules": [ { "protocol": "tcp", "ports": "80", "destinations": { "addresses": [ "0.0.0.0/0", "::/0" ] } } ], "created_at": "2017-05-23T21:24:00Z", "droplet_ids": [ 12345 ], "tags": [ ], "pending_changes": [ ] } ] }python-digitalocean-1.16.0/digitalocean/tests/data/firewalls/bytag.json000066400000000000000000000024311375555111600262410ustar00rootroot00000000000000{ "firewalls": [ { "id": "12345", "name": "firewall", "status": "succeeded", "inbound_rules": [ { "protocol": "tcp", "ports": "80", "sources": { "load_balancer_uids": [ "12345" ] } }, { "protocol": "tcp", "ports": 22, "sources": { "tags": [ "gateway" ], "addresses": [ "18.0.0.0/8" ] } } ], "outbound_rules": [ { "protocol": "tcp", "ports": "80", "destinations": { "addresses": [ "0.0.0.0/0", "::/0" ] } } ], "created_at": "2017-05-23T21:24:00Z", "droplet_ids": [ 12345 ], "tags": [ "awesome" ], "pending_changes": [ ] } ] }python-digitalocean-1.16.0/digitalocean/tests/data/firewalls/droplets.json000066400000000000000000000000541375555111600267660ustar00rootroot00000000000000{ "droplet_ids": [ 12345 ] }python-digitalocean-1.16.0/digitalocean/tests/data/firewalls/single.json000066400000000000000000000016651375555111600264240ustar00rootroot00000000000000{ "firewall": { "id": 12345, "name": "firewall", "status": "succeeded", "inbound_rules": [ { "protocol": "tcp", "ports": "80", "sources": { "load_balancer_uids": [ "12345" ] } }, { "protocol": "tcp", "ports": 22, "sources": { "tags": [ "gateway" ], "addresses": [ "18.0.0.0/8" ] } } ], "outbound_rules": [ { "protocol": "tcp", "ports": "80", "destinations": { "addresses": [ "0.0.0.0/0", "::/0" ] } } ], "created_at": "2017-05-23T21:24:00Z", "droplet_ids": [ 12345 ], "tags": [ ], "pending_changes": [ ] } }python-digitalocean-1.16.0/digitalocean/tests/data/firewalls/tags.json000066400000000000000000000000511375555111600260650ustar00rootroot00000000000000{ "tags": [ "awesome" ] }python-digitalocean-1.16.0/digitalocean/tests/data/floatingip/000077500000000000000000000000001375555111600244045ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/floatingip/assign.json000066400000000000000000000011671375555111600265700ustar00rootroot00000000000000{ "action": { "id": 68212728, "status": "in-progress", "type": "assign_ip", "started_at": "2015-10-15T17:45:44Z", "completed_at": null, "resource_id": 758603823, "resource_type": "floating_ip", "region": { "name": "New York 3", "slug": "nyc3", "sizes": [ "512mb", "1gb", "2gb", "4gb", "8gb", "16gb", "32gb", "48gb", "64gb" ], "features": [ "private_networking", "backups", "ipv6", "metadata" ], "available": true }, "region_slug": "nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/floatingip/list.json000066400000000000000000000011051375555111600262470ustar00rootroot00000000000000{ "floating_ips": [ { "ip": "45.55.96.47", "droplet": null, "region": { "name": "New York 3", "slug": "nyc3", "sizes": [ "512mb", "1gb", "2gb", "4gb", "8gb", "16gb", "32gb", "48gb", "64gb" ], "features": [ "private_networking", "backups", "ipv6", "metadata" ], "available": true }, "locked": false } ], "links": { }, "meta": { "total": 1 } }python-digitalocean-1.16.0/digitalocean/tests/data/floatingip/single.json000066400000000000000000000007241375555111600265630ustar00rootroot00000000000000{ "floating_ip": { "ip": "45.55.96.47", "droplet": null, "region": { "name": "New York 3", "slug": "nyc3", "sizes": [ "512mb", "1gb", "2gb", "4gb", "8gb", "16gb", "32gb", "48gb", "64gb" ], "features": [ "private_networking", "backups", "ipv6", "metadata" ], "available": true }, "locked": false } }python-digitalocean-1.16.0/digitalocean/tests/data/floatingip/unassign.json000066400000000000000000000011711375555111600271260ustar00rootroot00000000000000{ "action": { "id": 68212773, "status": "in-progress", "type": "unassign_ip", "started_at": "2015-10-15T17:46:15Z", "completed_at": null, "resource_id": 758603823, "resource_type": "floating_ip", "region": { "name": "New York 3", "slug": "nyc3", "sizes": [ "512mb", "1gb", "2gb", "4gb", "8gb", "16gb", "32gb", "48gb", "64gb" ], "features": [ "private_networking", "backups", "ipv6", "metadata" ], "available": true }, "region_slug": "nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/images/000077500000000000000000000000001375555111600235155ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/images/all.json000066400000000000000000000016511375555111600251630ustar00rootroot00000000000000{ "images": [ { "id": 119192817, "name": "14.04 x64", "distribution": "Ubuntu", "slug": "ubuntu-14-04-x64", "public": true, "regions": [ "nyc1" ], "created_at": "2014-07-29T14:35:40Z", "min_disk_size": 20, "size_gigabytes": 1.34 }, { "id": 449676376, "name": "14.04 x32", "distribution": "Ubuntu", "slug": "ubuntu-14-04-x32", "public": true, "regions": [ "nyc1" ], "created_at": "2014-07-29T14:35:40Z", "min_disk_size": 20, "size_gigabytes": 1.48 }, { "id": 449676856, "name": "My Snapshot", "distribution": "Ubuntu", "slug": "", "public": false, "regions": [ "nyc1", "nyc3" ], "created_at": "2014-08-18T16:35:40Z", "min_disk_size": 20, "size_gigabytes": 2.34 } ], "meta": { "total": 3 } }python-digitalocean-1.16.0/digitalocean/tests/data/images/app.json000066400000000000000000000015241375555111600251720ustar00rootroot00000000000000{ "images": [ { "id": 11146864, "name": "MEAN on 14.04", "distribution": "Ubuntu", "slug": "mean", "public": true, "regions": [ "nyc1", "ams1", "sfo1", "nyc2", "ams2", "sgp1", "lon1", "nyc3", "ams3" ], "created_at": "2015-03-24T17:10:43Z", "min_disk_size": 20, "type": "snapshot", "size_gigabytes": 1.36 }, { "id": 11162594, "name": "Dokku v0.3.16 on 14.04", "distribution": "Ubuntu", "slug": "dokku", "public": true, "regions": [ "nyc1", "ams1", "sfo1", "nyc2", "ams2", "sgp1", "lon1", "nyc3", "ams3" ], "created_at": "2015-03-25T19:00:05Z", "min_disk_size": 20, "type": "snapshot", "size_gigabytes": 1.34 } ], "meta": { "total": 2 } }python-digitalocean-1.16.0/digitalocean/tests/data/images/create.json000066400000000000000000000005221375555111600256520ustar00rootroot00000000000000{ "image": { "created_at": "2018-09-20T19:28:00Z", "description": "Cloud-optimized image", "distribution": "Ubuntu", "error_message": "", "id": 38413969, "name": "ubuntu-18.04-minimal", "regions": [ ], "type": "custom", "tags": [ "base-image", "prod" ], "status": "NEW" } }python-digitalocean-1.16.0/digitalocean/tests/data/images/distro.json000066400000000000000000000012001375555111600257050ustar00rootroot00000000000000{ "images": [ { "id": 119192817, "name": "14.04 x64", "distribution": "Ubuntu", "slug": "ubuntu-14-04-x64", "public": true, "regions": [ "nyc1" ], "created_at": "2014-07-29T14:35:40Z", "min_disk_size": 20, "size_gigabytes": 1.34 }, { "id": 449676376, "name": "14.04 x32", "distribution": "Ubuntu", "slug": "ubuntu-14-04-x32", "public": true, "regions": [ "nyc1" ], "created_at": "2014-07-29T14:35:40Z", "min_disk_size": 20, "size_gigabytes": 1.38 } ], "meta": { "total": 2 } }python-digitalocean-1.16.0/digitalocean/tests/data/images/private.json000066400000000000000000000005021375555111600260570ustar00rootroot00000000000000{ "images": [ { "id": 449676856, "name": "My Snapshot", "distribution": "Ubuntu", "slug": "", "public": false, "regions": [ "nyc1", "nyc3" ], "created_at": "2014-08-18T16:35:40Z", "size_gigabytes": 1.34 } ], "meta": { "total": 1 } }python-digitalocean-1.16.0/digitalocean/tests/data/images/rename.json000066400000000000000000000004731375555111600256630ustar00rootroot00000000000000{ "image": { "id": 449676856, "name": "Descriptive name", "distribution": "Ubuntu", "slug": "", "public": false, "regions": [ "nyc1", "nyc3" ], "created_at": "2014-08-18T16:35:40Z", "min_disk_size": 20, "size_gigabytes": 2.34 } }python-digitalocean-1.16.0/digitalocean/tests/data/images/single.json000066400000000000000000000004451375555111600256740ustar00rootroot00000000000000{ "image": { "id": 449676856, "name": "My Snapshot", "distribution": "Ubuntu", "public": false, "regions": [ "nyc1", "nyc3" ], "created_at": "2014-08-18T16:35:40Z", "min_disk_size": 20, "size_gigabytes": 2.34 } } python-digitalocean-1.16.0/digitalocean/tests/data/images/slug.json000066400000000000000000000004771375555111600253720ustar00rootroot00000000000000{ "image": { "id": null, "name": "My Slug Snapshot", "distribution": "Ubuntu", "slug": "testslug", "public": false, "regions": [ "nyc1", "nyc3" ], "created_at": "2014-08-18T16:35:40Z", "min_disk_size": 30, "size_gigabytes": 2.34 } } python-digitalocean-1.16.0/digitalocean/tests/data/images/transfer.json000066400000000000000000000011601375555111600262320ustar00rootroot00000000000000{ "action": { "id": 68212728, "status": "in-progress", "type": "transfer", "started_at": "2015-10-15T17:45:44Z", "completed_at": null, "resource_id": 449676856, "resource_type": "image", "region": { "name": "New York 3", "slug": "nyc3", "sizes": [ "512mb", "1gb", "2gb", "4gb", "8gb", "16gb", "32gb", "48gb", "64gb" ], "features": [ "private_networking", "backups", "ipv6", "metadata" ], "available": true }, "region_slug": "nyc3" } }python-digitalocean-1.16.0/digitalocean/tests/data/kernels/000077500000000000000000000000001375555111600237135ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/kernels/list.json000066400000000000000000000005051375555111600255610ustar00rootroot00000000000000{ "kernels": [ { "id": 61833229, "name": "Ubuntu 14.04 x32 vmlinuz-3.13.0-24-generic", "version": "3.13.0-24-generic" }, { "id": 485432972, "name": "Ubuntu 14.04 x64 vmlinuz-3.13.0-24-generic (1221)", "version": "3.13.0-24-generic" } ], "meta": { "total": 2 } }python-digitalocean-1.16.0/digitalocean/tests/data/kernels/page_one.json000066400000000000000000000010111375555111600263540ustar00rootroot00000000000000{ "kernels": [ { "id": 61833229, "name": "Ubuntu 14.04 x32 vmlinuz-3.13.0-24-generic", "version": "3.13.0-24-generic" }, { "id": 485432972, "name": "Ubuntu 14.04 x64 vmlinuz-3.13.0-24-generic (1221)", "version": "3.13.0-24-generic" } ], "links": { "pages": { "last": "https://api.digitalocean.com/v2/droplets/12345/kernels?page=2", "next": "https://api.digitalocean.com/v2/droplets/12345/kernels?page=2" } }, "meta": { "total": 3 } }python-digitalocean-1.16.0/digitalocean/tests/data/kernels/page_two.json000066400000000000000000000004511375555111600264130ustar00rootroot00000000000000{ "kernels": [ { "id": 231, "name": "Ubuntu 14.04 x64 vmlinuz-3.13.0-32-generic", "version": "3.13.0-32-generic" } ], "links": { "pages": { "last": "https://api.digitalocean.com/v2/droplets/12345/kernels?page=2" } }, "meta": { "total": 3 } }python-digitalocean-1.16.0/digitalocean/tests/data/keys/000077500000000000000000000000001375555111600232235ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/keys/all.json000066400000000000000000000005221375555111600246650ustar00rootroot00000000000000{ "ssh_keys": [ { "id": 1, "fingerprint": "f5:d1:78:ed:28:72:5f:e1:ac:94:fd:1f:e0:a3:48:6d", "public_key": "ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAAAQQDGk5V68BJ4P3Ereh779Vi/Ft2qs/rbXrcjKLGo6zsyeyFUE0svJUpRDEJvFSf8RlezKx1/1ulJu9+kZsxRiUKn example", "name": "Example Key" } ], "meta": { "total": 1 } }python-digitalocean-1.16.0/digitalocean/tests/data/keys/newly_posted.json000066400000000000000000000002371375555111600266340ustar00rootroot00000000000000{ "ssh_key": { "id": 1234, "fingerprint": "ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff", "public_key": "AAAAkey", "name": "new_key" } }python-digitalocean-1.16.0/digitalocean/tests/data/loadbalancer/000077500000000000000000000000001375555111600246575ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/loadbalancer/all.json000066400000000000000000000062641375555111600263320ustar00rootroot00000000000000{ "load_balancers": [ { "id": "4de2ac7b-495b-4884-9e69-1050d6793cd4", "name": "example-lb-02", "ip": "104.131.186.248", "algorithm": "round_robin", "status": "new", "created_at": "2017-02-01T22:22:58Z", "forwarding_rules": [ { "entry_protocol": "http", "entry_port": 80, "target_protocol": "http", "target_port": 80, "certificate_id": "", "tls_passthrough": false }, { "entry_protocol": "https", "entry_port": 444, "target_protocol": "https", "target_port": 443, "certificate_id": "", "tls_passthrough": true } ], "health_check": { "protocol": "http", "port": 80, "path": "/", "check_interval_seconds": 10, "response_timeout_seconds": 5, "healthy_threshold": 5, "unhealthy_threshold": 3 }, "sticky_sessions": { "type": "none" }, "region": { "name": "New York 3", "slug": "nyc3", "sizes": [ "512mb", "1gb", "2gb", "4gb", "8gb", "16gb", "m-16gb", "32gb", "m-32gb", "48gb", "m-64gb", "64gb", "m-128gb", "m-224gb" ], "features": [ "private_networking", "backups", "ipv6", "metadata", "install_agent" ], "available": true }, "tag": "web", "droplet_ids": [ 3164444, 3164445 ], "redirect_http_to_https": false, "vpc_uuid": "08187eaa-90eb-40d6-a8f0-0222b28ded72" }, { "id": "4de7ac8b-495b-4884-9a69-1050c6793cd6", "name": "example-lb-01", "ip": "104.131.186.241", "algorithm": "round_robin", "status": "new", "created_at": "2017-02-01T22:22:58Z", "forwarding_rules": [ { "entry_protocol": "http", "entry_port": 80, "target_protocol": "http", "target_port": 80, "certificate_id": "", "tls_passthrough": false }, { "entry_protocol": "https", "entry_port": 444, "target_protocol": "https", "target_port": 443, "certificate_id": "", "tls_passthrough": true } ], "health_check": { "protocol": "http", "port": 80, "path": "/", "check_interval_seconds": 10, "response_timeout_seconds": 5, "healthy_threshold": 5, "unhealthy_threshold": 3 }, "sticky_sessions": { "type": "none" }, "region": { "name": "New York 3", "slug": "nyc3", "sizes": [ "512mb", "1gb", "2gb", "4gb", "8gb", "16gb", "m-16gb", "32gb", "m-32gb", "48gb", "m-64gb", "64gb", "m-128gb", "m-224gb" ], "features": [ "private_networking", "backups", "ipv6", "metadata", "install_agent" ], "available": true }, "tag": "", "droplet_ids": [ 3164444, 3164445 ], "redirect_http_to_https": false, "vpc_uuid": "08187eaa-90eb-40d6-a8f0-0222b28ded72" }], "links": { }, "meta": { "total": 1 } }python-digitalocean-1.16.0/digitalocean/tests/data/loadbalancer/save.json000066400000000000000000000032241375555111600265110ustar00rootroot00000000000000{ "load_balancer": { "id": "4de7ac8b-495b-4884-9a69-1050c6793cd6", "name": "example-lb-01", "ip": "104.131.186.241", "algorithm": "least_connections", "status": "active", "created_at": "2017-02-01T22:22:58Z", "forwarding_rules": [ { "entry_protocol": "http", "entry_port": 80, "target_protocol": "http", "target_port": 80, "certificate_id": "", "tls_passthrough": false }, { "entry_protocol": "https", "entry_port": 444, "target_protocol": "https", "target_port": 443, "certificate_id": "", "tls_passthrough": true } ], "health_check": { "protocol": "http", "port": 80, "path": "/", "check_interval_seconds": 10, "response_timeout_seconds": 5, "healthy_threshold": 5, "unhealthy_threshold": 3 }, "sticky_sessions": { "type": "cookies", "cookie_name": "DO_LB", "cookie_ttl_seconds": 300 }, "region": { "name": "New York 3", "slug": "nyc3", "sizes": [ "512mb", "1gb", "2gb", "4gb", "8gb", "16gb", "m-16gb", "32gb", "m-32gb", "48gb", "m-64gb", "64gb", "m-128gb", "m-224gb" ], "features": [ "private_networking", "backups", "ipv6", "metadata", "install_agent" ], "available": true }, "tag": "", "droplet_ids": [ 34153248, 34153250 ], "redirect_http_to_https": false, "vpc_uuid": "08187eaa-90eb-40d6-a8f0-0222b28ded72" } } python-digitalocean-1.16.0/digitalocean/tests/data/loadbalancer/single.json000066400000000000000000000031061375555111600270330ustar00rootroot00000000000000{ "load_balancer": { "id": "4de7ac8b-495b-4884-9a69-1050c6793cd6", "name": "example-lb-01", "ip": "104.131.186.241", "algorithm": "round_robin", "status": "new", "created_at": "2017-02-01T22:22:58Z", "forwarding_rules": [ { "entry_protocol": "http", "entry_port": 80, "target_protocol": "http", "target_port": 80, "certificate_id": "", "tls_passthrough": false }, { "entry_protocol": "https", "entry_port": 444, "target_protocol": "https", "target_port": 443, "certificate_id": "", "tls_passthrough": true } ], "health_check": { "protocol": "http", "port": 80, "path": "/", "check_interval_seconds": 10, "response_timeout_seconds": 5, "healthy_threshold": 5, "unhealthy_threshold": 3 }, "sticky_sessions": { "type": "none" }, "region": { "name": "New York 3", "slug": "nyc3", "sizes": [ "512mb", "1gb", "2gb", "4gb", "8gb", "16gb", "m-16gb", "32gb", "m-32gb", "48gb", "m-64gb", "64gb", "m-128gb", "m-224gb" ], "features": [ "private_networking", "backups", "ipv6", "metadata", "install_agent" ], "available": true }, "tag": "", "droplet_ids": [ 3164444, 3164445 ], "redirect_http_to_https": false, "vpc_uuid": "08187eaa-90eb-40d6-a8f0-0222b28ded72" } }python-digitalocean-1.16.0/digitalocean/tests/data/loadbalancer/single_tag.json000066400000000000000000000031111375555111600276620ustar00rootroot00000000000000{ "load_balancer": { "id": "4de2ac7b-495b-4884-9e69-1050d6793cd4", "name": "example-lb-02", "ip": "104.131.186.248", "algorithm": "round_robin", "status": "new", "created_at": "2017-02-01T22:22:58Z", "forwarding_rules": [ { "entry_protocol": "http", "entry_port": 80, "target_protocol": "http", "target_port": 80, "certificate_id": "", "tls_passthrough": false }, { "entry_protocol": "https", "entry_port": 444, "target_protocol": "https", "target_port": 443, "certificate_id": "", "tls_passthrough": true } ], "health_check": { "protocol": "http", "port": 80, "path": "/", "check_interval_seconds": 10, "response_timeout_seconds": 5, "healthy_threshold": 5, "unhealthy_threshold": 3 }, "sticky_sessions": { "type": "none" }, "region": { "name": "New York 3", "slug": "nyc3", "sizes": [ "512mb", "1gb", "2gb", "4gb", "8gb", "16gb", "m-16gb", "32gb", "m-32gb", "48gb", "m-64gb", "64gb", "m-128gb", "m-224gb" ], "features": [ "private_networking", "backups", "ipv6", "metadata", "install_agent" ], "available": true }, "tag": "web", "droplet_ids": [ 3164444, 3164445 ], "redirect_http_to_https": false, "vpc_uuid": "08187eaa-90eb-40d6-a8f0-0222b28ded72" } }python-digitalocean-1.16.0/digitalocean/tests/data/projects/000077500000000000000000000000001375555111600241015ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/projects/all_projects_list.json000066400000000000000000000011611375555111600305070ustar00rootroot00000000000000{ "projects": [ { "id": "4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679", "owner_uuid": "99525febec065ca37b2ffe4f852fd2b2581895e7", "owner_id": 2, "name": "my-web-api", "description": "My website API", "purpose": "Service or API", "environment": "Production", "is_default": false, "created_at": "2018-09-27T20:10:35Z", "updated_at": "2018-09-27T20:10:35Z" } ], "links": { "pages": { "first": "https://api.digitalocean.com/v2/projects?page=1", "last": "https://api.digitalocean.com/v2/projects?page=1" } }, "meta": { "total": 1 } } python-digitalocean-1.16.0/digitalocean/tests/data/projects/assign_resources.json000066400000000000000000000013621375555111600303540ustar00rootroot00000000000000{ "resources": [ { "urn": "do:droplet:1", "assigned_at": "2018-09-28T19:26:37Z", "links": { "self": "https://api.digitalocean.com/v2/droplets/1" }, "status": "assigned" }, { "urn": "do:floatingip:192.168.99.100", "assigned_at": "2018-09-28T19:26:37Z", "links": { "self": "https://api.digitalocean.com/v2/floating_ips/192.168.99.100" }, "status": "assigned" } ], "links": { "pages": { "first": "https://api.digitalocean.com/v2/projects/4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679/resources?page=1", "last": "https://api.digitalocean.com/v2/projects/4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679/resources?page=1" } }, "meta": { "total": 2 } } python-digitalocean-1.16.0/digitalocean/tests/data/projects/create.json000066400000000000000000000006071375555111600262420ustar00rootroot00000000000000{ "project": { "id": "4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679", "owner_uuid": "99525febec065ca37b2ffe4f852fd2b2581895e7", "owner_id": 2, "name": "my-web-api", "description": "My website API", "purpose": "Service or API", "environment": "Production", "is_default": false, "created_at": "2018-09-27T15:52:48Z", "updated_at": "2018-09-27T15:52:48Z" } } python-digitalocean-1.16.0/digitalocean/tests/data/projects/default_project.json000066400000000000000000000006061375555111600301500ustar00rootroot00000000000000{ "project": { "id": "4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679", "owner_uuid": "99525febec065ca37b2ffe4f852fd2b2581895e7", "owner_id": 2, "name": "my-web-api", "description": "My website API", "purpose": "Service or API", "environment": "Production", "is_default": true, "created_at": "2018-09-27T20:10:35Z", "updated_at": "2018-09-27T20:10:35Z" } } python-digitalocean-1.16.0/digitalocean/tests/data/projects/list.json000066400000000000000000000011611375555111600257460ustar00rootroot00000000000000{ "projects": [ { "id": "4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679", "owner_uuid": "99525febec065ca37b2ffe4f852fd2b2581895e7", "owner_id": 2, "name": "my-web-api", "description": "My website API", "purpose": "Service or API", "environment": "Production", "is_default": false, "created_at": "2018-09-27T20:10:35Z", "updated_at": "2018-09-27T20:10:35Z" } ], "links": { "pages": { "first": "https://api.digitalocean.com/v2/projects?page=1", "last": "https://api.digitalocean.com/v2/projects?page=1" } }, "meta": { "total": 1 } } python-digitalocean-1.16.0/digitalocean/tests/data/projects/project_resources.json000066400000000000000000000010021375555111600305250ustar00rootroot00000000000000{ "resources": [ { "urn": "do:droplet:1", "assigned_at": "2018-09-28T19:26:37Z", "links": { "self": "https://api.digitalocean.com/v2/droplets/1" }, "status": "ok" } ], "links": { "pages": { "first": "https://api.digitalocean.com/v2/projects/4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679/resources?page=1", "last": "https://api.digitalocean.com/v2/projects/4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679/resources?page=1" } }, "meta": { "total": 1 } } python-digitalocean-1.16.0/digitalocean/tests/data/projects/retrieve.json000066400000000000000000000006071375555111600266240ustar00rootroot00000000000000{ "project": { "id": "4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679", "owner_uuid": "99525febec065ca37b2ffe4f852fd2b2581895e7", "owner_id": 2, "name": "my-web-api", "description": "My website API", "purpose": "Service or API", "environment": "Production", "is_default": false, "created_at": "2018-09-27T20:10:35Z", "updated_at": "2018-09-27T20:10:35Z" } } python-digitalocean-1.16.0/digitalocean/tests/data/projects/update.json000066400000000000000000000006041375555111600262560ustar00rootroot00000000000000{ "project": { "id": "4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679", "owner_uuid": "99525febec065ca37b2ffe4f852fd2b2581895e7", "owner_id": 2, "name": "my-web-api", "description": "My website API", "purpose": "Service or API", "environment": "Staging", "is_default": false, "created_at": "2018-09-27T15:52:48Z", "updated_at": "2018-09-27T15:52:48Z" } } python-digitalocean-1.16.0/digitalocean/tests/data/regions/000077500000000000000000000000001375555111600237165ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/regions/all.json000066400000000000000000000013301375555111600253560ustar00rootroot00000000000000{ "regions": [ { "slug": "nyc1", "name": "New York", "sizes": [ "1gb", "512mb" ], "available": false, "features": [ "virtio", "private_networking", "backups", "ipv6" ] }, { "slug": "sfo1", "name": "San Francisco", "sizes": [ "1gb", "512mb" ], "available": true, "features": [ "virtio", "backups" ] }, { "slug": "ams1", "name": "Amsterdam", "sizes": [ "1gb", "512mb" ], "available": true, "features": [ "virtio", "backups" ] } ], "meta": { "total": 3 } } python-digitalocean-1.16.0/digitalocean/tests/data/sizes/000077500000000000000000000000001375555111600234055ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/sizes/all.json000066400000000000000000000010411375555111600250440ustar00rootroot00000000000000{ "sizes": [ { "slug": "512mb", "memory": 512, "vcpus": 1, "disk": 20, "transfer": 1, "price_monthly": 5.0, "price_hourly": 0.00744, "regions": [ "nyc1", "ams1", "sfo1" ] }, { "slug": "1gb", "memory": 1024, "vcpus": 2, "disk": 30, "transfer": 2, "price_monthly": 10.0, "price_hourly": 0.01488, "regions": [ "nyc1", "ams1", "sfo1" ] } ], "meta": { "total": 2 } }python-digitalocean-1.16.0/digitalocean/tests/data/snapshots/000077500000000000000000000000001375555111600242725ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/snapshots/all.json000066400000000000000000000007371375555111600257440ustar00rootroot00000000000000{ "snapshots": [ { "id": 6372321, "name": "test", "regions": [ "nyc1", "ams1", "sfo1", "nyc2", "ams2", "sgp1", "lon1", "nyc3", "ams3", "fra1", "tor1" ], "created_at": "2014-09-26T16:40:18Z", "resource_id": 2713828, "resource_type": "droplet", "min_disk_size": 20, "size_gigabytes": 1.42 } ], "meta": { "total": 1 } } python-digitalocean-1.16.0/digitalocean/tests/data/snapshots/droplets.json000066400000000000000000000007651375555111600270310ustar00rootroot00000000000000{ "snapshots": [ { "id": 19602538, "name": "droplet-test", "regions": [ "nyc1", "sfo1", "nyc2", "ams2", "sgp1", "lon1", "nyc3", "ams3", "fra1", "tor1", "sfo2", "blr1" ], "created_at": "2016-09-10T18:06:25Z", "resource_id": null, "resource_type": "droplet", "min_disk_size": 20, "size_gigabytes": 0.31 } ], "meta": { "total": 1 } } python-digitalocean-1.16.0/digitalocean/tests/data/snapshots/single.json000066400000000000000000000005221375555111600264450ustar00rootroot00000000000000{ "snapshot": { "id": "fbe805e8-866b-11e6-96bf-000f53315a41", "name": "big-data-snapshot1475170902", "regions": [ "nyc1" ], "created_at": "2016-09-29T17:41:42Z", "resource_id": "fbcbc5c8-866b-11e6-96bf-000f53315a41", "resource_type": "volume", "min_disk_size": 20, "size_gigabytes": 1.42 } } python-digitalocean-1.16.0/digitalocean/tests/data/snapshots/volumes.json000066400000000000000000000006001375555111600266530ustar00rootroot00000000000000{ "snapshots": [ { "id": "4f60fc64-85d1-11e6-a004-000f53315871", "name": "volume-test", "regions": [ "nyc1" ], "created_at": "2016-09-28T23:14:30Z", "resource_id": "89bcc42f-85cf-11e6-a004-000f53315871", "resource_type": "volume", "min_disk_size": 10, "size_gigabytes": 0 } ], "meta": { "total": 1 } } python-digitalocean-1.16.0/digitalocean/tests/data/tags/000077500000000000000000000000001375555111600232065ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/tags/all.json000066400000000000000000000005031375555111600246470ustar00rootroot00000000000000{ "tags": [ { "name": "test", "resources": { "count": 0, "droplets": { "count": 0 }, "images": { "count": 0 }, "volumes": { "count": 0 }, "volume_snapshots": { "count": 0 } } } ] } python-digitalocean-1.16.0/digitalocean/tests/data/tags/single.json000066400000000000000000000004311375555111600253600ustar00rootroot00000000000000{ "tag": { "name": "awesome", "resources": { "count": 0, "droplets": { "count": 0 }, "images": { "count": 0 }, "volumes": { "count": 0 }, "volume_snapshots": { "count": 0 } } } } python-digitalocean-1.16.0/digitalocean/tests/data/tags/updatetag.json000066400000000000000000000006731375555111600260650ustar00rootroot00000000000000{ "tag": { "name": "extra-awesome", "resources": { "count": 1, "last_tagged_uri": "https://api.digitalocean.com/v2/droplets/3164444", "droplets": { "count": 1, "last_tagged_uri": "https://api.digitalocean.com/v2/droplets/3164444" }, "images": { "count": 0 }, "volumes": { "count": 0 }, "volume_snapshots": { "count": 0 } } } } python-digitalocean-1.16.0/digitalocean/tests/data/volumes/000077500000000000000000000000001375555111600237425ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/volumes/all.json000066400000000000000000000030261375555111600254060ustar00rootroot00000000000000{ "volumes": [ { "id": "506f78a4-e098-11e5-ad9f-000f53306ae1", "region": { "name": "New York 1", "slug": "nyc1", "sizes": [ "512mb", "1gb", "2gb", "4gb", "8gb", "16gb", "32gb", "48gb", "64gb" ], "features": [ "private_networking", "backups", "ipv6", "metadata" ], "available": true }, "droplet_ids": [ ], "name": "example", "description": "Block store for examples", "size_gigabytes": 100, "filesystem_type": "ext4", "filesystem_label": "label", "created_at": "2016-03-02T17:00:49Z" }, { "id": "2d2967ff-491d-11e6-860c-000f53315870", "region": { "name": "New York 3", "slug": "nyc3", "sizes": [ "512mb", "1gb", "2gb", "4gb", "8gb", "16gb", "32gb", "48gb", "64gb" ], "features": [ "private_networking", "backups", "ipv6", "metadata" ], "available": true }, "droplet_ids": [ 19486237 ], "name": "another-example", "description": "A bigger example volume", "size_gigabytes": 500, "filesystem_type": "ext4", "filesystem_label": "label", "created_at": "2016-03-05T17:00:49Z" } ], "links": { }, "meta": { "total": 2 } }python-digitalocean-1.16.0/digitalocean/tests/data/volumes/attach.json000066400000000000000000000012011375555111600260730ustar00rootroot00000000000000{ "action": { "id": 72531856, "status": "completed", "type": "attach_volume", "started_at": "2015-11-12T17:51:03Z", "completed_at": "2015-11-12T17:51:14Z", "resource_id": null, "resource_type": "volume", "region": { "name": "New York 1", "slug": "nyc1", "sizes": [ "1gb", "2gb", "4gb", "8gb", "32gb", "64gb", "512mb", "48gb", "16gb" ], "features": [ "private_networking", "backups", "ipv6", "metadata" ], "available": true }, "region_slug": "nyc1" } }python-digitalocean-1.16.0/digitalocean/tests/data/volumes/detach.json000066400000000000000000000011621375555111600260650ustar00rootroot00000000000000{ "action": { "id": 68212773, "status": "in-progress", "type": "detach_volume", "started_at": "2015-10-15T17:46:15Z", "completed_at": null, "resource_id": null, "resource_type": "backend", "region": { "name": "New York 1", "slug": "nyc1", "sizes": [ "512mb", "1gb", "2gb", "4gb", "8gb", "16gb", "32gb", "48gb", "64gb" ], "features": [ "private_networking", "backups", "ipv6", "metadata" ], "available": true }, "region_slug": "nyc1" } }python-digitalocean-1.16.0/digitalocean/tests/data/volumes/resize.json000066400000000000000000000012031375555111600261320ustar00rootroot00000000000000{ "action": { "id": 72531856, "status": "in-progress", "type": "resize_volume", "started_at": "2015-11-12T17:51:03Z", "completed_at": "2015-11-12T17:51:14Z", "resource_id": null, "resource_type": "volume", "region": { "name": "New York 1", "slug": "nyc1", "sizes": [ "1gb", "2gb", "4gb", "8gb", "32gb", "64gb", "512mb", "48gb", "16gb" ], "features": [ "private_networking", "backups", "ipv6", "metadata" ], "available": true }, "region_slug": "nyc1" } }python-digitalocean-1.16.0/digitalocean/tests/data/volumes/single.json000066400000000000000000000012501375555111600261140ustar00rootroot00000000000000{ "volume": { "id": "506f78a4-e098-11e5-ad9f-000f53306ae1", "region": { "name": "New York 1", "slug": "nyc1", "sizes": [ "512mb", "1gb", "2gb", "4gb", "8gb", "16gb", "32gb", "48gb", "64gb" ], "features": [ "private_networking", "backups", "ipv6", "metadata" ], "available": true }, "droplet_ids": [ ], "name": "example", "description": "Block store for examples", "size_gigabytes": 100, "filesystem_label": "ext4", "filesystem_label": "label", "created_at": "2016-03-02T17:00:49Z" } } python-digitalocean-1.16.0/digitalocean/tests/data/volumes/single_with_tags.json000066400000000000000000000013271375555111600301720ustar00rootroot00000000000000{ "volume": { "id": "506f78a4-e098-11e5-ad9f-000f53306ae1", "region": { "name": "New York 1", "slug": "nyc1", "sizes": [ "512mb", "1gb", "2gb", "4gb", "8gb", "16gb", "32gb", "48gb", "64gb" ], "features": [ "private_networking", "backups", "ipv6", "metadata" ], "available": true }, "droplet_ids": [ ], "name": "example", "description": "Block store for examples", "size_gigabytes": 100, "filesystem_label": "ext4", "filesystem_label": "label", "tags": [ "tag1", "tag2" ], "created_at": "2016-03-02T17:00:49Z" } }python-digitalocean-1.16.0/digitalocean/tests/data/volumes/snapshot.json000066400000000000000000000005221375555111600264730ustar00rootroot00000000000000{ "snapshot": { "id": "8fa70202-873f-11e6-8b68-000f533176b1", "name": "big-data-snapshot1475261774", "regions": [ "nyc1" ], "created_at": "2016-09-30T18:56:14Z", "resource_id": "82a48a18-873f-11e6-96bf-000f53315a41", "resource_type": "volume", "min_disk_size": 10, "size_gigabytes": 20.2 } } python-digitalocean-1.16.0/digitalocean/tests/data/volumes/snapshots.json000066400000000000000000000014021375555111600266540ustar00rootroot00000000000000{ "snapshots": [ { "id": "8eb4d51a-873f-11e6-96bf-000f53315a41", "name": "big-data-snapshot1475261752", "regions": [ "nyc1" ], "created_at": "2016-09-30T18:56:12Z", "resource_id": "82a48a18-873f-11e6-96bf-000f53315a41", "resource_type": "volume", "min_disk_size": 10, "size_gigabytes": 20.2 } ,{ "id": "8eb4d51a-873f-11e6-96bf-000f53315a42", "name": "big-data-snapshot1475261752-2", "regions": [ "nyc1" ], "created_at": "2016-08-30T18:56:12Z", "resource_id": "8eb4d51a-873f-11e6-96bf-000f53315a42", "resource_type": "volume", "min_disk_size": 20, "size_gigabytes": 40.4 } ], "links": { }, "meta": { "total": 1 } } python-digitalocean-1.16.0/digitalocean/tests/data/vpcs/000077500000000000000000000000001375555111600232235ustar00rootroot00000000000000python-digitalocean-1.16.0/digitalocean/tests/data/vpcs/list.json000066400000000000000000000017261375555111600250770ustar00rootroot00000000000000{ "vpcs": [ { "id": "5a4981aa-9653-4bd1-bef5-d6bff52042e4", "urn": "do:vpc:5a4981aa-9653-4bd1-bef5-d6bff52042e4", "name": "my-new-vpc", "description": "", "region": "nyc1", "ip_range": "10.10.10.0/24", "created_at": "2020-03-13T19:20:47Z", "default": false }, { "id": "e0fe0f4d-596a-465e-a902-571ce57b79fa", "urn": "do:vpc:e0fe0f4d-596a-465e-a902-571ce57b79fa", "name": "default-nyc1", "description": "", "region": "nyc1", "ip_range": "10.102.0.0/20", "created_at": "2020-03-13T19:29:20Z", "default": true }, { "id": "d455e75d-4858-4eec-8c95-da2f0a5f93a7", "urn": "do:vpc:d455e75d-4858-4eec-8c95-da2f0a5f93a7", "name": "default-nyc3", "description": "", "region": "nyc3", "ip_range": "10.100.0.0/20", "created_at": "2019-11-19T22:19:35Z", "default": true } ], "links": { }, "meta": { "total": 3 } }python-digitalocean-1.16.0/digitalocean/tests/data/vpcs/single.json000066400000000000000000000004451375555111600254020ustar00rootroot00000000000000{ "vpc": { "id": "5a4981aa-9653-4bd1-bef5-d6bff52042e4", "urn": "do:vpc:5a4981aa-9653-4bd1-bef5-d6bff52042e4", "name": "my-new-vpc", "description": "", "region": "nyc1", "ip_range": "10.10.10.0/24", "created_at": "2020-03-13T18:48:45Z", "default": false } }python-digitalocean-1.16.0/digitalocean/tests/test_action.py000066400000000000000000000030621375555111600242260ustar00rootroot00000000000000import unittest import responses import json import digitalocean from .BaseTest import BaseTest class TestAction(BaseTest): def setUp(self): super(TestAction, self).setUp() self.action = digitalocean.Action(id=39388122, token=self.token) @responses.activate def test_load_directly(self): data = self.load_from_file('actions/ipv6_completed.json') url = self.base_url + "actions/39388122" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') self.action.load_directly() self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(self.action.status, "completed") self.assertEqual(self.action.id, 39388122) self.assertEqual(self.action.region_slug, 'nyc3') @responses.activate def test_load_without_droplet_id(self): data = self.load_from_file('actions/ipv6_completed.json') url = self.base_url + "actions/39388122" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') self.action.load() self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(self.action.status, "completed") self.assertEqual(self.action.id, 39388122) self.assertEqual(self.action.region_slug, 'nyc3') if __name__ == '__main__': unittest.main() python-digitalocean-1.16.0/digitalocean/tests/test_baseapi.py000066400000000000000000000043011375555111600243520ustar00rootroot00000000000000import os try: import mock except ImportError: from unittest import mock import responses import requests import digitalocean from .BaseTest import BaseTest class TestBaseAPI(BaseTest): def setUp(self): super(TestBaseAPI, self).setUp() self.manager = digitalocean.Manager(token=self.token) self.user_agent = "{0}/{1} {2}/{3}".format('python-digitalocean', digitalocean.__version__, requests.__name__, requests.__version__) @responses.activate def test_user_agent(self): data = self.load_from_file('account/account.json') url = self.base_url + 'account/' responses.add(responses.GET, url, body=data, status=200, content_type='application/json') self.manager.get_account() self.assertEqual(responses.calls[0].request.headers['User-Agent'], self.user_agent) @responses.activate def test_customize_session(self): data = self.load_from_file('account/account.json') url = self.base_url + 'account/' responses.add(responses.GET, url, body=data, status=200, content_type='application/json') self.manager._session.proxies['https'] = 'https://127.0.0.1:3128' self.manager.get_account() def test_custom_endpoint(self): custom_endpoint = 'http://example.com/' with mock.patch.dict(os.environ, {'DIGITALOCEAN_END_POINT': custom_endpoint}, clear=True): base_api = digitalocean.baseapi.BaseAPI() self.assertEqual(base_api.end_point, custom_endpoint) def test_invalid_custom_endpoint(self): custom_endpoint = 'not a valid endpoint' with mock.patch.dict(os.environ, {'DIGITALOCEAN_END_POINT': custom_endpoint}, clear=True): self.assertRaises(digitalocean.EndPointError, digitalocean.baseapi.BaseAPI) python-digitalocean-1.16.0/digitalocean/tests/test_certificate.py000066400000000000000000000076271375555111600252460ustar00rootroot00000000000000import json import unittest import responses import digitalocean from .BaseTest import BaseTest class TestCertificate(BaseTest): def setUp(self): super(TestCertificate, self).setUp() self.cert_id = '892071a0-bb95-49bc-8021-3afd67a210bf' self.cert = digitalocean.Certificate(id=self.cert_id, token=self.token) @responses.activate def test_load(self): data = self.load_from_file('certificate/custom.json') url = self.base_url + 'certificates/' + self.cert_id responses.add(responses.GET, url, body=data, status=200, content_type='application/json') self.cert.load() self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(self.cert.id, self.cert_id) self.assertEqual(self.cert.name, 'web-cert-01') self.assertEqual(self.cert.sha1_fingerprint, 'dfcc9f57d86bf58e321c2c6c31c7a971be244ac7') self.assertEqual(self.cert.not_after, '2017-02-22T00:23:00Z') self.assertEqual(self.cert.created_at, '2017-02-08T16:02:37Z') @responses.activate def test_create_custom(self): data = self.load_from_file('certificate/custom.json') url = self.base_url + 'certificates' responses.add(responses.POST, url, body=data, status=201, content_type='application/json') cert = digitalocean.Certificate(name='web-cert-01', private_key="a-b-c", leaf_certificate="e-f-g", certificate_chain="a-b-c\ne-f-g", type="custom", token=self.token).create() self.assertEqual(responses.calls[0].request.url, url) self.assertEqual(cert.id, '892071a0-bb95-49bc-8021-3afd67a210bf') self.assertEqual(cert.name, 'web-cert-01') self.assertEqual(cert.sha1_fingerprint, 'dfcc9f57d86bf58e321c2c6c31c7a971be244ac7') self.assertEqual(cert.not_after, '2017-02-22T00:23:00Z') self.assertEqual(cert.created_at, '2017-02-08T16:02:37Z') self.assertEqual(cert.type, 'custom') @responses.activate def test_create_lets_encrypt(self): data = self.load_from_file('certificate/lets_encrpyt.json') url = self.base_url + 'certificates' responses.add(responses.POST, url, body=data, status=201, content_type='application/json') cert = digitalocean.Certificate(name='web-cert-02', dns_names=["www.example.com", "example.com"], type="lets_encrpyt", token=self.token).create() self.assertEqual(responses.calls[0].request.url, url) self.assertEqual(cert.id, 'ba9b9c18-6c59-46c2-99df-70da170a42ba') self.assertEqual(cert.name, 'web-cert-02') self.assertEqual(cert.sha1_fingerprint, '479c82b5c63cb6d3e6fac4624d58a33b267e166c') self.assertEqual(cert.not_after, '2018-06-07T17:44:12Z') self.assertEqual(cert.created_at, '2018-03-09T18:44:11Z') self.assertEqual(cert.type, 'lets_encrypt') self.assertEqual(cert.state, 'pending') @responses.activate def test_destroy(self): url = self.base_url + 'certificates/' + self.cert_id responses.add(responses.DELETE, url, status=204, content_type='application/json') self.cert.destroy() self.assertEqual(responses.calls[0].request.url, url) if __name__ == '__main__': unittest.main() python-digitalocean-1.16.0/digitalocean/tests/test_domain.py000066400000000000000000000140161375555111600242210ustar00rootroot00000000000000import json import unittest import responses import digitalocean from .BaseTest import BaseTest class TestDomain(BaseTest): def setUp(self): super(TestDomain, self).setUp() self.domain = digitalocean.Domain(name='example.com', token=self.token) @responses.activate def test_load(self): data = self.load_from_file('domains/single.json') url = self.base_url + "domains/example.com" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') domain = digitalocean.Domain(name='example.com', token=self.token) domain.load() self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(domain.name, "example.com") self.assertEqual(domain.ttl, 1800) @responses.activate def test_destroy(self): url = self.base_url + "domains/example.com" responses.add(responses.DELETE, url, status=204, content_type='application/json') self.domain.destroy() self.assertEqual(responses.calls[0].request.url, url) @responses.activate def test_create_new_domain_record(self): data = self.load_from_file('domains/create_record.json') url = self.base_url + "domains/example.com/records" responses.add(responses.POST, url, body=data, status=201, content_type='application/json') response = self.domain.create_new_domain_record( type="CNAME", name="www", data="@") self.assert_url_query_equal( responses.calls[0].request.url, self.base_url + "domains/example.com/records") self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "CNAME", "data": "@", "name": "www"}) self.assertEqual(response['domain_record']['type'], "CNAME") self.assertEqual(response['domain_record']['name'], "www") self.assertEqual(response['domain_record']['data'], "@") self.assertEqual(response['domain_record']['ttl'], 600) @responses.activate def test_create_new_srv_record_zero_priority(self): data = self.load_from_file('domains/create_srv_record.json') url = self.base_url + "domains/example.com/records" responses.add(responses.POST, url, body=data, status=201, content_type='application/json') response = self.domain.create_new_domain_record( type="SRV", name="service", data="service", priority=0, weight=0) self.assert_url_query_equal( responses.calls[0].request.url, self.base_url + "domains/example.com/records") self.assertEqual(response['domain_record']['type'], "SRV") self.assertEqual(response['domain_record']['name'], "service") self.assertEqual(response['domain_record']['data'], "service") self.assertEqual(response['domain_record']['priority'], 0) self.assertEqual(response['domain_record']['weight'], 0) @responses.activate def test_create_new_caa_record_zero_flags(self): data = self.load_from_file('domains/create_caa_record.json') url = self.base_url + "domains/example.com/records" responses.add(responses.POST, url, body=data, status=201, content_type='application/json') response = self.domain.create_new_domain_record( type="CAA", name="@", data="letsencrypt.org.", ttl=1800, flags=0, tag="issue") self.assert_url_query_equal( responses.calls[0].request.url, self.base_url + "domains/example.com/records") self.assertEqual(response['domain_record']['type'], "CAA") self.assertEqual(response['domain_record']['name'], "@") self.assertEqual(response['domain_record']['data'], "letsencrypt.org.") self.assertEqual(response['domain_record']['ttl'], 1800) self.assertEqual(response['domain_record']['flags'], 0) self.assertEqual(response['domain_record']['tag'], "issue") @responses.activate def test_create(self): data = self.load_from_file('domains/create.json') url = self.base_url + "domains" responses.add(responses.POST, url, body=data, status=201, content_type='application/json') domain = digitalocean.Domain(name="example.com", ip_address="1.1.1.1", token=self.token).create() self.assert_url_query_equal( responses.calls[0].request.url, self.base_url + "domains") self.assertEqual(json.loads(responses.calls[0].request.body), {'ip_address': '1.1.1.1', 'name': 'example.com'}) self.assertEqual(domain['domain']['name'], "example.com") self.assertEqual(domain['domain']['ttl'], 1800) @responses.activate def test_get_records(self): data = self.load_from_file('domains/records.json') url = self.base_url + "domains/example.com/records/" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') records = self.domain.get_records() self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(len(records), 6) self.assertEqual(records[0].type, "A") self.assertEqual(records[0].name, "@") self.assertEqual(records[4].type, "CNAME") self.assertEqual(records[4].name, "example") self.assertEqual(records[4].ttl, 600) self.assertEqual(records[5].data, "letsencrypt.org.") if __name__ == '__main__': unittest.main() python-digitalocean-1.16.0/digitalocean/tests/test_droplet.py000066400000000000000000001243101375555111600244220ustar00rootroot00000000000000import json import unittest import responses import digitalocean from .BaseTest import BaseTest class TestDroplet(BaseTest): @responses.activate def setUp(self): super(TestDroplet, self).setUp() self.actions_url = self.base_url + "droplets/12345/actions/" data = self.load_from_file('droplets/single.json') url = self.base_url + "droplets/12345" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') self.droplet = digitalocean.Droplet(id='12345', token=self.token).load() @responses.activate def test_load(self): data = self.load_from_file('droplets/single.json') url = self.base_url + "droplets/12345" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') droplet = digitalocean.Droplet(id='12345', token=self.token) d = droplet.load() self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(d.id, 12345) self.assertEqual(d.name, "example.com") self.assertEqual(d.memory, 512) self.assertEqual(d.vcpus, 1) self.assertEqual(d.disk, 20) self.assertEqual(d.backups, False) self.assertEqual(d.ipv6, True) self.assertEqual(d.private_networking, False) self.assertEqual(d.region['slug'], "nyc3") self.assertEqual(d.status, "active") self.assertEqual(d.image['slug'], "ubuntu-14-04-x64") self.assertEqual(d.size_slug, '512mb') self.assertEqual(d.created_at, "2014-11-14T16:36:31Z") self.assertEqual(d.ip_address, "104.131.186.241") self.assertEqual(d.ip_v6_address, "2604:A880:0800:0010:0000:0000:031D:2001") self.assertEqual(d.kernel['id'], 2233) self.assertEqual(d.features, ["ipv6", "virtio"]) self.assertEqual(d.tags, []) self.assertEqual(d.vpc_uuid, "08187eaa-90eb-40d6-a8f0-0222b28ded72") @responses.activate def test_power_off(self): data = self.load_from_file('droplet_actions/power_off.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.power_off() self.assertEqual(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "power_off"}) self.assertEqual(response['action']['id'], 54321) self.assertEqual(response['action']['status'], "in-progress") self.assertEqual(response['action']['type'], "power_off") self.assertEqual(response['action']['resource_id'], 12345) self.assertEqual(response['action']['resource_type'], "droplet") @responses.activate def test_power_off_action(self): data = self.load_from_file('droplet_actions/power_off.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.power_off(False) self.assertEqual(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "power_off"}) self.assertEqual(response.id, 54321) self.assertEqual(response.status, "in-progress") self.assertEqual(response.type, "power_off") self.assertEqual(response.resource_id, 12345) self.assertEqual(response.resource_type, "droplet") @responses.activate def test_power_on(self): data = self.load_from_file('droplet_actions/power_on.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.power_on() self.assertEqual(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "power_on"}) self.assertEqual(response['action']['id'], 54321) self.assertEqual(response['action']['status'], "in-progress") self.assertEqual(response['action']['type'], "power_on") self.assertEqual(response['action']['resource_id'], 12345) self.assertEqual(response['action']['resource_type'], "droplet") @responses.activate def test_power_on_action(self): data = self.load_from_file('droplet_actions/power_on.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.power_on(return_dict=False) self.assertEqual(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "power_on"}) self.assertEqual(response.id, 54321) self.assertEqual(response.status, "in-progress") self.assertEqual(response.type, "power_on") self.assertEqual(response.resource_id, 12345) self.assertEqual(response.resource_type, "droplet") @responses.activate def test_shutdown(self): data = self.load_from_file('droplet_actions/shutdown.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.shutdown() self.assertEqual(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "shutdown"}) self.assertEqual(response['action']['id'], 54321) self.assertEqual(response['action']['status'], "in-progress") self.assertEqual(response['action']['type'], "shutdown") self.assertEqual(response['action']['resource_id'], 12345) self.assertEqual(response['action']['resource_type'], "droplet") @responses.activate def test_shutdown_action(self): data = self.load_from_file('droplet_actions/shutdown.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.shutdown(return_dict=False) self.assertEqual(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "shutdown"}) self.assertEqual(response.id, 54321) self.assertEqual(response.status, "in-progress") self.assertEqual(response.type, "shutdown") self.assertEqual(response.resource_id, 12345) self.assertEqual(response.resource_type, "droplet") @responses.activate def test_reboot(self): data = self.load_from_file('droplet_actions/reboot.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.reboot() self.assertEqual(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "reboot"}) self.assertEqual(response['action']['id'], 54321) self.assertEqual(response['action']['status'], "in-progress") self.assertEqual(response['action']['type'], "reboot") self.assertEqual(response['action']['resource_id'], 12345) self.assertEqual(response['action']['resource_type'], "droplet") @responses.activate def test_reboot_action(self): data = self.load_from_file('droplet_actions/reboot.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.reboot(return_dict=False) self.assertEqual(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "reboot"}) self.assertEqual(response.id, 54321) self.assertEqual(response.status, "in-progress") self.assertEqual(response.type, "reboot") self.assertEqual(response.resource_id, 12345) self.assertEqual(response.resource_type, "droplet") @responses.activate def test_power_cycle(self): data = self.load_from_file('droplet_actions/power_cycle.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.power_cycle() self.assertEqual(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "power_cycle"}) self.assertEqual(response['action']['id'], 54321) self.assertEqual(response['action']['status'], "in-progress") self.assertEqual(response['action']['type'], "power_cycle") self.assertEqual(response['action']['resource_id'], 12345) self.assertEqual(response['action']['resource_type'], "droplet") @responses.activate def test_power_cycle_action(self): data = self.load_from_file('droplet_actions/power_cycle.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.power_cycle(return_dict=False) self.assertEqual(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "power_cycle"}) self.assertEqual(response.id, 54321) self.assertEqual(response.status, "in-progress") self.assertEqual(response.type, "power_cycle") self.assertEqual(response.resource_id, 12345) self.assertEqual(response.resource_type, "droplet") @responses.activate def test_reset_root_password(self): data = self.load_from_file('droplet_actions/password_reset.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.reset_root_password() self.assertEqual(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "password_reset"}) self.assertEqual(response['action']['id'], 54321) self.assertEqual(response['action']['status'], "in-progress") self.assertEqual(response['action']['type'], "password_reset") self.assertEqual(response['action']['resource_id'], 12345) self.assertEqual(response['action']['resource_type'], "droplet") @responses.activate def test_reset_root_password_action(self): data = self.load_from_file('droplet_actions/password_reset.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.reset_root_password(return_dict=False) self.assertEqual(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "password_reset"}) self.assertEqual(response.id, 54321) self.assertEqual(response.status, "in-progress") self.assertEqual(response.type, "password_reset") self.assertEqual(response.resource_id, 12345) self.assertEqual(response.resource_type, "droplet") @responses.activate def test_take_snapshot(self): data = self.load_from_file('droplet_actions/snapshot.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.take_snapshot("New Snapshot") self.assert_url_query_equal(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "snapshot", "name": "New Snapshot"}) self.assertEqual(response['action']['id'], 54321) self.assertEqual(response['action']['status'], "in-progress") self.assertEqual(response['action']['type'], "snapshot") self.assertEqual(response['action']['resource_id'], 12345) self.assertEqual(response['action']['resource_type'], "droplet") @responses.activate def test_take_snapshot_action(self): data = self.load_from_file('droplet_actions/snapshot.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.take_snapshot("New Snapshot", return_dict=False) self.assert_url_query_equal(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "snapshot", "name": "New Snapshot"}) self.assertEqual(response.id, 54321) self.assertEqual(response.status, "in-progress") self.assertEqual(response.type, "snapshot") self.assertEqual(response.resource_id, 12345) self.assertEqual(response.resource_type, "droplet") @responses.activate def test_resize(self): data = self.load_from_file('droplet_actions/resize.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.resize("64gb") self.assert_url_query_equal(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "resize", "size": "64gb", "disk": "true"}) self.assertEqual(response['action']['id'], 54321) self.assertEqual(response['action']['status'], "in-progress") self.assertEqual(response['action']['type'], "resize") self.assertEqual(response['action']['resource_id'], 12345) self.assertEqual(response['action']['resource_type'], "droplet") @responses.activate def test_resize_action(self): data = self.load_from_file('droplet_actions/resize.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.resize("64gb", False) self.assert_url_query_equal(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "resize", "size": "64gb", "disk": "true"}) self.assertEqual(response.id, 54321) self.assertEqual(response.status, "in-progress") self.assertEqual(response.type, "resize") self.assertEqual(response.resource_id, 12345) self.assertEqual(response.resource_type, "droplet") @responses.activate def test_restore(self): data = self.load_from_file('droplet_actions/restore.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.restore(image_id=78945) self.assert_url_query_equal(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"image": 78945, "type": "restore"}) self.assertEqual(response['action']['id'], 54321) self.assertEqual(response['action']['status'], "in-progress") self.assertEqual(response['action']['type'], "restore") self.assertEqual(response['action']['resource_id'], 12345) self.assertEqual(response['action']['resource_type'], "droplet") @responses.activate def test_restore_action(self): data = self.load_from_file('droplet_actions/restore.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.restore(image_id=78945, return_dict=False) self.assert_url_query_equal(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"image": 78945, "type": "restore"}) self.assertEqual(response.id, 54321) self.assertEqual(response.status, "in-progress") self.assertEqual(response.type, "restore") self.assertEqual(response.resource_id, 12345) self.assertEqual(response.resource_type, "droplet") @responses.activate def test_rebuild_passing_image(self): """ Test rebuilding an droplet from a provided image id. """ data = self.load_from_file('droplet_actions/rebuild.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.rebuild(image_id=78945) self.assert_url_query_equal(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"image": 78945, "type": "rebuild"}) self.assertEqual(response['action']['id'], 54321) self.assertEqual(response['action']['status'], "in-progress") self.assertEqual(response['action']['type'], "rebuild") self.assertEqual(response['action']['resource_id'], 12345) self.assertEqual(response['action']['resource_type'], "droplet") @responses.activate def test_rebuild_passing_image_action(self): """ Test rebuilding an droplet from a provided image id. """ data = self.load_from_file('droplet_actions/rebuild.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.rebuild(image_id=78945, return_dict=False) self.assert_url_query_equal(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"image": 78945, "type": "rebuild"}) self.assertEqual(response.id, 54321) self.assertEqual(response.status, "in-progress") self.assertEqual(response.type, "rebuild") self.assertEqual(response.resource_id, 12345) self.assertEqual(response.resource_type, "droplet") @responses.activate def test_rebuild_not_passing_image(self): """ Test rebuilding an droplet from its original parent image id. """ data = self.load_from_file('droplet_actions/rebuild.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.rebuild() self.assert_url_query_equal(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"image": 6918990, "type": "rebuild"}) self.assertEqual(response['action']['id'], 54321) self.assertEqual(response['action']['status'], "in-progress") self.assertEqual(response['action']['type'], "rebuild") self.assertEqual(response['action']['resource_id'], 12345) self.assertEqual(response['action']['resource_type'], "droplet") @responses.activate def test_rebuild_not_passing_image_action(self): """ Test rebuilding an droplet from its original parent image id. """ data = self.load_from_file('droplet_actions/rebuild.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.rebuild(return_dict=False) self.assert_url_query_equal(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"image": 6918990, "type": "rebuild"}) self.assertEqual(response.id, 54321) self.assertEqual(response.status, "in-progress") self.assertEqual(response.type, "rebuild") self.assertEqual(response.resource_id, 12345) self.assertEqual(response.resource_type, "droplet") @responses.activate def test_enable_backups(self): data = self.load_from_file('droplet_actions/enable_backups.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.enable_backups() self.assertEqual(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "enable_backups"}) self.assertEqual(response['action']['id'], 54321) self.assertEqual(response['action']['status'], "in-progress") self.assertEqual(response['action']['type'], "enable_backups") self.assertEqual(response['action']['resource_id'], 12345) self.assertEqual(response['action']['resource_type'], "droplet") @responses.activate def test_disable_backups(self): data = self.load_from_file('droplet_actions/disable_backups.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.disable_backups() self.assertEqual(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "disable_backups"}) self.assertEqual(response['action']['id'], 54321) self.assertEqual(response['action']['status'], "in-progress") self.assertEqual(response['action']['type'], "disable_backups") self.assertEqual(response['action']['resource_id'], 12345) self.assertEqual(response['action']['resource_type'], "droplet") @responses.activate def test_disable_backups_action(self): data = self.load_from_file('droplet_actions/disable_backups.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.disable_backups(return_dict=False) self.assertEqual(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "disable_backups"}) self.assertEqual(response.id, 54321) self.assertEqual(response.status, "in-progress") self.assertEqual(response.type, "disable_backups") self.assertEqual(response.resource_id, 12345) self.assertEqual(response.resource_type, "droplet") @responses.activate def test_destroy(self): url = self.base_url + "droplets/12345" responses.add(responses.DELETE, url, status=204, content_type='application/json') self.droplet.destroy() self.assertEqual(responses.calls[0].request.url, self.base_url + "droplets/12345") @responses.activate def test_rename(self): data = self.load_from_file('droplet_actions/rename.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.rename(name="New Name") self.assert_url_query_equal(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "rename", "name": "New Name"}) self.assertEqual(response['action']['id'], 54321) self.assertEqual(response['action']['status'], "in-progress") self.assertEqual(response['action']['type'], "rename") self.assertEqual(response['action']['resource_id'], 12345) self.assertEqual(response['action']['resource_type'], "droplet") @responses.activate def test_rename_action(self): data = self.load_from_file('droplet_actions/rename.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.rename(name="New Name", return_dict=False) self.assert_url_query_equal(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "rename", "name": "New Name"}) self.assertEqual(response.id, 54321) self.assertEqual(response.status, "in-progress") self.assertEqual(response.type, "rename") self.assertEqual(response.resource_id, 12345) self.assertEqual(response.resource_type, "droplet") @responses.activate def test_enable_private_networking(self): data = self.load_from_file('droplet_actions/enable_private_networking.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.enable_private_networking() self.assertEqual(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "enable_private_networking"}) self.assertEqual(response['action']['id'], 54321) self.assertEqual(response['action']['status'], "in-progress") self.assertEqual(response['action']['type'], "enable_private_networking") self.assertEqual(response['action']['resource_id'], 12345) self.assertEqual(response['action']['resource_type'], "droplet") @responses.activate def test_enable_private_networking_action(self): data = self.load_from_file('droplet_actions/enable_private_networking.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.enable_private_networking(return_dict=False) self.assertEqual(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "enable_private_networking"}) self.assertEqual(response.id, 54321) self.assertEqual(response.status, "in-progress") self.assertEqual(response.type, "enable_private_networking") self.assertEqual(response.resource_id, 12345) self.assertEqual(response.resource_type, "droplet") @responses.activate def test_enable_ipv6(self): data = self.load_from_file('droplet_actions/enable_ipv6.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.enable_ipv6() self.assertEqual(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "enable_ipv6"}) self.assertEqual(response['action']['id'], 54321) self.assertEqual(response['action']['status'], "in-progress") self.assertEqual(response['action']['type'], "enable_ipv6") self.assertEqual(response['action']['resource_id'], 12345) self.assertEqual(response['action']['resource_type'], "droplet") @responses.activate def test_enable_ipv6_action(self): data = self.load_from_file('droplet_actions/enable_ipv6.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.enable_ipv6(return_dict=False) self.assertEqual(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {"type": "enable_ipv6"}) self.assertEqual(response.id, 54321) self.assertEqual(response.status, "in-progress") self.assertEqual(response.type, "enable_ipv6") self.assertEqual(response.resource_id, 12345) self.assertEqual(response.resource_type, "droplet") def test_change_kernel_exception(self): with self.assertRaises(Exception) as error: self.droplet.change_kernel(kernel=123) exception = error.exception self.assertEqual(str(exception), 'Use Kernel object') @responses.activate def test_change_kernel(self): data = self.load_from_file('droplet_actions/change_kernel.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.change_kernel(digitalocean.Kernel(id=123)) self.assert_url_query_equal(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {u"kernel": 123, u"type": u"change_kernel"}) self.assertEqual(response['action']['id'], 54321) self.assertEqual(response['action']['status'], "in-progress") self.assertEqual(response['action']['type'], "change_kernel") self.assertEqual(response['action']['resource_id'], 12345) self.assertEqual(response['action']['resource_type'], "droplet") @responses.activate def test_change_kernel_action(self): data = self.load_from_file('droplet_actions/change_kernel.json') responses.add(responses.POST, self.actions_url, body=data, status=201, content_type='application/json') response = self.droplet.change_kernel(digitalocean.Kernel(id=123), return_dict=False) self.assert_url_query_equal(responses.calls[0].request.url, self.actions_url) self.assertEqual(json.loads(responses.calls[0].request.body), {u"kernel": 123, u"type": u"change_kernel"}) self.assertEqual(response.id, 54321) self.assertEqual(response.status, "in-progress") self.assertEqual(response.type, "change_kernel") self.assertEqual(response.resource_id, 12345) self.assertEqual(response.resource_type, "droplet") @responses.activate def test_create_no_keys(self): data = self.load_from_file('droplet_actions/create.json') url = self.base_url + "droplets/" responses.add(responses.POST, url, body=data, status=202, content_type='application/json') droplet = digitalocean.Droplet(name="example.com", size_slug="512mb", image="ubuntu-14-04-x64", region="nyc3", backups=True, ipv6=True, private_networking=False, monitoring=True, user_data="Some user data.", token=self.token, tags=["web"], vpc_uuid="08187eaa-90eb-40d6-a8f0-0222b28ded72") droplet.create() self.assert_url_query_equal(responses.calls[0].request.url, url) self.maxDiff = None self.assertEqual( json.loads(responses.calls[0].request.body), {u"name": u"example.com", u"region": u"nyc3", u"user_data": u"Some user data.", u"ipv6": True, u"private_networking": False, u"monitoring": True, u"backups": True, u"image": u"ubuntu-14-04-x64", u"size": u"512mb", u"ssh_keys": [], u"volumes": [], u"tags": ["web"], u"vpc_uuid": "08187eaa-90eb-40d6-a8f0-0222b28ded72"}) self.assertEqual(droplet.id, 3164494) self.assertEqual(droplet.action_ids, [36805096]) @responses.activate def test_create_multiple_no_keys(self): data = self.load_from_file('droplet_actions/create_multiple.json') url = self.base_url + "droplets/" responses.add(responses.POST, url, body=data, status=202, content_type='application/json') droplets = digitalocean.Droplet.create_multiple(names=["example.com", "example2.com"], size_slug="512mb", image="ubuntu-14-04-x64", region="nyc3", backups=True, ipv6=True, private_networking=False, monitoring=True, user_data="Some user data.", token=self.token, tags=["web"], vpc_uuid="08187eaa-90eb-40d6-a8f0-0222b28ded72") self.assert_url_query_equal(responses.calls[0].request.url, url) self.assertEqual(len(droplets), 2) self.assertEqual(droplets[0].id, 3164494) self.assertEqual(droplets[1].id, 3164495) self.assertEqual(droplets[0].action_ids, [36805096]) self.assertEqual(droplets[1].action_ids, [36805096]) self.maxDiff = None self.assertEqual( json.loads(responses.calls[0].request.body), {u"names": [u"example.com", u"example2.com"], u"region": u"nyc3", u"user_data": u"Some user data.", u"ipv6": True, u"private_networking": False, u"monitoring": True, u"backups": True, u"image": u"ubuntu-14-04-x64", u"size": u"512mb", u"tags": ["web"], u"vpc_uuid": "08187eaa-90eb-40d6-a8f0-0222b28ded72"}) @responses.activate def test_get_actions(self): data = self.load_from_file('actions/multi.json') create = self.load_from_file('actions/create_completed.json') ipv6 = self.load_from_file('actions/ipv6_completed.json') responses.add(responses.GET, self.actions_url, body=data, status=200, content_type='application/json') responses.add(responses.GET, self.actions_url + "39388122", body=create, status=200, content_type='application/json') responses.add(responses.GET, self.actions_url + "39290099", body=ipv6, status=200, content_type='application/json') actions = self.droplet.get_actions() self.assertEqual(len(actions), 2) self.assertEqual(len(responses.calls), 3) self.assert_get_url_equal(responses.calls[0].request.url, self.actions_url) self.assert_get_url_equal(responses.calls[1].request.url, self.actions_url + "39388122") self.assert_get_url_equal(responses.calls[2].request.url, self.actions_url + "39290099") self.assertEqual(actions[0].id, 39290099) self.assertEqual(actions[0].type, "create") self.assertEqual(actions[0].status, "completed") self.assertEqual(actions[1].id, 39388122) self.assertEqual(actions[1].type, "enable_ipv6") self.assertEqual(actions[1].status, "completed") @responses.activate def test_get_action(self): data = self.load_from_file('actions/create_completed.json') url = self.base_url + "actions/39388122" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') action = self.droplet.get_action(39388122) self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(action.id, 39290099) self.assertEqual(action.type, "create") self.assertEqual(action.status, "completed") def test_get_snapshots(self): snapshots = self.droplet.get_snapshots() self.assertEqual(len(snapshots), 1) self.assertEqual(snapshots[0].id, 7938206) @responses.activate def test_get_kernel_available_no_pages(self): data = self.load_from_file('kernels/list.json') url = self.base_url + "droplets/12345/kernels/" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') kernels = self.droplet.get_kernel_available() self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(len(kernels), 2) self.assertEqual(kernels[0].id, 61833229) self.assertEqual(kernels[0].name, "Ubuntu 14.04 x32 vmlinuz-3.13.0-24-generic") @responses.activate def test_get_kernel_available_with_pages(self): one = self.load_from_file('kernels/page_one.json') two = self.load_from_file('kernels/page_two.json') url_0 = self.base_url + "droplets/12345/kernels/" responses.add(responses.GET, url_0, body=one, status=200, content_type='application/json') url_1 = self.base_url + "droplets/12345/kernels?page=2&per_page=200" responses.add(responses.GET, url_1, body=two, status=200, content_type='application/json', match_querystring=True) kernels = self.droplet.get_kernel_available() self.assert_get_url_equal(responses.calls[0].request.url, url_0) self.assert_url_query_equal(responses.calls[1].request.url, url_1) self.assertEqual(len(kernels), 3) self.assertEqual(kernels[0].id, 61833229) self.assertEqual(kernels[0].name, "Ubuntu 14.04 x32 vmlinuz-3.13.0-24-generic") self.assertEqual(kernels[2].id, 231) self.assertEqual(kernels[2].name, "Ubuntu 14.04 x64 vmlinuz-3.13.0-32-generic") @responses.activate def test_update_volumes_data(self): droplet_response = self.load_from_file('droplets/single.json') volume_response = self.load_from_file('volumes/single.json') url_droplet =self.base_url + "droplets/12345" url_volume = self.base_url + "volumes/506f78a4-e098-11e5-ad9f-000f53306ae1" responses.add(responses.GET, url_droplet, body=droplet_response, status=200, content_type='application/json') responses.add(responses.GET, url_volume, body=volume_response, status=200, content_type='application/json') droplet = digitalocean.Droplet(id='12345', token=self.token) d = droplet.load() d.update_volumes_data() self.assert_get_url_equal(responses.calls[0].request.url, url_droplet) self.assert_get_url_equal(responses.calls[1].request.url, url_volume) self.assertEqual(len(d.volumes), 1) self.assertEqual(d.volumes[0].id, '506f78a4-e098-11e5-ad9f-000f53306ae1') self.assertEqual(d.volumes[0].name, 'example') self.assertEqual(d.volumes[0].region['slug'], 'nyc1') if __name__ == '__main__': unittest.main() python-digitalocean-1.16.0/digitalocean/tests/test_firewall.py000066400000000000000000000102411375555111600245530ustar00rootroot00000000000000import unittest import responses import digitalocean import json from .BaseTest import BaseTest class TestFirewall(BaseTest): @responses.activate def setUp(self): super(TestFirewall, self).setUp() data = self.load_from_file('firewalls/single.json') url = self.base_url + "firewalls/12345" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') self.firewall = digitalocean.Firewall(id=12345, token=self.token).load() @responses.activate def test_load(self): data = self.load_from_file('firewalls/single.json') url = self.base_url + "firewalls/12345" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') firewall = digitalocean.Firewall(id=12345, token=self.token) f = firewall.load() self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(f.id, 12345) self.assertEqual(f.name, "firewall") self.assertEqual(f.status, "succeeded") self.assertEqual(f.inbound_rules[0].ports, "80") self.assertEqual(f.inbound_rules[0].protocol, "tcp") self.assertEqual(f.inbound_rules[0].sources.load_balancer_uids, ["12345"]) self.assertEqual(f.inbound_rules[0].sources.addresses, []) self.assertEqual(f.inbound_rules[0].sources.tags, []) self.assertEqual(f.outbound_rules[0].ports, "80") self.assertEqual(f.outbound_rules[0].protocol, "tcp") self.assertEqual( f.outbound_rules[0].destinations.load_balancer_uids, []) self.assertEqual(f.outbound_rules[0].destinations.addresses, ["0.0.0.0/0", "::/0"]) self.assertEqual(f.outbound_rules[0].destinations.tags, []) self.assertEqual(f.created_at, "2017-05-23T21:24:00Z") self.assertEqual(f.droplet_ids, [12345]) self.assertEqual(f.tags, []) self.assertEqual(f.pending_changes, []) @responses.activate def test_add_droplets(self): data = self.load_from_file('firewalls/droplets.json') url = self.base_url + "firewalls/12345/droplets" responses.add(responses.POST, url, body=data, status=204, content_type='application/json') droplet_id = json.loads(data)["droplet_ids"][0] self.firewall.add_droplets([droplet_id]) self.assertEqual(responses.calls[0].request.url, url) @responses.activate def test_remove_droplets(self): data = self.load_from_file('firewalls/droplets.json') url = self.base_url + "firewalls/12345/droplets" responses.add(responses.DELETE, url, body=data, status=204, content_type='application/json') droplet_id = json.loads(data)["droplet_ids"][0] self.firewall.remove_droplets([droplet_id]) self.assertEqual(responses.calls[0].request.url, url) @responses.activate def test_add_tags(self): data = self.load_from_file('firewalls/tags.json') url = self.base_url + "firewalls/12345/tags" responses.add(responses.POST, url, body=data, status=204, content_type='application/json') tag = json.loads(data)["tags"][0] self.firewall.add_tags([tag]) self.assertEqual(responses.calls[0].request.url, url) @responses.activate def test_remove_tags(self): data = self.load_from_file('firewalls/tags.json') url = self.base_url + "firewalls/12345/tags" responses.add(responses.DELETE, url, body=data, status=204, content_type='application/json') tag = json.loads(data)["tags"][0] self.firewall.remove_tags([tag]) self.assertEqual(responses.calls[0].request.url, url) if __name__ == '__main__': unittest.main() python-digitalocean-1.16.0/digitalocean/tests/test_floatingip.py000066400000000000000000000101141375555111600251010ustar00rootroot00000000000000import unittest import responses import digitalocean from .BaseTest import BaseTest class TestFloatingIP(BaseTest): def setUp(self): super(TestFloatingIP, self).setUp() self.fip = digitalocean.FloatingIP(ip='45.55.96.47', token=self.token) @responses.activate def test_load(self): data = self.load_from_file('floatingip/single.json') url = self.base_url + "floating_ips/45.55.96.47" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') self.fip.load() self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(self.fip.ip, "45.55.96.47") self.assertEqual(self.fip.region['slug'], 'nyc3') @responses.activate def test_create(self): data = self.load_from_file('floatingip/single.json') url = self.base_url + "floating_ips/" responses.add(responses.POST, url, body=data, status=201, content_type='application/json') fip = digitalocean.FloatingIP(droplet_id=12345, token=self.token).create() self.assertEqual(responses.calls[0].request.url, self.base_url + "floating_ips/") self.assertEqual(fip.ip, "45.55.96.47") self.assertEqual(fip.region['slug'], 'nyc3') @responses.activate def test_reserve(self): data = self.load_from_file('floatingip/single.json') url = self.base_url + "floating_ips/" responses.add(responses.POST, url, body=data, status=201, content_type='application/json') fip = digitalocean.FloatingIP(region_slug='nyc3', token=self.token).reserve() self.assertEqual(responses.calls[0].request.url, self.base_url + "floating_ips/") self.assertEqual(fip.ip, "45.55.96.47") self.assertEqual(fip.region['slug'], 'nyc3') @responses.activate def test_destroy(self): url = self.base_url + "floating_ips/45.55.96.47/" responses.add(responses.DELETE, url, status=204, content_type='application/json') self.fip.destroy() self.assertEqual(responses.calls[0].request.url, self.base_url + "floating_ips/45.55.96.47/") @responses.activate def test_assign(self): data = self.load_from_file('floatingip/assign.json') responses.add(responses.POST, "{}floating_ips/{}/actions/".format( self.base_url, self.fip.ip), body=data, status=201, content_type='application/json') res = self.fip.assign(droplet_id=12345) self.assertEqual(responses.calls[0].request.url, self.base_url + "floating_ips/45.55.96.47/actions/") self.assertEqual(res['action']['type'], 'assign_ip') self.assertEqual(res['action']['status'], 'in-progress') self.assertEqual(res['action']['id'], 68212728) @responses.activate def test_unassign(self): data = self.load_from_file('floatingip/unassign.json') responses.add(responses.POST, "{}floating_ips/{}/actions/".format( self.base_url, self.fip.ip), body=data, status=201, content_type='application/json') res = self.fip.unassign() self.assertEqual(responses.calls[0].request.url, self.base_url + "floating_ips/45.55.96.47/actions/") self.assertEqual(res['action']['type'], 'unassign_ip') self.assertEqual(res['action']['status'], 'in-progress') self.assertEqual(res['action']['id'], 68212773) if __name__ == '__main__': unittest.main() python-digitalocean-1.16.0/digitalocean/tests/test_image.py000066400000000000000000000132421375555111600240340ustar00rootroot00000000000000import unittest import responses import digitalocean from .BaseTest import BaseTest class TestImage(BaseTest): def setUp(self): super(TestImage, self).setUp() self.image = digitalocean.Image( id=449676856, token=self.token ) self.image_with_slug = digitalocean.Image( slug='testslug', token=self.token ) @responses.activate def test_load(self): data = self.load_from_file('images/single.json') url = "{}images/{}".format(self.base_url, self.image.id) responses.add(responses.GET, url, body=data, status=200, content_type='application/json') self.image.load() self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(self.image.id, 449676856) self.assertEqual(self.image.name, 'My Snapshot') self.assertEqual(self.image.distribution, 'Ubuntu') self.assertEqual(self.image.public, False) self.assertEqual(self.image.created_at, "2014-08-18T16:35:40Z") self.assertEqual(self.image.size_gigabytes, 2.34) self.assertEqual(self.image.min_disk_size, 20) @responses.activate def test_load_by_slug(self): """Test loading image by slug.""" data = self.load_from_file('images/slug.json') url = "{}images/{}".format(self.base_url, self.image_with_slug.slug) responses.add(responses.GET, url, body=data, status=200, content_type='application/json') self.image_with_slug.load(use_slug=True) self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(self.image_with_slug.id, None) self.assertEqual(self.image_with_slug.slug, 'testslug') self.assertEqual(self.image_with_slug.name, 'My Slug Snapshot') self.assertEqual(self.image_with_slug.distribution, 'Ubuntu') self.assertEqual(self.image_with_slug.public, False) self.assertEqual( self.image_with_slug.created_at, "2014-08-18T16:35:40Z" ) self.assertEqual(self.image_with_slug.size_gigabytes, 2.34) self.assertEqual(self.image_with_slug.min_disk_size, 30) @responses.activate def test_create(self): data = self.load_from_file('images/create.json') url = self.base_url + "images" responses.add(responses.POST, url, body=data, status=202, content_type='application/json') image = digitalocean.Image(name='ubuntu-18.04-minimal', url='https://www.example.com/cloud.img', distribution='Ubuntu', region='nyc3', description='Cloud-optimized image', tags=['base-image', 'prod'], token=self.token) image.create() self.assertEqual(image.id, 38413969) self.assertEqual(image.name, 'ubuntu-18.04-minimal') self.assertEqual(image.distribution, 'Ubuntu') self.assertEqual(image.type, 'custom') self.assertEqual(image.status, 'NEW') self.assertEqual(image.description, 'Cloud-optimized image') self.assertEqual(image.tags, ['base-image', 'prod']) self.assertEqual(image.created_at, '2018-09-20T19:28:00Z') @responses.activate def test_destroy(self): responses.add(responses.DELETE, '{}images/{}/'.format(self.base_url, self.image.id), status=204, content_type='application/json') self.image.destroy() self.assertEqual(responses.calls[0].request.url, self.base_url + 'images/449676856/') @responses.activate def test_transfer(self): data = self.load_from_file('images/transfer.json') responses.add(responses.POST, '{}images/{}/actions/'.format( self.base_url, self.image.id), body=data, status=201, content_type='application/json') res = self.image.transfer(new_region_slug='lon1') self.assertEqual(responses.calls[0].request.url, self.base_url + 'images/449676856/actions/') self.assertEqual(res['action']['type'], 'transfer') self.assertEqual(res['action']['status'], 'in-progress') self.assertEqual(res['action']['id'], 68212728) @responses.activate def test_rename(self): data = self.load_from_file('images/rename.json') responses.add(responses.PUT, '{}images/{}'.format(self.base_url, self.image.id), body=data, status=200, content_type='application/json') res = self.image.rename(new_name='Descriptive name') self.assertEqual(responses.calls[0].request.url, self.base_url + 'images/449676856') self.assertEqual(res['image']['name'], 'Descriptive name') def test_is_string(self): self.assertEqual(self.image._is_string("String"), True) self.assertEqual(self.image._is_string("1234"), True) self.assertEqual(self.image._is_string(123), False) self.assertEqual(self.image._is_string(None), None) self.assertEqual(self.image._is_string(True), None) self.assertEqual(self.image._is_string(False), None) if __name__ == '__main__': unittest.main() python-digitalocean-1.16.0/digitalocean/tests/test_load_balancer.py000066400000000000000000000416251375555111600255260ustar00rootroot00000000000000import json import unittest import responses import digitalocean from .BaseTest import BaseTest class TestLoadBalancer(BaseTest): def setUp(self): super(TestLoadBalancer, self).setUp() self.lb_id = '4de7ac8b-495b-4884-9a69-1050c6793cd6' self.vpc_uuid = "08187eaa-90eb-40d6-a8f0-0222b28ded72" self.lb = digitalocean.LoadBalancer(id=self.lb_id, token=self.token) @responses.activate def test_load(self): data = self.load_from_file('loadbalancer/single.json') url = self.base_url + 'load_balancers/' + self.lb_id responses.add(responses.GET, url, body=data, status=200, content_type='application/json') self.lb.load() rules = self.lb.forwarding_rules self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(self.lb.id, self.lb_id) self.assertEqual(self.lb.region['slug'], 'nyc3') self.assertEqual(self.lb.algorithm, 'round_robin') self.assertEqual(self.lb.ip, '104.131.186.241') self.assertEqual(self.lb.name, 'example-lb-01') self.assertEqual(len(rules), 2) self.assertEqual(rules[0].entry_protocol, 'http') self.assertEqual(rules[0].entry_port, 80) self.assertEqual(rules[0].target_protocol, 'http') self.assertEqual(rules[0].target_port, 80) self.assertEqual(rules[0].tls_passthrough, False) self.assertEqual(self.lb.health_check.protocol, 'http') self.assertEqual(self.lb.health_check.port, 80) self.assertEqual(self.lb.sticky_sessions.type, 'none') self.assertEqual(self.lb.droplet_ids, [3164444, 3164445]) self.assertEqual(self.lb.vpc_uuid, self.vpc_uuid) @responses.activate def test_create_ids(self): data = self.load_from_file('loadbalancer/single.json') url = self.base_url + "load_balancers" responses.add(responses.POST, url, body=data, status=201, content_type='application/json') rule1 = digitalocean.ForwardingRule(entry_port=80, entry_protocol='http', target_port=80, target_protocol='http') rule2 = digitalocean.ForwardingRule(entry_port=443, entry_protocol='https', target_port=443, target_protocol='https', tls_passthrough=True) check = digitalocean.HealthCheck() sticky = digitalocean.StickySessions(type='none') lb = digitalocean.LoadBalancer(name='example-lb-01', region='nyc3', algorithm='round_robin', forwarding_rules=[rule1, rule2], health_check=check, sticky_sessions=sticky, redirect_http_to_https=False, droplet_ids=[3164444, 3164445], vpc_uuid=self.vpc_uuid, token=self.token).create() resp_rules = lb.forwarding_rules self.assert_url_query_equal(responses.calls[0].request.url, url) self.assertEqual(lb.id, self.lb_id) self.assertEqual(lb.algorithm, 'round_robin') self.assertEqual(lb.ip, '104.131.186.241') self.assertEqual(lb.name, 'example-lb-01') self.assertEqual(len(resp_rules), 2) self.assertEqual(resp_rules[0].entry_protocol, 'http') self.assertEqual(resp_rules[0].entry_port, 80) self.assertEqual(resp_rules[0].target_protocol, 'http') self.assertEqual(resp_rules[0].target_port, 80) self.assertEqual(resp_rules[0].tls_passthrough, False) self.assertEqual(lb.health_check.protocol, 'http') self.assertEqual(lb.health_check.port, 80) self.assertEqual(lb.sticky_sessions.type, 'none') self.assertEqual(lb.droplet_ids, [3164444, 3164445]) self.assertEqual(lb.vpc_uuid, self.vpc_uuid) @responses.activate def test_create_tag(self): data = self.load_from_file('loadbalancer/single_tag.json') url = self.base_url + "load_balancers" responses.add(responses.POST, url, body=data, status=201, content_type='application/json') rule1 = digitalocean.ForwardingRule(entry_port=80, entry_protocol='http', target_port=80, target_protocol='http') rule2 = digitalocean.ForwardingRule(entry_port=443, entry_protocol='https', target_port=443, target_protocol='https', tls_passthrough=True) check = digitalocean.HealthCheck() sticky = digitalocean.StickySessions(type='none') lb = digitalocean.LoadBalancer(name='example-lb-01', region='nyc3', algorithm='round_robin', forwarding_rules=[rule1, rule2], health_check=check, sticky_sessions=sticky, redirect_http_to_https=False, tag='web', vpc_uuid=self.vpc_uuid, token=self.token).create() resp_rules = lb.forwarding_rules self.assertEqual(responses.calls[0].request.url, self.base_url + 'load_balancers') self.assertEqual(lb.id, '4de2ac7b-495b-4884-9e69-1050d6793cd4') self.assertEqual(lb.algorithm, 'round_robin') self.assertEqual(lb.ip, '104.131.186.248') self.assertEqual(lb.name, 'example-lb-01') self.assertEqual(len(resp_rules), 2) self.assertEqual(resp_rules[0].entry_protocol, 'http') self.assertEqual(resp_rules[0].entry_port, 80) self.assertEqual(resp_rules[0].target_protocol, 'http') self.assertEqual(resp_rules[0].target_port, 80) self.assertEqual(resp_rules[0].tls_passthrough, False) self.assertEqual(lb.health_check.protocol, 'http') self.assertEqual(lb.health_check.port, 80) self.assertEqual(lb.sticky_sessions.type, 'none') self.assertEqual(lb.tag, 'web') self.assertEqual(lb.droplet_ids, [3164444, 3164445]) self.assertEqual(lb.vpc_uuid, self.vpc_uuid) @responses.activate def test_create_exception(self): data = self.load_from_file('loadbalancer/single_tag.json') url = self.base_url + "load_balancers/" responses.add(responses.POST, url, body=data, status=201, content_type='application/json') rule = digitalocean.ForwardingRule(entry_port=80, entry_protocol='http', target_port=80, target_protocol='http') check = digitalocean.HealthCheck() sticky = digitalocean.StickySessions(type='none') lb = digitalocean.LoadBalancer(name='example-lb-01', region='nyc3', algorithm='round_robin', forwarding_rules=[rule], health_check=check, sticky_sessions=sticky, redirect_http_to_https=False, tag='web', droplet_ids=[123456, 789456], vpc_uuid=self.vpc_uuid, token=self.token) with self.assertRaises(ValueError) as context: lb.create() self.assertEqual('droplet_ids and tag are mutually exclusive args', str(context.exception)) @responses.activate def test_save(self): data1 = self.load_from_file('loadbalancer/single.json') url = '{0}load_balancers/{1}'.format(self.base_url, self.lb_id) responses.add(responses.GET, url, body=data1, status=200, content_type='application/json') self.lb.load() rules = self.lb.forwarding_rules self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(self.lb.id, self.lb_id) self.assertEqual(self.lb.region['slug'], 'nyc3') self.assertEqual(self.lb.algorithm, 'round_robin') self.assertEqual(self.lb.ip, '104.131.186.241') self.assertEqual(self.lb.name, 'example-lb-01') self.assertEqual(len(rules), 2) self.assertEqual(rules[0].entry_protocol, 'http') self.assertEqual(rules[0].entry_port, 80) self.assertEqual(rules[0].target_protocol, 'http') self.assertEqual(rules[0].target_port, 80) self.assertEqual(rules[0].tls_passthrough, False) self.assertEqual(rules[1].entry_protocol, 'https') self.assertEqual(rules[1].entry_port, 444) self.assertEqual(rules[1].target_protocol, 'https') self.assertEqual(rules[1].target_port, 443) self.assertEqual(rules[1].tls_passthrough, True) self.assertEqual(self.lb.health_check.protocol, 'http') self.assertEqual(self.lb.health_check.port, 80) self.assertEqual(self.lb.health_check.path, '/') self.assertEqual(self.lb.health_check.check_interval_seconds, 10) self.assertEqual(self.lb.health_check.response_timeout_seconds, 5) self.assertEqual(self.lb.health_check.healthy_threshold, 5) self.assertEqual(self.lb.health_check.unhealthy_threshold, 3) self.assertEqual(self.lb.sticky_sessions.type, 'none') self.assertEqual(self.lb.droplet_ids, [3164444, 3164445]) self.assertEqual(self.lb.tag, '') self.assertEqual(self.lb.redirect_http_to_https, False) self.assertEqual(self.lb.vpc_uuid, self.vpc_uuid) data2 = self.load_from_file('loadbalancer/save.json') url = '{0}load_balancers/{1}'.format(self.base_url, self.lb_id) responses.add(responses.PUT, url, body=data2, status=202, content_type='application/json') self.lb.algorithm = 'least_connections' self.lb.sticky_sessions.type = 'cookies' self.lb.sticky_sessions.cookie_name = 'DO_LB' self.lb.sticky_sessions.cookie_ttl_seconds = 300 self.lb.droplet_ids = [34153248, 34153250] self.lb.vpc_uuid = self.vpc_uuid res = self.lb.save() lb = digitalocean.LoadBalancer(**res['load_balancer']) lb.health_check = digitalocean.HealthCheck(**res['load_balancer']['health_check']) lb.sticky_sessions = digitalocean.StickySessions(**res['load_balancer']['sticky_sessions']) rules = list() for rule in lb.forwarding_rules: rules.append(digitalocean.ForwardingRule(**rule)) self.assertEqual(lb.id, self.lb_id) self.assertEqual(lb.region['slug'], 'nyc3') self.assertEqual(lb.algorithm, 'least_connections') self.assertEqual(lb.ip, '104.131.186.241') self.assertEqual(lb.name, 'example-lb-01') self.assertEqual(len(rules), 2) self.assertEqual(rules[0].entry_protocol, 'http') self.assertEqual(rules[0].entry_port, 80) self.assertEqual(rules[0].target_protocol, 'http') self.assertEqual(rules[0].target_port, 80) self.assertEqual(rules[0].tls_passthrough, False) self.assertEqual(rules[1].entry_protocol, 'https') self.assertEqual(rules[1].entry_port, 444) self.assertEqual(rules[1].target_protocol, 'https') self.assertEqual(rules[1].target_port, 443) self.assertEqual(rules[1].tls_passthrough, True) self.assertEqual(lb.health_check.protocol, 'http') self.assertEqual(lb.health_check.port, 80) self.assertEqual(lb.health_check.path, '/') self.assertEqual(lb.health_check.check_interval_seconds, 10) self.assertEqual(lb.health_check.response_timeout_seconds, 5) self.assertEqual(lb.health_check.healthy_threshold, 5) self.assertEqual(lb.health_check.unhealthy_threshold, 3) self.assertEqual(lb.sticky_sessions.type, 'cookies') self.assertEqual(lb.sticky_sessions.cookie_name, 'DO_LB') self.assertEqual(lb.sticky_sessions.cookie_ttl_seconds, 300) self.assertEqual(lb.droplet_ids, [34153248, 34153250]) self.assertEqual(lb.tag, '') self.assertEqual(lb.redirect_http_to_https, False) self.assertEqual(self.lb.vpc_uuid, self.vpc_uuid) @responses.activate def test_destroy(self): url = '{0}load_balancers/{1}'.format(self.base_url, self.lb_id) responses.add(responses.DELETE, url, status=204, content_type='application/json') self.lb.destroy() self.assertEqual(responses.calls[0].request.url, url) @responses.activate def test_add_droplets(self): url = '{0}load_balancers/{1}/droplets'.format(self.base_url, self.lb_id) responses.add(responses.POST, url, status=204, content_type='application/json') self.lb.add_droplets([12345, 78945]) body = '{"droplet_ids": [12345, 78945]}' self.assertEqual(responses.calls[0].request.url, url) self.assertEqual(responses.calls[0].request.body, body) @responses.activate def test_remove_droplets(self): url = '{0}load_balancers/{1}/droplets'.format(self.base_url, self.lb_id) responses.add(responses.DELETE, url, status=204, content_type='application/json') self.lb.remove_droplets([12345, 78945]) body = '{"droplet_ids": [12345, 78945]}' self.assertEqual(responses.calls[0].request.url, url) self.assertEqual(responses.calls[0].request.body, body) @responses.activate def test_add_forwarding_rules(self): url = '{0}load_balancers/{1}/forwarding_rules'.format(self.base_url, self.lb_id) responses.add(responses.POST, url, status=204, content_type='application/json') rule = digitalocean.ForwardingRule(entry_port=3306, entry_protocol='tcp', target_port=3306, target_protocol='tcp') self.lb.add_forwarding_rules([rule]) req_body = json.loads("""{ "forwarding_rules": [ { "entry_protocol": "tcp", "entry_port": 3306, "target_protocol": "tcp", "target_port": 3306, "certificate_id": "", "tls_passthrough": false } ] }""") body = json.loads(responses.calls[0].request.body) self.assertEqual(responses.calls[0].request.url, url) self.assertEqual(sorted(body.items()), sorted(req_body.items())) @responses.activate def test_remove_forwarding_rules(self): url = '{0}load_balancers/{1}/forwarding_rules'.format(self.base_url, self.lb_id) responses.add(responses.DELETE, url, status=204, content_type='application/json') rule = digitalocean.ForwardingRule(entry_port=3306, entry_protocol='tcp', target_port=3306, target_protocol='tcp') self.lb.remove_forwarding_rules([rule]) req_body = json.loads("""{ "forwarding_rules": [ { "entry_protocol": "tcp", "entry_port": 3306, "target_protocol": "tcp", "target_port": 3306, "certificate_id": "", "tls_passthrough": false } ] }""") body = json.loads(responses.calls[0].request.body) self.assertEqual(responses.calls[0].request.url, url) self.assertEqual(sorted(body.items()), sorted(req_body.items())) if __name__ == '__main__': unittest.main() python-digitalocean-1.16.0/digitalocean/tests/test_manager.py000066400000000000000000000732521375555111600243730ustar00rootroot00000000000000import json import unittest import responses import digitalocean from .BaseTest import BaseTest class TestManager(BaseTest): def setUp(self): super(TestManager, self).setUp() self.manager = digitalocean.Manager(token=self.token) self.image = digitalocean.Image( id=449676856, slug='testslug', token=self.token ) @responses.activate def test_get_account(self): data = self.load_from_file('account/account.json') url = self.base_url + 'account/' responses.add(responses.GET, url, body=data, status=200, content_type='application/json') acct = self.manager.get_account() self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(acct.token, self.token) self.assertEqual(acct.email, 'web@digitalocean.com') self.assertEqual(acct.droplet_limit, 25) self.assertEqual(acct.email_verified, True) self.assertEqual(acct.status, "active") @responses.activate def test_get_balance(self): data = self.load_from_file('balance/balance.json') url = self.base_url + 'customers/my/balance' responses.add(responses.GET, url, body=data, status=200, content_type='application/json') balance = self.manager.get_balance() self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(balance.token, balance.token) self.assertEqual(balance.month_to_date_balance, '23.44') self.assertEqual(balance.account_balance, '12.23') self.assertEqual(balance.month_to_date_usage, '11.21') self.assertEqual(balance.generated_at, '2019-07-09T15:01:12Z') @responses.activate def test_auth_fail(self): data = self.load_from_file('errors/unauthorized.json') url = self.base_url + 'regions/' responses.add(responses.GET, url, body=data, status=401, content_type='application/json') bad_token = digitalocean.Manager(token='thisisnotagoodtoken') with self.assertRaises(Exception) as error: bad_token.get_all_regions() exception = error.exception self.assertEqual(str(exception), 'Unable to authenticate you.') @responses.activate def test_droplets(self): data = self.load_from_file('droplets/all.json') url = self.base_url + 'droplets/' responses.add(responses.GET, url, body=data, status=200, content_type='application/json') droplets = self.manager.get_all_droplets() droplet = droplets[0] self.assertEqual(droplet.token, self.token) self.assertEqual(droplet.id, 3164444) self.assertEqual(droplet.name, "example.com") self.assertEqual(droplet.memory, 512) self.assertEqual(droplet.vcpus, 1) self.assertEqual(droplet.disk, 20) self.assertEqual(droplet.backups, True) self.assertEqual(droplet.ipv6, True) self.assertEqual(droplet.private_networking, False) self.assertEqual(droplet.region['slug'], "nyc3") self.assertEqual(droplet.status, "active") self.assertEqual(droplet.image['slug'], "ubuntu-14-04-x64") self.assertEqual(droplet.size_slug, '512mb') self.assertEqual(droplet.created_at, "2014-11-14T16:29:21Z") self.assertEqual(droplet.ip_address, "104.236.32.182") self.assertEqual(droplet.ip_v6_address, "2604:A880:0800:0010:0000:0000:02DD:4001") self.assertEqual(droplet.kernel['id'], 2233) self.assertEqual(droplet.backup_ids, [7938002]) self.assertEqual(droplet.features, ["backups", "ipv6", "virtio"]) @responses.activate def test_get_droplets_by_tag(self): data = self.load_from_file('droplets/bytag.json') url = self.base_url + "droplets" responses.add(responses.GET, url + "/", body=data, status=200, content_type='application/json') # The next pages don"t use trailing slashes. Return an empty result # to prevent an infinite loop responses.add(responses.GET, url, body="{}", status=200, content_type="application/json") manager = digitalocean.Manager(token=self.token) droplets = manager.get_all_droplets(tag_name="awesome") droplet = droplets[0] self.assertEqual(droplet.token, self.token) self.assertEqual(droplet.id, 3164444) self.assertEqual(droplet.name, "example.com") self.assertEqual(droplet.memory, 512) self.assertEqual(droplet.vcpus, 1) self.assertEqual(droplet.disk, 20) self.assertEqual(droplet.backups, True) self.assertEqual(droplet.ipv6, True) self.assertEqual(droplet.private_networking, False) self.assertEqual(droplet.region['slug'], "nyc3") self.assertEqual(droplet.status, "active") self.assertEqual(droplet.image['slug'], "ubuntu-14-04-x64") self.assertEqual(droplet.size_slug, '512mb') self.assertEqual(droplet.created_at, "2014-11-14T16:29:21Z") self.assertEqual(droplet.ip_address, "104.236.32.182") self.assertEqual(droplet.ip_v6_address, "2604:A880:0800:0010:0000:0000:02DD:4001") self.assertEqual(droplet.kernel['id'], 2233) self.assertEqual(droplet.backup_ids, [7938002]) self.assertEqual(droplet.features, ["backups", "ipv6", "virtio"]) self.assertEqual(droplet.tags, ["awesome"]) @responses.activate def test_get_all_regions(self): data = self.load_from_file('regions/all.json') url = self.base_url + 'regions/' responses.add(responses.GET, url, body=data, status=200, content_type='application/json') all_regions = self.manager.get_all_regions() self.assertEqual(len(all_regions), 3) region = all_regions[0] self.assertEqual(region.token, self.token) self.assertEqual(region.name, 'New York') self.assertEqual(region.slug, 'nyc1') self.assertEqual(region.sizes, ["1gb", "512mb"]) self.assertEqual(region.features, ['virtio', 'private_networking', 'backups', 'ipv6']) @responses.activate def test_get_all_sizes(self): data = self.load_from_file('sizes/all.json') url = self.base_url + 'sizes/' responses.add(responses.GET, url, body=data, status=200, content_type='application/json') all_sizes = self.manager.get_all_sizes() self.assertEqual(len(all_sizes), 2) size = all_sizes[0] self.assertEqual(size.token, self.token) self.assertEqual(size.slug, '512mb') self.assertEqual(size.memory, 512) self.assertEqual(size.disk, 20) self.assertEqual(size.price_hourly, 0.00744) self.assertEqual(size.price_monthly, 5.0) self.assertEqual(size.transfer, 1) self.assertEqual(size.regions, ["nyc1", "ams1", "sfo1"]) @responses.activate def test_get_image(self): """Test get image by id.""" data = self.load_from_file('images/single.json') url = "{}images/{}".format(self.base_url, self.image.id) responses.add(responses.GET, url, body=data, status=200, content_type='application/json') self.image.load() self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(self.image.id, 449676856) self.assertEqual(self.image.slug, 'testslug') self.assertEqual(self.image.name, 'My Snapshot') @responses.activate def test_get_image_by_slug(self): """Test get image by slug.""" data = self.load_from_file('images/single.json') url = "{}images/{}".format(self.base_url, self.image.slug) responses.add(responses.GET, url, body=data, status=200, content_type='application/json') self.image.load(use_slug=True) self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(self.image.id, 449676856) self.assertEqual(self.image.slug, 'testslug') self.assertEqual(self.image.name, 'My Snapshot') @responses.activate def test_get_all_images(self): data = self.load_from_file('images/all.json') url = self.base_url + 'images/' responses.add(responses.GET, url, body=data, status=200, content_type='application/json') all_images = self.manager.get_all_images() self.assertEqual(len(all_images), 3) image = all_images[0] self.assertEqual(image.token, self.token) self.assertEqual(image.id, 119192817) self.assertEqual(image.name, '14.04 x64') self.assertTrue(image.public) self.assertEqual(image.slug, "ubuntu-14-04-x64") self.assertEqual(image.distribution, 'Ubuntu') self.assertEqual(image.regions, ['nyc1']) self.assertEqual(image.created_at, "2014-07-29T14:35:40Z") @responses.activate def test_get_global_images(self): data = self.load_from_file('images/all.json') url = self.base_url + 'images/' responses.add(responses.GET, url, body=data, status=200, content_type='application/json') global_images = self.manager.get_global_images() self.assertEqual(len(global_images), 2) image = global_images[0] self.assertEqual(image.token, self.token) self.assertEqual(image.id, 119192817) self.assertEqual(image.name, '14.04 x64') self.assertTrue(image.public) self.assertEqual(image.slug, "ubuntu-14-04-x64") self.assertEqual(image.distribution, 'Ubuntu') self.assertEqual(image.regions, ['nyc1']) self.assertEqual(image.created_at, "2014-07-29T14:35:40Z") @responses.activate def test_get_my_images(self): data = self.load_from_file('images/private.json') url = self.base_url + 'images/' responses.add(responses.GET, url, body=data, status=200, content_type='application/json') my_images = self.manager.get_my_images() self.assertEqual(len(my_images), 1) image = my_images[0] self.assertEqual(image.token, self.token) self.assertEqual(image.id, 449676856) self.assertEqual(image.name, 'My Snapshot') self.assertFalse(image.public) self.assertEqual(image.slug, "") self.assertEqual(image.distribution, 'Ubuntu') self.assertEqual(image.regions, ['nyc1', 'nyc3']) self.assertEqual(image.created_at, "2014-08-18T16:35:40Z") self.assert_url_query_equal( responses.calls[0].request.url, 'https://api.digitalocean.com/v2/images/?private=true&per_page=200' ) @responses.activate def test_get_distro_images(self): data = self.load_from_file('images/distro.json') url = self.base_url + 'images/' responses.add(responses.GET, url, body=data, status=200, content_type='application/json') distro_images = self.manager.get_distro_images() self.assertEqual(len(distro_images), 2) image = distro_images[0] self.assertEqual(image.token, self.token) self.assertEqual(image.id, 119192817) self.assertEqual(image.name, '14.04 x64') self.assertTrue(image.public) self.assertEqual(image.slug, "ubuntu-14-04-x64") self.assertEqual(image.distribution, 'Ubuntu') self.assert_url_query_equal(responses.calls[0].request.url, 'https://api.digitalocean.com/v2/images/?type=distribution&per_page=200') @responses.activate def test_get_app_images(self): data = self.load_from_file('images/app.json') url = self.base_url + 'images/' responses.add(responses.GET, url, body=data, status=200, content_type='application/json') app_images = self.manager.get_app_images() self.assertEqual(len(app_images), 2) image = app_images[0] self.assertEqual(image.token, self.token) self.assertEqual(image.id, 11146864) self.assertEqual(image.name, 'MEAN on 14.04') self.assertTrue(image.public) self.assertEqual(image.slug, "mean") self.assertEqual(image.distribution, 'Ubuntu') self.assert_url_query_equal(responses.calls[0].request.url, 'https://api.digitalocean.com/v2/images/?type=application&per_page=200') @responses.activate def test_get_all_sshkeys(self): data = self.load_from_file('keys/all.json') url = self.base_url + 'account/keys/' responses.add(responses.GET, url, body=data, status=200, content_type='application/json') ssh_keys = self.manager.get_all_sshkeys() self.assertEqual(len(ssh_keys), 1) # Test the few things we can assume about a random ssh key. key = ssh_keys[0] self.assertEqual(key.token, self.token) self.assertEqual(key.name, "Example Key") self.assertEqual(key.id, 1) self.assertEqual(key.public_key, "ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAAAQQDGk5V68BJ4P3Ereh779Vi/Ft2qs/rbXrcjKLGo6zsyeyFUE0svJUpRDEJvFSf8RlezKx1/1ulJu9+kZsxRiUKn example") self.assertEqual(key.fingerprint, "f5:d1:78:ed:28:72:5f:e1:ac:94:fd:1f:e0:a3:48:6d") @responses.activate def test_post_new_ssh_key(self): data = self.load_from_file('keys/newly_posted.json') url = self.base_url + 'account/keys/' responses.add(responses.POST, url, body=data, status=200, content_type='application/json') params = {'public_key': 'AAAAkey', 'name': 'new_key'} ssh_key = self.manager.get_data(url='account/keys/', type='POST', params=params) key = ssh_key['ssh_key'] self.assertEqual(key['id'], 1234) self.assertEqual(key['fingerprint'], 'ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff:ff') self.assertEqual(key['public_key'], 'AAAAkey') self.assertEqual(key['name'], 'new_key') @responses.activate def test_get_all_domains(self): data = self.load_from_file('domains/all.json') url = self.base_url + 'domains/' responses.add(responses.GET, url, body=data, status=200, content_type='application/json') domains = self.manager.get_all_domains() self.assertEqual(len(domains), 1) # Test the few things we can assume about a random domain. domain = domains[0] self.assertEqual(domain.token, self.token) self.assertEqual(domain.name, "example.com") self.assertEqual(domain.zone_file, "Example zone file text...") self.assertEqual(domain.ttl, 1800) @responses.activate def test_get_all_floating_ips(self): data = self.load_from_file('floatingip/list.json') url = self.base_url + "floating_ips" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') fips = self.manager.get_all_floating_ips() self.assertEqual(fips[0].ip, "45.55.96.47") self.assertEqual(fips[0].region['slug'], 'nyc3') @responses.activate def test_get_all_load_balancers(self): data = self.load_from_file('loadbalancer/all.json') url = self.base_url + "load_balancers" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') lbs = self.manager.get_all_load_balancers() resp_rules = lbs[0].forwarding_rules[0] self.assertEqual(lbs[0].id, '4de2ac7b-495b-4884-9e69-1050d6793cd4') self.assertEqual(lbs[0].algorithm, 'round_robin') self.assertEqual(lbs[0].ip, '104.131.186.248') self.assertEqual(lbs[0].name, 'example-lb-02') self.assertEqual(len(lbs[0].forwarding_rules), 2) self.assertEqual(resp_rules.entry_protocol, 'http') self.assertEqual(resp_rules.entry_port, 80) self.assertEqual(resp_rules.target_protocol, 'http') self.assertEqual(resp_rules.target_port, 80) self.assertEqual(resp_rules.tls_passthrough, False) self.assertEqual(lbs[0].health_check.protocol, 'http') self.assertEqual(lbs[0].health_check.port, 80) self.assertEqual(lbs[0].sticky_sessions.type, 'none') self.assertEqual(lbs[0].tag, 'web') self.assertEqual(lbs[0].droplet_ids, [3164444, 3164445]) @responses.activate def test_get_all_certificates(self): data = self.load_from_file('certificate/list.json') url = self.base_url + "certificates" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') certs = self.manager.get_all_certificates() self.assertEqual(certs[0].id, '892071a0-bb95-49bc-8021-3afd67a210bf') self.assertEqual(certs[0].name, 'web-cert-01') self.assertEqual(certs[0].sha1_fingerprint, 'dfcc9f57d86bf58e321c2c6c31c7a971be244ac7') self.assertEqual(certs[0].not_after, '2017-02-22T00:23:00Z') self.assertEqual(certs[0].created_at, '2017-02-08T16:02:37Z') self.assertEqual(certs[0].type, 'custom') self.assertEqual(certs[0].state, 'verified') self.assertEqual(certs[1].id, 'ba9b9c18-6c59-46c2-99df-70da170a42ba') self.assertEqual(certs[1].name, 'web-cert-02') self.assertEqual(certs[1].sha1_fingerprint, '479c82b5c63cb6d3e6fac4624d58a33b267e166c') self.assertEqual(certs[1].not_after, '2018-06-07T17:44:12Z') self.assertEqual(certs[1].created_at, '2018-03-09T18:44:11Z') self.assertEqual(certs[1].type, 'lets_encrypt') self.assertEqual(certs[1].state, 'pending') @responses.activate def test_get_all_volumes(self): data = self.load_from_file('volumes/all.json') url = self.base_url + "volumes" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') volumes = self.manager.get_all_volumes() self.assertEqual(volumes[0].id, "506f78a4-e098-11e5-ad9f-000f53306ae1") self.assertEqual(volumes[0].region['slug'], 'nyc1') self.assertEqual(volumes[0].filesystem_type, "ext4") self.assertEqual(len(volumes), 2) @responses.activate def test_get_per_region_volumes(self): data = json.loads(self.load_from_file('volumes/all.json')) data["volumes"] = [ volume for volume in data["volumes"] if volume["region"]["slug"] == "nyc1"] url = self.base_url + "volumes?region=nyc1&per_page=200" responses.add(responses.GET, url, match_querystring=True, body=json.dumps(data), status=200, content_type='application/json') volumes = self.manager.get_all_volumes("nyc1") self.assertEqual(volumes[0].id, "506f78a4-e098-11e5-ad9f-000f53306ae1") self.assertEqual(volumes[0].region['slug'], 'nyc1') self.assertEqual(len(volumes), 1) @responses.activate def test_get_all_tags(self): data = self.load_from_file('tags/all.json') url = self.base_url + 'tags' responses.add(responses.GET, url, body=data, status=200, content_type='application/json') all_tags = self.manager.get_all_tags() self.assertEqual(len(all_tags), 1) self.assertEqual(all_tags[0].name, 'test') self.assertEqual(all_tags[0].resources['droplets']['count'], 0) @responses.activate def test_get_all_snapshots(self): data = self.load_from_file('snapshots/all.json') url = self.base_url + 'snapshots/' responses.add(responses.GET, url, body=data, status=200, content_type='application/json') all_snapshots = self.manager.get_all_snapshots() self.assertEqual(len(all_snapshots), 1) self.assertEqual(all_snapshots[0].name, 'test') self.assertEqual(all_snapshots[0].id, 6372321) self.assertEqual(all_snapshots[0].size_gigabytes, 1.42) self.assertEqual(all_snapshots[0].resource_type, 'droplet') self.assertEqual(len(all_snapshots[0].regions), 11) @responses.activate def test_get_droplet_snapshots(self): data = self.load_from_file('snapshots/droplets.json') url = self.base_url + 'snapshots?resource_type=droplet&per_page=200' responses.add(responses.GET, url, match_querystring=True, body=data, status=200, content_type='application/json') droplet_snapshots = self.manager.get_droplet_snapshots() self.assertEqual(len(droplet_snapshots), 1) self.assertEqual(droplet_snapshots[0].name, 'droplet-test') self.assertEqual(droplet_snapshots[0].id, 19602538) self.assertEqual(droplet_snapshots[0].min_disk_size, 20) self.assertEqual(droplet_snapshots[0].size_gigabytes, 0.31) self.assertEqual(droplet_snapshots[0].resource_type, 'droplet') self.assertEqual(len(droplet_snapshots[0].regions), 12) @responses.activate def test_get_volume_snapshots(self): data = self.load_from_file('snapshots/volumes.json') url = self.base_url + 'snapshots?resource_type=volume&per_page=200' responses.add(responses.GET, url, match_querystring=True, body=data, status=200, content_type='application/json') volume_snapshots = self.manager.get_volume_snapshots() self.assertEqual(len(volume_snapshots), 1) self.assertEqual(volume_snapshots[0].name, 'volume-test') self.assertEqual( volume_snapshots[0].id, '4f60fc64-85d1-11e6-a004-000f53315871' ) self.assertEqual(volume_snapshots[0].min_disk_size, 10) self.assertEqual(volume_snapshots[0].size_gigabytes, 0) self.assertEqual(volume_snapshots[0].resource_type, 'volume') self.assertEqual(len(volume_snapshots[0].regions), 1) @responses.activate def test_get_all_projects(self): data = self.load_from_file('projects/all_projects_list.json') url = self.base_url + 'projects' responses.add(responses.GET, url, body=data, status=200, content_type='application/json') all_projects = self.manager.get_all_projects() self.assertEqual(len(all_projects), 1) self.assertEqual(all_projects[0].id, "4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679") self.assertEqual(all_projects[0].owner_uuid, "99525febec065ca37b2ffe4f852fd2b2581895e7") self.assertEqual(all_projects[0].owner_id, 2) self.assertEqual(all_projects[0].name, "my-web-api") self.assertEqual(all_projects[0].description, "My website API") self.assertEqual(all_projects[0].purpose, "Service or API") self.assertEqual(all_projects[0].environment, "Production") self.assertEqual(all_projects[0].is_default, False) self.assertEqual(all_projects[0].created_at, "2018-09-27T20:10:35Z") self.assertEqual(all_projects[0].updated_at, "2018-09-27T20:10:35Z") @responses.activate def test_get_default_project(self): data = self.load_from_file('projects/default_project.json') url = self.base_url + 'projects' + "/default" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') default_project = self.manager.get_default_project() self.assertEqual(default_project.id, "4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679") self.assertEqual(default_project.owner_uuid, "99525febec065ca37b2ffe4f852fd2b2581895e7") self.assertEqual(default_project.owner_id, 2) self.assertEqual(default_project.name, "my-web-api") self.assertEqual(default_project.description, "My website API") self.assertEqual(default_project.purpose, "Service or API") self.assertEqual(default_project.environment, "Production") self.assertEqual(default_project.is_default, True) self.assertEqual(default_project.created_at, "2018-09-27T20:10:35Z") self.assertEqual(default_project.updated_at, "2018-09-27T20:10:35Z") @responses.activate def test_get_firewalls(self): data = self.load_from_file('firewalls/all.json') url = self.base_url + "firewalls" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') firewalls = self.manager.get_all_firewalls() f = firewalls[0] self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(f.id, "12345") self.assertEqual(f.name, "firewall") self.assertEqual(f.status, "succeeded") self.assertEqual(f.inbound_rules[0].ports, "80") self.assertEqual(f.inbound_rules[0].protocol, "tcp") self.assertEqual(f.inbound_rules[0].sources.load_balancer_uids, ["12345"]) self.assertEqual(f.inbound_rules[0].sources.addresses, []) self.assertEqual(f.inbound_rules[0].sources.tags, []) self.assertEqual(f.outbound_rules[0].ports, "80") self.assertEqual(f.outbound_rules[0].protocol, "tcp") self.assertEqual( f.outbound_rules[0].destinations.load_balancer_uids, []) self.assertEqual(f.outbound_rules[0].destinations.addresses, ["0.0.0.0/0", "::/0"]) self.assertEqual(f.outbound_rules[0].destinations.tags, []) self.assertEqual(f.created_at, "2017-05-23T21:24:00Z") self.assertEqual(f.droplet_ids, [12345]) self.assertEqual(f.tags, []) self.assertEqual(f.pending_changes, []) @responses.activate def test_get_vpc(self): data = self.load_from_file('vpcs/single.json') vpc_id = "5a4981aa-9653-4bd1-bef5-d6bff52042e4" url = self.base_url + 'vpcs/' + vpc_id responses.add(responses.GET, url, body=data, status=200, content_type='application/json') vpc = self.manager.get_vpc(vpc_id) self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(vpc.id, vpc_id) self.assertEqual(vpc.name, 'my-new-vpc') self.assertEqual(vpc.region, 'nyc1') self.assertEqual(vpc.ip_range, '10.10.10.0/24') self.assertEqual(vpc.description, '') self.assertEqual(vpc.urn, 'do:vpc:5a4981aa-9653-4bd1-bef5-d6bff52042e4') self.assertEqual(vpc.created_at, '2020-03-13T18:48:45Z') self.assertEqual(vpc.default, False) @responses.activate def test_get_all_vpcs(self): data = self.load_from_file('vpcs/list.json') url = self.base_url + "vpcs" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') vpcs = self.manager.get_all_vpcs() self.assertEqual(vpcs[0].id, '5a4981aa-9653-4bd1-bef5-d6bff52042e4') self.assertEqual(vpcs[0].name, 'my-new-vpc') self.assertEqual(vpcs[0].created_at, '2020-03-13T19:20:47Z') self.assertEqual(vpcs[0].region, 'nyc1') self.assertEqual(vpcs[0].description, '') self.assertEqual(vpcs[0].urn, 'do:vpc:5a4981aa-9653-4bd1-bef5-d6bff52042e4') self.assertEqual(vpcs[0].ip_range, '10.10.10.0/24') self.assertEqual(vpcs[0].default, False) self.assertEqual(vpcs[1].id, 'e0fe0f4d-596a-465e-a902-571ce57b79fa') self.assertEqual(vpcs[1].name, 'default-nyc1') self.assertEqual(vpcs[1].description, '') self.assertEqual(vpcs[1].urn, 'do:vpc:e0fe0f4d-596a-465e-a902-571ce57b79fa') self.assertEqual(vpcs[1].ip_range, '10.102.0.0/20') self.assertEqual(vpcs[1].created_at, '2020-03-13T19:29:20Z') self.assertEqual(vpcs[1].region, 'nyc1') self.assertEqual(vpcs[1].default, True) if __name__ == '__main__': unittest.main() python-digitalocean-1.16.0/digitalocean/tests/test_project.py000066400000000000000000000216341375555111600244240ustar00rootroot00000000000000import unittest import responses import digitalocean from .BaseTest import BaseTest class TestProject(BaseTest): def setUp(self): super(TestProject, self).setUp() @responses.activate def test_load(self): self.project = digitalocean.Project( id='4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679', token=self.token) data = self.load_from_file('projects/retrieve.json') project_path = "projects/4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679" url = self.base_url + project_path responses.add(responses.GET, url, body=data, status=200, content_type='application/json') self.project.load() self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(self.project.id, '4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679') self.assertEqual(self.project.owner_uuid, "99525febec065ca37b2ffe4f852fd2b2581895e7") self.assertEqual(self.project.owner_id, 2) self.assertEqual(self.project.name, "my-web-api") self.assertEqual(self.project.description, "My website API") self.assertEqual(self.project.purpose, "Service or API") self.assertEqual(self.project.environment, "Production") self.assertEqual(self.project.is_default, False) self.assertEqual(self.project.updated_at, "2018-09-27T20:10:35Z") self.assertEqual(self.project.created_at, "2018-09-27T20:10:35Z") @responses.activate def test_create_new_project(self): data = self.load_from_file('projects/create.json') project_path = "projects" url = self.base_url + project_path responses.add(responses.POST, url, body=data, status=201, content_type='application/json') project = digitalocean.Project(token=self.token, name="my-web-api", purpose="Service or API", description="My website API", environment="Production") project.create_project() self.assertEqual(responses.calls[0].request.url, url) self.assertEqual(project.id, '4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679') self.assertEqual(project.owner_uuid, '99525febec065ca37b2ffe4f852fd2b2581895e7') self.assertEqual(project.is_default, False) self.assertEqual(project.name, "my-web-api") self.assertEqual(project.description, "My website API") self.assertEqual(project.purpose, "Service or API") self.assertEqual(project.environment, "Production") self.assertEqual(project.updated_at, "2018-09-27T15:52:48Z") self.assertEqual(project.created_at, "2018-09-27T15:52:48Z") @responses.activate def test_update_project(self): data = self.load_from_file('projects/update.json') project = digitalocean.Project(token=self.token, id="4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679") project_path = "projects/" + project.id url = self.base_url + project_path responses.add(responses.PUT, url, body=data, status=200, content_type='application/json') project.update_project(name="my-web-api", description="My website API", purpose="Service or API", environment="Staging", is_default=False) self.assertEqual(responses.calls[0].request.url, url) self.assertEqual(project.is_default, False) self.assertEqual(project.name, "my-web-api") self.assertEqual(project.description, "My website API") self.assertEqual(project.purpose, "Service or API") self.assertEqual(project.environment, "Staging") @responses.activate def test_get_all_resources(self): data = self.load_from_file('projects/project_resources.json') resource_project = digitalocean.Project(token=self.token, id="4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679") url = self.base_url + 'projects/' + resource_project.id + "/resources" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') all_resources = resource_project.get_all_resources() self.assertEqual(len(all_resources), 1) self.assertEqual(all_resources[0], "do:droplet:1") @responses.activate def test_delete(self): url = self.base_url + "projects/4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679" responses.add(responses.DELETE, url, status=204, content_type='application/json') project_to_be_deleted = digitalocean.Project(token=self.token, id="4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679") project_to_be_deleted.delete_project() self.assertEqual(responses.calls[0].request.url, url) @responses.activate def test_update_default_project(self): data = self.load_from_file('projects/update.json') project = digitalocean.Project(token=self.token, id="default") project_path = "projects/" + project.id url = self.base_url + project_path responses.add(responses.PUT, url, body=data, status=200, content_type='application/json') project.update_project(name="my-web-api", description="My website API", purpose="Service or API", environment="Staging", is_default=False) self.assertEqual(responses.calls[0].request.url, url) self.assertEqual(project.is_default, False) self.assertEqual(project.name, "my-web-api") self.assertEqual(project.description, "My website API") self.assertEqual(project.purpose, "Service or API") self.assertEqual(project.environment, "Staging") @responses.activate def test_assign_resource(self): data = self.load_from_file('projects/assign_resources.json') resource_project = digitalocean.Project(token=self.token, id="4e1bfbc3-dc3e-41f2-a18f-1b4d7ba71679") url = self.base_url + 'projects/' + resource_project.id + "/resources" responses.add(responses.POST, url, body=data, status=200, content_type='application/json') add_resources = { "resources": ["do:droplet:1", "do:floatingip:192.168.99.100"] } result_resources = resource_project.assign_resource(add_resources) self.assertEqual(len(result_resources['resources']), 2) self.assertEqual(result_resources['resources'][0]['urn'], "do:droplet:1") self.assertEqual(result_resources['resources'][1]['urn'], "do:floatingip:192.168.99.100") @responses.activate def test_list_default_project_resources(self): data = self.load_from_file('projects/project_resources.json') resource_project = digitalocean.Project(token=self.token, id="default") url = self.base_url + 'projects/' + resource_project.id + "/resources" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') all_resources = resource_project.get_all_resources() self.assertEqual(len(all_resources), 1) self.assertEqual(all_resources[0], "do:droplet:1") @responses.activate def test_assign_resource_to_default_project(self): data = self.load_from_file('projects/assign_resources.json') resource_project = digitalocean.Project(token=self.token, id="default") url = self.base_url + 'projects/' + resource_project.id + "/resources" responses.add(responses.POST, url, body=data, status=200, content_type='application/json') add_resources = { "resources": ["do:droplet:1", "do:floatingip:192.168.99.100"] } result_resources = resource_project.assign_resource(add_resources) self.assertEqual(len(result_resources['resources']), 2) self.assertEqual(result_resources['resources'][0]['urn'], "do:droplet:1") self.assertEqual(result_resources['resources'][1]['urn'], "do:floatingip:192.168.99.100") if __name__ == '__main__': unittest.main() python-digitalocean-1.16.0/digitalocean/tests/test_snapshot.py000066400000000000000000000031051375555111600246060ustar00rootroot00000000000000import unittest import responses import digitalocean from .BaseTest import BaseTest class TestSnapshot(BaseTest): def setUp(self): super(TestSnapshot, self).setUp() self.snapshot = digitalocean.Snapshot(id="fbe805e8-866b-11e6-96bf-000f53315a41", token=self.token) @responses.activate def test_load(self): data = self.load_from_file('snapshots/single.json') url = "{}snapshots/{}".format(self.base_url, self.snapshot.id) responses.add(responses.GET, url, body=data, status=200, content_type='application/json') self.snapshot.load() self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(self.snapshot.id, "fbe805e8-866b-11e6-96bf-000f53315a41") self.assertEqual(self.snapshot.name, 'big-data-snapshot1475170902') self.assertEqual(self.snapshot.created_at, "2016-09-29T17:41:42Z") self.assertEqual(self.snapshot.size_gigabytes, 1.42) self.assertEqual(self.snapshot.min_disk_size, 20) @responses.activate def test_destroy(self): responses.add(responses.DELETE, '{}snapshots/{}/'.format(self.base_url, self.snapshot.id), status=204, content_type='application/json') self.snapshot.destroy() self.assertEqual(responses.calls[0].request.url, self.base_url + 'snapshots/fbe805e8-866b-11e6-96bf-000f53315a41/') if __name__ == '__main__': unittest.main() python-digitalocean-1.16.0/digitalocean/tests/test_tag.py000066400000000000000000000076401375555111600235320ustar00rootroot00000000000000import unittest import responses import digitalocean import json from .BaseTest import BaseTest class TestTags(BaseTest): def setUp(self): super(TestTags, self).setUp() @responses.activate def test_load(self): data = self.load_from_file('tags/single.json') url = self.base_url + "tags/awesome" responses.add(responses.GET, url, body=data, status=200, content_type='application/json') droplet_tag = digitalocean.Tag(name='awesome', token=self.token) droplet_tag.load() self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(droplet_tag.name, "awesome") @responses.activate def test_create(self): data = self.load_from_file('tags/single.json') url = self.base_url + "tags" responses.add(responses.POST, url, body=data, status=201, content_type='application/json') droplet_tag = digitalocean.Tag(name='awesome', token=self.token) droplet_tag.create() self.assertEqual(responses.calls[0].request.url, self.base_url + "tags") self.assertEqual(droplet_tag.name, "awesome") @responses.activate def test_delete(self): url = self.base_url + "tags/awesome" responses.add(responses.DELETE, url, status=204, content_type='application/json') droplet_tag = digitalocean.Tag(name='awesome', token=self.token) droplet_tag.delete() self.assertEqual(responses.calls[0].request.url, self.base_url + "tags/awesome") self.assertEqual(droplet_tag.name, "awesome") @responses.activate def test_add_droplets(self): url = self.base_url + "tags/awesome/resources" responses.add(responses.POST, url, status=204, content_type='application/json') droplet_tag = digitalocean.Tag(name='awesome', token=self.token) droplet_tag.add_droplets(["9569411"]) self.assertEqual(responses.calls[0].request.url, self.base_url + "tags/awesome/resources") @responses.activate def test_remove_droplets(self): url = self.base_url + "tags/awesome/resources" responses.add(responses.DELETE, url, status=204, content_type='application/json') droplet_tag = digitalocean.Tag(name='awesome', token=self.token) droplet_tag.remove_droplets(["9569411"]) self.assertEqual(responses.calls[0].request.url, self.base_url + "tags/awesome/resources") @responses.activate def test_add_volume_snapshots(self): url = self.base_url + "tags/awesome/resources" responses.add(responses.POST, url, status=204, content_type='application/json') tag = digitalocean.Tag(name='awesome', token=self.token) tag.add_snapshots(["9569411"]) self.assertEqual(responses.calls[0].request.url, self.base_url + "tags/awesome/resources") @responses.activate def test_remove_volume_snapshots(self): url = self.base_url + "tags/awesome/resources" responses.add(responses.DELETE, url, status=204, content_type='application/json') tag = digitalocean.Tag(name='awesome', token=self.token) tag.remove_snapshots(["9569411"]) self.assertEqual(responses.calls[0].request.url, self.base_url + "tags/awesome/resources") if __name__ == '__main__': unittest.main() python-digitalocean-1.16.0/digitalocean/tests/test_volume.py000066400000000000000000000212771375555111600242700ustar00rootroot00000000000000import unittest import responses import digitalocean from .BaseTest import BaseTest class TestVolume(BaseTest): def setUp(self): super(TestVolume, self).setUp() self.volume = digitalocean.Volume( id='506f78a4-e098-11e5-ad9f-000f53306ae1', token=self.token) @responses.activate def test_load(self): data = self.load_from_file('volumes/single.json') volume_path = "volumes/506f78a4-e098-11e5-ad9f-000f53306ae1" url = self.base_url + volume_path responses.add(responses.GET, url, body=data, status=200, content_type='application/json') self.volume.load() self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(self.volume.id, "506f78a4-e098-11e5-ad9f-000f53306ae1") self.assertEqual(self.volume.size_gigabytes, 100) @responses.activate def test_create(self): data = self.load_from_file('volumes/single.json') url = self.base_url + "volumes/" responses.add(responses.POST, url, body=data, status=201, content_type='application/json') volume = digitalocean.Volume(droplet_id=12345, region='nyc1', size_gigabytes=100, filesystem_type='ext4', filesystem_label='label', token=self.token).create() self.assertEqual(responses.calls[0].request.url, self.base_url + "volumes/") self.assertEqual(volume.id, "506f78a4-e098-11e5-ad9f-000f53306ae1") self.assertEqual(volume.size_gigabytes, 100) self.assertEqual(volume.filesystem_type, "ext4") @responses.activate def test_create_with_tags(self): data = self.load_from_file('volumes/single_with_tags.json') url = self.base_url + "volumes/" responses.add(responses.POST, url, body=data, status=201, content_type='application/json') volume = digitalocean.Volume(droplet_id=12345, region='nyc1', size_gigabytes=100, filesystem_type='ext4', filesystem_label='label', tags=['tag1', 'tag2'], token=self.token).create() self.assertEqual(volume.tags, ['tag1', 'tag2']) self.assertEqual(responses.calls[0].request.url, self.base_url + "volumes/") self.assertEqual(volume.id, "506f78a4-e098-11e5-ad9f-000f53306ae1") self.assertEqual(volume.size_gigabytes, 100) self.assertEqual(volume.filesystem_type, "ext4") @responses.activate def test_create_from_snapshot(self): data = self.load_from_file('volumes/single.json') url = self.base_url + "volumes/" responses.add(responses.POST, url, body=data, status=201, content_type='application/json') volume = digitalocean.Volume(droplet_id=12345, snapshot_id='234234qwer', region='nyc1', size_gigabytes=100, filesystem_type='ext4', filesystem_label='label', token=self.token).create() self.assertEqual(responses.calls[0].request.url, self.base_url + "volumes/") self.assertEqual(volume.id, "506f78a4-e098-11e5-ad9f-000f53306ae1") self.assertEqual(volume.size_gigabytes, 100) self.assertEqual(volume.filesystem_type, "ext4") @responses.activate def test_destroy(self): volume_path = "volumes/506f78a4-e098-11e5-ad9f-000f53306ae1/" url = self.base_url + volume_path responses.add(responses.DELETE, url, status=204, content_type='application/json') self.volume.destroy() self.assertEqual(responses.calls[0].request.url, self.base_url + volume_path) @responses.activate def test_attach(self): data = self.load_from_file('volumes/attach.json') volume_path = "volumes/" + self.volume.id + "/actions/" url = self.base_url + volume_path responses.add(responses.POST, url, body=data, status=201, content_type='application/json') res = self.volume.attach(droplet_id=12345, region='nyc1') self.assertEqual(responses.calls[0].request.url, self.base_url + volume_path) self.assertEqual(res['action']['type'], 'attach_volume') self.assertEqual(res['action']['status'], 'completed') self.assertEqual(res['action']['id'], 72531856) @responses.activate def test_detach(self): data = self.load_from_file('volumes/detach.json') volume_path = "volumes/" + self.volume.id + "/actions/" url = self.base_url + volume_path responses.add(responses.POST, url, body=data, status=201, content_type='application/json') res = self.volume.detach(droplet_id=12345, region='nyc1') self.assertEqual(responses.calls[0].request.url, self.base_url + volume_path) self.assertEqual(res['action']['type'], 'detach_volume') self.assertEqual(res['action']['status'], 'in-progress') self.assertEqual(res['action']['id'], 68212773) @responses.activate def test_resize(self): data = self.load_from_file('volumes/resize.json') volume_path = "volumes/" + self.volume.id + "/actions/" url = self.base_url + volume_path responses.add(responses.POST, url, body=data, status=201, content_type='application/json') res = self.volume.resize(region='nyc1', size_gigabytes=1000) self.assertEqual(responses.calls[0].request.url, self.base_url + volume_path) self.assertEqual(res['action']['type'], 'resize_volume') self.assertEqual(res['action']['status'], 'in-progress') self.assertEqual(res['action']['id'], 72531856) @responses.activate def test_snapshot(self): data = self.load_from_file('volumes/snapshot.json') volume_path = "volumes/" + self.volume.id + "/snapshots/" url = self.base_url + volume_path responses.add(responses.POST, url, body=data, status=201, content_type='application/json') res = self.volume.snapshot(name='big-data-snapshot1475261774') self.assertEqual(responses.calls[0].request.url, self.base_url + volume_path) self.assertEqual(res['snapshot']['resource_type'], 'volume') self.assertEqual(res['snapshot']['min_disk_size'], 10) self.assertEqual(res['snapshot']['size_gigabytes'], 20.2) self.assertEqual(res['snapshot']['id'], '8fa70202-873f-11e6-8b68-000f533176b1') @responses.activate def test_get_snapshots(self): data = self.load_from_file('volumes/snapshots.json') volume_path = "volumes/" + self.volume.id + "/snapshots/" url = self.base_url + volume_path responses.add(responses.GET, url, body=data, status=201, content_type='application/json') res = self.volume.get_snapshots() self.assert_get_url_equal(responses.calls[0].request.url, self.base_url + volume_path) self.assertEqual(len(res), 2) self.assertEqual(res[0].id, '8eb4d51a-873f-11e6-96bf-000f53315a41') self.assertEqual(res[0].name, 'big-data-snapshot1475261752') self.assertEqual(res[0].size_gigabytes, 20.2) self.assertEqual(res[1].id, '8eb4d51a-873f-11e6-96bf-000f53315a42') self.assertEqual(res[1].name, 'big-data-snapshot1475261752-2') self.assertEqual(res[1].size_gigabytes, 40.4) if __name__ == '__main__': unittest.main() python-digitalocean-1.16.0/digitalocean/tests/test_vpc.py000066400000000000000000000063031375555111600235420ustar00rootroot00000000000000import json import unittest import responses import digitalocean from .BaseTest import BaseTest class TestVPC(BaseTest): def setUp(self): super(TestVPC, self).setUp() self.vpc_id = '5a4981aa-9653-4bd1-bef5-d6bff52042e4' self.vpc = digitalocean.VPC(id=self.vpc_id, token=self.token) @responses.activate def test_load(self): data = self.load_from_file('vpcs/single.json') url = self.base_url + 'vpcs/' + self.vpc_id responses.add(responses.GET, url, body=data, status=200, content_type='application/json') self.vpc.load() self.assert_get_url_equal(responses.calls[0].request.url, url) self.assertEqual(self.vpc.id, self.vpc_id) self.assertEqual(self.vpc.name, 'my-new-vpc') self.assertEqual(self.vpc.region, 'nyc1') self.assertEqual(self.vpc.ip_range, '10.10.10.0/24') self.assertEqual(self.vpc.description, '') self.assertEqual(self.vpc.urn, 'do:vpc:5a4981aa-9653-4bd1-bef5-d6bff52042e4') self.assertEqual(self.vpc.created_at, '2020-03-13T18:48:45Z') self.assertEqual(self.vpc.default, False) @responses.activate def test_create(self): data = self.load_from_file('vpcs/single.json') url = self.base_url + 'vpcs' responses.add(responses.POST, url, body=data, status=201, content_type='application/json') vpc = digitalocean.VPC(name='my-new-vpc', region='nyc1', ip_range='10.10.10.0/24', token=self.token).create() self.assertEqual(responses.calls[0].request.url, url) self.assertEqual(vpc.id, '5a4981aa-9653-4bd1-bef5-d6bff52042e4') self.assertEqual(vpc.name, 'my-new-vpc') self.assertEqual(vpc.ip_range, '10.10.10.0/24') self.assertEqual(vpc.description, '') self.assertEqual(vpc.urn, 'do:vpc:5a4981aa-9653-4bd1-bef5-d6bff52042e4') self.assertEqual(vpc.created_at, '2020-03-13T18:48:45Z') @responses.activate def test_rename(self): data = self.load_from_file('vpcs/single.json') url = self.base_url + 'vpcs/' + self.vpc_id responses.add(responses.PATCH, url, body=data, status=200, content_type='application/json') self.vpc.rename('my-new-vpc') self.assertEqual(responses.calls[0].request.url, url) self.assertEqual(self.vpc.id, '5a4981aa-9653-4bd1-bef5-d6bff52042e4') self.assertEqual(self.vpc.name, 'my-new-vpc') self.assertEqual(self.vpc.created_at, '2020-03-13T18:48:45Z') @responses.activate def test_destroy(self): url = self.base_url + 'vpcs/' + self.vpc_id responses.add(responses.DELETE, url, status=204, content_type='application/json') self.vpc.destroy() self.assertEqual(responses.calls[0].request.url, url) if __name__ == '__main__': unittest.main() python-digitalocean-1.16.0/docs/000077500000000000000000000000001375555111600165025ustar00rootroot00000000000000python-digitalocean-1.16.0/docs/Makefile000066400000000000000000000011471375555111600201450ustar00rootroot00000000000000# Minimal makefile for Sphinx documentation # # You can set these variables from the command line. SPHINXOPTS = SPHINXBUILD = sphinx-build SPHINXPROJ = PythonDigitalocean SOURCEDIR = . BUILDDIR = _build # Put it first so that "make" without argument is like "make help". help: @$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) .PHONY: help Makefile # Catch-all target: route all unknown targets to Sphinx using the new # "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS). %: Makefile @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)python-digitalocean-1.16.0/docs/conf.py000066400000000000000000000127261375555111600200110ustar00rootroot00000000000000# -*- coding: utf-8 -*- # # Python Digitalocean documentation build configuration file, created by # sphinx-quickstart on Wed Jan 25 13:52:17 2017. # # This file is execfile()d with the current directory set to its # containing dir. # # Note that not all possible configuration values are present in this # autogenerated file. # # All configuration values have a default; values that are commented out # serve to show the default. # If extensions (or modules to document with autodoc) are in another directory, # add these directories to sys.path here. If the directory is relative to the # documentation root, use os.path.abspath to make it absolute, like shown here. # # import os # import sys # sys.path.insert(0, os.path.abspath('.')) # -- General configuration ------------------------------------------------ # If your documentation needs a minimal Sphinx version, state it here. # # needs_sphinx = '1.0' # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom # ones. extensions = [ 'sphinx.ext.autodoc', 'sphinx.ext.intersphinx', 'sphinx.ext.coverage', 'sphinx.ext.viewcode', 'sphinx.ext.napoleon', ] # Add any paths that contain templates here, relative to this directory. templates_path = ['_templates'] # The suffix(es) of source filenames. # You can specify multiple suffix as a list of string: # # source_suffix = ['.rst', '.md'] source_suffix = '.rst' # The master toctree document. master_doc = 'index' # General information about the project. project = u'Python Digitalocean' copyright = u'2020, Lorenzo Setale' author = u'Lorenzo Setale' # The version info for the project you're documenting, acts as replacement for # |version| and |release|, also used in various other places throughout the # built documents. # # The short X.Y version. version = u'1.16.0' # The full version, including alpha/beta/rc tags. release = u'1.16.0' # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. # # This is also used if you do content translation via gettext catalogs. # Usually you set "language" from the command line for these cases. language = None # List of patterns, relative to source directory, that match files and # directories to ignore when looking for source files. # This patterns also effect to html_static_path and html_extra_path exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store'] # The name of the Pygments (syntax highlighting) style to use. pygments_style = 'sphinx' # If true, `todo` and `todoList` produce output, else they produce nothing. todo_include_todos = False # -- Options for HTML output ---------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. # html_theme = 'alabaster' # Theme options are theme-specific and customize the look and feel of a theme # further. For a list of options available for each theme, see the # documentation. # # html_theme_options = {} # Add any paths that contain custom static files (such as style sheets) here, # relative to this directory. They are copied after the builtin static files, # so a file named "default.css" will overwrite the builtin "default.css". # html_static_path = ['_static'] # -- Options for HTMLHelp output ------------------------------------------ # Output file base name for HTML help builder. htmlhelp_basename = 'PythonDigitaloceandoc' # -- Options for LaTeX output --------------------------------------------- latex_elements = { # The paper size ('letterpaper' or 'a4paper'). # # 'papersize': 'letterpaper', # The font size ('10pt', '11pt' or '12pt'). # # 'pointsize': '10pt', # Additional stuff for the LaTeX preamble. # # 'preamble': '', # Latex figure (float) alignment # # 'figure_align': 'htbp', } # Grouping the document tree into LaTeX files. List of tuples # (source start file, target name, title, # author, documentclass [howto, manual, or own class]). latex_documents = [ (master_doc, 'PythonDigitalocean.tex', u'Python Digitalocean Documentation', u'Lorenzo Setale', 'manual'), ] # -- Options for manual page output --------------------------------------- # One entry per manual page. List of tuples # (source start file, name, description, authors, manual section). man_pages = [ (master_doc, 'pythondigitalocean', u'Python Digitalocean Documentation', [author], 1) ] # -- Options for Texinfo output ------------------------------------------- # Grouping the document tree into Texinfo files. List of tuples # (source start file, target name, title, author, # dir menu entry, description, category) texinfo_documents = [ (master_doc, 'PythonDigitalocean', u'Python Digitalocean Documentation', author, 'PythonDigitalocean', 'One line description of project.', 'Miscellaneous'), ] # -- Options for Epub output ---------------------------------------------- # Bibliographic Dublin Core info. epub_title = project epub_author = author epub_publisher = author epub_copyright = copyright # The unique identifier of the text. This can be a ISBN number # or the project homepage. # # epub_identifier = '' # A unique identification for the text. # # epub_uid = '' # A list of files that should not be packed into the epub file. epub_exclude_files = ['search.html'] # Example configuration for intersphinx: refer to the Python standard library. intersphinx_mapping = {'https://docs.python.org/': None} python-digitalocean-1.16.0/docs/digitalocean.rst000066400000000000000000000052751375555111600216700ustar00rootroot00000000000000digitalocean package ==================== Submodules ---------- digitalocean.Account module --------------------------- .. automodule:: digitalocean.Account :members: :undoc-members: :show-inheritance: digitalocean.Action module -------------------------- .. automodule:: digitalocean.Action :members: :undoc-members: :show-inheritance: digitalocean.Domain module -------------------------- .. automodule:: digitalocean.Domain :members: :undoc-members: :show-inheritance: digitalocean.Droplet module --------------------------- .. automodule:: digitalocean.Droplet :members: :undoc-members: :show-inheritance: digitalocean.FloatingIP module ------------------------------ .. automodule:: digitalocean.FloatingIP :members: :undoc-members: :show-inheritance: digitalocean.Image module ------------------------- .. automodule:: digitalocean.Image :members: :undoc-members: :show-inheritance: digitalocean.Kernel module -------------------------- .. automodule:: digitalocean.Kernel :members: :undoc-members: :show-inheritance: digitalocean.LoadBalancer module -------------------------------- .. automodule:: digitalocean.LoadBalancer :members: :undoc-members: :show-inheritance: digitalocean.Manager module --------------------------- .. automodule:: digitalocean.Manager :members: :undoc-members: :show-inheritance: digitalocean.Metadata module ---------------------------- .. automodule:: digitalocean.Metadata :members: :undoc-members: :show-inheritance: digitalocean.Record module -------------------------- .. automodule:: digitalocean.Record :members: :undoc-members: :show-inheritance: digitalocean.Region module -------------------------- .. automodule:: digitalocean.Region :members: :undoc-members: :show-inheritance: digitalocean.SSHKey module -------------------------- .. automodule:: digitalocean.SSHKey :members: :undoc-members: :show-inheritance: digitalocean.Size module ------------------------ .. automodule:: digitalocean.Size :members: :undoc-members: :show-inheritance: digitalocean.Tag module ----------------------- .. automodule:: digitalocean.Tag :members: :undoc-members: :show-inheritance: digitalocean.Volume module -------------------------- .. automodule:: digitalocean.Volume :members: :undoc-members: :show-inheritance: digitalocean.baseapi module --------------------------- .. automodule:: digitalocean.baseapi :members: :undoc-members: :show-inheritance: Module contents --------------- .. automodule:: digitalocean :members: :undoc-members: :show-inheritance: python-digitalocean-1.16.0/docs/index.rst000066400000000000000000000007511375555111600203460ustar00rootroot00000000000000.. Python Digitalocean documentation master file, created by sphinx-quickstart on Wed Jan 25 13:52:17 2017. You can adapt this file completely to your liking, but it should at least contain the root `toctree` directive. Welcome to Python Digitalocean's documentation! =============================================== .. toctree:: :maxdepth: 4 :caption: Contents: digitalocean Indices and tables ================== * :ref:`genindex` * :ref:`modindex` * :ref:`search` python-digitalocean-1.16.0/requirements.txt000066400000000000000000000001401375555111600210310ustar00rootroot00000000000000requests>=2.2.1 jsonpickle # Testing requirements pytest responses mock; python_version < '3.3'python-digitalocean-1.16.0/setup.py000066400000000000000000000016071375555111600172700ustar00rootroot00000000000000#!/usr/bin/env python import os try: from setuptools import setup except ImportError: from ez_setup import use_setuptools use_setuptools() from setuptools import setup long_description = """This library provides easy access to Digital Ocean APIs to deploy droplets, images and more.""" if os.path.isfile("DESCRIPTION.rst"): with open('DESCRIPTION.rst') as file: long_description = file.read() setup( name='python-digitalocean', version='1.16.0', description='digitalocean.com API to manage Droplets and Images', author='Lorenzo Setale ( http://who.is.lorenzo.setale.me/? )', author_email='lorenzo@setale.me', url='https://github.com/koalalorenzo/python-digitalocean', packages=['digitalocean'], install_requires=['requests', 'jsonpickle'], test_suite='digitalocean.tests', license='LGPL v3', long_description=long_description )