././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1741251064.554279 aodhclient-3.7.1/0000775000175000017500000000000000000000000013657 5ustar00zuulzuul00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/.coveragerc0000664000175000017500000000014100000000000015774 0ustar00zuulzuul00000000000000[run] branch = True source = aodhclient omit = aodhclient/tests/* [report] ignore_errors = True ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/.stestr.conf0000664000175000017500000000006700000000000016133 0ustar00zuulzuul00000000000000[DEFAULT] test_path=./aodhclient/tests/unit top_dir=./ ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/.zuul.yaml0000664000175000017500000000255700000000000015631 0ustar00zuulzuul00000000000000- job: name: aodhclient-dsvm-functional parent: devstack-tox-functional description: | Devstack-based functional tests for aodhclient. required-projects: - openstack/python-aodhclient - openstack/aodh # We neeed ceilometer's devstack plugin to install gnocchi - openstack/ceilometer - gnocchixyz/gnocchi - openstack-k8s-operators/sg-core timeout: 4200 vars: devstack_localrc: AODH_SERVICE_HOST: localhost AODH_DEPLOY: uwsgi CEILOMETER_BACKENDS: "gnocchi,sg-core" PROMETHEUS_SERVICE_SCRAPE_TARGETS: prometheus,sg-core devstack_plugins: aodh: https://opendev.org/openstack/aodh ceilometer: https://opendev.org/openstack/ceilometer sg-core: https://github.com/openstack-k8s-operators/sg-core - project: queue: telemetry templates: - check-requirements - openstack-python3-jobs - publish-openstack-docs-pti - release-notes-jobs-python3 - openstackclient-plugin-jobs check: jobs: - aodhclient-dsvm-functional: irrelevant-files: &ac-irrelevant-files - ^(test-|)requirements.txt$ - ^setup.cfg$ - ^.*\.rst$ - ^releasenotes/.*$ gate: jobs: - aodhclient-dsvm-functional: irrelevant-files: *ac-irrelevant-files ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741251064.0 aodhclient-3.7.1/AUTHORS0000664000175000017500000000521200000000000014727 0ustar00zuulzuul0000000000000098k <18552437190@163.com> Andreas Jaeger Bhujay Kumar Bhatta Cao Xuan Hoang Chaozhe.Chen Corey Bryant Dan Radez David Rabel Doug Hellmann Dr. Jens Harbott Emma Foley Eyal Ghanshyam Mann Ha Manh Dong Hanxi Liu Hemanth Nakkina Hervé Beraud Jake Yip Jaromir Wysoglad Jaromír Wysoglad Jon Schlueter Julien Danjou KATO Tomoyuki Kevin_Zheng Lingxian Kong Martin Mágr Matthias Bastian Mehdi ABAAKOUK Mehdi Abaakouk Mehdi Abaakouk Monty Taylor Nguyen Hai OpenStack Release Bot PanFengyun Pradeep Kilambi Rui Yuan Dou Sean McGinnis Simon Merrick Stephen Finucane Stéphane Albert Swapnil Kulkarni (coolsvap) Takashi Kajinami Takashi Kajinami Thomas Bechtold Tony Breeds Tovin Seven Vieri <15050873171@163.com> Yadnesh Kulkarni Zhao Lei ZhiQiang Fan ZhongShengping Zi Lian Ji caoyuan gord chung gordon chung houweichao jacky06 kangyufei lipan liusheng liyuanzhen pengyuesheng qingszhao rabi rajat29 venkatamahesh wangzihao whoami-rajat wu.chunyang xialinjuan xiaozhuangqing zhangguoqing zhangyangyang ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/CONTRIBUTING.rst0000664000175000017500000000126200000000000016321 0ustar00zuulzuul00000000000000If you would like to contribute to the development of OpenStack, you must follow the steps in this page: https://docs.openstack.org/infra/manual/developers.html If you already have a good understanding of how the system works and your OpenStack accounts are set up, you can skip to the development workflow section of this documentation to learn how changes to OpenStack should be submitted for review via the Gerrit tool: https://docs.openstack.org/infra/manual/developers.html#development-workflow Pull requests submitted through GitHub will be ignored. Bugs should be filed on Launchpad, not GitHub: https://storyboard.openstack.org/#!/project/openstack/python-aodhclient ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741251064.0 aodhclient-3.7.1/ChangeLog0000664000175000017500000002676100000000000015445 0ustar00zuulzuul00000000000000CHANGES ======= 3.7.1 ----- * Switch to openstack-k8s-operators/sg-core * reno: Update master for unmaintained/2023.1 3.7.0 ----- * Bump pbr * Add note about requirements lower bounds * zuul: Use telemetry queue * Publish release notes * Remove Python 3.8 support * Avoid overriding the base url template * Add requirements check job * Enable GLOBAL\_VENV * Update master for stable/2024.2 3.6.0 ----- * Use global upper-constraints * Remove old excludes * reno: Update master for unmaintained/zed * reno: Update master for unmaintained/xena * reno: Update master for unmaintained/wallaby * reno: Update master for unmaintained/victoria * Update master for stable/2024.1 3.5.1 ----- * Remove oslo.db test cap * Add functional tests for prometheus type alarms 3.5.0 ----- * Fix releasenotes build of yoga moved to unmaintained * Bump hacking * Exclude tests directory from coverage calculation * Move functional test to dsvm and remove pifpaf * Update python classifier in setup.cfg * Bump upper version of db libraries in test requirements 3.4.0 ----- * Prometheus alarms * Fix gate * Update master for stable/2023.2 * Fix bindep.txt for python 3.11 job(Debian Bookworm) 3.3.0 ----- * Move test requirements to tox.ini * Update master for stable/2023.1 3.2.0 ----- * Fix CI jobs(py38, py310) and upload of wheels package to PyPi 3.1.0 ----- * Make tox.ini tox 4.0 compatible * Switch to 2023.1 Python3 unit tests and generic template name * Update master for stable/zed 3.0.0 ----- * Unblock the gate and update python testing * Remove six * Update master for stable/yoga 2.4.1 ----- * Add Python 3 only classifier * setup.cfg: Replace dashes by underscores * Add Python3 yoga unit tests 2.4.0 ----- * Fix aodhclient for pyparse 3.0.6 * Update master for stable/xena * Add Python3 xena unit tests 2.3.0 ----- * Update master for stable/wallaby 2.2.0 ----- * Add Python3 wallaby unit tests * Update master for stable/victoria 2.1.1 ----- * Don't use \*/\* in accept header * Remove translation sections from setup.cfg 2.1.0 ----- * Stop to use the \_\_future\_\_ module * Switch to newer openstackdocstheme and reno versions * Bump default tox env from py37 to py38 * Add py38 package metadata * Use unittest.mock instead of third party mock * Add Python3 victoria unit tests * Update master for stable/ussuri 2.0.1 ----- * Cleanup py27 support * Update hacking for Python3 2.0.0 ----- * Support quota CLI * tox: Keeping going with docs * Drop python 2.7 support and testing 1.5.0 ----- * Support threshold type alarm again * Switch to Ussuri jobs 1.4.0 ----- * Add support for loadbalancer\_member\_health alarms * Update master for stable/train * Bump the openstackdocstheme extension to 1.20 1.3.0 ----- * Blacklist sphinx 2.1.0 (autodoc bug) * Add Python 3 Train unit tests * Sync Sphinx requirement * Updating alarm for different alarm types * Remove telemetry-tox-py37 * allow an empty string to unset options * Replace git.openstack.org URLs with opendev.org URLs * OpenDev Migration Patch * Dropping the py35 testing * missing result of the alarm show by name * Add OSprofiler support for Aodh client * add python 3.6 unit test job * Update master for stable/stein * add python 3.7 unit test job 1.2.0 ----- * Change openstack-dev to openstack-discuss * Add Python 3.6 classifier to setup.cfg * Add metavar for alarm\_id in alarm-history show * add python 3.6 unit test job * switch documentation job to new PTI * import zuul job settings from project-config * Update reno for stable/rocky * Add gating on py37 * Switch to use stestr for unit test 1.1.1 ----- * Consider interface and region options with OSC 1.1.0 ----- * Fixup README * fix tox python3 overrides * fix warnings in documentation build * cli: replace metrics by metric * Follow the new PTI for document build * Not able to set threshold value to ‘0’ in aodh alarm creation * Update reno for stable/queens 1.0.0 ----- * remove threshold alarms * Set the Cliff namespace * Fix doc builds * delete --debug in a shell command * Remove setting of version/release from releasenotes * doc: remove mention of combination alarms * Cleanup setup.cfg * Update reno for stable/pike * Docs: switch to openstackdocstheme * Fix gnocchi/pifpaf requirements * Update documentation URLs * Fix Gnocchi tarball URL * block sphinx 1.6.1 * cleanup coveragerc * tests: stop using Ceilometer legacy resource types * Remove log translations 0.9.0 ----- * Remove upper cap on pbr * Clean deprecated "query" paremeter in alarm.list() * Correct the port number of gnocchi endpoint * Trivial-fix: make the capabilities help message more specific in OSC * Remove the duplicated \_\_version\_\_ definition * Trivial: remove support for py34 * Update reno for stable/ocata * Remove unused test dependency 0.8.0 ----- * Extras usage for test requirements + basic auth against Gnocch * Modified the help info * Enable coverage report in console output * [doc] Note lack of constraints is a choice * Replaces uuid.uuid4 with uuidutils.generate\_uuid() * Bump hacking to 0.12 * Add plug-in summary for osc doc * Enable release notes translation * Add missing pyparsing to requirements.txt * Clean imports in code * Update reno for stable/newton * Remove 'MANIFEST.in' 0.7.0 ----- * Use osc-lib instead of openstackclient * Trival: Remove unused logging import * Fix incorrect string format mapping * Remove discover from test-requirements 0.6.0 ----- * Add Python 3.5 classifier and venv * Add event\_id for alarm-history display * Add pagination support in Aodhclient * Split alarm query and alarm list in SDK layer * Support get/set alarm state interfaces * Add releasenotes for Newton updates * add default value for http\_status in ClientException * fix help text for option --composite-rule * improve threshold & event alarm query formatting * Add explanation for how to get meter name * Add unit test to validate composite alarm args * exceptions: make error code be implicit * Update the home-page with developer documentation * remove tempest test for alarm name unique constraint * Only install hacking in pep8 * shell: treat alarm id as name if it is not uuid like * Bump hacking to 0.11 0.5.0 ----- * add valid choice for alarm type help text * shell: fix meter help string * Allow to start Aodh command with OSC * Prepare compatibility with OSC * fix KeyError when credential is invalid * remove unnecessary slashes * Enable releasenotes documentation * [Trivial] Remove executable privilege of doc/source/conf.py * add mock to test requirement list * refactor code of adding alarm id and name arguments * Switch from deprecated tempest-lib to tempest 0.4.0 ----- * Remove keystonemiddleware dependency * Use pifpaf to setup tests * Remove Babel dependency * format the output of time\_constraints * Add --filter to "alarm list" * Remove unused mailmap file * Support easier query usage for alarm history search * Add name support for show, update and delete * Correct and add examples in docs of CLI * Add --query to alarm list * Clean flake8 paths and unused config options * Don't support "alarm search" and mandatory --type in "alarm list" * tox: remove useless install\_command * Use overtest to run Gnocchi in tests * Trival fix a typo * Fix aodh client fails when command with the arg --time-constraint * Fix the -q/--query in threshold alarm creation * Use assert\_called\_once\_with() in test\_alarm\_cli.py * Fix the wrong error info of parameter resource-type * Fix a minor doc error * Remove unused package requirement of "futurist" 0.3.0 ----- * Use cli.base.execute of tempest-lib * Add composite rule alarm support in aodhclient * Make the alarm list output more concise 0.2.0 ----- * Skip the version 1.16.0 of cliff * Remove unused method * Add abbreviation of --meter-name hint in threshold alarm creation * Return the error message of API request failure * Add some unit test cases * Add gnocchi alarm rules * cleanup existing tests * add alarm history unit test * check the alarm type when list alarm 0.1.0 ----- * add alarm-history interface * clean up docs * add event alarm support * Clean flake8 ignore * support alarm type param * add alarm interface * fix test debug * fix functional tests * drop remaining gnocchi specific code and fix pep8 * remove gnocchi specific tests * remove gnocchi specific commands and set to v2 * more gnocchi to aodh rename cleanup * rename gnocchi to aodh * Make the wheel universal * Allow to set resource type for aggregation * Examples in doc should be workable * Handle rename from dbsync to upgrade * Fixed history parameter having no effect on search * tests: update setup script to use Paste * Add some help messages of Gnocchi commands * metric: fix measures --stop argument * Creates better exceptions for http code 409 * resource: fix help string * tests: do not create a archive policy on setup * resource: define several option for all metric action * benchmark: Add more measurements stats * benchmark: add measures show * metric: rename metric get to show * Set x-roles when noauth is used * Convert keystoneauth exceptions by ours * Fix some spelling typo in manual and help output * Always make resource\_id positional * cli: make resource type non positional * benchmark: use processes rather than threads * benchmark: add support for measures add * metric: change all resource\_id options to -r * benchmark: add get metric * benchmark: rename argument to \`count' * benchmark: add metric create * get gnocchi api version from args/env * noauth: simplify class reference * Add support for /v1/status * shell: remove unused LOG * shell: remove dead code * shell: move command definition to the class using it * metric: change delete to allow deleting several metric at once * Add capabilities support to cli * Remove upper cap on Sphinx * Remove Python 3.3 support * Name the package \`gnocchiclient' * metric: allow to create metric with name and no resource\_id * shell: improve error info * Add six in requirements * shell: fix exception catching on cleanup * Replaces 'archive policy' by 'archive-policy' * Change ignore-errors to ignore\_errors * Remove entry\_points * Don't always load client * Remove now useless build\_url method * Replace keystoneclient by keystoneauth1 * Fix help typo * tox: Allow to pass some OS\_\* variables * Support aggregation accross metrics and resources * Allow CRUD measurements * Allow CRUD metric with an resource\_id and an name * Support CRUD for metric * resource: remove -q for --query it conflicts with --quiet * Allow quoted string with ' in query parser * cli: fix archive policy create output * docs: fix archive policy rule name doc * docs: enhance reference title * docs: delete previous build * Use 'archive policy' instead of 'archivepolicy' * Fix archive policy rule list * ap rule: Use named attributes * Fix archivepolicy list * ap: allows multiple definition and name attributes * test: remove duplicated code * Add archive policy rule commands to cli * Add archive policy commands to cli * Fix py34 tests * doc: fix typo in shell.rst * functional tests: don't require devstack * fixes noauth * doc: fix typo * Allow to disable the authentification layer * doc: Fixing a shell parameter * Allow limit/sort/marker for resource listing * Fixes logging * Allow resource history * Write the base of the documentation * Allow to pass the gnocchi enpoint via cli * Update documentation * Allow search for resource * Allow add/delete metrics with resource * Add ssl/region/timeout/interface parameters * Add functional test for crud resource * Hide useless logging * Add 'delete resource' * Adding resource update * Allow to create resources * Fixes "resource show" * Initial client code * Initial Cookiecutter Commit ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/HACKING.rst0000664000175000017500000000022100000000000015450 0ustar00zuulzuul00000000000000aodhclient Style Commandments ================================ Read the OpenStack Style Commandments https://docs.openstack.org/hacking/latest/ ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/LICENSE0000664000175000017500000002363700000000000014677 0ustar00zuulzuul00000000000000 Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. ././@PaxHeader0000000000000000000000000000003300000000000011451 xustar000000000000000027 mtime=1741251064.554279 aodhclient-3.7.1/PKG-INFO0000644000175000017500000000352200000000000014754 0ustar00zuulzuul00000000000000Metadata-Version: 2.1 Name: aodhclient Version: 3.7.1 Summary: Python client library for Aodh Home-page: https://docs.openstack.org/python-aodhclient/latest/ Author: OpenStack Author-email: openstack-discuss@lists.openstack.org Classifier: Environment :: OpenStack Classifier: Intended Audience :: Information Technology Classifier: Intended Audience :: System Administrators Classifier: License :: OSI Approved :: Apache Software License Classifier: Operating System :: POSIX :: Linux Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3 :: Only Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.9 Classifier: Programming Language :: Python :: 3.10 Classifier: Programming Language :: Python :: 3.11 Classifier: Programming Language :: Python :: 3.12 Requires-Python: >=3.9 License-File: LICENSE Requires-Dist: pbr>=2.0 Requires-Dist: cliff>=1.17.0 Requires-Dist: osc-lib>=1.0.1 Requires-Dist: oslo.i18n>=1.5.0 Requires-Dist: oslo.serialization>=1.4.0 Requires-Dist: oslo.utils>=2.0.0 Requires-Dist: osprofiler>=1.4.0 Requires-Dist: keystoneauth1>=1.0.0 Requires-Dist: pyparsing>2.1.0 ========== aodhclient ========== Python bindings to the OpenStack Aodh API This is a client for OpenStack Aodh API. There's a `Python API `_ (the aodhclient module), and a `command-line script `_ (installed as aodh). Each implements the entire OpenStack Aodh API. * Free software: Apache license * Release notes: https://releases.openstack.org/teams/telemetry.html * Documentation: https://docs.openstack.org/python-aodhclient/latest/ * Source: https://opendev.org/openstack/python-aodhclient * Bugs: https://storyboard.openstack.org/#!/project/openstack/python-aodhclient ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/README.rst0000664000175000017500000000127600000000000015354 0ustar00zuulzuul00000000000000========== aodhclient ========== Python bindings to the OpenStack Aodh API This is a client for OpenStack Aodh API. There's a `Python API `_ (the aodhclient module), and a `command-line script `_ (installed as aodh). Each implements the entire OpenStack Aodh API. * Free software: Apache license * Release notes: https://releases.openstack.org/teams/telemetry.html * Documentation: https://docs.openstack.org/python-aodhclient/latest/ * Source: https://opendev.org/openstack/python-aodhclient * Bugs: https://storyboard.openstack.org/#!/project/openstack/python-aodhclient ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1741251064.5342808 aodhclient-3.7.1/aodhclient/0000775000175000017500000000000000000000000015771 5ustar00zuulzuul00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/__init__.py0000664000175000017500000000123200000000000020100 0ustar00zuulzuul00000000000000# -*- coding: utf-8 -*- # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import pbr.version __version__ = pbr.version.VersionInfo( 'aodhclient').version_string() ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/client.py0000664000175000017500000000323100000000000017620 0ustar00zuulzuul00000000000000# Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from keystoneauth1 import adapter from oslo_utils import importutils from osprofiler import web from aodhclient import exceptions def Client(version, *args, **kwargs): module = 'aodhclient.v%s.client' % version module = importutils.import_module(module) client_class = getattr(module, 'Client') return client_class(*args, **kwargs) class SessionClient(adapter.Adapter): def request(self, url, method, **kwargs): kwargs.setdefault('headers', kwargs.get('headers', {})) # NOTE(sileht): The standard call raises errors from # keystoneauth, where we need to raise the aodhclient errors. raise_exc = kwargs.pop('raise_exc', True) kwargs['headers'].update(web.get_trace_id_headers()) resp = super(SessionClient, self).request(url, method, raise_exc=False, **kwargs) if raise_exc and resp.status_code >= 400: raise exceptions.from_response(resp, url, method) return resp ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/exceptions.py0000664000175000017500000001223400000000000020526 0ustar00zuulzuul00000000000000# # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. class ClientException(Exception): """The base exception class for all exceptions this library raises.""" message = 'Unknown Error' http_status = 'N/A' def __init__(self, message=None, request_id=None, url=None, method=None): self.message = message or self.__class__.message self.request_id = request_id self.url = url self.method = method # NOTE(jd) for backward compat @property def code(self): return self.http_status def __str__(self): formatted_string = "%s (HTTP %s)" % (self.message, self.http_status) if self.request_id: formatted_string += " (Request-ID: %s)" % self.request_id return formatted_string class RetryAfterException(ClientException): """The base exception for ClientExceptions that use Retry-After header.""" def __init__(self, *args, **kwargs): try: self.retry_after = int(kwargs.pop('retry_after')) except (KeyError, ValueError): self.retry_after = 0 super(RetryAfterException, self).__init__(*args, **kwargs) class MutipleMeaningException(object): """An mixin for exception that can be enhanced by reading the details""" class CommandError(Exception): pass class BadRequest(ClientException): """HTTP 400 - Bad request: you sent some malformed data.""" http_status = 400 message = "Bad request" class Unauthorized(ClientException): """HTTP 401 - Unauthorized: bad credentials.""" http_status = 401 message = "Unauthorized" class Forbidden(ClientException): """HTTP 403 - Forbidden: your credentials don't give you access to this resource. """ http_status = 403 message = "Forbidden" class NotFound(ClientException): """HTTP 404 - Not found""" http_status = 404 message = "Not found" class MethodNotAllowed(ClientException): """HTTP 405 - Method Not Allowed""" http_status = 405 message = "Method Not Allowed" class NotAcceptable(ClientException): """HTTP 406 - Not Acceptable""" http_status = 406 message = "Not Acceptable" class Conflict(ClientException): """HTTP 409 - Conflict""" http_status = 409 message = "Conflict" class OverLimit(RetryAfterException): """HTTP 413 - Over limit: you're over the API limits for this time period. """ http_status = 413 message = "Over limit" class RateLimit(RetryAfterException): """HTTP 429 - Rate limit: you've sent too many requests for this time period. """ http_status = 429 message = "Rate limit" class NoUniqueMatch(Exception): pass class NotImplemented(ClientException): """HTTP 501 - Not Implemented: the server does not support this operation. """ http_status = 501 message = "Not Implemented" _error_classes = [BadRequest, Unauthorized, Forbidden, NotFound, MethodNotAllowed, NotAcceptable, Conflict, OverLimit, RateLimit, NotImplemented] _error_classes_enhanced = {} _code_map = dict( (c.http_status, (c, _error_classes_enhanced.get(c, []))) for c in _error_classes) def from_response(response, url, method=None): """Return an instance of ClientException on an requests response. Usage:: resp, body = requests.request(...) if resp.status_code != 200: raise exception_from_response(resp) """ if response.status_code: cls, enhanced_classes = _code_map.get(response.status_code, (ClientException, [])) req_id = response.headers.get("x-openstack-request-id") content_type = response.headers.get("Content-Type", "").split(";")[0] kwargs = { 'method': method, 'url': url, 'request_id': req_id, } if "retry-after" in response.headers: kwargs['retry_after'] = response.headers.get('retry-after') if content_type == "application/json": try: body = response.json() except ValueError: pass else: desc = body.get('error_message', {}).get('faultstring') for enhanced_cls in enhanced_classes: if enhanced_cls.match.match(desc): cls = enhanced_cls break kwargs['message'] = desc elif content_type.startswith("text/"): kwargs['message'] = response.text if not kwargs.get('message'): kwargs.pop('message', None) exception = cls(**kwargs) if isinstance(exception, ClientException) and response.status_code: exception.http_status = response.status_code return exception ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/i18n.py0000664000175000017500000000135500000000000017126 0ustar00zuulzuul00000000000000# Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import oslo_i18n as i18n _translators = i18n.TranslatorFactory(domain='aodhclient') # The primary translation function using the well-known name "_" _ = _translators.primary ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/noauth.py0000664000175000017500000000500700000000000017643 0ustar00zuulzuul00000000000000# # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import os from keystoneauth1 import loading from keystoneauth1 import plugin class AodhNoAuthPlugin(plugin.BaseAuthPlugin): """No authentication plugin for Aodh This is a keystoneauth plugin that instead of doing authentication, it just fill the 'x-user-id' and 'x-project-id' headers with the user provided one. """ def __init__(self, user_id, project_id, roles, endpoint): self._user_id = user_id self._project_id = project_id self._endpoint = endpoint self._roles = roles def get_token(self, session, **kwargs): return '' def get_headers(self, session, **kwargs): return {'x-user-id': self._user_id, 'x-project-id': self._project_id, 'x-roles': self._roles} def get_user_id(self, session, **kwargs): return self._user_id def get_project_id(self, session, **kwargs): return self._project_id def get_endpoint(self, session, **kwargs): return self._endpoint class AodhOpt(loading.Opt): @property def argparse_args(self): return ['--%s' % o.name for o in self._all_opts] @property def argparse_default(self): # select the first ENV that is not false-y or return None for o in self._all_opts: v = os.environ.get('AODH_%s' % o.name.replace('-', '_').upper()) if v: return v return self.default class AodhNoAuthLoader(loading.BaseLoader): plugin_class = AodhNoAuthPlugin def get_options(self): options = super(AodhNoAuthLoader, self).get_options() options.extend([ AodhOpt('user-id', help='User ID', required=True), AodhOpt('project-id', help='Project ID', required=True), AodhOpt('roles', help='Roles', default="admin"), AodhOpt('aodh-endpoint', help='Aodh endpoint', dest="endpoint", required=True), ]) return options ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/osc.py0000664000175000017500000000346400000000000017136 0ustar00zuulzuul00000000000000# Copyright 2014 OpenStack Foundation # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. """OpenStackClient plugin for Telemetry Alarming service.""" from osc_lib import utils DEFAULT_ALARMING_API_VERSION = '2' API_VERSION_OPTION = 'os_alarming_api_version' API_NAME = "alarming" API_VERSIONS = { "2": "aodhclient.v2.client.Client", } def make_client(instance): """Returns an queues service client.""" version = instance._api_version[API_NAME] try: version = int(version) except ValueError: version = float(version) aodh_client = utils.get_client_class( API_NAME, version, API_VERSIONS) # NOTE(sileht): ensure setup of the session is done instance.setup_auth() return aodh_client(session=instance.session, interface=instance.interface, region_name=instance.region_name) def build_option_parser(parser): """Hook to add global options.""" parser.add_argument( '--os-alarming-api-version', metavar='', default=utils.env( 'OS_ALARMING_API_VERSION', default=DEFAULT_ALARMING_API_VERSION), help=('Queues API version, default=' + DEFAULT_ALARMING_API_VERSION + ' (Env: OS_ALARMING_API_VERSION)')) return parser ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/shell.py0000664000175000017500000001533700000000000017463 0ustar00zuulzuul00000000000000# Copyright 2012 OpenStack Foundation # All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import logging import os import sys import warnings from cliff import app from cliff import commandmanager from keystoneauth1 import exceptions from keystoneauth1 import loading from aodhclient import __version__ from aodhclient import client from aodhclient import noauth from aodhclient.v2 import alarm_cli from aodhclient.v2 import alarm_history_cli from aodhclient.v2 import capabilities_cli class AodhCommandManager(commandmanager.CommandManager): SHELL_COMMANDS = { "alarm create": alarm_cli.CliAlarmCreate, "alarm delete": alarm_cli.CliAlarmDelete, "alarm list": alarm_cli.CliAlarmList, "alarm show": alarm_cli.CliAlarmShow, "alarm update": alarm_cli.CliAlarmUpdate, "alarm state get": alarm_cli.CliAlarmStateGet, "alarm state set": alarm_cli.CliAlarmStateSet, "alarm-history show": alarm_history_cli.CliAlarmHistoryShow, "alarm-history search": alarm_history_cli.CliAlarmHistorySearch, "capabilities list": capabilities_cli.CliCapabilitiesList, } def load_commands(self, namespace): for name, command_class in self.SHELL_COMMANDS.items(): self.add_command(name, command_class) class AodhShell(app.App): def __init__(self): super(AodhShell, self).__init__( description='Aodh command line client', version=__version__, command_manager=AodhCommandManager('aodhclient'), deferred_help=True, ) self._client = None def build_option_parser(self, description, version): """Return an argparse option parser for this application. Subclasses may override this method to extend the parser with more global options. :param description: full description of the application :paramtype description: str :param version: version number for the application :paramtype version: str """ parser = super(AodhShell, self).build_option_parser( description, version, argparse_kwargs={'allow_abbrev': False}) # Global arguments, one day this should go to keystoneauth1 parser.add_argument( '--os-region-name', metavar='', dest='region_name', default=os.environ.get('OS_REGION_NAME'), help='Authentication region name (Env: OS_REGION_NAME)') parser.add_argument( '--os-interface', metavar='', dest='interface', choices=['admin', 'public', 'internal'], default=os.environ.get('OS_INTERFACE'), help='Select an interface type.' ' Valid interface types: [admin, public, internal].' ' (Env: OS_INTERFACE)') parser.add_argument( '--aodh-api-version', default=os.environ.get('AODH_API_VERSION', '2'), help='Defaults to env[AODH_API_VERSION] or 2.') loading.register_session_argparse_arguments(parser=parser) plugin = loading.register_auth_argparse_arguments( parser=parser, argv=sys.argv, default="password") if not isinstance(plugin, noauth.AodhNoAuthLoader): parser.add_argument( '--aodh-endpoint', metavar='', dest='endpoint', default=os.environ.get('AODH_ENDPOINT'), help='Aodh endpoint (Env: AODH_ENDPOINT)') return parser @property def client(self): # NOTE(sileht): we lazy load the client to not # load/connect auth stuffs if self._client is None: if hasattr(self.options, "endpoint"): endpoint_override = self.options.endpoint else: endpoint_override = None auth_plugin = loading.load_auth_from_argparse_arguments( self.options) session = loading.load_session_from_argparse_arguments( self.options, auth=auth_plugin) self._client = client.Client(self.options.aodh_api_version, session=session, interface=self.options.interface, region_name=self.options.region_name, endpoint_override=endpoint_override) return self._client def clean_up(self, cmd, result, err): if isinstance(err, exceptions.HttpError) and err.details: print(err.details, file=sys.stderr) def configure_logging(self): if self.options.debug: # --debug forces verbose_level 3 # Set this here so cliff.app.configure_logging() can work self.options.verbose_level = 3 super(AodhShell, self).configure_logging() root_logger = logging.getLogger('') # Set logging to the requested level if self.options.verbose_level == 0: # --quiet root_logger.setLevel(logging.ERROR) warnings.simplefilter("ignore") elif self.options.verbose_level == 1: # This is the default case, no --debug, --verbose or --quiet root_logger.setLevel(logging.WARNING) warnings.simplefilter("ignore") elif self.options.verbose_level == 2: # One --verbose root_logger.setLevel(logging.INFO) warnings.simplefilter("once") elif self.options.verbose_level >= 3: # Two or more --verbose root_logger.setLevel(logging.DEBUG) # Hide some useless message requests_log = logging.getLogger("requests") cliff_log = logging.getLogger('cliff') stevedore_log = logging.getLogger('stevedore') iso8601_log = logging.getLogger("iso8601") cliff_log.setLevel(logging.ERROR) stevedore_log.setLevel(logging.ERROR) iso8601_log.setLevel(logging.ERROR) if self.options.debug: requests_log.setLevel(logging.DEBUG) else: requests_log.setLevel(logging.ERROR) def main(args=None): if args is None: args = sys.argv[1:] return AodhShell().run(args) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1741251064.5342808 aodhclient-3.7.1/aodhclient/tests/0000775000175000017500000000000000000000000017133 5ustar00zuulzuul00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/tests/__init__.py0000664000175000017500000000000000000000000021232 0ustar00zuulzuul00000000000000././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1741251064.5342808 aodhclient-3.7.1/aodhclient/tests/functional/0000775000175000017500000000000000000000000021275 5ustar00zuulzuul00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/tests/functional/__init__.py0000664000175000017500000000000000000000000023374 0ustar00zuulzuul00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/tests/functional/base.py0000664000175000017500000001002700000000000022561 0ustar00zuulzuul00000000000000# Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import os import time import os_client_config from oslo_utils import uuidutils from tempest.lib.cli import base from tempest.lib import exceptions class AodhClient(object): """Aodh Client for tempest-lib This client doesn't use any authentication system """ def __init__(self): self.cli_dir = os.environ.get('AODH_CLIENT_EXEC_DIR') self.endpoint = os.environ.get('AODH_ENDPOINT') self.cloud = os.environ.get('OS_ADMIN_CLOUD', 'devstack-admin') self.user_id = uuidutils.generate_uuid() self.project_id = uuidutils.generate_uuid() def aodh(self, action, flags='', params='', fail_ok=False, merge_stderr=False): auth_args = [] if self.cloud is None: auth_args.append("--os-auth-type none") elif self.cloud != '': conf = os_client_config.OpenStackConfig() creds = conf.get_one_cloud(cloud=self.cloud).get_auth_args() auth_args.append(f"--os-auth-url {creds['auth_url']}") auth_args.append(f"--os-username {creds['username']}") auth_args.append(f"--os-password {creds['password']}") auth_args.append(f"--os-project-name {creds['project_name']}") auth_args.append(f"--os-user-domain-id {creds['user_domain_id']}") auth_args.append("--os-project-domain-id " f"{creds['project_domain_id']}") endpoint_arg = "--aodh-endpoint %s" % self.endpoint flags = " ".join(auth_args + [endpoint_arg] + [flags]) return base.execute("aodh", action, flags, params, fail_ok, merge_stderr, self.cli_dir) class ClientTestBase(base.ClientTestBase): """Base class for aodhclient tests. Establishes the aodhclient and retrieves the essential environment information. """ def _get_clients(self): return AodhClient() def retry_aodh(self, retry, *args, **kwargs): result = "" while not result.strip() and retry > 0: result = self.aodh(*args, **kwargs) if not result: time.sleep(1) retry -= 1 return result def aodh(self, *args, **kwargs): return self.clients.aodh(*args, **kwargs) def get_token(self): cloud = os.environ.get('OS_ADMIN_CLOUD', 'devstack-admin') if cloud is not None and cloud != "": conf = os_client_config.OpenStackConfig() region_conf = conf.get_one_cloud(cloud=cloud) return region_conf.get_auth().get_token(region_conf.get_session()) else: return "" def details_multiple(self, output_lines, with_label=False): """Return list of dicts with item details from cli output tables. If with_label is True, key '__label' is added to each items dict. For more about 'label' see OutputParser.tables(). NOTE(sileht): come from tempest-lib just because cliff use Field instead of Property as first columun header. """ items = [] tables_ = self.parser.tables(output_lines) for table_ in tables_: if ('Field' not in table_['headers'] or 'Value' not in table_['headers']): raise exceptions.InvalidStructure() item = {} for value in table_['values']: item[value[0]] = value[1] if with_label: item['__label'] = table_['label'] items.append(item) return items ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/tests/functional/test_alarm.py0000664000175000017500000013523100000000000024007 0ustar00zuulzuul00000000000000# Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import os from oslo_utils import uuidutils import requests import requests.auth from tempest.lib import exceptions from aodhclient.tests.functional import base class AodhClientTest(base.ClientTestBase): def test_help(self): self.aodh("help", params="alarm create") self.aodh("help", params="alarm delete") self.aodh("help", params="alarm list") self.aodh("help", params="alarm show") self.aodh("help", params="alarm update") def test_alarm_id_or_name_scenario(self): def _test(name): params = "create --type event --name %s" % name result = self.aodh('alarm', params=params) alarm_id = self.details_multiple(result)[0]['alarm_id'] params = 'show %s' % name result = self.aodh('alarm', params=params) self.assertEqual(alarm_id, self.details_multiple(result)[0]['alarm_id']) params = 'show %s' % alarm_id result = self.aodh('alarm', params=params) self.assertEqual(alarm_id, self.details_multiple(result)[0]['alarm_id']) params = "update --state ok %s" % name result = self.aodh('alarm', params=params) self.assertEqual("ok", self.details_multiple(result)[0]['state']) params = "update --state alarm %s" % alarm_id result = self.aodh('alarm', params=params) self.assertEqual("alarm", self.details_multiple(result)[0]['state']) params = "update --name another-name %s" % name result = self.aodh('alarm', params=params) self.assertEqual("another-name", self.details_multiple(result)[0]['name']) params = "update --name %s %s" % (name, alarm_id) result = self.aodh('alarm', params=params) self.assertEqual(name, self.details_multiple(result)[0]['name']) # Check update with no change is allowed params = "update --name %s %s" % (name, name) result = self.aodh('alarm', params=params) self.assertEqual(name, self.details_multiple(result)[0]['name']) params = "update --state ok" result = self.aodh('alarm', params=params, fail_ok=True, merge_stderr=True) self.assertFirstLineStartsWith( result.splitlines(), 'You need to specify one of alarm ID and alarm name(--name) ' 'to update an alarm.') params = "delete %s" % name result = self.aodh('alarm', params=params) self.assertEqual("", result) params = "create --type event --name %s" % name result = self.aodh('alarm', params=params) alarm_id = self.details_multiple(result)[0]['alarm_id'] params = "delete %s" % alarm_id result = self.aodh('alarm', params=params) self.assertEqual("", result) _test(uuidutils.generate_uuid()) _test('normal-alarm-name') def test_event_scenario(self): PROJECT_ID = uuidutils.generate_uuid() # CREATE result = self.aodh(u'alarm', params=(u"create --type event --name ev_alarm1 " "--project-id %s" % PROJECT_ID)) alarm = self.details_multiple(result)[0] ALARM_ID = alarm['alarm_id'] self.assertEqual('ev_alarm1', alarm['name']) self.assertEqual('*', alarm['event_type']) # UPDATE IGNORE INVALID result = self.aodh( 'alarm', params=("update %s --severity critical --threshold 10" % ALARM_ID)) alarm_updated = self.details_multiple(result)[0] self.assertEqual(ALARM_ID, alarm_updated["alarm_id"]) self.assertEqual('critical', alarm_updated['severity']) # UPDATE IGNORE INVALID result = self.aodh( 'alarm', params=("update %s --event-type dummy" % ALARM_ID)) alarm_updated = self.details_multiple(result)[0] self.assertEqual(ALARM_ID, alarm_updated["alarm_id"]) self.assertEqual('dummy', alarm_updated['event_type']) # GET result = self.aodh( 'alarm', params="show %s" % ALARM_ID) alarm_show = self.details_multiple(result)[0] self.assertEqual(ALARM_ID, alarm_show["alarm_id"]) self.assertEqual(PROJECT_ID, alarm_show["project_id"]) self.assertEqual('ev_alarm1', alarm_show['name']) self.assertEqual('dummy', alarm_show['event_type']) # GET BY NAME result = self.aodh( 'alarm', params="show --name ev_alarm1") alarm_show = self.details_multiple(result)[0] self.assertEqual(ALARM_ID, alarm_show["alarm_id"]) self.assertEqual(PROJECT_ID, alarm_show["project_id"]) self.assertEqual('ev_alarm1', alarm_show['name']) self.assertEqual('dummy', alarm_show['event_type']) # GET BY NAME AND ID ERROR self.assertRaises(exceptions.CommandFailed, self.aodh, u'alarm', params=(u"show %s --name ev_alarm1" % ALARM_ID)) # LIST result = self.aodh('alarm', params="list --filter all_projects=true") self.assertIn(ALARM_ID, [r['alarm_id'] for r in self.parser.listing(result)]) output_colums = ['alarm_id', 'type', 'name', 'state', 'severity', 'enabled'] for alarm_list in self.parser.listing(result): self.assertEqual(sorted(output_colums), sorted(alarm_list.keys())) if alarm_list["alarm_id"] == ALARM_ID: self.assertEqual('ev_alarm1', alarm_list['name']) # LIST WITH QUERY result = self.aodh('alarm', params=("list --query project_id=%s" % PROJECT_ID)) alarm_list = self.parser.listing(result)[0] self.assertEqual(ALARM_ID, alarm_list["alarm_id"]) self.assertEqual('ev_alarm1', alarm_list['name']) # DELETE result = self.aodh('alarm', params="delete %s" % ALARM_ID) self.assertEqual("", result) # GET FAIL result = self.aodh('alarm', params="show %s" % ALARM_ID, fail_ok=True, merge_stderr=True) expected = "Alarm %s not found (HTTP 404)" % ALARM_ID self.assertFirstLineStartsWith(result.splitlines(), expected) # DELETE FAIL result = self.aodh('alarm', params="delete %s" % ALARM_ID, fail_ok=True, merge_stderr=True) self.assertFirstLineStartsWith(result.splitlines(), expected) # LIST DOES NOT HAVE ALARM result = self.aodh('alarm', params="list") self.assertNotIn(ALARM_ID, [r['alarm_id'] for r in self.parser.listing(result)]) def test_composite_scenario(self): project_id = uuidutils.generate_uuid() res_id = uuidutils.generate_uuid() # CREATE result = self.aodh( u'alarm', params=(u'create --type composite --name calarm1 --composite-rule ' '\'{"or":[{"threshold": 0.8, "metric": "cpu_util", ' '"type": "gnocchi_resources_threshold", "resource_type": ' '"generic", "resource_id": "%s", ' '"aggregation_method": "mean"},' '{"and": [{"threshold": 200, "metric": "disk.iops", ' '"type": "gnocchi_resources_threshold", "resource_type": ' '"generic", "resource_id": "%s", ' '"aggregation_method": "mean"},' '{"threshold": 1000, "metric": "memory",' '"type": "gnocchi_resources_threshold", "resource_type": ' '"generic", "resource_id": "%s", ' '"aggregation_method": "mean"}]}]}\' --project-id %s' % (res_id, res_id, res_id, project_id))) alarm = self.details_multiple(result)[0] alarm_id = alarm['alarm_id'] self.assertEqual('calarm1', alarm['name']) self.assertEqual('composite', alarm['type']) self.assertIn('composite_rule', alarm) # CREATE FAIL MISSING PARAM self.assertRaises(exceptions.CommandFailed, self.aodh, u'alarm', params=(u"create --type composite --name calarm1 " "--project-id %s" % project_id)) # UPDATE result = self.aodh( 'alarm', params=("update %s --severity critical" % alarm_id)) alarm_updated = self.details_multiple(result)[0] self.assertEqual(alarm_id, alarm_updated["alarm_id"]) self.assertEqual('critical', alarm_updated['severity']) # GET result = self.aodh( 'alarm', params="show %s" % alarm_id) alarm_show = self.details_multiple(result)[0] self.assertEqual(alarm_id, alarm_show["alarm_id"]) self.assertEqual(project_id, alarm_show["project_id"]) self.assertEqual('calarm1', alarm_show['name']) # GET BY NAME result = self.aodh( 'alarm', params="show --name calarm1") alarm_show = self.details_multiple(result)[0] self.assertEqual(alarm_id, alarm_show["alarm_id"]) self.assertEqual(project_id, alarm_show["project_id"]) self.assertEqual('calarm1', alarm_show['name']) # GET BY NAME AND ID ERROR self.assertRaises(exceptions.CommandFailed, self.aodh, u'alarm', params=(u"show %s --name calarm1" % alarm_id)) # LIST result = self.aodh('alarm', params="list --filter all_projects=true") self.assertIn(alarm_id, [r['alarm_id'] for r in self.parser.listing(result)]) output_colums = ['alarm_id', 'type', 'name', 'state', 'severity', 'enabled'] for alarm_list in self.parser.listing(result): self.assertEqual(sorted(output_colums), sorted(alarm_list.keys())) if alarm_list["alarm_id"] == alarm_id: self.assertEqual('calarm1', alarm_list['name']) # LIST WITH QUERY result = self.aodh('alarm', params=("list --query project_id=%s" % project_id)) alarm_list = self.parser.listing(result)[0] self.assertEqual(alarm_id, alarm_list["alarm_id"]) self.assertEqual('calarm1', alarm_list['name']) # DELETE result = self.aodh('alarm', params="delete %s" % alarm_id) self.assertEqual("", result) # GET FAIL result = self.aodh('alarm', params="show %s" % alarm_id, fail_ok=True, merge_stderr=True) expected = "Alarm %s not found (HTTP 404)" % alarm_id self.assertFirstLineStartsWith(result.splitlines(), expected) # DELETE FAIL result = self.aodh('alarm', params="delete %s" % alarm_id, fail_ok=True, merge_stderr=True) self.assertFirstLineStartsWith(result.splitlines(), expected) # LIST DOES NOT HAVE ALARM result = self.aodh('alarm', params="list") self.assertNotIn(alarm_id, [r['alarm_id'] for r in self.parser.listing(result)]) def _test_alarm_create_show_query(self, create_params, expected_lines): def test(params): result = self.aodh('alarm', params=params) alarm = self.details_multiple(result)[0] for key, value in expected_lines.items(): self.assertEqual(value, alarm[key]) return alarm alarm = test(create_params) params = 'show %s' % alarm['alarm_id'] test(params) self.aodh('alarm', params='delete %s' % alarm['alarm_id']) def test_event_alarm_create_show_query(self): params = ('create --type event --name alarm-multiple-query ' '--query "traits.project_id=789;traits.resource_id=012"') expected_lines = { 'query': 'traits.project_id = 789 AND', '': 'traits.resource_id = 012', } self._test_alarm_create_show_query(params, expected_lines) params = ('create --type event --name alarm-single-query ' '--query "traits.project_id=789"') expected_lines = {'query': 'traits.project_id = 789'} self._test_alarm_create_show_query(params, expected_lines) params = 'create --type event --name alarm-no-query' self._test_alarm_create_show_query(params, {'query': ''}) def test_set_get_alarm_state(self): result = self.aodh( 'alarm', params=('create --type event --name alarm_state_test ' '--query "traits.project_id=789;traits.resource_id=012"')) alarm = self.details_multiple(result)[0] alarm_id = alarm['alarm_id'] result = self.aodh( 'alarm', params="show %s" % alarm_id) alarm_show = self.details_multiple(result)[0] self.assertEqual('insufficient data', alarm_show['state']) result = self.aodh('alarm', params="state get %s" % alarm_id) state_get = self.details_multiple(result)[0] self.assertEqual('insufficient data', state_get['state']) self.aodh('alarm', params="state set --state ok %s" % alarm_id) result = self.aodh('alarm', params="state get %s" % alarm_id) state_get = self.details_multiple(result)[0] self.assertEqual('ok', state_get['state']) self.aodh('alarm', params='delete %s' % alarm_id) def test_update_type_event_composite(self): res_id = uuidutils.generate_uuid() # CREATE result = self.aodh(u'alarm', params=(u"create --type event --name ev_alarm123")) alarm = self.details_multiple(result)[0] ALARM_ID = alarm['alarm_id'] self.assertEqual('ev_alarm123', alarm['name']) self.assertEqual('*', alarm['event_type']) # UPDATE TYPE TO COMPOSITE result = self.aodh( 'alarm', params=('update %s --type composite --composite-rule ' '\'{"or":[{"threshold": 0.8, "metric": "cpu_util", ' '"type": "gnocchi_resources_threshold", "resource_type": ' '"generic", "resource_id": "%s", ' '"aggregation_method": "mean"},' '{"and": [{"threshold": 200, "metric": "disk.iops", ' '"type": "gnocchi_resources_threshold", "resource_type": ' '"generic", "resource_id": "%s", ' '"aggregation_method": "mean"},' '{"threshold": 1000, "metric": "memory",' '"type": "gnocchi_resources_threshold", "resource_type": ' '"generic", "resource_id": "%s", ' '"aggregation_method": "mean"}]}]}\'' % (ALARM_ID, res_id, res_id, res_id))) alarm_updated = self.details_multiple(result)[0] self.assertEqual(ALARM_ID, alarm_updated["alarm_id"]) self.assertEqual('composite', alarm_updated['type']) self.assertIn('composite_rule', alarm_updated) # UPDATE TYPE TO EVENT result = self.aodh( 'alarm', params=("update %s --type event" % ALARM_ID)) alarm_updated = self.details_multiple(result)[0] self.assertEqual(ALARM_ID, alarm_updated["alarm_id"]) self.assertEqual('event', alarm_updated['type']) self.assertEqual('*', alarm_updated['event_type']) # DELETE result = self.aodh('alarm', params="delete %s" % ALARM_ID) self.assertEqual("", result) class AodhClientGnocchiRulesTest(base.ClientTestBase): def test_gnocchi_resources_threshold_scenario(self): PROJECT_ID = uuidutils.generate_uuid() RESOURCE_ID = uuidutils.generate_uuid() req = requests.post( os.environ.get("GNOCCHI_ENDPOINT") + "/v1/resource/generic", headers={"X-Auth-Token": self.get_token()}, json={ "id": RESOURCE_ID, }) self.assertEqual(201, req.status_code) # CREATE result = self.aodh(u'alarm', params=(u"create " "--type gnocchi_resources_threshold " "--name alarm_gn1 --metric cpu_util " "--threshold 80 " "--resource-id %s --resource-type generic " "--aggregation-method last " "--project-id %s" % (RESOURCE_ID, PROJECT_ID))) alarm = self.details_multiple(result)[0] ALARM_ID = alarm['alarm_id'] self.assertEqual('alarm_gn1', alarm['name']) self.assertEqual('cpu_util', alarm['metric']) self.assertEqual('80.0', alarm['threshold']) self.assertEqual('last', alarm['aggregation_method']) self.assertEqual(RESOURCE_ID, alarm['resource_id']) self.assertEqual('generic', alarm['resource_type']) # CREATE WITH --TIME-CONSTRAINT result = self.aodh( u'alarm', params=(u"create --type gnocchi_resources_threshold " "--name alarm_tc --metric cpu_util --threshold 80 " "--resource-id %s --resource-type generic " "--aggregation-method last --project-id %s " "--time-constraint " "name=cons1;start='0 11 * * *';duration=300 " "--time-constraint " "name=cons2;start='0 23 * * *';duration=600 " % (RESOURCE_ID, PROJECT_ID))) alarm = self.details_multiple(result)[0] self.assertEqual('alarm_tc', alarm['name']) self.assertEqual('80.0', alarm['threshold']) self.assertIsNotNone(alarm['time_constraints']) # CREATE FAIL MISSING PARAM self.assertRaises(exceptions.CommandFailed, self.aodh, u'alarm', params=(u"create " "--type gnocchi_resources_threshold " "--name alarm1 --metric cpu_util " "--resource-id %s --resource-type generic " "--aggregation-method last " "--project-id %s" % (RESOURCE_ID, PROJECT_ID))) # UPDATE result = self.aodh( 'alarm', params=("update %s --severity critical --threshold 90" % ALARM_ID)) alarm_updated = self.details_multiple(result)[0] self.assertEqual(ALARM_ID, alarm_updated["alarm_id"]) self.assertEqual('critical', alarm_updated['severity']) self.assertEqual('90.0', alarm_updated["threshold"]) # GET result = self.aodh( 'alarm', params="show %s" % ALARM_ID) alarm_show = self.details_multiple(result)[0] self.assertEqual(ALARM_ID, alarm_show["alarm_id"]) self.assertEqual(PROJECT_ID, alarm_show["project_id"]) self.assertEqual('alarm_gn1', alarm_show['name']) self.assertEqual('cpu_util', alarm_show['metric']) self.assertEqual('90.0', alarm_show['threshold']) self.assertEqual('critical', alarm_show['severity']) self.assertEqual('last', alarm_show['aggregation_method']) self.assertEqual('generic', alarm_show['resource_type']) # GET BY NAME result = self.aodh( 'alarm', params="show --name alarm_gn1") alarm_show = self.details_multiple(result)[0] self.assertEqual(ALARM_ID, alarm_show["alarm_id"]) self.assertEqual(PROJECT_ID, alarm_show["project_id"]) self.assertEqual('alarm_gn1', alarm_show['name']) self.assertEqual('cpu_util', alarm_show['metric']) self.assertEqual('90.0', alarm_show['threshold']) self.assertEqual('critical', alarm_show['severity']) self.assertEqual('last', alarm_show['aggregation_method']) self.assertEqual('generic', alarm_show['resource_type']) # GET BY NAME AND ID ERROR self.assertRaises(exceptions.CommandFailed, self.aodh, u'alarm', params=(u"show %s --name alarm_gn1" % ALARM_ID)) # LIST result = self.aodh('alarm', params="list --filter all_projects=true") self.assertIn(ALARM_ID, [r['alarm_id'] for r in self.parser.listing(result)]) output_colums = ['alarm_id', 'type', 'name', 'state', 'severity', 'enabled'] for alarm_list in self.parser.listing(result): self.assertEqual(sorted(output_colums), sorted(alarm_list.keys())) if alarm_list["alarm_id"] == ALARM_ID: self.assertEqual('alarm_gn1', alarm_list['name']) # LIST WITH PAGINATION # list with limit result = self.aodh('alarm', params="list --filter all_projects=true --limit 1") alarm_list = self.parser.listing(result) self.assertEqual(1, len(alarm_list)) # list with sort with key=name dir=asc result = self.aodh( 'alarm', params="list --filter all_projects=true --sort name:asc") names = [r['name'] for r in self.parser.listing(result)] sorted_name = sorted(names) self.assertEqual(sorted_name, names) # list with sort with key=name dir=asc and key=alarm_id dir=asc result = self.aodh( u'alarm', params=(u"create --type gnocchi_resources_threshold " "--name alarm_th --metric cpu_util --threshold 80 " "--resource-id %s --resource-type generic " "--aggregation-method last --project-id %s " % (RESOURCE_ID, PROJECT_ID))) created_alarm_id = self.details_multiple(result)[0]['alarm_id'] result = self.aodh( 'alarm', params="list --filter all_projects=true --sort name:asc " "--sort alarm_id:asc") alarm_list = self.parser.listing(result) ids_with_same_name = [] names = [] for alarm in alarm_list: names.append(['alarm_name']) if alarm['name'] == 'alarm_th': ids_with_same_name.append(alarm['alarm_id']) sorted_ids = sorted(ids_with_same_name) sorted_names = sorted(names) self.assertEqual(sorted_names, names) self.assertEqual(sorted_ids, ids_with_same_name) # list with sort with key=name dir=desc and with the marker equal to # the alarm_id of the alarm_th we created for this test. result = self.aodh( 'alarm', params="list --filter all_projects=true --sort name:desc " "--marker %s" % created_alarm_id) self.assertIn('alarm_tc', [r['name'] for r in self.parser.listing(result)]) self.aodh('alarm', params="delete %s" % created_alarm_id) # LIST WITH QUERY result = self.aodh('alarm', params=("list --query project_id=%s" % PROJECT_ID)) alarm_list = self.parser.listing(result)[0] self.assertEqual(ALARM_ID, alarm_list["alarm_id"]) self.assertEqual('alarm_gn1', alarm_list['name']) # DELETE result = self.aodh('alarm', params="delete %s" % ALARM_ID) self.assertEqual("", result) # GET FAIL result = self.aodh('alarm', params="show %s" % ALARM_ID, fail_ok=True, merge_stderr=True) expected = "Alarm %s not found (HTTP 404)" % ALARM_ID self.assertFirstLineStartsWith(result.splitlines(), expected) # DELETE FAIL result = self.aodh('alarm', params="delete %s" % ALARM_ID, fail_ok=True, merge_stderr=True) self.assertFirstLineStartsWith(result.splitlines(), expected) # LIST DOES NOT HAVE ALARM result = self.aodh('alarm', params="list") self.assertNotIn(ALARM_ID, [r['alarm_id'] for r in self.parser.listing(result)]) def test_gnocchi_aggr_by_resources_scenario(self): # CREATE result = self.aodh( u'alarm', params=(u"create " "--type " "gnocchi_aggregation_by_resources_threshold " "--name alarm1 --metric cpu --threshold 80 " "--query " '\'{"=": {"creator": "cr3at0r"}}\' ' "--resource-type generic " "--aggregation-method mean ")) alarm = self.details_multiple(result)[0] ALARM_ID = alarm['alarm_id'] self.assertEqual('alarm1', alarm['name']) self.assertEqual('cpu', alarm['metric']) self.assertEqual('80.0', alarm['threshold']) self.assertEqual('mean', alarm['aggregation_method']) self.assertEqual('generic', alarm['resource_type']) self.assertEqual('{"=": {"creator": "cr3at0r"}}', alarm['query']) # CREATE FAIL MISSING PARAM self.assertRaises( exceptions.CommandFailed, self.aodh, u'alarm', params=(u"create " "--type " "gnocchi_aggregation_by_resources_threshold " "--name alarm1 --metric cpu " "--query " '\'{"=": {"creator": "cr3at0r"}}\' ' "--resource-type generic " "--aggregation-method mean ")) # UPDATE result = self.aodh( 'alarm', params=("update %s --severity critical --threshold 90" % ALARM_ID)) alarm_updated = self.details_multiple(result)[0] self.assertEqual(ALARM_ID, alarm_updated["alarm_id"]) self.assertEqual('critical', alarm_updated['severity']) self.assertEqual('90.0', alarm_updated["threshold"]) # GET result = self.aodh( 'alarm', params="show %s" % ALARM_ID) alarm_show = self.details_multiple(result)[0] self.assertEqual(ALARM_ID, alarm_show["alarm_id"]) self.assertEqual('alarm1', alarm_show['name']) self.assertEqual('cpu', alarm_show['metric']) self.assertEqual('90.0', alarm_show['threshold']) self.assertEqual('critical', alarm_show['severity']) self.assertEqual('mean', alarm_show['aggregation_method']) self.assertEqual('generic', alarm_show['resource_type']) # LIST result = self.aodh('alarm', params="list --filter all_projects=true") self.assertIn(ALARM_ID, [r['alarm_id'] for r in self.parser.listing(result)]) output_colums = ['alarm_id', 'type', 'name', 'state', 'severity', 'enabled'] for alarm_list in self.parser.listing(result): self.assertEqual(sorted(output_colums), sorted(alarm_list.keys())) if alarm_list["alarm_id"] == ALARM_ID: self.assertEqual('alarm1', alarm_list['name']) # DELETE result = self.aodh('alarm', params="delete %s" % ALARM_ID) self.assertEqual("", result) # GET FAIL result = self.aodh('alarm', params="show %s" % ALARM_ID, fail_ok=True, merge_stderr=True) expected = "Alarm %s not found (HTTP 404)" % ALARM_ID self.assertFirstLineStartsWith(result.splitlines(), expected) # DELETE FAIL result = self.aodh('alarm', params="delete %s" % ALARM_ID, fail_ok=True, merge_stderr=True) self.assertFirstLineStartsWith(result.splitlines(), expected) # LIST DOES NOT HAVE ALARM result = self.aodh('alarm', params="list") self.assertNotIn(ALARM_ID, [r['alarm_id'] for r in self.parser.listing(result)]) def test_gnocchi_aggr_by_metrics_scenario(self): PROJECT_ID = uuidutils.generate_uuid() METRIC1 = 'cpu' METRIC2 = 'cpu_util' # CREATE result = self.aodh( u'alarm', params=(u"create " "--type gnocchi_aggregation_by_metrics_threshold " "--name alarm1 " "--metrics %s " "--metric %s " "--threshold 80 " "--aggregation-method last " "--project-id %s" % (METRIC1, METRIC2, PROJECT_ID))) alarm = self.details_multiple(result)[0] ALARM_ID = alarm['alarm_id'] self.assertEqual('alarm1', alarm['name']) metrics = "['cpu', 'cpu_util']" self.assertEqual(metrics, alarm['metrics']) self.assertEqual('80.0', alarm['threshold']) self.assertEqual('last', alarm['aggregation_method']) # CREATE FAIL MISSING PARAM self.assertRaises( exceptions.CommandFailed, self.aodh, u'alarm', params=(u"create " "--type gnocchi_aggregation_by_metrics_threshold " "--name alarm1 " "--metrics %s " "--metrics %s " "--aggregation-method last " "--project-id %s" % (METRIC1, METRIC2, PROJECT_ID))) # UPDATE result = self.aodh( 'alarm', params=("update %s --severity critical --threshold 90" % ALARM_ID)) alarm_updated = self.details_multiple(result)[0] self.assertEqual(ALARM_ID, alarm_updated["alarm_id"]) self.assertEqual('critical', alarm_updated['severity']) self.assertEqual('90.0', alarm_updated["threshold"]) # GET result = self.aodh( 'alarm', params="show %s" % ALARM_ID) alarm_show = self.details_multiple(result)[0] self.assertEqual(ALARM_ID, alarm_show["alarm_id"]) self.assertEqual(PROJECT_ID, alarm_show["project_id"]) self.assertEqual('alarm1', alarm_show['name']) self.assertEqual(metrics, alarm_show['metrics']) self.assertEqual('90.0', alarm_show['threshold']) self.assertEqual('critical', alarm_show['severity']) self.assertEqual('last', alarm_show['aggregation_method']) # LIST result = self.aodh('alarm', params="list --filter all_projects=true") self.assertIn(ALARM_ID, [r['alarm_id'] for r in self.parser.listing(result)]) for alarm_list in self.parser.listing(result): if alarm_list["alarm_id"] == ALARM_ID: self.assertEqual('alarm1', alarm_list['name']) # LIST WITH QUERY result = self.aodh('alarm', params=("list --query project_id=%s" % PROJECT_ID)) alarm_list = self.parser.listing(result)[0] self.assertEqual(ALARM_ID, alarm_list["alarm_id"]) self.assertEqual('alarm1', alarm_list['name']) # DELETE result = self.aodh('alarm', params="delete %s" % ALARM_ID) self.assertEqual("", result) # GET FAIL result = self.aodh('alarm', params="show %s" % ALARM_ID, fail_ok=True, merge_stderr=True) expected = "Alarm %s not found (HTTP 404)" % ALARM_ID self.assertFirstLineStartsWith(result.splitlines(), expected) # DELETE FAIL result = self.aodh('alarm', params="delete %s" % ALARM_ID, fail_ok=True, merge_stderr=True) self.assertFirstLineStartsWith(result.splitlines(), expected) # LIST DOES NOT HAVE ALARM result = self.aodh('alarm', params="list") output_colums = ['alarm_id', 'type', 'name', 'state', 'severity', 'enabled'] for alarm_list in self.parser.listing(result): self.assertEqual(sorted(output_colums), sorted(alarm_list.keys())) self.assertNotIn(ALARM_ID, [r['alarm_id'] for r in self.parser.listing(result)]) def test_update_gnresthr_gnaggrresthr(self): RESOURCE_ID = uuidutils.generate_uuid() # CREATE result = self.aodh(u'alarm', params=(u"create " "--type gnocchi_resources_threshold " "--name alarm_gn123 --metric cpu_util " "--resource-id %s --threshold 80 " "--resource-type generic " "--aggregation-method last " % RESOURCE_ID)) alarm = self.details_multiple(result)[0] ALARM_ID = alarm['alarm_id'] self.assertEqual('alarm_gn123', alarm['name']) self.assertEqual('cpu_util', alarm['metric']) self.assertEqual('80.0', alarm['threshold']) self.assertEqual('last', alarm['aggregation_method']) self.assertEqual('generic', alarm['resource_type']) # UPDATE TYPE TO GNOCCHI_AGGREGATION_BY_RESOURCES_THRESHOLD result = self.aodh( 'alarm', params=("update %s --type " "gnocchi_aggregation_by_resources_threshold " "--metric cpu --threshold 90 " "--query " '\'{"=": {"creator": "cr3at0r"}}\' ' "--resource-type generic " "--aggregation-method mean " % ALARM_ID)) alarm_updated = self.details_multiple(result)[0] self.assertEqual(ALARM_ID, alarm_updated["alarm_id"]) self.assertEqual('cpu', alarm_updated['metric']) self.assertEqual('90.0', alarm_updated['threshold']) self.assertEqual('mean', alarm_updated['aggregation_method']) self.assertEqual('generic', alarm_updated['resource_type']) self.assertEqual('{"=": {"creator": "cr3at0r"}}', alarm_updated['query']) self.assertEqual('gnocchi_aggregation_by_resources_threshold', alarm_updated['type']) # UPDATE TYPE TO GNOCCHI_RESOURCES_THRESHOLD result = self.aodh( 'alarm', params=("update %s " "--type gnocchi_resources_threshold " "--metric cpu_util " "--resource-id %s --threshold 80 " "--resource-type generic " "--aggregation-method last " % (ALARM_ID, RESOURCE_ID))) alarm_updated = self.details_multiple(result)[0] self.assertEqual(ALARM_ID, alarm_updated["alarm_id"]) self.assertEqual('cpu_util', alarm_updated['metric']) self.assertEqual('80.0', alarm_updated['threshold']) self.assertEqual('last', alarm_updated['aggregation_method']) self.assertEqual('generic', alarm_updated['resource_type']) self.assertEqual('gnocchi_resources_threshold', alarm_updated['type']) # DELETE result = self.aodh('alarm', params="delete %s" % ALARM_ID) self.assertEqual("", result) def test_update_gnaggrresthr_gnaggrmetricthr(self): METRIC1 = 'cpu' METRIC2 = 'cpu_util' # CREATE result = self.aodh( u'alarm', params=(u"create " "--type " "gnocchi_aggregation_by_resources_threshold " "--name alarm123 --metric cpu --threshold 80 " "--query " '\'{"=": {"creator": "cr3at0r"}}\' ' "--resource-type generic " "--aggregation-method mean ")) alarm = self.details_multiple(result)[0] ALARM_ID = alarm['alarm_id'] self.assertEqual('alarm123', alarm['name']) self.assertEqual('cpu', alarm['metric']) self.assertEqual('80.0', alarm['threshold']) self.assertEqual('mean', alarm['aggregation_method']) self.assertEqual('generic', alarm['resource_type']) self.assertEqual('{"=": {"creator": "cr3at0r"}}', alarm['query']) # UPDATE TYPE TO GNOCCHI_AGGREGATION_BY_METRICS_THRESHOLD result = self.aodh( 'alarm', params=("update %s --type " "gnocchi_aggregation_by_metrics_threshold " "--metrics %s " "--metrics %s " "--threshold 80 " "--aggregation-method last" % (ALARM_ID, METRIC1, METRIC2))) alarm_updated = self.details_multiple(result)[0] self.assertEqual(ALARM_ID, alarm_updated["alarm_id"]) metrics = "['cpu', 'cpu_util']" self.assertEqual(metrics, alarm_updated['metrics']) self.assertEqual('80.0', alarm_updated['threshold']) self.assertEqual('last', alarm_updated['aggregation_method']) self.assertEqual('gnocchi_aggregation_by_metrics_threshold', alarm_updated['type']) # UPDATE TYPE TO GNOCCHI_AGGREGATION_BY_RESOURCES_THRESHOLD result = self.aodh( 'alarm', params=("update %s --type " "gnocchi_aggregation_by_resources_threshold " "--metric cpu --threshold 80 " "--query " '\'{"=": {"creator": "cr3at0r"}}\' ' "--resource-type generic " "--aggregation-method mean " % ALARM_ID)) alarm_updated = self.details_multiple(result)[0] self.assertEqual(ALARM_ID, alarm_updated["alarm_id"]) self.assertEqual('cpu', alarm_updated['metric']) self.assertEqual('80.0', alarm_updated['threshold']) self.assertEqual('mean', alarm_updated['aggregation_method']) self.assertEqual('generic', alarm_updated['resource_type']) self.assertEqual('{"=": {"creator": "cr3at0r"}}', alarm_updated['query']) self.assertEqual('gnocchi_aggregation_by_resources_threshold', alarm_updated['type']) # DELETE result = self.aodh('alarm', params="delete %s" % ALARM_ID) self.assertEqual("", result) class AodhClientPrometheusRulesTest(base.ClientTestBase): def test_prometheus_type_scenario(self): QUERY = "ceilometer_image_size" req = requests.get( os.environ.get("PROMETHEUS_ENDPOINT") + "/api/v1/status/runtimeinfo", ) self.assertEqual(200, req.status_code) # CREATE result = self.aodh(u'alarm', params=(u"create " "--type prometheus " "--name alarm_p1 " "--threshold 80 " "--query %s " % QUERY)) alarm = self.details_multiple(result)[0] ALARM_ID = alarm['alarm_id'] self.assertEqual('alarm_p1', alarm['name']) self.assertEqual(QUERY, alarm['query']) self.assertEqual('80.0', alarm['threshold']) # CREATE WITH --TIME-CONSTRAINT result = self.aodh( u'alarm', params=(u"create --type prometheus " "--name alarm_ptc --threshold 80 " "--time-constraint " "name=cons1;start='0 11 * * *';duration=300 " "--time-constraint " "name=cons2;start='0 23 * * *';duration=600 " "--query %s " % QUERY)) alarm = self.details_multiple(result)[0] alarm_ptc_id = alarm["alarm_id"] self.assertEqual('alarm_ptc', alarm['name']) self.assertEqual('80.0', alarm['threshold']) self.assertIsNotNone(alarm['time_constraints']) # CREATE FAIL MISSING PARAM self.assertRaises(exceptions.CommandFailed, self.aodh, u'alarm', params=(u"create " "--type prometheus " "--name alarm1 " "--query %s " % QUERY)) # UPDATE result = self.aodh( 'alarm', params=("update %s --severity critical --threshold 90" % ALARM_ID)) alarm_updated = self.details_multiple(result)[0] self.assertEqual(ALARM_ID, alarm_updated["alarm_id"]) self.assertEqual('critical', alarm_updated['severity']) self.assertEqual('90.0', alarm_updated["threshold"]) # GET result = self.aodh( 'alarm', params="show %s" % ALARM_ID) alarm_show = self.details_multiple(result)[0] self.assertEqual(ALARM_ID, alarm_show["alarm_id"]) self.assertEqual('alarm_p1', alarm_show['name']) self.assertEqual('90.0', alarm_show['threshold']) self.assertEqual('critical', alarm_show['severity']) self.assertEqual(QUERY, alarm_show['query']) # GET BY NAME result = self.aodh( 'alarm', params="show --name alarm_p1") alarm_show = self.details_multiple(result)[0] self.assertEqual(ALARM_ID, alarm_show["alarm_id"]) self.assertEqual('alarm_p1', alarm_show['name']) self.assertEqual('90.0', alarm_show['threshold']) self.assertEqual('critical', alarm_show['severity']) self.assertEqual(QUERY, alarm_show['query']) # GET BY NAME AND ID ERROR self.assertRaises(exceptions.CommandFailed, self.aodh, u'alarm', params=(u"show %s --name alarm_p1" % ALARM_ID)) # LIST result = self.aodh('alarm', params="list --filter all_projects=true") self.assertIn(ALARM_ID, [r['alarm_id'] for r in self.parser.listing(result)]) output_colums = ['alarm_id', 'type', 'name', 'state', 'severity', 'enabled'] for alarm_list in self.parser.listing(result): self.assertEqual(sorted(output_colums), sorted(alarm_list.keys())) if alarm_list["alarm_id"] == ALARM_ID: self.assertEqual('alarm_p1', alarm_list['name']) # LIST WITH PAGINATION # list with limit result = self.aodh('alarm', params="list --filter all_projects=true --limit 1") alarm_list = self.parser.listing(result) self.assertEqual(1, len(alarm_list)) # list with sort with key=name dir=asc result = self.aodh( 'alarm', params="list --filter all_projects=true --sort name:asc") names = [r['name'] for r in self.parser.listing(result)] sorted_name = sorted(names) self.assertEqual(sorted_name, names) # list with sort with key=name dir=asc and key=alarm_id dir=asc result = self.aodh( u'alarm', params=(u"create --type prometheus " "--name alarm_p2 --threshold 80 " "--query %s" % QUERY)) created_alarm_id = self.details_multiple(result)[0]['alarm_id'] result = self.aodh( 'alarm', params="list --filter all_projects=true --sort name:asc " "--sort alarm_id:asc") alarm_list = self.parser.listing(result) ids_with_same_name = [] names = [] for alarm in alarm_list: names.append(['alarm_name']) if alarm['name'] == 'alarm_p2': ids_with_same_name.append(alarm['alarm_id']) sorted_ids = sorted(ids_with_same_name) sorted_names = sorted(names) self.assertEqual(sorted_names, names) self.assertEqual(sorted_ids, ids_with_same_name) # List with sort with key=name dir=desc and with the marker # verify that we handle non full and empty pages correctly. # Verify that marker works as expected # We have 3 alarms. List the first 2. result = self.aodh( 'alarm', params="list --filter all_projects=true --filter " "type=prometheus --sort name:asc --limit 2") alarm_list = self.parser.listing(result) self.assertNotIn('alarm_ptc', [r['name'] for r in alarm_list]) self.assertEqual(2, len(alarm_list)) # List the next page. Only the 3rd alarm should be returned result = self.aodh( 'alarm', params="list --filter all_projects=true --filter " "type=prometheus --sort name:asc " "--marker %s --limit 2" % alarm_list[1]['alarm_id']) alarm_list = self.parser.listing(result) self.assertIn('alarm_ptc', [r['name'] for r in alarm_list]) self.assertEqual(1, len(alarm_list)) # List the next page. Empty page should be returned result = self.aodh( 'alarm', params="list --filter all_projects=true --filter " "type=prometheus --sort name:asc " "--marker %s --limit 2" % alarm_list[0]['alarm_id']) alarm_list = self.parser.listing(result) self.assertEqual(0, len(alarm_list)) # Delete the last created alarm self.aodh('alarm', params="delete %s" % created_alarm_id) # Delete alarm_ptc self.aodh('alarm', params="delete %s" % alarm_ptc_id) # DELETE result = self.aodh('alarm', params="delete %s" % ALARM_ID) self.assertEqual("", result) # GET FAIL result = self.aodh('alarm', params="show %s" % ALARM_ID, fail_ok=True, merge_stderr=True) expected = "Alarm %s not found (HTTP 404)" % ALARM_ID self.assertFirstLineStartsWith(result.splitlines(), expected) # DELETE FAIL result = self.aodh('alarm', params="delete %s" % ALARM_ID, fail_ok=True, merge_stderr=True) self.assertFirstLineStartsWith(result.splitlines(), expected) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/tests/functional/test_alarm_history.py0000664000175000017500000001225600000000000025571 0ustar00zuulzuul00000000000000# Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from oslo_serialization import jsonutils from oslo_utils import uuidutils from aodhclient.tests.functional import base class AlarmHistoryTest(base.ClientTestBase): def test_help(self): self.aodh("help", params="alarm-history show") self.aodh("help", params="alarm-history search") def test_alarm_history_scenario(self): PROJECT_ID = uuidutils.generate_uuid() RESOURCE_ID = uuidutils.generate_uuid() result = self.aodh(u'alarm', params=(u"create " "--type gnocchi_resources_threshold " "--name history1 --metric cpu_util " "--threshold 5 " "--resource-id %s --resource-type generic " "--aggregation-method last " "--project-id %s" % (RESOURCE_ID, PROJECT_ID))) alarm = self.details_multiple(result)[0] ALARM_ID = alarm['alarm_id'] result = self.aodh(u'alarm', params=(u"create " "--type gnocchi_resources_threshold " "--name history2 --metric cpu_util " "--threshold 10 " "--resource-id %s --resource-type generic " "--aggregation-method last " "--project-id %s" % (RESOURCE_ID, PROJECT_ID))) alarm = self.details_multiple(result)[0] ALARM_ID2 = alarm['alarm_id'] # LIST WITH PAGINATION # list with limit result = self.aodh('alarm-history', params=("show %s --limit 1" % ALARM_ID)) alarm_list = self.parser.listing(result) self.assertEqual(1, len(alarm_list)) # list with sort key=timestamp, dir=asc result = self.aodh('alarm-history', params=("show %s --sort timestamp:asc" % ALARM_ID)) alarm_history_list = self.parser.listing(result) timestamp = [r['timestamp'] for r in alarm_history_list] sorted_timestamp = sorted(timestamp) self.assertEqual(sorted_timestamp, timestamp) # list with sort key=type dir = desc and key=timestamp, dir=asc result = self.aodh('alarm-history', params=("show %s --sort type:desc " "--sort timestamp:asc" % ALARM_ID)) alarm_history_list = self.parser.listing(result) creation = alarm_history_list.pop(-1) timestamp = [r['timestamp'] for r in alarm_history_list] sorted_timestamp = sorted(timestamp) self.assertEqual(sorted_timestamp, timestamp) self.assertEqual('creation', creation['type']) # TEST FIELDS result = self.aodh( 'alarm-history', params=("show %s" % ALARM_ID)) history = self.parser.listing(result)[0] for key in ["timestamp", "type", "detail", "event_id"]: self.assertIn(key, history) # SHOW result = self.aodh( 'alarm-history', params=("show %s" % ALARM_ID)) history = self.parser.listing(result)[0] self.assertEqual('creation', history['type']) self.assertEqual('history1', jsonutils.loads(history['detail'])['name']) result = self.aodh( 'alarm-history', params=("show %s" % ALARM_ID2)) history = self.parser.listing(result)[0] self.assertEqual('creation', history['type']) self.assertEqual('history2', jsonutils.loads(history['detail'])['name']) # SEARCH ALL result = self.aodh('alarm-history', params=("search")) self.assertIn(ALARM_ID, [r['alarm_id'] for r in self.parser.listing(result)]) self.assertIn(ALARM_ID2, [r['alarm_id'] for r in self.parser.listing(result)]) # SEARCH result = self.aodh('alarm-history', params=("search --query " "alarm_id=%s" % ALARM_ID)) history = self.parser.listing(result)[0] self.assertEqual(ALARM_ID, history["alarm_id"]) self.assertEqual('creation', history['type']) self.assertEqual('history1', jsonutils.loads(history['detail'])['name']) # CLEANUP self.aodh('alarm', params="delete %s" % ALARM_ID) self.aodh('alarm', params="delete %s" % ALARM_ID2) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/tests/functional/test_capabilities.py0000664000175000017500000000163300000000000025342 0ustar00zuulzuul00000000000000# Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from aodhclient.tests.functional import base class CapabilitiesClientTest(base.ClientTestBase): def test_capabilities_scenario(self): # GET result = self.aodh('capabilities', params="list") caps = self.parser.listing(result)[0] self.assertIsNotNone(caps) self.assertEqual('alarm_storage', caps['Field']) ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1741251064.5382805 aodhclient-3.7.1/aodhclient/tests/unit/0000775000175000017500000000000000000000000020112 5ustar00zuulzuul00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/tests/unit/__init__.py0000664000175000017500000000000000000000000022211 0ustar00zuulzuul00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/tests/unit/test_alarm_cli.py0000664000175000017500000002773000000000000023457 0ustar00zuulzuul00000000000000# # Copyright IBM 2016. All rights reserved # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import argparse from unittest import mock import testtools from aodhclient.v2 import alarm_cli class CliAlarmCreateTest(testtools.TestCase): def setUp(self): super(CliAlarmCreateTest, self).setUp() self.app = mock.Mock() self.parser = mock.Mock() self.cli_alarm_create = ( alarm_cli.CliAlarmCreate(self.app, self.parser)) @mock.patch.object(argparse.ArgumentParser, 'error') def test_validate_args_gnocchi_resources_threshold(self, mock_arg): # Cover the test case of the method _validate_args for # gnocchi_resources_threshold parser = self.cli_alarm_create.get_parser('aodh alarm create') test_parsed_args = parser.parse_args([ '--name', 'gnocchi_resources_threshold_test', '--type', 'gnocchi_resources_threshold', '--metric', 'cpu', '--aggregation-method', 'last', '--resource-type', 'generic', '--threshold', '80' ]) self.cli_alarm_create._validate_args(test_parsed_args) mock_arg.assert_called_once_with( 'gnocchi_resources_threshold requires --metric, ' '--threshold, --resource-id, --resource-type and ' '--aggregation-method') @mock.patch.object(argparse.ArgumentParser, 'error') def test_validate_args_threshold(self, mock_arg): # Cover the test case of the method _validate_args for # threshold parser = self.cli_alarm_create.get_parser('aodh alarm create') test_parsed_args = parser.parse_args([ '--name', 'threshold_test', '--type', 'threshold', '--threshold', '80' ]) self.cli_alarm_create._validate_args(test_parsed_args) mock_arg.assert_called_once_with( 'Threshold alarm requires -m/--meter-name and ' '--threshold parameters. Meter name can be ' 'found in Ceilometer') @mock.patch.object(argparse.ArgumentParser, 'error') def test_validate_args_composite(self, mock_arg): # Cover the test case of the method _validate_args for # composite parser = self.cli_alarm_create.get_parser('aodh alarm create') test_parsed_args = parser.parse_args([ '--name', 'composite_test', '--type', 'composite' ]) self.cli_alarm_create._validate_args(test_parsed_args) mock_arg.assert_called_once_with( 'Composite alarm requires --composite-rule parameter') @mock.patch.object(argparse.ArgumentParser, 'error') def test_validate_args_gno_agg_by_resources_threshold(self, mock_arg): # Cover the test case of the method _validate_args for # gnocchi_aggregation_by_resources_threshold parser = self.cli_alarm_create.get_parser('aodh alarm create') test_parsed_args = parser.parse_args([ '--name', 'gnocchi_aggregation_by_resources_threshold_test', '--type', 'gnocchi_aggregation_by_resources_threshold', '--metric', 'cpu', '--aggregation-method', 'last', '--resource-type', 'generic', '--threshold', '80' ]) self.cli_alarm_create._validate_args(test_parsed_args) mock_arg.assert_called_once_with( 'gnocchi_aggregation_by_resources_threshold requires ' '--metric, --threshold, --aggregation-method, --query and ' '--resource-type') @mock.patch.object(argparse.ArgumentParser, 'error') def test_validate_args_gno_agg_by_metrics_threshold(self, mock_arg): # Cover the test case of the method _validate_args for # gnocchi_aggregation_by_metrics_threshold parser = self.cli_alarm_create.get_parser('aodh alarm create') test_parsed_args = parser.parse_args([ '--name', 'gnocchi_aggregation_by_metrics_threshold_test', '--type', 'gnocchi_aggregation_by_metrics_threshold', '--resource-type', 'generic', '--threshold', '80' ]) self.cli_alarm_create._validate_args(test_parsed_args) mock_arg.assert_called_once_with( 'gnocchi_aggregation_by_metrics_threshold requires ' '--metric, --threshold and --aggregation-method') @mock.patch.object(argparse.ArgumentParser, 'error') def test_validate_args_prometheus(self, mock_arg): # Cover the test case of the method _validate_args for # prometheus parser = self.cli_alarm_create.get_parser('aodh alarm create') test_parsed_args = parser.parse_args([ '--name', 'prom_test', '--type', 'prometheus', '--comparison-operator', 'gt', '--threshold', '666', ]) self.cli_alarm_create._validate_args(test_parsed_args) mock_arg.assert_called_once_with( 'Prometheus alarm requires --query and --threshold parameters.') def test_alarm_from_args(self): # The test case to cover the method _alarm_from_args parser = self.cli_alarm_create.get_parser('aodh alarm create') test_parsed_args = parser.parse_args([ '--type', 'event', '--name', 'alarm_from_args_test', '--project-id', '01919bbd-8b0e-451c-be28-abe250ae9b1b', '--user-id', '01919bbd-8b0e-451c-be28-abe250ae9c1c', '--description', 'For Test', '--state', 'ok', '--severity', 'critical', '--enabled', 'True', '--alarm-action', 'http://something/alarm', '--ok-action', 'http://something/ok', '--repeat-action', 'True', '--insufficient-data-action', 'http://something/insufficient', '--time-constraint', 'name=cons1;start="0 11 * * *";duration=300;description=desc1', '--evaluation-periods', '60', '--comparison-operator', 'le', '--threshold', '80', '--event-type', 'event', '--query', 'resource=fake-resource-id', '--granularity', '60', '--aggregation-method', 'last', '--metric', 'cpu', '--resource-id', '01919bbd-8b0e-451c-be28-abe250ae9c1c', '--resource-type', 'generic', '--stack-id', '0809ab348-8b0e-451c-be28-abe250ae9c1c', '--pool-id', '79832aabf-343ba-be28-abe250ae9c1c', '--autoscaling-group-id', 'abe250ae9c1c-79832aabf-343ba-be28' ]) # Output for the test alarm = { 'name': 'alarm_from_args_test', 'project_id': '01919bbd-8b0e-451c-be28-abe250ae9b1b', 'user_id': '01919bbd-8b0e-451c-be28-abe250ae9c1c', 'description': 'For Test', 'state': 'ok', 'severity': 'critical', 'enabled': True, 'alarm_actions': ['http://something/alarm'], 'ok_actions': ['http://something/ok'], 'insufficient_data_actions': ['http://something/insufficient'], 'time_constraints': [{'description': 'desc1', 'duration': '300', 'name': 'cons1', 'start': '0 11 * * *'}], 'repeat_actions': True, 'event_rule': { 'event_type': 'event', 'query': [{'field': 'resource', 'op': 'eq', 'type': '', 'value': 'fake-resource-id'}] }, 'threshold_rule': {'comparison_operator': 'le', 'evaluation_periods': 60, 'query': [{'field': 'resource', 'op': 'eq', 'type': '', 'value': 'fake-resource-id'}], 'threshold': 80.0}, 'prometheus_rule': {'comparison_operator': 'le', 'query': [{'field': 'resource', 'op': 'eq', 'type': '', 'value': 'fake-resource-id'}], 'threshold': 80.0}, 'gnocchi_resources_threshold_rule': { 'granularity': '60', 'metric': 'cpu', 'aggregation_method': 'last', 'evaluation_periods': 60, 'resource_id': '01919bbd-8b0e-451c-be28-abe250ae9c1c', 'comparison_operator': 'le', 'threshold': 80.0, 'resource_type': 'generic' }, 'gnocchi_aggregation_by_metrics_threshold_rule': { 'granularity': '60', 'aggregation_method': 'last', 'evaluation_periods': 60, 'comparison_operator': 'le', 'threshold': 80.0, 'metrics': ['cpu'], }, 'loadbalancer_member_health_rule': { 'autoscaling_group_id': 'abe250ae9c1c-79832aabf-343ba-be28', 'pool_id': '79832aabf-343ba-be28-abe250ae9c1c', 'stack_id': '0809ab348-8b0e-451c-be28-abe250ae9c1c' }, 'gnocchi_aggregation_by_resources_threshold_rule': { 'granularity': '60', 'metric': 'cpu', 'aggregation_method': 'last', 'evaluation_periods': 60, 'comparison_operator': 'le', 'threshold': 80.0, 'query': [{'field': 'resource', 'op': 'eq', 'type': '', 'value': 'fake-resource-id'}], 'resource_type': 'generic' }, 'composite_rule': None, 'type': 'event' } alarm_rep = self.cli_alarm_create._alarm_from_args(test_parsed_args) self.assertEqual(alarm, alarm_rep) def test_alarm_from_args_for_prometheus(self): # The test case to cover the method _alarm_from_args parser = self.cli_alarm_create.get_parser('aodh alarm create') test_parsed_args = parser.parse_args([ '--name', 'alarm_prom', '--type', 'prometheus', '--comparison-operator', 'gt', '--threshold', '666', '--query', r'some_metric{some_label="some_value"}' ]) prom_rule = {'comparison_operator': 'gt', 'query': r'some_metric{some_label="some_value"}', 'threshold': 666.0} alarm_rep = self.cli_alarm_create._alarm_from_args(test_parsed_args) self.assertEqual(prom_rule, alarm_rep['prometheus_rule']) def test_validate_time_constraint(self): starts = ['0 11 * * *', ' 0 11 * * * ', '"0 11 * * *"', '\'0 11 * * *\''] for start in starts: string = 'name=const1;start=%s;duration=1' % start expected = dict(name='const1', start='0 11 * * *', duration='1') self.assertEqual( expected, self.cli_alarm_create.validate_time_constraint(string)) def test_validate_time_constraint_with_bad_format(self): string = 'name=const2;start="0 11 * * *";duration:2' self.assertRaises(argparse.ArgumentTypeError, self.cli_alarm_create.validate_time_constraint, string) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/tests/unit/test_alarm_history.py0000664000175000017500000000365100000000000024405 0ustar00zuulzuul00000000000000# # Copyright IBM 2016. All rights reserved # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from unittest import mock import testtools from aodhclient.v2 import alarm_history class AlarmHistoryManagerTest(testtools.TestCase): def setUp(self): super(AlarmHistoryManagerTest, self).setUp() self.client = mock.Mock() @mock.patch.object(alarm_history.AlarmHistoryManager, '_get') def test_get(self, mock_ahm): ahm = alarm_history.AlarmHistoryManager(self.client) ahm.get('01919bbd-8b0e-451c-be28-abe250ae9b1b') mock_ahm.assert_called_with( 'v2/alarms/01919bbd-8b0e-451c-be28-abe250ae9b1b/history') @mock.patch.object(alarm_history.AlarmHistoryManager, '_post') def test_search(self, mock_ahm): ahm = alarm_history.AlarmHistoryManager(self.client) q = ('{"and": [{"=": {"type": "gnocchi_resources_threshold"}}, ' '{"=": {"alarm_id": "87bacbcb-a09c-4cb9-86d0-ad410dd8ad98"}}]}') ahm.search(q) expected_called_data = ( '{"filter": "{\\"and\\": [' '{\\"=\\": {\\"type\\": \\"gnocchi_resources_threshold\\"}}, ' '{\\"=\\": {\\"alarm_id\\": ' '\\"87bacbcb-a09c-4cb9-86d0-ad410dd8ad98\\"}}]}"}') mock_ahm.assert_called_with( 'v2/query/alarms/history', data=expected_called_data, headers={'Content-Type': 'application/json'}) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/tests/unit/test_alarm_manager.py0000664000175000017500000000653200000000000024317 0ustar00zuulzuul00000000000000# # Copyright IBM 2016. All rights reserved # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import testtools from unittest import mock from aodhclient.v2 import alarm class AlarmManagerTest(testtools.TestCase): def setUp(self): super(AlarmManagerTest, self).setUp() self.client = mock.Mock() self.alarms = { 'event_alarm': { 'gnocchi_aggregation_by_metrics_threshold_rule': {}, 'gnocchi_resources_threshold_rule': {}, 'name': 'event_alarm', 'gnocchi_aggregation_by_resources_threshold_rule': {}, 'event_rule': {}, 'type': 'event'} } self.results = { "result1": { "event_rule": {} }, } @mock.patch.object(alarm.AlarmManager, '_get') def test_list(self, mock_am): am = alarm.AlarmManager(self.client) am.list() mock_am.assert_called_with('v2/alarms') @mock.patch.object(alarm.AlarmManager, '_post') def test_query(self, mock_am): am = alarm.AlarmManager(self.client) query = '{"=": {"type": "event"}}' am.query(query) url = 'v2/query/alarms' expected_value = ('{"filter": "{\\"=\\": {\\"type\\":' ' \\"event\\"}}"}') headers_value = {'Content-Type': "application/json"} mock_am.assert_called_with( url, data=expected_value, headers=headers_value) @mock.patch.object(alarm.AlarmManager, '_get') def test_list_with_filters(self, mock_am): am = alarm.AlarmManager(self.client) filters = dict(type='gnocchi_resources_threshold', severity='low') am.list(filters=filters) expected_url = ( "v2/alarms?q.field=severity&q.op=eq&q.value=low&" "q.field=type&q.op=eq&q.value=gnocchi_resources_threshold") mock_am.assert_called_with(expected_url) @mock.patch.object(alarm.AlarmManager, '_get') def test_get(self, mock_am): am = alarm.AlarmManager(self.client) am.get('01919bbd-8b0e-451c-be28-abe250ae9b1b') mock_am.assert_called_with( 'v2/alarms/01919bbd-8b0e-451c-be28-abe250ae9b1b') @mock.patch.object(alarm.AlarmManager, '_delete') def test_delete(self, mock_am): am = alarm.AlarmManager(self.client) am.delete('01919bbd-8b0e-451c-be28-abe250ae9b1b') mock_am.assert_called_with( 'v2/alarms/01919bbd-8b0e-451c-be28-abe250ae9b1b') def test_clean_rules_event_alarm(self): am = alarm.AlarmManager(self.client) alarm_value = self.alarms.get('event_alarm') am._clean_rules('event', alarm_value) alarm_value.pop('type') alarm_value.pop('name') result = self.results.get("result1") self.assertEqual(alarm_value, result) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/tests/unit/test_exceptions.py0000664000175000017500000000277100000000000023713 0ustar00zuulzuul00000000000000# Copyright 2016 Hewlett Packard Enterprise Development Company, L.P. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from unittest import mock from oslotest import base from aodhclient import exceptions class AodhclientExceptionsTest(base.BaseTestCase): def test_string_format_base_exception(self): # ensure http_status has initial value N/A self.assertEqual('Unknown Error (HTTP N/A)', '%s' % exceptions.ClientException()) def test_no_match_exception_from_response(self): resp = mock.MagicMock(status_code=520) resp.headers = { 'Content-Type': 'text/plain', 'x-openstack-request-id': 'fake-request-id' } resp.text = 'Of course I still love you' e = exceptions.from_response(resp, 'http://no.where:2333/v2/alarms') self.assertIsInstance(e, exceptions.ClientException) self.assertEqual('Of course I still love you (HTTP 520) ' '(Request-ID: fake-request-id)', '%s' % e) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/tests/unit/test_quota.py0000664000175000017500000000560600000000000022663 0ustar00zuulzuul00000000000000# # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import testtools from unittest import mock from aodhclient import exceptions from aodhclient.v2 import quota_cli class QuotaShowTest(testtools.TestCase): def setUp(self): super(QuotaShowTest, self).setUp() self.app = mock.Mock() self.quota_mgr_mock = self.app.client_manager.alarming.quota self.parser = mock.Mock() self.quota_show = ( quota_cli.QuotaShow(self.app, self.parser)) def test_quota_show(self): self.quota_mgr_mock.list.return_value = { "project_id": "fake_project", "quotas": [ { "limit": 20, "resource": "alarms" } ] } parser = self.quota_show.get_parser('') args = parser.parse_args(['--project', 'fake_project']) # Something like [('alarms',), (20,)] ret = list(self.quota_show.take_action(args)) self.quota_mgr_mock.list.assert_called_once_with( project='fake_project') self.assertIn('alarms', ret[0]) self.assertIn(20, ret[1]) class QuotaSetTest(testtools.TestCase): def setUp(self): super(QuotaSetTest, self).setUp() self.app = mock.Mock() self.quota_mgr_mock = self.app.client_manager.alarming.quota self.parser = mock.Mock() self.quota_set = ( quota_cli.QuotaSet(self.app, self.parser)) def test_quota_set(self): self.quota_mgr_mock.create.return_value = { "project_id": "fake_project", "quotas": [ { "limit": 20, "resource": "alarms" } ] } parser = self.quota_set.get_parser('') args = parser.parse_args(['fake_project', '--alarm', '20']) ret = list(self.quota_set.take_action(args)) self.quota_mgr_mock.create.assert_called_once_with( 'fake_project', [{'resource': 'alarms', 'limit': 20}]) self.assertIn('alarms', ret[0]) self.assertIn(20, ret[1]) def test_quota_set_invalid_quota(self): parser = self.quota_set.get_parser('') args = parser.parse_args(['fake_project', '--alarm', '-2']) self.assertRaises(exceptions.CommandError, self.quota_set.take_action, args) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/tests/unit/test_shell.py0000664000175000017500000000261100000000000022632 0ustar00zuulzuul00000000000000# Copyright 2016 Hewlett Packard Enterprise Development Company, L.P. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import io import sys from unittest import mock from keystoneauth1 import exceptions import testtools from aodhclient import shell class CliTest(testtools.TestCase): @mock.patch('sys.stderr', io.StringIO()) def test_cli_http_error_with_details(self): shell.AodhShell().clean_up( None, None, exceptions.HttpError('foo', details='bar')) stderr_lines = sys.stderr.getvalue().splitlines() self.assertEqual(1, len(stderr_lines)) self.assertEqual('bar', stderr_lines[0]) @mock.patch('sys.stderr', io.StringIO()) def test_cli_http_error_without_details(self): shell.AodhShell().clean_up(None, None, exceptions.HttpError('foo')) stderr_lines = sys.stderr.getvalue().splitlines() self.assertEqual(0, len(stderr_lines)) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/tests/unit/test_utils.py0000664000175000017500000000622300000000000022666 0ustar00zuulzuul00000000000000# -*- encoding: utf-8 -*- # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from oslotest import base from aodhclient import utils class SearchQueryBuilderTest(base.BaseTestCase): def _do_test(self, expr, expected): req = utils.search_query_builder(expr) self.assertEqual(expected, req) def test_search_query_builder(self): self._do_test('foo=bar', {"=": {"foo": "bar"}}) self._do_test('foo!=1', {"!=": {"foo": 1.0}}) self._do_test('foo=True', {"=": {"foo": True}}) self._do_test('foo=null', {"=": {"foo": None}}) self._do_test('foo="null"', {"=": {"foo": "null"}}) self._do_test('not (foo="quote" or foo="what!" ' 'or bar="who?")', {"not": {"or": [ {"=": {"bar": "who?"}}, {"=": {"foo": "what!"}}, {"=": {"foo": "quote"}}, ]}}) self._do_test('(foo="quote" or not foo="what!" ' 'or bar="who?") and cat="meme"', {"and": [ {"=": {"cat": "meme"}}, {"or": [ {"=": {"bar": "who?"}}, {"not": {"=": {"foo": "what!"}}}, {"=": {"foo": "quote"}}, ]} ]}) self._do_test('foo="quote" or foo="what!" ' 'or bar="who?" and cat="meme"', {"or": [ {"and": [ {"=": {"cat": "meme"}}, {"=": {"bar": "who?"}}, ]}, {"=": {"foo": "what!"}}, {"=": {"foo": "quote"}}, ]}) self._do_test('foo="quote" and foo="what!" ' 'or bar="who?" or cat="meme"', {'or': [ {'=': {'cat': 'meme'}}, {'=': {'bar': 'who?'}}, {'and': [ {'=': {'foo': 'what!'}}, {'=': {'foo': 'quote'}} ]} ]}) class CliQueryToArray(base.BaseTestCase): def test_cli_query_to_arrary(self): cli_query = "this<=34;that=string::foo" ret_array = utils.cli_to_array(cli_query) expected_query = [ {"field": "this", "type": "", "value": "34", "op": "le"}, {"field": "that", "type": "string", "value": "foo", "op": "eq"}] self.assertEqual(expected_query, ret_array) ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/utils.py0000664000175000017500000001550400000000000017510 0ustar00zuulzuul00000000000000# -*- encoding: utf-8 -*- # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import re from urllib import parse as urllib_parse import pyparsing as pp uninary_operators = ("not", ) binary_operator = (u">=", u"<=", u"!=", u">", u"<", u"=", u"==", u"eq", u"ne", u"lt", u"gt", u"ge", u"le") multiple_operators = (u"and", u"or") operator = pp.Regex(u"|".join(binary_operator)) null = pp.Regex("None|none|null").setParseAction(pp.replaceWith(None)) boolean = "False|True|false|true" boolean = pp.Regex(boolean).setParseAction(lambda t: t[0].lower() == "true") hex_string = lambda n: pp.Word(pp.hexnums, exact=n) # noqa: E731 uuid = pp.Combine(hex_string(8) + ("-" + hex_string(4)) * 3 + "-" + hex_string(12)) number = r"[+-]?\d+(:?\.\d*)?(:?[eE][+-]?\d+)?" number = pp.Regex(number).setParseAction(lambda t: float(t[0])) identifier = pp.Word(pp.alphas, pp.alphanums + "_") quoted_string = pp.QuotedString('"') | pp.QuotedString("'") comparison_term = pp.Forward() in_list = pp.Group(pp.Suppress('[') + pp.Optional(pp.delimitedList(comparison_term)) + pp.Suppress(']'))("list") comparison_term << (null | boolean | uuid | identifier | number | quoted_string) condition = pp.Group(comparison_term + operator + comparison_term) expr = pp.infixNotation(condition, [ ("not", 1, pp.opAssoc.RIGHT, ), ("and", 2, pp.opAssoc.LEFT, ), ("or", 2, pp.opAssoc.LEFT, ), ]) OP_LOOKUP = {'!=': 'ne', '>=': 'ge', '<=': 'le', '>': 'gt', '<': 'lt', '=': 'eq'} OP_LOOKUP_KEYS = '|'.join(sorted(OP_LOOKUP.keys(), key=len, reverse=True)) OP_SPLIT_RE = re.compile(r'(%s)' % OP_LOOKUP_KEYS) def _parsed_query2dict(parsed_query): result = None while parsed_query: part = parsed_query.pop() if part in binary_operator: result = {part: {parsed_query.pop(): result}} elif part in multiple_operators: if result.get(part): result[part].append( _parsed_query2dict(parsed_query.pop())) else: result = {part: [result]} elif part in uninary_operators: result = {part: result} elif isinstance(part, pp.ParseResults): kind = part.getName() if kind == "list": res = part.asList() else: res = _parsed_query2dict(part) if result is None: result = res elif isinstance(result, dict): list(result.values())[0].append(res) else: result = part return result def search_query_builder(query): parsed_query = expr.parseString(query)[0] return _parsed_query2dict(parsed_query) def list2cols(cols, objs): return cols, [tuple([o[k] for k in cols]) for o in objs] def format_string_list(objs, field): objs[field] = ", ".join(objs[field]) def format_dict_list(objs, field): objs[field] = "\n".join( "- " + ", ".join("%s: %s" % (k, v) for k, v in elem.items()) for elem in objs[field]) def format_move_dict_to_root(obj, field): for attr in obj[field]: obj["%s/%s" % (field, attr)] = obj[field][attr] del obj[field] def format_archive_policy(ap): format_dict_list(ap, "definition") format_string_list(ap, "aggregation_methods") def dict_from_parsed_args(parsed_args, attrs): d = {} for attr in attrs: if attr == "metric": if parsed_args.metrics: value = parsed_args.metrics[0] else: value = None else: value = getattr(parsed_args, attr) if value is not None: if value == [""]: # NOTE(jake): As options like --alarm-actions is an array, # their value can be array with empty string if user set option # to ''. In this case we set it to None here so that a None # value gets sent to API. d[attr] = None else: d[attr] = value return d def dict_to_querystring(objs): return "&".join(["%s=%s" % (k, v) for k, v in objs.items() if v is not None]) def cli_to_array(cli_query): """Convert CLI list of queries to the Python API format. This will convert the following: "this<=34;that=string::foo" to "[{field=this,op=le,value=34,type=''}, {field=that,op=eq,value=foo,type=string}]" """ opts = [] queries = cli_query.split(';') for q in queries: try: field, q_operator, type_value = OP_SPLIT_RE.split(q, maxsplit=1) except ValueError: raise ValueError('Invalid or missing operator in query %(q)s,' 'the supported operators are: %(k)s' % {'q': q, 'k': OP_LOOKUP.keys()}) if not field: raise ValueError('Missing field in query %s' % q) if not type_value: raise ValueError('Missing value in query %s' % q) opt = dict(field=field, op=OP_LOOKUP[q_operator]) if '::' not in type_value: opt['type'], opt['value'] = '', type_value else: opt['type'], _, opt['value'] = type_value.partition('::') if opt['type'] and opt['type'] not in ( 'string', 'integer', 'float', 'datetime', 'boolean'): err = ('Invalid value type %(type)s, the type of value' 'should be one of: integer, string, float, datetime,' ' boolean.' % opt) raise ValueError(err) opts.append(opt) return opts def get_pagination_options(limit=None, marker=None, sorts=None): options = [] if limit: options.append("limit=%d" % limit) if marker: options.append("marker=%s" % urllib_parse.quote(marker)) for sort in sorts or []: options.append("sort=%s" % urllib_parse.quote(sort)) return "&".join(options) def get_client(obj): if hasattr(obj.app, 'client_manager'): # NOTE(liusheng): cliff objects loaded by OSC return obj.app.client_manager.alarming else: # TODO(liusheng): Remove this when OSC is able # to install the aodh client binary itself return obj.app.client ././@PaxHeader0000000000000000000000000000003400000000000011452 xustar000000000000000028 mtime=1741251064.5422802 aodhclient-3.7.1/aodhclient/v2/0000775000175000017500000000000000000000000016320 5ustar00zuulzuul00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/v2/__init__.py0000664000175000017500000000000000000000000020417 0ustar00zuulzuul00000000000000././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/v2/alarm.py0000664000175000017500000002120300000000000017764 0ustar00zuulzuul00000000000000# # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from oslo_serialization import jsonutils from aodhclient import utils from aodhclient.v2 import alarm_cli from aodhclient.v2 import base class AlarmManager(base.Manager): url = "v2/alarms" @staticmethod def _filtersdict_to_url(filters): urls = [] for k, v in sorted(filters.items()): url = "q.field=%s&q.op=eq&q.value=%s" % (k, v) urls.append(url) return '&'.join(urls) def list(self, filters=None, limit=None, marker=None, sorts=None): """List alarms. :param filters: A dict includes filters parameters, for example, {'type': 'gnocchi_resources_threshold', 'severity': 'low'} represent filters to query alarms with type='gnocchi_resources_threshold' and severity='low'. :type filters: dict :param limit: maximum number of resources to return :type limit: int :param marker: the last item of the previous page; we return the next results after this value. :type marker: str :param sorts: list of resource attributes to order by. :type sorts: list of str """ pagination = utils.get_pagination_options(limit, marker, sorts) filter_string = (self._filtersdict_to_url(filters) if filters else "") url = self.url options = [] if filter_string: options.append(filter_string) if pagination: options.append(pagination) if options: url += "?" + "&".join(options) return self._get(url).json() def query(self, query=None): """Query alarms. :param query: A json format complex query expression, like this: '{"=":{"type":"gnocchi_resources_threshold"}}', this expression is used to query all the gnocchi_resources_threshold type alarms. :type query: json """ query = {'filter': query} url = "v2/query/alarms" return self._post(url, headers={'Content-Type': "application/json"}, data=jsonutils.dumps(query)).json() def get(self, alarm_id): """Get an alarm :param alarm_id: ID of the alarm :type alarm_id: str """ return self._get(self.url + '/' + alarm_id).json() @staticmethod def _clean_rules(alarm_type, alarm): for rule in alarm_cli.ALARM_TYPES: if rule != alarm_type: alarm.pop('%s_rule' % rule, None) def create(self, alarm): """Create an alarm :param alarm: the alarm :type alarm: dict """ self._clean_rules(alarm['type'], alarm) return self._post( self.url, headers={'Content-Type': "application/json"}, data=jsonutils.dumps(alarm)).json() def update(self, alarm_id, alarm_update): """Update an alarm :param alarm_id: ID of the alarm :type alarm_id: str :param attributes: Attributes of the alarm :type attributes: dict """ alarm = self._get(self.url + '/' + alarm_id).json() if 'type' not in alarm_update: self._clean_rules(alarm['type'], alarm_update) else: self._clean_rules(alarm_update['type'], alarm_update) if 'prometheus_rule' in alarm_update: rule = alarm_update.get('prometheus_rule') alarm['prometheus_rule'].update(rule) alarm_update.pop('prometheus_rule') elif 'threshold_rule' in alarm_update: alarm['threshold_rule'].update(alarm_update.get('threshold_rule')) alarm_update.pop('threshold_rule') elif 'event_rule' in alarm_update: if ('type' in alarm_update and alarm_update['type'] != alarm['type']): alarm.pop('%s_rule' % alarm['type'], None) alarm['event_rule'] = alarm_update['event_rule'] else: alarm['event_rule'].update(alarm_update.get('event_rule')) alarm_update.pop('event_rule') elif 'gnocchi_resources_threshold_rule' in alarm_update: if ('type' in alarm_update and alarm_update['type'] != alarm['type']): alarm.pop('%s_rule' % alarm['type'], None) alarm['gnocchi_resources_threshold_rule'] = alarm_update[ 'gnocchi_resources_threshold_rule'] else: alarm['gnocchi_resources_threshold_rule'].update( alarm_update.get('gnocchi_resources_threshold_rule')) alarm_update.pop('gnocchi_resources_threshold_rule') elif 'gnocchi_aggregation_by_metrics_threshold_rule' in alarm_update: if ('type' in alarm_update and alarm_update['type'] != alarm['type']): alarm.pop('%s_rule' % alarm['type'], None) alarm['gnocchi_aggregation_by_metrics_threshold_rule'] = \ alarm_update[ 'gnocchi_aggregation_by_metrics_threshold_rule'] else: alarm['gnocchi_aggregation_by_metrics_threshold_rule'].update( alarm_update.get( 'gnocchi_aggregation_by_metrics_threshold_rule')) alarm_update.pop('gnocchi_aggregation_by_metrics_threshold_rule') elif 'gnocchi_aggregation_by_resources_threshold_rule' in alarm_update: if ('type' in alarm_update and alarm_update['type'] != alarm['type']): alarm.pop('%s_rule' % alarm['type'], None) alarm['gnocchi_aggregation_by_resources_threshold_rule'] = \ alarm_update[ 'gnocchi_aggregation_by_resources_threshold_rule'] else: alarm['gnocchi_aggregation_by_resources_threshold_rule'].\ update(alarm_update.get( 'gnocchi_aggregation_by_resources_threshold_rule')) alarm_update.pop( 'gnocchi_aggregation_by_resources_threshold_rule') elif 'composite_rule' in alarm_update: if ('type' in alarm_update and alarm_update['type'] != alarm['type']): alarm.pop('%s_rule' % alarm['type'], None) if alarm_update['composite_rule'] is not None: alarm['composite_rule'] = alarm_update[ 'composite_rule'] alarm_update.pop('composite_rule') elif 'loadbalancer_member_health_rule' in alarm_update: if ('type' in alarm_update and alarm_update['type'] != alarm['type']): alarm.pop('%s_rule' % alarm['type'], None) if alarm_update['loadbalancer_member_health_rule'] is not None: alarm['loadbalancer_member_health_rule'] = alarm_update[ 'loadbalancer_member_health_rule'] alarm_update.pop('loadbalancer_member_health_rule') alarm.update(alarm_update) return self._put( self.url + '/' + alarm_id, headers={'Content-Type': "application/json"}, data=jsonutils.dumps(alarm)).json() def delete(self, alarm_id): """Delete an alarm :param alarm_id: ID of the alarm :type alarm_id: str """ self._delete(self.url + '/' + alarm_id) def get_state(self, alarm_id): """Get the state of an alarm :param alarm_id: ID of the alarm :type alarm_id: str """ return self._get(self.url + '/' + alarm_id + '/state').json() def set_state(self, alarm_id, state): """Set the state of an alarm :param alarm_id: ID of the alarm :type alarm_id: str :param state: the state to be updated to the alarm :type state: str """ return self._put(self.url + '/' + alarm_id + '/state', headers={'Content-Type': "application/json"}, data='"%s"' % state ).json() ././@PaxHeader0000000000000000000000000000002600000000000011453 xustar000000000000000022 mtime=1741250992.0 aodhclient-3.7.1/aodhclient/v2/alarm_cli.py0000664000175000017500000006600500000000000020624 0ustar00zuulzuul00000000000000# # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import argparse from cliff import command from cliff import lister from cliff import show from oslo_serialization import jsonutils from oslo_utils import strutils from oslo_utils import uuidutils from aodhclient import exceptions from aodhclient.i18n import _ from aodhclient import utils ALARM_TYPES = ['prometheus', 'event', 'composite', 'threshold', 'gnocchi_resources_threshold', 'gnocchi_aggregation_by_metrics_threshold', 'gnocchi_aggregation_by_resources_threshold', 'loadbalancer_member_health'] ALARM_STATES = ['ok', 'alarm', 'insufficient data'] ALARM_SEVERITY = ['low', 'moderate', 'critical'] ALARM_OPERATORS = ['lt', 'le', 'eq', 'ne', 'ge', 'gt'] ALARM_OP_MAP = dict(zip(ALARM_OPERATORS, ('<', '<=', '=', '!=', '>=', '>'))) STATISTICS = ['max', 'min', 'avg', 'sum', 'count'] ALARM_LIST_COLS = ['alarm_id', 'type', 'name', 'state', 'severity', 'enabled'] class CliAlarmList(lister.Lister): """List alarms""" @staticmethod def split_filter_param(param): key, eq_op, value = param.partition('=') if not eq_op: msg = 'Malformed parameter(%s). Use the key=value format.' % param raise ValueError(msg) return key, value def get_parser(self, prog_name): parser = super(CliAlarmList, self).get_parser(prog_name) exclusive_group = parser.add_mutually_exclusive_group() exclusive_group.add_argument("--query", help="Rich query supported by aodh, " "e.g. project_id!=my-id " "user_id=foo or user_id=bar") exclusive_group.add_argument('--filter', dest='filter', metavar='', type=self.split_filter_param, action='append', help='Filter parameters to apply on' ' returned alarms.') parser.add_argument("--limit", type=int, metavar="", help="Number of resources to return " "(Default is server default)") parser.add_argument("--marker", metavar="", help="Last item of the previous listing. " "Return the next results after this value," "the supported marker is alarm_id.") parser.add_argument("--sort", action="append", metavar="", help="Sort of resource attribute, " "e.g. name:asc") return parser def take_action(self, parsed_args): if parsed_args.query: if any([parsed_args.limit, parsed_args.sort, parsed_args.marker]): raise exceptions.CommandError( "Query and pagination options are mutually " "exclusive.") query = jsonutils.dumps( utils.search_query_builder(parsed_args.query)) alarms = utils.get_client(self).alarm.query(query=query) else: filters = dict(parsed_args.filter) if parsed_args.filter else None alarms = utils.get_client(self).alarm.list( filters=filters, sorts=parsed_args.sort, limit=parsed_args.limit, marker=parsed_args.marker) return utils.list2cols(ALARM_LIST_COLS, alarms) def _format_alarm(alarm): if alarm.get('composite_rule'): composite_rule = jsonutils.dumps(alarm['composite_rule'], indent=2) alarm['composite_rule'] = composite_rule return alarm for alarm_type in ALARM_TYPES: if alarm.get('%s_rule' % alarm_type): alarm.update(alarm.pop('%s_rule' % alarm_type)) if alarm["time_constraints"]: alarm["time_constraints"] = jsonutils.dumps(alarm["time_constraints"], sort_keys=True, indent=2) # only works for threshold and event alarm if isinstance(alarm.get('query'), list): query_rows = [] for q in alarm['query']: op = ALARM_OP_MAP.get(q['op'], q['op']) query_rows.append('%s %s %s' % (q['field'], op, q['value'])) alarm['query'] = ' AND\n'.join(query_rows) return alarm def _find_alarm_by_name(client, name): # then try to get entity as name query = jsonutils.dumps({"=": {"name": name}}) alarms = client.alarm.query(query) if len(alarms) > 1: msg = (_("Multiple alarms matches found for '%s', " "use an ID to be more specific.") % name) raise exceptions.NoUniqueMatch(msg) elif not alarms: msg = (_("Alarm %s not found") % name) raise exceptions.NotFound(msg) else: return alarms[0] def _find_alarm_id_by_name(client, name): alarm = _find_alarm_by_name(client, name) return alarm['alarm_id'] def _check_name_and_id_coexist(parsed_args, action): if parsed_args.id and parsed_args.name: raise exceptions.CommandError( "You should provide only one of " "alarm ID and alarm name(--name) " "to %s an alarm." % action) def _check_name_and_id_exist(parsed_args, action): if not parsed_args.id and not parsed_args.name: msg = (_("You need to specify one of " "alarm ID and alarm name(--name) " "to %s an alarm.") % action) raise exceptions.CommandError(msg) def _check_name_and_id(parsed_args, action): _check_name_and_id_coexist(parsed_args, action) _check_name_and_id_exist(parsed_args, action) def _add_name_to_parser(parser, required=False): parser.add_argument('--name', metavar='', required=required, help='Name of the alarm') return parser def _add_id_to_parser(parser): parser.add_argument("id", nargs='?', metavar='', help="ID or name of an alarm.") return parser class CliAlarmShow(show.ShowOne): """Show an alarm""" def get_parser(self, prog_name): return _add_name_to_parser( _add_id_to_parser( super(CliAlarmShow, self).get_parser(prog_name))) def take_action(self, parsed_args): _check_name_and_id(parsed_args, 'query') c = utils.get_client(self) if parsed_args.name: alarm = _find_alarm_by_name(c, parsed_args.name) else: if uuidutils.is_uuid_like(parsed_args.id): try: alarm = c.alarm.get(alarm_id=parsed_args.id) except exceptions.NotFound: # Maybe it's a name alarm = _find_alarm_by_name(c, parsed_args.id) else: alarm = _find_alarm_by_name(c, parsed_args.id) return self.dict2columns(_format_alarm(alarm)) class CliAlarmCreate(show.ShowOne): """Create an alarm""" create = True def get_parser(self, prog_name): parser = _add_name_to_parser( super(CliAlarmCreate, self).get_parser(prog_name), required=self.create) parser.add_argument('-t', '--type', metavar='', required=self.create, choices=ALARM_TYPES, help='Type of alarm, should be one of: ' '%s.' % ', '.join(ALARM_TYPES)) parser.add_argument('--project-id', metavar='', help='Project to associate with alarm ' '(configurable by admin users only)') parser.add_argument('--user-id', metavar='', help='User to associate with alarm ' '(configurable by admin users only)') parser.add_argument('--description', metavar='', help='Free text description of the alarm') parser.add_argument('--state', metavar='', choices=ALARM_STATES, help='State of the alarm, one of: ' + str(ALARM_STATES)) parser.add_argument('--severity', metavar='', choices=ALARM_SEVERITY, help='Severity of the alarm, one of: ' + str(ALARM_SEVERITY)) parser.add_argument('--enabled', type=strutils.bool_from_string, metavar='{True|False}', help=('True if alarm evaluation is enabled')) parser.add_argument('--alarm-action', dest='alarm_actions', metavar='', action='append', help=('URL to invoke when state transitions to ' 'alarm. May be used multiple times')) parser.add_argument('--ok-action', dest='ok_actions', metavar='', action='append', help=('URL to invoke when state transitions to ' 'OK. May be used multiple times')) parser.add_argument('--insufficient-data-action', dest='insufficient_data_actions', metavar='', action='append', help=('URL to invoke when state transitions to ' 'insufficient data. May be used multiple ' 'times')) parser.add_argument( '--time-constraint', dest='time_constraints', metavar='