pax_global_header 0000666 0000000 0000000 00000000064 13611433254 0014514 g ustar 00root root 0000000 0000000 52 comment=f55b7c1a7b5b86266c0fbda2d08d4d939c67485b
django-prometheus-2.0.0/ 0000775 0000000 0000000 00000000000 13611433254 0015146 5 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/.gitignore 0000664 0000000 0000000 00000002146 13611433254 0017141 0 ustar 00root root 0000000 0000000 # Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
# C extensions
*.so
# Distribution / packaging
.Python
env*/
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
*.egg-info/
.installed.cfg
*.egg
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*,cover
# Translations
*.mo
*.pot
# Django stuff:
*.log
*.sqlite3
# Sphinx documentation
docs/_build/
# PyBuilder
target/
# VSCode
.vscode/
### Emacs ###
# -*- mode: gitignore; -*-
*~
\#*\#
/.emacs.desktop
/.emacs.desktop.lock
*.elc
auto-save-list
tramp
.\#*
# Org-mode
.org-id-locations
*_archive
# flymake-mode
*_flymake.*
# eshell files
/eshell/history
/eshell/lastdir
# elpa packages
/elpa/
# reftex files
*.rel
# AUCTeX auto folder
/auto/
# cask packages
.cask/
# venv
venv/
### Prometheus ###
examples/prometheus/data
django-prometheus-2.0.0/.travis.yml 0000664 0000000 0000000 00000001330 13611433254 0017254 0 ustar 00root root 0000000 0000000 dist: xenial
language: python
python:
- "3.5"
- "3.6"
- "3.7"
- "3.8"
- "2.7"
services:
- memcached
- redis
- mysql
- postgresql
install:
- pip install tox-travis coverage coveralls codacy-coverage
- mysql -e 'create database django_prometheus_1;'
script:
- tox
after_success:
- coverage combine .coverage django_prometheus/tests/end2end/.coverage
- coveralls
- coverage xml
- python-codacy-coverage -r coverage.xml
before_deploy:
- git checkout $TRAVIS_BRANCH
- git fetch --unshallow
- python update_version_from_git.py
deploy:
- provider: pypi
distributions: sdist bdist_wheel
skip_cleanup: true
skip_existing: true
user: __token__
on:
branch:
- master
django-prometheus-2.0.0/CHANGELOG.md 0000664 0000000 0000000 00000001073 13611433254 0016760 0 ustar 00root root 0000000 0000000 # Changelog
## v2.0.0 - Jan 20, 2020
* Added support for newer Django and Python versions
* Added an extensibility that applications to add their own labels to middleware (request/response) metrics
* Allow overriding and setting custom bucket values for request/response latency histogram metric
* Internal improvements:
* use tox
* Use pytest
* use Black
* Automate pre-releases on every commit ot master
* Fix flaky tests.
## v1.1.0 - Sep 28, 2019
* maintenance release that updates this library to support recent and supported version of python & Django django-prometheus-2.0.0/CONTRIBUTING.md 0000664 0000000 0000000 00000003563 13611433254 0017406 0 ustar 00root root 0000000 0000000 # Contributing
## Git
Feel free to send pull requests, even for the tiniest things. Watch
for Travis' opinion on them ([](https://travis-ci.org/korfuri/django-prometheus)).
Travis will also make sure your code is pep8 compliant, and it's a
good idea to run flake8 as well (on django_prometheus/ and on
tests/). The code contains "unused" imports on purpose so flake8 isn't
run automatically.
## Tests
Please write unit tests for your change. There are two kinds of tests:
* Regular unit tests that test the code directly, without loading
Django. This is limited to pieces of the code that don't depend on
Django, since a lot of the Django code will require a full Django
environment (anything that interacts with models, for instance,
needs a full database configuration).
* End-to-end tests are Django unit tests in a test application. The
test application doubles as an easy way to interactively test your
changes. It uses most of the basic Django features and a few
advanced features, so you can test things for yourself.
### Running all tests
```shell
python setup.py test
cd tests/end2end/ && PYTHONPATH=../.. ./manage.py test
```
The former runs the regular unit tests, the latter runs the Django
unit test.
To avoid setting PYTHONPATH every time, you can also run `python
setup.py install`.
### Running the test Django app
```shell
cd tests/end2end/ && PYTHONPATH=../.. ./manage.py runserver
```
By default, this will start serving on http://localhost:8000/. Metrics
are available at `/metrics`.
## Running Prometheus
See for instructions on installing
Prometheus. Once you have Prometheus installed, you can use the
example rules and dashboard in `examples/prometheus/`. See
`examples/prometheus/README.md` to run Prometheus and view the example
dashboard.
django-prometheus-2.0.0/LICENSE 0000664 0000000 0000000 00000026136 13611433254 0016163 0 ustar 00root root 0000000 0000000 Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "{}"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright {yyyy} {name of copyright owner}
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
django-prometheus-2.0.0/MANIFEST.in 0000664 0000000 0000000 00000000042 13611433254 0016700 0 ustar 00root root 0000000 0000000 include LICENSE
include README.md
django-prometheus-2.0.0/README.md 0000664 0000000 0000000 00000016515 13611433254 0016435 0 ustar 00root root 0000000 0000000 # django-prometheus
Export Django monitoring metrics for Prometheus.io
[](https://gitter.im/django-prometheus/community?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
[](http://badge.fury.io/py/django-prometheus)
[](https://travis-ci.org/korfuri/django-prometheus)
[](https://coveralls.io/github/korfuri/django-prometheus?branch=master)
[](https://pypi.python.org/pypi/django-prometheus)
[](https://github.com/psf/black)
## Usage
### Requirements
* Django >= 1.11
### Installation
Install with:
```shell
pip install django-prometheus
```
Or, if you're using a development version cloned from this repository:
```shell
python path-to-where-you-cloned-django-prometheus/setup.py install
```
This will install [prometheus_client](https://github.com/prometheus/client_python) as a dependency.
### Quickstart
In your settings.py:
```python
INSTALLED_APPS = (
...
'django_prometheus',
...
)
MIDDLEWARE_CLASSES = (
'django_prometheus.middleware.PrometheusBeforeMiddleware',
# All your other middlewares go here, including the default
# middlewares like SessionMiddleware, CommonMiddleware,
# CsrfViewmiddleware, SecurityMiddleware, etc.
'django_prometheus.middleware.PrometheusAfterMiddleware',
)
```
In your urls.py:
```python
urlpatterns = [
...
url('', include('django_prometheus.urls')),
]
```
### Configuration
Prometheus uses Histogram based grouping for monitoring latencies. The default
buckets are here: https://github.com/prometheus/client_python/blob/master/prometheus_client/core.py
You can define custom buckets for latency, adding more buckets decreases performance but
increases accuracy: https://prometheus.io/docs/practices/histograms/
```
PROMETHEUS_LATENCY_BUCKETS = (.1, .2, .5, .6, .8, 1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.5, 9.0, 12.0, 15.0, 20.0, 30.0, float("inf"))
```
### Monitoring your databases
SQLite, MySQL, and PostgreSQL databases can be monitored. Just
replace the `ENGINE` property of your database, replacing
`django.db.backends` with `django_prometheus.db.backends`.
```python
DATABASES = {
'default': {
'ENGINE': 'django_prometheus.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
},
}
```
### Monitoring your caches
Filebased, memcached, redis caches can be monitored. Just replace
the cache backend to use the one provided by django_prometheus
`django.core.cache.backends` with `django_prometheus.cache.backends`.
```python
CACHES = {
'default': {
'BACKEND': 'django_prometheus.cache.backends.filebased.FileBasedCache',
'LOCATION': '/var/tmp/django_cache',
}
}
```
### Monitoring your models
You may want to monitor the creation/deletion/update rate for your
model. This can be done by adding a mixin to them. This is safe to do
on existing models (it does not require a migration).
If your model is:
```python
class Dog(models.Model):
name = models.CharField(max_length=100, unique=True)
breed = models.CharField(max_length=100, blank=True, null=True)
age = models.PositiveIntegerField(blank=True, null=True)
```
Just add the `ExportModelOperationsMixin` as such:
```python
from django_prometheus.models import ExportModelOperationsMixin
class Dog(ExportModelOperationsMixin('dog'), models.Model):
name = models.CharField(max_length=100, unique=True)
breed = models.CharField(max_length=100, blank=True, null=True)
age = models.PositiveIntegerField(blank=True, null=True)
```
This will export 3 metrics, `django_model_inserts_total{model="dog"}`,
`django_model_updates_total{model="dog"}` and
`django_model_deletes_total{model="dog"}`.
Note that the exported metrics are counters of creations,
modifications and deletions done in the current process. They are not
gauges of the number of objects in the model.
Starting with Django 1.7, migrations are also monitored. Two gauges
are exported, `django_migrations_applied_by_connection` and
`django_migrations_unapplied_by_connection`. You may want to alert if
there are unapplied migrations.
If you want to disable the Django migration metrics, set the
`PROMETHEUS_EXPORT_MIGRATIONS` setting to False.
### Monitoring and aggregating the metrics
Prometheus is quite easy to set up. An example prometheus.conf to
scrape `127.0.0.1:8001` can be found in `examples/prometheus`.
Here's an example of a PromDash displaying some of the metrics
collected by django-prometheus:

## Adding your own metrics
You can add application-level metrics in your code by using
[prometheus_client](https://github.com/prometheus/client_python)
directly. The exporter is global and will pick up your metrics.
To add metrics to the Django internals, the easiest way is to extend
django-prometheus' classes. Please consider contributing your metrics,
pull requests are welcome. Make sure to read the Prometheus best
practices on
[instrumentation](http://prometheus.io/docs/practices/instrumentation/)
and [naming](http://prometheus.io/docs/practices/naming/).
## Importing Django Prometheus using only local settings
If you wish to use Django Prometheus but are not able to change
the code base, it's possible to have all the default metrics by
modifying only the settings.
First step is to inject prometheus' middlewares and to add
django_prometheus in INSTALLED_APPS
```python
MIDDLEWARE = (
('django_prometheus.middleware.PrometheusBeforeMiddleware',) +
MIDDLEWARE +
('django_prometheus.middleware.PrometheusAfterMiddleware',)
)
INSTALLED_APPS = INSTALLED_APPS + ('django_prometheus',)
```
Second step is to create the /metrics end point, for that we need
another file (called urls_prometheus_wrapper.py in this example) that
will wraps the apps URLs and add one on top:
```python
from django.conf.urls import include, url
urlpatterns = []
urlpatterns.append(url('^prometheus/', include('django_prometheus.urls')))
urlpatterns.append(url('', include('myapp.urls')))
```
This file will add a "/prometheus/metrics" end point to the URLs of django
that will export the metrics (replace myapp by your project name).
Then we inject the wrapper in settings:
```python
ROOT_URLCONF = "graphite.urls_prometheus_wrapper"
```
## Adding custom labels to middleware (request/response) metrics
You can add application specific labels to metrics reported by the django-prometheus middleware.
This involves extending the classes defined in middleware.py.
* Extend the Metrics class and override the `register_metric` method to add the application specific labels.
* Extend middleware classes, set the metrics_cls class attribute to the the extended metric class and override the label_metric method to attach custom metrics.
See implementation example in [the test app](django_prometheus/tests/end2end/testapp/test_middleware_custom_labels.py#L19-L46) django-prometheus-2.0.0/django_prometheus/ 0000775 0000000 0000000 00000000000 13611433254 0020663 5 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/django_prometheus/__init__.py 0000664 0000000 0000000 00000001130 13611433254 0022767 0 ustar 00root root 0000000 0000000 """Django-Prometheus
https://github.com/korfuri/django-prometheus
"""
# Import all files that define metrics. This has the effect that
# `import django_prometheus` will always instantiate all metric
# objects right away.
from django_prometheus import middleware, models
__all__ = ["middleware", "models", "pip_prometheus"]
__version__ = "2.0.0"
# Import pip_prometheus to export the pip metrics automatically.
try:
import pip_prometheus
except ImportError:
# If people don't have pip, don't export anything.
pass
default_app_config = "django_prometheus.apps.DjangoPrometheusConfig"
django-prometheus-2.0.0/django_prometheus/apps.py 0000664 0000000 0000000 00000001651 13611433254 0022203 0 ustar 00root root 0000000 0000000 import django_prometheus
from django.apps import AppConfig
from django.conf import settings
from django_prometheus.exports import SetupPrometheusExportsFromConfig
from django_prometheus.migrations import ExportMigrations
class DjangoPrometheusConfig(AppConfig):
name = django_prometheus.__name__
verbose_name = "Django-Prometheus"
def ready(self):
"""Initializes the Prometheus exports if they are enabled in the config.
Note that this is called even for other management commands
than `runserver`. As such, it is possible to scrape the
metrics of a running `manage.py test` or of another command,
which shouldn't be done for real monitoring (since these jobs
are usually short-lived), but can be useful for debugging.
"""
SetupPrometheusExportsFromConfig()
if getattr(settings, "PROMETHEUS_EXPORT_MIGRATIONS", True):
ExportMigrations()
django-prometheus-2.0.0/django_prometheus/cache/ 0000775 0000000 0000000 00000000000 13611433254 0021726 5 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/django_prometheus/cache/__init__.py 0000664 0000000 0000000 00000000000 13611433254 0024025 0 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/django_prometheus/cache/backends/ 0000775 0000000 0000000 00000000000 13611433254 0023500 5 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/django_prometheus/cache/backends/__init__.py 0000664 0000000 0000000 00000000000 13611433254 0025577 0 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/django_prometheus/cache/backends/django_memcached_consul.py 0000664 0000000 0000000 00000001457 13611433254 0030674 0 ustar 00root root 0000000 0000000 from __future__ import absolute_import
from django_memcached_consul import memcached
from django_prometheus.cache.metrics import (
django_cache_get_total,
django_cache_hits_total,
django_cache_misses_total,
)
class MemcachedCache(memcached.MemcachedCache):
"""Inherit django_memcached_consul to add metrics about hit/miss ratio"""
def get(self, key, default=None, version=None):
django_cache_get_total.labels(backend="django_memcached_consul").inc()
cached = super(MemcachedCache, self).get(key, default=None, version=version)
if cached is not None:
django_cache_hits_total.labels(backend="django_memcached_consul").inc()
else:
django_cache_misses_total.labels(backend="django_memcached_consul").inc()
return cached or default
django-prometheus-2.0.0/django_prometheus/cache/backends/filebased.py 0000664 0000000 0000000 00000001330 13611433254 0025765 0 ustar 00root root 0000000 0000000 from django.core.cache.backends import filebased
from django_prometheus.cache.metrics import (
django_cache_get_total,
django_cache_hits_total,
django_cache_misses_total,
)
class FileBasedCache(filebased.FileBasedCache):
"""Inherit filebased cache to add metrics about hit/miss ratio"""
def get(self, key, default=None, version=None):
django_cache_get_total.labels(backend="filebased").inc()
cached = super(FileBasedCache, self).get(key, default=None, version=version)
if cached is not None:
django_cache_hits_total.labels(backend="filebased").inc()
else:
django_cache_misses_total.labels(backend="filebased").inc()
return cached or default
django-prometheus-2.0.0/django_prometheus/cache/backends/locmem.py 0000664 0000000 0000000 00000001300 13611433254 0025320 0 ustar 00root root 0000000 0000000 from django.core.cache.backends import locmem
from django_prometheus.cache.metrics import (
django_cache_get_total,
django_cache_hits_total,
django_cache_misses_total,
)
class LocMemCache(locmem.LocMemCache):
"""Inherit filebased cache to add metrics about hit/miss ratio"""
def get(self, key, default=None, version=None):
django_cache_get_total.labels(backend="locmem").inc()
cached = super(LocMemCache, self).get(key, default=None, version=version)
if cached is not None:
django_cache_hits_total.labels(backend="locmem").inc()
else:
django_cache_misses_total.labels(backend="locmem").inc()
return cached or default
django-prometheus-2.0.0/django_prometheus/cache/backends/memcached.py 0000664 0000000 0000000 00000001322 13611433254 0025756 0 ustar 00root root 0000000 0000000 from django.core.cache.backends import memcached
from django_prometheus.cache.metrics import (
django_cache_get_total,
django_cache_hits_total,
django_cache_misses_total,
)
class MemcachedCache(memcached.MemcachedCache):
"""Inherit memcached to add metrics about hit/miss ratio"""
def get(self, key, default=None, version=None):
django_cache_get_total.labels(backend="memcached").inc()
cached = super(MemcachedCache, self).get(key, default=None, version=version)
if cached is not None:
django_cache_hits_total.labels(backend="memcached").inc()
else:
django_cache_misses_total.labels(backend="memcached").inc()
return cached or default
django-prometheus-2.0.0/django_prometheus/cache/backends/redis.py 0000664 0000000 0000000 00000002275 13611433254 0025166 0 ustar 00root root 0000000 0000000 from django_prometheus.cache.metrics import (
django_cache_get_fail_total,
django_cache_get_total,
django_cache_hits_total,
django_cache_misses_total,
)
from django_redis import cache, exceptions
class RedisCache(cache.RedisCache):
"""Inherit redis to add metrics about hit/miss/interruption ratio"""
@cache.omit_exception
def get(self, key, default=None, version=None, client=None):
try:
django_cache_get_total.labels(backend="redis").inc()
cached = self.client.get(key, default=None, version=version, client=client)
except exceptions.ConnectionInterrupted as e:
django_cache_get_fail_total.labels(backend="redis").inc()
if cache.DJANGO_REDIS_IGNORE_EXCEPTIONS or self._ignore_exceptions:
if cache.DJANGO_REDIS_LOG_IGNORED_EXCEPTIONS:
cache.logger.error(str(e))
return default
raise
else:
if cached is not None:
django_cache_hits_total.labels(backend="redis").inc()
return cached
else:
django_cache_misses_total.labels(backend="redis").inc()
return default
django-prometheus-2.0.0/django_prometheus/cache/metrics.py 0000664 0000000 0000000 00000000757 13611433254 0023757 0 ustar 00root root 0000000 0000000 from prometheus_client import Counter
django_cache_get_total = Counter(
"django_cache_get_total", "Total get requests on cache", ["backend"]
)
django_cache_hits_total = Counter(
"django_cache_get_hits_total", "Total hits on cache", ["backend"]
)
django_cache_misses_total = Counter(
"django_cache_get_misses_total", "Total misses on cache", ["backend"]
)
django_cache_get_fail_total = Counter(
"django_cache_get_fail_total", "Total get request failures by cache", ["backend"]
)
django-prometheus-2.0.0/django_prometheus/db/ 0000775 0000000 0000000 00000000000 13611433254 0021250 5 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/django_prometheus/db/__init__.py 0000664 0000000 0000000 00000000531 13611433254 0023360 0 ustar 00root root 0000000 0000000 # Import all metrics
from django_prometheus.db.metrics import (
Counter,
connection_errors_total,
connections_total,
errors_total,
execute_many_total,
execute_total,
)
__all__ = [
"Counter",
"connection_errors_total",
"connections_total",
"errors_total",
"execute_many_total",
"execute_total",
]
django-prometheus-2.0.0/django_prometheus/db/backends/ 0000775 0000000 0000000 00000000000 13611433254 0023022 5 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/django_prometheus/db/backends/README.md 0000664 0000000 0000000 00000002733 13611433254 0024306 0 ustar 00root root 0000000 0000000 # Adding new database wrapper types
Unfortunately, I don't have the resources to create wrappers for all
database vendors. Doing so should be straightforward, but testing that
it works and maintaining it is a lot of busywork, or is impossible for
me for commercial databases.
This document should be enough for people who wish to implement a new
database wrapper.
## Structure
A database engine in Django requires 3 classes (it really requires 2,
but the 3rd one is required for our purposes):
* A DatabaseFeatures class, which describes what features the database
supports. For our usage, we can simply extend the existing
DatabaseFeatures class without any changes.
* A DatabaseWrapper class, which abstracts the interface to the
database.
* A CursorWrapper class, which abstracts the interface to a cursor. A
cursor is the object that can execute SQL statements via an open
connection.
An easy example can be found in the sqlite3 module. Here are a few tips:
* The `self.alias` and `self.vendor` properties are present in all
DatabaseWrappers.
* The CursorWrapper doesn't have access to the alias and vendor, so we
generate the class in a function that accepts them as arguments.
* Most methods you overload should just increment a counter, forward
all arguments to the original method and return the
result. `execute` and `execute_many` should also wrap the call to
the parent method in a `try...except` block to increment the
`errors_total` counter as appropriate.
django-prometheus-2.0.0/django_prometheus/db/backends/__init__.py 0000664 0000000 0000000 00000000000 13611433254 0025121 0 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/django_prometheus/db/backends/mysql/ 0000775 0000000 0000000 00000000000 13611433254 0024167 5 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/django_prometheus/db/backends/mysql/__init__.py 0000664 0000000 0000000 00000000000 13611433254 0026266 0 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/django_prometheus/db/backends/mysql/base.py 0000664 0000000 0000000 00000001122 13611433254 0025447 0 ustar 00root root 0000000 0000000 from django.db.backends.mysql import base
from django_prometheus.db.common import DatabaseWrapperMixin, ExportingCursorWrapper
class DatabaseFeatures(base.DatabaseFeatures):
"""Our database has the exact same features as the base one."""
pass
class DatabaseWrapper(DatabaseWrapperMixin, base.DatabaseWrapper):
CURSOR_CLASS = base.CursorWrapper
def create_cursor(self, name=None):
cursor = self.connection.cursor()
CursorWrapper = ExportingCursorWrapper(
self.CURSOR_CLASS, self.alias, self.vendor
)
return CursorWrapper(cursor)
django-prometheus-2.0.0/django_prometheus/db/backends/postgresql/ 0000775 0000000 0000000 00000000000 13611433254 0025225 5 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/django_prometheus/db/backends/postgresql/__init__.py 0000664 0000000 0000000 00000000000 13611433254 0027324 0 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/django_prometheus/db/backends/postgresql/base.py 0000664 0000000 0000000 00000001515 13611433254 0026513 0 ustar 00root root 0000000 0000000 import psycopg2.extensions
from django.db.backends.postgresql import base
from django_prometheus.db.common import DatabaseWrapperMixin, ExportingCursorWrapper
class DatabaseFeatures(base.DatabaseFeatures):
"""Our database has the exact same features as the base one."""
pass
class DatabaseWrapper(DatabaseWrapperMixin, base.DatabaseWrapper):
def get_connection_params(self):
conn_params = super(DatabaseWrapper, self).get_connection_params()
conn_params["cursor_factory"] = ExportingCursorWrapper(
psycopg2.extensions.cursor, self.alias, self.vendor
)
return conn_params
def create_cursor(self, name=None):
# cursor_factory is a kwarg to connect() so restore create_cursor()'s
# default behavior
return base.DatabaseWrapper.create_cursor(self, name=name)
django-prometheus-2.0.0/django_prometheus/db/backends/sqlite3/ 0000775 0000000 0000000 00000000000 13611433254 0024406 5 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/django_prometheus/db/backends/sqlite3/__init__.py 0000664 0000000 0000000 00000000000 13611433254 0026505 0 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/django_prometheus/db/backends/sqlite3/base.py 0000664 0000000 0000000 00000000531 13611433254 0025671 0 ustar 00root root 0000000 0000000 from django.db.backends.sqlite3 import base
from django_prometheus.db.common import DatabaseWrapperMixin
class DatabaseFeatures(base.DatabaseFeatures):
"""Our database has the exact same features as the base one."""
pass
class DatabaseWrapper(DatabaseWrapperMixin, base.DatabaseWrapper):
CURSOR_CLASS = base.SQLiteCursorWrapper
django-prometheus-2.0.0/django_prometheus/db/common.py 0000664 0000000 0000000 00000005225 13611433254 0023116 0 ustar 00root root 0000000 0000000 from django_prometheus.db import (
connection_errors_total,
connections_total,
errors_total,
execute_many_total,
execute_total,
)
class ExceptionCounterByType(object):
"""A context manager that counts exceptions by type.
Exceptions increment the provided counter, whose last label's name
must match the `type_label` argument.
In other words:
c = Counter('http_request_exceptions_total', 'Counter of exceptions',
['method', 'type'])
with ExceptionCounterByType(c, extra_labels={'method': 'GET'}):
handle_get_request()
"""
def __init__(self, counter, type_label="type", extra_labels=None):
self._counter = counter
self._type_label = type_label
self._labels = extra_labels
def __enter__(self):
pass
def __exit__(self, typ, value, traceback):
if typ is not None:
self._labels.update({self._type_label: typ.__name__})
self._counter.labels(**self._labels).inc()
class DatabaseWrapperMixin(object):
"""Extends the DatabaseWrapper to count connections and cursors."""
def get_new_connection(self, *args, **kwargs):
connections_total.labels(self.alias, self.vendor).inc()
try:
return super(DatabaseWrapperMixin, self).get_new_connection(*args, **kwargs)
except Exception:
connection_errors_total.labels(self.alias, self.vendor).inc()
raise
def create_cursor(self, name=None):
return self.connection.cursor(
factory=ExportingCursorWrapper(self.CURSOR_CLASS, self.alias, self.vendor)
)
def ExportingCursorWrapper(cursor_class, alias, vendor):
"""Returns a CursorWrapper class that knows its database's alias and
vendor name.
"""
class CursorWrapper(cursor_class):
"""Extends the base CursorWrapper to count events."""
def execute(self, *args, **kwargs):
execute_total.labels(alias, vendor).inc()
with ExceptionCounterByType(
errors_total, extra_labels={"alias": alias, "vendor": vendor}
):
return super(CursorWrapper, self).execute(*args, **kwargs)
def executemany(self, query, param_list, *args, **kwargs):
execute_total.labels(alias, vendor).inc(len(param_list))
execute_many_total.labels(alias, vendor).inc(len(param_list))
with ExceptionCounterByType(
errors_total, extra_labels={"alias": alias, "vendor": vendor}
):
return super(CursorWrapper, self).executemany(
query, param_list, *args, **kwargs
)
return CursorWrapper
django-prometheus-2.0.0/django_prometheus/db/metrics.py 0000664 0000000 0000000 00000001637 13611433254 0023277 0 ustar 00root root 0000000 0000000 from prometheus_client import Counter
connections_total = Counter(
"django_db_new_connections_total",
"Counter of created connections by database and by vendor.",
["alias", "vendor"],
)
connection_errors_total = Counter(
"django_db_new_connection_errors_total",
"Counter of connection failures by database and by vendor.",
["alias", "vendor"],
)
execute_total = Counter(
"django_db_execute_total",
(
"Counter of executed statements by database and by vendor, including"
" bulk executions."
),
["alias", "vendor"],
)
execute_many_total = Counter(
"django_db_execute_many_total",
("Counter of executed statements in bulk operations by database and" " by vendor."),
["alias", "vendor"],
)
errors_total = Counter(
"django_db_errors_total",
("Counter of execution errors by database, vendor and exception type."),
["alias", "vendor", "type"],
)
django-prometheus-2.0.0/django_prometheus/exports.py 0000664 0000000 0000000 00000011332 13611433254 0022741 0 ustar 00root root 0000000 0000000 import logging
import os
import socket
import threading
import prometheus_client
from prometheus_client import multiprocess
from django.conf import settings
from django.http import HttpResponse
try:
# Python 2
from BaseHTTPServer import HTTPServer
except ImportError:
# Python 3
from http.server import HTTPServer
logger = logging.getLogger(__name__)
def SetupPrometheusEndpointOnPort(port, addr=""):
"""Exports Prometheus metrics on an HTTPServer running in its own thread.
The server runs on the given port and is by default listenning on
all interfaces. This HTTPServer is fully independent of Django and
its stack. This offers the advantage that even if Django becomes
unable to respond, the HTTPServer will continue to function and
export metrics. However, this also means that the features
offered by Django (like middlewares or WSGI) can't be used.
Now here's the really weird part. When Django runs with the
auto-reloader enabled (which is the default, you can disable it
with `manage.py runserver --noreload`), it forks and executes
manage.py twice. That's wasteful but usually OK. It starts being a
problem when you try to open a port, like we do. We can detect
that we're running under an autoreloader through the presence of
the RUN_MAIN environment variable, so we abort if we're trying to
export under an autoreloader and trying to open a port.
"""
assert os.environ.get("RUN_MAIN") != "true", (
"The thread-based exporter can't be safely used when django's "
"autoreloader is active. Use the URL exporter, or start django "
"with --noreload. See documentation/exports.md."
)
prometheus_client.start_http_server(port, addr=addr)
class PrometheusEndpointServer(threading.Thread):
"""A thread class that holds an http and makes it serve_forever()."""
def __init__(self, httpd, *args, **kwargs):
self.httpd = httpd
super(PrometheusEndpointServer, self).__init__(*args, **kwargs)
def run(self):
self.httpd.serve_forever()
def SetupPrometheusEndpointOnPortRange(port_range, addr=""):
"""Like SetupPrometheusEndpointOnPort, but tries several ports.
This is useful when you're running Django as a WSGI application
with multiple processes and you want Prometheus to discover all
workers. Each worker will grab a port and you can use Prometheus
to aggregate across workers.
port_range may be any iterable object that contains a list of
ports. Typically this would be a `range` of contiguous ports.
As soon as one port is found that can serve, use this one and stop
trying.
Returns the port chosen (an `int`), or `None` if no port in the
supplied range was available.
The same caveats regarding autoreload apply. Do not use this when
Django's autoreloader is active.
"""
assert os.environ.get("RUN_MAIN") != "true", (
"The thread-based exporter can't be safely used when django's "
"autoreloader is active. Use the URL exporter, or start django "
"with --noreload. See documentation/exports.md."
)
for port in port_range:
try:
httpd = HTTPServer((addr, port), prometheus_client.MetricsHandler)
except (OSError, socket.error):
# Python 2 raises socket.error, in Python 3 socket.error is an
# alias for OSError
continue # Try next port
thread = PrometheusEndpointServer(httpd)
thread.daemon = True
thread.start()
logger.info("Exporting Prometheus /metrics/ on port %s" % port)
return port # Stop trying ports at this point
logger.warning(
"Cannot export Prometheus /metrics/ - " "no available ports in supplied range"
)
return None
def SetupPrometheusExportsFromConfig():
"""Exports metrics so Prometheus can collect them."""
port = getattr(settings, "PROMETHEUS_METRICS_EXPORT_PORT", None)
port_range = getattr(settings, "PROMETHEUS_METRICS_EXPORT_PORT_RANGE", None)
addr = getattr(settings, "PROMETHEUS_METRICS_EXPORT_ADDRESS", "")
if port_range:
SetupPrometheusEndpointOnPortRange(port_range, addr)
elif port:
SetupPrometheusEndpointOnPort(port, addr)
def ExportToDjangoView(request):
"""Exports /metrics as a Django view.
You can use django_prometheus.urls to map /metrics to this view.
"""
if "prometheus_multiproc_dir" in os.environ:
registry = prometheus_client.CollectorRegistry()
multiprocess.MultiProcessCollector(registry)
else:
registry = prometheus_client.REGISTRY
metrics_page = prometheus_client.generate_latest(registry)
return HttpResponse(
metrics_page, content_type=prometheus_client.CONTENT_TYPE_LATEST
)
django-prometheus-2.0.0/django_prometheus/middleware.py 0000664 0000000 0000000 00000026700 13611433254 0023357 0 ustar 00root root 0000000 0000000 from prometheus_client import Counter, Histogram
from django.conf import settings
from django.utils.deprecation import MiddlewareMixin
from django_prometheus.utils import PowersOf, Time, TimeSince
DEFAULT_LATENCY_BUCKETS = (
0.01,
0.025,
0.05,
0.075,
0.1,
0.25,
0.5,
0.75,
1.0,
2.5,
5.0,
7.5,
10.0,
25.0,
50.0,
75.0,
float("inf"),
)
class Metrics(object):
_instance = None
@classmethod
def get_instance(cls):
if not cls._instance:
cls._instance = cls()
return cls._instance
def register_metric(
self, metric_cls, name, documentation, labelnames=tuple(), **kwargs
):
return metric_cls(name, documentation, labelnames=labelnames, **kwargs)
def __init__(self, *args, **kwargs):
self.register()
def register(self):
self.requests_total = self.register_metric(
Counter,
"django_http_requests_before_middlewares_total",
"Total count of requests before middlewares run.",
)
self.responses_total = self.register_metric(
Counter,
"django_http_responses_before_middlewares_total",
"Total count of responses before middlewares run.",
)
self.requests_latency_before = self.register_metric(
Histogram,
"django_http_requests_latency_including_middlewares_seconds",
(
"Histogram of requests processing time (including middleware "
"processing time)."
),
)
self.requests_unknown_latency_before = self.register_metric(
Counter,
"django_http_requests_unknown_latency_including_middlewares_total",
(
"Count of requests for which the latency was unknown (when computing "
"django_http_requests_latency_including_middlewares_seconds)."
),
)
self.requests_latency_by_view_method = self.register_metric(
Histogram,
"django_http_requests_latency_seconds_by_view_method",
"Histogram of request processing time labelled by view.",
["view", "method"],
buckets=getattr(
settings, "PROMETHEUS_LATENCY_BUCKETS", DEFAULT_LATENCY_BUCKETS
),
)
self.requests_unknown_latency = self.register_metric(
Counter,
"django_http_requests_unknown_latency_total",
"Count of requests for which the latency was unknown.",
)
# Set in process_request
self.requests_ajax = self.register_metric(
Counter, "django_http_ajax_requests_total", "Count of AJAX requests."
)
self.requests_by_method = self.register_metric(
Counter,
"django_http_requests_total_by_method",
"Count of requests by method.",
["method"],
)
self.requests_by_transport = self.register_metric(
Counter,
"django_http_requests_total_by_transport",
"Count of requests by transport.",
["transport"],
)
# Set in process_view
self.requests_by_view_transport_method = self.register_metric(
Counter,
"django_http_requests_total_by_view_transport_method",
"Count of requests by view, transport, method.",
["view", "transport", "method"],
)
self.requests_body_bytes = self.register_metric(
Histogram,
"django_http_requests_body_total_bytes",
"Histogram of requests by body size.",
buckets=PowersOf(2, 30),
)
# Set in process_template_response
self.responses_by_templatename = self.register_metric(
Counter,
"django_http_responses_total_by_templatename",
"Count of responses by template name.",
["templatename"],
)
# Set in process_response
self.responses_by_status = self.register_metric(
Counter,
"django_http_responses_total_by_status",
"Count of responses by status.",
["status"],
)
self.responses_by_status_view_method = self.register_metric(
Counter,
"django_http_responses_total_by_status_view_method",
"Count of responses by status, view, method.",
["status", "view", "method"],
)
self.responses_body_bytes = self.register_metric(
Histogram,
"django_http_responses_body_total_bytes",
"Histogram of responses by body size.",
buckets=PowersOf(2, 30),
)
self.responses_by_charset = self.register_metric(
Counter,
"django_http_responses_total_by_charset",
"Count of responses by charset.",
["charset"],
)
self.responses_streaming = self.register_metric(
Counter,
"django_http_responses_streaming_total",
"Count of streaming responses.",
)
# Set in process_exception
self.exceptions_by_type = self.register_metric(
Counter,
"django_http_exceptions_total_by_type",
"Count of exceptions by object type.",
["type"],
)
self.exceptions_by_view = self.register_metric(
Counter,
"django_http_exceptions_total_by_view",
"Count of exceptions by view.",
["view"],
)
class PrometheusBeforeMiddleware(MiddlewareMixin):
"""Monitoring middleware that should run before other middlewares."""
metrics_cls = Metrics
def __init__(self, get_response=None):
super(PrometheusBeforeMiddleware, self).__init__(get_response)
self.metrics = self.metrics_cls.get_instance()
def process_request(self, request):
self.metrics.requests_total.inc()
request.prometheus_before_middleware_event = Time()
def process_response(self, request, response):
self.metrics.responses_total.inc()
if hasattr(request, "prometheus_before_middleware_event"):
self.metrics.requests_latency_before.observe(
TimeSince(request.prometheus_before_middleware_event)
)
else:
self.metrics.requests_unknown_latency_before.inc()
return response
class PrometheusAfterMiddleware(MiddlewareMixin):
"""Monitoring middleware that should run after other middlewares."""
metrics_cls = Metrics
def __init__(self, get_response=None):
super(PrometheusAfterMiddleware, self).__init__(get_response)
self.metrics = self.metrics_cls.get_instance()
def _transport(self, request):
return "https" if request.is_secure() else "http"
def _method(self, request):
m = request.method
if m not in (
"GET",
"HEAD",
"POST",
"PUT",
"DELETE",
"TRACE",
"OPTIONS",
"CONNECT",
"PATCH",
):
return ""
return m
def label_metric(self, metric, request, response=None, **labels):
return metric.labels(**labels) if labels else metric
def process_request(self, request):
transport = self._transport(request)
method = self._method(request)
self.label_metric(self.metrics.requests_by_method, request, method=method).inc()
self.label_metric(
self.metrics.requests_by_transport, request, transport=transport
).inc()
if request.is_ajax():
self.label_metric(self.metrics.requests_ajax, request).inc()
content_length = int(request.META.get("CONTENT_LENGTH") or 0)
self.label_metric(self.metrics.requests_body_bytes, request).observe(
content_length
)
request.prometheus_after_middleware_event = Time()
def _get_view_name(self, request):
view_name = ""
if hasattr(request, "resolver_match"):
if request.resolver_match is not None:
if request.resolver_match.view_name is not None:
view_name = request.resolver_match.view_name
return view_name
def process_view(self, request, view_func, *view_args, **view_kwargs):
transport = self._transport(request)
method = self._method(request)
if hasattr(request, "resolver_match"):
name = request.resolver_match.view_name or ""
self.label_metric(
self.metrics.requests_by_view_transport_method,
request,
view=name,
transport=transport,
method=method,
).inc()
def process_template_response(self, request, response):
if hasattr(response, "template_name"):
self.label_metric(
self.metrics.responses_by_templatename,
request,
response=response,
templatename=str(response.template_name),
).inc()
return response
def process_response(self, request, response):
method = self._method(request)
name = self._get_view_name(request)
status = str(response.status_code)
self.label_metric(
self.metrics.responses_by_status, request, response, status=status
).inc()
self.label_metric(
self.metrics.responses_by_status_view_method,
request,
response,
status=status,
view=name,
method=method,
).inc()
if hasattr(response, "charset"):
self.label_metric(
self.metrics.responses_by_charset,
request,
response,
charset=str(response.charset),
).inc()
if hasattr(response, "streaming") and response.streaming:
self.label_metric(self.metrics.responses_streaming, request, response).inc()
if hasattr(response, "content"):
self.label_metric(
self.metrics.responses_body_bytes, request, response
).observe(len(response.content))
if hasattr(request, "prometheus_after_middleware_event"):
self.label_metric(
self.metrics.requests_latency_by_view_method,
request,
response,
view=self._get_view_name(request),
method=request.method,
).observe(TimeSince(request.prometheus_after_middleware_event))
else:
self.label_metric(
self.metrics.requests_unknown_latency, request, response
).inc()
return response
def process_exception(self, request, exception):
self.label_metric(
self.metrics.exceptions_by_type, request, type=type(exception).__name__
).inc()
if hasattr(request, "resolver_match"):
name = request.resolver_match.view_name or ""
self.label_metric(self.metrics.exceptions_by_view, request, view=name).inc()
if hasattr(request, "prometheus_after_middleware_event"):
self.label_metric(
self.metrics.requests_latency_by_view_method,
request,
view=self._get_view_name(request),
method=request.method,
).observe(TimeSince(request.prometheus_after_middleware_event))
else:
self.label_metric(self.metrics.requests_unknown_latency, request).inc()
django-prometheus-2.0.0/django_prometheus/migrations.py 0000664 0000000 0000000 00000003430 13611433254 0023411 0 ustar 00root root 0000000 0000000 from prometheus_client import Gauge
from django.db import connections
from django.db.backends.dummy.base import DatabaseWrapper
unapplied_migrations = Gauge(
"django_migrations_unapplied_total",
"Count of unapplied migrations by database connection",
["connection"],
)
applied_migrations = Gauge(
"django_migrations_applied_total",
"Count of applied migrations by database connection",
["connection"],
)
def ExportMigrationsForDatabase(alias, executor):
plan = executor.migration_plan(executor.loader.graph.leaf_nodes())
unapplied_migrations.labels(alias).set(len(plan))
applied_migrations.labels(alias).set(len(executor.loader.applied_migrations))
def ExportMigrations():
"""Exports counts of unapplied migrations.
This is meant to be called during app startup, ideally by
django_prometheus.apps.AppConfig.
"""
# Import MigrationExecutor lazily. MigrationExecutor checks at
# import time that the apps are ready, and they are not when
# django_prometheus is imported. ExportMigrations() should be
# called in AppConfig.ready(), which signals that all apps are
# ready.
from django.db.migrations.executor import MigrationExecutor
if "default" in connections and (
isinstance(connections["default"], DatabaseWrapper)
):
# This is the case where DATABASES = {} in the configuration,
# i.e. the user is not using any databases. Django "helpfully"
# adds a dummy database and then throws when you try to
# actually use it. So we don't do anything, because trying to
# export stats would crash the app on startup.
return
for alias in connections.databases:
executor = MigrationExecutor(connections[alias])
ExportMigrationsForDatabase(alias, executor)
django-prometheus-2.0.0/django_prometheus/models.py 0000664 0000000 0000000 00000002533 13611433254 0022523 0 ustar 00root root 0000000 0000000 from prometheus_client import Counter
model_inserts = Counter(
"django_model_inserts_total", "Number of insert operations by model.", ["model"]
)
model_updates = Counter(
"django_model_updates_total", "Number of update operations by model.", ["model"]
)
model_deletes = Counter(
"django_model_deletes_total", "Number of delete operations by model.", ["model"]
)
def ExportModelOperationsMixin(model_name):
"""Returns a mixin for models to export counters for lifecycle operations.
Usage:
class User(ExportModelOperationsMixin('user'), Model):
...
"""
# Force create the labels for this model in the counters. This
# is not necessary but it avoids gaps in the aggregated data.
model_inserts.labels(model_name)
model_updates.labels(model_name)
model_deletes.labels(model_name)
class Mixin(object):
def _do_insert(self, *args, **kwargs):
model_inserts.labels(model_name).inc()
return super(Mixin, self)._do_insert(*args, **kwargs)
def _do_update(self, *args, **kwargs):
model_updates.labels(model_name).inc()
return super(Mixin, self)._do_update(*args, **kwargs)
def delete(self, *args, **kwargs):
model_deletes.labels(model_name).inc()
return super(Mixin, self).delete(*args, **kwargs)
return Mixin
django-prometheus-2.0.0/django_prometheus/tests/ 0000775 0000000 0000000 00000000000 13611433254 0022025 5 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/django_prometheus/tests/__init__.py 0000664 0000000 0000000 00000000000 13611433254 0024124 0 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/django_prometheus/tests/end2end/ 0000775 0000000 0000000 00000000000 13611433254 0023344 5 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/django_prometheus/tests/end2end/manage.py 0000775 0000000 0000000 00000000372 13611433254 0025153 0 ustar 00root root 0000000 0000000 #!/usr/bin/env python
import os
import sys
if __name__ == "__main__":
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "testapp.settings")
from django.core.management import execute_from_command_line
execute_from_command_line(sys.argv)
django-prometheus-2.0.0/django_prometheus/tests/end2end/testapp/ 0000775 0000000 0000000 00000000000 13611433254 0025024 5 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/django_prometheus/tests/end2end/testapp/__init__.py 0000664 0000000 0000000 00000000000 13611433254 0027123 0 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/django_prometheus/tests/end2end/testapp/helpers.py 0000664 0000000 0000000 00000001101 13611433254 0027031 0 ustar 00root root 0000000 0000000 DJANGO_MIDDLEWARES = [
"django.contrib.sessions.middleware.SessionMiddleware",
"django.middleware.common.CommonMiddleware",
"django.middleware.csrf.CsrfViewMiddleware",
"django.contrib.auth.middleware.AuthenticationMiddleware",
"django.contrib.messages.middleware.MessageMiddleware",
"django.middleware.clickjacking.XFrameOptionsMiddleware",
"django.middleware.security.SecurityMiddleware",
]
def get_middleware(before, after):
middleware = [before]
middleware.extend(DJANGO_MIDDLEWARES)
middleware.append(after)
return middleware
django-prometheus-2.0.0/django_prometheus/tests/end2end/testapp/models.py 0000664 0000000 0000000 00000000702 13611433254 0026660 0 ustar 00root root 0000000 0000000 from django.db.models import CharField, Model, PositiveIntegerField
from django_prometheus.models import ExportModelOperationsMixin
class Dog(ExportModelOperationsMixin("dog"), Model):
name = CharField(max_length=100, unique=True)
breed = CharField(max_length=100, blank=True, null=True)
age = PositiveIntegerField(blank=True, null=True)
class Lawn(ExportModelOperationsMixin("lawn"), Model):
location = CharField(max_length=100)
django-prometheus-2.0.0/django_prometheus/tests/end2end/testapp/settings.py 0000664 0000000 0000000 00000007526 13611433254 0027250 0 ustar 00root root 0000000 0000000 import os
import tempfile
from testapp.helpers import get_middleware
# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = ")0-t%mc5y1^fn8e7i**^^v166@5iu(&-2%9#kxud0&4ap#k!_k"
DEBUG = True
ALLOWED_HOSTS = []
# Application definition
INSTALLED_APPS = (
"django.contrib.admin",
"django.contrib.auth",
"django.contrib.contenttypes",
"django.contrib.sessions",
"django.contrib.messages",
"django.contrib.staticfiles",
"django_prometheus",
"testapp",
)
MIDDLEWARE = get_middleware(
"django_prometheus.middleware.PrometheusBeforeMiddleware",
"django_prometheus.middleware.PrometheusAfterMiddleware",
)
ROOT_URLCONF = "testapp.urls"
TEMPLATES = [
{
"BACKEND": "django.template.backends.django.DjangoTemplates",
"DIRS": [],
"APP_DIRS": True,
"OPTIONS": {
"context_processors": [
"django.template.context_processors.debug",
"django.template.context_processors.request",
"django.contrib.auth.context_processors.auth",
"django.contrib.messages.context_processors.messages",
]
},
}
]
WSGI_APPLICATION = "testapp.wsgi.application"
DATABASES = {
"default": {
"ENGINE": "django_prometheus.db.backends.sqlite3",
"NAME": "db.sqlite3",
},
# Comment this to not test django_prometheus.db.backends.postgres.
"postgresql": {
"ENGINE": "django_prometheus.db.backends.postgresql",
"NAME": "postgres",
"USER": "postgres",
"PASSWORD": "",
"HOST": "localhost",
"PORT": "5432",
},
# Comment this to not test django_prometheus.db.backends.mysql.
"mysql": {
"ENGINE": "django_prometheus.db.backends.mysql",
"NAME": "django_prometheus_1",
"USER": "travis",
"PASSWORD": "",
"HOST": "localhost",
"PORT": "3306",
},
# The following databases are used by test_db.py only
"test_db_1": {
"ENGINE": "django_prometheus.db.backends.sqlite3",
"NAME": "test_db_1.sqlite3",
},
"test_db_2": {
"ENGINE": "django_prometheus.db.backends.sqlite3",
"NAME": "test_db_2.sqlite3",
},
}
# Caches
_tmp_cache_dir = tempfile.mkdtemp()
CACHES = {
"default": {
"BACKEND": "django_prometheus.cache.backends.memcached.MemcachedCache",
"LOCATION": "localhost:11211",
},
"memcached": {
"BACKEND": "django_prometheus.cache.backends.memcached.MemcachedCache",
"LOCATION": "localhost:11211",
},
"filebased": {
"BACKEND": "django_prometheus.cache.backends.filebased.FileBasedCache",
"LOCATION": os.path.join(_tmp_cache_dir, "django_cache"),
},
"locmem": {
"BACKEND": "django_prometheus.cache.backends.locmem.LocMemCache",
"LOCATION": os.path.join(_tmp_cache_dir, "locmem_cache"),
},
"redis": {
"BACKEND": "django_prometheus.cache.backends.redis.RedisCache",
"LOCATION": "redis://127.0.0.1:6379/1",
},
# Fake redis config emulated stopped service
"stopped_redis": {
"BACKEND": "django_prometheus.cache.backends.redis.RedisCache",
"LOCATION": "redis://127.0.0.1:6666/1",
},
"stopped_redis_ignore_exception": {
"BACKEND": "django_prometheus.cache.backends.redis.RedisCache",
"LOCATION": "redis://127.0.0.1:6666/1",
"OPTIONS": {"IGNORE_EXCEPTIONS": True},
},
}
# Internationalization
LANGUAGE_CODE = "en-us"
TIME_ZONE = "UTC"
USE_I18N = True
USE_L10N = True
USE_TZ = True
# Static files (CSS, JavaScript, Images)
STATIC_URL = "/static/"
LOGGING = {
"version": 1,
"disable_existing_loggers": False,
"handlers": {"console": {"class": "logging.StreamHandler"}},
"root": {"handlers": ["console"], "level": "INFO"},
"loggers": {"django": {"handlers": ["console"], "level": "INFO"}},
}
django-prometheus-2.0.0/django_prometheus/tests/end2end/testapp/templates/ 0000775 0000000 0000000 00000000000 13611433254 0027022 5 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/django_prometheus/tests/end2end/testapp/templates/help.html 0000664 0000000 0000000 00000004126 13611433254 0030643 0 ustar 00root root 0000000 0000000 Can't Help Falling in Love
Remembering Helps Me to Forget
Helplessly, Hopelessly
Love Helps Those
I Need a Little Help
For a while We Helped Each Other Out
Give Me a Helping Hand
I Can't Help You, I'm Falling Too
How Can I Help You Say Goodbye?
Time Hasn't Helped
Jukebox, Help Me Find My Baby
I Just Can't Help Myself
Help Me, Girl
I Can't Help it
Help Somebody
Help, Help
I Can't Help How I Feel
No Help From Me
I Can Help
Somebody Help Me
Please Help Me I'm Falling in Love With You
Help Yourself
Outside Help
Helping Hand
Help Me, Rhonda
Can't Help Feeling So Blue
We All Agreed to Help
Help Pour Out the Rain (Lacey's Song)
Sleep Won't Help Me
I Can't Help Myself (Sugarpie, Honeybunch)
Cry for Help
She's Helping Me Get Over You
Mama Help Me
Help Yourself to Me
Can't Help But Wonder
Heaven Help the Working Girl
Help Me Pick Up the Pieces
Crying Won't Help Now
I Couldn't Help Myself
So Help Me, Girl
Heaven Help the Fool
Help Wanted
Help Me Get Over You
Helpless
Help
Can't Help it
Can't Help Calling Your Name
If She Just Helps Me Get Over You
Helpless Heart
No Help Wanted
It Didn't Help Much
Help Me Make it Through the Night
Help Me Understand
I Just Can't Help Believing
Can't Help Thinking About Me
How Could I Help But Love You?
Heaven Help My Heart
I Can't Help Remembering You
Help Me Hold on
Helping Me Get Over You
I Can't Help it if I'm Still in Love with You
Girl Can't Help it, The
I Can't Help it, I'm Falling in Love
With a Little Help from My Friends
Heaven Help the Child
Help Me
Can't Help But Love You
Help is on the Way
I Got Some Help I Don't Need
Heaven Help Us All
Heaven Help Me
Helplessly Hoping
django-prometheus-2.0.0/django_prometheus/tests/end2end/testapp/templates/index.html 0000664 0000000 0000000 00000000023 13611433254 0031012 0 ustar 00root root 0000000 0000000 This is the index.
django-prometheus-2.0.0/django_prometheus/tests/end2end/testapp/templates/lawn.html 0000664 0000000 0000000 00000000105 13611433254 0030645 0 ustar 00root root 0000000 0000000 Aaah, {{ lawn.location }}, the best place on Earth, probably.
django-prometheus-2.0.0/django_prometheus/tests/end2end/testapp/templates/slow.html 0000664 0000000 0000000 00000004252 13611433254 0030677 0 ustar 00root root 0000000 0000000
_.---"'"""""'`--.._
_,.-' `-._
_,." -.
.-"" ___...---------.._ `.
`---'"" `-. `.
`. \
`. \
\ \
. \
| .
| |
_________ | |
_,.-'" `"'-.._ : |
_,-' `-._.' |
_.' `. '
_.-. _,+......__ `. .
.' `-"' `"-.,-""--._ \ /
/ ,' | __ \ \ /
` .. +" ) \ \ /
`.' \ ,-"`-.. | | \ /
/ " | .' \ '. _.' .'
|,.."--"""--..| " | `""`. |
," `-._ | | |
.' `-._+ | |
/ `. / |
| ` ' | / |
`-.....--.__ | | / |
`./ "| / `-.........--.- ' | ,' '
/| || `.' ,' .' |_,-+ /
/ ' '.`. _,' ,' `. | ' _,.. /
/ `. `"'"'""'" _,^--------"`. | `.'_ _/
/... _.`:.________,.' `._,.-..| "'
`.__.' `._ /
"' mh
Art by Maija Haavisto
django-prometheus-2.0.0/django_prometheus/tests/end2end/testapp/templates/sql.html 0000664 0000000 0000000 00000001244 13611433254 0030510 0 ustar 00root root 0000000 0000000
Execute some SQL here, for fun and profit!
Note that this is a very bad vulnerability: it gives anyone direct
access to your whole database. This only exists to test that
django_prometheus is working.
{% if query %}
Your query was:
{{ query }}
Your results were:
{{ rows }}
{% endif %}
django-prometheus-2.0.0/django_prometheus/tests/end2end/testapp/test_caches.py 0000664 0000000 0000000 00000007772 13611433254 0027700 0 ustar 00root root 0000000 0000000 from django.core.cache import caches
from django.test import TestCase
from django_prometheus.testutils import PrometheusTestCaseMixin
from redis import RedisError
class TestCachesMetrics(PrometheusTestCaseMixin, TestCase):
"""Test django_prometheus.caches metrics."""
def testCounters(self):
supported_caches = ["memcached", "filebased", "locmem", "redis"]
# Note: those tests require a memcached server running
for supported_cache in supported_caches:
tested_cache = caches[supported_cache]
total_before = (
self.getMetric("django_cache_get_total", backend=supported_cache) or 0
)
hit_before = (
self.getMetric("django_cache_get_hits_total", backend=supported_cache)
or 0
)
miss_before = (
self.getMetric("django_cache_get_misses_total", backend=supported_cache)
or 0
)
tested_cache.set("foo1", "bar")
tested_cache.get("foo1")
tested_cache.get("foo1")
tested_cache.get("foofoo")
result = tested_cache.get("foofoo", default="default")
assert result == "default"
self.assertMetricEquals(
total_before + 4, "django_cache_get_total", backend=supported_cache
)
self.assertMetricEquals(
hit_before + 2, "django_cache_get_hits_total", backend=supported_cache
)
self.assertMetricEquals(
miss_before + 2,
"django_cache_get_misses_total",
backend=supported_cache,
)
def test_redis_cache_fail(self):
# Note: test use fake service config (like if server was stopped)
supported_cache = "redis"
total_before = (
self.getMetric("django_cache_get_total", backend=supported_cache) or 0
)
fail_before = (
self.getMetric("django_cache_get_fail_total", backend=supported_cache) or 0
)
hit_before = (
self.getMetric("django_cache_get_hits_total", backend=supported_cache) or 0
)
miss_before = (
self.getMetric("django_cache_get_misses_total", backend=supported_cache)
or 0
)
tested_cache = caches["stopped_redis_ignore_exception"]
tested_cache.get("foo1")
self.assertMetricEquals(
hit_before, "django_cache_get_hits_total", backend=supported_cache
)
self.assertMetricEquals(
miss_before, "django_cache_get_misses_total", backend=supported_cache
)
self.assertMetricEquals(
total_before + 1, "django_cache_get_total", backend=supported_cache
)
self.assertMetricEquals(
fail_before + 1, "django_cache_get_fail_total", backend=supported_cache
)
tested_cache = caches["stopped_redis"]
with self.assertRaises(RedisError):
tested_cache.get("foo1")
self.assertMetricEquals(
hit_before, "django_cache_get_hits_total", backend=supported_cache
)
self.assertMetricEquals(
miss_before, "django_cache_get_misses_total", backend=supported_cache
)
self.assertMetricEquals(
total_before + 2, "django_cache_get_total", backend=supported_cache
)
self.assertMetricEquals(
fail_before + 2, "django_cache_get_fail_total", backend=supported_cache
)
def test_cache_version_support(self):
supported_caches = ["memcached", "filebased", "locmem", "redis"]
# Note: those tests require a memcached server running
for supported_cache in supported_caches:
tested_cache = caches[supported_cache]
tested_cache.set("foo1", "bar v.1", version=1)
tested_cache.set("foo1", "bar v.2", version=2)
assert "bar v.1" == tested_cache.get("foo1", version=1)
assert "bar v.2" == tested_cache.get("foo1", version=2)
django-prometheus-2.0.0/django_prometheus/tests/end2end/testapp/test_db.py 0000664 0000000 0000000 00000010634 13611433254 0027026 0 ustar 00root root 0000000 0000000 from unittest import skipUnless
from django.db import connections
from django.test import TestCase
from django_prometheus.testutils import PrometheusTestCaseMixin
class BaseDbMetricTest(PrometheusTestCaseMixin, TestCase):
# https://docs.djangoproject.com/en/2.2/topics/testing/tools/#django.test.SimpleTestCase.databases
databases = "__all__"
@skipUnless(
connections["test_db_1"].vendor == "sqlite", "Skipped unless test_db_1 uses sqlite"
)
class TestDbMetrics(BaseDbMetricTest):
"""Test django_prometheus.db metrics.
Note regarding the values of metrics: many tests interact with the
database, and the test runner itself does. As such, tests that
require that a metric has a specific value are at best very
fragile. Consider asserting that the value exceeds a certain
threshold, or check by how much it increased during the test.
"""
def testConfigHasExpectedDatabases(self):
"""Not a real unit test: ensures that testapp.settings contains the
databases this test expects."""
assert "default" in connections.databases.keys()
assert "test_db_1" in connections.databases.keys()
assert "test_db_2" in connections.databases.keys()
def testCounters(self):
cursor_db1 = connections["test_db_1"].cursor()
cursor_db2 = connections["test_db_2"].cursor()
cursor_db1.execute("SELECT 1")
for _ in range(200):
cursor_db2.execute("SELECT 2")
cursor_db1.execute("SELECT 3")
try:
cursor_db1.execute("this is clearly not valid SQL")
except Exception:
pass
self.assertMetricEquals(
1,
"django_db_errors_total",
alias="test_db_1",
vendor="sqlite",
type="OperationalError",
)
assert (
self.getMetric(
"django_db_execute_total", alias="test_db_1", vendor="sqlite"
)
> 0
)
assert (
self.getMetric(
"django_db_execute_total", alias="test_db_2", vendor="sqlite"
)
>= 200
)
def testExecuteMany(self):
registry = self.saveRegistry()
cursor_db1 = connections["test_db_1"].cursor()
cursor_db1.executemany(
"INSERT INTO testapp_lawn(location) VALUES (?)",
[("Paris",), ("New York",), ("Berlin",), ("San Francisco",)],
)
self.assertMetricDiff(
registry,
4,
"django_db_execute_many_total",
alias="test_db_1",
vendor="sqlite",
)
@skipUnless(
"postgresql" in connections, "Skipped unless postgresql database is enabled"
)
class TestPostgresDbMetrics(BaseDbMetricTest):
"""Test django_prometheus.db metrics for postgres backend.
Note regarding the values of metrics: many tests interact with the
database, and the test runner itself does. As such, tests that
require that a metric has a specific value are at best very
fragile. Consider asserting that the value exceeds a certain
threshold, or check by how much it increased during the test.
"""
def testCounters(self):
registry = self.saveRegistry()
cursor = connections["postgresql"].cursor()
for _ in range(20):
cursor.execute("SELECT 1")
self.assertMetricCompare(
registry,
lambda a, b: a + 20 <= b < a + 25,
"django_db_execute_total",
alias="postgresql",
vendor="postgresql",
)
@skipUnless("mysql" in connections, "Skipped unless mysql database is enabled")
class TestMysDbMetrics(BaseDbMetricTest):
"""Test django_prometheus.db metrics for mys backend.
Note regarding the values of metrics: many tests interact with the
database, and the test runner itself does. As such, tests that
require that a metric has a specific value are at best very
fragile. Consider asserting that the value exceeds a certain
threshold, or check by how much it increased during the test.
"""
def testCounters(self):
registry = self.saveRegistry()
cursor = connections["mysql"].cursor()
for _ in range(20):
cursor.execute("SELECT 1")
self.assertMetricCompare(
registry,
lambda a, b: a + 20 <= b < a + 25,
"django_db_execute_total",
alias="mysql",
vendor="mysql",
)
django-prometheus-2.0.0/django_prometheus/tests/end2end/testapp/test_middleware.py 0000664 0000000 0000000 00000012246 13611433254 0030557 0 ustar 00root root 0000000 0000000 from django.test import SimpleTestCase, override_settings
from django_prometheus.testutils import PrometheusTestCaseMixin
from testapp.views import ObjectionException
def M(metric_name):
"""Makes a full metric name from a short metric name.
This is just intended to help keep the lines shorter in test
cases.
"""
return "django_http_%s" % metric_name
def T(metric_name):
"""Makes a full metric name from a short metric name like M(metric_name)
This method adds a '_total' postfix for metrics."""
return "%s_total" % M(metric_name)
@override_settings(
PROMETHEUS_LATENCY_BUCKETS=(0.05, 1.0, 2.0, 4.0, 5.0, 10.0, float("inf"))
)
class TestMiddlewareMetrics(PrometheusTestCaseMixin, SimpleTestCase):
"""Test django_prometheus.middleware.
Note that counters related to exceptions can't be tested as
Django's test Client only simulates requests and the exception
handling flow is very different in that simulation.
"""
def test_request_counters(self):
registry = self.saveRegistry()
self.client.get("/")
self.client.get("/")
self.client.get("/help")
self.client.post("/", {"test": "data"})
self.assertMetricDiff(registry, 4, M("requests_before_middlewares_total"))
self.assertMetricDiff(registry, 4, M("responses_before_middlewares_total"))
self.assertMetricDiff(registry, 3, T("requests_total_by_method"), method="GET")
self.assertMetricDiff(registry, 1, T("requests_total_by_method"), method="POST")
self.assertMetricDiff(
registry, 4, T("requests_total_by_transport"), transport="http"
)
self.assertMetricDiff(
registry,
2,
T("requests_total_by_view_transport_method"),
view="testapp.views.index",
transport="http",
method="GET",
)
self.assertMetricDiff(
registry,
1,
T("requests_total_by_view_transport_method"),
view="testapp.views.help",
transport="http",
method="GET",
)
self.assertMetricDiff(
registry,
1,
T("requests_total_by_view_transport_method"),
view="testapp.views.index",
transport="http",
method="POST",
)
# We have 3 requests with no post body, and one with a few
# bytes, but buckets are cumulative so that is 4 requests with
# <=128 bytes bodies.
self.assertMetricDiff(
registry, 3, M("requests_body_total_bytes_bucket"), le="0.0"
)
self.assertMetricDiff(
registry, 4, M("requests_body_total_bytes_bucket"), le="128.0"
)
self.assertMetricEquals(
None, M("responses_total_by_templatename"), templatename="help.html"
)
self.assertMetricDiff(
registry, 3, T("responses_total_by_templatename"), templatename="index.html"
)
self.assertMetricDiff(registry, 4, T("responses_total_by_status"), status="200")
self.assertMetricDiff(
registry, 0, M("responses_body_total_bytes_bucket"), le="0.0"
)
self.assertMetricDiff(
registry, 3, M("responses_body_total_bytes_bucket"), le="128.0"
)
self.assertMetricDiff(
registry, 4, M("responses_body_total_bytes_bucket"), le="8192.0"
)
self.assertMetricDiff(
registry, 4, T("responses_total_by_charset"), charset="utf-8"
)
self.assertMetricDiff(registry, 0, M("responses_streaming_total"))
def test_latency_histograms(self):
# Caution: this test is timing-based. This is not ideal. It
# runs slowly (each request to /slow takes at least .1 seconds
# to complete), to eliminate flakiness we adjust the buckets used
# in the test suite.
registry = self.saveRegistry()
# This always takes more than .1 second, so checking the lower
# buckets is fine.
self.client.get("/slow")
self.assertMetricDiff(
registry,
0,
M("requests_latency_seconds_by_view_method_bucket"),
le="0.05",
view="slow",
method="GET",
)
self.assertMetricDiff(
registry,
1,
M("requests_latency_seconds_by_view_method_bucket"),
le="5.0",
view="slow",
method="GET",
)
def test_exception_latency_histograms(self):
registry = self.saveRegistry()
try:
self.client.get("/objection")
except ObjectionException:
pass
self.assertMetricDiff(
registry,
2,
M("requests_latency_seconds_by_view_method_bucket"),
le="2.0",
view="testapp.views.objection",
method="GET",
)
def test_streaming_responses(self):
registry = self.saveRegistry()
self.client.get("/")
self.client.get("/file")
self.assertMetricDiff(registry, 1, M("responses_streaming_total"))
self.assertMetricDiff(
registry, 1, M("responses_body_total_bytes_bucket"), le="+Inf"
)
django-prometheus-2.0.0/django_prometheus/tests/end2end/testapp/test_middleware_custom_labels.py 0000664 0000000 0000000 00000010036 13611433254 0033466 0 ustar 00root root 0000000 0000000 from prometheus_client import REGISTRY
from prometheus_client.metrics import MetricWrapperBase
from django.test import SimpleTestCase, override_settings
from django_prometheus.middleware import (
Metrics,
PrometheusAfterMiddleware,
PrometheusBeforeMiddleware,
)
from django_prometheus.testutils import PrometheusTestCaseMixin
from testapp.helpers import get_middleware
from testapp.test_middleware import M, T
EXTENDED_METRICS = [
M("requests_latency_seconds_by_view_method"),
M("responses_total_by_status_view_method"),
]
class CustomMetrics(Metrics):
def register_metric(
self, metric_cls, name, documentation, labelnames=tuple(), **kwargs
):
if name in EXTENDED_METRICS:
labelnames.extend(("view_type", "user_agent_type"))
return super(CustomMetrics, self).register_metric(
metric_cls, name, documentation, labelnames=labelnames, **kwargs
)
class AppMetricsBeforeMiddleware(PrometheusBeforeMiddleware):
metrics_cls = CustomMetrics
class AppMetricsAfterMiddleware(PrometheusAfterMiddleware):
metrics_cls = CustomMetrics
def label_metric(self, metric, request, response=None, **labels):
new_labels = labels
if metric._name in EXTENDED_METRICS:
new_labels = {"view_type": "foo", "user_agent_type": "browser"}
new_labels.update(labels)
return super(AppMetricsAfterMiddleware, self).label_metric(
metric, request, response=response, **new_labels
)
@override_settings(
MIDDLEWARE=get_middleware(
"testapp.test_middleware_custom_labels.AppMetricsBeforeMiddleware",
"testapp.test_middleware_custom_labels.AppMetricsAfterMiddleware",
)
)
class TestMiddlewareMetricsWithCustomLabels(PrometheusTestCaseMixin, SimpleTestCase):
@classmethod
def setUpClass(cls):
super(TestMiddlewareMetricsWithCustomLabels, cls).setUpClass()
# Allow CustomMetrics to be used
for metric in Metrics._instance.__dict__.values():
if isinstance(metric, MetricWrapperBase):
REGISTRY.unregister(metric)
Metrics._instance = None
def test_request_counters(self):
registry = self.saveRegistry()
self.client.get("/")
self.client.get("/")
self.client.get("/help")
self.client.post("/", {"test": "data"})
self.assertMetricDiff(registry, 4, M("requests_before_middlewares_total"))
self.assertMetricDiff(registry, 4, M("responses_before_middlewares_total"))
self.assertMetricDiff(registry, 3, T("requests_total_by_method"), method="GET")
self.assertMetricDiff(registry, 1, T("requests_total_by_method"), method="POST")
self.assertMetricDiff(
registry, 4, T("requests_total_by_transport"), transport="http"
)
self.assertMetricDiff(
registry,
2,
T("requests_total_by_view_transport_method"),
view="testapp.views.index",
transport="http",
method="GET",
)
self.assertMetricDiff(
registry,
1,
T("requests_total_by_view_transport_method"),
view="testapp.views.help",
transport="http",
method="GET",
)
self.assertMetricDiff(
registry,
1,
T("requests_total_by_view_transport_method"),
view="testapp.views.index",
transport="http",
method="POST",
)
self.assertMetricDiff(
registry,
2.0,
T("responses_total_by_status_view_method"),
status="200",
view="testapp.views.index",
method="GET",
view_type="foo",
user_agent_type="browser",
)
self.assertMetricDiff(
registry,
1.0,
T("responses_total_by_status_view_method"),
status="200",
view="testapp.views.help",
method="GET",
view_type="foo",
user_agent_type="browser",
)
django-prometheus-2.0.0/django_prometheus/tests/end2end/testapp/test_migrations.py 0000664 0000000 0000000 00000002723 13611433254 0030615 0 ustar 00root root 0000000 0000000 import sys
from django.test import SimpleTestCase
from django_prometheus.migrations import ExportMigrationsForDatabase
from django_prometheus.testutils import PrometheusTestCaseMixin
if sys.version_info[:2] >= (3, 0):
from unittest.mock import MagicMock
else:
from mock import MagicMock
def M(metric_name):
"""Make a full metric name from a short metric name.
This is just intended to help keep the lines shorter in test
cases.
"""
return "django_migrations_%s" % metric_name
class TestMigrations(PrometheusTestCaseMixin, SimpleTestCase):
"""Test migration counters."""
def test_counters(self):
executor = MagicMock()
executor.migration_plan = MagicMock()
executor.migration_plan.return_value = set()
executor.loader.applied_migrations = set(["a", "b", "c"])
ExportMigrationsForDatabase("fakedb1", executor)
self.assertEqual(executor.migration_plan.call_count, 1)
executor.migration_plan = MagicMock()
executor.migration_plan.return_value = set(["a"])
executor.loader.applied_migrations = set(["b", "c"])
ExportMigrationsForDatabase("fakedb2", executor)
self.assertMetricEquals(3, M("applied_total"), connection="fakedb1")
self.assertMetricEquals(0, M("unapplied_total"), connection="fakedb1")
self.assertMetricEquals(2, M("applied_total"), connection="fakedb2")
self.assertMetricEquals(1, M("unapplied_total"), connection="fakedb2")
django-prometheus-2.0.0/django_prometheus/tests/end2end/testapp/test_models.py 0000664 0000000 0000000 00000003264 13611433254 0027725 0 ustar 00root root 0000000 0000000 from django.test import TestCase
from django_prometheus.testutils import PrometheusTestCaseMixin
from testapp.models import Dog, Lawn
def M(metric_name):
"""Make a full metric name from a short metric name.
This is just intended to help keep the lines shorter in test
cases.
"""
return "django_model_%s" % metric_name
class TestModelMetrics(PrometheusTestCaseMixin, TestCase):
"""Test django_prometheus.models."""
def test_counters(self):
registry = self.saveRegistry()
cool = Dog()
cool.name = "Cool"
cool.save()
self.assertMetricDiff(registry, 1, M("inserts_total"), model="dog")
elysees = Lawn()
elysees.location = "Champs Elysees, Paris"
elysees.save()
self.assertMetricDiff(registry, 1, M("inserts_total"), model="lawn")
self.assertMetricDiff(registry, 1, M("inserts_total"), model="dog")
galli = Dog()
galli.name = "Galli"
galli.save()
self.assertMetricDiff(registry, 2, M("inserts_total"), model="dog")
cool.breed = "Wolfhound"
self.assertMetricDiff(registry, 2, M("inserts_total"), model="dog")
cool.save()
self.assertMetricDiff(registry, 2, M("inserts_total"), model="dog")
self.assertMetricDiff(registry, 1, M("updates_total"), model="dog")
cool.age = 9
cool.save()
self.assertMetricDiff(registry, 2, M("updates_total"), model="dog")
cool.delete() # :(
self.assertMetricDiff(registry, 2, M("inserts_total"), model="dog")
self.assertMetricDiff(registry, 2, M("updates_total"), model="dog")
self.assertMetricDiff(registry, 1, M("deletes_total"), model="dog")
django-prometheus-2.0.0/django_prometheus/tests/end2end/testapp/urls.py 0000664 0000000 0000000 00000000742 13611433254 0026366 0 ustar 00root root 0000000 0000000 from django.conf.urls import include, url
from django.contrib import admin
from testapp import views
urlpatterns = [
url(r"^$", views.index),
url(r"^help$", views.help),
url(r"^slow$", views.slow, name="slow"),
url(r"^objection$", views.objection),
url(r"^sql$", views.sql),
url(r"^newlawn/(?P[a-zA-Z0-9 ]+)$", views.newlawn),
url(r"^file$", views.file),
url("", include("django_prometheus.urls")),
url(r"^admin/", admin.site.urls),
]
django-prometheus-2.0.0/django_prometheus/tests/end2end/testapp/views.py 0000664 0000000 0000000 00000003106 13611433254 0026533 0 ustar 00root root 0000000 0000000 import os
import time
from django.db import connections
from django.http import FileResponse
from django.shortcuts import render
from django.template.response import TemplateResponse
from testapp.models import Lawn
def index(request):
return TemplateResponse(request, "index.html", {})
def help(request):
# render does not instantiate a TemplateResponse, so it does not
# increment the "by_templatename" counters.
return render(request, "help.html", {})
def slow(request):
"""This view takes .1s to load, on purpose."""
time.sleep(0.1)
return TemplateResponse(request, "slow.html", {})
def newlawn(request, location):
"""This view creates a new Lawn instance in the database."""
lawn = Lawn()
lawn.location = location
lawn.save()
return TemplateResponse(request, "lawn.html", {"lawn": lawn})
class ObjectionException(Exception):
pass
def objection(request):
raise ObjectionException("Objection!")
def sql(request):
databases = connections.databases.keys()
query = request.GET.get("query")
db = request.GET.get("database")
if query and db:
cursor = connections[db].cursor()
cursor.execute(query, [])
results = cursor.fetchall()
return TemplateResponse(
request,
"sql.html",
{"query": query, "rows": results, "databases": databases},
)
else:
return TemplateResponse(
request, "sql.html", {"query": None, "rows": None, "databases": databases}
)
def file(request):
return FileResponse(open(os.devnull, "rb"))
django-prometheus-2.0.0/django_prometheus/tests/end2end/testapp/wsgi.py 0000664 0000000 0000000 00000000607 13611433254 0026352 0 ustar 00root root 0000000 0000000 """
WSGI config for testapp project.
It exposes the WSGI callable as a module-level variable named ``application``.
For more information on this file, see
https://docs.djangoproject.com/en/1.8/howto/deployment/wsgi/
"""
import os
from django.core.wsgi import get_wsgi_application
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "testapp.settings")
application = get_wsgi_application()
django-prometheus-2.0.0/django_prometheus/tests/test_django_prometheus.py 0000664 0000000 0000000 00000000717 13611433254 0027160 0 ustar 00root root 0000000 0000000 #!/usr/bin/env python
import unittest
from django_prometheus.utils import PowersOf
class DjangoPrometheusTest(unittest.TestCase):
def testPowersOf(self):
"""Tests utils.PowersOf."""
assert [0, 1, 2, 4, 8] == PowersOf(2, 4)
assert [0, 3, 9, 27, 81, 243] == PowersOf(3, 5, lower=1)
assert [1, 2, 4, 8] == PowersOf(2, 4, include_zero=False)
assert [4, 8, 16, 32, 64, 128] == PowersOf(2, 6, lower=2, include_zero=False)
django-prometheus-2.0.0/django_prometheus/tests/test_exports.py 0000664 0000000 0000000 00000002111 13611433254 0025135 0 ustar 00root root 0000000 0000000 #!/usr/bin/env python
import socket
from django_prometheus.exports import SetupPrometheusEndpointOnPortRange
from mock import ANY, MagicMock, call, patch
@patch("django_prometheus.exports.HTTPServer")
def test_port_range_available(httpserver_mock):
"""Test port range setup with an available port."""
httpserver_mock.side_effect = [socket.error, MagicMock()]
port_range = [8000, 8001]
port_chosen = SetupPrometheusEndpointOnPortRange(port_range)
assert port_chosen in port_range
expected_calls = [call(("", 8000), ANY), call(("", 8001), ANY)]
assert httpserver_mock.mock_calls == expected_calls
@patch("django_prometheus.exports.HTTPServer")
def test_port_range_unavailable(httpserver_mock):
"""Test port range setup with no available ports."""
httpserver_mock.side_effect = [socket.error, socket.error]
port_range = [8000, 8001]
port_chosen = SetupPrometheusEndpointOnPortRange(port_range)
expected_calls = [call(("", 8000), ANY), call(("", 8001), ANY)]
assert httpserver_mock.mock_calls == expected_calls
assert port_chosen is None
django-prometheus-2.0.0/django_prometheus/tests/test_testutils.py 0000664 0000000 0000000 00000013207 13611433254 0025501 0 ustar 00root root 0000000 0000000 #!/usr/bin/env python
import unittest
from operator import itemgetter
import prometheus_client
from django_prometheus.testutils import PrometheusTestCaseMixin
class SomeTestCase(PrometheusTestCaseMixin):
"""A class that pretends to be a unit test."""
def __init__(self):
self.passes = True
super(SomeTestCase, self).__init__()
def assertEqual(self, left, right, *args, **kwargs):
self.passes = self.passes and (left == right)
def assertFalse(self, expression, *args, **kwargs):
self.passes = self.passes and (not expression)
class PrometheusTestCaseMixinTest(unittest.TestCase):
def setUp(self):
self.registry = prometheus_client.CollectorRegistry()
self.some_gauge = prometheus_client.Gauge(
"some_gauge", "Some gauge.", registry=self.registry
)
self.some_gauge.set(42)
self.some_labelled_gauge = prometheus_client.Gauge(
"some_labelled_gauge",
"Some labelled gauge.",
["labelred", "labelblue"],
registry=self.registry,
)
self.some_labelled_gauge.labels("pink", "indigo").set(1)
self.some_labelled_gauge.labels("pink", "royal").set(2)
self.some_labelled_gauge.labels("carmin", "indigo").set(3)
self.some_labelled_gauge.labels("carmin", "royal").set(4)
self.test_case = SomeTestCase()
def testGetMetrics(self):
"""Tests getMetric."""
assert 42 == self.test_case.getMetric("some_gauge", registry=self.registry)
assert 1 == self.test_case.getMetric(
"some_labelled_gauge",
registry=self.registry,
labelred="pink",
labelblue="indigo",
)
def testGetMetricVector(self):
"""Tests getMetricVector."""
vector = self.test_case.getMetricVector(
"some_nonexistent_gauge", registry=self.registry
)
assert [] == vector
vector = self.test_case.getMetricVector("some_gauge", registry=self.registry)
assert [({}, 42)] == vector
vector = self.test_case.getMetricVector(
"some_labelled_gauge", registry=self.registry
)
assert sorted(
[
({"labelred": u"pink", "labelblue": u"indigo"}, 1),
({"labelred": u"pink", "labelblue": u"royal"}, 2),
({"labelred": u"carmin", "labelblue": u"indigo"}, 3),
({"labelred": u"carmin", "labelblue": u"royal"}, 4),
],
key=itemgetter(1),
) == sorted(vector, key=itemgetter(1))
def testAssertMetricEquals(self):
"""Tests assertMetricEquals."""
# First we test that a scalar metric can be tested.
self.test_case.assertMetricEquals(42, "some_gauge", registry=self.registry)
assert self.test_case.passes is True
self.test_case.assertMetricEquals(43, "some_gauge", registry=self.registry)
assert self.test_case.passes is False
self.test_case.passes = True
# Here we test that assertMetricEquals fails on nonexistent gauges.
self.test_case.assertMetricEquals(
42, "some_nonexistent_gauge", registry=self.registry
)
assert not self.test_case.passes
self.test_case.passes = True
# Here we test that labelled metrics can be tested.
self.test_case.assertMetricEquals(
1,
"some_labelled_gauge",
registry=self.registry,
labelred="pink",
labelblue="indigo",
)
assert self.test_case.passes is True
self.test_case.assertMetricEquals(
1,
"some_labelled_gauge",
registry=self.registry,
labelred="tomato",
labelblue="sky",
)
assert self.test_case.passes is False
self.test_case.passes = True
def testRegistrySaving(self):
"""Tests saveRegistry and frozen registries operations."""
frozen_registry = self.test_case.saveRegistry(registry=self.registry)
# Test that we can manipulate a frozen scalar metric.
assert 42 == self.test_case.getMetricFromFrozenRegistry(
"some_gauge", frozen_registry
)
self.some_gauge.set(99)
assert 42 == self.test_case.getMetricFromFrozenRegistry(
"some_gauge", frozen_registry
)
self.test_case.assertMetricDiff(
frozen_registry, 99 - 42, "some_gauge", registry=self.registry
)
assert self.test_case.passes is True
self.test_case.assertMetricDiff(
frozen_registry, 1, "some_gauge", registry=self.registry
)
assert self.test_case.passes is False
self.test_case.passes = True
# Now test the same thing with a labelled metric.
assert 1 == self.test_case.getMetricFromFrozenRegistry(
"some_labelled_gauge", frozen_registry, labelred="pink", labelblue="indigo"
)
self.some_labelled_gauge.labels("pink", "indigo").set(5)
assert 1 == self.test_case.getMetricFromFrozenRegistry(
"some_labelled_gauge", frozen_registry, labelred="pink", labelblue="indigo"
)
self.test_case.assertMetricDiff(
frozen_registry,
5 - 1,
"some_labelled_gauge",
registry=self.registry,
labelred="pink",
labelblue="indigo",
)
assert self.test_case.passes is True
self.test_case.assertMetricDiff(
frozen_registry,
1,
"some_labelled_gauge",
registry=self.registry,
labelred="pink",
labelblue="indigo",
)
assert self.test_case.passes is False
django-prometheus-2.0.0/django_prometheus/testutils.py 0000664 0000000 0000000 00000013305 13611433254 0023277 0 ustar 00root root 0000000 0000000 import copy
from prometheus_client import REGISTRY
METRIC_EQUALS_ERR_EXPLANATION = """
%s%s = %s, expected %s.
The values for %s are:
%s"""
METRIC_DIFF_ERR_EXPLANATION = """
%s%s changed by %f, expected %f.
Value before: %s
Value after: %s
"""
METRIC_COMPARE_ERR_EXPLANATION = """
The change in value of %s%s didn't match the predicate.
Value before: %s
Value after: %s
"""
METRIC_DIFF_ERR_NONE_EXPLANATION = """
%s%s was None after.
Value before: %s
Value after: %s
"""
class PrometheusTestCaseMixin(object):
"""A collection of utilities that make it easier to write test cases
that interact with metrics.
"""
def saveRegistry(self, registry=REGISTRY):
"""Freezes a registry. This lets a user test changes to a metric
instead of testing the absolute value. A typical use case looks like:
registry = self.saveRegistry()
doStuff()
self.assertMetricDiff(registry, 1, 'stuff_done_total')
"""
return copy.deepcopy(list(registry.collect()))
def getMetricFromFrozenRegistry(self, metric_name, frozen_registry, **labels):
"""Gets a single metric from a frozen registry."""
for metric in frozen_registry:
for sample in metric.samples:
if sample[0] == metric_name and sample[1] == labels:
return sample[2]
def getMetric(self, metric_name, registry=REGISTRY, **labels):
"""Gets a single metric."""
return self.getMetricFromFrozenRegistry(
metric_name, registry.collect(), **labels
)
def getMetricVectorFromFrozenRegistry(self, metric_name, frozen_registry):
"""Like getMetricVector, but from a frozen registry."""
output = []
for metric in frozen_registry:
for sample in metric.samples:
if sample[0] == metric_name:
output.append((sample[1], sample[2]))
return output
def getMetricVector(self, metric_name, registry=REGISTRY):
"""Returns the values for all labels of a given metric.
The result is returned as a list of (labels, value) tuples,
where `labels` is a dict.
This is quite a hack since it relies on the internal
representation of the prometheus_client, and it should
probably be provided as a function there instead.
"""
return self.getMetricVectorFromFrozenRegistry(metric_name, registry.collect())
def formatLabels(self, labels):
"""Format a set of labels to Prometheus representation.
In:
{'method': 'GET', 'port': '80'}
Out:
'{method="GET",port="80"}'
"""
return "{%s}" % ",".join(['%s="%s"' % (k, v) for k, v in labels.items()])
def formatVector(self, vector):
"""Formats a list of (labels, value) where labels is a dict into a
human-readable representation.
"""
return "\n".join(
["%s = %s" % (self.formatLabels(labels), value) for labels, value in vector]
)
def assertMetricEquals(
self, expected_value, metric_name, registry=REGISTRY, **labels
):
"""Asserts that metric_name{**labels} == expected_value."""
value = self.getMetric(metric_name, registry=registry, **labels)
self.assertEqual(
expected_value,
value,
METRIC_EQUALS_ERR_EXPLANATION
% (
metric_name,
self.formatLabels(labels),
value,
expected_value,
metric_name,
self.formatVector(self.getMetricVector(metric_name)),
),
)
def assertMetricDiff(
self, frozen_registry, expected_diff, metric_name, registry=REGISTRY, **labels
):
"""Asserts that metric_name{**labels} changed by expected_diff between
the frozen registry and now. A frozen registry can be obtained
by calling saveRegistry, typically at the beginning of a test
case.
"""
saved_value = self.getMetricFromFrozenRegistry(
metric_name, frozen_registry, **labels
)
current_value = self.getMetric(metric_name, registry=registry, **labels)
self.assertFalse(
current_value is None,
METRIC_DIFF_ERR_NONE_EXPLANATION
% (metric_name, self.formatLabels(labels), saved_value, current_value),
)
diff = current_value - (saved_value or 0.0)
self.assertEqual(
expected_diff,
diff,
METRIC_DIFF_ERR_EXPLANATION
% (
metric_name,
self.formatLabels(labels),
diff,
expected_diff,
saved_value,
current_value,
),
)
def assertMetricCompare(
self, frozen_registry, predicate, metric_name, registry=REGISTRY, **labels
):
"""Asserts that metric_name{**labels} changed according to a provided
predicate function between the frozen registry and now. A
frozen registry can be obtained by calling saveRegistry,
typically at the beginning of a test case.
"""
saved_value = self.getMetricFromFrozenRegistry(
metric_name, frozen_registry, **labels
)
current_value = self.getMetric(metric_name, registry=registry, **labels)
self.assertFalse(
current_value is None,
METRIC_DIFF_ERR_NONE_EXPLANATION
% (metric_name, self.formatLabels(labels), saved_value, current_value),
)
self.assertTrue(
predicate(saved_value, current_value),
METRIC_COMPARE_ERR_EXPLANATION
% (metric_name, self.formatLabels(labels), saved_value, current_value),
)
django-prometheus-2.0.0/django_prometheus/urls.py 0000664 0000000 0000000 00000000256 13611433254 0022225 0 ustar 00root root 0000000 0000000 from django.conf.urls import url
from django_prometheus import exports
urlpatterns = [
url(r"^metrics$", exports.ExportToDjangoView, name="prometheus-django-metrics")
]
django-prometheus-2.0.0/django_prometheus/utils.py 0000664 0000000 0000000 00000001524 13611433254 0022377 0 ustar 00root root 0000000 0000000 from timeit import default_timer
def Time():
"""Returns some representation of the current time.
This wrapper is meant to take advantage of a higher time
resolution when available. Thus, its return value should be
treated as an opaque object. It can be compared to the current
time with TimeSince().
"""
return default_timer()
def TimeSince(t):
"""Compares a value returned by Time() to the current time.
Returns:
the time since t, in fractional seconds.
"""
return default_timer() - t
def PowersOf(logbase, count, lower=0, include_zero=True):
"""Returns a list of count powers of logbase (from logbase**lower)."""
if not include_zero:
return [logbase ** i for i in range(lower, count + lower)]
else:
return [0] + [logbase ** i for i in range(lower, count + lower)]
django-prometheus-2.0.0/documentation/ 0000775 0000000 0000000 00000000000 13611433254 0020017 5 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/documentation/exports.md 0000664 0000000 0000000 00000011101 13611433254 0022037 0 ustar 00root root 0000000 0000000 # Exports
## Default: exporting /metrics as a Django view
/metrics can be exported as a Django view very easily. Simply
include('django_prometheus.urls') with no prefix like so:
```python
urlpatterns = [
...
url('', include('django_prometheus.urls')),
]
```
This will reserve the /metrics path on your server. This may be a
problem for you, so you can use a prefix. For instance, the following
will export the metrics at `/monitoring/metrics` instead. You will
need to configure Prometheus to use that path instead of the default.
```python
urlpatterns = [
...
url('^monitoring/', include('django_prometheus.urls')),
]
```
## Exporting /metrics in a dedicated thread
To ensure that issues in your Django app do not affect the monitoring,
it is recommended to export /metrics in an HTTPServer running in a
daemon thread. This will prevent that problems such as thread
starvation or low-level bugs in Django do not affect the export of
your metrics, which may be more needed than ever if these problems
occur.
It can be enabled by adding the following line in your `settings.py`:
```python
PROMETHEUS_METRICS_EXPORT_PORT = 8001
PROMETHEUS_METRICS_EXPORT_ADDRESS = '' # all addresses
```
However, by default this mechanism is disabled, because it is not
compatible with Django's autoreloader. The autoreloader is the feature
that allows you to edit your code and see the changes
immediately. This works by forking multiple processes of Django, which
would compete for the port. As such, this code will assert-fail if the
autoreloader is active.
You can run Django without the autoreloader by passing `-noreload` to
`manage.py`. If you decide to enable the thread-based exporter in
production, you may wish to modify your manage.py to ensure that this
option is always active:
```python
execute_from_command_line(sys.argv + ['--noreload'])
```
## Exporting /metrics in a WSGI application with multiple processes per process
If you're using WSGI (e.g. with uwsgi or with gunicorn) and multiple
Django processes, using either option above won't work, as requests
using the Django view would just go to an inconsistent backend each
time, and exporting on a single port doesn't work.
The following settings can be used instead:
```python
PROMETHEUS_METRICS_EXPORT_PORT_RANGE = range(8001, 8050)
```
This will make Django-Prometheus try to export /metrics on port
8001. If this fails (i.e. the port is in use), it will try 8002, then
8003, etc.
You can then configure Prometheus to collect metrics on as many
targets as you have workers, using each port separately.
## Exporting /metrics in a WSGI application with multiple processes globally
In some WSGI applications, workers are short lived (less than a minute), so some
are never scraped by prometheus by default. Prometheus client already provides
a nice system to aggregate them using the env variable: `prometheus_multiproc_dir`
which will configure the directory where metrics will be stored as files per process.
Configuration in uwsgi would look like:
```ini
env = prometheus_multiproc_dir=/path/to/django_metrics
```
You can also set this environment variable elsewhere such as in a kubernetes manifest.
Note that the environment variable is lower_case.
Setting this will create four files (one for counters, one for summaries, ...etc)
for each pid used. In uwsgi, the number of different pids used can be quite large
(the pid change every time a worker respawn). To prevent having thousand of files
created, it's possible to create file using worker ids rather than pids.
You can change the function used for identifying the process to use the uwsgi worker_id.
Modify this in settings before any metrics are created:
```python
try:
import prometheus_client
import uwsgi
prometheus_client.values.ValueClass = prometheus_client.values.MultiProcessValue(
_pidFunc=uwsgi.worker_id)
except ImportError:
pass # not running in uwsgi
```
Note that this code uses internal interfaces of prometheus_client.
The underlying implementation may change.
The number of resulting files will be:
number of processes * 4 (counter, histogram, gauge, summary)
Be aware that by default this will generate a large amount of file descriptors:
Each worker will keep 3 file descriptors for each files it created.
Since these files will be written often, you should consider mounting this directory
as a `tmpfs` or using a subdir of an existing one such as `/run/` or `/var/run/`.
If uwsgi is not using lazy-apps (lazy-apps = true), there will be a
file descriptors leak (tens to hundreds of fds on a single file) due
to the way uwsgi forks processes to create workers.
django-prometheus-2.0.0/examples/ 0000775 0000000 0000000 00000000000 13611433254 0016764 5 ustar 00root root 0000000 0000000 django-prometheus-2.0.0/examples/django-promdash.png 0000664 0000000 0000000 00000455427 13611433254 0022570 0 ustar 00root root 0000000 0000000 PNG
IHDR /b\
sBITO tEXtSoftware gnome-screenshot> IDATxX?`IQDKHՆ!^AZq\#%3~FU\l
|Zc О%+P\+ 4a%0$|E?$wrϼ`d290>"