graypy-2.1.0/0000775000372000037200000000000013544502033013666 5ustar travistravis00000000000000graypy-2.1.0/MANIFEST.in0000664000372000037200000000014013544501373015425 0ustar travistravis00000000000000include LICENSE include README.rst recursive-include tests *.py recursive-include tests/config *graypy-2.1.0/graypy.egg-info/0000775000372000037200000000000013544502033016673 5ustar travistravis00000000000000graypy-2.1.0/graypy.egg-info/PKG-INFO0000664000372000037200000002521613544502033017776 0ustar travistravis00000000000000Metadata-Version: 2.1 Name: graypy Version: 2.1.0 Summary: Python logging handlers that send messages in the Graylog Extended Log Format (GELF). Home-page: https://github.com/severb/graypy Author: Sever Banesiu Author-email: banesiu.sever@gmail.com License: BSD License Description: ###### graypy ###### .. image:: https://img.shields.io/pypi/v/graypy.svg :target: https://pypi.python.org/pypi/graypy :alt: PyPI Status .. image:: https://travis-ci.org/severb/graypy.svg?branch=master :target: https://travis-ci.org/severb/graypy :alt: Build Status .. image:: https://readthedocs.org/projects/graypy/badge/?version=stable :target: https://graypy.readthedocs.io/en/stable/?badge=stable :alt: Documentation Status .. image:: https://codecov.io/gh/severb/graypy/branch/master/graph/badge.svg :target: https://codecov.io/gh/severb/graypy :alt: Coverage Status Description =========== Python logging handlers that send log messages in the Graylog Extended Log Format (GELF_). graypy supports sending GELF logs to both Graylog2 and Graylog3 servers. Installing ========== Using pip --------- Install the basic graypy python logging handlers: .. code-block:: console pip install graypy Install with requirements for ``GELFRabbitHandler``: .. code-block:: console pip install graypy[amqp] Using easy_install ------------------ Install the basic graypy python logging handlers: .. code-block:: console easy_install graypy Install with requirements for ``GELFRabbitHandler``: .. code-block:: console easy_install graypy[amqp] Usage ===== graypy sends GELF logs to a Graylog server via subclasses of the python `logging.Handler`_ class. Below is the list of ready to run GELF logging handlers defined by graypy: * ``GELFUDPHandler`` - UDP log forwarding * ``GELFTCPHandler`` - TCP log forwarding * ``GELFTLSHandler`` - TCP log forwarding with TLS support * ``GELFHTTPHandler`` - HTTP log forwarding * ``GELFRabbitHandler`` - RabbitMQ log forwarding UDP Logging ----------- UDP Log forwarding to a locally hosted Graylog server can be easily done with the ``GELFUDPHandler``: .. code-block:: python import logging import graypy my_logger = logging.getLogger('test_logger') my_logger.setLevel(logging.DEBUG) handler = graypy.GELFUDPHandler('localhost', 12201) my_logger.addHandler(handler) my_logger.debug('Hello Graylog.') UDP GELF Chunkers ^^^^^^^^^^^^^^^^^ `GELF UDP Chunking`_ is supported by the ``GELFUDPHandler`` and is defined by the ``gelf_chunker`` argument within its constructor. By default the ``GELFWarningChunker`` is used, thus, GELF messages that chunk overflow (i.e. consisting of more than 128 chunks) will issue a ``GELFChunkOverflowWarning`` and **will be dropped**. Other ``gelf_chunker`` options are also available: * ``BaseGELFChunker`` silently drops GELF messages that chunk overflow * ``GELFTruncatingChunker`` issues a ``GELFChunkOverflowWarning`` and simplifies and truncates GELF messages that chunk overflow in a attempt to send some content to Graylog. If this process fails to prevent another chunk overflow a ``GELFTruncationFailureWarning`` is issued. RabbitMQ Logging ---------------- Alternately, use ``GELFRabbitHandler`` to send messages to RabbitMQ and configure your Graylog server to consume messages via AMQP. This prevents log messages from being lost due to dropped UDP packets (``GELFUDPHandler`` sends messages to Graylog using UDP). You will need to configure RabbitMQ with a ``gelf_log`` queue and bind it to the ``logging.gelf`` exchange so messages are properly routed to a queue that can be consumed by Graylog (the queue and exchange names may be customized to your liking). .. code-block:: python import logging import graypy my_logger = logging.getLogger('test_logger') my_logger.setLevel(logging.DEBUG) handler = graypy.GELFRabbitHandler('amqp://guest:guest@localhost/', exchange='logging.gelf') my_logger.addHandler(handler) my_logger.debug('Hello Graylog.') Django Logging -------------- It's easy to integrate ``graypy`` with Django's logging settings. Just add a new handler in your ``settings.py``: .. code-block:: python LOGGING = { 'version': 1, # other dictConfig keys here... 'handlers': { 'graypy': { 'level': 'WARNING', 'class': 'graypy.GELFUDPHandler', 'host': 'localhost', 'port': 12201, }, }, 'loggers': { 'django.request': { 'handlers': ['graypy'], 'level': 'ERROR', 'propagate': True, }, }, } Traceback Logging ----------------- By default log captured exception tracebacks are added to the GELF log as ``full_message`` fields: .. code-block:: python import logging import graypy my_logger = logging.getLogger('test_logger') my_logger.setLevel(logging.DEBUG) handler = graypy.GELFUDPHandler('localhost', 12201) my_logger.addHandler(handler) try: puff_the_magic_dragon() except NameError: my_logger.debug('No dragons here.', exc_info=1) Default Logging Fields ---------------------- By default a number of debugging logging fields are automatically added to the GELF log if available: * function * pid * process_name * thread_name You can disable automatically adding these debugging logging fields by specifying ``debugging_fields=False`` in the handler's constructor: .. code-block:: python handler = graypy.GELFUDPHandler('localhost', 12201, debugging_fields=False) Adding Custom Logging Fields ---------------------------- graypy also supports including custom fields in the GELF logs sent to Graylog. This can be done by using Python's LoggerAdapter_ and Filter_ classes. Using LoggerAdapter ^^^^^^^^^^^^^^^^^^^ LoggerAdapter_ makes it easy to add static information to your GELF log messages: .. code-block:: python import logging import graypy my_logger = logging.getLogger('test_logger') my_logger.setLevel(logging.DEBUG) handler = graypy.GELFUDPHandler('localhost', 12201) my_logger.addHandler(handler) my_adapter = logging.LoggerAdapter(logging.getLogger('test_logger'), {'username': 'John'}) my_adapter.debug('Hello Graylog from John.') Using Filter ^^^^^^^^^^^^ Filter_ gives more flexibility and allows for dynamic information to be added to your GELF logs: .. code-block:: python import logging import graypy class UsernameFilter(logging.Filter): def __init__(self): # In an actual use case would dynamically get this # (e.g. from memcache) self.username = 'John' def filter(self, record): record.username = self.username return True my_logger = logging.getLogger('test_logger') my_logger.setLevel(logging.DEBUG) handler = graypy.GELFUDPHandler('localhost', 12201) my_logger.addHandler(handler) my_logger.addFilter(UsernameFilter()) my_logger.debug('Hello Graylog from John.') Contributors ============ * Sever Banesiu * Daniel Miller * Tushar Makkar * Nathan Klapstein .. _GELF: https://docs.graylog.org/en/latest/pages/gelf.html .. _logging.Handler: https://docs.python.org/3/library/logging.html#logging.Handler .. _GELF UDP Chunking: https://docs.graylog.org/en/latest/pages/gelf.html#chunking .. _LoggerAdapter: https://docs.python.org/howto/logging-cookbook.html#using-loggeradapters-to-impart-contextual-information .. _Filter: https://docs.python.org/howto/logging-cookbook.html#using-filters-to-impart-contextual-information Keywords: logging gelf graylog2 graylog udp amqp Platform: UNKNOWN Classifier: License :: OSI Approved :: BSD License Classifier: Intended Audience :: Developers Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 2 Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.2 Classifier: Programming Language :: Python :: 3.3 Classifier: Programming Language :: Python :: 3.4 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: Implementation :: CPython Classifier: Programming Language :: Python :: Implementation :: PyPy Classifier: Topic :: System :: Logging Description-Content-Type: text/x-rst Provides-Extra: docs Provides-Extra: amqp graypy-2.1.0/graypy.egg-info/dependency_links.txt0000664000372000037200000000000113544502033022741 0ustar travistravis00000000000000 graypy-2.1.0/graypy.egg-info/top_level.txt0000664000372000037200000000001513544502033021421 0ustar travistravis00000000000000graypy tests graypy-2.1.0/graypy.egg-info/not-zip-safe0000664000372000037200000000000113544502033021121 0ustar travistravis00000000000000 graypy-2.1.0/graypy.egg-info/SOURCES.txt0000664000372000037200000000163313544502033020562 0ustar travistravis00000000000000LICENSE MANIFEST.in README.rst setup.cfg setup.py graypy/__init__.py graypy/handler.py graypy/rabbitmq.py graypy.egg-info/PKG-INFO graypy.egg-info/SOURCES.txt graypy.egg-info/dependency_links.txt graypy.egg-info/not-zip-safe graypy.egg-info/requires.txt graypy.egg-info/top_level.txt tests/__init__.py tests/helper.py tests/config/create_ssl_certs.sh tests/config/docker-compose.yml tests/config/inputs.json tests/config/start_local_graylog_server.sh tests/config/stop_local_graylog_server.sh tests/integration/__init__.py tests/integration/helper.py tests/integration/test_chunked_logging.py tests/integration/test_common_logging.py tests/integration/test_debugging_fields.py tests/integration/test_extra_fields.py tests/integration/test_status_issue.py tests/unit/__init__.py tests/unit/helper.py tests/unit/test_ExcludeFilter.py tests/unit/test_GELFRabbitHandler.py tests/unit/test_chunking.py tests/unit/test_handler.pygraypy-2.1.0/graypy.egg-info/requires.txt0000664000372000037200000000017213544502033021273 0ustar travistravis00000000000000 [amqp] amqplib==1.0.2 [docs] sphinx<3.0.0,>=2.1.2 sphinx_rtd_theme<1.0.0,>=0.4.3 sphinx-autodoc-typehints<2.0.0,>=1.6.0 graypy-2.1.0/PKG-INFO0000664000372000037200000002521613544502033014771 0ustar travistravis00000000000000Metadata-Version: 2.1 Name: graypy Version: 2.1.0 Summary: Python logging handlers that send messages in the Graylog Extended Log Format (GELF). Home-page: https://github.com/severb/graypy Author: Sever Banesiu Author-email: banesiu.sever@gmail.com License: BSD License Description: ###### graypy ###### .. image:: https://img.shields.io/pypi/v/graypy.svg :target: https://pypi.python.org/pypi/graypy :alt: PyPI Status .. image:: https://travis-ci.org/severb/graypy.svg?branch=master :target: https://travis-ci.org/severb/graypy :alt: Build Status .. image:: https://readthedocs.org/projects/graypy/badge/?version=stable :target: https://graypy.readthedocs.io/en/stable/?badge=stable :alt: Documentation Status .. image:: https://codecov.io/gh/severb/graypy/branch/master/graph/badge.svg :target: https://codecov.io/gh/severb/graypy :alt: Coverage Status Description =========== Python logging handlers that send log messages in the Graylog Extended Log Format (GELF_). graypy supports sending GELF logs to both Graylog2 and Graylog3 servers. Installing ========== Using pip --------- Install the basic graypy python logging handlers: .. code-block:: console pip install graypy Install with requirements for ``GELFRabbitHandler``: .. code-block:: console pip install graypy[amqp] Using easy_install ------------------ Install the basic graypy python logging handlers: .. code-block:: console easy_install graypy Install with requirements for ``GELFRabbitHandler``: .. code-block:: console easy_install graypy[amqp] Usage ===== graypy sends GELF logs to a Graylog server via subclasses of the python `logging.Handler`_ class. Below is the list of ready to run GELF logging handlers defined by graypy: * ``GELFUDPHandler`` - UDP log forwarding * ``GELFTCPHandler`` - TCP log forwarding * ``GELFTLSHandler`` - TCP log forwarding with TLS support * ``GELFHTTPHandler`` - HTTP log forwarding * ``GELFRabbitHandler`` - RabbitMQ log forwarding UDP Logging ----------- UDP Log forwarding to a locally hosted Graylog server can be easily done with the ``GELFUDPHandler``: .. code-block:: python import logging import graypy my_logger = logging.getLogger('test_logger') my_logger.setLevel(logging.DEBUG) handler = graypy.GELFUDPHandler('localhost', 12201) my_logger.addHandler(handler) my_logger.debug('Hello Graylog.') UDP GELF Chunkers ^^^^^^^^^^^^^^^^^ `GELF UDP Chunking`_ is supported by the ``GELFUDPHandler`` and is defined by the ``gelf_chunker`` argument within its constructor. By default the ``GELFWarningChunker`` is used, thus, GELF messages that chunk overflow (i.e. consisting of more than 128 chunks) will issue a ``GELFChunkOverflowWarning`` and **will be dropped**. Other ``gelf_chunker`` options are also available: * ``BaseGELFChunker`` silently drops GELF messages that chunk overflow * ``GELFTruncatingChunker`` issues a ``GELFChunkOverflowWarning`` and simplifies and truncates GELF messages that chunk overflow in a attempt to send some content to Graylog. If this process fails to prevent another chunk overflow a ``GELFTruncationFailureWarning`` is issued. RabbitMQ Logging ---------------- Alternately, use ``GELFRabbitHandler`` to send messages to RabbitMQ and configure your Graylog server to consume messages via AMQP. This prevents log messages from being lost due to dropped UDP packets (``GELFUDPHandler`` sends messages to Graylog using UDP). You will need to configure RabbitMQ with a ``gelf_log`` queue and bind it to the ``logging.gelf`` exchange so messages are properly routed to a queue that can be consumed by Graylog (the queue and exchange names may be customized to your liking). .. code-block:: python import logging import graypy my_logger = logging.getLogger('test_logger') my_logger.setLevel(logging.DEBUG) handler = graypy.GELFRabbitHandler('amqp://guest:guest@localhost/', exchange='logging.gelf') my_logger.addHandler(handler) my_logger.debug('Hello Graylog.') Django Logging -------------- It's easy to integrate ``graypy`` with Django's logging settings. Just add a new handler in your ``settings.py``: .. code-block:: python LOGGING = { 'version': 1, # other dictConfig keys here... 'handlers': { 'graypy': { 'level': 'WARNING', 'class': 'graypy.GELFUDPHandler', 'host': 'localhost', 'port': 12201, }, }, 'loggers': { 'django.request': { 'handlers': ['graypy'], 'level': 'ERROR', 'propagate': True, }, }, } Traceback Logging ----------------- By default log captured exception tracebacks are added to the GELF log as ``full_message`` fields: .. code-block:: python import logging import graypy my_logger = logging.getLogger('test_logger') my_logger.setLevel(logging.DEBUG) handler = graypy.GELFUDPHandler('localhost', 12201) my_logger.addHandler(handler) try: puff_the_magic_dragon() except NameError: my_logger.debug('No dragons here.', exc_info=1) Default Logging Fields ---------------------- By default a number of debugging logging fields are automatically added to the GELF log if available: * function * pid * process_name * thread_name You can disable automatically adding these debugging logging fields by specifying ``debugging_fields=False`` in the handler's constructor: .. code-block:: python handler = graypy.GELFUDPHandler('localhost', 12201, debugging_fields=False) Adding Custom Logging Fields ---------------------------- graypy also supports including custom fields in the GELF logs sent to Graylog. This can be done by using Python's LoggerAdapter_ and Filter_ classes. Using LoggerAdapter ^^^^^^^^^^^^^^^^^^^ LoggerAdapter_ makes it easy to add static information to your GELF log messages: .. code-block:: python import logging import graypy my_logger = logging.getLogger('test_logger') my_logger.setLevel(logging.DEBUG) handler = graypy.GELFUDPHandler('localhost', 12201) my_logger.addHandler(handler) my_adapter = logging.LoggerAdapter(logging.getLogger('test_logger'), {'username': 'John'}) my_adapter.debug('Hello Graylog from John.') Using Filter ^^^^^^^^^^^^ Filter_ gives more flexibility and allows for dynamic information to be added to your GELF logs: .. code-block:: python import logging import graypy class UsernameFilter(logging.Filter): def __init__(self): # In an actual use case would dynamically get this # (e.g. from memcache) self.username = 'John' def filter(self, record): record.username = self.username return True my_logger = logging.getLogger('test_logger') my_logger.setLevel(logging.DEBUG) handler = graypy.GELFUDPHandler('localhost', 12201) my_logger.addHandler(handler) my_logger.addFilter(UsernameFilter()) my_logger.debug('Hello Graylog from John.') Contributors ============ * Sever Banesiu * Daniel Miller * Tushar Makkar * Nathan Klapstein .. _GELF: https://docs.graylog.org/en/latest/pages/gelf.html .. _logging.Handler: https://docs.python.org/3/library/logging.html#logging.Handler .. _GELF UDP Chunking: https://docs.graylog.org/en/latest/pages/gelf.html#chunking .. _LoggerAdapter: https://docs.python.org/howto/logging-cookbook.html#using-loggeradapters-to-impart-contextual-information .. _Filter: https://docs.python.org/howto/logging-cookbook.html#using-filters-to-impart-contextual-information Keywords: logging gelf graylog2 graylog udp amqp Platform: UNKNOWN Classifier: License :: OSI Approved :: BSD License Classifier: Intended Audience :: Developers Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 2 Classifier: Programming Language :: Python :: 2.7 Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.2 Classifier: Programming Language :: Python :: 3.3 Classifier: Programming Language :: Python :: 3.4 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: Implementation :: CPython Classifier: Programming Language :: Python :: Implementation :: PyPy Classifier: Topic :: System :: Logging Description-Content-Type: text/x-rst Provides-Extra: docs Provides-Extra: amqp graypy-2.1.0/graypy/0000775000372000037200000000000013544502033015201 5ustar travistravis00000000000000graypy-2.1.0/graypy/rabbitmq.py0000664000372000037200000001040313544501373017360 0ustar travistravis00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """Logging Handler integrating RabbitMQ and Graylog Extended Log Format (GELF)""" import json from logging import Filter from logging.handlers import SocketHandler from amqplib import client_0_8 as amqp # pylint: disable=import-error from graypy.handler import BaseGELFHandler try: from urllib.parse import urlparse, unquote except ImportError: from urlparse import urlparse from urllib import unquote _ifnone = lambda v, x: x if v is None else v class GELFRabbitHandler(BaseGELFHandler, SocketHandler): """RabbitMQ / GELF handler .. note:: This handler ignores all messages logged by amqplib. """ def __init__( self, url, exchange="logging.gelf", exchange_type="fanout", virtual_host="/", routing_key="", **kwargs ): """Initialize the GELFRabbitHandler :param url: RabbitMQ URL (ex: amqp://guest:guest@localhost:5672/) :type url: str :param exchange: RabbitMQ exchange. A queue binding must be defined on the server to prevent GELF logs from being dropped. :type exchange: str :param exchange_type: RabbitMQ exchange type. :type exchange_type: str :param virtual_host: :type virtual_host: str :param routing_key: :type routing_key: str """ self.url = url parsed = urlparse(url) if parsed.scheme != "amqp": raise ValueError('invalid URL scheme (expected "amqp"): %s' % url) host = parsed.hostname or "localhost" port = _ifnone(parsed.port, 5672) self.virtual_host = ( virtual_host if not unquote(parsed.path[1:]) else unquote(parsed.path[1:]) ) self.cn_args = { "host": "%s:%s" % (host, port), "userid": _ifnone(parsed.username, "guest"), "password": _ifnone(parsed.password, "guest"), "virtual_host": self.virtual_host, "insist": False, } self.exchange = exchange self.exchange_type = exchange_type self.routing_key = routing_key BaseGELFHandler.__init__(self, **kwargs) SocketHandler.__init__(self, host, port) self.addFilter(ExcludeFilter("amqplib")) def makeSocket(self, timeout=1): return RabbitSocket( self.cn_args, timeout, self.exchange, self.exchange_type, self.routing_key ) def makePickle(self, record): message_dict = self._make_gelf_dict(record) return json.dumps(message_dict) class RabbitSocket(object): def __init__(self, cn_args, timeout, exchange, exchange_type, routing_key): self.cn_args = cn_args self.timeout = timeout self.exchange = exchange self.exchange_type = exchange_type self.routing_key = routing_key self.connection = amqp.Connection(connection_timeout=timeout, **self.cn_args) self.channel = self.connection.channel() self.channel.exchange_declare( exchange=self.exchange, type=self.exchange_type, durable=True, auto_delete=False, ) def sendall(self, data): msg = amqp.Message(data, delivery_mode=2) self.channel.basic_publish( msg, exchange=self.exchange, routing_key=self.routing_key ) def close(self): """Close the connection to the RabbitMQ socket""" try: self.connection.close() except Exception: pass class ExcludeFilter(Filter): """A subclass of :class:`logging.Filter` which should be instantiated with the name of the logger which, together with its children, will have its events excluded (filtered out)""" def __init__(self, name): """Initialize the ExcludeFilter :param name: Name to match for within a :class:`logging.LogRecord`'s ``name`` field for filtering. :type name: str """ if not name: raise ValueError("ExcludeFilter requires a non-empty name") Filter.__init__(self, name) def filter(self, record): return not ( record.name.startswith(self.name) and (len(record.name) == self.nlen or record.name[self.nlen] == ".") ) graypy-2.1.0/graypy/handler.py0000664000372000037200000006266013544501373017210 0ustar travistravis00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """Logging Handlers that send messages in Graylog Extended Log Format (GELF)""" import warnings import abc import datetime import json import logging import math import random import socket import ssl import struct import sys import traceback import zlib from logging.handlers import DatagramHandler, SocketHandler WAN_CHUNK = 1420 LAN_CHUNK = 8154 if sys.version_info[0] == 3: # check if python3+ data, text = bytes, str else: data, text = str, unicode # pylint: disable=undefined-variable # fixes for using ABC if sys.version_info >= (3, 4): # check if python3.4+ ABC = abc.ABC else: ABC = abc.ABCMeta(str("ABC"), (), {}) try: import httplib except ImportError: import http.client as httplib SYSLOG_LEVELS = { logging.CRITICAL: 2, logging.ERROR: 3, logging.WARNING: 4, logging.INFO: 6, logging.DEBUG: 7, } GELF_MAX_CHUNK_NUMBER = 128 class BaseGELFHandler(logging.Handler, ABC): """Abstract class defining the basic functionality of converting a :obj:`logging.LogRecord` into a GELF log. Provides the boilerplate for all GELF handlers defined within graypy.""" def __init__( self, debugging_fields=True, extra_fields=True, fqdn=False, localname=None, facility=None, level_names=False, compress=True, ): """Initialize the BaseGELFHandler :param debugging_fields: If :obj:`True` add debug fields from the log record into the GELF logs to be sent to Graylog. :type debugging_fields: bool :param extra_fields: If :obj:`True` add extra fields from the log record into the GELF logs to be sent to Graylog. :type extra_fields: bool :param fqdn: If :obj:`True` use the fully qualified domain name of localhost to populate the ``host`` GELF field. :type fqdn: bool :param localname: If specified and ``fqdn`` is :obj:`False`, use the specified hostname to populate the ``host`` GELF field. :type localname: str or None :param facility: If specified, replace the ``facility`` GELF field with the specified value. Also add a additional ``_logger`` GELF field containing the ``LogRecord.name``. :type facility: str :param level_names: If :obj:`True` use python logging error level name strings instead of syslog numerical values. :type level_names: bool :param compress: If :obj:`True` compress the GELF message before sending it to the Graylog server. :type compress: bool """ logging.Handler.__init__(self) self.debugging_fields = debugging_fields self.extra_fields = extra_fields if fqdn and localname: raise ValueError("cannot specify 'fqdn' and 'localname' arguments together") self.fqdn = fqdn self.localname = localname self.facility = facility self.level_names = level_names self.compress = compress def makePickle(self, record): """Convert a :class:`logging.LogRecord` into bytes representing a GELF log :param record: :class:`logging.LogRecord` to convert into a GELF log. :type record: logging.LogRecord :return: bytes representing a GELF log. :rtype: bytes """ gelf_dict = self._make_gelf_dict(record) packed = self._pack_gelf_dict(gelf_dict) pickle = zlib.compress(packed) if self.compress else packed return pickle def _make_gelf_dict(self, record): """Create a dictionary representing a GELF log from a python :class:`logging.LogRecord` :param record: :class:`logging.LogRecord` to create a GELF log from. :type record: logging.LogRecord :return: Dictionary representing a GELF log. :rtype: dict """ # construct the base GELF format gelf_dict = { "version": "1.0", "host": self._resolve_host(self.fqdn, self.localname), "short_message": self.formatter.format(record) if self.formatter else record.getMessage(), "timestamp": record.created, "level": SYSLOG_LEVELS.get(record.levelno, record.levelno), "facility": self.facility or record.name, } # add in specified optional extras self._add_full_message(gelf_dict, record) if self.level_names: self._add_level_names(gelf_dict, record) if self.facility is not None: self._set_custom_facility(gelf_dict, self.facility, record) if self.debugging_fields: self._add_debugging_fields(gelf_dict, record) if self.extra_fields: self._add_extra_fields(gelf_dict, record) return gelf_dict @staticmethod def _add_level_names(gelf_dict, record): """Add the ``level_name`` field to the ``gelf_dict`` which notes the logging level via the string error level names instead of numerical values :param gelf_dict: Dictionary representing a GELF log. :type gelf_dict: dict :param record: :class:`logging.LogRecord` to extract a logging level from to insert into the given ``gelf_dict``. :type record: logging.LogRecord """ gelf_dict["level_name"] = logging.getLevelName(record.levelno) @staticmethod def _set_custom_facility(gelf_dict, facility_value, record): """Set the ``gelf_dict``'s ``facility`` field to the specified value Also add a additional ``_logger`` field containing the ``LogRecord.name``. :param gelf_dict: Dictionary representing a GELF log. :type gelf_dict: dict :param facility_value: Value to set as the ``gelf_dict``'s ``facility`` field. :type facility_value: str :param record: :class:`logging.LogRecord` to extract it's record name to insert into the given ``gelf_dict`` as the ``_logger`` field. :type record: logging.LogRecord """ gelf_dict.update({"facility": facility_value, "_logger": record.name}) @staticmethod def _add_full_message(gelf_dict, record): """Add the ``full_message`` field to the ``gelf_dict`` if any traceback information exists within the logging record :param gelf_dict: Dictionary representing a GELF log. :type gelf_dict: dict :param record: :class:`logging.LogRecord` to extract a full logging message from to insert into the given ``gelf_dict``. :type record: logging.LogRecord """ # if a traceback exists add it to the log as the full_message field full_message = None # format exception information if present if record.exc_info: full_message = "\n".join(traceback.format_exception(*record.exc_info)) # use pre-formatted exception information in cases where the primary # exception information was removed, e.g. for LogRecord serialization if record.exc_text: full_message = record.exc_text if full_message: gelf_dict["full_message"] = full_message @staticmethod def _resolve_host(fqdn, localname): """Resolve the ``host`` GELF field :param fqdn: Boolean indicating whether to use :meth:`socket.getfqdn` to obtain the ``host`` GELF field. :type fqdn: bool :param localname: Use specified hostname as the ``host`` GELF field. :type localname: str or None :return: String representing the ``host`` GELF field. :rtype: str """ if fqdn: return socket.getfqdn() elif localname is not None: return localname return socket.gethostname() @staticmethod def _add_debugging_fields(gelf_dict, record): """Add debugging fields to the given ``gelf_dict`` :param gelf_dict: Dictionary representing a GELF log. :type gelf_dict: dict :param record: :class:`logging.LogRecord` to extract debugging fields from to insert into the given ``gelf_dict``. :type record: logging.LogRecord """ gelf_dict.update( { "file": record.pathname, "line": record.lineno, "_function": record.funcName, "_pid": record.process, "_thread_name": record.threadName, } ) # record.processName was added in Python 2.6.2 pn = getattr(record, "processName", None) if pn is not None: gelf_dict["_process_name"] = pn @staticmethod def _add_extra_fields(gelf_dict, record): """Add extra fields to the given ``gelf_dict`` However, this does not add additional fields in to ``message_dict`` that are either duplicated from standard :class:`logging.LogRecord` attributes, duplicated from the python logging module source (e.g. ``exc_text``), or violate GELF format (i.e. ``id``). .. seealso:: The list of standard :class:`logging.LogRecord` attributes can be found at: http://docs.python.org/library/logging.html#logrecord-attributes :param gelf_dict: Dictionary representing a GELF log. :type gelf_dict: dict :param record: :class:`logging.LogRecord` to extract extra fields from to insert into the given ``gelf_dict``. :type record: logging.LogRecord """ # skip_list is used to filter additional fields in a log message. skip_list = ( "args", "asctime", "created", "exc_info", "exc_text", "filename", "funcName", "id", "levelname", "levelno", "lineno", "module", "msecs", "message", "msg", "name", "pathname", "process", "processName", "relativeCreated", "thread", "threadName", ) for key, value in record.__dict__.items(): if key not in skip_list and not key.startswith("_"): gelf_dict["_%s" % key] = value @classmethod def _pack_gelf_dict(cls, gelf_dict): """Convert a given ``gelf_dict`` into JSON-encoded UTF-8 bytes, thus, creating an uncompressed GELF log ready for consumption by Graylog. Since we cannot be 100% sure of what is contained in the ``gelf_dict`` we have to do some sanitation. :param gelf_dict: Dictionary representing a GELF log. :type gelf_dict: dict :return: Bytes representing a uncompressed GELF log. :rtype: bytes """ gelf_dict = cls._sanitize_to_unicode(gelf_dict) packed = json.dumps(gelf_dict, separators=",:", default=cls._object_to_json) return packed.encode("utf-8") @classmethod def _sanitize_to_unicode(cls, obj): """Convert all strings records of the object to unicode :param obj: Object to sanitize to unicode. :type obj: object :return: Unicode string representing the given object. :rtype: str """ if isinstance(obj, dict): return dict( (cls._sanitize_to_unicode(k), cls._sanitize_to_unicode(v)) for k, v in obj.items() ) if isinstance(obj, (list, tuple)): return obj.__class__([cls._sanitize_to_unicode(i) for i in obj]) if isinstance(obj, data): obj = obj.decode("utf-8", errors="replace") return obj @staticmethod def _object_to_json(obj): """Convert objects that cannot be natively serialized into JSON into their string representation (for later JSON serialization). :class:`datetime.datetime` based objects will be converted into a ISO formatted timestamp string. :param obj: Object to convert into a string representation. :type obj: object :return: String representing the given object. :rtype: str """ if isinstance(obj, datetime.datetime): return obj.isoformat() return repr(obj) class BaseGELFChunker(object): """Base UDP GELF message chunker .. warning:: This will silently drop chunk overflowing GELF messages. (i.e. GELF messages that consist of more than 128 chunks) .. note:: UDP GELF message chunking is only supported for the :class:`.handler.GELFUDPHandler`. """ def __init__(self, chunk_size=WAN_CHUNK): """Initialize the BaseGELFChunker. :param chunk_size: Message chunk size. Messages larger than this size should be sent to Graylog in multiple chunks. :type chunk_size: int """ self.chunk_size = chunk_size def _message_chunk_number(self, message): """Get the number of chunks a GELF message requires :return: Number of chunks the specified GELF message requires. :rtype: int """ return int(math.ceil(len(message) * 1.0 / self.chunk_size)) @staticmethod def _encode(message_id, chunk_seq, total_chunks, chunk): return b"".join( [ b"\x1e\x0f", struct.pack("Q", message_id), struct.pack("B", chunk_seq), struct.pack("B", total_chunks), chunk, ] ) def _gen_gelf_chunks(self, message): """Generate and iter chunks for a GELF message :param message: GELF message to generate and iter chunks for. :type; bytes :return: Iterator of the chunks of a GELF message. :rtype: Iterator[bytes] """ total_chunks = self._message_chunk_number(message) message_id = random.randint(0, 0xFFFFFFFFFFFFFFFF) for sequence, chunk in enumerate( ( message[i : i + self.chunk_size] for i in range(0, len(message), self.chunk_size) ) ): yield self._encode(message_id, sequence, total_chunks, chunk) def chunk_message(self, message): """Chunk a GELF message Silently drop chunk overflowing GELF messages. :param message: GELF message to chunk. :type message: bytes :return: Iterator of the chunks of a GELF message. :rtype: Iterator[bytes], None """ if self._message_chunk_number(message) > GELF_MAX_CHUNK_NUMBER: return for chunk in self._gen_gelf_chunks(message): yield chunk class GELFChunkOverflowWarning(Warning): """Warning that a chunked GELF UDP message requires more than 128 chunks""" class GELFWarningChunker(BaseGELFChunker): """GELF UDP message chunker that warns and drops overflowing messages""" def chunk_message(self, message): """Chunk a GELF message Issue a :class:`.handler.GELFChunkOverflowWarning` on chunk overflowing GELF messages. Then drop them. """ if self._message_chunk_number(message) > GELF_MAX_CHUNK_NUMBER: warnings.warn( "chunk overflowing GELF message: {}".format(message), GELFChunkOverflowWarning, ) return for chunk in self._gen_gelf_chunks(message): yield chunk class GELFTruncationFailureWarning(GELFChunkOverflowWarning): """Warning that the truncation of a chunked GELF UDP message failed to prevent chunk overflowing""" class GELFTruncatingChunker(BaseGELFChunker): """GELF UDP message chunker that truncates overflowing messages""" def __init__( self, chunk_size=WAN_CHUNK, compress=True, gelf_packer=BaseGELFHandler._pack_gelf_dict, ): """Initialize the GELFTruncatingChunker :param compress: Boolean noting whether the given GELF messages are originally compressed. :type compress: bool :param gelf_packer: Function handle for packing a GELF dictionary into bytes. Should be of the form ``gelf_packer(gelf_dict)``. :type gelf_packer: Callable[dict] """ BaseGELFChunker.__init__(self, chunk_size) self.gelf_packer = gelf_packer self.compress = compress def gen_chunk_overflow_gelf_log(self, raw_message): """Attempt to truncate a chunk overflowing GELF message :param raw_message: Original bytes of a chunk overflowing GELF message. :type raw_message: bytes :return: Truncated and simplified version of raw_message. :rtype: bytes """ if self.compress: message = zlib.decompress(raw_message) else: message = raw_message gelf_dict = json.loads(message.decode("UTF-8")) # Simplified GELF message dictionary to base the truncated # GELF message from simplified_gelf_dict = { "version": gelf_dict["version"], "host": gelf_dict["host"], "short_message": "", "timestamp": gelf_dict["timestamp"], "level": SYSLOG_LEVELS.get(logging.ERROR, logging.ERROR), "facility": gelf_dict["facility"], "_chunk_overflow": True, } # compute a estimate of the number of message chunks left this is # used to estimate the amount of truncation to apply gelf_chunks_free = GELF_MAX_CHUNK_NUMBER - self._message_chunk_number( zlib.compress(self.gelf_packer(simplified_gelf_dict)) if self.compress else self.gelf_packer(simplified_gelf_dict) ) truncated_short_message = gelf_dict["short_message"][ : self.chunk_size * gelf_chunks_free ] for clip in range(gelf_chunks_free, -1, -1): simplified_gelf_dict["short_message"] = truncated_short_message packed_message = self.gelf_packer(simplified_gelf_dict) if self.compress: packed_message = zlib.compress(packed_message) if self._message_chunk_number(packed_message) <= GELF_MAX_CHUNK_NUMBER: return packed_message else: truncated_short_message = truncated_short_message[: -self.chunk_size] else: raise GELFTruncationFailureWarning( "truncation failed preventing chunk overflowing for GELF message: {}".format( raw_message ) ) def chunk_message(self, message): """Chunk a GELF message Issue a :class:`.handler.GELFChunkOverflowWarning` on chunk overflowing GELF messages. Then attempt to truncate and simplify the chunk overflowing GELF message so that it may be successfully chunked without overflowing. If the truncation and simplification of the chunk overflowing GELF message fails issue a :class:`.handler.GELFTruncationFailureWarning` and drop the overflowing GELF message. """ if self._message_chunk_number(message) > GELF_MAX_CHUNK_NUMBER: warnings.warn( "truncating GELF chunk overflowing message: {}".format(message), GELFChunkOverflowWarning, ) try: message = self.gen_chunk_overflow_gelf_log(message) except GELFTruncationFailureWarning as w: warnings.warn(w) return for chunk in self._gen_gelf_chunks(message): yield chunk class GELFUDPHandler(BaseGELFHandler, DatagramHandler): """GELF UDP handler""" def __init__(self, host, port=12202, gelf_chunker=GELFWarningChunker(), **kwargs): """Initialize the GELFUDPHandler .. note:: By default a :class:`.handler.GELFWarningChunker` is used as the ``gelf_chunker``. Thus, GELF messages that chunk overflow will issue a :class:`.handler.GELFChunkOverflowWarning` and will be dropped. :param host: GELF UDP input host. :type host: str :param port: GELF UDP input port. :type port: int :param gelf_chunker: :class:`.handler.BaseGELFChunker` instance to handle chunking larger GELF messages. :type gelf_chunker: GELFWarningChunker """ BaseGELFHandler.__init__(self, **kwargs) DatagramHandler.__init__(self, host, port) self.gelf_chunker = gelf_chunker def send(self, s): if len(s) < self.gelf_chunker.chunk_size: super(GELFUDPHandler, self).send(s) else: for chunk in self.gelf_chunker.chunk_message(s): super(GELFUDPHandler, self).send(chunk) class GELFTCPHandler(BaseGELFHandler, SocketHandler): """GELF TCP handler""" def __init__(self, host, port=12201, **kwargs): """Initialize the GELFTCPHandler :param host: GELF TCP input host. :type host: str :param port: GELF TCP input port. :type port: int .. attention:: GELF TCP does not support compression due to the use of the null byte (``\\0``) as frame delimiter. Thus, :class:`.handler.GELFTCPHandler` does not support setting ``compress`` to :obj:`True` and is locked to :obj:`False`. """ BaseGELFHandler.__init__(self, compress=False, **kwargs) SocketHandler.__init__(self, host, port) def makePickle(self, record): """Add a null terminator to generated pickles as TCP frame objects need to be null terminated :param record: :class:`logging.LogRecord` to create a null terminated GELF log. :type record: logging.LogRecord :return: Null terminated bytes representing a GELF log. :rtype: bytes """ return super(GELFTCPHandler, self).makePickle(record) + b"\x00" class GELFTLSHandler(GELFTCPHandler): """GELF TCP handler with TLS support""" def __init__( self, host, port=12204, validate=False, ca_certs=None, certfile=None, keyfile=None, **kwargs ): """Initialize the GELFTLSHandler :param host: GELF TLS input host. :type host: str :param port: GELF TLS input port. :type port: int :param validate: If :obj:`True`, validate the Graylog server's certificate. In this case specifying ``ca_certs`` is also required. :type validate: bool :param ca_certs: Path to CA bundle file. :type ca_certs: str :param certfile: Path to the client certificate file. :type certfile: str :param keyfile: Path to the client private key. If the private key is stored with the certificate, this parameter can be ignored. :type keyfile: str """ if validate and ca_certs is None: raise ValueError("CA bundle file path must be specified") if keyfile is not None and certfile is None: raise ValueError("certfile must be specified") GELFTCPHandler.__init__(self, host=host, port=port, **kwargs) self.ca_certs = ca_certs self.reqs = ssl.CERT_REQUIRED if validate else ssl.CERT_NONE self.certfile = certfile self.keyfile = keyfile if keyfile else certfile def makeSocket(self, timeout=1): """Create a TLS wrapped socket""" plain_socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM) if hasattr(plain_socket, "settimeout"): plain_socket.settimeout(timeout) wrapped_socket = ssl.wrap_socket( plain_socket, ca_certs=self.ca_certs, cert_reqs=self.reqs, keyfile=self.keyfile, certfile=self.certfile, ) wrapped_socket.connect((self.host, self.port)) return wrapped_socket # TODO: add https? class GELFHTTPHandler(BaseGELFHandler): """GELF HTTP handler""" def __init__( self, host, port=12203, compress=True, path="/gelf", timeout=5, **kwargs ): """Initialize the GELFHTTPHandler :param host: GELF HTTP input host. :type host: str :param port: GELF HTTP input port. :type port: int :param compress: If :obj:`True` compress the GELF message before sending it to the Graylog server. :type compress: bool :param path: Path of the HTTP input. (see http://docs.graylog.org/en/latest/pages/sending_data.html#gelf-via-http) :type path: str :param timeout: Number of seconds the HTTP client should wait before it discards the request if the Graylog server doesn't respond. :type timeout: int """ BaseGELFHandler.__init__(self, compress=compress, **kwargs) self.host = host self.port = port self.path = path self.timeout = timeout self.headers = {} if compress: self.headers["Content-Encoding"] = "gzip,deflate" def emit(self, record): """Convert a :class:`logging.LogRecord` to GELF and emit it to Graylog via a HTTP POST request :param record: :class:`logging.LogRecord` to convert into a GELF log and emit to Graylog via a HTTP POST request. :type record: logging.LogRecord """ pickle = self.makePickle(record) connection = httplib.HTTPConnection( host=self.host, port=self.port, timeout=self.timeout ) connection.request("POST", self.path, pickle, self.headers) graypy-2.1.0/graypy/__init__.py0000664000372000037200000000106513544501373017322 0ustar travistravis00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """graypy Python logging handlers that send messages in the Graylog Extended Log Format (GELF). Modules: + :mod:`.handler` - Basic GELF Logging Handlers + :mod:`.rabbitmq` - RabbitMQ GELF Logging Handler """ from graypy.handler import ( GELFUDPHandler, GELFTCPHandler, GELFTLSHandler, GELFHTTPHandler, WAN_CHUNK, LAN_CHUNK, ) try: from graypy.rabbitmq import GELFRabbitHandler, ExcludeFilter except ImportError: pass # amqplib is probably not installed __version__ = (2, 1, 0) graypy-2.1.0/setup.cfg0000664000372000037200000000014613544502033015510 0ustar travistravis00000000000000[bdist_wheel] universal = 1 [metadata] license_file = LICENSE [egg_info] tag_build = tag_date = 0 graypy-2.1.0/LICENSE0000664000372000037200000000274413544501373014710 0ustar travistravis00000000000000Copyright (c) 2011, Sever Băneşiu All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. * Neither the name of the author nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. graypy-2.1.0/README.rst0000664000372000037200000001650413544501373015371 0ustar travistravis00000000000000###### graypy ###### .. image:: https://img.shields.io/pypi/v/graypy.svg :target: https://pypi.python.org/pypi/graypy :alt: PyPI Status .. image:: https://travis-ci.org/severb/graypy.svg?branch=master :target: https://travis-ci.org/severb/graypy :alt: Build Status .. image:: https://readthedocs.org/projects/graypy/badge/?version=stable :target: https://graypy.readthedocs.io/en/stable/?badge=stable :alt: Documentation Status .. image:: https://codecov.io/gh/severb/graypy/branch/master/graph/badge.svg :target: https://codecov.io/gh/severb/graypy :alt: Coverage Status Description =========== Python logging handlers that send log messages in the Graylog Extended Log Format (GELF_). graypy supports sending GELF logs to both Graylog2 and Graylog3 servers. Installing ========== Using pip --------- Install the basic graypy python logging handlers: .. code-block:: console pip install graypy Install with requirements for ``GELFRabbitHandler``: .. code-block:: console pip install graypy[amqp] Using easy_install ------------------ Install the basic graypy python logging handlers: .. code-block:: console easy_install graypy Install with requirements for ``GELFRabbitHandler``: .. code-block:: console easy_install graypy[amqp] Usage ===== graypy sends GELF logs to a Graylog server via subclasses of the python `logging.Handler`_ class. Below is the list of ready to run GELF logging handlers defined by graypy: * ``GELFUDPHandler`` - UDP log forwarding * ``GELFTCPHandler`` - TCP log forwarding * ``GELFTLSHandler`` - TCP log forwarding with TLS support * ``GELFHTTPHandler`` - HTTP log forwarding * ``GELFRabbitHandler`` - RabbitMQ log forwarding UDP Logging ----------- UDP Log forwarding to a locally hosted Graylog server can be easily done with the ``GELFUDPHandler``: .. code-block:: python import logging import graypy my_logger = logging.getLogger('test_logger') my_logger.setLevel(logging.DEBUG) handler = graypy.GELFUDPHandler('localhost', 12201) my_logger.addHandler(handler) my_logger.debug('Hello Graylog.') UDP GELF Chunkers ^^^^^^^^^^^^^^^^^ `GELF UDP Chunking`_ is supported by the ``GELFUDPHandler`` and is defined by the ``gelf_chunker`` argument within its constructor. By default the ``GELFWarningChunker`` is used, thus, GELF messages that chunk overflow (i.e. consisting of more than 128 chunks) will issue a ``GELFChunkOverflowWarning`` and **will be dropped**. Other ``gelf_chunker`` options are also available: * ``BaseGELFChunker`` silently drops GELF messages that chunk overflow * ``GELFTruncatingChunker`` issues a ``GELFChunkOverflowWarning`` and simplifies and truncates GELF messages that chunk overflow in a attempt to send some content to Graylog. If this process fails to prevent another chunk overflow a ``GELFTruncationFailureWarning`` is issued. RabbitMQ Logging ---------------- Alternately, use ``GELFRabbitHandler`` to send messages to RabbitMQ and configure your Graylog server to consume messages via AMQP. This prevents log messages from being lost due to dropped UDP packets (``GELFUDPHandler`` sends messages to Graylog using UDP). You will need to configure RabbitMQ with a ``gelf_log`` queue and bind it to the ``logging.gelf`` exchange so messages are properly routed to a queue that can be consumed by Graylog (the queue and exchange names may be customized to your liking). .. code-block:: python import logging import graypy my_logger = logging.getLogger('test_logger') my_logger.setLevel(logging.DEBUG) handler = graypy.GELFRabbitHandler('amqp://guest:guest@localhost/', exchange='logging.gelf') my_logger.addHandler(handler) my_logger.debug('Hello Graylog.') Django Logging -------------- It's easy to integrate ``graypy`` with Django's logging settings. Just add a new handler in your ``settings.py``: .. code-block:: python LOGGING = { 'version': 1, # other dictConfig keys here... 'handlers': { 'graypy': { 'level': 'WARNING', 'class': 'graypy.GELFUDPHandler', 'host': 'localhost', 'port': 12201, }, }, 'loggers': { 'django.request': { 'handlers': ['graypy'], 'level': 'ERROR', 'propagate': True, }, }, } Traceback Logging ----------------- By default log captured exception tracebacks are added to the GELF log as ``full_message`` fields: .. code-block:: python import logging import graypy my_logger = logging.getLogger('test_logger') my_logger.setLevel(logging.DEBUG) handler = graypy.GELFUDPHandler('localhost', 12201) my_logger.addHandler(handler) try: puff_the_magic_dragon() except NameError: my_logger.debug('No dragons here.', exc_info=1) Default Logging Fields ---------------------- By default a number of debugging logging fields are automatically added to the GELF log if available: * function * pid * process_name * thread_name You can disable automatically adding these debugging logging fields by specifying ``debugging_fields=False`` in the handler's constructor: .. code-block:: python handler = graypy.GELFUDPHandler('localhost', 12201, debugging_fields=False) Adding Custom Logging Fields ---------------------------- graypy also supports including custom fields in the GELF logs sent to Graylog. This can be done by using Python's LoggerAdapter_ and Filter_ classes. Using LoggerAdapter ^^^^^^^^^^^^^^^^^^^ LoggerAdapter_ makes it easy to add static information to your GELF log messages: .. code-block:: python import logging import graypy my_logger = logging.getLogger('test_logger') my_logger.setLevel(logging.DEBUG) handler = graypy.GELFUDPHandler('localhost', 12201) my_logger.addHandler(handler) my_adapter = logging.LoggerAdapter(logging.getLogger('test_logger'), {'username': 'John'}) my_adapter.debug('Hello Graylog from John.') Using Filter ^^^^^^^^^^^^ Filter_ gives more flexibility and allows for dynamic information to be added to your GELF logs: .. code-block:: python import logging import graypy class UsernameFilter(logging.Filter): def __init__(self): # In an actual use case would dynamically get this # (e.g. from memcache) self.username = 'John' def filter(self, record): record.username = self.username return True my_logger = logging.getLogger('test_logger') my_logger.setLevel(logging.DEBUG) handler = graypy.GELFUDPHandler('localhost', 12201) my_logger.addHandler(handler) my_logger.addFilter(UsernameFilter()) my_logger.debug('Hello Graylog from John.') Contributors ============ * Sever Banesiu * Daniel Miller * Tushar Makkar * Nathan Klapstein .. _GELF: https://docs.graylog.org/en/latest/pages/gelf.html .. _logging.Handler: https://docs.python.org/3/library/logging.html#logging.Handler .. _GELF UDP Chunking: https://docs.graylog.org/en/latest/pages/gelf.html#chunking .. _LoggerAdapter: https://docs.python.org/howto/logging-cookbook.html#using-loggeradapters-to-impart-contextual-information .. _Filter: https://docs.python.org/howto/logging-cookbook.html#using-filters-to-impart-contextual-information graypy-2.1.0/setup.py0000775000372000037200000000622113544501373015412 0ustar travistravis00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """setup.py for graypy""" import codecs import re import sys import os from setuptools import setup, find_packages from setuptools.command.test import test def find_version(*file_paths): with codecs.open( os.path.join(os.path.abspath(os.path.dirname(__file__)), *file_paths), "r" ) as fp: version_file = fp.read() m = re.search(r"^__version__ = \((\d+), ?(\d+), ?(\d+)\)", version_file, re.M) if m: return "{}.{}.{}".format(*m.groups()) raise RuntimeError("Unable to find a valid version") VERSION = find_version("graypy", "__init__.py") class Pylint(test): def run_tests(self): from pylint.lint import Run Run( [ "graypy", "--persistent", "y", "--rcfile", ".pylintrc", "--output-format", "colorized", ] ) class PyTest(test): user_options = [("pytest-args=", "a", "Arguments to pass to pytest")] def initialize_options(self): test.initialize_options(self) self.pytest_args = "-v --cov={}".format("graypy") def run_tests(self): import shlex # import here, cause outside the eggs aren't loaded import pytest errno = pytest.main(shlex.split(self.pytest_args)) sys.exit(errno) setup( name="graypy", version=VERSION, description="Python logging handlers that send messages in the Graylog Extended Log Format (GELF).", long_description=open("README.rst").read(), long_description_content_type="text/x-rst", keywords="logging gelf graylog2 graylog udp amqp", author="Sever Banesiu", author_email="banesiu.sever@gmail.com", url="https://github.com/severb/graypy", license="BSD License", packages=find_packages(), include_package_data=True, zip_safe=False, tests_require=[ "pytest>=2.8.7,<4.0.0", "pytest-cov<=2.6.0,<3.0.0", "pylint>=1.9.3,<2.0.0", "mock>=2.0.0,<3.0.0", "requests>=2.20.1,<3.0.0", "amqplib>=1.0.2,<2.0.0", ], extras_require={ "amqp": ["amqplib==1.0.2"], "docs": [ "sphinx>=2.1.2,<3.0.0", "sphinx_rtd_theme>=0.4.3,<1.0.0", "sphinx-autodoc-typehints>=1.6.0,<2.0.0", ], }, classifiers=[ "License :: OSI Approved :: BSD License", "Intended Audience :: Developers", "Programming Language :: Python", "Programming Language :: Python :: 2", "Programming Language :: Python :: 2.7", "Programming Language :: Python :: 3", "Programming Language :: Python :: 3.2", "Programming Language :: Python :: 3.3", "Programming Language :: Python :: 3.4", "Programming Language :: Python :: 3.5", "Programming Language :: Python :: 3.6", "Programming Language :: Python :: 3.7", "Programming Language :: Python :: Implementation :: CPython", "Programming Language :: Python :: Implementation :: PyPy", "Topic :: System :: Logging", ], cmdclass={"test": PyTest, "lint": Pylint}, ) graypy-2.1.0/tests/0000775000372000037200000000000013544502033015030 5ustar travistravis00000000000000graypy-2.1.0/tests/integration/0000775000372000037200000000000013544502033017353 5ustar travistravis00000000000000graypy-2.1.0/tests/integration/test_common_logging.py0000664000372000037200000000147313544501373023775 0ustar travistravis00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """pytests sending logs to a local Graylog instance""" import logging import pytest from graypy.handler import SYSLOG_LEVELS from tests.helper import handler, logger from tests.integration import LOCAL_GRAYLOG_UP from tests.integration.helper import get_unique_message, get_graylog_response @pytest.mark.skipif(not LOCAL_GRAYLOG_UP, reason="local Graylog instance not up") def test_common_logging(logger): """Test sending a common usage log""" message = get_unique_message() logger.error(message) graylog_response = get_graylog_response(message) assert message == graylog_response["message"] assert "long_message" not in graylog_response assert "timestamp" in graylog_response assert SYSLOG_LEVELS[logging.ERROR] == graylog_response["level"] graypy-2.1.0/tests/integration/helper.py0000664000372000037200000000336513544501373021221 0ustar travistravis00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """helper functions for testing graypy with a local Graylog instance""" from time import sleep from uuid import uuid4 import requests def get_unique_message(): return str(uuid4()) DEFAULT_FIELDS = [ "message", "full_message", "source", "level", "func", "file", "line", "module", "logger_name", ] BASE_API_URL = 'http://127.0.0.1:9000/api/search/universal/relative?query=message:"{0}"&range=300&fields=' def get_graylog_response(message, fields=None): """Search for a given log message (with possible additional fields) within a local Graylog instance""" fields = fields if fields else [] tries = 0 while True: try: return _parse_api_response( api_response=_get_api_response(message, fields), wanted_message=message ) except ValueError: sleep(2) if tries == 5: raise tries += 1 def _build_api_string(message, fields): return BASE_API_URL.format(message) + "%2C".join(set(DEFAULT_FIELDS + fields)) def _get_api_response(message, fields): url = _build_api_string(message, fields) api_response = requests.get( url, auth=("admin", "admin"), headers={"accept": "application/json"} ) return api_response def _parse_api_response(api_response, wanted_message): assert api_response.status_code == 200 print(api_response.json()) for message in api_response.json()["messages"]: if message["message"]["message"] == wanted_message: return message["message"] raise ValueError( "wanted_message: '{}' not within api_response: {}".format( wanted_message, api_response ) ) graypy-2.1.0/tests/integration/test_status_issue.py0000664000372000037200000000366713544501373023541 0ustar travistravis00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """pytests for addressing potential issues with adding an ``status`` extra field withing a given log and having the log failing to appear within graylog. Related issue: - Fails to log silently with specific extra field #85 URL: - https://github.com/severb/graypy/issues/85 """ import pytest from tests.helper import handler, logger from tests.integration import LOCAL_GRAYLOG_UP from tests.integration.helper import get_unique_message, get_graylog_response @pytest.mark.skipif(not LOCAL_GRAYLOG_UP, reason="local Graylog instance not up") def test_non_status_field_log(logger): message = get_unique_message() logger.error(message, extra={"foo": "bar"}) graylog_response = get_graylog_response(message, fields=["foo"]) assert message == graylog_response["message"] assert "long_message" not in graylog_response assert "timestamp" in graylog_response assert "bar" == graylog_response["foo"] @pytest.mark.skipif(not LOCAL_GRAYLOG_UP, reason="local Graylog instance not up") def test_status_field_issue(logger): message = get_unique_message() logger.error(message, extra={"status": "OK"}) graylog_response = get_graylog_response(message, fields=["status"]) assert message == graylog_response["message"] assert "long_message" not in graylog_response assert "timestamp" in graylog_response assert "OK" == graylog_response["status"] @pytest.mark.skipif(not LOCAL_GRAYLOG_UP, reason="local Graylog instance not up") def test_status_field_issue_multi(logger): message = get_unique_message() logger.error(message, extra={"foo": "bar", "status": "OK"}) graylog_response = get_graylog_response(message, fields=["foo", "status"]) assert message == graylog_response["message"] assert "long_message" not in graylog_response assert "timestamp" in graylog_response assert "bar" == graylog_response["foo"] assert "OK" == graylog_response["status"] graypy-2.1.0/tests/integration/test_debugging_fields.py0000664000372000037200000000372513544501373024262 0ustar travistravis00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """pytests validating the emitting of valid debugging fields for graypy loggers""" import pytest from tests.helper import ( logger, TEST_CERT, TEST_TCP_PORT, TEST_HTTP_PORT, TEST_TLS_PORT, TEST_UDP_PORT, ) from tests.integration import LOCAL_GRAYLOG_UP from tests.integration.helper import get_graylog_response, get_unique_message from graypy import GELFUDPHandler, GELFTCPHandler, GELFTLSHandler, GELFHTTPHandler @pytest.fixture( params=[ GELFTCPHandler("127.0.0.1", TEST_TCP_PORT, debugging_fields=True), GELFUDPHandler("127.0.0.1", TEST_UDP_PORT, debugging_fields=True), GELFUDPHandler( "127.0.0.1", TEST_UDP_PORT, compress=False, debugging_fields=True ), GELFHTTPHandler("127.0.0.1", TEST_HTTP_PORT, debugging_fields=True), GELFHTTPHandler( "127.0.0.1", TEST_HTTP_PORT, compress=False, debugging_fields=True ), GELFTLSHandler("127.0.0.1", TEST_TLS_PORT, debugging_fields=True), GELFTLSHandler( "127.0.0.1", TEST_TLS_PORT, debugging_fields=True, validate=True, ca_certs=TEST_CERT, ), ] ) def handler(request): return request.param @pytest.mark.skipif(not LOCAL_GRAYLOG_UP, reason="local Graylog instance not up") def test_debug_mode(logger): message = get_unique_message() logger.error(message) graylog_response = get_graylog_response( message, fields=["function", "pid", "thread_name"] ) assert message == graylog_response["message"] assert "long_message" not in graylog_response assert "timestamp" in graylog_response assert graylog_response["file"].endswith("test_debugging_fields.py") assert "test_debug_mode" == graylog_response["function"] assert "line" in graylog_response assert "file" in graylog_response assert "pid" in graylog_response assert "thread_name" in graylog_response graypy-2.1.0/tests/integration/test_extra_fields.py0000664000372000037200000000432113544501373023443 0ustar travistravis00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """pytests for validating the addition of extra fields within GELF logs""" import logging import pytest from graypy import GELFTLSHandler, GELFTCPHandler, GELFUDPHandler, GELFHTTPHandler from tests.helper import ( TEST_CERT, TEST_TCP_PORT, TEST_HTTP_PORT, TEST_TLS_PORT, TEST_UDP_PORT, ) from tests.integration import LOCAL_GRAYLOG_UP from tests.integration.helper import get_unique_message, get_graylog_response class DummyFilter(logging.Filter): def filter(self, record): record.ozzy = "diary of a madman" record.van_halen = 1984 record.id = 42 return True @pytest.fixture( params=[ GELFTCPHandler("127.0.0.1", TEST_TCP_PORT, extra_fields=True), GELFUDPHandler("127.0.0.1", TEST_UDP_PORT, extra_fields=True), GELFUDPHandler("127.0.0.1", TEST_UDP_PORT, compress=False, extra_fields=True), GELFHTTPHandler("127.0.0.1", TEST_HTTP_PORT, extra_fields=True), GELFHTTPHandler("127.0.0.1", TEST_HTTP_PORT, compress=False, extra_fields=True), GELFTLSHandler("127.0.0.1", TEST_TLS_PORT, extra_fields=True), GELFTLSHandler( "127.0.0.1", TEST_TLS_PORT, validate=True, ca_certs=TEST_CERT, extra_fields=True, ), ] ) def handler(request): return request.param @pytest.yield_fixture def logger(handler): logger = logging.getLogger("test") dummy_filter = DummyFilter() logger.addFilter(dummy_filter) logger.addHandler(handler) yield logger logger.removeHandler(handler) logger.removeFilter(dummy_filter) @pytest.mark.skipif(not LOCAL_GRAYLOG_UP, reason="local Graylog instance not up") def test_dynamic_fields(logger): message = get_unique_message() logger.error(message) graylog_response = get_graylog_response(message, fields=["ozzy", "van_halen"]) assert message == graylog_response["message"] assert "long_message" not in graylog_response assert "timestamp" in graylog_response assert "diary of a madman" == graylog_response["ozzy"] assert 1984 == graylog_response["van_halen"] assert 42 != graylog_response["_id"] assert "id" not in graylog_response graypy-2.1.0/tests/integration/__init__.py0000664000372000037200000000073713544501373021501 0ustar travistravis00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """integration pytests for :mod:`graypy` .. note:: These tests require an local instance of Graylog to send messages to. """ import requests def validate_local_graylog_up(): """Test to see if a localhost instance of Graylog is currently running""" try: requests.get("http://127.0.0.1:9000/api") return True except Exception: return False LOCAL_GRAYLOG_UP = validate_local_graylog_up() graypy-2.1.0/tests/integration/test_chunked_logging.py0000664000372000037200000000241313544501373024121 0ustar travistravis00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """pytests sending logs to a local Graylog instance that need to be chunked""" import logging import pytest from graypy.handler import ( SYSLOG_LEVELS, GELFUDPHandler, GELFWarningChunker, BaseGELFChunker, GELFTruncatingChunker, ) from tests.helper import TEST_UDP_PORT from tests.integration import LOCAL_GRAYLOG_UP from tests.integration.helper import get_unique_message, get_graylog_response @pytest.mark.parametrize( "gelf_chunker", [BaseGELFChunker, GELFWarningChunker, GELFTruncatingChunker] ) @pytest.mark.skipif(not LOCAL_GRAYLOG_UP, reason="local Graylog instance not up") def test_chunked_logging(gelf_chunker): """Test sending a log that requires chunking to be fully sent""" logger = logging.getLogger("test_chunked_logger") handler = GELFUDPHandler( "127.0.0.1", TEST_UDP_PORT, gelf_chunker=gelf_chunker(chunk_size=10) ) logger.addHandler(handler) message = get_unique_message() logger.error(message) graylog_response = get_graylog_response(message) assert message == graylog_response["message"] assert "long_message" not in graylog_response assert "timestamp" in graylog_response assert SYSLOG_LEVELS[logging.ERROR] == graylog_response["level"] graypy-2.1.0/tests/helper.py0000664000372000037200000000457313544501373016700 0ustar travistravis00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """helper functions for testing graypy These functions are used for both the integration and unit testing. """ import logging import pytest from graypy import GELFUDPHandler, GELFTCPHandler, GELFTLSHandler, GELFHTTPHandler TEST_CERT = "tests/config/localhost.cert.pem" KEY_PASS = "secret" TEST_TCP_PORT = 12201 TEST_UDP_PORT = 12202 TEST_HTTP_PORT = 12203 TEST_TLS_PORT = 12204 @pytest.fixture( params=[ GELFTCPHandler("127.0.0.1", TEST_TCP_PORT), GELFTCPHandler("127.0.0.1", TEST_TCP_PORT, extra_fields=True), GELFTCPHandler( "127.0.0.1", TEST_TCP_PORT, extra_fields=True, debugging_fields=True ), GELFTLSHandler("localhost", TEST_TLS_PORT), GELFTLSHandler("localhost", TEST_TLS_PORT, validate=True, ca_certs=TEST_CERT), GELFTLSHandler("127.0.0.1", TEST_TLS_PORT), GELFTLSHandler("127.0.0.1", TEST_TLS_PORT, validate=True, ca_certs=TEST_CERT), GELFHTTPHandler("127.0.0.1", TEST_HTTP_PORT), GELFHTTPHandler("127.0.0.1", TEST_HTTP_PORT, compress=False), GELFUDPHandler("127.0.0.1", TEST_UDP_PORT), GELFUDPHandler("127.0.0.1", TEST_UDP_PORT, compress=False), # the below handler tests are essentially smoke tests # that help cover the argument permutations of BaseGELFHandler GELFUDPHandler( "127.0.0.1", TEST_UDP_PORT, debugging_fields=True, extra_fields=True, localname="foobar_localname", facility="foobar_facility", level_names=True, compress=False, ), GELFUDPHandler( "127.0.0.1", TEST_UDP_PORT, debugging_fields=True, extra_fields=True, fqdn=True, facility="foobar_facility", level_names=True, compress=False, ), ] ) def handler(request): return request.param @pytest.yield_fixture def logger(handler): logger_ = logging.getLogger("test_logger") logger_.addHandler(handler) yield logger_ logger_.removeHandler(handler) @pytest.yield_fixture def formatted_logger(handler): logger_ = logging.getLogger("formatted_test_logger") handler.setFormatter(logging.Formatter("%(levelname)s : %(message)s")) logger_.addHandler(handler) yield logger_ logger_.removeHandler(handler) graypy-2.1.0/tests/__init__.py0000664000372000037200000000011713544501373017146 0ustar travistravis00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """pytests for :mod:`graypy`""" graypy-2.1.0/tests/unit/0000775000372000037200000000000013544502033016007 5ustar travistravis00000000000000graypy-2.1.0/tests/unit/test_chunking.py0000664000372000037200000001172713544501373021244 0ustar travistravis00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """pytests for various GELF UDP message chunkers""" import json import logging import struct import zlib import pytest from graypy.handler import ( GELFTruncatingChunker, GELFWarningChunker, BaseGELFChunker, BaseGELFHandler, SYSLOG_LEVELS, GELFChunkOverflowWarning, GELFTruncationFailureWarning, ) @pytest.mark.parametrize( "gelf_chunker", [BaseGELFChunker, GELFWarningChunker, GELFTruncatingChunker] ) def test_gelf_chunking(gelf_chunker): """Test various GELF chunkers""" message = b"12345" header = b"\x1e\x0f" chunks = list(gelf_chunker(chunk_size=2).chunk_message(message)) expected = [ (struct.pack("b", 0), struct.pack("b", 3), b"12"), (struct.pack("b", 1), struct.pack("b", 3), b"34"), (struct.pack("b", 2), struct.pack("b", 3), b"5"), ] assert len(chunks) == len(expected) for index, chunk in enumerate(chunks): expected_index, expected_chunks_count, expected_chunk = expected[index] assert header == chunk[:2] assert expected_index == chunk[10:11] assert expected_chunks_count == chunk[11:12] assert expected_chunk == chunk[12:] def rebuild_gelf_bytes_from_udp_chunks(chunks): gelf_bytes = b"" bsize = len(chunks[0]) for chunk in chunks: if len(chunk) < bsize: gelf_bytes += chunk[-(bsize - len(chunk)) :] else: gelf_bytes += chunk[((2 + struct.calcsize("QBB")) - len(chunk)) :] return gelf_bytes @pytest.mark.parametrize( "gelf_chunker", [BaseGELFChunker, GELFWarningChunker, GELFTruncatingChunker] ) def test_gelf_chunkers(gelf_chunker): message = BaseGELFHandler().makePickle( logging.LogRecord( "test_gelf_chunkers", logging.INFO, None, None, "1" * 10, None, None ) ) chunks = list(gelf_chunker(chunk_size=2).chunk_message(message)) assert len(chunks) <= 128 @pytest.mark.parametrize( "gelf_chunker", [BaseGELFChunker, GELFWarningChunker, GELFTruncatingChunker] ) def test_gelf_chunkers_overflow(gelf_chunker): message = BaseGELFHandler().makePickle( logging.LogRecord( "test_gelf_chunkers_overflow", logging.INFO, None, None, "1" * 1000, None, None, ) ) chunks = list(gelf_chunker(chunk_size=1).chunk_message(message)) assert len(chunks) <= 128 def test_chunk_overflow_truncate_uncompressed(): message = BaseGELFHandler(compress=False).makePickle( logging.LogRecord( "test_chunk_overflow_truncate_uncompressed", logging.INFO, None, None, "1" * 1000, None, None, ) ) with pytest.warns(GELFChunkOverflowWarning): chunks = list( GELFTruncatingChunker(chunk_size=2, compress=False).chunk_message(message) ) assert len(chunks) <= 128 payload = rebuild_gelf_bytes_from_udp_chunks(chunks).decode("UTF-8") glef_json = json.loads(payload) assert glef_json["_chunk_overflow"] is True assert glef_json["short_message"] in "1" * 1000 assert glef_json["level"] == SYSLOG_LEVELS.get(logging.ERROR, logging.ERROR) def test_chunk_overflow_truncate_compressed(): message = BaseGELFHandler(compress=True).makePickle( logging.LogRecord( "test_chunk_overflow_truncate_compressed", logging.INFO, None, None, "123412345" * 5000, None, None, ) ) with pytest.warns(GELFChunkOverflowWarning): chunks = list( GELFTruncatingChunker(chunk_size=2, compress=True).chunk_message(message) ) assert len(chunks) <= 128 payload = zlib.decompress(rebuild_gelf_bytes_from_udp_chunks(chunks)).decode( "UTF-8" ) glef_json = json.loads(payload) assert glef_json["_chunk_overflow"] is True assert glef_json["short_message"] in "123412345" * 5000 assert glef_json["level"] == SYSLOG_LEVELS.get(logging.ERROR, logging.ERROR) def test_chunk_overflow_truncate_fail(): message = BaseGELFHandler().makePickle( logging.LogRecord( "test_chunk_overflow_truncate_fail", logging.INFO, None, None, "1" * 1000, None, None, ) ) with pytest.warns(GELFTruncationFailureWarning): list(GELFTruncatingChunker(1).chunk_message(message)) def test_chunk_overflow_truncate_fail_large_inherited_field(): message = BaseGELFHandler( facility="this is a really long facility" * 5000 ).makePickle( logging.LogRecord( "test_chunk_overflow_truncate_fail", logging.INFO, None, None, "reasonable message", None, None, ) ) with pytest.warns(GELFTruncationFailureWarning): list(GELFTruncatingChunker(2).chunk_message(message)) graypy-2.1.0/tests/unit/helper.py0000664000372000037200000000056313544501373017652 0ustar travistravis00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """helper functions for testing graypy with mocks of python logging and Graylog services""" import logging MOCK_LOG_RECORD_NAME = "MOCK_LOG_RECORD" MOCK_LOG_RECORD = logging.LogRecord( MOCK_LOG_RECORD_NAME, logging.INFO, pathname=None, lineno=None, msg="Log message", args=(), exc_info=None, ) graypy-2.1.0/tests/unit/test_GELFRabbitHandler.py0000664000372000037200000000233313544501373022566 0ustar travistravis00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """pytests for :class:`graypy.rabbitmq.GELFRabbitHandler`""" import json import pytest from graypy.rabbitmq import GELFRabbitHandler from graypy.handler import SYSLOG_LEVELS from tests.unit.helper import MOCK_LOG_RECORD def test_invalid_url(): """Test constructing :class:`graypy.rabbitmq.GELFRabbitHandler` with an invalid rabbitmq url""" with pytest.raises(ValueError): GELFRabbitHandler("BADURL") def test_valid_url(): """Test constructing :class:`graypy.rabbitmq.GELFRabbitHandler` with a valid rabbitmq url""" handler = GELFRabbitHandler("amqp://localhost") assert handler assert "amqp://localhost" == handler.url @pytest.mark.xfail(reason="rabbitmq service is not up") def test_socket_creation_failure(): """Test attempting to open a socket to a rabbitmq instance when no such service exists""" handler = GELFRabbitHandler("amqp://localhost") handler.makeSocket() def test_make_pickle(): handler = GELFRabbitHandler("amqp://localhost") pickle = json.loads(handler.makePickle(MOCK_LOG_RECORD)) assert "Log message" == pickle["short_message"] assert SYSLOG_LEVELS[MOCK_LOG_RECORD.levelno] == pickle["level"] graypy-2.1.0/tests/unit/__init__.py0000664000372000037200000000032413544501373020125 0ustar travistravis00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """unit pytests for :mod:`graypy` .. note:: These tests mock sending to Graylog, thus, do not require a local instance of Graylog to successfully run. """ graypy-2.1.0/tests/unit/test_ExcludeFilter.py0000664000372000037200000000226313544501373022170 0ustar travistravis00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """pytests for :class:`graypy.rabbitmq.ExcludeFilter`""" import pytest from graypy import ExcludeFilter from tests.unit.helper import MOCK_LOG_RECORD_NAME, MOCK_LOG_RECORD @pytest.mark.parametrize("name", [None, ""]) def test_invalid_name(name): """Test constructing:class:`graypy.rabbitmq.ExcludeFilter` with a invalid ``name`` argument""" with pytest.raises(ValueError): ExcludeFilter(name) @pytest.mark.parametrize("name", ["foobar", ".", " "]) def test_valid_name(name): """Test constructing :class:`graypy.rabbitmq.ExcludeFilter` with a valid ``name`` argument""" exclude_filter = ExcludeFilter(name) assert exclude_filter assert name == exclude_filter.name assert len(name) == exclude_filter.nlen def test_non_filtering_record(): exclude_filter = ExcludeFilter("NOT" + MOCK_LOG_RECORD_NAME) assert exclude_filter.filter(MOCK_LOG_RECORD) assert MOCK_LOG_RECORD.name != exclude_filter.name def test_filtering_record(): exclude_filter = ExcludeFilter(MOCK_LOG_RECORD_NAME) assert not exclude_filter.filter(MOCK_LOG_RECORD) assert MOCK_LOG_RECORD.name == exclude_filter.name graypy-2.1.0/tests/unit/test_handler.py0000664000372000037200000001734313544501373021053 0ustar travistravis00000000000000#!/usr/bin/env python # -*- coding: utf-8 -*- """pytests for the formatting and construction of GELF logs by the graypy logging handlers .. note:: These tests mock sending to Graylog and do not require an active graylog instance to operate. """ import datetime import json import logging import socket import sys import zlib import mock import pytest from graypy.handler import BaseGELFHandler, GELFHTTPHandler, GELFTLSHandler from tests.helper import handler, logger, formatted_logger from tests.unit.helper import MOCK_LOG_RECORD, MOCK_LOG_RECORD_NAME UNICODE_REPLACEMENT = u"\ufffd" class TestClass(object): def __repr__(self): return "" @pytest.yield_fixture def mock_send(handler): try: with mock.patch.object(handler, "send") as mock_send: yield mock_send except Exception: with mock.patch.object(handler, "emit") as mock_send: yield mock_send def get_mock_send_arg(mock_send): assert mock_send.call_args_list != [] [[[arg], _]] = mock_send.call_args_list # TODO: this is inaccurate solution for mocking non-send handlers if isinstance(arg, logging.LogRecord): return json.loads( BaseGELFHandler(compress=False).makePickle(arg).decode("utf-8") ) try: return json.loads(zlib.decompress(arg).decode("utf-8")) except zlib.error: # we have a uncompress message try: return json.loads(arg.decode("utf-8")) except Exception: # that is null terminated return json.loads(arg[:-1].decode("utf-8")) @pytest.mark.parametrize( "message,expected", [ (u"\u20AC", u"\u20AC"), (u"\u20AC".encode("utf-8"), u"\u20AC"), (b"\xc3", UNICODE_REPLACEMENT), (["a", b"\xc3"], ["a", UNICODE_REPLACEMENT]), ], ) def test_pack(message, expected): assert expected == json.loads( BaseGELFHandler._pack_gelf_dict(message).decode("utf-8") ) def test_manual_exc_info_handler(logger, mock_send): """Check that a the ``full_message`` traceback info is passed when the ``exc_info=1`` flag is given within a log message""" try: raise SyntaxError("Syntax error") except SyntaxError: logger.error("Failed", exc_info=1) arg = get_mock_send_arg(mock_send) assert "Failed" == arg["short_message"] assert arg["full_message"].startswith("Traceback (most recent call last):") # GELFHTTPHandler mocking does not complete the stacktrace # thus a missing \n assert arg["full_message"].endswith("SyntaxError: Syntax error") or arg[ "full_message" ].endswith("SyntaxError: Syntax error\n") def test_normal_exception_handler(logger, mock_send): try: raise SyntaxError("Syntax error") except SyntaxError: logger.exception("Failed") arg = get_mock_send_arg(mock_send) assert "Failed" == arg["short_message"] assert arg["full_message"].startswith("Traceback (most recent call last):") # GELFHTTPHandler mocking does not complete the stacktrace # thus a missing \n assert arg["full_message"].endswith("SyntaxError: Syntax error") or arg[ "full_message" ].endswith("SyntaxError: Syntax error\n") def test_unicode(logger, mock_send): logger.error(u"Mensaje de registro espa\xf1ol") arg = get_mock_send_arg(mock_send) assert u"Mensaje de registro espa\xf1ol" == arg["short_message"] @pytest.mark.skipif(sys.version_info[0] >= 3, reason="python2 only") def test_broken_unicode_python2(logger, mock_send): # py3 record.getMessage() returns a binary string here # which is safely converted to unicode during the sanitization # process logger.error(b"Broken \xde log message") decoded = get_mock_send_arg(mock_send) assert u"Broken %s log message" % UNICODE_REPLACEMENT == decoded["short_message"] @pytest.mark.skipif(sys.version_info[0] < 3, reason="python3 only") def test_broken_unicode_python3(logger, mock_send): # py3 record.getMessage() returns somewhat broken "b"foo"" if the # message string is not a string, but a binary object: b"foo" logger.error(b"Broken \xde log message") decoded = get_mock_send_arg(mock_send) assert "b'Broken \\xde log message'" == decoded["short_message"] def test_extra_field(logger, mock_send): logger.error("Log message", extra={"foo": "bar"}) decoded = get_mock_send_arg(mock_send) assert "Log message" == decoded["short_message"] assert "bar" == decoded["_foo"] def test_list(logger, mock_send): logger.error("Log message", extra={"foo": ["bar", "baz"]}) decoded = get_mock_send_arg(mock_send) assert "Log message" == decoded["short_message"] assert ["bar", "baz"] == decoded["_foo"] def test_arbitrary_object(logger, mock_send): logger.error("Log message", extra={"foo": TestClass()}) decoded = get_mock_send_arg(mock_send) assert "Log message" == decoded["short_message"] assert "" == decoded["_foo"] def test_message_to_pickle_serializes_datetime_objects_instead_of_blindly_repring_them( logger, mock_send ): timestamp = datetime.datetime(2001, 2, 3, 4, 5, 6, 7) logger.error("Log message", extra={"ts": timestamp}) decoded = get_mock_send_arg(mock_send) assert "datetime.datetime" not in decoded["_ts"] assert timestamp.isoformat() == decoded["_ts"] def test_status_field_issue(logger, mock_send): logger.error("Log message", extra={"status": "OK"}) decoded = get_mock_send_arg(mock_send) assert "Log message" == decoded["short_message"] assert "OK" == decoded["_status"] def test_add_level_name(): gelf_dict = dict() BaseGELFHandler._add_level_names(gelf_dict, MOCK_LOG_RECORD) assert "INFO" == gelf_dict["level_name"] def test_resolve_host(): """Test all posible resolutions of :meth:`BaseGELFHandler._resolve_host`""" assert socket.gethostname() == BaseGELFHandler._resolve_host(False, None) assert socket.getfqdn() == BaseGELFHandler._resolve_host(True, None) assert socket.getfqdn() == BaseGELFHandler._resolve_host(True, "localhost") assert "localhost" == BaseGELFHandler._resolve_host(False, "localhost") assert "" == BaseGELFHandler._resolve_host(False, "") def test_set_custom_facility(): gelf_dict = dict() facility = "test facility" BaseGELFHandler._set_custom_facility(gelf_dict, facility, MOCK_LOG_RECORD) assert MOCK_LOG_RECORD_NAME == gelf_dict["_logger"] assert "test facility" == gelf_dict["facility"] def test_formatted_logger(formatted_logger, mock_send): """Test the ability to set and modify the graypy handler's :class:`logging.Formatter` and have the resultant ``short_message`` be formatted by the set :class:`logging.Formatter`""" for handler in formatted_logger.handlers: if isinstance(handler, GELFHTTPHandler): pytest.skip("formatting not mocked for GELFHTTPHandler") formatted_logger.error("Log message") decoded = get_mock_send_arg(mock_send) assert "ERROR : Log message" == decoded["short_message"] def test_invalid_fqdn_localhost(): """Test constructing :class:`graypy.handler.BaseGELFHandler` with specifying conflicting arguments ``fqdn`` and ``localname``""" with pytest.raises(ValueError): BaseGELFHandler(fqdn=True, localname="localhost") def test_invalid_ca_certs(): """Test constructing :class:`graypy.handler.GELFTLSHandler` with incorrect arguments specifying server ca cert verification""" with pytest.raises(ValueError): GELFTLSHandler("127.0.0.1", validate=True) def test_invalid_client_certs(): """Test constructing :class:`graypy.handler.GELFTLSHandler` with incorrect arguments specifying client cert/key verification""" with pytest.raises(ValueError): # missing client cert GELFTLSHandler("127.0.0.1", keyfile="/dev/null") graypy-2.1.0/tests/config/0000775000372000037200000000000013544502033016275 5ustar travistravis00000000000000graypy-2.1.0/tests/config/stop_local_graylog_server.sh0000664000372000037200000000040013544501373024102 0ustar travistravis00000000000000#!/usr/bin/env bash # stop the local graylog server used for integration testing graypy # do work within ./test/config directory DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )" cd ${DIR} docker-compose -f docker-compose.yml down graypy-2.1.0/tests/config/docker-compose.yml0000664000372000037200000000206713544501373021745 0ustar travistravis00000000000000version: '2' services: mongo: image: "mongo:3" elasticsearch: image: "elasticsearch:2" command: "elasticsearch -Des.cluster.name='graylog'" graylog: image: graylog2/server environment: GRAYLOG_PASSWORD_SECRET: CVanHILkuYhsxE50BrNR6FFt75rS3h0V2uUlHxAshGB90guZznEoDxN7zhPx6Bcn61mfhY2T5r0PRkZVwowsTkHU2rBZnv0d GRAYLOG_ROOT_PASSWORD_SHA2: 8c6976e5b5410415bde908bd4dee15dfb167a9c873fc4bb8a81f6f2ab448a918 GRAYLOG_WEB_ENDPOINT_URI: http://127.0.0.1:9000/api GRAYLOG_CONTENT_PACKS_AUTO_LOAD: grok-patterns.json,inputs.json GRAYLOG_ELASTICSEARCH_HOSTS: http://elasticsearch:9200 volumes: - ./inputs.json:/usr/share/graylog/data/contentpacks/inputs.json - ./localhost.cert.pem:/usr/share/graylog/data/cert.pem - ./localhost.pkcs8-encrypted.key.pem:/usr/share/graylog/data/key.pem links: - mongo - elasticsearch depends_on: - mongo - elasticsearch ports: - "9000:9000" - "12201:12201/tcp" - "12202:12202/udp" - "12203:12203" - "12204:12204/tcp" graypy-2.1.0/tests/config/inputs.json0000664000372000037200000000253213544501373020522 0ustar travistravis00000000000000{ "inputs": [ { "title": "tcp", "configuration": { "bind_address": "0.0.0.0", "port": 12201 }, "type": "org.graylog2.inputs.gelf.tcp.GELFTCPInput", "global": true }, { "title": "udp", "configuration": { "bind_address": "0.0.0.0", "port": 12202 }, "type": "org.graylog2.inputs.gelf.udp.GELFUDPInput", "global": true }, { "title": "http", "configuration": { "bind_address": "0.0.0.0", "port": 12203 }, "type": "org.graylog2.inputs.gelf.http.GELFHttpInput", "global": true }, { "title": "tls", "configuration": { "bind_address": "0.0.0.0", "port": 12204, "tls_enable": true, "tls_cert_file": "/usr/share/graylog/data/cert.pem", "tls_key_file": "/usr/share/graylog/data/key.pem", "tls_key_password": "secret" }, "type": "org.graylog2.inputs.gelf.tcp.GELFTCPInput", "global": true } ], "streams": [], "outputs": [], "dashboards": [], "grok_patterns": [] }graypy-2.1.0/tests/config/start_local_graylog_server.sh0000664000372000037200000000132313544501373024257 0ustar travistravis00000000000000#!/usr/bin/env bash # start a local graylog server for integration testing graypy # do work within ./test/config directory DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )" cd ${DIR} # create ssl certs for enabling the graylog server to use a # TLS connection for GELF input bash create_ssl_certs.sh -h localhost -i 127.0.0.1 # start the graylog server docker container docker-compose -f docker-compose.yml down docker-compose -f docker-compose.yml up -d # wait for the graylog server docker container to start sleep 40 # test that the graylog server docker container is started curl -u admin:admin 'http://127.0.0.1:9000/api/search/universal/relative?query=test&range=5&fields=message' || true graypy-2.1.0/tests/config/create_ssl_certs.sh0000664000372000037200000001232013544501373022161 0ustar travistravis00000000000000#!/bin/bash -e # # This scrip generate self-signed certificate to be used in development. # it sets the CN to the first provided hostname and will add all other # provided names to subjectAltName. # # Some Magic is added to the script that tries to find some settings for the # current host where this script is started. # # This script was first created by Jan Doberstein 2017-07-30 # # This script is tested on CentOS 7, Ubuntu 14.04, Ubuntu 16.04, MacOS 10.12 OPSSLBIN=$(which openssl) while getopts "d:h:i:m?" opt; do case ${opt} in h) HNAME+=("${OPTARG}");; i) HIP+=("${OPTARG}");; m) MMODE=active;; d) VALIDDAYS=${OPTARG};; s) KEYSECRET=${OPTARG};; ?) HELPME=yes;; *) HELPME=yes;; esac done if [ -n "${HELPME}" ]; then echo " This script will generate self-signed ssl certificates, they will be written to the current directory Options available: -h to set Hostnames (can be used multiple times) -i to set IP Adresses (can be used multiple times) -m (optional) activates a magic mode where the script try to find Hostnames and IPs of the current Host -d (optional) Number of Days the certificate is valid (default=365) -s (optional) The secret that is used for the crypted key (default=secret) " exit 0 fi if [ -n "${MMODE}" ]; then echo "Magic Mode is on this will try to find the hostname and IP of host where this script is executed. it will then add this to the list of possible Hostnames and IPs If you get an error with the Magic Mode then retry with only one hostname set via -h option " HOSTNAME_BIN=$(type -p hostname) # possible addition # # try if dig is installed and check the hostname and ip resolve # dig_bin=$(which dig) if [ -n "${HOSTNAME_BIN}" ];then HNAME+=("$(hostname -s)") HNAME+=("$(hostname -A)") # add localhost as hostname to easy up debugging HNAME+=(localhost) # try if hostname -I returns the IP, if not # nasty workaround two steps because the array will get # entries that can't be parsed out correct GETIP=$({hostname -I 2>/dev/null || echo "127.0.0.1") HIP+=($(echo $GETIP | tr -d '[:blank:]')) else echo "The command hostname can't be found aborting Magic mode please use manual mode and provide at least one hostname with -h " exit 1 fi # take all IP Adresses returned by the command IP into the list # first check if all binaries are present that are needed # (when only bash build-ins are needed would be awesome) IPCMD=$(type -p ip) GRPCMD=$(type -p grep) AWKCMD=$(type -p awk) CUTCMD=$(type -p cut) if [ -n "${IPCMD}" ] && [ -n "${GRPCMD}" ] && [ -n "${AWKCMD}" ] && [ -n "${CUTCMD}" ]; then # to avoid error output in the array 2>/dev/null # every IP that is returned will be added to the array # ip addr show | grep 'inet ' | awk '{ print $2}' | cut -d"/" -f1 HIP+=($("${IPCMD}" addr show 2>/dev/null | "${GRPCMD}" 'inet ' 2>/dev/null| "${AWKCMD}" '{print $2}' 2>/dev/null| "${CUTCMD}" -d"/" -f1 2>/dev/null)) fi fi if [ -z "${HNAME}" ]; then echo "please provide hostname (-h) at least once. Try -? for help."; exit 1; fi if [ -z "${OPSSLBIN}" ]; then echo "no openssl detected aborting" exit 1; fi # set localhost IP if no other set if [ -z "${HIP}" ]; then HIP+=(127.0.0.1) fi # if no VALIDDAYS are set, default 365 if [ -z "${VALIDDAYS}" ]; then VALIDDAYS=365 fi # if no Key provided, set default secret if [ -z "${KEYSECRET}" ]; then KEYSECRET=secret fi # sort array entries and make them uniq NAMES=($(printf "DNS:%q\n" ${HNAME[@]} | sort -u)) IPADD=($(printf "IP:%q\n" ${HIP[@]} | sort -u)) # print each elemet of both arrays with comma seperator # and create a string from the array content SUBALT=$(IFS=','; echo "${NAMES[*]},${IPADD[*]}") #### output some informatione echo "This script will generate a SSL certificate with the following settings: CN Hostname = ${HNAME} subjectAltName = ${SUBALT} " # --------------------------- local_openssl_config=" [ req ] prompt = no distinguished_name = req_distinguished_name x509_extensions = san_self_signed [ req_distinguished_name ] CN=${HNAME} [ san_self_signed ] subjectAltName = ${SUBALT} subjectKeyIdentifier = hash authorityKeyIdentifier = keyid:always,issuer basicConstraints = CA:true " ${OPSSLBIN} req \ -newkey rsa:2048 -nodes \ -keyout "${HNAME}.pkcs5-plain.key.pem" \ -x509 -sha256 -days ${VALIDDAYS} \ -config <(echo "$local_openssl_config") \ -out "${HNAME}.cert.pem" 2>openssl_error.log || { echo -e "ERROR !\nOpenSSL returns an error, sorry this script will not work \n Possible reason: the openssl version is to old and does not support self signed san certificates \n Check openssl_error.log in your current directory for details"; exit 1; } ${OPSSLBIN} pkcs8 -in "${HNAME}.pkcs5-plain.key.pem" -topk8 -nocrypt -out "${HNAME}.pkcs8-plain.key.pem" ${OPSSLBIN} pkcs8 -in "${HNAME}.pkcs5-plain.key.pem" -topk8 -passout pass:"${KEYSECRET}" -out "${HNAME}.pkcs8-encrypted.key.pem" echo "the following files are written to the current directory:" echo " - ${HNAME}.pkcs5-plain.key.pem" echo " - ${HNAME}.pkcs8-plain.key.pem" echo " - ${HNAME}.pkcs8-encrypted.key.pem" echo " with the password: ${KEYSECRET}" echo "" rm openssl_error.log #EOF