oslo.log-4.1.1/0000775000175000017500000000000013643050376013302 5ustar zuulzuul00000000000000oslo.log-4.1.1/.zuul.yaml0000664000175000017500000000277213643050265015250 0ustar zuulzuul00000000000000- job: name: oslo.log-src-grenade-devstack parent: legacy-dsvm-base voting: false irrelevant-files: - ^(test-|)requirements.txt$ - ^setup.cfg$ post-run: playbooks/legacy/oslo.log-src-grenade-devstack/post.yaml required-projects: - openstack/grenade - openstack/devstack-gate - openstack/oslo.log run: playbooks/legacy/oslo.log-src-grenade-devstack/run.yaml timeout: 10800 - job: name: oslo.log-jsonformatter parent: devstack-tempest timeout: 10800 vars: devstack_local_conf: post-config: $NOVA_CONF: DEFAULT: use_json: True $NEUTRON_CONF: DEFAULT: use_json: True $GLANCE_CONF: DEFAULT: use_json: True $CINDER_CONF: DEFAULT: use_json: True $KEYSTONE_CONF: DEFAULT: use_json: True irrelevant-files: - ^.*\.rst$ - ^api-ref/.*$ - ^doc/.*$ - ^releasenotes/.*$ - project: check: jobs: - oslo.log-src-grenade-devstack - oslo.log-jsonformatter gate: jobs: - oslo.log-jsonformatter templates: - check-requirements - lib-forward-testing-python3 - openstack-lower-constraints-jobs - openstack-python3-ussuri-jobs - periodic-stable-jobs - publish-openstack-docs-pti - release-notes-jobs-python3 periodic: jobs: - requirements-check oslo.log-4.1.1/requirements.txt0000664000175000017500000000115013643050265016560 0ustar zuulzuul00000000000000# The order of packages is significant, because pip processes them in the order # of appearance. Changing the order has an impact on the overall integration # process, which may cause wedges in the gate later. pbr>=3.1.1 # Apache-2.0 oslo.config>=5.2.0 # Apache-2.0 oslo.context>=2.20.0 # Apache-2.0 oslo.i18n>=3.20.0 # Apache-2.0 oslo.utils>=3.36.0 # Apache-2.0 oslo.serialization>=2.25.0 # Apache-2.0 debtcollector>=1.19.0 # Apache-2.0 pyinotify>=0.9.6;sys_platform!='win32' and sys_platform!='darwin' and sys_platform!='sunos5' # MIT python-dateutil>=2.7.0 # BSD monotonic>=1.4;python_version<'3.3' # Apache-2.0 oslo.log-4.1.1/setup.py0000664000175000017500000000170113643050265015010 0ustar zuulzuul00000000000000# Copyright (c) 2013 Hewlett-Packard Development Company, L.P. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or # implied. # See the License for the specific language governing permissions and # limitations under the License. import setuptools # In python < 2.7.4, a lazy loading of package `pbr` will break # setuptools if some other modules registered functions in `atexit`. # solution from: http://bugs.python.org/issue15881#msg170215 try: import multiprocessing # noqa except ImportError: pass setuptools.setup( setup_requires=['pbr>=2.0.0'], pbr=True) oslo.log-4.1.1/babel.cfg0000664000175000017500000000002013643050265015015 0ustar zuulzuul00000000000000[python: **.py] oslo.log-4.1.1/LICENSE0000664000175000017500000002363613643050265014316 0ustar zuulzuul00000000000000 Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 1. Definitions. "License" shall mean the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 9 of this document. "Licensor" shall mean the copyright owner or entity authorized by the copyright owner that is granting the License. "Legal Entity" shall mean the union of the acting entity and all other entities that control, are controlled by, or are under common control with that entity. For the purposes of this definition, "control" means (i) the power, direct or indirect, to cause the direction or management of such entity, whether by contract or otherwise, or (ii) ownership of fifty percent (50%) or more of the outstanding shares, or (iii) beneficial ownership of such entity. "You" (or "Your") shall mean an individual or Legal Entity exercising permissions granted by this License. "Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files. "Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types. "Work" shall mean the work of authorship, whether in Source or Object form, made available under the License, as indicated by a copyright notice that is included in or attached to the work (an example is provided in the Appendix below). "Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from, or merely link (or bind by name) to the interfaces of, the Work and Derivative Works thereof. "Contribution" shall mean any work of authorship, including the original version of the Work and any modifications or additions to that Work or Derivative Works thereof, that is intentionally submitted to Licensor for inclusion in the Work by the copyright owner or by an individual or Legal Entity authorized to submit on behalf of the copyright owner. For the purposes of this definition, "submitted" means any form of electronic, verbal, or written communication sent to the Licensor or its representatives, including but not limited to communication on electronic mailing lists, source code control systems, and issue tracking systems that are managed by, or on behalf of, the Licensor for the purpose of discussing and improving the Work, but excluding communication that is conspicuously marked or otherwise designated in writing by the copyright owner as "Not a Contribution." "Contributor" shall mean Licensor and any individual or Legal Entity on behalf of whom a Contribution has been received by Licensor and subsequently incorporated within the Work. 2. Grant of Copyright License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable copyright license to reproduce, prepare Derivative Works of, publicly display, publicly perform, sublicense, and distribute the Work and such Derivative Works in Source or Object form. 3. Grant of Patent License. Subject to the terms and conditions of this License, each Contributor hereby grants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable (except as stated in this section) patent license to make, have made, use, offer to sell, sell, import, and otherwise transfer the Work, where such license applies only to those patent claims licensable by such Contributor that are necessarily infringed by their Contribution(s) alone or by combination of their Contribution(s) with the Work to which such Contribution(s) was submitted. If You institute patent litigation against any entity (including a cross-claim or counterclaim in a lawsuit) alleging that the Work or a Contribution incorporated within the Work constitutes direct or contributory patent infringement, then any patent licenses granted to You under this License for that Work shall terminate as of the date such litigation is filed. 4. Redistribution. You may reproduce and distribute copies of the Work or Derivative Works thereof in any medium, with or without modifications, and in Source or Object form, provided that You meet the following conditions: (a) You must give any other recipients of the Work or Derivative Works a copy of this License; and (b) You must cause any modified files to carry prominent notices stating that You changed the files; and (c) You must retain, in the Source form of any Derivative Works that You distribute, all copyright, patent, trademark, and attribution notices from the Source form of the Work, excluding those notices that do not pertain to any part of the Derivative Works; and (d) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that You distribute must include a readable copy of the attribution notices contained within such NOTICE file, excluding those notices that do not pertain to any part of the Derivative Works, in at least one of the following places: within a NOTICE text file distributed as part of the Derivative Works; within the Source form or documentation, if provided along with the Derivative Works; or, within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. You may add Your own attribution notices within Derivative Works that You distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. You may add Your own copyright statement to Your modifications and may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Derivative Works as a whole, provided Your use, reproduction, and distribution of the Work otherwise complies with the conditions stated in this License. 5. Submission of Contributions. Unless You explicitly state otherwise, any Contribution intentionally submitted for inclusion in the Work by You to the Licensor shall be under the terms and conditions of this License, without any additional terms or conditions. Notwithstanding the above, nothing herein shall supersede or modify the terms of any separate license agreement you may have executed with Licensor regarding such Contributions. 6. Trademarks. This License does not grant permission to use the trade names, trademarks, service marks, or product names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file. 7. Disclaimer of Warranty. Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Contributor provides its Contributions) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You are solely responsible for determining the appropriateness of using or redistributing the Work and assume any risks associated with Your exercise of permissions under this License. 8. Limitation of Liability. In no event and under no legal theory, whether in tort (including negligence), contract, or otherwise, unless required by applicable law (such as deliberate and grossly negligent acts) or agreed to in writing, shall any Contributor be liable to You for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if such Contributor has been advised of the possibility of such damages. 9. Accepting Warranty or Additional Liability. While redistributing the Work or Derivative Works thereof, You may choose to offer, and charge a fee for, acceptance of support, warranty, indemnity, or other liability obligations and/or rights consistent with this License. However, in accepting such obligations, You may act only on Your own behalf and on Your sole responsibility, not on behalf of any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against, such Contributor by reason of your accepting any such warranty or additional liability. oslo.log-4.1.1/.coveragerc0000664000175000017500000000013513643050265015417 0ustar zuulzuul00000000000000[run] branch = True source = oslo_log omit = oslo_log/tests/* [report] ignore_errors = True oslo.log-4.1.1/HACKING.rst0000664000175000017500000000017013643050265015073 0ustar zuulzuul00000000000000Style Commandments ================== Read the OpenStack Style Commandments https://docs.openstack.org/hacking/latest/ oslo.log-4.1.1/PKG-INFO0000664000175000017500000000432513643050376014403 0ustar zuulzuul00000000000000Metadata-Version: 2.1 Name: oslo.log Version: 4.1.1 Summary: oslo.log library Home-page: https://docs.openstack.org/oslo.log/latest Author: OpenStack Author-email: openstack-discuss@lists.openstack.org License: UNKNOWN Description: ======================== Team and repository tags ======================== .. image:: https://governance.openstack.org/tc/badges/oslo.log.svg :target: https://governance.openstack.org/tc/reference/tags/index.html .. Change things from this point on ================================ oslo.log -- Oslo Logging Library ================================ .. image:: https://img.shields.io/pypi/v/oslo.log.svg :target: https://pypi.org/project/oslo.log/ :alt: Latest Version .. image:: https://img.shields.io/pypi/dm/oslo.log.svg :target: https://pypi.org/project/oslo.log/ :alt: Downloads The oslo.log (logging) configuration library provides standardized configuration for all openstack projects. It also provides custom formatters, handlers and support for context specific logging (like resource id's etc). * Free software: Apache license * Documentation: https://docs.openstack.org/oslo.log/latest/ * Source: https://opendev.org/openstack/oslo.log * Bugs: https://bugs.launchpad.net/oslo.log * Release notes: https://docs.openstack.org/releasenotes/oslo.log/ Platform: UNKNOWN Classifier: Environment :: OpenStack Classifier: Intended Audience :: Information Technology Classifier: Intended Audience :: System Administrators Classifier: License :: OSI Approved :: Apache Software License Classifier: Operating System :: POSIX :: Linux Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3 :: Only Classifier: Programming Language :: Python :: Implementation :: CPython Requires-Python: >=3.6 Provides-Extra: fixtures Provides-Extra: systemd Provides-Extra: test oslo.log-4.1.1/oslo_log/0000775000175000017500000000000013643050376015117 5ustar zuulzuul00000000000000oslo.log-4.1.1/oslo_log/formatters.py0000664000175000017500000005163413643050265017665 0ustar zuulzuul00000000000000# Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import datetime import debtcollector import functools import io import itertools import logging import logging.config import logging.handlers import re import socket import sys import traceback from dateutil import tz from oslo_context import context as context_utils from oslo_serialization import jsonutils from oslo_utils import encodeutils def _dictify_context(context): if getattr(context, 'get_logging_values', None): return context.get_logging_values() elif getattr(context, 'to_dict', None): debtcollector.deprecate( 'The RequestContext.get_logging_values() ' 'method should be defined for logging context specific ' 'information. The to_dict() method is deprecated ' 'for oslo.log use.', version='3.8.0', removal_version='5.0.0') return context.to_dict() # This dict only style logging format will become deprecated # when projects using a dictionary object for context are updated elif isinstance(context, dict): return context return {} # A configuration object is given to us when the application registers # the logging options. _CONF = None def _store_global_conf(conf): global _CONF _CONF = conf def _update_record_with_context(record): """Given a log record, update it with context information. The request context, if there is one, will either be passed with the incoming record or in the global thread-local store. """ context = record.__dict__.get( 'context', context_utils.get_current() ) if context: d = _dictify_context(context) # Copy the context values directly onto the record so they can be # used by the formatting strings. for k, v in d.items(): setattr(record, k, v) return context def _ensure_unicode(msg): """Do our best to turn the input argument into a unicode object. """ if isinstance(msg, str): return msg if not isinstance(msg, bytes): return str(msg) return encodeutils.safe_decode( msg, incoming='utf-8', errors='xmlcharrefreplace') def _get_error_summary(record): """Return the error summary If there is no active exception, return the default. If the record is being logged below the warning level, return an empty string. If there is an active exception, format it and return the resulting string. """ error_summary = '' if record.levelno < logging.WARNING: return '' if record.exc_info: # Save the exception we were given so we can include the # summary in the log line. exc_info = record.exc_info else: # Check to see if there is an active exception that was # not given to us explicitly. If so, save it so we can # include the summary in the log line. exc_info = sys.exc_info() # If we get (None, None, None) because there is no # exception, convert it to a simple None to make the logic # that uses the value simpler. if not exc_info[0]: exc_info = None elif exc_info[0] in (TypeError, ValueError, KeyError, AttributeError, ImportError): # NOTE(dhellmann): Do not include information about # common built-in exceptions used to detect cases of # bad or missing data. We don't use isinstance() here # to limit this filter to only the built-in # classes. This check is only performed for cases # where the exception info is being detected # automatically so if a caller gives us an exception # we will definitely log it. exc_info = None # If we have an exception, format it to be included in the # output. if exc_info: try: # Build the exception summary in the line with the # primary log message, to serve as a mnemonic for error # and warning cases. error_summary = traceback.format_exception_only( exc_info[0], exc_info[1], )[0].rstrip() # If the exc_info wasn't explicitly passed to us, take only the # first line of it. _Remote exceptions from oslo.messaging append # the full traceback to the exception message, so we want to avoid # outputting the traceback unless we've been passed exc_info # directly (via LOG.exception(), for example). if not record.exc_info: error_summary = error_summary.split('\n', 1)[0] except TypeError as type_err: # Work around https://bugs.python.org/issue28603 error_summary = "" % str(type_err) finally: # Remove the local reference to the exception and # traceback to avoid a memory leak through the frame # references. del exc_info return error_summary class _ReplaceFalseValue(dict): def __getitem__(self, key): return dict.get(self, key, None) or '-' _MSG_KEY_REGEX = re.compile(r'(%+)\((\w+)\)') def _json_dumps_with_fallback(obj): # Bug #1593641: If an object cannot be serialized to JSON, convert # it using repr() to prevent serialization errors. Using repr() is # not ideal, but serialization errors are unexpected on logs, # especially when the code using logs is not aware that the # JSONFormatter will be used. convert = functools.partial(jsonutils.to_primitive, fallback=repr) return jsonutils.dumps(obj, default=convert) class JSONFormatter(logging.Formatter): def __init__(self, fmt=None, datefmt=None, style='%'): # NOTE(sfinucan) we ignore the fmt and style arguments, but they're # still there since logging.config.fileConfig passes the former in # Python < 3.2 and both in Python >= 3.2 self.datefmt = datefmt try: self.hostname = socket.gethostname() except socket.error: self.hostname = None def formatException(self, ei, strip_newlines=True): try: lines = traceback.format_exception(*ei) except TypeError as type_error: # Work around https://bugs.python.org/issue28603 msg = str(type_error) lines = ['\n' % msg] if strip_newlines: lines = [filter( lambda x: x, line.rstrip().splitlines()) for line in lines] lines = list(itertools.chain(*lines)) return lines def format(self, record): args = record.args if isinstance(args, dict): msg_keys = _MSG_KEY_REGEX.findall(record.msg) # NOTE(bnemec): The logic around skipping escaped placeholders is # tricky and error-prone to include in the regex. Much easier to # just grab them all and filter after the fact. msg_keys = [m[1] for m in msg_keys if len(m[0]) == 1] # If no named keys were found, then the entire dict must have been # the value to be formatted. Don't filter anything. if msg_keys: args = {k: v for k, v in args.items() if k in msg_keys} message = {'message': record.getMessage(), 'asctime': self.formatTime(record, self.datefmt), 'name': record.name, 'msg': record.msg, 'args': args, 'levelname': record.levelname, 'levelno': record.levelno, 'pathname': record.pathname, 'filename': record.filename, 'module': record.module, 'lineno': record.lineno, 'funcname': record.funcName, 'created': record.created, 'msecs': record.msecs, 'relative_created': record.relativeCreated, 'thread': record.thread, 'thread_name': record.threadName, 'process_name': record.processName, 'process': record.process, 'traceback': None, 'hostname': self.hostname, 'error_summary': _get_error_summary(record)} # Build the extra values that were given to us, including # the context. context = _update_record_with_context(record) if hasattr(record, 'extra'): extra = record.extra.copy() else: extra = {} for key in getattr(record, 'extra_keys', []): if key not in extra: extra[key] = getattr(record, key) # The context object might have been given from the logging call. if # that was the case, it'll come in the 'extra' entry already. If not, # lets use the context we fetched above. In either case, we explode it # into the 'context' entry because the values are more useful than the # object reference. if 'context' in extra and extra['context']: message['context'] = _dictify_context(extra['context']) elif context: message['context'] = _dictify_context(context) else: message['context'] = {} extra.pop('context', None) message['extra'] = extra if record.exc_info: message['traceback'] = self.formatException(record.exc_info) return _json_dumps_with_fallback(message) class FluentFormatter(logging.Formatter): """A formatter for fluentd. format() returns dict, not string. It expects to be used by fluent.handler.FluentHandler. (included in fluent-logger-python) .. versionadded:: 3.17 """ def __init__(self, fmt=None, datefmt=None, style='%s'): # NOTE(sfinucan) we ignore the fmt and style arguments for the same # reason as JSONFormatter. self.datefmt = datefmt try: self.hostname = socket.gethostname() except socket.error: self.hostname = None self.cmdline = " ".join(sys.argv) def formatException(self, exc_info, strip_newlines=True): try: lines = traceback.format_exception(*exc_info) except TypeError as type_error: # Work around https://bugs.python.org/issue28603 msg = str(type_error) lines = ['\n' % msg] if strip_newlines: lines = functools.reduce(lambda a, line: a + line.rstrip().splitlines(), lines, []) return lines def format(self, record): message = {'message': record.getMessage(), 'time': self.formatTime(record, self.datefmt), 'name': record.name, 'level': record.levelname, 'filename': record.filename, 'lineno': record.lineno, 'module': record.module, 'funcname': record.funcName, 'process_name': record.processName, 'cmdline': self.cmdline, 'hostname': self.hostname, 'traceback': None, 'error_summary': _get_error_summary(record)} # Build the extra values that were given to us, including # the context. context = _update_record_with_context(record) if hasattr(record, 'extra'): extra = record.extra.copy() else: extra = {} for key in getattr(record, 'extra_keys', []): if key not in extra: extra[key] = getattr(record, key) # The context object might have been given from the logging call. if # that was the case, it'll come in the 'extra' entry already. If not, # lets use the context we fetched above. In either case, we explode it # into the extra dictionary because the values are more useful than the # object reference. if 'context' in extra and extra['context']: message['context'] = _dictify_context(extra['context']) elif context: message['context'] = _dictify_context(context) else: message['context'] = {} extra.pop('context', None) # NOTE(vdrok): try to dump complex objects primitive_types = (str, int, bool, type(None), float, list, dict) for key, value in extra.items(): if not isinstance(value, primitive_types): extra[key] = _json_dumps_with_fallback(value) message['extra'] = extra if record.exc_info: message['traceback'] = self.formatException(record.exc_info) return message class ContextFormatter(logging.Formatter): """A context.RequestContext aware formatter configured through flags. The flags used to set format strings are: logging_context_format_string and logging_default_format_string. You can also specify logging_debug_format_suffix to append extra formatting if the log level is debug. The standard variables available to the formatter are listed at: http://docs.python.org/library/logging.html#formatter In addition to the standard variables, one custom variable is available to both formatting string: `isotime` produces a timestamp in ISO8601 format, suitable for producing RFC5424-compliant log messages. Furthermore, logging_context_format_string has access to all of the data in a dict representation of the context. """ def __init__(self, *args, **kwargs): """Initialize ContextFormatter instance Takes additional keyword arguments which can be used in the message format string. :keyword project: project name :type project: string :keyword version: project version :type version: string """ self.project = kwargs.pop('project', 'unknown') self.version = kwargs.pop('version', 'unknown') self.conf = kwargs.pop('config', _CONF) logging.Formatter.__init__(self, *args, **kwargs) def format(self, record): """Uses contextstring if request_id is set, otherwise default.""" # store project info record.project = self.project record.version = self.version # FIXME(dims): We need a better way to pick up the instance # or instance_uuid parameters from the kwargs from say # LOG.info or LOG.warn instance_extra = '' instance = getattr(record, 'instance', None) instance_uuid = getattr(record, 'instance_uuid', None) context = _update_record_with_context(record) if instance: try: instance_extra = (self.conf.instance_format % instance) except TypeError: instance_extra = instance elif instance_uuid: instance_extra = (self.conf.instance_uuid_format % {'uuid': instance_uuid}) elif context: # FIXME(dhellmann): We should replace these nova-isms with # more generic handling in the Context class. See the # app-agnostic-logging-parameters blueprint. instance = getattr(context, 'instance', None) instance_uuid = getattr(context, 'instance_uuid', None) # resource_uuid was introduced in oslo_context's # RequestContext resource_uuid = getattr(context, 'resource_uuid', None) if instance: instance_extra = (self.conf.instance_format % {'uuid': instance}) elif instance_uuid: instance_extra = (self.conf.instance_uuid_format % {'uuid': instance_uuid}) elif resource_uuid: instance_extra = (self.conf.instance_uuid_format % {'uuid': resource_uuid}) record.instance = instance_extra # NOTE(sdague): default the fancier formatting params # to an empty string so we don't throw an exception if # they get used for key in ('instance', 'color', 'user_identity', 'resource', 'user_name', 'project_name'): if key not in record.__dict__: record.__dict__[key] = '' # Set the "user_identity" value of "logging_context_format_string" # by using "logging_user_identity_format" and # get_logging_values of oslo.context. if context: record.user_identity = ( self.conf.logging_user_identity_format % _ReplaceFalseValue(_dictify_context(context)) ) if record.__dict__.get('request_id'): fmt = self.conf.logging_context_format_string else: fmt = self.conf.logging_default_format_string # Cache the formatted traceback on the record, Logger will # respect our formatted copy if record.exc_info: record.exc_text = self.formatException(record.exc_info, record) record.error_summary = _get_error_summary(record) if '%(error_summary)s' in fmt: # If we have been told explicitly how to format the error # summary, make sure there is always a default value for # it. record.error_summary = record.error_summary or '-' elif record.error_summary: # If we have not been told how to format the error and # there is an error to summarize, make sure the format # string includes the bits we need to include it. fmt += ': %(error_summary)s' if (record.levelno == logging.DEBUG and self.conf.logging_debug_format_suffix): fmt += " " + self.conf.logging_debug_format_suffix self._compute_iso_time(record) if sys.version_info < (3, 2): self._fmt = fmt else: self._style = logging.PercentStyle(fmt) self._fmt = self._style._fmt try: return logging.Formatter.format(self, record) except TypeError as err: # Something went wrong, report that instead so we at least # get the error message. record.msg = 'Error formatting log line msg={!r} err={!r}'.format( record.msg, err).replace('%', '*') return logging.Formatter.format(self, record) def formatException(self, exc_info, record=None): """Format exception output with CONF.logging_exception_prefix.""" if not record: try: return logging.Formatter.formatException(self, exc_info) except TypeError as type_error: # Work around https://bugs.python.org/issue28603 msg = str(type_error) return '\n' % msg stringbuffer = io.StringIO() try: traceback.print_exception(exc_info[0], exc_info[1], exc_info[2], None, stringbuffer) except TypeError as type_error: # Work around https://bugs.python.org/issue28603 msg = str(type_error) stringbuffer.write('\n' % msg) lines = stringbuffer.getvalue().split('\n') stringbuffer.close() if self.conf.logging_exception_prefix.find('%(asctime)') != -1: record.asctime = self.formatTime(record, self.datefmt) self._compute_iso_time(record) formatted_lines = [] for line in lines: pl = self.conf.logging_exception_prefix % record.__dict__ fl = '%s%s' % (pl, line) formatted_lines.append(fl) return '\n'.join(formatted_lines) def _compute_iso_time(self, record): # set iso8601 timestamp localtz = tz.tzlocal() record.isotime = datetime.datetime.fromtimestamp( record.created).replace(tzinfo=localtz).isoformat() if record.created == int(record.created): # NOTE(stpierre): when the timestamp includes no # microseconds -- e.g., 1450274066.000000 -- then the # microseconds aren't included in the isoformat() time. As # a result, in literally one in a million cases # isoformat() looks different. This adds microseconds when # that happens. record.isotime = "%s.000000%s" % (record.isotime[:-6], record.isotime[-6:]) oslo.log-4.1.1/oslo_log/_options.py0000664000175000017500000002525713643050265017333 0ustar zuulzuul00000000000000# Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from oslo_config import cfg from oslo_log import versionutils _DEFAULT_LOG_DATE_FORMAT = "%Y-%m-%d %H:%M:%S" DEFAULT_LOG_LEVELS = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO'] _IGNORE_MESSAGE = "This option is ignored if log_config_append is set." common_cli_opts = [ cfg.BoolOpt('debug', short='d', default=False, mutable=True, help='If set to true, the logging level will be set to ' 'DEBUG instead of the default INFO level.'), ] logging_cli_opts = [ cfg.StrOpt('log-config-append', metavar='PATH', deprecated_name='log-config', mutable=True, help='The name of a logging configuration file. This file ' 'is appended to any existing logging configuration ' 'files. For details about logging configuration files, ' 'see the Python logging module documentation. Note that ' 'when logging configuration files are used then all ' 'logging configuration is set in the configuration file ' 'and other logging configuration options are ignored ' '(for example, log-date-format).'), cfg.StrOpt('log-date-format', default=_DEFAULT_LOG_DATE_FORMAT, metavar='DATE_FORMAT', help='Defines the format string for %%(asctime)s in log ' 'records. Default: %(default)s . ' + _IGNORE_MESSAGE), cfg.StrOpt('log-file', metavar='PATH', deprecated_name='logfile', help='(Optional) Name of log file to send logging output to. ' 'If no default is set, logging will go to stderr as ' 'defined by use_stderr. ' + _IGNORE_MESSAGE), cfg.StrOpt('log-dir', deprecated_name='logdir', help='(Optional) The base directory used for relative log_file ' ' paths. ' + _IGNORE_MESSAGE), cfg.BoolOpt('watch-log-file', default=False, help='Uses logging handler designed to watch file ' 'system. When log file is moved or removed this handler ' 'will open a new log file with specified path ' 'instantaneously. It makes sense only if log_file option ' 'is specified and Linux platform is used. ' + _IGNORE_MESSAGE), cfg.BoolOpt('use-syslog', default=False, help='Use syslog for logging. ' 'Existing syslog format is DEPRECATED ' 'and will be changed later to honor RFC5424. ' + _IGNORE_MESSAGE), cfg.BoolOpt('use-journal', default=False, help='Enable journald for logging. ' 'If running in a systemd environment you may wish ' 'to enable journal support. Doing so will use the ' 'journal native protocol which includes structured ' 'metadata in addition to log messages.' + _IGNORE_MESSAGE), cfg.StrOpt('syslog-log-facility', default='LOG_USER', help='Syslog facility to receive log lines. ' + _IGNORE_MESSAGE), cfg.BoolOpt('use-json', default=False, help='Use JSON formatting for logging. ' + _IGNORE_MESSAGE), ] generic_log_opts = [ cfg.BoolOpt('use_stderr', default=False, help='Log output to standard error. ' + _IGNORE_MESSAGE), cfg.BoolOpt('use_eventlog', default=False, help='Log output to Windows Event Log.'), cfg.IntOpt('log_rotate_interval', default=1, help='The amount of time before the log files are rotated. ' 'This option is ignored unless log_rotation_type is set' 'to "interval".'), cfg.StrOpt('log_rotate_interval_type', choices=['Seconds', 'Minutes', 'Hours', 'Days', 'Weekday', 'Midnight'], ignore_case=True, default='days', help='Rotation interval type. The time of the last file ' 'change (or the time when the service was started) is ' 'used when scheduling the next rotation.'), cfg.IntOpt('max_logfile_count', default=30, help='Maximum number of rotated log files.'), cfg.IntOpt('max_logfile_size_mb', default=200, help='Log file maximum size in MB. This option is ignored if ' '"log_rotation_type" is not set to "size".'), cfg.StrOpt('log_rotation_type', default='none', choices=[('interval', 'Rotate logs at predefined time intervals.'), ('size', 'Rotate logs once they reach a predefined size.'), ('none', 'Do not rotate log files.')], ignore_case=True, help='Log rotation type.') ] log_opts = [ cfg.StrOpt('logging_context_format_string', default='%(asctime)s.%(msecs)03d %(process)d %(levelname)s ' '%(name)s [%(request_id)s %(user_identity)s] ' '%(instance)s%(message)s', help='Format string to use for log messages with context. ' 'Used by oslo_log.formatters.ContextFormatter'), cfg.StrOpt('logging_default_format_string', default='%(asctime)s.%(msecs)03d %(process)d %(levelname)s ' '%(name)s [-] %(instance)s%(message)s', help='Format string to use for log messages when context is ' 'undefined. ' 'Used by oslo_log.formatters.ContextFormatter'), cfg.StrOpt('logging_debug_format_suffix', default='%(funcName)s %(pathname)s:%(lineno)d', help='Additional data to append to log message when logging ' 'level for the message is DEBUG. ' 'Used by oslo_log.formatters.ContextFormatter'), cfg.StrOpt('logging_exception_prefix', default='%(asctime)s.%(msecs)03d %(process)d ERROR %(name)s ' '%(instance)s', help='Prefix each line of exception output with this format. ' 'Used by oslo_log.formatters.ContextFormatter'), cfg.StrOpt('logging_user_identity_format', default='%(user)s %(tenant)s ' '%(domain)s %(user_domain)s %(project_domain)s', help='Defines the format string for %(user_identity)s that ' 'is used in logging_context_format_string. ' 'Used by oslo_log.formatters.ContextFormatter'), cfg.ListOpt('default_log_levels', default=DEFAULT_LOG_LEVELS, help='List of package logging levels in logger=LEVEL pairs. ' + _IGNORE_MESSAGE), cfg.BoolOpt('publish_errors', default=False, help='Enables or disables publication of error events.'), # NOTE(mikal): there are two options here because sometimes we are handed # a full instance (and could include more information), and other times we # are just handed a UUID for the instance. cfg.StrOpt('instance_format', default='[instance: %(uuid)s] ', help='The format for an instance that is passed with the log ' 'message.'), cfg.StrOpt('instance_uuid_format', default='[instance: %(uuid)s] ', help='The format for an instance UUID that is passed with the ' 'log message.'), cfg.IntOpt('rate_limit_interval', default=0, help='Interval, number of seconds, of log rate limiting.'), cfg.IntOpt('rate_limit_burst', default=0, help='Maximum number of logged messages per ' 'rate_limit_interval.'), cfg.StrOpt('rate_limit_except_level', default='CRITICAL', help='Log level name used by rate limiting: CRITICAL, ERROR, ' 'INFO, WARNING, DEBUG or empty string. Logs with level ' 'greater or equal to rate_limit_except_level are not ' 'filtered. An empty string means that all levels are ' 'filtered.'), ] def list_opts(): """Returns a list of oslo.config options available in the library. The returned list includes all oslo.config options which may be registered at runtime by the library. Each element of the list is a tuple. The first element is the name of the group under which the list of elements in the second element will be registered. A group name of None corresponds to the [DEFAULT] group in config files. The purpose of this is to allow tools like the Oslo sample config file generator (oslo-config-generator) to discover the options exposed to users by this library. :returns: a list of (group_name, opts) tuples """ return [(None, (common_cli_opts + logging_cli_opts + generic_log_opts + log_opts + versionutils.deprecated_opts))] oslo.log-4.1.1/oslo_log/versionutils.py0000664000175000017500000002404613643050265020242 0ustar zuulzuul00000000000000# Copyright (c) 2013 OpenStack Foundation # All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. """ Helpers for comparing version strings. """ import functools import inspect import logging from oslo_config import cfg from oslo_log._i18n import _ LOG = logging.getLogger(__name__) CONF = cfg.CONF _DEPRECATED_EXCEPTIONS = set() deprecated_opts = [ cfg.BoolOpt('fatal_deprecations', default=False, help='Enables or disables fatal status of deprecations.'), ] _deprecated_msg_with_alternative = _( '%(what)s is deprecated as of %(as_of)s in favor of ' '%(in_favor_of)s and may be removed in %(remove_in)s.') _deprecated_msg_no_alternative = _( '%(what)s is deprecated as of %(as_of)s and may be ' 'removed in %(remove_in)s. It will not be superseded.') _deprecated_msg_with_alternative_no_removal = _( '%(what)s is deprecated as of %(as_of)s in favor of %(in_favor_of)s.') _deprecated_msg_with_no_alternative_no_removal = _( '%(what)s is deprecated as of %(as_of)s. It will not be superseded.') _RELEASES = { # NOTE(morganfainberg): Bexar is used for unit test purposes, it is # expected we maintain a gap between Bexar and Folsom in this list. 'B': 'Bexar', 'F': 'Folsom', 'G': 'Grizzly', 'H': 'Havana', 'I': 'Icehouse', 'J': 'Juno', 'K': 'Kilo', 'L': 'Liberty', 'M': 'Mitaka', 'N': 'Newton', 'O': 'Ocata', 'P': 'Pike', 'Q': 'Queens', 'R': 'Rocky', 'S': 'Stein', 'T': 'Train', 'U': 'Ussuri', 'V': 'Victoria', 'W': 'Wallaby', } def register_options(): """Register configuration options used by this library. .. note: This is optional since the options are also registered automatically when the functions in this module are used. """ CONF.register_opts(deprecated_opts) class deprecated(object): """A decorator to mark callables as deprecated. This decorator logs a deprecation message when the callable it decorates is used. The message will include the release where the callable was deprecated, the release where it may be removed and possibly an optional replacement. It also logs a message when a deprecated exception is being caught in a try-except block, but not when subclasses of that exception are being caught. Examples: 1. Specifying the required deprecated release >>> @deprecated(as_of=deprecated.ICEHOUSE) ... def a(): pass 2. Specifying a replacement: >>> @deprecated(as_of=deprecated.ICEHOUSE, in_favor_of='f()') ... def b(): pass 3. Specifying the release where the functionality may be removed: >>> @deprecated(as_of=deprecated.ICEHOUSE, remove_in=+1) ... def c(): pass 4. Specifying the deprecated functionality will not be removed: >>> @deprecated(as_of=deprecated.ICEHOUSE, remove_in=None) ... def d(): pass 5. Specifying a replacement, deprecated functionality will not be removed: >>> @deprecated(as_of=deprecated.ICEHOUSE, in_favor_of='f()', ... remove_in=None) ... def e(): pass .. warning:: The hook used to detect when a deprecated exception is being *caught* does not work under Python 3. Deprecated exceptions are still logged if they are thrown. """ # NOTE(morganfainberg): Bexar is used for unit test purposes, it is # expected we maintain a gap between Bexar and Folsom in this list. BEXAR = 'B' FOLSOM = 'F' GRIZZLY = 'G' HAVANA = 'H' ICEHOUSE = 'I' JUNO = 'J' KILO = 'K' LIBERTY = 'L' MITAKA = 'M' NEWTON = 'N' OCATA = 'O' PIKE = 'P' QUEENS = 'Q' ROCKY = 'R' STEIN = 'S' TRAIN = 'T' USSURI = 'U' def __init__(self, as_of, in_favor_of=None, remove_in=2, what=None): """Initialize decorator :param as_of: the release deprecating the callable. Constants are define in this class for convenience. :param in_favor_of: the replacement for the callable (optional) :param remove_in: an integer specifying how many releases to wait before removing (default: 2) :param what: name of the thing being deprecated (default: the callable's name) """ self.as_of = as_of self.in_favor_of = in_favor_of self.remove_in = remove_in self.what = what def __call__(self, func_or_cls): report_deprecated = functools.partial( deprecation_warning, what=self.what or func_or_cls.__name__ + '()', as_of=self.as_of, in_favor_of=self.in_favor_of, remove_in=self.remove_in) if inspect.isfunction(func_or_cls): @functools.wraps(func_or_cls) def wrapped(*args, **kwargs): report_deprecated() return func_or_cls(*args, **kwargs) return wrapped elif inspect.isclass(func_or_cls): orig_init = func_or_cls.__init__ @functools.wraps(orig_init, assigned=('__name__', '__doc__')) def new_init(self, *args, **kwargs): if self.__class__ in _DEPRECATED_EXCEPTIONS: report_deprecated() orig_init(self, *args, **kwargs) func_or_cls.__init__ = new_init _DEPRECATED_EXCEPTIONS.add(func_or_cls) if issubclass(func_or_cls, Exception): # NOTE(dhellmann): The subclasscheck is called, # sometimes, to test whether a class matches the type # being caught in an exception. This lets us warn # folks that they are trying to catch an exception # that has been deprecated. However, under Python 3 # the test for whether one class is a subclass of # another has been optimized so that the abstract # check is only invoked in some cases. (See # PyObject_IsSubclass in cpython/Objects/abstract.c # for the short-cut.) class ExceptionMeta(type): def __subclasscheck__(self, subclass): if self in _DEPRECATED_EXCEPTIONS: report_deprecated() return super(ExceptionMeta, self).__subclasscheck__(subclass) func_or_cls.__meta__ = ExceptionMeta _DEPRECATED_EXCEPTIONS.add(func_or_cls) return func_or_cls else: raise TypeError('deprecated can be used only with functions or ' 'classes') def _get_safe_to_remove_release(release, remove_in): # TODO(dstanek): this method will have to be reimplemented once # when we get to the X release because once we get to the Y # release, what is Y+2? if remove_in is None: remove_in = 0 new_release = chr(ord(release) + remove_in) if new_release in _RELEASES: return _RELEASES[new_release] else: return new_release def deprecation_warning(what, as_of, in_favor_of=None, remove_in=2, logger=LOG): """Warn about the deprecation of a feature. :param what: name of the thing being deprecated. :param as_of: the release deprecating the callable. :param in_favor_of: the replacement for the callable (optional) :param remove_in: an integer specifying how many releases to wait before removing (default: 2) :param logger: the logging object to use for reporting (optional). """ details = dict(what=what, as_of=_RELEASES[as_of], remove_in=_get_safe_to_remove_release(as_of, remove_in)) if in_favor_of: details['in_favor_of'] = in_favor_of if remove_in is not None and remove_in > 0: msg = _deprecated_msg_with_alternative else: # There are no plans to remove this function, but it is # now deprecated. msg = _deprecated_msg_with_alternative_no_removal else: if remove_in is not None and remove_in > 0: msg = _deprecated_msg_no_alternative else: # There are no plans to remove this function, but it is # now deprecated. msg = _deprecated_msg_with_no_alternative_no_removal report_deprecated_feature(logger, msg, details) # Track the messages we have sent already. See # report_deprecated_feature(). _deprecated_messages_sent = {} def report_deprecated_feature(logger, msg, *args, **kwargs): """Call this function when a deprecated feature is used. If the system is configured for fatal deprecations then the message is logged at the 'critical' level and :class:`DeprecatedConfig` will be raised. Otherwise, the message will be logged (once) at the 'warn' level. :raises: :class:`DeprecatedConfig` if the system is configured for fatal deprecations. """ stdmsg = _("Deprecated: %s") % msg register_options() if CONF.fatal_deprecations: logger.critical(stdmsg, *args, **kwargs) raise DeprecatedConfig(msg=stdmsg) # Using a list because a tuple with dict can't be stored in a set. sent_args = _deprecated_messages_sent.setdefault(msg, list()) if args in sent_args: # Already logged this message, so don't log it again. return sent_args.append(args) logger.warning(stdmsg, *args, **kwargs) class DeprecatedConfig(Exception): message = _("Fatal call to deprecated config: %(msg)s") def __init__(self, msg): super(Exception, self).__init__(self.message % dict(msg=msg)) oslo.log-4.1.1/oslo_log/_i18n.py0000664000175000017500000000152513643050265016407 0ustar zuulzuul00000000000000# Copyright 2014 IBM Corp. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. """oslo.i18n integration module. See https://docs.openstack.org/oslo.i18n/latest/user/index.html . """ import oslo_i18n _translators = oslo_i18n.TranslatorFactory(domain='oslo_log') # The primary translation function using the well-known name "_" _ = _translators.primary oslo.log-4.1.1/oslo_log/rate_limit.py0000664000175000017500000001140113643050265017614 0ustar zuulzuul00000000000000# Copyright 2016 Red Hat, Inc. All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import logging try: from time import monotonic as monotonic_clock # noqa except ImportError: from monotonic import monotonic as monotonic_clock # noqa class _LogRateLimit(logging.Filter): def __init__(self, burst, interval, except_level=None): logging.Filter.__init__(self) self.burst = burst self.interval = interval self.except_level = except_level self.logger = logging.getLogger() self._reset() def _reset(self, now=None): if now is None: now = monotonic_clock() self.counter = 0 self.end_time = now + self.interval self.emit_warn = False def filter(self, record): if (self.except_level is not None and record.levelno >= self.except_level): # don't limit levels >= except_level return True timestamp = monotonic_clock() if timestamp >= self.end_time: self._reset(timestamp) self.counter += 1 return True self.counter += 1 if self.counter <= self.burst: return True if self.emit_warn: # Allow to log our own warning: self.logger is also filtered by # rate limiting return True if self.counter == self.burst + 1: self.emit_warn = True self.logger.error("Logging rate limit: " "drop after %s records/%s sec", self.burst, self.interval) self.emit_warn = False # Drop the log return False def _iter_loggers(): """Iterate on existing loggers.""" # Sadly, Logger.manager and Manager.loggerDict are not documented, # but there is no logging public function to iterate on all loggers. # The root logger is not part of loggerDict. yield logging.getLogger() manager = logging.Logger.manager for logger in manager.loggerDict.values(): if isinstance(logger, logging.PlaceHolder): continue yield logger _LOG_LEVELS = { 'CRITICAL': logging.CRITICAL, 'ERROR': logging.ERROR, 'INFO': logging.INFO, 'WARNING': logging.WARNING, 'DEBUG': logging.DEBUG, } def install_filter(burst, interval, except_level='CRITICAL'): """Install a rate limit filter on existing and future loggers. Limit logs to *burst* messages every *interval* seconds, except of levels >= *except_level*. *except_level* is a log level name like 'CRITICAL'. If *except_level* is an empty string, all levels are filtered. The filter uses a monotonic clock, the timestamp of log records is not used. Raise an exception if a rate limit filter is already installed. """ if install_filter.log_filter is not None: raise RuntimeError("rate limit filter already installed") try: except_levelno = _LOG_LEVELS[except_level] except KeyError: raise ValueError("invalid log level name: %r" % except_level) log_filter = _LogRateLimit(burst, interval, except_levelno) install_filter.log_filter = log_filter install_filter.logger_class = logging.getLoggerClass() class RateLimitLogger(install_filter.logger_class): def __init__(self, *args, **kw): logging.Logger.__init__(self, *args, **kw) self.addFilter(log_filter) # Setup our own logger class to automatically add the filter # to new loggers. logging.setLoggerClass(RateLimitLogger) # Add the filter to all existing loggers for logger in _iter_loggers(): logger.addFilter(log_filter) install_filter.log_filter = None install_filter.logger_class = None def uninstall_filter(): """Uninstall the rate filter installed by install_filter(). Do nothing if the filter was already uninstalled. """ if install_filter.log_filter is None: # not installed (or already uninstalled) return # Restore the old logger class logging.setLoggerClass(install_filter.logger_class) # Remove the filter from all existing loggers for logger in _iter_loggers(): logger.removeFilter(install_filter.log_filter) install_filter.logger_class = None install_filter.log_filter = None oslo.log-4.1.1/oslo_log/handlers.py0000664000175000017500000001116113643050265017266 0ustar zuulzuul00000000000000# -*- coding: utf-8 -*- # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import inspect import logging import logging.config import logging.handlers import os try: from systemd import journal except ImportError: journal = None try: import syslog except ImportError: syslog = None NullHandler = logging.NullHandler def _get_binary_name(): return os.path.basename(inspect.stack()[-1][1]) _AUDIT = logging.INFO + 1 _TRACE = 5 # This is a copy of the numerical constants from syslog.h. The # definition of these goes back at least 20 years, and is specifically # 3 bits in a packed field, so these aren't likely to ever need # changing. SYSLOG_MAP = { "CRITICAL": 2, "ERROR": 3, "WARNING": 4, "WARN": 4, "INFO": 6, "DEBUG": 7, } class OSSysLogHandler(logging.Handler): """Syslog based handler. Only available on UNIX-like platforms.""" def __init__(self, facility=None): # Default values always get evaluated, for which reason we avoid # using 'syslog' directly, which may not be available. facility = facility if facility is not None else syslog.LOG_USER # Do not use super() unless type(logging.Handler) is 'type' # (i.e. >= Python 2.7). if not syslog: raise RuntimeError("Syslog not available on this platform") logging.Handler.__init__(self) binary_name = _get_binary_name() syslog.openlog(binary_name, 0, facility) def emit(self, record): priority = SYSLOG_MAP.get(record.levelname, 7) message = self.format(record) syslog.syslog(priority, message) class OSJournalHandler(logging.Handler): custom_fields = ( 'project_name', 'project_id', 'user_name', 'user_id', 'request_id', ) def __init__(self): # Do not use super() unless type(logging.Handler) is 'type' # (i.e. >= Python 2.7). if not journal: raise RuntimeError("Systemd bindings do not exist") logging.Handler.__init__(self) self.binary_name = _get_binary_name() def emit(self, record): priority = SYSLOG_MAP.get(record.levelname, 7) message = self.format(record) extras = { 'CODE_FILE': record.pathname, 'CODE_LINE': record.lineno, 'CODE_FUNC': record.funcName, 'THREAD_NAME': record.threadName, 'PROCESS_NAME': record.processName, 'LOGGER_NAME': record.name, 'LOGGER_LEVEL': record.levelname, 'SYSLOG_IDENTIFIER': self.binary_name, 'PRIORITY': priority } if record.exc_info: # Cache the traceback text to avoid converting it multiple times # (it's constant anyway) if not record.exc_text: record.exc_text = self.formatter.formatException( record.exc_info) if record.exc_text: extras['EXCEPTION_INFO'] = record.exc_text # Leave EXCEPTION_TEXT for backward compatibility extras['EXCEPTION_TEXT'] = record.exc_text for field in self.custom_fields: value = record.__dict__.get(field) if value: extras[field.upper()] = value journal.send(message, **extras) class ColorHandler(logging.StreamHandler): """Log handler that sets the 'color' key based on the level To use, include a '%(color)s' entry in the logging_context_format_string. There is also a '%(reset_color)s' key that can be used to manually reset the color within a log line. """ LEVEL_COLORS = { _TRACE: '\033[00;35m', # MAGENTA logging.DEBUG: '\033[00;32m', # GREEN logging.INFO: '\033[00;36m', # CYAN _AUDIT: '\033[01;36m', # BOLD CYAN logging.WARN: '\033[01;33m', # BOLD YELLOW logging.ERROR: '\033[01;31m', # BOLD RED logging.CRITICAL: '\033[01;31m', # BOLD RED } def format(self, record): record.color = self.LEVEL_COLORS[record.levelno] record.reset_color = '\033[00m' return logging.StreamHandler.format(self, record) + record.reset_color oslo.log-4.1.1/oslo_log/version.py0000664000175000017500000000126013643050265017152 0ustar zuulzuul00000000000000# Copyright 2016 OpenStack Foundation # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import pbr.version version_info = pbr.version.VersionInfo('oslo.log') oslo.log-4.1.1/oslo_log/cmds/0000775000175000017500000000000013643050376016045 5ustar zuulzuul00000000000000oslo.log-4.1.1/oslo_log/cmds/convert_json.py0000664000175000017500000001536013643050265021132 0ustar zuulzuul00000000000000#!/usr/bin/env python # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from __future__ import print_function import argparse import collections import functools import sys import time from oslo_serialization import jsonutils from oslo_utils import importutils from oslo_log import log termcolor = importutils.try_import('termcolor') _USE_COLOR = False DEFAULT_LEVEL_KEY = 'levelname' DEFAULT_TRACEBACK_KEY = 'traceback' def main(): global _USE_COLOR args = parse_args() _USE_COLOR = args.color formatter = functools.partial( console_format, args.prefix, args.locator, loggers=args.loggers, levels=args.levels, level_key=args.levelkey, traceback_key=args.tbkey, ) if args.lines: # Read backward until we find all of our newline characters # or reach the beginning of the file args.file.seek(0, 2) newlines = 0 pos = args.file.tell() while newlines <= args.lines and pos > 0: pos = pos - 1 args.file.seek(pos) if args.file.read(1) == '\n': newlines = newlines + 1 try: for line in reformat_json(args.file, formatter, args.follow): print(line) except KeyboardInterrupt: sys.exit(0) def parse_args(): parser = argparse.ArgumentParser() parser.add_argument("file", nargs='?', default=sys.stdin, type=argparse.FileType(), help="JSON log file to read from (if not provided" " standard input is used instead)") parser.add_argument("--prefix", default='%(asctime)s.%(msecs)03d' ' %(process)s %(levelname)s %(name)s', help="Message prefixes") parser.add_argument("--locator", default='[%(funcname)s %(pathname)s:%(lineno)s]', help="Locator to append to DEBUG records") parser.add_argument("--levelkey", default=DEFAULT_LEVEL_KEY, help="Key in the JSON record where the level is held") parser.add_argument("--tbkey", default=DEFAULT_TRACEBACK_KEY, help="Key in the JSON record where the" " traceback/exception is held") parser.add_argument("-c", "--color", action='store_true', default=False, help="Color log levels (requires `termcolor`)") parser.add_argument("-f", "--follow", action='store_true', default=False, help="Continue parsing new data until" " KeyboardInterrupt") parser.add_argument("-n", "--lines", required=False, type=int, help="Last N number of records to view." " (May show less than N records when used" " in conjuction with --loggers or --levels)") parser.add_argument("--loggers", nargs='*', default=[], help="only return results matching given logger(s)") parser.add_argument("--levels", nargs='*', default=[], help="Only return lines matching given log level(s)") args = parser.parse_args() if args.color and not termcolor: raise ImportError("Coloring requested but `termcolor` is not" " importable") return args def colorise(key, text=None): if text is None: text = key if not _USE_COLOR: return text colors = { 'exc': ('red', ['reverse', 'bold']), 'FATAL': ('red', ['reverse', 'bold']), 'ERROR': ('red', ['bold']), 'WARNING': ('yellow', ['bold']), 'WARN': ('yellow', ['bold']), 'INFO': ('white', ['bold']), } color, attrs = colors.get(key, ('', [])) if color: return termcolor.colored(text, color=color, attrs=attrs) return text def warn(prefix, msg): return "%s: %s" % (colorise('exc', prefix), msg) def reformat_json(fh, formatter, follow=False): # using readline allows interactive stdin to respond to every line while True: line = fh.readline() if not line: if follow: time.sleep(0.1) continue else: break line = line.strip() if not line: continue try: record = jsonutils.loads(line) except ValueError: yield warn("Not JSON", line) continue for out_line in formatter(record): yield out_line def console_format(prefix, locator, record, loggers=[], levels=[], level_key=DEFAULT_LEVEL_KEY, traceback_key=DEFAULT_TRACEBACK_KEY): # Provide an empty string to format-specifiers the record is # missing, instead of failing. Doesn't work for non-string # specifiers. record = collections.defaultdict(str, record) # skip if the record doesn't match a logger we are looking at if loggers: name = record.get('name') if not any(name.startswith(n) for n in loggers): return if levels: if record.get(level_key) not in levels: return levelname = record.get(level_key) if levelname: record[level_key] = colorise(levelname) try: prefix = prefix % record except TypeError: # Thrown when a non-string format-specifier can't be filled in. # Dict comprehension cleans up the output yield warn('Missing non-string placeholder in record', {str(k): str(v) if isinstance(v, str) else v for k, v in record.items()}) return locator = '' if (record.get('levelno', 100) <= log.DEBUG or levelname == 'DEBUG'): locator = locator % record yield ' '.join(x for x in [prefix, record['message'], locator] if x) tb = record.get(traceback_key) if tb: if type(tb) is str: tb = tb.rstrip().split("\n") for tb_line in tb: yield ' '.join([prefix, tb_line]) if __name__ == '__main__': main() oslo.log-4.1.1/oslo_log/cmds/__init__.py0000664000175000017500000000000013643050265020141 0ustar zuulzuul00000000000000oslo.log-4.1.1/oslo_log/watchers.py0000664000175000017500000000727613643050265017322 0ustar zuulzuul00000000000000# Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import errno import logging import logging.config import logging.handlers import os import pyinotify import stat import time try: import syslog except ImportError: syslog = None """Linux specific pyinotify based logging handlers""" class _FileKeeper(pyinotify.ProcessEvent): def my_init(self, watched_handler, watched_file): self._watched_handler = watched_handler self._watched_file = watched_file def process_default(self, event): if event.name == self._watched_file: self._watched_handler.reopen_file() class _EventletThreadedNotifier(pyinotify.ThreadedNotifier): def loop(self): """Eventlet friendly ThreadedNotifier EventletFriendlyThreadedNotifier contains additional time.sleep() call insude loop to allow switching to other thread when eventlet is used. It can be used with eventlet and native threads as well. """ while not self._stop_event.is_set(): self.process_events() time.sleep(0) ref_time = time.time() if self.check_events(): self._sleep(ref_time) self.read_events() class FastWatchedFileHandler(logging.handlers.WatchedFileHandler, object): """Frequency of reading events. Watching thread sleeps max(0, READ_FREQ - (TIMEOUT / 1000)) seconds. """ READ_FREQ = 5 """Poll timeout in milliseconds. See https://docs.python.org/2/library/select.html#select.poll.poll""" TIMEOUT = 5 def __init__(self, logpath, *args, **kwargs): self._log_file = os.path.basename(logpath) self._log_dir = os.path.dirname(logpath) super(FastWatchedFileHandler, self).__init__(logpath, *args, **kwargs) self._watch_file() def _watch_file(self): mask = pyinotify.IN_MOVED_FROM | pyinotify.IN_DELETE watch_manager = pyinotify.WatchManager() handler = _FileKeeper(watched_handler=self, watched_file=self._log_file) notifier = _EventletThreadedNotifier( watch_manager, default_proc_fun=handler, read_freq=FastWatchedFileHandler.READ_FREQ, timeout=FastWatchedFileHandler.TIMEOUT) notifier.daemon = True watch_manager.add_watch(self._log_dir, mask) notifier.start() def reopen_file(self): try: # stat the file by path, checking for existence sres = os.stat(self.baseFilename) except OSError as err: if err.errno == errno.ENOENT: sres = None else: raise # compare file system stat with that of our stream file handle if (not sres or sres[stat.ST_DEV] != self.dev or sres[stat.ST_INO] != self.ino): if self.stream is not None: # we have an open file handle, clean it up self.stream.flush() self.stream.close() self.stream = None # open a new file handle and get new stat info from that fd self.stream = self._open() self._statstream() oslo.log-4.1.1/oslo_log/locale/0000775000175000017500000000000013643050376016356 5ustar zuulzuul00000000000000oslo.log-4.1.1/oslo_log/locale/en_GB/0000775000175000017500000000000013643050376017330 5ustar zuulzuul00000000000000oslo.log-4.1.1/oslo_log/locale/en_GB/LC_MESSAGES/0000775000175000017500000000000013643050376021115 5ustar zuulzuul00000000000000oslo.log-4.1.1/oslo_log/locale/en_GB/LC_MESSAGES/oslo_log.po0000664000175000017500000000347213643050265023275 0ustar zuulzuul00000000000000# Andi Chandler , 2016. #zanata msgid "" msgstr "" "Project-Id-Version: oslo.log VERSION\n" "Report-Msgid-Bugs-To: https://bugs.launchpad.net/openstack-i18n/\n" "POT-Creation-Date: 2018-02-09 00:09+0000\n" "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" "PO-Revision-Date: 2016-06-28 05:56+0000\n" "Last-Translator: Andi Chandler \n" "Language-Team: English (United Kingdom)\n" "Language: en_GB\n" "X-Generator: Zanata 4.3.3\n" "Plural-Forms: nplurals=2; plural=(n != 1)\n" #, python-format msgid "" "%(what)s is deprecated as of %(as_of)s and may be removed in %(remove_in)s. " "It will not be superseded." msgstr "" "%(what)s is deprecated as of %(as_of)s and may be removed in %(remove_in)s. " "It will not be superseded." #, python-format msgid "" "%(what)s is deprecated as of %(as_of)s in favor of %(in_favor_of)s and may " "be removed in %(remove_in)s." msgstr "" "%(what)s is deprecated as of %(as_of)s in favour of %(in_favor_of)s and may " "be removed in %(remove_in)s." #, python-format msgid "%(what)s is deprecated as of %(as_of)s in favor of %(in_favor_of)s." msgstr "%(what)s is deprecated as of %(as_of)s in favour of %(in_favor_of)s." #, python-format msgid "%(what)s is deprecated as of %(as_of)s. It will not be superseded." msgstr "%(what)s is deprecated as of %(as_of)s. It will not be superseded." #, python-format msgid "Deprecated: %s" msgstr "Deprecated: %s" #, python-format msgid "Error loading logging config %(log_config)s: %(err_msg)s" msgstr "Error loading logging config %(log_config)s: %(err_msg)s" #, python-format msgid "Fatal call to deprecated config: %(msg)s" msgstr "Fatal call to deprecated config: %(msg)s" #, python-format msgid "syslog facility must be one of: %s" msgstr "syslog facility must be one of: %s" oslo.log-4.1.1/oslo_log/locale/ja/0000775000175000017500000000000013643050376016750 5ustar zuulzuul00000000000000oslo.log-4.1.1/oslo_log/locale/ja/LC_MESSAGES/0000775000175000017500000000000013643050376020535 5ustar zuulzuul00000000000000oslo.log-4.1.1/oslo_log/locale/ja/LC_MESSAGES/oslo_log.po0000664000175000017500000000410613643050265022710 0ustar zuulzuul00000000000000# Andreas Jaeger , 2016. #zanata msgid "" msgstr "" "Project-Id-Version: oslo.log 3.4.1.dev1\n" "Report-Msgid-Bugs-To: https://bugs.launchpad.net/openstack-i18n/\n" "POT-Creation-Date: 2016-04-19 12:12+0000\n" "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" "PO-Revision-Date: 2016-02-20 06:53+0000\n" "Last-Translator: KATO Tomoyuki \n" "Language-Team: Japanese\n" "Language: ja\n" "X-Generator: Zanata 3.7.3\n" "Plural-Forms: nplurals=1; plural=0\n" #, python-format msgid "" "%(what)s is deprecated as of %(as_of)s and may be removed in %(remove_in)s. " "It will not be superseded." msgstr "" "%(what)s は %(as_of)s において非推奨になります。%(remove_in)s において削除さ" "れる可能性があります。後継はありません。" #, python-format msgid "" "%(what)s is deprecated as of %(as_of)s in favor of %(in_favor_of)s and may " "be removed in %(remove_in)s." msgstr "" "%(what)s は、%(in_favor_of)s に移行するために %(as_of)s において非推奨になり" "ます。 %(remove_in)s において削除される可能性があります。" #, python-format msgid "%(what)s is deprecated as of %(as_of)s in favor of %(in_favor_of)s." msgstr "" "%(what)s は、%(in_favor_of)s に移行するために %(as_of)s において非推奨になり" "ます。" #, python-format msgid "%(what)s is deprecated as of %(as_of)s. It will not be superseded." msgstr "%(what)s は %(as_of)s において非推奨になります。後継はありません。" #, python-format msgid "Deprecated: %s" msgstr "非推奨: %s" #, python-format msgid "Error loading logging config %(log_config)s: %(err_msg)s" msgstr "ロギング設定 %(log_config)s の読み込み中にエラー発生: %(err_msg)s" #, python-format msgid "Fatal call to deprecated config: %(msg)s" msgstr "非推奨設定の致命的な呼び出し: %(msg)s" #, python-format msgid "syslog facility must be one of: %s" msgstr "syslog ファシリティーは次のどれかである必要があります: %s" oslo.log-4.1.1/oslo_log/locale/de/0000775000175000017500000000000013643050376016746 5ustar zuulzuul00000000000000oslo.log-4.1.1/oslo_log/locale/de/LC_MESSAGES/0000775000175000017500000000000013643050376020533 5ustar zuulzuul00000000000000oslo.log-4.1.1/oslo_log/locale/de/LC_MESSAGES/oslo_log.po0000664000175000017500000000371013643050265022706 0ustar zuulzuul00000000000000# Andreas Jaeger , 2016. #zanata msgid "" msgstr "" "Project-Id-Version: oslo.log 3.10.1.dev3\n" "Report-Msgid-Bugs-To: https://bugs.launchpad.net/openstack-i18n/\n" "POT-Creation-Date: 2016-06-24 09:24+0000\n" "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" "PO-Revision-Date: 2016-06-20 06:43+0000\n" "Last-Translator: Andreas Jaeger \n" "Language-Team: German\n" "Language: de\n" "X-Generator: Zanata 3.7.3\n" "Plural-Forms: nplurals=2; plural=(n != 1)\n" #, python-format msgid "" "%(what)s is deprecated as of %(as_of)s and may be removed in %(remove_in)s. " "It will not be superseded." msgstr "" "Seit %(as_of)s wird %(what)s nicht mehr unterstützt und voraussichtlich in " "%(remove_in)s entfernt. Es gibt keine Alternative." #, python-format msgid "" "%(what)s is deprecated as of %(as_of)s in favor of %(in_favor_of)s and may " "be removed in %(remove_in)s." msgstr "" "Seit %(as_of)s wird %(what)s nicht mehr unterstützt und voraussichtlich in " "%(remove_in)s entfernt. Benutzen Sie %(in_favor_of)s." #, python-format msgid "%(what)s is deprecated as of %(as_of)s in favor of %(in_favor_of)s." msgstr "" "Seit %(as_of)s wird %(what)s nicht mehr unterstützt. Benutzen Sie " "%(in_favor_of)s." #, python-format msgid "%(what)s is deprecated as of %(as_of)s. It will not be superseded." msgstr "" "Seit %(as_of)s wird %(what)s nicht mehr unterstützt. Es gibt keine " "Alternative." #, python-format msgid "Deprecated: %s" msgstr "Nicht weiter unterstützt: %s" #, python-format msgid "Error loading logging config %(log_config)s: %(err_msg)s" msgstr "" "Fehler beim Laden der Logging Konfiguration %(log_config)s: %(err_msg)s" #, python-format msgid "Fatal call to deprecated config: %(msg)s" msgstr "Aufruf zu nicht weiter unterstützter Konfiguration: %(msg)s" #, python-format msgid "syslog facility must be one of: %s" msgstr "Sylog Facility muß einer von %s sein." oslo.log-4.1.1/oslo_log/locale/es/0000775000175000017500000000000013643050376016765 5ustar zuulzuul00000000000000oslo.log-4.1.1/oslo_log/locale/es/LC_MESSAGES/0000775000175000017500000000000013643050376020552 5ustar zuulzuul00000000000000oslo.log-4.1.1/oslo_log/locale/es/LC_MESSAGES/oslo_log.po0000664000175000017500000000415513643050265022731 0ustar zuulzuul00000000000000# Translations template for oslo.log. # Copyright (C) 2015 ORGANIZATION # This file is distributed under the same license as the oslo.log project. # # Translators: # Adriana Chisco Landazábal , 2015 # Andreas Jaeger , 2016. #zanata msgid "" msgstr "" "Project-Id-Version: oslo.log 3.4.1.dev1\n" "Report-Msgid-Bugs-To: https://bugs.launchpad.net/openstack-i18n/\n" "POT-Creation-Date: 2016-04-19 12:12+0000\n" "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" "PO-Revision-Date: 2015-06-22 08:59+0000\n" "Last-Translator: Adriana Chisco Landazábal \n" "Language: es\n" "Plural-Forms: nplurals=2; plural=(n != 1);\n" "Generated-By: Babel 2.0\n" "X-Generator: Zanata 3.7.3\n" "Language-Team: Spanish\n" #, python-format msgid "" "%(what)s is deprecated as of %(as_of)s and may be removed in %(remove_in)s. " "It will not be superseded." msgstr "" "%(what)s está en desuso así como %(as_of)s y puede ser removido en " "%(remove_in)s. No se sustituirá." #, python-format msgid "" "%(what)s is deprecated as of %(as_of)s in favor of %(in_favor_of)s and may " "be removed in %(remove_in)s." msgstr "" "%(what)s esté en desuso así como %(as_of)s en beneficio de %(in_favor_of)s y " "puede ser removido en %(remove_in)s." #, python-format msgid "%(what)s is deprecated as of %(as_of)s in favor of %(in_favor_of)s." msgstr "" "%(what)s está en desuso así como %(as_of)s en beneficio de %(in_favor_of)s." #, python-format msgid "%(what)s is deprecated as of %(as_of)s. It will not be superseded." msgstr "%(what)s está en desuso así como %(as_of)s. No se sustituirá." #, python-format msgid "Deprecated: %s" msgstr "En desuso: %s" #, python-format msgid "Error loading logging config %(log_config)s: %(err_msg)s" msgstr "" "Error al cargar la configuración de registro %(log_config)s: %(err_msg)s" #, python-format msgid "Fatal call to deprecated config: %(msg)s" msgstr "Aviso urgente de configuración en desuso: %(msg)s" #, python-format msgid "syslog facility must be one of: %s" msgstr "El recurso syslog debe ser uno de: %s" oslo.log-4.1.1/oslo_log/log.py0000664000175000017500000004427013643050265016256 0ustar zuulzuul00000000000000# Copyright 2011 OpenStack Foundation. # Copyright 2010 United States Government as represented by the # Administrator of the National Aeronautics and Space Administration. # All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. """OpenStack logging handler. This module adds to logging functionality by adding the option to specify a context object when calling the various log methods. If the context object is not specified, default formatting is used. Additionally, an instance uuid may be passed as part of the log message, which is intended to make it easier for admins to find messages related to a specific instance. It also allows setting of formatting information through conf. """ import configparser import logging import logging.config import logging.handlers import os import platform import sys try: import syslog except ImportError: syslog = None from oslo_config import cfg from oslo_utils import importutils from oslo_utils import units from oslo_log._i18n import _ from oslo_log import _options from oslo_log import formatters from oslo_log import handlers CRITICAL = logging.CRITICAL FATAL = logging.FATAL ERROR = logging.ERROR WARNING = logging.WARNING WARN = logging.WARNING INFO = logging.INFO DEBUG = logging.DEBUG NOTSET = logging.NOTSET TRACE = handlers._TRACE logging.addLevelName(TRACE, 'TRACE') LOG_ROTATE_INTERVAL_MAPPING = { 'seconds': 's', 'minutes': 'm', 'hours': 'h', 'days': 'd', 'weekday': 'w', 'midnight': 'midnight' } def _get_log_file_path(conf, binary=None): logfile = conf.log_file logdir = conf.log_dir if logfile and not logdir: return logfile if logfile and logdir: return os.path.join(logdir, logfile) if logdir: binary = binary or handlers._get_binary_name() return '%s.log' % (os.path.join(logdir, binary),) return None def _iter_loggers(): """Iterate on existing loggers.""" # Sadly, Logger.manager and Manager.loggerDict are not documented, # but there is no logging public function to iterate on all loggers. # The root logger is not part of loggerDict. yield logging.getLogger() manager = logging.Logger.manager for logger in manager.loggerDict.values(): if isinstance(logger, logging.PlaceHolder): continue yield logger class BaseLoggerAdapter(logging.LoggerAdapter): warn = logging.LoggerAdapter.warning @property def handlers(self): return self.logger.handlers def trace(self, msg, *args, **kwargs): self.log(TRACE, msg, *args, **kwargs) class KeywordArgumentAdapter(BaseLoggerAdapter): """Logger adapter to add keyword arguments to log record's extra data Keywords passed to the log call are added to the "extra" dictionary passed to the underlying logger so they are emitted with the log message and available to the format string. Special keywords: extra An existing dictionary of extra values to be passed to the logger. If present, the dictionary is copied and extended. resource A dictionary-like object containing a ``name`` key or ``type`` and ``id`` keys. """ def process(self, msg, kwargs): # Make a new extra dictionary combining the values we were # given when we were constructed and anything from kwargs. extra = {} extra.update(self.extra) if 'extra' in kwargs: extra.update(kwargs.pop('extra')) # Move any unknown keyword arguments into the extra # dictionary. for name in list(kwargs.keys()): if name == 'exc_info': continue extra[name] = kwargs.pop(name) # NOTE(dhellmann): The gap between when the adapter is called # and when the formatter needs to know what the extra values # are is large enough that we can't get back to the original # extra dictionary easily. We leave a hint to ourselves here # in the form of a list of keys, which will eventually be # attributes of the LogRecord processed by the formatter. That # allows the formatter to know which values were original and # which were extra, so it can treat them differently (see # JSONFormatter for an example of this). We sort the keys so # it is possible to write sane unit tests. extra['extra_keys'] = list(sorted(extra.keys())) # Place the updated extra values back into the keyword # arguments. kwargs['extra'] = extra # NOTE(jdg): We would like an easy way to add resource info # to logging, for example a header like 'volume-' # Turns out Nova implemented this but it's Nova specific with # instance. Also there's resource_uuid that's been added to # context, but again that only works for Instances, and it # only works for contexts that have the resource id set. resource = kwargs['extra'].get('resource', None) if resource: # Many OpenStack resources have a name entry in their db ref # of the form -, let's just use that if # it's passed in if not resource.get('name', None): # For resources that don't have the name of the format we wish # to use (or places where the LOG call may not have the full # object ref, allow them to pass in a dict: # resource={'type': volume, 'id': uuid} resource_type = resource.get('type', None) resource_id = resource.get('id', None) if resource_type and resource_id: kwargs['extra']['resource'] = ('[' + resource_type + '-' + resource_id + '] ') else: # FIXME(jdg): Since the name format can be specified via conf # entry, we may want to consider allowing this to be configured # here as well kwargs['extra']['resource'] = ('[' + resource.get('name', '') + '] ') return msg, kwargs def _create_logging_excepthook(product_name): def logging_excepthook(exc_type, value, tb): extra = {'exc_info': (exc_type, value, tb)} getLogger(product_name).critical('Unhandled error', **extra) return logging_excepthook class LogConfigError(Exception): message = _('Error loading logging config %(log_config)s: %(err_msg)s') def __init__(self, log_config, err_msg): self.log_config = log_config self.err_msg = err_msg def __str__(self): return self.message % dict(log_config=self.log_config, err_msg=self.err_msg) def _load_log_config(log_config_append): try: if not hasattr(_load_log_config, "old_time"): _load_log_config.old_time = 0 new_time = os.path.getmtime(log_config_append) if _load_log_config.old_time != new_time: # Reset all existing loggers before reloading config as fileConfig # does not reset non-child loggers. for logger in _iter_loggers(): logger.setLevel(logging.NOTSET) logger.handlers = [] logger.propagate = 1 logging.config.fileConfig(log_config_append, disable_existing_loggers=False) _load_log_config.old_time = new_time except (configparser.Error, KeyError, os.error) as exc: raise LogConfigError(log_config_append, str(exc)) def _mutate_hook(conf, fresh): """Reconfigures oslo.log according to the mutated options.""" if (None, 'debug') in fresh: _refresh_root_level(conf.debug) if (None, 'log-config-append') in fresh: _load_log_config.old_time = 0 if conf.log_config_append: _load_log_config(conf.log_config_append) def register_options(conf): """Register the command line and configuration options used by oslo.log.""" # Sometimes logging occurs before logging is ready (e.g., oslo_config). # To avoid "No handlers could be found," temporarily log to sys.stderr. root_logger = logging.getLogger(None) if not root_logger.handlers: root_logger.addHandler(logging.StreamHandler()) conf.register_cli_opts(_options.common_cli_opts) conf.register_cli_opts(_options.logging_cli_opts) conf.register_opts(_options.generic_log_opts) conf.register_opts(_options.log_opts) formatters._store_global_conf(conf) conf.register_mutate_hook(_mutate_hook) def setup(conf, product_name, version='unknown'): """Setup logging for the current application.""" if conf.log_config_append: _load_log_config(conf.log_config_append) else: _setup_logging_from_conf(conf, product_name, version) sys.excepthook = _create_logging_excepthook(product_name) def set_defaults(logging_context_format_string=None, default_log_levels=None): """Set default values for the configuration options used by oslo.log.""" # Just in case the caller is not setting the # default_log_level. This is insurance because # we introduced the default_log_level parameter # later in a backwards in-compatible change if default_log_levels is not None: cfg.set_defaults( _options.log_opts, default_log_levels=default_log_levels) if logging_context_format_string is not None: cfg.set_defaults( _options.log_opts, logging_context_format_string=logging_context_format_string) def tempest_set_log_file(filename): """Provide an API for tempest to set the logging filename. .. warning:: Only Tempest should use this function. We don't want applications to set a default log file, so we don't want this in set_defaults(). Because tempest doesn't use a configuration file we don't have another convenient way to safely set the log file default. """ cfg.set_defaults(_options.logging_cli_opts, log_file=filename) def _find_facility(facility): # NOTE(jd): Check the validity of facilities at run time as they differ # depending on the OS and Python version being used. valid_facilities = [f for f in ["LOG_KERN", "LOG_USER", "LOG_MAIL", "LOG_DAEMON", "LOG_AUTH", "LOG_SYSLOG", "LOG_LPR", "LOG_NEWS", "LOG_UUCP", "LOG_CRON", "LOG_AUTHPRIV", "LOG_FTP", "LOG_LOCAL0", "LOG_LOCAL1", "LOG_LOCAL2", "LOG_LOCAL3", "LOG_LOCAL4", "LOG_LOCAL5", "LOG_LOCAL6", "LOG_LOCAL7"] if getattr(syslog, f, None)] facility = facility.upper() if not facility.startswith("LOG_"): facility = "LOG_" + facility if facility not in valid_facilities: raise TypeError(_('syslog facility must be one of: %s') % ', '.join("'%s'" % fac for fac in valid_facilities)) return getattr(syslog, facility) def _refresh_root_level(debug): """Set the level of the root logger. :param debug: If 'debug' is True, the level will be DEBUG. Otherwise the level will be INFO. """ log_root = getLogger(None).logger if debug: log_root.setLevel(logging.DEBUG) else: log_root.setLevel(logging.INFO) def _setup_logging_from_conf(conf, project, version): log_root = getLogger(None).logger # Remove all handlers for handler in list(log_root.handlers): log_root.removeHandler(handler) logpath = _get_log_file_path(conf) if logpath: # On Windows, in-use files cannot be moved or deleted. if conf.watch_log_file and platform.system() == 'Linux': from oslo_log import watchers file_handler = watchers.FastWatchedFileHandler filelog = file_handler(logpath) elif conf.log_rotation_type.lower() == "interval": file_handler = logging.handlers.TimedRotatingFileHandler when = conf.log_rotate_interval_type.lower() interval_type = LOG_ROTATE_INTERVAL_MAPPING[when] # When weekday is configured, "when" has to be a value between # 'w0'-'w6' (w0 for Monday, w1 for Tuesday, and so on)' if interval_type == 'w': interval_type = interval_type + str(conf.log_rotate_interval) filelog = file_handler(logpath, when=interval_type, interval=conf.log_rotate_interval, backupCount=conf.max_logfile_count) elif conf.log_rotation_type.lower() == "size": file_handler = logging.handlers.RotatingFileHandler maxBytes = conf.max_logfile_size_mb * units.Mi filelog = file_handler(logpath, maxBytes=maxBytes, backupCount=conf.max_logfile_count) else: file_handler = logging.handlers.WatchedFileHandler filelog = file_handler(logpath) log_root.addHandler(filelog) if conf.use_stderr: streamlog = handlers.ColorHandler() log_root.addHandler(streamlog) if conf.use_journal: journal = handlers.OSJournalHandler() log_root.addHandler(journal) if conf.use_eventlog: if platform.system() == 'Windows': eventlog = logging.handlers.NTEventLogHandler(project) log_root.addHandler(eventlog) else: raise RuntimeError(_("Windows Event Log is not available on this " "platform.")) # if None of the above are True, then fall back to standard out if not logpath and not conf.use_stderr and not conf.use_journal: # pass sys.stdout as a positional argument # python2.6 calls the argument strm, in 2.7 it's stream streamlog = handlers.ColorHandler(sys.stdout) log_root.addHandler(streamlog) if conf.publish_errors: handler = importutils.import_object( "oslo_messaging.notify.log_handler.PublishErrorsHandler", logging.ERROR) log_root.addHandler(handler) if conf.use_syslog: global syslog if syslog is None: raise RuntimeError("syslog is not available on this platform") facility = _find_facility(conf.syslog_log_facility) syslog_handler = handlers.OSSysLogHandler(facility=facility) log_root.addHandler(syslog_handler) datefmt = conf.log_date_format if not conf.use_json: for handler in log_root.handlers: handler.setFormatter(formatters.ContextFormatter(project=project, version=version, datefmt=datefmt, config=conf)) else: for handler in log_root.handlers: handler.setFormatter(formatters.JSONFormatter(datefmt=datefmt)) _refresh_root_level(conf.debug) for pair in conf.default_log_levels: mod, _sep, level_name = pair.partition('=') logger = logging.getLogger(mod) numeric_level = None try: # NOTE(harlowja): integer's are valid level names, and for some # libraries they have a lower level than DEBUG that is typically # defined at level 5, so to make that accessible, try to convert # this to a integer, and if not keep the original... numeric_level = int(level_name) except ValueError: # nosec pass if numeric_level is not None: logger.setLevel(numeric_level) else: logger.setLevel(level_name) if conf.rate_limit_burst >= 1 and conf.rate_limit_interval >= 1: from oslo_log import rate_limit rate_limit.install_filter(conf.rate_limit_burst, conf.rate_limit_interval, conf.rate_limit_except) _loggers = {} def get_loggers(): """Return a copy of the oslo loggers dictionary.""" return _loggers.copy() def getLogger(name=None, project='unknown', version='unknown'): """Build a logger with the given name. :param name: The name for the logger. This is usually the module name, ``__name__``. :type name: string :param project: The name of the project, to be injected into log messages. For example, ``'nova'``. :type project: string :param version: The version of the project, to be injected into log messages. For example, ``'2014.2'``. :type version: string """ # NOTE(dhellmann): To maintain backwards compatibility with the # old oslo namespace package logger configurations, and to make it # possible to control all oslo logging with one logger node, we # replace "oslo_" with "oslo." so that modules under the new # non-namespaced packages get loggers as though they are. if name and name.startswith('oslo_'): name = 'oslo.' + name[5:] if name not in _loggers: _loggers[name] = KeywordArgumentAdapter(logging.getLogger(name), {'project': project, 'version': version}) return _loggers[name] def get_default_log_levels(): """Return the Oslo Logging default log levels. Returns a copy of the list so an application can change the value and not affect the default value used in the log_opts configuration setup. """ return list(_options.DEFAULT_LOG_LEVELS) def is_debug_enabled(conf): """Determine if debug logging mode is enabled.""" return conf.debug oslo.log-4.1.1/oslo_log/fixture/0000775000175000017500000000000013643050376016605 5ustar zuulzuul00000000000000oslo.log-4.1.1/oslo_log/fixture/setlevel.py0000664000175000017500000000320513643050265020777 0ustar zuulzuul00000000000000# All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import logging import fixtures class SetLogLevel(fixtures.Fixture): """Override the log level for the named loggers, restoring their previous value at the end of the test. To use:: from oslo_log import fixture as log_fixture self.useFixture(log_fixture.SetLogLevel(['myapp.foo'], logging.DEBUG)) :param logger_names: Sequence of logger names, as would be passed to getLogger(). :type logger_names: list(str) :param level: Logging level, usually one of logging.DEBUG, logging.INFO, etc. :type level: int """ def __init__(self, logger_names, level): self.logger_names = logger_names self.level = level def setUp(self): super(SetLogLevel, self).setUp() for name in self.logger_names: # NOTE(dhellmann): Use the stdlib version of getLogger() # so we get the logger and not any adaptor wrapping it. logger = logging.getLogger(name) self.addCleanup(logger.setLevel, logger.level) logger.setLevel(self.level) oslo.log-4.1.1/oslo_log/fixture/__init__.py0000664000175000017500000000123113643050265020710 0ustar zuulzuul00000000000000# All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from .logging_error import get_logging_handle_error_fixture from .setlevel import SetLogLevel oslo.log-4.1.1/oslo_log/fixture/logging_error.py0000664000175000017500000000223113643050265022011 0ustar zuulzuul00000000000000# All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import fixtures def get_logging_handle_error_fixture(): """returns a fixture to make logging raise formatting exceptions. To use:: from oslo_log import fixture as log_fixture self.useFixture(log_fixture.get_logging_handle_error_fixture()) """ return fixtures.MonkeyPatch('logging.Handler.handleError', _handleError) def _handleError(self, record): """Monkey patch for logging.Handler.handleError. The default handleError just logs the error to stderr but we want the option of actually raising an exception. """ raise oslo.log-4.1.1/oslo_log/tests/0000775000175000017500000000000013643050376016261 5ustar zuulzuul00000000000000oslo.log-4.1.1/oslo_log/tests/unit/0000775000175000017500000000000013643050376017240 5ustar zuulzuul00000000000000oslo.log-4.1.1/oslo_log/tests/unit/test_rate_limit.py0000664000175000017500000000726713643050265023013 0ustar zuulzuul00000000000000# Copyright 2016 Red Hat, Inc. All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import io import logging from unittest import mock from oslotest import base as test_base from oslo_log import rate_limit class LogRateLimitTestCase(test_base.BaseTestCase): def tearDown(self): super(LogRateLimitTestCase, self).tearDown() rate_limit.uninstall_filter() def install_filter(self, *args): rate_limit.install_filter(*args) logger = logging.getLogger() # remove handlers to not pollute stdout def restore_handlers(logger, handlers): for handler in handlers: logger.addHandler(handler) self.addCleanup(restore_handlers, logger, list(logger.handlers)) for handler in list(logger.handlers): logger.removeHandler(handler) # install our handler writing logs into a StringIO stream = io.StringIO() handler = logging.StreamHandler(stream) logger.addHandler(handler) return (logger, stream) @mock.patch('oslo_log.rate_limit.monotonic_clock') def test_rate_limit(self, mock_clock): mock_clock.return_value = 1 logger, stream = self.install_filter(2, 1) # first burst logger.error("message 1") logger.error("message 2") logger.error("message 3") self.assertEqual(stream.getvalue(), 'message 1\n' 'message 2\n' 'Logging rate limit: drop after 2 records/1 sec\n') # second burst (clock changed) stream.seek(0) stream.truncate() mock_clock.return_value = 2 logger.error("message 4") logger.error("message 5") logger.error("message 6") self.assertEqual(stream.getvalue(), 'message 4\n' 'message 5\n' 'Logging rate limit: drop after 2 records/1 sec\n') @mock.patch('oslo_log.rate_limit.monotonic_clock') def test_rate_limit_except_level(self, mock_clock): mock_clock.return_value = 1 logger, stream = self.install_filter(1, 1, 'CRITICAL') # first burst logger.error("error 1") logger.error("error 2") logger.critical("critical 3") logger.critical("critical 4") self.assertEqual(stream.getvalue(), 'error 1\n' 'Logging rate limit: drop after 1 records/1 sec\n' 'critical 3\n' 'critical 4\n') def test_install_twice(self): rate_limit.install_filter(100, 1) self.assertRaises(RuntimeError, rate_limit.install_filter, 100, 1) @mock.patch('oslo_log.rate_limit.monotonic_clock') def test_uninstall(self, mock_clock): mock_clock.return_value = 1 logger, stream = self.install_filter(1, 1) rate_limit.uninstall_filter() # not limited logger.error("message 1") logger.error("message 2") logger.error("message 3") self.assertEqual(stream.getvalue(), 'message 1\n' 'message 2\n' 'message 3\n') oslo.log-4.1.1/oslo_log/tests/unit/test_log.py0000664000175000017500000022731013643050265021434 0ustar zuulzuul00000000000000# -*- coding: utf-8 -*- # Copyright (c) 2011 United States Government as represented by the # Administrator of the National Aeronautics and Space Administration. # All Rights Reserved. # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from contextlib import contextmanager import copy import datetime import io import logging import os import platform import shutil import sys try: import syslog except ImportError: syslog = None import tempfile import time from unittest import mock from dateutil import tz from oslo_config import cfg from oslo_config import fixture as fixture_config # noqa from oslo_context import context from oslo_context import fixture as fixture_context from oslo_i18n import fixture as fixture_trans from oslo_serialization import jsonutils from oslotest import base as test_base import testtools from oslo_log import _options from oslo_log import formatters from oslo_log import handlers from oslo_log import log from oslo_utils import units MIN_LOG_INI = b"""[loggers] keys=root [formatters] keys= [handlers] keys= [logger_root] handlers= """ def _fake_context(): ctxt = context.RequestContext(1, 1, overwrite=True) ctxt.user = 'myuser' ctxt.tenant = 'mytenant' ctxt.domain = 'mydomain' ctxt.project_domain = 'myprojectdomain' ctxt.user_domain = 'myuserdomain' return ctxt def _fake_new_context(): # New style contexts have a user_name / project_name, this is done # distinctly from the above context to not have to rewrite all the # other tests. ctxt = context.RequestContext(1, 1, overwrite=True) ctxt.user_name = 'myuser' ctxt.project_name = 'mytenant' ctxt.domain = 'mydomain' ctxt.project_domain = 'myprojectdomain' ctxt.user_domain = 'myuserdomain' return ctxt class CommonLoggerTestsMixIn(object): """These tests are shared between LoggerTestCase and LazyLoggerTestCase. """ def setUp(self): super(CommonLoggerTestsMixIn, self).setUp() # common context has different fields to the defaults in log.py self.config_fixture = self.useFixture( fixture_config.Config(cfg.ConfigOpts())) self.config = self.config_fixture.config self.CONF = self.config_fixture.conf log.register_options(self.config_fixture.conf) self.config(logging_context_format_string='%(asctime)s %(levelname)s ' '%(name)s [%(request_id)s ' '%(user)s %(tenant)s] ' '%(message)s') self.log = None log._setup_logging_from_conf(self.config_fixture.conf, 'test', 'test') self.log_handlers = log.getLogger(None).logger.handlers def test_handlers_have_context_formatter(self): formatters_list = [] for h in self.log.logger.handlers: f = h.formatter if isinstance(f, formatters.ContextFormatter): formatters_list.append(f) self.assertTrue(formatters_list) self.assertEqual(len(formatters_list), len(self.log.logger.handlers)) def test_handles_context_kwarg(self): self.log.info("foo", context=_fake_context()) self.assertTrue(True) # didn't raise exception def test_will_be_debug_if_debug_flag_set(self): self.config(debug=True) logger_name = 'test_is_debug' log.setup(self.CONF, logger_name) logger = logging.getLogger(logger_name) self.assertEqual(logging.DEBUG, logger.getEffectiveLevel()) def test_will_be_info_if_debug_flag_not_set(self): self.config(debug=False) logger_name = 'test_is_not_debug' log.setup(self.CONF, logger_name) logger = logging.getLogger(logger_name) self.assertEqual(logging.INFO, logger.getEffectiveLevel()) def test_no_logging_via_module(self): for func in ('critical', 'error', 'exception', 'warning', 'warn', 'info', 'debug', 'log'): self.assertRaises(AttributeError, getattr, log, func) @mock.patch('platform.system', return_value='Linux') def test_eventlog_missing(self, platform_mock): self.config(use_eventlog=True) self.assertRaises(RuntimeError, log._setup_logging_from_conf, self.CONF, 'test', 'test') @mock.patch('platform.system', return_value='Windows') @mock.patch('logging.handlers.NTEventLogHandler') @mock.patch('oslo_log.log.getLogger') def test_eventlog(self, loggers_mock, handler_mock, platform_mock): self.config(use_eventlog=True) log._setup_logging_from_conf(self.CONF, 'test', 'test') handler_mock.assert_called_once_with('test') mock_logger = loggers_mock.return_value.logger mock_logger.addHandler.assert_any_call(handler_mock.return_value) @mock.patch('oslo_log.watchers.FastWatchedFileHandler') @mock.patch('oslo_log.log._get_log_file_path', return_value='test.conf') @mock.patch('platform.system', return_value='Linux') def test_watchlog_on_linux(self, platfotm_mock, path_mock, handler_mock): self.config(watch_log_file=True) log._setup_logging_from_conf(self.CONF, 'test', 'test') handler_mock.assert_called_once_with(path_mock.return_value) self.assertEqual(self.log_handlers[0], handler_mock.return_value) @mock.patch('logging.handlers.WatchedFileHandler') @mock.patch('oslo_log.log._get_log_file_path', return_value='test.conf') @mock.patch('platform.system', return_value='Windows') def test_watchlog_on_windows(self, platform_mock, path_mock, handler_mock): self.config(watch_log_file=True) log._setup_logging_from_conf(self.CONF, 'test', 'test') handler_mock.assert_called_once_with(path_mock.return_value) self.assertEqual(self.log_handlers[0], handler_mock.return_value) @mock.patch('logging.handlers.TimedRotatingFileHandler') @mock.patch('oslo_log.log._get_log_file_path', return_value='test.conf') def test_timed_rotate_log(self, path_mock, handler_mock): rotation_type = 'interval' when = 'weekday' interval = 2 backup_count = 2 self.config(log_rotation_type=rotation_type, log_rotate_interval=interval, log_rotate_interval_type=when, max_logfile_count=backup_count) log._setup_logging_from_conf(self.CONF, 'test', 'test') handler_mock.assert_called_once_with(path_mock.return_value, when='w2', interval=interval, backupCount=backup_count) self.assertEqual(self.log_handlers[0], handler_mock.return_value) @mock.patch('logging.handlers.RotatingFileHandler') @mock.patch('oslo_log.log._get_log_file_path', return_value='test.conf') def test_rotate_log(self, path_mock, handler_mock): rotation_type = 'size' max_logfile_size_mb = 100 maxBytes = max_logfile_size_mb * units.Mi backup_count = 2 self.config(log_rotation_type=rotation_type, max_logfile_size_mb=max_logfile_size_mb, max_logfile_count=backup_count) log._setup_logging_from_conf(self.CONF, 'test', 'test') handler_mock.assert_called_once_with(path_mock.return_value, maxBytes=maxBytes, backupCount=backup_count) self.assertEqual(self.log_handlers[0], handler_mock.return_value) class LoggerTestCase(CommonLoggerTestsMixIn, test_base.BaseTestCase): def setUp(self): super(LoggerTestCase, self).setUp() self.log = log.getLogger(None) class BaseTestCase(test_base.BaseTestCase): def setUp(self): super(BaseTestCase, self).setUp() self.context_fixture = self.useFixture( fixture_context.ClearRequestContext()) self.config_fixture = self.useFixture( fixture_config.Config(cfg.ConfigOpts())) self.config = self.config_fixture.config self.CONF = self.config_fixture.conf log.register_options(self.CONF) log.setup(self.CONF, 'base') class LogTestBase(BaseTestCase): """Base test class that provides some convenience functions.""" def _add_handler_with_cleanup(self, log_instance, handler=None, formatter=None): """Add a log handler to a log instance. This function should be used to add handlers to loggers in test cases instead of directly adding them to ensure that the handler is correctly removed at the end of the test. Otherwise the handler may be left on the logger and interfere with subsequent tests. :param log_instance: The log instance to which the handler will be added. :param handler: The handler class to be added. Must be the class itself, not an instance. :param formatter: The formatter class to set on the handler. Must be the class itself, not an instance. """ self.stream = io.StringIO() if handler is None: handler = logging.StreamHandler self.handler = handler(self.stream) if formatter is None: formatter = formatters.ContextFormatter self.handler.setFormatter(formatter()) log_instance.logger.addHandler(self.handler) self.addCleanup(log_instance.logger.removeHandler, self.handler) def _set_log_level_with_cleanup(self, log_instance, level): """Set the log level of a logger for the duration of a test. Use this function to set the log level of a logger and add the necessary cleanup to reset it back to default at the end of the test. :param log_instance: The logger whose level will be changed. :param level: The new log level to use. """ self.level = log_instance.logger.getEffectiveLevel() log_instance.logger.setLevel(level) self.addCleanup(log_instance.logger.setLevel, self.level) class LogHandlerTestCase(BaseTestCase): def test_log_path_logdir(self): path = os.path.join('some', 'path') binary = 'foo-bar' expected = os.path.join(path, '%s.log' % binary) self.config(log_dir=path, log_file=None) self.assertEqual(log._get_log_file_path(self.config_fixture.conf, binary=binary), expected) def test_log_path_logfile(self): path = os.path.join('some', 'path') binary = 'foo-bar' expected = os.path.join(path, '%s.log' % binary) self.config(log_file=expected) self.assertEqual(log._get_log_file_path(self.config_fixture.conf, binary=binary), expected) def test_log_path_none(self): prefix = 'foo-bar' self.config(log_dir=None, log_file=None) self.assertIsNone(log._get_log_file_path(self.config_fixture.conf, binary=prefix)) def test_log_path_logfile_overrides_logdir(self): path = os.path.join(os.sep, 'some', 'path') prefix = 'foo-bar' expected = os.path.join(path, '%s.log' % prefix) self.config(log_dir=os.path.join('some', 'other', 'path'), log_file=expected) self.assertEqual(log._get_log_file_path(self.config_fixture.conf, binary=prefix), expected) def test_iter_loggers(self): mylog = logging.getLogger("abc.cde") loggers = list(log._iter_loggers()) self.assertIn(logging.getLogger(), loggers) self.assertIn(mylog, loggers) class SysLogHandlersTestCase(BaseTestCase): """Test the standard Syslog handler.""" def setUp(self): super(SysLogHandlersTestCase, self).setUp() self.facility = logging.handlers.SysLogHandler.LOG_USER self.logger = logging.handlers.SysLogHandler(facility=self.facility) def test_standard_format(self): """Ensure syslog msg isn't modified for standard handler.""" logrecord = logging.LogRecord('name', logging.WARNING, '/tmp', 1, 'Message', None, None) expected = logrecord self.assertEqual(expected.getMessage(), self.logger.format(logrecord)) @testtools.skipUnless(syslog, "syslog is not available") class OSSysLogHandlerTestCase(BaseTestCase): def test_handler(self): handler = handlers.OSSysLogHandler() syslog.syslog = mock.Mock() handler.emit( logging.LogRecord("foo", logging.INFO, "path", 123, "hey!", None, None)) self.assertTrue(syslog.syslog.called) def test_syslog_binary_name(self): # There is no way to test the actual output written to the # syslog (e.g. /var/log/syslog) to confirm binary_name value # is actually present syslog.openlog = mock.Mock() handlers.OSSysLogHandler() syslog.openlog.assert_called_with(handlers._get_binary_name(), 0, syslog.LOG_USER) def test_find_facility(self): self.assertEqual(syslog.LOG_USER, log._find_facility("user")) self.assertEqual(syslog.LOG_LPR, log._find_facility("LPR")) self.assertEqual(syslog.LOG_LOCAL3, log._find_facility("log_local3")) self.assertEqual(syslog.LOG_UUCP, log._find_facility("LOG_UUCP")) self.assertRaises(TypeError, log._find_facility, "fougere") def test_syslog(self): msg_unicode = u"Benoît Knecht & François Deppierraz login failure" handler = handlers.OSSysLogHandler() syslog.syslog = mock.Mock() handler.emit( logging.LogRecord("name", logging.INFO, "path", 123, msg_unicode, None, None)) syslog.syslog.assert_called_once_with(syslog.LOG_INFO, msg_unicode) class OSJournalHandlerTestCase(BaseTestCase): """Test systemd journal logging. This is a lightweight test for testing systemd journal logging. It mocks out the journal interface itself, which allows us to not have to have systemd-python installed (which is not possible to install on non Linux environments). Real world testing is also encouraged. """ def setUp(self): super(OSJournalHandlerTestCase, self).setUp() self.config(use_journal=True) self.journal = mock.patch("oslo_log.handlers.journal").start() self.addCleanup(self.journal.stop) log.setup(self.CONF, 'testing') def test_emit(self): logger = log.getLogger('nova-test.foo') local_context = _fake_new_context() logger.info("Foo", context=local_context) self.assertEqual( mock.call(mock.ANY, CODE_FILE=mock.ANY, CODE_FUNC='test_emit', CODE_LINE=mock.ANY, LOGGER_LEVEL='INFO', LOGGER_NAME='nova-test.foo', PRIORITY=6, SYSLOG_IDENTIFIER=mock.ANY, REQUEST_ID=mock.ANY, PROJECT_NAME='mytenant', PROCESS_NAME='MainProcess', THREAD_NAME='MainThread', USER_NAME='myuser'), self.journal.send.call_args) args, kwargs = self.journal.send.call_args self.assertEqual(len(args), 1) self.assertIsInstance(args[0], str) self.assertIsInstance(kwargs['CODE_LINE'], int) self.assertIsInstance(kwargs['PRIORITY'], int) del kwargs['CODE_LINE'], kwargs['PRIORITY'] for key, arg in kwargs.items(): self.assertIsInstance(key, str) self.assertIsInstance(arg, (bytes, str)) def test_emit_exception(self): logger = log.getLogger('nova-exception.foo') local_context = _fake_new_context() try: raise Exception("Some exception") except Exception: logger.exception("Foo", context=local_context) self.assertEqual( mock.call(mock.ANY, CODE_FILE=mock.ANY, CODE_FUNC='test_emit_exception', CODE_LINE=mock.ANY, LOGGER_LEVEL='ERROR', LOGGER_NAME='nova-exception.foo', PRIORITY=3, SYSLOG_IDENTIFIER=mock.ANY, REQUEST_ID=mock.ANY, EXCEPTION_INFO=mock.ANY, EXCEPTION_TEXT=mock.ANY, PROJECT_NAME='mytenant', PROCESS_NAME='MainProcess', THREAD_NAME='MainThread', USER_NAME='myuser'), self.journal.send.call_args) args, kwargs = self.journal.send.call_args self.assertEqual(len(args), 1) self.assertIsInstance(args[0], str) self.assertIsInstance(kwargs['CODE_LINE'], int) self.assertIsInstance(kwargs['PRIORITY'], int) del kwargs['CODE_LINE'], kwargs['PRIORITY'] for key, arg in kwargs.items(): self.assertIsInstance(key, str) self.assertIsInstance(arg, (bytes, str)) class LogLevelTestCase(BaseTestCase): def setUp(self): super(LogLevelTestCase, self).setUp() levels = self.CONF.default_log_levels info_level = 'nova-test' warn_level = 'nova-not-debug' other_level = 'nova-below-debug' trace_level = 'nova-trace' levels.append(info_level + '=INFO') levels.append(warn_level + '=WARN') levels.append(other_level + '=7') levels.append(trace_level + '=TRACE') self.config(default_log_levels=levels) log.setup(self.CONF, 'testing') self.log = log.getLogger(info_level) self.log_no_debug = log.getLogger(warn_level) self.log_below_debug = log.getLogger(other_level) self.log_trace = log.getLogger(trace_level) def test_is_enabled_for(self): self.assertTrue(self.log.isEnabledFor(logging.INFO)) self.assertFalse(self.log_no_debug.isEnabledFor(logging.DEBUG)) self.assertTrue(self.log_below_debug.isEnabledFor(logging.DEBUG)) self.assertTrue(self.log_below_debug.isEnabledFor(7)) self.assertTrue(self.log_trace.isEnabledFor(log.TRACE)) def test_has_level_from_flags(self): self.assertEqual(logging.INFO, self.log.logger.getEffectiveLevel()) def test_has_level_from_flags_for_trace(self): self.assertEqual(log.TRACE, self.log_trace.logger.getEffectiveLevel()) def test_child_log_has_level_of_parent_flag(self): logger = log.getLogger('nova-test.foo') self.assertEqual(logging.INFO, logger.logger.getEffectiveLevel()) def test_child_log_has_level_of_parent_flag_for_trace(self): logger = log.getLogger('nova-trace.foo') self.assertEqual(log.TRACE, logger.logger.getEffectiveLevel()) def test_get_loggers(self): log._loggers['sentinel_log'] = mock.sentinel.sentinel_log res = log.get_loggers() self.assertDictEqual(log._loggers, res) class JSONFormatterTestCase(LogTestBase): def setUp(self): super(JSONFormatterTestCase, self).setUp() self.log = log.getLogger('test-json') self._add_handler_with_cleanup(self.log, formatter=formatters.JSONFormatter) self._set_log_level_with_cleanup(self.log, logging.DEBUG) def test_json_w_context_in_extras(self): test_msg = 'This is a %(test)s line' test_data = {'test': 'log'} local_context = _fake_context() self.log.debug(test_msg, test_data, key='value', context=local_context) self._validate_json_data('test_json_w_context_in_extras', test_msg, test_data, local_context) def test_json_w_fetched_global_context(self): test_msg = 'This is a %(test)s line' test_data = {'test': 'log'} local_context = _fake_context() # NOTE we're not passing the context explicitly here. But it'll add the # context to the extras anyway since the call to fake_context adds the # context to the thread. The context will be fetched with the # _update_record_with_context call that's done in the formatter. self.log.debug(test_msg, test_data, key='value') self._validate_json_data('test_json_w_fetched_global_context', test_msg, test_data, local_context) def _validate_json_data(self, testname, test_msg, test_data, ctx): data = jsonutils.loads(self.stream.getvalue()) self.assertTrue(data) self.assertIn('extra', data) self.assertIn('context', data) extra = data['extra'] context = data['context'] self.assertNotIn('context', extra) self.assertEqual('value', extra['key']) self.assertEqual(ctx.user, context['user']) self.assertEqual(ctx.user_name, context['user_name']) self.assertEqual(ctx.project_name, context['project_name']) self.assertEqual('test-json', data['name']) self.assertIn('request_id', context) self.assertEqual(ctx.request_id, context['request_id']) self.assertIn('global_request_id', context) self.assertEqual(ctx.global_request_id, context['global_request_id']) self.assertEqual(test_msg % test_data, data['message']) self.assertEqual(test_msg, data['msg']) self.assertEqual(test_data, data['args']) self.assertEqual('test_log.py', data['filename']) self.assertEqual(testname, data['funcname']) self.assertEqual('DEBUG', data['levelname']) self.assertEqual(logging.DEBUG, data['levelno']) self.assertFalse(data['traceback']) def test_json_exception(self): test_msg = 'This is %s' test_data = 'exceptional' try: raise Exception('This is exceptional') except Exception: self.log.exception(test_msg, test_data) data = jsonutils.loads(self.stream.getvalue()) self.assertTrue(data) self.assertIn('extra', data) self.assertEqual('test-json', data['name']) self.assertEqual(test_msg % test_data, data['message']) self.assertEqual(test_msg, data['msg']) self.assertEqual([test_data], data['args']) self.assertEqual('ERROR', data['levelname']) self.assertEqual(logging.ERROR, data['levelno']) self.assertTrue(data['traceback']) def test_json_with_extra(self): test_msg = 'This is a %(test)s line' test_data = {'test': 'log'} extra_data = {'special_user': 'user1', 'special_tenant': 'unicorns'} self.log.debug(test_msg, test_data, key='value', extra=extra_data) data = jsonutils.loads(self.stream.getvalue()) self.assertTrue(data) self.assertIn('extra', data) for k, v in extra_data.items(): self.assertIn(k, data['extra']) self.assertEqual(v, data['extra'][k]) def test_json_with_extra_keys(self): test_msg = 'This is a %(test)s line' test_data = {'test': 'log'} extra_keys = ['special_tenant', 'special_user'] special_tenant = 'unicorns' special_user = 'user2' self.log.debug(test_msg, test_data, key='value', extra_keys=extra_keys, special_tenant=special_tenant, special_user=special_user) data = jsonutils.loads(self.stream.getvalue()) self.assertTrue(data) self.assertIn('extra', data) self.assertIn(extra_keys[0], data['extra']) self.assertEqual(special_tenant, data['extra'][extra_keys[0]]) self.assertIn(extra_keys[1], data['extra']) self.assertEqual(special_user, data['extra'][extra_keys[1]]) def test_can_process_strings(self): expected = b'\\u2622' # see ContextFormatterTestCase.test_can_process_strings expected = '\\\\xe2\\\\x98\\\\xa2' self.log.info(b'%s', u'\u2622'.encode('utf8')) self.assertIn(expected, self.stream.getvalue()) def test_exception(self): ctxt = _fake_context() ctxt.request_id = str('99') try: raise RuntimeError('test_exception') except RuntimeError: self.log.warning('testing', context=ctxt) data = jsonutils.loads(self.stream.getvalue()) self.assertIn('error_summary', data) self.assertEqual('RuntimeError: test_exception', data['error_summary']) def test_no_exception(self): ctxt = _fake_context() ctxt.request_id = str('99') self.log.info('testing', context=ctxt) data = jsonutils.loads(self.stream.getvalue()) self.assertIn('error_summary', data) self.assertEqual('', data['error_summary']) def test_exception_without_exc_info_passed(self): ctxt = _fake_context() ctxt.request_id = str('99') try: raise RuntimeError('test_exception\ntraceback\nfrom\nremote error') except RuntimeError: self.log.warning('testing', context=ctxt) data = jsonutils.loads(self.stream.getvalue()) self.assertIn('error_summary', data) self.assertEqual('RuntimeError: test_exception', data['error_summary']) def test_exception_with_exc_info_passed(self): ctxt = _fake_context() ctxt.request_id = str('99') try: raise RuntimeError('test_exception\ntraceback\nfrom\nremote error') except RuntimeError: self.log.exception('testing', context=ctxt) data = jsonutils.loads(self.stream.getvalue()) self.assertIn('error_summary', data) self.assertEqual('RuntimeError: test_exception' '\ntraceback\nfrom\nremote error', data['error_summary']) def test_fallback(self): class MyObject(object): def __str__(self): return 'str' def __repr__(self): return 'repr' obj = MyObject() self.log.debug('obj=%s', obj) data = jsonutils.loads(self.stream.getvalue()) self.assertEqual('obj=str', data['message']) # Bug #1593641: If an object of record.args cannot be serialized, # convert it using repr() to prevent serialization error on logging. self.assertEqual(['repr'], data['args']) def test_extra_args_filtered(self): test_msg = 'This is a %(test)s line %%(unused)' test_data = {'test': 'log', 'unused': 'removeme'} self.log.debug(test_msg, test_data) data = jsonutils.loads(self.stream.getvalue()) self.assertNotIn('unused', data['args']) def test_entire_dict(self): test_msg = 'This is a %s dict' test_data = {'test': 'log', 'other': 'value'} self.log.debug(test_msg, test_data) data = jsonutils.loads(self.stream.getvalue()) self.assertEqual(test_data, data['args']) def get_fake_datetime(retval): class FakeDateTime(datetime.datetime): @classmethod def fromtimestamp(cls, timestamp): return retval return FakeDateTime class DictStreamHandler(logging.StreamHandler): """Serialize dict in order to avoid TypeError in python 3. It is needed for FluentFormatterTestCase. """ def emit(self, record): try: msg = self.format(record) jsonutils.dump(msg, self.stream) self.stream.flush() except AttributeError: self.handleError(record) class FluentFormatterTestCase(LogTestBase): def setUp(self): super(FluentFormatterTestCase, self).setUp() self.log = log.getLogger('test-fluent') self._add_handler_with_cleanup(self.log, handler=DictStreamHandler, formatter=formatters.FluentFormatter) self._set_log_level_with_cleanup(self.log, logging.DEBUG) def test_fluent(self): test_msg = 'This is a %(test)s line' test_data = {'test': 'log'} local_context = _fake_context() self.log.debug(test_msg, test_data, key='value', context=local_context) data = jsonutils.loads(self.stream.getvalue()) self.assertIn('lineno', data) self.assertIn('extra', data) extra = data['extra'] context = data['context'] self.assertEqual('value', extra['key']) self.assertEqual(local_context.user, context['user']) self.assertEqual('test-fluent', data['name']) self.assertIn('request_id', context) self.assertEqual(local_context.request_id, context['request_id']) self.assertIn('global_request_id', context) self.assertEqual(local_context.global_request_id, context['global_request_id']) self.assertEqual(test_msg % test_data, data['message']) self.assertEqual('test_log.py', data['filename']) self.assertEqual('test_fluent', data['funcname']) self.assertEqual('DEBUG', data['level']) self.assertFalse(data['traceback']) def test_exception(self): local_context = _fake_context() try: raise RuntimeError('test_exception') except RuntimeError: self.log.warning('testing', context=local_context) data = jsonutils.loads(self.stream.getvalue()) self.assertIn('error_summary', data) self.assertEqual('RuntimeError: test_exception', data['error_summary']) def test_no_exception(self): local_context = _fake_context() self.log.info('testing', context=local_context) data = jsonutils.loads(self.stream.getvalue()) self.assertIn('error_summary', data) self.assertEqual('', data['error_summary']) def test_json_exception(self): test_msg = 'This is %s' test_data = 'exceptional' try: raise Exception('This is exceptional') except Exception: self.log.exception(test_msg, test_data) data = jsonutils.loads(self.stream.getvalue()) self.assertTrue(data) self.assertIn('extra', data) self.assertEqual('test-fluent', data['name']) self.assertEqual(test_msg % test_data, data['message']) self.assertEqual('ERROR', data['level']) self.assertTrue(data['traceback']) class ContextFormatterTestCase(LogTestBase): def setUp(self): super(ContextFormatterTestCase, self).setUp() self.config(logging_context_format_string="HAS CONTEXT " "[%(request_id)s]: " "%(message)s", logging_default_format_string="NOCTXT: %(message)s", logging_debug_format_suffix="--DBG") self.log = log.getLogger('') # obtain root logger instead of 'unknown' self._add_handler_with_cleanup(self.log) self._set_log_level_with_cleanup(self.log, logging.DEBUG) self.trans_fixture = self.useFixture(fixture_trans.Translation()) def test_uncontextualized_log(self): message = 'foo' self.log.info(message) self.assertEqual("NOCTXT: %s\n" % message, self.stream.getvalue()) def test_contextualized_log(self): ctxt = _fake_context() message = 'bar' self.log.info(message, context=ctxt) expected = 'HAS CONTEXT [%s]: %s\n' % (ctxt.request_id, message) self.assertEqual(expected, self.stream.getvalue()) def test_context_is_taken_from_tls_variable(self): ctxt = _fake_context() message = 'bar' self.log.info(message) expected = "HAS CONTEXT [%s]: %s\n" % (ctxt.request_id, message) self.assertEqual(expected, self.stream.getvalue()) def test_contextual_information_is_imparted_to_3rd_party_log_records(self): ctxt = _fake_context() sa_log = logging.getLogger('sqlalchemy.engine') sa_log.setLevel(logging.INFO) message = 'emulate logging within sqlalchemy' sa_log.info(message) expected = ('HAS CONTEXT [%s]: %s\n' % (ctxt.request_id, message)) self.assertEqual(expected, self.stream.getvalue()) def test_message_logging_3rd_party_log_records(self): ctxt = _fake_context() ctxt.request_id = str('99') sa_log = logging.getLogger('sqlalchemy.engine') sa_log.setLevel(logging.INFO) message = self.trans_fixture.lazy('test ' + chr(128)) sa_log.info(message) expected = ('HAS CONTEXT [%s]: %s\n' % (ctxt.request_id, str(message))) self.assertEqual(expected, self.stream.getvalue()) def test_debugging_log(self): message = 'baz' self.log.debug(message) self.assertEqual("NOCTXT: %s --DBG\n" % message, self.stream.getvalue()) def test_message_logging(self): # NOTE(luisg): Logging message objects with unicode objects # may cause trouble by the logging mechanism trying to coerce # the Message object, with a wrong encoding. This test case # tests that problem does not occur. ctxt = _fake_context() ctxt.request_id = str('99') message = self.trans_fixture.lazy('test ' + chr(128)) self.log.info(message, context=ctxt) expected = "HAS CONTEXT [%s]: %s\n" % (ctxt.request_id, str(message)) self.assertEqual(expected, self.stream.getvalue()) def test_exception_logging(self): # NOTE(dhellmann): If there is an exception and %(error_summary)s # does not appear in the format string, ensure that it is # appended to the end of the log lines. ctxt = _fake_context() ctxt.request_id = str('99') message = self.trans_fixture.lazy('test ' + chr(128)) try: raise RuntimeError('test_exception_logging') except RuntimeError: self.log.warning(message, context=ctxt) expected = 'RuntimeError: test_exception_logging\n' self.assertTrue(self.stream.getvalue().endswith(expected)) def test_skip_logging_builtin_exceptions(self): # NOTE(dhellmann): Several of the built-in exception types # should not be automatically added to the log output. ctxt = _fake_context() ctxt.request_id = str('99') message = self.trans_fixture.lazy('test ' + chr(128)) ignored_exceptions = [ ValueError, TypeError, KeyError, AttributeError, ImportError ] for ignore in ignored_exceptions: try: raise ignore('test_exception_logging') except ignore as e: self.log.warning(message, context=ctxt) expected = '{}: {}'.format(e.__class__.__name__, e) self.assertNotIn(expected, self.stream.getvalue()) def test_exception_logging_format_string(self): # NOTE(dhellmann): If the format string includes # %(error_summary)s then ensure the exception message ends up in # that position in the output. self.config(logging_context_format_string="A %(error_summary)s B") ctxt = _fake_context() ctxt.request_id = str('99') message = self.trans_fixture.lazy('test ' + chr(128)) try: raise RuntimeError('test_exception_logging') except RuntimeError: self.log.warning(message, context=ctxt) expected = 'A RuntimeError: test_exception_logging' self.assertTrue(self.stream.getvalue().startswith(expected)) def test_no_exception_logging_format_string(self): # NOTE(dhellmann): If there is no exception but the format # string includes %(error_summary)s then ensure the "-" is # inserted. self.config(logging_context_format_string="%(error_summary)s") ctxt = _fake_context() ctxt.request_id = str('99') message = self.trans_fixture.lazy('test ' + chr(128)) self.log.info(message, context=ctxt) expected = '-\n' self.assertTrue(self.stream.getvalue().startswith(expected)) def test_unicode_conversion_in_adapter(self): ctxt = _fake_context() ctxt.request_id = str('99') message = "Exception is (%s)" ex = Exception(self.trans_fixture.lazy('test' + chr(128))) self.log.debug(message, ex, context=ctxt) message = str(message) % ex expected = "HAS CONTEXT [%s]: %s --DBG\n" % (ctxt.request_id, message) self.assertEqual(expected, self.stream.getvalue()) def test_unicode_conversion_in_formatter(self): ctxt = _fake_context() ctxt.request_id = str('99') no_adapt_log = logging.getLogger('no_adapt') no_adapt_log.setLevel(logging.INFO) message = "Exception is (%s)" ex = Exception(self.trans_fixture.lazy('test' + chr(128))) no_adapt_log.info(message, ex) message = str(message) % ex expected = "HAS CONTEXT [%s]: %s\n" % (ctxt.request_id, message) self.assertEqual(expected, self.stream.getvalue()) def test_user_identity_logging(self): self.config(logging_context_format_string="HAS CONTEXT " "[%(request_id)s " "%(user_identity)s]: " "%(message)s") ctxt = _fake_context() ctxt.request_id = u'99' message = 'test' self.log.info(message, context=ctxt) expected = ("HAS CONTEXT [%s %s %s %s %s %s]: %s\n" % (ctxt.request_id, ctxt.user, ctxt.tenant, ctxt.domain, ctxt.user_domain, ctxt.project_domain, str(message))) self.assertEqual(expected, self.stream.getvalue()) def test_user_identity_logging_set_format(self): self.config(logging_context_format_string="HAS CONTEXT " "[%(request_id)s " "%(user_identity)s]: " "%(message)s", logging_user_identity_format="%(user)s " "%(tenant)s") ctxt = _fake_context() ctxt.request_id = u'99' message = 'test' self.log.info(message, context=ctxt) expected = ("HAS CONTEXT [%s %s %s]: %s\n" % (ctxt.request_id, ctxt.user, ctxt.tenant, str(message))) self.assertEqual(expected, self.stream.getvalue()) @mock.patch("datetime.datetime", get_fake_datetime( datetime.datetime(2015, 12, 16, 13, 54, 26, 517893))) @mock.patch("dateutil.tz.tzlocal", new=mock.Mock(return_value=tz.tzutc())) def test_rfc5424_isotime_format(self): self.config(logging_default_format_string="%(isotime)s %(message)s") message = "test" expected = "2015-12-16T13:54:26.517893+00:00 %s\n" % message self.log.info(message) self.assertEqual(expected, self.stream.getvalue()) @mock.patch("datetime.datetime", get_fake_datetime( datetime.datetime(2015, 12, 16, 13, 54, 26))) @mock.patch("time.time", new=mock.Mock(return_value=1450274066.000000)) @mock.patch("dateutil.tz.tzlocal", new=mock.Mock(return_value=tz.tzutc())) def test_rfc5424_isotime_format_no_microseconds(self): self.config(logging_default_format_string="%(isotime)s %(message)s") message = "test" expected = "2015-12-16T13:54:26.000000+00:00 %s\n" % message self.log.info(message) self.assertEqual(expected, self.stream.getvalue()) def test_can_process_strings(self): expected = b'\xe2\x98\xa2' # logging format string should be unicode string # or it will fail and inserting byte string in unicode string # causes such formatting expected = '\\xe2\\x98\\xa2' self.log.info(b'%s', u'\u2622'.encode('utf8')) self.assertIn(expected, self.stream.getvalue()) def test_dict_args_with_unicode(self): msg = '%(thing)s' arg = {'thing': '\xc6\x91\xc6\xa1\xc6\xa1'} self.log.info(msg, arg) self.assertIn(arg['thing'], self.stream.getvalue()) class ExceptionLoggingTestCase(LogTestBase): """Test that Exceptions are logged.""" def test_excepthook_logs_exception(self): product_name = 'somename' exc_log = log.getLogger(product_name) self._add_handler_with_cleanup(exc_log) excepthook = log._create_logging_excepthook(product_name) try: raise Exception('Some error happened') except Exception: excepthook(*sys.exc_info()) expected_string = ("CRITICAL somename [-] Unhandled error: " "Exception: Some error happened") self.assertIn(expected_string, self.stream.getvalue(), message="Exception is not logged") def test_excepthook_installed(self): log.setup(self.CONF, "test_excepthook_installed") self.assertTrue(sys.excepthook != sys.__excepthook__) @mock.patch("datetime.datetime", get_fake_datetime( datetime.datetime(2015, 12, 16, 13, 54, 26, 517893))) @mock.patch("dateutil.tz.tzlocal", new=mock.Mock(return_value=tz.tzutc())) def test_rfc5424_isotime_format(self): self.config( logging_default_format_string="%(isotime)s %(message)s", logging_exception_prefix="%(isotime)s ", ) product_name = 'somename' exc_log = log.getLogger(product_name) self._add_handler_with_cleanup(exc_log) excepthook = log._create_logging_excepthook(product_name) message = 'Some error happened' try: raise Exception(message) except Exception: excepthook(*sys.exc_info()) expected_string = ("2015-12-16T13:54:26.517893+00:00 " "Exception: %s" % message) self.assertIn(expected_string, self.stream.getvalue()) class FancyRecordTestCase(LogTestBase): """Test how we handle fancy record keys that are not in the base python logging. """ def setUp(self): super(FancyRecordTestCase, self).setUp() # NOTE(sdague): use the different formatters to demonstrate format # string with valid fancy keys and without. Slightly hacky, but given # the way log objects layer up seemed to be most concise approach self.config(logging_context_format_string="%(color)s " "[%(request_id)s]: " "%(instance)s" "%(resource)s" "%(message)s", logging_default_format_string="%(missing)s: %(message)s") self.colorlog = log.getLogger() self._add_handler_with_cleanup(self.colorlog, handlers.ColorHandler) self._set_log_level_with_cleanup(self.colorlog, logging.DEBUG) def test_unsupported_key_in_log_msg(self): # NOTE(sdague): exception logging bypasses the main stream # and goes to stderr. Suggests on a better way to do this are # welcomed. error = sys.stderr sys.stderr = io.StringIO() self.colorlog.info("foo") self.assertNotEqual(-1, sys.stderr.getvalue().find("KeyError: 'missing'")) sys.stderr = error def _validate_keys(self, ctxt, keyed_log_string): infocolor = handlers.ColorHandler.LEVEL_COLORS[logging.INFO] warncolor = handlers.ColorHandler.LEVEL_COLORS[logging.WARN] info_msg = 'info' warn_msg = 'warn' infoexpected = "%s %s %s" % (infocolor, keyed_log_string, info_msg) warnexpected = "%s %s %s" % (warncolor, keyed_log_string, warn_msg) self.colorlog.info(info_msg, context=ctxt) self.assertIn(infoexpected, self.stream.getvalue()) self.assertEqual('\033[00;36m', infocolor) self.colorlog.warn(warn_msg, context=ctxt) self.assertIn(infoexpected, self.stream.getvalue()) self.assertIn(warnexpected, self.stream.getvalue()) self.assertEqual('\033[01;33m', warncolor) def test_fancy_key_in_log_msg(self): ctxt = _fake_context() self._validate_keys(ctxt, '[%s]:' % ctxt.request_id) def test_instance_key_in_log_msg(self): ctxt = _fake_context() ctxt.resource_uuid = '1234' self._validate_keys(ctxt, ('[%s]: [instance: %s]' % (ctxt.request_id, ctxt.resource_uuid))) def test_resource_key_in_log_msg(self): color = handlers.ColorHandler.LEVEL_COLORS[logging.INFO] ctxt = _fake_context() resource = 'resource-202260f9-1224-490d-afaf-6a744c13141f' fake_resource = {'name': resource} message = 'info' self.colorlog.info(message, context=ctxt, resource=fake_resource) expected = ('%s [%s]: [%s] %s\033[00m\n' % (color, ctxt.request_id, resource, message)) self.assertEqual(expected, self.stream.getvalue()) def test_resource_key_dict_in_log_msg(self): color = handlers.ColorHandler.LEVEL_COLORS[logging.INFO] ctxt = _fake_context() type = 'fake_resource' resource_id = '202260f9-1224-490d-afaf-6a744c13141f' fake_resource = {'type': type, 'id': resource_id} message = 'info' self.colorlog.info(message, context=ctxt, resource=fake_resource) expected = ('%s [%s]: [%s-%s] %s\033[00m\n' % (color, ctxt.request_id, type, resource_id, message)) self.assertEqual(expected, self.stream.getvalue()) class InstanceRecordTestCase(LogTestBase): def setUp(self): super(InstanceRecordTestCase, self).setUp() self.config(logging_context_format_string="[%(request_id)s]: " "%(instance)s" "%(resource)s" "%(message)s", logging_default_format_string="%(instance)s" "%(resource)s" "%(message)s") self.log = log.getLogger() self._add_handler_with_cleanup(self.log) self._set_log_level_with_cleanup(self.log, logging.DEBUG) def test_instance_dict_in_context_log_msg(self): ctxt = _fake_context() uuid = 'C9B7CCC6-8A12-4C53-A736-D7A1C36A62F3' fake_resource = {'uuid': uuid} message = 'info' self.log.info(message, context=ctxt, instance=fake_resource) expected = '[instance: %s]' % uuid self.assertIn(expected, self.stream.getvalue()) def test_instance_dict_in_default_log_msg(self): uuid = 'C9B7CCC6-8A12-4C53-A736-D7A1C36A62F3' fake_resource = {'uuid': uuid} message = 'info' self.log.info(message, instance=fake_resource) expected = '[instance: %s]' % uuid self.assertIn(expected, self.stream.getvalue()) def test_instance_uuid_as_arg_in_context_log_msg(self): ctxt = _fake_context() uuid = 'C9B7CCC6-8A12-4C53-A736-D7A1C36A62F3' message = 'info' self.log.info(message, context=ctxt, instance_uuid=uuid) expected = '[instance: %s]' % uuid self.assertIn(expected, self.stream.getvalue()) def test_instance_uuid_as_arg_in_default_log_msg(self): uuid = 'C9B7CCC6-8A12-4C53-A736-D7A1C36A62F3' message = 'info' self.log.info(message, instance_uuid=uuid) expected = '[instance: %s]' % uuid self.assertIn(expected, self.stream.getvalue()) def test_instance_uuid_from_context_in_context_log_msg(self): ctxt = _fake_context() ctxt.instance_uuid = 'CCCCCCCC-8A12-4C53-A736-D7A1C36A62F3' message = 'info' self.log.info(message, context=ctxt) expected = '[instance: %s]' % ctxt.instance_uuid self.assertIn(expected, self.stream.getvalue()) def test_resource_uuid_from_context_in_context_log_msg(self): ctxt = _fake_context() ctxt.resource_uuid = 'RRRRRRRR-8A12-4C53-A736-D7A1C36A62F3' message = 'info' self.log.info(message, context=ctxt) expected = '[instance: %s]' % ctxt.resource_uuid self.assertIn(expected, self.stream.getvalue()) def test_instance_from_context_in_context_log_msg(self): # NOTE: instance when passed in a context object is just a uuid. # When passed to the log record, it is a dict. ctxt = _fake_context() ctxt.instance = 'IIIIIIII-8A12-4C53-A736-D7A1C36A62F3' message = 'info' self.log.info(message, context=ctxt) expected = '[instance: %s]' % ctxt.instance self.assertIn(expected, self.stream.getvalue()) class TraceLevelTestCase(LogTestBase): def setUp(self): super(TraceLevelTestCase, self).setUp() self.config(logging_context_format_string="%(message)s") self.mylog = log.getLogger() self._add_handler_with_cleanup(self.mylog) self._set_log_level_with_cleanup(self.mylog, log.TRACE) def test_trace_log_msg(self): ctxt = _fake_context() message = 'my trace message' self.mylog.trace(message, context=ctxt) self.assertEqual('%s\n' % message, self.stream.getvalue()) class DomainTestCase(LogTestBase): def setUp(self): super(DomainTestCase, self).setUp() self.config(logging_context_format_string="[%(request_id)s]: " "%(user_identity)s " "%(message)s") self.mylog = log.getLogger() self._add_handler_with_cleanup(self.mylog) self._set_log_level_with_cleanup(self.mylog, logging.DEBUG) def _validate_keys(self, ctxt, keyed_log_string): info_message = 'info' infoexpected = "%s %s\n" % (keyed_log_string, info_message) warn_message = 'warn' warnexpected = "%s %s\n" % (keyed_log_string, warn_message) self.mylog.info(info_message, context=ctxt) self.assertEqual(infoexpected, self.stream.getvalue()) self.mylog.warn(warn_message, context=ctxt) self.assertEqual(infoexpected + warnexpected, self.stream.getvalue()) def test_domain_in_log_msg(self): ctxt = _fake_context() user_identity = ctxt.get_logging_values()['user_identity'] self.assertIn(ctxt.domain, user_identity) self.assertIn(ctxt.project_domain, user_identity) self.assertIn(ctxt.user_domain, user_identity) self._validate_keys(ctxt, ('[%s]: %s' % (ctxt.request_id, user_identity))) class SetDefaultsTestCase(BaseTestCase): class TestConfigOpts(cfg.ConfigOpts): def __call__(self, args=None): return cfg.ConfigOpts.__call__(self, args=args, prog='test', version='1.0', usage='%(prog)s FOO BAR', default_config_files=[]) def setUp(self): super(SetDefaultsTestCase, self).setUp() self.conf = self.TestConfigOpts() self.conf.register_opts(_options.log_opts) self.conf.register_cli_opts(_options.logging_cli_opts) self._orig_defaults = dict([(o.dest, o.default) for o in _options.log_opts]) self.addCleanup(self._restore_log_defaults) def _restore_log_defaults(self): for opt in _options.log_opts: opt.default = self._orig_defaults[opt.dest] def test_default_log_level_to_none(self): log.set_defaults(logging_context_format_string=None, default_log_levels=None) self.conf([]) self.assertEqual(_options.DEFAULT_LOG_LEVELS, self.conf.default_log_levels) def test_default_log_level_method(self): self.assertEqual(_options.DEFAULT_LOG_LEVELS, log.get_default_log_levels()) def test_change_default(self): my_default = '%(asctime)s %(levelname)s %(name)s [%(request_id)s '\ '%(user_id)s %(project)s] %(instance)s'\ '%(message)s' log.set_defaults(logging_context_format_string=my_default) self.conf([]) self.assertEqual(self.conf.logging_context_format_string, my_default) def test_change_default_log_level(self): package_log_level = 'foo=bar' log.set_defaults(default_log_levels=[package_log_level]) self.conf([]) self.assertEqual([package_log_level], self.conf.default_log_levels) self.assertIsNotNone(self.conf.logging_context_format_string) def test_tempest_set_log_file(self): log_file = 'foo.log' log.tempest_set_log_file(log_file) self.addCleanup(log.tempest_set_log_file, None) log.set_defaults() self.conf([]) self.assertEqual(log_file, self.conf.log_file) def test_log_file_defaults_to_none(self): log.set_defaults() self.conf([]) self.assertIsNone(self.conf.log_file) @testtools.skipIf(platform.system() != 'Linux', 'pyinotify library works on Linux platform only.') class FastWatchedFileHandlerTestCase(BaseTestCase): def setUp(self): super(FastWatchedFileHandlerTestCase, self).setUp() def _config(self): os_level, log_path = tempfile.mkstemp() log_dir_path = os.path.dirname(log_path) log_file_path = os.path.basename(log_path) self.CONF(['--log-dir', log_dir_path, '--log-file', log_file_path]) self.config(use_stderr=False) self.config(watch_log_file=True) log.setup(self.CONF, 'test', 'test') return log_path def test_instantiate(self): self._config() logger = log._loggers[None].logger self.assertEqual(1, len(logger.handlers)) from oslo_log import watchers self.assertIsInstance(logger.handlers[0], watchers.FastWatchedFileHandler) def test_log(self): log_path = self._config() logger = log._loggers[None].logger text = 'Hello World!' logger.info(text) with open(log_path, 'r') as f: file_content = f.read() self.assertIn(text, file_content) def test_move(self): log_path = self._config() os_level_dst, log_path_dst = tempfile.mkstemp() os.rename(log_path, log_path_dst) time.sleep(6) self.assertTrue(os.path.exists(log_path)) def test_remove(self): log_path = self._config() os.remove(log_path) time.sleep(6) self.assertTrue(os.path.exists(log_path)) class MutateTestCase(BaseTestCase): def setUp(self): super(MutateTestCase, self).setUp() if hasattr(log._load_log_config, 'old_time'): del log._load_log_config.old_time def setup_confs(self, *confs): paths = self.create_tempfiles( ('conf_%d' % i, conf) for i, conf in enumerate(confs)) self.CONF(['--config-file', paths[0]]) return paths def test_debug(self): paths = self.setup_confs( "[DEFAULT]\ndebug = false\n", "[DEFAULT]\ndebug = true\n") log_root = log.getLogger(None).logger log._setup_logging_from_conf(self.CONF, 'test', 'test') self.assertEqual(False, self.CONF.debug) self.assertEqual(log.INFO, log_root.getEffectiveLevel()) shutil.copy(paths[1], paths[0]) self.CONF.mutate_config_files() self.assertEqual(True, self.CONF.debug) self.assertEqual(log.DEBUG, log_root.getEffectiveLevel()) @mock.patch.object(logging.config, "fileConfig") def test_log_config_append(self, mock_fileConfig): logini = self.create_tempfiles([('log.ini', MIN_LOG_INI)])[0] paths = self.setup_confs( "[DEFAULT]\nlog_config_append = no_exist\n", "[DEFAULT]\nlog_config_append = %s\n" % logini) self.assertRaises(log.LogConfigError, log.setup, self.CONF, '') self.assertFalse(mock_fileConfig.called) shutil.copy(paths[1], paths[0]) self.CONF.mutate_config_files() mock_fileConfig.assert_called_once_with( logini, disable_existing_loggers=False) @mock.patch.object(logging.config, "fileConfig") def test_log_config_append_no_touch(self, mock_fileConfig): logini = self.create_tempfiles([('log.ini', MIN_LOG_INI)])[0] self.setup_confs("[DEFAULT]\nlog_config_append = %s\n" % logini) log.setup(self.CONF, '') mock_fileConfig.assert_called_once_with( logini, disable_existing_loggers=False) mock_fileConfig.reset_mock() self.CONF.mutate_config_files() self.assertFalse(mock_fileConfig.called) @mock.patch.object(logging.config, "fileConfig") def test_log_config_append_touch(self, mock_fileConfig): logini = self.create_tempfiles([('log.ini', MIN_LOG_INI)])[0] self.setup_confs("[DEFAULT]\nlog_config_append = %s\n" % logini) log.setup(self.CONF, '') mock_fileConfig.assert_called_once_with( logini, disable_existing_loggers=False) mock_fileConfig.reset_mock() # No thread sync going on here, just ensure the mtimes are different time.sleep(1) os.utime(logini, None) self.CONF.mutate_config_files() mock_fileConfig.assert_called_once_with( logini, disable_existing_loggers=False) def mk_log_config(self, data): """Turns a dictConfig-like structure into one suitable for fileConfig. The schema is not validated as this is a test helper not production code. Garbage in, garbage out. Particularly, don't try to use filters, fileConfig doesn't support them. Handler args must be passed like 'args': (1, 2). dictConfig passes keys by keyword name and fileConfig passes them by position so accepting the dictConfig form makes it nigh impossible to produce the fileConfig form. I traverse dicts by sorted keys for output stability but it doesn't matter if defaulted keys are out of order. """ lines = [] for section in ['formatters', 'handlers', 'loggers']: items = data.get(section, {}) keys = sorted(items) skeys = ",".join(keys) if section == 'loggers' and 'root' in data: skeys = ("root," + skeys) if skeys else "root" lines.extend(["[%s]" % section, "keys=%s" % skeys]) for key in keys: lines.extend(["", "[%s_%s]" % (section[:-1], key)]) item = items[key] lines.extend("%s=%s" % (k, item[k]) for k in sorted(item)) if section == 'handlers': if 'args' not in item: lines.append("args=()") elif section == 'loggers': lines.append("qualname=%s" % key) if 'handlers' not in item: lines.append("handlers=") lines.append("") root = data.get('root', {}) if root: lines.extend(["[logger_root]"]) lines.extend("%s=%s" % (k, root[k]) for k in sorted(root)) if 'handlers' not in root: lines.append("handlers=") return "\n".join(lines) def test_mk_log_config_full(self): data = {'loggers': {'aaa': {'level': 'INFO'}, 'bbb': {'level': 'WARN', 'propagate': False}}, 'handlers': {'aaa': {'level': 'INFO'}, 'bbb': {'level': 'WARN', 'propagate': False, 'args': (1, 2)}}, 'formatters': {'aaa': {'level': 'INFO'}, 'bbb': {'level': 'WARN', 'propagate': False}}, 'root': {'level': 'INFO', 'handlers': 'aaa'}, } full = """[formatters] keys=aaa,bbb [formatter_aaa] level=INFO [formatter_bbb] level=WARN propagate=False [handlers] keys=aaa,bbb [handler_aaa] level=INFO args=() [handler_bbb] args=(1, 2) level=WARN propagate=False [loggers] keys=root,aaa,bbb [logger_aaa] level=INFO qualname=aaa handlers= [logger_bbb] level=WARN propagate=False qualname=bbb handlers= [logger_root] handlers=aaa level=INFO""" self.assertEqual(full, self.mk_log_config(data)) def test_mk_log_config_empty(self): """Ensure mk_log_config tolerates missing bits""" empty = """[formatters] keys= [handlers] keys= [loggers] keys= """ self.assertEqual(empty, self.mk_log_config({})) @contextmanager def mutate_conf(self, conf1, conf2): loginis = self.create_tempfiles([ ('log1.ini', self.mk_log_config(conf1)), ('log2.ini', self.mk_log_config(conf2))]) confs = self.setup_confs( "[DEFAULT]\nlog_config_append = %s\n" % loginis[0], "[DEFAULT]\nlog_config_append = %s\n" % loginis[1]) log.setup(self.CONF, '') yield loginis, confs shutil.copy(confs[1], confs[0]) # prevent the mtime ever matching os.utime(self.CONF.log_config_append, (0, 0)) self.CONF.mutate_config_files() @mock.patch.object(logging.config, "fileConfig") def test_log_config_append_change_file(self, mock_fileConfig): with self.mutate_conf({}, {}) as (loginis, confs): mock_fileConfig.assert_called_once_with( loginis[0], disable_existing_loggers=False) mock_fileConfig.reset_mock() mock_fileConfig.assert_called_once_with( loginis[1], disable_existing_loggers=False) def set_root_stream(self): root = logging.getLogger() self.assertEqual(1, len(root.handlers)) handler = root.handlers[0] handler.stream = io.StringIO() return handler.stream def test_remove_handler(self): fake_handler = {'class': 'logging.StreamHandler', 'args': ()} conf1 = {'root': {'handlers': 'fake'}, 'handlers': {'fake': fake_handler}} conf2 = {'root': {'handlers': ''}} with self.mutate_conf(conf1, conf2) as (loginis, confs): stream = self.set_root_stream() root = logging.getLogger() root.error("boo") self.assertEqual("boo\n", stream.getvalue()) stream.truncate(0) root.error("boo") self.assertEqual("", stream.getvalue()) def test_remove_logger(self): fake_handler = {'class': 'logging.StreamHandler'} fake_logger = {'level': 'WARN'} conf1 = {'root': {'handlers': 'fake'}, 'handlers': {'fake': fake_handler}, 'loggers': {'a.a': fake_logger}} conf2 = {'root': {'handlers': 'fake'}, 'handlers': {'fake': fake_handler}} stream = io.StringIO() with self.mutate_conf(conf1, conf2) as (loginis, confs): stream = self.set_root_stream() log = logging.getLogger("a.a") log.info("info") log.warn("warn") self.assertEqual("warn\n", stream.getvalue()) stream = self.set_root_stream() log.info("info") log.warn("warn") self.assertEqual("info\nwarn\n", stream.getvalue()) class LogConfigOptsTestCase(BaseTestCase): def setUp(self): super(LogConfigOptsTestCase, self).setUp() def test_print_help(self): f = io.StringIO() self.CONF([]) self.CONF.print_help(file=f) for option in ['debug', 'log-config', 'watch-log-file']: self.assertIn(option, f.getvalue()) def test_debug(self): self.CONF(['--debug']) self.assertEqual(True, self.CONF.debug) def test_logging_opts(self): self.CONF([]) self.assertIsNone(self.CONF.log_config_append) self.assertIsNone(self.CONF.log_file) self.assertIsNone(self.CONF.log_dir) self.assertEqual(_options._DEFAULT_LOG_DATE_FORMAT, self.CONF.log_date_format) self.assertEqual(False, self.CONF.use_syslog) self.assertEqual(False, self.CONF.use_json) def test_log_file(self): log_file = '/some/path/foo-bar.log' self.CONF(['--log-file', log_file]) self.assertEqual(log_file, self.CONF.log_file) def test_log_dir_handlers(self): log_dir = tempfile.mkdtemp() self.CONF(['--log-dir', log_dir]) self.CONF.set_default('use_stderr', False) log._setup_logging_from_conf(self.CONF, 'test', 'test') logger = log._loggers[None].logger self.assertEqual(1, len(logger.handlers)) self.assertIsInstance(logger.handlers[0], logging.handlers.WatchedFileHandler) def test_log_publish_errors_handlers(self): fake_handler = mock.MagicMock() with mock.patch('oslo_utils.importutils.import_object', return_value=fake_handler) as mock_import: log_dir = tempfile.mkdtemp() self.CONF(['--log-dir', log_dir]) self.CONF.set_default('use_stderr', False) self.CONF.set_default('publish_errors', True) log._setup_logging_from_conf(self.CONF, 'test', 'test') logger = log._loggers[None].logger self.assertEqual(2, len(logger.handlers)) self.assertIsInstance(logger.handlers[0], logging.handlers.WatchedFileHandler) self.assertEqual(fake_handler, logger.handlers[1]) mock_import.assert_called_once_with( 'oslo_messaging.notify.log_handler.PublishErrorsHandler', logging.ERROR) def test_logfile_deprecated(self): logfile = '/some/other/path/foo-bar.log' self.CONF(['--logfile', logfile]) self.assertEqual(logfile, self.CONF.log_file) def test_log_dir(self): log_dir = '/some/path/' self.CONF(['--log-dir', log_dir]) self.assertEqual(log_dir, self.CONF.log_dir) def test_logdir_deprecated(self): logdir = '/some/other/path/' self.CONF(['--logdir', logdir]) self.assertEqual(logdir, self.CONF.log_dir) def test_default_formatter(self): log._setup_logging_from_conf(self.CONF, 'test', 'test') logger = log._loggers[None].logger for handler in logger.handlers: formatter = handler.formatter self.assertIsInstance(formatter, formatters.ContextFormatter) def test_json_formatter(self): self.CONF(['--use-json']) log._setup_logging_from_conf(self.CONF, 'test', 'test') logger = log._loggers[None].logger for handler in logger.handlers: formatter = handler.formatter self.assertIsInstance(formatter, formatters.JSONFormatter) def test_handlers_cleanup(self): """Test that all old handlers get removed from log_root.""" old_handlers = [log.handlers.ColorHandler(), log.handlers.ColorHandler()] log._loggers[None].logger.handlers = list(old_handlers) log._setup_logging_from_conf(self.CONF, 'test', 'test') handlers = log._loggers[None].logger.handlers self.assertEqual(1, len(handlers)) self.assertNotIn(handlers[0], old_handlers) def test_list_opts(self): all_options = _options.list_opts() (group, options) = all_options[0] self.assertIsNone(group) self.assertEqual((_options.common_cli_opts + _options.logging_cli_opts + _options.generic_log_opts + _options.log_opts + _options.versionutils.deprecated_opts), options) class LogConfigTestCase(BaseTestCase): def setUp(self): super(LogConfigTestCase, self).setUp() names = self.create_tempfiles([('logging', MIN_LOG_INI)]) self.log_config_append = names[0] if hasattr(log._load_log_config, 'old_time'): del log._load_log_config.old_time def test_log_config_append_ok(self): self.config(log_config_append=self.log_config_append) log.setup(self.CONF, 'test_log_config_append') def test_log_config_append_not_exist(self): os.remove(self.log_config_append) self.config(log_config_append=self.log_config_append) self.assertRaises(log.LogConfigError, log.setup, self.CONF, 'test_log_config_append') def test_log_config_append_invalid(self): names = self.create_tempfiles([('logging', 'squawk')]) self.log_config_append = names[0] self.config(log_config_append=self.log_config_append) self.assertRaises(log.LogConfigError, log.setup, self.CONF, 'test_log_config_append') def test_log_config_append_unreadable(self): os.chmod(self.log_config_append, 0) self.config(log_config_append=self.log_config_append) self.assertRaises(log.LogConfigError, log.setup, self.CONF, 'test_log_config_append') def test_log_config_append_disable_existing_loggers(self): self.config(log_config_append=self.log_config_append) with mock.patch('logging.config.fileConfig') as fileConfig: log.setup(self.CONF, 'test_log_config_append') fileConfig.assert_called_once_with(self.log_config_append, disable_existing_loggers=False) class SavingAdapter(log.KeywordArgumentAdapter): def __init__(self, *args, **kwds): super(log.KeywordArgumentAdapter, self).__init__(*args, **kwds) self.results = [] def process(self, msg, kwargs): # Run the real adapter and save the inputs and outputs # before returning them so the test can examine both. results = super(SavingAdapter, self).process(msg, kwargs) self.results.append((msg, kwargs, results)) return results class KeywordArgumentAdapterTestCase(BaseTestCase): def setUp(self): super(KeywordArgumentAdapterTestCase, self).setUp() # Construct a mock that will look like a Logger configured to # emit messages at DEBUG or higher. self.mock_log = mock.Mock() self.mock_log.manager.disable = logging.NOTSET self.mock_log.isEnabledFor.return_value = True self.mock_log.getEffectiveLevel.return_value = logging.DEBUG def test_empty_kwargs(self): a = log.KeywordArgumentAdapter(self.mock_log, {}) msg, kwargs = a.process('message', {}) self.assertEqual({'extra': {'extra_keys': []}}, kwargs) def test_include_constructor_extras(self): key = 'foo' val = 'blah' data = {key: val} a = log.KeywordArgumentAdapter(self.mock_log, data) msg, kwargs = a.process('message', {}) self.assertEqual({'extra': {key: val, 'extra_keys': [key]}}, kwargs) def test_pass_through_exc_info(self): a = log.KeywordArgumentAdapter(self.mock_log, {}) exc_message = 'exception' msg, kwargs = a.process('message', {'exc_info': exc_message}) self.assertEqual( {'extra': {'extra_keys': []}, 'exc_info': exc_message}, kwargs) def test_update_extras(self): a = log.KeywordArgumentAdapter(self.mock_log, {}) data = {'context': 'some context object', 'instance': 'instance identifier', 'resource_uuid': 'UUID for instance', 'anything': 'goes'} expected = copy.copy(data) msg, kwargs = a.process('message', data) self.assertEqual( {'extra': {'anything': expected['anything'], 'context': expected['context'], 'extra_keys': sorted(expected.keys()), 'instance': expected['instance'], 'resource_uuid': expected['resource_uuid']}}, kwargs) def test_pass_args_to_log(self): a = SavingAdapter(self.mock_log, {}) message = 'message' exc_message = 'exception' val = 'value' a.log(logging.DEBUG, message, name=val, exc_info=exc_message) expected = { 'exc_info': exc_message, 'extra': {'name': val, 'extra_keys': ['name']}, } actual = a.results[0] self.assertEqual(message, actual[0]) self.assertEqual(expected, actual[1]) results = actual[2] self.assertEqual(message, results[0]) self.assertEqual(expected, results[1]) def test_pass_args_via_debug(self): a = SavingAdapter(self.mock_log, {}) message = 'message' exc_message = 'exception' val = 'value' a.debug(message, name=val, exc_info=exc_message) expected = { 'exc_info': exc_message, 'extra': {'name': val, 'extra_keys': ['name']}, } actual = a.results[0] self.assertEqual(message, actual[0]) self.assertEqual(expected, actual[1]) results = actual[2] self.assertEqual(message, results[0]) self.assertEqual(expected, results[1]) class UnicodeConversionTestCase(BaseTestCase): _MSG = u'Message with unicode char \ua000 in the middle' def test_ascii_to_unicode(self): msg = self._MSG enc_msg = msg.encode('utf-8') result = formatters._ensure_unicode(enc_msg) self.assertEqual(msg, result) self.assertIsInstance(result, str) def test_unicode_to_unicode(self): msg = self._MSG result = formatters._ensure_unicode(msg) self.assertEqual(msg, result) self.assertIsInstance(result, str) def test_exception_to_unicode(self): msg = self._MSG exc = Exception(msg) result = formatters._ensure_unicode(exc) self.assertEqual(msg, result) self.assertIsInstance(result, str) class LoggerNameTestCase(LoggerTestCase): def test_oslo_dot(self): logger_name = 'oslo.subname' logger = log.getLogger(logger_name) self.assertEqual(logger_name, logger.logger.name) def test_oslo_underscore(self): logger_name = 'oslo_subname' expected = logger_name.replace('_', '.') logger = log.getLogger(logger_name) self.assertEqual(expected, logger.logger.name) class IsDebugEnabledTestCase(test_base.BaseTestCase): def setUp(self): super(IsDebugEnabledTestCase, self).setUp() self.config_fixture = self.useFixture( fixture_config.Config(cfg.ConfigOpts())) self.config = self.config_fixture.config self.CONF = self.config_fixture.conf log.register_options(self.config_fixture.conf) def _test_is_debug_enabled(self, debug=False): self.config(debug=debug) self.assertEqual(debug, log.is_debug_enabled(self.CONF)) def test_is_debug_enabled_off(self): self._test_is_debug_enabled() def test_is_debug_enabled_on(self): self._test_is_debug_enabled(debug=True) oslo.log-4.1.1/oslo_log/tests/unit/test_versionutils.py0000664000175000017500000003334313643050265023422 0ustar zuulzuul00000000000000# Copyright (c) 2013 OpenStack Foundation # All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from unittest import mock from oslotest import base as test_base from testtools import matchers from oslo_log import versionutils class DeprecatedTestCase(test_base.BaseTestCase): def assert_deprecated(self, mock_reporter, no_removal=False, **expected_details): if 'in_favor_of' in expected_details: if no_removal is False: expected_msg = versionutils._deprecated_msg_with_alternative else: expected_msg = getattr( versionutils, '_deprecated_msg_with_alternative_no_removal') else: if no_removal is False: expected_msg = versionutils._deprecated_msg_no_alternative else: expected_msg = getattr( versionutils, '_deprecated_msg_with_no_alternative_no_removal') # The first argument is the logger, and we don't care about # that, so ignore it with ANY. mock_reporter.assert_called_with(mock.ANY, expected_msg, expected_details) @mock.patch('oslo_log.versionutils.report_deprecated_feature') def test_deprecating_a_function_returns_correct_value(self, mock_reporter): @versionutils.deprecated(as_of=versionutils.deprecated.ICEHOUSE) def do_outdated_stuff(data): return data expected_rv = 'expected return value' retval = do_outdated_stuff(expected_rv) self.assertThat(retval, matchers.Equals(expected_rv)) @mock.patch('oslo_log.versionutils.report_deprecated_feature') def test_deprecating_a_method_returns_correct_value(self, mock_reporter): class C(object): @versionutils.deprecated(as_of=versionutils.deprecated.ICEHOUSE) def outdated_method(self, *args): return args retval = C().outdated_method(1, 'of anything') self.assertThat(retval, matchers.Equals((1, 'of anything'))) @mock.patch('oslo_log.versionutils.report_deprecated_feature') def test_deprecated_with_unknown_future_release(self, mock_reporter): @versionutils.deprecated(as_of=versionutils.deprecated.BEXAR, in_favor_of='different_stuff()') def do_outdated_stuff(): return do_outdated_stuff() self.assert_deprecated(mock_reporter, what='do_outdated_stuff()', in_favor_of='different_stuff()', as_of='Bexar', remove_in='D') @mock.patch('oslo_log.versionutils.report_deprecated_feature') def test_deprecated_with_known_future_release(self, mock_reporter): @versionutils.deprecated(as_of=versionutils.deprecated.GRIZZLY, in_favor_of='different_stuff()') def do_outdated_stuff(): return do_outdated_stuff() self.assert_deprecated(mock_reporter, what='do_outdated_stuff()', in_favor_of='different_stuff()', as_of='Grizzly', remove_in='Icehouse') @mock.patch('oslo_log.versionutils.report_deprecated_feature') def test_deprecated_without_replacement(self, mock_reporter): @versionutils.deprecated(as_of=versionutils.deprecated.GRIZZLY) def do_outdated_stuff(): return do_outdated_stuff() self.assert_deprecated(mock_reporter, what='do_outdated_stuff()', as_of='Grizzly', remove_in='Icehouse') @mock.patch('oslo_log.versionutils.report_deprecated_feature') def test_deprecated_with_custom_what(self, mock_reporter): @versionutils.deprecated(as_of=versionutils.deprecated.GRIZZLY, what='v2.0 API', in_favor_of='v3 API') def do_outdated_stuff(): return do_outdated_stuff() self.assert_deprecated(mock_reporter, what='v2.0 API', in_favor_of='v3 API', as_of='Grizzly', remove_in='Icehouse') @mock.patch('oslo_log.versionutils.report_deprecated_feature') def test_deprecated_with_removed_next_release(self, mock_reporter): @versionutils.deprecated(as_of=versionutils.deprecated.GRIZZLY, remove_in=1) def do_outdated_stuff(): return do_outdated_stuff() self.assert_deprecated(mock_reporter, what='do_outdated_stuff()', as_of='Grizzly', remove_in='Havana') @mock.patch('oslo_log.versionutils.report_deprecated_feature') def test_deprecated_with_removed_plus_3(self, mock_reporter): @versionutils.deprecated(as_of=versionutils.deprecated.GRIZZLY, remove_in=+3) def do_outdated_stuff(): return do_outdated_stuff() self.assert_deprecated(mock_reporter, what='do_outdated_stuff()', as_of='Grizzly', remove_in='Juno') @mock.patch('oslo_log.versionutils.report_deprecated_feature') def test_deprecated_with_removed_zero(self, mock_reporter): @versionutils.deprecated(as_of=versionutils.deprecated.GRIZZLY, remove_in=0) def do_outdated_stuff(): return do_outdated_stuff() self.assert_deprecated(mock_reporter, no_removal=True, what='do_outdated_stuff()', as_of='Grizzly', remove_in='Grizzly') @mock.patch('oslo_log.versionutils.report_deprecated_feature') def test_deprecated_with_removed_none(self, mock_reporter): @versionutils.deprecated(as_of=versionutils.deprecated.GRIZZLY, remove_in=None) def do_outdated_stuff(): return do_outdated_stuff() self.assert_deprecated(mock_reporter, no_removal=True, what='do_outdated_stuff()', as_of='Grizzly', remove_in='Grizzly') @mock.patch('oslo_log.versionutils.report_deprecated_feature') def test_deprecated_with_removed_zero_and_alternative(self, mock_reporter): @versionutils.deprecated(as_of=versionutils.deprecated.GRIZZLY, in_favor_of='different_stuff()', remove_in=0) def do_outdated_stuff(): return do_outdated_stuff() self.assert_deprecated(mock_reporter, no_removal=True, what='do_outdated_stuff()', as_of='Grizzly', in_favor_of='different_stuff()', remove_in='Grizzly') @mock.patch('oslo_log.versionutils.report_deprecated_feature') def test_deprecated_class_without_init(self, mock_reporter): @versionutils.deprecated(as_of=versionutils.deprecated.JUNO, remove_in=+1) class OutdatedClass(object): pass obj = OutdatedClass() self.assertIsInstance(obj, OutdatedClass) self.assert_deprecated(mock_reporter, what='OutdatedClass()', as_of='Juno', remove_in='Kilo') @mock.patch('oslo_log.versionutils.report_deprecated_feature') def test_deprecated_class_with_init(self, mock_reporter): mock_arguments = mock.MagicMock() args = (1, 5, 7) kwargs = {'first': 10, 'second': 20} @versionutils.deprecated(as_of=versionutils.deprecated.JUNO, remove_in=+1) class OutdatedClass(object): def __init__(self, *args, **kwargs): """It is __init__ method.""" mock_arguments.args = args mock_arguments.kwargs = kwargs super(OutdatedClass, self).__init__() obj = OutdatedClass(*args, **kwargs) self.assertIsInstance(obj, OutdatedClass) self.assertEqual('__init__', obj.__init__.__name__) self.assertEqual('It is __init__ method.', obj.__init__.__doc__) self.assertEqual(args, mock_arguments.args) self.assertEqual(kwargs, mock_arguments.kwargs) self.assert_deprecated(mock_reporter, what='OutdatedClass()', as_of='Juno', remove_in='Kilo') @mock.patch('oslo_log.versionutils.report_deprecated_feature') def test_deprecated_exception_old(self, mock_log): @versionutils.deprecated(as_of=versionutils.deprecated.ICEHOUSE, remove_in=+1) class OldException(Exception): pass try: raise OldException() except OldException: pass self.assert_deprecated(mock_log, what='OldException()', as_of='Icehouse', remove_in='Juno') @mock.patch('oslo_log.versionutils.report_deprecated_feature') def test_deprecated_exception_new(self, mock_log): @versionutils.deprecated(as_of=versionutils.deprecated.ICEHOUSE, remove_in=+1) class OldException(Exception): pass class NewException(OldException): pass try: raise NewException() except NewException: pass mock_log.assert_not_called() @mock.patch('oslo_log.versionutils.report_deprecated_feature') def test_deprecated_exception_unrelated(self, mock_log): @versionutils.deprecated(as_of=versionutils.deprecated.ICEHOUSE, remove_in=+1) class OldException(Exception): pass class UnrelatedException(Exception): pass try: raise UnrelatedException() except UnrelatedException: pass mock_log.assert_not_called() @mock.patch.object(versionutils.CONF, 'register_opts') def test_register_options(self, mock_register_opts): # Calling register_options registers the config options. versionutils.register_options() mock_register_opts.assert_called_once_with( versionutils.deprecated_opts) @mock.patch('oslo_log.versionutils.report_deprecated_feature') def test_deprecated_mitaka_plus_two(self, mock_reporter): @versionutils.deprecated(as_of=versionutils.deprecated.MITAKA, remove_in=+2) class OutdatedClass(object): pass obj = OutdatedClass() self.assertIsInstance(obj, OutdatedClass) self.assert_deprecated(mock_reporter, what='OutdatedClass()', as_of='Mitaka', remove_in='Ocata') @mock.patch('oslo_log.versionutils.report_deprecated_feature') def test_deprecated_newton_plus_two(self, mock_reporter): @versionutils.deprecated(as_of=versionutils.deprecated.NEWTON, remove_in=+2) class OutdatedClass(object): pass obj = OutdatedClass() self.assertIsInstance(obj, OutdatedClass) self.assert_deprecated(mock_reporter, what='OutdatedClass()', as_of='Newton', remove_in='Pike') @mock.patch('oslo_log.versionutils.report_deprecated_feature') def test_deprecated_ocata_plus_two(self, mock_reporter): @versionutils.deprecated(as_of=versionutils.deprecated.OCATA, remove_in=+2) class OutdatedClass(object): pass obj = OutdatedClass() self.assertIsInstance(obj, OutdatedClass) self.assert_deprecated(mock_reporter, what='OutdatedClass()', as_of='Ocata', remove_in='Queens') @mock.patch('oslo_log.versionutils.report_deprecated_feature') def test_deprecated_message(self, mock_reporter): versionutils.deprecation_warning('outdated_stuff', as_of=versionutils.deprecated.KILO, in_favor_of='different_stuff', remove_in=+2) self.assert_deprecated(mock_reporter, what='outdated_stuff', in_favor_of='different_stuff', as_of='Kilo', remove_in='Mitaka') oslo.log-4.1.1/oslo_log/tests/unit/test_helpers.py0000664000175000017500000000532713643050265022317 0ustar zuulzuul00000000000000# Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from unittest import mock from oslotest import base as test_base from oslo_log import helpers class LogHelpersTestCase(test_base.BaseTestCase): def test_log_decorator(self): '''Test that LOG.debug is called with proper arguments.''' class test_class(object): @helpers.log_method_call def test_method(self, arg1, arg2, arg3, *args, **kwargs): pass @classmethod @helpers.log_method_call def test_classmethod(cls, arg1, arg2, arg3, *args, **kwargs): pass args = tuple(range(6)) kwargs = {'kwarg1': 6, 'kwarg2': 7} obj = test_class() for method_name in ('test_method', 'test_classmethod'): data = {'caller': helpers._get_full_class_name(test_class), 'method_name': method_name, 'args': args, 'kwargs': kwargs} method = getattr(obj, method_name) with mock.patch('logging.Logger.debug') as debug: method(*args, **kwargs) debug.assert_called_with(mock.ANY, data) def test_log_decorator_for_static(self): '''Test that LOG.debug is called with proper arguments.''' @helpers.log_method_call def _static_method(): pass class test_class(object): @staticmethod @helpers.log_method_call def test_staticmethod(arg1, arg2, arg3, *args, **kwargs): pass data = {'caller': 'static', 'method_name': '_static_method', 'args': (), 'kwargs': {}} with mock.patch('logging.Logger.debug') as debug: _static_method() debug.assert_called_with(mock.ANY, data) args = tuple(range(6)) kwargs = {'kwarg1': 6, 'kwarg2': 7} data = {'caller': 'static', 'method_name': 'test_staticmethod', 'args': args, 'kwargs': kwargs} with mock.patch('logging.Logger.debug') as debug: test_class.test_staticmethod(*args, **kwargs) debug.assert_called_with(mock.ANY, data) oslo.log-4.1.1/oslo_log/tests/unit/test_custom_loghandler.py0000664000175000017500000000371613643050265024366 0ustar zuulzuul00000000000000# Copyright (c) 2016 OpenStack Foundation # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. """Unit Tests for oslo.log with custom log handler""" import logging from oslo_log import log from oslo_log.tests.unit.test_log import LogTestBase class CustomLogHandler(logging.StreamHandler): # Custom loghandler to mimick the error which was later fixed by # https://github.com/openstack/oslo.privsep/commit/3c47348ced0d3ace1113ba8de8dff015792b0b89 def emit(self, record): # Make args None; this was the error, which broke oslo_log formatting record.args = None # This is intentionally wrong super(CustomLogHandler, self).emit(record) class CustomLogHandlerTestCase(LogTestBase): def setUp(self): super(CustomLogHandlerTestCase, self).setUp() self.config(logging_context_format_string="HAS CONTEXT " "[%(request_id)s]: " "%(message)s", logging_default_format_string="NOCTXT: %(message)s", logging_debug_format_suffix="--DBG") self.log = log.getLogger('') # obtain root logger instead of 'unknown' self._add_handler_with_cleanup(self.log, handler=CustomLogHandler) self._set_log_level_with_cleanup(self.log, logging.DEBUG) def test_log(self): message = 'foo' self.log.info(message) self.assertEqual("NOCTXT: %s\n" % message, self.stream.getvalue()) oslo.log-4.1.1/oslo_log/tests/unit/test_convert_json.py0000664000175000017500000000562613643050265023370 0ustar zuulzuul00000000000000# Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import io from oslo_log.cmds import convert_json from oslo_serialization import jsonutils from oslotest import base as test_base TRIVIAL_RECORD = {'message': 'msg'} DEBUG_LEVELNAME_RECORD = { 'message': 'msg', 'levelname': 'DEBUG', } DEBUG_LEVELNO_RECORD = { 'message': 'msg', 'levelno': 0, } TRACEBACK_RECORD = { 'message': 'msg', 'traceback': "abc\ndef", } DEBUG_LEVEL_KEY_RECORD = { 'message': 'msg', 'level': 'DEBUG', } EXCEPTION_RECORD = { 'message': 'msg', 'exception': "abc\ndef", } class ConvertJsonTestCase(test_base.BaseTestCase): def setUp(self): super(ConvertJsonTestCase, self).setUp() def _reformat(self, text): fh = io.StringIO(text) return list(convert_json.reformat_json(fh, lambda x: [x])) def test_reformat_json_single(self): text = jsonutils.dumps(TRIVIAL_RECORD) self.assertEqual([TRIVIAL_RECORD], self._reformat(text)) def test_reformat_json_blanks(self): text = jsonutils.dumps(TRIVIAL_RECORD) self.assertEqual([TRIVIAL_RECORD], self._reformat(text + "\n\n")) def test_reformat_json_double(self): text = jsonutils.dumps(TRIVIAL_RECORD) self.assertEqual( [TRIVIAL_RECORD, TRIVIAL_RECORD], self._reformat("\n".join([text, text]))) def _lines(self, record, pre='pre', loc='loc', **args): return list(convert_json.console_format(pre, loc, record, **args)) def test_console_format_trivial(self): lines = self._lines(TRIVIAL_RECORD) self.assertEqual(['pre msg'], lines) def test_console_format_debug_levelname(self): lines = self._lines(DEBUG_LEVELNAME_RECORD) self.assertEqual(['pre msg'], lines) def test_console_format_debug_levelno(self): lines = self._lines(DEBUG_LEVELNO_RECORD) self.assertEqual(['pre msg'], lines) def test_console_format_debug_level_key(self): lines = self._lines(DEBUG_LEVEL_KEY_RECORD, level_key='level') self.assertEqual(['pre msg'], lines) def test_console_format_traceback(self): lines = self._lines(TRACEBACK_RECORD) self.assertEqual(['pre msg', 'pre abc', 'pre def'], lines) def test_console_format_exception(self): lines = self._lines(EXCEPTION_RECORD, traceback_key='exception') self.assertEqual(['pre msg', 'pre abc', 'pre def'], lines) oslo.log-4.1.1/oslo_log/tests/unit/fixture/0000775000175000017500000000000013643050376020726 5ustar zuulzuul00000000000000oslo.log-4.1.1/oslo_log/tests/unit/fixture/test_setlevel.py0000664000175000017500000000265113643050265024163 0ustar zuulzuul00000000000000# Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. import logging from oslo_log import fixture from oslotest import base as test_base class TestSetLevelFixture(test_base.BaseTestCase): def test_unset_before(self): logger = logging.getLogger('no-such-logger-unset') self.assertEqual(logging.NOTSET, logger.level) fix = fixture.SetLogLevel(['no-such-logger-unset'], logging.DEBUG) with fix: self.assertEqual(logging.DEBUG, logger.level) self.assertEqual(logging.NOTSET, logger.level) def test_set_before(self): logger = logging.getLogger('no-such-logger-set') logger.setLevel(logging.ERROR) self.assertEqual(logging.ERROR, logger.level) fix = fixture.SetLogLevel(['no-such-logger-set'], logging.DEBUG) with fix: self.assertEqual(logging.DEBUG, logger.level) self.assertEqual(logging.ERROR, logger.level) oslo.log-4.1.1/oslo_log/tests/unit/fixture/test_logging_error.py0000664000175000017500000000223713643050265025177 0ustar zuulzuul00000000000000# Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. from oslo_log import fixture from oslo_log import log as logging from oslotest import base as test_base class TestLoggingFixture(test_base.BaseTestCase): def setUp(self): super(TestLoggingFixture, self).setUp() self.log = logging.getLogger(__name__) def test_logging_handle_error(self): self.log.error('pid of first child is %(foo)s', 1) self.useFixture(fixture.get_logging_handle_error_fixture()) self.assertRaises(TypeError, self.log.error, 'pid of first child is %(foo)s', 1) oslo.log-4.1.1/oslo_log/tests/unit/fixture/__init__.py0000664000175000017500000000107213643050265023034 0ustar zuulzuul00000000000000# -*- coding: utf-8 -*- # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. oslo.log-4.1.1/oslo_log/tests/unit/test_formatters.py0000664000175000017500000001204413643050265023035 0ustar zuulzuul00000000000000# Copyright (c) 2016 OpenStack Foundation # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. """Unit Tests for oslo.log formatter""" import logging import sys from unittest import mock from oslo_config import cfg from oslo_config import fixture as config_fixture from oslo_context import context from oslotest import base as test_base from oslo_log import formatters from oslo_log import log def _fake_context(): ctxt = context.RequestContext(user="user", tenant="tenant", project_domain="pdomain", user_domain="udomain", overwrite=True) return ctxt class AlternativeRequestContext(object): def __init__(self, user=None, tenant=None): self.user = user self.tenant = tenant def to_dict(self): return {'user': self.user, 'tenant': self.tenant} class FormatterTest(test_base.BaseTestCase): def setUp(self): super(FormatterTest, self).setUp() def test_replace_false_value_exists(self): d = {"user": "user1"} s = "%(user)s" % formatters._ReplaceFalseValue(d) self.assertEqual(d['user'], s) def test_replace_false_value_not_exists(self): d = {"user": "user1"} s = "%(project)s" % formatters._ReplaceFalseValue(d) self.assertEqual("-", s) def test_dictify_context_empty(self): self.assertEqual({}, formatters._dictify_context(None)) @mock.patch("debtcollector.deprecate") def test_dictify_context_with_dict(self, mock_deprecate): d = {"user": "user"} self.assertEqual(d, formatters._dictify_context(d)) mock_deprecate.assert_not_called() @mock.patch("debtcollector.deprecate") def test_dictify_context_with_context(self, mock_deprecate): ctxt = _fake_context() self.assertEqual(ctxt.get_logging_values(), formatters._dictify_context(ctxt)) mock_deprecate.assert_not_called() @mock.patch("debtcollector.deprecate") def test_dictify_context_without_get_logging_values(self, mock_deprecate): ctxt = AlternativeRequestContext(user="user", tenant="tenant") d = {"user": "user", "tenant": "tenant"} self.assertEqual(d, formatters._dictify_context(ctxt)) mock_deprecate.assert_called_with( 'The RequestContext.get_logging_values() ' 'method should be defined for logging context specific ' 'information. The to_dict() method is deprecated ' 'for oslo.log use.', removal_version='5.0.0', version='3.8.0') # Test for https://bugs.python.org/issue28603 class FormatUnhashableExceptionTest(test_base.BaseTestCase): def setUp(self): super(FormatUnhashableExceptionTest, self).setUp() self.config_fixture = self.useFixture( config_fixture.Config(cfg.ConfigOpts())) self.conf = self.config_fixture.conf log.register_options(self.conf) def _unhashable_exception_info(self): class UnhashableException(Exception): __hash__ = None try: raise UnhashableException() except UnhashableException: return sys.exc_info() def test_error_summary(self): exc_info = self._unhashable_exception_info() record = logging.LogRecord('test', logging.ERROR, 'test', 0, 'test message', [], exc_info) err_summary = formatters._get_error_summary(record) self.assertTrue(err_summary) def test_json_format_exception(self): exc_info = self._unhashable_exception_info() formatter = formatters.JSONFormatter() tb = ''.join(formatter.formatException(exc_info)) self.assertTrue(tb) def test_fluent_format_exception(self): exc_info = self._unhashable_exception_info() formatter = formatters.FluentFormatter() tb = formatter.formatException(exc_info) self.assertTrue(tb) def test_context_format_exception_norecord(self): exc_info = self._unhashable_exception_info() formatter = formatters.ContextFormatter(config=self.conf) tb = formatter.formatException(exc_info) self.assertTrue(tb) def test_context_format_exception(self): exc_info = self._unhashable_exception_info() formatter = formatters.ContextFormatter(config=self.conf) record = logging.LogRecord('test', logging.ERROR, 'test', 0, 'test message', [], exc_info) tb = formatter.format(record) self.assertTrue(tb) oslo.log-4.1.1/oslo_log/tests/unit/__init__.py0000664000175000017500000000107213643050265021346 0ustar zuulzuul00000000000000# -*- coding: utf-8 -*- # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. oslo.log-4.1.1/oslo_log/tests/__init__.py0000664000175000017500000000000013643050265020355 0ustar zuulzuul00000000000000oslo.log-4.1.1/oslo_log/__init__.py0000664000175000017500000000000013643050265017213 0ustar zuulzuul00000000000000oslo.log-4.1.1/oslo_log/helpers.py0000664000175000017500000000423113643050265017130 0ustar zuulzuul00000000000000# Copyright (C) 2013 eNovance SAS # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. """Log helper functions.""" import functools import inspect import logging def _get_full_class_name(cls): return '%s.%s' % (cls.__module__, getattr(cls, '__qualname__', cls.__name__)) def _is_method(obj, method): """Returns True if a given method is obj's method. You can not simply test a given method like: return inspect.ismethod(method) This is because functools.wraps converts the method to a function in log_method_call function. """ return inspect.ismethod(getattr(obj, method.__name__, None)) def log_method_call(method): """Decorator helping to log method calls. :param method: Method to decorate to be logged. :type method: method definition """ log = logging.getLogger(method.__module__) @functools.wraps(method) def wrapper(*args, **kwargs): args_start_pos = 0 if args: first_arg = args[0] if _is_method(first_arg, method): cls = (first_arg if isinstance(first_arg, type) else first_arg.__class__) caller = _get_full_class_name(cls) args_start_pos = 1 else: caller = 'static' else: caller = 'static' data = {'caller': caller, 'method_name': method.__name__, 'args': args[args_start_pos:], 'kwargs': kwargs} log.debug('%(caller)s method %(method_name)s ' 'called with arguments %(args)s %(kwargs)s', data) return method(*args, **kwargs) return wrapper oslo.log-4.1.1/oslo.log.egg-info/0000775000175000017500000000000013643050376016530 5ustar zuulzuul00000000000000oslo.log-4.1.1/oslo.log.egg-info/top_level.txt0000664000175000017500000000001113643050376021252 0ustar zuulzuul00000000000000oslo_log oslo.log-4.1.1/oslo.log.egg-info/SOURCES.txt0000664000175000017500000001006513643050376020416 0ustar zuulzuul00000000000000.coveragerc .mailmap .stestr.conf .zuul.yaml AUTHORS CONTRIBUTING.rst ChangeLog HACKING.rst LICENSE README.rst babel.cfg lower-constraints.txt requirements.txt setup.cfg setup.py test-requirements.txt tox.ini doc/requirements.txt doc/source/conf.py doc/source/index.rst doc/source/admin/advanced_config.rst doc/source/admin/example_nova.rst doc/source/admin/index.rst doc/source/admin/journal.rst doc/source/admin/log_rotation.rst doc/source/admin/nova_sample.conf doc/source/configuration/index.rst doc/source/contributor/index.rst doc/source/install/index.rst doc/source/reference/fixtures.rst doc/source/reference/formatters.rst doc/source/reference/handlers.rst doc/source/reference/helpers.rst doc/source/reference/index.rst doc/source/reference/log.rst doc/source/reference/versionutils.rst doc/source/reference/watchers.rst doc/source/user/examples.rst doc/source/user/guidelines.rst doc/source/user/history.rst doc/source/user/index.rst doc/source/user/usage.rst doc/source/user/examples/_i18n.py doc/source/user/examples/oslo_logging.py doc/source/user/examples/python_logging.py doc/source/user/examples/usage.py doc/source/user/examples/usage_context.py doc/source/user/examples/usage_helper.py oslo.log.egg-info/PKG-INFO oslo.log.egg-info/SOURCES.txt oslo.log.egg-info/dependency_links.txt oslo.log.egg-info/entry_points.txt oslo.log.egg-info/not-zip-safe oslo.log.egg-info/pbr.json oslo.log.egg-info/requires.txt oslo.log.egg-info/top_level.txt oslo_log/__init__.py oslo_log/_i18n.py oslo_log/_options.py oslo_log/formatters.py oslo_log/handlers.py oslo_log/helpers.py oslo_log/log.py oslo_log/rate_limit.py oslo_log/version.py oslo_log/versionutils.py oslo_log/watchers.py oslo_log/cmds/__init__.py oslo_log/cmds/convert_json.py oslo_log/fixture/__init__.py oslo_log/fixture/logging_error.py oslo_log/fixture/setlevel.py oslo_log/locale/de/LC_MESSAGES/oslo_log.po oslo_log/locale/en_GB/LC_MESSAGES/oslo_log.po oslo_log/locale/es/LC_MESSAGES/oslo_log.po oslo_log/locale/ja/LC_MESSAGES/oslo_log.po oslo_log/tests/__init__.py oslo_log/tests/unit/__init__.py oslo_log/tests/unit/test_convert_json.py oslo_log/tests/unit/test_custom_loghandler.py oslo_log/tests/unit/test_formatters.py oslo_log/tests/unit/test_helpers.py oslo_log/tests/unit/test_log.py oslo_log/tests/unit/test_rate_limit.py oslo_log/tests/unit/test_versionutils.py oslo_log/tests/unit/fixture/__init__.py oslo_log/tests/unit/fixture/test_logging_error.py oslo_log/tests/unit/fixture/test_setlevel.py playbooks/legacy/oslo.log-src-grenade-devstack/post.yaml playbooks/legacy/oslo.log-src-grenade-devstack/run.yaml releasenotes/notes/.placeholder releasenotes/notes/add-context-section-0b2f411ec64f42f6.yaml releasenotes/notes/add-reno-e4fedb11ece56f1e.yaml releasenotes/notes/always-add-error-text-715022964364ffa0.yaml releasenotes/notes/drop-python27-support-0fe4909a5468feb3.yaml releasenotes/notes/info-logging-7b7be9fc7a95aebc.yaml releasenotes/notes/is_debug_enabled-d7afee4c811a46df.yaml releasenotes/notes/jsonformatter-repr-fd616eb6fa6caeb3.yaml releasenotes/notes/log-rotation-595f8232cd987a6d.yaml releasenotes/notes/reload_log_config-743817192b1172b6.yaml releasenotes/notes/remove-log-format-b4b949701cee3315.yaml releasenotes/notes/remove-syslog-rfc-format-7a06772c0bb48e9b.yaml releasenotes/notes/remove-verbose-option-d0d1381e182d1be1.yaml releasenotes/notes/systemd-journal-support-fcbc34b3c5ce93ec.yaml releasenotes/notes/use-json-option-96f71da54a3b9a18.yaml releasenotes/notes/use_stderr_default_false-50d846b88cf2be90.yaml releasenotes/notes/windows-eventlog-2beb0a6010e342eb.yaml releasenotes/source/conf.py releasenotes/source/index.rst releasenotes/source/liberty.rst releasenotes/source/mitaka.rst releasenotes/source/newton.rst releasenotes/source/ocata.rst releasenotes/source/pike.rst releasenotes/source/queens.rst releasenotes/source/rocky.rst releasenotes/source/stein.rst releasenotes/source/train.rst releasenotes/source/unreleased.rst releasenotes/source/_static/.placeholder releasenotes/source/_templates/.placeholder releasenotes/source/locale/en_GB/LC_MESSAGES/releasenotes.po releasenotes/source/locale/fr/LC_MESSAGES/releasenotes.pooslo.log-4.1.1/oslo.log.egg-info/not-zip-safe0000664000175000017500000000000113643050376020756 0ustar zuulzuul00000000000000 oslo.log-4.1.1/oslo.log.egg-info/PKG-INFO0000664000175000017500000000432513643050376017631 0ustar zuulzuul00000000000000Metadata-Version: 2.1 Name: oslo.log Version: 4.1.1 Summary: oslo.log library Home-page: https://docs.openstack.org/oslo.log/latest Author: OpenStack Author-email: openstack-discuss@lists.openstack.org License: UNKNOWN Description: ======================== Team and repository tags ======================== .. image:: https://governance.openstack.org/tc/badges/oslo.log.svg :target: https://governance.openstack.org/tc/reference/tags/index.html .. Change things from this point on ================================ oslo.log -- Oslo Logging Library ================================ .. image:: https://img.shields.io/pypi/v/oslo.log.svg :target: https://pypi.org/project/oslo.log/ :alt: Latest Version .. image:: https://img.shields.io/pypi/dm/oslo.log.svg :target: https://pypi.org/project/oslo.log/ :alt: Downloads The oslo.log (logging) configuration library provides standardized configuration for all openstack projects. It also provides custom formatters, handlers and support for context specific logging (like resource id's etc). * Free software: Apache license * Documentation: https://docs.openstack.org/oslo.log/latest/ * Source: https://opendev.org/openstack/oslo.log * Bugs: https://bugs.launchpad.net/oslo.log * Release notes: https://docs.openstack.org/releasenotes/oslo.log/ Platform: UNKNOWN Classifier: Environment :: OpenStack Classifier: Intended Audience :: Information Technology Classifier: Intended Audience :: System Administrators Classifier: License :: OSI Approved :: Apache Software License Classifier: Operating System :: POSIX :: Linux Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.6 Classifier: Programming Language :: Python :: 3.7 Classifier: Programming Language :: Python :: 3 :: Only Classifier: Programming Language :: Python :: Implementation :: CPython Requires-Python: >=3.6 Provides-Extra: fixtures Provides-Extra: systemd Provides-Extra: test oslo.log-4.1.1/oslo.log.egg-info/entry_points.txt0000664000175000017500000000017513643050376022031 0ustar zuulzuul00000000000000[console_scripts] convert-json = oslo_log.cmds.convert_json:main [oslo.config.opts] oslo.log = oslo_log._options:list_opts oslo.log-4.1.1/oslo.log.egg-info/dependency_links.txt0000664000175000017500000000000113643050376022576 0ustar zuulzuul00000000000000 oslo.log-4.1.1/oslo.log.egg-info/requires.txt0000664000175000017500000000075213643050376021134 0ustar zuulzuul00000000000000pbr>=3.1.1 oslo.config>=5.2.0 oslo.context>=2.20.0 oslo.i18n>=3.20.0 oslo.utils>=3.36.0 oslo.serialization>=2.25.0 debtcollector>=1.19.0 python-dateutil>=2.7.0 [:(python_version<'3.3')] monotonic>=1.4 [:(sys_platform!='win32' and sys_platform!='darwin' and sys_platform!='sunos5')] pyinotify>=0.9.6 [fixtures] fixtures>=3.0.0 [systemd] systemd-python>=234 [test] hacking<2.1.0,>=2.0.0 stestr>=2.0.0 testtools>=2.3.0 oslotest>=3.3.0 coverage>=4.5.1 bandit<1.6.0,>=1.1.0 fixtures>=3.0.0 oslo.log-4.1.1/oslo.log.egg-info/pbr.json0000664000175000017500000000005613643050376020207 0ustar zuulzuul00000000000000{"git_version": "2aaf7b0", "is_release": true}oslo.log-4.1.1/releasenotes/0000775000175000017500000000000013643050376015773 5ustar zuulzuul00000000000000oslo.log-4.1.1/releasenotes/source/0000775000175000017500000000000013643050376017273 5ustar zuulzuul00000000000000oslo.log-4.1.1/releasenotes/source/conf.py0000664000175000017500000000306613643050265020574 0ustar zuulzuul00000000000000# -*- coding: utf-8 -*- # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or # implied. # See the License for the specific language governing permissions and # limitations under the License. # -- General configuration ------------------------------------------------ # If your documentation needs a minimal Sphinx version, state it here. # needs_sphinx = '1.0' # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom # ones. extensions = [ 'reno.sphinxext', 'openstackdocstheme' ] # The master toctree document. master_doc = 'index' # General information about the project. copyright = u'2016, oslo.log Developers' # Release notes do not need a version in the title, they span # multiple versions. # The full version, including alpha/beta/rc tags. release = '' # The short X.Y version. version = '' # -- Options for HTML output ---------------------------------------------- # The theme to use for HTML and HTML Help pages. See the documentation for # a list of builtin themes. html_theme = 'openstackdocs' # -- Options for Internationalization output ------------------------------ locale_dirs = ['locale/'] oslo.log-4.1.1/releasenotes/source/train.rst0000664000175000017500000000017613643050265021143 0ustar zuulzuul00000000000000========================== Train Series Release Notes ========================== .. release-notes:: :branch: stable/train oslo.log-4.1.1/releasenotes/source/rocky.rst0000664000175000017500000000022113643050265021144 0ustar zuulzuul00000000000000=================================== Rocky Series Release Notes =================================== .. release-notes:: :branch: stable/rocky oslo.log-4.1.1/releasenotes/source/index.rst0000664000175000017500000000033013643050265021125 0ustar zuulzuul00000000000000======================== oslo.log Release Notes ======================== .. toctree:: :maxdepth: 1 unreleased train stein rocky queens pike ocata newton mitaka liberty oslo.log-4.1.1/releasenotes/source/locale/0000775000175000017500000000000013643050376020532 5ustar zuulzuul00000000000000oslo.log-4.1.1/releasenotes/source/locale/en_GB/0000775000175000017500000000000013643050376021504 5ustar zuulzuul00000000000000oslo.log-4.1.1/releasenotes/source/locale/en_GB/LC_MESSAGES/0000775000175000017500000000000013643050376023271 5ustar zuulzuul00000000000000oslo.log-4.1.1/releasenotes/source/locale/en_GB/LC_MESSAGES/releasenotes.po0000664000175000017500000001436413643050265026327 0ustar zuulzuul00000000000000# Andi Chandler , 2016. #zanata # Andi Chandler , 2017. #zanata # Andi Chandler , 2018. #zanata msgid "" msgstr "" "Project-Id-Version: oslo.log Release Notes\n" "Report-Msgid-Bugs-To: \n" "POT-Creation-Date: 2018-07-26 22:58+0000\n" "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" "PO-Revision-Date: 2018-08-08 09:51+0000\n" "Last-Translator: Andi Chandler \n" "Language-Team: English (United Kingdom)\n" "Language: en_GB\n" "X-Generator: Zanata 4.3.3\n" "Plural-Forms: nplurals=2; plural=(n != 1)\n" msgid "3.1.0" msgstr "3.1.0" msgid "3.12.0" msgstr "3.12.0" msgid "3.17.0" msgstr "3.17.0" msgid "3.2.0" msgstr "3.2.0" msgid "3.21.0" msgstr "3.21.0" msgid "3.24.0" msgstr "3.24.0" msgid "3.27.0" msgstr "3.27.0" msgid "3.30.2" msgstr "3.30.2" msgid "3.33.0" msgstr "3.33.0" msgid "3.34.0" msgstr "3.34.0" msgid "3.35.0" msgstr "3.35.0" msgid "3.8.0" msgstr "3.8.0" msgid "" "A new ``oslo_log.log.is_debug_enabled`` helper function is added that allows " "to determine whether debug mode is enabled for logging." msgstr "" "A new ``oslo_log.log.is_debug_enabled`` helper function is added that allows " "to determine whether debug mode is enabled for logging." msgid "Bug Fixes" msgstr "Bug Fixes" msgid "" "Configuration option `use_stderr`'s default value is False now, this will " "avoid same logs on service log and specific log file by option --log-file." msgstr "" "Configuration option `use_stderr`'s default value is False now, this will " "avoid same logs on service log and specific log file by option --log-file." msgid "" "If the log format string includes ``%(error_summary)s``, it will be replaced " "with a summary of the current error when there is one and with \"``-``\" " "when there is no error. If the log format string does not include ``" "%(error_summary)s`` the error summary will be appended to the end of the " "line automatically, only if there is an error." msgstr "" "If the log format string includes ``%(error_summary)s``, it will be replaced " "with a summary of the current error when there is one and with \"``-``\" " "when there is no error. If the log format string does not include ``" "%(error_summary)s`` the error summary will be appended to the end of the " "line automatically, only if there is an error." msgid "Liberty Series Release Notes" msgstr "Liberty Series Release Notes" msgid "Mitaka Series Release Notes" msgstr "Mitaka Series Release Notes" msgid "New Features" msgstr "New Features" msgid "Newton Series Release Notes" msgstr "Newton Series Release Notes" msgid "Ocata Series Release Notes" msgstr "Ocata Series Release Notes" msgid "Other Notes" msgstr "Other Notes" msgid "Pike Series Release Notes" msgstr "Pike Series Release Notes" msgid "Queens Series Release Notes" msgstr "Queens Series Release Notes" msgid "Rocky Series Release Notes" msgstr "Rocky Series Release Notes" msgid "Switch to reno for managing release notes." msgstr "Switch to reno for managing release notes." msgid "" "Systemd native journal support is added. You can enable this in services " "with the ``use_journal`` flag." msgstr "" "Systemd native journal support is added. You can enable this in services " "with the ``use_journal`` flag." msgid "" "The JSON based formatters (namely JSONFormatter and FluentFormatter) now " "output an extra section called 'context' that contains the context-related " "keys and values, e.g. user, project and domain." msgstr "" "The JSON based formatters (namely JSONFormatter and FluentFormatter) now " "output an extra section called 'context' that contains the context-related " "keys and values, e.g. user, project and domain." msgid "" "The JSONFormatter formatter now converts unserializable objects with repr() " "to prevent JSON serialization errors on logging. The fix requires oslo." "serialization 2.20.2 or newer. (Bug #1593641)" msgstr "" "The JSONFormatter formatter now converts un-serialisable objects with repr() " "to prevent JSON serialisation errors on logging. The fix requires oslo." "serialisation 2.20.2 or newer. (Bug #1593641)" msgid "" "The JSONFormatter formatter now converts unserializable objects with repr() " "to prevent JSON serialization errors on logging. The fix requires oslo." "serialization 2.21.1 or newer. (Bug #1593641)" msgstr "" "The JSONFormatter formatter now converts unserialisable objects with repr() " "to prevent JSON serialization errors on logging. The fix requires oslo." "serialisation 2.21.1 or newer. (Bug #1593641)" msgid "The deprecated 'verbose' option has been removed." msgstr "The deprecated 'verbose' option has been removed." msgid "The deprecated log_format configuration option has been removed." msgstr "The deprecated log_format configuration option has been removed." msgid "" "The deprecated use_syslog_rfc_format configuration option has been removed." msgstr "" "The deprecated use_syslog_rfc_format configuration option has been removed." msgid "" "The log_config_append configuration option is now mutable and the logging " "settings it controls are reconfigured when the configuration file is reread. " "This can be used to, for example, change logger or handler levels." msgstr "" "The log_config_append configuration option is now mutable and the logging " "settings it controls are reconfigured when the configuration file is re-" "read. This can be used to, for example, change logger or handler levels." msgid "" "The use_json configuration option was added. It enables JSON formatting in " "the logs when set to True. The option is also available through the command " "line via the ``--use-json`` flag." msgstr "" "The use_json configuration option was added. It enables JSON formatting in " "the logs when set to True. The option is also available through the command " "line via the ``--use-json`` flag." msgid "Unreleased Release Notes" msgstr "Unreleased Release Notes" msgid "Upgrade Notes" msgstr "Upgrade Notes" msgid "" "When removing the \"verbose\" option, the default logging level was set to " "\"WARNING\" by mistake. Fixed it back to \"INFO\"." msgstr "" "When removing the \"verbose\" option, the default logging level was set to " "\"WARNING\" by mistake. Fixed it back to \"INFO\"." msgid "oslo.log Release Notes" msgstr "oslo.log Release Notes" oslo.log-4.1.1/releasenotes/source/locale/fr/0000775000175000017500000000000013643050376021141 5ustar zuulzuul00000000000000oslo.log-4.1.1/releasenotes/source/locale/fr/LC_MESSAGES/0000775000175000017500000000000013643050376022726 5ustar zuulzuul00000000000000oslo.log-4.1.1/releasenotes/source/locale/fr/LC_MESSAGES/releasenotes.po0000664000175000017500000000252113643050265025754 0ustar zuulzuul00000000000000# Gérald LONLAS , 2016. #zanata msgid "" msgstr "" "Project-Id-Version: oslo.log Release Notes 3.31.1\n" "Report-Msgid-Bugs-To: \n" "POT-Creation-Date: 2017-09-20 20:50+0000\n" "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=UTF-8\n" "Content-Transfer-Encoding: 8bit\n" "PO-Revision-Date: 2016-10-22 06:01+0000\n" "Last-Translator: Gérald LONLAS \n" "Language-Team: French\n" "Language: fr\n" "X-Generator: Zanata 3.9.6\n" "Plural-Forms: nplurals=2; plural=(n > 1)\n" msgid "3.1.0" msgstr "3.1.0" msgid "3.12.0" msgstr "3.12.0" msgid "3.2.0" msgstr "3.2.0" msgid "3.8.0" msgstr "3.8.0" msgid "Bug Fixes" msgstr "Corrections de bugs" msgid "Liberty Series Release Notes" msgstr "Note de release pour Liberty" msgid "Mitaka Series Release Notes" msgstr "Note de release pour Mitaka" msgid "New Features" msgstr "Nouvelles fonctionnalités" msgid "Newton Series Release Notes" msgstr "Note de release pour Newton" msgid "Other Notes" msgstr "Autres notes" msgid "Switch to reno for managing release notes." msgstr "Commence à utiliser reno pour la gestion des notes de release" msgid "Unreleased Release Notes" msgstr "Note de release pour les changements non déployées" msgid "Upgrade Notes" msgstr "Notes de mises à jours" msgid "oslo.log Release Notes" msgstr "Note de release pour oslo.log" oslo.log-4.1.1/releasenotes/source/ocata.rst0000664000175000017500000000023013643050265021104 0ustar zuulzuul00000000000000=================================== Ocata Series Release Notes =================================== .. release-notes:: :branch: origin/stable/ocata oslo.log-4.1.1/releasenotes/source/newton.rst0000664000175000017500000000021613643050265021333 0ustar zuulzuul00000000000000============================= Newton Series Release Notes ============================= .. release-notes:: :branch: origin/stable/newton oslo.log-4.1.1/releasenotes/source/_static/0000775000175000017500000000000013643050376020721 5ustar zuulzuul00000000000000oslo.log-4.1.1/releasenotes/source/_static/.placeholder0000664000175000017500000000000013643050265023167 0ustar zuulzuul00000000000000oslo.log-4.1.1/releasenotes/source/pike.rst0000664000175000017500000000021713643050265020752 0ustar zuulzuul00000000000000=================================== Pike Series Release Notes =================================== .. release-notes:: :branch: stable/pike oslo.log-4.1.1/releasenotes/source/queens.rst0000664000175000017500000000022313643050265021317 0ustar zuulzuul00000000000000=================================== Queens Series Release Notes =================================== .. release-notes:: :branch: stable/queens oslo.log-4.1.1/releasenotes/source/liberty.rst0000664000175000017500000000022113643050265021467 0ustar zuulzuul00000000000000============================== Liberty Series Release Notes ============================== .. release-notes:: :branch: origin/stable/liberty oslo.log-4.1.1/releasenotes/source/stein.rst0000664000175000017500000000022113643050265021137 0ustar zuulzuul00000000000000=================================== Stein Series Release Notes =================================== .. release-notes:: :branch: stable/stein oslo.log-4.1.1/releasenotes/source/mitaka.rst0000664000175000017500000000023213643050265021265 0ustar zuulzuul00000000000000=================================== Mitaka Series Release Notes =================================== .. release-notes:: :branch: origin/stable/mitaka oslo.log-4.1.1/releasenotes/source/_templates/0000775000175000017500000000000013643050376021430 5ustar zuulzuul00000000000000oslo.log-4.1.1/releasenotes/source/_templates/.placeholder0000664000175000017500000000000013643050265023676 0ustar zuulzuul00000000000000oslo.log-4.1.1/releasenotes/source/unreleased.rst0000664000175000017500000000014413643050265022150 0ustar zuulzuul00000000000000========================== Unreleased Release Notes ========================== .. release-notes:: oslo.log-4.1.1/releasenotes/notes/0000775000175000017500000000000013643050376017123 5ustar zuulzuul00000000000000oslo.log-4.1.1/releasenotes/notes/always-add-error-text-715022964364ffa0.yaml0000664000175000017500000000057513643050265026145 0ustar zuulzuul00000000000000--- features: - | If the log format string includes ``%(error_summary)s``, it will be replaced with a summary of the current error when there is one and with "``-``" when there is no error. If the log format string does not include ``%(error_summary)s`` the error summary will be appended to the end of the line automatically, only if there is an error. oslo.log-4.1.1/releasenotes/notes/windows-eventlog-2beb0a6010e342eb.yaml0000664000175000017500000000022213643050265025560 0ustar zuulzuul00000000000000features: - | Added Windows EventLog functionality to oslo.log. Set use_eventlog to true in the service's configuration file to use it. oslo.log-4.1.1/releasenotes/notes/use_stderr_default_false-50d846b88cf2be90.yaml0000664000175000017500000000025613643050265027271 0ustar zuulzuul00000000000000--- upgrade: - Configuration option `use_stderr`'s default value is False now, this will avoid same logs on service log and specific log file by option --log-file. oslo.log-4.1.1/releasenotes/notes/use-json-option-96f71da54a3b9a18.yaml0000664000175000017500000000033113643050265025273 0ustar zuulzuul00000000000000--- features: - | The use_json configuration option was added. It enables JSON formatting in the logs when set to True. The option is also available through the command line via the ``--use-json`` flag. oslo.log-4.1.1/releasenotes/notes/systemd-journal-support-fcbc34b3c5ce93ec.yaml0000664000175000017500000000020413643050265027365 0ustar zuulzuul00000000000000--- features: - | Systemd native journal support is added. You can enable this in services with the ``use_journal`` flag. oslo.log-4.1.1/releasenotes/notes/remove-log-format-b4b949701cee3315.yaml0000664000175000017500000000012013643050265025473 0ustar zuulzuul00000000000000--- upgrade: - The deprecated log_format configuration option has been removed. oslo.log-4.1.1/releasenotes/notes/reload_log_config-743817192b1172b6.yaml0000664000175000017500000000036413643050265025357 0ustar zuulzuul00000000000000--- features: - The log_config_append configuration option is now mutable and the logging settings it controls are reconfigured when the configuration file is reread. This can be used to, for example, change logger or handler levels. oslo.log-4.1.1/releasenotes/notes/is_debug_enabled-d7afee4c811a46df.yaml0000664000175000017500000000024113643050265025673 0ustar zuulzuul00000000000000--- features: - | A new ``oslo_log.log.is_debug_enabled`` helper function is added that allows to determine whether debug mode is enabled for logging. oslo.log-4.1.1/releasenotes/notes/add-reno-e4fedb11ece56f1e.yaml0000664000175000017500000000007213643050265024211 0ustar zuulzuul00000000000000--- other: - Switch to reno for managing release notes. oslo.log-4.1.1/releasenotes/notes/log-rotation-595f8232cd987a6d.yaml0000664000175000017500000000042113643050265024576 0ustar zuulzuul00000000000000--- features: - | The following new config options will allow rotating log files, especially useful on Windows: * ``log_rotate_interval`` * ``log_rotate_interval_type`` * ``max_logfile_count`` * ``max_logfile_size_mb`` * ``log_rotation_type`` oslo.log-4.1.1/releasenotes/notes/remove-verbose-option-d0d1381e182d1be1.yaml0000664000175000017500000000010313643050265026443 0ustar zuulzuul00000000000000--- upgrade: - The deprecated 'verbose' option has been removed. oslo.log-4.1.1/releasenotes/notes/drop-python27-support-0fe4909a5468feb3.yaml0000664000175000017500000000017113643050265026377 0ustar zuulzuul00000000000000--- upgrade: - | Python 2.7 is no longer supported. The minimum supported version of Python is now Python 3.6. oslo.log-4.1.1/releasenotes/notes/info-logging-7b7be9fc7a95aebc.yaml0000664000175000017500000000021313643050265025106 0ustar zuulzuul00000000000000--- fixes: - When removing the "verbose" option, the default logging level was set to "WARNING" by mistake. Fixed it back to "INFO". oslo.log-4.1.1/releasenotes/notes/remove-syslog-rfc-format-7a06772c0bb48e9b.yaml0000664000175000017500000000013513643050265027072 0ustar zuulzuul00000000000000--- upgrade: - The deprecated use_syslog_rfc_format configuration option has been removed. oslo.log-4.1.1/releasenotes/notes/.placeholder0000664000175000017500000000000013643050265021371 0ustar zuulzuul00000000000000oslo.log-4.1.1/releasenotes/notes/jsonformatter-repr-fd616eb6fa6caeb3.yaml0000664000175000017500000000033613643050265026365 0ustar zuulzuul00000000000000--- fixes: - | The JSONFormatter formatter now converts unserializable objects with repr() to prevent JSON serialization errors on logging. The fix requires oslo.serialization 2.21.1 or newer. (Bug #1593641) oslo.log-4.1.1/releasenotes/notes/add-context-section-0b2f411ec64f42f6.yaml0000664000175000017500000000034413643050265026070 0ustar zuulzuul00000000000000--- features: - | The JSON based formatters (namely JSONFormatter and FluentFormatter) now output an extra section called 'context' that contains the context-related keys and values, e.g. user, project and domain. oslo.log-4.1.1/lower-constraints.txt0000664000175000017500000000223713643050265017541 0ustar zuulzuul00000000000000alabaster==0.7.10 appdirs==1.4.3 Babel==2.5.3 bandit==1.4.0 certifi==2018.1.18 chardet==3.0.4 coverage==4.5.1 debtcollector==1.19.0 docutils==0.14 dulwich==0.19.0 extras==1.0.0 fixtures==3.0.0 flake8==2.5.5 gitdb2==2.0.3 GitPython==2.1.8 hacking==2.0.0 idna==2.6 imagesize==1.0.0 iso8601==0.1.12 Jinja2==2.10 keystoneauth1==3.4.0 linecache2==1.0.0 MarkupSafe==1.0 mccabe==0.2.1 monotonic==1.4;python_version<'3.3' mox3==0.25.0 msgpack==0.5.6 netaddr==0.7.19 netifaces==0.10.6 openstack-requirements==1.2.0 openstackdocstheme==1.20.0 os-client-config==1.29.0 oslo.config==5.2.0 oslo.context==2.20.0 oslo.i18n==3.20.0 oslo.serialization==2.25.0 oslo.utils==3.36.0 oslotest==3.3.0 packaging==17.1 Parsley==1.3 pbr==3.1.1 pep8==1.5.7 pyflakes==0.8.1 Pygments==2.2.0 pyinotify==0.9.6 pyparsing==2.2.0 python-dateutil==2.7.0 python-mimeparse==1.6.0 python-subunit==1.2.0 pytz==2018.3 PyYAML==3.12 reno==2.7.0 requests==2.18.4 requestsexceptions==1.4.0 rfc3986==1.1.0 smmap2==2.0.3 snowballstemmer==1.2.1 Sphinx==1.8.0 sphinxcontrib-websupport==1.0.1 stestr==2.0.0 stevedore==1.28.0 systemd-python==234 testrepository==0.0.20 testtools==2.3.0 traceback2==1.4.0 urllib3==1.22 wrapt==1.10.11 oslo.log-4.1.1/playbooks/0000775000175000017500000000000013643050376015305 5ustar zuulzuul00000000000000oslo.log-4.1.1/playbooks/legacy/0000775000175000017500000000000013643050376016551 5ustar zuulzuul00000000000000oslo.log-4.1.1/playbooks/legacy/oslo.log-src-grenade-devstack/0000775000175000017500000000000013643050376024277 5ustar zuulzuul00000000000000oslo.log-4.1.1/playbooks/legacy/oslo.log-src-grenade-devstack/post.yaml0000664000175000017500000000063313643050265026147 0ustar zuulzuul00000000000000- hosts: primary tasks: - name: Copy files from {{ ansible_user_dir }}/workspace/ on node synchronize: src: '{{ ansible_user_dir }}/workspace/' dest: '{{ zuul.executor.log_root }}' mode: pull copy_links: true verify_host: true rsync_opts: - --include=/logs/** - --include=*/ - --exclude=* - --prune-empty-dirs oslo.log-4.1.1/playbooks/legacy/oslo.log-src-grenade-devstack/run.yaml0000664000175000017500000000355013643050265025767 0ustar zuulzuul00000000000000- hosts: all name: Autoconverted job legacy-oslo.log-src-grenade-dsvm from old job gate-oslo.log-src-grenade-dsvm-ubuntu-xenial-nv tasks: - name: Ensure legacy workspace directory file: path: '{{ ansible_user_dir }}/workspace' state: directory - shell: cmd: | set -e set -x cat > clonemap.yaml << EOF clonemap: - name: openstack/devstack-gate dest: devstack-gate EOF /usr/zuul-env/bin/zuul-cloner -m clonemap.yaml --cache-dir /opt/git \ https://opendev.org \ openstack/devstack-gate executable: /bin/bash chdir: '{{ ansible_user_dir }}/workspace' environment: '{{ zuul | zuul_legacy_vars }}' - shell: cmd: | set -e set -x export PROJECTS="openstack/grenade $PROJECTS" export PYTHONUNBUFFERED=true export DEVSTACK_GATE_TEMPEST=1 export DEVSTACK_GATE_GRENADE=pullup export DEVSTACK_GATE_USE_PYTHON3=True export BRANCH_OVERRIDE=default if [ "$BRANCH_OVERRIDE" != "default" ] ; then export OVERRIDE_ZUUL_BRANCH=$BRANCH_OVERRIDE fi export DEVSTACK_PROJECT_FROM_GIT=$ZUUL_SHORT_PROJECT_NAME # Even if the branch is overridden, make sure we use # the correct branch using the OVERRIDE_*_PROJECT_BRANCH # variable. uc_project=`echo $DEVSTACK_PROJECT_FROM_GIT | tr [:lower:] [:upper:] | tr '-' '_' | sed 's/[^A-Z_]//'` export "OVERRIDE_"$uc_project"_PROJECT_BRANCH"=$ZUUL_BRANCH cp devstack-gate/devstack-vm-gate-wrap.sh ./safe-devstack-vm-gate-wrap.sh ./safe-devstack-vm-gate-wrap.sh executable: /bin/bash chdir: '{{ ansible_user_dir }}/workspace' environment: '{{ zuul | zuul_legacy_vars }}' oslo.log-4.1.1/setup.cfg0000664000175000017500000000245413643050376015130 0ustar zuulzuul00000000000000[metadata] name = oslo.log summary = oslo.log library description-file = README.rst author = OpenStack author-email = openstack-discuss@lists.openstack.org home-page = https://docs.openstack.org/oslo.log/latest python-requires = >=3.6 classifier = Environment :: OpenStack Intended Audience :: Information Technology Intended Audience :: System Administrators License :: OSI Approved :: Apache Software License Operating System :: POSIX :: Linux Programming Language :: Python Programming Language :: Python :: 3 Programming Language :: Python :: 3.6 Programming Language :: Python :: 3.7 Programming Language :: Python :: 3 :: Only Programming Language :: Python :: Implementation :: CPython [files] packages = oslo_log [extras] fixtures = fixtures>=3.0.0 # Apache-2.0/BSD systemd = systemd-python>=234 # LGPLv2+ [entry_points] oslo.config.opts = oslo.log = oslo_log._options:list_opts console_scripts = convert-json = oslo_log.cmds.convert_json:main [compile_catalog] directory = oslo_log/locale domain = oslo_log [update_catalog] domain = oslo_log output_dir = oslo_log/locale input_file = oslo_log/locale/oslo_log.pot [extract_messages] keywords = _ gettext ngettext l_ lazy_gettext mapping_file = babel.cfg output_file = oslo_log/locale/oslo_log.pot [egg_info] tag_build = tag_date = 0 oslo.log-4.1.1/AUTHORS0000664000175000017500000001510113643050376014350 0ustar zuulzuul00000000000000Adam Spiers Akash Gangil Akihiro Motoki Alexander Gorodnev Alexis Lee Alexis Lee Alvaro Lopez Garcia Andrea Frittoli Andreas Jaeger Andreas Jaeger Andrei Bacos Andrew Bogott Andrew Laski Andrey Volkov Angus Salkeld Anita Kuno Ann Kamyshnikova Atsushi SAKAI Ben Nemec Bogdan Dobrelya Bogdan Dobrelya Brad Pokorny Brant Knudson Chang Bo Guo ChangBo Guo(gcb) Chris St. Pierre Chris Yeoh Christian Berendt Chuck Short Cyril Roelandt Cyril Roelandt Daisuke Fujita Dan Prince Daniel Vincze Davanum Srinivas Davanum Srinivas David Stanek DennyZhang Dina Belova Dirk Mueller Dmitry Mescheryakov Dmitry Tantsur Dolph Mathews Doug Hellmann Duan Jiong Edan David Edwin Zhai Elena Ezhova Eric Brown Eric Harney Eric Windisch Erlon R. Cruz Flavio Percoco Flavio Percoco Francois Deppierraz Gage Hugo Gary Kotton Ghanshyam Mann Gorka Eguileor Hervé Beraud Ian Cordasco Ian Wienand Ihar Hrachyshka James Carey James E. Blair Jamie Lennox Jan Vondra Jason Kölker Javeme Jay Pipes Jay S. Bryant Jeremy Stanley JingLiu Joe D'Andrea Joe Gordon Joe Gordon John Griffith John L. Villalovos John Stanford Joshua Harlow Joshua Harlow Joshua Harlow Juan Antonio Osorio Robles Julien Danjou Kenneth Giusti Kevin Benton Kevin Rasmussen Kiall Mac Innes Kirill Zaitsev Kiseok Kim Lance Bragstad Lance Bragstad Lianhao Lu Lucian Petrut Luis A. Garcia Marian Horban Mark Doffman Mark McLoughlin Masaki Matsushita Mate Lakat Matt Odden Matt Riedemann Michael Basnight Michael Kerrin Michael Still Michał Dulko Michel Nederlof Min Pae Monty Taylor Morgan Fainberg Nam Nguyen Hoai Natal Ngétal Nataliia Uvarova Nikita Gerasimov OpenStack Release Bot Pavlo Shchelokovskyy Pádraig Brady Radomir Dopieralski Rajesh Tailor Rick Harris Roman Podoliaka Ronald Bradford Russell Bryant Sean Dague Sean Dague Sean Dague Sean McGinnis Sean McGinnis Sergey Kraynev Sergey Lukjanov Sergey Vilgelm Stanislav Kudriashev Stephen Finucane Steve Martinelli Steve Martinelli Stuart McLaren Suff Thomas Bechtold Thomas Herve Thomas Herve Timur Sufiev Tony Breeds Tony Xu Venkatesh Sampath Victor Sergeyev Victor Stinner Vishakha Agarwal Vishvananda Ishaya Vladyslav Drok Yaguang Tang YangLei Zane Bitter ZhiQiang Fan Zhiteng Huang ZhongShengping Zhongyue Luo abhishekkekane avnish caoyuan eeldill gecong1973 gengchc2 gtt116 gujin jacky06 liangjingtao lingyongxu loooosy malei melanie witt melissaml pengyuesheng ricolin sonu.kumar tanlin venkatamahesh wangqi yan.haifeng zhang-jinnan zhouxinyong oslo.log-4.1.1/test-requirements.txt0000664000175000017500000000071013643050265017536 0ustar zuulzuul00000000000000# The order of packages is significant, because pip processes them in the order # of appearance. Changing the order has an impact on the overall integration # process, which may cause wedges in the gate later. hacking>=2.0.0,<2.1.0 # Apache-2.0 stestr>=2.0.0 # Apache-2.0 testtools>=2.3.0 # MIT oslotest>=3.3.0 # Apache-2.0 coverage>=4.5.1 # Apache-2.0 # Bandit security code scanner bandit>=1.1.0,<1.6.0 # Apache-2.0 fixtures>=3.0.0 # Apache-2.0/BSD oslo.log-4.1.1/.stestr.conf0000664000175000017500000000006513643050265015551 0ustar zuulzuul00000000000000[DEFAULT] test_path=./oslo_log/tests/unit top_dir=./ oslo.log-4.1.1/README.rst0000664000175000017500000000207313643050265014770 0ustar zuulzuul00000000000000======================== Team and repository tags ======================== .. image:: https://governance.openstack.org/tc/badges/oslo.log.svg :target: https://governance.openstack.org/tc/reference/tags/index.html .. Change things from this point on ================================ oslo.log -- Oslo Logging Library ================================ .. image:: https://img.shields.io/pypi/v/oslo.log.svg :target: https://pypi.org/project/oslo.log/ :alt: Latest Version .. image:: https://img.shields.io/pypi/dm/oslo.log.svg :target: https://pypi.org/project/oslo.log/ :alt: Downloads The oslo.log (logging) configuration library provides standardized configuration for all openstack projects. It also provides custom formatters, handlers and support for context specific logging (like resource id's etc). * Free software: Apache license * Documentation: https://docs.openstack.org/oslo.log/latest/ * Source: https://opendev.org/openstack/oslo.log * Bugs: https://bugs.launchpad.net/oslo.log * Release notes: https://docs.openstack.org/releasenotes/oslo.log/ oslo.log-4.1.1/ChangeLog0000664000175000017500000006320013643050376015055 0ustar zuulzuul00000000000000CHANGES ======= 4.1.1 ----- * Use unittest.mock instead of third party mock * drop use of six 4.1.0 ----- * Add Victoria and Wallaby releases to versionutils 4.0.1 ----- * remove outdated header * Switch to hacking 2.x * Stop to build universal wheel * Ignore releasenote artifacts files 4.0.0 ----- * Drop python 2.7 support and testing * Drop use of unittest2 * tox: Trivial cleanup 3.45.2 ------ * Always use jsonutils.to\_primitive 'fallback' parameter 3.45.1 ------ * Serialize complex objects in FluentFormatter * Migrate grenade jobs to py3 3.45.0 ------ * tox: Keeping going with docs * Switch to official Ussuri jobs * Update master for stable/train * Add Ussuri release to versionutils 3.44.1 ------ * Add Python 3 Train unit tests * Use setLevel instead of setting logger.level directly * Bump the openstackdocstheme extension to 1.20 * Blacklist sphinx 2.1.0 (autodoc bug) * Remove incubator migration docs * Modify the constraints url in tox * Add logging guidelines based on previous spec * Fix guidelines w.r.t. translation of log messages * Schedule a periodical check of requirements to catch py2.7 issues quickly 3.44.0 ------ * Avoid tox\_install.sh for constraints support * Cap bandit below 1.6.0 version and update sphinx and limit monotonic * Replace git.openstack.org URLs with opendev.org URLs 3.43.0 ------ * OpenDev Migration Patch * Dropping the py35 testing * Add TRAIN to deprecated releases * Use raw string for regex * Added cmdline information into fluentFormatter event message * Replace openstack.org git:// URLs with https:// * Update master for stable/stein 3.42.3 ------ * Clarify some config options * Add 'levelkey' + 'tbkey' params 3.42.2 ------ * Use template for lower-constraints 3.42.1 ------ * Default oslo.policy logging to INFO * Update mailinglist from dev to discuss * Fix handling of exc\_info in OSJournalHandler * Fix up nits in log rotation change 3.42.0 ------ * Add config options for log rotation * Advancing the protocal of the website to HTTPS in usage.rst 3.41.0 ------ * Add Windows Event Log handler * Clean up .gitignore references to personal tools * Always build universal wheels * Add devstack job with JSONFormatter configured 3.40.1 ------ * Filter args dict in JSONFormatter * add lib-forward-testing-python3 test job * add python 3.6 unit test job * rewrite tests to not rely on implementation details of logging module * import zuul job settings from project-config * Follow the new PTI for document build * Migrate to stestr * Fix lower-constraints job * Imported Translations from Zanata * Update reno for stable/rocky 3.39.0 ------ * Add release notes link to README * Automatically append reset\_color to log lines * fix tox python3 overrides * Provide reset\_color key on log record * tox: Group targets and tool configuration together * tox: Don't set basepython in testenv 3.38.1 ------ * Fix Formatter subclasses for Python 3.2+ * Fix file permissions * Remove stale pip-missing-reqs tox test * Trivial: Update pypi url to new url * Fix sphinx-docs job * set default python to python3 3.38.0 ------ * Add Stein release to versionutils * Add ROCKY to deprecated releases * add lower-constraints job * Increase sleep time in testsuite to make it more robust * Updated from global requirements 3.37.0 ------ * Add Rocky release to versionutils.\_RELEASES * Updated from global requirements * Update links in README * Imported Translations from Zanata * Zuul: Remove project name * Imported Translations from Zanata * Zuul: Remove project name * Update reno for stable/queens * Updated from global requirements * Imported Translations from Zanata * update structured logging tests to prove context id is included * Updated from global requirements * Updated from global requirements * Updated from global requirements 3.36.0 ------ * Truncate error\_summary if exc\_info not explicitly passed * Cleanup test-requirements * Updated from global requirements * Imported Translations from Zanata 3.35.0 ------ * Updated from global requirements 3.34.0 ------ * Remove setting of version/release from releasenotes * Updated from global requirements * Capture context in its own key for JSON-based formatters 3.33.0 ------ * Updated from global requirements * Remove checks for auth\_token in JSON-based formatter tests * Add release note for use\_json option * Add option to use JSON formatter * Updated from global requirements * Zuul: add file extension to playbook path * JSONFormatter convert unserializable with repr() 3.32.0 ------ * Allow logging of unhashable exceptions in Python 3 * Updated from global requirements * Migrate to Zuul v3 * Imported Translations from Zanata * Updated from global requirements 3.31.0 ------ * Updated from global requirements * Update the documentation link for doc migration * Update the documentation link * Updated from global requirements * Update reno for stable/pike * Updated from global requirements 3.30.0 ------ * Updated from global requirements * Update URLs according to document migration * Add missing variable html\_last\_updated\_fmt 3.29.0 ------ * Updated from global requirements * switch from oslosphinx to openstackdocstheme * rearrange content to fit the new standard layout * only show error\_summary for warning and error messages * Updated from global requirements * Add log.get\_loggers method * Updated from global requirements 3.28.1 ------ * do not add error\_summary for debug log messages 3.28.0 ------ * Updated from global requirements * formatter: skip ImportError when adding error\_summary * Updated from global requirements 3.27.0 ------ * Updated from global requirements * Fix bug in log\_method\_call decorator * clarify release note for error summary handling * fix test description comment * Updated from global requirements * Oslo i18n 3.15.2 has broken deps * Remove deprecated module loggers * Updated from global requirements * add line number information to fluentd formatter * add error\_summary support for fluentd formatter * add error\_summary support to JSONFormatter * refactor error summary logic so it can be reused * improve the documentation for log format strings * skip built-in exceptions when adding error\_summary * make handling of error\_summary more flexible * add exception summaries to the main log line * Updated from global requirements 3.26.1 ------ * Use dict arg values for unicode checks in ContextFormatter 3.26.0 ------ * Add oslo\_messaging to the list of log levels * Add additional info like python-systemd does 3.25.0 ------ * Fix syslog module usage breaking Windows compatibility * Updated from global requirements 3.24.0 ------ * add an extras dependency for systemd * Optimize the link address * Always create OSSysLogHandler * protect systemd class initialization when syslog is not available * Documentation for journal usage * Systemd native journal support * When record.args is None, it should not give an exception 3.23.0 ------ * Trivial: Remove testscenarios from test-requirements.txt * Check reStructuredText documents for common style issues * Use Sphinx 1.5 warning-is-error * Fix some reST field lists in docstrings * Remove log translations 3.22.0 ------ * Updated from global requirements * Remove 'verbose' option (again) 3.21.0 ------ * Added is\_debug\_enabled helper * Updated from global requirements * [Fix gate]Update test requirement * Revert "Remove 'verbose' option (again)" * Updated from global requirements * Remove support for py34 * pbr.version.VersionInfo needs package name (oslo.xyz and not oslo\_xyz) * tail support, log filtering, executable, and splitlines bug fix * Must not go underneath the context object and access \_\_dict\_\_ * Fix devstack colors * Update reno for stable/ocata * Remove 'verbose' option (again) * Remove references to Python 3.4 3.20.0 ------ * Replace method attr in vars() to hasattr * Add Constraints support 3.19.0 ------ * Avoid converting to unicode if not needed * Show team and repo badges on README 3.18.0 ------ * Updated from global requirements * Updated from global requirements * Updated from global requirements * Imported Translations from Zanata 3.17.0 ------ * Modify use of assertTrue(A in B) * Change assertTrue(isinstance()) by optimal assert * Add a json reformatter command * Enable release notes translation * Add support for P and Q release names * Updated from global requirements * Updated from global requirements * modify the home-page info with the developer documentation * Add a filter to rate limit logs * Implement FluentFormatter * Fix races in unit tests * standardize release note page ordering * Use six.wraps instead of functools * Update reno for stable/newton * Updated from global requirements * Fix typos 3.16.0 ------ * Updated from global requirements * Default use\_stderr to False 3.15.0 ------ 3.14.0 ------ * Updated from global requirements * Updated from global requirements * Fixes unit tests on Windows 3.13.0 ------ * Updated from global requirements * Fix parameters of assertEqual are misplaced * Updated from global requirements * Remove discover from test-requirements * Add Python 3.5 classifier and venv 3.12.0 ------ * Replace "LOG.exception(\_" with "LOG.exception(\_LE" * Updated from global requirements * Reload log\_config\_append config on SIGHUP * Imported Translations from Zanata * Updated from global requirements * log: Introduce \_iter\_loggers * Imported Translations from Zanata * Updated from global requirements * Updated from global requirements 3.11.0 ------ 3.10.0 ------ * Updated from global requirements * Provide a normal method for deprecation warnings 3.9.0 ----- * Updated from global requirements * Make available to log encoded strings as arguments * Updated from global requirements * Fix typo: 'Olso' to 'Oslo' * Updated from global requirements * Convert unicode data to utf-8 before calling syslog.syslog() * log: don't create foo.log * Updated from global requirements * Use new logging specific method for context info * Reduce READ\_FREQ and TIMEOUT for watch-file 3.8.0 ----- * Revert "Remove 'verbose' option" * Fix regression causing the default log level to become WARNING * Remove 'verbose' option 3.7.0 ----- * Fix example issue * Updated from global requirements * Allow reload of 'debug' option 3.6.0 ----- * Imported Translations from Zanata 3.5.0 ----- * Remove direct dependency on babel 3.4.0 ----- * Updated from global requirements * Updated from global requirements * Updated from global requirements * Remove outdated comment in ContextFormatter * Enable log\_method\_call to work on static method * Explicitly exclude tests from bandit scan * Improve olso.log test coverage for edge cases * Improve test code coverage of \_options * Update reno for stable/mitaka * Unit test cleanup and validation improvements * Added +2 release names for versionutils * Fix broken links in docs usage page * Enable bandit in gate * Updated from global requirements 3.2.0 ----- * use log.warning instead of log.warn * Imported Translations from Zanata * Updated from global requirements * Remove deprecated use-syslog-rfc-format option 3.1.0 ----- * Add release note for removed log\_format option * Updated from global requirements * add page for release notes for unreleased versions * add a release note about using reno 3.0.0 ----- * Add reno for release notes management * remove pypy from default tox environment list * stop making a copy of options discovered by config generator * always run coverage report * Remove bandit.yaml in favor of defaults 2.4.0 ----- * Updated from global requirements * Fix spell typos * set oslo.cache and dogpile to INFO * Update translation setup * Updated from global requirements * Updated from global requirements * Updated from global requirements * Imported Translations from Zanata * Updated from global requirements * Improve Logging docs with inline examples and context example * Revert "Pass environment variables of proxy to tox" * Clean up removed hacking rule from [flake8] ignore lists * Provide a deprecated\_reason for use\_syslog\_rfc\_format * Remove deprecated log-format option 2.3.0 ----- * Improve documentataion of Oslo Log Usage * Added public method to getting default log levels * Updated from global requirements * enable isotime for exceptions * assertIsNone(val) instead of assertEqual(None,val) 2.2.0 ----- * Set keystoneauth default log level to WARN * Add ISO8601/RFC3339 timestamp to ContextFormatter * Format record before passing it to syslog * Updated from global requirements * Pass environment variables of proxy to tox * Updated from global requirements * Trival: Remove 'MANIFEST.in' 2.1.0 ----- * Remove iso8601 dependency * Remove duplicated profiles section from bandit.yaml * test\_logging\_error: build a logger at the test level * Cleanup all handlers in \_setup\_logging\_from\_conf * Drop python 2.6 support * Add a 'bandit' target to tox.ini 2.0.0 ----- * Updated from global requirements * Log to sys.stderr to avoid "No handlers could be found..." * Remove python 2.6 classifier * Remove python 2.6 and cleanup tox.ini * Refactor Python 2.6 check to use constant 1.14.0 ------ * The user\_identity format flexibility * Updated from global requirements * Imported Translations from Zanata * Updated from global requirements 1.13.0 ------ * Updated from global requirements * Updated from global requirements 1.12.1 ------ * Allow oslo.log to work on non-linux platforms 1.12.0 ------ * Fix coverage configuration and execution * No need for Oslo Incubator Sync * Add hostname field to JSONFormatter * Imported Translations from Zanata * Fix unintended assignment of "syslog" * Make doc title consistent with readme * add documentation with example of an external configuration file * add auto-generated docs for config options * Update option docs for when log config is used * Updated from global requirements * Add optional 'fixture' dependencies * Change ignore-errors to ignore\_errors * Fix the home-page value in setup.cfg with openstack.org * FastWatchedFileHandler class was added 1.11.0 ------ * Fix poor examples of exception logging * Updated from global requirements * Updated from global requirements 1.10.0 ------ * Fix package name for PublishErrorsHandler * Updated from global requirements * Fix duplicate-key pylint issue * Maintain old oslo logger names 1.9.0 ----- * Add Mitaka release to versionutils * Update single letter release names to full names * Provide a way to register versionutils options * Imported Translations from Transifex * Updated from global requirements 1.8.0 ----- * Set verbose to True and deprecate it * Define TRACE logging level * Imported Translations from Transifex * Updated from global requirements 1.7.0 ----- * Imported Translations from Transifex * Add more default fancier formatting params * Updated from global requirements * Updated from global requirements * Updated from global requirements * Do not report deprecations in subclasses * Imported Translations from Transifex * Updated from global requirements * Add tox target to find missing requirements 1.6.0 ----- * Remove duplication of fatal\_deprecations option * setting taskflow log level to WARN * Imported Translations from Transifex 1.5.0 ----- * Updated from global requirements * Updated from global requirements * Switch badges from 'pypip.in' to 'shields.io' * Deprecate use-syslog-rfc-format for removal 1.4.0 ----- 1.3.0 ----- * Do not fail if syslog is not available * Allow integer logging levels 1.2.0 ----- * Use proper deprecation for use-syslog-rfc-format option * Replace RFCSysLogHandler by a syslog() based one * Make remove\_in=0 (no removal) use a better syntax * Remove is\_compatible from versionutils * Add versionutils options to list\_opts * Add versionutils to API documentation * Advertise support for Python3.4 / Remove support for Python 3.3 * Updated from global requirements * Updated from global requirements * Remove run\_cross\_tests.sh * Deprecate WritableLogger - used for eventlet logging * Log deprecation message when catching deprecated exceptions * Change misleading TRACE to ERROR 1.1.0 ----- * Uncap library requirements for liberty * Provide an API to let tempest control the log file * fix pep8 errors * Add pypi download + version badges * Update to latest hacking * Add link to Logging Guidelines * move versionutils into place * Add liberty release name to versionutils * Expose opts entry point for version\_utils * Switch from oslo.config to oslo\_config * Remove oslo.log code and clean up versionutils API * Remove code that moved to oslo.i18n * Enhance versionutils.deprecated to work with classes * Add Kilo release name to versionutils * Allow deprecated decorator to specify no plan for removal * Add JUNO as a target to versionutils module * pep8: fixed multiple violations * Use oslotest instead of common test module * Use hacking import\_exceptions for gettextutils.\_ * fixed typos * Fix violations of H302:import only modules * Adds decorator to deprecate functions and methods * Remove vim header * Add \`versionutils\` for version compatibility checks * Default to True for use-syslog-rfc-format * Updated from global requirements * Restore automatic unicode conversion * Add migration notes 1.0.0 ----- * Updated from global requirements 0.4.0 ----- * Pickup instance from log format record * Make use\_syslog=True log to syslog via /dev/log 0.3.0 ----- * Updated from global requirements * update urllib3.util.retry log level to WARN 0.2.0 ----- * Expose fixtures through oslo\_log.fixture * Add fixture to let tests change log levels * Rename logging fixture module * Update comment to match implementation * fix link to bug tracker in readme * Updated from global requirements * Update Oslo imports to remove namespace package 0.1.0 ----- * Updated from global requirements * Add API documentation * Implement resource to logging extra keywords * Use RequestContext store in oslo\_context * Correct the translation domain for loading messages * Correct the position of the syslog handler * Enhance the README a bit * Switch to oslo.context * Move files out of the namespace package * Updated from global requirements * Workflow documentation is now in infra-manual * Added helper decorator to log method arguments * Updated from global requirements * Add oslo.config.opts entry\_points in setup.cfg * Updated from global requirements * Updated from global requirements * Activate pep8 check that \_ is imported * Add pbr to installation requirements * Updated from global requirements * Updated from global requirements * Remove audit log level * Switch from ContextAdapter to ContextFormatter * Move adapter properties to base class * Add KeywordArgumentAdapter * Remove extraneous vim editor configuration comments * Support building wheels (PEP-427) * Imported Translations from Transifex * Imported Translations from Transifex * Use oslo.utils and oslo.serialization * Fix test env order for testrepository db format * log: add missing space in error message * fix typo and formatting in contributing docs * Updated from global requirements * Remove duplicate test and cleanup unnecessary files * Use fixtures from oslo.i18n and oslo.cfg * Extract WritableLogger from log module * Move handlers and formatters out * Remove dependency on global CONF * switch test from info to error * Test formatting errors with log level being emitted * Imported Translations from Transifex * Simple doc cleanup * Work toward Python 3.4 support and testing * warn against sorting requirements * Make the local module private * Move the option definitions into a private file * Initial translation setup * Fix testr failure under python2.6 * Get py27 amd pep8 to work * exported from oslo-incubator by graduate.sh * Set stevedore log level to WARN by default * Add unicode coercion of logged messages to ContextFormatter * Correct coercion of logged message to unicode * Except socket.error if syslog isn't running * Fix E126 pep8 errors * log: make tests portable * Set keystonemiddleware and routes.middleware to log on WARN level * Adjust oslo logging to provide adapter is enabled for * Make logging\_context\_format\_string optional in log.set\_defaults * log: make set\_defaults() tests clean up properly * Add default log level for websocket * Ability to customize default\_log\_levels for each project * Python 3: enable tests/unit/test\_log.py * Move \`mask\_password\` to strutils * update new requests logger to default WARN * Remove extra whitespace * Use oslo.messaging to publish log errors * pep8: fixed multiple violations * Add a RequestContext.from\_dict method * Fix common.log.ContextFormatter for Python 3 * Mask passwords included without quotes at the ends of commands * Use moxstubout and mockpatch from oslotest * Fixes a simple spelling mistake * always log a traceback in the sys.excepthook * Remove redundant default=None for config options * Fix logging setup for Python 3.4 * Mask passwords that are included in commands * Improve help strings * Remove str() from LOG.\* and exceptions * Fix python26 compatibility for RFCSysLogHandler * Use oslotest instead of common test module * Revert setting oslo-incubator logs to INFO * Set default log levels for oslo.messaging and oslo-incubator * Python 3: enable tests/unit/middleware/test\_request\_id.py * Add default user\_identity to logging record * Add model\_query() to db.sqlalchemy.utils module * Remove None for dict.get() * Rename Openstack to OpenStack * Fixture to reraise exceptions raised during logging * Emit message which merged user-supplied argument in log\_handler * Log unit test improvements * Use ContextFormatter for imparting context info * Fix deprecated messages sent multiple times * default connectionpool to WARN log level * Backport 'ident' from python 3.3 for Oslo's SysLogHandler * remove extra newlines that eventlet seems to add * Small edits on help strings * Add error type to unhandled exception log message * Logging excepthook: print exception info if debug=True * Utilizes assertIsNone and assertIsNotNone * Fix spelling errors in comments * Use hacking import\_exceptions for gettextutils.\_ * Correct invalid docstrings * Translation Message improvements * Remove keystone from default\_log\_levels default * Adding domain to context and log * Unify different names between Python2/3 with six.moves * Remove vim header * Don't log to stdout when log\_dir is set * Remove uuidutils imports in oslo modules * Adds admin\_password as key to be sanitized when logging * Revert "Removes generate\_uuid from uuidutils" * Do not name variables as builtins * Removes generate\_uuid from uuidutils * Default iso8601 logging to WARN * Use six.text\_type instead of unicode function in tests * Add mask password impl from other projects * Use fileutils.write\_to\_tempfile in LogConfigTestCase * allow keeping of existing loggers with fileConfig * Add amqp=WARN,qpid=WARN to default\_log\_levels * Replace assert\_ with assertTrue * Don't override default value for eventlet.wsgi.server logging * \_get\_log\_file\_path explictly return, when logfile/logdire unset * Make openstack.common.log Python 3 compatible * Make Messages unicode before hitting logging * Adding instance\_uuid to context and log * Replace using tests.utils part2 * Make a cStringIO usage in test\_log py3 compatible * Bump hacking to 0.7.0 * Replace using tests.utils with openstack.common.test * Modify local.py to not be dependent on Eventlet * python3: handle module moves in log * Enable H302 hacking check * Add missing license header * Fix bad default for show\_deleted * Highlighting the deprecated nature of 'log-format' * Enable hacking H404 test * Enable hacking H402 test * python3: python3 binary/text data compatbility * Enable hacking H403 test * Remove the notifier and its dependencies from log.py * Deprecate log\_format and change default to None * oslo logging tries to run chmod on file * Improve Python 3.x compatibility * Support for lazily instantiated loggers * Incorrect logging setup - duplicating root handlers * Replaces the standard uuid with common in the context module * Gracefully handle errors in logging config files * clarify --log-file comments * Include PID in default logging\_context\_format\_string * Initialize root logger in \_setup\_logging\_from\_conf() * Fix Copyright Headers - Rename LLC to Foundation * Unignore log\_format option * Fix inconsistency with auth\_tok/auth\_token * Setup exception handler after configuring logging * Use oslo-config-2013.1b3 * Don't use subprocess for testing excepthook * Emit a warning if RPC calls made with lock * Replace direct use of testtools BaseTestCase * Use testtools as test base class * Move logging config options into the log module * Fixes import order errors * Verbose should not enable debug level logging * Fix pep8 E125 errors * Improve millisecond logging * Enable millisecond logging by default * Allow nova and others to override some logging defaults * update deprecated stanza * Adjust the logging\_context\_format\_string * Fix the log test so it uses the available context fields * Restore proper LoggerTestCase * move nova.common.deprecated to openstack-common * Use pep8 v1.3.3 * Improve logging of process id * Fix meaningless test case * Add multiple-driver support to the notifier api * Install a qualified except hook * Remove code to clear basicConfig root log handlers * don't throw exceptions if %(color)s tag is used * fix bug lp:1019348,update openstack-common to support pep8 1.3 * Fix missing gettextutils in several modules * Move get\_context\_from\_function\_and\_args() to context.py * Switch common files to using jsonutils * Pass in stream as positional argument to StreamHandler * Add common logging and notification * Added dictify() and uuids to the common request context * Add greenthread local storage model from nova * add context 'tests' * make the skeleton project a template * reog from import merge * Add some more generic middleware, request context, utils, and versioning. Add basic template for server binary * Initial skeleton project oslo.log-4.1.1/tox.ini0000664000175000017500000000342613643050265014617 0ustar zuulzuul00000000000000[tox] minversion = 3.1 envlist = py36,py37,pep8 ignore_basepython_conflict = true [testenv] basepython = python3 whitelist_externals = find deps = -c{env:UPPER_CONSTRAINTS_FILE:https://releases.openstack.org/constraints/upper/master} -r{toxinidir}/test-requirements.txt commands = find . -type f -name "*.pyc" -delete stestr run {posargs} stestr slowest [testenv:pep8] commands = flake8 # Run security linter bandit -r oslo_log -x tests -n5 [testenv:venv] commands = {posargs} [testenv:docs] whitelist_externals = rm deps = -c{env:UPPER_CONSTRAINTS_FILE:https://releases.openstack.org/constraints/upper/master} -r{toxinidir}/doc/requirements.txt commands = rm -fr doc/build sphinx-build -a -E -W --keep-going -b html doc/source doc/build/html [testenv:releasenotes] whitelist_externals = rm deps = -c{env:UPPER_CONSTRAINTS_FILE:https://releases.openstack.org/constraints/upper/master} -r{toxinidir}/doc/requirements.txt commands = rm -rf releasenotes/build sphinx-build -a -E -W --keep-going -b html releasenotes/source releasenotes/build/html [testenv:cover] commands = coverage erase {[testenv]commands} coverage combine coverage html -d cover coverage xml -o cover/coverage.xml coverage report --show-missing [testenv:bandit] commands = bandit -r oslo_log -x tests -n5 [testenv:lower-constraints] deps = -c{toxinidir}/lower-constraints.txt -r{toxinidir}/test-requirements.txt -r{toxinidir}/requirements.txt [flake8] # E123, E125 skipped as they are invalid PEP-8. # W503, W504 skipped: https://www.python.org/dev/peps/pep-0008/#should-a-line-break-before-or-after-a-binary-operator show-source = True ignore = E123,E125,H405,W503,W504 exclude=.venv,.git,.tox,dist,doc,*lib/python*,*egg,build,__init__.py [hacking] import_exceptions = oslo_log._i18n oslo.log-4.1.1/.mailmap0000664000175000017500000000013013643050265014712 0ustar zuulzuul00000000000000# Format is: # # oslo.log-4.1.1/CONTRIBUTING.rst0000664000175000017500000000103513643050265015737 0ustar zuulzuul00000000000000If you would like to contribute to the development of OpenStack, you must follow the steps in this page: https://docs.openstack.org/infra/manual/developers.html Once those steps have been completed, changes to OpenStack should be submitted for review via the Gerrit tool, following the workflow documented at: https://docs.openstack.org/infra/manual/developers.html#development-workflow Pull requests submitted through GitHub will be ignored. Bugs should be filed on Launchpad, not GitHub: https://bugs.launchpad.net/oslo.log oslo.log-4.1.1/doc/0000775000175000017500000000000013643050376014047 5ustar zuulzuul00000000000000oslo.log-4.1.1/doc/requirements.txt0000664000175000017500000000061013643050265017325 0ustar zuulzuul00000000000000# The order of packages is significant, because pip processes them in the order # of appearance. Changing the order has an impact on the overall integration # process, which may cause wedges in the gate later. sphinx>=1.8.0,!=2.1.0 # BSD openstackdocstheme>=1.20.0 # Apache-2.0 reno>=2.5.0 # Apache-2.0 # Optional dependencies that are needed to build docs fixtures>=3.0.0 # Apache-2.0/BSD oslo.log-4.1.1/doc/source/0000775000175000017500000000000013643050376015347 5ustar zuulzuul00000000000000oslo.log-4.1.1/doc/source/reference/0000775000175000017500000000000013643050376017305 5ustar zuulzuul00000000000000oslo.log-4.1.1/doc/source/reference/log.rst0000664000175000017500000000023713643050265020617 0ustar zuulzuul00000000000000============== oslo_log.log ============== .. automodule:: oslo_log.log :members: :undoc-members: :show-inheritance: .. seealso:: :ref:`using` oslo.log-4.1.1/doc/source/reference/versionutils.rst0000664000175000017500000000017313643050265022603 0ustar zuulzuul00000000000000======================= oslo_log.versionutils ======================= .. automodule:: oslo_log.versionutils :members: oslo.log-4.1.1/doc/source/reference/helpers.rst0000664000175000017500000000022013643050265021470 0ustar zuulzuul00000000000000================== oslo_log.helpers ================== .. automodule:: oslo_log.helpers :members: :undoc-members: :show-inheritance: oslo.log-4.1.1/doc/source/reference/handlers.rst0000664000175000017500000000023013643050265021627 0ustar zuulzuul00000000000000===================== oslo_log.handlers ===================== .. automodule:: oslo_log.handlers :members: :undoc-members: :show-inheritance: oslo.log-4.1.1/doc/source/reference/index.rst0000664000175000017500000000015013643050265021137 0ustar zuulzuul00000000000000======================== oslo.log API Reference ======================== .. toctree:: :glob: * oslo.log-4.1.1/doc/source/reference/watchers.rst0000664000175000017500000000026113643050265021653 0ustar zuulzuul00000000000000================== oslo_log.watchers ================== .. automodule:: oslo_log.watchers :members: :undoc-members: :show-inheritance: .. seealso:: :ref:`using` oslo.log-4.1.1/doc/source/reference/formatters.rst0000664000175000017500000000023413643050265022221 0ustar zuulzuul00000000000000===================== oslo_log.formatters ===================== .. automodule:: oslo_log.formatters :members: :undoc-members: :show-inheritance: oslo.log-4.1.1/doc/source/reference/fixtures.rst0000664000175000017500000000031013643050265021677 0ustar zuulzuul00000000000000================== oslo_log.fixture ================== .. module:: oslo_log.fixture .. autofunction:: oslo_log.fixture.get_logging_handle_error_fixture .. autoclass:: oslo_log.fixture.SetLogLevel oslo.log-4.1.1/doc/source/conf.py0000664000175000017500000000273713643050265016654 0ustar zuulzuul00000000000000# -*- coding: utf-8 -*- # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or # implied. # See the License for the specific language governing permissions and # limitations under the License. import os import sys sys.path.insert(0, os.path.abspath('../..')) # -- General configuration ---------------------------------------------------- # Add any Sphinx extension module names here, as strings. They can be # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom ones. extensions = [ 'sphinx.ext.autodoc', 'sphinx.ext.doctest', 'oslo_config.sphinxext', 'openstackdocstheme', ] # openstackdocstheme options repository_name = 'openstack/oslo.log' bug_project = 'oslo.log' bug_tag = '' # The master toctree document. master_doc = 'index' # General information about the project. copyright = u'2014, OpenStack Foundation' # -- Options for HTML output -------------------------------------------------- # The theme to use for HTML and HTML Help pages. Major themes that come with # Sphinx are currently 'default' and 'sphinxdoc'. # html_static_path = ['static'] html_theme = 'openstackdocs' oslo.log-4.1.1/doc/source/index.rst0000664000175000017500000000105113643050265017202 0ustar zuulzuul00000000000000================================ oslo.log -- Oslo Logging Library ================================ The oslo.log (logging) configuration library provides standardized configuration for all openstack projects. It also provides custom formatters, handlers and support for context specific logging (like resource id's etc). .. toctree:: :maxdepth: 1 install/index user/index reference/index configuration/index admin/index contributor/index Indices and tables ================== * :ref:`genindex` * :ref:`modindex` * :ref:`search` oslo.log-4.1.1/doc/source/install/0000775000175000017500000000000013643050376017015 5ustar zuulzuul00000000000000oslo.log-4.1.1/doc/source/install/index.rst0000664000175000017500000000037613643050265020661 0ustar zuulzuul00000000000000============ Installation ============ At the command line:: $ pip install oslo.log To use ``oslo_log.fixture``, some additional dependencies are needed. They can be installed using the ``fixtures`` extra:: $ pip install 'oslo.log[fixtures]' oslo.log-4.1.1/doc/source/contributor/0000775000175000017500000000000013643050376017721 5ustar zuulzuul00000000000000oslo.log-4.1.1/doc/source/contributor/index.rst0000664000175000017500000000011713643050265021556 0ustar zuulzuul00000000000000============ Contributing ============ .. include:: ../../../CONTRIBUTING.rst oslo.log-4.1.1/doc/source/user/0000775000175000017500000000000013643050376016325 5ustar zuulzuul00000000000000oslo.log-4.1.1/doc/source/user/examples.rst0000664000175000017500000000164513643050265020700 0ustar zuulzuul00000000000000========== Examples ========== .. _examples: These files can be found in the docs/source/examples directory of the git source of this project. They can also be found in the `online git repository`_ of this project. .. _online git repository: https://opendev.org/openstack/oslo.log/src/branch/master/doc/source/user/examples python_logging.py ----------------- .. _example_python_logging.py: .. highlight:: python .. literalinclude:: examples/python_logging.py :linenos: oslo_logging.py --------------- .. _example_oslo_logging.py: .. literalinclude:: examples/oslo_logging.py usage.py -------- .. _example_usage.py: .. literalinclude:: examples/usage.py :linenos: usage_helper.py --------------- .. _example_usage_helper.py: .. literalinclude:: examples/usage_helper.py :linenos: usage_context.py ---------------- .. _example_usage_context.py: .. literalinclude:: examples/usage_context.py :linenos: oslo.log-4.1.1/doc/source/user/index.rst0000664000175000017500000000021313643050265020157 0ustar zuulzuul00000000000000.. _using: ================ Using oslo.log ================ .. toctree:: :maxdepth: 2 usage examples guidelines history oslo.log-4.1.1/doc/source/user/history.rst0000664000175000017500000000043313643050265020555 0ustar zuulzuul00000000000000.. Add a fake link so that the reference to "assert_" in one of the changelog messages that looks like a reST link but isn't has something to resolve to. .. _assert: https://docs.python.org/2/library/unittest.html#unittest.TestCase.assertTrue .. include:: ../../../ChangeLog oslo.log-4.1.1/doc/source/user/guidelines.rst0000664000175000017500000002627613643050265021221 0ustar zuulzuul00000000000000.. This work is licensed under a Creative Commons Attribution 3.0 Unported License. http://creativecommons.org/licenses/by/3.0/legalcode ================== Logging Guidelines ================== .. note:: Many of these guidelines were originally authored as `a cross-project spec `_. Motivation ========== A consistent, unified logging format will better enable cloud administrators to monitor and maintain their environments. Therefore this document provides guidelines for best practices regarding how developers should use logging within their code. Adding Variables to Log Messages ================================ String interpolation should be delayed to be handled by the logging code, rather than being done at the point of the logging call. For example, **do not do this**:: # WRONG LOG.info('some message: variable=%s' % variable) Instead, use this style:: # RIGHT LOG.info('some message: variable=%s', variable) This allows the logging package to skip creating the formatted log message if the message is not going to be emitted because of the current log level. Definition of Log Levels ======================== .. note:: The following definitions were originally taken from `a popular answer on StackOverflow `_. ``DEBUG`` Shows everything and is likely not suitable for normal production operation due to the sheer size of logs generated. ``INFO`` Usually indicates successful service start/stop, versions and such non-error related data. This should include largely positive units of work that are accomplished (such as starting a compute service, creating a user, deleting a volume, etc.). ``AUDIT`` This should not be used. All previous messages at ``AUDIT`` level should be changed to ``INFO``, or sent as notifications to a notification queue. (The origin of ``AUDIT`` was a NASA-specific requirement which led to confusion/misuse and is no longer relevant to the current code.) ``WARNING`` Indicates that there might be a systemic issue; potential predictive failure notice. ``ERROR`` An error has occurred and an administrator should research the event. ``CRITICAL`` An error has occurred and the system might be unstable; administrator attention is required immediately. Log levels from an operator perspective --------------------------------------- We can think of this from an operator perspective the following ways (Note: we are not specifying operator policy here, just trying to set tone for developers that aren't familiar with how these messages will be interpreted): ``CRITICAL`` ZOMG! Cluster on FIRE! Call all pagers, wake up everyone. This is an unrecoverable error with a service that has or probably will lead to service death or massive degradation. ``ERROR`` Serious issue with cloud: administrator should be notified immediately via email/pager. On call people expected to respond. ``WARNING`` Something is not right; should get looked into during the next work week. Administrators should be working through eliminating warnings as part of normal work. ``INFO`` Normal status messages showing measurable units of positive work passing through under normal functioning of the system. Should not be so verbose as to overwhelm real signal with noise. Should not be continuous "I'm alive!" messages. See `Log messages at INFO and above should represent a "unit of work"`_ for more details. ``DEBUG`` Developer logging level. Only enable if you are interested in reading through a ton of additional information about what is going on. See `Debugging start / end messages`_ for more details. ``TRACE`` In functions which support this level, details every parameter and operation to help diagnose subtle bugs. This should only be enabled for specific areas of interest or the log volume will be overwhelming. Some system performance degradation should be expected. Overall logging principles ========================== The following principles should apply to all messages. Debugging start / end messages ------------------------------ At the ``DEBUG`` log level it is often extremely important to flag the beginning and ending of actions to track the progression of flows (which might error out before the unit of work is completed). This should be made clear by there being a "starting" message with some indication of completion for that starting point. In a real OpenStack environment lots of things are happening in parallel. There are multiple workers per services, multiple instances of services in the cloud. Log messages at INFO and above should represent a "unit of work" ---------------------------------------------------------------- The ``INFO`` log level is defined as: "normal status messages showing measurable units of positive work passing through under normal functioning of the system." A measurable unit of work should be describable by a short sentence fragment, in the past tense with a noun and a verb of something significant. Examples:: Instance spawned Instance destroyed Volume attached Image failed to copy Words like "started", "finished", or any verb ending in "ing" are flags for non unit of work messages. Examples of good and bad uses of INFO ------------------------------------- Below are some examples of good and bad uses of ``INFO``. In the good examples we can see the 'noun / verb' fragment for a unit of work. "Successfully" is probably superfluous and could be removed. **Good** :: 2014-01-26 15:36:10.597 28297 INFO nova.virt.libvirt.driver [-] [instance: b1b8e5c7-12f0-4092-84f6-297fe7642070] Instance spawned successfully. 2014-01-26 15:36:14.307 28297 INFO nova.virt.libvirt.driver [-] [instance: b1b8e5c7-12f0-4092-84f6-297fe7642070] Instance destroyed successfully. In the bad examples we see trace-level thinking put into messages at ``INFO`` level and above: **Bad** :: 2014-01-26 15:36:11.198 INFO nova.virt.libvirt.driver [req-ded67509-1e5d-4fb2-a0e2-92932bba9271 FixedIPsNegativeTestXml-1426989627 FixedIPsNegativeTestXml-38506689] [instance: fd027464-6e15-4f5d-8b1f-c389bdb8772a] Creating image 2014-01-26 15:36:11.525 INFO nova.virt.libvirt.driver [req-ded67509-1e5d-4fb2-a0e2-92932bba9271 FixedIPsNegativeTestXml-1426989627 FixedIPsNegativeTestXml-38506689] [instance: fd027464-6e15-4f5d-8b1f-c389bdb8772a] Using config drive 2014-01-26 15:36:12.326 AUDIT nova.compute.manager [req-714315e2-6318-4005-8f8f-05d7796ff45d FixedIPsTestXml-911165017 FixedIPsTestXml-1315774890] [instance: b1b8e5c7-12f0-4092-84f6-297fe7642070] Terminating instance 2014-01-26 15:36:12.570 INFO nova.virt.libvirt.driver [req-ded67509-1e5d-4fb2-a0e2-92932bba9271 FixedIPsNegativeTestXml-1426989627 FixedIPsNegativeTestXml-38506689] [instance: fd027464-6e15-4f5d-8b1f-c389bdb8772a] Creating config drive at /opt/stack/data/nova/instances/fd027464-6e15-4f5d-8b1f -c389bdb8772a/disk.config This is mostly an overshare issue. At ``INFO``, these are stages that don't really need to be fully communicated. Messages shouldn't need a secret decoder ring --------------------------------------------- **Bad** :: 2014-01-26 15:36:14.256 28297 INFO nova.compute.manager [-] Lifecycle event 1 on VM b1b8e5c7-12f0-4092-84f6-297fe7642070 As a general rule, when using constants or enums, ensure they are translated back to user strings prior to being sent to the user. Specific event types -------------------- In addition to the above guidelines very specific additional recommendations exist. These are guidelines rather than hard rules to be adhered to, so common sense should always be exercised. WSGI requests ~~~~~~~~~~~~~ - Should be logged at ``INFO`` level. - Should be logged exactly once per request. - Should include enough information to know what the request was (but not so much as to overwhelm the logs). The last point is notable, because some ``POST`` API requests don't include enough information in the URL alone to determine what the API did. For instance, Nova server actions (where ``POST`` includes a method name), although including ``POST`` request payloads could be excessive, so common sense should be exercised. **Rationale:** Operators should be able to easily see what API requests their users are making in their cloud to understand the usage patterns of their users with their cloud. Operator deprecation warnings ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - Should be logged at ``WARN`` level. - Where possible, should be logged exactly once per service start (not on every request through code). However it may be tricky to keep track of whether a warning was already issued, so common sense should dictate the best approach. - Should include directions on what to do to migrate from the deprecated state. **Rationale:** Operators need to know that some aspect of their cloud configuration is now deprecated, and will require changes in the future. And they need enough of a bread crumb trail to figure out how to do that. REST API deprecation warnings ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ - Should **not** be logged any higher than ``DEBUG``, since these are not operator-facing messages. - Should be logged no more than once per REST API usage / tenant, definitely not on *every* REST API call. **Rationale:** The users of the REST API don't have access to the system logs. Therefore logging at a ``WARNING`` level is telling the wrong people about the fact that they are using a deprecated API. Deprecation of user-facing APIs should be communicated via user-facing mechanisms, e.g. API change notes associated with new API versions. Stacktraces in logs ~~~~~~~~~~~~~~~~~~~ - Should be **exceptional** events, for unforeseeable circumstances that are not yet recoverable by the system. - Should be logged at ``ERROR`` level. - Should be considered high priority bugs to be addressed by the development team. **Rationale:** The current behavior of OpenStack is extremely stack trace happy. Many existing stack traces in the logs are considered *normal*. This dramatically increases the time to find the root cause of real issues in OpenStack. Logging by non-OpenStack components ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ OpenStack uses a ton of libraries, which have their own definitions of logging. This causes a lot of extraneous information in normal logs by wildly different definitions of those libraries. As such, all 3rd party libraries should have their logging levels adjusted so only real errors are logged. Currently proposed settings for 3rd party libraries: - ``amqp=WARN`` - ``boto=WARN`` - ``qpid=WARN`` - ``sqlalchemy=WARN`` - ``suds=INFO`` - ``iso8601=WARN`` - ``requests.packages.urllib3.connectionpool=WARN`` - ``urllib3.connectionpool=WARN`` Testing ======= See tests provided by https://blueprints.launchpad.net/nova/+spec/clean-logs References ========== - Security Log Guidelines - https://wiki.openstack.org/wiki/Security/Guidelines/logging_guidelines - Wiki page for basic logging standards proposal developed early in Icehouse - https://wiki.openstack.org/wiki/LoggingStandards - Apache Log4j levels (which many tools work with) - https://logging.apache.org/log4j/1.2/apidocs/org/apache/log4j/Level.html oslo.log-4.1.1/doc/source/user/examples/0000775000175000017500000000000013643050376020143 5ustar zuulzuul00000000000000oslo.log-4.1.1/doc/source/user/examples/_i18n.py0000664000175000017500000000304413643050265021431 0ustar zuulzuul00000000000000# Copyright (c) 2016 OpenStack Foundation # All Rights Reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. """A demonstration of oslo.i18n integration module that is used in projects wanting to implement Oslo i18n translation. See https://docs.openstack.org/oslo.i18n/latest/user/index.html """ import oslo_i18n DOMAIN = "demo" _translators = oslo_i18n.TranslatorFactory(domain=DOMAIN) # The primary translation function using the well-known name "_" _ = _translators.primary # The contextual translation function using the name "_C" _C = _translators.contextual_form # The plural translation function using the name "_P" _P = _translators.plural_form # Translators for log levels. # # The abbreviated names are meant to reflect the usual use of a short # name like '_'. The "L" is for "log" and the other letter comes from # the level. _LI = _translators.log_info _LW = _translators.log_warning _LE = _translators.log_error _LC = _translators.log_critical def get_available_languages(): return oslo_i18n.get_available_languages(DOMAIN) oslo.log-4.1.1/doc/source/user/examples/usage_helper.py0000664000175000017500000000603713643050265023163 0ustar zuulzuul00000000000000# Copyright (c) 2016 OpenStack Foundation # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. """A usage example with helper debugging of minimum Oslo Logging This example requires the following package to be installed. $ pip install oslo.log Additional Oslo packages installed include oslo.config, oslo.context, oslo.i18n, oslo.serialization and oslo.utils. More information about Oslo Logging can be found at: https://docs.openstack.org/oslo.log/latest/user/index.html """ # Use default Python logging to display running output import logging as py_logging from oslo_config import cfg from oslo_log import log as logging LOG = py_logging.getLogger(__name__) CONF = cfg.CONF DOMAIN = "demo" def prepare(): """Prepare Oslo Logging (2 or 3 steps) Use of Oslo Logging involves the following: * logging.register_options * logging.set_defaults (optional) * logging.setup """ LOG.debug("Prepare Oslo Logging") LOG.info("Size of configuration options before %d", len(CONF)) # Required step to register common, logging and generic configuration # variables logging.register_options(CONF) LOG.info("Size of configuration options after %d", len(CONF)) # Optional step to set new defaults if necessary for # * logging_context_format_string # * default_log_levels # # These variables default to respectively: # # import oslo_log # oslo_log._options.DEFAULT_LOG_LEVELS # oslo_log._options.log_opts[0].default # custom_log_level_defaults = logging.get_default_log_levels() + [ 'dogpile=INFO', 'routes=INFO' ] logging.set_defaults(default_log_levels=custom_log_level_defaults) # NOTE: We cannot show the contents of the CONF object # after register_options() because accessing this caches # the default_log_levels subsequently modified with set_defaults() LOG.info("List of Oslo Logging configuration options and current values") LOG.info("=" * 80) for c in CONF: LOG.info("%s = %s" % (c, CONF[c])) LOG.info("=" * 80) # Required setup based on configuration and domain logging.setup(CONF, DOMAIN) if __name__ == '__main__': py_logging.basicConfig(level=py_logging.DEBUG) prepare() # NOTE: These examples do not demonstration Oslo i18n messages LOG.info("Welcome to Oslo Logging") LOG.debug("A debugging message") LOG.warning("A warning occurred") LOG.error("An error occurred") try: raise Exception("This is exceptional") except Exception: LOG.exception("An Exception occurred") oslo.log-4.1.1/doc/source/user/examples/python_logging.py0000664000175000017500000000156213643050265023545 0ustar zuulzuul00000000000000# Copyright (c) 2016 OpenStack Foundation # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. """A syntax example of Python Logging""" import logging LOG = logging.getLogger(__name__) # Define a default handler at INFO logging level logging.basicConfig(level=logging.INFO) LOG.info("Python Standard Logging") LOG.warning("Python Standard Logging") LOG.error("Python Standard Logging") oslo.log-4.1.1/doc/source/user/examples/usage_context.py0000664000175000017500000000454313643050265023370 0ustar zuulzuul00000000000000# Copyright (c) 2016 OpenStack Foundation # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. """A usage example of Oslo Logging with context This example requires the following package to be installed. $ pip install oslo.log Additional Oslo packages installed include oslo.config, oslo.context, oslo.i18n, oslo.serialization and oslo.utils. More information about Oslo Logging can be found at: https://docs.openstack.org/oslo.log/latest/user/index.html https://docs.openstack.org/oslo.context/latest/user/index.html """ from oslo_config import cfg from oslo_context import context from oslo_log import log as logging LOG = logging.getLogger(__name__) CONF = cfg.CONF DOMAIN = 'demo' def prepare(): """Prepare Oslo Logging (2 or 3 steps) Use of Oslo Logging involves the following: * logging.register_options * logging.set_defaults (optional) * logging.setup """ # Required step to register common, logging and generic configuration # variables logging.register_options(CONF) # Optional step to set new defaults if necessary for # * logging_context_format_string # * default_log_levels # # These variables default to respectively: # # import oslo_log # oslo_log._options.DEFAULT_LOG_LEVELS # oslo_log._options.log_opts[0].default # extra_log_level_defaults = [ 'dogpile=INFO', 'routes=INFO' ] logging.set_defaults( default_log_levels=logging.get_default_log_levels() + extra_log_level_defaults) # Required setup based on configuration and domain logging.setup(CONF, DOMAIN) if __name__ == '__main__': prepare() LOG.info("Welcome to Oslo Logging") LOG.info("Without context") context.RequestContext(user='6ce90b4d', tenant='d6134462', domain='a6b9360e') LOG.info("With context") oslo.log-4.1.1/doc/source/user/examples/oslo_logging.py0000664000175000017500000000165613643050265023204 0ustar zuulzuul00000000000000# Copyright (c) 2016 OpenStack Foundation # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. """A minimal syntax example of Oslo Logging""" from oslo_config import cfg from oslo_log import log as logging LOG = logging.getLogger(__name__) CONF = cfg.CONF DOMAIN = "demo" logging.register_options(CONF) logging.setup(CONF, DOMAIN) # Oslo Logging uses INFO as default LOG.info("Oslo Logging") LOG.warning("Oslo Logging") LOG.error("Oslo Logging") oslo.log-4.1.1/doc/source/user/examples/usage.py0000664000175000017500000000453513643050265021625 0ustar zuulzuul00000000000000# Copyright (c) 2016 OpenStack Foundation # # Licensed under the Apache License, Version 2.0 (the "License"); you may # not use this file except in compliance with the License. You may obtain # a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, WITHOUT # WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the # License for the specific language governing permissions and limitations # under the License. """A usage example of Oslo Logging This example requires the following package to be installed. $ pip install oslo.log Additional Oslo packages installed include oslo.config, oslo.context, oslo.i18n, oslo.serialization and oslo.utils. More information about Oslo Logging can be found at: https://docs.openstack.org/oslo.log/latest/user/index.html """ from oslo_config import cfg from oslo_log import log as logging LOG = logging.getLogger(__name__) CONF = cfg.CONF DOMAIN = 'demo' def prepare(): """Prepare Oslo Logging (2 or 3 steps) Use of Oslo Logging involves the following: * logging.register_options * logging.set_defaults (optional) * logging.setup """ # Required step to register common, logging and generic configuration # variables logging.register_options(CONF) # Optional step to set new defaults if necessary for # * logging_context_format_string # * default_log_levels # # These variables default to respectively: # # import oslo_log # oslo_log._options.DEFAULT_LOG_LEVELS # oslo_log._options.log_opts[0].default # extra_log_level_defaults = [ 'dogpile=INFO', 'routes=INFO' ] logging.set_defaults( default_log_levels=logging.get_default_log_levels() + extra_log_level_defaults) # Required setup based on configuration and domain logging.setup(CONF, DOMAIN) if __name__ == '__main__': prepare() # NOTE: These examples do not demonstration Oslo i18n messages LOG.info("Welcome to Oslo Logging") LOG.debug("A debugging message") LOG.warning("A warning occurred") LOG.error("An error occurred") try: raise Exception("This is exceptional") except Exception: LOG.exception("An Exception occurred") oslo.log-4.1.1/doc/source/user/usage.rst0000664000175000017500000001116113643050265020160 0ustar zuulzuul00000000000000======= Usage ======= .. _usage-app: In an Application ================= When using `Python's standard logging library`_ the following minimal setup demonstrates basic logging. .. _Python's standard logging library: https://docs.python.org/2/library/logging.html .. highlight:: python .. literalinclude:: examples/python_logging.py :linenos: :lines: 17-26 Source: :ref:`examples/python_logging.py ` When using Oslo Logging the following setup demonstrates a comparative syntax with Python standard logging. .. literalinclude:: examples/oslo_logging.py :linenos: :lines: 17-30 :emphasize-lines: 8,9 Source: :ref:`examples/oslo_logging.py ` Oslo Logging Setup Methods -------------------------- Applications need to use the oslo.log configuration functions to register logging-related configuration options and configure the root and other default loggers before using standard logging functions. Call :func:`~oslo_log.log.register_options` with an oslo.config CONF object before parsing any application command line options. .. literalinclude:: examples/usage.py :linenos: :lines: 33,36-37,46-49 :emphasize-lines: 7 Optionally call :func:`~oslo_log.log.set_defaults` before setup to change default logging levels if necessary. .. literalinclude:: examples/usage.py :linenos: :lines: 51-53,61-69 :emphasize-lines: 10 Call :func:`~oslo_log.log.setup` with the oslo.config CONF object used when registering objects, along with the domain and optionally a version to configure logging for the application. .. literalinclude:: examples/usage.py :linenos: :lines: 34,36-37,70-72 :emphasize-lines: 6 Source: :ref:`examples/usage.py ` Oslo Logging Functions ---------------------- Use standard Python logging functions to produce log records at applicable log levels. .. literalinclude:: examples/usage.py :linenos: :lines: 77-83 **Example Logging Output:** :: 2016-01-14 21:07:51.394 12945 INFO __main__ [-] Welcome to Oslo Logging 2016-01-14 21:07:51.395 12945 WARNING __main__ [-] A warning occurred 2016-01-14 21:07:51.395 12945 ERROR __main__ [-] An error occurred 2016-01-14 21:07:51.396 12945 ERROR __main__ [-] An Exception occurred 2016-01-14 21:07:51.396 12945 ERROR __main__ None 2016-01-14 21:07:51.396 12945 ERROR __main__ Oslo Log Translation -------------------- As of the Pike release, `logging within an application should no longer use Oslo International Utilities (i18n) marker functions `_ to provide language translation capabilities. Adding Context to Logging ------------------------- With the use of `Oslo Context`_, log records can also contain additional contextual information applicable for your application. .. _Oslo Context: https://docs.openstack.org/oslo.context/latest .. literalinclude:: examples/usage_context.py :linenos: :lines: 80-85 :emphasize-lines: 3-5 **Example Logging Output:** :: 2016-01-14 20:04:34.562 11266 INFO __main__ [-] Welcome to Oslo Logging 2016-01-14 20:04:34.563 11266 INFO __main__ [-] Without context 2016-01-14 20:04:34.563 11266 INFO __main__ [req-bbc837a6-be80-4eb2-8ca3-53043a93b78d 6ce90b4d d6134462 a6b9360e - -] With context The log record output format without context is defined with :oslo.config:option:`logging_default_format_string` configuration variable. When specifying context the :oslo.config:option:`logging_context_format_string` configuration variable is used. The Oslo RequestContext object contains a number of attributes that can be specified in :oslo.config:option:`logging_context_format_string`. An application can extend this object to provide additional attributes that can be specified in log records. Examples -------- :ref:`examples/usage.py ` provides a documented example of Oslo Logging setup. :ref:`examples/usage_helper.py ` provides an example showing debugging logging at each step details the configuration and logging at each step of Oslo Logging setup. :ref:`examples/usage_context.py ` provides a documented example of Oslo Logging with Oslo Context. In a Library ============ oslo.log is primarily used for configuring logging in an application, but it does include helpers that can be useful from libraries. :func:`~oslo_log.log.getLogger` wraps the function of the same name from Python's standard library to add a :class:`~oslo_log.log.KeywordArgumentAdapter`, making it easier to pass data to the formatters provided by oslo.log and configured by an application. oslo.log-4.1.1/doc/source/configuration/0000775000175000017500000000000013643050376020216 5ustar zuulzuul00000000000000oslo.log-4.1.1/doc/source/configuration/index.rst0000664000175000017500000001055013643050265022055 0ustar zuulzuul00000000000000.. _opts: ======================= Configuration Options ======================= oslo.log uses oslo.config to define and manage configuration options to allow the deployer to control how an application's logs are handled. .. show-options:: oslo.log Format Strings and Log Record Metadata ====================================== oslo.log builds on top of the Python standard library logging module. The format string supports all of the built-in replacement keys provided by that library, with some additions. Some of the more useful keys are listed here. Refer to the `section on LogRecord attributes `__ in the library documentation for complete details about the built-in values. Basic Information ----------------- .. list-table:: :header-rows: 1 :widths: 25,75 - * Format key * Description - * ``%(message)s`` * The message passed from the application code. Time Information ---------------- .. list-table:: :header-rows: 1 :widths: 25,75 - * Format key * Description - * ``%(asctime)s`` * Human-readable time stamp of when the logging record was created, formatted as '2003-07-08 16:49:45,896' (the numbers after the comma are milliseconds). - * ``%(isotime)s`` * Human-readable time stamp of when the logging record was created, using `Python's isoformat() `__ function in ISO 8601 format (``YYYY-MM-DDTHH:MM:SS.mmmmmm`` or, if the microseconds value is 0, ``YYYY-MM-DDTHH:MM:SS``). Location Information -------------------- .. list-table:: :header-rows: 1 :widths: 25,75 - * Format key * Description - * ``%(pathname)s`` * Full name of the source file where the logging call was issued, when it is available. - * ``%(filename)s`` * Filename portion of ``pathname``. - * ``%(lineno)d`` * Source line number where the logging call was issued, when it is available. - * ``%(module)s`` * The module name is derived from the filename. - * ``%(name)s`` * The name of the logger used to log the call. For OpenStack projects, this usually corresponds to the full module name (i.e., ``nova.api`` or ``oslo_config.cfg``). Severity Information -------------------- .. list-table:: :header-rows: 1 :widths: 25,75 - * Format key * Description - * ``%(levelname)s`` * Text logging level for the message (``DEBUG``, ``INFO``, ``WARNING``, ``ERROR``, ``CRITICAL``). - * ``%(levelno)s`` * Numeric logging level for the message. DEBUG level messages have a lower numerical value than INFO, which have a lower value than WARNING, etc. Error Handling Information -------------------------- .. list-table:: :header-rows: 1 :widths: 25,75 - * Format key * Description - * ``%(error_summary)s`` * The name of the exception being processed and any message associated with it. Identity Information -------------------- *These keys are only available in OpenStack applications that also use oslo.context.* .. list-table:: :header-rows: 1 :widths: 25,75 - * Format key * Description - * ``%(user_identity)s`` * The pre-formatted identity information about the user. See the ``logging_user_identity_format`` configuration option. - * ``%(user_name)s`` * The name of the authenticated user, if available. - * ``%(user)s`` * The ID of the authenticated user, if available. - * ``%(tenant_name)s`` * The name of the authenticated tenant, if available. - * ``%(tenant)s`` * The ID of the authenticated tenant, if available. - * ``%(user_domain)s`` * The ID of the authenticated user domain, if available. - * ``%(project_domain)s`` * The ID of the authenticated project/tenant, if available. - * ``%(request_id)s`` * The ID of the current request. This value can be used to tie multiple log messages together as relating to the same operation. - * ``%(resource_uuid)s`` * The ID of the resource on which the current operation will have effect. For example, the instance, network, volume, etc. .. seealso:: * `Python logging library LogRecord attributes `__ oslo.log-4.1.1/doc/source/admin/0000775000175000017500000000000013643050376016437 5ustar zuulzuul00000000000000oslo.log-4.1.1/doc/source/admin/nova_sample.conf0000664000175000017500000000336113643050265021612 0ustar zuulzuul00000000000000[loggers] keys = root, nova [handlers] keys = stderr, stdout, watchedfile, syslog, fluent, null [formatters] keys = context, default, fluent [logger_root] level = WARNING handlers = null [logger_nova] level = INFO handlers = stderr qualname = nova [logger_amqp] level = WARNING handlers = stderr qualname = amqp [logger_amqplib] level = WARNING handlers = stderr qualname = amqplib [logger_sqlalchemy] level = WARNING handlers = stderr qualname = sqlalchemy # "level = INFO" logs SQL queries. # "level = DEBUG" logs SQL queries and results. # "level = WARNING" logs neither. (Recommended for production systems.) [logger_boto] level = WARNING handlers = stderr qualname = boto # NOTE(mikal): suds is used by the vmware driver, removing this will # cause many extraneous log lines for their tempest runs. Refer to # https://review.opendev.org/#/c/219225/ for details. [logger_suds] level = INFO handlers = stderr qualname = suds [logger_eventletwsgi] level = WARNING handlers = stderr qualname = eventlet.wsgi.server [handler_stderr] class = StreamHandler args = (sys.stderr,) formatter = context [handler_stdout] class = StreamHandler args = (sys.stdout,) formatter = context [handler_watchedfile] class = handlers.WatchedFileHandler args = ('nova.log',) formatter = context [handler_syslog] class = handlers.SysLogHandler args = ('/dev/log', handlers.SysLogHandler.LOG_USER) formatter = context [handler_fluent] class = fluent.handler.FluentHandler args = ('openstack.nova', 'localhost', 24224) formatter = fluent [handler_null] class = logging.NullHandler formatter = default args = () [formatter_context] class = oslo_log.formatters.ContextFormatter [formatter_default] format = %(message)s [formatter_fluent] class = oslo_log.formatters.FluentFormatter oslo.log-4.1.1/doc/source/admin/log_rotation.rst0000664000175000017500000000256413643050265021675 0ustar zuulzuul00000000000000============= Log rotation ============= oslo.log can work with ``logrotate``, picking up file changes once log files are rotated. Make sure to set the ``watch-log-file`` config option. Log rotation on Windows ----------------------- On Windows, in-use files cannot be renamed or moved. For this reason, oslo.log allows setting maximum log file sizes or log rotation interval, in which case the service itself will take care of the log rotation (as opposed to having an external daemon). Configuring log rotation ------------------------ Use the following options to set a maximum log file size. In this sample, log files will be rotated when reaching 1GB, having at most 30 log files. .. code-block:: ini [DEFAULT] log_rotation_type = size max_logfile_size_mb = 1024 # MB max_logfile_count = 30 The following sample configures log rotation to be performed every 12 hours. .. code-block:: ini [DEFAULT] log_rotation_type = interval log_rotate_interval = 12 log_rotate_interval_type = Hours max_logfile_count = 60 .. note:: The time of the next rotation is computed when the service starts or when a log rotation is performed, using the time of the last file modification or the service start time, to which the configured log rotation interval is added. This means that service restarts may delay periodic log file rotations. oslo.log-4.1.1/doc/source/admin/journal.rst0000664000175000017500000001177413643050265020652 0ustar zuulzuul00000000000000========================= Systemd Journal Support ========================= One of the newer features in oslo.log is the ability to integrate with the systemd journal service (journald) natively on newer Linux systems. When using native journald support, additional metadata will be logged on each log message in addition to the message itself, which can later be used to do some interesting searching through your logs. Enabling ======== In order to enable the support you must have Python bindings for systemd installed. On Red Hat based systems, run:: yum install systemd-python On Ubuntu/Debian based systems, run:: apt install python-systemd If there is no native package for your distribution, or you are running in a virtualenv, you can install with pip.:: pip install systemd-python .. note:: There are also many non official systemd python modules on pypi, with confusingly close names. Make sure you install `systemd-python `_. After the package is installed, you must enable journald support manually in all services that will be using it. Add the following to the config files for all services: .. code-block:: ini [DEFAULT] use_journal = True In all relevant config files. Extra Metadata ============== Journald supports the concept of adding structured metadata in addition to the log message in question. This makes it much easier to take the output of journald and push it into other logging systems like Elastic Search, without needing to regex guess relevant data. It also allows you to search the journal by these fields using ``journalctl``. We use this facility to add our own structured information, if it is known at the time of logging the message. CODE_FILE=, CODE_LINE=, CODE_FUNC= The code location generating this message, if known. Contains the source filename, the line number and the function name. (This is the same as systemd uses) THREAD_NAME=, PROCESS_NAME= Information about the thread and process, if known. (This is the same as systemd uses) EXCEPTION_TEXT=, EXCEPTION_INFO= Information about an exception, if an exception has been logged. LOGGER_NAME= The name of the python logger that emitted the log message. Very often this is the module where the log message was emitted from. LOGGER_LEVEL= The name of the python logging level, which allows seeing all 'ERROR' messages very easily without remembering how they are translated to syslog priorities. SYSLOG_IDENTIFIER= The binary name identified for syslog compatibility. It will be the basename of the process that emits the log messages (e.g. ``nova-api``, ``neutron-l3-agent``) PRIORITY= The syslog priority (based on LOGGER_LEVEL), which allows syslog style filtering of messages based on their priority (an openstack.err log file for instance). REQUEST_ID= Most OpenStack services generate a unique ``request-id`` on every REST API call, which is then passed between it's sub services as that request is handled. For example, this can be very useful in tracking the build of a nova server from the initial HTTP POST to final VM create. PROJECT_ID=, PROJECT_NAME=, USER_ID=, USER_NAME= The keystone known user and project information about the requestor. Both the id and name are provided for easier searching. This can be used to understand when particular users or projects are reporting issues in the environment. Additional fields may be added over time. It is unlikely that fields will be removed, but if so they will be deprecated for one release cycle before that happens. Using Journalctl ================ Because systemd is relatively new in the Linux ecosystem, it's worth noting how one can effectively use journal control. If you want to follow all the journal logs you would do so with:: journalctl -f That's going to be nearly everything on your system, which you will probably find overwhelming. You can limit this to a smaller number of things using the ``SYSLOG_IDENTIFIER=``:: journalctl -f SYSLOG_IDENTIFIER=nova-compute SYSLOG_IDENTIFIER=neutron-l3-agent Specifying a query parameter multiple times defaults to an ``OR`` operation, so that will show either nova-compute or neutron-l3-agent logs. You can also query by request id to see the entire flow of a REST call:: journalctl REQUEST_ID=req-b1903300-77a8-401d-984c-8e7d17e4a15f References ========== - A complete list of the systemd journal fields is here, it is worth making yourself familiar with them - https://www.freedesktop.org/software/systemd/man/systemd.journal-fields.html - The complete journalctl manual is worth reading, especially the ``-o`` parameter, as default displayed time resolution is only in seconds (even though systemd internally is tracking microsecs) - https://www.freedesktop.org/software/systemd/man/journalctl.html - The guide for using systemd in devstack provides additional examples of effective journalctl queries - https://opendev.org/openstack/devstack/src/branch/master/doc/source/systemd.rst oslo.log-4.1.1/doc/source/admin/index.rst0000664000175000017500000000033113643050265020272 0ustar zuulzuul00000000000000============================================== Administering Applications that use oslo.log ============================================== .. toctree:: :maxdepth: 2 advanced_config journal log_rotation oslo.log-4.1.1/doc/source/admin/advanced_config.rst0000664000175000017500000000313513643050265022262 0ustar zuulzuul00000000000000============================== Advanced Configuration Files ============================== The oslo.config options described in :ref:`opts` make it easy to enable some default logging configuration behavior such as setting the default log level and output file. For more advanced configurations using translations or multiple output destinations oslo.log relies on the Python standard library logging module configuration file features. The configuration file can be used to tie together the loggers, handlers, and formatters and provide all of the necessary configuration values to enable any desired behavior. Refer to the `Python logging Module Tutorial`_ for descriptions of these concepts. Logger Names ============ Loggers are configured by name. Most OpenStack applications use logger names based on the source file where the message is coming from. A file named ``myapp/package/module.py`` corresponds to a logger named ``myapp.package.module``. Loggers are configured in a tree structure, and the names reflect their location in this hierarchy. It is not necessary to configure every logger, since messages are passed up the tree during processing. To control the logging for ``myapp``, for example, it is only necessary to set up a logger for ``myapp`` and not ``myapp.package.module``. The base of the tree, through which all log message may pass unless otherwise discarded, is called the ``root`` logger. Example Files ============= .. toctree:: :glob: example* .. seealso:: * `Python logging Module Tutorial`_ .. _Python logging Module Tutorial: https://docs.python.org/2.7/howto/logging.html oslo.log-4.1.1/doc/source/admin/example_nova.rst0000664000175000017500000000534113643050265021647 0ustar zuulzuul00000000000000========================================= Example Configuration File for ``nova`` ========================================= This sample configuration file demonstrates how the OpenStack compute service (nova) might be configured. .. literalinclude:: nova_sample.conf :language: ini :prepend: # nova_sample.conf Two logger nodes are set up, ``root`` and ``nova``. .. literalinclude:: nova_sample.conf :language: ini :lines: 1-2 Several handlers are created, to send messages to different outputs. .. literalinclude:: nova_sample.conf :language: ini :lines: 4-5 And two formatters are created to be used based on whether the logging location will have OpenStack request context information available or not. A Fluentd formatter is also shown. .. literalinclude:: nova_sample.conf :language: ini :lines: 7-8 The ``root`` logger is configured to send messages to the ``null`` handler, silencing most messages that are not part of the nova application code namespace. .. literalinclude:: nova_sample.conf :language: ini :lines: 10-12 The ``nova`` logger is configured to send messages marked as ``INFO`` and higher level to the standard error stream. .. literalinclude:: nova_sample.conf :language: ini :lines: 14-17 The ``amqp`` and ``amqplib`` loggers, used by the module that connects the application to the message bus, are configured to emit warning messages to the standard error stream. .. literalinclude:: nova_sample.conf :language: ini :lines: 19-27 The ``sqlalchemy`` logger, used by the module that connects the application to the database, is configured to emit warning messages to the standard error stream. .. literalinclude:: nova_sample.conf :language: ini :lines: 29-35 Similarly, ``boto``, ``suds``, and ``eventlet.wsgi.server`` are configured to send warnings to the standard error stream. .. literalinclude:: nova_sample.conf :language: ini :lines: 37-53 The ``stderr`` handler, being used by most of the loggers above, is configured to write to the standard error stream on the console. .. literalinclude:: nova_sample.conf :language: ini :lines: 55-58 The ``stderr`` handler uses the ``context`` formatter, which takes its configuration settings from ``oslo.config``. .. literalinclude:: nova_sample.conf :language: ini :lines: 85-86 The ``stdout`` and ``syslog`` handlers are defined, but not used. The ``fluent`` handler is useful to send logs to ``fluentd``. It is a part of fluent-logger-python and you can install it as following. :: $ pip install fluent-logger This handler is configured to use ``fluent`` formatter. .. literalinclude:: nova_sample.conf :language: ini :lines: 75-78 .. literalinclude:: nova_sample.conf :language: ini :lines: 91-92