api_hour-0.8.2/0000775000175000017500000000000013201335070012351 5ustar lglg00000000000000api_hour-0.8.2/PKG-INFO0000664000175000017500000002615213201335070013454 0ustar lglg00000000000000Metadata-Version: 1.1 Name: api_hour Version: 0.8.2 Summary: Write efficient network daemons (HTTP, SSH...) with ease. Home-page: http://www.api-hour.io Author: Eyepea Dev Team Author-email: gmludo@gmail.com License: Apache 2 Download-URL: https://pypi.python.org/pypi/api_hour Description: API Hour ======== API-Hour is a lightweight daemon framework, that lets you write powerful applications. It was created to answer the need for a simple, robust, and super-fast server-side environment to build very efficient Daemons with ease. By default, API-Hour Starter Kit (Cookiecutter) creates for you a HTTP daemon to develop WebServices. With API-Hour, you can quickly convert any AsyncIO server library to multi-processing daemon, ready for production. .. image:: https://raw.githubusercontent.com/Eyepea/API-Hour/master/docs/API-Hour_small.png Quick'n'dirty HTTP benchmarks on a kitchen table ------------------------------------------------ .. image:: https://raw.githubusercontent.com/Eyepea/API-Hour/master/propaganda/en/stats.png Scale: Number of queries during 30 seconds with 400 simultaneous connexions. Benchmark made on a Dell Precision M6800 between API-Hour and Gunicorn with 16 workers. For details, read information in `benchmarks `_. Where is the magic to have theses performances ? '''''''''''''''''''''''''''''''''''''''''''''''' Architecture matters a lot more that tools. We use asynchronous and multiprocess patterns, combined together, to handle as much as possible HTTP requests. Ideally, the limitation should be your network card, not your CPU nor memory. Moreover, we've tried to reduce as much as possible layers between your code and async sockets. For each layer, we use the best in term of performance and simplicity: #. `AsyncIO `_: an easy asynchronous framework, directly integrated in Python 3.4+ #. `aiohttp.web `_: HTTP protocol implementation for AsyncIO + Web framework #. `ujson `_: fastest JSON serialization Examples -------- #. `API-Hour Starter Kit (Cookiecutter) `_ #. `API-Hour implementation of TechEmpower Web Framework Benchmarks `_ #. `HTTP+SSH Daemon `_ #. `Quick'n'dirty benchmarks on a kitchen table `_ How-to start an API-Hour project ? ---------------------------------- You can follow `one of our tutorials `_ Support ------- * `Documentation `_. * `Mailing-list `_ Requirements ------------ - Python 3.5+ Install ------- Follow `official documentation `_. License ------- ``API-Hour`` is offered under the Apache 2 license. Architecture ------------ ``API-Hour`` is a glue between your code and Gunicorn to launch your code in several process. Origin ------ API-Hour was a fork of aiorest, now only based on Gunicorn for multiprocessing. Thanks ------ Thanks to Gunicorn, aiorest, aiohttp and AsyncIO community, they made 99,9999% of the job for API-Hour. Special thanks to **Andrew Svetlov**, the creator of aiorest. Goals of API-Hour ----------------- #. **Fast**: API-Hour is designed from bottom-up to be extremely fast, and capable of handling a huge load. It uses Python 3 and its new powerful AsyncIO package. #. **Scalable**: API-Hour is built to be elastic, and easily scalable. #. **Lightweight**: #. **small codebase**: Doing less means being faster: the codebase for processing an request is kept as small as possible. Beyond this base foot-print, you can of course activate, preload and initialize more plugins or packages , but that choice is yours. #. **flexible setup**: Some people have no problems with using many dependencies, while others want to have none (other than Python). Some people are ok to loose a bit on performance, for the ease (and speed) of coding, while others wouldn't sacrifice a millisecond for ready-made functionality. These choices are yours, so there are no mandatory extra layer, plugin or middleware. #. **Easy**: API-Hour is meant to be very easy to grasp: No steep learning curve, no mountain of docs to read: Download our turn-key "Hello-world" applications, and immediately start coding your own application from there. #. **Packages-friendly and friendly-packages**: We try to let you use external packages without the need to re-write them, adapt them, " wrap " them or embed them in the framework. On the other hand, API-Hour " plugins " are written as much as possible to be usable as stand-alone packages outside the framework, to benefit to more people. #. **Asynchronous... or not**: If you don't need the extra complexity of building asynchronous code, you don't have to (you'll still enjoy tremendous performance). You can just handle your requests in a traditional synchronous way. On the other hand, if your project does IO or processing that could benefit from parallelizing tasks, the whole power of Async. IO, futures, coroutines and tasks is at your fingertips. All provided plugins (in particular, Database plugins) are Async-ready. CHANGES ======= 0.8.2 (2017-11-10) ------------------ * Add pre_start coroutine * Fix setup.py to check correctly minimal Python version. Thanks @romuald 0.8.1 (2016-07-08) ------------------ * Drop support of Python 3.3 and 3.4 0.7.1 (2016-07-08) ------------------ * Merge bugfix from https://github.com/KeepSafe/aiohttp/pull/879 0.7.0 (2015-05-04) ------------------ * Add HTML serializer plugin * Add AsyncIO high level stream server support (Used by FastAGI implementation of Panoramisk) * Now, you can use make_handler method to connect directly your handlers with your sockets for more flexibility 0.6.2 (2015-02-24) ------------------ * You can customize event loop used with make_event_loop() class method in Container 0.6.1 (2015-02-10) ------------------ * Release a new version because PyPI is bugged: 0.6.0 is broken on PyPI 0.6.0 (2015-01-13) ------------------ * API-Hour config file is now optional, use -ac to auto-configure your app * Add Python 3.3 compatibility to use easily Python 3 directly from distributions package * Add Debian/Ubuntu package * ujson is now optional for aiohttp.web * More documentation with tutorials: all-in-one and Starter Kit * If api_hour CLI has no logging file, enable logging on console by default 0.5.0 (2015-01-07) ------------------ * Project reboot * Change API-Hour main goal: API-Hour can now multiprocess all AsyncIO lib server, not only HTTP * API-Hour is now based on Gunicorn * Remove aiorest fork, recommend to use aiohttp.web for HTTP daemons in cookiecutter 0.3.3 (2014-12-19) ------------------ * Static files can be served automatically * body and json_body and transport accessible in Request * loop accessible in Application * Asset Serializer accepts encoding * cookiecutter available at https://github.com/Eyepea/cookiecutter-API-Hour * Use of ujson * Bugfixes 0.3.2 (2014-10-31) ------------------ * Refactoring and clean-up * Publish benchmark server for API-Hour * English version of PyCON-FR presentation about API-Hour * Fix response.write_eof() to follow aiohttp changes (Thanks aiorest for the patch) 0.3.1 (2014-10-28) ------------------ * Rename multi_process to arbiter * Improve Python packaging 0.3.0 (2014-10-26) ------------------ * First version of API-Hour, performance oriented version of aiorest * cookiecutter template * Serialization support * replace json by ujson * basic multiprocessing 0.2.4 (2014-09-12) ------------------ * Make loop keywork-only parameter in create_session_factory() function 0.2.3 (2014-08-28) ------------------ * Redis session switched from asyncio_redis to aioredis 0.2.2 (2014-08-15) ------------------ * Added Pyramid-like matchdict to request (see https://github.com/aio-libs/aiorest/pull/18) * Return "400 Bad Request" for incorrect JSON body in POST/PUT methods * README fixed * Custom response status code (see https://github.com/aio-libs/aiorest/pull/23) 0.1.1 (2014-07-09) ------------------ * Switched to aiohttp v0.9.0 0.1.0 (2014-07-07) ------------------ * Basic REST API Keywords: asyncio,performance,efficient,web,service,rest,json,daemon,application Platform: OS Independent Classifier: Development Status :: 5 - Production/Stable Classifier: Environment :: No Input/Output (Daemon) Classifier: Environment :: Web Environment Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: Apache Software License Classifier: Natural Language :: English Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3 :: Only Classifier: Programming Language :: Python :: Implementation :: CPython Classifier: Topic :: Internet :: WWW/HTTP Classifier: Topic :: Internet :: WWW/HTTP :: Dynamic Content Classifier: Topic :: Internet :: WWW/HTTP :: HTTP Servers Classifier: Topic :: System :: Networking Provides: api_hour api_hour-0.8.2/LICENSE0000664000175000017500000000105712452752107013374 0ustar lglg00000000000000Copyright [2015] [Eyepea Dev Team] Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.api_hour-0.8.2/README.rst0000664000175000017500000001225413201333073014045 0ustar lglg00000000000000API Hour ======== API-Hour is a lightweight daemon framework, that lets you write powerful applications. It was created to answer the need for a simple, robust, and super-fast server-side environment to build very efficient Daemons with ease. By default, API-Hour Starter Kit (Cookiecutter) creates for you a HTTP daemon to develop WebServices. With API-Hour, you can quickly convert any AsyncIO server library to multi-processing daemon, ready for production. .. image:: https://raw.githubusercontent.com/Eyepea/API-Hour/master/docs/API-Hour_small.png Quick'n'dirty HTTP benchmarks on a kitchen table ------------------------------------------------ .. image:: https://raw.githubusercontent.com/Eyepea/API-Hour/master/propaganda/en/stats.png Scale: Number of queries during 30 seconds with 400 simultaneous connexions. Benchmark made on a Dell Precision M6800 between API-Hour and Gunicorn with 16 workers. For details, read information in `benchmarks `_. Where is the magic to have theses performances ? '''''''''''''''''''''''''''''''''''''''''''''''' Architecture matters a lot more that tools. We use asynchronous and multiprocess patterns, combined together, to handle as much as possible HTTP requests. Ideally, the limitation should be your network card, not your CPU nor memory. Moreover, we've tried to reduce as much as possible layers between your code and async sockets. For each layer, we use the best in term of performance and simplicity: #. `AsyncIO `_: an easy asynchronous framework, directly integrated in Python 3.4+ #. `aiohttp.web `_: HTTP protocol implementation for AsyncIO + Web framework #. `ujson `_: fastest JSON serialization Examples -------- #. `API-Hour Starter Kit (Cookiecutter) `_ #. `API-Hour implementation of TechEmpower Web Framework Benchmarks `_ #. `HTTP+SSH Daemon `_ #. `Quick'n'dirty benchmarks on a kitchen table `_ How-to start an API-Hour project ? ---------------------------------- You can follow `one of our tutorials `_ Support ------- * `Documentation `_. * `Mailing-list `_ Requirements ------------ - Python 3.5+ Install ------- Follow `official documentation `_. License ------- ``API-Hour`` is offered under the Apache 2 license. Architecture ------------ ``API-Hour`` is a glue between your code and Gunicorn to launch your code in several process. Origin ------ API-Hour was a fork of aiorest, now only based on Gunicorn for multiprocessing. Thanks ------ Thanks to Gunicorn, aiorest, aiohttp and AsyncIO community, they made 99,9999% of the job for API-Hour. Special thanks to **Andrew Svetlov**, the creator of aiorest. Goals of API-Hour ----------------- #. **Fast**: API-Hour is designed from bottom-up to be extremely fast, and capable of handling a huge load. It uses Python 3 and its new powerful AsyncIO package. #. **Scalable**: API-Hour is built to be elastic, and easily scalable. #. **Lightweight**: #. **small codebase**: Doing less means being faster: the codebase for processing an request is kept as small as possible. Beyond this base foot-print, you can of course activate, preload and initialize more plugins or packages , but that choice is yours. #. **flexible setup**: Some people have no problems with using many dependencies, while others want to have none (other than Python). Some people are ok to loose a bit on performance, for the ease (and speed) of coding, while others wouldn't sacrifice a millisecond for ready-made functionality. These choices are yours, so there are no mandatory extra layer, plugin or middleware. #. **Easy**: API-Hour is meant to be very easy to grasp: No steep learning curve, no mountain of docs to read: Download our turn-key "Hello-world" applications, and immediately start coding your own application from there. #. **Packages-friendly and friendly-packages**: We try to let you use external packages without the need to re-write them, adapt them, " wrap " them or embed them in the framework. On the other hand, API-Hour " plugins " are written as much as possible to be usable as stand-alone packages outside the framework, to benefit to more people. #. **Asynchronous... or not**: If you don't need the extra complexity of building asynchronous code, you don't have to (you'll still enjoy tremendous performance). You can just handle your requests in a traditional synchronous way. On the other hand, if your project does IO or processing that could benefit from parallelizing tasks, the whole power of Async. IO, futures, coroutines and tasks is at your fingertips. All provided plugins (in particular, Database plugins) are Async-ready. api_hour-0.8.2/api_hour/0000775000175000017500000000000013201335070014157 5ustar lglg00000000000000api_hour-0.8.2/api_hour/container.py0000664000175000017500000000257513146643774016551 0ustar lglg00000000000000from abc import abstractmethod import asyncio from collections import OrderedDict import logging __all__ = [ 'Container', ] LOG = logging.getLogger(__name__) class Container: def __init__(self, config, worker, loop=None): if loop is None: loop = asyncio.get_event_loop() self.loop = loop super().__init__() self.config = config self.worker = worker # Engines initialisation self.engines = {} # Services initialisation self.services = {} self.servers = OrderedDict() self._stopping = False @abstractmethod async def make_servers(self, sockets): """Return handlers to serve data""" @classmethod def make_event_loop(cls, config): """To customize loop generation""" return asyncio.new_event_loop() async def pre_start(self): pass async def start(self): LOG.info('Starting application...') def pre_stop(self): if not self._stopping: self._stopping = True task = asyncio.ensure_future(self.shutdown(), loop=self.loop) task.add_done_callback(self.cleanup()) else: LOG.debug('Already stopping application, not doing anything') async def shutdown(self): pass async def cleanup(self): pass def post_stop(self, future): pass api_hour-0.8.2/api_hour/config.py0000664000175000017500000000354612454570157016026 0ustar lglg00000000000000import logging import logging.config import os import sys from gunicorn import util from gunicorn.config import Setting, validate_string, validate_bool from gunicorn.errors import ConfigError import yaml LOG = logging.getLogger(__name__) def get_config(overrides: dict) -> dict: """ :param overrides: config values that overrides the config file(s). :type overrides: dict :return: a ConfigObj object you can use like a dict :rtype: ConfigObj :Example: get_config(vars(p.parse_args())) """ config_file = os.path.join(overrides['config_dir'], 'main/main.yaml') try: conf = yaml.load(open(config_file, 'r')) except IOError as e: print(e) print('Configuration file "%s" cannot be found. please fix this and retry.' % config_file) sys.exit(1) LOG.info('Config file used: %s', config_file) return conf def validate_config_dir(val): if val is None: return val else: # valid if the value is a string val = validate_string(val) # transform relative paths path = os.path.abspath(os.path.normpath(os.path.join(util.getcwd(), val))) # test if the path exists if not os.path.exists(path): raise ConfigError("can't find a config directory in %r" % val) return path class ConfigDir(Setting): name = "config_dir" section = "API-Hour" cli = ["--config_dir"] validator = validate_config_dir default = None desc = """\ Config directory of your API-Hour Daemon. Example: /etc/hello/ """ class AutoConfig(Setting): name = "auto_config" section = "API-Hour" cli = ['-ac', "--auto_config"] validator = validate_bool action = "store_true" default = False desc = """\ Enable auto-configuration discover based on daemon name """ api_hour-0.8.2/api_hour/plugins/0000775000175000017500000000000013201335070015640 5ustar lglg00000000000000api_hour-0.8.2/api_hour/plugins/__init__.py0000664000175000017500000000002212452752107017756 0ustar lglg00000000000000__author__ = 'lg' api_hour-0.8.2/api_hour/plugins/aiohttp/0000775000175000017500000000000013201335070017310 5ustar lglg00000000000000api_hour-0.8.2/api_hour/plugins/aiohttp/response.py0000664000175000017500000000151413146142356021534 0ustar lglg00000000000000from aiohttp.web import HTTPException try: import ujson as json except ImportError: import json class JSON(HTTPException): """Serialize response to JSON with aiohttp.web""" def __init__(self, data, status=200, reason=None, headers=None): body = json.dumps(data).encode('utf-8') self.status_code = status super().__init__(body=body, reason=reason, headers=headers, content_type='application/json') class HTML(HTTPException): """Serialize response to HTML with aiohttp.web""" def __init__(self, data, status=200, reason=None, headers=None): body = data.encode('utf-8') self.status_code = status super().__init__(body=body, reason=reason, headers=headers, content_type='text/html') api_hour-0.8.2/api_hour/plugins/aiohttp/router.py0000664000175000017500000000112313146142356021212 0ustar lglg00000000000000from aiohttp.web_urldispatcher import UrlDispatcher class Router(UrlDispatcher): def __init__(self, container): self._container = container self._container.env_config = dict() super().__init__() def add_route(self, method, path, handler, *, name=None, expect_handler=None, env_config=None): resource = self.add_resource(path, name=name) if name: self._container.env_config[name] = env_config or [] return resource.add_route(method, handler, expect_handler=expect_handler) api_hour-0.8.2/api_hour/plugins/aiohttp/__init__.py0000664000175000017500000000015413146142356021434 0ustar lglg00000000000000from .response import JSON, HTML from .environment import env_middleware_factory from .router import Router api_hour-0.8.2/api_hour/plugins/aiohttp/environment.py0000664000175000017500000001111413146142356022237 0ustar lglg00000000000000import logging try: import ujson as json except ImportError: import json from . import JSON LOG = logging.getLogger(__name__) async def env_middleware_factory(app, handler): async def env_middleware(request): LOG.debug('incoming request %s from: %s', request.method, request.path) request['env'] = { 'container': request.app['ah_container'], 'id': request.headers.get('Request_ID', None) } try: name = request.match_info.route.resource.name config = request['env']['container'].env_config.get(name, []) except AttributeError: LOG.debug('Can not retrieve resource name for %(http_path)s', {'http_path': request.path}) return await handler(request) LOG.debug('Resource name: %s, config: %s', name, config) if 'pg' in config and 'pg' in request['env']['container'].engines: LOG.debug('Creating pg cursor') request['env']['pg'] = dict() request['env']['pg']['engine'] = await request['env']['container'].engines['pg'] request['env']['pg']['cursor_context_manager'] = await request['env']['pg']['engine'].cursor() request['env']['pg']['cursor'] = request['env']['pg']['cursor_context_manager'].__enter__() await request['env']['pg']['cursor'].execute('BEGIN') if 'mysql' in config and 'mysql' in request['env']['container'].engines: LOG.debug('Creating mysql cursor') request['env']['mysql'] = dict() request['env']['mysql']['engine'] = await request['env']['container'].engines['mysql'] request['env']['mysql']['connection'] = await request['env']['mysql']['engine'].acquire() request['env']['mysql']['cursor'] = await request['env']['mysql']['connection'].cursor() await request['env']['mysql']['connection'].begin() if 'redis' in config and 'redis' in request['env']['container'].engines: LOG.debug('Creating redis connection') request['env']['redis'] = dict() request['env']['redis']['engine'] = await request['env']['container'].engines['redis'] request['env']['redis']['connection'] = await request['env']['redis']['engine'].acquire() if 'json' in config: LOG.debug('Parsing json') try: request['env']['json'] = await request.json(loads=json.loads) except ValueError: raise JSON(status=400, data='''The payload must be of json format''') LOG.debug('Json parsed') request['env']['name'] = name request['env']['config'] = config try: LOG.debug('Handling request') response = await handler(request) except Exception as exception: if 'pg' in config and 'pg' in request['env'] and not request['env']['pg']['cursor'].closed: LOG.debug('Rolling back postgres transaction') await request['env']['pg']['cursor'].execute('ROLLBACK') request['env']['pg']['cursor_context_manager'].__exit__() if 'mysql' in config and 'mysql' in request['env'] and not request['env']['mysql']['cursor'].closed: LOG.debug('Rolling back mysql transaction') await request['env']['mysql']['connection'].rollback() await request['env']['mysql']['cursor'].close() request['env']['mysql']['engine'].release(request['env']['mysql']['connection']) raise exception else: if 'pg' in config and 'pg' in request['env'] and not request['env']['pg']['cursor'].closed: LOG.debug('Committing postgres transaction') await request['env']['pg']['cursor'].execute('COMMIT') request['env']['pg']['cursor_context_manager'].__exit__() if 'mysql' in config and 'mysql' in request['env'] and not request['env']['mysql']['cursor'].closed: LOG.debug('Committing mysql transaction') await request['env']['mysql']['connection'].commit() await request['env']['mysql']['cursor'].close() request['env']['mysql']['engine'].release(request['env']['mysql']['connection']) finally: if 'redis' in config and 'redis' in request['env'] and not request['env']['redis']['connection'].closed: LOG.debug('Closing redis connection') request['env']['redis']['connection'].close() await request['env']['redis']['connection'].wait_closed() return response return env_middleware api_hour-0.8.2/api_hour/utils.py0000664000175000017500000000006212452752107015702 0ustar lglg00000000000000import logging LOG = logging.getLogger(__name__)api_hour-0.8.2/api_hour/application.py0000664000175000017500000000517112455043010017040 0ustar lglg00000000000000# This file is part of API-Hour, forked from gunicorn/app/wsgiapp.py released under the MIT license. import logging import os import sys from gunicorn.app.base import Application as GunicornApp from gunicorn import util from gunicorn.config import Config from .config import get_config class Application(GunicornApp): def init(self, parser, opts, args): if len(args) < 1: parser.error("No application module specified.") self.cfg.set("default_proc_name", args[0]) self.app_uri = args[0] logging.captureWarnings(True) if opts.auto_config: # Default config_dir config_dir = os.path.join(self.cfg.chdir, 'etc', self.app_uri.split(":", 1)[0]) if os.path.exists(config_dir): self.cfg.set('config_dir', config_dir) # Define dev etc folder as default config directory # generate config_dir directly: egg or chicken problem if opts.config_dir: self.cfg.set('config_dir', opts.config_dir) if not opts.config: opts.config = os.path.join(self.cfg.config_dir, 'api_hour/gunicorn_conf.py') if not self.cfg.logconfig: self.cfg.set('logconfig', os.path.join(self.cfg.config_dir, 'api_hour/logging.ini')) else: # To avoid with Gunicorn 19 that the console it's empty when you test if not opts.errorlog: opts.errorlog = '-' if not opts.accesslog: opts.accesslog = '-' def load_default_config(self): # init configuration self.cfg = Config(self.usage, prog=self.prog) self.cfg.set('worker_class', 'api_hour.Worker') # Define api_hour.Worker as default def load_config(self): # parse console args super().load_config() if self.cfg.config_dir: self.config = get_config({'config_dir': self.cfg.config_dir}) else: self.config = None # import ipdb; ipdb.set_trace() def chdir(self): # chdir to the configured path before loading, # default is the current dir os.chdir(self.cfg.chdir) # add the path to sys.path sys.path.insert(0, self.cfg.chdir) def load(self): self.chdir() # load the app return util.import_app(self.app_uri) def run(): """\ The ``api_hour`` command line runner for launching API-Hour with container. """ from . import Application Application("%(prog)s [OPTIONS] [APP_MODULE]").run() if __name__ == '__main__': run() api_hour-0.8.2/api_hour/worker.py0000664000175000017500000001325113146644172016062 0ustar lglg00000000000000# This file is part of API-Hour, forked from aiohttp/worker.py. __all__ = ['Worker'] import asyncio import os import signal import sys import gunicorn.workers.base as base try: import aiohttp.web except ImportError: aiohttp = None # from pycallgraph import PyCallGraph # from pycallgraph import Config # from pycallgraph.output import GraphvizOutput class Worker(base.Worker): def __init__(self, *args, **kw): # pragma: no cover super().__init__(*args, **kw) self.handlers = {} self.exit_code = 0 self.container = None self.loop = None def init_process(self): # create new event_loop after fork asyncio.get_event_loop().close() super().init_process() # graphviz = GraphvizOutput() # graphviz.output_file = '/tmp/test.png' # # with PyCallGraph(output=graphviz): # super().init_process() def run(self): self.loop = self.app.callable.make_event_loop(config=self.app.config) asyncio.set_event_loop(self.loop) self._init_signals() self._runner = asyncio.ensure_future(self._run(), loop=self.loop) # import cProfile # prof = cProfile.Profile() # prof.enable() try: self.loop.run_until_complete(self._runner) finally: self.loop.close() # prof.disable() # prof.dump_stats('/tmp/out.pyprof') sys.exit(self.exit_code) async def close(self): if self.handlers: servers = self.handlers self.handlers = None # stop accepting connections self.log.info("Closing %s servers. PID: %s", len(servers), self.pid) closing = list() for server, handler in servers.items(): server.close() closing.append(server.wait_closed()) if closing: await asyncio.wait(closing, return_when=asyncio.ALL_COMPLETED, loop=self.loop) self.log.debug('Shutting down') await self.container.shutdown() tasks = [] for handler in servers.values(): if aiohttp and isinstance(handler, aiohttp.web.Server): tasks.append(handler.shutdown(timeout=self.cfg.graceful_timeout / 100 * 80)) if tasks: await asyncio.wait(tasks, loop=self.loop, return_when=asyncio.ALL_COMPLETED) self.log.debug('Cleaning container') await self.container.cleanup() self.log.debug('All server closed') else: await self.container.shutdown() await self.container.cleanup() async def _run(self): self.container = self.app.callable(config=self.app.config, worker=self, loop=self.loop) await self.container.pre_start() if asyncio.iscoroutinefunction(self.container.make_servers): self.handlers = await self.container.make_servers(self.sockets) else: handlers = self.container.make_servers() for i, sock in enumerate(self.sockets): if len(handlers) == 1: handler = handlers[0] else: handler = handlers[i] if asyncio.iscoroutinefunction(handler): self.log.info('Handler "%s" is a coroutine => High-level AsyncIO API', handler) srv = await asyncio.start_server(handler, sock=sock.sock, loop=self.loop) else: self.log.info('Handler "%s" is a function => Low-level AsyncIO API', handler) srv = await self.loop.create_server(handler, sock=sock.sock) self.handlers[srv] = handler await self.container.start() # If our parent changed then we shut down. pid = os.getpid() try: while self.alive: self.notify() if pid == os.getpid() and self.ppid != os.getppid(): self.alive = False self.log.info("Parent changed, shutting down: %s", self) else: await asyncio.sleep(1.0, loop=self.loop) except (Exception, BaseException, GeneratorExit, KeyboardInterrupt): pass await self.close() def init_signals(self): # init_signals initialized later in _init_signals because self.loop isn't initialized yet pass def _init_signals(self): # Set up signals through the event loop API. self.loop.add_signal_handler(signal.SIGQUIT, self.handle_quit, signal.SIGQUIT, None) self.loop.add_signal_handler(signal.SIGTERM, self.handle_exit, signal.SIGTERM, None) self.loop.add_signal_handler(signal.SIGINT, self.handle_quit, signal.SIGINT, None) self.loop.add_signal_handler(signal.SIGWINCH, self.handle_winch, signal.SIGWINCH, None) self.loop.add_signal_handler(signal.SIGUSR1, self.handle_usr1, signal.SIGUSR1, None) self.loop.add_signal_handler(signal.SIGABRT, self.handle_abort, signal.SIGABRT, None) # Don't let SIGTERM and SIGUSR1 disturb active requests # by interrupting system calls signal.siginterrupt(signal.SIGTERM, False) signal.siginterrupt(signal.SIGUSR1, False) def handle_quit(self, sig, frame): self.alive = False def handle_abort(self, sig, frame): self.alive = False self.exit_code = 1 api_hour-0.8.2/api_hour/__init__.py0000664000175000017500000000224713201333611016274 0ustar lglg00000000000000import re import sys import collections __author__ = 'Ludovic Gasc (GMLudo)' __email__ = 'git@gmludo.eu' __version__ = '0.8.2' version = __version__ + ' , Python ' + sys.version VersionInfo = collections.namedtuple('VersionInfo', 'major minor micro releaselevel serial') def _parse_version(ver): RE = (r'^(?P\d+)\.(?P\d+)\.' '(?P\d+)((?P[a-z]+)(?P\d+)?)?$') match = re.match(RE, ver) try: major = int(match.group('major')) minor = int(match.group('minor')) micro = int(match.group('micro')) levels = {'c': 'candidate', 'a': 'alpha', 'b': 'beta', None: 'final'} releaselevel = levels[match.group('releaselevel')] serial = int(match.group('serial')) if match.group('serial') else 0 return VersionInfo(major, minor, micro, releaselevel, serial) except Exception: raise ImportError("Invalid package version {}".format(ver)) version_info = _parse_version(__version__) from .application import Application from .container import Container from .worker import Workerapi_hour-0.8.2/setup.py0000664000175000017500000000501113201333073014061 0ustar lglg00000000000000import os import re import sys from setuptools import setup, find_packages __docformat__ = 'rst' PY_VER = sys.version_info[:3] if PY_VER < (3, 5, 0): PY_VERS = '.'.join(map(str, PY_VER)) raise RuntimeError("api_hour doesn't support Python earlier than 3.5.0, " "current Python version is: %s" % PY_VERS) install_requires = ['gunicorn', 'PyYAML', 'setproctitle'] def read(f): return open(os.path.join(os.path.dirname(__file__), f)).read().strip() def read_version(): regexp = re.compile(r"^__version__\W*=\W*'([\d.abrc]+)'") init_py = os.path.join(os.path.dirname(__file__), 'api_hour', '__init__.py') with open(init_py) as f: for line in f: match = regexp.match(line) if match is not None: return match.group(1) else: raise RuntimeError('Cannot find version in api_hour/__init__.py') classifiers = [ 'Development Status :: 5 - Production/Stable', 'Environment :: No Input/Output (Daemon)', 'Environment :: Web Environment', 'Intended Audience :: Developers', 'License :: OSI Approved :: Apache Software License', 'Natural Language :: English', 'Operating System :: OS Independent', 'Programming Language :: Python', 'Programming Language :: Python :: 3', 'Programming Language :: Python :: 3.5', 'Programming Language :: Python :: 3 :: Only', 'Programming Language :: Python :: Implementation :: CPython', 'Topic :: Internet :: WWW/HTTP', 'Topic :: Internet :: WWW/HTTP :: Dynamic Content', 'Topic :: Internet :: WWW/HTTP :: HTTP Servers', 'Topic :: System :: Networking', ] setup(name='api_hour', version=read_version(), description=('Write efficient network daemons (HTTP, SSH...) with ease.'), long_description='\n\n'.join((read('README.rst'), read('HISTORY.rst'))), classifiers=classifiers, platforms=['OS Independent'], author='Eyepea Dev Team', author_email='gmludo@gmail.com', url='http://www.api-hour.io', download_url='https://pypi.python.org/pypi/api_hour', keywords = ['asyncio', 'performance', 'efficient', 'web', 'service', 'rest', 'json', 'daemon', 'application'], license='Apache 2', packages=find_packages(), install_requires=install_requires, # tests_require = tests_require, # test_suite = 'nose.collector', provides=['api_hour'], include_package_data=True, entry_points=""" [console_scripts] api_hour=api_hour.application:run """, ) api_hour-0.8.2/api_hour.egg-info/0000775000175000017500000000000013201335070015651 5ustar lglg00000000000000api_hour-0.8.2/api_hour.egg-info/SOURCES.txt0000664000175000017500000000123613201335070017537 0ustar lglg00000000000000HISTORY.rst LICENSE MANIFEST.in README.rst setup.cfg setup.py api_hour/__init__.py api_hour/application.py api_hour/config.py api_hour/container.py api_hour/utils.py api_hour/worker.py api_hour.egg-info/PKG-INFO api_hour.egg-info/SOURCES.txt api_hour.egg-info/dependency_links.txt api_hour.egg-info/entry_points.txt api_hour.egg-info/requires.txt api_hour.egg-info/top_level.txt api_hour/plugins/__init__.py api_hour/plugins/aiohttp/__init__.py api_hour/plugins/aiohttp/environment.py api_hour/plugins/aiohttp/response.py api_hour/plugins/aiohttp/router.py test/__init__.py test/plugins/__init__.py test/plugins/aiohttp/__init__.py test/plugins/aiohttp/test_serialize.pyapi_hour-0.8.2/api_hour.egg-info/PKG-INFO0000664000175000017500000002615213201335070016754 0ustar lglg00000000000000Metadata-Version: 1.1 Name: api-hour Version: 0.8.2 Summary: Write efficient network daemons (HTTP, SSH...) with ease. Home-page: http://www.api-hour.io Author: Eyepea Dev Team Author-email: gmludo@gmail.com License: Apache 2 Download-URL: https://pypi.python.org/pypi/api_hour Description: API Hour ======== API-Hour is a lightweight daemon framework, that lets you write powerful applications. It was created to answer the need for a simple, robust, and super-fast server-side environment to build very efficient Daemons with ease. By default, API-Hour Starter Kit (Cookiecutter) creates for you a HTTP daemon to develop WebServices. With API-Hour, you can quickly convert any AsyncIO server library to multi-processing daemon, ready for production. .. image:: https://raw.githubusercontent.com/Eyepea/API-Hour/master/docs/API-Hour_small.png Quick'n'dirty HTTP benchmarks on a kitchen table ------------------------------------------------ .. image:: https://raw.githubusercontent.com/Eyepea/API-Hour/master/propaganda/en/stats.png Scale: Number of queries during 30 seconds with 400 simultaneous connexions. Benchmark made on a Dell Precision M6800 between API-Hour and Gunicorn with 16 workers. For details, read information in `benchmarks `_. Where is the magic to have theses performances ? '''''''''''''''''''''''''''''''''''''''''''''''' Architecture matters a lot more that tools. We use asynchronous and multiprocess patterns, combined together, to handle as much as possible HTTP requests. Ideally, the limitation should be your network card, not your CPU nor memory. Moreover, we've tried to reduce as much as possible layers between your code and async sockets. For each layer, we use the best in term of performance and simplicity: #. `AsyncIO `_: an easy asynchronous framework, directly integrated in Python 3.4+ #. `aiohttp.web `_: HTTP protocol implementation for AsyncIO + Web framework #. `ujson `_: fastest JSON serialization Examples -------- #. `API-Hour Starter Kit (Cookiecutter) `_ #. `API-Hour implementation of TechEmpower Web Framework Benchmarks `_ #. `HTTP+SSH Daemon `_ #. `Quick'n'dirty benchmarks on a kitchen table `_ How-to start an API-Hour project ? ---------------------------------- You can follow `one of our tutorials `_ Support ------- * `Documentation `_. * `Mailing-list `_ Requirements ------------ - Python 3.5+ Install ------- Follow `official documentation `_. License ------- ``API-Hour`` is offered under the Apache 2 license. Architecture ------------ ``API-Hour`` is a glue between your code and Gunicorn to launch your code in several process. Origin ------ API-Hour was a fork of aiorest, now only based on Gunicorn for multiprocessing. Thanks ------ Thanks to Gunicorn, aiorest, aiohttp and AsyncIO community, they made 99,9999% of the job for API-Hour. Special thanks to **Andrew Svetlov**, the creator of aiorest. Goals of API-Hour ----------------- #. **Fast**: API-Hour is designed from bottom-up to be extremely fast, and capable of handling a huge load. It uses Python 3 and its new powerful AsyncIO package. #. **Scalable**: API-Hour is built to be elastic, and easily scalable. #. **Lightweight**: #. **small codebase**: Doing less means being faster: the codebase for processing an request is kept as small as possible. Beyond this base foot-print, you can of course activate, preload and initialize more plugins or packages , but that choice is yours. #. **flexible setup**: Some people have no problems with using many dependencies, while others want to have none (other than Python). Some people are ok to loose a bit on performance, for the ease (and speed) of coding, while others wouldn't sacrifice a millisecond for ready-made functionality. These choices are yours, so there are no mandatory extra layer, plugin or middleware. #. **Easy**: API-Hour is meant to be very easy to grasp: No steep learning curve, no mountain of docs to read: Download our turn-key "Hello-world" applications, and immediately start coding your own application from there. #. **Packages-friendly and friendly-packages**: We try to let you use external packages without the need to re-write them, adapt them, " wrap " them or embed them in the framework. On the other hand, API-Hour " plugins " are written as much as possible to be usable as stand-alone packages outside the framework, to benefit to more people. #. **Asynchronous... or not**: If you don't need the extra complexity of building asynchronous code, you don't have to (you'll still enjoy tremendous performance). You can just handle your requests in a traditional synchronous way. On the other hand, if your project does IO or processing that could benefit from parallelizing tasks, the whole power of Async. IO, futures, coroutines and tasks is at your fingertips. All provided plugins (in particular, Database plugins) are Async-ready. CHANGES ======= 0.8.2 (2017-11-10) ------------------ * Add pre_start coroutine * Fix setup.py to check correctly minimal Python version. Thanks @romuald 0.8.1 (2016-07-08) ------------------ * Drop support of Python 3.3 and 3.4 0.7.1 (2016-07-08) ------------------ * Merge bugfix from https://github.com/KeepSafe/aiohttp/pull/879 0.7.0 (2015-05-04) ------------------ * Add HTML serializer plugin * Add AsyncIO high level stream server support (Used by FastAGI implementation of Panoramisk) * Now, you can use make_handler method to connect directly your handlers with your sockets for more flexibility 0.6.2 (2015-02-24) ------------------ * You can customize event loop used with make_event_loop() class method in Container 0.6.1 (2015-02-10) ------------------ * Release a new version because PyPI is bugged: 0.6.0 is broken on PyPI 0.6.0 (2015-01-13) ------------------ * API-Hour config file is now optional, use -ac to auto-configure your app * Add Python 3.3 compatibility to use easily Python 3 directly from distributions package * Add Debian/Ubuntu package * ujson is now optional for aiohttp.web * More documentation with tutorials: all-in-one and Starter Kit * If api_hour CLI has no logging file, enable logging on console by default 0.5.0 (2015-01-07) ------------------ * Project reboot * Change API-Hour main goal: API-Hour can now multiprocess all AsyncIO lib server, not only HTTP * API-Hour is now based on Gunicorn * Remove aiorest fork, recommend to use aiohttp.web for HTTP daemons in cookiecutter 0.3.3 (2014-12-19) ------------------ * Static files can be served automatically * body and json_body and transport accessible in Request * loop accessible in Application * Asset Serializer accepts encoding * cookiecutter available at https://github.com/Eyepea/cookiecutter-API-Hour * Use of ujson * Bugfixes 0.3.2 (2014-10-31) ------------------ * Refactoring and clean-up * Publish benchmark server for API-Hour * English version of PyCON-FR presentation about API-Hour * Fix response.write_eof() to follow aiohttp changes (Thanks aiorest for the patch) 0.3.1 (2014-10-28) ------------------ * Rename multi_process to arbiter * Improve Python packaging 0.3.0 (2014-10-26) ------------------ * First version of API-Hour, performance oriented version of aiorest * cookiecutter template * Serialization support * replace json by ujson * basic multiprocessing 0.2.4 (2014-09-12) ------------------ * Make loop keywork-only parameter in create_session_factory() function 0.2.3 (2014-08-28) ------------------ * Redis session switched from asyncio_redis to aioredis 0.2.2 (2014-08-15) ------------------ * Added Pyramid-like matchdict to request (see https://github.com/aio-libs/aiorest/pull/18) * Return "400 Bad Request" for incorrect JSON body in POST/PUT methods * README fixed * Custom response status code (see https://github.com/aio-libs/aiorest/pull/23) 0.1.1 (2014-07-09) ------------------ * Switched to aiohttp v0.9.0 0.1.0 (2014-07-07) ------------------ * Basic REST API Keywords: asyncio,performance,efficient,web,service,rest,json,daemon,application Platform: OS Independent Classifier: Development Status :: 5 - Production/Stable Classifier: Environment :: No Input/Output (Daemon) Classifier: Environment :: Web Environment Classifier: Intended Audience :: Developers Classifier: License :: OSI Approved :: Apache Software License Classifier: Natural Language :: English Classifier: Operating System :: OS Independent Classifier: Programming Language :: Python Classifier: Programming Language :: Python :: 3 Classifier: Programming Language :: Python :: 3.5 Classifier: Programming Language :: Python :: 3 :: Only Classifier: Programming Language :: Python :: Implementation :: CPython Classifier: Topic :: Internet :: WWW/HTTP Classifier: Topic :: Internet :: WWW/HTTP :: Dynamic Content Classifier: Topic :: Internet :: WWW/HTTP :: HTTP Servers Classifier: Topic :: System :: Networking Provides: api_hour api_hour-0.8.2/api_hour.egg-info/top_level.txt0000664000175000017500000000001613201335070020400 0ustar lglg00000000000000api_hour test api_hour-0.8.2/api_hour.egg-info/dependency_links.txt0000664000175000017500000000000113201335070021717 0ustar lglg00000000000000 api_hour-0.8.2/api_hour.egg-info/requires.txt0000664000175000017500000000003513201335070020247 0ustar lglg00000000000000gunicorn PyYAML setproctitle api_hour-0.8.2/api_hour.egg-info/entry_points.txt0000664000175000017500000000010713201335070021145 0ustar lglg00000000000000 [console_scripts] api_hour=api_hour.application:run api_hour-0.8.2/HISTORY.rst0000664000175000017500000000574513201334056014262 0ustar lglg00000000000000CHANGES ======= 0.8.2 (2017-11-10) ------------------ * Add pre_start coroutine * Fix setup.py to check correctly minimal Python version. Thanks @romuald 0.8.1 (2016-07-08) ------------------ * Drop support of Python 3.3 and 3.4 0.7.1 (2016-07-08) ------------------ * Merge bugfix from https://github.com/KeepSafe/aiohttp/pull/879 0.7.0 (2015-05-04) ------------------ * Add HTML serializer plugin * Add AsyncIO high level stream server support (Used by FastAGI implementation of Panoramisk) * Now, you can use make_handler method to connect directly your handlers with your sockets for more flexibility 0.6.2 (2015-02-24) ------------------ * You can customize event loop used with make_event_loop() class method in Container 0.6.1 (2015-02-10) ------------------ * Release a new version because PyPI is bugged: 0.6.0 is broken on PyPI 0.6.0 (2015-01-13) ------------------ * API-Hour config file is now optional, use -ac to auto-configure your app * Add Python 3.3 compatibility to use easily Python 3 directly from distributions package * Add Debian/Ubuntu package * ujson is now optional for aiohttp.web * More documentation with tutorials: all-in-one and Starter Kit * If api_hour CLI has no logging file, enable logging on console by default 0.5.0 (2015-01-07) ------------------ * Project reboot * Change API-Hour main goal: API-Hour can now multiprocess all AsyncIO lib server, not only HTTP * API-Hour is now based on Gunicorn * Remove aiorest fork, recommend to use aiohttp.web for HTTP daemons in cookiecutter 0.3.3 (2014-12-19) ------------------ * Static files can be served automatically * body and json_body and transport accessible in Request * loop accessible in Application * Asset Serializer accepts encoding * cookiecutter available at https://github.com/Eyepea/cookiecutter-API-Hour * Use of ujson * Bugfixes 0.3.2 (2014-10-31) ------------------ * Refactoring and clean-up * Publish benchmark server for API-Hour * English version of PyCON-FR presentation about API-Hour * Fix response.write_eof() to follow aiohttp changes (Thanks aiorest for the patch) 0.3.1 (2014-10-28) ------------------ * Rename multi_process to arbiter * Improve Python packaging 0.3.0 (2014-10-26) ------------------ * First version of API-Hour, performance oriented version of aiorest * cookiecutter template * Serialization support * replace json by ujson * basic multiprocessing 0.2.4 (2014-09-12) ------------------ * Make loop keywork-only parameter in create_session_factory() function 0.2.3 (2014-08-28) ------------------ * Redis session switched from asyncio_redis to aioredis 0.2.2 (2014-08-15) ------------------ * Added Pyramid-like matchdict to request (see https://github.com/aio-libs/aiorest/pull/18) * Return "400 Bad Request" for incorrect JSON body in POST/PUT methods * README fixed * Custom response status code (see https://github.com/aio-libs/aiorest/pull/23) 0.1.1 (2014-07-09) ------------------ * Switched to aiohttp v0.9.0 0.1.0 (2014-07-07) ------------------ * Basic REST API api_hour-0.8.2/test/0000775000175000017500000000000013201335070013330 5ustar lglg00000000000000api_hour-0.8.2/test/plugins/0000775000175000017500000000000013201335070015011 5ustar lglg00000000000000api_hour-0.8.2/test/plugins/__init__.py0000664000175000017500000000000012477002507017122 0ustar lglg00000000000000api_hour-0.8.2/test/plugins/aiohttp/0000775000175000017500000000000013201335070016461 5ustar lglg00000000000000api_hour-0.8.2/test/plugins/aiohttp/__init__.py0000664000175000017500000000000012477002507020572 0ustar lglg00000000000000api_hour-0.8.2/test/plugins/aiohttp/test_serialize.py0000664000175000017500000000075712477002507022104 0ustar lglg00000000000000from api_hour.plugins.aiohttp import JSON from api_hour.plugins.aiohttp import HTML def test_json_body(): x = JSON({'foo': 'bar'}) assert x._body == b'{"foo":"bar"}' def test_json_content_type(): x = JSON({'foo': 'bar'}) assert x._content_type == 'application/json' def test_html_body(): x = HTML('test') assert x._body == b'test' def test_html_content_type(): x = HTML('test') assert x._content_type == 'text/html' api_hour-0.8.2/test/__init__.py0000664000175000017500000000000012477002507015441 0ustar lglg00000000000000api_hour-0.8.2/setup.cfg0000664000175000017500000000017413201335070014174 0ustar lglg00000000000000[metadata] description-file = README.rst [wheel] universal = 1 [egg_info] tag_build = tag_date = 0 tag_svn_revision = 0 api_hour-0.8.2/MANIFEST.in0000664000175000017500000000014212423231114014103 0ustar lglg00000000000000include LICENSE include README.rst include HISTORY.rst graft api_hour global-exclude *.pyc *.swp