pax_global_header00006660000000000000000000000064132236351260014515gustar00rootroot0000000000000052 comment=691151b0975ac5e29059302673a14232e0d6bc8a daphne-1.4.2/000077500000000000000000000000001322363512600127605ustar00rootroot00000000000000daphne-1.4.2/.gitignore000066400000000000000000000001111322363512600147410ustar00rootroot00000000000000*.egg-info *.pyc __pycache__ dist/ build/ /.tox .hypothesis .cache .eggs daphne-1.4.2/.travis.yml000066400000000000000000000002161322363512600150700ustar00rootroot00000000000000sudo: false dist: trusty language: python python: - "2.7" - "3.4" - "3.5" - "3.6" install: pip install tox tox-travis script: tox daphne-1.4.2/CHANGELOG.txt000066400000000000000000000171341322363512600150160ustar00rootroot000000000000001.4.2 (2018-01-05) ------------------ * Bugfix for WebSocket protocol when X-Forwarded-For is turned on. 1.4.1 (2018-01-02) ------------------ * Bugfix for a bad merge of HTTPFactory for X-Forwarded-Proto causing Daphne to not start. 1.4.0 (2018-01-02) ------------------ * The X-Forwarded-Proto header can now be used to pass along protocol from a reverse proxy. * WebSocket headers are now correctly always passed as bytestrings. 1.3.0 (2017-06-16) ------------------ * Ability to set the websocket connection timeout * Server no longer reveals the exact Autobahn version number for security * A few unicode fixes for Python 2/3 compatability * Stopped logging messages to already-closed connections as ERROR 1.2.0 (2017-04-01) ------------------ * The new process-specific channel support is now implemented, resulting in significantly less traffic to your channel backend. * Native twisted blocking support for channel layers that support it is now used. While it is a lot more efficient, it is also sometimes slightly more latent; you can disable it using --force-sync. * Native SSL termination is now correctly reflected in the ASGI-HTTP `scheme` key. * accept: False is now a valid way to deny a connection, as well as close: True. * HTTP version is now correctly sent as one of "1.0", "1.1" or "2". * More command line options for websocket timeouts 1.1.0 (2017-03-18) ------------------ * HTTP/2 termination is now supported natively. The Twisted dependency has been increased to at least 17.1 as a result; for more information about setting up HTTP/2, see the README. * X-Forwarded-For decoding support understands IPv6 addresses, and picks the most remote (leftmost) entry if there are multiple relay hosts. * Fixed an error where `disconnect` messages would still try and get sent even if the client never finished a request. 1.0.3 (2017-02-12) ------------------ * IPv6 addresses are correctly accepted as bind targets on the command line * Twisted 17.1 compatability fixes for WebSocket receiving/keepalive and proxy header detection. 1.0.2 (2017-02-01) ------------------ * The "null" WebSocket origin (including file:// and no value) is now accepted by Daphne and passed onto the application to accept/deny. * Listening on file descriptors works properly again. * The DeprecationError caused by not passing endpoints into a Server class directly is now a warning instead. 1.0.1 (2017-01-09) ------------------ * Endpoint unicode strings now work correctly on Python 2 and Python 3 1.0.0 (2017-01-08) ------------------ * BREAKING CHANGE: Daphne now requires acceptance of WebSocket connections before it finishes the socket handshake and relays incoming packets. You must upgrade to at least Channels 1.0.0 as well; see http://channels.readthedocs.io/en/latest/releases/1.0.0.html for more. * http.disconnect now has a `path` key * WebSockets can now be closed with a specific code * X-Forwarded-For header support; defaults to X-Forwarded-For, override with --proxy-headers on the commandline. * Twisted endpoint description string support with `-e` on the command line (allowing for SNI/ACME support, among other things) * Logging/error verbosity fixes and access log flushes properly 0.15.0 (2016-08-28) ------------------- * Connections now force-close themselves after pings fail for a certain timeframe, controllable via the new --ping-timeout option. * Badly-formatted websocket response messages now log to console in all situations * Compatability with Twisted 16.3 and up 0.14.3 (2016-07-21) ------------------- * File descriptors can now be passed on the commandline for process managers that pass sockets along like this. * websocket.disconnect messages now come with a "code" attribute matching the WebSocket spec. * A memory leak in request logging has been fixed. 0.14.2 (2016-07-07) ------------------- * Marked as incompatible with twisted 16.3 and above until we work out why it stops incoming websocket messages working 0.14.1 (2016-07-06) ------------------- * Consumption of websocket.receive is also now required. 0.14.0 (2016-07-06) ------------------- * Consumption of websocket.connect is now required (channels 0.16 enforces this); getting backpressure on it now results in the socket being force closed. 0.13.1 (2016-06-28) ------------------- * Bad WebSocket handshakes now return 400 and an error messages rather than 500 with no content. 0.13.0 (2016-06-22) ------------------- * Query strings are now sent as bytestrings and the application is responsible for decoding. Ensure you're running Channels 0.15 or higher. 0.12.2 (2016-06-21) ------------------- * Plus signs in query string are now handled by Daphne, not Django-by-mistake. Ensure you're running Channels 0.14.3 or higher. * New --root-path and DAPHNE_ROOT_PATH options for setting root path. 0.12.1 (2016-05-18) ------------------- * Fixed bug where a non-ASCII byte in URL paths would crash the HTTP parser without a response; now returns 400, and hardening in place to catch most other errors and return a 500. * WebSocket header format now matches HTTP header format and the ASGI spec. No update needed to channels library, but user code may need updating. 0.12.0 (2016-05-07) ------------------- * Backpressure on http.request now causes incoming requests to drop with 503. Websockets will drop connection/disconnection messages/received frames if backpressure is encountered; options are coming soon to instead drop the connection if this happens. 0.11.4 (2016-05-04) ------------------- * Don't try to send TCP host info in message for unix sockets 0.11.3 (2016-04-27) ------------------- * Don't decode + as a space in URLs 0.11.2 (2016-04-27) ------------------- * Correctly encode all path params for WebSockets 0.11.1 (2016-04-26) ------------------- * Fix bugs with WebSocket path parsing under Python 2 0.11.0 (2016-04-26) ------------------- * HTTP paths and query strings are now pre-decoded before going to ASGI 0.10.3 (2016-04-05) ------------------- * Error on badly formatted websocket reply messages 0.10.2 (2016-04-03) ------------------- * Access logging in NCSAish format now printed to stdout, configurable to another file using --access-log=filename 0.10.1 (2016-03-29) ------------------- * WebSockets now close after they've been open for longer than the channel layer group expiry (86400 seconds by default for most layers). * Binding to UNIX sockets is now possible (use the -u argument) * WebSockets now send keepalive pings if they've had no data for a certain amount of time (20 seconds by default, set with --ping-interval) 0.10.0 (2016-03-21) ------------------- * Multiple cookies are now set correctly * Follows new ASGI single-response-channel spec for ! * Follows new ASGI header encoding spec for HTTP 0.9.3 (2016-03-08) ------------------ * WebSocket query strings are correctly encoded 0.9.2 (2016-03-02) ------------------ * HTTP requests now time out after a configurable amount of time and return 503 (default is 2 minutes) 0.9.1 (2016-03-01) ------------------ * Main thread actually idles rather than sitting at 100% * WebSocket packets have an "order" attribute attached * WebSocket upgrade header detection is now case insensitive 0.9 (2016-02-21) ---------------- * Signal handlers can now be disabled if you want to run inside a thread (e.g. inside Django autoreloader) * Logging hooks that can be used to allow calling code to show requests and other events. * Headers are now transmitted for websocket.connect * http.disconnect messages are now sent * Request handling speed significantly improved daphne-1.4.2/LICENSE000066400000000000000000000030201322363512600137600ustar00rootroot00000000000000Copyright (c) Django Software Foundation and individual contributors. All rights reserved. Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. 3. Neither the name of Django nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. daphne-1.4.2/Makefile000066400000000000000000000005061322363512600144210ustar00rootroot00000000000000.PHONY: release all: release: ifndef version $(error Please supply a version) endif @echo Releasing version $(version) ifeq (,$(findstring $(version),$(shell git log --oneline -1))) $(error Last commit does not match version) endif git tag $(version) git push git push --tags python setup.py sdist bdist_wheel upload daphne-1.4.2/README.rst000066400000000000000000000124011322363512600144450ustar00rootroot00000000000000daphne ====== .. image:: https://api.travis-ci.org/django/daphne.svg :target: https://travis-ci.org/django/daphne .. image:: https://img.shields.io/pypi/v/daphne.svg :target: https://pypi.python.org/pypi/daphne Daphne is a HTTP, HTTP2 and WebSocket protocol server for `ASGI `_, and developed to power Django Channels. It supports automatic negotiation of protocols; there's no need for URL prefixing to determine WebSocket endpoints versus HTTP endpoints. Running ------- Simply point Daphne to your ASGI channel layer instance, and optionally set a bind address and port (defaults to localhost, port 8000):: daphne -b 0.0.0.0 -p 8001 django_project.asgi:channel_layer If you intend to run daphne behind a proxy server you can use UNIX sockets to communicate between the two:: daphne -u /tmp/daphne.sock django_project.asgi:channel_layer If daphne is being run inside a process manager such as `Circus `_ you might want it to bind to a file descriptor passed down from a parent process. To achieve this you can use the --fd flag:: daphne --fd 5 django_project.asgi:channel_layer If you want more control over the port/socket bindings you can fall back to using `twisted's endpoint description strings `_ by using the `--endpoint (-e)` flag, which can be used multiple times. This line would start a SSL server on port 443, assuming that `key.pem` and `crt.pem` exist in the current directory (requires pyopenssl to be installed):: daphne -e ssl:443:privateKey=key.pem:certKey=crt.pem django_project.asgi:channel_layer Endpoints even let you use the ``txacme`` endpoint syntax to get automatic certificates from Let's Encrypt, which you can read more about at http://txacme.readthedocs.io/en/stable/. To see all available command line options run daphne with the *-h* flag. HTTP/2 Support -------------- Daphne 1.1 and above supports terminating HTTP/2 connections natively. You'll need to do a couple of things to get it working, though. First, you need to make sure you install the Twisted ``http2`` and ``tls`` extras:: pip install -U 'Twisted[tls,http2]' Next, because all current browsers only support HTTP/2 when using TLS, you will need to start Daphne with TLS turned on, which can be done using the Twisted endpoint syntax:: daphne -e ssl:443:privateKey=key.pem:certKey=crt.pem django_project.asgi:channel_layer Alternatively, you can use the ``txacme`` endpoint syntax or anything else that enables TLS under the hood. You will also need to be on a system that has **OpenSSL 1.0.2 or greater**; if you are using Ubuntu, this means you need at least 16.04. Now, when you start up Daphne, it should tell you this in the log:: 2017-03-18 19:14:02,741 INFO Starting server at ssl:port=8000:privateKey=privkey.pem:certKey=cert.pem, channel layer django_project.asgi:channel_layer. 2017-03-18 19:14:02,742 INFO HTTP/2 support enabled Then, connect with a browser that supports HTTP/2, and everything should be working. It's often hard to tell that HTTP/2 is working, as the log Daphne gives you will be identical (it's HTTP, after all), and most browsers don't make it obvious in their network inspector windows. There are browser extensions that will let you know clearly if it's working or not. Daphne only supports "normal" requests over HTTP/2 at this time; there is not yet support for extended features like Server Push. It will, however, result in much faster connections and lower overheads. If you have a reverse proxy in front of your site to serve static files or similar, HTTP/2 will only work if that proxy understands and passes through the connection correctly. Root Path (SCRIPT_NAME) ----------------------- In order to set the root path for Daphne, which is the equivalent of the WSGI ``SCRIPT_NAME`` setting, you have two options: * Pass a header value ``Daphne-Root-Path``, with the desired root path as a URLencoded ASCII value. This header will not be passed down to applications. * Set the ``--root-path`` commandline option with the desired root path as a URLencoded ASCII value. The header takes precedence if both are set. As with ``SCRIPT_ALIAS``, the value should start with a slash, but not end with one; for example:: daphne --root-path=/forum django_project.asgi:channel_layer Dependencies ------------ All Channels projects currently support Python 2.7, 3.4 and 3.5. `daphne` requires Twisted 17.1 or greater. Contributing ------------ Please refer to the `main Channels contributing docs `_. That also contains advice on how to set up the development environment and run the tests. Maintenance and Security ------------------------ To report security issues, please contact security@djangoproject.com. For GPG signatures and more security process information, see https://docs.djangoproject.com/en/dev/internals/security/. To report bugs or request new features, please open a new GitHub issue. This repository is part of the Channels project. For the shepherd and maintenance team, please see the `main Channels readme `_. daphne-1.4.2/daphne/000077500000000000000000000000001322363512600142175ustar00rootroot00000000000000daphne-1.4.2/daphne/__init__.py000077500000000000000000000000261322363512600163310ustar00rootroot00000000000000__version__ = "1.4.2" daphne-1.4.2/daphne/access.py000066400000000000000000000045051322363512600160360ustar00rootroot00000000000000import datetime class AccessLogGenerator(object): """ Object that implements the Daphne "action logger" internal interface in order to provide an access log in something resembling NCSA format. """ def __init__(self, stream): self.stream = stream def __call__(self, protocol, action, details): """ Called when an action happens; use it to generate log entries. """ # HTTP requests if protocol == "http" and action == "complete": self.write_entry( host=details['client'], date=datetime.datetime.now(), request="%(method)s %(path)s" % details, status=details['status'], length=details['size'], ) # Websocket requests elif protocol == "websocket" and action == "connecting": self.write_entry( host=details['client'], date=datetime.datetime.now(), request="WSCONNECTING %(path)s" % details, ) elif protocol == "websocket" and action == "rejected": self.write_entry( host=details['client'], date=datetime.datetime.now(), request="WSREJECT %(path)s" % details, ) elif protocol == "websocket" and action == "connected": self.write_entry( host=details['client'], date=datetime.datetime.now(), request="WSCONNECT %(path)s" % details, ) elif protocol == "websocket" and action == "disconnected": self.write_entry( host=details['client'], date=datetime.datetime.now(), request="WSDISCONNECT %(path)s" % details, ) def write_entry(self, host, date, request, status=None, length=None, ident=None, user=None): """ Writes an NCSA-style entry to the log file (some liberty is taken with what the entries are for non-HTTP) """ self.stream.write( "%s %s %s [%s] \"%s\" %s %s\n" % ( host, ident or "-", user or "-", date.strftime("%d/%b/%Y:%H:%M:%S"), request, status or "-", length or "-", ) ) daphne-1.4.2/daphne/cli.py000077500000000000000000000163771322363512600153610ustar00rootroot00000000000000import sys import argparse import logging import importlib from .server import Server, build_endpoint_description_strings from .access import AccessLogGenerator logger = logging.getLogger(__name__) DEFAULT_HOST = '127.0.0.1' DEFAULT_PORT = 8000 class CommandLineInterface(object): """ Acts as the main CLI entry point for running the server. """ description = "Django HTTP/WebSocket server" def __init__(self): self.parser = argparse.ArgumentParser( description=self.description, ) self.parser.add_argument( '-p', '--port', type=int, help='Port number to listen on', default=None, ) self.parser.add_argument( '-b', '--bind', dest='host', help='The host/address to bind to', default=None, ) self.parser.add_argument( '--websocket_timeout', type=int, help='Maximum time to allow a websocket to be connected. -1 for infinite.', default=None, ) self.parser.add_argument( '--websocket_connect_timeout', type=int, help='Maximum time to allow a connection to handshake. -1 for infinite', default=5, ) self.parser.add_argument( '-u', '--unix-socket', dest='unix_socket', help='Bind to a UNIX socket rather than a TCP host/port', default=None, ) self.parser.add_argument( '--fd', type=int, dest='file_descriptor', help='Bind to a file descriptor rather than a TCP host/port or named unix socket', default=None, ) self.parser.add_argument( '-e', '--endpoint', dest='socket_strings', action='append', help='Use raw server strings passed directly to twisted', default=[], ) self.parser.add_argument( '-v', '--verbosity', type=int, help='How verbose to make the output', default=1, ) self.parser.add_argument( '-t', '--http-timeout', type=int, help='How long to wait for worker server before timing out HTTP connections', default=120, ) self.parser.add_argument( '--access-log', help='Where to write the access log (- for stdout, the default for verbosity=1)', default=None, ) self.parser.add_argument( '--ping-interval', type=int, help='The number of seconds a WebSocket must be idle before a keepalive ping is sent', default=20, ) self.parser.add_argument( '--ping-timeout', type=int, help='The number of seconds before a WebSocket is closed if no response to a keepalive ping', default=30, ) self.parser.add_argument( '--ws-protocol', nargs='*', dest='ws_protocols', help='The WebSocket protocols you wish to support', default=None, ) self.parser.add_argument( '--root-path', dest='root_path', help='The setting for the ASGI root_path variable', default="", ) self.parser.add_argument( '--proxy-headers', dest='proxy_headers', help='Enable parsing and using of X-Forwarded-For and X-Forwarded-Port headers and using that as the ' 'client address', default=False, action='store_true', ) self.parser.add_argument( '--force-sync', dest='force_sync', action='store_true', help='Force the server to use synchronous mode on its ASGI channel layer', default=False, ) self.parser.add_argument( 'channel_layer', help='The ASGI channel layer instance to use as path.to.module:instance.path', ) self.server = None @classmethod def entrypoint(cls): """ Main entrypoint for external starts. """ cls().run(sys.argv[1:]) def run(self, args): """ Pass in raw argument list and it will decode them and run the server. """ # Decode args args = self.parser.parse_args(args) # Set up logging logging.basicConfig( level={ 0: logging.WARN, 1: logging.INFO, 2: logging.DEBUG, }[args.verbosity], format="%(asctime)-15s %(levelname)-8s %(message)s", ) # If verbosity is 1 or greater, or they told us explicitly, set up access log access_log_stream = None if args.access_log: if args.access_log == "-": access_log_stream = sys.stdout else: access_log_stream = open(args.access_log, "a", 1) elif args.verbosity >= 1: access_log_stream = sys.stdout # Import channel layer sys.path.insert(0, ".") module_path, object_path = args.channel_layer.split(":", 1) channel_layer = importlib.import_module(module_path) for bit in object_path.split("."): channel_layer = getattr(channel_layer, bit) if not any([args.host, args.port, args.unix_socket, args.file_descriptor, args.socket_strings]): # no advanced binding options passed, patch in defaults args.host = DEFAULT_HOST args.port = DEFAULT_PORT elif args.host and not args.port: args.port = DEFAULT_PORT elif args.port and not args.host: args.host = DEFAULT_HOST # build endpoint description strings from (optional) cli arguments endpoints = build_endpoint_description_strings( host=args.host, port=args.port, unix_socket=args.unix_socket, file_descriptor=args.file_descriptor ) endpoints = sorted( args.socket_strings + endpoints ) logger.info( 'Starting server at %s, channel layer %s.' % (', '.join(endpoints), args.channel_layer) ) self.server = Server( channel_layer=channel_layer, endpoints=endpoints, http_timeout=args.http_timeout, ping_interval=args.ping_interval, ping_timeout=args.ping_timeout, websocket_timeout=args.websocket_timeout, websocket_connect_timeout=args.websocket_connect_timeout, action_logger=AccessLogGenerator(access_log_stream) if access_log_stream else None, ws_protocols=args.ws_protocols, root_path=args.root_path, verbosity=args.verbosity, proxy_forwarded_address_header='X-Forwarded-For' if args.proxy_headers else None, proxy_forwarded_port_header='X-Forwarded-Port' if args.proxy_headers else None, proxy_forwarded_proto_header='X-Forwarded-Proto' if args.proxy_headers else None, force_sync=args.force_sync, ) self.server.run() daphne-1.4.2/daphne/http_protocol.py000077500000000000000000000465371322363512600175130ustar00rootroot00000000000000from __future__ import unicode_literals import logging import random import six import string import time import traceback from zope.interface import implementer from six.moves.urllib_parse import unquote, unquote_plus from twisted.internet.interfaces import IProtocolNegotiationFactory from twisted.protocols.policies import ProtocolWrapper from twisted.web import http from .utils import parse_x_forwarded_for from .ws_protocol import WebSocketProtocol, WebSocketFactory logger = logging.getLogger(__name__) class WebRequest(http.Request): """ Request that either hands off information to channels, or offloads to a WebSocket class. Does some extra processing over the normal Twisted Web request to separate GET and POST out. """ error_template = """ %(title)s

%(title)s

%(body)s

""".replace("\n", "").replace(" ", " ").replace(" ", " ").replace(" ", " ") # Shorten it a bit, bytes wise def __init__(self, *args, **kwargs): try: http.Request.__init__(self, *args, **kwargs) # Easy factory link self.factory = self.channel.factory # Make a name for our reply channel self.reply_channel = self.factory.make_send_channel() # Tell factory we're that channel's client self.last_keepalive = time.time() self.factory.reply_protocols[self.reply_channel] = self self._got_response_start = False except Exception: logger.error(traceback.format_exc()) raise def process(self): try: self.request_start = time.time() # Get upgrade header upgrade_header = None if self.requestHeaders.hasHeader(b"Upgrade"): upgrade_header = self.requestHeaders.getRawHeaders(b"Upgrade")[0] # Get client address if possible if hasattr(self.client, "host") and hasattr(self.client, "port"): # client.host and host.host are byte strings in Python 2, but spec # requires unicode string. self.client_addr = [six.text_type(self.client.host), self.client.port] self.server_addr = [six.text_type(self.host.host), self.host.port] else: self.client_addr = None self.server_addr = None self.client_scheme = 'https' if self.isSecure() else 'http' if self.factory.proxy_forwarded_address_header: self.client_addr, self.client_scheme = parse_x_forwarded_for( self.requestHeaders, self.factory.proxy_forwarded_address_header, self.factory.proxy_forwarded_port_header, self.factory.proxy_forwarded_proto_header, self.client_addr, self.client_scheme ) # Check for unicodeish path (or it'll crash when trying to parse) try: self.path.decode("ascii") except UnicodeDecodeError: self.path = b"/" self.basic_error(400, b"Bad Request", "Invalid characters in path") return # Calculate query string self.query_string = b"" if b"?" in self.uri: self.query_string = self.uri.split(b"?", 1)[1] # Is it WebSocket? IS IT?! if upgrade_header and upgrade_header.lower() == b"websocket": # Make WebSocket protocol to hand off to protocol = self.factory.ws_factory.buildProtocol(self.transport.getPeer()) if not protocol: # If protocol creation fails, we signal "internal server error" self.setResponseCode(500) logger.warn("Could not make WebSocket protocol") self.finish() # Give it the raw query string protocol._raw_query_string = self.query_string # Port across transport protocol.set_main_factory(self.factory) transport, self.transport = self.transport, None if isinstance(transport, ProtocolWrapper): # i.e. TLS is a wrapping protocol transport.wrappedProtocol = protocol else: transport.protocol = protocol protocol.makeConnection(transport) # Re-inject request data = self.method + b' ' + self.uri + b' HTTP/1.1\x0d\x0a' for h in self.requestHeaders.getAllRawHeaders(): data += h[0] + b': ' + b",".join(h[1]) + b'\x0d\x0a' data += b"\x0d\x0a" data += self.content.read() protocol.dataReceived(data) # Remove our HTTP reply channel association if hasattr(protocol, "reply_channel"): logger.debug("Upgraded connection %s to WebSocket %s", self.reply_channel, protocol.reply_channel) else: logger.debug("Connection %s did not get successful WS handshake.", self.reply_channel) del self.factory.reply_protocols[self.reply_channel] self.reply_channel = None # Resume the producer so we keep getting data, if it's available as a method # 17.1 version if hasattr(self.channel, "_networkProducer"): self.channel._networkProducer.resumeProducing() # 16.x version elif hasattr(self.channel, "resumeProducing"): self.channel.resumeProducing() # Boring old HTTP. else: # Sanitize and decode headers, potentially extracting root path self.clean_headers = [] self.root_path = self.factory.root_path for name, values in self.requestHeaders.getAllRawHeaders(): # Prevent CVE-2015-0219 if b"_" in name: continue for value in values: if name.lower() == b"daphne-root-path": self.root_path = self.unquote(value) else: self.clean_headers.append((name.lower(), value)) logger.debug("HTTP %s request for %s", self.method, self.reply_channel) self.content.seek(0, 0) # Send message try: self.factory.channel_layer.send("http.request", { "reply_channel": self.reply_channel, # TODO: Correctly say if it's 1.1 or 1.0 "http_version": self.clientproto.split(b"/")[-1].decode("ascii"), "method": self.method.decode("ascii"), "path": self.unquote(self.path), "root_path": self.root_path, "scheme": self.client_scheme, "query_string": self.query_string, "headers": self.clean_headers, "body": self.content.read(), "client": self.client_addr, "server": self.server_addr, }) except self.factory.channel_layer.ChannelFull: # Channel is too full; reject request with 503 self.basic_error(503, b"Service Unavailable", "Request queue full.") except Exception: logger.error(traceback.format_exc()) self.basic_error(500, b"Internal Server Error", "HTTP processing error") @classmethod def unquote(cls, value, plus_as_space=False): """ Python 2 and 3 compat layer for utf-8 unquoting """ if six.PY2: if plus_as_space: return unquote_plus(value).decode("utf8") else: return unquote(value).decode("utf8") else: if plus_as_space: return unquote_plus(value.decode("ascii")) else: return unquote(value.decode("ascii")) def send_disconnect(self): """ Sends a disconnect message on the http.disconnect channel. Useful only really for long-polling. """ # If we don't yet have a path, then don't send as we never opened. if self.path: try: self.factory.channel_layer.send("http.disconnect", { "reply_channel": self.reply_channel, "path": self.unquote(self.path), }) except self.factory.channel_layer.ChannelFull: pass def connectionLost(self, reason): """ Cleans up reply channel on close. """ if self.reply_channel and self.reply_channel in self.channel.factory.reply_protocols: self.send_disconnect() del self.channel.factory.reply_protocols[self.reply_channel] logger.debug("HTTP disconnect for %s", self.reply_channel) http.Request.connectionLost(self, reason) def finish(self): """ Cleans up reply channel on close. """ if self.reply_channel and self.reply_channel in self.channel.factory.reply_protocols: self.send_disconnect() del self.channel.factory.reply_protocols[self.reply_channel] logger.debug("HTTP close for %s", self.reply_channel) http.Request.finish(self) def serverResponse(self, message): """ Writes a received HTTP response back out to the transport. """ if not self._got_response_start: self._got_response_start = True if 'status' not in message: raise ValueError("Specifying a status code is required for a Response message.") # Set HTTP status code self.setResponseCode(message['status']) # Write headers for header, value in message.get("headers", {}): # Shim code from old ASGI version, can be removed after a while if isinstance(header, six.text_type): header = header.encode("latin1") self.responseHeaders.addRawHeader(header, value) logger.debug("HTTP %s response started for %s", message['status'], self.reply_channel) else: if 'status' in message: raise ValueError("Got multiple Response messages for %s!" % self.reply_channel) # Write out body http.Request.write(self, message.get('content', b'')) # End if there's no more content if not message.get("more_content", False): self.finish() logger.debug("HTTP response complete for %s", self.reply_channel) try: self.factory.log_action("http", "complete", { "path": self.uri.decode("ascii"), "status": self.code, "method": self.method.decode("ascii"), "client": "%s:%s" % tuple(self.client_addr) if self.client_addr else None, "time_taken": self.duration(), "size": self.sentLength, }) except Exception as e: logging.error(traceback.format_exc()) else: logger.debug("HTTP response chunk for %s", self.reply_channel) def duration(self): """ Returns the time since the start of the request. """ if not hasattr(self, "request_start"): return 0 return time.time() - self.request_start def basic_error(self, status, status_text, body): """ Responds with a server-level error page (very basic) """ self.serverResponse({ "status": status, "status_text": status_text, "headers": [ (b"Content-Type", b"text/html; charset=utf-8"), ], "content": (self.error_template % { "title": six.text_type(status) + " " + status_text.decode("ascii"), "body": body, }).encode("utf8"), }) @implementer(IProtocolNegotiationFactory) class HTTPFactory(http.HTTPFactory): """ Factory which takes care of tracking which protocol instances or request instances are responsible for which named response channels, so incoming messages can be routed appropriately. """ def __init__(self, channel_layer, action_logger=None, send_channel=None, timeout=120, websocket_timeout=86400, ping_interval=20, ping_timeout=30, ws_protocols=None, root_path="", websocket_connect_timeout=30, proxy_forwarded_address_header=None, proxy_forwarded_port_header=None, proxy_forwarded_proto_header=None, websocket_handshake_timeout=5): http.HTTPFactory.__init__(self) self.channel_layer = channel_layer self.action_logger = action_logger self.send_channel = send_channel assert self.send_channel is not None self.timeout = timeout self.websocket_timeout = websocket_timeout self.websocket_connect_timeout = websocket_connect_timeout self.ping_interval = ping_interval self.proxy_forwarded_address_header = proxy_forwarded_address_header self.proxy_forwarded_port_header = proxy_forwarded_port_header self.proxy_forwarded_proto_header = proxy_forwarded_proto_header # We track all sub-protocols for response channel mapping self.reply_protocols = {} # Make a factory for WebSocket protocols self.ws_factory = WebSocketFactory(self, protocols=ws_protocols, server='Daphne') self.ws_factory.setProtocolOptions( autoPingTimeout=ping_timeout, allowNullOrigin=True, openHandshakeTimeout=websocket_handshake_timeout ) self.ws_factory.protocol = WebSocketProtocol self.ws_factory.reply_protocols = self.reply_protocols self.root_path = root_path def buildProtocol(self, addr): """ Builds protocol instances. This override is used to ensure we use our own Request object instead of the default. """ try: protocol = http.HTTPFactory.buildProtocol(self, addr) protocol.requestFactory = WebRequest return protocol except Exception as e: logger.error("Cannot build protocol: %s" % traceback.format_exc()) raise def make_send_channel(self): """ Makes a new send channel for a protocol with our process prefix. """ protocol_id = "".join(random.choice(string.ascii_letters) for i in range(10)) return self.send_channel + protocol_id def reply_channels(self): return self.reply_protocols.keys() def dispatch_reply(self, channel, message): # If we don't know about the channel, ignore it (likely a channel we # used to have that's now in a group). # TODO: Find a better way of alerting people when this happens so # they can do more cleanup, that's not an error. if channel not in self.reply_protocols: logger.debug("Message on unknown channel %r - discarding" % channel) return if isinstance(self.reply_protocols[channel], WebRequest): self.reply_protocols[channel].serverResponse(message) elif isinstance(self.reply_protocols[channel], WebSocketProtocol): # Switch depending on current socket state protocol = self.reply_protocols[channel] # See if the message is valid unknown_keys = set(message.keys()) - {"bytes", "text", "close", "accept"} if unknown_keys: raise ValueError( "Got invalid WebSocket reply message on %s - " "contains unknown keys %s (looking for either {'accept', 'text', 'bytes', 'close'})" % ( channel, unknown_keys, ) ) # Accepts allow bytes/text afterwards if message.get("accept", None) and protocol.state == protocol.STATE_CONNECTING: protocol.serverAccept() # Rejections must be the only thing if message.get("accept", None) == False and protocol.state == protocol.STATE_CONNECTING: protocol.serverReject() return # You're only allowed one of bytes or text if message.get("bytes", None) and message.get("text", None): raise ValueError( "Got invalid WebSocket reply message on %s - contains both bytes and text keys" % ( channel, ) ) if message.get("bytes", None): protocol.serverSend(message["bytes"], True) if message.get("text", None): protocol.serverSend(message["text"], False) closing_code = message.get("close", False) if closing_code: if protocol.state == protocol.STATE_CONNECTING: protocol.serverReject() else: protocol.serverClose(code=closing_code) else: raise ValueError("Unknown protocol class") def log_action(self, protocol, action, details): """ Dispatches to any registered action logger, if there is one. """ if self.action_logger: self.action_logger(protocol, action, details) def check_timeouts(self): """ Runs through all HTTP protocol instances and times them out if they've taken too long (and so their message is probably expired) """ for protocol in list(self.reply_protocols.values()): # Web timeout checking if isinstance(protocol, WebRequest) and protocol.duration() > self.timeout: protocol.basic_error(503, b"Service Unavailable", "Worker server failed to respond within time limit.") # WebSocket timeout checking and keepalive ping sending elif isinstance(protocol, WebSocketProtocol): # Timeout check if protocol.duration() > self.websocket_timeout and self.websocket_timeout >= 0: protocol.serverClose() # Ping check else: protocol.check_ping() # IProtocolNegotiationFactory def acceptableProtocols(self): """ Protocols this server can speak after ALPN negotiation. Currently that is HTTP/1.1 and optionally HTTP/2. Websockets cannot be negotiated using ALPN, so that doesn't go here: anyone wanting websockets will negotiate HTTP/1.1 and then do the upgrade dance. """ baseProtocols = [b'http/1.1'] if http.H2_ENABLED: baseProtocols.insert(0, b'h2') return baseProtocols daphne-1.4.2/daphne/server.py000077500000000000000000000215251322363512600161070ustar00rootroot00000000000000from __future__ import unicode_literals import logging import random import string import warnings from twisted.internet import reactor, defer from twisted.internet.endpoints import serverFromString from twisted.logger import globalLogBeginner, STDLibLogObserver from twisted.web import http from .http_protocol import HTTPFactory logger = logging.getLogger(__name__) class Server(object): def __init__( self, channel_layer, host=None, port=None, endpoints=None, unix_socket=None, file_descriptor=None, signal_handlers=True, action_logger=None, http_timeout=120, websocket_timeout=None, websocket_connect_timeout=20, ping_interval=20, ping_timeout=30, ws_protocols=None, root_path="", proxy_forwarded_address_header=None, proxy_forwarded_port_header=None, proxy_forwarded_proto_header=None, force_sync=False, verbosity=1, websocket_handshake_timeout=5 ): self.channel_layer = channel_layer self.endpoints = endpoints or [] if any([host, port, unix_socket, file_descriptor]): warnings.warn(''' The host/port/unix_socket/file_descriptor keyword arguments to %s are deprecated. ''' % self.__class__.__name__, DeprecationWarning) # build endpoint description strings from deprecated kwargs self.endpoints = sorted(self.endpoints + build_endpoint_description_strings( host=host, port=port, unix_socket=unix_socket, file_descriptor=file_descriptor )) if len(self.endpoints) == 0: raise UserWarning("No endpoints. This server will not listen on anything.") self.listeners = [] self.signal_handlers = signal_handlers self.action_logger = action_logger self.http_timeout = http_timeout self.ping_interval = ping_interval self.ping_timeout = ping_timeout self.proxy_forwarded_address_header = proxy_forwarded_address_header self.proxy_forwarded_port_header = proxy_forwarded_port_header self.proxy_forwarded_proto_header = proxy_forwarded_proto_header # If they did not provide a websocket timeout, default it to the # channel layer's group_expiry value if present, or one day if not. self.websocket_timeout = websocket_timeout or getattr(channel_layer, "group_expiry", 86400) self.websocket_connect_timeout = websocket_connect_timeout self.websocket_handshake_timeout = websocket_handshake_timeout self.ws_protocols = ws_protocols self.root_path = root_path self.force_sync = force_sync self.verbosity = verbosity def run(self): # Create process-local channel prefixes # TODO: Can we guarantee non-collision better? process_id = "".join(random.choice(string.ascii_letters) for i in range(10)) self.send_channel = "daphne.response.%s!" % process_id # Make the factory self.factory = HTTPFactory( self.channel_layer, action_logger=self.action_logger, send_channel=self.send_channel, timeout=self.http_timeout, websocket_timeout=self.websocket_timeout, websocket_connect_timeout=self.websocket_connect_timeout, ping_interval=self.ping_interval, ping_timeout=self.ping_timeout, ws_protocols=self.ws_protocols, root_path=self.root_path, proxy_forwarded_address_header=self.proxy_forwarded_address_header, proxy_forwarded_port_header=self.proxy_forwarded_port_header, proxy_forwarded_proto_header=self.proxy_forwarded_proto_header, websocket_handshake_timeout=self.websocket_handshake_timeout ) if self.verbosity <= 1: # Redirect the Twisted log to nowhere globalLogBeginner.beginLoggingTo([lambda _: None], redirectStandardIO=False, discardBuffer=True) else: globalLogBeginner.beginLoggingTo([STDLibLogObserver(__name__)]) # Detect what Twisted features are enabled if http.H2_ENABLED: logger.info("HTTP/2 support enabled") else: logger.info("HTTP/2 support not enabled (install the http2 and tls Twisted extras)") if "twisted" in self.channel_layer.extensions and not self.force_sync: logger.info("Using native Twisted mode on channel layer") reactor.callLater(0, self.backend_reader_twisted) else: logger.info("Using busy-loop synchronous mode on channel layer") reactor.callLater(0, self.backend_reader_sync) reactor.callLater(2, self.timeout_checker) for socket_description in self.endpoints: logger.info("Listening on endpoint %s" % socket_description) # Twisted requires str on python2 (not unicode) and str on python3 (not bytes) ep = serverFromString(reactor, str(socket_description)) listener = ep.listen(self.factory) listener.addErrback(self.on_listener_error) self.listeners.append(listener) reactor.run(installSignalHandlers=self.signal_handlers) def on_listener_error(self, failure): """ Callback function used to process interface listener errors. """ logger.error(failure.getErrorMessage()) def backend_reader_sync(self): """ Runs as an-often-as-possible task with the reactor, unless there was no result previously in which case we add a small delay. """ channels = [self.send_channel] delay = 0 # Quit if reactor is stopping if not reactor.running: logger.debug("Backend reader quitting due to reactor stop") return # Try to receive a message try: channel, message = self.channel_layer.receive(channels, block=False) except Exception as e: # Log the error and wait a bit to retry logger.error('Error trying to receive messages: %s' % e) delay = 5.00 else: if channel: # Deal with the message try: self.factory.dispatch_reply(channel, message) except Exception as e: logger.error("HTTP/WS send decode error: %s" % e) else: # If there's no messages, idle a little bit. delay = 0.05 # We can't loop inside here as this is synchronous code. reactor.callLater(delay, self.backend_reader_sync) @defer.inlineCallbacks def backend_reader_twisted(self): """ Runs as an-often-as-possible task with the reactor, unless there was no result previously in which case we add a small delay. """ channels = [self.send_channel] while True: if not reactor.running: logging.debug("Backend reader quitting due to reactor stop") return try: channel, message = yield self.channel_layer.receive_twisted(channels) except Exception as e: logger.error('Error trying to receive messages: %s' % e) yield self.sleep(5.00) else: # Deal with the message if channel: try: self.factory.dispatch_reply(channel, message) except Exception as e: logger.error("HTTP/WS send decode error: %s" % e) def sleep(self, delay): d = defer.Deferred() reactor.callLater(delay, d.callback, None) return d def timeout_checker(self): """ Called periodically to enforce timeout rules on all connections. Also checks pings at the same time. """ self.factory.check_timeouts() reactor.callLater(2, self.timeout_checker) def build_endpoint_description_strings( host=None, port=None, unix_socket=None, file_descriptor=None ): """ Build a list of twisted endpoint description strings that the server will listen on. This is to streamline the generation of twisted endpoint description strings from easier to use command line args such as host, port, unix sockets etc. """ socket_descriptions = [] if host and port: host = host.strip('[]').replace(':', '\:') socket_descriptions.append('tcp:port=%d:interface=%s' % (int(port), host)) elif any([host, port]): raise ValueError('TCP binding requires both port and host kwargs.') if unix_socket: socket_descriptions.append('unix:%s' % unix_socket) if file_descriptor is not None: socket_descriptions.append('fd:fileno=%d' % int(file_descriptor)) return socket_descriptions daphne-1.4.2/daphne/tests/000077500000000000000000000000001322363512600153615ustar00rootroot00000000000000daphne-1.4.2/daphne/tests/__init__.py000066400000000000000000000002651322363512600174750ustar00rootroot00000000000000from hypothesis import HealthCheck, settings settings.register_profile( 'daphne', settings(suppress_health_check=[HealthCheck.too_slow]), ) settings.load_profile('daphne') daphne-1.4.2/daphne/tests/asgi.py000066400000000000000000000000431322363512600166530ustar00rootroot00000000000000# coding=utf-8 channel_layer = {} daphne-1.4.2/daphne/tests/factories.py000066400000000000000000000106501322363512600177140ustar00rootroot00000000000000from __future__ import unicode_literals import six from six.moves.urllib import parse from asgiref.inmemory import ChannelLayer from twisted.test import proto_helpers from daphne.http_protocol import HTTPFactory def message_for_request(method, path, params=None, headers=None, body=None): """ Constructs a HTTP request according to the given parameters, runs that through daphne and returns the emitted channel message. """ request = _build_request(method, path, params, headers, body) message, factory, transport = _run_through_daphne(request, 'http.request') return message def response_for_message(message): """ Returns the raw HTTP response that Daphne constructs when sending a reply to a HTTP request. The current approach actually first builds a HTTP request (similar to message_for_request) because we need a valid reply channel. I'm sure this can be streamlined, but it works for now. """ request = _build_request('GET', '/') request_message, factory, transport = _run_through_daphne(request, 'http.request') factory.dispatch_reply(request_message['reply_channel'], message) return transport.value() def _build_request(method, path, params=None, headers=None, body=None): """ Takes request parameters and returns a byte string of a valid HTTP/1.1 request. We really shouldn't manually build a HTTP request, and instead try to capture what e.g. urllib or requests would do. But that is non-trivial, so meanwhile we hope that our request building doesn't mask any errors. This code is messy, because urllib behaves rather different between Python 2 and 3. Readability is further obstructed by the fact that Python 3.4 doesn't support % formatting for bytes, so we need to concat everything. If we run into more issues with this, the python-future library has a backport of Python 3's urllib. :param method: ASCII string of HTTP method. :param path: unicode string of URL path. :param params: List of two-tuples of bytestrings, ready for consumption for urlencode. Encode to utf8 if necessary. :param headers: List of two-tuples ASCII strings of HTTP header, value. :param body: ASCII string of request body. ASCII string is short for a unicode string containing only ASCII characters, or a byte string with ASCII encoding. """ if headers is None: headers = [] else: headers = headers[:] if six.PY3: quoted_path = parse.quote(path) if params: quoted_path += '?' + parse.urlencode(params) quoted_path = quoted_path.encode('ascii') else: quoted_path = parse.quote(path.encode('utf8')) if params: quoted_path += b'?' + parse.urlencode(params) request = method.encode('ascii') + b' ' + quoted_path + b" HTTP/1.1\r\n" for name, value in headers: request += header_line(name, value) request += b'\r\n' if body: request += body.encode('ascii') return request def build_websocket_upgrade(path, params, headers): ws_headers = [ ('Host', 'somewhere.com'), ('Upgrade', 'websocket'), ('Connection', 'Upgrade'), ('Sec-WebSocket-Key', 'x3JJHMbDL1EzLkh9GBhXDw=='), ('Sec-WebSocket-Protocol', 'chat, superchat'), ('Sec-WebSocket-Version', '13'), ('Origin', 'http://example.com') ] return _build_request('GET', path, params, headers=headers + ws_headers, body=None) def header_line(name, value): """ Given a header name and value, returns the line to use in a HTTP request or response. """ return name.encode('ascii') + b': ' + value.encode('ascii') + b"\r\n" def _run_through_daphne(request, channel_name): """ Returns Daphne's channel message for a given request. This helper requires a fair bit of scaffolding and can certainly be improved, but it works for now. """ channel_layer = ChannelLayer() factory = HTTPFactory(channel_layer, send_channel="test!") proto = factory.buildProtocol(('127.0.0.1', 0)) transport = proto_helpers.StringTransport() proto.makeConnection(transport) proto.dataReceived(request) _, message = channel_layer.receive([channel_name]) return message, factory, transport def content_length_header(body): """ Returns an appropriate Content-Length HTTP header for a given body. """ return 'Content-Length', six.text_type(len(body)) daphne-1.4.2/daphne/tests/http_strategies.py000066400000000000000000000074001322363512600211450ustar00rootroot00000000000000""" Assorted Hypothesis strategies useful for generating HTTP requests and responses """ from __future__ import unicode_literals from six.moves.urllib import parse import string from hypothesis import strategies HTTP_METHODS = ['OPTIONS', 'GET', 'HEAD', 'POST', 'PUT', 'DELETE', 'TRACE', 'CONNECT'] # Unicode characters of the "Letter" category letters = strategies.characters(whitelist_categories=('Lu', 'Ll', 'Lt', 'Lm', 'Lo', 'Nl')) def http_method(): return strategies.sampled_from(HTTP_METHODS) def _http_path_portion(): alphabet = string.ascii_letters + string.digits + '-._~' return strategies.text(min_size=1, average_size=10, max_size=128, alphabet=alphabet) def http_path(): """ Returns a URL path (not encoded). """ return strategies.lists( _http_path_portion(), min_size=0, max_size=10).map(lambda s: '/' + '/'.join(s)) def http_body(): """ Returns random printable ASCII characters. This may be exceeding what HTTP allows, but seems to not cause an issue so far. """ return strategies.text(alphabet=string.printable, min_size=0, average_size=600, max_size=1500) def binary_payload(): return strategies.binary(min_size=0, average_size=600, max_size=1500) def valid_bidi(value): """ Rejects strings which nonsensical Unicode text direction flags. Relying on random Unicode characters means that some combinations don't make sense, from a direction of text point of view. This little helper just rejects those. """ try: value.encode('idna') except UnicodeError: return False else: return True def _domain_label(): return strategies.text( alphabet=letters, min_size=1, average_size=6, max_size=63).filter(valid_bidi) def international_domain_name(): """ Returns a byte string of a domain name, IDNA-encoded. """ return strategies.lists( _domain_label(), min_size=2, average_size=2).map(lambda s: ('.'.join(s)).encode('idna')) def _query_param(): return strategies.text(alphabet=letters, min_size=1, average_size=10, max_size=255).\ map(lambda s: s.encode('utf8')) def query_params(): """ Returns a list of two-tuples byte strings, ready for encoding with urlencode. We're aiming for a total length of a URL below 2083 characters, so this strategy ensures that the total urlencoded query string is not longer than 1500 characters. """ return strategies.lists( strategies.tuples(_query_param(), _query_param()), min_size=0, average_size=5).\ filter(lambda x: len(parse.urlencode(x)) < 1500) def header_name(): """ Strategy returning something that looks like a HTTP header field https://en.wikipedia.org/wiki/List_of_HTTP_header_fields suggests they are between 4 and 20 characters long """ return strategies.text( alphabet=string.ascii_letters + string.digits + '-', min_size=1, max_size=30) def header_value(): """ Strategy returning something that looks like a HTTP header value "For example, the Apache 2.3 server by default limits the size of each field to 8190 bytes" https://en.wikipedia.org/wiki/List_of_HTTP_header_fields """ return strategies.text( alphabet=string.ascii_letters + string.digits + string.punctuation + ' /t', min_size=1, average_size=40, max_size=8190).filter(lambda s: len(s.encode('utf8')) < 8190) def headers(): """ Strategy returning a list of tuples, containing HTTP header fields and their values. "[Apache 2.3] there can be at most 100 header fields in a single request." https://en.wikipedia.org/wiki/List_of_HTTP_header_fields """ return strategies.lists( strategies.tuples(header_name(), header_value()), min_size=0, average_size=10, max_size=100) daphne-1.4.2/daphne/tests/test_endpoints.py000066400000000000000000000130321322363512600207740ustar00rootroot00000000000000# coding: utf8 from __future__ import unicode_literals import logging from unittest import TestCase from six import string_types from ..cli import CommandLineInterface from ..server import Server, build_endpoint_description_strings # this is the callable that will be tested here build = build_endpoint_description_strings class TestEndpointDescriptions(TestCase): def testBasics(self): self.assertEqual(build(), [], msg="Empty list returned when no kwargs given") def testTcpPortBindings(self): self.assertEqual( build(port=1234, host='example.com'), ['tcp:port=1234:interface=example.com'] ) self.assertEqual( build(port=8000, host='127.0.0.1'), ['tcp:port=8000:interface=127.0.0.1'] ) self.assertEqual( build(port=8000, host='[200a::1]'), ['tcp:port=8000:interface=200a\:\:1'] ) self.assertEqual( build(port=8000, host='200a::1'), ['tcp:port=8000:interface=200a\:\:1'] ) # incomplete port/host kwargs raise errors self.assertRaises( ValueError, build, port=123 ) self.assertRaises( ValueError, build, host='example.com' ) def testUnixSocketBinding(self): self.assertEqual( build(unix_socket='/tmp/daphne.sock'), ['unix:/tmp/daphne.sock'] ) def testFileDescriptorBinding(self): self.assertEqual( build(file_descriptor=5), ['fd:fileno=5'] ) def testMultipleEnpoints(self): self.assertEqual( sorted( build( file_descriptor=123, unix_socket='/tmp/daphne.sock', port=8080, host='10.0.0.1' ) ), sorted([ 'tcp:port=8080:interface=10.0.0.1', 'unix:/tmp/daphne.sock', 'fd:fileno=123' ]) ) class TestCLIInterface(TestCase): # construct a string that will be accepted as the channel_layer argument _import_channel_layer_string = 'daphne.tests.asgi:channel_layer' def setUp(self): logging.disable(logging.CRITICAL) # patch out the servers run method self._default_server_run = Server.run Server.run = lambda x: x def tearDown(self): logging.disable(logging.NOTSET) # restore the original server run method Server.run = self._default_server_run def build_cli(self, cli_args=''): # split the string and append the channel_layer positional argument if isinstance(cli_args, string_types): cli_args = cli_args.split() args = cli_args + [self._import_channel_layer_string] cli = CommandLineInterface() cli.run(args) return cli def get_endpoints(self, cli_args=''): cli = self.build_cli(cli_args=cli_args) return cli.server.endpoints def checkCLI(self, args='', endpoints=None, msg='Expected endpoints do not match.'): endpoints = endpoints or [] cli = self.build_cli(cli_args=args) generated_endpoints = sorted(cli.server.endpoints) endpoints.sort() self.assertEqual( generated_endpoints, endpoints, msg=msg ) def testCLIBasics(self): self.checkCLI( '', ['tcp:port=8000:interface=127.0.0.1'] ) self.checkCLI( '-p 123', ['tcp:port=123:interface=127.0.0.1'] ) self.checkCLI( '-b 10.0.0.1', ['tcp:port=8000:interface=10.0.0.1'] ) self.checkCLI( '-b 200a::1', ['tcp:port=8000:interface=200a\:\:1'] ) self.checkCLI( '-b [200a::1]', ['tcp:port=8000:interface=200a\:\:1'] ) self.checkCLI( '-p 8080 -b example.com', ['tcp:port=8080:interface=example.com'] ) def testCLIEndpointCreation(self): self.checkCLI( '-p 8080 -u /tmp/daphne.sock', [ 'tcp:port=8080:interface=127.0.0.1', 'unix:/tmp/daphne.sock', ], 'Default binding host patched in when only port given' ) self.checkCLI( '-b example.com -u /tmp/daphne.sock', [ 'tcp:port=8000:interface=example.com', 'unix:/tmp/daphne.sock', ], 'Default port patched in when missing.' ) self.checkCLI( '-u /tmp/daphne.sock --fd 5', [ 'fd:fileno=5', 'unix:/tmp/daphne.sock' ], 'File descriptor and unix socket bound, TCP ignored.' ) def testMixedCLIEndpointCreation(self): self.checkCLI( '-p 8080 -e unix:/tmp/daphne.sock', [ 'tcp:port=8080:interface=127.0.0.1', 'unix:/tmp/daphne.sock' ], 'Mix host/port args with endpoint args' ) self.checkCLI( '-p 8080 -e tcp:port=8080:interface=127.0.0.1', [ 'tcp:port=8080:interface=127.0.0.1', ] * 2, 'Do not try to de-duplicate endpoint description strings.' 'This would fail when running the server.' ) def testCustomEndpoints(self): self.checkCLI( '-e imap:', ['imap:'] ) daphne-1.4.2/daphne/tests/test_http_request.py000066400000000000000000000202061322363512600215210ustar00rootroot00000000000000# coding: utf8 """ Tests for the HTTP request section of the ASGI spec """ from __future__ import unicode_literals import unittest from six.moves.urllib import parse from asgiref.inmemory import ChannelLayer from hypothesis import given, assume from twisted.test import proto_helpers from daphne.http_protocol import HTTPFactory from daphne.tests import testcases, http_strategies from daphne.tests.factories import message_for_request, content_length_header class TestHTTPRequestSpec(testcases.ASGIHTTPTestCase): """ Tests which try to pour the HTTP request section of the ASGI spec into code. The heavy lifting is done by the assert_valid_http_request_message function, the tests mostly serve to wire up hypothesis so that it exercise it's power to find edge cases. """ def test_minimal_request(self): """ Smallest viable example. Mostly verifies that our request building works. """ request_method, request_path = 'GET', '/' message = message_for_request(request_method, request_path) self.assert_valid_http_request_message(message, request_method, request_path) @given( request_path=http_strategies.http_path(), request_params=http_strategies.query_params() ) def test_get_request(self, request_path, request_params): """ Tests a typical HTTP GET request, with a path and query parameters """ request_method = 'GET' message = message_for_request(request_method, request_path, request_params) self.assert_valid_http_request_message( message, request_method, request_path, request_params=request_params) @given( request_path=http_strategies.http_path(), request_body=http_strategies.http_body() ) def test_post_request(self, request_path, request_body): """ Tests a typical POST request, submitting some data in a body. """ request_method = 'POST' headers = [content_length_header(request_body)] message = message_for_request( request_method, request_path, headers=headers, body=request_body) self.assert_valid_http_request_message( message, request_method, request_path, request_headers=headers, request_body=request_body) @given(request_headers=http_strategies.headers()) def test_headers(self, request_headers): """ Tests that HTTP header fields are handled as specified """ request_method, request_path = 'OPTIONS', '/te st-à/' message = message_for_request(request_method, request_path, headers=request_headers) self.assert_valid_http_request_message( message, request_method, request_path, request_headers=request_headers) @given(request_headers=http_strategies.headers()) def test_duplicate_headers(self, request_headers): """ Tests that duplicate header values are preserved """ assume(len(request_headers) >= 2) # Set all header field names to the same value header_name = request_headers[0][0] duplicated_headers = [(header_name, header[1]) for header in request_headers] request_method, request_path = 'OPTIONS', '/te st-à/' message = message_for_request(request_method, request_path, headers=duplicated_headers) self.assert_valid_http_request_message( message, request_method, request_path, request_headers=duplicated_headers) @given( request_method=http_strategies.http_method(), request_path=http_strategies.http_path(), request_params=http_strategies.query_params(), request_headers=http_strategies.headers(), request_body=http_strategies.http_body(), ) # This test is slow enough that on Travis, hypothesis sometimes complains. def test_kitchen_sink( self, request_method, request_path, request_params, request_headers, request_body): """ Throw everything at channels that we dare. The idea is that if a combination of method/path/headers/body would break the spec, hypothesis will eventually find it. """ request_headers.append(content_length_header(request_body)) message = message_for_request( request_method, request_path, request_params, request_headers, request_body) self.assert_valid_http_request_message( message, request_method, request_path, request_params, request_headers, request_body) def test_headers_are_lowercased_and_stripped(self): request_method, request_path = 'GET', '/' headers = [('MYCUSTOMHEADER', ' foobar ')] message = message_for_request(request_method, request_path, headers=headers) self.assert_valid_http_request_message( message, request_method, request_path, request_headers=headers) # Note that Daphne returns a list of tuples here, which is fine, because the spec # asks to treat them interchangeably. assert message['headers'] == [(b'mycustomheader', b'foobar')] @given(daphne_path=http_strategies.http_path()) def test_root_path_header(self, daphne_path): """ Tests root_path handling. """ request_method, request_path = 'GET', '/' # Daphne-Root-Path must be URL encoded when submitting as HTTP header field headers = [('Daphne-Root-Path', parse.quote(daphne_path.encode('utf8')))] message = message_for_request(request_method, request_path, headers=headers) # Daphne-Root-Path is not included in the returned 'headers' section. So we expect # empty headers. expected_headers = [] self.assert_valid_http_request_message( message, request_method, request_path, request_headers=expected_headers) # And what we're looking for, root_path being set. assert message['root_path'] == daphne_path class TestProxyHandling(unittest.TestCase): """ Tests that concern interaction of Daphne with proxies. They live in a separate test case, because they're not part of the spec. """ def setUp(self): self.channel_layer = ChannelLayer() self.factory = HTTPFactory(self.channel_layer, send_channel="test!") self.proto = self.factory.buildProtocol(('127.0.0.1', 0)) self.tr = proto_helpers.StringTransport() self.proto.makeConnection(self.tr) def test_x_forwarded_for_ignored(self): self.proto.dataReceived( b"GET /te%20st-%C3%A0/?foo=+bar HTTP/1.1\r\n" + b"Host: somewhere.com\r\n" + b"X-Forwarded-For: 10.1.2.3\r\n" + b"X-Forwarded-Port: 80\r\n" + b"\r\n" ) # Get the resulting message off of the channel layer _, message = self.channel_layer.receive(["http.request"]) self.assertEqual(message['client'], ['192.168.1.1', 54321]) def test_x_forwarded_for_parsed(self): self.factory.proxy_forwarded_address_header = 'X-Forwarded-For' self.factory.proxy_forwarded_port_header = 'X-Forwarded-Port' self.factory.proxy_forwarded_proto_header = 'X-Forwarded-Proto' self.proto.dataReceived( b"GET /te%20st-%C3%A0/?foo=+bar HTTP/1.1\r\n" + b"Host: somewhere.com\r\n" + b"X-Forwarded-For: 10.1.2.3\r\n" + b"X-Forwarded-Port: 80\r\n" + b"\r\n" ) # Get the resulting message off of the channel layer _, message = self.channel_layer.receive(["http.request"]) self.assertEqual(message['client'], ['10.1.2.3', 80]) def test_x_forwarded_for_port_missing(self): self.factory.proxy_forwarded_address_header = 'X-Forwarded-For' self.factory.proxy_forwarded_port_header = 'X-Forwarded-Port' self.factory.proxy_forwarded_proto_header = 'X-Forwarded-Proto' self.proto.dataReceived( b"GET /te%20st-%C3%A0/?foo=+bar HTTP/1.1\r\n" + b"Host: somewhere.com\r\n" + b"X-Forwarded-For: 10.1.2.3\r\n" + b"\r\n" ) # Get the resulting message off of the channel layer _, message = self.channel_layer.receive(["http.request"]) self.assertEqual(message['client'], ['10.1.2.3', 0]) daphne-1.4.2/daphne/tests/test_http_response.py000066400000000000000000000114601322363512600216710ustar00rootroot00000000000000# coding: utf8 """ Tests for the HTTP response section of the ASGI spec """ from __future__ import unicode_literals from unittest import TestCase from asgiref.inmemory import ChannelLayer from hypothesis import given from twisted.test import proto_helpers from daphne.http_protocol import HTTPFactory from . import factories, http_strategies, testcases class TestHTTPResponseSpec(testcases.ASGIHTTPTestCase): def test_minimal_response(self): """ Smallest viable example. Mostly verifies that our response building works. """ message = {'status': 200} response = factories.response_for_message(message) self.assert_valid_http_response_message(message, response) self.assertIn(b'200 OK', response) # Assert that the response is the last of the chunks. # N.b. at the time of writing, Daphne did not support multiple response chunks, # but still sends with Transfer-Encoding: chunked if no Content-Length header # is specified (and maybe even if specified). self.assertTrue(response.endswith(b'0\r\n\r\n')) def test_status_code_required(self): """ Asserts that passing in the 'status' key is required. Previous versions of Daphne did not enforce this, so this test is here to make sure it stays required. """ with self.assertRaises(ValueError): factories.response_for_message({}) def test_status_code_is_transmitted(self): """ Tests that a custom status code is present in the response. We can't really use hypothesis to test all sorts of status codes, because a lot of them have meaning that is respected by Twisted. E.g. setting 204 (No Content) as a status code results in Twisted discarding the body. """ message = {'status': 201} # 'Created' response = factories.response_for_message(message) self.assert_valid_http_response_message(message, response) self.assertIn(b'201 Created', response) @given(body=http_strategies.http_body()) def test_body_is_transmitted(self, body): message = {'status': 200, 'content': body.encode('ascii')} response = factories.response_for_message(message) self.assert_valid_http_response_message(message, response) @given(headers=http_strategies.headers()) def test_headers(self, headers): # The ASGI spec requires us to lowercase our header names message = {'status': 200, 'headers': [(name.lower(), value) for name, value in headers]} response = factories.response_for_message(message) # The assert_ method does the heavy lifting of checking that headers are # as expected. self.assert_valid_http_response_message(message, response) @given( headers=http_strategies.headers(), body=http_strategies.http_body(), ) def test_kitchen_sink(self, headers, body): """ This tests tries to let Hypothesis find combinations of variables that result in breaking our assumptions. But responses are less exciting than responses, so there's not a lot going on here. """ message = { 'status': 202, # 'Accepted' 'headers': [(name.lower(), value) for name, value in headers], 'content': body.encode('ascii') } response = factories.response_for_message(message) self.assert_valid_http_response_message(message, response) class TestHTTPResponse(TestCase): """ Tests that the HTTP protocol class correctly generates and parses messages. """ def setUp(self): self.channel_layer = ChannelLayer() self.factory = HTTPFactory(self.channel_layer, send_channel="test!") self.proto = self.factory.buildProtocol(('127.0.0.1', 0)) self.tr = proto_helpers.StringTransport() self.proto.makeConnection(self.tr) def test_http_disconnect_sets_path_key(self): """ Tests http disconnect has the path key set, see https://channels.readthedocs.io/en/latest/asgi.html#disconnect """ # Send a simple request to the protocol self.proto.dataReceived( b"GET /te%20st-%C3%A0/?foo=bar HTTP/1.1\r\n" + b"Host: anywhere.com\r\n" + b"\r\n" ) # Get the request message _, message = self.channel_layer.receive(["http.request"]) # Send back an example response self.factory.dispatch_reply( message['reply_channel'], { "status": 200, "status_text": b"OK", "content": b"DISCO", } ) # Get the disconnection notification _, disconnect_message = self.channel_layer.receive(["http.disconnect"]) self.assertEqual(disconnect_message['path'], "/te st-à/") daphne-1.4.2/daphne/tests/test_utils.py000066400000000000000000000074061322363512600201410ustar00rootroot00000000000000# coding: utf8 from __future__ import unicode_literals from unittest import TestCase import six from twisted.web.http_headers import Headers from ..utils import parse_x_forwarded_for class TestXForwardedForHttpParsing(TestCase): """ Tests that the parse_x_forwarded_for util correctly parses twisted Header. """ def test_basic(self): headers = Headers({ b'X-Forwarded-For': [b'10.1.2.3'], b'X-Forwarded-Port': [b'1234'], b'X-Forwarded-Proto': [b'https'] }) result = parse_x_forwarded_for(headers) self.assertEqual(result, (['10.1.2.3', 1234], 'https')) self.assertIsInstance(result[0][0], six.text_type) self.assertIsInstance(result[1], six.text_type) def test_address_only(self): headers = Headers({ b'X-Forwarded-For': [b'10.1.2.3'], }) self.assertEqual( parse_x_forwarded_for(headers), (['10.1.2.3', 0], None) ) def test_v6_address(self): headers = Headers({ b'X-Forwarded-For': [b'1043::a321:0001, 10.0.5.6'], }) self.assertEqual( parse_x_forwarded_for(headers), (['1043::a321:0001', 0], None) ) def test_multiple_proxys(self): headers = Headers({ b'X-Forwarded-For': [b'10.1.2.3, 10.1.2.4'], }) self.assertEqual( parse_x_forwarded_for(headers), (['10.1.2.3', 0], None) ) def test_original_addr(self): headers = Headers({}) self.assertEqual( parse_x_forwarded_for(headers, original_addr=['127.0.0.1', 80]), (['127.0.0.1', 80], None) ) def test_original_proto(self): headers = Headers({}) self.assertEqual( parse_x_forwarded_for(headers, original_scheme='http'), (None, 'http') ) def test_no_original(self): headers = Headers({}) self.assertEqual( parse_x_forwarded_for(headers), (None, None) ) def test_address_and_proto(self): headers = Headers({ b'X-Forwarded-For': [b'10.1.2.3'], b'X-Forwarded-Proto': [b'https'], }) self.assertEqual( parse_x_forwarded_for(headers), (['10.1.2.3', 0], 'https') ) class TestXForwardedForWsParsing(TestCase): """ Tests that the parse_x_forwarded_for util correctly parses dict headers. """ def test_basic(self): headers = { b'X-Forwarded-For': b'10.1.2.3', b'X-Forwarded-Port': b'1234', } self.assertEqual( parse_x_forwarded_for(headers), (['10.1.2.3', 1234], None) ) def test_address_only(self): headers = { b'X-Forwarded-For': b'10.1.2.3', } self.assertEqual( parse_x_forwarded_for(headers), (['10.1.2.3', 0], None) ) def test_v6_address(self): headers = { b'X-Forwarded-For': [b'1043::a321:0001, 10.0.5.6'], } self.assertEqual( parse_x_forwarded_for(headers), (['1043::a321:0001', 0], None) ) def test_multiple_proxys(self): headers = { b'X-Forwarded-For': b'10.1.2.3, 10.1.2.4', } self.assertEqual( parse_x_forwarded_for(headers), (['10.1.2.3', 0], None) ) def test_original(self): headers = {} self.assertEqual( parse_x_forwarded_for(headers, original_addr=['127.0.0.1', 80]), (['127.0.0.1', 80], None) ) def test_no_original(self): headers = {} self.assertEqual( parse_x_forwarded_for(headers), (None, None) ) daphne-1.4.2/daphne/tests/test_ws.py000066400000000000000000000222371322363512600174310ustar00rootroot00000000000000# coding: utf8 from __future__ import unicode_literals from hypothesis import assume, given, strategies, settings from twisted.test import proto_helpers from asgiref.inmemory import ChannelLayer from daphne.http_protocol import HTTPFactory from daphne.tests import http_strategies, testcases, factories class WebSocketConnection(object): """ Helper class that makes it easier to test Dahpne's WebSocket support. """ def __init__(self): self.last_message = None self.channel_layer = ChannelLayer() self.factory = HTTPFactory(self.channel_layer, send_channel="test!") self.proto = self.factory.buildProtocol(('127.0.0.1', 0)) self.transport = proto_helpers.StringTransport() self.proto.makeConnection(self.transport) def receive(self, request): """ Low-level method to let Daphne handle HTTP/WebSocket data """ self.proto.dataReceived(request) _, self.last_message = self.channel_layer.receive(['websocket.connect']) return self.last_message def send(self, content): """ Method to respond with a channel message """ if self.last_message is None: # Auto-connect for convenience. self.connect() self.factory.dispatch_reply(self.last_message['reply_channel'], content) response = self.transport.value() self.transport.clear() return response def connect(self, path='/', params=None, headers=None): """ High-level method to perform the WebSocket handshake """ request = factories.build_websocket_upgrade(path, params, headers or []) message = self.receive(request) return message class TestHandshake(testcases.ASGIWebSocketTestCase): """ Tests for the WebSocket handshake """ def test_minimal(self): message = WebSocketConnection().connect() self.assert_valid_websocket_connect_message(message) @given( path=http_strategies.http_path(), params=http_strategies.query_params(), headers=http_strategies.headers(), ) @settings(perform_health_check=False) def test_connection(self, path, params, headers): message = WebSocketConnection().connect(path, params, headers) self.assert_valid_websocket_connect_message(message, path, params, headers) class TestSendCloseAccept(testcases.ASGIWebSocketTestCase): """ Tests that, essentially, try to translate the send/close/accept section of the spec into code. """ def test_empty_accept(self): response = WebSocketConnection().send({'accept': True}) self.assert_websocket_upgrade(response) @given(text=http_strategies.http_body()) def test_accept_and_text(self, text): response = WebSocketConnection().send({'accept': True, 'text': text}) self.assert_websocket_upgrade(response, text.encode('ascii')) @given(data=http_strategies.binary_payload()) def test_accept_and_bytes(self, data): response = WebSocketConnection().send({'accept': True, 'bytes': data}) self.assert_websocket_upgrade(response, data) def test_accept_false(self): response = WebSocketConnection().send({'accept': False}) self.assert_websocket_denied(response) def test_accept_false_with_text(self): """ Tests that even if text is given, the connection is denied. We can't easily use Hypothesis to generate data for this test because it's hard to detect absence of the body if e.g. Hypothesis would generate a 'GET' """ text = 'foobar' response = WebSocketConnection().send({'accept': False, 'text': text}) self.assert_websocket_denied(response) self.assertNotIn(text.encode('ascii'), response) def test_accept_false_with_bytes(self): """ Tests that even if data is given, the connection is denied. We can't easily use Hypothesis to generate data for this test because it's hard to detect absence of the body if e.g. Hypothesis would generate a 'GET' """ data = b'foobar' response = WebSocketConnection().send({'accept': False, 'bytes': data}) self.assert_websocket_denied(response) self.assertNotIn(data, response) @given(text=http_strategies.http_body()) def test_just_text(self, text): assume(len(text) > 0) # If content is sent, accept=True is implied. response = WebSocketConnection().send({'text': text}) self.assert_websocket_upgrade(response, text.encode('ascii')) @given(data=http_strategies.binary_payload()) def test_just_bytes(self, data): assume(len(data) > 0) # If content is sent, accept=True is implied. response = WebSocketConnection().send({'bytes': data}) self.assert_websocket_upgrade(response, data) def test_close_boolean(self): response = WebSocketConnection().send({'close': True}) self.assert_websocket_denied(response) @given(number=strategies.integers(min_value=1)) def test_close_integer(self, number): response = WebSocketConnection().send({'close': number}) self.assert_websocket_denied(response) @given(text=http_strategies.http_body()) def test_close_with_text(self, text): assume(len(text) > 0) response = WebSocketConnection().send({'close': True, 'text': text}) self.assert_websocket_upgrade(response, text.encode('ascii'), expect_close=True) @given(data=http_strategies.binary_payload()) def test_close_with_data(self, data): assume(len(data) > 0) response = WebSocketConnection().send({'close': True, 'bytes': data}) self.assert_websocket_upgrade(response, data, expect_close=True) class TestWebSocketProtocol(testcases.ASGIWebSocketTestCase): """ Tests that the WebSocket protocol class correctly generates and parses messages. """ def setUp(self): self.connection = WebSocketConnection() def test_basic(self): # Send a simple request to the protocol and get the resulting message off # of the channel layer. message = self.connection.receive( b"GET /chat HTTP/1.1\r\n" b"Host: somewhere.com\r\n" b"Upgrade: websocket\r\n" b"Connection: Upgrade\r\n" b"Sec-WebSocket-Key: x3JJHMbDL1EzLkh9GBhXDw==\r\n" b"Sec-WebSocket-Protocol: chat, superchat\r\n" b"Sec-WebSocket-Version: 13\r\n" b"Origin: http://example.com\r\n" b"\r\n" ) self.assertEqual(message['path'], "/chat") self.assertEqual(message['query_string'], b"") self.assertEqual( sorted(message['headers']), [(b'connection', b'Upgrade'), (b'host', b'somewhere.com'), (b'origin', b'http://example.com'), (b'sec-websocket-key', b'x3JJHMbDL1EzLkh9GBhXDw=='), (b'sec-websocket-protocol', b'chat, superchat'), (b'sec-websocket-version', b'13'), (b'upgrade', b'websocket')] ) self.assert_valid_websocket_connect_message(message, '/chat') # Accept the connection response = self.connection.send({'accept': True}) self.assert_websocket_upgrade(response) # Send some text response = self.connection.send({'text': "Hello World!"}) self.assertEqual(response, b"\x81\x0cHello World!") # Send some bytes response = self.connection.send({'bytes': b"\xaa\xbb\xcc\xdd"}) self.assertEqual(response, b"\x82\x04\xaa\xbb\xcc\xdd") # Close the connection response = self.connection.send({'close': True}) self.assertEqual(response, b"\x88\x02\x03\xe8") def test_connection_with_file_origin_is_accepted(self): message = self.connection.receive( b"GET /chat HTTP/1.1\r\n" b"Host: somewhere.com\r\n" b"Upgrade: websocket\r\n" b"Connection: Upgrade\r\n" b"Sec-WebSocket-Key: x3JJHMbDL1EzLkh9GBhXDw==\r\n" b"Sec-WebSocket-Protocol: chat, superchat\r\n" b"Sec-WebSocket-Version: 13\r\n" b"Origin: file://\r\n" b"\r\n" ) self.assertIn((b'origin', b'file://'), message['headers']) self.assert_valid_websocket_connect_message(message, '/chat') # Accept the connection response = self.connection.send({'accept': True}) self.assert_websocket_upgrade(response) def test_connection_with_no_origin_is_accepted(self): message = self.connection.receive( b"GET /chat HTTP/1.1\r\n" b"Host: somewhere.com\r\n" b"Upgrade: websocket\r\n" b"Connection: Upgrade\r\n" b"Sec-WebSocket-Key: x3JJHMbDL1EzLkh9GBhXDw==\r\n" b"Sec-WebSocket-Protocol: chat, superchat\r\n" b"Sec-WebSocket-Version: 13\r\n" b"\r\n" ) self.assertNotIn(b'origin', [header_tuple[0] for header_tuple in message['headers']]) self.assert_valid_websocket_connect_message(message, '/chat') # Accept the connection response = self.connection.send({'accept': True}) self.assert_websocket_upgrade(response) daphne-1.4.2/daphne/tests/testcases.py000066400000000000000000000243431322363512600177370ustar00rootroot00000000000000""" Contains a test case class to allow verifying ASGI messages """ from __future__ import unicode_literals from collections import defaultdict import six from six.moves.urllib import parse import socket import unittest from . import factories class ASGITestCaseBase(unittest.TestCase): """ Base class for our test classes which contains shared method. """ def assert_is_ip_address(self, address): """ Tests whether a given address string is a valid IPv4 or IPv6 address. """ try: socket.inet_aton(address) except socket.error: self.fail("'%s' is not a valid IP address." % address) def assert_presence_of_message_keys(self, keys, required_keys, optional_keys): present_keys = set(keys) self.assertTrue(required_keys <= present_keys) # Assert that no other keys are present self.assertEqual(set(), present_keys - required_keys - optional_keys) def assert_valid_reply_channel(self, reply_channel): self.assertIsInstance(reply_channel, six.text_type) # The reply channel is decided by the server. self.assertTrue(reply_channel.startswith('test!')) def assert_valid_path(self, path, request_path): self.assertIsInstance(path, six.text_type) self.assertEqual(path, request_path) # Assert that it's already url decoded self.assertEqual(path, parse.unquote(path)) def assert_valid_address_and_port(self, host): address, port = host self.assertIsInstance(address, six.text_type) self.assert_is_ip_address(address) self.assertIsInstance(port, int) class ASGIHTTPTestCase(ASGITestCaseBase): """ Test case with helpers for verifying HTTP channel messages """ def assert_valid_http_request_message( self, channel_message, request_method, request_path, request_params=None, request_headers=None, request_body=None): """ Asserts that a given channel message conforms to the HTTP request section of the ASGI spec. """ self.assertTrue(channel_message) self.assert_presence_of_message_keys( channel_message.keys(), {'reply_channel', 'http_version', 'method', 'path', 'query_string', 'headers'}, {'scheme', 'root_path', 'body', 'body_channel', 'client', 'server'}) # == Assertions about required channel_message fields == self.assert_valid_reply_channel(channel_message['reply_channel']) self.assert_valid_path(channel_message['path'], request_path) http_version = channel_message['http_version'] self.assertIsInstance(http_version, six.text_type) self.assertIn(http_version, ['1.0', '1.1', '1.2']) method = channel_message['method'] self.assertIsInstance(method, six.text_type) self.assertTrue(method.isupper()) self.assertEqual(channel_message['method'], request_method) query_string = channel_message['query_string'] # Assert that query_string is a byte string and still url encoded self.assertIsInstance(query_string, six.binary_type) self.assertEqual(query_string, parse.urlencode(request_params or []).encode('ascii')) # Ordering of header names is not important, but the order of values for a header # name is. To assert whether that order is kept, we transform both the request # headers and the channel message headers into a dictionary # {name: [value1, value2, ...]} and check if they're equal. transformed_message_headers = defaultdict(list) for name, value in channel_message['headers']: transformed_message_headers[name].append(value) transformed_request_headers = defaultdict(list) for name, value in (request_headers or []): expected_name = name.lower().strip().encode('ascii') expected_value = value.strip().encode('ascii') transformed_request_headers[expected_name].append(expected_value) self.assertEqual(transformed_message_headers, transformed_request_headers) # == Assertions about optional channel_message fields == scheme = channel_message.get('scheme') if scheme is not None: self.assertIsInstance(scheme, six.text_type) self.assertTrue(scheme) # May not be empty root_path = channel_message.get('root_path') if root_path is not None: self.assertIsInstance(root_path, six.text_type) body = channel_message.get('body') # Ensure we test for presence of 'body' if a request body was given if request_body is not None or body is not None: self.assertIsInstance(body, six.binary_type) self.assertEqual(body, (request_body or '').encode('ascii')) body_channel = channel_message.get('body_channel') if body_channel is not None: self.assertIsInstance(body_channel, six.text_type) self.assertIn('?', body_channel) client = channel_message.get('client') if client is not None: self.assert_valid_address_and_port(channel_message['client']) server = channel_message.get('server') if server is not None: self.assert_valid_address_and_port(channel_message['server']) def assert_valid_http_response_message(self, message, response): self.assertTrue(message) self.assertTrue(response.startswith(b'HTTP')) status_code_bytes = six.text_type(message['status']).encode('ascii') self.assertIn(status_code_bytes, response) if 'content' in message: self.assertIn(message['content'], response) # Check that headers are in the given order. # N.b. HTTP spec only enforces that the order of header values is kept, but # the ASGI spec requires that order of all headers is kept. This code # checks conformance with the stricter ASGI spec. if 'headers' in message: for name, value in message['headers']: expected_header = factories.header_line(name, value) # Daphne or Twisted turn our lower cased header names ('foo-bar') into title # case ('Foo-Bar'). So technically we want to to match that the header name is # present while ignoring casing, and want to ensure the value is present without # altered casing. The approach below does this well enough. self.assertIn(expected_header.lower(), response.lower()) self.assertIn(value.encode('ascii'), response) class ASGIWebSocketTestCase(ASGITestCaseBase): """ Test case with helpers for verifying WebSocket channel messages """ def assert_websocket_upgrade(self, response, body=b'', expect_close=False): self.assertIn(b"HTTP/1.1 101 Switching Protocols", response) self.assertIn(b"Sec-WebSocket-Accept: HSmrc0sMlYUkAGmm5OPpG2HaGWk=\r\n", response) self.assertIn(body, response) self.assertEqual(expect_close, response.endswith(b"\x88\x02\x03\xe8")) def assert_websocket_denied(self, response): self.assertIn(b'HTTP/1.1 403', response) def assert_valid_websocket_connect_message( self, channel_message, request_path='/', request_params=None, request_headers=None): """ Asserts that a given channel message conforms to the HTTP request section of the ASGI spec. """ self.assertTrue(channel_message) self.assert_presence_of_message_keys( channel_message.keys(), {'reply_channel', 'path', 'headers', 'order'}, {'scheme', 'query_string', 'root_path', 'client', 'server'}) # == Assertions about required channel_message fields == self.assert_valid_reply_channel(channel_message['reply_channel']) self.assert_valid_path(channel_message['path'], request_path) order = channel_message['order'] self.assertIsInstance(order, int) self.assertEqual(order, 0) # Ordering of header names is not important, but the order of values for a header # name is. To assert whether that order is kept, we transform the request # headers and the channel message headers into a set # {('name1': 'value1,value2'), ('name2': 'value3')} and check if they're equal. # Note that unlike for HTTP, Daphne never gives out individual header values; instead we # get one string per header field with values separated by comma. transformed_request_headers = defaultdict(list) for name, value in (request_headers or []): expected_name = name.lower().strip().encode('ascii') expected_value = value.strip().encode('ascii') transformed_request_headers[expected_name].append(expected_value) final_request_headers = { (name, b','.join(value)) for name, value in transformed_request_headers.items() } # Websockets carry a lot of additional header fields, so instead of verifying that # headers look exactly like expected, we just check that the expected header fields # and values are present - additional header fields (e.g. Sec-WebSocket-Key) are allowed # and not tested for. assert final_request_headers.issubset(set(channel_message['headers'])) # == Assertions about optional channel_message fields == scheme = channel_message.get('scheme') if scheme: self.assertIsInstance(scheme, six.text_type) self.assertIn(scheme, ['ws', 'wss']) query_string = channel_message.get('query_string') if query_string: # Assert that query_string is a byte string and still url encoded self.assertIsInstance(query_string, six.binary_type) self.assertEqual(query_string, parse.urlencode(request_params or []).encode('ascii')) root_path = channel_message.get('root_path') if root_path is not None: self.assertIsInstance(root_path, six.text_type) client = channel_message.get('client') if client is not None: self.assert_valid_address_and_port(channel_message['client']) server = channel_message.get('server') if server is not None: self.assert_valid_address_and_port(channel_message['server']) daphne-1.4.2/daphne/twisted/000077500000000000000000000000001322363512600157025ustar00rootroot00000000000000daphne-1.4.2/daphne/twisted/plugins/000077500000000000000000000000001322363512600173635ustar00rootroot00000000000000daphne-1.4.2/daphne/twisted/plugins/fd_endpoint.py000066400000000000000000000014651322363512600222340ustar00rootroot00000000000000from twisted.plugin import IPlugin from zope.interface import implementer from twisted.internet.interfaces import IStreamServerEndpointStringParser from twisted.internet import endpoints import socket @implementer(IPlugin, IStreamServerEndpointStringParser) class _FDParser(object): prefix = "fd" def _parseServer(self, reactor, fileno, domain=socket.AF_INET): fileno = int(fileno) return endpoints.AdoptedStreamServerEndpoint(reactor, fileno, domain) def parseStreamServer(self, reactor, *args, **kwargs): # Delegate to another function with a sane signature. This function has # an insane signature to trick zope.interface into believing the # interface is correctly implemented. return self._parseServer(reactor, *args, **kwargs) parser = _FDParser()daphne-1.4.2/daphne/utils.py000066400000000000000000000057421322363512600157410ustar00rootroot00000000000000from twisted.web.http_headers import Headers def header_value(headers, header_name): value = headers[header_name] if isinstance(value, list): value = value[0] # decode to urf-8 if value is bytes if isinstance(value, bytes): value = value.decode("utf-8") return value def parse_x_forwarded_for(headers, address_header_name='X-Forwarded-For', port_header_name='X-Forwarded-Port', proto_header_name='X-Forwarded-Proto', original_addr=None, original_scheme=None): """ Parses an X-Forwarded-For header and returns a host/port pair as a list. @param headers: The twisted-style object containing a request's headers @param address_header_name: The name of the expected host header @param port_header_name: The name of the expected port header @param proto_header_name: The name of the expected protocol header @param original_addr: A host/port pair that should be returned if the headers are not in the request @param original_scheme: A scheme that should be returned if the headers are not in the request @return: A tuple containing a list [host (string), port (int)] as the first entry and a proto (string) as the second """ if not address_header_name: return (original_addr, original_scheme) if isinstance(headers, Headers): # Convert twisted-style headers into a dict headers = dict(headers.getAllRawHeaders()) # Lowercase all header keys headers = {name.lower(): values for name, values in headers.items()} else: # Lowercase (and encode to utf-8 where needed) non-twisted header keys headers = {name.lower() if isinstance(name, bytes) else name.lower().encode("utf-8"): values for name, values in headers.items()} address_header_name = address_header_name.lower().encode("utf-8") result_addr = original_addr result_scheme = original_scheme if address_header_name in headers: address_value = header_value(headers, address_header_name) if ',' in address_value: address_value = address_value.split(",")[0].strip() result_addr = [address_value, 0] if port_header_name: # We only want to parse the X-Forwarded-Port header if we also parsed the X-Forwarded-For # header to avoid inconsistent results. port_header_name = port_header_name.lower().encode("utf-8") if port_header_name in headers: port_value = header_value(headers, port_header_name) try: result_addr[1] = int(port_value) except ValueError: pass if proto_header_name: proto_header_name = proto_header_name.lower().encode("utf-8") if proto_header_name in headers: result_scheme = header_value(headers, proto_header_name) return result_addr, result_scheme daphne-1.4.2/daphne/ws_protocol.py000077500000000000000000000245441322363512600171570ustar00rootroot00000000000000from __future__ import unicode_literals import logging import six import time import traceback from six.moves.urllib_parse import unquote from twisted.internet import defer from autobahn.twisted.websocket import WebSocketServerProtocol, WebSocketServerFactory, ConnectionDeny from .utils import parse_x_forwarded_for logger = logging.getLogger(__name__) class WebSocketProtocol(WebSocketServerProtocol): """ Protocol which supports WebSockets and forwards incoming messages to the websocket channels. """ # If we should send no more messages (e.g. we error-closed the socket) muted = False def set_main_factory(self, main_factory): self.main_factory = main_factory self.channel_layer = self.main_factory.channel_layer def onConnect(self, request): self.request = request self.packets_received = 0 self.protocol_to_accept = None self.socket_opened = time.time() self.last_data = time.time() try: # Sanitize and decode headers self.clean_headers = [] for name, value in request.headers.items(): name = name.encode("ascii") # Prevent CVE-2015-0219 if b"_" in name: continue self.clean_headers.append((name.lower(), value.encode("latin1"))) # Make sending channel self.reply_channel = self.main_factory.make_send_channel() # Tell main factory about it self.main_factory.reply_protocols[self.reply_channel] = self # Get client address if possible peer = self.transport.getPeer() host = self.transport.getHost() if hasattr(peer, "host") and hasattr(peer, "port"): self.client_addr = [six.text_type(peer.host), peer.port] self.server_addr = [six.text_type(host.host), host.port] else: self.client_addr = None self.server_addr = None if self.main_factory.proxy_forwarded_address_header: self.client_addr, self.client_scheme = parse_x_forwarded_for( self.http_headers, self.main_factory.proxy_forwarded_address_header, self.main_factory.proxy_forwarded_port_header, self.main_factory.proxy_forwarded_proto_header, self.client_addr ) # Make initial request info dict from request (we only have it here) self.path = request.path.encode("ascii") self.request_info = { "path": self.unquote(self.path), "headers": self.clean_headers, "query_string": self._raw_query_string, # Passed by HTTP protocol "client": self.client_addr, "server": self.server_addr, "reply_channel": self.reply_channel, "order": 0, } except: # Exceptions here are not displayed right, just 500. # Turn them into an ERROR log. logger.error(traceback.format_exc()) raise ws_protocol = None for header, value in self.clean_headers: if header == b'sec-websocket-protocol': protocols = [x.strip() for x in self.unquote(value).split(",")] for protocol in protocols: if protocol in self.factory.protocols: ws_protocol = protocol break # Work out what subprotocol we will accept, if any if ws_protocol and ws_protocol in self.factory.protocols: self.protocol_to_accept = ws_protocol else: self.protocol_to_accept = None # Send over the connect message try: self.channel_layer.send("websocket.connect", self.request_info) except self.channel_layer.ChannelFull: # You have to consume websocket.connect according to the spec, # so drop the connection. self.muted = True logger.warn("WebSocket force closed for %s due to connect backpressure", self.reply_channel) # Send code 503 "Service Unavailable" with close. raise ConnectionDeny(code=503, reason="Connection queue at capacity") else: self.factory.log_action("websocket", "connecting", { "path": self.request.path, "client": "%s:%s" % tuple(self.client_addr) if self.client_addr else None, }) # Make a deferred and return it - we'll either call it or err it later on self.handshake_deferred = defer.Deferred() return self.handshake_deferred @classmethod def unquote(cls, value): """ Python 2 and 3 compat layer for utf-8 unquoting """ if six.PY2: return unquote(value).decode("utf8") else: return unquote(value.decode("ascii")) def onOpen(self): # Send news that this channel is open logger.debug("WebSocket %s open and established", self.reply_channel) self.factory.log_action("websocket", "connected", { "path": self.request.path, "client": "%s:%s" % tuple(self.client_addr) if self.client_addr else None, }) def onMessage(self, payload, isBinary): # If we're muted, do nothing. if self.muted: logger.debug("Muting incoming frame on %s", self.reply_channel) return logger.debug("WebSocket incoming frame on %s", self.reply_channel) self.packets_received += 1 self.last_data = time.time() try: if isBinary: self.channel_layer.send("websocket.receive", { "reply_channel": self.reply_channel, "path": self.unquote(self.path), "order": self.packets_received, "bytes": payload, }) else: self.channel_layer.send("websocket.receive", { "reply_channel": self.reply_channel, "path": self.unquote(self.path), "order": self.packets_received, "text": payload.decode("utf8"), }) except self.channel_layer.ChannelFull: # You have to consume websocket.receive according to the spec, # so drop the connection. self.muted = True logger.warn("WebSocket force closed for %s due to receive backpressure", self.reply_channel) # Send code 1013 "try again later" with close. self.sendCloseFrame(code=1013, isReply=False) def serverAccept(self): """ Called when we get a message saying to accept the connection. """ self.handshake_deferred.callback(self.protocol_to_accept) logger.debug("WebSocket %s accepted by application", self.reply_channel) def serverReject(self): """ Called when we get a message saying to reject the connection. """ self.handshake_deferred.errback(ConnectionDeny(code=403, reason="Access denied")) self.cleanup() logger.debug("WebSocket %s rejected by application", self.reply_channel) self.factory.log_action("websocket", "rejected", { "path": self.request.path, "client": "%s:%s" % tuple(self.client_addr) if self.client_addr else None, }) def serverSend(self, content, binary=False): """ Server-side channel message to send a message. """ if self.state == self.STATE_CONNECTING: self.serverAccept() self.last_data = time.time() logger.debug("Sent WebSocket packet to client for %s", self.reply_channel) if binary: self.sendMessage(content, binary) else: self.sendMessage(content.encode("utf8"), binary) def serverClose(self, code=True): """ Server-side channel message to close the socket """ code = 1000 if code is True else code self.sendClose(code=code) def onClose(self, wasClean, code, reason): self.cleanup() if hasattr(self, "reply_channel"): logger.debug("WebSocket closed for %s", self.reply_channel) try: if not self.muted: self.channel_layer.send("websocket.disconnect", { "reply_channel": self.reply_channel, "code": code, "path": self.unquote(self.path), "order": self.packets_received + 1, }) except self.channel_layer.ChannelFull: pass self.factory.log_action("websocket", "disconnected", { "path": self.request.path, "client": "%s:%s" % tuple(self.client_addr) if self.client_addr else None, }) else: logger.debug("WebSocket closed before handshake established") def cleanup(self): """ Call to clean up this socket after it's closed. """ if hasattr(self, "reply_channel"): if self.reply_channel in self.factory.reply_protocols: del self.factory.reply_protocols[self.reply_channel] def duration(self): """ Returns the time since the socket was opened """ return time.time() - self.socket_opened def check_ping(self): """ Checks to see if we should send a keepalive ping/deny socket connection """ # If we're still connecting, deny the connection if self.state == self.STATE_CONNECTING: if self.duration() > self.main_factory.websocket_connect_timeout: self.serverReject() elif self.state == self.STATE_OPEN: if (time.time() - self.last_data) > self.main_factory.ping_interval: self._sendAutoPing() self.last_data = time.time() class WebSocketFactory(WebSocketServerFactory): """ Factory subclass that remembers what the "main" factory is, so WebSocket protocols can access it to get reply ID info. """ def __init__(self, main_factory, *args, **kwargs): self.main_factory = main_factory WebSocketServerFactory.__init__(self, *args, **kwargs) def log_action(self, *args, **kwargs): self.main_factory.log_action(*args, **kwargs) daphne-1.4.2/setup.cfg000066400000000000000000000000321322363512600145740ustar00rootroot00000000000000[bdist_wheel] universal=1 daphne-1.4.2/setup.py000077500000000000000000000031211322363512600144720ustar00rootroot00000000000000import os from setuptools import find_packages, setup from daphne import __version__ # We use the README as the long_description readme_path = os.path.join(os.path.dirname(__file__), "README.rst") with open(readme_path) as fp: long_description = fp.read() setup( name='daphne', version=__version__, url='https://github.com/django/daphne', author='Django Software Foundation', author_email='foundation@djangoproject.com', description='Django ASGI (HTTP/WebSocket) server', long_description=long_description, license='BSD', zip_safe=False, package_dir={'twisted': 'daphne/twisted'}, packages=find_packages() + ['twisted.plugins'], include_package_data=True, install_requires=[ 'asgiref~=1.1', 'twisted>=17.1', 'autobahn>=0.18', ], extras_require={ 'tests': ['hypothesis', 'tox'] }, entry_points={'console_scripts': [ 'daphne = daphne.cli:CommandLineInterface.entrypoint', ]}, classifiers=[ 'Development Status :: 4 - Beta', 'Environment :: Web Environment', 'Intended Audience :: Developers', 'License :: OSI Approved :: BSD License', 'Operating System :: OS Independent', 'Programming Language :: Python', 'Programming Language :: Python :: 2', 'Programming Language :: Python :: 2.7', 'Programming Language :: Python :: 3', 'Programming Language :: Python :: 3.4', 'Programming Language :: Python :: 3.5', 'Programming Language :: Python :: 3.6', 'Topic :: Internet :: WWW/HTTP', ], ) daphne-1.4.2/tox.ini000066400000000000000000000003751322363512600143000ustar00rootroot00000000000000# We test against the oldest supported Twisted release, and the current release. [tox] envlist = py{27,34,35,36}-twisted-{old,new} [testenv] deps = twisted-old: twisted==17.1.0 commands = pip install -e .[tests] python -m unittest discover