ooniprobe-1.3.2/ 0000755 0001750 0001750 00000000000 12623630152 011647 5 ustar irl irl ooniprobe-1.3.2/bin/ 0000755 0001750 0001750 00000000000 12623630152 012417 5 ustar irl irl ooniprobe-1.3.2/bin/oonireport 0000755 0001750 0001750 00000001330 12623613431 014543 0 ustar irl irl #!/usr/bin/env python
import sys
import exceptions
from twisted.internet import defer, reactor
from ooni.utils import log
from ooni.report import cli
exitCode = 128
def failed(failure):
global exitCode
r = failure.trap(exceptions.SystemExit,
Exception)
if r != exceptions.SystemExit:
log.err("Failed to run oonireport")
log.exception(failure)
exitCode = 127
else:
exitCode = failure.value.code
reactor.stop()
def done(result):
global exitCode
exitCode = 0
reactor.stop()
def start():
d = defer.maybeDeferred(cli.run)
d.addCallback(done)
d.addErrback(failed)
reactor.callWhenRunning(start)
reactor.run()
sys.exit(exitCode)
ooniprobe-1.3.2/bin/oonideckgen 0000755 0001750 0001750 00000001331 12531110611 014617 0 ustar irl irl #!/usr/bin/env python
import sys
import exceptions
from twisted.internet import defer, reactor
from ooni.utils import log
from ooni.deckgen import cli
exitCode = 128
def failed(failure):
global exitCode
r = failure.trap(exceptions.SystemExit,
Exception)
if r != exceptions.SystemExit:
log.err("Failed to run oonideckgen")
log.exception(failure)
exitCode = 127
else:
exitCode = failure.value.code
reactor.stop()
def done(result):
global exitCode
exitCode = 0
reactor.stop()
def start():
d = defer.maybeDeferred(cli.run)
d.addCallback(done)
d.addErrback(failed)
reactor.callWhenRunning(start)
reactor.run()
sys.exit(exitCode)
ooniprobe-1.3.2/bin/ooniresources 0000755 0001750 0001750 00000001310 12531110611 015226 0 ustar irl irl #!/usr/bin/env python
import sys
from twisted.internet import defer, reactor
from ooni.utils import log
from ooni.resources import cli
exitCode = 128
def failed(failure):
global exitCode
r = failure.trap(exceptions.SystemExit,
Exception)
if r != exceptions.SystemExit:
log.err("Failed to run ooniresources")
log.exception(failure)
exitCode = 127
else:
exitCode = failure.value.code
reactor.stop()
def done(result):
global exitCode
exitCode = 0
reactor.stop()
def start():
d = defer.maybeDeferred(cli.run)
d.addCallback(done)
d.addErrback(done)
reactor.callWhenRunning(start)
reactor.run()
sys.exit(exitCode)
ooniprobe-1.3.2/bin/ooniprobe 0000755 0001750 0001750 00000000563 12502236531 014344 0 ustar irl irl #!/usr/bin/env python
import copy_reg
from twisted.internet import reactor
# This is a hack to overcome a bug in python
from ooni.utils.hacks import patched_reduce_ex
copy_reg._reduce_ex = patched_reduce_ex
# from ooni.oonicli import run
# run()
from ooni.oonicli import runWithDirector
d = runWithDirector()
@d.addBoth
def cb(result):
reactor.stop()
reactor.run()
ooniprobe-1.3.2/requirements.txt 0000644 0001750 0001750 00000000353 12623613431 015135 0 ustar irl irl PyYAML>=3.10
Twisted>=12.2.0
ipaddr>=2.1.10
pyOpenSSL>=0.13
geoip
txtorcon>=0.7
txsocksx>=0.0.2
parsley>=1.1
scapy-real>=2.2.0-dev
pypcap>=1.1
service-identity
pydumbnet
pyasn1
pyasn1-modules
characteristic
zope.interface
cryptography
ooniprobe-1.3.2/LICENSE 0000644 0001750 0001750 00000002736 12447563404 012675 0 ustar irl irl Copyright (c) 2012-2014, Jacob Appelbaum, Aaron Gibson, Arturo Filastò, Isis Lovecruft, The Tor Project
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright notice,
this list of conditions and the following disclaimer in the documentation
and/or other materials provided with the distribution.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR
ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
This product includes GeoLite data created by MaxMind, available from
http://www.maxmind.com.
ooniprobe-1.3.2/setup.py 0000644 0001750 0001750 00000017557 12623616571 013411 0 ustar irl irl """
ooniprobe: a network interference detection tool
================================================
.. image:: https://travis-ci.org/TheTorProject/ooni-probe.png?branch=master
:target: https://travis-ci.org/TheTorProject/ooni-probe
.. image:: https://coveralls.io/repos/TheTorProject/ooni-probe/badge.png
:target: https://coveralls.io/r/TheTorProject/ooni-probe
___________________________________________________________________________
.. image:: https://ooni.torproject.org/theme/img/ooni-logo.png
:target: https:://ooni.torproject.org/
OONI, the Open Observatory of Network Interference, is a global observation
network which aims is to collect high quality data using open methodologies,
using Free and Open Source Software (FL/OSS) to share observations and data
about the various types, methods, and amounts of network tampering in the
world.
Read this before running ooniprobe!
-----------------------------------
Running ooniprobe is a potentially risky activity. This greatly depends on the
jurisdiction in which you are in and which test you are running. It is
technically possible for a person observing your internet connection to be
aware of the fact that you are running ooniprobe. This means that if running
network measurement tests is something considered to be illegal in your country
then you could be spotted.
Furthermore, ooniprobe takes no precautions to protect the install target machine
from forensics analysis. If the fact that you have installed or used ooni
probe is a liability for you, please be aware of this risk.
Setup ooniprobe
-------------------
To install ooniprobe you will need the following dependencies:
* python
* python-dev
* python-setuptools
* build-essential
* libdumbnet1
* python-dumbnet
* python-libpcap
* tor
* libgeoip-dev
* libpcap0.8-dev
* libssl-dev
* libffi-dev
* libdumbnet-dev
When you got them run:
.. code:: bash
sudo pip install ooniprobe
Using ooniprobe
---------------
To generate a test deck for your country, cd to the directory where you want it
and run:
.. code:: bash
oonideckgen
To setup a daily cronjob run this:
.. code:: bash
(crontab -l 2>/dev/null; echo "@daily ooniprobe `oonideckgen | grep -e '^ooniprobe'`") | crontab -
Have fun!
"""
from ooni import __version__, __author__
import os
import sys
import tempfile
from ConfigParser import SafeConfigParser
from os.path import join as pj
from setuptools import setup
from setuptools.command.install import install as _st_install
from distutils.spawn import find_executable
class install(_st_install):
def gen_config(self, share_path):
config_file = pj(tempfile.mkdtemp(), "ooniprobe.conf.sample")
o = open(config_file, "w+")
with open("data/ooniprobe.conf.sample") as f:
for line in f:
if "/usr/share" in line:
line = line.replace("/usr/share", share_path)
o.write(line)
o.close()
return config_file
def set_data_files(self, prefix):
share_path = pj(prefix, 'share')
if prefix.startswith("/usr"):
var_path = "/var/lib/"
else:
var_path = pj(prefix, 'var', 'lib')
for root, dirs, file_names in os.walk('data/'):
files = []
for file_name in file_names:
if file_name.endswith('.pyc'):
continue
elif file_name.endswith('.dat') and \
file_name.startswith('Geo'):
continue
elif file_name == "ooniprobe.conf.sample":
files.append(self.gen_config(share_path))
continue
files.append(pj(root, file_name))
self.distribution.data_files.append(
[
pj(share_path, 'ooni', root.replace('data/', '')),
files
]
)
settings = SafeConfigParser()
settings.add_section("directories")
settings.set("directories", "usr_share",
os.path.join(share_path, "ooni"))
settings.set("directories", "var_lib",
os.path.join(var_path, "ooni"))
with open("ooni/settings.ini", "w+") as fp:
settings.write(fp)
try:
os.makedirs(pj(var_path, 'ooni'))
except OSError:
pass
try:
os.makedirs(pj(share_path, 'ooni'))
except OSError:
pass
def ooniresources(self):
ooniresources = find_executable("ooniresources")
from subprocess import Popen
process = Popen([ooniresources, '--update-inputs', '--update-geoip'],
stdout=sys.stdout.fileno(), stderr=sys.stderr.fileno())
process.wait()
def run(self):
prefix = os.path.abspath(self.prefix)
self.set_data_files(prefix)
self.do_egg_install()
self.ooniresources()
install_requires = []
dependency_links = []
data_files = []
packages = [
'ooni',
'ooni.api',
'ooni.deckgen',
'ooni.deckgen.processors',
'ooni.kit',
'ooni.nettests',
'ooni.nettests.manipulation',
'ooni.nettests.experimental',
'ooni.nettests.scanning',
'ooni.nettests.blocking',
'ooni.nettests.third_party',
'ooni.report',
'ooni.resources',
'ooni.templates',
'ooni.tests',
'ooni.utils'
]
with open('requirements.txt') as f:
for line in f:
if line.startswith("#"):
continue
if line.startswith('https'):
dependency_links.append(line)
continue
install_requires.append(line)
setup(
name="ooniprobe",
version=__version__,
author=__author__,
author_email="ooni-dev@lists.torproject.org",
description="Network measurement tool for"
"identifying traffic manipulation and blocking.",
long_description=__doc__,
license='BSD 2 clause',
url="https://ooni.torproject.org/",
package_dir={'ooni': 'ooni'},
data_files=data_files,
packages=packages,
include_package_data=True,
scripts=["bin/oonideckgen", "bin/ooniprobe",
"bin/oonireport", "bin/ooniresources"],
dependency_links=dependency_links,
install_requires=install_requires,
zip_safe=False,
cmdclass={"install": install},
classifiers=(
"Development Status :: 5 - Production/Stable",
"Environment :: Console",
"Framework :: Twisted",
"Intended Audience :: Developers",
"Intended Audience :: Education",
"Intended Audience :: End Users/Desktop",
"Intended Audience :: Information Technology",
"Intended Audience :: Science/Research",
"Intended Audience :: Telecommunications Industry",
"License :: OSI Approved :: BSD License"
"Programming Language :: Python",
"Programming Language :: Python :: 2",
"Programming Language :: Python :: 2 :: Only",
"Programming Language :: Python :: 2.6",
"Programming Language :: Python :: 2.7",
"Operating System :: MacOS :: MacOS X",
"Operating System :: POSIX",
"Operating System :: POSIX :: BSD",
"Operating System :: POSIX :: BSD :: BSD/OS",
"Operating System :: POSIX :: BSD :: FreeBSD",
"Operating System :: POSIX :: BSD :: NetBSD",
"Operating System :: POSIX :: BSD :: OpenBSD",
"Operating System :: POSIX :: Linux",
"Operating System :: Unix",
"Topic :: Scientific/Engineering :: Information Analysis",
"Topic :: Security",
"Topic :: Security :: Cryptography",
"Topic :: Software Development :: Libraries :: Application Frameworks",
"Topic :: Software Development :: Libraries :: Python Modules",
"Topic :: Software Development :: Testing",
"Topic :: Software Development :: Testing :: Traffic Generation",
"Topic :: System :: Networking :: Monitoring",
)
)
ooniprobe-1.3.2/ooni/ 0000755 0001750 0001750 00000000000 12623630152 012613 5 ustar irl irl ooniprobe-1.3.2/ooni/ngdeck.py 0000644 0001750 0001750 00000000147 12402050424 014413 0 ustar irl irl class NgDeck(object):
def __init__(self, deck_path):
pass
def run(self):
pass
ooniprobe-1.3.2/ooni/oonibclient.py 0000644 0001750 0001750 00000016036 12474100761 015502 0 ustar irl irl import json
from twisted.web.client import Agent
from twisted.internet import defer, reactor
from twisted.internet.endpoints import TCP4ClientEndpoint
from ooni import errors as e
from ooni.settings import config
from ooni.utils import log
from ooni.utils.net import BodyReceiver, StringProducer, Downloader
from ooni.utils.trueheaders import TrueHeadersSOCKS5Agent
class OONIBClient(object):
retries = 3
def __init__(self, address):
self.address = address
def _request(self, method, urn, genReceiver, bodyProducer=None):
address = self.address
if self.address.startswith('httpo://'):
address = self.address.replace('httpo://', 'http://')
agent = TrueHeadersSOCKS5Agent(reactor,
proxyEndpoint=TCP4ClientEndpoint(reactor, '127.0.0.1',
config.tor.socks_port))
elif self.address.startswith('https://'):
log.err("HTTPS based bouncers are currently not supported.")
raise e.InvalidOONIBBouncerAddress
elif self.address.startswith('http://'):
log.msg("Warning using unencrypted backend")
agent = Agent(reactor)
attempts = 0
finished = defer.Deferred()
def perform_request(attempts):
uri = address + urn
d = agent.request(method, uri, bodyProducer=bodyProducer)
@d.addCallback
def callback(response):
try:
content_length = int(response.headers.getRawHeaders('content-length')[0])
except:
content_length = None
response.deliverBody(genReceiver(finished, content_length))
def errback(err, attempts):
# We we will recursively keep trying to perform a request until
# we have reached the retry count.
if attempts < self.retries:
log.err("Lookup failed. Retrying.")
attempts += 1
perform_request(attempts)
else:
log.err("Failed. Giving up.")
finished.errback(err)
d.addErrback(errback, attempts)
perform_request(attempts)
return finished
def queryBackend(self, method, urn, query=None):
bodyProducer = None
if query:
bodyProducer = StringProducer(json.dumps(query))
def genReceiver(finished, content_length):
def process_response(s):
# If empty string then don't parse it.
if not s:
return
try:
response = json.loads(s)
except ValueError:
raise e.get_error(None)
if 'error' in response:
print "Got this backend error message %s" % response
log.err("Got this backend error message %s" % response)
raise e.get_error(response['error'])
return response
return BodyReceiver(finished, content_length, process_response)
return self._request(method, urn, genReceiver, bodyProducer)
def download(self, urn, download_path):
def genReceiver(finished, content_length):
return Downloader(download_path, finished, content_length)
return self._request('GET', urn, genReceiver)
def getInput(self, input_hash):
from ooni.deck import InputFile
input_file = InputFile(input_hash)
if input_file.descriptorCached:
return defer.succeed(input_file)
else:
d = self.queryBackend('GET', '/input/' + input_hash)
@d.addCallback
def cb(descriptor):
input_file.load(descriptor)
input_file.save()
return input_file
@d.addErrback
def err(err):
log.err("Failed to get descriptor for input %s" % input_hash)
log.exception(err)
return d
def getInputList(self):
return self.queryBackend('GET', '/input')
def downloadInput(self, input_hash):
from ooni.deck import InputFile
input_file = InputFile(input_hash)
if input_file.fileCached:
return defer.succeed(input_file)
else:
d = self.download('/input/' + input_hash + '/file', input_file.cached_file)
@d.addCallback
def cb(res):
input_file.verify()
return input_file
@d.addErrback
def err(err):
log.err("Failed to download the input file %s" % input_hash)
log.exception(err)
return d
def getInputPolicy(self):
return self.queryBackend('GET', '/policy/input')
def getNettestPolicy(self):
return self.queryBackend('GET', '/policy/nettest')
def getDeckList(self):
return self.queryBackend('GET', '/deck')
def getDeck(self, deck_hash):
from ooni.deck import Deck
deck = Deck(deck_hash)
if deck.descriptorCached:
return defer.succeed(deck)
else:
d = self.queryBackend('GET', '/deck/' + deck_hash)
@d.addCallback
def cb(descriptor):
deck.load(descriptor)
deck.save()
return deck
@d.addErrback
def err(err):
log.err("Failed to get descriptor for deck %s" % deck_hash)
print err
log.exception(err)
return d
def downloadDeck(self, deck_hash):
from ooni.deck import Deck
deck = Deck(deck_hash)
if deck.fileCached:
return defer.succeed(deck)
else:
d = self.download('/deck/' + deck_hash + '/file', deck.cached_file)
@d.addCallback
def cb(res):
deck.verify()
return deck
@d.addErrback
def err(err):
log.err("Failed to download the deck %s" % deck_hash)
print err
log.exception(err)
return d
@defer.inlineCallbacks
def lookupTestCollector(self, net_tests):
try:
test_collector = yield self.queryBackend('POST', '/bouncer/net-tests',
query={'net-tests': net_tests})
except Exception as exc:
log.exception(exc)
raise e.CouldNotFindTestCollector
defer.returnValue(test_collector)
@defer.inlineCallbacks
def lookupTestHelpers(self, test_helper_names):
try:
test_helper = yield self.queryBackend('POST', '/bouncer/test-helpers',
query={'test-helpers': test_helper_names})
except Exception as exc:
log.exception(exc)
raise e.CouldNotFindTestHelper
if not test_helper:
raise e.CouldNotFindTestHelper
defer.returnValue(test_helper)
ooniprobe-1.3.2/ooni/kit/ 0000755 0001750 0001750 00000000000 12623630152 013402 5 ustar irl irl ooniprobe-1.3.2/ooni/kit/domclass.py 0000644 0001750 0001750 00000017046 12463144534 015577 0 ustar irl irl """
how this works
--------------
This classifier uses the DOM structure of a website to determine how similar
the two sites are.
The procedure we use is the following:
* First we parse all the DOM tree of the web page and we build a list of
TAG parent child relationships (ex. =>
(html, a), (a, b), (html, c)).
* We then use this information to build a matrix (M) where m[i][j] = P(of
transitioning from tag[i] to tag[j]). If tag[i] does not exists P() = 0.
Note: M is a square matrix that is number_of_tags wide.
* We then calculate the eigenvectors (v_i) and eigenvalues (e) of M.
* The corelation between page A and B is given via this formula:
correlation = dot_product(e_A, e_B), where e_A and e_B are
resepectively the eigenvalues for the probability matrix A and the
probability matrix B.
"""
import numpy
import time
from ooni import log
# All HTML4 tags
# XXX add link to W3C page where these came from
alltags = ['A', 'ABBR', 'ACRONYM', 'ADDRESS', 'APPLET', 'AREA', 'B', 'BASE',
'BASEFONT', 'BD', 'BIG', 'BLOCKQUOTE', 'BODY', 'BR', 'BUTTON', 'CAPTION',
'CENTER', 'CITE', 'CODE', 'COL', 'COLGROUP', 'DD', 'DEL', 'DFN', 'DIR', 'DIV',
'DL', 'DT', 'E M', 'FIELDSET', 'FONT', 'FORM', 'FRAME', 'FRAMESET', 'H1', 'H2',
'H3', 'H4', 'H5', 'H6', 'HEAD', 'HR', 'HTML', 'I', 'IFRAME ', 'IMG',
'INPUT', 'INS', 'ISINDEX', 'KBD', 'LABEL', 'LEGEND', 'LI', 'LINK', 'MAP',
'MENU', 'META', 'NOFRAMES', 'NOSCRIPT', 'OBJECT', 'OL', 'OPTGROUP', 'OPTION',
'P', 'PARAM', 'PRE', 'Q', 'S', 'SAMP', 'SCRIPT', 'SELECT', 'SMALL', 'SPAN',
'STRIKE', 'STRONG', 'STYLE', 'SUB', 'SUP', 'TABLE', 'TBODY', 'TD',
'TEXTAREA', 'TFOOT', 'TH', 'THEAD', 'TITLE', 'TR', 'TT', 'U', 'UL', 'VAR']
# Reduced subset of only the most common tags
commontags = ['A', 'B', 'BLOCKQUOTE', 'BODY', 'BR', 'BUTTON', 'CAPTION',
'CENTER', 'CITE', 'CODE', 'COL', 'DD', 'DIV',
'DL', 'DT', 'EM', 'FIELDSET', 'FONT', 'FORM', 'FRAME', 'FRAMESET', 'H1', 'H2',
'H3', 'H4', 'H5', 'H6', 'HEAD', 'HR', 'HTML', 'IFRAME ', 'IMG',
'INPUT', 'INS', 'LABEL', 'LEGEND', 'LI', 'LINK', 'MAP',
'MENU', 'META', 'NOFRAMES', 'NOSCRIPT', 'OBJECT', 'OL', 'OPTION',
'P', 'PRE', 'SCRIPT', 'SELECT', 'SMALL', 'SPAN',
'STRIKE', 'STRONG', 'STYLE', 'SUB', 'SUP', 'TABLE', 'TBODY', 'TD',
'TEXTAREA', 'TFOOT', 'TH', 'THEAD', 'TITLE', 'TR', 'TT', 'U', 'UL']
# The tags we are intested in using for our analysis
thetags = ['A', 'DIV', 'FRAME', 'H1', 'H2',
'H3', 'H4', 'IFRAME ', 'INPUT',
'LABEL','LI', 'P', 'SCRIPT', 'SPAN',
'STYLE', 'TR']
def compute_probability_matrix(dataset):
"""
Compute the probability matrix based on the input dataset.
:dataset: an array of pairs representing the parent child relationships.
"""
matrix = numpy.zeros((len(thetags) + 1, len(thetags) + 1))
for data in dataset:
x = data[0].upper()
y = data[1].upper()
try:
x = thetags.index(x)
except:
x = len(thetags)
try:
y = thetags.index(y)
except:
y = len(thetags)
matrix[x,y] += 1
for x in xrange(len(thetags) + 1):
possibilities = 0
for y in matrix[x]:
possibilities += y
for i in xrange(len(matrix[x])):
if possibilities != 0:
matrix[x][i] = matrix[x][i]/possibilities
return matrix
def compute_eigenvalues(matrix):
"""
Returns the eigenvalues of the supplied square matrix.
:matrix: must be a square matrix and diagonalizable.
"""
return numpy.linalg.eigvals(matrix)
def readDOM(content=None, filename=None, debug=False):
"""
Parses the DOM of the HTML page and returns an array of parent, child
pairs.
:content: the content of the HTML page to be read.
:filename: the filename to be read from for getting the content of the
page.
"""
try:
from bs4 import BeautifulSoup
except ImportError:
log.err("BeautifulSoup is not installed. This test canno run")
raise Exception
if filename:
f = open(filename)
content = ''.join(f.readlines())
f.close()
if debug:
start = time.time()
print "Running BeautifulSoup on content"
dom = BeautifulSoup(content)
if debug:
print "done in %s" % (time.time() - start)
if debug:
start = time.time()
print "Creating couples matrix"
couples = []
for x in dom.findAll():
couples.append((str(x.parent.name), str(x.name)))
if debug:
print "done in %s" % (time.time() - start)
return couples
def compute_eigenvalues_from_DOM(*arg,**kw):
dom = readDOM(*arg, **kw)
probability_matrix = compute_probability_matrix(dom)
eigenvalues = compute_eigenvalues(probability_matrix)
return eigenvalues
def compute_correlation(matrix_a, matrix_b):
correlation = numpy.vdot(matrix_a, matrix_b)
correlation /= numpy.linalg.norm(matrix_a)*numpy.linalg.norm(matrix_b)
correlation = (correlation + 1)/2
return correlation
def benchmark():
"""
Running some very basic benchmarks on this input data:
Data files:
683 filea.txt
678 fileb.txt
diff file* | wc -l
283
We get such results:
Read file B
Running BeautifulSoup on content
done in 0.768223047256
Creating couples matrix
done in 0.023903131485
--------
total done in 0.796372890472
Read file A
Running BeautifulSoup on content
done in 0.752885818481
Creating couples matrix
done in 0.0163578987122
--------
total done in 0.770951986313
Computing prob matrix
done in 0.0475239753723
Computing eigenvalues
done in 0.00161099433899
Computing prob matrix B
done in 0.0408289432526
Computing eigen B
done in 0.000268936157227
Computing correlation
done in 0.00016713142395
Corelation: 0.999999079331
What this means is that the bottleneck is not in the maths, but is rather
in the computation of the DOM tree matrix.
XXX We should focus on optimizing the parsing of the HTML (this depends on
beautiful soup). Perhaps we can find and alternative to it that is
sufficient for us.
"""
start = time.time()
print "Read file B"
site_a = readDOM(filename='filea.txt', debug=True)
print "--------"
print "total done in %s" % (time.time() - start)
start = time.time()
print "Read file A"
site_b = readDOM(filename='fileb.txt', debug=True)
print "--------"
print "total done in %s" % (time.time() - start)
a = {}
b = {}
start = time.time()
print "Computing prob matrix"
a['matrix'] = compute_probability_matrix(site_a)
print "done in %s" % (time.time() - start)
start = time.time()
print "Computing eigenvalues"
a['eigen'] = compute_eigenvalues(a['matrix'])
print "done in %s" % (time.time() - start)
start = time.time()
start = time.time()
print "Computing prob matrix B"
b['matrix'] = compute_probability_matrix(site_b)
print "done in %s" % (time.time() - start)
start = time.time()
print "Computing eigen B"
b['eigen'] = compute_eigenvalues(b['matrix'])
print "done in %s" % (time.time() - start)
start = time.time()
print "Computing correlation"
correlation = compute_correlation(a['eigen'], b['eigen'])
print "done in %s" % (time.time() - start)
print "Corelation: %s" % correlation
#benchmark()
ooniprobe-1.3.2/ooni/kit/__init__.py 0000644 0001750 0001750 00000000060 12373757527 015531 0 ustar irl irl #__all__ = ['domclass']
#from . import domclass
ooniprobe-1.3.2/ooni/kit/daphn3.py 0000644 0001750 0001750 00000015033 12463144534 015141 0 ustar irl irl import yaml
from twisted.internet import protocol, defer
from ooni.utils import log
def read_pcap(filename):
"""
@param filename: Filesystem path to the pcap.
Returns:
[{"client": "\x17\x52\x15"}, {"server": "\x17\x15\x13"}]
"""
from scapy.all import IP, Raw, rdpcap
packets = rdpcap(filename)
checking_first_packet = True
client_ip_addr = None
server_ip_addr = None
ssl_packets = []
messages = []
"""
pcap assumptions:
pcap only contains packets exchanged between a Tor client and a Tor
server. (This assumption makes sure that there are only two IP addresses
in the pcap file)
The first packet of the pcap is sent from the client to the server. (This
assumption is used to get the IP address of the client.)
All captured packets are TLS packets: that is TCP session
establishment/teardown packets should be filtered out (no SYN/SYN+ACK)
"""
"""
Minimally validate the pcap and also find out what's the client
and server IP addresses.
"""
for packet in packets:
if checking_first_packet:
client_ip_addr = packet[IP].src
checking_first_packet = False
else:
if packet[IP].src != client_ip_addr:
server_ip_addr = packet[IP].src
try:
if (packet[Raw]):
ssl_packets.append(packet)
except IndexError:
pass
"""Form our list."""
for packet in ssl_packets:
if packet[IP].src == client_ip_addr:
messages.append({"client": str(packet[Raw])})
elif packet[IP].src == server_ip_addr:
messages.append({"server": str(packet[Raw])})
else:
raise("Detected third IP address! pcap is corrupted.")
return messages
def read_yaml(filename):
f = open(filename)
obj = yaml.safe_load(f)
f.close()
return obj
class NoInputSpecified(Exception):
pass
class StepError(Exception):
pass
def daphn3MutateString(string, i):
"""
Takes a string and mutates the ith bytes of it.
"""
mutated = ""
for y in range(len(string)):
if y == i:
mutated += chr(ord(string[i]) + 1)
else:
mutated += string[y]
return mutated
def daphn3Mutate(steps, step_idx, mutation_idx):
"""
Take a set of steps and a step index and mutates the step of that
index at the mutation_idx'th byte.
"""
mutated_steps = []
for idx, step in enumerate(steps):
if idx == step_idx:
step_string = step.values()[0]
step_key = step.keys()[0]
mutated_string = daphn3MutateString(step_string,
mutation_idx)
mutated_steps.append({step_key: mutated_string})
else:
mutated_steps.append(step)
return mutated_steps
class Daphn3Protocol(protocol.Protocol):
steps = None
role = "client"
report = None
# We use this index to keep track of where we are in the state machine
current_step = 0
current_data_received = 0
# We use this to keep track of the mutated steps
mutated_steps = None
d = defer.Deferred()
def _current_step_role(self):
return self.steps[self.current_step].keys()[0]
def _current_step_data(self):
step_idx, mutation_idx = self.factory.mutation
log.debug("Mutating %s %s" % (step_idx, mutation_idx))
mutated_step = daphn3Mutate(self.steps,
step_idx, mutation_idx)
log.debug("Mutated packet into %s" % mutated_step)
return mutated_step[self.current_step].values()[0]
def sendPayload(self):
self.debug("Sending payload")
current_step_role = self._current_step_role()
current_step_data = self._current_step_data()
if current_step_role == self.role:
print "In a state to do shit %s" % current_step_data
self.transport.write(current_step_data)
self.nextStep()
else:
print "Not in a state to do anything"
def connectionMade(self):
print "Got connection"
def debug(self, msg):
log.debug("Current step %s" % self.current_step)
log.debug("Current data received %s" % self.current_data_received)
log.debug("Current role %s" % self.role)
log.debug("Current steps %s" % self.steps)
log.debug("Current step data %s" % self._current_step_data())
def nextStep(self):
"""
XXX this method is overwritten individually by client and server transport.
There is probably a smarter way to do this and refactor the common
code into one place, but for the moment like this is good.
"""
pass
def dataReceived(self, data):
current_step_role = self.steps[self.current_step].keys()[0]
log.debug("Current step role %s" % current_step_role)
if current_step_role == self.role:
log.debug("Got a state error!")
raise StepError("I should not have gotten data, while I did, \
perhaps there is something wrong with the state machine?")
self.current_data_received += len(data)
expected_data_in_this_state = len(self.steps[self.current_step].values()[0])
log.debug("Current data received %s" % self.current_data_received)
if self.current_data_received >= expected_data_in_this_state:
self.nextStep()
def nextMutation(self):
log.debug("Moving onto next mutation")
# [step_idx, mutation_idx]
c_step_idx, c_mutation_idx = self.factory.mutation
log.debug("[%s]: c_step_idx: %s | c_mutation_idx: %s" % (self.role,
c_step_idx, c_mutation_idx))
if c_step_idx >= (len(self.steps) - 1):
log.err("No censorship fingerprint bisected.")
log.err("Givinig up.")
self.transport.loseConnection()
return
# This means we have mutated all bytes in the step
# we should proceed to mutating the next step.
log.debug("steps: %s | %s" % (self.steps, self.steps[c_step_idx]))
if c_mutation_idx >= (len(self.steps[c_step_idx].values()[0]) - 1):
log.debug("Finished mutating step")
# increase step
self.factory.mutation[0] += 1
# reset mutation idx
self.factory.mutation[1] = 0
else:
log.debug("Mutating next byte in step")
# increase mutation index
self.factory.mutation[1] += 1
def connectionLost(self, reason):
self.debug("--- Lost the connection ---")
self.nextMutation()
ooniprobe-1.3.2/ooni/report/ 0000755 0001750 0001750 00000000000 12623630152 014126 5 ustar irl irl ooniprobe-1.3.2/ooni/report/tool.py 0000644 0001750 0001750 00000005374 12545443435 015477 0 ustar irl irl import yaml
from twisted.internet import defer
from ooni.reporter import OONIBReporter, OONIBReportLog
from ooni.utils import log
from ooni.report import parser
from ooni.settings import config
from ooni.oonibclient import OONIBClient
@defer.inlineCallbacks
def upload(report_file, collector=None, bouncer=None):
oonib_report_log = OONIBReportLog()
print "Attempting to upload %s" % report_file
with open(config.report_log_file) as f:
report_log = yaml.safe_load(f)
report = parser.ReportLoader(report_file)
if bouncer and not collector:
oonib_client = OONIBClient(bouncer)
net_tests = [{
'test-helpers': [],
'input-hashes': report.header['input_hashes'],
'name': report.header['test_name'],
'version': report.header['test_version'],
}]
result = yield oonib_client.lookupTestCollector(
net_tests
)
collector = str(result['net-tests'][0]['collector'])
if collector is None:
try:
collector = report_log[report_file]['collector']
if collector is None:
raise KeyError
except KeyError:
raise Exception(
"No collector or bouncer specified"
" and collector not in report log."
)
oonib_reporter = OONIBReporter(report.header, collector)
log.msg("Creating report for %s with %s" % (report_file, collector))
report_id = yield oonib_reporter.createReport()
yield oonib_report_log.created(report_file, collector, report_id)
for entry in report:
print "Writing entry"
yield oonib_reporter.writeReportEntry(entry)
log.msg("Closing report.")
yield oonib_reporter.finish()
yield oonib_report_log.closed(report_file)
@defer.inlineCallbacks
def upload_all(collector=None, bouncer=None):
oonib_report_log = OONIBReportLog()
for report_file, value in oonib_report_log.reports_to_upload:
try:
yield upload(report_file, collector, bouncer)
except Exception as exc:
log.exception(exc)
def print_report(report_file, value):
print "* %s" % report_file
print " %s" % value['created_at']
def status():
oonib_report_log = OONIBReportLog()
print "Reports to be uploaded"
print "----------------------"
for report_file, value in oonib_report_log.reports_to_upload:
print_report(report_file, value)
print "Reports in progress"
print "-------------------"
for report_file, value in oonib_report_log.reports_in_progress:
print_report(report_file, value)
print "Incomplete reports"
print "------------------"
for report_file, value in oonib_report_log.reports_incomplete:
print_report(report_file, value)
ooniprobe-1.3.2/ooni/report/__init__.py 0000644 0001750 0001750 00000000026 12531110611 016224 0 ustar irl irl __version__ = "0.1.0"
ooniprobe-1.3.2/ooni/report/parser.py 0000644 0001750 0001750 00000001302 12545443435 016001 0 ustar irl irl import yaml
class ReportLoader(object):
_header_keys = (
'probe_asn',
'probe_cc',
'probe_ip',
'start_time',
'test_name',
'test_version',
'options',
'input_hashes',
'software_name',
'software_version'
)
def __init__(self, report_filename):
self._fp = open(report_filename)
self._yfp = yaml.safe_load_all(self._fp)
self.header = self._yfp.next()
def __iter__(self):
return self
def next(self):
try:
self._yfp.next()
except StopIteration:
self.close()
raise StopIteration
def close(self):
self._fp.close()
ooniprobe-1.3.2/ooni/report/cli.py 0000644 0001750 0001750 00000004363 12560161671 015262 0 ustar irl irl from __future__ import print_function
import os
import sys
from ooni.report import __version__
from ooni.report import tool
from ooni.settings import config
from twisted.python import usage
class Options(usage.Options):
synopsis = """%s [options] upload | status
""" % (os.path.basename(sys.argv[0]),)
optParameters = [
["configfile", "f", None,
"Specify the configuration file to use."],
["collector", "c", None,
"Specify the collector to upload the result to."],
["bouncer", "b", "httpo://nkvphnp3p6agi5qq.onion",
"Specify the bouncer to query for a collector."]
]
def opt_version(self):
print("oonireport version: %s" % __version__)
sys.exit(0)
def parseArgs(self, *args):
if len(args) == 0:
raise usage.UsageError(
"Must specify at least one command"
)
return
self['command'] = args[0]
if self['command'] not in ("upload", "status"):
raise usage.UsageError(
"Must specify either command upload or status"
)
if self['command'] == "upload":
try:
self['report_file'] = args[1]
except IndexError:
self['report_file'] = None
def tor_check():
if not config.tor.socks_port:
print("Currently oonireport requires that you start Tor yourself "
"and set the socks_port inside of ooniprobe.conf")
sys.exit(1)
def run():
options = Options()
try:
options.parseOptions()
except Exception as exc:
print("Error: %s" % exc)
print(options)
sys.exit(2)
config.global_options = dict(options)
config.set_paths()
config.read_config_file()
if options['command'] == "upload" and options['report_file']:
tor_check()
return tool.upload(options['report_file'],
options['collector'],
options['bouncer'])
elif options['command'] == "upload":
tor_check()
return tool.upload_all(options['collector'],
options['bouncer'])
elif options['command'] == "status":
return tool.status()
else:
print(options)
ooniprobe-1.3.2/ooni/otime.py 0000644 0001750 0001750 00000004155 12447563404 014320 0 ustar irl irl from datetime import datetime, timedelta, tzinfo
class UTC(tzinfo):
"""UTC"""
ZERO = timedelta(0)
def utcoffset(self, dt):
return self.ZERO
def tzname(self, dt):
return "UTC"
def dst(self, dt):
return self.ZERO
def prettyDateNow():
"""
Returns a good looking string for the local time.
"""
return datetime.now().ctime()
def utcPrettyDateNow():
"""
Returns a good looking string for utc time.
"""
return datetime.utcnow().ctime()
class InvalidTimestampFormat(Exception):
pass
def fromTimestamp(s):
"""
Converts a string that is output from the timestamp function back to a
datetime object
Args:
s (str): a ISO8601 formatted string.
ex. 1912-06-23T101234Z"
Note: we currently only support parsing strings that are generated from the
timestamp function and have no intention in supporting the full standard.
"""
try:
date_part, time_part = s.split('T')
hours, minutes, seconds = time_part[:2], time_part[2:4], time_part[4:6]
year, month, day = date_part.split('-')
except:
raise InvalidTimestampFormat(s)
return datetime(int(year), int(month), int(day), int(hours), int(minutes),
int(seconds))
def timestamp(t=None):
"""
The timestamp for ooni reports follows ISO 8601 in
UTC time format.
We do not inlcude ':' and include seconds.
Example:
if the current date is "10:12:34 AM, June 23 1912" (datetime(1912, 6,
23, 10, 12, 34))
the timestamp will be:
"1912-06-23T101234Z"
Args:
t (datetime): a datetime object representing the
time to be represented (*MUST* be expressed
in UTC).
If not specified will default to the current time
in UTC.
"""
if not t:
t = datetime.utcnow()
ISO8601 = "%Y-%m-%dT%H%M%SZ"
return t.strftime(ISO8601)
def epochToTimestamp(seconds):
return timestamp(datetime.fromtimestamp(seconds, UTC()))
def epochToUTC(seconds):
return float(datetime.utcfromtimestamp(seconds).strftime("%s"))
ooniprobe-1.3.2/ooni/templates/ 0000755 0001750 0001750 00000000000 12623630152 014611 5 ustar irl irl ooniprobe-1.3.2/ooni/templates/__init__.py 0000644 0001750 0001750 00000000000 12373757552 016730 0 ustar irl irl ooniprobe-1.3.2/ooni/templates/httpt.py 0000644 0001750 0001750 00000030017 12533065720 016332 0 ustar irl irl import random
from twisted.internet import defer
from txtorcon.interface import StreamListenerMixin
from twisted.internet import reactor
from twisted.internet.endpoints import TCP4ClientEndpoint
from ooni.utils.trueheaders import TrueHeadersAgent, TrueHeadersSOCKS5Agent
from ooni.nettest import NetTestCase
from ooni.utils import log
from ooni.settings import config
from ooni.utils.net import BodyReceiver, StringProducer, userAgents
from ooni.utils.trueheaders import TrueHeaders
from ooni.errors import handleAllFailures
class InvalidSocksProxyOption(Exception):
pass
class StreamListener(StreamListenerMixin):
def __init__(self, request):
self.request = request
def stream_succeeded(self, stream):
host=self.request['url'].split('/')[2]
try:
if stream.target_host == host and len(self.request['tor']) == 1:
self.request['tor']['exit_ip'] = stream.circuit.path[-1].ip
self.request['tor']['exit_name'] = stream.circuit.path[-1].name
config.tor_state.stream_listeners.remove(self)
except:
log.err("Tor Exit ip detection failed")
class HTTPTest(NetTestCase):
"""
A utility class for dealing with HTTP based testing. It provides methods to
be overriden for dealing with HTTP based testing.
The main functions to look at are processResponseBody and
processResponseHeader that are invoked once the headers have been received
and once the request body has been received.
To perform requests over Tor you will have to use the special URL schema
"shttp". For example to request / on example.com you will have to do
specify as URL "shttp://example.com/".
XXX all of this requires some refactoring.
"""
name = "HTTP Test"
version = "0.1.1"
randomizeUA = False
followRedirects = False
baseParameters = [['socksproxy', 's', None,
'Specify a socks proxy to use for requests (ip:port)']]
def _setUp(self):
super(HTTPTest, self)._setUp()
try:
import OpenSSL
except:
log.err("Warning! pyOpenSSL is not installed. https websites will "
"not work")
self.control_agent = TrueHeadersSOCKS5Agent(reactor,
proxyEndpoint=TCP4ClientEndpoint(reactor, '127.0.0.1',
config.tor.socks_port))
self.report['socksproxy'] = None
sockshost, socksport = (None, None)
if self.localOptions['socksproxy']:
try:
sockshost, socksport = self.localOptions['socksproxy'].split(':')
self.report['socksproxy'] = self.localOptions['socksproxy']
except ValueError:
raise InvalidSocksProxyOption
socksport = int(socksport)
self.agent = TrueHeadersSOCKS5Agent(reactor,
proxyEndpoint=TCP4ClientEndpoint(reactor, sockshost,
socksport))
else:
self.agent = TrueHeadersAgent(reactor)
self.report['agent'] = 'agent'
if self.followRedirects:
try:
from twisted.web.client import RedirectAgent
self.control_agent = RedirectAgent(self.control_agent)
self.agent = RedirectAgent(self.agent)
self.report['agent'] = 'redirect'
except:
log.err("Warning! You are running an old version of twisted"\
"(<= 10.1). I will not be able to follow redirects."\
"This may make the testing less precise.")
self.processInputs()
log.debug("Finished test setup")
def randomize_useragent(self, request):
user_agent = random.choice(userAgents)
request['headers']['User-Agent'] = [user_agent]
def processInputs(self):
pass
def addToReport(self, request, response=None, response_body=None, failure_string=None):
"""
Adds to the report the specified request and response.
Args:
request (dict): A dict describing the request that was made
response (instance): An instance of
:class:twisted.web.client.Response.
Note: headers is our modified True Headers version.
failure (instance): An instance of :class:twisted.internet.failure.Failure
"""
log.debug("Adding %s to report" % request)
request_headers = TrueHeaders(request['headers'])
request_response = {
'request': {
'headers': list(request_headers.getAllRawHeaders()),
'body': request['body'],
'url': request['url'],
'method': request['method'],
'tor': request['tor']
}
}
if response:
request_response['response'] = {
'headers': list(response.headers.getAllRawHeaders()),
'body': response_body,
'code': response.code
}
if failure_string:
request_response['failure'] = failure_string
self.report['requests'].append(request_response)
def _processResponseBody(self, response_body, request, response, body_processor):
log.debug("Processing response body")
HTTPTest.addToReport(self, request, response, response_body)
if body_processor:
body_processor(response_body)
else:
self.processResponseBody(response_body)
response.body = response_body
return response
def _processResponseBodyFail(self, failure, request, response):
failure_string = handleAllFailures(failure)
HTTPTest.addToReport(self, request, response,
failure_string=failure_string)
return response
def processResponseBody(self, body):
"""
Overwrite this method if you wish to interact with the response body of
every request that is made.
Args:
body (str): The body of the HTTP response
"""
pass
def processResponseHeaders(self, headers):
"""
This should take care of dealing with the returned HTTP headers.
Args:
headers (dict): The returned header fields.
"""
pass
def processRedirect(self, location):
"""
Handle a redirection via a 3XX HTTP status code.
Here you may place logic that evaluates the destination that you are
being redirected to. Matches against known censor redirects, etc.
Note: if self.followRedirects is set to True, then this method will
never be called.
XXX perhaps we may want to hook _handleResponse in RedirectAgent to
call processRedirect every time we get redirected.
Args:
location (str): the url that we are being redirected to.
"""
pass
def _cbResponse(self, response, request,
headers_processor, body_processor):
"""
This callback is fired once we have gotten a response for our request.
If we are using a RedirectAgent then this will fire once we have
reached the end of the redirect chain.
Args:
response (:twisted.web.iweb.IResponse:): a provider for getting our response
request (dict): the dict containing our response (XXX this should be dropped)
header_processor (func): a function to be called with argument a
dict containing the response headers. This will lead
self.headerProcessor to not be called.
body_processor (func): a function to be called with as argument the
body of the response. This will lead self.bodyProcessor to not
be called.
"""
if not response:
log.err("Got no response for request %s" % request)
HTTPTest.addToReport(self, request, response)
return
else:
log.debug("Got response %s" % response)
if str(response.code).startswith('3'):
self.processRedirect(response.headers.getRawHeaders('Location')[0])
# [!] We are passing to the headers_processor the headers dict and
# not the Headers() object
response_headers_dict = list(response.headers.getAllRawHeaders())
if headers_processor:
headers_processor(response_headers_dict)
else:
self.processResponseHeaders(response_headers_dict)
try:
content_length = int(response.headers.getRawHeaders('content-length')[0])
except Exception:
content_length = None
finished = defer.Deferred()
response.deliverBody(BodyReceiver(finished, content_length))
finished.addCallback(self._processResponseBody, request,
response, body_processor)
finished.addErrback(self._processResponseBodyFail, request,
response)
return finished
def doRequest(self, url, method="GET",
headers={}, body=None, headers_processor=None,
body_processor=None, use_tor=False):
"""
Perform an HTTP request with the specified method and headers.
Args:
url (str): the full URL of the request. The scheme may be either
http, https, or httpo for http over Tor Hidden Service.
Kwargs:
method (str): the HTTP method name to use for the request
headers (dict): the request headers to send
body (str): the request body
headers_processor : a function to be used for processing the HTTP
header responses (defaults to self.processResponseHeaders).
This function takes as argument the HTTP headers as a dict.
body_processory: a function to be used for processing the HTTP
response body (defaults to self.processResponseBody). This
function takes the response body as an argument.
use_tor (bool): specify if the HTTP request should be done over Tor
or not.
"""
# We prefix the URL with 's' to make the connection go over the
# configured socks proxy
if use_tor:
log.debug("Using Tor for the request to %s" % url)
agent = self.control_agent
else:
agent = self.agent
if self.localOptions['socksproxy']:
log.debug("Using SOCKS proxy %s for request" % (self.localOptions['socksproxy']))
log.debug("Performing request %s %s %s" % (url, method, headers))
request = {}
request['method'] = method
request['url'] = url
request['headers'] = headers
request['body'] = body
request['tor'] = {}
if use_tor:
request['tor']['is_tor'] = True
else:
request['tor']['is_tor'] = False
if self.randomizeUA:
log.debug("Randomizing user agent")
self.randomize_useragent(request)
if 'requests' not in self.report:
self.report['requests'] = []
# If we have a request body payload, set the request body to such
# content
if body:
body_producer = StringProducer(request['body'])
else:
body_producer = None
headers = TrueHeaders(request['headers'])
def errback(failure, request):
if request['tor']['is_tor']:
log.err("Error performing torified request: %s" % request['url'])
else:
log.err("Error performing request: %s" % request['url'])
failure_string = handleAllFailures(failure)
self.addToReport(request, failure_string=failure_string)
return failure
if use_tor:
state = config.tor_state
if state:
state.add_stream_listener(StreamListener(request))
d = agent.request(request['method'], request['url'], headers,
body_producer)
d.addErrback(errback, request)
d.addCallback(self._cbResponse, request, headers_processor,
body_processor)
return d
ooniprobe-1.3.2/ooni/templates/dnst.py 0000644 0001750 0001750 00000015506 12461441644 016150 0 ustar irl irl # -*- encoding: utf-8 -*-
#
# :authors: Arturo Filastò
# :licence: see LICENSE
from twisted.internet import udp, error, base
from twisted.internet.defer import TimeoutError
from twisted.names import client, dns
from twisted.names.client import Resolver
from ooni.utils import log
from ooni.nettest import NetTestCase
from ooni.errors import failureToString
import socket
from socket import gaierror
dns.DNSDatagramProtocol.noisy = False
def _bindSocket(self):
"""
_bindSocket taken from Twisted 13.1.0 to suppress logging.
"""
try:
skt = self.createInternetSocket()
skt.bind((self.interface, self.port))
except socket.error as le:
raise error.CannotListenError(self.interface, self.port, le)
# Make sure that if we listened on port 0, we update that to
# reflect what the OS actually assigned us.
self._realPortNumber = skt.getsockname()[1]
# Here we remove the logging.
# log.msg("%s starting on %s" % (
# self._getLogPrefix(self.protocol), self._realPortNumber))
self.connected = 1
self.socket = skt
self.fileno = self.socket.fileno
udp.Port._bindSocket = _bindSocket
def connectionLost(self, reason=None):
"""
Taken from Twisted 13.1.0 to suppress log.msg printing.
"""
# Here we remove the logging.
# log.msg('(UDP Port %s Closed)' % self._realPortNumber)
self._realPortNumber = None
base.BasePort.connectionLost(self, reason)
self.protocol.doStop()
self.socket.close()
del self.socket
del self.fileno
if hasattr(self, "d"):
self.d.callback(None)
del self.d
udp.Port.connectionLost = connectionLost
def representAnswer(answer):
# We store the resource record and the answer payload in a
# tuple
return (repr(answer), repr(answer.payload))
class DNSTest(NetTestCase):
name = "Base DNS Test"
version = 0.1
requiresRoot = False
queryTimeout = [1]
def _setUp(self):
super(DNSTest, self)._setUp()
self.report['queries'] = []
def performPTRLookup(self, address, dns_server = None):
"""
Does a reverse DNS lookup on the input ip address
:address: the IP Address as a dotted quad to do a reverse lookup on.
:dns_server: is the dns_server that should be used for the lookup as a
tuple of ip port (ex. ("127.0.0.1", 53))
if None, system dns settings will be used
"""
ptr = '.'.join(address.split('.')[::-1]) + '.in-addr.arpa'
return self.dnsLookup(ptr, 'PTR', dns_server)
def performALookup(self, hostname, dns_server = None):
"""
Performs an A lookup and returns an array containg all the dotted quad
IP addresses in the response.
:hostname: is the hostname to perform the A lookup on
:dns_server: is the dns_server that should be used for the lookup as a
tuple of ip port (ex. ("127.0.0.1", 53))
if None, system dns settings will be used
"""
return self.dnsLookup(hostname, 'A', dns_server)
def performNSLookup(self, hostname, dns_server = None):
"""
Performs a NS lookup and returns an array containg all nameservers in
the response.
:hostname: is the hostname to perform the NS lookup on
:dns_server: is the dns_server that should be used for the lookup as a
tuple of ip port (ex. ("127.0.0.1", 53))
if None, system dns settings will be used
"""
return self.dnsLookup(hostname, 'NS', dns_server)
def performSOALookup(self, hostname, dns_server = None):
"""
Performs a SOA lookup and returns the response (name,serial).
:hostname: is the hostname to perform the SOA lookup on
:dns_server: is the dns_server that should be used for the lookup as a
tuple of ip port (ex. ("127.0.0.1", 53))
if None, system dns settings will be used
"""
return self.dnsLookup(hostname,'SOA',dns_server)
def dnsLookup(self, hostname, dns_type, dns_server = None):
"""
Performs a DNS lookup and returns the response.
:hostname: is the hostname to perform the DNS lookup on
:dns_type: type of lookup 'NS'/'A'/'SOA'
:dns_server: is the dns_server that should be used for the lookup as a
tuple of ip port (ex. ("127.0.0.1", 53))
"""
types={'NS':dns.NS,'A':dns.A,'SOA':dns.SOA,'PTR':dns.PTR}
dnsType=types[dns_type]
query = [dns.Query(hostname, dnsType, dns.IN)]
def gotResponse(message):
log.debug(dns_type+" Lookup successful")
log.debug(str(message))
addrs = []
answers = []
if dns_server:
msg = message.answers
else:
msg = message[0]
for answer in msg:
if answer.type is dnsType:
if dnsType is dns.SOA:
addr = (answer.name.name,answer.payload.serial)
elif dnsType in [dns.NS,dns.PTR]:
addr = answer.payload.name.name
elif dnsType is dns.A:
addr = answer.payload.dottedQuad()
else:
addr = None
addrs.append(addr)
answers.append(representAnswer(answer))
DNSTest.addToReport(self, query, resolver=dns_server, query_type=dns_type,
answers=answers, addrs=addrs)
return addrs
def gotError(failure):
failure.trap(gaierror, TimeoutError)
DNSTest.addToReport(self, query, resolver=dns_server, query_type=dns_type,
failure=failure)
return failure
if dns_server:
resolver = Resolver(servers=[dns_server])
d = resolver.queryUDP(query, timeout=self.queryTimeout)
else:
lookupFunction={'NS':client.lookupNameservers, 'SOA':client.lookupAuthority, 'A':client.lookupAddress, 'PTR':client.lookupPointer}
d = lookupFunction[dns_type](hostname)
d.addCallback(gotResponse)
d.addErrback(gotError)
return d
def addToReport(self, query, resolver=None, query_type=None,
answers=None, name=None, addrs=None, failure=None):
log.debug("Adding %s to report)" % query)
result = {}
result['resolver'] = resolver
result['query_type'] = query_type
result['query'] = repr(query)
if failure:
result['failure'] = failureToString(failure)
if answers:
result['answers'] = answers
if name:
result['name'] = name
if addrs:
result['addrs'] = addrs
self.report['queries'].append(result)
ooniprobe-1.3.2/ooni/templates/tcpt.py 0000644 0001750 0001750 00000006051 12447563404 016150 0 ustar irl irl from twisted.internet import protocol, defer, reactor
from twisted.internet.endpoints import TCP4ClientEndpoint
from ooni.nettest import NetTestCase
from ooni.errors import failureToString
from ooni.utils import log
class TCPSender(protocol.Protocol):
def __init__(self):
self.received_data = ''
self.sent_data = ''
def dataReceived(self, data):
"""
We receive data until the total amount of data received reaches that
which we have sent. At that point we append the received data to the
report and we fire the callback of the test template sendPayload
function.
This is used in pair with a TCP Echo server.
The reason why we put the data received inside of an array is that in
future we may want to expand this to support state and do something
similar to what daphne does, but without the mutation.
XXX Actually daphne will probably be refactored to be a subclass of the
TCP Test Template.
"""
if self.payload_len:
self.received_data += data
def sendPayload(self, payload):
"""
Write the payload to the wire and set the expected size of the payload
we are to receive.
Args:
payload: the data to be sent on the wire.
"""
self.payload_len = len(payload)
self.sent_data = payload
self.transport.write(payload)
class TCPSenderFactory(protocol.Factory):
def buildProtocol(self, addr):
return TCPSender()
class TCPTest(NetTestCase):
name = "Base TCP Test"
version = "0.1"
requiresRoot = False
timeout = 5
address = None
port = None
def _setUp(self):
super(TCPTest, self)._setUp()
self.report['sent'] = []
self.report['received'] = []
def sendPayload(self, payload):
d1 = defer.Deferred()
def closeConnection(proto):
self.report['sent'].append(proto.sent_data)
self.report['received'].append(proto.received_data)
proto.transport.loseConnection()
log.debug("Closing connection")
d1.callback(proto.received_data)
def timedOut(proto):
self.report['failure'] = 'tcp_timed_out_error'
proto.transport.loseConnection()
def errback(failure):
self.report['failure'] = failureToString(failure)
d1.errback(failure)
def connected(proto):
log.debug("Connected to %s:%s" % (self.address, self.port))
proto.report = self.report
proto.deferred = d1
proto.sendPayload(payload)
if self.timeout:
# XXX-Twisted this logic should probably go inside of the protocol
reactor.callLater(self.timeout, closeConnection, proto)
point = TCP4ClientEndpoint(reactor, self.address, self.port)
log.debug("Connecting to %s:%s" % (self.address, self.port))
d2 = point.connect(TCPSenderFactory())
d2.addCallback(connected)
d2.addErrback(errback)
return d1
ooniprobe-1.3.2/ooni/templates/process.py 0000644 0001750 0001750 00000007076 12623613431 016654 0 ustar irl irl from twisted.internet import protocol, defer, reactor
from ooni.nettest import NetTestCase
from ooni.utils import log
class ProcessDirector(protocol.ProcessProtocol):
def __init__(self, d, finished=None, timeout=None, stdin=None):
self.d = d
self.stderr = ""
self.stdout = ""
self.finished = finished
self.timeout = timeout
self.stdin = stdin
self.timer = None
self.exit_reason = None
def cancelTimer(self):
if self.timeout and self.timer:
self.timer.cancel()
self.timer = None
def close(self, reason=None):
self.reason = reason
self.transport.loseConnection()
def resetTimer(self):
if self.timeout is not None:
if self.timer is not None and self.timer.active():
self.timer.cancel()
self.timer = reactor.callLater(self.timeout,
self.close,
"timeout_reached")
def finish(self, exit_reason=None):
if not self.exit_reason:
self.exit_reason = exit_reason
data = {
"stderr": self.stderr,
"stdout": self.stdout,
"exit_reason": self.exit_reason
}
self.d.callback(data)
def shouldClose(self):
if self.finished is None:
return False
return self.finished(self.stdout, self.stderr)
def connectionMade(self):
self.resetTimer()
if self.stdin is not None:
self.transport.write(self.stin)
self.transport.closeStdin()
def outReceived(self, data):
log.debug("STDOUT: %s" % data)
self.stdout += data
if self.shouldClose():
self.close("condition_met")
self.handleRead(data, None)
def errReceived(self, data):
log.debug("STDERR: %s" % data)
self.stderr += data
if self.shouldClose():
self.close("condition_met")
self.handlRead(None, data)
def inConnectionLost(self):
log.debug("inConnectionLost")
# self.d.callback(self.data())
def outConnectionLost(self):
log.debug("outConnectionLost")
def errConnectionLost(self):
log.debug("errConnectionLost")
def processExited(self, reason):
log.debug("Exited %s" % reason)
def processEnded(self, reason):
log.debug("Ended %s" % reason)
self.finish("process_done")
def handleRead(self, stdout, stderr=None):
pass
class ProcessTest(NetTestCase):
name = "Base Process Test"
version = "0.1"
requiresRoot = False
timeout = 5
processDirector = None
def _setUp(self):
super(ProcessTest, self)._setUp()
def processEnded(self, result, command):
log.debug("Finished %s: %s" % (command, result))
key = ' '.join(command)
self.report[key] = {
'stdout': result['stdout'],
'stderr': result['stderr'],
'exit_reason': result['exit_reason']
}
return result
def run(self, command, finished=None, env={}, path=None, usePTY=0):
d = defer.Deferred()
d.addCallback(self.processEnded, command)
self.processDirector = ProcessDirector(d, finished, self.timeout)
self.processDirector.handleRead = self.handleRead
reactor.spawnProcess(self.processDirector, command[0], command, env=env, path=path, usePTY=usePTY)
return d
# handleRead is not an abstract method to be backwards compatible
def handleRead(self, stdout, stderr=None):
pass
ooniprobe-1.3.2/ooni/templates/scapyt.py 0000644 0001750 0001750 00000011212 12463144534 016471 0 ustar irl irl from ooni.nettest import NetTestCase
from ooni.utils import log
from ooni.settings import config
from ooni.utils.net import hasRawSocketPermission
from ooni.utils.txscapy import ScapySender, ScapyFactory
class BaseScapyTest(NetTestCase):
"""
The report of a test run with scapy looks like this:
report:
sent_packets: [
{
'raw_packet': BASE64Encoding of packet,
'summary': 'IP / TCP 192.168.2.66:ftp_data > 8.8.8.8:http S'
}
]
answered_packets: []
"""
name = "Base Scapy Test"
version = 0.1
requiresRoot = not hasRawSocketPermission()
baseFlags = [
['ipsrc', 's',
'Does *not* check if IP src and ICMP IP citation '
'matches when processing answers'],
['seqack', 'k',
'Check if TCP sequence number and ACK match in the '
'ICMP citation when processing answers'],
['ipid', 'i', 'Check if the IPID matches when processing answers']]
def _setUp(self):
super(BaseScapyTest, self)._setUp()
if config.scapyFactory is None:
log.debug("Scapy factory not set, registering it.")
config.scapyFactory = ScapyFactory(config.advanced.interface)
self.report['answer_flags'] = []
if self.localOptions['ipsrc']:
config.checkIPsrc = 0
else:
self.report['answer_flags'].append('ipsrc')
config.checkIPsrc = 1
if self.localOptions['ipid']:
self.report['answer_flags'].append('ipid')
config.checkIPID = 1
else:
config.checkIPID = 0
# XXX we don't support strict matching
# since (from scapy's documentation), some stacks have a bug for which
# the bytes in the IPID are swapped.
# Perhaps in the future we will want to have more fine grained control
# over this.
if self.localOptions['seqack']:
self.report['answer_flags'].append('seqack')
config.check_TCPerror_seqack = 1
else:
config.check_TCPerror_seqack = 0
self.report['sent_packets'] = []
self.report['answered_packets'] = []
def finishedSendReceive(self, packets):
"""
This gets called when all packets have been sent and received.
"""
answered, unanswered = packets
for snd, rcv in answered:
log.debug("Writing report for scapy test")
sent_packet = snd
received_packet = rcv
if not config.privacy.includeip:
log.debug("Detected you would not like to "
"include your ip in the report")
log.debug(
"Stripping source and destination IPs from the reports")
sent_packet.src = '127.0.0.1'
received_packet.dst = '127.0.0.1'
self.report['sent_packets'].append(sent_packet)
self.report['answered_packets'].append(received_packet)
return packets
def sr(self, packets, timeout=None, *arg, **kw):
"""
Wrapper around scapy.sendrecv.sr for sending and receiving of packets
at layer 3.
"""
scapySender = ScapySender(timeout=timeout)
config.scapyFactory.registerProtocol(scapySender)
log.debug("Using sending with hash %s" % scapySender.__hash__)
d = scapySender.startSending(packets)
d.addCallback(self.finishedSendReceive)
return d
def sr1(self, packets, *arg, **kw):
def done(packets):
"""
We do this so that the returned value is only the one packet that
we expected a response for, identical to the scapy implementation
of sr1.
"""
try:
return packets[0][0][1]
except IndexError:
log.err("Got no response...")
return packets
scapySender = ScapySender()
scapySender.expected_answers = 1
config.scapyFactory.registerProtocol(scapySender)
log.debug("Running sr1")
d = scapySender.startSending(packets)
log.debug("Started to send")
d.addCallback(self.finishedSendReceive)
d.addCallback(done)
return d
def send(self, packets, *arg, **kw):
"""
Wrapper around scapy.sendrecv.send for sending of packets at layer 3
"""
scapySender = ScapySender()
config.scapyFactory.registerProtocol(scapySender)
scapySender.startSending(packets)
scapySender.stopSending()
for sent_packet in packets:
self.report['sent_packets'].append(sent_packet)
ScapyTest = BaseScapyTest
ooniprobe-1.3.2/ooni/api/ 0000755 0001750 0001750 00000000000 12623630152 013364 5 ustar irl irl ooniprobe-1.3.2/ooni/api/spec.py 0000644 0001750 0001750 00000017106 12463144534 014703 0 ustar irl irl import os
import re
import copy
import json
import types
import tempfile
import functools
from twisted.python import usage
from cyclone import web, escape
from ooni.reporter import YAMLReporter, OONIBReporter, collector_supported
from ooni import errors
from ooni.nettest import NetTestLoader
from ooni.settings import config
class InvalidInputFilename(Exception):
pass
class FilenameExists(Exception):
pass
def check_xsrf(method):
@functools.wraps(method)
def wrapper(self, *args, **kw):
xsrf_header = self.request.headers.get("X-XSRF-TOKEN")
if self.xsrf_token != xsrf_header:
raise web.HTTPError(403, "Invalid XSRF token.")
return method(self, *args, **kw)
return wrapper
class ORequestHandler(web.RequestHandler):
serialize_lists = True
xsrf_cookie_name = "XSRF-TOKEN"
def write(self, chunk):
"""
XXX This is a patch that can be removed once
https://github.com/fiorix/cyclone/pull/92 makes it into a release.
"""
if isinstance(chunk, types.ListType):
chunk = escape.json_encode(chunk)
self.set_header("Content-Type", "application/json")
web.RequestHandler.write(self, chunk)
class Status(ORequestHandler):
@check_xsrf
def get(self):
result = {'active_tests': oonidApplication.director.activeNetTests}
self.write(result)
def list_inputs():
input_list = []
for filename in os.listdir(config.inputs_directory):
input_list.append({'filename': filename})
return input_list
class Inputs(ORequestHandler):
"""
This handler is responsible for listing and adding new inputs.
"""
@check_xsrf
def get(self):
"""
Obtain the list of currently installed inputs. Inputs are stored inside
of $OONI_HOME/inputs/.
"""
input_list = list_inputs()
self.write(input_list)
@check_xsrf
def post(self):
"""
Add a new input to the currently installed inputs.
"""
input_file = self.request.files.get("file")[0]
filename = input_file['filename']
if not filename or not re.match('(\w.*\.\w.*).*', filename):
raise InvalidInputFilename
if os.path.exists(filename):
raise FilenameExists
content_type = input_file["content_type"]
body = input_file["body"]
fn = os.path.join(config.inputs_directory, filename)
with open(os.path.abspath(fn), "w") as fp:
fp.write(body)
class ListTests(ORequestHandler):
@check_xsrf
def get(self):
test_list = copy.deepcopy(oonidApplication.director.netTests)
for test_id in test_list.keys():
test_list[test_id].pop('path')
self.write(test_list)
def get_net_test_loader(test_options, test_file):
"""
Args:
test_options: (dict) containing as keys the option names.
test_file: (string) the path to the test_file to be run.
Returns:
an instance of :class:`ooni.nettest.NetTestLoader` with the specified
test_file and the specified options.
"""
options = []
for k, v in test_options.items():
options.append('--'+k)
options.append(v)
net_test_loader = NetTestLoader(options,
test_file=test_file)
return net_test_loader
def get_reporters(net_test_loader):
"""
Determines which reports are able to run and returns an instance of them.
We always report to flat file via the :class:`ooni.reporters.YAMLReporter`
and the :class:`ooni.reporters.OONIBReporter`.
The later will be used only if we determine that Tor is running.
Returns:
a list of reporter instances
"""
test_details = net_test_loader.testDetails
reporters = []
yaml_reporter = YAMLReporter(test_details, config.reports_directory)
reporters.append(yaml_reporter)
if config.reports.collector and collector_supported(config.reports.collector):
oonib_reporter = OONIBReporter(test_details, config.reports.collector)
reporters.append(oonib_reporter)
return reporters
def write_temporary_input(content):
"""
Creates a temporary file for the given content.
Returns:
the path to the temporary file.
"""
fd, path = tempfile.mkstemp()
with open(path, 'w') as f:
f.write(content)
f.close()
print "This is the path %s" % path
return fd, path
class StartTest(ORequestHandler):
@check_xsrf
def post(self, test_name):
"""
Starts a test with the specified options.
"""
test_file = oonidApplication.director.netTests[test_name]['path']
test_options = json.loads(self.request.body)
tmp_files = []
if ('manual_input' in test_options):
for option, content in test_options['manual_input'].items():
fd, path = write_temporary_input(content)
test_options[option] = path
tmp_files.append((fd, path))
test_options.pop('manual_input')
net_test_loader = get_net_test_loader(test_options, test_file)
try:
net_test_loader.checkOptions()
d = oonidApplication.director.startNetTest(net_test_loader,
get_reporters(net_test_loader))
@d.addBoth
def cleanup(result):
for fd, path in tmp_files:
os.close(fd)
os.remove(path)
except errors.MissingRequiredOption, option_name:
self.write({'error':
'Missing required option: "%s"' % option_name})
except usage.UsageError, e:
self.write({'error':
'Error in parsing options'})
except errors.InsufficientPrivileges:
self.write({'error':
'Insufficient priviledges'})
class StopTest(ORequestHandler):
@check_xsrf
def delete(self, test_name):
pass
def get_test_results(test_id):
"""
Returns:
a list of test dicts that correspond to the test results for the given
test_id.
The dict is made like so:
{
'name': The name of the report,
'content': The content of the report
}
"""
test_results = []
for test_result in os.listdir(config.reports_directory):
if test_result.startswith('report-'+test_id):
with open(os.path.join(config.reports_directory, test_result)) as f:
test_content = ''.join(f.readlines())
test_results.append({'name': test_result,
'content': test_content})
test_results.reverse()
return test_results
class TestStatus(ORequestHandler):
@check_xsrf
def get(self, test_id):
"""
Returns the requested test_id details and the stored results for such
test.
"""
try:
test = copy.deepcopy(oonidApplication.director.netTests[test_id])
test.pop('path')
test['results'] = get_test_results(test_id)
self.write(test)
except KeyError:
self.write({'error':
'Test with such ID not found!'})
config.read_config_file()
oonidAPI = [
(r"/status", Status),
(r"/inputs", Inputs),
(r"/test", ListTests),
(r"/test/(.*)/start", StartTest),
(r"/test/(.*)/stop", StopTest),
(r"/test/(.*)", TestStatus),
(r"/(.*)", web.StaticFileHandler,
{"path": os.path.join(config.data_directory, 'ui', 'app'),
"default_filename": "index.html"})
]
oonidApplication = web.Application(oonidAPI, debug=True)
ooniprobe-1.3.2/ooni/api/__init__.py 0000644 0001750 0001750 00000000000 12373757526 015504 0 ustar irl irl ooniprobe-1.3.2/ooni/tasks.py 0000644 0001750 0001750 00000011075 12447563404 014327 0 ustar irl irl import time
from twisted.internet import defer, reactor
from ooni import errors as e
from ooni.settings import config
from ooni import otime
class BaseTask(object):
_timer = None
_running = None
def __init__(self):
"""
If you want to schedule a task multiple times, remember to create fresh
instances of it.
"""
self.failures = 0
self.startTime = time.time()
self.runtime = 0
# This is a deferred that gets called when a test has reached it's
# final status, this means: all retries have been attempted or the test
# has successfully executed.
# Such deferred will be called on completion by the TaskManager.
self.done = defer.Deferred()
def _failed(self, failure):
self.failures += 1
self.failed(failure)
return failure
def _succeeded(self, result):
self.runtime = time.time() - self.startTime
self.succeeded(result)
return result
def start(self):
self._running = defer.maybeDeferred(self.run)
self._running.addErrback(self._failed)
self._running.addCallback(self._succeeded)
return self._running
def succeeded(self, result):
"""
Place here the logic to handle a successful execution of the task.
"""
pass
def failed(self, failure):
"""
Place in here logic to handle failure.
"""
pass
def run(self):
"""
Override this with the logic of your task.
Must return a deferred.
"""
pass
class TaskWithTimeout(BaseTask):
timeout = 30
# So that we can test the callLater calls
clock = reactor
def _timedOut(self):
"""Internal method for handling timeout failure"""
if self._running:
self._failed(e.TaskTimedOut)
self._running.cancel()
def _cancelTimer(self):
if self._timer.active():
self._timer.cancel()
def _succeeded(self, result):
self._cancelTimer()
return BaseTask._succeeded(self, result)
def _failed(self, failure):
self._cancelTimer()
return BaseTask._failed(self, failure)
def start(self):
self._timer = self.clock.callLater(self.timeout, self._timedOut)
return BaseTask.start(self)
class Measurement(TaskWithTimeout):
def __init__(self, test_instance, test_method, test_input):
"""
test_class:
is the class, subclass of NetTestCase, of the test to be run
test_method:
is a string representing the test method to be called to perform
this measurement
test_input:
is the input to the test
net_test:
a reference to the net_test object such measurement belongs to.
"""
self.testInstance = test_instance
self.testInstance.input = test_input
self.testInstance._setUp()
if not hasattr(self.testInstance, '_start_time'):
self.testInstance._start_time = time.time()
if 'input' not in self.testInstance.report.keys():
self.testInstance.report['input'] = test_input
if 'test_start_time' not in self.testInstance.report.keys():
start_time = otime.epochToUTC(self.testInstance._start_time)
self.testInstance.report['test_start_time'] = start_time
self.testInstance.setUp()
self.netTestMethod = getattr(self.testInstance, test_method)
if 'timeout' in dir(test_instance):
if isinstance(test_instance.timeout, int) or isinstance(test_instance.timeout, float):
# If the test has a timeout option set we set the measurement
# timeout to that value + 8 seconds to give it enough time to
# trigger it's internal timeout before we start trigger the
# measurement timeout.
self.timeout = test_instance.timeout + 8
elif config.advanced.measurement_timeout:
self.timeout = config.advanced.measurement_timeout
TaskWithTimeout.__init__(self)
def succeeded(self, result):
pass
def failed(self, failure):
pass
def run(self):
return self.netTestMethod()
class ReportEntry(TaskWithTimeout):
def __init__(self, reporter, entry):
self.reporter = reporter
self.entry = entry
if config.advanced.reporting_timeout:
self.timeout = config.advanced.reporting_timeout
TaskWithTimeout.__init__(self)
def run(self):
return self.reporter.writeReportEntry(self.entry)
ooniprobe-1.3.2/ooni/__init__.py 0000644 0001750 0001750 00000000362 12623614117 014730 0 ustar irl irl # -*- encoding: utf-8 -*-
__author__ = "Open Observatory of Network Interference"
__version__ = "1.3.2"
__all__ = ['config', 'inputunit', 'kit',
'lib', 'nettest', 'oonicli', 'report', 'reporter',
'templates', 'utils']
ooniprobe-1.3.2/ooni/deckgen/ 0000755 0001750 0001750 00000000000 12623630152 014213 5 ustar irl irl ooniprobe-1.3.2/ooni/deckgen/__init__.py 0000644 0001750 0001750 00000000026 12531110611 016311 0 ustar irl irl __version__ = "0.1.0"
ooniprobe-1.3.2/ooni/deckgen/processors/ 0000755 0001750 0001750 00000000000 12623630152 016415 5 ustar irl irl ooniprobe-1.3.2/ooni/deckgen/processors/__init__.py 0000644 0001750 0001750 00000000000 12447563404 020525 0 ustar irl irl ooniprobe-1.3.2/ooni/deckgen/processors/citizenlab_test_lists.py 0000644 0001750 0001750 00000002707 12531110611 023365 0 ustar irl irl import os
import csv
from ooni.settings import config
def load_input(file_input, file_output):
fw = open(file_output, "w+")
with open(file_input) as f:
csvreader = csv.reader(f)
csvreader.next()
for row in csvreader:
fw.write("%s\n" % row[0])
fw.close()
def generate_country_input(country_code, dst):
"""
Write to dst/citizenlab-urls-{country_code}.txt
the list for the given country code.
Returns:
the path to the generated input
"""
country_code = country_code.lower()
filename = os.path.join(dst, "citizenlab-urls-%s.txt" % country_code)
input_list = config.get_data_file_path("resources/"
"citizenlab-test-lists/"
"test-lists-master/lists/"
+ country_code + ".csv")
if not os.path.exists(input_list):
raise Exception("Could not find list for country %s" % country_code)
load_input(input_list, filename)
return filename
def generate_global_input(dst):
filename = os.path.join(dst, "citizenlab-urls-global.txt")
input_list = config.get_data_file_path("resources/"
"citizenlab-test-lists/"
"test-lists-master/lists/"
"global.csv")
load_input(input_list, filename)
return filename
ooniprobe-1.3.2/ooni/deckgen/processors/namebench_dns_servers.py 0000644 0001750 0001750 00000002636 12474100761 023335 0 ustar irl irl import os
import csv
import GeoIP
from ooni.settings import config
class GeoIPDB(object):
_borg = {}
country = None
asn = None
def __init__(self):
self.__dict__ = self._borg
if not self.country:
try:
country_file = config.get_data_file_path('GeoIP/GeoIP.dat')
self.country = GeoIP.open(country_file,
GeoIP.GEOIP_STANDARD)
except:
raise Exception("Edit the geoip_data_dir line in your config"
" file to point to your geoip files")
def generate_country_input(country_code, dst):
csv_file = config.get_data_file_path("resources/"
"namebench-dns-servers.csv")
filename = os.path.join(dst, "dns-server-%s.txt" % country_code)
fw = open(filename, "w")
geoip_db = GeoIPDB()
reader = csv.reader(open(csv_file))
for row in reader:
if row[2] == 'X-Internal-IP':
continue
elif row[2] == 'X-Unroutable':
continue
elif row[2] == 'X-Link_local':
continue
ipaddr = row[0]
cc = geoip_db.country.country_code_by_addr(ipaddr)
if not cc:
continue
if cc.lower() == country_code.lower():
fw.write(ipaddr + "\n")
fw.close()
return filename
def generate_global_input(dst):
pass
ooniprobe-1.3.2/ooni/deckgen/cli.py 0000644 0001750 0001750 00000011615 12560161671 015345 0 ustar irl irl import os
import sys
import copy
import errno
import yaml
from twisted.internet import defer
from twisted.python import usage
from ooni import errors
from ooni.geoip import ProbeIP
from ooni.settings import config
from ooni.deckgen import __version__
from ooni.resources import inputs
class Options(usage.Options):
synopsis = """%s [options]
""" % sys.argv[0]
optParameters = [
["country-code", "c",
None,
"Specify the two letter country code for which we should "
"generate the deck."
],
["output", "o",
None,
"Specify the directory where to write output."
]
]
def opt_version(self):
print("oonideckgen version: %s" % __version__)
sys.exit(0)
class Deck(object):
_base_entry = {
"options": {
"collector": None,
"help": 0,
"logfile": None,
"no-default-reporter": 0,
"parallelism": None,
"pcapfile": None,
"reportfile": None,
"resume": 0,
"testdeck": None
}
}
def __init__(self):
self.deck = []
def add_test(self, test_file, subargs=[]):
deck_entry = copy.deepcopy(self._base_entry)
deck_entry['options']['test_file'] = test_file
deck_entry['options']['subargs'] = subargs
self.deck.append(deck_entry)
def pprint(self):
print yaml.safe_dump(self.deck)
def write_to_file(self, filename):
with open(filename, "w+") as f:
f.write(yaml.safe_dump(self.deck))
def generate_deck(options):
dns_servers_processor = inputs['namebench-dns-servers.csv']['processor']
url_lists_processor = inputs['citizenlab-test-lists.zip']['processor']
try:
url_list_country = url_lists_processor.generate_country_input(
options['country-code'],
options['output']
)
except Exception:
print "Could not generate country specific url list"
print "We will just use the global one."
url_list_country = None
url_list_global = url_lists_processor.generate_global_input(
options['output']
)
dns_servers = dns_servers_processor.generate_country_input(
options['country-code'],
options['output']
)
deck = Deck()
# deck.add_test('manipulation/http_host', ['-f', 'somefile.txt'])
deck.add_test('blocking/http_requests', ['-f', url_list_global])
deck.add_test('blocking/dns_consistency',
['-f', url_list_global, '-T', dns_servers])
if url_list_country is not None:
deck.add_test('blocking/dns_consistency',
['-f', url_list_country, '-T', dns_servers])
deck.add_test('blocking/http_requests', ['-f', url_list_country])
deck.add_test('manipulation/http_invalid_request_line')
deck.add_test('manipulation/http_header_field_manipulation')
# deck.add_test('manipulation/traceroute')
if config.advanced.debug:
deck.pprint()
deck_filename = os.path.join(options['output'],
"%s-%s-user.deck" % (__version__,
options['country-code']))
deck.write_to_file(deck_filename)
print "Deck written to %s" % deck_filename
print "Run ooniprobe like so:"
print "ooniprobe -i %s" % deck_filename
@defer.inlineCallbacks
def get_user_country_code():
config.privacy.includecountry = True
probe_ip = ProbeIP()
yield probe_ip.lookup()
defer.returnValue(probe_ip.geodata['countrycode'])
@defer.inlineCallbacks
def run():
options = Options()
try:
options.parseOptions()
except usage.UsageError as error_message:
print "%s: %s" % (sys.argv[0], error_message)
print options
sys.exit(1)
if not options['output']:
options['output'] = os.getcwd()
if not options['country-code']:
try:
options['country-code'] = yield get_user_country_code()
except errors.ProbeIPUnknown:
print "Could not determine your IP address."
print "Check your internet connection or specify a country code with -c."
sys.exit(4)
if len(options['country-code']) != 2:
print "%s: --country-code must be 2 characters" % sys.argv[0]
sys.exit(2)
if not os.path.isdir(options['output']):
print "%s: %s is not a directory" % (sys.argv[0],
options['output'])
sys.exit(3)
options['country-code'] = options['country-code'].lower()
output_dir = os.path.abspath(options['output'])
output_dir = os.path.join(output_dir,
"deck-%s" % options['country-code'])
options['output'] = output_dir
try:
os.makedirs(options['output'])
except OSError as exception:
if exception.errno != errno.EEXIST:
raise
generate_deck(options)
ooniprobe-1.3.2/ooni/errors.py 0000644 0001750 0001750 00000020247 12623613431 014507 0 ustar irl irl from twisted.internet.defer import CancelledError
from twisted.internet.defer import TimeoutError as DeferTimeoutError
from twisted.web._newclient import ResponseNeverReceived
from twisted.web.error import Error
from twisted.internet.error import ConnectionRefusedError, TCPTimedOutError
from twisted.internet.error import DNSLookupError, ConnectError, ConnectionLost
from twisted.internet.error import TimeoutError as GenericTimeoutError
from twisted.internet.error import ProcessDone, ConnectionDone
from twisted.python import usage
from txsocksx.errors import SOCKSError
from txsocksx.errors import MethodsNotAcceptedError, AddressNotSupported
from txsocksx.errors import ConnectionError, NetworkUnreachable
from txsocksx.errors import ConnectionLostEarly, ConnectionNotAllowed
from txsocksx.errors import NoAcceptableMethods, ServerFailure
from txsocksx.errors import HostUnreachable, ConnectionRefused
from txsocksx.errors import TTLExpired, CommandNotSupported
from socket import gaierror
def handleAllFailures(failure):
"""
Here we make sure to trap all the failures that are supported by the
failureToString function and we return the the string that represents the
failure.
"""
failure.trap(
ConnectionRefusedError,
gaierror,
DNSLookupError,
TCPTimedOutError,
ResponseNeverReceived,
DeferTimeoutError,
GenericTimeoutError,
SOCKSError,
MethodsNotAcceptedError,
AddressNotSupported,
ConnectionError,
NetworkUnreachable,
ConnectionLostEarly,
ConnectionNotAllowed,
NoAcceptableMethods,
ServerFailure,
HostUnreachable,
ConnectionRefused,
TTLExpired,
CommandNotSupported,
ConnectError,
ConnectionLost,
CancelledError,
ConnectionDone)
return failureToString(failure)
def failureToString(failure):
"""
Given a failure instance return a string representing the kind of error
that occurred.
Args:
failure: a :class:twisted.internet.error instance
Returns:
A string representing the HTTP response error message.
"""
string = None
if isinstance(failure.value, ConnectionRefusedError):
# log.err("Connection refused.")
string = 'connection_refused_error'
elif isinstance(failure.value, ConnectionLost):
# log.err("Connection lost.")
string = 'connection_lost_error'
elif isinstance(failure.value, ConnectError):
# log.err("Connect error.")
string = 'connect_error'
elif isinstance(failure.value, gaierror):
# log.err("Address family for hostname not supported")
string = 'address_family_not_supported_error'
elif isinstance(failure.value, DNSLookupError):
# log.err("DNS lookup failure")
string = 'dns_lookup_error'
elif isinstance(failure.value, TCPTimedOutError):
# log.err("TCP Timed Out Error")
string = 'tcp_timed_out_error'
elif isinstance(failure.value, ResponseNeverReceived):
# log.err("Response Never Received")
string = 'response_never_received'
elif isinstance(failure.value, DeferTimeoutError):
# log.err("Deferred Timeout Error")
string = 'deferred_timeout_error'
elif isinstance(failure.value, GenericTimeoutError):
# log.err("Time Out Error")
string = 'generic_timeout_error'
elif isinstance(failure.value, ServerFailure):
# log.err("SOCKS error: ServerFailure")
string = 'socks_server_failure'
elif isinstance(failure.value, ConnectionNotAllowed):
# log.err("SOCKS error: ConnectionNotAllowed")
string = 'socks_connection_not_allowed'
elif isinstance(failure.value, NetworkUnreachable):
# log.err("SOCKS error: NetworkUnreachable")
string = 'socks_network_unreachable'
elif isinstance(failure.value, HostUnreachable):
# log.err("SOCKS error: HostUnreachable")
string = 'socks_host_unreachable'
elif isinstance(failure.value, ConnectionRefused):
# log.err("SOCKS error: ConnectionRefused")
string = 'socks_connection_refused'
elif isinstance(failure.value, TTLExpired):
# log.err("SOCKS error: TTLExpired")
string = 'socks_ttl_expired'
elif isinstance(failure.value, CommandNotSupported):
# log.err("SOCKS error: CommandNotSupported")
string = 'socks_command_not_supported'
elif isinstance(failure.value, AddressNotSupported):
# log.err("SOCKS error: AddressNotSupported")
string = 'socks_address_not_supported'
elif isinstance(failure.value, SOCKSError):
# log.err("Generic SOCKS error")
string = 'socks_error'
elif isinstance(failure.value, CancelledError):
# log.err("Task timed out")
string = 'task_timed_out'
elif isinstance(failure.value, ProcessDone):
string = 'process_done'
elif isinstance(failure.value, ConnectionDone):
string = 'connection_done'
else:
# log.err("Unknown failure type: %s" % type(failure.value))
string = 'unknown_failure %s' % str(failure.value)
return string
class DirectorException(Exception):
pass
class UnableToStartTor(DirectorException):
pass
class InvalidOONIBCollectorAddress(Exception):
pass
class InvalidOONIBBouncerAddress(Exception):
pass
class AllReportersFailed(Exception):
pass
class GeoIPDataFilesNotFound(Exception):
pass
class ReportNotCreated(Exception):
pass
class ReportAlreadyClosed(Exception):
pass
class TorStateNotFound(Exception):
pass
class TorControlPortNotFound(Exception):
pass
class InsufficientPrivileges(Exception):
pass
class ProbeIPUnknown(Exception):
pass
class NoMoreReporters(Exception):
pass
class TorNotRunning(Exception):
pass
class OONIBError(Exception):
pass
class OONIBInvalidRequest(OONIBError):
pass
class OONIBReportError(OONIBError):
pass
class OONIBReportUpdateError(OONIBReportError):
pass
class OONIBReportCreationError(OONIBReportError):
pass
class OONIBTestDetailsLookupError(OONIBReportError):
pass
class OONIBInputError(OONIBError):
pass
class OONIBInputDescriptorNotFound(OONIBInputError):
pass
class UnableToLoadDeckInput(Exception):
pass
class CouldNotFindTestHelper(Exception):
pass
class CouldNotFindTestCollector(Exception):
pass
class NetTestNotFound(Exception):
pass
class MissingRequiredOption(Exception):
def __init__(self, message, net_test_loader):
super(MissingRequiredOption, self).__init__()
self.net_test_loader = net_test_loader
self.message = message
def __str__(self):
return ','.join(self.message)
class OONIUsageError(usage.UsageError):
def __init__(self, net_test_loader):
super(OONIUsageError, self).__init__()
self.net_test_loader = net_test_loader
class FailureToLoadNetTest(Exception):
pass
class NoPostProcessor(Exception):
pass
class InvalidOption(Exception):
pass
class IncoherentOptions(Exception):
def __init__(self, first_options, second_options):
super(IncoherentOptions, self).__init__()
self.message = "%s is different to %s" % (first_options, second_options)
def __str__(self):
return self.message
class TaskTimedOut(Exception):
pass
class InvalidInputFile(Exception):
pass
class ReporterException(Exception):
pass
class InvalidDestination(ReporterException):
pass
class ReportLogExists(Exception):
pass
class InvalidConfigFile(Exception):
pass
class ConfigFileIncoherent(Exception):
pass
def get_error(error_key):
if error_key == 'test-helpers-key-missing':
return CouldNotFindTestHelper
if error_key == 'input-descriptor-not-found':
return OONIBInputDescriptorNotFound
if error_key == 'invalid-request':
return OONIBInvalidRequest
elif isinstance(error_key, int):
return Error("%d" % error_key)
else:
return OONIBError
class IfaceError(Exception):
pass
class ProtocolNotRegistered(Exception):
pass
class ProtocolAlreadyRegistered(Exception):
pass
class LibraryNotInstalledError(Exception):
pass
ooniprobe-1.3.2/ooni/tests/ 0000755 0001750 0001750 00000000000 12623630152 013755 5 ustar irl irl ooniprobe-1.3.2/ooni/tests/test_trueheaders.py 0000644 0001750 0001750 00000002220 12447563404 017706 0 ustar irl irl from twisted.trial import unittest
from ooni.utils.trueheaders import TrueHeaders
dummy_headers_dict = {
'Header1': ['Value1', 'Value2'],
'Header2': ['ValueA', 'ValueB']
}
dummy_headers_dict2 = {
'Header1': ['Value1', 'Value2'],
'Header2': ['ValueA', 'ValueB'],
'Header3': ['ValueA', 'ValueB'],
}
dummy_headers_dict3 = {
'Header1': ['Value1', 'Value2'],
'Header2': ['ValueA', 'ValueB'],
'Header4': ['ValueA', 'ValueB'],
}
class TestTrueHeaders(unittest.TestCase):
def test_names_match(self):
th = TrueHeaders(dummy_headers_dict)
self.assertEqual(th.getDiff(TrueHeaders(dummy_headers_dict)), set())
def test_names_not_match(self):
th = TrueHeaders(dummy_headers_dict)
self.assertEqual(th.getDiff(TrueHeaders(dummy_headers_dict2)), set(['Header3']))
th = TrueHeaders(dummy_headers_dict3)
self.assertEqual(th.getDiff(TrueHeaders(dummy_headers_dict2)), set(['Header3', 'Header4']))
def test_names_match_expect_ignore(self):
th = TrueHeaders(dummy_headers_dict)
self.assertEqual(th.getDiff(TrueHeaders(dummy_headers_dict2), ignore=['Header3']), set())
ooniprobe-1.3.2/ooni/tests/mocks.py 0000644 0001750 0001750 00000011056 12447563404 015457 0 ustar irl irl from twisted.python import failure
from twisted.internet import defer
from ooni.tasks import BaseTask, TaskWithTimeout
from ooni.managers import TaskManager
class MockMeasurementFailOnce(BaseTask):
def run(self):
f = open('dummyTaskFailOnce.txt', 'w')
f.write('fail')
f.close()
if self.failure >= 1:
return defer.succeed(self)
else:
return defer.fail(failure.Failure)
class MockMeasurementManager(TaskManager):
def __init__(self):
self.successes = []
TaskManager.__init__(self)
def failed(self, failure, task):
pass
def succeeded(self, result, task):
self.successes.append((result, task))
class MockReporter(object):
def __init__(self):
self.created = defer.Deferred()
def writeReportEntry(self, entry):
pass
def createReport(self):
self.created.callback(self)
def finish(self):
pass
class MockFailure(Exception):
pass
# # from test_managers
mockFailure = failure.Failure(MockFailure('mock'))
class MockSuccessTask(BaseTask):
def run(self):
return defer.succeed(42)
class MockFailTask(BaseTask):
def run(self):
return defer.fail(mockFailure)
class MockFailOnceTask(BaseTask):
def run(self):
if self.failures >= 1:
return defer.succeed(42)
else:
return defer.fail(mockFailure)
class MockSuccessTaskWithTimeout(TaskWithTimeout):
def run(self):
return defer.succeed(42)
class MockFailTaskThatTimesOut(TaskWithTimeout):
def run(self):
return defer.Deferred()
class MockTimeoutOnceTask(TaskWithTimeout):
def run(self):
if self.failures >= 1:
return defer.succeed(42)
else:
return defer.Deferred()
class MockFailTaskWithTimeout(TaskWithTimeout):
def run(self):
return defer.fail(mockFailure)
class MockNetTest(object):
def __init__(self):
self.successes = []
def succeeded(self, measurement):
self.successes.append(measurement)
class MockMeasurement(TaskWithTimeout):
def __init__(self, net_test):
TaskWithTimeout.__init__(self)
self.netTest = net_test
def succeeded(self, result):
return self.netTest.succeeded(42)
class MockSuccessMeasurement(MockMeasurement):
def run(self):
return defer.succeed(42)
class MockFailMeasurement(MockMeasurement):
def run(self):
return defer.fail(mockFailure)
class MockFailOnceMeasurement(MockMeasurement):
def run(self):
if self.failures >= 1:
return defer.succeed(42)
else:
return defer.fail(mockFailure)
class MockDirector(object):
def __init__(self):
self.successes = []
def measurementFailed(self, failure, measurement):
pass
def measurementSucceeded(self, measurement):
self.successes.append(measurement)
## from test_reporter.py
class MockOReporter(object):
def __init__(self):
self.created = defer.Deferred()
def writeReportEntry(self, entry):
return defer.succeed(42)
def finish(self):
pass
def createReport(self):
from ooni.utils import log
log.debug("Creating report with %s" % self)
self.created.callback(self)
class MockOReporterThatFailsWrite(MockOReporter):
def writeReportEntry(self, entry):
raise MockFailure
class MockOReporterThatFailsOpen(MockOReporter):
def createReport(self):
raise MockFailure
class MockOReporterThatFailsWriteOnce(MockOReporter):
def __init__(self):
self.failure = 0
MockOReporter.__init__(self)
def writeReportEntry(self, entry):
if self.failure >= 1:
return defer.succeed(42)
else:
self.failure += 1
raise MockFailure
class MockTaskManager(TaskManager):
def __init__(self):
self.successes = []
TaskManager.__init__(self)
def failed(self, failure, task):
pass
def succeeded(self, result, task):
self.successes.append((result, task))
class MockOONIBClient(object):
def lookupTestHelpers(self, required_test_helpers):
ret = {
'default': {
'address': '127.0.0.1',
'collector': 'httpo://thirteenchars1234.onion'
}
}
for required_test_helper in required_test_helpers:
ret[required_test_helper] = {
'address': '127.0.0.1',
'collector': 'httpo://thirteenchars1234.onion'
}
return defer.succeed(ret)
ooniprobe-1.3.2/ooni/tests/test_otime.py 0000644 0001750 0001750 00000000633 12447563404 016516 0 ustar irl irl import unittest
from datetime import datetime
from ooni import otime
test_date = datetime(2002, 6, 26, 22, 45, 49)
class TestOtime(unittest.TestCase):
def test_timestamp(self):
self.assertEqual(otime.timestamp(test_date), "2002-06-26T224549Z")
def test_fromTimestamp(self):
time_stamp = otime.timestamp(test_date)
self.assertEqual(test_date, otime.fromTimestamp(time_stamp))
ooniprobe-1.3.2/ooni/tests/test_geoip.py 0000644 0001750 0001750 00000002157 12447563404 016507 0 ustar irl irl
from twisted.internet import defer
from ooni.tests import is_internet_connected, bases
from ooni import geoip
class TestGeoIP(bases.ConfigTestCase):
def test_ip_to_location(self):
location = geoip.IPToLocation('8.8.8.8')
assert 'countrycode' in location
assert 'asn' in location
assert 'city' in location
@defer.inlineCallbacks
def test_probe_ip(self):
if not is_internet_connected():
self.skipTest(
"You must be connected to the internet to run this test"
)
probe_ip = geoip.ProbeIP()
res = yield probe_ip.lookup()
assert len(res.split('.')) == 4
def test_geoip_database_version(self):
version = geoip.database_version()
assert 'GeoIP' in version.keys()
assert 'GeoIPASNum' in version.keys()
assert 'GeoLiteCity' in version.keys()
assert len(version['GeoIP']['sha256']) == 64
assert isinstance(version['GeoIP']['timestamp'], float)
assert len(version['GeoIPASNum']['sha256']) == 64
assert isinstance(version['GeoIPASNum']['timestamp'], float)
ooniprobe-1.3.2/ooni/tests/bases.py 0000644 0001750 0001750 00000000525 12507276511 015434 0 ustar irl irl from twisted.trial import unittest
from ooni.settings import config
class ConfigTestCase(unittest.TestCase):
def setUp(self):
config.initialize_ooni_home("ooni_home")
def skipTest(self, reason):
raise unittest.SkipTest(reason)
def tearDown(self):
config.set_paths()
config.read_config_file()
ooniprobe-1.3.2/ooni/tests/test_deck.py 0000644 0001750 0001750 00000010165 12447563404 016310 0 ustar irl irl import os
from twisted.internet import defer
from twisted.trial import unittest
from hashlib import sha256
from ooni.deck import InputFile, Deck
from ooni.tests.mocks import MockOONIBClient
net_test_string = """
from twisted.python import usage
from ooni.nettest import NetTestCase
class UsageOptions(usage.Options):
optParameters = [['spam', 's', None, 'ham']]
class DummyTestCase(NetTestCase):
usageOptions = UsageOptions
requiredTestHelpers = {'spam': 'test-helper-typeA'}
def test_a(self):
self.report['bar'] = 'bar'
def test_b(self):
self.report['foo'] = 'foo'
"""
class BaseTestCase(unittest.TestCase):
def setUp(self):
self.filename = ""
self.cwd = os.getcwd()
self.dummy_deck_content = """- options:
collector: null
help: 0
logfile: null
no-default-reporter: 0
parallelism: null
pcapfile: null
reportfile: null
resume: 0
subargs: []
test_file: manipulation/http_invalid_request_line
testdeck: null
"""
class TestInputFile(BaseTestCase):
def tearDown(self):
if self.filename != "":
os.remove(self.filename)
def test_file_cached(self):
self.filename = file_hash = sha256(self.dummy_deck_content).hexdigest()
input_file = InputFile(file_hash, base_path='.')
with open(file_hash, 'w+') as f:
f.write(self.dummy_deck_content)
assert input_file.fileCached
def test_file_invalid_hash(self):
self.filename = invalid_hash = 'a' * 64
with open(invalid_hash, 'w+') as f:
f.write("b" * 100)
input_file = InputFile(invalid_hash, base_path='.')
self.assertRaises(AssertionError, input_file.verify)
def test_save_descriptor(self):
descriptor = {
'name': 'spam',
'id': 'spam',
'version': 'spam',
'author': 'spam',
'date': 'spam',
'description': 'spam'
}
file_id = 'a' * 64
self.filename = file_id + '.desc'
input_file = InputFile(file_id, base_path='.')
input_file.load(descriptor)
input_file.save()
assert os.path.isfile(self.filename)
assert input_file.descriptorCached
class TestDeck(BaseTestCase):
def setUp(self):
super(TestDeck, self).setUp()
deck_hash = sha256(self.dummy_deck_content).hexdigest()
self.deck_file = os.path.join(self.cwd, deck_hash)
with open(self.deck_file, 'w+') as f:
f.write(self.dummy_deck_content)
with open(os.path.join(self.cwd, 'dummy_test.py'), 'w+') as f:
f.write(net_test_string)
def tearDown(self):
os.remove(os.path.join(self.cwd, 'dummy_test.py'))
os.remove(self.deck_file)
if self.filename != "":
os.remove(self.filename)
def test_open_deck(self):
deck = Deck(decks_directory=".")
deck.bouncer = "httpo://foo.onion"
deck.loadDeck(self.deck_file)
assert len(deck.netTestLoaders) == 1
def test_save_deck_descriptor(self):
deck = Deck(decks_directory=".")
deck.bouncer = "httpo://foo.onion"
deck.loadDeck(self.deck_file)
deck.load({'name': 'spam',
'id': 'spam',
'version': 'spam',
'author': 'spam',
'date': 'spam',
'description': 'spam'
})
deck.save()
self.filename = self.deck_file + ".desc"
deck.verify()
@defer.inlineCallbacks
def test_lookuptest_helpers(self):
deck = Deck(decks_directory=".")
deck.bouncer = "httpo://foo.onion"
deck.oonibclient = MockOONIBClient()
deck.loadDeck(self.deck_file)
yield deck.lookupTestHelpers()
assert deck.netTestLoaders[0].collector == 'httpo://thirteenchars1234.onion'
required_test_helpers = deck.netTestLoaders[0].requiredTestHelpers
assert len(required_test_helpers) == 1
assert required_test_helpers[0]['test_class'].localOptions['backend'] == '127.0.0.1'
ooniprobe-1.3.2/ooni/tests/__init__.py 0000644 0001750 0001750 00000000611 12507276502 016072 0 ustar irl irl import socket
from ooni.settings import config
config.initialize_ooni_home('ooni_home')
config.read_config_file()
config.logging = False
config.advanced.debug = False
def is_internet_connected():
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
try:
s.connect(('torproject.org', 80))
s.shutdown(2)
return True
except Exception:
return False
ooniprobe-1.3.2/ooni/tests/test_oonicli.py 0000644 0001750 0001750 00000015006 12623613431 017025 0 ustar irl irl import os
import sys
import yaml
from twisted.internet import defer
from ooni.tests import is_internet_connected
from ooni.tests.bases import ConfigTestCase
from ooni.settings import config
from ooni.oonicli import runWithDirector
from ooni.errors import InsufficientPrivileges
from ooni.utils.net import hasRawSocketPermission
def verify_header(header):
assert 'input_hashes' in header.keys()
assert 'options' in header.keys()
assert 'probe_asn' in header.keys()
assert 'probe_cc' in header.keys()
assert 'probe_ip' in header.keys()
assert 'software_name' in header.keys()
assert 'software_version' in header.keys()
assert 'test_name' in header.keys()
assert 'test_version' in header.keys()
def verify_entry(entry):
assert 'input' in entry
config_includepcap = """
basic:
logfile: ~/.ooni/ooniprobe.log
privacy:
includeip: false
includeasn: true
includecountry: true
includecity: false
includepcap: true
reports:
pcap: null
collector: null
advanced:
geoip_data_dir: /usr/share/GeoIP
debug: false
interface: auto
start_tor: false
measurement_timeout: 60
measurement_retries: 2
measurement_concurrency: 10
reporting_timeout: 80
reporting_retries: 3
reporting_concurrency: 15
data_dir: %s
oonid_api_port: 8042
tor:
socks_port: 9050
""" % config.data_directory
class TestRunDirector(ConfigTestCase):
timeout = 220
def setUp(self):
super(TestRunDirector, self).setUp()
if not is_internet_connected():
self.skipTest("You must be connected to the internet to run this test")
elif not hasRawSocketPermission():
self.skipTest("You must run this test as root or have the capabilities "
"cap_net_admin,cap_net_raw+eip")
config.tor.socks_port = 9050
config.tor.control_port = None
self.filenames = ['example-input.txt']
with open('example-input.txt', 'w+') as f:
f.write('http://torproject.org/\n')
f.write('http://bridges.torproject.org/\n')
f.write('http://blog.torproject.org/\n')
def tearDown(self):
super(TestRunDirector, self).tearDown()
for filename in self.filenames:
if os.path.exists(filename):
os.remove(filename)
self.filenames = []
@defer.inlineCallbacks
def run_helper(self, test_name, nettest_args, verify_function, ooni_args=()):
output_file = os.path.abspath('test_report.yamloo')
self.filenames.append(output_file)
oldargv = sys.argv
sys.argv = ['']
sys.argv.extend(ooni_args)
sys.argv.extend(['-n', '-o', output_file, test_name])
sys.argv.extend(nettest_args)
yield runWithDirector(False, False, False)
with open(output_file) as f:
entries = yaml.safe_load_all(f)
header = entries.next()
try:
first_entry = entries.next()
except StopIteration:
raise Exception("Missing entry in report")
verify_header(header)
verify_entry(first_entry)
verify_function(first_entry)
sys.argv = oldargv
@defer.inlineCallbacks
def test_http_requests(self):
def verify_function(entry):
assert 'body_length_match' in entry
assert 'body_proportion' in entry
assert 'control_failure' in entry
assert 'experiment_failure' in entry
assert 'factor' in entry
assert 'headers_diff' in entry
assert 'headers_match' in entry
yield self.run_helper('blocking/http_requests',
['-u', 'http://torproject.org/'],
verify_function)
@defer.inlineCallbacks
def test_http_requests_with_file(self):
def verify_function(entry):
assert 'body_length_match' in entry
assert 'body_proportion' in entry
assert 'control_failure' in entry
assert 'experiment_failure' in entry
assert 'factor' in entry
assert 'headers_diff' in entry
assert 'headers_match' in entry
yield self.run_helper('blocking/http_requests',
['-f', 'example-input.txt'],
verify_function)
@defer.inlineCallbacks
def test_dnsconsistency(self):
def verify_function(entry):
assert 'queries' in entry
assert 'control_resolver' in entry
assert 'tampering' in entry
assert len(entry['tampering']) == 1
yield self.run_helper('blocking/dns_consistency',
['-b', '8.8.8.8:53',
'-t', '8.8.8.8',
'-f', 'example-input.txt'],
verify_function)
@defer.inlineCallbacks
def test_http_header_field_manipulation(self):
self.skipTest("This test requires a property configured backend")
def verify_function(entry):
assert 'agent' in entry
assert 'requests' in entry
assert 'socksproxy' in entry
assert 'tampering' in entry
assert 'header_field_name' in entry['tampering']
assert 'header_field_number' in entry['tampering']
assert 'header_field_value' in entry['tampering']
assert 'header_name_capitalization' in entry['tampering']
assert 'header_name_diff' in entry['tampering']
assert 'request_line_capitalization' in entry['tampering']
assert 'total' in entry['tampering']
yield self.run_helper('manipulation/http_header_field_manipulation',
['-b', 'http://4.15.35.157:80'],
verify_function)
@defer.inlineCallbacks
def test_sniffing_activated(self):
self.skipTest("Not properly set packet capture?")
filename = os.path.abspath('test_report.pcap')
self.filenames.append(filename)
conf_file = os.path.abspath('fake_config.conf')
with open(conf_file, 'w') as cfg:
cfg.writelines(config_includepcap)
self.filenames.append(conf_file)
def verify_function(_):
assert os.path.exists(filename)
self.assertGreater(os.stat(filename).st_size, 0)
yield self.run_helper('blocking/http_requests',
['-f', 'example-input.txt'],
verify_function, ooni_args=['-f', conf_file])
config.scapyFactory.connectionLost('')
ooniprobe-1.3.2/ooni/tests/test_txscapy.py 0000644 0001750 0001750 00000003460 12463144534 017072 0 ustar irl irl from mock import MagicMock
from twisted.internet import defer
from twisted.trial import unittest
from ooni.utils import txscapy
defer.setDebugging(True)
class TestTxScapy(unittest.TestCase):
def setUp(self):
# if not txscapy.hasRawSocketPermission():
# self.skipTest("No raw socket permissions...")
mock_super_socket = MagicMock()
mock_super_socket.ins.fileno.return_value = 1
self.scapy_factory = txscapy.ScapyFactory('foo', mock_super_socket)
def tearDown(self):
self.scapy_factory.connectionLost(None)
def test_pcapdnet_installed(self):
assert txscapy.pcapdnet_installed() is True
def test_send_packet_no_answer(self):
from scapy.all import IP, TCP
sender = txscapy.ScapySender()
self.scapy_factory.registerProtocol(sender)
packet = IP(dst='8.8.8.8') / TCP(dport=53)
sender.startSending([packet])
self.scapy_factory.super_socket.send.assert_called_with(packet)
assert len(sender.sent_packets) == 1
@defer.inlineCallbacks
def test_send_packet_with_answer(self):
from scapy.all import IP, TCP
sender = txscapy.ScapySender()
self.scapy_factory.registerProtocol(sender)
packet_sent = IP(dst='8.8.8.8', src='127.0.0.1') / TCP(dport=53,
sport=5300)
packet_received = IP(dst='127.0.0.1', src='8.8.8.8') / TCP(sport=53,
dport=5300)
d = sender.startSending([packet_sent])
self.scapy_factory.super_socket.send.assert_called_with(packet_sent)
sender.packetReceived(packet_received)
result = yield d
assert result[0][0][0] == packet_sent
assert result[0][0][1] == packet_received
ooniprobe-1.3.2/ooni/tests/test_managers.py 0000644 0001750 0001750 00000020613 12463144534 017173 0 ustar irl irl import os
from twisted.trial import unittest
from twisted.internet import defer, task
from ooni.managers import MeasurementManager
from ooni.tests.mocks import MockSuccessTask, MockFailTask, MockFailOnceTask, MockFailure
from ooni.tests.mocks import MockSuccessTaskWithTimeout, MockFailTaskThatTimesOut
from ooni.tests.mocks import MockTimeoutOnceTask, MockFailTaskWithTimeout
from ooni.tests.mocks import MockTaskManager, mockFailure, MockDirector
from ooni.tests.mocks import MockNetTest, MockSuccessMeasurement
from ooni.tests.mocks import MockFailMeasurement
from ooni.settings import config
class TestTaskManager(unittest.TestCase):
timeout = 1
def setUp(self):
self.measurementManager = MockTaskManager()
self.measurementManager.concurrency = 20
self.measurementManager.retries = 2
self.measurementManager.start()
self.clock = task.Clock()
data_dir = os.path.dirname(os.path.abspath(__file__))
data_dir = os.path.join(data_dir, '..', '..', 'data')
self.old_datadir = ""
if hasattr(config.global_options, 'datadir'):
self.old_datadir = config.global_options['datadir']
config.global_options['datadir'] = data_dir
config.set_paths()
def tearDown(self):
if self.old_datadir == "":
del config.global_options['datadir']
else:
config.global_options['datadir'] = self.old_datadir
config.set_paths()
def schedule_successful_tasks(self, task_type, number=1):
all_done = []
for x in range(number):
mock_task = task_type()
all_done.append(mock_task.done)
self.measurementManager.schedule(mock_task)
d = defer.DeferredList(all_done)
@d.addCallback
def done(res):
for task_result, task_instance in self.measurementManager.successes:
self.assertEqual(task_result, 42)
self.assertIsInstance(task_instance, task_type)
return d
def schedule_failing_tasks(self, task_type, number=1):
all_done = []
for x in range(number):
mock_task = task_type()
all_done.append(mock_task.done)
mock_task.done.addErrback(lambda x: None)
self.measurementManager.schedule(mock_task)
d = defer.DeferredList(all_done)
@d.addCallback
def done(res):
# 10*2 because 2 is the number of retries
self.assertEqual(self.measurementManager.failures, number * 3)
# XXX @aagbsn is there a reason why you switched to using an int
# over a using a list?
# self.assertEqual(len(self.measurementManager.failures), number*3)
# for task_result, task_instance in self.measurementManager.failures:
# self.assertEqual(task_result, mockFailure)
# self.assertIsInstance(task_instance, task_type)
return d
def test_schedule_failing_with_mock_failure_task(self):
mock_task = MockFailTask()
self.measurementManager.schedule(mock_task)
self.assertFailure(mock_task.done, MockFailure)
return mock_task.done
def test_schedule_successful_one_task(self):
return self.schedule_successful_tasks(MockSuccessTask)
def test_schedule_successful_one_task_with_timeout(self):
return self.schedule_successful_tasks(MockSuccessTaskWithTimeout)
def test_schedule_failing_tasks_that_timesout(self):
self.measurementManager.retries = 0
task_type = MockFailTaskThatTimesOut
task_timeout = 5
mock_task = task_type()
mock_task.timeout = task_timeout
mock_task.clock = self.clock
self.measurementManager.schedule(mock_task)
self.clock.advance(task_timeout)
@mock_task.done.addBoth
def done(res):
self.assertEqual(self.measurementManager.failures, 1)
# self.assertEqual(len(self.measurementManager.failures), 1)
# for task_result, task_instance in self.measurementManager.failures:
# self.assertIsInstance(task_instance, task_type)
return mock_task.done
def test_schedule_time_out_once(self):
task_type = MockTimeoutOnceTask
task_timeout = 5
mock_task = task_type()
mock_task.timeout = task_timeout
mock_task.clock = self.clock
self.measurementManager.schedule(mock_task)
self.clock.advance(task_timeout)
@mock_task.done.addBoth
def done(res):
self.assertEqual(self.measurementManager.failures, 1)
# self.assertEqual(len(self.measurementManager.failures), 1)
# for task_result, task_instance in self.measurementManager.failures:
# self.assertIsInstance(task_instance, task_type)
for task_result, task_instance in self.measurementManager.successes:
self.assertEqual(task_result, 42)
self.assertIsInstance(task_instance, task_type)
return mock_task.done
def test_schedule_failing_one_task(self):
return self.schedule_failing_tasks(MockFailTask)
def test_schedule_failing_one_task_with_timeout(self):
return self.schedule_failing_tasks(MockFailTaskWithTimeout)
def test_schedule_successful_ten_tasks(self):
return self.schedule_successful_tasks(MockSuccessTask, number=10)
def test_schedule_failing_ten_tasks(self):
return self.schedule_failing_tasks(MockFailTask, number=10)
def test_schedule_successful_27_tasks(self):
return self.schedule_successful_tasks(MockSuccessTask, number=27)
def test_schedule_failing_27_tasks(self):
return self.schedule_failing_tasks(MockFailTask, number=27)
def test_task_retry_and_succeed(self):
mock_task = MockFailOnceTask()
self.measurementManager.schedule(mock_task)
@mock_task.done.addCallback
def done(res):
self.assertEqual(self.measurementManager.failures, 1)
# self.assertEqual(len(self.measurementManager.failures), 1)
# self.assertEqual(self.measurementManager.failures,
# [(mockFailure, mock_task)])
self.assertEqual(self.measurementManager.successes,
[(42, mock_task)])
return mock_task.done
def test_task_retry_and_succeed_56_tasks(self):
"""
XXX this test fails in a non-deterministic manner.
"""
all_done = []
number = 56
for x in range(number):
mock_task = MockFailOnceTask()
all_done.append(mock_task.done)
self.measurementManager.schedule(mock_task)
d = defer.DeferredList(all_done)
@d.addCallback
def done(res):
self.assertEqual(self.measurementManager.failures, number)
# self.assertEqual(len(self.measurementManager.failures), number)
for task_result, task_instance in self.measurementManager.successes:
self.assertEqual(task_result, 42)
self.assertIsInstance(task_instance, MockFailOnceTask)
return d
class TestMeasurementManager(unittest.TestCase):
def setUp(self):
mock_director = MockDirector()
self.measurementManager = MeasurementManager()
self.measurementManager.director = mock_director
self.measurementManager.concurrency = 10
self.measurementManager.retries = 2
self.measurementManager.start()
self.mockNetTest = MockNetTest()
def test_schedule_and_net_test_notified(self, number=1):
# XXX we should probably be inheriting from the base test class
mock_task = MockSuccessMeasurement(self.mockNetTest)
self.measurementManager.schedule(mock_task)
@mock_task.done.addCallback
def done(res):
self.assertEqual(self.mockNetTest.successes,
[42])
self.assertEqual(len(self.mockNetTest.successes), 1)
return mock_task.done
def test_schedule_failing_one_measurement(self):
mock_task = MockFailMeasurement(self.mockNetTest)
self.measurementManager.schedule(mock_task)
@mock_task.done.addErrback
def done(failure):
self.assertEqual(self.measurementManager.failures, 3)
# self.assertEqual(len(self.measurementManager.failures), 3)
self.assertEqual(failure, mockFailure)
self.assertEqual(len(self.mockNetTest.successes), 0)
return mock_task.done
ooniprobe-1.3.2/ooni/tests/test_utils.py 0000644 0001750 0001750 00000005533 12463144534 016542 0 ustar irl irl import os
from twisted.trial import unittest
from ooni.utils import pushFilenameStack, log, generate_filename, net
class TestUtils(unittest.TestCase):
def setUp(self):
self.test_details = {
'test_name': 'foo',
'start_time': 441763200
}
self.extension = 'ext'
self.prefix = 'prefix'
self.basename = 'filename'
self.filename = 'filename.txe'
def test_pushFilenameStack(self):
basefilename = os.path.join(os.getcwd(), 'dummyfile')
f = open(basefilename, "w+")
f.write("0\n")
f.close()
for i in xrange(1, 20):
f = open("%s.%d" % (basefilename, i), "w+")
f.write("%s\n" % i)
f.close()
pushFilenameStack(basefilename)
for i in xrange(1, 20):
f = open("%s.%d" % (basefilename, i))
c = f.readlines()[0].strip()
self.assertEqual(str(i-1), str(c))
f.close()
for i in xrange(1, 21):
os.remove("%s.%d" % (basefilename, i))
def test_log_encode(self):
logmsgs = (
(r"spam\x07\x08", "spam\a\b"),
(r"spam\x07\x08", u"spam\a\b"),
(r"ham\u237e", u"ham"+u"\u237e")
)
for encoded_logmsg, logmsg in logmsgs:
self.assertEqual(log.log_encode(logmsg), encoded_logmsg)
def test_generate_filename(self):
filename = generate_filename(self.test_details)
self.assertEqual(filename, 'foo-1984-01-01T000000Z')
def test_generate_filename_with_extension(self):
filename = generate_filename(self.test_details, extension=self.extension)
self.assertEqual(filename, 'foo-1984-01-01T000000Z.ext')
def test_generate_filename_with_prefix(self):
filename = generate_filename(self.test_details, prefix=self.prefix)
self.assertEqual(filename, 'prefix-foo-1984-01-01T000000Z')
def test_generate_filename_with_extension_and_prefix(self):
filename = generate_filename(self.test_details, prefix=self.prefix, extension=self.extension)
self.assertEqual(filename, 'prefix-foo-1984-01-01T000000Z.ext')
def test_generate_filename_with_filename(self):
filename = generate_filename(self.test_details, filename=self.filename)
self.assertEqual(filename, 'filename.txe')
def test_generate_filename_with_extension_and_filename(self):
filename = generate_filename(self.test_details, extension=self.extension, filename=self.filename)
self.assertEqual(filename, 'filename.ext')
def test_generate_filename_with_extension_and_basename(self):
filename = generate_filename(self.test_details, extension=self.extension, filename=self.basename)
self.assertEqual(filename, 'filename.ext')
def test_get_addresses(self):
addresses = net.getAddresses()
assert isinstance(addresses, list)
ooniprobe-1.3.2/ooni/tests/test_onion.py 0000644 0001750 0001750 00000004302 12531110611 016476 0 ustar irl irl from twisted.trial import unittest
from ooni.utils import onion
from mock import Mock, patch
sample_transport_lines = {
'fte': 'fte exec /fakebin --managed',
'scramblesuit': 'scramblesuit exec /fakebin --log-min-severity info --log-file /log.txt managed',
'obfs2': 'obfs2 exec /fakebin --log-min-severity info --log-file /log.txt managed',
'obfs3': 'obfs3 exec /fakebin --log-min-severity info --log-file /log.txt managed',
'obfs4': 'obfs4 exec /fakebin --enableLogging=true --logLevel=INFO' }
class TestOnion(unittest.TestCase):
def test_tor_details(self):
assert isinstance(onion.tor_details, dict)
assert onion.tor_details['version']
assert onion.tor_details['binary']
def test_transport_dicts(self):
self.assertEqual( set(onion.transport_bin_name.keys()),
set(onion._transport_line_templates.keys()) )
def test_bridge_line(self):
self.assertRaises(onion.UnrecognizedTransport,
onion.bridge_line, 'rot13', '/log.txt')
onion.find_executable = Mock(return_value=False)
self.assertRaises(onion.UninstalledTransport,
onion.bridge_line, 'fte', '/log.txt')
onion.find_executable = Mock(return_value="/fakebin")
for transport, exp_line in sample_transport_lines.iteritems():
self.assertEqual(onion.bridge_line(transport, '/log.txt'),
exp_line)
with patch.dict(onion.obfsproxy_details,
{'version': onion.OBFSProxyVersion('0.1.12')}):
self.assertRaises(onion.OutdatedObfsproxy,
onion.bridge_line, 'obfs2', '/log.txt')
with patch.dict(onion.tor_details,
{'version': onion.TorVersion('0.2.4.20')}):
onion.bridge_line('fte', '/log.txt')
self.assertRaises(onion.OutdatedTor,
onion.bridge_line, 'scramblesuit', '/log.txt')
self.assertRaises(onion.OutdatedTor,
onion.bridge_line, 'obfs4', '/log.txt')
with patch.dict(onion.tor_details,
{'version': onion.TorVersion('0.2.3.20')}):
self.assertRaises(onion.OutdatedTor,
onion.bridge_line, 'fte', '/log.txt')
ooniprobe-1.3.2/ooni/tests/test_mutate.py 0000644 0001750 0001750 00000001147 12447563404 016701 0 ustar irl irl import unittest
from ooni.kit import daphn3
class TestDaphn3(unittest.TestCase):
def test_mutate_string(self):
original_string = '\x00\x00\x00'
mutated = daphn3.daphn3MutateString(original_string, 1)
self.assertEqual(mutated, '\x00\x01\x00')
def test_mutate_daphn3(self):
original_dict = [{'client': '\x00\x00\x00'},
{'server': '\x00\x00\x00'}]
mutated_dict = daphn3.daphn3Mutate(original_dict, 1, 1)
self.assertEqual(mutated_dict, [{'client': '\x00\x00\x00'},
{'server': '\x00\x01\x00'}])
ooniprobe-1.3.2/ooni/tests/test_nettest.py 0000644 0001750 0001750 00000025376 12447563404 017102 0 ustar irl irl import os
from tempfile import mkstemp
from twisted.trial import unittest
from twisted.internet import defer, reactor
from twisted.python.usage import UsageError
from ooni.settings import config
from ooni.errors import MissingRequiredOption, OONIUsageError, IncoherentOptions
from ooni.nettest import NetTest, NetTestLoader
from ooni.director import Director
from ooni.tests.bases import ConfigTestCase
net_test_string = """
from twisted.python import usage
from ooni.nettest import NetTestCase
class UsageOptions(usage.Options):
optParameters = [['spam', 's', None, 'ham']]
class DummyTestCase(NetTestCase):
usageOptions = UsageOptions
def test_a(self):
self.report['bar'] = 'bar'
def test_b(self):
self.report['foo'] = 'foo'
"""
double_net_test_string = """
from twisted.python import usage
from ooni.nettest import NetTestCase
class UsageOptions(usage.Options):
optParameters = [['spam', 's', None, 'ham']]
class DummyTestCaseA(NetTestCase):
usageOptions = UsageOptions
def test_a(self):
self.report['bar'] = 'bar'
class DummyTestCaseB(NetTestCase):
usageOptions = UsageOptions
def test_b(self):
self.report['foo'] = 'foo'
"""
double_different_options_net_test_string = """
from twisted.python import usage
from ooni.nettest import NetTestCase
class UsageOptionsA(usage.Options):
optParameters = [['spam', 's', None, 'ham']]
class UsageOptionsB(usage.Options):
optParameters = [['spam', 's', None, 'ham']]
class DummyTestCaseA(NetTestCase):
usageOptions = UsageOptionsA
def test_a(self):
self.report['bar'] = 'bar'
class DummyTestCaseB(NetTestCase):
usageOptions = UsageOptionsB
def test_b(self):
self.report['foo'] = 'foo'
"""
net_test_root_required = net_test_string + """
requiresRoot = True
"""
net_test_string_with_file = """
from twisted.python import usage
from ooni.nettest import NetTestCase
class UsageOptions(usage.Options):
optParameters = [['spam', 's', None, 'ham']]
class DummyTestCase(NetTestCase):
inputFile = ['file', 'f', None, 'The input File']
usageOptions = UsageOptions
def test_a(self):
self.report['bar'] = 'bar'
def test_b(self):
self.report['foo'] = 'foo'
"""
net_test_string_with_required_option = """
from twisted.python import usage
from ooni.nettest import NetTestCase
class UsageOptions(usage.Options):
optParameters = [['spam', 's', None, 'ham'],
['foo', 'o', None, 'moo'],
['bar', 'o', None, 'baz'],
]
class DummyTestCase(NetTestCase):
inputFile = ['file', 'f', None, 'The input File']
requiredOptions = ['foo', 'bar']
usageOptions = UsageOptions
def test_a(self):
self.report['bar'] = 'bar'
def test_b(self):
self.report['foo'] = 'foo'
"""
http_net_test = """
from twisted.internet import defer
from twisted.python import usage, failure
from ooni.utils import log
from ooni.utils.net import userAgents
from ooni.templates import httpt
from ooni.errors import failureToString, handleAllFailures
class UsageOptions(usage.Options):
optParameters = [
['url', 'u', None, 'Specify a single URL to test.'],
]
class HTTPBasedTest(httpt.HTTPTest):
usageOptions = UsageOptions
def test_get(self):
return self.doRequest(self.localOptions['url'], method="GET",
use_tor=False)
"""
dummyInputs = range(1)
dummyArgs = ('--spam', 'notham')
dummyOptions = {'spam': 'notham'}
dummyInvalidArgs = ('--cram', 'jam')
dummyInvalidOptions = {'cram': 'jam'}
dummyArgsWithRequiredOptions = ('--foo', 'moo', '--bar', 'baz')
dummyRequiredOptions = {'foo': 'moo', 'bar': 'baz'}
dummyArgsWithFile = ('--spam', 'notham', '--file', 'dummyInputFile.txt')
dummyInputFile = 'dummyInputFile.txt'
class TestNetTest(unittest.TestCase):
timeout = 1
def setUp(self):
self.filename = ""
with open(dummyInputFile, 'w') as f:
for i in range(10):
f.write("%s\n" % i)
def tearDown(self):
os.remove(dummyInputFile)
if self.filename != "":
os.remove(self.filename)
def assertCallable(self, thing):
self.assertIn('__call__', dir(thing))
def verifyMethods(self, testCases):
uniq_test_methods = set()
for test_class, test_methods in testCases:
instance = test_class()
for test_method in test_methods:
c = getattr(instance, test_method)
self.assertCallable(c)
uniq_test_methods.add(test_method)
self.assertEqual(set(['test_a', 'test_b']), uniq_test_methods)
def verifyClasses(self, test_cases, control_classes):
actual_classes = set()
for test_class, test_methods in test_cases:
actual_classes.add(test_class.__name__)
self.assertEqual(actual_classes, control_classes)
def test_load_net_test_from_file(self):
"""
Given a file verify that the net test cases are properly
generated.
"""
__, net_test_file = mkstemp()
with open(net_test_file, 'w') as f:
f.write(net_test_string)
f.close()
ntl = NetTestLoader(dummyArgs)
ntl.loadNetTestFile(net_test_file)
self.verifyMethods(ntl.testCases)
os.unlink(net_test_file)
def test_load_net_test_from_str(self):
"""
Given a file like object verify that the net test cases are properly
generated.
"""
ntl = NetTestLoader(dummyArgs)
ntl.loadNetTestString(net_test_string)
self.verifyMethods(ntl.testCases)
def test_load_net_test_multiple(self):
ntl = NetTestLoader(dummyArgs)
ntl.loadNetTestString(double_net_test_string)
self.verifyMethods(ntl.testCases)
self.verifyClasses(ntl.testCases, set(('DummyTestCaseA', 'DummyTestCaseB')))
ntl.checkOptions()
def test_load_net_test_multiple_different_options(self):
ntl = NetTestLoader(dummyArgs)
ntl.loadNetTestString(double_different_options_net_test_string)
self.verifyMethods(ntl.testCases)
self.verifyClasses(ntl.testCases, set(('DummyTestCaseA', 'DummyTestCaseB')))
self.assertRaises(IncoherentOptions, ntl.checkOptions)
def test_load_with_option(self):
ntl = NetTestLoader(dummyArgs)
ntl.loadNetTestString(net_test_string)
self.assertIsInstance(ntl, NetTestLoader)
for test_klass, test_meth in ntl.testCases:
for option in dummyOptions.keys():
self.assertIn(option, test_klass.usageOptions())
def test_load_with_invalid_option(self):
ntl = NetTestLoader(dummyInvalidArgs)
ntl.loadNetTestString(net_test_string)
self.assertRaises(UsageError, ntl.checkOptions)
self.assertRaises(OONIUsageError, ntl.checkOptions)
def test_load_with_required_option(self):
ntl = NetTestLoader(dummyArgsWithRequiredOptions)
ntl.loadNetTestString(net_test_string_with_required_option)
self.assertIsInstance(ntl, NetTestLoader)
def test_load_with_missing_required_option(self):
ntl = NetTestLoader(dummyArgs)
ntl.loadNetTestString(net_test_string_with_required_option)
self.assertRaises(MissingRequiredOption, ntl.checkOptions)
def test_net_test_inputs(self):
ntl = NetTestLoader(dummyArgsWithFile)
ntl.loadNetTestString(net_test_string_with_file)
ntl.checkOptions()
nt = NetTest(ntl, None)
nt.initializeInputProcessor()
# XXX: if you use the same test_class twice you will have consumed all
# of its inputs!
tested = set([])
for test_class, test_method in ntl.testCases:
if test_class not in tested:
tested.update([test_class])
self.assertEqual(len(list(test_class.inputs)), 10)
def test_setup_local_options_in_test_cases(self):
ntl = NetTestLoader(dummyArgs)
ntl.loadNetTestString(net_test_string)
ntl.checkOptions()
for test_class, test_method in ntl.testCases:
self.assertEqual(test_class.localOptions, dummyOptions)
def test_generate_measurements_size(self):
ntl = NetTestLoader(dummyArgsWithFile)
ntl.loadNetTestString(net_test_string_with_file)
ntl.checkOptions()
net_test = NetTest(ntl, None)
net_test.initializeInputProcessor()
measurements = list(net_test.generateMeasurements())
self.assertEqual(len(measurements), 20)
def test_net_test_completed_callback(self):
ntl = NetTestLoader(dummyArgsWithFile)
ntl.loadNetTestString(net_test_string_with_file)
ntl.checkOptions()
director = Director()
self.filename = 'dummy_report.yamloo'
d = director.startNetTest(ntl, self.filename)
@d.addCallback
def complete(result):
self.assertEqual(result, None)
self.assertEqual(director.successfulMeasurements, 20)
return d
def test_require_root_succeed(self):
# XXX: will require root to run
ntl = NetTestLoader(dummyArgs)
ntl.loadNetTestString(net_test_root_required)
for test_class, method in ntl.testCases:
self.assertTrue(test_class.requiresRoot)
class TestNettestTimeout(ConfigTestCase):
@defer.inlineCallbacks
def setUp(self):
super(TestNettestTimeout, self).setUp()
from twisted.internet.protocol import Protocol, Factory
from twisted.internet.endpoints import TCP4ServerEndpoint
class DummyProtocol(Protocol):
def dataReceived(self, data):
pass
class DummyFactory(Factory):
def __init__(self):
self.protocols = []
def buildProtocol(self, addr):
proto = DummyProtocol()
self.protocols.append(proto)
return proto
def stopFactory(self):
for proto in self.protocols:
proto.transport.loseConnection()
self.factory = DummyFactory()
endpoint = TCP4ServerEndpoint(reactor, 8007)
self.port = yield endpoint.listen(self.factory)
config.advanced.measurement_timeout = 2
def tearDown(self):
super(TestNettestTimeout, self).tearDown()
self.factory.stopFactory()
self.port.stopListening()
os.remove(self.filename)
def test_nettest_timeout(self):
ntl = NetTestLoader(('-u', 'http://localhost:8007/'))
ntl.loadNetTestString(http_net_test)
ntl.checkOptions()
director = Director()
self.filename = 'dummy_report.yamloo'
d = director.startNetTest(ntl, self.filename)
@d.addCallback
def complete(result):
assert director.failedMeasurements == 1
return d
ooniprobe-1.3.2/ooni/tests/test_ngdeck.py 0000644 0001750 0001750 00000000323 12402050362 016611 0 ustar irl irl from twisted.trial import unittest
from ooni.ngdeck import NgDeck
class TestNgDeck(unittest.TestCase):
deck_path = ""
def test_run_deck(self):
deck = NgDeck(self.deck_path)
deck.run()
ooniprobe-1.3.2/ooni/tests/test_director.py 0000644 0001750 0001750 00000007732 12447563404 017223 0 ustar irl irl import time
from mock import patch, MagicMock
from ooni.settings import config
from ooni.director import Director
from ooni.tests.bases import ConfigTestCase
from twisted.internet import defer
from twisted.trial import unittest
from txtorcon import TorControlProtocol
proto = MagicMock()
proto.tor_protocol = TorControlProtocol()
mock_TorState = MagicMock()
# We use the instance of mock_TorState so that the mock caching will
# return the same instance when TorState is created.
mts = mock_TorState()
mts.protocol.get_conf = lambda x: defer.succeed({'SocksPort': '4242'})
mts.post_bootstrap = defer.succeed(mts)
# Set the tor_protocol to be already fired
state = MagicMock()
proto.tor_protocol.post_bootstrap = defer.succeed(state)
mock_launch_tor = MagicMock()
mock_launch_tor.return_value = defer.succeed(proto)
class TestDirector(ConfigTestCase):
def tearDown(self):
super(TestDirector, self).tearDown()
config.tor_state = None
def test_get_net_tests(self):
director = Director()
nettests = director.getNetTests()
assert 'http_requests' in nettests
assert 'dns_consistency' in nettests
assert 'http_header_field_manipulation' in nettests
assert 'traceroute' in nettests
@patch('ooni.director.TorState', mock_TorState)
@patch('ooni.director.launch_tor', mock_launch_tor)
def test_start_tor(self):
@defer.inlineCallbacks
def director_start_tor():
director = Director()
yield director.startTor()
assert config.tor.socks_port == 4242
assert config.tor.control_port == 4242
return director_start_tor()
class TestStartSniffing(unittest.TestCase):
def setUp(self):
self.director = Director()
self.testDetails = {
'test_name': 'foo',
'start_time': time.time()
}
# Each NetTestCase has a name attribute
class FooTestCase(object):
name = 'foo'
self.FooTestCase = FooTestCase
def test_start_sniffing_once(self):
with patch('ooni.settings.config.scapyFactory') as mock_scapy_factory:
with patch('ooni.utils.txscapy.ScapySniffer') as mock_scapy_sniffer:
self.director.startSniffing(self.testDetails)
sniffer = mock_scapy_sniffer.return_value
mock_scapy_factory.registerProtocol.assert_called_once_with(sniffer)
def test_start_sniffing_twice(self):
with patch('ooni.settings.config.scapyFactory') as mock_scapy_factory:
with patch('ooni.utils.txscapy.ScapySniffer') as mock_scapy_sniffer:
sniffer = mock_scapy_sniffer.return_value
sniffer.pcapwriter.filename = 'foo1_filename'
self.director.startSniffing(self.testDetails)
self.assertEqual(len(self.director.sniffers), 1)
self.testDetails = {
'test_name': 'bar',
'start_time': time.time()
}
with patch('ooni.utils.txscapy.ScapySniffer') as mock_scapy_sniffer:
sniffer = mock_scapy_sniffer.return_value
sniffer.pcapwriter.filename = 'foo2_filename'
self.director.startSniffing(self.testDetails)
self.assertEqual(len(self.director.sniffers), 2)
def test_measurement_succeeded(self):
with patch('ooni.settings.config.scapyFactory') as mock_scapy_factory:
with patch('ooni.utils.txscapy.ScapySniffer') as mock_scapy_sniffer:
self.director.startSniffing(self.testDetails)
self.assertEqual(len(self.director.sniffers), 1)
measurement = MagicMock()
measurement.testInstance = self.FooTestCase()
self.director.measurementSucceeded('awesome', measurement)
self.assertEqual(len(self.director.sniffers), 0)
sniffer = mock_scapy_sniffer.return_value
mock_scapy_factory.unRegisterProtocol.assert_called_once_with(sniffer)
ooniprobe-1.3.2/ooni/tests/test_templates.py 0000644 0001750 0001750 00000002657 12447563404 017407 0 ustar irl irl from ooni.templates import httpt
from twisted.internet.error import DNSLookupError
from twisted.internet import reactor, defer
from twisted.trial import unittest
class TestHTTPT(unittest.TestCase):
def setUp(self):
from twisted.web.resource import Resource
from twisted.web.server import Site
class DummyResource(Resource):
isLeaf = True
def render_GET(self, request):
return "%s" % request.method
r = DummyResource()
factory = Site(r)
self.port = reactor.listenTCP(8880, factory)
def tearDown(self):
self.port.stopListening()
@defer.inlineCallbacks
def test_do_request(self):
http_test = httpt.HTTPTest()
http_test.localOptions['socksproxy'] = None
http_test._setUp()
response = yield http_test.doRequest('http://localhost:8880/')
assert response.body == "GET"
assert len(http_test.report['requests']) == 1
assert 'request' in http_test.report['requests'][0]
assert 'response' in http_test.report['requests'][0]
@defer.inlineCallbacks
def test_do_failing_request(self):
http_test = httpt.HTTPTest()
http_test.localOptions['socksproxy'] = None
http_test._setUp()
yield self.assertFailure(http_test.doRequest('http://invaliddomain/'), DNSLookupError)
assert http_test.report['requests'][0]['failure'] == 'dns_lookup_error'
ooniprobe-1.3.2/ooni/tests/disable_test_dns.py 0000644 0001750 0001750 00000001171 12463144534 017643 0 ustar irl irl #
# This unittest is to verify that our usage of the twisted DNS resolver does
# not break with new versions of twisted.
from twisted.trial import unittest
from twisted.names import dns
from twisted.names.client import Resolver
class DNSTest(unittest.TestCase):
def test_a_lookup_ooni_query(self):
def done_query(message, *arg):
answer = message.answers[0]
self.assertEqual(answer.type, 1)
dns_query = [dns.Query('ooni.nu', type=dns.A)]
resolver = Resolver(servers=[('8.8.8.8', 53)])
d = resolver.queryUDP(dns_query)
d.addCallback(done_query)
return d
ooniprobe-1.3.2/ooni/tests/test_safe_represent.py 0000644 0001750 0001750 00000000431 12447563404 020402 0 ustar irl irl import yaml
from twisted.trial import unittest
from ooni.reporter import OSafeDumper
from scapy.all import IP, UDP
class TestScapyRepresent(unittest.TestCase):
def test_represent_scapy(self):
data = IP() / UDP()
yaml.dump_all([data], Dumper=OSafeDumper)
ooniprobe-1.3.2/ooni/tests/test_oonibclient.py 0000644 0001750 0001750 00000012171 12447563404 017706 0 ustar irl irl import os
import shutil
import socket
from twisted.internet import defer
from twisted.web import error
from ooni import errors as e
from ooni.settings import config
from ooni.oonibclient import OONIBClient
from ooni.tests.bases import ConfigTestCase
input_id = '37e60e13536f6afe47a830bfb6b371b5cf65da66d7ad65137344679b24fdccd1'
deck_id = 'd4ae40ecfb3c1b943748cce503ab8233efce7823f3e391058fc0f87829c644ed'
class TestOONIBClient(ConfigTestCase):
def setUp(self):
super(TestOONIBClient, self).setUp()
host = '127.0.0.1'
port = 8889
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
try:
s.connect((host, port))
s.shutdown(2)
data_dir = '/tmp/testooni'
config.advanced.data_dir = data_dir
if os.path.exists(data_dir):
shutil.rmtree(data_dir)
os.mkdir(data_dir)
os.mkdir(os.path.join(data_dir, 'inputs'))
os.mkdir(os.path.join(data_dir, 'decks'))
except Exception:
self.skipTest("OONIB must be listening on port 8888 to run this test (tor_hidden_service: false)")
self.oonibclient = OONIBClient('http://' + host + ':' + str(port))
@defer.inlineCallbacks
def test_query(self):
res = yield self.oonibclient.queryBackend('GET', '/policy/input')
self.assertTrue(isinstance(res, list))
@defer.inlineCallbacks
def test_get_input_list(self):
input_list = yield self.oonibclient.getInputList()
self.assertTrue(isinstance(input_list, list))
@defer.inlineCallbacks
def test_get_input_descriptor(self):
input_descriptor = yield self.oonibclient.getInput(input_id)
for key in ['name', 'description',
'version', 'author', 'date', 'id']:
self.assertTrue(hasattr(input_descriptor, key))
@defer.inlineCallbacks
def test_download_input(self):
yield self.oonibclient.downloadInput(input_id)
@defer.inlineCallbacks
def test_get_deck_list(self):
deck_list = yield self.oonibclient.getDeckList()
self.assertTrue(isinstance(deck_list, list))
@defer.inlineCallbacks
def test_get_deck_descriptor(self):
deck_descriptor = yield self.oonibclient.getDeck(deck_id)
for key in ['name', 'description',
'version', 'author', 'date', 'id']:
self.assertTrue(hasattr(deck_descriptor, key))
@defer.inlineCallbacks
def test_download_deck(self):
yield self.oonibclient.downloadDeck(deck_id)
def test_lookup_invalid_helpers(self):
self.oonibclient.address = 'http://127.0.0.1:8888'
return self.failUnlessFailure(
self.oonibclient.lookupTestHelpers([
'sdadsadsa', 'dns'
]), e.CouldNotFindTestHelper)
@defer.inlineCallbacks
def test_lookup_no_test_helpers(self):
self.oonibclient.address = 'http://127.0.0.1:8888'
required_helpers = []
helpers = yield self.oonibclient.lookupTestHelpers(required_helpers)
self.assertTrue('default' in helpers.keys())
@defer.inlineCallbacks
def test_lookup_test_helpers(self):
self.oonibclient.address = 'http://127.0.0.1:8888'
required_helpers = [u'http-return-json-headers', u'dns']
helpers = yield self.oonibclient.lookupTestHelpers(required_helpers)
self.assertEqual(set(helpers.keys()), set(required_helpers + [u'default']))
self.assertTrue(helpers['http-return-json-headers']['address'].startswith('http'))
self.assertTrue(int(helpers['dns']['address'].split('.')[0]))
@defer.inlineCallbacks
def test_input_descriptor_not_found(self):
yield self.assertFailure(self.oonibclient.queryBackend('GET', '/input/' + 'a'*64), e.OONIBInputDescriptorNotFound)
@defer.inlineCallbacks
def test_http_errors(self):
yield self.assertFailure(self.oonibclient.queryBackend('PUT', '/policy/input'), error.Error)
@defer.inlineCallbacks
def test_create_report(self):
res = yield self.oonibclient.queryBackend('POST', '/report', {
'software_name': 'spam',
'software_version': '2.0',
'probe_asn': 'AS0',
'probe_cc': 'ZZ',
'test_name': 'foobar',
'test_version': '1.0',
'input_hashes': []
})
assert isinstance(res['report_id'], unicode)
@defer.inlineCallbacks
def test_report_lifecycle(self):
res = yield self.oonibclient.queryBackend('POST', '/report', {
'software_name': 'spam',
'software_version': '2.0',
'probe_asn': 'AS0',
'probe_cc': 'ZZ',
'test_name': 'foobar',
'test_version': '1.0',
'input_hashes': []
})
report_id = str(res['report_id'])
res = yield self.oonibclient.queryBackend('POST', '/report/' + report_id, {
'content': '---\nspam: ham\n...\n'
})
res = yield self.oonibclient.queryBackend('POST', '/report/' + report_id, {
'content': '---\nspam: ham\n...\n'
})
res = yield self.oonibclient.queryBackend('POST', '/report/' + report_id + '/close')
ooniprobe-1.3.2/ooni/tests/test_settings.py 0000644 0001750 0001750 00000012002 12623613431 017222 0 ustar irl irl import random
from twisted.internet import defer, reactor
from twisted.internet.protocol import Protocol, Factory
from scapy.all import get_if_list
import txtorcon
from ooni.settings import OConfig
from ooni import errors
from ooni.utils import net
from bases import ConfigTestCase
class TestSettings(ConfigTestCase):
def setUp(self):
super(TestSettings, self).setUp()
self.conf = OConfig()
self.configuration = {'advanced': {'interface': 'auto',
'start_tor': True},
'tor': {}}
self.silly_listener = None
self.tor_protocol = None
def tearDown(self):
super(TestSettings, self).tearDown()
if self.silly_listener is not None:
self.silly_listener.stopListening()
def run_tor(self):
def progress(percent, tag, summary):
ticks = int((percent/100.0) * 10.0)
prog = (ticks * '#') + ((10 - ticks) * '.')
print '%s %s' % (prog, summary)
config = txtorcon.TorConfig()
config.SocksPort = self.conf.tor.socks_port
config.ControlPort = self.conf.tor.control_port
d = txtorcon.launch_tor(config, reactor, progress_updates=progress)
return d
def run_silly_server(self):
class SillyProtocol(Protocol):
def __init__(self, factory):
self.factory = factory
class SillyFactory(Factory):
protocol = SillyProtocol
def buildProtocol(self, address):
p = self.protocol(self)
return p
self.silly_listener = reactor.listenTCP(self.conf.tor.socks_port, SillyFactory())
def test_vanilla_configuration(self):
self.conf.check_incoherences(self.configuration)
@defer.inlineCallbacks
def test_check_tor_missing_options(self):
self.conf.advanced.start_tor = False
try:
yield self.conf.check_tor()
except errors.ConfigFileIncoherent:
pass
self.conf.tor.socks_port = 9999
try:
yield self.conf.check_tor()
except errors.ConfigFileIncoherent:
pass
self.conf.tor.socks_port = None
self.conf.tor.control_port = 9998
try:
yield self.conf.check_tor()
except errors.ConfigFileIncoherent:
pass
@defer.inlineCallbacks
def test_check_tor_correct(self):
"""
This test has been disabled because there is a strange concatenation of
conditions that make it not possible to run it on travis.
The tests need to be run as root on travis so that the ones that use
scapy will work properly. When running tor as root, though, it will by
default drop privileges to a lesser priviledged user (on debian based
systems debian-tor). The problem is that the datadir will have already
been created with the privileges of root, hence it will fail to use it
as a datadir and fail.
txtorcon addressed this issue in https://github.com/meejah/txtorcon/issues/26
by chmodding the datadir with what is set as User.
So we could either:
1) Set User to root so that tor has access to that directory, but
this will not work because then it will not be happy that
/var/run/tor has more lax permissions (also debian-tor can read it)
so it will fail. We could disable the control port, hence not
needing to use /var/run/tor, but this is not possible due to:
https://github.com/meejah/txtorcon/issues/80
2) We set the User to be the owner of /var/run/tor, but this does
not exist on all systems, so it would only work for travis.
For the time being I am just going to disable this test and wait for
one of the above bugs to have a better fix.
"""
self.skipTest("See comment in the code")
self.conf.advanced.start_tor = False
self.conf.tor.socks_port = net.randomFreePort()
self.conf.tor.control_port = net.randomFreePort()
self.tor_process = yield self.run_tor()
yield self.conf.check_incoherences(self.configuration)
self.tor_process.transport.signalProcess('TERM')
d = defer.Deferred()
reactor.callLater(10, d.callback, None)
yield d
@defer.inlineCallbacks
def test_check_tor_silly_listener(self):
self.conf.advanced.start_tor = False
self.conf.tor.socks_port = net.randomFreePort()
self.conf.tor.control_port = None
self.run_silly_server()
try:
yield self.conf.check_tor()
except errors.ConfigFileIncoherent:
pass
def test_check_incoherences_interface(self):
self.configuration['advanced']['interface'] = 'funky'
self.assertRaises(errors.ConfigFileIncoherent, self.conf.check_incoherences, self.configuration)
self.configuration['advanced']['interface'] = random.choice(get_if_list())
self.conf.check_incoherences(self.configuration)
ooniprobe-1.3.2/ooni/tests/test_reporter.py 0000644 0001750 0001750 00000012634 12463144534 017244 0 ustar irl irl import os
import yaml
import json
import time
from mock import MagicMock
from twisted.internet import defer
from twisted.trial import unittest
from ooni import errors as e
from ooni.reporter import YAMLReporter, OONIBReporter, OONIBReportLog
class MockTest(object):
_start_time = time.time()
report = {'report_content': 'ham'}
input = 'spam'
test_details = {
'test_name': 'spam',
'test_version': '1.0',
'software_name': 'spam',
'software_version': '1.0',
'input_hashes': [],
'probe_asn': 'AS0',
'start_time': time.time()
}
oonib_new_report_message = {
'report_id': "2014-01-29T202038Z_AS0_" + "A" * 50,
'backend_version': "1.0"
}
oonib_generic_error_message = {
'error': 'generic-error'
}
class TestYAMLReporter(unittest.TestCase):
def setUp(self):
self.filename = ""
def tearDown(self):
if self.filename != "":
os.remove(self.filename)
def test_write_report(self):
y_reporter = YAMLReporter(test_details)
y_reporter.createReport()
with open(y_reporter.report_path) as f:
self.filename = y_reporter.report_path
report_entries = yaml.safe_load_all(f)
# Check for keys in header
entry = report_entries.next()
assert all(x in entry for x in ['test_name', 'test_version'])
class TestOONIBReporter(unittest.TestCase):
def setUp(self):
self.mock_response = {}
self.collector_address = 'http://example.com'
self.oonib_reporter = OONIBReporter(
test_details,
self.collector_address)
self.oonib_reporter.agent = MagicMock()
self.mock_agent_response = MagicMock()
def deliverBody(body_receiver):
body_receiver.dataReceived(json.dumps(self.mock_response))
body_receiver.connectionLost(None)
self.mock_agent_response.deliverBody = deliverBody
self.oonib_reporter.agent.request.return_value = defer.succeed(
self.mock_agent_response)
@defer.inlineCallbacks
def test_create_report(self):
self.mock_response = oonib_new_report_message
yield self.oonib_reporter.createReport()
assert self.oonib_reporter.reportID == oonib_new_report_message[
'report_id']
@defer.inlineCallbacks
def test_create_report_failure(self):
self.mock_response = oonib_generic_error_message
self.mock_agent_response.code = 406
yield self.assertFailure(self.oonib_reporter.createReport(),
e.OONIBReportCreationError)
@defer.inlineCallbacks
def test_write_report_entry(self):
req = {'content': 'something'}
yield self.oonib_reporter.writeReportEntry(req)
assert self.oonib_reporter.agent.request.called
class TestOONIBReportLog(unittest.TestCase):
def setUp(self):
self.report_log = OONIBReportLog('report_log')
self.report_log.create_report_log()
def tearDown(self):
os.remove(self.report_log.file_name)
@defer.inlineCallbacks
def test_report_created(self):
yield self.report_log.created("path_to_my_report.yaml",
'httpo://foo.onion',
'someid')
with open(self.report_log.file_name) as f:
report = yaml.safe_load(f)
assert "path_to_my_report.yaml" in report
@defer.inlineCallbacks
def test_concurrent_edit(self):
d1 = self.report_log.created("path_to_my_report1.yaml",
'httpo://foo.onion',
'someid1')
d2 = self.report_log.created("path_to_my_report2.yaml",
'httpo://foo.onion',
'someid2')
yield defer.DeferredList([d1, d2])
with open(self.report_log.file_name) as f:
report = yaml.safe_load(f)
assert "path_to_my_report1.yaml" in report
assert "path_to_my_report2.yaml" in report
@defer.inlineCallbacks
def test_report_closed(self):
yield self.report_log.created("path_to_my_report.yaml",
'httpo://foo.onion',
'someid')
yield self.report_log.closed("path_to_my_report.yaml")
with open(self.report_log.file_name) as f:
report = yaml.safe_load(f)
assert "path_to_my_report.yaml" not in report
@defer.inlineCallbacks
def test_report_creation_failed(self):
yield self.report_log.creation_failed("path_to_my_report.yaml",
'httpo://foo.onion')
with open(self.report_log.file_name) as f:
report = yaml.safe_load(f)
assert "path_to_my_report.yaml" in report
assert report["path_to_my_report.yaml"]["status"] == "creation-failed"
@defer.inlineCallbacks
def test_list_reports(self):
yield self.report_log.creation_failed("failed_report.yaml",
'httpo://foo.onion')
yield self.report_log.created("created_report.yaml",
'httpo://foo.onion', 'XXXX')
assert len(self.report_log.reports_in_progress) == 1
assert len(self.report_log.reports_incomplete) == 0
assert len(self.report_log.reports_to_upload) == 1
ooniprobe-1.3.2/ooni/nettest.py 0000644 0001750 0001750 00000062403 12531110611 014647 0 ustar irl irl import os
import re
import time
import sys
from hashlib import sha256
from twisted.internet import defer
from twisted.trial.runner import filenameToModule
from twisted.python import usage, reflect
from ooni import otime
from ooni.tasks import Measurement
from ooni.utils import log, sanitize_options, randomStr
from ooni.utils.net import hasRawSocketPermission
from ooni.settings import config
from ooni import errors as e
from inspect import getmembers
from StringIO import StringIO
class NoTestCasesFound(Exception):
pass
def getTestClassFromFile(net_test_file):
"""
Will return the first class that is an instance of NetTestCase.
XXX this means that if inside of a test there are more than 1 test case
then we will only run the first one.
"""
module = filenameToModule(net_test_file)
for __, item in getmembers(module):
try:
assert issubclass(item, NetTestCase)
return item
except (TypeError, AssertionError):
pass
def getOption(opt_parameter, required_options, type='text'):
"""
Arguments:
usage_options: a list as should be the optParameters of an UsageOptions
class.
required_options: a list containing the strings of the options that are
required.
type: a string containing the type of the option.
Returns:
a dict containing
{
'description': the description of the option,
'default': the default value of the option,
'required': True|False if the option is required or not,
'type': the type of the option ('text' or 'file')
}
"""
option_name, _, default, description = opt_parameter
if option_name in required_options:
required = True
else:
required = False
return {'description': description,
'value': default, 'required': required,
'type': type
}
def getArguments(test_class):
arguments = {}
if test_class.inputFile:
option_name = test_class.inputFile[0]
arguments[option_name] = getOption(
test_class.inputFile,
test_class.requiredOptions,
type='file')
try:
list(test_class.usageOptions.optParameters)
except AttributeError:
return arguments
for opt_parameter in test_class.usageOptions.optParameters:
option_name = opt_parameter[0]
opt_type = "text"
if opt_parameter[3].lower().startswith("file"):
opt_type = "file"
arguments[option_name] = getOption(
opt_parameter,
test_class.requiredOptions,
type=opt_type)
return arguments
def test_class_name_to_name(test_class_name):
return test_class_name.lower().replace(' ', '_')
def getNetTestInformation(net_test_file):
"""
Returns a dict containing:
{
'id': the test filename excluding the .py extension,
'name': the full name of the test,
'description': the description of the test,
'version': version number of this test,
'arguments': a dict containing as keys the supported arguments and as
values the argument description.
}
"""
test_class = getTestClassFromFile(net_test_file)
test_id = os.path.basename(net_test_file).replace('.py', '')
information = {'id': test_id,
'name': test_class.name,
'description': test_class.description,
'version': test_class.version,
'arguments': getArguments(test_class),
'path': net_test_file,
}
return information
class NetTestLoader(object):
method_prefix = 'test'
collector = None
requiresTor = False
reportID = None
def __init__(self, options, test_file=None, test_string=None):
self.onionInputRegex = re.compile(
"(httpo://[a-z0-9]{16}\.onion)/input/([a-z0-9]{64})$")
self.options = options
self.testCases = []
if test_file:
self.loadNetTestFile(test_file)
elif test_string:
self.loadNetTestString(test_string)
@property
def requiredTestHelpers(self):
required_test_helpers = []
if not self.testCases:
return required_test_helpers
for test_class, test_methods in self.testCases:
for option, name in test_class.requiredTestHelpers.items():
required_test_helpers.append({
'name': name,
'option': option,
'test_class': test_class
})
return required_test_helpers
@property
def inputFiles(self):
input_files = []
if not self.testCases:
return input_files
for test_class, test_methods in self.testCases:
if test_class.inputFile:
key = test_class.inputFile[0]
filename = test_class.localOptions[key]
if not filename:
continue
input_file = {
'key': key,
'test_class': test_class
}
m = self.onionInputRegex.match(filename)
if m:
input_file['url'] = filename
input_file['address'] = m.group(1)
input_file['hash'] = m.group(2)
else:
input_file['filename'] = filename
try:
with open(filename) as f:
h = sha256()
for l in f:
h.update(l)
except:
raise e.InvalidInputFile(filename)
input_file['hash'] = h.hexdigest()
input_files.append(input_file)
return input_files
@property
def testDetails(self):
from ooni import __version__ as software_version
input_file_hashes = []
for input_file in self.inputFiles:
input_file_hashes.append(input_file['hash'])
options = sanitize_options(self.options)
test_details = {
'start_time': otime.epochToUTC(time.time()),
'probe_asn': config.probe_ip.geodata['asn'],
'probe_cc': config.probe_ip.geodata['countrycode'],
'probe_ip': config.probe_ip.geodata['ip'],
'probe_city': config.probe_ip.geodata['city'],
'test_name': self.testName,
'test_version': self.testVersion,
'software_name': 'ooniprobe',
'software_version': software_version,
'options': options,
'input_hashes': input_file_hashes,
'report_id': self.reportID,
'test_helpers': self.testHelpers
}
return test_details
def _parseNetTestOptions(self, klass):
"""
Helper method to assemble the options into a single UsageOptions object
"""
usage_options = klass.usageOptions
if not hasattr(usage_options, 'optParameters'):
usage_options.optParameters = []
else:
for parameter in usage_options.optParameters:
if len(parameter) == 5:
parameter.pop()
if klass.inputFile:
usage_options.optParameters.append(klass.inputFile)
if klass.baseParameters:
for parameter in klass.baseParameters:
usage_options.optParameters.append(parameter)
if klass.baseFlags:
if not hasattr(usage_options, 'optFlags'):
usage_options.optFlags = []
for flag in klass.baseFlags:
usage_options.optFlags.append(flag)
return usage_options
@property
def usageOptions(self):
usage_options = None
for test_class, test_method in self.testCases:
if not usage_options:
usage_options = self._parseNetTestOptions(test_class)
else:
if usage_options != test_class.usageOptions:
raise e.IncoherentOptions(usage_options.__name__,
test_class.usageOptions.__name__)
return usage_options
def loadNetTestString(self, net_test_string):
"""
Load NetTest from a string.
WARNING input to this function *MUST* be sanitized and *NEVER* take
untrusted input.
Failure to do so will result in code exec.
net_test_string:
a string that contains the net test to be run.
"""
net_test_file_object = StringIO(net_test_string)
ns = {}
test_cases = []
exec net_test_file_object.read() in ns
for item in ns.itervalues():
test_cases.extend(self._get_test_methods(item))
if not test_cases:
raise e.NoTestCasesFound
self.setupTestCases(test_cases)
def loadNetTestFile(self, net_test_file):
"""
Load NetTest from a file.
"""
test_cases = []
module = filenameToModule(net_test_file)
for __, item in getmembers(module):
test_cases.extend(self._get_test_methods(item))
if not test_cases:
raise e.NoTestCasesFound
self.setupTestCases(test_cases)
def setupTestCases(self, test_cases):
"""
Creates all the necessary test_cases (a list of tuples containing the
NetTestCase (test_class, test_method))
example:
[(test_classA, test_method1),
(test_classA, test_method2),
(test_classA, test_method3),
(test_classA, test_method4),
(test_classA, test_method5),
(test_classB, test_method1),
(test_classB, test_method2)]
Note: the inputs must be valid for test_classA and test_classB.
net_test_file:
is either a file path or a file like object that will be used to
generate the test_cases.
"""
test_class, _ = test_cases[0]
self.testVersion = test_class.version
self.testName = test_class_name_to_name(test_class.name)
self.testCases = test_cases
self.testClasses = set([])
self.testHelpers = {}
if config.reports.unique_id is True:
self.reportID = randomStr(64)
for test_class, test_method in self.testCases:
self.testClasses.add(test_class)
def checkOptions(self):
"""
Call processTest and processOptions methods of each NetTestCase
"""
for klass in self.testClasses:
options = self.usageOptions()
try:
options.parseOptions(self.options)
except usage.UsageError:
tb = sys.exc_info()[2]
raise e.OONIUsageError(self), None, tb
if options:
klass.localOptions = options
test_instance = klass()
if test_instance.requiresRoot and not hasRawSocketPermission():
raise e.InsufficientPrivileges
if test_instance.requiresTor:
self.requiresTor = True
test_instance.requirements()
test_instance._checkRequiredOptions()
test_instance._checkValidOptions()
def _get_test_methods(self, item):
"""
Look for test_ methods in subclasses of NetTestCase
"""
test_cases = []
try:
assert issubclass(item, NetTestCase)
methods = reflect.prefixedMethodNames(item, self.method_prefix)
test_methods = []
for method in methods:
test_methods.append(self.method_prefix + method)
if test_methods:
test_cases.append((item, test_methods))
except (TypeError, AssertionError):
pass
return test_cases
class NetTestState(object):
def __init__(self, allTasksDone):
"""
This keeps track of the state of a running NetTests case.
Args:
allTasksDone is a deferred that will get fired once all the NetTest
cases have reached a final done state.
"""
self.doneTasks = 0
self.tasks = 0
self.completedScheduling = False
self.allTasksDone = allTasksDone
def taskCreated(self):
self.tasks += 1
def checkAllTasksDone(self):
log.debug("Checking all tasks for completion %s == %s" %
(self.doneTasks, self.tasks))
if self.completedScheduling and \
self.doneTasks == self.tasks:
self.allTasksDone.callback(self.doneTasks)
def taskDone(self):
"""
This is called every time a task has finished running.
"""
self.doneTasks += 1
self.checkAllTasksDone()
def allTasksScheduled(self):
"""
This should be called once all the tasks that need to run have been
scheduled.
XXX this is ghetto.
The reason for which we are calling allTasksDone inside of the
allTasksScheduled method is called after all tasks are done, then we
will run into a race condition. The race is that we don't end up
checking that all the tasks are complete because no task is to be
scheduled.
"""
self.completedScheduling = True
self.checkAllTasksDone()
class NetTest(object):
director = None
def __init__(self, net_test_loader, report):
"""
net_test_loader:
an instance of :class:ooni.nettest.NetTestLoader containing
the test to be run.
report:
an instance of :class:ooni.reporter.Reporter
"""
self.report = report
self.testCases = net_test_loader.testCases
self.testClasses = net_test_loader.testClasses
self.testDetails = net_test_loader.testDetails
self.summary = {}
# This will fire when all the measurements have been completed and
# all the reports are done. Done means that they have either completed
# successfully or all the possible retries have been reached.
self.done = defer.Deferred()
self.done.addCallback(self.doneNetTest)
self.state = NetTestState(self.done)
def __str__(self):
return ' '.join(tc.name for tc, _ in self.testCases)
def doneNetTest(self, result):
if self.summary:
print "Summary for %s" % self.testDetails['test_name']
print "------------" + "-"*len(self.testDetails['test_name'])
for test_class in self.testClasses:
test_instance = test_class()
test_instance.displaySummary(self.summary)
if self.testDetails["report_id"]:
print "Report ID: %s" % self.testDetails["report_id"]
def doneReport(self, report_results):
"""
This will get called every time a report is done and therefore a
measurement is done.
The state for the NetTest is informed of the fact that another task has
reached the done state.
"""
self.state.taskDone()
return report_results
def makeMeasurement(self, test_instance, test_method, test_input=None):
"""
Creates a new instance of :class:ooni.tasks.Measurement and add's it's
callbacks and errbacks.
Args:
test_class:
a subclass of :class:ooni.nettest.NetTestCase
test_method:
a string that represents the method to be called on test_class
test_input:
optional argument that represents the input to be passed to the
NetTestCase
"""
measurement = Measurement(test_instance, test_method, test_input)
measurement.netTest = self
if self.director:
measurement.done.addCallback(self.director.measurementSucceeded,
measurement)
measurement.done.addErrback(self.director.measurementFailed,
measurement)
return measurement
@defer.inlineCallbacks
def initializeInputProcessor(self):
for test_class, _ in self.testCases:
test_class.inputs = yield defer.maybeDeferred(
test_class().getInputProcessor
)
if not test_class.inputs:
test_class.inputs = [None]
def generateMeasurements(self):
"""
This is a generator that yields measurements and registers the
callbacks for when a measurement is successful or has failed.
"""
for test_class, test_methods in self.testCases:
# load the input processor as late as possible
for input in test_class.inputs:
measurements = []
test_instance = test_class()
test_instance.summary = self.summary
for method in test_methods:
log.debug("Running %s %s" % (test_class, method))
measurement = self.makeMeasurement(
test_instance,
method,
input)
measurements.append(measurement.done)
self.state.taskCreated()
yield measurement
# When the measurement.done callbacks have all fired
# call the postProcessor before writing the report
if self.report:
post = defer.DeferredList(measurements)
@post.addBoth
def set_runtime(results):
runtime = time.time() - test_instance._start_time
for _, m in results:
m.testInstance.report['test_runtime'] = runtime
test_instance.report['test_runtime'] = runtime
return results
# Call the postProcessor, which must return a single report
# or a deferred
post.addCallback(test_instance.postProcessor)
def noPostProcessor(failure, report):
failure.trap(e.NoPostProcessor)
return report
post.addErrback(noPostProcessor, test_instance.report)
post.addCallback(self.report.write)
if self.report and self.director:
# ghetto hax to keep NetTestState counts are accurate
[post.addBoth(self.doneReport) for _ in measurements]
self.state.allTasksScheduled()
class NetTestCase(object):
"""
This is the base of the OONI nettest universe. When you write a nettest
you will subclass this object.
* inputs: can be set to a static set of inputs. All the tests (the methods
starting with the "test" prefix) will be run once per input. At every
run the _input_ attribute of the TestCase instance will be set to the
value of the current iteration over inputs. Any python iterable object
can be set to inputs.
* inputFile: attribute should be set to an array containing the command
line argument that should be used as the input file. Such array looks
like this:
``["commandlinearg", "c", "default value" "The description"]``
The second value of such arrray is the shorthand for the command line
arg. The user will then be able to specify inputs to the test via:
``ooniprobe mytest.py --commandlinearg path/to/file.txt``
or
``ooniprobe mytest.py -c path/to/file.txt``
* inputProcessor: should be set to a function that takes as argument a
filename and it will return the input to be passed to the test
instance.
* name: should be set to the name of the test.
* author: should contain the name and contact details for the test author.
The format for such string is as follows:
``The Name ``
* version: is the version string of the test.
* requiresRoot: set to True if the test must be run as root.
* usageOptions: a subclass of twisted.python.usage.Options for processing
of command line arguments
* localOptions: contains the parsed command line arguments.
Quirks:
Every class that is prefixed with test *must* return a
twisted.internet.defer.Deferred.
"""
name = "This test is nameless"
author = "Jane Doe "
version = "0.0.0"
description = "Sorry, this test has no description :("
inputs = None
inputFile = None
inputFilename = None
report = {}
usageOptions = usage.Options
optParameters = None
baseParameters = None
baseFlags = None
requiredTestHelpers = {}
requiredOptions = []
requiresRoot = False
requiresTor = False
localOptions = {}
def _setUp(self):
"""
This is the internal setup method to be overwritten by templates.
"""
self.report = {}
self.inputs = None
def requirements(self):
"""
Place in here logic that will be executed before the test is to be run.
If some condition is not met then you should raise an exception.
"""
pass
def setUp(self):
"""
Place here your logic to be executed when the test is being setup.
"""
pass
def postProcessor(self, measurements):
"""
Subclass this to do post processing tasks that are to occur once all
the test methods have been called once per input.
postProcessing works exactly like test methods, in the sense that
anything that gets written to the object self.report[] will be added to
the final test report.
You should also place in this method any logic that is required for
generating the summary.
"""
raise e.NoPostProcessor
def displaySummary(self, summary):
"""
This gets called after the test has run to allow printing out of a
summary of the test run.
"""
pass
def inputProcessor(self, filename):
"""
You may replace this with your own custom input processor. It takes as
input a file name.
An inputProcessor is an iterator that will yield one item from the file
and takes as argument a filename.
This can be useful when you have some input data that is in a certain
format and you want to set the input attribute of the test to something
that you will be able to properly process.
For example you may wish to have an input processor that will allow you
to ignore comments in files. This can be easily achieved like so::
fp = open(filename)
for x in fp.xreadlines():
if x.startswith("#"):
continue
yield x.strip()
fp.close()
Other fun stuff is also possible.
"""
log.debug("Running default input processor")
with open(filename) as f:
for line in f:
l = line.strip()
# Skip empty lines
if not l:
continue
# Skip comment lines
elif l.startswith('#'):
continue
yield l
@property
def inputFileSpecified(self):
"""
Returns:
True
when inputFile is supported and is specified
False
when input is either not support or not specified
"""
if not self.inputFile:
return False
k = self.inputFile[0]
if self.localOptions.get(k):
return True
else:
return False
def getInputProcessor(self):
"""
This method must be called after all options are validated by
_checkValidOptions and _checkRequiredOptions, which ensure that
if the inputFile is a required option it will be present.
We check to see if it's possible to have an input file and if the user
has specified such file.
If the operations to be done here are network related or blocking, they
should be wrapped in a deferred. That is the return value of this
method should be a :class:`twisted.internet.defer.Deferred`.
Returns:
a generator that will yield one item from the file based on the
inputProcessor.
"""
if self.inputFileSpecified:
self.inputFilename = self.localOptions[self.inputFile[0]]
return self.inputProcessor(self.inputFilename)
if self.inputs:
return self.inputs
return None
def _checkValidOptions(self):
for option in self.localOptions:
if option not in self.usageOptions():
if not self.inputFile or option not in self.inputFile:
raise e.InvalidOption
def _checkRequiredOptions(self):
missing_options = []
for required_option in self.requiredOptions:
log.debug("Checking if %s is present" % required_option)
if required_option not in self.localOptions or \
self.localOptions[required_option] is None:
missing_options.append(required_option)
if missing_options:
raise e.MissingRequiredOption(missing_options, self)
def __repr__(self):
return "<%s inputs=%s>" % (self.__class__, self.inputs)
ooniprobe-1.3.2/ooni/nettests/ 0000755 0001750 0001750 00000000000 12623630152 014464 5 ustar irl irl ooniprobe-1.3.2/ooni/nettests/third_party/ 0000755 0001750 0001750 00000000000 12623630152 017015 5 ustar irl irl ooniprobe-1.3.2/ooni/nettests/third_party/__init__.py 0000644 0001750 0001750 00000000000 12373757547 021140 0 ustar irl irl ooniprobe-1.3.2/ooni/nettests/third_party/psiphon.py 0000644 0001750 0001750 00000011605 12623613431 021053 0 ustar irl irl import tempfile
import stat
import os
import sys
from twisted.internet import defer, reactor
from twisted.internet.endpoints import TCP4ClientEndpoint
from twisted.web.client import readBody
from twisted.python import usage
from txsocksx.http import SOCKS5Agent
from ooni.errors import handleAllFailures, TaskTimedOut
from ooni.utils import log
from ooni.templates import process, httpt
from ooni.templates.process import ProcessTest
class UsageOptions(usage.Options):
log.debug("UsageOptions")
optParameters = [
['url', 'u', None, 'Specify a single URL to test.'],
['psiphonpath', 'p', None, 'Specify psiphon python client path.'],
['socksproxy', 's', None, 'Specify psiphon socks proxy ip:port.'],]
class PsiphonTest(httpt.HTTPTest, process.ProcessTest):
"""
This class tests Psiphon python client
test_psiphon:
Starts a Psiphon, check if it bootstraps successfully
(print a line in stdout).
Then, perform an HTTP request using the proxy
"""
name = "Psiphon Test"
description = "Bootstraps Psiphon and \
does a HTTP GET for the specified URL"
author = "juga"
version = "0.0.1"
timeout = 20
usageOptions = UsageOptions
def _setUp(self):
# it is necessary to do this in _setUp instead of setUp
# because it needs to happen before HTTPTest's _setUp.
# incidentally, setting this option in setUp results in HTTPTest
# *saying* it is using this proxy while not actually using it.
log.debug('PiphonTest._setUp: setting socksproxy')
self.localOptions['socksproxy'] = '127.0.0.1:1080'
super(PsiphonTest, self)._setUp()
def setUp(self):
log.debug('PsiphonTest.setUp')
self.bootstrapped = defer.Deferred()
if self.localOptions['url']:
self.url = self.localOptions['url']
else:
self.url = 'https://check.torproject.org'
if self.localOptions['psiphonpath']:
self.psiphonpath = self.localOptions['psiphonpath']
else:
# Psiphon is not installable and to run it manually, it has to be
# run from the psiphon directory, so it wouldn't make sense to
# install it in the PATH. For now, we assume that Psiphon sources
# are in the user's home directory.
from os import path, getenv
self.psiphonpath = path.join(
getenv('HOME'), 'psiphon-circumvention-system/pyclient')
log.debug('psiphon path: %s' % self.psiphonpath)
# psi_client.py can not be run directly because the paths in the
# code are relative, so it'll fail to execute from this test
x = """
from psi_client import connect
connect(False)
"""
f = tempfile.NamedTemporaryFile(delete=False)
f.write(x)
f.close()
self.command = [sys.executable, f.name]
log.debug('command: %s' % ''.join(self.command))
def handleRead(self, stdout, stderr):
if 'Press Ctrl-C to terminate.' in self.processDirector.stdout:
if not self.bootstrapped.called:
log.debug("PsiphonTest: calling bootstrapped.callback")
self.bootstrapped.callback(None)
def test_psiphon(self):
log.debug('PsiphonTest.test_psiphon')
if not os.path.exists(self.psiphonpath):
log.err('psiphon path does not exists, is it installed?')
self.report['success'] = False
self.report['psiphon_installed'] = False
log.debug("Adding %s to report" % self.report)
# XXX: the original code written by juga0 readed
# > return defer.succeed(None)
# but this caused `ooniprobe -ng` to hang forever, so I
# rewrote the code to return a deferred and simulate calling
# its callback method, to trigger an event.
# -sbs
reactor.callLater(0.0, self.bootstrapped.callback, None)
return self.bootstrapped
self.report['psiphon_installed'] = True
log.debug("Adding %s to report" % self.report)
# Using pty to see output lines as soon as they get wrotten in the
# buffer, otherwise the test might not see lines until the buffer is
# full with some block size and therefore the test would
# terminate with error
finished = self.run(self.command,
env=dict(PYTHONPATH=self.psiphonpath),
path=self.psiphonpath,
usePTY=1)
def callDoRequest(_):
return self.doRequest(self.url)
self.bootstrapped.addCallback(callDoRequest)
def cleanup(_):
log.debug('PsiphonTest:cleanup')
self.processDirector.transport.signalProcess('INT')
os.remove(self.command[1])
return finished
self.bootstrapped.addBoth(cleanup)
return self.bootstrapped
ooniprobe-1.3.2/ooni/nettests/third_party/netalyzr.py 0000644 0001750 0001750 00000004322 12463144534 021246 0 ustar irl irl # -*- encoding: utf-8 -*-
#
# This is a wrapper around the Netalyzer Java command line client
#
# :authors: Jacob Appelbaum
# Arturo "hellais" Filastò
# :licence: see LICENSE
from ooni import nettest
from ooni.utils import log
import time
import os
from twisted.internet import reactor, threads
class NetalyzrWrapperTest(nettest.NetTestCase):
name = "NetalyzrWrapper"
requiresRoot = False
requiresTor = False
def setUp(self):
cwd = os.path.abspath(os.path.join(os.path.abspath(__file__), '..'))
# XXX set the output directory to something more uniform
outputdir = os.path.join(cwd, '..', '..')
program_path = os.path.join(cwd, 'NetalyzrCLI.jar')
program = "java -jar %s -d" % program_path
test_token = time.asctime(time.gmtime()).replace(" ", "_").strip()
self.output_file = os.path.join(outputdir,
"NetalyzrCLI_" + test_token + ".out")
self.output_file.strip()
self.run_me = program + " 2>&1 >> " + self.output_file
def blocking_call(self):
try:
result = threads.blockingCallFromThread(reactor, os.system, self.run_me)
except:
log.debug("Netalyzr had an error, please see the log file: %s" % self.output_file)
finally:
self.clean_up()
def clean_up(self):
self.report['netalyzr_report'] = self.output_file
log.debug("finished running NetalzrWrapper")
log.debug("Please check %s for Netalyzr output" % self.output_file)
def test_run_netalyzr(self):
"""
This test simply wraps netalyzr and runs it from command line
"""
log.msg("Running NetalyzrWrapper (this will take some time, be patient)")
log.debug("with command '%s'" % self.run_me)
# XXX we probably want to use a processprotocol here to obtain the
# stdout from Netalyzr. This would allows us to visualize progress
# (currently there is no progress because the stdout of os.system is
# trapped by twisted) and to include the link to the netalyzr report
# directly in the OONI report, perhaps even downloading it.
reactor.callInThread(self.blocking_call)
ooniprobe-1.3.2/ooni/nettests/third_party/lantern.py 0000644 0001750 0001750 00000006753 12620155315 021045 0 ustar irl irl from twisted.internet import defer, reactor
from twisted.internet.endpoints import TCP4ClientEndpoint
from twisted.python import usage
from twisted.web.client import ProxyAgent, readBody
from ooni.templates.process import ProcessTest, ProcessDirector
from ooni.utils import log
from ooni.errors import handleAllFailures, TaskTimedOut
import os.path
from os import getenv
class UsageOptions(usage.Options):
optParameters = [
['url', 'u', None, 'Specify a single URL to test.'],]
class LanternProcessDirector(ProcessDirector):
"""
This Process Director monitors Lantern during its
bootstrap and fires a callback if bootstrap is
successful or an errback if it fails to bootstrap
before timing out.
"""
def __init__(self, d, timeout=None):
self.d = d
self.stderr = ""
self.stdout = ""
self.finished = None
self.timeout = timeout
self.stdin = None
self.timer = None
self.exit_reason = None
self.bootstrapped = defer.Deferred()
def outReceived(self, data):
self.stdout += data
# output received, see if we have bootstrapped
if not self.bootstrapped.called and "client (http) proxy at" in self.stdout:
log.debug("Bootstrap Detected")
self.cancelTimer()
self.bootstrapped.callback("bootstrapped")
class LanternTest(ProcessTest):
"""
This class tests Lantern (https://getlantern.org).
test_lantern_circumvent
Starts Lantern on Linux in --headless mode and
determine if it bootstraps successfully or not.
Then, make a HTTP request for http://google.com
and records the response body or failure string.
"""
name = "Lantern Circumvention Tool Test"
description = "Bootstraps Lantern and does a HTTP GET for the specified URL"
author = "Aaron Gibson"
version = "0.0.1"
timeout = 20
usageOptions = UsageOptions
requiredOptions = ['url']
def setUp(self):
self.command = ["lantern_linux", "--headless"]
self.d = defer.Deferred()
self.processDirector = LanternProcessDirector(self.d, timeout=self.timeout)
self.d.addCallback(self.processEnded, self.command)
if self.localOptions['url']:
self.url = self.localOptions['url']
def runLantern(self):
paths = filter(os.path.exists,[os.path.join(os.path.expanduser(x), self.command[0]) for x in getenv('PATH').split(':')])
log.debug("Spawning Lantern")
reactor.spawnProcess(self.processDirector, paths[0], self.command)
def test_lantern_circumvent(self):
proxyEndpoint=TCP4ClientEndpoint(reactor, '127.0.0.1', 8787)
agent = ProxyAgent(proxyEndpoint, reactor)
def addResultToReport(result):
self.report['body'] = result
self.report['success'] = True
def addFailureToReport(failure):
self.report['failure'] = handleAllFailures(failure)
self.report['success'] = False
def doRequest(noreason):
log.debug("Doing HTTP request via Lantern (127.0.0.1:8787) for %s" % self.url)
request = agent.request("GET", self.url)
request.addCallback(readBody)
request.addCallback(addResultToReport)
request.addCallback(self.processDirector.close)
return request
self.processDirector.bootstrapped.addCallback(doRequest)
self.processDirector.bootstrapped.addErrback(addFailureToReport)
self.runLantern()
return self.d
ooniprobe-1.3.2/ooni/nettests/scanning/ 0000755 0001750 0001750 00000000000 12623630152 016264 5 ustar irl irl ooniprobe-1.3.2/ooni/nettests/scanning/__init__.py 0000644 0001750 0001750 00000000000 12373757547 020407 0 ustar irl irl ooniprobe-1.3.2/ooni/nettests/scanning/http_url_list.py 0000644 0001750 0001750 00000006276 12623613431 021546 0 ustar irl irl # -*- encoding: utf-8 -*-
#
# :authors: Arturo Filastò
# :licence: see LICENSE
from twisted.python import usage
from ooni.templates import httpt
from ooni.utils import log
class UsageOptions(usage.Options):
optParameters = [['content', 'c', None,
'The file to read from containing the content of a block page'],
['url', 'u', None, 'Specify a single URL to test.']
]
class HTTPURLList(httpt.HTTPTest):
"""
Performs GET, POST and PUT requests to a list of URLs specified as
input and checks if the page that we get back as a result matches that
of a block page given as input.
If no block page is given as input to the test it will simply collect the
responses to the HTTP requests and write them to a report file.
"""
name = "HTTP URL List"
author = "Arturo Filastò"
version = "0.1.3"
usageOptions = UsageOptions
requiresRoot = False
requiresTor = False
inputFile = ['file', 'f', None,
'List of URLS to perform GET and POST requests to']
def setUp(self):
"""
Check for inputs.
"""
if self.input:
self.url = self.input
elif self.localOptions['url']:
self.url = self.localOptions['url']
else:
raise Exception("No input specified")
def check_for_content_censorship(self, body):
"""
If we have specified what a censorship page looks like here we will
check if the page we are looking at matches it.
XXX this is not tested, though it is basically what was used to detect
censorship in the palestine case.
"""
self.report['censored'] = True
censorship_page = open(self.localOptions['content']).xreadlines()
response_page = iter(body.split("\n"))
# We first align the two pages to the first HTML tag (something
# starting with <). This is useful so that we can give as input to this
# test something that comes from the output of curl -kis
# http://the_page/
for line in censorship_page:
if line.strip().startswith("<"):
break
for line in response_page:
if line.strip().startswith("<"):
break
for censorship_line in censorship_page:
try:
response_line = response_page.next()
except StopIteration:
# The censored page and the response we got do not match in
# length.
self.report['censored'] = False
break
censorship_line = censorship_line.replace("\n", "")
if response_line != censorship_line:
self.report['censored'] = False
censorship_page.close()
def processResponseBody(self, body):
if self.localOptions['content']:
log.msg("Checking for censorship in response body")
self.check_for_content_censorship(body)
def test_get(self):
return self.doRequest(self.url, method="GET")
def test_post(self):
return self.doRequest(self.url, method="POST")
def test_put(self):
return self.doRequest(self.url, method="PUT")
ooniprobe-1.3.2/ooni/nettests/__init__.py 0000644 0001750 0001750 00000000000 12373757530 016577 0 ustar irl irl ooniprobe-1.3.2/ooni/nettests/manipulation/ 0000755 0001750 0001750 00000000000 12623630152 017164 5 ustar irl irl ooniprobe-1.3.2/ooni/nettests/manipulation/captiveportal.py 0000644 0001750 0001750 00000062757 12531110611 022423 0 ustar irl irl # -*- coding: utf-8 -*-
# captiveportal
# *************
#
# This test is a collection of tests to detect the presence of a
# captive portal. Code is taken, in part, from the old ooni-probe,
# which was written by Jacob Appelbaum and Arturo Filastò.
#
# This module performs multiple tests that match specific vendor captive
# portal tests. This is a basic internet captive portal filter tester written
# for RECon 2011.
#
# Read the following URLs to understand the captive portal detection process
# for various vendors:
#
# http://technet.microsoft.com/en-us/library/cc766017%28WS.10%29.aspx
# http://blog.superuser.com/2011/05/16/windows-7-network-awareness/
# http://isc.sans.org/diary.html?storyid=10312&
# http://src.chromium.org/viewvc/chrome?view=rev&revision=74608
# http://code.google.com/p/chromium-os/issues/detail?3281ttp,
# http://crbug.com/52489
# http://crbug.com/71736
# https://bugzilla.mozilla.org/show_bug.cgi?id=562917
# https://bugzilla.mozilla.org/show_bug.cgi?id=603505
# http://lists.w3.org/Archives/Public/ietf-http-wg/2011JanMar/0086.html
# http://tools.ietf.org/html/draft-nottingham-http-portal-02
#
# :authors: Jacob Appelbaum, Arturo Filastò, Isis Lovecruft
# :license: see LICENSE for more details
import random
import re
import string
from urlparse import urlparse
from twisted.names import error
from twisted.python import usage
from twisted.internet import defer
from ooni.templates import httpt, dnst
from ooni.utils import net
from ooni.utils import log
__plugoo__ = "captiveportal"
__desc__ = "Captive portal detection test"
class UsageOptions(usage.Options):
optParameters = [['asset', 'a', None, 'Asset file'],
['experiment-url', 'e', 'http://google.com/', 'Experiment URL'],
['user-agent', 'u', random.choice(net.userAgents),
'User agent for HTTP requests']
]
class CaptivePortal(httpt.HTTPTest, dnst.DNSTest):
"""
Compares content and status codes of HTTP responses, and attempts
to determine if content has been altered.
"""
name = "captiveportal"
description = "Captive Portal Test"
version = '0.3'
author = "Isis Lovecruft"
usageOptions = UsageOptions
requiresRoot = False
requiresTor = False
@defer.inlineCallbacks
def http_fetch(self, url, headers={}):
"""
Parses an HTTP url, fetches it, and returns a response
object.
"""
url = urlparse(url).geturl()
#XXX: HTTP Error 302: The HTTP server returned a redirect error that
#would lead to an infinite loop. The last 30x error message was: Found
try:
response = yield self.doRequest(url, "GET", headers)
defer.returnValue(response)
except Exception:
log.err("HTTPError")
defer.returnValue(None)
@defer.inlineCallbacks
def http_content_match_fuzzy_opt(self, experimental_url, control_result,
headers=None, fuzzy=False):
"""
Makes an HTTP request on port 80 for experimental_url, then
compares the response_content of experimental_url with the
control_result. Optionally, if the fuzzy parameter is set to
True, the response_content is compared with a regex of the
control_result. If the response_content from the
experimental_url and the control_result match, returns True
with the HTTP status code and headers; False, status code, and
headers if otherwise.
"""
if headers is None:
default_ua = self.local_options['user-agent']
headers = {'User-Agent': default_ua}
response = yield self.http_fetch(experimental_url, headers)
response_headers = response.headers
response_content = response.body if response else None
response_code = response.code if response else None
if response_content is None:
log.err("HTTP connection appears to have failed.")
r = (False, False, False)
defer.returnValue(r)
if fuzzy:
pattern = re.compile(control_result)
match = pattern.search(response_content)
log.msg("Fuzzy HTTP content comparison for experiment URL")
log.msg("'%s'" % experimental_url)
if not match:
log.msg("does not match!")
r = (False, response_code, response_headers)
defer.returnValue(r)
else:
log.msg("and the expected control result yielded a match.")
r = (True, response_code, response_headers)
defer.returnValue(r)
else:
if str(response_content) != str(control_result):
log.msg("HTTP content comparison of experiment URL")
log.msg("'%s'" % experimental_url)
log.msg("and the expected control result do not match.")
r = (False, response_code, response_headers)
defer.returnValue(r)
else:
r = (True, response_code, response_headers)
defer.returnValue(r)
def http_status_code_match(self, experiment_code, control_code):
"""
Compare two HTTP status codes, returns True if they match.
"""
return int(experiment_code) == int(control_code)
def http_status_code_no_match(self, experiment_code, control_code):
"""
Compare two HTTP status codes, returns True if they do not match.
"""
return int(experiment_code) != int(control_code)
@defer.inlineCallbacks
def dns_resolve(self, hostname, nameserver=None):
"""
Resolves hostname(s) though nameserver to corresponding
address(es). hostname may be either a single hostname string,
or a list of strings. If nameserver is not given, use local
DNS resolver, and if that fails try using 8.8.8.8.
"""
if isinstance(hostname, str):
hostname = [hostname]
response = []
answer = None
for hn in hostname:
try:
answer = yield self.performALookup(hn)
if not answer:
answer = yield self.performALookup(hn, ('8.8.8.8', 53))
except error.DNSNameError:
log.msg("DNS resolution for %s returned NXDOMAIN" % hn)
response.append('NXDOMAIN')
except Exception:
log.err("DNS Resolution failed")
finally:
if not answer:
defer.returnValue(response)
for addr in answer:
response.append(addr)
defer.returnValue(response)
@defer.inlineCallbacks
def dns_resolve_match(self, experiment_hostname, control_address):
"""
Resolve experiment_hostname, and check to see that it returns
an experiment_address which matches the control_address. If
they match, returns True and experiment_address; otherwise
returns False and experiment_address.
"""
experiment_address = yield self.dns_resolve(experiment_hostname)
if not experiment_address:
log.debug("dns_resolve() for %s failed" % experiment_hostname)
ret = None, experiment_address
defer.returnValue(ret)
if len(set(experiment_address) & set([control_address])) > 0:
ret = True, experiment_address
defer.returnValue(ret)
else:
log.msg("DNS comparison of control '%s' does not" % control_address)
log.msg("match experiment response '%s'" % experiment_address)
ret = False, experiment_address
defer.returnValue(ret)
@defer.inlineCallbacks
def get_auth_nameservers(self, hostname):
"""
Many CPs set a nameserver to be used. Let's query that
nameserver for the authoritative nameservers of hostname.
The equivalent of:
$ dig +short NS ooni.nu
"""
auth_nameservers = yield self.performNSLookup(hostname)
defer.returnValue(auth_nameservers)
def hostname_to_0x20(self, hostname):
"""
MaKEs yOur HOsTnaME lOoK LiKE THis.
For more information, see:
D. Dagon, et. al. "Increased DNS Forgery Resistance
Through 0x20-Bit Encoding". Proc. CSS, 2008.
"""
hostname_0x20 = ''
for char in hostname:
l33t = random.choice(['caps', 'nocaps'])
if l33t == 'caps':
hostname_0x20 += char.capitalize()
else:
hostname_0x20 += char.lower()
return hostname_0x20
@defer.inlineCallbacks
def check_0x20_to_auth_ns(self, hostname, sample_size=None):
"""
Resolve a 0x20 DNS request for hostname over hostname's
authoritative nameserver(s), and check to make sure that
the capitalization in the 0x20 request matches that of the
response. Also, check the serial numbers of the SOA (Start
of Authority) records on the authoritative nameservers to
make sure that they match.
If sample_size is given, a random sample equal to that number
of authoritative nameservers will be queried; default is 5.
"""
log.msg("")
log.msg("Testing random capitalization of DNS queries...")
log.msg("Testing that Start of Authority serial numbers match...")
auth_nameservers = yield self.get_auth_nameservers(hostname)
if sample_size is None:
sample_size = 5
res = yield self.dns_resolve(auth_nameservers)
resolved_auth_ns = random.sample(res, sample_size)
querynames = []
answernames = []
serials = []
# Even when gevent monkey patching is on, the requests here
# are sent without being 0x20'd, so we need to 0x20 them.
hostname = self.hostname_to_0x20(hostname)
for auth_ns in resolved_auth_ns:
querynames.append(hostname)
try:
answer = yield self.performSOALookup(hostname, (auth_ns, 53))
except Exception:
continue
for soa in answer:
answernames.append(soa[0])
serials.append(str(soa[1]))
if len(set(querynames).intersection(answernames)) == 1:
log.msg("Capitalization in DNS queries and responses match.")
name_match = True
else:
log.msg("The random capitalization '%s' used in" % hostname)
log.msg("DNS queries to that hostname's authoritative")
log.msg("nameservers does not match the capitalization in")
log.msg("the response.")
name_match = False
if len(set(serials)) == 1:
log.msg("Start of Authority serial numbers all match.")
serial_match = True
else:
log.msg("Some SOA serial numbers did not match the rest!")
serial_match = False
if name_match and serial_match:
log.msg("Your DNS queries do not appear to be tampered.")
elif name_match or serial_match:
log.msg("Something is tampering with your DNS queries.")
elif not name_match and not serial_match:
log.msg("Your DNS queries are definitely being tampered with.")
ret = {
'result': name_match and serial_match,
'name_match': name_match,
'serial_match': serial_match,
'querynames': querynames,
'answernames': answernames,
'SOA_serials': serials
}
defer.returnValue(ret)
def get_random_url_safe_string(self, length):
"""
Returns a random url-safe string of specified length, where
0 < length <= 256. The returned string will always start with
an alphabetic character.
"""
if length <= 0:
length = 1
elif length > 256:
length = 256
random_string = ''
while length > 0:
random_string += random.choice(string.lowercase)
length -= 1
return random_string
def get_random_hostname(self, length=None):
"""
Returns a random hostname with SLD of specified length. If
length is unspecified, length=32 is used.
These *should* all resolve to NXDOMAIN. If they actually
resolve to a box that isn't part of a captive portal that
would be rather interesting.
"""
if length is None:
length = 32
random_sld = self.get_random_url_safe_string(length)
tld_list = ['.com', '.net', '.org', '.info', '.test', '.invalid']
random_tld = random.choice(tld_list)
random_hostname = random_sld + random_tld
return random_hostname
@defer.inlineCallbacks
def compare_random_hostnames(self, hostname_count=None, hostname_length=None):
"""
Get hostname_count number of random hostnames with SLD length
of hostname_length, and then attempt DNS resolution. If no
arguments are given, default to three hostnames of 32 bytes
each. These random hostnames *should* resolve to NXDOMAIN,
except in the case where a user is presented with a captive
portal and remains unauthenticated, in which case the captive
portal may return the address of the authentication page.
If the cardinality of the intersection of the set of resolved
random hostnames and the single element control set
(['NXDOMAIN']) are equal to one, then DNS properly resolved.
Returns true if only NXDOMAINs were returned, otherwise returns
False with the relative complement of the control set in the
response set.
"""
if hostname_count is None:
hostname_count = 3
log.msg("Generating random hostnames...")
log.msg("Resolving DNS for %d random hostnames..." % hostname_count)
control = ['NXDOMAIN']
responses = []
for x in range(hostname_count):
random_hostname = self.get_random_hostname(hostname_length)
response_match, response_address = yield self.dns_resolve_match(random_hostname,
control[0])
for address in response_address:
if response_match is False:
log.msg("Strangely, DNS resolution of the random hostname")
log.msg("%s actually points to %s"
% (random_hostname, response_address))
responses = responses + [address]
else:
responses = responses + [address]
intersection = set(responses) & set(control)
relative_complement = set(responses) - set(control)
r = set(responses)
if len(intersection) == 1:
log.msg("All %d random hostnames properly resolved to NXDOMAIN."
% hostname_count)
ret = True, relative_complement
defer.returnValue(ret)
elif (len(intersection) == 0) and (len(r) > 1):
log.msg("Something odd happened. Some random hostnames correctly")
log.msg("resolved to NXDOMAIN, but several others resolved to")
log.msg("to the following addresses: %s" % relative_complement)
ret = False, relative_complement
defer.returnValue(ret)
elif (len(intersection) == 0) and (len(r) == 1):
log.msg("All random hostnames resolved to the IP address ")
log.msg("'%s', which is indicative of a captive portal." % r)
ret = False, relative_complement
defer.returnValue(ret)
else:
log.debug("Apparently, pigs are flying on your network, 'cause a")
log.debug("bunch of hostnames made from 32-byte random strings")
log.debug("just magically resolved to a bunch of random addresses.")
log.debug("That is definitely highly improbable. In fact, my napkin")
log.debug("tells me that the probability of just one of those")
log.debug("hostnames resolving to an address is 1.68e-59, making")
log.debug("it nearly twice as unlikely as an MD5 hash collision.")
log.debug("Either someone is seriously messing with your network,")
log.debug("or else you are witnessing the impossible. %s" % r)
ret = False, relative_complement
defer.returnValue(ret)
@defer.inlineCallbacks
def google_dns_cp_test(self):
"""
Google Chrome resolves three 10-byte random hostnames.
"""
subtest = "Google Chrome DNS-based"
log.msg("Running the Google Chrome DNS-based captive portal test...")
gmatch, google_dns_result = yield self.compare_random_hostnames(3, 10)
ret = {
'result': gmatch,
'addresses': google_dns_result
}
if gmatch:
log.msg("Google Chrome DNS-based captive portal test did not")
log.msg("detect a captive portal.")
defer.returnValue(ret)
else:
log.msg("Google Chrome DNS-based captive portal test believes")
log.msg("you are in a captive portal, or else something very")
log.msg("odd is happening with your DNS.")
defer.returnValue(ret)
@defer.inlineCallbacks
def ms_dns_cp_test(self):
"""
Microsoft "phones home" to a server which will always resolve
to the same address.
"""
subtest = "Microsoft NCSI DNS-based"
log.msg("")
log.msg("Running the Microsoft NCSI DNS-based captive portal")
log.msg("test...")
msmatch, ms_dns_result = yield self.dns_resolve_match("dns.msftncsi.com",
"131.107.255.255")
ret = {
'result': msmatch,
'address': ms_dns_result
}
if msmatch:
log.msg("Microsoft NCSI DNS-based captive portal test did not")
log.msg("detect a captive portal.")
defer.returnValue(ms_dns_result)
else:
log.msg("Microsoft NCSI DNS-based captive portal test ")
log.msg("believes you are in a captive portal.")
defer.returnValue(ms_dns_result)
@defer.inlineCallbacks
def run_vendor_dns_tests(self):
"""
Run the vendor DNS tests.
"""
report = {}
report['google_dns_cp'] = yield self.google_dns_cp_test()
report['ms_dns_cp'] = yield self.ms_dns_cp_test()
defer.returnValue(report)
@defer.inlineCallbacks
def run_vendor_tests(self, *a, **kw):
"""
These are several vendor tests used to detect the presence of
a captive portal. Each test compares HTTP status code and
content to the control results and has its own User-Agent
string, in order to emulate the test as it would occur on the
device it was intended for. Vendor tests are defined in the
format:
[exp_url, ctrl_result, ctrl_code, ua, test_name]
"""
vendor_tests = [['http://www.apple.com/library/test/success.html',
'Success',
'200',
'Mozilla/5.0 (iPhone; U; CPU like Mac OS X; en) AppleWebKit/420+ (KHTML, like Gecko) Version/3.0 Mobile/1A543a Safari/419.3',
'Apple HTTP Captive Portal'],
['http://tools.ietf.org/html/draft-nottingham-http-portal-02',
'428 Network Authentication Required',
'428',
'Mozilla/5.0 (Windows NT 6.1; rv:5.0) Gecko/20100101 Firefox/5.0',
'W3 Captive Portal'],
['http://www.msftncsi.com/ncsi.txt',
'Microsoft NCSI',
'200',
'Microsoft NCSI',
'MS HTTP Captive Portal', ]]
cm = self.http_content_match_fuzzy_opt
sm = self.http_status_code_match
snm = self.http_status_code_no_match
@defer.inlineCallbacks
def compare_content(status_func, fuzzy, experiment_url, control_result,
control_code, headers, test_name):
log.msg("")
log.msg("Running the %s test..." % test_name)
content_match, experiment_code, experiment_headers = yield cm(experiment_url,
control_result,
headers, fuzzy)
status_match = status_func(experiment_code, control_code)
if status_match and content_match:
log.msg("The %s test was unable to detect" % test_name)
log.msg("a captive portal.")
defer.returnValue(True)
else:
log.msg("The %s test shows that your network" % test_name)
log.msg("is filtered.")
defer.returnValue(False)
result = {}
for vt in vendor_tests:
report = {}
experiment_url = vt[0]
control_result = vt[1]
control_code = vt[2]
headers = {'User-Agent': vt[3]}
test_name = vt[4]
args = (experiment_url, control_result, control_code, headers, test_name)
if test_name == "MS HTTP Captive Portal":
report['result'] = yield compare_content(sm, False, *args)
elif test_name == "Apple HTTP Captive Portal":
report['result'] = yield compare_content(sm, True, *args)
elif test_name == "W3 Captive Portal":
report['result'] = yield compare_content(snm, True, *args)
else:
log.err("Ooni is trying to run an undefined CP vendor test.")
report['URL'] = experiment_url
report['http_status_summary'] = control_result
report['http_status_number'] = control_code
report['User_Agent'] = vt[3]
result[test_name] = report
defer.returnValue(result)
@defer.inlineCallbacks
def control(self, experiment_result, args):
"""
Compares the content and status code of the HTTP response for
experiment_url with the control_result and control_code
respectively. If the status codes match, but the experimental
content and control_result do not match, fuzzy matching is enabled
to determine if the control_result is at least included somewhere
in the experimental content. Returns True if matches are found,
and False if otherwise.
"""
# XXX put this back to being parametrized
#experiment_url = self.local_options['experiment-url']
experiment_url = 'http://google.com/'
control_result = 'XX'
control_code = 200
ua = self.local_options['user-agent']
cm = self.http_content_match_fuzzy_opt
sm = self.http_status_code_match
snm = self.http_status_code_no_match
log.msg("Running test for '%s'..." % experiment_url)
content_match, experiment_code, experiment_headers = yield cm(experiment_url,
control_result)
status_match = sm(experiment_code, control_code)
if status_match and content_match:
log.msg("The test for '%s'" % experiment_url)
log.msg("was unable to detect a captive portal.")
self.report['result'] = True
elif status_match and not content_match:
log.msg("Retrying '%s' with fuzzy match enabled."
% experiment_url)
fuzzy_match, experiment_code, experiment_headers = yield cm(experiment_url,
control_result,
fuzzy=True)
if fuzzy_match:
self.report['result'] = True
else:
log.msg("Found modified content on '%s'," % experiment_url)
log.msg("which could indicate a captive portal.")
self.report['result'] = False
else:
log.msg("The content comparison test for ")
log.msg("'%s'" % experiment_url)
log.msg("shows that your HTTP traffic is filtered.")
self.report['result'] = False
@defer.inlineCallbacks
def test_captive_portal(self):
"""
Runs the CaptivePortal(Test).
CONFIG OPTIONS
If "do_captive_portal_vendor_tests" is set to "true", then vendor
specific captive portal HTTP-based tests will be run.
If "do_captive_portal_dns_tests" is set to "true", then vendor
specific captive portal DNS-based tests will be run.
If "check_dns_requests" is set to "true", then Ooni-probe will
attempt to check that your DNS requests are not being tampered with
by a captive portal.
If "captive_portal" = "yourfilename.txt", then user-specified tests
will be run.
Any combination of the above tests can be run.
"""
log.msg("")
log.msg("Running vendor tests...")
self.report['vendor_tests'] = yield self.run_vendor_tests()
log.msg("")
log.msg("Running vendor DNS-based tests...")
self.report['vendor_dns_tests'] = yield self.run_vendor_dns_tests()
log.msg("")
log.msg("Checking that DNS requests are not being tampered...")
self.report['check0x20'] = yield self.check_0x20_to_auth_ns('ooni.nu')
log.msg("")
log.msg("Captive portal test finished!")
ooniprobe-1.3.2/ooni/nettests/manipulation/traceroute.py 0000644 0001750 0001750 00000007671 12447563404 021737 0 ustar irl irl # -*- encoding: utf-8 -*-
from twisted.internet import defer
from twisted.python import usage
from ooni.templates import scapyt
from ooni.utils import log
from ooni.utils.txscapy import MPTraceroute
from ooni.settings import config
class UsageOptions(usage.Options):
optParameters = [
['backend', 'b', None, 'Test backend to use'],
['timeout', 't', 5, 'The timeout for the traceroute test'],
['maxttl', 'm', 30,
'The maximum value of ttl to set on packets'],
['dstport', 'd', None,
'Specify a single destination port. May be repeated.'],
['interval', 'i', None,
'Specify the inter-packet delay in seconds'],
['numPackets', 'n', None,
'Specify the number of packets to send per hop'],
]
class Traceroute(scapyt.BaseScapyTest):
name = "Traceroute"
description = "Performs a UDP, TCP, ICMP traceroute with destination port number "\
"set to 0, 22, 23, 53, 80, 123, 443, 8080 and 65535"
requiredTestHelpers = {'backend': 'traceroute'}
requiresRoot = True
requiresTor = False
usageOptions = UsageOptions
dst_ports = [0, 22, 23, 53, 80, 123, 443, 8080, 65535]
version = "0.3"
def setUp(self):
self.report['test_tcp_traceroute'] = dict(
[('hops_%d' % d, []) for d in self.dst_ports])
self.report['test_udp_traceroute'] = dict(
[('hops_%d' % d, []) for d in self.dst_ports])
self.report['test_icmp_traceroute'] = {'hops': []}
@defer.inlineCallbacks
def run_traceroute(self, protocol):
st = MPTraceroute()
if self.localOptions['maxttl']:
st.ttl_max = int(self.localOptions['maxttl'])
if self.localOptions['dstport']:
st.dst_ports = [int(self.localOptions['dstport'])]
if self.localOptions['interval']:
st.interval = float(self.localOptions['interval'])
log.msg("Running %s traceroute towards %s" % (protocol,
self.localOptions['backend']))
log.msg("This will take about %s seconds" % st.timeout)
config.scapyFactory.registerProtocol(st)
traceroute = getattr(st, protocol + 'Traceroute')
yield traceroute(self.localOptions['backend'])
st.stopListening()
st.matchResponses()
for packet in st.sent_packets:
self.report['sent_packets'].append(packet)
for packet in st.matched_packets.values():
self.report['answered_packets'].extend(packet)
for ttl in xrange(st.ttl_min, st.ttl_max):
matchedPackets = filter(
lambda x: x.ttl == ttl,
st.matched_packets.keys())
for packet in matchedPackets:
for response in st.matched_packets[packet]:
self.addToReport(packet, response)
def test_icmp_traceroute(self):
return self.run_traceroute('ICMP')
def test_tcp_traceroute(self):
return self.run_traceroute('TCP')
def test_udp_traceroute(self):
return self.run_traceroute('UDP')
def addToReport(self, packet, response):
if packet.proto == 1:
self.report['test_icmp_traceroute']['hops'].append(
{'ttl': packet.ttl, 'rtt': response.time - packet.time,
'address': response.src})
elif packet.proto == 6:
self.report['test_tcp_traceroute'][
'hops_%s' % packet.dport].append(
{'ttl': packet.ttl, 'rtt': response.time - packet.time,
'address': response.src, 'sport': response.sport})
else:
self.report['test_udp_traceroute'][
'hops_%s' % packet.dport].append(
{'ttl': packet.ttl, 'rtt': response.time - packet.time,
'address': response.src, 'sport': response.sport})
ooniprobe-1.3.2/ooni/nettests/manipulation/dns_spoof.py 0000644 0001750 0001750 00000005360 12623613431 021535 0 ustar irl irl # -*- encoding: utf-8 -*-
#
# :authors: Arturo Filastò
# :licence: see LICENSE
from twisted.internet import defer
from twisted.python import usage
from scapy.all import IP, UDP, DNS, DNSQR
from ooni.templates import scapyt
from ooni.utils import log
class UsageOptions(usage.Options):
optParameters = [
['resolver', 'r', None,
'Specify the resolver that should be used for DNS queries (ip:port)'],
['hostname', 'h', None, 'Specify the hostname of a censored site'],
['backend', 'b', None,
'Specify the IP address of a good DNS resolver (ip:port)']]
class DNSSpoof(scapyt.ScapyTest):
name = "DNS Spoof"
description = "Used to validate if the type of censorship " \
"happening is DNS spoofing or not."
author = "Arturo Filastò"
version = "0.0.1"
timeout = 2
usageOptions = UsageOptions
requiredTestHelpers = {'backend': 'dns'}
requiredOptions = ['hostname', 'resolver']
requiresRoot = True
requiresTor = False
def setUp(self):
self.resolverAddr, self.resolverPort = self.localOptions['resolver'].split(':')
self.resolverPort = int(self.resolverPort)
self.controlResolverAddr, self.controlResolverPort = self.localOptions['backend'].split(':')
self.controlResolverPort = int(self.controlResolverPort)
self.hostname = self.localOptions['hostname']
def postProcessor(self, measurements):
"""
This is not tested, but the concept is that if the two responses
match up then spoofing is occurring.
"""
try:
test_answer = self.report['answered_packets'][0][UDP]
control_answer = self.report['answered_packets'][1][UDP]
except IndexError:
self.report['spoofing'] = 'no_answer'
else:
if test_answer == control_answer:
self.report['spoofing'] = False
else:
self.report['spoofing'] = True
return self.report
@defer.inlineCallbacks
def test_a_lookup(self):
question = IP(dst=self.resolverAddr) / \
UDP() / \
DNS(rd=1, qd=DNSQR(qtype="A", qclass="IN", qname=self.hostname))
log.msg("Performing query to %s with %s:%s" %
(self.hostname, self.resolverAddr, self.resolverPort))
yield self.sr1(question)
@defer.inlineCallbacks
def test_control_a_lookup(self):
question = IP(dst=self.controlResolverAddr) / \
UDP() / \
DNS(rd=1, qd=DNSQR(qtype="A", qclass="IN", qname=self.hostname))
log.msg("Performing query to %s with %s:%s" %
(self.hostname, self.controlResolverAddr, self.controlResolverPort))
yield self.sr1(question)
ooniprobe-1.3.2/ooni/nettests/manipulation/__init__.py 0000644 0001750 0001750 00000000000 12373757544 021304 0 ustar irl irl ooniprobe-1.3.2/ooni/nettests/manipulation/daphne.py 0000644 0001750 0001750 00000010132 12502236531 020771 0 ustar irl irl # -*- encoding: utf-8 -*-
from twisted.python import usage
from twisted.internet import protocol, endpoints, reactor
from ooni import nettest
from ooni.kit import daphn3
from ooni.utils import log
class Daphn3ClientProtocol(daphn3.Daphn3Protocol):
def nextStep(self):
log.debug("Moving on to next step in the state walk")
self.current_data_received = 0
if self.current_step >= (len(self.steps) - 1):
log.msg("Reached the end of the state machine")
log.msg("Censorship fingerprint bisected!")
step_idx, mutation_idx = self.factory.mutation
log.msg("step_idx: %s | mutation_id: %s" % (step_idx, mutation_idx))
#self.transport.loseConnection()
if self.report:
self.report['mutation_idx'] = mutation_idx
self.report['step_idx'] = step_idx
self.d.callback(None)
return
else:
self.current_step += 1
if self._current_step_role() == self.role:
# We need to send more data because we are again responsible for
# doing so.
self.sendPayload()
class Daphn3ClientFactory(protocol.ClientFactory):
protocol = daphn3.Daphn3Protocol
mutation = [0,0]
steps = None
def buildProtocol(self, addr):
p = self.protocol()
p.steps = self.steps
p.factory = self
return p
def startedConnecting(self, connector):
log.msg("Started connecting %s" % connector)
def clientConnectionFailed(self, reason, connector):
log.err("We failed connecting the the OONIB")
log.err("Cannot perform test. Perhaps it got blocked?")
log.err("Please report this to tor-assistants@torproject.org")
def clientConnectionLost(self, reason, connector):
log.err("Daphn3 client connection lost")
print reason
class daphn3Args(usage.Options):
optParameters = [
['host', 'h', '127.0.0.1', 'Target Hostname'],
['port', 'p', 57003, 'Target port number']]
optFlags = [['pcap', 'c', 'Specify that the input file is a pcap file'],
['yaml', 'y', 'Specify that the input file is a YAML file (default)']]
class daphn3Test(nettest.NetTestCase):
name = "Daphn3"
description = "Bisects the censors fingerprint by mutating the given input packets."
usageOptions = daphn3Args
inputFile = ['file', 'f', None,
'Specify the pcap or YAML file to be used as input to the test']
#requiredOptions = ['file']
requiresRoot = False
requiresTor = False
steps = None
def inputProcessor(self, filename):
"""
step_idx is the step in the packet exchange
ex.
[.X.] are packets sent by a client or a server
client: [.1.] [.3.] [.4.]
server: [.2.] [.5.]
mutation_idx: is the sub index of the packet as in the byte of the
packet at the step_idx that is to be mutated
"""
if self.localOptions['pcap']:
daphn3Steps = daphn3.read_pcap(filename)
else:
daphn3Steps = daphn3.read_yaml(filename)
log.debug("Loaded these steps %s" % daphn3Steps)
yield daphn3Steps
def test_daphn3(self):
host = self.localOptions['host']
port = int(self.localOptions['port'])
def failure(failure):
log.msg("Failed to connect")
self.report['censored'] = True
self.report['mutation'] = 0
raise Exception("Error in connection, perhaps the backend is censored")
return
def success(protocol):
log.msg("Successfully connected")
protocol.sendPayload()
return protocol.d
log.msg("Connecting to %s:%s" % (host, port))
endpoint = endpoints.TCP4ClientEndpoint(reactor, host, port)
daphn3_factory = Daphn3ClientFactory()
daphn3_factory.steps = self.input
daphn3_factory.report = self.report
d = endpoint.connect(daphn3_factory)
d.addErrback(failure)
d.addCallback(success)
return d
ooniprobe-1.3.2/ooni/nettests/manipulation/http_invalid_request_line.py 0000644 0001750 0001750 00000007241 12447563404 025017 0 ustar irl irl # -*- encoding: utf-8 -*-
from twisted.python import usage
from ooni.utils import log
from ooni.utils import randomStr, randomSTR
from ooni.templates import tcpt
class UsageOptions(usage.Options):
optParameters = [
['backend', 'b', None, 'The OONI backend that runs a TCP echo server'],
['backendport', 'p', 80,
'Specify the port that the TCP echo server is running '
'(should only be set for debugging)']]
class HTTPInvalidRequestLine(tcpt.TCPTest):
"""
The goal of this test is to do some very basic and not very noisy fuzzing
on the HTTP request line. We generate a series of requests that are not
valid HTTP requests.
Unless elsewhere stated 'Xx'*N refers to N*2 random upper or lowercase
ascii letters or numbers ('XxXx' will be 4).
"""
name = "HTTP Invalid Request Line"
description = "Performs out of spec HTTP requests in the attempt to "\
"trigger a proxy error message."
version = "0.2"
authors = "Arturo Filastò"
usageOptions = UsageOptions
requiredTestHelpers = {'backend': 'tcp-echo'}
requiredOptions = ['backend']
requiresRoot = False
requiresTor = False
def setUp(self):
self.port = int(self.localOptions['backendport'])
self.address = self.localOptions['backend']
self.report['tampering'] = False
def check_for_manipulation(self, response, payload):
log.debug("Checking if %s == %s" % (response, payload))
if response != payload:
self.report['tampering'] = True
else:
self.report['tampering'] = self.report['tampering'] | False
def test_random_invalid_method(self):
"""
We test sending data to a TCP echo server listening on port 80, if what
we get back is not what we have sent then there is tampering going on.
This is for example what squid will return when performing such
request:
HTTP/1.0 400 Bad Request
Server: squid/2.6.STABLE21
Date: Sat, 23 Jul 2011 02:22:44 GMT
Content-Type: text/html
Content-Length: 1178
Expires: Sat, 23 Jul 2011 02:22:44 GMT
X-Squid-Error: ERR_INVALID_REQ 0
X-Cache: MISS from cache_server
X-Cache-Lookup: NONE from cache_server:3128
Via: 1.0 cache_server:3128 (squid/2.6.STABLE21)
Proxy-Connection: close
"""
payload = randomSTR(4) + " / HTTP/1.1\n\r"
d = self.sendPayload(payload)
d.addCallback(self.check_for_manipulation, payload)
return d
def test_random_invalid_field_count(self):
"""
This generates a request that looks like this:
XxXxX XxXxX XxXxX XxXxX
This may trigger some bugs in the HTTP parsers of transparent HTTP
proxies.
"""
payload = ' '.join(randomStr(5) for x in range(4))
payload += "\n\r"
d = self.sendPayload(payload)
d.addCallback(self.check_for_manipulation, payload)
return d
def test_random_big_request_method(self):
"""
This generates a request that looks like this:
Xx*512 / HTTP/1.1
"""
payload = randomStr(1024) + ' / HTTP/1.1\n\r'
d = self.sendPayload(payload)
d.addCallback(self.check_for_manipulation, payload)
return d
def test_random_invalid_version_number(self):
"""
This generates a request that looks like this:
GET / HTTP/XxX
"""
payload = 'GET / HTTP/' + randomStr(3)
payload += '\n\r'
d = self.sendPayload(payload)
d.addCallback(self.check_for_manipulation, payload)
return d
ooniprobe-1.3.2/ooni/nettests/manipulation/http_host.py 0000644 0001750 0001750 00000014376 12447563404 021576 0 ustar irl irl # -*- encoding: utf-8 -*-
#
# HTTP Host Test
# **************
#
# :authors: Arturo Filastò
# :licence: see LICENSE
import sys
import json
from twisted.internet import defer
from twisted.python import usage
from ooni.utils import randomStr
from ooni.utils import log
from ooni.templates import httpt
class UsageOptions(usage.Options):
optParameters = [['backend', 'b', None,
'URL of the test backend to use. Should be \
listening on port 80 and be a \
HTTPReturnJSONHeadersHelper'],
['content', 'c', None, 'The file to read \
from containing the content of a block page']]
class HTTPHost(httpt.HTTPTest):
"""
This test performs various manipulations of the HTTP Host header field and
attempts to detect which filter bypassing strategies will work against the
censor.
Usually this test should be run with a list of sites that are known to be
blocked inside of a particular network to assess which filter evasion
strategies will work.
"""
name = "HTTP Host"
description = "Tests a variety of different filter bypassing techniques " \
"based on the HTTP Host header field."
author = "Arturo Filastò"
version = "0.2.4"
randomizeUA = False
usageOptions = UsageOptions
inputFile = ['file', 'f', None,
'List of hostnames to test for censorship']
requiredTestHelpers = {'backend': 'http-return-json-headers'}
requiredOptions = ['backend']
requiresTor = False
requiresRoot = False
def setUp(self):
self.report['transparent_http_proxy'] = False
def check_for_censorship(self, body, test_name):
"""
XXX this is to be filled in with either a domclass based classified or
with a rule that will allow to detect that the body of the result is
that of a censored site.
"""
# If we don't see a json dict we know that something is wrong for
# sure
if not body.startswith("{"):
log.msg("This does not appear to be JSON")
self.report['transparent_http_proxy'] = True
self.check_for_censorship(body)
return
try:
content = json.loads(body)
except:
log.msg("The json does not parse, this is not what we expected")
self.report['transparent_http_proxy'] = True
self.check_for_censorship(body)
return
# We base the determination of the presence of a transparent HTTP
# proxy on the basis of the response containing the json that is to be
# returned by a HTTP Request Test Helper
if 'request_headers' in content and \
'request_line' in content and \
'headers_dict' in content:
log.msg("Found the keys I expected in %s" % content)
self.report['transparent_http_proxy'] = self.report[
'transparent_http_proxy'] | False
self.report[test_name] = False
else:
log.msg("Did not find the keys I expected in %s" % content)
self.report['transparent_http_proxy'] = True
if self.localOptions['content']:
self.report[test_name] = True
censorship_page = open(self.localOptions['content'])
response_page = iter(body.split("\n"))
for censorship_line in censorship_page:
response_line = response_page.next()
if response_line != censorship_line:
self.report[test_name] = False
break
censorship_page.close()
@defer.inlineCallbacks
def test_filtering_prepend_newline_to_method(self):
test_name = sys._getframe().f_code.co_name.replace('test_', '')
headers = {}
headers["Host"] = [self.input]
response = yield self.doRequest(self.localOptions['backend'],
method="\nGET",
headers=headers)
self.check_for_censorship(response.body, test_name)
@defer.inlineCallbacks
def test_filtering_add_tab_to_host(self):
test_name = sys._getframe().f_code.co_name.replace('test_', '')
headers = {}
headers["Host"] = [self.input + '\t']
response = yield self.doRequest(self.localOptions['backend'],
headers=headers)
self.check_for_censorship(response.body, test_name)
@defer.inlineCallbacks
def test_filtering_of_subdomain(self):
test_name = sys._getframe().f_code.co_name.replace('test_', '')
headers = {}
headers["Host"] = [randomStr(10) + '.' + self.input]
response = yield self.doRequest(self.localOptions['backend'],
headers=headers)
self.check_for_censorship(response.body, test_name)
@defer.inlineCallbacks
def test_filtering_via_fuzzy_matching(self):
test_name = sys._getframe().f_code.co_name.replace('test_', '')
headers = {}
headers["Host"] = [randomStr(10) + self.input + randomStr(10)]
response = yield self.doRequest(self.localOptions['backend'],
headers=headers)
self.check_for_censorship(response.body, test_name)
@defer.inlineCallbacks
def test_send_host_header(self):
"""
Stuffs the HTTP Host header field with the site to be tested for
censorship and does an HTTP request of this kind to our backend.
We randomize the HTTP User Agent headers.
"""
test_name = sys._getframe().f_code.co_name.replace('test_', '')
headers = {}
headers["Host"] = [self.input]
response = yield self.doRequest(self.localOptions['backend'],
headers=headers)
self.check_for_censorship(response.body, test_name)
def inputProcessor(self, filename=None):
"""
This inputProcessor extracts domain names from urls
"""
if filename:
fp = open(filename)
for x in fp.readlines():
yield x.strip().split('//')[-1].split('/')[0]
fp.close()
else:
pass
ooniprobe-1.3.2/ooni/nettests/manipulation/http_header_field_manipulation.py 0000644 0001750 0001750 00000014205 12531110611 025741 0 ustar irl irl # -*- encoding: utf-8 -*-
#
# :authors: Arturo Filastò
# :licence: see LICENSE
import random
import json
import yaml
from twisted.python import usage
from ooni.utils import log, net, randomStr
from ooni.templates import httpt
from ooni.utils.trueheaders import TrueHeaders
def random_capitalization(string):
output = ""
original_string = string
string = string.swapcase()
for i in range(len(string)):
if random.randint(0, 1):
output += string[i].swapcase()
else:
output += string[i]
if original_string == output:
return random_capitalization(output)
else:
return output
class UsageOptions(usage.Options):
optParameters = [
['backend', 'b', None,
'URL of the backend to use for sending the requests'],
['headers', 'h', None,
'Specify a yaml formatted file from which to read '
'the request headers to send']
]
class HTTPHeaderFieldManipulation(httpt.HTTPTest):
"""
It performes HTTP requests with request headers that vary capitalization
towards a backend. If the headers reported by the server differ from
the ones we sent, then we have detected tampering.
"""
name = "HTTP Header Field Manipulation"
description = "Checks if the HTTP request the server " \
"sees is the same as the one that the client has created."
author = "Arturo Filastò"
version = "0.1.5"
randomizeUA = False
usageOptions = UsageOptions
requiredTestHelpers = {'backend': 'http-return-json-headers'}
requiredOptions = ['backend']
requiresTor = False
requiresRoot = False
def setUp(self):
super(HTTPHeaderFieldManipulation, self).setUp()
self.url = self.localOptions['backend']
def get_headers(self):
headers = {}
if self.localOptions['headers']:
try:
f = open(self.localOptions['headers'])
except IOError:
raise Exception("Specified input file does not exist")
content = ''.join(f.readlines())
f.close()
headers = yaml.safe_load(content)
return headers
else:
# XXX generate these from a random choice taken from
# whatheaders.com
# http://s3.amazonaws.com/data.whatheaders.com/whatheaders-latest.xml.zip
headers = {
"User-Agent": [
random.choice(
net.userAgents)],
"Accept": ["text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"],
"Accept-Encoding": ["gzip,deflate,sdch"],
"Accept-Language": ["en-US,en;q=0.8"],
"Accept-Charset": ["ISO-8859-1,utf-8;q=0.7,*;q=0.3"],
"Host": [
randomStr(15) +
'.com']}
return headers
def get_random_caps_headers(self):
headers = {}
normal_headers = self.get_headers()
for k, v in normal_headers.items():
new_key = random_capitalization(k)
headers[new_key] = v
return headers
def processResponseBody(self, data):
self.check_for_tampering(data)
def check_for_tampering(self, data):
"""
Here we do checks to verify if the request we made has been tampered
with. We have 3 categories of tampering:
* **total** when the response is not a json object and therefore we were not
able to reach the ooniprobe test backend
* **request_line_capitalization** when the HTTP Request line (e.x. GET /
HTTP/1.1) does not match the capitalization we set.
* **header_field_number** when the number of headers we sent does not match
with the ones the backend received
* **header_name_capitalization** when the header field names do not match
those that we sent.
* **header_field_value** when the header field value does not match with the
one we transmitted.
"""
log.msg("Checking for tampering on %s" % self.url)
self.report['tampering'] = {
'total': False,
'request_line_capitalization': False,
'header_name_capitalization': False,
'header_field_value': False,
'header_field_number': False
}
try:
response = json.loads(data)
except ValueError:
self.report['tampering']['total'] = True
return
request_request_line = "%s / HTTP/1.1" % self.request_method
try:
response_request_line = response['request_line']
response_headers_dict = response['headers_dict']
except KeyError:
self.report['tampering']['total'] = True
return
if request_request_line != response_request_line:
self.report['tampering']['request_line_capitalization'] = True
request_headers = TrueHeaders(self.request_headers)
diff = request_headers.getDiff(TrueHeaders(response_headers_dict),
ignore=['Connection'])
if diff:
self.report['tampering']['header_field_name'] = True
else:
self.report['tampering']['header_field_name'] = False
self.report['tampering']['header_name_diff'] = list(diff)
log.msg(" total: %(total)s" % self.report['tampering'])
log.msg(
" request_line_capitalization: %(request_line_capitalization)s" %
self.report['tampering'])
log.msg(
" header_name_capitalization: %(header_name_capitalization)s" %
self.report['tampering'])
log.msg(
" header_field_value: %(header_field_value)s" %
self.report['tampering'])
log.msg(
" header_field_number: %(header_field_number)s" %
self.report['tampering'])
def test_get_random_capitalization(self):
self.request_method = random_capitalization("GET")
self.request_headers = self.get_random_caps_headers()
return self.doRequest(self.url, self.request_method,
headers=self.request_headers)
ooniprobe-1.3.2/ooni/nettests/blocking/ 0000755 0001750 0001750 00000000000 12623630152 016254 5 ustar irl irl ooniprobe-1.3.2/ooni/nettests/blocking/tcp_connect.py 0000644 0001750 0001750 00000005522 12447563404 021142 0 ustar irl irl # -*- encoding: utf-8 -*-
from twisted.internet.protocol import Factory, Protocol
from twisted.internet.endpoints import TCP4ClientEndpoint
from ooni import nettest
from ooni.errors import handleAllFailures
from ooni.utils import log
class TCPFactory(Factory):
def buildProtocol(self, addr):
return Protocol()
class TCPConnectTest(nettest.NetTestCase):
name = "TCP Connect"
description = "Performs a TCP connect scan of all the " \
"host port combinations given as input."
author = "Arturo Filastò"
version = "0.1"
inputFile = [
'file',
'f',
None,
'File containing the IP:PORT combinations to be tested, one per line']
requiresTor = False
requiresRoot = False
requiredOptions = ['file']
def test_connect(self):
"""
This test performs a TCP connection to the remote host on the
specified port.
The report will contains the string 'success' if the test has
succeeded, or the reason for the failure if it has failed.
"""
host, port = self.input.split(":")
def connectionSuccess(protocol):
protocol.transport.loseConnection()
log.debug("Got a connection to %s" % self.input)
self.report["connection"] = 'success'
def connectionFailed(failure):
self.report['connection'] = handleAllFailures(failure)
from twisted.internet import reactor
point = TCP4ClientEndpoint(reactor, host, int(port))
d = point.connect(TCPFactory())
d.addCallback(connectionSuccess)
d.addErrback(connectionFailed)
return d
def inputProcessor(self, filename=None):
"""
This inputProcessor extracts name:port pairs from urls
XXX: Does not support unusual port numbers
"""
def strip_url(address):
proto, path = x.strip().split('://')
proto = proto.lower()
host = path.split('/')[0]
if proto == 'http':
return "%s:80" % host
if proto == 'https':
return "%s:443" % host
pluggable_transports = ("obfs3", "obfs2", "fte", "scramblesuit")
def is_bridge_line(line):
first = line.split(" ")[0]
return first.lower() in pluggable_transports + ("bridge",)
def strip_bridge(line):
if line.lower().startswith("Bridge"):
return line.split(" ")[2]
return line.split(" ")[1]
if filename:
fp = open(filename)
for x in fp.readlines():
if x.startswith("http"):
yield strip_url(x)
elif is_bridge_line(x):
yield strip_bridge(x)
else:
yield x.split(" ")[0]
fp.close()
else:
pass
ooniprobe-1.3.2/ooni/nettests/blocking/dns_n_http.py 0000644 0001750 0001750 00000001246 12373757531 021006 0 ustar irl irl from twisted.python import usage
from urlparse import urlsplit
from ooni.templates import dnst, httpt
class UsageOptions(usage.Options):
optParameters = [
['resolver', 'r', '8.8.8.8',
'Specify the DNS resolver to use for DNS queries'],
]
class DNSnHTTP(dnst.DNSTest, httpt.HTTPTest):
inputFile = ["filename", "f", None,
"file containing urls to test for blocking."]
usageOptions = UsageOptions
name = 'DNS n HTTP'
version = '0.1'
def test_http(self):
return self.doRequest(self.input)
def test_dns(self):
hostname = urlsplit(self.input)[1]
return self.performALookup(hostname)
ooniprobe-1.3.2/ooni/nettests/blocking/__init__.py 0000644 0001750 0001750 00000000001 12441552466 020365 0 ustar irl irl
ooniprobe-1.3.2/ooni/nettests/blocking/bridge_reachability.py 0000644 0001750 0001750 00000017316 12531110611 022601 0 ustar irl irl # -*- encoding: utf-8 -*-
import os
import random
import tempfile
import shutil
from twisted.python import usage
from twisted.internet import reactor, error
import txtorcon
from ooni.utils import log, onion
from ooni import nettest
class TorIsNotInstalled(Exception):
pass
class UsageOptions(usage.Options):
optParameters = [
['timeout', 't', 120,
'Specify the timeout after which to consider '
'the Tor bootstrapping process to have failed'], ]
class BridgeReachability(nettest.NetTestCase):
name = "Bridge Reachability"
description = "A test for checking if bridges are reachable " \
"from a given location."
author = "Arturo Filastò"
version = "0.1.2"
usageOptions = UsageOptions
inputFile = ['file', 'f', None,
'File containing bridges to test reachability for. '
'They should be one per line IP:ORPort or '
'TransportType IP:ORPort (ex. obfs2 127.0.0.1:443)']
requiredOptions = ['file']
def requirements(self):
if not onion.find_tor_binary():
raise TorIsNotInstalled(
"For instructions on installing Tor see: "
"https://www.torproject.org/download/download")
def setUp(self):
self.tor_progress = 0
self.timeout = int(self.localOptions['timeout'])
fd, self.tor_logfile = tempfile.mkstemp()
os.close(fd)
fd, self.obfsproxy_logfile = tempfile.mkstemp()
os.close(fd)
self.tor_datadir = tempfile.mkdtemp()
self.report['error'] = None
self.report['success'] = None
self.report['timeout'] = self.timeout
self.report['transport_name'] = 'vanilla'
self.report['tor_version'] = str(onion.tor_details['version'])
self.report['tor_progress'] = 0
self.report['tor_progress_tag'] = None
self.report['tor_progress_summary'] = None
self.report['tor_log'] = None
self.report['obfsproxy_version'] = str(onion.obfsproxy_details['version'])
self.report['obfsproxy_log'] = None
self.report['bridge_address'] = None
self.bridge = self.input
if self.input.startswith('Bridge'):
self.bridge = self.input.replace('Bridge ', '')
def postProcessor(self, measurements):
if 'successes' not in self.summary:
self.summary['successes'] = []
if 'failures' not in self.summary:
self.summary['failures'] = []
details = {
'address': self.report['bridge_address'],
'transport_name': self.report['transport_name'],
'tor_progress': self.report['tor_progress']
}
if self.report['success']:
self.summary['successes'].append(details)
else:
self.summary['failures'].append(details)
return self.report
def displaySummary(self, summary):
successful_count = {}
failure_count = {}
def count(results, counter):
for result in results:
if result['transport_name'] not in counter:
counter[result['transport_name']] = 0
counter[result['transport_name']] += 1
count(summary['successes'], successful_count)
count(summary['failures'], failure_count)
working_bridges = ', '.join(
["%s %s" % (x['transport_name'], x['address'])
for x in summary['successes']])
failing_bridges = ', '.join(
["%s %s (at %s%%)"
% (x['transport_name'], x['address'], x['tor_progress'])
for x in summary['failures']])
log.msg("Total successes: %d" % len(summary['successes']))
log.msg("Total failures: %d" % len(summary['failures']))
for transport, count in successful_count.items():
log.msg("%s successes: %d" % (transport.title(), count))
for transport, count in failure_count.items():
log.msg("%s failures: %d" % (transport.title(), count))
log.msg("Working bridges: %s" % working_bridges)
log.msg("Failing bridges: %s" % failing_bridges)
def test_full_tor_connection(self):
config = txtorcon.TorConfig()
config.ControlPort = random.randint(2**14, 2**16)
config.SocksPort = random.randint(2**14, 2**16)
config.DataDirectory = self.tor_datadir
log.msg(
"Connecting to %s with tor %s" %
(self.bridge, onion.tor_details['version']))
transport_name = onion.transport_name(self.bridge)
if transport_name == None:
self.report['bridge_address'] = self.bridge.split(' ')[0]
else:
self.report['bridge_address'] = self.bridge.split(' ')[1]
self.report['transport_name'] = transport_name
try:
config.ClientTransportPlugin = \
onion.bridge_line(transport_name, self.obfsproxy_logfile)
except onion.UnrecognizedTransport:
log.err("Unable to test bridge because we don't recognize "
"the %s transport" % transport_name)
self.report['error'] = "unrecognized-transport"
return
except onion.UninstalledTransport:
bin_name = onion.transport_bin_name.get(transport_name)
log.err("Unable to test bridge because %s is not installed" %
bin_name)
self.report['error'] = "missing-%s" % bin_name
return
except onion.OutdatedObfsproxy:
log.err("The obfsproxy version you are using " \
"appears to be outdated.")
self.report['error'] = 'old-obfsproxy'
return
except onion.OutdatedTor:
log.err("Unsupported Tor version.")
self.report['error'] = 'unsupported-tor-version'
return
log.debug("Using ClientTransportPlugin '%s'" % \
config.ClientTransportPlugin)
config.Bridge = self.bridge
config.UseBridges = 1
config.log = ['notice stdout', 'notice file %s' % self.tor_logfile]
config.save()
def updates(prog, tag, summary):
log.msg("%s: %s%%" % (self.bridge, prog))
self.report['tor_progress'] = int(prog)
self.report['tor_progress_tag'] = tag
self.report['tor_progress_summary'] = summary
d = txtorcon.launch_tor(config, reactor, timeout=self.timeout,
progress_updates=updates)
@d.addCallback
def setup_complete(proto):
try:
proto.transport.signalProcess('TERM')
except error.ProcessExitedAlready:
proto.transport.loseConnection()
log.msg("Successfully connected to %s" % self.bridge)
self.report['success'] = True
@d.addErrback
def setup_failed(failure):
log.msg("Failed to connect to %s" % self.bridge)
self.report['success'] = False
self.report['error'] = 'timeout-reached'
return
@d.addCallback
def write_log(_):
with open(self.tor_logfile) as f:
self.report['tor_log'] = f.read()
os.remove(self.tor_logfile)
with open(self.obfsproxy_logfile) as f:
self.report['obfsproxy_log'] = f.read()
os.remove(self.obfsproxy_logfile)
try:
with open(os.path.join(self.tor_datadir,
'pt_state', 'obfs4proxy.log')) as f:
self.report['obfsproxy_log'] = f.read()
except:
pass
finally:
shutil.rmtree(self.tor_datadir)
return d
ooniprobe-1.3.2/ooni/nettests/blocking/meek_fronted_requests.py 0000644 0001750 0001750 00000005637 12623624024 023237 0 ustar irl irl # -*- encoding: utf-8 -*-
#
# :licence: see LICENSE
from twisted.python import usage
from ooni.templates import httpt
from ooni.utils import log
class UsageOptions(usage.Options):
optParameters = [ ['expectedBody', 'B',
'I’m just a happy little web server.\n',
'Expected body content from GET response'],
['domainName', 'D', None,
'Specify a single fronted domainName to test.'],
['hostHeader', 'H', None,
'Specify "inside" Host Header to test.']
]
class meekTest(httpt.HTTPTest):
"""
Performs a HTTP GET request to a list of fronted domains with the Host
Header of the "inside" meek-server. The meek-server handles a GET request
and response with: "I’m just a happy little web server.\n".
The input file should be formatted as (one per line):
"domainName:hostHeader"
www.google.com:meek-reflect.appspot.com
ajax.aspnetcdn.com:az668014.vo.msecnd.net
a0.awsstatic.com:d2zfqthxsdq309.cloudfront.net
"""
name = "Meek fronted requests test"
description = "This tests for the Meek Tor pluggable transport frontend reachability"
version = "0.0.1"
usageOptions = UsageOptions
inputFile = ['file', 'f', None,
"File containing the domainName:hostHeader combinations to\
be tested, one per line."]
inputs = [('www.google.com', 'meek-reflect.appspot.com'),
('ajax.aspnetcdn.com', 'az668014.vo.msecnd.net'),
('a0.awsstatic.com', 'd2zfqthxsdq309.cloudfront.net')]
requiresRoot = False
requiresTor = False
def setUp(self):
"""
Check for inputs.
"""
if self.input:
if (isinstance(self.input, tuple) or isinstance(self.input, list)):
self.domainName, self.header = self.input
else:
self.domainName, self.header = self.input.split(':')
elif (self.localOptions['domainName'] and
self.localOptions['hostHeader']):
self.domainName = self.localOptions['domainName']
self.header = self.localOptions['hostHeader']
self.expectedBody = self.localOptions['expectedBody']
self.domainName = 'https://' + self.domainName
def test_meek_response(self):
"""
Detects if the fronted request is blocked.
"""
log.msg("Testing fronted domain:%s with Host Header:%s"
% (self.domainName, self.header))
def process_body(body):
if self.expectedBody != body:
self.report['success'] = False
else:
self.report['success'] = True
headers = {}
headers['Host'] = [self.header]
return self.doRequest(self.domainName, method="GET", headers=headers,
body_processor=process_body)
ooniprobe-1.3.2/ooni/nettests/blocking/http_requests.py 0000644 0001750 0001750 00000012121 12533065720 021540 0 ustar irl irl # -*- encoding: utf-8 -*-
#
# :authors: Arturo Filastò
# :licence: see LICENSE
import random
from twisted.python import usage, failure
from ooni.utils import log
from ooni.utils.net import userAgents
from ooni.templates import httpt
from ooni.errors import failureToString
class UsageOptions(usage.Options):
optParameters = [
['url', 'u', None, 'Specify a single URL to test.'],
['factor', 'f', 0.8,
'What factor should be used for triggering censorship (0.8 == 80%)']]
class HTTPRequestsTest(httpt.HTTPTest):
"""
Performs a two GET requests to the set of sites to be tested for
censorship, one over a known good control channel (Tor), the other over the
test network.
We check to see if the response headers match and if the response body
lengths match.
"""
name = "HTTP Requests"
description = "Performs a HTTP GET request over Tor and one over the " \
"local network and compares the two results."
author = "Arturo Filastò"
version = "0.2.4"
usageOptions = UsageOptions
inputFile = ['file', 'f', None,
'List of URLS to perform GET and POST requests to']
requiresRoot = False
requiresTor = True
# These values are used for determining censorship based on response body
# lengths
control_body_length = None
experiment_body_length = None
def setUp(self):
"""
Check for inputs.
"""
if self.input:
self.url = self.input
elif self.localOptions['url']:
self.url = self.localOptions['url']
else:
raise Exception("No input specified")
self.factor = self.localOptions['factor']
self.report['control_failure'] = None
self.report['experiment_failure'] = None
self.report['body_length_match'] = None
self.report['body_proportion'] = None
self.report['factor'] = float(self.factor)
self.report['headers_diff'] = None
self.report['headers_match'] = None
self.headers = {'User-Agent': [random.choice(userAgents)]}
def compare_body_lengths(self, body_length_a, body_length_b):
if body_length_b == 0 and body_length_a != 0:
rel = float(body_length_b)/float(body_length_a)
elif body_length_b == 0 and body_length_a == 0:
rel = float(1)
else:
rel = float(body_length_a)/float(body_length_b)
if rel > 1:
rel = 1/rel
self.report['body_proportion'] = rel
if rel > float(self.factor):
log.msg("The two body lengths appear to match")
log.msg("censorship is probably not happening")
self.report['body_length_match'] = True
else:
log.msg("The two body lengths appear to not match")
log.msg("censorship could be happening")
self.report['body_length_match'] = False
def compare_headers(self, headers_a, headers_b):
diff = headers_a.getDiff(headers_b)
if diff:
log.msg("Headers appear to *not* match")
self.report['headers_diff'] = diff
self.report['headers_match'] = False
else:
log.msg("Headers appear to match")
self.report['headers_diff'] = diff
self.report['headers_match'] = True
def test_get_experiment(self):
log.msg("Performing GET request to %s" % self.url)
return self.doRequest(self.url, method="GET",
use_tor=False, headers=self.headers)
def test_get_control(self):
log.msg("Performing GET request to %s over Tor" % self.url)
return self.doRequest(self.url, method="GET",
use_tor=True, headers=self.headers)
def postProcessor(self, measurements):
experiment = control = None
for status, measurement in measurements:
net_test_method = measurement.netTestMethod.im_func.func_name
if net_test_method == "test_get_experiment":
if isinstance(measurement.result, failure.Failure):
self.report['experiment_failure'] = failureToString(
measurement.result)
else:
experiment = measurement.result
elif net_test_method == "test_get_control":
if isinstance(measurement.result, failure.Failure):
self.report['control_failure'] = failureToString(
measurement.result)
else:
control = measurement.result
if experiment and control:
if hasattr(experiment, 'body') and hasattr(control, 'body') \
and experiment.body and control.body:
self.compare_body_lengths(len(control.body),
len(experiment.body))
if hasattr(experiment, 'headers') and hasattr(control, 'headers') \
and experiment.headers and control.headers:
self.compare_headers(control.headers,
experiment.headers)
return self.report
ooniprobe-1.3.2/ooni/nettests/blocking/top_ports.py 0000644 0001750 0001750 00000005024 12373757532 020676 0 ustar irl irl # -*- encoding: utf-8 -*-
import random
from twisted.python import usage
from twisted.internet.protocol import Factory, Protocol
from twisted.internet.endpoints import TCP4ClientEndpoint
from twisted.internet.error import ConnectionRefusedError
from twisted.internet.error import TCPTimedOutError, TimeoutError
from ooni.templates import tcpt
from ooni.errors import handleAllFailures
from ooni.utils import log, randomStr
class TCPFactory(Factory):
def buildProtocol(self, addr):
return Protocol()
class UsageOptions(usage.Options):
optParameters = [
['backend', 'b', None,
'Backend to connect to that binds to the 100 most common ports']
]
class TopPortsTest(tcpt.TCPTest):
name = "Top ports"
description = "This test will verify if it can connect to the 100 most common ports"
author = "Arturo Filastò"
version = "0.1"
usageOptions = UsageOptions
# These are the top 100 most probably open ports according to
# https://svn.nmap.org/nmap/nmap-services
inputs = [80, 23, 443, 21, 22, 25, 3389, 110, 445, 139,
143, 53, 135, 3306, 8080, 1723, 111, 995, 993,
5900, 1025, 587, 8888, 199, 1720, 465, 548, 113,
81, 6001, 10000, 514, 5060, 179, 1026, 2000,
8443, 8000, 32768, 554, 26, 1433, 49152, 2001,
515, 8008, 49154, 1027, 5666, 646, 5000, 5631,
631, 49153, 8081, 2049, 88, 79, 5800, 106, 2121,
1110, 49155, 6000, 513, 990, 5357, 427, 49156,
543, 544, 5101, 144, 7, 389, 8009, 3128, 444,
9999, 5009, 7070, 5190, 3000, 5432, 3986, 1900,
13, 1029, 9, 6646, 5051, 49157, 1028, 873, 1755,
2717, 4899, 9100, 119, 37]
requiredTestHelpers = {'backend': 'top-ports'}
requiredOptions = ['backend']
requiresTor = False
requiresRoot = False
def setUp(self):
self.address = self.localOptions['backend']
self.port = self.input
def test_connect(self):
"""
This test performs a TCP connection to the remote host on the specified port.
the report will contains the string 'success' if the test has
succeeded, or the reason for the failure if it has failed.
"""
payload = randomStr(42)
self.report['response_match'] = None
d = self.sendPayload(payload)
@d.addCallback
def cb(result):
self.report['response_match'] = False
if result == payload:
self.report['response_match'] = True
return d
ooniprobe-1.3.2/ooni/nettests/blocking/dns_consistency.py 0000644 0001750 0001750 00000017152 12474100761 022043 0 ustar irl irl # -*- encoding: utf-8 -*-
#
# dnsconsistency
# **************
#
# The test reports censorship if the cardinality of the intersection of
# the query result set from the control server and the query result set
# from the experimental server is zero, which is to say, if the two sets
# have no matching results whatsoever.
#
# NOTE: This test frequently results in false positives due to GeoIP-based
# load balancing on major global sites such as google, facebook, and
# youtube, etc.
#
# :authors: Arturo Filastò, Isis Lovecruft
# :licence: see LICENSE
from twisted.python import usage
from twisted.internet import defer
from ooni.templates import dnst
from ooni.utils import log
class UsageOptions(usage.Options):
optParameters = [['backend', 'b', None,
'The OONI backend that runs the DNS resolver'],
['testresolvers', 'T', None,
'File containing list of DNS resolvers to test against'],
['testresolver', 't', None,
'Specify a single test resolver to use for testing']
]
class DNSConsistencyTest(dnst.DNSTest):
name = "DNS Consistency"
description = "Checks to see if the DNS responses from a "\
"set of DNS resolvers are consistent."
version = "0.6"
authors = "Arturo Filastò, Isis Lovecruft"
inputFile = ['file', 'f', None,
'Input file of list of hostnames to attempt to resolve']
requiredTestHelpers = {'backend': 'dns'}
requiresRoot = False
requiresTor = False
usageOptions = UsageOptions
requiredOptions = ['backend', 'file']
def setUp(self):
if (not self.localOptions['testresolvers'] and
not self.localOptions['testresolver']):
self.test_resolvers = []
with open('/etc/resolv.conf') as f:
for line in f:
if line.startswith('nameserver'):
self.test_resolvers.append(line.split(' ')[1].strip())
self.report['test_resolvers'] = self.test_resolvers
elif self.localOptions['testresolvers']:
test_resolvers_file = self.localOptions['testresolvers']
elif self.localOptions['testresolver']:
self.test_resolvers = [self.localOptions['testresolver']]
try:
with open(test_resolvers_file) as f:
self.test_resolvers = [
x.split('#')[0].strip() for x in f.readlines()]
self.report['test_resolvers'] = self.test_resolvers
f.close()
except IOError as e:
log.exception(e)
raise usage.UsageError("Invalid test resolvers file")
except NameError:
log.debug("No test resolver file configured")
dns_ip, dns_port = self.localOptions['backend'].split(':')
self.control_dns_server = (str(dns_ip), int(dns_port))
self.report['control_resolver'] = "%s:%d" % self.control_dns_server
@defer.inlineCallbacks
def test_a_lookup(self):
"""
We perform an A lookup on the DNS test servers for the domains to be
tested and an A lookup on the known good DNS server.
We then compare the results from test_resolvers and that from
control_resolver and see if they match up.
If they match up then no censorship is happening (tampering: false).
If they do not we do a reverse lookup (PTR) on the test_resolvers and
the control resolver for every IP address we got back and check to see
if anyone of them matches the control ones.
If they do, then we take note of the fact that censorship is probably
not happening (tampering: reverse-match).
If they do not match then censorship is probably going on (tampering:
true).
"""
log.msg("Doing the test lookups on %s" % self.input)
hostname = self.input
self.report['tampering'] = {}
try:
control_answers = yield self.performALookup(hostname,
self.control_dns_server)
if not control_answers:
log.err(
"Got no response from control DNS server %s:%d, "
"perhaps the DNS resolver is down?" %
self.control_dns_server)
self.report['tampering'][
"%s:%d" %
self.control_dns_server] = 'no_answer'
except:
self.report['tampering'][
"%s:%d" %
self.control_dns_server] = 'error'
control_answers = None
for test_resolver in self.test_resolvers:
log.msg("Testing resolver: %s" % test_resolver)
test_dns_server = (test_resolver, 53)
try:
experiment_answers = yield self.performALookup(hostname,
test_dns_server)
except Exception:
log.err("Problem performing the DNS lookup")
self.report['tampering'][test_resolver] = 'dns_lookup_error'
continue
if not experiment_answers:
log.err("Got no response, perhaps the DNS resolver is down?")
self.report['tampering'][test_resolver] = 'no_answer'
continue
else:
log.debug(
"Got the following A lookup answers %s from %s" %
(experiment_answers, test_resolver))
def lookup_details():
"""
A closure useful for printing test details.
"""
log.msg("test resolver: %s" % test_resolver)
log.msg("experiment answers: %s" % experiment_answers)
log.msg("control answers: %s" % control_answers)
log.debug(
"Comparing %s with %s" %
(experiment_answers, control_answers))
if not control_answers:
log.msg("Skipping control resolver comparison")
self.report['tampering'][test_resolver] = None
elif set(experiment_answers) & set(control_answers):
lookup_details()
log.msg("tampering: false")
self.report['tampering'][test_resolver] = False
else:
log.msg("Trying to do reverse lookup")
experiment_reverse = yield self.performPTRLookup(experiment_answers[0],
test_dns_server)
control_reverse = yield self.performPTRLookup(control_answers[0],
self.control_dns_server)
if experiment_reverse == control_reverse:
log.msg("Further testing has eliminated false positives")
lookup_details()
log.msg("tampering: reverse_match")
self.report['tampering'][test_resolver] = 'reverse_match'
else:
log.msg("Reverse lookups do not match")
lookup_details()
log.msg("tampering: true")
self.report['tampering'][test_resolver] = True
def inputProcessor(self, filename=None):
"""
This inputProcessor extracts domain names from urls
"""
log.debug("Running dnsconsistency default processor")
if filename:
fp = open(filename)
for x in fp.readlines():
yield x.strip().split('//')[-1].split('/')[0]
fp.close()
else:
pass
ooniprobe-1.3.2/ooni/nettests/experimental/ 0000755 0001750 0001750 00000000000 12623630152 017161 5 ustar irl irl ooniprobe-1.3.2/ooni/nettests/experimental/tls_handshake.py 0000644 0001750 0001750 00000112640 12447563404 022360 0 ustar irl irl #!/usr/bin/env python
# -*- encoding: utf-8 -*-
"""
tls_handshake.py
----------------
This file contains test cases for determining if a TLS handshake completes
successfully, including ways to test if a TLS handshake which uses Mozilla
Firefox's current ciphersuite list completes. Rather than using Twisted and
OpenSSL's methods for automatically completing a handshake, which includes
setting all the parameters, such as the ciphersuite list, these tests use
non-blocking sockets and implement asychronous error-handling transversal of
OpenSSL's memory BIO state machine, allowing us to determine where and why a
handshake fails.
This network test is a complete rewrite of a pseudonymously contributed
script by Hackerberry Finn, in order to fit into OONI's core network tests.
@authors: Isis Agora Lovecruft
@license: see included LICENSE file
@copyright: © 2013 Isis Lovecruft, The Tor Project Inc.
"""
from socket import error as socket_error
from socket import timeout as socket_timeout
from socket import inet_aton as socket_inet_aton
from socket import gethostbyname as socket_gethostbyname
from time import sleep
import os
import socket
import struct
import sys
import types
import ipaddr
import OpenSSL
from OpenSSL import SSL, crypto
from twisted.internet import defer, threads
from twisted.python import usage, failure
from ooni import nettest
from ooni.utils import log
from ooni.errors import InsufficientPrivileges
from ooni.settings import config
## For a way to obtain the current version of Firefox's default ciphersuite
## list, see https://trac.torproject.org/projects/tor/attachment/ticket/4744/
## and the attached file "get_mozilla_files.py".
##
## Note, however, that doing so requires the source code to the version of
## firefox that you wish to emulate.
firefox_ciphers = ["ECDHE-ECDSA-AES256-SHA",
"ECDHE-RSA-AES256-SHA",
"DHE-RSA-CAMELLIA256-SHA",
"DHE-DSS-CAMELLIA256-SHA",
"DHE-RSA-AES256-SHA",
"DHE-DSS-AES256-SHA",
"ECDH-ECDSA-AES256-CBC-SHA",
"ECDH-RSA-AES256-CBC-SHA",
"CAMELLIA256-SHA",
"AES256-SHA",
"ECDHE-ECDSA-RC4-SHA",
"ECDHE-ECDSA-AES128-SHA",
"ECDHE-RSA-RC4-SHA",
"ECDHE-RSA-AES128-SHA",
"DHE-RSA-CAMELLIA128-SHA",
"DHE-DSS-CAMELLIA128-SHA",]
class SSLContextError(usage.UsageError):
"""Raised when we're missing the SSL context method, or incompatible
contexts were provided. The SSL context method should be one of the
following:
:attr:`OpenSSL.SSL.SSLv2_METHOD `
:attr:`OpenSSL.SSL.SSLv23_METHOD `
:attr:`OpenSSL.SSL.SSLv3_METHOD `
:attr:`OpenSSL.SSL.TLSv1_METHOD `
To use the pre-defined error messages, construct with one of the
:meth:`SSLContextError.errors.keys ` as the ``message`` string, like
so:
``SSLContextError('NO_CONTEXT')``
"""
#: Pre-defined error messages.
errors = {
'NO_CONTEXT': 'No SSL/TLS context chosen! Defaulting to TLSv1.',
'INCOMPATIBLE': str("Testing TLSv1 (option '--tls1') is incompatible "
+ "with testing SSL ('--ssl2' and '--ssl3')."),
'MISSING_SSLV2': str("Your version of OpenSSL was compiled without "
+ "support for SSLv2. This is normal on newer "
+ "versions of OpenSSL, but it means that you "
+ "will be unable to test SSLv2 handshakes "
+ "without recompiling OpenSSL."), }
def __init__(self, message):
if message in self.errors.keys():
message = self.errors[message]
super(usage.UsageError, self).__init__(message)
class HostUnreachableError(Exception):
"""Raised when the host IP address appears to be unreachable."""
pass
class HostUnresolveableError(Exception):
"""Raised when the host address appears to be unresolveable."""
pass
class ConnectionTimeout(Exception):
"""Raised when we receive a :class:`socket.timeout `, in order to
pass the Exception along to
:func:`TLSHandshakeTest.test_handshake.connectionFailed
`.
"""
pass
class HandshakeOptions(usage.Options):
""" :class:`usage.Options ` parser for the tls-handshake test."""
optParameters = [
['host', 'h', None,
'Remote host IP address (v4/v6) and port, i.e. "1.2.3.4:443"'],
['port', 'p', None,
'Use this port for all hosts, regardless of port specified in file'],
['ciphersuite', 'c', None ,
'File containing ciphersuite list, one per line'],]
optFlags = [
['ssl2', '2', 'Use SSLv2'],
['ssl3', '3', 'Use SSLv3'],
['tls1', 't', 'Use TLSv1'],]
class HandshakeTest(nettest.NetTestCase):
"""An ooniprobe NetTestCase for determining if we can complete a TLS/SSL
handshake with a remote host.
"""
name = 'tls-handshake'
author = 'Isis Lovecruft '
description = 'A test to determing if we can complete a TLS hankshake.'
version = '0.0.3'
requiresRoot = False
requiresTor = False
usageOptions = HandshakeOptions
host = None
inputFile = ['file', 'f', None, 'List of :s to test']
#: Default SSL/TLS context method.
context = SSL.Context(SSL.TLSv1_METHOD)
def setUp(self, *args, **kwargs):
"""Set defaults for a :class:`HandshakeTest `."""
self.ciphers = list()
if self.localOptions:
options = self.localOptions
## check that we're testing an IP:PORT, else exit gracefully:
if not (options['host'] or options['file']):
raise SystemExit("Need --host or --file!")
if options['host']:
self.host = options['host']
## If no context was chosen, explain our default to the user:
if not (options['ssl2'] or options['ssl3'] or options['tls1']):
try: raise SSLContextError('NO_CONTEXT')
except SSLContextError as sce: log.err(sce.message)
context = None
else:
## If incompatible contexts were chosen, inform the user:
if options['tls1'] and (options['ssl2'] or options['ssl3']):
try: raise SSLContextError('INCOMPATIBLE')
except SSLContextError as sce: log.err(sce.message)
finally: log.msg('Defaulting to testing only TLSv1.')
elif options['ssl2']:
try:
if not options['ssl3']:
context = SSL.Context(SSL.SSLv2_METHOD)
else:
context = SSL.Context(SSL.SSLv23_METHOD)
except ValueError as ve:
log.err(ve.message)
try: raise SSLContextError('MISSING_SSLV2')
except SSLContextError as sce:
log.err(sce.message)
log.msg("Falling back to testing only TLSv1.")
context = SSL.Context(SSL.TLSv1_METHOD)
elif options['ssl3']:
context = SSL.Context(SSL.SSLv3_METHOD)
## finally, reset the context if the user's choice was okay:
if context: self.context = context
## if we weren't given a file with a list of ciphersuites to use,
## then use the firefox default list:
if not options['ciphersuite']:
self.ciphers = firefox_ciphers
log.msg('Using default Firefox ciphersuite list.')
else:
if os.path.isfile(options['ciphersuite']):
log.msg('Using ciphersuite list from "%s"'
% options['ciphersuite'])
with open(options['ciphersuite']) as cipherfile:
for line in cipherfile.readlines():
self.ciphers.append(line.strip())
self.ciphersuite = ":".join(self.ciphers)
if getattr(config.advanced, 'default_timeout', None) is not None:
self.timeout = config.advanced.default_timeout
else:
self.timeout = 30 ## default the timeout to 30 seconds
## xxx For debugging, set the socket timeout higher anyway:
self.timeout = 30
## We have to set the default timeout on our sockets before creation:
socket.setdefaulttimeout(self.timeout)
def isIP(self,addr):
try:
socket_inet_aton(addr)
return True
except socket_error:
return False
def resolveHost(self,addr):
try:
return socket_gethostbyname(addr)
except socket_error:
raise HostUnresolveableError
def splitInput(self, input):
addr, port = input.strip().rsplit(':', 1)
#if addr is hostname it is resolved to ip
if not self.isIP(addr):
addr=self.resolveHost(addr)
if self.localOptions['port']:
port = self.localOptions['port']
return (str(addr), int(port))
def inputProcessor(self, file=None):
if self.host:
yield self.splitInput(self.host)
if os.path.isfile(file):
with open(file) as fh:
for line in fh.readlines():
if line.startswith('#'):
continue
try:
yield self.splitInput(line)
except HostUnresolveableError:
continue
def buildSocket(self, addr):
global s
ip = ipaddr.IPAddress(addr) ## learn if we're IPv4 or IPv6
if ip.version == 4:
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
elif ip.version == 6:
s = socket.socket(socket.AF_INET6, socket.SOCK_STREAM)
return s
def getContext(self):
self.context.set_cipher_list(self.ciphersuite)
return self.context
@staticmethod
def getPeerCert(connection, get_chain=False):
"""Get the PEM-encoded certificate or cert chain of the remote host.
:param connection: A :class:`OpenSSL.SSL.Connection `.
:param bool get_chain: If True, get the all certificates in the
chain. Otherwise, only get the remote host's certificate.
:returns: A PEM-encoded x509 certificate. If
:param:`getPeerCert.get_chain ` is True, returns a list
of PEM-encoded x509 certificates.
"""
if not get_chain:
x509_cert = connection.get_peer_certificate()
pem_cert = crypto.dump_certificate(crypto.FILETYPE_PEM, x509_cert)
return pem_cert
else:
cert_chain = []
x509_cert_chain = connection.get_peer_cert_chain()
for x509_cert in x509_cert_chain:
pem_cert = crypto.dump_certificate(crypto.FILETYPE_PEM,
x509_cert)
cert_chain.append(pem_cert)
return cert_chain
@staticmethod
def getX509Name(certificate, get_components=False):
"""Get the DER-encoded form of the Name fields of an X509 certificate.
@param certificate: A :class:`OpenSSL.crypto.X509Name` object.
@param get_components: A boolean. If True, returns a list of tuples of
the (name, value)s of each Name field in the
:param:`certificate`. If False, returns the DER
encoded form of the Name fields of the
:param:`certificate`.
"""
x509_name = None
try:
assert isinstance(certificate, crypto.X509Name), \
"getX509Name takes OpenSSL.crypto.X509Name as first argument!"
x509_name = crypto.X509Name(certificate)
except AssertionError as ae:
log.err(ae)
except Exception as exc:
log.exception(exc)
if not x509_name is None:
if not get_components:
return x509_name.der()
else:
return x509_name.get_components()
else:
log.debug("getX509Name: got None for ivar x509_name")
@staticmethod
def getPublicKey(key):
"""Get the PEM-encoded format of a host certificate's public key.
:param key: A :class:`OpenSSL.crypto.PKey ` object.
"""
try:
assert isinstance(key, crypto.PKey), \
"getPublicKey expects type OpenSSL.crypto.PKey for parameter key"
except AssertionError as ae:
log.err(ae)
else:
pubkey = crypto.dump_privatekey(crypto.FILETYPE_PEM, key)
return pubkey
def test_handshake(self):
"""xxx fill me in"""
def makeConnection(host):
"""Create a socket to the remote host's IP address, then get the
TLS/SSL context method and ciphersuite list. Lastly, initiate a
connection to the host.
:param tuple host: A tuple of the remote host's IP address as a
string, and an integer specifying the remote host port, i.e.
('1.1.1.1',443)
:raises: :exc:`ConnectionTimeout` if the socket timed out.
:returns: A :class:`OpenSSL.SSL.Connection `.
"""
addr, port = host
sckt = self.buildSocket(addr)
context = self.getContext()
connection = SSL.Connection(context, sckt)
try:
connection.connect(host)
except socket_timeout as stmo:
error = ConnectionTimeout(stmo.message)
return failure.Failure(error)
else:
return connection
def connectionFailed(connection, host):
"""Handle errors raised while attempting to create the socket and
:class:`OpenSSL.SSL.Connection `, and setting the
TLS/SSL context.
:type connection: :exc:Exception
:param connection: The exception that was raised in
:func:`HandshakeTest.test_handshake.makeConnection
`.
:param tuple host: A tuple of the host IP address as a string, and
an int specifying the host port, i.e. ('1.1.1.1', 443)
:rtype: :exc:Exception
:returns: The original exception.
"""
addr, port = host
if not isinstance(connection, SSL.Connection):
if isinstance(connection, IOError):
## On some *nix distros, /dev/random is 0600 root:root and
## we get a permissions error when trying to read
if connection.message.find("[Errno 13]"):
raise InsufficientPrivileges(
"%s" % connection.message.split("[Errno 13]", 1)[1])
elif isinstance(connection, socket_error):
if connection.message.find("[Errno 101]"):
raise HostUnreachableError(
"Host unreachable: %s:%s" % (addr, port))
elif isinstance(connection, Exception):
log.debug("connectionFailed: got Exception:")
log.err("Connection failed with reason: %s"
% connection.message)
else:
log.err("Connection failed with reason: %s" % str(connection))
self.report['host'] = addr
self.report['port'] = port
self.report['state'] = 'CONNECTION_FAILED'
return connection
def connectionSucceeded(connection, host, timeout):
"""If we have created a connection, set the socket options, and log
the connection state and peer name.
:param connection: A :class:`OpenSSL.SSL.Connection `.
:param tuple host: A tuple of the remote host's IP address as a
string, and an integer specifying the remote host port, i.e.
('1.1.1.1',443)
"""
## xxx TODO to get this to work with a non-blocking socket, see how
## twisted.internet.tcp.Client handles socket objects.
connection.setblocking(1)
## Set the timeout on the connection:
##
## We want to set SO_RCVTIMEO and SO_SNDTIMEO, which both are
## defined in the socket option definitions in , and
## which both take as their value, according to socket(7), a
## struct timeval, which is defined in the libc manual:
## https://www.gnu.org/software/libc/manual/html_node/Elapsed-Time.html
timeval = struct.pack('ll', int(timeout), 0)
connection.setsockopt(socket.SOL_SOCKET, socket.SO_RCVTIMEO, timeval)
connection.setsockopt(socket.SOL_SOCKET, socket.SO_SNDTIMEO, timeval)
## Set the connection state to client mode:
connection.set_connect_state()
peer_name, peer_port = connection.getpeername()
if peer_name:
log.msg("Connected to %s" % peer_name)
else:
log.debug("Couldn't get peer name from connection: %s" % host)
log.msg("Connected to %s" % host)
log.debug("Connection state: %s " % connection.state_string())
return connection
def connectionRenegotiate(connection, host, error_message):
"""Handle a server-initiated SSL/TLS handshake renegotiation.
:param connection: A :class:`OpenSSL.SSL.Connection `.
:param tuple host: A tuple of the remote host's IP address as a
string, and an integer specifying the remote host port, i.e.
('1.1.1.1',443)
"""
log.msg("Server requested renegotiation from: %s" % host)
log.debug("Renegotiation reason: %s" % error_message)
log.debug("State: %s" % connection.state_string())
if connection.renegotiate():
log.debug("Renegotiation possible.")
log.msg("Retrying handshake with %s..." % host)
try:
connection.do_handshake()
while connection.renegotiate_pending():
log.msg("Renegotiation with %s in progress..." % host)
log.debug("State: %s" % connection.state_string())
sleep(1)
else:
log.msg("Renegotiation with %s complete!" % host)
except SSL.WantReadError, wre:
connection = handleWantRead(connection)
log.debug("State: %s" % connection.state_string())
except SSL.WantWriteError, wwe:
connection = handleWantWrite(connection)
log.debug("State: %s" % connection.state_string())
return connection
def connectionShutdown(connection, host):
"""Handle shutting down a :class:`OpenSSL.SSL.Connection
`, including correct handling of halfway shutdown
connections.
Calls to :meth:`OpenSSL.SSL.Connection.shutdown
` return a boolean value -- if the
connection is already shutdown, it returns True, else it returns
false. Thus we loop through a block which detects if the connection
is an a partial shutdown state and corrects that if that is the
case, else it waits for one second, then attempts shutting down the
connection again.
Detection of a partial shutdown state is done through
:meth:`OpenSSL.SSL.Connection.get_shutdown
` which queries OpenSSL for a bitvector
of the server and client shutdown states. For example, the binary
string '0b00' is an open connection, and '0b10' is a partially
closed connection that has been shutdown on the serverside.
:param connection: A :class:`OpenSSL.SSL.Connection `.
:param tuple host: A tuple of the remote host's IP address as a
string, and an integer specifying the remote host port, i.e.
('1.1.1.1',443)
"""
peername, peerport = host
if isinstance(connection, SSL.Connection):
log.msg("Closing connection to %s:%d..." % (peername, peerport))
while not connection.shutdown():
## if the connection is halfway shutdown, we have to
## wait for a ZeroReturnError on connection.recv():
if (bin(connection.get_shutdown()) == '0b01') \
or (bin(connection.get_shutdown()) == '0b10'):
try:
_read_buffer = connection.pending()
connection.recv(_read_buffer)
except SSL.ZeroReturnError, zre: continue
else:
sleep(1)
else:
log.msg("Closed connection to %s:%d"
% (peername, peerport))
elif isinstance(connection, types.NoneType):
log.debug("connectionShutdown: got NoneType for connection")
return
else:
log.debug("connectionShutdown: expected connection, got %r"
% connection.__repr__())
return connection
def handleWantRead(connection):
"""From OpenSSL memory BIO documentation on ssl_read():
If the underlying BIO is blocking, SSL_read() will only
return, once the read operation has been finished or an error
occurred, except when a renegotiation take place, in which
case a SSL_ERROR_WANT_READ may occur. This behaviour can be
controlled with the SSL_MODE_AUTO_RETRY flag of the
SSL_CTX_set_mode(3) call.
If the underlying BIO is non-blocking, SSL_read() will also
return when the underlying BIO could not satisfy the needs of
SSL_read() to continue the operation. In this case a call to
SSL_get_error(3) with the return value of SSL_read() will
yield SSL_ERROR_WANT_READ or SSL_ERROR_WANT_WRITE. As at any
time a re-negotiation is possible, a call to SSL_read() can
also cause write operations! The calling process then must
repeat the call after taking appropriate action to satisfy the
needs of SSL_read(). The action depends on the underlying
BIO. When using a non-blocking socket, nothing is to be done,
but select() can be used to check for the required condition.
And from the OpenSSL memory BIO documentation on ssl_get_error():
SSL_ERROR_WANT_READ, SSL_ERROR_WANT_WRITE
The operation did not complete; the same TLS/SSL I/O function
should be called again later. If, by then, the underlying BIO
has data available for reading (if the result code is
SSL_ERROR_WANT_READ) or allows writing data
(SSL_ERROR_WANT_WRITE), then some TLS/SSL protocol progress
will take place, i.e. at least part of an TLS/SSL record will
be read or written. Note that the retry may again lead to a
SSL_ERROR_WANT_READ or SSL_ERROR_WANT_WRITE condition. There
is no fixed upper limit for the number of iterations that may
be necessary until progress becomes visible at application
protocol level.
For socket BIOs (e.g. when SSL_set_fd() was used), select() or
poll() on the underlying socket can be used to find out when
the TLS/SSL I/O function should be retried.
Caveat: Any TLS/SSL I/O function can lead to either of
SSL_ERROR_WANT_READ and SSL_ERROR_WANT_WRITE. In particular,
SSL_read() or SSL_peek() may want to write data and
SSL_write() may want to read data. This is mainly because
TLS/SSL handshakes may occur at any time during the protocol
(initiated by either the client or the server); SSL_read(),
SSL_peek(), and SSL_write() will handle any pending
handshakes.
Also, see http://stackoverflow.com/q/3952104
"""
try:
while connection.want_read():
self.state = connection.state_string()
log.debug("Connection to %s HAS want_read" % host)
_read_buffer = connection.pending()
log.debug("Rereading %d bytes..." % _read_buffer)
sleep(1)
rereceived = connection.recv(int(_read_buffer))
log.debug("Received %d bytes" % rereceived)
log.debug("State: %s" % connection.state_string())
else:
self.state = connection.state_string()
peername, peerport = connection.getpeername()
log.debug("Connection to %s:%s DOES NOT HAVE want_read"
% (peername, peerport))
log.debug("State: %s" % connection.state_string())
except SSL.WantWriteError, wwe:
self.state = connection.state_string()
log.debug("Got WantWriteError while handling want_read")
log.debug("WantWriteError: %s" % wwe.message)
log.debug("Switching to handleWantWrite()...")
handleWantWrite(connection)
return connection
def handleWantWrite(connection):
"""See :func:HandshakeTest.test_hanshake.handleWantRead """
try:
while connection.want_write():
self.state = connection.state_string()
log.debug("Connection to %s HAS want_write" % host)
sleep(1)
resent = connection.send("o\r\n")
log.debug("Sent: %d" % resent)
log.debug("State: %s" % connection.state_string())
except SSL.WantReadError, wre:
self.state = connection.state_string()
log.debug("Got WantReadError while handling want_write")
log.debug("WantReadError: %s" % wre.message)
log.debug("Switching to handleWantRead()...")
handleWantRead(connection)
return connection
def doHandshake(connection):
"""Attempt a TLS/SSL handshake with the host.
If, after the first attempt at handshaking, OpenSSL's memory BIO
state machine does not report success, then try reading and
writing from the connection, and handle any SSL_ERROR_WANT_READ or
SSL_ERROR_WANT_WRITE which occurs.
If multiple want_reads occur, then try renegotiation with the
host, and start over. If multiple want_writes occur, then it is
possible that the connection has timed out, and move on to the
connectionShutdown step.
:param connection: A :class:`OpenSSL.SSL.Connection `.
:ivar peername: The host IP address, as reported by
:meth:`Connection.getpeername `.
:ivar peerport: The host port, reported by
:meth:`Connection.getpeername `.
:ivar int sent: The number of bytes sent to to the remote host.
:ivar int received: The number of bytes received from the remote
host.
:ivar int _read_buffer: The max bytes that can be read from the
connection.
:returns: The :param:`doHandshake.connection ` with
handshake completed, else the unhandled error that was
raised.
"""
peername, peerport = connection.getpeername()
try:
log.msg("Attempting handshake: %s" % peername)
connection.do_handshake()
except OpenSSL.SSL.WantReadError() as wre:
self.state = connection.state_string()
log.debug("Handshake state: %s" % self.state)
log.debug("doHandshake: WantReadError on first handshake attempt.")
connection = handleWantRead(connection)
except OpenSSL.SSL.WantWriteError() as wwe:
self.state = connection.state_string()
log.debug("Handshake state: %s" % self.state)
log.debug("doHandshake: WantWriteError on first handshake attempt.")
connection = handleWantWrite(connection)
else:
self.state = connection.state_string()
if self.state == 'SSL negotiation finished successfully':
## jump to handshakeSuccessful and get certchain
return connection
else:
sent = connection.send("o\r\n")
self.state = connection.state_string()
log.debug("Handshake state: %s" % self.state)
log.debug("Transmitted %d bytes" % sent)
_read_buffer = connection.pending()
log.debug("Max bytes in receive buffer: %d" % _read_buffer)
try:
received = connection.recv(int(_read_buffer))
except SSL.WantReadError, wre:
if connection.want_read():
self.state = connection.state_string()
connection = handleWantRead(connection)
else:
## if we still have an SSL_ERROR_WANT_READ, then try to
## renegotiate
self.state = connection.state_string()
connection = connectionRenegotiate(connection,
connection.getpeername(),
wre.message)
except SSL.WantWriteError, wwe:
self.state = connection.state_string()
log.debug("Handshake state: %s" % self.state)
if connection.want_write():
connection = handleWantWrite(connection)
else:
raise ConnectionTimeout("Connection to %s:%d timed out."
% (peername, peerport))
else:
log.msg("Received: %s" % received)
self.state = connection.state_string()
log.debug("Handshake state: %s" % self.state)
return connection
def handshakeSucceeded(connection):
"""Get the details from the server certificate, cert chain, and
server ciphersuite list, and put them in our report.
WARNING: do *not* do this:
>>> server_cert.get_pubkey()
>>> pk = server_cert.get_pubkey()
>>> pk.check()
Segmentation fault
:param connection: A :class:`OpenSSL.SSL.Connection `.
:returns: :param:`handshakeSucceeded.connection `.
"""
host, port = connection.getpeername()
log.msg("Handshake with %s:%d successful!" % (host, port))
server_cert = self.getPeerCert(connection)
server_cert_chain = self.getPeerCert(connection, get_chain=True)
renegotiations = connection.total_renegotiations()
cipher_list = connection.get_cipher_list()
session_key = connection.master_key()
rawcert = connection.get_peer_certificate()
## xxx TODO this hash needs to be formatted as SHA1, not long
cert_subj_hash = rawcert.subject_name_hash()
cert_serial = rawcert.get_serial_number()
cert_sig_algo = rawcert.get_signature_algorithm()
cert_subject = self.getX509Name(rawcert.get_subject(),
get_components=True)
cert_issuer = self.getX509Name(rawcert.get_issuer(),
get_components=True)
cert_pubkey = self.getPublicKey(rawcert.get_pubkey())
self.report['host'] = host
self.report['port'] = port
self.report['state'] = self.state
self.report['renegotiations'] = renegotiations
self.report['server_cert'] = server_cert
self.report['server_cert_chain'] = \
''.join([cert for cert in server_cert_chain])
self.report['server_ciphersuite'] = cipher_list
self.report['cert_subject'] = cert_subject
self.report['cert_subj_hash'] = cert_subj_hash
self.report['cert_issuer'] = cert_issuer
self.report['cert_public_key'] = cert_pubkey
self.report['cert_serial_no'] = cert_serial
self.report['cert_sig_algo'] = cert_sig_algo
## The session's master key is only valid for that session, and
## will allow us to decrypt any packet captures (if they were
## collected). Because we are not requesting URLs, only host:port
## (which would be visible in pcaps anyway, since the FQDN is
## never encrypted) I do not see a way for this to log any user or
## identifying information. Correct me if I'm wrong.
self.report['session_key'] = session_key
log.msg("Server certificate:\n\n%s" % server_cert)
log.msg("Server certificate chain:\n\n%s"
% ''.join([cert for cert in server_cert_chain]))
log.msg("Negotiated ciphersuite:\n%s"
% '\n\t'.join([cipher for cipher in cipher_list]))
log.msg("Certificate subject: %s" % cert_subject)
log.msg("Certificate subject hash: %d" % cert_subj_hash)
log.msg("Certificate issuer: %s" % cert_issuer)
log.msg("Certificate public key:\n\n%s" % cert_pubkey)
log.msg("Certificate signature algorithm: %s" % cert_sig_algo)
log.msg("Certificate serial number: %s" % cert_serial)
log.msg("Total renegotiations: %d" % renegotiations)
return connection
def handshakeFailed(connection, host):
"""Handle a failed handshake attempt and report the failure reason.
:type connection: :class:`twisted.python.failure.Failure `
or :exc:Exception
:param connection: The failed connection.
:param tuple host: A tuple of the remote host's IP address as a
string, and an integer specifying the remote host port, i.e.
('1.1.1.1',443)
:returns: None
"""
addr, port = host
log.msg("Handshake with %s:%d failed!" % host)
self.report['host'] = host
self.report['port'] = port
if isinstance(connection, Exception) \
or isinstance(connection, ConnectionTimeout):
log.msg("Handshake failed with reason: %s" % connection.message)
self.report['state'] = connection.message
elif isinstance(connection, failure.Failure):
log.msg("Handshake failed with reason: Socket %s"
% connection.getErrorMessage())
self.report['state'] = connection.getErrorMessage()
ctmo = connection.trap(ConnectionTimeout)
if ctmo == ConnectionTimeout:
connection.cleanFailure()
else:
log.msg("Handshake failed with reason: %s" % str(connection))
if not 'state' in self.report.keys():
self.report['state'] = str(connection)
return None
def deferMakeConnection(host):
return threads.deferToThread(makeConnection, self.input)
if self.host and not self.input:
self.input = self.splitInput(self.host)
log.msg("Beginning handshake test for %s:%s" % self.input)
connection = deferMakeConnection(self.input)
connection.addCallbacks(connectionSucceeded, connectionFailed,
callbackArgs=[self.input, self.timeout],
errbackArgs=[self.input])
handshake = defer.Deferred()
handshake.addCallback(doHandshake)
handshake.addCallbacks(handshakeSucceeded, handshakeFailed,
errbackArgs=[self.input])
connection.chainDeferred(handshake)
connection.addCallbacks(connectionShutdown, defer.passthru,
callbackArgs=[self.input])
connection.addBoth(log.exception)
return connection
ooniprobe-1.3.2/ooni/nettests/experimental/dns_injection.py 0000644 0001750 0001750 00000003706 12463144534 022375 0 ustar irl irl # -*- encoding: utf-8 -*-
from twisted.python import usage
from twisted.internet import defer
from ooni.templates import dnst
from ooni.utils import log
class UsageOptions(usage.Options):
optParameters = [
['resolver', 'r', '8.8.8.1', 'an invalid DNS resolver'],
['timeout', 't', 3, 'timeout after which we should consider the query failed']
]
class DNSInjectionTest(dnst.DNSTest):
"""
This test detects DNS spoofed DNS responses by performing UDP based DNS
queries towards an invalid DNS resolver.
For it to work we must be traversing the network segment of a machine that
is actively injecting DNS query answers.
"""
name = "DNS Injection"
description = "Checks for injection of spoofed DNS answers"
version = "0.1"
authors = "Arturo Filastò"
inputFile = ['file', 'f', None,
'Input file of list of hostnames to attempt to resolve']
usageOptions = UsageOptions
requiredOptions = ['resolver', 'file']
requiresRoot = False
requiresTor = False
def setUp(self):
self.resolver = (self.localOptions['resolver'], 53)
self.queryTimeout = [self.localOptions['timeout']]
def inputProcessor(self, filename):
fp = open(filename)
for line in fp:
if line.startswith('http://'):
yield line.replace('http://', '').replace('/', '').strip()
else:
yield line.strip()
fp.close()
def test_injection(self):
self.report['injected'] = None
d = self.performALookup(self.input, self.resolver)
@d.addCallback
def cb(res):
log.msg("The DNS query for %s is injected" % self.input)
self.report['injected'] = True
@d.addErrback
def err(err):
err.trap(defer.TimeoutError)
log.msg("The DNS query for %s is not injected" % self.input)
self.report['injected'] = False
return d
ooniprobe-1.3.2/ooni/nettests/experimental/tor_censorship.py 0000644 0001750 0001750 00000001402 12411545735 022600 0 ustar irl irl from twisted.internet import reactor
from ooni.templates import tcpt
from ooni.templates import httpt
# class TorWebsiteTest(httpt.HTTPTest):
# inputs = [
# 'http://blog.torproject.org/',
# 'http://trac.torproject.org/',
# 'http://bridges.torproject.org/',
# 'http://torproject.org/',
# ]
#
# name = 'tor_website_test'
#
# def test_website_censorship(self):
# return self.doRequest(self.input)
#
class TorConnectionTest(tcpt.TCPTest):
name = 'tor_connect'
inputs = [
'8.8.8.8:53',
'127.0.0.1:2838'
]
def setUp(self):
self.address, self.port = self.input.split(':')
self.port = int(self.port)
def test_connect_to_relay(self):
return self.connect()
ooniprobe-1.3.2/ooni/nettests/experimental/keyword_filtering.py 0000644 0001750 0001750 00000003414 12463144534 023272 0 ustar irl irl # -*- encoding: utf-8 -*-
#
# :authors: Arturo Filastò
# :licence: see LICENSE
from twisted.python import usage
from ooni.templates import scapyt
from scapy.layers.inet import TCP, IP
from scapy.volatile import RandShort
class UsageOptions(usage.Options):
optParameters = [
['backend', 'b', '127.0.0.1:57002', 'Test backend running TCP echo'],
['timeout', 't', 5, 'Timeout after which to give up waiting for RST packets']
]
class KeywordFiltering(scapyt.BaseScapyTest):
name = "Keyword Filtering detection based on RST packets"
author = "Arturo Filastò"
version = "0.2"
usageOptions = UsageOptions
inputFile = ['file', 'f', None,
'List of keywords to use for censorship testing']
requiresRoot = True
requiresTor = False
def test_tcp_keyword_filtering(self):
"""
Places the keyword to be tested in the payload of a TCP packet.
XXX need to implement bisection method for enumerating keywords.
though this should not be an issue since we are testing all
the keywords in parallel.
"""
backend_ip, backend_port = self.localOptions['backend'].split(':')
timeout = int(self.localOptions['timeout'])
keyword_to_test = str(self.input)
packets = IP(dst=backend_ip, id=RandShort()) / TCP(sport=4000, dport=int(backend_port)) / keyword_to_test
d = self.sr(packets, timeout=timeout)
@d.addCallback
def finished(packets):
answered, unanswered = packets
self.report['rst_packets'] = []
for snd, rcv in answered:
# The received packet has the RST flag
if rcv[TCP].flags == 4:
self.report['rst_packets'].append(rcv)
return d
ooniprobe-1.3.2/ooni/nettests/experimental/http_trix.py 0000644 0001750 0001750 00000003036 12463144534 021570 0 ustar irl irl # -*- encoding: utf-8 -*-
from twisted.python import usage
from ooni.utils import log
from ooni.templates import tcpt
class UsageOptions(usage.Options):
optParameters = [['backend', 'b', '127.0.0.1',
'The OONI backend that runs a TCP echo server'],
['backendport', 'p', 80, 'Specify the port that the TCP echo server is running (should only be set for debugging)']]
class HTTPTrix(tcpt.TCPTest):
name = "HTTPTrix"
version = "0.1"
authors = "Arturo Filastò"
usageOptions = UsageOptions
requiresTor = False
requiresRoot = False
requiredOptions = ['backend']
def setUp(self):
self.port = int(self.localOptions['backendport'])
self.address = self.localOptions['backend']
def check_for_manipulation(self, response, payload):
log.debug("Checking if %s == %s" % (response, payload))
if response != payload:
self.report['tampering'] = True
else:
self.report['tampering'] = False
def test_for_squid_cache_object(self):
"""
This detects the presence of a squid transparent HTTP proxy by sending
a request for cache_object://localhost/info.
This tests for the presence of a Squid Transparent proxy by sending:
GET cache_object://localhost/info HTTP/1.1
"""
payload = 'GET cache_object://localhost/info HTTP/1.1'
payload += '\n\r'
d = self.sendPayload(payload)
d.addCallback(self.check_for_manipulation, payload)
return d
ooniprobe-1.3.2/ooni/nettests/experimental/__init__.py 0000644 0001750 0001750 00000000000 12373757532 021276 0 ustar irl irl ooniprobe-1.3.2/ooni/nettests/experimental/lizard_mafia_fuck.py 0000644 0001750 0001750 00000002775 12447365765 023223 0 ustar irl irl from twisted.web.client import Agent
from twisted.internet import reactor
from ooni.settings import config
from ooni.templates.tort import TorTest
from ooni import errors
class LizardMafiaFuckTest(TorTest):
name = "Lizard Mafia Fuck Test"
version = "4.20"
description = "Scan shit via lizard mafia exit nodes."
def getInputProcessor(self):
# XXX: doesn't seem that we have any of the exitpolicy available :\
# XXX: so the circuit might fail if port 80 isn't allowed
exits = filter(lambda router: 'exit' in router.flags,
config.tor_state.routers.values())
hexes = [exit.id_hex for exit in exits]
for curse in hexes:
if curse.name.startswith("LizardNSA"):
import pdb; pdb.set_trace()
yield curse
def test_fetch_exit_ip(self):
try:
exit = self.state.routers[self.input]
except KeyError:
# Router not in consensus, sorry
self.report['failure'] = "Router %s not in consensus." % self.input
return
self.report['exit_ip'] = exit.ip
endpoint = self.getExitSpecificEndpoint((host, port), exit)
endpoint.connect()
def addResultToReport(result):
self.report['external_exit_ip'] = result
def addFailureToReport(failure):
self.report['failure'] = errors.handleAllFailures(failure)
d.addCallback(addResultToReport)
d.addErrback(addFailureToReport)
return d
ooniprobe-1.3.2/ooni/nettests/experimental/script.py 0000644 0001750 0001750 00000005254 12447563404 021056 0 ustar irl irl from ooni import nettest
from ooni.utils import log
from twisted.internet import defer, protocol, reactor
from twisted.python import usage
import os
def which(program):
def is_exe(fpath):
return os.path.isfile(fpath) and os.access(fpath, os.X_OK)
fpath, fname = os.path.split(program)
if fpath:
if is_exe(program):
return program
else:
for path in os.environ["PATH"].split(os.pathsep):
path = path.strip('"')
exe_file = os.path.join(path, program)
if is_exe(exe_file):
return exe_file
return None
class UsageOptions(usage.Options):
optParameters = [
['interpreter', 'i', '', 'The interpreter to use'],
['script', 's', '', 'The script to run']
]
class ScriptProcessProtocol(protocol.ProcessProtocol):
def __init__(self, test_case):
self.test_case = test_case
self.deferred = defer.Deferred()
def connectionMade(self):
log.debug("connectionMade")
self.transport.closeStdin()
self.test_case.report['lua_output'] = ""
def outReceived(self, data):
log.debug('outReceived: %s' % data)
self.test_case.report['lua_output'] += data
def errReceived(self, data):
log.err('Script error: %s' % data)
self.transport.signalProcess('KILL')
def processEnded(self, status):
rc = status.value.exitCode
log.debug('processEnded: %s, %s' % \
(rc, self.test_case.report['lua_output']))
if rc == 0:
self.deferred.callback(self)
else:
self.deferred.errback(rc)
# TODO: Maybe the script requires a back-end.
class Script(nettest.NetTestCase):
name = "Script test"
version = "0.1"
authors = "Dominic Hamon"
usageOptions = UsageOptions
requiredOptions = ['interpreter', 'script']
requiresRoot = False
requiresTor = False
def test_run_script(self):
"""
We run the script specified in the usage options and take whatever
is printed to stdout as the results of the test.
"""
processProtocol = ScriptProcessProtocol(self)
interpreter = self.localOptions['interpreter']
if not which(interpreter):
log.err('Unable to find %s executable in PATH.' % interpreter)
return
reactor.spawnProcess(processProtocol,
interpreter,
args=[interpreter, self.localOptions['script']],
env={'HOME': os.environ['HOME']},
usePTY=True)
if not reactor.running:
reactor.run()
return processProtocol.deferred
ooniprobe-1.3.2/ooni/nettests/experimental/squid.py 0000644 0001750 0001750 00000010721 12463144534 020667 0 ustar irl irl # -*- encoding: utf-8 -*-
#
# Squid transparent HTTP proxy detector
# *************************************
#
# :authors: Arturo Filastò
# :licence: see LICENSE
from ooni import utils
from ooni.utils import log
from ooni.templates import httpt
import re
class SquidTest(httpt.HTTPTest):
"""
This test aims at detecting the presence of a squid based transparent HTTP
proxy. It also tries to detect the version number.
"""
name = "Squid test"
author = "Arturo Filastò"
version = "0.1"
optParameters = [['backend', 'b', 'http://ooni.nu/test/', 'Test backend to use']]
#inputFile = ['urls', 'f', None, 'Urls file']
inputs =['http://google.com']
requiresRoot = False
requiresTor = False
def test_cacheobject(self):
"""
This detects the presence of a squid transparent HTTP proxy by sending
a request for cache_object://localhost/info.
The response to this request will usually also contain the squid
version number.
"""
log.debug("Running")
def process_body(body):
if "Access Denied." in body:
self.report['transparent_http_proxy'] = True
else:
self.report['transparent_http_proxy'] = False
log.msg("Testing Squid proxy presence by sending a request for "\
"cache_object")
headers = {}
#headers["Host"] = [self.input]
self.report['trans_http_proxy'] = None
method = "GET"
body = "cache_object://localhost/info"
return self.doRequest(self.localOptions['backend'], method=method, body=body,
headers=headers, body_processor=process_body)
def test_search_bad_request(self):
"""
Attempts to perform a request with a random invalid HTTP method.
If we are being MITMed by a Transparent Squid HTTP proxy we will get
back a response containing the X-Squid-Error header.
"""
def process_headers(headers):
log.debug("Processing headers in test_search_bad_request")
if 'X-Squid-Error' in headers:
log.msg("Detected the presence of a transparent HTTP "\
"squid proxy")
self.report['trans_http_proxy'] = True
else:
log.msg("Did not detect the presence of transparent HTTP "\
"squid proxy")
self.report['transparent_http_proxy'] = False
log.msg("Testing Squid proxy presence by sending a random bad request")
headers = {}
#headers["Host"] = [self.input]
method = utils.randomSTR(10, True)
self.report['transparent_http_proxy'] = None
return self.doRequest(self.localOptions['backend'], method=method,
headers=headers, headers_processor=process_headers)
def test_squid_headers(self):
"""
Detects the presence of a squid transparent HTTP proxy based on the
response headers it adds to the responses to requests.
"""
def process_headers(headers):
"""
Checks if any of the headers that squid is known to add match the
squid regexp.
We are looking for something that looks like this:
via: 1.0 cache_server:3128 (squid/2.6.STABLE21)
x-cache: MISS from cache_server
x-cache-lookup: MISS from cache_server:3128
"""
squid_headers = {'via': r'.* \((squid.*)\)',
'x-cache': r'MISS from (\w+)',
'x-cache-lookup': r'MISS from (\w+:?\d+?)'
}
self.report['transparent_http_proxy'] = False
for key in squid_headers.keys():
if key in headers:
log.debug("Found %s in headers" % key)
m = re.search(squid_headers[key], headers[key])
if m:
log.msg("Detected the presence of squid transparent"\
" HTTP Proxy")
self.report['transparent_http_proxy'] = True
log.msg("Testing Squid proxy by looking at response headers")
headers = {}
#headers["Host"] = [self.input]
method = "GET"
self.report['transparent_http_proxy'] = None
d = self.doRequest(self.localOptions['backend'], method=method,
headers=headers, headers_processor=process_headers)
return d
ooniprobe-1.3.2/ooni/nettests/experimental/http_uk_mobile_networks.py 0000644 0001750 0001750 00000005023 12463144534 024502 0 ustar irl irl # -*- encoding: utf-8 -*-
import yaml
from twisted.python import usage
from ooni.templates import httpt
from ooni.utils import log
class UsageOptions(usage.Options):
"""
See https://github.com/hellais/ooni-inputs/processed/uk_mobile_networks_redirects.yaml
to see how the rules file should look like.
"""
optParameters = [
['rules', 'y', None,
'Specify the redirect rules file ']
]
class HTTPUKMobileNetworksTest(httpt.HTTPTest):
"""
This test was thought of by Open Rights Group and implemented with the
purpose of detecting censorship in the UK.
For more details on this test see:
https://trac.torproject.org/projects/tor/ticket/6437
XXX port the knowledge from the trac ticket into this test docstring
"""
name = "HTTP UK mobile network redirect test"
usageOptions = UsageOptions
followRedirects = True
inputFile = ['urls', 'f', None, 'List of urls one per line to test for censorship']
requiredOptions = ['urls']
requiresRoot = False
requiresTor = False
def testPattern(self, value, pattern, type):
if type == 'eq':
return value == pattern
elif type == 're':
import re
if re.match(pattern, value):
return True
else:
return False
else:
return None
def testPatterns(self, patterns, location):
test_result = False
if type(patterns) == list:
for pattern in patterns:
test_result |= self.testPattern(location, pattern['value'], pattern['type'])
rules_file = self.localOptions['rules']
return test_result
def testRules(self, rules, location):
result = {}
blocked = False
for rule, value in rules.items():
current_rule = {}
current_rule['name'] = value['name']
current_rule['patterns'] = value['patterns']
current_rule['test'] = self.testPatterns(value['patterns'], location)
blocked |= current_rule['test']
result[rule] = current_rule
result['blocked'] = blocked
return result
def processRedirect(self, location):
self.report['redirect'] = None
rules_file = self.localOptions['rules']
fp = open(rules_file)
rules = yaml.safe_load(fp)
fp.close()
log.msg("Testing rules %s" % rules)
redirect = self.testRules(rules, location)
self.report['redirect'] = redirect
ooniprobe-1.3.2/ooni/nettests/experimental/domclass_collector.py 0000644 0001750 0001750 00000001603 12463144534 023414 0 ustar irl irl # -*- encoding: utf-8 -*-
#
# The purpose of this collector is to compute the eigenvector for the input
# file containing a list of sites.
#
#
# :authors: Arturo Filastò
# :licence: see LICENSE
from ooni.kit import domclass
from ooni.templates import httpt
class DOMClassCollector(httpt.HTTPTest):
name = "DOM class collector"
author = "Arturo Filastò"
version = 0.1
followRedirects = True
inputFile = ['file', 'f', None, 'The list of urls to build a domclass for']
requiresTor = False
requiresRoot = False
def test_collect(self):
if self.input:
url = self.input
return self.doRequest(url)
else:
raise Exception("No input specified")
def processResponseBody(self, body):
eigenvalues = domclass.compute_eigenvalues_from_DOM(content=body)
self.report['eigenvalues'] = eigenvalues.tolist()
ooniprobe-1.3.2/ooni/nettests/experimental/http_keyword_filtering.py 0000644 0001750 0001750 00000002572 12447563404 024340 0 ustar irl irl # -*- encoding: utf-8 -*-
#
# :authors: Arturo Filastò
# :licence: see LICENSE
from twisted.python import usage
from ooni.templates import httpt
class UsageOptions(usage.Options):
optParameters = [['backend', 'b', 'http://127.0.0.1:57001',
'URL of the test backend to use']]
class HTTPKeywordFiltering(httpt.HTTPTest):
"""
This test involves performing HTTP requests containing to be tested for
censorship keywords.
It does not detect censorship on the client, but just logs the response from the
HTTP backend server.
"""
name = "HTTP Keyword Filtering"
author = "Arturo Filastò"
version = "0.1.1"
inputFile = ['file', 'f', None, 'List of keywords to use for censorship testing']
usageOptions = UsageOptions
requiresTor = False
requiresRoot = False
requiredOptions = ['backend']
def test_get(self):
"""
Perform a HTTP GET request to the backend containing the keyword to be
tested inside of the request body.
"""
return self.doRequest(self.localOptions['backend'], method="GET", body=self.input)
def test_post(self):
"""
Perform a HTTP POST request to the backend containing the keyword to be
tested inside of the request body.
"""
return self.doRequest(self.localOptions['backend'], method="POST", body=self.input)
ooniprobe-1.3.2/ooni/nettests/experimental/parasitictraceroute.py 0000644 0001750 0001750 00000004044 12463144534 023620 0 ustar irl irl from twisted.internet import defer, reactor
from ooni.errors import handleAllFailures
from ooni.templates import scapyt
from ooni.utils import log
from ooni.utils.txscapy import ParasiticTraceroute
from ooni.settings import config
from scapy.all import TCPerror, IPerror
class ParasiticTracerouteTest(scapyt.BaseScapyTest):
name = "Parasitic Traceroute Test"
description = "Injects duplicate TCP packets with varying TTL values by sniffing traffic"
version = '0.1'
samplePeriod = 40
requiresTor = False
def setUp(self):
self.report['parasitic_traceroute'] = {}
def test_parasitic_traceroute(self):
self.pt = ParasiticTraceroute()
log.debug("Starting ParasiticTraceroute for up to %d hosts at inject "
"rate %d with %s" % (self.pt.numHosts, self.pt.rate, self.pt))
config.scapyFactory.registerProtocol(self.pt)
d = defer.Deferred()
reactor.callLater(self.samplePeriod, d.callback, self)
d.addCallback(self.addToReport)
d.addErrback(handleAllFailures)
return d
def addToReport(self, result):
log.debug("Stopping ParasiticTraceroute")
self.pt.stopListening()
self.report['received_packets'] = self.pt.received_packets
for packet in self.pt.received_packets:
k = (packet[IPerror].id, packet[TCPerror].sport, packet[TCPerror].dport, packet[TCPerror].seq)
if k in self.pt.matched_packets:
ttl = self.pt.matched_packets[k]['ttl']
else:
ttl = 'unknown'
hop = (ttl, packet.src)
path = 'hops_%s' % packet[IPerror].dst
if path in self.report['parasitic_traceroute']:
self.report['parasitic_traceroute'][path].append(hop)
else:
self.report['parasitic_traceroute'][path] = [hop]
for p in self.report['parasitic_traceroute'].keys():
self.report['parasitic_traceroute'][p].sort(key=lambda x: x[0])
self.report['sent_packets'] = self.pt.sent_packets
ooniprobe-1.3.2/ooni/nettests/experimental/http_filtering_bypassing.py 0000644 0001750 0001750 00000005415 12463144534 024647 0 ustar irl irl # -*- encoding: utf-8 -*-
from twisted.python import usage
from ooni.utils import log
from ooni.utils import randomStr
from ooni.templates import tcpt
class UsageOptions(usage.Options):
optParameters = [['backend', 'b', '127.0.0.1',
'The OONI backend that runs a TCP echo server'],
['backendport', 'p', 80, 'Specify the port that the TCP echo server is running (should only be set for debugging)']]
class HTTPFilteringBypass(tcpt.TCPTest):
name = "HTTPFilteringBypass"
version = "0.1"
authors = "xx"
inputFile = ['file', 'f', None,
'Specify a list of hostnames to use as inputs']
usageOptions = UsageOptions
requiredOptions = ['backend']
requiresRoot = False
requiresTor = False
def setUp(self):
self.port = int(self.localOptions['backendport'])
self.address = self.localOptions['backend']
self.report['tampering'] = None
def check_for_manipulation(self, response, payload):
log.debug("Checking if %s == %s" % (response, payload))
if response != payload:
self.report['tampering'] = True
else:
self.report['tampering'] = False
def test_prepend_newline(self):
payload = "\nGET / HTTP/1.1\n\r"
payload += "Host: %s\n\r" % self.input
d = self.sendPayload(payload)
d.addCallback(self.check_for_manipulation, payload)
return d
def test_tab_trick(self):
payload = "GET / HTTP/1.1\n\r"
payload += "Host: %s\t\n\r" % self.input
d = self.sendPayload(payload)
d.addCallback(self.check_for_manipulation, payload)
return d
def test_subdomain_blocking(self):
payload = "GET / HTTP/1.1\n\r"
payload += "Host: %s\n\r" % randomStr(10) + '.' + self.input
d = self.sendPayload(payload)
d.addCallback(self.check_for_manipulation, payload)
return d
def test_fuzzy_domain_blocking(self):
hostname_field = randomStr(10) + '.' + self.input + '.' + randomStr(10)
payload = "GET / HTTP/1.1\n\r"
payload += "Host: %s\n\r" % hostname_field
d = self.sendPayload(payload)
d.addCallback(self.check_for_manipulation, payload)
return d
def test_fuzzy_match_blocking(self):
hostname_field = randomStr(10) + self.input + randomStr(10)
payload = "GET / HTTP/1.1\n\r"
payload += "Host: %s\n\r" % hostname_field
d = self.sendPayload(payload)
d.addCallback(self.check_for_manipulation, payload)
return d
def test_normal_request(self):
payload = "GET / HTTP/1.1\n\r"
payload += "Host: %s\n\r" % self.input
d = self.sendPayload(payload)
d.addCallback(self.check_for_manipulation, payload)
return d
ooniprobe-1.3.2/ooni/nettests/experimental/chinatrigger.py 0000644 0001750 0001750 00000007067 12502236531 022212 0 ustar irl irl import random
import string
import struct
import time
from twisted.python import usage
from ooni.templates.scapyt import BaseScapyTest
class UsageOptions(usage.Options):
optParameters = [['dst', 'd', None, 'Specify the target address'],
['port', 'p', None, 'Specify the target port']
]
class ChinaTriggerTest(BaseScapyTest):
"""
This test is a OONI based implementation of the C tool written
by Philipp Winter to engage chinese probes in active scanning.
Example of running it:
ooniprobe chinatrigger -d 127.0.0.1 -p 8080
"""
name = "chinatrigger"
usageOptions = UsageOptions
requiredOptions = ['dst', 'port']
timeout = 2
def setUp(self):
self.dst = self.localOptions['dst']
self.port = int(self.localOptions['port'])
@staticmethod
def set_random_servername(pkt):
ret = pkt[:121]
for i in range(16):
ret += random.choice(string.ascii_lowercase)
ret += pkt[121+16:]
return ret
@staticmethod
def set_random_time(pkt):
ret = pkt[:11]
ret += struct.pack('!I', int(time.time()))
ret += pkt[11+4:]
return ret
@staticmethod
def set_random_field(pkt):
ret = pkt[:15]
for i in range(28):
ret += chr(random.randint(0, 255))
ret += pkt[15+28:]
return ret
@staticmethod
def mutate(pkt, idx):
"""
Slightly changed mutate function.
"""
ret = pkt[:idx-1]
mutation = chr(random.randint(0, 255))
while mutation == pkt[idx]:
mutation = chr(random.randint(0, 255))
ret += mutation
ret += pkt[idx:]
return ret
@staticmethod
def set_all_random_fields(pkt):
pkt = ChinaTriggerTest.set_random_servername(pkt)
pkt = ChinaTriggerTest.set_random_time(pkt)
pkt = ChinaTriggerTest.set_random_field(pkt)
return pkt
def test_send_mutations(self):
from scapy.all import IP, TCP
pkt = "\x16\x03\x01\x00\xcc\x01\x00\x00\xc8"\
"\x03\x01\x4f\x12\xe5\x63\x3f\xef\x7d"\
"\x20\xb9\x94\xaa\x04\xb0\xc1\xd4\x8c"\
"\x50\xcd\xe2\xf9\x2f\xa9\xfb\x78\xca"\
"\x02\xa8\x73\xe7\x0e\xa8\xf9\x00\x00"\
"\x3a\xc0\x0a\xc0\x14\x00\x39\x00\x38"\
"\xc0\x0f\xc0\x05\x00\x35\xc0\x07\xc0"\
"\x09\xc0\x11\xc0\x13\x00\x33\x00\x32"\
"\xc0\x0c\xc0\x0e\xc0\x02\xc0\x04\x00"\
"\x04\x00\x05\x00\x2f\xc0\x08\xc0\x12"\
"\x00\x16\x00\x13\xc0\x0d\xc0\x03\xfe"\
"\xff\x00\x0a\x00\xff\x01\x00\x00\x65"\
"\x00\x00\x00\x1d\x00\x1b\x00\x00\x18"\
"\x77\x77\x77\x2e\x67\x6e\x6c\x69\x67"\
"\x78\x7a\x70\x79\x76\x6f\x35\x66\x76"\
"\x6b\x64\x2e\x63\x6f\x6d\x00\x0b\x00"\
"\x04\x03\x00\x01\x02\x00\x0a\x00\x34"\
"\x00\x32\x00\x01\x00\x02\x00\x03\x00"\
"\x04\x00\x05\x00\x06\x00\x07\x00\x08"\
"\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00"\
"\x0d\x00\x0e\x00\x0f\x00\x10\x00\x11"\
"\x00\x12\x00\x13\x00\x14\x00\x15\x00"\
"\x16\x00\x17\x00\x18\x00\x19\x00\x23"\
"\x00\x00"
pkt = ChinaTriggerTest.set_all_random_fields(pkt)
pkts = [IP(dst=self.dst)/TCP(dport=self.port)/pkt]
for x in range(len(pkt)):
mutation = IP(dst=self.dst)/TCP(dport=self.port)/ChinaTriggerTest.mutate(pkt, x)
pkts.append(mutation)
return self.sr(pkts, timeout=2)
ooniprobe-1.3.2/ooni/managers.py 0000644 0001750 0001750 00000014050 12447563404 014773 0 ustar irl irl import itertools
from ooni.utils import log
from ooni.settings import config
def makeIterable(item):
"""
Takes as argument or an iterable and if it's not an iterable object then it
will return a listiterator.
"""
try:
iterable = iter(item)
except TypeError:
iterable = iter([item])
return iterable
class TaskManager(object):
retries = 2
concurrency = 10
def __init__(self):
self._tasks = iter(())
self._active_tasks = []
self.failures = 0
def _failed(self, failure, task):
"""
The has failed to complete, we append it to the end of the task chain
to be re-run once all the currently scheduled tasks have run.
"""
log.debug("Task %s has failed %s times" % (task, task.failures))
if config.advanced.debug:
log.exception(failure)
self._active_tasks.remove(task)
self.failures = self.failures + 1
if task.failures <= self.retries:
log.debug("Rescheduling...")
self._tasks = itertools.chain(makeIterable(task), self._tasks)
else:
# This fires the errback when the task is done but has failed.
log.debug('Permanent failure for %s' % task)
task.done.errback(failure)
self._fillSlots()
self.failed(failure, task)
def _fillSlots(self):
"""
Called on test completion and schedules measurements to be run for the
available slots.
"""
for _ in range(self.availableSlots):
try:
task = self._tasks.next()
self._run(task)
except StopIteration:
break
except ValueError:
# XXX this is a workaround the race condition that leads the
# _tasks generator to throw the exception
# ValueError: generator already called.
continue
def _run(self, task):
"""
This gets called to add a task to the list of currently active and
running tasks.
"""
self._active_tasks.append(task)
d = task.start()
d.addCallback(self._succeeded, task)
d.addErrback(self._failed, task)
def _succeeded(self, result, task):
"""
We have successfully completed a measurement.
"""
self._active_tasks.remove(task)
# Fires the done deferred when the task has completed
task.done.callback(result)
self._fillSlots()
self.succeeded(result, task)
@property
def failedMeasurements(self):
return self.failures
@property
def availableSlots(self):
"""
Returns the number of available slots for running tests.
"""
return self.concurrency - len(self._active_tasks)
def schedule(self, task_or_task_iterator):
"""
Takes as argument a single task or a task iterable and appends it to
the task generator queue.
"""
log.debug("Starting this task %s" % repr(task_or_task_iterator))
iterable = makeIterable(task_or_task_iterator)
self._tasks = itertools.chain(self._tasks, iterable)
self._fillSlots()
def start(self):
"""
This is called to start the task manager.
"""
self.failures = 0
self._fillSlots()
def failed(self, failure, task):
"""
This hoook is called every time a task has failed.
The default failure handling logic is to reschedule the task up until
we reach the maximum number of retries.
"""
raise NotImplemented
def succeeded(self, result, task):
"""
This hook is called every time a task has been successfully executed.
"""
raise NotImplemented
class LinkedTaskManager(TaskManager):
def __init__(self):
super(LinkedTaskManager, self).__init__()
self.child = None
self.parent = None
@property
def availableSlots(self):
mySlots = self.concurrency - len(self._active_tasks)
if self.child:
s = self.child.availableSlots
return min(s, mySlots)
return mySlots
def _succeeded(self, result, task):
super(LinkedTaskManager, self)._succeeded(result, task)
if self.parent:
self.parent._fillSlots()
def _failed(self, result, task):
super(LinkedTaskManager, self)._failed(result, task)
if self.parent:
self.parent._fillSlots()
class MeasurementManager(LinkedTaskManager):
"""
This is the Measurement Tracker. In here we keep track of active
measurements and issue new measurements once the active ones have been
completed.
MeasurementTracker does not keep track of the typology of measurements that
it is running. It just considers a measurement something that has an input
and a method to be called.
NetTest on the contrary is aware of the typology of measurements that it is
dispatching as they are logically grouped by test file.
"""
def __init__(self):
if config.advanced.measurement_retries:
self.retries = config.advanced.measurement_retries
if config.advanced.measurement_concurrency:
self.concurrency = config.advanced.measurement_concurrency
super(MeasurementManager, self).__init__()
def succeeded(self, result, measurement):
log.debug("Successfully performed measurement %s" % measurement)
log.debug("%s" % result)
def failed(self, failure, measurement):
pass
class ReportEntryManager(LinkedTaskManager):
def __init__(self):
if config.advanced.reporting_retries:
self.retries = config.advanced.reporting_retries
if config.advanced.reporting_concurrency:
self.concurrency = config.advanced.reporting_concurrency
super(ReportEntryManager, self).__init__()
def succeeded(self, result, task):
log.debug("Successfully performed report %s" % task)
log.debug(str(result))
def failed(self, failure, task):
pass
ooniprobe-1.3.2/ooni/settings.py 0000644 0001750 0001750 00000020263 12531110612 015020 0 ustar irl irl import os
import yaml
import getpass
from ConfigParser import SafeConfigParser
from twisted.internet import defer, reactor
from twisted.internet.endpoints import TCP4ClientEndpoint
from os.path import abspath, expanduser
from ooni.utils.net import ConnectAndCloseProtocol, connectProtocol
from ooni import geoip
from ooni.utils import Storage, log, get_ooni_root
from ooni import errors
class OConfig(object):
_custom_home = None
def __init__(self):
self.current_user = getpass.getuser()
self.global_options = {}
self.reports = Storage()
self.scapyFactory = None
self.tor_state = None
# This is used to store the probes IP address obtained via Tor
self.probe_ip = geoip.ProbeIP()
self.logging = True
self.basic = Storage()
self.advanced = Storage()
self.tor = Storage()
self.privacy = Storage()
self.set_paths()
def embedded_settings(self, category, option):
embedded_settings = os.path.join(get_ooni_root(), 'settings.ini')
if os.path.isfile(embedded_settings):
settings = SafeConfigParser()
with open(embedded_settings) as fp:
settings.readfp(fp)
return settings.get(category, option)
return None
@property
def var_lib_path(self):
var_lib_path = self.embedded_settings("directories", "var_lib")
if var_lib_path:
return os.path.abspath(var_lib_path)
return "/var/lib/ooni"
@property
def usr_share_path(self):
usr_share_path = self.embedded_settings("directories", "usr_share")
if usr_share_path:
return os.path.abspath(usr_share_path)
return "/usr/share/ooni"
@property
def data_directory_candidates(self):
dirs = [
os.path.join(expanduser('~'+self.current_user), '.ooni'),
self.var_lib_path,
self.usr_share_path,
os.path.join(get_ooni_root(), '..', 'data'),
'/usr/share/'
]
if os.getenv("OONI_DATA_DIR"):
dirs.insert(0, os.getenv("OONI_DATA_DIR"))
if self.global_options.get('datadir'):
dirs.insert(0, abspath(expanduser(self.global_options['datadir'])))
return dirs
@property
def data_directory(self):
for target_dir in self.data_directory_candidates:
if os.path.isdir(target_dir):
return target_dir
return self.var_lib_path
@property
def ooni_home(self):
home = expanduser('~'+self.current_user)
if os.getenv("HOME"):
home = os.getenv("HOME")
if self._custom_home:
return self._custom_home
else:
return os.path.join(home, '.ooni')
def get_data_file_path(self, file_name):
for target_dir in self.data_directory_candidates:
file_path = os.path.join(target_dir, file_name)
if os.path.isfile(file_path):
return file_path
def set_paths(self):
self.nettest_directory = os.path.join(get_ooni_root(), 'nettests')
if self.advanced.inputs_dir:
self.inputs_directory = self.advanced.inputs_dir
else:
self.inputs_directory = os.path.join(self.ooni_home, 'inputs')
if self.advanced.decks_dir:
self.decks_directory = self.advanced.decks_dir
else:
self.decks_directory = os.path.join(self.ooni_home, 'decks')
self.reports_directory = os.path.join(self.ooni_home, 'reports')
self.resources_directory = os.path.join(self.data_directory,
"resources")
if self.advanced.report_log_file:
self.report_log_file = self.advanced.report_log_file
else:
self.report_log_file = os.path.join(self.ooni_home,
'reporting.yml')
if self.global_options.get('configfile'):
config_file = self.global_options['configfile']
self.config_file = expanduser(config_file)
else:
self.config_file = os.path.join(self.ooni_home, 'ooniprobe.conf')
if 'logfile' in self.basic:
self.basic.logfile = expanduser(self.basic.logfile.replace(
'~', '~'+self.current_user))
def initialize_ooni_home(self, custom_home=None):
if custom_home:
self._custom_home = custom_home
self.set_paths()
if not os.path.isdir(self.ooni_home):
print "Ooni home directory does not exist."
print "Creating it in '%s'." % self.ooni_home
os.mkdir(self.ooni_home)
os.mkdir(self.inputs_directory)
os.mkdir(self.decks_directory)
def _create_config_file(self):
target_config_file = self.config_file
print "Creating it for you in '%s'." % target_config_file
sample_config_file = self.get_data_file_path('ooniprobe.conf.sample')
with open(sample_config_file) as f:
with open(target_config_file, 'w+') as w:
for line in f:
if line.startswith(' logfile: '):
w.write(' logfile: %s\n' % (
os.path.join(self.ooni_home, 'ooniprobe.log'))
)
else:
w.write(line)
def read_config_file(self, check_incoherences=False):
if not os.path.isfile(self.config_file):
print "Configuration file does not exist."
self._create_config_file()
self.read_config_file()
with open(self.config_file) as f:
config_file_contents = '\n'.join(f.readlines())
configuration = yaml.safe_load(config_file_contents)
for setting in configuration.keys():
if setting in dir(self) and configuration[setting] is not None:
for k, v in configuration[setting].items():
getattr(self, setting)[k] = v
self.set_paths()
if check_incoherences:
self.check_incoherences(configuration)
def check_incoherences(self, configuration):
incoherent = []
if configuration['advanced']['interface'] != 'auto':
from scapy.all import get_if_list
if configuration['advanced']['interface'] not in get_if_list():
incoherent.append('advanced:interface')
self.log_incoherences(incoherent)
def log_incoherences(self, incoherences):
if len(incoherences) > 0:
if len(incoherences) > 1:
incoherent_pretty = ", ".join(incoherences[:-1]) + ' and ' + incoherences[-1]
else:
incoherent_pretty = incoherences[0]
log.err("You must set properly %s in %s." % (incoherent_pretty, self.config_file))
raise errors.ConfigFileIncoherent
@defer.inlineCallbacks
def check_tor(self):
"""
Called only when we must start tor by director.start
"""
incoherent = []
if not self.advanced.start_tor:
if self.tor.socks_port is None:
incoherent.append('tor:socks_port')
else:
socks_port_ep = TCP4ClientEndpoint(reactor,
"localhost",
self.tor.socks_port)
try:
yield connectProtocol(socks_port_ep, ConnectAndCloseProtocol())
except Exception:
incoherent.append('tor:socks_port')
if self.tor.control_port is not None:
control_port_ep = TCP4ClientEndpoint(reactor,
"localhost",
self.tor.control_port)
try:
yield connectProtocol(control_port_ep, ConnectAndCloseProtocol())
except Exception:
incoherent.append('tor:control_port')
self.log_incoherences(incoherent)
config = OConfig()
if not os.path.isfile(config.config_file) \
and os.path.isfile('/etc/ooniprobe.conf'):
config.global_options['configfile'] = '/etc/ooniprobe.conf'
config.set_paths()
ooniprobe-1.3.2/ooni/utils/ 0000755 0001750 0001750 00000000000 12623630152 013753 5 ustar irl irl ooniprobe-1.3.2/ooni/utils/net.py 0000644 0001750 0001750 00000012615 12533065720 015123 0 ustar irl irl import sys
import socket
from random import randint
from zope.interface import implements
from twisted.internet import protocol, defer
from twisted.web.iweb import IBodyProducer
from scapy.config import conf
from ooni.errors import IfaceError
try:
from twisted.internet.endpoints import connectProtocol
except ImportError:
def connectProtocol(endpoint, protocol):
class OneShotFactory(protocol.Factory):
def buildProtocol(self, addr):
return protocol
return endpoint.connect(OneShotFactory())
# if sys.platform.system() == 'Windows':
# import _winreg as winreg
# These user agents are taken from the "How Unique Is Your Web Browser?"
# (https://panopticlick.eff.org/browser-uniqueness.pdf) paper as the browser user
# agents with largest anonymity set.
userAgents = ("Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.7) Gecko/20091221 Firefox/3.5.7",
"Mozilla/5.0 (iPhone; U; CPU iPhone OS 3 1 2 like Mac OS X; en-us)"
"AppleWebKit/528.18 (KHTML, like Gecko) Mobile/7D11",
"Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2) Gecko/20100115 Firefox/3.6",
"Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2) Gecko/20100115 Firefox/3.6",
"Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2) Gecko/20100115 Firefox/3.6",
"Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.2) Gecko/20100115 Firefox/3.6",
"Mozilla/5.0 (Windows; U; Windows NT 6.1; de; rv:1.9.2) Gecko/20100115 Firefox/3.6",
"Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.2) Gecko/20100115 Firefox/3.6",
"Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.1.7) Gecko/20091221 Firefox/3.5.7",
"Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.1.7) "
"Gecko/20091221 Firefox/3.5.7 (.NET CLR 3.5.30729)")
PLATFORMS = {'LINUX': sys.platform.startswith("linux"),
'OPENBSD': sys.platform.startswith("openbsd"),
'FREEBSD': sys.platform.startswith("freebsd"),
'NETBSD': sys.platform.startswith("netbsd"),
'DARWIN': sys.platform.startswith("darwin"),
'SOLARIS': sys.platform.startswith("sunos"),
'WINDOWS': sys.platform.startswith("win32")}
class StringProducer(object):
implements(IBodyProducer)
def __init__(self, body):
self.body = body
self.length = len(body)
def startProducing(self, consumer):
consumer.write(self.body)
return defer.succeed(None)
def pauseProducing(self):
pass
def stopProducing(self):
pass
class BodyReceiver(protocol.Protocol):
def __init__(self, finished, content_length=None, body_processor=None):
self.finished = finished
self.data = ""
self.bytes_remaining = content_length
self.body_processor = body_processor
def dataReceived(self, b):
self.data += b
if self.bytes_remaining:
if self.bytes_remaining == 0:
self.connectionLost(None)
else:
self.bytes_remaining -= len(b)
def connectionLost(self, reason):
try:
if self.body_processor:
self.data = self.body_processor(self.data)
self.finished.callback(self.data)
except Exception as exc:
self.finished.errback(exc)
class Downloader(protocol.Protocol):
def __init__(self, download_path,
finished, content_length=None):
self.finished = finished
self.bytes_remaining = content_length
self.fp = open(download_path, 'w+')
def dataReceived(self, b):
self.fp.write(b)
if self.bytes_remaining:
if self.bytes_remaining == 0:
self.connectionLost(None)
else:
self.bytes_remaining -= len(b)
def connectionLost(self, reason):
self.fp.flush()
self.fp.close()
self.finished.callback(None)
class ConnectAndCloseProtocol(protocol.Protocol):
def connectionMade(self):
self.transport.loseConnection()
def randomFreePort(addr="127.0.0.1"):
"""
Args:
addr (str): the IP address to attempt to bind to.
Returns an int representing the free port number at the moment of calling
Note: there is no guarantee that some other application will attempt to
bind to this port once this function has been called.
"""
free = False
while not free:
port = randint(1024, 65535)
s = socket.socket()
try:
s.bind((addr, port))
free = True
except:
pass
s.close()
return port
def getDefaultIface():
""" Return the default interface or raise IfaceError """
iface = conf.route.route('0.0.0.0', verbose=0)[0]
if len(iface) > 0:
return iface
raise IfaceError
def getAddresses():
from scapy.all import get_if_addr, get_if_list
from ipaddr import IPAddress
addresses = set()
for i in get_if_list():
try:
addresses.add(get_if_addr(i))
except:
pass
if '0.0.0.0' in addresses:
addresses.remove('0.0.0.0')
return [IPAddress(addr) for addr in addresses]
def hasRawSocketPermission():
try:
socket.socket(socket.AF_INET, socket.SOCK_RAW, socket.IPPROTO_RAW)
return True
except socket.error:
return False ooniprobe-1.3.2/ooni/utils/trueheaders.py 0000644 0001750 0001750 00000013402 12623613431 016641 0 ustar irl irl # :authors: Giovanni Pellerano
# :licence: see LICENSE
#
# Here we make sure that the HTTP Headers sent and received are True. By this
# we mean that they are not normalized and that the ordering is maintained.
import itertools
from copy import copy
from twisted.web import client, _newclient, http_headers
from twisted.web._newclient import RequestNotSent, RequestGenerationFailed
from twisted.web._newclient import TransportProxyProducer, STATUS
from twisted.internet import reactor
from twisted.internet.defer import Deferred, fail, maybeDeferred, failure
from txsocksx.http import SOCKS5Agent
from txsocksx.client import SOCKS5ClientFactory
SOCKS5ClientFactory.noisy = False
from ooni.utils import log
import twisted
from twisted.python.versions import Version
class TrueHeaders(http_headers.Headers):
def __init__(self, rawHeaders=None):
self._rawHeaders = dict()
if rawHeaders is not None:
for name, values in rawHeaders.iteritems():
if type(values) is list:
self.setRawHeaders(name, values[:])
elif type(values) is dict:
self._rawHeaders[name.lower()] = values
elif type(values) is str:
self.setRawHeaders(name, values)
def setRawHeaders(self, name, values):
if name.lower() not in self._rawHeaders:
self._rawHeaders[name.lower()] = dict()
self._rawHeaders[name.lower()]['name'] = name
self._rawHeaders[name.lower()]['values'] = values
def getDiff(self, headers, ignore=[]):
"""
Args:
headers: a TrueHeaders object
ignore: specify a list of header fields to ignore
Returns:
a set containing the header names that are not present in
header_dict or not present in self.
"""
diff = set()
field_names = []
headers_a = copy(self)
headers_b = copy(headers)
for name in ignore:
try:
del headers_a._rawHeaders[name.lower()]
except KeyError:
pass
try:
del headers_b._rawHeaders[name.lower()]
except KeyError:
pass
for k, v in itertools.chain(headers_a.getAllRawHeaders(),
headers_b.getAllRawHeaders()):
field_names.append(k)
for name in field_names:
if self.getRawHeaders(name) and headers.getRawHeaders(name):
pass
else:
diff.add(name)
return diff
def getAllRawHeaders(self):
for k, v in self._rawHeaders.iteritems():
yield v['name'], v['values']
def getRawHeaders(self, name, default=None):
if name.lower() in self._rawHeaders:
return self._rawHeaders[name.lower()]['values']
return default
class HTTPClientParser(_newclient.HTTPClientParser):
def logPrefix(self):
return 'HTTPClientParser'
def connectionMade(self):
self.headers = TrueHeaders()
self.connHeaders = TrueHeaders()
self.state = STATUS
self._partialHeader = None
def headerReceived(self, name, value):
if self.isConnectionControlHeader(name):
headers = self.connHeaders
else:
headers = self.headers
headers.addRawHeader(name, value)
class HTTP11ClientProtocol(_newclient.HTTP11ClientProtocol):
def request(self, request):
if self._state != 'QUIESCENT':
return fail(RequestNotSent())
self._state = 'TRANSMITTING'
_requestDeferred = maybeDeferred(request.writeTo, self.transport)
self._finishedRequest = Deferred()
self._currentRequest = request
self._transportProxy = TransportProxyProducer(self.transport)
self._parser = HTTPClientParser(request, self._finishResponse)
self._parser.makeConnection(self._transportProxy)
self._responseDeferred = self._parser._responseDeferred
def cbRequestWrotten(ignored):
if self._state == 'TRANSMITTING':
self._state = 'WAITING'
self._responseDeferred.chainDeferred(self._finishedRequest)
def ebRequestWriting(err):
if self._state == 'TRANSMITTING':
self._state = 'GENERATION_FAILED'
self.transport.loseConnection()
self._finishedRequest.errback(
failure.Failure(RequestGenerationFailed([err])))
else:
log.err(err, 'Error writing request, but not in valid state '
'to finalize request: %s' % self._state)
_requestDeferred.addCallbacks(cbRequestWrotten, ebRequestWriting)
return self._finishedRequest
class _HTTP11ClientFactory(client._HTTP11ClientFactory):
noisy = False
def buildProtocol(self, addr):
return HTTP11ClientProtocol(self._quiescentCallback)
class HTTPConnectionPool(client.HTTPConnectionPool):
_factory = _HTTP11ClientFactory
class TrueHeadersAgent(client.Agent):
def __init__(self, *args, **kw):
super(TrueHeadersAgent, self).__init__(*args, **kw)
self._pool = HTTPConnectionPool(reactor, False)
_twisted_15_0 = Version('twisted', 15, 0, 0)
class TrueHeadersSOCKS5Agent(SOCKS5Agent):
def __init__(self, *args, **kw):
super(TrueHeadersSOCKS5Agent, self).__init__(*args, **kw)
pool = HTTPConnectionPool(reactor, False)
#
# With Twisted > 15.0 txsocksx wraps the twisted agent using a
# wrapper class, hence we must set the _pool attribute in the
# inner class rather than into its external wrapper.
#
if twisted.version >= _twisted_15_0:
self._wrappedAgent._pool = pool
else:
self._pool = pool
ooniprobe-1.3.2/ooni/utils/onion.py 0000644 0001750 0001750 00000007106 12531110611 015442 0 ustar irl irl import string
import subprocess
from distutils.spawn import find_executable
from distutils.version import LooseVersion
from txtorcon.util import find_tor_binary as tx_find_tor_binary
from ooni.settings import config
class TorVersion(LooseVersion):
pass
class OBFSProxyVersion(LooseVersion):
pass
def find_tor_binary():
if config.advanced.tor_binary:
return config.advanced.tor_binary
return tx_find_tor_binary()
def executable_version(binary, strip=lambda x: x):
if not binary:
return None
try:
proc = subprocess.Popen((binary, '--version'),
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
except OSError:
pass
else:
stdout, _ = proc.communicate()
if proc.poll() == 0 and stdout != '':
version = stdout.strip()
return LooseVersion(strip(version))
return None
def tor_version():
version = executable_version(find_tor_binary(),
lambda x: x.split(' ')[2])
return TorVersion(str(version))
def obfsproxy_version():
version = executable_version(find_executable('obfsproxy'))
return OBFSProxyVersion(str(version))
def transport_name(address):
"""
If the address of the bridge starts with a valid c identifier then
we consider it to be a bridge.
Returns:
The transport_name if it's a transport.
None if it's not a obfsproxy bridge.
"""
transport_name = address.split(' ')[0]
transport_name_chars = string.ascii_letters + string.digits
if all(c in transport_name_chars for c in transport_name):
return transport_name
else:
return None
tor_details = {
'binary': find_tor_binary(),
'version': tor_version()
}
obfsproxy_details = {
'binary': find_executable('obfsproxy'),
'version': obfsproxy_version()
}
transport_bin_name = { 'fte': 'fteproxy',
'scramblesuit': 'obfsproxy',
'obfs2': 'obfsproxy',
'obfs3': 'obfsproxy',
'obfs4': 'obfs4proxy' }
_pyobfsproxy_line = lambda transport, bin_loc, log_file: \
"%s exec %s --log-min-severity info --log-file %s managed" % \
(transport, bin_loc, log_file)
_transport_line_templates = {
'fte': lambda bin_loc, log_file : \
"fte exec %s --managed" % bin_loc,
'scramblesuit': lambda bin_loc, log_file: \
_pyobfsproxy_line('scramblesuit', bin_loc, log_file),
'obfs2': lambda bin_loc, log_file: \
_pyobfsproxy_line('obfs2', bin_loc, log_file),
'obfs3': lambda bin_loc, log_file: \
_pyobfsproxy_line('obfs3', bin_loc, log_file),
'obfs4': lambda bin_loc, log_file: \
"obfs4 exec %s --enableLogging=true --logLevel=INFO" % bin_loc }
class UnrecognizedTransport(Exception):
pass
class UninstalledTransport(Exception):
pass
class OutdatedObfsproxy(Exception):
pass
class OutdatedTor(Exception):
pass
def bridge_line(transport, log_file):
bin_name = transport_bin_name.get(transport)
if not bin_name:
raise UnrecognizedTransport
bin_loc = find_executable(bin_name)
if not bin_loc:
raise UninstalledTransport
if OBFSProxyVersion('0.2') > obfsproxy_details['version']:
raise OutdatedObfsproxy
if (transport == 'scramblesuit' or \
bin_name == 'obfs4proxy') and \
TorVersion('0.2.5.1') > tor_details['version']:
raise OutdatedTor
if TorVersion('0.2.4.1') > tor_details['version']:
raise OutdatedTor
return _transport_line_templates[transport](bin_loc, log_file)
ooniprobe-1.3.2/ooni/utils/__init__.py 0000644 0001750 0001750 00000011275 12463144534 016100 0 ustar irl irl import shutil
import string
import random
import glob
import os
import gzip
from zipfile import ZipFile
from ooni import otime
from ooni import errors
class Storage(dict):
"""
A Storage object is like a dictionary except `obj.foo` can be used
in addition to `obj['foo']`.
>>> o = Storage(a=1)
>>> o.a
1
>>> o['a']
1
>>> o.a = 2
>>> o['a']
2
>>> del o.a
>>> o.a
None
"""
def __getattr__(self, key):
try:
return self[key]
except KeyError:
return None
def __setattr__(self, key, value):
self[key] = value
def __delattr__(self, key):
try:
del self[key]
except KeyError, k:
raise AttributeError(k)
def __repr__(self):
return ''
def __getstate__(self):
return dict(self)
def __setstate__(self, value):
for (k, v) in value.items():
self[k] = v
def checkForRoot():
if os.getuid() != 0:
raise errors.InsufficientPrivileges
def randomSTR(length, num=True):
"""
Returns a random all uppercase alfa-numerical (if num True) string long length
"""
chars = string.ascii_uppercase
if num:
chars += string.digits
return ''.join(random.choice(chars) for x in range(length))
def randomstr(length, num=True):
"""
Returns a random all lowercase alfa-numerical (if num True) string long length
"""
chars = string.ascii_lowercase
if num:
chars += string.digits
return ''.join(random.choice(chars) for x in range(length))
def randomStr(length, num=True):
"""
Returns a random a mixed lowercase, uppercase, alfanumerical (if num True)
string long length
"""
chars = string.ascii_lowercase + string.ascii_uppercase
if num:
chars += string.digits
return ''.join(random.choice(chars) for x in range(length))
def pushFilenameStack(filename):
"""
Takes as input a target filename and checks to see if a file by such name
already exists. If it does exist then it will attempt to rename it to .1,
if .1 exists it will rename .1 to .2 if .2 exists then it will rename it to
.3, etc.
This is similar to pushing into a LIFO stack.
Args:
filename (str): the path to filename that you wish to create.
"""
stack = glob.glob(filename + ".*")
stack.sort(key=lambda x: int(x.split('.')[-1]))
for f in reversed(stack):
c_idx = f.split(".")[-1]
c_filename = '.'.join(f.split(".")[:-1])
new_idx = int(c_idx) + 1
new_filename = "%s.%s" % (c_filename, new_idx)
os.rename(f, new_filename)
os.rename(filename, filename + ".1")
def generate_filename(testDetails, prefix=None, extension=None, filename=None):
"""
Returns a filename for every test execution.
It's used to assure that all files of a certain test have a common basename but different
extension.
"""
if filename is None:
test_name, start_time = testDetails['test_name'], testDetails['start_time']
start_time = otime.epochToTimestamp(start_time)
suffix = "%s-%s" % (test_name, start_time)
basename = '%s-%s' % (prefix, suffix) if prefix is not None else suffix
final_filename = '%s.%s' % (basename, extension) if extension is not None else basename
else:
if extension is not None:
basename = filename.split('.')[0] if '.' in filename else filename
final_filename = '%s.%s' % (basename, extension)
else:
final_filename = filename
return final_filename
def sanitize_options(options):
"""
Strips all possible user identifying information from the ooniprobe test
options.
Currently only strips leading directories from filepaths.
"""
sanitized_options = []
for option in options:
option = os.path.basename(option)
sanitized_options.append(option)
return sanitized_options
def unzip(filename, dst):
assert filename.endswith('.zip')
dst_path = os.path.join(
dst,
os.path.basename(filename).replace(".zip", "")
)
with open(filename) as zfp:
zip_file = ZipFile(zfp)
zip_file.extractall(dst_path)
return dst_path
def gunzip(filename, dst):
assert filename.endswith(".gz")
dst_path = os.path.join(
dst,
os.path.basename(filename).replace(".gz", "")
)
with open(dst_path, "w+") as fw:
gzip_file = gzip.open(filename)
shutil.copyfileobj(gzip_file, fw)
gzip_file.close()
def get_ooni_root():
script = os.path.join(__file__, '..')
return os.path.dirname(os.path.realpath(script))
ooniprobe-1.3.2/ooni/utils/log.py 0000644 0001750 0001750 00000006746 12447563404 015134 0 ustar irl irl import os
import sys
import codecs
import logging
import traceback
from twisted.python import log as txlog
from twisted.python import util
from twisted.python.failure import Failure
from twisted.python.logfile import DailyLogFile
from ooni import otime
# Get rid of the annoying "No route found for
# IPv6 destination warnings":
logging.getLogger("scapy.runtime").setLevel(logging.ERROR)
def log_encode(logmsg):
"""
I encode logmsg (a str or unicode) as printable ASCII. Each case
gets a distinct prefix, so that people differentiate a unicode
from a utf-8-encoded-byte-string or binary gunk that would
otherwise result in the same final output.
"""
if isinstance(logmsg, unicode):
return codecs.encode(logmsg, 'unicode_escape')
elif isinstance(logmsg, str):
try:
unicodelogmsg = logmsg.decode('utf-8')
except UnicodeDecodeError:
return codecs.encode(logmsg, 'string_escape')
else:
return codecs.encode(unicodelogmsg, 'unicode_escape')
else:
raise Exception("I accept only a unicode object or a string, "
"not a %s object like %r" % (type(logmsg),
repr(logmsg)))
class LogWithNoPrefix(txlog.FileLogObserver):
def emit(self, eventDict):
text = txlog.textFromEventDict(eventDict)
if text is None:
return
util.untilConcludes(self.write, "%s\n" % text)
util.untilConcludes(self.flush) # Hoorj!
class OONILogger(object):
def start(self, logfile=None, application_name="ooniprobe"):
from ooni.settings import config
daily_logfile = None
if not logfile:
logfile = os.path.expanduser(config.basic.logfile)
log_folder = os.path.dirname(logfile)
log_filename = os.path.basename(logfile)
daily_logfile = DailyLogFile(log_filename, log_folder)
txlog.msg("Starting %s on %s (%s UTC)" % (application_name,
otime.prettyDateNow(),
otime.utcPrettyDateNow()))
self.fileObserver = txlog.FileLogObserver(daily_logfile)
self.stdoutObserver = LogWithNoPrefix(sys.stdout)
txlog.startLoggingWithObserver(self.stdoutObserver.emit)
txlog.addObserver(self.fileObserver.emit)
def stop(self):
self.stdoutObserver.stop()
self.fileObserver.stop()
oonilogger = OONILogger()
def start(logfile=None, application_name="ooniprobe"):
oonilogger.start(logfile, application_name)
def stop():
oonilogger.stop()
def msg(msg, *arg, **kw):
from ooni.settings import config
if config.logging:
print "%s" % log_encode(msg)
def debug(msg, *arg, **kw):
from ooni.settings import config
if config.advanced.debug and config.logging:
print "[D] %s" % log_encode(msg)
def err(msg, *arg, **kw):
from ooni.settings import config
if config.logging:
if isinstance(msg, Exception):
msg = "%s: %s" % (msg.__class__.__name__, msg)
print "[!] %s" % log_encode(msg)
def exception(error):
"""
Error can either be an error message to print to stdout and to the logfile
or it can be a twisted.python.failure.Failure instance.
"""
if isinstance(error, Failure):
error.printTraceback()
else:
exc_type, exc_value, exc_traceback = sys.exc_info()
traceback.print_exception(exc_type, exc_value, exc_traceback)
ooniprobe-1.3.2/ooni/utils/txscapy.py 0000644 0001750 0001750 00000037066 12623613431 016035 0 ustar irl irl import sys
import time
import random
from twisted.internet import fdesc
from twisted.internet import reactor
from twisted.internet import defer, abstract
from scapy.config import conf
from scapy.all import RandShort, IP, IPerror, ICMP, ICMPerror, TCP, TCPerror, UDP, UDPerror
from ooni.errors import ProtocolNotRegistered, ProtocolAlreadyRegistered, LibraryNotInstalledError
from ooni.utils import log
from ooni.utils.net import getDefaultIface, getAddresses
from ooni.settings import config
def pcapdnet_installed():
"""
Checks to see if libdnet or libpcap are installed and set the according
variables.
Returns:
True
if pypcap and libdnet are installed
False
if one of the two is absent
"""
# In debian libdnet is called dumbnet instead of dnet, but scapy is
# expecting "dnet" so we try and import it under such name.
try:
import dumbnet
sys.modules['dnet'] = dumbnet
except ImportError:
pass
try:
conf.use_pcap = True
conf.use_dnet = True
from scapy.arch import pcapdnet
config.pcap_dnet = True
except ImportError as e:
log.err(e.message + ". Pypcap or dnet are not properly installed. Certain tests may not work.")
config.pcap_dnet = False
conf.use_pcap = False
conf.use_dnet = False
# This is required for unix systems that are different than linux (OSX for
# example) since scapy explicitly wants pcap and libdnet installed for it
# to work.
try:
from scapy.arch import pcapdnet
except ImportError:
log.err("Your platform requires having libdnet and libpcap installed.")
raise LibraryNotInstalledError
return config.pcap_dnet
if pcapdnet_installed():
from scapy.all import PcapWriter
else:
class DummyPcapWriter:
def __init__(self, pcap_filename, *arg, **kw):
log.err("Initializing DummyPcapWriter. We will not actually write to a pcapfile")
@staticmethod
def write(self):
pass
PcapWriter = DummyPcapWriter
from scapy.all import Gen, SetGen, MTU
class ScapyFactory(abstract.FileDescriptor):
"""
Inspired by muxTCP scapyLink:
https://github.com/enki/muXTCP/blob/master/scapyLink.py
"""
def __init__(self, interface, super_socket=None, timeout=5):
abstract.FileDescriptor.__init__(self, reactor)
if interface == 'auto':
interface = getDefaultIface()
if not super_socket and sys.platform == 'darwin':
super_socket = conf.L3socket(iface=interface, promisc=True, filter='')
elif not super_socket:
super_socket = conf.L3socket(iface=interface)
self.protocols = []
fdesc._setCloseOnExec(super_socket.ins.fileno())
self.super_socket = super_socket
def writeSomeData(self, data):
"""
XXX we actually want to use this, but this requires overriding doWrite
or writeSequence.
"""
pass
def send(self, packet):
"""
Write a scapy packet to the wire.
"""
return self.super_socket.send(packet)
def fileno(self):
return self.super_socket.ins.fileno()
def doRead(self):
packet = self.super_socket.recv(MTU)
if packet:
for protocol in self.protocols:
protocol.packetReceived(packet)
def registerProtocol(self, protocol):
if not self.connected:
self.startReading()
if protocol not in self.protocols:
protocol.factory = self
self.protocols.append(protocol)
else:
raise ProtocolAlreadyRegistered
def unRegisterProtocol(self, protocol):
if protocol in self.protocols:
self.protocols.remove(protocol)
if len(self.protocols) == 0:
self.loseConnection()
else:
raise ProtocolNotRegistered
class ScapyProtocol(object):
factory = None
def packetReceived(self, packet):
"""
When you register a protocol, this method will be called with argument
the packet it received.
Every protocol that is registered will have this method called.
"""
raise NotImplementedError
class ScapySender(ScapyProtocol):
timeout = 5
# This deferred will fire when we have finished sending a receiving packets.
# Should we look for multiple answers for the same sent packet?
multi = False
# When 0 we stop when all the packets we have sent have received an
# answer
expected_answers = 0
def processPacket(self, packet):
"""
Hook useful for processing packets as they come in.
"""
def processAnswer(self, packet, answer_hr):
log.debug("Got a packet from %s" % packet.src)
log.debug("%s" % self.__hash__)
for i in range(len(answer_hr)):
if packet.answers(answer_hr[i]):
self.answered_packets.append((answer_hr[i], packet))
if not self.multi:
del (answer_hr[i])
break
if len(self.answered_packets) == len(self.sent_packets):
log.debug("All of our questions have been answered.")
self.stopSending()
return
if self.expected_answers and self.expected_answers == len(self.answered_packets):
log.debug("Got the number of expected answers")
self.stopSending()
def packetReceived(self, packet):
if self.timeout and time.time() - self._start_time > self.timeout:
self.stopSending()
if packet:
self.processPacket(packet)
# A string that has the same value for the request than for the
# response.
hr = packet.hashret()
if hr in self.hr_sent_packets:
answer_hr = self.hr_sent_packets[hr]
self.processAnswer(packet, answer_hr)
def stopSending(self):
result = (self.answered_packets, self.sent_packets)
self.d.callback(result)
self.factory.unRegisterProtocol(self)
def sendPackets(self, packets):
if not isinstance(packets, Gen):
packets = SetGen(packets)
for packet in packets:
hashret = packet.hashret()
if hashret in self.hr_sent_packets:
self.hr_sent_packets[hashret].append(packet)
else:
self.hr_sent_packets[hashret] = [packet]
self.sent_packets.append(packet)
self.factory.send(packet)
def startSending(self, packets):
# This dict is used to store the unique hashes that allow scapy to
# match up request with answer
self.hr_sent_packets = {}
# These are the packets we have received as answer to the ones we sent
self.answered_packets = []
# These are the packets we send
self.sent_packets = []
self._start_time = time.time()
self.d = defer.Deferred()
self.sendPackets(packets)
return self.d
class ScapySniffer(ScapyProtocol):
def __init__(self, pcap_filename, *arg, **kw):
self.pcapwriter = PcapWriter(pcap_filename, *arg, **kw)
def packetReceived(self, packet):
self.pcapwriter.write(packet)
def close(self):
self.pcapwriter.close()
class ParasiticTraceroute(ScapyProtocol):
def __init__(self):
self.numHosts = 7
self.rate = 15
self.hosts = {}
self.ttl_max = 15
self.ttl_min = 1
self.sent_packets = []
self.received_packets = []
self.matched_packets = {}
self.addresses = [str(x) for x in getAddresses()]
def sendPacket(self, packet):
self.factory.send(packet)
self.sent_packets.append(packet)
log.debug("Sent packet to %s with ttl %d" % (packet.dst, packet.ttl))
def packetReceived(self, packet):
try:
packet[IP]
except IndexError:
return
# Add TTL Expired responses.
if isinstance(packet.getlayer(3), TCPerror):
self.received_packets.append(packet)
# Live traceroute?
log.debug("%s replied with icmp-ttl-exceeded for %s" % (packet.src, packet[IPerror].dst))
return
elif packet.dst in self.hosts:
if random.randint(1, 100) > self.rate:
# Don't send a packet this time
return
try:
packet[IP].ttl = self.hosts[packet.dst]['ttl'].pop()
del packet.chksum # XXX Why is this incorrect?
self.sendPacket(packet)
k = (packet.id, packet[TCP].sport, packet[TCP].dport, packet[TCP].seq)
self.matched_packets[k] = {'ttl': packet.ttl}
return
except IndexError:
return
def maxttl(packet=None):
if packet:
return min(self.ttl_max, *map(lambda x: x - packet.ttl, [64, 128, 256])) - 1
else:
return self.ttl_max
def genttl(packet=None):
ttl = range(self.ttl_min, maxttl(packet))
random.shuffle(ttl)
return ttl
if len(self.hosts) < self.numHosts:
if packet.dst not in self.hosts \
and packet.dst not in self.addresses \
and isinstance(packet.getlayer(1), TCP):
self.hosts[packet.dst] = {'ttl': genttl()}
log.debug("Tracing to %s" % packet.dst)
return
if packet.src not in self.hosts \
and packet.src not in self.addresses \
and isinstance(packet.getlayer(1), TCP):
self.hosts[packet.src] = {'ttl': genttl(packet),
'ttl_max': maxttl(packet)}
log.debug("Tracing to %s" % packet.src)
return
if packet.src in self.hosts and not 'ttl_max' in self.hosts[packet.src]:
self.hosts[packet.src]['ttl_max'] = ttl_max = maxttl(packet)
log.debug("set ttl_max to %d for host %s" % (ttl_max, packet.src))
ttl = []
for t in self.hosts[packet.src]['ttl']:
if t < ttl_max:
ttl.append(t)
self.hosts[packet.src]['ttl'] = ttl
return
def stopListening(self):
self.factory.unRegisterProtocol(self)
class MPTraceroute(ScapyProtocol):
dst_ports = [0, 22, 23, 53, 80, 123, 443, 8080, 65535]
ttl_min = 1
ttl_max = 30
def __init__(self):
self.sent_packets = []
self._recvbuf = []
self.received_packets = {}
self.matched_packets = {}
self.hosts = []
self.interval = 0.2
self.timeout = ((self.ttl_max - self.ttl_min) * len(self.dst_ports) * self.interval) + 5
self.numPackets = 1
def ICMPTraceroute(self, host):
if host not in self.hosts:
self.hosts.append(host)
d = defer.Deferred()
reactor.callLater(self.timeout, d.callback, self)
self.sendPackets(IP(dst=host, ttl=(self.ttl_min, self.ttl_max), id=RandShort()) / ICMP(id=RandShort()))
return d
def UDPTraceroute(self, host):
if host not in self.hosts:
self.hosts.append(host)
d = defer.Deferred()
reactor.callLater(self.timeout, d.callback, self)
for dst_port in self.dst_ports:
self.sendPackets(
IP(dst=host, ttl=(self.ttl_min, self.ttl_max), id=RandShort()) / UDP(dport=dst_port, sport=RandShort()))
return d
def TCPTraceroute(self, host):
if host not in self.hosts:
self.hosts.append(host)
d = defer.Deferred()
reactor.callLater(self.timeout, d.callback, self)
for dst_port in self.dst_ports:
self.sendPackets(
IP(dst=host, ttl=(self.ttl_min, self.ttl_max), id=RandShort()) / TCP(flags=2L, dport=dst_port,
sport=RandShort(),
seq=RandShort()))
return d
@defer.inlineCallbacks
def sendPackets(self, packets):
def sleep(seconds):
d = defer.Deferred()
reactor.callLater(seconds, d.callback, seconds)
return d
if not isinstance(packets, Gen):
packets = SetGen(packets)
for packet in packets:
for i in xrange(self.numPackets):
self.sent_packets.append(packet)
self.factory.super_socket.send(packet)
yield sleep(self.interval)
def matchResponses(self):
def addToReceivedPackets(key, packet):
"""
Add a packet into the received packets dictionary,
typically the key is a tuple of packet fields used
to correlate sent packets with received packets.
"""
# Initialize or append to the lists of packets
# with the same key
if key in self.received_packets:
self.received_packets[key].append(packet)
else:
self.received_packets[key] = [packet]
def matchResponse(k, p):
if k in self.received_packets:
if p in self.matched_packets:
log.debug("Matched sent packet to more than one response!")
self.matched_packets[p].extend(self.received_packets[k])
else:
self.matched_packets[p] = self.received_packets[k]
log.debug("Packet %s matched %s" % ([p], self.received_packets[k]))
return 1
return 0
for p in self._recvbuf:
l = p.getlayer(2)
if isinstance(l, IPerror):
l = p.getlayer(3)
if isinstance(l, ICMPerror):
addToReceivedPackets(('icmp', l.id), p)
elif isinstance(l, TCPerror):
addToReceivedPackets(('tcp', l.dport, l.sport), p)
elif isinstance(l, UDPerror):
addToReceivedPackets(('udp', l.dport, l.sport), p)
elif hasattr(p, 'src') and p.src in self.hosts:
l = p.getlayer(1)
if isinstance(l, ICMP):
addToReceivedPackets(('icmp', l.id), p)
elif isinstance(l, TCP):
addToReceivedPackets(('tcp', l.ack - 1, l.dport, l.sport), p)
elif isinstance(l, UDP):
addToReceivedPackets(('udp', l.dport, l.sport), p)
for p in self.sent_packets:
# for each sent packet, find corresponding
# received packets
l = p.getlayer(1)
i = 0
if isinstance(l, ICMP):
i += matchResponse(('icmp', p.id), p) # match by ipid
i += matchResponse(('icmp', l.id), p) # match by icmpid
if isinstance(l, TCP):
i += matchResponse(('tcp', l.dport, l.sport), p) # match by s|dport
i += matchResponse(('tcp', l.seq, l.sport, l.dport), p)
if isinstance(l, UDP):
i += matchResponse(('udp', l.dport, l.sport), p)
i += matchResponse(('udp', l.sport, l.dport), p)
if i == 0:
log.debug("No response for packet %s" % [p])
del self._recvbuf
def packetReceived(self, packet):
l = packet.getlayer(1)
if not l:
return
elif isinstance(l, ICMP) or isinstance(l, UDP) or isinstance(l, TCP):
self._recvbuf.append(packet)
def stopListening(self):
self.factory.unRegisterProtocol(self)
ooniprobe-1.3.2/ooni/utils/hacks.py 0000644 0001750 0001750 00000003660 12623613431 015424 0 ustar irl irl # When some software has issues and we need to fix it in a
# hackish way, we put it in here. This one day will be empty.
import copy_reg
from twisted.web.client import SchemeNotSupported
from txsocksx.http import SOCKS5Agent as SOCKS5AgentOriginal
def patched_reduce_ex(self, proto):
"""
This is a hack to overcome a bug in one of pythons core functions. It is
located inside of copy_reg and is called _reduce_ex.
Some background on the issue can be found here:
http://stackoverflow.com/questions/569754/how-to-tell-for-which-object-attribute-pickle
http://stackoverflow.com/questions/2049849/why-cant-i-pickle-this-object
There was also an open bug on the pyyaml trac repo, but it got closed because
they could not reproduce.
http://pyyaml.org/ticket/190
It turned out to be easier to patch the python core library than to monkey
patch yaml.
XXX see if there is a better way. sigh...
"""
_HEAPTYPE = 1 << 9
assert proto < 2
for base in self.__class__.__mro__:
if hasattr(base, '__flags__') and not base.__flags__ & _HEAPTYPE:
break
else:
base = object # not really reachable
if base is object:
state = None
elif base is int:
state = None
else:
if base is self.__class__:
raise TypeError("can't pickle %s objects" % base.__name__)
state = base(self)
args = (self.__class__, base, state)
try:
getstate = self.__getstate__
except AttributeError:
if getattr(self, "__slots__", None):
raise TypeError("a class that defines __slots__ without "
"defining __getstate__ cannot be pickled")
try:
dict = self.__dict__
except AttributeError:
dict = None
else:
dict = getstate()
if dict:
return copy_reg._reconstructor, args, dict
else:
return copy_reg._reconstructor, args
ooniprobe-1.3.2/ooni/settings.ini 0000644 0001750 0001750 00000000177 12555437730 015174 0 ustar irl irl [directories]
usr_share = /Users/x/.virtualenvs/ooni-probe/share/ooni
var_lib = /Users/x/.virtualenvs/ooni-probe/var/lib/ooni
ooniprobe-1.3.2/ooni/oonid.py 0000644 0001750 0001750 00000000713 12463144534 014304 0 ustar irl irl from twisted.application import service, internet
from ooni.settings import config
from ooni.api.spec import oonidApplication
from ooni.director import Director
def getOonid():
director = Director()
director.start()
oonidApplication.director = director
return internet.TCPServer(int(config.advanced.oonid_api_port), oonidApplication)
application = service.Application("ooniprobe")
service = getOonid()
service.setServiceParent(application)
ooniprobe-1.3.2/ooni/deck.py 0000644 0001750 0001750 00000025355 12531110611 014074 0 ustar irl irl # -*- coding: utf-8 -*-
from ooni.oonibclient import OONIBClient
from ooni.nettest import NetTestLoader
from ooni.settings import config
from ooni.utils import log
from ooni import errors as e
from twisted.python.filepath import FilePath
from twisted.internet import defer
import os
import yaml
import json
from hashlib import sha256
class InputFile(object):
def __init__(self, input_hash, base_path=config.inputs_directory):
self.id = input_hash
cache_path = os.path.join(os.path.abspath(base_path), input_hash)
self.cached_file = cache_path
self.cached_descriptor = cache_path + '.desc'
@property
def descriptorCached(self):
if os.path.exists(self.cached_descriptor):
with open(self.cached_descriptor) as f:
descriptor = json.load(f)
self.load(descriptor)
return True
return False
@property
def fileCached(self):
if os.path.exists(self.cached_file):
try:
self.verify()
except AssertionError:
log.err("The input %s failed validation."
"Going to consider it not cached." % self.id)
return False
return True
return False
def save(self):
with open(self.cached_descriptor, 'w+') as f:
json.dump({
'name': self.name,
'id': self.id,
'version': self.version,
'author': self.author,
'date': self.date,
'description': self.description
}, f)
def load(self, descriptor):
self.name = descriptor['name']
self.version = descriptor['version']
self.author = descriptor['author']
self.date = descriptor['date']
self.description = descriptor['description']
def verify(self):
digest = os.path.basename(self.cached_file)
with open(self.cached_file) as f:
file_hash = sha256(f.read())
assert file_hash.hexdigest() == digest
def nettest_to_path(path, allow_arbitrary_paths=False):
"""
Takes as input either a path or a nettest name.
Args:
allow_arbitrary_paths:
allow also paths that are not relative to the nettest_directory.
Returns:
full path to the nettest file.
"""
if allow_arbitrary_paths and os.path.exists(path):
return path
fp = FilePath(config.nettest_directory).preauthChild(path + '.py')
if fp.exists():
return fp.path
else:
raise e.NetTestNotFound(path)
class Deck(InputFile):
def __init__(self, deck_hash=None,
deckFile=None,
decks_directory=config.decks_directory,
no_collector=False):
self.id = deck_hash
self.requiresTor = False
self.no_collector = no_collector
self.bouncer = ''
self.netTestLoaders = []
self.inputs = []
self.oonibclient = OONIBClient(self.bouncer)
self.decksDirectory = os.path.abspath(decks_directory)
self.deckHash = deck_hash
if deckFile:
self.loadDeck(deckFile)
@property
def cached_file(self):
return os.path.join(self.decksDirectory, self.deckHash)
@property
def cached_descriptor(self):
return self.cached_file + '.desc'
def loadDeck(self, deckFile):
with open(deckFile) as f:
self.deckHash = sha256(f.read()).hexdigest()
f.seek(0)
test_deck = yaml.safe_load(f)
for test in test_deck:
try:
nettest_path = nettest_to_path(test['options']['test_file'])
except e.NetTestNotFound:
log.err("Could not find %s" % test['options']['test_file'])
log.msg("Skipping...")
continue
net_test_loader = NetTestLoader(test['options']['subargs'],
test_file=nettest_path)
if test['options']['collector']:
net_test_loader.collector = test['options']['collector']
self.insert(net_test_loader)
def insert(self, net_test_loader):
""" Add a NetTestLoader to this test deck """
def has_test_helper(missing_option):
for rth in net_test_loader.requiredTestHelpers:
if missing_option == rth['option']:
return True
return False
try:
net_test_loader.checkOptions()
if net_test_loader.requiresTor:
self.requiresTor = True
except e.MissingRequiredOption as missing_options:
if not self.bouncer:
raise
for missing_option in missing_options.message:
if not has_test_helper(missing_option):
raise
self.requiresTor = True
self.netTestLoaders.append(net_test_loader)
@defer.inlineCallbacks
def setup(self):
""" fetch and verify inputs for all NetTests in the deck """
log.msg("Fetching required net test inputs...")
for net_test_loader in self.netTestLoaders:
yield self.fetchAndVerifyNetTestInput(net_test_loader)
if self.bouncer:
log.msg("Looking up collector and test helpers")
yield self.lookupCollector()
@defer.inlineCallbacks
def lookupCollector(self):
self.oonibclient.address = self.bouncer
required_nettests = []
requires_test_helpers = False
requires_collector = False
for net_test_loader in self.netTestLoaders:
nettest = {
'name': net_test_loader.testDetails['test_name'],
'version': net_test_loader.testDetails['test_version'],
'test-helpers': [],
'input-hashes': [x['hash'] for x in net_test_loader.inputFiles]
}
if not net_test_loader.collector and not self.no_collector:
requires_collector = True
for th in net_test_loader.requiredTestHelpers:
# {'name':'', 'option':'', 'test_class':''}
if th['test_class'].localOptions[th['option']]:
continue
nettest['test-helpers'].append(th['name'])
requires_test_helpers = True
required_nettests.append(nettest)
if not requires_test_helpers and not requires_collector:
defer.returnValue(None)
response = yield self.oonibclient.lookupTestCollector(required_nettests)
provided_net_tests = response['net-tests']
def find_collector_and_test_helpers(test_name, test_version, input_files):
input_files = [u""+x['hash'] for x in input_files]
for net_test in provided_net_tests:
if net_test['name'] != test_name:
continue
if net_test['version'] != test_version:
continue
if set(net_test['input-hashes']) != set(input_files):
continue
return net_test['collector'], net_test['test-helpers']
for net_test_loader in self.netTestLoaders:
log.msg("Setting collector and test helpers for %s" % net_test_loader.testDetails['test_name'])
collector, test_helpers = \
find_collector_and_test_helpers(net_test_loader.testDetails['test_name'],
net_test_loader.testDetails['test_version'],
net_test_loader.inputFiles)
for th in net_test_loader.requiredTestHelpers:
if not th['test_class'].localOptions[th['option']]:
th['test_class'].localOptions[th['option']] = test_helpers[th['name']].encode('utf-8')
net_test_loader.testHelpers[th['option']] = th['test_class'].localOptions[th['option']]
if not net_test_loader.collector:
net_test_loader.collector = collector.encode('utf-8')
@defer.inlineCallbacks
def lookupTestHelpers(self):
self.oonibclient.address = self.bouncer
required_test_helpers = []
requires_collector = []
for net_test_loader in self.netTestLoaders:
if not net_test_loader.collector and not self.no_collector:
requires_collector.append(net_test_loader)
for th in net_test_loader.requiredTestHelpers:
# {'name':'', 'option':'', 'test_class':''}
if th['test_class'].localOptions[th['option']]:
continue
required_test_helpers.append(th['name'])
if not required_test_helpers and not requires_collector:
defer.returnValue(None)
response = yield self.oonibclient.lookupTestHelpers(required_test_helpers)
for net_test_loader in self.netTestLoaders:
log.msg("Setting collector and test helpers for %s" %
net_test_loader.testDetails['test_name'])
# Only set the collector if the no collector has been specified
# from the command line or via the test deck.
if not net_test_loader.requiredTestHelpers and \
net_test_loader in requires_collector:
log.msg("Using the default collector: %s" %
response['default']['collector'])
net_test_loader.collector = response['default']['collector'].encode('utf-8')
continue
for th in net_test_loader.requiredTestHelpers:
# Only set helpers which are not already specified
if th['name'] not in required_test_helpers:
continue
test_helper = response[th['name']]
log.msg("Using this helper: %s" % test_helper)
th['test_class'].localOptions[th['option']] = test_helper['address'].encode('utf-8')
if net_test_loader in requires_collector:
net_test_loader.collector = test_helper['collector'].encode('utf-8')
@defer.inlineCallbacks
def fetchAndVerifyNetTestInput(self, net_test_loader):
""" fetch and verify a single NetTest's inputs """
log.debug("Fetching and verifying inputs")
for i in net_test_loader.inputFiles:
if 'url' in i:
log.debug("Downloading %s" % i['url'])
self.oonibclient.address = i['address']
try:
input_file = yield self.oonibclient.downloadInput(i['hash'])
except:
raise e.UnableToLoadDeckInput
try:
input_file.verify()
except AssertionError:
raise e.UnableToLoadDeckInput
i['test_class'].localOptions[i['key']] = input_file.cached_file
ooniprobe-1.3.2/ooni/geoip.py 0000644 0001750 0001750 00000017603 12623613431 014300 0 ustar irl irl import re
import os
import random
from hashlib import sha256
from twisted.web import client, http_headers
from ooni.utils.net import hasRawSocketPermission
client._HTTP11ClientFactory.noisy = False
from twisted.internet import reactor, defer
from ooni.utils import log
from ooni import errors
try:
from pygeoip import GeoIP
except ImportError:
try:
import GeoIP as CGeoIP
def GeoIP(database_path, *args, **kwargs):
return CGeoIP.open(database_path, CGeoIP.GEOIP_STANDARD)
except ImportError:
log.err("Unable to import pygeoip. We will not be able to run geo IP related measurements")
class GeoIPDataFilesNotFound(Exception):
pass
def IPToLocation(ipaddr):
from ooni.settings import config
city_file = config.get_data_file_path('GeoIP/GeoLiteCity.dat')
country_file = config.get_data_file_path('GeoIP/GeoIP.dat')
asn_file = config.get_data_file_path('GeoIP/GeoIPASNum.dat')
location = {'city': None, 'countrycode': 'ZZ', 'asn': 'AS0'}
def error():
log.err("Could not find GeoIP data file in %s."
"Try running ooniresources --update-geoip or"
" edit your ooniprobe.conf" % config.advanced.geoip_data_dir)
try:
country_dat = GeoIP(country_file)
location['countrycode'] = country_dat.country_code_by_addr(ipaddr)
if not location['countrycode']:
location['countrycode'] = 'ZZ'
except IOError:
error()
try:
city_dat = GeoIP(city_file)
location['city'] = city_dat.record_by_addr(ipaddr)['city']
except:
error()
try:
asn_dat = GeoIP(asn_file)
location['asn'] = asn_dat.org_by_addr(ipaddr).split(' ')[0]
except:
error()
return location
def database_version():
from ooni.settings import config
version = {
'GeoIP': {
'sha256': None,
'timestamp': None,
},
'GeoIPASNum': {
'sha256': None,
'timestamp': None
},
'GeoLiteCity': {
'sha256': None,
'timestamp': None
}
}
for key in version.keys():
geoip_file = config.get_data_file_path("GeoIP/" + key + ".dat")
if not geoip_file or not os.path.isfile(geoip_file):
continue
timestamp = os.stat(geoip_file).st_mtime
sha256hash = sha256()
with open(geoip_file) as f:
while True:
chunk = f.read(8192)
if not chunk:
break
sha256hash.update(chunk)
version[key]['timestamp'] = timestamp
version[key]['sha256'] = sha256hash.hexdigest()
return version
class HTTPGeoIPLookupper(object):
url = None
_agent = client.Agent
def __init__(self):
self.agent = self._agent(reactor)
def _response(self, response):
from ooni.utils.net import BodyReceiver
content_length = response.headers.getRawHeaders('content-length')
finished = defer.Deferred()
response.deliverBody(BodyReceiver(finished, content_length))
finished.addCallback(self.parseResponse)
return finished
def parseResponse(self, response_body):
"""
Override this with the logic for parsing the response.
Should return the IP address of the probe.
"""
pass
def failed(self, failure):
log.err("Failed to lookup via %s" % self.url)
log.exception(failure)
return failure
def lookup(self):
from ooni.utils.net import userAgents
headers = {}
headers['User-Agent'] = [random.choice(userAgents)]
d = self.agent.request("GET", self.url, http_headers.Headers(headers))
d.addCallback(self._response)
d.addErrback(self.failed)
return d
class UbuntuGeoIP(HTTPGeoIPLookupper):
url = "http://geoip.ubuntu.com/lookup"
def parseResponse(self, response_body):
m = re.match(".*(.*).*", response_body)
probe_ip = m.group(1)
return probe_ip
class TorProjectGeoIP(HTTPGeoIPLookupper):
url = "https://check.torproject.org/"
def parseResponse(self, response_body):
regexp = "Your IP address appears to be: ((\d+\.)+(\d+))"
probe_ip = re.search(regexp, response_body).group(1)
return probe_ip
class ProbeIP(object):
strategy = None
address = None
def __init__(self):
self.geoIPServices = {
'ubuntu': UbuntuGeoIP,
'torproject': TorProjectGeoIP
}
self.geodata = {
'asn': 'AS0',
'city': None,
'countrycode': 'ZZ',
'ip': '127.0.0.1'
}
def resolveGeodata(self):
from ooni.settings import config
self.geodata = IPToLocation(self.address)
self.geodata['ip'] = self.address
if not config.privacy.includeasn:
self.geodata['asn'] = 'AS0'
if not config.privacy.includecity:
self.geodata['city'] = None
if not config.privacy.includecountry:
self.geodata['countrycode'] = 'ZZ'
if not config.privacy.includeip:
self.geodata['ip'] = '127.0.0.1'
@defer.inlineCallbacks
def lookup(self):
try:
yield self.askTor()
log.msg("Found your IP via Tor %s" % self.address)
self.resolveGeodata()
defer.returnValue(self.address)
except errors.TorStateNotFound:
log.debug("Tor is not running. Skipping IP lookup via Tor.")
except Exception:
log.msg("Unable to lookup the probe IP via Tor.")
try:
yield self.askTraceroute()
log.msg("Found your IP via Traceroute %s" % self.address)
self.resolveGeodata()
defer.returnValue(self.address)
except errors.InsufficientPrivileges:
log.debug("Cannot determine the probe IP address with a traceroute, becase of insufficient privileges")
except:
log.msg("Unable to lookup the probe IP via traceroute")
try:
yield self.askGeoIPService()
log.msg("Found your IP via a GeoIP service: %s" % self.address)
self.resolveGeodata()
defer.returnValue(self.address)
except Exception:
log.msg("Unable to lookup the probe IP via GeoIPService")
raise
@defer.inlineCallbacks
def askGeoIPService(self):
# Shuffle the order in which we test the geoip services.
services = self.geoIPServices.items()
random.shuffle(services)
for service_name, service in services:
s = service()
log.msg("Looking up your IP address via %s" % service_name)
try:
self.address = yield s.lookup()
self.strategy = 'geo_ip_service-' + service_name
break
except Exception:
log.msg("Failed to lookup your IP via %s" % service_name)
if not self.address:
raise errors.ProbeIPUnknown
def askTraceroute(self):
"""
Perform a UDP traceroute to determine the probes IP address.
"""
if not hasRawSocketPermission():
raise errors.InsufficientPrivileges
raise NotImplemented
def askTor(self):
"""
Obtain the probes IP address by asking the Tor Control port via GET INFO
address.
XXX this lookup method is currently broken when there are cached descriptors or consensus documents
see: https://trac.torproject.org/projects/tor/ticket/8214
"""
from ooni.settings import config
if config.tor_state:
d = config.tor_state.protocol.get_info("address")
@d.addCallback
def cb(result):
self.strategy = 'tor_get_info_address'
self.address = result.values()[0]
return d
else:
raise errors.TorStateNotFound
ooniprobe-1.3.2/ooni/resources/ 0000755 0001750 0001750 00000000000 12623630152 014625 5 ustar irl irl ooniprobe-1.3.2/ooni/resources/__init__.py 0000644 0001750 0001750 00000003531 12531110611 016727 0 ustar irl irl import os
from ooni.settings import config
from ooni.utils import unzip, gunzip
from ooni.deckgen.processors import citizenlab_test_lists
from ooni.deckgen.processors import namebench_dns_servers
__version__ = "0.1.0"
if os.access(config.var_lib_path, os.W_OK):
resources_directory = os.path.join(config.var_lib_path,
"resources")
geoip_directory = os.path.join(config.var_lib_path,
"GeoIP")
else:
resources_directory = os.path.join(config.ooni_home,
"resources")
geoip_directory = os.path.join(config.ooni_home,
"GeoIP")
inputs = {
"namebench-dns-servers.csv": {
"url": "https://namebench.googlecode.com/svn/trunk/config/servers.csv",
"action": None,
"action_args": [],
"processor": namebench_dns_servers,
},
"citizenlab-test-lists.zip": {
"url": "https://github.com/citizenlab/test-lists/archive/master.zip",
"action": unzip,
"action_args": [resources_directory],
"processor": citizenlab_test_lists
}
}
geoip = {
"GeoLiteCity.dat.gz": {
"url": "https://geolite.maxmind.com/download/"
"geoip/database/GeoLiteCity.dat.gz",
"action": gunzip,
"action_args": [geoip_directory],
"processor": None
},
"GeoIPASNum.dat.gz": {
"url": "https://geolite.maxmind.com/download/"
"geoip/database/asnum/GeoIPASNum.dat.gz",
"action": gunzip,
"action_args": [geoip_directory],
"processor": None
},
"GeoIP.dat.gz": {
"url": "https://geolite.maxmind.com/"
"download/geoip/database/GeoLiteCountry/GeoIP.dat.gz",
"action": gunzip,
"action_args": [geoip_directory],
"processor": None
}
}
ooniprobe-1.3.2/ooni/resources/update.py 0000644 0001750 0001750 00000002610 12474100761 016462 0 ustar irl irl import os
from twisted.internet import defer
from twisted.web.client import downloadPage
from ooni.resources import inputs, geoip, resources_directory
from ooni.utils import unzip, gunzip
@defer.inlineCallbacks
def download_resource(resources):
for filename, resource in resources.items():
print "Downloading %s" % filename
if resource["action"] in [unzip, gunzip] and resource["action_args"]:
dirname = resource["action_args"][0]
filename = os.path.join(dirname, filename)
else:
filename = os.path.join(resources_directory, filename)
if not os.path.exists(filename):
directory = os.path.dirname(filename)
if not os.path.isdir(directory):
os.makedirs(directory)
f = open(filename, 'w')
f.close()
elif not os.path.isfile(filename):
print "[!] %s must be a file." % filename
defer.returnValue(False)
yield downloadPage(resource['url'], filename)
if resource['action'] is not None:
yield defer.maybeDeferred(resource['action'],
filename,
*resource['action_args'])
print "%s written." % filename
def download_inputs():
return download_resource(inputs)
def download_geoip():
return download_resource(geoip)
ooniprobe-1.3.2/ooni/resources/cli.py 0000644 0001750 0001750 00000002662 12447563404 015765 0 ustar irl irl import sys
from twisted.internet import defer
from twisted.python import usage
from ooni.utils import log
from ooni.resources import __version__
from ooni.resources import update
class Options(usage.Options):
synopsis = """%s""" % sys.argv[0]
optFlags = [
["update-inputs", None, "Update the resources needed for inputs."],
["update-geoip", None, "Update the geoip related resources."]
]
optParameters = []
def opt_version(self):
print("ooniresources version: %s" % __version__)
sys.exit(0)
@defer.inlineCallbacks
def run():
options = Options()
try:
options.parseOptions()
except usage.UsageError as error_message:
print "%s: %s" % (sys.argv[0], error_message)
print "%s: Try --help for usage details." % (sys.argv[0])
sys.exit(1)
if not any(options.values()):
print("%s: no command specified" % sys.argv[0])
print options
sys.exit(1)
if options['update-inputs']:
print "Downloading inputs"
try:
yield update.download_inputs()
except Exception as exc:
log.err("failed to download geoip files")
log.exception(exc)
if options['update-geoip']:
print "Downloading geoip files"
try:
yield update.download_geoip()
except Exception as exc:
log.err("failed to download geoip files")
log.exception(exc)
ooniprobe-1.3.2/ooni/oonicli.py 0000644 0001750 0001750 00000027320 12545443435 014636 0 ustar irl irl import sys
import os
import yaml
from twisted.python import usage
from twisted.python.util import spewer
from twisted.internet import defer
from ooni import errors, __version__
from ooni.settings import config
from ooni.director import Director
from ooni.deck import Deck, nettest_to_path
from ooni.nettest import NetTestLoader
from ooni.utils import log
from ooni.utils.net import hasRawSocketPermission
class Options(usage.Options):
synopsis = """%s [options] [path to test].py
""" % (os.path.basename(sys.argv[0]),)
longdesc = ("ooniprobe loads and executes a suite or a set of suites of"
" network tests. These are loaded from modules, packages and"
" files listed on the command line")
optFlags = [["help", "h"],
["resume", "r"],
["no-collector", "n"],
["no-geoip", "g"],
["list", "s"],
["printdeck", "p"],
["verbose", "v"]
]
optParameters = [
["reportfile", "o", None, "report file name"],
["testdeck", "i", None,
"Specify as input a test deck: a yaml file containing the tests to run and their arguments"],
["collector", "c", None,
"Address of the collector of test results. This option should not be used, but you should always use a bouncer."],
["bouncer", "b", 'httpo://nkvphnp3p6agi5qq.onion',
"Address of the bouncer for test helpers."],
["logfile", "l", None, "log file name"],
["pcapfile", "O", None, "pcap file name"],
["configfile", "f", None,
"Specify a path to the ooniprobe configuration file"],
["datadir", "d", None,
"Specify a path to the ooniprobe data directory"],
["annotations", "a", None,
"Annotate the report with a key:value[, key:value] format."]]
compData = usage.Completions(
extraActions=[usage.CompleteFiles(
"*.py", descr="file | module | package | TestCase | testMethod",
repeat=True)],)
tracer = None
def __init__(self):
self['test'] = None
usage.Options.__init__(self)
def opt_spew(self):
"""
Print an insanely verbose log of everything that happens. Useful
when debugging freezes or locks in complex code.
"""
sys.settrace(spewer)
def opt_version(self):
"""
Display the ooniprobe version and exit.
"""
print "ooniprobe version:", __version__
sys.exit(0)
def parseArgs(self, *args):
if self['testdeck'] or self['list']:
return
try:
self['test_file'] = args[0]
self['subargs'] = args[1:]
except:
raise usage.UsageError("No test filename specified!")
def parseOptions():
print "WARNING: running ooniprobe involves some risk that varies greatly"
print " from country to country. You should be aware of this when"
print " running the tool. Read more about this in the manpage or README."
cmd_line_options = Options()
if len(sys.argv) == 1:
cmd_line_options.getUsage()
try:
cmd_line_options.parseOptions()
except usage.UsageError as ue:
print cmd_line_options.getUsage()
raise SystemExit("%s: %s" % (sys.argv[0], ue))
return dict(cmd_line_options)
def runWithDirector(logging=True, start_tor=True, check_incoherences=True):
"""
Instance the director, parse command line options and start an ooniprobe
test!
"""
global_options = parseOptions()
config.global_options = global_options
config.set_paths()
config.initialize_ooni_home()
try:
config.read_config_file(check_incoherences=check_incoherences)
except errors.ConfigFileIncoherent:
sys.exit(6)
if global_options['verbose']:
config.advanced.debug = True
if not start_tor:
config.advanced.start_tor = False
if logging:
log.start(global_options['logfile'])
if config.privacy.includepcap:
if hasRawSocketPermission():
from ooni.utils.txscapy import ScapyFactory
config.scapyFactory = ScapyFactory(config.advanced.interface)
else:
log.err("Insufficient Privileges to capture packets."
" See ooniprobe.conf privacy.includepcap")
sys.exit(2)
director = Director()
if global_options['list']:
print "# Installed nettests"
for net_test_id, net_test in director.getNetTests().items():
print "* %s (%s/%s)" % (net_test['name'],
net_test['category'],
net_test['id'])
print " %s" % net_test['description']
sys.exit(0)
elif global_options['printdeck']:
del global_options['printdeck']
print "# Copy and paste the lines below into a test deck to run the specified test with the specified arguments"
print yaml.safe_dump([{'options': global_options}]).strip()
sys.exit(0)
if global_options.get('annotations') is not None:
annotations = {}
for annotation in global_options["annotations"].split(","):
pair = annotation.split(":")
if len(pair) == 2:
key = pair[0].strip()
value = pair[1].strip()
annotations[key] = value
else:
log.err("Invalid annotation: %s" % annotation)
sys.exit(1)
global_options["annotations"] = annotations
if global_options['no-collector']:
log.msg("Not reporting using a collector")
global_options['collector'] = None
start_tor = False
else:
start_tor = True
deck = Deck(no_collector=global_options['no-collector'])
deck.bouncer = global_options['bouncer']
if global_options['collector']:
start_tor |= True
try:
if global_options['testdeck']:
deck.loadDeck(global_options['testdeck'])
else:
log.debug("No test deck detected")
test_file = nettest_to_path(global_options['test_file'], True)
net_test_loader = NetTestLoader(global_options['subargs'],
test_file=test_file)
if global_options['collector']:
net_test_loader.collector = global_options['collector']
deck.insert(net_test_loader)
except errors.MissingRequiredOption as option_name:
log.err('Missing required option: "%s"' % option_name)
incomplete_net_test_loader = option_name.net_test_loader
print incomplete_net_test_loader.usageOptions().getUsage()
sys.exit(2)
except errors.NetTestNotFound as path:
log.err('Requested NetTest file not found (%s)' % path)
sys.exit(3)
except errors.OONIUsageError as e:
log.err(e)
print e.net_test_loader.usageOptions().getUsage()
sys.exit(4)
except Exception as e:
if config.advanced.debug:
log.exception(e)
log.err(e)
sys.exit(5)
start_tor |= deck.requiresTor
d = director.start(start_tor=start_tor,
check_incoherences=check_incoherences)
def setup_nettest(_):
try:
return deck.setup()
except errors.UnableToLoadDeckInput as error:
return defer.failure.Failure(error)
def director_startup_handled_failures(failure):
log.err("Could not start the director")
failure.trap(errors.TorNotRunning,
errors.InvalidOONIBCollectorAddress,
errors.UnableToLoadDeckInput,
errors.CouldNotFindTestHelper,
errors.CouldNotFindTestCollector,
errors.ProbeIPUnknown,
errors.InvalidInputFile,
errors.ConfigFileIncoherent)
if isinstance(failure.value, errors.TorNotRunning):
log.err("Tor does not appear to be running")
log.err("Reporting with the collector %s is not possible" %
global_options['collector'])
log.msg(
"Try with a different collector or disable collector reporting with -n")
elif isinstance(failure.value, errors.InvalidOONIBCollectorAddress):
log.err("Invalid format for oonib collector address.")
log.msg(
"Should be in the format http://:")
log.msg("for example: ooniprobe -c httpo://nkvphnp3p6agi5qq.onion")
elif isinstance(failure.value, errors.UnableToLoadDeckInput):
log.err("Unable to fetch the required inputs for the test deck.")
log.msg(
"Please file a ticket on our issue tracker: https://github.com/thetorproject/ooni-probe/issues")
elif isinstance(failure.value, errors.CouldNotFindTestHelper):
log.err("Unable to obtain the required test helpers.")
log.msg(
"Try with a different bouncer or check that Tor is running properly.")
elif isinstance(failure.value, errors.CouldNotFindTestCollector):
log.err("Could not find a valid collector.")
log.msg(
"Try with a different bouncer, specify a collector with -c or disable reporting to a collector with -n.")
elif isinstance(failure.value, errors.ProbeIPUnknown):
log.err("Failed to lookup probe IP address.")
log.msg("Check your internet connection.")
elif isinstance(failure.value, errors.InvalidInputFile):
log.err("Invalid input file \"%s\"" % failure.value)
elif isinstance(failure.value, errors.ConfigFileIncoherent):
log.err("Incoherent config file")
if config.advanced.debug:
log.exception(failure)
def director_startup_other_failures(failure):
log.err("An unhandled exception occurred while starting the director!")
log.exception(failure)
# Wait until director has started up (including bootstrapping Tor)
# before adding tests
def post_director_start(_):
for net_test_loader in deck.netTestLoaders:
# Decks can specify different collectors
# for each net test, so that each NetTest
# may be paired with a test_helper and its collector
# However, a user can override this behavior by
# specifying a collector from the command-line (-c).
# If a collector is not specified in the deck, or the
# deck is a singleton, the default collector set in
# ooniprobe.conf will be used
collector = None
if not global_options['no-collector']:
if global_options['collector']:
collector = global_options['collector']
elif 'collector' in config.reports \
and config.reports['collector']:
collector = config.reports['collector']
elif net_test_loader.collector:
collector = net_test_loader.collector
if collector and collector.startswith('httpo:') \
and (not (config.tor_state or config.tor.socks_port)):
raise errors.TorNotRunning
test_details = net_test_loader.testDetails
test_details['annotations'] = global_options['annotations']
director.startNetTest(net_test_loader,
global_options['reportfile'],
collector)
return director.allTestsDone
def start():
d.addCallback(setup_nettest)
d.addCallback(post_director_start)
d.addErrback(director_startup_handled_failures)
d.addErrback(director_startup_other_failures)
return d
return start()
ooniprobe-1.3.2/ooni/director.py 0000644 0001750 0001750 00000034575 12533065720 015021 0 ustar irl irl import pwd
import os
from ooni.managers import ReportEntryManager, MeasurementManager
from ooni.reporter import Report
from ooni.utils import log, generate_filename
from ooni.utils.net import randomFreePort
from ooni.nettest import NetTest, getNetTestInformation
from ooni.settings import config
from ooni import errors
from ooni.nettest import test_class_name_to_name
from txtorcon import TorConfig, TorState, launch_tor, build_tor_connection
from twisted.internet import defer, reactor
from twisted.internet.endpoints import TCP4ClientEndpoint
class Director(object):
"""
Singleton object responsible for coordinating the Measurements Manager
and the Reporting Manager.
How this all looks like is as follows:
+------------------------------------------------+
| Director |<--+
+------------------------------------------------+ |
^ ^ |
| Measurement | |
+---------+ [---------] +--------------------+ |
| | | MeasurementManager | |
| NetTest | [---------] +--------------------+ |
| | | [----------------] | |
+---------+ [---------] | [----------------] | |
| | [----------------] | |
| +--------------------+ |
v |
+---------+ ReportEntry |
| | [---------] +--------------------+ |
| Report | | ReportEntryManager | |
| | [---------] +--------------------+ |
+---------+ | [----------------] | |
[---------] | [----------------] |--
| [----------------] |
+--------------------+
[------------] are Tasks
+------+
| | are TaskManagers
+------+
| |
+------+
+------+
| | are general purpose objects
+------+
"""
_scheduledTests = 0
# Only list NetTests belonging to these categories
categories = ['blocking', 'manipulation']
def __init__(self):
self.activeNetTests = []
self.measurementManager = MeasurementManager()
self.measurementManager.director = self
self.reportEntryManager = ReportEntryManager()
self.reportEntryManager.director = self
# Link the TaskManager's by least available slots.
self.measurementManager.child = self.reportEntryManager
# Notify the parent when tasks complete # XXX deadlock!?
self.reportEntryManager.parent = self.measurementManager
self.successfulMeasurements = 0
self.failedMeasurements = 0
self.totalMeasurements = 0
# The cumulative runtime of all the measurements
self.totalMeasurementRuntime = 0
self.failures = []
self.torControlProtocol = None
# This deferred is fired once all the measurements and their reporting
# tasks are completed.
self.allTestsDone = defer.Deferred()
self.sniffers = {}
def getNetTests(self):
nettests = {}
def is_nettest(filename):
return not filename == '__init__.py' and filename.endswith('.py')
for category in self.categories:
dirname = os.path.join(config.nettest_directory, category)
# print path to all filenames.
for filename in os.listdir(dirname):
if is_nettest(filename):
net_test_file = os.path.join(dirname, filename)
try:
nettest = getNetTestInformation(net_test_file)
except:
log.err("Error processing %s" % filename)
continue
nettest['category'] = category.replace('/', '')
if nettest['id'] in nettests:
log.err("Found a two tests with the same name %s, %s" %
(net_test_file,
nettests[nettest['id']]['path']))
else:
category = dirname.replace(config.nettest_directory,
'')
nettests[nettest['id']] = nettest
return nettests
@defer.inlineCallbacks
def start(self, start_tor=False, check_incoherences=True):
self.netTests = self.getNetTests()
if start_tor:
if check_incoherences:
yield config.check_tor()
if config.advanced.start_tor:
yield self.startTor()
elif config.tor.control_port:
log.msg("Connecting to Tor Control Port...")
yield self.getTorState()
if config.global_options['no-geoip']:
aux = [False]
if config.global_options.get('annotations') is not None:
annotations = [k.lower() for k in config.global_options['annotations'].keys()]
aux = map(lambda x: x in annotations, ["city", "country", "asn"])
if not all(aux):
log.msg("You should add annotations for the country, city and ASN")
else:
yield config.probe_ip.lookup()
@property
def measurementSuccessRatio(self):
if self.totalMeasurements == 0:
return 0
return self.successfulMeasurements / self.totalMeasurements
@property
def measurementFailureRatio(self):
if self.totalMeasurements == 0:
return 0
return self.failedMeasurements / self.totalMeasurements
@property
def measurementSuccessRate(self):
"""
The speed at which tests are succeeding globally.
This means that fast tests that perform a lot of measurements will
impact this value quite heavily.
"""
if self.totalMeasurementRuntime == 0:
return 0
return self.successfulMeasurements / self.totalMeasurementRuntime
@property
def measurementFailureRate(self):
"""
The speed at which tests are failing globally.
"""
if self.totalMeasurementRuntime == 0:
return 0
return self.failedMeasurements / self.totalMeasurementRuntime
def measurementTimedOut(self, measurement):
"""
This gets called every time a measurement times out independenty from
the fact that it gets re-scheduled or not.
"""
pass
def measurementStarted(self, measurement):
self.totalMeasurements += 1
def measurementSucceeded(self, result, measurement):
log.debug("Successfully completed measurement: %s" % measurement)
self.totalMeasurementRuntime += measurement.runtime
self.successfulMeasurements += 1
measurement.result = result
test_name = test_class_name_to_name(measurement.testInstance.name)
if test_name in self.sniffers:
sniffer = self.sniffers[test_name]
config.scapyFactory.unRegisterProtocol(sniffer)
sniffer.close()
del self.sniffers[test_name]
return measurement
def measurementFailed(self, failure, measurement):
log.debug("Failed doing measurement: %s" % measurement)
self.totalMeasurementRuntime += measurement.runtime
self.failedMeasurements += 1
measurement.result = failure
return measurement
def reporterFailed(self, failure, net_test):
"""
This gets called every time a reporter is failing and has been removed
from the reporters of a NetTest.
Once a report has failed to be created that net_test will never use the
reporter again.
XXX hook some logic here.
note: failure contains an extra attribute called failure.reporter
"""
pass
def netTestDone(self, net_test):
self.activeNetTests.remove(net_test)
if len(self.activeNetTests) == 0:
self.allTestsDone.callback(None)
@defer.inlineCallbacks
def startNetTest(self, net_test_loader, report_filename,
collector_address=None):
"""
Create the Report for the NetTest and start the report NetTest.
Args:
net_test_loader:
an instance of :class:ooni.nettest.NetTestLoader
"""
if self.allTestsDone.called:
self.allTestsDone = defer.Deferred()
if config.privacy.includepcap:
self.startSniffing(net_test_loader.testDetails)
report = Report(net_test_loader.testDetails, report_filename,
self.reportEntryManager, collector_address)
net_test = NetTest(net_test_loader, report)
net_test.director = self
yield net_test.report.open()
yield net_test.initializeInputProcessor()
try:
self.activeNetTests.append(net_test)
self.measurementManager.schedule(net_test.generateMeasurements())
yield net_test.done
yield report.close()
finally:
self.netTestDone(net_test)
def startSniffing(self, testDetails):
""" Start sniffing with Scapy. Exits if required privileges (root) are not
available.
"""
from ooni.utils.txscapy import ScapySniffer, ScapyFactory
if config.scapyFactory is None:
config.scapyFactory = ScapyFactory(config.advanced.interface)
if not config.reports.pcap:
prefix = 'report'
else:
prefix = config.reports.pcap
filename = config.global_options['reportfile'] if 'reportfile' in config.global_options.keys() else None
filename_pcap = generate_filename(testDetails, filename=filename, prefix=prefix, extension='pcap')
if len(self.sniffers) > 0:
pcap_filenames = set(sniffer.pcapwriter.filename for sniffer in self.sniffers.values())
pcap_filenames.add(filename_pcap)
log.msg("pcap files %s can be messed up because several netTests are being executed in parallel." %
','.join(pcap_filenames))
sniffer = ScapySniffer(filename_pcap)
self.sniffers[testDetails['test_name']] = sniffer
config.scapyFactory.registerProtocol(sniffer)
log.msg("Starting packet capture to: %s" % filename_pcap)
@defer.inlineCallbacks
def getTorState(self):
connection = TCP4ClientEndpoint(reactor, '127.0.0.1',
config.tor.control_port)
config.tor_state = yield build_tor_connection(connection)
def startTor(self):
""" Starts Tor
Launches a Tor with :param: socks_port :param: control_port
:param: tor_binary set in ooniprobe.conf
"""
log.msg("Starting Tor...")
@defer.inlineCallbacks
def state_complete(state):
config.tor_state = state
log.msg("Successfully bootstrapped Tor")
log.debug("We now have the following circuits: ")
for circuit in state.circuits.values():
log.debug(" * %s" % circuit)
socks_port = yield state.protocol.get_conf("SocksPort")
control_port = yield state.protocol.get_conf("ControlPort")
config.tor.socks_port = int(socks_port.values()[0])
config.tor.control_port = int(control_port.values()[0])
def setup_failed(failure):
log.exception(failure)
raise errors.UnableToStartTor
def setup_complete(proto):
"""
Called when we read from stdout that Tor has reached 100%.
"""
log.debug("Building a TorState")
config.tor.protocol = proto
state = TorState(proto.tor_protocol)
state.post_bootstrap.addCallback(state_complete)
state.post_bootstrap.addErrback(setup_failed)
return state.post_bootstrap
def updates(prog, tag, summary):
log.msg("%d%%: %s" % (prog, summary))
tor_config = TorConfig()
if config.tor.control_port:
tor_config.ControlPort = config.tor.control_port
if config.tor.socks_port:
tor_config.SocksPort = config.tor.socks_port
if config.tor.data_dir:
data_dir = os.path.expanduser(config.tor.data_dir)
if not os.path.exists(data_dir):
log.msg("%s does not exist. Creating it." % data_dir)
os.makedirs(data_dir)
tor_config.DataDirectory = data_dir
if config.tor.bridges:
tor_config.UseBridges = 1
if config.advanced.obfsproxy_binary:
tor_config.ClientTransportPlugin = (
'obfs2,obfs3 exec %s managed' %
config.advanced.obfsproxy_binary
)
bridges = []
with open(config.tor.bridges) as f:
for bridge in f:
if 'obfs' in bridge:
if config.advanced.obfsproxy_binary:
bridges.append(bridge.strip())
else:
bridges.append(bridge.strip())
tor_config.Bridge = bridges
if config.tor.torrc:
for i in config.tor.torrc.keys():
setattr(tor_config, i, config.tor.torrc[i])
if os.geteuid() == 0:
tor_config.User = pwd.getpwuid(os.geteuid()).pw_name
tor_config.save()
if not hasattr(tor_config, 'ControlPort'):
control_port = int(randomFreePort())
tor_config.ControlPort = control_port
config.tor.control_port = control_port
if not hasattr(tor_config, 'SocksPort'):
socks_port = int(randomFreePort())
tor_config.SocksPort = socks_port
config.tor.socks_port = socks_port
tor_config.save()
log.debug("Setting control port as %s" % tor_config.ControlPort)
log.debug("Setting SOCKS port as %s" % tor_config.SocksPort)
if config.advanced.tor_binary:
d = launch_tor(tor_config, reactor,
tor_binary=config.advanced.tor_binary,
progress_updates=updates)
else:
d = launch_tor(tor_config, reactor,
progress_updates=updates)
d.addCallback(setup_complete)
d.addErrback(setup_failed)
return d
ooniprobe-1.3.2/ooni/reporter.py 0000644 0001750 0001750 00000055274 12623613431 015045 0 ustar irl irl import yaml
import json
import os
import re
from datetime import datetime
from contextlib import contextmanager
from yaml.representer import SafeRepresenter
from yaml.emitter import Emitter
from yaml.serializer import Serializer
from yaml.resolver import Resolver
from twisted.python.util import untilConcludes
from twisted.internet import defer
from twisted.internet.error import ConnectionRefusedError
from twisted.python.failure import Failure
from twisted.internet.endpoints import TCP4ClientEndpoint
from ooni.utils import log
from ooni.tasks import Measurement
try:
from scapy.packet import Packet
except ImportError:
log.err("Scapy is not installed.")
class Packet(object):
pass
from ooni import errors
from ooni import otime
from ooni.utils import pushFilenameStack, generate_filename
from ooni.utils.net import BodyReceiver, StringProducer
from ooni.settings import config
from ooni.tasks import ReportEntry
def createPacketReport(packet_list):
"""
Takes as input a packet a list.
Returns a dict containing a dict with the packet
summary and the raw packet.
"""
report = []
for packet in packet_list:
report.append({'raw_packet': str(packet),
'summary': str([packet])})
return report
class OSafeRepresenter(SafeRepresenter):
"""
This is a custom YAML representer that allows us to represent reports
safely.
It extends the SafeRepresenter to be able to also represent complex
numbers and scapy packet.
"""
def represent_data(self, data):
"""
This is very hackish. There is for sure a better way either by using
the add_multi_representer or add_representer, the issue though lies in
the fact that Scapy packets are metaclasses that leads to
yaml.representer.get_classobj_bases to not be able to properly get the
base of class of a Scapy packet.
XXX fully debug this problem
"""
if isinstance(data, Packet):
data = createPacketReport(data)
return SafeRepresenter.represent_data(self, data)
def represent_complex(self, data):
if data.imag == 0.0:
data = u'%r' % data.real
elif data.real == 0.0:
data = u'%rj' % data.imag
elif data.imag > 0:
data = u'%r+%rj' % (data.real, data.imag)
else:
data = u'%r%rj' % (data.real, data.imag)
return self.represent_scalar(u'tag:yaml.org,2002:python/complex', data)
OSafeRepresenter.add_representer(complex,
OSafeRepresenter.represent_complex)
class OSafeDumper(Emitter, Serializer, OSafeRepresenter, Resolver):
"""
This is a modification of the YAML Safe Dumper to use our own Safe
Representer that supports complex numbers.
"""
def __init__(self, stream,
default_style=None, default_flow_style=None,
canonical=None, indent=None, width=None,
allow_unicode=None, line_break=None,
encoding=None, explicit_start=None, explicit_end=None,
version=None, tags=None):
Emitter.__init__(self, stream, canonical=canonical,
indent=indent, width=width,
allow_unicode=allow_unicode, line_break=line_break)
Serializer.__init__(self, encoding=encoding,
explicit_start=explicit_start,
explicit_end=explicit_end,
version=version, tags=tags)
OSafeRepresenter.__init__(self, default_style=default_style,
default_flow_style=default_flow_style)
Resolver.__init__(self)
def safe_dump(data, stream=None, **kw):
"""
Safely dump to a yaml file the specified data.
"""
return yaml.dump_all([data], stream, Dumper=OSafeDumper, **kw)
class OReporter(object):
def __init__(self, test_details):
self.testDetails = test_details
def createReport(self):
"""
Override this with your own logic to implement tests.
"""
raise NotImplemented
def writeReportEntry(self, entry):
"""
Takes as input an entry and writes a report for it.
"""
raise NotImplemented
def finish(self):
pass
class YAMLReporter(OReporter):
"""
These are useful functions for reporting to YAML format.
report_destination:
the destination directory of the report
"""
def __init__(self, test_details, report_destination='.', report_filename=None):
self.reportDestination = report_destination
if not os.path.isdir(report_destination):
raise errors.InvalidDestination
report_filename = generate_filename(test_details, filename=report_filename, prefix='report', extension='yamloo')
report_path = os.path.join(self.reportDestination, report_filename)
if os.path.exists(report_path):
log.msg("Report already exists with filename %s" % report_path)
pushFilenameStack(report_path)
self.report_path = os.path.abspath(report_path)
OReporter.__init__(self, test_details)
def _writeln(self, line):
self._write("%s\n" % line)
def _write(self, format_string, *args):
if not self._stream:
raise errors.ReportNotCreated
if self._stream.closed:
raise errors.ReportAlreadyClosed
s = str(format_string)
assert isinstance(s, type(''))
if args:
self._stream.write(s % args)
else:
self._stream.write(s)
untilConcludes(self._stream.flush)
def writeReportEntry(self, entry):
log.debug("Writing report with YAML reporter")
self._write('---\n')
if isinstance(entry, Measurement):
self._write(safe_dump(entry.testInstance.report))
elif isinstance(entry, Failure):
self._write(entry.value)
elif isinstance(entry, dict):
self._write(safe_dump(entry))
self._write('...\n')
def createReport(self):
"""
Writes the report header and fire callbacks on self.created
"""
log.debug("Creating %s" % self.report_path)
self._stream = open(self.report_path, 'w+')
self._writeln("###########################################")
self._writeln("# OONI Probe Report for %s (%s)" % (
self.testDetails['test_name'],
self.testDetails['test_version'])
)
self._writeln("# %s" % otime.prettyDateNow())
self._writeln("###########################################")
self.writeReportEntry(self.testDetails)
def finish(self):
self._stream.close()
def collector_supported(collector_address):
if collector_address.startswith('httpo') \
and (not (config.tor_state or config.tor.socks_port)):
return False
return True
class OONIBReporter(OReporter):
def __init__(self, test_details, collector_address):
self.collectorAddress = collector_address
self.validateCollectorAddress()
self.reportID = None
OReporter.__init__(self, test_details)
def validateCollectorAddress(self):
"""
Will raise :class:ooni.errors.InvalidOONIBCollectorAddress an exception
if the oonib reporter is not valid.
"""
regexp = '^(http|httpo):\/\/[a-zA-Z0-9\-\.]+(:\d+)?$'
if not re.match(regexp, self.collectorAddress):
raise errors.InvalidOONIBCollectorAddress
@defer.inlineCallbacks
def writeReportEntry(self, entry):
log.debug("Writing report with OONIB reporter")
content = '---\n'
if isinstance(entry, Measurement):
content += safe_dump(entry.testInstance.report)
elif isinstance(entry, Failure):
content += entry.value
elif isinstance(entry, dict):
content += safe_dump(entry)
content += '...\n'
url = self.collectorAddress + '/report'
request = {'report_id': self.reportID,
'content': content}
log.debug("Updating report with id %s (%s)" % (self.reportID, url))
request_json = json.dumps(request)
log.debug("Sending %s" % request_json)
bodyProducer = StringProducer(json.dumps(request))
try:
yield self.agent.request("PUT", url,
bodyProducer=bodyProducer)
except:
# XXX we must trap this in the runner and make sure to report the
# data later.
log.err("Error in writing report entry")
raise errors.OONIBReportUpdateError
@defer.inlineCallbacks
def createReport(self):
"""
Creates a report on the oonib collector.
"""
# XXX we should probably be setting this inside of the constructor,
# however config.tor.socks_port is not set until Tor is started and the
# reporter is instantiated before Tor is started. We probably want to
# do this with some deferred kung foo or instantiate the reporter after
# tor is started.
from txsocksx.http import SOCKS5Agent
from twisted.internet import reactor
if self.collectorAddress.startswith('httpo://'):
self.collectorAddress = \
self.collectorAddress.replace('httpo://', 'http://')
proxyEndpoint = TCP4ClientEndpoint(reactor, '127.0.0.1',
config.tor.socks_port)
self.agent = SOCKS5Agent(reactor, proxyEndpoint=proxyEndpoint)
elif self.collectorAddress.startswith('https://'):
# XXX add support for securely reporting to HTTPS collectors.
log.err("HTTPS based collectors are currently not supported.")
url = self.collectorAddress + '/report'
content = '---\n'
content += safe_dump(self.testDetails)
content += '...\n'
request = {
'software_name': self.testDetails['software_name'],
'software_version': self.testDetails['software_version'],
'probe_asn': self.testDetails['probe_asn'],
'test_name': self.testDetails['test_name'],
'test_version': self.testDetails['test_version'],
'input_hashes': self.testDetails['input_hashes'],
# XXX there is a bunch of redundancy in the arguments getting sent
# to the backend. This may need to get changed in the client and
# the backend.
'content': content
}
log.msg("Reporting %s" % url)
request_json = json.dumps(request)
log.debug("Sending %s" % request_json)
bodyProducer = StringProducer(json.dumps(request))
log.msg("Creating report with OONIB Reporter. Please be patient.")
log.msg("This may take up to 1-2 minutes...")
try:
response = yield self.agent.request("POST", url,
bodyProducer=bodyProducer)
except ConnectionRefusedError:
log.err("Connection to reporting backend failed "
"(ConnectionRefusedError)")
raise errors.OONIBReportCreationError
except errors.HostUnreachable:
log.err("Host is not reachable (HostUnreachable error")
raise errors.OONIBReportCreationError
except Exception, e:
log.err("Failed to connect to reporter backend")
log.exception(e)
raise errors.OONIBReportCreationError
# This is a little trix to allow us to unspool the response. We create
# a deferred and call yield on it.
response_body = defer.Deferred()
response.deliverBody(BodyReceiver(response_body))
backend_response = yield response_body
try:
parsed_response = json.loads(backend_response)
except Exception, e:
log.err("Failed to parse collector response %s" % backend_response)
log.exception(e)
raise errors.OONIBReportCreationError
if response.code == 406:
# XXX make this more strict
log.err("The specified input or nettests cannot be submitted to "
"this collector.")
log.msg("Try running a different test or try reporting to a "
"different collector.")
raise errors.OONIBReportCreationError
self.reportID = parsed_response['report_id']
self.backendVersion = parsed_response['backend_version']
log.debug("Created report with id %s" % parsed_response['report_id'])
defer.returnValue(parsed_response['report_id'])
def finish(self):
url = self.collectorAddress + '/report/' + self.reportID + '/close'
log.debug("Closing the report %s" % url)
return self.agent.request("POST", str(url))
class OONIBReportLog(object):
"""
Used to keep track of report creation on a collector backend.
"""
def __init__(self, file_name=None):
if file_name is None:
file_name = config.report_log_file
self.file_name = file_name
self.create_report_log()
def get_report_log(self):
with open(self.file_name) as f:
report_log = yaml.safe_load(f)
if not report_log:
report_log = {} # consumers expect dictionary structure
return report_log
@property
def reports_incomplete(self):
reports = []
report_log = self.get_report_log()
for report_file, value in report_log.items():
if value['status'] in ('created'):
try:
os.kill(value['pid'], 0)
except:
reports.append((report_file, value))
elif value['status'] in ('incomplete'):
reports.append((report_file, value))
return reports
@property
def reports_in_progress(self):
reports = []
report_log = self.get_report_log()
for report_file, value in report_log.items():
if value['status'] in ('created'):
try:
os.kill(value['pid'], 0)
reports.append((report_file, value))
except:
pass
return reports
@property
def reports_to_upload(self):
reports = []
report_log = self.get_report_log()
for report_file, value in report_log.items():
if value['status'] in ('creation-failed', 'not-created'):
reports.append((report_file, value))
return reports
def run(self, f, *arg, **kw):
lock = defer.DeferredFilesystemLock(self.file_name + '.lock')
d = lock.deferUntilLocked()
def unlockAndReturn(r):
lock.unlock()
return r
def execute(_):
d = defer.maybeDeferred(f, *arg, **kw)
d.addBoth(unlockAndReturn)
return d
d.addCallback(execute)
return d
def create_report_log(self):
if not os.path.exists(self.file_name):
with open(self.file_name, 'w+') as f:
f.write(yaml.safe_dump({}))
@contextmanager
def edit_log(self):
with open(self.file_name) as rfp:
report = yaml.safe_load(rfp)
# This should never happen.
if report is None:
report = {}
with open(self.file_name, 'w+') as wfp:
try:
yield report
finally:
wfp.write(yaml.safe_dump(report))
def _not_created(self, report_file):
with self.edit_log() as report:
report[report_file] = {
'pid': os.getpid(),
'created_at': datetime.now(),
'status': 'not-created',
'collector': None
}
def not_created(self, report_file):
return self.run(self._not_created, report_file)
def _created(self, report_file, collector_address, report_id):
with self.edit_log() as report:
report[report_file] = {
'pid': os.getpid(),
'created_at': datetime.now(),
'status': 'created',
'collector': collector_address,
'report_id': report_id
}
def created(self, report_file, collector_address, report_id):
return self.run(self._created, report_file,
collector_address, report_id)
def _creation_failed(self, report_file, collector_address):
with self.edit_log() as report:
report[report_file] = {
'pid': os.getpid(),
'created_at': datetime.now(),
'status': 'creation-failed',
'collector': collector_address
}
def creation_failed(self, report_file, collector_address):
return self.run(self._creation_failed, report_file,
collector_address)
def _incomplete(self, report_file):
with self.edit_log() as report:
if report[report_file]['status'] != "created":
raise errors.ReportNotCreated()
report[report_file]['status'] = 'incomplete'
def incomplete(self, report_file):
return self.run(self._incomplete, report_file)
def _closed(self, report_file):
with self.edit_log() as report:
if report[report_file]['status'] != "created":
raise errors.ReportNotCreated()
del report[report_file]
def closed(self, report_file):
return self.run(self._closed, report_file)
class Report(object):
def __init__(self, test_details, report_filename,
reportEntryManager, collector_address=None):
"""
This is an abstraction layer on top of all the configured reporters.
It allows to lazily write to the reporters that are to be used.
Args:
test_details:
A dictionary containing the test details.
report_filename:
The file path for the report to be written.
reportEntryManager:
an instance of :class:ooni.tasks.ReportEntryManager
collector:
The address of the oonib collector for this report.
"""
self.test_details = test_details
self.collector_address = collector_address
self.report_log = OONIBReportLog()
self.yaml_reporter = YAMLReporter(test_details, report_filename=report_filename)
self.report_filename = self.yaml_reporter.report_path
self.oonib_reporter = None
if collector_address:
self.oonib_reporter = OONIBReporter(test_details,
collector_address)
self.done = defer.Deferred()
self.reportEntryManager = reportEntryManager
def open_oonib_reporter(self):
def creation_failed(failure):
self.oonib_reporter = None
return self.report_log.creation_failed(self.report_filename,
self.collector_address)
def created(report_id):
if not self.oonib_reporter:
return
return self.report_log.created(self.report_filename,
self.collector_address,
report_id)
d = self.oonib_reporter.createReport()
d.addErrback(creation_failed)
d.addCallback(created)
return d
def open(self):
"""
This will create all the reports that need to be created and fires the
created callback of the reporter whose report got created.
"""
d = defer.Deferred()
deferreds = []
def yaml_report_failed(failure):
d.errback(failure)
def all_reports_openned(result):
if not d.called:
d.callback(None)
if self.oonib_reporter:
deferreds.append(self.open_oonib_reporter())
else:
deferreds.append(self.report_log.not_created(self.report_filename))
yaml_report_created = \
defer.maybeDeferred(self.yaml_reporter.createReport)
yaml_report_created.addErrback(yaml_report_failed)
dl = defer.DeferredList(deferreds)
dl.addCallback(all_reports_openned)
return d
def write(self, measurement):
"""
Will return a deferred that will fire once the report for the specified
measurement have been written to all the reporters.
Args:
measurement:
an instance of :class:ooni.tasks.Measurement
Returns:
a deferred that will fire once all the report entries have
been written or errbacks when no more reporters
"""
d = defer.Deferred()
deferreds = []
def yaml_report_failed(failure):
d.errback(failure)
def oonib_report_failed(failure):
return self.report_log.incomplete(self.report_filename)
def all_reports_written(_):
if not d.called:
d.callback(None)
write_yaml_report = ReportEntry(self.yaml_reporter, measurement)
self.reportEntryManager.schedule(write_yaml_report)
write_yaml_report.done.addErrback(yaml_report_failed)
deferreds.append(write_yaml_report.done)
if self.oonib_reporter:
write_oonib_report = ReportEntry(self.oonib_reporter, measurement)
self.reportEntryManager.schedule(write_oonib_report)
write_oonib_report.done.addErrback(oonib_report_failed)
deferreds.append(write_oonib_report.done)
dl = defer.DeferredList(deferreds)
dl.addCallback(all_reports_written)
return d
def close(self):
"""
Close the report by calling it's finish method.
Returns:
a :class:twisted.internet.defer.DeferredList that will fire when
all the reports have been closed.
"""
d = defer.Deferred()
deferreds = []
def yaml_report_failed(failure):
d.errback(failure)
def oonib_report_closed(result):
return self.report_log.closed(self.report_filename)
def oonib_report_failed(result):
log.err("Failed to close oonib report.")
def all_reports_closed(_):
if not d.called:
d.callback(None)
close_yaml = defer.maybeDeferred(self.yaml_reporter.finish)
close_yaml.addErrback(yaml_report_failed)
deferreds.append(close_yaml)
if self.oonib_reporter:
close_oonib = self.oonib_reporter.finish()
close_oonib.addCallback(oonib_report_closed)
close_oonib.addErrback(oonib_report_failed)
deferreds.append(close_oonib)
dl = defer.DeferredList(deferreds)
dl.addCallback(all_reports_closed)
return d
ooniprobe-1.3.2/setup.cfg 0000644 0001750 0001750 00000000073 12623630152 013470 0 ustar irl irl [egg_info]
tag_build =
tag_date = 0
tag_svn_revision = 0
ooniprobe-1.3.2/data/ 0000755 0001750 0001750 00000000000 12623630152 012560 5 ustar irl irl ooniprobe-1.3.2/data/decks/ 0000755 0001750 0001750 00000000000 12623630152 013651 5 ustar irl irl ooniprobe-1.3.2/data/decks/complete_no_root.deck 0000644 0001750 0001750 00000003011 12447563404 020054 0 ustar irl irl - options:
collector: null
help: 0
logfile: null
no-default-reporter: 0
parallelism: null
pcapfile: null
reportfile: null
resume: 0
subargs: [-f, 'httpo://ihiderha53f36lsd.onion/input/37e60e13536f6afe47a830bfb6b371b5cf65da66d7ad65137344679b24fdccd1']
test_file: blocking/http_requests
testdeck: null
- options:
collector: null
help: 0
logfile: null
no-default-reporter: 0
parallelism: null
pcapfile: null
reportfile: null
resume: 0
subargs: [-f, 'httpo://ihiderha53f36lsd.onion/input/37e60e13536f6afe47a830bfb6b371b5cf65da66d7ad65137344679b24fdccd1']
test_file: blocking/dns_consistency
testdeck: null
- options:
collector: null
help: 0
logfile: null
no-default-reporter: 0
parallelism: null
pcapfile: null
reportfile: null
resume: 0
subargs: []
test_file: manipulation/http_invalid_request_line
testdeck: null
- options:
collector: null
help: 0
logfile: null
no-default-reporter: 0
parallelism: null
pcapfile: null
reportfile: null
resume: 0
subargs: []
test_file: manipulation/http_header_field_manipulation
testdeck: null
- options:
collector: null
help: 0
logfile: null
no-default-reporter: 0
parallelism: null
pcapfile: null
reportfile: null
resume: 0
subargs: [-f, 'httpo://ihiderha53f36lsd.onion/input/37e60e13536f6afe47a830bfb6b371b5cf65da66d7ad65137344679b24fdccd1']
test_file: manipulation/http_host
testdeck: null
ooniprobe-1.3.2/data/decks/complete.deck 0000644 0001750 0001750 00000003367 12447563404 016333 0 ustar irl irl - options:
collector: null
help: 0
logfile: null
no-default-reporter: 0
parallelism: null
pcapfile: null
reportfile: null
resume: 0
subargs: [-f, 'httpo://ihiderha53f36lsd.onion/input/37e60e13536f6afe47a830bfb6b371b5cf65da66d7ad65137344679b24fdccd1']
test_file: blocking/http_requests
testdeck: null
- options:
collector: null
help: 0
logfile: null
no-default-reporter: 0
parallelism: null
pcapfile: null
reportfile: null
resume: 0
subargs: [-f, 'httpo://ihiderha53f36lsd.onion/input/37e60e13536f6afe47a830bfb6b371b5cf65da66d7ad65137344679b24fdccd1']
test_file: blocking/dns_consistency
testdeck: null
- options:
collector: null
help: 0
logfile: null
no-default-reporter: 0
parallelism: null
pcapfile: null
reportfile: null
resume: 0
subargs: []
test_file: manipulation/http_invalid_request_line
testdeck: null
- options:
collector: null
help: 0
logfile: null
no-default-reporter: 0
parallelism: null
pcapfile: null
reportfile: null
resume: 0
subargs: []
test_file: manipulation/http_header_field_manipulation
testdeck: null
- options:
collector: null
help: 0
logfile: null
no-default-reporter: 0
parallelism: null
pcapfile: null
reportfile: null
resume: 0
subargs: []
test_file: manipulation/traceroute
testdeck: null
- options:
collector: null
help: 0
logfile: null
no-default-reporter: 0
parallelism: null
pcapfile: null
reportfile: null
resume: 0
subargs: [-f, 'httpo://ihiderha53f36lsd.onion/input/37e60e13536f6afe47a830bfb6b371b5cf65da66d7ad65137344679b24fdccd1']
test_file: manipulation/http_host
testdeck: null
ooniprobe-1.3.2/data/decks/fast_no_root.deck 0000644 0001750 0001750 00000000777 12447563404 017221 0 ustar irl irl - options:
collector: null
help: 0
logfile: null
no-default-reporter: 0
parallelism: null
pcapfile: null
reportfile: null
resume: 0
subargs: []
test_file: manipulation/http_invalid_request_line
testdeck: null
- options:
collector: null
help: 0
logfile: null
no-default-reporter: 0
parallelism: null
pcapfile: null
reportfile: null
resume: 0
subargs: []
test_file: manipulation/http_header_field_manipulation
testdeck: null
ooniprobe-1.3.2/data/decks/mlab.deck 0000644 0001750 0001750 00000000402 12447563404 015421 0 ustar irl irl - options:
collector: null
help: 0
logfile: null
no-default-reporter: 0
parallelism: null
pcapfile: null
reportfile: null
resume: 0
subargs: []
test_file: manipulation/http_header_field_manipulation
testdeck: null
ooniprobe-1.3.2/data/decks/fast.deck 0000644 0001750 0001750 00000001355 12447563404 015453 0 ustar irl irl - options:
collector: null
help: 0
logfile: null
no-default-reporter: 0
parallelism: null
pcapfile: null
reportfile: null
resume: 0
subargs: []
test_file: manipulation/traceroute
testdeck: null
- options:
collector: null
help: 0
logfile: null
no-default-reporter: 0
parallelism: null
pcapfile: null
reportfile: null
resume: 0
subargs: []
test_file: manipulation/http_invalid_request_line
testdeck: null
- options:
collector: null
help: 0
logfile: null
no-default-reporter: 0
parallelism: null
pcapfile: null
reportfile: null
resume: 0
subargs: []
test_file: manipulation/http_header_field_manipulation
testdeck: null
ooniprobe-1.3.2/data/oonireport.1 0000644 0001750 0001750 00000004704 12447563404 015060 0 ustar irl irl .\" Man page generated from reStructuredText.
.
.TH "oonireport" "1" "June 26, 2014" "1.0.2" "oonireport"
.SH NAME
oonireport - tool for managing ooniprobe reports.
.
.nr rst2man-indent-level 0
.
.de1 rstReportMargin
\\$1 \\n[an-margin]
level \\n[rst2man-indent-level]
level margin: \\n[rst2man-indent\\n[rst2man-indent-level]]
-
\\n[rst2man-indent0]
\\n[rst2man-indent1]
\\n[rst2man-indent2]
..
.de1 INDENT
.\" .rstReportMargin pre:
. RS \\$1
. nr rst2man-indent\\n[rst2man-indent-level] \\n[an-margin]
. nr rst2man-indent-level +1
.\" .rstReportMargin post:
..
.de UNINDENT
. RE
.\" indent \\n[an-margin]
.\" old: \\n[rst2man-indent\\n[rst2man-indent-level]]
.nr rst2man-indent-level -1
.\" new: \\n[rst2man-indent\\n[rst2man-indent-level]]
.in \\n[rst2man-indent\\n[rst2man-indent-level]]u
..
.SH SYNOPSIS
.B oonireport
.RB [ --version ]
.RB [ --help ]
.RB [ \-c
.IR collector ]
.I "status | upload"
.I [ report file ]
.SH DESCRIPTION
.sp
oonireport is a tool for viewing which ooniprobe reports have not been
submitted to a collector and upload them.
.SH OPTIONS
.TP
.BR \-\^h " or " \-\-help
Display this help and exit.
.TP
.BR \-\^c " or " \-\-collector
Specify the address of the collector to use for uploading reports.
.TP
.BR upload
If no argument is specified all the reports that have not been submitted will
be submitted to either the collector specified as an option (\-c) or to the
collector that was previously used.
If a report file is specified then only that report file will be uploaded
.TP
.BR status
Outputs all of the reports that are either "not uploaded", "incomplete" or
"pending".
The possible statuses are:
.BR not-created
.sp
If it has not been created, because the user specified to not use a
collector.
.BR creation-failed
.sp
If we attempted to create the report, but failed to do so either because
the collector does not accept our report or because it was unreachable at the
time.
.BR created
.sp
If the report has been successfully created, but has not yet been closed.
.BR incomplete
.sp
If the report has been created, but we have failed to update the report
with some data.
.SH OONIREPORT
.sp
This tool is used to upload reports that are stored on the users filesystem and
have not been submitted to a collector. This can be either because the
collector was unreachable when we attempted to submit the report or because the policy
of the collector did not support the specified test.
.sp
.SH AUTHOR
The Tor Project
.SH COPYRIGHT
2014, The Tor Project
.
ooniprobe-1.3.2/data/ooniresources.1 0000644 0001750 0001750 00000003645 12447563404 015562 0 ustar irl irl .\" Man page generated from reStructuredText.
.
.TH "ooniresources" "1" "October 1, 2014" "1.1.4" "ooniresources"
.SH NAME
ooniresources - tool for generating ooniprobe test decks
.
.nr rst2man-indent-level 0
.
.de1 rstReportMargin
\\$1 \\n[an-margin]
level \\n[rst2man-indent-level]
level margin: \\n[rst2man-indent\\n[rst2man-indent-level]]
-
\\n[rst2man-indent0]
\\n[rst2man-indent1]
\\n[rst2man-indent2]
..
.de1 INDENT
.\" .rstReportMargin pre:
. RS \\$1
. nr rst2man-indent\\n[rst2man-indent-level] \\n[an-margin]
. nr rst2man-indent-level +1
.\" .rstReportMargin post:
..
.de UNINDENT
. RE
.\" indent \\n[an-margin]
.\" old: \\n[rst2man-indent\\n[rst2man-indent-level]]
.nr rst2man-indent-level -1
.\" new: \\n[rst2man-indent\\n[rst2man-indent-level]]
.in \\n[rst2man-indent\\n[rst2man-indent-level]]u
..
.SH SYNOPSIS
.B ooniresources
.RB [ --version ]
.RB [ --help ]
.RB [ --update-inputs ]
.RB [ --update-geoip ]
.SH DESCRIPTION
.sp
ooniresources will update the resources that are needed for the functioning of
ooniprobe and related tools.
.SH OPTIONS
.TP
.BR \-\^h " or " \-\-help
Display this help and exit.
.TP
.BR \-\-update\-inputs
This will update the inputs that are used for generating test decks with
oonideckgen.
.TP
.BR \-\-update\-geoip
This will update the maxmind geoip databases that are used for detecting the
probes country and ASN.
.TP
.SH OONIRESOURCES
.sp
This tools will update the resources needed for running ooniprobe and
oonideckgen.
You don't need to run it very often, but it's just sufficient to run it once
every month of so as the data usually does not update or change that often.
If you are seeing some errors related to the geoip subsystem not having some
Maxmind databases, you probably want to run:
ooniresources --update-geoip.
If on the other hand you are interested in using oonideckgen, you should run:
ooniresources --update-inputs.
.sp
.SH AUTHOR
The Tor Project
.SH COPYRIGHT
2014, The Tor Project
.
ooniprobe-1.3.2/data/inputs/ 0000755 0001750 0001750 00000000000 12623630152 014102 5 ustar irl irl ooniprobe-1.3.2/data/inputs/README 0000644 0001750 0001750 00000002227 12502236531 014764 0 ustar irl irl In here you will find some very simple input lists that are useful for testing
the correct functionality of the various OONIProbe tests.
# DNS Consistency
./ooniprobe-dev -o dns_tamper_test.yamloo data/nettests/blocking/dns_consistency.py -t
example_inputs/dns_tamper_test_resolvers.txt -f example_inputs/dns_tamper_file.txt
less dns_tamper_test.yamloo
# Captive Portal
./ooniprobe-dev -o captive_portal_test.yamloo data/nettests/core/captiveportal.py
less captive_portal_test.yamloo
# HTTP Host
./ooniprobe-dev -o http_host.yamloo data/nettests/manipulation/http_host.py -b http://ooni.nu/test -f example_inputs/http_host_file.txt
less http_host.yamloo
# Keyword filtering
./ooniprobe-dev -o keyword_filtering.yamloo data/nettests/core/keyword_filtering.py -b http://ooni.nu/test/ -f test_inputs/keyword_filtering_file.txt
less keyword_filtering.yamloo
# URL List
./ooniprobe-dev -o url_lists.yamloo data/nettests/core/url_list.py -f test_inputs/url_lists_file.txt
less url_lists.yamloo
# Squid transparent proxy
./ooniprobe-dev -o squid.yamloo data/nettests/core/squid.py
less squid.yamloo
# HTTP Requests
Not Implemented
# Traceroute
Not Implemented
ooniprobe-1.3.2/data/inputs/Makefile 0000644 0001750 0001750 00000000625 12352774073 015557 0 ustar irl irl # This file will fetch a set of inputs that are of use to ooni-probe
all: whatheaders lists
lists:
curl -O https://ooni.torproject.org/inputs/input-pack.tar.gz
tar xzf input-pack.tar.gz
rm input-pack.tar.gz
whatheaders:
wget http://s3.amazonaws.com/data.whatheaders.com/whatheaders-latest.xml.zip
unzip whatheaders-latest.xml.zip
mv whatheaders*.xml whatheaders.xml
rm whatheaders-latest.xml.zip
ooniprobe-1.3.2/data/ooniprobe.1 0000644 0001750 0001750 00000076276 12623613431 014661 0 ustar irl irl .\" Man page generated from reStructuredText.
.
.TH "ooniprobe" "1" "April 29, 2014" "1.0.2" "ooniprobe"
.SH NAME
ooniprobe - a network censorship measurement tool.
.
.nr rst2man-indent-level 0
.
.de1 rstReportMargin
\\$1 \\n[an-margin]
level \\n[rst2man-indent-level]
level margin: \\n[rst2man-indent\\n[rst2man-indent-level]]
-
\\n[rst2man-indent0]
\\n[rst2man-indent1]
\\n[rst2man-indent2]
..
.de1 INDENT
.\" .rstReportMargin pre:
. RS \\$1
. nr rst2man-indent\\n[rst2man-indent-level] \\n[an-margin]
. nr rst2man-indent-level +1
.\" .rstReportMargin post:
..
.de UNINDENT
. RE
.\" indent \\n[an-margin]
.\" old: \\n[rst2man-indent\\n[rst2man-indent-level]]
.nr rst2man-indent-level -1
.\" new: \\n[rst2man-indent\\n[rst2man-indent-level]]
.in \\n[rst2man-indent\\n[rst2man-indent-level]]u
..
.SH SYNOPSIS
.B ooniprobe
.RB [ \-hnsp ]
.RB [ --version ]
.RB [ --spew ]
.RB [ \-o
.IR reportfile ]
.RB [ \-i
.IR testdeck ]
.RB [ \-c
.IR collector ]
.RB [ \-b
.IR bouncer ]
.RB [ \-l
.IR logfile ]
.RB [ \-O
.IR pcapfile ]
.RB [ \-f
.IR configfile ]
.RB [ \-d
.IR datadir ]
.I "test_name"
.SH DESCRIPTION
.sp
ooniprobe is tool for performing internet censorship measurements. Our goal is
to achieve a common data format and set of methodologies for conducting
censorship related research.
.SH OPTIONS
.TP
.BR \-\^h " or " \-\-help
Display this help and exit.
.TP
.BR \-\^n " or " \-\-no\-collector
Disable reporting the net test results to an oonib collector.
.TP
.BR \-\^s " or " \-\-list
List all of the available net tests.
.TP
.BR \-\^p " or " \-\-printdeck
Print to standard output the specified command line options as an ooniprobe test deck.
.TP
.BR \-\^o " or " \-\-reportfile
Specify the path to report file to write.
.TP
.BR \-\^i " or " \-\-testdeck
Specify as input a test deck: a yaml file containing the tests to run and their
arguments.
.TP
.BR \-\^c " or " \-\-collector
Specify the address of the collector of net test results. It is advisable to
always specify and bouncer and let it return a collector for the test or test
deck you are running.
.TP
.BR \-\^b " or " \-\-bouncer
Address of the bouncer that will inform the probe of which collector to use and
the addresses of test helpers. default: httpo://nkvphnp3p6agi5qq.onion
.TP
.BR \-\^l " or " \-\-logfile
Path to the log file to write
.TP
.BR \-\^O " or " \-\-pcapfile
Prefix to the pcap file name.
.TP
.BR \-\^f " or " \-\-configfile
Specify a path to the ooniprobe configuration file.
.TP
.BR \-\^d " or " \-\-datadir
Specify a path to the ooniprobe data directory
.TP
.BR \-\-spew
Print an insanely verbose log of everything that happens.
Useful when debugging freezes or locks in complex code.
.TP
.BR \-\-version
Display the ooniprobe version and exit.
.SH OONIPROBE
.sp
Is the tool that volunteers and researches interested in contributing data to
the project should be running.
.sp
ooniprobe allows the user to select what test should be run and what backend
should be used for storing the test report and/or assisting them in the running
of the test.
.sp
ooniprobe tests are divided into two categories: \fBTraffic Manipulation\fP and
\fBContent Blocking\fP\&.
.sp
\fBTraffic Manipulation\fP tests aim to detect the presence of some sort of
tampering with the internet traffic between the probe and a remote test helper
backend. As such they usually require the selection of a oonib backend
component for running the test.
.sp
\fBContent Blocking\fP are aimed at enumerating the kind of content that is
blocked from the probes network point of view. As such they usually require to
have specified an input list for running the test.
.SS Threat Model
.sp
Our adversary is capable of doing country wide network surveillance and
manipulation of network traffic.
.sp
The goals of our adversary are:
.INDENT 0.0
.INDENT 3.5
.INDENT 0.0
.IP \(bu 2
Restrict access to certain content, while not degrading overall quality of
the network
.IP \(bu 2
Monitor the network in a way that they are able to identify misuse of it in
real time
.UNINDENT
.UNINDENT
.UNINDENT
.sp
More specifc to the running of network filtering detection tests:
.INDENT 0.0
.IP 1. 3
Detect actors performing censorship detection tests
.IP 2. 3
Fool people running such tests into believing that the network is
unrestricted
.UNINDENT
.sp
\fINote\fP that while 2) => 1) it is not true that 1) => 2) as the identification of
such actors does not necessarily have to happen in real time.
While our intention is to minimize the risk of users running OONI probe to be
identified, this comes with a tradeoff in accuracy. It is therefore necessary in
certain tests to trade\-off fingerprintability in favour of tests accuracy.
.sp
This is why we divide tests based on what risk the user running it can face,
allowing the user to freely choose what threat model they wish to adere to.
.SS Installation
.sp
\fBRead this before running ooniprobe!\fP
.sp
Running ooniprobe is a potentially risky activity. This greatly depends on the
jurisdiction in which you are in and which test you are running. It is
technically possible for a person observing your internet connection to be
aware of the fact that you are running ooniprobe. This means that if running
network measurement tests is something considered to be illegal in your country
then you could be spotted.
.sp
Furthermore, ooniprobe takes no precautions to protect the install target machine
from forensics analysis. If the fact that you have installed or used ooni
probe is a liability for you, please be aware of this risk.
.SS Debian based systems
.sp
\fIsudo sh \-c \(aqecho "deb http://deb.ooni.nu/ooni wheezy main" >> /etc/apt/sources.list\(aq\fP
.sp
\fIgpg \-\-keyserver pgp.mit.edu \-\-recv\-key 0x49B8CDF4\fP
.sp
\fIgpg \-\-export 89AB86D4788F3785FE9EDA31F9E2D9B049B8CDF4 | sudo apt\-key add \-\fP
.sp
\fIsudo apt\-get update && sudo apt\-get install ooniprobe\fP
.SS Linux
.sp
We believe that ooniprobe runs reasonably well on Debian GNU/Linux wheezy as
well as versions of Ubuntu such as natty and later releases. Running ooniprobe
without installing it is supported with the following commands:
.sp
\fIgit clone https://git.torproject.org/ooni\-probe.git\fP
.sp
\fIcd ooni\-probe\fP
.sp
\fI\&./setup\-dependencies.sh\fP
.sp
\fIpython setup.py install\fP
.SS Setting up development environment
.sp
On debian based systems this can be done with:
.sp
\fIVsudo apt\-get install libgeoip\-dev python\-virtualenv virtualenvwrapper\fP
.sp
\fImkvirtualenv ooniprobe\fP
.sp
\fIpython setup.py install\fP
.sp
\fIpip install \-r requirements\-dev.txt\fP
.SS Other platforms (with Vagrant)
.sp
\fI\%Install Vagrant\fP
and \fI\%Install Virtualbox\fP
.sp
\fBOn OSX:\fP
.sp
If you don\(aqt have it install \fI\%homebrew\fP
.sp
\fIbrew install git\fP
.sp
\fBOn debian/ubuntu:\fP
.sp
\fIsudo apt\-get install git\fP
.INDENT 0.0
.IP 1. 3
Open a Terminal and run:
.UNINDENT
.sp
\fIgit clone https://git.torproject.org/ooni\-probe.git\fP
.sp
\fIcd ooni\-probe/\fP
.sp
\fIvagrant up\fP
.INDENT 0.0
.IP 2. 3
Login to the box with:
.UNINDENT
.sp
\fIvagrant ssh\fP
.sp
ooniprobe will be installed in \fI/ooni\fP\&.
.INDENT 0.0
.IP 3. 3
You can run tests with:
.UNINDENT
.sp
\fIooniprobe blocking/http_requests \-f /ooni/inputs/input\-pack/alexa\-top\-1k.txt\fP
.SS Using ooniprobe
.sp
\fBNet test\fP is a set of measurements to assess what kind of internet censorship is occurring.
.sp
\fBDecks\fP are collections of ooniprobe nettests with some associated inputs.
.sp
\fBCollector\fP is a service used to report the results of measurements.
.sp
\fBTest helper\fP is a service used by a probe for successfully performing its measurements.
.sp
\fBBouncer\fP is a service used to discover the addresses of test helpers and collectors.
.SS Configuring ooniprobe
.sp
You may edit the configuration for ooniprobe by editing the configuration file
found inside of \fI~/.ooni/ooniprobe.conf\fP\&.
.sp
By default ooniprobe will not include personal identifying information in the
test result, nor create a pcap file. This behavior can be personalized.
.SS Running decks
.sp
You will find all the installed decks inside of \fI/usr/share/ooni/decks\fP\&.
.sp
You may then run a deck by using the command line option \fI\-i\fP:
.sp
As root:
.sp
\fIooniprobe \-i /usr/share/ooni/decks/mlab.deck\fP
.sp
Or as a user:
.sp
\fIooniprobe \-i /usr/share/ooni/decks/mlab_no_root.deck\fP
.sp
Or:
.sp
As root:
.sp
\fIooniprobe \-i /usr/share/ooni/decks/complete.deck\fP
.sp
Or as a user:
.sp
\fIooniprobe \-i /usr/share/ooni/decks/complete_no_root.deck\fP
.sp
The above tests will require around 20\-30 minutes to complete depending on your network speed.
.sp
If you would prefer to run some faster tests you should run:
As root:
.sp
\fIooniprobe \-i /usr/share/ooni/decks/fast.deck\fP
.sp
Or as a user:
.sp
\fIooniprobe \-i /usr/share/ooni/decks/fast_no_root.deck\fP
.SS Running net tests
.sp
You may list all the installed stable net tests with:
.sp
\fIooniprobe \-s\fP
.sp
You may then run a nettest by specifying its name for example:
.sp
\fIooniprobe manipulation/http_header_field_manipulation\fP
.sp
It is also possible to specify inputs to tests as URLs:
.sp
\fIooniprobe blocking/http_requests \-f httpo://ihiderha53f36lsd.onion/input/37e60e13536f6afe47a830bfb6b371b5cf65da66d7ad65137344679b24fdccd1\fP
.sp
You can find the result of the test in your current working directory.
.sp
By default the report result will be collected by the default ooni collector
and the addresses of test helpers will be obtained from the default bouncer.
.sp
You may also specify your own collector or bouncer with the options \fI\-c\fP and
\fI\-b\fP\&.
.SS (Optional) Install obfsproxy
.sp
Install the latest version of obfsproxy for your platform.
.sp
\fI\%Download Obfsproxy\fP
.SS Bridges and obfsproxy bridges
.sp
ooniprobe submits reports to oonib report collectors through Tor to a hidden
service endpoint. By default, ooniprobe uses the installed system Tor, but can
also be configured to launch Tor (see the advanced.start_tor option in
ooniprobe.conf), and ooniprobe supports bridges (and obfsproxy bridges, if
obfsproxy is installed). The tor.bridges option in ooniprobe.conf sets the path
to a file that should contain a set of "bridge" lines (of the same format as
used in torrc, and as returned by \fI\%https://bridges.torproject.org\fP). If obfsproxy
bridges are to be used, the path to the obfsproxy binary must be configured.
See option advanced.obfsproxy_binary, in ooniprobe.conf.
.SS Setting capabilities on your virtualenv python binary
.sp
If your distributation supports capabilities you can avoid needing to run OONI as root:
.sp
\fIsetcap cap_net_admin,cap_net_raw+eip /path/to/your/virtualenv\(aqs/python\fP
.SS Core ooniprobe Tests
.sp
The source for \fI\%Content blocking tests\fP
and \fI\%Traffic Manipulation tests\fP
can be found in the nettests/blocking and nettests/manipulation directories
respectively.
.SS Content Blocking Tests
.INDENT 0.0
.INDENT 3.5
.INDENT 0.0
.IP \(bu 2
\fI\%DNSConsistency\fP
.IP \(bu 2
\fI\%HTTP Requests\fP
.IP \(bu 2
\fI\%TCP Connect\fP
.UNINDENT
.UNINDENT
.UNINDENT
.SS Traffic Manipulation Tests
.INDENT 0.0
.INDENT 3.5
.INDENT 0.0
.IP \(bu 2
\fI\%HTTP Invalid Request Line:\fP
.IP \(bu 2
\fI\%DNS Spoof\fP
.IP \(bu 2
\fI\%HTTP Header Field Manipulation\fP
.IP \(bu 2
\fI\%Traceroute\fP
.IP \(bu 2
\fI\%HTTP Host\fP
.UNINDENT
.UNINDENT
.UNINDENT
.SS Other tests
.sp
We also have some other tests that are currently not fully supported or still
being experimented with.
.sp
You can find these in:
.INDENT 0.0
.INDENT 3.5
.INDENT 0.0
.IP \(bu 2
\fI\%ooni/nettests/experimental\fP
.UNINDENT
.UNINDENT
.UNINDENT
.sp
Tests that don\(aqt do a measurement but are useful for scanning can be found in:
.INDENT 0.0
.INDENT 3.5
.INDENT 0.0
.IP \(bu 2
\fI\%ooni/nettests/scanning\fP
.UNINDENT
.UNINDENT
.UNINDENT
.sp
Tests that involve running third party tools may be found in:
.INDENT 0.0
.INDENT 3.5
.INDENT 0.0
.IP \(bu 2
\fI\%ooni/nettests/third_party\fP
.UNINDENT
.UNINDENT
.UNINDENT
.SS Reports
.sp
The reports collected by ooniprobe are stored on
\fI\%https://ooni.torproject.org/reports/0.1/\fP \fBCC\fP /
.sp
Where \fBCC\fP is the two letter country code as specified by \fI\%ISO 31666\-2\fP\&.
.sp
For example the reports for Italy (\fBCC\fP is \fBit\fP) of the may be found in:
.sp
\fI\%https://ooni.torproject.org/reports/0.1/IT/\fP
.sp
This directory shall contain the various reports for the test using the
following convention:
.sp
\fBtestName\fP \- \fBdateInISO8601Format\fP \- \fBprobeASNumber\fP .yamloo
.sp
The date is expressed using \fI\%ISO 8601\fP
including seconds and with no \fB:\fP to delimit hours, minutes, days.
.sp
Like so:
.sp
\fBYEAR\fP \- \fBMONTH\fP \- \fBDAY\fP T \fBHOURS\fP \fBMINUTES\fP \fBSECONDS\fP Z
.sp
Look \fI\%here for the up to date list of ISO 8601 country codes\fP
.sp
The time is \fBalways\fP expressed in UTC.
.sp
If a collision is detected then an int (starting with 1) will get appended to
the test.
.sp
For example if two report that are created on the first of January 2012 at Noon
(UTC time) sharp from MIT (AS3) will be stored here:
.INDENT 0.0
.INDENT 3.5
.sp
.nf
.ft C
https://ooni.torproject.org/reports/0.1/US/2012\-01\-01T120000Z_AS3.yamloo
https://ooni.torproject.org/reports/0.1/US/2012\-01\-01T120000Z_AS3.1.yamloo
.ft P
.fi
.UNINDENT
.UNINDENT
.sp
Note: it is highly unlikely that reports get created with the same exact
timestamp from the same exact ASN. If this does happen it could be index of
some malicious report poisoning attack in progress.
.SS Report format version changelog
.sp
In here shall go details about the major changes to the reporting format.
.SS version 0.1
.sp
Initial format version.
.SS Writing OONI tests
.sp
The OONI testing API is heavily influenced and partially based on the python
\fBunittest\fP module and \fBtwisted.trial\fP\&.
.SS Test Cases
.sp
The atom of OONI Testing is called a Test Case. A test case class may contain
multiple Test Methods.
.INDENT 0.0
.TP
.B class ooni.nettest.NetTestCase
This is the base of the OONI nettest universe. When you write a nettest
you will subclass this object.
.INDENT 7.0
.IP \(bu 2
inputs: can be set to a static set of inputs. All the tests (the methods
starting with the "test" prefix) will be run once per input. At every run
the _input_ attribute of the TestCase instance will be set to the value of
the current iteration over inputs. Any python iterable object can be set
to inputs.
.IP \(bu 2
inputFile: attribute should be set to an array containing the command line
argument that should be used as the input file. Such array looks like
this:
.INDENT 2.0
.INDENT 3.5
\fB["commandlinearg", "c", "default value" "The description"]\fP
.UNINDENT
.UNINDENT
.sp
The second value of such arrray is the shorthand for the command line arg.
The user will then be able to specify inputs to the test via:
.INDENT 2.0
.INDENT 3.5
\fBooniprobe mytest.py \-\-commandlinearg path/to/file.txt\fP
.UNINDENT
.UNINDENT
.sp
or
.INDENT 2.0
.INDENT 3.5
\fBooniprobe mytest.py \-c path/to/file.txt\fP
.UNINDENT
.UNINDENT
.IP \(bu 2
inputProcessor: should be set to a function that takes as argument a
filename and it will return the input to be passed to the test
instance.
.IP \(bu 2
name: should be set to the name of the test.
.IP \(bu 2
author: should contain the name and contact details for the test author.
The format for such string is as follows:
.INDENT 2.0
.INDENT 3.5
\fBThe Name \fP
.UNINDENT
.UNINDENT
.IP \(bu 2
version: is the version string of the test.
.IP \(bu 2
requiresRoot: set to True if the test must be run as root.
.IP \(bu 2
usageOptions: a subclass of twisted.python.usage.Options for processing of command line arguments
.IP \(bu 2
localOptions: contains the parsed command line arguments.
.UNINDENT
.sp
Quirks:
Every class that is prefixed with test \fImust\fP return a twisted.internet.defer.Deferred.
.UNINDENT
.sp
If the test you plan to write is not listed on the \fI\%Tor OONI trac page\fP, you should
add it to the list and then add a description about it following the \fI\%Test
Template\fP
.sp
Tests are driven by inputs. For every input a new test instance is created,
internally the _setUp method is called that is defined inside of test
templates, then the setUp method that is overwritable by users.
.sp
Gotchas:
\fBnever\fP call reactor.start of reactor.stop inside of your test method and all
will be good.
.SS Inputs
.sp
Inputs are what is given as input to every iteration of the Test Case.
Iflyou have 100 inputs, then every test case will be run 100 times.
.sp
To configure a static set of inputs you should define the
\fBooni.nettest.NetTestCase\fP attribute \fBinputs\fP\&. The test will be
run \fBlen(inputs)\fP times. Any iterable object is a valid \fBinputs\fP
attribute.
.sp
If you would like to have inputs be determined from a user specified input
file, then you must set the \fBinputFile\fP attribute. This is an array that
specifies what command line option may be used to control this value.
.sp
By default the \fBinputProcessor\fP is set to read the file line by line and
strip newline characters. To change this behavior you must set the
\fBinputProcessor\fP attribute to a function that takes as argument a file
descriptor and yield the next item. The default \fBinputProcessor\fP looks like
this:
.INDENT 0.0
.INDENT 3.5
.sp
.nf
.ft C
def lineByLine(filename):
fp = open(filename)
for x in fp.xreadlines():
yield x.strip()
fp.close()
.ft P
.fi
.UNINDENT
.UNINDENT
.SS Setup and command line passing
.sp
Tests may define the \fIsetUp\fP method that will be called every time the
Test Case object is instantiated, in here you may place some common logic
to all your Test Methods that should be run before any testing occurs.
.sp
Command line arguments can be parsed thanks to the twisted
\fBtwisted.python.usage.UsageOptions\fP class.
.sp
You will have to subclass this and define the NetTestCase attribute
usageOptions to point to a subclass of this.
.INDENT 0.0
.INDENT 3.5
.sp
.nf
.ft C
class UsageOptions(usage.Options):
optParameters = [[\(aqbackend\(aq, \(aqb\(aq, \(aqhttp://127.0.0.1:57001\(aq,
\(aqURL of the test backend to use\(aq]
]
class MyTestCase(nettest.NetTestCase):
usageOptions = UsageOptions
inputFile = [\(aqfile\(aq, \(aqf\(aq, None, "Some foo file"]
requiredOptions = [\(aqbackend\(aq]
def test_my_test(self):
self.localOptions[\(aqbackend\(aq]
.ft P
.fi
.UNINDENT
.UNINDENT
.sp
You will then be able to access the parsed command line arguments via the
class attribute localOptions.
.sp
The \fIrequiredOptions\fP attributes specifies an array of parameters that are
required for the test to run properly.
.sp
\fIinputFile\fP is a special class attribute that will be used for processing
of the inputFile. The filename that is read here will be given to the
\fBooni.nettest.NetTestCase.inputProcessor\fP method that will yield, by
default, one line of the file at a time.
.SS Test Methods
.sp
These shall be defined inside of your \fBooni.nettest.NetTestCase\fP
subclass. These will be class methods.
.sp
All class methods that are prefixed with test_ shall be run. Functions
that are relevant to your test should be all lowercase separated by
underscore.
.sp
To add data to the test report you may write directly to the report object
like so:
.INDENT 0.0
.INDENT 3.5
.sp
.nf
.ft C
def test_my_function():
result = do_something()
self.report[\(aqsomething\(aq] = result
.ft P
.fi
.UNINDENT
.UNINDENT
.sp
OONI will then handle the writing of the data to the final test report.
.sp
To access the current input you can use the \fBinput\fP attribute, for example:
.INDENT 0.0
.INDENT 3.5
.sp
.nf
.ft C
def test_with_input():
do_something_with_input(self.input)
.ft P
.fi
.UNINDENT
.UNINDENT
.sp
This will at each iteration over the list of inputs do something with the
input.
.SS Test Templates
.sp
Test templates assist you in writing tests. They already contain all the
common functionality that is useful to running a test of that type. They
also take care of writing the data they collect that is relevant to the
test run to the report file.
.sp
Currently implemented test templates are \fBooni.templates.scapyt\fP for
tests based on Scapy, \fBooni.templates.tcpt\fP for tests based on TCP,
\fBooni.templates.httpt\fP for tests based on HTTP, and
\fBooni.templates.dnst\fP for tests based on DNS.
.SS Scapy based tests
.sp
Scapy based tests will be a subclass of \fBooni.templates.scapyt.BaseScapyTest\fP\&.
.sp
It provides a wrapper around the scapy send and receive function that will
write the sent and received packets to the report with sanitization of the
src and destination IP addresses.
.sp
It has the same syntax as the Scapy sr function, except that it will
return a deferred.
.sp
To implement a simple ICMP ping based on this function you can do like so
(Taken from \fBnettest.examples.example_scapyt.ExampleICMPPingScapy\fP)
.INDENT 0.0
.INDENT 3.5
.sp
.nf
.ft C
from twisted.python import usage
from scapy.all import IP, ICMP
from ooni.templates import scapyt
class UsageOptions(usage.Options):
optParameters = [[\(aqtarget\(aq, \(aqt\(aq, \(aq127.0.0.1\(aq, "Specify the target to ping"]]
class ExampleICMPPingScapy(scapyt.BaseScapyTest):
name = "Example ICMP Ping Test"
usageOptions = UsageOptions
def test_icmp_ping(self):
def finished(packets):
print packets
answered, unanswered = packets
for snd, rcv in answered:
rcv.show()
packets = IP(dst=self.localOptions[\(aqtarget\(aq])/ICMP()
d = self.sr(packets)
d.addCallback(finished)
return d
.ft P
.fi
.UNINDENT
.UNINDENT
.sp
The arguments taken by self.sr() are exactly the same as the scapy send and
receive function, the only difference is that instead of using the regular
scapy super socket it uses our twisted driven wrapper around it.
.sp
Alternatively this test can also be written using the
\fBtwisted.defer.inlineCallbacks()\fP decorator, that makes it look more similar to
regular sequential code.
.INDENT 0.0
.INDENT 3.5
.sp
.nf
.ft C
from twisted.python import usage
from twisted.internet import defer
from scapy.all import IP, ICMP
from ooni.templates import scapyt
class UsageOptions(usage.Options):
optParameters = [[\(aqtarget\(aq, \(aqt\(aq, \(aq127.0.0.1\(aq, "Specify the target to ping"]]
class ExampleICMPPingScapyYield(scapyt.BaseScapyTest):
name = "Example ICMP Ping Test"
usageOptions = UsageOptions
@defer.inlineCallbacks
def test_icmp_ping(self):
packets = IP(dst=self.localOptions[\(aqtarget\(aq])/ICMP()
answered, unanswered = yield self.sr(packets)
for snd, rcv in answered:
rcv.show()
.ft P
.fi
.UNINDENT
.UNINDENT
.SS Report Format
.INDENT 0.0
.INDENT 3.5
.sp
.nf
.ft C
###########################################
# OONI Probe Report for Example ICMP Ping Test test
# Thu Nov 22 18:20:43 2012
###########################################
\-\-\-
{probe_asn: null, probe_cc: null, probe_ip: 127.0.0.1, software_name: ooniprobe, software_version: 0.0.7.1\-alpha,
start_time: 1353601243.0, test_name: Example ICMP Ping Test, test_version: 0.1}
\&...
\-\-\-
input: null
report:
answer_flags: [ipsrc]
answered_packets:
\- \- raw_packet: !!binary |
RQAAHAEdAAAuAbjKCAgICH8AAAEAAAAAAAAAAA==
summary: IP / ICMP 8.8.8.8 > 127.0.0.1 echo\-reply 0
sent_packets:
\- \- raw_packet: !!binary |
RQAAHAABAABAAevPfwAAAQgICAgIAPf/AAAAAA==
summary: IP / ICMP 127.0.0.1 > 8.8.8.8 echo\-request 0
test_name: test_icmp_ping
test_started: 1353604843.553605
\&...
.ft P
.fi
.UNINDENT
.UNINDENT
.SS TCP based tests
.sp
TCP based tests will subclass \fBooni.templates.tcpt.TCPTest\fP\&.
.sp
This test template facilitates the sending of TCP payloads to the wire and
recording the response.
.INDENT 0.0
.INDENT 3.5
.sp
.nf
.ft C
from twisted.internet.error import ConnectionRefusedError
from ooni.utils import log
from ooni.templates import tcpt
class ExampleTCPT(tcpt.TCPTest):
def test_hello_world(self):
def got_response(response):
print "Got this data %s" % response
def connection_failed(failure):
failure.trap(ConnectionRefusedError)
print "Connection Refused"
self.address = "127.0.0.1"
self.port = 57002
payload = "Hello World!\en\er"
d = self.sendPayload(payload)
d.addErrback(connection_failed)
d.addCallback(got_response)
return d
.ft P
.fi
.UNINDENT
.UNINDENT
.sp
The possible failures for a TCP connection are:
.sp
\fBtwisted.internet.error.NoRouteError\fP that corresponds to errno.ENETUNREACH
.sp
\fBtwisted.internet.error.ConnectionRefusedError\fP that corresponds to
errno.ECONNREFUSED
.sp
\fBtwisted.internet.error.TCPTimedOutError\fP that corresponds to errno.ETIMEDOUT
.SS Report format
.sp
The basic report of a TCP test looks like the following (this is an report
generated by running the above example against a TCP echo server).
.INDENT 0.0
.INDENT 3.5
.sp
.nf
.ft C
###########################################
# OONI Probe Report for Base TCP Test test
# Thu Nov 22 18:18:28 2012
###########################################
\-\-\-
{probe_asn: null, probe_cc: null, probe_ip: 127.0.0.1, software_name: ooniprobe, software_version: 0.0.7.1\-alpha,
start_time: 1353601108.0, test_name: Base TCP Test, test_version: \(aq0.1\(aq}
\&...
\-\-\-
input: null
report:
errors: []
received: ["Hello World!\en\er"]
sent: ["Hello World!\en\er"]
test_name: test_hello_world
test_started: 1353604708.705081
\&...
.ft P
.fi
.UNINDENT
.UNINDENT
.sp
TODO finish this with more details
.SS HTTP based tests
.sp
HTTP based tests will be a subclass of \fBooni.templates.httpt.HTTPTest\fP\&.
.sp
It provides methods \fBooni.templates.httpt.HTTPTest.processResponseBody()\fP and
\fBooni.templates.httpt.HTTPTest.processResponseHeaders()\fP for interacting with the
response body and headers respectively.
.sp
For example, to implement a HTTP test that returns the sha256 hash of the
response body (based on \fBnettests.examples.example_httpt\fP):
.INDENT 0.0
.INDENT 3.5
.sp
.nf
.ft C
from ooni.utils import log
from ooni.templates import httpt
from hashlib import sha256
class SHA256HTTPBodyTest(httpt.HTTPTest):
name = "ChecksumHTTPBodyTest"
author = "Aaron Gibson"
version = 0.1
inputFile = [\(aqurl file\(aq, \(aqf\(aq, None,
\(aqList of URLS to perform GET requests to\(aq]
requiredOptions = [\(aqurl file\(aq]
def test_http(self):
if self.input:
url = self.input
return self.doRequest(url)
else:
raise Exception("No input specified")
def processResponseBody(self, body):
body_sha256sum = sha256(body).hexdigest()
self.report[\(aqchecksum\(aq] = body_sha256sum
.ft P
.fi
.UNINDENT
.UNINDENT
.SS Report format
.INDENT 0.0
.INDENT 3.5
.sp
.nf
.ft C
###########################################
# OONI Probe Report for ChecksumHTTPBodyTest test
# Thu Dec 6 17:31:57 2012
###########################################
\-\-\-
options:
collector: null
help: 0
logfile: null
pcapfile: null
reportfile: null
resume: 0
subargs: [\-f, hosts]
test: nettests/examples/example_http_checksum.py
probe_asn: null
probe_cc: null
probe_ip: 127.0.0.1
software_name: ooniprobe
software_version: 0.0.7.1\-alpha
start_time: 1354786317.0
test_name: ChecksumHTTPBodyTest
test_version: 0.1
\&...
\-\-\-
input: http://www.google.com
report:
agent: agent
checksum: d630fa2efd547d3656e349e96ff7af5496889dad959e8e29212af1ff843e7aa1
requests:
\- request:
body: null
headers:
\- \- User\-Agent
\- \- [Opera/9.00 (Windows NT 5.1; U; en), \(aqOpera 9.0, Windows XP\(aq]
method: GET
url: http://www.google.com
response:
body: \(aq\(aq
code: 200
headers:
\- \- X\-XSS\-Protection
\- [1; mode=block]
\- \- Set\-Cookie
\- [\(aqPREF=ID=fada4216eb3684f9:FF=0:TM=1354800717:LM=1354800717:S=IT\-2GCkNAocyXlVa;
expires=Sat, 06\-Dec\-2014 13:31:57 GMT; path=/; domain=.google.com\(aq, \(aqNID=66=KWaLbNQumuGuYf0HrWlGm54u9l\-DKJwhFCMQXfhQPZM\-qniRhmF6QRGXUKXb_8CIUuCOHnyoC5oAX5jWNrsfk\-LLJLW530UiMp6hemTtDMh_e6GSiEB4GR3yOP_E0TCN;
expires=Fri, 07\-Jun\-2013 13:31:57 GMT; path=/; domain=.google.com; HttpOnly\(aq]
\- \- Expires
\- [\(aq\-1\(aq]
\- \- Server
\- [gws]
\- \- Connection
\- [close]
\- \- Cache\-Control
\- [\(aqprivate, max\-age=0\(aq]
\- \- Date
\- [\(aqThu, 06 Dec 2012 13:31:57 GMT\(aq]
\- \- P3P
\- [\(aqCP="This is not a P3P policy! See http://www.google.com/support/accounts/bin/answer.py?hl=en&answer=151657
for more info."\(aq]
\- \- Content\-Type
\- [text/html; charset=UTF\-8]
\- \- X\-Frame\-Options
\- [SAMEORIGIN]
socksproxy: null
test_name: test_http
test_runtime: 0.08298492431640625
test_started: 1354800717.478403
\&...
.ft P
.fi
.UNINDENT
.UNINDENT
.SS DNS based tests
.sp
DNS based tests will be a subclass of \fBooni.templates.dnst.DNSTest\fP\&.
.sp
It provides methods \fBooni.templates.dnst.DNSTest.performPTRLookup()\fP
and \fBooni.templates.dnst.DNSTest.performALookup()\fP
.sp
For example (taken from \fBnettests.examples.example_dnst\fP):
.INDENT 0.0
.INDENT 3.5
.sp
.nf
.ft C
from ooni.templates.dnst import DNSTest
class ExampleDNSTest(DNSTest):
def test_a_lookup(self):
def gotResult(result):
# Result is an array containing all the A record lookup results
print result
d = self.performALookup(\(aqtorproject.org\(aq, (\(aq8.8.8.8\(aq, 53))
d.addCallback(gotResult)
return d
.ft P
.fi
.UNINDENT
.UNINDENT
.SS Report format
.INDENT 0.0
.INDENT 3.5
.sp
.nf
.ft C
###########################################
# OONI Probe Report for Base DNS Test test
# Thu Dec 6 17:42:51 2012
###########################################
\-\-\-
options:
collector: null
help: 0
logfile: null
pcapfile: null
reportfile: null
resume: 0
subargs: []
test: nettests/examples/example_dnst.py
probe_asn: null
probe_cc: null
probe_ip: 127.0.0.1
software_name: ooniprobe
software_version: 0.0.7.1\-alpha
start_time: 1354786971.0
test_name: Base DNS Test
test_version: 0.1
\&...
\-\-\-
input: null
report:
queries:
\- addrs: [82.195.75.101, 86.59.30.40, 38.229.72.14, 38.229.72.16]
answers:
\- [, ]
\- [, ]
\- [, ]
\- [, ]
query: \(aq[Query(\(aq\(aqtorproject.org\(aq\(aq, 1, 1)]\(aq
query_type: A
resolver: [8.8.8.8, 53]
test_name: test_a_lookup
test_runtime: 0.028924942016601562
test_started: 1354801371.980114
\&...
.ft P
.fi
.UNINDENT
.UNINDENT
.sp
For a more complex example, see: \fBnettests.blocking.dnsconsistency\fP
.SH GLOSSARY
.sp
Here we will summarize some of the jargon that is unique to OONI.
.sp
\fBTest Case\fP: a set of measurements performed on a to be tested network that
are logically grouped together
.sp
\fBReport\fP: is the output of a test run containing all the information that is
require for a researcher to assess what is the output of a test.
.sp
\fBYamlooni\fP: The format we use for Reports, that is based on YAML.
.sp
\fBInput\fP: What is given as input to a TestCase to perform a measurement.
.SH AUTHOR
The Tor Project
.SH COPYRIGHT
2014, The Tor Project
.
ooniprobe-1.3.2/data/oonideckgen.1 0000644 0001750 0001750 00000003762 12447563404 015150 0 ustar irl irl .\" Man page generated from reStructuredText.
.
.TH "oonideckgen" "1" "October 1, 2014" "1.1.4" "oonideckgen"
.SH NAME
oonideckgen - tool for generating ooniprobe test decks
.
.nr rst2man-indent-level 0
.
.de1 rstReportMargin
\\$1 \\n[an-margin]
level \\n[rst2man-indent-level]
level margin: \\n[rst2man-indent\\n[rst2man-indent-level]]
-
\\n[rst2man-indent0]
\\n[rst2man-indent1]
\\n[rst2man-indent2]
..
.de1 INDENT
.\" .rstReportMargin pre:
. RS \\$1
. nr rst2man-indent\\n[rst2man-indent-level] \\n[an-margin]
. nr rst2man-indent-level +1
.\" .rstReportMargin post:
..
.de UNINDENT
. RE
.\" indent \\n[an-margin]
.\" old: \\n[rst2man-indent\\n[rst2man-indent-level]]
.nr rst2man-indent-level -1
.\" new: \\n[rst2man-indent\\n[rst2man-indent-level]]
.in \\n[rst2man-indent\\n[rst2man-indent-level]]u
..
.SH SYNOPSIS
.B oonideckgen
.RB [ --version ]
.RB [ --help ]
.RB [ \-c
.IR country-code]
.RB [ \-o
.IR output]
.SH DESCRIPTION
.sp
oonideckgen will generate a ooniprobe test deck for the specified country. This
will then allow the user to conduct measurements useful to assess internet
censorship in that country.
.SH OPTIONS
.TP
.BR \-\^h " or " \-\-help
Display this help and exit.
.TP
.BR \-\^c " or " \-\-country-code
Specify the country code for which a test deck should be geneated
.TP
.BR \-\^o " or " \-\-output
Specify in which directory the deck should be written.
.TP
.SH OONIDECKGEN
.sp
This tools will generate a test deck that include the following tests:
* blocking/http_requests, with a set of relevant sites for the country to be analysed
* blocking/dns_consistency, with all the DNS resolver of the country to be analysed and a set of relevant sites
* manipulation/http_invalid_request_line
* manipulation/http_header_field_manipulation
If deckgen is run with no arugment a geoip lookup is done on the IP of the user
and that country is chosen.
In order to run deckgen you must have first have run ooniresources --update-inputs.
.sp
.SH AUTHOR
The Tor Project
.SH COPYRIGHT
2014, The Tor Project
.
ooniprobe-1.3.2/data/ooniprobe.conf.sample 0000644 0001750 0001750 00000004616 12531110611 016701 0 ustar irl irl # This is the configuration file for OONIProbe
# This file follows the YAML markup format: http://yaml.org/spec/1.2/spec.html
# Keep in mind that indentation matters.
basic:
# Where OONIProbe should be writing it's log file
logfile: /var/log/ooniprobe.log
privacy:
# Should we include the IP address of the probe in the report?
includeip: false
# Should we include the ASN of the probe in the report?
includeasn: true
# Should we include the country as reported by GeoIP in the report?
includecountry: true
# Should we include the city as reported by GeoIP in the report?
includecity: false
# Should we collect a full packet capture on the client?
includepcap: false
reports:
# Should we place a unique ID inside of every report
unique_id: true
# This is a prefix for each packet capture file (.pcap) per test:
pcap: null
collector: null
advanced:
geoip_data_dir: /usr/share/GeoIP
debug: false
# enable if auto detection fails
#tor_binary: /usr/sbin/tor
#obfsproxy_binary: /usr/bin/obfsproxy
# For auto detection
interface: auto
# Of specify a specific interface
#interface: wlan0
# If you do not specify start_tor, you will have to have Tor running and
# explicitly set the control port and SOCKS port
start_tor: true
# After how many seconds we should give up on a particular measurement
measurement_timeout: 60
# After how many retries we should give up on a measurement
measurement_retries: 2
# How many measurements to perform concurrently
measurement_concurrency: 10
# After how may seconds we should give up reporting
reporting_timeout: 80
# After how many retries to give up on reporting
reporting_retries: 3
# How many reports to perform concurrently
reporting_concurrency: 15
oonid_api_port: 8042
report_log_file: null
inputs_dir: null
decks_dir: null
tor:
#socks_port: 8801
#control_port: 8802
# Specify the absolute path to the Tor bridges to use for testing
#bridges: bridges.list
# Specify path of the tor datadirectory.
# This should be set to something to avoid having Tor download each time
# the descriptors and consensus data.
#data_dir: ~/.tor/
torrc:
#HTTPProxy: host:port
#HTTPProxyAuthenticator: user:password
#HTTPSProxy: host:port
#HTTPSProxyAuthenticator: user:password
ooniprobe-1.3.2/ooniprobe.egg-info/ 0000755 0001750 0001750 00000000000 12623630152 015335 5 ustar irl irl ooniprobe-1.3.2/ooniprobe.egg-info/not-zip-safe 0000644 0001750 0001750 00000000001 12623627754 017601 0 ustar irl irl
ooniprobe-1.3.2/ooniprobe.egg-info/top_level.txt 0000644 0001750 0001750 00000000005 12623630151 020061 0 ustar irl irl ooni
ooniprobe-1.3.2/ooniprobe.egg-info/SOURCES.txt 0000644 0001750 0001750 00000007330 12623630152 017224 0 ustar irl irl ChangeLog.rst
LICENSE
MANIFEST.in
README.rst
requirements.txt
setup.cfg
setup.py
bin/oonideckgen
bin/ooniprobe
bin/oonireport
bin/ooniresources
data/oonideckgen.1
data/ooniprobe.1
data/ooniprobe.conf.sample
data/oonireport.1
data/ooniresources.1
data/decks/complete.deck
data/decks/complete_no_root.deck
data/decks/fast.deck
data/decks/fast_no_root.deck
data/decks/mlab.deck
data/inputs/Makefile
data/inputs/README
ooni/__init__.py
ooni/deck.py
ooni/director.py
ooni/errors.py
ooni/geoip.py
ooni/managers.py
ooni/nettest.py
ooni/ngdeck.py
ooni/oonibclient.py
ooni/oonicli.py
ooni/oonid.py
ooni/otime.py
ooni/reporter.py
ooni/settings.ini
ooni/settings.py
ooni/tasks.py
ooni/api/__init__.py
ooni/api/spec.py
ooni/deckgen/__init__.py
ooni/deckgen/cli.py
ooni/deckgen/processors/__init__.py
ooni/deckgen/processors/citizenlab_test_lists.py
ooni/deckgen/processors/namebench_dns_servers.py
ooni/kit/__init__.py
ooni/kit/daphn3.py
ooni/kit/domclass.py
ooni/nettests/__init__.py
ooni/nettests/blocking/__init__.py
ooni/nettests/blocking/bridge_reachability.py
ooni/nettests/blocking/dns_consistency.py
ooni/nettests/blocking/dns_n_http.py
ooni/nettests/blocking/http_requests.py
ooni/nettests/blocking/meek_fronted_requests.py
ooni/nettests/blocking/tcp_connect.py
ooni/nettests/blocking/top_ports.py
ooni/nettests/experimental/__init__.py
ooni/nettests/experimental/chinatrigger.py
ooni/nettests/experimental/dns_injection.py
ooni/nettests/experimental/domclass_collector.py
ooni/nettests/experimental/http_filtering_bypassing.py
ooni/nettests/experimental/http_keyword_filtering.py
ooni/nettests/experimental/http_trix.py
ooni/nettests/experimental/http_uk_mobile_networks.py
ooni/nettests/experimental/keyword_filtering.py
ooni/nettests/experimental/lizard_mafia_fuck.py
ooni/nettests/experimental/parasitictraceroute.py
ooni/nettests/experimental/script.py
ooni/nettests/experimental/squid.py
ooni/nettests/experimental/tls_handshake.py
ooni/nettests/experimental/tor_censorship.py
ooni/nettests/manipulation/__init__.py
ooni/nettests/manipulation/captiveportal.py
ooni/nettests/manipulation/daphne.py
ooni/nettests/manipulation/dns_spoof.py
ooni/nettests/manipulation/http_header_field_manipulation.py
ooni/nettests/manipulation/http_host.py
ooni/nettests/manipulation/http_invalid_request_line.py
ooni/nettests/manipulation/traceroute.py
ooni/nettests/scanning/__init__.py
ooni/nettests/scanning/http_url_list.py
ooni/nettests/third_party/__init__.py
ooni/nettests/third_party/lantern.py
ooni/nettests/third_party/netalyzr.py
ooni/nettests/third_party/psiphon.py
ooni/report/__init__.py
ooni/report/cli.py
ooni/report/parser.py
ooni/report/tool.py
ooni/resources/__init__.py
ooni/resources/cli.py
ooni/resources/update.py
ooni/templates/__init__.py
ooni/templates/dnst.py
ooni/templates/httpt.py
ooni/templates/process.py
ooni/templates/scapyt.py
ooni/templates/tcpt.py
ooni/tests/__init__.py
ooni/tests/bases.py
ooni/tests/disable_test_dns.py
ooni/tests/mocks.py
ooni/tests/test_deck.py
ooni/tests/test_director.py
ooni/tests/test_geoip.py
ooni/tests/test_managers.py
ooni/tests/test_mutate.py
ooni/tests/test_nettest.py
ooni/tests/test_ngdeck.py
ooni/tests/test_onion.py
ooni/tests/test_oonibclient.py
ooni/tests/test_oonicli.py
ooni/tests/test_otime.py
ooni/tests/test_reporter.py
ooni/tests/test_safe_represent.py
ooni/tests/test_settings.py
ooni/tests/test_templates.py
ooni/tests/test_trueheaders.py
ooni/tests/test_txscapy.py
ooni/tests/test_utils.py
ooni/utils/__init__.py
ooni/utils/hacks.py
ooni/utils/log.py
ooni/utils/net.py
ooni/utils/onion.py
ooni/utils/trueheaders.py
ooni/utils/txscapy.py
ooniprobe.egg-info/PKG-INFO
ooniprobe.egg-info/SOURCES.txt
ooniprobe.egg-info/dependency_links.txt
ooniprobe.egg-info/not-zip-safe
ooniprobe.egg-info/requires.txt
ooniprobe.egg-info/top_level.txt ooniprobe-1.3.2/ooniprobe.egg-info/requires.txt 0000644 0001750 0001750 00000000353 12623630151 017735 0 ustar irl irl PyYAML>=3.10
Twisted>=12.2.0
ipaddr>=2.1.10
pyOpenSSL>=0.13
geoip
txtorcon>=0.7
txsocksx>=0.0.2
parsley>=1.1
scapy-real>=2.2.0-dev
pypcap>=1.1
service-identity
pydumbnet
pyasn1
pyasn1-modules
characteristic
zope.interface
cryptography
ooniprobe-1.3.2/ooniprobe.egg-info/dependency_links.txt 0000644 0001750 0001750 00000000001 12623630151 021402 0 ustar irl irl
ooniprobe-1.3.2/ooniprobe.egg-info/PKG-INFO 0000644 0001750 0001750 00000011440 12623630151 016431 0 ustar irl irl Metadata-Version: 1.1
Name: ooniprobe
Version: 1.3.2
Summary: Network measurement tool foridentifying traffic manipulation and blocking.
Home-page: https://ooni.torproject.org/
Author: Open Observatory of Network Interference
Author-email: ooni-dev@lists.torproject.org
License: BSD 2 clause
Description:
ooniprobe: a network interference detection tool
================================================
.. image:: https://travis-ci.org/TheTorProject/ooni-probe.png?branch=master
:target: https://travis-ci.org/TheTorProject/ooni-probe
.. image:: https://coveralls.io/repos/TheTorProject/ooni-probe/badge.png
:target: https://coveralls.io/r/TheTorProject/ooni-probe
___________________________________________________________________________
.. image:: https://ooni.torproject.org/theme/img/ooni-logo.png
:target: https:://ooni.torproject.org/
OONI, the Open Observatory of Network Interference, is a global observation
network which aims is to collect high quality data using open methodologies,
using Free and Open Source Software (FL/OSS) to share observations and data
about the various types, methods, and amounts of network tampering in the
world.
Read this before running ooniprobe!
-----------------------------------
Running ooniprobe is a potentially risky activity. This greatly depends on the
jurisdiction in which you are in and which test you are running. It is
technically possible for a person observing your internet connection to be
aware of the fact that you are running ooniprobe. This means that if running
network measurement tests is something considered to be illegal in your country
then you could be spotted.
Furthermore, ooniprobe takes no precautions to protect the install target machine
from forensics analysis. If the fact that you have installed or used ooni
probe is a liability for you, please be aware of this risk.
Setup ooniprobe
-------------------
To install ooniprobe you will need the following dependencies:
* python
* python-dev
* python-setuptools
* build-essential
* libdumbnet1
* python-dumbnet
* python-libpcap
* tor
* libgeoip-dev
* libpcap0.8-dev
* libssl-dev
* libffi-dev
* libdumbnet-dev
When you got them run:
.. code:: bash
sudo pip install ooniprobe
Using ooniprobe
---------------
To generate a test deck for your country, cd to the directory where you want it
and run:
.. code:: bash
oonideckgen
To setup a daily cronjob run this:
.. code:: bash
(crontab -l 2>/dev/null; echo "@daily ooniprobe `oonideckgen | grep -e '^ooniprobe'`") | crontab -
Have fun!
Platform: UNKNOWN
Classifier: Development Status :: 5 - Production/Stable
Classifier: Environment :: Console
Classifier: Framework :: Twisted
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Education
Classifier: Intended Audience :: End Users/Desktop
Classifier: Intended Audience :: Information Technology
Classifier: Intended Audience :: Science/Research
Classifier: Intended Audience :: Telecommunications Industry
Classifier: License :: OSI Approved :: BSD LicenseProgramming Language :: Python
Classifier: Programming Language :: Python :: 2
Classifier: Programming Language :: Python :: 2 :: Only
Classifier: Programming Language :: Python :: 2.6
Classifier: Programming Language :: Python :: 2.7
Classifier: Operating System :: MacOS :: MacOS X
Classifier: Operating System :: POSIX
Classifier: Operating System :: POSIX :: BSD
Classifier: Operating System :: POSIX :: BSD :: BSD/OS
Classifier: Operating System :: POSIX :: BSD :: FreeBSD
Classifier: Operating System :: POSIX :: BSD :: NetBSD
Classifier: Operating System :: POSIX :: BSD :: OpenBSD
Classifier: Operating System :: POSIX :: Linux
Classifier: Operating System :: Unix
Classifier: Topic :: Scientific/Engineering :: Information Analysis
Classifier: Topic :: Security
Classifier: Topic :: Security :: Cryptography
Classifier: Topic :: Software Development :: Libraries :: Application Frameworks
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Topic :: Software Development :: Testing
Classifier: Topic :: Software Development :: Testing :: Traffic Generation
Classifier: Topic :: System :: Networking :: Monitoring
ooniprobe-1.3.2/MANIFEST.in 0000644 0001750 0001750 00000000441 12474100761 013406 0 ustar irl irl include README.rst ChangeLog.rst requirements.txt LICENSE
recursive-include data/decks *
recursive-include data/inputs *
include data/oonideckgen.1
include data/ooniprobe.1
include data/oonireport.1
include data/ooniresources.1
include data/ooniprobe.conf.sample
include ooni/settings.ini
ooniprobe-1.3.2/ChangeLog.rst 0000644 0001750 0001750 00000031545 12623614370 014244 0 ustar irl irl Changelog
=========
v1.3.2 (Fri, 20 Nov 2015)
-------------------------
* Implement third party test template
* Add tutorial for using TCP test
* Add tests for censorship resistance
* Add meek test
* Add lantern test
* Support for Twisted 15.0
* Various stability and bug fixes
v1.3.1 (Fri, 3 Apr 2015)
------------------------
* Fix bug with --help of oonireport
* Read the home directory from an environement variable
* Expose the inputs_dir and decks_dir from the config file
* Fix bug that leads to some incomplete reports not showing up with oonireport
v1.3.0 (Fri, 27 Mar 2015)
-------------------------
* Add obfs4 bridge reachability support
* Avoid hacking sys.path in bin/* scripts to support running ooniprobe from
non-root.
* Point to the new citizenlab test lists directory
* Add support for report_id inside of reports
* Add the list of test helper addresses to the report
* Handle also unhandled exceptions inside of ooni(deckgen|report|resources)
v1.2.3-rc1 (Wed, 4 Feb 2015)
----------------------------
* Restructure directories where ooni software writes/reads from
https://trac.torproject.org/projects/tor/ticket/14086
* Properly set exit codes of oonideckgen
* Exit cleanly if we can't find the probes IP address
* Make the DNS Consistency test handle errors better
v1.2.2 (Fri, 17 Oct 2014)
-------------------------
Who said friday 17th is only bad luck?
* Add two new report entry keys test_start_time and test_runtime
* Fix bug that lead to ooniresources not working properly
v1.2.0 (Wed, 1 Oct 2014)
-------------------------
* Introduce a new tool for generating ooniprobe test decks called oonideckgen.
* Introduce a new tool for updating resources used for geoip lookup and deck
generation.
* Add support for policy aware bouncing in the client.
https://trac.torproject.org/projects/tor/ticket/12579
* Various improvements to the bridge_reachability test (enable better tor
logging and also log obfsproxy)
* Fix backward compatibility with twisted 13.1 and add regression tests for
this.
https://trac.torproject.org/projects/tor/ticket/13139
v1.1.1 (Sun, 24 Aug 2014)
-------------------------
* Update MANIFEST.in to include the manpages for ooniprobe and oonireport.
* Raise a more specific exception when multiple test cases are in a single
nettest file and the usageOptions are incoherent.
v1.1.0 (Tue, 19 Aug 2014)
-------------------------
In this new release of ooniprobe we have added a new command line tool for
listing the reports that have not been published to a collector and that allows
the probe operator to choose which ones they would like to upload.
We have also made some privacy improvements to the reports (we will sanitize
all things that may look like file paths) and added metadata associated with
the maxmind database being used by the probe operator.
Here is a more detailed list of what has been done:
* Annotate on disk which reports we have submitted and which ones we have not:
https://trac.torproject.org/projects/tor/ticket/11860
* Add tool called oonireport for publishing unpublished ooniprobe reports to a
collector: https://trac.torproject.org/projects/tor/ticket/11862
* Probe Report does not leak filepaths anymore:
https://trac.torproject.org/projects/tor/ticket/12706
* Reports now include version information about the maxmind database being
used: https://trac.torproject.org/projects/tor/ticket/12771
* We now do integrity checks on the ooniprobe.conf file so that we don't start
the tool if the config file is missing some settings or is not consistent:
https://trac.torproject.org/projects/tor/ticket/11983
(thanks to Alejandro López (kudrom))
* Improvements have been made to the sniffer subsystem (thanks to Alejandro
López (kudrom))
* Fix the multi protocol traceroute test.
https://trac.torproject.org/projects/tor/ticket/12883
Minor bug fixes:
* Fix dns_spoof test (by kudrom)
https://trac.torproject.org/projects/tor/ticket/12486
* ooni might not look at requiresTor:
https://trac.torproject.org/projects/tor/ticket/11858
* ooni spits out gobs of tracebacks if Tor is not running and the OONI config
says it will be:
https://trac.torproject.org/projects/tor/ticket/11859
* The README for ooni-probe should mention the bugtracker and repository
https://trac.torproject.org/projects/tor/ticket/11980
v1.0.2 (Fri, 9 May 2014)
------------------------
* Add ooniprobe manpage.
* Fix various security issues raised by the least authority audit.
* Add a test that checks for Tor bridge reachability.
* Record the IP address of the exit node being used in torified requests.
* Captive portal test now uses the ooni-probe test templates.
* Have better test naming consistency.
v1.0.1 (Fri, 14 Mar 2014)
-------------------------
* Fix bugs in the traceroute test that lead to not all packets being collected.
* All values inside of http_requests test are now initialized inside of setUp.
* Fix a bug that lead to the input value of the report not being set in some
circumstances.
* Add bridge_reachability test
v1.0.0 (Thu, 20 Feb 2014)
-------------------------
* Add bouncer support for discovering test helpers and collectors
* Fix bug that lead to HTTP tests to stall
* Add support for connect_error and connection_lost_error error types
* Add support for additional Tor configuration keys
* Add disclaimer when running ooniprobe
v0.1.0 (Mon, 17 Jun 2013)
-------------------------
Improvements to HTML/JS based user interface:
* XSRF protection
* user supplied input specification
Bugfixing and improvements to scheduler.
v0.0.12 (Sat, 8 Jun 2013)
-------------------------
Implement JS/HTML based user interface.
Supports:
* Starting and stopping of tests
* Monitoring of test progress
v0.0.11 (Thu, 11 Apr 2013)
--------------------------
* Parametrize task timeout and retry count
* Set the default collector via the command line option
* Add option to disable the default collector
* Add continuous integration with travis
v0.0.10 (Wed, 26 Dec 2012)
--------------------------
ooniprobe:
* Fix bug that made HTTP based tests stall
* Update DNS Test example to not import the DNS Test template If you import the
DNS Test template it will be considered a valid test case and command line
argument parsing will not work as expected. see:
#7795 for more details
* Fix major bug in DNS test template that prevented PTR lookups from working
properly I was calling the queryUDP function with the arguments in the wrong
order. Twisted, why you API no consistent?
* Add support for specifying the level of parallelism in tests (aka router
melt mode)
* Do not swallow failures when a test instance fails to run fixes #7714
scripts:
* Add report archival script
Fix bug in TCP connect test that made it not properly log errors
* Refactor failure handling code in nettest Add function that traps all the
supported failure and outputs the failure string representing it.
documentation:
* Add birdseye view of the ooniprobe architecture
* Add details on the current implementation status of ooni*
* Add draft ooniprobe API specification
* Add instructions for supervisord configuration and clean up README.md
0.0.9 (Tue, 11 Dec 2012)
------------------------
ooniprobe:
* Set the default ASN to 0
* Make Beautiful soup a soft depedency
* Add support for sending the ASN number of the probe:
the ASN number will get sent when creating a new report
* Add support for obtaining the probes IP address via getinfo address as per
https://trac.torproject.org/projects/tor/ticket/7447
* Fix bug in ooniprobe test decks
https://trac.torproject.org/projects/tor/ticket/7664
oonib:
* Use twisted fdesc when writing to files
* Add support for processing the ASN number of the probe
* Test reports shall follow the specification detailed inside of docs/reports.rst
* Add support for setting the tor binary path in oonib/config.py
scripts:
* Add a very simple example on how to securely parse the ooniprobe reports
documentation:
* Add documentation for the DNSSpoof test
* Add documentation for HTTPHeaderFieldManipulation
* Clean up writing_tests.rst
* Properly use the power of sphinx!
Tests:
* fixup Netalyzr third party plugin
v0.0.8-alpha (Sun, 2 Dec 2012)
------------------------------
ooniprobe:
* Allow test resolver file to have comments.
* Autostart Tor in default configuration.
* Add support for starting Tor via txtorcon.
* Make the sniffer not run in a separate thread, but use a non blocking fdesc.
Do some refactoring of scapy testing, following Factory creational pattern
and a pub-sub pattern for the readers and writers.
* Extend TrueHeaders to support calculation of difference between two HTTP headers respectful of
capitalization
* Implement test deck system for automating the specification of command line
arguments for tests
* Implement sr1 in txscapy
* Include socksproxy address in HTTP based tests
* Include the resolver IP:Port in the report
* Changes to the report format of HTTP Test template derived tests:
Requests are now stored inside of an array to allow
the storing of multiple request/response pairs.
* Fix bug that lead to httpt based reports to not have the url attribute set
properly.
* twisted Headers() class edited to avoid header fix in reference to:
https://trac.torproject.org/projects/tor/ticket/7432
* Parametrize tor socksport for usage with modified HTTP Agent
* Update URL List test to take as input also a single URL
* Clean up filenames of reports generated by ooni-probe:
they now follow the format $testName_report_$timestamp.yamloo
* Add ooniprobe prefix to logs
* Respect the includeip = false option in ooniprobe.conf for scapyt derivate
tests:
If the option to not include the IP address of the probe is set,
change the source and destination ip address of the sent and received
packets to 127.0.0.1.
tests:
* Implement basic keyword filtering detection test.
* Add ICMP support to multi protocol traceroute test
* parametrize max_ttl and timeout
* make max_ttl and timeout be included in the report
* Port UK Mobile Network test to new API
* Port daphn3 test
* Randomize source port by default in traceroute test and include source port in
report
* Test and Implement HTTP Header Field Manipulation Test (rename it to what we
had originally called it since it made most sense)
* Implement test that detects DNS spoofing
* Implement TCP payload sending test template:
Example test based on this test template
* Make report IDs include the timestamp of the report
* Add test that detects censorship in HTTP pages based on HTTP body length
* Add socks proxy support to HTTP Test
* Create DNS Test template:
Use such template for DNS Tamper test.
Add example usage of DNS Test Template.
* Refactor captive portal test to run tests in threads
oonib:
* Implement basic collector for ooniprobe reports.
Reports can be submitted over the network via http to a remote collector.
Implement the backend component of the collector that writes submitted
reports to flat files, following the report_id naming convention.
* Implement very simple HTTP Request backend that does only the part of HTTP we
need for testing
* Make oonib a daemon
* Loosen up the oonib regexp to support the timestamp report format
* Add Tor Hidden Service support
* Make the reporting directory of the collector configurable
* Implement TCP Echo test helper.
scripts:
* Add fabfile for automatic deployment of ooni-probe to remote sites
documentation:
* Update documentation on how to setup ooniprobe.
v0.0.7.1-alpha (Sun, 11 Nov 2012)
---------------------------------
* Add software version to the report
* Implement basic oonib reporting to flat files containing the report ID.
* Improve HTTP Host test to work with the HTTP Requests test backend
v0.0.7-alpha (Sat, 10 Nov 2012)
-------------------------------
* Add test_name key to ooniprobe reports
* Port TCP connect test to the new API
v0.0.4-alpha (Sat, 10 Nov 2012)
-------------------------------
* Add multi protocol multi port traceroute for UDP and TCP
* Implement basic HTTP request test that does capitalization variations on the
HTTP method.
* Bugfixing and refactoring of txscapy for sending and receiving of scapy
packets.
v0.0.3-alpha (Fri, 9 Nov 2012)
------------------------------
* Implement logging to PCAP file support
* Remove dependency on trial
* Port china trigger to new API
* Rename keyword filtering test to HTTP keyword filtering
* Refactor install documentation.
* Convert header of ooniprobe script to a non docstring
* Add Makefile to fetch Maxmind geoip database files
* Implement GeoIP lookup support
* From configuration options it is possible to choice what level of privacy
the prober is willing to accept. Implement config file support You are able
to specify basic and advanced options in YAML format
* Remove raw inputs and move them to a separate repository and add Makefile to
fetch such lists
0.0.1-alpha (Tue, 6 Nov 2012)
-----------------------------
First release of ooni-probe. woot!
ooniprobe-1.3.2/README.rst 0000644 0001750 0001750 00000030122 12623613456 013344 0 ustar irl irl ooniprobe: a network interference detection tool
================================================
.. image:: https://travis-ci.org/TheTorProject/ooni-probe.png?branch=master
:target: https://travis-ci.org/TheTorProject/ooni-probe
.. image:: https://coveralls.io/repos/TheTorProject/ooni-probe/badge.png
:target: https://coveralls.io/r/TheTorProject/ooni-probe
___________________________________________________________________________
.. image:: https://ooni.torproject.org/images/ooni-header-mascot.svg
:target: https:://ooni.torproject.org/
OONI, the Open Observatory of Network Interference, is a global observation
network which aims is to collect high quality data using open methodologies,
using Free and Open Source Software (FL/OSS) to share observations and data
about the various types, methods, and amounts of network tampering in the
world.
"The Net interprets censorship as damage and routes around it."
- John Gilmore; TIME magazine (6 December 1993)
ooniprobe is the first program that users run to probe their network and to
collect data for the OONI project. Are you interested in testing your network
for signs of surveillance and censorship? Do you want to collect data to share
with others, so that you and others may better understand your network? If so,
please read this document and we hope ooniprobe will help you to gather
network data that will assist you with your endeavors!
Read this before running ooniprobe!
-----------------------------------
Running ooniprobe is a potentially risky activity. This greatly depends on the
jurisdiction in which you are in and which test you are running. It is
technically possible for a person observing your internet connection to be
aware of the fact that you are running ooniprobe. This means that if running
network measurement tests is something considered to be illegal in your country
then you could be spotted.
Furthermore, ooniprobe takes no precautions to protect the install target machine
from forensics analysis. If the fact that you have installed or used ooni
probe is a liability for you, please be aware of this risk.
OONI in 5 minutes
=================
On debian testing or unstable::
sudo apt-get install ooniprobe
If you are running debian stable you can get it from backports via::
sudo sh -c 'echo "deb http://http.debian.net/debian jessie-backports main" >> /etc/apt/sources.list'
sudo apt-get update && sudo apt-get install ooniprobe
On unix systems::
sudo pip install ooniprobe
To install it from the current master run::
sudo pip install https://github.com/TheTorProject/ooni-probe/archive/master.zip
Then run::
mkdir my_decks
sudo ooniresources --update-inputs --update-geoip
oonideckgen -o my_decks/
**BUG** Note:
ooniprobe version 1.2.2 when installed from the debian repository will not
properly create the ooni home folder and if you run into an error in accessing
`~/.ooni/` run::
ooniprobe -n blocking/http_requests -u http://google.com/
This should generate the home and allow you to run oonideckgen.
The output from the last command will tell you how to run ooniprobe to perform
the measurement.
If you would like to contribute measurements to OONI daily you can also add
this to your crontab::
@daily ooniprobe $THE_OONI_COMMAND
Run this command to automatically update your crontab::
(crontab -l 2>/dev/null; echo "@daily ooniprobe $THE_OONI_COMMAND") | crontab -
Installation
============
Debian based systems
--------------------
If you are running Debian testing or Debian unstable you can install ooniprobe
simply with::
apt-get install ooniprobe
If you are running Debian stable you can get it from backports via::
sudo sh -c 'echo "deb http://http.debian.net/debian wheezy-backports main" >> /etc/apt/sources.list'
sudo apt-get update && sudo apt-get install ooniprobe
If you are running Ubuntu 14.04 LTS, 14.10 or 15.04 you can install it from the PPA
(https://launchpad.net/~irl/+archive/ubuntu/ooni/)::
sudo add-apt-repository ppa:irl/ooni
sudo apt-get update && sudo apt-get install ooniprobe
You will be warned that the packages are unauthenticated. This is due to the
PPA not being signed and is normal behaviour. If you would prefer to verify the
integrity of the package, use our private Debian repository below.
Mac OS X
--------
You can install ooniprobe on OSX if you have installed homebrew (http://mxcl.github.io/homebrew) with::
brew install ooniprobe
Unix systems (with pip)
-----------------------
Make sure you have installed the following depedencies:
* build-essential
* python (>=2.7)
* python-dev
* pip
* libgeoip-dev
* libdumbnet-dev
* libpcap-dev
* libssl-dev
* libffi-dev
* tor (>=0.2.5.1 to run all the tor related tests)
Then you should be able to install ooniprobe by running::
pip install ooniprobe
Other platforms (with Vagrant)
------------------------------
1. Install Vagrant (https://www.vagrantup.com/downloads.html) and Install Virtualbox (https://www.virtualbox.org/wiki/Downloads)
2. On OSX:
If you don't have it install homebrew http://mxcl.github.io/homebrew/::
brew install git
On debian/ubuntu::
sudo apt-get install git
3. Open a Terminal and run::
git clone https://git.torproject.org/ooni-probe.git
cd ooni-probe/
vagrant up
4. Login to the box with::
vagrant ssh
ooniprobe will be installed in ``/ooni``.
5. You can run tests with::
ooniprobe blocking/http_requests -f /ooni/example_inputs/alexa-top-1k.txt
Using ooniprobe
===============
**Net test** is a set of measurements to assess what kind of internet censorship is occurring.
**Decks** are collections of ooniprobe nettests with some associated inputs.
**Collector** is a service used to report the results of measurements.
**Test helper** is a service used by a probe for successfully performing its measurements.
**Bouncer** is a service used to discover the addresses of test helpers and collectors.
Configuring ooniprobe
---------------------
You may edit the configuration for ooniprobe by editing the configuration file
found inside of ``~/.ooni/ooniprobe.conf``.
By default ooniprobe will not include personal identifying information in the
test result, nor create a pcap file. This behavior can be personalized.
Updating resources
------------------
To generate decks you will have to update the input resources of ooniprobe.
This can be done with::
ooniresources --update-inputs
If you get a permission error, you may have to run the command as root or
change the ooniprobe data directory inside of `ooniprobe.conf`.
On some platforms, for example debian contrib, you will not get all the geoip
related files needed. In that case it is possible to manually download them
with ``ooniresources``::
ooniresources --update-geoip
Generating decks
----------------
You can generate decks for your country thanks to the oonideckgen command.
If you wish, for example, to generate a deck to be run in the country of Italy,
you can do so (be sure to have updated the input resources first) by running::
oonideckgen --country-code IT --output ~/
You will now have in your home a folder called `deck-it`, containing the ooni
deck (ends with .deck) and the inputs.
Note: that you should not move the `deck-*` directory once it has been
generated as the paths to the inputs referenced by the test in the deck are
absolute. If you want your deck to live in another directory you must
regenerated it.
Running decks
-------------
You will find all the installed decks inside of ``/usr/share/ooni/decks``.
You may then run a deck by using the command line option ``-i``:
As root::
ooniprobe -i /usr/share/ooni/decks/mlab.deck
Or as a user::
ooniprobe -i /usr/share/ooni/decks/mlab_no_root.deck
Or:
As root::
ooniprobe -i /usr/share/ooni/decks/complete.deck
Or as a user::
ooniprobe -i /usr/share/ooni/decks/complete_no_root.deck
The above tests will require around 20-30 minutes to complete depending on your network speed.
If you would prefer to run some faster tests you should run:
As root::
ooniprobe -i /usr/share/ooni/decks/fast.deck
Or as a user::
ooniprobe -i /usr/share/ooni/decks/fast_no_root.deck
Running net tests
-----------------
You may list all the installed stable net tests with::
ooniprobe -s
You may then run a nettest by specifying its name for example::
ooniprobe manipulation/http_header_field_manipulation
It is also possible to specify inputs to tests as URLs::
ooniprobe blocking/http_requests -f httpo://ihiderha53f36lsd.onion/input/37e60e13536f6afe47a830bfb6b371b5cf65da66d7ad65137344679b24fdccd1
You can find the result of the test in your current working directory.
By default the report result will be collected by the default ooni collector
and the addresses of test helpers will be obtained from the default bouncer.
You may also specify your own collector or bouncer with the options ``-c`` and
``-b``.
Bridges and obfsproxy bridges
=============================
ooniprobe submits reports to oonib report collectors through Tor to a hidden
service endpoint. By default, ooniprobe uses the installed system Tor, but can
also be configured to launch Tor (see the advanced.start_tor option in
ooniprobe.conf), and ooniprobe supports bridges (and obfsproxy bridges, if
obfsproxy is installed). The tor.bridges option in ooniprobe.conf sets the path
to a file that should contain a set of "bridge" lines (of the same format as
used in torrc, and as returned by https://bridges.torproject.org). If obfsproxy
bridges are to be used, the path to the obfsproxy binary must be configured.
See option advanced.obfsproxy_binary, in ooniprobe.conf.
(Optional) Install obfsproxy
----------------------------
Install the latest version of obfsproxy for your platform.
Download Obfsproxy: https://www.torproject.org/projects/obfsproxy.html.en
Setting capabilities on your virtualenv python binary
=====================================================
If your distributation supports capabilities you can avoid needing to run OONI as root::
setcap cap_net_admin,cap_net_raw+eip /path/to/your/virtualenv's/python
Reporting bugs
==============
You can report bugs and issues you find with ooni-probe on The Tor Projec issue
tracker filing them under the "Ooni" component: https://trac.torproject.org/projects/tor/newticket?component=Ooni.
You can either register an account or use the group account "cypherpunks" with
password "writecode".
Contributing
============
You can download the code for ooniprobe from the following git repository::
git clone https://git.torproject.org/ooni-probe.git
It is also viewable on the web via: https://gitweb.torproject.org/ooni-probe.git.
You should then submit patches for review as pull requests to this github repository:
https://github.com/TheTorProject/ooni-probe
Read this article to learn how to create a pull request on github (https://help.github.com/articles/creating-a-pull-request).
If you prefer not to use github (or don't have an account), you may also submit
patches as attachments to tickets.
Be sure to format the patch (given that you are working on a feature branch
that is different from master) with::
git format-patch master --stdout > my_first_ooniprobe.patch
Setting up development environment
----------------------------------
On Debian based systems a development environment can be setup as follows: (prerequisites include build essentials, python-dev, and tor; for tor see https://www.torproject.org/docs/debian.html.en)::
sudo apt-get install python-pip python-virtualenv virtualenv virtualenvwrapper
sudo apt-get install libgeoip-dev libffi-dev libdumbnet-dev libssl-dev libpcap-dev
git clone https://github.com/TheTorProject/ooni-probe
cd ooni-probe
mkvirtualenv ooniprobe # . ~/.virtualenvs/ooniprobe/bin/activate to access later
pip install -r requirements.txt
pip install -r requirements-dev.txt
python setup.py install
ooniprobe -s # if all went well, lists available tests
Donate
-------
Send bitcoins to
.. image:: http://i.imgur.com/CIWHb5R.png
:target: http://www.coindesk.com/information/how-can-i-buy-bitcoins/
1Ai9d4dhDBjxYVkKKf1pFXptEGfM1vxFBf
ooniprobe-1.3.2/PKG-INFO 0000644 0001750 0001750 00000011440 12623630152 012744 0 ustar irl irl Metadata-Version: 1.1
Name: ooniprobe
Version: 1.3.2
Summary: Network measurement tool foridentifying traffic manipulation and blocking.
Home-page: https://ooni.torproject.org/
Author: Open Observatory of Network Interference
Author-email: ooni-dev@lists.torproject.org
License: BSD 2 clause
Description:
ooniprobe: a network interference detection tool
================================================
.. image:: https://travis-ci.org/TheTorProject/ooni-probe.png?branch=master
:target: https://travis-ci.org/TheTorProject/ooni-probe
.. image:: https://coveralls.io/repos/TheTorProject/ooni-probe/badge.png
:target: https://coveralls.io/r/TheTorProject/ooni-probe
___________________________________________________________________________
.. image:: https://ooni.torproject.org/theme/img/ooni-logo.png
:target: https:://ooni.torproject.org/
OONI, the Open Observatory of Network Interference, is a global observation
network which aims is to collect high quality data using open methodologies,
using Free and Open Source Software (FL/OSS) to share observations and data
about the various types, methods, and amounts of network tampering in the
world.
Read this before running ooniprobe!
-----------------------------------
Running ooniprobe is a potentially risky activity. This greatly depends on the
jurisdiction in which you are in and which test you are running. It is
technically possible for a person observing your internet connection to be
aware of the fact that you are running ooniprobe. This means that if running
network measurement tests is something considered to be illegal in your country
then you could be spotted.
Furthermore, ooniprobe takes no precautions to protect the install target machine
from forensics analysis. If the fact that you have installed or used ooni
probe is a liability for you, please be aware of this risk.
Setup ooniprobe
-------------------
To install ooniprobe you will need the following dependencies:
* python
* python-dev
* python-setuptools
* build-essential
* libdumbnet1
* python-dumbnet
* python-libpcap
* tor
* libgeoip-dev
* libpcap0.8-dev
* libssl-dev
* libffi-dev
* libdumbnet-dev
When you got them run:
.. code:: bash
sudo pip install ooniprobe
Using ooniprobe
---------------
To generate a test deck for your country, cd to the directory where you want it
and run:
.. code:: bash
oonideckgen
To setup a daily cronjob run this:
.. code:: bash
(crontab -l 2>/dev/null; echo "@daily ooniprobe `oonideckgen | grep -e '^ooniprobe'`") | crontab -
Have fun!
Platform: UNKNOWN
Classifier: Development Status :: 5 - Production/Stable
Classifier: Environment :: Console
Classifier: Framework :: Twisted
Classifier: Intended Audience :: Developers
Classifier: Intended Audience :: Education
Classifier: Intended Audience :: End Users/Desktop
Classifier: Intended Audience :: Information Technology
Classifier: Intended Audience :: Science/Research
Classifier: Intended Audience :: Telecommunications Industry
Classifier: License :: OSI Approved :: BSD LicenseProgramming Language :: Python
Classifier: Programming Language :: Python :: 2
Classifier: Programming Language :: Python :: 2 :: Only
Classifier: Programming Language :: Python :: 2.6
Classifier: Programming Language :: Python :: 2.7
Classifier: Operating System :: MacOS :: MacOS X
Classifier: Operating System :: POSIX
Classifier: Operating System :: POSIX :: BSD
Classifier: Operating System :: POSIX :: BSD :: BSD/OS
Classifier: Operating System :: POSIX :: BSD :: FreeBSD
Classifier: Operating System :: POSIX :: BSD :: NetBSD
Classifier: Operating System :: POSIX :: BSD :: OpenBSD
Classifier: Operating System :: POSIX :: Linux
Classifier: Operating System :: Unix
Classifier: Topic :: Scientific/Engineering :: Information Analysis
Classifier: Topic :: Security
Classifier: Topic :: Security :: Cryptography
Classifier: Topic :: Software Development :: Libraries :: Application Frameworks
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Topic :: Software Development :: Testing
Classifier: Topic :: Software Development :: Testing :: Traffic Generation
Classifier: Topic :: System :: Networking :: Monitoring