fdroidserver-0.6.0/ 0000775 0000765 0000765 00000000000 12661321517 014137 5 ustar hans hans 0000000 0000000 fdroidserver-0.6.0/fdroidserver.egg-info/ 0000775 0000765 0000765 00000000000 12661321517 020327 5 ustar hans hans 0000000 0000000 fdroidserver-0.6.0/fdroidserver.egg-info/dependency_links.txt 0000664 0000765 0000765 00000000001 12661321474 024377 0 ustar hans hans 0000000 0000000
fdroidserver-0.6.0/fdroidserver.egg-info/SOURCES.txt 0000664 0000765 0000765 00000003212 12661321475 022214 0 ustar hans hans 0000000 0000000 LICENSE
MANIFEST.in
README.md
fd-commit
fdroid
jenkins-build
makebuildserver
setup.cfg
setup.py
buildserver/config.buildserver.py
buildserver/fixpaths.sh
buildserver/cookbooks/android-ndk/recipes/default.rb
buildserver/cookbooks/android-sdk/recipes/default.rb
buildserver/cookbooks/fdroidbuild-general/recipes/default.rb
buildserver/cookbooks/gradle/recipes/default.rb
buildserver/cookbooks/gradle/recipes/gradle
buildserver/cookbooks/kivy/recipes/default.rb
completion/bash-completion
docs/fdl.texi
docs/fdroid.texi
docs/gendocs.sh
docs/gendocs_template
docs/index_versions.md
docs/update.sh
examples/config.py
examples/fdroid-icon.png
examples/makebuildserver.config.py
examples/opensc-fdroid.cfg
fdroidserver/__init__.py
fdroidserver/build.py
fdroidserver/checkupdates.py
fdroidserver/common.py
fdroidserver/gpgsign.py
fdroidserver/import.py
fdroidserver/init.py
fdroidserver/install.py
fdroidserver/lint.py
fdroidserver/metadata.py
fdroidserver/net.py
fdroidserver/publish.py
fdroidserver/readmeta.py
fdroidserver/rewritemeta.py
fdroidserver/scanner.py
fdroidserver/server.py
fdroidserver/signindex.py
fdroidserver/stats.py
fdroidserver/update.py
fdroidserver/verify.py
fdroidserver.egg-info/PKG-INFO
fdroidserver.egg-info/SOURCES.txt
fdroidserver.egg-info/dependency_links.txt
fdroidserver.egg-info/requires.txt
fdroidserver.egg-info/top_level.txt
fdroidserver/asynchronousfilereader/__init__.py
tests/run-tests
tests/update.TestCase
tests/urzip-badsig.apk
tests/urzip.apk
tests/getsig/getsig.java
tests/getsig/make.sh
tests/getsig/run.sh
wp-fdroid/AndroidManifest.xml
wp-fdroid/android-permissions.php
wp-fdroid/readme.txt
wp-fdroid/strings.xml
wp-fdroid/wp-fdroid.php fdroidserver-0.6.0/fdroidserver.egg-info/requires.txt 0000664 0000765 0000765 00000000130 12661321474 022723 0 ustar hans hans 0000000 0000000 mwclient
paramiko
Pillow
apache-libcloud >= 0.14.1
pyasn1
pyasn1-modules
PyYAML
requests fdroidserver-0.6.0/fdroidserver.egg-info/PKG-INFO 0000664 0000765 0000765 00000007273 12661321474 021437 0 ustar hans hans 0000000 0000000 Metadata-Version: 1.1
Name: fdroidserver
Version: 0.6.0
Summary: F-Droid Server Tools
Home-page: https://f-droid.org
Author: The F-Droid Project
Author-email: team@f-droid.org
License: UNKNOWN
Description: # F-Droid Server
[](https://gitlab.com/ci/projects/6642?ref=master)
Server for [F-Droid](https://f-droid.org), the Free Software repository system
for Android.
The F-Droid server tools provide various scripts and tools that are used to
maintain the main [F-Droid application repository](https://f-droid.org/repository/browse).
You can use these same tools to create your own additional or alternative
repository for publishing, or to assist in creating, testing and submitting
metadata to the main repository.
For documentation, please see the docs directory.
Alternatively, visit [https://f-droid.org/manual/](https://f-droid.org/manual/).
### What is F-Droid?
F-Droid is an installable catalogue of FOSS (Free and Open Source Software)
applications for the Android platform. The client makes it easy to browse,
install, and keep track of updates on your device.
### Installing
Note that only Python 2 is supported. We recommend version 2.7.7 or
later.
The easiest way to install the `fdroidserver` tools is on Ubuntu, Mint or other
Ubuntu based distributions, you can install using:
sudo apt-get install fdroidserver
For older Ubuntu releases or to get the latest version, you can get
`fdroidserver` from the Guardian Project PPA (the signing key
fingerprint is `6B80 A842 07B3 0AC9 DEE2 35FE F50E ADDD 2234 F563`)
sudo add-apt-repository ppa:guardianproject/ppa
sudo apt-get update
sudo apt-get install fdroidserver
On OSX, `fdroidserver` is available from third party package managers,
like Homebrew, MacPorts, and Fink:
brew install fdroidserver
For Arch-Linux is a package in the AUR available. If you have installed
`yaourt` or something similiar, you can do:
yaourt -S fdroidserver
For any platform where Python's `easy_install` is an option (e.g. OSX
or Cygwin, you can use it:
sudo easy_install fdroidserver
Python's `pip` also works:
sudo pip install fdroidserver
The combination of `virtualenv` and `pip` is great for testing out the
latest versions of `fdroidserver`. Using `pip`, `fdroidserver` can
even be installed straight from git. First, make sure you have
installed the python header files, virtualenv and pip. They should be
included in your OS's default package manager or you can install them
via other mechanisms like Brew/dnf/pacman/emerge/Fink/MacPorts.
For Debian based distributions:
apt-get install python-dev python-pip python-virtualenv
Then here's how to install:
git clone https://gitlab.com/fdroid/fdroidserver.git
cd fdroidserver
virtualenv env/
source env/bin/activate
pip install -e .
python2 setup.py install
Platform: UNKNOWN
Classifier: Development Status :: 3 - Alpha
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: GNU Affero General Public License v3 or later (AGPLv3+)
Classifier: Operating System :: POSIX
Classifier: Topic :: Utilities
fdroidserver-0.6.0/fdroidserver.egg-info/top_level.txt 0000664 0000765 0000765 00000000015 12661321474 023057 0 ustar hans hans 0000000 0000000 fdroidserver
fdroidserver-0.6.0/README.md 0000664 0000765 0000765 00000005146 12660365727 015436 0 ustar hans hans 0000000 0000000 # F-Droid Server
[](https://gitlab.com/ci/projects/6642?ref=master)
Server for [F-Droid](https://f-droid.org), the Free Software repository system
for Android.
The F-Droid server tools provide various scripts and tools that are used to
maintain the main [F-Droid application repository](https://f-droid.org/repository/browse).
You can use these same tools to create your own additional or alternative
repository for publishing, or to assist in creating, testing and submitting
metadata to the main repository.
For documentation, please see the docs directory.
Alternatively, visit [https://f-droid.org/manual/](https://f-droid.org/manual/).
### What is F-Droid?
F-Droid is an installable catalogue of FOSS (Free and Open Source Software)
applications for the Android platform. The client makes it easy to browse,
install, and keep track of updates on your device.
### Installing
Note that only Python 2 is supported. We recommend version 2.7.7 or
later.
The easiest way to install the `fdroidserver` tools is on Ubuntu, Mint or other
Ubuntu based distributions, you can install using:
sudo apt-get install fdroidserver
For older Ubuntu releases or to get the latest version, you can get
`fdroidserver` from the Guardian Project PPA (the signing key
fingerprint is `6B80 A842 07B3 0AC9 DEE2 35FE F50E ADDD 2234 F563`)
sudo add-apt-repository ppa:guardianproject/ppa
sudo apt-get update
sudo apt-get install fdroidserver
On OSX, `fdroidserver` is available from third party package managers,
like Homebrew, MacPorts, and Fink:
brew install fdroidserver
For Arch-Linux is a package in the AUR available. If you have installed
`yaourt` or something similiar, you can do:
yaourt -S fdroidserver
For any platform where Python's `easy_install` is an option (e.g. OSX
or Cygwin, you can use it:
sudo easy_install fdroidserver
Python's `pip` also works:
sudo pip install fdroidserver
The combination of `virtualenv` and `pip` is great for testing out the
latest versions of `fdroidserver`. Using `pip`, `fdroidserver` can
even be installed straight from git. First, make sure you have
installed the python header files, virtualenv and pip. They should be
included in your OS's default package manager or you can install them
via other mechanisms like Brew/dnf/pacman/emerge/Fink/MacPorts.
For Debian based distributions:
apt-get install python-dev python-pip python-virtualenv
Then here's how to install:
git clone https://gitlab.com/fdroid/fdroidserver.git
cd fdroidserver
virtualenv env/
source env/bin/activate
pip install -e .
python2 setup.py install
fdroidserver-0.6.0/setup.py 0000664 0000765 0000765 00000002706 12661320440 015650 0 ustar hans hans 0000000 0000000 #!/usr/bin/env python2
from setuptools import setup
import sys
# workaround issue on OSX, where sys.prefix is not an installable location
if sys.platform == 'darwin' and sys.prefix.startswith('/System'):
data_prefix = '.'
else:
data_prefix = sys.prefix
setup(name='fdroidserver',
version='0.6.0',
description='F-Droid Server Tools',
long_description=open('README.md').read(),
author='The F-Droid Project',
author_email='team@f-droid.org',
url='https://f-droid.org',
packages=['fdroidserver', 'fdroidserver.asynchronousfilereader'],
scripts=['fdroid', 'fd-commit'],
data_files=[
(data_prefix + '/share/doc/fdroidserver/examples',
['buildserver/config.buildserver.py',
'examples/config.py',
'examples/makebuildserver.config.py',
'examples/opensc-fdroid.cfg',
'examples/fdroid-icon.png']),
],
install_requires=[
'mwclient',
'paramiko',
'Pillow',
'apache-libcloud >= 0.14.1',
'pyasn1',
'pyasn1-modules',
'PyYAML',
'requests',
],
classifiers=[
'Development Status :: 3 - Alpha',
'Intended Audience :: Developers',
'License :: OSI Approved :: GNU Affero General Public License v3 or later (AGPLv3+)',
'Operating System :: POSIX',
'Topic :: Utilities',
],
)
fdroidserver-0.6.0/completion/ 0000775 0000765 0000765 00000000000 12661321517 016310 5 ustar hans hans 0000000 0000000 fdroidserver-0.6.0/completion/bash-completion 0000664 0000765 0000765 00000013274 12657174252 021335 0 ustar hans hans 0000000 0000000 #!/bin/bash
#
# bash-completion - part of the FDroid server tools
# Bash completion for the fdroid main tools
#
# Copyright (C) 2013, 2014 Daniel Martí
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see .
# 'fdroid' is completed automatically, but aliases to it are not.
# For instance, to alias 'fd' to 'fdroid' and have competion available:
#
# alias fd='fdroid'
# complete -F _fdroid fd
#
# One can use completion on aliased subcommands as follows:
#
# alias fbuild='fdroid build'
# complete -F _fdroid_build fbuild
__fdroid_init() {
COMPREPLY=()
cur="${COMP_WORDS[COMP_CWORD]}"
prev="${COMP_WORDS[COMP_CWORD-1]}"
(( $# >= 1 )) && __complete_${1}
}
__by_ext() {
local ext="$1"
files=( metadata/*.$ext )
files=( ${files[@]#metadata/} )
files=${files[@]%.$ext}
echo "$files"
}
__package() {
files="$(__by_ext txt) $(__by_ext yaml) $(__by_ext json) $(__by_ext xml)"
COMPREPLY=( $( compgen -W "$files" -- $cur ) )
}
__apk_package() {
files=( ${1}/*.apk )
[ -f "${files[0]}" ] || return
files=( ${files[@]#*/} )
files=${files[@]%_*}
COMPREPLY=( $( compgen -W "$files" -- $cur ) )
}
__apk_vercode() {
local p=${cur:0:-1}
files=( ${1}/${p}_*.apk )
[ -f "${files[0]}" ] || return
files=( ${files[@]#*_} )
files=${files[@]%.apk}
COMPREPLY=( $( compgen -P "${p}:" -W "$files" -- $cur ) )
}
__vercode() {
local p v
echo $cur | IFS=':' read p v
COMPREPLY=( $( compgen -P "${p}:" -W "$( while read line; do
if [[ "$line" == "Build Version:"* ]]
then
line="${line#*,}"
printf "${line%%,*} "
elif [[ "$line" == "Build:"* ]]
then
line="${line#*,}"
printf "${line%%,*} "
fi
done < "metadata/${p}.txt" )" -- $cur ) )
}
__complete_options() {
case "${cur}" in
--*)
COMPREPLY=( $( compgen -W "--help --version ${lopts}" -- $cur ) )
return 0;;
*)
COMPREPLY=( $( compgen -W "-h ${opts} --help --version ${lopts}" -- $cur ) )
return 0;;
esac
}
__complete_build() {
opts="-v -q -l -s -t -f -a -w"
lopts="--verbose --quiet --latest --stop --test --server --resetserver
--on-server --skip-scan --no-tarball --force --all --wiki --no-refresh"
case "${cur}" in
-*)
__complete_options
return 0;;
*:*)
__vercode
return 0;;
*)
__package
return 0;;
esac
}
__complete_install() {
opts="-v -q"
lopts="--verbose --quiet --all"
case "${cur}" in
-*)
__complete_options
return 0;;
*:)
__apk_vercode repo
return 0;;
*)
__apk_package repo
return 0;;
esac
}
__complete_update() {
opts="-c -v -q -b -i -I -e -w"
lopts="--create-metadata --verbose --quiet --buildreport
--interactive --icons --editor --wiki --pretty --clean --delete-unknown
--nosign"
case "${prev}" in
-e|--editor)
_filedir
return 0;;
esac
__complete_options
}
__complete_publish() {
opts="-v -q"
lopts="--verbose --quiet"
case "${cur}" in
-*)
__complete_options
return 0;;
*:)
__apk_vercode unsigned
return 0;;
*)
__apk_package unsigned
return 0;;
esac
}
__complete_checkupdates() {
opts="-v -q"
lopts="--verbose --quiet --auto --autoonly --commit --gplay"
case "${cur}" in
-*)
__complete_options
return 0;;
*)
__package
return 0;;
esac
}
__complete_import() {
opts="-u -s -q"
lopts="--url --subdir --rev --quiet"
case "${prev}" in
-u|--url|-s|--subdir|--rev) return 0;;
esac
__complete_options
}
__complete_readmeta() {
opts="-v -q"
lopts="--verbose --quiet"
__complete_options
}
__complete_rewritemeta() {
opts="-v -q -l"
lopts="--verbose --quiet --list"
case "${cur}" in
-*)
__complete_options
return 0;;
*)
__package
return 0;;
esac
}
__complete_lint() {
opts="-v -q"
lopts="--verbose --quiet"
case "${cur}" in
-*)
__complete_options
return 0;;
*)
__package
return 0;;
esac
}
__complete_scanner() {
opts="-v -q"
lopts="--verbose --quiet"
case "${cur}" in
-*)
__complete_options
return 0;;
*:)
__vercode
return 0;;
*)
__package
return 0;;
esac
}
__complete_verify() {
opts="-v -q -p"
lopts="--verbose --quiet"
case "${cur}" in
-*)
__complete_options
return 0;;
*:)
__vercode
return 0;;
*)
__package
return 0;;
esac
}
__complete_stats() {
opts="-v -q -d"
lopts="--verbose --quiet --download"
__complete_options
}
__complete_server() {
opts="-i -v -q"
lopts="--identity-file --local-copy-dir --sync-from-local-copy-dir
--verbose --quiet --no-checksum update"
__complete_options
}
__complete_signindex() {
opts="-v -q"
lopts="--verbose"
__complete_options
}
__complete_init() {
opts="-v -q -d"
lopts="--verbose --quiet --distinguished-name --keystore
--repo-keyalias --android-home --no-prompt"
__complete_options
}
__cmds=" build init install update publish checkupdates import readmeta \
rewritemeta lint scanner verify stats server signindex "
for c in $__cmds; do
eval "_fdroid_${c} () {
local cur prev opts lopts
__fdroid_init ${c}
}"
done
_fdroid() {
local cmd
cmd=${COMP_WORDS[1]}
[[ $__cmds == *\ $cmd\ * ]] && _fdroid_${cmd} || {
(($COMP_CWORD == 1)) && COMPREPLY=( $( compgen -W "${__cmds}" -- $cmd ) )
}
}
_fd-commit() {
__package
}
complete -F _fdroid fdroid
complete -F _fd-commit fd-commit
return 0
fdroidserver-0.6.0/fdroidserver/ 0000775 0000765 0000765 00000000000 12661321517 016635 5 ustar hans hans 0000000 0000000 fdroidserver-0.6.0/fdroidserver/asynchronousfilereader/ 0000775 0000765 0000765 00000000000 12661321517 023413 5 ustar hans hans 0000000 0000000 fdroidserver-0.6.0/fdroidserver/asynchronousfilereader/__init__.py 0000664 0000765 0000765 00000002604 12657174252 025535 0 ustar hans hans 0000000 0000000 """
AsynchronousFileReader
======================
Simple thread based asynchronous file reader for Python.
see https://github.com/soxofaan/asynchronousfilereader
MIT License
Copyright (c) 2014 Stefaan Lippens
"""
__version__ = '0.2.1'
import threading
try:
# Python 2
from Queue import Queue
except ImportError:
# Python 3
from queue import Queue
class AsynchronousFileReader(threading.Thread):
"""
Helper class to implement asynchronous reading of a file
in a separate thread. Pushes read lines on a queue to
be consumed in another thread.
"""
def __init__(self, fd, queue=None, autostart=True):
self._fd = fd
if queue is None:
queue = Queue()
self.queue = queue
threading.Thread.__init__(self)
if autostart:
self.start()
def run(self):
"""
The body of the tread: read lines and put them on the queue.
"""
while True:
line = self._fd.readline()
if not line:
break
self.queue.put(line)
def eof(self):
"""
Check whether there is no more content to expect.
"""
return not self.is_alive() and self.queue.empty()
def readlines(self):
"""
Get currently available lines.
"""
while not self.queue.empty():
yield self.queue.get()
fdroidserver-0.6.0/fdroidserver/server.py 0000664 0000765 0000765 00000033052 12657174252 020527 0 ustar hans hans 0000000 0000000 #!/usr/bin/env python2
# -*- coding: utf-8 -*-
#
# server.py - part of the FDroid server tools
# Copyright (C) 2010-15, Ciaran Gultnieks, ciaran@ciarang.com
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see .
import sys
import glob
import hashlib
import os
import paramiko
import pwd
import subprocess
from argparse import ArgumentParser
import logging
import common
config = None
options = None
def update_awsbucket(repo_section):
'''
Upload the contents of the directory `repo_section` (including
subdirectories) to the AWS S3 "bucket". The contents of that subdir of the
bucket will first be deleted.
Requires AWS credentials set in config.py: awsaccesskeyid, awssecretkey
'''
logging.debug('Syncing "' + repo_section + '" to Amazon S3 bucket "'
+ config['awsbucket'] + '"')
import libcloud.security
libcloud.security.VERIFY_SSL_CERT = True
from libcloud.storage.types import Provider, ContainerDoesNotExistError
from libcloud.storage.providers import get_driver
if not config.get('awsaccesskeyid') or not config.get('awssecretkey'):
logging.error('To use awsbucket, you must set awssecretkey and awsaccesskeyid in config.py!')
sys.exit(1)
awsbucket = config['awsbucket']
cls = get_driver(Provider.S3)
driver = cls(config['awsaccesskeyid'], config['awssecretkey'])
try:
container = driver.get_container(container_name=awsbucket)
except ContainerDoesNotExistError:
container = driver.create_container(container_name=awsbucket)
logging.info('Created new container "' + container.name + '"')
upload_dir = 'fdroid/' + repo_section
objs = dict()
for obj in container.list_objects():
if obj.name.startswith(upload_dir + '/'):
objs[obj.name] = obj
for root, _, files in os.walk(os.path.join(os.getcwd(), repo_section)):
for name in files:
upload = False
file_to_upload = os.path.join(root, name)
object_name = 'fdroid/' + os.path.relpath(file_to_upload, os.getcwd())
if object_name not in objs:
upload = True
else:
obj = objs.pop(object_name)
if obj.size != os.path.getsize(file_to_upload):
upload = True
else:
# if the sizes match, then compare by MD5
md5 = hashlib.md5()
with open(file_to_upload, 'rb') as f:
while True:
data = f.read(8192)
if not data:
break
md5.update(data)
if obj.hash != md5.hexdigest():
s3url = 's3://' + awsbucket + '/' + obj.name
logging.info(' deleting ' + s3url)
if not driver.delete_object(obj):
logging.warn('Could not delete ' + s3url)
upload = True
if upload:
logging.debug(' uploading "' + file_to_upload + '"...')
extra = {'acl': 'public-read'}
if file_to_upload.endswith('.sig'):
extra['content_type'] = 'application/pgp-signature'
elif file_to_upload.endswith('.asc'):
extra['content_type'] = 'application/pgp-signature'
logging.info(' uploading ' + os.path.relpath(file_to_upload)
+ ' to s3://' + awsbucket + '/' + object_name)
with open(file_to_upload, 'rb') as iterator:
obj = driver.upload_object_via_stream(iterator=iterator,
container=container,
object_name=object_name,
extra=extra)
# delete the remnants in the bucket, they do not exist locally
while objs:
object_name, obj = objs.popitem()
s3url = 's3://' + awsbucket + '/' + object_name
if object_name.startswith(upload_dir):
logging.warn(' deleting ' + s3url)
driver.delete_object(obj)
else:
logging.info(' skipping ' + s3url)
def update_serverwebroot(serverwebroot, repo_section):
# use a checksum comparison for accurate comparisons on different
# filesystems, for example, FAT has a low resolution timestamp
rsyncargs = ['rsync', '--archive', '--delete-after', '--safe-links']
if not options.no_checksum:
rsyncargs.append('--checksum')
if options.verbose:
rsyncargs += ['--verbose']
if options.quiet:
rsyncargs += ['--quiet']
if options.identity_file is not None:
rsyncargs += ['-e', 'ssh -i ' + options.identity_file]
if 'identity_file' in config:
rsyncargs += ['-e', 'ssh -i ' + config['identity_file']]
indexxml = os.path.join(repo_section, 'index.xml')
indexjar = os.path.join(repo_section, 'index.jar')
# Upload the first time without the index files and delay the deletion as
# much as possible, that keeps the repo functional while this update is
# running. Then once it is complete, rerun the command again to upload
# the index files. Always using the same target with rsync allows for
# very strict settings on the receiving server, you can literally specify
# the one rsync command that is allowed to run in ~/.ssh/authorized_keys.
# (serverwebroot is guaranteed to have a trailing slash in common.py)
logging.info('rsyncing ' + repo_section + ' to ' + serverwebroot)
if subprocess.call(rsyncargs +
['--exclude', indexxml, '--exclude', indexjar,
repo_section, serverwebroot]) != 0:
sys.exit(1)
if subprocess.call(rsyncargs + [repo_section, serverwebroot]) != 0:
sys.exit(1)
# upload "current version" symlinks if requested
if config['make_current_version_link'] and repo_section == 'repo':
links_to_upload = []
for f in glob.glob('*.apk') \
+ glob.glob('*.apk.asc') + glob.glob('*.apk.sig'):
if os.path.islink(f):
links_to_upload.append(f)
if len(links_to_upload) > 0:
if subprocess.call(rsyncargs + links_to_upload + [serverwebroot]) != 0:
sys.exit(1)
def _local_sync(fromdir, todir):
rsyncargs = ['rsync', '--recursive', '--safe-links', '--times', '--perms',
'--one-file-system', '--delete', '--chmod=Da+rx,Fa-x,a+r,u+w']
# use stricter rsync checking on all files since people using offline mode
# are already prioritizing security above ease and speed
if not options.no_checksum:
rsyncargs.append('--checksum')
if options.verbose:
rsyncargs += ['--verbose']
if options.quiet:
rsyncargs += ['--quiet']
logging.debug(' '.join(rsyncargs + [fromdir, todir]))
if subprocess.call(rsyncargs + [fromdir, todir]) != 0:
sys.exit(1)
def sync_from_localcopy(repo_section, local_copy_dir):
logging.info('Syncing from local_copy_dir to this repo.')
# trailing slashes have a meaning in rsync which is not needed here, so
# make sure both paths have exactly one trailing slash
_local_sync(os.path.join(local_copy_dir, repo_section).rstrip('/') + '/',
repo_section.rstrip('/') + '/')
def update_localcopy(repo_section, local_copy_dir):
# local_copy_dir is guaranteed to have a trailing slash in main() below
_local_sync(repo_section, local_copy_dir)
def main():
global config, options
# Parse command line...
parser = ArgumentParser()
common.setup_global_opts(parser)
parser.add_argument("command", help="command to execute, either 'init' or 'update'")
parser.add_argument("-i", "--identity-file", default=None,
help="Specify an identity file to provide to SSH for rsyncing")
parser.add_argument("--local-copy-dir", default=None,
help="Specify a local folder to sync the repo to")
parser.add_argument("--sync-from-local-copy-dir", action="store_true", default=False,
help="Before uploading to servers, sync from local copy dir")
parser.add_argument("--no-checksum", action="store_true", default=False,
help="Don't use rsync checksums")
options = parser.parse_args()
config = common.read_config(options)
if options.command != 'init' and options.command != 'update':
logging.critical("The only commands currently supported are 'init' and 'update'")
sys.exit(1)
if config.get('nonstandardwebroot') is True:
standardwebroot = False
else:
standardwebroot = True
for serverwebroot in config.get('serverwebroot', []):
# this supports both an ssh host:path and just a path
s = serverwebroot.rstrip('/').split(':')
if len(s) == 1:
fdroiddir = s[0]
elif len(s) == 2:
host, fdroiddir = s
else:
logging.error('Malformed serverwebroot line: ' + serverwebroot)
sys.exit(1)
repobase = os.path.basename(fdroiddir)
if standardwebroot and repobase != 'fdroid':
logging.error('serverwebroot path does not end with "fdroid", '
+ 'perhaps you meant one of these:\n\t'
+ serverwebroot.rstrip('/') + '/fdroid\n\t'
+ serverwebroot.rstrip('/').rstrip(repobase) + 'fdroid')
sys.exit(1)
if options.local_copy_dir is not None:
local_copy_dir = options.local_copy_dir
elif config.get('local_copy_dir'):
local_copy_dir = config['local_copy_dir']
else:
local_copy_dir = None
if local_copy_dir is not None:
fdroiddir = local_copy_dir.rstrip('/')
if os.path.exists(fdroiddir) and not os.path.isdir(fdroiddir):
logging.error('local_copy_dir must be directory, not a file!')
sys.exit(1)
if not os.path.exists(os.path.dirname(fdroiddir)):
logging.error('The root dir for local_copy_dir "'
+ os.path.dirname(fdroiddir)
+ '" does not exist!')
sys.exit(1)
if not os.path.isabs(fdroiddir):
logging.error('local_copy_dir must be an absolute path!')
sys.exit(1)
repobase = os.path.basename(fdroiddir)
if standardwebroot and repobase != 'fdroid':
logging.error('local_copy_dir does not end with "fdroid", '
+ 'perhaps you meant: ' + fdroiddir + '/fdroid')
sys.exit(1)
if local_copy_dir[-1] != '/':
local_copy_dir += '/'
local_copy_dir = local_copy_dir.replace('//', '/')
if not os.path.exists(fdroiddir):
os.mkdir(fdroiddir)
if not config.get('awsbucket') \
and not config.get('serverwebroot') \
and local_copy_dir is None:
logging.warn('No serverwebroot, local_copy_dir, or awsbucket set!'
+ 'Edit your config.py to set at least one.')
sys.exit(1)
repo_sections = ['repo']
if config['archive_older'] != 0:
repo_sections.append('archive')
if not os.path.exists('archive'):
os.mkdir('archive')
if config['per_app_repos']:
repo_sections += common.get_per_app_repos()
if options.command == 'init':
ssh = paramiko.SSHClient()
ssh.load_system_host_keys()
for serverwebroot in config.get('serverwebroot', []):
sshstr, remotepath = serverwebroot.rstrip('/').split(':')
if sshstr.find('@') >= 0:
username, hostname = sshstr.split('@')
else:
username = pwd.getpwuid(os.getuid())[0] # get effective uid
hostname = sshstr
ssh.connect(hostname, username=username)
sftp = ssh.open_sftp()
if os.path.basename(remotepath) \
not in sftp.listdir(os.path.dirname(remotepath)):
sftp.mkdir(remotepath, mode=0755)
for repo_section in repo_sections:
repo_path = os.path.join(remotepath, repo_section)
if os.path.basename(repo_path) \
not in sftp.listdir(remotepath):
sftp.mkdir(repo_path, mode=0755)
sftp.close()
ssh.close()
elif options.command == 'update':
for repo_section in repo_sections:
if local_copy_dir is not None:
if config['sync_from_local_copy_dir'] and os.path.exists(repo_section):
sync_from_localcopy(repo_section, local_copy_dir)
else:
update_localcopy(repo_section, local_copy_dir)
for serverwebroot in config.get('serverwebroot', []):
update_serverwebroot(serverwebroot, repo_section)
if config.get('awsbucket'):
update_awsbucket(repo_section)
sys.exit(0)
if __name__ == "__main__":
main()
fdroidserver-0.6.0/fdroidserver/verify.py 0000664 0000765 0000765 00000006043 12657174252 020525 0 ustar hans hans 0000000 0000000 #!/usr/bin/env python2
# -*- coding: utf-8 -*-
#
# verify.py - part of the FDroid server tools
# Copyright (C) 2013, Ciaran Gultnieks, ciaran@ciarang.com
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see .
import sys
import os
import glob
from argparse import ArgumentParser
import logging
import common
import net
from common import FDroidException
options = None
config = None
def main():
global options, config
# Parse command line...
parser = ArgumentParser(usage="%(prog)s [options] [APPID[:VERCODE] [APPID[:VERCODE] ...]]")
common.setup_global_opts(parser)
parser.add_argument("appid", nargs='*', help="app-id with optional versioncode in the form APPID[:VERCODE]")
options = parser.parse_args()
config = common.read_config(options)
tmp_dir = 'tmp'
if not os.path.isdir(tmp_dir):
logging.info("Creating temporary directory")
os.makedirs(tmp_dir)
unsigned_dir = 'unsigned'
if not os.path.isdir(unsigned_dir):
logging.error("No unsigned directory - nothing to do")
sys.exit(0)
verified = 0
notverified = 0
vercodes = common.read_pkg_args(options.appid, True)
for apkfile in sorted(glob.glob(os.path.join(unsigned_dir, '*.apk'))):
apkfilename = os.path.basename(apkfile)
appid, vercode = common.apknameinfo(apkfile)
if vercodes and appid not in vercodes:
continue
if vercodes[appid] and vercode not in vercodes[appid]:
continue
try:
logging.info("Processing " + apkfilename)
remoteapk = os.path.join(tmp_dir, apkfilename)
if os.path.exists(remoteapk):
os.remove(remoteapk)
url = 'https://f-droid.org/repo/' + apkfilename
logging.info("...retrieving " + url)
net.download_file(url, dldir=tmp_dir)
compare_result = common.compare_apks(
os.path.join(unsigned_dir, apkfilename),
remoteapk,
tmp_dir)
if compare_result:
raise FDroidException(compare_result)
logging.info("...successfully verified")
verified += 1
except FDroidException as e:
logging.info("...NOT verified - {0}".format(e))
notverified += 1
logging.info("Finished")
logging.info("{0} successfully verified".format(verified))
logging.info("{0} NOT verified".format(notverified))
if __name__ == "__main__":
main()
fdroidserver-0.6.0/fdroidserver/gpgsign.py 0000664 0000765 0000765 00000004712 12657174252 020660 0 ustar hans hans 0000000 0000000 #!/usr/bin/env python2
# -*- coding: utf-8 -*-
#
# gpgsign.py - part of the FDroid server tools
# Copyright (C) 2014, Ciaran Gultnieks, ciaran@ciarang.com
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see .
import sys
import os
import glob
from argparse import ArgumentParser
import logging
import common
from common import FDroidPopen
config = None
options = None
def main():
global config, options
# Parse command line...
parser = ArgumentParser(usage="%(prog)s [options]")
common.setup_global_opts(parser)
options = parser.parse_args()
config = common.read_config(options)
repodirs = ['repo']
if config['archive_older'] != 0:
repodirs.append('archive')
for output_dir in repodirs:
if not os.path.isdir(output_dir):
logging.error("Missing output directory '" + output_dir + "'")
sys.exit(1)
# Process any apks that are waiting to be signed...
for apkfile in sorted(glob.glob(os.path.join(output_dir, '*.apk'))):
apkfilename = os.path.basename(apkfile)
sigfilename = apkfilename + ".asc"
sigpath = os.path.join(output_dir, sigfilename)
if not os.path.exists(sigpath):
gpgargs = ['gpg', '-a',
'--output', sigpath,
'--detach-sig']
if 'gpghome' in config:
gpgargs.extend(['--homedir', config['gpghome']])
if 'gpgkey' in config:
gpgargs.extend(['--local-user', config['gpgkey']])
gpgargs.append(os.path.join(output_dir, apkfilename))
p = FDroidPopen(gpgargs)
if p.returncode != 0:
logging.error("Signing failed.")
sys.exit(1)
logging.info('Signed ' + apkfilename)
if __name__ == "__main__":
main()
fdroidserver-0.6.0/fdroidserver/import.py 0000664 0000765 0000765 00000020362 12657206106 020525 0 ustar hans hans 0000000 0000000 #!/usr/bin/env python2
# -*- coding: utf-8 -*-
#
# import.py - part of the FDroid server tools
# Copyright (C) 2010-13, Ciaran Gultnieks, ciaran@ciarang.com
# Copyright (C) 2013-2014 Daniel Martí
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see .
import sys
import os
import shutil
import urllib
from argparse import ArgumentParser
from ConfigParser import ConfigParser
import logging
import common
import metadata
# Get the repo type and address from the given web page. The page is scanned
# in a rather naive manner for 'git clone xxxx', 'hg clone xxxx', etc, and
# when one of these is found it's assumed that's the information we want.
# Returns repotype, address, or None, reason
def getrepofrompage(url):
req = urllib.urlopen(url)
if req.getcode() != 200:
return (None, 'Unable to get ' + url + ' - return code ' + str(req.getcode()))
page = req.read()
# Works for BitBucket
index = page.find('hg clone')
if index != -1:
repotype = 'hg'
repo = page[index + 9:]
index = repo.find('<')
if index == -1:
return (None, "Error while getting repo address")
repo = repo[:index]
repo = repo.split('"')[0]
return (repotype, repo)
# Works for BitBucket
index = page.find('git clone')
if index != -1:
repotype = 'git'
repo = page[index + 10:]
index = repo.find('<')
if index == -1:
return (None, "Error while getting repo address")
repo = repo[:index]
repo = repo.split('"')[0]
return (repotype, repo)
return (None, "No information found." + page)
config = None
options = None
def get_metadata_from_url(app, url):
tmp_dir = 'tmp'
if not os.path.isdir(tmp_dir):
logging.info("Creating temporary directory")
os.makedirs(tmp_dir)
# Figure out what kind of project it is...
projecttype = None
app.WebSite = url # by default, we might override it
if url.startswith('git://'):
projecttype = 'git'
repo = url
repotype = 'git'
app.SourceCode = ""
app.WebSite = ""
elif url.startswith('https://github.com'):
projecttype = 'github'
repo = url
repotype = 'git'
app.SourceCode = url
app.IssueTracker = url + '/issues'
app.WebSite = ""
elif url.startswith('https://gitlab.com/'):
projecttype = 'gitlab'
# git can be fussy with gitlab URLs unless they end in .git
if url.endswith('.git'):
repo = url
else:
repo = url + '.git'
repotype = 'git'
app.SourceCode = url + '/tree/HEAD'
app.IssueTracker = url + '/issues'
elif url.startswith('https://bitbucket.org/'):
if url.endswith('/'):
url = url[:-1]
projecttype = 'bitbucket'
app.SourceCode = url + '/src'
app.IssueTracker = url + '/issues'
# Figure out the repo type and adddress...
repotype, repo = getrepofrompage(app.SourceCode)
if not repotype:
logging.error("Unable to determine vcs type. " + repo)
sys.exit(1)
if not projecttype:
logging.error("Unable to determine the project type.")
logging.error("The URL you supplied was not in one of the supported formats. Please consult")
logging.error("the manual for a list of supported formats, and supply one of those.")
sys.exit(1)
# Ensure we have a sensible-looking repo address at this point. If not, we
# might have got a page format we weren't expecting. (Note that we
# specifically don't want git@...)
if ((repotype != 'bzr' and (not repo.startswith('http://') and
not repo.startswith('https://') and
not repo.startswith('git://'))) or
' ' in repo):
logging.error("Repo address '{0}' does not seem to be valid".format(repo))
sys.exit(1)
# Get a copy of the source so we can extract some info...
logging.info('Getting source from ' + repotype + ' repo at ' + repo)
build_dir = os.path.join(tmp_dir, 'importer')
if os.path.exists(build_dir):
shutil.rmtree(build_dir)
vcs = common.getvcs(repotype, repo, build_dir)
vcs.gotorevision(options.rev)
root_dir = get_subdir(build_dir)
app.RepoType = repotype
app.Repo = repo
return root_dir, build_dir
config = None
options = None
def get_subdir(build_dir):
if options.subdir:
return os.path.join(build_dir, options.subdir)
return build_dir
def main():
global config, options
# Parse command line...
parser = ArgumentParser()
common.setup_global_opts(parser)
parser.add_argument("-u", "--url", default=None,
help="Project URL to import from.")
parser.add_argument("-s", "--subdir", default=None,
help="Path to main android project subdirectory, if not in root.")
parser.add_argument("--rev", default=None,
help="Allows a different revision (or git branch) to be specified for the initial import")
options = parser.parse_args()
config = common.read_config(options)
apps = metadata.read_metadata()
app = metadata.App()
app.UpdateCheckMode = "Tags"
root_dir = None
build_dir = None
if options.url:
root_dir, build_dir = get_metadata_from_url(app, options.url)
elif os.path.isdir('.git'):
if options.url:
app.WebSite = options.url
root_dir = get_subdir(os.getcwd())
else:
logging.error("Specify project url.")
sys.exit(1)
# Extract some information...
paths = common.manifest_paths(root_dir, [])
if paths:
version, vercode, package = common.parse_androidmanifests(paths, app)
if not package:
logging.error("Couldn't find package ID")
sys.exit(1)
if not version:
logging.warn("Couldn't find latest version name")
if not vercode:
logging.warn("Couldn't find latest version code")
else:
spec = os.path.join(root_dir, 'buildozer.spec')
if os.path.exists(spec):
defaults = {'orientation': 'landscape', 'icon': '',
'permissions': '', 'android.api': "18"}
bconfig = ConfigParser(defaults, allow_no_value=True)
bconfig.read(spec)
package = bconfig.get('app', 'package.domain') + '.' + bconfig.get('app', 'package.name')
version = bconfig.get('app', 'version')
vercode = None
else:
logging.error("No android or kivy project could be found. Specify --subdir?")
sys.exit(1)
# Make sure it's actually new...
if package in apps:
logging.error("Package " + package + " already exists")
sys.exit(1)
# Create a build line...
build = metadata.Build()
build.version = version or '?'
build.vercode = vercode or '?'
build.commit = '?'
build.disable = 'Generated by import.py - check/set version fields and commit id'
if options.subdir:
build.subdir = options.subdir
if os.path.exists(os.path.join(root_dir, 'jni')):
build.buildjni = ['yes']
app.builds.append(build)
# Keep the repo directory to save bandwidth...
if not os.path.exists('build'):
os.mkdir('build')
if build_dir is not None:
shutil.move(build_dir, os.path.join('build', package))
with open('build/.fdroidvcs-' + package, 'w') as f:
f.write(app.RepoType + ' ' + app.Repo)
metadatapath = os.path.join('metadata', package + '.txt')
with open(metadatapath, 'w') as f:
metadata.write_metadata('txt', f, app)
logging.info("Wrote " + metadatapath)
if __name__ == "__main__":
main()
fdroidserver-0.6.0/fdroidserver/readmeta.py 0000664 0000765 0000765 00000002116 12657174252 021000 0 ustar hans hans 0000000 0000000 #!/usr/bin/env python2
# -*- coding: utf-8 -*-
#
# readmeta.py - part of the FDroid server tools
# Copyright (C) 2014 Daniel Martí
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see .
from argparse import ArgumentParser
import common
import metadata
def main():
parser = ArgumentParser(usage="%(prog)s")
common.setup_global_opts(parser)
parser.parse_args()
common.read_config(None)
metadata.read_metadata(xref=True)
if __name__ == "__main__":
main()
fdroidserver-0.6.0/fdroidserver/common.py 0000664 0000765 0000765 00000220570 12660445746 020517 0 ustar hans hans 0000000 0000000 # -*- coding: utf-8 -*-
#
# common.py - part of the FDroid server tools
# Copyright (C) 2010-13, Ciaran Gultnieks, ciaran@ciarang.com
# Copyright (C) 2013-2014 Daniel Martí
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see .
# common.py is imported by all modules, so do not import third-party
# libraries here as they will become a requirement for all commands.
import os
import sys
import re
import shutil
import glob
import stat
import subprocess
import time
import operator
import logging
import hashlib
import socket
import xml.etree.ElementTree as XMLElementTree
try:
# Python 2
from Queue import Queue
except ImportError:
# Python 3
from queue import Queue
from zipfile import ZipFile
import metadata
from fdroidserver.asynchronousfilereader import AsynchronousFileReader
XMLElementTree.register_namespace('android', 'http://schemas.android.com/apk/res/android')
config = None
options = None
env = None
orig_path = None
default_config = {
'sdk_path': "$ANDROID_HOME",
'ndk_paths': {
'r9b': None,
'r10e': "$ANDROID_NDK",
},
'build_tools': "23.0.2",
'java_paths': None,
'ant': "ant",
'mvn3': "mvn",
'gradle': 'gradle',
'accepted_formats': ['txt', 'yaml'],
'sync_from_local_copy_dir': False,
'per_app_repos': False,
'make_current_version_link': True,
'current_version_name_source': 'Name',
'update_stats': False,
'stats_ignore': [],
'stats_server': None,
'stats_user': None,
'stats_to_carbon': False,
'repo_maxage': 0,
'build_server_always': False,
'keystore': 'keystore.jks',
'smartcardoptions': [],
'char_limits': {
'Summary': 80,
'Description': 4000,
},
'keyaliases': {},
'repo_url': "https://MyFirstFDroidRepo.org/fdroid/repo",
'repo_name': "My First FDroid Repo Demo",
'repo_icon': "fdroid-icon.png",
'repo_description': '''
This is a repository of apps to be used with FDroid. Applications in this
repository are either official binaries built by the original application
developers, or are binaries built from source by the admin of f-droid.org
using the tools on https://gitlab.com/u/fdroid.
''',
'archive_older': 0,
}
def setup_global_opts(parser):
parser.add_argument("-v", "--verbose", action="store_true", default=False,
help="Spew out even more information than normal")
parser.add_argument("-q", "--quiet", action="store_true", default=False,
help="Restrict output to warnings and errors")
def fill_config_defaults(thisconfig):
for k, v in default_config.items():
if k not in thisconfig:
thisconfig[k] = v
# Expand paths (~users and $vars)
def expand_path(path):
if path is None:
return None
orig = path
path = os.path.expanduser(path)
path = os.path.expandvars(path)
if orig == path:
return None
return path
for k in ['sdk_path', 'ant', 'mvn3', 'gradle', 'keystore', 'repo_icon']:
v = thisconfig[k]
exp = expand_path(v)
if exp is not None:
thisconfig[k] = exp
thisconfig[k + '_orig'] = v
# find all installed JDKs for keytool, jarsigner, and JAVA[6-9]_HOME env vars
if thisconfig['java_paths'] is None:
thisconfig['java_paths'] = dict()
for d in sorted(glob.glob('/usr/lib/jvm/j*[6-9]*')
+ glob.glob('/usr/java/jdk1.[6-9]*')
+ glob.glob('/System/Library/Java/JavaVirtualMachines/1.[6-9].0.jdk')
+ glob.glob('/Library/Java/JavaVirtualMachines/*jdk*[6-9]*')):
if os.path.islink(d):
continue
j = os.path.basename(d)
# the last one found will be the canonical one, so order appropriately
for regex in (r'1\.([6-9])\.0\.jdk', # OSX
r'jdk1\.([6-9])\.0_[0-9]+.jdk', # OSX and Oracle tarball
r'jdk([6-9])-openjdk', # Arch
r'java-([6-9])-openjdk', # Arch
r'java-([6-9])-jdk', # Arch (oracle)
r'java-1\.([6-9])\.0-.*', # RedHat
r'java-([6-9])-oracle', # Debian WebUpd8
r'jdk-([6-9])-oracle-.*', # Debian make-jpkg
r'java-([6-9])-openjdk-[^c][^o][^m].*'): # Debian
m = re.match(regex, j)
if m:
osxhome = os.path.join(d, 'Contents', 'Home')
if os.path.exists(osxhome):
thisconfig['java_paths'][m.group(1)] = osxhome
else:
thisconfig['java_paths'][m.group(1)] = d
for java_version in ('7', '8', '9'):
if java_version not in thisconfig['java_paths']:
continue
java_home = thisconfig['java_paths'][java_version]
jarsigner = os.path.join(java_home, 'bin', 'jarsigner')
if os.path.exists(jarsigner):
thisconfig['jarsigner'] = jarsigner
thisconfig['keytool'] = os.path.join(java_home, 'bin', 'keytool')
break # Java7 is preferred, so quit if found
for k in ['ndk_paths', 'java_paths']:
d = thisconfig[k]
for k2 in d.copy():
v = d[k2]
exp = expand_path(v)
if exp is not None:
thisconfig[k][k2] = exp
thisconfig[k][k2 + '_orig'] = v
def regsub_file(pattern, repl, path):
with open(path, 'r') as f:
text = f.read()
text = re.sub(pattern, repl, text)
with open(path, 'w') as f:
f.write(text)
def read_config(opts, config_file='config.py'):
"""Read the repository config
The config is read from config_file, which is in the current directory when
any of the repo management commands are used.
"""
global config, options, env, orig_path
if config is not None:
return config
if not os.path.isfile(config_file):
logging.critical("Missing config file - is this a repo directory?")
sys.exit(2)
options = opts
config = {}
logging.debug("Reading %s" % config_file)
execfile(config_file, config)
# smartcardoptions must be a list since its command line args for Popen
if 'smartcardoptions' in config:
config['smartcardoptions'] = config['smartcardoptions'].split(' ')
elif 'keystore' in config and config['keystore'] == 'NONE':
# keystore='NONE' means use smartcard, these are required defaults
config['smartcardoptions'] = ['-storetype', 'PKCS11', '-providerName',
'SunPKCS11-OpenSC', '-providerClass',
'sun.security.pkcs11.SunPKCS11',
'-providerArg', 'opensc-fdroid.cfg']
if any(k in config for k in ["keystore", "keystorepass", "keypass"]):
st = os.stat(config_file)
if st.st_mode & stat.S_IRWXG or st.st_mode & stat.S_IRWXO:
logging.warn("unsafe permissions on {0} (should be 0600)!".format(config_file))
fill_config_defaults(config)
# There is no standard, so just set up the most common environment
# variables
env = os.environ
orig_path = env['PATH']
for n in ['ANDROID_HOME', 'ANDROID_SDK']:
env[n] = config['sdk_path']
for k, v in config['java_paths'].items():
env['JAVA%s_HOME' % k] = v
for k in ["keystorepass", "keypass"]:
if k in config:
write_password_file(k)
for k in ["repo_description", "archive_description"]:
if k in config:
config[k] = clean_description(config[k])
if 'serverwebroot' in config:
if isinstance(config['serverwebroot'], basestring):
roots = [config['serverwebroot']]
elif all(isinstance(item, basestring) for item in config['serverwebroot']):
roots = config['serverwebroot']
else:
raise TypeError('only accepts strings, lists, and tuples')
rootlist = []
for rootstr in roots:
# since this is used with rsync, where trailing slashes have
# meaning, ensure there is always a trailing slash
if rootstr[-1] != '/':
rootstr += '/'
rootlist.append(rootstr.replace('//', '/'))
config['serverwebroot'] = rootlist
return config
def find_sdk_tools_cmd(cmd):
'''find a working path to a tool from the Android SDK'''
tooldirs = []
if config is not None and 'sdk_path' in config and os.path.exists(config['sdk_path']):
# try to find a working path to this command, in all the recent possible paths
if 'build_tools' in config:
build_tools = os.path.join(config['sdk_path'], 'build-tools')
# if 'build_tools' was manually set and exists, check only that one
configed_build_tools = os.path.join(build_tools, config['build_tools'])
if os.path.exists(configed_build_tools):
tooldirs.append(configed_build_tools)
else:
# no configed version, so hunt known paths for it
for f in sorted(os.listdir(build_tools), reverse=True):
if os.path.isdir(os.path.join(build_tools, f)):
tooldirs.append(os.path.join(build_tools, f))
tooldirs.append(build_tools)
sdk_tools = os.path.join(config['sdk_path'], 'tools')
if os.path.exists(sdk_tools):
tooldirs.append(sdk_tools)
sdk_platform_tools = os.path.join(config['sdk_path'], 'platform-tools')
if os.path.exists(sdk_platform_tools):
tooldirs.append(sdk_platform_tools)
tooldirs.append('/usr/bin')
for d in tooldirs:
if os.path.isfile(os.path.join(d, cmd)):
return os.path.join(d, cmd)
# did not find the command, exit with error message
ensure_build_tools_exists(config)
def test_sdk_exists(thisconfig):
if 'sdk_path' not in thisconfig:
if 'aapt' in thisconfig and os.path.isfile(thisconfig['aapt']):
return True
else:
logging.error("'sdk_path' not set in config.py!")
return False
if thisconfig['sdk_path'] == default_config['sdk_path']:
logging.error('No Android SDK found!')
logging.error('You can use ANDROID_HOME to set the path to your SDK, i.e.:')
logging.error('\texport ANDROID_HOME=/opt/android-sdk')
return False
if not os.path.exists(thisconfig['sdk_path']):
logging.critical('Android SDK path "' + thisconfig['sdk_path'] + '" does not exist!')
return False
if not os.path.isdir(thisconfig['sdk_path']):
logging.critical('Android SDK path "' + thisconfig['sdk_path'] + '" is not a directory!')
return False
for d in ['build-tools', 'platform-tools', 'tools']:
if not os.path.isdir(os.path.join(thisconfig['sdk_path'], d)):
logging.critical('Android SDK path "%s" does not contain "%s/"!' % (
thisconfig['sdk_path'], d))
return False
return True
def ensure_build_tools_exists(thisconfig):
if not test_sdk_exists(thisconfig):
sys.exit(3)
build_tools = os.path.join(thisconfig['sdk_path'], 'build-tools')
versioned_build_tools = os.path.join(build_tools, thisconfig['build_tools'])
if not os.path.isdir(versioned_build_tools):
logging.critical('Android Build Tools path "'
+ versioned_build_tools + '" does not exist!')
sys.exit(3)
def write_password_file(pwtype, password=None):
'''
writes out passwords to a protected file instead of passing passwords as
command line argments
'''
filename = '.fdroid.' + pwtype + '.txt'
fd = os.open(filename, os.O_CREAT | os.O_TRUNC | os.O_WRONLY, 0o600)
if password is None:
os.write(fd, config[pwtype])
else:
os.write(fd, password)
os.close(fd)
config[pwtype + 'file'] = filename
# Given the arguments in the form of multiple appid:[vc] strings, this returns
# a dictionary with the set of vercodes specified for each package.
def read_pkg_args(args, allow_vercodes=False):
vercodes = {}
if not args:
return vercodes
for p in args:
if allow_vercodes and ':' in p:
package, vercode = p.split(':')
else:
package, vercode = p, None
if package not in vercodes:
vercodes[package] = [vercode] if vercode else []
continue
elif vercode and vercode not in vercodes[package]:
vercodes[package] += [vercode] if vercode else []
return vercodes
# On top of what read_pkg_args does, this returns the whole app metadata, but
# limiting the builds list to the builds matching the vercodes specified.
def read_app_args(args, allapps, allow_vercodes=False):
vercodes = read_pkg_args(args, allow_vercodes)
if not vercodes:
return allapps
apps = {}
for appid, app in allapps.iteritems():
if appid in vercodes:
apps[appid] = app
if len(apps) != len(vercodes):
for p in vercodes:
if p not in allapps:
logging.critical("No such package: %s" % p)
raise FDroidException("Found invalid app ids in arguments")
if not apps:
raise FDroidException("No packages specified")
error = False
for appid, app in apps.iteritems():
vc = vercodes[appid]
if not vc:
continue
app.builds = [b for b in app.builds if b.vercode in vc]
if len(app.builds) != len(vercodes[appid]):
error = True
allvcs = [b.vercode for b in app.builds]
for v in vercodes[appid]:
if v not in allvcs:
logging.critical("No such vercode %s for app %s" % (v, appid))
if error:
raise FDroidException("Found invalid vercodes for some apps")
return apps
def get_extension(filename):
base, ext = os.path.splitext(filename)
if not ext:
return base, ''
return base, ext.lower()[1:]
def has_extension(filename, ext):
_, f_ext = get_extension(filename)
return ext == f_ext
apk_regex = re.compile(r"^(.+)_([0-9]+)\.apk$")
def clean_description(description):
'Remove unneeded newlines and spaces from a block of description text'
returnstring = ''
# this is split up by paragraph to make removing the newlines easier
for paragraph in re.split(r'\n\n', description):
paragraph = re.sub('\r', '', paragraph)
paragraph = re.sub('\n', ' ', paragraph)
paragraph = re.sub(' {2,}', ' ', paragraph)
paragraph = re.sub('^\s*(\w)', r'\1', paragraph)
returnstring += paragraph + '\n\n'
return returnstring.rstrip('\n')
def apknameinfo(filename):
filename = os.path.basename(filename)
m = apk_regex.match(filename)
try:
result = (m.group(1), m.group(2))
except AttributeError:
raise FDroidException("Invalid apk name: %s" % filename)
return result
def getapkname(app, build):
return "%s_%s.apk" % (app.id, build.vercode)
def getsrcname(app, build):
return "%s_%s_src.tar.gz" % (app.id, build.vercode)
def getappname(app):
if app.Name:
return app.Name
if app.AutoName:
return app.AutoName
return app.id
def getcvname(app):
return '%s (%s)' % (app.CurrentVersion, app.CurrentVersionCode)
def getvcs(vcstype, remote, local):
if vcstype == 'git':
return vcs_git(remote, local)
if vcstype == 'git-svn':
return vcs_gitsvn(remote, local)
if vcstype == 'hg':
return vcs_hg(remote, local)
if vcstype == 'bzr':
return vcs_bzr(remote, local)
if vcstype == 'srclib':
if local != os.path.join('build', 'srclib', remote):
raise VCSException("Error: srclib paths are hard-coded!")
return getsrclib(remote, os.path.join('build', 'srclib'), raw=True)
if vcstype == 'svn':
raise VCSException("Deprecated vcs type 'svn' - please use 'git-svn' instead")
raise VCSException("Invalid vcs type " + vcstype)
def getsrclibvcs(name):
if name not in metadata.srclibs:
raise VCSException("Missing srclib " + name)
return metadata.srclibs[name]['Repo Type']
class vcs:
def __init__(self, remote, local):
# svn, git-svn and bzr may require auth
self.username = None
if self.repotype() in ('git-svn', 'bzr'):
if '@' in remote:
if self.repotype == 'git-svn':
raise VCSException("Authentication is not supported for git-svn")
self.username, remote = remote.split('@')
if ':' not in self.username:
raise VCSException("Password required with username")
self.username, self.password = self.username.split(':')
self.remote = remote
self.local = local
self.clone_failed = False
self.refreshed = False
self.srclib = None
def repotype(self):
return None
# Take the local repository to a clean version of the given revision, which
# is specificed in the VCS's native format. Beforehand, the repository can
# be dirty, or even non-existent. If the repository does already exist
# locally, it will be updated from the origin, but only once in the
# lifetime of the vcs object.
# None is acceptable for 'rev' if you know you are cloning a clean copy of
# the repo - otherwise it must specify a valid revision.
def gotorevision(self, rev, refresh=True):
if self.clone_failed:
raise VCSException("Downloading the repository already failed once, not trying again.")
# The .fdroidvcs-id file for a repo tells us what VCS type
# and remote that directory was created from, allowing us to drop it
# automatically if either of those things changes.
fdpath = os.path.join(self.local, '..',
'.fdroidvcs-' + os.path.basename(self.local))
cdata = self.repotype() + ' ' + self.remote
writeback = True
deleterepo = False
if os.path.exists(self.local):
if os.path.exists(fdpath):
with open(fdpath, 'r') as f:
fsdata = f.read().strip()
if fsdata == cdata:
writeback = False
else:
deleterepo = True
logging.info("Repository details for %s changed - deleting" % (
self.local))
else:
deleterepo = True
logging.info("Repository details for %s missing - deleting" % (
self.local))
if deleterepo:
shutil.rmtree(self.local)
exc = None
if not refresh:
self.refreshed = True
try:
self.gotorevisionx(rev)
except FDroidException as e:
exc = e
# If necessary, write the .fdroidvcs file.
if writeback and not self.clone_failed:
with open(fdpath, 'w') as f:
f.write(cdata)
if exc is not None:
raise exc
# Derived classes need to implement this. It's called once basic checking
# has been performend.
def gotorevisionx(self, rev):
raise VCSException("This VCS type doesn't define gotorevisionx")
# Initialise and update submodules
def initsubmodules(self):
raise VCSException('Submodules not supported for this vcs type')
# Get a list of all known tags
def gettags(self):
if not self._gettags:
raise VCSException('gettags not supported for this vcs type')
rtags = []
for tag in self._gettags():
if re.match('[-A-Za-z0-9_. /]+$', tag):
rtags.append(tag)
return rtags
def latesttags(self, tags, number):
"""Get the most recent tags in a given list.
:param tags: a list of tags
:param number: the number to return
:returns: A list containing the most recent tags in the provided
list, up to the maximum number given.
"""
raise VCSException('latesttags not supported for this vcs type')
# Get current commit reference (hash, revision, etc)
def getref(self):
raise VCSException('getref not supported for this vcs type')
# Returns the srclib (name, path) used in setting up the current
# revision, or None.
def getsrclib(self):
return self.srclib
class vcs_git(vcs):
def repotype(self):
return 'git'
# If the local directory exists, but is somehow not a git repository, git
# will traverse up the directory tree until it finds one that is (i.e.
# fdroidserver) and then we'll proceed to destroy it! This is called as
# a safety check.
def checkrepo(self):
p = FDroidPopen(['git', 'rev-parse', '--show-toplevel'], cwd=self.local, output=False)
result = p.output.rstrip()
if not result.endswith(self.local):
raise VCSException('Repository mismatch')
def gotorevisionx(self, rev):
if not os.path.exists(self.local):
# Brand new checkout
p = FDroidPopen(['git', 'clone', self.remote, self.local])
if p.returncode != 0:
self.clone_failed = True
raise VCSException("Git clone failed", p.output)
self.checkrepo()
else:
self.checkrepo()
# Discard any working tree changes
p = FDroidPopen(['git', 'submodule', 'foreach', '--recursive',
'git', 'reset', '--hard'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git reset failed", p.output)
# Remove untracked files now, in case they're tracked in the target
# revision (it happens!)
p = FDroidPopen(['git', 'submodule', 'foreach', '--recursive',
'git', 'clean', '-dffx'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git clean failed", p.output)
if not self.refreshed:
# Get latest commits and tags from remote
p = FDroidPopen(['git', 'fetch', 'origin'], cwd=self.local)
if p.returncode != 0:
raise VCSException("Git fetch failed", p.output)
p = FDroidPopen(['git', 'fetch', '--prune', '--tags', 'origin'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git fetch failed", p.output)
# Recreate origin/HEAD as git clone would do it, in case it disappeared
p = FDroidPopen(['git', 'remote', 'set-head', 'origin', '--auto'], cwd=self.local, output=False)
if p.returncode != 0:
lines = p.output.splitlines()
if 'Multiple remote HEAD branches' not in lines[0]:
raise VCSException("Git remote set-head failed", p.output)
branch = lines[1].split(' ')[-1]
p2 = FDroidPopen(['git', 'remote', 'set-head', 'origin', branch], cwd=self.local, output=False)
if p2.returncode != 0:
raise VCSException("Git remote set-head failed", p.output + '\n' + p2.output)
self.refreshed = True
# origin/HEAD is the HEAD of the remote, e.g. the "default branch" on
# a github repo. Most of the time this is the same as origin/master.
rev = rev or 'origin/HEAD'
p = FDroidPopen(['git', 'checkout', '-f', rev], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git checkout of '%s' failed" % rev, p.output)
# Get rid of any uncontrolled files left behind
p = FDroidPopen(['git', 'clean', '-dffx'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git clean failed", p.output)
def initsubmodules(self):
self.checkrepo()
submfile = os.path.join(self.local, '.gitmodules')
if not os.path.isfile(submfile):
raise VCSException("No git submodules available")
# fix submodules not accessible without an account and public key auth
with open(submfile, 'r') as f:
lines = f.readlines()
with open(submfile, 'w') as f:
for line in lines:
if 'git@github.com' in line:
line = line.replace('git@github.com:', 'https://github.com/')
if 'git@gitlab.com' in line:
line = line.replace('git@gitlab.com:', 'https://gitlab.com/')
f.write(line)
p = FDroidPopen(['git', 'submodule', 'sync'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git submodule sync failed", p.output)
p = FDroidPopen(['git', 'submodule', 'update', '--init', '--force', '--recursive'], cwd=self.local)
if p.returncode != 0:
raise VCSException("Git submodule update failed", p.output)
def _gettags(self):
self.checkrepo()
p = FDroidPopen(['git', 'tag'], cwd=self.local, output=False)
return p.output.splitlines()
def latesttags(self, tags, number):
self.checkrepo()
tl = []
for tag in tags:
p = FDroidPopen(
['git', 'show', '--format=format:%ct', '-s', tag],
cwd=self.local, output=False)
# Timestamp is on the last line. For a normal tag, it's the only
# line, but for annotated tags, the rest of the info precedes it.
ts = int(p.output.splitlines()[-1])
tl.append((ts, tag))
latest = []
for _, t in sorted(tl)[-number:]:
latest.append(t)
return latest
class vcs_gitsvn(vcs):
def repotype(self):
return 'git-svn'
# If the local directory exists, but is somehow not a git repository, git
# will traverse up the directory tree until it finds one that is (i.e.
# fdroidserver) and then we'll proceed to destory it! This is called as
# a safety check.
def checkrepo(self):
p = FDroidPopen(['git', 'rev-parse', '--show-toplevel'], cwd=self.local, output=False)
result = p.output.rstrip()
if not result.endswith(self.local):
raise VCSException('Repository mismatch')
def gotorevisionx(self, rev):
if not os.path.exists(self.local):
# Brand new checkout
gitsvn_args = ['git', 'svn', 'clone']
if ';' in self.remote:
remote_split = self.remote.split(';')
for i in remote_split[1:]:
if i.startswith('trunk='):
gitsvn_args.extend(['-T', i[6:]])
elif i.startswith('tags='):
gitsvn_args.extend(['-t', i[5:]])
elif i.startswith('branches='):
gitsvn_args.extend(['-b', i[9:]])
gitsvn_args.extend([remote_split[0], self.local])
p = FDroidPopen(gitsvn_args, output=False)
if p.returncode != 0:
self.clone_failed = True
raise VCSException("Git svn clone failed", p.output)
else:
gitsvn_args.extend([self.remote, self.local])
p = FDroidPopen(gitsvn_args, output=False)
if p.returncode != 0:
self.clone_failed = True
raise VCSException("Git svn clone failed", p.output)
self.checkrepo()
else:
self.checkrepo()
# Discard any working tree changes
p = FDroidPopen(['git', 'reset', '--hard'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git reset failed", p.output)
# Remove untracked files now, in case they're tracked in the target
# revision (it happens!)
p = FDroidPopen(['git', 'clean', '-dffx'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git clean failed", p.output)
if not self.refreshed:
# Get new commits, branches and tags from repo
p = FDroidPopen(['git', 'svn', 'fetch'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git svn fetch failed")
p = FDroidPopen(['git', 'svn', 'rebase'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git svn rebase failed", p.output)
self.refreshed = True
rev = rev or 'master'
if rev:
nospaces_rev = rev.replace(' ', '%20')
# Try finding a svn tag
for treeish in ['origin/', '']:
p = FDroidPopen(['git', 'checkout', treeish + 'tags/' + nospaces_rev], cwd=self.local, output=False)
if p.returncode == 0:
break
if p.returncode != 0:
# No tag found, normal svn rev translation
# Translate svn rev into git format
rev_split = rev.split('/')
p = None
for treeish in ['origin/', '']:
if len(rev_split) > 1:
treeish += rev_split[0]
svn_rev = rev_split[1]
else:
# if no branch is specified, then assume trunk (i.e. 'master' branch):
treeish += 'master'
svn_rev = rev
svn_rev = svn_rev if svn_rev[0] == 'r' else 'r' + svn_rev
p = FDroidPopen(['git', 'svn', 'find-rev', '--before', svn_rev, treeish], cwd=self.local, output=False)
git_rev = p.output.rstrip()
if p.returncode == 0 and git_rev:
break
if p.returncode != 0 or not git_rev:
# Try a plain git checkout as a last resort
p = FDroidPopen(['git', 'checkout', rev], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("No git treeish found and direct git checkout of '%s' failed" % rev, p.output)
else:
# Check out the git rev equivalent to the svn rev
p = FDroidPopen(['git', 'checkout', git_rev], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git checkout of '%s' failed" % rev, p.output)
# Get rid of any uncontrolled files left behind
p = FDroidPopen(['git', 'clean', '-dffx'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git clean failed", p.output)
def _gettags(self):
self.checkrepo()
for treeish in ['origin/', '']:
d = os.path.join(self.local, '.git', 'svn', 'refs', 'remotes', treeish, 'tags')
if os.path.isdir(d):
return os.listdir(d)
def getref(self):
self.checkrepo()
p = FDroidPopen(['git', 'svn', 'find-rev', 'HEAD'], cwd=self.local, output=False)
if p.returncode != 0:
return None
return p.output.strip()
class vcs_hg(vcs):
def repotype(self):
return 'hg'
def gotorevisionx(self, rev):
if not os.path.exists(self.local):
p = FDroidPopen(['hg', 'clone', self.remote, self.local], output=False)
if p.returncode != 0:
self.clone_failed = True
raise VCSException("Hg clone failed", p.output)
else:
p = FDroidPopen(['hg', 'status', '-uS'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Hg status failed", p.output)
for line in p.output.splitlines():
if not line.startswith('? '):
raise VCSException("Unexpected output from hg status -uS: " + line)
FDroidPopen(['rm', '-rf', line[2:]], cwd=self.local, output=False)
if not self.refreshed:
p = FDroidPopen(['hg', 'pull'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Hg pull failed", p.output)
self.refreshed = True
rev = rev or 'default'
if not rev:
return
p = FDroidPopen(['hg', 'update', '-C', rev], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Hg checkout of '%s' failed" % rev, p.output)
p = FDroidPopen(['hg', 'purge', '--all'], cwd=self.local, output=False)
# Also delete untracked files, we have to enable purge extension for that:
if "'purge' is provided by the following extension" in p.output:
with open(os.path.join(self.local, '.hg', 'hgrc'), "a") as myfile:
myfile.write("\n[extensions]\nhgext.purge=\n")
p = FDroidPopen(['hg', 'purge', '--all'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("HG purge failed", p.output)
elif p.returncode != 0:
raise VCSException("HG purge failed", p.output)
def _gettags(self):
p = FDroidPopen(['hg', 'tags', '-q'], cwd=self.local, output=False)
return p.output.splitlines()[1:]
class vcs_bzr(vcs):
def repotype(self):
return 'bzr'
def gotorevisionx(self, rev):
if not os.path.exists(self.local):
p = FDroidPopen(['bzr', 'branch', self.remote, self.local], output=False)
if p.returncode != 0:
self.clone_failed = True
raise VCSException("Bzr branch failed", p.output)
else:
p = FDroidPopen(['bzr', 'clean-tree', '--force', '--unknown', '--ignored'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Bzr revert failed", p.output)
if not self.refreshed:
p = FDroidPopen(['bzr', 'pull'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Bzr update failed", p.output)
self.refreshed = True
revargs = list(['-r', rev] if rev else [])
p = FDroidPopen(['bzr', 'revert'] + revargs, cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Bzr revert of '%s' failed" % rev, p.output)
def _gettags(self):
p = FDroidPopen(['bzr', 'tags'], cwd=self.local, output=False)
return [tag.split(' ')[0].strip() for tag in
p.output.splitlines()]
def unescape_string(string):
if len(string) < 2:
return string
if string[0] == '"' and string[-1] == '"':
return string[1:-1]
return string.replace("\\'", "'")
def retrieve_string(app_dir, string, xmlfiles=None):
if not string.startswith('@string/'):
return unescape_string(string)
if xmlfiles is None:
xmlfiles = []
for res_dir in [
os.path.join(app_dir, 'res'),
os.path.join(app_dir, 'src', 'main', 'res'),
]:
for r, d, f in os.walk(res_dir):
if os.path.basename(r) == 'values':
xmlfiles += [os.path.join(r, x) for x in f if x.endswith('.xml')]
name = string[len('@string/'):]
def element_content(element):
if element.text is None:
return ""
s = XMLElementTree.tostring(element, encoding='utf-8', method='text')
return s.strip()
for path in xmlfiles:
if not os.path.isfile(path):
continue
xml = parse_xml(path)
element = xml.find('string[@name="' + name + '"]')
if element is not None:
content = element_content(element)
return retrieve_string(app_dir, content, xmlfiles)
return ''
def retrieve_string_singleline(app_dir, string, xmlfiles=None):
return retrieve_string(app_dir, string, xmlfiles).replace('\n', ' ').strip()
# Return list of existing files that will be used to find the highest vercode
def manifest_paths(app_dir, flavours):
possible_manifests = \
[os.path.join(app_dir, 'AndroidManifest.xml'),
os.path.join(app_dir, 'src', 'main', 'AndroidManifest.xml'),
os.path.join(app_dir, 'src', 'AndroidManifest.xml'),
os.path.join(app_dir, 'build.gradle')]
for flavour in flavours:
if flavour == 'yes':
continue
possible_manifests.append(
os.path.join(app_dir, 'src', flavour, 'AndroidManifest.xml'))
return [path for path in possible_manifests if os.path.isfile(path)]
# Retrieve the package name. Returns the name, or None if not found.
def fetch_real_name(app_dir, flavours):
for path in manifest_paths(app_dir, flavours):
if not has_extension(path, 'xml') or not os.path.isfile(path):
continue
logging.debug("fetch_real_name: Checking manifest at " + path)
xml = parse_xml(path)
app = xml.find('application')
if app is None:
continue
if "{http://schemas.android.com/apk/res/android}label" not in app.attrib:
continue
label = app.attrib["{http://schemas.android.com/apk/res/android}label"].encode('utf-8')
result = retrieve_string_singleline(app_dir, label)
if result:
result = result.strip()
return result
return None
def get_library_references(root_dir):
libraries = []
proppath = os.path.join(root_dir, 'project.properties')
if not os.path.isfile(proppath):
return libraries
for line in file(proppath):
if not line.startswith('android.library.reference.'):
continue
path = line.split('=')[1].strip()
relpath = os.path.join(root_dir, path)
if not os.path.isdir(relpath):
continue
logging.debug("Found subproject at %s" % path)
libraries.append(path)
return libraries
def ant_subprojects(root_dir):
subprojects = get_library_references(root_dir)
for subpath in subprojects:
subrelpath = os.path.join(root_dir, subpath)
for p in get_library_references(subrelpath):
relp = os.path.normpath(os.path.join(subpath, p))
if relp not in subprojects:
subprojects.insert(0, relp)
return subprojects
def remove_debuggable_flags(root_dir):
# Remove forced debuggable flags
logging.debug("Removing debuggable flags from %s" % root_dir)
for root, dirs, files in os.walk(root_dir):
if 'AndroidManifest.xml' in files:
regsub_file(r'android:debuggable="[^"]*"',
'',
os.path.join(root, 'AndroidManifest.xml'))
vcsearch_g = re.compile(r'.*versionCode *=* *["\']*([0-9]+)["\']*').search
vnsearch_g = re.compile(r'.*versionName *=* *(["\'])((?:(?=(\\?))\3.)*?)\1.*').search
psearch_g = re.compile(r'.*(packageName|applicationId) *=* *["\']([^"]+)["\'].*').search
def app_matches_packagename(app, package):
if not package:
return False
appid = app.UpdateCheckName or app.id
if appid is None or appid == "Ignore":
return True
return appid == package
# Extract some information from the AndroidManifest.xml at the given path.
# Returns (version, vercode, package), any or all of which might be None.
# All values returned are strings.
def parse_androidmanifests(paths, app):
ignoreversions = app.UpdateCheckIgnore
ignoresearch = re.compile(ignoreversions).search if ignoreversions else None
if not paths:
return (None, None, None)
max_version = None
max_vercode = None
max_package = None
for path in paths:
if not os.path.isfile(path):
continue
logging.debug("Parsing manifest at {0}".format(path))
gradle = has_extension(path, 'gradle')
version = None
vercode = None
package = None
if gradle:
for line in file(path):
if gradle_comment.match(line):
continue
# Grab first occurence of each to avoid running into
# alternative flavours and builds.
if not package:
matches = psearch_g(line)
if matches:
s = matches.group(2)
if app_matches_packagename(app, s):
package = s
if not version:
matches = vnsearch_g(line)
if matches:
version = matches.group(2)
if not vercode:
matches = vcsearch_g(line)
if matches:
vercode = matches.group(1)
else:
try:
xml = parse_xml(path)
if "package" in xml.attrib:
s = xml.attrib["package"].encode('utf-8')
if app_matches_packagename(app, s):
package = s
if "{http://schemas.android.com/apk/res/android}versionName" in xml.attrib:
version = xml.attrib["{http://schemas.android.com/apk/res/android}versionName"].encode('utf-8')
base_dir = os.path.dirname(path)
version = retrieve_string_singleline(base_dir, version)
if "{http://schemas.android.com/apk/res/android}versionCode" in xml.attrib:
a = xml.attrib["{http://schemas.android.com/apk/res/android}versionCode"].encode('utf-8')
if string_is_integer(a):
vercode = a
except Exception:
logging.warning("Problem with xml at {0}".format(path))
# Remember package name, may be defined separately from version+vercode
if package is None:
package = max_package
logging.debug("..got package={0}, version={1}, vercode={2}"
.format(package, version, vercode))
# Always grab the package name and version name in case they are not
# together with the highest version code
if max_package is None and package is not None:
max_package = package
if max_version is None and version is not None:
max_version = version
if max_vercode is None or (vercode is not None and vercode > max_vercode):
if not ignoresearch or not ignoresearch(version):
if version is not None:
max_version = version
if vercode is not None:
max_vercode = vercode
if package is not None:
max_package = package
else:
max_version = "Ignore"
if max_version is None:
max_version = "Unknown"
if max_package and not is_valid_package_name(max_package):
raise FDroidException("Invalid package name {0}".format(max_package))
return (max_version, max_vercode, max_package)
def is_valid_package_name(name):
return re.match("[A-Za-z_][A-Za-z_0-9.]+$", name)
class FDroidException(Exception):
def __init__(self, value, detail=None):
self.value = value
self.detail = detail
def shortened_detail(self):
if len(self.detail) < 16000:
return self.detail
return '[...]\n' + self.detail[-16000:]
def get_wikitext(self):
ret = repr(self.value) + "\n"
if self.detail:
ret += "=detail=\n"
ret += "
\n" + self.shortened_detail() + "
\n"
return ret
def __str__(self):
ret = self.value
if self.detail:
ret += "\n==== detail begin ====\n%s\n==== detail end ====" % self.detail.strip()
return ret
class VCSException(FDroidException):
pass
class BuildException(FDroidException):
pass
# Get the specified source library.
# Returns the path to it. Normally this is the path to be used when referencing
# it, which may be a subdirectory of the actual project. If you want the base
# directory of the project, pass 'basepath=True'.
def getsrclib(spec, srclib_dir, subdir=None, basepath=False,
raw=False, prepare=True, preponly=False, refresh=True):
number = None
subdir = None
if raw:
name = spec
ref = None
else:
name, ref = spec.split('@')
if ':' in name:
number, name = name.split(':', 1)
if '/' in name:
name, subdir = name.split('/', 1)
if name not in metadata.srclibs:
raise VCSException('srclib ' + name + ' not found.')
srclib = metadata.srclibs[name]
sdir = os.path.join(srclib_dir, name)
if not preponly:
vcs = getvcs(srclib["Repo Type"], srclib["Repo"], sdir)
vcs.srclib = (name, number, sdir)
if ref:
vcs.gotorevision(ref, refresh)
if raw:
return vcs
libdir = None
if subdir:
libdir = os.path.join(sdir, subdir)
elif srclib["Subdir"]:
for subdir in srclib["Subdir"]:
libdir_candidate = os.path.join(sdir, subdir)
if os.path.exists(libdir_candidate):
libdir = libdir_candidate
break
if libdir is None:
libdir = sdir
remove_signing_keys(sdir)
remove_debuggable_flags(sdir)
if prepare:
if srclib["Prepare"]:
cmd = replace_config_vars(srclib["Prepare"], None)
p = FDroidPopen(['bash', '-x', '-c', cmd], cwd=libdir)
if p.returncode != 0:
raise BuildException("Error running prepare command for srclib %s"
% name, p.output)
if basepath:
libdir = sdir
return (name, number, libdir)
gradle_version_regex = re.compile(r"[^/]*'com\.android\.tools\.build:gradle:([^\.]+\.[^\.]+).*'.*")
# Prepare the source code for a particular build
# 'vcs' - the appropriate vcs object for the application
# 'app' - the application details from the metadata
# 'build' - the build details from the metadata
# 'build_dir' - the path to the build directory, usually
# 'build/app.id'
# 'srclib_dir' - the path to the source libraries directory, usually
# 'build/srclib'
# 'extlib_dir' - the path to the external libraries directory, usually
# 'build/extlib'
# Returns the (root, srclibpaths) where:
# 'root' is the root directory, which may be the same as 'build_dir' or may
# be a subdirectory of it.
# 'srclibpaths' is information on the srclibs being used
def prepare_source(vcs, app, build, build_dir, srclib_dir, extlib_dir, onserver=False, refresh=True):
# Optionally, the actual app source can be in a subdirectory
if build.subdir:
root_dir = os.path.join(build_dir, build.subdir)
else:
root_dir = build_dir
# Get a working copy of the right revision
logging.info("Getting source for revision " + build.commit)
vcs.gotorevision(build.commit, refresh)
# Initialise submodules if required
if build.submodules:
logging.info("Initialising submodules")
vcs.initsubmodules()
# Check that a subdir (if we're using one) exists. This has to happen
# after the checkout, since it might not exist elsewhere
if not os.path.exists(root_dir):
raise BuildException('Missing subdir ' + root_dir)
# Run an init command if one is required
if build.init:
cmd = replace_config_vars(build.init, build)
logging.info("Running 'init' commands in %s" % root_dir)
p = FDroidPopen(['bash', '-x', '-c', cmd], cwd=root_dir)
if p.returncode != 0:
raise BuildException("Error running init command for %s:%s" %
(app.id, build.version), p.output)
# Apply patches if any
if build.patch:
logging.info("Applying patches")
for patch in build.patch:
patch = patch.strip()
logging.info("Applying " + patch)
patch_path = os.path.join('metadata', app.id, patch)
p = FDroidPopen(['patch', '-p1', '-i', os.path.abspath(patch_path)], cwd=build_dir)
if p.returncode != 0:
raise BuildException("Failed to apply patch %s" % patch_path)
# Get required source libraries
srclibpaths = []
if build.srclibs:
logging.info("Collecting source libraries")
for lib in build.srclibs:
srclibpaths.append(getsrclib(lib, srclib_dir, build, preponly=onserver, refresh=refresh))
for name, number, libpath in srclibpaths:
place_srclib(root_dir, int(number) if number else None, libpath)
basesrclib = vcs.getsrclib()
# If one was used for the main source, add that too.
if basesrclib:
srclibpaths.append(basesrclib)
# Update the local.properties file
localprops = [os.path.join(build_dir, 'local.properties')]
if build.subdir:
parts = build.subdir.split(os.sep)
cur = build_dir
for d in parts:
cur = os.path.join(cur, d)
localprops += [os.path.join(cur, 'local.properties')]
for path in localprops:
props = ""
if os.path.isfile(path):
logging.info("Updating local.properties file at %s" % path)
with open(path, 'r') as f:
props += f.read()
props += '\n'
else:
logging.info("Creating local.properties file at %s" % path)
# Fix old-fashioned 'sdk-location' by copying
# from sdk.dir, if necessary
if build.oldsdkloc:
sdkloc = re.match(r".*^sdk.dir=(\S+)$.*", props,
re.S | re.M).group(1)
props += "sdk-location=%s\n" % sdkloc
else:
props += "sdk.dir=%s\n" % config['sdk_path']
props += "sdk-location=%s\n" % config['sdk_path']
ndk_path = build.ndk_path()
if ndk_path:
# Add ndk location
props += "ndk.dir=%s\n" % ndk_path
props += "ndk-location=%s\n" % ndk_path
# Add java.encoding if necessary
if build.encoding:
props += "java.encoding=%s\n" % build.encoding
with open(path, 'w') as f:
f.write(props)
flavours = []
if build.build_method() == 'gradle':
flavours = build.gradle
if build.target:
n = build.target.split('-')[1]
regsub_file(r'compileSdkVersion[ =]+[0-9]+',
r'compileSdkVersion %s' % n,
os.path.join(root_dir, 'build.gradle'))
# Remove forced debuggable flags
remove_debuggable_flags(root_dir)
# Insert version code and number into the manifest if necessary
if build.forceversion:
logging.info("Changing the version name")
for path in manifest_paths(root_dir, flavours):
if not os.path.isfile(path):
continue
if has_extension(path, 'xml'):
regsub_file(r'android:versionName="[^"]*"',
r'android:versionName="%s"' % build.version,
path)
elif has_extension(path, 'gradle'):
regsub_file(r"""(\s*)versionName[\s'"=]+.*""",
r"""\1versionName '%s'""" % build.version,
path)
if build.forcevercode:
logging.info("Changing the version code")
for path in manifest_paths(root_dir, flavours):
if not os.path.isfile(path):
continue
if has_extension(path, 'xml'):
regsub_file(r'android:versionCode="[^"]*"',
r'android:versionCode="%s"' % build.vercode,
path)
elif has_extension(path, 'gradle'):
regsub_file(r'versionCode[ =]+[0-9]+',
r'versionCode %s' % build.vercode,
path)
# Delete unwanted files
if build.rm:
logging.info("Removing specified files")
for part in getpaths(build_dir, build.rm):
dest = os.path.join(build_dir, part)
logging.info("Removing {0}".format(part))
if os.path.lexists(dest):
if os.path.islink(dest):
FDroidPopen(['unlink', dest], output=False)
else:
FDroidPopen(['rm', '-rf', dest], output=False)
else:
logging.info("...but it didn't exist")
remove_signing_keys(build_dir)
# Add required external libraries
if build.extlibs:
logging.info("Collecting prebuilt libraries")
libsdir = os.path.join(root_dir, 'libs')
if not os.path.exists(libsdir):
os.mkdir(libsdir)
for lib in build.extlibs:
lib = lib.strip()
logging.info("...installing extlib {0}".format(lib))
libf = os.path.basename(lib)
libsrc = os.path.join(extlib_dir, lib)
if not os.path.exists(libsrc):
raise BuildException("Missing extlib file {0}".format(libsrc))
shutil.copyfile(libsrc, os.path.join(libsdir, libf))
# Run a pre-build command if one is required
if build.prebuild:
logging.info("Running 'prebuild' commands in %s" % root_dir)
cmd = replace_config_vars(build.prebuild, build)
# Substitute source library paths into prebuild commands
for name, number, libpath in srclibpaths:
libpath = os.path.relpath(libpath, root_dir)
cmd = cmd.replace('$$' + name + '$$', libpath)
p = FDroidPopen(['bash', '-x', '-c', cmd], cwd=root_dir)
if p.returncode != 0:
raise BuildException("Error running prebuild command for %s:%s" %
(app.id, build.version), p.output)
# Generate (or update) the ant build file, build.xml...
if build.build_method() == 'ant' and build.update != ['no']:
parms = ['android', 'update', 'lib-project']
lparms = ['android', 'update', 'project']
if build.target:
parms += ['-t', build.target]
lparms += ['-t', build.target]
if build.update:
update_dirs = build.update
else:
update_dirs = ant_subprojects(root_dir) + ['.']
for d in update_dirs:
subdir = os.path.join(root_dir, d)
if d == '.':
logging.debug("Updating main project")
cmd = parms + ['-p', d]
else:
logging.debug("Updating subproject %s" % d)
cmd = lparms + ['-p', d]
p = SdkToolsPopen(cmd, cwd=root_dir)
# Check to see whether an error was returned without a proper exit
# code (this is the case for the 'no target set or target invalid'
# error)
if p.returncode != 0 or p.output.startswith("Error: "):
raise BuildException("Failed to update project at %s" % d, p.output)
# Clean update dirs via ant
if d != '.':
logging.info("Cleaning subproject %s" % d)
p = FDroidPopen(['ant', 'clean'], cwd=subdir)
return (root_dir, srclibpaths)
# Extend via globbing the paths from a field and return them as a map from
# original path to resulting paths
def getpaths_map(build_dir, globpaths):
paths = dict()
for p in globpaths:
p = p.strip()
full_path = os.path.join(build_dir, p)
full_path = os.path.normpath(full_path)
paths[p] = [r[len(build_dir) + 1:] for r in glob.glob(full_path)]
if not paths[p]:
raise FDroidException("glob path '%s' did not match any files/dirs" % p)
return paths
# Extend via globbing the paths from a field and return them as a set
def getpaths(build_dir, globpaths):
paths_map = getpaths_map(build_dir, globpaths)
paths = set()
for k, v in paths_map.iteritems():
for p in v:
paths.add(p)
return paths
def natural_key(s):
return [int(sp) if sp.isdigit() else sp for sp in re.split(r'(\d+)', s)]
class KnownApks:
def __init__(self):
self.path = os.path.join('stats', 'known_apks.txt')
self.apks = {}
if os.path.isfile(self.path):
for line in file(self.path):
t = line.rstrip().split(' ')
if len(t) == 2:
self.apks[t[0]] = (t[1], None)
else:
self.apks[t[0]] = (t[1], time.strptime(t[2], '%Y-%m-%d'))
self.changed = False
def writeifchanged(self):
if not self.changed:
return
if not os.path.exists('stats'):
os.mkdir('stats')
lst = []
for apk, app in self.apks.iteritems():
appid, added = app
line = apk + ' ' + appid
if added:
line += ' ' + time.strftime('%Y-%m-%d', added)
lst.append(line)
with open(self.path, 'w') as f:
for line in sorted(lst, key=natural_key):
f.write(line + '\n')
# Record an apk (if it's new, otherwise does nothing)
# Returns the date it was added.
def recordapk(self, apk, app):
if apk not in self.apks:
self.apks[apk] = (app, time.gmtime(time.time()))
self.changed = True
_, added = self.apks[apk]
return added
# Look up information - given the 'apkname', returns (app id, date added/None).
# Or returns None for an unknown apk.
def getapp(self, apkname):
if apkname in self.apks:
return self.apks[apkname]
return None
# Get the most recent 'num' apps added to the repo, as a list of package ids
# with the most recent first.
def getlatest(self, num):
apps = {}
for apk, app in self.apks.iteritems():
appid, added = app
if added:
if appid in apps:
if apps[appid] > added:
apps[appid] = added
else:
apps[appid] = added
sortedapps = sorted(apps.iteritems(), key=operator.itemgetter(1))[-num:]
lst = [app for app, _ in sortedapps]
lst.reverse()
return lst
def isApkDebuggable(apkfile, config):
"""Returns True if the given apk file is debuggable
:param apkfile: full path to the apk to check"""
p = SdkToolsPopen(['aapt', 'dump', 'xmltree', apkfile, 'AndroidManifest.xml'],
output=False)
if p.returncode != 0:
logging.critical("Failed to get apk manifest information")
sys.exit(1)
for line in p.output.splitlines():
if 'android:debuggable' in line and not line.endswith('0x0'):
return True
return False
class PopenResult:
returncode = None
output = ''
def SdkToolsPopen(commands, cwd=None, output=True):
cmd = commands[0]
if cmd not in config:
config[cmd] = find_sdk_tools_cmd(commands[0])
abscmd = config[cmd]
if abscmd is None:
logging.critical("Could not find '%s' on your system" % cmd)
sys.exit(1)
return FDroidPopen([abscmd] + commands[1:],
cwd=cwd, output=output)
def FDroidPopen(commands, cwd=None, output=True):
"""
Run a command and capture the possibly huge output.
:param commands: command and argument list like in subprocess.Popen
:param cwd: optionally specifies a working directory
:returns: A PopenResult.
"""
global env
if cwd:
cwd = os.path.normpath(cwd)
logging.debug("Directory: %s" % cwd)
logging.debug("> %s" % ' '.join(commands))
result = PopenResult()
p = None
try:
p = subprocess.Popen(commands, cwd=cwd, shell=False, env=env,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
except OSError as e:
raise BuildException("OSError while trying to execute " +
' '.join(commands) + ': ' + str(e))
stdout_queue = Queue()
stdout_reader = AsynchronousFileReader(p.stdout, stdout_queue)
# Check the queue for output (until there is no more to get)
while not stdout_reader.eof():
while not stdout_queue.empty():
line = stdout_queue.get()
if output and options.verbose:
# Output directly to console
sys.stderr.write(line)
sys.stderr.flush()
result.output += line
time.sleep(0.1)
result.returncode = p.wait()
return result
gradle_comment = re.compile(r'[ ]*//')
gradle_signing_configs = re.compile(r'^[\t ]*signingConfigs[ \t]*{[ \t]*$')
gradle_line_matches = [
re.compile(r'^[\t ]*signingConfig [^ ]*$'),
re.compile(r'.*android\.signingConfigs\.[^{]*$'),
re.compile(r'.*\.readLine\(.*'),
]
def remove_signing_keys(build_dir):
for root, dirs, files in os.walk(build_dir):
if 'build.gradle' in files:
path = os.path.join(root, 'build.gradle')
with open(path, "r") as o:
lines = o.readlines()
changed = False
opened = 0
i = 0
with open(path, "w") as o:
while i < len(lines):
line = lines[i]
i += 1
while line.endswith('\\\n'):
line = line.rstrip('\\\n') + lines[i]
i += 1
if gradle_comment.match(line):
o.write(line)
continue
if opened > 0:
opened += line.count('{')
opened -= line.count('}')
continue
if gradle_signing_configs.match(line):
changed = True
opened += 1
continue
if any(s.match(line) for s in gradle_line_matches):
changed = True
continue
if opened == 0:
o.write(line)
if changed:
logging.info("Cleaned build.gradle of keysigning configs at %s" % path)
for propfile in [
'project.properties',
'build.properties',
'default.properties',
'ant.properties', ]:
if propfile in files:
path = os.path.join(root, propfile)
with open(path, "r") as o:
lines = o.readlines()
changed = False
with open(path, "w") as o:
for line in lines:
if any(line.startswith(s) for s in ('key.store', 'key.alias')):
changed = True
continue
o.write(line)
if changed:
logging.info("Cleaned %s of keysigning configs at %s" % (propfile, path))
def reset_env_path():
global env, orig_path
env['PATH'] = orig_path
def add_to_env_path(path):
global env
paths = env['PATH'].split(os.pathsep)
if path in paths:
return
paths.append(path)
env['PATH'] = os.pathsep.join(paths)
def replace_config_vars(cmd, build):
global env
cmd = cmd.replace('$$SDK$$', config['sdk_path'])
# env['ANDROID_NDK'] is set in build_local right before prepare_source
cmd = cmd.replace('$$NDK$$', env['ANDROID_NDK'])
cmd = cmd.replace('$$MVN3$$', config['mvn3'])
if build is not None:
cmd = cmd.replace('$$COMMIT$$', build.commit)
cmd = cmd.replace('$$VERSION$$', build.version)
cmd = cmd.replace('$$VERCODE$$', build.vercode)
return cmd
def place_srclib(root_dir, number, libpath):
if not number:
return
relpath = os.path.relpath(libpath, root_dir)
proppath = os.path.join(root_dir, 'project.properties')
lines = []
if os.path.isfile(proppath):
with open(proppath, "r") as o:
lines = o.readlines()
with open(proppath, "w") as o:
placed = False
for line in lines:
if line.startswith('android.library.reference.%d=' % number):
o.write('android.library.reference.%d=%s\n' % (number, relpath))
placed = True
else:
o.write(line)
if not placed:
o.write('android.library.reference.%d=%s\n' % (number, relpath))
apk_sigfile = re.compile(r'META-INF/[0-9A-Za-z]+\.(SF|RSA)')
def verify_apks(signed_apk, unsigned_apk, tmp_dir):
"""Verify that two apks are the same
One of the inputs is signed, the other is unsigned. The signature metadata
is transferred from the signed to the unsigned apk, and then jarsigner is
used to verify that the signature from the signed apk is also varlid for
the unsigned one.
:param signed_apk: Path to a signed apk file
:param unsigned_apk: Path to an unsigned apk file expected to match it
:param tmp_dir: Path to directory for temporary files
:returns: None if the verification is successful, otherwise a string
describing what went wrong.
"""
with ZipFile(signed_apk) as signed_apk_as_zip:
meta_inf_files = ['META-INF/MANIFEST.MF']
for f in signed_apk_as_zip.namelist():
if apk_sigfile.match(f):
meta_inf_files.append(f)
if len(meta_inf_files) < 3:
return "Signature files missing from {0}".format(signed_apk)
signed_apk_as_zip.extractall(tmp_dir, meta_inf_files)
with ZipFile(unsigned_apk, mode='a') as unsigned_apk_as_zip:
for meta_inf_file in meta_inf_files:
unsigned_apk_as_zip.write(os.path.join(tmp_dir, meta_inf_file), arcname=meta_inf_file)
if subprocess.call([config['jarsigner'], '-verify', unsigned_apk]) != 0:
logging.info("...NOT verified - {0}".format(signed_apk))
return compare_apks(signed_apk, unsigned_apk, tmp_dir)
logging.info("...successfully verified")
return None
apk_badchars = re.compile('''[/ :;'"]''')
def compare_apks(apk1, apk2, tmp_dir):
"""Compare two apks
Returns None if the apk content is the same (apart from the signing key),
otherwise a string describing what's different, or what went wrong when
trying to do the comparison.
"""
apk1dir = os.path.join(tmp_dir, apk_badchars.sub('_', apk1[0:-4])) # trim .apk
apk2dir = os.path.join(tmp_dir, apk_badchars.sub('_', apk2[0:-4])) # trim .apk
for d in [apk1dir, apk2dir]:
if os.path.exists(d):
shutil.rmtree(d)
os.mkdir(d)
os.mkdir(os.path.join(d, 'jar-xf'))
if subprocess.call(['jar', 'xf',
os.path.abspath(apk1)],
cwd=os.path.join(apk1dir, 'jar-xf')) != 0:
return("Failed to unpack " + apk1)
if subprocess.call(['jar', 'xf',
os.path.abspath(apk2)],
cwd=os.path.join(apk2dir, 'jar-xf')) != 0:
return("Failed to unpack " + apk2)
# try to find apktool in the path, if it hasn't been manually configed
if 'apktool' not in config:
tmp = find_command('apktool')
if tmp is not None:
config['apktool'] = tmp
if 'apktool' in config:
if subprocess.call([config['apktool'], 'd', os.path.abspath(apk1), '--output', 'apktool'],
cwd=apk1dir) != 0:
return("Failed to unpack " + apk1)
if subprocess.call([config['apktool'], 'd', os.path.abspath(apk2), '--output', 'apktool'],
cwd=apk2dir) != 0:
return("Failed to unpack " + apk2)
p = FDroidPopen(['diff', '-r', apk1dir, apk2dir], output=False)
lines = p.output.splitlines()
if len(lines) != 1 or 'META-INF' not in lines[0]:
meld = find_command('meld')
if meld is not None:
p = FDroidPopen(['meld', apk1dir, apk2dir], output=False)
return("Unexpected diff output - " + p.output)
# since everything verifies, delete the comparison to keep cruft down
shutil.rmtree(apk1dir)
shutil.rmtree(apk2dir)
# If we get here, it seems like they're the same!
return None
def find_command(command):
'''find the full path of a command, or None if it can't be found in the PATH'''
def is_exe(fpath):
return os.path.isfile(fpath) and os.access(fpath, os.X_OK)
fpath, fname = os.path.split(command)
if fpath:
if is_exe(command):
return command
else:
for path in os.environ["PATH"].split(os.pathsep):
path = path.strip('"')
exe_file = os.path.join(path, command)
if is_exe(exe_file):
return exe_file
return None
def genpassword():
'''generate a random password for when generating keys'''
h = hashlib.sha256()
h.update(os.urandom(16)) # salt
h.update(bytes(socket.getfqdn()))
return h.digest().encode('base64').strip()
def genkeystore(localconfig):
'''Generate a new key with random passwords and add it to new keystore'''
logging.info('Generating a new key in "' + localconfig['keystore'] + '"...')
keystoredir = os.path.dirname(localconfig['keystore'])
if keystoredir is None or keystoredir == '':
keystoredir = os.path.join(os.getcwd(), keystoredir)
if not os.path.exists(keystoredir):
os.makedirs(keystoredir, mode=0o700)
write_password_file("keystorepass", localconfig['keystorepass'])
write_password_file("keypass", localconfig['keypass'])
p = FDroidPopen([config['keytool'], '-genkey',
'-keystore', localconfig['keystore'],
'-alias', localconfig['repo_keyalias'],
'-keyalg', 'RSA', '-keysize', '4096',
'-sigalg', 'SHA256withRSA',
'-validity', '10000',
'-storepass:file', config['keystorepassfile'],
'-keypass:file', config['keypassfile'],
'-dname', localconfig['keydname']])
# TODO keypass should be sent via stdin
if p.returncode != 0:
raise BuildException("Failed to generate key", p.output)
os.chmod(localconfig['keystore'], 0o0600)
# now show the lovely key that was just generated
p = FDroidPopen([config['keytool'], '-list', '-v',
'-keystore', localconfig['keystore'],
'-alias', localconfig['repo_keyalias'],
'-storepass:file', config['keystorepassfile']])
logging.info(p.output.strip() + '\n\n')
def write_to_config(thisconfig, key, value=None):
'''write a key/value to the local config.py'''
if value is None:
origkey = key + '_orig'
value = thisconfig[origkey] if origkey in thisconfig else thisconfig[key]
with open('config.py', 'r') as f:
data = f.read()
pattern = '\n[\s#]*' + key + '\s*=\s*"[^"]*"'
repl = '\n' + key + ' = "' + value + '"'
data = re.sub(pattern, repl, data)
# if this key is not in the file, append it
if not re.match('\s*' + key + '\s*=\s*"', data):
data += repl
# make sure the file ends with a carraige return
if not re.match('\n$', data):
data += '\n'
with open('config.py', 'w') as f:
f.writelines(data)
def parse_xml(path):
return XMLElementTree.parse(path).getroot()
def string_is_integer(string):
try:
int(string)
return True
except ValueError:
return False
def get_per_app_repos():
'''per-app repos are dirs named with the packageName of a single app'''
# Android packageNames are Java packages, they may contain uppercase or
# lowercase letters ('A' through 'Z'), numbers, and underscores
# ('_'). However, individual package name parts may only start with
# letters. https://developer.android.com/guide/topics/manifest/manifest-element.html#package
p = re.compile('^([a-zA-Z][a-zA-Z0-9_]*(\\.[a-zA-Z][a-zA-Z0-9_]*)*)?$')
repos = []
for root, dirs, files in os.walk(os.getcwd()):
for d in dirs:
print('checking', root, 'for', d)
if d in ('archive', 'metadata', 'repo', 'srclibs', 'tmp'):
# standard parts of an fdroid repo, so never packageNames
continue
elif p.match(d) \
and os.path.exists(os.path.join(d, 'fdroid', 'repo', 'index.jar')):
repos.append(d)
break
return repos
fdroidserver-0.6.0/fdroidserver/rewritemeta.py 0000664 0000765 0000765 00000006054 12657206106 021545 0 ustar hans hans 0000000 0000000 #!/usr/bin/env python2
# -*- coding: utf-8 -*-
#
# rewritemeta.py - part of the FDroid server tools
# This cleans up the original .txt metadata file format.
# Copyright (C) 2010-12, Ciaran Gultnieks, ciaran@ciarang.com
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see .
from argparse import ArgumentParser
import os
import logging
try:
from cStringIO import StringIO
except:
from StringIO import StringIO
import common
import metadata
config = None
options = None
def proper_format(app):
s = StringIO()
# TODO: currently reading entire file again, should reuse first
# read in metadata.py
with open(app.metadatapath, 'r') as f:
cur_content = f.read()
metadata.write_txt_metadata(s, app)
content = s.getvalue()
s.close()
return content == cur_content
def main():
global config, options
# Parse command line...
parser = ArgumentParser(usage="%(prog)s [options] [APPID [APPID ...]]")
common.setup_global_opts(parser)
parser.add_argument("-l", "--list", action="store_true", default=False,
help="List files that would be reformatted")
parser.add_argument("-t", "--to", default=None,
help="Rewrite to a specific format")
parser.add_argument("appid", nargs='*', help="app-id in the form APPID")
options = parser.parse_args()
config = common.read_config(options)
# Get all apps...
allapps = metadata.read_metadata(xref=True)
apps = common.read_app_args(options.appid, allapps, False)
if options.list and options.to is not None:
parser.error("Cannot use --list and --to at the same time")
supported = ['txt', 'yaml']
if options.to is not None and options.to not in supported:
parser.error("Must give a valid format to --to")
for appid, app in apps.iteritems():
base, ext = common.get_extension(app.metadatapath)
if not options.to and ext not in supported:
logging.info("Ignoring %s file at '%s'" % (ext, app.metadatapath))
continue
to_ext = ext
if options.to is not None:
to_ext = options.to
if options.list:
if not proper_format(app):
print app.metadatapath
continue
with open(base + '.' + to_ext, 'w') as f:
metadata.write_metadata(to_ext, f, app)
if ext != to_ext:
os.remove(app.metadatapath)
logging.debug("Finished.")
if __name__ == "__main__":
main()
fdroidserver-0.6.0/fdroidserver/lint.py 0000664 0000765 0000765 00000026224 12660445746 020175 0 ustar hans hans 0000000 0000000 #!/usr/bin/env python2
# -*- coding: utf-8 -*-
#
# lint.py - part of the FDroid server tool
# Copyright (C) 2013-2014 Daniel Martí
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See th
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public Licen
# along with this program. If not, see .
from argparse import ArgumentParser
import re
import sys
from sets import Set
import common
import metadata
import rewritemeta
config = None
options = None
def enforce_https(domain):
return (re.compile(r'.*[^sS]://[^/]*' + re.escape(domain) + r'(/.*)?'),
domain + " URLs should always use https://")
https_enforcings = [
enforce_https('github.com'),
enforce_https('gitlab.com'),
enforce_https('bitbucket.org'),
enforce_https('apache.org'),
enforce_https('google.com'),
enforce_https('svn.code.sf.net'),
]
def forbid_shortener(domain):
return (re.compile(r'https?://[^/]*' + re.escape(domain) + r'/.*'),
"URL shorteners should not be used")
http_url_shorteners = [
forbid_shortener('goo.gl'),
forbid_shortener('t.co'),
forbid_shortener('ur1.ca'),
]
http_checks = https_enforcings + http_url_shorteners + [
(re.compile(r'.*github\.com/[^/]+/[^/]+\.git'),
"Appending .git is not necessary"),
(re.compile(r'.*://[^/]*(github|gitlab|bitbucket|rawgit)[^/]*/([^/]+/){1,3}master'),
"Use /HEAD instead of /master to point at a file in the default branch"),
]
regex_checks = {
'Web Site': http_checks,
'Source Code': http_checks,
'Repo': https_enforcings,
'Issue Tracker': http_checks + [
(re.compile(r'.*github\.com/[^/]+/[^/]+[/]*$'),
"/issues is missing"),
],
'Donate': http_checks + [
(re.compile(r'.*flattr\.com'),
"Flattr donation methods belong in the FlattrID flag"),
],
'Changelog': http_checks,
'Author Name': [
(re.compile(r'^\s'),
"Unnecessary leading space"),
(re.compile(r'.*\s$'),
"Unnecessary trailing space"),
],
'License': [
(re.compile(r'^(|None|Unknown)$'),
"No license specified"),
],
'Summary': [
(re.compile(r'^$'),
"Summary yet to be filled"),
(re.compile(r'.*\b(free software|open source)\b.*', re.IGNORECASE),
"No need to specify that the app is Free Software"),
(re.compile(r'.*((your|for).*android|android.*(app|device|client|port|version))', re.IGNORECASE),
"No need to specify that the app is for Android"),
(re.compile(r'.*[a-z0-9][.!?]( |$)'),
"Punctuation should be avoided"),
(re.compile(r'^\s'),
"Unnecessary leading space"),
(re.compile(r'.*\s$'),
"Unnecessary trailing space"),
],
'Description': [
(re.compile(r'^No description available$'),
"Description yet to be filled"),
(re.compile(r'\s*[*#][^ .]'),
"Invalid bulleted list"),
(re.compile(r'^\s'),
"Unnecessary leading space"),
(re.compile(r'.*\s$'),
"Unnecessary trailing space"),
(re.compile(r'.*([^[]|^)\[[^:[\]]+( |\]|$)'),
"Invalid link - use [http://foo.bar Link title] or [http://foo.bar]"),
(re.compile(r'(^|.* )https?://[^ ]+'),
"Unlinkified link - use [http://foo.bar Link title] or [http://foo.bar]"),
],
}
def check_regexes(app):
for f, checks in regex_checks.iteritems():
for m, r in checks:
v = app.get_field(f)
t = metadata.fieldtype(f)
if t == metadata.TYPE_MULTILINE:
for l in v.splitlines():
if m.match(l):
yield "%s at line '%s': %s" % (f, l, r)
else:
if v is None:
continue
if m.match(v):
yield "%s '%s': %s" % (f, v, r)
def get_lastbuild(builds):
lowest_vercode = -1
lastbuild = None
for build in builds:
if not build.disable:
vercode = int(build.vercode)
if lowest_vercode == -1 or vercode < lowest_vercode:
lowest_vercode = vercode
if not lastbuild or int(build.vercode) > int(lastbuild.vercode):
lastbuild = build
return lastbuild
def check_ucm_tags(app):
lastbuild = get_lastbuild(app.builds)
if (lastbuild is not None
and lastbuild.commit
and app.UpdateCheckMode == 'RepoManifest'
and not lastbuild.commit.startswith('unknown')
and lastbuild.vercode == app.CurrentVersionCode
and not lastbuild.forcevercode
and any(s in lastbuild.commit for s in '.,_-/')):
yield "Last used commit '%s' looks like a tag, but Update Check Mode is '%s'" % (
lastbuild.commit, app.UpdateCheckMode)
def check_char_limits(app):
limits = config['char_limits']
if len(app.Summary) > limits['Summary']:
yield "Summary of length %s is over the %i char limit" % (
len(app.Summary), limits['Summary'])
if len(app.Description) > limits['Description']:
yield "Description of length %s is over the %i char limit" % (
len(app.Description), limits['Description'])
def check_old_links(app):
usual_sites = [
'github.com',
'gitlab.com',
'bitbucket.org',
]
old_sites = [
'gitorious.org',
'code.google.com',
]
if any(s in app.Repo for s in usual_sites):
for f in ['Web Site', 'Source Code', 'Issue Tracker', 'Changelog']:
v = app.get_field(f)
if any(s in v for s in old_sites):
yield "App is in '%s' but has a link to '%s'" % (app.Repo, v)
def check_useless_fields(app):
if app.UpdateCheckName == app.id:
yield "Update Check Name is set to the known app id - it can be removed"
filling_ucms = re.compile(r'^(Tags.*|RepoManifest.*)')
def check_checkupdates_ran(app):
if filling_ucms.match(app.UpdateCheckMode):
if not app.AutoName and not app.CurrentVersion and app.CurrentVersionCode == '0':
yield "UCM is set but it looks like checkupdates hasn't been run yet"
def check_empty_fields(app):
if not app.Categories:
yield "Categories are not set"
all_categories = Set([
"Connectivity",
"Development",
"Games",
"Graphics",
"Internet",
"Money",
"Multimedia",
"Navigation",
"Phone & SMS",
"Reading",
"Science & Education",
"Security",
"Sports & Health",
"System",
"Theming",
"Time",
"Writing",
])
def check_categories(app):
for categ in app.Categories:
if categ not in all_categories:
yield "Category '%s' is not valid" % categ
def check_duplicates(app):
if app.Name and app.Name == app.AutoName:
yield "Name '%s' is just the auto name - remove it" % app.Name
links_seen = set()
for f in ['Source Code', 'Web Site', 'Issue Tracker', 'Changelog']:
v = app.get_field(f)
if not v:
continue
v = v.lower()
if v in links_seen:
yield "Duplicate link in '%s': %s" % (f, v)
else:
links_seen.add(v)
name = app.Name or app.AutoName
if app.Summary and name:
if app.Summary.lower() == name.lower():
yield "Summary '%s' is just the app's name" % app.Summary
if app.Summary and app.Description and len(app.Description) == 1:
if app.Summary.lower() == app.Description[0].lower():
yield "Description '%s' is just the app's summary" % app.Summary
seenlines = set()
for l in app.Description.splitlines():
if len(l) < 1:
continue
if l in seenlines:
yield "Description has a duplicate line"
seenlines.add(l)
desc_url = re.compile(r'(^|[^[])\[([^ ]+)( |\]|$)')
def check_mediawiki_links(app):
wholedesc = ' '.join(app.Description)
for um in desc_url.finditer(wholedesc):
url = um.group(1)
for m, r in http_checks:
if m.match(url):
yield "URL '%s' in Description: %s" % (url, r)
def check_bulleted_lists(app):
validchars = ['*', '#']
lchar = ''
lcount = 0
for l in app.Description.splitlines():
if len(l) < 1:
lcount = 0
continue
if l[0] == lchar and l[1] == ' ':
lcount += 1
if lcount > 2 and lchar not in validchars:
yield "Description has a list (%s) but it isn't bulleted (*) nor numbered (#)" % lchar
break
else:
lchar = l[0]
lcount = 1
def check_builds(app):
for build in app.builds:
if build.disable:
continue
for s in ['master', 'origin', 'HEAD', 'default', 'trunk']:
if build.commit and build.commit.startswith(s):
yield "Branch '%s' used as commit in build '%s'" % (s, build.version)
for srclib in build.srclibs:
ref = srclib.split('@')[1].split('/')[0]
if ref.startswith(s):
yield "Branch '%s' used as commit in srclib '%s'" % (s, srclib)
if build.target and build.build_method() == 'gradle':
yield "target= has no gradle support"
def main():
global config, options
anywarns = False
# Parse command line...
parser = ArgumentParser(usage="%(prog)s [options] [APPID [APPID ...]]")
common.setup_global_opts(parser)
parser.add_argument("-f", "--format", action="store_true", default=False,
help="Also warn about formatting issues, like rewritemeta -l")
parser.add_argument("appid", nargs='*', help="app-id in the form APPID")
options = parser.parse_args()
config = common.read_config(options)
# Get all apps...
allapps = metadata.read_metadata(xref=True)
apps = common.read_app_args(options.appid, allapps, False)
for appid, app in apps.iteritems():
if app.Disabled:
continue
warns = []
for check_func in [
check_regexes,
check_ucm_tags,
check_char_limits,
check_old_links,
check_checkupdates_ran,
check_useless_fields,
check_empty_fields,
check_categories,
check_duplicates,
check_mediawiki_links,
check_bulleted_lists,
check_builds,
]:
warns += check_func(app)
if options.format:
if not rewritemeta.proper_format(app):
warns.append("Run rewritemeta to fix formatting")
if warns:
anywarns = True
for warn in warns:
print("%s: %s" % (appid, warn))
if anywarns:
sys.exit(1)
if __name__ == "__main__":
main()
fdroidserver-0.6.0/fdroidserver/net.py 0000664 0000765 0000765 00000002454 12651671447 020013 0 ustar hans hans 0000000 0000000 # -*- coding: utf-8 -*-
#
# net.py - part of the FDroid server tools
# Copyright (C) 2015 Hans-Christoph Steiner
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see .
import os
import requests
def download_file(url, local_filename=None, dldir='tmp'):
filename = url.split('/')[-1]
if local_filename is None:
local_filename = os.path.join(dldir, filename)
# the stream=True parameter keeps memory usage low
r = requests.get(url, stream=True)
with open(local_filename, 'wb') as f:
for chunk in r.iter_content(chunk_size=1024):
if chunk: # filter out keep-alive new chunks
f.write(chunk)
f.flush()
return local_filename
fdroidserver-0.6.0/fdroidserver/stats.py 0000664 0000765 0000765 00000025344 12657174252 020364 0 ustar hans hans 0000000 0000000 #!/usr/bin/env python2
# -*- coding: utf-8 -*-
#
# stats.py - part of the FDroid server tools
# Copyright (C) 2010-13, Ciaran Gultnieks, ciaran@ciarang.com
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see .
import sys
import os
import re
import time
import traceback
import glob
import json
from argparse import ArgumentParser
import paramiko
import socket
import logging
import common
import metadata
import subprocess
from collections import Counter
def carbon_send(key, value):
s = socket.socket()
s.connect((config['carbon_host'], config['carbon_port']))
msg = '%s %d %d\n' % (key, value, int(time.time()))
s.sendall(msg)
s.close()
options = None
config = None
def most_common_stable(counts):
pairs = []
for s in counts:
pairs.append((s, counts[s]))
return sorted(pairs, key=lambda t: (-t[1], t[0]))
def main():
global options, config
# Parse command line...
parser = ArgumentParser()
common.setup_global_opts(parser)
parser.add_argument("-d", "--download", action="store_true", default=False,
help="Download logs we don't have")
parser.add_argument("--recalc", action="store_true", default=False,
help="Recalculate aggregate stats - use when changes "
"have been made that would invalidate old cached data.")
parser.add_argument("--nologs", action="store_true", default=False,
help="Don't do anything logs-related")
options = parser.parse_args()
config = common.read_config(options)
if not config['update_stats']:
logging.info("Stats are disabled - set \"update_stats = True\" in your config.py")
sys.exit(1)
# Get all metadata-defined apps...
allmetaapps = [app for app in metadata.read_metadata().itervalues()]
metaapps = [app for app in allmetaapps if not app.Disabled]
statsdir = 'stats'
logsdir = os.path.join(statsdir, 'logs')
datadir = os.path.join(statsdir, 'data')
if not os.path.exists(statsdir):
os.mkdir(statsdir)
if not os.path.exists(logsdir):
os.mkdir(logsdir)
if not os.path.exists(datadir):
os.mkdir(datadir)
if options.download:
# Get any access logs we don't have...
ssh = None
ftp = None
try:
logging.info('Retrieving logs')
ssh = paramiko.SSHClient()
ssh.load_system_host_keys()
ssh.connect(config['stats_server'], username=config['stats_user'],
timeout=10, key_filename=config['webserver_keyfile'])
ftp = ssh.open_sftp()
ftp.get_channel().settimeout(60)
logging.info("...connected")
ftp.chdir('logs')
files = ftp.listdir()
for f in files:
if f.startswith('access-') and f.endswith('.log.gz'):
destpath = os.path.join(logsdir, f)
destsize = ftp.stat(f).st_size
if (not os.path.exists(destpath) or
os.path.getsize(destpath) != destsize):
logging.debug("...retrieving " + f)
ftp.get(f, destpath)
except Exception:
traceback.print_exc()
sys.exit(1)
finally:
# Disconnect
if ftp is not None:
ftp.close()
if ssh is not None:
ssh.close()
knownapks = common.KnownApks()
unknownapks = []
if not options.nologs:
# Process logs
logging.info('Processing logs...')
appscount = Counter()
appsvercount = Counter()
logexpr = '(?P[.:0-9a-fA-F]+) - - \[(?P
')
del self.para_lines[:]
def endul(self):
self.html.write('')
self.laststate = self.state
self.state = self.stNONE
def endol(self):
self.html.write('')
self.laststate = self.state
self.state = self.stNONE
def formatted(self, txt, html):
res = ''
if html:
txt = cgi.escape(txt)
while True:
index = txt.find("''")
if index == -1:
return res + txt
res += txt[:index]
txt = txt[index:]
if txt.startswith("'''"):
if html:
if self.bold:
res += ''
else:
res += ''
self.bold = not self.bold
txt = txt[3:]
else:
if html:
if self.ital:
res += ''
else:
res += ''
self.ital = not self.ital
txt = txt[2:]
def linkify(self, txt):
res_plain = ''
res_html = ''
while True:
index = txt.find("[")
if index == -1:
return (res_plain + self.formatted(txt, False), res_html + self.formatted(txt, True))
res_plain += self.formatted(txt[:index], False)
res_html += self.formatted(txt[:index], True)
txt = txt[index:]
if txt.startswith("[["):
index = txt.find("]]")
if index == -1:
raise MetaDataException("Unterminated ]]")
url = txt[2:index]
if self.linkResolver:
url, urltext = self.linkResolver(url)
else:
urltext = url
res_html += '' + cgi.escape(urltext) + ''
res_plain += urltext
txt = txt[index + 2:]
else:
index = txt.find("]")
if index == -1:
raise MetaDataException("Unterminated ]")
url = txt[1:index]
index2 = url.find(' ')
if index2 == -1:
urltxt = url
else:
urltxt = url[index2 + 1:]
url = url[:index2]
if url == urltxt:
raise MetaDataException("Url title is just the URL - use [url]")
res_html += '' + cgi.escape(urltxt) + ''
res_plain += urltxt
if urltxt != url:
res_plain += ' (' + url + ')'
txt = txt[index + 1:]
def addtext(self, txt):
p, h = self.linkify(txt)
self.html.write(h)
def parseline(self, line):
if not line:
self.endcur()
elif line.startswith('* '):
self.endcur([self.stUL])
if self.state != self.stUL:
self.html.write('
')
else:
self.para_lines.append(line)
self.endcur([self.stPARA])
if self.state == self.stNONE:
self.state = self.stPARA
if self.laststate != self.stNONE:
self.text.write('\n\n')
self.html.write('
')
def end(self):
self.endcur()
self.text_txt = self.text.getvalue()
self.text_html = self.html.getvalue()
self.text.close()
self.html.close()
# Parse multiple lines of description as written in a metadata file, returning
# a single string in text format and wrapped to 80 columns.
def description_txt(s):
ps = DescriptionFormatter(None)
for line in s.splitlines():
ps.parseline(line)
ps.end()
return ps.text_txt
# Parse multiple lines of description as written in a metadata file, returning
# a single string in wiki format. Used for the Maintainer Notes field as well,
# because it's the same format.
def description_wiki(s):
return s
# Parse multiple lines of description as written in a metadata file, returning
# a single string in HTML format.
def description_html(s, linkres):
ps = DescriptionFormatter(linkres)
for line in s.splitlines():
ps.parseline(line)
ps.end()
return ps.text_html
def parse_srclib(metadatapath):
thisinfo = {}
# Defaults for fields that come from metadata
thisinfo['Repo Type'] = ''
thisinfo['Repo'] = ''
thisinfo['Subdir'] = None
thisinfo['Prepare'] = None
if not os.path.exists(metadatapath):
return thisinfo
metafile = open(metadatapath, "r")
n = 0
for line in metafile:
n += 1
line = line.rstrip('\r\n')
if not line or line.startswith("#"):
continue
try:
f, v = line.split(':', 1)
except ValueError:
raise MetaDataException("Invalid metadata in %s:%d" % (line, n))
if f == "Subdir":
thisinfo[f] = v.split(',')
else:
thisinfo[f] = v
metafile.close()
return thisinfo
def read_srclibs():
"""Read all srclib metadata.
The information read will be accessible as metadata.srclibs, which is a
dictionary, keyed on srclib name, with the values each being a dictionary
in the same format as that returned by the parse_srclib function.
A MetaDataException is raised if there are any problems with the srclib
metadata.
"""
global srclibs
# They were already loaded
if srclibs is not None:
return
srclibs = {}
srcdir = 'srclibs'
if not os.path.exists(srcdir):
os.makedirs(srcdir)
for metadatapath in sorted(glob.glob(os.path.join(srcdir, '*.txt'))):
srclibname = os.path.basename(metadatapath[:-4])
srclibs[srclibname] = parse_srclib(metadatapath)
# Read all metadata. Returns a list of 'app' objects (which are dictionaries as
# returned by the parse_txt_metadata function.
def read_metadata(xref=True):
# Always read the srclibs before the apps, since they can use a srlib as
# their source repository.
read_srclibs()
apps = {}
for basedir in ('metadata', 'tmp'):
if not os.path.exists(basedir):
os.makedirs(basedir)
# If there are multiple metadata files for a single appid, then the first
# file that is parsed wins over all the others, and the rest throw an
# exception. So the original .txt format is parsed first, at least until
# newer formats stabilize.
for metadatapath in sorted(glob.glob(os.path.join('metadata', '*.txt'))
+ glob.glob(os.path.join('metadata', '*.json'))
+ glob.glob(os.path.join('metadata', '*.xml'))
+ glob.glob(os.path.join('metadata', '*.yaml'))):
app = parse_metadata(metadatapath)
if app.id in apps:
raise MetaDataException("Found multiple metadata files for " + app.id)
check_metadata(app)
apps[app.id] = app
if xref:
# Parse all descriptions at load time, just to ensure cross-referencing
# errors are caught early rather than when they hit the build server.
def linkres(appid):
if appid in apps:
return ("fdroid.app:" + appid, "Dummy name - don't know yet")
raise MetaDataException("Cannot resolve app id " + appid)
for appid, app in apps.iteritems():
try:
description_html(app.Description, linkres)
except MetaDataException as e:
raise MetaDataException("Problem with description of " + appid +
" - " + str(e))
return apps
# Port legacy ';' separators
list_sep = re.compile(r'[,;]')
def split_list_values(s):
res = []
for v in re.split(list_sep, s):
if not v:
continue
v = v.strip()
if not v:
continue
res.append(v)
return res
def get_default_app_info(metadatapath=None):
if metadatapath is None:
appid = None
else:
appid, _ = common.get_extension(os.path.basename(metadatapath))
app = App()
app.metadatapath = metadatapath
if appid is not None:
app.id = appid
return app
def sorted_builds(builds):
return sorted(builds, key=lambda build: int(build.vercode))
esc_newlines = re.compile(r'\\( |\n)')
# This function uses __dict__ to be faster
def post_metadata_parse(app):
for k, v in app.__dict__.iteritems():
if k not in app._modified:
continue
if type(v) in (float, int):
app.__dict__[k] = str(v)
for build in app.builds:
for k, v in build.__dict__.iteritems():
if k not in build._modified:
continue
if type(v) in (float, int):
build.__dict__[k] = str(v)
continue
ftype = flagtype(k)
if ftype == TYPE_SCRIPT:
build.__dict__[k] = re.sub(esc_newlines, '', v).lstrip().rstrip()
elif ftype == TYPE_BOOL:
# TODO handle this using ')
for child in root:
if child.tag != 'builds':
# builds does not have name="" attrib
name = child.attrib['name']
if child.tag == 'string':
app.set_field(name, child.text)
elif child.tag == 'string-array':
for item in child:
app.append_field(name, item.text)
elif child.tag == 'builds':
for b in child:
build = Build()
for key in b:
build.set_flag(key.tag, key.text)
app.builds.append(build)
# TODO handle this using 0:
cmds[-1] = cmds[-1][:-len('&& \\')]
w_field(f, cmds, prefix, 'multiline')
return
else:
v = ' ' + escape(v) + '\n'
mf.write(prefix)
mf.write(f)
mf.write(":")
mf.write(v)
global first_build
first_build = True
def w_build(build):
global first_build
if first_build:
mf.write("builds:\n")
first_build = False
w_field('versionName', build.version, ' - ', TYPE_STRING)
w_field('versionCode', build.vercode, ' ', TYPE_STRING)
for f in build_flags_order:
v = build.get_flag(f)
if not v:
continue
w_field(f, v, ' ', flagtype(f))
write_plaintext_metadata(mf, app, w_comment, w_field, w_build)
def write_metadata(fmt, mf, app):
if fmt == 'txt':
return write_txt_metadata(mf, app)
if fmt == 'yaml':
return write_yaml_metadata(mf, app)
raise MetaDataException("Unknown metadata format given")
fdroidserver-0.6.0/fdroidserver/publish.py 0000664 0000765 0000765 00000021006 12657174252 020663 0 ustar hans hans 0000000 0000000 #!/usr/bin/env python2
# -*- coding: utf-8 -*-
#
# publish.py - part of the FDroid server tools
# Copyright (C) 2010-13, Ciaran Gultnieks, ciaran@ciarang.com
# Copyright (C) 2013-2014 Daniel Martí
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see .
import sys
import os
import shutil
import md5
import glob
from argparse import ArgumentParser
import logging
import common
import metadata
from common import FDroidPopen, SdkToolsPopen, BuildException
config = None
options = None
def main():
global config, options
# Parse command line...
parser = ArgumentParser(usage="%(prog)s [options] "
"[APPID[:VERCODE] [APPID[:VERCODE] ...]]")
common.setup_global_opts(parser)
parser.add_argument("appid", nargs='*', help="app-id with optional versioncode in the form APPID[:VERCODE]")
options = parser.parse_args()
config = common.read_config(options)
if not ('jarsigner' in config and 'keytool' in config):
logging.critical('Java JDK not found! Install in standard location or set java_paths!')
sys.exit(1)
log_dir = 'logs'
if not os.path.isdir(log_dir):
logging.info("Creating log directory")
os.makedirs(log_dir)
tmp_dir = 'tmp'
if not os.path.isdir(tmp_dir):
logging.info("Creating temporary directory")
os.makedirs(tmp_dir)
output_dir = 'repo'
if not os.path.isdir(output_dir):
logging.info("Creating output directory")
os.makedirs(output_dir)
unsigned_dir = 'unsigned'
if not os.path.isdir(unsigned_dir):
logging.warning("No unsigned directory - nothing to do")
sys.exit(1)
for f in [config['keystorepassfile'],
config['keystore'],
config['keypassfile']]:
if not os.path.exists(f):
logging.error("Config error - missing '{0}'".format(f))
sys.exit(1)
# It was suggested at
# https://dev.guardianproject.info/projects/bazaar/wiki/FDroid_Audit
# that a package could be crafted, such that it would use the same signing
# key as an existing app. While it may be theoretically possible for such a
# colliding package ID to be generated, it seems virtually impossible that
# the colliding ID would be something that would be a) a valid package ID,
# and b) a sane-looking ID that would make its way into the repo.
# Nonetheless, to be sure, before publishing we check that there are no
# collisions, and refuse to do any publishing if that's the case...
allapps = metadata.read_metadata()
vercodes = common.read_pkg_args(options.appid, True)
allaliases = []
for appid in allapps:
m = md5.new()
m.update(appid)
keyalias = m.hexdigest()[:8]
if keyalias in allaliases:
logging.error("There is a keyalias collision - publishing halted")
sys.exit(1)
allaliases.append(keyalias)
logging.info("{0} apps, {0} key aliases".format(len(allapps),
len(allaliases)))
# Process any apks that are waiting to be signed...
for apkfile in sorted(glob.glob(os.path.join(unsigned_dir, '*.apk'))):
appid, vercode = common.apknameinfo(apkfile)
apkfilename = os.path.basename(apkfile)
if vercodes and appid not in vercodes:
continue
if appid in vercodes and vercodes[appid]:
if vercode not in vercodes[appid]:
continue
logging.info("Processing " + apkfile)
# There ought to be valid metadata for this app, otherwise why are we
# trying to publish it?
if appid not in allapps:
logging.error("Unexpected {0} found in unsigned directory"
.format(apkfilename))
sys.exit(1)
app = allapps[appid]
if app.Binaries is not None:
# It's an app where we build from source, and verify the apk
# contents against a developer's binary, and then publish their
# version if everything checks out.
# The binary should already have been retrieved during the build
# process.
srcapk = apkfile + ".binary"
# Compare our unsigned one with the downloaded one...
compare_result = common.verify_apks(srcapk, apkfile, tmp_dir)
if compare_result:
logging.error("...verification failed - publish skipped : "
+ compare_result)
continue
# Success! So move the downloaded file to the repo, and remove
# our built version.
shutil.move(srcapk, os.path.join(output_dir, apkfilename))
os.remove(apkfile)
else:
# It's a 'normal' app, i.e. we sign and publish it...
# Figure out the key alias name we'll use. Only the first 8
# characters are significant, so we'll use the first 8 from
# the MD5 of the app's ID and hope there are no collisions.
# If a collision does occur later, we're going to have to
# come up with a new alogrithm, AND rename all existing keys
# in the keystore!
if appid in config['keyaliases']:
# For this particular app, the key alias is overridden...
keyalias = config['keyaliases'][appid]
if keyalias.startswith('@'):
m = md5.new()
m.update(keyalias[1:])
keyalias = m.hexdigest()[:8]
else:
m = md5.new()
m.update(appid)
keyalias = m.hexdigest()[:8]
logging.info("Key alias: " + keyalias)
# See if we already have a key for this application, and
# if not generate one...
p = FDroidPopen([config['keytool'], '-list',
'-alias', keyalias, '-keystore', config['keystore'],
'-storepass:file', config['keystorepassfile']])
if p.returncode != 0:
logging.info("Key does not exist - generating...")
p = FDroidPopen([config['keytool'], '-genkey',
'-keystore', config['keystore'],
'-alias', keyalias,
'-keyalg', 'RSA', '-keysize', '2048',
'-validity', '10000',
'-storepass:file', config['keystorepassfile'],
'-keypass:file', config['keypassfile'],
'-dname', config['keydname']])
# TODO keypass should be sent via stdin
if p.returncode != 0:
raise BuildException("Failed to generate key")
# Sign the application...
p = FDroidPopen([config['jarsigner'], '-keystore', config['keystore'],
'-storepass:file', config['keystorepassfile'],
'-keypass:file', config['keypassfile'], '-sigalg',
'SHA1withRSA', '-digestalg', 'SHA1',
apkfile, keyalias])
# TODO keypass should be sent via stdin
if p.returncode != 0:
raise BuildException("Failed to sign application")
# Zipalign it...
p = SdkToolsPopen(['zipalign', '-v', '4', apkfile,
os.path.join(output_dir, apkfilename)])
if p.returncode != 0:
raise BuildException("Failed to align application")
os.remove(apkfile)
# Move the source tarball into the output directory...
tarfilename = apkfilename[:-4] + '_src.tar.gz'
tarfile = os.path.join(unsigned_dir, tarfilename)
if os.path.exists(tarfile):
shutil.move(tarfile, os.path.join(output_dir, tarfilename))
logging.info('Published ' + apkfilename)
if __name__ == "__main__":
main()
fdroidserver-0.6.0/fdroidserver/build.py 0000664 0000765 0000765 00000132623 12660445747 020330 0 ustar hans hans 0000000 0000000 #!/usr/bin/env python2
# -*- coding: utf-8 -*-
#
# build.py - part of the FDroid server tools
# Copyright (C) 2010-2014, Ciaran Gultnieks, ciaran@ciarang.com
# Copyright (C) 2013-2014 Daniel Martí
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see .
import sys
import os
import shutil
import glob
import subprocess
import re
import tarfile
import traceback
import time
import json
from ConfigParser import ConfigParser
from argparse import ArgumentParser
import logging
import common
import net
import metadata
import scanner
from common import FDroidException, BuildException, VCSException, FDroidPopen, SdkToolsPopen
try:
import paramiko
except ImportError:
pass
def get_builder_vm_id():
vd = os.path.join('builder', '.vagrant')
if os.path.isdir(vd):
# Vagrant 1.2 (and maybe 1.1?) it's a directory tree...
with open(os.path.join(vd, 'machines', 'default',
'virtualbox', 'id')) as vf:
id = vf.read()
return id
else:
# Vagrant 1.0 - it's a json file...
with open(os.path.join('builder', '.vagrant')) as vf:
v = json.load(vf)
return v['active']['default']
def got_valid_builder_vm():
"""Returns True if we have a valid-looking builder vm
"""
if not os.path.exists(os.path.join('builder', 'Vagrantfile')):
return False
vd = os.path.join('builder', '.vagrant')
if not os.path.exists(vd):
return False
if not os.path.isdir(vd):
# Vagrant 1.0 - if the directory is there, it's valid...
return True
# Vagrant 1.2 - the directory can exist, but the id can be missing...
if not os.path.exists(os.path.join(vd, 'machines', 'default',
'virtualbox', 'id')):
return False
return True
def vagrant(params, cwd=None, printout=False):
"""Run a vagrant command.
:param: list of parameters to pass to vagrant
:cwd: directory to run in, or None for current directory
:returns: (ret, out) where ret is the return code, and out
is the stdout (and stderr) from vagrant
"""
p = FDroidPopen(['vagrant'] + params, cwd=cwd)
return (p.returncode, p.output)
def get_vagrant_sshinfo():
"""Get ssh connection info for a vagrant VM
:returns: A dictionary containing 'hostname', 'port', 'user'
and 'idfile'
"""
if subprocess.call('vagrant ssh-config >sshconfig',
cwd='builder', shell=True) != 0:
raise BuildException("Error getting ssh config")
vagranthost = 'default' # Host in ssh config file
sshconfig = paramiko.SSHConfig()
sshf = open(os.path.join('builder', 'sshconfig'), 'r')
sshconfig.parse(sshf)
sshf.close()
sshconfig = sshconfig.lookup(vagranthost)
idfile = sshconfig['identityfile']
if isinstance(idfile, list):
idfile = idfile[0]
elif idfile.startswith('"') and idfile.endswith('"'):
idfile = idfile[1:-1]
return {'hostname': sshconfig['hostname'],
'port': int(sshconfig['port']),
'user': sshconfig['user'],
'idfile': idfile}
def get_clean_vm(reset=False):
"""Get a clean VM ready to do a buildserver build.
This might involve creating and starting a new virtual machine from
scratch, or it might be as simple (unless overridden by the reset
parameter) as re-using a snapshot created previously.
A BuildException will be raised if anything goes wrong.
:reset: True to force creating from scratch.
:returns: A dictionary containing 'hostname', 'port', 'user'
and 'idfile'
"""
# Reset existing builder machine to a clean state if possible.
vm_ok = False
if not reset:
logging.info("Checking for valid existing build server")
if got_valid_builder_vm():
logging.info("...VM is present")
p = FDroidPopen(['VBoxManage', 'snapshot',
get_builder_vm_id(), 'list',
'--details'], cwd='builder')
if 'fdroidclean' in p.output:
logging.info("...snapshot exists - resetting build server to "
"clean state")
retcode, output = vagrant(['status'], cwd='builder')
if 'running' in output:
logging.info("...suspending")
vagrant(['suspend'], cwd='builder')
logging.info("...waiting a sec...")
time.sleep(10)
p = FDroidPopen(['VBoxManage', 'snapshot', get_builder_vm_id(),
'restore', 'fdroidclean'],
cwd='builder')
if p.returncode == 0:
logging.info("...reset to snapshot - server is valid")
retcode, output = vagrant(['up'], cwd='builder')
if retcode != 0:
raise BuildException("Failed to start build server")
logging.info("...waiting a sec...")
time.sleep(10)
sshinfo = get_vagrant_sshinfo()
vm_ok = True
else:
logging.info("...failed to reset to snapshot")
else:
logging.info("...snapshot doesn't exist - "
"VBoxManage snapshot list:\n" + p.output)
# If we can't use the existing machine for any reason, make a
# new one from scratch.
if not vm_ok:
if os.path.exists('builder'):
logging.info("Removing broken/incomplete/unwanted build server")
vagrant(['destroy', '-f'], cwd='builder')
shutil.rmtree('builder')
os.mkdir('builder')
p = subprocess.Popen(['vagrant', '--version'],
stdout=subprocess.PIPE)
vver = p.communicate()[0].strip().split(' ')[1]
if vver.split('.')[0] != '1' or int(vver.split('.')[1]) < 4:
raise BuildException("Unsupported vagrant version {0}".format(vver))
with open(os.path.join('builder', 'Vagrantfile'), 'w') as vf:
vf.write('Vagrant.configure("2") do |config|\n')
vf.write('config.vm.box = "buildserver"\n')
vf.write('config.vm.synced_folder ".", "/vagrant", disabled: true\n')
vf.write('end\n')
logging.info("Starting new build server")
retcode, _ = vagrant(['up'], cwd='builder')
if retcode != 0:
raise BuildException("Failed to start build server")
# Open SSH connection to make sure it's working and ready...
logging.info("Connecting to virtual machine...")
sshinfo = get_vagrant_sshinfo()
sshs = paramiko.SSHClient()
sshs.set_missing_host_key_policy(paramiko.AutoAddPolicy())
sshs.connect(sshinfo['hostname'], username=sshinfo['user'],
port=sshinfo['port'], timeout=300,
look_for_keys=False,
key_filename=sshinfo['idfile'])
sshs.close()
logging.info("Saving clean state of new build server")
retcode, _ = vagrant(['suspend'], cwd='builder')
if retcode != 0:
raise BuildException("Failed to suspend build server")
logging.info("...waiting a sec...")
time.sleep(10)
p = FDroidPopen(['VBoxManage', 'snapshot', get_builder_vm_id(),
'take', 'fdroidclean'],
cwd='builder')
if p.returncode != 0:
raise BuildException("Failed to take snapshot")
logging.info("...waiting a sec...")
time.sleep(10)
logging.info("Restarting new build server")
retcode, _ = vagrant(['up'], cwd='builder')
if retcode != 0:
raise BuildException("Failed to start build server")
logging.info("...waiting a sec...")
time.sleep(10)
# Make sure it worked...
p = FDroidPopen(['VBoxManage', 'snapshot', get_builder_vm_id(),
'list', '--details'],
cwd='builder')
if 'fdroidclean' not in p.output:
raise BuildException("Failed to take snapshot.")
return sshinfo
def release_vm():
"""Release the VM previously started with get_clean_vm().
This should always be called.
"""
logging.info("Suspending build server")
subprocess.call(['vagrant', 'suspend'], cwd='builder')
# Note that 'force' here also implies test mode.
def build_server(app, build, vcs, build_dir, output_dir, force):
"""Do a build on the build server."""
try:
paramiko
except NameError:
raise BuildException("Paramiko is required to use the buildserver")
if options.verbose:
logging.getLogger("paramiko").setLevel(logging.INFO)
else:
logging.getLogger("paramiko").setLevel(logging.WARN)
sshinfo = get_clean_vm()
try:
# Open SSH connection...
logging.info("Connecting to virtual machine...")
sshs = paramiko.SSHClient()
sshs.set_missing_host_key_policy(paramiko.AutoAddPolicy())
sshs.connect(sshinfo['hostname'], username=sshinfo['user'],
port=sshinfo['port'], timeout=300,
look_for_keys=False, key_filename=sshinfo['idfile'])
homedir = '/home/' + sshinfo['user']
# Get an SFTP connection...
ftp = sshs.open_sftp()
ftp.get_channel().settimeout(15)
# Put all the necessary files in place...
ftp.chdir(homedir)
# Helper to copy the contents of a directory to the server...
def send_dir(path):
root = os.path.dirname(path)
main = os.path.basename(path)
ftp.mkdir(main)
for r, d, f in os.walk(path):
rr = os.path.relpath(r, root)
ftp.chdir(rr)
for dd in d:
ftp.mkdir(dd)
for ff in f:
lfile = os.path.join(root, rr, ff)
if not os.path.islink(lfile):
ftp.put(lfile, ff)
ftp.chmod(ff, os.stat(lfile).st_mode)
for i in range(len(rr.split('/'))):
ftp.chdir('..')
ftp.chdir('..')
logging.info("Preparing server for build...")
serverpath = os.path.abspath(os.path.dirname(__file__))
ftp.mkdir('fdroidserver')
ftp.chdir('fdroidserver')
ftp.put(os.path.join(serverpath, '..', 'fdroid'), 'fdroid')
ftp.chmod('fdroid', 0o755)
send_dir(os.path.join(serverpath))
ftp.chdir(homedir)
ftp.put(os.path.join(serverpath, '..', 'buildserver',
'config.buildserver.py'), 'config.py')
ftp.chmod('config.py', 0o600)
# Copy over the ID (head commit hash) of the fdroidserver in use...
subprocess.call('git rev-parse HEAD >' +
os.path.join(os.getcwd(), 'tmp', 'fdroidserverid'),
shell=True, cwd=serverpath)
ftp.put('tmp/fdroidserverid', 'fdroidserverid')
# Copy the metadata - just the file for this app...
ftp.mkdir('metadata')
ftp.mkdir('srclibs')
ftp.chdir('metadata')
ftp.put(os.path.join('metadata', app.id + '.txt'),
app.id + '.txt')
# And patches if there are any...
if os.path.exists(os.path.join('metadata', app.id)):
send_dir(os.path.join('metadata', app.id))
ftp.chdir(homedir)
# Create the build directory...
ftp.mkdir('build')
ftp.chdir('build')
ftp.mkdir('extlib')
ftp.mkdir('srclib')
# Copy any extlibs that are required...
if build.extlibs:
ftp.chdir(homedir + '/build/extlib')
for lib in build.extlibs:
lib = lib.strip()
libsrc = os.path.join('build/extlib', lib)
if not os.path.exists(libsrc):
raise BuildException("Missing extlib {0}".format(libsrc))
lp = lib.split('/')
for d in lp[:-1]:
if d not in ftp.listdir():
ftp.mkdir(d)
ftp.chdir(d)
ftp.put(libsrc, lp[-1])
for _ in lp[:-1]:
ftp.chdir('..')
# Copy any srclibs that are required...
srclibpaths = []
if build.srclibs:
for lib in build.srclibs:
srclibpaths.append(
common.getsrclib(lib, 'build/srclib', basepath=True, prepare=False))
# If one was used for the main source, add that too.
basesrclib = vcs.getsrclib()
if basesrclib:
srclibpaths.append(basesrclib)
for name, number, lib in srclibpaths:
logging.info("Sending srclib '%s'" % lib)
ftp.chdir(homedir + '/build/srclib')
if not os.path.exists(lib):
raise BuildException("Missing srclib directory '" + lib + "'")
fv = '.fdroidvcs-' + name
ftp.put(os.path.join('build/srclib', fv), fv)
send_dir(lib)
# Copy the metadata file too...
ftp.chdir(homedir + '/srclibs')
ftp.put(os.path.join('srclibs', name + '.txt'),
name + '.txt')
# Copy the main app source code
# (no need if it's a srclib)
if (not basesrclib) and os.path.exists(build_dir):
ftp.chdir(homedir + '/build')
fv = '.fdroidvcs-' + app.id
ftp.put(os.path.join('build', fv), fv)
send_dir(build_dir)
# Execute the build script...
logging.info("Starting build...")
chan = sshs.get_transport().open_session()
chan.get_pty()
cmdline = os.path.join(homedir, 'fdroidserver', 'fdroid')
cmdline += ' build --on-server'
if force:
cmdline += ' --force --test'
if options.verbose:
cmdline += ' --verbose'
cmdline += " %s:%s" % (app.id, build.vercode)
chan.exec_command('bash -c ". ~/.bsenv && ' + cmdline + '"')
output = ''
while not chan.exit_status_ready():
while chan.recv_ready():
output += chan.recv(1024)
time.sleep(0.1)
logging.info("...getting exit status")
returncode = chan.recv_exit_status()
while True:
get = chan.recv(1024)
if len(get) == 0:
break
output += get
if returncode != 0:
raise BuildException(
"Build.py failed on server for {0}:{1}".format(
app.id, build.version), output)
# Retrieve the built files...
logging.info("Retrieving build output...")
if force:
ftp.chdir(homedir + '/tmp')
else:
ftp.chdir(homedir + '/unsigned')
apkfile = common.getapkname(app, build)
tarball = common.getsrcname(app, build)
try:
ftp.get(apkfile, os.path.join(output_dir, apkfile))
if not options.notarball:
ftp.get(tarball, os.path.join(output_dir, tarball))
except:
raise BuildException(
"Build failed for %s:%s - missing output files".format(
app.id, build.version), output)
ftp.close()
finally:
# Suspend the build server.
release_vm()
def adapt_gradle(build_dir):
filename = 'build.gradle'
for root, dirs, files in os.walk(build_dir):
for filename in files:
if not filename.endswith('.gradle'):
continue
path = os.path.join(root, filename)
if not os.path.isfile(path):
continue
logging.debug("Adapting %s at %s" % (filename, path))
common.regsub_file(r"""(\s*)buildToolsVersion([\s=]+).*""",
r"""\1buildToolsVersion\2'%s'""" % config['build_tools'],
path)
def capitalize_intact(string):
"""Like str.capitalize(), but leave the rest of the string intact without
switching it to lowercase."""
if len(string) == 0:
return string
if len(string) == 1:
return string.upper()
return string[0].upper() + string[1:]
def build_local(app, build, vcs, build_dir, output_dir, srclib_dir, extlib_dir, tmp_dir, force, onserver, refresh):
"""Do a build locally."""
ndk_path = build.ndk_path()
if build.buildjni and build.buildjni != ['no']:
if not ndk_path:
logging.critical("Android NDK version '%s' could not be found!" % build.ndk or 'r10e')
logging.critical("Configured versions:")
for k, v in config['ndk_paths'].iteritems():
if k.endswith("_orig"):
continue
logging.critical(" %s: %s" % (k, v))
sys.exit(3)
elif not os.path.isdir(ndk_path):
logging.critical("Android NDK '%s' is not a directory!" % ndk_path)
sys.exit(3)
# Set up environment vars that depend on each build
for n in ['ANDROID_NDK', 'NDK', 'ANDROID_NDK_HOME']:
common.env[n] = ndk_path
common.reset_env_path()
# Set up the current NDK to the PATH
common.add_to_env_path(ndk_path)
# Prepare the source code...
root_dir, srclibpaths = common.prepare_source(vcs, app, build,
build_dir, srclib_dir,
extlib_dir, onserver, refresh)
# We need to clean via the build tool in case the binary dirs are
# different from the default ones
p = None
gradletasks = []
bmethod = build.build_method()
if bmethod == 'maven':
logging.info("Cleaning Maven project...")
cmd = [config['mvn3'], 'clean', '-Dandroid.sdk.path=' + config['sdk_path']]
if '@' in build.maven:
maven_dir = os.path.join(root_dir, build.maven.split('@', 1)[1])
maven_dir = os.path.normpath(maven_dir)
else:
maven_dir = root_dir
p = FDroidPopen(cmd, cwd=maven_dir)
elif bmethod == 'gradle':
logging.info("Cleaning Gradle project...")
if build.preassemble:
gradletasks += build.preassemble
flavours = build.gradle
if flavours == ['yes']:
flavours = []
flavours_cmd = ''.join([capitalize_intact(f) for f in flavours])
gradletasks += ['assemble' + flavours_cmd + 'Release']
adapt_gradle(build_dir)
for name, number, libpath in srclibpaths:
adapt_gradle(libpath)
cmd = [config['gradle']]
if build.gradleprops:
cmd += ['-P' + kv for kv in build.gradleprops]
cmd += ['clean']
p = FDroidPopen(cmd, cwd=root_dir)
elif bmethod == 'kivy':
pass
elif bmethod == 'ant':
logging.info("Cleaning Ant project...")
p = FDroidPopen(['ant', 'clean'], cwd=root_dir)
if p is not None and p.returncode != 0:
raise BuildException("Error cleaning %s:%s" %
(app.id, build.version), p.output)
for root, dirs, files in os.walk(build_dir):
def del_dirs(dl):
for d in dl:
if d in dirs:
shutil.rmtree(os.path.join(root, d))
def del_files(fl):
for f in fl:
if f in files:
os.remove(os.path.join(root, f))
if 'build.gradle' in files:
# Even when running clean, gradle stores task/artifact caches in
# .gradle/ as binary files. To avoid overcomplicating the scanner,
# manually delete them, just like `gradle clean` should have removed
# the build/ dirs.
del_dirs(['build', '.gradle', 'gradle'])
del_files(['gradlew', 'gradlew.bat'])
if 'pom.xml' in files:
del_dirs(['target'])
if any(f in files for f in ['ant.properties', 'project.properties', 'build.xml']):
del_dirs(['bin', 'gen'])
if 'jni' in dirs:
del_dirs(['obj'])
if options.skipscan:
if build.scandelete:
raise BuildException("Refusing to skip source scan since scandelete is present")
else:
# Scan before building...
logging.info("Scanning source for common problems...")
count = scanner.scan_source(build_dir, root_dir, build)
if count > 0:
if force:
logging.warn('Scanner found %d problems' % count)
else:
raise BuildException("Can't build due to %d errors while scanning" % count)
if not options.notarball:
# Build the source tarball right before we build the release...
logging.info("Creating source tarball...")
tarname = common.getsrcname(app, build)
tarball = tarfile.open(os.path.join(tmp_dir, tarname), "w:gz")
def tarexc(f):
return any(f.endswith(s) for s in ['.svn', '.git', '.hg', '.bzr'])
tarball.add(build_dir, tarname, exclude=tarexc)
tarball.close()
# Run a build command if one is required...
if build.build:
logging.info("Running 'build' commands in %s" % root_dir)
cmd = common.replace_config_vars(build.build, build)
# Substitute source library paths into commands...
for name, number, libpath in srclibpaths:
libpath = os.path.relpath(libpath, root_dir)
cmd = cmd.replace('$$' + name + '$$', libpath)
p = FDroidPopen(['bash', '-x', '-c', cmd], cwd=root_dir)
if p.returncode != 0:
raise BuildException("Error running build command for %s:%s" %
(app.id, build.version), p.output)
# Build native stuff if required...
if build.buildjni and build.buildjni != ['no']:
logging.info("Building the native code")
jni_components = build.buildjni
if jni_components == ['yes']:
jni_components = ['']
cmd = [os.path.join(ndk_path, "ndk-build"), "-j1"]
for d in jni_components:
if d:
logging.info("Building native code in '%s'" % d)
else:
logging.info("Building native code in the main project")
manifest = os.path.join(root_dir, d, 'AndroidManifest.xml')
if os.path.exists(manifest):
# Read and write the whole AM.xml to fix newlines and avoid
# the ndk r8c or later 'wordlist' errors. The outcome of this
# under gnu/linux is the same as when using tools like
# dos2unix, but the native python way is faster and will
# work in non-unix systems.
manifest_text = open(manifest, 'U').read()
open(manifest, 'w').write(manifest_text)
# In case the AM.xml read was big, free the memory
del manifest_text
p = FDroidPopen(cmd, cwd=os.path.join(root_dir, d))
if p.returncode != 0:
raise BuildException("NDK build failed for %s:%s" % (app.id, build.version), p.output)
p = None
# Build the release...
if bmethod == 'maven':
logging.info("Building Maven project...")
if '@' in build.maven:
maven_dir = os.path.join(root_dir, build.maven.split('@', 1)[1])
else:
maven_dir = root_dir
mvncmd = [config['mvn3'], '-Dandroid.sdk.path=' + config['sdk_path'],
'-Dmaven.jar.sign.skip=true', '-Dmaven.test.skip=true',
'-Dandroid.sign.debug=false', '-Dandroid.release=true',
'package']
if build.target:
target = build.target.split('-')[1]
common.regsub_file(r'[0-9]*',
r'%s' % target,
os.path.join(root_dir, 'pom.xml'))
if '@' in build.maven:
common.regsub_file(r'[0-9]*',
r'%s' % target,
os.path.join(maven_dir, 'pom.xml'))
p = FDroidPopen(mvncmd, cwd=maven_dir)
bindir = os.path.join(root_dir, 'target')
elif bmethod == 'kivy':
logging.info("Building Kivy project...")
spec = os.path.join(root_dir, 'buildozer.spec')
if not os.path.exists(spec):
raise BuildException("Expected to find buildozer-compatible spec at {0}"
.format(spec))
defaults = {'orientation': 'landscape', 'icon': '',
'permissions': '', 'android.api': "18"}
bconfig = ConfigParser(defaults, allow_no_value=True)
bconfig.read(spec)
distdir = os.path.join('python-for-android', 'dist', 'fdroid')
if os.path.exists(distdir):
shutil.rmtree(distdir)
modules = bconfig.get('app', 'requirements').split(',')
cmd = 'ANDROIDSDK=' + config['sdk_path']
cmd += ' ANDROIDNDK=' + ndk_path
cmd += ' ANDROIDNDKVER=' + build.ndk
cmd += ' ANDROIDAPI=' + str(bconfig.get('app', 'android.api'))
cmd += ' VIRTUALENV=virtualenv'
cmd += ' ./distribute.sh'
cmd += ' -m ' + "'" + ' '.join(modules) + "'"
cmd += ' -d fdroid'
p = subprocess.Popen(cmd, cwd='python-for-android', shell=True)
if p.returncode != 0:
raise BuildException("Distribute build failed")
cid = bconfig.get('app', 'package.domain') + '.' + bconfig.get('app', 'package.name')
if cid != app.id:
raise BuildException("Package ID mismatch between metadata and spec")
orientation = bconfig.get('app', 'orientation', 'landscape')
if orientation == 'all':
orientation = 'sensor'
cmd = ['./build.py'
'--dir', root_dir,
'--name', bconfig.get('app', 'title'),
'--package', app.id,
'--version', bconfig.get('app', 'version'),
'--orientation', orientation
]
perms = bconfig.get('app', 'permissions')
for perm in perms.split(','):
cmd.extend(['--permission', perm])
if config.get('app', 'fullscreen') == 0:
cmd.append('--window')
icon = bconfig.get('app', 'icon.filename')
if icon:
cmd.extend(['--icon', os.path.join(root_dir, icon)])
cmd.append('release')
p = FDroidPopen(cmd, cwd=distdir)
elif bmethod == 'gradle':
logging.info("Building Gradle project...")
cmd = [config['gradle']]
if build.gradleprops:
cmd += ['-P' + kv for kv in build.gradleprops]
cmd += gradletasks
p = FDroidPopen(cmd, cwd=root_dir)
elif bmethod == 'ant':
logging.info("Building Ant project...")
cmd = ['ant']
if build.antcommands:
cmd += build.antcommands
else:
cmd += ['release']
p = FDroidPopen(cmd, cwd=root_dir)
bindir = os.path.join(root_dir, 'bin')
if p is not None and p.returncode != 0:
raise BuildException("Build failed for %s:%s" % (app.id, build.version), p.output)
logging.info("Successfully built version " + build.version + ' of ' + app.id)
omethod = build.output_method()
if omethod == 'maven':
stdout_apk = '\n'.join([
line for line in p.output.splitlines() if any(
a in line for a in ('.apk', '.ap_', '.jar'))])
m = re.match(r".*^\[INFO\] .*apkbuilder.*/([^/]*)\.apk",
stdout_apk, re.S | re.M)
if not m:
m = re.match(r".*^\[INFO\] Creating additional unsigned apk file .*/([^/]+)\.apk[^l]",
stdout_apk, re.S | re.M)
if not m:
m = re.match(r'.*^\[INFO\] [^$]*aapt \[package,[^$]*' + bindir + r'/([^/]+)\.ap[_k][,\]]',
stdout_apk, re.S | re.M)
if not m:
m = re.match(r".*^\[INFO\] Building jar: .*/" + bindir + r"/(.+)\.jar",
stdout_apk, re.S | re.M)
if not m:
raise BuildException('Failed to find output')
src = m.group(1)
src = os.path.join(bindir, src) + '.apk'
elif omethod == 'kivy':
src = os.path.join('python-for-android', 'dist', 'default', 'bin',
'{0}-{1}-release.apk'.format(
bconfig.get('app', 'title'),
bconfig.get('app', 'version')))
elif omethod == 'gradle':
src = None
for apks_dir in [
os.path.join(root_dir, 'build', 'outputs', 'apk'),
os.path.join(root_dir, 'build', 'apk'),
]:
for apkglob in ['*-release-unsigned.apk', '*-unsigned.apk', '*.apk']:
apks = glob.glob(os.path.join(apks_dir, apkglob))
if len(apks) > 1:
raise BuildException('More than one resulting apks found in %s' % apks_dir,
'\n'.join(apks))
if len(apks) == 1:
src = apks[0]
break
if src is not None:
break
if src is None:
raise BuildException('Failed to find any output apks')
elif omethod == 'ant':
stdout_apk = '\n'.join([
line for line in p.output.splitlines() if '.apk' in line])
src = re.match(r".*^.*Creating (.+) for release.*$.*", stdout_apk,
re.S | re.M).group(1)
src = os.path.join(bindir, src)
elif omethod == 'raw':
globpath = os.path.join(root_dir, build.output)
apks = glob.glob(globpath)
if len(apks) > 1:
raise BuildException('Multiple apks match %s' % globpath, '\n'.join(apks))
if len(apks) < 1:
raise BuildException('No apks match %s' % globpath)
src = os.path.normpath(apks[0])
# Make sure it's not debuggable...
if common.isApkDebuggable(src, config):
raise BuildException("APK is debuggable")
# By way of a sanity check, make sure the version and version
# code in our new apk match what we expect...
logging.debug("Checking " + src)
if not os.path.exists(src):
raise BuildException("Unsigned apk is not at expected location of " + src)
p = SdkToolsPopen(['aapt', 'dump', 'badging', src], output=False)
vercode = None
version = None
foundid = None
nativecode = None
for line in p.output.splitlines():
if line.startswith("package:"):
pat = re.compile(".*name='([a-zA-Z0-9._]*)'.*")
m = pat.match(line)
if m:
foundid = m.group(1)
pat = re.compile(".*versionCode='([0-9]*)'.*")
m = pat.match(line)
if m:
vercode = m.group(1)
pat = re.compile(".*versionName='([^']*)'.*")
m = pat.match(line)
if m:
version = m.group(1)
elif line.startswith("native-code:"):
nativecode = line[12:]
# Ignore empty strings or any kind of space/newline chars that we don't
# care about
if nativecode is not None:
nativecode = nativecode.strip()
nativecode = None if not nativecode else nativecode
if build.buildjni and build.buildjni != ['no']:
if nativecode is None:
raise BuildException("Native code should have been built but none was packaged")
if build.novcheck:
vercode = build.vercode
version = build.version
if not version or not vercode:
raise BuildException("Could not find version information in build in output")
if not foundid:
raise BuildException("Could not find package ID in output")
if foundid != app.id:
raise BuildException("Wrong package ID - build " + foundid + " but expected " + app.id)
# Some apps (e.g. Timeriffic) have had the bonkers idea of
# including the entire changelog in the version number. Remove
# it so we can compare. (TODO: might be better to remove it
# before we compile, in fact)
index = version.find(" //")
if index != -1:
version = version[:index]
if (version != build.version or
vercode != build.vercode):
raise BuildException(("Unexpected version/version code in output;"
" APK: '%s' / '%s', "
" Expected: '%s' / '%s'")
% (version, str(vercode), build.version,
str(build.vercode))
)
# Add information for 'fdroid verify' to be able to reproduce the build
# environment.
if onserver:
metadir = os.path.join(tmp_dir, 'META-INF')
if not os.path.exists(metadir):
os.mkdir(metadir)
homedir = os.path.expanduser('~')
for fn in ['buildserverid', 'fdroidserverid']:
shutil.copyfile(os.path.join(homedir, fn),
os.path.join(metadir, fn))
subprocess.call(['jar', 'uf', os.path.abspath(src),
'META-INF/' + fn], cwd=tmp_dir)
# Copy the unsigned apk to our destination directory for further
# processing (by publish.py)...
dest = os.path.join(output_dir, common.getapkname(app, build))
shutil.copyfile(src, dest)
# Move the source tarball into the output directory...
if output_dir != tmp_dir and not options.notarball:
shutil.move(os.path.join(tmp_dir, tarname),
os.path.join(output_dir, tarname))
def trybuild(app, build, build_dir, output_dir, also_check_dir, srclib_dir, extlib_dir,
tmp_dir, repo_dir, vcs, test, server, force, onserver, refresh):
"""
Build a particular version of an application, if it needs building.
:param output_dir: The directory where the build output will go. Usually
this is the 'unsigned' directory.
:param repo_dir: The repo directory - used for checking if the build is
necessary.
:paaram also_check_dir: An additional location for checking if the build
is necessary (usually the archive repo)
:param test: True if building in test mode, in which case the build will
always happen, even if the output already exists. In test mode, the
output directory should be a temporary location, not any of the real
ones.
:returns: True if the build was done, False if it wasn't necessary.
"""
dest_apk = common.getapkname(app, build)
dest = os.path.join(output_dir, dest_apk)
dest_repo = os.path.join(repo_dir, dest_apk)
if not test:
if os.path.exists(dest) or os.path.exists(dest_repo):
return False
if also_check_dir:
dest_also = os.path.join(also_check_dir, dest_apk)
if os.path.exists(dest_also):
return False
if build.disable and not options.force:
return False
logging.info("Building version %s (%s) of %s" % (
build.version, build.vercode, app.id))
if server:
# When using server mode, still keep a local cache of the repo, by
# grabbing the source now.
vcs.gotorevision(build.commit)
build_server(app, build, vcs, build_dir, output_dir, force)
else:
build_local(app, build, vcs, build_dir, output_dir, srclib_dir, extlib_dir, tmp_dir, force, onserver, refresh)
return True
def parse_commandline():
"""Parse the command line. Returns options, parser."""
parser = ArgumentParser(usage="%(prog)s [options] [APPID[:VERCODE] [APPID[:VERCODE] ...]]")
common.setup_global_opts(parser)
parser.add_argument("appid", nargs='*', help="app-id with optional versioncode in the form APPID[:VERCODE]")
parser.add_argument("-l", "--latest", action="store_true", default=False,
help="Build only the latest version of each package")
parser.add_argument("-s", "--stop", action="store_true", default=False,
help="Make the build stop on exceptions")
parser.add_argument("-t", "--test", action="store_true", default=False,
help="Test mode - put output in the tmp directory only, and always build, even if the output already exists.")
parser.add_argument("--server", action="store_true", default=False,
help="Use build server")
parser.add_argument("--resetserver", action="store_true", default=False,
help="Reset and create a brand new build server, even if the existing one appears to be ok.")
parser.add_argument("--on-server", dest="onserver", action="store_true", default=False,
help="Specify that we're running on the build server")
parser.add_argument("--skip-scan", dest="skipscan", action="store_true", default=False,
help="Skip scanning the source code for binaries and other problems")
parser.add_argument("--no-tarball", dest="notarball", action="store_true", default=False,
help="Don't create a source tarball, useful when testing a build")
parser.add_argument("--no-refresh", dest="refresh", action="store_false", default=True,
help="Don't refresh the repository, useful when testing a build with no internet connection")
parser.add_argument("-f", "--force", action="store_true", default=False,
help="Force build of disabled apps, and carries on regardless of scan problems. Only allowed in test mode.")
parser.add_argument("-a", "--all", action="store_true", default=False,
help="Build all applications available")
parser.add_argument("-w", "--wiki", default=False, action="store_true",
help="Update the wiki")
options = parser.parse_args()
# Force --stop with --on-server to get correct exit code
if options.onserver:
options.stop = True
if options.force and not options.test:
parser.error("option %s: Force is only allowed in test mode" % "force")
return options, parser
options = None
config = None
def main():
global options, config
options, parser = parse_commandline()
metadata_files = glob.glob('.fdroid.*[a-z]') # ignore files ending in ~
if len(metadata_files) > 1:
raise FDroidException("Only one local metadata file allowed! Found: "
+ " ".join(metadata_files))
if not os.path.isdir('metadata') and len(metadata_files) == 0:
raise FDroidException("No app metadata found, nothing to process!")
if not options.appid and not options.all:
parser.error("option %s: If you really want to build all the apps, use --all" % "all")
config = common.read_config(options)
if config['build_server_always']:
options.server = True
if options.resetserver and not options.server:
parser.error("option %s: Using --resetserver without --server makes no sense" % "resetserver")
log_dir = 'logs'
if not os.path.isdir(log_dir):
logging.info("Creating log directory")
os.makedirs(log_dir)
tmp_dir = 'tmp'
if not os.path.isdir(tmp_dir):
logging.info("Creating temporary directory")
os.makedirs(tmp_dir)
if options.test:
output_dir = tmp_dir
else:
output_dir = 'unsigned'
if not os.path.isdir(output_dir):
logging.info("Creating output directory")
os.makedirs(output_dir)
if config['archive_older'] != 0:
also_check_dir = 'archive'
else:
also_check_dir = None
repo_dir = 'repo'
build_dir = 'build'
if not os.path.isdir(build_dir):
logging.info("Creating build directory")
os.makedirs(build_dir)
srclib_dir = os.path.join(build_dir, 'srclib')
extlib_dir = os.path.join(build_dir, 'extlib')
# Read all app and srclib metadata
allapps = metadata.read_metadata(xref=not options.onserver)
apps = common.read_app_args(options.appid, allapps, True)
for appid, app in apps.items():
if (app.Disabled and not options.force) or not app.RepoType or not app.builds:
del apps[appid]
if not apps:
raise FDroidException("No apps to process.")
if options.latest:
for app in apps.itervalues():
for build in reversed(app.builds):
if build.disable and not options.force:
continue
app.builds = [build]
break
if options.wiki:
import mwclient
site = mwclient.Site((config['wiki_protocol'], config['wiki_server']),
path=config['wiki_path'])
site.login(config['wiki_user'], config['wiki_password'])
# Build applications...
failed_apps = {}
build_succeeded = []
for appid, app in apps.iteritems():
first = True
for build in app.builds:
wikilog = None
try:
# For the first build of a particular app, we need to set up
# the source repo. We can reuse it on subsequent builds, if
# there are any.
if first:
if app.RepoType == 'srclib':
build_dir = os.path.join('build', 'srclib', app.Repo)
else:
build_dir = os.path.join('build', appid)
# Set up vcs interface and make sure we have the latest code...
logging.debug("Getting {0} vcs interface for {1}"
.format(app.RepoType, app.Repo))
vcs = common.getvcs(app.RepoType, app.Repo, build_dir)
first = False
logging.debug("Checking " + build.version)
if trybuild(app, build, build_dir, output_dir,
also_check_dir, srclib_dir, extlib_dir,
tmp_dir, repo_dir, vcs, options.test,
options.server, options.force,
options.onserver, options.refresh):
if app.Binaries is not None:
# This is an app where we build from source, and
# verify the apk contents against a developer's
# binary. We get that binary now, and save it
# alongside our built one in the 'unsigend'
# directory.
url = app.Binaries
url = url.replace('%v', build.version)
url = url.replace('%c', str(build.vercode))
logging.info("...retrieving " + url)
of = "{0}_{1}.apk.binary".format(app.id, build.vercode)
of = os.path.join(output_dir, of)
net.download_file(url, local_filename=of)
build_succeeded.append(app)
wikilog = "Build succeeded"
except VCSException as vcse:
reason = str(vcse).split('\n', 1)[0] if options.verbose else str(vcse)
logging.error("VCS error while building app %s: %s" % (
appid, reason))
if options.stop:
sys.exit(1)
failed_apps[appid] = vcse
wikilog = str(vcse)
except FDroidException as e:
with open(os.path.join(log_dir, appid + '.log'), 'a+') as f:
f.write(str(e))
logging.error("Could not build app %s: %s" % (appid, e))
if options.stop:
sys.exit(1)
failed_apps[appid] = e
wikilog = e.get_wikitext()
except Exception as e:
logging.error("Could not build app %s due to unknown error: %s" % (
appid, traceback.format_exc()))
if options.stop:
sys.exit(1)
failed_apps[appid] = e
wikilog = str(e)
if options.wiki and wikilog:
try:
# Write a page with the last build log for this version code
lastbuildpage = appid + '/lastbuild_' + build.vercode
newpage = site.Pages[lastbuildpage]
txt = "Build completed at " + time.strftime("%Y-%m-%d %H:%M:%SZ", time.gmtime()) + "\n\n" + wikilog
newpage.save(txt, summary='Build log')
# Redirect from /lastbuild to the most recent build log
newpage = site.Pages[appid + '/lastbuild']
newpage.save('#REDIRECT [[' + lastbuildpage + ']]', summary='Update redirect')
except:
logging.error("Error while attempting to publish build log")
for app in build_succeeded:
logging.info("success: %s" % (app.id))
if not options.verbose:
for fa in failed_apps:
logging.info("Build for app %s failed:\n%s" % (fa, failed_apps[fa]))
logging.info("Finished.")
if len(build_succeeded) > 0:
logging.info(str(len(build_succeeded)) + ' builds succeeded')
if len(failed_apps) > 0:
logging.info(str(len(failed_apps)) + ' builds failed')
sys.exit(0)
if __name__ == "__main__":
main()
fdroidserver-0.6.0/fdroidserver/checkupdates.py 0000664 0000765 0000765 00000047671 12657206106 021672 0 ustar hans hans 0000000 0000000 #!/usr/bin/env python2
# -*- coding: utf-8 -*-
#
# checkupdates.py - part of the FDroid server tools
# Copyright (C) 2010-2015, Ciaran Gultnieks, ciaran@ciarang.com
# Copyright (C) 2013-2014 Daniel Martí
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see .
import sys
import os
import re
import urllib2
import time
import subprocess
from argparse import ArgumentParser
import traceback
import HTMLParser
from distutils.version import LooseVersion
import logging
import copy
import common
import metadata
from common import VCSException, FDroidException
from metadata import MetaDataException
# Check for a new version by looking at a document retrieved via HTTP.
# The app's Update Check Data field is used to provide the information
# required.
def check_http(app):
try:
if not app.UpdateCheckData:
raise FDroidException('Missing Update Check Data')
urlcode, codeex, urlver, verex = app.UpdateCheckData.split('|')
vercode = "99999999"
if len(urlcode) > 0:
logging.debug("...requesting {0}".format(urlcode))
req = urllib2.Request(urlcode, None)
resp = urllib2.urlopen(req, None, 20)
page = resp.read()
m = re.search(codeex, page)
if not m:
raise FDroidException("No RE match for version code")
vercode = m.group(1)
version = "??"
if len(urlver) > 0:
if urlver != '.':
logging.debug("...requesting {0}".format(urlver))
req = urllib2.Request(urlver, None)
resp = urllib2.urlopen(req, None, 20)
page = resp.read()
m = re.search(verex, page)
if not m:
raise FDroidException("No RE match for version")
version = m.group(1)
return (version, vercode)
except FDroidException:
msg = "Could not complete http check for app {0} due to unknown error: {1}".format(app.id, traceback.format_exc())
return (None, msg)
# Check for a new version by looking at the tags in the source repo.
# Whether this can be used reliably or not depends on
# the development procedures used by the project's developers. Use it with
# caution, because it's inappropriate for many projects.
# Returns (None, "a message") if this didn't work, or (version, vercode, tag) for
# the details of the current version.
def check_tags(app, pattern):
try:
if app.RepoType == 'srclib':
build_dir = os.path.join('build', 'srclib', app.Repo)
repotype = common.getsrclibvcs(app.Repo)
else:
build_dir = os.path.join('build', app.id)
repotype = app.RepoType
if repotype not in ('git', 'git-svn', 'hg', 'bzr'):
return (None, 'Tags update mode only works for git, hg, bzr and git-svn repositories currently', None)
if repotype == 'git-svn' and ';' not in app.Repo:
return (None, 'Tags update mode used in git-svn, but the repo was not set up with tags', None)
# Set up vcs interface and make sure we have the latest code...
vcs = common.getvcs(app.RepoType, app.Repo, build_dir)
vcs.gotorevision(None)
last_build = metadata.Build()
if len(app.builds) > 0:
last_build = app.builds[-1]
if last_build.submodules:
vcs.initsubmodules()
hpak = None
htag = None
hver = None
hcode = "0"
tags = vcs.gettags()
if not tags:
return (None, "No tags found", None)
logging.debug("All tags: " + ','.join(tags))
if pattern:
pat = re.compile(pattern)
tags = [tag for tag in tags if pat.match(tag)]
if not tags:
return (None, "No matching tags found", None)
logging.debug("Matching tags: " + ','.join(tags))
if len(tags) > 5 and repotype in ('git',):
tags = vcs.latesttags(tags, 5)
logging.debug("Latest tags: " + ','.join(tags))
for tag in tags:
logging.debug("Check tag: '{0}'".format(tag))
vcs.gotorevision(tag)
for subdir in possible_subdirs(app):
if subdir == '.':
root_dir = build_dir
else:
root_dir = os.path.join(build_dir, subdir)
paths = common.manifest_paths(root_dir, last_build.gradle)
version, vercode, package = common.parse_androidmanifests(paths, app)
if vercode:
logging.debug("Manifest exists in subdir '{0}'. Found version {1} ({2})"
.format(subdir, version, vercode))
if int(vercode) > int(hcode):
hpak = package
htag = tag
hcode = str(int(vercode))
hver = version
if not hpak:
return (None, "Couldn't find package ID", None)
if hver:
return (hver, hcode, htag)
return (None, "Couldn't find any version information", None)
except VCSException as vcse:
msg = "VCS error while scanning app {0}: {1}".format(app.id, vcse)
return (None, msg, None)
except Exception:
msg = "Could not scan app {0} due to unknown error: {1}".format(app.id, traceback.format_exc())
return (None, msg, None)
# Check for a new version by looking at the AndroidManifest.xml at the HEAD
# of the source repo. Whether this can be used reliably or not depends on
# the development procedures used by the project's developers. Use it with
# caution, because it's inappropriate for many projects.
# Returns (None, "a message") if this didn't work, or (version, vercode) for
# the details of the current version.
def check_repomanifest(app, branch=None):
try:
if app.RepoType == 'srclib':
build_dir = os.path.join('build', 'srclib', app.Repo)
repotype = common.getsrclibvcs(app.Repo)
else:
build_dir = os.path.join('build', app.id)
repotype = app.RepoType
# Set up vcs interface and make sure we have the latest code...
vcs = common.getvcs(app.RepoType, app.Repo, build_dir)
if repotype == 'git':
if branch:
branch = 'origin/' + branch
vcs.gotorevision(branch)
elif repotype == 'git-svn':
vcs.gotorevision(branch)
elif repotype == 'hg':
vcs.gotorevision(branch)
elif repotype == 'bzr':
vcs.gotorevision(None)
last_build = metadata.Build()
if len(app.builds) > 0:
last_build = app.builds[-1]
if last_build.submodules:
vcs.initsubmodules()
hpak = None
hver = None
hcode = "0"
for subdir in possible_subdirs(app):
if subdir == '.':
root_dir = build_dir
else:
root_dir = os.path.join(build_dir, subdir)
paths = common.manifest_paths(root_dir, last_build.gradle)
version, vercode, package = common.parse_androidmanifests(paths, app)
if vercode:
logging.debug("Manifest exists in subdir '{0}'. Found version {1} ({2})"
.format(subdir, version, vercode))
if int(vercode) > int(hcode):
hpak = package
hcode = str(int(vercode))
hver = version
if not hpak:
return (None, "Couldn't find package ID")
if hver:
return (hver, hcode)
return (None, "Couldn't find any version information")
except VCSException as vcse:
msg = "VCS error while scanning app {0}: {1}".format(app.id, vcse)
return (None, msg)
except Exception:
msg = "Could not scan app {0} due to unknown error: {1}".format(app.id, traceback.format_exc())
return (None, msg)
def check_repotrunk(app, branch=None):
try:
if app.RepoType == 'srclib':
build_dir = os.path.join('build', 'srclib', app.Repo)
repotype = common.getsrclibvcs(app.Repo)
else:
build_dir = os.path.join('build', app.id)
repotype = app.RepoType
if repotype not in ('git-svn', ):
return (None, 'RepoTrunk update mode only makes sense in git-svn repositories')
# Set up vcs interface and make sure we have the latest code...
vcs = common.getvcs(app.RepoType, app.Repo, build_dir)
vcs.gotorevision(None)
ref = vcs.getref()
return (ref, ref)
except VCSException as vcse:
msg = "VCS error while scanning app {0}: {1}".format(app.id, vcse)
return (None, msg)
except Exception:
msg = "Could not scan app {0} due to unknown error: {1}".format(app.id, traceback.format_exc())
return (None, msg)
# Check for a new version by looking at the Google Play Store.
# Returns (None, "a message") if this didn't work, or (version, None) for
# the details of the current version.
def check_gplay(app):
time.sleep(15)
url = 'https://play.google.com/store/apps/details?id=' + app.id
headers = {'User-Agent': 'Mozilla/5.0 (X11; Linux i686; rv:18.0) Gecko/20100101 Firefox/18.0'}
req = urllib2.Request(url, None, headers)
try:
resp = urllib2.urlopen(req, None, 20)
page = resp.read()
except urllib2.HTTPError as e:
return (None, str(e.code))
except Exception as e:
return (None, 'Failed:' + str(e))
version = None
m = re.search('itemprop="softwareVersion">[ ]*([^<]+)[ ]*', page)
if m:
html_parser = HTMLParser.HTMLParser()
version = html_parser.unescape(m.group(1))
if version == 'Varies with device':
return (None, 'Device-variable version, cannot use this method')
if not version:
return (None, "Couldn't find version")
return (version.strip(), None)
# Return all directories under startdir that contain any of the manifest
# files, and thus are probably an Android project.
def dirs_with_manifest(startdir):
for r, d, f in os.walk(startdir):
if any(m in f for m in [
'AndroidManifest.xml', 'pom.xml', 'build.gradle']):
yield r
# Tries to find a new subdir starting from the root build_dir. Returns said
# subdir relative to the build dir if found, None otherwise.
def possible_subdirs(app):
if app.RepoType == 'srclib':
build_dir = os.path.join('build', 'srclib', app.Repo)
else:
build_dir = os.path.join('build', app.id)
last_build = metadata.Build()
if len(app.builds) > 0:
last_build = app.builds[-1]
for d in dirs_with_manifest(build_dir):
m_paths = common.manifest_paths(d, last_build.gradle)
package = common.parse_androidmanifests(m_paths, app)[2]
if package is not None:
subdir = os.path.relpath(d, build_dir)
logging.debug("Adding possible subdir %s" % subdir)
yield subdir
def fetch_autoname(app, tag):
if not app.RepoType or app.UpdateCheckMode in ('None', 'Static'):
return None
if app.RepoType == 'srclib':
build_dir = os.path.join('build', 'srclib', app.Repo)
else:
build_dir = os.path.join('build', app.id)
try:
vcs = common.getvcs(app.RepoType, app.Repo, build_dir)
vcs.gotorevision(tag)
except VCSException:
return None
last_build = metadata.Build()
if len(app.builds) > 0:
last_build = app.builds[-1]
logging.debug("...fetch auto name from " + build_dir)
new_name = None
for subdir in possible_subdirs(app):
if subdir == '.':
root_dir = build_dir
else:
root_dir = os.path.join(build_dir, subdir)
new_name = common.fetch_real_name(root_dir, last_build.gradle)
if new_name is not None:
break
commitmsg = None
if new_name:
logging.debug("...got autoname '" + new_name + "'")
if new_name != app.AutoName:
app.AutoName = new_name
if not commitmsg:
commitmsg = "Set autoname of {0}".format(common.getappname(app))
else:
logging.debug("...couldn't get autoname")
return commitmsg
def checkupdates_app(app, first=True):
# If a change is made, commitmsg should be set to a description of it.
# Only if this is set will changes be written back to the metadata.
commitmsg = None
tag = None
msg = None
vercode = None
noverok = False
mode = app.UpdateCheckMode
if mode.startswith('Tags'):
pattern = mode[5:] if len(mode) > 4 else None
(version, vercode, tag) = check_tags(app, pattern)
if version == 'Unknown':
version = tag
msg = vercode
elif mode == 'RepoManifest':
(version, vercode) = check_repomanifest(app)
msg = vercode
elif mode.startswith('RepoManifest/'):
tag = mode[13:]
(version, vercode) = check_repomanifest(app, tag)
msg = vercode
elif mode == 'RepoTrunk':
(version, vercode) = check_repotrunk(app)
msg = vercode
elif mode == 'HTTP':
(version, vercode) = check_http(app)
msg = vercode
elif mode in ('None', 'Static'):
version = None
msg = 'Checking disabled'
noverok = True
else:
version = None
msg = 'Invalid update check method'
if version and vercode and app.VercodeOperation:
oldvercode = str(int(vercode))
op = app.VercodeOperation.replace("%c", oldvercode)
vercode = str(eval(op))
logging.debug("Applied vercode operation: %s -> %s" % (oldvercode, vercode))
if version and any(version.startswith(s) for s in [
'${', # Gradle variable names
'@string/', # Strings we could not resolve
]):
version = "Unknown"
updating = False
if version is None:
logmsg = "...{0} : {1}".format(app.id, msg)
if noverok:
logging.info(logmsg)
else:
logging.warn(logmsg)
elif vercode == app.CurrentVersionCode:
logging.info("...up to date")
else:
app.CurrentVersion = version
app.CurrentVersionCode = str(int(vercode))
updating = True
commitmsg = fetch_autoname(app, tag)
if updating:
name = common.getappname(app)
ver = common.getcvname(app)
logging.info('...updating to version %s' % ver)
commitmsg = 'Update CV of %s to %s' % (name, ver)
if options.auto:
mode = app.AutoUpdateMode
if mode in ('None', 'Static'):
pass
elif mode.startswith('Version '):
pattern = mode[8:]
if pattern.startswith('+'):
try:
suffix, pattern = pattern.split(' ', 1)
except ValueError:
raise MetaDataException("Invalid AUM: " + mode)
else:
suffix = ''
gotcur = False
latest = None
for build in app.builds:
if int(build.vercode) >= int(app.CurrentVersionCode):
gotcur = True
if not latest or int(build.vercode) > int(latest.vercode):
latest = build
if int(latest.vercode) > int(app.CurrentVersionCode):
logging.info("Refusing to auto update, since the latest build is newer")
if not gotcur:
newbuild = copy.deepcopy(latest)
newbuild.disable = False
newbuild.vercode = app.CurrentVersionCode
newbuild.version = app.CurrentVersion + suffix
logging.info("...auto-generating build for " + newbuild.version)
commit = pattern.replace('%v', newbuild.version)
commit = commit.replace('%c', newbuild.vercode)
newbuild.commit = commit
app.builds.append(newbuild)
name = common.getappname(app)
ver = common.getcvname(app)
commitmsg = "Update %s to %s" % (name, ver)
else:
logging.warn('Invalid auto update mode "' + mode + '" on ' + app.id)
if commitmsg:
metadatapath = os.path.join('metadata', app.id + '.txt')
with open(metadatapath, 'w') as f:
metadata.write_metadata('txt', f, app)
if options.commit:
logging.info("Commiting update for " + metadatapath)
gitcmd = ["git", "commit", "-m", commitmsg]
if 'auto_author' in config:
gitcmd.extend(['--author', config['auto_author']])
gitcmd.extend(["--", metadatapath])
if subprocess.call(gitcmd) != 0:
logging.error("Git commit failed")
sys.exit(1)
config = None
options = None
def main():
global config, options
# Parse command line...
parser = ArgumentParser(usage="%(prog)s [options] [APPID [APPID ...]]")
common.setup_global_opts(parser)
parser.add_argument("appid", nargs='*', help="app-id to check for updates")
parser.add_argument("--auto", action="store_true", default=False,
help="Process auto-updates")
parser.add_argument("--autoonly", action="store_true", default=False,
help="Only process apps with auto-updates")
parser.add_argument("--commit", action="store_true", default=False,
help="Commit changes")
parser.add_argument("--gplay", action="store_true", default=False,
help="Only print differences with the Play Store")
options = parser.parse_args()
config = common.read_config(options)
# Get all apps...
allapps = metadata.read_metadata()
apps = common.read_app_args(options.appid, allapps, False)
if options.gplay:
for app in apps:
version, reason = check_gplay(app)
if version is None:
if reason == '404':
logging.info("{0} is not in the Play Store".format(common.getappname(app)))
else:
logging.info("{0} encountered a problem: {1}".format(common.getappname(app), reason))
if version is not None:
stored = app.CurrentVersion
if not stored:
logging.info("{0} has no Current Version but has version {1} on the Play Store"
.format(common.getappname(app), version))
elif LooseVersion(stored) < LooseVersion(version):
logging.info("{0} has version {1} on the Play Store, which is bigger than {2}"
.format(common.getappname(app), version, stored))
else:
if stored != version:
logging.info("{0} has version {1} on the Play Store, which differs from {2}"
.format(common.getappname(app), version, stored))
else:
logging.info("{0} has the same version {1} on the Play Store"
.format(common.getappname(app), version))
return
for appid, app in apps.iteritems():
if options.autoonly and app.AutoUpdateMode in ('None', 'Static'):
logging.debug("Nothing to do for {0}...".format(appid))
continue
logging.info("Processing " + appid + '...')
checkupdates_app(app)
logging.info("Finished.")
if __name__ == "__main__":
main()
fdroidserver-0.6.0/examples/ 0000775 0000765 0000765 00000000000 12661321517 015755 5 ustar hans hans 0000000 0000000 fdroidserver-0.6.0/examples/makebuildserver.config.py 0000664 0000765 0000765 00000003636 12657174252 022776 0 ustar hans hans 0000000 0000000 #!/usr/bin/env python2
#
# You may want to alter these before running ./makebuildserver
# Name of the base box to use
# basebox = "jessie32"
# Location where testing32.box can be found, if you don't already have
# it. For security reasons, it's recommended that you make your own
# in a secure environment using trusted media (see the manual) but
# you can use this default if you like...
# baseboxurl = "https://f-droid.org/jessie32.box"
#
# or if you have a cached local copy, you can use that first:
# baseboxurl = ["file:///home/fdroid/fdroidserver/cache/jessie32.box", "https://f-droid.org/jessie32.box"]
# In the process of setting up the build server, many gigs of files
# are downloaded (Android SDK components, gradle, etc). These are
# cached so that they are not redownloaded each time. By default,
# these are stored in ~/.cache/fdroidserver
#
# cachedir = 'buildserver/cache'
# A big part of creating a new instance is downloading packages from Debian.
# This setups up a folder in ~/.cache/fdroidserver to cache the downloaded
# packages when rebuilding the build server from scratch. This requires
# that virtualbox-guest-utils is installed.
#
# apt_package_cache = True
# To specify which Debian mirror the build server VM should use, by
# default it uses http.debian.net, which auto-detects which is the
# best mirror to use.
#
# debian_mirror = 'http://ftp.uk.debian.org/debian/'
# The amount of RAM the build server will have (default: 1024)
# memory = 3584
# The number of CPUs the build server will have
# cpus = 1
# Debian package proxy server - if you have one
# aptproxy = "http://192.168.0.19:8000"
# Set to True if your base box is 64 bit (e.g. testing32.box isn't)
# arch64 = True
# If this is running on an older machine or on a virtualized system,
# it can run a lot slower. If the provisioning fails with a warning
# about the timeout, extend the timeout here. (default: 600 seconds)
#
# boot_timeout = 1200
fdroidserver-0.6.0/examples/opensc-fdroid.cfg 0000664 0000765 0000765 00000000173 12320601722 021162 0 ustar hans hans 0000000 0000000 name = OpenSC
description = SunPKCS11 w/ OpenSC Smart card Framework
library = /usr/lib/opensc-pkcs11.so
slotListIndex = 1
fdroidserver-0.6.0/examples/config.py 0000664 0000765 0000765 00000023016 12660730047 017577 0 ustar hans hans 0000000 0000000 #!/usr/bin/env python2
# -*- coding: utf-8 -*-
# Copy this file to config.py, then amend the settings below according to
# your system configuration.
# Custom path to the Android SDK, defaults to $ANDROID_HOME
# sdk_path = "$ANDROID_HOME"
# Custom paths to various versions of the Android NDK, defaults to 'r10e' set
# to $ANDROID_NDK. Most users will have the latest at $ANDROID_NDK, which is
# used by default. If a version is missing or assigned to None, it is assumed
# not installed.
# ndk_paths = {
# 'r9b': None,
# 'r10e': "$ANDROID_NDK",
# }
# If you want to build apps that use retrolambda and Java 1.8, you'll need to
# have both 1.7 and 1.8 installed.
# java_paths = {
# '1.7': "/usr/lib/jvm/java-7-openjdk",
# '1.8': None,
# }
# Build tools version to be used
# build_tools = "23.0.2"
# Command or path to binary for running Ant
# ant = "ant"
# Command or path to binary for running maven 3
# mvn3 = "mvn"
# Command or path to binary for running Gradle
# gradle = "gradle"
# Set the maximum age (in days) of an index that a client should accept from
# this repo. Setting it to 0 or not setting it at all disables this
# functionality. If you do set this to a non-zero value, you need to ensure
# that your index is updated much more frequently than the specified interval.
# The same policy is applied to the archive repo, if there is one.
# repo_maxage = 0
repo_url = "https://MyFirstFDroidRepo.org/fdroid/repo"
repo_name = "My First F-Droid Repo Demo"
repo_icon = "fdroid-icon.png"
repo_description = """
This is a repository of apps to be used with F-Droid. Applications in this
repository are either official binaries built by the original application
developers, or are binaries built from source by the admin of f-droid.org
using the tools on https://gitlab.com/u/fdroid.
"""
# As above, but for the archive repo.
# archive_older sets the number of versions kept in the main repo, with all
# older ones going to the archive. Set it to 0, and there will be no archive
# repository, and no need to define the other archive_ values.
archive_older = 3
archive_url = "https://f-droid.org/archive"
archive_name = "My First F-Droid Archive Demo"
archive_icon = "fdroid-icon.png"
archive_description = """
The repository of older versions of applications from the main demo repository.
"""
# Normally, all apps are collected into a single app repository, like on
# https://f-droid.org. For certain situations, it is better to make a repo
# that is made up of APKs only from a single app. For example, an automated
# build server that publishes nightly builds.
# per_app_repos = True
# `fdroid update` will create a link to the current version of a given app.
# This provides a static path to the current APK. To disable the creation of
# this link, uncomment this:
# make_current_version_link = False
# By default, the "current version" link will be based on the "Name" of the
# app from the metadata. You can change it to use a different field from the
# metadata here:
# current_version_name_source = 'id'
# Optionally, override home directory for gpg
# gpghome = /home/fdroid/somewhere/else/.gnupg
# The ID of a GPG key for making detached signatures for apks. Optional.
# gpgkey = '1DBA2E89'
# The key (from the keystore defined below) to be used for signing the
# repository itself. This is the same name you would give to keytool or
# jarsigner using -alias. (Not needed in an unsigned repository).
# repo_keyalias = "fdroidrepo"
# Optionally, the public key for the key defined by repo_keyalias above can
# be specified here. There is no need to do this, as the public key can and
# will be retrieved from the keystore when needed. However, specifying it
# manually can allow some processing to take place without access to the
# keystore.
# repo_pubkey = "..."
# The keystore to use for release keys when building. This needs to be
# somewhere safe and secure, and backed up! The best way to manage these
# sensitive keys is to use a "smartcard" (aka Hardware Security Module). To
# configure F-Droid to use a smartcard, set the keystore file using the keyword
# "NONE" (i.e. keystore = "NONE"). That makes Java find the keystore on the
# smartcard based on 'smartcardoptions' below.
# keystore = "~/.local/share/fdroidserver/keystore.jks"
# You should not need to change these at all, unless you have a very
# customized setup for using smartcards in Java with keytool/jarsigner
# smartcardoptions = "-storetype PKCS11 -providerName SunPKCS11-OpenSC \
# -providerClass sun.security.pkcs11.SunPKCS11 \
# -providerArg opensc-fdroid.cfg"
# The password for the keystore (at least 6 characters). If this password is
# different than the keypass below, it can be OK to store the password in this
# file for real use. But in general, sensitive passwords should not be stored
# in text files!
# keystorepass = "password1"
# The password for keys - the same is used for each auto-generated key as well
# as for the repository key. You should not normally store this password in a
# file since it is a sensitive password.
# keypass = "password2"
# The distinguished name used for all keys.
# keydname = "CN=Birdman, OU=Cell, O=Alcatraz, L=Alcatraz, S=California, C=US"
# Use this to override the auto-generated key aliases with specific ones
# for particular applications. Normally, just leave it empty.
# keyaliases = {}
# keyaliases['com.example.app'] = 'example'
# You can also force an app to use the same key alias as another one, using
# the @ prefix.
# keyaliases['com.example.another.plugin'] = '@com.example.another'
# The full path to the root of the repository. It must be specified in
# rsync/ssh format for a remote host/path. This is used for syncing a locally
# generated repo to the server that is it hosted on. It must end in the
# standard public repo name of "/fdroid", but can be in up to three levels of
# sub-directories (i.e. /var/www/packagerepos/fdroid). You can include
# multiple servers to sync to by wrapping the whole thing in {} or [], and
# including the serverwebroot strings in a comma-separated list.
#
# serverwebroot = 'user@example:/var/www/fdroid'
# serverwebroot = {
# 'foo.com:/usr/share/nginx/www/fdroid',
# 'bar.info:/var/www/fdroid',
# }
# Any mirrors of this repo, for example all of the servers declared in
# serverwebroot, will automatically be used by the client. If one
# mirror is not working, then the client will try another. If the
# client has Tor enabled, then the client will prefer mirrors with
# .onion addresses. This base URL will be used for both the main repo
# and the archive, if it is enabled. So these URLs should end in the
# 'fdroid' base of the F-Droid part of the web server like serverwebroot.
#
# mirrors = {
# 'https://foo.bar/fdroid',
# 'http://foobarfoobarfoobar.onion/fdroid',
# }
# optionally specific which identity file to use when using rsync over SSH
#
# identity_file = '~/.ssh/fdroid_id_rsa'
# If you are running the repo signing process on a completely offline machine,
# which provides the best security, then you can specify a folder to sync the
# repo to when running `fdroid server update`. This is most likely going to
# be a USB thumb drive, SD Card, or some other kind of removable media. Make
# sure it is mounted before running `fdroid server update`. Using the
# standard folder called 'fdroid' as the specified folder is recommended, like
# with serverwebroot.
#
# local_copy_dir = '/media/MyUSBThumbDrive/fdroid'
# If you are using local_copy_dir on an offline build/signing server, once the
# thumb drive has been plugged into the online machine, it will need to be
# synced to the copy on the online machine. To make that happen
# automatically, set sync_from_local_copy_dir to True:
#
# sync_from_local_copy_dir = True
# To upload the repo to an Amazon S3 bucket using `fdroid server update`.
# Warning, this deletes and recreates the whole fdroid/ directory each
# time. This is based on apache-libcloud, which supports basically all cloud
# storage services, so it should be easy to port the fdroid server tools to
# any of them.
#
# awsbucket = 'myawsfdroid'
# awsaccesskeyid = 'SEE0CHAITHEIMAUR2USA'
# awssecretkey = 'yourverysecretkeywordpassphraserighthere'
# If you want to force 'fdroid server' to use a non-standard serverwebroot
#
# nonstandardwebroot = False
# The build logs can be posted to a mediawiki instance, like on f-droid.org.
# wiki_protocol = "http"
# wiki_server = "server"
# wiki_path = "/wiki/"
# wiki_user = "login"
# wiki_password = "1234"
# Only set this to true when running a repository where you want to generate
# stats, and only then on the master build servers, not a development
# machine.
# update_stats = True
# When used with stats, this is a list of IP addresses that are ignored for
# calculation purposes.
# stats_ignore = []
# Server stats logs are retrieved from. Required when update_stats is True.
# stats_server = "example.com"
# User stats logs are retrieved from. Required when update_stats is True.
# stats_user = "bob"
# Use the following to push stats to a Carbon instance:
# stats_to_carbon = False
# carbon_host = '0.0.0.0'
# carbon_port = 2003
# Set this to true to always use a build server. This saves specifying the
# --server option on dedicated secure build server hosts.
# build_server_always = True
# By default, fdroid will use YAML and the custom .txt metadata formats. It
# is also possible to have metadata in JSON and XML by adding 'json' and
# 'xml'.
# accepted_formats = ['txt', 'yaml']
# Limit in number of characters that fields can take up
# Only the fields listed here are supported, defaults shown
# char_limits = {
# 'Summary': 80,
# 'Description': 4000,
# }
fdroidserver-0.6.0/examples/fdroid-icon.png 0000664 0000765 0000765 00000006447 12316633764 020702 0 ustar hans hans 0000000 0000000 PNG
IHDR 0 0 W sBIT|d pHYs
a
aJ% tEXtSoftware www.inkscape.org< tEXtTitle F-Droid logom{ tEXtAuthor Robert Martinez1 RtEXtCopyright CC Attribution-ShareAlike http://creativecommons.org/licenses/by-sa/3.0/^Z IDAThՙ{mW]?ksqvmi)
DhB4%D45&(1)*?nc1TZN̝9Z}^wf:SkB^{{QU~c_x?yt3mOjWz~OA/|k3w]pPb`TЧjvS5AFxѵQ"g|tE _o _~K
Wڑ$pUڪ,
`O}7>9ĝ+*)҇*]sӋ3~5Ɣ0X`LlL Xlq7R}L8i$$!·8i·d<(je_,Ө~0{={/#yĄA
J1leސ[?**QeWcoUY}a.8I$M>ëC!F,*
C>31 xl*5>rhxT6pc0
ŕ$oÌ4uNr3_y'drAK*( cTIdBK35vO/^o472ՂD=wK?tvm/}_na>ECDԭL'80 ;?
?a/)N2|ghhLa~ixt,z"b1h1D8U_8Is|.u]gg xgN
dNȜys9K_?R4kåoGm#S\.3
CVCy.xwYwOw aDTN`Bqd@