fdroidserver-2.1/0000755000175000017500000000000014205260750013757 5ustar hanshans00000000000000fdroidserver-2.1/CHANGELOG.md0000644000175000017500000001670614203004041015565 0ustar hanshans00000000000000# Changelog All notable changes to this project will be documented in this file. The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/) ## [Unreleased] ## [2.0.3] - 2021-07-01 ### Fixed * Support AutoUpdateMode: Version without pattern [931](https://gitlab.com/fdroid/fdroidserver/-/merge_requests/931) ## [2.0.2] - 2021-06-01 ### Fixed * fix "ruamel round_trip_dump will be removed" ## [2.0.1] - 2021-03-09 ### Fixed * metadata: stop setting up source repo when running lint/rewritemeta * scanner: show error if scan_binary fails to run apkanalyzer * common: properly parse version from NDK's source.properties * update: stop extracting and storing XML icons, they're useless * index: raise error rather than crash on bad repo file * update: handle large, corrupt, or inaccessible fastlane/triple-t files * Update SPDX License List * checkupdates: set User-Agent to make gitlab.com happy * Run push_binary_transparency only once ## [2.0] - 2021-01-31 For a more complete overview, see the [2.0 milestone](https://gitlab.com/fdroid/fdroidserver/-/milestones/10) ### Added * `fdroid update` inserts donation links based on upstream's _FUNDING.yml_ ([!754](https://gitlab.com/fdroid/fdroidserver/merge_requests/754)) * Stable, public API for most useful functions ([!798](https://gitlab.com/fdroid/fdroidserver/merge_requests/798)) * Load with any YAML lib and use with the API, no more custom parser needed ([!826](https://gitlab.com/fdroid/fdroidserver/merge_requests/826)) ([!838](https://gitlab.com/fdroid/fdroidserver/merge_requests/838)) * _config.yml_ for a safe, easy, standard configuration format ([!663](https://gitlab.com/fdroid/fdroidserver/merge_requests/663)) * Config options can be set from environment variables using this syntax: `keystorepass: {env: keystorepass}` ([!669](https://gitlab.com/fdroid/fdroidserver/merge_requests/669)) * Add SHA256 to filename of repo graphics ([!669](https://gitlab.com/fdroid/fdroidserver/merge_requests/669)) * Support for srclibs metadata in YAML format ([!700](https://gitlab.com/fdroid/fdroidserver/merge_requests/700)) * Check srclibs and app-metadata files with yamllint ([!721](https://gitlab.com/fdroid/fdroidserver/merge_requests/721)) * Added plugin system for adding subcommands to `fdroid` ([!709](https://gitlab.com/fdroid/fdroidserver/merge_requests/709)) * `fdroid update`, `fdroid publish`, and `fdroid signindex` now work with SmartCard HSMs, specifically the NitroKey HSM ([!779](https://gitlab.com/fdroid/fdroidserver/merge_requests/779)) ([!782](https://gitlab.com/fdroid/fdroidserver/merge_requests/782)) * `fdroid update` support for Triple-T Gradle Play Publisher v2.x ([!683](https://gitlab.com/fdroid/fdroidserver/merge_requests/683)) * Translated into: bo de es fr hu it ko nb_NO pl pt pt_BR pt_PT ru sq tr uk zh_Hans zh_Hant ### Fixed * Smoother process for signing APKs with `apksigner` ([!736](https://gitlab.com/fdroid/fdroidserver/merge_requests/736)) ([!821](https://gitlab.com/fdroid/fdroidserver/merge_requests/821)) * `apksigner` is used by default on new repos * All parts except _build_ and _publish_ work without the Android SDK ([!821](https://gitlab.com/fdroid/fdroidserver/merge_requests/821)) * Description: is now passed to clients unchanged, no HTML conversion ([!828](https://gitlab.com/fdroid/fdroidserver/merge_requests/828)) * Lots of improvements for scanning for proprietary code and trackers ([!748](https://gitlab.com/fdroid/fdroidserver/merge_requests/748)) ([!REPLACE](https://gitlab.com/fdroid/fdroidserver/merge_requests/REPLACE)) ([!844](https://gitlab.com/fdroid/fdroidserver/merge_requests/844)) * `fdroid mirror` now generates complete, working local mirror repos * fix build-logs dissapearing when deploying ([!685](https://gitlab.com/fdroid/fdroidserver/merge_requests/685)) * do not crash when system encoding can not be retrieved ([!671](https://gitlab.com/fdroid/fdroidserver/merge_requests/671)) * checkupdates: UpdateCheckIngore gets properly observed now ([!659](https://gitlab.com/fdroid/fdroidserver/merge_requests/659), [!660](https://gitlab.com/fdroid/fdroidserver/merge_requests/660)) * keep yaml metadata when rewrite failed ([!658](https://gitlab.com/fdroid/fdroidserver/merge_requests/658)) * import: `template.yml` now supports omitting values ([!657](https://gitlab.com/fdroid/fdroidserver/merge_requests/657)) * build: deploying buildlogs with rsync ([!651](https://gitlab.com/fdroid/fdroidserver/merge_requests/651)) * `fdroid init` generates PKCS12 keystores, drop Java < 8 support ([!801](https://gitlab.com/fdroid/fdroidserver/-/merge_requests/801)) * Parse Version Codes specified in hex ([!692](https://gitlab.com/fdroid/fdroidserver/-/merge_requests/692)) * Major refactoring on core parts of code to be more Pythonic ([!756](https://gitlab.com/fdroid/fdroidserver/-/merge_requests/756)) * `fdroid init` now works when installed with pip ### Removed * Removed all support for _.txt_ and _.json_ metadata ([!772](https://gitlab.com/fdroid/fdroidserver/-/merge_requests/772)) * dropped support for Debian 8 _jessie_ and 9 _stretch_ * dropped support for Ubuntu releases older than bionic 18.04 * dropped `fdroid server update` and `fdroid server init`, use `fdroid deploy` * `fdroid dscanner` was removed. ([!711](https://gitlab.com/fdroid/fdroidserver/-/merge_requests/711)) * `make_current_version_link` is now off by default * Dropped `force_build_tools` config option ([!797](https://gitlab.com/fdroid/fdroidserver/-/merge_requests/797)) * Dropped `accepted_formats` config option, there is only _.yml_ now ([!818](https://gitlab.com/fdroid/fdroidserver/-/merge_requests/818)) * `Provides:` was removed as a metadata field ([!654](https://gitlab.com/fdroid/fdroidserver/-/merge_requests/654)) * Remove unused `latestapps.dat` ([!794](https://gitlab.com/fdroid/fdroidserver/-/merge_requests/794)) ## [1.1.4] - 2019-08-15 ### Fixed * include bitcoin validation regex required by fdroiddata * merged Debian patches to fix test suite there ## [1.1.3] - 2019-07-03 ### Fixed * fixed test suite when run from source tarball * fixed test runs in Debian ## [1.1.2] - 2019-03-29 ### Fixed * fix bug while downloading repo index ([!636](https://gitlab.com/fdroid/fdroidserver/merge_requests/636)) ## [1.1.1] - 2019-02-03 ### Fixed * support APK Signature v2 and v3 * all SDK Version values are output as integers in the index JSON * take graphics from Fastlane dirs using any valid RFC5646 locale * print warning if not running in UTF-8 encoding * fdroid build: hide --on-server cli flag ## [1.1] - 2019-01-28 ### Fixed * a huge update with many fixes and new features: https://gitlab.com/fdroid/fdroidserver/milestones/7 * can run without and Android SDK installed * much more reliable operation with large binary APK collections * sync all translations, including newly added languages: hu it ko pl pt_PT ru * many security fixes, based on the security audit * NoSourceSince automatically adds SourceGone Anti-Feature * aapt scraping works with all known aapt versions * smoother mirror setups * much faster `fdroid update` when using androguard [Unreleased]: https://gitlab.com/fdroid/fdroidserver/compare/1.1.4...master [1.1.4]: https://gitlab.com/fdroid/fdroidserver/compare/1.1.3...1.1.4 [1.1.3]: https://gitlab.com/fdroid/fdroidserver/compare/1.1.2...1.1.3 [1.1.2]: https://gitlab.com/fdroid/fdroidserver/compare/1.1.1...1.1.2 [1.1.1]: https://gitlab.com/fdroid/fdroidserver/compare/1.1...1.1.1 [1.1]: https://gitlab.com/fdroid/fdroidserver/tags/1.1 fdroidserver-2.1/LICENSE0000644000175000017500000010333014203004041014747 0ustar hanshans00000000000000 GNU AFFERO GENERAL PUBLIC LICENSE Version 3, 19 November 2007 Copyright (C) 2007 Free Software Foundation, Inc. Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. Preamble The GNU Affero General Public License is a free, copyleft license for software and other kinds of works, specifically designed to ensure cooperation with the community in the case of network server software. The licenses for most software and other practical works are designed to take away your freedom to share and change the works. By contrast, our General Public Licenses are intended to guarantee your freedom to share and change all versions of a program--to make sure it remains free software for all its users. When we speak of free software, we are referring to freedom, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for them if you wish), that you receive source code or can get it if you want it, that you can change the software or use pieces of it in new free programs, and that you know you can do these things. Developers that use our General Public Licenses protect your rights with two steps: (1) assert copyright on the software, and (2) offer you this License which gives you legal permission to copy, distribute and/or modify the software. A secondary benefit of defending all users' freedom is that improvements made in alternate versions of the program, if they receive widespread use, become available for other developers to incorporate. Many developers of free software are heartened and encouraged by the resulting cooperation. However, in the case of software used on network servers, this result may fail to come about. The GNU General Public License permits making a modified version and letting the public access it on a server without ever releasing its source code to the public. The GNU Affero General Public License is designed specifically to ensure that, in such cases, the modified source code becomes available to the community. It requires the operator of a network server to provide the source code of the modified version running there to the users of that server. Therefore, public use of a modified version, on a publicly accessible server, gives the public access to the source code of the modified version. An older license, called the Affero General Public License and published by Affero, was designed to accomplish similar goals. This is a different license, not a version of the Affero GPL, but Affero has released a new version of the Affero GPL which permits relicensing under this license. The precise terms and conditions for copying, distribution and modification follow. TERMS AND CONDITIONS 0. Definitions. "This License" refers to version 3 of the GNU Affero General Public License. "Copyright" also means copyright-like laws that apply to other kinds of works, such as semiconductor masks. "The Program" refers to any copyrightable work licensed under this License. Each licensee is addressed as "you". "Licensees" and "recipients" may be individuals or organizations. To "modify" a work means to copy from or adapt all or part of the work in a fashion requiring copyright permission, other than the making of an exact copy. The resulting work is called a "modified version" of the earlier work or a work "based on" the earlier work. A "covered work" means either the unmodified Program or a work based on the Program. To "propagate" a work means to do anything with it that, without permission, would make you directly or secondarily liable for infringement under applicable copyright law, except executing it on a computer or modifying a private copy. Propagation includes copying, distribution (with or without modification), making available to the public, and in some countries other activities as well. To "convey" a work means any kind of propagation that enables other parties to make or receive copies. Mere interaction with a user through a computer network, with no transfer of a copy, is not conveying. An interactive user interface displays "Appropriate Legal Notices" to the extent that it includes a convenient and prominently visible feature that (1) displays an appropriate copyright notice, and (2) tells the user that there is no warranty for the work (except to the extent that warranties are provided), that licensees may convey the work under this License, and how to view a copy of this License. If the interface presents a list of user commands or options, such as a menu, a prominent item in the list meets this criterion. 1. Source Code. The "source code" for a work means the preferred form of the work for making modifications to it. "Object code" means any non-source form of a work. A "Standard Interface" means an interface that either is an official standard defined by a recognized standards body, or, in the case of interfaces specified for a particular programming language, one that is widely used among developers working in that language. The "System Libraries" of an executable work include anything, other than the work as a whole, that (a) is included in the normal form of packaging a Major Component, but which is not part of that Major Component, and (b) serves only to enable use of the work with that Major Component, or to implement a Standard Interface for which an implementation is available to the public in source code form. A "Major Component", in this context, means a major essential component (kernel, window system, and so on) of the specific operating system (if any) on which the executable work runs, or a compiler used to produce the work, or an object code interpreter used to run it. The "Corresponding Source" for a work in object code form means all the source code needed to generate, install, and (for an executable work) run the object code and to modify the work, including scripts to control those activities. However, it does not include the work's System Libraries, or general-purpose tools or generally available free programs which are used unmodified in performing those activities but which are not part of the work. For example, Corresponding Source includes interface definition files associated with source files for the work, and the source code for shared libraries and dynamically linked subprograms that the work is specifically designed to require, such as by intimate data communication or control flow between those subprograms and other parts of the work. The Corresponding Source need not include anything that users can regenerate automatically from other parts of the Corresponding Source. The Corresponding Source for a work in source code form is that same work. 2. Basic Permissions. All rights granted under this License are granted for the term of copyright on the Program, and are irrevocable provided the stated conditions are met. This License explicitly affirms your unlimited permission to run the unmodified Program. The output from running a covered work is covered by this License only if the output, given its content, constitutes a covered work. This License acknowledges your rights of fair use or other equivalent, as provided by copyright law. You may make, run and propagate covered works that you do not convey, without conditions so long as your license otherwise remains in force. You may convey covered works to others for the sole purpose of having them make modifications exclusively for you, or provide you with facilities for running those works, provided that you comply with the terms of this License in conveying all material for which you do not control copyright. Those thus making or running the covered works for you must do so exclusively on your behalf, under your direction and control, on terms that prohibit them from making any copies of your copyrighted material outside their relationship with you. Conveying under any other circumstances is permitted solely under the conditions stated below. Sublicensing is not allowed; section 10 makes it unnecessary. 3. Protecting Users' Legal Rights From Anti-Circumvention Law. No covered work shall be deemed part of an effective technological measure under any applicable law fulfilling obligations under article 11 of the WIPO copyright treaty adopted on 20 December 1996, or similar laws prohibiting or restricting circumvention of such measures. When you convey a covered work, you waive any legal power to forbid circumvention of technological measures to the extent such circumvention is effected by exercising rights under this License with respect to the covered work, and you disclaim any intention to limit operation or modification of the work as a means of enforcing, against the work's users, your or third parties' legal rights to forbid circumvention of technological measures. 4. Conveying Verbatim Copies. You may convey verbatim copies of the Program's source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice; keep intact all notices stating that this License and any non-permissive terms added in accord with section 7 apply to the code; keep intact all notices of the absence of any warranty; and give all recipients a copy of this License along with the Program. You may charge any price or no price for each copy that you convey, and you may offer support or warranty protection for a fee. 5. Conveying Modified Source Versions. You may convey a work based on the Program, or the modifications to produce it from the Program, in the form of source code under the terms of section 4, provided that you also meet all of these conditions: a) The work must carry prominent notices stating that you modified it, and giving a relevant date. b) The work must carry prominent notices stating that it is released under this License and any conditions added under section 7. This requirement modifies the requirement in section 4 to "keep intact all notices". c) You must license the entire work, as a whole, under this License to anyone who comes into possession of a copy. This License will therefore apply, along with any applicable section 7 additional terms, to the whole of the work, and all its parts, regardless of how they are packaged. This License gives no permission to license the work in any other way, but it does not invalidate such permission if you have separately received it. d) If the work has interactive user interfaces, each must display Appropriate Legal Notices; however, if the Program has interactive interfaces that do not display Appropriate Legal Notices, your work need not make them do so. A compilation of a covered work with other separate and independent works, which are not by their nature extensions of the covered work, and which are not combined with it such as to form a larger program, in or on a volume of a storage or distribution medium, is called an "aggregate" if the compilation and its resulting copyright are not used to limit the access or legal rights of the compilation's users beyond what the individual works permit. Inclusion of a covered work in an aggregate does not cause this License to apply to the other parts of the aggregate. 6. Conveying Non-Source Forms. You may convey a covered work in object code form under the terms of sections 4 and 5, provided that you also convey the machine-readable Corresponding Source under the terms of this License, in one of these ways: a) Convey the object code in, or embodied in, a physical product (including a physical distribution medium), accompanied by the Corresponding Source fixed on a durable physical medium customarily used for software interchange. b) Convey the object code in, or embodied in, a physical product (including a physical distribution medium), accompanied by a written offer, valid for at least three years and valid for as long as you offer spare parts or customer support for that product model, to give anyone who possesses the object code either (1) a copy of the Corresponding Source for all the software in the product that is covered by this License, on a durable physical medium customarily used for software interchange, for a price no more than your reasonable cost of physically performing this conveying of source, or (2) access to copy the Corresponding Source from a network server at no charge. c) Convey individual copies of the object code with a copy of the written offer to provide the Corresponding Source. This alternative is allowed only occasionally and noncommercially, and only if you received the object code with such an offer, in accord with subsection 6b. d) Convey the object code by offering access from a designated place (gratis or for a charge), and offer equivalent access to the Corresponding Source in the same way through the same place at no further charge. You need not require recipients to copy the Corresponding Source along with the object code. If the place to copy the object code is a network server, the Corresponding Source may be on a different server (operated by you or a third party) that supports equivalent copying facilities, provided you maintain clear directions next to the object code saying where to find the Corresponding Source. Regardless of what server hosts the Corresponding Source, you remain obligated to ensure that it is available for as long as needed to satisfy these requirements. e) Convey the object code using peer-to-peer transmission, provided you inform other peers where the object code and Corresponding Source of the work are being offered to the general public at no charge under subsection 6d. A separable portion of the object code, whose source code is excluded from the Corresponding Source as a System Library, need not be included in conveying the object code work. A "User Product" is either (1) a "consumer product", which means any tangible personal property which is normally used for personal, family, or household purposes, or (2) anything designed or sold for incorporation into a dwelling. In determining whether a product is a consumer product, doubtful cases shall be resolved in favor of coverage. For a particular product received by a particular user, "normally used" refers to a typical or common use of that class of product, regardless of the status of the particular user or of the way in which the particular user actually uses, or expects or is expected to use, the product. A product is a consumer product regardless of whether the product has substantial commercial, industrial or non-consumer uses, unless such uses represent the only significant mode of use of the product. "Installation Information" for a User Product means any methods, procedures, authorization keys, or other information required to install and execute modified versions of a covered work in that User Product from a modified version of its Corresponding Source. The information must suffice to ensure that the continued functioning of the modified object code is in no case prevented or interfered with solely because modification has been made. If you convey an object code work under this section in, or with, or specifically for use in, a User Product, and the conveying occurs as part of a transaction in which the right of possession and use of the User Product is transferred to the recipient in perpetuity or for a fixed term (regardless of how the transaction is characterized), the Corresponding Source conveyed under this section must be accompanied by the Installation Information. But this requirement does not apply if neither you nor any third party retains the ability to install modified object code on the User Product (for example, the work has been installed in ROM). The requirement to provide Installation Information does not include a requirement to continue to provide support service, warranty, or updates for a work that has been modified or installed by the recipient, or for the User Product in which it has been modified or installed. Access to a network may be denied when the modification itself materially and adversely affects the operation of the network or violates the rules and protocols for communication across the network. Corresponding Source conveyed, and Installation Information provided, in accord with this section must be in a format that is publicly documented (and with an implementation available to the public in source code form), and must require no special password or key for unpacking, reading or copying. 7. Additional Terms. "Additional permissions" are terms that supplement the terms of this License by making exceptions from one or more of its conditions. Additional permissions that are applicable to the entire Program shall be treated as though they were included in this License, to the extent that they are valid under applicable law. If additional permissions apply only to part of the Program, that part may be used separately under those permissions, but the entire Program remains governed by this License without regard to the additional permissions. When you convey a copy of a covered work, you may at your option remove any additional permissions from that copy, or from any part of it. (Additional permissions may be written to require their own removal in certain cases when you modify the work.) You may place additional permissions on material, added by you to a covered work, for which you have or can give appropriate copyright permission. Notwithstanding any other provision of this License, for material you add to a covered work, you may (if authorized by the copyright holders of that material) supplement the terms of this License with terms: a) Disclaiming warranty or limiting liability differently from the terms of sections 15 and 16 of this License; or b) Requiring preservation of specified reasonable legal notices or author attributions in that material or in the Appropriate Legal Notices displayed by works containing it; or c) Prohibiting misrepresentation of the origin of that material, or requiring that modified versions of such material be marked in reasonable ways as different from the original version; or d) Limiting the use for publicity purposes of names of licensors or authors of the material; or e) Declining to grant rights under trademark law for use of some trade names, trademarks, or service marks; or f) Requiring indemnification of licensors and authors of that material by anyone who conveys the material (or modified versions of it) with contractual assumptions of liability to the recipient, for any liability that these contractual assumptions directly impose on those licensors and authors. All other non-permissive additional terms are considered "further restrictions" within the meaning of section 10. If the Program as you received it, or any part of it, contains a notice stating that it is governed by this License along with a term that is a further restriction, you may remove that term. If a license document contains a further restriction but permits relicensing or conveying under this License, you may add to a covered work material governed by the terms of that license document, provided that the further restriction does not survive such relicensing or conveying. If you add terms to a covered work in accord with this section, you must place, in the relevant source files, a statement of the additional terms that apply to those files, or a notice indicating where to find the applicable terms. Additional terms, permissive or non-permissive, may be stated in the form of a separately written license, or stated as exceptions; the above requirements apply either way. 8. Termination. You may not propagate or modify a covered work except as expressly provided under this License. Any attempt otherwise to propagate or modify it is void, and will automatically terminate your rights under this License (including any patent licenses granted under the third paragraph of section 11). However, if you cease all violation of this License, then your license from a particular copyright holder is reinstated (a) provisionally, unless and until the copyright holder explicitly and finally terminates your license, and (b) permanently, if the copyright holder fails to notify you of the violation by some reasonable means prior to 60 days after the cessation. Moreover, your license from a particular copyright holder is reinstated permanently if the copyright holder notifies you of the violation by some reasonable means, this is the first time you have received notice of violation of this License (for any work) from that copyright holder, and you cure the violation prior to 30 days after your receipt of the notice. Termination of your rights under this section does not terminate the licenses of parties who have received copies or rights from you under this License. If your rights have been terminated and not permanently reinstated, you do not qualify to receive new licenses for the same material under section 10. 9. Acceptance Not Required for Having Copies. You are not required to accept this License in order to receive or run a copy of the Program. Ancillary propagation of a covered work occurring solely as a consequence of using peer-to-peer transmission to receive a copy likewise does not require acceptance. However, nothing other than this License grants you permission to propagate or modify any covered work. These actions infringe copyright if you do not accept this License. Therefore, by modifying or propagating a covered work, you indicate your acceptance of this License to do so. 10. Automatic Licensing of Downstream Recipients. Each time you convey a covered work, the recipient automatically receives a license from the original licensors, to run, modify and propagate that work, subject to this License. You are not responsible for enforcing compliance by third parties with this License. An "entity transaction" is a transaction transferring control of an organization, or substantially all assets of one, or subdividing an organization, or merging organizations. If propagation of a covered work results from an entity transaction, each party to that transaction who receives a copy of the work also receives whatever licenses to the work the party's predecessor in interest had or could give under the previous paragraph, plus a right to possession of the Corresponding Source of the work from the predecessor in interest, if the predecessor has it or can get it with reasonable efforts. You may not impose any further restrictions on the exercise of the rights granted or affirmed under this License. For example, you may not impose a license fee, royalty, or other charge for exercise of rights granted under this License, and you may not initiate litigation (including a cross-claim or counterclaim in a lawsuit) alleging that any patent claim is infringed by making, using, selling, offering for sale, or importing the Program or any portion of it. 11. Patents. A "contributor" is a copyright holder who authorizes use under this License of the Program or a work on which the Program is based. The work thus licensed is called the contributor's "contributor version". A contributor's "essential patent claims" are all patent claims owned or controlled by the contributor, whether already acquired or hereafter acquired, that would be infringed by some manner, permitted by this License, of making, using, or selling its contributor version, but do not include claims that would be infringed only as a consequence of further modification of the contributor version. For purposes of this definition, "control" includes the right to grant patent sublicenses in a manner consistent with the requirements of this License. Each contributor grants you a non-exclusive, worldwide, royalty-free patent license under the contributor's essential patent claims, to make, use, sell, offer for sale, import and otherwise run, modify and propagate the contents of its contributor version. In the following three paragraphs, a "patent license" is any express agreement or commitment, however denominated, not to enforce a patent (such as an express permission to practice a patent or covenant not to sue for patent infringement). To "grant" such a patent license to a party means to make such an agreement or commitment not to enforce a patent against the party. If you convey a covered work, knowingly relying on a patent license, and the Corresponding Source of the work is not available for anyone to copy, free of charge and under the terms of this License, through a publicly available network server or other readily accessible means, then you must either (1) cause the Corresponding Source to be so available, or (2) arrange to deprive yourself of the benefit of the patent license for this particular work, or (3) arrange, in a manner consistent with the requirements of this License, to extend the patent license to downstream recipients. "Knowingly relying" means you have actual knowledge that, but for the patent license, your conveying the covered work in a country, or your recipient's use of the covered work in a country, would infringe one or more identifiable patents in that country that you have reason to believe are valid. If, pursuant to or in connection with a single transaction or arrangement, you convey, or propagate by procuring conveyance of, a covered work, and grant a patent license to some of the parties receiving the covered work authorizing them to use, propagate, modify or convey a specific copy of the covered work, then the patent license you grant is automatically extended to all recipients of the covered work and works based on it. A patent license is "discriminatory" if it does not include within the scope of its coverage, prohibits the exercise of, or is conditioned on the non-exercise of one or more of the rights that are specifically granted under this License. You may not convey a covered work if you are a party to an arrangement with a third party that is in the business of distributing software, under which you make payment to the third party based on the extent of your activity of conveying the work, and under which the third party grants, to any of the parties who would receive the covered work from you, a discriminatory patent license (a) in connection with copies of the covered work conveyed by you (or copies made from those copies), or (b) primarily for and in connection with specific products or compilations that contain the covered work, unless you entered into that arrangement, or that patent license was granted, prior to 28 March 2007. Nothing in this License shall be construed as excluding or limiting any implied license or other defenses to infringement that may otherwise be available to you under applicable patent law. 12. No Surrender of Others' Freedom. If conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot convey a covered work so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not convey it at all. For example, if you agree to terms that obligate you to collect a royalty for further conveying from those to whom you convey the Program, the only way you could satisfy both those terms and this License would be to refrain entirely from conveying the Program. 13. Remote Network Interaction; Use with the GNU General Public License. Notwithstanding any other provision of this License, if you modify the Program, your modified version must prominently offer all users interacting with it remotely through a computer network (if your version supports such interaction) an opportunity to receive the Corresponding Source of your version by providing access to the Corresponding Source from a network server at no charge, through some standard or customary means of facilitating copying of software. This Corresponding Source shall include the Corresponding Source for any work covered by version 3 of the GNU General Public License that is incorporated pursuant to the following paragraph. Notwithstanding any other provision of this License, you have permission to link or combine any covered work with a work licensed under version 3 of the GNU General Public License into a single combined work, and to convey the resulting work. The terms of this License will continue to apply to the part which is the covered work, but the work with which it is combined will remain governed by version 3 of the GNU General Public License. 14. Revised Versions of this License. The Free Software Foundation may publish revised and/or new versions of the GNU Affero General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Program specifies that a certain numbered version of the GNU Affero General Public License "or any later version" applies to it, you have the option of following the terms and conditions either of that numbered version or of any later version published by the Free Software Foundation. If the Program does not specify a version number of the GNU Affero General Public License, you may choose any version ever published by the Free Software Foundation. If the Program specifies that a proxy can decide which future versions of the GNU Affero General Public License can be used, that proxy's public statement of acceptance of a version permanently authorizes you to choose that version for the Program. Later license versions may give you additional or different permissions. However, no additional obligations are imposed on any author or copyright holder as a result of your choosing to follow a later version. 15. Disclaimer of Warranty. THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 16. Limitation of Liability. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. 17. Interpretation of Sections 15 and 16. If the disclaimer of warranty and limitation of liability provided above cannot be given local legal effect according to their terms, reviewing courts shall apply local law that most closely approximates an absolute waiver of all civil liability in connection with the Program, unless a warranty or assumption of liability accompanies a copy of the Program in return for a fee. END OF TERMS AND CONDITIONS How to Apply These Terms to Your New Programs If you develop a new program, and you want it to be of the greatest possible use to the public, the best way to achieve this is to make it free software which everyone can redistribute and change under these terms. To do so, attach the following notices to the program. It is safest to attach them to the start of each source file to most effectively state the exclusion of warranty; and each file should have at least the "copyright" line and a pointer to where the full notice is found. Copyright (C) This program is free software: you can redistribute it and/or modify it under the terms of the GNU Affero General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Affero General Public License for more details. You should have received a copy of the GNU Affero General Public License along with this program. If not, see . Also add information on how to contact you by electronic and paper mail. If your software can interact with users remotely through a computer network, you should also make sure that it provides a way for users to get its source. For example, if your program is a web application, its interface could display a "Source" link that leads users to an archive of the code. There are many ways you could offer source, and different solutions will be better for different programs; see section 13 for the specific requirements. You should also get your employer (if you work as a programmer) or school, if any, to sign a "copyright disclaimer" for the program, if necessary. For more information on this, and how to apply and follow the GNU AGPL, see . fdroidserver-2.1/MANIFEST.in0000644000175000017500000016406514203004041015514 0ustar hanshans00000000000000include buildserver/config.buildserver.yml include buildserver/provision-android-ndk include buildserver/provision-android-sdk include buildserver/provision-apt-get-install include buildserver/provision-apt-proxy include buildserver/provision-gradle include buildserver/setup-env-vars include buildserver/Vagrantfile include CHANGELOG.md include completion/bash-completion include examples/config.yml include examples/fdroid_exportkeystore.py include examples/fdroid_export_keystore_to_nitrokey.py include examples/fdroid_extract_repo_pubkey.py include examples/fdroid_fetchsrclibs.py include examples/fdroid_nitrokeyimport.py include examples/makebuildserver.config.py include examples/opensc-fdroid.cfg include examples/public-read-only-s3-bucket-policy.json include examples/template.yml include gradlew-fdroid include LICENSE include locale/bo/LC_MESSAGES/fdroidserver.po include locale/de/LC_MESSAGES/fdroidserver.po include locale/es/LC_MESSAGES/fdroidserver.po include locale/fr/LC_MESSAGES/fdroidserver.po include locale/hu/LC_MESSAGES/fdroidserver.po include locale/it/LC_MESSAGES/fdroidserver.po include locale/ko/LC_MESSAGES/fdroidserver.po include locale/nb_NO/LC_MESSAGES/fdroidserver.po include locale/pl/LC_MESSAGES/fdroidserver.po include locale/pt/LC_MESSAGES/fdroidserver.po include locale/pt_BR/LC_MESSAGES/fdroidserver.po include locale/pt_PT/LC_MESSAGES/fdroidserver.po include locale/ro/LC_MESSAGES/fdroidserver.po include locale/ru/LC_MESSAGES/fdroidserver.po include locale/sq/LC_MESSAGES/fdroidserver.po include locale/tr/LC_MESSAGES/fdroidserver.po include locale/uk/LC_MESSAGES/fdroidserver.po include locale/zh_Hans/LC_MESSAGES/fdroidserver.po include locale/zh_Hant/LC_MESSAGES/fdroidserver.po include makebuildserver include README.md include tests/androguard_test.py include tests/bad-unicode-*.apk include tests/build.TestCase include tests/build-tools/17.0.0/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/17.0.0/aapt-output-com.politedroid_3.txt include tests/build-tools/17.0.0/aapt-output-com.politedroid_4.txt include tests/build-tools/17.0.0/aapt-output-com.politedroid_5.txt include tests/build-tools/17.0.0/aapt-output-com.politedroid_6.txt include tests/build-tools/17.0.0/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/17.0.0/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/17.0.0/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/17.0.0/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/17.0.0/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/17.0.0/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/17.0.0/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/17.0.0/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/17.0.0/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/17.0.0/aapt-output-souch.smsbypass_9.txt include tests/build-tools/18.1.1/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/18.1.1/aapt-output-com.politedroid_3.txt include tests/build-tools/18.1.1/aapt-output-com.politedroid_4.txt include tests/build-tools/18.1.1/aapt-output-com.politedroid_5.txt include tests/build-tools/18.1.1/aapt-output-com.politedroid_6.txt include tests/build-tools/18.1.1/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/18.1.1/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/18.1.1/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/18.1.1/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/18.1.1/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/18.1.1/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/18.1.1/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/18.1.1/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/18.1.1/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/18.1.1/aapt-output-souch.smsbypass_9.txt include tests/build-tools/19.0.0/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/19.0.0/aapt-output-com.politedroid_3.txt include tests/build-tools/19.0.0/aapt-output-com.politedroid_4.txt include tests/build-tools/19.0.0/aapt-output-com.politedroid_5.txt include tests/build-tools/19.0.0/aapt-output-com.politedroid_6.txt include tests/build-tools/19.0.0/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/19.0.0/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/19.0.0/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/19.0.0/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/19.0.0/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/19.0.0/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/19.0.0/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/19.0.0/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/19.0.0/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/19.0.0/aapt-output-souch.smsbypass_9.txt include tests/build-tools/19.1.0/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/19.1.0/aapt-output-com.politedroid_3.txt include tests/build-tools/19.1.0/aapt-output-com.politedroid_4.txt include tests/build-tools/19.1.0/aapt-output-com.politedroid_5.txt include tests/build-tools/19.1.0/aapt-output-com.politedroid_6.txt include tests/build-tools/19.1.0/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/19.1.0/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/19.1.0/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/19.1.0/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/19.1.0/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/19.1.0/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/19.1.0/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/19.1.0/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/19.1.0/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/19.1.0/aapt-output-souch.smsbypass_9.txt include tests/build-tools/20.0.0/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/20.0.0/aapt-output-com.politedroid_3.txt include tests/build-tools/20.0.0/aapt-output-com.politedroid_4.txt include tests/build-tools/20.0.0/aapt-output-com.politedroid_5.txt include tests/build-tools/20.0.0/aapt-output-com.politedroid_6.txt include tests/build-tools/20.0.0/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/20.0.0/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/20.0.0/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/20.0.0/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/20.0.0/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/20.0.0/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/20.0.0/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/20.0.0/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/20.0.0/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/20.0.0/aapt-output-souch.smsbypass_9.txt include tests/build-tools/21.1.1/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/21.1.1/aapt-output-com.politedroid_3.txt include tests/build-tools/21.1.1/aapt-output-com.politedroid_4.txt include tests/build-tools/21.1.1/aapt-output-com.politedroid_5.txt include tests/build-tools/21.1.1/aapt-output-com.politedroid_6.txt include tests/build-tools/21.1.1/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/21.1.1/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/21.1.1/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/21.1.1/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/21.1.1/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/21.1.1/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/21.1.1/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/21.1.1/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/21.1.1/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/21.1.1/aapt-output-souch.smsbypass_9.txt include tests/build-tools/21.1.2/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/21.1.2/aapt-output-com.politedroid_3.txt include tests/build-tools/21.1.2/aapt-output-com.politedroid_4.txt include tests/build-tools/21.1.2/aapt-output-com.politedroid_5.txt include tests/build-tools/21.1.2/aapt-output-com.politedroid_6.txt include tests/build-tools/21.1.2/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/21.1.2/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/21.1.2/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/21.1.2/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/21.1.2/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/21.1.2/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/21.1.2/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/21.1.2/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/21.1.2/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/21.1.2/aapt-output-souch.smsbypass_9.txt include tests/build-tools/22.0.0/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/22.0.0/aapt-output-com.politedroid_3.txt include tests/build-tools/22.0.0/aapt-output-com.politedroid_4.txt include tests/build-tools/22.0.0/aapt-output-com.politedroid_5.txt include tests/build-tools/22.0.0/aapt-output-com.politedroid_6.txt include tests/build-tools/22.0.0/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/22.0.0/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/22.0.0/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/22.0.0/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/22.0.0/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/22.0.0/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/22.0.0/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/22.0.0/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/22.0.0/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/22.0.0/aapt-output-souch.smsbypass_9.txt include tests/build-tools/22.0.1/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/22.0.1/aapt-output-com.politedroid_3.txt include tests/build-tools/22.0.1/aapt-output-com.politedroid_4.txt include tests/build-tools/22.0.1/aapt-output-com.politedroid_5.txt include tests/build-tools/22.0.1/aapt-output-com.politedroid_6.txt include tests/build-tools/22.0.1/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/22.0.1/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/22.0.1/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/22.0.1/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/22.0.1/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/22.0.1/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/22.0.1/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/22.0.1/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/22.0.1/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/22.0.1/aapt-output-souch.smsbypass_9.txt include tests/build-tools/23.0.0/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/23.0.0/aapt-output-com.politedroid_3.txt include tests/build-tools/23.0.0/aapt-output-com.politedroid_4.txt include tests/build-tools/23.0.0/aapt-output-com.politedroid_5.txt include tests/build-tools/23.0.0/aapt-output-com.politedroid_6.txt include tests/build-tools/23.0.0/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/23.0.0/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/23.0.0/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/23.0.0/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/23.0.0/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/23.0.0/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/23.0.0/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/23.0.0/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/23.0.0/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/23.0.0/aapt-output-souch.smsbypass_9.txt include tests/build-tools/23.0.1/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/23.0.1/aapt-output-com.politedroid_3.txt include tests/build-tools/23.0.1/aapt-output-com.politedroid_4.txt include tests/build-tools/23.0.1/aapt-output-com.politedroid_5.txt include tests/build-tools/23.0.1/aapt-output-com.politedroid_6.txt include tests/build-tools/23.0.1/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/23.0.1/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/23.0.1/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/23.0.1/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/23.0.1/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/23.0.1/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/23.0.1/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/23.0.1/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/23.0.1/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/23.0.1/aapt-output-souch.smsbypass_9.txt include tests/build-tools/23.0.2/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/23.0.2/aapt-output-com.politedroid_3.txt include tests/build-tools/23.0.2/aapt-output-com.politedroid_4.txt include tests/build-tools/23.0.2/aapt-output-com.politedroid_5.txt include tests/build-tools/23.0.2/aapt-output-com.politedroid_6.txt include tests/build-tools/23.0.2/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/23.0.2/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/23.0.2/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/23.0.2/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/23.0.2/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/23.0.2/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/23.0.2/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/23.0.2/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/23.0.2/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/23.0.2/aapt-output-souch.smsbypass_9.txt include tests/build-tools/23.0.3/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/23.0.3/aapt-output-com.politedroid_3.txt include tests/build-tools/23.0.3/aapt-output-com.politedroid_4.txt include tests/build-tools/23.0.3/aapt-output-com.politedroid_5.txt include tests/build-tools/23.0.3/aapt-output-com.politedroid_6.txt include tests/build-tools/23.0.3/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/23.0.3/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/23.0.3/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/23.0.3/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/23.0.3/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/23.0.3/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/23.0.3/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/23.0.3/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/23.0.3/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/23.0.3/aapt-output-souch.smsbypass_9.txt include tests/build-tools/24.0.0/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/24.0.0/aapt-output-com.politedroid_3.txt include tests/build-tools/24.0.0/aapt-output-com.politedroid_4.txt include tests/build-tools/24.0.0/aapt-output-com.politedroid_5.txt include tests/build-tools/24.0.0/aapt-output-com.politedroid_6.txt include tests/build-tools/24.0.0/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/24.0.0/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/24.0.0/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/24.0.0/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/24.0.0/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/24.0.0/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/24.0.0/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/24.0.0/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/24.0.0/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/24.0.0/aapt-output-souch.smsbypass_9.txt include tests/build-tools/24.0.1/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/24.0.1/aapt-output-com.politedroid_3.txt include tests/build-tools/24.0.1/aapt-output-com.politedroid_4.txt include tests/build-tools/24.0.1/aapt-output-com.politedroid_5.txt include tests/build-tools/24.0.1/aapt-output-com.politedroid_6.txt include tests/build-tools/24.0.1/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/24.0.1/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/24.0.1/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/24.0.1/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/24.0.1/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/24.0.1/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/24.0.1/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/24.0.1/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/24.0.1/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/24.0.1/aapt-output-souch.smsbypass_9.txt include tests/build-tools/24.0.2/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/24.0.2/aapt-output-com.politedroid_3.txt include tests/build-tools/24.0.2/aapt-output-com.politedroid_4.txt include tests/build-tools/24.0.2/aapt-output-com.politedroid_5.txt include tests/build-tools/24.0.2/aapt-output-com.politedroid_6.txt include tests/build-tools/24.0.2/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/24.0.2/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/24.0.2/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/24.0.2/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/24.0.2/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/24.0.2/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/24.0.2/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/24.0.2/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/24.0.2/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/24.0.2/aapt-output-souch.smsbypass_9.txt include tests/build-tools/24.0.3/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/24.0.3/aapt-output-com.politedroid_3.txt include tests/build-tools/24.0.3/aapt-output-com.politedroid_4.txt include tests/build-tools/24.0.3/aapt-output-com.politedroid_5.txt include tests/build-tools/24.0.3/aapt-output-com.politedroid_6.txt include tests/build-tools/24.0.3/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/24.0.3/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/24.0.3/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/24.0.3/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/24.0.3/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/24.0.3/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/24.0.3/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/24.0.3/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/24.0.3/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/24.0.3/aapt-output-souch.smsbypass_9.txt include tests/build-tools/25.0.0/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/25.0.0/aapt-output-com.politedroid_3.txt include tests/build-tools/25.0.0/aapt-output-com.politedroid_4.txt include tests/build-tools/25.0.0/aapt-output-com.politedroid_5.txt include tests/build-tools/25.0.0/aapt-output-com.politedroid_6.txt include tests/build-tools/25.0.0/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/25.0.0/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/25.0.0/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/25.0.0/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/25.0.0/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/25.0.0/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/25.0.0/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/25.0.0/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/25.0.0/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/25.0.0/aapt-output-souch.smsbypass_9.txt include tests/build-tools/25.0.1/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/25.0.1/aapt-output-com.politedroid_3.txt include tests/build-tools/25.0.1/aapt-output-com.politedroid_4.txt include tests/build-tools/25.0.1/aapt-output-com.politedroid_5.txt include tests/build-tools/25.0.1/aapt-output-com.politedroid_6.txt include tests/build-tools/25.0.1/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/25.0.1/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/25.0.1/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/25.0.1/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/25.0.1/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/25.0.1/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/25.0.1/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/25.0.1/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/25.0.1/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/25.0.1/aapt-output-souch.smsbypass_9.txt include tests/build-tools/25.0.2/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/25.0.2/aapt-output-com.politedroid_3.txt include tests/build-tools/25.0.2/aapt-output-com.politedroid_4.txt include tests/build-tools/25.0.2/aapt-output-com.politedroid_5.txt include tests/build-tools/25.0.2/aapt-output-com.politedroid_6.txt include tests/build-tools/25.0.2/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/25.0.2/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/25.0.2/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/25.0.2/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/25.0.2/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/25.0.2/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/25.0.2/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/25.0.2/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/25.0.2/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/25.0.2/aapt-output-souch.smsbypass_9.txt include tests/build-tools/25.0.3/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/25.0.3/aapt-output-com.politedroid_3.txt include tests/build-tools/25.0.3/aapt-output-com.politedroid_4.txt include tests/build-tools/25.0.3/aapt-output-com.politedroid_5.txt include tests/build-tools/25.0.3/aapt-output-com.politedroid_6.txt include tests/build-tools/25.0.3/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/25.0.3/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/25.0.3/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/25.0.3/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/25.0.3/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/25.0.3/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/25.0.3/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/25.0.3/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/25.0.3/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/25.0.3/aapt-output-souch.smsbypass_9.txt include tests/build-tools/26.0.0/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/26.0.0/aapt-output-com.politedroid_3.txt include tests/build-tools/26.0.0/aapt-output-com.politedroid_4.txt include tests/build-tools/26.0.0/aapt-output-com.politedroid_5.txt include tests/build-tools/26.0.0/aapt-output-com.politedroid_6.txt include tests/build-tools/26.0.0/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/26.0.0/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/26.0.0/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/26.0.0/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/26.0.0/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/26.0.0/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/26.0.0/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/26.0.0/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/26.0.0/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/26.0.0/aapt-output-souch.smsbypass_9.txt include tests/build-tools/26.0.1/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/26.0.1/aapt-output-com.politedroid_3.txt include tests/build-tools/26.0.1/aapt-output-com.politedroid_4.txt include tests/build-tools/26.0.1/aapt-output-com.politedroid_5.txt include tests/build-tools/26.0.1/aapt-output-com.politedroid_6.txt include tests/build-tools/26.0.1/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/26.0.1/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/26.0.1/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/26.0.1/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/26.0.1/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/26.0.1/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/26.0.1/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/26.0.1/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/26.0.1/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/26.0.1/aapt-output-souch.smsbypass_9.txt include tests/build-tools/26.0.2/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/26.0.2/aapt-output-com.politedroid_3.txt include tests/build-tools/26.0.2/aapt-output-com.politedroid_4.txt include tests/build-tools/26.0.2/aapt-output-com.politedroid_5.txt include tests/build-tools/26.0.2/aapt-output-com.politedroid_6.txt include tests/build-tools/26.0.2/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/26.0.2/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/26.0.2/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/26.0.2/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/26.0.2/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/26.0.2/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/26.0.2/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/26.0.2/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/26.0.2/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/26.0.2/aapt-output-souch.smsbypass_9.txt include tests/build-tools/26.0.3/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/26.0.3/aapt-output-com.politedroid_3.txt include tests/build-tools/26.0.3/aapt-output-com.politedroid_4.txt include tests/build-tools/26.0.3/aapt-output-com.politedroid_5.txt include tests/build-tools/26.0.3/aapt-output-com.politedroid_6.txt include tests/build-tools/26.0.3/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/26.0.3/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/26.0.3/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/26.0.3/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/26.0.3/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/26.0.3/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/26.0.3/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/26.0.3/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/26.0.3/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/26.0.3/aapt-output-souch.smsbypass_9.txt include tests/build-tools/27.0.0/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/27.0.0/aapt-output-com.politedroid_3.txt include tests/build-tools/27.0.0/aapt-output-com.politedroid_4.txt include tests/build-tools/27.0.0/aapt-output-com.politedroid_5.txt include tests/build-tools/27.0.0/aapt-output-com.politedroid_6.txt include tests/build-tools/27.0.0/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/27.0.0/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/27.0.0/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/27.0.0/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/27.0.0/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/27.0.0/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/27.0.0/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/27.0.0/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/27.0.0/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/27.0.0/aapt-output-souch.smsbypass_9.txt include tests/build-tools/27.0.1/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/27.0.1/aapt-output-com.politedroid_3.txt include tests/build-tools/27.0.1/aapt-output-com.politedroid_4.txt include tests/build-tools/27.0.1/aapt-output-com.politedroid_5.txt include tests/build-tools/27.0.1/aapt-output-com.politedroid_6.txt include tests/build-tools/27.0.1/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/27.0.1/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/27.0.1/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/27.0.1/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/27.0.1/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/27.0.1/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/27.0.1/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/27.0.1/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/27.0.1/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/27.0.1/aapt-output-souch.smsbypass_9.txt include tests/build-tools/27.0.2/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/27.0.2/aapt-output-com.politedroid_3.txt include tests/build-tools/27.0.2/aapt-output-com.politedroid_4.txt include tests/build-tools/27.0.2/aapt-output-com.politedroid_5.txt include tests/build-tools/27.0.2/aapt-output-com.politedroid_6.txt include tests/build-tools/27.0.2/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/27.0.2/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/27.0.2/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/27.0.2/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/27.0.2/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/27.0.2/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/27.0.2/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/27.0.2/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/27.0.2/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/27.0.2/aapt-output-souch.smsbypass_9.txt include tests/build-tools/27.0.3/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/27.0.3/aapt-output-com.politedroid_3.txt include tests/build-tools/27.0.3/aapt-output-com.politedroid_4.txt include tests/build-tools/27.0.3/aapt-output-com.politedroid_5.txt include tests/build-tools/27.0.3/aapt-output-com.politedroid_6.txt include tests/build-tools/27.0.3/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/27.0.3/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/27.0.3/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/27.0.3/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/27.0.3/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/27.0.3/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/27.0.3/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/27.0.3/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/27.0.3/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/27.0.3/aapt-output-souch.smsbypass_9.txt include tests/build-tools/28.0.0/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/28.0.0/aapt-output-com.politedroid_3.txt include tests/build-tools/28.0.0/aapt-output-com.politedroid_4.txt include tests/build-tools/28.0.0/aapt-output-com.politedroid_5.txt include tests/build-tools/28.0.0/aapt-output-com.politedroid_6.txt include tests/build-tools/28.0.0/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/28.0.0/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/28.0.0/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/28.0.0/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/28.0.0/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/28.0.0/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/28.0.0/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/28.0.0/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/28.0.0/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/28.0.0/aapt-output-souch.smsbypass_9.txt include tests/build-tools/28.0.1/aapt-output-com.moez.QKSMS_182.txt include tests/build-tools/28.0.1/aapt-output-com.politedroid_3.txt include tests/build-tools/28.0.1/aapt-output-com.politedroid_4.txt include tests/build-tools/28.0.1/aapt-output-com.politedroid_5.txt include tests/build-tools/28.0.1/aapt-output-com.politedroid_6.txt include tests/build-tools/28.0.1/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/28.0.1/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/28.0.1/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/28.0.1/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/28.0.1/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/28.0.1/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/28.0.1/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/28.0.1/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/28.0.1/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/28.0.1/aapt-output-souch.smsbypass_9.txt include tests/build-tools/28.0.2/aapt-output-com.politedroid_3.txt include tests/build-tools/28.0.2/aapt-output-com.politedroid_4.txt include tests/build-tools/28.0.2/aapt-output-com.politedroid_5.txt include tests/build-tools/28.0.2/aapt-output-com.politedroid_6.txt include tests/build-tools/28.0.2/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/28.0.2/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/28.0.2/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/28.0.2/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/28.0.2/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/28.0.2/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/28.0.2/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/28.0.2/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/28.0.2/aapt-output-org.droidtr.keyboard_34.txt include tests/build-tools/28.0.2/aapt-output-souch.smsbypass_9.txt include tests/build-tools/28.0.3/aapt-output-com.example.test.helloworld_1.txt include tests/build-tools/28.0.3/aapt-output-com.politedroid_3.txt include tests/build-tools/28.0.3/aapt-output-com.politedroid_4.txt include tests/build-tools/28.0.3/aapt-output-com.politedroid_5.txt include tests/build-tools/28.0.3/aapt-output-com.politedroid_6.txt include tests/build-tools/28.0.3/aapt-output-duplicate.permisssions_9999999.txt include tests/build-tools/28.0.3/aapt-output-info.guardianproject.urzip_100.txt include tests/build-tools/28.0.3/aapt-output-info.zwanenburg.caffeinetile_4.txt include tests/build-tools/28.0.3/aapt-output-no.min.target.sdk_987.txt include tests/build-tools/28.0.3/aapt-output-obb.main.oldversion_1444412523.txt include tests/build-tools/28.0.3/aapt-output-obb.mainpatch.current_1619.txt include tests/build-tools/28.0.3/aapt-output-obb.main.twoversions_1101613.txt include tests/build-tools/28.0.3/aapt-output-obb.main.twoversions_1101615.txt include tests/build-tools/28.0.3/aapt-output-obb.main.twoversions_1101617.txt include tests/build-tools/28.0.3/aapt-output-souch.smsbypass_9.txt include tests/build-tools/generate.sh include tests/check-fdroid-apk include tests/checkupdates.TestCase include tests/common.TestCase include tests/config.py include tests/corrupt-featureGraphic.png include tests/deploy.TestCase include tests/dummy-keystore.jks include tests/dump_internal_metadata_format.py include tests/exception.TestCase include tests/extra/manual-vmtools-test.py include tests/funding-usernames.yaml include tests/get_android_tools_versions/android-ndk/android-ndk-r21d/source.properties include tests/get_android_tools_versions/android-ndk/r11c/source.properties include tests/get_android_tools_versions/android-ndk/r17c/source.properties include tests/get_android_tools_versions/android-sdk/patcher/v4/source.properties include tests/get_android_tools_versions/android-sdk/platforms/android-30/source.properties include tests/get_android_tools_versions/android-sdk/skiaparser/1/source.properties include tests/get_android_tools_versions/android-sdk/tools/source.properties include tests/getsig/getsig.java include tests/getsig/make.sh include tests/getsig/run.sh include tests/gnupghome/pubring.gpg include tests/gnupghome/random_seed include tests/gnupghome/secring.gpg include tests/gnupghome/trustdb.gpg include tests/gradle-maven-blocks.yaml include tests/gradle-release-checksums.py include tests/import_proxy.py include tests/import.TestCase include tests/index.TestCase include tests/init.TestCase include tests/install.TestCase include tests/IsMD5Disabled.java include tests/janus.apk include tests/keystore.jks include tests/key-tricks.py include tests/lint.TestCase include tests/main.TestCase include tests/metadata/apk/info.guardianproject.urzip.yaml include tests/metadata/apk/org.dyndns.fules.ck.yaml include tests/metadata/app.with.special.build.params.yml include tests/metadata/com.politedroid.yml include tests/metadata/dump/com.politedroid.yaml include tests/metadata/dump/org.adaway.yaml include tests/metadata/dump/org.smssecure.smssecure.yaml include tests/metadata/dump/org.videolan.vlc.yaml include tests/metadata/duplicate.permisssions.yml include tests/metadata/fake.ota.update.yml include tests/metadata/info.guardianproject.checkey/en-US/description.txt include tests/metadata/info.guardianproject.checkey/en-US/name.txt include tests/metadata/info.guardianproject.checkey/en-US/phoneScreenshots/checkey-phone.png include tests/metadata/info.guardianproject.checkey/en-US/phoneScreenshots/checkey.png include tests/metadata/info.guardianproject.checkey/en-US/summary.txt include tests/metadata/info.guardianproject.checkey/ja-JP/name.txt include tests/metadata/info.guardianproject.checkey.yml include tests/metadata/info.guardianproject.urzip/en-US/changelogs/100.txt include tests/metadata/info.guardianproject.urzip/en-US/full_description.txt include tests/metadata/info.guardianproject.urzip/en-US/images/featureGraphic.png include tests/metadata/info.guardianproject.urzip/en-US/images/icon.png include tests/metadata/info.guardianproject.urzip/en-US/short_description.txt include tests/metadata/info.guardianproject.urzip/en-US/title.txt include tests/metadata/info.guardianproject.urzip/en-US/video.txt include tests/metadata/info.guardianproject.urzip.yml include tests/metadata/info.zwanenburg.caffeinetile.yml include tests/metadata/no.min.target.sdk.yml include tests/metadata/obb.main.oldversion.yml include tests/metadata/obb.mainpatch.current.yml include tests/metadata/obb.main.twoversions.yml include tests/metadata/org.adaway.yml include tests/metadata/org.fdroid.ci.test.app.yml include tests/metadata/org.fdroid.fdroid.yml include tests/metadata/org.smssecure.smssecure/signatures/134/28969C09.RSA include tests/metadata/org.smssecure.smssecure/signatures/134/28969C09.SF include tests/metadata/org.smssecure.smssecure/signatures/134/MANIFEST.MF include tests/metadata/org.smssecure.smssecure/signatures/135/28969C09.RSA include tests/metadata/org.smssecure.smssecure/signatures/135/28969C09.SF include tests/metadata/org.smssecure.smssecure/signatures/135/MANIFEST.MF include tests/metadata/org.smssecure.smssecure.yml include tests/metadata/org.videolan.vlc.yml include tests/metadata/raw.template.yml include tests/metadata-rewrite-yml/app.with.special.build.params.yml include tests/metadata-rewrite-yml/fake.ota.update.yml include tests/metadata-rewrite-yml/org.fdroid.fdroid.yml include tests/metadata/souch.smsbypass.yml include tests/metadata.TestCase include tests/minimal_targetsdk_30_unsigned.apk include tests/Norway_bouvet_europe_2.obf.zip include tests/no_targetsdk_minsdk1_unsigned.apk include tests/no_targetsdk_minsdk30_unsigned.apk include tests/openssl-version-check-test.py include tests/org.bitbucket.tickytacky.mirrormirror_1.apk include tests/org.bitbucket.tickytacky.mirrormirror_2.apk include tests/org.bitbucket.tickytacky.mirrormirror_3.apk include tests/org.bitbucket.tickytacky.mirrormirror_4.apk include tests/org.dyndns.fules.ck_20.apk include tests/publish.TestCase include tests/repo/categories.txt include tests/repo/com.example.test.helloworld_1.apk include tests/repo/com.politedroid_3.apk include tests/repo/com.politedroid_4.apk include tests/repo/com.politedroid_5.apk include tests/repo/com.politedroid_6.apk include tests/repo/duplicate.permisssions_9999999.apk include tests/repo/fake.ota.update_1234.zip include tests/repo/index-v1.json include tests/repo/index.xml include tests/repo/info.zwanenburg.caffeinetile_4.apk include tests/repo/main.1101613.obb.main.twoversions.obb include tests/repo/main.1101615.obb.main.twoversions.obb include tests/repo/main.1434483388.obb.main.oldversion.obb include tests/repo/main.1619.obb.mainpatch.current.obb include tests/repo/no.min.target.sdk_987.apk include tests/repo/obb.main.oldversion_1444412523.apk include tests/repo/obb.mainpatch.current_1619_another-release-key.apk include tests/repo/obb.mainpatch.current_1619.apk include tests/repo/obb.mainpatch.current/en-US/featureGraphic.png include tests/repo/obb.mainpatch.current/en-US/icon.png include tests/repo/obb.mainpatch.current/en-US/phoneScreenshots/screenshot-main.png include tests/repo/obb.mainpatch.current/en-US/sevenInchScreenshots/screenshot-tablet-main.png include tests/repo/obb.main.twoversions_1101613.apk include tests/repo/obb.main.twoversions_1101615.apk include tests/repo/obb.main.twoversions_1101617.apk include tests/repo/obb.main.twoversions_1101617_src.tar.gz include tests/repo/org.videolan.vlc/en-US/icon.png include tests/repo/org.videolan.vlc/en-US/phoneScreenshots/screenshot10.png include tests/repo/org.videolan.vlc/en-US/phoneScreenshots/screenshot12.png include tests/repo/org.videolan.vlc/en-US/phoneScreenshots/screenshot15.png include tests/repo/org.videolan.vlc/en-US/phoneScreenshots/screenshot18.png include tests/repo/org.videolan.vlc/en-US/phoneScreenshots/screenshot20.png include tests/repo/org.videolan.vlc/en-US/phoneScreenshots/screenshot22.png include tests/repo/org.videolan.vlc/en-US/phoneScreenshots/screenshot4.png include tests/repo/org.videolan.vlc/en-US/phoneScreenshots/screenshot7.png include tests/repo/org.videolan.vlc/en-US/phoneScreenshots/screenshot9.png include tests/repo/org.videolan.vlc/en-US/sevenInchScreenshots/screenshot0.png include tests/repo/org.videolan.vlc/en-US/sevenInchScreenshots/screenshot11.png include tests/repo/org.videolan.vlc/en-US/sevenInchScreenshots/screenshot13.png include tests/repo/org.videolan.vlc/en-US/sevenInchScreenshots/screenshot14.png include tests/repo/org.videolan.vlc/en-US/sevenInchScreenshots/screenshot16.png include tests/repo/org.videolan.vlc/en-US/sevenInchScreenshots/screenshot17.png include tests/repo/org.videolan.vlc/en-US/sevenInchScreenshots/screenshot19.png include tests/repo/org.videolan.vlc/en-US/sevenInchScreenshots/screenshot1.png include tests/repo/org.videolan.vlc/en-US/sevenInchScreenshots/screenshot21.png include tests/repo/org.videolan.vlc/en-US/sevenInchScreenshots/screenshot23.png include tests/repo/org.videolan.vlc/en-US/sevenInchScreenshots/screenshot2.png include tests/repo/org.videolan.vlc/en-US/sevenInchScreenshots/screenshot3.png include tests/repo/org.videolan.vlc/en-US/sevenInchScreenshots/screenshot5.png include tests/repo/org.videolan.vlc/en-US/sevenInchScreenshots/screenshot6.png include tests/repo/org.videolan.vlc/en-US/sevenInchScreenshots/screenshot8.png include tests/repo/patch.1619.obb.mainpatch.current.obb include tests/repo/souch.smsbypass_9.apk include tests/repo/urzip-*.apk include tests/repo/v1.v2.sig_1020.apk include tests/rewritemeta.TestCase include tests/run-tests include tests/scanner.TestCase include tests/signatures.TestCase include tests/signindex/guardianproject.jar include tests/signindex/guardianproject-v1.jar include tests/signindex/testy.jar include tests/signindex/unsigned.jar include tests/source-files/at.bitfire.davdroid/build.gradle include tests/source-files/cn.wildfirechat.chat/avenginekit/build.gradle include tests/source-files/cn.wildfirechat.chat/build.gradle include tests/source-files/cn.wildfirechat.chat/chat/build.gradle include tests/source-files/cn.wildfirechat.chat/client/build.gradle include tests/source-files/cn.wildfirechat.chat/client/src/main/AndroidManifest.xml include tests/source-files/cn.wildfirechat.chat/emojilibrary/build.gradle include tests/source-files/cn.wildfirechat.chat/gradle/build_libraries.gradle include tests/source-files/cn.wildfirechat.chat/imagepicker/build.gradle include tests/source-files/cn.wildfirechat.chat/mars-core-release/build.gradle include tests/source-files/cn.wildfirechat.chat/push/build.gradle include tests/source-files/cn.wildfirechat.chat/settings.gradle include tests/source-files/com.anpmech.launcher/app/build.gradle include tests/source-files/com.anpmech.launcher/app/src/main/AndroidManifest.xml include tests/source-files/com.anpmech.launcher/build.gradle include tests/source-files/com.anpmech.launcher/settings.gradle include tests/source-files/com.github.jameshnsears.quoteunquote/build.gradle include tests/source-files/com.integreight.onesheeld/build.gradle include tests/source-files/com.integreight.onesheeld/gradle/wrapper/gradle-wrapper.properties include tests/source-files/com.integreight.onesheeld/localeapi/build.gradle include tests/source-files/com.integreight.onesheeld/localeapi/src/main/AndroidManifest.xml include tests/source-files/com.integreight.onesheeld/oneSheeld/build.gradle include tests/source-files/com.integreight.onesheeld/oneSheeld/src/main/AndroidManifest.xml include tests/source-files/com.integreight.onesheeld/pagerIndicator/build.gradle include tests/source-files/com.integreight.onesheeld/pagerIndicator/src/main/AndroidManifest.xml include tests/source-files/com.integreight.onesheeld/pullToRefreshlibrary/build.gradle include tests/source-files/com.integreight.onesheeld/pullToRefreshlibrary/src/main/AndroidManifest.xml include tests/source-files/com.integreight.onesheeld/quickReturnHeader/build.gradle include tests/source-files/com.integreight.onesheeld/quickReturnHeader/src/main/AndroidManifest.xml include tests/source-files/com.integreight.onesheeld/settings.gradle include tests/source-files/com.jens.automation2/build.gradle include tests/source-files/com.jens.automation2/app/build.gradle include tests/source-files/com.kunzisoft.testcase/build.gradle include tests/source-files/com.nextcloud.client/build.gradle include tests/source-files/com.nextcloud.client.dev/src/generic/fastlane/metadata/android/en-US/full_description.txt include tests/source-files/com.nextcloud.client.dev/src/generic/fastlane/metadata/android/en-US/short_description.txt include tests/source-files/com.nextcloud.client.dev/src/generic/fastlane/metadata/android/en-US/title.txt include tests/source-files/com.nextcloud.client.dev/src/versionDev/fastlane/metadata/android/en-US/full_description.txt include tests/source-files/com.nextcloud.client.dev/src/versionDev/fastlane/metadata/android/en-US/short_description.txt include tests/source-files/com.nextcloud.client.dev/src/versionDev/fastlane/metadata/android/en-US/title.txt include tests/source-files/com.nextcloud.client/src/generic/fastlane/metadata/android/en-US/full_description.txt include tests/source-files/com.nextcloud.client/src/generic/fastlane/metadata/android/en-US/short_description.txt include tests/source-files/com.nextcloud.client/src/generic/fastlane/metadata/android/en-US/title.txt include tests/source-files/com.nextcloud.client/src/versionDev/fastlane/metadata/android/en-US/full_description.txt include tests/source-files/com.nextcloud.client/src/versionDev/fastlane/metadata/android/en-US/short_description.txt include tests/source-files/com.nextcloud.client/src/versionDev/fastlane/metadata/android/en-US/title.txt include tests/source-files/com.seafile.seadroid2/app/build.gradle include tests/source-files/de.varengold.activeTAN/build.gradle include tests/source-files/dev.patrickgold.florisboard/app/build.gradle.kts include tests/source-files/eu.siacs.conversations/build.gradle include tests/source-files/eu.siacs.conversations/metadata/en-US/name.txt include tests/source-files/fdroid/fdroidclient/AndroidManifest.xml include tests/source-files/fdroid/fdroidclient/build.gradle include tests/source-files/firebase-suspect/app/build.gradle include tests/source-files/firebase-suspect/build.gradle include tests/source-files/firebase-allowlisted/app/build.gradle include tests/source-files/firebase-allowlisted/build.gradle include tests/source-files/info.guardianproject.ripple/build.gradle include tests/source-files/open-keychain/open-keychain/build.gradle include tests/source-files/open-keychain/open-keychain/OpenKeychain/build.gradle include tests/source-files/org.mozilla.rocket/app/build.gradle include tests/source-files/org.tasks/app/build.gradle.kts include tests/source-files/org.tasks/build.gradle include tests/source-files/org.tasks/build.gradle.kts include tests/source-files/org.tasks/buildSrc/build.gradle.kts include tests/source-files/org.tasks/settings.gradle.kts include tests/source-files/org.noise_planet.noisecapture/app/build.gradle include tests/source-files/org.noise_planet.noisecapture/settings.gradle include tests/source-files/org.noise_planet.noisecapture/sosfilter/build.gradle include tests/source-files/osmandapp/osmand/build.gradle include tests/source-files/osmandapp/osmand/gradle/wrapper/gradle-wrapper.properties include tests/source-files/realm/react-native/android/build.gradle include tests/source-files/se.manyver/android/app/build.gradle include tests/source-files/se.manyver/android/build.gradle include tests/source-files/se.manyver/android/gradle.properties include tests/source-files/se.manyver/android/gradle/wrapper/gradle-wrapper.properties include tests/source-files/se.manyver/android/settings.gradle include tests/source-files/se.manyver/app.json include tests/source-files/se.manyver/index.android.js include tests/source-files/se.manyver/package.json include tests/source-files/se.manyver/react-native.config.js include tests/source-files/ut.ewh.audiometrytest/app/build.gradle include tests/source-files/ut.ewh.audiometrytest/app/src/main/AndroidManifest.xml include tests/source-files/ut.ewh.audiometrytest/build.gradle include tests/source-files/ut.ewh.audiometrytest/settings.gradle include tests/source-files/yuriykulikov/AlarmClock/gradle/wrapper/gradle-wrapper.properties include tests/source-files/Zillode/syncthing-silk/build.gradle include tests/SpeedoMeterApp.main_1.apk include tests/stats/known_apks.txt include tests/testcommon.py include tests/test-gradlew-fdroid include tests/triple-t-2/build/org.piwigo.android/app/build.gradle include tests/triple-t-2/build/org.piwigo.android/app/.gitignore include tests/triple-t-2/build/org.piwigo.android/app/src/debug/res/values/constants.xml include tests/triple-t-2/build/org.piwigo.android/app/src/debug/res/values/strings.xml include tests/triple-t-2/build/org.piwigo.android/app/src/main/java/org/piwigo/PiwigoApplication.java include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/contact-email.txt include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/contact-website.txt include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/default-language.txt include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/listings/de-DE/full-description.txt include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/listings/de-DE/short-description.txt include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/listings/de-DE/title.txt include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/listings/en-US/full-description.txt include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/listings/en-US/graphics/feature-graphic/piwigo-full.png include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/listings/en-US/graphics/icon/piwigo-icon.png include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/listings/en-US/graphics/phone-screenshots/01_Login.jpg include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/listings/en-US/graphics/phone-screenshots/02_Albums.jpg include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/listings/en-US/graphics/phone-screenshots/03_Photos.jpg include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/listings/en-US/graphics/phone-screenshots/04_Albums_horizontal.jpg include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/listings/en-US/graphics/phone-screenshots/05_Menu.jpg include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/listings/en-US/graphics/tablet-screenshots/01_Login.png include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/listings/en-US/short-description.txt include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/listings/en-US/title.txt include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/listings/fr-FR/full-description.txt include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/listings/fr-FR/short-description.txt include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/listings/fr-FR/title.txt include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/listings/kn-IN/full-description.txt include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/listings/kn-IN/short-description.txt include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/listings/kn-IN/title.txt include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/release-notes/de-DE/default.txt include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/release-notes/en-US/default.txt include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/release-notes/fr-FR/default.txt include tests/triple-t-2/build/org.piwigo.android/app/src/main/play/release-notes/kn-IN/default.txt include tests/triple-t-2/build/org.piwigo.android/build.gradle include tests/triple-t-2/build/org.piwigo.android/settings.gradle include tests/triple-t-2/metadata/org.piwigo.android.yml include tests/triple-t-anysoftkeyboard/build/com.anysoftkeyboard.languagepack.dutch/addons/languages/dutch/apk/src/main/play/listings/en-US/title.txt include tests/triple-t-anysoftkeyboard/build/com.anysoftkeyboard.languagepack.dutch/ime/app/src/main/play/listings/en-US/title.txt include tests/triple-t-anysoftkeyboard/build/com.anysoftkeyboard.languagepack.dutch/settings.gradle include tests/triple-t-anysoftkeyboard/build/com.menny.android.anysoftkeyboard/addons/languages/dutch/apk/src/main/play/listings/en-US/title.txt include tests/triple-t-anysoftkeyboard/build/com.menny.android.anysoftkeyboard/ime/app/src/main/play/listings/en-US/title.txt include tests/triple-t-anysoftkeyboard/build/com.menny.android.anysoftkeyboard/settings.gradle include tests/triple-t-anysoftkeyboard/metadata/com.anysoftkeyboard.languagepack.dutch.yml include tests/triple-t-anysoftkeyboard/metadata/com.menny.android.anysoftkeyboard.yml include tests/triple-t-multiple/build/ch.admin.bag.covidcertificate.verifier/settings.gradle include tests/triple-t-multiple/build/ch.admin.bag.covidcertificate.verifier/verifier/src/main/play/listings/en-US/title.txt include tests/triple-t-multiple/build/ch.admin.bag.covidcertificate.verifier/wallet/src/main/play/listings/en-US/title.txt include tests/triple-t-multiple/build/ch.admin.bag.covidcertificate.wallet/settings.gradle include tests/triple-t-multiple/build/ch.admin.bag.covidcertificate.wallet/verifier/src/main/play/listings/en-US/title.txt include tests/triple-t-multiple/build/ch.admin.bag.covidcertificate.wallet/wallet/src/main/play/listings/en-US/title.txt include tests/triple-t-multiple/metadata/ch.admin.bag.covidcertificate.verifier.yml include tests/triple-t-multiple/metadata/ch.admin.bag.covidcertificate.wallet.yml include tests/update.TestCase include tests/urzip.apk include tests/urzip-badcert.apk include tests/urzip-badsig.apk include tests/urzip-release.apk include tests/urzip-release-unsigned.apk include tests/v2.only.sig_2.apk include tests/valid-package-names/random-package-names include tests/valid-package-names/RandomPackageNames.java include tests/valid-package-names/test.py include tests/vcs.TestCase fdroidserver-2.1/PKG-INFO0000644000175000017500000001602314205260750015056 0ustar hanshans00000000000000Metadata-Version: 2.1 Name: fdroidserver Version: 2.1 Summary: F-Droid Server Tools Home-page: https://f-droid.org Author: The F-Droid Project Author-email: team@f-droid.org License: AGPL-3.0 Description:

# F-Droid Server ### Server tools for maintaining an F-Droid repository system.
--- ## What is F-Droid? F-Droid is an installable catalogue of FOSS (Free and Open Source Software) applications for the Android platform. The client makes it easy to browse, install, and keep track of updates on your device. ## What is F-Droid Server? The F-Droid server tools provide various scripts and tools that are used to maintain the main [F-Droid application repository](https://f-droid.org/packages). You can use these same tools to create your own additional or alternative repository for publishing, or to assist in creating, testing and submitting metadata to the main repository. For documentation, please see , or you can find the source for the documentation in [fdroid/fdroid-website](https://gitlab.com/fdroid/fdroid-website). ## CI/CD status | | fdroidserver | buildserver | fdroid build --all | publishing tools | |--------------------------|:-------------:|:-----------:|:------------------:|:----------------:| | GNU/Linux | [![fdroidserver status on GNU/Linux](https://gitlab.com/fdroid/fdroidserver/badges/master/pipeline.svg)](https://gitlab.com/fdroid/fdroidserver/-/jobs) | [![buildserver status](https://jenkins.debian.net/job/reproducible_setup_fdroid_build_environment/badge/icon)](https://jenkins.debian.net/job/reproducible_setup_fdroid_build_environment) | [![fdroid build all status](https://jenkins.debian.net/job/reproducible_fdroid_build_apps/badge/icon)](https://jenkins.debian.net/job/reproducible_fdroid_build_apps/) | [![fdroid test status](https://jenkins.debian.net/job/reproducible_fdroid_test/badge/icon)](https://jenkins.debian.net/job/reproducible_fdroid_test/) | | macOS | [![fdroidserver status on macOS](https://travis-ci.org/f-droid/fdroidserver.svg?branch=master)](https://travis-ci.org/f-droid/fdroidserver) | | | | ## Installing There are many ways to install _fdroidserver_, they are documented on the website: https://f-droid.org/docs/Installing_the_Server_and_Repo_Tools All sorts of other documentation lives there as well. ## Tests There are many components to all the tests for the components in this git repository. The most commonly used parts of well tested, while some parts still lack tests. This test suite has built over time a bit haphazardly, so it is not as clean, organized, or complete as it could be. We welcome contributions. Before rearchitecting any parts of it, be sure to [contact us](https://f-droid.org/about) to discuss the changes beforehand. ### `fdroid` commands The test suite for all of the `fdroid` commands is in the _tests/_ subdir. _.gitlab-ci.yml_ and _.travis.yml_ run this test suite on various configurations. - _tests/run-tests_ runs the whole test suite - _tests/*.TestCase_ are individual unit tests for all of the `fdroid` commands, which can be run separately, e.g. `./update.TestCase`. - run one test: `tests/common.TestCase CommonTest.test_get_apk_id` ### Additional tests for different linux distributions These tests are also run on various distributions through GitLab CI. This is only enabled for `master@fdroid/fdroidserver` because it takes longer to complete than the regular CI tests. Most of the time you won't need to worry about them, but sometimes it might make sense to also run them for your merge request. In that case you need to remove [these lines from .gitlab-ci.yml](https://gitlab.com/fdroid/fdroidserver/blob/master/.gitlab-ci.yml#L34-35) and push this to a new branch of your fork. Alternatively [run them locally](https://docs.gitlab.com/runner/commands/README.html#gitlab-runner-exec) like this: `gitlab-runner exec docker ubuntu_lts` ### Buildserver The tests for the whole build server setup are entirely separate because they require at least 200 GB of disk space, and 8 GB of RAM. These test scripts are in the root of the project, all starting with _jenkins-_ since they are run on https://jenkins.debian.net. ## Documentation The API documentation based on the docstrings gets automatically published [here](https://fdroid.gitlab.io/fdroidserver) on every commit on the `master` branch. It can be built locally via ```bash pip install -e .[docs] cd docs sphinx-apidoc -o ./source ../fdroidserver -M -e sphinx-autogen -o generated source/*.rst make html ``` To additionally lint the code call ```bash pydocstyle fdroidserver --count ``` When writing docstrings you should follow the [numpy style guide](https://numpydoc.readthedocs.io/en/latest/format.html). ## Translation Everything can be translated. See [Translation and Localization](https://f-droid.org/docs/Translation_and_Localization) for more info.
[![](https://hosted.weblate.org/widgets/f-droid/-/287x66-white.png)](https://hosted.weblate.org/engage/f-droid)
View translation status for all languages. [![](https://hosted.weblate.org/widgets/f-droid/-/fdroidserver/multi-auto.svg)](https://hosted.weblate.org/engage/f-droid/?utm_source=widget)
Platform: UNKNOWN Classifier: Development Status :: 4 - Beta Classifier: Intended Audience :: Developers Classifier: Intended Audience :: Information Technology Classifier: Intended Audience :: System Administrators Classifier: Intended Audience :: Telecommunications Industry Classifier: License :: OSI Approved :: GNU Affero General Public License v3 or later (AGPLv3+) Classifier: Operating System :: POSIX Classifier: Operating System :: MacOS :: MacOS X Classifier: Operating System :: Unix Classifier: Topic :: Utilities Requires-Python: >=3.5 Description-Content-Type: text/markdown Provides-Extra: docs Provides-Extra: test fdroidserver-2.1/README.md0000644000175000017500000001222514203004041015223 0ustar hanshans00000000000000

# F-Droid Server ### Server tools for maintaining an F-Droid repository system.
--- ## What is F-Droid? F-Droid is an installable catalogue of FOSS (Free and Open Source Software) applications for the Android platform. The client makes it easy to browse, install, and keep track of updates on your device. ## What is F-Droid Server? The F-Droid server tools provide various scripts and tools that are used to maintain the main [F-Droid application repository](https://f-droid.org/packages). You can use these same tools to create your own additional or alternative repository for publishing, or to assist in creating, testing and submitting metadata to the main repository. For documentation, please see , or you can find the source for the documentation in [fdroid/fdroid-website](https://gitlab.com/fdroid/fdroid-website). ## CI/CD status | | fdroidserver | buildserver | fdroid build --all | publishing tools | |--------------------------|:-------------:|:-----------:|:------------------:|:----------------:| | GNU/Linux | [![fdroidserver status on GNU/Linux](https://gitlab.com/fdroid/fdroidserver/badges/master/pipeline.svg)](https://gitlab.com/fdroid/fdroidserver/-/jobs) | [![buildserver status](https://jenkins.debian.net/job/reproducible_setup_fdroid_build_environment/badge/icon)](https://jenkins.debian.net/job/reproducible_setup_fdroid_build_environment) | [![fdroid build all status](https://jenkins.debian.net/job/reproducible_fdroid_build_apps/badge/icon)](https://jenkins.debian.net/job/reproducible_fdroid_build_apps/) | [![fdroid test status](https://jenkins.debian.net/job/reproducible_fdroid_test/badge/icon)](https://jenkins.debian.net/job/reproducible_fdroid_test/) | | macOS | [![fdroidserver status on macOS](https://travis-ci.org/f-droid/fdroidserver.svg?branch=master)](https://travis-ci.org/f-droid/fdroidserver) | | | | ## Installing There are many ways to install _fdroidserver_, they are documented on the website: https://f-droid.org/docs/Installing_the_Server_and_Repo_Tools All sorts of other documentation lives there as well. ## Tests There are many components to all the tests for the components in this git repository. The most commonly used parts of well tested, while some parts still lack tests. This test suite has built over time a bit haphazardly, so it is not as clean, organized, or complete as it could be. We welcome contributions. Before rearchitecting any parts of it, be sure to [contact us](https://f-droid.org/about) to discuss the changes beforehand. ### `fdroid` commands The test suite for all of the `fdroid` commands is in the _tests/_ subdir. _.gitlab-ci.yml_ and _.travis.yml_ run this test suite on various configurations. - _tests/run-tests_ runs the whole test suite - _tests/*.TestCase_ are individual unit tests for all of the `fdroid` commands, which can be run separately, e.g. `./update.TestCase`. - run one test: `tests/common.TestCase CommonTest.test_get_apk_id` ### Additional tests for different linux distributions These tests are also run on various distributions through GitLab CI. This is only enabled for `master@fdroid/fdroidserver` because it takes longer to complete than the regular CI tests. Most of the time you won't need to worry about them, but sometimes it might make sense to also run them for your merge request. In that case you need to remove [these lines from .gitlab-ci.yml](https://gitlab.com/fdroid/fdroidserver/blob/master/.gitlab-ci.yml#L34-35) and push this to a new branch of your fork. Alternatively [run them locally](https://docs.gitlab.com/runner/commands/README.html#gitlab-runner-exec) like this: `gitlab-runner exec docker ubuntu_lts` ### Buildserver The tests for the whole build server setup are entirely separate because they require at least 200 GB of disk space, and 8 GB of RAM. These test scripts are in the root of the project, all starting with _jenkins-_ since they are run on https://jenkins.debian.net. ## Documentation The API documentation based on the docstrings gets automatically published [here](https://fdroid.gitlab.io/fdroidserver) on every commit on the `master` branch. It can be built locally via ```bash pip install -e .[docs] cd docs sphinx-apidoc -o ./source ../fdroidserver -M -e sphinx-autogen -o generated source/*.rst make html ``` To additionally lint the code call ```bash pydocstyle fdroidserver --count ``` When writing docstrings you should follow the [numpy style guide](https://numpydoc.readthedocs.io/en/latest/format.html). ## Translation Everything can be translated. See [Translation and Localization](https://f-droid.org/docs/Translation_and_Localization) for more info.
[![](https://hosted.weblate.org/widgets/f-droid/-/287x66-white.png)](https://hosted.weblate.org/engage/f-droid)
View translation status for all languages. [![](https://hosted.weblate.org/widgets/f-droid/-/fdroidserver/multi-auto.svg)](https://hosted.weblate.org/engage/f-droid/?utm_source=widget)
fdroidserver-2.1/buildserver/0000755000175000017500000000000014205260750016305 5ustar hanshans00000000000000fdroidserver-2.1/buildserver/Vagrantfile0000644000175000017500000000711014203004041020454 0ustar hanshans00000000000000 require 'yaml' require 'pathname' srvpath = Pathname.new(File.dirname(__FILE__)).realpath configfile = YAML.load_file(File.join(srvpath, "/Vagrantfile.yaml")) Vagrant.configure("2") do |config| # these two caching methods conflict, so only use one at a time if Vagrant.has_plugin?("vagrant-cachier") and not configfile.has_key? "aptcachedir" config.cache.scope = :box config.cache.auto_detect = false config.cache.enable :apt config.cache.enable :chef end config.vm.box = configfile['basebox'] if configfile.has_key? "basebox_version" config.vm.box_version = configfile['basebox_version'] end if not configfile.has_key? "vm_provider" or configfile["vm_provider"] == "virtualbox" # default to VirtualBox if not set config.vm.provider "virtualbox" do |v| v.customize ["modifyvm", :id, "--memory", configfile['memory']] v.customize ["modifyvm", :id, "--cpus", configfile['cpus']] v.customize ["modifyvm", :id, "--hwvirtex", configfile['hwvirtex']] end synced_folder_type = 'virtualbox' elsif configfile["vm_provider"] == "libvirt" # use KVM/QEMU if this is running in KVM/QEMU config.vm.provider :libvirt do |libvirt| libvirt.driver = configfile["hwvirtex"] == "on" ? "kvm" : "qemu" libvirt.host = "localhost" libvirt.uri = "qemu:///system" libvirt.cpus = configfile["cpus"] libvirt.memory = configfile["memory"] if configfile.has_key? "libvirt_disk_bus" libvirt.disk_bus = configfile["libvirt_disk_bus"] end if configfile.has_key? "libvirt_nic_model_type" libvirt.nic_model_type = configfile["libvirt_nic_model_type"] end end if configfile.has_key? "synced_folder_type" synced_folder_type = configfile["synced_folder_type"] else synced_folder_type = '9p' end config.vm.synced_folder './', '/vagrant', type: synced_folder_type, SharedFoldersEnableSymlinksCreate: false else abort("No supported VM Provider found, set vm_provider in Vagrantfile.yaml!") end config.vm.boot_timeout = configfile['boot_timeout'] if configfile.has_key? "aptproxy" config.vm.provision :shell, path: "provision-apt-proxy", args: [configfile["aptproxy"]] end # buildserver/ is shared to the VM's /vagrant by default so the old # default does not need a custom mount if configfile["cachedir"] != "buildserver/cache" config.vm.synced_folder configfile["cachedir"], '/vagrant/cache', create: true, type: synced_folder_type end # Make sure dir exists to mount to, since buildserver/ is # automatically mounted as /vagrant in the guest VM. This is more # necessary with 9p synced folders Dir.mkdir('cache') unless File.exists?('cache') # cache .deb packages on the host via a mount trick if configfile.has_key? "aptcachedir" config.vm.synced_folder configfile["aptcachedir"], "/var/cache/apt/archives", owner: 'root', group: 'root', create: true end config.vm.provision "shell", name: "setup-env-vars", path: "setup-env-vars", args: ["/opt/android-sdk"] config.vm.provision "shell", name: "apt-get-install", path: "provision-apt-get-install", args: [configfile['debian_mirror']] config.vm.provision "shell", name: "android-sdk", path: "provision-android-sdk" config.vm.provision "shell", name: "android-ndk", path: "provision-android-ndk", args: ["/opt/android-sdk/ndk", "r21e", "r23b"] config.vm.provision "shell", name: "gradle", path: "provision-gradle" config.vm.provision "shell", name: "buildserverid", path: "provision-buildserverid", args: [`git rev-parse HEAD`] end fdroidserver-2.1/buildserver/config.buildserver.yml0000644000175000017500000000017114203004041022604 0ustar hanshans00000000000000sdk_path: /opt/android-sdk java_paths: 8: /usr/lib/jvm/java-8-openjdk-amd64 gradle_version_dir: /opt/gradle/versions fdroidserver-2.1/buildserver/provision-android-ndk0000644000175000017500000000130714203004041022434 0ustar hanshans00000000000000#!/bin/bash # # $1 is the root dir to install the NDKs into # $2 and after are the NDK releases to install echo $0 set -e set -x NDK_BASE=$1 shift test -e $NDK_BASE || mkdir -p $NDK_BASE cd $NDK_BASE for version in $@; do if [ ! -e ${NDK_BASE}/${version} ]; then unzip /vagrant/cache/android-ndk-${version}-linux*.zip > /dev/null mv android-ndk-${version} \ `sed -En 's,^Pkg.Revision *= *(.+),\1,p' android-ndk-${version}/source.properties` fi done # allow gradle/etc to install missing NDK versions chgrp vagrant $NDK_BASE chmod g+w $NDK_BASE # ensure all users can read and execute the NDK chmod -R a+rX $NDK_BASE/ find $NDK_BASE/ -type f -executable -exec chmod a+x -- {} + fdroidserver-2.1/buildserver/provision-android-sdk0000644000175000017500000001137314203004041022445 0ustar hanshans00000000000000#!/bin/bash # echo $0 set -e set -x if [ -z $ANDROID_HOME ]; then echo "ANDROID_HOME env var must be set!" exit 1 fi # TODO remove the rm, this should work with an existing ANDROID_HOME if [ ! -x $ANDROID_HOME/tools/android ]; then rm -rf $ANDROID_HOME mkdir ${ANDROID_HOME} mkdir ${ANDROID_HOME}/temp mkdir ${ANDROID_HOME}/platforms mkdir ${ANDROID_HOME}/build-tools cd $ANDROID_HOME tools=`ls -1 /vagrant/cache/tools_*.zip | sort -n | tail -1` unzip -qq $tools fi # disable the repositories of proprietary stuff disabled=" @version@=1 @disabled@https\://dl.google.com/android/repository/extras/intel/addon.xml=disabled @disabled@https\://dl.google.com/android/repository/glass/addon.xml=disabled @disabled@https\://dl.google.com/android/repository/sys-img/android/sys-img.xml=disabled @disabled@https\://dl.google.com/android/repository/sys-img/android-tv/sys-img.xml=disabled @disabled@https\://dl.google.com/android/repository/sys-img/android-wear/sys-img.xml=disabled @disabled@https\://dl.google.com/android/repository/sys-img/google_apis/sys-img.xml=disabled " test -d ${HOME}/.android || mkdir ${HOME}/.android # there are currently zero user repos echo 'count=0' > ${HOME}/.android/repositories.cfg for line in $disabled; do echo $line >> ${HOME}/.android/sites-settings.cfg done cd /vagrant/cache # make links for `android update sdk` to use and delete blocklist="build-tools_r17-linux.zip build-tools_r18.0.1-linux.zip build-tools_r18.1-linux.zip build-tools_r18.1.1-linux.zip build-tools_r19-linux.zip build-tools_r19.0.1-linux.zip build-tools_r19.0.2-linux.zip build-tools_r19.0.3-linux.zip build-tools_r21-linux.zip build-tools_r21.0.1-linux.zip build-tools_r21.0.2-linux.zip build-tools_r21.1-linux.zip build-tools_r21.1.1-linux.zip build-tools_r22-linux.zip build-tools_r23-linux.zip android-1.5_r04-linux.zip android-1.6_r03-linux.zip android-2.0_r01-linux.zip android-2.0.1_r01-linux.zip" latestm2=`ls -1 android_m2repository*.zip | sort -n | tail -1` for f in $latestm2 android-[0-9]*.zip platform-[0-9]*.zip build-tools_r*-linux.zip; do rm -f ${ANDROID_HOME}/temp/$f if [[ $blocklist != *$f* ]]; then ln -s /vagrant/cache/$f ${ANDROID_HOME}/temp/ fi done # install all cached platforms cached="" for f in `ls -1 android-[0-9]*.zip platform-[0-9]*.zip`; do sdk=`unzip -c $f "*/build.prop" | sed -n 's,^ro.build.version.sdk=,,p'` cached=,android-${sdk}${cached} done # install all cached build-tools for f in `ls -1 build-tools*.zip`; do ver=`unzip -c $f "*/source.properties" | sed -n 's,^Pkg.Revision=,,p'` if [[ $ver == 24.0.0 ]] && [[ $f =~ .*r24\.0\.1.* ]]; then # 24.0.1 has the wrong revision in the zip ver=24.0.1 fi cached=,build-tools-${ver}${cached} done ${ANDROID_HOME}/tools/android update sdk --no-ui --all \ --filter platform-tools,extra-android-m2repository${cached} < $ANDROID_HOME/licenses/android-sdk-license 8933bad161af4178b1185d1a37fbf41ea5269c55 d56f5187479451eabf01fb78af6dfcb131a6481e 24333f8a63b6825ea9c5514f83c2829b004d1fee EOF cat < $ANDROID_HOME/licenses/android-sdk-preview-license 84831b9409646a918e30573bab4c9c91346d8abd EOF cat < $ANDROID_HOME/licenses/android-sdk-preview-license-old 79120722343a6f314e0719f863036c702b0e6b2a 84831b9409646a918e30573bab4c9c91346d8abd EOF cat < $ANDROID_HOME/licenses/intel-android-extra-license d975f751698a77b662f1254ddbeed3901e976f5a EOF echo y | $ANDROID_HOME/tools/bin/sdkmanager "extras;m2repository;com;android;support;constraint;constraint-layout;1.0.1" echo y | $ANDROID_HOME/tools/bin/sdkmanager "extras;m2repository;com;android;support;constraint;constraint-layout-solver;1.0.1" echo y | $ANDROID_HOME/tools/bin/sdkmanager "extras;m2repository;com;android;support;constraint;constraint-layout;1.0.2" echo y | $ANDROID_HOME/tools/bin/sdkmanager "extras;m2repository;com;android;support;constraint;constraint-layout-solver;1.0.2" chmod a+X $(dirname $ANDROID_HOME/) chmod -R a+rX $ANDROID_HOME/ chgrp vagrant $ANDROID_HOME chmod g+w $ANDROID_HOME find $ANDROID_HOME/ -type f -executable -print0 | xargs -0 chmod a+x # allow gradle to install newer build-tools and platforms chgrp vagrant $ANDROID_HOME/{build-tools,platforms} chmod g+w $ANDROID_HOME/{build-tools,platforms} # allow gradle/sdkmanager to install into the new m2repository test -d $ANDROID_HOME/extras/m2repository || mkdir -p $ANDROID_HOME/extras/m2repository find $ANDROID_HOME/extras/m2repository -type d | xargs chgrp vagrant find $ANDROID_HOME/extras/m2repository -type d | xargs chmod g+w fdroidserver-2.1/buildserver/provision-apt-get-install0000644000175000017500000000617314203004041023255 0ustar hanshans00000000000000#!/bin/bash echo $0 set -e set -x debian_mirror=$1 export DEBIAN_FRONTEND=noninteractive printf 'APT::Install-Recommends "0";\nAPT::Install-Suggests "0";\n' \ > /etc/apt/apt.conf.d/99no-install-recommends printf 'APT::Acquire::Retries "20";\n' \ > /etc/apt/apt.conf.d/99acquire-retries cat < /etc/apt/apt.conf.d/99no-auto-updates APT::Periodic::Enable "0"; APT::Periodic::Update-Package-Lists "0"; APT::Periodic::Unattended-Upgrade "0"; EOF printf 'APT::Get::Assume-Yes "true";\n' \ > /etc/apt/apt.conf.d/99assumeyes cat < /etc/apt/apt.conf.d/99quiet Dpkg::Use-Pty "0"; quiet "1"; EOF cat < /etc/apt/apt.conf.d/99confdef Dpkg::Options { "--force-confdef"; }; EOF if echo $debian_mirror | grep '^https' 2>&1 > /dev/null; then apt-get update || apt-get update apt-get install apt-transport-https ca-certificates fi cat << EOF > /etc/apt/sources.list deb ${debian_mirror} stretch main deb http://security.debian.org/debian-security stretch/updates main deb ${debian_mirror} stretch-updates main EOF echo "deb ${debian_mirror} stretch-backports main" > /etc/apt/sources.list.d/stretch-backports.list echo "deb ${debian_mirror} stretch-backports-sloppy main" > /etc/apt/sources.list.d/stretch-backports-sloppy.list echo "deb ${debian_mirror} testing main" > /etc/apt/sources.list.d/testing.list printf "Package: *\nPin: release o=Debian,a=testing\nPin-Priority: -300\n" > /etc/apt/preferences.d/debian-testing dpkg --add-architecture i386 apt-get update || apt-get update apt-get upgrade --download-only apt-get upgrade # again after upgrade in case of keyring changes apt-get update || apt-get update packages=" androguard/stretch-backports ant asn1c ant-contrib autoconf autoconf2.13 automake automake1.11 autopoint bison bzr ca-certificates-java cmake curl disorderfs expect faketime flex gettext gettext-base git-core git-svn gperf gpg/stretch-backports-sloppy gpgconf/stretch-backports-sloppy libassuan0/stretch-backports libgpg-error0/stretch-backports javacc libarchive-zip-perl libexpat1-dev libgcc1:i386 libglib2.0-dev liblzma-dev libncurses5:i386 librsvg2-bin libsaxonb-java libssl-dev libstdc++6:i386 libtool libtool-bin make maven mercurial nasm openjdk-8-jre-headless openjdk-8-jdk-headless optipng pkg-config python-gnupg python-lxml python-magic python-pip python-setuptools python3-asn1crypto/stretch-backports python3-defusedxml python3-git python3-gitdb python3-gnupg python3-pip python3-pyasn1 python3-pyasn1-modules python3-qrcode python3-requests python3-setuptools python3-smmap python3-yaml python3-ruamel.yaml python3-pil python3-paramiko quilt rsync scons sqlite3 subversion sudo swig unzip xsltproc yasm zip zlib1g:i386 " apt-get install $packages --download-only apt-get install $packages highestjava=`update-java-alternatives --list | sort -n | tail -1 | cut -d ' ' -f 1` update-java-alternatives --set $highestjava # configure headless openjdk to work without gtk accessability dependencies sed -i -e 's@\(assistive_technologies=org.GNOME.Accessibility.AtkWrapper\)@#\1@' /etc/java-8-openjdk/accessibility.properties fdroidserver-2.1/buildserver/provision-apt-proxy0000644000175000017500000000045214203004041022205 0ustar hanshans00000000000000#!/bin/bash echo $0 set -e rm -f /etc/apt/apt.conf.d/02proxy echo "Acquire::ftp::Proxy \"$1\";" >> /etc/apt/apt.conf.d/02proxy echo "Acquire::http::Proxy \"$1\";" >> /etc/apt/apt.conf.d/02proxy echo "Acquire::https::Proxy \"$1\";" >> /etc/apt/apt.conf.d/02proxy apt-get update || apt-get update fdroidserver-2.1/buildserver/provision-gradle0000644000175000017500000000266714203004041021512 0ustar hanshans00000000000000#!/bin/bash set -ex # version compare magic vergte() { printf '%s\n%s' "$1" "$2" | sort -C -V -r } test -e /opt/gradle/versions || mkdir -p /opt/gradle/versions cd /opt/gradle/versions glob="/vagrant/cache/gradle-*.zip" if compgen -G $glob; then # test if glob matches anything f=$(ls -1 --sort=version --group-directories-first $glob | tail -1) ver=`echo $f | sed 's,.*gradle-\([0-9][0-9.]*\).*\.zip,\1,'` # only use versions greater or equal 2.2.1 if vergte $ver 2.2.1 && [ ! -d /opt/gradle/versions/${ver} ]; then unzip -qq $f mv gradle-${ver} /opt/gradle/versions/${ver} fi fi chmod -R a+rX /opt/gradle test -e /opt/gradle/bin || mkdir -p /opt/gradle/bin ln -fs /home/vagrant/fdroidserver/gradlew-fdroid /opt/gradle/bin/gradle chown -h vagrant.vagrant /opt/gradle/bin/gradle chown vagrant.vagrant /opt/gradle/versions chmod 0755 /opt/gradle/versions GRADLE_HOME=/home/vagrant/.gradle test -d $GRADLE_HOME/ || mkdir $GRADLE_HOME/ cat < $GRADLE_HOME/gradle.properties # builds are not reused, so the daemon is a waste of time org.gradle.daemon=false # set network timeouts to 10 minutes # https://github.com/gradle/gradle/pull/3371/files systemProp.http.connectionTimeout=600000 systemProp.http.socketTimeout=600000 systemProp.org.gradle.internal.http.connectionTimeout=600000 systemProp.org.gradle.internal.http.socketTimeout=600000 EOF chown -R vagrant.vagrant $GRADLE_HOME/ chmod -R a+rX $GRADLE_HOME/ fdroidserver-2.1/buildserver/setup-env-vars0000644000175000017500000000123214203004041021110 0ustar hanshans00000000000000#!/bin/sh # # sets up the environment vars needed by the build process set -e set -x bsenv=/etc/profile.d/bsenv.sh echo "# generated on "`date` > $bsenv echo export ANDROID_HOME=$1 >> $bsenv echo 'export PATH=$PATH:${ANDROID_HOME}/tools:${ANDROID_HOME}/platform-tools:/opt/gradle/bin' >> $bsenv echo "export DEBIAN_FRONTEND=noninteractive" >> $bsenv echo 'export home_vagrant=/home/vagrant' >> $bsenv echo 'export fdroidserver=$home_vagrant/fdroidserver' >> $bsenv chmod 0644 $bsenv # make sure that SSH never hangs at a password or key prompt printf ' StrictHostKeyChecking yes\n' >> /etc/ssh/ssh_config printf ' BatchMode yes\n' >> /etc/ssh/ssh_config fdroidserver-2.1/completion/0000755000175000017500000000000014205260750016130 5ustar hanshans00000000000000fdroidserver-2.1/completion/bash-completion0000644000175000017500000001430014203004041021120 0ustar hanshans00000000000000# fdroid(1) completion -*- shell-script -*- # # bash-completion - part of the FDroid server tools # # Copyright (C) 2013-2017 Hans-Christoph Steiner # Copyright (C) 2013, 2014 Daniel Martí # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . __fdroid_init() { COMPREPLY=() cur="${COMP_WORDS[COMP_CWORD]}" prev="${COMP_WORDS[COMP_CWORD-1]}" (( $# >= 1 )) && __complete_${1} } __get_appid() { files=( metadata/*.yml ) files=( ${files[@]#metadata/} ) files=${files[@]%.yml} echo "$files" } __package() { files="$(__get_appid)" COMPREPLY=( $( compgen -W "$files" -- $cur ) ) } __apk_package() { files=( ${1}/*.apk ) [ -f "${files[0]}" ] || return files=( ${files[@]#*/} ) files=${files[@]%_*} COMPREPLY=( $( compgen -W "$files" -- $cur ) ) } __apk_vercode() { local p=${cur:0:-1} files=( ${1}/${p}_*.apk ) [ -f "${files[0]}" ] || return files=( ${files[@]#*_} ) files=${files[@]%.apk} COMPREPLY=( $( compgen -P "${p}:" -W "$files" -- $cur ) ) } __vercode() { if [ $prev = ":" ]; then appid="${COMP_WORDS[COMP_CWORD-2]}" elif [ $cur = ":" ]; then appid=$prev cur="" fi versionCodes=`sed -En 's,^ +versionCode: +([0-9]+) *$,\1,p' metadata/${appid}.yml` COMPREPLY=( $( compgen -W "$versionCodes" -- $cur ) ) } __complete_options() { case "${cur}" in --*) COMPREPLY=( $( compgen -W "--help --version ${lopts}" -- $cur ) ) return 0;; *) COMPREPLY=( $( compgen -W "-h ${opts} --help --version ${lopts}" -- $cur ) ) return 0;; esac } __complete_build() { opts="-v -q -l -s -t -f -a" lopts="--verbose --quiet --latest --stop --test --server --reset-server --skip-scan --scan-binary --no-tarball --force --all --no-refresh" case "${prev}" in :) __vercode return 0;; esac case "${cur}" in -*) __complete_options return 0;; :) __vercode return 0;; *) __package return 0;; esac } __complete_gpgsign() { opts="-v -q" lopts="--verbose --quiet" __complete_options } __complete_install() { opts="-v -q" lopts="--verbose --quiet --all" case "${cur}" in -*) __complete_options return 0;; *:) __apk_vercode repo return 0;; *) __apk_package repo return 0;; esac } __complete_update() { opts="-c -v -q -i -I -e" lopts="--create-metadata --verbose --quiet --icons --pretty --clean --delete-unknown --nosign --rename-apks --use-date-from-apk" case "${prev}" in -e|--editor) _filedir return 0;; esac __complete_options } __complete_publish() { opts="-v -q" lopts="--verbose --quiet" case "${cur}" in -*) __complete_options return 0;; *:) __apk_vercode unsigned return 0;; *) __apk_package unsigned return 0;; esac } __complete_checkupdates() { opts="-v -q" lopts="--verbose --quiet --auto --autoonly --commit --gplay --allow-dirty" case "${cur}" in -*) __complete_options return 0;; *) __package return 0;; esac } __complete_import() { opts="-c -h -l -q -s -u -v -W" lopts="--categories --license --quiet --rev --subdir --url" case "${prev}" in -c|-l|-s|-u|--categories|--license|--quiet|--rev|--subdir|--url) return 0;; -W) COMPREPLY=( $( compgen -W "error warn ignore" -- $cur ) ) return 0;; esac __complete_options } __complete_readmeta() { opts="-v -q" lopts="--verbose --quiet" __complete_options } __complete_rewritemeta() { opts="-v -q -l" lopts="--verbose --quiet --list" case "${cur}" in -*) __complete_options return 0;; *) __package return 0;; esac } __complete_lint() { opts="-v -q -f" lopts="--verbose --quiet --force-yamllint --format" case "${cur}" in -*) __complete_options return 0;; *) __package return 0;; esac } __complete_scanner() { opts="-v -q" lopts="--verbose --quiet" case "${cur}" in -*) __complete_options return 0;; *:) __vercode return 0;; *) __package return 0;; esac } __complete_verify() { opts="-v -q -p" lopts="--verbose --quiet" case "${cur}" in -*) __complete_options return 0;; *:) __vercode return 0;; *) __package return 0;; esac } __complete_btlog() { opts="-u" lopts="--git-remote --git-repo --url" __complete_options } __complete_mirror() { opts="-v" lopts="--all --archive --build-logs --pgp-signatures --src-tarballs --output-dir" __complete_options } __complete_nightly() { opts="-v -q" lopts="--show-secret-var --archive-older" __complete_options } __complete_stats() { opts="-v -q -d" lopts="--verbose --quiet --download" __complete_options } __complete_deploy() { opts="-i -v -q" lopts="--identity-file --local-copy-dir --sync-from-local-copy-dir --verbose --quiet --no-checksum --no-keep-git-mirror-archive" __complete_options } __complete_signatures() { opts="-v -q" lopts="--verbose --no-check-https" case "${cur}" in -*) __complete_options return 0;; esac _filedir 'apk' return 0 } __complete_signindex() { opts="-v -q" lopts="--verbose" __complete_options } __complete_init() { opts="-v -q -d" lopts="--verbose --quiet --distinguished-name --keystore --repo-keyalias --android-home --no-prompt" __complete_options } __cmds=" \ btlog \ build \ checkupdates \ deploy \ gpgsign \ import \ init \ install \ lint \ mirror \ nightly \ publish \ readmeta \ rewritemeta \ scanner \ signatures \ signindex \ stats \ update \ verify \ " for c in $__cmds; do eval "_fdroid_${c} () { local cur prev opts lopts __fdroid_init ${c} }" done _fdroid() { local cmd cmd=${COMP_WORDS[1]} [[ $__cmds == *\ $cmd\ * ]] && _fdroid_${cmd} || { (($COMP_CWORD == 1)) && COMPREPLY=( $( compgen -W "${__cmds}" -- $cmd ) ) } } complete -F _fdroid fdroid return 0 fdroidserver-2.1/examples/0000755000175000017500000000000014205260750015575 5ustar hanshans00000000000000fdroidserver-2.1/examples/config.yml0000644000175000017500000003364114203004041017557 0ustar hanshans00000000000000--- # Copy this file to config.yml, then amend the settings below according to # your system configuration. # Custom path to the Android SDK, defaults to $ANDROID_HOME # sdk_path: $ANDROID_HOME # Paths to installed versions of the Android NDK. This will be # automatically filled out from well known sources like # $ANDROID_HOME/ndk-bundle and $ANDROID_HOME/ndk/*. If a required # version is missing in the buildserver VM, it will be automatically # downloaded and installed into the standard $ANDROID_HOME/ndk/ # directory. Manually setting it here will override the auto-detected # values. The keys can either be the "release" (e.g. r21e) or the # "revision" (e.g. 21.4.7075529). # # ndk_paths: # r10e: $ANDROID_HOME/android-ndk-r10e # r17: "" # 21.4.7075529: ~/Android/Ndk # r22b: null # Directory to store downloaded tools in (i.e. gradle versions) # By default, these are stored in ~/.cache/fdroidserver # cachedir: cache # Specify paths to each major Java release that you want to support # java_paths: # 8: /usr/lib/jvm/java-8-openjdk # Command or path to binary for running Ant # ant: ant # Command or path to binary for running maven 3 # mvn3: mvn # Command or path to binary for running Gradle # Defaults to using an internal gradle wrapper (gradlew-fdroid). # gradle: gradle # Always scan the APKs produced by `fdroid build` for known non-free classes # scan_binary: true # Set the maximum age (in days) of an index that a client should accept from # this repo. Setting it to 0 or not setting it at all disables this # functionality. If you do set this to a non-zero value, you need to ensure # that your index is updated much more frequently than the specified interval. # The same policy is applied to the archive repo, if there is one. # repo_maxage: 0 # repo_url: https://MyFirstFDroidRepo.org/fdroid/repo # repo_name: My First F-Droid Repo Demo # repo_description: >- # This is a repository of apps to be used with F-Droid. Applications # in this repository are either official binaries built by the # original application developers, or are binaries built from source # by the admin of f-droid.org using the tools on # https://gitlab.com/fdroid. # As above, but for the archive repo. # # archive_url: https://f-droid.org/archive # archive_name: My First F-Droid Archive Demo # archive_description: >- # The repository of older versions of packages from the main demo repository. # archive_older sets the number of versions kept in the main repo, with all # older ones going to the archive. Set it to 0, and there will be no archive # repository, and no need to define the other archive_ values. # # archive_older: 3 # The repo's icon defaults to a file called 'icon.png' in the 'icons' # folder for each section, e.g. repo/icons/icon.png and # archive/icons/icon.png. To use a different filename for the icons, # set the filename here. You must still copy it into place in # repo/icons/ and/or archive/icons/. # # repo_icon: myicon.png # archive_icon: myicon.png # This allows a specific kind of insecure APK to be included in the # 'repo' section. Since April 2017, APK signatures that use MD5 are # no longer considered valid, jarsigner and apksigner will return an # error when verifying. `fdroid update` will move APKs with these # disabled signatures to the archive. This option stops that # behavior, and lets those APKs stay part of 'repo'. # # allow_disabled_algorithms: true # Normally, all apps are collected into a single app repository, like on # https://f-droid.org. For certain situations, it is better to make a repo # that is made up of APKs only from a single app. For example, an automated # build server that publishes nightly builds. # per_app_repos: true # `fdroid update` will create a link to the current version of a given app. # This provides a static path to the current APK. To disable the creation of # this link, uncomment this: # make_current_version_link: false # By default, the "current version" link will be based on the "Name" of the # app from the metadata. You can change it to use a different field from the # metadata here: # current_version_name_source: packageName # Optionally, override home directory for gpg # gpghome: /home/fdroid/somewhere/else/.gnupg # The ID of a GPG key for making detached signatures for APKs. Optional. # gpgkey: 1DBA2E89 # The key (from the keystore defined below) to be used for signing the # repository itself. This is the same name you would give to keytool or # jarsigner using -alias. (Not needed in an unsigned repository). # repo_keyalias: fdroidrepo # Optionally, the public key for the key defined by repo_keyalias above can # be specified here. There is no need to do this, as the public key can and # will be retrieved from the keystore when needed. However, specifying it # manually can allow some processing to take place without access to the # keystore. # repo_pubkey: ... # The keystore to use for release keys when building. This needs to be # somewhere safe and secure, and backed up! The best way to manage these # sensitive keys is to use a "smartcard" (aka Hardware Security Module). To # configure F-Droid to use a smartcard, set the keystore file using the keyword # "NONE" (i.e. keystore: "NONE"). That makes Java find the keystore on the # smartcard based on 'smartcardoptions' below. # keystore: ~/.local/share/fdroidserver/keystore.jks # You should not need to change these at all, unless you have a very # customized setup for using smartcards in Java with keytool/jarsigner # smartcardoptions: | # -storetype PKCS11 -providerName SunPKCS11-OpenSC # -providerClass sun.security.pkcs11.SunPKCS11 # -providerArg opensc-fdroid.cfg # The password for the keystore (at least 6 characters). If this password is # different than the keypass below, it can be OK to store the password in this # file for real use. But in general, sensitive passwords should not be stored # in text files! # keystorepass: password1 # The password for keys - the same is used for each auto-generated key as well # as for the repository key. You should not normally store this password in a # file since it is a sensitive password. # keypass: password2 # The distinguished name used for all keys. # keydname: CN=Birdman, OU=Cell, O=Alcatraz, L=Alcatraz, S=California, C=US # Use this to override the auto-generated key aliases with specific ones # for particular applications. Normally, just leave it empty. # # keyaliases: # com.example.app: example # # You can also force an app to use the same key alias as another one, using # the @ prefix. # # keyaliases: # com.example.another.plugin: "@com.example.another" # The full path to the root of the repository. It must be specified in # rsync/ssh format for a remote host/path. This is used for syncing a locally # generated repo to the server that is it hosted on. It must end in the # standard public repo name of "/fdroid", but can be in up to three levels of # sub-directories (i.e. /var/www/packagerepos/fdroid). You can include # multiple servers to sync to by wrapping the whole thing in {} or [], and # including the serverwebroot strings in a comma-separated list. # # serverwebroot: user@example:/var/www/fdroid # serverwebroot: # - foo.com:/usr/share/nginx/www/fdroid # - bar.info:/var/www/fdroid # When running fdroid processes on a remote server, it is possible to # publish extra information about the status. Each fdroid sub-command # can create repo/status/running.json when it starts, then a # repo/status/.json when it completes. The builds logs # and other processes will also get published, if they are running in # a buildserver VM. The build logs name scheme is: # .../repo/$APPID_$VERCODE.log.gz. These files are also pushed to all # servers configured in 'serverwebroot'. # # deploy_process_logs: true # The full URL to a git remote repository. You can include # multiple servers to mirror to by wrapping the whole thing in {} or [], and # including the servergitmirrors strings in a comma-separated list. # Servers listed here will also be automatically inserted in the mirrors list. # # servergitmirrors: https://github.com/user/repo # servergitmirrors: # - https://github.com/user/repo # - https://gitlab.com/user/repo # Most git hosting services have hard size limits for each git repo. # `fdroid deploy` will delete the git history when the git mirror repo # approaches this limit to ensure that the repo will still fit when # pushed. GitHub recommends 1GB, gitlab.com recommends 10GB. # # git_mirror_size_limit: 10GB # Any mirrors of this repo, for example all of the servers declared in # serverwebroot and all the servers declared in servergitmirrors, # will automatically be used by the client. If one # mirror is not working, then the client will try another. If the # client has Tor enabled, then the client will prefer mirrors with # .onion addresses. This base URL will be used for both the main repo # and the archive, if it is enabled. So these URLs should end in the # 'fdroid' base of the F-Droid part of the web server like serverwebroot. # # mirrors: # - https://foo.bar/fdroid # - http://foobarfoobarfoobar.onion/fdroid # optionally specify which identity file to use when using rsync or git over SSH # # identity_file: ~/.ssh/fdroid_id_rsa # If you are running the repo signing process on a completely offline machine, # which provides the best security, then you can specify a folder to sync the # repo to when running `fdroid deploy`. This is most likely going to # be a USB thumb drive, SD Card, or some other kind of removable media. Make # sure it is mounted before running `fdroid deploy`. Using the # standard folder called 'fdroid' as the specified folder is recommended, like # with serverwebroot. # # local_copy_dir: /media/MyUSBThumbDrive/fdroid # If you are using local_copy_dir on an offline build/signing server, once the # thumb drive has been plugged into the online machine, it will need to be # synced to the copy on the online machine. To make that happen # automatically, set sync_from_local_copy_dir to True: # # sync_from_local_copy_dir: true # To upload the repo to an Amazon S3 bucket using `fdroid server # update`. Warning, this deletes and recreates the whole fdroid/ # directory each time. This prefers s3cmd, but can also use # apache-libcloud. To customize how s3cmd interacts with the cloud # provider, create a 's3cfg' file next to this file (config.yml), and # those settings will be used instead of any 'aws' variable below. # Secrets can be fetched from environment variables to ensure that # they are not leaked as part of this file. # # awsbucket: myawsfdroid # awsaccesskeyid: SEE0CHAITHEIMAUR2USA # awssecretkey: {env: awssecretkey} # If you want to force 'fdroid server' to use a non-standard serverwebroot. # This will allow you to have 'serverwebroot' entries which do not end in # '/fdroid'. (Please note that some client features expect repository URLs # to end in '/fdroid/repo'.) # # nonstandardwebroot: false # If you want to upload the release APK file to androidobservatory.org # # androidobservatory: false # If you want to upload the release APK file to virustotal.com # You have to enter your profile apikey to enable the upload. # # virustotal_apikey: 9872987234982734 # # Or get it from an environment variable: # # virustotal_apikey: {env: virustotal_apikey} # Keep a log of all generated index files in a git repo to provide a # "binary transparency" log for anyone to check the history of the # binaries that are published. This is in the form of a "git remote", # which this machine where `fdroid update` is run has already been # configured to allow push access (e.g. ssh key, username/password, etc) # binary_transparency_remote: git@gitlab.com:fdroid/binary-transparency-log.git # Only set this to true when running a repository where you want to generate # stats, and only then on the master build servers, not a development # machine. If you want to keep the "added" and "last updated" dates for each # app and APK in your repo, then you should enable this. # update_stats: true # When used with stats, this is a list of IP addresses that are ignored for # calculation purposes. # stats_ignore: [] # Server stats logs are retrieved from. Required when update_stats is True. # stats_server: example.com # User stats logs are retrieved from. Required when update_stats is True. # stats_user: bob # Use the following to push stats to a Carbon instance: # stats_to_carbon: false # carbon_host: 0.0.0.0 # carbon_port: 2003 # Set this to true to always use a build server. This saves specifying the # --server option on dedicated secure build server hosts. # build_server_always: true # Limit in number of characters that fields can take up # Only the fields listed here are supported, defaults shown # char_limits: # author: 256 # name: 50 # summary: 80 # description: 4000 # video: 256 # whatsNew: 500 # It is possible for the server operator to specify lists of apps that # must be installed or uninstalled on the client (aka "push installs). # If the user has opted in, or the device is already setup to respond # to these requests, then F-Droid will automatically install/uninstall # the packageNames listed. This is protected by the same signing key # as the app index metadata. # # install_list: # - at.bitfire.davdroid # - com.fsck.k9 # - us.replicant # # uninstall_list: # - com.facebook.orca # - com.android.vending # `fdroid lint` checks licenses in metadata against a built white list. By # default we will require license metadata to be present and only allow # licenses approved either by FSF or OSI. We're using the standardized SPDX # license IDs. (https://spdx.org/licenses/) # # We use `python3 -m spdx-license-list print --filter-fsf-or-osi` for # generating our default list. (https://pypi.org/project/spdx-license-list) # # You can override our default list of allowed licenes by setting this option. # Just supply a custom list of licene names you would like to allow. To disable # checking licenses by the linter, assign an empty value to lint_licenses. # # lint_licenses: # - Custom-License-A # - Another-License fdroidserver-2.1/examples/fdroid_export_keystore_to_nitrokey.py0000644000175000017500000000434414203004041025362 0ustar hanshans00000000000000#!/usr/bin/env python3 # # an fdroid plugin for exporting a repo's keystore in standard PEM format import os from argparse import ArgumentParser from fdroidserver import common from fdroidserver.common import FDroidPopen from fdroidserver.exception import BuildException fdroid_summary = "export the repo's keystore file to a NitroKey HSM" def run(cmd, error): envs = {'LC_ALL': 'C.UTF-8', 'PIN': config['smartcard_pin'], 'FDROID_KEY_STORE_PASS': config['keystorepass'], 'FDROID_KEY_PASS': config['keypass']} p = FDroidPopen(cmd, envs=envs) if p.returncode != 0: raise BuildException(error, p.output) def main(): global config parser = ArgumentParser() common.setup_global_opts(parser) options = parser.parse_args() config = common.read_config(options) destkeystore = config['keystore'].replace('.jks', '.p12').replace('/', '_') exportkeystore = config['keystore'].replace('.jks', '.pem').replace('/', '_') if os.path.exists(destkeystore) or os.path.exists(exportkeystore): raise BuildException('%s exists!' % exportkeystore) run([config['keytool'], '-importkeystore', '-srckeystore', config['keystore'], '-srcalias', config['repo_keyalias'], '-srcstorepass:env', 'FDROID_KEY_STORE_PASS', '-srckeypass:env', 'FDROID_KEY_PASS', '-destkeystore', destkeystore, '-deststorepass:env', 'FDROID_KEY_STORE_PASS', '-deststoretype', 'PKCS12'], 'Failed to convert to PKCS12!') # run(['openssl', 'pkcs12', '-in', destkeystore, # '-passin', 'env:FDROID_KEY_STORE_PASS', '-nokeys', # '-out', exportkeystore, # '-passout', 'env:FDROID_KEY_STORE_PASS'], # 'Failed to convert to PEM!') run(['pkcs15-init', '--delete-objects', 'privkey,pubkey', '--id', '3', '--store-private-key', destkeystore, '--format', 'pkcs12', '--auth-id', '3', '--verify-pin', '--pin', 'env:PIN'], '') run(['pkcs15-init', '--delete-objects', 'privkey,pubkey', '--id', '2', '--store-private-key', destkeystore, '--format', 'pkcs12', '--auth-id', '3', '--verify-pin', '--pin', 'env:PIN'], '') if __name__ == "__main__": main() fdroidserver-2.1/examples/fdroid_exportkeystore.py0000644000175000017500000000373714203004041022602 0ustar hanshans00000000000000#!/usr/bin/env python3 # # an fdroid plugin for exporting a repo's keystore in standard PEM format import os from argparse import ArgumentParser from fdroidserver import common from fdroidserver.common import FDroidPopen from fdroidserver.exception import BuildException fdroid_summary = 'export the keystore in standard PEM format' def main(): parser = ArgumentParser() common.setup_global_opts(parser) options = parser.parse_args() config = common.read_config(options) env_vars = {'LC_ALL': 'C.UTF-8', 'FDROID_KEY_STORE_PASS': config['keystorepass'], 'FDROID_KEY_PASS': config['keypass']} destkeystore = config['keystore'].replace('.jks', '.p12').replace('/', '_') exportkeystore = config['keystore'].replace('.jks', '.pem').replace('/', '_') if os.path.exists(destkeystore) or os.path.exists(exportkeystore): raise BuildException('%s exists!' % exportkeystore) p = FDroidPopen([config['keytool'], '-importkeystore', '-srckeystore', config['keystore'], '-srcalias', config['repo_keyalias'], '-srcstorepass:env', 'FDROID_KEY_STORE_PASS', '-srckeypass:env', 'FDROID_KEY_PASS', '-destkeystore', destkeystore, '-deststoretype', 'PKCS12', '-deststorepass:env', 'FDROID_KEY_STORE_PASS', '-destkeypass:env', 'FDROID_KEY_PASS'], envs=env_vars) if p.returncode != 0: raise BuildException("Failed to convert to PKCS12!", p.output) p = FDroidPopen(['openssl', 'pkcs12', '-in', destkeystore, '-passin', 'env:FDROID_KEY_STORE_PASS', '-nokeys', '-out', exportkeystore, '-passout', 'env:FDROID_KEY_STORE_PASS'], envs=env_vars) if p.returncode != 0: raise BuildException("Failed to convert to PEM!", p.output) if __name__ == "__main__": main() fdroidserver-2.1/examples/fdroid_extract_repo_pubkey.py0000644000175000017500000000104014203004041023532 0ustar hanshans00000000000000#!/usr/bin/env python3 # # an fdroid plugin print the repo_pubkey from a repo's keystore # from argparse import ArgumentParser from fdroidserver import common, index fdroid_summary = 'export the keystore in standard PEM format' def main(): parser = ArgumentParser() common.setup_global_opts(parser) options = parser.parse_args() common.config = common.read_config(options) pubkey, repo_pubkey_fingerprint = index.extract_pubkey() print('repo_pubkey = "%s"' % pubkey.decode()) if __name__ == "__main__": main() fdroidserver-2.1/examples/fdroid_fetchsrclibs.py0000644000175000017500000000265414203004041022143 0ustar hanshans00000000000000#!/usr/bin/env python3 # # an fdroid plugin for setting up srclibs # # The 'fdroid build' gitlab-ci job uses --on-server, which does not # set up the srclibs. This plugin does the missing setup. import argparse import os import pprint from fdroidserver import _, common, metadata fdroid_summary = 'prepare the srclibs for `fdroid build --on-server`' def main(): parser = argparse.ArgumentParser(usage="%(prog)s [options] [APPID[:VERCODE] [APPID[:VERCODE] ...]]") common.setup_global_opts(parser) parser.add_argument("appid", nargs='*', help=_("applicationId with optional versionCode in the form APPID[:VERCODE]")) metadata.add_metadata_arguments(parser) options = parser.parse_args() common.options = options pkgs = common.read_pkg_args(options.appid, True) allapps = metadata.read_metadata(pkgs) apps = common.read_app_args(options.appid, allapps, True) common.read_config(options) srclib_dir = os.path.join('build', 'srclib') os.makedirs(srclib_dir, exist_ok=True) srclibpaths = [] for appid, app in apps.items(): vcs, _ignored = common.setup_vcs(app) vcs.gotorevision('HEAD', refresh=False) for build in app.get('Builds', []): for lib in build.srclibs: srclibpaths.append(common.getsrclib(lib, srclib_dir, prepare=False, build=build)) print('Set up srclibs:') pprint.pprint(srclibpaths) if __name__ == "__main__": main() fdroidserver-2.1/examples/fdroid_nitrokeyimport.py0000644000175000017500000000302014203004041022553 0ustar hanshans00000000000000#!/usr/bin/env python3 from argparse import ArgumentParser from fdroidserver import common from fdroidserver.common import FDroidPopen from fdroidserver.exception import BuildException fdroid_summary = 'import the local keystore into a SmartCard HSM' def main(): parser = ArgumentParser() common.setup_global_opts(parser) options = parser.parse_args() config = common.read_config(options) env_vars = { 'LC_ALL': 'C.UTF-8', 'FDROID_KEY_STORE_PASS': config['keystorepass'], 'FDROID_KEY_PASS': config['keypass'], 'SMARTCARD_PIN': str(config['smartcard_pin']), } p = FDroidPopen([config['keytool'], '-importkeystore', '-srcalias', config['repo_keyalias'], '-srckeystore', config['keystore'], '-srcstorepass:env', 'FDROID_KEY_STORE_PASS', '-srckeypass:env', 'FDROID_KEY_PASS', '-destalias', config['repo_keyalias'], '-destkeystore', 'NONE', '-deststoretype', 'PKCS11', '-providerName', 'SunPKCS11-OpenSC', '-providerClass', 'sun.security.pkcs11.SunPKCS11', '-providerArg', 'opensc-fdroid.cfg', '-deststorepass:env', 'SMARTCARD_PIN', '-J-Djava.security.debug=sunpkcs11'], envs=env_vars) if p.returncode != 0: raise BuildException("Failed to import into HSM!", p.output) if __name__ == "__main__": main() fdroidserver-2.1/examples/makebuildserver.config.py0000644000175000017500000000711314203004041022564 0ustar hanshans00000000000000#!/usr/bin/env python3 # # You may want to alter these before running ./makebuildserver # Name of the Vagrant basebox to use, by default it will be downloaded # from Vagrant Cloud. For release builds setup, generate the basebox # locally using https://gitlab.com/fdroid/basebox, add it to Vagrant, # then set this to the local basebox name. # This defaults to "fdroid/basebox-stretch64" which will download a # prebuilt basebox from https://app.vagrantup.com/fdroid. # # (If you change this value you have to supply the `--clean` option on # your next `makebuildserver` run.) # # basebox = "basebox-stretch64" # This allows you to pin your basebox to a specific versions. It defaults # the most recent basebox version which can be aumotaically verifyed by # `makebuildserver`. # Please note that vagrant does not support versioning of locally added # boxes, so we can't support that either. # # (If you change this value you have to supply the `--clean` option on # your next `makebuildserver` run.) # # basebox_version = "0.1" # In the process of setting up the build server, many gigs of files # are downloaded (Android SDK components, gradle, etc). These are # cached so that they are not redownloaded each time. By default, # these are stored in ~/.cache/fdroidserver # # cachedir = 'buildserver/cache' # A big part of creating a new instance is downloading packages from Debian. # This setups up a folder in ~/.cache/fdroidserver to cache the downloaded # packages when rebuilding the build server from scratch. This requires # that virtualbox-guest-utils is installed. # # apt_package_cache = True # The buildserver can use some local caches to speed up builds, # especially when the internet connection is slow and/or expensive. # If enabled, the buildserver setup will look for standard caches in # your HOME dir and copy them to the buildserver VM. Be aware: this # will reduce the isolation of the buildserver from your host machine, # so the buildserver will provide an environment only as trustworthy # as the host machine's environment. # # copy_caches_from_host = True # To specify which Debian mirror the build server VM should use, by # default it uses http.debian.net, which auto-detects which is the # best mirror to use. # # debian_mirror = 'http://ftp.uk.debian.org/debian/' # The amount of RAM the build server will have (default: 2048) # memory = 3584 # The number of CPUs the build server will have # cpus = 1 # Debian package proxy server - if you have one # aptproxy = "http://192.168.0.19:8000" # If this is running on an older machine or on a virtualized system, # it can run a lot slower. If the provisioning fails with a warning # about the timeout, extend the timeout here. (default: 600 seconds) # # boot_timeout = 1200 # By default, this whole process uses VirtualBox as the provider, but # QEMU+KVM is also supported via the libvirt plugin to vagrant. If # this is run within a KVM guest, then libvirt's QEMU+KVM will be used # automatically. It can also be manually enabled by uncommenting # below: # # vm_provider = 'libvirt' # By default libvirt uses 'virtio' for both network and disk drivers. # Some systems (eg. nesting VMware ESXi) do not support virtio. As a # workaround for such rare cases, this setting allows to configure # KVM/libvirt to emulate hardware rather than using virtio. # # libvirt_disk_bus = 'sata' # libvirt_nic_model_type = 'rtl8139' # Sometimes, it is not possible to use the 9p synced folder type with # libvirt, like if running a KVM buildserver instance inside of a # VMware ESXi guest. In that case, using NFS or another method is # required. # # synced_folder_type = 'nfs' fdroidserver-2.1/examples/opensc-fdroid.cfg0000644000175000017500000000017314203004041020776 0ustar hanshans00000000000000name = OpenSC description = SunPKCS11 w/ OpenSC Smart card Framework library = /usr/lib/opensc-pkcs11.so slotListIndex = 1 fdroidserver-2.1/examples/public-read-only-s3-bucket-policy.json0000644000175000017500000000035514203004041024717 0ustar hanshans00000000000000{ "Version":"2012-10-17", "Statement":[ {"Sid":"AddPerm", "Effect":"Allow", "Principal":"*", "Action":"s3:GetObject", "Resource":"arn:aws:s3:::examplebucket/fdroid/*" } ] } fdroidserver-2.1/examples/template.yml0000644000175000017500000000036414203004041020121 0ustar hanshans00000000000000AuthorName: . WebSite: '' Bitcoin: null Litecoin: null Donate: null License: Unknown Categories: - Internet IssueTracker: '' SourceCode: '' Changelog: '' Name: . Summary: . Description: | . ArchivePolicy: 2 versions RequiresRoot: false fdroidserver-2.1/fdroidserver/0000755000175000017500000000000014205260750016455 5ustar hanshans00000000000000fdroidserver-2.1/fdroidserver/__init__.py0000644000175000017500000000372614203004041020561 0ustar hanshans00000000000000import gettext import glob import os import sys # support running straight from git and standard installs rootpaths = [ os.path.realpath(os.path.join(os.path.dirname(__file__), '..')), os.path.realpath( os.path.join(os.path.dirname(__file__), '..', '..', '..', '..', 'share') ), os.path.join(sys.prefix, 'share'), ] localedir = None for rootpath in rootpaths: if len(glob.glob(os.path.join(rootpath, 'locale', '*', 'LC_MESSAGES', 'fdroidserver.mo'))) > 0: localedir = os.path.join(rootpath, 'locale') break gettext.bindtextdomain('fdroidserver', localedir) gettext.textdomain('fdroidserver') _ = gettext.gettext from fdroidserver.exception import (FDroidException, MetaDataException, VerificationException) # NOQA: E402 FDroidException # NOQA: B101 MetaDataException # NOQA: B101 VerificationException # NOQA: B101 from fdroidserver.common import (verify_apk_signature, genkeystore as generate_keystore) # NOQA: E402 verify_apk_signature # NOQA: B101 generate_keystore # NOQA: B101 from fdroidserver.index import (download_repo_index, get_mirror_service_urls, make as make_index) # NOQA: E402 download_repo_index # NOQA: B101 get_mirror_service_urls # NOQA: B101 make_index # NOQA: B101 from fdroidserver.update import (process_apk, process_apks, scan_apk, scan_repo_files) # NOQA: E402 process_apk # NOQA: B101 process_apks # NOQA: B101 scan_apk # NOQA: B101 scan_repo_files # NOQA: B101 from fdroidserver.deploy import (update_awsbucket, update_servergitmirrors, update_serverwebroot) # NOQA: E402 update_awsbucket # NOQA: B101 update_servergitmirrors # NOQA: B101 update_serverwebroot # NOQA: B101 fdroidserver-2.1/fdroidserver/__main__.py0000755000175000017500000002267614203004041020552 0ustar hanshans00000000000000#!/usr/bin/env python3 # # fdroidserver/__main__.py - part of the FDroid server tools # Copyright (C) 2020 Michael Pöhn # Copyright (C) 2010-2015, Ciaran Gultnieks, ciaran@ciarang.com # Copyright (C) 2013-2014 Daniel Marti # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . import re import sys import os import pkgutil import logging import fdroidserver.common import fdroidserver.metadata from fdroidserver import _ from argparse import ArgumentError from collections import OrderedDict COMMANDS = OrderedDict([ ("build", _("Build a package from source")), ("init", _("Quickly start a new repository")), ("publish", _("Sign and place packages in the repo")), ("gpgsign", _("Add PGP signatures using GnuPG for packages in repo")), ("update", _("Update repo information for new packages")), ("deploy", _("Interact with the repo HTTP server")), ("verify", _("Verify the integrity of downloaded packages")), ("checkupdates", _("Check for updates to applications")), ("import", _("Add a new application from its source code")), ("install", _("Install built packages on devices")), ("readmeta", _("Read all the metadata files and exit")), ("rewritemeta", _("Rewrite all the metadata files")), ("lint", _("Warn about possible metadata errors")), ("scanner", _("Scan the source code of a package")), ("stats", _("Update the stats of the repo")), ("signindex", _("Sign indexes created using update --nosign")), ("btlog", _("Update the binary transparency log for a URL")), ("signatures", _("Extract signatures from APKs")), ("nightly", _("Set up an app build for a nightly build repo")), ("mirror", _("Download complete mirrors of small repos")), ]) def print_help(available_plugins=None): print(_("usage: ") + _("fdroid [] [-h|--help|--version|]")) print("") print(_("Valid commands are:")) for cmd, summary in COMMANDS.items(): print(" " + cmd + ' ' * (15 - len(cmd)) + summary) if available_plugins: print(_('commands from plugin modules:')) for command in sorted(available_plugins.keys()): print(' {:15}{}'.format(command, available_plugins[command]['summary'])) print("") def preparse_plugin(module_name, module_dir): """No summary. Simple regex based parsing for plugin scripts. So we don't have to import them when we just need the summary, but not plan on executing this particular plugin. """ if '.' in module_name: raise ValueError("No '.' allowed in fdroid plugin modules: '{}'" .format(module_name)) path = os.path.join(module_dir, module_name + '.py') if not os.path.isfile(path): path = os.path.join(module_dir, module_name, '__main__.py') if not os.path.isfile(path): raise ValueError("unable to find main plugin script " "for module '{n}' ('{d}')" .format(n=module_name, d=module_dir)) summary = None main = None with open(path, 'r', encoding='utf-8') as f: re_main = re.compile(r'^(\s*def\s+main\s*\(.*\)\s*:' r'|\s*main\s*=\s*lambda\s*:.+)$') re_summary = re.compile(r'^\s*fdroid_summary\s*=\s["\'](?P.+)["\']$') for line in f: m_summary = re_summary.match(line) if m_summary: summary = m_summary.group('text') if re_main.match(line): main = True if summary is None: raise NameError("could not find 'fdroid_summary' in: '{}' plugin" .format(module_name)) if main is None: raise NameError("could not find 'main' function in: '{}' plugin" .format(module_name)) return {'name': module_name, 'summary': summary} def find_plugins(): found_plugins = [{'name': x[1], 'dir': x[0].path} for x in pkgutil.iter_modules() if x[1].startswith('fdroid_')] plugin_infos = {} for plugin_def in found_plugins: command_name = plugin_def['name'][7:] try: plugin_infos[command_name] = preparse_plugin(plugin_def['name'], plugin_def['dir']) except Exception as e: # We need to keep module lookup fault tolerant because buggy # modules must not prevent fdroidserver from functioning if len(sys.argv) > 1 and sys.argv[1] == command_name: # only raise exeption when a user specifies the broken # plugin in explicitly in command line raise e return plugin_infos def main(): available_plugins = find_plugins() if len(sys.argv) <= 1: print_help(available_plugins=available_plugins) sys.exit(0) command = sys.argv[1] if command not in COMMANDS and command not in available_plugins.keys(): if command in ('-h', '--help'): print_help(available_plugins=available_plugins) sys.exit(0) elif command == 'server': print(_("""ERROR: The "server" subcommand has been removed, use "deploy"!""")) sys.exit(1) elif command == '--version': output = _('no version info found!') cmddir = os.path.realpath(os.path.dirname(os.path.dirname(__file__))) moduledir = os.path.realpath(os.path.dirname(fdroidserver.common.__file__) + '/..') if cmddir == moduledir: # running from git os.chdir(cmddir) if os.path.isdir('.git'): import subprocess try: output = subprocess.check_output(['git', 'describe'], stderr=subprocess.STDOUT, universal_newlines=True) except subprocess.CalledProcessError: output = 'git commit ' + subprocess.check_output(['git', 'rev-parse', 'HEAD'], universal_newlines=True) elif os.path.exists('setup.py'): import re m = re.search(r'''.*[\s,\(]+version\s*=\s*["']([0-9a-z.]+)["'].*''', open('setup.py').read(), flags=re.MULTILINE) if m: output = m.group(1) + '\n' else: from pkg_resources import get_distribution output = get_distribution('fdroidserver').version + '\n' print(output) sys.exit(0) else: print(_("Command '%s' not recognised.\n" % command)) print_help(available_plugins=available_plugins) sys.exit(1) verbose = any(s in sys.argv for s in ['-v', '--verbose']) quiet = any(s in sys.argv for s in ['-q', '--quiet']) # Helpful to differentiate warnings from errors even when on quiet logformat = '%(asctime)s %(levelname)s: %(message)s' loglevel = logging.INFO if verbose: loglevel = logging.DEBUG elif quiet: loglevel = logging.WARN logging.basicConfig(format=logformat, level=loglevel) if verbose and quiet: logging.critical(_("Conflicting arguments: '--verbose' and '--quiet' " "can not be specified at the same time.")) sys.exit(1) # Trick optparse into displaying the right usage when --help is used. sys.argv[0] += ' ' + command del sys.argv[1] if command in COMMANDS.keys(): mod = __import__('fdroidserver.' + command, None, None, [command]) else: mod = __import__(available_plugins[command]['name'], None, None, [command]) system_encoding = sys.getdefaultencoding() if system_encoding is None or system_encoding.lower() not in ('utf-8', 'utf8'): logging.warning(_("Encoding is set to '{enc}' fdroid might run " "into encoding issues. Please set it to 'UTF-8' " "for best results.".format(enc=system_encoding))) try: mod.main() # These are ours, contain a proper message and are "expected" except (fdroidserver.common.FDroidException, fdroidserver.metadata.MetaDataException) as e: if verbose: raise else: logging.critical(str(e)) sys.exit(1) except ArgumentError as e: logging.critical(str(e)) sys.exit(1) except KeyboardInterrupt: print('') fdroidserver.common.force_exit(1) # These should only be unexpected crashes due to bugs in the code # str(e) often doesn't contain a reason, so just show the backtrace except Exception as e: logging.critical(_("Unknown exception found!")) raise e sys.exit(0) if __name__ == "__main__": main() fdroidserver-2.1/fdroidserver/apksigcopier.py0000644000175000017500000004377614203004041021513 0ustar hanshans00000000000000#!/usr/bin/python3 # encoding: utf-8 # -- ; {{{1 # # File : apksigcopier # Maintainer : Felix C. Stegerman # Date : 2021-04-14 # # Copyright : Copyright (C) 2021 Felix C. Stegerman # Version : v0.4.0 # License : GPLv3+ # # -- ; }}}1 """Copy/extract/patch apk signatures. apksigcopier is a tool for copying APK signatures from a signed APK to an unsigned one (in order to verify reproducible builds). CLI === $ apksigcopier extract [OPTIONS] SIGNED_APK OUTPUT_DIR $ apksigcopier patch [OPTIONS] METADATA_DIR UNSIGNED_APK OUTPUT_APK $ apksigcopier copy [OPTIONS] SIGNED_APK UNSIGNED_APK OUTPUT_APK The following environment variables can be set to 1, yes, or true to overide the default behaviour: * set APKSIGCOPIER_EXCLUDE_ALL_META=1 to exclude all metadata files * set APKSIGCOPIER_COPY_EXTRA_BYTES=1 to copy extra bytes after data (e.g. a v2 sig) API === >> from apksigcopier import do_extract, do_patch, do_copy >> do_extract(signed_apk, output_dir, v1_only=NO) >> do_patch(metadata_dir, unsigned_apk, output_apk, v1_only=NO) >> do_copy(signed_apk, unsigned_apk, output_apk, v1_only=NO) You can use False, None, and True instead of NO, AUTO, and YES respectively. The following global variables (which default to False), can be set to override the default behaviour: * set exclude_all_meta=True to exclude all metadata files * set copy_extra_bytes=True to copy extra bytes after data (e.g. a v2 sig) """ import glob import os import re import struct import sys import zipfile import zlib from collections import namedtuple from typing import Dict, Tuple, Union __version__ = "0.4.0" NAME = "apksigcopier" SIGBLOCK, SIGOFFSET = "APKSigningBlock", "APKSigningBlockOffset" NOAUTOYES = NO, AUTO, YES = ("no", "auto", "yes") APK_META = re.compile(r"^META-INF/([0-9A-Za-z_-]+\.(SF|RSA|DSA|EC)|MANIFEST\.MF)$") META_EXT = ("SF", "RSA|DSA|EC", "MF") COPY_EXCLUDE = ("META-INF/MANIFEST.MF",) DATETIMEZERO = (1980, 0, 0, 0, 0, 0) ZipData = namedtuple("ZipData", ("cd_offset", "eocd_offset", "cd_and_eocd")) copy_extra_bytes = False # copy extra bytes after data in copy_apk() class APKSigCopierError(Exception): """Base class for errors.""" class APKSigningBlockError(APKSigCopierError): """Something wrong with the APK Signing Block.""" class NoAPKSigningBlock(APKSigningBlockError): """APK Signing Block Missing.""" class ZipError(APKSigCopierError): """Something wrong with ZIP file.""" # FIXME: is there a better alternative? class ReproducibleZipInfo(zipfile.ZipInfo): """Reproducible ZipInfo hack.""" _override = {} # type: Dict[str, Union[int, Tuple[int, ...]]] def __init__(self, zinfo, **override): if override: self._override = {**self._override, **override} for k in self.__slots__: if hasattr(zinfo, k): setattr(self, k, getattr(zinfo, k)) def __getattribute__(self, name): if name != "_override": try: return self._override[name] except KeyError: pass return object.__getattribute__(self, name) class APKZipInfo(ReproducibleZipInfo): """Reproducible ZipInfo for APK files.""" _override = dict( compress_type=8, create_system=0, create_version=20, date_time=DATETIMEZERO, external_attr=0, extract_version=20, flag_bits=0x800, ) def noautoyes(value): """Turn False into NO, None into AUTO, and True into YES. >>> from apksigcopier import noautoyes, NO, AUTO, YES >>> noautoyes(False) == NO == noautoyes(NO) True >>> noautoyes(None) == AUTO == noautoyes(AUTO) True >>> noautoyes(True) == YES == noautoyes(YES) True """ if isinstance(value, str): if value not in NOAUTOYES: raise ValueError("expected NO, AUTO, or YES") return value try: return {False: NO, None: AUTO, True: YES}[value] except KeyError: raise ValueError("expected False, None, or True") def is_meta(filename): """No summary. Returns whether filename is a v1 (JAR) signature file (.SF), signature block file (.RSA, .DSA, or .EC), or manifest (MANIFEST.MF). See https://docs.oracle.com/javase/tutorial/deployment/jar/intro.html """ return APK_META.fullmatch(filename) is not None def exclude_from_copying(filename): """Fdroidserver always wants JAR Signature files to be excluded.""" return is_meta(filename) ################################################################################ # # https://en.wikipedia.org/wiki/ZIP_(file_format) # https://source.android.com/security/apksigning/v2#apk-signing-block-format # # ================================= # | Contents of ZIP entries | # ================================= # | APK Signing Block | # | ----------------------------- | # | | size (w/o this) uint64 LE | | # | | ... | | # | | size (again) uint64 LE | | # | | "APK Sig Block 42" (16B) | | # | ----------------------------- | # ================================= # | ZIP Central Directory | # ================================= # | ZIP End of Central Directory | # | ----------------------------- | # | | 0x06054b50 ( 4B) | | # | | ... (12B) | | # | | CD Offset ( 4B) | | # | | ... | | # | ----------------------------- | # ================================= # ################################################################################ # FIXME: makes certain assumptions and doesn't handle all valid ZIP files! def copy_apk(unsigned_apk, output_apk): """Copy APK like apksigner would, excluding files matched by exclude_from_copying(). The following global variables (which default to False), can be set to override the default behaviour: * set exclude_all_meta=True to exclude all metadata files * set copy_extra_bytes=True to copy extra bytes after data (e.g. a v2 sig) Returns ------- max date_time. """ with zipfile.ZipFile(unsigned_apk, "r") as zf: infos = zf.infolist() zdata = zip_data(unsigned_apk) offsets = {} with open(unsigned_apk, "rb") as fhi, open(output_apk, "w+b") as fho: for info in sorted(infos, key=lambda info: info.header_offset): off_i = fhi.tell() if info.header_offset > off_i: # copy extra bytes fho.write(fhi.read(info.header_offset - off_i)) hdr = fhi.read(30) if hdr[:4] != b"\x50\x4b\x03\x04": raise ZipError("Expected local file header signature") n, m = struct.unpack("= 4: hdr_id, size = struct.unpack(" len(old_xtr) - 4: break if not (hdr_id == 0 and size == 0): if hdr_id == 0xd935: if size >= 2: align = int.from_bytes(old_xtr[4:6], "little") else: new_xtr += old_xtr[:size + 4] old_xtr = old_xtr[size + 4:] if old_off % align == 0 and new_off % align != 0: pad = (align - (new_off - m + len(new_xtr) + 6) % align) % align xtr = new_xtr + struct.pack(" 0: data = fhi.read(min(size, blocksize)) if not data: break size -= len(data) fho.write(data) if size != 0: raise ZipError("Unexpected EOF") def extract_meta(signed_apk): """ Extract v1 signature metadata files from signed APK. Yields (ZipInfo, data) pairs. """ with zipfile.ZipFile(signed_apk, "r") as zf_sig: for info in zf_sig.infolist(): if is_meta(info.filename): yield info, zf_sig.read(info.filename) def patch_meta(extracted_meta, output_apk, date_time=DATETIMEZERO): """Add v1 signature metadata to APK (removes v2 sig block, if any).""" with zipfile.ZipFile(output_apk, "r") as zf_out: for info in zf_out.infolist(): if is_meta(info.filename): raise ZipError("Unexpected metadata") with zipfile.ZipFile(output_apk, "a") as zf_out: info_data = [(APKZipInfo(info, date_time=date_time), data) for info, data in extracted_meta] _write_to_zip(info_data, zf_out) if sys.version_info >= (3, 7): def _write_to_zip(info_data, zf_out): for info, data in info_data: zf_out.writestr(info, data, compresslevel=9) else: def _write_to_zip(info_data, zf_out): old = zipfile._get_compressor zipfile._get_compressor = lambda _: zlib.compressobj(9, 8, -15) try: for info, data in info_data: zf_out.writestr(info, data) finally: zipfile._get_compressor = old def extract_v2_sig(apkfile, expected=True): """ Extract APK Signing Block and offset from APK. When successful, returns (sb_offset, sig_block); otherwise raises NoAPKSigningBlock when expected is True, else returns None. """ cd_offset = zip_data(apkfile).cd_offset with open(apkfile, "rb") as fh: fh.seek(cd_offset - 16) if fh.read(16) != b"APK Sig Block 42": if expected: raise NoAPKSigningBlock("No APK Signing Block") return None fh.seek(-24, os.SEEK_CUR) sb_size2 = int.from_bytes(fh.read(8), "little") fh.seek(-sb_size2 + 8, os.SEEK_CUR) sb_size1 = int.from_bytes(fh.read(8), "little") if sb_size1 != sb_size2: raise APKSigningBlockError("APK Signing Block sizes not equal") fh.seek(-8, os.SEEK_CUR) sb_offset = fh.tell() sig_block = fh.read(sb_size2 + 8) return sb_offset, sig_block def zip_data(apkfile, count=1024): """ Extract central directory, EOCD, and offsets from ZIP. Returns ------- ZipData """ with open(apkfile, "rb") as fh: fh.seek(-count, os.SEEK_END) data = fh.read() pos = data.rfind(b"\x50\x4b\x05\x06") if pos == -1: raise ZipError("Expected end of central directory record (EOCD)") fh.seek(pos - len(data), os.SEEK_CUR) eocd_offset = fh.tell() fh.seek(16, os.SEEK_CUR) cd_offset = int.from_bytes(fh.read(4), "little") fh.seek(cd_offset) cd_and_eocd = fh.read() return ZipData(cd_offset, eocd_offset, cd_and_eocd) # FIXME: can we determine signed_sb_offset? def patch_v2_sig(extracted_v2_sig, output_apk): """Implant extracted v2/v3 signature into APK.""" signed_sb_offset, signed_sb = extracted_v2_sig data_out = zip_data(output_apk) if signed_sb_offset < data_out.cd_offset: raise APKSigningBlockError("APK Signing Block offset < central directory offset") padding = b"\x00" * (signed_sb_offset - data_out.cd_offset) offset = len(signed_sb) + len(padding) with open(output_apk, "r+b") as fh: fh.seek(data_out.cd_offset) fh.write(padding) fh.write(signed_sb) fh.write(data_out.cd_and_eocd) fh.seek(data_out.eocd_offset + offset + 16) fh.write(int.to_bytes(data_out.cd_offset + offset, 4, "little")) def patch_apk(extracted_meta, extracted_v2_sig, unsigned_apk, output_apk): """Patch extracted_meta + extracted_v2_sig. Patches extracted_meta + extracted_v2_sig (if not None) onto unsigned_apk and save as output_apk. """ date_time = copy_apk(unsigned_apk, output_apk) patch_meta(extracted_meta, output_apk, date_time=date_time) if extracted_v2_sig is not None: patch_v2_sig(extracted_v2_sig, output_apk) def do_extract(signed_apk, output_dir, v1_only=NO): """Extract signatures from signed_apk and save in output_dir. The v1_only parameter controls whether the absence of a v1 signature is considered an error or not: * use v1_only=NO (or v1_only=False) to only accept (v1+)v2/v3 signatures; * use v1_only=AUTO (or v1_only=None) to automatically detect v2/v3 signatures; * use v1_only=YES (or v1_only=True) to ignore any v2/v3 signatures. """ v1_only = noautoyes(v1_only) extracted_meta = tuple(extract_meta(signed_apk)) if len(extracted_meta) not in (len(META_EXT), 0): raise APKSigCopierError("Unexpected or missing metadata files in signed_apk") for info, data in extracted_meta: name = os.path.basename(info.filename) with open(os.path.join(output_dir, name), "wb") as fh: fh.write(data) if v1_only == YES: if not extracted_meta: raise APKSigCopierError("Expected v1 signature") return expected = v1_only == NO extracted_v2_sig = extract_v2_sig(signed_apk, expected=expected) if extracted_v2_sig is None: if not extracted_meta: raise APKSigCopierError("Expected v1 and/or v2/v3 signature, found neither") return signed_sb_offset, signed_sb = extracted_v2_sig with open(os.path.join(output_dir, SIGOFFSET), "w") as fh: fh.write(str(signed_sb_offset) + "\n") with open(os.path.join(output_dir, SIGBLOCK), "wb") as fh: fh.write(signed_sb) def do_patch(metadata_dir, unsigned_apk, output_apk, v1_only=NO): """Patch signatures from metadata_dir onto unsigned_apk and save as output_apk. The v1_only parameter controls whether the absence of a v1 signature is considered an error or not: * use v1_only=NO (or v1_only=False) to only accept (v1+)v2/v3 signatures; * use v1_only=AUTO (or v1_only=None) to automatically detect v2/v3 signatures; * use v1_only=YES (or v1_only=True) to ignore any v2/v3 signatures. """ v1_only = noautoyes(v1_only) extracted_meta = [] for pat in META_EXT: files = [fn for ext in pat.split("|") for fn in glob.glob(os.path.join(metadata_dir, "*." + ext))] if len(files) != 1: continue info = zipfile.ZipInfo("META-INF/" + os.path.basename(files[0])) with open(files[0], "rb") as fh: extracted_meta.append((info, fh.read())) if len(extracted_meta) not in (len(META_EXT), 0): raise APKSigCopierError("Unexpected or missing files in metadata_dir") if v1_only == YES: extracted_v2_sig = None else: sigoffset_file = os.path.join(metadata_dir, SIGOFFSET) sigblock_file = os.path.join(metadata_dir, SIGBLOCK) if v1_only == AUTO and not os.path.exists(sigblock_file): extracted_v2_sig = None else: with open(sigoffset_file, "r") as fh: signed_sb_offset = int(fh.read()) with open(sigblock_file, "rb") as fh: signed_sb = fh.read() extracted_v2_sig = signed_sb_offset, signed_sb if not extracted_meta and extracted_v2_sig is None: raise APKSigCopierError("Expected v1 and/or v2/v3 signature, found neither") patch_apk(extracted_meta, extracted_v2_sig, unsigned_apk, output_apk) def do_copy(signed_apk, unsigned_apk, output_apk, v1_only=NO): """Copy signatures from signed_apk onto unsigned_apk and save as output_apk. The v1_only parameter controls whether the absence of a v1 signature is considered an error or not: * use v1_only=NO (or v1_only=False) to only accept (v1+)v2/v3 signatures; * use v1_only=AUTO (or v1_only=None) to automatically detect v2/v3 signatures; * use v1_only=YES (or v1_only=True) to ignore any v2/v3 signatures. """ v1_only = noautoyes(v1_only) extracted_meta = extract_meta(signed_apk) if v1_only == YES: extracted_v2_sig = None else: extracted_v2_sig = extract_v2_sig(signed_apk, expected=v1_only == NO) patch_apk(extracted_meta, extracted_v2_sig, unsigned_apk, output_apk) # vim: set tw=80 sw=4 sts=4 et fdm=marker : fdroidserver-2.1/fdroidserver/asynchronousfilereader/0000755000175000017500000000000014205260750023233 5ustar hanshans00000000000000fdroidserver-2.1/fdroidserver/asynchronousfilereader/__init__.py0000644000175000017500000000251214203004041025327 0ustar hanshans00000000000000"""Simple thread based asynchronous file reader for Python. AsynchronousFileReader ====================== see https://github.com/soxofaan/asynchronousfilereader MIT License Copyright (c) 2014 Stefaan Lippens """ __version__ = '0.2.1' import threading try: # Python 2 from Queue import Queue except ImportError: # Python 3 from queue import Queue class AsynchronousFileReader(threading.Thread): """Helper class to implement asynchronous reading of a file in a separate thread. Pushes read lines on a queue to be consumed in another thread. """ def __init__(self, fd, queue=None, autostart=True): self._fd = fd if queue is None: queue = Queue() self.queue = queue threading.Thread.__init__(self) if autostart: self.start() def run(self): """Read lines and put them on the queue (the body of the tread).""" while True: line = self._fd.readline() if not line: break self.queue.put(line) def eof(self): """Check whether there is no more content to expect.""" return not self.is_alive() and self.queue.empty() def readlines(self): """Get currently available lines.""" while not self.queue.empty(): yield self.queue.get() fdroidserver-2.1/fdroidserver/btlog.py0000755000175000017500000002070414203004041020127 0ustar hanshans00000000000000#!/usr/bin/env python3 # # btlog.py - part of the FDroid server tools # Copyright (C) 2017, Hans-Christoph Steiner # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . # This is for creating a binary transparency log in a git repo for any # F-Droid repo accessible via HTTP. It is meant to run very often, # even once a minute in a cronjob, so it uses HEAD requests and the # HTTP ETag to check if the file has changed. HEAD requests should # not count against the download counts. This pattern of a HEAD then # a GET is what fdroidclient uses to avoid ETags being abused as # cookies. This also uses the same HTTP User Agent as the F-Droid # client app so its not easy for the server to distinguish this from # the F-Droid client. import collections import defusedxml.minidom import git import glob import os import json import logging import requests import shutil import tempfile import zipfile from argparse import ArgumentParser from . import _ from . import common from . import deploy from .exception import FDroidException options = None def make_binary_transparency_log( repodirs, btrepo='binary_transparency', url=None, commit_title='fdroid update' ): """Log the indexes in a standalone git repo to serve as a "binary transparency" log. References ---------- https://www.eff.org/deeplinks/2014/02/open-letter-to-tech-companies """ logging.info('Committing indexes to ' + btrepo) if os.path.exists(os.path.join(btrepo, '.git')): gitrepo = git.Repo(btrepo) else: if not os.path.exists(btrepo): os.mkdir(btrepo) gitrepo = git.Repo.init(btrepo) if not url: url = common.config['repo_url'].rstrip('/') with open(os.path.join(btrepo, 'README.md'), 'w') as fp: fp.write(""" # Binary Transparency Log for %s This is a log of the signed app index metadata. This is stored in a git repo, which serves as an imperfect append-only storage mechanism. People can then check that any file that they received from that F-Droid repository was a publicly released file. For more info on this idea: * https://wiki.mozilla.org/Security/Binary_Transparency """ % url[:url.rindex('/')]) # strip '/repo' gitrepo.index.add(['README.md', ]) gitrepo.index.commit('add README') for repodir in repodirs: cpdir = os.path.join(btrepo, repodir) if not os.path.exists(cpdir): os.mkdir(cpdir) for f in ('index.xml', 'index-v1.json'): repof = os.path.join(repodir, f) if not os.path.exists(repof): continue dest = os.path.join(cpdir, f) if f.endswith('.xml'): doc = defusedxml.minidom.parse(repof) output = doc.toprettyxml(encoding='utf-8') with open(dest, 'wb') as f: f.write(output) elif f.endswith('.json'): with open(repof) as fp: output = json.load(fp, object_pairs_hook=collections.OrderedDict) with open(dest, 'w') as fp: json.dump(output, fp, indent=2) gitrepo.index.add([repof]) for f in ('index.jar', 'index-v1.jar'): repof = os.path.join(repodir, f) if not os.path.exists(repof): continue dest = os.path.join(cpdir, f) jarin = zipfile.ZipFile(repof, 'r') jarout = zipfile.ZipFile(dest, 'w') for info in jarin.infolist(): if info.filename.startswith('META-INF/'): jarout.writestr(info, jarin.read(info.filename)) jarout.close() jarin.close() gitrepo.index.add([repof]) output_files = [] for root, dirs, files in os.walk(repodir): for f in files: output_files.append(os.path.relpath(os.path.join(root, f), repodir)) output = collections.OrderedDict() for f in sorted(output_files): repofile = os.path.join(repodir, f) stat = os.stat(repofile) output[f] = ( stat.st_size, stat.st_ctime_ns, stat.st_mtime_ns, stat.st_mode, stat.st_uid, stat.st_gid, ) fslogfile = os.path.join(cpdir, 'filesystemlog.json') with open(fslogfile, 'w') as fp: json.dump(output, fp, indent=2) gitrepo.index.add([os.path.join(repodir, 'filesystemlog.json')]) for f in glob.glob(os.path.join(cpdir, '*.HTTP-headers.json')): gitrepo.index.add([os.path.join(repodir, os.path.basename(f))]) gitrepo.index.commit(commit_title) def main(): global options parser = ArgumentParser() common.setup_global_opts(parser) parser.add_argument("--git-repo", default=os.path.join(os.getcwd(), 'binary_transparency'), help=_("Path to the git repo to use as the log")) parser.add_argument("-u", "--url", default='https://f-droid.org', help=_("The base URL for the repo to log (default: https://f-droid.org)")) parser.add_argument("--git-remote", default=None, help=_("Push the log to this git remote repository")) options = parser.parse_args() if options.verbose: logging.getLogger("requests").setLevel(logging.INFO) logging.getLogger("urllib3").setLevel(logging.INFO) else: logging.getLogger("requests").setLevel(logging.WARNING) logging.getLogger("urllib3").setLevel(logging.WARNING) if not os.path.exists(options.git_repo): raise FDroidException( '"%s" does not exist! Create it, or use --git-repo' % options.git_repo ) session = requests.Session() new_files = False repodirs = ('repo', 'archive') tempdirbase = tempfile.mkdtemp(prefix='.fdroid-btlog-') for repodir in repodirs: # TODO read HTTP headers for etag from git repo tempdir = os.path.join(tempdirbase, repodir) os.makedirs(tempdir, exist_ok=True) gitrepodir = os.path.join(options.git_repo, repodir) os.makedirs(gitrepodir, exist_ok=True) for f in ('index.jar', 'index.xml', 'index-v1.jar', 'index-v1.json'): dlfile = os.path.join(tempdir, f) dlurl = options.url + '/' + repodir + '/' + f http_headers_file = os.path.join(gitrepodir, f + '.HTTP-headers.json') headers = {'User-Agent': 'F-Droid 0.102.3'} etag = None if os.path.exists(http_headers_file): with open(http_headers_file) as fp: etag = json.load(fp)['ETag'] r = session.head(dlurl, headers=headers, allow_redirects=False) if r.status_code != 200: logging.debug( 'HTTP Response (' + str(r.status_code) + '), did not download ' + dlurl ) continue if etag and etag == r.headers.get('ETag'): logging.debug('ETag matches, did not download ' + dlurl) continue r = session.get(dlurl, headers=headers, allow_redirects=False) if r.status_code == 200: with open(dlfile, 'wb') as f: for chunk in r: f.write(chunk) dump = dict() for k, v in r.headers.items(): dump[k] = v with open(http_headers_file, 'w') as fp: json.dump(dump, fp, indent=2, sort_keys=True) new_files = True if new_files: os.chdir(tempdirbase) make_binary_transparency_log(repodirs, options.git_repo, options.url, 'fdroid btlog') if options.git_remote: deploy.push_binary_transparency(options.git_repo, options.git_remote) shutil.rmtree(tempdirbase, ignore_errors=True) if __name__ == "__main__": main() fdroidserver-2.1/fdroidserver/build.py0000644000175000017500000015350114203004041020116 0ustar hanshans00000000000000#!/usr/bin/env python3 # # build.py - part of the FDroid server tools # Copyright (C) 2010-2014, Ciaran Gultnieks, ciaran@ciarang.com # Copyright (C) 2013-2014 Daniel Martí # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . import os import shutil import glob import subprocess import posixpath import re import tarfile import threading import traceback import time import requests import tempfile import argparse from configparser import ConfigParser import logging from gettext import ngettext from . import _ from . import common from . import net from . import metadata from . import scanner from . import vmtools from .common import FDroidPopen from .exception import FDroidException, BuildException, VCSException try: import paramiko except ImportError: pass # Note that 'force' here also implies test mode. def build_server(app, build, vcs, build_dir, output_dir, log_dir, force): """Do a build on the builder vm. Parameters ---------- app app metadata dict build vcs version control system controller object build_dir local source-code checkout of app output_dir target folder for the build result force """ global buildserverid try: paramiko except NameError as e: raise BuildException("Paramiko is required to use the buildserver") from e if options.verbose: logging.getLogger("paramiko").setLevel(logging.INFO) else: logging.getLogger("paramiko").setLevel(logging.WARN) sshinfo = vmtools.get_clean_builder('builder') output = None try: if not buildserverid: try: buildserverid = subprocess.check_output(['vagrant', 'ssh', '-c', 'cat /home/vagrant/buildserverid'], cwd='builder').strip().decode() logging.debug(_('Fetched buildserverid from VM: {buildserverid}') .format(buildserverid=buildserverid)) except Exception as e: if type(buildserverid) is not str or not re.match('^[0-9a-f]{40}$', buildserverid): logging.info(subprocess.check_output(['vagrant', 'status'], cwd="builder")) raise FDroidException("Could not obtain buildserverid from buldserver VM. " "(stored inside the buildserver VM at '/home/vagrant/buildserverid') " "Please reset your buildserver, the setup VM is broken.") from e # Open SSH connection... logging.info("Connecting to virtual machine...") sshs = paramiko.SSHClient() sshs.set_missing_host_key_policy(paramiko.AutoAddPolicy()) sshs.connect(sshinfo['hostname'], username=sshinfo['user'], port=sshinfo['port'], timeout=300, look_for_keys=False, key_filename=sshinfo['idfile']) homedir = posixpath.join('/home', sshinfo['user']) # Get an SFTP connection... ftp = sshs.open_sftp() ftp.get_channel().settimeout(60) # Put all the necessary files in place... ftp.chdir(homedir) # Helper to copy the contents of a directory to the server... def send_dir(path): logging.debug("rsyncing " + path + " to " + ftp.getcwd()) # TODO this should move to `vagrant rsync` from >= v1.5 try: subprocess.check_output(['rsync', '--recursive', '--perms', '--links', '--quiet', '--rsh=' + 'ssh -o StrictHostKeyChecking=no' + ' -o UserKnownHostsFile=/dev/null' + ' -o LogLevel=FATAL' + ' -o IdentitiesOnly=yes' + ' -o PasswordAuthentication=no' + ' -p ' + str(sshinfo['port']) + ' -i ' + sshinfo['idfile'], path, sshinfo['user'] + "@" + sshinfo['hostname'] + ":" + ftp.getcwd()], stderr=subprocess.STDOUT) except subprocess.CalledProcessError as e: raise FDroidException(str(e), e.output.decode()) logging.info("Preparing server for build...") serverpath = os.path.abspath(os.path.dirname(__file__)) ftp.mkdir('fdroidserver') ftp.chdir('fdroidserver') ftp.put(os.path.join(serverpath, '..', 'fdroid'), 'fdroid') ftp.put(os.path.join(serverpath, '..', 'gradlew-fdroid'), 'gradlew-fdroid') ftp.chmod('fdroid', 0o755) # nosec B103 permissions are appropriate ftp.chmod('gradlew-fdroid', 0o755) # nosec B103 permissions are appropriate send_dir(os.path.join(serverpath)) ftp.chdir(homedir) ftp.put(os.path.join(serverpath, '..', 'buildserver', 'config.buildserver.yml'), 'config.yml') ftp.chmod('config.yml', 0o600) # Copy over the ID (head commit hash) of the fdroidserver in use... with open(os.path.join(os.getcwd(), 'tmp', 'fdroidserverid'), 'wb') as fp: fp.write(subprocess.check_output(['git', 'rev-parse', 'HEAD'], cwd=serverpath)) ftp.put('tmp/fdroidserverid', 'fdroidserverid') # Copy the metadata - just the file for this app... ftp.mkdir('metadata') ftp.mkdir('srclibs') ftp.chdir('metadata') ftp.put(app.metadatapath, os.path.basename(app.metadatapath)) # And patches if there are any... if os.path.exists(os.path.join('metadata', app.id)): send_dir(os.path.join('metadata', app.id)) ftp.chdir(homedir) # Create the build directory... ftp.mkdir('build') ftp.chdir('build') ftp.mkdir('extlib') ftp.mkdir('srclib') # Copy any extlibs that are required... if build.extlibs: ftp.chdir(posixpath.join(homedir, 'build', 'extlib')) for lib in build.extlibs: lib = lib.strip() libsrc = os.path.join('build/extlib', lib) if not os.path.exists(libsrc): raise BuildException("Missing extlib {0}".format(libsrc)) lp = lib.split('/') for d in lp[:-1]: if d not in ftp.listdir(): ftp.mkdir(d) ftp.chdir(d) ftp.put(libsrc, lp[-1]) for _ignored in lp[:-1]: ftp.chdir('..') # Copy any srclibs that are required... srclibpaths = [] if build.srclibs: for lib in build.srclibs: srclibpaths.append( common.getsrclib(lib, 'build/srclib', basepath=True, prepare=False)) # If one was used for the main source, add that too. basesrclib = vcs.getsrclib() if basesrclib: srclibpaths.append(basesrclib) for name, number, lib in srclibpaths: logging.info("Sending srclib '%s'" % lib) ftp.chdir(posixpath.join(homedir, 'build', 'srclib')) if not os.path.exists(lib): raise BuildException("Missing srclib directory '" + lib + "'") fv = '.fdroidvcs-' + name ftp.put(os.path.join('build/srclib', fv), fv) send_dir(lib) # Copy the metadata file too... ftp.chdir(posixpath.join(homedir, 'srclibs')) srclibsfile = os.path.join('srclibs', name + '.yml') if os.path.isfile(srclibsfile): ftp.put(srclibsfile, os.path.basename(srclibsfile)) else: raise BuildException(_('cannot find required srclibs: "{path}"') .format(path=srclibsfile)) # Copy the main app source code # (no need if it's a srclib) if (not basesrclib) and os.path.exists(build_dir): ftp.chdir(posixpath.join(homedir, 'build')) fv = '.fdroidvcs-' + app.id ftp.put(os.path.join('build', fv), fv) send_dir(build_dir) # Execute the build script... logging.info("Starting build...") chan = sshs.get_transport().open_session() chan.get_pty() cmdline = posixpath.join(homedir, 'fdroidserver', 'fdroid') cmdline += ' build --on-server' if force: cmdline += ' --force --test' if options.verbose: cmdline += ' --verbose' if options.skipscan: cmdline += ' --skip-scan' if options.notarball: cmdline += ' --no-tarball' cmdline += " %s:%s" % (app.id, build.versionCode) chan.exec_command('bash --login -c "' + cmdline + '"') # nosec B601 inputs are sanitized # Fetch build process output ... try: cmd_stdout = chan.makefile('rb', 1024) output = bytes() output += common.get_android_tools_version_log().encode() while not chan.exit_status_ready(): line = cmd_stdout.readline() if line: if options.verbose: logging.debug("buildserver > " + str(line, 'utf-8').rstrip()) output += line else: time.sleep(0.05) for line in cmd_stdout.readlines(): if options.verbose: logging.debug("buildserver > " + str(line, 'utf-8').rstrip()) output += line finally: cmd_stdout.close() # Check build process exit status ... logging.info("...getting exit status") returncode = chan.recv_exit_status() if returncode != 0: if timeout_event.is_set(): message = "Timeout exceeded! Build VM force-stopped for {0}:{1}" else: message = "Build.py failed on server for {0}:{1}" raise BuildException(message.format(app.id, build.versionName), str(output, 'utf-8')) # Retreive logs... toolsversion_log = common.get_toolsversion_logname(app, build) try: ftp.chdir(posixpath.join(homedir, log_dir)) ftp.get(toolsversion_log, os.path.join(log_dir, toolsversion_log)) logging.debug('retrieved %s', toolsversion_log) except Exception as e: logging.warning('could not get %s from builder vm: %s' % (toolsversion_log, e)) # Retrieve the built files... logging.info("Retrieving build output...") if force: ftp.chdir(posixpath.join(homedir, 'tmp')) else: ftp.chdir(posixpath.join(homedir, 'unsigned')) apkfile = common.get_release_filename(app, build) tarball = common.getsrcname(app, build) try: ftp.get(apkfile, os.path.join(output_dir, apkfile)) if not options.notarball: ftp.get(tarball, os.path.join(output_dir, tarball)) except Exception: raise BuildException( "Build failed for {0}:{1} - missing output files".format( app.id, build.versionName), str(output, 'utf-8')) ftp.close() finally: # Suspend the build server. vm = vmtools.get_build_vm('builder') logging.info('destroying buildserver after build') vm.destroy() # deploy logfile to repository web server if output: common.deploy_build_log_with_rsync(app.id, build.versionCode, output) else: logging.debug('skip publishing full build logs: ' 'no output present') def force_gradle_build_tools(build_dir, build_tools): for root, dirs, files in os.walk(build_dir): for filename in files: if not filename.endswith('.gradle'): continue path = os.path.join(root, filename) if not os.path.isfile(path): continue logging.debug("Forcing build-tools %s in %s" % (build_tools, path)) common.regsub_file(r"""(\s*)buildToolsVersion([\s=]+).*""", r"""\1buildToolsVersion\2'%s'""" % build_tools, path) def transform_first_char(string, method): """Use method() on the first character of string.""" if len(string) == 0: return string if len(string) == 1: return method(string) return method(string[0]) + string[1:] def add_failed_builds_entry(failed_builds, appid, build, entry): failed_builds.append([appid, int(build.versionCode), str(entry)]) def get_metadata_from_apk(app, build, apkfile): """Get the required metadata from the built APK. VersionName is allowed to be a blank string, i.e. '' """ appid, versionCode, versionName = common.get_apk_id(apkfile) native_code = common.get_native_code(apkfile) if build.buildjni and build.buildjni != ['no'] and not native_code: raise BuildException("Native code should have been built but none was packaged") if build.novcheck: versionCode = build.versionCode versionName = build.versionName if not versionCode or versionName is None: raise BuildException("Could not find version information in build in output") if not appid: raise BuildException("Could not find package ID in output") if appid != app.id: raise BuildException("Wrong package ID - build " + appid + " but expected " + app.id) return versionCode, versionName def build_local(app, build, vcs, build_dir, output_dir, log_dir, srclib_dir, extlib_dir, tmp_dir, force, onserver, refresh): """Do a build locally.""" ndk_path = build.ndk_path() if build.ndk or (build.buildjni and build.buildjni != ['no']): if not ndk_path: logging.warning("Android NDK version '%s' could not be found!" % build.ndk) logging.warning("Configured versions:") for k, v in config['ndk_paths'].items(): if k.endswith("_orig"): continue logging.warning(" %s: %s" % (k, v)) if onserver: common.auto_install_ndk(build) else: raise FDroidException() elif not os.path.isdir(ndk_path): logging.critical("Android NDK '%s' is not a directory!" % ndk_path) raise FDroidException() common.set_FDroidPopen_env(build) # create ..._toolsversion.log when running in builder vm if onserver: # before doing anything, run the sudo commands to setup the VM if build.sudo: logging.info("Running 'sudo' commands in %s" % os.getcwd()) p = FDroidPopen(['sudo', 'DEBIAN_FRONTEND=noninteractive', 'bash', '-x', '-c', build.sudo]) if p.returncode != 0: raise BuildException("Error running sudo command for %s:%s" % (app.id, build.versionName), p.output) p = FDroidPopen(['sudo', 'passwd', '--lock', 'root']) if p.returncode != 0: raise BuildException("Error locking root account for %s:%s" % (app.id, build.versionName), p.output) p = FDroidPopen(['sudo', 'SUDO_FORCE_REMOVE=yes', 'dpkg', '--purge', 'sudo']) if p.returncode != 0: raise BuildException("Error removing sudo for %s:%s" % (app.id, build.versionName), p.output) log_path = os.path.join(log_dir, common.get_toolsversion_logname(app, build)) with open(log_path, 'w') as f: f.write(common.get_android_tools_version_log()) else: if build.sudo: logging.warning('%s:%s runs this on the buildserver with sudo:\n\t%s\nThese commands were skipped because fdroid build is not running on a dedicated build server.' % (app.id, build.versionName, build.sudo)) # Prepare the source code... root_dir, srclibpaths = common.prepare_source(vcs, app, build, build_dir, srclib_dir, extlib_dir, onserver, refresh) # We need to clean via the build tool in case the binary dirs are # different from the default ones p = None gradletasks = [] bmethod = build.build_method() if bmethod == 'maven': logging.info("Cleaning Maven project...") cmd = [config['mvn3'], 'clean', '-Dandroid.sdk.path=' + config['sdk_path']] if '@' in build.maven: maven_dir = os.path.join(root_dir, build.maven.split('@', 1)[1]) maven_dir = os.path.normpath(maven_dir) else: maven_dir = root_dir p = FDroidPopen(cmd, cwd=maven_dir) elif bmethod == 'gradle': logging.info("Cleaning Gradle project...") if build.preassemble: gradletasks += build.preassemble flavours = build.gradle if flavours == ['yes']: flavours = [] flavours_cmd = ''.join([transform_first_char(flav, str.upper) for flav in flavours]) gradletasks += ['assemble' + flavours_cmd + 'Release'] cmd = [config['gradle']] if build.gradleprops: cmd += ['-P' + kv for kv in build.gradleprops] cmd += ['clean'] p = FDroidPopen(cmd, cwd=root_dir, envs={"GRADLE_VERSION_DIR": config['gradle_version_dir'], "CACHEDIR": config['cachedir']}) elif bmethod == 'buildozer': pass elif bmethod == 'ant': logging.info("Cleaning Ant project...") p = FDroidPopen(['ant', 'clean'], cwd=root_dir) if p is not None and p.returncode != 0: raise BuildException("Error cleaning %s:%s" % (app.id, build.versionName), p.output) for root, dirs, files in os.walk(build_dir): def del_dirs(dl): for d in dl: shutil.rmtree(os.path.join(root, d), ignore_errors=True) def del_files(fl): for f in fl: if f in files: os.remove(os.path.join(root, f)) if any(f in files for f in ['build.gradle', 'build.gradle.kts', 'settings.gradle', 'settings.gradle.kts']): # Even when running clean, gradle stores task/artifact caches in # .gradle/ as binary files. To avoid overcomplicating the scanner, # manually delete them, just like `gradle clean` should have removed # the build/* dirs. del_dirs([os.path.join('build', 'android-profile'), os.path.join('build', 'generated'), os.path.join('build', 'intermediates'), os.path.join('build', 'outputs'), os.path.join('build', 'reports'), os.path.join('build', 'tmp'), os.path.join('buildSrc', 'build'), '.gradle']) del_files(['gradlew', 'gradlew.bat']) if 'pom.xml' in files: del_dirs(['target']) if any(f in files for f in ['ant.properties', 'project.properties', 'build.xml']): del_dirs(['bin', 'gen']) if 'jni' in dirs: del_dirs(['obj']) if options.skipscan: if build.scandelete: raise BuildException("Refusing to skip source scan since scandelete is present") else: # Scan before building... logging.info("Scanning source for common problems...") scanner.options = options # pass verbose through count = scanner.scan_source(build_dir, build) if count > 0: if force: logging.warning(ngettext('Scanner found {} problem', 'Scanner found {} problems', count).format(count)) else: raise BuildException(ngettext( "Can't build due to {} error while scanning", "Can't build due to {} errors while scanning", count).format(count)) if not options.notarball: # Build the source tarball right before we build the release... logging.info("Creating source tarball...") tarname = common.getsrcname(app, build) tarball = tarfile.open(os.path.join(tmp_dir, tarname), "w:gz") def tarexc(t): return None if any(t.name.endswith(s) for s in ['.svn', '.git', '.hg', '.bzr']) else t tarball.add(build_dir, tarname, filter=tarexc) tarball.close() # Run a build command if one is required... if build.build: logging.info("Running 'build' commands in %s" % root_dir) cmd = common.replace_config_vars(build.build, build) # Substitute source library paths into commands... for name, number, libpath in srclibpaths: cmd = cmd.replace('$$' + name + '$$', os.path.join(os.getcwd(), libpath)) p = FDroidPopen(['bash', '-x', '-c', cmd], cwd=root_dir) if p.returncode != 0: raise BuildException("Error running build command for %s:%s" % (app.id, build.versionName), p.output) # Build native stuff if required... if build.buildjni and build.buildjni != ['no']: logging.info("Building the native code") jni_components = build.buildjni if jni_components == ['yes']: jni_components = [''] cmd = [os.path.join(ndk_path, "ndk-build"), "-j1"] for d in jni_components: if d: logging.info("Building native code in '%s'" % d) else: logging.info("Building native code in the main project") manifest = os.path.join(root_dir, d, 'AndroidManifest.xml') if os.path.exists(manifest): # Read and write the whole AM.xml to fix newlines and avoid # the ndk r8c or later 'wordlist' errors. The outcome of this # under gnu/linux is the same as when using tools like # dos2unix, but the native python way is faster and will # work in non-unix systems. manifest_text = open(manifest, 'U').read() open(manifest, 'w').write(manifest_text) # In case the AM.xml read was big, free the memory del manifest_text p = FDroidPopen(cmd, cwd=os.path.join(root_dir, d)) if p.returncode != 0: raise BuildException("NDK build failed for %s:%s" % (app.id, build.versionName), p.output) p = None # Build the release... if bmethod == 'maven': logging.info("Building Maven project...") if '@' in build.maven: maven_dir = os.path.join(root_dir, build.maven.split('@', 1)[1]) else: maven_dir = root_dir mvncmd = [config['mvn3'], '-Dandroid.sdk.path=' + config['sdk_path'], '-Dmaven.jar.sign.skip=true', '-Dmaven.test.skip=true', '-Dandroid.sign.debug=false', '-Dandroid.release=true', 'package'] if build.target: target = build.target.split('-')[1] common.regsub_file(r'[0-9]*', r'%s' % target, os.path.join(root_dir, 'pom.xml')) if '@' in build.maven: common.regsub_file(r'[0-9]*', r'%s' % target, os.path.join(maven_dir, 'pom.xml')) p = FDroidPopen(mvncmd, cwd=maven_dir) bindir = os.path.join(root_dir, 'target') elif bmethod == 'buildozer': logging.info("Building Kivy project using buildozer...") # parse buildozer.spez spec = os.path.join(root_dir, 'buildozer.spec') if not os.path.exists(spec): raise BuildException("Expected to find buildozer-compatible spec at {0}" .format(spec)) defaults = {'orientation': 'landscape', 'icon': '', 'permissions': '', 'android.api': "19"} bconfig = ConfigParser(defaults, allow_no_value=True) bconfig.read(spec) # update spec with sdk and ndk locations to prevent buildozer from # downloading. loc_ndk = common.env['ANDROID_NDK'] loc_sdk = common.env['ANDROID_SDK'] if loc_ndk == '$ANDROID_NDK': loc_ndk = loc_sdk + '/ndk-bundle' bc_ndk = None bc_sdk = None try: bc_ndk = bconfig.get('app', 'android.sdk_path') except Exception: pass try: bc_sdk = bconfig.get('app', 'android.ndk_path') except Exception: pass if bc_sdk is None: bconfig.set('app', 'android.sdk_path', loc_sdk) if bc_ndk is None: bconfig.set('app', 'android.ndk_path', loc_ndk) fspec = open(spec, 'w') bconfig.write(fspec) fspec.close() logging.info("sdk_path = %s" % loc_sdk) logging.info("ndk_path = %s" % loc_ndk) p = None # execute buildozer cmd = ['buildozer', 'android', 'release'] try: p = FDroidPopen(cmd, cwd=root_dir) except Exception: pass # buidozer not installed ? clone repo and run if (p is None or p.returncode != 0): cmd = ['git', 'clone', 'https://github.com/kivy/buildozer.git'] p = subprocess.Popen(cmd, cwd=root_dir, shell=False) p.wait() if p.returncode != 0: raise BuildException("Distribute build failed") cmd = ['python', 'buildozer/buildozer/scripts/client.py', 'android', 'release'] p = FDroidPopen(cmd, cwd=root_dir) # expected to fail. # Signing will fail if not set by environnment vars (cf. p4a docs). # But the unsigned APK will be ok. p.returncode = 0 elif bmethod == 'gradle': logging.info("Building Gradle project...") cmd = [config['gradle']] if build.gradleprops: cmd += ['-P' + kv for kv in build.gradleprops] cmd += gradletasks p = FDroidPopen(cmd, cwd=root_dir, envs={"GRADLE_VERSION_DIR": config['gradle_version_dir'], "CACHEDIR": config['cachedir']}) elif bmethod == 'ant': logging.info("Building Ant project...") cmd = ['ant'] if build.antcommands: cmd += build.antcommands else: cmd += ['release'] p = FDroidPopen(cmd, cwd=root_dir) bindir = os.path.join(root_dir, 'bin') if os.path.isdir(os.path.join(build_dir, '.git')): import git commit_id = common.get_head_commit_id(git.repo.Repo(build_dir)) else: commit_id = build.commit if p is not None and p.returncode != 0: raise BuildException("Build failed for %s:%s@%s" % (app.id, build.versionName, commit_id), p.output) logging.info("Successfully built version {versionName} of {appid} from {commit_id}" .format(versionName=build.versionName, appid=app.id, commit_id=commit_id)) omethod = build.output_method() if omethod == 'maven': stdout_apk = '\n'.join([ line for line in p.output.splitlines() if any( a in line for a in ('.apk', '.ap_', '.jar'))]) m = re.match(r".*^\[INFO\] .*apkbuilder.*/([^/]*)\.apk", stdout_apk, re.S | re.M) if not m: m = re.match(r".*^\[INFO\] Creating additional unsigned apk file .*/([^/]+)\.apk[^l]", stdout_apk, re.S | re.M) if not m: m = re.match(r'.*^\[INFO\] [^$]*aapt \[package,[^$]*' + bindir + r'/([^/]+)\.ap[_k][,\]]', stdout_apk, re.S | re.M) if not m: m = re.match(r".*^\[INFO\] Building jar: .*/" + bindir + r"/(.+)\.jar", stdout_apk, re.S | re.M) if not m: raise BuildException('Failed to find output') src = m.group(1) src = os.path.join(bindir, src) + '.apk' elif omethod == 'buildozer': src = None for apks_dir in [ os.path.join(root_dir, '.buildozer', 'android', 'platform', 'build', 'dists', bconfig.get('app', 'title'), 'bin'), ]: for apkglob in ['*-release-unsigned.apk', '*-unsigned.apk', '*.apk']: apks = glob.glob(os.path.join(apks_dir, apkglob)) if len(apks) > 1: raise BuildException('More than one resulting apks found in %s' % apks_dir, '\n'.join(apks)) if len(apks) == 1: src = apks[0] break if src is not None: break if src is None: raise BuildException('Failed to find any output apks') elif omethod == 'gradle': src = None apk_dirs = [ # gradle plugin >= 3.0 os.path.join(root_dir, 'build', 'outputs', 'apk', 'release'), # gradle plugin < 3.0 and >= 0.11 os.path.join(root_dir, 'build', 'outputs', 'apk'), # really old path os.path.join(root_dir, 'build', 'apk'), ] # If we build with gradle flavours with gradle plugin >= 3.0 the APK will be in # a subdirectory corresponding to the flavour command used, but with different # capitalization. if flavours_cmd: apk_dirs.append(os.path.join(root_dir, 'build', 'outputs', 'apk', transform_first_char(flavours_cmd, str.lower), 'release')) for apks_dir in apk_dirs: for apkglob in ['*-release-unsigned.apk', '*-unsigned.apk', '*.apk']: apks = glob.glob(os.path.join(apks_dir, apkglob)) if len(apks) > 1: raise BuildException('More than one resulting apks found in %s' % apks_dir, '\n'.join(apks)) if len(apks) == 1: src = apks[0] break if src is not None: break if src is None: raise BuildException('Failed to find any output apks') elif omethod == 'ant': stdout_apk = '\n'.join([ line for line in p.output.splitlines() if '.apk' in line]) src = re.match(r".*^.*Creating (.+) for release.*$.*", stdout_apk, re.S | re.M).group(1) src = os.path.join(bindir, src) elif omethod == 'raw': output_path = common.replace_build_vars(build.output, build) globpath = os.path.join(root_dir, output_path) apks = glob.glob(globpath) if len(apks) > 1: raise BuildException('Multiple apks match %s' % globpath, '\n'.join(apks)) if len(apks) < 1: raise BuildException('No apks match %s' % globpath) src = os.path.normpath(apks[0]) # Make sure it's not debuggable... if common.is_apk_and_debuggable(src): raise BuildException("APK is debuggable") # By way of a sanity check, make sure the version and version # code in our new APK match what we expect... logging.debug("Checking " + src) if not os.path.exists(src): raise BuildException("Unsigned APK is not at expected location of " + src) if common.get_file_extension(src) == 'apk': vercode, version = get_metadata_from_apk(app, build, src) if version != build.versionName or vercode != build.versionCode: raise BuildException(("Unexpected version/version code in output;" " APK: '%s' / '%s', " " Expected: '%s' / '%s'") % (version, str(vercode), build.versionName, str(build.versionCode))) if (options.scan_binary or config.get('scan_binary')) and not options.skipscan: if scanner.scan_binary(src): raise BuildException("Found blocklisted packages in final apk!") # Copy the unsigned APK to our destination directory for further # processing (by publish.py)... dest = os.path.join( output_dir, common.get_release_filename( app, build, common.get_file_extension(src) ) ) shutil.copyfile(src, dest) # Move the source tarball into the output directory... if output_dir != tmp_dir and not options.notarball: shutil.move(os.path.join(tmp_dir, tarname), os.path.join(output_dir, tarname)) def trybuild(app, build, build_dir, output_dir, log_dir, also_check_dir, srclib_dir, extlib_dir, tmp_dir, repo_dir, vcs, test, server, force, onserver, refresh): """Build a particular version of an application, if it needs building. Parameters ---------- output_dir The directory where the build output will go. Usually this is the 'unsigned' directory. repo_dir The repo directory - used for checking if the build is necessary. also_check_dir An additional location for checking if the build is necessary (usually the archive repo) test True if building in test mode, in which case the build will always happen, even if the output already exists. In test mode, the output directory should be a temporary location, not any of the real ones. Returns ------- Boolean True if the build was done, False if it wasn't necessary. """ dest_file = common.get_release_filename(app, build) dest = os.path.join(output_dir, dest_file) dest_repo = os.path.join(repo_dir, dest_file) if not test: if os.path.exists(dest) or os.path.exists(dest_repo): return False if also_check_dir: dest_also = os.path.join(also_check_dir, dest_file) if os.path.exists(dest_also): return False if build.disable and not options.force: return False logging.info("Building version %s (%s) of %s" % ( build.versionName, build.versionCode, app.id)) if server: # When using server mode, still keep a local cache of the repo, by # grabbing the source now. vcs.gotorevision(build.commit, refresh) # Initialise submodules if required if build.submodules: vcs.initsubmodules() build_server(app, build, vcs, build_dir, output_dir, log_dir, force) else: build_local(app, build, vcs, build_dir, output_dir, log_dir, srclib_dir, extlib_dir, tmp_dir, force, onserver, refresh) return True def force_halt_build(timeout): """Halt the currently running Vagrant VM, to be called from a Timer.""" logging.error(_('Force halting build after {0} sec timeout!').format(timeout)) timeout_event.set() vm = vmtools.get_build_vm('builder') vm.halt() def parse_commandline(): """Parse the command line. Returns ------- options parser """ parser = argparse.ArgumentParser(usage="%(prog)s [options] [APPID[:VERCODE] [APPID[:VERCODE] ...]]") common.setup_global_opts(parser) parser.add_argument("appid", nargs='*', help=_("application ID with optional versionCode in the form APPID[:VERCODE]")) parser.add_argument("-l", "--latest", action="store_true", default=False, help=_("Build only the latest version of each package")) parser.add_argument("-s", "--stop", action="store_true", default=False, help=_("Make the build stop on exceptions")) parser.add_argument("-t", "--test", action="store_true", default=False, help=_("Test mode - put output in the tmp directory only, and always build, even if the output already exists.")) parser.add_argument("--server", action="store_true", default=False, help=_("Use build server")) parser.add_argument("--reset-server", action="store_true", default=False, help=_("Reset and create a brand new build server, even if the existing one appears to be ok.")) # this option is internal API for telling fdroid that # it's running inside a buildserver vm. parser.add_argument("--on-server", dest="onserver", action="store_true", default=False, help=argparse.SUPPRESS) parser.add_argument("--skip-scan", dest="skipscan", action="store_true", default=False, help=_("Skip scanning the source code for binaries and other problems")) parser.add_argument("--scan-binary", action="store_true", default=False, help=_("Scan the resulting APK(s) for known non-free classes.")) parser.add_argument("--no-tarball", dest="notarball", action="store_true", default=False, help=_("Don't create a source tarball, useful when testing a build")) parser.add_argument("--no-refresh", dest="refresh", action="store_false", default=True, help=_("Don't refresh the repository, useful when testing a build with no internet connection")) parser.add_argument("-f", "--force", action="store_true", default=False, help=_("Force build of disabled apps, and carries on regardless of scan problems. Only allowed in test mode.")) parser.add_argument("-a", "--all", action="store_true", default=False, help=_("Build all applications available")) parser.add_argument("-w", "--wiki", default=False, action="store_true", help=argparse.SUPPRESS) metadata.add_metadata_arguments(parser) options = parser.parse_args() metadata.warnings_action = options.W # Force --stop with --on-server to get correct exit code if options.onserver: options.stop = True if options.force and not options.test: parser.error("option %s: Force is only allowed in test mode" % "force") return options, parser options = None config = None buildserverid = None fdroidserverid = None start_timestamp = time.gmtime() status_output = None timeout_event = threading.Event() def main(): global options, config, buildserverid, fdroidserverid options, parser = parse_commandline() # The defaults for .fdroid.* metadata that is included in a git repo are # different than for the standard metadata/ layout because expectations # are different. In this case, the most common user will be the app # developer working on the latest update of the app on their own machine. local_metadata_files = common.get_local_metadata_files() if len(local_metadata_files) == 1: # there is local metadata in an app's source config = dict(common.default_config) # `fdroid build` should build only the latest version by default since # most of the time the user will be building the most recent update if not options.all: options.latest = True elif len(local_metadata_files) > 1: raise FDroidException("Only one local metadata file allowed! Found: " + " ".join(local_metadata_files)) else: if not os.path.isdir('metadata') and len(local_metadata_files) == 0: raise FDroidException("No app metadata found, nothing to process!") if not options.appid and not options.all: parser.error("option %s: If you really want to build all the apps, use --all" % "all") config = common.read_config(options) if config['build_server_always']: options.server = True if options.reset_server and not options.server: parser.error("option %s: Using --reset-server without --server makes no sense" % "reset-server") if options.onserver or not options.server: for d in ['build-tools', 'platform-tools', 'tools']: if not os.path.isdir(os.path.join(config['sdk_path'], d)): raise FDroidException(_("Android SDK '{path}' does not have '{dirname}' installed!") .format(path=config['sdk_path'], dirname=d)) log_dir = 'logs' if not os.path.isdir(log_dir): logging.info("Creating log directory") os.makedirs(log_dir) tmp_dir = 'tmp' if not os.path.isdir(tmp_dir): logging.info("Creating temporary directory") os.makedirs(tmp_dir) if options.test: output_dir = tmp_dir else: output_dir = 'unsigned' if not os.path.isdir(output_dir): logging.info("Creating output directory") os.makedirs(output_dir) binaries_dir = os.path.join(output_dir, 'binaries') if config['archive_older'] != 0: also_check_dir = 'archive' else: also_check_dir = None if options.onserver: status_output = dict() # HACK dummy placeholder else: status_output = common.setup_status_output(start_timestamp) repo_dir = 'repo' build_dir = 'build' if not os.path.isdir(build_dir): logging.info("Creating build directory") os.makedirs(build_dir) srclib_dir = os.path.join(build_dir, 'srclib') extlib_dir = os.path.join(build_dir, 'extlib') # Read all app and srclib metadata pkgs = common.read_pkg_args(options.appid, True) allapps = metadata.read_metadata(pkgs, sort_by_time=True) apps = common.read_app_args(options.appid, allapps, True) for appid, app in list(apps.items()): if (app.get('Disabled') and not options.force) or not app.get('RepoType') or not app.get('Builds', []): del apps[appid] if not apps: raise FDroidException("No apps to process.") # make sure enough open files are allowed to process everything try: import resource # not available on Windows soft, hard = resource.getrlimit(resource.RLIMIT_NOFILE) if len(apps) > soft: try: soft = len(apps) * 2 if soft > hard: soft = hard resource.setrlimit(resource.RLIMIT_NOFILE, (soft, hard)) logging.debug(_('Set open file limit to {integer}') .format(integer=soft)) except (OSError, ValueError) as e: logging.warning(_('Setting open file limit failed: ') + str(e)) except ImportError: pass if options.latest: for app in apps.values(): for build in reversed(app.get('Builds', [])): if build.disable and not options.force: continue app['Builds'] = [build] break # Build applications... failed_builds = [] build_succeeded = [] build_succeeded_ids = [] status_output['failedBuilds'] = failed_builds status_output['successfulBuilds'] = build_succeeded status_output['successfulBuildIds'] = build_succeeded_ids # Only build for 72 hours, then stop gracefully. endtime = time.time() + 72 * 60 * 60 max_build_time_reached = False for appid, app in apps.items(): first = True for build in app.get('Builds', []): if time.time() > endtime: max_build_time_reached = True break # Enable watchdog timer (2 hours by default). if build.timeout is None: timeout = 7200 else: timeout = int(build.timeout) if options.server and timeout > 0: logging.debug(_('Setting {0} sec timeout for this build').format(timeout)) timer = threading.Timer(timeout, force_halt_build, [timeout]) timeout_event.clear() timer.start() else: timer = None tools_version_log = '' if not options.onserver: tools_version_log = common.get_android_tools_version_log() common.write_running_status_json(status_output) try: # For the first build of a particular app, we need to set up # the source repo. We can reuse it on subsequent builds, if # there are any. if first: vcs, build_dir = common.setup_vcs(app) first = False logging.info("Using %s" % vcs.clientversion()) logging.debug("Checking " + build.versionName) if trybuild(app, build, build_dir, output_dir, log_dir, also_check_dir, srclib_dir, extlib_dir, tmp_dir, repo_dir, vcs, options.test, options.server, options.force, options.onserver, options.refresh): toolslog = os.path.join(log_dir, common.get_toolsversion_logname(app, build)) if not options.onserver and os.path.exists(toolslog): with open(toolslog, 'r') as f: tools_version_log = ''.join(f.readlines()) os.remove(toolslog) if app.Binaries is not None: # This is an app where we build from source, and # verify the APK contents against a developer's # binary. We get that binary now, and save it # alongside our built one in the 'unsigend' # directory. if not os.path.isdir(binaries_dir): os.makedirs(binaries_dir) logging.info("Created directory for storing " "developer supplied reference " "binaries: '{path}'" .format(path=binaries_dir)) url = app.Binaries url = url.replace('%v', build.versionName) url = url.replace('%c', str(build.versionCode)) logging.info("...retrieving " + url) of = re.sub(r'\.apk$', '.binary.apk', common.get_release_filename(app, build)) of = os.path.join(binaries_dir, of) try: net.download_file(url, local_filename=of) except requests.exceptions.HTTPError as e: raise FDroidException( 'Downloading Binaries from %s failed.' % url) from e # Now we check whether the build can be verified to # match the supplied binary or not. Should the # comparison fail, we mark this build as a failure # and remove everything from the unsigend folder. with tempfile.TemporaryDirectory() as tmpdir: unsigned_apk = \ common.get_release_filename(app, build) unsigned_apk = \ os.path.join(output_dir, unsigned_apk) compare_result = \ common.verify_apks(of, unsigned_apk, tmpdir) if compare_result: if options.test: logging.warning(_('Keeping failed build "{apkfilename}"') .format(apkfilename=unsigned_apk)) else: logging.debug('removing %s', unsigned_apk) os.remove(unsigned_apk) logging.debug('removing %s', of) os.remove(of) compare_result = compare_result.split('\n') line_count = len(compare_result) compare_result = compare_result[:299] if line_count > len(compare_result): line_difference = \ line_count - len(compare_result) compare_result.append('%d more lines ...' % line_difference) compare_result = '\n'.join(compare_result) raise FDroidException('compared built binary ' 'to supplied reference ' 'binary but failed', compare_result) else: logging.info('compared built binary to ' 'supplied reference binary ' 'successfully') build_succeeded.append(app) build_succeeded_ids.append([app['id'], build.versionCode]) except VCSException as vcse: reason = str(vcse).split('\n', 1)[0] if options.verbose else str(vcse) logging.error("VCS error while building app %s: %s" % ( appid, reason)) if options.stop: logging.debug("Error encoutered, stopping by user request.") common.force_exit(1) add_failed_builds_entry(failed_builds, appid, build, vcse) common.deploy_build_log_with_rsync( appid, build.versionCode, "".join(traceback.format_exc()) ) except FDroidException as e: tstamp = time.strftime("%Y-%m-%d %H:%M:%SZ", time.gmtime()) with open(os.path.join(log_dir, appid + '.log'), 'a+') as f: f.write('\n\n============================================================\n') f.write('versionCode: %s\nversionName: %s\ncommit: %s\n' % (build.versionCode, build.versionName, build.commit)) f.write('Build completed at ' + tstamp + '\n') f.write('\n' + tools_version_log + '\n') f.write(str(e)) logging.error("Could not build app %s: %s" % (appid, e)) if options.stop: logging.debug("Error encoutered, stopping by user request.") common.force_exit(1) add_failed_builds_entry(failed_builds, appid, build, e) common.deploy_build_log_with_rsync( appid, build.versionCode, "".join(traceback.format_exc()) ) except Exception as e: logging.error("Could not build app %s due to unknown error: %s" % ( appid, traceback.format_exc())) if options.stop: logging.debug("Error encoutered, stopping by user request.") common.force_exit(1) add_failed_builds_entry(failed_builds, appid, build, e) common.deploy_build_log_with_rsync( appid, build.versionCode, "".join(traceback.format_exc()) ) if timer: timer.cancel() # kill the watchdog timer if max_build_time_reached: status_output['maxBuildTimeReached'] = True logging.info("Stopping after global build timeout...") break for app in build_succeeded: logging.info("success: %s" % (app.id)) if not options.verbose: for fb in failed_builds: logging.info('Build for app {}:{} failed:\n{}'.format(*fb)) logging.info(_("Finished")) if len(build_succeeded) > 0: logging.info(ngettext("{} build succeeded", "{} builds succeeded", len(build_succeeded)).format(len(build_succeeded))) if len(failed_builds) > 0: logging.info(ngettext("{} build failed", "{} builds failed", len(failed_builds)).format(len(failed_builds))) if options.server: if os.cpu_count(): status_output['hostOsCpuCount'] = os.cpu_count() if os.path.isfile('/proc/meminfo') and os.access('/proc/meminfo', os.R_OK): with open('/proc/meminfo') as fp: for line in fp: m = re.search(r'MemTotal:\s*([0-9].*)', line) if m: status_output['hostProcMeminfoMemTotal'] = m.group(1) break fdroid_path = os.path.realpath(os.path.join(os.path.dirname(__file__), '..')) buildserver_config = os.path.join(fdroid_path, 'makebuildserver.config.py') if os.path.isfile(buildserver_config) and os.access(buildserver_config, os.R_OK): with open(buildserver_config) as configfile: for line in configfile: m = re.search(r'cpus\s*=\s*([0-9].*)', line) if m: status_output['guestVagrantVmCpus'] = m.group(1) m = re.search(r'memory\s*=\s*([0-9].*)', line) if m: status_output['guestVagrantVmMemory'] = m.group(1) if buildserverid: status_output['buildserver'] = {'commitId': buildserverid} if not options.onserver: common.write_status_json(status_output) # hack to ensure this exits, even is some threads are still running common.force_exit() if __name__ == "__main__": main() fdroidserver-2.1/fdroidserver/checkupdates.py0000644000175000017500000005676314203004041021476 0ustar hanshans00000000000000#!/usr/bin/env python3 # # checkupdates.py - part of the FDroid server tools # Copyright (C) 2010-2015, Ciaran Gultnieks, ciaran@ciarang.com # Copyright (C) 2013-2014 Daniel Martí # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . import os import re import urllib.request import urllib.error import time import subprocess import sys from argparse import ArgumentParser import traceback import html from distutils.version import LooseVersion import logging import copy import urllib.parse from pathlib import Path from . import _ from . import common from . import metadata from . import net from .exception import VCSException, NoSubmodulesException, FDroidException, MetaDataException # Check for a new version by looking at a document retrieved via HTTP. # The app's Update Check Data field is used to provide the information # required. def check_http(app): if not app.UpdateCheckData: raise FDroidException('Missing Update Check Data') urlcode, codeex, urlver, verex = app.UpdateCheckData.split('|') parsed = urllib.parse.urlparse(urlcode) if not parsed.netloc or not parsed.scheme or parsed.scheme != 'https': raise FDroidException(_('UpdateCheckData has invalid URL: {url}').format(url=urlcode)) if urlver != '.': parsed = urllib.parse.urlparse(urlver) if not parsed.netloc or not parsed.scheme or parsed.scheme != 'https': raise FDroidException(_('UpdateCheckData has invalid URL: {url}').format(url=urlcode)) logging.debug("...requesting {0}".format(urlcode)) req = urllib.request.Request(urlcode, None, headers=net.HEADERS) resp = urllib.request.urlopen(req, None, 20) # nosec B310 scheme is filtered above page = resp.read().decode('utf-8') m = re.search(codeex, page) if not m: raise FDroidException("No RE match for version code") vercode = m.group(1).strip() if urlver != '.': logging.debug("...requesting {0}".format(urlver)) req = urllib.request.Request(urlver, None) resp = urllib.request.urlopen(req, None, 20) # nosec B310 scheme is filtered above page = resp.read().decode('utf-8') m = re.search(verex, page) if not m: raise FDroidException("No RE match for version") version = m.group(1) if app.UpdateCheckIgnore and re.search(app.UpdateCheckIgnore, version): logging.info("Version {version} for {appid} is ignored".format(version=version, appid=app.id)) return (None, None) return (version, vercode) def check_tags(app, pattern): """Check for a new version by looking at the tags in the source repo. Whether this can be used reliably or not depends on the development procedures used by the project's developers. Use it with caution, because it's inappropriate for many projects. """ if app.RepoType == 'srclib': build_dir = Path('build/srclib') / app.Repo repotype = common.getsrclibvcs(app.Repo) else: build_dir = Path('build') / app.id repotype = app.RepoType if repotype not in ('git', 'git-svn', 'hg', 'bzr'): raise MetaDataException(_('Tags update mode only works for git, hg, bzr and git-svn repositories currently')) if repotype == 'git-svn' and ';' not in app.Repo: raise MetaDataException(_('Tags update mode used in git-svn, but the repo was not set up with tags')) # Set up vcs interface and make sure we have the latest code... vcs = common.getvcs(app.RepoType, app.Repo, build_dir) vcs.gotorevision(None) last_build = app.get_last_build() try_init_submodules(app, last_build, vcs) htag = None hver = None hcode = "0" tags = [] if repotype == 'git': tags = vcs.latesttags() else: tags = vcs.gettags() if not tags: raise FDroidException(_('No tags found')) logging.debug("All tags: " + ','.join(tags)) if pattern: pat = re.compile(pattern) tags = [tag for tag in tags if pat.match(tag)] if not tags: raise FDroidException(_('No matching tags found')) logging.debug("Matching tags: " + ','.join(tags)) if len(tags) > 5 and repotype == 'git': tags = tags[:5] logging.debug("Latest tags: " + ','.join(tags)) for tag in tags: logging.debug("Check tag: '{0}'".format(tag)) vcs.gotorevision(tag) if app.UpdateCheckData: filecode, codeex, filever, verex = app.UpdateCheckData.split('|') if filecode: filecode = build_dir / filecode if not filecode.is_file(): logging.debug("UpdateCheckData file {0} not found in tag {1}".format(filecode, tag)) continue filecontent = filecode.read_text() else: filecontent = tag vercode = tag if codeex: m = re.search(codeex, filecontent) if not m: continue vercode = m.group(1).strip() if filever: if filever != '.': filever = build_dir / filever if filever.is_file(): filecontent = filever.read_text() else: logging.debug("UpdateCheckData file {0} not found in tag {1}".format(filever, tag)) else: filecontent = tag version = tag if verex: m = re.search(verex, filecontent) if m: version = m.group(1) logging.debug("UpdateCheckData found version {0} ({1})" .format(version, vercode)) i_vercode = common.version_code_string_to_int(vercode) if i_vercode > common.version_code_string_to_int(hcode): htag = tag hcode = str(i_vercode) hver = version else: for subdir in possible_subdirs(app): root_dir = build_dir / subdir paths = common.manifest_paths(root_dir, last_build.gradle) version, vercode, _package = common.parse_androidmanifests(paths, app) if version == 'Unknown' or version == 'Ignore': version = tag if vercode: logging.debug("Manifest exists in subdir '{0}'. Found version {1} ({2})" .format(subdir, version, vercode)) i_vercode = common.version_code_string_to_int(vercode) if i_vercode > common.version_code_string_to_int(hcode): htag = tag hcode = str(i_vercode) hver = version if hver: if htag != tags[0]: logging.warning( "{appid}: latest tag {tag} does not contain highest version {version}".format( appid=app.id, tag=tags[0], version=hver ) ) try: commit = vcs.getref(htag) if commit: return (hver, hcode, commit) except VCSException: pass return (hver, hcode, htag) raise FDroidException(_("Couldn't find any version information")) def check_repomanifest(app, branch=None): """Check for a new version by looking at the AndroidManifest.xml at the HEAD of the source repo. Whether this can be used reliably or not depends on the development procedures used by the project's developers. Use it with caution, because it's inappropriate for many projects. """ if app.RepoType == 'srclib': build_dir = Path('build/srclib') / app.Repo repotype = common.getsrclibvcs(app.Repo) else: build_dir = Path('build') / app.id repotype = app.RepoType # Set up vcs interface and make sure we have the latest code... vcs = common.getvcs(app.RepoType, app.Repo, build_dir) if repotype == 'git': if branch: branch = 'origin/' + branch vcs.gotorevision(branch) elif repotype == 'git-svn': vcs.gotorevision(branch) elif repotype == 'hg': vcs.gotorevision(branch) elif repotype == 'bzr': vcs.gotorevision(None) last_build = metadata.Build() if app.get('Builds', []): last_build = app.get('Builds', [])[-1] try_init_submodules(app, last_build, vcs) hpak = None hver = None hcode = "0" for subdir in possible_subdirs(app): root_dir = build_dir / subdir paths = common.manifest_paths(root_dir, last_build.gradle) version, vercode, package = common.parse_androidmanifests(paths, app) if vercode: logging.debug("Manifest exists in subdir '{0}'. Found version {1} ({2})" .format(subdir, version, vercode)) i_vercode = common.version_code_string_to_int(vercode) if i_vercode > common.version_code_string_to_int(hcode): hpak = package hcode = str(i_vercode) hver = version if not hpak: raise FDroidException(_("Couldn't find package ID")) if hver: return (hver, hcode) raise FDroidException(_("Couldn't find any version information")) def check_repotrunk(app): if app.RepoType == 'srclib': build_dir = Path('build/srclib') / app.Repo repotype = common.getsrclibvcs(app.Repo) else: build_dir = Path('build') / app.id repotype = app.RepoType if repotype not in ('git-svn', ): raise MetaDataException(_('RepoTrunk update mode only makes sense in git-svn repositories')) # Set up vcs interface and make sure we have the latest code... vcs = common.getvcs(app.RepoType, app.Repo, build_dir) vcs.gotorevision(None) ref = vcs.getref() return (ref, ref) # Check for a new version by looking at the Google Play Store. # Returns (None, "a message") if this didn't work, or (version, None) for # the details of the current version. def check_gplay(app): time.sleep(15) url = 'https://play.google.com/store/apps/details?id=' + app.id headers = {'User-Agent': 'Mozilla/5.0 (X11; Linux i686; rv:18.0) Gecko/20100101 Firefox/18.0'} req = urllib.request.Request(url, None, headers) try: resp = urllib.request.urlopen(req, None, 20) # nosec B310 URL base is hardcoded above page = resp.read().decode() except urllib.error.HTTPError as e: return (None, str(e.code)) except Exception as e: return (None, 'Failed:' + str(e)) version = None m = re.search('itemprop="softwareVersion">[ ]*([^<]+)[ ]*', page) if m: version = html.unescape(m.group(1)) if version == 'Varies with device': return (None, 'Device-variable version, cannot use this method') if not version: return (None, "Couldn't find version") return (version.strip(), None) def try_init_submodules(app, last_build, vcs): """Try to init submodules if the last build entry used them. They might have been removed from the app's repo in the meantime, so if we can't find any submodules we continue with the updates check. If there is any other error in initializing them then we stop the check. """ if last_build.submodules: try: vcs.initsubmodules() except NoSubmodulesException: logging.info("No submodules present for {}".format(_getappname(app))) except VCSException: logging.info("submodule broken for {}".format(_getappname(app))) # Return all directories under startdir that contain any of the manifest # files, and thus are probably an Android project. def dirs_with_manifest(startdir): # TODO: Python3.6: Accepts a path-like object. for root, _dirs, files in os.walk(str(startdir)): if any(m in files for m in [ 'AndroidManifest.xml', 'pom.xml', 'build.gradle', 'build.gradle.kts']): yield Path(root) # Tries to find a new subdir starting from the root build_dir. Returns said # subdir relative to the build dir if found, None otherwise. def possible_subdirs(app): if app.RepoType == 'srclib': build_dir = Path('build/srclib') / app.Repo else: build_dir = Path('build') / app.id last_build = app.get_last_build() for d in dirs_with_manifest(build_dir): m_paths = common.manifest_paths(d, last_build.gradle) package = common.parse_androidmanifests(m_paths, app)[2] if package is not None: subdir = d.relative_to(build_dir) logging.debug("Adding possible subdir %s" % subdir) yield subdir def _getappname(app): return common.get_app_display_name(app) def _getcvname(app): return '%s (%s)' % (app.CurrentVersion, app.CurrentVersionCode) def fetch_autoname(app, tag): if not app.RepoType or app.UpdateCheckMode in ('None', 'Static') \ or app.UpdateCheckName == "Ignore": return None if app.RepoType == 'srclib': build_dir = Path('build/srclib') / app.Repo else: build_dir = Path('build') / app.id try: vcs = common.getvcs(app.RepoType, app.Repo, build_dir) vcs.gotorevision(tag) except VCSException: return None last_build = app.get_last_build() logging.debug("...fetch auto name from " + str(build_dir)) new_name = None for subdir in possible_subdirs(app): root_dir = build_dir / subdir new_name = common.fetch_real_name(root_dir, last_build.gradle) if new_name is not None: break commitmsg = None if new_name: logging.debug("...got autoname '" + new_name + "'") if new_name != app.AutoName: app.AutoName = new_name if not commitmsg: commitmsg = "Set autoname of {0}".format(_getappname(app)) else: logging.debug("...couldn't get autoname") return commitmsg def checkupdates_app(app): # If a change is made, commitmsg should be set to a description of it. # Only if this is set will changes be written back to the metadata. commitmsg = None tag = None vercode = None mode = app.UpdateCheckMode if mode.startswith('Tags'): pattern = mode[5:] if len(mode) > 4 else None (version, vercode, tag) = check_tags(app, pattern) elif mode == 'RepoManifest': (version, vercode) = check_repomanifest(app) elif mode.startswith('RepoManifest/'): tag = mode[13:] (version, vercode) = check_repomanifest(app, tag) elif mode == 'RepoTrunk': (version, vercode) = check_repotrunk(app) elif mode == 'HTTP': (version, vercode) = check_http(app) elif mode in ('None', 'Static'): logging.debug('Checking disabled') return else: raise MetaDataException(_('Invalid UpdateCheckMode: {mode}').format(mode=mode)) if version and vercode and app.VercodeOperation: if not common.VERCODE_OPERATION_RE.match(app.VercodeOperation): raise MetaDataException(_('Invalid VercodeOperation: {field}') .format(field=app.VercodeOperation)) oldvercode = str(int(vercode)) op = app.VercodeOperation.replace("%c", oldvercode) vercode = str(common.calculate_math_string(op)) logging.debug("Applied vercode operation: %s -> %s" % (oldvercode, vercode)) if version and any(version.startswith(s) for s in [ '${', # Gradle variable names '@string/', # Strings we could not resolve ]): version = "Unknown" updating = False if version is None: raise FDroidException(_('no version information found')) elif vercode == app.CurrentVersionCode: logging.debug("...up to date") elif int(vercode) > int(app.CurrentVersionCode): logging.debug("...updating - old vercode={0}, new vercode={1}".format( app.CurrentVersionCode, vercode)) app.CurrentVersion = version app.CurrentVersionCode = str(int(vercode)) updating = True else: raise FDroidException( _('current version is newer: old vercode={old}, new vercode={new}').format( old=app.CurrentVersionCode, new=vercode ) ) commitmsg = fetch_autoname(app, tag) if updating: name = _getappname(app) ver = _getcvname(app) logging.info('...updating to version %s' % ver) commitmsg = 'Update CurrentVersion of %s to %s' % (name, ver) if options.auto: mode = app.AutoUpdateMode if not app.CurrentVersionCode: raise MetaDataException( _("Can't auto-update app with no CurrentVersionCode") ) elif mode in ('None', 'Static'): pass elif mode.startswith('Version'): pattern = mode[8:] suffix = '' if pattern.startswith('+'): try: suffix, pattern = pattern[1:].split(' ', 1) except ValueError: raise MetaDataException("Invalid AutoUpdateMode: " + mode) gotcur = False latest = None for build in app.get('Builds', []): if int(build.versionCode) >= int(app.CurrentVersionCode): gotcur = True if not latest or int(build.versionCode) > int(latest.versionCode): latest = build if int(latest.versionCode) > int(app.CurrentVersionCode): raise FDroidException( _( 'latest build recipe is newer: old vercode={old}, new vercode={new}' ).format(old=latest.versionCode, new=app.CurrentVersionCode) ) if not gotcur: newbuild = copy.deepcopy(latest) newbuild.disable = False newbuild.versionCode = app.CurrentVersionCode newbuild.versionName = app.CurrentVersion + suffix.replace('%c', newbuild.versionCode) logging.info("...auto-generating build for " + newbuild.versionName) if tag: newbuild.commit = tag else: commit = pattern.replace('%v', app.CurrentVersion) commit = commit.replace('%c', newbuild.versionCode) newbuild.commit = commit app['Builds'].append(newbuild) name = _getappname(app) ver = _getcvname(app) commitmsg = "Update %s to %s" % (name, ver) else: raise MetaDataException( _('Invalid AutoUpdateMode: {mode}').format(mode=mode) ) if commitmsg: metadata.write_metadata(app.metadatapath, app) if options.commit: logging.info("Commiting update for " + app.metadatapath) gitcmd = ["git", "commit", "-m", commitmsg] if 'auto_author' in config: gitcmd.extend(['--author', config['auto_author']]) gitcmd.extend(["--", app.metadatapath]) if subprocess.call(gitcmd) != 0: raise FDroidException("Git commit failed") def status_update_json(processed, failed): """Output a JSON file with metadata about this run.""" logging.debug(_('Outputting JSON')) output = common.setup_status_output(start_timestamp) if processed: output['processed'] = processed if failed: output['failed'] = failed common.write_status_json(output) config = None options = None start_timestamp = time.gmtime() def main(): global config, options # Parse command line... parser = ArgumentParser() common.setup_global_opts(parser) parser.add_argument("appid", nargs='*', help=_("application ID of file to operate on")) parser.add_argument("--auto", action="store_true", default=False, help=_("Process auto-updates")) parser.add_argument("--autoonly", action="store_true", default=False, help=_("Only process apps with auto-updates")) parser.add_argument("--commit", action="store_true", default=False, help=_("Commit changes")) parser.add_argument("--allow-dirty", action="store_true", default=False, help=_("Run on git repo that has uncommitted changes")) parser.add_argument("--gplay", action="store_true", default=False, help=_("Only print differences with the Play Store")) metadata.add_metadata_arguments(parser) options = parser.parse_args() metadata.warnings_action = options.W config = common.read_config(options) if not options.allow_dirty: status = subprocess.check_output(['git', 'status', '--porcelain']) if status: logging.error(_('Build metadata git repo has uncommited changes!')) sys.exit(1) # Get all apps... allapps = metadata.read_metadata() apps = common.read_app_args(options.appid, allapps, False) if options.gplay: for appid, app in apps.items(): version, reason = check_gplay(app) if version is None: if reason == '404': logging.info("{0} is not in the Play Store".format(_getappname(app))) else: logging.info("{0} encountered a problem: {1}".format(_getappname(app), reason)) if version is not None: stored = app.CurrentVersion if not stored: logging.info("{0} has no Current Version but has version {1} on the Play Store" .format(_getappname(app), version)) elif LooseVersion(stored) < LooseVersion(version): logging.info("{0} has version {1} on the Play Store, which is bigger than {2}" .format(_getappname(app), version, stored)) else: if stored != version: logging.info("{0} has version {1} on the Play Store, which differs from {2}" .format(_getappname(app), version, stored)) else: logging.info("{0} has the same version {1} on the Play Store" .format(_getappname(app), version)) return processed = [] failed = dict() exit_code = 0 for appid, app in apps.items(): if options.autoonly and app.AutoUpdateMode in ('None', 'Static'): logging.debug(_("Nothing to do for {appid}.").format(appid=appid)) continue msg = _("Processing {appid}").format(appid=appid) logging.info(msg) try: checkupdates_app(app) processed.append(appid) except Exception as e: msg = _("...checkupdate failed for {appid} : {error}").format(appid=appid, error=e) logging.error(msg) logging.debug(traceback.format_exc()) failed[appid] = str(e) exit_code = 1 status_update_json(processed, failed) sys.exit(exit_code) if __name__ == "__main__": main() fdroidserver-2.1/fdroidserver/common.py0000644000175000017500000051533614203004041020317 0ustar hanshans00000000000000#!/usr/bin/env python3 # # common.py - part of the FDroid server tools # # Copyright (C) 2010-2016, Ciaran Gultnieks, ciaran@ciarang.com # Copyright (C) 2013-2017, Daniel Martí # Copyright (C) 2013-2021, Hans-Christoph Steiner # Copyright (C) 2017-2018, Torsten Grote # Copyright (C) 2017, tobiasKaminsky # Copyright (C) 2017-2021, Michael Pöhn # Copyright (C) 2017,2021, mimi89999 # Copyright (C) 2019-2021, Jochen Sprickerhof # Copyright (C) 2021, Felix C. Stegerman # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . # common.py is imported by all modules, so do not import third-party # libraries here as they will become a requirement for all commands. import git import io import os import sys import re import ast import gzip import shutil import glob import stat import subprocess import time import operator import logging import hashlib import socket import base64 import urllib.parse import urllib.request import yaml import zipfile import tempfile import json from pathlib import Path # TODO change to only import defusedxml once its installed everywhere try: import defusedxml.ElementTree as XMLElementTree except ImportError: import xml.etree.ElementTree as XMLElementTree # nosec this is a fallback only from base64 import urlsafe_b64encode from binascii import hexlify from datetime import datetime, timedelta, timezone from distutils.version import LooseVersion from queue import Queue from zipfile import ZipFile from pyasn1.codec.der import decoder, encoder from pyasn1_modules import rfc2315 from pyasn1.error import PyAsn1Error from . import net import fdroidserver.metadata import fdroidserver.lint from fdroidserver import _ from fdroidserver.exception import FDroidException, VCSException, NoSubmodulesException,\ BuildException, VerificationException, MetaDataException from .asynchronousfilereader import AsynchronousFileReader from . import apksigcopier # The path to this fdroidserver distribution FDROID_PATH = os.path.realpath(os.path.join(os.path.dirname(__file__), '..')) # this is the build-tools version, aapt has a separate version that # has to be manually set in test_aapt_version() MINIMUM_AAPT_BUILD_TOOLS_VERSION = '26.0.0' # 26.0.2 is the first version recognizing md5 based signatures as valid again # (as does android, so we want that) MINIMUM_APKSIGNER_BUILD_TOOLS_VERSION = '26.0.2' VERCODE_OPERATION_RE = re.compile(r'^([ 0-9/*+-]|%c)+$') # A signature block file with a .DSA, .RSA, or .EC extension SIGNATURE_BLOCK_FILE_REGEX = re.compile(r'^META-INF/.*\.(DSA|EC|RSA)$') APK_NAME_REGEX = re.compile(r'^([a-zA-Z][\w.]*)_(-?[0-9]+)_?([0-9a-f]{7})?\.apk') APK_ID_TRIPLET_REGEX = re.compile(r"^package: name='(\w[^']*)' versionCode='([^']+)' versionName='([^']*)'") STANDARD_FILE_NAME_REGEX = re.compile(r'^(\w[\w.]*)_(-?[0-9]+)\.\w+') FDROID_PACKAGE_NAME_REGEX = re.compile(r'''^[a-f0-9]+$''', re.IGNORECASE) STRICT_APPLICATION_ID_REGEX = re.compile(r'''(?:^[a-zA-Z]+(?:\d*[a-zA-Z_]*)*)(?:\.[a-zA-Z]+(?:\d*[a-zA-Z_]*)*)+$''') VALID_APPLICATION_ID_REGEX = re.compile(r'''(?:^[a-z_]+(?:\d*[a-zA-Z_]*)*)(?:\.[a-z_]+(?:\d*[a-zA-Z_]*)*)*$''', re.IGNORECASE) ANDROID_PLUGIN_REGEX = re.compile(r'''\s*(:?apply plugin:|id)\(?\s*['"](android|com\.android\.application)['"]\s*\)?''') SETTINGS_GRADLE_REGEX = re.compile(r'settings\.gradle(?:\.kts)?') GRADLE_SUBPROJECT_REGEX = re.compile(r'''['"]:?([^'"]+)['"]''') MAX_VERSION_CODE = 0x7fffffff # Java's Integer.MAX_VALUE (2147483647) XMLNS_ANDROID = '{http://schemas.android.com/apk/res/android}' # https://docs.gitlab.com/ee/user/gitlab_com/#gitlab-pages GITLAB_COM_PAGES_MAX_SIZE = 1000000000 config = None options = None env = None orig_path = None default_config = { 'sdk_path': "$ANDROID_HOME", 'ndk_paths': {}, 'cachedir': str(Path.home() / '.cache/fdroidserver'), 'java_paths': None, 'scan_binary': False, 'ant': "ant", 'mvn3': "mvn", 'gradle': os.path.join(FDROID_PATH, 'gradlew-fdroid'), 'gradle_version_dir': str(Path.home() / '.cache/fdroidserver/gradle'), 'sync_from_local_copy_dir': False, 'allow_disabled_algorithms': False, 'per_app_repos': False, 'make_current_version_link': False, 'current_version_name_source': 'Name', 'deploy_process_logs': False, 'update_stats': False, 'stats_ignore': [], 'stats_server': None, 'stats_user': None, 'stats_to_carbon': False, 'repo_maxage': 0, 'build_server_always': False, 'keystore': 'keystore.p12', 'smartcardoptions': [], 'char_limits': { 'author': 256, 'name': 50, 'summary': 80, 'description': 4000, 'video': 256, 'whatsNew': 500, }, 'keyaliases': {}, 'repo_url': "https://MyFirstFDroidRepo.org/fdroid/repo", 'repo_name': "My First F-Droid Repo Demo", 'repo_icon': "icon.png", 'repo_description': _("""This is a repository of apps to be used with F-Droid. Applications in this repository are either official binaries built by the original application developers, or are binaries built from source by the admin of f-droid.org using the tools on https://gitlab.com/fdroid."""), # type: ignore 'archive_name': 'My First F-Droid Archive Demo', 'archive_description': _('These are the apps that have been archived from the main repo.'), # type: ignore 'archive_older': 0, 'lint_licenses': fdroidserver.lint.APPROVED_LICENSES, # type: ignore 'git_mirror_size_limit': 10000000000, } def setup_global_opts(parser): try: # the buildserver VM might not have PIL installed from PIL import PngImagePlugin logger = logging.getLogger(PngImagePlugin.__name__) logger.setLevel(logging.INFO) # tame the "STREAM" debug messages except ImportError: pass parser.add_argument("-v", "--verbose", action="store_true", default=False, help=_("Spew out even more information than normal")) parser.add_argument("-q", "--quiet", action="store_true", default=False, help=_("Restrict output to warnings and errors")) def _add_java_paths_to_config(pathlist, thisconfig): def path_version_key(s): versionlist = [] for u in re.split('[^0-9]+', s): try: versionlist.append(int(u)) except ValueError: pass return versionlist for d in sorted(pathlist, key=path_version_key): if os.path.islink(d): continue j = os.path.basename(d) # the last one found will be the canonical one, so order appropriately for regex in [ r'^1\.([16-9][0-9]?)\.0\.jdk$', # OSX r'^jdk1\.([16-9][0-9]?)\.0_[0-9]+.jdk$', # OSX and Oracle tarball r'^jdk1\.([16-9][0-9]?)\.0_[0-9]+$', # Oracle Windows r'^jdk([16-9][0-9]?)-openjdk$', # Arch r'^java-([16-9][0-9]?)-openjdk$', # Arch r'^java-([16-9][0-9]?)-jdk$', # Arch (oracle) r'^java-1\.([16-9][0-9]?)\.0-.*$', # RedHat r'^java-([16-9][0-9]?)-oracle$', # Debian WebUpd8 r'^jdk-([16-9][0-9]?)-oracle-.*$', # Debian make-jpkg r'^java-([16-9][0-9]?)-openjdk-[^c][^o][^m].*$', # Debian r'^oracle-jdk-bin-1\.([17-9][0-9]?).*$', # Gentoo (oracle) r'^icedtea-bin-([17-9][0-9]?).*$', # Gentoo (openjdk) ]: m = re.match(regex, j) if not m: continue for p in [d, os.path.join(d, 'Contents', 'Home')]: if os.path.exists(os.path.join(p, 'bin', 'javac')): thisconfig['java_paths'][m.group(1)] = p def fill_config_defaults(thisconfig): """Fill in the global config dict with relevant defaults. For config values that have a path that can be expanded, e.g. an env var or a ~/, this will store the original value using "_orig" appended to the key name so that if the config gets written out, it will preserve the original, unexpanded string. """ for k, v in default_config.items(): if k not in thisconfig: if isinstance(v, dict) or isinstance(v, list): thisconfig[k] = v.copy() else: thisconfig[k] = v # Expand paths (~users and $vars) def expand_path(path): if path is None: return None orig = path path = os.path.expanduser(path) path = os.path.expandvars(path) if orig == path: return None return path for k in ['sdk_path', 'ant', 'mvn3', 'gradle', 'keystore']: v = thisconfig[k] exp = expand_path(v) if exp is not None: thisconfig[k] = exp thisconfig[k + '_orig'] = v # find all installed JDKs for keytool, jarsigner, and JAVA[6-9]_HOME env vars if thisconfig['java_paths'] is None: thisconfig['java_paths'] = dict() pathlist = [] pathlist += glob.glob('/usr/lib/jvm/j*[16-9]*') pathlist += glob.glob('/usr/java/jdk1.[16-9]*') pathlist += glob.glob('/System/Library/Java/JavaVirtualMachines/1.[16-9][0-9]?.0.jdk') pathlist += glob.glob('/Library/Java/JavaVirtualMachines/*jdk*[0-9]*') pathlist += glob.glob('/opt/oracle-jdk-*1.[0-9]*') pathlist += glob.glob('/opt/icedtea-*[0-9]*') if os.getenv('JAVA_HOME') is not None: pathlist.append(os.getenv('JAVA_HOME')) if os.getenv('PROGRAMFILES') is not None: pathlist += glob.glob(os.path.join(os.getenv('PROGRAMFILES'), 'Java', 'jdk1.[16-9][0-9]?.*')) _add_java_paths_to_config(pathlist, thisconfig) for java_version in ('14', '13', '12', '11', '10', '9', '8', '7'): if java_version not in thisconfig['java_paths']: continue java_home = thisconfig['java_paths'][java_version] jarsigner = os.path.join(java_home, 'bin', 'jarsigner') if os.path.exists(jarsigner): thisconfig['jarsigner'] = jarsigner thisconfig['keytool'] = os.path.join(java_home, 'bin', 'keytool') break if 'jarsigner' not in thisconfig and shutil.which('jarsigner'): thisconfig['jarsigner'] = shutil.which('jarsigner') if 'keytool' not in thisconfig and shutil.which('keytool'): thisconfig['keytool'] = shutil.which('keytool') # enable apksigner by default so v2/v3 APK signatures validate find_apksigner(thisconfig) if not thisconfig.get('apksigner'): logging.warning(_('apksigner not found! Cannot sign or verify modern APKs')) for k in ['ndk_paths', 'java_paths']: d = thisconfig[k] for k2 in d.copy(): v = d[k2] exp = expand_path(v) if exp is not None: thisconfig[k][k2] = exp thisconfig[k][k2 + '_orig'] = v ndk_paths = thisconfig.get('ndk_paths', {}) ndk_bundle = os.path.join(thisconfig['sdk_path'], 'ndk-bundle') if os.path.exists(ndk_bundle): version = get_ndk_version(ndk_bundle) if version not in ndk_paths: ndk_paths[version] = ndk_bundle ndk_dir = os.path.join(thisconfig['sdk_path'], 'ndk') if os.path.exists(ndk_dir): for ndk in glob.glob(os.path.join(ndk_dir, '*')): version = get_ndk_version(ndk) if version not in ndk_paths: ndk_paths[version] = ndk for k in list(ndk_paths.keys()): if not re.match(r'r[1-9][0-9]*[a-z]?', k): for ndkdict in NDKS: if k == ndkdict.get('revision'): ndk_paths[ndkdict['release']] = ndk_paths.pop(k) break def regsub_file(pattern, repl, path): with open(path, 'rb') as f: text = f.read() text = re.sub(bytes(pattern, 'utf8'), bytes(repl, 'utf8'), text) with open(path, 'wb') as f: f.write(text) def read_config(opts=None): """Read the repository config. The config is read from config_file, which is in the current directory when any of the repo management commands are used. If there is a local metadata file in the git repo, then the config is not required, just use defaults. config.yml is the preferred form because no code is executed when reading it. config.py is deprecated and supported for backwards compatibility. config.yml requires ASCII or UTF-8 encoding because this code does not auto-detect the file's encoding. That is left up to the YAML library. YAML allows ASCII, UTF-8, UTF-16, and UTF-32 encodings. Since it is a good idea to manage config.yml (WITHOUT PASSWORDS!) in git, it makes sense to use a globally standard encoding. """ global config, options if config is not None: return config options = opts config = {} config_file = 'config.yml' old_config_file = 'config.py' if os.path.exists(config_file) and os.path.exists(old_config_file): logging.error(_("""Conflicting config files! Using {newfile}, ignoring {oldfile}!""") .format(oldfile=old_config_file, newfile=config_file)) if os.path.exists(config_file): logging.debug(_("Reading '{config_file}'").format(config_file=config_file)) with open(config_file, encoding='utf-8') as fp: config = yaml.safe_load(fp) if not config: config = {} elif os.path.exists(old_config_file): logging.warning(_("""{oldfile} is deprecated, use {newfile}""") .format(oldfile=old_config_file, newfile=config_file)) with io.open(old_config_file, "rb") as fp: code = compile(fp.read(), old_config_file, 'exec') exec(code, None, config) # nosec TODO automatically migrate for k in ('mirrors', 'install_list', 'uninstall_list', 'serverwebroot', 'servergitroot'): if k in config: if not type(config[k]) in (str, list, tuple): logging.warning( _("'{field}' will be in random order! Use () or [] brackets if order is important!") .format(field=k)) # smartcardoptions must be a list since its command line args for Popen smartcardoptions = config.get('smartcardoptions') if isinstance(smartcardoptions, str): config['smartcardoptions'] = re.sub(r'\s+', r' ', config['smartcardoptions']).split(' ') elif not smartcardoptions and 'keystore' in config and config['keystore'] == 'NONE': # keystore='NONE' means use smartcard, these are required defaults config['smartcardoptions'] = ['-storetype', 'PKCS11', '-providerName', 'SunPKCS11-OpenSC', '-providerClass', 'sun.security.pkcs11.SunPKCS11', '-providerArg', 'opensc-fdroid.cfg'] if any(k in config for k in ["keystore", "keystorepass", "keypass"]): if os.path.exists(config_file): f = config_file elif os.path.exists(old_config_file): f = old_config_file st = os.stat(f) if st.st_mode & stat.S_IRWXG or st.st_mode & stat.S_IRWXO: logging.warning(_("unsafe permissions on '{config_file}' (should be 0600)!") .format(config_file=f)) fill_config_defaults(config) if 'serverwebroot' in config: if isinstance(config['serverwebroot'], str): roots = [config['serverwebroot']] elif all(isinstance(item, str) for item in config['serverwebroot']): roots = config['serverwebroot'] else: raise TypeError(_('only accepts strings, lists, and tuples')) rootlist = [] for rootstr in roots: # since this is used with rsync, where trailing slashes have # meaning, ensure there is always a trailing slash if rootstr[-1] != '/': rootstr += '/' rootlist.append(rootstr.replace('//', '/')) config['serverwebroot'] = rootlist if 'servergitmirrors' in config: if isinstance(config['servergitmirrors'], str): roots = [config['servergitmirrors']] elif all(isinstance(item, str) for item in config['servergitmirrors']): roots = config['servergitmirrors'] else: raise TypeError(_('only accepts strings, lists, and tuples')) config['servergitmirrors'] = roots limit = config['git_mirror_size_limit'] config['git_mirror_size_limit'] = parse_human_readable_size(limit) confignames_to_delete = set() for configname, dictvalue in config.items(): if configname == 'java_paths': new = dict() for k, v in dictvalue.items(): new[str(k)] = v config[configname] = new elif configname in ('ndk_paths', 'java_paths', 'char_limits', 'keyaliases'): continue elif isinstance(dictvalue, dict): for k, v in dictvalue.items(): if k == 'env': env = os.getenv(v) if env: config[configname] = env else: confignames_to_delete.add(configname) logging.error(_('Environment variable {var} from {configname} is not set!') .format(var=k, configname=configname)) else: confignames_to_delete.add(configname) logging.error(_('Unknown entry {key} in {configname}') .format(key=k, configname=configname)) for configname in confignames_to_delete: del(config[configname]) return config def parse_human_readable_size(size): units = { 'b': 1, 'kb': 1000, 'mb': 1000**2, 'gb': 1000**3, 'tb': 1000**4, 'kib': 1024, 'mib': 1024**2, 'gib': 1024**3, 'tib': 1024**4, } try: return int(float(size)) except (ValueError, TypeError): if type(size) != str: raise ValueError(_('Could not parse size "{size}", wrong type "{type}"') .format(size=size, type=type(size))) s = size.lower().replace(' ', '') m = re.match(r'^(?P[0-9][0-9.]*) *(?P' + r'|'.join(units.keys()) + r')$', s) if not m: raise ValueError(_('Not a valid size definition: "{}"').format(size)) return int(float(m.group("value")) * units[m.group("unit")]) def get_dir_size(path_or_str): """Get the total size of all files in the given directory.""" if isinstance(path_or_str, str): path_or_str = Path(path_or_str) return sum(f.stat().st_size for f in path_or_str.glob('**/*') if f.is_file()) def assert_config_keystore(config): """Check weather keystore is configured correctly and raise exception if not.""" nosigningkey = False if 'repo_keyalias' not in config: nosigningkey = True logging.critical(_("'repo_keyalias' not found in config.yml!")) if 'keystore' not in config: nosigningkey = True logging.critical(_("'keystore' not found in config.yml!")) elif config['keystore'] == 'NONE': if not config.get('smartcardoptions'): nosigningkey = True logging.critical(_("'keystore' is NONE and 'smartcardoptions' is blank!")) elif not os.path.exists(config['keystore']): nosigningkey = True logging.critical("'" + config['keystore'] + "' does not exist!") if 'keystorepass' not in config: nosigningkey = True logging.critical(_("'keystorepass' not found in config.yml!")) if 'keypass' not in config and config.get('keystore') != 'NONE': nosigningkey = True logging.critical(_("'keypass' not found in config.yml!")) if nosigningkey: raise FDroidException("This command requires a signing key, " + "you can create one using: fdroid update --create-key") def find_apksigner(config): """Search for the best version apksigner and adds it to the config. Returns the best version of apksigner following this algorithm: * use config['apksigner'] if set * try to find apksigner in path * find apksigner in build-tools starting from newest installed going down to MINIMUM_APKSIGNER_BUILD_TOOLS_VERSION Returns ------- str path to apksigner or None if no version is found """ command = 'apksigner' if command in config: return tmp = find_command(command) if tmp is not None: config[command] = tmp return build_tools_path = os.path.join(config.get('sdk_path', ''), 'build-tools') if not os.path.isdir(build_tools_path): return for f in sorted(os.listdir(build_tools_path), reverse=True): if not os.path.isdir(os.path.join(build_tools_path, f)): continue try: if LooseVersion(f) < LooseVersion(MINIMUM_APKSIGNER_BUILD_TOOLS_VERSION): logging.debug("Local Android SDK only has outdated apksigner versions") return except TypeError: continue if os.path.exists(os.path.join(build_tools_path, f, 'apksigner')): apksigner = os.path.join(build_tools_path, f, 'apksigner') logging.info("Using %s " % apksigner) config['apksigner'] = apksigner return def find_sdk_tools_cmd(cmd): """Find a working path to a tool from the Android SDK.""" tooldirs = [] if config is not None and 'sdk_path' in config and os.path.exists(config['sdk_path']): # try to find a working path to this command, in all the recent possible paths build_tools = os.path.join(config['sdk_path'], 'build-tools') if os.path.isdir(build_tools): for f in sorted(os.listdir(build_tools), reverse=True): if os.path.isdir(os.path.join(build_tools, f)): tooldirs.append(os.path.join(build_tools, f)) sdk_tools = os.path.join(config['sdk_path'], 'tools') if os.path.exists(sdk_tools): tooldirs.append(sdk_tools) tooldirs.append(os.path.join(sdk_tools, 'bin')) sdk_platform_tools = os.path.join(config['sdk_path'], 'platform-tools') if os.path.exists(sdk_platform_tools): tooldirs.append(sdk_platform_tools) if os.path.exists('/usr/bin'): tooldirs.append('/usr/bin') for d in tooldirs: path = os.path.join(d, cmd) if not os.path.isfile(path): path += '.exe' if os.path.isfile(path): if cmd == 'aapt': test_aapt_version(path) return path # did not find the command, exit with error message test_sdk_exists(config) # ignore result so None is never returned raise FDroidException(_("Android SDK tool {cmd} not found!").format(cmd=cmd)) def test_aapt_version(aapt): """Check whether the version of aapt is new enough.""" output = subprocess.check_output([aapt, 'version'], universal_newlines=True) if output is None or output == '': logging.error(_("'{path}' failed to execute!").format(path=aapt)) else: m = re.match(r'.*v([0-9]+)\.([0-9]+)[.-]?([0-9.-]*)', output) if m: major = m.group(1) minor = m.group(2) bugfix = m.group(3) # the Debian package has the version string like "v0.2-23.0.2" too_old = False if '.' in bugfix: if LooseVersion(bugfix) < LooseVersion(MINIMUM_AAPT_BUILD_TOOLS_VERSION): too_old = True elif LooseVersion('.'.join((major, minor, bugfix))) < LooseVersion('0.2.4062713'): too_old = True if too_old: logging.warning(_("'{aapt}' is too old, fdroid requires build-tools-{version} or newer!") .format(aapt=aapt, version=MINIMUM_AAPT_BUILD_TOOLS_VERSION)) else: logging.warning(_('Unknown version of aapt, might cause problems: ') + output) def test_sdk_exists(thisconfig): if 'sdk_path' not in thisconfig: # TODO convert this to apksigner once it is required if 'aapt' in thisconfig and os.path.isfile(thisconfig['aapt']): test_aapt_version(thisconfig['aapt']) return True else: logging.error(_("'sdk_path' not set in config.yml!")) return False if thisconfig['sdk_path'] == default_config['sdk_path']: logging.error(_('No Android SDK found!')) logging.error(_('You can use ANDROID_HOME to set the path to your SDK, i.e.:')) logging.error('\texport ANDROID_HOME=/opt/android-sdk') return False if not os.path.exists(thisconfig['sdk_path']): logging.critical(_("Android SDK path '{path}' does not exist!") .format(path=thisconfig['sdk_path'])) return False if not os.path.isdir(thisconfig['sdk_path']): logging.critical(_("Android SDK path '{path}' is not a directory!") .format(path=thisconfig['sdk_path'])) return False return True def get_local_metadata_files(): """Get any metadata files local to an app's source repo. This tries to ignore anything that does not count as app metdata, including emacs cruft ending in ~ """ return glob.glob('.fdroid.[a-jl-z]*[a-rt-z]') def read_pkg_args(appid_versionCode_pairs, allow_vercodes=False): """No summary. Parameters ---------- appids arguments in the form of multiple appid:[vc] strings Returns ------- a dictionary with the set of vercodes specified for each package """ vercodes = {} if not appid_versionCode_pairs: return vercodes apk_regex = re.compile(r'_(\d+)\.apk$') for p in appid_versionCode_pairs: p = apk_regex.sub(r':\1', p) if allow_vercodes and ':' in p: package, vercode = p.split(':') try: i_vercode = int(vercode, 0) except ValueError: i_vercode = int(vercode) vercode = str(i_vercode) else: package, vercode = p, None if package not in vercodes: vercodes[package] = [vercode] if vercode else [] continue elif vercode and vercode not in vercodes[package]: vercodes[package] += [vercode] if vercode else [] return vercodes def get_metadata_files(vercodes): """ Build a list of metadata files and raise an exception for invalid appids. Parameters ---------- vercodes version codes as returned by read_pkg_args() Returns ------- List a list of corresponding metadata/*.yml files """ found_invalid = False metadatafiles = [] for appid in vercodes.keys(): f = Path('metadata') / ('%s.yml' % appid) if f.exists(): metadatafiles.append(f) else: found_invalid = True logging.critical(_("No such package: %s") % appid) if found_invalid: raise FDroidException(_("Found invalid appids in arguments")) return metadatafiles def read_app_args(appid_versionCode_pairs, allapps, allow_vercodes=False): """Build a list of App instances for processing. On top of what read_pkg_args does, this returns the whole app metadata, but limiting the builds list to the builds matching the appid_versionCode_pairs and vercodes specified. If no appid_versionCode_pairs are specified, then all App and Build instances are returned. """ vercodes = read_pkg_args(appid_versionCode_pairs, allow_vercodes) if not vercodes: return allapps apps = {} for appid, app in allapps.items(): if appid in vercodes: apps[appid] = app if len(apps) != len(vercodes): for p in vercodes: if p not in allapps: logging.critical(_("No such package: %s") % p) raise FDroidException(_("Found invalid appids in arguments")) if not apps: raise FDroidException(_("No packages specified")) error = False for appid, app in apps.items(): vc = vercodes[appid] if not vc: continue app['Builds'] = [b for b in app.get('Builds', []) if b.versionCode in vc] if len(app.get('Builds', [])) != len(vercodes[appid]): error = True allvcs = [b.versionCode for b in app.get('Builds', [])] for v in vercodes[appid]: if v not in allvcs: logging.critical(_("No such versionCode {versionCode} for app {appid}") .format(versionCode=v, appid=appid)) if error: raise FDroidException(_("Found invalid versionCodes for some apps")) return apps def get_extension(filename): """Get name and extension of filename, with extension always lower case.""" base, ext = os.path.splitext(filename) if not ext: return base, '' return base, ext.lower()[1:] publish_name_regex = re.compile(r"^(.+)_([0-9]+)\.(apk|zip)$") def publishednameinfo(filename): filename = os.path.basename(filename) m = publish_name_regex.match(filename) try: result = (m.group(1), m.group(2)) except AttributeError: raise FDroidException(_("Invalid name for published file: %s") % filename) return result apk_release_filename = re.compile(r'(?P[a-zA-Z0-9_\.]+)_(?P[0-9]+)\.apk') apk_release_filename_with_sigfp = re.compile(r'(?P[a-zA-Z0-9_\.]+)_(?P[0-9]+)_(?P[0-9a-f]{7})\.apk') def apk_parse_release_filename(apkname): """Parse the name of an APK file according the F-Droids APK naming scheme. WARNING: Returned values don't necessarily represent the APKs actual properties, the are just paresed from the file name. Returns ------- Tuple A triplet containing (appid, versionCode, signer), where appid should be the package name, versionCode should be the integer represion of the APKs version and signer should be the first 7 hex digists of the sha256 signing key fingerprint which was used to sign this APK. """ m = apk_release_filename_with_sigfp.match(apkname) if m: return m.group('appid'), m.group('vercode'), m.group('sigfp') m = apk_release_filename.match(apkname) if m: return m.group('appid'), m.group('vercode'), None return None, None, None def get_release_filename(app, build, extension=None): if extension: return "%s_%s.%s" % (app.id, build.versionCode, extension) if build.output and get_file_extension(build.output): return "%s_%s.%s" % (app.id, build.versionCode, get_file_extension(build.output)) else: return "%s_%s.apk" % (app.id, build.versionCode) def get_toolsversion_logname(app, build): return "%s_%s_toolsversion.log" % (app.id, build.versionCode) def getsrcname(app, build): return "%s_%s_src.tar.gz" % (app.id, build.versionCode) def get_build_dir(app): """Get the dir that this app will be built in.""" if app.RepoType == 'srclib': return Path('build/srclib') / app.Repo return Path('build') / app.id class Encoder(json.JSONEncoder): def default(self, obj): if isinstance(obj, set): return sorted(obj) return super().default(obj) def setup_status_output(start_timestamp): """Create the common output dictionary for public status updates.""" output = { 'commandLine': sys.argv, 'startTimestamp': int(time.mktime(start_timestamp) * 1000), 'subcommand': sys.argv[0].split()[1], } if os.path.isdir('.git'): git_repo = git.repo.Repo(os.getcwd()) output['fdroiddata'] = { 'commitId': get_head_commit_id(git_repo), 'isDirty': git_repo.is_dirty(), 'modifiedFiles': git_repo.git().ls_files(modified=True).split(), 'untrackedFiles': git_repo.untracked_files, } fdroidserver_dir = os.path.dirname(sys.argv[0]) if os.path.isdir(os.path.join(fdroidserver_dir, '.git')): git_repo = git.repo.Repo(fdroidserver_dir) output['fdroidserver'] = { 'commitId': get_head_commit_id(git_repo), 'isDirty': git_repo.is_dirty(), 'modifiedFiles': git_repo.git().ls_files(modified=True).split(), 'untrackedFiles': git_repo.untracked_files, } etc_issue_net = '/etc/issue.net' if os.path.exists(etc_issue_net): with open(etc_issue_net) as fp: output[etc_issue_net] = fp.read(100).strip() write_running_status_json(output) return output def write_running_status_json(output): write_status_json(output, pretty=True, name='running') def write_status_json(output, pretty=False, name=None): """Write status out as JSON, and rsync it to the repo server.""" status_dir = os.path.join('repo', 'status') if not os.path.exists(status_dir): os.makedirs(status_dir) if not name: output['endTimestamp'] = int(datetime.now(timezone.utc).timestamp() * 1000) name = sys.argv[0].split()[1] # fdroid subcommand path = os.path.join(status_dir, name + '.json') with open(path, 'w') as fp: if pretty: json.dump(output, fp, sort_keys=True, cls=Encoder, indent=2) else: json.dump(output, fp, sort_keys=True, cls=Encoder, separators=(',', ':')) rsync_status_file_to_repo(path, repo_subdir='status') def get_head_commit_id(git_repo): """Get git commit ID for HEAD as a str.""" try: return git_repo.head.commit.hexsha except ValueError: return "None" def setup_vcs(app): """Checkout code from VCS and return instance of vcs and the build dir.""" build_dir = get_build_dir(app) # TODO: Remove this build_dir = str(build_dir) # Set up vcs interface and make sure we have the latest code... logging.debug("Getting {0} vcs interface for {1}" .format(app.RepoType, app.Repo)) if app.RepoType == 'git' and os.path.exists('.fdroid.yml'): remote = os.getcwd() else: remote = app.Repo vcs = getvcs(app.RepoType, remote, build_dir) return vcs, build_dir def getvcs(vcstype, remote, local): # TODO: Remove this in Python3.6 local = str(local) if vcstype == 'git': return vcs_git(remote, local) if vcstype == 'git-svn': return vcs_gitsvn(remote, local) if vcstype == 'hg': return vcs_hg(remote, local) if vcstype == 'bzr': return vcs_bzr(remote, local) if vcstype == 'srclib': if local != os.path.join('build', 'srclib', remote): raise VCSException("Error: srclib paths are hard-coded!") return getsrclib(remote, os.path.join('build', 'srclib'), raw=True) if vcstype == 'svn': raise VCSException("Deprecated vcs type 'svn' - please use 'git-svn' instead") raise VCSException("Invalid vcs type " + vcstype) def getsrclibvcs(name): if name not in fdroidserver.metadata.srclibs: raise VCSException("Missing srclib " + name) return fdroidserver.metadata.srclibs[name]['RepoType'] class vcs: def __init__(self, remote, local): # TODO: Remove this in Python3.6 local = str(local) # svn, git-svn and bzr may require auth self.username = None if self.repotype() in ('git-svn', 'bzr'): if '@' in remote: if self.repotype == 'git-svn': raise VCSException("Authentication is not supported for git-svn") self.username, remote = remote.split('@') if ':' not in self.username: raise VCSException(_("Password required with username")) self.username, self.password = self.username.split(':') self.remote = remote self.local = local self.clone_failed = False self.refreshed = False self.srclib = None def repotype(self): return None def clientversion(self): versionstr = FDroidPopen(self.clientversioncmd()).output return versionstr[0:versionstr.find('\n')] def clientversioncmd(self): return None def gotorevision(self, rev, refresh=True): """Take the local repository to a clean version of the given revision. Take the local repository to a clean version of the given revision, which is specificed in the VCS's native format. Beforehand, the repository can be dirty, or even non-existent. If the repository does already exist locally, it will be updated from the origin, but only once in the lifetime of the vcs object. None is acceptable for 'rev' if you know you are cloning a clean copy of the repo - otherwise it must specify a valid revision. """ if self.clone_failed: raise VCSException(_("Downloading the repository already failed once, not trying again.")) # The .fdroidvcs-id file for a repo tells us what VCS type # and remote that directory was created from, allowing us to drop it # automatically if either of those things changes. fdpath = os.path.join(self.local, '..', '.fdroidvcs-' + os.path.basename(self.local)) fdpath = os.path.normpath(fdpath) cdata = self.repotype() + ' ' + self.remote writeback = True deleterepo = False if os.path.exists(self.local): if os.path.exists(fdpath): with open(fdpath, 'r') as f: fsdata = f.read().strip() if fsdata == cdata: writeback = False else: deleterepo = True logging.info("Repository details for %s changed - deleting" % ( self.local)) else: deleterepo = True logging.info("Repository details for %s missing - deleting" % ( self.local)) if deleterepo: shutil.rmtree(self.local) exc = None if not refresh: self.refreshed = True try: self.gotorevisionx(rev) except FDroidException as e: exc = e # If necessary, write the .fdroidvcs file. if writeback and not self.clone_failed: os.makedirs(os.path.dirname(fdpath), exist_ok=True) with open(fdpath, 'w+') as f: f.write(cdata) if exc is not None: raise exc def gotorevisionx(self, rev): # pylint: disable=unused-argument """No summary. Derived classes need to implement this. It's called once basic checking has been performed. """ raise VCSException("This VCS type doesn't define gotorevisionx") # Initialise and update submodules def initsubmodules(self): raise VCSException('Submodules not supported for this vcs type') # Get a list of all known tags def gettags(self): if not self._gettags: raise VCSException('gettags not supported for this vcs type') rtags = [] for tag in self._gettags(): if re.match('[-A-Za-z0-9_. /]+$', tag): rtags.append(tag) return rtags def latesttags(self): """Get a list of all the known tags, sorted from newest to oldest.""" raise VCSException('latesttags not supported for this vcs type') def getref(self, revname=None): """Get current commit reference (hash, revision, etc).""" raise VCSException('getref not supported for this vcs type') def getsrclib(self): """Return the srclib (name, path) used in setting up the current revision, or None.""" return self.srclib class vcs_git(vcs): def repotype(self): return 'git' def clientversioncmd(self): return ['git', '--version'] def git(self, args, envs=dict(), cwd=None, output=True): """Prevent git fetch/clone/submodule from hanging at the username/password prompt. While fetch/pull/clone respect the command line option flags, it seems that submodule commands do not. They do seem to follow whatever is in env vars, if the version of git is new enough. So we just throw the kitchen sink at it to see what sticks. Also, because of CVE-2017-1000117, block all SSH URLs. """ # # supported in git >= 2.3 git_config = [ '-c', 'core.askpass=/bin/true', '-c', 'core.sshCommand=/bin/false', '-c', 'url.https://.insteadOf=ssh://', ] for domain in ('bitbucket.org', 'github.com', 'gitlab.com'): git_config.append('-c') git_config.append('url.https://u:p@' + domain + '/.insteadOf=git@' + domain + ':') git_config.append('-c') git_config.append('url.https://u:p@' + domain + '.insteadOf=git://' + domain) git_config.append('-c') git_config.append('url.https://u:p@' + domain + '.insteadOf=https://' + domain) envs.update({ 'GIT_TERMINAL_PROMPT': '0', 'GIT_ASKPASS': '/bin/true', 'SSH_ASKPASS': '/bin/true', 'GIT_SSH': '/bin/false', # for git < 2.3 }) return FDroidPopen(['git', ] + git_config + args, envs=envs, cwd=cwd, output=output) def checkrepo(self): """No summary. If the local directory exists, but is somehow not a git repository, git will traverse up the directory tree until it finds one that is (i.e. fdroidserver) and then we'll proceed to destroy it! This is called as a safety check. """ p = FDroidPopen(['git', 'rev-parse', '--show-toplevel'], cwd=self.local, output=False) result = p.output.rstrip() if Path(result) != Path(self.local).resolve(): raise VCSException('Repository mismatch') def gotorevisionx(self, rev): if not os.path.exists(self.local): # Brand new checkout p = self.git(['clone', '--', self.remote, str(self.local)]) if p.returncode != 0: self.clone_failed = True raise VCSException("Git clone failed", p.output) self.checkrepo() else: self.checkrepo() # Discard any working tree changes p = FDroidPopen(['git', 'submodule', 'foreach', '--recursive', 'git', 'reset', '--hard'], cwd=self.local, output=False) if p.returncode != 0: logging.debug("Git submodule reset failed (ignored) {output}".format(output=p.output)) p = FDroidPopen(['git', 'reset', '--hard'], cwd=self.local, output=False) if p.returncode != 0: raise VCSException(_("Git reset failed"), p.output) # Remove untracked files now, in case they're tracked in the target # revision (it happens!) p = FDroidPopen(['git', 'submodule', 'foreach', '--recursive', 'git', 'clean', '-dffx'], cwd=self.local, output=False) if p.returncode != 0: logging.debug("Git submodule cleanup failed (ignored) {output}".format(output=p.output)) p = FDroidPopen(['git', 'clean', '-dffx'], cwd=self.local, output=False) if p.returncode != 0: raise VCSException(_("Git clean failed"), p.output) if not self.refreshed: # Get latest commits and tags from remote p = self.git(['fetch', 'origin'], cwd=self.local) if p.returncode != 0: raise VCSException(_("Git fetch failed"), p.output) p = self.git(['remote', 'prune', 'origin'], output=False, cwd=self.local) if p.returncode != 0: raise VCSException(_("Git prune failed"), p.output) p = self.git(['fetch', '--prune', '--tags', '--force', 'origin'], output=False, cwd=self.local) if p.returncode != 0: raise VCSException(_("Git fetch failed"), p.output) # Recreate origin/HEAD as git clone would do it, in case it disappeared p = FDroidPopen(['git', 'remote', 'set-head', 'origin', '--auto'], cwd=self.local, output=False) if p.returncode != 0: lines = p.output.splitlines() if 'Multiple remote HEAD branches' not in lines[0]: logging.warning(_("Git remote set-head failed: \"%s\"") % p.output.strip()) else: branch = lines[1].split(' ')[-1] p2 = FDroidPopen(['git', 'remote', 'set-head', 'origin', '--', branch], cwd=self.local, output=False) if p2.returncode != 0: logging.warning(_("Git remote set-head failed: \"%s\"") % p.output.strip() + '\n' + p2.output.strip()) self.refreshed = True # origin/HEAD is the HEAD of the remote, e.g. the "default branch" on # a github repo. Most of the time this is the same as origin/master. rev = rev or 'origin/HEAD' p = FDroidPopen(['git', 'checkout', '-f', rev], cwd=self.local, output=False) if p.returncode != 0: raise VCSException(_("Git checkout of '%s' failed") % rev, p.output) # Get rid of any uncontrolled files left behind p = FDroidPopen(['git', 'clean', '-dffx'], cwd=self.local, output=False) if p.returncode != 0: raise VCSException(_("Git clean failed"), p.output) def initsubmodules(self): self.checkrepo() submfile = os.path.join(self.local, '.gitmodules') if not os.path.isfile(submfile): raise NoSubmodulesException(_("No git submodules available")) # fix submodules not accessible without an account and public key auth with open(submfile, 'r') as f: lines = f.readlines() with open(submfile, 'w') as f: for line in lines: for domain in ('bitbucket.org', 'github.com', 'gitlab.com'): line = re.sub('git@' + domain + ':', 'https://u:p@' + domain + '/', line) f.write(line) p = FDroidPopen(['git', 'submodule', 'sync'], cwd=self.local, output=False) if p.returncode != 0: raise VCSException(_("Git submodule sync failed"), p.output) p = self.git(['submodule', 'update', '--init', '--force', '--recursive'], cwd=self.local) if p.returncode != 0: raise VCSException(_("Git submodule update failed"), p.output) def _gettags(self): self.checkrepo() p = FDroidPopen(['git', 'tag'], cwd=self.local, output=False) return p.output.splitlines() tag_format = re.compile(r'tag: ([^) ]*)') def latesttags(self): """Return a list of latest tags. The definition is a little blurry here, Android does not care for the version name of an app as normally used as the tag name so versions do not need to follow strverscmp() or similar. Also they can be rather arbitrary so git tag --sort=-version:refname does not work. On the other side sorting them by creation date, i.e. git tag --sort=-authordate does not work either as there are a lot of repos where older tags were created later. So git log preserves the graph order and only sorts by date afterwards. This results in tags of beta versions being sorted earlier then the latest tag as long as they are part of the graph below the latest tag or are created earlier. """ self.checkrepo() p = FDroidPopen(['git', 'log', '--tags', '--simplify-by-decoration', '--pretty=format:%d'], cwd=self.local, output=False) tags = [] for line in p.output.splitlines(): for entry in line.split(', '): for tag in self.tag_format.findall(entry): tags.append(tag) return tags def getref(self, revname='HEAD'): self.checkrepo() p = FDroidPopen(['git', 'rev-parse', '--verify', '{revname}^{{commit}}'.format(revname=revname)], cwd=self.local, output=False) if p.returncode != 0: return None return p.output.strip() class vcs_gitsvn(vcs): def repotype(self): return 'git-svn' def clientversioncmd(self): return ['git', 'svn', '--version'] def checkrepo(self): """No summary. If the local directory exists, but is somehow not a git repository, git will traverse up the directory tree until it finds one that is (i.e. fdroidserver) and then we'll proceed to destory it! This is called as a safety check. """ p = FDroidPopen(['git', 'rev-parse', '--show-toplevel'], cwd=self.local, output=False) result = p.output.rstrip() if Path(result) != Path(self.local).resolve(): raise VCSException('Repository mismatch') def git(self, args, envs=dict(), cwd=None, output=True): """Prevent git fetch/clone/submodule from hanging at the username/password prompt. AskPass is set to /bin/true to let the process try to connect without a username/password. The SSH command is set to /bin/false to block all SSH URLs (supported in git >= 2.3). This protects against CVE-2017-1000117. """ git_config = [ '-c', 'core.askpass=/bin/true', '-c', 'core.sshCommand=/bin/false', ] envs.update({ 'GIT_TERMINAL_PROMPT': '0', 'GIT_ASKPASS': '/bin/true', 'SSH_ASKPASS': '/bin/true', 'GIT_SSH': '/bin/false', # for git < 2.3 'SVN_SSH': '/bin/false', }) return FDroidPopen(['git', ] + git_config + args, envs=envs, cwd=cwd, output=output) def gotorevisionx(self, rev): if not os.path.exists(self.local): # Brand new checkout gitsvn_args = ['svn', 'clone'] remote = None if ';' in self.remote: remote_split = self.remote.split(';') for i in remote_split[1:]: if i.startswith('trunk='): gitsvn_args.extend(['-T', i[6:]]) elif i.startswith('tags='): gitsvn_args.extend(['-t', i[5:]]) elif i.startswith('branches='): gitsvn_args.extend(['-b', i[9:]]) remote = remote_split[0] else: remote = self.remote if not remote.startswith('https://'): raise VCSException(_('HTTPS must be used with Subversion URLs!')) # git-svn sucks at certificate validation, this throws useful errors: try: import requests r = requests.head(remote) r.raise_for_status() except Exception as e: raise VCSException('SVN certificate pre-validation failed: ' + str(e)) location = r.headers.get('location') if location and not location.startswith('https://'): raise VCSException(_('Invalid redirect to non-HTTPS: {before} -> {after} ') .format(before=remote, after=location)) gitsvn_args.extend(['--', remote, str(self.local)]) p = self.git(gitsvn_args) if p.returncode != 0: self.clone_failed = True raise VCSException(_('git svn clone failed'), p.output) self.checkrepo() else: self.checkrepo() # Discard any working tree changes p = self.git(['reset', '--hard'], cwd=self.local, output=False) if p.returncode != 0: raise VCSException("Git reset failed", p.output) # Remove untracked files now, in case they're tracked in the target # revision (it happens!) p = self.git(['clean', '-dffx'], cwd=self.local, output=False) if p.returncode != 0: raise VCSException("Git clean failed", p.output) if not self.refreshed: # Get new commits, branches and tags from repo p = self.git(['svn', 'fetch'], cwd=self.local, output=False) if p.returncode != 0: raise VCSException("Git svn fetch failed") p = self.git(['svn', 'rebase'], cwd=self.local, output=False) if p.returncode != 0: raise VCSException("Git svn rebase failed", p.output) self.refreshed = True rev = rev or 'master' if rev: nospaces_rev = rev.replace(' ', '%20') # Try finding a svn tag for treeish in ['origin/', '']: p = self.git(['checkout', treeish + 'tags/' + nospaces_rev], cwd=self.local, output=False) if p.returncode == 0: break if p.returncode != 0: # No tag found, normal svn rev translation # Translate svn rev into git format rev_split = rev.split('/') p = None for treeish in ['origin/', '']: if len(rev_split) > 1: treeish += rev_split[0] svn_rev = rev_split[1] else: # if no branch is specified, then assume trunk (i.e. 'master' branch): treeish += 'master' svn_rev = rev svn_rev = svn_rev if svn_rev[0] == 'r' else 'r' + svn_rev p = self.git(['svn', 'find-rev', '--before', svn_rev, treeish], cwd=self.local, output=False) git_rev = p.output.rstrip() if p.returncode == 0 and git_rev: break if p.returncode != 0 or not git_rev: # Try a plain git checkout as a last resort p = self.git(['checkout', rev], cwd=self.local, output=False) if p.returncode != 0: raise VCSException("No git treeish found and direct git checkout of '%s' failed" % rev, p.output) else: # Check out the git rev equivalent to the svn rev p = self.git(['checkout', git_rev], cwd=self.local, output=False) if p.returncode != 0: raise VCSException(_("Git checkout of '%s' failed") % rev, p.output) # Get rid of any uncontrolled files left behind p = self.git(['clean', '-dffx'], cwd=self.local, output=False) if p.returncode != 0: raise VCSException(_("Git clean failed"), p.output) def _gettags(self): self.checkrepo() for treeish in ['origin/', '']: d = os.path.join(self.local, '.git', 'svn', 'refs', 'remotes', treeish, 'tags') if os.path.isdir(d): return os.listdir(d) def getref(self, revname='HEAD'): self.checkrepo() p = FDroidPopen(['git', 'svn', 'find-rev', revname], cwd=self.local, output=False) if p.returncode != 0: return None return p.output.strip() class vcs_hg(vcs): def repotype(self): return 'hg' def clientversioncmd(self): return ['hg', '--version'] def gotorevisionx(self, rev): if not os.path.exists(self.local): p = FDroidPopen(['hg', 'clone', '--ssh', '/bin/false', '--', self.remote, str(self.local)], output=False) if p.returncode != 0: self.clone_failed = True raise VCSException("Hg clone failed", p.output) else: p = FDroidPopen(['hg', 'status', '-uS'], cwd=self.local, output=False) if p.returncode != 0: raise VCSException("Hg status failed", p.output) for line in p.output.splitlines(): if not line.startswith('? '): raise VCSException("Unexpected output from hg status -uS: " + line) FDroidPopen(['rm', '-rf', '--', line[2:]], cwd=self.local, output=False) if not self.refreshed: p = FDroidPopen(['hg', 'pull', '--ssh', '/bin/false'], cwd=self.local, output=False) if p.returncode != 0: raise VCSException("Hg pull failed", p.output) self.refreshed = True rev = rev or 'default' if not rev: return p = FDroidPopen(['hg', 'update', '-C', '--', rev], cwd=self.local, output=False) if p.returncode != 0: raise VCSException("Hg checkout of '%s' failed" % rev, p.output) p = FDroidPopen(['hg', 'purge', '--all'], cwd=self.local, output=False) # Also delete untracked files, we have to enable purge extension for that: if "'purge' is provided by the following extension" in p.output: with open(os.path.join(self.local, '.hg', 'hgrc'), "a") as myfile: myfile.write("\n[extensions]\nhgext.purge=\n") p = FDroidPopen(['hg', 'purge', '--all'], cwd=self.local, output=False) if p.returncode != 0: raise VCSException("HG purge failed", p.output) elif p.returncode != 0: raise VCSException("HG purge failed", p.output) def _gettags(self): p = FDroidPopen(['hg', 'tags', '-q'], cwd=self.local, output=False) return p.output.splitlines()[1:] class vcs_bzr(vcs): def repotype(self): return 'bzr' def clientversioncmd(self): return ['bzr', '--version'] def bzr(self, args, envs=dict(), cwd=None, output=True): """Prevent bzr from ever using SSH to avoid security vulns.""" envs.update({ 'BZR_SSH': 'false', }) return FDroidPopen(['bzr', ] + args, envs=envs, cwd=cwd, output=output) def gotorevisionx(self, rev): if not os.path.exists(self.local): p = self.bzr(['branch', self.remote, str(self.local)], output=False) if p.returncode != 0: self.clone_failed = True raise VCSException("Bzr branch failed", p.output) else: p = self.bzr(['clean-tree', '--force', '--unknown', '--ignored'], cwd=self.local, output=False) if p.returncode != 0: raise VCSException("Bzr revert failed", p.output) if not self.refreshed: p = self.bzr(['pull'], cwd=self.local, output=False) if p.returncode != 0: raise VCSException("Bzr update failed", p.output) self.refreshed = True revargs = list(['-r', rev] if rev else []) p = self.bzr(['revert'] + revargs, cwd=self.local, output=False) if p.returncode != 0: raise VCSException("Bzr revert of '%s' failed" % rev, p.output) def _gettags(self): p = self.bzr(['tags'], cwd=self.local, output=False) return [tag.split(' ')[0].strip() for tag in p.output.splitlines()] def unescape_string(string): if len(string) < 2: return string if string[0] == '"' and string[-1] == '"': return string[1:-1] return string.replace("\\'", "'") def retrieve_string(app_dir, string, xmlfiles=None): if string.startswith('@string/'): name = string[len('@string/'):] elif string.startswith('${'): return '' # Gradle variable else: return unescape_string(string) if xmlfiles is None: xmlfiles = [] for res_dir in [ os.path.join(app_dir, 'res'), os.path.join(app_dir, 'src', 'main', 'res'), ]: for root, dirs, files in os.walk(res_dir): if os.path.basename(root) == 'values': xmlfiles += [os.path.join(root, x) for x in files if x.endswith('.xml')] def element_content(element): if element.text is None: return "" s = XMLElementTree.tostring(element, encoding='utf-8', method='text') return s.decode('utf-8').strip() for path in xmlfiles: if not os.path.isfile(path): continue xml = parse_xml(path) element = xml.find('string[@name="' + name + '"]') if element is not None: content = element_content(element) return retrieve_string(app_dir, content, xmlfiles) return '' def retrieve_string_singleline(app_dir, string, xmlfiles=None): return retrieve_string(app_dir, string, xmlfiles).replace('\n', ' ').strip() def manifest_paths(app_dir, flavours): """Return list of existing files that will be used to find the highest vercode.""" # TODO: Remove this in Python3.6 app_dir = str(app_dir) possible_manifests = \ [os.path.join(app_dir, 'AndroidManifest.xml'), os.path.join(app_dir, 'src', 'main', 'AndroidManifest.xml'), os.path.join(app_dir, 'src', 'AndroidManifest.xml'), os.path.join(app_dir, 'build.gradle'), os.path.join(app_dir, 'build-extras.gradle'), os.path.join(app_dir, 'build.gradle.kts')] for flavour in flavours: if flavour == 'yes': continue possible_manifests.append( os.path.join(app_dir, 'src', flavour, 'AndroidManifest.xml')) return [path for path in possible_manifests if os.path.isfile(path)] def fetch_real_name(app_dir, flavours): """Retrieve the package name. Returns the name, or None if not found.""" # TODO: Remove this in Python3.6 app_dir = str(app_dir) for path in manifest_paths(app_dir, flavours): if not path.endswith('.xml') or not os.path.isfile(path): continue logging.debug("fetch_real_name: Checking manifest at " + path) xml = parse_xml(path) app = xml.find('application') if app is None: continue if XMLNS_ANDROID + "label" not in app.attrib: continue label = app.attrib[XMLNS_ANDROID + "label"] result = retrieve_string_singleline(app_dir, label) if result: result = result.strip() return result return None def get_library_references(root_dir): libraries = [] proppath = os.path.join(root_dir, 'project.properties') if not os.path.isfile(proppath): return libraries with open(proppath, 'r', encoding='iso-8859-1') as f: for line in f: if not line.startswith('android.library.reference.'): continue path = line.split('=')[1].strip() relpath = os.path.join(root_dir, path) if not os.path.isdir(relpath): continue logging.debug("Found subproject at %s" % path) libraries.append(path) return libraries def ant_subprojects(root_dir): subprojects = get_library_references(root_dir) for subpath in subprojects: subrelpath = os.path.join(root_dir, subpath) for p in get_library_references(subrelpath): relp = os.path.normpath(os.path.join(subpath, p)) if relp not in subprojects: subprojects.insert(0, relp) return subprojects def remove_debuggable_flags(root_dir): # Remove forced debuggable flags logging.debug("Removing debuggable flags from %s" % root_dir) for root, dirs, files in os.walk(root_dir): if 'AndroidManifest.xml' in files and os.path.isfile(os.path.join(root, 'AndroidManifest.xml')): regsub_file(r'android:debuggable="[^"]*"', '', os.path.join(root, 'AndroidManifest.xml')) vcsearch_g = re.compile(r'''\b[Vv]ersionCode\s*=?\s*["'(]*([0-9][0-9_]*)["')]*''').search vnsearch_g = re.compile(r'''\b[Vv]ersionName\s*=?\s*\(?(["'])((?:(?=(\\?))\3.)*?)\1''').search vnssearch_g = re.compile(r'''\b[Vv]ersionNameSuffix\s*=?\s*(["'])((?:(?=(\\?))\3.)*?)\1''').search psearch_g = re.compile(r'''\b(packageName|applicationId)\s*=*\s*["']([^"']+)["']''').search fsearch_g = re.compile(r'''\b(applicationIdSuffix)\s*=*\s*["']([^"']+)["']''').search def app_matches_packagename(app, package): if not package: return False appid = app.UpdateCheckName or app.id if appid is None or appid == "Ignore": return True return appid == package def parse_androidmanifests(paths, app): """Extract some information from the AndroidManifest.xml at the given path. Returns (version, vercode, package), any or all of which might be None. All values returned are strings. Android Studio recommends "you use UTF-8 encoding whenever possible", so this code assumes the files use UTF-8. https://sites.google.com/a/android.com/tools/knownissues/encoding """ ignoreversions = app.UpdateCheckIgnore ignoresearch = re.compile(ignoreversions).search if ignoreversions else None if not paths: return (None, None, None) max_version = None max_vercode = None max_package = None for path in paths: # TODO: Remove this in Python3.6 path = str(path) if not os.path.isfile(path): continue logging.debug(_("Parsing manifest at '{path}'").format(path=path)) version = None vercode = None package = None flavours = None temp_app_id = None temp_version_name = None if len(app.get('Builds', [])) > 0 and 'gradle' in app['Builds'][-1] and app['Builds'][-1].gradle: flavours = app['Builds'][-1].gradle if path.endswith('.gradle') or path.endswith('.gradle.kts'): with open(path, 'r', encoding='utf-8') as f: android_plugin_file = False inside_flavour_group = 0 inside_required_flavour = 0 for line in f: if gradle_comment.match(line): continue if "applicationId" in line and not temp_app_id: matches = psearch_g(line) if matches: temp_app_id = matches.group(2) if "versionName" in line and not temp_version_name: matches = vnsearch_g(line) if matches: temp_version_name = matches.group(2) if inside_flavour_group > 0: if inside_required_flavour > 1: matches = psearch_g(line) if matches: s = matches.group(2) if app_matches_packagename(app, s): package = s else: # If build.gradle contains applicationIdSuffix add it to the end of package name matches = fsearch_g(line) if matches and temp_app_id: suffix = matches.group(2) temp_app_id = temp_app_id + suffix if app_matches_packagename(app, temp_app_id): package = temp_app_id matches = vnsearch_g(line) if matches: version = matches.group(2) else: # If build.gradle contains applicationNameSuffix add it to the end of version name matches = vnssearch_g(line) if matches and temp_version_name: name_suffix = matches.group(2) version = temp_version_name + name_suffix matches = vcsearch_g(line) if matches: vercode = matches.group(1) if inside_required_flavour > 0: if '{' in line: inside_required_flavour += 1 if '}' in line: inside_required_flavour -= 1 if inside_required_flavour == 1: inside_required_flavour -= 1 elif flavours: for flavour in flavours: if re.match(r'.*[\'"\s]{flavour}[\'"\s].*\{{.*'.format(flavour=flavour), line): inside_required_flavour = 2 break elif re.match(r'.*[\'"\s]{flavour}[\'"\s].*'.format(flavour=flavour), line): inside_required_flavour = 1 break if '{' in line: inside_flavour_group += 1 if '}' in line: inside_flavour_group -= 1 else: if "productFlavors" in line: inside_flavour_group = 1 if not package: matches = psearch_g(line) if matches: s = matches.group(2) if app_matches_packagename(app, s): package = s if not version: matches = vnsearch_g(line) if matches: version = matches.group(2) if not vercode: matches = vcsearch_g(line) if matches: vercode = matches.group(1) if not android_plugin_file and ANDROID_PLUGIN_REGEX.match(line): android_plugin_file = True if android_plugin_file: if package: max_package = package if version: max_version = version if vercode: max_vercode = vercode if max_package and max_version and max_vercode: break else: try: xml = parse_xml(path) if "package" in xml.attrib: s = xml.attrib["package"] if app_matches_packagename(app, s): package = s if XMLNS_ANDROID + "versionName" in xml.attrib: version = xml.attrib[XMLNS_ANDROID + "versionName"] base_dir = os.path.dirname(path) version = retrieve_string_singleline(base_dir, version) if XMLNS_ANDROID + "versionCode" in xml.attrib: a = xml.attrib[XMLNS_ANDROID + "versionCode"] if string_is_integer(a): vercode = a except Exception: logging.warning(_("Problem with xml at '{path}'").format(path=path)) # Remember package name, may be defined separately from version+vercode if package is None: package = max_package logging.debug("..got package={0}, version={1}, vercode={2}" .format(package, version, vercode)) # Always grab the package name and version name in case they are not # together with the highest version code if max_package is None and package is not None: max_package = package if max_version is None and version is not None: max_version = version if vercode is not None \ and (max_vercode is None or vercode > max_vercode): if version and (not ignoresearch or not ignoresearch(version)): if version is not None: max_version = version if vercode is not None: max_vercode = vercode if package is not None: max_package = package else: max_version = "Ignore" if max_version is None: max_version = "Unknown" if max_package: msg = _("Invalid application ID {appid}").format(appid=max_package) if not is_valid_package_name(max_package): raise FDroidException(msg) elif not is_strict_application_id(max_package): logging.warning(msg) return (max_version, max_vercode, max_package) def is_valid_package_name(name): """Check whether name is a valid fdroid package name. APKs and manually defined package names must use a valid Java Package Name. Automatically generated package names for non-APK files use the SHA-256 sum. """ return VALID_APPLICATION_ID_REGEX.match(name) is not None \ or FDROID_PACKAGE_NAME_REGEX.match(name) is not None def is_strict_application_id(name): """Check whether name is a valid Android Application ID. The Android ApplicationID is basically a Java Package Name, but with more restrictive naming rules: * It must have at least two segments (one or more dots). * Each segment must start with a letter. * All characters must be alphanumeric or an underscore [a-zA-Z0-9_]. References ---------- https://developer.android.com/studio/build/application-id """ return STRICT_APPLICATION_ID_REGEX.match(name) is not None \ and '.' in name def get_all_gradle_and_manifests(build_dir): paths = [] # TODO: Python3.6: Accepts a path-like object. for root, dirs, files in os.walk(str(build_dir)): for f in sorted(files): if f == 'AndroidManifest.xml' \ or f.endswith('.gradle') or f.endswith('.gradle.kts'): full = Path(root) / f paths.append(full) return paths def get_gradle_subdir(build_dir, paths): """Get the subdir where the gradle build is based.""" first_gradle_dir = None for path in paths: if not first_gradle_dir: first_gradle_dir = path.parent.relative_to(build_dir) if path.exists() and SETTINGS_GRADLE_REGEX.match(str(path.name)): for m in GRADLE_SUBPROJECT_REGEX.finditer(path.read_text(encoding='utf-8')): for f in (path.parent / m.group(1)).glob('build.gradle*'): with f.open(encoding='utf-8') as fp: for line in fp.readlines(): if ANDROID_PLUGIN_REGEX.match(line): return f.parent.relative_to(build_dir) if first_gradle_dir and first_gradle_dir != Path('.'): return first_gradle_dir return def getrepofrompage(url): """Get the repo type and address from the given web page. The page is scanned in a rather naive manner for 'git clone xxxx', 'hg clone xxxx', etc, and when one of these is found it's assumed that's the information we want. Returns repotype, address, or None, reason """ if not url.startswith('http'): return (None, _('{url} does not start with "http"!'.format(url=url))) req = urllib.request.urlopen(url) # nosec B310 non-http URLs are filtered out if req.getcode() != 200: return (None, 'Unable to get ' + url + ' - return code ' + str(req.getcode())) page = req.read().decode(req.headers.get_content_charset()) # Works for BitBucket m = re.search('data-fetch-url="(.*)"', page) if m is not None: repo = m.group(1) if repo.endswith('.git'): return ('git', repo) return ('hg', repo) # Works for BitBucket (obsolete) index = page.find('hg clone') if index != -1: repotype = 'hg' repo = page[index + 9:] index = repo.find('<') if index == -1: return (None, _("Error while getting repo address")) repo = repo[:index] repo = repo.split('"')[0] return (repotype, repo) # Works for BitBucket (obsolete) index = page.find('git clone') if index != -1: repotype = 'git' repo = page[index + 10:] index = repo.find('<') if index == -1: return (None, _("Error while getting repo address")) repo = repo[:index] repo = repo.split('"')[0] return (repotype, repo) return (None, _("No information found.") + page) def get_app_from_url(url): """Guess basic app metadata from the URL. The URL must include a network hostname, unless it is an lp:, file:, or git/ssh URL. This throws ValueError on bad URLs to match urlparse(). """ parsed = urllib.parse.urlparse(url) invalid_url = False if not parsed.scheme or not parsed.path: invalid_url = True app = fdroidserver.metadata.App() app.Repo = url if url.startswith('git://') or url.startswith('git@'): app.RepoType = 'git' elif parsed.netloc == 'github.com': app.RepoType = 'git' app.SourceCode = url app.IssueTracker = url + '/issues' elif parsed.netloc == 'gitlab.com' or parsed.netloc == 'framagit.org': # git can be fussy with gitlab URLs unless they end in .git if url.endswith('.git'): url = url[:-4] app.Repo = url + '.git' app.RepoType = 'git' app.SourceCode = url app.IssueTracker = url + '/issues' elif parsed.netloc == 'notabug.org': if url.endswith('.git'): url = url[:-4] app.Repo = url + '.git' app.RepoType = 'git' app.SourceCode = url app.IssueTracker = url + '/issues' elif parsed.netloc == 'bitbucket.org': if url.endswith('/'): url = url[:-1] app.SourceCode = url + '/src' app.IssueTracker = url + '/issues' # Figure out the repo type and adddress... app.RepoType, app.Repo = getrepofrompage(url) elif parsed.netloc == 'codeberg.org': app.RepoType = 'git' app.SourceCode = url app.IssueTracker = url + '/issues' elif url.startswith('https://') and url.endswith('.git'): app.RepoType = 'git' if not parsed.netloc and parsed.scheme in ('git', 'http', 'https', 'ssh'): invalid_url = True if invalid_url: raise ValueError(_('"{url}" is not a valid URL!'.format(url=url))) if not app.RepoType: raise FDroidException("Unable to determine vcs type. " + app.Repo) return app def parse_srclib_spec(spec): if type(spec) != str: raise MetaDataException(_("can not parse scrlib spec " "(not a string): '{}'") .format(spec)) tokens = spec.split('@') if len(tokens) > 2: raise MetaDataException(_("could not parse srclib spec " "(too many '@' signs): '{}'") .format(spec)) elif len(tokens) < 2: raise MetaDataException(_("could not parse srclib spec " "(no ref specified): '{}'") .format(spec)) name = tokens[0] ref = tokens[1] number = None subdir = None if ':' in name: number, name = name.split(':', 1) if '/' in name: name, subdir = name.split('/', 1) return (name, ref, number, subdir) def getsrclib(spec, srclib_dir, basepath=False, raw=False, prepare=True, preponly=False, refresh=True, build=None): """Get the specified source library. Returns the path to it. Normally this is the path to be used when referencing it, which may be a subdirectory of the actual project. If you want the base directory of the project, pass 'basepath=True'. """ number = None subdir = None if raw: name = spec ref = None else: name, ref = spec.split('@') if ':' in name: number, name = name.split(':', 1) if '/' in name: name, subdir = name.split('/', 1) if name not in fdroidserver.metadata.srclibs: raise VCSException('srclib ' + name + ' not found.') srclib = fdroidserver.metadata.srclibs[name] sdir = os.path.join(srclib_dir, name) if not preponly: vcs = getvcs(srclib["RepoType"], srclib["Repo"], sdir) vcs.srclib = (name, number, sdir) if ref: vcs.gotorevision(ref, refresh) if raw: return vcs libdir = None if subdir: libdir = os.path.join(sdir, subdir) elif srclib["Subdir"]: for subdir in srclib["Subdir"]: libdir_candidate = os.path.join(sdir, subdir) if os.path.exists(libdir_candidate): libdir = libdir_candidate break if libdir is None: libdir = sdir remove_signing_keys(sdir) remove_debuggable_flags(sdir) if prepare: if srclib["Prepare"]: cmd = replace_config_vars(srclib["Prepare"], build) p = FDroidPopen(['bash', '-x', '-c', '--', cmd], cwd=libdir) if p.returncode != 0: raise BuildException("Error running prepare command for srclib %s" % name, p.output) if basepath: libdir = sdir return (name, number, libdir) gradle_version_regex = re.compile(r"[^/]*'com\.android\.tools\.build:gradle:([^\.]+\.[^\.]+).*'.*") def prepare_source(vcs, app, build, build_dir, srclib_dir, extlib_dir, onserver=False, refresh=True): """Prepare the source code for a particular build. Parameters ---------- vcs the appropriate vcs object for the application app the application details from the metadata build the build details from the metadata build_dir the path to the build directory, usually 'build/app.id' srclib_dir the path to the source libraries directory, usually 'build/srclib' extlib_dir the path to the external libraries directory, usually 'build/extlib' Returns ------- root is the root directory, which may be the same as 'build_dir' or may be a subdirectory of it. srclibpaths is information on the srclibs being used """ # Optionally, the actual app source can be in a subdirectory if build.subdir: root_dir = os.path.join(build_dir, build.subdir) else: root_dir = build_dir # Get a working copy of the right revision logging.info("Getting source for revision " + build.commit) vcs.gotorevision(build.commit, refresh) # Initialise submodules if required if build.submodules: logging.info(_("Initialising submodules")) vcs.initsubmodules() # Check that a subdir (if we're using one) exists. This has to happen # after the checkout, since it might not exist elsewhere if not os.path.exists(root_dir): raise BuildException('Missing subdir ' + root_dir) # Run an init command if one is required if build.init: cmd = replace_config_vars(build.init, build) logging.info("Running 'init' commands in %s" % root_dir) p = FDroidPopen(['bash', '-x', '-c', '--', cmd], cwd=root_dir) if p.returncode != 0: raise BuildException("Error running init command for %s:%s" % (app.id, build.versionName), p.output) # Apply patches if any if build.patch: logging.info("Applying patches") for patch in build.patch: patch = patch.strip() logging.info("Applying " + patch) patch_path = os.path.join('metadata', app.id, patch) p = FDroidPopen(['patch', '-p1', '-i', os.path.abspath(patch_path)], cwd=build_dir) if p.returncode != 0: raise BuildException("Failed to apply patch %s" % patch_path) # Get required source libraries srclibpaths = [] if build.srclibs: logging.info("Collecting source libraries") for lib in build.srclibs: srclibpaths.append(getsrclib(lib, srclib_dir, preponly=onserver, refresh=refresh, build=build)) for name, number, libpath in srclibpaths: place_srclib(root_dir, int(number) if number else None, libpath) basesrclib = vcs.getsrclib() # If one was used for the main source, add that too. if basesrclib: srclibpaths.append(basesrclib) # Update the local.properties file localprops = [os.path.join(build_dir, 'local.properties')] if build.subdir: parts = build.subdir.split(os.sep) cur = build_dir for d in parts: cur = os.path.join(cur, d) localprops += [os.path.join(cur, 'local.properties')] for path in localprops: props = "" if os.path.isfile(path): logging.info("Updating local.properties file at %s" % path) with open(path, 'r', encoding='iso-8859-1') as f: props += f.read() props += '\n' else: logging.info("Creating local.properties file at %s" % path) # Fix old-fashioned 'sdk-location' by copying # from sdk.dir, if necessary if build.oldsdkloc: sdkloc = re.match(r".*^sdk.dir=(\S+)$.*", props, re.S | re.M).group(1) props += "sdk-location=%s\n" % sdkloc else: props += "sdk.dir=%s\n" % config['sdk_path'] props += "sdk-location=%s\n" % config['sdk_path'] ndk_path = build.ndk_path() # if for any reason the path isn't valid or the directory # doesn't exist, some versions of Gradle will error with a # cryptic message (even if the NDK is not even necessary). # https://gitlab.com/fdroid/fdroidserver/issues/171 if ndk_path and os.path.exists(ndk_path): # Add ndk location props += "ndk.dir=%s\n" % ndk_path props += "ndk-location=%s\n" % ndk_path # Add java.encoding if necessary if build.encoding: props += "java.encoding=%s\n" % build.encoding with open(path, 'w', encoding='iso-8859-1') as f: f.write(props) flavours = [] if build.build_method() == 'gradle': flavours = build.gradle if build.target: n = build.target.split('-')[1] build_gradle = os.path.join(root_dir, "build.gradle") build_gradle_kts = build_gradle + ".kts" if os.path.exists(build_gradle): gradlefile = build_gradle elif os.path.exists(build_gradle_kts): gradlefile = build_gradle_kts regsub_file(r'compileSdkVersion[ =]+[0-9]+', r'compileSdkVersion %s' % n, gradlefile) # Remove forced debuggable flags remove_debuggable_flags(root_dir) # Insert version code and number into the manifest if necessary if build.forceversion: logging.info("Changing the version name") for path in manifest_paths(root_dir, flavours): if not os.path.isfile(path): continue if path.endswith('.xml'): regsub_file(r'android:versionName="[^"]*"', r'android:versionName="%s"' % build.versionName, path) elif path.endswith('.gradle'): regsub_file(r"""(\s*)versionName[\s'"=]+.*""", r"""\1versionName '%s'""" % build.versionName, path) if build.forcevercode: logging.info("Changing the version code") for path in manifest_paths(root_dir, flavours): if not os.path.isfile(path): continue if path.endswith('.xml'): regsub_file(r'android:versionCode="[^"]*"', r'android:versionCode="%s"' % build.versionCode, path) elif path.endswith('.gradle'): regsub_file(r'versionCode[ =]+[0-9]+', r'versionCode %s' % build.versionCode, path) # Delete unwanted files if build.rm: logging.info(_("Removing specified files")) for part in getpaths(build_dir, build.rm): dest = os.path.join(build_dir, part) logging.info("Removing {0}".format(part)) if os.path.lexists(dest): # rmtree can only handle directories that are not symlinks, so catch anything else if not os.path.isdir(dest) or os.path.islink(dest): os.remove(dest) else: shutil.rmtree(dest) else: logging.info("...but it didn't exist") remove_signing_keys(build_dir) # Add required external libraries if build.extlibs: logging.info("Collecting prebuilt libraries") libsdir = os.path.join(root_dir, 'libs') if not os.path.exists(libsdir): os.mkdir(libsdir) for lib in build.extlibs: lib = lib.strip() logging.info("...installing extlib {0}".format(lib)) libf = os.path.basename(lib) libsrc = os.path.join(extlib_dir, lib) if not os.path.exists(libsrc): raise BuildException("Missing extlib file {0}".format(libsrc)) shutil.copyfile(libsrc, os.path.join(libsdir, libf)) # Add extlibs to scanignore (this is relative to the build dir root, *sigh*) if build.subdir: scanignorepath = os.path.join(build.subdir, 'libs', libf) else: scanignorepath = os.path.join('libs', libf) if scanignorepath not in build.scanignore: build.scanignore.append(scanignorepath) # Run a pre-build command if one is required if build.prebuild: logging.info("Running 'prebuild' commands in %s" % root_dir) cmd = replace_config_vars(build.prebuild, build) # Substitute source library paths into prebuild commands for name, number, libpath in srclibpaths: cmd = cmd.replace('$$' + name + '$$', os.path.join(os.getcwd(), libpath)) p = FDroidPopen(['bash', '-x', '-c', '--', cmd], cwd=root_dir) if p.returncode != 0: raise BuildException("Error running prebuild command for %s:%s" % (app.id, build.versionName), p.output) # Generate (or update) the ant build file, build.xml... if build.build_method() == 'ant' and build.androidupdate != ['no']: parms = ['android', 'update', 'lib-project'] lparms = ['android', 'update', 'project'] if build.target: parms += ['-t', build.target] lparms += ['-t', build.target] if build.androidupdate: update_dirs = build.androidupdate else: update_dirs = ant_subprojects(root_dir) + ['.'] for d in update_dirs: subdir = os.path.join(root_dir, d) if d == '.': logging.debug("Updating main project") cmd = parms + ['-p', d] else: logging.debug("Updating subproject %s" % d) cmd = lparms + ['-p', d] p = SdkToolsPopen(cmd, cwd=root_dir) # Check to see whether an error was returned without a proper exit # code (this is the case for the 'no target set or target invalid' # error) if p.returncode != 0 or p.output.startswith("Error: "): raise BuildException("Failed to update project at %s" % d, p.output) # Clean update dirs via ant if d != '.': logging.info("Cleaning subproject %s" % d) p = FDroidPopen(['ant', 'clean'], cwd=subdir) return (root_dir, srclibpaths) def getpaths_map(build_dir, globpaths): """Extend via globbing the paths from a field and return them as a map from original path to resulting paths.""" paths = dict() for p in globpaths: p = p.strip() full_path = os.path.join(build_dir, p) full_path = os.path.normpath(full_path) paths[p] = [r[len(build_dir) + 1:] for r in glob.glob(full_path)] if not paths[p]: raise FDroidException("glob path '%s' did not match any files/dirs" % p) return paths def getpaths(build_dir, globpaths): """Extend via globbing the paths from a field and return them as a set.""" paths_map = getpaths_map(build_dir, globpaths) paths = set() for k, v in paths_map.items(): for p in v: paths.add(p) return paths def natural_key(s): return [int(sp) if sp.isdigit() else sp for sp in re.split(r'(\d+)', s)] def check_system_clock(dt_obj, path): """Check if system clock is updated based on provided date. If an APK has files newer than the system time, suggest updating the system clock. This is useful for offline systems, used for signing, which do not have another source of clock sync info. It has to be more than 24 hours newer because ZIP/APK files do not store timezone info """ checkdt = dt_obj - timedelta(1) if datetime.today() < checkdt: logging.warning(_('System clock is older than date in {path}!').format(path=path) + '\n' + _('Set clock to that time using:') + '\n' + 'sudo date -s "' + str(dt_obj) + '"') class KnownApks: """Permanent store of existing APKs with the date they were added. This is currently the only way to permanently store the "updated" date of APKs. """ def __init__(self): """Load filename/date info about previously seen APKs. Since the appid and date strings both will never have spaces, this is parsed as a list from the end to allow the filename to have any combo of spaces. """ self.path = os.path.join('stats', 'known_apks.txt') self.apks = {} if os.path.isfile(self.path): with open(self.path, 'r', encoding='utf-8') as f: for line in f: t = line.rstrip().split(' ') if len(t) == 2: self.apks[t[0]] = (t[1], None) else: appid = t[-2] date = datetime.strptime(t[-1], '%Y-%m-%d') filename = line[0:line.rfind(appid) - 1] self.apks[filename] = (appid, date) check_system_clock(date, self.path) self.changed = False def writeifchanged(self): if not self.changed: return if not os.path.exists('stats'): os.mkdir('stats') lst = [] for apk, app in self.apks.items(): appid, added = app line = apk + ' ' + appid if added: line += ' ' + added.strftime('%Y-%m-%d') lst.append(line) with open(self.path, 'w') as f: for line in sorted(lst, key=natural_key): f.write(line + '\n') def recordapk(self, apkName, app, default_date=None): """ Record an APK (if it's new, otherwise does nothing). Returns ------- datetime the date it was added as a datetime instance. """ if apkName not in self.apks: if default_date is None: default_date = datetime.utcnow() self.apks[apkName] = (app, default_date) self.changed = True _ignored, added = self.apks[apkName] return added def getapp(self, apkname): """Look up information - given the 'apkname'. Returns (app id, date added/None). Or returns None for an unknown apk. """ if apkname in self.apks: return self.apks[apkname] return None def getlatest(self, num): """Get the most recent 'num' apps added to the repo, as a list of package ids with the most recent first.""" apps = {} for apk, app in self.apks.items(): appid, added = app if added: if appid in apps: if apps[appid] > added: apps[appid] = added else: apps[appid] = added sortedapps = sorted(apps.items(), key=operator.itemgetter(1))[-num:] lst = [app for app, _ignored in sortedapps] lst.reverse() return lst def get_file_extension(filename): """Get the normalized file extension, can be blank string but never None.""" if isinstance(filename, bytes): filename = filename.decode('utf-8') return os.path.splitext(filename)[1].lower()[1:] def use_androguard(): """Report if androguard is available, and config its debug logging.""" try: import androguard if use_androguard.show_path: logging.debug(_('Using androguard from "{path}"').format(path=androguard.__file__)) use_androguard.show_path = False if options and options.verbose: logging.getLogger("androguard.axml").setLevel(logging.INFO) return True except ImportError: return False use_androguard.show_path = True # type: ignore def _get_androguard_APK(apkfile): try: from androguard.core.bytecodes.apk import APK except ImportError: raise FDroidException("androguard library is not installed") return APK(apkfile) def ensure_final_value(packageName, arsc, value): """Ensure incoming value is always the value, not the resid. androguard will sometimes return the Android "resId" aka Resource ID instead of the actual value. This checks whether the value is actually a resId, then performs the Android Resource lookup as needed. """ if value: returnValue = value if value[0] == '@': try: # can be a literal value or a resId res_id = int('0x' + value[1:], 16) res_id = arsc.get_id(packageName, res_id)[1] returnValue = arsc.get_string(packageName, res_id)[1] except (ValueError, TypeError): pass return returnValue return '' def is_apk_and_debuggable(apkfile): """Return True if the given file is an APK and is debuggable. Parse only from the APK. Parameters ---------- apkfile full path to the APK to check """ if get_file_extension(apkfile) != 'apk': return False from androguard.core.bytecodes.axml import AXMLParser, format_value, START_TAG with ZipFile(apkfile) as apk: with apk.open('AndroidManifest.xml') as manifest: axml = AXMLParser(manifest.read()) while axml.is_valid(): _type = next(axml) if _type == START_TAG and axml.getName() == 'application': for i in range(0, axml.getAttributeCount()): name = axml.getAttributeName(i) if name == 'debuggable': _type = axml.getAttributeValueType(i) _data = axml.getAttributeValueData(i) value = format_value(_type, _data, lambda _: axml.getAttributeValue(i)) if value == 'true': return True else: return False break return False def get_apk_id(apkfile): """Extract identification information from APK. Androguard is preferred since it is more reliable and a lot faster. Occasionally, when androguard can't get the info from the APK, aapt still can. So aapt is also used as the final fallback method. Parameters ---------- apkfile path to an APK file. Returns ------- appid version code version name """ try: return get_apk_id_androguard(apkfile) except zipfile.BadZipFile as e: logging.error(apkfile + ': ' + str(e)) if 'aapt' in config: return get_apk_id_aapt(apkfile) def get_apk_id_androguard(apkfile): """Read (appid, versionCode, versionName) from an APK. This first tries to do quick binary XML parsing to just get the values that are needed. It will fallback to full androguard parsing, which is slow, if it can't find the versionName value or versionName is set to a Android String Resource (e.g. an integer hex value that starts with @). """ if not os.path.exists(apkfile): raise FDroidException(_("Reading packageName/versionCode/versionName failed, APK invalid: '{apkfilename}'") .format(apkfilename=apkfile)) from androguard.core.bytecodes.axml import AXMLParser, format_value, START_TAG, END_TAG, TEXT, END_DOCUMENT appid = None versionCode = None versionName = None with zipfile.ZipFile(apkfile) as apk: with apk.open('AndroidManifest.xml') as manifest: axml = AXMLParser(manifest.read()) count = 0 while axml.is_valid(): _type = next(axml) count += 1 if _type == START_TAG: for i in range(0, axml.getAttributeCount()): name = axml.getAttributeName(i) _type = axml.getAttributeValueType(i) _data = axml.getAttributeValueData(i) value = format_value(_type, _data, lambda _: axml.getAttributeValue(i)) if appid is None and name == 'package': appid = value elif versionCode is None and name == 'versionCode': if value.startswith('0x'): versionCode = str(int(value, 16)) else: versionCode = value elif versionName is None and name == 'versionName': versionName = value if axml.getName() == 'manifest': break elif _type == END_TAG or _type == TEXT or _type == END_DOCUMENT: raise RuntimeError('{path}: must be the first element in AndroidManifest.xml' .format(path=apkfile)) if not versionName or versionName[0] == '@': a = _get_androguard_APK(apkfile) versionName = ensure_final_value(a.package, a.get_android_resources(), a.get_androidversion_name()) if not versionName: versionName = '' # versionName is expected to always be a str return appid, versionCode, versionName.strip('\0') def get_apk_id_aapt(apkfile): p = SdkToolsPopen(['aapt', 'dump', 'badging', apkfile], output=False) m = APK_ID_TRIPLET_REGEX.match(p.output[0:p.output.index('\n')]) if m: return m.group(1), m.group(2), m.group(3) raise FDroidException(_("Reading packageName/versionCode/versionName failed, APK invalid: '{apkfilename}'") .format(apkfilename=apkfile)) def get_native_code(apkfile): """Aapt checks if there are architecture folders under the lib/ folder. We are simulating the same behaviour. """ arch_re = re.compile("^lib/(.*)/.*$") archset = set() with ZipFile(apkfile) as apk: for filename in apk.namelist(): m = arch_re.match(filename) if m: archset.add(m.group(1)) return sorted(list(archset)) class PopenResult: def __init__(self): self.returncode = None self.output = None def SdkToolsPopen(commands, cwd=None, output=True): cmd = commands[0] if cmd not in config: config[cmd] = find_sdk_tools_cmd(commands[0]) abscmd = config[cmd] if abscmd is None: raise FDroidException(_("Could not find '{command}' on your system").format(command=cmd)) if cmd == 'aapt': test_aapt_version(config['aapt']) return FDroidPopen([abscmd] + commands[1:], cwd=cwd, output=output) def FDroidPopenBytes(commands, cwd=None, envs=None, output=True, stderr_to_stdout=True): """ Run a command and capture the possibly huge output as bytes. Parameters ---------- commands command and argument list like in subprocess.Popen cwd optionally specifies a working directory envs a optional dictionary of environment variables and their values Returns ------- A PopenResult. """ global env if env is None: set_FDroidPopen_env() process_env = env.copy() if envs is not None and len(envs) > 0: process_env.update(envs) if cwd: cwd = os.path.normpath(cwd) logging.debug("Directory: %s" % cwd) logging.debug("> %s" % ' '.join(commands)) stderr_param = subprocess.STDOUT if stderr_to_stdout else subprocess.PIPE result = PopenResult() p = None try: p = subprocess.Popen(commands, cwd=cwd, shell=False, env=process_env, stdin=subprocess.DEVNULL, stdout=subprocess.PIPE, stderr=stderr_param) except OSError as e: raise BuildException("OSError while trying to execute " + ' '.join(commands) + ': ' + str(e)) # TODO are these AsynchronousFileReader threads always exiting? if not stderr_to_stdout and options.verbose: stderr_queue = Queue() stderr_reader = AsynchronousFileReader(p.stderr, stderr_queue) while not stderr_reader.eof(): while not stderr_queue.empty(): line = stderr_queue.get() sys.stderr.buffer.write(line) sys.stderr.flush() time.sleep(0.1) stdout_queue = Queue() stdout_reader = AsynchronousFileReader(p.stdout, stdout_queue) buf = io.BytesIO() # Check the queue for output (until there is no more to get) while not stdout_reader.eof(): while not stdout_queue.empty(): line = stdout_queue.get() if output and options.verbose: # Output directly to console sys.stderr.buffer.write(line) sys.stderr.flush() buf.write(line) time.sleep(0.1) result.returncode = p.wait() result.output = buf.getvalue() buf.close() # make sure all filestreams of the subprocess are closed for streamvar in ['stdin', 'stdout', 'stderr']: if hasattr(p, streamvar): stream = getattr(p, streamvar) if stream: stream.close() return result def FDroidPopen(commands, cwd=None, envs=None, output=True, stderr_to_stdout=True): """ Run a command and capture the possibly huge output as a str. Parameters ---------- commands command and argument list like in subprocess.Popen cwd optionally specifies a working directory envs a optional dictionary of environment variables and their values Returns ------- A PopenResult. """ result = FDroidPopenBytes(commands, cwd, envs, output, stderr_to_stdout) result.output = result.output.decode('utf-8', 'ignore') return result gradle_comment = re.compile(r'[ ]*//') gradle_signing_configs = re.compile(r'^[\t ]*signingConfigs[ \t]*{[ \t]*$') gradle_line_matches = [ re.compile(r'^[\t ]*signingConfig\s*[= ]\s*[^ ]*$'), re.compile(r'.*android\.signingConfigs\.[^{]*$'), re.compile(r'.*release\.signingConfig *= *'), ] def remove_signing_keys(build_dir): for root, dirs, files in os.walk(build_dir): gradlefile = None if 'build.gradle' in files: gradlefile = "build.gradle" elif 'build.gradle.kts' in files: gradlefile = "build.gradle.kts" if gradlefile: path = os.path.join(root, gradlefile) with open(path, "r") as o: lines = o.readlines() changed = False opened = 0 i = 0 with open(path, "w") as o: while i < len(lines): line = lines[i] i += 1 while line.endswith('\\\n'): line = line.rstrip('\\\n') + lines[i] i += 1 if gradle_comment.match(line): o.write(line) continue if opened > 0: opened += line.count('{') opened -= line.count('}') continue if gradle_signing_configs.match(line): changed = True opened += 1 continue if any(s.match(line) for s in gradle_line_matches): changed = True continue if opened == 0: o.write(line) if changed: logging.info("Cleaned %s of keysigning configs at %s" % (gradlefile, path)) for propfile in [ 'project.properties', 'build.properties', 'default.properties', 'ant.properties', ]: if propfile in files: path = os.path.join(root, propfile) with open(path, "r", encoding='iso-8859-1') as o: lines = o.readlines() changed = False with open(path, "w", encoding='iso-8859-1') as o: for line in lines: if any(line.startswith(s) for s in ('key.store', 'key.alias')): changed = True continue o.write(line) if changed: logging.info("Cleaned %s of keysigning configs at %s" % (propfile, path)) def set_FDroidPopen_env(build=None): """Set up the environment variables for the build environment. There is only a weak standard, the variables used by gradle, so also set up the most commonly used environment variables for SDK and NDK. Also, if there is no locale set, this will set the locale (e.g. LANG) to en_US.UTF-8. """ global env, orig_path if env is None: env = os.environ orig_path = env['PATH'] if config: if config.get('sdk_path'): for n in ['ANDROID_HOME', 'ANDROID_SDK', 'ANDROID_SDK_ROOT']: env[n] = config['sdk_path'] for k, v in config.get('java_paths', {}).items(): env['JAVA%s_HOME' % k] = v missinglocale = True for k, v in env.items(): if k == 'LANG' and v != 'C': missinglocale = False elif k == 'LC_ALL': missinglocale = False if missinglocale: env['LANG'] = 'en_US.UTF-8' if build is not None: path = build.ndk_path() paths = orig_path.split(os.pathsep) if path not in paths: paths = [path] + paths env['PATH'] = os.pathsep.join(paths) for n in ['ANDROID_NDK', 'NDK', 'ANDROID_NDK_HOME']: env[n] = build.ndk_path() def replace_build_vars(cmd, build): cmd = cmd.replace('$$COMMIT$$', build.commit) cmd = cmd.replace('$$VERSION$$', build.versionName) cmd = cmd.replace('$$VERCODE$$', str(build.versionCode)) return cmd def replace_config_vars(cmd, build): cmd = cmd.replace('$$SDK$$', config['sdk_path']) cmd = cmd.replace('$$NDK$$', build.ndk_path()) cmd = cmd.replace('$$MVN3$$', config['mvn3']) if build is not None: cmd = replace_build_vars(cmd, build) return cmd def place_srclib(root_dir, number, libpath): if not number: return relpath = os.path.relpath(libpath, root_dir) proppath = os.path.join(root_dir, 'project.properties') lines = [] if os.path.isfile(proppath): with open(proppath, "r", encoding='iso-8859-1') as o: lines = o.readlines() with open(proppath, "w", encoding='iso-8859-1') as o: placed = False for line in lines: if line.startswith('android.library.reference.%d=' % number): o.write('android.library.reference.%d=%s\n' % (number, relpath)) placed = True else: o.write(line) if not placed: o.write('android.library.reference.%d=%s\n' % (number, relpath)) APK_SIGNATURE_FILES = re.compile(r'META-INF/[0-9A-Za-z_\-]+\.(SF|RSA|DSA|EC)') def signer_fingerprint_short(cert_encoded): """Obtain shortened sha256 signing-key fingerprint for pkcs7 DER certficate. Extracts the first 7 hexadecimal digits of sha256 signing-key fingerprint for a given pkcs7 signature. Parameters ---------- cert_encoded Contents of an APK signing certificate. Returns ------- shortened signing-key fingerprint. """ return signer_fingerprint(cert_encoded)[:7] def signer_fingerprint(cert_encoded): """Obtain sha256 signing-key fingerprint for pkcs7 DER certificate. Extracts hexadecimal sha256 signing-key fingerprint string for a given pkcs7 signature. Parameters ---------- Contents of an APK signature. Returns ------- shortened signature fingerprint. """ return hashlib.sha256(cert_encoded).hexdigest() def get_first_signer_certificate(apkpath): """Get the first signing certificate from the APK, DER-encoded.""" certs = None cert_encoded = None with zipfile.ZipFile(apkpath, 'r') as apk: cert_files = [n for n in apk.namelist() if SIGNATURE_BLOCK_FILE_REGEX.match(n)] if len(cert_files) > 1: logging.error(_("Found multiple JAR Signature Block Files in {path}").format(path=apkpath)) return None elif len(cert_files) == 1: cert_encoded = get_certificate(apk.read(cert_files[0])) if not cert_encoded and use_androguard(): apkobject = _get_androguard_APK(apkpath) certs = apkobject.get_certificates_der_v2() if len(certs) > 0: logging.debug(_('Using APK Signature v2')) cert_encoded = certs[0] if not cert_encoded: certs = apkobject.get_certificates_der_v3() if len(certs) > 0: logging.debug(_('Using APK Signature v3')) cert_encoded = certs[0] if not cert_encoded: logging.error(_("No signing certificates found in {path}").format(path=apkpath)) return None return cert_encoded def apk_signer_fingerprint(apk_path): """Obtain sha256 signing-key fingerprint for APK. Extracts hexadecimal sha256 signing-key fingerprint string for a given APK. Parameters ---------- apk_path path to APK Returns ------- signature fingerprint """ cert_encoded = get_first_signer_certificate(apk_path) if not cert_encoded: return None return signer_fingerprint(cert_encoded) def apk_signer_fingerprint_short(apk_path): """Obtain shortened sha256 signing-key fingerprint for APK. Extracts the first 7 hexadecimal digits of sha256 signing-key fingerprint for a given pkcs7 APK. Parameters ---------- apk_path path to APK Returns ------- shortened signing-key fingerprint """ return apk_signer_fingerprint(apk_path)[:7] def metadata_get_sigdir(appid, vercode=None): """Get signature directory for app.""" if vercode: return os.path.join('metadata', appid, 'signatures', str(vercode)) else: return os.path.join('metadata', appid, 'signatures') def metadata_find_developer_signature(appid, vercode=None): """Try to find the developer signature for given appid. This picks the first signature file found in metadata an returns its signature. Returns ------- sha256 signing key fingerprint of the developer signing key. None in case no signature can not be found. """ # fetch list of dirs for all versions of signatures appversigdirs = [] if vercode: appversigdirs.append(metadata_get_sigdir(appid, vercode)) else: appsigdir = metadata_get_sigdir(appid) if os.path.isdir(appsigdir): numre = re.compile('[0-9]+') for ver in os.listdir(appsigdir): if numre.match(ver): appversigdir = os.path.join(appsigdir, ver) appversigdirs.append(appversigdir) for sigdir in appversigdirs: signature_block_files = ( glob.glob(os.path.join(sigdir, '*.DSA')) + glob.glob(os.path.join(sigdir, '*.EC')) + glob.glob(os.path.join(sigdir, '*.RSA')) ) if len(signature_block_files) > 1: raise FDroidException('ambiguous signatures, please make sure there is only one signature in \'{}\'. (The signature has to be the App maintainers signature for version of the APK.)'.format(sigdir)) for signature_block_file in signature_block_files: with open(signature_block_file, 'rb') as f: return signer_fingerprint(get_certificate(f.read())) return None def metadata_find_signing_files(appid, vercode): """Get a list of signed manifests and signatures. Parameters ---------- appid app id string vercode app version code Returns ------- List of 4-tuples for each signing key with following paths: (signature_file, signature_block_file, manifest, v2_files), where v2_files is either a (apk_signing_block_offset_file, apk_signing_block_file) pair or None References ---------- * https://docs.oracle.com/javase/tutorial/deployment/jar/intro.html * https://source.android.com/security/apksigning/v2 * https://source.android.com/security/apksigning/v3 """ ret = [] sigdir = metadata_get_sigdir(appid, vercode) signature_block_files = ( glob.glob(os.path.join(sigdir, '*.DSA')) + glob.glob(os.path.join(sigdir, '*.EC')) + glob.glob(os.path.join(sigdir, '*.RSA')) ) signature_block_pat = re.compile(r'(\.DSA|\.EC|\.RSA)$') apk_signing_block = os.path.join(sigdir, "APKSigningBlock") apk_signing_block_offset = os.path.join(sigdir, "APKSigningBlockOffset") if os.path.isfile(apk_signing_block) and os.path.isfile(apk_signing_block_offset): v2_files = apk_signing_block, apk_signing_block_offset else: v2_files = None for signature_block_file in signature_block_files: signature_file = signature_block_pat.sub('.SF', signature_block_file) if os.path.isfile(signature_file): manifest = os.path.join(sigdir, 'MANIFEST.MF') if os.path.isfile(manifest): ret.append((signature_block_file, signature_file, manifest, v2_files)) return ret def metadata_find_developer_signing_files(appid, vercode): """Get developer signature files for specified app from metadata. Returns ------- List of 4-tuples for each signing key with following paths: (signature_file, signature_block_file, manifest, v2_files), where v2_files is either a (apk_signing_block_offset_file, apk_signing_block_file) pair or None """ allsigningfiles = metadata_find_signing_files(appid, vercode) if allsigningfiles and len(allsigningfiles) == 1: return allsigningfiles[0] else: return None class ClonedZipInfo(zipfile.ZipInfo): """Hack to allow fully cloning ZipInfo instances. The zipfile library has some bugs that prevent it from fully cloning ZipInfo entries. https://bugs.python.org/issue43547 """ def __init__(self, zinfo): self.original = zinfo for k in self.__slots__: try: setattr(self, k, getattr(zinfo, k)) except AttributeError: pass def __getattribute__(self, name): if name in ("date_time", "external_attr", "flag_bits"): return getattr(self.original, name) return object.__getattribute__(self, name) def apk_has_v1_signatures(apkfile): """Test whether an APK has v1 signature files.""" with ZipFile(apkfile, 'r') as apk: for info in apk.infolist(): if APK_SIGNATURE_FILES.match(info.filename): return True return False def apk_strip_v1_signatures(signed_apk, strip_manifest=False): """Remove signatures from APK. Parameters ---------- signed_apk path to APK file. strip_manifest when set to True also the manifest file will be removed from the APK. """ with tempfile.TemporaryDirectory() as tmpdir: tmp_apk = os.path.join(tmpdir, 'tmp.apk') shutil.move(signed_apk, tmp_apk) with ZipFile(tmp_apk, 'r') as in_apk: with ZipFile(signed_apk, 'w') as out_apk: for info in in_apk.infolist(): if not APK_SIGNATURE_FILES.match(info.filename): if strip_manifest: if info.filename != 'META-INF/MANIFEST.MF': buf = in_apk.read(info.filename) out_apk.writestr(ClonedZipInfo(info), buf) else: buf = in_apk.read(info.filename) out_apk.writestr(ClonedZipInfo(info), buf) def _zipalign(unsigned_apk, aligned_apk): """Run 'zipalign' using standard flags used by Gradle Android Plugin. -p was added in build-tools-23.0.0 References ---------- https://developer.android.com/studio/publish/app-signing#sign-manually """ p = SdkToolsPopen(['zipalign', '-v', '-p', '4', unsigned_apk, aligned_apk]) if p.returncode != 0: raise BuildException("Failed to align application") def apk_implant_signatures(apkpath, outpath, manifest): """Implant a signature from metadata into an APK. Note: this changes there supplied APK in place. So copy it if you need the original to be preserved. Parameters ---------- apkpath location of the unsigned apk outpath location of the output apk References ---------- * https://docs.oracle.com/javase/tutorial/deployment/jar/intro.html * https://source.android.com/security/apksigning/v2 * https://source.android.com/security/apksigning/v3 """ sigdir = os.path.dirname(manifest) # FIXME apksigcopier.do_patch(sigdir, apkpath, outpath, v1_only=None) def apk_extract_signatures(apkpath, outdir): """Extract a signature files from APK and puts them into target directory. Parameters ---------- apkpath location of the apk outdir older where the extracted signature files will be stored References ---------- * https://docs.oracle.com/javase/tutorial/deployment/jar/intro.html * https://source.android.com/security/apksigning/v2 * https://source.android.com/security/apksigning/v3 """ apksigcopier.do_extract(apkpath, outdir, v1_only=None) def get_min_sdk_version(apk): """Wrap the androguard function to always return and int. Fall back to 1 if we can't get a valid minsdk version. Parameters ---------- apk androguard APK object Returns ------- minsdk: int """ try: return int(apk.get_min_sdk_version()) except TypeError: return 1 def sign_apk(unsigned_path, signed_path, keyalias): """Sign and zipalign an unsigned APK, then save to a new file, deleting the unsigned. NONE is a Java keyword used to configure smartcards as the keystore. Otherwise, the keystore is a local file. https://docs.oracle.com/javase/7/docs/technotes/guides/security/p11guide.html#KeyToolJarSigner When using smartcards, apksigner does not use the same options has Java/keytool/jarsigner (-providerName, -providerClass, -providerArg, -storetype). apksigner documents the options as --ks-provider-class and --ks-provider-arg. Those seem to be accepted but fail when actually making a signature with weird internal exceptions. We use the options that actually work. From: https://geoffreymetais.github.io/code/key-signing/#scripting """ if config['keystore'] == 'NONE': apksigner_smartcardoptions = config['smartcardoptions'].copy() if '-providerName' in apksigner_smartcardoptions: pos = config['smartcardoptions'].index('-providerName') # remove -providerName and it's argument del apksigner_smartcardoptions[pos] del apksigner_smartcardoptions[pos] replacements = {'-storetype': '--ks-type', '-providerClass': '--provider-class', '-providerArg': '--provider-arg'} signing_args = [replacements.get(n, n) for n in apksigner_smartcardoptions] else: signing_args = ['--key-pass', 'env:FDROID_KEY_PASS'] apksigner = config.get('apksigner', '') if not shutil.which(apksigner): raise BuildException(_("apksigner not found, it's required for signing!")) cmd = [apksigner, 'sign', '--ks', config['keystore'], '--ks-pass', 'env:FDROID_KEY_STORE_PASS'] cmd += signing_args cmd += ['--ks-key-alias', keyalias, '--in', unsigned_path, '--out', signed_path] p = FDroidPopen(cmd, envs={ 'FDROID_KEY_STORE_PASS': config['keystorepass'], 'FDROID_KEY_PASS': config.get('keypass', "")}) if p.returncode != 0: raise BuildException(_("Failed to sign application"), p.output) os.remove(unsigned_path) def verify_apks(signed_apk, unsigned_apk, tmp_dir, v1_only=None): """Verify that two apks are the same. One of the inputs is signed, the other is unsigned. The signature metadata is transferred from the signed to the unsigned apk, and then apksigner is used to verify that the signature from the signed APK is also valid for the unsigned one. If the APK given as unsigned actually does have a signature, it will be stripped out and ignored. Parameters ---------- signed_apk Path to a signed APK file unsigned_apk Path to an unsigned APK file expected to match it tmp_dir Path to directory for temporary files v1_only True for v1-only signatures, False for v1 and v2 signatures, or None for autodetection Returns ------- None if the verification is successful, otherwise a string describing what went wrong. """ if not verify_apk_signature(signed_apk): logging.info('...NOT verified - {0}'.format(signed_apk)) return 'verification of signed APK failed' if not os.path.isfile(signed_apk): return 'can not verify: file does not exists: {}'.format(signed_apk) if not os.path.isfile(unsigned_apk): return 'can not verify: file does not exists: {}'.format(unsigned_apk) tmp_apk = os.path.join(tmp_dir, 'sigcp_' + os.path.basename(unsigned_apk)) try: apksigcopier.do_copy(signed_apk, unsigned_apk, tmp_apk, v1_only=v1_only) except apksigcopier.APKSigCopierError as e: logging.info('...NOT verified - {0}'.format(tmp_apk)) return 'signature copying failed: {}'.format(str(e)) if not verify_apk_signature(tmp_apk): logging.info('...NOT verified - {0}'.format(tmp_apk)) result = compare_apks(signed_apk, tmp_apk, tmp_dir, os.path.dirname(unsigned_apk)) if result is not None: return result return 'verification of APK with copied signature failed' logging.info('...successfully verified') return None def verify_jar_signature(jar): """Verify the signature of a given JAR file. jarsigner is very shitty: unsigned JARs pass as "verified"! So this has to turn on -strict then check for result 4, since this does not expect the signature to be from a CA-signed certificate. Raises ------ VerificationException If the JAR's signature could not be verified. """ error = _('JAR signature failed to verify: {path}').format(path=jar) try: output = subprocess.check_output([config['jarsigner'], '-strict', '-verify', jar], stderr=subprocess.STDOUT) raise VerificationException(error + '\n' + output.decode('utf-8')) except subprocess.CalledProcessError as e: if e.returncode == 4: logging.debug(_('JAR signature verified: {path}').format(path=jar)) else: raise VerificationException(error + '\n' + e.output.decode('utf-8')) def verify_apk_signature(apk, min_sdk_version=None): """Verify the signature on an APK. Try to use apksigner whenever possible since jarsigner is very shitty: unsigned APKs pass as "verified"! Warning, this does not work on JARs with apksigner >= 0.7 (build-tools 26.0.1) Returns ------- Boolean whether the APK was verified """ if set_command_in_config('apksigner'): args = [config['apksigner'], 'verify'] if min_sdk_version: args += ['--min-sdk-version=' + min_sdk_version] if options.verbose: args += ['--verbose'] try: output = subprocess.check_output(args + [apk]) if options.verbose: logging.debug(apk + ': ' + output.decode('utf-8')) return True except subprocess.CalledProcessError as e: logging.error('\n' + apk + ': ' + e.output.decode('utf-8')) else: if not config.get('jarsigner_warning_displayed'): config['jarsigner_warning_displayed'] = True logging.warning(_("Using Java's jarsigner, not recommended for verifying APKs! Use apksigner")) try: verify_jar_signature(apk) return True except Exception as e: logging.error(e) return False def verify_old_apk_signature(apk): """Verify the signature on an archived APK, supporting deprecated algorithms. F-Droid aims to keep every single binary that it ever published. Therefore, it needs to be able to verify APK signatures that include deprecated/removed algorithms. For example, jarsigner treats an MD5 signature as unsigned. jarsigner passes unsigned APKs as "verified"! So this has to turn on -strict then check for result 4. Just to be safe, this never reuses the file, and locks down the file permissions while in use. That should prevent a bad actor from changing the settings during operation. Returns ------- Boolean whether the APK was verified """ _java_security = os.path.join(os.getcwd(), '.java.security') if os.path.exists(_java_security): os.remove(_java_security) with open(_java_security, 'w') as fp: fp.write('jdk.jar.disabledAlgorithms=MD2, RSA keySize < 1024') os.chmod(_java_security, 0o400) try: cmd = [ config['jarsigner'], '-J-Djava.security.properties=' + _java_security, '-strict', '-verify', apk ] output = subprocess.check_output(cmd, stderr=subprocess.STDOUT) except subprocess.CalledProcessError as e: if e.returncode != 4: output = e.output else: logging.debug(_('JAR signature verified: {path}').format(path=apk)) return True finally: if os.path.exists(_java_security): os.chmod(_java_security, 0o600) os.remove(_java_security) logging.error(_('Old APK signature failed to verify: {path}').format(path=apk) + '\n' + output.decode('utf-8')) return False apk_badchars = re.compile('''[/ :;'"]''') def compare_apks(apk1, apk2, tmp_dir, log_dir=None): """Compare two apks. Returns ------- None if the APK content is the same (apart from the signing key), otherwise a string describing what's different, or what went wrong when trying to do the comparison. """ if not log_dir: log_dir = tmp_dir absapk1 = os.path.abspath(apk1) absapk2 = os.path.abspath(apk2) if set_command_in_config('diffoscope'): logfilename = os.path.join(log_dir, os.path.basename(absapk1)) htmlfile = logfilename + '.diffoscope.html' textfile = logfilename + '.diffoscope.txt' if subprocess.call([config['diffoscope'], '--max-report-size', '12345678', '--max-diff-block-lines', '128', '--html', htmlfile, '--text', textfile, absapk1, absapk2]) != 0: return("Failed to run diffoscope " + apk1) apk1dir = os.path.join(tmp_dir, apk_badchars.sub('_', apk1[0:-4])) # trim .apk apk2dir = os.path.join(tmp_dir, apk_badchars.sub('_', apk2[0:-4])) # trim .apk for d in [apk1dir, apk2dir]: if os.path.exists(d): shutil.rmtree(d) os.mkdir(d) os.mkdir(os.path.join(d, 'content')) # extract APK contents for comparision with ZipFile(absapk1, 'r') as f: f.extractall(path=os.path.join(apk1dir, 'content')) with ZipFile(absapk2, 'r') as f: f.extractall(path=os.path.join(apk2dir, 'content')) if set_command_in_config('apktool'): if subprocess.call([config['apktool'], 'd', absapk1, '--output', 'apktool'], cwd=apk1dir) != 0: return("Failed to run apktool " + apk1) if subprocess.call([config['apktool'], 'd', absapk2, '--output', 'apktool'], cwd=apk2dir) != 0: return("Failed to run apktool " + apk2) p = FDroidPopen(['diff', '-r', apk1dir, apk2dir], output=False) lines = p.output.splitlines() if len(lines) != 1 or 'META-INF' not in lines[0]: if set_command_in_config('meld'): p = FDroidPopen([config['meld'], apk1dir, apk2dir], output=False) return("Unexpected diff output:\n" + p.output) # since everything verifies, delete the comparison to keep cruft down shutil.rmtree(apk1dir) shutil.rmtree(apk2dir) # If we get here, it seems like they're the same! return None def set_command_in_config(command): """Try to find specified command in the path, if it hasn't been manually set in config.yml. If found, it is added to the config dict. The return value says whether the command is available. """ if command in config: return True else: tmp = find_command(command) if tmp is not None: config[command] = tmp return True return False def find_command(command): """Find the full path of a command, or None if it can't be found in the PATH.""" def is_exe(fpath): return os.path.isfile(fpath) and os.access(fpath, os.X_OK) fpath, fname = os.path.split(command) if fpath: if is_exe(command): return command else: for path in os.environ["PATH"].split(os.pathsep): path = path.strip('"') exe_file = os.path.join(path, command) if is_exe(exe_file): return exe_file return None def genpassword(): """Generate a random password for when generating keys.""" h = hashlib.sha256() h.update(os.urandom(16)) # salt h.update(socket.getfqdn().encode('utf-8')) passwd = base64.b64encode(h.digest()).strip() return passwd.decode('utf-8') def genkeystore(localconfig): """Generate a new key with password provided in localconfig and add it to new keystore. Parameters ---------- localconfig Returns ------- hexed public key, public key fingerprint """ logging.info('Generating a new key in "' + localconfig['keystore'] + '"...') keystoredir = os.path.dirname(localconfig['keystore']) if keystoredir is None or keystoredir == '': keystoredir = os.path.join(os.getcwd(), keystoredir) if not os.path.exists(keystoredir): os.makedirs(keystoredir, mode=0o700) env_vars = {'LC_ALL': 'C.UTF-8', 'FDROID_KEY_STORE_PASS': localconfig['keystorepass'], 'FDROID_KEY_PASS': localconfig.get('keypass', "")} cmd = [config['keytool'], '-genkey', '-keystore', localconfig['keystore'], '-alias', localconfig['repo_keyalias'], '-keyalg', 'RSA', '-keysize', '4096', '-sigalg', 'SHA256withRSA', '-validity', '10000', '-storetype', 'pkcs12', '-storepass:env', 'FDROID_KEY_STORE_PASS', '-dname', localconfig['keydname'], '-J-Duser.language=en'] if localconfig['keystore'] == "NONE": cmd += localconfig['smartcardoptions'] else: cmd += '-keypass:env', 'FDROID_KEY_PASS' p = FDroidPopen(cmd, envs=env_vars) if p.returncode != 0: raise BuildException("Failed to generate key", p.output) if localconfig['keystore'] != "NONE": os.chmod(localconfig['keystore'], 0o0600) if not options.quiet: # now show the lovely key that was just generated p = FDroidPopen([config['keytool'], '-list', '-v', '-keystore', localconfig['keystore'], '-alias', localconfig['repo_keyalias'], '-storepass:env', 'FDROID_KEY_STORE_PASS', '-J-Duser.language=en'] + config['smartcardoptions'], envs=env_vars) logging.info(p.output.strip() + '\n\n') # get the public key p = FDroidPopenBytes([config['keytool'], '-exportcert', '-keystore', localconfig['keystore'], '-alias', localconfig['repo_keyalias'], '-storepass:env', 'FDROID_KEY_STORE_PASS'] + config['smartcardoptions'], envs=env_vars, output=False, stderr_to_stdout=False) if p.returncode != 0 or len(p.output) < 20: raise BuildException("Failed to get public key", p.output) pubkey = p.output fingerprint = get_cert_fingerprint(pubkey) return hexlify(pubkey), fingerprint def get_cert_fingerprint(pubkey): """Generate a certificate fingerprint the same way keytool does it (but with slightly different formatting).""" digest = hashlib.sha256(pubkey).digest() ret = [' '.join("%02X" % b for b in bytearray(digest))] return " ".join(ret) def get_certificate(signature_block_file): """Extract a DER certificate from JAR Signature's "Signature Block File". Parameters ---------- signature_block_file file bytes (as string) representing the certificate, as read directly out of the APK/ZIP Returns ------- A binary representation of the certificate's public key, or None in case of error """ content = decoder.decode(signature_block_file, asn1Spec=rfc2315.ContentInfo())[0] if content.getComponentByName('contentType') != rfc2315.signedData: return None content = decoder.decode(content.getComponentByName('content'), asn1Spec=rfc2315.SignedData())[0] try: certificates = content.getComponentByName('certificates') cert = certificates[0].getComponentByName('certificate') except PyAsn1Error: logging.error("Certificates not found.") return None return encoder.encode(cert) def load_stats_fdroid_signing_key_fingerprints(): """Load signing-key fingerprints stored in file generated by fdroid publish. Returns ------- dict containing the signing-key fingerprints. """ jar_file = os.path.join('stats', 'publishsigkeys.jar') if not os.path.isfile(jar_file): return {} cmd = [config['jarsigner'], '-strict', '-verify', jar_file] p = FDroidPopen(cmd, output=False) if p.returncode != 4: raise FDroidException("Signature validation of '{}' failed! " "Please run publish again to rebuild this file.".format(jar_file)) jar_sigkey = apk_signer_fingerprint(jar_file) repo_key_sig = config.get('repo_key_sha256') if repo_key_sig: if jar_sigkey != repo_key_sig: raise FDroidException("Signature key fingerprint of file '{}' does not match repo_key_sha256 in config.yml (found fingerprint: '{}')".format(jar_file, jar_sigkey)) else: logging.warning("repo_key_sha256 not in config.yml, setting it to the signature key fingerprint of '{}'".format(jar_file)) config['repo_key_sha256'] = jar_sigkey write_to_config(config, 'repo_key_sha256') with zipfile.ZipFile(jar_file, 'r') as f: return json.loads(str(f.read('publishsigkeys.json'), 'utf-8')) def write_to_config(thisconfig, key, value=None, config_file=None): """Write a key/value to the local config.yml or config.py. NOTE: only supports writing string variables. Parameters ---------- thisconfig config dictionary key variable name in config to be overwritten/added value optional value to be written, instead of fetched from 'thisconfig' dictionary. """ if value is None: origkey = key + '_orig' value = thisconfig[origkey] if origkey in thisconfig else thisconfig[key] if config_file: cfg = config_file elif os.path.exists('config.py') and not os.path.exists('config.yml'): cfg = 'config.py' else: cfg = 'config.yml' # load config file, create one if it doesn't exist if not os.path.exists(cfg): open(cfg, 'a').close() logging.info("Creating empty " + cfg) with open(cfg, 'r') as f: lines = f.readlines() # make sure the file ends with a carraige return if len(lines) > 0: if not lines[-1].endswith('\n'): lines[-1] += '\n' # regex for finding and replacing python string variable # definitions/initializations if cfg.endswith('.py'): pattern = re.compile(r'^[\s#]*' + key + r'\s*=\s*"[^"]*"') repl = key + ' = "' + value + '"' pattern2 = re.compile(r'^[\s#]*' + key + r"\s*=\s*'[^']*'") repl2 = key + " = '" + value + "'" else: # assume .yml as default pattern = re.compile(r'^[\s#]*' + key + r':.*') repl = yaml.dump({key: value}, default_flow_style=False) pattern2 = pattern repl2 = repl # If we replaced this line once, we make sure won't be a # second instance of this line for this key in the document. didRepl = False # edit config file with open(cfg, 'w') as f: for line in lines: if pattern.match(line) or pattern2.match(line): if not didRepl: line = pattern.sub(repl, line) line = pattern2.sub(repl2, line) f.write(line) didRepl = True else: f.write(line) if not didRepl: f.write('\n') f.write(repl) f.write('\n') def parse_xml(path): return XMLElementTree.parse(path).getroot() def string_is_integer(string): try: int(string, 0) return True except ValueError: try: int(string) return True except ValueError: return False def version_code_string_to_int(vercode): """Convert an version code string of any base into an int.""" try: return int(vercode, 0) except ValueError: return int(vercode) def get_app_display_name(app): """Get a human readable name for the app for logging and sorting. When trying to find a localized name, this first tries en-US since that his the historical language used for sorting. """ if app.get('Name'): return app['Name'] if app.get('localized'): localized = app['localized'].get('en-US') if not localized: for v in app['localized'].values(): localized = v break if localized.get('name'): return localized['name'] return app.get('AutoName') or app['id'] def local_rsync(options, fromdir, todir): """Rsync method for local to local copying of things. This is an rsync wrapper with all the settings for safe use within the various fdroidserver use cases. This uses stricter rsync checking on all files since people using offline mode are already prioritizing security above ease and speed. """ rsyncargs = ['rsync', '--recursive', '--safe-links', '--times', '--perms', '--one-file-system', '--delete', '--chmod=Da+rx,Fa-x,a+r,u+w'] if not options.no_checksum: rsyncargs.append('--checksum') if options.verbose: rsyncargs += ['--verbose'] if options.quiet: rsyncargs += ['--quiet'] logging.debug(' '.join(rsyncargs + [fromdir, todir])) if subprocess.call(rsyncargs + [fromdir, todir]) != 0: raise FDroidException() def deploy_build_log_with_rsync(appid, vercode, log_content): """Upload build log of one individual app build to an fdroid repository. Parameters ---------- appid package name for dientifying to which app this log belongs. vercode version of the app to which this build belongs. log_content Content of the log which is about to be posted. Should be either a string or bytes. (bytes will be decoded as 'utf-8') """ if not log_content: logging.warning(_('skip deploying full build logs: log content is empty')) return if not os.path.exists('repo'): os.mkdir('repo') # gzip compress log file log_gz_path = os.path.join('repo', '{appid}_{versionCode}.log.gz'.format(appid=appid, versionCode=vercode)) with gzip.open(log_gz_path, 'wb') as f: if isinstance(log_content, str): f.write(bytes(log_content, 'utf-8')) else: f.write(log_content) rsync_status_file_to_repo(log_gz_path) def rsync_status_file_to_repo(path, repo_subdir=None): """Copy a build log or status JSON to the repo using rsync.""" if not config.get('deploy_process_logs', False): logging.debug(_('skip deploying full build logs: not enabled in config')) return for webroot in config.get('serverwebroot', []): cmd = ['rsync', '--archive', '--delete-after', '--safe-links'] if options.verbose: cmd += ['--verbose'] if options.quiet: cmd += ['--quiet'] if 'identity_file' in config: cmd += ['-e', 'ssh -oBatchMode=yes -oIdentitiesOnly=yes -i ' + config['identity_file']] dest_path = os.path.join(webroot, "repo") if repo_subdir is not None: dest_path = os.path.join(dest_path, repo_subdir) if not dest_path.endswith('/'): dest_path += '/' # make sure rsync knows this is a directory cmd += [path, dest_path] retcode = subprocess.call(cmd) if retcode: logging.error(_('process log deploy {path} to {dest} failed!') .format(path=path, dest=webroot)) else: logging.debug(_('deployed process log {path} to {dest}') .format(path=path, dest=webroot)) def get_per_app_repos(): """Per-app repos are dirs named with the packageName of a single app.""" # Android packageNames are Java packages, they may contain uppercase or # lowercase letters ('A' through 'Z'), numbers, and underscores # ('_'). However, individual package name parts may only start with # letters. https://developer.android.com/guide/topics/manifest/manifest-element.html#package p = re.compile('^([a-zA-Z][a-zA-Z0-9_]*(\\.[a-zA-Z][a-zA-Z0-9_]*)*)?$') repos = [] for root, dirs, files in os.walk(os.getcwd()): for d in dirs: print('checking', root, 'for', d) if d in ('archive', 'metadata', 'repo', 'srclibs', 'tmp'): # standard parts of an fdroid repo, so never packageNames continue elif p.match(d) \ and os.path.exists(os.path.join(d, 'fdroid', 'repo', 'index.jar')): repos.append(d) break return repos def is_repo_file(filename): """Whether the file in a repo is a build product to be delivered to users.""" if isinstance(filename, str): filename = filename.encode('utf-8', errors="surrogateescape") return os.path.isfile(filename) \ and not filename.endswith(b'.asc') \ and not filename.endswith(b'.sig') \ and not filename.endswith(b'.idsig') \ and not filename.endswith(b'.log.gz') \ and os.path.basename(filename) not in [ b'index.css', b'index.jar', b'index_unsigned.jar', b'index.xml', b'index.html', b'index.png', b'index-v1.jar', b'index-v1.json', b'categories.txt', ] def get_examples_dir(): """Return the dir where the fdroidserver example files are available.""" examplesdir = None tmp = os.path.dirname(sys.argv[0]) if os.path.basename(tmp) == 'bin': egg_links = glob.glob(os.path.join(tmp, '..', 'local/lib/python3.*/site-packages/fdroidserver.egg-link')) if egg_links: # installed from local git repo examplesdir = os.path.join(open(egg_links[0]).readline().rstrip(), 'examples') else: # try .egg layout examplesdir = os.path.dirname(os.path.dirname(__file__)) + '/share/doc/fdroidserver/examples' if not os.path.exists(examplesdir): # use UNIX layout examplesdir = os.path.dirname(tmp) + '/share/doc/fdroidserver/examples' else: # we're running straight out of the git repo prefix = os.path.normpath(os.path.join(os.path.dirname(__file__), '..')) examplesdir = prefix + '/examples' return examplesdir def get_android_tools_versions(): """Get a list of the versions of all installed Android SDK/NDK components.""" global config sdk_path = config['sdk_path'] if sdk_path[-1] != '/': sdk_path += '/' components = [] for ndk_path in config.get('ndk_paths', []): version = get_ndk_version(ndk_path) components.append((os.path.basename(ndk_path), str(version))) pattern = re.compile(r'^Pkg.Revision *= *(.+)', re.MULTILINE) for root, dirs, files in os.walk(sdk_path): if 'source.properties' in files: source_properties = os.path.join(root, 'source.properties') with open(source_properties, 'r') as fp: m = pattern.search(fp.read()) if m: components.append((root[len(sdk_path):], m.group(1))) return components def get_android_tools_version_log(): """Get a list of the versions of all installed Android SDK/NDK components.""" log = '== Installed Android Tools ==\n\n' components = get_android_tools_versions() for name, version in sorted(components): log += '* ' + name + ' (' + version + ')\n' return log def calculate_math_string(expr): ops = { ast.Add: operator.add, ast.Mult: operator.mul, ast.Sub: operator.sub, ast.USub: operator.neg, } def execute_ast(node): if isinstance(node, ast.Num): # return node.n elif isinstance(node, ast.BinOp): # return ops[type(node.op)](execute_ast(node.left), execute_ast(node.right)) elif isinstance(node, ast.UnaryOp): # e.g., -1 return ops[type(node.op)](ast.literal_eval(node.operand)) else: raise SyntaxError(node) try: if '#' in expr: raise SyntaxError('no comments allowed') return execute_ast(ast.parse(expr, mode='eval').body) except SyntaxError: raise SyntaxError("could not parse expression '{expr}', " "only basic math operations are allowed (+, -, *)" .format(expr=expr)) def force_exit(exitvalue=0): """Force exit when thread operations could block the exit. The build command has to use some threading stuff to handle the timeout and locks. This seems to prevent the command from exiting, unless this hack is used. """ sys.stdout.flush() sys.stderr.flush() os._exit(exitvalue) YAML_LINT_CONFIG = {'extends': 'default', 'rules': {'document-start': 'disable', 'line-length': 'disable', 'truthy': 'disable'}} def run_yamllint(path, indent=0): path = Path(path) try: import yamllint.config import yamllint.linter except ImportError: return '' result = [] with path.open('r', encoding='utf-8') as f: problems = yamllint.linter.run(f, yamllint.config.YamlLintConfig(json.dumps(YAML_LINT_CONFIG))) for problem in problems: result.append(' ' * indent + str(path) + ':' + str(problem.line) + ': ' + problem.message) return '\n'.join(result) def sha256sum(filename): """Calculate the sha256 of the given file.""" sha = hashlib.sha256() with open(filename, 'rb') as f: while True: t = f.read(16384) if len(t) == 0: break sha.update(t) return sha.hexdigest() def sha256base64(filename): """Calculate the sha256 of the given file as URL-safe base64.""" hasher = hashlib.sha256() with open(filename, 'rb') as f: while True: t = f.read(16384) if len(t) == 0: break hasher.update(t) return urlsafe_b64encode(hasher.digest()).decode() def get_ndk_version(ndk_path): """Get the version info from the metadata in the NDK package. Since r11, the info is nice and easy to find in sources.properties. Before, there was a kludgey format in RELEASE.txt. This is only needed for r10e. """ source_properties = os.path.join(ndk_path, 'source.properties') release_txt = os.path.join(ndk_path, 'RELEASE.TXT') if os.path.exists(source_properties): with open(source_properties) as fp: m = re.search(r'^Pkg.Revision *= *(.+)', fp.read(), flags=re.MULTILINE) if m: return m.group(1) elif os.path.exists(release_txt): with open(release_txt) as fp: return fp.read().split('-')[0] def auto_install_ndk(build): """Auto-install the NDK in the build, this assumes its in a buildserver guest VM. Download, verify, and install the NDK version as specified via the "ndk:" field in the build entry. As it uncompresses the zipball, this forces the permissions to work for all users, since this might uncompress as root and then be used from a different user. This needs to be able to install multiple versions of the NDK, since this is also used in CI builds, where multiple `fdroid build --onserver` calls can run in a single session. The production buildserver is reset between every build. The default ANDROID_SDK_ROOT base dir of /opt/android-sdk is hard-coded in buildserver/Vagrantfile. The $ANDROID_HOME/ndk subdir is where Android Studio will install the NDK into versioned subdirs. https://developer.android.com/studio/projects/configure-agp-ndk#agp_version_41 Also, r10e and older cannot be handled via this mechanism because they are packaged differently. """ global config if build.get('disable'): return ndk = build.get('ndk') if not ndk: return if isinstance(ndk, str): _install_ndk(ndk) elif isinstance(ndk, list): for n in ndk: _install_ndk(n) else: BuildException(_('Invalid ndk: entry in build: "{ndk}"') .format(ndk=str(ndk))) def _install_ndk(ndk): """Install specified NDK if it is not already installed. Parameters ---------- ndk The NDK version to install, either in "release" form (r21e) or "revision" form (21.4.7075529). """ if re.match(r'[1-9][0-9.]+[0-9]', ndk): for ndkdict in NDKS: if ndk == ndkdict.get('revision'): ndk = ndkdict['release'] break ndk_path = config.get(ndk) if ndk_path and os.path.isdir(ndk_path): return for ndkdict in NDKS: if ndk == ndkdict['release']: url = ndkdict['url'] sha256 = ndkdict['sha256'] break else: raise FDroidException("NDK %s not found" % ndk) ndk_base = os.path.join(config['sdk_path'], 'ndk') logging.info(_('Downloading %s') % url) zipball = os.path.join( tempfile.mkdtemp(prefix='android-ndk-'), os.path.basename(url) ) net.download_file(url, zipball) if sha256 != sha256sum(zipball): raise FDroidException('SHA-256 %s does not match expected for %s' % (sha256, url)) logging.info(_('Unzipping to %s') % ndk_base) with zipfile.ZipFile(zipball) as zipfp: for info in zipfp.infolist(): permbits = info.external_attr >> 16 if stat.S_ISLNK(permbits): link = os.path.join(ndk_base, info.filename) link_target = zipfp.read(info).decode() link_dir = os.path.dirname(link) os.makedirs(link_dir, 0o755, True) # ensure intermediate directories are created os.symlink(link_target, link) real_target = os.path.realpath(link) if not real_target.startswith(ndk_base): os.remove(link) logging.error(_('Unexpected symlink target: {link} -> {target}') .format(link=link, target=real_target)) elif stat.S_ISDIR(permbits) or stat.S_IXUSR & permbits: zipfp.extract(info.filename, path=ndk_base) os.chmod(os.path.join(ndk_base, info.filename), 0o755) # nosec bandit B103 else: zipfp.extract(info.filename, path=ndk_base) os.chmod(os.path.join(ndk_base, info.filename), 0o644) # nosec bandit B103 os.remove(zipball) for extracted in glob.glob(os.path.join(ndk_base, '*')): version = get_ndk_version(extracted) if os.path.basename(extracted) != version: ndk_dir = os.path.join(ndk_base, version) os.rename(extracted, ndk_dir) if 'ndk_paths' not in config: config['ndk_paths'] = dict() config['ndk_paths'][ndk] = ndk_dir logging.info(_('Set NDK {release} ({version}) up') .format(release=ndk, version=version)) """Derived from https://gitlab.com/fdroid/android-sdk-transparency-log/-/blob/master/checksums.json""" NDKS = [ { "release": "r10e", "sha256": "ee5f405f3b57c4f5c3b3b8b5d495ae12b660e03d2112e4ed5c728d349f1e520c", "url": "https://dl.google.com/android/repository/android-ndk-r10e-linux-x86_64.zip" }, { "release": "r11", "revision": "11.0.2655954", "sha256": "59ab44f7ee6201df4381844736fdc456134c7f7660151003944a3017a0dcce97", "url": "https://dl.google.com/android/repository/android-ndk-r11-linux-x86_64.zip" }, { "release": "r11b", "revision": "11.1.2683735", "sha256": "51d429bfda8bbe038683ed7ae7acc03b39604b84711901b555fe18c698867e53", "url": "https://dl.google.com/android/repository/android-ndk-r11b-linux-x86_64.zip" }, { "release": "r11c", "revision": "11.2.2725575", "sha256": "ba85dbe4d370e4de567222f73a3e034d85fc3011b3cbd90697f3e8dcace3ad94", "url": "https://dl.google.com/android/repository/android-ndk-r11c-linux-x86_64.zip" }, { "release": "r12", "revision": "12.0.2931149", "sha256": "7876e3b99f3596a3215ecf4e9f152d24b82dfdf2bbe7d3a38c423ae6a3edee79", "url": "https://dl.google.com/android/repository/android-ndk-r12-linux-x86_64.zip" }, { "release": "r12b", "revision": "12.1.2977051", "sha256": "eafae2d614e5475a3bcfd7c5f201db5b963cc1290ee3e8ae791ff0c66757781e", "url": "https://dl.google.com/android/repository/android-ndk-r12b-linux-x86_64.zip" }, { "release": "r13", "revision": "13.0.3315539", "sha256": "0a1dbd216386399e2979c17a48f65b962bf7ddc0c2311ef35d902b90c298c400", "url": "https://dl.google.com/android/repository/android-ndk-r13-linux-x86_64.zip" }, { "release": "r13b", "revision": "13.1.3345770", "sha256": "3524d7f8fca6dc0d8e7073a7ab7f76888780a22841a6641927123146c3ffd29c", "url": "https://dl.google.com/android/repository/android-ndk-r13b-linux-x86_64.zip" }, { "release": "r14", "revision": "14.0.3770861", "sha256": "3e622c2c9943964ea44cd56317d0769ed4c811bb4b40dc45b1f6965e4db9aa44", "url": "https://dl.google.com/android/repository/android-ndk-r14-linux-x86_64.zip" }, { "release": "r14b", "revision": "14.1.3816874", "sha256": "0ecc2017802924cf81fffc0f51d342e3e69de6343da892ac9fa1cd79bc106024", "url": "https://dl.google.com/android/repository/android-ndk-r14b-linux-x86_64.zip" }, { "release": "r15", "revision": "15.0.4075724", "sha256": "078eb7d28c3fcf45841f5baf6e6582e7fd5b73d8e8c4e0101df490f51abd37b6", "url": "https://dl.google.com/android/repository/android-ndk-r15-linux-x86_64.zip" }, { "release": "r15b", "revision": "15.1.4119039", "sha256": "d1ce63f68cd806b5a992d4e5aa60defde131c243bf523cdfc5b67990ef0ee0d3", "url": "https://dl.google.com/android/repository/android-ndk-r15b-linux-x86_64.zip" }, { "release": "r15c", "revision": "15.2.4203891", "sha256": "f01788946733bf6294a36727b99366a18369904eb068a599dde8cca2c1d2ba3c", "url": "https://dl.google.com/android/repository/android-ndk-r15c-linux-x86_64.zip" }, { "release": "r16", "revision": "16.0.4442984", "sha256": "a8550b81771c67cc6ab7b479a6918d29aa78de3482901762b4f9e0132cd9672e", "url": "https://dl.google.com/android/repository/android-ndk-r16-linux-x86_64.zip" }, { "release": "r16b", "revision": "16.1.4479499", "sha256": "bcdea4f5353773b2ffa85b5a9a2ae35544ce88ec5b507301d8cf6a76b765d901", "url": "https://dl.google.com/android/repository/android-ndk-r16b-linux-x86_64.zip" }, { "release": "r17", "revision": "17.0.4754217", "sha256": "ba3d813b47de75bc32a2f3de087f72599c6cb36fdc9686b96f517f5492ff43ca", "url": "https://dl.google.com/android/repository/android-ndk-r17-linux-x86_64.zip" }, { "release": "r17b", "revision": "17.1.4828580", "sha256": "5dfbbdc2d3ba859fed90d0e978af87c71a91a5be1f6e1c40ba697503d48ccecd", "url": "https://dl.google.com/android/repository/android-ndk-r17b-linux-x86_64.zip" }, { "release": "r17c", "revision": "17.2.4988734", "sha256": "3f541adbd0330a9205ba12697f6d04ec90752c53d6b622101a2a8a856e816589", "url": "https://dl.google.com/android/repository/android-ndk-r17c-linux-x86_64.zip" }, { "release": "r18b", "revision": "18.1.5063045", "sha256": "4f61cbe4bbf6406aa5ef2ae871def78010eed6271af72de83f8bd0b07a9fd3fd", "url": "https://dl.google.com/android/repository/android-ndk-r18b-linux-x86_64.zip" }, { "release": "r19", "revision": "19.0.5232133", "sha256": "c0a2425206191252197b97ea5fcc7eab9f693a576e69ef4773a9ed1690feed53", "url": "https://dl.google.com/android/repository/android-ndk-r19-linux-x86_64.zip" }, { "release": "r19b", "revision": "19.1.5304403", "sha256": "0fbb1645d0f1de4dde90a4ff79ca5ec4899c835e729d692f433fda501623257a", "url": "https://dl.google.com/android/repository/android-ndk-r19b-linux-x86_64.zip" }, { "release": "r19c", "revision": "19.2.5345600", "sha256": "4c62514ec9c2309315fd84da6d52465651cdb68605058f231f1e480fcf2692e1", "url": "https://dl.google.com/android/repository/android-ndk-r19c-linux-x86_64.zip" }, { "release": "r20", "revision": "20.0.5594570", "sha256": "57435158f109162f41f2f43d5563d2164e4d5d0364783a9a6fab3ef12cb06ce0", "url": "https://dl.google.com/android/repository/android-ndk-r20-linux-x86_64.zip" }, { "release": "r20b", "revision": "20.1.5948944", "sha256": "8381c440fe61fcbb01e209211ac01b519cd6adf51ab1c2281d5daad6ca4c8c8c", "url": "https://dl.google.com/android/repository/android-ndk-r20b-linux-x86_64.zip" }, { "release": "r21", "revision": "21.0.6113669", "sha256": "b65ea2d5c5b68fb603626adcbcea6e4d12c68eb8a73e373bbb9d23c252fc647b", "url": "https://dl.google.com/android/repository/android-ndk-r21-linux-x86_64.zip" }, { "release": "r21b", "revision": "21.1.6352462", "sha256": "0c7af5dd23c5d2564915194e71b1053578438ac992958904703161c7672cbed7", "url": "https://dl.google.com/android/repository/android-ndk-r21b-linux-x86_64.zip" }, { "release": "r21c", "revision": "21.2.6472646", "sha256": "214ebfcfa5108ba78f5b2cc8db4d575068f9c973ac7f27d2fa1987dfdb76c9e7", "url": "https://dl.google.com/android/repository/android-ndk-r21c-linux-x86_64.zip" }, { "release": "r21d", "revision": "21.3.6528147", "sha256": "dd6dc090b6e2580206c64bcee499bc16509a5d017c6952dcd2bed9072af67cbd", "url": "https://dl.google.com/android/repository/android-ndk-r21d-linux-x86_64.zip" }, { "release": "r21e", "revision": "21.4.7075529", "sha256": "ad7ce5467e18d40050dc51b8e7affc3e635c85bd8c59be62de32352328ed467e", "url": "https://dl.google.com/android/repository/android-ndk-r21e-linux-x86_64.zip" }, { "release": "r22", "revision": "22.0.7026061", "sha256": "d37fc69cd81e5660234a686e20adef39bc0244086e4d66525a40af771c020718", "url": "https://dl.google.com/android/repository/android-ndk-r22-linux-x86_64.zip" }, { "release": "r22b", "revision": "22.1.7171670", "sha256": "ac3a0421e76f71dd330d0cd55f9d99b9ac864c4c034fc67e0d671d022d4e806b", "url": "https://dl.google.com/android/repository/android-ndk-r22b-linux-x86_64.zip" }, { "release": "r23", "revision": "23.0.7599858", "sha256": "e3eacf80016b91d4cd2c8ca9f34eebd32df912bb799c859cc5450b6b19277b4f", "url": "https://dl.google.com/android/repository/android-ndk-r23-linux.zip" }, { "release": "r23b", "revision": "23.1.7779620", "sha256": "c6e97f9c8cfe5b7be0a9e6c15af8e7a179475b7ded23e2d1c1fa0945d6fb4382", "url": "https://dl.google.com/android/repository/android-ndk-r23b-linux.zip" } ] def handle_retree_error_on_windows(function, path, excinfo): """Python can't remove a readonly file on Windows so chmod first.""" if function in (os.unlink, os.rmdir, os.remove) and excinfo[0] == PermissionError: os.chmod(path, stat.S_IWRITE) function(path) fdroidserver-2.1/fdroidserver/deploy.py0000644000175000017500000010421314205260731020323 0ustar hanshans00000000000000#!/usr/bin/env python3 # # deploy.py - part of the FDroid server tools # Copyright (C) 2010-15, Ciaran Gultnieks, ciaran@ciarang.com # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . import sys import glob import hashlib import json import os import re import subprocess import time import urllib from argparse import ArgumentParser import logging import shutil from . import _ from . import common from . import index from . import update from .exception import FDroidException config = None options = None start_timestamp = time.gmtime() BINARY_TRANSPARENCY_DIR = 'binary_transparency' AUTO_S3CFG = '.fdroid-deploy-s3cfg' USER_S3CFG = 's3cfg' REMOTE_HOSTNAME_REGEX = re.compile(r'\W*\w+\W+(\w+).*') def update_awsbucket(repo_section): """Upload the contents of the directory `repo_section` (including subdirectories) to the AWS S3 "bucket". The contents of that subdir of the bucket will first be deleted. Requires AWS credentials set in config.yml: awsaccesskeyid, awssecretkey """ logging.debug('Syncing "' + repo_section + '" to Amazon S3 bucket "' + config['awsbucket'] + '"') if common.set_command_in_config('s3cmd'): update_awsbucket_s3cmd(repo_section) else: update_awsbucket_libcloud(repo_section) def update_awsbucket_s3cmd(repo_section): """Upload using the CLI tool s3cmd, which provides rsync-like sync. The upload is done in multiple passes to reduce the chance of interfering with an existing client-server interaction. In the first pass, only new files are uploaded. In the second pass, changed files are uploaded, overwriting what is on the server. On the third/last pass, the indexes are uploaded, and any removed files are deleted from the server. The last pass is the only pass to use a full MD5 checksum of all files to detect changes. """ logging.debug(_('Using s3cmd to sync with: {url}') .format(url=config['awsbucket'])) if os.path.exists(USER_S3CFG): logging.info(_('Using "{path}" for configuring s3cmd.').format(path=USER_S3CFG)) configfilename = USER_S3CFG else: fd = os.open(AUTO_S3CFG, os.O_CREAT | os.O_TRUNC | os.O_WRONLY, 0o600) logging.debug(_('Creating "{path}" for configuring s3cmd.').format(path=AUTO_S3CFG)) os.write(fd, '[default]\n'.encode('utf-8')) os.write(fd, ('access_key = ' + config['awsaccesskeyid'] + '\n').encode('utf-8')) os.write(fd, ('secret_key = ' + config['awssecretkey'] + '\n').encode('utf-8')) os.close(fd) configfilename = AUTO_S3CFG s3bucketurl = 's3://' + config['awsbucket'] s3cmd = [config['s3cmd'], '--config=' + configfilename] if subprocess.call(s3cmd + ['info', s3bucketurl]) != 0: logging.warning(_('Creating new S3 bucket: {url}') .format(url=s3bucketurl)) if subprocess.call(s3cmd + ['mb', s3bucketurl]) != 0: logging.error(_('Failed to create S3 bucket: {url}') .format(url=s3bucketurl)) raise FDroidException() s3cmd_sync = s3cmd + ['sync', '--acl-public'] if options.verbose: s3cmd_sync += ['--verbose'] if options.quiet: s3cmd_sync += ['--quiet'] indexxml = os.path.join(repo_section, 'index.xml') indexjar = os.path.join(repo_section, 'index.jar') indexv1jar = os.path.join(repo_section, 'index-v1.jar') indexv1json = os.path.join(repo_section, 'index-v1.json') indexv1jsonasc = os.path.join(repo_section, 'index-v1.json.asc') s3url = s3bucketurl + '/fdroid/' logging.debug('s3cmd sync new files in ' + repo_section + ' to ' + s3url) logging.debug(_('Running first pass with MD5 checking disabled')) if subprocess.call(s3cmd_sync + ['--no-check-md5', '--skip-existing', '--exclude', indexxml, '--exclude', indexjar, '--exclude', indexv1jar, '--exclude', indexv1json, '--exclude', indexv1jsonasc, repo_section, s3url]) != 0: raise FDroidException() logging.debug('s3cmd sync all files in ' + repo_section + ' to ' + s3url) if subprocess.call(s3cmd_sync + ['--no-check-md5', '--exclude', indexxml, '--exclude', indexjar, '--exclude', indexv1jar, '--exclude', indexv1json, '--exclude', indexv1jsonasc, repo_section, s3url]) != 0: raise FDroidException() logging.debug(_('s3cmd sync indexes {path} to {url} and delete') .format(path=repo_section, url=s3url)) s3cmd_sync.append('--delete-removed') s3cmd_sync.append('--delete-after') if options.no_checksum: s3cmd_sync.append('--no-check-md5') else: s3cmd_sync.append('--check-md5') if subprocess.call(s3cmd_sync + [repo_section, s3url]) != 0: raise FDroidException() def update_awsbucket_libcloud(repo_section): """No summary. Upload the contents of the directory `repo_section` (including subdirectories) to the AWS S3 "bucket". The contents of that subdir of the bucket will first be deleted. Requires AWS credentials set in config.yml: awsaccesskeyid, awssecretkey """ logging.debug(_('using Apache libcloud to sync with {url}') .format(url=config['awsbucket'])) import libcloud.security libcloud.security.VERIFY_SSL_CERT = True from libcloud.storage.types import Provider, ContainerDoesNotExistError from libcloud.storage.providers import get_driver if not config.get('awsaccesskeyid') or not config.get('awssecretkey'): raise FDroidException( _('To use awsbucket, awssecretkey and awsaccesskeyid must also be set in config.yml!')) awsbucket = config['awsbucket'] if os.path.exists(USER_S3CFG): raise FDroidException(_('"{path}" exists but s3cmd is not installed!') .format(path=USER_S3CFG)) cls = get_driver(Provider.S3) driver = cls(config['awsaccesskeyid'], config['awssecretkey']) try: container = driver.get_container(container_name=awsbucket) except ContainerDoesNotExistError: container = driver.create_container(container_name=awsbucket) logging.info(_('Created new container "{name}"') .format(name=container.name)) upload_dir = 'fdroid/' + repo_section objs = dict() for obj in container.list_objects(): if obj.name.startswith(upload_dir + '/'): objs[obj.name] = obj for root, dirs, files in os.walk(os.path.join(os.getcwd(), repo_section)): for name in files: upload = False file_to_upload = os.path.join(root, name) object_name = 'fdroid/' + os.path.relpath(file_to_upload, os.getcwd()) if object_name not in objs: upload = True else: obj = objs.pop(object_name) if obj.size != os.path.getsize(file_to_upload): upload = True else: # if the sizes match, then compare by MD5 md5 = hashlib.md5() # nosec AWS uses MD5 with open(file_to_upload, 'rb') as f: while True: data = f.read(8192) if not data: break md5.update(data) if obj.hash != md5.hexdigest(): s3url = 's3://' + awsbucket + '/' + obj.name logging.info(' deleting ' + s3url) if not driver.delete_object(obj): logging.warning('Could not delete ' + s3url) upload = True if upload: logging.debug(' uploading "' + file_to_upload + '"...') extra = {'acl': 'public-read'} if file_to_upload.endswith('.sig'): extra['content_type'] = 'application/pgp-signature' elif file_to_upload.endswith('.asc'): extra['content_type'] = 'application/pgp-signature' logging.info(' uploading ' + os.path.relpath(file_to_upload) + ' to s3://' + awsbucket + '/' + object_name) with open(file_to_upload, 'rb') as iterator: obj = driver.upload_object_via_stream(iterator=iterator, container=container, object_name=object_name, extra=extra) # delete the remnants in the bucket, they do not exist locally while objs: object_name, obj = objs.popitem() s3url = 's3://' + awsbucket + '/' + object_name if object_name.startswith(upload_dir): logging.warning(' deleting ' + s3url) driver.delete_object(obj) else: logging.info(' skipping ' + s3url) def update_serverwebroot(serverwebroot, repo_section): # use a checksum comparison for accurate comparisons on different # filesystems, for example, FAT has a low resolution timestamp rsyncargs = ['rsync', '--archive', '--delete-after', '--safe-links'] if not options.no_checksum: rsyncargs.append('--checksum') if options.verbose: rsyncargs += ['--verbose'] if options.quiet: rsyncargs += ['--quiet'] if options.identity_file is not None: rsyncargs += ['-e', 'ssh -oBatchMode=yes -oIdentitiesOnly=yes -i ' + options.identity_file] elif 'identity_file' in config: rsyncargs += ['-e', 'ssh -oBatchMode=yes -oIdentitiesOnly=yes -i ' + config['identity_file']] indexxml = os.path.join(repo_section, 'index.xml') indexjar = os.path.join(repo_section, 'index.jar') indexv1jar = os.path.join(repo_section, 'index-v1.jar') indexv1json = os.path.join(repo_section, 'index-v1.json') indexv1jsonasc = os.path.join(repo_section, 'index-v1.json.asc') # Upload the first time without the index files and delay the deletion as # much as possible, that keeps the repo functional while this update is # running. Then once it is complete, rerun the command again to upload # the index files. Always using the same target with rsync allows for # very strict settings on the receiving server, you can literally specify # the one rsync command that is allowed to run in ~/.ssh/authorized_keys. # (serverwebroot is guaranteed to have a trailing slash in common.py) logging.info('rsyncing ' + repo_section + ' to ' + serverwebroot) if subprocess.call(rsyncargs + ['--exclude', indexxml, '--exclude', indexjar, '--exclude', indexv1jar, '--exclude', indexv1json, '--exclude', indexv1jsonasc, repo_section, serverwebroot]) != 0: raise FDroidException() if subprocess.call(rsyncargs + [repo_section, serverwebroot]) != 0: raise FDroidException() # upload "current version" symlinks if requested if config['make_current_version_link'] and repo_section == 'repo': links_to_upload = [] for f in glob.glob('*.apk') \ + glob.glob('*.apk.asc') + glob.glob('*.apk.sig'): if os.path.islink(f): links_to_upload.append(f) if len(links_to_upload) > 0: if subprocess.call(rsyncargs + links_to_upload + [serverwebroot]) != 0: raise FDroidException() def sync_from_localcopy(repo_section, local_copy_dir): """Sync the repo from "local copy dir" filesystem to this box. In setups that use offline signing, this is the last step that syncs the repo from the "local copy dir" e.g. a thumb drive to the repo on the local filesystem. That local repo is then used to push to all the servers that are configured. """ logging.info('Syncing from local_copy_dir to this repo.') # trailing slashes have a meaning in rsync which is not needed here, so # make sure both paths have exactly one trailing slash common.local_rsync(options, os.path.join(local_copy_dir, repo_section).rstrip('/') + '/', repo_section.rstrip('/') + '/') offline_copy = os.path.join(local_copy_dir, BINARY_TRANSPARENCY_DIR) if os.path.exists(os.path.join(offline_copy, '.git')): online_copy = os.path.join(os.getcwd(), BINARY_TRANSPARENCY_DIR) push_binary_transparency(offline_copy, online_copy) def update_localcopy(repo_section, local_copy_dir): """Copy data from offline to the "local copy dir" filesystem. This updates the copy of this repo used to shuttle data from an offline signing machine to the online machine, e.g. on a thumb drive. """ # local_copy_dir is guaranteed to have a trailing slash in main() below common.local_rsync(options, repo_section, local_copy_dir) offline_copy = os.path.join(os.getcwd(), BINARY_TRANSPARENCY_DIR) if os.path.isdir(os.path.join(offline_copy, '.git')): online_copy = os.path.join(local_copy_dir, BINARY_TRANSPARENCY_DIR) push_binary_transparency(offline_copy, online_copy) def _get_size(start_path='.'): """Get size of all files in a dir https://stackoverflow.com/a/1392549.""" total_size = 0 for root, dirs, files in os.walk(start_path): for f in files: fp = os.path.join(root, f) total_size += os.path.getsize(fp) return total_size def update_servergitmirrors(servergitmirrors, repo_section): """Update repo mirrors stored in git repos. This is a hack to use public git repos as F-Droid repos. It recreates the git repo from scratch each time, so that there is no history. That keeps the size of the git repo small. Services like GitHub or GitLab have a size limit of something like 1 gig. This git repo is only a git repo for the purpose of being hosted. For history, there is the archive section, and there is the binary transparency log. """ import git from clint.textui import progress if config.get('local_copy_dir') \ and not config.get('sync_from_local_copy_dir'): logging.debug(_('Offline machine, skipping git mirror generation until `fdroid deploy`')) return # right now we support only 'repo' git-mirroring if repo_section == 'repo': git_mirror_path = 'git-mirror' dotgit = os.path.join(git_mirror_path, '.git') git_repodir = os.path.join(git_mirror_path, 'fdroid', repo_section) if not os.path.isdir(git_repodir): os.makedirs(git_repodir) # github/gitlab use bare git repos, so only count the .git folder # test: generate giant APKs by including AndroidManifest.xml and and large # file from /dev/urandom, then sign it. Then add those to the git repo. dotgit_size = _get_size(dotgit) dotgit_over_limit = dotgit_size > config['git_mirror_size_limit'] if os.path.isdir(dotgit) and dotgit_over_limit: logging.warning(_('Deleting git-mirror history, repo is too big ({size} max {limit})') .format(size=dotgit_size, limit=config['git_mirror_size_limit'])) shutil.rmtree(dotgit) if options.no_keep_git_mirror_archive and dotgit_over_limit: logging.warning(_('Deleting archive, repo is too big ({size} max {limit})') .format(size=dotgit_size, limit=config['git_mirror_size_limit'])) archive_path = os.path.join(git_mirror_path, 'fdroid', 'archive') shutil.rmtree(archive_path, ignore_errors=True) # rsync is very particular about trailing slashes common.local_rsync(options, repo_section.rstrip('/') + '/', git_repodir.rstrip('/') + '/') # use custom SSH command if identity_file specified ssh_cmd = 'ssh -oBatchMode=yes' if options.identity_file is not None: ssh_cmd += ' -oIdentitiesOnly=yes -i "%s"' % options.identity_file elif 'identity_file' in config: ssh_cmd += ' -oIdentitiesOnly=yes -i "%s"' % config['identity_file'] repo = git.Repo.init(git_mirror_path) enabled_remotes = [] for remote_url in servergitmirrors: name = REMOTE_HOSTNAME_REGEX.sub(r'\1', remote_url) enabled_remotes.append(name) r = git.remote.Remote(repo, name) if r in repo.remotes: r = repo.remote(name) if 'set_url' in dir(r): # force remote URL if using GitPython 2.x r.set_url(remote_url) else: repo.create_remote(name, remote_url) logging.info('Mirroring to: ' + remote_url) # sadly index.add don't allow the --all parameter logging.debug('Adding all files to git mirror') repo.git.add(all=True) logging.debug('Committing all files into git mirror') repo.index.commit("fdroidserver git-mirror") if options.verbose: bar = progress.Bar() class MyProgressPrinter(git.RemoteProgress): def update(self, op_code, current, maximum=None, message=None): if isinstance(maximum, float): bar.show(current, maximum) progress = MyProgressPrinter() else: progress = None # push for every remote. This will overwrite the git history for remote in repo.remotes: if remote.name not in enabled_remotes: repo.delete_remote(remote) continue if remote.name == 'gitlab': logging.debug('Writing .gitlab-ci.yml to deploy to GitLab Pages') with open(os.path.join(git_mirror_path, ".gitlab-ci.yml"), "wt") as out_file: out_file.write("""pages: script: - mkdir .public - cp -r * .public/ - mv .public public artifacts: paths: - public """) repo.git.add(all=True) repo.index.commit("fdroidserver git-mirror: Deploy to GitLab Pages") logging.debug(_('Pushing to {url}').format(url=remote.url)) with repo.git.custom_environment(GIT_SSH_COMMAND=ssh_cmd): pushinfos = remote.push('master', force=True, set_upstream=True, progress=progress) for pushinfo in pushinfos: if pushinfo.flags & (git.remote.PushInfo.ERROR | git.remote.PushInfo.REJECTED | git.remote.PushInfo.REMOTE_FAILURE | git.remote.PushInfo.REMOTE_REJECTED): # Show potentially useful messages from git remote for line in progress.other_lines: if line.startswith('remote:'): logging.debug(line) raise FDroidException(remote.url + ' push failed: ' + str(pushinfo.flags) + ' ' + pushinfo.summary) else: logging.debug(remote.url + ': ' + pushinfo.summary) if progress: bar.done() def upload_to_android_observatory(repo_section): import requests requests # stop unused import warning if options.verbose: logging.getLogger("requests").setLevel(logging.INFO) logging.getLogger("urllib3").setLevel(logging.INFO) else: logging.getLogger("requests").setLevel(logging.WARNING) logging.getLogger("urllib3").setLevel(logging.WARNING) if repo_section == 'repo': for f in sorted(glob.glob(os.path.join(repo_section, '*.apk'))): upload_apk_to_android_observatory(f) def upload_apk_to_android_observatory(path): # depend on requests and lxml only if users enable AO import requests from . import net from lxml.html import fromstring apkfilename = os.path.basename(path) r = requests.post('https://androidobservatory.org/', data={'q': update.sha256sum(path), 'searchby': 'hash'}, headers=net.HEADERS) if r.status_code == 200: # from now on XPath will be used to retrieve the message in the HTML # androidobservatory doesn't have a nice API to talk with # so we must scrape the page content tree = fromstring(r.text) href = None for element in tree.xpath("//html/body/div/div/table/tbody/tr/td/a"): a = element.attrib.get('href') if a: m = re.match(r'^/app/[0-9A-F]{40}$', a) if m: href = m.group() page = 'https://androidobservatory.org' if href: message = (_('Found {apkfilename} at {url}') .format(apkfilename=apkfilename, url=(page + href))) logging.debug(message) return # upload the file with a post request logging.info(_('Uploading {apkfilename} to androidobservatory.org') .format(apkfilename=apkfilename)) r = requests.post('https://androidobservatory.org/upload', files={'apk': (apkfilename, open(path, 'rb'))}, headers=net.HEADERS, allow_redirects=False) def upload_to_virustotal(repo_section, virustotal_apikey): import requests requests # stop unused import warning if repo_section == 'repo': if not os.path.exists('virustotal'): os.mkdir('virustotal') if os.path.exists(os.path.join(repo_section, 'index-v1.json')): with open(os.path.join(repo_section, 'index-v1.json')) as fp: data = json.load(fp) else: data, _ignored, _ignored = index.get_index_from_jar(os.path.join(repo_section, 'index-v1.jar')) for packageName, packages in data['packages'].items(): for package in packages: upload_apk_to_virustotal(virustotal_apikey, **package) def upload_apk_to_virustotal(virustotal_apikey, packageName, apkName, hash, versionCode, **kwargs): import requests logging.getLogger("urllib3").setLevel(logging.WARNING) logging.getLogger("requests").setLevel(logging.WARNING) outputfilename = os.path.join('virustotal', packageName + '_' + str(versionCode) + '_' + hash + '.json') if os.path.exists(outputfilename): logging.debug(apkName + ' results are in ' + outputfilename) return outputfilename repofilename = os.path.join('repo', apkName) logging.info('Checking if ' + repofilename + ' is on virustotal') headers = { "User-Agent": "F-Droid" } if 'headers' in kwargs: for k, v in kwargs['headers'].items(): headers[k] = v data = { 'apikey': virustotal_apikey, 'resource': hash, } needs_file_upload = False while True: r = requests.get('https://www.virustotal.com/vtapi/v2/file/report?' + urllib.parse.urlencode(data), headers=headers) if r.status_code == 200: response = r.json() if response['response_code'] == 0: needs_file_upload = True else: response['filename'] = apkName response['packageName'] = packageName response['versionCode'] = versionCode if kwargs.get('versionName'): response['versionName'] = kwargs.get('versionName') with open(outputfilename, 'w') as fp: json.dump(response, fp, indent=2, sort_keys=True) if response.get('positives', 0) > 0: logging.warning(repofilename + ' has been flagged by virustotal ' + str(response['positives']) + ' times:' + '\n\t' + response['permalink']) break elif r.status_code == 204: logging.warning(_('virustotal.com is rate limiting, waiting to retry...')) time.sleep(30) # wait for public API rate limiting upload_url = None if needs_file_upload: manual_url = 'https://www.virustotal.com/' size = os.path.getsize(repofilename) if size > 200000000: # VirusTotal API 200MB hard limit logging.error(_('{path} more than 200MB, manually upload: {url}') .format(path=repofilename, url=manual_url)) elif size > 32000000: # VirusTotal API requires fetching a URL to upload bigger files r = requests.get('https://www.virustotal.com/vtapi/v2/file/scan/upload_url?' + urllib.parse.urlencode(data), headers=headers) if r.status_code == 200: upload_url = r.json().get('upload_url') elif r.status_code == 403: logging.error(_('VirusTotal API key cannot upload files larger than 32MB, ' + 'use {url} to upload {path}.') .format(path=repofilename, url=manual_url)) else: r.raise_for_status() else: upload_url = 'https://www.virustotal.com/vtapi/v2/file/scan' if upload_url: logging.info(_('Uploading {apkfilename} to virustotal') .format(apkfilename=repofilename)) files = { 'file': (apkName, open(repofilename, 'rb')) } r = requests.post(upload_url, data=data, headers=headers, files=files) logging.debug(_('If this upload fails, try manually uploading to {url}') .format(url=manual_url)) r.raise_for_status() response = r.json() logging.info(response['verbose_msg'] + " " + response['permalink']) return outputfilename def push_binary_transparency(git_repo_path, git_remote): """Push the binary transparency git repo to the specifed remote. If the remote is a local directory, make sure it exists, and is a git repo. This is used to move this git repo from an offline machine onto a flash drive, then onto the online machine. Also, this pulls because pushing to a non-bare git repo is error prone. This is also used in offline signing setups, where it then also creates a "local copy dir" git repo that serves to shuttle the git data from the offline machine to the online machine. In that case, git_remote is a dir on the local file system, e.g. a thumb drive. """ import git logging.info(_('Pushing binary transparency log to {url}') .format(url=git_remote)) if os.path.isdir(os.path.dirname(git_remote)): # from offline machine to thumbdrive remote_path = os.path.abspath(git_repo_path) if not os.path.isdir(os.path.join(git_remote, '.git')): os.makedirs(git_remote, exist_ok=True) thumbdriverepo = git.Repo.init(git_remote) local = thumbdriverepo.create_remote('local', remote_path) else: thumbdriverepo = git.Repo(git_remote) local = git.remote.Remote(thumbdriverepo, 'local') if local in thumbdriverepo.remotes: local = thumbdriverepo.remote('local') if 'set_url' in dir(local): # force remote URL if using GitPython 2.x local.set_url(remote_path) else: local = thumbdriverepo.create_remote('local', remote_path) local.pull('master') else: # from online machine to remote on a server on the internet gitrepo = git.Repo(git_repo_path) origin = git.remote.Remote(gitrepo, 'origin') if origin in gitrepo.remotes: origin = gitrepo.remote('origin') if 'set_url' in dir(origin): # added in GitPython 2.x origin.set_url(git_remote) else: origin = gitrepo.create_remote('origin', git_remote) origin.push('master') def main(): global config, options parser = ArgumentParser() common.setup_global_opts(parser) parser.add_argument("-i", "--identity-file", default=None, help=_("Specify an identity file to provide to SSH for rsyncing")) parser.add_argument("--local-copy-dir", default=None, help=_("Specify a local folder to sync the repo to")) parser.add_argument("--no-checksum", action="store_true", default=False, help=_("Don't use rsync checksums")) parser.add_argument("--no-keep-git-mirror-archive", action="store_true", default=False, help=_("If a git mirror gets to big, allow the archive to be deleted")) options = parser.parse_args() config = common.read_config(options) if config.get('nonstandardwebroot') is True: standardwebroot = False else: standardwebroot = True for serverwebroot in config.get('serverwebroot', []): # this supports both an ssh host:path and just a path s = serverwebroot.rstrip('/').split(':') if len(s) == 1: fdroiddir = s[0] elif len(s) == 2: host, fdroiddir = s else: logging.error(_('Malformed serverwebroot line:') + ' ' + serverwebroot) sys.exit(1) repobase = os.path.basename(fdroiddir) if standardwebroot and repobase != 'fdroid': logging.error('serverwebroot path does not end with "fdroid", ' + 'perhaps you meant one of these:\n\t' + serverwebroot.rstrip('/') + '/fdroid\n\t' + serverwebroot.rstrip('/').rstrip(repobase) + 'fdroid') sys.exit(1) if options.local_copy_dir is not None: local_copy_dir = options.local_copy_dir elif config.get('local_copy_dir'): local_copy_dir = config['local_copy_dir'] else: local_copy_dir = None if local_copy_dir is not None: fdroiddir = local_copy_dir.rstrip('/') if os.path.exists(fdroiddir) and not os.path.isdir(fdroiddir): logging.error(_('local_copy_dir must be directory, not a file!')) sys.exit(1) if not os.path.exists(os.path.dirname(fdroiddir)): logging.error(_('The root dir for local_copy_dir "{path}" does not exist!') .format(path=os.path.dirname(fdroiddir))) sys.exit(1) if not os.path.isabs(fdroiddir): logging.error(_('local_copy_dir must be an absolute path!')) sys.exit(1) repobase = os.path.basename(fdroiddir) if standardwebroot and repobase != 'fdroid': logging.error(_('local_copy_dir does not end with "fdroid", ' + 'perhaps you meant: "{path}"') .format(path=fdroiddir + '/fdroid')) sys.exit(1) if local_copy_dir[-1] != '/': local_copy_dir += '/' local_copy_dir = local_copy_dir.replace('//', '/') if not os.path.exists(fdroiddir): os.mkdir(fdroiddir) if not config.get('awsbucket') \ and not config.get('serverwebroot') \ and not config.get('servergitmirrors') \ and not config.get('androidobservatory') \ and not config.get('binary_transparency_remote') \ and not config.get('virustotal_apikey') \ and local_copy_dir is None: logging.warning(_('No option set! Edit your config.yml to set at least one of these:') + '\nserverwebroot, servergitmirrors, local_copy_dir, awsbucket, ' + 'virustotal_apikey, androidobservatory, or binary_transparency_remote') sys.exit(1) repo_sections = ['repo'] if config['archive_older'] != 0: repo_sections.append('archive') if not os.path.exists('archive'): os.mkdir('archive') if config['per_app_repos']: repo_sections += common.get_per_app_repos() if os.path.isdir('unsigned') or (local_copy_dir is not None and os.path.isdir(os.path.join(local_copy_dir, 'unsigned'))): repo_sections.append('unsigned') for repo_section in repo_sections: if local_copy_dir is not None: if config['sync_from_local_copy_dir']: sync_from_localcopy(repo_section, local_copy_dir) else: update_localcopy(repo_section, local_copy_dir) for serverwebroot in config.get('serverwebroot', []): update_serverwebroot(serverwebroot, repo_section) if config.get('servergitmirrors', []): # update_servergitmirrors will take care of multiple mirrors so don't need a foreach servergitmirrors = config.get('servergitmirrors', []) update_servergitmirrors(servergitmirrors, repo_section) if config.get('awsbucket'): update_awsbucket(repo_section) if config.get('androidobservatory'): upload_to_android_observatory(repo_section) if config.get('virustotal_apikey'): upload_to_virustotal(repo_section, config.get('virustotal_apikey')) binary_transparency_remote = config.get('binary_transparency_remote') if binary_transparency_remote: push_binary_transparency(BINARY_TRANSPARENCY_DIR, binary_transparency_remote) common.write_status_json(common.setup_status_output(start_timestamp)) sys.exit(0) if __name__ == "__main__": main() fdroidserver-2.1/fdroidserver/exception.py0000644000175000017500000000170714203004041021015 0ustar hanshans00000000000000class FDroidException(Exception): def __init__(self, value=None, detail=None): self.value = value self.detail = detail def shortened_detail(self): if len(self.detail) < 16000: return self.detail return '[...]\n' + self.detail[-16000:] def __str__(self): if self.value is None: ret = __name__ else: ret = str(self.value) if self.detail: ret += ( "\n==== detail begin ====\n%s\n==== detail end ====" % ''.join(self.detail).strip() ) return ret class MetaDataException(Exception): def __init__(self, value): self.value = value def __str__(self): return self.value class VCSException(FDroidException): pass class NoSubmodulesException(VCSException): pass class BuildException(FDroidException): pass class VerificationException(FDroidException): pass fdroidserver-2.1/fdroidserver/gpgsign.py0000644000175000017500000000571014205260731020467 0ustar hanshans00000000000000#!/usr/bin/env python3 # # gpgsign.py - part of the FDroid server tools # Copyright (C) 2014, Ciaran Gultnieks, ciaran@ciarang.com # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . import os import glob from argparse import ArgumentParser import logging import time from . import _ from . import common from .common import FDroidPopen from .exception import FDroidException config = None options = None start_timestamp = time.gmtime() def status_update_json(signed): """Output a JSON file with metadata about this run.""" logging.debug(_('Outputting JSON')) output = common.setup_status_output(start_timestamp) if signed: output['signed'] = signed common.write_status_json(output) def main(): global config, options # Parse command line... parser = ArgumentParser() common.setup_global_opts(parser) options = parser.parse_args() config = common.read_config(options) repodirs = ['repo'] if config['archive_older'] != 0: repodirs.append('archive') signed = [] for output_dir in repodirs: if not os.path.isdir(output_dir): raise FDroidException( _("Missing output directory") + " '" + output_dir + "'" ) # Process any apks that are waiting to be signed... for f in sorted(glob.glob(os.path.join(output_dir, '*.*'))): if common.get_file_extension(f) == 'asc': continue if not common.is_repo_file(f) and not f.endswith('/index-v1.json'): continue filename = os.path.basename(f) sigfilename = filename + ".asc" sigpath = os.path.join(output_dir, sigfilename) if not os.path.exists(sigpath): gpgargs = ['gpg', '-a', '--output', sigpath, '--detach-sig'] if 'gpghome' in config: gpgargs.extend(['--homedir', config['gpghome']]) if 'gpgkey' in config: gpgargs.extend(['--local-user', config['gpgkey']]) gpgargs.append(os.path.join(output_dir, filename)) p = FDroidPopen(gpgargs) if p.returncode != 0: raise FDroidException("Signing failed.") signed.append(filename) logging.info('Signed ' + filename) status_update_json(signed) if __name__ == "__main__": main() fdroidserver-2.1/fdroidserver/import.py0000644000175000017500000002462014203004041020330 0ustar hanshans00000000000000#!/usr/bin/env python3 # # import.py - part of the FDroid server tools # Copyright (C) 2010-13, Ciaran Gultnieks, ciaran@ciarang.com # Copyright (C) 2013-2014 Daniel Martí # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . import configparser import git import json import shutil import sys import yaml from argparse import ArgumentParser import logging from pathlib import Path try: from yaml import CSafeLoader as SafeLoader except ImportError: from yaml import SafeLoader from . import _ from . import common from . import metadata from .exception import FDroidException config = None options = None # WARNING! This cannot be imported as a Python module, so reuseable functions need to go into common.py! def clone_to_tmp_dir(app): tmp_dir = Path('tmp') tmp_dir.mkdir(exist_ok=True) tmp_dir = tmp_dir / 'importer' if tmp_dir.exists(): shutil.rmtree(str(tmp_dir), onerror=common.handle_retree_error_on_windows) vcs = common.getvcs(app.RepoType, app.Repo, tmp_dir) vcs.gotorevision(options.rev) return tmp_dir def check_for_kivy_buildozer(tmp_importer_dir, app, build): versionCode = None buildozer_spec = tmp_importer_dir / 'buildozer.spec' if buildozer_spec.exists(): config = configparser.ConfigParser() config.read(buildozer_spec) import pprint pprint.pprint(sorted(config['app'].keys())) app.id = config['app'].get('package.domain') print(app.id) app.AutoName = config['app'].get('package.name', app.AutoName) app.License = config['app'].get('license', app.License) app.Description = config['app'].get('description', app.Description) build.versionName = config['app'].get('version') build.output = 'bin/%s-$$VERSION$$-release-unsigned.apk' % app.AutoName build.ndk = 'r17c' build.srclibs = [ 'buildozer@586152c', 'python-for-android@ccb0f8e1', ] build.sudo = [ 'apt-get update', 'apt-get install -y build-essential libffi-dev libltdl-dev', ] build.prebuild = [ 'sed -iE "/^[# ]*android\\.(ant|ndk|sdk)_path[ =]/d" buildozer.spec', 'sed -iE "/^[# ]*android.accept_sdk_license[ =]+.*/d" buildozer.spec', 'sed -iE "/^[# ]*android.skip_update[ =]+.*/d" buildozer.spec', 'sed -iE "/^[# ]*p4a.source_dir[ =]+.*/d" buildozer.spec', 'sed -i "s,\\[app\\],[app]\\n\\nandroid.sdk_path = $$SDK$$\\nandroid.ndk_path = $$NDK$$\\np4a.source_dir = $$python-for-android$$\\nandroid.accept_sdk_license = False\\nandroid.skip_update = True\\nandroid.ant_path = /usr/bin/ant\\n," buildozer.spec', 'pip3 install --user --upgrade $$buildozer$$ Cython==0.28.6', ] build.build = [ 'PATH="$HOME/.local/bin:$PATH" buildozer android release', ] return build.get('versionName'), versionCode, app.get('id') def main(): global config, options # Parse command line... parser = ArgumentParser() common.setup_global_opts(parser) parser.add_argument("-u", "--url", default=None, help=_("Project URL to import from.")) parser.add_argument("-s", "--subdir", default=None, help=_("Path to main Android project subdirectory, if not in root.")) parser.add_argument("-c", "--categories", default=None, help=_("Comma separated list of categories.")) parser.add_argument("-l", "--license", default=None, help=_("Overall license of the project.")) parser.add_argument("--omit-disable", action="store_true", default=False, help=_("Do not add 'disable:' to the generated build entries")) parser.add_argument("--rev", default=None, help=_("Allows a different revision (or git branch) to be specified for the initial import")) metadata.add_metadata_arguments(parser) options = parser.parse_args() metadata.warnings_action = options.W config = common.read_config(options) apps = metadata.read_metadata() app = None tmp_importer_dir = None local_metadata_files = common.get_local_metadata_files() if local_metadata_files != []: raise FDroidException(_("This repo already has local metadata: %s") % local_metadata_files[0]) build = metadata.Build() if options.url is None and Path('.git').is_dir(): app = metadata.App() app.AutoName = Path.cwd().name app.RepoType = 'git' if Path('build.gradle').exists() or Path('build.gradle.kts').exists(): build.gradle = ['yes'] # TODO: Python3.6: Should accept path-like git_repo = git.Repo(str(Path.cwd())) for remote in git.Remote.iter_items(git_repo): if remote.name == 'origin': url = git_repo.remotes.origin.url if url.startswith('https://git'): # github, gitlab app.SourceCode = url.rstrip('.git') app.Repo = url break write_local_file = True elif options.url: app = common.get_app_from_url(options.url) tmp_importer_dir = clone_to_tmp_dir(app) # TODO: Python3.6: Should accept path-like git_repo = git.Repo(str(tmp_importer_dir)) if not options.omit_disable: build.disable = 'Generated by import.py - check/set version fields and commit id' write_local_file = False else: raise FDroidException("Specify project url.") app.UpdateCheckMode = 'Tags' build.commit = common.get_head_commit_id(git_repo) versionName, versionCode, appid = check_for_kivy_buildozer(tmp_importer_dir, app, build) # Extract some information... paths = common.get_all_gradle_and_manifests(tmp_importer_dir) subdir = common.get_gradle_subdir(tmp_importer_dir, paths) if paths: versionName, versionCode, appid = common.parse_androidmanifests(paths, app) if not appid: raise FDroidException(_("Couldn't find Application ID")) if not versionName: logging.warning(_('Could not find latest version name')) if not versionCode: logging.warning(_('Could not find latest version code')) elif not appid: raise FDroidException(_("No gradle project could be found. Specify --subdir?")) # Make sure it's actually new... if appid in apps: raise FDroidException(_('Package "{appid}" already exists').format(appid=appid)) # Create a build line... build.versionName = versionName or 'Unknown' build.versionCode = versionCode or '0' # TODO heinous but this is still a str if options.subdir: build.subdir = options.subdir build.gradle = ['yes'] elif subdir: build.subdir = subdir.as_posix() build.gradle = ['yes'] if options.license: app.License = options.license if options.categories: app.Categories = options.categories.split(',') if (subdir / 'jni').exists(): build.buildjni = ['yes'] if (subdir / 'build.gradle').exists() or (subdir / 'build.gradle').exists(): build.gradle = ['yes'] package_json = tmp_importer_dir / 'package.json' # react-native pubspec_yaml = tmp_importer_dir / 'pubspec.yaml' # flutter if package_json.exists(): build.sudo = ['apt-get update || apt-get update', 'apt-get install -t stretch-backports npm', 'npm install -g react-native-cli'] build.init = ['npm install'] with package_json.open() as fp: data = json.load(fp) app.AutoName = data.get('name', app.AutoName) app.License = data.get('license', app.License) app.Description = data.get('description', app.Description) app.WebSite = data.get('homepage', app.WebSite) app_json = tmp_importer_dir / 'app.json' if app_json.exists(): with app_json.open() as fp: data = json.load(fp) app.AutoName = data.get('name', app.AutoName) if pubspec_yaml.exists(): with pubspec_yaml.open() as fp: data = yaml.load(fp, Loader=SafeLoader) app.AutoName = data.get('name', app.AutoName) app.License = data.get('license', app.License) app.Description = data.get('description', app.Description) build.srclibs = ['flutter@stable'] build.output = 'build/app/outputs/apk/release/app-release.apk' build.build = [ '$$flutter$$/bin/flutter config --no-analytics', '$$flutter$$/bin/flutter packages pub get', '$$flutter$$/bin/flutter build apk', ] git_modules = tmp_importer_dir / '.gitmodules' if git_modules.exists(): build.submodules = True metadata.post_metadata_parse(app) app['Builds'].append(build) if write_local_file: metadata.write_metadata(Path('.fdroid.yml'), app) else: # Keep the repo directory to save bandwidth... Path('build').mkdir(exist_ok=True) build_dir = Path('build') / appid if build_dir.exists(): logging.warning(_('{path} already exists, ignoring import results!') .format(path=build_dir)) sys.exit(1) elif tmp_importer_dir: # For Windows: Close the repo or a git.exe instance holds handles to repo try: git_repo.close() except AttributeError: # Debian/stretch's version does not have close() pass # TODO: Python3.9: Accepts a path-like object for both src and dst. shutil.move(str(tmp_importer_dir), str(build_dir)) Path('build/.fdroidvcs-' + appid).write_text(app.RepoType + ' ' + app.Repo) metadatapath = Path('metadata') / (appid + '.yml') metadata.write_metadata(metadatapath, app) logging.info("Wrote " + str(metadatapath)) if __name__ == "__main__": main() fdroidserver-2.1/fdroidserver/index.py0000644000175000017500000012522014203004041020123 0ustar hanshans00000000000000#!/usr/bin/env python3 # # update.py - part of the FDroid server tools # Copyright (C) 2017, Torsten Grote # Copyright (C) 2016, Blue Jay Wireless # Copyright (C) 2014-2016, Hans-Christoph Steiner # Copyright (C) 2010-2015, Ciaran Gultnieks # Copyright (C) 2013-2014, Daniel Martí # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . import collections import json import logging import os import re import shutil import tempfile import urllib.parse import zipfile import calendar import qrcode from binascii import hexlify, unhexlify from datetime import datetime, timezone from xml.dom.minidom import Document from . import _ from . import common from . import metadata from . import net from . import signindex from fdroidserver.common import FDroidPopen, FDroidPopenBytes, load_stats_fdroid_signing_key_fingerprints from fdroidserver.exception import FDroidException, VerificationException def make(apps, apks, repodir, archive): """Generate the repo index files. This requires properly initialized options and config objects. Parameters ---------- apps OrderedDict of apps to go into the index, each app should have at least one associated apk apks list of apks to go into the index repodir the repo directory archive True if this is the archive repo, False if it's the main one. """ from fdroidserver.update import METADATA_VERSION if hasattr(common.options, 'nosign') and common.options.nosign: if 'keystore' not in common.config and 'repo_pubkey' not in common.config: raise FDroidException(_('"repo_pubkey" must be present in config.yml when using --nosign!')) else: common.assert_config_keystore(common.config) # Historically the index has been sorted by App Name, so we enforce this ordering here sortedids = sorted(apps, key=lambda appid: common.get_app_display_name(apps[appid]).upper()) sortedapps = collections.OrderedDict() for appid in sortedids: sortedapps[appid] = apps[appid] repodict = collections.OrderedDict() repodict['timestamp'] = datetime.utcnow().replace(tzinfo=timezone.utc) repodict['version'] = METADATA_VERSION if common.config['repo_maxage'] != 0: repodict['maxage'] = common.config['repo_maxage'] if archive: repodict['name'] = common.config['archive_name'] repodict['icon'] = common.config.get('archive_icon', common.default_config['repo_icon']) repodict['description'] = common.config['archive_description'] archive_url = common.config.get('archive_url', common.config['repo_url'][:-4] + 'archive') repodict['address'] = archive_url urlbasepath = os.path.basename(urllib.parse.urlparse(archive_url).path) else: repodict['name'] = common.config['repo_name'] repodict['icon'] = common.config.get('repo_icon', common.default_config['repo_icon']) repodict['address'] = common.config['repo_url'] repodict['description'] = common.config['repo_description'] urlbasepath = os.path.basename(urllib.parse.urlparse(common.config['repo_url']).path) mirrorcheckfailed = False mirrors = [] for mirror in common.config.get('mirrors', []): base = os.path.basename(urllib.parse.urlparse(mirror).path.rstrip('/')) if common.config.get('nonstandardwebroot') is not True and base != 'fdroid': logging.error(_("mirror '%s' does not end with 'fdroid'!") % mirror) mirrorcheckfailed = True # must end with / or urljoin strips a whole path segment if mirror.endswith('/'): mirrors.append(urllib.parse.urljoin(mirror, urlbasepath)) else: mirrors.append(urllib.parse.urljoin(mirror + '/', urlbasepath)) for mirror in common.config.get('servergitmirrors', []): for url in get_mirror_service_urls(mirror): mirrors.append(url + '/' + repodir) if mirrorcheckfailed: raise FDroidException(_("Malformed repository mirrors.")) if mirrors: repodict['mirrors'] = mirrors requestsdict = collections.OrderedDict() for command in ('install', 'uninstall'): packageNames = [] key = command + '_list' if key in common.config: if isinstance(common.config[key], str): packageNames = [common.config[key]] elif all(isinstance(item, str) for item in common.config[key]): packageNames = common.config[key] else: raise TypeError(_('only accepts strings, lists, and tuples')) requestsdict[command] = packageNames fdroid_signing_key_fingerprints = load_stats_fdroid_signing_key_fingerprints() make_v0(sortedapps, apks, repodir, repodict, requestsdict, fdroid_signing_key_fingerprints) make_v1(sortedapps, apks, repodir, repodict, requestsdict, fdroid_signing_key_fingerprints) make_website(sortedapps, repodir, repodict) def _should_file_be_generated(path, magic_string): if os.path.exists(path): with open(path) as f: # if the magic_string is not in the first line the file should be overwritten if magic_string not in f.readline(): return False return True def make_website(apps, repodir, repodict): _ignored, repo_pubkey_fingerprint = extract_pubkey() repo_pubkey_fingerprint_stripped = repo_pubkey_fingerprint.replace(" ", "") link = repodict["address"] link_fingerprinted = ('{link}?fingerprint={fingerprint}' .format(link=link, fingerprint=repo_pubkey_fingerprint_stripped)) # do not change this string, as it will break updates for files with older versions of this string autogenerate_comment = "auto-generated - fdroid index updates will overwrite this file" if not os.path.exists(repodir): os.makedirs(repodir) qrcode.make(link_fingerprinted).save(os.path.join(repodir, "index.png")) html_name = 'index.html' html_file = os.path.join(repodir, html_name) if _should_file_be_generated(html_file, autogenerate_comment): with open(html_file, 'w') as f: name = repodict["name"] description = repodict["description"] icon = repodict["icon"] f.write(""" {name}

{name}

QR: test {description}

Currently it serves {number_of_apps} apps. To add it to your F-Droid client, scan the QR code (click it to enlarge) or use this URL:

{link}

If you would like to manually verify the fingerprint (SHA-256) of the repository signing key, here it is:
{fingerprint}

""".format(autogenerate_comment=autogenerate_comment, description=description, fingerprint=repo_pubkey_fingerprint, icon=icon, link=link, link_fingerprinted=link_fingerprinted, name=name, number_of_apps=str(len(apps)))) css_file = os.path.join(repodir, "index.css") if _should_file_be_generated(css_file, autogenerate_comment): with open(css_file, "w") as f: # this auto generated comment was not included via .format(), as python seems to have problems with css files in combination with .format() f.write("""/* auto-generated - fdroid index updates will overwrite this file */ BODY { font-family : Arial, Helvetica, Sans-Serif; color : #0000ee; background-color : #ffffff; } p { text-align : justify; } p.center { text-align : center; } TD { font-family : Arial, Helvetica, Sans-Serif; color : #0000ee; } body,td { font-size : 14px; } TH { font-family : Arial, Helvetica, Sans-Serif; color : #0000ee; background-color : #F5EAD4; } a:link { color : #bb0000; } a:visited { color : #ff0000; } .zitat { margin-left : 1cm; margin-right : 1cm; font-style : italic; } #intro { border-spacing : 1em; border : 1px solid gray; border-radius : 0.5em; box-shadow : 10px 10px 5px #888; margin : 1.5em; font-size : .9em; width : 600px; max-width : 90%; display : table; margin-left : auto; margin-right : auto; font-size : .8em; color : #555555; } #intro > p { margin-top : 0; } #intro p:last-child { margin-bottom : 0; } .last { border-bottom : 1px solid black; padding-bottom : .5em; text-align : center; } table { border-collapse : collapse; } h2 { text-align : center; } .perms { font-family : monospace; font-size : .8em; } .repoapplist { display : table; border-collapse : collapse; margin-left : auto; margin-right : auto; width : 600px; max-width : 90%; } .approw, appdetailrow { display : table-row; } .appdetailrow { display : flex; padding : .5em; } .appiconbig, .appdetailblock, .appdetailcell { display : table-cell } .appiconbig { vertical-align : middle; text-align : center; } .appdetailinner { width : 100%; } .applinkcell { text-align : center; float : right; width : 100%; margin-bottom : .1em; } .paddedlink { margin : 1em; } .approw { border-spacing : 1em; border : 1px solid gray; border-radius : 0.5em; padding : 0.5em; margin : 1.5em; } .appdetailinner .appdetailrow:first-child { background-color : #d5d5d5; } .appdetailinner .appdetailrow:first-child .appdetailcell { min-width : 33%; flex : 1 33%; text-align : center; } .appdetailinner .appdetailrow:first-child .appdetailcell:first-child { text-align : left; } .appdetailinner .appdetailrow:first-child .appdetailcell:last-child { float : none; text-align : right; } .minor-details { font-size : .8em; color : #555555; } .boldname { font-weight : bold; } #appcount { text-align : center; margin-bottom : .5em; } kbd { padding : 0.1em 0.6em; border : 1px solid #CCC; background-color : #F7F7F7; color : #333; box-shadow : 0px 1px 0px rgba(0, 0, 0, 0.2), 0px 0px 0px 2px #FFF inset; border-radius : 3px; display : inline-block; margin : 0px 0.1em; text-shadow : 0px 1px 0px #FFF; white-space : nowrap; } div.filterline, div.repoline { display : table; margin-left : auto; margin-right : auto; margin-bottom : 1em; vertical-align : middle; display : table; font-size : .8em; } .filterline form { display : table-row; } .filterline .filtercell { display : table-cell; vertical-align : middle; } fieldset { float : left; } fieldset select, fieldset input, #reposelect select, #reposelect input { font-size : .9em; } .pager { display : table; margin-left : auto; margin-right : auto; width : 600px; max-width : 90%; padding-top : .6em; } /* should correspond to .repoapplist */ .pagerrow { display : table-row; } .pagercell { display : table-cell; } .pagercell.left { text-align : left; padding-right : 1em; } .pagercell.middle { text-align : center; font-size : .9em; color : #555; } .pagercell.right { text-align : right; padding-left : 1em; } .anti { color : peru; } .antibold { color : crimson; } #footer { text-align : center; margin-top : 1em; font-size : 11px; color : #555; } #footer img { vertical-align : middle; } @media (max-width: 600px) { .repoapplist { display : block; } .appdetailinner, .appdetailrow { display : block; } .appdetailcell { display : block; float : left; line-height : 1.5em; } }""") def make_v1(apps, packages, repodir, repodict, requestsdict, fdroid_signing_key_fingerprints): def _index_encoder_default(obj): if isinstance(obj, set): return sorted(list(obj)) if isinstance(obj, datetime): # Java prefers milliseconds # we also need to account for time zone/daylight saving time return int(calendar.timegm(obj.timetuple()) * 1000) if isinstance(obj, dict): d = collections.OrderedDict() for key in sorted(obj.keys()): d[key] = obj[key] return d raise TypeError(repr(obj) + " is not JSON serializable") output = collections.OrderedDict() output['repo'] = repodict output['requests'] = requestsdict # establish sort order of the index v1_sort_packages(packages, fdroid_signing_key_fingerprints) appslist = [] output['apps'] = appslist for packageName, appdict in apps.items(): d = collections.OrderedDict() appslist.append(d) for k, v in sorted(appdict.items()): if not v: continue if k in ('Builds', 'comments', 'metadatapath', 'ArchivePolicy', 'AutoName', 'AutoUpdateMode', 'MaintainerNotes', 'Provides', 'Repo', 'RepoType', 'RequiresRoot', 'UpdateCheckData', 'UpdateCheckIgnore', 'UpdateCheckMode', 'UpdateCheckName', 'NoSourceSince', 'VercodeOperation'): continue # name things after the App class fields in fdroidclient if k == 'id': k = 'packageName' elif k == 'CurrentVersionCode': # TODO make SuggestedVersionCode the canonical name k = 'suggestedVersionCode' elif k == 'CurrentVersion': # TODO make SuggestedVersionName the canonical name k = 'suggestedVersionName' else: k = k[:1].lower() + k[1:] d[k] = v # establish sort order in localized dicts for app in output['apps']: localized = app.get('localized') if localized: lordered = collections.OrderedDict() for lkey, lvalue in sorted(localized.items()): lordered[lkey] = collections.OrderedDict() for ikey, iname in sorted(lvalue.items()): lordered[lkey][ikey] = iname app['localized'] = lordered output_packages = collections.OrderedDict() output['packages'] = output_packages for package in packages: packageName = package['packageName'] if packageName not in apps: logging.info(_('Ignoring package without metadata: ') + package['apkName']) continue if not package.get('versionName'): app = apps[packageName] versionCodeStr = str(package['versionCode']) # TODO build.versionCode should be int! for build in app.get('Builds', []): if build['versionCode'] == versionCodeStr: versionName = build.get('versionName') logging.info(_('Overriding blank versionName in {apkfilename} from metadata: {version}') .format(apkfilename=package['apkName'], version=versionName)) package['versionName'] = versionName break if packageName in output_packages: packagelist = output_packages[packageName] else: packagelist = [] output_packages[packageName] = packagelist d = collections.OrderedDict() packagelist.append(d) for k, v in sorted(package.items()): if not v: continue if k in ('icon', 'icons', 'icons_src', 'name', ): continue d[k] = v json_name = 'index-v1.json' index_file = os.path.join(repodir, json_name) with open(index_file, 'w') as fp: if common.options.pretty: json.dump(output, fp, default=_index_encoder_default, indent=2) else: json.dump(output, fp, default=_index_encoder_default) if common.options.nosign: _copy_to_local_copy_dir(repodir, index_file) logging.debug(_('index-v1 must have a signature, use `fdroid signindex` to create it!')) else: signindex.config = common.config signindex.sign_index_v1(repodir, json_name) def _copy_to_local_copy_dir(repodir, f): local_copy_dir = common.config.get('local_copy_dir', '') if os.path.exists(local_copy_dir): destdir = os.path.join(local_copy_dir, repodir) if not os.path.exists(destdir): os.mkdir(destdir) shutil.copy2(f, destdir, follow_symlinks=False) elif local_copy_dir: raise FDroidException(_('"local_copy_dir" {path} does not exist!') .format(path=local_copy_dir)) def v1_sort_packages(packages, fdroid_signing_key_fingerprints): """Sort the supplied list to ensure a deterministic sort order for package entries in the index file. This sort-order also expresses installation preference to the clients. (First in this list = first to install) Parameters ---------- packages list of packages which need to be sorted before but into index file. """ GROUP_DEV_SIGNED = 1 GROUP_FDROID_SIGNED = 2 GROUP_OTHER_SIGNED = 3 def v1_sort_keys(package): packageName = package.get('packageName', None) signer = package.get('signer', None) dev_signer = common.metadata_find_developer_signature(packageName) group = GROUP_OTHER_SIGNED if dev_signer and dev_signer == signer: group = GROUP_DEV_SIGNED else: fdroid_signer = fdroid_signing_key_fingerprints.get(packageName, {}).get('signer') if fdroid_signer and fdroid_signer == signer: group = GROUP_FDROID_SIGNED versionCode = None if package.get('versionCode', None): versionCode = -int(package['versionCode']) return(packageName, group, signer, versionCode) packages.sort(key=v1_sort_keys) def make_v0(apps, apks, repodir, repodict, requestsdict, fdroid_signing_key_fingerprints): """Aka index.jar aka index.xml.""" doc = Document() def addElement(name, value, doc, parent): el = doc.createElement(name) el.appendChild(doc.createTextNode(value)) parent.appendChild(el) def addElementNonEmpty(name, value, doc, parent): if not value: return addElement(name, value, doc, parent) def addElementIfInApk(name, apk, key, doc, parent): if key not in apk: return value = str(apk[key]) addElement(name, value, doc, parent) def addElementCheckLocalized(name, app, key, doc, parent, default=''): """Fill in field from metadata or localized block. For name/summary/description, they can come only from the app source, or from a dir in fdroiddata. They can be entirely missing from the metadata file if there is localized versions. This will fetch those from the localized version if its not available in the metadata file. Attributes should be alpha-sorted, so they must be added in alpha- sort order. """ el = doc.createElement(name) value = app.get(key) lkey = key[:1].lower() + key[1:] localized = app.get('localized') if not value and localized: for lang in ['en-US'] + [x for x in localized.keys()]: if not lang.startswith('en'): continue if lang in localized: value = localized[lang].get(lkey) if value: break if not value and localized and len(localized) > 1: lang = list(localized.keys())[0] value = localized[lang].get(lkey) if not value: value = default if not value and name == 'name' and app.get('AutoName'): value = app['AutoName'] el.appendChild(doc.createTextNode(value)) parent.appendChild(el) root = doc.createElement("fdroid") doc.appendChild(root) repoel = doc.createElement("repo") repoel.setAttribute("icon", repodict['icon']) if 'maxage' in repodict: repoel.setAttribute("maxage", str(repodict['maxage'])) repoel.setAttribute("name", repodict['name']) pubkey, repo_pubkey_fingerprint = extract_pubkey() repoel.setAttribute("pubkey", pubkey.decode('utf-8')) repoel.setAttribute("timestamp", '%d' % repodict['timestamp'].timestamp()) repoel.setAttribute("url", repodict['address']) repoel.setAttribute("version", str(repodict['version'])) addElement('description', repodict['description'], doc, repoel) for mirror in repodict.get('mirrors', []): addElement('mirror', mirror, doc, repoel) root.appendChild(repoel) for command in ('install', 'uninstall'): for packageName in requestsdict[command]: element = doc.createElement(command) root.appendChild(element) element.setAttribute('packageName', packageName) for appid, appdict in apps.items(): app = metadata.App(appdict) if app.get('Disabled') is not None: continue # Get a list of the apks for this app... apklist = [] name_from_apk = None apksbyversion = collections.defaultdict(lambda: []) for apk in apks: if apk.get('versionCode') and apk.get('packageName') == appid: apksbyversion[apk['versionCode']].append(apk) if name_from_apk is None: name_from_apk = apk.get('name') for versionCode, apksforver in apksbyversion.items(): fdroid_signer = fdroid_signing_key_fingerprints.get(appid, {}).get('signer') fdroid_signed_apk = None name_match_apk = None for x in apksforver: if fdroid_signer and x.get('signer', None) == fdroid_signer: fdroid_signed_apk = x if common.apk_release_filename.match(x.get('apkName', '')): name_match_apk = x # choose which of the available versions is most # suiteable for index v0 if fdroid_signed_apk: apklist.append(fdroid_signed_apk) elif name_match_apk: apklist.append(name_match_apk) else: apklist.append(apksforver[0]) if len(apklist) == 0: continue apel = doc.createElement("application") apel.setAttribute("id", app.id) root.appendChild(apel) addElement('id', app.id, doc, apel) if app.added: addElement('added', app.added.strftime('%Y-%m-%d'), doc, apel) if app.lastUpdated: addElement('lastupdated', app.lastUpdated.strftime('%Y-%m-%d'), doc, apel) addElementCheckLocalized('name', app, 'Name', doc, apel, name_from_apk) addElementCheckLocalized('summary', app, 'Summary', doc, apel) if app.icon: addElement('icon', app.icon, doc, apel) addElementCheckLocalized('desc', app, 'Description', doc, apel, 'No description available') addElement('license', app.License, doc, apel) if app.Categories: addElement('categories', ','.join(app.Categories), doc, apel) # We put the first (primary) category in LAST, which will have # the desired effect of making clients that only understand one # category see that one. addElement('category', app.Categories[0], doc, apel) addElement('web', app.WebSite, doc, apel) addElement('source', app.SourceCode, doc, apel) addElement('tracker', app.IssueTracker, doc, apel) addElementNonEmpty('changelog', app.Changelog, doc, apel) addElementNonEmpty('author', app.AuthorName, doc, apel) addElementNonEmpty('email', app.AuthorEmail, doc, apel) addElementNonEmpty('donate', app.Donate, doc, apel) addElementNonEmpty('bitcoin', app.Bitcoin, doc, apel) addElementNonEmpty('litecoin', app.Litecoin, doc, apel) addElementNonEmpty('flattr', app.FlattrID, doc, apel) addElementNonEmpty('liberapay', app.LiberapayID, doc, apel) addElementNonEmpty('openCollective', app.OpenCollective, doc, apel) # These elements actually refer to the current version (i.e. which # one is recommended. They are historically mis-named, and need # changing, but stay like this for now to support existing clients. addElement('marketversion', app.CurrentVersion, doc, apel) addElement('marketvercode', app.CurrentVersionCode, doc, apel) if app.Provides: pv = app.Provides.split(',') addElementNonEmpty('provides', ','.join(pv), doc, apel) if app.RequiresRoot: addElement('requirements', 'root', doc, apel) # Sort the APK list into version order, just so the web site # doesn't have to do any work by default... apklist = sorted(apklist, key=lambda apk: apk['versionCode'], reverse=True) if 'antiFeatures' in apklist[0]: app.AntiFeatures.extend(apklist[0]['antiFeatures']) if app.AntiFeatures: addElementNonEmpty('antifeatures', ','.join(app.AntiFeatures), doc, apel) # Check for duplicates - they will make the client unhappy... for i in range(len(apklist) - 1): first = apklist[i] second = apklist[i + 1] if first['versionCode'] == second['versionCode'] \ and first['sig'] == second['sig']: if first['hash'] == second['hash']: raise FDroidException('"{0}/{1}" and "{0}/{2}" are exact duplicates!'.format( repodir, first['apkName'], second['apkName'])) else: raise FDroidException('duplicates: "{0}/{1}" - "{0}/{2}"'.format( repodir, first['apkName'], second['apkName'])) current_version_code = 0 current_version_file = None for apk in apklist: file_extension = common.get_file_extension(apk['apkName']) # find the APK for the "Current Version" if current_version_code < int(app.CurrentVersionCode): current_version_file = apk['apkName'] if current_version_code < apk['versionCode']: current_version_code = apk['versionCode'] apkel = doc.createElement("package") apel.appendChild(apkel) versionName = apk.get('versionName') if not versionName: versionCodeStr = str(apk['versionCode']) # TODO build.versionCode should be int! for build in app.get('Builds', []): if build['versionCode'] == versionCodeStr and 'versionName' in build: versionName = build['versionName'] break if versionName: addElement('version', versionName, doc, apkel) addElement('versioncode', str(apk['versionCode']), doc, apkel) addElement('apkname', apk['apkName'], doc, apkel) addElementIfInApk('srcname', apk, 'srcname', doc, apkel) hashel = doc.createElement("hash") hashel.setAttribute('type', 'sha256') hashel.appendChild(doc.createTextNode(apk['hash'])) apkel.appendChild(hashel) addElement('size', str(apk['size']), doc, apkel) addElementIfInApk('sdkver', apk, 'minSdkVersion', doc, apkel) addElementIfInApk('targetSdkVersion', apk, 'targetSdkVersion', doc, apkel) addElementIfInApk('maxsdkver', apk, 'maxSdkVersion', doc, apkel) addElementIfInApk('obbMainFile', apk, 'obbMainFile', doc, apkel) addElementIfInApk('obbMainFileSha256', apk, 'obbMainFileSha256', doc, apkel) addElementIfInApk('obbPatchFile', apk, 'obbPatchFile', doc, apkel) addElementIfInApk('obbPatchFileSha256', apk, 'obbPatchFileSha256', doc, apkel) if 'added' in apk: addElement('added', apk['added'].strftime('%Y-%m-%d'), doc, apkel) if file_extension == 'apk': # sig is required for APKs, but only APKs addElement('sig', apk['sig'], doc, apkel) old_permissions = set() sorted_permissions = sorted(apk['uses-permission']) for perm in sorted_permissions: perm_name = perm[0] if perm_name.startswith("android.permission."): perm_name = perm_name[19:] old_permissions.add(perm_name) addElementNonEmpty('permissions', ','.join(sorted(old_permissions)), doc, apkel) for permission in sorted_permissions: permel = doc.createElement('uses-permission') if permission[1] is not None: permel.setAttribute('maxSdkVersion', '%d' % permission[1]) apkel.appendChild(permel) permel.setAttribute('name', permission[0]) for permission_sdk_23 in sorted(apk['uses-permission-sdk-23']): permel = doc.createElement('uses-permission-sdk-23') if permission_sdk_23[1] is not None: permel.setAttribute('maxSdkVersion', '%d' % permission_sdk_23[1]) apkel.appendChild(permel) permel.setAttribute('name', permission_sdk_23[0]) if 'nativecode' in apk: addElement('nativecode', ','.join(sorted(apk['nativecode'])), doc, apkel) addElementNonEmpty('features', ','.join(sorted(apk['features'])), doc, apkel) if current_version_file is not None \ and common.config['make_current_version_link'] \ and repodir == 'repo': # only create these namefield = common.config['current_version_name_source'] name = app.get(namefield) if not name and namefield == 'Name': name = app.get('localized', {}).get('en-US', {}).get('name') if not name: name = app.id sanitized_name = re.sub(b'''[ '"&%?+=/]''', b'', name.encode('utf-8')) apklinkname = sanitized_name + os.path.splitext(current_version_file)[1].encode('utf-8') current_version_path = os.path.join(repodir, current_version_file).encode('utf-8', 'surrogateescape') if os.path.islink(apklinkname): os.remove(apklinkname) os.symlink(current_version_path, apklinkname) # also symlink gpg signature, if it exists for extension in (b'.asc', b'.sig'): sigfile_path = current_version_path + extension if os.path.exists(sigfile_path): siglinkname = apklinkname + extension if os.path.islink(siglinkname): os.remove(siglinkname) os.symlink(sigfile_path, siglinkname) if common.options.pretty: output = doc.toprettyxml(encoding='utf-8') else: output = doc.toxml(encoding='utf-8') with open(os.path.join(repodir, 'index.xml'), 'wb') as f: f.write(output) if 'repo_keyalias' in common.config \ or (common.options.nosign and 'repo_pubkey' in common.config): if common.options.nosign: logging.info(_("Creating unsigned index in preparation for signing")) else: logging.info(_("Creating signed index with this key (SHA256):")) logging.info("%s" % repo_pubkey_fingerprint) # Create a jar of the index... jar_output = 'index_unsigned.jar' if common.options.nosign else 'index.jar' p = FDroidPopen(['jar', 'cf', jar_output, 'index.xml'], cwd=repodir) if p.returncode != 0: raise FDroidException("Failed to create {0}".format(jar_output)) # Sign the index... signed = os.path.join(repodir, 'index.jar') if common.options.nosign: _copy_to_local_copy_dir(repodir, os.path.join(repodir, jar_output)) # Remove old signed index if not signing if os.path.exists(signed): os.remove(signed) else: signindex.config = common.config signindex.sign_jar(signed) # Copy the repo icon into the repo directory... icon_dir = os.path.join(repodir, 'icons') repo_icon = common.config.get('repo_icon', common.default_config['repo_icon']) iconfilename = os.path.join(icon_dir, os.path.basename(repo_icon)) if os.path.exists(repo_icon): shutil.copyfile(common.config['repo_icon'], iconfilename) else: logging.warning(_('repo_icon "repo/icons/%s" does not exist, generating placeholder.') % repo_icon) os.makedirs(os.path.dirname(iconfilename), exist_ok=True) try: qrcode.make(common.config['repo_url']).save(iconfilename) except Exception: exampleicon = os.path.join(common.get_examples_dir(), common.default_config['repo_icon']) shutil.copy(exampleicon, iconfilename) def extract_pubkey(): """Extract and return the repository's public key from the keystore. Returns ------- public key in hex repository fingerprint """ if 'repo_pubkey' in common.config: pubkey = unhexlify(common.config['repo_pubkey']) else: env_vars = {'LC_ALL': 'C.UTF-8', 'FDROID_KEY_STORE_PASS': common.config['keystorepass']} p = FDroidPopenBytes([common.config['keytool'], '-exportcert', '-alias', common.config['repo_keyalias'], '-keystore', common.config['keystore'], '-storepass:env', 'FDROID_KEY_STORE_PASS'] + list(common.config['smartcardoptions']), envs=env_vars, output=False, stderr_to_stdout=False) if p.returncode != 0 or len(p.output) < 20: msg = "Failed to get repo pubkey!" if common.config['keystore'] == 'NONE': msg += ' Is your crypto smartcard plugged in?' raise FDroidException(msg) pubkey = p.output repo_pubkey_fingerprint = common.get_cert_fingerprint(pubkey) return hexlify(pubkey), repo_pubkey_fingerprint def get_mirror_service_urls(url): """Get direct URLs from git service for use by fdroidclient. Via 'servergitmirrors', fdroidserver can create and push a mirror to certain well known git services like GitLab or GitHub. This will always use the 'master' branch since that is the default branch in git. The files are then accessible via alternate URLs, where they are served in their raw format via a CDN rather than from git. Both of the GitLab URLs will work with F-Droid, but only the GitLab Pages will work in the browser This is because the "raw" URLs are not served with the correct mime types, so any index.html which is put in the repo will not be rendered. Putting an index.html file in the repo root is a common way for to make information about the repo available to end user. """ if url.startswith('git@'): url = re.sub(r'^git@([^:]+):(.+)', r'https://\1/\2', url) segments = url.split("/") if segments[4].endswith('.git'): segments[4] = segments[4][:-4] hostname = segments[2] user = segments[3] repo = segments[4] branch = "master" folder = "fdroid" urls = [] if hostname == "github.com": # Github-like RAW segments "https://raw.githubusercontent.com/user/repo/branch/folder" segments[2] = "raw.githubusercontent.com" segments.extend([branch, folder]) urls.append('/'.join(segments)) elif hostname == "gitlab.com": if common.get_dir_size(folder) <= common.GITLAB_COM_PAGES_MAX_SIZE: # Gitlab-like Pages segments "https://user.gitlab.io/repo/folder" gitlab_pages = ["https:", "", user + ".gitlab.io", repo, folder] urls.append('/'.join(gitlab_pages)) else: logging.warning( _( 'Skipping GitLab Pages mirror because the repo is too large (>%.2fGB)!' ) % (common.GITLAB_COM_PAGES_MAX_SIZE / 1000000000) ) # GitLab Raw "https://gitlab.com/user/repo/-/raw/branch/folder" gitlab_raw = segments + ['-', 'raw', branch, folder] urls.append('/'.join(gitlab_raw)) return urls def download_repo_index(url_str, etag=None, verify_fingerprint=True, timeout=600): """Download and verifies index file, then returns its data. Downloads the repository index from the given :param url_str and verifies the repository's fingerprint if :param verify_fingerprint is not False. Raises ------ VerificationException() if the repository could not be verified Returns ------- A tuple consisting of: - The index in JSON format or None if the index did not change - The new eTag as returned by the HTTP request """ url = urllib.parse.urlsplit(url_str) fingerprint = None if verify_fingerprint: query = urllib.parse.parse_qs(url.query) if 'fingerprint' not in query: raise VerificationException(_("No fingerprint in URL.")) fingerprint = query['fingerprint'][0] if url.path.endswith('/index-v1.jar'): path = url.path[:-13].rstrip('/') else: path = url.path.rstrip('/') url = urllib.parse.SplitResult(url.scheme, url.netloc, path + '/index-v1.jar', '', '') download, new_etag = net.http_get(url.geturl(), etag, timeout) if download is None: return None, new_etag with tempfile.NamedTemporaryFile() as fp: fp.write(download) fp.flush() index, public_key, public_key_fingerprint = get_index_from_jar(fp.name, fingerprint) index["repo"]["pubkey"] = hexlify(public_key).decode() index["repo"]["fingerprint"] = public_key_fingerprint index["apps"] = [metadata.App(app) for app in index["apps"]] return index, new_etag def get_index_from_jar(jarfile, fingerprint=None): """Return the data, public key, and fingerprint from index-v1.jar. Parameters ---------- fingerprint is the SHA-256 fingerprint of signing key. Only hex digits count, all other chars will can be discarded. Raises ------ VerificationException() if the repository could not be verified """ logging.debug(_('Verifying index signature:')) common.verify_jar_signature(jarfile) with zipfile.ZipFile(jarfile) as jar: public_key, public_key_fingerprint = get_public_key_from_jar(jar) if fingerprint is not None: fingerprint = re.sub(r'[^0-9A-F]', r'', fingerprint.upper()) if fingerprint != public_key_fingerprint: raise VerificationException(_("The repository's fingerprint does not match.")) data = json.loads(jar.read('index-v1.json').decode()) return data, public_key, public_key_fingerprint def get_public_key_from_jar(jar): """Get the public key and its fingerprint from a JAR file. Raises ------ VerificationException() if the JAR was not signed exactly once Parameters ---------- jar a zipfile.ZipFile object Returns ------- the public key from the jar and its fingerprint """ # extract certificate from jar certs = [n for n in jar.namelist() if common.SIGNATURE_BLOCK_FILE_REGEX.match(n)] if len(certs) < 1: raise VerificationException(_("Found no signing certificates for repository.")) if len(certs) > 1: raise VerificationException(_("Found multiple signing certificates for repository.")) # extract public key from certificate public_key = common.get_certificate(jar.read(certs[0])) public_key_fingerprint = common.get_cert_fingerprint(public_key).replace(' ', '') return public_key, public_key_fingerprint fdroidserver-2.1/fdroidserver/init.py0000644000175000017500000002632214203004041017762 0ustar hanshans00000000000000#!/usr/bin/env python3 # # init.py - part of the FDroid server tools # Copyright (C) 2010-2013, Ciaran Gultnieks, ciaran@ciarang.com # Copyright (C) 2013-2014 Daniel Martí # Copyright (C) 2013 Hans-Christoph Steiner # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . import glob import os import re import shutil import socket import sys from argparse import ArgumentParser import logging from . import _ from . import common from .exception import FDroidException config = {} options = None def disable_in_config(key, value): """Write a key/value to the local config.yml, then comment it out.""" import yaml with open('config.yml') as f: data = f.read() pattern = r'\n[\s#]*' + key + r':.*' repl = '\n#' + yaml.dump({key: value}, default_flow_style=False) data = re.sub(pattern, repl, data) with open('config.yml', 'w') as f: f.writelines(data) def main(): global options, config # Parse command line... parser = ArgumentParser() common.setup_global_opts(parser) parser.add_argument( "-d", "--distinguished-name", default=None, help=_("X.509 'Distinguished Name' used when generating keys"), ) parser.add_argument( "--keystore", default=None, help=_("Path to the keystore for the repo signing key"), ) parser.add_argument( "--repo-keyalias", default=None, help=_("Alias of the repo signing key in the keystore"), ) parser.add_argument( "--android-home", default=None, help=_("Path to the Android SDK (sometimes set in ANDROID_HOME)"), ) parser.add_argument( "--no-prompt", action="store_true", default=False, help=_("Do not prompt for Android SDK path, just fail"), ) options = parser.parse_args() fdroiddir = os.getcwd() test_config = dict() examplesdir = common.get_examples_dir() common.fill_config_defaults(test_config) # track down where the Android SDK is, the default is to use the path set # in ANDROID_HOME if that exists, otherwise None if options.android_home is not None: test_config['sdk_path'] = options.android_home elif not common.test_sdk_exists(test_config): # if neither --android-home nor the default sdk_path # exist, prompt the user using platform-specific default # and if the user leaves it blank, ignore and move on. default_sdk_path = '' if sys.platform == 'win32' or sys.platform == 'cygwin': p = os.path.join( os.getenv('USERPROFILE'), 'AppData', 'Local', 'Android', 'android-sdk' ) elif sys.platform == 'darwin': # on OSX, Homebrew is common and has an easy path to detect p = '/usr/local/opt/android-sdk' elif os.path.isdir('/usr/lib/android-sdk'): # if the Debian packages are installed, suggest them p = '/usr/lib/android-sdk' else: p = '/opt/android-sdk' if os.path.exists(p): default_sdk_path = p test_config['sdk_path'] = default_sdk_path if not common.test_sdk_exists(test_config): del (test_config['sdk_path']) while not options.no_prompt: try: s = input( _('Enter the path to the Android SDK (%s) here:\n> ') % default_sdk_path ) except KeyboardInterrupt: print('') sys.exit(1) if re.match(r'^\s*$', s) is not None: test_config['sdk_path'] = default_sdk_path else: test_config['sdk_path'] = s if common.test_sdk_exists(test_config): break default_sdk_path = '' if test_config.get('sdk_path') and not common.test_sdk_exists(test_config): raise FDroidException( _("Android SDK not found at {path}!").format(path=test_config['sdk_path']) ) if not os.path.exists('config.yml') and not os.path.exists('config.py'): # 'metadata' and 'tmp' are created in fdroid if not os.path.exists('repo'): os.mkdir('repo') example_config_yml = os.path.join(examplesdir, 'config.yml') if os.path.exists(example_config_yml): shutil.copyfile(example_config_yml, 'config.yml') else: from pkg_resources import get_distribution versionstr = get_distribution('fdroidserver').version if not versionstr: versionstr = 'master' with open('config.yml', 'w') as fp: fp.write('# see https://gitlab.com/fdroid/fdroidserver/blob/') fp.write(versionstr) fp.write('/examples/config.yml\n') os.chmod('config.yml', 0o0600) # If android_home is None, test_config['sdk_path'] will be used and # "$ANDROID_HOME" may be used if the env var is set up correctly. # If android_home is not None, the path given from the command line # will be directly written in the config. if 'sdk_path' in test_config: common.write_to_config(test_config, 'sdk_path', options.android_home) else: logging.warning( 'Looks like this is already an F-Droid repo, cowardly refusing to overwrite it...' ) logging.info('Try running `fdroid init` in an empty directory.') raise FDroidException('Repository already exists.') # now that we have a local config.yml, read configuration... config = common.read_config(options) # the NDK is optional and there may be multiple versions of it, so it's # left for the user to configure # find or generate the keystore for the repo signing key. First try the # path written in the default config.yml. Then check if the user has # specified a path from the command line, which will trump all others. # Otherwise, create ~/.local/share/fdroidserver and stick it in there. If # keystore is set to NONE, that means that Java will look for keys in a # Hardware Security Module aka Smartcard. keystore = config['keystore'] if options.keystore: keystore = os.path.abspath(options.keystore) if options.keystore == 'NONE': keystore = options.keystore else: keystore = os.path.abspath(options.keystore) if not os.path.exists(keystore): logging.info( '"' + keystore + '" does not exist, creating a new keystore there.' ) common.write_to_config(test_config, 'keystore', keystore) repo_keyalias = None keydname = None if options.repo_keyalias: repo_keyalias = options.repo_keyalias common.write_to_config(test_config, 'repo_keyalias', repo_keyalias) if options.distinguished_name: keydname = options.distinguished_name common.write_to_config(test_config, 'keydname', keydname) if keystore == 'NONE': # we're using a smartcard common.write_to_config( test_config, 'repo_keyalias', '1' ) # seems to be the default disable_in_config('keypass', 'never used with smartcard') common.write_to_config( test_config, 'smartcardoptions', ( '-storetype PKCS11 ' + '-providerClass sun.security.pkcs11.SunPKCS11 ' + '-providerArg opensc-fdroid.cfg' ), ) # find opensc-pkcs11.so if not os.path.exists('opensc-fdroid.cfg'): if os.path.exists('/usr/lib/opensc-pkcs11.so'): opensc_so = '/usr/lib/opensc-pkcs11.so' elif os.path.exists('/usr/lib64/opensc-pkcs11.so'): opensc_so = '/usr/lib64/opensc-pkcs11.so' else: files = glob.glob( '/usr/lib/' + os.uname()[4] + '-*-gnu/opensc-pkcs11.so' ) if len(files) > 0: opensc_so = files[0] else: opensc_so = '/usr/lib/opensc-pkcs11.so' logging.warning( 'No OpenSC PKCS#11 module found, ' + 'install OpenSC then edit "opensc-fdroid.cfg"!' ) with open('opensc-fdroid.cfg', 'w') as f: f.write('name = OpenSC\nlibrary = ') f.write(opensc_so) f.write('\n') logging.info( "Repo setup using a smartcard HSM. Please edit keystorepass and repo_keyalias in config.yml." ) logging.info( "If you want to generate a new repo signing key in the HSM you can do that with 'fdroid update " "--create-key'." ) elif os.path.exists(keystore): to_set = ['keystorepass', 'keypass', 'repo_keyalias', 'keydname'] if repo_keyalias: to_set.remove('repo_keyalias') if keydname: to_set.remove('keydname') logging.warning( '\n' + _('Using existing keystore "{path}"').format(path=keystore) + '\n' + _('Now set these in config.yml:') + ' ' + ', '.join(to_set) + '\n' ) else: password = common.genpassword() c = dict(test_config) c['keystorepass'] = password c['keypass'] = password c['repo_keyalias'] = socket.getfqdn() c['keydname'] = 'CN=' + c['repo_keyalias'] + ', OU=F-Droid' common.write_to_config(test_config, 'keystorepass', password) common.write_to_config(test_config, 'keypass', password) common.write_to_config(test_config, 'repo_keyalias', c['repo_keyalias']) common.write_to_config(test_config, 'keydname', c['keydname']) common.genkeystore(c) msg = '\n' msg += _('Built repo based in "%s" with this config:') % fdroiddir msg += '\n\n Android SDK:\t\t\t' + config['sdk_path'] msg += '\n ' + _('Keystore for signing key:\t') + keystore if repo_keyalias is not None: msg += '\n Alias for key in store:\t' + repo_keyalias msg += '\n\n' msg += ( _( '''To complete the setup, add your APKs to "%s" then run "fdroid update -c; fdroid update". You might also want to edit "config.yml" to set the URL, repo name, and more. You should also set up a signing key (a temporary one might have been automatically generated). For more info: https://f-droid.org/docs/Setup_an_F-Droid_App_Repo and https://f-droid.org/docs/Signing_Process''' ) % os.path.join(fdroiddir, 'repo') ) logging.info(msg) fdroidserver-2.1/fdroidserver/install.py0000644000175000017500000001051114203004041020456 0ustar hanshans00000000000000#!/usr/bin/env python3 # # install.py - part of the FDroid server tools # Copyright (C) 2013, Ciaran Gultnieks, ciaran@ciarang.com # Copyright (C) 2013-2014 Daniel Martí # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . import sys import os import glob from argparse import ArgumentParser import logging from . import _ from . import common from .common import SdkToolsPopen from .exception import FDroidException options = None config = None def devices(): p = SdkToolsPopen(['adb', "devices"]) if p.returncode != 0: raise FDroidException("An error occured when finding devices: %s" % p.output) lines = [line for line in p.output.splitlines() if not line.startswith('* ')] if len(lines) < 3: return [] lines = lines[1:-1] return [line.split()[0] for line in lines] def main(): global options, config # Parse command line... parser = ArgumentParser(usage="%(prog)s [options] [APPID[:VERCODE] [APPID[:VERCODE] ...]]") common.setup_global_opts(parser) parser.add_argument("appid", nargs='*', help=_("application ID with optional versionCode in the form APPID[:VERCODE]")) parser.add_argument("-a", "--all", action="store_true", default=False, help=_("Install all signed applications available")) options = parser.parse_args() if not options.appid and not options.all: parser.error(_("option %s: If you really want to install all the signed apps, use --all") % "all") config = common.read_config(options) output_dir = 'repo' if not os.path.isdir(output_dir): logging.info(_("No signed output directory - nothing to do")) sys.exit(0) if options.appid: vercodes = common.read_pkg_args(options.appid, True) common.get_metadata_files(vercodes) # only check appids apks = {appid: None for appid in vercodes} # Get the signed APK with the highest vercode for apkfile in sorted(glob.glob(os.path.join(output_dir, '*.apk'))): try: appid, vercode = common.publishednameinfo(apkfile) except FDroidException: continue if appid not in apks: continue if vercodes[appid] and vercode not in vercodes[appid]: continue apks[appid] = apkfile for appid, apk in apks.items(): if not apk: raise FDroidException(_("No signed APK available for %s") % appid) else: apks = {common.publishednameinfo(apkfile)[0]: apkfile for apkfile in sorted(glob.glob(os.path.join(output_dir, '*.apk')))} for appid, apk in apks.items(): # Get device list each time to avoid device not found errors devs = devices() if not devs: raise FDroidException(_("No attached devices found")) logging.info(_("Installing %s...") % apk) for dev in devs: logging.info(_("Installing '{apkfilename}' on {dev}...").format(apkfilename=apk, dev=dev)) p = SdkToolsPopen(['adb', "-s", dev, "install", apk]) fail = "" for line in p.output.splitlines(): if line.startswith("Failure"): fail = line[9:-1] if not fail: continue if fail == "INSTALL_FAILED_ALREADY_EXISTS": logging.warning(_('"{apkfilename}" is already installed on {dev}.') .format(apkfilename=apk, dev=dev)) else: raise FDroidException(_("Failed to install '{apkfilename}' on {dev}: {error}") .format(apkfilename=apk, dev=dev, error=fail)) logging.info('\n' + _('Finished')) if __name__ == "__main__": main() fdroidserver-2.1/fdroidserver/lint.py0000644000175000017500000006756014203004041017776 0ustar hanshans00000000000000#!/usr/bin/env python3 # # lint.py - part of the FDroid server tool # Copyright (C) 2013-2014 Daniel Martí # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See th # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public Licen # along with this program. If not, see . from argparse import ArgumentParser import re import sys import platform import urllib.parse from pathlib import Path from . import _ from . import common from . import metadata from . import rewritemeta config = None options = None def enforce_https(domain): return ( re.compile( r'^http://([^/]*\.)?' + re.escape(domain) + r'(/.*)?', re.IGNORECASE ), domain + " URLs should always use https://", ) https_enforcings = [ enforce_https('github.com'), enforce_https('gitlab.com'), enforce_https('bitbucket.org'), enforce_https('apache.org'), enforce_https('google.com'), enforce_https('git.code.sf.net'), enforce_https('svn.code.sf.net'), enforce_https('anongit.kde.org'), enforce_https('savannah.nongnu.org'), enforce_https('git.savannah.nongnu.org'), enforce_https('download.savannah.nongnu.org'), enforce_https('savannah.gnu.org'), enforce_https('git.savannah.gnu.org'), enforce_https('download.savannah.gnu.org'), enforce_https('github.io'), enforce_https('gitlab.io'), enforce_https('githubusercontent.com'), ] def forbid_shortener(domain): return ( re.compile(r'https?://[^/]*' + re.escape(domain) + r'/.*'), _("URL shorteners should not be used"), ) http_url_shorteners = [ forbid_shortener('1url.com'), forbid_shortener('adf.ly'), forbid_shortener('bc.vc'), forbid_shortener('bit.do'), forbid_shortener('bit.ly'), forbid_shortener('bitly.com'), forbid_shortener('budurl.com'), forbid_shortener('buzurl.com'), forbid_shortener('cli.gs'), forbid_shortener('cur.lv'), forbid_shortener('cutt.us'), forbid_shortener('db.tt'), forbid_shortener('filoops.info'), forbid_shortener('goo.gl'), forbid_shortener('is.gd'), forbid_shortener('ity.im'), forbid_shortener('j.mp'), forbid_shortener('l.gg'), forbid_shortener('lnkd.in'), forbid_shortener('moourl.com'), forbid_shortener('ow.ly'), forbid_shortener('para.pt'), forbid_shortener('po.st'), forbid_shortener('q.gs'), forbid_shortener('qr.ae'), forbid_shortener('qr.net'), forbid_shortener('rdlnk.com'), forbid_shortener('scrnch.me'), forbid_shortener('short.nr'), forbid_shortener('sn.im'), forbid_shortener('snipurl.com'), forbid_shortener('su.pr'), forbid_shortener('t.co'), forbid_shortener('tiny.cc'), forbid_shortener('tinyarrows.com'), forbid_shortener('tinyurl.com'), forbid_shortener('tr.im'), forbid_shortener('tweez.me'), forbid_shortener('twitthis.com'), forbid_shortener('twurl.nl'), forbid_shortener('tyn.ee'), forbid_shortener('u.bb'), forbid_shortener('u.to'), forbid_shortener('ur1.ca'), forbid_shortener('urlof.site'), forbid_shortener('v.gd'), forbid_shortener('vzturl.com'), forbid_shortener('x.co'), forbid_shortener('xrl.us'), forbid_shortener('yourls.org'), forbid_shortener('zip.net'), forbid_shortener('✩.ws'), forbid_shortener('➡.ws'), ] http_checks = ( https_enforcings + http_url_shorteners + [ ( re.compile(r'^(?!https?://)[^/]+'), _("URL must start with https:// or http://"), ), ( re.compile(r'^https://(github|gitlab)\.com(/[^/]+){2,3}\.git'), _("Appending .git is not necessary"), ), ( re.compile( r'^https://[^/]*(github|gitlab|bitbucket|rawgit|githubusercontent)\.[a-zA-Z]+/([^/]+/){2,3}master/' ), _("Use /HEAD instead of /master to point at a file in the default branch"), ), ] ) regex_checks = { 'WebSite': http_checks, 'SourceCode': http_checks, 'Repo': https_enforcings, 'UpdateCheckMode': https_enforcings, 'IssueTracker': http_checks + [ (re.compile(r'.*github\.com/[^/]+/[^/]+/*$'), _("/issues is missing")), (re.compile(r'.*gitlab\.com/[^/]+/[^/]+/*$'), _("/issues is missing")), ], 'Donate': http_checks + [ ( re.compile(r'.*flattr\.com'), _("Flattr donation methods belong in the FlattrID: field"), ), ( re.compile(r'.*liberapay\.com'), _("Liberapay donation methods belong in the Liberapay: field"), ), ( re.compile(r'.*opencollective\.com'), _("OpenCollective donation methods belong in the OpenCollective: field"), ), ], 'Changelog': http_checks, 'Author Name': [ (re.compile(r'^\s'), _("Unnecessary leading space")), (re.compile(r'.*\s$'), _("Unnecessary trailing space")), ], 'Summary': [ ( re.compile(r'.*\b(free software|open source)\b.*', re.IGNORECASE), _("No need to specify that the app is Free Software"), ), ( re.compile( r'.*((your|for).*android|android.*(app|device|client|port|version))', re.IGNORECASE, ), _("No need to specify that the app is for Android"), ), (re.compile(r'.*[a-z0-9][.!?]( |$)'), _("Punctuation should be avoided")), (re.compile(r'^\s'), _("Unnecessary leading space")), (re.compile(r'.*\s$'), _("Unnecessary trailing space")), ], 'Description': https_enforcings + http_url_shorteners + [ (re.compile(r'\s*[*#][^ .]'), _("Invalid bulleted list")), ( re.compile(r'https://f-droid.org/[a-z][a-z](_[A-Za-z]{2,4})?/'), _("Locale included in f-droid.org URL"), ), (re.compile(r'^\s'), _("Unnecessary leading space")), (re.compile(r'.*\s$'), _("Unnecessary trailing space")), ( re.compile( r'.*<(applet|base|body|button|embed|form|head|html|iframe|img|input|link|object|picture|script|source|style|svg|video).*', re.IGNORECASE, ), _("Forbidden HTML tags"), ), ( re.compile(r'''.*\s+src=["']javascript:.*'''), _("Javascript in HTML src attributes"), ), ], } locale_pattern = re.compile(r"[a-z]{2,3}(-([A-Z][a-zA-Z]+|\d+|[a-z]+))*") def check_regexes(app): for f, checks in regex_checks.items(): for m, r in checks: v = app.get(f) t = metadata.fieldtype(f) if t == metadata.TYPE_MULTILINE: for line in v.splitlines(): if m.match(line): yield "%s at line '%s': %s" % (f, line, r) else: if v is None: continue if m.match(v): yield "%s '%s': %s" % (f, v, r) def get_lastbuild(builds): lowest_vercode = -1 lastbuild = None for build in builds: if not build.disable: vercode = int(build.versionCode) if lowest_vercode == -1 or vercode < lowest_vercode: lowest_vercode = vercode if not lastbuild or int(build.versionCode) > int(lastbuild.versionCode): lastbuild = build return lastbuild def check_update_check_data_url(app): # noqa: D403 """UpdateCheckData must have a valid HTTPS URL to protect checkupdates runs.""" if app.UpdateCheckData and app.UpdateCheckMode == 'HTTP': urlcode, codeex, urlver, verex = app.UpdateCheckData.split('|') for url in (urlcode, urlver): if url != '.': parsed = urllib.parse.urlparse(url) if not parsed.scheme or not parsed.netloc: yield _('UpdateCheckData not a valid URL: {url}').format(url=url) if parsed.scheme != 'https': yield _('UpdateCheckData must use HTTPS URL: {url}').format(url=url) def check_vercode_operation(app): if app.VercodeOperation and not common.VERCODE_OPERATION_RE.match( app.VercodeOperation ): yield _('Invalid VercodeOperation: {field}').format(field=app.VercodeOperation) def check_ucm_tags(app): lastbuild = get_lastbuild(app.get('Builds', [])) if ( lastbuild is not None and lastbuild.commit and app.UpdateCheckMode == 'RepoManifest' and not lastbuild.commit.startswith('unknown') and lastbuild.versionCode == app.CurrentVersionCode and not lastbuild.forcevercode and any(s in lastbuild.commit for s in '.,_-/') ): yield _( "Last used commit '{commit}' looks like a tag, but UpdateCheckMode is '{ucm}'" ).format(commit=lastbuild.commit, ucm=app.UpdateCheckMode) def check_char_limits(app): limits = config['char_limits'] if len(app.Summary) > limits['summary']: yield _("Summary of length {length} is over the {limit} char limit").format( length=len(app.Summary), limit=limits['summary'] ) if len(app.Description) > limits['description']: yield _("Description of length {length} is over the {limit} char limit").format( length=len(app.Description), limit=limits['description'] ) def check_old_links(app): usual_sites = [ 'github.com', 'gitlab.com', 'bitbucket.org', ] old_sites = [ 'gitorious.org', 'code.google.com', ] if any(s in app.Repo for s in usual_sites): for f in ['WebSite', 'SourceCode', 'IssueTracker', 'Changelog']: v = app.get(f) if any(s in v for s in old_sites): yield _("App is in '{repo}' but has a link to {url}").format( repo=app.Repo, url=v ) def check_useless_fields(app): if app.UpdateCheckName == app.id: yield _("UpdateCheckName is set to the known application ID, it can be removed") filling_ucms = re.compile(r'^(Tags.*|RepoManifest.*)') def check_checkupdates_ran(app): if filling_ucms.match(app.UpdateCheckMode): if ( not app.AutoName and not app.CurrentVersion and app.CurrentVersionCode == '0' ): yield _( "UpdateCheckMode is set but it looks like checkupdates hasn't been run yet" ) def check_empty_fields(app): if not app.Categories: yield _("Categories are not set") all_categories = set( [ "Connectivity", "Development", "Games", "Graphics", "Internet", "Money", "Multimedia", "Navigation", "Phone & SMS", "Reading", "Science & Education", "Security", "Sports & Health", "System", "Theming", "Time", "Writing", ] ) def check_categories(app): for categ in app.Categories: if categ not in all_categories: yield _("Categories '%s' is not valid" % categ) def check_duplicates(app): links_seen = set() for f in ['Source Code', 'Web Site', 'Issue Tracker', 'Changelog']: v = app.get(f) if not v: continue v = v.lower() if v in links_seen: yield _("Duplicate link in '{field}': {url}").format(field=f, url=v) else: links_seen.add(v) name = common.get_app_display_name(app) if app.Summary and name: if app.Summary.lower() == name.lower(): yield _("Summary '%s' is just the app's name") % app.Summary if app.Summary and app.Description and len(app.Description) == 1: if app.Summary.lower() == app.Description[0].lower(): yield _("Description '%s' is just the app's summary") % app.Summary seenlines = set() for line in app.Description.splitlines(): if len(line) < 1: continue if line in seenlines: yield _("Description has a duplicate line") seenlines.add(line) desc_url = re.compile(r'(^|[^[])\[([^ ]+)( |\]|$)') def check_mediawiki_links(app): wholedesc = ' '.join(app.Description) for um in desc_url.finditer(wholedesc): url = um.group(1) for m, r in http_checks: if m.match(url): yield _("URL {url} in Description: {error}").format(url=url, error=r) def check_bulleted_lists(app): validchars = ['*', '#'] lchar = '' lcount = 0 for line in app.Description.splitlines(): if len(line) < 1: lcount = 0 continue if line[0] == lchar and line[1] == ' ': lcount += 1 if lcount > 2 and lchar not in validchars: yield _( "Description has a list (%s) but it isn't bulleted (*) nor numbered (#)" ) % lchar break else: lchar = line[0] lcount = 1 def check_builds(app): supported_flags = set(metadata.build_flags) # needed for YAML and JSON for build in app.get('Builds', []): if build.disable: if build.disable.startswith('Generated by import.py'): yield _( "Build generated by `fdroid import` - remove disable line once ready" ) continue for s in ['master', 'origin', 'HEAD', 'default', 'trunk']: if build.commit and build.commit.startswith(s): yield _( "Branch '{branch}' used as commit in build '{versionName}'" ).format(branch=s, versionName=build.versionName) for srclib in build.srclibs: if '@' in srclib: ref = srclib.split('@')[1].split('/')[0] if ref.startswith(s): yield _( "Branch '{branch}' used as commit in srclib '{srclib}'" ).format(branch=s, srclib=srclib) else: yield _( 'srclibs missing name and/or @' ) + ' (srclibs: ' + srclib + ')' for key in build.keys(): if key not in supported_flags: yield _('%s is not an accepted build field') % key def check_files_dir(app): dir_path = Path('metadata') / app.id if not dir_path.is_dir(): return files = set() for path in dir_path.iterdir(): name = path.name if not ( path.is_file() or name == 'signatures' or locale_pattern.fullmatch(name) ): yield _("Found non-file at %s") % path continue files.add(name) used = { 'signatures', } for build in app.get('Builds', []): for fname in build.patch: if fname not in files: yield _("Unknown file '{filename}' in build '{versionName}'").format( filename=fname, versionName=build.versionName ) else: used.add(fname) for name in files.difference(used): if locale_pattern.fullmatch(name): continue yield _("Unused file at %s") % (dir_path / name) def check_format(app): if options.format and not rewritemeta.proper_format(app): yield _("Run rewritemeta to fix formatting") def check_license_tag(app): """Ensure all license tags contain only valid/approved values.""" if config['lint_licenses'] is None: return if app.License not in config['lint_licenses']: if config['lint_licenses'] == APPROVED_LICENSES: yield _( 'Unexpected license tag "{}"! Only use FSF or OSI ' 'approved tags from https://spdx.org/license-list' ).format(app.License) else: yield _( 'Unexpected license tag "{}"! Only use license tags ' 'configured in your config file' ).format(app.License) def check_extlib_dir(apps): dir_path = Path('build/extlib') extlib_files = set() for path in dir_path.glob('**/*'): if path.is_file(): extlib_files.add(path.relative_to(dir_path)) used = set() for app in apps: for build in app.get('Builds', []): for path in build.extlibs: path = Path(path) if path not in extlib_files: yield _( "{appid}: Unknown extlib {path} in build '{versionName}'" ).format(appid=app.id, path=path, versionName=build.versionName) else: used.add(path) for path in extlib_files.difference(used): if path.name not in [ '.gitignore', 'source.txt', 'origin.txt', 'md5.txt', 'LICENSE', 'LICENSE.txt', 'COPYING', 'COPYING.txt', 'NOTICE', 'NOTICE.txt', ]: yield _("Unused extlib at %s") % (dir_path / path) def check_app_field_types(app): """Check the fields have valid data types.""" for field in app.keys(): v = app.get(field) t = metadata.fieldtype(field) if v is None: continue elif field == 'Builds': if not isinstance(v, list): yield ( _( "{appid}: {field} must be a '{type}', but it is a '{fieldtype}'!" ).format( appid=app.id, field=field, type='list', fieldtype=v.__class__.__name__, ) ) elif t == metadata.TYPE_LIST and not isinstance(v, list): yield ( _( "{appid}: {field} must be a '{type}', but it is a '{fieldtype}!'" ).format( appid=app.id, field=field, type='list', fieldtype=v.__class__.__name__, ) ) elif t == metadata.TYPE_STRING and not type(v) in (str, bool, dict): yield ( _( "{appid}: {field} must be a '{type}', but it is a '{fieldtype}'!" ).format( appid=app.id, field=field, type='str', fieldtype=v.__class__.__name__, ) ) def check_for_unsupported_metadata_files(basedir=""): """Check whether any non-metadata files are in metadata/.""" basedir = Path(basedir) global config if not (basedir / 'metadata').exists(): return False return_value = False for f in (basedir / 'metadata').iterdir(): if f.is_dir(): if not Path(str(f) + '.yml').exists(): print(_('"%s/" has no matching metadata file!') % f) return_value = True elif f.suffix == '.yml': packageName = f.stem if not common.is_valid_package_name(packageName): print( '"' + packageName + '" is an invalid package name!\n' + 'https://developer.android.com/studio/build/application-id' ) return_value = True else: print( _( '"{path}" is not a supported file format (use: metadata/*.yml)' ).format(path=f.relative_to(basedir)) ) return_value = True return return_value def check_current_version_code(app): """Check that the CurrentVersionCode is currently available.""" archive_policy = app.get('ArchivePolicy') if archive_policy and archive_policy.split()[0] == "0": return cv = app.get('CurrentVersionCode') if cv is not None and int(cv) == 0: return builds = app.get('Builds') active_builds = 0 min_versionCode = None if builds: for build in builds: vc = int(build['versionCode']) if min_versionCode is None or min_versionCode > vc: min_versionCode = vc if not build.get('disable'): active_builds += 1 if cv == build['versionCode']: break if active_builds == 0: return # all builds are disabled if cv is not None and int(cv) < min_versionCode: yield ( _( 'CurrentVersionCode {cv} is less than oldest build entry {versionCode}' ).format(cv=cv, versionCode=min_versionCode) ) def main(): global config, options # Parse command line... parser = ArgumentParser() common.setup_global_opts(parser) parser.add_argument( "-f", "--format", action="store_true", default=False, help=_("Also warn about formatting issues, like rewritemeta -l"), ) parser.add_argument( '--force-yamllint', action="store_true", default=False, help=_( "When linting the entire repository yamllint is disabled by default. " "This option forces yamllint regardless." ), ) parser.add_argument( "appid", nargs='*', help=_("application ID of file to operate on") ) metadata.add_metadata_arguments(parser) options = parser.parse_args() metadata.warnings_action = options.W config = common.read_config(options) # Get all apps... allapps = metadata.read_metadata(options.appid) apps = common.read_app_args(options.appid, allapps, False) anywarns = check_for_unsupported_metadata_files() apps_check_funcs = [] if not options.appid: # otherwise it finds tons of unused extlibs apps_check_funcs.append(check_extlib_dir) for check_func in apps_check_funcs: for warn in check_func(apps.values()): anywarns = True print(warn) for appid, app in apps.items(): if app.Disabled: continue if options.force_yamllint: import yamllint # throw error if it is not installed yamllint # make pyflakes ignore this # only run yamllint when linting individual apps. if options.appid or options.force_yamllint: # run yamllint on app metadata ymlpath = Path('metadata') / (appid + '.yml') if ymlpath.is_file(): yamllintresult = common.run_yamllint(ymlpath) if yamllintresult: print(yamllintresult) # run yamllint on srclib metadata srclibs = set() for build in app.get('Builds', []): for srclib in build.srclibs: name, _ref, _number, _subdir = common.parse_srclib_spec(srclib) srclibs.add(name + '.yml') for srclib in srclibs: srclibpath = Path('srclibs') / srclib if srclibpath.is_file(): if platform.system() == 'Windows': # Handle symlink on Windows symlink = srclibpath.read_text() if symlink in srclibs: continue elif (srclibpath.parent / symlink).is_file(): srclibpath = srclibpath.parent / symlink yamllintresult = common.run_yamllint(srclibpath) if yamllintresult: print(yamllintresult) app_check_funcs = [ check_app_field_types, check_regexes, check_update_check_data_url, check_vercode_operation, check_ucm_tags, check_char_limits, check_old_links, check_checkupdates_ran, check_useless_fields, check_empty_fields, check_categories, check_duplicates, check_mediawiki_links, check_bulleted_lists, check_builds, check_files_dir, check_format, check_license_tag, check_current_version_code, ] for check_func in app_check_funcs: for warn in check_func(app): anywarns = True print("%s: %s" % (appid, warn)) if anywarns: sys.exit(1) # A compiled, public domain list of official SPDX license tags. generated # using: `python3 -m spdx_license_list print --filter-fsf-or-osi` Only contains # licenes approved by either FSF to be free/libre software or OSI to be open # source APPROVED_LICENSES = [ '0BSD', 'AAL', 'AFL-1.1', 'AFL-1.2', 'AFL-2.0', 'AFL-2.1', 'AFL-3.0', 'AGPL-3.0-only', 'AGPL-3.0-or-later', 'APL-1.0', 'APSL-1.0', 'APSL-1.1', 'APSL-1.2', 'APSL-2.0', 'Apache-1.0', 'Apache-1.1', 'Apache-2.0', 'Artistic-1.0', 'Artistic-1.0-Perl', 'Artistic-1.0-cl8', 'Artistic-2.0', 'Beerware', 'BSD-1-Clause', 'BSD-2-Clause', 'BSD-2-Clause-FreeBSD', 'BSD-2-Clause-Patent', 'BSD-3-Clause', 'BSD-3-Clause-Clear', 'BSD-3-Clause-LBNL', 'BSD-4-Clause', 'BSL-1.0', 'BitTorrent-1.1', 'CAL-1.0', 'CAL-1.0-Combined-Work-Exception', 'CATOSL-1.1', 'CC-BY-4.0', 'CC-BY-SA-4.0', 'CC0-1.0', 'CDDL-1.0', 'CECILL-2.0', 'CECILL-2.1', 'CECILL-B', 'CECILL-C', 'CNRI-Python', 'CPAL-1.0', 'CPL-1.0', 'CUA-OPL-1.0', 'ClArtistic', 'Condor-1.1', 'ECL-1.0', 'ECL-2.0', 'EFL-1.0', 'EFL-2.0', 'EPL-1.0', 'EPL-2.0', 'EUDatagrid', 'EUPL-1.1', 'EUPL-1.2', 'Entessa', 'FSFAP', 'FTL', 'Fair', 'Frameworx-1.0', 'GFDL-1.1-only', 'GFDL-1.1-or-later', 'GFDL-1.2-only', 'GFDL-1.2-or-later', 'GFDL-1.3-only', 'GFDL-1.3-or-later', 'GPL-2.0-only', 'GPL-2.0-or-later', 'GPL-3.0-only', 'GPL-3.0-or-later', 'HPND', 'IJG', 'IPA', 'IPL-1.0', 'ISC', 'Imlib2', 'Intel', 'LGPL-2.0-only', 'LGPL-2.0-or-later', 'LGPL-2.1-only', 'LGPL-2.1-or-later', 'LGPL-3.0-only', 'LGPL-3.0-or-later', 'LPL-1.0', 'LPL-1.02', 'LPPL-1.2', 'LPPL-1.3a', 'LPPL-1.3c', 'LiLiQ-P-1.1', 'LiLiQ-R-1.1', 'LiLiQ-Rplus-1.1', 'MIT', 'MIT-0', 'MIT-CMU', 'MPL-1.0', 'MPL-1.1', 'MPL-2.0', 'MPL-2.0-no-copyleft-exception', 'MS-PL', 'MS-RL', 'MirOS', 'Motosoto', 'MulanPSL-2.0', 'Multics', 'NASA-1.3', 'NCSA', 'NGPL', 'NOSL', 'NPL-1.0', 'NPL-1.1', 'NPOSL-3.0', 'NTP', 'Naumen', 'Nokia', 'OCLC-2.0', 'ODbL-1.0', 'OFL-1.0', 'OFL-1.1', 'OFL-1.1-RFN', 'OFL-1.1-no-RFN', 'OGTSL', 'OLDAP-2.3', 'OLDAP-2.7', 'OLDAP-2.8', 'OSET-PL-2.1', 'OSL-1.0', 'OSL-1.1', 'OSL-2.0', 'OSL-2.1', 'OSL-3.0', 'OpenSSL', 'PHP-3.0', 'PHP-3.01', 'PostgreSQL', 'Python-2.0', 'QPL-1.0', 'RPL-1.1', 'RPL-1.5', 'RPSL-1.0', 'RSCPL', 'Ruby', 'SGI-B-2.0', 'SISSL', 'SMLNJ', 'SPL-1.0', 'SimPL-2.0', 'Sleepycat', 'UCL-1.0', 'UPL-1.0', 'Unicode-DFS-2016', 'Unlicense', 'VSL-1.0', 'Vim', 'W3C', 'WTFPL', 'Watcom-1.0', 'X11', 'XFree86-1.1', 'Xnet', 'XSkat', 'YPL-1.1', 'ZPL-2.0', 'ZPL-2.1', 'Zend-2.0', 'Zimbra-1.3', 'Zlib', 'gnuplot', 'iMatix', 'xinetd', ] # an F-Droid addition, until we can enforce a better option APPROVED_LICENSES.append("PublicDomain") if __name__ == "__main__": main() fdroidserver-2.1/fdroidserver/metadata.py0000644000175000017500000010071014203004041020571 0ustar hanshans00000000000000#!/usr/bin/env python3 # # metadata.py - part of the FDroid server tools # Copyright (C) 2013, Ciaran Gultnieks, ciaran@ciarang.com # Copyright (C) 2013-2014 Daniel Martí # Copyright (C) 2017-2018 Michael Pöhn # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . import git from pathlib import Path import platform import re import logging import yaml try: from yaml import CSafeLoader as SafeLoader except ImportError: from yaml import SafeLoader import importlib from collections import OrderedDict from . import common from . import _ from .exception import MetaDataException, FDroidException srclibs = None warnings_action = None # validates usernames based on a loose collection of rules from GitHub, GitLab, # Liberapay and issuehunt. This is mostly to block abuse. VALID_USERNAME_REGEX = re.compile(r'^[a-z\d](?:[a-z\d/._-]){0,38}$', re.IGNORECASE) def _warn_or_exception(value, cause=None): """Output warning or Exception depending on -W.""" if warnings_action == 'ignore': pass elif warnings_action == 'error': if cause: raise MetaDataException(value) from cause else: raise MetaDataException(value) else: logging.warning(value) yaml_app_field_order = [ 'Disabled', 'AntiFeatures', 'Categories', 'License', 'AuthorName', 'AuthorEmail', 'AuthorWebSite', 'WebSite', 'SourceCode', 'IssueTracker', 'Translation', 'Changelog', 'Donate', 'FlattrID', 'Liberapay', 'LiberapayID', 'OpenCollective', 'Bitcoin', 'Litecoin', '\n', 'Name', 'AutoName', 'Summary', 'Description', '\n', 'RequiresRoot', '\n', 'RepoType', 'Repo', 'Binaries', '\n', 'Builds', '\n', 'AllowedAPKSigningKeys', '\n', 'MaintainerNotes', '\n', 'ArchivePolicy', 'AutoUpdateMode', 'UpdateCheckMode', 'UpdateCheckIgnore', 'VercodeOperation', 'UpdateCheckName', 'UpdateCheckData', 'CurrentVersion', 'CurrentVersionCode', '\n', 'NoSourceSince', ] yaml_app_fields = [x for x in yaml_app_field_order if x != '\n'] class App(dict): def __init__(self, copydict=None): if copydict: super().__init__(copydict) return super().__init__() self.Disabled = None self.AntiFeatures = [] self.Provides = None self.Categories = [] self.License = 'Unknown' self.AuthorName = None self.AuthorEmail = None self.AuthorWebSite = None self.WebSite = '' self.SourceCode = '' self.IssueTracker = '' self.Translation = '' self.Changelog = '' self.Donate = None self.FlattrID = None self.Liberapay = None self.LiberapayID = None self.OpenCollective = None self.Bitcoin = None self.Litecoin = None self.Name = None self.AutoName = '' self.Summary = '' self.Description = '' self.RequiresRoot = False self.RepoType = '' self.Repo = '' self.Binaries = None self.AllowedAPKSigningKeys = [] self.MaintainerNotes = '' self.ArchivePolicy = None self.AutoUpdateMode = 'None' self.UpdateCheckMode = 'None' self.UpdateCheckIgnore = None self.VercodeOperation = None self.UpdateCheckName = None self.UpdateCheckData = None self.CurrentVersion = '' self.CurrentVersionCode = None self.NoSourceSince = '' self.id = None self.metadatapath = None self.Builds = [] self.comments = {} self.added = None self.lastUpdated = None def __getattr__(self, name): if name in self: return self[name] else: raise AttributeError("No such attribute: " + name) def __setattr__(self, name, value): self[name] = value def __delattr__(self, name): if name in self: del self[name] else: raise AttributeError("No such attribute: " + name) def get_last_build(self): if len(self.Builds) > 0: return self.Builds[-1] else: return Build() TYPE_STRING = 2 TYPE_BOOL = 3 TYPE_LIST = 4 TYPE_SCRIPT = 5 TYPE_MULTILINE = 6 TYPE_BUILD = 7 TYPE_INT = 8 fieldtypes = { 'Description': TYPE_MULTILINE, 'MaintainerNotes': TYPE_MULTILINE, 'Categories': TYPE_LIST, 'AntiFeatures': TYPE_LIST, 'AllowedAPKSigningKeys': TYPE_LIST, 'Build': TYPE_BUILD, } def fieldtype(name): name = name.replace(' ', '') if name in fieldtypes: return fieldtypes[name] return TYPE_STRING # In the order in which they are laid out on files build_flags = [ 'versionName', 'versionCode', 'disable', 'commit', 'timeout', 'subdir', 'submodules', 'sudo', 'init', 'patch', 'gradle', 'maven', 'buildozer', 'output', 'srclibs', 'oldsdkloc', 'encoding', 'forceversion', 'forcevercode', 'rm', 'extlibs', 'prebuild', 'androidupdate', 'target', 'scanignore', 'scandelete', 'build', 'buildjni', 'ndk', 'preassemble', 'gradleprops', 'antcommands', 'novcheck', 'antifeatures', ] class Build(dict): def __init__(self, copydict=None): super().__init__() self.disable = '' self.commit = None self.timeout = None self.subdir = None self.submodules = False self.sudo = '' self.init = '' self.patch = [] self.gradle = [] self.maven = False self.buildozer = False self.output = None self.srclibs = [] self.oldsdkloc = False self.encoding = None self.forceversion = False self.forcevercode = False self.rm = [] self.extlibs = [] self.prebuild = '' self.androidupdate = [] self.target = None self.scanignore = [] self.scandelete = [] self.build = '' self.buildjni = [] self.ndk = None self.preassemble = [] self.gradleprops = [] self.antcommands = [] self.novcheck = False self.antifeatures = [] if copydict: super().__init__(copydict) return def __getattr__(self, name): if name in self: return self[name] else: raise AttributeError("No such attribute: " + name) def __setattr__(self, name, value): self[name] = value def __delattr__(self, name): if name in self: del self[name] else: raise AttributeError("No such attribute: " + name) def build_method(self): for f in ['maven', 'gradle', 'buildozer']: if self.get(f): return f if self.output: return 'raw' return 'ant' # like build_method, but prioritize output= def output_method(self): if self.output: return 'raw' for f in ['maven', 'gradle', 'buildozer']: if self.get(f): return f return 'ant' def ndk_path(self): """Return the path to the first configured NDK or an empty string.""" ndk = self.ndk if isinstance(ndk, list): ndk = self.ndk[0] return common.config['ndk_paths'].get(ndk, '') flagtypes = { 'versionCode': TYPE_INT, 'extlibs': TYPE_LIST, 'srclibs': TYPE_LIST, 'patch': TYPE_LIST, 'rm': TYPE_LIST, 'buildjni': TYPE_LIST, 'preassemble': TYPE_LIST, 'androidupdate': TYPE_LIST, 'scanignore': TYPE_LIST, 'scandelete': TYPE_LIST, 'gradle': TYPE_LIST, 'antcommands': TYPE_LIST, 'gradleprops': TYPE_LIST, 'sudo': TYPE_SCRIPT, 'init': TYPE_SCRIPT, 'prebuild': TYPE_SCRIPT, 'build': TYPE_SCRIPT, 'submodules': TYPE_BOOL, 'oldsdkloc': TYPE_BOOL, 'forceversion': TYPE_BOOL, 'forcevercode': TYPE_BOOL, 'novcheck': TYPE_BOOL, 'antifeatures': TYPE_LIST, 'timeout': TYPE_INT, } def flagtype(name): if name in flagtypes: return flagtypes[name] return TYPE_STRING class FieldValidator(): """Designate App metadata field types and checks that it matches. 'name' - The long name of the field type 'matching' - List of possible values or regex expression 'sep' - Separator to use if value may be a list 'fields' - Metadata fields (Field:Value) of this type """ def __init__(self, name, matching, fields): self.name = name self.matching = matching self.compiled = re.compile(matching) self.fields = fields def check(self, v, appid): if not v: return if type(v) == list: values = v else: values = [v] for v in values: if not self.compiled.match(v): _warn_or_exception(_("'{value}' is not a valid {field} in {appid}. Regex pattern: {pattern}") .format(value=v, field=self.name, appid=appid, pattern=self.matching)) # Generic value types valuetypes = { FieldValidator("Flattr ID", r'^[0-9a-z]+$', ['FlattrID']), FieldValidator("Liberapay", VALID_USERNAME_REGEX, ['Liberapay']), FieldValidator("Liberapay ID", r'^[0-9]+$', ['LiberapayID']), FieldValidator("Open Collective", VALID_USERNAME_REGEX, ['OpenCollective']), FieldValidator("HTTP link", r'^http[s]?://', ["WebSite", "SourceCode", "IssueTracker", "Translation", "Changelog", "Donate"]), FieldValidator("Email", r'^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$', ["AuthorEmail"]), FieldValidator("Bitcoin address", r'^(bc1|[13])[a-zA-HJ-NP-Z0-9]{25,39}$', ["Bitcoin"]), FieldValidator("Litecoin address", r'^[LM3][a-km-zA-HJ-NP-Z1-9]{26,33}$', ["Litecoin"]), FieldValidator("Repo Type", r'^(git|git-svn|svn|hg|bzr|srclib)$', ["RepoType"]), FieldValidator("Binaries", r'^http[s]?://', ["Binaries"]), FieldValidator("AllowedAPKSigningKeys", r'^[a-fA-F0-9]{64}$', ["AllowedAPKSigningKeys"]), FieldValidator("Archive Policy", r'^[0-9]+ versions$', ["ArchivePolicy"]), FieldValidator("Anti-Feature", r'^(Ads|Tracking|NonFreeNet|NonFreeDep|NonFreeAdd|UpstreamNonFree|NonFreeAssets|KnownVuln|ApplicationDebuggable|NoSourceSince|NSFW)$', ["AntiFeatures"]), FieldValidator("Auto Update Mode", r"^(Version.*|None)$", ["AutoUpdateMode"]), FieldValidator("Update Check Mode", r"^(Tags|Tags .+|RepoManifest|RepoManifest/.+|RepoTrunk|HTTP|Static|None)$", ["UpdateCheckMode"]) } # Check an app's metadata information for integrity errors def check_metadata(app): for v in valuetypes: for k in v.fields: v.check(app[k], app.id) def parse_yaml_srclib(metadatapath): thisinfo = {'RepoType': '', 'Repo': '', 'Subdir': None, 'Prepare': None} if not metadatapath.exists(): _warn_or_exception(_("Invalid scrlib metadata: '{file}' " "does not exist" .format(file=metadatapath))) return thisinfo with metadatapath.open("r", encoding="utf-8") as f: try: data = yaml.load(f, Loader=SafeLoader) if type(data) is not dict: if platform.system() == 'Windows': # Handle symlink on Windows symlink = metadatapath.parent / metadatapath.read_text(encoding='utf-8') if symlink.is_file(): with symlink.open("r", encoding="utf-8") as s: data = yaml.load(s, Loader=SafeLoader) if type(data) is not dict: raise yaml.error.YAMLError(_('{file} is blank or corrupt!') .format(file=metadatapath)) except yaml.error.YAMLError as e: _warn_or_exception(_("Invalid srclib metadata: could not " "parse '{file}'") .format(file=metadatapath) + '\n' + common.run_yamllint(metadatapath, indent=4), cause=e) return thisinfo for key in data.keys(): if key not in thisinfo.keys(): _warn_or_exception(_("Invalid srclib metadata: unknown key " "'{key}' in '{file}'") .format(key=key, file=metadatapath)) return thisinfo else: if key == 'Subdir': if isinstance(data[key], str): thisinfo[key] = data[key].split(',') elif isinstance(data[key], list): thisinfo[key] = data[key] elif data[key] is None: thisinfo[key] = [''] elif key == 'Prepare' and isinstance(data[key], list): thisinfo[key] = ' && '.join(data[key]) else: thisinfo[key] = str(data[key] or '') return thisinfo def read_srclibs(): """Read all srclib metadata. The information read will be accessible as metadata.srclibs, which is a dictionary, keyed on srclib name, with the values each being a dictionary in the same format as that returned by the parse_yaml_srclib function. A MetaDataException is raised if there are any problems with the srclib metadata. """ global srclibs # They were already loaded if srclibs is not None: return srclibs = {} srcdir = Path('srclibs') srcdir.mkdir(exist_ok=True) for metadatapath in sorted(srcdir.glob('*.yml')): srclibs[metadatapath.stem] = parse_yaml_srclib(metadatapath) def read_metadata(appids={}, sort_by_time=False): """Return a list of App instances sorted newest first. This reads all of the metadata files in a 'data' repository, then builds a list of App instances from those files. The list is sorted based on creation time, newest first. Most of the time, the newer files are the most interesting. appids is a dict with appids a keys and versionCodes as values. """ # Always read the srclibs before the apps, since they can use a srlib as # their source repository. read_srclibs() apps = OrderedDict() for basedir in ('metadata', 'tmp'): Path(basedir).mkdir(exist_ok=True) if appids: vercodes = common.read_pkg_args(appids) metadatafiles = common.get_metadata_files(vercodes) else: metadatafiles = list(Path('metadata').glob('*.yml')) + list( Path('.').glob('.fdroid.yml')) if sort_by_time: entries = ((path.stat().st_mtime, path) for path in metadatafiles) metadatafiles = [] for _ignored, path in sorted(entries, reverse=True): metadatafiles.append(path) else: # most things want the index alpha sorted for stability metadatafiles = sorted(metadatafiles) for metadatapath in metadatafiles: appid = metadatapath.stem if appid != '.fdroid' and not common.is_valid_package_name(appid): _warn_or_exception(_("{appid} from {path} is not a valid Java Package Name!") .format(appid=appid, path=metadatapath)) if appid in apps: _warn_or_exception(_("Found multiple metadata files for {appid}") .format(appid=appid)) app = parse_metadata(metadatapath) check_metadata(app) apps[app.id] = app return apps # Port legacy ';' separators list_sep = re.compile(r'[,;]') def split_list_values(s): res = [] for v in re.split(list_sep, s): if not v: continue v = v.strip() if not v: continue res.append(v) return res def sorted_builds(builds): return sorted(builds, key=lambda build: int(build.versionCode)) esc_newlines = re.compile(r'\\( |\n)') def post_metadata_parse(app): # TODO keep native types, convert only for .txt metadata for k, v in app.items(): if type(v) in (float, int): app[k] = str(v) if 'flavours' in app and app['flavours'] == [True]: app['flavours'] = 'yes' for field, fieldtype in fieldtypes.items(): if fieldtype != TYPE_LIST: continue value = app.get(field) if isinstance(value, str): app[field] = [value, ] elif value is not None: app[field] = [str(i) for i in value] def _yaml_bool_unmapable(v): return v in (True, False, [True], [False]) def _yaml_bool_unmap(v): if v is True: return 'yes' elif v is False: return 'no' elif v == [True]: return ['yes'] elif v == [False]: return ['no'] _bool_allowed = ('maven', 'buildozer') builds = [] if 'Builds' in app: for build in app.get('Builds', []): if not isinstance(build, Build): build = Build(build) for k, v in build.items(): if not (v is None): if flagtype(k) == TYPE_LIST: if _yaml_bool_unmapable(v): build[k] = _yaml_bool_unmap(v) if isinstance(v, str): build[k] = [v] elif isinstance(v, bool): if v: build[k] = ['yes'] else: build[k] = [] elif flagtype(k) is TYPE_INT: build[k] = str(v) elif flagtype(k) is TYPE_STRING: if isinstance(v, bool) and k in _bool_allowed: build[k] = v else: if _yaml_bool_unmapable(v): build[k] = _yaml_bool_unmap(v) else: build[k] = str(v) builds.append(build) app['Builds'] = sorted_builds(builds) # Parse metadata for a single application. # # 'metadatapath' - the file path to read. The "Application ID" aka # "Package Name" for the application comes from this # filename. Pass None to get a blank entry. # # Returns a dictionary containing all the details of the application. There are # two major kinds of information in the dictionary. Keys beginning with capital # letters correspond directory to identically named keys in the metadata file. # Keys beginning with lower case letters are generated in one way or another, # and are not found verbatim in the metadata. # # Known keys not originating from the metadata are: # # 'comments' - a list of comments from the metadata file. Each is # a list of the form [field, comment] where field is # the name of the field it preceded in the metadata # file. Where field is None, the comment goes at the # end of the file. Alternatively, 'build:version' is # for a comment before a particular build version. # 'descriptionlines' - original lines of description as formatted in the # metadata file. # bool_true = re.compile(r'([Yy]es|[Tt]rue)') bool_false = re.compile(r'([Nn]o|[Ff]alse)') def _decode_bool(s): if bool_true.match(s): return True if bool_false.match(s): return False _warn_or_exception(_("Invalid boolean '%s'") % s) def parse_metadata(metadatapath): """Parse metadata file, also checking the source repo for .fdroid.yml. If this is a metadata file from fdroiddata, it will first load the source repo type and URL from fdroiddata, then read .fdroid.yml if it exists, then include the rest of the metadata as specified in fdroiddata, so that fdroiddata has precedence over the metadata in the source code. """ metadatapath = Path(metadatapath) app = App() app.metadatapath = metadatapath.as_posix() name = metadatapath.stem if name != '.fdroid': app.id = name if metadatapath.suffix == '.yml': with metadatapath.open('r', encoding='utf-8') as mf: parse_yaml_metadata(mf, app) else: _warn_or_exception(_('Unknown metadata format: {path} (use: *.yml)') .format(path=metadatapath)) if metadatapath.name != '.fdroid.yml' and app.Repo: build_dir = common.get_build_dir(app) metadata_in_repo = build_dir / '.fdroid.yml' if metadata_in_repo.is_file(): try: # TODO: Python3.6: Should accept path-like commit_id = common.get_head_commit_id(git.Repo(str(build_dir))) logging.debug(_('Including metadata from %s@%s') % (metadata_in_repo, commit_id)) except git.exc.InvalidGitRepositoryError: logging.debug(_('Including metadata from {path}').format(metadata_in_repo)) app_in_repo = parse_metadata(metadata_in_repo) for k, v in app_in_repo.items(): if k not in app: app[k] = v post_metadata_parse(app) if not app.id: if app.get('Builds'): build = app['Builds'][-1] if build.subdir: root_dir = Path(build.subdir) else: root_dir = Path('.') paths = common.manifest_paths(root_dir, build.gradle) _ignored, _ignored, app.id = common.parse_androidmanifests(paths, app) return app def parse_yaml_metadata(mf, app): """Parse the .yml file and post-process it. Clean metadata .yml files can be used directly, but in order to make a better user experience for people editing .yml files, there is post processing. .fdroid.yml is embedded in the app's source repo, so it is "user-generated". That means that it can have weird things in it that need to be removed so they don't break the overall process. """ try: yamldata = yaml.load(mf, Loader=SafeLoader) except yaml.YAMLError as e: _warn_or_exception(_("could not parse '{path}'") .format(path=mf.name) + '\n' + common.run_yamllint(mf.name, indent=4), cause=e) deprecated_in_yaml = ['Provides'] if yamldata: for field in tuple(yamldata.keys()): if field not in yaml_app_fields + deprecated_in_yaml: msg = (_("Unrecognised app field '{fieldname}' in '{path}'") .format(fieldname=field, path=mf.name)) if Path(mf.name).name == '.fdroid.yml': logging.error(msg) del yamldata[field] else: _warn_or_exception(msg) for deprecated_field in deprecated_in_yaml: if deprecated_field in yamldata: logging.warning(_("Ignoring '{field}' in '{metapath}' " "metadata because it is deprecated.") .format(field=deprecated_field, metapath=mf.name)) del(yamldata[deprecated_field]) if yamldata.get('Builds', None): for build in yamldata.get('Builds', []): # put all build flag keywords into a set to avoid # excessive looping action build_flag_set = set() for build_flag in build.keys(): build_flag_set.add(build_flag) for build_flag in build_flag_set: if build_flag not in build_flags: _warn_or_exception( _("Unrecognised build flag '{build_flag}' " "in '{path}'").format(build_flag=build_flag, path=mf.name)) post_parse_yaml_metadata(yamldata) app.update(yamldata) return app def post_parse_yaml_metadata(yamldata): """Transform yaml metadata to our internal data format.""" for build in yamldata.get('Builds', []): for flag in build.keys(): _flagtype = flagtype(flag) if _flagtype is TYPE_SCRIPT: # concatenate script flags into a single string if they are stored as list if isinstance(build[flag], list): build[flag] = ' && '.join(build[flag]) elif _flagtype is TYPE_STRING: # things like versionNames are strings, but without quotes can be numbers if isinstance(build[flag], float) or isinstance(build[flag], int): build[flag] = str(build[flag]) elif _flagtype is TYPE_INT: # versionCode must be int if not isinstance(build[flag], int): _warn_or_exception(_('{build_flag} must be an integer, found: {value}') .format(build_flag=flag, value=build[flag])) def write_yaml(mf, app): """Write metadata in yaml format. Parameters ---------- mf active file discriptor for writing app app metadata to written to the yaml file """ # import rumael.yaml and check version try: import ruamel.yaml except ImportError as e: raise FDroidException('ruamel.yaml not installed, can not write metadata.') from e if not ruamel.yaml.__version__: raise FDroidException('ruamel.yaml.__version__ not accessible. Please make sure a ruamel.yaml >= 0.13 is installed..') m = re.match(r'(?P[0-9]+)\.(?P[0-9]+)\.(?P[0-9]+)(-.+)?', ruamel.yaml.__version__) if not m: raise FDroidException('ruamel.yaml version malfored, please install an upstream version of ruamel.yaml') if int(m.group('major')) < 0 or int(m.group('minor')) < 13: raise FDroidException('currently installed version of ruamel.yaml ({}) is too old, >= 1.13 required.'.format(ruamel.yaml.__version__)) # suiteable version ruamel.yaml imported successfully _yaml_bools_true = ('y', 'Y', 'yes', 'Yes', 'YES', 'true', 'True', 'TRUE', 'on', 'On', 'ON') _yaml_bools_false = ('n', 'N', 'no', 'No', 'NO', 'false', 'False', 'FALSE', 'off', 'Off', 'OFF') _yaml_bools_plus_lists = [] _yaml_bools_plus_lists.extend(_yaml_bools_true) _yaml_bools_plus_lists.extend([[x] for x in _yaml_bools_true]) _yaml_bools_plus_lists.extend(_yaml_bools_false) _yaml_bools_plus_lists.extend([[x] for x in _yaml_bools_false]) def _field_to_yaml(typ, value): if typ is TYPE_STRING: if value in _yaml_bools_plus_lists: return ruamel.yaml.scalarstring.SingleQuotedScalarString(str(value)) return str(value) elif typ is TYPE_INT: return int(value) elif typ is TYPE_MULTILINE: if '\n' in value: return ruamel.yaml.scalarstring.preserve_literal(str(value)) else: return str(value) elif typ is TYPE_SCRIPT: if type(value) == list: if len(value) == 1: return value[0] else: return value else: script_lines = value.split(' && ') if len(script_lines) > 1: return script_lines else: return value else: return value def _app_to_yaml(app): cm = ruamel.yaml.comments.CommentedMap() insert_newline = False for field in yaml_app_field_order: if field == '\n': # next iteration will need to insert a newline insert_newline = True else: if app.get(field) or field == 'Builds': if field == 'Builds': if app.get('Builds'): cm.update({field: _builds_to_yaml(app)}) elif field == 'CurrentVersionCode': cm.update({field: _field_to_yaml(TYPE_INT, getattr(app, field))}) elif field == 'AllowedAPKSigningKeys': value = getattr(app, field) if value: value = [str(i).lower() for i in value] if len(value) == 1: cm.update({field: _field_to_yaml(TYPE_STRING, value[0])}) else: cm.update({field: _field_to_yaml(TYPE_LIST, value)}) else: cm.update({field: _field_to_yaml(fieldtype(field), getattr(app, field))}) if insert_newline: # we need to prepend a newline in front of this field insert_newline = False # inserting empty lines is not supported so we add a # bogus comment and over-write its value cm.yaml_set_comment_before_after_key(field, 'bogus') cm.ca.items[field][1][-1].value = '\n' return cm def _builds_to_yaml(app): builds = ruamel.yaml.comments.CommentedSeq() for build in app.get('Builds', []): if not isinstance(build, Build): build = Build(build) b = ruamel.yaml.comments.CommentedMap() for field in build_flags: value = getattr(build, field) if hasattr(build, field) and value: if field == 'gradle' and value == ['off']: value = [ruamel.yaml.scalarstring.SingleQuotedScalarString('off')] if field in ('maven', 'buildozer'): if value == 'no': continue elif value == 'yes': value = 'yes' b.update({field: _field_to_yaml(flagtype(field), value)}) builds.append(b) # insert extra empty lines between build entries for i in range(1, len(builds)): builds.yaml_set_comment_before_after_key(i, 'bogus') builds.ca.items[i][1][-1].value = '\n' return builds yaml_app = _app_to_yaml(app) try: yaml = ruamel.yaml.YAML() yaml.indent(mapping=4, sequence=4, offset=2) yaml.dump(yaml_app, stream=mf) except AttributeError: # Debian/stretch's version does not have YAML() ruamel.yaml.round_trip_dump(yaml_app, mf, indent=4, block_seq_indent=2) build_line_sep = re.compile(r'(? # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . import os import requests HEADERS = {'User-Agent': 'F-Droid'} def download_file(url, local_filename=None, dldir='tmp'): filename = url.split('/')[-1] if local_filename is None: local_filename = os.path.join(dldir, filename) # the stream=True parameter keeps memory usage low r = requests.get(url, stream=True, allow_redirects=True, headers=HEADERS) r.raise_for_status() with open(local_filename, 'wb') as f: for chunk in r.iter_content(chunk_size=1024): if chunk: # filter out keep-alive new chunks f.write(chunk) f.flush() return local_filename def http_get(url, etag=None, timeout=600): """Download the content from the given URL by making a GET request. If an ETag is given, it will do a HEAD request first, to see if the content changed. Parameters ---------- url The URL to download from. etag The last ETag to be used for the request (optional). Returns ------- A tuple consisting of: - The raw content that was downloaded or None if it did not change - The new eTag as returned by the HTTP request """ # TODO disable TLS Session IDs and TLS Session Tickets # (plain text cookie visible to anyone who can see the network traffic) if etag: r = requests.head(url, headers=HEADERS, timeout=timeout) r.raise_for_status() if 'ETag' in r.headers and etag == r.headers['ETag']: return None, etag r = requests.get(url, headers=HEADERS, timeout=timeout) r.raise_for_status() new_etag = None if 'ETag' in r.headers: new_etag = r.headers['ETag'] return r.content, new_etag fdroidserver-2.1/fdroidserver/nightly.py0000644000175000017500000004251614205260731020514 0ustar hanshans00000000000000#!/usr/bin/env python3 # # nightly.py - part of the FDroid server tools # Copyright (C) 2017 Hans-Christoph Steiner # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . import base64 import datetime import git import hashlib import logging import os import paramiko import platform import shutil import subprocess import sys import tempfile import yaml from urllib.parse import urlparse from argparse import ArgumentParser from . import _ from . import common # hard coded defaults for Android ~/.android/debug.keystore files # https://developers.google.com/android/guides/client-auth KEYSTORE_FILE = os.path.join(os.getenv('HOME'), '.android', 'debug.keystore') PASSWORD = 'android' # nosec B105 standard hardcoded password for debug keystores KEY_ALIAS = 'androiddebugkey' DISTINGUISHED_NAME = 'CN=Android Debug,O=Android,C=US' # standard suffix for naming fdroid git repos NIGHTLY = '-nightly' def _ssh_key_from_debug_keystore(keystore=KEYSTORE_FILE): tmp_dir = tempfile.mkdtemp(prefix='.') privkey = os.path.join(tmp_dir, '.privkey') key_pem = os.path.join(tmp_dir, '.key.pem') p12 = os.path.join(tmp_dir, '.keystore.p12') _config = dict() common.fill_config_defaults(_config) subprocess.check_call( [ _config['keytool'], '-importkeystore', '-srckeystore', keystore, '-srcalias', KEY_ALIAS, '-srcstorepass', PASSWORD, '-srckeypass', PASSWORD, '-destkeystore', p12, '-destalias', KEY_ALIAS, '-deststorepass', PASSWORD, '-destkeypass', PASSWORD, '-deststoretype', 'PKCS12', ], env={'LC_ALL': 'C.UTF-8'}, ) subprocess.check_call( [ 'openssl', 'pkcs12', '-in', p12, '-out', key_pem, '-passin', 'pass:' + PASSWORD, '-passout', 'pass:' + PASSWORD, ], env={'LC_ALL': 'C.UTF-8'}, ) subprocess.check_call( [ 'openssl', 'rsa', '-in', key_pem, '-out', privkey, '-passin', 'pass:' + PASSWORD, ], env={'LC_ALL': 'C.UTF-8'}, ) os.remove(key_pem) os.remove(p12) os.chmod(privkey, 0o600) # os.umask() should cover this, but just in case rsakey = paramiko.RSAKey.from_private_key_file(privkey) fingerprint = ( base64.b64encode(hashlib.sha256(rsakey.asbytes()).digest()) .decode('ascii') .rstrip('=') ) ssh_private_key_file = os.path.join( tmp_dir, 'debug_keystore_' + fingerprint.replace('/', '_') + '_id_rsa' ) shutil.move(privkey, ssh_private_key_file) pub = rsakey.get_name() + ' ' + rsakey.get_base64() + ' ' + ssh_private_key_file with open(ssh_private_key_file + '.pub', 'w') as fp: fp.write(pub) logging.info(_('\nSSH public key to be used as deploy key:') + '\n' + pub) return ssh_private_key_file def main(): parser = ArgumentParser() common.setup_global_opts(parser) parser.add_argument( "--keystore", default=KEYSTORE_FILE, help=_("Specify which debug keystore file to use."), ) parser.add_argument( "--show-secret-var", action="store_true", default=False, help=_("Print the secret variable to the terminal for easy copy/paste"), ) parser.add_argument( "--keep-private-keys", action="store_true", default=False, help=_("Do not remove the private keys generated from the keystore"), ) parser.add_argument( "--no-deploy", action="store_true", default=False, help=_("Do not deploy the new files to the repo"), ) parser.add_argument( "--file", default='app/build/outputs/apk/*.apk', help=_('The file to be included in the repo (path or glob)'), ) parser.add_argument( "--no-checksum", action="store_true", default=False, help=_("Don't use rsync checksums"), ) parser.add_argument( "--archive-older", type=int, default=20, help=_("Set maximum releases in repo before older ones are archived"), ) # TODO add --with-btlog options = parser.parse_args() # force a tighter umask since this writes private key material umask = os.umask(0o077) if 'CI' in os.environ: v = os.getenv('DEBUG_KEYSTORE') debug_keystore = None if v: debug_keystore = base64.b64decode(v) if not debug_keystore: logging.error(_('DEBUG_KEYSTORE is not set or the value is incomplete')) sys.exit(1) os.makedirs(os.path.dirname(KEYSTORE_FILE), exist_ok=True) if os.path.exists(KEYSTORE_FILE): logging.warning(_('overwriting existing {path}').format(path=KEYSTORE_FILE)) with open(KEYSTORE_FILE, 'wb') as fp: fp.write(debug_keystore) repo_basedir = os.path.join(os.getcwd(), 'fdroid') repodir = os.path.join(repo_basedir, 'repo') cibase = os.getcwd() os.makedirs(repodir, exist_ok=True) if 'CI_PROJECT_PATH' in os.environ and 'CI_PROJECT_URL' in os.environ: # we are in GitLab CI repo_git_base = os.getenv('CI_PROJECT_PATH') + NIGHTLY clone_url = os.getenv('CI_PROJECT_URL') + NIGHTLY repo_base = clone_url + '/raw/master/fdroid' servergitmirror = 'git@' + urlparse(clone_url).netloc + ':' + repo_git_base deploy_key_url = clone_url + '/-/settings/repository#js-deploy-keys-settings' git_user_name = os.getenv('GITLAB_USER_NAME') git_user_email = os.getenv('GITLAB_USER_EMAIL') elif 'TRAVIS_REPO_SLUG' in os.environ: # we are in Travis CI repo_git_base = os.getenv('TRAVIS_REPO_SLUG') + NIGHTLY clone_url = 'https://github.com/' + repo_git_base _branch = os.getenv('TRAVIS_BRANCH') repo_base = 'https://raw.githubusercontent.com/' + repo_git_base + '/' + _branch + '/fdroid' servergitmirror = 'git@github.com:' + repo_git_base deploy_key_url = ('https://github.com/' + repo_git_base + '/settings/keys' + '\nhttps://developer.github.com/v3/guides/managing-deploy-keys/#deploy-keys') git_user_name = repo_git_base git_user_email = os.getenv('USER') + '@' + platform.node() elif ( 'CIRCLE_REPOSITORY_URL' in os.environ and 'CIRCLE_PROJECT_USERNAME' in os.environ and 'CIRCLE_PROJECT_REPONAME' in os.environ ): # we are in Circle CI repo_git_base = (os.getenv('CIRCLE_PROJECT_USERNAME') + '/' + os.getenv('CIRCLE_PROJECT_REPONAME') + NIGHTLY) clone_url = os.getenv('CIRCLE_REPOSITORY_URL') + NIGHTLY repo_base = clone_url + '/raw/master/fdroid' servergitmirror = 'git@' + urlparse(clone_url).netloc + ':' + repo_git_base deploy_key_url = ('https://github.com/' + repo_git_base + '/settings/keys' + '\nhttps://developer.github.com/v3/guides/managing-deploy-keys/#deploy-keys') git_user_name = os.getenv('CIRCLE_USERNAME') git_user_email = git_user_name + '@' + platform.node() elif 'GITHUB_ACTIONS' in os.environ: # we are in Github actions repo_git_base = (os.getenv('GITHUB_REPOSITORY') + NIGHTLY) clone_url = (os.getenv('GITHUB_SERVER_URL') + '/' + repo_git_base) repo_base = clone_url + '/raw/master/fdroid' servergitmirror = 'git@' + urlparse(clone_url).netloc + ':' + repo_git_base deploy_key_url = ('https://github.com/' + repo_git_base + '/settings/keys' + '\nhttps://developer.github.com/v3/guides/managing-deploy-keys/#deploy-keys') git_user_name = os.getenv('GITHUB_ACTOR') git_user_email = git_user_name + '@' + platform.node() else: print(_('ERROR: unsupported CI type, patches welcome!')) sys.exit(1) repo_url = repo_base + '/repo' git_mirror_path = os.path.join(repo_basedir, 'git-mirror') git_mirror_repodir = os.path.join(git_mirror_path, 'fdroid', 'repo') git_mirror_metadatadir = os.path.join(git_mirror_path, 'fdroid', 'metadata') git_mirror_statsdir = os.path.join(git_mirror_path, 'fdroid', 'stats') if not os.path.isdir(git_mirror_repodir): logging.debug(_('cloning {url}').format(url=clone_url)) try: git.Repo.clone_from(clone_url, git_mirror_path) except Exception: pass if not os.path.isdir(git_mirror_repodir): os.makedirs(git_mirror_repodir, mode=0o755) mirror_git_repo = git.Repo.init(git_mirror_path) writer = mirror_git_repo.config_writer() writer.set_value('user', 'name', git_user_name) writer.set_value('user', 'email', git_user_email) writer.release() for remote in mirror_git_repo.remotes: mirror_git_repo.delete_remote(remote) readme_path = os.path.join(git_mirror_path, 'README.md') readme = ''' # {repo_git_base} [![{repo_url}]({repo_url}/icons/icon.png)]({repo_url}) Last updated: {date}'''.format(repo_git_base=repo_git_base, repo_url=repo_url, date=datetime.datetime.utcnow().strftime('%Y-%m-%d %H:%M:%S UTC')) with open(readme_path, 'w') as fp: fp.write(readme) mirror_git_repo.git.add(all=True) mirror_git_repo.index.commit("update README") mirror_git_repo.git.add(all=True) mirror_git_repo.index.commit("update repo/website icon") os.chdir(repo_basedir) if os.path.isdir(git_mirror_repodir): common.local_rsync(options, git_mirror_repodir + '/', 'repo/') if os.path.isdir(git_mirror_metadatadir): common.local_rsync(options, git_mirror_metadatadir + '/', 'metadata/') if os.path.isdir(git_mirror_statsdir): common.local_rsync(options, git_mirror_statsdir + '/', 'stats/') ssh_private_key_file = _ssh_key_from_debug_keystore() # this is needed for GitPython to find the SSH key ssh_dir = os.path.join(os.getenv('HOME'), '.ssh') os.makedirs(ssh_dir, exist_ok=True) ssh_config = os.path.join(ssh_dir, 'config') logging.debug(_('adding IdentityFile to {path}').format(path=ssh_config)) with open(ssh_config, 'a') as fp: fp.write('\n\nHost *\n\tIdentityFile %s\n' % ssh_private_key_file) config = '' config += "identity_file = '%s'\n" % ssh_private_key_file config += "repo_name = '%s'\n" % repo_git_base config += "repo_url = '%s'\n" % repo_url config += "repo_description = 'Nightly builds from %s'\n" % git_user_email config += "archive_name = '%s'\n" % (repo_git_base + ' archive') config += "archive_url = '%s'\n" % (repo_base + '/archive') config += ( "archive_description = 'Old nightly builds that have been archived.'\n" ) config += "archive_older = %i\n" % options.archive_older config += "servergitmirrors = '%s'\n" % servergitmirror config += "keystore = '%s'\n" % KEYSTORE_FILE config += "repo_keyalias = '%s'\n" % KEY_ALIAS config += "keystorepass = '%s'\n" % PASSWORD config += "keypass = '%s'\n" % PASSWORD config += "keydname = '%s'\n" % DISTINGUISHED_NAME config += "make_current_version_link = False\n" config += "update_stats = True\n" with open('config.py', 'w') as fp: fp.write(config) os.chmod('config.py', 0o600) config = common.read_config(options) common.assert_config_keystore(config) for root, dirs, files in os.walk(cibase): for d in dirs: if d == '.git' or d == '.gradle' or (d == 'fdroid' and root == cibase): dirs.remove(d) for f in files: if f.endswith('-debug.apk'): apkfilename = os.path.join(root, f) logging.debug( _('Stripping mystery signature from {apkfilename}').format( apkfilename=apkfilename ) ) destapk = os.path.join(repodir, os.path.basename(f)) os.chmod(apkfilename, 0o644) logging.debug( _( 'Resigning {apkfilename} with provided debug.keystore' ).format(apkfilename=os.path.basename(apkfilename)) ) common.sign_apk(apkfilename, destapk, KEY_ALIAS) if options.verbose: logging.debug(_('attempting bare SSH connection to test deploy key:')) try: subprocess.check_call( [ 'ssh', '-Tvi', ssh_private_key_file, '-oIdentitiesOnly=yes', '-oStrictHostKeyChecking=no', servergitmirror.split(':')[0], ] ) except subprocess.CalledProcessError: pass app_url = clone_url[: -len(NIGHTLY)] template = dict() template['AuthorName'] = clone_url.split('/')[4] template['AuthorWebSite'] = '/'.join(clone_url.split('/')[:4]) template['Categories'] = ['nightly'] template['SourceCode'] = app_url template['IssueTracker'] = app_url + '/issues' template['Summary'] = 'Nightly build of ' + urlparse(app_url).path[1:] template['Description'] = template['Summary'] with open('template.yml', 'w') as fp: yaml.dump(template, fp) subprocess.check_call( ['fdroid', 'update', '--rename-apks', '--create-metadata', '--verbose'], cwd=repo_basedir, ) common.local_rsync( options, repo_basedir + '/metadata/', git_mirror_metadatadir + '/' ) common.local_rsync(options, repo_basedir + '/stats/', git_mirror_statsdir + '/') mirror_git_repo.git.add(all=True) mirror_git_repo.index.commit("update app metadata") if not options.no_deploy: try: cmd = ['fdroid', 'deploy', '--verbose', '--no-keep-git-mirror-archive'] subprocess.check_call(cmd, cwd=repo_basedir) except subprocess.CalledProcessError: logging.error( _('cannot publish update, did you set the deploy key?') + '\n' + deploy_key_url ) sys.exit(1) if not options.keep_private_keys: os.remove(KEYSTORE_FILE) if shutil.rmtree.avoids_symlink_attacks: shutil.rmtree(os.path.dirname(ssh_private_key_file)) else: if not os.path.isfile(options.keystore): androiddir = os.path.dirname(options.keystore) if not os.path.exists(androiddir): os.mkdir(androiddir) logging.info(_('created {path}').format(path=androiddir)) logging.error(_('{path} does not exist! Create it by running:').format(path=options.keystore) + '\n keytool -genkey -v -keystore ' + options.keystore + ' -storepass android \\' + '\n -alias androiddebugkey -keypass android -keyalg RSA -keysize 2048 -validity 10000 \\' + '\n -dname "CN=Android Debug,O=Android,C=US"') sys.exit(1) ssh_dir = os.path.join(os.getenv('HOME'), '.ssh') os.makedirs(os.path.dirname(ssh_dir), exist_ok=True) privkey = _ssh_key_from_debug_keystore(options.keystore) ssh_private_key_file = os.path.join(ssh_dir, os.path.basename(privkey)) shutil.move(privkey, ssh_private_key_file) shutil.move(privkey + '.pub', ssh_private_key_file + '.pub') if shutil.rmtree.avoids_symlink_attacks: shutil.rmtree(os.path.dirname(privkey)) if options.show_secret_var: with open(options.keystore, 'rb') as fp: debug_keystore = base64.standard_b64encode(fp.read()).decode('ascii') print( _('\n{path} encoded for the DEBUG_KEYSTORE secret variable:').format( path=options.keystore ) ) print(debug_keystore) os.umask(umask) if __name__ == "__main__": main() fdroidserver-2.1/fdroidserver/publish.py0000644000175000017500000004000514203004041020457 0ustar hanshans00000000000000#!/usr/bin/env python3 # # publish.py - part of the FDroid server tools # Copyright (C) 2010-13, Ciaran Gultnieks, ciaran@ciarang.com # Copyright (C) 2013-2014 Daniel Martí # Copyright (C) 2021 Felix C. Stegerman # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . import sys import os import re import shutil import glob import hashlib from argparse import ArgumentParser from collections import OrderedDict import logging from gettext import ngettext import json import time import zipfile from . import _ from . import common from . import metadata from .common import FDroidPopen from .exception import BuildException, FDroidException config = None options = None start_timestamp = time.gmtime() def publish_source_tarball(apkfilename, unsigned_dir, output_dir): """Move the source tarball into the output directory...""" tarfilename = apkfilename[:-4] + '_src.tar.gz' tarfile = os.path.join(unsigned_dir, tarfilename) if os.path.exists(tarfile): shutil.move(tarfile, os.path.join(output_dir, tarfilename)) logging.debug('...published %s', tarfilename) else: logging.debug('...no source tarball for %s', apkfilename) def key_alias(appid): """No summary. Get the alias which F-Droid uses to indentify the singing key for this App in F-Droids keystore. """ if config and 'keyaliases' in config and appid in config['keyaliases']: # For this particular app, the key alias is overridden... keyalias = config['keyaliases'][appid] if keyalias.startswith('@'): m = hashlib.md5() # nosec just used to generate a keyalias m.update(keyalias[1:].encode('utf-8')) keyalias = m.hexdigest()[:8] return keyalias else: m = hashlib.md5() # nosec just used to generate a keyalias m.update(appid.encode('utf-8')) return m.hexdigest()[:8] def read_fingerprints_from_keystore(): """Obtain a dictionary containing all singning-key fingerprints which are managed by F-Droid, grouped by appid.""" env_vars = {'LC_ALL': 'C.UTF-8', 'FDROID_KEY_STORE_PASS': config['keystorepass']} cmd = [ config['keytool'], '-list', '-v', '-keystore', config['keystore'], '-storepass:env', 'FDROID_KEY_STORE_PASS', ] if config['keystore'] == 'NONE': cmd += config['smartcardoptions'] p = FDroidPopen(cmd, envs=env_vars, output=False) if p.returncode != 0: raise FDroidException('could not read keystore {}'.format(config['keystore'])) realias = re.compile('Alias name: (?P.+)' + os.linesep) resha256 = re.compile(r'\s+SHA256: (?P[:0-9A-F]{95})' + os.linesep) fps = {} for block in p.output.split(('*' * 43) + os.linesep + '*' * 43): s_alias = realias.search(block) s_sha256 = resha256.search(block) if s_alias and s_sha256: sigfp = s_sha256.group('sha256').replace(':', '').lower() fps[s_alias.group('alias')] = sigfp return fps def sign_sig_key_fingerprint_list(jar_file): """Sign the list of app-signing key fingerprints. This is used primaryily by fdroid update to determine which APKs where built and signed by F-Droid and which ones were manually added by users. """ cmd = [config['jarsigner']] cmd += '-keystore', config['keystore'] cmd += '-storepass:env', 'FDROID_KEY_STORE_PASS' cmd += '-digestalg', 'SHA1' cmd += '-sigalg', 'SHA1withRSA' cmd += jar_file, config['repo_keyalias'] if config['keystore'] == 'NONE': cmd += config['smartcardoptions'] else: # smardcards never use -keypass cmd += '-keypass:env', 'FDROID_KEY_PASS' env_vars = { 'FDROID_KEY_STORE_PASS': config['keystorepass'], 'FDROID_KEY_PASS': config.get('keypass', ""), } p = common.FDroidPopen(cmd, envs=env_vars) if p.returncode != 0: raise FDroidException("Failed to sign '{}'!".format(jar_file)) def store_stats_fdroid_signing_key_fingerprints(appids, indent=None): """Store list of all signing-key fingerprints for given appids to HD. This list will later on be needed by fdroid update. """ if not os.path.exists('stats'): os.makedirs('stats') data = OrderedDict() fps = read_fingerprints_from_keystore() for appid in sorted(appids): alias = key_alias(appid) if alias in fps: data[appid] = {'signer': fps[key_alias(appid)]} jar_file = os.path.join('stats', 'publishsigkeys.jar') with zipfile.ZipFile(jar_file, 'w', zipfile.ZIP_DEFLATED) as jar: jar.writestr('publishsigkeys.json', json.dumps(data, indent=indent)) sign_sig_key_fingerprint_list(jar_file) def status_update_json(generatedKeys, signedApks): """Output a JSON file with metadata about this run.""" logging.debug(_('Outputting JSON')) output = common.setup_status_output(start_timestamp) output['apksigner'] = shutil.which(config.get('apksigner', '')) output['jarsigner'] = shutil.which(config.get('jarsigner', '')) output['keytool'] = shutil.which(config.get('keytool', '')) if generatedKeys: output['generatedKeys'] = generatedKeys if signedApks: output['signedApks'] = signedApks common.write_status_json(output) def check_for_key_collisions(allapps): """Make sure there's no collision in keyaliases from apps. It was suggested at https://dev.guardianproject.info/projects/bazaar/wiki/FDroid_Audit that a package could be crafted, such that it would use the same signing key as an existing app. While it may be theoretically possible for such a colliding package ID to be generated, it seems virtually impossible that the colliding ID would be something that would be a) a valid package ID, and b) a sane-looking ID that would make its way into the repo. Nonetheless, to be sure, before publishing we check that there are no collisions, and refuse to do any publishing if that's the case. Parameters ---------- allapps a dict of all apps to process Returns ------- a list of all aliases corresponding to allapps """ allaliases = [] for appid in allapps: m = hashlib.md5() # nosec just used to generate a keyalias m.update(appid.encode('utf-8')) keyalias = m.hexdigest()[:8] if keyalias in allaliases: logging.error(_("There is a keyalias collision - publishing halted")) sys.exit(1) allaliases.append(keyalias) return allaliases def create_key_if_not_existing(keyalias): """Ensure a signing key with the given keyalias exists. Returns ------- boolean True if a new key was created, False otherwise """ # See if we already have a key for this application, and # if not generate one... env_vars = { 'LC_ALL': 'C.UTF-8', 'FDROID_KEY_STORE_PASS': config['keystorepass'], 'FDROID_KEY_PASS': config.get('keypass', ""), } cmd = [ config['keytool'], '-list', '-alias', keyalias, '-keystore', config['keystore'], '-storepass:env', 'FDROID_KEY_STORE_PASS', ] if config['keystore'] == 'NONE': cmd += config['smartcardoptions'] p = FDroidPopen(cmd, envs=env_vars) if p.returncode != 0: logging.info("Key does not exist - generating...") cmd = [ config['keytool'], '-genkey', '-keystore', config['keystore'], '-alias', keyalias, '-keyalg', 'RSA', '-keysize', '2048', '-validity', '10000', '-storepass:env', 'FDROID_KEY_STORE_PASS', '-dname', config['keydname'], ] if config['keystore'] == 'NONE': cmd += config['smartcardoptions'] else: cmd += '-keypass:env', 'FDROID_KEY_PASS' p = FDroidPopen(cmd, envs=env_vars) if p.returncode != 0: raise BuildException("Failed to generate key", p.output) return True else: return False def main(): global config, options # Parse command line... parser = ArgumentParser( usage="%(prog)s [options] " "[APPID[:VERCODE] [APPID[:VERCODE] ...]]" ) common.setup_global_opts(parser) parser.add_argument( "appid", nargs='*', help=_("application ID with optional versionCode in the form APPID[:VERCODE]"), ) metadata.add_metadata_arguments(parser) options = parser.parse_args() metadata.warnings_action = options.W config = common.read_config(options) if not ('jarsigner' in config and 'keytool' in config): logging.critical( _('Java JDK not found! Install in standard location or set java_paths!') ) sys.exit(1) common.assert_config_keystore(config) log_dir = 'logs' if not os.path.isdir(log_dir): logging.info(_("Creating log directory")) os.makedirs(log_dir) tmp_dir = 'tmp' if not os.path.isdir(tmp_dir): logging.info(_("Creating temporary directory")) os.makedirs(tmp_dir) output_dir = 'repo' if not os.path.isdir(output_dir): logging.info(_("Creating output directory")) os.makedirs(output_dir) unsigned_dir = 'unsigned' if not os.path.isdir(unsigned_dir): logging.warning(_("No unsigned directory - nothing to do")) sys.exit(1) binaries_dir = os.path.join(unsigned_dir, 'binaries') if not config['keystore'] == "NONE" and not os.path.exists(config['keystore']): logging.error("Config error - missing '{0}'".format(config['keystore'])) sys.exit(1) allapps = metadata.read_metadata() vercodes = common.read_pkg_args(options.appid, True) common.get_metadata_files(vercodes) # only check appids signed_apks = dict() generated_keys = dict() allaliases = check_for_key_collisions(allapps) logging.info( ngettext( '{0} app, {1} key aliases', '{0} apps, {1} key aliases', len(allapps) ).format(len(allapps), len(allaliases)) ) # Process any APKs or ZIPs that are waiting to be signed... for apkfile in sorted( glob.glob(os.path.join(unsigned_dir, '*.apk')) + glob.glob(os.path.join(unsigned_dir, '*.zip')) ): appid, vercode = common.publishednameinfo(apkfile) apkfilename = os.path.basename(apkfile) if vercodes and appid not in vercodes: continue if appid in vercodes and vercodes[appid]: if vercode not in vercodes[appid]: continue logging.info(_("Processing {apkfilename}").format(apkfilename=apkfile)) # There ought to be valid metadata for this app, otherwise why are we # trying to publish it? if appid not in allapps: logging.error( "Unexpected {0} found in unsigned directory".format(apkfilename) ) sys.exit(1) app = allapps[appid] if app.Binaries: # It's an app where we build from source, and verify the apk # contents against a developer's binary, and then publish their # version if everything checks out. # The binary should already have been retrieved during the build # process. srcapk = re.sub(r'\.apk$', '.binary.apk', apkfile) srcapk = srcapk.replace(unsigned_dir, binaries_dir) if not os.path.isfile(srcapk): logging.error("...reference binary missing - publish skipped: " "'{refpath}'".format(refpath=srcapk)) else: # Compare our unsigned one with the downloaded one... compare_result = common.verify_apks(srcapk, apkfile, tmp_dir) if compare_result: logging.error("...verification failed - publish skipped : " "{result}".format(result=compare_result)) else: # Success! So move the downloaded file to the repo, and remove # our built version. shutil.move(srcapk, os.path.join(output_dir, apkfilename)) os.remove(apkfile) publish_source_tarball(apkfilename, unsigned_dir, output_dir) logging.info('Published ' + apkfilename) elif apkfile.endswith('.zip'): # OTA ZIPs built by fdroid do not need to be signed by jarsigner, # just to be moved into place in the repo shutil.move(apkfile, os.path.join(output_dir, apkfilename)) publish_source_tarball(apkfilename, unsigned_dir, output_dir) logging.info('Published ' + apkfilename) else: # It's a 'normal' app, i.e. we sign and publish it... skipsigning = False # First we handle signatures for this app from local metadata signingfiles = common.metadata_find_developer_signing_files(appid, vercode) if signingfiles: # There's a signature of the app developer present in our # metadata. This means we're going to prepare both a locally # signed APK and a version signed with the developers key. signature_file, _ignored, manifest, v2_files = signingfiles with open(signature_file, 'rb') as f: devfp = common.signer_fingerprint_short( common.get_certificate(f.read()) ) devsigned = '{}_{}_{}.apk'.format(appid, vercode, devfp) devsignedtmp = os.path.join(tmp_dir, devsigned) common.apk_implant_signatures(apkfile, devsignedtmp, manifest=manifest) if common.verify_apk_signature(devsignedtmp): shutil.move(devsignedtmp, os.path.join(output_dir, devsigned)) else: os.remove(devsignedtmp) logging.error('...verification failed - skipping: %s', devsigned) skipsigning = True # Now we sign with the F-Droid key. if not skipsigning: keyalias = key_alias(appid) logging.info("Key alias: " + keyalias) if create_key_if_not_existing(keyalias): generated_keys[appid] = keyalias signed_apk_path = os.path.join(output_dir, apkfilename) if os.path.exists(signed_apk_path): raise BuildException("Refusing to sign '{0}' file exists in both " "{1} and {2} folder.".format(apkfilename, unsigned_dir, output_dir)) # Sign and zipalign the application... common.sign_apk(apkfile, signed_apk_path, keyalias) if appid not in signed_apks: signed_apks[appid] = [] signed_apks[appid].append({"keyalias": keyalias, "filename": apkfile}) publish_source_tarball(apkfilename, unsigned_dir, output_dir) logging.info('Published ' + apkfilename) store_stats_fdroid_signing_key_fingerprints(allapps.keys()) status_update_json(generated_keys, signed_apks) logging.info('published list signing-key fingerprints') if __name__ == "__main__": main() fdroidserver-2.1/fdroidserver/readmeta.py0000644000175000017500000000223314203004041020574 0ustar hanshans00000000000000#!/usr/bin/env python3 # # readmeta.py - part of the FDroid server tools # Copyright (C) 2014 Daniel Martí # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . from argparse import ArgumentParser from . import common from . import metadata options = None def main(): parser = ArgumentParser() common.setup_global_opts(parser) metadata.add_metadata_arguments(parser) options = parser.parse_args() metadata.warnings_action = options.W common.read_config(None) metadata.read_metadata() if __name__ == "__main__": main() fdroidserver-2.1/fdroidserver/rewritemeta.py0000644000175000017500000000635014203004041021346 0ustar hanshans00000000000000#!/usr/bin/env python3 # # rewritemeta.py - part of the FDroid server tools # This cleans up the original .yml metadata file format. # Copyright (C) 2010-12, Ciaran Gultnieks, ciaran@ciarang.com # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . from argparse import ArgumentParser import logging import io import tempfile import shutil from pathlib import Path from . import _ from . import common from . import metadata config = None options = None def proper_format(app): s = io.StringIO() # TODO: currently reading entire file again, should reuse first # read in metadata.py cur_content = Path(app.metadatapath).read_text(encoding='utf-8') if Path(app.metadatapath).suffix == '.yml': metadata.write_yaml(s, app) content = s.getvalue() s.close() return content == cur_content def main(): global config, options parser = ArgumentParser() common.setup_global_opts(parser) parser.add_argument("-l", "--list", action="store_true", default=False, help=_("List files that would be reformatted (dry run)")) parser.add_argument("appid", nargs='*', help=_("application ID of file to operate on")) metadata.add_metadata_arguments(parser) options = parser.parse_args() metadata.warnings_action = options.W config = common.read_config(options) # Get all apps... allapps = metadata.read_metadata(options.appid) apps = common.read_app_args(options.appid, allapps, False) for appid, app in apps.items(): path = Path(app.metadatapath) if path.suffix == '.yml': logging.info(_("Rewriting '{appid}'").format(appid=appid)) else: logging.warning(_('Cannot rewrite "{path}"').format(path=path)) continue if options.list: if not proper_format(app): print(path) continue newbuilds = [] for build in app.get('Builds', []): new = metadata.Build() for k in metadata.build_flags: v = build[k] if v is None or v is False or v == [] or v == '': continue new[k] = v newbuilds.append(new) app['Builds'] = newbuilds # rewrite to temporary file before overwriting existsing # file in case there's a bug in write_metadata with tempfile.TemporaryDirectory() as tmpdir: tmp_path = Path(tmpdir) / path.name metadata.write_metadata(tmp_path, app) # TODO: Python3.6: Accept path-lik shutil.move(str(tmp_path), str(path)) logging.debug(_("Finished")) if __name__ == "__main__": main() fdroidserver-2.1/fdroidserver/scanner.py0000644000175000017500000005016614205260731020467 0ustar hanshans00000000000000#!/usr/bin/env python3 # # scanner.py - part of the FDroid server tools # Copyright (C) 2010-13, Ciaran Gultnieks, ciaran@ciarang.com # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . import imghdr import json import os import re import sys import traceback from argparse import ArgumentParser import logging import itertools from . import _ from . import common from . import metadata from .exception import BuildException, VCSException config = None options = None DEFAULT_JSON_PER_BUILD = {'errors': [], 'warnings': [], 'infos': []} # type: ignore json_per_build = DEFAULT_JSON_PER_BUILD MAVEN_URL_REGEX = re.compile(r"""\smaven\s*{.*?(?:setUrl|url)\s*=?\s*(?:uri)?\(?\s*["']?([^\s"']+)["']?[^}]*}""", re.DOTALL) CODE_SIGNATURES = { # The `apkanalyzer dex packages` output looks like this: # M d 1 1 93 # The first column has P/C/M/F for package, class, method or field # The second column has x/k/r/d for removed, kept, referenced and defined. # We already filter for defined only in the apkanalyzer call. 'r' will be # for things referenced but not distributed in the apk. exp: re.compile(r'.[\s]*d[\s]*[0-9]*[\s]*[0-9*][\s]*[0-9]*[\s]*' + exp, re.IGNORECASE) for exp in [ r'(com\.google\.firebase[^\s]*)', r'(com\.google\.android\.gms[^\s]*)', r'(com\.google\.android\.play\.core[^\s]*)', r'(com\.google\.tagmanager[^\s]*)', r'(com\.google\.analytics[^\s]*)', r'(com\.android\.billing[^\s]*)', ] } # Common known non-free blobs (always lower case): NON_FREE_GRADLE_LINES = { exp: re.compile(r'.*' + exp, re.IGNORECASE) for exp in [ r'flurryagent', r'paypal.*mpl', r'admob.*sdk.*android', r'google.*ad.*view', r'google.*admob', r'google.*play.*services', r'com.google.android.play:core.*', r'com.google.mlkit', r'com.android.billingclient', r'androidx.work:work-gcm', r'crittercism', r'heyzap', r'jpct.*ae', r'youtube.*android.*player.*api', r'bugsense', r'crashlytics', r'ouya.*sdk', r'libspen23', r'firebase', r'''["']com.facebook.android['":]''', r'cloudrail', r'com.tencent.bugly', r'appcenter-push', ] } def get_gradle_compile_commands(build): compileCommands = ['compile', 'provided', 'apk', 'implementation', 'api', 'compileOnly', 'runtimeOnly'] buildTypes = ['', 'release'] flavors = [''] if build.gradle and build.gradle != ['yes']: flavors += build.gradle commands = [''.join(c) for c in itertools.product(flavors, buildTypes, compileCommands)] return [re.compile(r'\s*' + c, re.IGNORECASE) for c in commands] def scan_binary(apkfile): """Scan output of apkanalyzer for known non-free classes. apkanalyzer produces useful output when it can run, but it does not support all recent JDK versions, and also some DEX versions, so this cannot count on it to always produce useful output or even to run without exiting with an error. """ logging.info(_('Scanning APK with apkanalyzer for known non-free classes.')) result = common.SdkToolsPopen(["apkanalyzer", "dex", "packages", "--defined-only", apkfile], output=False) if result.returncode != 0: logging.warning(_('scanner not cleanly run apkanalyzer: %s') % result.output) problems = 0 for suspect, regexp in CODE_SIGNATURES.items(): matches = regexp.findall(result.output) if matches: for m in set(matches): logging.debug("Found class '%s'" % m) problems += 1 if problems: logging.critical("Found problems in %s" % apkfile) return problems def scan_source(build_dir, build=metadata.Build()): """Scan the source code in the given directory (and all subdirectories). Returns ------- the number of fatal problems encountered. """ count = 0 allowlisted = [ 'firebase-jobdispatcher', # https://github.com/firebase/firebase-jobdispatcher-android/blob/master/LICENSE 'com.firebaseui', # https://github.com/firebase/FirebaseUI-Android/blob/master/LICENSE 'geofire-android' # https://github.com/firebase/geofire-java/blob/master/LICENSE ] def is_allowlisted(s): return any(al in s for al in allowlisted) def suspects_found(s): for n, r in NON_FREE_GRADLE_LINES.items(): if r.match(s) and not is_allowlisted(s): yield n allowed_repos = [re.compile(r'^https://' + re.escape(repo) + r'/*') for repo in [ 'repo1.maven.org/maven2', # mavenCentral() 'jcenter.bintray.com', # jcenter() 'jitpack.io', 'www.jitpack.io', 'repo.maven.apache.org/maven2', 'oss.jfrog.org/artifactory/oss-snapshot-local', 'oss.sonatype.org/content/repositories/snapshots', 'oss.sonatype.org/content/repositories/releases', 'oss.sonatype.org/content/groups/public', 'clojars.org/repo', # Clojure free software libs 's3.amazonaws.com/repo.commonsware.com', # CommonsWare 'plugins.gradle.org/m2', # Gradle plugin repo 'maven.google.com', # Google Maven Repo, https://developer.android.com/studio/build/dependencies.html#google-maven ] ] + [re.compile(r'^file://' + re.escape(repo) + r'/*') for repo in [ '/usr/share/maven-repo', # local repo on Debian installs ] ] scanignore = common.getpaths_map(build_dir, build.scanignore) scandelete = common.getpaths_map(build_dir, build.scandelete) scanignore_worked = set() scandelete_worked = set() def toignore(path_in_build_dir): for k, paths in scanignore.items(): for p in paths: if path_in_build_dir.startswith(p): scanignore_worked.add(k) return True return False def todelete(path_in_build_dir): for k, paths in scandelete.items(): for p in paths: if path_in_build_dir.startswith(p): scandelete_worked.add(k) return True return False def ignoreproblem(what, path_in_build_dir): """No summary. Parameters ---------- what: string describing the problem, will be printed in log messages path_in_build_dir path to the file relative to `build`-dir Returns ------- 0 as we explicitly ignore the file, so don't count an error """ msg = ('Ignoring %s at %s' % (what, path_in_build_dir)) logging.info(msg) if json_per_build is not None: json_per_build['infos'].append([msg, path_in_build_dir]) return 0 def removeproblem(what, path_in_build_dir, filepath): """No summary. Parameters ---------- what: string describing the problem, will be printed in log messages path_in_build_dir path to the file relative to `build`-dir filepath Path (relative to our current path) to the file Returns ------- 0 as we deleted the offending file """ msg = ('Removing %s at %s' % (what, path_in_build_dir)) logging.info(msg) if json_per_build is not None: json_per_build['infos'].append([msg, path_in_build_dir]) try: os.remove(filepath) except FileNotFoundError: # File is already gone, nothing to do. # This can happen if we find multiple problems in one file that is setup for scandelete # I.e. build.gradle files containig multiple unknown maven repos. pass return 0 def warnproblem(what, path_in_build_dir): """No summary. Parameters ---------- what: string describing the problem, will be printed in log messages path_in_build_dir path to the file relative to `build`-dir Returns ------- 0, as warnings don't count as errors """ if toignore(path_in_build_dir): return 0 logging.warning('Found %s at %s' % (what, path_in_build_dir)) if json_per_build is not None: json_per_build['warnings'].append([what, path_in_build_dir]) return 0 def handleproblem(what, path_in_build_dir, filepath): """Dispatches to problem handlers (ignore, delete, warn). Or returns 1 for increasing the error count. Parameters ---------- what: string describing the problem, will be printed in log messages path_in_build_dir path to the file relative to `build`-dir filepath Path (relative to our current path) to the file Returns ------- 0 if the problem was ignored/deleted/is only a warning, 1 otherwise """ if toignore(path_in_build_dir): return ignoreproblem(what, path_in_build_dir) if todelete(path_in_build_dir): return removeproblem(what, path_in_build_dir, filepath) if 'src/test' in path_in_build_dir or '/test/' in path_in_build_dir: return warnproblem(what, path_in_build_dir) if options and 'json' in vars(options) and options.json: json_per_build['errors'].append([what, path_in_build_dir]) if options and (options.verbose or not ('json' in vars(options) and options.json)): logging.error('Found %s at %s' % (what, path_in_build_dir)) return 1 def is_executable(path): return os.path.exists(path) and os.access(path, os.X_OK) textchars = bytearray({7, 8, 9, 10, 12, 13, 27} | set(range(0x20, 0x100)) - {0x7f}) def is_binary(path): d = None with open(path, 'rb') as f: d = f.read(1024) return bool(d.translate(None, textchars)) # False positives patterns for files that are binary and executable. safe_paths = [re.compile(r) for r in [ r".*/drawable[^/]*/.*\.png$", # png drawables r".*/mipmap[^/]*/.*\.png$", # png mipmaps ] ] def is_image_file(path): if imghdr.what(path) is not None: return True def safe_path(path_in_build_dir): for sp in safe_paths: if sp.match(path_in_build_dir): return True return False gradle_compile_commands = get_gradle_compile_commands(build) def is_used_by_gradle(line): return any(command.match(line) for command in gradle_compile_commands) # Iterate through all files in the source code for root, dirs, files in os.walk(build_dir, topdown=True): # It's topdown, so checking the basename is enough for ignoredir in ('.hg', '.git', '.svn', '.bzr'): if ignoredir in dirs: dirs.remove(ignoredir) for curfile in files: if curfile in ['.DS_Store']: continue # Path (relative) to the file filepath = os.path.join(root, curfile) if os.path.islink(filepath): continue path_in_build_dir = os.path.relpath(filepath, build_dir) if curfile in ('gradle-wrapper.jar', 'gradlew', 'gradlew.bat'): removeproblem(curfile, path_in_build_dir, filepath) elif curfile.endswith('.apk'): removeproblem(_('Android APK file'), path_in_build_dir, filepath) elif curfile.endswith('.a'): count += handleproblem(_('static library'), path_in_build_dir, filepath) elif curfile.endswith('.aar'): count += handleproblem(_('Android AAR library'), path_in_build_dir, filepath) elif curfile.endswith('.class'): count += handleproblem(_('Java compiled class'), path_in_build_dir, filepath) elif curfile.endswith('.dex'): count += handleproblem(_('Android DEX code'), path_in_build_dir, filepath) elif curfile.endswith('.gz'): count += handleproblem(_('gzip file archive'), path_in_build_dir, filepath) # We use a regular expression here to also match versioned shared objects like .so.0.0.0 elif re.match(r'.*\.so(\..+)*$', curfile): count += handleproblem(_('shared library'), path_in_build_dir, filepath) elif curfile.endswith('.zip'): count += handleproblem(_('ZIP file archive'), path_in_build_dir, filepath) elif curfile.endswith('.jar'): for name in suspects_found(curfile): count += handleproblem('usual suspect \'%s\'' % name, path_in_build_dir, filepath) count += handleproblem(_('Java JAR file'), path_in_build_dir, filepath) elif curfile.endswith('.java'): if not os.path.isfile(filepath): continue with open(filepath, 'r', errors='replace') as f: for line in f: if 'DexClassLoader' in line: count += handleproblem('DexClassLoader', path_in_build_dir, filepath) break elif curfile.endswith('.gradle'): if not os.path.isfile(filepath): continue with open(filepath, 'r', errors='replace') as f: lines = f.readlines() for i, line in enumerate(lines): if is_used_by_gradle(line): for name in suspects_found(line): count += handleproblem("usual suspect \'%s\'" % (name), path_in_build_dir, filepath) noncomment_lines = [line for line in lines if not common.gradle_comment.match(line)] no_comments = re.sub(r'/\*.*?\*/', '', ''.join(noncomment_lines), flags=re.DOTALL) for url in MAVEN_URL_REGEX.findall(no_comments): if not any(r.match(url) for r in allowed_repos): count += handleproblem('unknown maven repo \'%s\'' % url, path_in_build_dir, filepath) elif curfile.endswith(('.', '.bin', '.out', '.exe')): if is_binary(filepath): count += handleproblem('binary', path_in_build_dir, filepath) elif is_executable(filepath): if is_binary(filepath) and not (safe_path(path_in_build_dir) or is_image_file(filepath)): warnproblem(_('executable binary, possibly code'), path_in_build_dir) for p in scanignore: if p not in scanignore_worked: logging.error(_('Unused scanignore path: %s') % p) count += 1 for p in scandelete: if p not in scandelete_worked: logging.error(_('Unused scandelete path: %s') % p) count += 1 return count def main(): global config, options, json_per_build # Parse command line... parser = ArgumentParser(usage="%(prog)s [options] [APPID[:VERCODE] [APPID[:VERCODE] ...]]") common.setup_global_opts(parser) parser.add_argument("appid", nargs='*', help=_("application ID with optional versionCode in the form APPID[:VERCODE]")) parser.add_argument("-f", "--force", action="store_true", default=False, help=_("Force scan of disabled apps and builds.")) parser.add_argument("--json", action="store_true", default=False, help=_("Output JSON to stdout.")) metadata.add_metadata_arguments(parser) options = parser.parse_args() metadata.warnings_action = options.W json_output = dict() if options.json: if options.verbose: logging.basicConfig(stream=sys.stderr, level=logging.DEBUG) else: logging.getLogger().setLevel(logging.ERROR) config = common.read_config(options) # Read all app and srclib metadata allapps = metadata.read_metadata() apps = common.read_app_args(options.appid, allapps, True) probcount = 0 build_dir = 'build' if not os.path.isdir(build_dir): logging.info("Creating build directory") os.makedirs(build_dir) srclib_dir = os.path.join(build_dir, 'srclib') extlib_dir = os.path.join(build_dir, 'extlib') for appid, app in apps.items(): json_per_appid = dict() if app.Disabled and not options.force: logging.info(_("Skipping {appid}: disabled").format(appid=appid)) json_per_appid['disabled'] = json_per_build['infos'].append('Skipping: disabled') continue try: if app.RepoType == 'srclib': build_dir = os.path.join('build', 'srclib', app.Repo) else: build_dir = os.path.join('build', appid) if app.get('Builds'): logging.info(_("Processing {appid}").format(appid=appid)) # Set up vcs interface and make sure we have the latest code... vcs = common.getvcs(app.RepoType, app.Repo, build_dir) else: logging.info(_("{appid}: no builds specified, running on current source state") .format(appid=appid)) json_per_build = DEFAULT_JSON_PER_BUILD json_per_appid['current-source-state'] = json_per_build count = scan_source(build_dir) if count > 0: logging.warning(_('Scanner found {count} problems in {appid}:') .format(count=count, appid=appid)) probcount += count app['Builds'] = [] for build in app.get('Builds', []): json_per_build = DEFAULT_JSON_PER_BUILD json_per_appid[build.versionCode] = json_per_build if build.disable and not options.force: logging.info("...skipping version %s - %s" % ( build.versionName, build.get('disable', build.commit[1:]))) continue logging.info("...scanning version " + build.versionName) # Prepare the source code... common.prepare_source(vcs, app, build, build_dir, srclib_dir, extlib_dir, False) count = scan_source(build_dir, build) if count > 0: logging.warning(_('Scanner found {count} problems in {appid}:{versionCode}:') .format(count=count, appid=appid, versionCode=build.versionCode)) probcount += count except BuildException as be: logging.warning('Could not scan app %s due to BuildException: %s' % ( appid, be)) probcount += 1 except VCSException as vcse: logging.warning('VCS error while scanning app %s: %s' % (appid, vcse)) probcount += 1 except Exception: logging.warning('Could not scan app %s due to unknown error: %s' % ( appid, traceback.format_exc())) probcount += 1 for k, v in json_per_appid.items(): if len(v['errors']) or len(v['warnings']) or len(v['infos']): json_output[appid] = json_per_appid break logging.info(_("Finished")) if options.json: print(json.dumps(json_output)) else: print(_("%d problems found") % probcount) if __name__ == "__main__": main() fdroidserver-2.1/fdroidserver/signatures.py0000644000175000017500000000713614203004041021205 0ustar hanshans00000000000000#!/usr/bin/env python3 # # Copyright (C) 2017, Michael Poehn # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . from argparse import ArgumentParser import re import os import sys import logging from . import _ from . import common from . import net from .exception import FDroidException def extract_signature(apkpath): if not os.path.exists(apkpath): raise FDroidException("file APK does not exists '{}'".format(apkpath)) if not common.verify_apk_signature(apkpath): raise FDroidException("no valid signature in '{}'".format(apkpath)) logging.debug('signature okay: %s', apkpath) appid, vercode, _ignored = common.get_apk_id(apkpath) sigdir = common.metadata_get_sigdir(appid, vercode) if not os.path.exists(sigdir): os.makedirs(sigdir) common.apk_extract_signatures(apkpath, sigdir) return sigdir def extract(options): # Create tmp dir if missing… tmp_dir = 'tmp' if not os.path.exists(tmp_dir): os.mkdir(tmp_dir) if not options.APK or len(options.APK) <= 0: logging.critical(_('no APK supplied')) sys.exit(1) # iterate over supplied APKs downlaod and extract them… httpre = re.compile(r'https?:\/\/') for apk in options.APK: try: if os.path.isfile(apk): sigdir = extract_signature(apk) logging.info(_("Fetched signatures for '{apkfilename}' -> '{sigdir}'") .format(apkfilename=apk, sigdir=sigdir)) elif httpre.match(apk): if apk.startswith('https') or options.no_check_https: try: tmp_apk = os.path.join(tmp_dir, 'signed.apk') net.download_file(apk, tmp_apk) sigdir = extract_signature(tmp_apk) logging.info(_("Fetched signatures for '{apkfilename}' -> '{sigdir}'") .format(apkfilename=apk, sigdir=sigdir)) finally: if tmp_apk and os.path.exists(tmp_apk): os.remove(tmp_apk) else: logging.warning(_('refuse downloading via insecure HTTP connection ' '(use HTTPS or specify --no-https-check): {apkfilename}') .format(apkfilename=apk)) except FDroidException as e: logging.warning(_("Failed fetching signatures for '{apkfilename}': {error}") .format(apkfilename=apk, error=e)) if e.detail: logging.debug(e.detail) def main(): parser = ArgumentParser() common.setup_global_opts(parser) parser.add_argument( "APK", nargs='*', help=_("signed APK, either a file-path or HTTPS URL.") ) parser.add_argument("--no-check-https", action="store_true", default=False) options = parser.parse_args() # Read config.py... common.read_config(options) extract(options) fdroidserver-2.1/fdroidserver/signindex.py0000644000175000017500000001075114203004041021006 0ustar hanshans00000000000000#!/usr/bin/env python3 # # gpgsign.py - part of the FDroid server tools # Copyright (C) 2015, Ciaran Gultnieks, ciaran@ciarang.com # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . import os import time import zipfile from argparse import ArgumentParser import logging from . import _ from . import common from .exception import FDroidException config = None options = None start_timestamp = time.gmtime() def sign_jar(jar): """Sign a JAR file with Java's jarsigner. This method requires a properly initialized config object. This does use old hashing algorithms, i.e. SHA1, but that's not broken yet for file verification. This could be set to SHA256, but then Android < 4.3 would not be able to verify it. https://code.google.com/p/android/issues/detail?id=38321 """ args = [ config['jarsigner'], '-keystore', config['keystore'], '-storepass:env', 'FDROID_KEY_STORE_PASS', '-digestalg', 'SHA1', '-sigalg', 'SHA1withRSA', jar, config['repo_keyalias'], ] if config['keystore'] == 'NONE': args += config['smartcardoptions'] else: # smardcards never use -keypass args += ['-keypass:env', 'FDROID_KEY_PASS'] env_vars = { 'FDROID_KEY_STORE_PASS': config['keystorepass'], 'FDROID_KEY_PASS': config.get('keypass', ""), } p = common.FDroidPopen(args, envs=env_vars) if p.returncode != 0: raise FDroidException("Failed to sign %s!" % jar) def sign_index_v1(repodir, json_name): """Sign index-v1.json to make index-v1.jar. This is a bit different than index.jar: instead of their being index.xml and index_unsigned.jar, the presence of index-v1.json means that there is unsigned data. That file is then stuck into a jar and signed by the signing process. index-v1.json is never published to the repo. It is included in the binary transparency log, if that is enabled. """ name, ext = common.get_extension(json_name) index_file = os.path.join(repodir, json_name) jar_file = os.path.join(repodir, name + '.jar') with zipfile.ZipFile(jar_file, 'w', zipfile.ZIP_DEFLATED) as jar: jar.write(index_file, json_name) sign_jar(jar_file) def status_update_json(signed): """Output a JSON file with metadata about this run.""" logging.debug(_('Outputting JSON')) output = common.setup_status_output(start_timestamp) if signed: output['signed'] = signed common.write_status_json(output) def main(): global config, options parser = ArgumentParser() common.setup_global_opts(parser) options = parser.parse_args() config = common.read_config(options) if 'jarsigner' not in config: raise FDroidException( _( 'Java jarsigner not found! Install in standard location or set java_paths!' ) ) repodirs = ['repo'] if config['archive_older'] != 0: repodirs.append('archive') signed = [] for output_dir in repodirs: if not os.path.isdir(output_dir): raise FDroidException("Missing output directory '" + output_dir + "'") unsigned = os.path.join(output_dir, 'index_unsigned.jar') if os.path.exists(unsigned): sign_jar(unsigned) index_jar = os.path.join(output_dir, 'index.jar') os.rename(unsigned, index_jar) logging.info('Signed index in ' + output_dir) signed.append(index_jar) json_name = 'index-v1.json' index_file = os.path.join(output_dir, json_name) if os.path.exists(index_file): sign_index_v1(output_dir, json_name) os.remove(index_file) logging.info('Signed ' + index_file) signed.append(index_file) if not signed: logging.info(_("Nothing to do")) status_update_json(signed) if __name__ == "__main__": main() fdroidserver-2.1/fdroidserver/stats.py0000644000175000017500000002553514203004041020162 0ustar hanshans00000000000000#!/usr/bin/env python3 # # stats.py - part of the FDroid server tools # Copyright (C) 2010-13, Ciaran Gultnieks, ciaran@ciarang.com # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU Affero General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Affero General Public License for more details. # # You should have received a copy of the GNU Affero General Public License # along with this program. If not, see . import sys import os import re import time import traceback import glob import json from argparse import ArgumentParser import paramiko import socket import logging import subprocess from collections import Counter from . import _ from . import common from . import metadata def carbon_send(key, value): s = socket.socket() s.connect((config['carbon_host'], config['carbon_port'])) msg = '%s %d %d\n' % (key, value, int(time.time())) s.sendall(msg) s.close() options = None config = None def most_common_stable(counts): pairs = [] for s in counts: pairs.append((s, counts[s])) return sorted(pairs, key=lambda t: (-t[1], t[0])) def main(): global options, config # Parse command line... parser = ArgumentParser() common.setup_global_opts(parser) parser.add_argument("-d", "--download", action="store_true", default=False, help=_("Download logs we don't have")) parser.add_argument("--recalc", action="store_true", default=False, help=_("Recalculate aggregate stats - use when changes " "have been made that would invalidate old cached data.")) parser.add_argument("--nologs", action="store_true", default=False, help=_("Don't do anything logs-related")) metadata.add_metadata_arguments(parser) options = parser.parse_args() metadata.warnings_action = options.W config = common.read_config(options) if not config['update_stats']: logging.info("Stats are disabled - set \"update_stats = True\" in your config.yml") sys.exit(1) # Get all metadata-defined apps... allmetaapps = [app for app in metadata.read_metadata().values()] metaapps = [app for app in allmetaapps if not app.Disabled] statsdir = 'stats' logsdir = os.path.join(statsdir, 'logs') datadir = os.path.join(statsdir, 'data') if not os.path.exists(statsdir): os.mkdir(statsdir) if not os.path.exists(logsdir): os.mkdir(logsdir) if not os.path.exists(datadir): os.mkdir(datadir) if options.download: # Get any access logs we don't have... ssh = None ftp = None try: logging.info('Retrieving logs') ssh = paramiko.SSHClient() ssh.load_system_host_keys() ssh.connect(config['stats_server'], username=config['stats_user'], timeout=10, key_filename=config['webserver_keyfile']) ftp = ssh.open_sftp() ftp.get_channel().settimeout(60) logging.info("...connected") ftp.chdir('logs') files = ftp.listdir() for f in files: if f.startswith('access-') and f.endswith('.log.gz'): destpath = os.path.join(logsdir, f) destsize = ftp.stat(f).st_size if not os.path.exists(destpath) \ or os.path.getsize(destpath) != destsize: logging.debug("...retrieving " + f) ftp.get(f, destpath) except Exception: traceback.print_exc() sys.exit(1) finally: # Disconnect if ftp is not None: ftp.close() if ssh is not None: ssh.close() knownapks = common.KnownApks() unknownapks = [] if not options.nologs: # Process logs logging.info('Processing logs...') appscount = Counter() appsvercount = Counter() logexpr = r'(?P[.:0-9a-fA-F]+) - - \[(?P