arriero-0.7~20161228/000077500000000000000000000000001306715713600140605ustar00rootroot00000000000000arriero-0.7~20161228/.coveragerc000066400000000000000000000002461306715713600162030ustar00rootroot00000000000000; vim: ft=dosini [paths] source = setup.py arriero/ tests/ [run] branch = True source = setup.py arriero/ tests/ omit = arriero/cementery.py arriero-0.7~20161228/.gitignore000066400000000000000000000000771306715713600160540ustar00rootroot00000000000000/build/ /dist/ *.egg-info/ *.egg-info/** *.pyc *.swp .coverage arriero-0.7~20161228/AUTHORS000066400000000000000000000001131306715713600151230ustar00rootroot00000000000000Maximiliano Curia Margarita Manterola arriero-0.7~20161228/COPYING000066400000000000000000000432541306715713600151230ustar00rootroot00000000000000 GNU GENERAL PUBLIC LICENSE Version 2, June 1991 Copyright (C) 1989, 1991 Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. Preamble The licenses for most software are designed to take away your freedom to share and change it. By contrast, the GNU General Public License is intended to guarantee your freedom to share and change free software--to make sure the software is free for all its users. This General Public License applies to most of the Free Software Foundation's software and to any other program whose authors commit to using it. (Some other Free Software Foundation software is covered by the GNU Lesser General Public License instead.) You can apply it to your programs, too. When we speak of free software, we are referring to freedom, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for this service if you wish), that you receive source code or can get it if you want it, that you can change the software or use pieces of it in new free programs; and that you know you can do these things. To protect your rights, we need to make restrictions that forbid anyone to deny you these rights or to ask you to surrender the rights. These restrictions translate to certain responsibilities for you if you distribute copies of the software, or if you modify it. For example, if you distribute copies of such a program, whether gratis or for a fee, you must give the recipients all the rights that you have. You must make sure that they, too, receive or can get the source code. And you must show them these terms so they know their rights. We protect your rights with two steps: (1) copyright the software, and (2) offer you this license which gives you legal permission to copy, distribute and/or modify the software. Also, for each author's protection and ours, we want to make certain that everyone understands that there is no warranty for this free software. If the software is modified by someone else and passed on, we want its recipients to know that what they have is not the original, so that any problems introduced by others will not reflect on the original authors' reputations. Finally, any free program is threatened constantly by software patents. We wish to avoid the danger that redistributors of a free program will individually obtain patent licenses, in effect making the program proprietary. To prevent this, we have made it clear that any patent must be licensed for everyone's free use or not licensed at all. The precise terms and conditions for copying, distribution and modification follow. GNU GENERAL PUBLIC LICENSE TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION 0. This License applies to any program or other work which contains a notice placed by the copyright holder saying it may be distributed under the terms of this General Public License. The "Program", below, refers to any such program or work, and a "work based on the Program" means either the Program or any derivative work under copyright law: that is to say, a work containing the Program or a portion of it, either verbatim or with modifications and/or translated into another language. (Hereinafter, translation is included without limitation in the term "modification".) Each licensee is addressed as "you". Activities other than copying, distribution and modification are not covered by this License; they are outside its scope. The act of running the Program is not restricted, and the output from the Program is covered only if its contents constitute a work based on the Program (independent of having been made by running the Program). Whether that is true depends on what the Program does. 1. You may copy and distribute verbatim copies of the Program's source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice and disclaimer of warranty; keep intact all the notices that refer to this License and to the absence of any warranty; and give any other recipients of the Program a copy of this License along with the Program. You may charge a fee for the physical act of transferring a copy, and you may at your option offer warranty protection in exchange for a fee. 2. You may modify your copy or copies of the Program or any portion of it, thus forming a work based on the Program, and copy and distribute such modifications or work under the terms of Section 1 above, provided that you also meet all of these conditions: a) You must cause the modified files to carry prominent notices stating that you changed the files and the date of any change. b) You must cause any work that you distribute or publish, that in whole or in part contains or is derived from the Program or any part thereof, to be licensed as a whole at no charge to all third parties under the terms of this License. c) If the modified program normally reads commands interactively when run, you must cause it, when started running for such interactive use in the most ordinary way, to print or display an announcement including an appropriate copyright notice and a notice that there is no warranty (or else, saying that you provide a warranty) and that users may redistribute the program under these conditions, and telling the user how to view a copy of this License. (Exception: if the Program itself is interactive but does not normally print such an announcement, your work based on the Program is not required to print an announcement.) These requirements apply to the modified work as a whole. If identifiable sections of that work are not derived from the Program, and can be reasonably considered independent and separate works in themselves, then this License, and its terms, do not apply to those sections when you distribute them as separate works. But when you distribute the same sections as part of a whole which is a work based on the Program, the distribution of the whole must be on the terms of this License, whose permissions for other licensees extend to the entire whole, and thus to each and every part regardless of who wrote it. Thus, it is not the intent of this section to claim rights or contest your rights to work written entirely by you; rather, the intent is to exercise the right to control the distribution of derivative or collective works based on the Program. In addition, mere aggregation of another work not based on the Program with the Program (or with a work based on the Program) on a volume of a storage or distribution medium does not bring the other work under the scope of this License. 3. You may copy and distribute the Program (or a work based on it, under Section 2) in object code or executable form under the terms of Sections 1 and 2 above provided that you also do one of the following: a) Accompany it with the complete corresponding machine-readable source code, which must be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange; or, b) Accompany it with a written offer, valid for at least three years, to give any third party, for a charge no more than your cost of physically performing source distribution, a complete machine-readable copy of the corresponding source code, to be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange; or, c) Accompany it with the information you received as to the offer to distribute corresponding source code. (This alternative is allowed only for noncommercial distribution and only if you received the program in object code or executable form with such an offer, in accord with Subsection b above.) The source code for a work means the preferred form of the work for making modifications to it. For an executable work, complete source code means all the source code for all modules it contains, plus any associated interface definition files, plus the scripts used to control compilation and installation of the executable. However, as a special exception, the source code distributed need not include anything that is normally distributed (in either source or binary form) with the major components (compiler, kernel, and so on) of the operating system on which the executable runs, unless that component itself accompanies the executable. If distribution of executable or object code is made by offering access to copy from a designated place, then offering equivalent access to copy the source code from the same place counts as distribution of the source code, even though third parties are not compelled to copy the source along with the object code. 4. You may not copy, modify, sublicense, or distribute the Program except as expressly provided under this License. Any attempt otherwise to copy, modify, sublicense or distribute the Program is void, and will automatically terminate your rights under this License. However, parties who have received copies, or rights, from you under this License will not have their licenses terminated so long as such parties remain in full compliance. 5. You are not required to accept this License, since you have not signed it. However, nothing else grants you permission to modify or distribute the Program or its derivative works. These actions are prohibited by law if you do not accept this License. Therefore, by modifying or distributing the Program (or any work based on the Program), you indicate your acceptance of this License to do so, and all its terms and conditions for copying, distributing or modifying the Program or works based on it. 6. Each time you redistribute the Program (or any work based on the Program), the recipient automatically receives a license from the original licensor to copy, distribute or modify the Program subject to these terms and conditions. You may not impose any further restrictions on the recipients' exercise of the rights granted herein. You are not responsible for enforcing compliance by third parties to this License. 7. If, as a consequence of a court judgment or allegation of patent infringement or for any other reason (not limited to patent issues), conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot distribute so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not distribute the Program at all. For example, if a patent license would not permit royalty-free redistribution of the Program by all those who receive copies directly or indirectly through you, then the only way you could satisfy both it and this License would be to refrain entirely from distribution of the Program. If any portion of this section is held invalid or unenforceable under any particular circumstance, the balance of the section is intended to apply and the section as a whole is intended to apply in other circumstances. It is not the purpose of this section to induce you to infringe any patents or other property right claims or to contest validity of any such claims; this section has the sole purpose of protecting the integrity of the free software distribution system, which is implemented by public license practices. Many people have made generous contributions to the wide range of software distributed through that system in reliance on consistent application of that system; it is up to the author/donor to decide if he or she is willing to distribute software through any other system and a licensee cannot impose that choice. This section is intended to make thoroughly clear what is believed to be a consequence of the rest of this License. 8. If the distribution and/or use of the Program is restricted in certain countries either by patents or by copyrighted interfaces, the original copyright holder who places the Program under this License may add an explicit geographical distribution limitation excluding those countries, so that distribution is permitted only in or among countries not thus excluded. In such case, this License incorporates the limitation as if written in the body of this License. 9. The Free Software Foundation may publish revised and/or new versions of the General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Program specifies a version number of this License which applies to it and "any later version", you have the option of following the terms and conditions either of that version or of any later version published by the Free Software Foundation. If the Program does not specify a version number of this License, you may choose any version ever published by the Free Software Foundation. 10. If you wish to incorporate parts of the Program into other free programs whose distribution conditions are different, write to the author to ask for permission. For software which is copyrighted by the Free Software Foundation, write to the Free Software Foundation; we sometimes make exceptions for this. Our decision will be guided by the two goals of preserving the free status of all derivatives of our free software and of promoting the sharing and reuse of software generally. NO WARRANTY 11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. END OF TERMS AND CONDITIONS How to Apply These Terms to Your New Programs If you develop a new program, and you want it to be of the greatest possible use to the public, the best way to achieve this is to make it free software which everyone can redistribute and change under these terms. To do so, attach the following notices to the program. It is safest to attach them to the start of each source file to most effectively convey the exclusion of warranty; and each file should have at least the "copyright" line and a pointer to where the full notice is found. Copyright (C) This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. Also add information on how to contact you by electronic and paper mail. If the program is interactive, make it output a short notice like this when it starts in an interactive mode: Gnomovision version 69, Copyright (C) year name of author Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'. This is free software, and you are welcome to redistribute it under certain conditions; type `show c' for details. The hypothetical commands `show w' and `show c' should show the appropriate parts of the General Public License. Of course, the commands you use may be called something other than `show w' and `show c'; they could even be mouse-clicks or menu items--whatever suits your program. You should also get your employer (if you work as a programmer) or your school, if any, to sign a "copyright disclaimer" for the program, if necessary. Here is a sample; alter the names: Yoyodyne, Inc., hereby disclaims all copyright interest in the program `Gnomovision' (which makes passes at compilers) written by James Hacker. , 1 April 1989 Ty Coon, President of Vice This General Public License does not permit incorporating your program into proprietary programs. If your program is a subroutine library, you may consider it more useful to permit linking proprietary applications with the library. If this is what you want to do, use the GNU Lesser General Public License instead of this License. arriero-0.7~20161228/Changelog000066400000000000000000000036261306715713600157010ustar00rootroot00000000000000Version 0.7~20161228: * Migrate to python 3 * Use the debian module whenever possible * Use the dependencies declared in the packages, no need to declare them in the configuration file. * Update setup.py * Run commands with pexpect * Add --download-version option to update Version 0.6.1: * Add source-only build option (-S) * Add dsc_file and source_changes_file field (for exec and list) * Add binary-only build option (-B) * Add arch-all/no-arch-all build option * Add update-symbols action * Add dist and package as field aliases * Select builder in a cli option * Add fields last_changelog_distribution last_changelog_version and version_at_distribution * Start external-uploads actions. * Add new script upstream_diff.sh * Add new script wrap_and_sort.sh * Add logdir option * Add (sbuild) hook list-missing to list not installed files * Add update_vcs_browser.sh script * Avoid calling create chroot when update fails * Add has_symbols field * Use the new gbp command names. * Handle debian/copyright excluded-files uscan mangle * Add a meta script for the control file * Add --run-autopkgtest={true|false} build option. * Add InternalDependencies action * Add a generic meta script (update.sh) * Use the current version by default in fetch_upstream * Push the upstream tags if the upstream branch is pushed Version 0.6: * Make filter-orig a pattern list. * Allow the user to specify extra git-buildpackage options when building (-o). * Add feature to uscan class that allows to download arbitrary versions. * Ignore upstream releases when they have no changes. * Use downloaded tarballs, if present. * Create debian branch on pull, if missing. * Use current directory if no package name is received. * Several bugfixes regarding handling of configuration file * Added 'urgency' property for changelog, 'source_name' for package * Improve status output Version 0.5: * Initial release. arriero-0.7~20161228/TODO000066400000000000000000000066761306715713600145670ustar00rootroot00000000000000This list doesn't pretend to be complete, it's sort of a roadmap, or a pensieve for some bothering issues. Fetch upstream and friends: - Fix Uscan uversion and version. upstream-version, latest upstream version debian-uversion, upstream version of current package debian-mangled-version, after removing the dfsg part - Fix Uscan, no tarball found - Fix Uscan, download current version - Fix Uscan, dfsg rule If there is not a new version the output is confusing. - Add support of multiple upstream tarballs: http://raphaelhertzog.com/2010/09/07/how-to-use-multiple-upstream-tarballs-in-debian-source-packages/ The support was added to uscan. Some packages don't have a watch file, and some might need a feature not present in uscan (xawtv, for example), or upstream might not have any releases (cinnamon-themes), another example is kde-l10n, which uses a multiwatch file, sort of supported by uscan and arriero. Gbp configuration: - Per package configuration: + Read gbp config to know: upstream-branch, debian-branch - Use the gbp configuration, when possible Package cloning: - Add the branches to sync in .git/config (in clone) +++ IMPORTANT +++ - Download the current version instead of new version (if needed). - Add autoconfigure to clone: + if there is an origin/upstream branch, upstream is pushed + if there is a origin/pristine-tar branch add pristine-tar + configure this in the .git/gbp.conf if not configured in the debian/gbp.conf - Optionally do not ignore out of scope packages - Check packages bugs - If there's no basedir anywhere, we should use the current dir - Add a configuration for a rule that generates the dfsg version + arriero users prune-nonfree (it should be configurable) + uscan supports copyright format 1.0 exclude-files (need to test this with arriero) + gbp uses filters (filter-orig in arriero) - Add a logcheck like functionality - Obtain a set of packages for a given module version - Make sure to apply the packaging changes (commits) always in the "unmerged" branch. - Make the "unmerged" branch to work with command line configurable, allow aliases (codename/releases) and make it configurable per package - Export build dir to work outside the git repo - Allow an upstream remote - Add a Log class, to handle lintian style reporting - Move the Graph usage to it's own class - Clone needs to fetch the upstream branch - Verify that the command used to clone exists - Allow to choose between pbuilder/cowbuilder/qemubuilder/debuild/sbuild - Document the default setup requirements - Parse build log and extract errors. - Clean up after overlay Addons ------ * Allow having extra files that define extra actions. These files should list the actions that they define, with a help string. Arriero would then read from the files according to the configuration. * Also add the possibility of running external scripts that receive some specified environment. Distro branches --------------- * Allow having different branches for different distributions (stable, unstable, experimental). * Could be achieved by having a file that defines different profiles (kde4.11, kde4.10, etc), with each profile including distro, branch, and maybe other stuff. * Unresolved: how to associate each package with the available profiles. Status ------ * Add a status that indicates if the package needs to be uploaded or not (if it's not UNRELEASED), using rmadison: e.g. rmadison -a amd64 -s unstable cinnamon -u debian arriero-0.7~20161228/arriero.1000066400000000000000000000152331306715713600156110ustar00rootroot00000000000000.TH "arriero" 1 "2014 Mar 11" "Debian" "arriero" .SH NAME arriero \- simplifies management of several Debian packages .SH SYNOPSIS .BI "arriero [" "--config FILE" "] [" --verbose "] [" --quiet "] command [" "options" "] [" "package names" "]" .SH "DESCRIPTION" Arriero is a tool that allows simplifying the management of Debian packages, particularly useful when having to make new upstream releases, builds and uploads of similar packages. . It relies heavily in the use of .B git-buildpackage and general .B git practices, so it's only useful for packages currently maintained through git. .SH "GENERAL OPTIONS" .TP .BR -c , "--config " \fIFILE\fR Specifies the location of the config file to use. The config file holds all information related to packages. It's recommended to have different config files in order to work with different groups of packages. If not specified, the default config file .I ~/.config/arriero.conf is read. .TP .BR -v , " --verbose " Show info and debug messages. By default, only warnings and errors are shown. .TP .BR -q , " --quiet " Only show critical errors. If both \fBquiet\fR and \fBverbose\fR are specified, \fBverbose\fR is honored. .TP .BR -a , " --all " Work with all packages. When this option is not specified, package names need to be specified following the command option, separated by spaces. .SH "COMMANDS" The main action that arriero will perform is determined by the command it receives. Each command may have its own specific options, that modify its behavior. .SS .B "build" Build each package in a pbuilder. This will call .B git-pbuilder which will read local configurations from .IR /etc/pbuilderrc " and " ~/.pbuilderrc . .TP .BR -D , " --distribution " , " --dist " \fIdist-name\fR Build the package for the specified distribution .TP .BR -A , " --architecture " , " --arch " \fIarch-name\fR Build the package for the specified architecture .TP .BR -U , " --local-upload " After a successful build is finished, the package is uploaded, using the \fIupload-command\fR, using \fIlocal\fR as the host to upload to. .SS .B "clone" Obtain the repository for each package. This command can either receive a list of package names or a git URL to clone from. When specifying a URL, it will create a new entry in the configuration file; if specifying a package name, it needs to already be present in the configuration. .TP .BR --basedir The base directory in which to create the clone. After making the clone successfully, the package will be located in \fIbasedir/package_name\fR .TP .BR --upstream-branch The branch where the upstream code is located. .TP .BR --debian-branch The branch where the Debian code is located. When performing a clone from a URL, if the branches are not manually specified, arriero will try to guess their names, and store the guessed names in the configuration file. .SS .B "exec" Execute one or more scripts for each package. The scripts invoked will receive the properties of the packages as environment variables, and will be executed inside the package directory. .TP .BR -x , " --script " \fIscript_name\fR The name of the script to be executed. This option can be present multiple times. In that case, each script will be called, in the same order as presented in the command line. If one of the scripts fails for a certain package, the following ones will not be executed for that package .SS .B "fetch-upstream" Fetch the current upstream tarball for each package. .SS .B "list" List packages matching some criteria, with a specific format. This command allows specifying the desired format with which each package is going to be displayed. .TP .BR -f , " --fields " \fIfield_list\fR Fields to include while generating the list. The list of fields should be comma separated. The fields available are: .RS .TP basedir .TP branch .TP build_file .TP changes_file .TP debian_branch .TP depends .TP distribution .TP export_dir .TP is_dfsg .TP is_native .TP is_merged .TP name .TP path .TP pristine_tar_branch .TP tarball_dir .TP upstream_branch .TP upstream_version .TP vcs_git .TP version .RE .TP .BR -F , " --format " \fIfield_format\fR The format to use may include fields by name or order, as specified in the \fI--fields\fR parameter. .TP .BR -e , " --include-empty-results" By default, results where nothing would be listed are skipped, if this option is specified, they will be shown even when there is no string to show. .SS .B "overlay" Combine upstream and debian branches into either the original debian branch, or a new branch. This command is intended to be used when the debian branch doesn't include the upstream code and the user needs to have them together in order to work on the package (for example, to create a quilt package). .B Important: this command does not handle cleaning up the branch after the work is done. This has to be done manually by the user. .TP .BR -b , " --branch " \fIbranch-name\fR The name of the new branch to create with the overlay. If specified and the branch already exists, the command will fail without modifying anything. If not specified, the debian branch for the package will be used. .SS .B "pull" Obtain any new changes from the packages' repositories. .SS .B "push" Push local changes to the packages' repositories. .SS .B "release" Change the distribution in the changelog, committing the change to the local git. This command only has effect if the distribution in the changelog is either UNRELEASED or different than the one passed here. .TP .BR -D , "--distribution " \fIdistribution-name\fR The distribution to make the release to. .TP .BR -P , --pre-release If this option is received, the release will contain a ~ after the debian version. The number after the ~ will get incremented each time the release command is called. This allows for maitainers to keep track of internal tests until it's time to actually release the package. If this option is not passed, but the version in the changelog was already a pre-release (i.e. it contained a ~), the it's modified to be a final release (without ~) .SS .B "status" Show the status of each package. This command checks both the repository state (by using git to query any local/remote changes) and the upstream state (by using uscan) .SS .B "update" Get the new upstream release for each package. This command not only downloads the new upstream tarball, but also updates the debian/changelog with a new entry for the new release, with distribution set to UNRELEASED. .SS .B "upload" Upload each package. This command uses the upload-command set in the config file to upload each built package (packages that have not been built are ignored). \" .SH "EXAMPLES" \" TODO .SH "AUTHORS" Maximiliano Curia , Margarita Manterola arriero-0.7~20161228/arriero/000077500000000000000000000000001306715713600155235ustar00rootroot00000000000000arriero-0.7~20161228/arriero/__init__.py000066400000000000000000000001241306715713600176310ustar00rootroot00000000000000#!/usr/bin/env python3 # -*- coding: utf-8 -*- from .arriero import Arriero, main arriero-0.7~20161228/arriero/actions.py000066400000000000000000001215111306715713600175360ustar00rootroot00000000000000# -*- coding: utf8 -*- # Copyright: 2013-2015, Maximiliano Curia # # This program is free software; you can redistribute it and/or modify it # under the terms of the GNU General Public License as published by the Free # Software Foundation; either version 2 of the License, or (at your option) # any later version. # # This program is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or # FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for # more details. # # You should have received a copy of the GNU General Public License along with # this program; if not, write to the Free Software Foundation, Inc., 51 # Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. import argparse import collections import glob import itertools import logging import os import re import subprocess import tempfile import debian.deb822 as deb822 # Own imports from . import util from .errors import ArrieroError, ActionError from .package import ERROR, IGNORE, OK from .version import Version class Action(object): '''Abstract class that all actions should subclass.''' def __init__(self, arriero): '''Basic constructor for the class that calls the necessary functions. It follows the Template Pattern. ''' self._arriero = arriero self._process_options() @property def config(self): return self._arriero.config def _get_packages(self, names): '''Expands the groups in names, returning the package names.''' packages = util.OrderedSet() groups = self.config.groups for name in names: if name in groups: group_packages = self.config.get_packages(name) packages.extend(group_packages) else: packages.add(name) return packages @classmethod def add_options(cls, arriero, parser): '''Add arguments/config options''' def _process_options(self): '''Process the options after they were parsed.''' def run(self): '''Execute this action's main goal. Returns True if successful.''' def print_status(self): '''Print a status report of the run.''' def actionpackage_print_status(self): log_files = [ util.AttrDict( level=ERROR, filename='error', f=None, log=lambda p: logging.error('%s: failed', p)), util.AttrDict( level=IGNORE, filename='ignore', f=None, log=lambda p: logging.info('%s: ignored', p)), util.AttrDict( level=OK, filename='success', f=None, log=lambda p: None), ] logdir = self._arriero.logdir for item in log_files: if logdir: item.f = open(os.path.join(logdir, item.filename), 'w') for item in log_files: for package in self._results[item.level]: if item.f: item.f.write('{}\n'.format(package)) item.log(package) if item.f: item.f.close() # Methods related to parsing arguments @classmethod def _add_option_all(cls, arriero, parser): parser.add_argument('-a', '--all', action='store_true') # names @classmethod def _add_option_names(cls, arriero, parser, strict=True): if strict: names_choices = set(['']) | set(arriero.config.list_all()) else: names_choices = None parser.add_argument('names', nargs='*', default='', choices=names_choices) def _process_option_names(self): if self.config.get('all'): self.names = self.config.packages return self.names = self.config.get('names') if not self.names: # Try to find out if we are standing on a current package cwd = os.getcwd() for pkg_name in self.config.packages: pkg = self._arriero.get_package(pkg_name) if pkg.path == cwd: self.names = [pkg_name] return raise argparse.ArgumentError(None, 'No package names received.') @classmethod def _add_boolean_argument(cls, parser, name, default=False, dest=None): '''Helper to add options --name --no-name and set the default''' if dest is None: dest = name.replace('-', '_') parser.add_argument('--{}'.format(name), action='store_true', dest=dest) parser.add_argument('--no-{}'.format(name), action='store_false', dest=dest) parser.set_defaults(**{dest: default}) @classmethod def _add_option_ignore_branch(cls, arriero, parser): cls._add_boolean_argument(parser, 'ignore-branch', default=None) class ActionPackages(Action): def _get_name(self): return self.__class__.__name__ @classmethod def add_options(cls, arriero, parser): super(ActionPackages, cls).add_options(arriero, parser) cls._add_option_all(arriero, parser) cls._add_option_names(arriero, parser, strict=True) cls._add_option_ignore_branch(arriero, parser) def _process_options(self): self._process_option_names() def run(self): '''Method for iterating packages and applying a function.''' packages = self._get_packages(self.names) self._results = collections.defaultdict(set) for package_name in packages: self._arriero.switch_log(to=package_name) logging.info('%s: executing %s action.', package_name, self._get_name()) package = self._arriero.get_package(package_name) try: status = self._package_action(package) except ArrieroError as e: status = ERROR self._arriero.switch_log() logging.error('%s: action %s FAILED with %s', package_name, self._get_name(), e) else: if status == ERROR: self._arriero.switch_log() logging.error('%s: action %s FAILED', package_name, self._get_name()) self._results[status].add(package.name) self._arriero.switch_log() if self._results[ERROR]: return False return True print_status = Action.actionpackage_print_status def _package_action(self, package): '''To override.''' class ActionFields(ActionPackages): '''Common class for List and Exec''' available_fields = set([ 'basedir', 'branch', 'build_file', 'changes_file', 'debian_branch', 'depends', 'dist', 'distribution', 'dsc_file', 'export_dir', 'has_symbols', 'i', 'is_dfsg', 'is_native', 'is_merged', 'last_changelog_distribution', 'last_changelog_version', 'name', 'package', 'path', 'source_name', 'source_changes_file', 'tarball_dir', 'upstream_branch', 'upstream_version', 'urgency', 'vcs_git', 'version', 'version_at_distribution', ]) available_sorts = set(('raw', 'alpha', 'build')) aliases_fields = { 'dist': 'distribution', 'package': 'name', } @classmethod def add_options(cls, arriero, parser): super(ActionFields, cls).add_options(arriero, parser) parser.add_argument('-f', '--fields', default='name') parser.add_argument('-s', '--sort', choices=cls.available_sorts, default='raw') def _process_options(self, *formats): super(ActionFields, self)._process_options() self.fields = util.split(self.config.get('fields')) self.field_names = {} required_fields = [] for format_string in formats: if not format_string: continue _, f = util.expansions_needed(format_string) required_fields.extend(f) for field_name in itertools.chain(self.fields, required_fields): if field_name not in self.available_fields: raise ActionError( 'action %s: Field %s not known' % ( self._get_name(), field_name ) ) self.field_names[field_name] = None def _get_field(self, package, field): obj = getattr(package, field) if hasattr(obj, '__call__'): result = obj() else: result = obj return (result if isinstance(result, str) else '' if result is None else str(tuple(result)) if ( isinstance(result, (set, deb822.OrderedSet, collections.Set))) else str(result)) def _resolve_aliases(self, field, visited=None): if field not in self.aliases_fields: return field if not visited: visited = set() if field in visited: raise ActionError('action %s: Invalid alias %s' % (self.__class__.__name__, field)) visited.add(field) return self._resolve_aliases(self.aliases_fields[field], visited) def _get_field_values(self, package, known=None): values = [] by_name = {} for field in self.field_names: lookup = self._resolve_aliases(field) if known and field in known: field_value = known[field] else: field_value = self._get_field(package, lookup) by_name[field] = field_value for field in self.fields: values.append(by_name[field]) return values, by_name def _get_packages(self, names): return self._sorted(super(ActionFields, self)._get_packages(names)) def run(self): '''Method for iterating packages and applying a function.''' packages = self._get_packages(self.names) self._results = collections.defaultdict(set) for i, package_name in enumerate(packages): self._arriero.switch_log(to=package_name) logging.info('%s: executing %s action.', package_name, self._get_name()) package = self._arriero.get_package(package_name) try: if hasattr(self, '_package_action_i'): status = self._package_action_i(package, i) else: status = self._package_action(package) except ArrieroError as e: status = ERROR self._arriero.switch_log() logging.error('%s: action %s FAILED with %s', package_name, self._get_name(), e) else: if status == ERROR: self._arriero.switch_log() logging.error('%s: action %s FAILED', package_name, self._get_name()) self._results[status].add(package.name) if self._results[ERROR]: return False return True # Sort methods def raw_sort(self, packages): return packages def alpha_sort(self, packages): return sorted(packages) def build_sort(self, packages): return self._arriero.sort_by_depends(packages) def _sorted(self, packages): method_name = self.config.get('sort') + '_sort' method = getattr(self, method_name) return method(packages) class List(ActionFields): '''Query the available packages with formatting.''' @classmethod def add_options(cls, arriero, parser): super(List, cls).add_options(arriero, parser) parser.add_argument('-F', '--format', default=None, dest='output_format') parser.add_argument('-e', '--include-empty-results', action='store_true') def _process_options(self): self.output_format = self.config.get('output_format', raw=True) super(List, self)._process_options(self.output_format) self.include_empty_results = self.config.get('include_empty_results') def _is_empty(self, iterable): '''Returns True if the iterables has all empty elements.''' if not iterable: return True for value in iterable: if value: return False return True def _package_action_i(self, package, i): values, by_name = self._get_field_values(package, {'i': str(i)}) if self.include_empty_results or not self._is_empty(values): if self.output_format: print (self.output_format.format(*values, **by_name)) else: print ('\t'.join(values)) return OK class Exec(ActionFields): '''Run a command on each package.''' @classmethod def add_options(cls, arriero, parser): super(Exec, cls).add_options(arriero, parser) parser.add_argument('-x', '--script', action='append') parser.add_argument('--no-env', action='store_false', dest='env') parser.add_argument('--no-chdir', action='store_false', dest='chdir') def _process_options(self): self._scripts = self.config.get('script', raw=True) super(Exec, self)._process_options(*self._scripts) def _package_action_i(self, package, i): status = OK kwargs = {'interactive': True, 'shell': True} if self.config.get('chdir'): kwargs['cwd'] = package.path values, by_name = self._get_field_values(package, {'i': str(i)}) if self.config.get('env'): kwargs['env'] = dict(os.environ, **by_name) for script in self._scripts: script_formatted = script.format(*values, **by_name) try: util.log_check_call(script_formatted, **kwargs) except (ArrieroError, subprocess.CalledProcessError) as e: logging.error('%s: %s', package.name, e) status = ERROR break return status class Clone(Action): '''Clone upstream repositories.''' @classmethod def add_options(cls, arriero, parser): cls._add_option_all(arriero, parser) cls._add_option_names(arriero, parser, strict=False) parser.add_argument('--basedir', default=None) parser.add_argument('--upstream-branch', default=None) parser.add_argument('--debian-branch', default=None) parser.add_argument('--vcs-git', default=None) def _process_options(self): self._process_option_names() # Split the names into URLs and packages. self._packages = set() self._urls = set() for name in self.config.get('names'): if ':' in name: self._urls.add(name) else: self._packages.add(self._arriero.get_package(name)) def run(self): self._not_ok = set() for url in self._urls: try: if not self.url_clone(url): self._not_ok.add(url) except Exception as e: logging.error('{}: Failed to clone.'.format(url)) logging.error('{}: {}'.format(url, e)) self._not_ok.add(url) for package in self._packages: self._arriero.switch_log(to=package.name) try: if not self.package_clone(package): self._not_ok.add(package.name) except Exception as e: logging.error('{}: Failed to clone.'.format(package.name)) logging.error('{}: {}'.format(package.name, e)) self._not_ok.add(package.name) # Run status if self._not_ok: return False return True def get_remote_heads(self, url): heads = set() cmd = ['git', 'ls-remote', '--heads', url] p = subprocess.Popen(cmd, universal_newlines=True, stdout=subprocess.PIPE) for line in p.stdout: m = re.search(r'\srefs/heads/(.*)$', line) if m: heads.add(m.group(1)) return heads def guess_branches(self, url): '''Use git ls-remote to check which branches are there.''' heads = self.get_remote_heads(url) upstream_branch = 'upstream' debian_branch = 'master' # Review this: Lucky guess? if 'debian' in heads: debian_branch = 'debian' if 'upstream' not in heads: if 'master' in heads: upstream_branch = 'master' elif 'unstable' in heads: debian_branch = 'unstable' pristine_tar = False if 'pristine-tar' in heads: pristine_tar = True return debian_branch, upstream_branch, pristine_tar def url_clone(self, url): '''Clone a package from the provided url.''' # Check if this URL is already configured for package_name in self.config.packages: package = self._arriero.get_package(package_name) if package.vcs_git == url: logging.warning( 'The URL %s is already configured by package %s.', package.vcs_git, package.name) logging.warning('Switching to cloning from configuration file.') self._packages.add(package) return True # Get basedir for this package basedir = self.config.get('basedir') # Guess destdir for this package name = os.path.basename(url) if name.endswith('.git'): name = name[:-4] destdir = os.path.join(basedir, name) # Guess the branches debian_branch, upstream_branch, pristine_tar = self.guess_branches(url) self.clone(basedir, destdir, url, debian_branch, upstream_branch, pristine_tar) # Obtain package name from control file destdir = os.path.expanduser(destdir) control_filepath = os.path.join(destdir, 'debian', 'control') if not os.path.exists(control_filepath): logging.error('Unable to find debian/control while cloning %s', url) return False control_file = open(control_filepath) # Deb822 will parse just the first paragraph, which is ok. control = deb822.Deb822(control_file) package_name = control['Source'] if not self._arriero.add_new_package(package_name, url, destdir, debian_branch, upstream_branch, pristine_tar): logging.error('Clone successful for package not in configuration. ' 'You will not be able to use arriero with it.') return False package = self._arriero.get_package(package_name) return package.fetch_upstream() # TODO: this method should probably be in the package and not here def clone(self, basedir, destdir, url, debian_branch, upstream_branch, pristine_tar): '''Verify the directories for the clone, and clone.''' basedir = os.path.expanduser(basedir) destdir = os.path.expanduser(destdir) util.ensure_path(basedir) logging.debug('basedir: %s', basedir) logging.debug('destdir: %s', destdir) if os.path.exists(destdir): logging.error('Cloning %s, directory already exists: %s', url, destdir) return False dirname, basename = os.path.split(destdir) cmd = ['gbp', 'clone'] if debian_branch: cmd.append('--debian-branch=%s' % debian_branch) if upstream_branch: cmd.append('--upstream-branch=%s' % upstream_branch) if pristine_tar: cmd.append('--pristine-tar') else: cmd.append('--no-pristine-tar') cmd.append(url) cmd.append(basename) util.log_check_call(cmd, interactive=True, cwd=dirname) logging.info('Successfully cloned %s', url) return True def package_clone(self, package): """Clone a package that is already in the config file.""" # TODO: shouldn't we check if this returned true or false? self.clone(package.basedir, package.path, package.vcs_git, package.debian_branch, package.upstream_branch, package.pristine_tar) if package.name not in self.config.list_all(): success = self._arriero.add_new_package( package.name, package.vcs_git, package.path, package.debian_branch, package.upstream_branch, package.pristine_tar) if not success: logging.error('Clone successful for package not in configuration. ' 'You will not be able to use arriero with it.') return package.fetch_upstream() class Build(ActionPackages): '''Merge and compile the received packages.''' @classmethod def add_options(cls, arriero, parser): super(Build, cls).add_options(arriero, parser) parser.add_argument('-D', '--distribution', '--dist', default=None, dest='target_distribution') parser.add_argument('-A', '--architecture', '--arch', default=None) parser.add_argument('-U', '--local-upload', action='store_true') parser.add_argument('-S', '--source-only', action='store_true') parser.add_argument('-B', '--binary-only', action='store_true') cls._add_boolean_argument(parser, 'arch-all', default=True) parser.add_argument('--force-orig-source', action='store_true') parser.add_argument('-o', '--builder-options', action='append') parser.add_argument('--builder', default=None) parser.add_argument('--hooks', default=None, dest='builder_hooks') parser.add_argument('--run-autopkgtest', default=None) def _build_package(self, name): package = self._arriero.get_package(name) # TODO: why is build catching the exception? try: package.build() except (ArrieroError, subprocess.CalledProcessError) as e: logging.error('%s: Build failed.', package.name) logging.error('%s: %s', package.name, str(e)) return False if package.get('local_upload'): try: package.local_upload() except subprocess.CalledProcessError as e: logging.error('%s: Upload failed.', package.name) logging.error('%s: %s', package.name, str(e)) return False return True def run(self): def _get_dependencies(package): return package.build_depends package_names = self._get_packages(self.names) self._results = collections.defaultdict(set) for name in self._arriero.sort_by_depends( package_names, error=self._results[ERROR], get_dependencies=_get_dependencies): self._arriero.switch_log(to=name) if self._build_package(name): self._results[OK].add(name) continue self._results[ERROR].add(name) self._arriero.switch_log() logging.error('%s: Build FAILED', name) self._results[IGNORE] = ((package_names - self._results[OK]) - self._results[ERROR]) if self._results[ERROR]: return False return True print_status = Action.actionpackage_print_status class Upload(ActionPackages): @classmethod def add_options(cls, arriero, parser): super(Upload, cls).add_options(arriero, parser) parser.add_argument('--host', default='local', dest='upload_host') parser.add_argument('-f', '--force', action='store_true') def _package_action(self, package): try: return package.upload() except Exception as e: logging.error('%s: Failed to upload.', package.name) logging.error('%s: %s', package.name, e) return ERROR class Pull(ActionPackages): def _package_action(self, package): status = OK try: package.pull() logging.info('%s: Successfully pulled.', package.name) except ArrieroError as e: logging.error('%s: Failed to pull.', package.name) logging.error('%s: %s', package.name, e) status = ERROR return status class Push(ActionPackages): @classmethod def add_options(cls, arriero, parser): super().add_options(arriero, parser) cls._add_boolean_argument(parser, 'debian-tags', None) def _package_action(self, package): status = OK if not package.switch_branches(package.debian_branch): status = ERROR else: if not package.push(): status = ERROR return status class Release(ActionPackages): @classmethod def add_options(cls, arriero, parser): super(Release, cls).add_options(arriero, parser) parser.add_argument('-D', '--distribution', default=None) parser.add_argument('-P', '--pre-release', action='store_true') def _package_action(self, package): if not package.switch_branches(package.debian_branch): logging.error('%s: can\'t switch to branch %s.', package.name, package.debian_branch) return ERROR return package.release() class Update(ActionPackages): @classmethod def add_options(cls, arriero, parser): super(Update, cls).add_options(arriero, parser) parser.add_argument('-V', '--download-version', default=None) def _package_action(self, package): status = OK try: package.new_upstream_release() except (ArrieroError, subprocess.CalledProcessError) as e: logging.error('%s: Failed to get new upstream release.', package.name) logging.error('%s: %s', package.name, e) status = ERROR return status class Status(ActionPackages): def _package_action(self, package): print ('\n'.join(package.get_status())) class PrepareOverlay(ActionPackages): @classmethod def add_options(cls, arriero, parser): super(PrepareOverlay, cls).add_options(arriero, parser) parser.add_argument('-b', '--branch', default=None, dest='overlay_branch') def _package_action(self, package): status = OK try: package.prepare_overlay( overlay_branch=package.get('overlay_branch')) except Exception as e: logging.error('{}: failure while preparing overlay'.format( package.name)) logging.error('{}: {}'.format(package.name, e)) status = ERROR return status class FetchUpstream(ActionPackages): def _package_action(self, package): status = OK try: package.fetch_upstream() except Exception as e: logging.error('{}: unable to fetch upstream release'.format( package.name)) logging.error('{}: {}'.format(package.name, e)) status = ERROR return status class CheckIfChanged(ActionPackages): def _package_action(self, package): status = OK if package.is_native(): return status current = previous = package.upstream_version for block in package.changelog: version = Version(str(block.version)) if version.upstream_version != current: previous = version.upstream_version break if current == previous: return status changes = package.upstream_changes(old_version=previous) if changes: msg = 'changes since {}'.format(previous) else: msg = '{} = {}'.format(current, previous) print ('{}: {}'.format(package.name, msg)) return status class UpdateSymbols(ActionPackages): @classmethod def add_options(cls, arriero, parser): super(UpdateSymbols, cls).add_options(arriero, parser) parser.add_argument('-D', '--distribution', default=None) parser.add_argument('--from-build-log', action='store_true') parser.add_argument('--from-buildds-logs', action='store_true') @staticmethod def _has_symbols_changes(package, since): release_tag = package.tag_template('debian', version=since) repo = package.repo repo_tag = repo.tags[release_tag] return bool(repo.index.diff(repo_tag.commit, 'debian/*.symbols')) @staticmethod def _symbols_helper(package, version, files): upstream_version, _ = str(version).split('-') cmd = ['pkgkde-symbolshelper', 'batchpatch', '-v', upstream_version] cmd.extend(files) try: util.log_check_call(cmd, cwd=package.path) except subprocess.CalledProcessError: return None # Check index with working dir diffs = package.repo.index.diff(None) return [diff.a_blob.path for diff in diffs] @staticmethod def _process_buildd_logs(package, version): cmd = ['getbuildlog', package.source_name, version] util.log_check_call(cmd, cwd=package.export_dir) filename_glob = '{}_{}_*.log*'.format(package.source_name, version) downloaded_glob = os.path.join(package.export_dir, filename_glob) downloaded_files = glob.glob(downloaded_glob) return UpdateSymbols._symbols_helper(package, version, downloaded_files) @staticmethod def _process_build_log(package): files = [package.build_file] return UpdateSymbols._symbols_helper(package, package.version, files) @staticmethod def _has_missing_symbols(package, changes): missing_re = re.compile(r'^\s*#(DEPRECATED|MISSING)') for filename in changes: fullpath = os.path.join(package.path, filename) f = open(fullpath) for line in f: if missing_re.match(line): return True return False @staticmethod def _get_symbols_mtime(package): return max(os.path.getmtime(f) for f in package.symbols_files()) def _buildds_logs_action(self, package): status = OK # Obtain last distribution changelog_dist = self.config.get('distribution') if not changelog_dist: changelog_dist = package.last_changelog_distribution if not changelog_dist: logging.info( '{}: no previous upload (dist), ignoring.'.format( package.name)) return IGNORE # Obtain last uploaded version version = util.version_at_distribution(package.source_name, changelog_dist) if not version: logging.info( '{}: no previous upload, ignoring.'.format( package.name)) return IGNORE # Check if symbols have changed since last upload if self._has_symbols_changes(package, since=version): logging.info( '{}: there were changes since last upload, ignoring.'.format( package.name)) return IGNORE # Obtain the buildds logs and # Update the symbols with the buildds results changes = self._process_buildd_logs(package, str(version)) if not changes: logging.info('{}: no changes needed.'.format(package.name)) return status # Check for missing symbols, let the user fix those if self._has_missing_symbols(package, changes): logging.error('{}: missing symbols'.format(package.name)) return ERROR # Commit the symbols changes package.commit( msg='Update symbols files from buildds logs ({}).'.format(version), files=changes) return status @staticmethod def _build_log_action(package): status = OK # Obtain build log if not package.build_file: logging.info('{}: no build file, ignoring.'.format(package.name)) return IGNORE build_mtime = os.path.getmtime(package.build_file) # Obtain newer mtime of the symbols files symbols_mtime = UpdateSymbols._get_symbols_mtime(package) # Skip if symbols mtime is newer than the build log mtime if symbols_mtime > build_mtime: logging.info('{}: build is previous to the last symbols change' ', ignoring'.format(package.name)) return IGNORE # Process build log changes = UpdateSymbols._process_build_log(package) if not changes: logging.info('{}: no changes needed.'.format(package.name)) return status # Check for missing symbols, let the user fix those if UpdateSymbols._has_missing_symbols(package, changes): logging.error('{}: missing symbols'.format(package.name)) return ERROR # Commit the symbols changes package.commit(msg='Update symbols files.', files=changes) return status def _package_action(self, package): status = OK if package.repo.is_dirty(untracked_files=True): logging.warning( '{}: branch {} has uncommitted changes.'.format( package.name, package.repo.active_branch.name)) return ERROR if not package.has_symbols(): logging.info( '{}: no symbols files, ignoring.'.format(package.name)) return IGNORE buildds = self.config.get('from_buildds_logs') build = self.config.get('from_build_log') if buildds or build: if buildds: status = self._buildds_logs_action(package) if status != ERROR and build: status = self._build_log_action(package) else: if not package.build_file: return self._buildds_logs_action(package) return self._build_log_action(package) class ExternalUploads(ActionPackages): @classmethod def add_options(cls, arriero, parser): super(ExternalUploads, cls).add_options(arriero, parser) cls._add_boolean_argument(parser, 'import', dest='do_import') def _check_external_uploads(self, package): def drop_binnmu_part(version): m = re.match('^(.*)\+b[0-9]+$', version) if m: return m.group(1) return version status = OK archive_versions = util.rmadison(package.source_name) pending = collections.defaultdict(set) for dist in archive_versions: for version in dist: version = drop_binnmu_part(version) pending[version].add(dist) for block in package.changelog: if block.version in pending: pending.pop(block.version) for version, dists in pending.items(): print ('{}: missing version {} available in {}'.format( package.name, version, dists)) if pending: status = ERROR return pending, status def _import_external_uploads_get_import_dsc_cmd(self, package, tmpdir): # gbp import-dsc dsc_file = glob.glob('{}/*.dsc'.format(tmpdir))[0] cmd = ['gbp', 'import-dsc', dsc_file] if package.debian_branch: cmd.append('--debian-branch=%s' % package.debian_branch) if package.upstream_branch: cmd.append('--upstream-branch=%s' % package.upstream_branch) if package.pristine_tar: cmd.append('--pristine-tar') if package.filter_orig: cmd.append('--filter-pristine-tar') for pattern in package.filter_orig: cmd.append('--filter=%s' % pattern) else: cmd.append('--no-pristine-tar') return cmd def _import_external_uploads(self, package): pending, status = self._check_external_uploads(package) for version, dists in pending.items(): # Do something if 'new' in dists: logging.info('{}: ignoring version {} in new'.format( package.name, version)) dists.remove('new') status = IGNORE incoming = set() for dist in dists: if dist.startswith('buildd-'): incoming.add(dist) if dists - incoming: # In the archive, using debsnap with tempfile.TemporaryDirectory(prefix=package.name) as tmpdir: logging.warn('{}: gbp import-dsc always merges the ' 'upstream source into the debian branch'.format( package.name)) cmd = ['debsnap', '--destdir={}'.format(tmpdir), '--force', package.source_name, version] util.log_check_call(cmd) cmd = self._import_external_uploads_get_import_dsc_cmd( package, tmpdir) util.log_check_call(cmd) status = OK elif incoming: logging.info('{}: ignoring version {} in incoming'.format( package.name, version)) status = IGNORE return status def _package_action(self, package): status = OK try: package.pull() except Exception: return ERROR if not self.config.get('do_import'): versions, status = self._check_external_uploads(package) else: status = self._import_external_uploads(package) return status class InternalDependencies(ActionFields): def __init__(self, *args, **kwargs): super(InternalDependencies, self).__init__(*args, **kwargs) self.binaries = {} for package_name in self.config.packages: package = self._arriero.get_package(package_name) binaries = package.get_packages() for binary_name in binaries: self.binaries[binary_name] = package_name def _package_action(self, package): internal_dep = package.internal_dependencies(self.binaries) internal_dep = { key: {'due_to': value} for (key, value) in internal_dep.items() } for dep, value in internal_dep.items(): value['groups'] = \ sorted(self.config.list_parents(dep)) nicer = sorted(internal_dep.items(), key=lambda x: (x[1]['groups'], x[0])) current_group = None groups = [] for item in nicer: if item[1]['groups'] != current_group: current_group = item[1]['groups'] groups.append([]) groups[-1].append(item[0]) print ('# {:32} : due to {} ({})'.format( item[0], ', '.join(item[1]['due_to']), ' '.join(item[1]['groups']))) print ('; {}'.format( ' '.join(sorted(self.config.list_parents(package.name))) )) print ('[{}]'.format(package.name)) print ('depends: ' + '\n '.join(' '.join(g) for g in groups)) # print # for item in nicer: # print ('{}'.format(item[0])) # Dictionary containing the available actions in this module and some help # about what they are for. AVAILABLE_ACTIONS = { 'status': (Status, 'Show the status of the package/s'), 'update': (Update, 'Get the new upstream release of the package/s'), 'build': (Build, 'Build the package/s in a pbuilder'), 'list': (List, 'List package/s with a specific format'), 'exec': (Exec, 'Run commands for each package/s'), 'clone': (Clone, 'Obtain the repository for the package/s'), 'upload': (Upload, 'Upload the package/s'), 'pull': (Pull, 'Pull new changes to the package/s repositories'), 'push': (Push, 'Push local changes to the package/s repositories'), 'release': (Release, 'Change the distribution in the changelog'), 'overlay': (PrepareOverlay, 'Combine upstream and debian branches'), 'fetch-upstream': (FetchUpstream, 'Fetch upstream tarball'), 'check-if-changed': (CheckIfChanged, 'Check changes in the upstream tarball'), 'update-symbols': (UpdateSymbols, 'Update symbols files with previous build'), 'external-uploads': (ExternalUploads, 'Break stuff'), 'internal-dependencies': (InternalDependencies, 'List dependencies between packages'), } # vi:expandtab:softtabstop=4:shiftwidth=4:smarttab arriero-0.7~20161228/arriero/arriero.py000077500000000000000000000337531306715713600175560ustar00rootroot00000000000000#!/usr/bin/env python3 # -*- coding: utf8 -*- # Copyright: 2013-2015, Maximiliano Curia # # This program is free software; you can redistribute it and/or modify it # under the terms of the GNU General Public License as published by the Free # Software Foundation; either version 2 of the License, or (at your option) # any later version. # # This program is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or # FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for # more details. # # You should have received a copy of the GNU General Public License along with # this program; if not, write to the Free Software Foundation, Inc., 51 # Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. ''' Arriero is a tool for simplifying maintaining many debian packages. It allows to quickly update, build, push and upload many packages at the same time. ''' # System imports import argparse import logging import os import subprocess import sys # Own imports from . import util from .actions import AVAILABLE_ACTIONS, ActionError from .builder import AVAILABLE_TESTERS, AVAILABLE_BUILDERS from .config import Configuration, Schema from .graph import TSortGraph from .package import Package class ArrieroHandler(logging.FileHandler): def __init__(self, logdir, *args, **kwargs): self.logdir = logdir self.default_target = 'arriero' if not os.path.isdir(self.logdir): os.makedirs(self.logdir) filename = os.path.join(self.logdir, '{}.log'.format(self.default_target)) super(ArrieroHandler, self).__init__(filename, *args, **kwargs) def change_file(self, name=None): if not name: name = self.default_target filename = os.path.join(self.logdir, '{}.log'.format(name)) self.baseFilename = filename if self.stream: self.stream.close() self.stream = None if not self.delay: self.stream = self._open() class Arriero(object): '''Main class to interface with the user. This class handles the CLI, the config values. The individual actions are delegated to the actions module. ''' defaults = { 'basedir': '~', 'builder': 'sbuild', 'builder_hooks': '', 'builder_options': '', 'config_files': ('/etc/arriero.conf', '~/.config/arriero.conf'), 'cowbuilder_basepath': '/var/cache/pbuilder/base-{dist}-{arch}.cow', 'debian_branch': 'master', 'debian_tag': 'debian/{epochless_version}', 'debian_tags': 'True', 'depends': '', 'export_dir': '{path}/../build-area', 'filter_orig': '', 'force': 'False', 'force_orig_source': 'False', 'ignore_branch': 'False', 'is_merged': 'False', 'path': '{basedir}/{name}', 'pbuilder_basetgz': '/var/cache/pbuilder/{dist}-{arch}-base.tgz', 'pre_release': 'False', 'pristine_tar': 'False', 'run_autopkgtest': 'True', 'run_lintian': 'True', 'run_piuparts': 'False', 'sbuild_chroot': '{dist}-{arch}-sbuild', 'sbuild_chrootpath': '/var/lib/schroot/chroots/sbuild-{dist}-{arch}', 'sbuild_test_chroot': '{sbuild_chroot}', 'source_only': 'False', 'tarball_dir': '{path}/../tarballs', 'target_distribution': '{distribution}', 'tester': 'schroot', 'tester_name': 'adt-{dist}-{arch}', 'upload_command': 'dput -u {upload_host} {changes_file}', 'upload_host': 'local', 'upstream_branch': 'upstream', 'upstream_push': 'True', 'upstream_tag': 'upstream/{upstream_version}', 'upstream_vcs_tag': '', 'vcs_git': 'https://anonscm.debian.org/git/collab-maint/{name}.git', } aliases = { 'arch': 'architecture', 'dist': 'distribution', } schema = { 'basedir': Schema(type='path'), 'build_depends': Schema(type='multivalue'), 'build_file': Schema(type='path'), 'builder_hooks': Schema(type='multivalue'), 'builder_options': Schema(type='multistring'), 'changes_file': Schema(type='path'), 'config_files': Schema(type='multivalue'), 'cowbuilder_basepath': Schema(type='path'), 'depends': Schema(type='multivalue'), 'debian_tag': Schema(type='rawstring'), 'debian_tags': Schema(type='bool'), 'dsc_file': Schema(type='path'), 'export_dir': Schema(type='path'), 'filter_orig': Schema(type='multivalue'), 'force': Schema(type='bool'), 'force_orig_source': Schema(type='bool'), 'has_symbols': Schema(type='bool'), 'i': Schema(type='int'), 'ignore_branch': Schema(type='bool'), 'is_dfsg': Schema(type='bool'), 'is_merged': Schema(type='bool'), 'is_native': Schema(type='bool'), 'packages': Schema(type='multivalue', inherit=False), 'path': Schema(type='path'), 'pbuilder_basetgz': Schema(type='path'), 'pre_release': Schema(type='bool'), 'pristine_tar': Schema(type='bool'), 'run_autopkgtest': Schema(type='bool'), 'run_lintian': Schema(type='bool'), 'run_piuparts': Schema(type='bool'), 'runtime_depends': Schema(type='multivalue'), 'sbuild_chrootpath': Schema(type='path'), 'source_only': Schema(type='bool'), 'source_changes_file': Schema(type='path'), 'tarball_dir': Schema(type='path'), 'tests_depends': Schema(type='multivalue'), 'upstream_push': Schema(type='bool'), 'upstream_tag': Schema(type='rawstring'), 'upstream_vcs_tag': Schema(type='rawstring'), } def __init__(self): self.commands = {} # TODO: dynamically read other files and their actions self.commands.update(AVAILABLE_ACTIONS) self.config = Configuration( defaults=self.defaults, aliases=self.aliases, schema=self.schema, argparse_kwargs=util.chain_map( description=__doc__, formatter_class=argparse.RawTextHelpFormatter, ) ) self.logger = logging.getLogger() self.packages = {} self._architecture = None self._builder = {} self._tester = {} self._log_handler = None self._add_options() read_files = self.config.read_config_files() self.config.update(partial=True) self._process_log_args() for orig_file in self.config.config_files: if orig_file not in read_files: logging.warning('Could not parse config file: %s', orig_file) self.config.update(partial=True) self._add_actions_options() # Delayed so it doesn't clash the ones in the actions self.config.add_help() self.config.update() # log related def add_log_handler(self, handler): self._log_handler = handler def switch_log(self, to=None): if self._log_handler: self._log_handler.change_file(name=to) @property def logdir(self): if self._log_handler: return self._log_handler.logdir def _process_log_args(self): self.logger.setLevel(logging.DEBUG) console_handler = logging.StreamHandler() console_formatter = logging.Formatter('[%(levelname)s] %(message)s') console_handler.setFormatter(console_formatter) if self.config.get('verbose'): console_handler.setLevel(logging.DEBUG) elif self.config.get('quiet'): console_handler.setLevel(logging.CRITICAL) else: console_handler.setLevel(logging.WARNING) self.logger.addHandler(console_handler) if self.config.get('logdir'): self._log_handler = ArrieroHandler( self.config.get('logdir')) file_formatter = logging.Formatter( '%(asctime)s [%(levelname)s] %(message)s') self._log_handler.setFormatter(file_formatter) self._log_handler.setLevel(logging.DEBUG) self.logger.addHandler(self._log_handler) # arguments/config related def show_commands_help(self): result = ['\nAvailable commands:'] commands = sorted(AVAILABLE_ACTIONS.items()) for command, (classname, helptext) in commands: result.append(' %-22s%s.' % (command, helptext)) return '\n'.join(result) def _add_options(self): '''Add arguments/config options''' self.config.argparser.epilog = self.show_commands_help() self.config.arg_add('-v', '--verbose', action='store_true', help='Show more information.') self.config.arg_add('-q', '--quiet', action='store_true', help='Show only critical errors.') self.config.arg_add('--logdir', default=None, help='Directory to store verbose logs') def _add_actions_options(self): subparsers = self.config.argparser.add_subparsers( title='command', description='valid action', help='Command to execute.', dest='command', ) for cmd, (cls, _) in self.commands.items(): subparser = subparsers.add_parser(cmd) subparser.set_defaults(cmdcls=cls) cls.add_options(self, subparser) @property def architecture(self): if not self._architecture: self._architecture = subprocess.check_output( ['dpkg-architecture', '-qDEB_BUILD_ARCH'], universal_newlines=True).rstrip('\n') return self._architecture def add_new_package(self, package_name, git_url, path, debian_branch, upstream_branch, pristine_tar): '''Adds a new package to the configuration.''' if package_name in self.config.list_all(): logging.error( 'Package %s definition already in the config file. ' 'Not adding it.', package_name) return False basedir = self.config.get('basedir') basedir = os.path.expanduser(basedir) if path.startswith(basedir): path = '{basedir}' + path[len(basedir):] self.config.add_section(package_name) # TODO: Need to make this general, deb-src package have no git repo if git_url: self.config.set(package_name, 'vcs_git', git_url) self.config.set(package_name, 'path', path) self.config.set(package_name, 'debian_branch', debian_branch) self.config.set(package_name, 'pristine_tar', str(pristine_tar)) if (upstream_branch): self.config.set(package_name, 'upstream-branch', upstream_branch) self.config.write() return True def call_command(self): '''Execute the command the user requested.''' exit_code = 0 if 'cmdcls' not in self.config.args: self.config.argparser.error('No command specified') action = self.config.args.cmdcls try: instance = action(self) success = instance.run() exit_code = (0 if success is True else 1 if success is False else success) instance.print_status() return exit_code except ActionError as e: logging.critical(e) return 255 def get_package(self, name): '''Returns a Package object.''' if name not in self.packages: self.packages[name] = Package(name, self) return self.packages[name] def sort_by_depends(self, package_names, error=None, binaries=None, get_dependencies=None): def _config_depends(package): return package.depends def _dependencies(package): return util.OrderedSet( package.internal_dependencies( binaries, get_dependencies(package)).keys() ) def _get_inputs(package_name): package = self.get_package(package_name) return get_depends(package) error = set() if error is None else error binaries = (self.prepare_binaries_map(package_names) if binaries is None and get_dependencies is not None else binaries) get_depends = (_config_depends if get_dependencies is None else _dependencies) graph = TSortGraph(package_names, _get_inputs) return graph.sort_generator(skip=error) def _get_builder(self, name, distribution, architecture, known, cache): distribution = distribution.lower() key = (name, distribution, architecture) if key in cache: return cache[key] item = known[name] Class = item[0] _instance = Class(self, distribution, architecture) cache[key] = _instance return _instance def get_tester(self, tester_name, distribution, architecture): '''Get a tester''' return self._get_builder(tester_name, distribution, architecture, AVAILABLE_TESTERS, self._tester) def get_builder(self, builder_name, distribution, architecture): '''Get a builder''' return self._get_builder(builder_name, distribution, architecture, AVAILABLE_BUILDERS, self._builder) def prepare_binaries_map(self, package_names): 'Map each binpkg with corresponding source.' binaries = {} for name in package_names: package = self.get_package(name) for binary_name in package.get_packages(): binaries[binary_name] = name return binaries def main(): arriero = Arriero() try: exit_code = arriero.call_command() except argparse.ArgumentError as error: logging.error('Error while parsing arguments: %s', error) exit_code = 100 return exit_code if __name__ == '__main__': sys.exit(main()) # vi:expandtab:softtabstop=4:shiftwidth=4:smarttab arriero-0.7~20161228/arriero/builder.py000066400000000000000000000566061306715713600175400ustar00rootroot00000000000000#!/usr/bin/env python3 # -*- coding: utf8 -*- # Copyright: 2013-2014, Maximiliano Curia # # This program is free software; you can redistribute it and/or modify it # under the terms of the GNU General Public License as published by the Free # Software Foundation; either version 2 of the License, or (at your option) # any later version. # # This program is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or # FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for # more details. # # You should have received a copy of the GNU General Public License along with # this program; if not, write to the Free Software Foundation, Inc., 51 # Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. import getpass import logging import os import re import shutil import subprocess import sys import tempfile from pkg_resources import Requirement, resource_filename from . import util from .errors import TestFailed NO = 0 ON_ERROR = 1 ALWAYS = 2 class Common(object): def __init__(self, arriero, dist, arch): self._arriero = arriero self.dist = dist.lower() self.arch = arch # Update the chroot just once (if we are planning to make this a long # running process, then it might make sense to set a ttl. self._updated = False def update(self, force=False): self._updated = True @property def config(self): return self._arriero.config class Builder(Common): def create(self): pass def build(self, package): pass def ensure_image(self): try: self.update() except Exception: # FIXME: this gets called even if a package fails to be installed # we need to avoid calling create if the builder is already # created # self.create() pass def get_hooks_dir(self): # TODO: use the installed path if installed, is there a way to set a # variable after installing? dirname = resource_filename(Requirement.parse("Arriero"), "hooks") return dirname def _get_shell_enum(self, hooks): shell = NO if 'shell' in hooks: shell = ALWAYS elif 'shell_on_error' in hooks: shell = ON_ERROR return shell class CowBuilder(object): updated = set() def __init__(self, *a, **kw): self.path = None def needs_update(self): if not self.path: return return self.path not in self.updated def mark_updated(self): return self.updated.add(self.path) class GitPBuilder(Builder, CowBuilder): def __init__(self, *args, **kwargs): super(GitPBuilder, self).__init__(*args, **kwargs) self.path = self.config.get('cowbuilder_basepath') def create(self): self._updated = True env = {'ARCH': self.arch, 'DIST': self.dist} if self.dist == 'unreleased': options = [] if 'GIT_PBUILDER_OPTIONS' in os.environ: options.append(os.environ['GIT_PBUILDER_OPTIONS']) # TODO: Fix this options.append('--distribution=unstable') ' '.join(options) env['GIT_PBUILDER_OPTIONS'] = ' '.join(options) logging.warning('Build image for {}-{} not found. Creating...'.format( self.dist, self.arch)) cmd = ['git-pbuilder', 'create'] util.log_check_call(cmd, interactive=True, env=dict(os.environ, **env)) def update(self, force=False): if not force and self._updated: return self._updated = True if not force and not self.needs_update(): return self.mark_updated() env = {'ARCH': self.arch, 'DIST': self.dist} cmd = ['git-pbuilder', 'update'] logging.info('Updating build image for {}-{}'.format( self.dist, self.arch)) util.log_check_call(cmd, interactive=True, env=dict(os.environ, **env)) def build(self, package, ignore_branch=False): env = {} cmd, source_only = self._build_cmd(package, ignore_branch) tmp_hooks_dir, shell = self._prepare_hooks(package) if tmp_hooks_dir: gbp_options = [] if 'GIT_PBUILDER_OPTIONS' in os.environ: gbp_options.append(os.environ['GIT_PBUILDER_OPTIONS']) gbp_options.append('--hookdir={}'.format(tmp_hooks_dir)) env['GIT_PBUILDER_OPTIONS'] = ' '.join(gbp_options) builder_options = package.get('builder_options') if builder_options: cmd.extend(builder_options) try: util.log_check_call(cmd, cwd=package.path, env=dict(os.environ, **env), interactive=True) finally: if tmp_hooks_dir and os.path.exists(tmp_hooks_dir): shutil.rmtree(tmp_hooks_dir) if source_only: return self._run_tests(package, shell) def _build_cmd(self, package, ignore_branch): cmd = [ 'gbp', 'buildpackage', '--git-export-dir={}'.format(package.export_dir), '--git-arch={}'.format(self.arch), '--git-dist={}'.format(self.dist), ] force_orig_source = package.get('force_orig_source') if force_orig_source: cmd.append('-sa') source_only = package.get('source_only') if source_only: cmd.append('-S') cmd.append('--git-builder=debuild -I -i -us -uc -d') else: cmd.append('--git-pbuilder') arch_all = package.get('arch_all') if not arch_all: cmd.append('-B') if self.config.get('verbose'): cmd.append('--git-verbose') if package.pristine_tar: cmd.append('--git-pristine-tar') # If already generated, use that cmd.append('--git-tarball-dir={}'.format(package.export_dir)) if not ignore_branch: cmd.append('--git-debian-branch={}'.format(package.debian_branch)) if not package.is_merged and not package.is_native(): cmd.append('--git-overlay') return cmd, source_only def _prepare_hooks(self, package): hooks = package.get('builder_hooks') user_hooks_dir = package.get('builder-user-hooks-dir') shell = self._get_shell_enum(hooks) if not user_hooks_dir and not hooks: return None, shell tmpdir = tempfile.mkdtemp(prefix='builder', suffix='hooks') os.chmod(tmpdir, 0o755) if user_hooks_dir: cmd = ['cp', '-Lrx', os.path.join(user_hooks_dir, '.'), tmpdir] util.quiet(cmd) if hooks: hooks_dir = self.get_hooks_dir() for hook in hooks: src = os.path.join(hooks_dir, hook) if os.path.isdir(src): cmd = ['cp', '-Lrx', os.path.join(src, '.'), tmpdir] util.quiet(cmd) else: # TODO: ERROR ? # ignoring invalid hook logging.warning('{}: Ignoring invalid hook: {}'.format( package.name, hook)) pass return tmpdir, shell def _run_tests(self, package, shell): # TODO: launch shell when requested if package.get('run_lintian'): lintian_cmd = [ 'lintian', '-I', '--pedantic', '--show-overrides', package.changes_file] util.log_check_call(lintian_cmd, cwd=package.path) if package.get('run_piuparts'): # --defaults=flavor could use the vendor chroot = package.get('cowbuilder_basepath') puiparts_cmd = [ 'sudo', 'piuparts', '--arch={}'.format(self.arch), '--existing-chroot={}'.format(chroot), '--warn-on-debsums-errors', package.changes_file] if self.dist != 'unreleased': # TODO: it will fail if it depends on something from the # users local repositories. puiparts_cmd.append('--distribution={}'.format(self.dist)) util.log_check_call(puiparts_cmd, cwd=package.path) run_autopkgtest = (package.get('run_autopkgtest') and package.has_tests()) if run_autopkgtest: tester_name = package.get('tester') tester = self._arriero.get_tester(tester_name, self.dist, self.arch) tester.update() tester.test(package, package.changes_file) class SChRoot(object): updated = set() def __init__(self, *a, **kw): self.schroot = None def needs_update(self): if not self.schroot: return return self.schroot not in self.updated def mark_updated(self): return self.updated.add(self.schroot) def shell_in_session(self, session, start_dir='/'): logging.info('Starting shell in session: {} ' 'directory: {}'.format(session, start_dir)) if not session: # No session, probably it could not even be started return cmd = ['schroot', '-c', session, '--shell=/bin/bash', '--directory={}'.format(start_dir), '-r'] util.log_check_call(cmd, cwd='/', interactive=True) def end_session(self, session): if not session: # No session, probably it could not even be started return cmd = ['schroot', '-c', session, '-e'] util.log_check_call(cmd, interactive=True) class SBuild(Builder, SChRoot): class SBuildSession(object): def __init__(self, sbuild, package, cmd): self.sbuild = sbuild self.package = package self.cmd = cmd self._session_path = None self._build_dir = None self.exc_info = None def __enter__(self): try: util.log_check_call(self.cmd, cwd=self.package.path, interactive=True) except subprocess.CalledProcessError: self.exc_info = sys.exc_info() return self def process_buildlog(self): build_dir_re = re.compile( r"I: NOTICE: Log filtering will replace '(.*)' with " r"'(?:«|<<)PKGBUILDDIR(?:»|>>)'") session_path_re = re.compile( r"I: NOTICE: Log filtering will replace '(.*)' with " r"'(?:«|<<)CHROOT(?:»|>>)'") try: buildlog = open(self.package.build_file) except IOError: # No build file, most probably the session could not be # started return for line in buildlog: match = session_path_re.match(line) if match: self._session_path = match.group(1) match = build_dir_re.match(line) if match: self._build_dir = match.group(1) if self._session_path and self._build_dir: break @property def session_path(self): if not self._session_path: self.process_buildlog() if self._session_path and not self._session_path.startswith('/'): self._session_path = '/' + self._session_path return self._session_path @property def build_dir(self): if not self._build_dir: self.process_buildlog() return self._build_dir @property def chroot_build_dir(self): if self.full_build_dir and os.path.exists(self.full_build_dir): return os.path.join('/', self.build_dir) return '/' @property def session(self): path = self.session_path if path is None: return return os.path.basename(path) @property def full_build_dir(self): path = self.session_path if path is None: return return os.path.join(self.session_path, self.build_dir) def __exit__(self, exc_type, exc_value, traceback): self.sbuild.end_session(self.session) def __init__(self, arriero, dist, arch, *args, **kwargs): super(SBuild, self).__init__(arriero, dist, arch, *args, **kwargs) self.schroot = self.config.get('sbuild_chroot', distribution=dist, architecture=arch) def create(self): self._updated = True # TODO: remove hardcoded mirror chrootpath = self.config.get('sbuild_chrootpath') if self.dist == 'unreleased': distribution = 'unstable' else: distribution = self.dist cmd = ['sudo', 'sbuild-createchroot', '--arch={}'.format(self.arch), distribution, chrootpath, 'http://http.debian.net/debian'] # TODO: fix image: # - add local repositories # - set fstab pkg_archive / apt-cacher-ng # - configure ccache # - configure tmpfs util.log_check_call(cmd, interactive=True) def update(self, force=False): if not force and self._updated: return self._updated = True if not force and not self.needs_update(): return self.mark_updated() cmd = ['sudo', 'sbuild-update', '-uagdr', self.schroot] util.log_check_call(cmd, interactive=True) def list_missing_hook(self, builddir): hook = 'list-missing' hooks_dir = self.get_hooks_dir() src = os.path.join(hooks_dir, hook) with tempfile.TemporaryDirectory(prefix=hook) as tmpdir: logging.debug('src: {}, dst: {}'.format(src, tmpdir)) cmd = ['cp', '-Lrx', os.path.join(src, '.'), tmpdir] util.log_check_call(cmd) cmd = [os.path.join(tmpdir, hook)] process = util.log_popen(cmd, stdout=subprocess.PIPE, cwd=builddir) first_line = '=== Start list-missing' last_line = '=== End list-missing' end = False num_files = 0 for i, line in enumerate(process.stdout): print (line), if end or (i == 0 and line.startswith(first_line)): pass elif not end and line.startswith(last_line): # The doors are playing somewhere end = True else: num_files += 1 returncode = process.wait() if returncode or num_files > 0: raise TestFailed( "Hook {} failed, returned {}, num of files: {}".format( hook, returncode, num_files)) def check_lintian(self, package): error_re = re.compile(r'Lintian: fail') try: buildlog = open(package.build_file) for line in buildlog: match = error_re.match(line) if match: raise TestFailed("Lintian failed") except Exception: raise TestFailed("Couldn't open build log") def run_sbuild(self, package, builder_options, ignore_branch, binary_only, arch_all): sbuild_cmd = self._run_sbuild_sbuild_cmd(package, arch_all, binary_only, builder_options) gbp_cmd = [ 'gbp', 'buildpackage', '--git-export-dir={}'.format(package.export_dir), '--git-arch={}'.format(self.arch), '--git-dist={}'.format(self.dist), ] if self.config.get('verbose'): gbp_cmd.append('--git-verbose') if package.pristine_tar: gbp_cmd.append('--git-pristine-tar') # If already generated, use that gbp_cmd.append('--git-tarball-dir={}'.format(package.export_dir)) else: gbp_cmd.append('--git-tarball-dir={}'.format(package.tarball_dir)) if not ignore_branch: gbp_cmd.append('--git-debian-branch={}'.format( package.debian_branch)) if not package.is_merged and not package.is_native(): gbp_cmd.append('--git-overlay') gbp_cmd.append('--git-builder={}'.format(' '.join(sbuild_cmd))) return self.SBuildSession(self, package, gbp_cmd) def build(self, package, ignore_branch=False): source_only = package.get('source_only') if source_only: git_pbuilder = GitPBuilder(self._arriero, self.dist, self.arch) git_pbuilder.build(package, ignore_branch) return run_autopkgtest = (package.get('run_autopkgtest') and package.has_tests()) hooks = package.get('builder_hooks') shell = self._get_shell_enum(hooks) builder_options = package.get('builder_options') binary_only = package.get('binary_only') arch_all = package.get('arch_all') with self.run_sbuild(package, builder_options, ignore_branch, binary_only, arch_all) as session: exc_info = session.exc_info try: if exc_info: raise exc_info[0].with_traceback(exc_info[1], exc_info[2]) self._build_run_tests(package, session, run_autopkgtest, run_listmissing='list-missing' in hooks, run_lintian=package.get('run_lintian')) except (subprocess.CalledProcessError, TestFailed): if shell in (ON_ERROR, ALWAYS): self.shell_in_session(session.session, start_dir=session.chroot_build_dir) raise if shell == ALWAYS: self.shell_in_session(session.session, start_dir=session.chroot_build_dir) def _run_sbuild_sbuild_cmd(self, package, arch_all, binary_only, builder_options): sbuild_cmd = [ 'sbuild', '--dist={}'.format(self.dist), '--arch={}'.format(self.arch), '--chroot={}'.format(self.schroot), ] if arch_all: sbuild_cmd.append('--arch-all') else: sbuild_cmd.append('--no-arch-all') if not binary_only: sbuild_cmd.append('--source') if self.config.get('verbose'): sbuild_cmd.append('--verbose') if package.get('run_lintian'): sbuild_cmd.extend([ '--run-lintian', '--lintian-opts="-I --pedantic --show-overrides"']) else: sbuild_cmd.append('--no-run-lintian') if package.get('run_piuparts'): sbuild_cmd.append('--run-piuparts') piuparts_opts = '--schroot=chroot:{}'.format(self.schroot) if self.dist != 'unreleased': piuparts_opts = '-d {} {}'.format(self.dist, piuparts_opts) sbuild_cmd.append('--piuparts-opts="{}"'.format(piuparts_opts)) else: sbuild_cmd.append('--no-run-piuparts') sbuild_cmd.append('--purge=never') sbuild_cmd.extend(builder_options) return sbuild_cmd def _build_run_tests(self, package, session, run_autopkgtest, run_listmissing, run_lintian): if run_autopkgtest: tester_name = package.get('tester') tester = self._arriero.get_tester(tester_name, self.dist, self.arch) tester.update() tester.test(package, package.changes_file) # FIXME: starting a shell in adt send the current terminal # to the background # tester.test(package, ON_ERROR if shell else NO) if run_listmissing: self.list_missing_hook(session.full_build_dir) if run_lintian: self.check_lintian(package) class Tester(Common): def test(self, package, shell): pass class CowBuilderTester(Tester, CowBuilder): def __init__(self, arriero, dist, arch): super(CowBuilderTester, self).__init__(arriero, dist, arch) self.path = self.config.get('tester_name', distribution=dist, architecture=arch) def update(self, force=False): if not force and self._updated: return self._updated = True env = {'ARCH': self.arch, 'DIST': self.dist} cmd = ['git-pbuilder', 'update'] logging.info('Updating build image for {}-{}'.format( self.dist, self.arch)) util.log_check_call(cmd, env=dict(os.environ, interactive=True, **env)) def test(self, package, changes_file, shell=NO): cmd = ['sudo', 'autopkgtest', '-U', changes_file, '--user={}'.format(getpass.getuser())] if shell: cmd.append('--shell-fail') if shell == ALWAYS: cmd.append('--shell') cmd.extend(['--', 'chroot', self.path]) util.log_check_call(cmd, interactive=True, cwd=package.path) class LXCTester(Tester): def __init__(self, arriero, dist, arch): super(LXCTester, self).__init__(arriero, dist, arch) self.name = self.config.get('tester_name', distribution=dist, architecture=arch) def update(self, force=False): if not force and self._updated: return self._updated = True # TODO: The update script needs to be generated first cmd = ['sudo', 'lxc-start', '-n', self.name, '-F', '--', '/usr/local/sbin/update'] util.log_check_call(cmd, interactive=True) def test(self, package, changes_file, shell=NO): cmd = ['sudo', 'autopkgtest', '-U', changes_file, '--user={}'.format(getpass.getuser())] if shell: cmd.append('--shell-fail') if shell == ALWAYS: cmd.append('--shell') cmd.extend(['--', 'lxc', self.name]) util.log_check_call(cmd, interactive=True, cwd=package.path) class SChRootTester(Tester, SChRoot): def __init__(self, arriero, dist, arch): super(SChRootTester, self).__init__(arriero, dist, arch) self.schroot = self.config.get('sbuild_test_chroot', distribution=dist, architecture=arch) def update(self, force=False): if not force and self._updated: return self._updated = True if not force and not self.needs_update(): return self.mark_updated() cmd = ['sudo', 'sbuild-update', '-uagdr', self.schroot] util.log_check_call(cmd, interactive=True) def test(self, package, changes_file, shell=NO): cmd = ['sudo', 'autopkgtest', '-U', changes_file, '--user={}'.format(getpass.getuser())] if shell: cmd.append('--shell-fail') if shell == ALWAYS: cmd.append('--shell') cmd.extend(['--', 'schroot', self.schroot]) util.log_check_call(cmd, interactive=True, cwd=package.path) AVAILABLE_BUILDERS = { 'git-pbuilder': (GitPBuilder, 'Use git-pbuilder cowbuilder'), 'sbuild': (SBuild, 'Use sbuild'), } AVAILABLE_TESTERS = { 'cowbuilder': (CowBuilderTester, 'Use cowbuilder'), 'lxc': (LXCTester, 'Use lxc'), 'schroot': (SChRootTester, 'Use schroot'), } arriero-0.7~20161228/arriero/cementery.py000066400000000000000000000047471306715713600201040ustar00rootroot00000000000000#!/usr/bin/env python3 # -*- coding: utf-8 -*- from __future__ import print_function import logging import sys import git ## # Misc functions ## def action_report(msg, action): ''' Executes action and prints the success state. ''' print('{}: '.format(msg), end='') if action(): print('done.') return True else: print('fail.') return False def check_report(msg, action): if not action(): print('%s' % (msg,), file=sys.stderr) return False return True def git_checkout(repo, branch): ''' Switch repo to branch. ''' try: repo.git.checkout(branch) except git.exc.GitCommandError: return False return True def list_module(self, name, order='build'): def raw(ps): return ps def build(ps): return self.sort_buildable(ps) module = self.get_module(name) if order not in set(('raw', 'alpha', 'build')): raise ValueError('Invalid value for order') s = raw if order == 'alpha': s = sorted elif order == 'build': s = build for package in s(module.packages): print(package) class Arriero_attic(object): # Old commands, no longer used. def set_debian_push(self, name): module = self.get_module(name) # TODO: fix, ugly ugly for package_name in module.packages: package = self.get_package(package_name) try: remote = package.git.config('branch.%s.remote' % ( package.debian_branch,)) except git.exc.GitCommandError: remote = 'origin' try: ref = package.git.config('branch.%s.merge' % ( package.debian_branch,)) except git.exc.GitCommandError: ref = 'refs/heads/%s' % (package.debian_branch,) try: package.git.config('--get', 'remote.%s.push' % (remote,), ref) except git.exc.GitCommandError: package.git.config('--add', 'remote.%s.push' % (remote,), ref) def checkout_debian(self, name): module = self.get_module(name) for package_name in module.packages: package = self.get_package(package_name) if not package.switch_branches(package.debian_branch): logging.error('Failure while switching branches for: %s', package_name) arriero-0.7~20161228/arriero/config.py000066400000000000000000000365661306715713600173620ustar00rootroot00000000000000#!/usr/bin/env python3 # -*- coding: utf-8 -*- import argparse import configparser import os import shutil from collections import OrderedDict, defaultdict, namedtuple from . import util # types: string (default), # multistring (split('\n'), each term is expanded), # rawstring (do not expand format strings), # multiraw (split('\n'), each term is not expanded), # multivalue (util.split, not expanded), # int, bool (util.str2bool), # path (string with os.path.expanduser and normpath) # multipath (multistring with os.path.expanduser and normpath) Schema = namedtuple('Schema', ['type', 'inherit']) Schema.__new__.__defaults__ = ('string', True) class Configuration(object): 'Unified interface to configparser and argparse' _schema_types = set(('string', 'multistring', 'rawstring', 'multiraw', 'multivalue', 'int', 'bool', 'path', 'multipath')) _schema_raw_types = set(('rawstring', 'multiraw', 'multivalue')) def __init__(self, defaults=None, schema=None, aliases=None, argparse_kwargs=None, configparser_kwargs=None): self.defaults = defaults if defaults else {} self.defaults.setdefault('config_files', []) self.schema = schema if schema else {} self.aliases = aliases if aliases else {} argparse_kwargs = argparse_kwargs if argparse_kwargs else {} configparser_kwargs = ( configparser_kwargs if configparser_kwargs else {}) self.argparser = argparse.ArgumentParser( **util.chain_map(argparse_kwargs, fromfile_prefix_chars='@', add_help=False)) self.argparser.add_argument('-c', '--config', help='Specify a config file.', metavar='FILE', action='append', dest='config_files', default=list(self.defaults['config_files'])) self.argparser.add_argument('--with', help='Override a configuration option.', nargs=3, metavar=('SECTION', 'KEY', 'VALUE'), action='append') self.config_files = None # delay till we have to read the files # Let's handle the default section ourselves kwargs = util.chain_map(configparser_kwargs, default_section='') self.cfgparser = configparser.ConfigParser(**kwargs) self._args = None self._overrides = None self._packages = None self._groups = None self._parents = None @property def args(self): if not self._args: self._args = self.argparser.parse_args() return self._args @property def partial_args(self): if not self._args: self._args, _ = self.argparser.parse_known_args() return self._args @property def overrides(self): if not self._overrides: self._overrides = defaultdict(lambda: defaultdict(None)) overrides = vars(self.partial_args).get('with') if overrides: for (s, k, v) in overrides: self._overrides[s][k] = v return self._overrides def add_help(self): # Taken from the constructor of argparse self.argparser.add_argument( '-h', '--help', action='help', default=argparse.SUPPRESS, help=argparse._('show this help message and exit')) def read_config_files(self): if not self.config_files: self.config_files = list( map(os.path.expanduser, self.partial_args.config_files)) return self.cfgparser.read(self.config_files) def update(self, partial=False): self._args = None if partial: return self.partial_args return self.args def _get_raw_inherit(self, option, raw, inherit): if raw is None: raw = (option in self.schema and self.schema[option].type in self._schema_raw_types) if inherit is None: inherit = (option not in self.schema or self.schema[option].inherit) return raw, inherit def _get_neighbors(self, name): return reversed(self.list_parents(name)) def _get_maps(self, option, section, raw, inherit, known, instance_dict): def visit(section): yield self.overrides.get(section, {}) if section in self.cfgparser: d = dict(self.cfgparser.items(section, raw=True)) yield d # compatibility mangling, the keys are mangled into attributes # names by argparse, but previous default configurations contained # dashes that are now "invalid". mangled = option.replace('_', '-') if '_' in option and mangled in d: d[option] = d[mangled] yield d def queue_maps(): # trivial queries yield known # configurations of the section for mapping in visit(section): yield mapping # ask the package for internal values like the ones that need to query # the files yield instance_dict # ask the parents if inherit: # bfs over the parents for visit_result in \ util.bfs_gen(section, get_neighbors=self._get_neighbors, visit=visit): for mapping in visit_result: yield mapping # command line options yield vars(self.partial_args) for mapping in visit('DEFAULT'): yield mapping yield self.defaults return queue_maps() def _resolve_aliases(self, field, visited=None): if field not in self.aliases: return field if not visited: visited = set() if field in visited: raise ValueError( '{}: Invalid alias {}'.format(self.__class__.__name__, field)) visited.add(field) return self._resolve_aliases(self.aliases[field], visited) def _get(self, option, active, default_value=None, section='', raw=None, inherit=None, known=None, instance_dict=None): 'Retrieve the configuration option' raw, inherit = self._get_raw_inherit(option, raw, inherit) maps = util.CachingIterable( self._get_maps(option, section, raw, inherit, known, instance_dict)) chain = util.ChainMap() chain.maps = maps chain.default_value = default_value lookup = self._resolve_aliases(option) value = chain.get(lookup) return self._follow_schema(value, option, section, raw, known, instance_dict, active) def get(self, option, default_value=None, section='', raw=None, inherit=None, instance=None, **kw): 'Interpolate values in the configuration value' # Initialize known values if default_value is None and option in self.defaults: default_value = self.defaults[option] known = util.chain_map(kw, option=option, section=section, name=section, default_value=default_value) instance_dict = None if instance and hasattr(instance, '_getter'): instance_dict = instance._getter() else: instance_dict = {} # avoid loops in the interpolation active = OrderedDict() value = self._get(option, active, default_value, section=section, raw=raw, inherit=inherit, known=known, instance_dict=instance_dict) return value def _follow_schema(self, value, option, section, raw, known, instance_dict, active): if value is None: return value schema = self.schema.get(option, Schema()) # call the corresponding handler attr_name = '_get_{}'.format(schema.type) obj = getattr(self, attr_name) return obj(value, option, section, raw, known, instance_dict, active) def _get_rawstring(self, value, *a, **kw): return value def _get_multiraw(self, value, *a, **kw): if isinstance(value, str): value = filter(lambda x: x is not '', value.split('\n')) return value def _get_string(self, *a, **kw): return self._interpolate(*a, **kw) def _get_multistring(self, value, *a, **kw): value = self._interpolate(value, *a, **kw) result = [] for part in self._get_multiraw(value, *a, **kw): aux = self._interpolate(part, *a, **kw) if aux: result.append(aux) return result def _get_multivalue(self, value, *a, **kw): value = self._interpolate(value, *a, **kw) if isinstance(value, str): value = util.split(value) return value def _get_int(self, value, *a, **kw): value = self._interpolate(value, *a, **kw) return int(value) def _get_bool(self, value, *a, **kw): value = self._interpolate(value, *a, **kw) if isinstance(value, str): value = util.str2bool(value) return value def _get_path(self, value, *a, **kw): value = self._interpolate(value, *a, **kw) value = os.path.expanduser(value) value = os.path.normpath(value) return value def _get_multipath(self, value, *a, **kw): xs = self._get_multistring(value, *a, **kw) for i, v in enumerate(xs): xs[i] = os.path.expanduser(v) xs[i] = os.path.normpath(xs[i]) return xs def _interpolate(self, value, option, section, raw, known, instance_dict, active): def learn_fields(required_fields): if any(f in active for f in required_fields): raise(ValueError('Circular values dependency')) for field in required_fields: if field in known: continue value = self._get(field, active, '', section=section, known=known, instance_dict=instance_dict) if isinstance(value, str): _, fields = util.expansions_needed(value) if fields: value = interpolate(field, value, fields) known[field] = value def interpolate(option, format_string, required_fields): active[option] = (format_string, required_fields) learn_fields(required_fields) assert(all(f in known for f in required_fields)) # Apply the format value = format_string.format(**known) del(active[option]) return value if raw or not isinstance(value, str): return value # Interpolate the values n, fields = util.expansions_needed(value) assert(n == 0) if fields: value = interpolate(option, value, fields) return value def _init_inheritance(self): if self._groups is None: self._groups = OrderedDict() if self._packages is None: self._packages = OrderedDict() if self._parents is None: self._parents = defaultdict(OrderedDict) @property def groups(self): if self._groups is None: self._init_inheritance() self.update_inheritance() return self._groups.keys() @property def packages(self): if self._packages is None: self._init_inheritance() self.update_inheritance() return self._packages.keys() @property def parents(self): if self._parents is None: self._init_inheritance() self.update_inheritance() return self._parents def list_all(self): return list(self.groups) + list(self.packages) def list_parents(self, name): if name in self.parents: return self.parents[name].keys() return [] def get_packages(self, group, expanded_groups=None): def _get_packages(group, expanded_groups): expanded_groups.add(group) packages = util.OrderedSet() for element in self._groups[group]: if element not in self.groups: packages.add(element) elif element not in expanded_groups: packages.extend(_get_packages(element, expanded_groups)) return packages if group in self.packages: return util.OrderedSet([group]) if group not in self.groups: return util.OrderedSet() # Expand package groups in the packages list if expanded_groups is None: expanded_groups = set() return _get_packages(group, expanded_groups) def update_inheritance(self): def _add(x, to=None, parent=None, value=None): to[x] = value if parent: self._parents[x][parent] = True sections = self.cfgparser.sections() sections.extend(self.overrides.keys()) for section in sections: if section == 'DEFAULT': # The cost of handling the default ourselves continue section_packages = self.get('packages', section=section, inherit=False) # If it contains no packages it's a package if section_packages is None: _add(section, to=self._packages) continue # If it contains packages it's a group _add(section, to=self._groups, value=section_packages) for package in section_packages: value = self.get('packages', section=package, inherit=False) if value is None: to = self._packages else: to = self._groups _add(package, to=to, parent=section, value=value) def write(self): '''Writes any changes to the config file to disk.''' config_file = self.config_files[-1] if os.path.exists(config_file): shutil.copyfile(config_file, config_file + '.bak') # Write all the overrides for section_name, section in self.overrides.items(): for option, value in section.items(): if value is None: continue self.cfgparser.set(section_name, option, value) with open(config_file, 'w') as f: self.cfgparser.write(f) self.update_inheritance() def arg_add(self, *args, **kwargs): return self.argparser.add_argument(*args, **kwargs) def add_section(self, *args, **kwargs): return self.cfgparser.add_section(*args, **kwargs) def set(self, section, option, value): self.overrides[section][option] = value return value def main(): config = Configuration() print (config.partial_args) config.read_config_files() print (config.get('test')) print ('{} {}'.format(config.get('test', section='foo'))) print (config.packages) if __name__ == '__main__': main() arriero-0.7~20161228/arriero/errors.py000066400000000000000000000023431306715713600174130ustar00rootroot00000000000000#!/usr/bin/env python3 # -*- coding: utf-8 -*- # Copyright: 2015, Maximiliano Curia # # This program is free software; you can redistribute it and/or modify it # under the terms of the GNU General Public License as published by the Free # Software Foundation; either version 2 of the License, or (at your option) # any later version. # # This program is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or # FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for # more details. # # You should have received a copy of the GNU General Public License along with # this program; if not, write to the Free Software Foundation, Inc., 51 # Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. ## # Exceptions ## class ArrieroError(Exception): pass class ActionError(ArrieroError): pass class GitError(ArrieroError): pass class GitBranchError(GitError): pass class GitDirty(GitError): pass class GitDiverge(GitError): pass class GitRemoteNotFound(GitError): pass class PackageError(ArrieroError): pass class TestFailed(ArrieroError): pass class UscanError(ArrieroError): pass arriero-0.7~20161228/arriero/graph.py000066400000000000000000000061041306715713600171770ustar00rootroot00000000000000#!/usr/bin/env python3 # -*- coding: utf8 -*- # Copyright: 2013-2014, Maximiliano Curia # # This program is free software; you can redistribute it and/or modify it # under the terms of the GNU General Public License as published by the Free # Software Foundation; either version 2 of the License, or (at your option) # any later version. # # This program is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or # FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for # more details. # # You should have received a copy of the GNU General Public License along with # this program; if not, write to the Free Software Foundation, Inc., 51 # Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. import collections from . import util # Graph node Node = collections.namedtuple('Node', ['input', 'output']) class GraphError(Exception): pass class TSortGraph(object): '''Graph for topological sorts.''' def __init__(self, keys, get_inputs): ''' :keys: sequence of hashable items. :get_inputs: function of key that returns the list of inputs that the node depends on. ''' self.nodes = collections.defaultdict( lambda: Node(util.OrderedSet(), util.OrderedSet()) ) self.ready = collections.deque() self._done = set() visited = set() node_keys = util.OrderedSet(keys) for key in node_keys: if key in visited: continue inputs = get_inputs(key) # Reduce the inputs to the set of nodes we are working with inputs &= node_keys self.nodes[key].input.extend(inputs) for input in inputs: self.nodes[input].output.add(key) if not inputs: self.ready.append(key) visited.add(key) def done(self, key): '''Remove key from the inputs of the depending nodes''' self._done.add(key) for child in self.nodes[key].output: self.nodes[child].input.remove(key) # if there are no more dependencies, the element is ready if not self.nodes[child].input: self.ready.append(child) return self.ready def __repr__(self): items = ('{key}: in={node.inputs}, out={node.outputs}'.format( key=key, node=node) for key, node in self.nodes.items()) return '{}({})'.format(self.__class__.__name__, ', '.join(items)) def sort_generator(self, skip=None): '''Topological sort as a generator :skip: is a set of values that are not going to be considered as done. ''' while self.ready: key = self.ready.popleft() yield key if skip and key in skip: continue self.done(key) if len(self._done) != len(self.nodes) and not skip: raise GraphError('Not a DAG?: done {}, ready {}, graph {}'.format( self._done, self.ready, self)) arriero-0.7~20161228/arriero/package.py000066400000000000000000001517351306715713600175040ustar00rootroot00000000000000#!/usr/bin/env python3 # -*- coding: utf8 -*- # Copyright: 2013-2014, Maximiliano Curia # # This program is free software; you can redistribute it and/or modify it # under the terms of the GNU General Public License as published by the Free # Software Foundation; either version 2 of the License, or (at your option) # any later version. # # This program is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or # FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for # more details. # # You should have received a copy of the GNU General Public License along with # this program; if not, write to the Free Software Foundation, Inc., 51 # Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. import collections import glob import logging import os import re import debian.changelog as changelog import debian.deb822 as deb822 import debian.debian_support as ds import git # Own imports from . import util from .errors import (GitBranchError, GitDirty, GitDiverge, GitRemoteNotFound, PackageError, UscanError) from .uscan import Uscan from .version import Version ## # Constants ## OK = 'Ok' IGNORE = 'Ignored' ERROR = 'Error' MISS_DEP = 'Missing dependencies' # Precompiled regexps dfsg_re = re.compile(r'(.*)[+.]dfsg(?:.\d+)?$') class Package(object): class Getter(object): '''Provides a mapping face to the internal values The sole purpose of this class is to provide a mapping like interface to be used from the Configuration class. ''' def __init__(self, package): self.package = package def __contains__(self, key): if not isinstance(key, str): return False attr_name = '_field_{}'.format(key) return hasattr(self.package, attr_name) def __getitem__(self, key): if self.__contains__(key): attr_name = '_field_{}'.format(key) return getattr(self.package, attr_name) raise KeyError() _field_pristine_tar_branch = 'pristine-tar' def __init__(self, name, arriero): self._field_name = name self._field_depends = None self._field_vcs_git = None self._field_path = None self._arriero = arriero self._field_architecture = self._arriero.architecture self._repo = None self._changelog = None self._control = None self._tests_control = None self._last_changelog = None @property def config(self): return self._arriero.config def get(self, *a, **kw): kw.update({'section': self._field_name, 'instance': self}) return self.config.get(*a, **kw) def set(self, option, value): return self.config.set(self._field_name, option, value) def _getter(self): return self.Getter(self) @property def depends(self): if self._field_depends is not None: return self._field_depends raw_depends = self.get('depends') not_expanded = set() if raw_depends: not_expanded.update(raw_depends) if self.name in not_expanded: not_expanded.remove(self.name) depends = util.OrderedSet() for dependency in not_expanded: # Do not expand parents if self.name in self.config.parents and \ dependency in self.config.list_parents(self.name): # Add it as a package if dependency in self.config.packages: depends.add(dependency) continue depends.extend(self.config.get_packages(dependency)) if self.name in depends: depends.remove(self.name) self._field_depends = depends logging.debug('{}: depends on: {}'.format(self.name, depends)) return depends @property def name(self): return self._field_name @property def architecture(self): return self.get('architecture') @property def basedir(self): return self.get('basedir') @property def path(self): if not self._field_path: self._field_path = self.get('path') return self._field_path @property def target_distribution(self): return self.get('target_distribution') @target_distribution.setter def target_distribution(self, distribution): self.set('target_distribution', distribution) @property def upstream_branch(self): return self.get('upstream_branch') @property def debian_branch(self): return self.get('debian_branch') @property def filter_orig(self): return self.get('filter_orig') @property def pristine_tar(self): return self.get('pristine_tar') @property def pristine_tar_branch(self): return self._field_pristine_tar_branch @property def is_merged(self): return self.get('is_merged') @property def _field_vsc_git(self): try: value = self.source_control.get('Vcs-Git') if value is not None: self.vcs_git = value except PackageError: # No control file. return None @property def vcs_git(self): return self.get('vcs_git') @vcs_git.setter def set_vcs_git(self, value): return self.set('vcs_git', value) def check_path(self): if os.path.isdir(self.path): return self.path @property def tarball_dir(self): return self.get('tarball_dir') @property def export_dir(self): return self.get('export_dir') @property def repo(self): if not self._repo: self._repo = git.Repo(self.path) return self._repo @property def git(self): return self.repo.git @property def _field_branch(self): if self.repo.head_is_detached: return None return self.repo.head.shorthand @property def branch(self): if self.repo.head.is_detached: return None return self.repo.active_branch.name @property def changelog(self): if not self._changelog: self.update_changelog() return self._changelog def update_changelog(self): changelog_filename = os.path.join(self.path, 'debian', 'changelog') try: with open(changelog_filename) as f: self._changelog = changelog.Changelog(f) except EnvironmentError: raise PackageError('Could not open the changelog file') if not len(self._changelog): raise PackageError('Changelog file could not be parsed') @property def control(self): if not self._control: self._control = self.update_control( os.path.join(self.path, 'debian', 'control')) return self._control @property def source_control(self): return self.control[0] @property def binary_controls(self): return self.control[1:] @property def tests_control(self): if not self.has_tests(): return if not self._tests_control: self._tests_control = self.update_control( os.path.join(self.path, 'debian', 'tests', 'control')) return self._tests_control def update_control(self, filename): try: with open(filename) as f: return list(deb822.Deb822.iter_paragraphs(f)) except EnvironmentError: raise PackageError('Could not open the control file') @property def source_name(self): return self.source_control['Source'] @property def build_file(self, check=True): build_file = os.path.join( self.export_dir, '{}_{}_{}.build'.format( self.source_name, self.epochless_version, self.architecture) ) if check and not os.path.exists(build_file): return '' return build_file @property def changes_file(self, check=True): changes_file = os.path.join( self.export_dir, '{}_{}_{}.changes'.format( self.source_name, self.epochless_version, self.architecture) ) if check and not os.path.exists(changes_file): return '' return changes_file @property def source_changes_file(self, check=True): changes_file = os.path.join( self.export_dir, '{}_{}_source.changes'.format( self.source_name, self.epochless_version) ) if check and not os.path.exists(changes_file): return '' return changes_file @property def upload_file(self, check=True): # TODO, dput specific, ignoring source and multi uploads upload_file = os.path.join( self.export_dir, '{}_{}_{}.ftp-master.upload'.format( self.source_name, self.epochless_version, self.architecture) ) if check and not os.path.exists(upload_file): return '' return upload_file @property def dsc_file(self, check=True): dsc_file = os.path.join( self.export_dir, '{}_{}.dsc'.format(self.source_name, self.epochless_version) ) if check and not os.path.exists(dsc_file): return '' return dsc_file @property def version(self): return Version(self.changelog.version) @property def epochless_version(self): epochless_version = self.version.upstream_version if self.version.debian_revision: epochless_version += '-{}'.format(self.version.debian_revision) return Version(epochless_version) @property def epoch(self): return self.changelog.epoch @property def upstream_version(self): if self.changelog: v = self.changelog.upstream_version return v if v else '' @property def debian_version(self): return self.changelog.debian_version @property def _field_distribution(self): return self.changelog.distributions @property def distribution(self): return self._field_distribution @property def urgency(self): return self.changelog.urgency @property def last_changelog(self): if self._last_changelog is None: for i, block in enumerate(self.changelog): if i == 0: # Skip the first one continue if block.distributions.lower() == 'UNRELEASED': # Ignore unreleased continue self._last_changelog = block break return self._last_changelog @property def last_changelog_distribution(self): if self.last_changelog: return self.last_changelog.distributions @property def last_changelog_version(self): if self.last_changelog: return Version(self.last_changelog.version) @property def version_at_distribution(self): if self.last_changelog_distribution: return Version(util.version_at_distribution( self.source_name, self.last_changelog_distribution)) def tag_template(self, name, version=None, **values): '''Returns a tag version correctly formatted according to the name.''' if version is None: version = self.debian_version try: v = Version(version) upstream_version = v.upstream_version v.epoch = None epochless_version = str(v) except ValueError: # version can be a glob like '*' upstream_version = version epochless_version = version values.setdefault('version', version) values.setdefault('debian_version', version) values.setdefault('upstream_version', upstream_version) values.setdefault('epochless_version', epochless_version) values = {k: Version.to_tag(v) for k, v in values.items()} if name == 'upstream': return self.get('upstream_tag').format(**values) if name == 'debian': return self.get('debian_tag').format(**values) def uscan(self): return Uscan(self.path, destdir=self.tarball_dir) def is_native(self): return self.version and not self.version.debian_revision def is_dfsg(self): if self.is_native(): return False if self.upstream_version and dfsg_re.match(self.upstream_version): return True return False def _append_field(self, control, dest, field): if field in control: dest.append(control[field]) return dest @property def build_depends(self): control = self.source_control deps = [] self._append_field(control, deps, 'Build-Depends') self._append_field(control, deps, 'Build-Depends-Indep') return ', '.join(deps) @property def runtime_depends(self): deps = [] for control in self.binary_controls: self._append_field(control, deps, 'Pre-Depends') self._append_field(control, deps, 'Depends') return ', '.join(deps) @property def runtime_recommends(self): deps = [] for control in self.binary_controls: self._append_field(control, deps, 'Recommends') return ', '.join(deps) @property def tests_depends(self): deps = [] if not self.has_tests(): return '' for control in self.tests_control: depends = control.get('Depends', '') depends_list = re.split(r'\s*,\s*', depends) restrictions = control.get('Restrictions', '') restrictions_list = re.split(r'\s*,\s*', restrictions) if '@' in depends_list: depends_list.remove('@') deps.append(self.runtime_depends) if 'needs-recommends' in restrictions_list: deps.append(self.runtime_recommends) if '@builddeps@' in depends_list: depends_list.remove('@builddeps@') deps.append(self.build_depends) deps.extend(depends_list) return ', '.join(deps) def _all_depends(self): # return ', '.join([self.build_depends, self.runtime_depends, self.tests_depends]) # Test dependencies might break the DAG return ', '.join([self.build_depends, self.runtime_depends]) def internal_dependencies(self, binaries, dependencies=None): '''Process dependencies against binaries to take into account. :binaries: is a dictionary that maps binaries to the package_name that generates it. :dependencies: is a dpkg depends string :returns: a mapping, of the package_names that generate the needed binaries, and the binaries that cause the dependency as the value. ''' bin_deps = util.OrderedSet() if dependencies is None: dependencies = self._all_depends() rels = deb822.PkgRelation.parse_relations(dependencies) for or_part in rels: for part in or_part: bin_deps.add(part['name']) internal_dep = collections.OrderedDict() for dep in bin_deps: if dep not in binaries: continue if self.name == binaries[dep]: continue internal_dep.setdefault(binaries[dep], []).append(dep) return internal_dep def get_packages(self): 'A simple dh_listpackages' packages = [] for part in self.control: if 'Package' in part: packages.append(part['Package']) return packages def commit(self, msg, files): cmd = ['dch', '--', msg] util.log_check_call(cmd, cwd=self.path) self.git.add(os.path.join('debian', 'changelog')) self.git.add(*files) self.git.commit('-m', msg) def create_branch(self, branch, tracking=None): if not tracking: # Search a remote to track trackable = [] for remote in self.repo.remotes: remote.fetch(tags=True) if self.vcs_git and self.vcs_git == remote.url: logging.debug('%s: remote %s configured as vcs-git', self.name, remote.name) tracking = remote break if branch in remote.refs: logging.debug('%s: found %s/%s', self.name, remote.name, branch) trackable.append(remote) if not tracking and trackable: tracking = trackable[0] if tracking and branch in tracking.refs: # Create a new branch with tracking head = self.repo.create_head(branch, commit=tracking.refs[branch]) head.set_tracking_branch(tracking.refs[branch]) return head else: return False def switch_branches(self, branch): if self.repo.is_dirty(untracked_files=True): logging.warning('{name}: branch {curr} has uncommitted changes.' 'Can\'t switch to {dst} from {curr}'.format( name=self.name, curr=self.branch, dst=branch)) return False try: self.git.checkout(branch) except git.exc.GitCommandError: # Does the branch even exists? logging.error('{}: Can\'t switch to {} from {}.'.format( name=self.name, curr=self.branch, dst=branch)) return False # TODO: Check if really needed self._changelog = None return True def _ahead_behind_rev_list(self, ref1, ref2): out = self.git.rev_list('--left-right', '--count', '%s...%s' % (ref1, ref2)) return map(int, out.split()) def _ahead_behind_guess_local(self, local): if not local and not self.repo.head.is_detached: local = self.repo.active_branch if not local: # FIXME: it might be wiser to use HEAD, but then we have to deal # with fake references local = self.debian_branch if isinstance(local, str): local = self.repo.references[local] return local def _get_default_remote(self): # repo.remote() fallsback to origin, even if there is no origin if not self.repo.remotes: return if (len(self.repo.remotes) == 1 or 'origin' not in {r.name for r in self.repo.remotes}): return self.repo.remotes[0] return self.repo.remote() def _guess_tracking(self, local, tracking=None): if not tracking: tracking = local.tracking_branch() if not tracking: git_remote = self._get_default_remote() if not git_remote: return tracking = git_remote.refs[local.name] if isinstance(tracking, str): tracking = self.repo.references[tracking] return tracking def ahead_behind(self, local=None, remote=None): # Hacky check between two refs # returns two numbers, commits ahead and commits behind local_ref = self._ahead_behind_guess_local(local) remote_ref = self._guess_tracking(local_ref, remote) if not remote_ref: # No remote to compare to return 0, 0 return self._ahead_behind_rev_list(local_ref.path, remote_ref.path) def prepare_overlay(self, overlay_branch=None): '''Combines the upstream and debian branches into one working tree. If the debian_branch of the package already contains the upstream source (stored in self.merge attribute), then that branch is checked out, nothing else happens. If the package is a native package, the debian branch is checked out, nothing else happens. Args: overlay_branch: if received, the result is checked into a new branch instead of a dettached HEAD. ''' # Try to switch to debian branch. If this fails, there are probably # uncommitted changes. logging.debug( '{}: switching to branch {} for overlay creation.'.format( self.name, self.debian_branch)) if not self.switch_branches(self.debian_branch): raise GitBranchError( 'Could not change to branch {}'.format(self.debian_branch)) if overlay_branch is None: overlay_branch = self.get( 'overlay_branch', overlay_branch=vars(self.config.args).get('overlay_branch')) # Create working branch, if the user so requested if overlay_branch: logging.debug( '{}: creating new overlay branch {}.'.format( self.name, overlay_branch)) self.repo.create_branch('overlay_branch', self.repo.head.get_object()) if self.is_merged: logging.info( '{}: skipping overlay creation, already merged.'.format( self.name)) return if self.is_native(): logging.info( '{}: native package, no overlay needed.'.format( self.name)) return # Checkout the upstream tree into the current branch, then reset the # index so that the files are there, but not scheduled to be committed. logging.info( '{}: copying tree from latest upstream tag.'.format( self.name)) self.git.checkout( self.tag_template('upstream', self.upstream_version), '.', ':!.gitignore', ours=True) # TODO: add :!.gitignore only if the git version supports it. self.git.reset() logging.info( '{}: overlay created, branch needs to be manually cleaned'.format( self.name)) def _get_builder(self): distribution = self.target_distribution architecture = self.architecture builder_name = self.get('builder') return self._arriero.get_builder(builder_name, distribution, architecture) def build(self, ignore_branch=None): if ignore_branch is None: ignore_branch = self.get('ignore_branch') if not ignore_branch and self.debian_branch != self.branch: if not self.switch_branches(self.debian_branch): raise(GitBranchError('Could not change to branch %s' % ( self.debian_branch,))) builder = self._get_builder() if not self.get('source_only'): builder.ensure_image() builder.build(package=self, ignore_branch=ignore_branch) def _release_prepare_dch_cmd(self, distribution, pre_release, unreleased): cmd = ['dch'] msg = '' if unreleased: version = Version(str(self.version)) if pre_release: if not version.is_pre_release(): cmd.append('-b') new_version = version.pre_release() cmd.extend(['-v', str(new_version)]) elif version.is_pre_release(): new_version = version.release() # First we need to remove ~ part util.log_check_call( ['dch', '--release-heuristic', 'changelog', '-v', str(new_version), ''], cwd=self.path) cmd.append('-r') else: cmd.append('-r') else: if pre_release: cmd.append('-i') elif distribution: if self.distribution == distribution: logging.error('%s: Already released for %s', self.name, distribution) return [] msg = 'Release to %s' % distribution cmd.append('-i') else: # There is no distribution not even in the changelog # reachable? cmd.append('-r') if distribution and distribution.lower() != 'unreleased': cmd.append('-D') cmd.append(distribution) cmd.append(msg) return cmd def _release_args(self, distribution, pre_release, ignore_branch): if pre_release is None: pre_release = self.get('pre_release') if pre_release: distribution = 'UNRELEASED' if distribution is None: # Use the cli option first distribution = self.get( 'distribution', distribution=vars(self.config.args).get('distribution')) if ignore_branch is None: ignore_branch = self.get('ignore_branch') return distribution, pre_release, ignore_branch def release(self, distribution=None, pre_release=None, ignore_branch=None): distribution, pre_release, ignore_branch = \ self._release_args(distribution, pre_release, ignore_branch) if not ignore_branch and self.debian_branch != self.branch: if not self.switch_branches(self.debian_branch): raise(GitBranchError('Could not change to branch %s' % ( self.debian_branch,))) unreleased = self.distribution.lower() == 'unreleased' dch_cmd = self._release_prepare_dch_cmd(distribution, pre_release, unreleased) if not dch_cmd: return ERROR util.log_check_call(dch_cmd, cwd=self.path) self.update_changelog() if pre_release and not unreleased and \ self.distribution.lower() == 'unreleased': # We called dch -i dch_cmd = self._release_prepare_dch_cmd(distribution, pre_release, True) if not dch_cmd: return ERROR util.log_check_call(dch_cmd, cwd=self.path) self.update_changelog() # If there were any changes to the changelog, commit them if self.repo.index.diff(None, 'debian/changelog'): if pre_release: msg = 'Pre-release %s' % self.version else: msg = 'Release to %s' % self.distribution self.git.commit('debian/changelog', '-m', msg) return OK def local_upload(self): self.upload('local') def upload(self, host=None, force=None): if host is None: host = self.get('upload_host') if force is None: force = self.get('force') upload_command = self.config.get( 'upload_command', section=self.name, raw=True) if not upload_command: return ERROR if not self.changes_file: return IGNORE cmd_variables = { 'changes_file': self.changes_file, 'package': self.name, 'version': self.version, 'distribution': self.distribution, 'dist': self.distribution, 'upload_host': host, } try: full_command = upload_command.format(**cmd_variables) except (ValueError, KeyError) as e: logging.error('%s: unable to format upload-command: %s', self.name, upload_command) logging.error('%s: %s' % (e.__class__.__name__, e.message)) return ERROR util.log_check_call(full_command, interactive=True, shell=True) @staticmethod def _branch_to_ref(branch): return 'refs/heads/{}'.format(branch) def _get_tracking_remote_name(self, branch, fallback_to_default=True): try: if isinstance(branch, str): if branch not in self.repo.branches: return branch = self.repo.branches[branch] tracked = branch.tracking_branch() if tracked: remote_name = tracked.remote_name else: if not fallback_to_default: return remote = self._get_default_remote() if not remote: return remote_name = remote.name return remote_name except git.exc.GitCommandError as e: logging.debug('%s: No remote associated.\n%s', self.name, e) def _push(self, branch, tag_template=None, fallback=True): remote_name = self._get_tracking_remote_name(branch, fallback_to_default=fallback) if not remote_name: return True ref = self._branch_to_ref(branch) try: ref += ":{}".format( self.git.config('branch.%s.merge' % branch)) except git.exc.GitCommandError as e: logging.debug('%s: No remote merge ref associated.\n%s', self.name, e) if tag_template: tag_refs = 'refs/tags/%s' % self.tag_template(tag_template, '*') try: self.git.push(remote_name, ref) if tag_template: self.git.push(remote_name, tag_refs) except git.exc.GitCommandError as e: logging.error('%s: Failed to push.\n%s', self.name, e) return False return True def push(self, debian_tags=None, upstream_push=None): logging.debug('Pushing %s', self.name) if debian_tags is None: debian_tags = self.get('debian_tags') if upstream_push is None: upstream_push = self.get('upstream_push') if not self._push(self.debian_branch, 'debian' if debian_tags else ''): return False if not self._push(self.upstream_branch, 'upstream', fallback=upstream_push): return False if self.pristine_tar: if not self._push(self.pristine_tar_branch, fallback=upstream_push): return False return True def safe_pull(self): # update to check ahead/behind # TODO: fix ugly hack if not self.branch: logging.info('Index is in a detached head') return False remote_name = self._get_tracking_remote_name(self.repo.active_branch) if not remote_name: return True try: self.git.fetch(remote_name, tags=True) except git.exc.GitCommandError as e: logging.info('Error on fetch: {}'.format(e.message)) return False try: ahead, behind = self.ahead_behind(self.repo.active_branch) if ahead > 0 and behind > 0: raise(GitDiverge('Needs to merge with head.')) except GitRemoteNotFound as e: logging.info('Branch not associated with a remote: %s' % e.message) return False # TODO: return True if there were changes try: self.git.pull() except git.exc.GitCommandError as e: logging.error('%s: Failed to pull.\n%s', self.name, e) raise(GitDirty(e.message)) def _pull_branch(self, branch): # FIXME: Know issue # if we have local changes in a branch that's not the # current one it fails with rejected * (non-fast-forward) # We could detect if we only have local changes and ignore the # pull, but we will eventually hit the case when both the remote # and the local branch have changes. # For that could either change to the branch (which might be tough if # the worktree has uncommitted changes, ignored files, etc), use a # different worktree for the merge. Both of these are quite # frail. try: branch_ref = self.repo.branches[branch] tracking_ref = self._guess_tracking(branch_ref) if not tracking_ref: raise IndexError('No remote asociated with {}'.format(branch_ref)) remote_ref = self.repo.remotes[tracking_ref.remote_name] tracking = self._branch_to_ref(tracking_ref.remote_head) logging.debug('{} {} {}:{}'.format( 'pull' if branch == self.branch else 'fetch', remote_ref.name, tracking, branch_ref.path)) remote_ref.fetch(tags=True) if branch == self.branch: remote_ref.pull() else: ahead, behind = self.ahead_behind(branch_ref, tracking_ref) if ahead == 0: remote_ref.fetch('{}:{}'.format(tracking, branch_ref.path)) if ahead and behind: raise(GitDirty('failed to pull {}, needs to be ' 'manually merged'.format(self.name, branch))) except IndexError as e: logging.warn('{}: Failed to access remote branch asociated with ' '{}'.format(self.name, branch)) logging.warn('{}: {}'.format(self.name, e)) return False except git.exc.GitCommandError as e: logging.error('{}: Failed to pull.'.format(self.name)) logging.error('{}: {}'.format(self.name, e)) return False return True def pull(self): logging.debug('Pulling %s', self.name) if not self._pull_branch(self.debian_branch): return False if not self._pull_branch(self.upstream_branch): return False if self.pristine_tar: if not self._pull_branch(self.pristine_tar_branch): return False return True def get_new_version(self, upstream): ''' Obtains a new version. ''' old = Version(self.version) new = old.new_upstream_version(upstream) return new.full_version def new_dfsg_version(self, upstream_version): # copyright excluded files rules are handled by uscan if upstream_version.rstrip('0123456789').endswith('+dfsg'): return upstream_version branch = self.branch rules = os.path.join(self.path, 'debian', 'rules') dfsg_version = upstream_version + '+dfsg' dfsg_tag = self.tag_template('upstream', dfsg_version) if dfsg_tag in self.repo.tags: # Already there return dfsg_version self.git.checkout(self.tag_template('upstream', upstream_version)) self.git.checkout('heads/%s' % branch, '--', 'debian') self.git.reset() util.log_check_call( ['fakeroot', rules, 'prune-nonfree'], cwd=self.path) util.log_check_call(['rm', '-rf', 'debian'], cwd=self.path) if (not self.repo.is_dirty(untracked_files=True)): # No changes made by the prune-nonfree call, we are free \o/ self.switch_branches(branch) return upstream_version self.git.commit('-a', '-m', 'DFSG version %s' % (dfsg_version,)) self.repo.create_tag(dfsg_tag) self.switch_branches(branch) return dfsg_version def _create_upstream_branch(self): if not self.upstream_branch or self.is_native(): return IGNORE if self.upstream_branch not in self.repo.branches: logging.debug('Creating upstream branch for %s.', self.name) original_branch = self.branch # Create upstream branch self.git.checkout('--orphan', self.upstream_branch) self.git.reset() self.git.clean('-xdf') self.git.commit('--allow-empty', '-m', 'Upstream branch') self.switch_branches(original_branch) return OK def fetch_upstream(self, ignore_branch=None): '''Fetch upstream tarball when there is no upstream_branch.''' if ignore_branch is None: ignore_branch = self.get('ignore_branch') status = self._create_upstream_branch() if status == OK: status = self._get_upstream_release(current=True, version=self.upstream_version, ignore_branch=ignore_branch) return status def get_upstream_release_uscan(self, current=False, version=None): '''Download the upstream tarball with uscan. Args: current: if True downloads the tarball even if it's the same as the one in the changelog file. ''' try: pkg_scan = self.uscan() pkg_scan.scan(download=True, force_download=current, version=version) except UscanError as e: logging.error( '%s: Could not download upstream tarball: %s', self.name, str(e)) return False # ignore uptodate status if we forced the download if not current and pkg_scan.uptodate: return False return (pkg_scan.uversion, pkg_scan.tarball) def _get_local(self, file_glob, version_re, requested_version): files = [] for filename in glob.iglob( os.path.join(self.tarball_dir, file_glob)): basename = os.path.basename(filename) # Skip .gpg files if basename.endswith('.gpg'): continue m = re.search(version_re, basename) if not m: continue version = m.group(1) if requested_version: if ds.version_compare(version, requested_version) == 0: logging.debug('%s: %s found', self.name, filename) return (version, filename) elif ds.version_compare(version, self.upstream_version) > 0: logging.debug('%s: %s found', self.name, filename) # is already there? upstream_tag = self.tag_template('upstream', version) if upstream_tag in self.repo.tags: continue files.append((version, filename)) if files: files.sort(key=lambda x: Version(x[0]), reverse=True) return files[0] return None def get_upstream_release_local(self, version=None): '''Check if new upstream release file is already downloaded.''' requested_version = version logging.debug('%s: looking for .orig files.', self.name) file_glob_orig = self.source_name + '_[0-9]*.orig.tar.*' version_orig_re = r'_([0-9.]+)\.' found = self._get_local(file_glob_orig, version_orig_re, requested_version) if found: return found logging.debug('%s: looking for non .orig files.', self.name) # not found, let's see if the file is downloaded without the .orig name copyright_file = open(os.path.join(self.path, 'debian', 'copyright')) copyright_deb822 = deb822.Deb822(copyright_file) if copyright_deb822 and 'Upstream-Name' in copyright_deb822: upstream_name = copyright_deb822['Upstream-Name'] else: upstream_name = self.source_name file_glob_other = upstream_name + '[-_][0-9]*.tar.*' version_re = r'[-_]([0-9.]+)\.tar' found = self._get_local(file_glob_other, version_re, requested_version) if found: return found if upstream_name != self.source_name: logging.debug('%s: looking for non .orig files using source_name.', self.name) file_glob_other = self.source_name + '[-_][0-9]*.tar.*' version_re = r'[-_]([0-9.]+)\.tar' found = self._get_local(file_glob_other, version_re, requested_version) if found: return found return False def _get_upstream_release_download(self, version, current): download = self.get_upstream_release_local(version=version) if not download: if not current: # if we don't have the current version, fetch at least that upstream_tag = self.tag_template('upstream', self.upstream_version) current = upstream_tag not in self.repo.tags download = self.get_upstream_release_uscan(current=current, version=version) return download def _get_upstream_release_call_import_orig(self, upstream_version, tarball): cmd = ['gbp', 'import-orig', '--upstream-version=%s' % upstream_version] # TODO: this should not be duplicated if self.debian_branch: cmd.append('--debian-branch=%s' % self.debian_branch) if self.upstream_branch: cmd.append('--upstream-branch=%s' % self.upstream_branch) if self.pristine_tar: cmd.append('--pristine-tar') if self.filter_orig: cmd.append('--filter-pristine-tar') for pattern in self.filter_orig: cmd.append('--filter=%s' % pattern) else: cmd.append('--no-pristine-tar') if self.is_merged: cmd.append('--merge') else: cmd.append('--no-merge') upstream_vcs_tag = self.get('upstream_vcs_tag') if upstream_vcs_tag: # We need the tags from the upstream vcs self.git.fetch('--all', '--tags') cmd.append('--upstream-vcs-tag={}'.format(upstream_vcs_tag)) cmd.append(tarball) return util.log_check_call(cmd, cwd=self.path) def _get_upstream_release(self, ignore_branch, current=False, version=None): if self.is_native(): return False logging.debug('%s: Fetching upstream tarball.', self.name) if self.repo.is_dirty(untracked_files=True): raise(GitDirty('Uncommited changes')) if not ignore_branch and self.debian_branch != self.branch: if not self.switch_branches(self.debian_branch): raise(GitDirty('Could not switch branches')) self.safe_pull() # Need to reread changelog self.update_changelog() if version and dfsg_re.match(version): version = dfsg_re.sub(r'\1', version) # Let's check if it's already downloaded download = self._get_upstream_release_download(version, current) if not download: return False upstream_version, tarball = download tag = self.tag_template('upstream', upstream_version) if tag not in self.repo.tags: self._get_upstream_release_call_import_orig(upstream_version, tarball) # TODO: if requested version is dfsg, it should apply the fixes # corresponding to that version if self.is_dfsg(): return self.new_dfsg_version(upstream_version) return upstream_version def _upstream_changes_process_args(self, old_version, new_version, ignore_branch): if ignore_branch is None: ignore_branch = self.get('ignore_branch') if not old_version and not new_version: sections = self.changelog.sections new_version = self.upstream_version for section in sections: version = Version(section.version) if version.upstream_version != new_version: old_version = version.upstream_version break elif not old_version: old_version = self.upstream_version elif not new_version: new_version = self.upstream_version return old_version, new_version, ignore_branch def upstream_changes(self, old_version=None, new_version=None, ignore_branch=None): logging.debug('%s: Checking changes between upstream releases', self.name) old_version, new_version, ignore_branch = \ self._upstream_changes_process_args( old_version, new_version, ignore_branch) if not old_version: return True old_tag = self.tag_template('upstream', old_version) new_tag = self.tag_template('upstream', new_version) # Are this versions imported and tagged? if old_tag not in self.repo.tags: self._get_upstream_release(ignore_branch=ignore_branch, version=old_version) if new_tag not in self.repo.tags: self._get_upstream_release(ignore_branch=ignore_branch, version=new_version) if old_tag not in self.repo.tags: # still not there, nothing to compare return True return self.repo.tags[old_tag].commit.tree.diff(new_tag) def new_upstream_release(self, ignore_branch=None): logging.debug( '{}: Searching for a new upstream release'.format(self.name)) if ignore_branch is None: ignore_branch = self.get('ignore_branch') request_version = self.get('request_version') if not self.check_path(): logging.error('{}: {} doesn\'t exist'.format(self.name, self.path)) return False status = self._create_upstream_branch() if status == OK: upstream_version = self._get_upstream_release( ignore_branch, current=False, version=request_version) else: return status == IGNORE if not upstream_version: logging.debug('{}: No upstream version found.'.format(self.name)) return False # upstream_changes imports the corresponding upstream tags if not self.upstream_changes(new_version=upstream_version): return False if self.upstream_version == upstream_version: return False version = self.get_new_version(upstream_version) msg = 'New upstream release ({}).'.format(upstream_version) cmd = ['dch', '-p', '-v', version, msg] util.log_check_call(cmd, cwd=self.path) # Just changed the changelog, but its probably not going to be used # anymore. self._changelog = None self.git.add(os.path.join('debian', 'changelog')) self.git.commit('-m', msg) return True def get_status(self): status = [] status.append('Package: {}'.format(self.name)) status.append('+ Directory: {}'.format(self.path)) if not self.check_path(): status.append('! Status: Error, "{}" doesn\'t exist'.format( self.path)) return status status.extend(self._get_status_branch()) return status def _get_status_switch_branches(self, status): # Check if there are non commited changes dirty = self.repo.is_dirty(untracked_files=True) if dirty: status.append('! Status: Uncommited changes') status.append(self.git.status()) # Check is head is detached if not self.branch: status.append('! Status: HEAD is detached') # Check current version (only in debian) if not self.branch or self.branch not in self.debian_branch: if dirty: status.append('! Status: Can\'t check version, dirty branch') return False # If it's not dirty we can change branches, right? if not self.switch_branches(self.debian_branch): status.append( '! Status: Error, change to branch {} failed'.format( self.debian_branch)) return False else: status.append('* Switched to branch: {}'.format(self.branch)) return True def _get_status_branch(self): status = [] status.append('+ Branch: {}'.format(self.branch)) error = False branches = {'debian': self.debian_branch} if self.pristine_tar: branches['pristine-tar'] = self.pristine_tar_branch if not self._get_status_switch_branches(status): return status status.append('+ Version: {}'.format(self.version)) # Now that we are in the debian branch we can check if the package is # native or not. if not self.is_native(): branches['upstream'] = self.upstream_branch for k, v in branches.items(): if v not in self.repo.heads: status.append('! Status: Error, Missing {} branch: {}'.format( k, v)) error = True if error: return status # Check upstream tag with current version # TODO: # status.append(self.tag_template('upstream')) # Check if released # released = (self.distribution.lower() != 'unreleased') status.append('+ Distribution: {}'.format(self.distribution)) # if released, check if tagged in debian repo # TODO: # status.append(self.tag_template('debian')) # TODO: it should be possible to obtain the state without actually # changing branches status.extend(self._get_status_uscan()) status.extend(self._get_status_repo()) status.extend(self._get_status_build()) return status def _get_status_uscan(self): '''Check if the package is up to date with uscan.''' # Native packages don't have uscan status if self.is_native(): return [] try: uscan_status = self.uscan() uscan_status.scan(download=False) except UscanError as e: return ['! Status: Error while running uscan: {}'.format(str(e))] status = [] if not uscan_status.uptodate: status.append( '! Status: New upstream release available.\n' 'Local version: {0.version}.\n' '! Upstream version: {0.uversion}.\n' 'Status: Source URL: {0.url}'.format(uscan_status) ) if uscan_status.tarball: status.append('- Source: Already downloaded in {}'.format( uscan_status.tarball) ) return status def _get_status_repo(self): # Check if up to date with git repo status = [] remote_name = self._get_tracking_remote_name(self.repo.active_branch) if not remote_name: status.append('! Status: Branch not associated with a remote') return status try: self.git.fetch(remote_name, tags=True) ahead, behind = self.ahead_behind() if behind > 0: status.append( '! Status: Remote changes commited ({}), pull them'.format( behind)) if ahead > 0: status.append( '! Status: Local changes commited ({})'.format(ahead)) except git.exc.GitCommandError: status.append('! Status: Branch not associated with a remote') return status def _get_status_build(self): status = [] if not self.changes_file: status.append('- Status: The package has not been built') signed = False if self.changes_file: ret = util.quiet(['gpg', '--verify', self.changes_file]) signed = (ret == 0) unreleased = (self.distribution.lower() == 'unreleased') if signed and not unreleased: if self.upload_file: status.append('+ Status: Package signed for {}, uploaded'.format( self.distribution)) else: status.append( '! Status: Package signed for {}, not uploaded'.format( self.distribution)) if self.changes_file and not signed: status.append('- Status: Package has been built but not signed') # Check lintian # Check for reported errors # Check errors reported upstream # tarball_dir # export_dir # self.changelog.version # self.changelog.epochless_version return status def symbols_files(self, ignore_branch=None): if ignore_branch is None: ignore_branch = self.get('ignore_branch') if not ignore_branch and self.debian_branch != self.branch: if not self.switch_branches(self.debian_branch): raise(GitBranchError('Could not change to branch {}'.format( self.debian_branch))) return glob.glob(os.path.join(self.path, 'debian', '*.symbols')) def has_symbols(self, ignore_branch=True): return bool(self.symbols_files(ignore_branch=ignore_branch)) def has_tests(self): dirname = os.path.join(self.path, 'debian', 'tests') filename = os.path.join(dirname, 'control') return os.path.isdir(dirname) and os.path.exists(filename) # vi:expandtab:softtabstop=4:shiftwidth=4:smarttab arriero-0.7~20161228/arriero/uscan.py000066400000000000000000000244201306715713600172100ustar00rootroot00000000000000# -*- coding: utf8 -*- # Copyright: 2013-2014, Maximiliano Curia # # This program is free software; you can redistribute it and/or modify it # under the terms of the GNU General Public License as published by the Free # Software Foundation; either version 2 of the License, or (at your option) # any later version. # # This program is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or # FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for # more details. # # You should have received a copy of the GNU General Public License along with # this program; if not, write to the Free Software Foundation, Inc., 51 # Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. import collections import glob import logging import os import re import subprocess import debian.debian_support as ds import lxml.etree from . import util from .errors import UscanError # Taken from gbp.deb.uscan and modified to suit this program # Copyright: 2012, Guido Günther # License: GPL-2+ '''Interface to uscan''' class Uscan(object): cmd = '/usr/bin/uscan' class Package(object): upstream_version = None debian_upstream_version = None @property def version(self): return self.upstream_version \ if self.upstream_version else self.debian_upstream_version def __init__(self, dir='.', destdir='..'): self._uptodate = False self._tarball = None self._version = None self._uversion = None self._url = None self._dir = os.path.abspath(dir) self._destdir = destdir @property def uptodate(self): return self._uptodate @property def tarball(self): return self._tarball @property def version(self): return self._version if self._version else self._uversion @property def uversion(self): return self._uversion @property def url(self): return self._url def _process_package_check_repacked(self, entry, package, filename): m = re.search(r'Successfully repacked (?:.+) as (.+),', entry['messages']) if m: filename = m.group(1) # Use the repacked version m = re.match(r'(?:.*)_([^_]+)\.orig\.', filename) if m: package.upstream_version = m.group(1) return filename def _process_package_check_symlink(self, entry, filename): if not filename: m = re.match(r'.*symlinked ([^\s]+) to it', entry['messages']) if m: filename = m.group(1) return filename def _process_package_check_downloaded(self, entry, filename): if not filename: m = re.match(r'Successfully downloaded updated package ' r'(.+)', entry['messages']) if m: filename = m.group(1) return filename def _process_package_check_orig(self, package, filename): def _get_ext(package): r = os.path.splitext(package.url) if len(r) > 1: return r[1] r = os.path.splitext(package.tarball) if len(r) > 1: return r[1] return '' if not filename: ext = _get_ext(package) if package.package and package.version and ext: filename = '{0.package}_{0.version}.orig.tar{1}'.format( package, ext) return filename def _process_package_set_tarball(self, package, destdir, filename): if filename: fullpath = os.path.join(destdir, filename) if os.path.exists(fullpath): package.tarball = fullpath if (not package.tarball) and package.package and package.version: filename_glob = '{0.package}_{0.version}.orig.tar.*'.format( package) wild = os.path.join(destdir, filename_glob) files = glob.glob(wild) if len(files): filename = os.path.basename(files[0]) package.tarball = files[0] if (not package.tarball) and package.url: filename = package.url.rsplit('/', 1)[1] fullpath = os.path.join(destdir, filename) if os.path.exists(fullpath): package.tarball = fullpath return filename def _process_package(self, entry, destdir): package = self.Package() package.package = entry['package'] package.status = entry['status'] package.upstream_version = entry['upstream-version'] package.debian_upstream_version = entry['debian-uversion'] package.url = entry['upstream-url'] package.tarball = entry['target-path'] filename = entry['target'] filename = self._process_package_check_repacked(entry, package, filename) filename = self._process_package_check_symlink(entry, filename) filename = self._process_package_check_downloaded(entry, filename) filename = self._process_package_check_downloaded(entry, filename) filename = self._process_package_check_orig(package, filename) filename = self._process_package_set_tarball(package, destdir, filename) package.filename = filename return package def _parse(self, out, destdir=None): r''' Parse the uscan output return and update the object's properties @param out: uscan output @type out: string >>> u = Uscan('http://example.com/') >>> u._parse('virt-viewer_0.4.0.orig.tar.gz') >>> u.tarball '../virt-viewer_0.4.0.orig.tar.gz' >>> u.uptodate False >>> u._parse('') Traceback (most recent call last): ... UscanError: Couldn't find 'upstream-url' in uscan output >>> u._parse('uscan: no watch file found') Traceback (most recent call last): ... UscanError: Uscan warning: uscan: no watch file found ''' xml_root = lxml.etree.fromstring(out) if xml_root.tag != 'dehs': raise UscanError( 'Unexpected uscan output, missing dehs tag: {}'.format(out)) packages = [] current = None for i, child in enumerate(xml_root): # logging.info('_parse: {}'.format(child.tag)) if child.tag == 'package': current = collections.defaultdict(str) packages.append(current) if current is None: if child.tag == 'warnings': raise UscanError('Uscan warning: {}'.format(child.text)) raise UscanError('Unexpected uscan output: {}'.format(out)) current[child.tag] = child.text # logging.info('%s', str(packages)) if not destdir: destdir = self._destdir latest = None for entry in packages: package = self._process_package(entry, destdir) if not latest or \ ds.version_compare( latest.version, package.version) < 0: latest = package if not latest: raise UscanError('Unexpected uscan output: {}'.format(out)) self._uptodate = (latest.status == 'up to date') self._tarball = latest.tarball self._version = latest.debian_upstream_version self._uversion = latest.upstream_version self._url = latest.url def _raise_error(self, out): r''' Parse the uscan output for errors and warnings and raise a L{UscanError} exception based on this. If no error detail is found a generic error message is used. @param out: uscan output @type out: string @raises UscanError: exception raised >>> u = Uscan('http://example.com/') >>> u._raise_error("uscan warning: " ... "In watchfile debian/watch, reading webpage\n" ... "http://a.b/ failed: 500 Cant connect " ... "to example.com:80 (Bad hostname)") Traceback (most recent call last): ... UscanError: Uscan failed: uscan warning: In watchfile debian/watch, reading webpage http://a.b/ failed: 500 Cant connect to example.com:80 (Bad hostname) >>> u._raise_error("uscan: Can't use --verbose if " ... "you're using --dehs!") Traceback (most recent call last): ... UscanError: Uscan failed: uscan: Can't use --verbose if you're using --dehs! >>> u = u._raise_error('') Traceback (most recent call last): ... UscanError: Uscan failed - debug by running 'uscan --verbose' ''' msg = None for n in ('errors', 'warnings'): m = re.search('<{0}>(.*)'.format(n), out, re.DOTALL) if m: msg = 'Uscan failed: {}'.format(m.group(1)) break if not msg: msg = "Uscan failed - debug by running 'uscan --verbose'" raise UscanError(msg) def scan(self, destdir=None, download=True, force_download=False, version=None): '''Invoke uscan to fetch a new upstream version''' if not destdir: destdir = self._destdir util.ensure_path(destdir) cmd = [self.cmd, '--symlink', '--destdir={}'.format(destdir), '--dehs', '--watchfile', 'debian/watch'] if not download: cmd.append('--report') if download and force_download or download and version: cmd.append('--force-download') if version: cmd.append('--download-version={}'.format(version)) logging.debug('Calling uscan: {}'.format(cmd)) p = subprocess.Popen(cmd, cwd=self._dir, universal_newlines=True, stdout=subprocess.PIPE) out = p.communicate()[0] # uscan exits with 1 in case of uptodate and when an error occured. # Don't fail in the uptodate case: self._parse(out, destdir) if not self.uptodate and p.returncode: self._raise_error(out) if download and not self._tarball: raise UscanError("Couldn't find tarball") # vi:expandtab:softtabstop=4:shiftwidth=4:smarttab arriero-0.7~20161228/arriero/util.py000066400000000000000000000200601306715713600170500ustar00rootroot00000000000000#!/usr/bin/env python3 # -*- coding: utf8 -*- # Copyright: 2013-2015, Maximiliano Curia # # This program is free software; you can redistribute it and/or modify it # under the terms of the GNU General Public License as published by the Free # Software Foundation; either version 2 of the License, or (at your option) # any later version. # # This program is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or # FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for # more details. # # You should have received a copy of the GNU General Public License along with # this program; if not, write to the Free Software Foundation, Inc., 51 # Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. import collections import fcntl import io import itertools import logging import os import re import signal import string import struct import subprocess import sys import termios import pexpect import debian.deb822 as deb822 from .version import Version # subprocess/pexpect wrappers def pexpect_interact(cmd, **kwargs): '''Run command in a pty''' def sigwinch_passthrough(sig, data): s = struct.pack('HHHH', 0, 0, 0, 0) try: a = struct.unpack('hhhh', fcntl.ioctl(sys.stdout.fileno(), termios.TIOCGWINSZ, s)) p.setwinsize(a[0], a[1]) except io.UnsupportedOperation: pass if isinstance(cmd, str): cmd = [cmd] if 'shell' in kwargs: v = kwargs['shell'] del kwargs['shell'] if v: cmd = ['/bin/sh', '-c'] + cmd # Note this 'p' used in sigwinch_passthrough. p = pexpect.spawn(cmd[0], cmd[1:], **kwargs) old_handler = signal.signal(signal.SIGWINCH, sigwinch_passthrough) try: sigwinch_passthrough(signal.SIGWINCH, None) p.interact(None) finally: signal.signal(signal.SIGWINCH, old_handler) p.close() return p.exitstatus def quiet(cmd, *argv, **kwargs): '''Make an OS call without generating any output.''' with open(os.devnull, 'r+') as devnull: kw = chain_map(kwargs, universal_newlines=True) return subprocess.call(cmd, *argv, stdin=devnull, stdout=devnull, stderr=devnull, **kw) def log_popen(cmd, **kwargs): '''Equivalent to Popen, but with logging.''' str_cmd = cmd if isinstance(cmd, str) else ' '.join(cmd) kw = chain_map(kwargs, universal_newlines=True) logging.debug('Executing: %s with %s', str_cmd, kw) popen = subprocess.Popen(cmd, **kw) return popen def log_check_call(cmd, interactive=False, **kwargs): '''Equivalent to check_call, but logging before and after.''' str_cmd = cmd if isinstance(cmd, str) else ' '.join(cmd) kw = chain_map(kwargs) if not interactive: kw['universal_newlines'] = True logging.debug('Executing: {} with {}'.format(str_cmd, kw)) if interactive: returncode = pexpect_interact(cmd, **kw) else: p = subprocess.run(cmd, **kw) returncode = p.returncode logging.debug('\t{} ended with returncode {}'.format(str_cmd, returncode)) if returncode: raise subprocess.CalledProcessError(returncode, cmd) return returncode # Debian specific def version_at_distribution(source_name, distribution): result = rmadison(source_name, distribution=distribution) version = max(v for k, d in result.items() for v in d) return version def rmadison(source_name, url='debian', distribution=None): cmd = ['rmadison', '--url={}'.format(url), source_name] if distribution: cmd.extend(['-s', distribution]) logging.info('{}: {}'.format(source_name, cmd)) output = subprocess.check_output(cmd, universal_newlines=True) logging.info('{}: {}'.format(source_name, output)) result = {} for line in output.split('\n'): if '|' not in line: continue fields = line.split('|') version = fields[1].strip() # keep in mind, this is uses oldstable, stable, testing, unstable. # FIXME: We would need a way to learn this mappings. dist = fields[2].strip() result.setdefault(dist, []).append(Version(version)) return result # Simple helpers class AttrDict(dict): def __init__(self, *args, **kwargs): super(AttrDict, self).__init__(*args, **kwargs) self.__dict__ = self class ChainMap(collections.ChainMap): default_value = None def __getitem__(self, key): for mapping in self.maps: try: value = mapping[key] if value is not None: return value except KeyError: pass return self.default_value # https://stackoverflow.com/questions/19503455/caching-a-generator/19504173 class CachingIterable(object): def __init__(self, iterable): self.iterable = iterable self.iter = iter(iterable) self.done = False self.vals = [] def __iter__(self): if self.done: return iter(self.vals) # chain vals so far & then gen the rest return itertools.chain(self.vals, self._gen_iter()) def _gen_iter(self): # gen new vals, appending as it goes for new_val in self.iter: self.vals.append(new_val) yield new_val self.done = True def chain_map(*ds, **kw): return ChainMap(*itertools.chain(ds, [kw])) class OrderedSet(deb822.OrderedSet, collections.MutableSet): discard = deb822.OrderedSet.remove def __reversed__(self): # Return an iterator of items in the order they were added return reversed(self.__order) def pop(self, last=True): if not self: raise KeyError('pop from an empty set') key = self.__order.pop() if last else self.__order.pop(0) self.__set.remove(key) return key def __repr__(self): return '{}({})'.format(self.__class__.__name__, ', '.join(self)) def __eq__(self, other): if isinstance(other, deb822.OrderedSet): return len(self) == len(other) and list(self) == list(other) return set(self) == set(other) # Filesystem def ensure_path(path): '''Create path if it doesn't exist.''' if not os.path.exists(path): logging.info('Creating path: %s', path) os.makedirs(path) # Configuration related def str2bool(s): return s and s.lower() in ('true', 'on', 'yes') def split(values): '''Split a comma separated string of values into a list.''' if values is None: values = '' return list(filter(lambda x: x is not '', re.split(r'[\s,]+', values))) # String mangling def expansions_needed(format_string): '''Get the amount and name of fields requested by a format string. Given a formated string in the "{}" syntax, returns the amount of positional arguments and named values it needs. ''' formatter = string.Formatter() pos = 0 # positional arguments required fields = collections.OrderedDict() for (_, field_name, _, _) in formatter.parse(format_string): if field_name is None: continue # just text if not field_name: pos += 1 # pos var continue fields[field_name] = None return pos, fields.keys() # bfs def bfs_gen(start, get_neighbors=None, visit=None): '''BFS generator implementation The starting position is not visited, to track the paths use a path as a starting point. :start: starting node :get_neighbors: function to obtain a sequence of reachable nodes :visit: function to be called on every visited node :returns: the node on the goal state or None. ''' done = set([start]) queue = collections.deque() if get_neighbors: queue.extend(get_neighbors(start)) while queue: node = queue.popleft() if node in done: continue yield visit(node) done.add(node) queue.extend(get_neighbors(node)) # vi:expandtab:softtabstop=4:shiftwidth=4:smarttab arriero-0.7~20161228/arriero/version.py000066400000000000000000000113741306715713600175700ustar00rootroot00000000000000#!/usr/bin/env python3 # -*- coding: utf8 -*- # Copyright: 2016, Maximiliano Curia # # This program is free software; you can redistribute it and/or modify it # under the terms of the GNU General Public License as published by the Free # Software Foundation; either version 2 of the License, or (at your option) # any later version. # # This program is distributed in the hope that it will be useful, but WITHOUT # ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or # FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for # more details. # # You should have received a copy of the GNU General Public License along with # this program; if not, write to the Free Software Foundation, Inc., 51 # Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. import re import debian.debian_support as ds class Version(ds.Version): '''Represents the different parts of the version string. Attributes: full_version : Full version string epoch : Epoch if any, None otherwise upstream_version: Upstream version debian_revision : Debian revision if any, None otherwise ''' known_vendors = set(['ubuntu']) vendor_re = re.compile(r'\d(?P[[:alpha:]]*)?[1-9][0-9.~]*$') def __init__(self, version_string): super(ds.Version, self).__init__(version_string) @staticmethod def from_parts(epoch=None, upstream_version='0', debian_revision=None): version_string = '' if epoch: version_string += '%s:' % epoch version_string += upstream_version if debian_revision: version_string += '-%s' % debian_revision return Version(version_string) @staticmethod def from_tag(tag): return Version(tag.replace('_', '~').replace('%', ':').replace('#', '')) @staticmethod def to_tag(version): tag = str(version).replace('~', '_').replace(':', '%') return re.sub(r'\.(?=\.|$|lock$)', '.#', tag) def is_native(self): return not self.debian_revision @property def vendor(self): if self.is_native(): version = self.upstream_version else: version = self.debian_revision match = self.vendor_re.search(version) if match and match.group('vendor') in self.known_vendors: return match.group('vendor') return '' def new_upstream_version(self, upstream_version): '''Returns a Version for the new upstream release.''' # Can be '' epoch = '%s:' % self.epoch if self.epoch else '' # Let's trust upstream # if dfsg_re.match(self.upstream) and dfsg_re.match(upstream): # upstream += '+dfsg' debian_revision = '-1~' if self.debian_revision else '' new_version = epoch + upstream_version + debian_revision return Version(new_version) @staticmethod def _bump(revision): '''Bump a revision so dpkg --compare-version considers it greater.''' match = re.match('(.*?)([0-9]*)$', revision) numeric_part = 0 if match.group(2): numeric_part = int(match.group(2)) numeric_part += 1 return match.group(1) + str(numeric_part) def is_pre_release(self): '''Check if its a pre release.''' return (self.debian_revision and '~' in self.debian_revision) or \ (not self.debian_revision and '~' in self.upstream_version) def pre_release(self): '''Returns a pre release Version from the current one.''' def bump_pre(revision): if '~' not in revision: return revision + '~' base, pre = revision.split('~', 1) return base + '~' + Version._bump(pre) if self.is_native(): new_upstream_version = bump_pre(self.upstream_version) new_debian_revision = self.debian_revision else: new_upstream_version = self.upstream_version new_debian_revision = bump_pre(self.debian_revision) return Version.from_parts(self.epoch, new_upstream_version, new_debian_revision) def release(self): '''Returns a release Version from a pre release one.''' def remove_pre(revision): pos = revision.find('~') if pos == -1: return revision return revision[:pos] if self.is_native(): new_upstream_version = remove_pre(self.upstream_version) new_debian_revision = self.debian else: new_upstream_version = self.upstream_version new_debian_revision = remove_pre(self.debian_revision) return Version.from_parts(self.epoch, new_upstream_version, new_debian_revision) arriero-0.7~20161228/bin/000077500000000000000000000000001306715713600146305ustar00rootroot00000000000000arriero-0.7~20161228/bin/arriero000077500000000000000000000003311306715713600162160ustar00rootroot00000000000000#!/usr/bin/env python3 # -*- coding: utf8 -*- import sys from pkg_resources import load_entry_point if __name__ == '__main__': sys.exit( load_entry_point('Arriero', 'console_scripts', 'arriero')() ) arriero-0.7~20161228/examples/000077500000000000000000000000001306715713600156765ustar00rootroot00000000000000arriero-0.7~20161228/examples/arriero.conf000066400000000000000000000121341306715713600202110ustar00rootroot00000000000000[DEFAULT] basedir=~/local/kde upload-command = reprepro -Vb ~/share/reprepro include %(distribution)s %(changes_file)s [kde-req] packages: akonadi soprano # packages: akonadi automoc eigen2 phonon polkit-qt-1 soprano path: %(basedir)s/kde-req/%(package)s [soprano] debian-branch: experimental #[polkit-kde] #packages: polkit-kde-1 #path: %(basedir)s/kde-req/%(package)s #depends: kde4libs [kde_libs] packages: kactivities kde4libs nepomuk-core nepomuk-widgets path: %(basedir)s/kde-sc/%(package)s depends: kde-req debian-branch: kde4.10 [kactivities] depends: kde4libs [nepomuk-core] depends: kde4libs [nepomuk-widgets] depends: nepomuk-core [kdepimlibs] packages: kdepimlibs path: %(basedir)s/kde-sc/%(package)s depends: kde_libs debian-branch: kde4.10 [kdegames] packages: bomber bovo granatier kajongg kapman katomic kblackbox kblocks kbounce kbreakout kdiamond kfourinline kgoldrunner kigo killbots kiriki kjumpingcube klickety klines kmahjongg kmines knavalbattle knetwalk kolf kollision konquest kpat kreversi kshisen ksirk ksnakeduel kspaceduel ksquares ksudoku ktuberling kubrick libkdegames libkmahjongg lskat palapeli picmi path: %(basedir)s/kde-sc/%(package)s depends: kde_libs libkdegames [kajongg] depends: libkdegames libkmahjongg [kmahjongg] depends: libkdegames libkmahjongg [kshisen] depends: libkdegames libkmahjongg [kdemultimedia] packages: audiocd-kio dragon ffmpegthumbs juk kmix kscd libkcddb libkcompactdisc mplayerthumbs path: %(basedir)s/kde-sc/%(package)s depends: kde_libs [audiocd-kio] depends: libkcddb libkcompactdisc [kdeaccessibility] packages: jovie kaccessible kmag kmouth kmousetool path: %(basedir)s/kde-sc/%(package)s depends: kde_libs debian-branch: kde4.10 [kde_base_artwork] packages: kde-base-artwork kde-wallpapers oxygen-icons path: %(basedir)s/kde-sc/%(package)s depends: kde_libs [kde-wallpapers] debian-branch: kde4.10 [oxygen-icons] debian-branch: kde4.10 [kde_baseapps] packages: kde-baseapps kate konsole path: %(basedir)s/kde-sc/%(package)s depends: kde_libs debian-branch: kde4.10 [konsole] depends: kde-baseapps [kdeedu] packages: analitza blinken cantor kalgebra kalzium kanagram kbruch kgeography khangman kig kiten klettres kmplot kstars ktouch kturtle kwordquiz libkdeedu marble pairs parley rocs step path: %(basedir)s/kde-sc/%(package)s depends: kde_libs libkdeedu debian-branch: kde4.10 [cantor] depends: cantor [kalgebra] depends: analitza [kmplot] debian-branch: master [pairs] debian-branch: master [kde-workspace] packages: kde-workspace path: %(basedir)s/kde-sc/%(package)s depends: kdepimlibs debian-branch: kde4.10 [kde-runtime] packages: kde-runtime path: %(basedir)s/kde-sc/%(package)s depends: kdepimlibs debian-branch: kde4.10 [kdegraphics] packages: gwenview kamera kcolorchooser kdegraphics-mobipocket kdegraphics-strigi-analyzer kdegraphics-thumbnailers kgamma kolourpaint kruler ksaneplugin ksnapshot libkdcraw libkexiv2 libkipi libksane okular svgpart path: %(basedir)s/kde-sc/%(package)s # gwenview build-depends on libkonq5-dev depends: kde-baseapps debian-branch: kde4.10 [gwenview] depends: libkipi [kdegraphics-mobipocket] depends: okular [kdegraphics-thumbnailers] depends: libkdcraw libkexiv2 [ksaneplugin] depends: libksane debian-branch: master [ksnapshot] depends: libkipi [kdesdk] packages: kdesdk path: %(basedir)s/kde-sc/%(package)s depends: kde-baseapps debian-branch: kde4.10 [kdeartwork] packages: kdeartwork path: %(basedir)s/kde-sc/%(package)s depends: kde-workspace kdegraphics debian-branch: kde4.10 [kdebindings] # perlqt and perlkde not packaged yet # qyoto -> uics is licensed under a qt preview license. :( # (is it really needed?) packages: korundum kross-interpreters pykde4 qtruby smokegen smokekde smokeqt path: %(basedir)s/kde-sc/%(package)s depends: kdegraphics kdepimlibs smokegen debian-branch: kde4.10 [korundum] depends: qtruby [qtruby] depends: smokekde [qyoto] debian-branch: master [smokekde] depends: smokeqt [kdepim] packages: kdepim kdepim-runtime path: %(basedir)s/kde-sc/%(package)s depends: kdepimlibs debian-branch: experimental [kdewebdev] packages: kdewebdev path: %(basedir)s/kde-sc/%(package)s depends: kdepimlibs debian-branch: kde4.10 [kdeadmin] packages: kdeadmin path: %(basedir)s/kde-sc/%(package)s depends: kdepimlibs debian-branch: kde4.10 [kdeutils] packages: ark filelight kcalc kcharselect kdf kfloppy kgpg kremotecontrol ktimer kwallet print-manager superkaramba sweeper path: %(basedir)s/kde-sc/%(package)s depends: kde-baseapps debian-branch: kde4.10 [kcharselect] debian-branch: master [print-manager] debian-branch: master [kdenetwork] packages: kdenetwork path: %(basedir)s/kde-sc/%(package)s depends: kde-baseapps kde-workspace debian-branch: kde4.10 [kdeplasma-addons] packages: kdeplasma-addons path: %(basedir)s/kde-sc/%(package)s depends: kdegraphics kde-workspace debian-branch: kde4.10 [kdetoys] packages: kdetoys path: %(basedir)s/kde-sc/%(package)s depends: kde-workspace debian-branch: kde4.10 [kde-extras] packages: amarok path: %(basedir)s/kde-extras/%(package)s depends: kde_libs arriero-0.7~20161228/examples/gbp.example.conf000066400000000000000000000010321306715713600207430ustar00rootroot00000000000000[git-buildpackage] postbuild = lintian -I --show-overrides $GBP_CHANGES_FILE export-dir = ../build-area/ tarball-dir = ../tarballs/ arch = amd64 [git-import-orig] merge = False filter = ['.svn', '.hg', '.bzr', 'CVS', 'debian/*'] [git-import-dsc] filter = [ 'CVS', '.cvsignore', '.hg', '.hgignore' '.bzr', '.bzrignore', '.gitignore' ] [remote-config pkg-libvirt] remote-url-pattern = ssh://git.debian.org/git/pkg-libvirt/%(pkg)s template-dir = /srv/alioth.debian.org/chroot/home/groups/pkg-libvirt/git-template arriero-0.7~20161228/hooks/000077500000000000000000000000001306715713600152035ustar00rootroot00000000000000arriero-0.7~20161228/hooks/autopkgtest/000077500000000000000000000000001306715713600175555ustar00rootroot00000000000000arriero-0.7~20161228/hooks/autopkgtest/B10autopkgtest000077500000000000000000000016271306715713600223260ustar00rootroot00000000000000#!/bin/sh export LANG=C.UTF-8 export LC_ALL=C.UTF-8 cd /tmp/buildd/*/debian/.. if [ ! -f debian/tests/control ]; then # No tests to run echo "Package does not have autopkgtest support, debian/tests/control is missing" exit 0 fi if [ ! -f debian/files ]; then echo "Package source is not built, debian/files is missing" >&2 exit 1 fi # runner/autopkgtest uses apt-utils's apt-ftparchive and # pbuilder's pbuilder-satisfydepends-classic apt-get install -y --force-yes autopkgtest exuberant-ctags apt-utils pbuilder rm /dev/random ln /dev/urandom /dev/random TMPADT=/tmp/adt mkdir -p "$TMPADT/out" binaries=$(awk '/\.deb / { print "--binary ../" $1 }' debian/files) autopkgtest \ --shell-fail --timeout-factor=2.0 \ --user $BUILDUSERNAME $binaries \ --built-tree "$PWD" -- adt-virt-null ret=$? if [ $ret -ne 0 ]; then /bin/bash < /dev/tty > /dev/tty 2> /dev/tty fi exit $ret arriero-0.7~20161228/hooks/list-missing/000077500000000000000000000000001306715713600176255ustar00rootroot00000000000000arriero-0.7~20161228/hooks/list-missing/list-missing000077500000000000000000000015071306715713600222000ustar00rootroot00000000000000#!/bin/bash echo "=== Start list-missing" if test -d debian/tmp; then (cd debian/tmp && find . -type f -o -type l | grep -v '/DEBIAN/' | sort) > debian/dhmk-install-list (for package in $(dh_listpackages); do (cd debian/${package} && find . -type f -o -type l) done; test -e debian/not-installed && sed '/^#/d;/^$/d;s|/$||;/^\.\//!s|^|./|' debian/not-installed | while read glob_patt; do (cd debian/tmp; find . '(' -path "${glob_patt}" -o -path "${glob_patt}"'/*' ')' '(' -type f -o -type l ')') done; ) | sort -u > debian/dhmk-package-list diff -u debian/dhmk-install-list debian/dhmk-package-list | sed '1,2d' | egrep '^-' || true echo "=== End list-missing" rm -f debian/dhmk-install-list debian/dhmk-package-list else echo "=== End list-missing" fi arriero-0.7~20161228/hooks/shell/000077500000000000000000000000001306715713600163125ustar00rootroot00000000000000arriero-0.7~20161228/hooks/shell/B20shell000077500000000000000000000000661306715713600176150ustar00rootroot00000000000000#!/bin/sh /bin/bash < /dev/tty > /dev/tty 2> /dev/tty arriero-0.7~20161228/hooks/shell/C20shell000077500000000000000000000000661306715713600176160ustar00rootroot00000000000000#!/bin/sh /bin/bash < /dev/tty > /dev/tty 2> /dev/tty arriero-0.7~20161228/hooks/shell_on_error/000077500000000000000000000000001306715713600202175ustar00rootroot00000000000000arriero-0.7~20161228/hooks/shell_on_error/C20shell000077500000000000000000000000661306715713600215230ustar00rootroot00000000000000#!/bin/sh /bin/bash < /dev/tty > /dev/tty 2> /dev/tty arriero-0.7~20161228/scripts/000077500000000000000000000000001306715713600155475ustar00rootroot00000000000000arriero-0.7~20161228/scripts/add_myself_to_uploaders.sh000077500000000000000000000016121306715713600227750ustar00rootroot00000000000000#!/bin/bash for path in $(arriero list -f path "$@"); do echo $path ( cd $path changes=0 uploaders=$(sed -n -r '/^Uploaders:/,/^[^[:space:]]/{ /^(Uploaders:|[[:space:]])/p }' debian/control) if echo "$uploaders" | grep -q 'Maximiliano Curia'; then : else sed -i -r '/^Uploaders:/,/^[^[:space:]]/{ /^(Uploaders:|[[:space:]])/ s|([^,])\s*$|\1,| /^[[:space:]]/ s|^[[:space:]]+| | /^(Uploaders:|[[:space:]])/! i \ Maximiliano Curia }' debian/control changes=1 fi if [ $changes -gt 0 ]; then dch 'Add myself to uploaders.' git commit -a -m 'Add myself to uploaders.' fi ) done arriero-0.7~20161228/scripts/bump_debhelper.sh000077500000000000000000000012111306715713600210560ustar00rootroot00000000000000#!/bin/bash for path in $(arriero list -f path "$@"); do echo $path ( cd $path changes=0 if [ $(cat debian/compat) -lt 9 ]; then echo 9 > debian/compat changes=1 fi dhv=$(sed -n -r 's!.*\W(debhelper\s+\(>= ([^)]+)\)).*!\2!p' debian/control) if [ $dhv != "9" ]; then sed -i -r 's!(\Wdebhelper\s+\(>=) [^)]+\)!\1 9)!' debian/control changes=1 fi if [ $changes -gt 0 ]; then dch 'Bump debhelper build-dep and compat to 9.' git commit -a -m 'Bump debhelper build-dep and compat to 9.' fi ) done arriero-0.7~20161228/scripts/bump_kde-sc.sh000077500000000000000000000012201306715713600202720ustar00rootroot00000000000000#!/bin/bash MSG='Bump kde-sc-dev-latest build dependency.' for path in $(arriero -c ~/.config/arriero-kde4.12.conf list -f path \ $(sed -n '/^[^# ]/s/:.*//p' ~/tmp/kde4.12)); do echo $path; ( cd $path change=0 if grep -q 'kde-sc-dev-latest ([^)]*)' debian/control && \ ! grep -q 'kde-sc-dev-latest (>= 4:4\.12[^)]*)' debian/control; then sed -i 's/kde-sc-dev-latest ([^)][^)]*)/kde-sc-dev-latest (>= 4:4.12)/' debian/control change=1 fi if [ $change -gt 0 ]; then dch "$MSG"; git commit -a -m "$MSG" fi ) done arriero-0.7~20161228/scripts/bump_standards.sh000077500000000000000000000012161306715713600211140ustar00rootroot00000000000000#!/bin/bash # This script is intended to be run as: # arriero exec -x 'bump_standards.sh 3.9.6' [packages] set -e if [ $# -lt 1 ]; then echo "usage: $0 STANDARDS-VERSION" > /dev/stderr exit 1 fi new_version="$1" old_version=$(sed -n 's/^Standards-Version:\s*\(\S*\)/\1/p' debian/control) newer=$(sort -rV <(echo ${new_version}) <(echo ${old_version}) | head -n1) if [ "${newer}" = "${old_version}" ]; then # No changes needed exit 0 fi sed -i 's/^Standards-Version:.*$/Standards-Version: '"${new_version}"'/' debian/control dch "Bump Standards-Version to ${new_version}." git commit -a -m "Bump Standards-Version to ${new_version}." arriero-0.7~20161228/scripts/drop_x-testsuite_field.sh000077500000000000000000000000721306715713600225720ustar00rootroot00000000000000#!/bin/sh sed -i '/^X[A-Z]*-Testsuite:/d' debian/control arriero-0.7~20161228/scripts/need_to_merge.sh000077500000000000000000000006731306715713600207100ustar00rootroot00000000000000#!/bin/bash for path in $(arriero list -f path "$@"); do # echo $path ( cd $path branch=$(git rev-parse --abbrev-ref HEAD) if [ "$branch" != "master" ]; then out=$(git rev-list --left-right --count master..."$branch") master=$(echo $out | sed 's/\s.*//') if [ "$master" -gt 0 ]; then echo $path echo $out fi fi ) done arriero-0.7~20161228/scripts/new_repos.sh000077500000000000000000000024311306715713600201070ustar00rootroot00000000000000#!/bin/sh path="$1" vcs_pull="git://git.debian.org/collab-maint/cinnamon" vcs_push="git.debian.org:/git/collab-maint/cinnamon" vcs_upstream="https://github.com/linuxmint" if [ -e "$path" ]; then echo "Directory already exists" > /dev/stderr exit 1 fi name=$(basename "$path") mkdir "$path" cd "$path" git init git remote add origin "${vcs_pull}/${name}.git" git remote set-url --push origin "${vcs_push}/${name}.git" git remote add "$name" "${vcs_upstream}/${name}.git" git checkout --orphan master git reset; git commit --allow-empty -m 'Initial debian branch' git checkout --orphan upstream git reset; git commit --allow-empty -m 'Initial upstream branch' git checkout --orphan pristine-tar git reset; git commit --allow-empty -m 'Initial pristine-tar branch' git push origin master upstream pristine-tar --tags git branch --set-upstream-to origin/master master git branch --set-upstream-to origin/upstream upstream git branch --set-upstream-to origin/pristine-tar pristine-tar git config remote.origin.push refs/heads/master git config --add remote.origin.push refs/heads/upstream git config --add remote.origin.push refs/heads/pristine-tar git config --add remote.origin.push refs/tags/debian/* git config --add remote.origin.push refs/tags/upstream/* git pull --all arriero-0.7~20161228/scripts/new_upstream_release.py000077500000000000000000000045701306715713600223430ustar00rootroot00000000000000#!/usr/bin/env python3 from collections import defaultdict import re import subprocess import sys COMMIT = 'New upstream release.' MSG = ' * New upstream release.\n' changelog_trail = re.compile(r'^ --') changelog_multimaint = re.compile(r'^\s*\[\s*(\b[^]]*\b)\s*\]\s*$') changelog_empty = re.compile(r'^\s*$') entry = re.compile(r' \* New upstream release\.') TRAIL = ('', 0) for path in subprocess.check_output(['arriero', 'list', '-f', 'path'] + sys.argv[1:], universal_newlines=True).split('\n'): if not path: continue print(path) output = subprocess.check_output(['git', 'log', 'HEAD~1..HEAD', '--oneline'], cwd=path, universal_newlines=True) last = re.sub('^[^\s]+\s+', '', output.rstrip('\n')) if last != COMMIT: continue filename = '%s/debian/changelog' % (path,) f = open(filename) lines = f.readlines() skip = 0 first_block = [] maintainers = defaultdict(set) maintainer = TRAIL entries = [] for i, line in enumerate(lines): first_block.append(line) # print(line, end='') m = changelog_multimaint.match(line) if m: maintainer = (m.group(1), i) elif changelog_empty.match(line): maintainer = TRAIL else: maintainers[maintainer].add(i) if entry.match(line): entries.append((i, maintainer)) if changelog_trail.match(line): break skip = len(first_block) delete = set() # print(maintainers) for e in entries: delete.add(e[0]) if e[1] == TRAIL: continue maintainers[e[1]].remove(e[0]) if not maintainers[e[1]]: i = e[1][1] delete.add(i) if i > 2 and changelog_empty.match(first_block[i - 1]): delete.add(i - 1) f = open(filename, 'w') # f = sys.stdout for i, line in enumerate(first_block): if i == 2: f.write(MSG) if changelog_multimaint.match(line) and i not in delete: f.write('\n') if i in delete: continue f.write(line) for line in lines[skip:]: f.write(line) f.close() subprocess.call(['git', 'commit', '-a', '--amend', '-m', 'New upstream release.'], cwd=path, universal_newlines=True) arriero-0.7~20161228/scripts/tag_uploaded.sh000077500000000000000000000004261306715713600205400ustar00rootroot00000000000000#!/bin/bash # This script is intended to be run as: # arriero exec -x tag_uploaded.sh -f version,distribution,urgency [packages] TAG_VERSION=$(echo $version | tr ':~' '%_') TAG="debian/$TAG_VERSION" DESC="$version $distribution; urgency=$urgency" git tag -s -m "$DESC" "$TAG" arriero-0.7~20161228/scripts/update.sh000077500000000000000000000016051306715713600173720ustar00rootroot00000000000000#!/bin/sh # This script is intended to be run as: # arriero exec -x "update.sh -m 'Commit message' script [args...]" [packages] commit_message='' for i in "$@"; do case "$i" in -m) shift commit_message="$1" shift ;; *) break ;; esac done if [ $# -lt 1 ]; then echo "usage: $0 [-m commit_message] command [args...]" > /dev/stderr exit 1 fi status=$(git status --porcelain) if [ -n "${status}" ]; then echo 'Git repository is dirty' > /dev/stderr exit 1 fi script="$1" "${@}" ret=$? if [ $ret -ne 0 ]; then exit $ret fi status=$(git status --porcelain) if [ -z "${status}" ]; then # No changes needed exit 0 fi wrap-and-sort if [ -z "${commit_message}" ]; then commit_message='Automatic update with '"$(basename "${script}")" fi git commit -a -m "${commit_message}" arriero-0.7~20161228/scripts/update_control.sh000077500000000000000000000007001306715713600211250ustar00rootroot00000000000000#!/bin/sh # This script is intended to be run as: # arriero exec -x "update_control.sh script" [packages] if [ $# -lt 1 ]; then echo "usage: $0 cmake_parser" > /dev/stderr exit 1 fi script="$1" "${@}" status=$(git status --porcelain) if [ -z "${status}" ]; then # No changes needed exit 0 fi wrap-and-sort -f debian/control git add debian/control git commit -m 'Automatic debian/control update with '"$(basename "${script}")" arriero-0.7~20161228/scripts/update_deps_cmake.sh000077500000000000000000000007151306715713600215460ustar00rootroot00000000000000#!/bin/sh # This script is intended to be run as: # arriero exec -x "update_deps_cmake.sh cmake_parser" [packages] if [ $# -lt 1 ]; then echo "usage: $0 cmake_parser" > /dev/stderr exit 1 fi cmake_parser="$1" ${cmake_parser} status=$(git status --porcelain) if [ -z "${status}" ]; then # No changes needed exit 0 fi wrap-and-sort -f debian/control git add debian/control git commit -m 'Update build-deps and deps with the info from cmake' arriero-0.7~20161228/scripts/update_vcs_browser.sh000077500000000000000000000010571306715713600220110ustar00rootroot00000000000000#!/bin/bash # This script is intended to be run as: # arriero exec -x update_vcs_fields.sh [packages] set -e MSG='debian/control: Update Vcs-Browser: field' sed -r -i '/^Vcs-Browser:/{ s|^Vcs-Browser:\s*https?://git\.debian\.org|Vcs-Browser: http://anonscm.debian.org/gitweb| s|^Vcs-Browser:\s*https?://anonscm\.debian\.org/gitweb/\?p=|Vcs-Browser: http://anonscm.debian.org/cgit/| }' debian/control status=$(git status --porcelain) if [ -z "${status}" ]; then # No changes needed exit 0 fi git add debian/control git commit -m "${MSG}" arriero-0.7~20161228/scripts/update_vcs_fields.sh000077500000000000000000000017311306715713600215730ustar00rootroot00000000000000#!/bin/bash # This script is intended to be run as: # arriero exec -x update_vcs_fields.sh [packages] set -e status=$(git status --porcelain) if [ -n "${status}" ]; then echo 'Git repository is dirty' > /dev/stderr exit 1 fi MSG='debian/control: Update Vcs-Browser and Vcs-Git fields' sed -r -i ' /^Vcs-Browser:/{ s|^Vcs-Browser:\s*https?://git\.debian\.org|Vcs-Browser: http://anonscm.debian.org/gitweb| s&^Vcs-Browser:\s*https?://anonscm\.debian\.org/(cgit/|gitweb/\?p=)&Vcs-Browser: https://anonscm.debian.org/git/& }; /^Vcs-Git:/{ s&^Vcs-Git:\s*(git|http)://anonscm\.debian\.org&Vcs-Git: https://anonscm.debian.org& \&Vcs-Git: https://anonscm.debian.org/[^g][^i][^t][^/]&{ s|^Vcs-Git:\s*https://anonscm\.debian\.org|Vcs-Git: https://anonscm.debian.org/git| } }; ' debian/control status=$(git status --porcelain) if [ -z "${status}" ]; then # No changes needed exit 0 fi wrap-and-sort git add debian/control git commit -m "${MSG}" arriero-0.7~20161228/scripts/upstream_diff.sh000077500000000000000000000012521306715713600207360ustar00rootroot00000000000000#!/bin/bash # This script is intended to be run as: # arriero exec -s upstream_diff.sh -f upstream_version,last_changelog_version [packages] last_upstream=$(echo "$last_changelog_version" | sed -r 's/^.*?://;s/-.*?$//') if [ -z "$upstream_version" -o -z "$last_upstream" ]; then # Nothing to compare exit 0 fi old=$(echo "$last_upstream" | tr '~:' '_%') new=$(echo "$upstream_version" | tr '~:' '_%') if [ "$new" = "$old" ]; then # No version change exit 0 fi echo '=================================================================' echo $PWD echo git diff "upstream/${last_upstream}...upstream/${upstream_version}" git diff "upstream/${old}...upstream/${new}" arriero-0.7~20161228/scripts/wrap_and_sort.sh000077500000000000000000000004161306715713600207510ustar00rootroot00000000000000#!/bin/bash # This script is intended to be run as: # arriero exec -x wrap_and_sort.sh [packages] set -e MSG='wrap-and-sort' wrap-and-sort status=$(git status --porcelain) if [ -z "${status}" ]; then # No changes needed exit 0 fi git commit -a -m "${MSG}" arriero-0.7~20161228/setup.py000077500000000000000000000035211306715713600155760ustar00rootroot00000000000000#!/usr/bin/env python3 from setuptools import setup, find_packages import glob import os def find_files(directory, target=None): ''' Recursively search for files, return value is in data_files format data_files expects a list of tuples, each tuple containing directory, and a list of files to copy to that directory. ''' files = [] if not target: target = directory for path, directories, filenames in os.walk(directory): path = path.replace(directory, target) files.append((path, [os.path.join(path, filename) for filename in filenames])) return files setup( name='Arriero', version='0.7', author='Maximiliano Curia', author_email='maxy@debian.org', description='Arriero Package Helper', long_description=''' Arriero is a tool that allows simplifying the management of *Debian* packages, particularly useful when having to make new upstream releases, builds and uploads of similar packages. It relies heavily in the use of *git-buildpackage* and general *git* practices, so it's only useful for packages currently maintained through git. ''', url='http://anonscm.debian.org/git/collab-maint/arriero.git', license='GPLv2+', requires=['debian', 'git', 'lxml', 'pexpect'], classifiers=[ 'Environment :: Console', 'Programming Language :: Python :: 3', 'Topic :: Software Development :: Version Control :: Git', 'Operating System :: POSIX :: Linux', ], packages=find_packages(exclude=['tests', 'tests.*']), entry_points={ 'console_scripts': ['arriero = arriero.arriero:main'], }, data_files=[ ('scripts', glob.glob('scripts/*')), ('examples', glob.glob('examples/*')), ] + find_files('hooks'), test_suite='tests', ) arriero-0.7~20161228/tests/000077500000000000000000000000001306715713600152225ustar00rootroot00000000000000arriero-0.7~20161228/tests/__init__.py000066400000000000000000000013651306715713600173400ustar00rootroot00000000000000#!/usr/bin/env python # encoding: utf-8 # Unit tests for arriero # Copyright (C) 2015 Maximiliano Curia # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, see . arriero-0.7~20161228/tests/test_arriero.py000066400000000000000000001034121306715713600202770ustar00rootroot00000000000000#!/usr/bin/env python3 # encoding: utf-8 import io import os import shutil import subprocess import sys import tempfile import unittest from unittest.mock import DEFAULT, patch # Own imports import arriero def create_basic_pkg(dirname): debian_dir = os.path.join(dirname, 'debian') os.mkdir(os.path.join(dirname, 'debian')) # The \N{space} is used to avoid confusing the indenter basic_content = { 'changelog': '''\ basic-pkg (0.1-1) UNRELEASED; urgency=low \N{space} * Initial release. \N{space}-- Foo Bar Thu, 01 Jan 1970 00:00:00 +0000 ''', 'compat': '9', 'control': '''\ Source: basic-pkg Section: misc Priority: extra Maintainer: Foo Bar Build-Depends: debhelper (>= 9) Standards-Version: 3.9.6 Package: basic-pkg Architecture: any Depends: ${misc:Depends}, ${shlibs:Depends} Description: Null package used to test arriero \N{space}Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor \N{space}incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis \N{space}nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. \N{space}Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore \N{space}eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt \N{space}in culpa qui officia deserunt mollit anim id est laborum. ''', 'copyright': '''\ Format: http://www.debian.org/doc/packaging-manuals/copyright-format/1.0/ Upstream-Name: basic-pkg Files: debian/* Copyright: 1970, Foo Bar License: CC0 \ To the extent possible under law, the person who associated CC0 with \ this work has waived all copyright and related or neighboring rights \ to this work. ''', 'rules': '''\ #!/usr/bin/make -f %: \tdh $@ ''', 'source/format': '3.0 (quilt)', } for filename, content in basic_content.items(): parents = filename.split('/')[:-1] for i, parent in enumerate(parents): parent_dir = os.path.join(*([debian_dir] + parents[:i + 1])) os.mkdir(parent_dir) full_filename = os.path.join(debian_dir, filename) with open(full_filename, 'w') as f: f.write(content) def create_basic_git_pkg(dirname): create_basic_pkg(dirname) subprocess.call(['git', 'init'], cwd=dirname) subprocess.call(['git', 'add', '-A', '.'], cwd=dirname) subprocess.call(['git', 'commit', '-m', 'Initial commit'], cwd=dirname) class TestArrieroRun(unittest.TestCase): def test_run(self): # No parameters argv = ['arriero'] with self.assertRaises(SystemExit) as sys_exit, \ patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() self.assertEqual(sys_exit.exception.code, 2, 'Unexpected exit code: {}'.format( sys_exit.exception.code)) out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertFalse(len(out), 'Unexpected stdout: {}'.format(out)) self.assertTrue(len(err), 'Missing usage output') # Help argv = ['arriero', '--help'] with self.assertRaises(SystemExit) as sys_exit, \ patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() self.assertEqual(sys_exit.exception.code, 0, 'Unexpected exit code: {}'.format( sys_exit.exception.code)) out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertTrue(len(out), 'Missing help output') self.assertFalse(len(err), 'Unexpected stderr: {}'.format(out)) class TestArrieroList(unittest.TestCase): def setUp(self): configuration = ''' [DEFAULT] basedir: /foo [_test_group_1] packages: atestpkg ztestpkg vcs_git: git://{{name}} export_dir: /nonexistent [_test_group_2] packages: atestpkg ztestpkg export_dir: /tmp/{{name}}/export_dir [ztestpkg] path: {path} depends: atestpkg [atestpkg] path: {path} depends: testpkg [testpkg] path: {path} ''' self.config_file = tempfile.NamedTemporaryFile( prefix='arriero.', suffix='.conf') self.pkg_path = tempfile.mkdtemp(prefix='arriero.') self.config_file.write( configuration.format(path=self.pkg_path).encode('utf8') ) self.config_file.flush() self.base_argv = ['arriero', '--config', self.config_file.name] def tearDown(self): self.config_file.close() if os.path.exists(self.pkg_path): shutil.rmtree(self.pkg_path) def test_list_basic_pkg(self): create_basic_pkg(self.pkg_path) argv = self.base_argv + ['list', '-F', '{branch}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertFalse(len(out), 'Unexpected stdout: {}'.format(out)) self.assertTrue(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{changes_file}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, '\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{dist}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'UNRELEASED\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{distribution}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'UNRELEASED\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{dsc_file}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, '\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{is_dfsg}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'False\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{is_native}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'False\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{is_merged}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'False\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{last_changelog_distribution}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, '\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{last_changelog_version}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, '\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{source_name}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'basic-pkg\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{source_changes_file}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, '\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{upstream_version}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, '0.1\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{urgency}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'low\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{vcs_git}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, '\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{version}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, '0.1-1\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{version_at_distribution}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, '\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) def test_list_basic_git_pkg(self): create_basic_git_pkg(self.pkg_path) argv = self.base_argv + ['list', '-F', '{branch}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'master\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) def test_list_pkg_not_created(self): if os.path.exists(self.pkg_path): shutil.rmtree(self.pkg_path) argv = self.base_argv + ['list', '-F', '{name} {path}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'testpkg {path}\n'.format(path=self.pkg_path), 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{basedir}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, '/foo\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{branch}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertFalse(len(out), 'Unexpected stdout: {}'.format(out)) self.assertTrue(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{changes_file}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertFalse(len(out), 'Unexpected stdout: {}'.format(out)) self.assertTrue(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{debian_branch} {depends}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual('master ()\n', out, 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{dist}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertFalse(len(out), 'Unexpected stdout: {}'.format(out)) self.assertTrue(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{distribution}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertFalse(len(out), 'Unexpected stdout: {}'.format(out)) self.assertTrue(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{dsc_file}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertFalse(len(out), 'Unexpected stdout: {}'.format(out)) self.assertTrue(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{export_dir}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() expected_path = os.path.join(os.path.dirname(self.pkg_path), 'build-area') self.assertEqual(out, '{path}\n'.format(path=expected_path), 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{has_symbols}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'False\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{i}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, '0\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{is_dfsg}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertFalse(len(out), 'Unexpected stdout: {}'.format(out)) self.assertTrue(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{is_native}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertFalse(len(out), 'Unexpected stdout: {}'.format(out)) self.assertTrue(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{is_merged}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'False\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{last_changelog_distribution}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertFalse(len(out), 'Unexpected stdout: {}'.format(out)) self.assertTrue(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{last_changelog_version}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertFalse(len(out), 'Unexpected stdout: {}'.format(out)) self.assertTrue(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{package}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'testpkg\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{source_name}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertFalse(len(out), 'Unexpected stdout: {}'.format(out)) self.assertTrue(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{source_changes_file}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertFalse(len(out), 'Unexpected stdout: {}'.format(out)) self.assertTrue(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{upstream_branch}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'upstream\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{upstream_version}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertFalse(len(out), 'Unexpected stdout: {}'.format(out)) self.assertTrue(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{urgency}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertFalse(len(out), 'Unexpected stdout: {}'.format(out)) self.assertTrue(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{vcs_git}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, '\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{version}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertFalse(len(out), 'Unexpected stdout: {}'.format(out)) self.assertTrue(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-F', '{version_at_distribution}', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertFalse(len(out), 'Unexpected stdout: {}'.format(out)) self.assertTrue(len(err), 'Unexpected stderr: {}'.format(err)) def test_list_fields(self): # An empty result, no new line argv = self.base_argv + ['list', '-f', 'vcs_git', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertFalse(len(out), 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) # Test with a list of fields argv = self.base_argv + ['list', '-f', 'name,path package, debian_branch', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'testpkg\t{path}\ttestpkg\tmaster\n'.format( path=self.pkg_path), 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) def test_list_sort(self): argv = self.base_argv + ['list', '-s', 'alpha', 'testpkg', 'ztestpkg', 'atestpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'atestpkg\ntestpkg\nztestpkg\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-s', 'raw', 'testpkg', 'ztestpkg', 'atestpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'testpkg\nztestpkg\natestpkg\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['list', '-s', 'build', 'testpkg', 'ztestpkg', 'atestpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'testpkg\natestpkg\nztestpkg\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) def test_list_groups(self): argv = self.base_argv + ['list', '_test_group_1'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'atestpkg\nztestpkg\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) # Test inheritance argv = self.base_argv + ['list', '-f', 'export_dir, vcs_git', '_test_group_1'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, '/tmp/atestpkg/export_dir\tgit://atestpkg\n' '/tmp/ztestpkg/export_dir\tgit://ztestpkg\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) class TestArrieroExec(unittest.TestCase): def setUp(self): configuration = ''' [DEFAULT] basedir: /foo [testpkg] path: {path} ''' self.config_file = tempfile.NamedTemporaryFile( prefix='arriero.', suffix='.conf') self.pkg_path = tempfile.mkdtemp(prefix='arriero.') self.config_file.write( configuration.format(path=self.pkg_path).encode('utf8') ) self.config_file.flush() self.base_argv = ['arriero', '--config', self.config_file.name] create_basic_git_pkg(self.pkg_path) def tearDown(self): self.config_file.close() if os.path.exists(self.pkg_path): shutil.rmtree(self.pkg_path) def test_exec_basic_pkg(self): argv = self.base_argv + ['exec', '-x', 'echo \'{name}\'', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'testpkg\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['exec', '-x', 'echo \'{name}\'', '-x', 'echo \'{basedir}\'', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'testpkg\n/foo\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) argv = self.base_argv + ['exec', '-f', 'name', '-x', 'set | grep -a \'^name\'', 'testpkg'] with patch.object(sys, 'argv', new=argv), \ patch.multiple(sys, stdout=DEFAULT, stderr=DEFAULT, new_callable=io.StringIO) as _sys: arriero.main() out = _sys['stdout'].getvalue() err = _sys['stderr'].getvalue() self.assertEqual(out, 'name=testpkg\n', 'Unexpected stdout: {}'.format(out)) self.assertFalse(len(err), 'Unexpected stderr: {}'.format(err)) if __name__ == '__main__': unittest.main()