bzr-git-0.6.13+bzr1649/.testr.conf0000644000000000000000000000025613165530605014517 0ustar 00000000000000[DEFAULT] test_command=BZR_PLUGINS_AT=git@`pwd` bzr selftest ^bzrlib.plugins.git. Git --subunit $IDOPTION $LISTOPT test_id_option=--load-list $IDFILE test_list_option=--list bzr-git-0.6.13+bzr1649/COPYING0000644000000000000000000004310513165530605013464 0ustar 00000000000000 GNU GENERAL PUBLIC LICENSE Version 2, June 1991 Copyright (C) 1989, 1991 Free Software Foundation, Inc. 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. Preamble The licenses for most software are designed to take away your freedom to share and change it. By contrast, the GNU General Public License is intended to guarantee your freedom to share and change free software--to make sure the software is free for all its users. This General Public License applies to most of the Free Software Foundation's software and to any other program whose authors commit to using it. (Some other Free Software Foundation software is covered by the GNU Library General Public License instead.) You can apply it to your programs, too. When we speak of free software, we are referring to freedom, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for this service if you wish), that you receive source code or can get it if you want it, that you can change the software or use pieces of it in new free programs; and that you know you can do these things. To protect your rights, we need to make restrictions that forbid anyone to deny you these rights or to ask you to surrender the rights. These restrictions translate to certain responsibilities for you if you distribute copies of the software, or if you modify it. For example, if you distribute copies of such a program, whether gratis or for a fee, you must give the recipients all the rights that you have. You must make sure that they, too, receive or can get the source code. And you must show them these terms so they know their rights. We protect your rights with two steps: (1) copyright the software, and (2) offer you this license which gives you legal permission to copy, distribute and/or modify the software. Also, for each author's protection and ours, we want to make certain that everyone understands that there is no warranty for this free software. If the software is modified by someone else and passed on, we want its recipients to know that what they have is not the original, so that any problems introduced by others will not reflect on the original authors' reputations. Finally, any free program is threatened constantly by software patents. We wish to avoid the danger that redistributors of a free program will individually obtain patent licenses, in effect making the program proprietary. To prevent this, we have made it clear that any patent must be licensed for everyone's free use or not licensed at all. The precise terms and conditions for copying, distribution and modification follow. GNU GENERAL PUBLIC LICENSE TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION 0. This License applies to any program or other work which contains a notice placed by the copyright holder saying it may be distributed under the terms of this General Public License. The "Program", below, refers to any such program or work, and a "work based on the Program" means either the Program or any derivative work under copyright law: that is to say, a work containing the Program or a portion of it, either verbatim or with modifications and/or translated into another language. (Hereinafter, translation is included without limitation in the term "modification".) Each licensee is addressed as "you". Activities other than copying, distribution and modification are not covered by this License; they are outside its scope. The act of running the Program is not restricted, and the output from the Program is covered only if its contents constitute a work based on the Program (independent of having been made by running the Program). Whether that is true depends on what the Program does. 1. You may copy and distribute verbatim copies of the Program's source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice and disclaimer of warranty; keep intact all the notices that refer to this License and to the absence of any warranty; and give any other recipients of the Program a copy of this License along with the Program. You may charge a fee for the physical act of transferring a copy, and you may at your option offer warranty protection in exchange for a fee. 2. You may modify your copy or copies of the Program or any portion of it, thus forming a work based on the Program, and copy and distribute such modifications or work under the terms of Section 1 above, provided that you also meet all of these conditions: a) You must cause the modified files to carry prominent notices stating that you changed the files and the date of any change. b) You must cause any work that you distribute or publish, that in whole or in part contains or is derived from the Program or any part thereof, to be licensed as a whole at no charge to all third parties under the terms of this License. c) If the modified program normally reads commands interactively when run, you must cause it, when started running for such interactive use in the most ordinary way, to print or display an announcement including an appropriate copyright notice and a notice that there is no warranty (or else, saying that you provide a warranty) and that users may redistribute the program under these conditions, and telling the user how to view a copy of this License. (Exception: if the Program itself is interactive but does not normally print such an announcement, your work based on the Program is not required to print an announcement.) These requirements apply to the modified work as a whole. If identifiable sections of that work are not derived from the Program, and can be reasonably considered independent and separate works in themselves, then this License, and its terms, do not apply to those sections when you distribute them as separate works. But when you distribute the same sections as part of a whole which is a work based on the Program, the distribution of the whole must be on the terms of this License, whose permissions for other licensees extend to the entire whole, and thus to each and every part regardless of who wrote it. Thus, it is not the intent of this section to claim rights or contest your rights to work written entirely by you; rather, the intent is to exercise the right to control the distribution of derivative or collective works based on the Program. In addition, mere aggregation of another work not based on the Program with the Program (or with a work based on the Program) on a volume of a storage or distribution medium does not bring the other work under the scope of this License. 3. You may copy and distribute the Program (or a work based on it, under Section 2) in object code or executable form under the terms of Sections 1 and 2 above provided that you also do one of the following: a) Accompany it with the complete corresponding machine-readable source code, which must be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange; or, b) Accompany it with a written offer, valid for at least three years, to give any third party, for a charge no more than your cost of physically performing source distribution, a complete machine-readable copy of the corresponding source code, to be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange; or, c) Accompany it with the information you received as to the offer to distribute corresponding source code. (This alternative is allowed only for noncommercial distribution and only if you received the program in object code or executable form with such an offer, in accord with Subsection b above.) The source code for a work means the preferred form of the work for making modifications to it. For an executable work, complete source code means all the source code for all modules it contains, plus any associated interface definition files, plus the scripts used to control compilation and installation of the executable. However, as a special exception, the source code distributed need not include anything that is normally distributed (in either source or binary form) with the major components (compiler, kernel, and so on) of the operating system on which the executable runs, unless that component itself accompanies the executable. If distribution of executable or object code is made by offering access to copy from a designated place, then offering equivalent access to copy the source code from the same place counts as distribution of the source code, even though third parties are not compelled to copy the source along with the object code. 4. You may not copy, modify, sublicense, or distribute the Program except as expressly provided under this License. Any attempt otherwise to copy, modify, sublicense or distribute the Program is void, and will automatically terminate your rights under this License. However, parties who have received copies, or rights, from you under this License will not have their licenses terminated so long as such parties remain in full compliance. 5. You are not required to accept this License, since you have not signed it. However, nothing else grants you permission to modify or distribute the Program or its derivative works. These actions are prohibited by law if you do not accept this License. Therefore, by modifying or distributing the Program (or any work based on the Program), you indicate your acceptance of this License to do so, and all its terms and conditions for copying, distributing or modifying the Program or works based on it. 6. Each time you redistribute the Program (or any work based on the Program), the recipient automatically receives a license from the original licensor to copy, distribute or modify the Program subject to these terms and conditions. You may not impose any further restrictions on the recipients' exercise of the rights granted herein. You are not responsible for enforcing compliance by third parties to this License. 7. If, as a consequence of a court judgment or allegation of patent infringement or for any other reason (not limited to patent issues), conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot distribute so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not distribute the Program at all. For example, if a patent license would not permit royalty-free redistribution of the Program by all those who receive copies directly or indirectly through you, then the only way you could satisfy both it and this License would be to refrain entirely from distribution of the Program. If any portion of this section is held invalid or unenforceable under any particular circumstance, the balance of the section is intended to apply and the section as a whole is intended to apply in other circumstances. It is not the purpose of this section to induce you to infringe any patents or other property right claims or to contest validity of any such claims; this section has the sole purpose of protecting the integrity of the free software distribution system, which is implemented by public license practices. Many people have made generous contributions to the wide range of software distributed through that system in reliance on consistent application of that system; it is up to the author/donor to decide if he or she is willing to distribute software through any other system and a licensee cannot impose that choice. This section is intended to make thoroughly clear what is believed to be a consequence of the rest of this License. 8. If the distribution and/or use of the Program is restricted in certain countries either by patents or by copyrighted interfaces, the original copyright holder who places the Program under this License may add an explicit geographical distribution limitation excluding those countries, so that distribution is permitted only in or among countries not thus excluded. In such case, this License incorporates the limitation as if written in the body of this License. 9. The Free Software Foundation may publish revised and/or new versions of the General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Program specifies a version number of this License which applies to it and "any later version", you have the option of following the terms and conditions either of that version or of any later version published by the Free Software Foundation. If the Program does not specify a version number of this License, you may choose any version ever published by the Free Software Foundation. 10. If you wish to incorporate parts of the Program into other free programs whose distribution conditions are different, write to the author to ask for permission. For software which is copyrighted by the Free Software Foundation, write to the Free Software Foundation; we sometimes make exceptions for this. Our decision will be guided by the two goals of preserving the free status of all derivatives of our free software and of promoting the sharing and reuse of software generally. NO WARRANTY 11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. END OF TERMS AND CONDITIONS How to Apply These Terms to Your New Programs If you develop a new program, and you want it to be of the greatest possible use to the public, the best way to achieve this is to make it free software which everyone can redistribute and change under these terms. To do so, attach the following notices to the program. It is safest to attach them to the start of each source file to most effectively convey the exclusion of warranty; and each file should have at least the "copyright" line and a pointer to where the full notice is found. Copyright (C) This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program; if not, write to the Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA Also add information on how to contact you by electronic and paper mail. If the program is interactive, make it output a short notice like this when it starts in an interactive mode: Gnomovision version 69, Copyright (C) year name of author Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'. This is free software, and you are welcome to redistribute it under certain conditions; type `show c' for details. The hypothetical commands `show w' and `show c' should show the appropriate parts of the General Public License. Of course, the commands you use may be called something other than `show w' and `show c'; they could even be mouse-clicks or menu items--whatever suits your program. You should also get your employer (if you work as a programmer) or your school, if any, to sign a "copyright disclaimer" for the program, if necessary. Here is a sample; alter the names: Yoyodyne, Inc., hereby disclaims all copyright interest in the program `Gnomovision' (which makes passes at compilers) written by James Hacker. , 1 April 1989 Ty Coon, President of Vice This General Public License does not permit incorporating your program into proprietary programs. If your program is a subroutine library, you may consider it more useful to permit linking proprietary applications with the library. If this is what you want to do, use the GNU Library General Public License instead of this License. bzr-git-0.6.13+bzr1649/HACKING0000644000000000000000000000013013165530605013407 0ustar 00000000000000Please refer to HACKING in the Bazaar source distribution for topics not covered here. bzr-git-0.6.13+bzr1649/INSTALL0000644000000000000000000000061213165530605013456 0ustar 00000000000000You need the Dulwich Python module installed (or in PYTHONPATH) and a fairly recent version of bzr. You will also need either the python tdb or sqlite bindings; the latter are included with Python. For more information on Dulwich, see http://samba.org/~jelmer/dulwich/ Installing bzr-git can be done by putting it into the ~/.bazaar/plugins directory, and renaming its directory to "git". bzr-git-0.6.13+bzr1649/Makefile0000644000000000000000000000274213165530605014073 0ustar 00000000000000DEBUGGER ?= BZR_OPTIONS ?= BZR ?= $(shell which bzr) PYTHON ?= $(shell which python) SETUP ?= ./setup.py PYDOCTOR ?= pydoctor CTAGS ?= ctags PYLINT ?= pylint RST2HTML ?= rst2html TESTS ?= -s bp.git all:: build build:: $(SETUP) build build-inplace:: install:: $(SETUP) install clean:: $(SETUP) clean rm -f *.so check:: build-inplace BZR_PLUGINS_AT=git@$(shell pwd) $(DEBUGGER) $(PYTHON) $(PYTHON_OPTIONS) $(BZR) $(BZR_OPTIONS) selftest $(TEST_OPTIONS) $(TESTS) check-all:: $(MAKE) check TESTS="^bzrlib.plugins.git. Git" check-verbose:: $(MAKE) check TEST_OPTIONS=-v check-one:: $(MAKE) check TEST_OPTIONS=--one check-random:: $(MAKE) check TEST_OPTIONS="--random=now --verbose --one" show-plugins:: BZR_PLUGINS_AT=git@$(shell pwd) $(BZR) plugins -v lint:: $(PYLINT) -f parseable *.py */*.py tags:: $(CTAGS) -R . ctags:: tags coverage:: $(MAKE) check BZR_OPTIONS="--coverage coverage" .PHONY: update-pot po/bzr-git.pot update-pot: po/bzr-git.pot TRANSLATABLE_PYFILES:=$(shell find . -name '*.py' \ | grep -v 'tests/' \ ) po/bzr-git.pot: $(PYFILES) $(DOCFILES) BZR_PLUGINS_AT=git@$(shell pwd) bzr export-pot \ --plugin=git > po/bzr-git.pot echo $(TRANSLATABLE_PYFILES) | xargs \ xgettext --package-name "bzr-git" \ --msgid-bugs-address "" \ --copyright-holder "Canonical Ltd " \ --from-code ISO-8859-1 --sort-by-file --join --add-comments=i18n: \ -d bzr-git -p po -o bzr-git.pot bzr-git-0.6.13+bzr1649/NEWS0000644000000000000000000003752113165530605013135 0ustar 000000000000000.6.13 UNRELEASED BUG FIXES * Fix compatibility with newer versions of Dulwich, which now require Repo._determine_file_mode. (Jelmer Vernooij) * "Support" empty repositories; print an appropriate error. (Jelmer Vernooij, #1219424) * Fix compatibility with and depend on dulwich 0.9.6. (William Grant) * Correctly handle all moves when converting bzr trees to git. (William Grant, #818318) * Fix compatibility with dulwich 0.14. (Jelmer Vernooij) FEATURES * Mark bzr 2.7 as supported. * Support 'HG:rename-source' fields. * Support HG extra 'amend_source'. 0.6.12 2013-09-22 FEATURES * Support thin packs when pushing and fetching to/from remote repositories. (William Grant, #878085) CHANGES * Fix compatibility with and depend on dulwich 0.9.1. (Jelmer Vernooij) 0.6.11 2013-08-04 BUG FIXES * Add support for Bazaar 2.6.0. (Jelmer Vernooij) 0.6.10 2012-12-13 FEATURES * New command 'bzr git-push-pristine-tar', which pushes pristine tar deltas to a git repository. (Jelmer Vernooij) BUG FIXES * Implement ``GitRevisionTree.is_executable``. (Jelmer Vernooij) 0.6.9 2012-05-29 BUG FIXES * ``bzr git-import`` only creates colocated branches now if the --colocated option is specified. (Jelmer Vernooij) * Convert `~` to `_` when pushing into git, as `~` is not allowed in refs. (Jelmer Vernooij) * Support optional timeout argument to ``bzr serve --git``. (Jelmer Vernooij) * Handle encoding better in working tree iter changes. (Jelmer Vernooij, #1019978) 0.6.8 2012-03-28 BUG FIXES * Fixes duplicate tag warnings in 'git-remote-bzr' helper. (Jelmer Vernooij, #905275) * Don't suggest development-subtree when submodules are encountered. (Jelmer Vernooij, #951494) * Print proper error when encountering data that can't be roundtripped. (Jelmer Vernooij) * Ignore control directory filenames on Windows, too. (Jelmer Vernooij, #967054) * Fix 'Unable to obtain lock' error when dpushing from a bound branch. (Jelmer Vernooij, #949557) * Cope with commits with a completely empty tree. (Jelmer Vernooij, #933132) TESTS * Add test to verify that certain invalid timezones ("--700") are roundtripped correctly. (Jelmer Vernooij, #697828) 0.6.7 2012-01-25 CHANGES * Switch to supporting bzr 2.5 only. (Jelmer Vernooij) PERFORMANCE * "git:" revision specifier now avoids full branch history access if it can. (Jelmer Vernooij) DOCUMENTATION * Added manual page for git-remote-bzr. (Jelmer Vernooij0 BUG FIXES * Support setting branch nicks. (Jelmer Vernooij, #731239) * Support Repository.set_make_working_trees(). (Jelmer Vernooij, #777065) FEATURES * Add 'github:' directory service. (Jelmer Vernooij) 0.6.6 2011-12-15 BUG FIXES * Warn about ignoring path segment parameters when using bzr 2.4. (Jelmer Vernooij, #887785) * Don't request unpeeled objects, newer versions of github refuse it. (Jelmer Vernooij, #897951) * Specify proper number of arguments to action() handler. (Jelmer Vernooij, #894195) * Fix compatibility with beta 4 of bzr 2.5. (Jelmer Vernooij) FEATURES * New options '--git-receive-pack' and '--git-upload-pack' for 'bzr serve', providing support for inetd. (Jelmer Vernooij) 0.6.5 2011-11-08 CHANGES * git-import no longer creates a deep hierarchy to store branches, instead it now strips the refs/heads/ prefix from ref names. (Jelmer Vernooij) * Bzr 2.3 is no longer supported. (Jelmer Vernooij) FIXES * Fix compatibility with bzr < 2.5 when used with remote repositories. (Jelmer Vernooij, #885566) * Fix git-import after branch refactoring. (Jelmer Vernooij, #886161) * git-import now creates colocated branches if the target bzrdir supports them. (Jelmer Vernooij) 0.6.4 2011-11-03 FIXES * Fix and test fetching from "bzr serve --git". (Jelmer Vernooij) * Raise UnsupportedOperation for `Branch.revision_id_to_dotted_revno`, making it possible to run ``bzr tags`` against remote git repositories. (Jelmer Vernooij, #858942) * Fix compatibility with bzr < 2.5. (Jelmer Vernooij, #885566) FEATURES * Allow downloading branches over HTTP from loggerhead using the Git protocol. To enable, set 'http_git=True' in the branch configuration. (Jelmer Vernooij, #585822) 0.6.3 2011-11-01 FIXES * Only actually fetch tags if "branch.fetch_tags" is set to true. (Jelmer Vernooij, #771184) * Add basic support for alternates. (Jelmer Vernooij) * Support addressing branches by ref where the name can't be mapped back to a branch. (Jelmer Vernooij, #829481) * Skip post commit hook when dulwich is not available. (Jelmer Vernooij, #853974) * Fix pushing from git repository to git repository. (Jelmer Vernooij, #731270) * Friendlier error message when HEAD can not be found in remote repository. (Jelmer Vernooij, #778920) * Support updating tags in remote branches. (Jelmer Vernooij, #706990) * Fix compatibility with tags API after changes in bzr. (Jelmer Vernooij, #861592) * Cope with tags pointing at trees when cloning local git repositories. (Jelmer Vernooij, #861973) * Remove pending entries when converting directory into tree reference. (Jelmer Vernooij, #871595) * Fix fetching into repositories with fallback repositories. (Jelmer Vernooij, #866028) FEATURES * Support the git http smart server protocol. (Jelmer Vernooij, #581933) * Support removal of remote branches. (Jelmer Vernooij, #855993) * Add i18n support. (Jelmer Vernooij) * Now includes git remote helper ``git-remote-bzr``. (Jelmer Vernooij) 0.6.2 2011-08-07 FEATURES * Provide Repository.get_known_graph_ancestry(). (Jelmer Vernooij) * Provide Repository.get_file_graph(). (Jelmer Vernooij, #677363) * Provide GitRevisionTree.get_file_mtime(). (Jelmer Vernooij) * Provide GitRevisionTree.get_file_revision(). (Jelmer Vernooij, #780953) * Add post-commit hook to update the git cache. (Jelmer Vernooij, #814651) PERFORMANCE * Significantly improve performance of WorkingTree.extras(). (Jelmer Vernooij) CHANGES * Require Dulwich 0.8.0. (Jelmer Vernooij) 0.6.1 2011-06-18 BUG FIXES * Support git repositories without a branches directory in their control directory. (Jelmer Vernooij, #780239) * Fix two mistakes in 'bzr help git'. (Jelmer Vernooij, #791047) * Now raises a proper exception when receiving an "Unknown repository" error from GitHub. (Jelmer Vernooij, #798295) * Support the new limit argument to InterBranch.fetch. (Jelmer Vernooij, #750175) * Support the new testament API that accepts a tree rather than an inventory. (Jelmer Vernooij, #762608) * Remove InterBranch.update_revisions. (Jelmer Vernooij, #771765) * Implement Repository.set_make_working_trees(). (Jelmer Vernooij, #777065) 0.6.0 2011-04-12 BUG FIXES * Fix encoding handling in Git working trees. (Jelmer Vernooij, #393038) * Use transports internally in "bzr git-import". (Jelmer Vernooij, #733919) * Provide custom GitDir.sprout() implementation for compatibility with bzr 2.4. (Jelmer Vernooij, #717937) * Revisions attached to tags that are not in the tips ancestry are now fetched. (Jelmer Vernooij, #309682) * Fix recursion error merging tags for bound branches. (Jelmer Vernooij, #742833) * Fix fetching from remote git repositories during merge. (Jelmer Vernooij, #741760) * Properly raise RootMissing if no root is specified to an empty tree in the commit builder. (Jelmer Vernooij, #731360) * Return proper conflict list from WorkingTree.conflicts. (Jelmer Vernooij, #741397) * Fix dpush of certain branches. (Jelmer Vernooij, #705807) API COMPLETENESS * Implement LocalGitControlDir.clone_on_transport. (Jelmer Vernooij, #721899) COMPATIBILITY * Drop support for Bazaar < 2.3. (Jelmer Vernooij) 0.5.4 2011-02-10 BUG FIXES * Fix test suite compatibility with Bazaar 2.2. (Max Bowsher, #707434) * Fix compatibility with older versions of python-tdb. (Jelmer Vernooij, #707735) * Fix 'bzr git-import' from remote repositories. (Jelmer Vernooij, #706990) * Cope with tags when doing local fetches. (Jelmer Vernooij, #675637) 0.5.3 2011-01-21 BUG FIXES * Add in an empty git repository now works. (Jelmer Vernooij, #603823) * Support opening of repositories over HTTP where the HTTP server doesn't allow directory access. (Jelmer Vernooij, #617078) * Support non-ascii characters in tag names. (Jelmer Vernooij, #616995) * Mark as compatible with bzr 2.3, 2.4. (Jelmer Vernooij) * Cope with unknown refs. (Jelmer Vernooij, #666443) * Don't peel tags automatically when pushing back. (Jelmer Vernooij, #675231) * Fix `bzr-receive-pack` and `bzr-upload-pack`. (Jelmer Vernooij, #681193) FEATURES * Remove all remaining dependencies on C git. (Jelmer Vernooij, #348238) * Add some basic documentation in 'bzr help git'. (Jelmer Vernooij, #605394) * Add --signoff option to 'bzr git-apply'. (Jelmer Vernooij) * Add --force option to 'bzr git-apply'. (Jelmer Vernooij) 0.5.2 2010-07-30 COMPATIBILITY * Drop support for Bazaar < 2.0. (Jelmer Vernooij) BUG FIXES * Cope with kind changes when generating git objects from Bazaar revisions that contain kind changes where a directory is changed into a file and its (file/symlink) children are removed (rather than moved). (#597758, Jelmer Vernooij) * Fix reading pack files over http. (#588724, Jelmer Vernooij) * Fix 'bzr status' after 'bzr add' in native git working trees. (#603800, Chadrik) * Provide VersionedFiles.get_annotator(). (#508288, Jelmer Vernooij) * Handle non-ascii characters in filenames. (#612291, Jelmer Vernooij) FEATURES * Support specifying alternative paths for git-upload-pack and git-receive-pack. (Ross Light, #585204) 0.5.1 2010-05-22 BUG FIXES * Mark as compatible with Bazaar 2.2 (Jelmer Vernooij) * Use host specified rather than localhost in `bzr serve`. (David Coles, #543998) * Handle working trees without valid HEAD branch. (Jelmer Vernooij, #501385) * Default to non-bare repositories when initializing a control directory. (Jelmer Vernooij) * Cope with -0000 as timezone in Git commits. (Jelmer Vernooij, #539978) FEATURES * Support 'bzr diff --format=git'. (Jelmer Vernooij, #555994) PERFORMANCE * Avoid the use of InventoryDirectory.children. This speeds up imports significantly. (Jelmer Vernooij) * Use Bazaar index files to store the sha map and git objects to cache certain objects. (#520694, Jelmer Vernooij) 0.5.0 2010-03-18 BUG FIXES * Fix compatibility with newer versions of Python2.6, which change the behaviour of urlparse.urlsplit. (Jelmer Vernooij, #561351) * Avoid storing texts of symlinks, which causes checksum errors in `bzr check`. (#512323, INADA Naoki, Jelmer Vernooij) * Support committing to a git branch from a bzr working tree. (#506174, Jelmer Vernooij) * Support executable symlinks. (#512871, INADA Naoki, Jelmer Vernooij) * When unpacking URLs, strip leftmost slash to match gits behaviour. (#445156, Jelmer Vernooij) * Support merging tags to a local Git repository. (#4445230, Jelmer Vernooij) * InterFromGitBranch.pull() supports an optional limit argument to limit how many revisions to import in one go. (Michael Hudson) * Cope with different encodings better, rather than just stripping out unknown characters. (#529460, Jelmer Vernooij) * Support ``run_hooks`` argument to ``InterGitRemoteLocalBranch.pull()``. (#524843, Jelmer Vernooij) * Properly ignore directories when creating bundles, deal with new files. (#456849, Jelmer Vernooij) PERFORMANCE * Don't import head revision twice when pulling from Git. (Jelmer Vernooij) FEATURES * Support (dumb) HTTP repositories. (#373688, Jelmer Vernooij) * Implement API for colocated branches. (#380871, Jelmer Vernooij) 0.4.3 2010-01-19 BUG FIXES * Fix warning about unclosed files on Windows. (#441978, INADA Naoki) * Support creating working tree for existing repository. (Jelmer Vernooij) * Fix base url of Git branches - use the working tree path rather than the control directory path. (Jelmer Vernooij) * Fix fetching between git repositories. (#449507, Jelmer Vernooij) * Refuse pulling into non-rich-root branches rather than erroring out with an AttributeError. (#449507, Jelmer Vernooij) * Unquote paths extracted from URLs. (#445156, Jelmer Vernooij) PERFORMANCE * Improve performance of WorkingTree.extras() by not looking up the SHA1, kind and stat data of each file. (Jelmer Vernooij) * Provide custom InterTree for faster deltas between git working trees and revision trees. (Jelmer Vernooij) * Provide custom InterTree for faster deltas between git revision trees. (Jelmer Vernooij) * Fix several places where a lot of memory was being consumed, especially for repositories with a large number of revisions or big trees. (#486076, Jelmer Vernooij FEATURES * Support for parsing --HG-- metadata in git commit messages, for better interoperability with bzr-hg. * Submodules are now imported. This requires the use of the development-subtree format in bzrlib though. (#402814, Jelmer Vernooij) 0.4.2 2009-10-01 BUG FIXES * Cope with ghosts a bit better during "bzr dpush". (Jelmer Vernooij) * Better error message when Dulwich is missing. (#427276, Jelmer Vernooij) * Support checkouts. (#427310, Jelmer Vernooij) * Don't break "bzr info -v" when Dulwich is not installed. (#429394, Jelmer Vernooij) * Mark as compatible with Bazaar 2.1. (Jelmer Vernooij) * Fix fetching of remote repositories on Windows. (INADA Naoki, Jelmer Vernooij, #382125) * Ignore directories in WorkingTree.extras(). (Jelmer Vernooij, #373902) FEATURES * New "git" format supported by "bzr send". All revisions are currently sent as one concatenated file, rather than as separate files because of limitations in Bazaar. (Jelmer Vernooij, Lukas Lalinsky) PERFORMANCE * Avoid re-fetching basis inventory during fetch. (Jelmer Vernooij) 0.4.1 2009-07-24 BUG FIXES * Avoid "No such revision" error when encountering submodules. (#400598) * Avoid creating empty trees in Git during dpush, as they are not officially allowed. (#393706) FEATURES * Progress bars will now show results from the remote git server. 0.4.0 2009-06-18 BUG FIXES * Fix handling of not-executable files becoming executable without any other changes. (#382609) * XML-invalid characters are now no longer squashed if not required by the target repository serializer format. The only non-XML based format at the moment is the "2a" development format supported since bzr 1.16. * Unusual file modes that could be created in Git repositories using older versions of Git are now stored in Bazaar revision properties. This means it's now possible to import the Git repository and the Linux kernel repository. * Mark as compatible with bzr 1.16. 0.3.2 2009-05-20 FEATURES * "bzr commit" in git working trees works to some extent. * "bzr push" from local git repositories to remote git repositories works. 0.3.1 2009-05-13 FEATURES * Alternative (faster) storage for SHA map using the TDB library (http://tdb.samba.org/). This will automatically be used if you have TDB and its Python bindings installed. In all other situations the previous (slower) Sqlite database format will be used. * Now warns when escaping XML-invalid characters to work around a bug in the Bazaar revision serializer. * Now allows "unusual" file modes (100664, etc) but will warn the user about them. BUG FIXES * Fixed git-import. * Fixed handling kind changes (tree -> blob) during fetch. 0.3.0 2009-05-10 FEATURES * Support parsing .gitignore * Support dpushing to remote repositories. bzr-git-0.6.13+bzr1649/README0000644000000000000000000000071613165530605013312 0ustar 00000000000000bzr-git, a plugin for bzr that adds git support. This was originally written as a proof of concept at Europython 2006 by Robert Collins, using stgit's convenience methods for accessing gits head and parsing git output. Later, it was adapted to use James Westby's Python Git module (which had by then be renamed to "Dulwich") and extended to support push and pull by Jelmer Vernooij. Please see INSTALL for installation instructions, and TODO for future plans. bzr-git-0.6.13+bzr1649/TODO0000644000000000000000000000005513165530605013116 0ustar 00000000000000- "Roundtripping" push into git - More tests bzr-git-0.6.13+bzr1649/__init__.py0000644000000000000000000004103313165530605014540 0ustar 00000000000000# Copyright (C) 2006-2009 Canonical Ltd # Authors: Robert Collins # Jelmer Vernooij # John Carr # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """A GIT branch and repository format implementation for bzr.""" from __future__ import absolute_import import os import sys import bzrlib import bzrlib.api from .info import ( bzr_compatible_versions, bzr_plugin_version as version_info, dulwich_minimum_version, ) if version_info[3] == 'final': version_string = '%d.%d.%d' % version_info[:3] else: version_string = '%d.%d.%d%s%d' % version_info __version__ = version_string bzrlib.api.require_any_api(bzrlib, bzr_compatible_versions) try: from ...i18n import load_plugin_translations except ImportError: # No translations for bzr < 2.5 gettext = lambda x: x else: translation = load_plugin_translations("bzr-git") gettext = translation.gettext from ... import ( errors as bzr_errors, trace, ) from ...controldir import ( ControlDirFormat, Prober, format_registry, network_format_registry as controldir_network_format_registry, ) from ...transport import ( register_lazy_transport, register_transport_proto, transport_server_registry, ) from ...commands import ( plugin_cmds, ) if getattr(sys, "frozen", None): # allow import additional libs from ./_lib for bzr.exe only sys.path.append(os.path.normpath( os.path.join(os.path.dirname(__file__), '_lib'))) def import_dulwich(): try: from dulwich import __version__ as dulwich_version except ImportError: raise bzr_errors.DependencyNotPresent("dulwich", "bzr-git: Please install dulwich, https://launchpad.net/dulwich") else: if dulwich_version < dulwich_minimum_version: raise bzr_errors.DependencyNotPresent("dulwich", "bzr-git: Dulwich is too old; at least %d.%d.%d is required" % dulwich_minimum_version) _versions_checked = False def lazy_check_versions(): global _versions_checked if _versions_checked: return import_dulwich() _versions_checked = True format_registry.register_lazy('git', "bzrlib.plugins.git.dir", "LocalGitControlDirFormat", help='GIT repository.', native=False, experimental=False, ) format_registry.register_lazy('git-bare', "bzrlib.plugins.git.dir", "BareLocalGitControlDirFormat", help='Bare GIT repository (no working tree).', native=False, experimental=False, ) from ...revisionspec import (RevisionSpec_dwim, revspec_registry) revspec_registry.register_lazy("git:", "bzrlib.plugins.git.revspec", "RevisionSpec_git") RevisionSpec_dwim.append_possible_lazy_revspec( "bzrlib.plugins.git.revspec", "RevisionSpec_git") class LocalGitProber(Prober): def probe_transport(self, transport): try: external_url = transport.external_url() except bzr_errors.InProcessTransport: raise bzr_errors.NotBranchError(path=transport.base) if (external_url.startswith("http:") or external_url.startswith("https:")): # Already handled by RemoteGitProber raise bzr_errors.NotBranchError(path=transport.base) from ... import urlutils if urlutils.split(transport.base)[1] == ".git": raise bzr_errors.NotBranchError(path=transport.base) if not transport.has_any(['objects', '.git/objects']): raise bzr_errors.NotBranchError(path=transport.base) lazy_check_versions() from .dir import ( BareLocalGitControlDirFormat, LocalGitControlDirFormat, ) if transport.has_any(['.git/objects']): return LocalGitControlDirFormat() if transport.has('info') and transport.has('objects'): return BareLocalGitControlDirFormat() raise bzr_errors.NotBranchError(path=transport.base) @classmethod def known_formats(cls): from .dir import ( BareLocalGitControlDirFormat, LocalGitControlDirFormat, ) return set([BareLocalGitControlDirFormat(), LocalGitControlDirFormat()]) class RemoteGitProber(Prober): def probe_http_transport(self, transport): from ... import urlutils base_url, _ = urlutils.split_segment_parameters(transport.external_url()) url = urlutils.join(base_url, "info/refs") + "?service=git-upload-pack" from ...transport.http._urllib import HttpTransport_urllib, Request if isinstance(transport, HttpTransport_urllib): req = Request('GET', url, accepted_errors=[200, 403, 404, 405], headers={"Content-Type": "application/x-git-upload-pack-request"}) req.follow_redirections = True resp = transport._perform(req) if resp.code in (404, 405): raise bzr_errors.NotBranchError(transport.base) headers = resp.headers refs_text = resp.read() else: try: from ...transport.http._pycurl import PyCurlTransport except bzr_errors.DependencyNotPresent: raise bzr_errors.NotBranchError(transport.base) else: import pycurl from cStringIO import StringIO if isinstance(transport, PyCurlTransport): conn = transport._get_curl() conn.setopt(pycurl.URL, url) conn.setopt(pycurl.FOLLOWLOCATION, 1) transport._set_curl_options(conn) conn.setopt(pycurl.HTTPGET, 1) header = StringIO() data = StringIO() conn.setopt(pycurl.HEADERFUNCTION, header.write) conn.setopt(pycurl.WRITEFUNCTION, data.write) transport._curl_perform(conn, header, ["Content-Type: application/x-git-upload-pack-request"]) code = conn.getinfo(pycurl.HTTP_CODE) if code in (404, 405): raise bzr_errors.NotBranchError(transport.base) if code != 200: raise bzr_errors.InvalidHttpResponse(transport._path, str(code)) headers = transport._parse_headers(header) else: raise bzr_errors.NotBranchError(transport.base) refs_text = data.getvalue() ct = headers.getheader("Content-Type") if ct is None: raise bzr_errors.NotBranchError(transport.base) if ct.startswith("application/x-git"): from .remote import RemoteGitControlDirFormat return RemoteGitControlDirFormat() else: from .dir import ( BareLocalGitControlDirFormat, ) ret = BareLocalGitControlDirFormat() ret._refs_text = refs_text return ret def probe_transport(self, transport): try: external_url = transport.external_url() except bzr_errors.InProcessTransport: raise bzr_errors.NotBranchError(path=transport.base) if (external_url.startswith("http:") or external_url.startswith("https:")): return self.probe_http_transport(transport) if (not external_url.startswith("git://") and not external_url.startswith("git+")): raise bzr_errors.NotBranchError(transport.base) # little ugly, but works from .remote import ( GitSmartTransport, RemoteGitControlDirFormat, ) if isinstance(transport, GitSmartTransport): return RemoteGitControlDirFormat() raise bzr_errors.NotBranchError(path=transport.base) @classmethod def known_formats(cls): from .remote import RemoteGitControlDirFormat return set([RemoteGitControlDirFormat()]) ControlDirFormat.register_prober(LocalGitProber) ControlDirFormat._server_probers.append(RemoteGitProber) register_transport_proto('git://', help="Access using the Git smart server protocol.") register_transport_proto('git+ssh://', help="Access using the Git smart server protocol over SSH.") register_lazy_transport("git://", __name__ + '.remote', 'TCPGitSmartTransport') register_lazy_transport("git+ssh://", __name__ + '.remote', 'SSHGitSmartTransport') plugin_cmds.register_lazy("cmd_git_import", [], __name__ + ".commands") plugin_cmds.register_lazy("cmd_git_object", ["git-objects", "git-cat"], __name__ + ".commands") plugin_cmds.register_lazy("cmd_git_refs", [], __name__ + ".commands") plugin_cmds.register_lazy("cmd_git_apply", [], __name__ + ".commands") plugin_cmds.register_lazy("cmd_git_push_pristine_tar_deltas", ['git-push-pristine-tar', 'git-push-pristine'], __name__ + ".commands") def extract_git_foreign_revid(rev): try: foreign_revid = rev.foreign_revid except AttributeError: from .mapping import mapping_registry foreign_revid, mapping = \ mapping_registry.parse_revision_id(rev.revision_id) return foreign_revid else: from .mapping import foreign_vcs_git if rev.mapping.vcs == foreign_vcs_git: return foreign_revid else: raise bzr_errors.InvalidRevisionId(rev.revision_id, None) def update_stanza(rev, stanza): mapping = getattr(rev, "mapping", None) try: git_commit = extract_git_foreign_revid(rev) except bzr_errors.InvalidRevisionId: pass else: stanza.add("git-commit", git_commit) from ...hooks import install_lazy_named_hook install_lazy_named_hook("bzrlib.version_info_formats.format_rio", "RioVersionInfoBuilder.hooks", "revision", update_stanza, "git commits") transport_server_registry.register_lazy('git', __name__ + '.server', 'serve_git', 'Git Smart server protocol over TCP. (default port: 9418)') transport_server_registry.register_lazy('git-receive-pack', __name__ + '.server', 'serve_git_receive_pack', help='Git Smart server receive pack command. (inetd mode only)') transport_server_registry.register_lazy('git-upload-pack', __name__ + 'git.server', 'serve_git_upload_pack', help='Git Smart server upload pack command. (inetd mode only)') from ...repository import ( format_registry as repository_format_registry, network_format_registry as repository_network_format_registry, ) repository_network_format_registry.register_lazy('git', __name__ + '.repository', 'GitRepositoryFormat') register_extra_lazy_repository_format = getattr(repository_format_registry, "register_extra_lazy") register_extra_lazy_repository_format(__name__ + '.repository', 'GitRepositoryFormat') from ...branch import ( network_format_registry as branch_network_format_registry, ) branch_network_format_registry.register_lazy('git', __name__ + '.branch', 'GitBranchFormat') from ...branch import ( format_registry as branch_format_registry, ) branch_format_registry.register_extra_lazy( __name__ + '.branch', 'GitBranchFormat', ) from ...workingtree import ( format_registry as workingtree_format_registry, ) workingtree_format_registry.register_extra_lazy( __name__ + '.workingtree', 'GitWorkingTreeFormat', ) controldir_network_format_registry.register_lazy('git', __name__ + ".dir", "GitControlDirFormat") try: from ...registry import register_lazy except ImportError: from ...diff import format_registry as diff_format_registry diff_format_registry.register_lazy('git', __name__ + '.send', 'GitDiffTree', 'Git am-style diff format') from ...send import ( format_registry as send_format_registry, ) send_format_registry.register_lazy('git', __name__ + '.send', 'send_git', 'Git am-style diff format') from ...directory_service import directories directories.register_lazy('github:', __name__ + '.directory', 'GitHubDirectory', 'GitHub directory.') directories.register_lazy('git@github.com:', __name__ + '.directory', 'GitHubDirectory', 'GitHub directory.') from ...help_topics import ( topic_registry, ) topic_registry.register_lazy('git', __name__ + '.help', 'help_git', 'Using Bazaar with Git') from ...foreign import ( foreign_vcs_registry, ) foreign_vcs_registry.register_lazy("git", __name__ + ".mapping", "foreign_vcs_git", "Stupid content tracker") else: register_lazy("bzrlib.diff", "format_registry", 'git', __name__ + '.send', 'GitDiffTree', 'Git am-style diff format') register_lazy("bzrlib.send", "format_registry", 'git', __name__ + '.send', 'send_git', 'Git am-style diff format') register_lazy('bzrlib.directory_service', 'directories', 'github:', __name__ + '.directory', 'GitHubDirectory', 'GitHub directory.') register_lazy('bzrlib.directory_service', 'directories', 'git@github.com:', 'bzrlib.plugins.git.directory', 'GitHubDirectory', 'GitHub directory.') register_lazy('bzrlib.help_topics', 'topic_registry', 'git', __name__ + '.help', 'help_git', 'Using Bazaar with Git') register_lazy('bzrlib.foreign', 'foreign_vcs_registry', "git", __name__ + ".mapping", "foreign_vcs_git", "Stupid content tracker") def update_git_cache(repository, revid): """Update the git cache after a local commit.""" if getattr(repository, "_git", None) is not None: return # No need to update cache for git repositories if not repository.control_transport.has("git"): return # No existing cache, don't bother updating try: lazy_check_versions() except bzr_errors.DependencyNotPresent, e: # dulwich is probably missing. silently ignore trace.mutter("not updating git map for %r: %s", repository, e) from .object_store import BazaarObjectStore store = BazaarObjectStore(repository) store.lock_write() try: try: parent_revisions = set(repository.get_parent_map([revid])[revid]) except KeyError: # Isn't this a bit odd - how can a revision that was just committed be missing? return missing_revisions = store._missing_revisions(parent_revisions) if not missing_revisions: # Only update if the cache was up to date previously store._update_sha_map_revision(revid) finally: store.unlock() def post_commit_update_cache(local_branch, master_branch, old_revno, old_revid, new_revno, new_revid): if local_branch is not None: update_git_cache(local_branch.repository, new_revid) update_git_cache(master_branch.repository, new_revid) def loggerhead_git_hook(branch_app, environ): branch = branch_app.branch config_stack = branch.get_config_stack() if config_stack.get('http_git'): return None from .server import git_http_hook return git_http_hook(branch, environ['REQUEST_METHOD'], environ['PATH_INFO']) install_lazy_named_hook("bzrlib.branch", "Branch.hooks", "post_commit", post_commit_update_cache, "git cache") install_lazy_named_hook("bzrlib.plugins.loggerhead.apps.branch", "BranchWSGIApp.hooks", "controller", loggerhead_git_hook, "git support") from ...config import ( option_registry, Option, bool_from_store, ) option_registry.register( Option('git.http', default=None, from_unicode=bool_from_store, invalid='warning', help='''\ Allow fetching of Git packs over HTTP. This enables support for fetching Git packs over HTTP in Loggerhead. ''')) def test_suite(): from . import tests return tests.test_suite() bzr-git-0.6.13+bzr1649/branch.py0000644000000000000000000012427713165530605014252 0ustar 00000000000000# Copyright (C) 2007,2012 Canonical Ltd # Copyright (C) 2009-2012 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """An adapter between a Git Branch and a Bazaar Branch""" from __future__ import absolute_import from cStringIO import StringIO from collections import defaultdict from dulwich.objects import ( ZERO_SHA, ) from dulwich.repo import check_ref_format from ... import ( branch, bzrdir, config, errors, repository as _mod_repository, revision, tag, transport, urlutils, ) from ...decorators import ( needs_read_lock, ) from ...revision import ( NULL_REVISION, ) from ...trace import ( is_quiet, mutter, warning, ) from .config import ( GitBranchConfig, GitBranchStack, ) from .errors import ( NoPushSupport, NoSuchRef, ) from .refs import ( is_tag, ref_to_branch_name, ref_to_tag_name, tag_name_to_ref, ) from .unpeel_map import ( UnpeelMap, ) from ...foreign import ForeignBranch class GitPullResult(branch.PullResult): """Result of a pull from a Git branch.""" def _lookup_revno(self, revid): assert isinstance(revid, str), "was %r" % revid # Try in source branch first, it'll be faster self.target_branch.lock_read() try: return self.target_branch.revision_id_to_revno(revid) finally: self.target_branch.unlock() @property def old_revno(self): return self._lookup_revno(self.old_revid) @property def new_revno(self): return self._lookup_revno(self.new_revid) class GitTags(tag.BasicTags): """Ref-based tag dictionary.""" def __init__(self, branch): self.branch = branch self.repository = branch.repository def get_refs_container(self): raise NotImplementedError(self.get_refs_container) def _iter_tag_refs(self, refs): """Iterate over the tag refs. :param refs: Refs dictionary (name -> git sha1) :return: iterator over (name, peeled_sha1, unpeeled_sha1, bzr_revid) """ for k, unpeeled in refs.as_dict().iteritems(): try: tag_name = ref_to_tag_name(k) except (ValueError, UnicodeDecodeError): continue peeled = refs.get_peeled(k) if peeled is None: peeled = self.repository.bzrdir._git.object_store.peel_sha(unpeeled).id assert type(tag_name) is unicode yield (tag_name, peeled, unpeeled, self.branch.lookup_foreign_revision_id(peeled)) def _merge_to_remote_git(self, target_repo, new_refs, overwrite=False): updates = {} conflicts = [] def get_changed_refs(old_refs): ret = dict(old_refs) for k, v in new_refs.iteritems(): if not is_tag(k): continue name = ref_to_tag_name(k) if old_refs.get(k) == v: pass elif overwrite or not k in old_refs: ret[k] = v updates[name] = target_repo.lookup_foreign_revision_id(v) else: conflicts.append((name, v, old_refs[k])) return ret target_repo.bzrdir.send_pack(get_changed_refs, lambda have, want: []) return updates, conflicts def _merge_to_local_git(self, target_repo, refs, overwrite=False): conflicts = [] updates = {} for k, unpeeled in refs.as_dict().iteritems(): if not is_tag(k): continue name = ref_to_tag_name(k) peeled = self.repository.bzrdir.get_peeled(k) if target_repo._git.refs.get(k) == unpeeled: pass elif overwrite or not k in target_repo._git.refs: target_repo._git.refs[k] = unpeeled or peeled updates[name] = target_repo.lookup_foreign_revision_id(peeled) else: conflicts.append((name, peeled, target_repo._git.refs[k])) return updates, conflicts def _merge_to_git(self, to_tags, refs, overwrite=False): target_repo = to_tags.repository if self.repository.has_same_location(target_repo): return {}, [] if getattr(target_repo, "_git", None): return self._merge_to_local_git(target_repo, refs, overwrite) else: return self._merge_to_remote_git(target_repo, refs, overwrite) def _merge_to_non_git(self, to_tags, refs, overwrite=False): unpeeled_map = defaultdict(set) conflicts = [] updates = {} result = dict(to_tags.get_tag_dict()) for n, peeled, unpeeled, bzr_revid in self._iter_tag_refs(refs): if unpeeled is not None: unpeeled_map[peeled].add(unpeeled) if result.get(n) == bzr_revid: pass elif n not in result or overwrite: result[n] = bzr_revid updates[n] = bzr_revid else: conflicts.append((n, result[n], bzr_revid)) to_tags._set_tag_dict(result) if len(unpeeled_map) > 0: map_file = UnpeelMap.from_repository(to_tags.branch.repository) map_file.update(unpeeled_map) map_file.save_in_repository(to_tags.branch.repository) return updates, conflicts def merge_to(self, to_tags, overwrite=False, ignore_master=False, source_refs=None): """See Tags.merge_to.""" if source_refs is None: source_refs = self.get_refs_container() if self == to_tags: return {}, [] if isinstance(to_tags, GitTags): return self._merge_to_git(to_tags, source_refs, overwrite=overwrite) else: if ignore_master: master = None else: master = to_tags.branch.get_master_branch() updates, conflicts = self._merge_to_non_git(to_tags, source_refs, overwrite=overwrite) if master is not None: extra_updates, extra_conflicts = self.merge_to( master.tags, overwrite=overwrite, source_refs=source_refs, ignore_master=ignore_master) updates.update(extra_updates) conflicts += extra_conflicts return updates, conflicts def get_tag_dict(self): ret = {} refs = self.get_refs_container() for (name, peeled, unpeeled, bzr_revid) in self._iter_tag_refs(refs): ret[name] = bzr_revid return ret class LocalGitTagDict(GitTags): """Dictionary with tags in a local repository.""" def __init__(self, branch): super(LocalGitTagDict, self).__init__(branch) self.refs = self.repository.bzrdir._git.refs def get_refs_container(self): return self.refs def _set_tag_dict(self, to_dict): extra = set(self.refs.allkeys()) for k, revid in to_dict.iteritems(): name = tag_name_to_ref(k) if name in extra: extra.remove(name) self.set_tag(k, revid) for name in extra: if is_tag(name): del self.repository._git[name] def set_tag(self, name, revid): try: git_sha, mapping = self.branch.lookup_bzr_revision_id(revid) except errors.NoSuchRevision: raise errors.GhostTagsNotSupported(self) self.refs[tag_name_to_ref(name)] = git_sha class DictTagDict(tag.BasicTags): def __init__(self, branch, tags): super(DictTagDict, self).__init__(branch) self._tags = tags def get_tag_dict(self): return self._tags class GitSymrefBranchFormat(branch.BranchFormat): def get_format_description(self): return 'Git Symbolic Reference Branch' def network_name(self): return "git" def get_reference(self, controldir, name=None): return controldir.get_branch_reference(name) def set_reference(self, controldir, name, target): return controldir.set_branch_reference(target, name) class GitBranchFormat(branch.BranchFormat): def get_format_description(self): return 'Git Branch' def network_name(self): return "git" def supports_tags(self): return True def supports_leaving_lock(self): return False def supports_tags_referencing_ghosts(self): return False def tags_are_versioned(self): return False @property def _matchingbzrdir(self): from .dir import LocalGitControlDirFormat return LocalGitControlDirFormat() def get_foreign_tests_branch_factory(self): from .tests.test_branch import ForeignTestsBranchFactory return ForeignTestsBranchFactory() def make_tags(self, branch): try: return branch.tags except AttributeError: pass if getattr(branch.repository, "_git", None) is None: from .remote import RemoteGitTagDict return RemoteGitTagDict(branch) else: return LocalGitTagDict(branch) def initialize(self, a_bzrdir, name=None, repository=None, append_revisions_only=None): from .dir import LocalGitDir if not isinstance(a_bzrdir, LocalGitDir): raise errors.IncompatibleFormat(self, a_bzrdir._format) return a_bzrdir.create_branch(repository=repository, name=name, append_revisions_only=append_revisions_only) class GitReadLock(object): def __init__(self, unlock): self.unlock = unlock class GitWriteLock(object): def __init__(self, unlock): self.branch_token = None self.unlock = unlock class GitBranch(ForeignBranch): """An adapter to git repositories for bzr Branch objects.""" @property def control_transport(self): return self.bzrdir.control_transport def __init__(self, bzrdir, repository, ref): self.base = bzrdir.root_transport.base self.repository = repository self._format = GitBranchFormat() self.bzrdir = bzrdir self._lock_mode = None self._lock_count = 0 super(GitBranch, self).__init__(repository.get_mapping()) self.ref = ref try: self.name = ref_to_branch_name(ref) except ValueError: self.name = None self._head = None def _get_checkout_format(self, lightweight=False): """Return the most suitable metadir for a checkout of this branch. Weaves are used if this branch's repository uses weaves. """ return bzrdir.format_registry.make_bzrdir("default") def get_child_submit_format(self): """Return the preferred format of submissions to this branch.""" ret = self.get_config_stack().get("child_submit_format") if ret is not None: return ret return "git" def get_config(self): return GitBranchConfig(self) def get_config_stack(self): return GitBranchStack(self) def _get_nick(self, local=False, possible_master_transports=None): """Find the nick name for this branch. :return: Branch nick """ cs = self.repository._git.get_config_stack() try: return cs.get(("branch", self.name), "nick") except KeyError: pass return self.name.encode('utf-8') or "HEAD" def _set_nick(self, nick): cf = self.repository._git.get_config() cf.set(("branch", self.name), "nick", nick) f = StringIO() cf.write_to_file(f) self.bzrdir.control_transport.put_bytes('config', f.getvalue()) nick = property(_get_nick, _set_nick) def __repr__(self): return "<%s(%r, %r)>" % (self.__class__.__name__, self.repository.base, self.name) def generate_revision_history(self, revid, old_revid=None): if revid == NULL_REVISION: newhead = ZERO_SHA else: # FIXME: Check that old_revid is in the ancestry of revid newhead, self.mapping = self.repository.lookup_bzr_revision_id(revid) if self.mapping is None: raise AssertionError self._set_head(newhead) def lock_write(self, token=None): if token is not None: raise errors.TokenLockingNotSupported(self) if self._lock_mode: if self._lock_mode == 'r': raise errors.ReadOnlyError(self) self._lock_count += 1 else: self._lock_mode = 'w' self._lock_count = 1 self.repository.lock_write() return GitWriteLock(self.unlock) def leave_lock_in_place(self): raise NotImplementedError(self.leave_lock_in_place) def dont_leave_lock_in_place(self): raise NotImplementedError(self.dont_leave_lock_in_place) def get_stacked_on_url(self): # Git doesn't do stacking (yet...) raise errors.UnstackableBranchFormat(self._format, self.base) def get_parent(self): """See Branch.get_parent().""" # FIXME: Set "origin" url from .git/config ? return None def set_parent(self, url): # FIXME: Set "origin" url in .git/config ? pass def break_lock(self): raise NotImplementedError(self.break_lock) def lock_read(self): if self._lock_mode: assert self._lock_mode in ('r', 'w') self._lock_count += 1 else: self._lock_mode = 'r' self._lock_count = 1 self.repository.lock_read() return GitReadLock(self.unlock) def peek_lock_mode(self): return self._lock_mode def is_locked(self): return (self._lock_mode is not None) def unlock(self): """See Branch.unlock().""" self._lock_count -= 1 if self._lock_count == 0: self._lock_mode = None self._clear_cached_state() self.repository.unlock() def get_physical_lock_status(self): return False @needs_read_lock def last_revision(self): # perhaps should escape this ? if self.head is None: return revision.NULL_REVISION return self.lookup_foreign_revision_id(self.head) def _basic_push(self, target, overwrite=False, stop_revision=None): return branch.InterBranch.get(self, target)._basic_push( overwrite, stop_revision) def lookup_foreign_revision_id(self, foreign_revid): return self.repository.lookup_foreign_revision_id(foreign_revid, self.mapping) def lookup_bzr_revision_id(self, revid): return self.repository.lookup_bzr_revision_id( revid, mapping=self.mapping) class LocalGitBranch(GitBranch): """A local Git branch.""" def __init__(self, bzrdir, repository, ref): super(LocalGitBranch, self).__init__(bzrdir, repository, ref) refs = bzrdir.get_refs_container() if not (ref in refs or "HEAD" in refs): raise errors.NotBranchError(self.base) def create_checkout(self, to_location, revision_id=None, lightweight=False, accelerator_tree=None, hardlink=False): if lightweight: t = transport.get_transport(to_location) t.ensure_base() format = self._get_checkout_format(lightweight=True) checkout = format.initialize_on_transport(t) from_branch = branch.BranchReferenceFormat().initialize(checkout, self) tree = checkout.create_workingtree(revision_id, from_branch=from_branch, hardlink=hardlink) return tree else: return self._create_heavyweight_checkout(to_location, revision_id, hardlink) def _create_heavyweight_checkout(self, to_location, revision_id=None, hardlink=False): """Create a new heavyweight checkout of this branch. :param to_location: URL of location to create the new checkout in. :param revision_id: Revision that should be the tip of the checkout. :param hardlink: Whether to hardlink :return: WorkingTree object of checkout. """ checkout_branch = bzrdir.BzrDir.create_branch_convenience( to_location, force_new_tree=False, format=self._get_checkout_format(lightweight=False)) checkout = checkout_branch.bzrdir checkout_branch.bind(self) # pull up to the specified revision_id to set the initial # branch tip correctly, and seed it with history. checkout_branch.pull(self, stop_revision=revision_id) return checkout.create_workingtree(revision_id, hardlink=hardlink) def fetch(self, from_branch, last_revision=None, limit=None): return branch.InterBranch.get(from_branch, self).fetch( stop_revision=last_revision, limit=limit) def _gen_revision_history(self): if self.head is None: return [] graph = self.repository.get_graph() ret = list(graph.iter_lefthand_ancestry(self.last_revision(), (revision.NULL_REVISION, ))) ret.reverse() return ret def _get_head(self): try: return self.repository._git.refs[self.ref or "HEAD"] except KeyError: return None def _read_last_revision_info(self): last_revid = self.last_revision() graph = self.repository.get_graph() revno = graph.find_distance_to_null(last_revid, [(revision.NULL_REVISION, 0)]) return revno, last_revid def set_last_revision_info(self, revno, revision_id): self.set_last_revision(revision_id) self._last_revision_info_cache = revno, revision_id def set_last_revision(self, revid): if not revid or not isinstance(revid, basestring): raise errors.InvalidRevisionId(revision_id=revid, branch=self) if revid == NULL_REVISION: newhead = ZERO_SHA else: (newhead, self.mapping) = self.repository.lookup_bzr_revision_id(revid) if self.mapping is None: raise AssertionError self._set_head(newhead) def _set_head(self, value): self._head = value self.repository._git.refs[self.ref or "HEAD"] = self._head self._clear_cached_state() head = property(_get_head, _set_head) def get_push_location(self): """See Branch.get_push_location.""" push_loc = self.get_config_stack().get('push_location') return push_loc def set_push_location(self, location): """See Branch.set_push_location.""" self.get_config().set_user_option('push_location', location, store=config.STORE_LOCATION) def supports_tags(self): return True def _quick_lookup_revno(local_branch, remote_branch, revid): assert isinstance(revid, str), "was %r" % revid # Try in source branch first, it'll be faster local_branch.lock_read() try: try: return local_branch.revision_id_to_revno(revid) except errors.NoSuchRevision: graph = local_branch.repository.get_graph() try: return graph.find_distance_to_null(revid, [(revision.NULL_REVISION, 0)]) except errors.GhostRevisionsHaveNoRevno: # FIXME: Check using graph.find_distance_to_null() ? remote_branch.lock_read() try: return remote_branch.revision_id_to_revno(revid) finally: remote_branch.unlock() finally: local_branch.unlock() class GitBranchPullResult(branch.PullResult): def __init__(self): super(GitBranchPullResult, self).__init__() self.new_git_head = None self._old_revno = None self._new_revno = None def report(self, to_file): if not is_quiet(): if self.old_revid == self.new_revid: to_file.write('No revisions to pull.\n') elif self.new_git_head is not None: to_file.write('Now on revision %d (git sha: %s).\n' % (self.new_revno, self.new_git_head)) else: to_file.write('Now on revision %d.\n' % (self.new_revno,)) self._show_tag_conficts(to_file) def _lookup_revno(self, revid): return _quick_lookup_revno(self.target_branch, self.source_branch, revid) def _get_old_revno(self): if self._old_revno is not None: return self._old_revno return self._lookup_revno(self.old_revid) def _set_old_revno(self, revno): self._old_revno = revno old_revno = property(_get_old_revno, _set_old_revno) def _get_new_revno(self): if self._new_revno is not None: return self._new_revno return self._lookup_revno(self.new_revid) def _set_new_revno(self, revno): self._new_revno = revno new_revno = property(_get_new_revno, _set_new_revno) class GitBranchPushResult(branch.BranchPushResult): def _lookup_revno(self, revid): return _quick_lookup_revno(self.source_branch, self.target_branch, revid) @property def old_revno(self): return self._lookup_revno(self.old_revid) @property def new_revno(self): new_original_revno = getattr(self, "new_original_revno", None) if new_original_revno: return new_original_revno if getattr(self, "new_original_revid", None) is not None: return self._lookup_revno(self.new_original_revid) return self._lookup_revno(self.new_revid) class InterFromGitBranch(branch.GenericInterBranch): """InterBranch implementation that pulls from Git into bzr.""" @staticmethod def _get_branch_formats_to_test(): try: default_format = branch.format_registry.get_default() except AttributeError: default_format = branch.BranchFormat._default_format return [ (GitBranchFormat(), GitBranchFormat()), (GitBranchFormat(), default_format)] @classmethod def _get_interrepo(self, source, target): return _mod_repository.InterRepository.get(source.repository, target.repository) @classmethod def is_compatible(cls, source, target): if not isinstance(source, GitBranch): return False if isinstance(target, GitBranch): # InterLocalGitRemoteGitBranch or InterToGitBranch should be used return False if getattr(cls._get_interrepo(source, target), "fetch_objects", None) is None: # fetch_objects is necessary for this to work return False return True def fetch(self, stop_revision=None, fetch_tags=None, limit=None): self.fetch_objects(stop_revision, fetch_tags=fetch_tags, limit=limit) def fetch_objects(self, stop_revision, fetch_tags, limit=None): interrepo = self._get_interrepo(self.source, self.target) if fetch_tags is None: c = self.source.get_config_stack() fetch_tags = c.get('branch.fetch_tags') def determine_wants(heads): if self.source.ref is not None and not self.source.ref in heads: raise NoSuchRef(self.source.ref, self.source.user_url, heads.keys()) if stop_revision is None: if self.source.ref is not None: head = heads[self.source.ref] else: head = heads["HEAD"] self._last_revid = self.source.lookup_foreign_revision_id(head) else: self._last_revid = stop_revision real = interrepo.get_determine_wants_revids( [self._last_revid], include_tags=fetch_tags) return real(heads) pack_hint, head, refs = interrepo.fetch_objects( determine_wants, self.source.mapping, limit=limit) if (pack_hint is not None and self.target.repository._format.pack_compresses): self.target.repository.pack(hint=pack_hint) return head, refs def _update_revisions(self, stop_revision=None, overwrite=False): head, refs = self.fetch_objects(stop_revision, fetch_tags=None) if overwrite: prev_last_revid = None else: prev_last_revid = self.target.last_revision() self.target.generate_revision_history(self._last_revid, prev_last_revid, self.source) return head, refs def _basic_pull(self, stop_revision, overwrite, run_hooks, _override_hook_target, _hook_master): result = GitBranchPullResult() result.source_branch = self.source if _override_hook_target is None: result.target_branch = self.target else: result.target_branch = _override_hook_target self.source.lock_read() try: self.target.lock_write() try: # We assume that during 'pull' the target repository is closer than # the source one. (result.old_revno, result.old_revid) = \ self.target.last_revision_info() result.new_git_head, remote_refs = self._update_revisions( stop_revision, overwrite=overwrite) tags_ret = self.source.tags.merge_to( self.target.tags, overwrite, ignore_master=True) if isinstance(tags_ret, tuple): result.tag_updates, result.tag_conflicts = tags_ret else: result.tag_conflicts = tags_ret (result.new_revno, result.new_revid) = \ self.target.last_revision_info() if _hook_master: result.master_branch = _hook_master result.local_branch = result.target_branch else: result.master_branch = result.target_branch result.local_branch = None if run_hooks: for hook in branch.Branch.hooks['post_pull']: hook(result) return result finally: self.target.unlock() finally: self.source.unlock() def pull(self, overwrite=False, stop_revision=None, possible_transports=None, _hook_master=None, run_hooks=True, _override_hook_target=None, local=False): """See Branch.pull. :param _hook_master: Private parameter - set the branch to be supplied as the master to pull hooks. :param run_hooks: Private parameter - if false, this branch is being called because it's the master of the primary branch, so it should not run its hooks. :param _override_hook_target: Private parameter - set the branch to be supplied as the target_branch to pull hooks. """ # This type of branch can't be bound. bound_location = self.target.get_bound_location() if local and not bound_location: raise errors.LocalRequiresBoundBranch() master_branch = None source_is_master = False self.source.lock_read() if bound_location: # bound_location comes from a config file, some care has to be # taken to relate it to source.user_url normalized = urlutils.normalize_url(bound_location) try: relpath = self.source.user_transport.relpath(normalized) source_is_master = (relpath == '') except (errors.PathNotChild, errors.InvalidURL): source_is_master = False if not local and bound_location and not source_is_master: # not pulling from master, so we need to update master. master_branch = self.target.get_master_branch(possible_transports) master_branch.lock_write() try: try: if master_branch: # pull from source into master. master_branch.pull(self.source, overwrite, stop_revision, run_hooks=False) result = self._basic_pull(stop_revision, overwrite, run_hooks, _override_hook_target, _hook_master=master_branch) finally: self.source.unlock() finally: if master_branch: master_branch.unlock() return result def _basic_push(self, overwrite=False, stop_revision=None): result = branch.BranchPushResult() result.source_branch = self.source result.target_branch = self.target result.old_revno, result.old_revid = self.target.last_revision_info() result.new_git_head, remote_refs = self._update_revisions( stop_revision, overwrite=overwrite) tags_ret = self.source.tags.merge_to(self.target.tags, overwrite) if isinstance(tags_ret, tuple): (result.tag_updates, result.tag_conflicts) = tags_ret else: result.tag_conflicts = tags_ret result.new_revno, result.new_revid = self.target.last_revision_info() return result class InterGitBranch(branch.GenericInterBranch): """InterBranch implementation that pulls between Git branches.""" def fetch(self, stop_revision=None, fetch_tags=None, limit=None): raise NotImplementedError(self.fetch) class InterLocalGitRemoteGitBranch(InterGitBranch): """InterBranch that copies from a local to a remote git branch.""" @staticmethod def _get_branch_formats_to_test(): # FIXME return [] @classmethod def is_compatible(self, source, target): from .remote import RemoteGitBranch return (isinstance(source, LocalGitBranch) and isinstance(target, RemoteGitBranch)) def _basic_push(self, overwrite=False, stop_revision=None): result = GitBranchPushResult() result.source_branch = self.source result.target_branch = self.target if stop_revision is None: stop_revision = self.source.last_revision() # FIXME: Check for diverged branches def get_changed_refs(old_refs): old_ref = old_refs.get(self.target.ref, ZERO_SHA) result.old_revid = self.target.lookup_foreign_revision_id(old_ref) refs = { self.target.ref: self.source.repository.lookup_bzr_revision_id(stop_revision)[0] } result.new_revid = stop_revision for name, sha in self.source.repository._git.refs.as_dict("refs/tags").iteritems(): refs[tag_name_to_ref(name)] = sha return refs self.target.repository.send_pack(get_changed_refs, self.source.repository._git.object_store.generate_pack_contents) return result class InterGitLocalGitBranch(InterGitBranch): """InterBranch that copies from a remote to a local git branch.""" @staticmethod def _get_branch_formats_to_test(): # FIXME return [] @classmethod def is_compatible(self, source, target): return (isinstance(source, GitBranch) and isinstance(target, LocalGitBranch)) def fetch(self, stop_revision=None, fetch_tags=None, limit=None): interrepo = _mod_repository.InterRepository.get(self.source.repository, self.target.repository) if stop_revision is None: stop_revision = self.source.last_revision() determine_wants = interrepo.get_determine_wants_revids( [stop_revision], include_tags=fetch_tags) interrepo.fetch_objects(determine_wants, limit=limit) def _basic_push(self, overwrite=False, stop_revision=None): result = GitBranchPushResult() result.source_branch = self.source result.target_branch = self.target result.old_revid = self.target.last_revision() refs, stop_revision = self.update_refs(stop_revision) self.target.generate_revision_history(stop_revision, result.old_revid) tags_ret = self.source.tags.merge_to(self.target.tags, source_refs=refs, overwrite=overwrite) if isinstance(tags_ret, tuple): (result.tag_updates, result.tag_conflicts) = tags_ret else: result.tag_conflicts = tags_ret result.new_revid = self.target.last_revision() return result def update_refs(self, stop_revision=None): interrepo = _mod_repository.InterRepository.get(self.source.repository, self.target.repository) if stop_revision is None: refs = interrepo.fetch(branches=["HEAD"]) stop_revision = self.target.lookup_foreign_revision_id(refs["HEAD"]) else: refs = interrepo.fetch(revision_id=stop_revision) return refs, stop_revision def pull(self, stop_revision=None, overwrite=False, possible_transports=None, run_hooks=True,local=False): # This type of branch can't be bound. if local: raise errors.LocalRequiresBoundBranch() result = GitPullResult() result.source_branch = self.source result.target_branch = self.target self.source.lock_read() try: self.target.lock_write() try: result.old_revid = self.target.last_revision() refs, stop_revision = self.update_refs(stop_revision) self.target.generate_revision_history(stop_revision, result.old_revid) tags_ret = self.source.tags.merge_to(self.target.tags, overwrite=overwrite, source_refs=refs) if isinstance(tags_ret, tuple): (result.tag_updates, result.tag_conflicts) = tags_ret else: result.tag_conflicts = tags_ret result.new_revid = self.target.last_revision() result.local_branch = None result.master_branch = result.target_branch if run_hooks: for hook in branch.Branch.hooks['post_pull']: hook(result) finally: self.target.unlock() finally: self.source.unlock() return result class InterToGitBranch(branch.GenericInterBranch): """InterBranch implementation that pulls into a Git branch.""" def __init__(self, source, target): super(InterToGitBranch, self).__init__(source, target) self.interrepo = _mod_repository.InterRepository.get(source.repository, target.repository) @staticmethod def _get_branch_formats_to_test(): try: default_format = branch.format_registry.get_default() except AttributeError: default_format = branch.BranchFormat._default_format return [(default_format, GitBranchFormat())] @classmethod def is_compatible(self, source, target): return (not isinstance(source, GitBranch) and isinstance(target, GitBranch)) def _get_new_refs(self, stop_revision=None, fetch_tags=None): assert self.source.is_locked() if stop_revision is None: (stop_revno, stop_revision) = self.source.last_revision_info() else: stop_revno = self.source.revision_id_to_revno(stop_revision) assert type(stop_revision) is str main_ref = self.target.ref or "refs/heads/master" refs = { main_ref: (None, stop_revision) } if fetch_tags is None: c = self.source.get_config_stack() fetch_tags = c.get('branch.fetch_tags') for name, revid in self.source.tags.get_tag_dict().iteritems(): if self.source.repository.has_revision(revid): ref = tag_name_to_ref(name) if not check_ref_format(ref): warning("skipping tag with invalid characters %s (%s)", name, ref) continue if fetch_tags: # FIXME: Skip tags that are not in the ancestry refs[ref] = (None, revid) return refs, main_ref, (stop_revno, stop_revision) def _update_refs(self, result, old_refs, new_refs, overwrite): mutter("updating refs. old refs: %r, new refs: %r", old_refs, new_refs) result.tag_updates = {} result.tag_conflicts = [] ret = dict(old_refs) def ref_equals(refs, ref, git_sha, revid): try: value = refs[ref] except KeyError: return False if (value[0] is not None and git_sha is not None and value[0] == git_sha): return True if (value[1] is not None and revid is not None and value[1] == revid): return True # FIXME: If one side only has the git sha available and the other only # has the bzr revid, then this will cause us to show a tag as updated # that hasn't actually been updated. return False # FIXME: Check for diverged branches for ref, (git_sha, revid) in new_refs.iteritems(): if ref_equals(ret, ref, git_sha, revid): # Already up to date if git_sha is None: git_sha = old_refs[ref][0] if revid is None: revid = old_refs[ref][1] ret[ref] = new_refs[ref] = (git_sha, revid) elif ref not in ret or overwrite: try: tag_name = ref_to_tag_name(ref) except ValueError: pass else: result.tag_updates[tag_name] = revid ret[ref] = (git_sha, revid) else: # FIXME: Check diverged diverged = False if diverged: try: name = ref_to_tag_name(ref) except ValueError: pass else: result.tag_conflicts.append((name, revid, ret[name][1])) else: ret[ref] = (git_sha, revid) return ret def fetch(self, stop_revision=None, fetch_tags=None, lossy=False, limit=None): assert limit is None if stop_revision is None: stop_revision = self.source.last_revision() ret = [] if fetch_tags: for k, v in self.source.tags.get_tag_dict().iteritems(): ret.append((None, v)) ret.append((None, stop_revision)) self.interrepo.fetch_objects(ret, lossy=lossy) def pull(self, overwrite=False, stop_revision=None, local=False, possible_transports=None, run_hooks=True): result = GitBranchPullResult() result.source_branch = self.source result.target_branch = self.target self.source.lock_read() try: self.target.lock_write() try: new_refs, main_ref, stop_revinfo = self._get_new_refs( stop_revision) def update_refs(old_refs): return self._update_refs(result, old_refs, new_refs, overwrite) try: result.revidmap, old_refs, new_refs = self.interrepo.fetch_refs( update_refs, lossy=False) except NoPushSupport: raise errors.NoRoundtrippingSupport(self.source, self.target) (old_sha1, result.old_revid) = old_refs.get(main_ref, (ZERO_SHA, NULL_REVISION)) if result.old_revid is None: result.old_revid = self.target.lookup_foreign_revision_id(old_sha1) result.new_revid = new_refs[main_ref][1] result.local_branch = None result.master_branch = self.target if run_hooks: for hook in branch.Branch.hooks['post_pull']: hook(result) finally: self.target.unlock() finally: self.source.unlock() return result def push(self, overwrite=False, stop_revision=None, lossy=False, _override_hook_source_branch=None): result = GitBranchPushResult() result.source_branch = self.source result.target_branch = self.target result.local_branch = None result.master_branch = result.target_branch self.source.lock_read() try: new_refs, main_ref, stop_revinfo = self._get_new_refs(stop_revision) def update_refs(old_refs): return self._update_refs(result, old_refs, new_refs, overwrite) try: result.revidmap, old_refs, new_refs = self.interrepo.fetch_refs( update_refs, lossy=lossy) except NoPushSupport: raise errors.NoRoundtrippingSupport(self.source, self.target) (old_sha1, result.old_revid) = old_refs.get(main_ref, (ZERO_SHA, NULL_REVISION)) if result.old_revid is None: result.old_revid = self.target.lookup_foreign_revision_id(old_sha1) result.new_revid = new_refs[main_ref][1] (result.new_original_revno, result.new_original_revid) = stop_revinfo for hook in branch.Branch.hooks['post_push']: hook(result) finally: self.source.unlock() return result branch.InterBranch.register_optimiser(InterGitLocalGitBranch) branch.InterBranch.register_optimiser(InterFromGitBranch) branch.InterBranch.register_optimiser(InterToGitBranch) branch.InterBranch.register_optimiser(InterLocalGitRemoteGitBranch) bzr-git-0.6.13+bzr1649/bzr-receive-pack0000755000000000000000000000070513165530605015507 0ustar 00000000000000#!/usr/bin/env python import bzrlib from bzrlib.plugin import load_plugins load_plugins() from bzrlib.plugins.git.server import BzrBackend from dulwich.server import ReceivePackHandler, serve_command import sys, os if len(sys.argv) < 2: print >>sys.stderr, "usage: %s " % os.path.basename(sys.argv[0]) sys.exit(1) backend = BzrBackend(bzrlib.transport.get_transport("/")) sys.exit(serve_command(ReceivePackHandler, backend=backend)) bzr-git-0.6.13+bzr1649/bzr-upload-pack0000755000000000000000000000066613165530605015357 0ustar 00000000000000#!/usr/bin/env python import bzrlib from bzrlib.plugin import load_plugins load_plugins () from bzrlib.plugins.git.server import BzrBackend from dulwich.server import UploadPackHandler, serve_command import sys, os if len(sys.argv) < 2: print "usage: %s " % os.path.basename(sys.argv[0]) sys.exit(1) backend = BzrBackend(bzrlib.transport.get_transport("/")) sys.exit(serve_command(UploadPackHandler, backend=backend)) bzr-git-0.6.13+bzr1649/cache.py0000644000000000000000000010172013165530605014044 0ustar 00000000000000# Copyright (C) 2009 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Map from Git sha's to Bazaar objects.""" from __future__ import absolute_import from dulwich.objects import ( sha_to_hex, hex_to_sha, ) import os import threading from dulwich.objects import ( ShaFile, ) from ... import ( btree_index as _mod_btree_index, errors as bzr_errors, index as _mod_index, osutils, registry, trace, versionedfile, ) from ...transport import ( get_transport, ) def get_cache_dir(): try: from xdg.BaseDirectory import xdg_cache_home except ImportError: from ...config import config_dir ret = os.path.join(config_dir(), "git") else: ret = os.path.join(xdg_cache_home, "bazaar", "git") if not os.path.isdir(ret): os.makedirs(ret) return ret def get_remote_cache_transport(repository): """Retrieve the transport to use when accessing (unwritable) remote repositories. """ uuid = getattr(repository, "uuid", None) if uuid is None: path = get_cache_dir() else: path = os.path.join(get_cache_dir(), uuid) if not os.path.isdir(path): os.mkdir(path) return get_transport(path) def check_pysqlite_version(sqlite3): """Check that sqlite library is compatible. """ if (sqlite3.sqlite_version_info[0] < 3 or (sqlite3.sqlite_version_info[0] == 3 and sqlite3.sqlite_version_info[1] < 3)): trace.warning('Needs at least sqlite 3.3.x') raise bzr_errors.BzrError("incompatible sqlite library") try: try: import sqlite3 check_pysqlite_version(sqlite3) except (ImportError, bzr_errors.BzrError), e: from pysqlite2 import dbapi2 as sqlite3 check_pysqlite_version(sqlite3) except: trace.warning('Needs at least Python2.5 or Python2.4 with the pysqlite2 ' 'module') raise bzr_errors.BzrError("missing sqlite library") _mapdbs = threading.local() def mapdbs(): """Get a cache for this thread's db connections.""" try: return _mapdbs.cache except AttributeError: _mapdbs.cache = {} return _mapdbs.cache class GitShaMap(object): """Git<->Bzr revision id mapping database.""" def lookup_git_sha(self, sha): """Lookup a Git sha in the database. :param sha: Git object sha :return: list with (type, type_data) tuples with type_data: commit: revid, tree_sha, verifiers blob: fileid, revid tree: fileid, revid """ raise NotImplementedError(self.lookup_git_sha) def lookup_blob_id(self, file_id, revision): """Retrieve a Git blob SHA by file id. :param file_id: File id of the file/symlink :param revision: revision in which the file was last changed. """ raise NotImplementedError(self.lookup_blob_id) def lookup_tree_id(self, file_id, revision): """Retrieve a Git tree SHA by file id. """ raise NotImplementedError(self.lookup_tree_id) def lookup_commit(self, revid): """Retrieve a Git commit SHA by Bazaar revision id. """ raise NotImplementedError(self.lookup_commit) def revids(self): """List the revision ids known.""" raise NotImplementedError(self.revids) def missing_revisions(self, revids): """Return set of all the revisions that are not present.""" present_revids = set(self.revids()) if not isinstance(revids, set): revids = set(revids) return revids - present_revids def sha1s(self): """List the SHA1s.""" raise NotImplementedError(self.sha1s) def start_write_group(self): """Start writing changes.""" def commit_write_group(self): """Commit any pending changes.""" def abort_write_group(self): """Abort any pending changes.""" class ContentCache(object): """Object that can cache Git objects.""" def add(self, object): """Add an object.""" raise NotImplementedError(self.add) def add_multi(self, objects): """Add multiple objects.""" for obj in objects: self.add(obj) def __getitem__(self, sha): """Retrieve an item, by SHA.""" raise NotImplementedError(self.__getitem__) class BzrGitCacheFormat(object): """Bazaar-Git Cache Format.""" def get_format_string(self): """Return a single-line unique format string for this cache format.""" raise NotImplementedError(self.get_format_string) def open(self, transport): """Open this format on a transport.""" raise NotImplementedError(self.open) def initialize(self, transport): """Create a new instance of this cache format at transport.""" transport.put_bytes('format', self.get_format_string()) @classmethod def from_transport(self, transport): """Open a cache file present on a transport, or initialize one. :param transport: Transport to use :return: A BzrGitCache instance """ try: format_name = transport.get_bytes('format') format = formats.get(format_name) except bzr_errors.NoSuchFile: format = formats.get('default') format.initialize(transport) return format.open(transport) @classmethod def from_repository(cls, repository): """Open a cache file for a repository. This will use the repository's transport to store the cache file, or use the users global cache directory if the repository has no transport associated with it. :param repository: Repository to open the cache for :return: A `BzrGitCache` """ from ...transport.local import LocalTransport repo_transport = getattr(repository, "_transport", None) if (repo_transport is not None and isinstance(repo_transport, LocalTransport)): # Even if we don't write to this repo, we should be able # to update its cache. try: repo_transport = remove_readonly_transport_decorator(repo_transport) except bzr_errors.ReadOnlyError: transport = None else: try: repo_transport.mkdir('git') except bzr_errors.FileExists: pass transport = repo_transport.clone('git') else: transport = None if transport is None: transport = get_remote_cache_transport(repository) return cls.from_transport(transport) class CacheUpdater(object): """Base class for objects that can update a bzr-git cache.""" def add_object(self, obj, bzr_key_data, path): """Add an object. :param obj: Object type ("commit", "blob" or "tree") :param bzr_key_data: bzr key store data or testament_sha in case of commit :param path: Path of the object (optional) """ raise NotImplementedError(self.add_object) def finish(self): raise NotImplementedError(self.finish) class BzrGitCache(object): """Caching backend.""" def __init__(self, idmap, content_cache, cache_updater_klass): self.idmap = idmap self.content_cache = content_cache self._cache_updater_klass = cache_updater_klass def get_updater(self, rev): """Update an object that implements the CacheUpdater interface for updating this cache. """ return self._cache_updater_klass(self, rev) DictBzrGitCache = lambda: BzrGitCache(DictGitShaMap(), None, DictCacheUpdater) class DictCacheUpdater(CacheUpdater): """Cache updater for dict-based caches.""" def __init__(self, cache, rev): self.cache = cache self.revid = rev.revision_id self.parent_revids = rev.parent_ids self._commit = None self._entries = [] def add_object(self, obj, bzr_key_data, path): if obj.type_name == "commit": self._commit = obj assert type(bzr_key_data) is dict key = self.revid type_data = (self.revid, self._commit.tree, bzr_key_data) self.cache.idmap._by_revid[self.revid] = obj.id elif obj.type_name in ("blob", "tree"): if bzr_key_data is not None: if obj.type_name == "blob": revision = bzr_key_data[1] else: revision = self.revid key = type_data = (bzr_key_data[0], revision) self.cache.idmap._by_fileid.setdefault(type_data[1], {})[type_data[0]] = obj.id else: raise AssertionError entry = (obj.type_name, type_data) self.cache.idmap._by_sha.setdefault(obj.id, {})[key] = entry def finish(self): if self._commit is None: raise AssertionError("No commit object added") return self._commit class DictGitShaMap(GitShaMap): """Git SHA map that uses a dictionary.""" def __init__(self): self._by_sha = {} self._by_fileid = {} self._by_revid = {} def lookup_blob_id(self, fileid, revision): return self._by_fileid[revision][fileid] def lookup_git_sha(self, sha): for entry in self._by_sha[sha].itervalues(): yield entry def lookup_tree_id(self, fileid, revision): return self._by_fileid[revision][fileid] def lookup_commit(self, revid): return self._by_revid[revid] def revids(self): for key, entries in self._by_sha.iteritems(): for (type, type_data) in entries.values(): if type == "commit": yield type_data[0] def sha1s(self): return self._by_sha.iterkeys() class SqliteCacheUpdater(CacheUpdater): def __init__(self, cache, rev): self.cache = cache self.db = self.cache.idmap.db self.revid = rev.revision_id self._commit = None self._trees = [] self._blobs = [] def add_object(self, obj, bzr_key_data, path): if obj.type_name == "commit": self._commit = obj assert type(bzr_key_data) is dict self._testament3_sha1 = bzr_key_data.get("testament3-sha1") elif obj.type_name == "tree": if bzr_key_data is not None: self._trees.append((obj.id, bzr_key_data[0], self.revid)) elif obj.type_name == "blob": if bzr_key_data is not None: self._blobs.append((obj.id, bzr_key_data[0], bzr_key_data[1])) else: raise AssertionError def finish(self): if self._commit is None: raise AssertionError("No commit object added") self.db.executemany( "replace into trees (sha1, fileid, revid) values (?, ?, ?)", self._trees) self.db.executemany( "replace into blobs (sha1, fileid, revid) values (?, ?, ?)", self._blobs) self.db.execute( "replace into commits (sha1, revid, tree_sha, testament3_sha1) values (?, ?, ?, ?)", (self._commit.id, self.revid, self._commit.tree, self._testament3_sha1)) return self._commit SqliteBzrGitCache = lambda p: BzrGitCache(SqliteGitShaMap(p), None, SqliteCacheUpdater) class SqliteGitCacheFormat(BzrGitCacheFormat): def get_format_string(self): return 'bzr-git sha map version 1 using sqlite\n' def open(self, transport): try: basepath = transport.local_abspath(".") except bzr_errors.NotLocalUrl: basepath = get_cache_dir() return SqliteBzrGitCache(os.path.join(basepath, "idmap.db")) class SqliteGitShaMap(GitShaMap): """Bazaar GIT Sha map that uses a sqlite database for storage.""" def __init__(self, path=None): self.path = path if path is None: self.db = sqlite3.connect(":memory:") else: if not mapdbs().has_key(path): mapdbs()[path] = sqlite3.connect(path) self.db = mapdbs()[path] self.db.text_factory = str self.db.executescript(""" create table if not exists commits( sha1 text not null check(length(sha1) == 40), revid text not null, tree_sha text not null check(length(tree_sha) == 40) ); create index if not exists commit_sha1 on commits(sha1); create unique index if not exists commit_revid on commits(revid); create table if not exists blobs( sha1 text not null check(length(sha1) == 40), fileid text not null, revid text not null ); create index if not exists blobs_sha1 on blobs(sha1); create unique index if not exists blobs_fileid_revid on blobs(fileid, revid); create table if not exists trees( sha1 text unique not null check(length(sha1) == 40), fileid text not null, revid text not null ); create unique index if not exists trees_sha1 on trees(sha1); create unique index if not exists trees_fileid_revid on trees(fileid, revid); """) try: self.db.executescript( "ALTER TABLE commits ADD testament3_sha1 TEXT;") except sqlite3.OperationalError: pass # Column already exists. def __repr__(self): return "%s(%r)" % (self.__class__.__name__, self.path) def lookup_commit(self, revid): cursor = self.db.execute("select sha1 from commits where revid = ?", (revid,)) row = cursor.fetchone() if row is not None: return row[0] raise KeyError def commit_write_group(self): self.db.commit() def lookup_blob_id(self, fileid, revision): row = self.db.execute("select sha1 from blobs where fileid = ? and revid = ?", (fileid, revision)).fetchone() if row is not None: return row[0] raise KeyError(fileid) def lookup_tree_id(self, fileid, revision): row = self.db.execute("select sha1 from trees where fileid = ? and revid = ?", (fileid, revision)).fetchone() if row is not None: return row[0] raise KeyError(fileid) def lookup_git_sha(self, sha): """Lookup a Git sha in the database. :param sha: Git object sha :return: (type, type_data) with type_data: commit: revid, tree sha, verifiers tree: fileid, revid blob: fileid, revid """ found = False cursor = self.db.execute("select revid, tree_sha, testament3_sha1 from commits where sha1 = ?", (sha,)) for row in cursor.fetchall(): found = True if row[2] is not None: verifiers = {"testament3-sha1": row[2]} else: verifiers = {} yield ("commit", (row[0], row[1], verifiers)) cursor = self.db.execute("select fileid, revid from blobs where sha1 = ?", (sha,)) for row in cursor.fetchall(): found = True yield ("blob", row) cursor = self.db.execute("select fileid, revid from trees where sha1 = ?", (sha,)) for row in cursor.fetchall(): found = True yield ("tree", row) if not found: raise KeyError(sha) def revids(self): """List the revision ids known.""" return (row for (row,) in self.db.execute("select revid from commits")) def sha1s(self): """List the SHA1s.""" for table in ("blobs", "commits", "trees"): for (sha,) in self.db.execute("select sha1 from %s" % table): yield sha class TdbCacheUpdater(CacheUpdater): """Cache updater for tdb-based caches.""" def __init__(self, cache, rev): self.cache = cache self.db = cache.idmap.db self.revid = rev.revision_id self.parent_revids = rev.parent_ids self._commit = None self._entries = [] def add_object(self, obj, bzr_key_data, path): sha = obj.sha().digest() if obj.type_name == "commit": self.db["commit\0" + self.revid] = "\0".join((sha, obj.tree)) assert type(bzr_key_data) is dict, "was %r" % bzr_key_data type_data = (self.revid, obj.tree) try: type_data += (bzr_key_data["testament3-sha1"],) except KeyError: pass self._commit = obj elif obj.type_name == "blob": if bzr_key_data is None: return self.db["\0".join(("blob", bzr_key_data[0], bzr_key_data[1]))] = sha type_data = bzr_key_data elif obj.type_name == "tree": if bzr_key_data is None: return (file_id, ) = bzr_key_data type_data = (file_id, self.revid) else: raise AssertionError entry = "\0".join((obj.type_name, ) + type_data) + "\n" key = "git\0" + sha try: oldval = self.db[key] except KeyError: self.db[key] = entry else: if oldval[-1] != "\n": self.db[key] = "".join([oldval, "\n", entry]) else: self.db[key] = "".join([oldval, entry]) def finish(self): if self._commit is None: raise AssertionError("No commit object added") return self._commit TdbBzrGitCache = lambda p: BzrGitCache(TdbGitShaMap(p), None, TdbCacheUpdater) class TdbGitCacheFormat(BzrGitCacheFormat): """Cache format for tdb-based caches.""" def get_format_string(self): return 'bzr-git sha map version 3 using tdb\n' def open(self, transport): try: basepath = transport.local_abspath(".").encode(osutils._fs_enc) except bzr_errors.NotLocalUrl: basepath = get_cache_dir() assert isinstance(basepath, str) try: return TdbBzrGitCache(os.path.join(basepath, "idmap.tdb")) except ImportError: raise ImportError( "Unable to open existing bzr-git cache because 'tdb' is not " "installed.") class TdbGitShaMap(GitShaMap): """SHA Map that uses a TDB database. Entries: "git " -> " " "commit revid" -> " " "tree fileid revid" -> "" "blob fileid revid" -> "" """ TDB_MAP_VERSION = 3 TDB_HASH_SIZE = 50000 def __init__(self, path=None): import tdb self.path = path if path is None: self.db = {} else: assert isinstance(path, str) if not mapdbs().has_key(path): mapdbs()[path] = tdb.Tdb(path, self.TDB_HASH_SIZE, tdb.DEFAULT, os.O_RDWR|os.O_CREAT) self.db = mapdbs()[path] try: if int(self.db["version"]) not in (2, 3): trace.warning("SHA Map is incompatible (%s -> %d), rebuilding database.", self.db["version"], self.TDB_MAP_VERSION) self.db.clear() except KeyError: pass self.db["version"] = str(self.TDB_MAP_VERSION) def start_write_group(self): """Start writing changes.""" self.db.transaction_start() def commit_write_group(self): """Commit any pending changes.""" self.db.transaction_commit() def abort_write_group(self): """Abort any pending changes.""" self.db.transaction_cancel() def __repr__(self): return "%s(%r)" % (self.__class__.__name__, self.path) def lookup_commit(self, revid): try: return sha_to_hex(self.db["commit\0" + revid][:20]) except KeyError: raise KeyError("No cache entry for %r" % revid) def lookup_blob_id(self, fileid, revision): return sha_to_hex(self.db["\0".join(("blob", fileid, revision))]) def lookup_git_sha(self, sha): """Lookup a Git sha in the database. :param sha: Git object sha :return: (type, type_data) with type_data: commit: revid, tree sha blob: fileid, revid tree: fileid, revid """ if len(sha) == 40: sha = hex_to_sha(sha) value = self.db["git\0" + sha] for data in value.splitlines(): data = data.split("\0") if data[0] == "commit": if len(data) == 3: yield (data[0], (data[1], data[2], {})) else: yield (data[0], (data[1], data[2], {"testament3-sha1": data[3]})) elif data[0] in ("tree", "blob"): yield (data[0], tuple(data[1:])) else: raise AssertionError("unknown type %r" % data[0]) def missing_revisions(self, revids): ret = set() for revid in revids: if self.db.get("commit\0" + revid) is None: ret.add(revid) return ret def revids(self): """List the revision ids known.""" for key in self.db.iterkeys(): if key.startswith("commit\0"): yield key[7:] def sha1s(self): """List the SHA1s.""" for key in self.db.iterkeys(): if key.startswith("git\0"): yield sha_to_hex(key[4:]) class VersionedFilesContentCache(ContentCache): def __init__(self, vf): self._vf = vf def add(self, obj): self._vf.insert_record_stream( [versionedfile.ChunkedContentFactory((obj.id,), [], None, obj.as_legacy_object_chunks())]) def __getitem__(self, sha): stream = self._vf.get_record_stream([(sha,)], 'unordered', True) entry = stream.next() if entry.storage_kind == 'absent': raise KeyError(sha) return ShaFile._parse_legacy_object(entry.get_bytes_as('fulltext')) class GitObjectStoreContentCache(ContentCache): def __init__(self, store): self.store = store def add_multi(self, objs): self.store.add_objects(objs) def add(self, obj, path): self.store.add_object(obj) def __getitem__(self, sha): return self.store[sha] class IndexCacheUpdater(CacheUpdater): def __init__(self, cache, rev): self.cache = cache self.revid = rev.revision_id self.parent_revids = rev.parent_ids self._commit = None self._entries = [] self._cache_objs = set() def add_object(self, obj, bzr_key_data, path): if obj.type_name == "commit": self._commit = obj assert type(bzr_key_data) is dict self.cache.idmap._add_git_sha(obj.id, "commit", (self.revid, obj.tree, bzr_key_data)) self.cache.idmap._add_node(("commit", self.revid, "X"), " ".join((obj.id, obj.tree))) self._cache_objs.add((obj, path)) elif obj.type_name == "blob": self.cache.idmap._add_git_sha(obj.id, "blob", bzr_key_data) self.cache.idmap._add_node(("blob", bzr_key_data[0], bzr_key_data[1]), obj.id) elif obj.type_name == "tree": self.cache.idmap._add_git_sha(obj.id, "tree", (bzr_key_data[0], self.revid)) self._cache_objs.add((obj, path)) else: raise AssertionError def finish(self): self.cache.content_cache.add_multi(self._cache_objs) return self._commit class IndexBzrGitCache(BzrGitCache): def __init__(self, transport=None): mapper = versionedfile.ConstantMapper("trees") shamap = IndexGitShaMap(transport.clone('index')) #trees_store = knit.make_file_factory(True, mapper)(transport) #content_cache = VersionedFilesContentCache(trees_store) from .transportgit import TransportObjectStore store = TransportObjectStore(transport.clone('objects')) content_cache = GitObjectStoreContentCache(store) super(IndexBzrGitCache, self).__init__(shamap, content_cache, IndexCacheUpdater) class IndexGitCacheFormat(BzrGitCacheFormat): def get_format_string(self): return 'bzr-git sha map with git object cache version 1\n' def initialize(self, transport): super(IndexGitCacheFormat, self).initialize(transport) transport.mkdir('index') transport.mkdir('objects') from .transportgit import TransportObjectStore TransportObjectStore.init(transport.clone('objects')) def open(self, transport): return IndexBzrGitCache(transport) class IndexGitShaMap(GitShaMap): """SHA Map that uses the Bazaar APIs to store a cache. BTree Index file with the following contents: ("git", ) -> " " ("commit", ) -> " " ("blob", , ) -> """ def __init__(self, transport=None): if transport is None: self._transport = None self._index = _mod_index.InMemoryGraphIndex(0, key_elements=3) self._builder = self._index else: self._builder = None self._transport = transport self._index = _mod_index.CombinedGraphIndex([]) for name in self._transport.list_dir("."): if not name.endswith(".rix"): continue x = _mod_btree_index.BTreeGraphIndex(self._transport, name, self._transport.stat(name).st_size) self._index.insert_index(0, x) @classmethod def from_repository(cls, repository): transport = getattr(repository, "_transport", None) if transport is not None: try: transport.mkdir('git') except bzr_errors.FileExists: pass return cls(transport.clone('git')) from ...transport import get_transport return cls(get_transport(get_cache_dir())) def __repr__(self): if self._transport is not None: return "%s(%r)" % (self.__class__.__name__, self._transport.base) else: return "%s()" % (self.__class__.__name__) def repack(self): assert self._builder is None self.start_write_group() for _, key, value in self._index.iter_all_entries(): self._builder.add_node(key, value) to_remove = [] for name in self._transport.list_dir('.'): if name.endswith('.rix'): to_remove.append(name) self.commit_write_group() del self._index.indices[1:] for name in to_remove: self._transport.rename(name, name + '.old') def start_write_group(self): assert self._builder is None self._builder = _mod_btree_index.BTreeBuilder(0, key_elements=3) self._name = osutils.sha() def commit_write_group(self): assert self._builder is not None stream = self._builder.finish() name = self._name.hexdigest() + ".rix" size = self._transport.put_file(name, stream) index = _mod_btree_index.BTreeGraphIndex(self._transport, name, size) self._index.insert_index(0, index) self._builder = None self._name = None def abort_write_group(self): assert self._builder is not None self._builder = None self._name = None def _add_node(self, key, value): try: self._builder.add_node(key, value) except bzr_errors.BadIndexDuplicateKey: # Multiple bzr objects can have the same contents return True else: return False def _get_entry(self, key): entries = self._index.iter_entries([key]) try: return entries.next()[2] except StopIteration: if self._builder is None: raise KeyError entries = self._builder.iter_entries([key]) try: return entries.next()[2] except StopIteration: raise KeyError def _iter_entries_prefix(self, prefix): for entry in self._index.iter_entries_prefix([prefix]): yield (entry[1], entry[2]) if self._builder is not None: for entry in self._builder.iter_entries_prefix([prefix]): yield (entry[1], entry[2]) def lookup_commit(self, revid): return self._get_entry(("commit", revid, "X"))[:40] def _add_git_sha(self, hexsha, type, type_data): if hexsha is not None: self._name.update(hexsha) if type == "commit": td = (type_data[0], type_data[1]) try: td += (type_data[2]["testament3-sha1"],) except KeyError: pass else: td = type_data self._add_node(("git", hexsha, "X"), " ".join((type,) + td)) else: # This object is not represented in Git - perhaps an empty # directory? self._name.update(type + " ".join(type_data)) def lookup_blob_id(self, fileid, revision): return self._get_entry(("blob", fileid, revision)) def lookup_git_sha(self, sha): if len(sha) == 20: sha = sha_to_hex(sha) found = False for key, value in self._iter_entries_prefix(("git", sha, None)): found = True data = value.split(" ", 3) if data[0] == "commit": if data[3]: verifiers = {"testament3-sha1": data[3]} else: verifiers = {} yield ("commit", (data[1], data[2], verifiers)) else: yield (data[0], tuple(data[1:])) if not found: raise KeyError(sha) def revids(self): """List the revision ids known.""" for key, value in self._iter_entries_prefix(("commit", None, None)): yield key[1] def missing_revisions(self, revids): """Return set of all the revisions that are not present.""" missing_revids = set(revids) for _, key, value in self._index.iter_entries(( ("commit", revid, "X") for revid in revids)): missing_revids.remove(key[1]) return missing_revids def sha1s(self): """List the SHA1s.""" for key, value in self._iter_entries_prefix(("git", None, None)): yield key[1] formats = registry.Registry() formats.register(TdbGitCacheFormat().get_format_string(), TdbGitCacheFormat()) formats.register(SqliteGitCacheFormat().get_format_string(), SqliteGitCacheFormat()) formats.register(IndexGitCacheFormat().get_format_string(), IndexGitCacheFormat()) # In the future, this will become the default: # formats.register('default', IndexGitCacheFormat()) try: import tdb except ImportError: formats.register('default', SqliteGitCacheFormat()) else: formats.register('default', TdbGitCacheFormat()) def migrate_ancient_formats(repo_transport): # Migrate older cache formats repo_transport = remove_readonly_transport_decorator(repo_transport) has_sqlite = repo_transport.has("git.db") has_tdb = repo_transport.has("git.tdb") if not has_sqlite or has_tdb: return try: repo_transport.mkdir("git") except bzr_errors.FileExists: return # Prefer migrating git.db over git.tdb, since the latter may not # be openable on some platforms. if has_sqlite: SqliteGitCacheFormat().initialize(repo_transport.clone("git")) repo_transport.rename("git.db", "git/idmap.db") elif has_tdb: TdbGitCacheFormat().initialize(repo_transport.clone("git")) repo_transport.rename("git.tdb", "git/idmap.tdb") def remove_readonly_transport_decorator(transport): if transport.is_readonly(): try: return transport._decorated except AttributeError: raise bzr_errors.ReadOnlyError(transport) return transport def from_repository(repository): """Open a cache file for a repository. If the repository is remote and there is no transport available from it this will use a local file in the users cache directory (typically ~/.cache/bazaar/git/) :param repository: A repository object """ repo_transport = getattr(repository, "_transport", None) if repo_transport is not None: try: migrate_ancient_formats(repo_transport) except bzr_errors.ReadOnlyError: pass # Not much we can do return BzrGitCacheFormat.from_repository(repository) bzr-git-0.6.13+bzr1649/commands.py0000644000000000000000000003012613165530605014603 0ustar 00000000000000# Copyright (C) 2006-2009 Canonical Ltd # Authors: Robert Collins # Jelmer Vernooij # John Carr # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Git-specific subcommands for Bazaar.""" from __future__ import absolute_import from ...commands import ( Command, display_command, ) from ...option import ( Option, ) class cmd_git_import(Command): """Import all branches from a git repository. """ takes_args = ["src_location", "dest_location?"] takes_options = [ Option('colocated', help='Create colocated branches.'), ] def _get_colocated_branch(self, target_bzrdir, name): from ...errors import NotBranchError try: return target_bzrdir.open_branch(name=name) except NotBranchError: return target_bzrdir.create_branch(name=name) def _get_nested_branch(self, dest_transport, dest_format, name): from ...bzrdir import BzrDir from ...errors import NotBranchError head_transport = dest_transport.clone(name) try: head_bzrdir = BzrDir.open_from_transport(head_transport) except NotBranchError: head_bzrdir = dest_format.initialize_on_transport_ex( head_transport, create_prefix=True)[1] try: return head_bzrdir.open_branch() except NotBranchError: return head_bzrdir.create_branch() def run(self, src_location, dest_location=None, colocated=False): import os import urllib from ... import ( controldir, trace, ui, urlutils, ) from ...bzrdir import ( BzrDir, ) from ...errors import ( BzrCommandError, NoRepositoryPresent, NotBranchError, ) from . import gettext from ...repository import ( InterRepository, Repository, ) from ...transport import get_transport from .branch import ( GitBranch, ) from .refs import ( ref_to_branch_name, ) from .repository import GitRepository dest_format = controldir.ControlDirFormat.get_default_format() if dest_location is None: dest_location = os.path.basename(src_location.rstrip("/\\")) dest_transport = get_transport(dest_location) source_repo = Repository.open(src_location) if not isinstance(source_repo, GitRepository): raise BzrCommandError(gettext("%r is not a git repository") % src_location) try: target_bzrdir = BzrDir.open_from_transport(dest_transport) except NotBranchError: target_bzrdir = dest_format.initialize_on_transport_ex( dest_transport, shared_repo=True)[1] try: target_repo = target_bzrdir.find_repository() except NoRepositoryPresent: target_repo = target_bzrdir.create_repository(shared=True) if not target_repo.supports_rich_root(): raise BzrCommandError(gettext("Target repository doesn't support rich roots")) interrepo = InterRepository.get(source_repo, target_repo) mapping = source_repo.get_mapping() refs = interrepo.fetch() refs_dict = refs.as_dict() pb = ui.ui_factory.nested_progress_bar() try: for i, (name, sha) in enumerate(refs_dict.iteritems()): try: branch_name = ref_to_branch_name(name) except ValueError: # Not a branch, ignore continue pb.update(gettext("creating branches"), i, len(refs_dict)) if getattr(target_bzrdir._format, "colocated_branches", False) and colocated: if name == "HEAD": branch_name = None head_branch = self._get_colocated_branch(target_bzrdir, branch_name) else: head_branch = self._get_nested_branch(dest_transport, dest_format, branch_name) revid = mapping.revision_id_foreign_to_bzr(sha) source_branch = GitBranch(source_repo.bzrdir, source_repo, sha) source_branch.head = sha if head_branch.last_revision() != revid: head_branch.generate_revision_history(revid) source_branch.tags.merge_to(head_branch.tags) if not head_branch.get_parent(): url = urlutils.join_segment_parameters( source_branch.base, {"ref": urllib.quote(name, '')}) head_branch.set_parent(url) finally: pb.finished() trace.note(gettext( "Use 'bzr checkout' to create a working tree in " "the newly created branches.")) class cmd_git_object(Command): """List or display Git objects by SHA. Cat a particular object's Git representation if a SHA is specified. List all available SHAs otherwise. """ hidden = True aliases = ["git-objects", "git-cat"] takes_args = ["sha1?"] takes_options = [Option('directory', short_name='d', help='Location of repository.', type=unicode), Option('pretty', help='Pretty-print objects.')] encoding_type = 'exact' @display_command def run(self, sha1=None, directory=".", pretty=False): from ...errors import ( BzrCommandError, ) from ...bzrdir import ( BzrDir, ) from .object_store import ( get_object_store, ) from . import gettext bzrdir, _ = BzrDir.open_containing(directory) repo = bzrdir.find_repository() object_store = get_object_store(repo) object_store.lock_read() try: if sha1 is not None: try: obj = object_store[str(sha1)] except KeyError: raise BzrCommandError(gettext("Object not found: %s") % sha1) if pretty: text = obj.as_pretty_string() else: text = obj.as_raw_string() self.outf.write(text) else: for sha1 in object_store: self.outf.write("%s\n" % sha1) finally: object_store.unlock() class cmd_git_refs(Command): """Output all of the virtual refs for a repository. """ hidden = True takes_args = ["location?"] @display_command def run(self, location="."): from ...bzrdir import ( BzrDir, ) from .refs import ( get_refs_container, ) from .object_store import ( get_object_store, ) bzrdir, _ = BzrDir.open_containing(location) repo = bzrdir.find_repository() object_store = get_object_store(repo) object_store.lock_read() try: refs = get_refs_container(bzrdir, object_store) for k, v in refs.as_dict().iteritems(): self.outf.write("%s -> %s\n" % (k, v)) finally: object_store.unlock() class cmd_git_apply(Command): """Apply a series of git-am style patches. This command will in the future probably be integrated into "bzr pull". """ takes_options = [ Option('signoff', short_name='s', help='Add a Signed-off-by line.'), Option('force', help='Apply patches even if tree has uncommitted changes.') ] takes_args = ["patches*"] def _apply_patch(self, wt, f, signoff): """Apply a patch. :param wt: A Bazaar working tree object. :param f: Patch file to read. :param signoff: Add Signed-Off-By flag. """ from . import gettext from ...errors import BzrCommandError from dulwich.patch import git_am_patch_split import subprocess (c, diff, version) = git_am_patch_split(f) # FIXME: Cope with git-specific bits in patch # FIXME: Add new files to working tree p = subprocess.Popen(["patch", "-p1"], stdin=subprocess.PIPE, cwd=wt.basedir) p.communicate(diff) exitcode = p.wait() if exitcode != 0: raise BzrCommandError(gettext("error running patch")) message = c.message if signoff: signed_off_by = wt.branch.get_config().username() message += "Signed-off-by: %s\n" % signed_off_by.encode('utf-8') wt.commit(authors=[c.author], message=message) def run(self, patches_list=None, signoff=False, force=False): from ...errors import UncommittedChanges from ...workingtree import WorkingTree if patches_list is None: patches_list = [] tree, _ = WorkingTree.open_containing(".") if tree.basis_tree().changes_from(tree).has_changed() and not force: raise UncommittedChanges(tree) tree.lock_write() try: for patch in patches_list: f = open(patch, 'r') try: self._apply_patch(tree, f, signoff=signoff) finally: f.close() finally: tree.unlock() class cmd_git_push_pristine_tar_deltas(Command): """Push pristine tar deltas to a git repository.""" takes_options = [Option('directory', short_name='d', help='Location of repository.', type=unicode)] takes_args = ['target', 'package'] def run(self, target, package, directory='.'): from ...branch import Branch from ...errors import ( BzrCommandError, NoSuchRevision, ) from ...trace import warning from ...repository import Repository from .object_store import get_object_store from .pristine_tar import ( revision_pristine_tar_data, store_git_pristine_tar_data, ) source = Branch.open_containing(directory)[0] target_bzr = Repository.open(target) target = getattr(target_bzr, '_git', None) git_store = get_object_store(source.repository) self.add_cleanup(git_store.unlock) git_store.lock_read() if target is None: raise BzrCommandError("Target not a git repository") tag_dict = source.tags.get_tag_dict() for name, revid in tag_dict.iteritems(): try: rev = source.repository.get_revision(revid) except NoSuchRevision: continue try: delta, kind = revision_pristine_tar_data(rev) except KeyError: continue gitid = git_store._lookup_revision_sha1(revid) if not (name.startswith('upstream/') or name.startswith('upstream-')): warning("Unexpected pristine tar revision tagged %s. Ignoring.", name) continue upstream_version = name[len("upstream/"):] filename = '%s_%s.orig.tar.%s' % (package, upstream_version, kind) if not gitid in target: warning("base git id %s for %s missing in target repository", gitid, filename) store_git_pristine_tar_data(target, filename.encode('utf-8'), delta, gitid) bzr-git-0.6.13+bzr1649/commit.py0000644000000000000000000002140313165530605014270 0ustar 00000000000000# Copyright (C) 2009-2010 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Support for committing in native Git working trees.""" from __future__ import absolute_import from dulwich.index import ( commit_tree, ) import os import stat from ... import ( revision as _mod_revision, ) from ...errors import ( RootMissing, ) from ...repository import ( CommitBuilder, ) from dulwich.objects import ( S_IFGITLINK, Blob, Commit, ) from dulwich.repo import Repo from .mapping import ( entry_mode, ) from .roundtrip import ( CommitSupplement, inject_bzr_metadata, ) class GitCommitBuilder(CommitBuilder): """Commit builder for Git repositories.""" supports_record_entry_contents = False def __init__(self, *args, **kwargs): super(GitCommitBuilder, self).__init__(*args, **kwargs) self._validate_revprops(self._revprops) self.store = self.repository._git.object_store self._blobs = {} self._any_changes = False self._will_record_deletes = False self._override_fileids = {} self._mapping = self.repository.get_mapping() def any_changes(self): return self._any_changes def record_entry_contents(self, ie, parent_invs, path, tree, content_summary): raise NotImplementedError(self.record_entry_contents) def record_delete(self, kind, path, file_id): assert type(path) == str if kind != 'directory': self._override_fileids[path] = None self._blobs[path] = None self._any_changes = True def record_iter_changes(self, workingtree, basis_revid, iter_changes): def link_sha1(path, file_id): blob = Blob() blob.data = workingtree.get_symlink_target(file_id, path).encode("utf-8") self.store.add_object(blob) return blob.id def text_sha1(path, file_id): blob = Blob() blob.data = workingtree.get_file_text(file_id, path) self.store.add_object(blob) return blob.id def treeref_sha1(path, file_id): return Repo.open(os.path.join(workingtree.basedir, path)).head() seen_root = False for (file_id, path, changed_content, versioned, parent, name, kind, executable) in iter_changes: if kind[1] in ("directory",): if kind[0] in ("file", "symlink"): self.record_delete(kind[0], path[0].encode("utf-8"), file_id) if path[1] == "": seen_root = True continue if path[1] is None: self.record_delete(kind[0], path[0].encode("utf-8"), file_id) continue if kind[1] == "file": mode = stat.S_IFREG sha = text_sha1(path[1], file_id) elif kind[1] == "symlink": mode = stat.S_IFLNK sha = link_sha1(path[1], file_id) elif kind[1] == "tree-reference": mode = S_IFGITLINK sha = treeref_sha1(path[1], file_id) else: raise AssertionError("Unknown kind %r" % kind[1]) if executable[1]: mode |= 0111 self._any_changes = True encoded_new_path = path[1].encode("utf-8") self._blobs[encoded_new_path] = (mode, sha) file_sha1 = workingtree.get_file_sha1(file_id, path[1]) if file_sha1 is None: # File no longer exists if path[0] is not None: self.record_delete(kind[0], path[0].encode("utf-8"), file_id) continue _, st = workingtree.get_file_with_stat(file_id, path[1]) yield file_id, path[1], (file_sha1, st) self._override_fileids[encoded_new_path] = file_id if not seen_root and len(self.parents) == 0: raise RootMissing() if getattr(workingtree, "basis_tree", False): basis_tree = workingtree.basis_tree() else: if len(self.parents) == 0: basis_revid = _mod_revision.NULL_REVISION else: basis_revid = self.parents[0] basis_tree = self.repository.revision_tree(basis_revid) # Fill in entries that were not changed for path, entry in basis_tree.iter_entries_by_dir(): if entry.kind not in ("file", "symlink", "tree-reference"): continue if not path in self._blobs: if entry.kind == "symlink": blob = Blob() blob.data = basis_tree.get_symlink_target(entry.file_id, path) self._blobs[path.encode("utf-8")] = (entry_mode(entry), blob.id) elif entry.kind == "file": blob = Blob() blob.data = basis_tree.get_file_text(entry.file_id, path) self._blobs[path.encode("utf-8")] = (entry_mode(entry), blob.id) else: (mode, sha) = workingtree._lookup_entry(path.encode("utf-8"), update_index=True) self._blobs[path.encode("utf-8")] = (sha, mode) if not self._lossy and self._mapping.BZR_FILE_IDS_FILE is not None: try: fileid_map = dict(basis_tree._fileid_map.file_ids) except AttributeError: fileid_map = {} for path, file_id in self._override_fileids.iteritems(): assert type(path) == str if file_id is None: del fileid_map[path] else: assert type(file_id) == str fileid_map[path] = file_id if fileid_map: fileid_blob = self._mapping.export_fileid_map(fileid_map) self.store.add_object(fileid_blob) self._blobs[self._mapping.BZR_FILE_IDS_FILE] = (stat.S_IFREG | 0644, fileid_blob.id) else: self._blobs[self._mapping.BZR_FILE_IDS_FILE] = None self.new_inventory = None def get_basis_delta(self): if not self._will_record_deletes: raise AssertionError # FIXME return [] def finish_inventory(self): # eliminate blobs that were removed for path, entry in iter(self._blobs.items()): if entry is None: del self._blobs[path] def _iterblobs(self): return ((path, sha, mode) for (path, (mode, sha)) in self._blobs.iteritems()) def commit(self, message): self._validate_unicode_text(message, 'commit message') c = Commit() c.parents = [self.repository.lookup_bzr_revision_id(revid)[0] for revid in self.parents] c.tree = commit_tree(self.store, self._iterblobs()) c.committer = self._committer c.author = self._revprops.get('author', self._committer) if c.author != c.committer: self._revprops.remove("author") c.commit_time = int(self._timestamp) c.author_time = int(self._timestamp) c.commit_timezone = self._timezone c.author_timezone = self._timezone c.encoding = 'utf-8' c.message = message.encode("utf-8") if not self._lossy: commit_supplement = CommitSupplement() commit_supplement.revision_id = self._new_revision_id commit_supplement.properties = self._revprops commit_supplement.explicit_parent_ids = self.parents if commit_supplement: c.message = inject_bzr_metadata(c.message, commit_supplement, "utf-8") assert len(c.id) == 40 if self._new_revision_id is None or self._lossy: self._new_revision_id = self._mapping.revision_id_foreign_to_bzr(c.id) self.store.add_object(c) self.repository.commit_write_group() return self._new_revision_id def abort(self): self.repository.abort_write_group() def will_record_deletes(self): self._will_record_deletes = True def revision_tree(self): return self.repository.revision_tree(self._new_revision_id) bzr-git-0.6.13+bzr1649/config.py0000644000000000000000000000444513165530605014254 0ustar 00000000000000# Copyright (C) 2009-2010 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Config file handling for Git.""" from __future__ import absolute_import from ... import ( config, ) class GitBranchConfig(config.BranchConfig): """BranchConfig that uses locations.conf in place of branch.conf""" def __init__(self, branch): super(GitBranchConfig, self).__init__(branch) # do not provide a BranchDataConfig self.option_sources = self.option_sources[0], self.option_sources[2] def __repr__(self): return "<%s of %r>" % (self.__class__.__name__, self.branch) def set_user_option(self, name, value, store=config.STORE_BRANCH, warn_masked=False): """Force local to True""" config.BranchConfig.set_user_option(self, name, value, store=config.STORE_LOCATION, warn_masked=warn_masked) def _get_user_id(self): # TODO: Read from ~/.gitconfig return self._get_best_value('_get_user_id') class GitBranchStack(config._CompatibleStack): """GitBranch stack.""" def __init__(self, branch): lstore = config.LocationStore() loc_matcher = config.LocationMatcher(lstore, branch.base) # FIXME: This should also be looking in .git/config for # local git branches. gstore = config.GlobalStore() super(GitBranchStack, self).__init__( [self._get_overrides, loc_matcher.get_sections, gstore.get_sections], # All modifications go to the corresponding section in # locations.conf lstore, branch.base) self.branch = branch bzr-git-0.6.13+bzr1649/dir.py0000644000000000000000000004707113165530605013567 0ustar 00000000000000# Copyright (C) 2007 Canonical Ltd # Copyright (C) 2010 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """An adapter between a Git control dir and a Bazaar ControlDir.""" from __future__ import absolute_import import urllib from ... import ( errors as bzr_errors, trace, osutils, revision as _mod_revision, urlutils, ) from ...bzrdir import CreateRepository from ...transport import do_catching_redirections from ...controldir import ( ControlDir, ControlDirFormat, format_registry, ) class GitDirConfig(object): def get_default_stack_on(self): return None def set_default_stack_on(self, value): raise bzr_errors.BzrError("Cannot set configuration") class GitControlDirFormat(ControlDirFormat): colocated_branches = True fixed_components = True def __eq__(self, other): return type(self) == type(other) def is_supported(self): return True def network_name(self): return "git" class GitDir(ControlDir): """An adapter to the '.git' dir used by git.""" def is_supported(self): return True def can_convert_format(self): return False def break_lock(self): pass def cloning_metadir(self, stacked=False): return format_registry.make_bzrdir("default") def checkout_metadir(self, stacked=False): return format_registry.make_bzrdir("default") def _get_default_ref(self): return "HEAD" def _get_selected_ref(self, branch, ref=None): if ref is not None and branch is not None: raise bzr_errors.BzrError("can't specify both ref and branch") if ref is not None: return ref segment_parameters = getattr( self.user_transport, "get_segment_parameters", lambda: {})() ref = segment_parameters.get("ref") if ref is not None: return urlutils.unescape(ref) if branch is None and getattr(self, "_get_selected_branch", False): branch = self._get_selected_branch() if branch is not None: from .refs import branch_name_to_ref return branch_name_to_ref(branch) return self._get_default_ref() def get_config(self): return GitDirConfig() def _available_backup_name(self, base): return osutils.available_backup_name(base, self.root_transport.has) def sprout(self, url, revision_id=None, force_new_repo=False, recurse='down', possible_transports=None, accelerator_tree=None, hardlink=False, stacked=False, source_branch=None, create_tree_if_local=True): from ...repository import InterRepository from ...transport.local import LocalTransport from ...transport import get_transport target_transport = get_transport(url, possible_transports) target_transport.ensure_base() cloning_format = self.cloning_metadir() # Create/update the result branch result = cloning_format.initialize_on_transport(target_transport) source_branch = self.open_branch() source_repository = self.find_repository() try: result_repo = result.find_repository() except bzr_errors.NoRepositoryPresent: result_repo = result.create_repository() target_is_empty = True else: target_is_empty = None # Unknown if stacked: raise bzr_errors.IncompatibleRepositories(source_repository, result_repo) interrepo = InterRepository.get(source_repository, result_repo) if revision_id is not None: determine_wants = interrepo.get_determine_wants_revids( [revision_id], include_tags=False) else: determine_wants = interrepo.determine_wants_all interrepo.fetch_objects(determine_wants=determine_wants, mapping=source_branch.mapping) result_branch = source_branch.sprout(result, revision_id=revision_id, repository=result_repo) if (create_tree_if_local and isinstance(target_transport, LocalTransport) and (result_repo is None or result_repo.make_working_trees())): wt = result.create_workingtree(accelerator_tree=accelerator_tree, hardlink=hardlink, from_branch=result_branch) wt.lock_write() try: if wt.path2id('') is None: try: wt.set_root_id(self.open_workingtree.get_root_id()) except bzr_errors.NoWorkingTree: pass finally: wt.unlock() return result def clone_on_transport(self, transport, revision_id=None, force_new_repo=False, preserve_stacking=False, stacked_on=None, create_prefix=False, use_existing_dir=True, no_tree=False): """See ControlDir.clone_on_transport.""" from ...repository import InterRepository from .mapping import default_mapping if no_tree: format = BareLocalGitControlDirFormat() else: format = LocalGitControlDirFormat() (target_repo, target_controldir, stacking, repo_policy) = format.initialize_on_transport_ex(transport, use_existing_dir=use_existing_dir, create_prefix=create_prefix, force_new_repo=force_new_repo) target_git_repo = target_repo._git source_repo = self.open_repository() source_git_repo = source_repo._git interrepo = InterRepository.get(source_repo, target_repo) if revision_id is not None: determine_wants = interrepo.get_determine_wants_revids([revision_id], include_tags=True) else: determine_wants = interrepo.determine_wants_all (pack_hint, _, refs) = interrepo.fetch_objects(determine_wants, mapping=default_mapping) for name, val in refs.iteritems(): target_git_repo.refs[name] = val return self.__class__(transport, target_git_repo, format) def find_repository(self): """Find the repository that should be used. This does not require a branch as we use it to find the repo for new branches as well as to hook existing branches up to their repository. """ return self.open_repository() def get_refs_container(self): """Retrieve the refs container. """ raise NotImplementedError(self.get_refs_container) class LocalGitControlDirFormat(GitControlDirFormat): """The .git directory control format.""" bare = False @classmethod def _known_formats(self): return set([LocalGitControlDirFormat()]) @property def repository_format(self): from .repository import GitRepositoryFormat return GitRepositoryFormat() def get_branch_format(self): from .branch import GitBranchFormat return GitBranchFormat() def open(self, transport, _found=None): """Open this directory. """ from .transportgit import TransportRepo gitrepo = TransportRepo(transport, self.bare, refs_text=getattr(self, "_refs_text", None)) return LocalGitDir(transport, gitrepo, self) def get_format_description(self): return "Local Git Repository" def initialize_on_transport(self, transport): from .transportgit import TransportRepo repo = TransportRepo.init(transport, bare=self.bare) del repo.refs["HEAD"] return self.open(transport) def initialize_on_transport_ex(self, transport, use_existing_dir=False, create_prefix=False, force_new_repo=False, stacked_on=None, stack_on_pwd=None, repo_format_name=None, make_working_trees=None, shared_repo=False, vfs_only=False): def make_directory(transport): transport.mkdir('.') return transport def redirected(transport, e, redirection_notice): trace.note(redirection_notice) return transport._redirected_to(e.source, e.target) try: transport = do_catching_redirections(make_directory, transport, redirected) except bzr_errors.FileExists: if not use_existing_dir: raise except bzr_errors.NoSuchFile: if not create_prefix: raise transport.create_prefix() controldir = self.initialize_on_transport(transport) repository = controldir.open_repository() repository.lock_write() return (repository, controldir, False, CreateRepository(controldir)) def is_supported(self): return True def supports_transport(self, transport): try: external_url = transport.external_url() except bzr_errors.InProcessTransport: raise bzr_errors.NotBranchError(path=transport.base) return (external_url.startswith("http:") or external_url.startswith("https:") or external_url.startswith("file:")) class BareLocalGitControlDirFormat(LocalGitControlDirFormat): bare = True supports_workingtrees = False def get_format_description(self): return "Local Git Repository (bare)" class LocalGitDir(GitDir): """An adapter to the '.git' dir used by git.""" def _get_gitrepository_class(self): from .repository import LocalGitRepository return LocalGitRepository def __repr__(self): return "<%s at %r>" % ( self.__class__.__name__, self.root_transport.base) _gitrepository_class = property(_get_gitrepository_class) @property def user_transport(self): return self.root_transport @property def control_transport(self): return self.transport def __init__(self, transport, gitrepo, format): self._format = format self.root_transport = transport self._mode_check_done = False self._git = gitrepo if gitrepo.bare: self.transport = transport else: self.transport = transport.clone('.git') self._mode_check_done = None def is_control_filename(self, filename): return (filename == '.git' or filename.startswith('.git/') or filename.startswith('.git\\')) def _get_symref(self, ref): from dulwich.repo import SYMREF refcontents = self._git.refs.read_ref(ref) if refcontents is None: # no such ref return None if refcontents.startswith(SYMREF): return refcontents[len(SYMREF):].rstrip("\n") return None def set_branch_reference(self, target, name=None): if self.control_transport.base != target.bzrdir.control_transport.base: raise bzr_errors.IncompatibleFormat(target._format, self._format) ref = self._get_selected_ref(name) self._git.refs.set_symbolic_ref(ref, target.ref) def get_branch_reference(self, name=None): ref = self._get_selected_ref(name) target_ref = self._get_symref(ref) if target_ref is not None: return urlutils.join_segment_parameters( self.user_url.rstrip("/"), {"ref": urllib.quote(target_ref, '')}) return None def find_branch_format(self, name=None): from .branch import ( GitBranchFormat, GitSymrefBranchFormat, ) ref = self._get_selected_ref(name) if self._get_symref(ref) is not None: return GitSymrefBranchFormat() else: return GitBranchFormat() def get_branch_transport(self, branch_format, name=None): if branch_format is None: return self.transport if isinstance(branch_format, LocalGitControlDirFormat): return self.transport raise bzr_errors.IncompatibleFormat(branch_format, self._format) def get_repository_transport(self, format): if format is None: return self.transport if isinstance(format, LocalGitControlDirFormat): return self.transport raise bzr_errors.IncompatibleFormat(format, self._format) def get_workingtree_transport(self, format): if format is None: return self.transport if isinstance(format, LocalGitControlDirFormat): return self.transport raise bzr_errors.IncompatibleFormat(format, self._format) def open_branch(self, name=None, unsupported=False, ignore_fallbacks=None, ref=None, possible_transports=None): """'create' a branch for this dir.""" repo = self.open_repository() from .branch import LocalGitBranch ref = self._get_selected_ref(name, ref) ref_chain, sha = self._git.refs.follow(ref) if sha is None: raise bzr_errors.NotBranchError(self.root_transport.base, bzrdir=self) return LocalGitBranch(self, repo, ref) def destroy_branch(self, name=None): refname = self._get_selected_ref(name) try: del self._git.refs[refname] except KeyError: raise bzr_errors.NotBranchError(self.root_transport.base, bzrdir=self) def destroy_repository(self): raise bzr_errors.UnsupportedOperation(self.destroy_repository, self) def destroy_workingtree(self): wt = self.open_workingtree(recommend_upgrade=False) repository = wt.branch.repository empty = repository.revision_tree(_mod_revision.NULL_REVISION) # We ignore the conflicts returned by wt.revert since we're about to # delete the wt metadata anyway, all that should be left here are # detritus. But see bug #634470 about subtree .bzr dirs. conflicts = wt.revert(old_tree=empty) self.destroy_workingtree_metadata() def destroy_workingtree_metadata(self): self.transport.delete('index') def needs_format_conversion(self, format=None): return not isinstance(self._format, format.__class__) def list_branches(self): return self.get_branches().values() def get_branches(self): from .refs import ref_to_branch_name ret = {} for ref in self._git.refs.keys(): try: branch_name = ref_to_branch_name(ref) except ValueError: continue except UnicodeDecodeError: trace.warning("Ignoring branch %r with unicode error ref", ref) continue ret[branch_name] = self.open_branch(ref=ref) return ret def open_repository(self): """'open' a repository for this dir.""" return self._gitrepository_class(self) def open_workingtree(self, recommend_upgrade=True, unsupported=False): if not self._git.bare: from dulwich.errors import NoIndexPresent repo = self.open_repository() try: index = repo._git.open_index() except NoIndexPresent: pass else: from .workingtree import GitWorkingTree try: branch = self.open_branch() except bzr_errors.NotBranchError: pass else: return GitWorkingTree(self, repo, branch, index) loc = urlutils.unescape_for_display(self.root_transport.base, 'ascii') raise bzr_errors.NoWorkingTree(loc) def create_repository(self, shared=False): from .repository import GitRepositoryFormat if shared: raise bzr_errors.IncompatibleFormat(GitRepositoryFormat(), self._format) return self.open_repository() def create_branch(self, name=None, repository=None, append_revisions_only=None, ref=None): refname = self._get_selected_ref(name, ref) from dulwich.protocol import ZERO_SHA if refname in self._git.refs: raise bzr_errors.AlreadyBranchError(self.user_url) self._git.refs[refname] = ZERO_SHA branch = self.open_branch(name) if append_revisions_only: branch.set_append_revisions_only(append_revisions_only) return branch def backup_bzrdir(self): if not self._git.bare: self.root_transport.copy_tree(".git", ".git.backup") return (self.root_transport.abspath(".git"), self.root_transport.abspath(".git.backup")) else: basename = urlutils.basename(self.root_transport.base) parent = self.root_transport.clone('..') parent.copy_tree(basename, basename + ".backup") def create_workingtree(self, revision_id=None, from_branch=None, accelerator_tree=None, hardlink=False): if self._git.bare: raise bzr_errors.UnsupportedOperation(self.create_workingtree, self) from dulwich.index import write_index from dulwich.pack import SHA1Writer f = open(self.transport.local_abspath("index"), 'w+') try: f = SHA1Writer(f) write_index(f, []) finally: f.close() return self.open_workingtree() def _find_or_create_repository(self, force_new_repo=None): return self.create_repository(shared=False) def _find_creation_modes(self): """Determine the appropriate modes for files and directories. They're always set to be consistent with the base directory, assuming that this transport allows setting modes. """ # TODO: Do we need or want an option (maybe a config setting) to turn # this off or override it for particular locations? -- mbp 20080512 if self._mode_check_done: return self._mode_check_done = True try: st = self.transport.stat('.') except bzr_errors.TransportNotPossible: self._dir_mode = None self._file_mode = None else: # Check the directory mode, but also make sure the created # directories and files are read-write for this user. This is # mostly a workaround for filesystems which lie about being able to # write to a directory (cygwin & win32) if (st.st_mode & 07777 == 00000): # FTP allows stat but does not return dir/file modes self._dir_mode = None self._file_mode = None else: self._dir_mode = (st.st_mode & 07777) | 00700 # Remove the sticky and execute bits for files self._file_mode = self._dir_mode & ~07111 def _get_file_mode(self): """Return Unix mode for newly created files, or None. """ if not self._mode_check_done: self._find_creation_modes() return self._file_mode def _get_dir_mode(self): """Return Unix mode for newly created directories, or None. """ if not self._mode_check_done: self._find_creation_modes() return self._dir_mode def get_refs_container(self): return self._git.refs def get_peeled(self, ref): return self._git.get_peeled(ref) bzr-git-0.6.13+bzr1649/directory.py0000644000000000000000000000213713165530605015007 0ustar 00000000000000# Copyright (C) 2012 Jelmer Vernooij # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Directory service for gitorious.""" from __future__ import absolute_import from __future__ import absolute_import from ... import transport transport.register_urlparse_netloc_protocol('github') class GitHubDirectory(object): def look_up(self, name, url): """See DirectoryService.look_up""" return "git+ssh://git@github.com/" + name bzr-git-0.6.13+bzr1649/errors.py0000644000000000000000000000443613165530605014323 0ustar 00000000000000# Copyright (C) 2007 Canonical Ltd # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """A grouping of Exceptions for bzr-git""" from __future__ import absolute_import from dulwich import errors as git_errors from ... import errors as bzr_errors class BzrGitError(bzr_errors.BzrError): """The base-level exception for bzr-git errors.""" class NoSuchRef(BzrGitError): """Raised when a ref can not be found.""" _fmt = "The ref %(ref)s was not found in the repository at %(location)s." def __init__(self, ref, location, present_refs=None): self.ref = ref self.location = location self.present_refs = present_refs def convert_dulwich_error(error): """Convert a Dulwich error to a Bazaar error.""" if isinstance(error, git_errors.HangupException): raise bzr_errors.ConnectionReset(error.msg, "") raise error class NoPushSupport(bzr_errors.BzrError): _fmt = "Push is not yet supported for bzr-git. Try dpush instead." class GitSmartRemoteNotSupported(bzr_errors.UnsupportedOperation): _fmt = "This operation is not supported by the Git smart server protocol." class UnknownCommitExtra(bzr_errors.BzrError): _fmt = "Unknown extra fields in %(object)r: %(fields)r." def __init__(self, object, fields): bzr_errors.BzrError.__init__(self) self.object = object self.fields = ",".join(fields) class UnknownMercurialCommitExtra(bzr_errors.BzrError): _fmt = "Unknown mercurial extra fields in %(object)r: %(fields)r." def __init__(self, object, fields): bzr_errors.BzrError.__init__(self) self.object = object self.fields = ",".join(fields) bzr-git-0.6.13+bzr1649/fetch.py0000644000000000000000000010443713165530605014102 0ustar 00000000000000# Copyright (C) 2008-2010 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA from __future__ import absolute_import from dulwich.errors import ( NotCommitError, ) from dulwich.objects import ( Commit, Tag, Tree, S_IFGITLINK, S_ISGITLINK, ZERO_SHA, ) from dulwich.object_store import ( ObjectStoreGraphWalker, tree_lookup_path, ) from dulwich.walk import Walker from itertools import ( imap, ) import posixpath import re import stat from ... import ( debug, errors, osutils, trace, ui, ) from ...errors import ( BzrError, ) from ...inventory import ( InventoryDirectory, InventoryFile, InventoryLink, TreeReference, ) from ...repository import ( InterRepository, ) from ...revision import ( NULL_REVISION, ) from ...revisiontree import InventoryRevisionTree from ...testament import ( StrictTestament3, ) from ...tsort import ( topo_sort, ) from ...versionedfile import ( ChunkedContentFactory, ) from .mapping import ( DEFAULT_FILE_MODE, mode_is_executable, mode_kind, warn_unusual_mode, ) from .object_store import ( BazaarObjectStore, LRUTreeCache, _tree_to_objects, ) from .refs import ( is_tag, ) from .remote import ( RemoteGitRepository, ) from .repository import ( GitRepository, GitRepositoryFormat, LocalGitRepository, ) def import_git_blob(texts, mapping, path, name, (base_hexsha, hexsha), base_bzr_tree, parent_id, revision_id, parent_bzr_trees, lookup_object, (base_mode, mode), store_updater, lookup_file_id): """Import a git blob object into a bzr repository. :param texts: VersionedFiles to add to :param path: Path in the tree :param blob: A git blob :return: Inventory delta for this file """ if mapping.is_control_file(path): return [] if base_hexsha == hexsha and base_mode == mode: # If nothing has changed since the base revision, we're done return [] file_id = lookup_file_id(path) if stat.S_ISLNK(mode): cls = InventoryLink else: cls = InventoryFile ie = cls(file_id, name.decode("utf-8"), parent_id) if ie.kind == "file": ie.executable = mode_is_executable(mode) if base_hexsha == hexsha and mode_kind(base_mode) == mode_kind(mode): base_file_id = base_bzr_tree.path2id(path) base_exec = base_bzr_tree.is_executable(base_file_id, path) if ie.kind == "symlink": ie.symlink_target = base_bzr_tree.get_symlink_target( base_file_id, path) else: ie.text_size = base_bzr_tree.get_file_size(base_file_id) ie.text_sha1 = base_bzr_tree.get_file_sha1(base_file_id, path) if ie.kind == "symlink" or ie.executable == base_exec: ie.revision = base_bzr_tree.get_file_revision(base_file_id, path) else: blob = lookup_object(hexsha) else: blob = lookup_object(hexsha) if ie.kind == "symlink": ie.revision = None ie.symlink_target = blob.data.decode("utf-8") else: ie.text_size = sum(imap(len, blob.chunked)) ie.text_sha1 = osutils.sha_strings(blob.chunked) # Check what revision we should store parent_keys = [] for ptree in parent_bzr_trees: try: pkind = ptree.kind(file_id) except errors.NoSuchId: continue if (pkind == ie.kind and ((pkind == "symlink" and ptree.get_symlink_target(file_id) == ie.symlink_target) or (pkind == "file" and ptree.get_file_sha1(file_id) == ie.text_sha1 and ptree.is_executable(file_id) == ie.executable))): # found a revision in one of the parents to use ie.revision = ptree.get_file_revision(file_id) break parent_key = (file_id, ptree.get_file_revision(file_id)) if not parent_key in parent_keys: parent_keys.append(parent_key) if ie.revision is None: # Need to store a new revision ie.revision = revision_id assert ie.revision is not None if ie.kind == 'symlink': chunks = [] else: chunks = blob.chunked texts.insert_record_stream([ ChunkedContentFactory((file_id, ie.revision), tuple(parent_keys), ie.text_sha1, chunks)]) invdelta = [] if base_hexsha is not None: old_path = path.decode("utf-8") # Renames are not supported yet if stat.S_ISDIR(base_mode): invdelta.extend(remove_disappeared_children(base_bzr_tree, old_path, lookup_object(base_hexsha), [], lookup_object)) else: old_path = None new_path = path.decode("utf-8") invdelta.append((old_path, new_path, file_id, ie)) if base_hexsha != hexsha: store_updater.add_object(blob, (ie.file_id, ie.revision), path) return invdelta class SubmodulesRequireSubtrees(BzrError): _fmt = ("The repository you are fetching from contains submodules, " "which are not yet supported.") internal = False def import_git_submodule(texts, mapping, path, name, (base_hexsha, hexsha), base_bzr_tree, parent_id, revision_id, parent_bzr_trees, lookup_object, (base_mode, mode), store_updater, lookup_file_id): """Import a git submodule.""" if base_hexsha == hexsha and base_mode == mode: return [], {} file_id = lookup_file_id(path) invdelta = [] ie = TreeReference(file_id, name.decode("utf-8"), parent_id) ie.revision = revision_id if base_hexsha is not None: old_path = path.decode("utf-8") # Renames are not supported yet if stat.S_ISDIR(base_mode): invdelta.extend(remove_disappeared_children(base_bzr_tree, old_path, lookup_object(base_hexsha), [], lookup_object)) else: old_path = None ie.reference_revision = mapping.revision_id_foreign_to_bzr(hexsha) texts.insert_record_stream([ ChunkedContentFactory((file_id, ie.revision), (), None, [])]) invdelta.append((old_path, path, file_id, ie)) return invdelta, {} def remove_disappeared_children(base_bzr_tree, path, base_tree, existing_children, lookup_object): """Generate an inventory delta for removed children. :param base_bzr_tree: Base bzr tree against which to generate the inventory delta. :param path: Path to process (unicode) :param base_tree: Git Tree base object :param existing_children: Children that still exist :param lookup_object: Lookup a git object by its SHA1 :return: Inventory delta, as list """ assert type(path) is unicode ret = [] for name, mode, hexsha in base_tree.iteritems(): if name in existing_children: continue c_path = posixpath.join(path, name.decode("utf-8")) file_id = base_bzr_tree.path2id(c_path) assert file_id is not None ret.append((c_path, None, file_id, None)) if stat.S_ISDIR(mode): ret.extend(remove_disappeared_children( base_bzr_tree, c_path, lookup_object(hexsha), [], lookup_object)) return ret def import_git_tree(texts, mapping, path, name, (base_hexsha, hexsha), base_bzr_tree, parent_id, revision_id, parent_bzr_trees, lookup_object, (base_mode, mode), store_updater, lookup_file_id, allow_submodules=False): """Import a git tree object into a bzr repository. :param texts: VersionedFiles object to add to :param path: Path in the tree (str) :param name: Name of the tree (str) :param tree: A git tree object :param base_bzr_tree: Base inventory against which to return inventory delta :return: Inventory delta for this subtree """ assert type(path) is str assert type(name) is str if base_hexsha == hexsha and base_mode == mode: # If nothing has changed since the base revision, we're done return [], {} invdelta = [] file_id = lookup_file_id(path) # We just have to hope this is indeed utf-8: ie = InventoryDirectory(file_id, name.decode("utf-8"), parent_id) tree = lookup_object(hexsha) if base_hexsha is None: base_tree = None old_path = None # Newly appeared here else: base_tree = lookup_object(base_hexsha) old_path = path.decode("utf-8") # Renames aren't supported yet new_path = path.decode("utf-8") if base_tree is None or type(base_tree) is not Tree: ie.revision = revision_id invdelta.append((old_path, new_path, ie.file_id, ie)) texts.insert_record_stream([ ChunkedContentFactory((ie.file_id, ie.revision), (), None, [])]) # Remember for next time existing_children = set() child_modes = {} for name, child_mode, child_hexsha in tree.iteritems(): existing_children.add(name) child_path = posixpath.join(path, name) if type(base_tree) is Tree: try: child_base_mode, child_base_hexsha = base_tree[name] except KeyError: child_base_hexsha = None child_base_mode = 0 else: child_base_hexsha = None child_base_mode = 0 if stat.S_ISDIR(child_mode): subinvdelta, grandchildmodes = import_git_tree(texts, mapping, child_path, name, (child_base_hexsha, child_hexsha), base_bzr_tree, file_id, revision_id, parent_bzr_trees, lookup_object, (child_base_mode, child_mode), store_updater, lookup_file_id, allow_submodules=allow_submodules) elif S_ISGITLINK(child_mode): # submodule if not allow_submodules: raise SubmodulesRequireSubtrees() subinvdelta, grandchildmodes = import_git_submodule(texts, mapping, child_path, name, (child_base_hexsha, child_hexsha), base_bzr_tree, file_id, revision_id, parent_bzr_trees, lookup_object, (child_base_mode, child_mode), store_updater, lookup_file_id) else: if not mapping.is_special_file(name): subinvdelta = import_git_blob(texts, mapping, child_path, name, (child_base_hexsha, child_hexsha), base_bzr_tree, file_id, revision_id, parent_bzr_trees, lookup_object, (child_base_mode, child_mode), store_updater, lookup_file_id) else: subinvdelta = [] grandchildmodes = {} child_modes.update(grandchildmodes) invdelta.extend(subinvdelta) if child_mode not in (stat.S_IFDIR, DEFAULT_FILE_MODE, stat.S_IFLNK, DEFAULT_FILE_MODE|0111, S_IFGITLINK): child_modes[child_path] = child_mode # Remove any children that have disappeared if base_tree is not None and type(base_tree) is Tree: invdelta.extend(remove_disappeared_children(base_bzr_tree, old_path, base_tree, existing_children, lookup_object)) store_updater.add_object(tree, (file_id, ), path) return invdelta, child_modes def verify_commit_reconstruction(target_git_object_retriever, lookup_object, o, rev, ret_tree, parent_trees, mapping, unusual_modes, verifiers): new_unusual_modes = mapping.export_unusual_file_modes(rev) if new_unusual_modes != unusual_modes: raise AssertionError("unusual modes don't match: %r != %r" % ( unusual_modes, new_unusual_modes)) # Verify that we can reconstruct the commit properly rec_o = target_git_object_retriever._reconstruct_commit(rev, o.tree, True, verifiers) if rec_o != o: raise AssertionError("Reconstructed commit differs: %r != %r" % ( rec_o, o)) diff = [] new_objs = {} for path, obj, ie in _tree_to_objects(ret_tree, parent_trees, target_git_object_retriever._cache.idmap, unusual_modes, mapping.BZR_DUMMY_FILE): old_obj_id = tree_lookup_path(lookup_object, o.tree, path)[1] new_objs[path] = obj if obj.id != old_obj_id: diff.append((path, lookup_object(old_obj_id), obj)) for (path, old_obj, new_obj) in diff: while (old_obj.type_name == "tree" and new_obj.type_name == "tree" and sorted(old_obj) == sorted(new_obj)): for name in old_obj: if old_obj[name][0] != new_obj[name][0]: raise AssertionError("Modes for %s differ: %o != %o" % (path, old_obj[name][0], new_obj[name][0])) if old_obj[name][1] != new_obj[name][1]: # Found a differing child, delve deeper path = posixpath.join(path, name) old_obj = lookup_object(old_obj[name][1]) new_obj = new_objs[path] break raise AssertionError("objects differ for %s: %r != %r" % (path, old_obj, new_obj)) def ensure_inventories_in_repo(repo, trees): real_inv_vf = repo.inventories.without_fallbacks() for t in trees: revid = t.get_revision_id() if not real_inv_vf.get_parent_map([(revid, )]): repo.add_inventory(revid, t.inventory, t.get_parent_ids()) def import_git_commit(repo, mapping, head, lookup_object, target_git_object_retriever, trees_cache): o = lookup_object(head) # Note that this uses mapping.revision_id_foreign_to_bzr. If the parents # were bzr roundtripped revisions they would be specified in the # roundtrip data. rev, roundtrip_revid, verifiers = mapping.import_commit( o, mapping.revision_id_foreign_to_bzr) if roundtrip_revid is not None: original_revid = rev.revision_id rev.revision_id = roundtrip_revid # We have to do this here, since we have to walk the tree and # we need to make sure to import the blobs / trees with the right # path; this may involve adding them more than once. parent_trees = trees_cache.revision_trees(rev.parent_ids) ensure_inventories_in_repo(repo, parent_trees) if parent_trees == []: base_bzr_tree = trees_cache.revision_tree(NULL_REVISION) base_tree = None base_mode = None else: base_bzr_tree = parent_trees[0] base_tree = lookup_object(o.parents[0]).tree base_mode = stat.S_IFDIR store_updater = target_git_object_retriever._get_updater(rev) tree_supplement = mapping.get_fileid_map(lookup_object, o.tree) inv_delta, unusual_modes = import_git_tree(repo.texts, mapping, "", "", (base_tree, o.tree), base_bzr_tree, None, rev.revision_id, parent_trees, lookup_object, (base_mode, stat.S_IFDIR), store_updater, tree_supplement.lookup_file_id, allow_submodules=getattr(repo._format, "supports_tree_reference", False)) if unusual_modes != {}: for path, mode in unusual_modes.iteritems(): warn_unusual_mode(rev.foreign_revid, path, mode) mapping.import_unusual_file_modes(rev, unusual_modes) try: basis_id = rev.parent_ids[0] except IndexError: basis_id = NULL_REVISION base_bzr_inventory = None else: try: base_bzr_inventory = base_bzr_tree.root_inventory except AttributeError: # bzr < 2.6 base_bzr_inventory = base_bzr_tree.inventory rev.inventory_sha1, inv = repo.add_inventory_by_delta(basis_id, inv_delta, rev.revision_id, rev.parent_ids, base_bzr_inventory) ret_tree = InventoryRevisionTree(repo, inv, rev.revision_id) # Check verifiers if verifiers and roundtrip_revid is not None: testament = StrictTestament3(rev, ret_tree) calculated_verifiers = { "testament3-sha1": testament.as_sha1() } if calculated_verifiers != verifiers: trace.mutter("Testament SHA1 %r for %r did not match %r.", calculated_verifiers["testament3-sha1"], rev.revision_id, verifiers["testament3-sha1"]) rev.revision_id = original_revid rev.inventory_sha1, inv = repo.add_inventory_by_delta(basis_id, inv_delta, rev.revision_id, rev.parent_ids, base_bzr_tree) ret_tree = InventoryRevisionTree(repo, inv, rev.revision_id) else: calculated_verifiers = {} store_updater.add_object(o, calculated_verifiers, None) store_updater.finish() trees_cache.add(ret_tree) repo.add_revision(rev.revision_id, rev) if "verify" in debug.debug_flags: verify_commit_reconstruction(target_git_object_retriever, lookup_object, o, rev, ret_tree, parent_trees, mapping, unusual_modes, verifiers) def import_git_objects(repo, mapping, object_iter, target_git_object_retriever, heads, pb=None, limit=None): """Import a set of git objects into a bzr repository. :param repo: Target Bazaar repository :param mapping: Mapping to use :param object_iter: Iterator over Git objects. :return: Tuple with pack hints and last imported revision id """ def lookup_object(sha): try: return object_iter[sha] except KeyError: return target_git_object_retriever[sha] graph = [] checked = set() heads = list(set(heads)) trees_cache = LRUTreeCache(repo) # Find and convert commit objects while heads: if pb is not None: pb.update("finding revisions to fetch", len(graph), None) head = heads.pop() if head == ZERO_SHA: continue assert isinstance(head, str), "head is %r" % (head,) try: o = lookup_object(head) except KeyError: continue if isinstance(o, Commit): rev, roundtrip_revid, verifiers = mapping.import_commit(o, mapping.revision_id_foreign_to_bzr) if (repo.has_revision(rev.revision_id) or (roundtrip_revid and repo.has_revision(roundtrip_revid))): continue graph.append((o.id, o.parents)) heads.extend([p for p in o.parents if p not in checked]) elif isinstance(o, Tag): if o.object[1] not in checked: heads.append(o.object[1]) else: trace.warning("Unable to import head object %r" % o) checked.add(o.id) del checked # Order the revisions # Create the inventory objects batch_size = 1000 revision_ids = topo_sort(graph) pack_hints = [] if limit is not None: revision_ids = revision_ids[:limit] last_imported = None for offset in range(0, len(revision_ids), batch_size): target_git_object_retriever.start_write_group() try: repo.start_write_group() try: for i, head in enumerate( revision_ids[offset:offset+batch_size]): if pb is not None: pb.update("fetching revisions", offset+i, len(revision_ids)) import_git_commit(repo, mapping, head, lookup_object, target_git_object_retriever, trees_cache) last_imported = head except: repo.abort_write_group() raise else: hint = repo.commit_write_group() if hint is not None: pack_hints.extend(hint) except: target_git_object_retriever.abort_write_group() raise else: target_git_object_retriever.commit_write_group() return pack_hints, last_imported class InterFromGitRepository(InterRepository): _matching_repo_format = GitRepositoryFormat() def _target_has_shas(self, shas): raise NotImplementedError(self._target_has_shas) def get_determine_wants_heads(self, wants, include_tags=False): raise NotImplementedError(self.get_determine_wants_heads) def determine_wants_all(self, refs): raise NotImplementedError(self.determine_wants_all) @staticmethod def _get_repo_format_to_test(): return None def copy_content(self, revision_id=None): """See InterRepository.copy_content.""" self.fetch(revision_id, find_ghosts=False) def search_missing_revision_ids(self, find_ghosts=True, revision_ids=None, if_present_ids=None, limit=None): git_shas = [] todo = [] if revision_ids: todo.extend(revision_ids) if if_present_ids: todo.extend(revision_ids) for revid in revision_ids: if revid == NULL_REVISION: continue git_sha, mapping = self.source.lookup_bzr_revision_id(revid) git_shas.append(git_sha) walker = Walker(self.source._git.object_store, include=git_shas, exclude=[sha for sha in self.target.bzrdir.get_refs_container().as_dict().values() if sha != ZERO_SHA]) missing_revids = set() for entry in walker: missing_revids.add(self.source.lookup_foreign_revision_id(entry.commit.id)) return self.source.revision_ids_to_search_result(missing_revids) class InterGitNonGitRepository(InterFromGitRepository): """Base InterRepository that copies revisions from a Git into a non-Git repository.""" def _target_has_shas(self, shas): revids = {} for sha in shas: try: revid = self.source.lookup_foreign_revision_id(sha) except NotCommitError: # Commit is definitely not present continue else: revids[revid] = sha return set([revids[r] for r in self.target.has_revisions(revids)]) def determine_wants_all(self, refs): potential = set() for k, v in refs.as_dict().iteritems(): # For non-git target repositories, only worry about peeled if v == ZERO_SHA: continue potential.add(self.source.bzrdir.get_peeled(k)) return list(potential - self._target_has_shas(potential)) def get_determine_wants_heads(self, wants, include_tags=False): wants = set(wants) def determine_wants(refs): potential = set(wants) if include_tags: for k, unpeeled in refs.as_dict().iteritems(): if not is_tag(k): continue if unpeeled == ZERO_SHA: continue potential.add(self.source.bzrdir.get_peeled(k)) return list(potential - self._target_has_shas(potential)) return determine_wants def get_determine_wants_revids(self, revids, include_tags=False): wants = set() for revid in set(revids): if self.target.has_revision(revid): continue git_sha, mapping = self.source.lookup_bzr_revision_id(revid) wants.add(git_sha) return self.get_determine_wants_heads(wants, include_tags=include_tags) def fetch_objects(self, determine_wants, mapping, limit=None): """Fetch objects from a remote server. :param determine_wants: determine_wants callback :param mapping: BzrGitMapping to use :param limit: Maximum number of commits to import. :return: Tuple with pack hint, last imported revision id and remote refs """ raise NotImplementedError(self.fetch_objects) def fetch(self, revision_id=None, find_ghosts=False, mapping=None, fetch_spec=None): if mapping is None: mapping = self.source.get_mapping() if revision_id is not None: interesting_heads = [revision_id] elif fetch_spec is not None: recipe = fetch_spec.get_recipe() if recipe[0] in ("search", "proxy-search"): interesting_heads = recipe[1] else: raise AssertionError("Unsupported search result type %s" % recipe[0]) else: interesting_heads = None if interesting_heads is not None: determine_wants = self.get_determine_wants_revids( interesting_heads, include_tags=False) else: determine_wants = self.determine_wants_all (pack_hint, _, remote_refs) = self.fetch_objects(determine_wants, mapping) if pack_hint is not None and self.target._format.pack_compresses: self.target.pack(hint=pack_hint) return remote_refs _GIT_PROGRESS_RE = re.compile(r"(.*?): +(\d+)% \((\d+)/(\d+)\)") def report_git_progress(pb, text): text = text.rstrip("\r\n") g = _GIT_PROGRESS_RE.match(text) if g is not None: (text, pct, current, total) = g.groups() pb.update(text, int(current), int(total)) else: pb.update(text, 0, 0) class DetermineWantsRecorder(object): def __init__(self, actual): self.actual = actual self.wants = [] self.remote_refs = {} def __call__(self, refs): self.remote_refs = refs self.wants = self.actual(refs) return self.wants class InterRemoteGitNonGitRepository(InterGitNonGitRepository): """InterRepository that copies revisions from a remote Git into a non-Git repository.""" def get_target_heads(self): # FIXME: This should be more efficient all_revs = self.target.all_revision_ids() parent_map = self.target.get_parent_map(all_revs) all_parents = set() map(all_parents.update, parent_map.itervalues()) return set(all_revs) - all_parents def fetch_objects(self, determine_wants, mapping, limit=None): """See `InterGitNonGitRepository`.""" store = BazaarObjectStore(self.target, mapping) store.lock_write() try: heads = self.get_target_heads() graph_walker = ObjectStoreGraphWalker( [store._lookup_revision_sha1(head) for head in heads], lambda sha: store[sha].parents) wants_recorder = DetermineWantsRecorder(determine_wants) pb = ui.ui_factory.nested_progress_bar() try: objects_iter = self.source.fetch_objects( wants_recorder, graph_walker, store.get_raw, progress=lambda text: report_git_progress(pb, text)) trace.mutter("Importing %d new revisions", len(wants_recorder.wants)) (pack_hint, last_rev) = import_git_objects(self.target, mapping, objects_iter, store, wants_recorder.wants, pb, limit) return (pack_hint, last_rev, wants_recorder.remote_refs) finally: pb.finished() finally: store.unlock() @staticmethod def is_compatible(source, target): """Be compatible with GitRepository.""" if not isinstance(source, RemoteGitRepository): return False if not target.supports_rich_root(): return False if isinstance(target, GitRepository): return False if not getattr(target._format, "supports_full_versioned_files", True): return False return True class InterLocalGitNonGitRepository(InterGitNonGitRepository): """InterRepository that copies revisions from a local Git into a non-Git repository.""" def fetch_objects(self, determine_wants, mapping, limit=None): """See `InterGitNonGitRepository`.""" remote_refs = self.source.bzrdir.get_refs_container() wants = determine_wants(remote_refs) create_pb = None pb = ui.ui_factory.nested_progress_bar() target_git_object_retriever = BazaarObjectStore(self.target, mapping) try: target_git_object_retriever.lock_write() try: (pack_hint, last_rev) = import_git_objects(self.target, mapping, self.source._git.object_store, target_git_object_retriever, wants, pb, limit) return (pack_hint, last_rev, remote_refs) finally: target_git_object_retriever.unlock() finally: pb.finished() @staticmethod def is_compatible(source, target): """Be compatible with GitRepository.""" if not isinstance(source, LocalGitRepository): return False if not target.supports_rich_root(): return False if isinstance(target, GitRepository): return False if not getattr(target._format, "supports_full_versioned_files", True): return False return True class InterGitGitRepository(InterFromGitRepository): """InterRepository that copies between Git repositories.""" def fetch_refs(self, update_refs, lossy=False): if lossy: raise errors.LossyPushToSameVCS(self.source, self.target) old_refs = self.target.bzrdir.get_refs_container() ref_changes = {} def determine_wants(heads): old_refs = dict([(k, (v, None)) for (k, v) in heads.as_dict().iteritems()]) new_refs = update_refs(old_refs) ref_changes.update(new_refs) return [sha1 for (sha1, bzr_revid) in new_refs.itervalues()] self.fetch_objects(determine_wants) for k, (git_sha, bzr_revid) in ref_changes.iteritems(): self.target._git.refs[k] = git_sha new_refs = self.target.bzrdir.get_refs_container() return None, old_refs, new_refs def fetch_objects(self, determine_wants, mapping=None, limit=None): graphwalker = self.target._git.get_graph_walker() if (isinstance(self.source, LocalGitRepository) and isinstance(self.target, LocalGitRepository)): def wrap_determine_wants(refs): return determine_wants(self.source._git.refs) pb = ui.ui_factory.nested_progress_bar() try: refs = self.source._git.fetch(self.target._git, wrap_determine_wants, lambda text: report_git_progress(pb, text)) finally: pb.finished() return (None, None, refs) elif (isinstance(self.source, LocalGitRepository) and isinstance(self.target, RemoteGitRepository)): raise NotImplementedError elif (isinstance(self.source, RemoteGitRepository) and isinstance(self.target, LocalGitRepository)): pb = ui.ui_factory.nested_progress_bar() try: f, commit = self.target._git.object_store.add_pack() try: refs = self.source.bzrdir.fetch_pack( determine_wants, graphwalker, f.write, lambda text: report_git_progress(pb, text)) commit() return (None, None, refs) except: f.close() raise finally: pb.finished() else: raise AssertionError("fetching between %r and %r not supported" % (self.source, self.target)) def _target_has_shas(self, shas): return set([sha for sha in shas if sha in self.target._git.object_store]) def fetch(self, revision_id=None, find_ghosts=False, mapping=None, fetch_spec=None, branches=None, limit=None): if mapping is None: mapping = self.source.get_mapping() r = self.target._git if revision_id is not None: args = [self.source.lookup_bzr_revision_id(revision_id)[0]] elif fetch_spec is not None: recipe = fetch_spec.get_recipe() if recipe[0] in ("search", "proxy-search"): heads = recipe[1] else: raise AssertionError( "Unsupported search result type %s" % recipe[0]) args = [self.source.lookup_bzr_revision_id(revid)[0] for revid in heads] if branches is not None: determine_wants = lambda x: [x[y] for y in branches if not x[y] in r.object_store and x[y] != ZERO_SHA] elif fetch_spec is None and revision_id is None: determine_wants = self.determine_wants_all else: determine_wants = lambda x: [y for y in args if not y in r.object_store and y != ZERO_SHA] wants_recorder = DetermineWantsRecorder(determine_wants) self.fetch_objects(wants_recorder, mapping) return wants_recorder.remote_refs @staticmethod def is_compatible(source, target): """Be compatible with GitRepository.""" return (isinstance(source, GitRepository) and isinstance(target, GitRepository)) def get_determine_wants_revids(self, revids, include_tags=False): wants = set() for revid in set(revids): if self.target.has_revision(revid): continue git_sha, mapping = self.source.lookup_bzr_revision_id(revid) wants.add(git_sha) return self.get_determine_wants_heads(wants, include_tags=include_tags) def determine_wants_all(self, refs): potential = set([v for v in refs.as_dict().values() if not v == ZERO_SHA]) return list(potential - self._target_has_shas(potential)) def get_determine_wants_heads(self, wants, include_tags=False): wants = set(wants) def determine_wants(refs): potential = set(wants) if include_tags: for k, unpeeled in refs.as_dict().iteritems(): if not is_tag(k): continue if unpeeled == ZERO_SHA: continue potential.add(unpeeled) return list(potential - self._target_has_shas(potential)) return determine_wants bzr-git-0.6.13+bzr1649/filegraph.py0000644000000000000000000000606113165530605014744 0ustar 00000000000000# Copyright (C) 2011 Canonical Ltd # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """File graph access.""" from __future__ import absolute_import from dulwich.errors import ( NotTreeError, ) from dulwich.object_store import ( tree_lookup_path, ) from ...revision import ( NULL_REVISION, ) class GitFileLastChangeScanner(object): def __init__(self, repository): self.repository = repository self.store = self.repository._git.object_store def find_last_change_revision(self, path, commit_id): commit = self.store[commit_id] target_mode, target_sha = tree_lookup_path(self.store.__getitem__, commit.tree, path) while True: parent_commits = [self.store[c] for c in commit.parents] for parent_commit in parent_commits: try: mode, sha = tree_lookup_path(self.store.__getitem__, parent_commit.tree, path) except (NotTreeError, KeyError): continue if mode != target_mode or sha != target_sha: return (path, commit.id) if parent_commits == []: break commit = parent_commits[0] return (path, commit.id) class GitFileParentProvider(object): def __init__(self, change_scanner): self.change_scanner = change_scanner self.store = self.change_scanner.repository._git.object_store def _get_parents(self, file_id, text_revision): commit_id, mapping = self.change_scanner.repository.lookup_bzr_revision_id( text_revision) path = mapping.parse_file_id(file_id) text_parents = [] for commit_parent in self.store[commit_id].parents: (_, text_parent) = self.change_scanner.find_last_change_revision(path, commit_parent) if text_parent not in text_parents: text_parents.append(text_parent) return tuple([(file_id, self.change_scanner.repository.lookup_foreign_revision_id(p)) for p in text_parents]) def get_parent_map(self, keys): ret = {} for key in keys: (file_id, text_revision) = key if text_revision == NULL_REVISION: continue try: ret[key] = self._get_parents(file_id, text_revision) except KeyError: pass return ret bzr-git-0.6.13+bzr1649/git-remote-bzr0000755000000000000000000000260213165530605015223 0ustar 00000000000000#!/usr/bin/env python # vim: expandtab # Copyright (C) 2011 Jelmer Vernooij # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Remote helper for git for accessing bzr repositories.""" import optparse import signal import sys def handle_sigint(signal, frame): sys.exit(0) signal.signal(signal.SIGINT, handle_sigint) import bzrlib bzrlib.initialize() from bzrlib.plugin import load_plugins load_plugins() from bzrlib.plugins.git.git_remote_helper import ( RemoteHelper, open_local_dir, open_remote_dir, ) parser = optparse.OptionParser() (opts, args) = parser.parse_args() (shortname, url) = args helper = RemoteHelper(open_local_dir(), shortname, open_remote_dir(url)) helper.process(sys.stdin, sys.stdout) bzr-git-0.6.13+bzr1649/git-remote-bzr.10000644000000000000000000000204113165530605015354 0ustar 00000000000000.TH "GIT\-REMOTE\-BZR" "1" "12/17/2011" "bzr-git 0\&.6\&.6" "Git Manual" .ie \n(.g .ds Aq \(aq .el .ds Aq ' .\" disable hyphenation .nh .\" disable justification (adjust text to left margin only) .ad l .SH "NAME" git-remote-bzr \- Git remote support for Bazaar repositories .SH "SYNOPSIS" .sp .nf git clone bzr:: [] git fetch bzr:: [] .fi .sp .SH "DESCRIPTION" .sp This command provides support for using \fIbzr\fR repositories as Git remotes, through the bzr-git plugin. At the moment it supports cloning from, fetching from and pushing into Bazaar repositories. Fetch support is still experimental, and may be slow. .SH "BUGS" .sp Please report bugs at \fUhttps://launchpad.net/bzr-git/+filebug\fR .SH "LICENSE" bzr-git and git-remote-bzr are licensed under the GNU GPL, version 2 or later. .SH "SEE ALSO" .sp \fBgit-remote-helpers\fR(1), \fBbzr\fR(1) .SH "BAZAAR" .sp Part of the \fBbzr\fR(1) suite .SH "AUTHOR" .sp bzr-git, git-remote-bzr and this manual page were written by Jelmer Vernooij. bzr-git-0.6.13+bzr1649/git_remote_helper.py0000644000000000000000000001500613165530605016477 0ustar 00000000000000#!/usr/bin/env python # vim: expandtab # Copyright (C) 2011 Jelmer Vernooij # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Remote helper for git for accessing bzr repositories.""" from __future__ import absolute_import CAPABILITIES = ["fetch", "option", "push"] import os from ...controldir import ControlDir from ...errors import NotBranchError, NoRepositoryPresent from ...repository import InterRepository from ...transport import get_transport_from_path from . import ( LocalGitProber, ) from .dir import ( BareLocalGitControlDirFormat, LocalGitControlDirFormat, ) from .object_store import ( get_object_store, ) from .refs import ( get_refs_container, ref_to_branch_name, ) from .repository import ( GitRepository, ) try: from ..fastimport import exporter as fastexporter except ImportError: fastexporter = None else: CAPABILITIES.append("import") def open_remote_dir(url): try: return ControlDir.open(url) except NotBranchError: return ControlDir.create(url) def fetch(outf, wants, shortname, remote_dir, local_dir): remote_repo = remote_dir.find_repository() local_repo = local_dir.find_repository() inter = InterRepository.get(remote_repo, local_repo) revs = [] for (sha1, ref) in wants: revs.append((sha1, None)) if (isinstance(remote_repo, GitRepository) and isinstance(local_repo, GitRepository)): lossy = False else: lossy = True inter.fetch_objects(revs, lossy=lossy) outf.write("\n") def push(outf, wants, shortname, remote_dir, local_dir): for (src_ref, dest_ref) in wants: local_branch = local_dir.open_branch(ref=src_ref) dest_branch_name = ref_to_branch_name(dest_ref) if dest_branch_name == "master": dest_branch_name = None try: remote_branch = remote_dir.open_branch(name=dest_branch_name) except NotBranchError: remote_branch = remote_dir.create_branch(name=dest_branch_name) local_branch.push(remote_branch) outf.write("ok %s\n" % dest_ref) outf.write("\n") class RemoteHelper(object): """Git remote helper.""" def __init__(self, local_dir, shortname, remote_dir): self.local_dir = local_dir self.shortname = shortname self.remote_dir = remote_dir self.batchcmd = None self.wants = [] def cmd_capabilities(self, outf, argv): outf.write("\n".join(CAPABILITIES)+"\n\n") def cmd_list(self, outf, argv): try: repo = self.remote_dir.find_repository() except NoRepositoryPresent: repo = self.remote_dir.create_repository() object_store = get_object_store(repo) object_store.lock_read() try: refs = get_refs_container(self.remote_dir, object_store) for ref, git_sha1 in refs.as_dict().iteritems(): ref = ref.replace("~", "_") outf.write("%s %s\n" % (git_sha1, ref)) outf.write("\n") finally: object_store.unlock() def cmd_option(self, outf, argv): outf.write("unsupported\n") def cmd_fetch(self, outf, argv): if self.batchcmd not in (None, "fetch"): raise Exception("fetch command inside other batch command") self.wants.append(tuple(argv[1:])) self.batchcmd = "fetch" def cmd_push(self, outf, argv): if self.batchcmd not in (None, "push"): raise Exception("push command inside other batch command") self.wants.append(tuple(argv[1].split(":", 1))) self.batchcmd = "push" def cmd_import(self, outf, argv): if fastexporter is None: raise Exception("install bzr-fastimport for 'import' command support") dest_branch_name = ref_to_branch_name(argv[1]) if dest_branch_name == "master": dest_branch_name = None remote_branch = self.remote_dir.open_branch(name=dest_branch_name) exporter = fastexporter.BzrFastExporter(remote_branch, outf=outf, ref=argv[1], checkpoint=None, import_marks_file=None, export_marks_file=None, revision=None, verbose=None, plain_format=True, rewrite_tags=False) exporter.run() commands = { "capabilities": cmd_capabilities, "list": cmd_list, "option": cmd_option, "fetch": cmd_fetch, "push": cmd_push, "import": cmd_import, } def process(self, inf, outf): while True: l = inf.readline() if not l: break self.process_line(l, outf) def process_line(self, l, outf): argv = l.strip().split() if argv == []: if self.batchcmd == "fetch": fetch(outf, self.wants, self.shortname, self.remote_dir, self.local_dir) elif self.batchcmd == "push": push(outf, self.wants, self.shortname, self.remote_dir, self.local_dir) elif self.batchcmd is None: return else: raise AssertionError("invalid batch %r" % self.batchcmd) self.batchcmd = None else: try: self.commands[argv[0]](self, outf, argv) except KeyError: raise Exception("Unknown remote command %r" % argv) outf.flush() def open_local_dir(): try: git_path = os.environ["GIT_DIR"] except KeyError: git_transport = get_transport_from_path(".") git_format = LocalGitProber().probe_transport(git_transport) else: if git_path.endswith("/.git"): git_format = LocalGitControlDirFormat() git_path = git_path[:-4] else: git_format = BareLocalGitControlDirFormat() git_transport = get_transport_from_path(git_path) return git_format.open(git_transport) bzr-git-0.6.13+bzr1649/help.py0000644000000000000000000000257213165530605013736 0ustar 00000000000000# Copyright (C) 2010 Jelmer Vernooij # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # You should have received a copy of the GNU General Public License # along with this program. If not, see . """Help information.""" from __future__ import absolute_import help_git = """Using Bazaar with Git. The bzr-git plugin provides support for using Bazaar with local and remote Git repositories, as just another format. You can clone, pull from and push to git repositories as you would with any native Bazaar branch. The bzr-git plugin also adds three new bzr subcommands: * bzr git-objects: Extracts Git objects out of a Bazaar repository * bzr git-refs: Display Git refs from a Bazaar branch or repository * bzr git-import: Imports a local or remote Git repository to a set of Bazaar branches The 'git:' revision specifier can be used to find revisions by short or long GIT SHA1. """ bzr-git-0.6.13+bzr1649/hg.py0000644000000000000000000000540613165530605013403 0ustar 00000000000000# Copyright (C) 2009 Scott Chacon # Copyright (C) 2009 Jelmer Vernooij # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Compatibility for hg-git.""" from __future__ import absolute_import import urllib def format_hg_metadata(renames, branch, extra): """Construct a tail with hg-git metadata. :param renames: List of (oldpath, newpath) tuples with file renames :param branch: Branch name :param extra: Dictionary with extra data :return: Tail for commit message """ extra_message = '' if branch != 'default': extra_message += "branch : " + branch + "\n" if renames: for oldfile, newfile in renames: extra_message += "rename : " + oldfile + " => " + newfile + "\n" for key, value in extra.iteritems(): if key in ('author', 'committer', 'encoding', 'message', 'branch', 'hg-git'): continue else: extra_message += "extra : " + key + " : " + urllib.quote(value) + "\n" if extra_message: return "\n--HG--\n" + extra_message else: return "" def extract_hg_metadata(message): """Extract Mercurial metadata from a commit message. :param message: Commit message to extract from :return: Tuple with original commit message, renames, branch and extra data. """ split = message.split("\n--HG--\n", 1) renames = {} extra = {} branch = None if len(split) == 2: message, meta = split lines = meta.split("\n") for line in lines: if line == '': continue command, data = line.split(" : ", 1) if command == 'rename': before, after = data.split(" => ", 1) renames[after] = before elif command == 'branch': branch = data elif command == 'extra': before, after = data.split(" : ", 1) extra[before] = urllib.unquote(after) else: raise KeyError("unknown hg-git metadata command %s" % command) return (message, renames, branch, extra) bzr-git-0.6.13+bzr1649/info.py0000644000000000000000000000111613165530605013732 0ustar 00000000000000from __future__ import absolute_import bzr_plugin_name = "git" dulwich_minimum_version = (0, 18, 3) # versions ending in 'exp' mean experimental mappings # versions ending in 'dev' mean development version # versions ending in 'final' mean release (well tested, etc) bzr_plugin_version = (0, 6, 12, 'final', 0) bzr_commands = ["git-import", "git-object", "git-refs", "git-apply"] bzr_compatible_versions = [(2, x, 0) for x in [5, 6, 7]] bzr_minimum_version = bzr_compatible_versions[0] bzr_maximum_version = bzr_compatible_versions[-1] bzr_control_formats = {"Git":{'.git/': None}} bzr-git-0.6.13+bzr1649/mapping.py0000644000000000000000000005521313165530605014441 0ustar 00000000000000# Copyright (C) 2007 Canonical Ltd # Copyright (C) 2008-2010 Jelmer Vernooij # Copyright (C) 2008 John Carr # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Converters, etc for going between Bazaar and Git ids.""" from __future__ import absolute_import import base64 import stat from ... import ( bencode, errors, foreign, trace, ) from ...inventory import ( ROOT_ID, ) from ...foreign import ( ForeignVcs, VcsMappingRegistry, ForeignRevision, ) from ...revision import ( NULL_REVISION, ) from .errors import ( NoPushSupport, UnknownCommitExtra, UnknownMercurialCommitExtra, ) from .hg import ( format_hg_metadata, extract_hg_metadata, ) from .roundtrip import ( extract_bzr_metadata, inject_bzr_metadata, CommitSupplement, deserialize_fileid_map, serialize_fileid_map, ) DEFAULT_FILE_MODE = stat.S_IFREG | 0644 HG_RENAME_SOURCE = "HG:rename-source" HG_EXTRA = "HG:extra" # This HG extra is used to indicate the commit that this commit was based on. HG_EXTRA_AMEND_SOURCE = "amend_source" def escape_file_id(file_id): return file_id.replace('_', '__').replace(' ', '_s').replace('\x0c', '_c') def unescape_file_id(file_id): ret = [] i = 0 while i < len(file_id): if file_id[i] != '_': ret.append(file_id[i]) else: if file_id[i+1] == '_': ret.append("_") elif file_id[i+1] == 's': ret.append(" ") elif file_id[i+1] == 'c': ret.append("\x0c") else: raise AssertionError("unknown escape character %s" % file_id[i+1]) i += 1 i += 1 return "".join(ret) def fix_person_identifier(text): if "<" in text and ">" in text: if not " <" in text and text.count("<") == 1: text = text.replace("<", " <") return text return "%s <%s>" % (text, text) def warn_escaped(commit, num_escaped): trace.warning("Escaped %d XML-invalid characters in %s. Will be unable " "to regenerate the SHA map.", num_escaped, commit) def warn_unusual_mode(commit, path, mode): trace.mutter("Unusual file mode %o for %s in %s. Storing as revision " "property. ", mode, path, commit) class BzrGitMapping(foreign.VcsMapping): """Class that maps between Git and Bazaar semantics.""" experimental = False BZR_FILE_IDS_FILE = None BZR_DUMMY_FILE = None def is_special_file(self, filename): return (filename in (self.BZR_FILE_IDS_FILE, self.BZR_DUMMY_FILE)) def __init__(self): super(BzrGitMapping, self).__init__(foreign_vcs_git) def __eq__(self, other): return (type(self) == type(other) and self.revid_prefix == other.revid_prefix) @classmethod def revision_id_foreign_to_bzr(cls, git_rev_id): """Convert a git revision id handle to a Bazaar revision id.""" from dulwich.protocol import ZERO_SHA if git_rev_id == ZERO_SHA: return NULL_REVISION return "%s:%s" % (cls.revid_prefix, git_rev_id) @classmethod def revision_id_bzr_to_foreign(cls, bzr_rev_id): """Convert a Bazaar revision id to a git revision id handle.""" if not bzr_rev_id.startswith("%s:" % cls.revid_prefix): raise errors.InvalidRevisionId(bzr_rev_id, cls) return bzr_rev_id[len(cls.revid_prefix)+1:], cls() def generate_file_id(self, path): # Git paths are just bytestrings # We must just hope they are valid UTF-8.. if path == "": return ROOT_ID if type(path) is unicode: path = path.encode("utf-8") return escape_file_id(path) def is_control_file(self, path): return path in (self.BZR_FILE_IDS_FILE, self.BZR_DUMMY_FILE) def parse_file_id(self, file_id): if file_id == ROOT_ID: return "" return unescape_file_id(file_id) def revid_as_refname(self, revid): import urllib return "refs/bzr/%s" % urllib.quote(revid) def import_unusual_file_modes(self, rev, unusual_file_modes): if unusual_file_modes: ret = [(path, unusual_file_modes[path]) for path in sorted(unusual_file_modes.keys())] rev.properties['file-modes'] = bencode.bencode(ret) def export_unusual_file_modes(self, rev): try: file_modes = rev.properties['file-modes'] except KeyError: return {} else: return dict([(self.generate_file_id(path), mode) for (path, mode) in bencode.bdecode(file_modes.encode("utf-8"))]) def _generate_git_svn_metadata(self, rev, encoding): try: git_svn_id = rev.properties["git-svn-id"] except KeyError: return "" else: return "\ngit-svn-id: %s\n" % git_svn_id.encode(encoding) def _generate_hg_message_tail(self, rev): extra = {} renames = [] branch = 'default' for name in rev.properties: if name == 'hg:extra:branch': branch = rev.properties['hg:extra:branch'] elif name.startswith('hg:extra'): extra[name[len('hg:extra:'):]] = base64.b64decode( rev.properties[name]) elif name == 'hg:renames': renames = bencode.bdecode(base64.b64decode( rev.properties['hg:renames'])) # TODO: Export other properties as 'bzr:' extras? ret = format_hg_metadata(renames, branch, extra) assert isinstance(ret, str) return ret def _extract_git_svn_metadata(self, rev, message): lines = message.split("\n") if not (lines[-1] == "" and len(lines) >= 2 and lines[-2].startswith("git-svn-id:")): return message git_svn_id = lines[-2].split(": ", 1)[1] rev.properties['git-svn-id'] = git_svn_id (url, rev, uuid) = parse_git_svn_id(git_svn_id) # FIXME: Convert this to converted-from property somehow.. ret = "\n".join(lines[:-2]) assert isinstance(ret, str) return ret def _extract_hg_metadata(self, rev, message): (message, renames, branch, extra) = extract_hg_metadata(message) if branch is not None: rev.properties['hg:extra:branch'] = branch for name, value in extra.iteritems(): rev.properties['hg:extra:' + name] = base64.b64encode(value) if renames: rev.properties['hg:renames'] = base64.b64encode(bencode.bencode( [(new, old) for (old, new) in renames.iteritems()])) return message def _extract_bzr_metadata(self, rev, message): (message, metadata) = extract_bzr_metadata(message) return message, metadata def _decode_commit_message(self, rev, message, encoding): return message.decode(encoding), CommitSupplement() def _encode_commit_message(self, rev, message, encoding): return message.encode(encoding) def export_fileid_map(self, fileid_map): """Export a file id map to a fileid map. :param fileid_map: File id map, mapping paths to file ids :return: A Git blob object """ from dulwich.objects import Blob b = Blob() b.set_raw_chunks(serialize_fileid_map(fileid_map)) return b def export_commit(self, rev, tree_sha, parent_lookup, lossy, verifiers): """Turn a Bazaar revision in to a Git commit :param tree_sha: Tree sha for the commit :param parent_lookup: Function for looking up the GIT sha equiv of a bzr revision :param lossy: Whether to store roundtripping information. :param verifiers: Verifiers info :return dulwich.objects.Commit represent the revision: """ from dulwich.objects import Commit commit = Commit() commit.tree = tree_sha if not lossy: metadata = CommitSupplement() metadata.verifiers = verifiers else: metadata = None parents = [] for p in rev.parent_ids: try: git_p = parent_lookup(p) except KeyError: git_p = None if metadata is not None: metadata.explicit_parent_ids = rev.parent_ids if git_p is not None: assert len(git_p) == 40, "unexpected length for %r" % git_p parents.append(git_p) commit.parents = parents try: encoding = rev.properties['git-explicit-encoding'] except KeyError: encoding = rev.properties.get('git-implicit-encoding', 'utf-8') commit.encoding = rev.properties.get('git-explicit-encoding') commit.committer = fix_person_identifier(rev.committer.encode( encoding)) commit.author = fix_person_identifier( rev.get_apparent_authors()[0].encode(encoding)) commit.commit_time = long(rev.timestamp) if 'author-timestamp' in rev.properties: commit.author_time = long(rev.properties['author-timestamp']) else: commit.author_time = commit.commit_time commit._commit_timezone_neg_utc = "commit-timezone-neg-utc" in rev.properties commit.commit_timezone = rev.timezone commit._author_timezone_neg_utc = "author-timezone-neg-utc" in rev.properties if 'author-timezone' in rev.properties: commit.author_timezone = int(rev.properties['author-timezone']) else: commit.author_timezone = commit.commit_timezone commit.message = self._encode_commit_message(rev, rev.message, encoding) assert type(commit.message) == str if metadata is not None: try: mapping_registry.parse_revision_id(rev.revision_id) except errors.InvalidRevisionId: metadata.revision_id = rev.revision_id mapping_properties = set( ['author', 'author-timezone', 'author-timezone-neg-utc', 'commit-timezone-neg-utc', 'git-implicit-encoding', 'git-explicit-encoding', 'author-timestamp', 'file-modes']) for k, v in rev.properties.iteritems(): if not k in mapping_properties: metadata.properties[k] = v if not lossy: if self.roundtripping: commit.message = inject_bzr_metadata(commit.message, metadata, encoding) else: raise NoPushSupport() assert type(commit.message) == str if 'git-extra' in rev.properties: commit.extra.extend([l.split(' ', 1) for l in rev.properties['git-extra'].splitlines()]) return commit def import_fileid_map(self, blob): """Convert a git file id map blob. :param blob: Git blob object with fileid map :return: Dictionary mapping paths to file ids """ return deserialize_fileid_map(blob.data) def import_commit(self, commit, lookup_parent_revid): """Convert a git commit to a bzr revision. :return: a `bzrlib.revision.Revision` object, foreign revid and a testament sha1 """ if commit is None: raise AssertionError("Commit object can't be None") rev = ForeignRevision(commit.id, self, self.revision_id_foreign_to_bzr(commit.id)) rev.git_metadata = None def decode_using_encoding(rev, commit, encoding): rev.committer = str(commit.committer).decode(encoding) if commit.committer != commit.author: rev.properties['author'] = str(commit.author).decode(encoding) rev.message, rev.git_metadata = self._decode_commit_message( rev, commit.message, encoding) if commit.encoding is not None: rev.properties['git-explicit-encoding'] = commit.encoding decode_using_encoding(rev, commit, commit.encoding) else: for encoding in ('utf-8', 'latin1'): try: decode_using_encoding(rev, commit, encoding) except UnicodeDecodeError: pass else: if encoding != 'utf-8': rev.properties['git-implicit-encoding'] = encoding break if commit.commit_time != commit.author_time: rev.properties['author-timestamp'] = str(commit.author_time) if commit.commit_timezone != commit.author_timezone: rev.properties['author-timezone'] = "%d" % commit.author_timezone if commit._author_timezone_neg_utc: rev.properties['author-timezone-neg-utc'] = "" if commit._commit_timezone_neg_utc: rev.properties['commit-timezone-neg-utc'] = "" rev.timestamp = commit.commit_time rev.timezone = commit.commit_timezone rev.parent_ids = None if rev.git_metadata is not None: md = rev.git_metadata roundtrip_revid = md.revision_id if md.explicit_parent_ids: rev.parent_ids = md.explicit_parent_ids rev.properties.update(md.properties) verifiers = md.verifiers else: roundtrip_revid = None verifiers = {} if rev.parent_ids is None: rev.parent_ids = tuple([lookup_parent_revid(p) for p in commit.parents]) unknown_extra_fields = [] extra_lines = [] for k, v in commit.extra: if k == HG_RENAME_SOURCE: extra_lines.append(k + ' ' + v + '\n') elif k == HG_EXTRA: hgk, hgv = v.split(':', 1) if hgk not in (HG_EXTRA_AMEND_SOURCE, ): raise UnknownMercurialCommitExtra(commit, hgk) extra_lines.append(k + ' ' + v + '\n') else: unknown_extra_fields.append(k) if unknown_extra_fields: raise UnknownCommitExtra(commit, unknown_extra_fields) if extra_lines: rev.properties['git-extra'] = ''.join(extra_lines) return rev, roundtrip_revid, verifiers def get_fileid_map(self, lookup_object, tree_sha): """Obtain a fileid map for a particular tree. :param lookup_object: Function for looking up an object :param tree_sha: SHA of the root tree :return: GitFileIdMap instance """ try: file_id_map_sha = lookup_object(tree_sha)[self.BZR_FILE_IDS_FILE][1] except KeyError: file_ids = {} else: file_ids = self.import_fileid_map(lookup_object(file_id_map_sha)) return GitFileIdMap(file_ids, self) class BzrGitMappingv1(BzrGitMapping): revid_prefix = 'git-v1' experimental = False def __str__(self): return self.revid_prefix class BzrGitMappingExperimental(BzrGitMappingv1): revid_prefix = 'git-experimental' experimental = True roundtripping = True BZR_FILE_IDS_FILE = '.bzrfileids' BZR_DUMMY_FILE = '.bzrdummy' def _decode_commit_message(self, rev, message, encoding): message = self._extract_hg_metadata(rev, message) message = self._extract_git_svn_metadata(rev, message) message, metadata = self._extract_bzr_metadata(rev, message) return message.decode(encoding), metadata def _encode_commit_message(self, rev, message, encoding): ret = message.encode(encoding) ret += self._generate_hg_message_tail(rev) ret += self._generate_git_svn_metadata(rev, encoding) return ret def import_commit(self, commit, lookup_parent_revid): rev, roundtrip_revid, verifiers = super(BzrGitMappingExperimental, self).import_commit(commit, lookup_parent_revid) rev.properties['converted_revision'] = "git %s\n" % commit.id return rev, roundtrip_revid, verifiers class GitMappingRegistry(VcsMappingRegistry): """Registry with available git mappings.""" def revision_id_bzr_to_foreign(self, bzr_revid): if bzr_revid == NULL_REVISION: from dulwich.protocol import ZERO_SHA return ZERO_SHA, None if not bzr_revid.startswith("git-"): raise errors.InvalidRevisionId(bzr_revid, None) (mapping_version, git_sha) = bzr_revid.split(":", 1) mapping = self.get(mapping_version) return mapping.revision_id_bzr_to_foreign(bzr_revid) parse_revision_id = revision_id_bzr_to_foreign mapping_registry = GitMappingRegistry() mapping_registry.register_lazy('git-v1', "bzrlib.plugins.git.mapping", "BzrGitMappingv1") mapping_registry.register_lazy('git-experimental', "bzrlib.plugins.git.mapping", "BzrGitMappingExperimental") # Uncomment the next line to enable the experimental bzr-git mappings. # This will make sure all bzr metadata is pushed into git, allowing for # full roundtripping later. # NOTE: THIS IS EXPERIMENTAL. IT MAY EAT YOUR DATA OR CORRUPT # YOUR BZR OR GIT REPOSITORIES. USE WITH CARE. #mapping_registry.set_default('git-experimental') mapping_registry.set_default('git-v1') class ForeignGit(ForeignVcs): """The Git Stupid Content Tracker""" @property def branch_format(self): from .branch import GitBranchFormat return GitBranchFormat() @property def repository_format(self): from .repository import GitRepositoryFormat return GitRepositoryFormat() def __init__(self): super(ForeignGit, self).__init__(mapping_registry) self.abbreviation = "git" @classmethod def serialize_foreign_revid(self, foreign_revid): return foreign_revid @classmethod def show_foreign_revid(cls, foreign_revid): return { "git commit": foreign_revid } foreign_vcs_git = ForeignGit() default_mapping = mapping_registry.get_default()() def symlink_to_blob(symlink_target): from dulwich.objects import Blob blob = Blob() if type(symlink_target) == unicode: symlink_target = symlink_target.encode('utf-8') blob.data = symlink_target return blob def mode_is_executable(mode): """Check if mode should be considered executable.""" return bool(mode & 0111) def mode_kind(mode): """Determine the Bazaar inventory kind based on Unix file mode.""" if mode is None: return None entry_kind = (mode & 0700000) / 0100000 if entry_kind == 0: return 'directory' elif entry_kind == 1: file_kind = (mode & 070000) / 010000 if file_kind == 0: return 'file' elif file_kind == 2: return 'symlink' elif file_kind == 6: return 'tree-reference' else: raise AssertionError( "Unknown file kind %d, perms=%o." % (file_kind, mode,)) else: raise AssertionError( "Unknown kind, perms=%r." % (mode,)) def object_mode(kind, executable): if kind == 'directory': return stat.S_IFDIR elif kind == 'symlink': mode = stat.S_IFLNK if executable: mode |= 0111 return mode elif kind == 'file': mode = stat.S_IFREG | 0644 if executable: mode |= 0111 return mode elif kind == 'tree-reference': from dulwich.objects import S_IFGITLINK return S_IFGITLINK else: raise AssertionError def entry_mode(entry): """Determine the git file mode for an inventory entry.""" return object_mode(entry.kind, entry.executable) def directory_to_tree(children, lookup_ie_sha1, unusual_modes, empty_file_name, allow_empty=False): """Create a Git Tree object from a Bazaar directory. :param children: Children inventory entries :param lookup_ie_sha1: Lookup the Git SHA1 for a inventory entry :param unusual_modes: Dictionary with unusual file modes by file ids :param empty_file_name: Name to use for dummy files in empty directories, None to ignore empty directories. """ from dulwich.objects import Blob, Tree tree = Tree() for name, value in children.iteritems(): ie = children[name] try: mode = unusual_modes[ie.file_id] except KeyError: mode = entry_mode(ie) hexsha = lookup_ie_sha1(ie) if hexsha is not None: tree.add(name.encode("utf-8"), mode, hexsha) if not allow_empty and len(tree) == 0: # Only the root can be an empty tree if empty_file_name is not None: tree.add(empty_file_name, stat.S_IFREG | 0644, Blob().id) else: return None return tree def extract_unusual_modes(rev): try: foreign_revid, mapping = mapping_registry.parse_revision_id( rev.revision_id) except errors.InvalidRevisionId: return {} else: return mapping.export_unusual_file_modes(rev) def parse_git_svn_id(text): (head, uuid) = text.rsplit(" ", 1) (full_url, rev) = head.rsplit("@", 1) return (full_url, int(rev), uuid) class GitFileIdMap(object): def __init__(self, file_ids, mapping): self.file_ids = file_ids self.paths = None self.mapping = mapping def all_file_ids(self): return self.file_ids.values() def set_file_id(self, path, file_id): assert type(path) is str assert type(file_id) is str self.file_ids[path] = file_id def lookup_file_id(self, path): assert type(path) is str try: file_id = self.file_ids[path] except KeyError: file_id = self.mapping.generate_file_id(path) assert type(file_id) is str return file_id def lookup_path(self, file_id): if self.paths is None: self.paths = {} for k, v in self.file_ids.iteritems(): self.paths[v] = k try: path = self.paths[file_id] except KeyError: return self.mapping.parse_file_id(file_id) else: assert type(path) is str return path def copy(self): return self.__class__(dict(self.file_ids), self.mapping) bzr-git-0.6.13+bzr1649/notes/0000755000000000000000000000000013165530605013556 5ustar 00000000000000bzr-git-0.6.13+bzr1649/object_store.py0000644000000000000000000007103113165530605015464 0ustar 00000000000000# Copyright (C) 2009-2012 Jelmer Vernooij # Copyright (C) 2012 Canonical Ltd # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Map from Git sha's to Bazaar objects.""" from __future__ import absolute_import from dulwich.objects import ( Blob, Commit, Tree, sha_to_hex, ZERO_SHA, ) from dulwich.object_store import ( BaseObjectStore, ) from ... import ( errors, lru_cache, trace, ui, urlutils, ) from ...lock import LogicalLockResult from ...revision import ( NULL_REVISION, ) from ...testament import( StrictTestament3, ) from .cache import ( from_repository as cache_from_repository, ) from .mapping import ( default_mapping, directory_to_tree, extract_unusual_modes, mapping_registry, symlink_to_blob, ) from .unpeel_map import ( UnpeelMap, ) import posixpath import stat def get_object_store(repo, mapping=None): git = getattr(repo, "_git", None) if git is not None: git.object_store.unlock = lambda: None git.object_store.lock_read = lambda: LogicalLockResult(lambda: None) git.object_store.lock_write = lambda: LogicalLockResult(lambda: None) return git.object_store return BazaarObjectStore(repo, mapping) MAX_TREE_CACHE_SIZE = 50 * 1024 * 1024 class LRUTreeCache(object): def __init__(self, repository): def approx_tree_size(tree): # Very rough estimate, 250 per inventory entry try: inv = tree.root_inventory except AttributeError: inv = tree.inventory return len(inv) * 250 self.repository = repository self._cache = lru_cache.LRUSizeCache(max_size=MAX_TREE_CACHE_SIZE, after_cleanup_size=None, compute_size=approx_tree_size) def revision_tree(self, revid): try: tree = self._cache[revid] except KeyError: tree = self.repository.revision_tree(revid) self.add(tree) return tree def iter_revision_trees(self, revids): trees = {} todo = [] for revid in revids: try: tree = self._cache[revid] except KeyError: todo.append(revid) else: assert tree.get_revision_id() == revid trees[revid] = tree for tree in self.repository.revision_trees(todo): trees[tree.get_revision_id()] = tree self.add(tree) return (trees[r] for r in revids) def revision_trees(self, revids): return list(self.iter_revision_trees(revids)) def add(self, tree): self._cache[tree.get_revision_id()] = tree def _find_missing_bzr_revids(graph, want, have): """Find the revisions that have to be pushed. :param get_parent_map: Function that returns the parents for a sequence of revisions. :param want: Revisions the target wants :param have: Revisions the target already has :return: Set of revisions to fetch """ handled = set(have) todo = set() for rev in want: extra_todo = graph.find_unique_ancestors(rev, handled) todo.update(extra_todo) handled.update(extra_todo) if NULL_REVISION in todo: todo.remove(NULL_REVISION) return todo def _check_expected_sha(expected_sha, object): """Check whether an object matches an expected SHA. :param expected_sha: None or expected SHA as either binary or as hex digest :param object: Object to verify """ if expected_sha is None: return if len(expected_sha) == 40: if expected_sha != object.sha().hexdigest(): raise AssertionError("Invalid sha for %r: %s" % (object, expected_sha)) elif len(expected_sha) == 20: if expected_sha != object.sha().digest(): raise AssertionError("Invalid sha for %r: %s" % (object, sha_to_hex(expected_sha))) else: raise AssertionError("Unknown length %d for %r" % (len(expected_sha), expected_sha)) def _tree_to_objects(tree, parent_trees, idmap, unusual_modes, dummy_file_name=None): """Iterate over the objects that were introduced in a revision. :param idmap: id map :param parent_trees: Parent revision trees :param unusual_modes: Unusual file modes dictionary :param dummy_file_name: File name to use for dummy files in empty directories. None to skip empty directories :return: Yields (path, object, ie) entries """ dirty_dirs = set() new_blobs = [] shamap = {} try: base_tree = parent_trees[0] other_parent_trees = parent_trees[1:] except IndexError: base_tree = tree._repository.revision_tree(NULL_REVISION) other_parent_trees = [] def find_unchanged_parent_ie(file_id, kind, other, parent_trees): for ptree in parent_trees: try: pkind = ptree.kind(file_id) except errors.NoSuchId: pass else: if kind == "file": if (pkind == "file" and ptree.get_file_sha1(file_id) == other): return (file_id, ptree.get_file_revision(file_id)) if kind == "symlink": if (pkind == "symlink" and ptree.get_symlink_target(file_id) == other): return (file_id, ptree.get_file_revision(file_id)) raise KeyError # Find all the changed blobs for (file_id, path, changed_content, versioned, parent, name, kind, executable) in tree.iter_changes(base_tree): if kind[1] == "file": if changed_content: try: (pfile_id, prevision) = find_unchanged_parent_ie(file_id, kind[1], tree.get_file_sha1(file_id), other_parent_trees) except KeyError: pass else: try: shamap[file_id] = idmap.lookup_blob_id( pfile_id, prevision) except KeyError: # no-change merge ? blob = Blob() blob.data = tree.get_file_text(file_id) shamap[file_id] = blob.id if not file_id in shamap: new_blobs.append((path[1], file_id)) elif kind[1] == "symlink": if changed_content: target = tree.get_symlink_target(file_id) blob = symlink_to_blob(target) shamap[file_id] = blob.id try: find_unchanged_parent_ie(file_id, kind[1], target, other_parent_trees) except KeyError: yield path[1], blob, (file_id, tree.get_file_revision(file_id, path[1])) elif kind[1] not in (None, "directory"): raise AssertionError(kind[1]) for p in parent: if p and tree.has_id(p) and tree.kind(p) == "directory": dirty_dirs.add(p) # Fetch contents of the blobs that were changed for (path, file_id), chunks in tree.iter_files_bytes( [(file_id, (path, file_id)) for (path, file_id) in new_blobs]): obj = Blob() obj.chunked = chunks yield path, obj, (file_id, tree.get_file_revision(file_id, path)) shamap[file_id] = obj.id for path in unusual_modes: parent_path = posixpath.dirname(path) file_id = tree.path2id(parent_path) assert file_id is not None, "Unable to find file id for %r" % parent_path dirty_dirs.add(file_id) try: inv = tree.root_inventory except AttributeError: inv = tree.inventory trees = {} while dirty_dirs: new_dirs = set() for file_id in dirty_dirs: if file_id is None or not inv.has_id(file_id): continue trees[inv.id2path(file_id)] = file_id ie = inv[file_id] if ie.parent_id is not None: new_dirs.add(ie.parent_id) dirty_dirs = new_dirs def ie_to_hexsha(ie): try: return shamap[ie.file_id] except KeyError: # FIXME: Should be the same as in parent if ie.kind in ("file", "symlink"): try: return idmap.lookup_blob_id(ie.file_id, ie.revision) except KeyError: # no-change merge ? blob = Blob() blob.data = tree.get_file_text(ie.file_id) return blob.id elif ie.kind == "directory": # Not all cache backends store the tree information, # calculate again from scratch ret = directory_to_tree(ie.children, ie_to_hexsha, unusual_modes, dummy_file_name, ie.parent_id is None) if ret is None: return ret return ret.id else: raise AssertionError for path in sorted(trees.keys(), reverse=True): file_id = trees[path] assert tree.kind(file_id) == 'directory' ie = inv[file_id] obj = directory_to_tree(ie.children, ie_to_hexsha, unusual_modes, dummy_file_name, path == "") if obj is not None: yield path, obj, (file_id, ) shamap[file_id] = obj.id class PackTupleIterable(object): def __init__(self, store): self.store = store self.store.lock_read() self.objects = {} def __del__(self): self.store.unlock() def add(self, sha, path): self.objects[sha] = path def __len__(self): return len(self.objects) def __iter__(self): return ((self.store[object_id], path) for (object_id, path) in self.objects.iteritems()) class BazaarObjectStore(BaseObjectStore): """A Git-style object store backed onto a Bazaar repository.""" def __init__(self, repository, mapping=None): self.repository = repository self._map_updated = False self._locked = None if mapping is None: self.mapping = default_mapping else: self.mapping = mapping self._cache = cache_from_repository(repository) self._content_cache_types = ("tree",) self.start_write_group = self._cache.idmap.start_write_group self.abort_write_group = self._cache.idmap.abort_write_group self.commit_write_group = self._cache.idmap.commit_write_group self.tree_cache = LRUTreeCache(self.repository) self.unpeel_map = UnpeelMap.from_repository(self.repository) def _missing_revisions(self, revisions): return self._cache.idmap.missing_revisions(revisions) def _update_sha_map(self, stop_revision=None): if not self.is_locked(): raise AssertionError() if self._map_updated: return if (stop_revision is not None and not self._missing_revisions([stop_revision])): return graph = self.repository.get_graph() if stop_revision is None: all_revids = self.repository.all_revision_ids() missing_revids = self._missing_revisions(all_revids) else: heads = set([stop_revision]) missing_revids = self._missing_revisions(heads) while heads: parents = graph.get_parent_map(heads) todo = set() for p in parents.values(): todo.update([x for x in p if x not in missing_revids]) heads = self._missing_revisions(todo) missing_revids.update(heads) if NULL_REVISION in missing_revids: missing_revids.remove(NULL_REVISION) missing_revids = self.repository.has_revisions(missing_revids) if not missing_revids: if stop_revision is None: self._map_updated = True return self.start_write_group() try: pb = ui.ui_factory.nested_progress_bar() try: for i, revid in enumerate(graph.iter_topo_order(missing_revids)): trace.mutter('processing %r', revid) pb.update("updating git map", i, len(missing_revids)) self._update_sha_map_revision(revid) finally: pb.finished() if stop_revision is None: self._map_updated = True except: self.abort_write_group() raise else: self.commit_write_group() def __iter__(self): self._update_sha_map() return iter(self._cache.idmap.sha1s()) def _reconstruct_commit(self, rev, tree_sha, lossy, verifiers): """Reconstruct a Commit object. :param rev: Revision object :param tree_sha: SHA1 of the root tree object :param lossy: Whether or not to roundtrip bzr metadata :param verifiers: Verifiers for the commits :return: Commit object """ def parent_lookup(revid): try: return self._lookup_revision_sha1(revid) except errors.NoSuchRevision: return None return self.mapping.export_commit(rev, tree_sha, parent_lookup, lossy, verifiers) def _create_fileid_map_blob(self, tree): # FIXME: This can probably be a lot more efficient, # not all files necessarily have to be processed. file_ids = {} for (path, ie) in tree.inventory.iter_entries(): if self.mapping.generate_file_id(path) != ie.file_id: file_ids[path] = ie.file_id return self.mapping.export_fileid_map(file_ids) def _revision_to_objects(self, rev, tree, lossy): """Convert a revision to a set of git objects. :param rev: Bazaar revision object :param tree: Bazaar revision tree :param lossy: Whether to not roundtrip all Bazaar revision data """ unusual_modes = extract_unusual_modes(rev) present_parents = self.repository.has_revisions(rev.parent_ids) parent_trees = self.tree_cache.revision_trees( [p for p in rev.parent_ids if p in present_parents]) root_tree = None for path, obj, bzr_key_data in _tree_to_objects(tree, parent_trees, self._cache.idmap, unusual_modes, self.mapping.BZR_DUMMY_FILE): if path == "": root_tree = obj root_key_data = bzr_key_data # Don't yield just yet else: yield path, obj, bzr_key_data if root_tree is None: # Pointless commit - get the tree sha elsewhere if not rev.parent_ids: root_tree = Tree() else: base_sha1 = self._lookup_revision_sha1(rev.parent_ids[0]) root_tree = self[self[base_sha1].tree] root_key_data = (tree.get_root_id(), ) if not lossy and self.mapping.BZR_FILE_IDS_FILE is not None: b = self._create_fileid_map_blob(tree) if b is not None: root_tree[self.mapping.BZR_FILE_IDS_FILE] = ( (stat.S_IFREG | 0644), b.id) yield self.mapping.BZR_FILE_IDS_FILE, b, None yield "", root_tree, root_key_data if not lossy: testament3 = StrictTestament3(rev, tree) verifiers = { "testament3-sha1": testament3.as_sha1() } else: verifiers = {} commit_obj = self._reconstruct_commit(rev, root_tree.id, lossy=lossy, verifiers=verifiers) try: foreign_revid, mapping = mapping_registry.parse_revision_id( rev.revision_id) except errors.InvalidRevisionId: pass else: _check_expected_sha(foreign_revid, commit_obj) yield None, commit_obj, None def _get_updater(self, rev): return self._cache.get_updater(rev) def _update_sha_map_revision(self, revid): rev = self.repository.get_revision(revid) tree = self.tree_cache.revision_tree(rev.revision_id) updater = self._get_updater(rev) # FIXME JRV 2011-12-15: Shouldn't we try both values for lossy ? for path, obj, ie in self._revision_to_objects(rev, tree, lossy=(not self.mapping.roundtripping)): if isinstance(obj, Commit): testament3 = StrictTestament3(rev, tree) ie = { "testament3-sha1": testament3.as_sha1() } updater.add_object(obj, ie, path) commit_obj = updater.finish() return commit_obj.id def _reconstruct_blobs(self, keys): """Return a Git Blob object from a fileid and revision stored in bzr. :param fileid: File id of the text :param revision: Revision of the text """ stream = self.repository.iter_files_bytes( ((key[0], key[1], key) for key in keys)) for (fileid, revision, expected_sha), chunks in stream: blob = Blob() blob.chunked = chunks if blob.id != expected_sha and blob.data == "": # Perhaps it's a symlink ? tree = self.tree_cache.revision_tree(revision) if tree.kind(fileid) == 'symlink': blob = symlink_to_blob(tree.get_symlink_target(fileid)) _check_expected_sha(expected_sha, blob) yield blob def _reconstruct_tree(self, fileid, revid, bzr_tree, unusual_modes, expected_sha=None): """Return a Git Tree object from a file id and a revision stored in bzr. :param fileid: fileid in the tree. :param revision: Revision of the tree. """ def get_ie_sha1(entry): if entry.kind == "directory": try: return self._cache.idmap.lookup_tree_id(entry.file_id, revid) except (NotImplementedError, KeyError): obj = self._reconstruct_tree(entry.file_id, revid, bzr_tree, unusual_modes) if obj is None: return None else: return obj.id elif entry.kind in ("file", "symlink"): try: return self._cache.idmap.lookup_blob_id(entry.file_id, entry.revision) except KeyError: # no-change merge? return self._reconstruct_blobs( [(entry.file_id, entry.revision, None)]).next().id elif entry.kind == 'tree-reference': # FIXME: Make sure the file id is the root id return self._lookup_revision_sha1(entry.reference_revision) else: raise AssertionError("unknown entry kind '%s'" % entry.kind) try: inv = bzr_tree.root_inventory except AttributeError: inv = bzr_tree.inventory tree = directory_to_tree(inv[fileid].children, get_ie_sha1, unusual_modes, self.mapping.BZR_DUMMY_FILE, bzr_tree.get_root_id() == fileid) if (bzr_tree.get_root_id() == fileid and self.mapping.BZR_FILE_IDS_FILE is not None): if tree is None: tree = Tree() b = self._create_fileid_map_blob(bzr_tree) # If this is the root tree, add the file ids tree[self.mapping.BZR_FILE_IDS_FILE] = ( (stat.S_IFREG | 0644), b.id) if tree is not None: _check_expected_sha(expected_sha, tree) return tree def get_parents(self, sha): """Retrieve the parents of a Git commit by SHA1. :param sha: SHA1 of the commit :raises: KeyError, NotCommitError """ return self[sha].parents def _lookup_revision_sha1(self, revid): """Return the SHA1 matching a Bazaar revision.""" if revid == NULL_REVISION: return ZERO_SHA try: return self._cache.idmap.lookup_commit(revid) except KeyError: try: return mapping_registry.parse_revision_id(revid)[0] except errors.InvalidRevisionId: self._update_sha_map(revid) return self._cache.idmap.lookup_commit(revid) def get_raw(self, sha): """Get the raw representation of a Git object by SHA1. :param sha: SHA1 of the git object """ if len(sha) == 20: sha = sha_to_hex(sha) obj = self[sha] return (obj.type, obj.as_raw_string()) def __contains__(self, sha): # See if sha is in map try: for (type, type_data) in self.lookup_git_sha(sha): if type == "commit": if self.repository.has_revision(type_data[0]): return True elif type == "blob": if self.repository.texts.has_key(type_data): return True elif type == "tree": if self.repository.has_revision(type_data[1]): return True else: raise AssertionError("Unknown object type '%s'" % type) else: return False except KeyError: return False def lock_read(self): self._locked = 'r' self._map_updated = False self.repository.lock_read() return LogicalLockResult(self.unlock) def lock_write(self): self._locked = 'r' self._map_updated = False self.repository.lock_write() return LogicalLockResult(self.unlock) def is_locked(self): return (self._locked is not None) def unlock(self): self._locked = None self._map_updated = False self.repository.unlock() def lookup_git_shas(self, shas): ret = {} for sha in shas: if sha == ZERO_SHA: ret[sha] = [("commit", (NULL_REVISION, None, {}))] continue try: ret[sha] = list(self._cache.idmap.lookup_git_sha(sha)) except KeyError: # if not, see if there are any unconverted revisions and # add them to the map, search for sha in map again self._update_sha_map() try: ret[sha] = list(self._cache.idmap.lookup_git_sha(sha)) except KeyError: pass return ret def lookup_git_sha(self, sha): return self.lookup_git_shas([sha])[sha] def __getitem__(self, sha): if self._cache.content_cache is not None: try: return self._cache.content_cache[sha] except KeyError: pass for (kind, type_data) in self.lookup_git_sha(sha): # convert object to git object if kind == "commit": (revid, tree_sha, verifiers) = type_data try: rev = self.repository.get_revision(revid) except errors.NoSuchRevision: if revid == NULL_REVISION: raise AssertionError( "should not try to look up NULL_REVISION") trace.mutter('entry for %s %s in shamap: %r, but not ' 'found in repository', kind, sha, type_data) raise KeyError(sha) # FIXME: the type data should say whether conversion was lossless commit = self._reconstruct_commit(rev, tree_sha, lossy=(not self.mapping.roundtripping), verifiers=verifiers) _check_expected_sha(sha, commit) return commit elif kind == "blob": (fileid, revision) = type_data blobs = self._reconstruct_blobs([(fileid, revision, sha)]) return blobs.next() elif kind == "tree": (fileid, revid) = type_data try: tree = self.tree_cache.revision_tree(revid) rev = self.repository.get_revision(revid) except errors.NoSuchRevision: trace.mutter('entry for %s %s in shamap: %r, but not found in ' 'repository', kind, sha, type_data) raise KeyError(sha) unusual_modes = extract_unusual_modes(rev) try: return self._reconstruct_tree(fileid, revid, tree, unusual_modes, expected_sha=sha) except errors.NoSuchRevision: raise KeyError(sha) else: raise AssertionError("Unknown object type '%s'" % kind) else: raise KeyError(sha) def generate_lossy_pack_contents(self, have, want, progress=None, get_tagged=None): return self.generate_pack_contents(have, want, progress, get_tagged, lossy=True) def generate_pack_contents(self, have, want, progress=None, get_tagged=None, lossy=False): """Iterate over the contents of a pack file. :param have: List of SHA1s of objects that should not be sent :param want: List of SHA1s of objects that should be sent """ processed = set() ret = self.lookup_git_shas(have + want) for commit_sha in have: commit_sha = self.unpeel_map.peel_tag(commit_sha, commit_sha) try: for (type, type_data) in ret[commit_sha]: assert type == "commit" processed.add(type_data[0]) except KeyError: trace.mutter("unable to find remote ref %s", commit_sha) pending = set() for commit_sha in want: if commit_sha in have: continue try: for (type, type_data) in ret[commit_sha]: assert type == "commit" pending.add(type_data[0]) except KeyError: pass graph = self.repository.get_graph() todo = _find_missing_bzr_revids(graph, pending, processed) ret = PackTupleIterable(self) pb = ui.ui_factory.nested_progress_bar() try: for i, revid in enumerate(todo): pb.update("generating git objects", i, len(todo)) try: rev = self.repository.get_revision(revid) except errors.NoSuchRevision: continue tree = self.tree_cache.revision_tree(revid) for path, obj, ie in self._revision_to_objects(rev, tree, lossy=lossy): ret.add(obj.id, path) return ret finally: pb.finished() def add_thin_pack(self): import tempfile import os fd, path = tempfile.mkstemp(suffix=".pack") f = os.fdopen(fd, 'wb') def commit(): from dulwich.pack import PackData, Pack from .fetch import import_git_objects os.fsync(fd) f.close() if os.path.getsize(path) == 0: return pd = PackData(path) pd.create_index_v2(path[:-5]+".idx", self.object_store.get_raw) p = Pack(path[:-5]) self.repository.lock_write() try: self.repository.start_write_group() try: import_git_objects(self.repository, self.mapping, p.iterobjects(get_raw=self.get_raw), self.object_store) except: self.repository.abort_write_group() raise else: self.repository.commit_write_group() finally: self.repository.unlock() return f, commit # The pack isn't kept around anyway, so no point # in treating full packs different from thin packs add_pack = add_thin_pack bzr-git-0.6.13+bzr1649/po/0000755000000000000000000000000013165530605013044 5ustar 00000000000000bzr-git-0.6.13+bzr1649/pristine_tar.py0000644000000000000000000000670413165530605015512 0ustar 00000000000000# Copyright (C) 2012 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Support for pristine tar deltas.""" from base64 import ( standard_b64decode, ) from dulwich.objects import ( Blob, Tree, ) import stat README_CONTENTS = """\ This branch contains delta files that pristine-tar can use to regenerate tarballs for its own releases. """ def revision_pristine_tar_data(rev): """Export the pristine tar data from a revision.""" if 'deb-pristine-delta' in rev.properties: uuencoded = rev.properties['deb-pristine-delta'] kind = 'gz' elif 'deb-pristine-delta-bz2' in rev.properties: uuencoded = rev.properties['deb-pristine-delta-bz2'] kind = 'bz2' elif 'deb-pristine-delta-xz' in rev.properties: uuencoded = rev.properties['deb-pristine-delta-xz'] kind = 'xz' else: raise KeyError(rev.revision_id) return (standard_b64decode(uuencoded), kind) def get_pristine_tar_tree(repo): """Retrieve the pristine tar tree for a repository. """ try: cid = repo.refs["refs/heads/pristine-tar"] except KeyError: return Tree() tid = repo.object_store[cid].tree return repo.object_store[tid] def read_git_pristine_tar_data(repo, filename): """Read pristine data from a Git repository. :param repo: Git repository to read from :param filename: Name of file to read :return: Tuple with delta and id """ tree = get_pristine_tar_tree(repo) delta = tree[filename + ".delta"][1] gitid = tree[filename + ".id"][1] return (repo.object_store[delta].data, repo.object_store[gitid].data) def store_git_pristine_tar_data(repo, filename, delta, gitid, message=None, **kwargs): """Add pristine tar data to a Git repository. :param repo: Git repository to add data to :param filename: Name of file to store for :param delta: pristine-tar delta :param gitid: Git id the pristine tar delta is generated against """ delta_ob = Blob.from_string(delta) delta_name = filename + ".delta" id_ob = Blob.from_string(gitid) id_name = filename + ".id" objects = [ (delta_ob, delta_name), (id_ob, id_name)] tree = get_pristine_tar_tree(repo) tree.add(delta_name, stat.S_IFREG | 0644, delta_ob.id) tree.add(id_name, stat.S_IFREG | 0644, id_ob.id) if not "README" in tree: readme_ob = Blob.from_string(README_CONTENTS) objects.append((readme_ob, "README")) tree.add("README", stat.S_IFREG | 0644, readme_ob.id) objects.append((tree, "")) repo.object_store.add_objects(objects) if message is None: message = 'pristine-tar data for %s' % filename return repo.do_commit(ref='refs/heads/pristine-tar', tree=tree.id, message=message, **kwargs) bzr-git-0.6.13+bzr1649/push.py0000644000000000000000000003263013165530605013763 0ustar 00000000000000# Copyright (C) 2009-2010 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Push implementation that simply prints message saying push is not supported.""" from __future__ import absolute_import from dulwich.objects import ZERO_SHA from dulwich.walk import Walker from ... import ( errors, ui, ) from ...repository import ( InterRepository, ) from ...revision import ( NULL_REVISION, ) from .errors import ( NoPushSupport, ) from .object_store import ( get_object_store, ) from .repository import ( GitRepository, LocalGitRepository, GitRepositoryFormat, ) from .remote import ( RemoteGitRepository, ) from .unpeel_map import ( UnpeelMap, ) class MissingObjectsIterator(object): """Iterate over git objects that are missing from a target repository. """ def __init__(self, store, source, pb=None): """Create a new missing objects iterator. """ self.source = source self._object_store = store self._pending = [] self.pb = pb def import_revisions(self, revids, lossy): """Import a set of revisions into this git repository. :param revids: Revision ids of revisions to import :param lossy: Whether to not roundtrip bzr metadata """ for i, revid in enumerate(revids): if self.pb: self.pb.update("pushing revisions", i, len(revids)) git_commit = self.import_revision(revid, lossy) yield (revid, git_commit) def import_revision(self, revid, lossy): """Import a revision into this Git repository. :param revid: Revision id of the revision :param roundtrip: Whether to roundtrip bzr metadata """ tree = self._object_store.tree_cache.revision_tree(revid) rev = self.source.get_revision(revid) commit = None for path, obj, ie in self._object_store._revision_to_objects(rev, tree, lossy): if obj.type_name == "commit": commit = obj self._pending.append((obj, path)) if commit is None: raise AssertionError("no commit object generated for revision %s" % revid) return commit.id def __len__(self): return len(self._pending) def __iter__(self): return iter(self._pending) class InterToGitRepository(InterRepository): """InterRepository that copies into a Git repository.""" _matching_repo_format = GitRepositoryFormat() def __init__(self, source, target): super(InterToGitRepository, self).__init__(source, target) self.mapping = self.target.get_mapping() self.source_store = get_object_store(self.source, self.mapping) @staticmethod def _get_repo_format_to_test(): return None def copy_content(self, revision_id=None, pb=None): """See InterRepository.copy_content.""" self.fetch(revision_id, pb, find_ghosts=False) def fetch_refs(self, update_refs, lossy): """Fetch possibly roundtripped revisions into the target repository and update refs. :param update_refs: Generate refs to fetch. Receives dictionary with old refs (git shas), returns dictionary of new names to git shas. :param lossy: Whether to roundtrip :return: old refs, new refs """ raise NotImplementedError(self.fetch_refs) def search_missing_revision_ids(self, find_ghosts=True, revision_ids=None, if_present_ids=None, limit=None): git_shas = [] todo = [] if revision_ids: todo.extend(revision_ids) if if_present_ids: todo.extend(revision_ids) self.source_store.lock_read() try: for revid in revision_ids: if revid == NULL_REVISION: continue git_sha = self.source_store._lookup_revision_sha1(revid) git_shas.append(git_sha) walker = Walker(self.source_store, include=git_shas, exclude=[sha for sha in self.target.bzrdir.get_refs_container().as_dict().values() if sha != ZERO_SHA]) missing_revids = set() for entry in walker: for (kind, type_data) in self.source_store.lookup_git_sha(entry.commit.id): if kind == "commit": missing_revids.add(type_data[0]) finally: self.source_store.unlock() return self.source.revision_ids_to_search_result(missing_revids) class InterToLocalGitRepository(InterToGitRepository): """InterBranch implementation between a Bazaar and a Git repository.""" def __init__(self, source, target): super(InterToLocalGitRepository, self).__init__(source, target) self.target_store = self.target.bzrdir._git.object_store self.target_refs = self.target.bzrdir._git.refs def _commit_needs_fetching(self, sha_id): try: return (sha_id not in self.target_store) except errors.NoSuchRevision: # Ghost, can't push return False def _revision_needs_fetching(self, sha_id, revid): if revid == NULL_REVISION: return False if sha_id is None: try: sha_id = self.source_store._lookup_revision_sha1(revid) except KeyError: return False return self._commit_needs_fetching(sha_id) def missing_revisions(self, stop_revisions): """Find the revisions that are missing from the target repository. :param stop_revisions: Revisions to check for (tuples with Git SHA1, bzr revid) :return: sequence of missing revisions, in topological order :raise: NoSuchRevision if the stop_revisions are not present in the source """ revid_sha_map = {} stop_revids = [] for (sha1, revid) in stop_revisions: if sha1 is not None and revid is not None: revid_sha_map[revid] = sha1 stop_revids.append(revid) elif sha1 is not None: if self._commit_needs_fetching(sha1): for (kind, (revid, tree_sha, verifiers)) in self.source_store.lookup_git_sha(sha1): revid_sha_map[revid] = sha1 stop_revids.append(revid) else: assert revid is not None stop_revids.append(revid) missing = set() graph = self.source.get_graph() pb = ui.ui_factory.nested_progress_bar() try: while stop_revids: new_stop_revids = [] for revid in stop_revids: sha1 = revid_sha_map.get(revid) if (not revid in missing and self._revision_needs_fetching(sha1, revid)): missing.add(revid) new_stop_revids.append(revid) stop_revids = set() parent_map = graph.get_parent_map(new_stop_revids) for parent_revids in parent_map.itervalues(): stop_revids.update(parent_revids) pb.update("determining revisions to fetch", len(missing)) finally: pb.finished() return graph.iter_topo_order(missing) def _get_target_bzr_refs(self): """Return a dictionary with references. :return: Dictionary with reference names as keys and tuples with Git SHA, Bazaar revid as values. """ bzr_refs = {} refs = {} for k in self.target._git.refs.allkeys(): v = self.target._git.refs[k] try: for (kind, type_data) in self.source_store.lookup_git_sha(v): if kind == "commit" and self.source.has_revision(type_data[0]): revid = type_data[0] break else: revid = None except KeyError: revid = None bzr_refs[k] = (v, revid) return bzr_refs def fetch_refs(self, update_refs, lossy): if not lossy and not self.mapping.roundtripping: raise NoPushSupport() self.source_store.lock_read() try: old_refs = self._get_target_bzr_refs() new_refs = update_refs(old_refs) revidmap = self.fetch_objects( [(git_sha, bzr_revid) for (git_sha, bzr_revid) in new_refs.values() if git_sha is None or not git_sha.startswith('ref:')], lossy=lossy) for name, (gitid, revid) in new_refs.iteritems(): if gitid is None: try: gitid = revidmap[revid][0] except KeyError: gitid = self.source_store._lookup_revision_sha1(revid) assert len(gitid) == 40 or gitid.startswith('ref: ') self.target_refs[name] = gitid finally: self.source_store.unlock() return revidmap, old_refs, new_refs def fetch_objects(self, revs, lossy): if not lossy and not self.mapping.roundtripping: raise NoPushSupport() self.source_store.lock_read() try: todo = list(self.missing_revisions(revs)) revidmap = {} pb = ui.ui_factory.nested_progress_bar() try: object_generator = MissingObjectsIterator( self.source_store, self.source, pb) for (old_revid, git_sha) in object_generator.import_revisions( todo, lossy=lossy): if lossy: new_revid = self.mapping.revision_id_foreign_to_bzr(git_sha) else: new_revid = old_revid try: self.mapping.revision_id_bzr_to_foreign(old_revid) except errors.InvalidRevisionId: refname = self.mapping.revid_as_refname(old_revid) self.target_refs[refname] = git_sha revidmap[old_revid] = (git_sha, new_revid) self.target_store.add_objects(object_generator) return revidmap finally: pb.finished() finally: self.source_store.unlock() def fetch(self, revision_id=None, pb=None, find_ghosts=False, fetch_spec=None, mapped_refs=None): if not self.mapping.roundtripping: raise NoPushSupport() if mapped_refs is not None: stop_revisions = mapped_refs elif revision_id is not None: stop_revisions = [(None, revision_id)] elif fetch_spec is not None: recipe = fetch_spec.get_recipe() if recipe[0] in ("search", "proxy-search"): stop_revisions = [(None, revid) for revid in recipe[1]] else: raise AssertionError("Unsupported search result type %s" % recipe[0]) else: stop_revisions = [(None, revid) for revid in self.source.all_revision_ids()] self.fetch_objects(stop_revisions, lossy=False) @staticmethod def is_compatible(source, target): """Be compatible with GitRepository.""" return (not isinstance(source, GitRepository) and isinstance(target, LocalGitRepository)) class InterToRemoteGitRepository(InterToGitRepository): def fetch_refs(self, update_refs, lossy): """Import the gist of the ancestry of a particular revision.""" if not lossy and not self.mapping.roundtripping: raise NoPushSupport() unpeel_map = UnpeelMap.from_repository(self.source) revidmap = {} def determine_wants(old_refs): ret = {} self.old_refs = dict([(k, (v, None)) for (k, v) in old_refs.iteritems()]) self.new_refs = update_refs(self.old_refs) for name, (gitid, revid) in self.new_refs.iteritems(): if gitid is None: git_sha = self.source_store._lookup_revision_sha1(revid) ret[name] = unpeel_map.re_unpeel_tag(git_sha, old_refs.get(name)) else: ret[name] = gitid return ret self.source_store.lock_read() try: new_refs = self.target.send_pack(determine_wants, self.source_store.generate_lossy_pack_contents) finally: self.source_store.unlock() # FIXME: revidmap? return revidmap, self.old_refs, self.new_refs @staticmethod def is_compatible(source, target): """Be compatible with GitRepository.""" return (not isinstance(source, GitRepository) and isinstance(target, RemoteGitRepository)) bzr-git-0.6.13+bzr1649/refs.py0000644000000000000000000001273213165530605013744 0ustar 00000000000000# Copyright (C) 2010 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Conversion between refs and Bazaar revision pointers.""" from __future__ import absolute_import from dulwich.repo import ( RefsContainer, ) from ... import ( errors, osutils, ) is_tag = lambda x: x.startswith("refs/tags/") is_head = lambda x: x.startswith("refs/heads/") is_peeled = lambda x: x.endswith("^{}") def gather_peeled(refs): ret = {} for k, v in refs.iteritems(): if is_peeled(k): continue try: peeled = refs[k+"^{}"] unpeeled = v except KeyError: peeled = v unpeeled = None ret[k] = (peeled, unpeeled) return ret def branch_name_to_ref(name): """Map a branch name to a ref. :param name: Branch name :return: ref string """ if name == "": return "HEAD" if not name.startswith("refs/"): return "refs/heads/%s" % osutils.safe_utf8(name) else: return osutils.safe_utf8(name) def tag_name_to_ref(name): """Map a tag name to a ref. :param name: Tag name :return: ref string """ return "refs/tags/%s" % osutils.safe_utf8(name) def ref_to_branch_name(ref): """Map a ref to a branch name :param ref: Ref :return: A branch name """ if ref == "HEAD": return u"" if ref is None: return ref if ref.startswith("refs/heads/"): return osutils.safe_unicode(ref[len("refs/heads/"):]) raise ValueError("unable to map ref %s back to branch name" % ref) def ref_to_tag_name(ref): if ref.startswith("refs/tags/"): return ref[len('refs/tags/'):].decode("utf-8") raise ValueError("unable to map ref %s back to tag name" % ref) class BazaarRefsContainer(RefsContainer): def __init__(self, dir, object_store): self.dir = dir self.object_store = object_store def set_symbolic_ref(self, name, other): if name == "HEAD": pass # FIXME: Switch default branch else: raise NotImplementedError( "Symbolic references not supported for anything other than " "HEAD") def _get_revid_by_tag_name(self, tag_name): for branch in self.dir.list_branches(): try: # FIXME: This is ambiguous! return branch.tags.lookup_tag(tag_name) except errors.NoSuchTag: pass return None def _get_revid_by_branch_name(self, branch_name): try: branch = self.dir.open_branch(branch_name) except errors.NoColocatedBranchSupport: if branch_name in ("HEAD", "master"): branch = self.dir.open_branch() else: raise return branch.last_revision() def read_loose_ref(self, ref): try: branch_name = ref_to_branch_name(ref) except ValueError: tag_name = ref_to_tag_name(ref) revid = self._get_revid_by_tag_name(tag_name) else: revid = self._get_revid_by_branch_name(branch_name) # FIXME: Unpeel if necessary return self.object_store._lookup_revision_sha1(revid) def get_peeled(self, ref): return self.read_loose_ref(ref) def allkeys(self): keys = set() for branch in self.dir.list_branches(): repo = branch.repository if repo.has_revision(branch.last_revision()): ref = branch_name_to_ref(getattr(branch, "name", "")) keys.add(ref) try: for tag_name, revid in branch.tags.get_tag_dict().iteritems(): if repo.has_revision(revid): keys.add(tag_name_to_ref(tag_name)) except errors.TagsNotSupported: pass return keys def __delitem__(self, ref): try: branch_name = ref_to_branch_name(ref) except ValueError: return # FIXME: Cope with tags! self.dir.destroy_branch(branch_name) def __setitem__(self, ref, sha): try: branch_name = ref_to_branch_name(ref) except ValueError: # FIXME: Cope with tags! return try: target_branch = self.repo_dir.open_branch(branch_name) except errors.NotBranchError: target_branch = self.repo.create_branch(branch_name) rev_id = self.mapping.revision_id_foreign_to_bzr(sha) target_branch.lock_write() try: target_branch.generate_revision_history(rev_id) finally: target_branch.unlock() def get_refs_container(controldir, object_store): fn = getattr(controldir, "get_refs_container", None) if fn is not None: return fn() return BazaarRefsContainer(controldir, object_store) bzr-git-0.6.13+bzr1649/remote.py0000644000000000000000000004134013165530605014275 0ustar 00000000000000# Copyright (C) 2007-2012 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA from __future__ import absolute_import from ... import ( config, debug, trace, ui, urlutils, ) from ...errors import ( BzrError, InProcessTransport, InvalidRevisionId, NoSuchFile, NoSuchRevision, NotBranchError, NotLocalUrl, UninitializableFormat, ) from ...transport import ( Transport, ) from . import ( lazy_check_versions, ) lazy_check_versions() from .branch import ( GitBranch, GitTags, ) from .dir import ( GitControlDirFormat, GitDir, ) from .errors import ( GitSmartRemoteNotSupported, NoSuchRef, ) from .mapping import ( mapping_registry, ) from .repository import ( GitRepository, ) from .refs import ( branch_name_to_ref, is_peeled, ) import dulwich import dulwich.client from dulwich.errors import ( GitProtocolError, ) from dulwich.pack import ( Pack, ) from dulwich.repo import DictRefsContainer import os import select import tempfile import urllib import urlparse # urlparse only supports a limited number of schemes by default urlparse.uses_netloc.extend(['git', 'git+ssh']) from dulwich.pack import load_pack_index # Don't run any tests on GitSmartTransport as it is not intended to be # a full implementation of Transport def get_test_permutations(): return [] def split_git_url(url): """Split a Git URL. :param url: Git URL :return: Tuple with host, port, username, path. """ (scheme, netloc, loc, _, _) = urlparse.urlsplit(url) path = urllib.unquote(loc) if path.startswith("/~"): path = path[1:] (username, hostport) = urllib.splituser(netloc) (host, port) = urllib.splitnport(hostport, None) return (host, port, username, path) class RemoteGitError(BzrError): _fmt = "Remote server error: %(message)s" def parse_git_error(url, message): """Parse a remote git server error and return a bzr exception. :param url: URL of the remote repository :param message: Message sent by the remote git server """ message = str(message).strip() if message.startswith("Could not find Repository "): return NotBranchError(url, message) if message == "HEAD failed to update": base_url, _ = urlutils.split_segment_parameters(url) raise BzrError( ("Unable to update remote HEAD branch. To update the master " "branch, specify the URL %s,branch=master.") % base_url) # Don't know, just return it to the user as-is return RemoteGitError(message) class GitSmartTransport(Transport): def __init__(self, url, _client=None): Transport.__init__(self, url) (self._host, self._port, self._username, self._path) = \ split_git_url(url) if 'transport' in debug.debug_flags: trace.mutter('host: %r, user: %r, port: %r, path: %r', self._host, self._username, self._port, self._path) self._client = _client self._stripped_path = self._path.rsplit(",", 1)[0] def external_url(self): return self.base def has(self, relpath): return False def _get_client(self, thin_packs): raise NotImplementedError(self._get_client) def _get_path(self): return self._stripped_path def get(self, path): raise NoSuchFile(path) def abspath(self, relpath): return urlutils.join(self.base, relpath) def clone(self, offset=None): """See Transport.clone().""" if offset is None: newurl = self.base else: newurl = urlutils.join(self.base, offset) return self.__class__(newurl, self._client) class TCPGitSmartTransport(GitSmartTransport): _scheme = 'git' def _get_client(self, thin_packs): if self._client is not None: ret = self._client self._client = None return ret return dulwich.client.TCPGitClient(self._host, self._port, thin_packs=thin_packs, report_activity=self._report_activity) class SSHSocketWrapper(object): def __init__(self, sock): self.sock = sock def read(self, len=None): return self.sock.recv(len) def write(self, data): return self.sock.write(data) def can_read(self): return len(select.select([self.sock.fileno()], [], [], 0)[0]) > 0 class DulwichSSHVendor(dulwich.client.SSHVendor): def __init__(self): from ...transport import ssh self.bzr_ssh_vendor = ssh._get_ssh_vendor() def run_command(self, host, command, username=None, port=None): connection = self.bzr_ssh_vendor.connect_ssh(username=username, password=None, port=port, host=host, command=command) (kind, io_object) = connection.get_sock_or_pipes() if kind == 'socket': return SSHSocketWrapper(io_object) else: raise AssertionError("Unknown io object kind %r'" % kind) #dulwich.client.get_ssh_vendor = DulwichSSHVendor class SSHGitSmartTransport(GitSmartTransport): _scheme = 'git+ssh' def _get_path(self): path = self._stripped_path if path.startswith("/~/"): return path[3:] return path def _get_client(self, thin_packs): if self._client is not None: ret = self._client self._client = None return ret location_config = config.LocationConfig(self.base) client = dulwich.client.SSHGitClient(self._host, self._port, self._username, thin_packs=thin_packs, report_activity=self._report_activity) # Set up alternate pack program paths upload_pack = location_config.get_user_option('git_upload_pack') if upload_pack: client.alternative_paths["upload-pack"] = upload_pack receive_pack = location_config.get_user_option('git_receive_pack') if receive_pack: client.alternative_paths["receive-pack"] = receive_pack return client class RemoteGitDir(GitDir): def __init__(self, transport, format, get_client, client_path): self._format = format self.root_transport = transport self.transport = transport self._mode_check_done = None self._get_client = get_client self._client_path = client_path self.base = self.root_transport.base self._refs = None def fetch_pack(self, determine_wants, graph_walker, pack_data, progress=None): if progress is None: def progress(text): trace.info("git: %s" % text) def wrap_determine_wants(refs_dict): return determine_wants(remote_refs_dict_to_container(refs_dict)) client = self._get_client(thin_packs=True) try: refs_dict = client.fetch_pack(self._client_path, wrap_determine_wants, graph_walker, pack_data, progress) if refs_dict is None: refs_dict = {} self._refs = remote_refs_dict_to_container(refs_dict) return refs_dict except GitProtocolError, e: raise parse_git_error(self.transport.external_url(), e) def send_pack(self, get_changed_refs, generate_pack_contents): client = self._get_client(thin_packs=True) try: return client.send_pack(self._client_path, get_changed_refs, generate_pack_contents) except GitProtocolError, e: raise parse_git_error(self.transport.external_url(), e) def _get_default_ref(self): return "refs/heads/master" def destroy_branch(self, name=None): refname = self._get_selected_ref(name) def get_changed_refs(old_refs): ret = dict(old_refs) if not refname in ret: raise NotBranchError(self.user_url) ret[refname] = "00" * 20 return ret self.send_pack(get_changed_refs, lambda have, want: []) @property def user_url(self): return self.control_url @property def user_transport(self): return self.root_transport @property def control_url(self): return self.control_transport.base @property def control_transport(self): return self.root_transport def open_repository(self): return RemoteGitRepository(self) def open_branch(self, name=None, unsupported=False, ignore_fallbacks=False, ref=None, possible_transports=None): repo = self.open_repository() refname = self._get_selected_ref(name, ref) return RemoteGitBranch(self, repo, refname) def open_workingtree(self, recommend_upgrade=False): raise NotLocalUrl(self.transport.base) def get_peeled(self, name): return self.get_refs_container().get_peeled(name) def get_refs_container(self): if self._refs is not None: return self._refs refs_dict = self.fetch_pack(lambda x: [], None, lambda x: None, lambda x: trace.mutter("git: %s" % x)) self._refs = remote_refs_dict_to_container(refs_dict) return self._refs class EmptyObjectStoreIterator(dict): def iterobjects(self): return [] class TemporaryPackIterator(Pack): def __init__(self, path, resolve_ext_ref): super(TemporaryPackIterator, self).__init__( path, resolve_ext_ref=resolve_ext_ref) self._idx_load = lambda: self._idx_load_or_generate(self._idx_path) def _idx_load_or_generate(self, path): if not os.path.exists(path): pb = ui.ui_factory.nested_progress_bar() try: def report_progress(cur, total): pb.update("generating index", cur, total) self.data.create_index(path, progress=report_progress) finally: pb.finished() return load_pack_index(path) def __del__(self): if self._idx is not None: self._idx.close() os.remove(self._idx_path) if self._data is not None: self._data.close() os.remove(self._data_path) class BzrGitHttpClient(dulwich.client.HttpGitClient): def __init__(self, transport, *args, **kwargs): self.transport = transport super(BzrGitHttpClient, self).__init__(transport.external_url(), *args, **kwargs) import urllib2 self._http_perform = getattr(self.transport, "_perform", urllib2.urlopen) def _perform(self, req): req.accepted_errors = (200, 404) req.follow_redirections = True req.redirected_to = None return self._http_perform(req) class RemoteGitControlDirFormat(GitControlDirFormat): """The .git directory control format.""" supports_workingtrees = False @classmethod def _known_formats(self): return set([RemoteGitControlDirFormat()]) def is_initializable(self): return False def is_supported(self): return True def open(self, transport, _found=None): """Open this directory. """ # we dont grok readonly - git isn't integrated with transport. url = transport.base if url.startswith('readonly+'): url = url[len('readonly+'):] if isinstance(transport, GitSmartTransport): get_client = transport._get_client client_path = transport._get_path() elif urlparse.urlsplit(transport.external_url())[0] in ("http", "https"): def get_client(thin_packs): return BzrGitHttpClient(transport, thin_packs=thin_packs) client_path, _ = urlutils.split_segment_parameters(transport._path) else: raise NotBranchError(transport.base) return RemoteGitDir(transport, self, get_client, client_path) def get_format_description(self): return "Remote Git Repository" def initialize_on_transport(self, transport): raise UninitializableFormat(self) def supports_transport(self, transport): try: external_url = transport.external_url() except InProcessTransport: raise NotBranchError(path=transport.base) return (external_url.startswith("http:") or external_url.startswith("https:") or external_url.startswith("git+") or external_url.startswith("git:")) class RemoteGitRepository(GitRepository): @property def user_url(self): return self.control_url def get_parent_map(self, revids): raise GitSmartRemoteNotSupported(self.get_parent_map, self) def fetch_pack(self, determine_wants, graph_walker, pack_data, progress=None): return self.bzrdir.fetch_pack(determine_wants, graph_walker, pack_data, progress) def send_pack(self, get_changed_refs, generate_pack_contents): return self.bzrdir.send_pack(get_changed_refs, generate_pack_contents) def fetch_objects(self, determine_wants, graph_walker, resolve_ext_ref, progress=None): fd, path = tempfile.mkstemp(suffix=".pack") try: self.fetch_pack(determine_wants, graph_walker, lambda x: os.write(fd, x), progress) finally: os.close(fd) if os.path.getsize(path) == 0: return EmptyObjectStoreIterator() return TemporaryPackIterator(path[:-len(".pack")], resolve_ext_ref) def lookup_bzr_revision_id(self, bzr_revid): # This won't work for any round-tripped bzr revisions, but it's a start.. try: return mapping_registry.revision_id_bzr_to_foreign(bzr_revid) except InvalidRevisionId: raise NoSuchRevision(self, bzr_revid) def lookup_foreign_revision_id(self, foreign_revid, mapping=None): """Lookup a revision id. """ if mapping is None: mapping = self.get_mapping() # Not really an easy way to parse foreign revids here.. return mapping.revision_id_foreign_to_bzr(foreign_revid) def revision_tree(self, revid): raise GitSmartRemoteNotSupported(self.revision_tree, self) def get_revisions(self, revids): raise GitSmartRemoteNotSupported(self.get_revisions, self) def has_revisions(self, revids): raise GitSmartRemoteNotSupported(self.get_revisions, self) class RemoteGitTagDict(GitTags): def get_refs_container(self): return self.repository.bzrdir.get_refs_container() def set_tag(self, name, revid): # FIXME: Not supported yet, should do a push of a new ref raise NotImplementedError(self.set_tag) class RemoteGitBranch(GitBranch): def __init__(self, bzrdir, repository, name): self._sha = None super(RemoteGitBranch, self).__init__(bzrdir, repository, name) def last_revision_info(self): raise GitSmartRemoteNotSupported(self.last_revision_info, self) @property def user_url(self): return self.control_url @property def control_url(self): return self.base def revision_id_to_revno(self, revision_id): raise GitSmartRemoteNotSupported(self.revision_id_to_revno, self) def last_revision(self): return self.lookup_foreign_revision_id(self.head) @property def head(self): if self._sha is not None: return self._sha refs = self.bzrdir.get_refs_container() name = branch_name_to_ref(self.name) try: self._sha = refs[name] except KeyError: raise NoSuchRef(name, self.repository.user_url, refs) return self._sha def _synchronize_history(self, destination, revision_id): """See Branch._synchronize_history().""" destination.generate_revision_history(self.last_revision()) def get_push_location(self): return None def set_push_location(self, url): pass def remote_refs_dict_to_container(refs_dict): base = {} peeled = {} for k, v in refs_dict.iteritems(): if is_peeled(k): peeled[k[:-3]] = v else: base[k] = v peeled[k] = v ret = DictRefsContainer(base) ret._peeled = peeled return ret bzr-git-0.6.13+bzr1649/repository.py0000644000000000000000000005017413165530605015226 0ustar 00000000000000# Copyright (C) 2007 Canonical Ltd # Copyright (C) 2008-2009 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """An adapter between a Git Repository and a Bazaar Branch""" from __future__ import absolute_import from ... import ( check, errors, graph as _mod_graph, inventory, repository, revision, transactions, ui, version_info as bzrlib_version, ) from ...decorators import only_raises from ...revisiontree import InventoryRevisionTree from ...foreign import ( ForeignRepository, ) from .commit import ( GitCommitBuilder, ) from .filegraph import ( GitFileLastChangeScanner, GitFileParentProvider, ) from .mapping import ( default_mapping, foreign_vcs_git, mapping_registry, ) from .tree import ( GitRevisionTree, ) from dulwich.errors import ( NotCommitError, ) from dulwich.objects import ( Commit, ZERO_SHA, ) from dulwich.object_store import ( tree_lookup_path, ) class RepoReconciler(object): """Reconciler that reconciles a repository. """ def __init__(self, repo, other=None, thorough=False): """Construct a RepoReconciler. :param thorough: perform a thorough check which may take longer but will correct non-data loss issues such as incorrect cached data. """ self.repo = repo def reconcile(self): """Perform reconciliation. After reconciliation the following attributes document found issues: inconsistent_parents: The number of revisions in the repository whose ancestry was being reported incorrectly. garbage_inventories: The number of inventory objects without revisions that were garbage collected. """ class GitCheck(check.Check): def __init__(self, repository, check_repo=True): self.repository = repository self.checked_rev_cnt = 0 def check(self, callback_refs=None, check_repo=True): if callback_refs is None: callback_refs = {} self.repository.lock_read() self.repository.unlock() def report_results(self, verbose): pass _optimisers_loaded = False def lazy_load_optimisers(): global _optimisers_loaded if _optimisers_loaded: return from . import fetch, push for optimiser in [fetch.InterRemoteGitNonGitRepository, fetch.InterLocalGitNonGitRepository, fetch.InterGitGitRepository, push.InterToLocalGitRepository, push.InterToRemoteGitRepository]: repository.InterRepository.register_optimiser(optimiser) _optimisers_loaded = True class GitRepository(ForeignRepository): """An adapter to git repositories for bzr.""" _serializer = None vcs = foreign_vcs_git chk_bytes = None def __init__(self, gitdir): self._transport = gitdir.root_transport super(GitRepository, self).__init__(GitRepositoryFormat(), gitdir, control_files=None) self.base = gitdir.root_transport.base lazy_load_optimisers() self._lock_mode = None self._lock_count = 0 def add_fallback_repository(self, basis_url): raise errors.UnstackableRepositoryFormat(self._format, self.control_transport.base) def is_shared(self): return False def get_physical_lock_status(self): return False def lock_write(self): """See Branch.lock_write().""" if self._lock_mode: assert self._lock_mode == 'w' self._lock_count += 1 else: self._lock_mode = 'w' self._lock_count = 1 return GitRepositoryLock(self) def break_lock(self): raise NotImplementedError(self.break_lock) def dont_leave_lock_in_place(self): raise NotImplementedError(self.dont_leave_lock_in_place) def leave_lock_in_place(self): raise NotImplementedError(self.leave_lock_in_place) def lock_read(self): if self._lock_mode: assert self._lock_mode in ('r', 'w') self._lock_count += 1 else: self._lock_mode = 'r' self._lock_count = 1 return self @only_raises(errors.LockNotHeld, errors.LockBroken) def unlock(self): if self._lock_count == 0: raise errors.LockNotHeld(self) if self._lock_count == 1 and self._lock_mode == 'w': if self._write_group is not None: self.abort_write_group() self._lock_count -= 1 self._lock_mode = None raise errors.BzrError( 'Must end write groups before releasing write locks.') self._lock_count -= 1 if self._lock_count == 0: self._lock_mode = None def is_write_locked(self): return (self._lock_mode == 'w') def is_locked(self): return (self._lock_mode is not None) def get_transaction(self): """See Repository.get_transaction().""" if self._write_group is None: return transactions.PassThroughTransaction() else: return self._write_group def reconcile(self, other=None, thorough=False): """Reconcile this repository.""" reconciler = RepoReconciler(self, thorough=thorough) reconciler.reconcile() return reconciler def supports_rich_root(self): return True def get_mapping(self): return default_mapping def make_working_trees(self): return not self._git.get_config().get_boolean(("core", ), "bare") def revision_graph_can_have_wrong_parents(self): return False def add_signature_text(self, revid, signature): raise errors.UnsupportedOperation(self.add_signature_text, self) def sign_revision(self, revision_id, gpg_strategy): raise errors.UnsupportedOperation(self.add_signature_text, self) class GitRepositoryLock(object): """Subversion lock.""" def __init__(self, repository): self.repository_token = None self.repository = repository def unlock(self): self.repository.unlock() class LocalGitRepository(GitRepository): """Git repository on the file system.""" def __init__(self, gitdir): GitRepository.__init__(self, gitdir) self._git = gitdir._git self._file_change_scanner = GitFileLastChangeScanner(self) def get_commit_builder(self, branch, parents, config, timestamp=None, timezone=None, committer=None, revprops=None, revision_id=None, lossy=False): """Obtain a CommitBuilder for this repository. :param branch: Branch to commit to. :param parents: Revision ids of the parents of the new revision. :param config: Configuration to use. :param timestamp: Optional timestamp recorded for commit. :param timezone: Optional timezone for timestamp. :param committer: Optional committer to set for commit. :param revprops: Optional dictionary of revision properties. :param revision_id: Optional revision id. :param lossy: Whether to discard data that can not be natively represented, when pushing to a foreign VCS """ self.start_write_group() return GitCommitBuilder(self, parents, config, timestamp, timezone, committer, revprops, revision_id, lossy) def get_file_graph(self): return _mod_graph.Graph(GitFileParentProvider( self._file_change_scanner)) def iter_files_bytes(self, desired_files): """Iterate through file versions. Files will not necessarily be returned in the order they occur in desired_files. No specific order is guaranteed. Yields pairs of identifier, bytes_iterator. identifier is an opaque value supplied by the caller as part of desired_files. It should uniquely identify the file version in the caller's context. (Examples: an index number or a TreeTransform trans_id.) bytes_iterator is an iterable of bytestrings for the file. The kind of iterable and length of the bytestrings are unspecified, but for this implementation, it is a list of bytes produced by VersionedFile.get_record_stream(). :param desired_files: a list of (file_id, revision_id, identifier) triples """ per_revision = {} for (file_id, revision_id, identifier) in desired_files: per_revision.setdefault(revision_id, []).append( (file_id, identifier)) for revid, files in per_revision.iteritems(): (commit_id, mapping) = self.lookup_bzr_revision_id(revid) try: commit = self._git.object_store[commit_id] except KeyError: raise errors.RevisionNotPresent(revid, self) root_tree = commit.tree for fileid, identifier in files: path = mapping.parse_file_id(fileid) try: obj = tree_lookup_path( self._git.object_store.__getitem__, root_tree, path) if isinstance(obj, tuple): (mode, item_id) = obj obj = self._git.object_store[item_id] except KeyError: raise errors.RevisionNotPresent((fileid, revid), self) else: if obj.type_name == "tree": yield (identifier, []) elif obj.type_name == "blob": yield (identifier, obj.chunked) else: raise AssertionError("file text resolved to %r" % obj) def gather_stats(self, revid=None, committers=None): """See Repository.gather_stats().""" result = super(LocalGitRepository, self).gather_stats(revid, committers) revs = [] for sha in self._git.object_store: o = self._git.object_store[sha] if o.type_name == "commit": revs.append(o.id) result['revisions'] = len(revs) return result def _iter_revision_ids(self): mapping = self.get_mapping() for sha in self._git.object_store: o = self._git.object_store[sha] if not isinstance(o, Commit): continue rev, roundtrip_revid, verifiers = mapping.import_commit(o, mapping.revision_id_foreign_to_bzr) yield o.id, rev.revision_id, roundtrip_revid def all_revision_ids(self): ret = set([]) for git_sha, revid, roundtrip_revid in self._iter_revision_ids(): if roundtrip_revid: ret.add(roundtrip_revid) else: ret.add(revid) return ret def _get_parents(self, revid, no_alternates=False): if type(revid) != str: raise ValueError try: (hexsha, mapping) = self.lookup_bzr_revision_id(revid) except errors.NoSuchRevision: return None # FIXME: Honor no_alternates setting try: commit = self._git.object_store[hexsha] except KeyError: return None return [ self.lookup_foreign_revision_id(p, mapping) for p in commit.parents] def _get_parent_map_no_fallbacks(self, revids): return self.get_parent_map(revids, no_alternates=True) def get_parent_map(self, revids, no_alternates=False): parent_map = {} for revision_id in revids: parents = self._get_parents(revision_id, no_alternates=no_alternates) if revision_id == revision.NULL_REVISION: parent_map[revision_id] = () continue if parents is None: continue if len(parents) == 0: parents = [revision.NULL_REVISION] parent_map[revision_id] = tuple(parents) return parent_map def get_known_graph_ancestry(self, revision_ids): """Return the known graph for a set of revision ids and their ancestors. """ pending = set(revision_ids) parent_map = {} while pending: this_parent_map = {} for revid in pending: if revid == revision.NULL_REVISION: continue parents = self._get_parents(revid) if parents is not None: this_parent_map[revid] = parents parent_map.update(this_parent_map) pending = set() map(pending.update, this_parent_map.itervalues()) pending = pending.difference(parent_map) return _mod_graph.KnownGraph(parent_map) def get_signature_text(self, revision_id): raise errors.NoSuchRevision(self, revision_id) def check(self, revision_ids=None, callback_refs=None, check_repo=True): result = GitCheck(self, check_repo=check_repo) result.check(callback_refs) return result def pack(self, hint=None, clean_obsolete_packs=False): self._git.object_store.pack_loose_objects() def lookup_foreign_revision_id(self, foreign_revid, mapping=None): """Lookup a revision id. :param foreign_revid: Foreign revision id to look up :param mapping: Mapping to use (use default mapping if not specified) :raise KeyError: If foreign revision was not found :return: bzr revision id """ assert type(foreign_revid) is str if mapping is None: mapping = self.get_mapping() if foreign_revid == ZERO_SHA: return revision.NULL_REVISION commit = self._git.object_store.peel_sha(foreign_revid) if not isinstance(commit, Commit): raise NotCommitError(commit.id) rev, roundtrip_revid, verifiers = mapping.import_commit(commit, mapping.revision_id_foreign_to_bzr) # FIXME: check testament before doing this? if roundtrip_revid: return roundtrip_revid else: return rev.revision_id def has_signature_for_revision_id(self, revision_id): """Check whether a GPG signature is present for this revision. This is never the case for Git repositories. """ return False def lookup_bzr_revision_id(self, bzr_revid, mapping=None): """Lookup a bzr revision id in a Git repository. :param bzr_revid: Bazaar revision id :param mapping: Optional mapping to use :return: Tuple with git commit id, mapping that was used and supplement details """ try: (git_sha, mapping) = mapping_registry.revision_id_bzr_to_foreign(bzr_revid) except errors.InvalidRevisionId: if mapping is None: mapping = self.get_mapping() try: return (self._git.refs[mapping.revid_as_refname(bzr_revid)], mapping) except KeyError: # Update refs from Git commit objects # FIXME: Hitting this a lot will be very inefficient... pb = ui.ui_factory.nested_progress_bar() try: for i, (git_sha, revid, roundtrip_revid) in enumerate(self._iter_revision_ids()): if not roundtrip_revid: continue pb.update("resolving revision id", i) refname = mapping.revid_as_refname(roundtrip_revid) self._git.refs[refname] = git_sha if roundtrip_revid == bzr_revid: return git_sha, mapping finally: pb.finished() raise errors.NoSuchRevision(self, bzr_revid) else: return (git_sha, mapping) def get_revision(self, revision_id): if not isinstance(revision_id, str): raise errors.InvalidRevisionId(revision_id, self) git_commit_id, mapping = self.lookup_bzr_revision_id(revision_id) try: commit = self._git.object_store[git_commit_id] except KeyError: raise errors.NoSuchRevision(self, revision_id) revision, roundtrip_revid, verifiers = mapping.import_commit( commit, self.lookup_foreign_revision_id) assert revision is not None # FIXME: check verifiers ? if roundtrip_revid: revision.revision_id = roundtrip_revid return revision def has_revision(self, revision_id): """See Repository.has_revision.""" if revision_id == revision.NULL_REVISION: return True try: git_commit_id, mapping = self.lookup_bzr_revision_id(revision_id) except errors.NoSuchRevision: return False return (git_commit_id in self._git) def has_revisions(self, revision_ids): """See Repository.has_revisions.""" return set(filter(self.has_revision, revision_ids)) def get_revisions(self, revids): """See Repository.get_revisions.""" return [self.get_revision(r) for r in revids] def revision_trees(self, revids): """See Repository.revision_trees.""" for revid in revids: yield self.revision_tree(revid) def revision_tree(self, revision_id): """See Repository.revision_tree.""" revision_id = revision.ensure_null(revision_id) if revision_id == revision.NULL_REVISION: inv = inventory.Inventory(root_id=None) inv.revision_id = revision_id return InventoryRevisionTree(self, inv, revision_id) return GitRevisionTree(self, revision_id) def get_inventory(self, revision_id): raise NotImplementedError(self.get_inventory) def set_make_working_trees(self, trees): if trees: self._git.get_config().set(("core", ), "bare", "false") else: self._git.get_config().set(("core", ), "bare", "true") def fetch_objects(self, determine_wants, graph_walker, resolve_ext_ref, progress=None): return self._git.fetch_objects(determine_wants, graph_walker, progress) class GitRepositoryFormat(repository.RepositoryFormat): """Git repository format.""" supports_versioned_directories = False supports_tree_reference = False rich_root_data = True supports_leaving_lock = False fast_deltas = True supports_funky_characters = True supports_external_lookups = False supports_full_versioned_files = False supports_revision_signatures = False supports_nesting_repositories = False revision_graph_can_have_wrong_parents = False supports_unreferenced_revisions = True @property def _matchingbzrdir(self): from .dir import LocalGitControlDirFormat return LocalGitControlDirFormat() def get_format_description(self): return "Git Repository" def initialize(self, controldir, shared=False, _internal=False): from .dir import GitDir if not isinstance(controldir, GitDir): raise errors.UninitializableFormat(self) return controldir.open_repository() def check_conversion_target(self, target_repo_format): return target_repo_format.rich_root_data def get_foreign_tests_repository_factory(self): from ...tests.test_repository import ( ForeignTestsRepositoryFactory, ) return ForeignTestsRepositoryFactory() def network_name(self): return "git" bzr-git-0.6.13+bzr1649/revspec.py0000644000000000000000000001031713165530605014451 0ustar 00000000000000# Copyright (C) 2009 Jelmer Vernooij # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Custom revision specifier for Subversion.""" from __future__ import absolute_import # Please note that imports are delayed as much as possible here since # if DWIM revspecs are supported this module is imported by __init__.py. from ... import version_info as bzrlib_version from ...errors import ( InvalidRevisionId, InvalidRevisionSpec, ) from ...revision import ( NULL_REVISION, ) from ...revisionspec import ( RevisionInfo, RevisionSpec, ) def valid_git_sha1(hex): """Check if `hex` is a validly formatted Git SHA1. :param hex: Hex string to validate :return: Boolean """ import binascii try: binascii.unhexlify(hex) except TypeError: return False else: return True class RevisionSpec_git(RevisionSpec): """Selects a revision using a Subversion revision number.""" help_txt = """Selects a revision using a Git revision sha1. """ prefix = 'git:' wants_revision_history = False def _lookup_git_sha1(self, branch, sha1): from .errors import ( GitSmartRemoteNotSupported, ) from .mapping import ( default_mapping, ) bzr_revid = getattr(branch.repository, "lookup_foreign_revision_id", default_mapping.revision_id_foreign_to_bzr)(sha1) try: if branch.repository.has_revision(bzr_revid): return RevisionInfo.from_revision_id(branch, bzr_revid) except GitSmartRemoteNotSupported: return RevisionInfo(branch, None, bzr_revid) raise InvalidRevisionSpec(self.user_spec, branch) def __nonzero__(self): # The default implementation uses branch.repository.has_revision() if self.rev_id is None: return False if self.rev_id == NULL_REVISION: return False return True def _find_short_git_sha1(self, branch, sha1): from .mapping import ( ForeignGit, mapping_registry, ) parse_revid = getattr(branch.repository, "lookup_bzr_revision_id", mapping_registry.parse_revision_id) branch.repository.lock_read() try: graph = branch.repository.get_graph() for revid, _ in graph.iter_ancestry([branch.last_revision()]): if revid == NULL_REVISION: continue try: foreign_revid, mapping = parse_revid(revid) except InvalidRevisionId: continue if not isinstance(mapping.vcs, ForeignGit): continue if foreign_revid.startswith(sha1): return RevisionInfo.from_revision_id(branch, revid) raise InvalidRevisionSpec(self.user_spec, branch) finally: branch.repository.unlock() def _match_on(self, branch, revs): loc = self.spec.find(':') git_sha1 = self.spec[loc+1:].encode("utf-8") if len(git_sha1) > 40 or not valid_git_sha1(git_sha1): raise InvalidRevisionSpec(self.user_spec, branch) from . import ( lazy_check_versions, ) lazy_check_versions() if len(git_sha1) == 40: return self._lookup_git_sha1(branch, git_sha1) else: return self._find_short_git_sha1(branch, git_sha1) def needs_branch(self): return True def get_branch(self): return None bzr-git-0.6.13+bzr1649/roundtrip.py0000644000000000000000000001251013165530605015025 0ustar 00000000000000# Copyright (C) 2010 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Roundtripping support. Bazaar stores more data than Git, which means that in order to preserve a commit when it is pushed from Bazaar into Git we have to stash that extra metadata somewhere. There are two kinds of metadata relevant here: * per-file metadata (stored by revision+path) - usually stored per tree * per-revision metadata (stored by git commit id) Bazaar revisions have the following information that is not present in Git commits: * revision ids * revision properties * ghost parents Tree content: * empty directories * path file ids * path last changed revisions [1] [1] path last changed revision information can usually be induced from the existing history, unless ghost revisions are involved. This extra metadata is stored in so-called "supplements": * CommitSupplement * TreeSupplement """ from __future__ import absolute_import from ... import osutils from cStringIO import StringIO class CommitSupplement(object): """Supplement for a Bazaar revision roundtripped into Git. :ivar revision_id: Revision id, as string :ivar properties: Revision properties, as dictionary :ivar explicit_parent_ids: Parent ids (needed if there are ghosts) :ivar verifiers: Verifier information """ revision_id = None explicit_parent_ids = None def __init__(self): self.properties = {} self.verifiers = {} def __nonzero__(self): return bool(self.revision_id or self.properties or self.explicit_parent_ids) class TreeSupplement(object): """Supplement for a Bazaar tree roundtripped into Git. This provides file ids (if they are different from the mapping default) and can provide text revisions. """ def parse_roundtripping_metadata(text): """Parse Bazaar roundtripping metadata.""" ret = CommitSupplement() f = StringIO(text) for l in f.readlines(): (key, value) = l.split(":", 1) if key == "revision-id": ret.revision_id = value.strip() elif key == "parent-ids": ret.explicit_parent_ids = tuple(value.strip().split(" ")) elif key == "testament3-sha1": ret.verifiers["testament3-sha1"] = value.strip() elif key.startswith("property-"): name = key[len("property-"):] if not name in ret.properties: ret.properties[name] = value[1:].rstrip("\n") else: ret.properties[name] += "\n" + value[1:].rstrip("\n") else: raise ValueError return ret def generate_roundtripping_metadata(metadata, encoding): """Serialize the roundtripping metadata. :param metadata: A `CommitSupplement` instance :return: String with revision metadata """ lines = [] if metadata.revision_id: lines.append("revision-id: %s\n" % metadata.revision_id) if metadata.explicit_parent_ids: lines.append("parent-ids: %s\n" % " ".join(metadata.explicit_parent_ids)) for key in sorted(metadata.properties.keys()): for l in metadata.properties[key].split("\n"): lines.append("property-%s: %s\n" % (key.encode(encoding), osutils.safe_utf8(l))) if "testament3-sha1" in metadata.verifiers: lines.append("testament3-sha1: %s\n" % metadata.verifiers["testament3-sha1"]) return "".join(lines) def extract_bzr_metadata(message): """Extract Bazaar metadata from a commit message. :param message: Commit message to extract from :return: Tuple with original commit message and metadata object """ split = message.split("\n--BZR--\n", 1) if len(split) != 2: return message, None return split[0], parse_roundtripping_metadata(split[1]) def inject_bzr_metadata(message, commit_supplement, encoding): if not commit_supplement: return message rt_data = generate_roundtripping_metadata(commit_supplement, encoding) if not rt_data: return message assert type(rt_data) == str return message + "\n--BZR--\n" + rt_data def serialize_fileid_map(file_ids): """Serialize a fileid map. :param file_ids: Path -> fileid map :return: Serialized fileid map, as sequence of chunks """ lines = [] for path in sorted(file_ids.keys()): lines.append("%s\0%s\n" % (path, file_ids[path])) return lines def deserialize_fileid_map(filetext): """Deserialize a file id map. :param file: File :return: Fileid map (path -> fileid) """ ret = {} f = StringIO(filetext) lines = f.readlines() for l in lines: (path, file_id) = l.rstrip("\n").split("\0") ret[path] = file_id return ret bzr-git-0.6.13+bzr1649/send.py0000644000000000000000000001637613165530605013746 0ustar 00000000000000# Copyright (C) 2009 Jelmer Vernooij # Based on the original from bzr-svn: # Copyright (C) 2009 Lukas Lalinsky # Copyright (C) 2009 Jelmer Vernooij # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Support in "bzr send" for git-am style patches.""" import time from ... import __version__ as bzr_version from ... import ( branch as _mod_branch, diff as _mod_diff, errors, osutils, revision as _mod_revision, ) from ...merge_directive import BaseMergeDirective from . import ( version_info as bzr_git_version_info, ) from .mapping import ( object_mode, ) from .object_store import ( get_object_store, ) from cStringIO import StringIO from dulwich import ( __version__ as dulwich_version, ) from dulwich.objects import ( Blob, ) version_tail = "bzr %s, bzr-git %d.%d.%d, dulwich %d.%d.%d" % ( (bzr_version, ) + bzr_git_version_info[:3] + dulwich_version[:3]) class GitDiffTree(_mod_diff.DiffTree): """Provides a text representation between two trees, formatted for svn.""" def _show_diff(self, specific_files, extra_trees): from dulwich.patch import write_blob_diff iterator = self.new_tree.iter_changes(self.old_tree, specific_files=specific_files, extra_trees=extra_trees, require_versioned=True) has_changes = 0 def get_encoded_path(path): if path is not None: return path.encode(self.path_encoding, "replace") def get_file_mode(tree, path, kind, executable): if path is None: return 0 return object_mode(kind, executable) def get_blob(present, tree, file_id): if present: return Blob.from_string(tree.get_file(file_id).read()) else: return None trees = (self.old_tree, self.new_tree) for (file_id, paths, changed_content, versioned, parent, name, kind, executable) in iterator: # The root does not get diffed, and items with no known kind (that # is, missing) in both trees are skipped as well. if parent == (None, None) or kind == (None, None): continue path_encoded = (get_encoded_path(paths[0]), get_encoded_path(paths[1])) present = ((kind[0] not in (None, 'directory')), (kind[1] not in (None, 'directory'))) if not present[0] and not present[1]: continue contents = (get_blob(present[0], trees[0], file_id), get_blob(present[1], trees[1], file_id)) renamed = (parent[0], name[0]) != (parent[1], name[1]) mode = (get_file_mode(trees[0], path_encoded[0], kind[0], executable[0]), get_file_mode(trees[1], path_encoded[1], kind[1], executable[1])) write_blob_diff(self.to_file, (path_encoded[0], mode[0], contents[0]), (path_encoded[1], mode[1], contents[1])) has_changes |= (changed_content or renamed) return has_changes def generate_patch_filename(num, summary): return "%04d-%s.patch" % (num, summary.replace("/", "_").rstrip(".")) class GitMergeDirective(BaseMergeDirective): multiple_output_files = True def __init__(self, revision_id, testament_sha1, time, timezone, target_branch, source_branch=None, message=None, patches=None, local_target_branch=None): super(GitMergeDirective, self).__init__(revision_id=revision_id, testament_sha1=testament_sha1, time=time, timezone=timezone, target_branch=target_branch, patch=None, source_branch=source_branch, message=message, bundle=None) self.patches = patches def to_lines(self): return self.patch.splitlines(True) def to_files(self): return self.patches @classmethod def _generate_commit(cls, repository, revision_id, num, total): s = StringIO() store = get_object_store(repository) store.lock_read() try: commit = store[store._lookup_revision_sha1(revision_id)] finally: store.unlock() from dulwich.patch import write_commit_patch, get_summary try: lhs_parent = repository.get_revision(revision_id).parent_ids[0] except IndexError: lhs_parent = _mod_revision.NULL_REVISION tree_1 = repository.revision_tree(lhs_parent) tree_2 = repository.revision_tree(revision_id) contents = StringIO() differ = GitDiffTree.from_trees_options(tree_1, tree_2, contents, 'utf8', None, 'a/', 'b/', None) differ.show_diff(None, None) write_commit_patch(s, commit, contents.getvalue(), (num, total), version_tail) summary = generate_patch_filename(num, get_summary(commit)) return summary, s.getvalue() @classmethod def from_objects(cls, repository, revision_id, time, timezone, target_branch, local_target_branch=None, public_branch=None, message=None): patches = [] submit_branch = _mod_branch.Branch.open(target_branch) submit_branch.lock_read() try: submit_revision_id = submit_branch.last_revision() repository.fetch(submit_branch.repository, submit_revision_id) graph = repository.get_graph() todo = graph.find_difference(submit_revision_id, revision_id)[1] total = len(todo) for i, revid in enumerate(graph.iter_topo_order(todo)): patches.append(cls._generate_commit(repository, revid, i+1, total)) finally: submit_branch.unlock() return cls(revision_id, None, time, timezone, target_branch=target_branch, source_branch=public_branch, message=message, patches=patches) def send_git(branch, revision_id, submit_branch, public_branch, no_patch, no_bundle, message, base_revision_id, local_target_branch=None): if no_patch: raise errors.BzrCommandError("no patch not supported for git-am style patches") if no_bundle: raise errors.BzrCommandError("no bundle not supported for git-am style patches") return GitMergeDirective.from_objects( branch.repository, revision_id, time.time(), osutils.local_time_offset(), submit_branch, public_branch=public_branch, message=message, local_target_branch=local_target_branch) bzr-git-0.6.13+bzr1649/server.py0000644000000000000000000001322513165530605014311 0ustar 00000000000000# Copyright (C) 2008-2012 Jelmer Vernooij # Copyright (C) 2008 John Carr # Copyright (C) 2008-2011 Canonical Ltd # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA from __future__ import absolute_import from dulwich.server import TCPGitServer import sys from ... import ( errors, trace, ) from ...bzrdir import ( BzrDir, ) from .mapping import ( default_mapping, ) from .object_store import ( get_object_store, ) from .refs import ( get_refs_container, ) from dulwich.protocol import Protocol from dulwich.server import ( Backend, BackendRepo, ReceivePackHandler, UploadPackHandler, ) class BzrBackend(Backend): """A git serve backend that can use a Bazaar repository.""" def __init__(self, transport): self.transport = transport self.mapping = default_mapping def open_repository(self, path): # FIXME: More secure path sanitization transport = self.transport.clone(path.lstrip("/")) trace.mutter('client opens %r: %r', path, transport) return BzrBackendRepo(transport, self.mapping) class BzrBackendRepo(BackendRepo): def __init__(self, transport, mapping): self.mapping = mapping self.repo_dir = BzrDir.open_from_transport(transport) self.repo = self.repo_dir.find_repository() self.object_store = get_object_store(self.repo) self.refs = get_refs_container(self.repo_dir, self.object_store) def get_refs(self): self.object_store.lock_read() try: return self.refs.as_dict() finally: self.object_store.unlock() def get_peeled(self, name): cached = self.refs.get_peeled(name) if cached is not None: return cached return self.object_store.peel_sha(self.refs[name]).id def fetch_objects(self, determine_wants, graph_walker, progress, get_tagged=None): """Yield git objects to send to client """ self.object_store.lock_read() try: wants = determine_wants(self.get_refs()) have = self.object_store.find_common_revisions(graph_walker) if wants is None: return return self.object_store.generate_pack_contents(have, wants, progress, get_tagged, lossy=(not self.mapping.roundtripping)) finally: self.object_store.unlock() class BzrTCPGitServer(TCPGitServer): def handle_error(self, request, client_address): trace.log_exception_quietly() trace.warning('Exception happened during processing of request ' 'from %s', client_address) def serve_git(transport, host=None, port=None, inet=False, timeout=None): backend = BzrBackend(transport) if host is None: host = 'localhost' if port: server = BzrTCPGitServer(backend, host, port) else: server = BzrTCPGitServer(backend, host) server.serve_forever() def git_http_hook(branch, method, path): from dulwich.web import HTTPGitApplication, HTTPGitRequest, DEFAULT_HANDLERS handler = None for (smethod, spath) in HTTPGitApplication.services: if smethod != method: continue mat = spath.search(path) if mat: handler = HTTPGitApplication.services[smethod, spath] break if handler is None: return None backend = BzrBackend(branch.user_transport) def git_call(environ, start_response): req = HTTPGitRequest(environ, start_response, dumb=False, handlers=DEFAULT_HANDLERS) return handler(req, backend, mat) return git_call def serve_command(handler_cls, backend, inf=sys.stdin, outf=sys.stdout): """Serve a single command. This is mostly useful for the implementation of commands used by e.g. git+ssh. :param handler_cls: `Handler` class to use for the request :param argv: execv-style command-line arguments. Defaults to sys.argv. :param backend: `Backend` to use :param inf: File-like object to read from, defaults to standard input. :param outf: File-like object to write to, defaults to standard output. :return: Exit code for use with sys.exit. 0 on success, 1 on failure. """ def send_fn(data): outf.write(data) outf.flush() proto = Protocol(inf.read, send_fn) handler = handler_cls(backend, ["/"], proto) # FIXME: Catch exceptions and write a single-line summary to outf. handler.handle() return 0 def serve_git_receive_pack(transport, host=None, port=None, inet=False): if not inet: raise errors.BzrCommandError( "git-receive-pack only works in inetd mode") backend = BzrBackend(transport) sys.exit(serve_command(ReceivePackHandler, backend=backend)) def serve_git_upload_pack(transport, host=None, port=None, inet=False): if not inet: raise errors.BzrCommandError( "git-receive-pack only works in inetd mode") backend = BzrBackend(transport) sys.exit(serve_command(UploadPackHandler, backend=backend)) bzr-git-0.6.13+bzr1649/setup.py0000755000000000000000000000275713165530605014156 0ustar 00000000000000#!/usr/bin/env python from info import * readme = file('README').read() if __name__ == '__main__': from distutils.core import setup version = bzr_plugin_version[:3] version_string = ".".join([str(x) for x in version]) command_classes = {} try: from bzrlib.bzr_distutils import build_mo except ImportError: pass else: command_classes['build_mo'] = build_mo setup(name='bzr-git', description='Support for Git branches in Bazaar', keywords='plugin bzr git bazaar', version=version_string, url='http://bazaar-vcs.org/BzrForeignBranches/Git', license='GPL', maintainer='Jelmer Vernooij', maintainer_email='jelmer@samba.org', long_description=readme, package_dir={'bzrlib.plugins.git':'.'}, packages=['bzrlib.plugins.git', 'bzrlib.plugins.git.tests'], scripts=['bzr-receive-pack', 'bzr-upload-pack', 'git-remote-bzr'], classifiers=[ 'Topic :: Software Development :: Version Control', 'Environment :: Plugins', 'Development Status :: 4 - Beta', 'License :: OSI Approved :: GNU General Public License (GPL)', 'Natural Language :: English', 'Operating System :: OS Independent', 'Programming Language :: Python', 'Programming Language :: Python :: 2', ], cmdclass=command_classes, ) bzr-git-0.6.13+bzr1649/tests/0000755000000000000000000000000013165530605013570 5ustar 00000000000000bzr-git-0.6.13+bzr1649/transportgit.py0000644000000000000000000005151613165530605015550 0ustar 00000000000000# Copyright (C) 2010-2012 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """A Git repository implementation that uses a Bazaar transport.""" from __future__ import absolute_import from cStringIO import StringIO import os import sys import urllib from dulwich.errors import ( NotGitRepository, NoIndexPresent, ) from dulwich.objects import ( ShaFile, ) from dulwich.object_store import ( PackBasedObjectStore, PACKDIR, ) from dulwich.pack import ( MemoryPackIndex, PackData, Pack, iter_sha1, load_pack_index_file, write_pack_data, write_pack_index_v2, ) from dulwich.repo import ( BaseRepo, RefsContainer, BASE_DIRECTORIES, INDEX_FILENAME, OBJECTDIR, REFSDIR, SYMREF, check_ref_format, read_packed_refs_with_peeled, read_packed_refs, write_packed_refs, ) from ... import ( transport as _mod_transport, ) from ...errors import ( FileExists, NoSuchFile, TransportNotPossible, ) class TransportRefsContainer(RefsContainer): """Refs container that reads refs from a transport.""" def __init__(self, transport): self.transport = transport self._packed_refs = None self._peeled_refs = None def __repr__(self): return "%s(%r)" % (self.__class__.__name__, self.transport) def _ensure_dir_exists(self, path): for n in range(path.count("/")): dirname = "/".join(path.split("/")[:n+1]) try: self.transport.mkdir(dirname) except FileExists: pass def subkeys(self, base): keys = set() try: iter_files = self.transport.clone(base).iter_files_recursive() keys.update(("%s/%s" % (base, urllib.unquote(refname))).strip("/") for refname in iter_files if check_ref_format("%s/%s" % (base, refname))) except (TransportNotPossible, NoSuchFile): pass for key in self.get_packed_refs(): if key.startswith(base): keys.add(key[len(base):].strip("/")) return keys def allkeys(self): keys = set() try: self.transport.get_bytes("HEAD") except NoSuchFile: pass else: keys.add("HEAD") try: iter_files = list(self.transport.clone("refs").iter_files_recursive()) for filename in iter_files: refname = "refs/%s" % urllib.unquote(filename) if check_ref_format(refname): keys.add(refname) except (TransportNotPossible, NoSuchFile): pass keys.update(self.get_packed_refs()) return keys def get_packed_refs(self): """Get contents of the packed-refs file. :return: Dictionary mapping ref names to SHA1s :note: Will return an empty dictionary when no packed-refs file is present. """ # TODO: invalidate the cache on repacking if self._packed_refs is None: # set both to empty because we want _peeled_refs to be # None if and only if _packed_refs is also None. self._packed_refs = {} self._peeled_refs = {} try: f = self.transport.get("packed-refs") except NoSuchFile: return {} try: first_line = iter(f).next().rstrip() if (first_line.startswith("# pack-refs") and " peeled" in first_line): for sha, name, peeled in read_packed_refs_with_peeled(f): self._packed_refs[name] = sha if peeled: self._peeled_refs[name] = peeled else: f.seek(0) for sha, name in read_packed_refs(f): self._packed_refs[name] = sha finally: f.close() return self._packed_refs def get_peeled(self, name): """Return the cached peeled value of a ref, if available. :param name: Name of the ref to peel :return: The peeled value of the ref. If the ref is known not point to a tag, this will be the SHA the ref refers to. If the ref may point to a tag, but no cached information is available, None is returned. """ self.get_packed_refs() if self._peeled_refs is None or name not in self._packed_refs: # No cache: no peeled refs were read, or this ref is loose return None if name in self._peeled_refs: return self._peeled_refs[name] else: # Known not peelable return self[name] def read_loose_ref(self, name): """Read a reference file and return its contents. If the reference file a symbolic reference, only read the first line of the file. Otherwise, only read the first 40 bytes. :param name: the refname to read, relative to refpath :return: The contents of the ref file, or None if the file does not exist. :raises IOError: if any other error occurs """ try: f = self.transport.get(name) except NoSuchFile: return None f = StringIO(f.read()) try: header = f.read(len(SYMREF)) if header == SYMREF: # Read only the first line return header + iter(f).next().rstrip("\r\n") else: # Read only the first 40 bytes return header + f.read(40-len(SYMREF)) finally: f.close() def _remove_packed_ref(self, name): if self._packed_refs is None: return # reread cached refs from disk, while holding the lock self._packed_refs = None self.get_packed_refs() if name not in self._packed_refs: return del self._packed_refs[name] if name in self._peeled_refs: del self._peeled_refs[name] f = self.transport.open_write_stream("packed-refs") try: write_packed_refs(f, self._packed_refs, self._peeled_refs) finally: f.close() def set_symbolic_ref(self, name, other): """Make a ref point at another ref. :param name: Name of the ref to set :param other: Name of the ref to point at """ self._check_refname(name) self._check_refname(other) self._ensure_dir_exists(name) self.transport.put_bytes(name, SYMREF + other + '\n') def set_if_equals(self, name, old_ref, new_ref): """Set a refname to new_ref only if it currently equals old_ref. This method follows all symbolic references, and can be used to perform an atomic compare-and-swap operation. :param name: The refname to set. :param old_ref: The old sha the refname must refer to, or None to set unconditionally. :param new_ref: The new sha the refname will refer to. :return: True if the set was successful, False otherwise. """ try: realnames, _ = self.follow(name) realname = realnames[-1] except (KeyError, IndexError): realname = name self._ensure_dir_exists(realname) self.transport.put_bytes(realname, new_ref+"\n") return True def add_if_new(self, name, ref): """Add a new reference only if it does not already exist. This method follows symrefs, and only ensures that the last ref in the chain does not exist. :param name: The refname to set. :param ref: The new sha the refname will refer to. :return: True if the add was successful, False otherwise. """ try: realnames, contents = self.follow(name) if contents is not None: return False realname = realnames[-1] except (KeyError, IndexError): realname = name self._check_refname(realname) self._ensure_dir_exists(realname) self.transport.put_bytes(realname, ref+"\n") return True def remove_if_equals(self, name, old_ref): """Remove a refname only if it currently equals old_ref. This method does not follow symbolic references. It can be used to perform an atomic compare-and-delete operation. :param name: The refname to delete. :param old_ref: The old sha the refname must refer to, or None to delete unconditionally. :return: True if the delete was successful, False otherwise. """ self._check_refname(name) # may only be packed try: self.transport.delete(name) except NoSuchFile: pass self._remove_packed_ref(name) return True def get(self, name, default=None): try: return self[name] except KeyError: return default class TransportRepo(BaseRepo): def __init__(self, transport, bare, refs_text=None): self.transport = transport self.bare = bare if self.bare: self._controltransport = self.transport else: self._controltransport = self.transport.clone('.git') object_store = TransportObjectStore( self._controltransport.clone(OBJECTDIR)) if refs_text is not None: from dulwich.repo import InfoRefsContainer # dulwich >= 0.8.2 refs_container = InfoRefsContainer(StringIO(refs_text)) try: head = TransportRefsContainer(self._controltransport).read_loose_ref("HEAD") except KeyError: pass else: refs_container._refs["HEAD"] = head else: refs_container = TransportRefsContainer(self._controltransport) super(TransportRepo, self).__init__(object_store, refs_container) def _determine_file_mode(self): # Be consistent with bzr if sys.platform == 'win32': return False return True def get_named_file(self, path): """Get a file from the control dir with a specific name. Although the filename should be interpreted as a filename relative to the control dir in a disk-baked Repo, the object returned need not be pointing to a file in that location. :param path: The path to the file, relative to the control dir. :return: An open file object, or None if the file does not exist. """ try: return self._controltransport.get(path.lstrip('/')) except NoSuchFile: return None def _put_named_file(self, relpath, contents): self._controltransport.put_bytes(relpath, contents) def index_path(self): """Return the path to the index file.""" return self._controltransport.local_abspath(INDEX_FILENAME) def open_index(self): """Open the index for this repository.""" from dulwich.index import Index if not self.has_index(): raise NoIndexPresent() return Index(self.index_path()) def has_index(self): """Check if an index is present.""" # Bare repos must never have index files; non-bare repos may have a # missing index file, which is treated as empty. return not self.bare def get_config(self): from dulwich.config import ConfigFile try: return ConfigFile.from_file(self._controltransport.get('config')) except NoSuchFile: return ConfigFile() def get_config_stack(self): from dulwich.config import StackedConfig backends = [] p = self.get_config() if p is not None: backends.append(p) writable = p else: writable = None backends.extend(StackedConfig.default_backends()) return StackedConfig(backends, writable=writable) def __repr__(self): return "<%s for %r>" % (self.__class__.__name__, self.transport) @classmethod def init(cls, transport, bare=False): if not bare: transport.mkdir(".git") control_transport = transport.clone(".git") else: control_transport = transport for d in BASE_DIRECTORIES: control_transport.mkdir("/".join(d)) control_transport.mkdir(OBJECTDIR) TransportObjectStore.init(control_transport.clone(OBJECTDIR)) ret = cls(transport, bare) ret.refs.set_symbolic_ref("HEAD", "refs/heads/master") ret._init_files(bare) return ret class TransportObjectStore(PackBasedObjectStore): """Git-style object store that exists on disk.""" def __init__(self, transport): """Open an object store. :param transport: Transport to open data from """ super(TransportObjectStore, self).__init__() self.transport = transport self.pack_transport = self.transport.clone(PACKDIR) self._alternates = None def __repr__(self): return "%s(%r)" % (self.__class__.__name__, self.transport) @property def alternates(self): if self._alternates is not None: return self._alternates self._alternates = [] for path in self._read_alternate_paths(): # FIXME: Check path t = _mod_transport.get_transport_from_path(path) self._alternates.append(self.__class__(t)) return self._alternates def _read_alternate_paths(self): try: f = self.transport.get("info/alternates") except NoSuchFile: return [] ret = [] try: for l in f.read().splitlines(): if l[0] == "#": continue if os.path.isabs(l): continue ret.append(l) return ret finally: f.close() @property def packs(self): # FIXME: Never invalidates. if not self._pack_cache: self._update_pack_cache() return self._pack_cache.values() def _update_pack_cache(self): for pack in self._load_packs(): self._pack_cache[pack._basename] = pack def _pack_names(self): try: f = self.transport.get('info/packs') except NoSuchFile: return self.pack_transport.list_dir(".") else: ret = [] for line in f.read().splitlines(): if not line: continue (kind, name) = line.split(" ", 1) if kind != "P": continue ret.append(name) return ret def _load_packs(self): ret = [] for name in self._pack_names(): if name.startswith("pack-") and name.endswith(".pack"): try: size = self.pack_transport.stat(name).st_size except TransportNotPossible: # FIXME: This reads the whole pack file at once f = self.pack_transport.get(name) contents = f.read() pd = PackData(name, StringIO(contents), size=len(contents)) else: pd = PackData(name, self.pack_transport.get(name), size=size) idxname = name.replace(".pack", ".idx") idx = load_pack_index_file(idxname, self.pack_transport.get(idxname)) pack = Pack.from_objects(pd, idx) pack._basename = idxname[:-4] ret.append(pack) return ret def _iter_loose_objects(self): for base in self.transport.list_dir('.'): if len(base) != 2: continue for rest in self.transport.list_dir(base): yield base+rest def _split_loose_object(self, sha): return (sha[:2], sha[2:]) def _remove_loose_object(self, sha): path = '%s/%s' % self._split_loose_object(sha) self.transport.delete(path) def _remove_pack(self, pack): self.pack_transport.delete(pack.data.filename) self.pack_transport.delete(pack.index.filename) def _get_loose_object(self, sha): path = '%s/%s' % self._split_loose_object(sha) try: return ShaFile.from_file(self.transport.get(path)) except NoSuchFile: return None def add_object(self, obj): """Add a single object to this object store. :param obj: Object to add """ (dir, file) = self._split_loose_object(obj.id) try: self.transport.mkdir(dir) except FileExists: pass path = "%s/%s" % (dir, file) if self.transport.has(path): return # Already there, no need to write again self.transport.put_bytes(path, obj.as_legacy_object()) def move_in_pack(self, f): """Move a specific file containing a pack into the pack directory. :note: The file should be on the same file system as the packs directory. :param path: Path to the pack file. """ f.seek(0) p = PackData("", f, len(f.getvalue())) entries = p.sorted_entries() basename = "pack-%s" % iter_sha1(entry[0] for entry in entries) f.seek(0) self.pack_transport.put_file(basename + ".pack", f) p._filename = basename + ".pack" idxfile = self.pack_transport.open_write_stream(basename + ".idx") try: write_pack_index_v2(idxfile, entries, p.get_stored_checksum()) finally: idxfile.close() idxfile = self.pack_transport.get(basename + ".idx") idx = load_pack_index_file(basename+".idx", idxfile) final_pack = Pack.from_objects(p, idx) final_pack._basename = basename self._add_known_pack(basename, final_pack) return final_pack def add_thin_pack(self): """Add a new thin pack to this object store. Thin packs are packs that contain deltas with parents that exist in a different pack. """ from cStringIO import StringIO f = StringIO() def commit(): if len(f.getvalue()) > 0: return self.move_in_thin_pack(f) else: return None return f, commit def move_in_thin_pack(self, f): """Move a specific file containing a pack into the pack directory. :note: The file should be on the same file system as the packs directory. :param path: Path to the pack file. """ f.seek(0) data = PackData.from_file(self.get_raw, f, len(f.getvalue())) idx = MemoryPackIndex(data.sorted_entries(), data.get_stored_checksum()) p = Pack.from_objects(data, idx) pack_sha = idx.objects_sha1() datafile = self.pack_transport.open_write_stream( "pack-%s.pack" % pack_sha) try: entries, data_sum = write_pack_data(datafile, p.pack_tuples()) finally: datafile.close() entries.sort() idxfile = self.pack_transport.open_write_stream( "pack-%s.idx" % pack_sha) try: write_pack_index_v2(idxfile, data.sorted_entries(), data_sum) finally: idxfile.close() basename = "pack-%s" % pack_sha final_pack = Pack(basename) self._add_known_pack(basename, final_pack) return final_pack def add_pack(self): """Add a new pack to this object store. :return: Fileobject to write to and a commit function to call when the pack is finished. """ from cStringIO import StringIO f = StringIO() def commit(): if len(f.getvalue()) > 0: return self.move_in_pack(f) else: return None def abort(): return None return f, commit, abort @classmethod def init(cls, transport): transport.mkdir('info') transport.mkdir(PACKDIR) return cls(transport) def _remove_pack(self, pack): self.pack_transport.delete(pack.data.filename) self.pack_transport.delete(os.path.basename(pack.index.path)) bzr-git-0.6.13+bzr1649/tree.py0000644000000000000000000003755713165530605013760 0ustar 00000000000000# Copyright (C) 2009 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Git Trees.""" from __future__ import absolute_import from dulwich.object_store import tree_lookup_path import stat import posixpath from ... import ( delta, errors, inventory, osutils, revisiontree, tree, ) from .mapping import ( mode_is_executable, mode_kind, ) class GitRevisionTree(revisiontree.RevisionTree): """Revision tree implementation based on Git objects.""" def __init__(self, repository, revision_id): self._revision_id = revision_id self._repository = repository self.store = repository._git.object_store assert isinstance(revision_id, str) self.commit_id, self.mapping = repository.lookup_bzr_revision_id(revision_id) try: commit = self.store[self.commit_id] except KeyError, r: raise errors.NoSuchRevision(repository, revision_id) self.tree = commit.tree self._fileid_map = self.mapping.get_fileid_map(self.store.__getitem__, self.tree) def get_file_revision(self, file_id, path=None): if path is None: path = self.id2path(file_id) change_scanner = self._repository._file_change_scanner (path, commit_id) = change_scanner.find_last_change_revision(path, self.commit_id) return self._repository.lookup_foreign_revision_id(commit_id, self.mapping) def get_file_mtime(self, file_id, path=None): revid = self.get_file_revision(file_id, path) try: rev = self._repository.get_revision(revid) except errors.NoSuchRevision: raise errors.FileTimestampUnavailable(path) return rev.timestamp def id2path(self, file_id): return self._fileid_map.lookup_path(file_id) def path2id(self, path): if self.mapping.is_special_file(path): return None return self._fileid_map.lookup_file_id(path.encode('utf-8')) def all_file_ids(self): return set(self._fileid_map.all_file_ids()) def get_root_id(self): return self.path2id("") def has_or_had_id(self, file_id): return self.has_id(file_id) def has_id(self, file_id): try: path = self.id2path(file_id) except errors.NoSuchId: return False return self.has_filename(path) def is_executable(self, file_id, path=None): if path is None: path = self.id2path(file_id) try: (mode, hexsha) = tree_lookup_path(self.store.__getitem__, self.tree, path) except KeyError: raise errors.NoSuchId(self, file_id) if mode is None: # the tree root is a directory return False return mode_is_executable(mode) def kind(self, file_id, path=None): if path is None: path = self.id2path(file_id) try: (mode, hexsha) = tree_lookup_path(self.store.__getitem__, self.tree, path) except KeyError: raise errors.NoSuchId(self, file_id) if mode is None: # the tree root is a directory return "directory" return mode_kind(mode) def has_filename(self, path): try: tree_lookup_path(self.store.__getitem__, self.tree, path.encode("utf-8")) except KeyError: return False else: return True def list_files(self, include_root=False, from_dir=None, recursive=True): if from_dir is None: from_dir = u"" (mode, hexsha) = tree_lookup_path(self.store.__getitem__, self.tree, from_dir.encode("utf-8")) if mode is None: # Root root_ie = self._get_dir_ie("", None) else: parent_path = posixpath.dirname(from_dir.encode("utf-8")) parent_id = self._fileid_map.lookup_file_id(parent_path) if mode_kind(mode) == 'directory': root_ie = self._get_dir_ie(from_dir.encode("utf-8"), parent_id) else: root_ie = self._get_file_ie(from_dir.encode("utf-8"), posixpath.basename(from_dir), mode, hexsha) if from_dir != "" or include_root: yield (from_dir, "V", root_ie.kind, root_ie.file_id, root_ie) todo = set() if root_ie.kind == 'directory': todo.add((from_dir.encode("utf-8"), hexsha, root_ie.file_id)) while todo: (path, hexsha, parent_id) = todo.pop() tree = self.store[hexsha] for name, mode, hexsha in tree.iteritems(): if self.mapping.is_special_file(name): continue child_path = posixpath.join(path, name) if stat.S_ISDIR(mode): ie = self._get_dir_ie(child_path, parent_id) if recursive: todo.add((child_path, hexsha, ie.file_id)) else: ie = self._get_file_ie(child_path, name, mode, hexsha, parent_id) yield child_path, "V", ie.kind, ie.file_id, ie def _get_file_ie(self, path, name, mode, hexsha, parent_id): kind = mode_kind(mode) file_id = self._fileid_map.lookup_file_id(path) ie = inventory.entry_factory[kind](file_id, name.decode("utf-8"), parent_id) if kind == 'symlink': ie.symlink_target = self.store[hexsha].data elif kind == 'tree-reference': ie.reference_revision = self.mapping.revision_id_foreign_to_bzr(hexsha) else: data = self.store[hexsha].data ie.text_sha1 = osutils.sha_string(data) ie.text_size = len(data) ie.executable = mode_is_executable(mode) return ie def _get_dir_ie(self, path, parent_id): file_id = self._fileid_map.lookup_file_id(path) return inventory.InventoryDirectory(file_id, posixpath.basename(path).decode("utf-8"), parent_id) def iter_entries_by_dir(self, specific_file_ids=None, yield_parents=False): # FIXME: Support yield parents if specific_file_ids is not None: specific_paths = [self.id2path(file_id) for file_id in specific_file_ids] if specific_paths in ([u""], []): specific_paths = None else: specific_paths = set(specific_paths) else: specific_paths = None todo = set([("", self.tree, None)]) while todo: path, tree_sha, parent_id = todo.pop() ie = self._get_dir_ie(path, parent_id) if specific_paths is None or path in specific_paths: yield path, ie tree = self.store[tree_sha] for name, mode, hexsha in tree.iteritems(): if self.mapping.is_special_file(name): continue child_path = posixpath.join(path, name) if stat.S_ISDIR(mode): if (specific_paths is None or any(filter(lambda p: p.startswith(child_path), specific_paths))): todo.add((child_path, hexsha, ie.file_id)) elif specific_paths is None or child_path in specific_paths: yield (child_path, self._get_file_ie(child_path, name, mode, hexsha, ie.file_id)) def get_revision_id(self): """See RevisionTree.get_revision_id.""" return self._revision_id def get_file_sha1(self, file_id, path=None, stat_value=None): return osutils.sha_string(self.get_file_text(file_id, path)) def get_file_verifier(self, file_id, path=None, stat_value=None): if path is None: path = self.id2path(file_id) (mode, hexsha) = tree_lookup_path(self.store.__getitem__, self.tree, path) return ("GIT", hexsha) def get_file_text(self, file_id, path=None): """See RevisionTree.get_file_text.""" if path is None: path = self.id2path(file_id) (mode, hexsha) = tree_lookup_path(self.store.__getitem__, self.tree, path) if stat.S_ISREG(mode): return self.store[hexsha].data else: return "" def get_symlink_target(self, file_id, path=None): """See RevisionTree.get_symlink_target.""" if path is None: path = self.id2path(file_id) (mode, hexsha) = tree_lookup_path(self.store.__getitem__, self.tree, path) if stat.S_ISLNK(mode): return self.store[hexsha].data else: return None def _comparison_data(self, entry, path): if entry is None: return None, False, None return entry.kind, entry.executable, None def path_content_summary(self, path): """See Tree.path_content_summary.""" try: (mode, hexsha) = tree_lookup_path(self.store.__getitem__, self.tree, path) except KeyError: return ('missing', None, None, None) kind = mode_kind(mode) if kind == 'file': executable = mode_is_executable(mode) contents = self.store[hexsha].data return (kind, len(contents), executable, osutils.sha_string(contents)) elif kind == 'symlink': return (kind, None, None, self.store[hexsha].data) else: return (kind, None, None, None) def tree_delta_from_git_changes(changes, mapping, (old_fileid_map, new_fileid_map), specific_file=None, require_versioned=False): """Create a TreeDelta from two git trees. source and target are iterators over tuples with: (filename, sha, mode) """ ret = delta.TreeDelta() for (oldpath, newpath), (oldmode, newmode), (oldsha, newsha) in changes: if mapping.is_control_file(oldpath): oldpath = None if mapping.is_control_file(newpath): newpath = None if oldpath is None and newpath is None: continue if oldpath is None: file_id = new_fileid_map.lookup_file_id(newpath) ret.added.append((newpath.decode('utf-8'), file_id, mode_kind(newmode))) elif newpath is None: file_id = old_fileid_map.lookup_file_id(oldpath) ret.removed.append((oldpath.decode('utf-8'), file_id, mode_kind(oldmode))) elif oldpath != newpath: file_id = old_fileid_map.lookup_file_id(oldpath) ret.renamed.append((oldpath.decode('utf-8'), newpath.decode('utf-8'), file_id, mode_kind(newmode), (oldsha != newsha), (oldmode != newmode))) elif mode_kind(oldmode) != mode_kind(newmode): file_id = new_fileid_map.lookup_file_id(newpath) ret.kind_changed.append((newpath.decode('utf-8'), file_id, mode_kind(oldmode), mode_kind(newmode))) elif oldsha != newsha or oldmode != newmode: file_id = new_fileid_map.lookup_file_id(newpath) ret.modified.append((newpath.decode('utf-8'), file_id, mode_kind(newmode), (oldsha != newsha), (oldmode != newmode))) else: file_id = new_fileid_map.lookup_file_id(newpath) ret.unchanged.append((newpath.decode('utf-8'), file_id, mode_kind(newmode))) return ret def changes_from_git_changes(changes, mapping, specific_file=None, require_versioned=False): """Create a iter_changes-like generator from a git stream. source and target are iterators over tuples with: (filename, sha, mode) """ for (oldpath, newpath), (oldmode, newmode), (oldsha, newsha) in changes: path = (oldpath, newpath) if mapping.is_special_file(oldpath) or mapping.is_special_file(newpath): continue if oldpath is None: fileid = mapping.generate_file_id(newpath) oldexe = None oldkind = None oldname = None oldparent = None else: oldpath = oldpath.decode("utf-8") assert oldmode is not None oldexe = mode_is_executable(oldmode) oldkind = mode_kind(oldmode) try: (oldparentpath, oldname) = oldpath.rsplit("/", 1) except ValueError: oldparent = None oldname = oldpath else: oldparent = mapping.generate_file_id(oldparentpath) fileid = mapping.generate_file_id(oldpath) if newpath is None: newexe = None newkind = None newname = None newparent = None else: newpath = newpath.decode("utf-8") assert newmode is not None if newmode is not None: newexe = mode_is_executable(newmode) newkind = mode_kind(newmode) else: newexe = False newkind = None try: newparentpath, newname = newpath.rsplit("/", 1) except ValueError: newparent = None newname = newpath else: newparent = mapping.generate_file_id(newparentpath) yield (fileid, (oldpath, newpath), (oldsha != newsha), (oldpath is not None, newpath is not None), (oldparent, newparent), (oldname, newname), (oldkind, newkind), (oldexe, newexe)) class InterGitRevisionTrees(tree.InterTree): """InterTree that works between two git revision trees.""" _matching_from_tree_format = None _matching_to_tree_format = None _test_mutable_trees_to_test_trees = None @classmethod def is_compatible(cls, source, target): return (isinstance(source, GitRevisionTree) and isinstance(target, GitRevisionTree)) def compare(self, want_unchanged=False, specific_files=None, extra_trees=None, require_versioned=False, include_root=False, want_unversioned=False): if self.source._repository._git.object_store != self.target._repository._git.object_store: raise AssertionError changes = self.source._repository._git.object_store.tree_changes( self.source.tree, self.target.tree, want_unchanged=want_unchanged) source_fileid_map = self.source.mapping.get_fileid_map( self.source._repository._git.object_store.__getitem__, self.source.tree) target_fileid_map = self.target.mapping.get_fileid_map( self.target._repository._git.object_store.__getitem__, self.target.tree) return tree_delta_from_git_changes(changes, self.target.mapping, (source_fileid_map, target_fileid_map), specific_file=specific_files) def iter_changes(self, include_unchanged=False, specific_files=None, pb=None, extra_trees=[], require_versioned=True, want_unversioned=False): if self.source._repository._git.object_store != self.target._repository._git.object_store: raise AssertionError changes = self.source._repository._git.object_store.tree_changes( self.source.tree, self.target.tree, want_unchanged=include_unchanged) return changes_from_git_changes(changes, self.target.mapping, specific_file=specific_files) tree.InterTree.register_optimiser(InterGitRevisionTrees) bzr-git-0.6.13+bzr1649/unpeel_map.py0000644000000000000000000000560413165530605015132 0ustar 00000000000000# Copyright (C) 2011 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Unpeel map storage.""" from __future__ import absolute_import from collections import defaultdict from cStringIO import StringIO from ... import ( errors, trace, ) class UnpeelMap(object): """Unpeel map. Keeps track of the unpeeled object id of tags. """ def __init__(self): self._map = defaultdict(set) self._re_map = {} def update(self, m): for k, v in m.iteritems(): self._map[k].update(v) for i in v: self._re_map[i] = k def load(self, f): firstline = f.readline() if firstline != "unpeel map version 1\n": raise AssertionError("invalid format for unpeel map: %r" % firstline) for l in f.readlines(): (k, v) = l.split(":", 1) k = k.strip() v = v.strip() self._map[k].add(v) self._re_map[v] = k def save(self, f): f.write("unpeel map version 1\n") for k, vs in self._map.iteritems(): for v in vs: f.write("%s: %s\n" % (k, v)) def save_in_repository(self, repository): f = StringIO() try: self.save(f) f.seek(0) repository.control_transport.put_file("git-unpeel-map", f) finally: f.close() def peel_tag(self, git_sha, default=None): """Peel a tag.""" return self._re_map.get(git_sha, default) def re_unpeel_tag(self, new_git_sha, old_git_sha): """Re-unpeel a tag. Bazaar can't store unpeeled refs so in order to prevent peeling existing tags when pushing they are "unpeeled" here. """ if old_git_sha is not None and old_git_sha in self._map[new_git_sha]: trace.mutter("re-unpeeling %r to %r", new_git_sha, old_git_sha) return old_git_sha return new_git_sha @classmethod def from_repository(cls, repository): """Load the unpeel map for a repository. """ m = UnpeelMap() try: m.load(repository.control_transport.get("git-unpeel-map")) except errors.NoSuchFile: pass return m bzr-git-0.6.13+bzr1649/workingtree.py0000644000000000000000000010051013165530605015335 0ustar 00000000000000# Copyright (C) 2008-2011 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """An adapter between a Git index and a Bazaar Working Tree""" from __future__ import absolute_import from cStringIO import ( StringIO, ) from collections import defaultdict import errno from dulwich.errors import NotGitRepository from dulwich.index import ( Index, changes_from_tree, cleanup_mode, index_entry_from_stat, ) from dulwich.object_store import ( tree_lookup_path, ) from dulwich.objects import ( Blob, S_IFGITLINK, ZERO_SHA, ) from dulwich.repo import Repo import os import posixpath import stat import sys from ... import ( errors, conflicts as _mod_conflicts, ignores, inventory, lock, osutils, trace, tree, workingtree, ) from ...decorators import ( needs_read_lock, ) from ...mutabletree import needs_tree_write_lock from .dir import ( LocalGitDir, ) from .tree import ( changes_from_git_changes, tree_delta_from_git_changes, ) from .mapping import ( GitFileIdMap, mode_kind, ) IGNORE_FILENAME = ".gitignore" class GitWorkingTree(workingtree.WorkingTree): """A Git working tree.""" def __init__(self, bzrdir, repo, branch, index): self.basedir = bzrdir.root_transport.local_abspath('.').encode(osutils._fs_enc) self.bzrdir = bzrdir self.repository = repo self.store = self.repository._git.object_store self.mapping = self.repository.get_mapping() self._branch = branch self._transport = bzrdir.transport self._format = GitWorkingTreeFormat() self.index = index self._versioned_dirs = None self.views = self._make_views() self._rules_searcher = None self._detect_case_handling() self._reset_data() self._fileid_map = self._basis_fileid_map.copy() self._lock_mode = None self._lock_count = 0 def lock_read(self): """Lock the repository for read operations. :return: A bzrlib.lock.LogicalLockResult. """ if not self._lock_mode: self._lock_mode = 'r' self._lock_count = 1 self.index.read() else: self._lock_count += 1 self.branch.lock_read() return lock.LogicalLockResult(self.unlock) def lock_tree_write(self): if not self._lock_mode: self._lock_mode = 'w' self._lock_count = 1 self.index.read() elif self._lock_mode == 'r': raise errors.ReadOnlyError(self) else: self._lock_count +=1 self.branch.lock_read() return lock.LogicalLockResult(self.unlock) def lock_write(self, token=None): if not self._lock_mode: self._lock_mode = 'w' self._lock_count = 1 self.index.read() elif self._lock_mode == 'r': raise errors.ReadOnlyError(self) else: self._lock_count +=1 self.branch.lock_write() return lock.LogicalLockResult(self.unlock) def is_locked(self): return self._lock_count >= 1 def get_physical_lock_status(self): return False def unlock(self): if not self._lock_count: return lock.cant_unlock_not_held(self) self.branch.unlock() self._cleanup() self._lock_count -= 1 if self._lock_count > 0: return self._lock_mode = None def _detect_case_handling(self): try: self._transport.stat(".git/cOnFiG") except errors.NoSuchFile: self.case_sensitive = True else: self.case_sensitive = False def merge_modified(self): return {} def set_parent_trees(self, parents_list, allow_leftmost_as_ghost=False): self.set_parent_ids([p for p, t in parents_list]) def iter_children(self, file_id): dpath = self.id2path(file_id) + "/" if dpath in self.index: return for path in self.index: if not path.startswith(dpath): continue if "/" in path[len(dpath):]: # Not a direct child but something further down continue yield self.path2id(path) def _index_add_entry(self, path, file_id, kind): assert self._lock_mode is not None assert isinstance(path, basestring) assert type(file_id) == str or file_id is None if kind == "directory": # Git indexes don't contain directories return if kind == "file": blob = Blob() try: file, stat_val = self.get_file_with_stat(file_id, path) except (errors.NoSuchFile, IOError): # TODO: Rather than come up with something here, use the old index file = StringIO() stat_val = os.stat_result( (stat.S_IFREG | 0644, 0, 0, 0, 0, 0, 0, 0, 0, 0)) blob.set_raw_string(file.read()) elif kind == "symlink": blob = Blob() try: stat_val = os.lstat(self.abspath(path)) except (errors.NoSuchFile, OSError): # TODO: Rather than come up with something here, use the # old index stat_val = os.stat_result( (stat.S_IFLNK, 0, 0, 0, 0, 0, 0, 0, 0, 0)) blob.set_raw_string( self.get_symlink_target(file_id, path).encode("utf-8")) else: raise AssertionError("unknown kind '%s'" % kind) # Add object to the repository if it didn't exist yet if not blob.id in self.store: self.store.add_object(blob) # Add an entry to the index or update the existing entry flags = 0 # FIXME encoded_path = path.encode("utf-8") self.index[encoded_path] = index_entry_from_stat( stat_val, blob.id, flags) if self._versioned_dirs is not None: self._ensure_versioned_dir(encoded_path) def _ensure_versioned_dir(self, dirname): if dirname in self._versioned_dirs: return if dirname != "": self._ensure_versioned_dir(posixpath.dirname(dirname)) self._versioned_dirs.add(dirname) def _load_dirs(self): assert self._lock_mode is not None self._versioned_dirs = set() for p in self.index: self._ensure_versioned_dir(posixpath.dirname(p)) def _unversion_path(self, path): assert self._lock_mode is not None encoded_path = path.encode("utf-8") try: del self.index[encoded_path] except KeyError: # A directory, perhaps? for p in list(self.index): if p.startswith(encoded_path+"/"): del self.index[p] # FIXME: remove empty directories @needs_tree_write_lock def unversion(self, file_ids): for file_id in file_ids: path = self.id2path(file_id) self._unversion_path(path) self.flush() def check_state(self): """Check that the working state is/isn't valid.""" pass @needs_tree_write_lock def remove(self, files, verbose=False, to_file=None, keep_files=True, force=False): """Remove nominated files from the working tree metadata. :param files: File paths relative to the basedir. :param keep_files: If true, the files will also be kept. :param force: Delete files and directories, even if they are changed and even if the directories are not empty. """ all_files = set() # specified and nested files if isinstance(files, basestring): files = [files] if to_file is None: to_file = sys.stdout files = list(all_files) if len(files) == 0: return # nothing to do # Sort needed to first handle directory content before the directory files.sort(reverse=True) def backup(file_to_backup): abs_path = self.abspath(file_to_backup) backup_name = self.bzrdir._available_backup_name(file_to_backup) osutils.rename(abs_path, self.abspath(backup_name)) return "removed %s (but kept a copy: %s)" % ( file_to_backup, backup_name) for f in files: fid = self.path2id(f) if not fid: message = "%s is not versioned." % (f,) else: abs_path = self.abspath(f) if verbose: # having removed it, it must be either ignored or unknown if self.is_ignored(f): new_status = 'I' else: new_status = '?' # XXX: Really should be a more abstract reporter interface kind_ch = osutils.kind_marker(self.kind(fid)) to_file.write(new_status + ' ' + f + kind_ch + '\n') # Unversion file # FIXME: _unversion_path() is O(size-of-index) for directories self._unversion_path(f) message = "removed %s" % (f,) if osutils.lexists(abs_path): if (osutils.isdir(abs_path) and len(os.listdir(abs_path)) > 0): if force: osutils.rmtree(abs_path) message = "deleted %s" % (f,) else: message = backup(f) else: if not keep_files: osutils.delete_any(abs_path) message = "deleted %s" % (f,) # print only one message (if any) per file. if message is not None: trace.note(message) self.flush() def _add(self, files, ids, kinds): for (path, file_id, kind) in zip(files, ids, kinds): if file_id is not None: self._fileid_map.set_file_id(path.encode("utf-8"), file_id) else: file_id = self._fileid_map.lookup_file_id(path.encode("utf-8")) self._index_add_entry(path, file_id, kind) @needs_tree_write_lock def smart_add(self, file_list, recurse=True, action=None, save=True): added = [] ignored = {} user_dirs = [] for filepath in osutils.canonical_relpaths(self.basedir, file_list): abspath = self.abspath(filepath) kind = osutils.file_kind(abspath) if action is not None: file_id = action(self, None, filepath, kind) else: file_id = None if kind in ("file", "symlink"): if save: self._index_add_entry(filepath, file_id, kind) added.append(filepath) elif kind == "directory": if recurse: user_dirs.append(filepath) else: raise errors.BadFileKindError(filename=abspath, kind=kind) for user_dir in user_dirs: abs_user_dir = self.abspath(user_dir) for name in os.listdir(abs_user_dir): subp = os.path.join(user_dir, name) if self.is_control_filename(subp) or self.mapping.is_special_file(subp): continue ignore_glob = self.is_ignored(subp) if ignore_glob is not None: ignored.setdefault(ignore_glob, []).append(subp) continue abspath = self.abspath(subp) kind = osutils.file_kind(abspath) if kind == "directory": user_dirs.append(subp) else: if action is not None: file_id = action(self, None, filepath, kind) else: file_id = None if save: self._index_add_entry(subp, file_id, kind) if added and save: self.flush() return added, ignored def _set_root_id(self, file_id): self._fileid_map.set_file_id("", file_id) @needs_tree_write_lock def move(self, from_paths, to_dir=None, after=False): rename_tuples = [] to_abs = self.abspath(to_dir) if not os.path.isdir(to_abs): raise errors.BzrMoveFailedError('', to_dir, errors.NotADirectory(to_abs)) for from_rel in from_paths: from_tail = os.path.split(from_rel)[-1] to_rel = os.path.join(to_dir, from_tail) self.rename_one(from_rel, to_rel, after=after) rename_tuples.append((from_rel, to_rel)) self.flush() return rename_tuples @needs_tree_write_lock def rename_one(self, from_rel, to_rel, after=False): from_path = from_rel.encode("utf-8") to_path = to_rel.encode("utf-8") if not self.has_filename(to_rel): raise errors.BzrMoveFailedError(from_rel, to_rel, errors.NoSuchFile(to_rel)) if not from_path in self.index: raise errors.BzrMoveFailedError(from_rel, to_rel, errors.NotVersionedError(path=from_rel)) if not after: os.rename(self.abspath(from_rel), self.abspath(to_rel)) self.index[to_path] = self.index[from_path] del self.index[from_path] self.flush() def get_root_id(self): return self.path2id("") def _has_dir(self, path): if path == "": return True if self._versioned_dirs is None: self._load_dirs() return path in self._versioned_dirs @needs_read_lock def path2id(self, path): if type(path) is list: path = u"/".join(path) encoded_path = path.encode("utf-8") if self._is_versioned(encoded_path): return self._fileid_map.lookup_file_id(encoded_path) return None def _iter_files_recursive(self, from_dir=None): if from_dir is None: from_dir = "" for (dirpath, dirnames, filenames) in os.walk(self.abspath(from_dir)): dir_relpath = dirpath[len(self.basedir):].strip("/") if self.bzrdir.is_control_filename(dir_relpath): continue for filename in filenames: if not self.mapping.is_special_file(filename): yield os.path.join(dir_relpath, filename) @needs_read_lock def extras(self): """Yield all unversioned files in this WorkingTree. """ return set(self._iter_files_recursive()) - set(self.index) @needs_tree_write_lock def flush(self): # TODO: Maybe this should only write on dirty ? if self._lock_mode != 'w': raise errors.NotWriteLocked(self) self.index.write() @needs_read_lock def __iter__(self): for path in self.index: yield self.path2id(path) self._load_dirs() for path in self._versioned_dirs: yield self.path2id(path) def has_or_had_id(self, file_id): if self.has_id(file_id): return True if self.had_id(file_id): return True return False def had_id(self, file_id): path = self._basis_fileid_map.lookup_file_id(file_id) try: head = self.repository._git.head() except KeyError: # Assume no if basis is not accessible return False if head == ZERO_SHA: return False root_tree = self.store[head].tree try: tree_lookup_path(self.store.__getitem__, root_tree, path) except KeyError: return False else: return True def has_id(self, file_id): try: self.id2path(file_id) except errors.NoSuchId: return False else: return True @needs_read_lock def id2path(self, file_id): assert type(file_id) is str, "file id not a string: %r" % file_id file_id = osutils.safe_utf8(file_id) path = self._fileid_map.lookup_path(file_id) # FIXME: What about directories? if self._is_versioned(path): return path.decode("utf-8") raise errors.NoSuchId(self, file_id) def get_file_mtime(self, file_id, path=None): """See Tree.get_file_mtime.""" if not path: path = self.id2path(file_id) return os.lstat(self.abspath(path)).st_mtime def get_ignore_list(self): ignoreset = getattr(self, '_ignoreset', None) if ignoreset is not None: return ignoreset ignore_globs = set() ignore_globs.update(ignores.get_runtime_ignores()) ignore_globs.update(ignores.get_user_ignores()) if self.has_filename(IGNORE_FILENAME): f = self.get_file_byname(IGNORE_FILENAME) try: # FIXME: Parse git file format, rather than assuming it's # the same as for bzr's native formats. ignore_globs.update(ignores.parse_ignore_file(f)) finally: f.close() self._ignoreset = ignore_globs return ignore_globs def set_last_revision(self, revid): self._change_last_revision(revid) def _reset_data(self): try: head = self.repository._git.head() except KeyError, name: raise errors.NotBranchError("branch %s at %s" % (name, self.repository.base)) if head == ZERO_SHA: self._basis_fileid_map = GitFileIdMap({}, self.mapping) else: self._basis_fileid_map = self.mapping.get_fileid_map( self.store.__getitem__, self.store[head].tree) @needs_read_lock def get_file_verifier(self, file_id, path=None, stat_value=None): if path is None: path = self.id2path(file_id) try: return ("GIT", self.index[path][-2]) except KeyError: if self._has_dir(path): return ("GIT", None) raise errors.NoSuchId(self, file_id) @needs_read_lock def get_file_sha1(self, file_id, path=None, stat_value=None): if not path: path = self.id2path(file_id) abspath = self.abspath(path).encode(osutils._fs_enc) try: return osutils.sha_file_by_name(abspath) except OSError, (num, msg): if num in (errno.EISDIR, errno.ENOENT): return None raise def revision_tree(self, revid): return self.repository.revision_tree(revid) def _is_versioned(self, path): assert self._lock_mode is not None return (path in self.index or self._has_dir(path)) def filter_unversioned_files(self, files): return set([p for p in files if not self._is_versioned(p.encode("utf-8"))]) def _get_dir_ie(self, path, parent_id): file_id = self.path2id(path) return inventory.InventoryDirectory(file_id, posixpath.basename(path).strip("/"), parent_id) def _add_missing_parent_ids(self, path, dir_ids): if path in dir_ids: return [] parent = posixpath.dirname(path).strip("/") ret = self._add_missing_parent_ids(parent, dir_ids) parent_id = dir_ids[parent] ie = self._get_dir_ie(path, parent_id) dir_ids[path] = ie.file_id ret.append((path, ie)) return ret def _get_file_ie(self, name, path, value, parent_id): assert isinstance(name, unicode) assert isinstance(path, unicode) assert isinstance(value, tuple) and len(value) == 10 (ctime, mtime, dev, ino, mode, uid, gid, size, sha, flags) = value file_id = self.path2id(path) if type(file_id) != str: raise AssertionError kind = mode_kind(mode) ie = inventory.entry_factory[kind](file_id, name, parent_id) if kind == 'symlink': ie.symlink_target = self.get_symlink_target(file_id) else: data = self.get_file_text(file_id, path) ie.text_sha1 = osutils.sha_string(data) ie.text_size = len(data) ie.executable = self.is_executable(file_id, path) ie.revision = None return ie def _is_executable_from_path_and_stat_from_stat(self, path, stat_result): mode = stat_result.st_mode return bool(stat.S_ISREG(mode) and stat.S_IEXEC & mode) @needs_read_lock def stored_kind(self, file_id, path=None): if path is None: path = self.id2path(file_id) try: return mode_kind(self.index[path.encode("utf-8")][4]) except KeyError: # Maybe it's a directory? if self._has_dir(path): return "directory" raise errors.NoSuchId(self, file_id) def is_executable(self, file_id, path=None): if getattr(self, "_supports_executable", osutils.supports_executable)(): if not path: path = self.id2path(file_id) mode = os.lstat(self.abspath(path)).st_mode return bool(stat.S_ISREG(mode) and stat.S_IEXEC & mode) else: basis_tree = self.basis_tree() if file_id in basis_tree: return basis_tree.is_executable(file_id) # Default to not executable return False def _is_executable_from_path_and_stat(self, path, stat_result): if getattr(self, "_supports_executable", osutils.supports_executable)(): return self._is_executable_from_path_and_stat_from_stat(path, stat_result) else: return self._is_executable_from_path_and_stat_from_basis(path, stat_result) @needs_read_lock def list_files(self, include_root=False, from_dir=None, recursive=True): # FIXME: Yield non-versioned files if from_dir is None: from_dir = "" dir_ids = {} fk_entries = {'directory': workingtree.TreeDirectory, 'file': workingtree.TreeFile, 'symlink': workingtree.TreeLink} root_ie = self._get_dir_ie(u"", None) if include_root and not from_dir: yield "", "V", root_ie.kind, root_ie.file_id, root_ie dir_ids[u""] = root_ie.file_id if recursive: path_iterator = self._iter_files_recursive(from_dir) else: if from_dir is None: start = self.basedir else: start = os.path.join(self.basedir, from_dir) path_iterator = sorted([os.path.join(from_dir, name) for name in os.listdir(start) if not self.bzrdir.is_control_filename(name) and not self.mapping.is_special_file(name)]) for path in path_iterator: try: value = self.index[path] except KeyError: value = None path = path.decode("utf-8") parent, name = posixpath.split(path) for dir_path, dir_ie in self._add_missing_parent_ids(parent, dir_ids): yield dir_path, "V", dir_ie.kind, dir_ie.file_id, dir_ie if value is not None: ie = self._get_file_ie(name, path, value, dir_ids[parent]) yield path, "V", ie.kind, ie.file_id, ie else: kind = osutils.file_kind(self.abspath(path)) ie = fk_entries[kind]() yield path, "?", kind, None, ie @needs_read_lock def all_file_ids(self): ids = {u"": self.path2id("")} for path in self.index: if self.mapping.is_special_file(path): continue path = path.decode("utf-8") parent = posixpath.dirname(path).strip("/") for e in self._add_missing_parent_ids(parent, ids): pass ids[path] = self.path2id(path) return set(ids.values()) def _directory_is_tree_reference(self, path): # FIXME: Check .gitsubmodules for path return False @needs_read_lock def iter_entries_by_dir(self, specific_file_ids=None, yield_parents=False): # FIXME: Is return order correct? if yield_parents: raise NotImplementedError(self.iter_entries_by_dir) if specific_file_ids is not None: specific_paths = [self.id2path(file_id) for file_id in specific_file_ids] if specific_paths in ([u""], []): specific_paths = None else: specific_paths = set(specific_paths) else: specific_paths = None root_ie = self._get_dir_ie(u"", None) if specific_paths is None: yield u"", root_ie dir_ids = {u"": root_ie.file_id} for path, value in self.index.iteritems(): if self.mapping.is_special_file(path): continue path = path.decode("utf-8") if specific_paths is not None and not path in specific_paths: continue (parent, name) = posixpath.split(path) try: file_ie = self._get_file_ie(name, path, value, None) except IOError: continue for (dir_path, dir_ie) in self._add_missing_parent_ids(parent, dir_ids): yield dir_path, dir_ie file_ie.parent_id = self.path2id(parent) yield path, file_ie @needs_read_lock def conflicts(self): # FIXME: return _mod_conflicts.ConflictList() def update_basis_by_delta(self, new_revid, delta): # The index just contains content, which won't have changed. self._reset_data() @needs_read_lock def get_canonical_inventory_path(self, path): for p in self.index: if p.lower() == path.lower(): return p else: return path @needs_read_lock def _walkdirs(self, prefix=""): if prefix != "": prefix += "/" per_dir = defaultdict(list) for path, value in self.index.iteritems(): if self.mapping.is_special_file(path): continue if not path.startswith(prefix): continue (dirname, child_name) = posixpath.split(path) dirname = dirname.decode("utf-8") dir_file_id = self.path2id(dirname) assert isinstance(value, tuple) and len(value) == 10 (ctime, mtime, dev, ino, mode, uid, gid, size, sha, flags) = value stat_result = os.stat_result((mode, ino, dev, 1, uid, gid, size, 0, mtime, ctime)) per_dir[(dirname, dir_file_id)].append( (path.decode("utf-8"), child_name.decode("utf-8"), mode_kind(mode), stat_result, self.path2id(path.decode("utf-8")), mode_kind(mode))) return per_dir.iteritems() def _lookup_entry(self, path, update_index=False): assert type(path) == str entry = self.index[path] index_mode = entry[-6] index_sha = entry[-2] disk_path = os.path.join(self.basedir, path) try: disk_stat = os.lstat(disk_path) except OSError, (num, msg): if num in (errno.EISDIR, errno.ENOENT): raise KeyError(path) raise disk_mtime = disk_stat.st_mtime if isinstance(entry[1], tuple): index_mtime = entry[1][0] else: index_mtime = int(entry[1]) mtime_delta = (disk_mtime - index_mtime) disk_mode = cleanup_mode(disk_stat.st_mode) if mtime_delta > 0 or disk_mode != index_mode: if stat.S_ISDIR(disk_mode): try: subrepo = Repo(disk_path) except NotGitRepository: return (None, None) else: disk_mode = S_IFGITLINK git_id = subrepo.head() elif stat.S_ISLNK(disk_mode): blob = Blob.from_string(os.readlink(disk_path).encode('utf-8')) git_id = blob.id elif stat.S_ISREG(disk_mode): with open(disk_path, 'r') as f: blob = Blob.from_string(f.read()) git_id = blob.id else: raise AssertionError if update_index: flags = 0 # FIXME self.index[path] = index_entry_from_stat(disk_stat, git_id, flags, disk_mode) return (git_id, disk_mode) return (index_sha, index_mode) class GitWorkingTreeFormat(workingtree.WorkingTreeFormat): _tree_class = GitWorkingTree supports_versioned_directories = False @property def _matchingbzrdir(self): from .dir import LocalGitControlDirFormat return LocalGitControlDirFormat() def get_format_description(self): return "Git Working Tree" def initialize(self, a_bzrdir, revision_id=None, from_branch=None, accelerator_tree=None, hardlink=False): """See WorkingTreeFormat.initialize().""" if not isinstance(a_bzrdir, LocalGitDir): raise errors.IncompatibleFormat(self, a_bzrdir) index = Index(a_bzrdir.root_transport.local_abspath(".git/index")) index.write() return GitWorkingTree(a_bzrdir, a_bzrdir.open_repository(), a_bzrdir.open_branch(), index) class InterIndexGitTree(tree.InterTree): """InterTree that works between a Git revision tree and an index.""" def __init__(self, source, target): super(InterIndexGitTree, self).__init__(source, target) self._index = target.index @classmethod def is_compatible(cls, source, target): from .repository import GitRevisionTree return (isinstance(source, GitRevisionTree) and isinstance(target, GitWorkingTree)) @needs_read_lock def compare(self, want_unchanged=False, specific_files=None, extra_trees=None, require_versioned=False, include_root=False, want_unversioned=False): # FIXME: Handle include_root changes = changes_between_git_tree_and_index( self.source.store, self.source.tree, self.target, want_unchanged=want_unchanged, want_unversioned=want_unversioned) source_fileid_map = self.source._fileid_map target_fileid_map = self.target._fileid_map ret = tree_delta_from_git_changes(changes, self.target.mapping, (source_fileid_map, target_fileid_map), specific_file=specific_files, require_versioned=require_versioned) if want_unversioned: for e in self.target.extras(): ret.unversioned.append((e, None, osutils.file_kind(self.target.abspath(e)))) return ret @needs_read_lock def iter_changes(self, include_unchanged=False, specific_files=None, pb=None, extra_trees=[], require_versioned=True, want_unversioned=False): changes = changes_between_git_tree_and_index( self.source.store, self.source.tree, self.target, want_unchanged=include_unchanged, want_unversioned=want_unversioned) return changes_from_git_changes(changes, self.target.mapping, specific_file=specific_files) tree.InterTree.register_optimiser(InterIndexGitTree) def changes_between_git_tree_and_index(object_store, tree, target, want_unchanged=False, want_unversioned=False, update_index=False): """Determine the changes between a git tree and a working tree with index. """ names = target.index._byname.keys() for (name, mode, sha) in changes_from_tree(names, target._lookup_entry, object_store, tree, want_unchanged=want_unchanged): if name == (None, None): continue yield (name, mode, sha) bzr-git-0.6.13+bzr1649/notes/git-serve.txt0000644000000000000000000000110113165530605016215 0ustar 00000000000000Git serve Todo: * Fix the pack creation code in Dulwich. It doesn't generate deltas quite well at the moment. (http://pad.lv/562673) * Switch to the new pack-based format once John's work on PackCollections is finished. This should give very nice performance improvements, in particular the caching of Trees. (http://pad.lv/520694) * Support using the cached trees in BazaarObjectStore.generate_pack_contents, rather than calling out to _revision_to_objects as the latter is slow (it uses inventories). * Support roundtripping (http://pad.lv/544776) bzr-git-0.6.13+bzr1649/notes/mapping.txt0000644000000000000000000000277413165530605015764 0ustar 00000000000000Mapping between Git and Bazaar is generally straightforward. Mapping version 1 ================= All revision ids created in this mapping format are prefixed with "git-v1:". This mapping format does not support roundtripped revisions from Bazaar; pushing or pulling from Bazaar into Git is not possible. dpush is possible and more or less does the opposite of the mapping described in this section. Commits ------- Git commits are mapped to Bazaar revisions. Bazaar revision ids are created by prefixing the (hex format) of git commit sha with "git-v1:". Commit properties are as follows: * git committer string: mapped to the Bazaar committer string * git committer timestamp: mapped to the Bazaar commit timestamp * git author string: mapped to the Bazaar 'author' revision property, if it is different from the committer string * git author timestamp: ignored * git commit message: mapped to Bazaar commit message The git committer string, author string and commit message are assumed to be encoded in UTF-8. Any utf-8-invalid characters are ignored. Trees and blobs --------------- Git trees are generally converted to Bazaar directories, Git blobs are generally converted to Bazaar files and symlinks. Since all git trees are mapped *including* the root tree, it is only possible to create mapped rich-root revisions. File ids for all objects are simply created by taking their path and escaping invalid characters in them: * _ is mapped to __ * spaces are mapped to _s * \x0c is mapped to _c bzr-git-0.6.13+bzr1649/notes/roundtripping.txt0000644000000000000000000000035713165530605017230 0ustar 00000000000000Bzr revision metadata that doesn't exist in git: - revision ids - revision properties - ghost parents - file ids * git-sha+path -> fileid mapping refs/bzr/ refs to be able to find Git revisions based on Bazaar revision ids. bzr-git-0.6.13+bzr1649/po/bzr-git.pot0000644000000000000000000000320313165530605015144 0ustar 00000000000000# SOME DESCRIPTIVE TITLE. # Copyright (C) YEAR Canonical Ltd # This file is distributed under the same license as the PACKAGE package. # FIRST AUTHOR , YEAR. # #, fuzzy msgid "" msgstr "" "Project-Id-Version: bzr-git\n" "Report-Msgid-Bugs-To: \n" "POT-Creation-Date: 2011-12-05 02:21+0100\n" "PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n" "Last-Translator: FULL NAME \n" "Language-Team: LANGUAGE \n" "Language: \n" "MIME-Version: 1.0\n" "Content-Type: text/plain; charset=CHARSET\n" "Content-Transfer-Encoding: 8bit\n" #: commands.py:100 #, python-format msgid "%r is not a git repository" msgstr "" #: commands.py:112 msgid "Target repository doesn't support rich roots" msgstr "" #: commands.py:132 msgid "creating branches" msgstr "" #: commands.py:153 msgid "" "Use 'bzr checkout' to create a working tree in the newly created branches." msgstr "" #: commands.py:194 #, python-format msgid "Object not found: %s" msgstr "" #: commands.py:273 msgid "error running patch" msgstr "" #: commands.py:33 msgid "Import all branches from a git repository." msgstr "" #: commands.py:35 msgid " " msgstr "" #: commands.py:242 msgid "Apply a series of git-am style patches." msgstr "" #: commands.py:244 msgid "" "This command will in the future probably be integrated into \n" "\"bzr pull\"." msgstr "" # help of 'signoff' option of 'git-apply' command #: commands.py:249 msgid "Add a Signed-off-by line." msgstr "" # help of 'force' option of 'git-apply' command #: commands.py:251 msgid "Apply patches even if tree has uncommitted changes." msgstr "" bzr-git-0.6.13+bzr1649/tests/__init__.py0000644000000000000000000001544713165530605015714 0ustar 00000000000000# Copyright (C) 2006, 2007 Canonical Ltd # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """The basic test suite for bzr-git.""" from cStringIO import StringIO import time from .... import ( errors as bzr_errors, tests, ) try: from ....tests.features import Feature except ImportError: # bzr < 2.5 from ....tests import Feature from .. import ( import_dulwich, ) from fastimport import ( commands, ) TestCase = tests.TestCase TestCaseInTempDir = tests.TestCaseInTempDir TestCaseWithTransport = tests.TestCaseWithTransport TestCaseWithMemoryTransport = tests.TestCaseWithMemoryTransport class _DulwichFeature(Feature): def _probe(self): try: import_dulwich() except bzr_errors.DependencyNotPresent: return False return True def feature_name(self): return 'dulwich' DulwichFeature = _DulwichFeature() class GitBranchBuilder(object): def __init__(self, stream=None): self.commit_info = [] self.orig_stream = stream if stream is None: self.stream = StringIO() else: self.stream = stream self._counter = 0 self._branch = 'refs/heads/master' def set_branch(self, branch): """Set the branch we are committing.""" self._branch = branch def _write(self, text): self.stream.write(text) def _writelines(self, lines): self.stream.writelines(lines) def _create_blob(self, content): self._counter += 1 blob = commands.BlobCommand(str(self._counter), content) self._write(str(blob)+"\n") return self._counter def set_symlink(self, path, content): """Create or update symlink at a given path.""" mark = self._create_blob(content) mode = '120000' self.commit_info.append('M %s :%d %s\n' % (mode, mark, self._encode_path(path))) def set_file(self, path, content, executable): """Create or update content at a given path.""" mark = self._create_blob(content) if executable: mode = '100755' else: mode = '100644' self.commit_info.append('M %s :%d %s\n' % (mode, mark, self._encode_path(path))) def set_link(self, path, link_target): """Create or update a link at a given path.""" mark = self._create_blob(link_target) self.commit_info.append('M 120000 :%d %s\n' % (mark, self._encode_path(path))) def delete_entry(self, path): """This will delete files or symlinks at the given location.""" self.commit_info.append('D %s\n' % (self._encode_path(path),)) @staticmethod def _encode_path(path): if '\n' in path or path[0] == '"': path = path.replace('\\', '\\\\') path = path.replace('\n', '\\n') path = path.replace('"', '\\"') path = '"' + path + '"' return path.encode('utf-8') # TODO: Author # TODO: Author timestamp+timezone def commit(self, committer, message, timestamp=None, timezone='+0000', author=None, merge=None, base=None): """Commit the new content. :param committer: The name and address for the committer :param message: The commit message :param timestamp: The timestamp for the commit :param timezone: The timezone of the commit, such as '+0000' or '-1000' :param author: The name and address of the author (if different from committer) :param merge: A list of marks if this should merge in another commit :param base: An id for the base revision (primary parent) if that is not the last commit. :return: A mark which can be used in the future to reference this commit. """ self._counter += 1 mark = str(self._counter) if timestamp is None: timestamp = int(time.time()) self._write('commit %s\n' % (self._branch,)) self._write('mark :%s\n' % (mark,)) self._write('committer %s %s %s\n' % (committer, timestamp, timezone)) message = message.encode('UTF-8') self._write('data %d\n' % (len(message),)) self._write(message) self._write('\n') if base is not None: self._write('from :%s\n' % (base,)) if merge is not None: for m in merge: self._write('merge :%s\n' % (m,)) self._writelines(self.commit_info) self._write('\n') self.commit_info = [] return mark def reset(self, ref=None, mark=None): """Create or recreate the named branch. :param ref: branch name, defaults to the current branch. :param mark: commit the branch will point to. """ if ref is None: ref = self._branch self._write('reset %s\n' % (ref,)) if mark is not None: self._write('from :%s\n' % mark) self._write('\n') def finish(self): """We are finished building, close the stream, get the id mapping""" self.stream.seek(0) if self.orig_stream is None: from dulwich.repo import Repo r = Repo(".") from dulwich.fastexport import GitImportProcessor importer = GitImportProcessor(r) return importer.import_stream(self.stream) def test_suite(): loader = tests.TestUtil.TestLoader() suite = tests.TestUtil.TestSuite() testmod_names = [ 'test_blackbox', 'test_builder', 'test_branch', 'test_cache', 'test_dir', 'test_fetch', 'test_git_remote_helper', 'test_mapping', 'test_object_store', 'test_pristine_tar', 'test_push', 'test_remote', 'test_repository', 'test_refs', 'test_revspec', 'test_roundtrip', 'test_server', 'test_transportgit', 'test_unpeel_map', 'test_workingtree', ] testmod_names = ['%s.%s' % (__name__, t) for t in testmod_names] suite.addTests(loader.loadTestsFromModuleNames(testmod_names)) return suite bzr-git-0.6.13+bzr1649/tests/test_blackbox.py0000644000000000000000000002520013165530605016765 0ustar 00000000000000# Copyright (C) 2007 David Allouche # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Black-box tests for bzr-git.""" from dulwich.repo import ( Repo as GitRepo, ) import os from .... import ( version_info as bzrlib_version, ) from ....bzrdir import ( BzrDir, ) from ....tests.blackbox import ExternalBase from .. import ( tests, ) class TestGitBlackBox(ExternalBase): def simple_commit(self): # Create a git repository with a revision. repo = GitRepo.init(self.test_dir) builder = tests.GitBranchBuilder() builder.set_file('a', 'text for a\n', False) r1 = builder.commit('Joe Foo ', u'') return repo, builder.finish()[r1] def test_nick(self): r = GitRepo.init(self.test_dir) del r["HEAD"] dir = BzrDir.open(self.test_dir) dir.create_branch() output, error = self.run_bzr(['nick']) self.assertEquals("HEAD\n", output) def test_info(self): self.simple_commit() output, error = self.run_bzr(['info']) self.assertEqual(error, '') self.assertTrue("Standalone tree (format: git)" in output) def test_branch(self): os.mkdir("gitbranch") GitRepo.init(os.path.join(self.test_dir, "gitbranch")) os.chdir('gitbranch') builder = tests.GitBranchBuilder() builder.set_file('a', 'text for a\n', False) builder.commit('Joe Foo ', u'') builder.finish() os.chdir('..') output, error = self.run_bzr(['branch', 'gitbranch', 'bzrbranch']) self.assertTrue( (error == 'Branched 1 revision(s).\n') or (error == 'Branched 1 revision.\n'), error) def test_checkout(self): os.mkdir("gitbranch") GitRepo.init(os.path.join(self.test_dir, "gitbranch")) os.chdir('gitbranch') builder = tests.GitBranchBuilder() builder.set_file('a', 'text for a\n', False) builder.commit('Joe Foo ', u'') builder.finish() os.chdir('..') output, error = self.run_bzr(['checkout', 'gitbranch', 'bzrbranch']) self.assertEqual(error, '') self.assertEqual(output, '') def test_branch_ls(self): self.simple_commit() output, error = self.run_bzr(['ls', '-r-1']) self.assertEqual(error, '') self.assertEqual(output, "a\n") def test_init(self): self.run_bzr("init --git repo") def test_info_verbose(self): self.simple_commit() output, error = self.run_bzr(['info', '-v']) self.assertEqual(error, '') self.assertTrue("Standalone tree (format: git)" in output) self.assertTrue("control: Local Git Repository" in output) self.assertTrue("branch: Git Branch" in output) self.assertTrue("repository: Git Repository" in output) def test_push_roundtripping(self): self.knownFailure("roundtripping is not yet supported") self.with_roundtripping() os.mkdir("bla") GitRepo.init(os.path.join(self.test_dir, "bla")) self.run_bzr(['init', 'foo']) self.run_bzr(['commit', '--unchanged', '-m', 'bla', 'foo']) # when roundtripping is supported output, error = self.run_bzr(['push', '-d', 'foo', 'bla']) self.assertEquals("", output) self.assertTrue(error.endswith("Created new branch.\n")) def test_log(self): # Smoke test for "bzr log" in a git repository. self.simple_commit() # Check that bzr log does not fail and includes the revision. output, error = self.run_bzr(['log']) self.assertEqual(error, '') self.assertTrue( '' in output, "Commit message was not found in output:\n%s" % (output,)) def test_log_verbose(self): # Smoke test for "bzr log -v" in a git repository. self.simple_commit() # Check that bzr log does not fail and includes the revision. output, error = self.run_bzr(['log', '-v']) def test_tags(self): git_repo, commit_sha1 = self.simple_commit() git_repo.refs["refs/tags/foo"] = commit_sha1 output, error = self.run_bzr(['tags']) self.assertEquals(error, '') self.assertEquals(output, "foo 1\n") def test_tag(self): self.simple_commit() output, error = self.run_bzr(["tag", "bar"]) # bzr <= 2.2 emits this message in the output stream # bzr => 2.3 emits this message in the error stream self.assertEquals(error + output, 'Created tag bar.\n') def test_init_repo(self): output, error = self.run_bzr(["init", "--git", "bla.git"]) self.assertEquals(error, '') self.assertEquals(output, 'Created a standalone tree (format: git)\n') def test_diff_format(self): tree = self.make_branch_and_tree('.') self.build_tree(['a']) tree.add(['a']) output, error = self.run_bzr(['diff', '--format=git'], retcode=1) self.assertEqual(error, '') self.assertEqual(output, 'diff --git /dev/null b/a\n' 'old mode 0\n' 'new mode 100644\n' 'index 0000000..c197bd8 100644\n' '--- /dev/null\n' '+++ b/a\n' '@@ -0,0 +1 @@\n' '+contents of a\n') def test_git_import_uncolocated(self): r = GitRepo.init("a", mkdir=True) self.build_tree(["a/file"]) r.stage("file") r.do_commit(ref="refs/heads/abranch", committer="Joe ", message="Dummy") r.do_commit(ref="refs/heads/bbranch", committer="Joe ", message="Dummy") self.run_bzr(["git-import", "a", "b"]) self.assertEquals(set([".bzr", "abranch", "bbranch"]), set(os.listdir("b"))) def test_git_import(self): r = GitRepo.init("a", mkdir=True) self.build_tree(["a/file"]) r.stage("file") r.do_commit(ref="refs/heads/abranch", committer="Joe ", message="Dummy") r.do_commit(ref="refs/heads/bbranch", committer="Joe ", message="Dummy") self.run_bzr(["git-import", "--colocated", "a", "b"]) self.assertEquals(set([".bzr"]), set(os.listdir("b"))) self.assertEquals(set(["abranch", "bbranch"]), set(BzrDir.open("b").get_branches().keys())) def test_git_import_incremental(self): r = GitRepo.init("a", mkdir=True) self.build_tree(["a/file"]) r.stage("file") r.do_commit(ref="refs/heads/abranch", committer="Joe ", message="Dummy") self.run_bzr(["git-import", "--colocated", "a", "b"]) self.run_bzr(["git-import", "--colocated", "a", "b"]) self.assertEquals(set([".bzr"]), set(os.listdir("b"))) b = BzrDir.open("b") self.assertEquals(["abranch"], b.get_branches().keys()) def test_git_import_tags(self): r = GitRepo.init("a", mkdir=True) self.build_tree(["a/file"]) r.stage("file") cid = r.do_commit(ref="refs/heads/abranch", committer="Joe ", message="Dummy") r["refs/tags/atag"] = cid self.run_bzr(["git-import", "--colocated", "a", "b"]) self.assertEquals(set([".bzr"]), set(os.listdir("b"))) b = BzrDir.open("b") self.assertEquals(["abranch"], b.get_branches().keys()) self.assertEquals(["atag"], b.open_branch("abranch").tags.get_tag_dict().keys()) def test_git_import_colo(self): r = GitRepo.init("a", mkdir=True) self.build_tree(["a/file"]) r.stage("file") r.do_commit(ref="refs/heads/abranch", committer="Joe ", message="Dummy") r.do_commit(ref="refs/heads/bbranch", committer="Joe ", message="Dummy") self.make_bzrdir("b", format="development-colo") self.run_bzr(["git-import", "--colocated", "a", "b"]) self.assertEquals( set([b.name for b in BzrDir.open("b").list_branches()]), set(["abranch", "bbranch"])) def test_git_refs_from_git(self): r = GitRepo.init("a", mkdir=True) self.build_tree(["a/file"]) r.stage("file") cid = r.do_commit(ref="refs/heads/abranch", committer="Joe ", message="Dummy") r["refs/tags/atag"] = cid (stdout, stderr) = self.run_bzr(["git-refs", "a"]) self.assertEquals(stderr, "") self.assertEquals(stdout, 'refs/tags/atag -> ' + cid + '\n' 'refs/heads/abranch -> ' + cid + '\n') def test_git_refs_from_bzr(self): tree = self.make_branch_and_tree('a') self.build_tree(["a/file"]) tree.add(["file"]) revid = tree.commit(committer="Joe ", message="Dummy") tree.branch.tags.set_tag("atag", revid) (stdout, stderr) = self.run_bzr(["git-refs", "a"]) self.assertEquals(stderr, "") self.assertTrue("refs/tags/atag -> " in stdout) self.assertTrue("HEAD -> " in stdout) def test_dpush(self): r = GitRepo.init("gitr", mkdir=True) self.build_tree_contents([("gitr/foo", "hello from git")]) r.stage("foo") r.do_commit("message", committer="Somebody ") self.run_bzr(["branch", "gitr", "bzrb"]) self.build_tree_contents([("bzrb/foo", "hello from bzr")]) self.run_bzr(["commit", "-m", "msg", "bzrb"]) self.run_bzr(["dpush", "-d", "bzrb", "gitr"]) def test_dpush_from_bound(self): r = GitRepo.init("gitr", mkdir=True) self.build_tree_contents([("gitr/foo", "hello from git")]) r.stage("foo") r.do_commit("message", committer="Somebody ") self.run_bzr(["branch", "gitr", "bzrm"]) self.run_bzr(["checkout", "bzrm", "bzrb"]) self.build_tree_contents([("bzrb/foo", "hello from bzr")]) self.run_bzr(["commit", "-m", "msg", "bzrb"]) self.run_bzr(["dpush", "-d", "bzrb", "gitr"]) bzr-git-0.6.13+bzr1649/tests/test_branch.py0000644000000000000000000002265413165530605016447 0ustar 00000000000000# Copyright (C) 2007 Canonical Ltd # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Tests for interfacing with a Git Branch""" import dulwich from dulwich.objects import ( Commit, Tag, ) from dulwich.repo import ( Repo as GitRepo, ) import os import urllib from .... import ( errors, revision, urlutils, ) from ....branch import ( Branch, InterBranch, ) from ....bzrdir import ( BzrDir, ) from ....repository import ( Repository, ) from .. import ( branch, tests, ) from ..dir import ( LocalGitControlDirFormat, ) from ..mapping import ( default_mapping, ) class TestGitBranch(tests.TestCaseInTempDir): def test_open_by_ref(self): GitRepo.init('.') url = "%s,ref=%s" % ( urlutils.local_path_to_url(self.test_dir), urllib.quote("refs/remotes/origin/unstable", safe='') ) d = BzrDir.open(url) b = d.create_branch() self.assertEquals(b.ref, "refs/remotes/origin/unstable") def test_open_existing(self): r = GitRepo.init('.') del r.refs["HEAD"] d = BzrDir.open('.') thebranch = d.create_branch() self.assertIsInstance(thebranch, branch.GitBranch) def test_repr(self): r = GitRepo.init('.') del r.refs["HEAD"] d = BzrDir.open('.') thebranch = d.create_branch() self.assertEquals( "" % ( urlutils.local_path_to_url(self.test_dir),), repr(thebranch)) def test_last_revision_is_null(self): r = GitRepo.init('.') del r.refs["HEAD"] thedir = BzrDir.open('.') thebranch = thedir.create_branch() self.assertEqual(revision.NULL_REVISION, thebranch.last_revision()) self.assertEqual((0, revision.NULL_REVISION), thebranch.last_revision_info()) def simple_commit_a(self): r = GitRepo.init('.') self.build_tree(['a']) r.stage(["a"]) return r.do_commit("a", committer="Somebody ") def test_last_revision_is_valid(self): head = self.simple_commit_a() thebranch = Branch.open('.') self.assertEqual(default_mapping.revision_id_foreign_to_bzr(head), thebranch.last_revision()) def test_last_revision_info(self): reva = self.simple_commit_a() self.build_tree(['b']) r = GitRepo(".") r.stage("b") revb = r.do_commit("b", committer="Somebody ") thebranch = Branch.open('.') self.assertEquals((2, default_mapping.revision_id_foreign_to_bzr(revb)), thebranch.last_revision_info()) def test_tag_annotated(self): reva = self.simple_commit_a() o = Tag() o.name = "foo" o.tagger = "Jelmer " o.message = "add tag" o.object = (Commit, reva) o.tag_timezone = 0 o.tag_time = 42 r = GitRepo(".") r.object_store.add_object(o) r['refs/tags/foo'] = o.id thebranch = Branch.open('.') self.assertEquals({"foo": default_mapping.revision_id_foreign_to_bzr(reva)}, thebranch.tags.get_tag_dict()) def test_tag(self): reva = self.simple_commit_a() r = GitRepo(".") r.refs["refs/tags/foo"] = reva thebranch = Branch.open('.') self.assertEquals({"foo": default_mapping.revision_id_foreign_to_bzr(reva)}, thebranch.tags.get_tag_dict()) class TestWithGitBranch(tests.TestCaseWithTransport): def setUp(self): tests.TestCaseWithTransport.setUp(self) r = dulwich.repo.Repo.create(self.test_dir) del r.refs["HEAD"] d = BzrDir.open(self.test_dir) self.git_branch = d.create_branch() def test_get_parent(self): self.assertIs(None, self.git_branch.get_parent()) def test_get_stacked_on_url(self): self.assertRaises(errors.UnstackableBranchFormat, self.git_branch.get_stacked_on_url) def test_get_physical_lock_status(self): self.assertFalse(self.git_branch.get_physical_lock_status()) class TestGitBranchFormat(tests.TestCase): def setUp(self): super(TestGitBranchFormat, self).setUp() self.format = branch.GitBranchFormat() def test_get_format_description(self): self.assertEquals("Git Branch", self.format.get_format_description()) def test_get_network_name(self): self.assertEquals("git", self.format.network_name()) def test_supports_tags(self): self.assertTrue(self.format.supports_tags()) class BranchTests(tests.TestCaseInTempDir): def make_onerev_branch(self): os.mkdir("d") os.chdir("d") GitRepo.init('.') bb = tests.GitBranchBuilder() bb.set_file("foobar", "foo\nbar\n", False) mark = bb.commit("Somebody ", "mymsg") gitsha = bb.finish()[mark] os.chdir("..") return os.path.abspath("d"), gitsha def make_tworev_branch(self): os.mkdir("d") os.chdir("d") GitRepo.init('.') bb = tests.GitBranchBuilder() bb.set_file("foobar", "foo\nbar\n", False) mark1 = bb.commit("Somebody ", "mymsg") mark2 = bb.commit("Somebody ", "mymsg") marks = bb.finish() os.chdir("..") return "d", (marks[mark1], marks[mark2]) def clone_git_branch(self, from_url, to_url): from_dir = BzrDir.open(from_url) to_dir = from_dir.sprout(to_url) return to_dir.open_branch() def test_single_rev(self): path, gitsha = self.make_onerev_branch() oldrepo = Repository.open(path) revid = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha) self.assertEquals(gitsha, oldrepo._git.get_refs()["refs/heads/master"]) newbranch = self.clone_git_branch(path, "f") self.assertEquals([revid], newbranch.repository.all_revision_ids()) def test_sprouted_tags(self): path, gitsha = self.make_onerev_branch() r = GitRepo(path) r.refs["refs/tags/lala"] = r.head() oldrepo = Repository.open(path) revid = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha) newbranch = self.clone_git_branch(path, "f") self.assertEquals({"lala": revid}, newbranch.tags.get_tag_dict()) self.assertEquals([revid], newbranch.repository.all_revision_ids()) def test_interbranch_pull(self): path, (gitsha1, gitsha2) = self.make_tworev_branch() oldrepo = Repository.open(path) revid2 = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha2) newbranch = self.make_branch('g') inter_branch = InterBranch.get(Branch.open(path), newbranch) inter_branch.pull() self.assertEquals(revid2, newbranch.last_revision()) def test_interbranch_pull_noop(self): path, (gitsha1, gitsha2) = self.make_tworev_branch() oldrepo = Repository.open(path) revid2 = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha2) newbranch = self.make_branch('g') inter_branch = InterBranch.get(Branch.open(path), newbranch) inter_branch.pull() # This is basically "assertNotRaises" inter_branch.pull() self.assertEquals(revid2, newbranch.last_revision()) def test_interbranch_pull_stop_revision(self): path, (gitsha1, gitsha2) = self.make_tworev_branch() oldrepo = Repository.open(path) revid1 = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha1) newbranch = self.make_branch('g') inter_branch = InterBranch.get(Branch.open(path), newbranch) inter_branch.pull(stop_revision=revid1) self.assertEquals(revid1, newbranch.last_revision()) def test_interbranch_pull_with_tags(self): path, (gitsha1, gitsha2) = self.make_tworev_branch() gitrepo = GitRepo(path) gitrepo.refs["refs/tags/sometag"] = gitsha2 oldrepo = Repository.open(path) revid1 = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha1) revid2 = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha2) newbranch = self.make_branch('g') source_branch = Branch.open(path) source_branch.get_config().set_user_option("branch.fetch_tags", True) inter_branch = InterBranch.get(source_branch, newbranch) inter_branch.pull(stop_revision=revid1) self.assertEquals(revid1, newbranch.last_revision()) self.assertTrue(newbranch.repository.has_revision(revid2)) class ForeignTestsBranchFactory(object): def make_empty_branch(self, transport): d = LocalGitControlDirFormat().initialize_on_transport(transport) return d.create_branch() make_branch = make_empty_branch bzr-git-0.6.13+bzr1649/tests/test_builder.py0000644000000000000000000002563113165530605016636 0ustar 00000000000000# Copyright (C) 2007 Canonical Ltd # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Test our ability to build up test repositories""" from cStringIO import StringIO from dulwich.repo import Repo as GitRepo from .. import tests class TestGitBranchBuilder(tests.TestCase): def test__create_blob(self): stream = StringIO() builder = tests.GitBranchBuilder(stream) self.assertEqual(1, builder._create_blob('foo\nbar\n')) self.assertEqualDiff('blob\nmark :1\ndata 8\nfoo\nbar\n\n', stream.getvalue()) def test_set_file(self): stream = StringIO() builder = tests.GitBranchBuilder(stream) builder.set_file('foobar', 'foo\nbar\n', False) self.assertEqualDiff('blob\nmark :1\ndata 8\nfoo\nbar\n\n', stream.getvalue()) self.assertEqual(['M 100644 :1 foobar\n'], builder.commit_info) def test_set_file_unicode(self): stream = StringIO() builder = tests.GitBranchBuilder(stream) builder.set_file(u'f\xb5/bar', 'contents\nbar\n', False) self.assertEqualDiff('blob\nmark :1\ndata 13\ncontents\nbar\n\n', stream.getvalue()) self.assertEqual(['M 100644 :1 f\xc2\xb5/bar\n'], builder.commit_info) def test_set_file_newline(self): stream = StringIO() builder = tests.GitBranchBuilder(stream) builder.set_file(u'foo\nbar', 'contents\nbar\n', False) self.assertEqualDiff('blob\nmark :1\ndata 13\ncontents\nbar\n\n', stream.getvalue()) self.assertEqual(['M 100644 :1 "foo\\nbar"\n'], builder.commit_info) def test_set_file_executable(self): stream = StringIO() builder = tests.GitBranchBuilder(stream) builder.set_file(u'f\xb5/bar', 'contents\nbar\n', True) self.assertEqualDiff('blob\nmark :1\ndata 13\ncontents\nbar\n\n', stream.getvalue()) self.assertEqual(['M 100755 :1 f\xc2\xb5/bar\n'], builder.commit_info) def test_set_link(self): stream = StringIO() builder = tests.GitBranchBuilder(stream) builder.set_link(u'f\xb5/bar', 'link/contents') self.assertEqualDiff('blob\nmark :1\ndata 13\nlink/contents\n', stream.getvalue()) self.assertEqual(['M 120000 :1 f\xc2\xb5/bar\n'], builder.commit_info) def test_set_link_newline(self): stream = StringIO() builder = tests.GitBranchBuilder(stream) builder.set_link(u'foo\nbar', 'link/contents') self.assertEqualDiff('blob\nmark :1\ndata 13\nlink/contents\n', stream.getvalue()) self.assertEqual(['M 120000 :1 "foo\\nbar"\n'], builder.commit_info) def test_delete_entry(self): stream = StringIO() builder = tests.GitBranchBuilder(stream) builder.delete_entry(u'path/to/f\xb5') self.assertEqual(['D path/to/f\xc2\xb5\n'], builder.commit_info) def test_delete_entry_newline(self): stream = StringIO() builder = tests.GitBranchBuilder(stream) builder.delete_entry(u'path/to/foo\nbar') self.assertEqual(['D "path/to/foo\\nbar"\n'], builder.commit_info) def test_encode_path(self): encode = tests.GitBranchBuilder._encode_path # Unicode is encoded to utf-8 self.assertEqual(encode(u'f\xb5'), 'f\xc2\xb5') # The name must be quoted if it starts by a double quote or contains a # newline. self.assertEqual(encode(u'"foo'), '"\\"foo"') self.assertEqual(encode(u'fo\no'), '"fo\\no"') # When the name is quoted, all backslash and quote chars must be # escaped. self.assertEqual(encode(u'fo\\o\nbar'), '"fo\\\\o\\nbar"') self.assertEqual(encode(u'fo"o"\nbar'), '"fo\\"o\\"\\nbar"') # Other control chars, such as \r, need not be escaped. self.assertEqual(encode(u'foo\r\nbar'), '"foo\r\\nbar"') def test_add_and_commit(self): stream = StringIO() builder = tests.GitBranchBuilder(stream) builder.set_file(u'f\xb5/bar', 'contents\nbar\n', False) self.assertEqual('2', builder.commit('Joe Foo ', u'committing f\xb5/bar', timestamp=1194586400, timezone='+0100')) self.assertEqualDiff('blob\nmark :1\ndata 13\ncontents\nbar\n\n' 'commit refs/heads/master\n' 'mark :2\n' 'committer Joe Foo 1194586400 +0100\n' 'data 18\n' 'committing f\xc2\xb5/bar' '\n' 'M 100644 :1 f\xc2\xb5/bar\n' '\n', stream.getvalue()) def test_commit_base(self): stream = StringIO() builder = tests.GitBranchBuilder(stream) builder.set_file(u'foo', 'contents\nfoo\n', False) r1 = builder.commit('Joe Foo ', u'first', timestamp=1194586400) r2 = builder.commit('Joe Foo ', u'second', timestamp=1194586405) r3 = builder.commit('Joe Foo ', u'third', timestamp=1194586410, base=r1) self.assertEqualDiff('blob\nmark :1\ndata 13\ncontents\nfoo\n\n' 'commit refs/heads/master\n' 'mark :2\n' 'committer Joe Foo 1194586400 +0000\n' 'data 5\n' 'first' '\n' 'M 100644 :1 foo\n' '\n' 'commit refs/heads/master\n' 'mark :3\n' 'committer Joe Foo 1194586405 +0000\n' 'data 6\n' 'second' '\n' '\n' 'commit refs/heads/master\n' 'mark :4\n' 'committer Joe Foo 1194586410 +0000\n' 'data 5\n' 'third' '\n' 'from :2\n' '\n', stream.getvalue()) def test_commit_merge(self): stream = StringIO() builder = tests.GitBranchBuilder(stream) builder.set_file(u'foo', 'contents\nfoo\n', False) r1 = builder.commit('Joe Foo ', u'first', timestamp=1194586400) r2 = builder.commit('Joe Foo ', u'second', timestamp=1194586405) r3 = builder.commit('Joe Foo ', u'third', timestamp=1194586410, base=r1) r4 = builder.commit('Joe Foo ', u'Merge', timestamp=1194586415, merge=[r2]) self.assertEqualDiff('blob\nmark :1\ndata 13\ncontents\nfoo\n\n' 'commit refs/heads/master\n' 'mark :2\n' 'committer Joe Foo 1194586400 +0000\n' 'data 5\n' 'first' '\n' 'M 100644 :1 foo\n' '\n' 'commit refs/heads/master\n' 'mark :3\n' 'committer Joe Foo 1194586405 +0000\n' 'data 6\n' 'second' '\n' '\n' 'commit refs/heads/master\n' 'mark :4\n' 'committer Joe Foo 1194586410 +0000\n' 'data 5\n' 'third' '\n' 'from :2\n' '\n' 'commit refs/heads/master\n' 'mark :5\n' 'committer Joe Foo 1194586415 +0000\n' 'data 5\n' 'Merge' '\n' 'merge :3\n' '\n', stream.getvalue()) def test_auto_timestamp(self): stream = StringIO() builder = tests.GitBranchBuilder(stream) builder.commit('Joe Foo ', u'message') self.assertContainsRe(stream.getvalue(), r'committer Joe Foo \d+ \+0000') def test_reset(self): stream = StringIO() builder = tests.GitBranchBuilder(stream) builder.reset() self.assertEqualDiff('reset refs/heads/master\n\n', stream.getvalue()) def test_reset_named_ref(self): stream = StringIO() builder = tests.GitBranchBuilder(stream) builder.reset('refs/heads/branch') self.assertEqualDiff('reset refs/heads/branch\n\n', stream.getvalue()) def test_reset_revision(self): stream = StringIO() builder = tests.GitBranchBuilder(stream) builder.reset(mark=123) self.assertEqualDiff( 'reset refs/heads/master\n' 'from :123\n' '\n', stream.getvalue()) class TestGitBranchBuilderReal(tests.TestCaseInTempDir): def test_create_real_branch(self): GitRepo.init(".") builder = tests.GitBranchBuilder() builder.set_file(u'foo', 'contents\nfoo\n', False) r1 = builder.commit('Joe Foo ', u'first', timestamp=1194586400) mapping = builder.finish() self.assertEqual({'1':'44411e8e9202177dd19b6599d7a7991059fa3cb4', '2': 'b0b62e674f67306fddcf72fa888c3b56df100d64', }, mapping) bzr-git-0.6.13+bzr1649/tests/test_cache.py0000644000000000000000000001327313165530605016252 0ustar 00000000000000# Copyright (C) 2009 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Tests for GitShaMap.""" from dulwich.objects import ( Blob, Commit, Tree, ) import os import stat from .... import osutils from ....inventory import ( InventoryFile, InventoryDirectory, ROOT_ID, ) from ....revision import ( Revision, ) from ....tests import ( TestCase, TestCaseInTempDir, UnavailableFeature, ) from ....transport import ( get_transport, ) from ..cache import ( DictBzrGitCache, IndexBzrGitCache, IndexGitCacheFormat, SqliteBzrGitCache, TdbBzrGitCache, ) class TestGitShaMap: def _get_test_commit(self): c = Commit() c.committer = "Jelmer " c.commit_time = 0 c.commit_timezone = 0 c.author = "Jelmer " c.author_time = 0 c.author_timezone = 0 c.message = "Teh foo bar" c.tree = "cc9462f7f8263ef5adfbeff2fb936bb36b504cba" return c def test_commit(self): self.map.start_write_group() updater = self.cache.get_updater(Revision("myrevid")) c = self._get_test_commit() updater.add_object(c, { "testament3-sha1": "cc9462f7f8263ef5adf8eff2fb936bb36b504cba"}, None) updater.finish() self.map.commit_write_group() self.assertEquals( [("commit", ("myrevid", "cc9462f7f8263ef5adfbeff2fb936bb36b504cba", {"testament3-sha1": "cc9462f7f8263ef5adf8eff2fb936bb36b504cba"}, ))], list(self.map.lookup_git_sha(c.id))) self.assertEquals(c.id, self.map.lookup_commit("myrevid")) def test_lookup_notfound(self): self.assertRaises(KeyError, list, self.map.lookup_git_sha("5686645d49063c73d35436192dfc9a160c672301")) def test_blob(self): self.map.start_write_group() updater = self.cache.get_updater(Revision("myrevid")) updater.add_object(self._get_test_commit(), { "testament3-sha1": "Test" }, None) b = Blob() b.data = "TEH BLOB" updater.add_object(b, ("myfileid", "myrevid"), None) updater.finish() self.map.commit_write_group() self.assertEquals( [("blob", ("myfileid", "myrevid"))], list(self.map.lookup_git_sha(b.id))) self.assertEquals(b.id, self.map.lookup_blob_id("myfileid", "myrevid")) def test_tree(self): self.map.start_write_group() updater = self.cache.get_updater(Revision("myrevid")) updater.add_object(self._get_test_commit(), { "testament3-sha1": "mytestamentsha" }, None) t = Tree() t.add("somename", stat.S_IFREG, Blob().id) updater.add_object(t, ("fileid", ), "") updater.finish() self.map.commit_write_group() self.assertEquals([("tree", ("fileid", "myrevid"))], list(self.map.lookup_git_sha(t.id))) # It's possible for a backend to not implement lookup_tree try: self.assertEquals(t.id, self.map.lookup_tree_id("fileid", "myrevid")) except NotImplementedError: pass def test_revids(self): self.map.start_write_group() updater = self.cache.get_updater(Revision("myrevid")) c = self._get_test_commit() updater.add_object(c, {"testament3-sha1": "mtestament"}, None) updater.finish() self.map.commit_write_group() self.assertEquals(["myrevid"], list(self.map.revids())) def test_missing_revisions(self): self.map.start_write_group() updater = self.cache.get_updater(Revision("myrevid")) c = self._get_test_commit() updater.add_object(c, {"testament3-sha1": "testament"}, None) updater.finish() self.map.commit_write_group() self.assertEquals(set(["lala", "bla"]), set(self.map.missing_revisions(["myrevid", "lala", "bla"]))) class DictGitShaMapTests(TestCase,TestGitShaMap): def setUp(self): TestCase.setUp(self) self.cache = DictBzrGitCache() self.map = self.cache.idmap class SqliteGitShaMapTests(TestCaseInTempDir,TestGitShaMap): def setUp(self): TestCaseInTempDir.setUp(self) self.cache = SqliteBzrGitCache(os.path.join(self.test_dir, 'foo.db')) self.map = self.cache.idmap class TdbGitShaMapTests(TestCaseInTempDir,TestGitShaMap): def setUp(self): TestCaseInTempDir.setUp(self) try: self.cache = TdbBzrGitCache( os.path.join(self.test_dir, 'foo.tdb').encode(osutils._fs_enc)) except ImportError: raise UnavailableFeature("Missing tdb") self.map = self.cache.idmap class IndexGitShaMapTests(TestCaseInTempDir,TestGitShaMap): def setUp(self): TestCaseInTempDir.setUp(self) transport = get_transport(self.test_dir) IndexGitCacheFormat().initialize(transport) self.cache = IndexBzrGitCache(transport) self.map = self.cache.idmap bzr-git-0.6.13+bzr1649/tests/test_dir.py0000644000000000000000000000466413165530605015771 0ustar 00000000000000# Copyright (C) 2007 Canonical Ltd # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Test the GitDir class""" from dulwich.repo import Repo as GitRepo import os from .... import ( bzrdir, errors, urlutils, ) from ....tests import TestSkipped from .. import ( dir, tests, workingtree, ) class TestGitDir(tests.TestCaseInTempDir): def test_get_head_branch_reference(self): GitRepo.init(".") gd = bzrdir.BzrDir.open('.') self.assertEquals( "%s,ref=refs%%2Fheads%%2Fmaster" % urlutils.local_path_to_url(os.path.abspath(".")), gd.get_branch_reference()) def test_open_existing(self): GitRepo.init(".") gd = bzrdir.BzrDir.open('.') self.assertIsInstance(gd, dir.LocalGitDir) def test_open_workingtree(self): GitRepo.init(".") gd = bzrdir.BzrDir.open('.') raise TestSkipped wt = gd.open_workingtree() self.assertIsInstance(wt, workingtree.GitWorkingTree) def test_open_workingtree_bare(self): GitRepo.init_bare(".") gd = bzrdir.BzrDir.open('.') self.assertRaises(errors.NoWorkingTree, gd.open_workingtree) class TestGitDirFormat(tests.TestCase): def setUp(self): super(TestGitDirFormat, self).setUp() self.format = dir.LocalGitControlDirFormat() def test_get_format_description(self): self.assertEquals("Local Git Repository", self.format.get_format_description()) def test_eq(self): format2 = dir.LocalGitControlDirFormat() self.assertEquals(self.format, format2) self.assertEquals(self.format, self.format) bzr_format = bzrdir.format_registry.make_bzrdir("default") self.assertNotEquals(self.format, bzr_format) bzr-git-0.6.13+bzr1649/tests/test_fetch.py0000644000000000000000000004562313165530605016304 0ustar 00000000000000# Copyright (C) 2009 Jelmer Vernooij # -*- coding: utf-8 -*- # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA from dulwich.objects import ( Blob, Tag, Tree, S_IFGITLINK, ) from dulwich.repo import ( Repo as GitRepo, ) import os import stat import time from .... import ( knit, osutils, versionedfile, ) from ....branch import ( Branch, ) from ....bzrdir import ( BzrDir, ) from ....inventory import ( Inventory, ) from ....repository import ( Repository, ) from ....tests import ( TestCaseWithTransport, ) from ..fetch import ( import_git_blob, import_git_tree, import_git_submodule, ) from ..mapping import ( BzrGitMappingv1, DEFAULT_FILE_MODE, ) from . import ( GitBranchBuilder, ) class RepositoryFetchTests(object): def make_git_repo(self, path): os.mkdir(path) return GitRepo.init(os.path.abspath(path)) def clone_git_repo(self, from_url, to_url, revision_id=None): oldrepos = self.open_git_repo(from_url) dir = BzrDir.create(to_url) newrepos = dir.create_repository() oldrepos.copy_content_into(newrepos, revision_id=revision_id) return newrepos def test_empty(self): self.make_git_repo("d") newrepos = self.clone_git_repo("d", "f") self.assertEquals([], newrepos.all_revision_ids()) def make_onerev_branch(self): self.make_git_repo("d") os.chdir("d") bb = GitBranchBuilder() bb.set_file("foobar", "foo\nbar\n", False) mark = bb.commit("Somebody ", "mymsg") gitsha = bb.finish()[mark] os.chdir("..") return "d", gitsha def test_single_rev(self): path, gitsha = self.make_onerev_branch() oldrepo = self.open_git_repo(path) newrepo = self.clone_git_repo(path, "f") revid = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha) self.assertEquals([revid], newrepo.all_revision_ids()) def test_single_rev_specific(self): path, gitsha = self.make_onerev_branch() oldrepo = self.open_git_repo(path) revid = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha) newrepo = self.clone_git_repo(path, "f", revision_id=revid) self.assertEquals([revid], newrepo.all_revision_ids()) def test_incremental(self): self.make_git_repo("d") os.chdir("d") bb = GitBranchBuilder() bb.set_file("foobar", "foo\nbar\n", False) mark1 = bb.commit("Somebody ", "mymsg") bb.set_file("foobar", "fooll\nbar\n", False) mark2 = bb.commit("Somebody ", "nextmsg") marks = bb.finish() gitsha1 = marks[mark1] gitsha2 = marks[mark2] os.chdir("..") oldrepo = self.open_git_repo("d") revid1 = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha1) newrepo = self.clone_git_repo("d", "f", revision_id=revid1) self.assertEquals([revid1], newrepo.all_revision_ids()) revid2 = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha2) newrepo.fetch(oldrepo, revision_id=revid2) self.assertEquals(set([revid1, revid2]), set(newrepo.all_revision_ids())) def test_dir_becomes_symlink(self): self.make_git_repo("d") os.chdir("d") bb = GitBranchBuilder() bb.set_file("mylink/somefile", "foo\nbar\n", False) mark1 = bb.commit("Somebody ", "mymsg1") bb.set_symlink("mylink", "target/") mark2 = bb.commit("Somebody ", "mymsg2") marks = bb.finish() gitsha1 = marks[mark1] gitsha2 = marks[mark2] os.chdir("..") oldrepo = self.open_git_repo("d") newrepo = self.clone_git_repo("d", "f") revid1 = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha1) revid2 = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha2) tree1 = newrepo.revision_tree(revid1) tree2 = newrepo.revision_tree(revid2) fileid = tree1.path2id("mylink") self.assertEquals(revid1, tree1.get_file_revision(fileid)) self.assertEquals("directory", tree1.kind(fileid)) self.assertEquals(None, tree1.get_symlink_target(fileid)) self.assertEquals(revid2, tree2.get_file_revision(fileid)) self.assertEquals("symlink", tree2.kind(fileid)) self.assertEquals("target/", tree2.get_symlink_target(fileid)) def test_symlink_becomes_dir(self): self.make_git_repo("d") os.chdir("d") bb = GitBranchBuilder() bb.set_symlink("mylink", "target/") mark1 = bb.commit("Somebody ", "mymsg1") bb.set_file("mylink/somefile", "foo\nbar\n", False) mark2 = bb.commit("Somebody ", "mymsg2") marks = bb.finish() gitsha1 = marks[mark1] gitsha2 = marks[mark2] os.chdir("..") oldrepo = self.open_git_repo("d") newrepo = self.clone_git_repo("d", "f") revid1 = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha1) revid2 = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha2) tree1 = newrepo.revision_tree(revid1) tree2 = newrepo.revision_tree(revid2) fileid = tree1.path2id("mylink") self.assertEquals(revid1, tree1.get_file_revision(fileid)) self.assertEquals("symlink", tree1.kind(fileid)) self.assertEquals("target/", tree1.get_symlink_target(fileid)) self.assertEquals(revid2, tree2.get_file_revision(fileid)) self.assertEquals("directory", tree2.kind(fileid)) self.assertEquals(None, tree2.get_symlink_target(fileid)) def test_changing_symlink(self): self.make_git_repo("d") os.chdir("d") bb = GitBranchBuilder() bb.set_symlink("mylink", "target") mark1 = bb.commit("Somebody ", "mymsg1") bb.set_symlink("mylink", "target/") mark2 = bb.commit("Somebody ", "mymsg2") marks = bb.finish() gitsha1 = marks[mark1] gitsha2 = marks[mark2] os.chdir("..") oldrepo = self.open_git_repo("d") newrepo = self.clone_git_repo("d", "f") revid1 = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha1) revid2 = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha2) tree1 = newrepo.revision_tree(revid1) tree2 = newrepo.revision_tree(revid2) fileid = tree1.path2id("mylink") self.assertEquals(revid1, tree1.get_file_revision(fileid)) self.assertEquals("target", tree1.get_symlink_target(fileid)) self.assertEquals(revid2, tree2.get_file_revision(fileid)) self.assertEquals("target/", tree2.get_symlink_target(fileid)) def test_executable(self): self.make_git_repo("d") os.chdir("d") bb = GitBranchBuilder() bb.set_file("foobar", "foo\nbar\n", True) bb.set_file("notexec", "foo\nbar\n", False) mark = bb.commit("Somebody ", "mymsg") gitsha = bb.finish()[mark] os.chdir("..") oldrepo = self.open_git_repo("d") newrepo = self.clone_git_repo("d", "f") revid = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha) tree = newrepo.revision_tree(revid) self.assertTrue(tree.has_filename("foobar")) self.assertEquals(True, tree.is_executable(tree.path2id("foobar"))) self.assertTrue(tree.has_filename("notexec")) self.assertEquals(False, tree.is_executable(tree.path2id("notexec"))) def test_becomes_executable(self): self.make_git_repo("d") os.chdir("d") bb = GitBranchBuilder() bb.set_file("foobar", "foo\nbar\n", False) mark1 = bb.commit("Somebody ", "mymsg") bb.set_file("foobar", "foo\nbar\n", True) mark2 = bb.commit("Somebody ", "mymsg") gitsha2 = bb.finish()[mark2] os.chdir("..") oldrepo = self.open_git_repo("d") newrepo = self.clone_git_repo("d", "f") revid = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha2) tree = newrepo.revision_tree(revid) self.assertTrue(tree.has_filename("foobar")) fileid = tree.path2id("foobar") self.assertEquals(True, tree.is_executable(fileid)) self.assertEquals(revid, tree.get_file_revision(fileid)) def test_into_stacked_on(self): r = self.make_git_repo("d") os.chdir("d") bb = GitBranchBuilder() bb.set_file(u"foobar", "foo\n", False) mark1 = bb.commit("Somebody ", "mymsg1") gitsha1 = bb.finish()[mark1] os.chdir("..") stacked_on = self.clone_git_repo("d", "stacked-on") oldrepo = Repository.open("d") revid1 = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha1) self.assertEquals([revid1], stacked_on.all_revision_ids()) b = stacked_on.bzrdir.create_branch() b.generate_revision_history(revid1) self.assertEquals(b.last_revision(), revid1) tree = self.make_branch_and_tree("stacked") tree.branch.set_stacked_on_url(b.user_url) os.chdir("d") bb = GitBranchBuilder() bb.set_file(u"barbar", "bar\n", False) bb.set_file(u"foo/blie/bla", "bla\n", False) mark2 = bb.commit("Somebody ", "mymsg2") gitsha2 = bb.finish()[mark2] revid2 = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha2) os.chdir("..") tree.branch.fetch(Branch.open("d")) tree.branch.repository.check() self.addCleanup(tree.lock_read().unlock) self.assertEquals( set([(revid2,)]), tree.branch.repository.revisions.without_fallbacks().keys()) self.assertEquals( set([revid1, revid2]), set(tree.branch.repository.all_revision_ids())) def test_non_ascii_characters(self): self.make_git_repo("d") os.chdir("d") bb = GitBranchBuilder() bb.set_file(u"foőbar", "foo\nbar\n", False) mark = bb.commit("Somebody ", "mymsg") gitsha = bb.finish()[mark] os.chdir("..") oldrepo = self.open_git_repo("d") newrepo = self.clone_git_repo("d", "f") revid = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha) tree = newrepo.revision_tree(revid) self.assertTrue(tree.has_filename(u"foőbar")) def test_tagged_tree(self): r = self.make_git_repo("d") os.chdir("d") bb = GitBranchBuilder() bb.set_file("foobar", "fooll\nbar\n", False) mark = bb.commit("Somebody ", "nextmsg") marks = bb.finish() gitsha = marks[mark] tag = Tag() tag.name = "sometag" tag.tag_time = int(time.time()) tag.tag_timezone = 0 tag.tagger = "Somebody " tag.message = "Created tag pointed at tree" tag.object = (Tree, r[gitsha].tree) r.object_store.add_object(tag) r["refs/tags/sometag"] = tag os.chdir("..") oldrepo = self.open_git_repo("d") revid = oldrepo.get_mapping().revision_id_foreign_to_bzr(gitsha) newrepo = self.clone_git_repo("d", "f") self.assertEquals(set([revid]), set(newrepo.all_revision_ids())) class LocalRepositoryFetchTests(RepositoryFetchTests, TestCaseWithTransport): def open_git_repo(self, path): return Repository.open(path) class DummyStoreUpdater(object): def add_object(self, obj, ie, path): pass def finish(self): pass class ImportObjects(TestCaseWithTransport): def setUp(self): super(ImportObjects, self).setUp() self._mapping = BzrGitMappingv1() factory = knit.make_file_factory(True, versionedfile.PrefixMapper()) self._texts = factory(self.get_transport('texts')) def test_import_blob_simple(self): blob = Blob.from_string("bar") base_inv = Inventory() objs = { "blobname": blob} ret = import_git_blob(self._texts, self._mapping, "bla", "bla", (None, "blobname"), base_inv, None, "somerevid", [], objs.__getitem__, (None, DEFAULT_FILE_MODE), DummyStoreUpdater(), self._mapping.generate_file_id) self.assertEquals(set([('bla', 'somerevid')]), self._texts.keys()) self.assertEquals(self._texts.get_record_stream([('bla', 'somerevid')], "unordered", True).next().get_bytes_as("fulltext"), "bar") self.assertEquals(1, len(ret)) self.assertEquals(None, ret[0][0]) self.assertEquals("bla", ret[0][1]) ie = ret[0][3] self.assertEquals(False, ie.executable) self.assertEquals("file", ie.kind) self.assertEquals("somerevid", ie.revision) self.assertEquals(osutils.sha_strings(["bar"]), ie.text_sha1) def test_import_tree_empty_root(self): base_inv = Inventory(root_id=None) tree = Tree() ret, child_modes = import_git_tree(self._texts, self._mapping, "", "", (None, tree.id), base_inv, None, "somerevid", [], {tree.id: tree}.__getitem__, (None, stat.S_IFDIR), DummyStoreUpdater(), self._mapping.generate_file_id) self.assertEquals(child_modes, {}) self.assertEquals(set([("TREE_ROOT", 'somerevid')]), self._texts.keys()) self.assertEquals(1, len(ret)) self.assertEquals(None, ret[0][0]) self.assertEquals("", ret[0][1]) ie = ret[0][3] self.assertEquals(False, ie.executable) self.assertEquals("directory", ie.kind) self.assertEquals({}, ie.children) self.assertEquals("somerevid", ie.revision) self.assertEquals(None, ie.text_sha1) def test_import_tree_empty(self): base_inv = Inventory() tree = Tree() ret, child_modes = import_git_tree(self._texts, self._mapping, "bla", "bla", (None, tree.id), base_inv, None, "somerevid", [], { tree.id: tree }.__getitem__, (None, stat.S_IFDIR), DummyStoreUpdater(), self._mapping.generate_file_id) self.assertEquals(child_modes, {}) self.assertEquals(set([("bla", 'somerevid')]), self._texts.keys()) self.assertEquals(1, len(ret)) self.assertEquals(None, ret[0][0]) self.assertEquals("bla", ret[0][1]) ie = ret[0][3] self.assertEquals("directory", ie.kind) self.assertEquals(False, ie.executable) self.assertEquals({}, ie.children) self.assertEquals("somerevid", ie.revision) self.assertEquals(None, ie.text_sha1) def test_import_tree_with_file(self): base_inv = Inventory() blob = Blob.from_string("bar1") tree = Tree() tree.add("foo", stat.S_IFREG | 0644, blob.id) objects = { blob.id: blob, tree.id: tree } ret, child_modes = import_git_tree(self._texts, self._mapping, "bla", "bla", (None, tree.id), base_inv, None, "somerevid", [], objects.__getitem__, (None, stat.S_IFDIR), DummyStoreUpdater(), self._mapping.generate_file_id) self.assertEquals(child_modes, {}) self.assertEquals(2, len(ret)) self.assertEquals(None, ret[0][0]) self.assertEquals("bla", ret[0][1]) self.assertEquals(None, ret[1][0]) self.assertEquals("bla/foo", ret[1][1]) ie = ret[0][3] self.assertEquals("directory", ie.kind) ie = ret[1][3] self.assertEquals("file", ie.kind) self.assertEquals("bla/foo", ie.file_id) self.assertEquals("somerevid", ie.revision) self.assertEquals(osutils.sha_strings(["bar1"]), ie.text_sha1) self.assertEquals(False, ie.executable) def test_import_tree_with_unusual_mode_file(self): base_inv = Inventory() blob = Blob.from_string("bar1") tree = Tree() tree.add("foo", stat.S_IFREG | 0664, blob.id) objects = { blob.id: blob, tree.id: tree } ret, child_modes = import_git_tree(self._texts, self._mapping, "bla", "bla", (None, tree.id), base_inv, None, "somerevid", [], objects.__getitem__, (None, stat.S_IFDIR), DummyStoreUpdater(), self._mapping.generate_file_id) self.assertEquals(child_modes, { "bla/foo": stat.S_IFREG | 0664 }) def test_import_tree_with_file_exe(self): base_inv = Inventory(root_id=None) blob = Blob.from_string("bar") tree = Tree() tree.add("foo", 0100755, blob.id) objects = { blob.id: blob, tree.id: tree } ret, child_modes = import_git_tree(self._texts, self._mapping, "", "", (None, tree.id), base_inv, None, "somerevid", [], objects.__getitem__, (None, stat.S_IFDIR), DummyStoreUpdater(), self._mapping.generate_file_id) self.assertEquals(child_modes, {}) self.assertEquals(2, len(ret)) self.assertEquals(None, ret[0][0]) self.assertEquals("", ret[0][1]) self.assertEquals(None, ret[1][0]) self.assertEquals("foo", ret[1][1]) ie = ret[0][3] self.assertEquals("directory", ie.kind) ie = ret[1][3] self.assertEquals("file", ie.kind) self.assertEquals(True, ie.executable) def test_directory_converted_to_submodule(self): base_inv = Inventory() base_inv.add_path("foo", "directory") base_inv.add_path("foo/bar", "file") othertree = Blob.from_string("someotherthing") blob = Blob.from_string("bar") tree = Tree() tree.add("bar", 0160000, blob.id) objects = { tree.id: tree } ret, child_modes = import_git_submodule(self._texts, self._mapping, "foo", "foo", (tree.id, othertree.id), base_inv, base_inv.root.file_id, "somerevid", [], objects.__getitem__, (stat.S_IFDIR | 0755, S_IFGITLINK), DummyStoreUpdater(), self._mapping.generate_file_id) self.assertEquals(child_modes, {}) self.assertEquals(2, len(ret)) self.assertEquals(ret[0], ("foo/bar", None, base_inv.path2id("foo/bar"), None)) self.assertEquals(ret[1][:3], ("foo", "foo", self._mapping.generate_file_id("foo"))) ie = ret[1][3] self.assertEquals(ie.kind, "tree-reference") bzr-git-0.6.13+bzr1649/tests/test_git_remote_helper.py0000644000000000000000000001121513165530605020676 0ustar 00000000000000#!/usr/bin/env python # vim: expandtab # Copyright (C) 2011 Jelmer Vernooij # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA from cStringIO import StringIO import os from dulwich.repo import Repo from ....tests import ( TestCaseWithTransport, TestSkipped, ) from ..object_store import get_object_store from ..git_remote_helper import ( RemoteHelper, open_local_dir, fastexporter, fetch, ) def map_to_git_sha1(dir, bzr_revid): object_store = get_object_store(dir.open_repository()) object_store.lock_read() try: return object_store._lookup_revision_sha1(bzr_revid) finally: object_store.unlock() class OpenLocalDirTests(TestCaseWithTransport): def test_from_env(self): self.make_branch_and_tree('bla', format='git') self.overrideEnv('GIT_DIR', os.path.join(self.test_dir, 'bla')) open_local_dir() def test_from_env_dir(self): self.make_branch_and_tree('bla', format='git') self.overrideEnv('GIT_DIR', os.path.join(self.test_dir, 'bla', '.git')) open_local_dir() def test_from_dir(self): self.make_branch_and_tree('.', format='git') open_local_dir() class FetchTests(TestCaseWithTransport): def setUp(self): super(FetchTests, self).setUp() self.local_dir = self.make_branch_and_tree('local', format='git').bzrdir self.remote_tree = self.make_branch_and_tree('remote') self.remote_dir = self.remote_tree.bzrdir self.shortname = 'bzr' def fetch(self, wants): outf = StringIO() fetch(outf, wants, self.shortname, self.remote_dir, self.local_dir) return outf.getvalue() def test_no_wants(self): r = self.fetch([]) self.assertEquals("\n", r) def test_simple(self): self.build_tree(['remote/foo']) self.remote_tree.add("foo") revid = self.remote_tree.commit("msg") git_sha1 = map_to_git_sha1(self.remote_dir, revid) out = self.fetch([(git_sha1, 'HEAD')]) self.assertEquals(out, "\n") r = Repo('local') self.assertTrue(git_sha1 in r.object_store) self.assertEquals({'HEAD': '0000000000000000000000000000000000000000'}, r.get_refs()) class RemoteHelperTests(TestCaseWithTransport): def setUp(self): super(RemoteHelperTests, self).setUp() self.local_dir = self.make_branch_and_tree('local', format='git').bzrdir self.remote_tree = self.make_branch_and_tree('remote') self.remote_dir = self.remote_tree.bzrdir self.shortname = 'bzr' self.helper = RemoteHelper(self.local_dir, self.shortname, self.remote_dir) def test_capabilities(self): f = StringIO() self.helper.cmd_capabilities(f, []) capabs = f.getvalue() base = "fetch\noption\npush\n" self.assertTrue(capabs in (base+"\n", base+"import\n\n"), capabs) def test_option(self): f = StringIO() self.helper.cmd_option(f, []) self.assertEquals("unsupported\n", f.getvalue()) def test_list_basic(self): f = StringIO() self.helper.cmd_list(f, []) self.assertEquals( '0000000000000000000000000000000000000000 HEAD\n\n', f.getvalue()) def test_import(self): if fastexporter is None: raise TestSkipped("bzr-fastimport not available") self.build_tree_contents([("remote/afile", "somecontent")]) self.remote_tree.add(["afile"]) self.remote_tree.commit("A commit message", timestamp=1330445983, timezone=0, committer='Somebody ') f = StringIO() self.helper.cmd_import(f, ["import", "refs/heads/master"]) self.assertEquals( 'commit refs/heads/master\n' 'mark :1\n' 'committer Somebody 1330445983 +0000\n' 'data 16\n' 'A commit message\n' 'M 644 inline afile\n' 'data 11\n' 'somecontent\n', f.getvalue()) bzr-git-0.6.13+bzr1649/tests/test_mapping.py0000644000000000000000000003434413165530605016644 0ustar 00000000000000# Copyright (C) 2007 Jelmer Vernooij # -*- encoding: utf-8 -*- # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA from ....inventory import ( InventoryDirectory, InventoryFile, ) from ....revision import ( Revision, ) from dulwich.objects import ( Blob, Commit, Tree, parse_timezone, ) from .. import tests from ..errors import UnknownCommitExtra from ..mapping import ( BzrGitMappingv1, directory_to_tree, escape_file_id, unescape_file_id, ) class TestRevidConversionV1(tests.TestCase): def test_simple_git_to_bzr_revision_id(self): self.assertEqual("git-v1:" "c6a4d8f1fa4ac650748e647c4b1b368f589a7356", BzrGitMappingv1().revision_id_foreign_to_bzr( "c6a4d8f1fa4ac650748e647c4b1b368f589a7356")) def test_simple_bzr_to_git_revision_id(self): self.assertEqual(("c6a4d8f1fa4ac650748e647c4b1b368f589a7356", BzrGitMappingv1()), BzrGitMappingv1().revision_id_bzr_to_foreign( "git-v1:" "c6a4d8f1fa4ac650748e647c4b1b368f589a7356")) def test_is_control_file(self): mapping = BzrGitMappingv1() if mapping.roundtripping: self.assertTrue(mapping.is_control_file(".bzrdummy")) self.assertTrue(mapping.is_control_file(".bzrfileids")) self.assertFalse(mapping.is_control_file(".bzrfoo")) def test_generate_file_id(self): mapping = BzrGitMappingv1() self.assertIsInstance(mapping.generate_file_id("la"), str) self.assertIsInstance(mapping.generate_file_id(u"é"), str) class FileidTests(tests.TestCase): def test_escape_space(self): self.assertEquals("bla_s", escape_file_id("bla ")) def test_escape_control_l(self): self.assertEquals("bla_c", escape_file_id("bla\x0c")) def test_unescape_control_l(self): self.assertEquals("bla\x0c", unescape_file_id("bla_c")) def test_escape_underscore(self): self.assertEquals("bla__", escape_file_id("bla_")) def test_escape_underscore_space(self): self.assertEquals("bla___s", escape_file_id("bla_ ")) def test_unescape_underscore(self): self.assertEquals("bla ", unescape_file_id("bla_s")) def test_unescape_underscore_space(self): self.assertEquals("bla _", unescape_file_id("bla_s__")) class TestImportCommit(tests.TestCase): def test_commit(self): c = Commit() c.tree = "cc9462f7f8263ef5adfbeff2fb936bb36b504cba" c.message = "Some message" c.committer = "Committer" c.commit_time = 4 c.author_time = 5 c.commit_timezone = 60 * 5 c.author_timezone = 60 * 3 c.author = "Author" mapping = BzrGitMappingv1() rev, roundtrip_revid, verifiers = mapping.import_commit(c, mapping.revision_id_foreign_to_bzr) self.assertEquals(None, roundtrip_revid) self.assertEquals({}, verifiers) self.assertEquals("Some message", rev.message) self.assertEquals("Committer", rev.committer) self.assertEquals("Author", rev.properties['author']) self.assertEquals(300, rev.timezone) self.assertEquals((), rev.parent_ids) self.assertEquals("5", rev.properties['author-timestamp']) self.assertEquals("180", rev.properties['author-timezone']) self.assertEquals("git-v1:" + c.id, rev.revision_id) def test_explicit_encoding(self): c = Commit() c.tree = "cc9462f7f8263ef5adfbeff2fb936bb36b504cba" c.message = "Some message" c.committer = "Committer" c.commit_time = 4 c.author_time = 5 c.commit_timezone = 60 * 5 c.author_timezone = 60 * 3 c.author = u"Authér".encode("iso8859-1") c.encoding = "iso8859-1" mapping = BzrGitMappingv1() rev, roundtrip_revid, verifiers = mapping.import_commit(c, mapping.revision_id_foreign_to_bzr) self.assertEquals(None, roundtrip_revid) self.assertEquals({}, verifiers) self.assertEquals(u"Authér", rev.properties['author']) self.assertEquals("iso8859-1", rev.properties["git-explicit-encoding"]) self.assertTrue("git-implicit-encoding" not in rev.properties) def test_implicit_encoding_fallback(self): c = Commit() c.tree = "cc9462f7f8263ef5adfbeff2fb936bb36b504cba" c.message = "Some message" c.committer = "Committer" c.commit_time = 4 c.author_time = 5 c.commit_timezone = 60 * 5 c.author_timezone = 60 * 3 c.author = u"Authér".encode("latin1") mapping = BzrGitMappingv1() rev, roundtrip_revid, verifiers = mapping.import_commit(c, mapping.revision_id_foreign_to_bzr) self.assertEquals(None, roundtrip_revid) self.assertEquals({}, verifiers) self.assertEquals(u"Authér", rev.properties['author']) self.assertEquals("latin1", rev.properties["git-implicit-encoding"]) self.assertTrue("git-explicit-encoding" not in rev.properties) def test_implicit_encoding_utf8(self): c = Commit() c.tree = "cc9462f7f8263ef5adfbeff2fb936bb36b504cba" c.message = "Some message" c.committer = "Committer" c.commit_time = 4 c.author_time = 5 c.commit_timezone = 60 * 5 c.author_timezone = 60 * 3 c.author = u"Authér".encode("utf-8") mapping = BzrGitMappingv1() rev, roundtrip_revid, verifiers = mapping.import_commit(c, mapping.revision_id_foreign_to_bzr) self.assertEquals(None, roundtrip_revid) self.assertEquals({}, verifiers) self.assertEquals(u"Authér", rev.properties['author']) self.assertTrue("git-explicit-encoding" not in rev.properties) self.assertTrue("git-implicit-encoding" not in rev.properties) def test_unknown_extra(self): c = Commit() c.tree = "cc9462f7f8263ef5adfbeff2fb936bb36b504cba" c.message = "Some message" c.committer = "Committer" c.commit_time = 4 c.author_time = 5 c.commit_timezone = 60 * 5 c.author_timezone = 60 * 3 c.author = "Author" c._extra.append(("iamextra", "foo")) mapping = BzrGitMappingv1() self.assertRaises(UnknownCommitExtra, mapping.import_commit, c, mapping.revision_id_foreign_to_bzr) class RoundtripRevisionsFromBazaar(tests.TestCase): def setUp(self): super(RoundtripRevisionsFromBazaar, self).setUp() self.mapping = BzrGitMappingv1() self._parent_map = {} self._lookup_parent = self._parent_map.__getitem__ def assertRoundtripRevision(self, orig_rev): commit = self.mapping.export_commit(orig_rev, "mysha", self._lookup_parent, True, "testamentsha") rev, roundtrip_revid, verifiers = self.mapping.import_commit( commit, self.mapping.revision_id_foreign_to_bzr) self.assertEquals(rev.revision_id, self.mapping.revision_id_foreign_to_bzr(commit.id)) if self.mapping.roundtripping: self.assertEquals({"testament3-sha1": "testamentsha"} , verifiers) self.assertEquals(orig_rev.revision_id, roundtrip_revid) self.assertEquals(orig_rev.properties, rev.properties) self.assertEquals(orig_rev.committer, rev.committer) self.assertEquals(orig_rev.timestamp, rev.timestamp) self.assertEquals(orig_rev.timezone, rev.timezone) self.assertEquals(orig_rev.message, rev.message) self.assertEquals(list(orig_rev.parent_ids), list(rev.parent_ids)) else: self.assertEquals({}, verifiers) def test_simple_commit(self): r = Revision(self.mapping.revision_id_foreign_to_bzr("edf99e6c56495c620f20d5dacff9859ff7119261")) r.message = "MyCommitMessage" r.parent_ids = [] r.committer = "Jelmer Vernooij " r.timestamp = 453543543 r.timezone = 0 r.properties = {} self.assertRoundtripRevision(r) def test_revision_id(self): r = Revision("myrevid") r.message = "MyCommitMessage" r.parent_ids = [] r.committer = "Jelmer Vernooij " r.timestamp = 453543543 r.timezone = 0 r.properties = {} self.assertRoundtripRevision(r) def test_ghost_parent(self): r = Revision("myrevid") r.message = "MyCommitMessage" r.parent_ids = ["iamaghost"] r.committer = "Jelmer Vernooij " r.timestamp = 453543543 r.timezone = 0 r.properties = {} self.assertRoundtripRevision(r) def test_custom_property(self): r = Revision("myrevid") r.message = "MyCommitMessage" r.parent_ids = [] r.properties = {"fool": "bar"} r.committer = "Jelmer Vernooij " r.timestamp = 453543543 r.timezone = 0 self.assertRoundtripRevision(r) class RoundtripRevisionsFromGit(tests.TestCase): def setUp(self): super(RoundtripRevisionsFromGit, self).setUp() self.mapping = BzrGitMappingv1() def assertRoundtripTree(self, tree): raise NotImplementedError(self.assertRoundtripTree) def assertRoundtripBlob(self, blob): raise NotImplementedError(self.assertRoundtripBlob) def assertRoundtripCommit(self, commit1): rev, roundtrip_revid, verifiers = self.mapping.import_commit( commit1, self.mapping.revision_id_foreign_to_bzr) commit2 = self.mapping.export_commit(rev, "12341212121212", None, True, None) self.assertEquals(commit1.committer, commit2.committer) self.assertEquals(commit1.commit_time, commit2.commit_time) self.assertEquals(commit1.commit_timezone, commit2.commit_timezone) self.assertEquals(commit1.author, commit2.author) self.assertEquals(commit1.author_time, commit2.author_time) self.assertEquals(commit1.author_timezone, commit2.author_timezone) self.assertEquals(commit1.message, commit2.message) self.assertEquals(commit1.encoding, commit2.encoding) def test_commit(self): c = Commit() c.tree = "cc9462f7f8263ef5adfbeff2fb936bb36b504cba" c.message = "Some message" c.committer = "Committer " c.commit_time = 4 c.commit_timezone = -60 * 3 c.author_time = 5 c.author_timezone = 60 * 2 c.author = "Author " self.assertRoundtripCommit(c) def test_commit_double_negative_timezone(self): c = Commit() c.tree = "cc9462f7f8263ef5adfbeff2fb936bb36b504cba" c.message = "Some message" c.committer = "Committer " c.commit_time = 4 (c.commit_timezone, c._commit_timezone_neg_utc) = parse_timezone("--700") c.author_time = 5 c.author_timezone = 60 * 2 c.author = "Author " self.assertRoundtripCommit(c) def test_commit_zero_utc_timezone(self): c = Commit() c.tree = "cc9462f7f8263ef5adfbeff2fb936bb36b504cba" c.message = "Some message" c.committer = "Committer " c.commit_time = 4 c.commit_timezone = 0 c._commit_timezone_neg_utc = True c.author_time = 5 c.author_timezone = 60 * 2 c.author = "Author " self.assertRoundtripCommit(c) def test_commit_encoding(self): c = Commit() c.tree = "cc9462f7f8263ef5adfbeff2fb936bb36b504cba" c.message = "Some message" c.committer = "Committer " c.encoding = 'iso8859-1' c.commit_time = 4 c.commit_timezone = -60 * 3 c.author_time = 5 c.author_timezone = 60 * 2 c.author = "Author " self.assertRoundtripCommit(c) def test_commit_extra(self): c = Commit() c.tree = "cc9462f7f8263ef5adfbeff2fb936bb36b504cba" c.message = "Some message" c.committer = "Committer " c.commit_time = 4 c.commit_timezone = -60 * 3 c.author_time = 5 c.author_timezone = 60 * 2 c.author = "Author " c._extra = [("HG:rename-source", "hg")] self.assertRoundtripCommit(c) class DirectoryToTreeTests(tests.TestCase): def test_empty(self): t = directory_to_tree({}, None, {}, None, allow_empty=False) self.assertEquals(None, t) def test_empty_dir(self): child_ie = InventoryDirectory('bar', 'bar', 'bar') children = {'bar': child_ie} t = directory_to_tree(children, lambda x: None, {}, None, allow_empty=False) self.assertEquals(None, t) def test_empty_dir_dummy_files(self): child_ie = InventoryDirectory('bar', 'bar', 'bar') children = {'bar':child_ie} t = directory_to_tree(children, lambda x: None, {}, ".mydummy", allow_empty=False) self.assertTrue(".mydummy" in t) def test_empty_root(self): child_ie = InventoryDirectory('bar', 'bar', 'bar') children = {'bar': child_ie} t = directory_to_tree(children, lambda x: None, {}, None, allow_empty=True) self.assertEquals(Tree(), t) def test_with_file(self): child_ie = InventoryFile('bar', 'bar', 'bar') children = {"bar": child_ie} b = Blob.from_string("bla") t1 = directory_to_tree(children, lambda x: b.id, {}, None, allow_empty=False) t2 = Tree() t2.add("bar", 0100644, b.id) self.assertEquals(t1, t2) bzr-git-0.6.13+bzr1649/tests/test_object_store.py0000644000000000000000000001513313165530605017666 0ustar 00000000000000# Copyright (C) 2010 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Tests for bzr-git's object store.""" from dulwich.objects import ( Blob, ) from ....branchbuilder import ( BranchBuilder, ) from ....errors import ( NoSuchRevision, ) from ....graph import ( DictParentsProvider, Graph, ) from ....tests import ( TestCase, TestCaseWithTransport, ) from ..cache import ( DictGitShaMap, ) from ..object_store import ( BazaarObjectStore, LRUTreeCache, _check_expected_sha, _find_missing_bzr_revids, _tree_to_objects, ) class ExpectedShaTests(TestCase): def setUp(self): super(ExpectedShaTests, self).setUp() self.obj = Blob() self.obj.data = "foo" def test_none(self): _check_expected_sha(None, self.obj) def test_hex(self): _check_expected_sha(self.obj.sha().hexdigest(), self.obj) self.assertRaises(AssertionError, _check_expected_sha, "0" * 40, self.obj) def test_binary(self): _check_expected_sha(self.obj.sha().digest(), self.obj) self.assertRaises(AssertionError, _check_expected_sha, "x" * 20, self.obj) class FindMissingBzrRevidsTests(TestCase): def _find_missing(self, ancestry, want, have): return _find_missing_bzr_revids( Graph(DictParentsProvider(ancestry)), set(want), set(have)) def test_simple(self): self.assertEquals(set(), self._find_missing({}, [], [])) def test_up_to_date(self): self.assertEquals(set(), self._find_missing({"a": ["b"]}, ["a"], ["a"])) def test_one_missing(self): self.assertEquals(set(["a"]), self._find_missing({"a": ["b"]}, ["a"], ["b"])) def test_two_missing(self): self.assertEquals(set(["a", "b"]), self._find_missing({"a": ["b"], "b": ["c"]}, ["a"], ["c"])) def test_two_missing_history(self): self.assertEquals(set(["a", "b"]), self._find_missing({"a": ["b"], "b": ["c"], "c": ["d"]}, ["a"], ["c"])) class LRUTreeCacheTests(TestCaseWithTransport): def setUp(self): super(LRUTreeCacheTests, self).setUp() self.branch = self.make_branch(".") self.branch.lock_write() self.addCleanup(self.branch.unlock) self.cache = LRUTreeCache(self.branch.repository) def test_get_not_present(self): self.assertRaises(NoSuchRevision, self.cache.revision_tree, "unknown") def test_revision_trees(self): self.assertRaises(NoSuchRevision, self.cache.revision_trees, ["unknown", "la"]) def test_iter_revision_trees(self): self.assertRaises(NoSuchRevision, self.cache.iter_revision_trees, ["unknown", "la"]) def test_get(self): bb = BranchBuilder(branch=self.branch) bb.start_series() bb.build_snapshot('BASE-id', None, [('add', ('', None, 'directory', None)), ('add', ('foo', 'foo-id', 'file', 'a\nb\nc\nd\ne\n')), ]) bb.finish_series() tree = self.cache.revision_tree("BASE-id") self.assertEquals("BASE-id", tree.get_revision_id()) class BazaarObjectStoreTests(TestCaseWithTransport): def setUp(self): super(BazaarObjectStoreTests, self).setUp() self.branch = self.make_branch(".") self.branch.lock_write() self.addCleanup(self.branch.unlock) self.store = BazaarObjectStore(self.branch.repository) def test_get_blob(self): b = Blob() b.data = 'a\nb\nc\nd\ne\n' self.store.lock_read() self.addCleanup(self.store.unlock) self.assertRaises(KeyError, self.store.__getitem__, b.id) bb = BranchBuilder(branch=self.branch) bb.start_series() bb.build_snapshot('BASE-id', None, [('add', ('', None, 'directory', None)), ('add', ('foo', 'foo-id', 'file', 'a\nb\nc\nd\ne\n')), ]) bb.finish_series() # read locks cache self.assertRaises(KeyError, self.store.__getitem__, b.id) self.store.unlock() self.store.lock_read() self.assertEquals(b, self.store[b.id]) def test_get_raw(self): b = Blob() b.data = 'a\nb\nc\nd\ne\n' self.store.lock_read() self.addCleanup(self.store.unlock) self.assertRaises(KeyError, self.store.get_raw, b.id) bb = BranchBuilder(branch=self.branch) bb.start_series() bb.build_snapshot('BASE-id', None, [('add', ('', None, 'directory', None)), ('add', ('foo', 'foo-id', 'file', 'a\nb\nc\nd\ne\n')), ]) bb.finish_series() # read locks cache self.assertRaises(KeyError, self.store.get_raw, b.id) self.store.unlock() self.store.lock_read() self.assertEquals(b.as_raw_string(), self.store.get_raw(b.id)[1]) def test_contains(self): b = Blob() b.data = 'a\nb\nc\nd\ne\n' self.store.lock_read() self.addCleanup(self.store.unlock) self.assertFalse(b.id in self.store) bb = BranchBuilder(branch=self.branch) bb.start_series() bb.build_snapshot('BASE-id', None, [('add', ('', None, 'directory', None)), ('add', ('foo', 'foo-id', 'file', 'a\nb\nc\nd\ne\n')), ]) bb.finish_series() # read locks cache self.assertFalse(b.id in self.store) self.store.unlock() self.store.lock_read() self.assertTrue(b.id in self.store) class TreeToObjectsTests(TestCaseWithTransport): def setUp(self): super(TreeToObjectsTests, self).setUp() self.idmap = DictGitShaMap() def test_no_changes(self): tree = self.make_branch_and_tree('.') self.addCleanup(tree.lock_read().unlock) entries = list(_tree_to_objects(tree, [tree], self.idmap, {})) self.assertEquals([], entries) bzr-git-0.6.13+bzr1649/tests/test_pristine_tar.py0000644000000000000000000000675613165530605017722 0ustar 00000000000000# Copyright (C) 2012 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Tests for pristine tar extraction code.""" from base64 import standard_b64encode from ..pristine_tar import ( get_pristine_tar_tree, revision_pristine_tar_data, read_git_pristine_tar_data, store_git_pristine_tar_data, ) from ....revision import Revision from ....tests import TestCase from dulwich.objects import ( Blob, Tree, ) from dulwich.repo import ( MemoryRepo as GitMemoryRepo, ) import stat class RevisionPristineTarDataTests(TestCase): def test_pristine_tar_delta_unknown(self): rev = Revision("myrevid") self.assertRaises(KeyError, revision_pristine_tar_data, rev) def test_pristine_tar_delta_gz(self): rev = Revision("myrevid") rev.properties["deb-pristine-delta"] = standard_b64encode("bla") self.assertEquals(("bla", "gz"), revision_pristine_tar_data(rev)) class ReadPristineTarData(TestCase): def test_read_pristine_tar_data_no_branch(self): r = GitMemoryRepo() self.assertRaises(KeyError, read_git_pristine_tar_data, r, "foo") def test_read_pristine_tar_data_no_file(self): r = GitMemoryRepo() t = Tree() b = Blob.from_string("README") r.object_store.add_object(b) t.add("README", stat.S_IFREG | 0644, b.id) r.object_store.add_object(t) r.do_commit("Add README", tree=t.id, ref='refs/heads/pristine-tar') self.assertRaises(KeyError, read_git_pristine_tar_data, r, "foo") def test_read_pristine_tar_data(self): r = GitMemoryRepo() delta = Blob.from_string("some yummy data") r.object_store.add_object(delta) idfile = Blob.from_string("someid") r.object_store.add_object(idfile) t = Tree() t.add("foo.delta", stat.S_IFREG | 0644, delta.id) t.add("foo.id", stat.S_IFREG | 0644, idfile.id) r.object_store.add_object(t) r.do_commit("pristine tar delta for foo", tree=t.id, ref='refs/heads/pristine-tar') self.assertEquals( ("some yummy data", "someid"), read_git_pristine_tar_data(r, 'foo')) class StoreGitPristineTarData(TestCase): def test_store_new(self): r = GitMemoryRepo() cid = store_git_pristine_tar_data(r, "foo", "mydelta", "myid") tree = get_pristine_tar_tree(r) self.assertEquals( (stat.S_IFREG | 0644, "7b02de8ac4162e64f402c43487d8a40a505482e1"), tree["README"]) self.assertEquals(r[cid].tree, tree.id) self.assertEquals(r[tree["foo.delta"][1]].data, "mydelta") self.assertEquals(r[tree["foo.id"][1]].data, "myid") self.assertEquals(("mydelta", "myid"), read_git_pristine_tar_data(r, "foo")) bzr-git-0.6.13+bzr1649/tests/test_push.py0000644000000000000000000001045713165530605016167 0ustar 00000000000000# Copyright (C) 2010 Jelmer Vernooij # -*- coding: utf-8 -*- # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Tests for pushing revisions from Bazaar into Git.""" from ....bzrdir import ( format_registry, ) from ....repository import ( InterRepository, ) from ....tests import ( TestCaseWithTransport, ) from ..mapping import ( BzrGitMappingExperimental, BzrGitMappingv1, ) from ..errors import NoPushSupport from ..push import ( InterToGitRepository, ) class InterToGitRepositoryTests(TestCaseWithTransport): def setUp(self): super(InterToGitRepositoryTests, self).setUp() self.git_repo = self.make_repository("git", format=format_registry.make_bzrdir("git")) self.bzr_repo = self.make_repository("bzr", shared=True) def _get_interrepo(self, mapping=None): self.bzr_repo.lock_read() self.addCleanup(self.bzr_repo.unlock) interrepo = InterRepository.get(self.bzr_repo, self.git_repo) if mapping is not None: interrepo.mapping = mapping return interrepo def test_instance(self): self.assertIsInstance(self._get_interrepo(), InterToGitRepository) def test_pointless_fetch_refs_old_mapping(self): interrepo = self._get_interrepo(mapping=BzrGitMappingv1()) self.assertRaises(NoPushSupport, interrepo.fetch_refs, lambda x: {}, lossy=False) def test_pointless_fetch_refs(self): interrepo = self._get_interrepo(mapping=BzrGitMappingExperimental()) revidmap, old_refs, new_refs = interrepo.fetch_refs(lambda x: {}, lossy=False) self.assertEquals(old_refs, {}) self.assertEquals(new_refs, {}) def test_pointless_lossy_fetch_refs(self): revidmap, old_refs, new_refs = self._get_interrepo().fetch_refs(lambda x: {}, lossy=True) self.assertEquals(old_refs, {}) self.assertEquals(new_refs, {}) self.assertEquals(revidmap, {}) def test_pointless_missing_revisions(self): interrepo = self._get_interrepo() interrepo.source_store.lock_read() self.addCleanup(interrepo.source_store.unlock) self.assertEquals([], list(interrepo.missing_revisions([]))) def test_missing_revisions_unknown_stop_rev(self): interrepo = self._get_interrepo() interrepo.source_store.lock_read() self.addCleanup(interrepo.source_store.unlock) self.assertEquals([], list(interrepo.missing_revisions([(None, "unknown")]))) def test_odd_rename(self): # Add initial revision to bzr branch. branch = self.bzr_repo.bzrdir.create_branch() tree = branch.bzrdir.create_workingtree() self.build_tree(["bzr/bar/", "bzr/bar/foobar"]) tree.add(["bar", "bar/foobar"]) tree.commit("initial") # Add new directory and perform move in bzr branch. self.build_tree(["bzr/baz/"]) tree.add(["baz"]) tree.rename_one("bar", "baz/IrcDotNet") last_revid = tree.commit("rename") # Push bzr branch to git branch. def decide(x): return { "refs/heads/master": (None, last_revid) } interrepo = self._get_interrepo() revidmap, old_refs, new_refs = interrepo.fetch_refs(decide, lossy=True) gitid = revidmap[last_revid][0] store = self.git_repo._git.object_store commit = store[gitid] tree = store[commit.tree] tree.check() self.assertTrue(tree["baz"][1] in store) baz = store[tree["baz"][1]] baz.check() ircdotnet = store[baz["IrcDotNet"][1]] ircdotnet.check() foobar = store[ircdotnet["foobar"][1]] foobar.check() bzr-git-0.6.13+bzr1649/tests/test_refs.py0000644000000000000000000000247513165530605016150 0ustar 00000000000000# Copyright (C) 2010 Jelmer Vernooij # vim: encoding=utf-8 # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Tests for ref handling.""" from .... import tests from ...git import refs class BranchNameRefConversionTests(tests.TestCase): def test_head(self): self.assertEquals("", refs.ref_to_branch_name("HEAD")) self.assertEquals("HEAD", refs.branch_name_to_ref("")) def test_tag(self): self.assertRaises(ValueError, refs.ref_to_branch_name, "refs/tags/FOO") def test_branch(self): self.assertEquals("frost", refs.ref_to_branch_name("refs/heads/frost")) self.assertEquals("refs/heads/frost", refs.branch_name_to_ref("frost")) bzr-git-0.6.13+bzr1649/tests/test_remote.py0000644000000000000000000000400013165530605016466 0ustar 00000000000000# Copyright (C) 2010 Canonical Ltd # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Test the smart client.""" from ....errors import ( BzrError, NotBranchError, ) from ....tests import TestCase from ..remote import ( split_git_url, parse_git_error, ) class SplitUrlTests(TestCase): def test_simple(self): self.assertEquals(("foo", None, None, "/bar"), split_git_url("git://foo/bar")) def test_port(self): self.assertEquals(("foo", 343, None, "/bar"), split_git_url("git://foo:343/bar")) def test_username(self): self.assertEquals(("foo", None, "la", "/bar"), split_git_url("git://la@foo/bar")) def test_nopath(self): self.assertEquals(("foo", None, None, "/"), split_git_url("git://foo/")) def test_slashpath(self): self.assertEquals(("foo", None, None, "//bar"), split_git_url("git://foo//bar")) def test_homedir(self): self.assertEquals(("foo", None, None, "~bar"), split_git_url("git://foo/~bar")) class ParseGitErrorTests(TestCase): def test_unknown(self): e = parse_git_error("url", "foo") self.assertIsInstance(e, BzrError) def test_notbrancherror(self): e = parse_git_error("url", "\n Could not find Repository foo/bar") self.assertIsInstance(e, NotBranchError) bzr-git-0.6.13+bzr1649/tests/test_repository.py0000644000000000000000000002105013165530605017416 0ustar 00000000000000# Copyright (C) 2007 Canonical Ltd # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Tests for interfacing with a Git Repository""" import dulwich from dulwich.repo import ( Repo as GitRepo, ) import os from .... import ( errors, revision, ) from ....repository import ( InterRepository, Repository, ) from .. import ( dir, repository, tests, ) from ..mapping import ( default_mapping, ) from ..object_store import ( BazaarObjectStore, ) from ..push import ( MissingObjectsIterator, ) class TestGitRepositoryFeatures(tests.TestCaseInTempDir): """Feature tests for GitRepository.""" def _do_commit(self): builder = tests.GitBranchBuilder() builder.set_file('a', 'text for a\n', False) commit_handle = builder.commit('Joe Foo ', u'message') mapping = builder.finish() return mapping[commit_handle] def test_open_existing(self): GitRepo.init(self.test_dir) repo = Repository.open('.') self.assertIsInstance(repo, repository.GitRepository) def test_has_git_repo(self): GitRepo.init(self.test_dir) repo = Repository.open('.') self.assertIsInstance(repo._git, dulwich.repo.BaseRepo) def test_has_revision(self): GitRepo.init(self.test_dir) commit_id = self._do_commit() repo = Repository.open('.') self.assertFalse(repo.has_revision('foobar')) revid = default_mapping.revision_id_foreign_to_bzr(commit_id) self.assertTrue(repo.has_revision(revid)) def test_has_revisions(self): GitRepo.init(self.test_dir) commit_id = self._do_commit() repo = Repository.open('.') self.assertEquals(set(), repo.has_revisions(['foobar'])) revid = default_mapping.revision_id_foreign_to_bzr(commit_id) self.assertEquals(set([revid]), repo.has_revisions(['foobar', revid])) def test_get_revision(self): # GitRepository.get_revision gives a Revision object. # Create a git repository with a revision. GitRepo.init(self.test_dir) commit_id = self._do_commit() # Get the corresponding Revision object. revid = default_mapping.revision_id_foreign_to_bzr(commit_id) repo = Repository.open('.') rev = repo.get_revision(revid) self.assertIsInstance(rev, revision.Revision) def test_get_revision_unknown(self): GitRepo.init(self.test_dir) repo = Repository.open('.') self.assertRaises(errors.NoSuchRevision, repo.get_revision, "bla") def simple_commit(self): # Create a git repository with some interesting files in a revision. GitRepo.init(self.test_dir) builder = tests.GitBranchBuilder() builder.set_file('data', 'text\n', False) builder.set_file('executable', 'content', True) builder.set_link('link', 'broken') builder.set_file('subdir/subfile', 'subdir text\n', False) commit_handle = builder.commit('Joe Foo ', u'message', timestamp=1205433193) mapping = builder.finish() return mapping[commit_handle] def test_pack(self): commit_id = self.simple_commit() repo = Repository.open('.') repo.pack() def test_revision_tree(self): commit_id = self.simple_commit() revid = default_mapping.revision_id_foreign_to_bzr(commit_id) repo = Repository.open('.') tree = repo.revision_tree(revid) self.assertEquals(tree.get_revision_id(), revid) self.assertEquals("text\n", tree.get_file_text(tree.path2id("data"))) class TestGitRepository(tests.TestCaseWithTransport): def _do_commit(self): builder = tests.GitBranchBuilder() builder.set_file('a', 'text for a\n', False) commit_handle = builder.commit('Joe Foo ', u'message') mapping = builder.finish() return mapping[commit_handle] def setUp(self): tests.TestCaseWithTransport.setUp(self) dulwich.repo.Repo.create(self.test_dir) self.git_repo = Repository.open(self.test_dir) def test_supports_rich_root(self): repo = self.git_repo self.assertEqual(repo.supports_rich_root(), True) def test_get_signature_text(self): self.assertRaises(errors.NoSuchRevision, self.git_repo.get_signature_text, revision.NULL_REVISION) def test_has_signature_for_revision_id(self): self.assertEquals(False, self.git_repo.has_signature_for_revision_id(revision.NULL_REVISION)) def test_all_revision_ids_none(self): self.assertEquals(set([]), self.git_repo.all_revision_ids()) def test_get_known_graph_ancestry(self): cid = self._do_commit() revid = default_mapping.revision_id_foreign_to_bzr(cid) g = self.git_repo.get_known_graph_ancestry([revid]) self.assertEquals(frozenset([revid]), g.heads([revid])) self.assertEqual([(revid, 0, (1,), True)], [(n.key, n.merge_depth, n.revno, n.end_of_merge) for n in g.merge_sort(revid)]) def test_all_revision_ids(self): commit_id = self._do_commit() self.assertEquals( set([default_mapping.revision_id_foreign_to_bzr(commit_id)]), self.git_repo.all_revision_ids()) def assertIsNullInventory(self, inv): self.assertEqual(inv.root, None) self.assertEqual(inv.revision_id, revision.NULL_REVISION) self.assertEqual(list(inv.iter_entries()), []) def test_revision_tree_none(self): # GitRepository.revision_tree(None) returns the null tree. repo = self.git_repo tree = repo.revision_tree(revision.NULL_REVISION) self.assertEqual(tree.get_revision_id(), revision.NULL_REVISION) def test_get_parent_map_null(self): self.assertEquals({revision.NULL_REVISION: ()}, self.git_repo.get_parent_map([revision.NULL_REVISION])) class GitRepositoryFormat(tests.TestCase): def setUp(self): super(GitRepositoryFormat, self).setUp() self.format = repository.GitRepositoryFormat() def test_get_format_description(self): self.assertEquals("Git Repository", self.format.get_format_description()) class RevisionGistImportTests(tests.TestCaseWithTransport): def setUp(self): tests.TestCaseWithTransport.setUp(self) self.git_path = os.path.join(self.test_dir, "git") os.mkdir(self.git_path) dulwich.repo.Repo.create(self.git_path) self.git_repo = Repository.open(self.git_path) self.bzr_tree = self.make_branch_and_tree("bzr") def get_inter(self): return InterRepository.get(self.bzr_tree.branch.repository, self.git_repo) def object_iter(self): store = BazaarObjectStore(self.bzr_tree.branch.repository, default_mapping) store_iterator = MissingObjectsIterator(store, self.bzr_tree.branch.repository) return store, store_iterator def import_rev(self, revid, parent_lookup=None): store, store_iter = self.object_iter() store._cache.idmap.start_write_group() try: return store_iter.import_revision(revid, lossy=True) except: store._cache.idmap.abort_write_group() raise else: store._cache.idmap.commit_write_group() def test_pointless(self): revid = self.bzr_tree.commit("pointless", timestamp=1205433193, timezone=0, committer="Jelmer Vernooij ") self.assertEquals("2caa8094a5b794961cd9bf582e3e2bb090db0b14", self.import_rev(revid)) self.assertEquals("2caa8094a5b794961cd9bf582e3e2bb090db0b14", self.import_rev(revid)) class ForeignTestsRepositoryFactory(object): def make_repository(self, transport): return dir.LocalGitControlDirFormat().initialize_on_transport(transport).open_repository() bzr-git-0.6.13+bzr1649/tests/test_revspec.py0000644000000000000000000000207213165530605016651 0ustar 00000000000000# Copyright (C) 2010 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Test the git revision specifiers.""" from ....tests import TestCase from ..revspec import ( valid_git_sha1, ) class Sha1ValidTests(TestCase): def test_invalid(self): self.assertFalse(valid_git_sha1("git-v1:abcde")) def test_valid(self): self.assertTrue(valid_git_sha1("aabbccddee")) bzr-git-0.6.13+bzr1649/tests/test_roundtrip.py0000644000000000000000000000661313165530605017235 0ustar 00000000000000# Copyright (C) 2010 Jelmer Vernooij # -*- encoding: utf-8 -*- # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Tests for roundtripping text parsing.""" from ....tests import TestCase from ..roundtrip import ( CommitSupplement, deserialize_fileid_map, extract_bzr_metadata, generate_roundtripping_metadata, inject_bzr_metadata, parse_roundtripping_metadata, serialize_fileid_map, ) class RoundtripTests(TestCase): def test_revid(self): md = parse_roundtripping_metadata("revision-id: foo\n") self.assertEquals("foo", md.revision_id) def test_parent_ids(self): md = parse_roundtripping_metadata("parent-ids: foo bar\n") self.assertEquals(("foo", "bar"), md.explicit_parent_ids) def test_properties(self): md = parse_roundtripping_metadata("property-foop: blar\n") self.assertEquals({"foop": "blar"}, md.properties) class FormatTests(TestCase): def test_revid(self): metadata = CommitSupplement() metadata.revision_id = "bla" self.assertEquals("revision-id: bla\n", generate_roundtripping_metadata(metadata, "utf-8")) def test_parent_ids(self): metadata = CommitSupplement() metadata.explicit_parent_ids = ("foo", "bar") self.assertEquals("parent-ids: foo bar\n", generate_roundtripping_metadata(metadata, "utf-8")) def test_properties(self): metadata = CommitSupplement() metadata.properties = {"foo": "bar"} self.assertEquals("property-foo: bar\n", generate_roundtripping_metadata(metadata, "utf-8")) def test_empty(self): metadata = CommitSupplement() self.assertEquals("", generate_roundtripping_metadata(metadata, "utf-8")) class ExtractMetadataTests(TestCase): def test_roundtrip(self): (msg, metadata) = extract_bzr_metadata("""Foo --BZR-- revision-id: foo """) self.assertEquals("Foo", msg) self.assertEquals("foo", metadata.revision_id) class GenerateMetadataTests(TestCase): def test_roundtrip(self): metadata = CommitSupplement() metadata.revision_id = "myrevid" msg = inject_bzr_metadata("Foo", metadata, "utf-8") self.assertEquals("""Foo --BZR-- revision-id: myrevid """, msg) def test_no_metadata(self): metadata = CommitSupplement() msg = inject_bzr_metadata("Foo", metadata, "utf-8") self.assertEquals("Foo", msg) class FileIdRoundTripTests(TestCase): def test_deserialize(self): self.assertEquals({"bar/bla": "fid"}, deserialize_fileid_map("bar/bla\0fid\n")) def test_serialize(self): self.assertEquals(["bar/bla\0fid\n"], serialize_fileid_map({"bar/bla": "fid"})) bzr-git-0.6.13+bzr1649/tests/test_server.py0000644000000000000000000000531113165530605016507 0ustar 00000000000000# Copyright (C) 2011 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Test for git server.""" from dulwich.client import TCPGitClient from dulwich.repo import Repo import threading from ....transport import transport_server_registry from ....tests import ( TestCase, TestCaseWithTransport, ) from ..server import ( BzrBackend, BzrTCPGitServer, ) class TestPresent(TestCase): def test_present(self): # Just test that the server is registered. transport_server_registry.get('git') class GitServerTestCase(TestCaseWithTransport): def start_server(self, t): backend = BzrBackend(t) server = BzrTCPGitServer(backend, 'localhost', port=0) self.addCleanup(server.shutdown) thread = threading.Thread(target=server.serve).start() self._server = server _, port = self._server.socket.getsockname() return port class TestPlainFetch(GitServerTestCase): def test_fetch_simple(self): wt = self.make_branch_and_tree('t') self.build_tree(['t/foo']) wt.add('foo') revid = wt.commit(message="some data") wt.branch.tags.set_tag("atag", revid) t = self.get_transport('t') port = self.start_server(t) c = TCPGitClient('localhost', port=port) gitrepo = Repo.init('gitrepo', mkdir=True) ret = c.fetch('/', gitrepo) self.assertEquals( set(ret.refs.keys()), set(["refs/tags/atag", "HEAD"])) def test_fetch_nothing(self): wt = self.make_branch_and_tree('t') self.build_tree(['t/foo']) wt.add('foo') revid = wt.commit(message="some data") wt.branch.tags.set_tag("atag", revid) t = self.get_transport('t') port = self.start_server(t) c = TCPGitClient('localhost', port=port) gitrepo = Repo.init('gitrepo', mkdir=True) ret = c.fetch('/', gitrepo, determine_wants=lambda x: []) self.assertEquals( set(ret.refs.keys()), set(["refs/tags/atag", "HEAD"])) bzr-git-0.6.13+bzr1649/tests/test_transportgit.py0000644000000000000000000000440313165530605017742 0ustar 00000000000000# Copyright (C) 2010 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Tests for bzr-git's object store.""" from dulwich.objects import Blob from dulwich.tests.test_object_store import PackBasedObjectStoreTests from dulwich.tests.utils import make_object from ....tests import TestCaseWithTransport from ..transportgit import TransportObjectStore class TransportObjectStoreTests(PackBasedObjectStoreTests, TestCaseWithTransport): def setUp(self): TestCaseWithTransport.setUp(self) self.store = TransportObjectStore.init(self.get_transport()) def tearDown(self): PackBasedObjectStoreTests.tearDown(self) TestCaseWithTransport.tearDown(self) def test_remembers_packs(self): self.store.add_object(make_object(Blob, data="data")) self.assertEqual(0, len(self.store.packs)) self.store.pack_loose_objects() self.assertEqual(1, len(self.store.packs)) # Packing a second object creates a second pack. self.store.add_object(make_object(Blob, data="more data")) self.store.pack_loose_objects() self.assertEqual(2, len(self.store.packs)) # If we reopen the store, it reloads both packs. restore = TransportObjectStore(self.get_transport()) self.assertEqual(2, len(restore.packs)) # FIXME: Unfortunately RefsContainerTests requires on a specific set of refs existing. # class TransportRefContainerTests(RefsContainerTests, TestCaseWithTransport): # # def setUp(self): # TestCaseWithTransport.setUp(self) # self._refs = TransportRefsContainer(self.get_transport()) bzr-git-0.6.13+bzr1649/tests/test_unpeel_map.py0000644000000000000000000000320213165530605017323 0ustar 00000000000000# Copyright (C) 2011 Jelmer Vernooij # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Tests for the unpeel map.""" from cStringIO import StringIO from ....tests import ( TestCaseWithTransport, ) from ..unpeel_map import ( UnpeelMap, ) class TestUnpeelMap(TestCaseWithTransport): def test_new(self): m = UnpeelMap() self.assertIs(None, m.peel_tag("ab"* 20)) def test_load(self): f = StringIO( "unpeel map version 1\n" "0123456789012345678901234567890123456789: aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa\n") m = UnpeelMap() m.load(f) self.assertEquals("0123456789012345678901234567890123456789", m.peel_tag("aa"*20)) def test_update(self): m = UnpeelMap() m.update({ "0123456789012345678901234567890123456789": set(["aa" * 20]), }) self.assertEquals("0123456789012345678901234567890123456789", m.peel_tag("aa"*20)) bzr-git-0.6.13+bzr1649/tests/test_workingtree.py0000644000000000000000000000231313165530605017540 0ustar 00000000000000# Copyright (C) 2010 Jelmer Vernooij # Copyright (C) 2011 Canonical Ltd. # # This program is free software; you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation; either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program; if not, write to the Free Software # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA """Tests for Git working trees.""" from .... import ( conflicts as _mod_conflicts, ) from ....tests import TestCaseWithTransport class GitWorkingTreeTests(TestCaseWithTransport): def setUp(self): super(GitWorkingTreeTests, self).setUp() self.tree = self.make_branch_and_tree('.', format="git") def test_conflicts(self): self.assertIsInstance(self.tree.conflicts(), _mod_conflicts.ConflictList)