pax_global_header00006660000000000000000000000064121426457540014524gustar00rootroot0000000000000052 comment=dbf5dccad997d5f5532f7f0cbab62be35f9d7143 xapers-0.5.2/000077500000000000000000000000001214264575400130325ustar00rootroot00000000000000xapers-0.5.2/.gitignore000066400000000000000000000000441214264575400150200ustar00rootroot00000000000000*~ *.pyc dist build xapers.egg-info xapers-0.5.2/COPYING000066400000000000000000000012131214264575400140620ustar00rootroot00000000000000Xapers is free software. You can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program, (in the COPYING-GPL-3 file in this directory). If not, see http://www.gnu.org/licenses/ xapers-0.5.2/COPYING-GPL-3000066400000000000000000001043741214264575400146560ustar00rootroot00000000000000 GNU GENERAL PUBLIC LICENSE Version 3, 29 June 2007 Copyright (C) 2007 Free Software Foundation, Inc. Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. Preamble The GNU General Public License is a free, copyleft license for software and other kinds of works. The licenses for most software and other practical works are designed to take away your freedom to share and change the works. By contrast, the GNU General Public License is intended to guarantee your freedom to share and change all versions of a program--to make sure it remains free software for all its users. We, the Free Software Foundation, use the GNU General Public License for most of our software; it applies also to any other work released this way by its authors. You can apply it to your programs, too. When we speak of free software, we are referring to freedom, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for them if you wish), that you receive source code or can get it if you want it, that you can change the software or use pieces of it in new free programs, and that you know you can do these things. To protect your rights, we need to prevent others from denying you these rights or asking you to surrender the rights. Therefore, you have certain responsibilities if you distribute copies of the software, or if you modify it: responsibilities to respect the freedom of others. For example, if you distribute copies of such a program, whether gratis or for a fee, you must pass on to the recipients the same freedoms that you received. You must make sure that they, too, receive or can get the source code. And you must show them these terms so they know their rights. Developers that use the GNU GPL protect your rights with two steps: (1) assert copyright on the software, and (2) offer you this License giving you legal permission to copy, distribute and/or modify it. For the developers' and authors' protection, the GPL clearly explains that there is no warranty for this free software. For both users' and authors' sake, the GPL requires that modified versions be marked as changed, so that their problems will not be attributed erroneously to authors of previous versions. Some devices are designed to deny users access to install or run modified versions of the software inside them, although the manufacturer can do so. This is fundamentally incompatible with the aim of protecting users' freedom to change the software. The systematic pattern of such abuse occurs in the area of products for individuals to use, which is precisely where it is most unacceptable. Therefore, we have designed this version of the GPL to prohibit the practice for those products. If such problems arise substantially in other domains, we stand ready to extend this provision to those domains in future versions of the GPL, as needed to protect the freedom of users. Finally, every program is threatened constantly by software patents. States should not allow patents to restrict development and use of software on general-purpose computers, but in those that do, we wish to avoid the special danger that patents applied to a free program could make it effectively proprietary. To prevent this, the GPL assures that patents cannot be used to render the program non-free. The precise terms and conditions for copying, distribution and modification follow. TERMS AND CONDITIONS 0. Definitions. "This License" refers to version 3 of the GNU General Public License. "Copyright" also means copyright-like laws that apply to other kinds of works, such as semiconductor masks. "The Program" refers to any copyrightable work licensed under this License. Each licensee is addressed as "you". "Licensees" and "recipients" may be individuals or organizations. To "modify" a work means to copy from or adapt all or part of the work in a fashion requiring copyright permission, other than the making of an exact copy. The resulting work is called a "modified version" of the earlier work or a work "based on" the earlier work. A "covered work" means either the unmodified Program or a work based on the Program. To "propagate" a work means to do anything with it that, without permission, would make you directly or secondarily liable for infringement under applicable copyright law, except executing it on a computer or modifying a private copy. Propagation includes copying, distribution (with or without modification), making available to the public, and in some countries other activities as well. To "convey" a work means any kind of propagation that enables other parties to make or receive copies. Mere interaction with a user through a computer network, with no transfer of a copy, is not conveying. An interactive user interface displays "Appropriate Legal Notices" to the extent that it includes a convenient and prominently visible feature that (1) displays an appropriate copyright notice, and (2) tells the user that there is no warranty for the work (except to the extent that warranties are provided), that licensees may convey the work under this License, and how to view a copy of this License. If the interface presents a list of user commands or options, such as a menu, a prominent item in the list meets this criterion. 1. Source Code. The "source code" for a work means the preferred form of the work for making modifications to it. "Object code" means any non-source form of a work. A "Standard Interface" means an interface that either is an official standard defined by a recognized standards body, or, in the case of interfaces specified for a particular programming language, one that is widely used among developers working in that language. The "System Libraries" of an executable work include anything, other than the work as a whole, that (a) is included in the normal form of packaging a Major Component, but which is not part of that Major Component, and (b) serves only to enable use of the work with that Major Component, or to implement a Standard Interface for which an implementation is available to the public in source code form. A "Major Component", in this context, means a major essential component (kernel, window system, and so on) of the specific operating system (if any) on which the executable work runs, or a compiler used to produce the work, or an object code interpreter used to run it. The "Corresponding Source" for a work in object code form means all the source code needed to generate, install, and (for an executable work) run the object code and to modify the work, including scripts to control those activities. However, it does not include the work's System Libraries, or general-purpose tools or generally available free programs which are used unmodified in performing those activities but which are not part of the work. For example, Corresponding Source includes interface definition files associated with source files for the work, and the source code for shared libraries and dynamically linked subprograms that the work is specifically designed to require, such as by intimate data communication or control flow between those subprograms and other parts of the work. The Corresponding Source need not include anything that users can regenerate automatically from other parts of the Corresponding Source. The Corresponding Source for a work in source code form is that same work. 2. Basic Permissions. All rights granted under this License are granted for the term of copyright on the Program, and are irrevocable provided the stated conditions are met. This License explicitly affirms your unlimited permission to run the unmodified Program. The output from running a covered work is covered by this License only if the output, given its content, constitutes a covered work. This License acknowledges your rights of fair use or other equivalent, as provided by copyright law. You may make, run and propagate covered works that you do not convey, without conditions so long as your license otherwise remains in force. You may convey covered works to others for the sole purpose of having them make modifications exclusively for you, or provide you with facilities for running those works, provided that you comply with the terms of this License in conveying all material for which you do not control copyright. Those thus making or running the covered works for you must do so exclusively on your behalf, under your direction and control, on terms that prohibit them from making any copies of your copyrighted material outside their relationship with you. Conveying under any other circumstances is permitted solely under the conditions stated below. Sublicensing is not allowed; section 10 makes it unnecessary. 3. Protecting Users' Legal Rights From Anti-Circumvention Law. No covered work shall be deemed part of an effective technological measure under any applicable law fulfilling obligations under article 11 of the WIPO copyright treaty adopted on 20 December 1996, or similar laws prohibiting or restricting circumvention of such measures. When you convey a covered work, you waive any legal power to forbid circumvention of technological measures to the extent such circumvention is effected by exercising rights under this License with respect to the covered work, and you disclaim any intention to limit operation or modification of the work as a means of enforcing, against the work's users, your or third parties' legal rights to forbid circumvention of technological measures. 4. Conveying Verbatim Copies. You may convey verbatim copies of the Program's source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice; keep intact all notices stating that this License and any non-permissive terms added in accord with section 7 apply to the code; keep intact all notices of the absence of any warranty; and give all recipients a copy of this License along with the Program. You may charge any price or no price for each copy that you convey, and you may offer support or warranty protection for a fee. 5. Conveying Modified Source Versions. You may convey a work based on the Program, or the modifications to produce it from the Program, in the form of source code under the terms of section 4, provided that you also meet all of these conditions: a) The work must carry prominent notices stating that you modified it, and giving a relevant date. b) The work must carry prominent notices stating that it is released under this License and any conditions added under section 7. This requirement modifies the requirement in section 4 to "keep intact all notices". c) You must license the entire work, as a whole, under this License to anyone who comes into possession of a copy. This License will therefore apply, along with any applicable section 7 additional terms, to the whole of the work, and all its parts, regardless of how they are packaged. This License gives no permission to license the work in any other way, but it does not invalidate such permission if you have separately received it. d) If the work has interactive user interfaces, each must display Appropriate Legal Notices; however, if the Program has interactive interfaces that do not display Appropriate Legal Notices, your work need not make them do so. A compilation of a covered work with other separate and independent works, which are not by their nature extensions of the covered work, and which are not combined with it such as to form a larger program, in or on a volume of a storage or distribution medium, is called an "aggregate" if the compilation and its resulting copyright are not used to limit the access or legal rights of the compilation's users beyond what the individual works permit. Inclusion of a covered work in an aggregate does not cause this License to apply to the other parts of the aggregate. 6. Conveying Non-Source Forms. You may convey a covered work in object code form under the terms of sections 4 and 5, provided that you also convey the machine-readable Corresponding Source under the terms of this License, in one of these ways: a) Convey the object code in, or embodied in, a physical product (including a physical distribution medium), accompanied by the Corresponding Source fixed on a durable physical medium customarily used for software interchange. b) Convey the object code in, or embodied in, a physical product (including a physical distribution medium), accompanied by a written offer, valid for at least three years and valid for as long as you offer spare parts or customer support for that product model, to give anyone who possesses the object code either (1) a copy of the Corresponding Source for all the software in the product that is covered by this License, on a durable physical medium customarily used for software interchange, for a price no more than your reasonable cost of physically performing this conveying of source, or (2) access to copy the Corresponding Source from a network server at no charge. c) Convey individual copies of the object code with a copy of the written offer to provide the Corresponding Source. This alternative is allowed only occasionally and noncommercially, and only if you received the object code with such an offer, in accord with subsection 6b. d) Convey the object code by offering access from a designated place (gratis or for a charge), and offer equivalent access to the Corresponding Source in the same way through the same place at no further charge. You need not require recipients to copy the Corresponding Source along with the object code. If the place to copy the object code is a network server, the Corresponding Source may be on a different server (operated by you or a third party) that supports equivalent copying facilities, provided you maintain clear directions next to the object code saying where to find the Corresponding Source. Regardless of what server hosts the Corresponding Source, you remain obligated to ensure that it is available for as long as needed to satisfy these requirements. e) Convey the object code using peer-to-peer transmission, provided you inform other peers where the object code and Corresponding Source of the work are being offered to the general public at no charge under subsection 6d. A separable portion of the object code, whose source code is excluded from the Corresponding Source as a System Library, need not be included in conveying the object code work. A "User Product" is either (1) a "consumer product", which means any tangible personal property which is normally used for personal, family, or household purposes, or (2) anything designed or sold for incorporation into a dwelling. In determining whether a product is a consumer product, doubtful cases shall be resolved in favor of coverage. For a particular product received by a particular user, "normally used" refers to a typical or common use of that class of product, regardless of the status of the particular user or of the way in which the particular user actually uses, or expects or is expected to use, the product. A product is a consumer product regardless of whether the product has substantial commercial, industrial or non-consumer uses, unless such uses represent the only significant mode of use of the product. "Installation Information" for a User Product means any methods, procedures, authorization keys, or other information required to install and execute modified versions of a covered work in that User Product from a modified version of its Corresponding Source. The information must suffice to ensure that the continued functioning of the modified object code is in no case prevented or interfered with solely because modification has been made. If you convey an object code work under this section in, or with, or specifically for use in, a User Product, and the conveying occurs as part of a transaction in which the right of possession and use of the User Product is transferred to the recipient in perpetuity or for a fixed term (regardless of how the transaction is characterized), the Corresponding Source conveyed under this section must be accompanied by the Installation Information. But this requirement does not apply if neither you nor any third party retains the ability to install modified object code on the User Product (for example, the work has been installed in ROM). The requirement to provide Installation Information does not include a requirement to continue to provide support service, warranty, or updates for a work that has been modified or installed by the recipient, or for the User Product in which it has been modified or installed. Access to a network may be denied when the modification itself materially and adversely affects the operation of the network or violates the rules and protocols for communication across the network. Corresponding Source conveyed, and Installation Information provided, in accord with this section must be in a format that is publicly documented (and with an implementation available to the public in source code form), and must require no special password or key for unpacking, reading or copying. 7. Additional Terms. "Additional permissions" are terms that supplement the terms of this License by making exceptions from one or more of its conditions. Additional permissions that are applicable to the entire Program shall be treated as though they were included in this License, to the extent that they are valid under applicable law. If additional permissions apply only to part of the Program, that part may be used separately under those permissions, but the entire Program remains governed by this License without regard to the additional permissions. When you convey a copy of a covered work, you may at your option remove any additional permissions from that copy, or from any part of it. (Additional permissions may be written to require their own removal in certain cases when you modify the work.) You may place additional permissions on material, added by you to a covered work, for which you have or can give appropriate copyright permission. Notwithstanding any other provision of this License, for material you add to a covered work, you may (if authorized by the copyright holders of that material) supplement the terms of this License with terms: a) Disclaiming warranty or limiting liability differently from the terms of sections 15 and 16 of this License; or b) Requiring preservation of specified reasonable legal notices or author attributions in that material or in the Appropriate Legal Notices displayed by works containing it; or c) Prohibiting misrepresentation of the origin of that material, or requiring that modified versions of such material be marked in reasonable ways as different from the original version; or d) Limiting the use for publicity purposes of names of licensors or authors of the material; or e) Declining to grant rights under trademark law for use of some trade names, trademarks, or service marks; or f) Requiring indemnification of licensors and authors of that material by anyone who conveys the material (or modified versions of it) with contractual assumptions of liability to the recipient, for any liability that these contractual assumptions directly impose on those licensors and authors. All other non-permissive additional terms are considered "further restrictions" within the meaning of section 10. If the Program as you received it, or any part of it, contains a notice stating that it is governed by this License along with a term that is a further restriction, you may remove that term. If a license document contains a further restriction but permits relicensing or conveying under this License, you may add to a covered work material governed by the terms of that license document, provided that the further restriction does not survive such relicensing or conveying. If you add terms to a covered work in accord with this section, you must place, in the relevant source files, a statement of the additional terms that apply to those files, or a notice indicating where to find the applicable terms. Additional terms, permissive or non-permissive, may be stated in the form of a separately written license, or stated as exceptions; the above requirements apply either way. 8. Termination. You may not propagate or modify a covered work except as expressly provided under this License. Any attempt otherwise to propagate or modify it is void, and will automatically terminate your rights under this License (including any patent licenses granted under the third paragraph of section 11). However, if you cease all violation of this License, then your license from a particular copyright holder is reinstated (a) provisionally, unless and until the copyright holder explicitly and finally terminates your license, and (b) permanently, if the copyright holder fails to notify you of the violation by some reasonable means prior to 60 days after the cessation. Moreover, your license from a particular copyright holder is reinstated permanently if the copyright holder notifies you of the violation by some reasonable means, this is the first time you have received notice of violation of this License (for any work) from that copyright holder, and you cure the violation prior to 30 days after your receipt of the notice. Termination of your rights under this section does not terminate the licenses of parties who have received copies or rights from you under this License. If your rights have been terminated and not permanently reinstated, you do not qualify to receive new licenses for the same material under section 10. 9. Acceptance Not Required for Having Copies. You are not required to accept this License in order to receive or run a copy of the Program. Ancillary propagation of a covered work occurring solely as a consequence of using peer-to-peer transmission to receive a copy likewise does not require acceptance. However, nothing other than this License grants you permission to propagate or modify any covered work. These actions infringe copyright if you do not accept this License. Therefore, by modifying or propagating a covered work, you indicate your acceptance of this License to do so. 10. Automatic Licensing of Downstream Recipients. Each time you convey a covered work, the recipient automatically receives a license from the original licensors, to run, modify and propagate that work, subject to this License. You are not responsible for enforcing compliance by third parties with this License. An "entity transaction" is a transaction transferring control of an organization, or substantially all assets of one, or subdividing an organization, or merging organizations. If propagation of a covered work results from an entity transaction, each party to that transaction who receives a copy of the work also receives whatever licenses to the work the party's predecessor in interest had or could give under the previous paragraph, plus a right to possession of the Corresponding Source of the work from the predecessor in interest, if the predecessor has it or can get it with reasonable efforts. You may not impose any further restrictions on the exercise of the rights granted or affirmed under this License. For example, you may not impose a license fee, royalty, or other charge for exercise of rights granted under this License, and you may not initiate litigation (including a cross-claim or counterclaim in a lawsuit) alleging that any patent claim is infringed by making, using, selling, offering for sale, or importing the Program or any portion of it. 11. Patents. A "contributor" is a copyright holder who authorizes use under this License of the Program or a work on which the Program is based. The work thus licensed is called the contributor's "contributor version". A contributor's "essential patent claims" are all patent claims owned or controlled by the contributor, whether already acquired or hereafter acquired, that would be infringed by some manner, permitted by this License, of making, using, or selling its contributor version, but do not include claims that would be infringed only as a consequence of further modification of the contributor version. For purposes of this definition, "control" includes the right to grant patent sublicenses in a manner consistent with the requirements of this License. Each contributor grants you a non-exclusive, worldwide, royalty-free patent license under the contributor's essential patent claims, to make, use, sell, offer for sale, import and otherwise run, modify and propagate the contents of its contributor version. In the following three paragraphs, a "patent license" is any express agreement or commitment, however denominated, not to enforce a patent (such as an express permission to practice a patent or covenant not to sue for patent infringement). To "grant" such a patent license to a party means to make such an agreement or commitment not to enforce a patent against the party. If you convey a covered work, knowingly relying on a patent license, and the Corresponding Source of the work is not available for anyone to copy, free of charge and under the terms of this License, through a publicly available network server or other readily accessible means, then you must either (1) cause the Corresponding Source to be so available, or (2) arrange to deprive yourself of the benefit of the patent license for this particular work, or (3) arrange, in a manner consistent with the requirements of this License, to extend the patent license to downstream recipients. "Knowingly relying" means you have actual knowledge that, but for the patent license, your conveying the covered work in a country, or your recipient's use of the covered work in a country, would infringe one or more identifiable patents in that country that you have reason to believe are valid. If, pursuant to or in connection with a single transaction or arrangement, you convey, or propagate by procuring conveyance of, a covered work, and grant a patent license to some of the parties receiving the covered work authorizing them to use, propagate, modify or convey a specific copy of the covered work, then the patent license you grant is automatically extended to all recipients of the covered work and works based on it. A patent license is "discriminatory" if it does not include within the scope of its coverage, prohibits the exercise of, or is conditioned on the non-exercise of one or more of the rights that are specifically granted under this License. You may not convey a covered work if you are a party to an arrangement with a third party that is in the business of distributing software, under which you make payment to the third party based on the extent of your activity of conveying the work, and under which the third party grants, to any of the parties who would receive the covered work from you, a discriminatory patent license (a) in connection with copies of the covered work conveyed by you (or copies made from those copies), or (b) primarily for and in connection with specific products or compilations that contain the covered work, unless you entered into that arrangement, or that patent license was granted, prior to 28 March 2007. Nothing in this License shall be construed as excluding or limiting any implied license or other defenses to infringement that may otherwise be available to you under applicable patent law. 12. No Surrender of Others' Freedom. If conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot convey a covered work so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not convey it at all. For example, if you agree to terms that obligate you to collect a royalty for further conveying from those to whom you convey the Program, the only way you could satisfy both those terms and this License would be to refrain entirely from conveying the Program. 13. Use with the GNU Affero General Public License. Notwithstanding any other provision of this License, you have permission to link or combine any covered work with a work licensed under version 3 of the GNU Affero General Public License into a single combined work, and to convey the resulting work. The terms of this License will continue to apply to the part which is the covered work, but the special requirements of the GNU Affero General Public License, section 13, concerning interaction through a network will apply to the combination as such. 14. Revised Versions of this License. The Free Software Foundation may publish revised and/or new versions of the GNU General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Program specifies that a certain numbered version of the GNU General Public License "or any later version" applies to it, you have the option of following the terms and conditions either of that numbered version or of any later version published by the Free Software Foundation. If the Program does not specify a version number of the GNU General Public License, you may choose any version ever published by the Free Software Foundation. If the Program specifies that a proxy can decide which future versions of the GNU General Public License can be used, that proxy's public statement of acceptance of a version permanently authorizes you to choose that version for the Program. Later license versions may give you additional or different permissions. However, no additional obligations are imposed on any author or copyright holder as a result of your choosing to follow a later version. 15. Disclaimer of Warranty. THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 16. Limitation of Liability. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. 17. Interpretation of Sections 15 and 16. If the disclaimer of warranty and limitation of liability provided above cannot be given local legal effect according to their terms, reviewing courts shall apply local law that most closely approximates an absolute waiver of all civil liability in connection with the Program, unless a warranty or assumption of liability accompanies a copy of the Program in return for a fee. END OF TERMS AND CONDITIONS How to Apply These Terms to Your New Programs If you develop a new program, and you want it to be of the greatest possible use to the public, the best way to achieve this is to make it free software which everyone can redistribute and change under these terms. To do so, attach the following notices to the program. It is safest to attach them to the start of each source file to most effectively state the exclusion of warranty; and each file should have at least the "copyright" line and a pointer to where the full notice is found. Copyright (C) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . Also add information on how to contact you by electronic and paper mail. If the program does terminal interaction, make it output a short notice like this when it starts in an interactive mode: Copyright (C) This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'. This is free software, and you are welcome to redistribute it under certain conditions; type `show c' for details. The hypothetical commands `show w' and `show c' should show the appropriate parts of the General Public License. Of course, your program's commands might be different; for a GUI interface, you would use an "about box". You should also get your employer (if you work as a programmer) or school, if any, to sign a "copyright disclaimer" for the program, if necessary. For more information on this, and how to apply and follow the GNU GPL, see . The GNU General Public License does not permit incorporating your program into proprietary programs. If your program is a subroutine library, you may consider it more useful to permit linking proprietary applications with the library. If this is what you want to do, use the GNU Lesser General Public License instead of this License. But first, please read . xapers-0.5.2/Makefile000066400000000000000000000017331214264575400144760ustar00rootroot00000000000000# -*- makefile -*- VERSION:=$(shell git describe --tags | sed -e s/_/~/ -e s/-/+/ -e s/-/~/) PV_FILE=lib/xapers/version.py .PHONY: all all: .PHONY: test test: ./test/xapers-test .PHONY: update-version update-version: echo "__version__ = '$(VERSION)'" >$(PV_FILE) .PHONY: release ifdef V update-version: VERSION:=$(V) release: VERSION:=$(V) release: update-version make test git commit -m "Update version for release $(VERSION)." $(PV_FILE) git tag --sign -m "Xapers $(VERSION) release." $(VERSION) else release: git tag -l | grep -v debian/ endif .PHONY: debian-snapshot debian-snapshot: rm -rf build/snapshot mkdir -p build/snapshot/debian git archive HEAD | tar -x -C build/snapshot/ git archive debian:debian | tar -x -C build/snapshot/debian/ cd build/snapshot; make update-version cd build/snapshot; dch -b -v $(VERSION) -D UNRELEASED 'test build, not for upload' cd build/snapshot; echo '3.0 (native)' > debian/source/format cd build/snapshot; debuild -us -uc xapers-0.5.2/README000066400000000000000000000107401214264575400137140ustar00rootroot00000000000000Xapers ====== Xapers is a personal document indexing system, geared towards academic journal articles. Think of it as your own personal document search engine, or a local cache of online libraries. It provides fast search of document text and bibliographic data and simple document and bibtex retrieval. Document files (in PDF format) and source identifiers (e.g. DOI) are parsed and indexed into a Xapian search engine [0]. Document text is extracted from the PDF and fully indexed. Bibliographic information downloaded from online libraries is indexed as prefixed search terms. Existing bibtex databases can be easily imported as well, including import of pdf files specified in Jabref/Mendeley format. Documents can be arbitrarily tagged. Original document files are easily retrievable from a simple curses search UI. The command line interface allows for exporting bibtex [1] from arbitrary searches, allowing seamless integration into LaTeX work flows. Xapers provides source modules for some common online libraries: * DOI: http://www.doi.org/ * arXiv: http://arxiv.org/ Contributions of additional library interface modules is highly encouraged. Xapers is heavily inspired by the notmuch mail indexing system [2]. [0] http://www.xapian.org/ [1] http://www.bibtex.org/ [2] http://notmuchmail.org/ Source ------ Clone the repo: $ git clone git://finestructure.net/xapers $ cd xapers Dependencies : * python (>= 2.6) * python-xapian - Python Xapian search engine bindings * poppler-utils - PDF processing tools * pycurl - Python bindings to libcurl * pybtex - Python bibtex parser Recommends (for curses UI) : * python-urwid - Python Urwid curses library * xdg-utils - Desktop tools for opening files and URLs * xclip - X clipboard support for copying document fields On Debian: $ sudo apt-get install python-xapian poppler-utils python-pycurl pybtex python-urwid xdg-utils xclip Tests ----- Run the tests: $ make test Debian ------ Debian/Ubuntu snapshot packages can be easily made from the git source. You can build the package from any branch but it requires an up-to-date local branch of origin/debian, e.g.: $ git branch debian origin/debian Then: $ sudo apt-get install build-essential devscripts pkg-config python-all-dev python-setuptools debhelper dpkg-dev fakeroot $ make debian-snapshot $ sudo dpkg -i build/xapers_0.1_amd64.deb Using Xapers ============ See the included xapers(1) man page for detailed usage and information on source modules and searching. Command line interface ---------------------- The main interface to Xapers is the xapers command line utility. From this interface you can import documents, search, tag, etc. The "add" command allows importing or updating single documents with sources. The "import" command allows importing an entire bibtex databases (.bib file). If the bibtex entries include "file" fields (ala. Mendeley or Jabref), then those files are retrieved, indexed, and imported as well. Curses interface ---------------- The curses interface (accessed through 'xapers show ...') provides a simple way to search the database and retrieve files. Documents matching searches are displayed with their bibliographic information and a short text summary. Document tags can be manipulated, files can be viewed, and source URLs can be opened in a browser. xapers-adder ------------ xapers-adder is a simple script that helps the adding of individual documents to your Xapers database. It can be used e.g. as a PDF handler in your favorite browser. It displays the PDF then presents the user with the option to import the document into Xapers. The user is prompted for any sources to retrieve and any initial tags to add. If the source is known, bibtex is retrieved and indexed. The resulting xapers entry for the document is displayed. Development of more clever import methods is highly encouraged. Python library -------------- Xapers is really a python library interface under the hood: >>> import xapers >>> db = xapers.Database('~/.xapers/docs') >>> docs = db.search('tag:new') >>> for doc in docs: doc.add_tags(['foo']) ... >>> Development of new interfaces to the underlying library is highly encouraged. Contact ======= Xapers was written by: Jameson Graef Rollins Feel free to contact him directly with any questions, comments, feedback, patches, etc. He also hangs out on IRC: server: irc.freenode.net channel: #xapers xapers-0.5.2/TODO000066400000000000000000000026161214264575400135270ustar00rootroot00000000000000db: * add values: publication date, entry creation time, mod time, etc. cli: * utilize meta-data pulled from parser * update should re-pull from existing source if available * export should produce full mirror of xapers document structure * needs simplification (script + module is clunky) * improve command line parsing nci: * how to test?? * asyncronously load search results * update/add in search * better entry highlighting (full entry) * fail gracefully if db has changed * customizable keybindings * customizable palette sources: * allow for user source modules (inherit Source) * add method to download document file parser: * extract any metadata from pdfs * better handle parse errors * better pdf parser (native python: https://gist.github.com/pazz/5455090) * parsers for other document types bib: * test for compiling bibtex * do something about "container-title" from doi * provide bib output in other formats emacs UI * make emacs UI! (need json/sexp output) ? * rename file when importing and copying into docdir? * store bib in different form (json instead of bibtex)? * should something other than summary go in document "data"? * clear old indexed terms when importing new file/bib? * vcs integration (git of root)? packaging: * set version in setup.py BUGS ==== * nci: 'b' on doc with no bib produces traceback * capitalized prefixed terms are not searchable - dcc:T00000 - key:HaEA2009a xapers-0.5.2/bin/000077500000000000000000000000001214264575400136025ustar00rootroot00000000000000xapers-0.5.2/bin/xapers000077500000000000000000000313751214264575400150430ustar00rootroot00000000000000#!/usr/bin/env python import os import sys import pkg_resources import xapers import xapers.cli import xapers.source ######################################################################## # combine a list of terms with spaces between, so that simple queries # don't have to be quoted at the shell level. def make_query_string(terms, require=True): string = str.join(' ', terms) if string == '': if require: print >>sys.stderr, "Must specify a search term." sys.exit(1) else: string = '*' return string def import_nci(): try: import xapers.nci except ImportError: print >>sys.stderr, "The python-urwid package does not appear to be installed." print >>sys.stderr, "Please install to be able to use the curses UI." sys.exit(1) ######################################################################## def usage(): prog = os.path.basename(sys.argv[0]) print "Usage:", prog, " [args...]" print """ Commands: add [options] [] Add a new document or update existing. If provided, search should match a single document. --source= source, for retrieving bibtex --file= PDF file to index and archive --tags=[,...] initial tags --prompt prompt for unspecified options --view view entry after adding import Import all entries from a bibtex database. delete Delete documents from database. --noprompt do not prompt to confirm deletion restore Restore database from xapers root. tag +|- [...] [--] Add/remove tags. search [options] Search for documents. --output=[summary|bibtex|tags|sources|keys|files] output format (default is 'summary') --limit=N limit number of results returned (default is 20, use '0' for all) bibtex Short for \"search --output=bibtex\". view View search in curses UI. count Count matches. export Export documents to a directory of files named for document titles. sources List available sources. source2bib Retrieve bibtex for source and print to stdout. scandoc Scan PDF file for source IDs. version Print version number. help This help. The xapers document store is specified by the XAPERS_ROOT environment variable, or defaults to '~/.xapers/docs' if not specified (the directory is allowed to be a symlink). Other definitions: : Documents are assigned unique integer IDs. : A source can be either a URL, a source ID string of the form ':', or a bibtex file. Currently recognized sources: doi: arxiv: : Free-form text to match against indexed document text, or the following prefixes can be used to match against specific document metadata: id: Xapers document id author: string in authors (also a:) title: string in title (also t:) tag: specific user tags : specific sid string source: specific source key: specific bibtex citation key The string '*' will match all documents.""" ######################################################################## if __name__ == '__main__': if len(sys.argv) > 1: cmd = sys.argv[1] else: cmd = [] xroot = os.getenv('XAPERS_ROOT', os.path.expanduser(os.path.join('~','.xapers','docs'))) ######################################## if cmd in ['add','a']: cli = xapers.cli.UI(xroot) tags = None infile = None source = None prompt = False view = False query_string = None argc = 2 while True: if argc >= len(sys.argv): break elif '--source=' in sys.argv[argc]: source = sys.argv[argc].split('=',1)[1] elif '--file=' in sys.argv[argc]: infile = sys.argv[argc].split('=',1)[1] elif '--tags=' in sys.argv[argc]: tags = sys.argv[argc].split('=',1)[1].split(',') elif '--prompt' in sys.argv[argc]: prompt = True elif '--view' in sys.argv[argc]: view = True else: break argc += 1 if argc == (len(sys.argv) - 1): query_string = make_query_string(sys.argv[argc:]) try: docid = cli.add(query_string, infile=infile, source=source, tags=tags, prompt=prompt) except KeyboardInterrupt: print >>sys.stderr, '' sys.exit(1) # dereference the cli object so that the database is flushed # FIXME: is there a better way to handle this? cli = None if view and docid: import_nci() xapers.nci.UI(xroot, cmd=['search', 'id:'+docid]) ######################################## elif cmd in ['import','i']: cli = xapers.cli.UI(xroot) tags = [] argc = 2 while True: if argc >= len(sys.argv): break elif '--tags=' in sys.argv[argc]: tags = sys.argv[argc].split('=',1)[1].split(',') elif '--overwrite' in sys.argv[argc]: overwrite = True else: break argc += 1 try: bibfile = sys.argv[argc] except IndexError: print >>sys.stderr, "Must specify bibtex file to import." sys.exit(1) if not os.path.exists(bibfile): print >>sys.stderr, "File not found: %s" % bibfile sys.exit(1) try: cli.importbib(bibfile, tags=tags) except KeyboardInterrupt: print >>sys.stderr, '' sys.exit(1) ######################################## elif cmd in ['update-all']: cli = xapers.cli.UI(xroot) cli.update_all() ######################################## elif cmd in ['delete']: prompt = True argc = 2 while True: if argc >= len(sys.argv): break elif '--noprompt' in sys.argv[argc]: prompt = False else: break argc += 1 cli = xapers.cli.UI(xroot) try: cli.delete(make_query_string(sys.argv[argc:]), prompt=prompt) except KeyboardInterrupt: print >>sys.stderr, '' sys.exit(1) ######################################## elif cmd in ['search','s']: cli = xapers.cli.UI(xroot) oformat = 'summary' limit = 20 argc = 2 while True: if argc >= len(sys.argv): break if '--output=' in sys.argv[argc]: oformat = sys.argv[argc].split('=')[1] elif '--limit=' in sys.argv[argc]: limit = int(sys.argv[argc].split('=')[1]) else: break argc += 1 query = make_query_string(sys.argv[argc:]) try: cli.search(query, oformat=oformat, limit=limit) except KeyboardInterrupt: print >>sys.stderr, '' sys.exit(1) ######################################## elif cmd in ['bibtex','bib','b']: cli = xapers.cli.UI(xroot) argc = 2 query = make_query_string(sys.argv[argc:]) try: cli.search(query, oformat='bibtex') except KeyboardInterrupt: print >>sys.stderr, '' sys.exit(1) ######################################## elif cmd in ['nci','view','show','select']: import_nci() if cmd == 'nci': args = sys.argv[2:] else: query = make_query_string(sys.argv[2:], require=False) args = ['search', query] try: xapers.nci.UI(xroot, cmd=args) except KeyboardInterrupt: print >>sys.stderr, '' sys.exit(1) ######################################## elif cmd in ['tag','t']: cli = xapers.cli.UI(xroot) add_tags = [] remove_tags = [] argc = 2 for arg in sys.argv[argc:]: if argc >= len(sys.argv): break if arg == '--': argc += 1 continue if arg[0] == '+': add_tags.append(arg[1:]) elif arg[0] == '-': remove_tags.append(arg[1:]) else: break argc += 1 if not add_tags and not remove_tags: print >>sys.stderr, "Must specify tags to add or remove." sys.exit(1) if '' in add_tags: print >>sys.stderr, "Null tags not allowed." sys.exit(1) query = make_query_string(sys.argv[argc:]) cli.tag(query, add_tags, remove_tags) ######################################## elif cmd in ['dumpterms']: cli = xapers.cli.UI(xroot) query = make_query_string(sys.argv[2:], require=False) cli.dumpterms(query) ######################################## elif cmd in ['maxid']: db = xapers.cli.initdb(xroot) docid = 0 for doc in db.search('*'): docid = max(docid, int(doc.docid)) print 'id:%d' % docid ######################################## elif cmd in ['count']: cli = xapers.cli.UI(xroot) query = make_query_string(sys.argv[2:], require=False) cli.count(query) ######################################## elif cmd in ['export']: cli = xapers.cli.UI(xroot) outdir = sys.argv[2] query = make_query_string(sys.argv[3:]) cli.export(outdir, query) ######################################## elif cmd in ['restore']: cli = xapers.cli.UI(xroot) cli.restore() ######################################## elif cmd in ['sources']: for source in xapers.source.list_sources(): print source ######################################## elif cmd in ['source2bib','s2b']: try: string = sys.argv[2] except IndexError: print >>sys.stderr, "Must specify source to retrieve." sys.exit(1) try: smod = xapers.source.get_source(string) except xapers.source.SourceError as e: print >>sys.stderr, e sys.exit(1) try: print >>sys.stderr, "Retrieving bibtex...", bibtex = smod.get_bibtex() print >>sys.stderr, "done." except Exception, e: print >>sys.stderr, "\n" print >>sys.stderr, "Could not retrieve bibtex: %s" % e sys.exit(1) try: print xapers.bibtex.Bibtex(bibtex)[0].as_string() except Exception, e: print >>sys.stderr, "Error parsing bibtex: %s" % e print >>sys.stderr, "Outputting raw..." print bibtex sys.exit(1) ######################################## elif cmd in ['scandoc','sd']: try: infile = sys.argv[2] except IndexError: print >>sys.stderr, "Must specify document to scan." sys.exit(1) try: sources = xapers.source.scan_file_for_sources(infile) except xapers.parser.ParseError as e: print >>sys.stderr, "Parse error: %s" % e print >>sys.stderr, "Is file '%s' a PDF?" % infile sys.exit(1) for ss in sources: print "%s" % (ss) ######################################## elif cmd in ['version','--version','-v']: print 'xapers', pkg_resources.get_distribution('xapers').version ######################################## elif cmd in ['help','h','--help','-h']: usage() sys.exit(0) ######################################## else: print >>sys.stderr, "Unknown command '%s'." % cmd print >>sys.stderr, "See \"help\" for more information." sys.exit(1) xapers-0.5.2/bin/xapers-adder000077500000000000000000000010261214264575400161060ustar00rootroot00000000000000#!/bin/bash -e if [ -z "$1" ] || [[ "$1" == '--help' ]] || [[ "$1" == '-h' ]]; then echo "usage: $(basename $0) " >&2 exit 1 fi infile="$1" if [ ! -e "$infile" ] ;then echo "File not found: $infile" >&2 exit 1 fi # open the file with preferred application xdg-open "$infile" & # prompt for xapers import x-terminal-emulator \ -title "xapers-adder" \ -e bash -c " echo 'Type C-c at any time to cancel.' xapers add --file=\"$infile\" --tags=new --prompt --view read -N1 -p 'any key to quit:' OK " xapers-0.5.2/lib/000077500000000000000000000000001214264575400136005ustar00rootroot00000000000000xapers-0.5.2/lib/xapers/000077500000000000000000000000001214264575400151025ustar00rootroot00000000000000xapers-0.5.2/lib/xapers/__init__.py000066400000000000000000000015711214264575400172170ustar00rootroot00000000000000""" This file is part of xapers. Xapers is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. Xapers is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with notmuch. If not, see . Copyright 2012 Jameson Rollins """ from database import Database from database import DatabaseError from database import DatabaseUninitializedError from database import DatabaseLockError from documents import Documents, Document xapers-0.5.2/lib/xapers/bibtex.py000066400000000000000000000121201214264575400167250ustar00rootroot00000000000000import os import sys import io import json import pybtex from pybtex.core import Entry, Person from pybtex.bibtex.utils import split_name_list from pybtex.database.input import bibtex as inparser from pybtex.database.output import bibtex as outparser def clean_bib_string(string): for char in ['{', '}']: string = string.replace(char,'') return string ################################################## class BibtexError(Exception): """Base class for Xapers bibtex exceptions.""" def __init__(self, msg): self.msg = msg def __str__(self): return self.msg ################################################## class Bibtex(): """Represents a bibtex database.""" # http://www.bibtex.org/Format/ def __init__(self, bibtex): parser = inparser.Parser(encoding='utf-8') if os.path.exists(bibtex): try: bibdata = parser.parse_file(bibtex) except Exception, e: raise BibtexError('Error loading bibtex from file: %s' % e ) else: try: with io.StringIO(unicode(bibtex)) as stream: bibdata = parser.parse_stream(stream) except Exception, e: raise BibtexError('Error loading bibtex string: %s' % e ) self.keys = bibdata.entries.keys() self.entries = bibdata.entries.values() self.index = -1 self.max = len(self.entries) def __getitem__(self, index): key = self.keys[index] entry = self.entries[index] return Bibentry(key, entry) def __iter__(self): return self def __len__(self): return self.max def next(self): self.index = self.index + 1 if self.index == self.max: raise StopIteration return self[self.index] ################################################## class Bibentry(): def __init__(self, key, entry): self.key = key self.entry = entry def get_authors(self): """Return a list of authors.""" authors = [] if 'author' in self.entry.persons: for p in self.entry.persons['author']: authors.append(clean_bib_string(unicode(p))) return authors def get_fields(self): """Return a dict of non-author fields.""" bibfields = self.entry.fields # entry.fields is actually already a dict, but we want to # clean the strings first fields = {} for field in bibfields: fields[field] = unicode(clean_bib_string(bibfields[field])) return fields def set_file(self, path): # FIXME: what's the REAL proper format for this self.entry.fields['file'] = ':%s:%s' % (path, 'pdf') def get_file(self): """Returns file path if file field exists. Expects either single path string or Mendeley/Jabref format.""" try: parsed = self.entry.fields['file'].split(':') if len(parsed) > 1: return parsed[1] else: return parsed[0] except KeyError: return None except IndexError: return None def _entry2db(self): db = pybtex.database.BibliographyData() db.add_entry(self.key, self.entry) return db def as_string(self): """Return entry as formatted bibtex string.""" writer = outparser.Writer() with io.StringIO() as stream: writer.write_stream(self._entry2db(), stream) string = stream.getvalue() string = string.strip() return string def to_file(self, path): """Write entry bibtex to file.""" writer = outparser.Writer(encoding='utf-8') writer.write_file(self._entry2db(), path) ################################################## def data2bib(data, key, type='article'): """Convert a python dict into a Bibentry object.""" if not data: return # need to remove authors field from data authors = None if 'authors' in data: authors = data['authors'] if isinstance(authors, str): authors = split_name_list(authors) if len(authors) == 1: authors = authors[0].split(',') del data['authors'] entry = Entry(type, fields=data) if authors: for p in authors: entry.add_person(Person(p), 'author') return Bibentry(key, entry) def json2bib(jsonstring, key, type='article'): """Convert a json string into a Bibentry object.""" if not json: return data = json.loads(jsonstring) # need to remove authors field from data authors = None if 'author' in data: authors = data['author'] del data['author'] if 'issued' in data: data['year'] = str(data['issued']['date-parts'][0][0]) del data['issued'] # delete other problematic fields del data['editor'] entry = Entry(type, fields=data) if authors: for author in authors: entry.add_person(Person(first=author['given'], last=author['family']), 'author') return Bibentry(key, entry) xapers-0.5.2/lib/xapers/cli/000077500000000000000000000000001214264575400156515ustar00rootroot00000000000000xapers-0.5.2/lib/xapers/cli/__init__.py000066400000000000000000000013211214264575400177570ustar00rootroot00000000000000""" This file is part of xapers. Xapers is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. Xapers is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with notmuch. If not, see . Copyright 2012 Jameson Rollins """ from ui import UI, initdb xapers-0.5.2/lib/xapers/cli/ui.py000066400000000000000000000375741214264575400166600ustar00rootroot00000000000000""" This file is part of xapers. Xapers is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. Xapers is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with notmuch. If not, see . Copyright 2012, 2013 Jameson Rollins """ import os import sys import codecs SYS_STDOUT = sys.stdout sys.stdout = codecs.getwriter('utf8')(sys.stdout) import sets import shutil import readline from xapers.database import Database from xapers.documents import Document from xapers.parser import ParseError from xapers.bibtex import Bibtex, BibtexError import xapers.source ############################################################ def initdb(xroot, writable=False, create=False, force=False): try: return Database(xroot, writable=writable, create=create, force=force) except xapers.DatabaseUninitializedError as e: print >>sys.stderr, e.msg print >>sys.stderr, "Import a document to initialize." sys.exit(1) except xapers.DatabaseError as e: print >>sys.stderr, e.msg sys.exit(1) ############################################################ class UI(): """Xapers command-line UI.""" def __init__(self, xroot): self.xroot = xroot self.db = None ########## def prompt_for_file(self, infile): if infile: print >>sys.stderr, 'file: %s' % infile else: readline.set_startup_hook() readline.parse_and_bind('') readline.set_completer() infile = raw_input('file: ') if infile == '': infile = None return infile def prompt_for_source(self, sources): if sources: readline.set_startup_hook(lambda: readline.insert_text(sources[0])) elif self.db: sources = self.db.get_terms('source') readline.parse_and_bind("tab: complete") completer = Completer(sources) readline.set_completer(completer.terms) source = raw_input('source: ') if source == '': source = None return source def prompt_for_tags(self, tags): # always prompt for tags, and append to initial if tags: print >>sys.stderr, 'initial tags: %s' % ' '.join(tags) else: tags = [] if self.db: itags = self.db.get_terms('tag') else: itags = None readline.set_startup_hook() readline.parse_and_bind("tab: complete") completer = Completer(itags) readline.set_completer(completer.terms) while True: tag = raw_input('tag: ') if tag and tag != '': tags.append(tag.strip()) else: break return tags ############################################ def add(self, query_string, infile=None, source=None, tags=None, prompt=False): doc = None bibtex = None ################################## # open db and get doc self.db = initdb(self.xroot, writable=True, create=True) # if query provided, find single doc to update if query_string: if self.db.count(query_string) != 1: print >>sys.stderr, "Search did not match a single document. Aborting." sys.exit(1) for doc in self.db.search(query_string): break ################################## # do fancy option prompting if prompt: infile = self.prompt_for_file(infile) if prompt: sources = [] if source: sources = [source] # scan the file for source info if infile: print >>sys.stderr, "Scanning document for source identifiers..." try: ss = xapers.source.scan_file_for_sources(infile) print >>sys.stderr, "%d source ids found:" % (len(sources)) except ParseError, e: print >>sys.stderr, "\n" print >>sys.stderr, "Parse error: %s" % e sys.exit(1) if len(sources) > 0: for sid in ss: print >>sys.stderr, " %s" % (sid) sources += ss source = self.prompt_for_source(sources) tags = self.prompt_for_tags(tags) if not query_string and not infile and not source: print >>sys.stderr, "Must specify file or source to import, or query to update existing document." sys.exit(1) ################################## # process source and get bibtex # check if source is a file, in which case interpret it as bibtex if source and os.path.exists(source): bibtex = source elif source: try: smod = xapers.source.get_source(source) except xapers.source.SourceError as e: print >>sys.stderr, e sys.exit(1) sid = smod.get_sid() if not sid: print >>sys.stderr, "Source ID not specified." sys.exit(1) # check that the source doesn't match an existing doc sdoc = self.db.doc_for_source(sid) if sdoc: if sdoc != doc: print >>sys.stderr, "Document already exists for source '%s'. Aborting." % (sid) sys.exit(1) print >>sys.stderr, "Updating existing document..." doc = tdoc try: print >>sys.stderr, "Retrieving bibtex...", bibtex = smod.get_bibtex() print >>sys.stderr, "done." except Exception, e: print >>sys.stderr, "\n" print >>sys.stderr, "Could not retrieve bibtex: %s" % e sys.exit(1) ################################## # if we still don't have a doc, create a new one if not doc: doc = Document(self.db) ################################## # add stuff to the doc if bibtex: try: print >>sys.stderr, "Adding bibtex...", doc.add_bibtex(bibtex) print >>sys.stderr, "done." except BibtexError, e: print >>sys.stderr, "\n" print >>sys.stderr, e print >>sys.stderr, "Bibtex must be a plain text file with a single bibtex entry." sys.exit(1) except: print >>sys.stderr, "\n" raise if infile: path = os.path.abspath(infile) try: print >>sys.stderr, "Adding file '%s'..." % (path), doc.add_file(path) print >>sys.stderr, "done." except ParseError, e: print >>sys.stderr, "\n" print >>sys.stderr, "Parse error: %s" % e sys.exit(1) except: print >>sys.stderr, "\n" raise if tags: try: print >>sys.stderr, "Adding tags...", doc.add_tags(tags) print >>sys.stderr, "done." except: print >>sys.stderr, "\n" raise ################################## # sync the doc to db and disk try: print >>sys.stderr, "Syncing document...", doc.sync() print >>sys.stderr, "done.\n", except: print >>sys.stderr, "\n" raise print "id:%s" % doc.docid return doc.docid ############################################ def importbib(self, bibfile, tags=[], overwrite=False): self.db = initdb(self.xroot, writable=True, create=True) errors = [] for entry in sorted(Bibtex(bibfile), key=lambda entry: entry.key): print >>sys.stderr, entry.key try: docs = [] # check for doc with this bibkey bdoc = self.db.doc_for_bib(entry.key) if bdoc: docs.append(bdoc) # check for known sids for sid in xapers.source.scan_bibentry_for_sources(entry): sdoc = self.db.doc_for_source(sid) # FIXME: why can't we match docs in list? if sdoc and sdoc.docid not in [doc.docid for doc in docs]: docs.append(sdoc) if len(docs) == 0: doc = Document(self.db) elif len(docs) > 0: if len(docs) > 1: print >>sys.stderr, " Multiple distinct docs found for entry. Using first found." doc = docs[0] print >>sys.stderr, " Updating id:%s..." % (doc.docid) doc.add_bibentry(entry) filepath = entry.get_file() if filepath: print >>sys.stderr, " Adding file: %s" % filepath doc.add_file(filepath) doc.add_tags(tags) doc.sync() except Exception, e: print >>sys.stderr, " Error processing entry %s: %s" % (entry.key, e) print >>sys.stderr errors.append(entry.key) if errors: print >>sys.stderr print >>sys.stderr, "Failed to import %d" % (len(errors)), if len(errors) == 1: print >>sys.stderr, "entry", else: print >>sys.stderr, "entries", print >>sys.stderr, "from bibtex:" for error in errors: print >>sys.stderr, " %s" % (error) sys.exit(1) else: sys.exit(0) ############################################ def delete(self, query_string, prompt=True): self.db = initdb(self.xroot, writable=True) count = self.db.count(query_string) if count == 0: print >>sys.stderr, "No documents found for query." sys.exit(1) if prompt: resp = raw_input("Type 'yes' to delete %d documents: " % count) if resp != 'yes': print >>sys.stderr, "Aborting." sys.exit(1) for doc in self.db.search(query_string): doc.purge() ############################################ def update_all(self): self.db = initdb(self.xroot, writable=True) for doc in self.db.search('*', limit=0): try: print >>sys.stderr, "Updating %s..." % doc.docid, doc.update_from_bibtex() doc.sync() print >>sys.stderr, "done." except: print >>sys.stderr, "\n" raise ############################################ def tag(self, query_string, add_tags, remove_tags): self.db = initdb(self.xroot, writable=True) for doc in self.db.search(query_string): doc.add_tags(add_tags) doc.remove_tags(remove_tags) doc.sync() ############################################ def search(self, query_string, oformat='summary', limit=None): if oformat not in ['summary','bibtex','tags','sources','keys','files']: print >>sys.stderr, "Unknown output format." sys.exit(1) self.db = initdb(self.xroot) if oformat == 'tags' and query_string == '*': for tag in self.db.get_terms('tag'): print tag return if oformat == 'sources' and query_string == '*': for source in self.db.get_sids(): print source return if oformat == 'keys' and query_string == '*': for key in self.db.get_terms('key'): print key return otags = set([]) osources = set([]) okeys = set([]) for doc in self.db.search(query_string, limit=limit): docid = doc.get_docid() if oformat in ['file','files']: # FIXME: could this be multiple paths? for path in doc.get_fullpaths(): print "%s" % (path) continue tags = doc.get_tags() sources = doc.get_sids() keys = doc.get_keys() if oformat == 'tags': otags = otags | set(tags) continue if oformat == 'sources': osources = osources | set(sources) continue if oformat == 'keys': okeys = okeys | set(keys) continue title = doc.get_title() if not title: title = '' if oformat in ['summary']: print "id:%s [%s] {%s} (%s) \"%s\"" % ( docid, ' '.join(sources), ' '.join(keys), ' '.join(tags), title, ) continue if oformat == 'bibtex': bibtex = doc.get_bibtex() if not bibtex: print >>sys.stderr, "No bibtex for doc id:%s." % docid else: print bibtex print continue if oformat == 'tags': for tag in otags: print tag return if oformat == 'sources': for source in osources: print source return if oformat == 'keys': for key in okeys: print key return ############################################ def count(self, query_string): self.db = initdb(self.xroot) count = self.db.count(query_string) print count ############################################ def dumpterms(self, query_string): self.db = initdb(self.xroot) for doc in self.db.search(query_string): for term in doc.doc: print term.term ############################################ def export(self, outdir, query_string): self.db = initdb(self.xroot) try: os.makedirs(outdir) except: pass for doc in self.db.search(query_string): title = doc.get_title() origpaths = doc.get_fullpaths() nfiles = len(origpaths) for path in origpaths: if not title: name = os.path.basename(os.path.splitext(path)[0]) else: name = '%s' % (title.replace(' ','_')) ind = 0 if nfiles > 1: name += '.%s' % ind ind += 1 name += '.pdf' outpath = os.path.join(outdir,name) print outpath shutil.copyfile(path, outpath.encode('utf-8')) ############################################ def restore(self): self.db = initdb(self.xroot, writable=True, create=True, force=True) self.db.restore(log=True) ############################################################ # readline completion class class Completer: def __init__(self, words): self.words = words def terms(self, prefix, index): matching_words = [ w for w in self.words if w.startswith(prefix) ] try: return matching_words[index] except IndexError: return None xapers-0.5.2/lib/xapers/database.py000066400000000000000000000265631214264575400172340ustar00rootroot00000000000000""" This file is part of xapers. Xapers is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. Xapers is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with notmuch. If not, see . Copyright 2012, 2013 Jameson Rollins """ import os import sys import xapian from source import list_sources from documents import Documents, Document # FIXME: add db schema documentation ################################################## class DatabaseError(Exception): """Base class for Xapers database exceptions.""" def __init__(self, msg): self.msg = msg def __str__(self): return self.msg class DatabaseUninitializedError(DatabaseError): pass class DatabaseInitializationError(DatabaseError): pass class DatabaseLockError(DatabaseError): pass ################################################## class Database(): """Represents a Xapers database""" # http://xapian.org/docs/omega/termprefixes.html BOOLEAN_PREFIX_INTERNAL = { # FIXME: use this for doi? #'url': 'U', 'file': 'P', # FIXME: use this for doc mime type 'type': 'T', } BOOLEAN_PREFIX_EXTERNAL = { 'id': 'Q', 'key': 'XBIB|', 'source': 'XSOURCE|', 'tag': 'K', 'year': 'Y', } PROBABILISTIC_PREFIX = { 'title': 'S', 't': 'S', 'author': 'A', 'a': 'A', } # FIXME: need to set the following value fields: # publication date # added date # modified date # FIXME: need database version def _find_prefix(self, name): if name in self.BOOLEAN_PREFIX_INTERNAL: return self.BOOLEAN_PREFIX_INTERNAL[name] if name in self.BOOLEAN_PREFIX_EXTERNAL: return self.BOOLEAN_PREFIX_EXTERNAL[name] if name in self.PROBABILISTIC_PREFIX: return self.PROBABILISTIC_PREFIX[name] # FIXME: raise internal error for unknown name def _make_source_prefix(self, source): return 'X%s|' % (source.upper()) ######################################## def __init__(self, root, writable=False, create=False, force=False): # xapers root self.root = os.path.abspath(os.path.expanduser(root)) # xapers db directory xapers_path = os.path.join(self.root, '.xapers') # xapes directory initialization if not os.path.exists(xapers_path): if create: if os.path.exists(self.root): if os.listdir(self.root) and not force: raise DatabaseInitializationError('Uninitialized Xapers root directory exists but is not empty.') os.makedirs(xapers_path) else: if os.path.exists(self.root): raise DatabaseUninitializedError("Xapers directory '%s' does not contain database." % (self.root)) else: raise DatabaseUninitializedError("Xapers directory '%s' not found." % (self.root)) # the Xapian db xapian_path = os.path.join(xapers_path, 'xapian') if writable: try: self.xapian = xapian.WritableDatabase(xapian_path, xapian.DB_CREATE_OR_OPEN) except xapian.DatabaseLockError: raise DatabaseLockError("Xapers database locked.") else: self.xapian = xapian.Database(xapian_path) stemmer = xapian.Stem("english") # The Xapian TermGenerator # http://trac.xapian.org/wiki/FAQ/TermGenerator self.term_gen = xapian.TermGenerator() self.term_gen.set_stemmer(stemmer) # The Xapian QueryParser self.query_parser = xapian.QueryParser() self.query_parser.set_database(self.xapian) self.query_parser.set_stemmer(stemmer) self.query_parser.set_stemming_strategy(xapian.QueryParser.STEM_SOME) # add boolean internal prefixes for name, prefix in self.BOOLEAN_PREFIX_EXTERNAL.iteritems(): self.query_parser.add_boolean_prefix(name, prefix) # add probabalistic prefixes for name, prefix in self.PROBABILISTIC_PREFIX.iteritems(): self.query_parser.add_prefix(name, prefix) # register known source prefixes # FIXME: can we do this by just finding all XSOURCE terms in # db? Would elliminate dependence on source modules at # search time. for source in list_sources(): self.query_parser.add_boolean_prefix(source, self._make_source_prefix(source)) def __enter__(self): return self def __exit__(self, exc_type, exc_value, traceback): pass def __getitem__(self, docid): if docid.find('id:') == 0: docid = docid.split(':')[1] term = self._find_prefix('id') + str(docid) return self._doc_for_term(term) ######################################## # generate a new doc id, based on the last availabe doc id def _generate_docid(self): return str(self.xapian.get_lastdocid() + 1) # Return the xapers-relative path for a path # If the the specified path is not in the xapers root, return None. def _basename_for_path(self, path): if path.find('/') == 0: if path.find(self.root) == 0: index = len(self.root) + 1 base = path[index:] else: # FIXME: should this be an exception? base = None else: base = path full = None if base: full = os.path.join(self.root, base) return base, full def _path_in_db(self, path): base, full = self._basename_for_path(path) if not base: return False else: return True ######################################## # return a list of terms for prefix # FIXME: is this the fastest way to do this? def _get_terms(self, prefix): terms = [] for term in self.xapian: if term.term.find(prefix.encode("utf-8")) == 0: index = len(prefix) terms.append(term.term[index:]) return terms def get_terms(self, name): """Get terms associate with name.""" prefix = self._find_prefix(name) return self._get_terms(prefix) def get_sids(self): """Get all sources in database.""" sids = [] for source in self._get_terms(self._find_prefix('source')): for oid in self._get_terms(self._make_source_prefix(source)): sids.append('%s:%s' % (source, oid)) return sids ######################################## # search for documents based on query string def _search(self, query_string, limit=None): enquire = xapian.Enquire(self.xapian) if query_string == "*": query = xapian.Query.MatchAll else: # parse the query string to produce a Xapian::Query object. query = self.query_parser.parse_query(query_string) if os.getenv('XAPERS_DEBUG_QUERY'): print >>sys.stderr, "query string:", query_string print >>sys.stderr, "final query:", query # FIXME: need to catch Xapian::Error when using enquire enquire.set_query(query) # set order of returned docs as newest first # FIXME: make this user specifiable enquire.set_docid_order(xapian.Enquire.DESCENDING) if limit: mset = enquire.get_mset(0, limit) else: mset = enquire.get_mset(0, self.xapian.get_doccount()) return mset def search(self, query_string, limit=0): """Search for documents in the database.""" mset = self._search(query_string, limit) return Documents(self, mset) def count(self, query_string): """Count documents matching search terms.""" return self._search(query_string).get_matches_estimated() def _doc_for_term(self, term): enquire = xapian.Enquire(self.xapian) query = xapian.Query(term) enquire.set_query(query) mset = enquire.get_mset(0, 2) # FIXME: need to throw an exception if more than one match found if mset: return Document(self, mset[0].document) else: return None def doc_for_path(self, path): """Return document for specified path.""" term = self._find_prefix('file') + path return self._doc_for_term(term) def doc_for_source(self, sid): """Return document for source id string.""" source, oid = sid.split(':', 1) term = self._make_source_prefix(source) + oid return self._doc_for_term(term) def doc_for_bib(self, bibkey): """Return document for bibtex key.""" term = self._find_prefix('key') + bibkey return self._doc_for_term(term) ######################################## def replace_document(self, docid, doc): """Replace (sync) document to database.""" docid = int(docid) self.xapian.replace_document(docid, doc) def delete_document(self, docid): """Delete document from database.""" docid = int(docid) self.xapian.delete_document(docid) ######################################## def restore(self, log=False): """Restore a database from an existing root.""" docdirs = os.listdir(self.root) docdirs.sort() for ddir in docdirs: if ddir == '.xapers': continue docdir = os.path.join(self.root, ddir) if not os.path.isdir(docdir): # skip things that aren't directories continue if log: print >>sys.stderr, docdir docfiles = os.listdir(docdir) if not docfiles: # skip empty directories continue # if we can't convert the directory name into an integer, # assume it's not relevant to us and continue try: docid = str(int(ddir)) except ValueError: continue if log: print >>sys.stderr, ' docid:', docid doc = self.__getitem__(docid) if not doc: doc = Document(self, docid=docid) for dfile in docfiles: dpath = os.path.join(docdir, dfile) if dfile == 'bibtex': if log: print >>sys.stderr, ' adding bibtex' doc.add_bibtex(dpath) elif os.path.splitext(dpath)[1] == '.pdf': if log: print >>sys.stderr, ' adding file:', dfile doc.add_file(dpath) elif dfile == 'tags': if log: print >>sys.stderr, ' adding tags' with open(dpath, 'r') as f: tags = f.read().strip().split('\n') doc.add_tags(tags) doc.sync() xapers-0.5.2/lib/xapers/documents.py000066400000000000000000000337661214264575400174740ustar00rootroot00000000000000""" This file is part of xapers. Xapers is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. Xapers is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with notmuch. If not, see . Copyright 2012, 2013 Jameson Rollins """ import os import shutil import xapian from parser import parse_file from source import get_source, scan_bibentry_for_sources from bibtex import Bibtex ################################################## class DocumentError(Exception): """Base class for Xapers document exceptions.""" def __init__(self, msg): self.msg = msg def __str__(self): return self.msg ################################################## class Documents(): """Represents a set of Xapers documents given a Xapian mset.""" def __init__(self, db, mset): self.db = db self.mset = mset self.index = -1 self.max = len(mset) def __getitem__(self, index): m = self.mset[index] doc = Document(self.db, m.document) doc.matchp = m.percent return doc def __iter__(self): return self def __len__(self): return self.max def next(self): self.index = self.index + 1 if self.index == self.max: raise StopIteration return self[self.index] ################################################## class Document(): """Represents a single Xapers document.""" def __init__(self, db, doc=None, docid=None): # Xapers db self.db = db self.root = self.db.root # if Xapian doc provided, initiate for that document if doc: self.doc = doc self.docid = str(doc.get_docid()) # else, create a new empty document # document won't be added to database until sync is called else: self.doc = xapian.Document() # use specified docid if provided if docid: if self.db[docid]: raise DocumentError('Document already exists for id %s.' % docid) self.docid = docid else: self.docid = str(self.db._generate_docid()) self._add_term(self.db._find_prefix('id'), self.docid) # specify a directory in the Xapers root for document data self.docdir = os.path.join(self.root, '%010d' % int(self.docid)) # self.bibentry = None def get_docid(self): """Return document id of document.""" return self.docid def _make_docdir(self): if os.path.exists(self.docdir): if not os.path.isdir(self.docdir): raise DocumentError('File exists at intended docdir location: %s' % self.docdir) else: os.makedirs(self.docdir) def _write_files(self): if '_infiles' in dir(self): for infile, outfile in self._infiles.iteritems(): try: shutil.copyfile(infile, outfile) except shutil.Error: pass def _write_bibfile(self): bibpath = self.get_bibpath() # reload bibtex only if we have new files paths = self.get_fullpaths() if paths: self._load_bib() if self.bibentry: # we put only the first file in the bibtex # FIXME: does jabref/mendeley spec allow for multiple files? if paths and not self.bibentry.get_file(): self.bibentry.set_file(paths[0]) self.bibentry.to_file(bibpath) def _write_tagfile(self): with open(os.path.join(self.docdir, 'tags'), 'w') as f: for tag in self.get_tags(): f.write(tag) f.write('\n') def _rm_docdir(self): if os.path.exists(self.docdir) and os.path.isdir(self.docdir): shutil.rmtree(self.docdir) def sync(self): """Sync document to database.""" # FIXME: add value for modification time # FIXME: catch db not writable errors try: self._make_docdir() self._write_files() self._write_bibfile() self._write_tagfile() self.db.replace_document(self.docid, self.doc) except: self._rm_docdir() raise def purge(self): """Purge document from database and root.""" # FIXME: catch db not writable errors try: self.db.delete_document(self.docid) except xapian.DocNotFoundError: pass self._rm_docdir() self.docid = None ######################################## # internal stuff # add an individual prefix'd term for the document def _add_term(self, prefix, value): term = '%s%s' % (prefix, value) self.doc.add_term(term) # remove an individual prefix'd term for the document def _remove_term(self, prefix, value): term = '%s%s' % (prefix, value) try: self.doc.remove_term(term) except xapian.InvalidArgumentError: pass # Parse 'text' and add a term to 'message' for each parsed # word. Each term will be added both prefixed (if prefix_name is # not NULL) and also non-prefixed). # http://xapian.org/docs/bindings/python/ # http://xapian.org/docs/quickstart.html # http://www.flax.co.uk/blog/2009/04/02/xapian-search-architecture/ def _gen_terms(self, prefix, text): term_gen = self.db.term_gen term_gen.set_document(self.doc) if prefix: term_gen.index_text(text, 1, prefix) term_gen.index_text(text) # return a list of terms for prefix # FIXME: is this the fastest way to do this? def _get_terms(self, prefix): list = [] for term in self.doc: if term.term.find(prefix.encode("utf-8")) == 0: index = len(prefix) list.append(term.term[index:]) return list # set the data object for the document def _set_data(self, text): self.doc.set_data(text) def get_data(self): """Get data object for document.""" return self.doc.get_data() ######################################## # files # index file for the document def _index_file(self, path): text = parse_file(path) self._gen_terms(None, text) summary = text[0:997].translate(None, '\n') + '...' return summary def _add_path(self, path): base, full = self.db._basename_for_path(path) prefix = self.db._find_prefix('file') self._add_term(prefix, base) def _get_paths(self): return self._get_terms(self.db._find_prefix('file')) def get_fullpaths(self): """Return fullpaths associated with document.""" list = [] for path in self._get_paths(): # FIXME: this is a hack for old bad path specifications and should be removed if path.find(self.root) == 0: index = len(self.root) + 1 path = path[index:] path = path.lstrip('/') # FIXME base, full = self.db._basename_for_path(path) list.append(full) return list def add_file(self, infile): """Add a file to document. File will not copied in to docdir until sync().""" # FIXME: should load entire file into {name: file} to be # written as file>docdir/name # FIXME: set mime type term summary = self._index_file(infile) # set data to be text sample # FIXME: is this the right thing to put in the data? self._set_data(summary) # FIXME: should files be renamed to something generic (0.pdf)? outfile = os.path.join(self.docdir, os.path.basename(infile)) base, full = self.db._basename_for_path(outfile) self._add_path(base) # add it to the cache to be written at sync() if '_infiles' not in dir(self): self._infiles = {} self._infiles[infile] = outfile ######################################## # SOURCES def _purge_sources_prefix(self, source): # purge all terms for a given source prefix prefix = self.db._make_source_prefix(source) for i in self._get_terms(prefix): self._remove_term(prefix, i) self._remove_term(self.db._find_prefix('source'), source) def add_sid(self, sid): """Add source sid to document.""" source, oid = sid.split(':', 1) source = source.lower() # remove any existing terms for this source self._purge_sources_prefix(source) # add a term for the source self._add_term(self.db._find_prefix('source'), source) # add a term for the sid, with source as prefix self._add_term(self.db._make_source_prefix(source), oid) def get_sids(self): """Return a list of sids for document.""" sids = [] for source in self._get_terms(self.db._find_prefix('source')): for oid in self._get_terms(self.db._make_source_prefix(source)): sids.append('%s:%s' % (source, oid)) return sids # BIBTEX KEYS def get_keys(self): """Return a list of bibtex citation keys associated with document.""" prefix = self.db._find_prefix('key') return self._get_terms(prefix) # TAGS def add_tags(self, tags): """Add tags from list to document.""" prefix = self.db._find_prefix('tag') for tag in tags: self._add_term(prefix, tag) def get_tags(self): """Return a list of tags associated with document.""" prefix = self.db._find_prefix('tag') return self._get_terms(prefix) def remove_tags(self, tags): """Remove tags from a document.""" prefix = self.db._find_prefix('tag') for tag in tags: self._remove_term(prefix, tag) # TITLE def _set_title(self, title): pt = self.db._find_prefix('title') for term in self._get_terms(pt): self._remove_term(pt, term) # FIXME: what's the clean way to get these prefixes? for term in self._get_terms('ZS'): self._remove_term('ZS', term) self._gen_terms(pt, title) # AUTHOR def _set_authors(self, authors): pa = self.db._find_prefix('author') for term in self._get_terms(pa): self._remove_term(pa, term) # FIXME: what's the clean way to get these prefixes? for term in self._get_terms('ZA'): self._remove_term('ZA', term) self._gen_terms(pa, authors) # YEAR def _set_year(self, year): # FIXME: this should be a value pass ######################################## # bibtex def get_bibpath(self): """Return path to document bibtex file.""" return os.path.join(self.docdir, 'bibtex') def _set_bibkey(self, key): prefix = self.db._find_prefix('key') for term in self._get_terms(prefix): self._remove_term(prefix, term) self._add_term(prefix, key) def _index_bibentry(self, bibentry): authors = bibentry.get_authors() fields = bibentry.get_fields() if 'title' in fields: self._set_title(fields['title']) if 'year' in fields: self._set_year(fields['year']) if authors: # authors should be a list, so we make a single text string # FIXME: better way to do this? self._set_authors(' '.join(authors)) # add any sources in the bibtex for sid in scan_bibentry_for_sources(bibentry): self.add_sid(sid) # FIXME: index 'keywords' field as regular terms self._set_bibkey(bibentry.key) def add_bibentry(self, bibentry): """Add bibentry object.""" self.bibentry = bibentry self._index_bibentry(self.bibentry) def add_bibtex(self, bibtex): """Add bibtex to document, as string or file path.""" self.add_bibentry(Bibtex(bibtex)[0]) def _load_bib(self): if self.bibentry: return bibpath = self.get_bibpath() if os.path.exists(bibpath): self.bibentry = Bibtex(bibpath)[0] def get_bibtex(self): """Get the bib for document as a bibtex string.""" self._load_bib() if self.bibentry: return self.bibentry.as_string() else: return None def get_bibdata(self): self._load_bib() if self.bibentry: data = self.bibentry.get_fields() data['authors'] = self.bibentry.get_authors() return data else: return None def update_from_bibtex(self): """Update document metadata from document bibtex.""" self._load_bib() self._index_bibentry(self.bibentry) ######################################## def get_title(self): """Get the title from document bibtex.""" self._load_bib() if not self.bibentry: return None fields = self.bibentry.get_fields() if 'title' in fields: return fields['title'] return None def get_urls(self): """Get all URLs associated with document.""" urls = [] # get urls associated with known sources for sid in self.get_sids(): smod = get_source(sid) urls.append(smod.gen_url()) # get urls from bibtex self._load_bib() if self.bibentry: fields = self.bibentry.get_fields() if 'url' in fields: urls.append(fields['url']) if 'adsurl' in fields: urls.append(fields['adsurl']) return urls xapers-0.5.2/lib/xapers/nci/000077500000000000000000000000001214264575400156535ustar00rootroot00000000000000xapers-0.5.2/lib/xapers/nci/__init__.py000066400000000000000000000013111214264575400177600ustar00rootroot00000000000000""" This file is part of xapers. Xapers is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. Xapers is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with notmuch. If not, see . Copyright 2012 Jameson Rollins """ from ui import UI xapers-0.5.2/lib/xapers/nci/bibview.py000066400000000000000000000011541214264575400176550ustar00rootroot00000000000000import urwid ############################################################ class Bibview(urwid.WidgetWrap): def __init__(self, ui, query): self.ui = ui self.ui.set_header("Bibtex: " + query) docs = self.ui.db.search(query, limit=20) if len(docs) == 0: self.ui.set_status('No documents found.') string = '' for doc in docs: string = string + doc.get_bibtex() + '\n' self.box = urwid.Filler(urwid.Text(string)) w = self.box self.__super.__init__(w) def keypress(self, size, key): self.ui.keypress(key) xapers-0.5.2/lib/xapers/nci/defaults/000077500000000000000000000000001214264575400174625ustar00rootroot00000000000000xapers-0.5.2/lib/xapers/nci/defaults/bindings000066400000000000000000000004431214264575400212030ustar00rootroot00000000000000[global] ?: help s: promptSearch S: promptSearch A: promptAdd q: killBuffer Q: quit [search] n: nextEntry p: prevEntry] enter: viewFile u: viewURL b: viewBibtex +: addTags -: removeTags u: update a: archive meta i: copyID meta f: copyPath meta u: copyURL meta b: copyBibtex f: filterSearch xapers-0.5.2/lib/xapers/nci/help.py000066400000000000000000000025401214264575400171560ustar00rootroot00000000000000import urwid ############################################################ class Help(urwid.WidgetWrap): def __init__(self, ui, target=None): self.ui = ui self.target = target if self.target: tname = self.target.__class__.__name__ self.ui.set_header("Help: " + tname) else: self.ui.set_header("Help") pile = [] if self.target and hasattr(self.target, 'keys'): pile.append(urwid.Text('%s commands:' % (tname))) pile.append(urwid.Text('')) for key, cmd in sorted(self.target.keys.iteritems()): pile.append(self.row('target', cmd, key)) pile.append(urwid.Text('')) pile.append(urwid.Text('')) pile.append(urwid.Text('Global commands:')) pile.append(urwid.Text('')) for key, cmd in sorted(self.ui.keys.iteritems()): pile.append(self.row('ui', cmd, key)) w = urwid.Filler(urwid.Pile(pile)) self.__super.__init__(w) def row(self, c, cmd, key): hstring = eval('str(self.%s.%s.__doc__)' % (c, cmd)) return urwid.Columns( [ ('fixed', 8, urwid.Text(key)), urwid.Text(hstring), ] ) def keypress(self, size, key): self.ui.keypress(key) xapers-0.5.2/lib/xapers/nci/search.py000066400000000000000000000226231214264575400174770ustar00rootroot00000000000000import os import subprocess import urwid from xapers.database import Database, DatabaseLockError ############################################################ def xclip(text, isfile=False): """Copy text or file contents into X clipboard.""" f = None if isfile: f = open(text, 'r') sin = f else: sin = subprocess.PIPE p = subprocess.Popen(' '.join(["xclip", "-i"]), shell=True, stdin=sin) p.communicate(text) if f: f.close() ############################################################ class DocListItem(urwid.WidgetWrap): def __init__(self, doc): self.doc = doc self.matchp = doc.matchp self.docid = self.doc.docid # fill the default attributes for the fields self.fields = {} for field in ['sources', 'tags', 'title', 'authors', 'year', 'summary']: self.fields[field] = urwid.Text('') self.fields['sources'].set_text(' '.join(self.doc.get_sids())) self.fields['tags'].set_text(' '.join(self.doc.get_tags())) data = self.doc.get_bibdata() if data: if 'title' in data: self.fields['title'].set_text(data['title']) if 'authors' in data: astring = ' and '.join(data['authors'][:10]) if len(data['authors']) > 10: astring = astring + ' et al.' self.fields['authors'].set_text(astring) if 'year' in data: self.fields['year'].set_text(data['year']) self.fields['summary'].set_text(self.doc.get_data()) self.c1width = 10 self.rowHeader = urwid.AttrMap( urwid.Text('id:%s (%s)' % (self.docid, self.matchp)), 'head', 'head_focus') # FIXME: how do we hightlight everything in pile during focus? w = urwid.Pile( [ urwid.Divider('-'), self.rowHeader, self.docfield('sources'), self.docfield('tags'), self.docfield('title'), self.docfield('authors'), self.docfield('year'), self.docfield('summary'), ] , focus_item=1) self.__super.__init__(w) def docfield(self, field): attr_map = field return urwid.Columns( [ ('fixed', self.c1width, urwid.AttrMap( urwid.Text(field + ':'), 'field', 'field_focus')), urwid.AttrMap( self.fields[field], attr_map) ] ) def selectable(self): return True def keypress(self, size, key): return key ############################################################ class Search(urwid.WidgetWrap): palette = [ ('head', 'dark blue, bold', ''), ('head_focus', 'white, bold', 'dark blue'), ('field', 'light gray', ''), ('field_focus', '', 'light gray'), ('sources', 'light magenta, bold', ''), ('tags', 'dark green, bold', ''), ('title', 'yellow', ''), ('authors', 'dark cyan, bold', ''), ('year', 'dark red', '',), ] keys = { 'n': "nextEntry", 'p': "prevEntry", 'down': "nextEntry", 'up': "prevEntry", 'enter': "viewFile", 'u': "viewURL", 'b': "viewBibtex", '+': "addTags", '-': "removeTags", 'a': "archive", 'meta i': "copyID", 'meta f': "copyPath", 'meta u': "copyURL", 'meta b': "copyBibtex", } def __init__(self, ui, query=None): self.ui = ui self.ui.set_header("Search: " + query) docs = self.ui.db.search(query, limit=20) if len(docs) == 0: self.ui.set_status('No documents found.') items = [] for doc in docs: items.append(DocListItem(doc)) self.lenitems = len(items) self.listwalker = urwid.SimpleListWalker(items) self.listbox = urwid.ListBox(self.listwalker) w = self.listbox self.__super.__init__(w) ########## def nextEntry(self): """next entry""" entry, pos = self.listbox.get_focus() if not entry: return if pos + 1 >= self.lenitems: return self.listbox.set_focus(pos + 1) def prevEntry(self): """previous entry""" entry, pos = self.listbox.get_focus() if not entry: return if pos == 0: return self.listbox.set_focus(pos - 1) def viewFile(self): """open document file""" entry = self.listbox.get_focus()[0] if not entry: return path = entry.doc.get_fullpaths() if not path: self.ui.set_status('No file for document id:%s.' % entry.docid) return path = path[0].replace(' ','\ ') if not os.path.exists(path): self.ui.set_status('ERROR: id:%s: file not found.' % entry.docid) return self.ui.set_status('opening file: %s...' % path) subprocess.call(' '.join(['nohup', 'xdg-open', path, '&']), shell=True, stdout=open('/dev/null','w'), stderr=open('/dev/null','w')) def viewURL(self): """open document URL in browser""" entry = self.listbox.get_focus()[0] if not entry: return urls = entry.doc.get_urls() if not urls: self.ui.set_status('ERROR: id:%s: no URLs found.' % entry.docid) return # FIXME: open all instead of just first? url = urls[0] self.ui.set_status('opening url: %s...' % url) subprocess.call(' '.join(['nohup', 'xdg-open', url, '&']), shell=True, stdout=open('/dev/null','w'), stderr=open('/dev/null','w')) def viewBibtex(self): """view document bibtex""" entry = self.listbox.get_focus()[0] if not entry: return self.ui.newbuffer(['bibview', 'id:' + entry.docid]) def copyID(self): """copy document ID to clipboard""" entry = self.listbox.get_focus()[0] if not entry: return docid = "id:%s" % entry.docid xclip(docid) self.ui.set_status('docid yanked: %s' % docid) def copyPath(self): """copy document file path to clipboard""" entry = self.listbox.get_focus()[0] if not entry: return path = entry.doc.get_fullpaths()[0] if not path: self.ui.set_status('ERROR: id:%s: file path not found.' % entry.docid) return xclip(path) self.ui.set_status('path yanked: %s' % path) def copyURL(self): """copy document URL to clipboard""" entry = self.listbox.get_focus()[0] if not entry: return urls = entry.doc.get_urls() if not urls: self.ui.set_status('ERROR: id:%s: URL not found.' % entry.docid) return # FIXME: copy all instead of just first? url = urls[0] xclip(url) self.ui.set_status('url yanked: %s' % url) def copyBibtex(self): """copy document bibtex to clipboard""" entry = self.listbox.get_focus()[0] if not entry: return bibtex = entry.doc.get_bibpath() if not bibtex: self.ui.set_status('ERROR: id:%s: bibtex not found.' % entry.docid) return xclip(bibtex, isfile=True) self.ui.set_status('bibtex yanked: %s' % bibtex) def addTags(self): """add tags from document (space separated)""" self.promptTag('+') def removeTags(self): """remove tags from document (space separated)""" self.promptTag('-') def promptTag(self, sign): entry = self.listbox.get_focus()[0] if not entry: return if sign is '+': # FIXME: autocomplete to existing tags prompt = 'add tags: ' elif sign is '-': # FIXME: autocomplete to doc tags only prompt = 'remove tags: ' urwid.connect_signal(self.ui.prompt(prompt), 'done', self._promptTag_done, sign) def _promptTag_done(self, tag_string, sign): self.ui.view.set_focus('body') urwid.disconnect_signal(self, self.ui.prompt, 'done', self._promptTag_done) if not tag_string: self.ui.set_status('No tags set.') return entry = self.listbox.get_focus()[0] try: with Database(self.ui.xroot, writable=True) as db: doc = db[entry.docid] tags = tag_string.split() if sign is '+': doc.add_tags(tags) msg = "Added tags: %s" % (tag_string) elif sign is '-': doc.remove_tags(tags) msg = "Removed tags: %s" % (tag_string) doc.sync() tags = doc.get_tags() entry.fields['tags'].set_text(' '.join(tags)) except DatabaseLockError as e: msg = e.msg self.ui.set_status(msg) def archive(self): """archive document (remove 'new' tag)""" self._promptTag_done('new', '-') def keypress(self, size, key): if key in self.keys: cmd = "self.%s()" % (self.keys[key]) eval(cmd) else: self.ui.keypress(key) xapers-0.5.2/lib/xapers/nci/ui.py000066400000000000000000000114621214264575400166460ustar00rootroot00000000000000""" This file is part of xapers. Xapers is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. Xapers is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with notmuch. If not, see . Copyright 2012, 2013 Jameson Rollins """ import os import sys import urwid import subprocess # FIXME: what the hell? Why do we have to do this? # it's because of # sys.stdout = codecs.getwriter('utf8')(sys.stdout) # in cli.ui, but i don't understand why from xapers.cli.ui import initdb, SYS_STDOUT sys.stdout = SYS_STDOUT from search import Search from bibview import Bibview from help import Help ############################################################ class UI(): palette = [ ('header', 'white', 'dark blue'), ('footer', 'white', 'dark blue'), ('prompt', 'black', 'light green'), ] keys = { '?': "help", 's': "promptSearch", 'q': "killBuffer", 'Q': "quit", } def __init__(self, xroot, db=None, cmd=None): self.xroot = xroot if db: # reuse db if provided self.db = db else: self.db = initdb(self.xroot) self.header_string = "Xapers" self.status_string = "q: quit buffer, Q: quit Xapers, ?: help" self.view = urwid.Frame(urwid.SolidFill()) self.set_header() self.set_status() if not cmd: cmd = ['search', '*'] if cmd[0] == 'search': query = ' '.join(cmd[1:]) self.buffer = Search(self, query) elif cmd[0] == 'bibview': query = ' '.join(cmd[1:]) self.buffer = Bibview(self, query) elif cmd[0] == 'help': target = None if len(cmd) > 1: target = cmd[1] if isinstance(target, str): target = None self.buffer = Help(self, target) else: self.buffer = Help(self) self.set_status("Unknown command '%s'." % (cmd[0])) self.merge_palette(self.buffer) self.view.body = urwid.AttrMap(self.buffer, 'body') self.mainloop = urwid.MainLoop( self.view, self.palette, unhandled_input=self.keypress, handle_mouse=False, ) self.mainloop.run() ########## def merge_palette(self, buffer): if hasattr(buffer, 'palette'): self.palette = list(set(self.palette) | set(buffer.palette)) def set_header(self, text=None): if text: self.header_string = 'Xapers %s' % (text) self.view.set_header(urwid.AttrMap(urwid.Text(self.header_string), 'header')) def set_status(self, text=None): if text: self.status_string = '%s' % (text) self.view.set_footer(urwid.AttrMap(urwid.Text(self.status_string), 'footer')) def newbuffer(self, cmd): UI(self.xroot, db=self.db, cmd=cmd) self.set_status() def prompt(self, string): prompt = PromptEdit(string) self.view.set_footer(urwid.AttrMap(prompt, 'prompt')) self.view.set_focus('footer') return prompt ########## def promptSearch(self): """search database""" prompt = 'search: ' urwid.connect_signal(self.prompt(prompt), 'done', self._promptSearch_done) def _promptSearch_done(self, query): self.view.set_focus('body') urwid.disconnect_signal(self, self.prompt, 'done', self._promptSearch_done) if not query: self.set_status() return self.newbuffer(['search', query]) def killBuffer(self): """kill current buffer""" raise urwid.ExitMainLoop() def quit(self): """quit Xapers""" sys.exit() def help(self): """help""" self.newbuffer(['help', self.buffer]) def keypress(self, key): if key in self.keys: cmd = "self.%s()" % (self.keys[key]) eval(cmd) ############################################################ class PromptEdit(urwid.Edit): __metaclass__ = urwid.signals.MetaSignals signals = ['done'] def keypress(self, size, key): if key == 'enter': urwid.emit_signal(self, 'done', self.get_edit_text()) return elif key == 'esc': urwid.emit_signal(self, 'done', None) return urwid.Edit.keypress(self, size, key) xapers-0.5.2/lib/xapers/parser.py000066400000000000000000000022021214264575400167440ustar00rootroot00000000000000import os ################################################## class ParseError(Exception): """Base class for Xapers parser exceptions.""" def __init__(self, msg): self.msg = msg def __str__(self): return self.msg ################################################## class ParserBase(): """Base class for Xapers document parsering.""" def __init__(self, path): self.path = os.path.expanduser(path) def extract(self): pass ################################################## def parse_file(path): # FIXME: determine mime type mimetype = 'pdf' try: mod = __import__('xapers.parsers.' + mimetype, fromlist=['Parser']) pmod = getattr(mod, 'Parser') except ImportError: raise ParseError("Unknown parser '%s'." % mimetype) if not os.path.exists(path): raise ParseError("File '%s' not found." % path) if not os.path.isfile(path): raise ParseError("File '%s' is not a regular file." % path) try: text = pmod(path).extract() except Exception, e: raise ParseError("Could not parse file: %s" % e) return text xapers-0.5.2/lib/xapers/parsers/000077500000000000000000000000001214264575400165615ustar00rootroot00000000000000xapers-0.5.2/lib/xapers/parsers/__init__.py000066400000000000000000000000001214264575400206600ustar00rootroot00000000000000xapers-0.5.2/lib/xapers/parsers/pdf.py000066400000000000000000000005071214264575400177060ustar00rootroot00000000000000from xapers.parser import ParserBase from pipes import quote from subprocess import check_output class Parser(ParserBase): def extract(self): path = quote(self.path) cmd = ['pdftotext', path, '-'] text = check_output(' '.join(cmd), shell=True, stderr=open('/dev/null','w')) return text xapers-0.5.2/lib/xapers/source.py000066400000000000000000000063751214264575400167670ustar00rootroot00000000000000import re from urlparse import urlparse import xapers.sources from parser import parse_file ################################################## class SourceError(Exception): """Base class for Xapers source exceptions.""" def __init__(self, msg): self.msg = msg def __str__(self): return self.msg ################################################## class SourceBase(): source = None netloc = None scan_regex = None def __init__(self, id=None): self.id = id def get_sid(self): if self.id: return '%s:%s' % (self.source, self.id) def gen_url(self): """Return url string for source ID.""" if self.netloc and self.id: return 'http://%s/%s' % (self.netloc, self.id) def match(self, netloc, path): """Return True if netloc/path belongs to this source and a sid can be determined.""" def get_bibtex(self): """Download source bibtex, and return as string.""" ################################################## def list_sources(): """List all available source modules.""" sources = [] # FIXME: how do we register sources? for s in dir(xapers.sources): # skip the __init__ file when finding sources if '__' in s: continue sources.append(s) return sources def _load_source(source): try: mod = __import__('xapers.sources.' + source, fromlist=['Source']) return getattr(mod, 'Source') except ImportError: raise SourceError("Unknown source '%s'." % source) def get_source(string): """Return Source class object for URL or source identifier string. """ smod = None o = urlparse(string) # if the scheme is http, look for source match if o.scheme in ['http', 'https']: for source in list_sources(): smod = _load_source(source)() # if matches, id will be set if smod.match(o.netloc, o.path): break else: smod = None if not smod: raise SourceError('URL matches no known source.') elif o.scheme == '': source = o.path smod = _load_source(source)() else: source = o.scheme oid = o.path smod = _load_source(source)(oid) return smod def scan_file_for_sources(file): """Scan document file for source identifiers and return list of sid strings.""" text = parse_file(file) sources = [] for source in list_sources(): smod = _load_source(source)() if 'scan_regex' not in dir(smod): continue prog = re.compile(smod.scan_regex) matches = prog.findall(text) if matches: for match in matches: sources.append('%s:%s' % (smod.source.lower(), match)) return sources def scan_bibentry_for_sources(bibentry): """Scan bibentry for source identifiers and return list of sid strings.""" fields = bibentry.get_fields() sources = [] for source in list_sources(): if source in fields: sources.append('%s:%s' % (source.lower(), fields[source])) # FIXME: how do we get around special exception for this? if 'eprint' in fields: sources.append('%s:%s' % ('arxiv', fields['eprint'])) return sources xapers-0.5.2/lib/xapers/sources/000077500000000000000000000000001214264575400165655ustar00rootroot00000000000000xapers-0.5.2/lib/xapers/sources/__init__.py000066400000000000000000000000431214264575400206730ustar00rootroot00000000000000import doi import arxiv import dcc xapers-0.5.2/lib/xapers/sources/arxiv.py000066400000000000000000000055231214264575400202750ustar00rootroot00000000000000import urllib from HTMLParser import HTMLParser from xapers.bibtex import data2bib # html parser override to override handler methods class MyHTMLParser(HTMLParser): def __init__(self): HTMLParser.__init__(self) self.lefthead = False self.title = None self.author = [] self.year = None self.sid = None def handle_starttag(self, tag, attrs): title = False author = False date = False sid = False if self.lefthead: return if tag != 'meta': return for attr in attrs: if attr[0] == 'name': if attr[1] == 'citation_title': title = True if attr[1] == 'citation_author': author = True if attr[1] == 'citation_date': date = True if attr[1] == 'citation_arxiv_id': sid = True if attr[0] == 'content': if title: self.title = attr[1] if author: self.author.append(attr[1]) if date: self.year = attr[1].split('/')[0] if sid: self.sid = attr[1] def handle_endtag(self, tag): if tag == 'head': self.lefthead = True class Source(): source = 'arxiv' netloc = 'arxiv.org' def __init__(self, id=None): self.id = id def get_sid(self): if self.id: return '%s:%s' % (self.source, self.id) def gen_url(self): if self.id: return 'http://%s/abs/%s' % (self.netloc, self.id) def match(self, netloc, path): if netloc.find(self.netloc) < 0: return False for prefix in ['/abs/', '/pdf/', '/format/']: index = path.find(prefix) if index == 0: break index = len(prefix) # FIXME: strip anything else? self.id = path[index:].strip('/') return True def get_data(self): if 'file' in dir(self): url = None f = open(self.file, 'r') else: url = self.gen_url() f = urllib.urlopen(url) html = f.read() f.close() # instantiate the parser and fed it some HTML try: parser = MyHTMLParser() parser.feed(html) except: return None data = { 'arxiv': self.id, 'title': parser.title, 'authors': parser.author, 'year': parser.year, 'eprint': self.id, 'url': self.gen_url(), } return data def get_bibtex(self): data = self.get_data() bibentry = data2bib(data, self.get_sid()) return bibentry.as_string() xapers-0.5.2/lib/xapers/sources/dcc.py000066400000000000000000000062551214264575400177000ustar00rootroot00000000000000import sys import pycurl import cStringIO import tempfile from xapers.bibtex import data2bib def dccRetrieveXML(docid): url = 'https://dcc.ligo.org/Shibboleth.sso/Login?target=https%3A%2F%2Fdcc.ligo.org%2Fcgi-bin%2Fprivate%2FDocDB%2FShowDocument?docid=' + docid + '%26outformat=xml&entityID=https%3A%2F%2Flogin.ligo.org%2Fidp%2Fshibboleth' curl = pycurl.Curl() cookies = tempfile.NamedTemporaryFile() curl.setopt(pycurl.URL, url) curl.setopt(pycurl.UNRESTRICTED_AUTH, 1) curl.setopt(pycurl.HTTPAUTH, pycurl.HTTPAUTH_GSSNEGOTIATE) curl.setopt(pycurl.COOKIEJAR, cookies.name) curl.setopt(pycurl.USERPWD, ':') curl.setopt(pycurl.FOLLOWLOCATION, 1) doc = cStringIO.StringIO() curl.setopt(pycurl.WRITEFUNCTION, doc.write) try: curl.perform() except: import traceback traceback.print_exc(file=sys.stderr) sys.stderr.flush() xml = doc.getvalue() curl.close() cookies.close() doc.close() return xml def dccXMLExtract(xmlstring): from xml.dom.minidom import parse, parseString xml = parseString(xmlstring) etitle = xml.getElementsByTagName("title")[0].firstChild if etitle: title = etitle.data else: title = None alist = xml.getElementsByTagName("author") authors = [] for author in alist: authors.append(author.getElementsByTagName("fullname")[0].firstChild.data) eabstract = xml.getElementsByTagName("abstract")[0].firstChild if eabstract: abstract = eabstract.data else: abstract = None # FIXME: find year/date year = None return title, authors, year, abstract class Source(): source = 'dcc' netloc = 'dcc.ligo.org' def __init__(self, id=None): self.id = id def get_sid(self): if self.id: return '%s:%s' % (self.source, self.id) def gen_url(self): if self.id: return 'http://%s/%s' % (self.netloc, self.id) def match(self, netloc, path): if netloc != self.netloc: return False fullid = path.split('/')[1] dccid, vers = fullid.replace('LIGO-', '').split('-') self.id = dccid if self.id: return True else: return False def get_data(self): if 'file' in dir(self): f = open(self.file, 'r') xml = f.read() f.close() else: xml = dccRetrieveXML(self.id) try: title, authors, year, abstract = dccXMLExtract(xml) except: print >>sys.stderr, xml raise data = { 'institution': 'LIGO Laboratory', 'number': self.id, 'dcc': self.id, 'url': self.gen_url() } if title: data['title'] = title if authors: data['authors'] = authors if abstract: data['abstract'] = abstract if year: data['year'] = year return data def get_bibtex(self): data = self.get_data() key = self.get_sid() btype = '@techreport' bibentry = data2bib(data, key, type=btype) return bibentry.as_string() xapers-0.5.2/lib/xapers/sources/doi.py000066400000000000000000000046361214264575400177230ustar00rootroot00000000000000import io import urllib2 from xapers.bibtex import json2bib class Source(): source = 'doi' netloc = 'dx.doi.org' #scan_regex = '[doi|DOI][\s\.\:]{0,2}(10\.\d{4}[\d\:\.\-\/a-z]+)[A-Z\s]' #scan_regex = '\b(10[.][0-9]{4,}(?:[.][0-9]+)*/(?:(?!["&\'<>])[[:graph:]])+)\b' #scan_regex = '(doi|DOI)(10[.][0-9]{4,}(?:[.][0-9]+)*[\/\.](?:(?!["&\'<>])[[:graph:]])+)' scan_regex = '(?:doi|DOI)[\s\.\:]{0,2}(10\.\d{4,}[\w\d\:\.\-\/]+)' def __init__(self, id=None): self.id = id def get_sid(self): if self.id: return '%s:%s' % (self.source, self.id) def gen_url(self): if self.id: return 'http://%s/%s' % (self.netloc, self.id) def match(self, netloc, path): if netloc.find(self.netloc) >= 0: self.id = path.strip('/') return True else: return False def _clean_bibtex_key(self, bibtex): # FIXME: there must be a better way of doing this stream = io.StringIO() i = True for c in bibtex: if c == ',': i = False if i and c == ' ': c = u'_' else: c = unicode(c) stream.write(c) bibtex = stream.getvalue() stream.close() return bibtex def _get_bib_doi(self): # http://www.crossref.org/CrossTech/2011/11/turning_dois_into_formatted_ci.html url = self.gen_url() headers = dict(Accept='text/bibliography; style=bibtex') req = urllib2.Request(url, headers=headers) f = urllib2.urlopen(req) bibtex = f.read() f.close # FIXME: this is a doi hack return self._clean_bibtex_key(bibtex) def _get_bib_doi_json(self): # http://www.crossref.org/CrossTech/2011/11/turning_dois_into_formatted_ci.html url = self.gen_url() headers = dict(Accept='application/citeproc+json') req = urllib2.Request(url, headers=headers) f = urllib2.urlopen(req) json = f.read() f.close bibentry = json2bib(json, self.get_sid()) return bibentry.as_string() def _get_bib_ads(self): req = 'http://adsabs.harvard.edu/cgi-bin/nph-bib_query?bibcode=' + self.sid + '&data_type=BIBTEXPLUS' f = urllib2.urlopen(req) bibtex = f.read() f.close return bibtex def get_bibtex(self): return self._get_bib_doi_json() xapers-0.5.2/lib/xapers/version.py000066400000000000000000000000261214264575400171370ustar00rootroot00000000000000__version__ = '0.5.2' xapers-0.5.2/man/000077500000000000000000000000001214264575400136055ustar00rootroot00000000000000xapers-0.5.2/man/man1/000077500000000000000000000000001214264575400144415ustar00rootroot00000000000000xapers-0.5.2/man/man1/xapers-adder.1000066400000000000000000000025321214264575400171040ustar00rootroot00000000000000.\" xapers - journal article indexing system .\" .\" Copyright © 2013 Jameson Rollins .\" .\" Xapers is free software: you can redistribute it and/or modify .\" it under the terms of the GNU General Public License as published by .\" the Free Software Foundation, either version 3 of the License, or .\" (at your option) any later version. .\" .\" Xapers is distributed in the hope that it will be useful, .\" but WITHOUT ANY WARRANTY; without even the implied warranty of .\" MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the .\" GNU General Public License for more details. .\" .\" You should have received a copy of the GNU General Public License .\" along with this program. If not, see http://www.gnu.org/licenses/ . .\" .\" Author: Jameson Rollins .TH XAPERS 1 .SH NAME xapers-adder \- "gui" to import individual documents into Xapers database .SH SYNOPSIS .B xapers-adder .IR file.pdf .SH DESCRIPTION The specified PDF file is displayed (using \fBxdg-open\fR(1)), then a terminal is opened (\fBx-terminal-emulator\fR(1)) executing the following command: xapers add --file= --tags=new --prompt --view This program is useful to use as your PDF handler in your browser. See \fBxapers\fR(1) for more information. .SH CONTACT Feel free to email the author: Jameson Rollins xapers-0.5.2/man/man1/xapers.1000066400000000000000000000200071214264575400160240ustar00rootroot00000000000000.\" xapers - journal article indexing system .\" .\" Copyright © 2013 Jameson Rollins .\" .\" Xapers is free software: you can redistribute it and/or modify .\" it under the terms of the GNU General Public License as published by .\" the Free Software Foundation, either version 3 of the License, or .\" (at your option) any later version. .\" .\" Xapers is distributed in the hope that it will be useful, .\" but WITHOUT ANY WARRANTY; without even the implied warranty of .\" MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the .\" GNU General Public License for more details. .\" .\" You should have received a copy of the GNU General Public License .\" along with this program. If not, see http://www.gnu.org/licenses/ . .\" .\" Author: Jameson Rollins .TH XAPERS 1 .SH NAME xapers \- personal journal article indexing system .SH SYNOPSIS .B xapers .IR command " [" args " ...]" .SH DESCRIPTION Xapers is a personal document indexing system, geared towards academic journal articles. It provides fast search of document text and bibliographic data (synced from online libraries) and simple document and bibtex retrieval. Xapers takes as input document files (as PDF) and source identifiers. Documents are copied into a local document store (~/.xapers/docs by default) and text is extracted from the PDF and fully indexed into a Xapian database. Source identifiers are used to download document bibliographic data from online digital libraries (see \fBSOURCES\fR below), which are then parsed and indexed to prefixed terms in the database. The bibliographic data is also stored as bibtex in the document store for easy retrieval. Documents can be arbitrarily tagged. A curses UI is provided for simple access to documents (see the \fBview\fR command below). Xapers is ultimately a document indexing library, though, so development of alternate user interfaces is encouraged. Underlying Xapers is the wonderful Xapian database/search engine. See http://xapian.org/ for more information. .SH MAIN COMMANDS The following are the main xapers commands. See \fBSEARCH TERMS\fR below for details of the supported syntax for . .SS add [options] [] Add a document, or update an existing document. Must specify at least one of --file or --source. If search terms are provided they must match exactly one document and the matching document is updated with the newly provided information. Available options: .RS 4 .TP 4 .BR \-\-source= Source identifier for document. See \fBSOURCES\fR below. This may also be a path to a file that contains a single bibtex entry. .RE .RS 4 .TP 4 .BR \-\-file= Document file (as PDF) to add. Text of document will be extracted from PDF and indexed. A copy of the file will be placed in the Xapers document store. .RE .RS 4 .TP 4 .BR \-\-tags=[,...] Initial tags to apply to document. Multiple tags can be specified, comma separated. .RE .RS 4 .TP 4 .BR \-\-prompt Prompt user for source/file/tags, if not specified. When prompting for source information input files are automatically scanned for source IDs and found ids are displayed. .RE .RS 4 .TP 4 .BR \-\-view View resulting entry in curses UI when done. See the \fBviewP\fR command below for more info. .RE .SS import [options] Import an existing bibtex database. Each bibtex entry will be added as a new document. If bibtex key, or any sources found in bibtex, match an existing document that document is instead updated (this makes the command effectively idempotent). Any "file" fields will be parsed for document files to add. Files can be specified as a single path, or in Mendeley/Jabref format. Available options: .RS 4 .TP 4 .BR \-\-tags=[,...] Tags to apply to all imported documents. Multiple tags can be specified, comma separated. .RE .SS tag +|- [...] [--] Add/remove tags from documents. '--' can be used to separate tagging operations from search terms. .SS search [options] Search for documents in the database. Document information is printed to stdout. .RS 4 .TP 4 .BR \-\-output=[summary|bibtex|tags|sources|keys|files] Specify document information to be output: .B summary outputs a single-line summary of the documents (default). .B bibtex outputs bibtex for all documents (if available). .B tags outputs all tags associated with documents. .B sources outputs all sources associated with documents. .B keys outputs all bibtex citation keys associated with documents. .B files outputs the full paths to all files associated with documents. Default is .B summary. .RE .RS 4 .TP 4 .BR \-\-limit=N Limit number of results returned. Default is 20. Use 0 to return all results. .RE .SS bibtex Short for "search --output=bibtex ". .SS count Return a simple count of search results. .SS view [] .SS show [] View search results in curses search UI. Documents matching search are displayed with their bibliographic information and a short text summary. It allows for manipulating document tags and for retrieved for document files and source URLs for viewing (see .B xdg-open(1) for more info). Initial search terms can be provided, but further searches can be performed from within the UI. While in the UI type "?" for available commands. NOTE: At the moment only the top 20 search results are displayed, due to synchronous loading restrictions. This obviously needs to be fixed. .SS export Copy PDF files of resulting documents into , named with document titles when available. .SS delete Delete documents from the database. All document files will purged from the document store. .RS 4 .TP 4 .BR \-\-noprompt Do not prompt to confirm deletion of documents. .RE .SH SOURCE COMMANDS These commands provide access to some of the source module methods. See \fBSOURCES\fR below. .SS sources List available sources. .SS source2bib Retrieve bibtex from source for a specified URL or source id, and print to stdout. .SS scandoc Scan a document file (PDF) for source IDs. .SH SOURCES Sources are online databases from which document bibliographic data can be retrieved. In Xapers, online libraries are assigned unique prefixes. The online libraries associate unique document identifiers to individual documents. Xapers then recognizes document source information with \fBsid\fR of the form ":". Xapers currently recognizes the following online sources: doi: Digital Object Identifier (DOI) (http://www.doi.org/) arxiv: arXiv (http://arxiv.org/) When adding documents into Xapers, sources may be specified as either full URLs (e.g. "http://dx.doi.org/10.1364/JOSAA.29.002092") or sid strings (e.g. "doi:10.1364/JOSAA.29.002092"). URLs are parsed into sources and source ids when recognized, and this information is used to retrieve bibtex from the online library databases. The sources and sids for a given document are stored as prefixed terms in the Xapers database (see below). .SH SEARCH TERMS Search terms consist of free-form text (and quoted phrases) which will match all documents that contain all of the given terms/phrases. As a special case, a search string consisting of a single asterisk ('*') will match all documents in the database. In addition to free text, the following prefixes can be used to match text against specific document metadata: id: Xapers document ID author: string in authors (also a:) title: string in title (also t:) tag: specific user tag : specific sid string source: specific source key: specific bibtex citation key .SH ENVIRONMENT The following environment variables can be used to control the behavior of xapers: .SS XAPERS_ROOT Location of the Xapers document store. Defaults to "~/.xapers/docs" if not specified. .SH CONTACT Feel free to email the author: Jameson Rollins xapers-0.5.2/screenshot.png000066400000000000000000001466441214264575400157340ustar00rootroot00000000000000PNG  IHDR~'sBITO IDATx^}14I5 cL< 9b 1b~7۞%/Z_P(##77Y{:@t:@SlU.:@Ã~ :@D}KY#uwI*^(' t<ҟ:.ai-:@u:z{m(xכfҩ :@@ǁ4ߟLrکLZ Ygjy;+X tЁ80 g>'q;cw>'XNTuFOZ=L7kl:@w?<9z'`zІNf̭-Dfֹ*z?*.U:@:T?fe mP[P 7 .a$AF NcQIvQ-tTG/&o=fasc$,id.N-!V0Bt`́ɿNILtЁ/;t:@t:@t:@t:@t:@t:@z;o1C:gY^>t:@t:@t:@t:@t:@t:@t? J3 :@՟^=DWT6#*":@؁pՖhS]CS:@8s2C! 2gdzUdV:@@ǁ@r\ JNP3uV6_9(tЁL?pwvo1EUśN#ot:@^wia< e:@;|ShH$ػtЁSzi6Ftqי2;tЁ7r`zc6jؓ>GwãƳu. iJtݷ {4zT0ZhTFЌC|uA96v :@0[Xʅݿv30f}j!Tpv' RZ ]H g@`XZ"zY;G,<#9_ցSЅ# gi$j"}ֲ%YMOD^UB#t:@cVci)@>>ܵhGY>#KaS@KkY1X#ȃb:@ā?f%zvi߃ץHgfvsΟUxHHQuZQS2:;GRZIjN?T͛vZ̎:@xxzMP ?^!5t|%x~iW:@t:@t:L~CL^~>22uV:~HlO9}x[:@J&Go{lS\N B6qlMցŬ ǭŅ,Wl}f!E: 4B X:`R6䱈cGfOyM[ g5=A/Aʅ+ F#x{ "hlʣ^]M'9$L'8vʹe `t9:@~?fnvC%rqz\`B:lڻTt:0v`rGf7|,HՊxRNHoL=V١%vJI֧` ESii J:ezZd/َg]p:@N8s0$M;7Ft:@t:@t|>pvOf!҈7A{`Ȓ m׿1]D›n %hYV%,^x& %i]M 4iC_BFTxFN1[BL;E)K MS.6Z}º ӓBlԆ/TWpօ>L}FZՓSLQ?Y1cYe5\ } 0+L0veԗWQjCUWI "B= KX"$b-6-T@QlKn;2c1p? l*Btz%0ӠRHQ-\cRݺY#SC 1+<7kG*VNl-jACKUMyPhܿ3QyLAX#S溯,6MxЇei‚,U1u SrJa Z%Z(OML {v&QSUFBMGEB)~C&vjZ8P,\ON>K4{{=bFcFS8?:C;j*35P#gɰؒFC ZV$cVmk}mO~fӯkۺHkOӉ5kOc$ZS^RМY FvL )`KZk;^TLM=Z8T~HAax3Zyad%^)NٚgKK~ .TIvgQOYHQR|ũ6N~=t^:,}ܑ+ %uגn¦ Ny*\_fZ*:U7|S+X!G>%?4AV%ҽ1*njxV{b(#fsQ 1 a.}d6~kwugk:zV>U"Kk֨*ξV&S x[<+܀,):@t:@7V:@rz)c(:@p?^?bLIt…MNO\˲Z#>~tЁxkXOJyns17Jb̼K"t|>?[tЁ*?f׌3רE_ցrOHʊ1$O/latЁ-R7A=>KrEϢӬ,~j3^%ˆ˵P;ːbDBoɔF:iݳBA%HKt:@:tdt:@t:@t:@t :M |W z(+>b)o ;t:pkp⪨4x^:@́{~V>_Gg5s:@@'5air_fa(17?Ygүkt:U'XT LwBtЁ@׬RGF'۟jw*' v{R-tK9:zqNPY/gKlgQ:@h:P[]ezpN)+-aD53Wtlq{v nvAfmQ$->F%'UFt&}{ 72t:@t`r{=:@t:@t:@np` Nc}ͩ$<鴯eND#wMNIϚK]YKaa#;0~}z4+4d XVUW#^ʨTX'vFWkYk)(Dz H(`1 #S?11T, ]۞r i`!?Ա]R[A׶jK|yjfͲg*t{o*ib26U%B-`+" #'DOQ Fz贯ĝ@Xu'l=jNSn4gbF3>1eO#6 zpsh?#rut<YZIտ)(n8"KK_=S$|'*ˆJ q҈>0 u05,ӆĔ' N%M-=ʠ&ly02%Z2#O'<2O:ϦC0BC{Zn/$?fu"#yū^]Ea>2 -!x^]hTmVOYJXJkA 0+$q%,kֲ`GA=L8]jWGexZ¦l&೦ 2\:jZh'j~MEsjj4}%Jr( :X(UWԣB48I5T\J2͚:k$)s'xtΫ1o9L3ųPy)>ۺ#ky-cYF%_]-|'y] ]}s })?BnW,D*ILfI54+ kī^QJ0g1{Rm)Zr!^C Fa&-^2V4['ńZUR0KGEzQT!sHB j`S0<;Seі5 ӑaLVBbCy9tųko0{ 44f.LpU_hƂL}uvʭeuL dHᲞi_"XYd,yN/V/cV g,<cH=,`Ny:d"K]UfF{EkW] DG>1`#X$F aF= f!ad:⠎abjB5`ÌGx׋uɆ-"$W=bmbxa|E=gj7ХP1ak jj@{o;5:gtn98t:@t:<FD4_V~N#y:Y1iRe?# Π' gxGD~"΢fu!" .juP fodhQQLR@ {@^]է)f"N%).Lm{bH0XK "s'N]<6hΚÆj1ء!yGU1azd`#'B,+5H WU,_˴a;EV+v0&iČ}x?Bw @GY34x1aSex1Z1+R~JBiȡ庇e ^M4v2Og*S(Ťi9zth^'QxR=<4fPee6fadWEPF-}%eqTXmwe kaR%M#J#lCV,"DBSt5ZH#„uAIO=fS=5jN).y1#@l1jbYtO1Ab0)l)fA g$[IF [mՋO[pf1kj:b0 Q)鏇)}%;#wSv>|mˏx$<|{;ʾf"va:@t:@>~&Qpo0 ENkOQ?fź&2]ˉ;(z1pT #ȋv~8t#C`ZkY(x27#8:kNX,D [cY/h'`:)(8lL-MeU__kD7;%41O`=pGd(T) lZOU~\=m %w0(>rĀq\,MPLYHtVE=`t1aZhv0&lMI{S (8 f.ʁw]إgϫ>ut&sE rA8`3 1#rH݅J1¦W(jQhcT洵PK}jYA= ' @z[-4 NS jЖz.fV7fAUXWGL96CN5oQaN5vn aR7|$oɅ~uY6H?p9uͣH\zص$Ayd^ңkj թ1"R-D: Ʈ)Ra$S=PH84yPTaZyЮN_Sz'̻鸊x~=͝B e a#3b 4YAje:KE=pg4NCڲuSS,1cF.OS:]xJVǣSg3?ȳ./cV]|5aut0_K;iLSQiuPXu=\뗽aH!2dԊ>mMG]eY_)1U)<$f(>k)Я勎욖KȂNMy]Zߛ16uḯ 5%#faZ}o ֩^КK#x~o OQ݃\d5D7v1eN/DTO1лiʬxәZzz{xz4z:,^puE[5K\3 %֘Vڪ9NIlM,ܰ,,8臓p8\Ys^^s.X|+q8ク3V6BtЁg9p3 r:@ba}5?A1pzϚhgZ(}X)rk%.]ì^:?DgeMppLf~KVx]GV̝3ZazQ9 8T妅```ΤT`FxlS,ֲ +.ҦC)i J'^yHY]A\͗|_;7oY佶NjA=YkN'wtC-;N10#0kW gڈ{XYP0By:e0۫V1-bV`XPǦY8AK aB 0C<~XGjuձ{ 3<1L@CP} ܁0Hyh_~H jy1z݌Qz]k, WWU&+QdkF N֨v!HFGa:a.^j`rziRV dfA٨y';A0,J;JFȑ*wa"Z]_W wC}Y:52w񄇔Etg9շ@b:x][xDPΡtKt0!K{*1ARsTV,,(x18/E F:;5-i)Ĭ-}W)k Euw Oi-PM.{y4Rެ}uܸ)Z.:N- [{~ T3؝6 _I3*f"YS@~VHrmsը4)|zjIu4 Ks2yT]xgLR ;jVB!RAȗ^{ڌmfzax{8,hT!&u?]bdɛF`|(r U*)ƨFa4";zOR=+TWm{A ۨQVQl ߩZgSUgr@ZʐK.u`󵬳Z#+x Ǯo9V/ړp-Vޟ1B~WQt:@t:@.r><˻ւu˓+"JQ!U:TׇhdmYfCcaLC7|#Sxh!Ӿ%4w!uhfiQab\,0^{~^a+UQ C}N5OyH%k]%{I= W!C:40:@t:@tЁ7t@')@`Buڨr5CWv,.^CbF]SaEi2`vzISU Z\T"CXԇο[='ZgF̶F)DߑiukL Ŕ`Jy`YZ+&*f!$we tM1*`䡷N{v:bZ+kbhQwMxgd}En~)m6"SL<δPh)O_@/1gnF_~#}ŚZ ^UGyN]3xDbFif-eaTSE:%bAH1=Bm( 2vyvP8ZOt:OZgm_:wtm*jd݉,սշ.>F~?!M& i(vEڅu ,:>rE FPbN;<#fZs]7]]RaRPka"v(׺"q$ufquGF;t:>Zmeye֯qj1}xڏYug UN1vԓ,h-bmAx$jPrhnYs?[1#_<+ҁOr (_$R6%$ipJx[Z\#!(e: L5O)NiBpM!{tR Wkfҁ^o,-f_u} u:kf((~_]jz'K]t:Q}jпĄH'+`d*E?3#S8գ lӘK^sձ&sz^x[$ܩNGytRfL|00Agzt(@uڼ`]ZL`FLmwj =svBCw8`;vD,F-+DҞ=TdX*iݎQ]9S6Ddղユ1<_"\# 5{HVx,! C, T8 {󌪤nԵ#Ua`mx]vXklz aECY۫-TU,aPܡBBj4 AbBLQ PDPQaj*DX1ǖ0 53<-hvB_!bNV@_.ԥ-į1ef(b"{!/Δ8Z8)H1Nyr<ԯLnv`XPKZ#Ѧ$_d_**T5L;:i! Fx:Yhr4J9GyMM}o gNt`nNdJKliH#):@t:@t:@t:@t:@t:@_́o/n$x?P8jDt:@t:@t:@t:@t:@x/GZqNtwRwFy3ՙKt;0coO0nW_ȏ<{#t:@rwj8YU4:n Ix[D.#!tg93uOGkIj6#ňSfY:@90K<>e׃3<\t:B&wՎp*Ė OS4:@xQv;9]s:@:0m6&w>8$Mrt<Ɂţv]޵Q#JųI:@>ځU;'v=etĎJfס0`*tЁ;I;դ߰gH1Htt:}[Ft:@tΧ?}'z{/ j¤Y :@>؁O8pgH3KYn v!,1}## Ft`Ɂw?zۥV6"V1$`:@v`᪝|?TJC$Q(K)c?]4ZaU b}-$ES#I [XA+}yB\>+A:@逿f`L:VYJ ?!чva?]{%ӆAozc:RB4yL*Oڵ'tDREua:@TG$' G`] #1Z Yz"zHK<=, |Py<2BG~s81h‡v;§1XlZmȲgwipXVqLv0#R1f:@;pǬDzd̟⺴4l?VO\xZx#mUkd:@%Or> ײD:@:0}Co%aC}(:@t:@t:pЁ~?YοS:ZW䗋ҟ Ac̃Qؓ7> ፇмKUuΌ;;]\ ר]'+iۂ'zgYۙY]wqtUY4j)F$Oh]3ndI,jԈ%CnkJFG x>wci/ʽ("tӗPFҧU_E봖&AՍ]CpLSbQ;]x|  jt^MXcU4@zc:Y.se[ۓ\>tìtk FH!Q 3`x`%<¾|G(~D]责1hZ38`|碒Nc*$h}Sv=lV*53VSAY!{uYl<ҁcyقuG3d=9rbƐ0fԗG!ESX僅lŜUZA=Qu|>hpijYW]IJFk4`xꩀPwjo:e4f)fO1;#t*]ϐZZ WKޭ\1; o E.c=Dr5XEju޾[_鹎3V($:@t:@ts_p?=!rj8+3t9—2R)A:@p`{7:,ן*cR#HNOq`rɧ_گ?ھ3gIh<5F)Bdύ \f:@fgmq莲;„*]?)>WHW+qt: p6f|Zr;TxK?e:@+g|kg!;/z(}s?u54 t:pU-xbDiٍt0gG' :@@uk`q!*j*ʟSg~WTH%8CS:@xg OKh:@t:@t:@t:@VG?yVIb:@ڵ^R(4:@G:p*竸6=?YtЁrk/k3Duudt:pՊ9.2\;c nSBt:5U`UoWUZdPѣ_=t:h ',YOXɳ%(t4W?\=i8hT\ܕYٖ0Rsr:@Firl'hsھZj= Sֲ!mG?>x:@C}{ C\t:9Lzo[Ǩ:@t:@t:@놆TjzrEM7}/G޹sFҸN/M#hS5I*O ujjA 5\(vdi&d!$[a.1au1e.Όjuםjħ QU*48-5P^= *w\7L~謖EV٥wvf{]wq#$|LWtQTWN> k[%Hh<5feٗ. 1#U,D ͦNGޱz`K$? 1ȀI#t7dau1}ƦJ|uԆ%1g40lx"Z?nLkϨ'y-Ӄ֔+7cHT"nm̖n*#\)OxU?ɵ/xsȲw"14ȃ"#h]a|\\9`(*jLY YXw Z*C @~L "2,AIh^E6R gT"].oWuOSe0š^]nbj10bL0zĦxL ͚zZ.0}ԅjbOGac2 ftZY{[VBB9d.)YȌ<]<(qzyx WY&;o>mMmyj+"} JSw4+ÛK \su{_G;Y#iPNag˽w\W+0տ)(n8"KK_y2*TT씛bR@栘TRa,^ծ6 -;>cVN6Z.E\SBF mti k䘅43`:)Hןzkp=#H.Dž1pe cl^I Ց -ȓAԃl'`u~&#]&`Uǣ6 iqT 0ӀYXhbATӚdc-h%f t1O?PKwzs= WMNu̝鼶zG!1t xdSo'/ wvg̵ׯH}muzc.쵥V($:@t:@׍V:WF8Ŷ||OvH +X| NWY{n4j"C;֤宑jIwݴn3+, >c>ՑT'Ɇ5ܚy:#r֫IiG'eԦҔQ_)O-5=˛# m$57@.|v_>G1WBPc_"l1?>x qo~B﫮2u^5c 7O-X0֭#FԚ5m #0{^yA:-i-*0ƙ*n<+$DTI#ݰ`YU?N>5FlkG" Y~cV=W<Ճ<2Җ2 {{Z0,jêYm*zUu_dz!X<&؋z$b_8)uUihNc.zw޽BQF`x67<.* 2|waH~G#rA[*0hܬd4p;N` r>1F49i ^x .yyn)dZZe NF#'NZ@̴rX z_  NFƴ^FtX^>4S\kխ,N0F=rROUYo0L74y:[cWtʻ(Bl> SFүD XH֚ϴ-*4%G E\yUMY)ckP$F:ͮaF֘GY낽0<%&ƲB /<[& w~sӽ # l SQ@m!#X׷0FFHs+LYRQ`^(n|I:FQZcdGF𚥜O1!=#UM1ZF>Ra\2B/f0myOSRB?,_" ȭ1*1q|&h${_8rkmTp I/ ND HLͪ4Iޓ]3>2ZeLj?_gy.ma_W`D4垉?R'{~ߏh +u[܅ʫ )FD 6][-[ QOZzIZȻ.1#ZU  B&OPY֯Ih'* ^bX'UhKJy83E1v;A]F!C#*l~GQ\`H`VIs5:%b_*Ik**Faf-+ IDAT xφ^yS!S+e {C+`ODKG1.0ҩw7x}hHL b) Tea_A!;<)FcF#ş{ȎCGy4ݿ}#0(K-0T\-ʡf먶~陾P(IL4n iޠ:CV4%qbVDZecV Y A g.mqn1]#Fkh亊^ަGwjmoV6;5d ӬLҗi^Δ(}^އ=(Q}Q~̪xKg݈P%VFFsE64md]h8¬nFbB5LVku`#ŋkTqTgY)y<ԩ0Z˥;}0h:SkVl ڌYӤEFߟ^ۻA}Ć~O7t1Bz"gHQ4,X/Ut yOY0)W=[v,3}yBA!z0")1zyaA0xX }bzgk֚놦(qڮx^.ow ?JQO(YNU5Q!E9?7Ѹ?Gu|EoXjܲB &PM@LDLP).$. *Xutj!)Z?V̒d!fq{.S}ʓ(EYbQܐ#ӊa `5xG,`#S?:`YѩE:,07;EN[0MZ d HUF)B_ȉ wԖZXA=WRWĬ9cO'>1\BfDg16},x5d SL 괝)8џ\,(&,F-fw^@܁-$ߧFbϝG㪤ss/+|6ڀL%)OucV<=i(P;Y#b( # b$n0Ŝ'Z-wH܄;0}Y/-H"icq㋾,.t:iT23'*}Qܲ Rc) AF N#5hƄ%H//˯k ?\WW4 gw53zUdz?5=gqim c UXaKس.s2q́<.X:v.t:@tЁtphLsk>3~Vk$c3-D;p~Q!:\=n/RS@Zn-+Tϥ[st?/))7FޛKol8Z6[CY#(0@yA̮gtn%k=n(}2`h9Hh~͟MѷEVɧr&ېBؽJdISUbgY-tǬ{ Қ͗%00?k]4BGgaڷf;}! v?}x1lBbNZKEpo$*KhAcרBͬ49@7Ri"m y0¾\7D4WC=kӉ#񈗈bT11i֨}w}´ z:3tZ`=3VcN `.B-K)M3-)Il:6R>botF:}<%b}܅]#nwho/_?Y^Ol9`aZS*#>STO I'c̮j-z^8S uS-B1M}y%Vh*IE2t,Bs,2`<c{O1Ay@Ԧ߀AVgO»7@}9kl\$>~5~dcEt:@tTFQ3*c^T݅|Ɵpm]s~E^G%~:YvmJ-o 54aYw OwܹMO!Xjm!ӱK[ٮ;{pvZQGuz)y_I!5'r㟶/ t#vK'7 I66GhMOpi#ʣQHp Q3FQ.c>MQZ J#( ².ZVOۣBdFh7SZV- OVplԸ/xN1,|d&8`BP+u86$c65]DVGf3R%!9uR֪.P!m m K0aԁkanj?:w%uч cHZUXžS]1\c-y:ձn3b0NJ^vRPb'%wF6 YSH?] .l$N1Y܉-wTBک\tfORL諃I贓&j$(vQkւHLYrSdMkuhG!a-[EmƱ(jR iG@o;FݰW36=6S@4wKP2MȌ@ V`0kKZBbdW (|KyO}wЫj# RՄyIiJl0a@N٬v Xfc^EP ]w܄ '#LG z~u% r°u5ҡ`F/?^}/C>Ж}f_7K|),DfvRfSx5 zv BW;H1 dF)PG"!뗿O~/ӟZWa) ǐyF;)fa:ȃHwVgH0|udJaE cЁ@R[ZY`^Dс}2^4 w/t:@S8yz˾w@GgS-Wxm!+|Ys]w6Kh«^W,Z S)PJ i;> %bA@iѐ,qt4xTw,ޕ܂O\&vq!ε^qmle⣧uf?Zks>K.pG<ǧ^gol3iנּ"g:M9Ӭ>"=h*(<3z0ciz跟^# JQbYtʣ>w0iKI-dδYvB폭z4$t|K6Q;*HGg6X;`3 jf\M,1ק*cTIѯ XU-H2f`>hY4e+ FȃY>SUƐ ; S=AɨE FA4# RGB@(1bd5*zFZ# o]cx]0H)),Ӭ4XN5 ղNV-z%^$Y:2$zu {_ 3Ny3tCȃd>ԗkYkmv|NL|Wcu<.:]ȓvهi1FYL]0TִO1+8]S;4}91e`B4% Q$L`Ęx-LVkJ[Թ!1I5-n!&OvÉ,Gܡ,&ՄU!k;Dѳ^zm:Bn'D:QVS="iVL y!]ÓnG^sRONJ`vrZd wl QOHpޱiXB1ڋoP1 kϿ/?ǿS)OHy:ZiXG.L9[ :p;m]:fZ^l:6"7&ϗuYQL9);fU4" X  o\dTŔ٪9Z TA>;fcU|i-t#DRCSF(k6jib_ŤM}y@Ӭ4=E>+o~C?y b02r2 ar aF18u TYHߪȅ~,B f v(>cQw>kĀZ:C:2cF0q5e6t1hbwrt+GgO^1y\}%8W=.FZdp6Zȼe:.Z^2BP8C-|t^ӬNALDi -xyiG}'%ŤK=~L'iQʐ׽xboZ`5"ZѦ4k! #)[_{g͒iuBsN D7Yf6:C_C=x 5 DdfUg,p|UOgL U2L"^GT-]U*%(W_l)B%[U3bd?QNs} $pB.Գɟc ]Ⱦ$@$@EyQ>iHHHHHHH>Jq펽i#+:zm -p$b2Wg뉟`LK45}$NE:^BOU^3NZUyW* I$Hsď1FG=,ЕFitu&׽ ϪKZDi2sVrFzè/LKFbR]W%[ 0 Beļ]sm: FHgD Vݥ=I\M#i$b 1c*Ot0^Ȫ\ J*Nє?YӃzDg]s\/oJ o^osneǵ(pn4s eN{3R$ CF>}Zvo_׬˦JH?+bT蒾*&`wY*{TcJЃ}ZxrdFpwƉ$. LdJ+o|s]#ĜI NFR\8]rϸ6"OPZ Z! K7[CNugDP0'2qrq,x\揍Biw/myF; ^_4       M`:YJ~N`Ώۛ(~.߷+}+xItm \Vu[L)9SV= j鱶;!RYjć%]ʷDm3'LSuFP*ei>rH=„4Rm烅8{x'v7ekꈱ64Ò>@PUu$M?FMmPW7Sj*dVhUu&}uvO`X.ͤV]S#^RF&"aIԚ޶ֆr?Lf4_>WDWMJ%vT+Kt&=t1BeIgwn |mU)1טOVѳ9'GEdK`e"­ZV-H`&ǒ1 ۊ4]af3p^trG:ʅ bdGҲK&SXs7KӜg1o~ƞ-r:9yE /lMJ~.V~3x BOگ\-H+9[4dvE"gS~DW!<\cVצIUSš`4! V$]vΕ.դ׻ObՑ:wgOEBi.nWsI# ~V'*.NH_Zs6z>^wLOzU&?;GxFr~0 ˪nƍ:",!h<-ǹ9s0fԇf[5 #",Uw+r, }xNr`L-5LM0Bcg>9*Md9Cwtv?x+Z Z* 8VI$0|riU}֔%>aB9$Ms]mv         pﹽ{} Ct6 ^~q?duw|K_%I#k{%z=<1m r`c_s# u0-f⡤R=SW=}~LrϚ;c~_/K-|6CI`U5̑u^|8&m!ՐFԃ ljդ3F EPAe HPssHP/#zqgE 8~:ІEP^W kv?D;QLza*798"%^P+M#3 AhgٗTt|;9߷=}aY6҄ȷ JG{gShuYD>0~cVz;ԗ?ևanN*Ah*.jױ 7'xӔ]:^{+R`sXMewG}xjLdXfANqRA XG[ykM.VX;r4MԏCw{y-ʰ '"6Ef7U n~;_zc[Kf"bRU[7ZmѸBIHNq. %A yZ#VUR9͙ە.1Th WPeqc2v4W!y_ERf  ;NF,mpcNȭE.zaUV塿6z'{f\Ep /W ̈́V޿6dX #o3.[:Lj(F~H_Zovm >>h( :U$@$@o8#\{e(x߀b$}w-.wO$@$nr<#ɘ N ovǕ K5(~ a Tľ 2H$@$[ +fW{Ȓiw'E%`wF#RHHeDtՏIH^Gma.? r|4d+^>cU]on[*MKq;N"^UGz~W~%L(  |/?c(Όk+h"Y]5aWҘa ֛ZΊ-AyTK疡n6-D< $;nPk4G$Жo$eq$idAT8%^59XaM$ XB=7!UH޻3 `Px`ЛFB/ VmiBDFZw`/tJL cNښAh7e$3<d}\5ͮHC߾Ύi_k]yWt*nʳΈF>YF߽hz+d}?.UPׯV#$2{Vz gx+glI{U5Y{_0mgU4'o42g$]_~4O4SӴICՁtkikT{!$Wn$xѭw!b҉O$^Kx˷{Vxpl1%;is֙93fO eZEZe$4Yo[ݜIߢIWtc%#U~(#v硎AF΍ڋ:/U99uVuV`%6dQ:G<}g& /XAGssY^ap{ũ^ze)^>ؒ7D< 4ҒWTHSeC6K燒k߽qV`@#w3F0 fދBO bzs#?Ӊ?}6]\9Ыy^VH w!b:-gOyH$ 7z}}eh7wk~M` 7XzΧ:}H_%s>_#W YNH9&:W:}P%#vXjy3u7(rW4a8ˤWwWl틟i c! &] iWnZ}9`{V;E{RmTc[I@J{:>hU*SX^ۃ@)UƪIEY< )q}ei,Lg3?S6U N?LW ;C$@$@$@$@$@$@$@$@$@$@$@$:N-%>B$5|L$@$@$].›4! $UnBmyIH~Oj]˅~t!#&4IHHiNi7TkQӞMf23* /3F$@$@@׬R'۟;ViA|[E,'  nvd+Z=>䄲$@$@$տo>he;}WY[f mWv;hxijp=D÷M wS}he$@$@$Y+CFHHHX}ٻIHHHHHHHHH_o=$mٷUԙDlKK[ Oo]9%ڪzګ,Z5ûƖ%ݦg.٫3'wu?@WwGt,jtԶu\JZv_ V 7Ja{ط,ju2˱0 V&j;ԜVwMrV&Lj^Cw\t^&/ e G h<|e޳.=]}wXzΧ*ozm̟NU`6+;n'!0_ԂE,9UB-p:Ȳ*"T$l++T8`$07}*Uפ6xN{Xm<{AĚbe$v=t頻46{t^iՄYN:dvt҉Ǫec}h$њWJYUfb9 ō>9=VRE|~s Lz-JBV~IG9IZ f^>CK=Զ~8VtG{c!y}-b$lh N^`N4sNI*Ht2-wU/e#X2V # S=M{" gTU.oWwSU0^_=3#UV/$ bnӠ \l'fn⬑ OLr LdצU Q)&&5J7BV墡U\''~&رoM:iS`M[tߒJFǑ+z[/{pL8/|h/9:[MaZxVz>r5 HSXv`?)RNͧ6:R4՛[o?xh&Vw]7LК^,]gBcs6+&Sܕ3ksk2w٤gU~Bx9ͻtNw8ueۏYUs-D,SOF((cF]9"125LzT]ɑZ}ѩ69ZL-EWU,Wu#,g5;6)i_󫪞j*hy_Z5o{ jhay*K+cI$ 1#i0Lt&9lTuIE_to^ʞQ[@u^ :+D ;zf 13t 'T++[U6~iwGE+D}iĂ#iHMC32_CTނ`Xy꺧UHّN S<ĈѥTgTN'VU;XyN:iIĆzt':\N | s&$rVUup_V]oP /jEBNC,Y#W[ӯ.ǫKz6D_.ٗ("雊[]Bc $[f*SAe~ޤ6R&gURgUؽ 6-iF;jeo!n}˪&x%?z6YL`JLFx%1ܯYDB|޽Eg~Ot_2wwҙD$9r넳kl%5,T=+%ʰٰ kKm\&UôNNsE~)>LPnYcDDRT%)1K]P#ޡKT_+i>12|b!FR !xV*ۻ^xFSi0Qٔm'4UIHZcCϩ7w+fҦeHBZcB iZZ^ ;-iUз1ߋ3@>VqP򌠂10F40i3HZUѸx$cwhiwB!u|lk_tp@lȪZh N&>Bͳ ]D+BW^ٷG9Nzzût> ݂߰(Ud       #ꞐOLC$z8o;wǸQ-/ߝ}w|i:a.{i;(}F0EIs 0Be$(C$Hadj.reB/kxK bdOJ$LC+ |ۏYSI"w I/!Y̡JQW@2ܤ9o|6.lUwf>,JEϚ-2g[Uo?-ߍ NR:*.`v!b `&-eJpY L='/6sy;3ԋIN?|Y*!{=ؐ#_;;WY_Kg^q~N9Ke{-3%:{j7wi+{vG?ofe_.ٖ/#O*{Uǫߵ\qI#-#۫VW֪U6 `OL{sݍ bCYktPsP-a/,g7G?gb3:@P& '9;MĞCe!( F^M^C;WVifY˪~lp4̱rk 6K N7ut =XWߺ8 SWj>@v=L1g cpը.gL3PU~GZmzi%-R'aƥN*b^\r3^{īp>3/pu]7 zb5!*O"C價;œIk1)LԀYEX'L{Kg*- idvә$W9݃P9MTOv51k";IYгB4<9P g+? +f":~҉2V͝_ϼRu[ a/R?):bk3]?f4L(NUr[\[-]_ҙcyY:(Ii\.=ߘp`JžXWBY ^.J;Jdn/-c[#Mw귽# Q ?ID!z%N-rkXeЬFhlUᙦ<̪۪S`V/_~Lջk,TUKH0'mW){皣##Tvk6B3l~bYHE )T<Tʓ~$/D|2,sHHm_g4~ t 7I VOҥ6 | {ۅ L$'WgHHHHHHH~ Gij$>/=L깧ʱeaD=xdՃ)ձq!zxM.éN>"-I=>.eۻil{EΨUWje^'~/jA)ҹ/O+wۻ'Fʅ/!M-G.^UxF͒^6HuKFyv'7U|8P y<펳ca|RUKf+[qv=.:#xpT2M"Qc!-̅(O*h0jͭ6:=`Z{6?‚ ϋjp=thoA@d^ (?cM/ n;RQ9̞)Rԛjt|_uhRM1C`7"uR?A U$']NTH#Ug~Tm|Qa.f#Tk?i-2駨:@s1;ľ ."_w90_ZO\Iߦi+|u6jMi\4ʩ*/5-.)$KUtF'͗NI_ibQƥf3}}qelc]S]NI4gKgx^C>n[m:_֟>a^tΜo,xe&tY &Nxp,&)tSwWt^\Z%nwU~}>I[,*?X8,gg:XUڍ\{Lt42SSW?X;+Ǻ ~&ppzc~ɩhIQq.-O&(ɱs5c) hF'p^I%i!'>ֿ.bMsw>yw~ɉ9]ݥ|  _H }z#QHHHHHHH u!-LJ\8-iǬJfRݲ+*>oq][W.v]]`ygP& g:Z.7gj8 *Ft*UugoҌ9!\{q/--\ɿ]ߥs}?v7듉}%Um}yiU,i.(pw:{t.U)k9akZr]Dϡ\ <7X7 Oz6,~.'чAk&E8):JYJIE4[ \x^ܣ84lb1zsz=jɤT?&U@V'{Q#vp6FKto)G{菋dL ^e *Fo:i`sPGE ӹO:K0&q{a[KW_˱0rWz|Pe FRe |O> "ZU}:f,h ˹&SgR?wu3lhǽS]9+{_{8Tvٟ=dU(f!  LNϨdtAX>S1v9gjAe dQ'u0'EIds*;z*kBeԙ^:ʅ-Nh<ʱd2 UgS(Gv{8{Wz.&TH#rbL̥*y744 MGK?Ϛs': .=ɦU}o% rI1RZ|DH1HP?pjY"$> !*ۘ/P'0D>R+ɱ8E1_L'Uu@ԏ0oP~aC7/j~y0#r…>^]>k-rFt*」$s/x}!KFӜ{TqGЛEpRW9Q&&i}mL}52ѩj8*c$U&EPVnK!K}9 `f:f ~0gס*`F$;6=ђh«z'8$˧a*=~ϤBٲ~+ϱG'$@B`LOw0 bmI%o_>8o&N$Ik4'p%Oz@w̻tGYɆ#v,BDʒ ikN&ʒYG=<<΅R i4m)lʅKw0"HvdO'ܐ_>=yITʧ ,r3 Nv0@*?T3H+.L|pАLĹ9b{&Cz@ZV|{J?(ΰj8WUL"TvAYQx܋TiK&~Py)BIUc AaP ꤑgkag~NV|ӅxPC9 =+ۘv v{Z-w# (/LҪ n<="|k]gªcy0ZzoN*[ª2HD|GTH0mABܖBU;ڨ%$OZF˛䳥-?\wu#| fG>m^V7IM]\>k 'wG򍱆Į?ߖIVa")1|\ٳYՄY/tǪȷ` >|ˆ꫐i>2`I$1taPM*hU=?|"M'9H,29F9jslp42+'5LͤTٞj[xeit+J@?$i57D_eBԴJr,A"D }pTN=bJ`0b*)+LdgsUSx)YY>4Bzv) IDATZt"|ՑϾp. "IO ޳T֦Ncl9AF>M:i3YqL {~FZ =^ϵh# |? ~ȱb |7>{9 NA|{g\T Q=211lt}gapnbV.C+7WDp xb8)Nd8!o SIďMQuF`XNfR-(NhmRzQZ DR?![CTFiN~34jMFc 9̧]>.VSIOֹX}-!HoKe'B>jhi:5 E@uc߽jj}֪ꡉvy(q޷x{/|hj5c?S U7N>'՞WI>Sw }ĮV}D:>1KY aTȆKt,jtԶu%YzN)Wv<\uŻ"r CYDv-/A#^#͋#*r{8a0Y*G}#-+f7LiU-^ NjNsTahW. !'"kzU{}UHu) jp p٫_9bNYz?)aMY~F{8U0E+{g "aU3qvzY0-0gA I +=3~Z]ڝNz]'z_%L?wꩃ4դXv`G-u&wr- $@^ gy {׫ZVk`d) /zgt rlҊ0*Tq Tj uo8WFyWCqwߥ}ڿ>VMS"y``̦3y/-$λeV{r|BMh;G i_MSE_Xn:6]D$@[w^%nIJ_RuWy*G^ǒ# c @K&8ZN4ZۤWu$ iAzJr|pjA zSXzdh9(5*u3eUc*F sC,Dg ws׳zPdC*ɍAGN"^L  9]o_ڪ*JW/}di09"^ =ۑ@J7b_;I~![4|)}-塣ˆ~)͡})0G[_[,;>=%oFFF4"&@U _KKa>{җĔUX25:Ĉ ^5҇H' 9ў EM/Y:us夨UW-(K$! %7!I}ű!zhXO v'{H~2$O~ݒw'l]?w/v'o*O"s}f *ܥ˟5r_W8YGeH^Jޗa3  x)IH>o4B$@$;;S | z?f+hHHwDې3 IENDB`xapers-0.5.2/setup.py000077500000000000000000000011121214264575400145420ustar00rootroot00000000000000#!/usr/bin/env python from distutils.core import setup execfile('lib/xapers/version.py') setup( name = 'xapers', version = __version__, description = 'Xapian article indexing system.', author = 'Jameson Rollins', author_email = 'jrollins@finestructure.net', url = '', package_dir = {'': 'lib'}, packages = [ 'xapers', 'xapers.parsers', 'xapers.sources', 'xapers.cli', 'xapers.nci', ], scripts = ['bin/xapers'], requires = [ 'xapian', 'pybtex', 'urwid' ], ) xapers-0.5.2/test/000077500000000000000000000000001214264575400140115ustar00rootroot00000000000000xapers-0.5.2/test/all000077500000000000000000000334761214264575400145240ustar00rootroot00000000000000#!/usr/bin/env bash test_description='basic command line usage.' . ./test-lib.sh ################################################################ # FIXME: update with source already in db # FIXME: add with prompting ################################################################ test_expect_code 1 'fail search without database' \ 'xapers search tag:foo' test_expect_code 1 'fail to add without file or source' \ 'xapers add --tags=new' test_expect_success 'add file without source' \ 'xapers add \ --file=$DOC_DIR/1.pdf \ --tags=new,foo' test_expect_success 'new docdir exists' \ 'test -d $XAPERS_ROOT/0000000001' test_begin_subtest 'tag file exists' cat <EXPECTED foo new EOF test_expect_equal_file "$XAPERS_ROOT"/0000000001/tags EXPECTED test_expect_code 1 'fail to add non-bibtex file as source' \ 'xapers add \ --source=$DOC_DIR/1.pdf' test_expect_success 'add bib without file' \ 'xapers add \ --source=$DOC_DIR/2.bib \ --tags=new,bar' test_begin_subtest 'bib file exists and is correct' cat <EXPECTED @article{ Good_Bad_Up_Down_Left_Right_et_al._2012, author = "Good, Bob and Bad, Sam and Up, Steve and Down, Joseph and Left, Aidan and Right, Kate and et al.", publisher = "Optical Society of America", doi = "10.9999/FOO.1", title = "Multicolor cavity sadness", url = "http://dx.doi.org/10.9999/FOO.1", journal = "Journal of the Color Feelings", number = "10", month = "Sep", volume = "29", year = "2012", pages = "2092" } EOF test_expect_equal_file "$XAPERS_ROOT"/0000000002/bibtex EXPECTED test_expect_success 'add with file and bib' \ 'xapers add \ --file=$DOC_DIR/3.pdf \ --source=$DOC_DIR/3.bib \ --tags=qux' test_expect_code 1 'fail to add non-existant file' \ 'xapers add --file=foo.pdf' test_expect_code 1 'fail to add non-existant source' \ 'xapers add --source=foo.bib' test_expect_code 1 'fail to add non-bibtex file as source' \ 'xapers add --source=$DOC_DIR/3.pdf' test_expect_code 1 'fail to add source doc already associated with different doc' \ 'xapers add --source=doi:10.9999/FOO.1 id:1' test_begin_subtest 'update doc with bib' xapers add --source=$DOC_DIR/1.bib id:1 xapers search id:1 >OUTPUT cat <EXPECTED id:1 [arxiv:1234] {arxiv:1234} (foo new) "Creation of the Universe" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'update with different bib overwrites previous' xapers add --source=$DOC_DIR/1a.bib id:1 xapers search id:1 >OUTPUT cat <EXPECTED id:1 [arxiv:1235] {arxiv:1235} (foo new) "Creation of the γ-verses" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'previous source no longer in db' xapers search arxiv:1234 >OUTPUT cat <EXPECTED EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'update doc with file' xapers add --file=$DOC_DIR/2\ file.pdf doi:10.9999/FOO.1 xapers search pellentesque >OUTPUT cat <EXPECTED id:2 [doi:10.9999/FOO.1] {Good_Bad_Up_Down_Left_Right_et_al._2012} (bar new) "Multicolor cavity sadness" EOF test_expect_equal_file OUTPUT EXPECTED test_expect_success 'add bib without file' \ 'xapers add \ --source=$DOC_DIR/4.bib \ --tags=new' test_expect_success 'add file without source' \ 'xapers add \ --file=$DOC_DIR/5.pdf \ --tags=new' ################################################################ test_begin_subtest 'count all' output=`xapers count` test_expect_equal "$output" 5 test_begin_subtest 'count all (*)' output=`xapers count '*'` test_expect_equal "$output" 5 test_begin_subtest 'count search' output=`xapers count tag:new` test_expect_equal "$output" 4 test_expect_code 1 'fail search without query' \ 'xapers search' test_begin_subtest 'search all' xapers search '*' >OUTPUT cat <EXPECTED id:5 [] {} (new) "" id:4 [doi:10.9999/FOO.2] {30929234} (new) "The Circle and the Square: Forbidden Love" id:3 [] {fake:1234} (qux) "When the liver meats the pavement" id:2 [doi:10.9999/FOO.1] {Good_Bad_Up_Down_Left_Right_et_al._2012} (bar new) "Multicolor cavity sadness" id:1 [arxiv:1235] {arxiv:1235} (foo new) "Creation of the γ-verses" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search all pipe' xapers search '*' | cat >OUTPUT cat <EXPECTED id:5 [] {} (new) "" id:4 [doi:10.9999/FOO.2] {30929234} (new) "The Circle and the Square: Forbidden Love" id:3 [] {fake:1234} (qux) "When the liver meats the pavement" id:2 [doi:10.9999/FOO.1] {Good_Bad_Up_Down_Left_Right_et_al._2012} (bar new) "Multicolor cavity sadness" id:1 [arxiv:1235] {arxiv:1235} (foo new) "Creation of the γ-verses" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search all --limit' xapers search --limit=3 '*' >OUTPUT cat <EXPECTED id:5 [] {} (new) "" id:4 [doi:10.9999/FOO.2] {30929234} (new) "The Circle and the Square: Forbidden Love" id:3 [] {fake:1234} (qux) "When the liver meats the pavement" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search text' xapers search --output=summary lorem >OUTPUT cat <EXPECTED id:5 [] {} (new) "" id:1 [arxiv:1235] {arxiv:1235} (foo new) "Creation of the γ-verses" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search prefix title:' xapers search title:cavity >OUTPUT cat <EXPECTED id:2 [doi:10.9999/FOO.1] {Good_Bad_Up_Down_Left_Right_et_al._2012} (bar new) "Multicolor cavity sadness" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search prefix author:' xapers search author:cruise >OUTPUT cat <EXPECTED id:1 [arxiv:1235] {arxiv:1235} (foo new) "Creation of the γ-verses" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search prefix id:' xapers search id:3 >OUTPUT cat <EXPECTED id:3 [] {fake:1234} (qux) "When the liver meats the pavement" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search prefix :' xapers search doi:10.9999/FOO.1 >OUTPUT cat <EXPECTED id:2 [doi:10.9999/FOO.1] {Good_Bad_Up_Down_Left_Right_et_al._2012} (bar new) "Multicolor cavity sadness" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search prefix bib:' test_subtest_known_broken xapers search key:Good_Bad_Up_Down_Left_Right_et_al._2012 >OUTPUT cat <EXPECTED id:2 [doi:10.9999/FOO.1] {Good_Bad_Up_Down_Left_Right_et_al._2012} (bar new) "Multicolor cavity sadness" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search different prefix bib:' xapers search key:fake:1234 >OUTPUT cat <EXPECTED id:3 [] {fake:1234} (qux) "When the liver meats the pavement" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search prefix tag:' xapers search tag:new >OUTPUT cat <EXPECTED id:5 [] {} (new) "" id:4 [doi:10.9999/FOO.2] {30929234} (new) "The Circle and the Square: Forbidden Love" id:2 [doi:10.9999/FOO.1] {Good_Bad_Up_Down_Left_Right_et_al._2012} (bar new) "Multicolor cavity sadness" id:1 [arxiv:1235] {arxiv:1235} (foo new) "Creation of the γ-verses" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search --output=tags' xapers search --output=tags tag:foo | sort >OUTPUT cat <EXPECTED foo new EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search --output=tags all' xapers search --output=tags '*' | sort >OUTPUT cat <EXPECTED bar foo new qux EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search --output=sources' xapers search --output=sources tag:bar >OUTPUT cat <EXPECTED doi:10.9999/FOO.1 EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search --output=sources all' xapers search --output=sources '*' >OUTPUT cat <EXPECTED arxiv:1235 doi:10.9999/FOO.1 doi:10.9999/FOO.2 EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search --output=keys' xapers search --output=keys tag:bar >OUTPUT cat <EXPECTED Good_Bad_Up_Down_Left_Right_et_al._2012 EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search --output=keys all' xapers search --output=keys '*' >OUTPUT cat <EXPECTED 30929234 Good_Bad_Up_Down_Left_Right_et_al._2012 arxiv:1235 fake:1234 EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search --output=files' xapers search --output=files '*' | sed "s|$XAPERS_ROOT|XAPERS_ROOT|" >OUTPUT cat <EXPECTED XAPERS_ROOT/0000000005/5.pdf XAPERS_ROOT/0000000003/3.pdf XAPERS_ROOT/0000000002/2 file.pdf XAPERS_ROOT/0000000001/1.pdf EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search --output=bibtex single' xapers search --output=bibtex tag:foo | sed "s|$XAPERS_ROOT|XAPERS_ROOT|" >OUTPUT cat <EXPECTED @article{ arxiv:1235, author = "Dole, Bob and Cruise, Tim", title = "Creation of the γ-verses", eprint = "1235", file = ":XAPERS_ROOT/0000000001/1.pdf:pdf", year = "2012" } EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'bibtex multiple' xapers bibtex tag:new | sed "s|$XAPERS_ROOT|XAPERS_ROOT|" >OUTPUT cat <EXPECTED @article{ 30929234, author = "Me and You and We Know, Everyone", url = "http://dx.doi.org/10.9999/FOO.2", title = "The Circle and the Square: Forbidden Love", journal = "Shaply Letters", doi = "10.9999/FOO.2", year = "1869" } @article{ Good_Bad_Up_Down_Left_Right_et_al._2012, author = "Good, Bob and Bad, Sam and Up, Steve and Down, Joseph and Left, Aidan and Right, Kate and et al.", publisher = "Optical Society of America", doi = "10.9999/FOO.1", title = "Multicolor cavity sadness", url = "http://dx.doi.org/10.9999/FOO.1", journal = "Journal of the Color Feelings", number = "10", month = "Sep", volume = "29", file = ":XAPERS_ROOT/0000000002/2 file.pdf:pdf", year = "2012", pages = "2092" } @article{ arxiv:1235, author = "Dole, Bob and Cruise, Tim", title = "Creation of the γ-verses", eprint = "1235", file = ":XAPERS_ROOT/0000000001/1.pdf:pdf", year = "2012" } EOF test_expect_equal_file OUTPUT EXPECTED ################################################################ test_expect_code 1 'fail tag without operation' \ 'xapers tag tag:foo' test_expect_code 1 'fail tag without search' \ 'xapers tag +baz' test_begin_subtest 'add tag' xapers tag +baz -- tag:foo xapers search tag:baz >OUTPUT cat <EXPECTED id:1 [arxiv:1235] {arxiv:1235} (baz foo new) "Creation of the γ-verses" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'check tags added to tag file' cat <EXPECTED baz foo new EOF test_expect_equal_file "$XAPERS_ROOT"/0000000001/tags EXPECTED test_begin_subtest 'remove tag' xapers tag -baz -- tag:baz xapers search tag:baz >OUTPUT cat <EXPECTED EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'check tags removed from tag file' cat <EXPECTED foo new EOF test_expect_equal_file "$XAPERS_ROOT"/0000000001/tags EXPECTED test_begin_subtest 'add and remove tags' xapers tag -foo +zzz -- tag:foo and tag:zzz xapers search tag:foo and tag:zzz >OUTPUT cat <EXPECTED EOF test_expect_equal_file OUTPUT EXPECTED ################################################################ rm -rf "$TMP_DIRECTORY"/export test_expect_code 1 'fail export no query' \ 'xapers export $TMP_DIRECTORY/export' test_begin_subtest 'export all' xapers export "$TMP_DIRECTORY"/export '*' find "$TMP_DIRECTORY"/export -mindepth 1 | sed "s|$TMP_DIRECTORY|TMP_DIRECTORY|" | sort >OUTPUT cat <EXPECTED TMP_DIRECTORY/export/5.pdf TMP_DIRECTORY/export/When_the_liver_meats_the_pavement.pdf TMP_DIRECTORY/export/Multicolor_cavity_sadness.pdf TMP_DIRECTORY/export/Creation_of_the_γ-verses.pdf EOF test_expect_equal_file OUTPUT EXPECTED rm -rf "$TMP_DIRECTORY"/export test_begin_subtest 'export query' xapers export "$TMP_DIRECTORY"/export lorem find "$TMP_DIRECTORY"/export -mindepth 1 | sed "s|$TMP_DIRECTORY|TMP_DIRECTORY|" | sort >OUTPUT cat <EXPECTED TMP_DIRECTORY/export/5.pdf TMP_DIRECTORY/export/Creation_of_the_γ-verses.pdf EOF test_expect_equal_file OUTPUT EXPECTED test_expect_success 'restore to existing db' \ "xapers restore" test_begin_subtest 'database intact after restore' xapers search '*' >OUTPUT cat <EXPECTED id:5 [] {} (new) "" id:4 [doi:10.9999/FOO.2] {30929234} (new) "The Circle and the Square: Forbidden Love" id:3 [] {fake:1234} (qux) "When the liver meats the pavement" id:2 [doi:10.9999/FOO.1] {Good_Bad_Up_Down_Left_Right_et_al._2012} (bar new) "Multicolor cavity sadness" id:1 [arxiv:1235] {arxiv:1235} (foo new) "Creation of the γ-verses" EOF test_expect_equal_file OUTPUT EXPECTED test_expect_code 1 'fail delete with no query' \ "xapers delete" # purge the db from the root rm -rf $XAPERS_ROOT/.xapers test_expect_success 'restore purged db' \ "xapers restore" test_begin_subtest 'database intact after restore' xapers search '*' >OUTPUT cat <EXPECTED id:5 [] {} (new) "" id:4 [doi:10.9999/FOO.2] {30929234} (new) "The Circle and the Square: Forbidden Love" id:3 [] {fake:1234} (qux) "When the liver meats the pavement" id:2 [doi:10.9999/FOO.1] {Good_Bad_Up_Down_Left_Right_et_al._2012} (bar new) "Multicolor cavity sadness" id:1 [arxiv:1235] {arxiv:1235} (foo new) "Creation of the γ-verses" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'delete single document noprompt' xapers delete --noprompt id:2 xapers search '*' >OUTPUT cat <EXPECTED id:5 [] {} (new) "" id:4 [doi:10.9999/FOO.2] {30929234} (new) "The Circle and the Square: Forbidden Love" id:3 [] {fake:1234} (qux) "When the liver meats the pavement" id:1 [arxiv:1235] {arxiv:1235} (foo new) "Creation of the γ-verses" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'delete document search w/ prompt' echo 'yes' | xapers delete lorem xapers search lorem >OUTPUT cat <EXPECTED EOF test_expect_equal_file OUTPUT EXPECTED test_expect_code 1 'check for deleted docdirs' " test -d $XAPERS_ROOT/0000000001 \ || test -d $XAPERS_ROOT/0000000002 \ || test -d $XAPERS_ROOT/0000000005 " ################################################################ test_done xapers-0.5.2/test/basic000077500000000000000000000045201214264575400150210ustar00rootroot00000000000000#!/usr/bin/env bash # # Copyright (c) 2005 Junio C Hamano # test_description='the test framework itself.' . ./test-lib.sh ################################################################ test_expect_success 'success is reported like this' ' : ' test_set_prereq HAVEIT haveit=no test_expect_success HAVEIT 'test runs if prerequisite is satisfied' ' test_have_prereq HAVEIT && haveit=yes ' clean=no test_expect_success 'tests clean up after themselves' ' test_when_finished clean=yes ' cleaner=no test_expect_code 1 'tests clean up even after a failure' ' test_when_finished cleaner=yes && (exit 1) ' if test $clean$cleaner != yesyes then say "bug in test framework: cleanup commands do not work reliably" exit 1 fi test_expect_code 2 'failure to clean up causes the test to fail' ' test_when_finished "(exit 2)" ' # Ensure that all tests are being run test_begin_subtest 'Ensure that all available tests will be run by xapers-test' eval $(sed -n -e '/^TESTS="$/,/^"$/p' $TEST_DIRECTORY/xapers-test) eval $(sed -n -e '/^TESTS_NET="$/,/^"$/p' $TEST_DIRECTORY/xapers-test) tests_in_suite=$(for i in $TESTS $TESTS_NET; do echo $i; done | sort) available=$(find "$TEST_DIRECTORY" -maxdepth 1 -type f -perm +111 \ ! -name '*~' \ ! -name test-aggregate-results \ ! -name test-verbose \ ! -name xapers-test \ | sed 's,.*/,,' | sort) test_expect_equal "$tests_in_suite" "$available" EXPECTED=$TEST_DIRECTORY/test.expected-output suppress_diff_date() { sed -e 's/\(.*\-\-\- test-verbose\.4\.\expected\).*/\1/' \ -e 's/\(.*\+\+\+ test-verbose\.4\.\output\).*/\1/' } test_begin_subtest "Ensure that test output is suppressed unless the test fails" output=$(cd $TEST_DIRECTORY; ./test-verbose 2>&1 | suppress_diff_date) expected=$(cat $EXPECTED/test-verbose-no | suppress_diff_date) test_expect_equal "$output" "$expected" test_begin_subtest "Ensure that -v does not suppress test output" output=$(cd $TEST_DIRECTORY; ./test-verbose -v 2>&1 | suppress_diff_date) expected=$(cat $EXPECTED/test-verbose-yes | suppress_diff_date) # Do not include the results of test-verbose in totals rm $TEST_DIRECTORY/test-results/test-verbose-* rm -r $TEST_DIRECTORY/tmp.test-verbose test_expect_equal "$output" "$expected" ################################################################ test_done xapers-0.5.2/test/docs/000077500000000000000000000000001214264575400147415ustar00rootroot00000000000000xapers-0.5.2/test/docs/1.bib000066400000000000000000000002261214264575400155570ustar00rootroot00000000000000@article{ arxiv:1234, author = "Dole, Bob and Cruise, Toom", title = "Creation of the Universe", year = "2012", eprint = "1234" } xapers-0.5.2/test/docs/1.pdf000066400000000000000000000342671214264575400156100ustar00rootroot00000000000000%PDF-1.5 % 3 0 obj << /Length 596 /Filter /FlateDecode >> stream xmS=! +dgOȤH&2)0+Ȱ>@;W>Ӌ )}TzW)v(cK|[L妩a.9.vq ocjfS/ o| &TmC`\iMȡ=mն=S k-oB T024)'GzI  Cjwmk\;A6au|HI/啤 !=#!k]I>'Ct]/~H4O !fK,R4=d(!aVgH#ZZ"k6ޤan:u4ykaQi)+R-& .E挱uL:ziwXG lt Kv0IgJ=ot &7tڴG_L5>> stream xڍP\Cpw% 68=@pwww A{Gɹޫ_w*re5&[#-ؑ `e`feeGRp-F;X؂e f2t|:)؂N67++C[{> @ k 9 Rڹ[;#֘0 b06 A6 j GrAohlhlko&Hpp4@  m@UƌHP7pKfkbh -A`N`=58@MFde,#18f6ZXJ̎C - ^  )0|--,(7,6Odn,7ZML(ɎEl $#ɫ beer@ Wcs?ܫفT!~`Z` 8;GJqfȮʜ90*Z(Yq"wiqjӇ$ԑ{BH QCv#tWGV$-[f.~+C23gS·اi{c&6 T խ YH Dq2H-D - Hjw"ѥU ކ.(}M# % *ds W&fȦlh3mpVE׋<&V~R35Q4dExJ'9Q\D^g"VzX_/a9 u̇OPxI?kUi[ #p0| +RbO槦Y ek„ռoLJnx4+>'Ȍk?q.(o=q|1Cϡ6dY% 1'OU'Q6 [߅s^cD&z:T_ɯX\ q dtM})d(>: Iޘ9UnmX.38߾>d(ѹof033G ûxDy2;#Q(mS/@=#L-sJ{eLdgQ8"ܝvij#L o.90F Ƚ kȻK#5=kHsÏ U=D)X"CIR/W@T.CU|y{+'^f@8z4!BBd뷞ѳדR°m`4oRxҵ`ڶlfDH]xrYNrT^'XH|+f,s,AbhDZ^IGWtB-bىF<[Vi%WʍQ/P# 7=bgzIC5_*sPg|6㸧Sauް/7|ϊ&%qeݮIE9،js:_A)@n_{)$o66-gC?7>s$I4 !^P2{S!vcK0' NXvf_12)sm^BG180I4LgESp-oF>nrI蛣Y }%ׇZjdn\i{.n wQ3+d[WK(fkLR]oZ1 7р3(I9FUnHk]+xsƆ JPcªz)Z[X4e@%DKr"aFɤyza rŕSVg*~m+ˈAB),96R8>Q]a2YD@QϣCeچ8oDb|XqeT .%=8{sH-;F~z!9Ns[=ҺLJn!1Q/!~:#7M"+Vȍ%zh\ L:ݴG"| z+yȰK$'#lꍿ13/ "N tP%Vmq'v[W~bZLx f Ol|mfi!۞ܑ˸{٢5d '6IfiTw"{J!>,mq٤*~nlʣh~$7vh7`€Kjf$2ݙ, )P`[};_U|,h_]$ Mk~ 7YؼGrrU*\C-pv`)OȊ ֵ aidl+L)N姉 U]{ɃM;T |BSϲ:w{xj&XV Fg^\5iP> F&xn :4`EKoF34whe>|u{\}~Exѷ8 bK|=GEÊOfEU,EfcUTq]&?:\gD'N>jeF(+F3cBab( ͨAIhF\&XJoQj4 V;mD fF)R~36+Dm_R~d32h,f:ŚW[U-BOs&DLX?-04GE^r9U &sk0Y%> 82=acЕ[\C|JduY\`B" ^Qg$ p0Y`k6[jHe]*skEQ"m׻Y2ؐKE >gb_Qh7%|ǰ.r9dU=Ҩ; f[:$"%^&/Z]6I!S pu{LbNJ3$#A^a^¥ [1Ќc/0/ mLTN<Hw AH2 CNm565s@hOUs׽W+9L@=HJ󽢝HX"';HH&yQ;,ÅD4lwC;1p] gPY~BsbI)<۲n󯶕 !DF" qUG츬С4Wayײi?Sy_b/ 9:TOސnv ?]w3'Gҭ$Ƅ(R)38wu{Yʔ!^rCR̺ M,K- $Z}((>djǞug?9Vl[Z:&D%Rԍ-cD?],:"98Ţs? tz}I/B2 !def_&DB^oXIgbY3:.v@,H9=L 5|zG?ꚩ=fJ7 {̢A﫤*{ܕ@*pTVhTS 18<#|h0!S/y-%a1LFʣ;obf[!B;2֯OY uoU(5 pdUGWFCmvz۲ꦰżW> X=]acQB:|♏-I`U1fi~@F~ "Wd.r.w%erCm:@#hYUPu]A^/D^ԩ%y0+%Kv2U-@uB1WNXQ89'viU]Zu~Ŝ}$l0q.kgcoMǁ5!C\V޳|.voYUW7PSmԆAarQ{u*^s8EP8";cU0%X} y" a=yvch[.!$qŬ% kR*ט3K%—лD6Bd\YQӠl$P;HADf툼_ua z?ᪧé#)$*Ẅ́Gx<Ӥ"0"~8 TL]]&^u8qUn]ɎGWw4E:] /xX: b>. ̶k*coLW/ͷBi5ާS~1Zұ 8rKbequ;OkZArRF9;*ʼn =J}vp6zLd ;`qXL탻q ۻۯv̓Bq ;Ȧ^Q"OՅ>zc'V'_Y +zuq}g. ⍳T2>Z%G?/-֦"V[9sYA d^¡'j_`K.`)}S#r(,>f;Ü3MF5T* =-_D]شj"0_-G-}]l`f>t0H[أLt(Ӟ~cS+n2XÃ)x_4Li);%@ivI2l8!!to+ )&&*w{ Yf7RsZ?3k #@K_;afJ!+@*PvA]y[D$j 89pnIӛh~i=)=D@6"K=uϾI' -p}8 0*aU]s_$h& oL$oN2cHjYffo :Gs "RfŃխCb`cAPLadr5%repа buߤTJFN]5jL uwud0%z }XW5һ#n)t ∬$q;DK[)WK lRKY7Ү[hcSeÐLvjpebuٟ@W1괇X tHRÆ:ߛt+v4]jUISD[3K8#Qe(ir]$YzZ(Q{eU{vo!$H\0OPT'"'3lj$<E+y3-6+\]t6Ys{՛r&G"&ϒ+UM{7&7e. ¬֊cZ*fsǮ8D!̃/`]s}rK^Sz^ wqQrWEp5Hy̛K|fΟ8??M `SKcF0Z!U\+[1k*mQB<@̞cwHF o"{k4C׃4FLc]3vNeRL ˁzOmhN=1~_x;Ase0# I-)'mAvAsd/0ڃ$$PAߡ3L\ c -%j2JڃC4 +[~;zZ7&_^o5ϿHN-"i3XPN@h4-YyxSqPJ%ZFF'}{w)^ ߓxFf\j02rMX;c̙֩$< 58Z'ލ^7I ʏDɫ%ҒՄSt!QSN߯9t/ӑW5^9Y`İww(9FWͿdSد#a/CQ7^mݙ|7T`o*[.w."g ~T\ˣg{4=d$fєO*(uZc${h8# ]RI4)Q2P*UUtC@oITn!ԑ*M.ȱm/^Jxo0ýѤ0/nAʾyCx+P+ G;ZCd 9*g$`<&@R<||A>}&4X#&΁q푚𰗋S cd_`US~}bUbmsE/ .P"wM< 8IpvOڭv#- _6}~2QE;ox ׶"Nmy0h ?Tٷ\$|O `v 15hdag fׁOo݉e+ֳY6938v2|WxGDZi.HT Նd߾q)x˟ۢ~L5;gw71:? JFUe-SS;Ȅgc’eV"},X!u nl!\)ONt g<Z]#kJV:YޡdVzs'N!ǀ?a]Z'E7#-agK夵!xs԰J49 x{` YB{lf(W?C4zo~cw`2A*h-;k+' ෥ CTs[H٪yVV(Dr%ՌOPM춃&GV\Gd"vy( ɌT%MjբtAo8@<=L }1Ig%4sDF㰥 {㔛miުCKAk:h54C0njtͽj]c;KBOk2@)0N ]ے @|- 'zFcv0e2iuvzN+qvz{xٝ?tva]2ӻ2߷>PvM)s3_=*ֈƣM0DݙelLr d$vF 7u)m b_KϺF] U#u?D;Q5:ՁPͧ)EDMA]yč_!i?NQ4c0 b0R84oЪ JE0ov/Cwd/JhtZ (Yf\Eϯ_-ӢDV^m](@f}JWYF_?mz=,1#HG; ێs ,wJCܬr w4tq)/]J2A!\B˶4>doͪzU,%ٰdDINM%3`z/r'rW~m,~8{fZGO )\fE;]H.:t)Ґ@ő99FcO%_A?p.#BZ-%ws}Ohnt1YMkk c \ȱAB1" j^A3'Br7᥃-fj1P=_A|k{]xd Kqb|* DEuEr}<Qw ICl,fDC`D"h!?]]our#b-Q&b*Vλ7[2Id̜6#-X'v{ Zh]Z.`^'`F<簍'-r 2./6utZ"Ōzõ.]4~"nypb@߃97A4 W\@ch}a gl~BV_[aJ_>;nZxlB`\q`up~D&DC8kPiy@^zC2Kf4~a )ayGX]6SG?7, 9)7u7/ak`-~oGe6x:i:md+koJQTTgךG=>G 0NE":kDڸrO*eۿ(Yu^WЪ+D 1}7/A3 4Yi0hɅ0Hs7K~Rn$'.TeO7 ϗf ?=ҕNY.qpxhntIҐi$6g$b3TwI;Mdr--=;MTlvjrbK_ܛWI/b/g # ɩJfmX=?n*2O߱(AdB"9^6A8j} ??yR!$!ra/I 'nj7YRX8+}=D<#eeHjL+ɉ\Q-sh_٢`xMDk!48 %lTb ɷ6]S< Eqj~ e4Bˍ/v3o<ip=|r=gJzQJړ6{ ᗟP> T}U641 Gݥ,y7z}e.=ٟQ߻վV!pe˷J86zO"l-"D3]^LDw/ie͢Sg#||/K^2TC7/~OF8.m0442aQ;rm+'73/ A?r﯅kWmz`bq;AK-֞`S+ endstream endobj 11 0 obj << /Producer (pdfTeX-1.40.13) /Creator (TeX) /CreationDate (D:20130413212405-07'00') /ModDate (D:20130413212405-07'00') /Trapped /False /PTEX.Fullbanner (This is pdfTeX, Version 3.1415926-2.4-1.40.13 (TeX Live 2012/Debian) kpathsea version 6.1.0) >> endobj 5 0 obj << /Type /ObjStm /N 7 /First 40 /Length 557 /Filter /FlateDecode >> stream xڥSn0+آDQ +hv0|`dVRaK$wH9^c$8{pF:x( H $Rgh>6pnrമz[H"X]o3WMi& )'b%;kOǀ)(8Y,8oli{XQY `_zX;JP+HXN+A=hbJY<"0iPJ[I#d$I1dS[(SE|&~nEi2e[s8(7}mjVuOq(rJ8P< ]X endstream endobj 12 0 obj << /Type /XRef /Index [0 13] /Size 13 /W [1 2 1] /Root 10 0 R /Info 11 0 R /ID [<3332CC463BB586FDA86A16786A1D4E0B> <3332CC463BB586FDA86A16786A1D4E0B>] /Length 50 /Filter /FlateDecode >> stream xc`` @fb&FM 631``4fr endstream endobj startxref 14219 %%EOF xapers-0.5.2/test/docs/1a.bib000066400000000000000000000002261214264575400157200ustar00rootroot00000000000000@article{ arxiv:1235, author = "Dole, Bob and Cruise, Tim", title = "Creation of the γ-verses", year = "2012", eprint = "1235" } xapers-0.5.2/test/docs/2 file.pdf000066400000000000000000000340331214264575400165000ustar00rootroot00000000000000%PDF-1.5 % 3 0 obj << /Length 645 /Filter /FlateDecode >> stream xڅTKo0 W(9~$ס+vvvPm5% ˮ#E9K;o}HgRVk!}]ѵMU7}q_҃ʗ] M݋fV%jڰRӒge=Pf׈h\6rMg,Ub#mx6Ut iuP|XWح <, M-A3l=blN6_1jR^!:g;¤&aޢ qv>4#%.W$t=*N{cn5\|.ѥ4s0~ia$ΐNO>*9N*&SHչ}8 Nl\1>Oٳc҉>2bcb{!WfD@Ղ7l PV~w&^bh\Gzi(ku ى/-oj7Ay.FX|IU5ΖqvMm.ҮF&۶"|sY9U5oܛ9Hyo>H؁KqU`/f55ߩdG̤=i!KD+&2n|].N?BA&Kd6lf_eb%>Ӈ endstream endobj 8 0 obj << /Length1 1837 /Length2 11129 /Length3 0 /Length 12278 /Filter /FlateDecode >> stream xڍP w0݂CpA`pwww n5 !Xpww 8r9*ղv{ j e5f1[#3  9X@vdjjuo32&bkq0&rzS9[8ll<@ O?@1(lmv3sc@gL`a3 f vl 's +1?tNNv, kG[3az&+ v;M()cACٚ:WllcvPP&߽? 6&m@63) P٘r} V ׀?+TWs4v998Bk%mLm6N'qݝuBlLLalǪawJjBfvp@>Nv36g^oO;[;7898=_0;f_`ӿ;@c'2r'eԒ`K}ۺ<98\l>^^,ZAkC(kcj Kk#褐{c{(tL' 36 )g+?t k aCmlN1o!R72-[Alʶ?3|fl8՟.:6ƶ&;7rG;uAMnN65*`jǍrX%0x8 ?5R`UqX5AV"W?`5/|E{Oa5d՛ ҙCZOī /Zտkq@R٬ƾ>rf#vn/-&: /J/S]3o:8>`y~X Ȣ6Fĕy{\hz[3=C:b2}ufÍXpҦ$ݵaK=bhkJoGx6_?&!2x={i[¶@wQ;+ܹI-nTsˣ _Kvo4kcV@&@9O__Akvќ`3؀S(@R>s(izAIB`r^Yo";I7]ir/%\ 13Yl>|I͒pEN(8u{$}j,-1?AgT:nc15gQN(Nv!aOsZn.c!"XҳXd yS9q 4-v6ܢ׹osc12,E FcŭFD['W/+H_۞s"\"qV趲-yxeX 8SUd۸UN!nk41Iz1,FTDȷ PRTđnSzZSwLX֬wpUhr2!P%r0#{hG1)Dh^E#naBMXv,>IMܯ~Mz5Hl*vLs,#Y"shTT60gsL}2z{!rL X_V%RpE!17SvHp,A%u [+oz>hrj*?hrB7))zݘk_ .iSp,Z26Uebpsb<p3,|.tW7Qv²x/6u)JB2V}8a3,R^y>)>#ˍ\N$"N4$Ri75mR, S螄b@*>qFԁj~n\q(v56āW5,[(Pס{4X9_mFTb~q HߏU4gp%3ըJ刖8V~cb`Zz]~16KT; "ST"ij@A҇)[n/E E* tRoXzy.N.ZU ^#2}Nos.pbCLl!R"7Z!U~yHrK iDI=6s]\CoګH%zsKs,~x\P̱ e+Y:`U+S'F*L:SrŞ?N?,1gF)<(E0|ݯvKУl( 2VbqL2(I"qš}P7 Zrk#1&H8$dkVAi_Pd72d,n:̈OcKGG%IYcG|?n9O[^'dVXxE3{(qi r (UNp`@xs=3 twKP~nS^/7wPtʄ-)ת\Ez_{֝( S}&& L?E/5wD/bx\`yU4:ϣHhfIk>T-mg|Rn̎7ƊA tGTd©v.J6[:/gj;b(3` c^m .-wz6m^opEX`iNmE;N G=#xwnY*Fp@R`o`sL2GJ>i4$DE_LLe6TpYE< ; qG֦31Mt 2I%-6>JeG\n+C J:֍6M!NSRDB,=^Dˮ$Lc{Dztv3/%kcF[.㵈}ޭh珞5@q˓u+3+ q k黍,dK^p%V̹ JJY^HIG8 NQ +~^R}Z"TS5H܃oPˤ$BX4xA+C_+n`MM2FGfKŕ[GŠ:ٺY]|5_{ɠ/MLN)LnjxBlcqFʅTs@.N@ǀhYl>̎~SF?v&fpcVRB?%Ugp1tJCj& ]$|Յ*gz{; k+䐈t3">Ch>s/E-q1\ɽFڣ҈b F3֝pPє{4roV.O9XuT=J4USoFFC]W{r~U){=./ N[>] J%ʦEuULY[o"~fd)6c~Dq_K[ąX&+ЊE/ *{?腘|xLy0#-ǀq:M-腻'6`YQUuִok>}Kd!g>4-br'{qС|.K, Ww5bpHjuпtU\#xoS,CONj>SpXe%xb&T0%[ )1:PU[+z d%Я$SCD&E3=:**񄆗E cUä|hx\MS.hr:1ȯ3 aL9x;!ϐ)4̘0zgB MMYjSՂ 2Pi~33|8>1To+OՇ&f(jR^3 [sW :i34+I#Ħ F(_+:m+X mgrliF 0Jp*DuQۍEl/UF ;*^B{:FlZO ģ:rbCuWߤ30cMK!BmW&.҆z8e}s_t4^tXT/zVOכ{.˷;0M(y*+ؗY_.[9 7瀢Kk|,B1?Oen5-ǗtG;_CtŒ ~3N jFuCpW7nOL(`8ҨTw[~$͸ ́e2i D l 92n:c?ˇ{Ԣʻjs`zS_aC[EܕZfYv)  # XgZeoӐ.zL bߙY-'ҺW6,4Eћ2m E"+`WVZ!Q>om2EU1vF@w.PRH ,yS$H6{v ȱw~Fpɯ?M 3"2rψgx?sCܞtTAl>OQXUzdBe}[Z&i>f1d@sJD6$< ޻a6`xL.U(mpWj`C]{\-܌#^4 *dzyk؇L`q_\d\#0`tD2ZͣW\)Lcۭbmڦ%%Qk.ɰT0 p7FH`y=[kjhRqz&ҭZe1^I>#. I G҃0r&%'2'a(6r: \-HʍH`ߛ?oGA(lFb<~ B tw$2TGc<:DQbѻ#@M|? WY평;rØGdn/xЃFZCΕnrta`v2@&\΁ehw{+r+ېQcmy"i*X\Kj  Լ 75缑=zo:˞dzv/=epm +i,e~V9P여␀$: avdw|tcSbʖ,xlR4esgz^n>i'dYڮNI|>f] ט͌G"mo7EpqTv%v$,0+rDM_.'OBڻgl*c)#%c. 7J{mg V;W {e᷒({4Nw=BYz=#lBSX#g[ͭw 꼰PYղe$F-iy粠dj.iҥ:|r 8Js?i$y2k.*:6!;Smq3>WCEL5 MvN]/> %@G̔ha^PnKJ( q5X ʦOb؝\tҩ Ra)T.wL^{E;N\.ӛv)a&l|.&4"2 f{.'dZ 2{=yQU K)s\Tl".M< ^|+?L9j5&U>7L. { lwA]!5=:g7UE@T}]^(V=6x[s<ۜ¨rWr'3]Yw[ˉ%$WwU''"0W C%z 7xx \,$FTāe'ejC]HpE<տ+0'rA"lZ0vͱ0or nsZ'h5+ VUmӸ\k*` S7@) Y' 0ڋWLa"ܓ壺iYj`N+zՐ0;e͸MK<檈YY~ 2g}}V $#ե\鐂:aاБovxBi[nC*VUWѓ^J>K'UEZHEʬ:]|U2CadNꆶIfxjO2Wt3l@{&ūf:o뉌4dq-Z _ ҁ+Hr3V)A@q5]ۣ'p-=0sUBd䄲s4%,LI}(xswbɷkT%wL!*m6nvu)O,}SgKD&W{pS壪8&S@Pً+^ޯ-;yJ^+S/mr {jzLk؊s6 ÈoG/2}&FfdQē] S/ ;87Wt27Be{6,kmU: Y˦LoA+"Kp:{k.h<;بq,ܿ{Xb,TR(րۭTq@hG}~sG^_ ]~0uÿƃ13 B(O_ɖ.٭a99~NDL3yt$#&F㉁*X<߷?ڰOF]NdYmv]ގ jy<ƄAbހtIٺ;$'3?rЎ+Z4_$N߂g).z meƷc;Fd./ >VˣċԻ*5 F_+d>3q YP씾 b{n4=.~6yYTP(B- QfC2sah=g?bt}`B=r,CA.!"-GÌmSg@+g]Xr =׷!yLЬKp? ozK\(uD{:a)_?k=)|:F ^G]h=4z*FbQ%K*U>e\ɵrl(ބ(Ñ3¢=p~AtߌK!UeWOxkl3a q`wXU}cp|Iipawև! t85%1:u-; 9{jn` e#W}SH&+?TK"\cj-~yP0wc%h+6E3ziH%diP8⎧ y؂R~&8w` qVg.Yo*Gr$PJ*1y?,`aoywoeUwv?e l[$yɠwRl+2pSv-"xDY= E|֕d&5>b;pPg;P*m"s" IJ haw?ẃV8(=Џ<;Ւk3YD~J`T7ޡA'!*%0~:uZ;T-uU_4[rj*8(6@W>(+!zI[ŏht|ʎDX$[$6ޭZ6$ϊe;(5p<R+ [| LĜB??zLr2eǸ˔IQh,9:Pl|io{L׸9LgLӠ׸1˔n؃|M,9bl)&wo)/Eݭ'Jr`o9!)> ۶Lhk&}mQ. 9nGTAʪس[/bi+%Ar$8+/i%G`ѮǕVm>Da>Qc`UKm{'\>? YUr t~Vj!EQĵ/{cxq)̔vڻ8 IxW~[4_QNɦ0xZF g$g8=Cj4SU-[4gX? p&͑gŸqvC)vE{z9E` ?52.D&)!ƣ- D ,Lt?6U٦?[ZVؚ;~Cm`Q<,:g(9E㰇^}8wαΒw;ED%ٯh ̑\d"(=+&vG u4*`w c0ހg^ld/EKbW!6SE $cʻY6T R<9i#_fVE11.fC]$[_ƌ- 9.8207<Wq9vdbokIe#'iN7<4[\Mb`q\Pmk &$M ̅o~w/.P-;u,aUVg:nC,$ /%\X[3ϣ'ͿT2>W\9DU~$߄y\ IИm?hb{y.$ǽ`98Dc`$tSl`0;ʾg0V5ȾG[t>y:[h?7ԾoOkV! E.>\<$a|݉Cb;.K&7-|Ձ'O|"o|4sQ\uUO6_Eg}dhaR`951O=/}lH =[d Ni9UG-N}wkЄ2g3G9࠼ ln0G f:N雹*-&*84N(pkv0_KJPF#h,>/j?=+M(_5,|m^B!FUl=U;s6kDs+&Kfy#5c+"&ܘwZGc>Ņߺ_~? endstream endobj 11 0 obj << /Producer (pdfTeX-1.40.13) /Creator (TeX) /CreationDate (D:20130413212529-07'00') /ModDate (D:20130413212529-07'00') /Trapped /False /PTEX.Fullbanner (This is pdfTeX, Version 3.1415926-2.4-1.40.13 (TeX Live 2012/Debian) kpathsea version 6.1.0) >> endobj 5 0 obj << /Type /ObjStm /N 7 /First 40 /Length 553 /Filter /FlateDecode >> stream xڥSMo0 WaHhI,A&Ykm2 u4@b{3~c;`A${hC XAC,@1F>7p&I]u,X0>f[ !9$%o#q3pbJN:{2)tl:8Қ'-Rz(t*_"KN{a2$\ H).uP( šL2FH=K|̣6錣CdZ]=vL$bWƵ^=b&8g+vP}LП~m.ũY[6]1S~z2yX0N][z؏Ȁ^71$qHsrk3?'fʼ S]0]V@SgKq?H0('>#+f}[4s,xo Sa;c5֕Ӗa;<+f%%W#}n txT8Əߍj _YΕ{z4SmWX2-nMrޗ[ kk$yS΋ endstream endobj 12 0 obj << /Type /XRef /Index [0 13] /Size 13 /W [1 2 1] /Root 10 0 R /Info 11 0 R /ID [<0C860A0B723005BB60821FB8FF0F2E89> <0C860A0B723005BB60821FB8FF0F2E89>] /Length 50 /Filter /FlateDecode >> stream xc`` @$ D1#c3a}u endstream endobj startxref 14063 %%EOF xapers-0.5.2/test/docs/2.bib000066400000000000000000000006171214264575400155640ustar00rootroot00000000000000@article{Good_Bad_Up_Down_Left_Right_et_al._2012, title={Multicolor cavity sadness}, volume={29}, url={http://dx.doi.org/10.9999/FOO.1}, DOI={10.9999/FOO.1}, number={10}, journal={Journal of the Color Feelings}, publisher={Optical Society of America}, author={Good, Bob and Bad, Sam and Up, Steve and Down, Joseph and Left, Aidan and Right, Kate and et al.}, year={2012}, month={Sep}, pages={2092}} xapers-0.5.2/test/docs/3.bib000066400000000000000000000002311214264575400155550ustar00rootroot00000000000000@article{ fake:1234, author = "Reed, Lou and Björk", title = "When the liver meats the pavement", year = "1980", journal = "fake" } xapers-0.5.2/test/docs/3.pdf000066400000000000000000000342101214264575400155760ustar00rootroot00000000000000%PDF-1.5 % 3 0 obj << /Length 547 /Filter /FlateDecode >> stream xmS=0 +<@Zx-zC]KTr%ED=x,Lr:;fV6Yߵeq~oa]U^4謅ѧ #ޑz6F7u6OɰĎ~f@!6i''^Վf}fw Dkw˫/%,8ot ~v΀VX|Cx.:~8tVzCncɅۗ,/a(j:|/_r1 endstream endobj 8 0 obj << /Length1 1852 /Length2 11327 /Length3 0 /Length 12485 /Filter /FlateDecode >> stream xڍT-S܋C(.Cqm݊w+P"E ŝBqwx㽑1%sZKhT5XL@2vgVN6A:' VlیJ rtA t;?ۤ '7O_O @lPb(A@Nv` Ke`YLۂ%yESc)_ BnnnlƶNlv",7%@rt P6 i vˮagf<l s x^!PA V+88;"0dcSS;[{c;b0ۀ*2lPg1@c'|cWcsddwvbs!]Iڂ N'v>o;_'k sx 3?DسkA. yCM,@^~ AM-t`, 6=z:Ύ. oP99f`Sg  A 2 ?# x=Nn/3;/[㓐/s: q6۸.Sd< 5J 3z坍Cba6dP*)<ۀ! U;'w |fj|8=՟.8S;?掋`h^\O5Al;9,`nlj0 ~̿Hx">?i/zf1 gWa|Yoij |7\oV!seQ=:ݿ=>?~s?Wjy!޷MÅ7\oR3oy]~υ UL]?A (unuUmPu8֘N*#c-&rcUfx i yϽzD;{x6^jm{=8xi[÷v*:`]BFBfԶޠݗMFkEOd}'zJ 5}q93Dό]䩻sc&S1 .9NgIъHk8mv<կwO h#VZlxV-&N,* ':^u4VmN2A[PuK:JHo(9lNfX}4~C3Ho"wIjh:(?}1溻d|na߹=dfKlD Sݚ#L.BHMݠ6`vk1j :~VٓxUE@&N!rlM\ڨKokR˸k a1 !g:_F?/`b+АBWx4Kj,~ʄu(+fY!ξsh#0H%ת=ҿ= o&vRpnFȇ{)Tc'v5wǮ233YO__![\n#9ɷ!>ɃG07d@K(iF!n "`r8BE ȋNZ !7z\'3e)ׂƨlL6z H;T0T[Fd9%cȂPSԡꔥMpY,k'6'2IXM67X.PxU\,Y90'ʕ̡iߵDIt>D=o9qZ6^͘ǭY丸tƨ??LN?߿3[bb̾63 WU5 aƇV9#$n( ﷱT`$M0HahiZ/ٸvH aoGh%3ȱGbPϫiG& 8J~]ȈN~7!S! Ô0DT>&f'?B=WC9oU17͈.kGrH; >3~_ƒ|*uS9` G*C[NJI5D/o 2F{W&BpoMZ&m߳V4ܐ|ʐ>kXYto8)vGc|GGaC}_I w]Rj}o ]g='| 8ou^$osN[9 3Dg## 7E b2Ֆ:^3? V;n#iw%IK6ba\K. ײ TU( "ҫ.Ω7j -U5lXOՐeq W8sW0jlJrC‰+j6䄳'[Ex0ЂW@::€}ϖw V9Y^DIG47l ~g_W&נ:XGn{}T ;mt:ڤ30H Oe )i:[?xsH[g[7[zN dI(EUIcX|Il{|?d&q%P9O.?\7 "/<uٔőPW:BڋIU% .v_jdg䓯:8̻[/Q"(W2AT^\12^GA/dB}G#E$ZW o߇)J43*0aMֈߑuވ)F^Wn.r-W~8O ВA#&Fy#6ՄZ||ȒrUZeHC]f ۤ*dM;^ϲ.2k]N"9珄;nr@ 79T ޹Ŧya_I8e :v&  鸖i s9M\hx\%)$'HRk^Qc,LYY]0r৕x{aϐ)C4iQ.:SE^#JUL+A|_GLVd󎺲U2m)a 9K P0%#W ܔDc%iG:3+c_{?>|f&&RJfjEMёtXG#}Ff=9=+СlP9}{dF^@x[ȩ0VV=[Ydjq!ܵDR`ix/(i+k8_ o*jL PS,-hzi&Q}輞m(> m[Σ4i>lq aNA[iP۶3v_okѬo gAGצ\jX/ҕ b9jY 0BBS\F.[pZRܑNt}o#r q$Z8..q.2}n l^T3Volb՚Uqm 1oF^ ^kn 3oxB,=YˆH}36W5sǜGH3ǵ 6L+pE82"{/;ޔy"5)prK=*oVtɮr,̶UHS `n8舾1f\Ә0s.ۮa$ /gߊpxtgirȴ8TNJV2ypH AV&IB.-+t%R|)]w ]8l vT@4m T_,ëe}ġnIl`W~́eri  ќ9WrЈwXCNq=QHݒu@ d^}C>U]UcmyU?Ju|5xYADŽ9Nn2㵅R"=4DBLj`2e '4&vZ/pJ#@z k\ʁf{]adP/ذƦtH7$"*=~Sw2-3rH 3OweĬoQ6@Djt!d4իB][ɑŌ*ܭDZ}@fvyܕjDt2sY ܲ`A<&@Y~ԐMejǹ"/n*.Lט0Pӈ'n'/\C8'c35ґ :"{'weQ+/r$Ƕ.AڦPw$Oqհ 6awGtH_ xRJjhzݤHWRc՟^^rp[R@4nsV*lFv\6kaX\.e8'7AQ[Wpq0d{0^4&:_os7""V2CM5))eN`OQ"?kE&u}-}59ըt='>5fBEȐŎq‚k WHS$`,mUj{AK&[_,C[k|֭0&&Ф:6 ܼ ݐ 9"=\$e!.dUOٰVoA8Dr=n3*nZ)RBdLbɣWuDy y"VRfz+f|mnCdF.=% tbORY8SI~y\G?3q4kS [~cGV[DdWcu@gM쥏m|+mrŴ(szkjΏIXSK @,*jשtD"8SF瀢.퟿zL:F3xaLRNn=_NY4$ ̹sŰh>Z/ީPiըT%Z"".78d?ܩHךDIG&9[(/4 xSƛ|dru) %Y<e)kO`#x)gC}:%s'GK2R㫎\xXXz%;X&<!׃3ǐ!˦!p9 DH\_R2S[a$y /Cbġ820M3E!$U2=&O5)(3KR.','!%8,\j ^0j\ij!~TVp8ed%[CMc-[8@,?# Ⓩ9%ؽs H#CGj &/v@BoOgiJ.˴hװLzZtLc嫛)e\fj=DG ]xNU 13Kk)2ul[q7MR'|GDcD"m"pv}u>e4#1 |JRM!ɯ55:/;Z1auXŬWb{\N7lOԈfA};29}#+n:lN@Dȟ|vjdGCR>DRT|wxW-*5c}/ *6so_`P3vʐ1M~01r5{An;3改` X\"  ێ[4ʌ0y?痺Z!td\XZ-wsHSE͆ ;RF:h0 j!TFC>c>*x^/:6Fte#Y߳ҷF)cLRh )c{.ƞEieu<}q OЋ0>)^֗!JݭWq(,q~.ub4}.ڌzݹȷBwVp -_(}hფu2K/|F1 V)Յn~tnmvlryV9m3q PQR. M΃1?ZRՎL#7 +zAG$[MGplgᑆ_aRG 5;П65zetGdtR偝54՚J ')+>>&"L4i:U8x:A8S7?<"Y Wׁ=GdvY6zpO`Jӑ%I/x'xmV:&8dkP-$c9?+Wk@~,CVRpJMtz!~6b7E |UflK* w>jN!-(2Rwfwl*/5W* ŵM Iuc7M tH;3M Yb j5㤉/ uZ0J}ޅsgx:#Zd N֒69=*( b")HiE=u=ȚCZV/)ts5\P&[yó8t,5L;+RW,pn)qg7䙧I,;6*rCXbU)!:?: q&POa! ?2d.)z!/?23V4C_-jb0S5ذ~.(W|3 hʜ!՝5%\J(EA+-ylՇS ,i/gļk]~WS)x;| ɮoԎ 5h9egf%Ofn-VЦh 4^>..3@zqZ@R:'}Tp)ZaҧD ^sfEs{4xNS$'-|"~*,KmuɍOV).!s?3<նܓU.Z춶4mkK0PBPZ[%4dO RYU\%G#V zf06E1 5Iƿ&̾`vOZ"Aaq eQ-#VD aA- '[!)*@ŕG~{N}}M .muf_9ם>If:jEl8!+#3%RV;v.^x Fnk3r>Wo .(i2rze+*'^v-Tn MJ~eTxJQ:%z !;z,$u;nrg'fsģ&#YU;kfgQ޵ۛ==aaFݛ-sxwhG5>:x'a%L0鼮q$,Lxzln_ -MNM2`bXU3+AZט}~p 쾻6D2߃ũ9Mq|Wo!W+S軫jOo84~ҧU1 (48ڏìw{E\|Mw7Gsz©5?'%&?-̉FWU[Z6@E y4=Ȩ2* o#DRoARdM^N6~8bBa^-n NYi)s\#]T} xuzS ۗsM{j~CTmCy+B,<hD 3ZEqP<e~P,T="PPRk8琹iWT ꔆ[TS?8(ٮd}W vVEj }fȼ4Qi#-|LO. CjtH:.$> endobj 5 0 obj << /Type /ObjStm /N 7 /First 40 /Length 553 /Filter /FlateDecode >> stream xڥSKo@+ت2²+YlGU˲$[=j^7f8@i4D" AX:Vx{cin|4o=l-hcK?+J^ŒY>%4->Ņ@JI4A|D$!E*Po@y@om},YiVh&-,qkC8[$guNs*p-wϾbؿxo-nHy^/ɗÊqӼpQw#1JA'.s{)p6_M*Mb}#ft_f*<\2 #"m.} 0.qO+fՇC;4s,xo Sa?qƺ2`cڲ/lGRRzuԗPQީHLθ{47#ktm ~IaLy\'P它 YJ#so\N endstream endobj 12 0 obj << /Type /XRef /Index [0 13] /Size 13 /W [1 2 1] /Root 10 0 R /Info 11 0 R /ID [ ] /Length 50 /Filter /FlateDecode >> stream xc`` @8fb&FF200h[ endstream endobj startxref 14172 %%EOF xapers-0.5.2/test/docs/4.bib000066400000000000000000000005721214264575400155660ustar00rootroot00000000000000@article{30929234, title={The Circle and the Square: Forbidden Love}, url={http://dx.doi.org/10.9999/FOO.2}, DOI={10.9999/FOO.2}, journal={Shaply Letters}, author={Me and You and We Know, Everyone}, year={1869}} @article{30929, title={Circle are Squares}, url={http://dx.doi.org/10.9999/FOO.3}, DOI={10.9999/FOO.3}, journal={Sharp Letters}, author={Me and You}, year={1869}} xapers-0.5.2/test/docs/4.pdf000066400000000000000000000326451214264575400156110ustar00rootroot00000000000000%PDF-1.5 % 3 0 obj << /Length 498 /Filter /FlateDecode >> stream xmS0 eFkwC[،Brt_Jtz  )||d/\UR.V:( yT7'G{2Mm\NNi!ϵn0!P D  Dۺݞ'R~D>?oUÏR=~e+偘OqN>OߑY= $%F&lBј֫eeH)D_(-zvY<ѳ5i!᥼;2#Ta)loy=9nVnЄH[TA`.{B=77 IX;=Uk\) Dagο($M~Y|Y "&3Jܩ4uc[kFl7+B|.N ,Fc;aZ6r;r,:>"\\q@fs9jXU"C(\4F $s 6]@2FyIU5-W۽ ?B endstream endobj 8 0 obj << /Length1 1791 /Length2 10683 /Length3 0 /Length 11804 /Filter /FlateDecode >> stream xڍTj-L#]Cwwww 1 Ctw4ҝ"H 9~uO #>H#s32ݝ@@07 DydS?i@jN^~G?.01l@@n.P~XYdA05ۃ:Z.``wwqs{yyqݸ\`vR/=@ylhA1Bg؃غ{a l =ex@l@0Ss@ WrS? !&]@b;J\ @S>vZ=99$ >5 uwr;A2OwqvAO Y?_7q l?Hx@ !`W!O&mv w  y[sQ A }" ۂ܀ ;://lف!V2lO{Lx /'yٸ@| ~tet7_'  ?Ol<TغDtv*XVwry2Myy?ϔ?QHO7˟ v;IO[ }k5A6`Cb1ݔ _-- Oe .O)'z[7*$ 'p+DZ ~?H)/zAOi{'o mO}~OnoP|74i oi ~}zs?$_x<v >M >M|*|:Omks yї]_Rxq|ayFMg}Mr!LG{jdGnEN'gTh2\ ]=Du +V#wtwk1+9 LgHP9{^\L>Ҩ%o%V1R!^O1e,-[gwMl::h]O͙'XH Uc;TRiSP򎿨{ [qh0Dwzo }PY^E7@# JvILWV D-ڻڽn _5=:{;广Dx|n2Խ;fLf"Wב^NKܘGq)NLO:Yc¬akn^u- ÊGBxh* 7ʟ7#7բX"ϞWM|8;2+રӗrJ#0N8)J0m'|+W`C X WX'˿s}@"c#AwEa~x{08΅{V1c7j`}:a7=k3@ g9>p;W!#Gp7!Aq&fY%Hm+I4]b/)·$y ̵݂^W^bms`DΣja%15Y՛v+Oj;oMr5yFewBY9iکZu}B]/mv{:5fL7g?SKnq€,kC);y3梪0J1PdY5 6frX)v@#7 Sۭ(9\tиj޵b%_&^}}Uy}ȗŒ9&}'fK~4w9Rx;-܉q}:\,Ѿ_Rˏk-`Ui2YECe/ (TvܦW`xvJ4-ARsF)wGYoM/v4z*ApzFpAҴNo5V SIo[&SY䟭fY uw&^eՓvLtc?7e:ʰzfx{lfд} z$j ONuY۰;*%ѿU%eYP7d#$. 4ee& ;^DdejFߗ3b>f `<}`I|% 0Ac?m bNx ;<Ϗ@%bޒ\4S.p]}!R&:ʜ[';,ptLd;ޠxiR EuX%E:Qk+USKZ2UH;?T6-p(QO} C$@IC1?2U#$Q 2]93"HtM:m>ۇy׽фĴz|zn(M+BuoG æ Ә y;/Tͳ Y/ . \y4+ pǩbW2f fvF*б0cG*_TdxUu$Q2<%Pge]_DIm&BUIX_{ 6;U֪/=0KeϹݨwJ>":c #f K;~!AEN^Y5Z~71dl/%#nMZSqM֟+NVn0#fN_KvG2Sd {odawGE8ʿu}ι+T5- uid\wH$/WtDM%qJs}Xשĥ9GnS;/B2*%"(MX=S {C|` ހt*!(ށaMI%\>M)Puҗżruj =g'mTP !ܧjx ts]U޷ҳ̣"b,qE˨(pcB1HdYXE `%k;Ew|U?"jၔ_hAe3P ̬95ľ]qU-׌C pĶ>z l[&0)*"KXb_bʯdg 7pZ/1 Vkz~N6 F=f2$ o;,f2BsUn~‡%YUO9Ջ@魙mu^m``xsau#$MQJmP7is>g9tP-ʄ5,t%99/v,:E o3|@8$9XW[XKݦ(ssqdঁ|@ewyʜ2n$Q5 :%%Eg=ڛ>?g -ה=Lω(jPV,*7ʛ3[c. $4RpcṿulՖ/Კ+KcP&q wp:噒'&Zx ՔeP¹qHLA0MB JC?*θS!rgrbtVMr'}ręgӚzqąv5o ^]xT;_zD #vR9Jmmc ,7p"LTxACXNmŧډf-,\V^xe*BMP }2g6qsB)& DyItRa-KfS<پ5cI%"A(WCsor⟁f9idKL/s2o֐>4yI>/.=:-d"݅&L͆!ҫD^QeuyޓnIz2wٔݱ@|V@zyAm5:R>/m6klG8o똼G^*>W:I#J}W;&F[33+~o v2()V0c8dŠT#gǤԒ`8K7N(8ZGX$[k]fTzB_6# ;BpVVk\&ES[ap3w(<5i"u@4uRB` ǦR#䘓 j ؉$.~U#k- S/1 RHYޫi VˑklM=9v:n3LiOJv~9K`ݭU=A4JMo5S}:ԍYC`$}>/BY7mɖKG>3rl|H\ +ۄX~rmjU&[c9WKz Nhe.tcG s0<= |& b-IY\pfs:~hcdKudyGpDӏS3ޅfV.^}zPr:!#b$'`vg1v\(kxC^bk/b_0levCr3kW":ĐbC?|q gXt 2t{;3#J7X ׶Ӊ'-ymbQ%S4Jn~I\z'bcՔh]HwѵOvp30D1g,[cwn]"/qR:aowER޿O06`)>4EAցhc=`Ws kv"T .?gʁ6^1ԇ!)sq$$V96vOjr([2I׏M굮NnX[@4n:ܩ]'+0Mvt(*!(!vKozǶY=tD#kI<5T1Y~rE*,z#pʰuN_!yE`y\VZbh#QP;_ts8J|[,W{,,&&K /r@y!o(?Ô-FfG5C+P1A,DpEW`p ~ ^ T tFR@h.k42ҷ@9&%=ưc*> h}"giP E'P夎B7f뺗X53Tm:Y]3=iryrEZ6JGcvi6#c_`QG?p*vm'#]7j"!Gx$Ɂ+%Hl,҃>=.:S$rnY67,E[Eǻ:bL"}fC6oE J6+.P2U3uOؓ[M(6.h=TΦYٛ()ṬN8y)]3jf7(zn&'`QfIllD˷'S! ~I3حnYy:͇Ђf+Wwr$,MlipSClI. bϺ+:W`ၗzl.<Υ4zMJ{gŠ1Rkg F4\18Cu(R\eed ʡG T穏:\ŧM)ʳ_px3|aBS>RaHSK;-sśUVG|PG~ux^^륯^I|+S1{ץb4!l71%^AUz8i<$RQo) 38LX.vpY*ż+tdUމMY/cO>mI󹵖_e9‹Uasl?&L!P9rH#QA?տtIі|hR)Jz532$3m5 : =j/GHfN֪k\/ h)*n;y5_K!$&ðw_7kSd  "jQN:={Eku$queHI\°b&V<8Z'5~@]mַ4KU?d#r \HeA$E M˔ۛyFF\sАJq"Z̆Emy$D tYZiegJKʜec&H Ev2;d׌xt{XĝD3Jg>;VET3mIG^ztE&ceeiݚ2(VU--/6EߕU,;ؐu({^"Tq}XGxɇ H~mk6z7k*_4晫42\.ihm$$5򣕨UǕY_d0Cif? Z*{ʶ`H2 =<ҼV)+|oLu]]LXlBiNU&E{> VZ_&o_9֬qotw%5ɬOoͻ2+19uyъ @ ,$xoQ]O0RnW 8-/DŀZ*M$tk]G+N&Mr 1D?C|ՙū0r~trO`v۾M߫}Beޒ-,:E0;l=.a$78}%d`s!LNNM LsͲ.i[^?2GrpY"-ux'8.vU l>ztN?앿qE;F(T/g6٭7 4xKC#rGRz$ "EǴaYΊꮙDl(4y7 Qn?0I\>ӲPY/O> Jrዸ85 ϹzecPhdZF=V'T'_'z =<@ߗɛ]~d"ꖔ ;7=KO ^ 2B:)\*B-YZjf{`<&ޡ޾WRN/?&6ß|8O v^s%t9i{2Nw?K1zdLVY ?չF]a"УVU[R,D}mwg P}*Xe&^'ݗLTr]=é)ފs0Y sc3 .tR`r65XmSw=굛lLilabYj\^x;fǁO?d|Y}UXW8)ݖ˂"gψTB'ơo-w؛?8 [`K|XG#KXvAbUn6V).7߿}T7v@EՈ>!Uof"ٟ{4)l^oy@t8F}Gjne5@M/E+qM=S6؎|'˦:XO%".q~p詳KϏBqgVj%(PأP6֩Uӎ e39K86~W9o kbTNMM_>\2Oi߀\qEݖM %pc]A; ԢM#-t^[.WwFkEJ$smlyC t>W*5L\JuEyWr Уc8$^VUK=/m#LYHв ΄d%B*/âł!ď%41L[%o*"Å_X*٫f*\Lz5wm5Xݹ ǻlbKwh@.;M- 1r'{GD6A2AozBosJ8pghBd4I;Į8e'\ӣ[#y UŁTnh8dޞ?=d(,75Kc)HZukp^ڭ>˝9)pKvFc. n( !} X`n˯[KnX#My߳Mx`= r%wV&hzm=oGW)q(Xit53A^﹪6"[*KՁ WpK=z>+"N[X(܄t,՛ endstream endobj 11 0 obj << /Producer (pdfTeX-1.40.13) /Creator (TeX) /CreationDate (D:20130413212533-07'00') /ModDate (D:20130413212533-07'00') /Trapped /False /PTEX.Fullbanner (This is pdfTeX, Version 3.1415926-2.4-1.40.13 (TeX Live 2012/Debian) kpathsea version 6.1.0) >> endobj 5 0 obj << /Type /ObjStm /N 7 /First 40 /Length 544 /Filter /FlateDecode >> stream xڥSn0 }WqP[@حE 98Qr "y!Epr!8(0Bg`T >9{['Ͷs΃$iS}[:Ojq떵70gP\@n"!zK<(\ (9E"x6u07{`Hs B RT3}ge `|@4hPJ]IQ2˘LrNHm"K̃`tfQ;㌩} 2B Ǜ0fp˸₭{D1YSKֻ8wkq{]RzW{08 >niaVӇS^z\\"'TIO9mV5 endstream endobj 12 0 obj << /Type /XRef /Index [0 13] /Size 13 /W [1 2 1] /Root 10 0 R /Info 11 0 R /ID [ ] /Length 50 /Filter /FlateDecode >> stream xc`` @-fb&F 30T2jo endstream endobj startxref 13433 %%EOF xapers-0.5.2/test/docs/5.pdf000066400000000000000000000345401214264575400156060ustar00rootroot00000000000000%PDF-1.5 % 3 0 obj << /Length 727 /Filter /FlateDecode >> stream xmTM0Qčm븥]hBJv@BNҌiޓ>>(+.Ekjݝ+ &;ȣjMk]nrŴp7ZRD<;4M'~9N f=-+lk}w(ڼ{\e.ehQU:^9fӦZWD~7lIɃ̈c+L{6׼ee+ 會 #t;DhǏVs5 F7X5`J+܂[j3Ǭi+BMQPF<`U&T9V0Wݬ$kwe M3I)m@ 9 Z'CYo4G^Ia$Ywn ҉AЬ\ԚO#nHqQA#>U0oVp)AOyL4 `m/* ȲB5: xHՌ=sT/Yz6 p[5C ֪QO˜Ik |H>5ܸH_a|8[(;^|WDX^<[Y-!=k > stream xڍP\-ݚww!hq -H@ CpǑ{rs751HH(@ Zd یL rpB 2rzI9)B67??`h4rX#2ЙxLـ&FD#k _%-YY]]]YlYl̅`' *2Ahd 25@]x1XM@Ǘ g)r8@M@+_Lg`caCL6213!35$͉ `1#%lmdgFi1 98YPd˔ 66 #I@&/cwgf Ͽbj Sg;V $'wȋ <|=fbGyuw;ПN?/ =lf/$@`3  2l0!ȿAfw/cËLm!_V wbڌ1'.nd0sxy<\@WiU6wA9-/ / UAE)t4yb)Mېn? %;;ln@7T &+L6s2z1?c;J@`'$[x)o m/f/ ]wċ /қKs6!K+}sV_b_^rf2˻_[_ 8 _q?) .?52A^5 l#qeJg\phwCGL_qK\ڐ]xs[*UoQJ#5"PgFpb&{ŀ}1sy5=L!ˈ}Q詻u;\EH {tS|7I`޳pa4y}l&zcKQaLzWSJlMegɒcG&85O* V]v."^6C^b+ :=4|!Ae܆f>tC)/~{KnCr3S.{ ָ'&c6DЉ B-ْ($2ɜGR%f|hPF4PuB^|ڿ[Q,VUÛU$u6/m_'vȚ|h]p޹VEX|f K8unR̻(.}^'- ^={!sRL+DYm0@dZ'Z޽#60S3F s%r6?5jhA]?KM}>>6GIwxA\aoHId-|#>N rMvB#gtIM!J˖w("Wis=U1~ϒ)^ႂ>pS/Gi _*A69*%-־!ksލ#|UN7q'h6enw4ܞvfsa`GS25"iSnSڠ_&$v#oQS1yzDTRh9atf蓟 lThq׾ nJfL柎42O.KB3/2ro<\;Tj}ɇŲNd 萇wB^8Vݙ0XmiZσmȽʍW$]ԅL@(|+**.h1?_J|VDX6}xJY^Bk9nFX#q$!%]Iju 'Ɵ6y5'qIY̍a49*,ij&ԑƪHIvs,A}"_` %^$i[g\ !+v%0-Dʐv}.X]J%(x6\E.Y瞲HCb_XiCNqnݦEM9،ns2_I-DfS!h@?Z @W5,c)wMG;žLXgPD"É" IG" 0Fa5eAkW>2\+iKfK?_1!w:; wuȇ5.,yNF詚h>¨b#=h 'hMgĜ&pWpx\e<,A+V*RAY&dEYVi_-y&iR1@Xp#-0tE5oz481[o|fma'^98U*@›ǵ E1Tx.\e8|PL Glƈ?k_:wqH}&QU$ʶc#S(x{f:Wzj`5 ؙΞƒl/GFG4:LhZI,/@t.S!KqFfaS)^;8Bxm8N\%qdǛںLb/4w!֫ݘiw5ֱxID8 Q*K>dbps~g릓8p1/叢>I9+NFԛ|+=eaq+:}!B$iЭr`$34J^y!1>!ŎZ7ǁ‰$z\itD:UʞUEaݓdM M 3qQgOبys~#⎽Oeq=Q !fQLD !ܥ) ݚϔ yOPq뫊YKߏ*e4oV3k lIj\|HKzGV;^_ń`jZ]~6Kt; "St%"q~»im#U^ʠSvUtдM~ISqo4OK8-@?eCd+ iW &(ӑ ܌FԳ^in15Q74vk S~_gi-C: `|We Wao_TJS䏹k>VL Pnߙ3#hrП22cLucij'pl ) ٫r@P6`mN3< *NM22řdPBAP#Zvk%1!}c'6p@8J׬4۶|)N.wdD°e6gW[U#FO&BJ\0-4Z3 _uOKՀsNnc霰OO@)'G\"r,DQW1LzbLv[RtKH*? ]8o,t YQVt?>95d4`M:z>m0O}`.%nzQ4foq=@EWc5~qDzʆ4v\bcƈ&)_~j^h獞5@rAbޒu+I0+ s i黍=/fIWw/:`z+f_EI&ĥuRtR}Q+vAJ}F,ǁ8xOIn7 AI2U}59Um-;W{,o +jfFUBї3?LL}iob *Tef_fI6-O2^Ij(ޞ|gD|Ow5uI:絍Gޏ́cW#+L{U/P/WITD %7RuWeRk/_uҙ98<#r鄈!.8iKYVC#AiDbfS!W#Ih-.ٌXuU=k@A; 6s$_/ܬ#mͮP.,d48cxCs0'ᙏ/Khu^!Sg(ֽأ⥿ D@|9/e,cB?Z%.(Њ}**,Aڲ}!s3kP? `VFxC-腻+6`YV}+(qrmVշ|<3 LM>9YkQoMPYLIWebpwHjukX\W %jbq>tڣEP` ̈=Ks!GB%kGwFfb"RWQ,+[~&^JgŌ_4;L -T@bw GVXgNDF`d(W3b͓< 0R3ct_띵ڧ =G746V{GMv*3T\]+0Gޞ'k}Flˋe~A{nRsZA琘lC`j>L'|W@PGQHT7< 5|MuC ;cLCJ Ki>O1m{tZN,g1WQ=> GpS&?^Z9_YN//ґRG+3b A`44>u;]h}آ0]S{cʻ:.&|Sz"z6Av"Tn@&8m[WrΊu'QZx34sVo̴#|laGMDž>kg j|4gP9~4+]6 a$p9i:I+.ºLWxo-U䢼N#SLHQk:Z'Yz.\vZwhDS3|ɹ.N1=ה]ajjWv3ks^m3Œ9qơk|Ss\ o݅b5n3n^ rcvPVh~fCۍnD ;*^.B{_zljOă:r|CUuIg`ǪBڎPuL<9z8`*6/h&i7oB,N:U=ˍZa&\В53V7IyiAF%t-+1dMfl6~[-pi'~9C}g^ 럨륷 k]ݸSPApɌ7}IqJ5eS."deu0kǦʆc{"ʺ%jsa@&T==$j`CZDܕZXWթ  _! XfX oѐ.zL a%ߘ[-Ӻ+ VmOE1MSbn|GaaUR&ʱv'Ψ`zzJI2N#w|ZWQ_lg‚/"ߋ0'.kwxŒx?3ӡhnKD*'c6|H K=,,b*TTpF2}ey]Z "ny,~jiC4&?tekR),[33n^N#KIF!J8Vam` oB⮏fj5zHh|ohRqz"ԫ ])ƲHҁUږM|E`u'䯚;}V\QI&pJb@FElŅ$jrPA.Ƥٶ:66z|KM Êl]z h_#,ֹs Vqg3B{&q69$FqF(ѵPqQsMbw!>-BRFi 1phՁA X& Mќ+2Imc슙HEDrҏIAsf!tBe==Cg2=CቺK%A}ϧET= {fBGԩ"`wu˖gcV7;R\[H2VFkߥ@7s{H6.ޑiEA}[z8.K[>sS)#GNJ%W~n|&鑎Pl{ PGV-wI zyu"^ij~pNSsBDkx@CsB=kڷ",%؇NՋK$Xkݽle8O|d!V>kp`]5EH*ws=P7U)ךV MRkf'69E3AXB0Vl1_eOV(s_u=ց]H ~Wy8tyRB8 BqQ*U:)0~Wm Mj$"Y{00A:VCAŒ(._F8G+;DU#XaB-[9P)za-&Ts%FHqt7x<1@R8?Angiޱ _nw8N_>B)S|[E#Rb}+:pKnS{jn?VHM}KY2g^&qY8*v@vwcJQb(&/Z$p08K)Y8j&C+HC>C%c|҈ģSO" ,ujGߌSƶ%%$lFč֌ck')7VgJJd앝D3XKJcW5C±ܥc[&s א@p D$'(!]\s㯬x(Q'P^xVt(뺌>(6]Tce%g<.La@>) j'v^Uڿ8D [[Yxwӌd g5ݓ6:݈QGa+7 ',M f jAm\ڍȫ,byb9e0B]=bcBqr6&IOlUY rYbsm1? w7cȍT"[!,\8f蹕6j~9Jwg*,gJ Kg>y&2fR>.n3|?7A8HҔ{^>Ajt/_֧5aȂ=MW@:f#7`i\ »k|=>cohD03T~i8R!)%žAHK!1:5B?E4fAWj*G뭀 y2ޢg ?JBem1_^>D/ -JS%A;N.7"E_c. P" ua ߿eHr NJ)$pnW!\6)iK/4gb 緧O#":5"H (yhc$84] ZHbZ.9Yq:O:6CGa'w ֳUMЇS(xal(ZVtFJp'9jh{t$M]m˝GŭjHc$/Ҙn8hۚRsJڟ캒Ϋ|F4nX7`ヨ*Yr/l4Z6[pZa  [QTp|5)|+HJC(0'>妀(^]^ ϵxu;a!7ۧϣb7:K(EͶ"(eүޝ!tyZ5L<@D L +p/ϭvV -%:K[&cYE6bM ^zo^ə=-sB~S1Mб kʛ(Z) ljv&O_DgJ4jT? Tn d?_7R |yLh:BnH`TF~ l!c9 06IrIbQjb{9;^yJ«P !9A|/0[0!R]- m-,VߧYAS^1YaleγýbfaY)-rHHu`AX hM;ѥ_HSP9F(1ɗ[Nse&Ec#D>Dp"~\ì*J>,Pn{y#g6 cE:qt8k97ޠK1 ~WJ~H7r=fhzs)|Ҍ(.ّaQ3EC6 8Vk`b,&#Gj_% <-hZ$ؘ$.x}Ouv,$X[;ڎ.>S} 2k9v(dA211-GDbW6* 0̡x'1X bHy 9|?bxR'Vz7[ ['N`Wqtwe;q / jwH#*]2,_Q<ʸ]HԶy8u u7r;:oK V]d}@k VOnv % PR}DY~U MqI\u՚(`0Ѭ@ ѡ ߃YX'̛@ J)ߝ=?;]W¥9 #fKx\oVO#ifJntK'ln[ X$"3f& i~q43!ӯ #0tu),|OB80@QfޡP5J@^8 ׎ѿrc8)G'esɂ$2cP,z|{ݡȱ` ħۧ1{㤰muy{ *܈{AF% 7h7Y:_Q08G N+]˚-L*axm22eƬ4x{%&v^z\8zԂOxPCLIx`]]яy^,^H)tf^pms6r}Z9+ J3NgXuOb`-/[OPB$KpCx%?~K..N~3z 2p&/A`-<4UAA0&~> wY3ѱo6as_+NH1V$mޙoI2n[:R#BL`Ӳ~eRp0y#T"2 o~Q^E a THD pp K-0ٸ6SEqOy#B! BloО04~:ޜP-fog=`.A ./]UdJ,}mt9Kji!1xrYXb)vOl{yR WDg!p-lQ] %Ӿ-O'OG&iK-3e1B?)H jj>*%Ȏ)|틍Wܘ1Nf;7ƨ_ЫX$b Y[lJ^ $ k}$ԻK)[riR v3~;u<$>*];m+|~ !3pK1ך[2ݼ/jHu T7;(HckF m e[1,Xh~}8LEF_;N3s7i+Z4Զ?&R{3k^$9+YFoE+o﬛wԎ0ùp*1(&g6 zbz5 vԒ@B;2՜Ex/J 2^$X[ endstream endobj 11 0 obj << /Producer (pdfTeX-1.40.13) /Creator (TeX) /CreationDate (D:20130413212535-07'00') /ModDate (D:20130413212535-07'00') /Trapped /False /PTEX.Fullbanner (This is pdfTeX, Version 3.1415926-2.4-1.40.13 (TeX Live 2012/Debian) kpathsea version 6.1.0) >> endobj 5 0 obj << /Type /ObjStm /N 7 /First 40 /Length 553 /Filter /FlateDecode >> stream xڥSKo@+ت2²+Yl,ԨI:Q- @Wɿ{`53`4B"C A ,+o:5Ya%4->Ņ@JI4A|D$!E*PoDy@om},QhR:i&-(qsC8[$guc*p-wbؿ0}Ut_vI~̒/qop Pw1JA'>w{)0گ*JR& !V.{y|Euv> 5~'s̛. X-`fciERBE_Z#U3z\ѝRZu` cf({2;)VmzXqL2aSx#ty0 endstream endobj 12 0 obj << /Type /XRef /Index [0 13] /Size 13 /W [1 2 1] /Root 10 0 R /Info 11 0 R /ID [<0D826DDE839BE08F0848B1953C6480A7> <0D826DDE839BE08F0848B1953C6480A7>] /Length 48 /Filter /FlateDecode >> stream x "Sg7-⿥њ<^  endstream endobj startxref 14390 %%EOF xapers-0.5.2/test/docs/all.bib000066400000000000000000000022601214264575400161670ustar00rootroot00000000000000@article{ arxiv:1235, author = "Dole, Bob and Cruise, Tim", title = "Creation of the γ-verses", year = "2012", eprint = "1235", file = {:__DOC_DIR__/1.pdf:pdf} } @article{Good_Bad_Up_Down_Left_Right_et_al._2012, title={Multicolor cavity sadness}, volume={29}, url={http://dx.doi.org/10.9999/FOO.1}, DOI={10.9999/FOO.1}, number={10}, journal={Journal of the Color Feelings}, publisher={Optical Society of America}, author={Good, Bob and Bad, Sam and Up, Steve and Down, Joseph and Left, Aidan and Right, Kate and et al.}, year={2012}, month={Sep}, pages={2092}, file={:__DOC_DIR__/2 file.pdf:pdf}} @article{ fake:1234, author = "Reed, Lou and Björk", title = "When the liver meats the pavement", year = "1980", journal = "fake", file = {:__DOC_DIR__/5.pdf:} } @article{30929234, title={The Circle and the Square: Forbidden Love}, url={http://dx.doi.org/10.9999/FOO.2}, DOI={10.9999/FOO.2}, file={:}, journal={Shaply Letters}, author={Me and You and We Know, Everyone}, year={1869}} @article{30929, title={Circle are Squares}, url={http://dx.doi.org/10.9999/FOO.3}, DOI={10.9999/FOO.3}, journal={Sharp Letters}, author={Me and You}, year={1869}} xapers-0.5.2/test/import000077500000000000000000000051271214264575400152560ustar00rootroot00000000000000#!/usr/bin/env bash test_description='bibtex database importing.' . ./test-lib.sh ################################################################ test_expect_code 1 'fail import without bibtex' \ 'xapers import' sed "s|__DOC_DIR__|$DOC_DIR|g" <"$DOC_DIR"/all.bib >all.bib # the following two tests provides entries so we can test that import # updates existing entries test_begin_subtest 'add initial documents' xapers add --tags=foo --source="$DOC_DIR"/2.bib xapers add --tags=bar --source="$DOC_DIR"/3.bib xapers search '*' >OUTPUT cat <EXPECTED id:2 [] {fake:1234} (bar) "When the liver meats the pavement" id:1 [doi:10.9999/FOO.1] {Good_Bad_Up_Down_Left_Right_et_al._2012} (foo) "Multicolor cavity sadness" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'import full bibtex with files' xapers import --tags=new all.bib xapers search '*' >OUTPUT cat <EXPECTED id:5 [arxiv:1235] {arxiv:1235} (new) "Creation of the γ-verses" id:4 [doi:10.9999/FOO.2] {30929234} (new) "The Circle and the Square: Forbidden Love" id:3 [doi:10.9999/FOO.3] {30929} (new) "Circle are Squares" id:2 [] {fake:1234} (bar new) "When the liver meats the pavement" id:1 [doi:10.9999/FOO.1] {Good_Bad_Up_Down_Left_Right_et_al._2012} (foo new) "Multicolor cavity sadness" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search id:' xapers search id:5 >OUTPUT cat <EXPECTED id:5 [arxiv:1235] {arxiv:1235} (new) "Creation of the γ-verses" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search bib:' xapers search key:30929234 >OUTPUT cat <EXPECTED id:4 [doi:10.9999/FOO.2] {30929234} (new) "The Circle and the Square: Forbidden Love" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 'search text' xapers search --output=summary lorem >OUTPUT cat <EXPECTED id:2 [] {fake:1234} (bar new) "When the liver meats the pavement" id:5 [arxiv:1235] {arxiv:1235} (new) "Creation of the γ-verses" EOF test_expect_equal_file OUTPUT EXPECTED test_begin_subtest 're-import produces identical results' xapers import --tags=new all.bib xapers search '*' >OUTPUT cat <EXPECTED id:5 [arxiv:1235] {arxiv:1235} (new) "Creation of the γ-verses" id:4 [doi:10.9999/FOO.2] {30929234} (new) "The Circle and the Square: Forbidden Love" id:3 [doi:10.9999/FOO.3] {30929} (new) "Circle are Squares" id:2 [] {fake:1234} (bar new) "When the liver meats the pavement" id:1 [doi:10.9999/FOO.1] {Good_Bad_Up_Down_Left_Right_et_al._2012} (foo new) "Multicolor cavity sadness" EOF test_expect_equal_file OUTPUT EXPECTED ################################################################ test_done xapers-0.5.2/test/sources000077500000000000000000000020661214264575400154260ustar00rootroot00000000000000#!/usr/bin/env bash test_description='sources.' . ./test-lib.sh ################################################################ # FIXME: add test for source2bib # FIXME: add test for scandoc test_begin_subtest 'source2bib doi' xapers source2bib 'doi:10.1364/JOSAA.29.002092' >OUTPUT cat <EXPECTED @article{ Izumi_Arai_Barr_Betzwieser_Brooks_Dahl_Doravari_Driggers_Korth_Miao_et_al._2012, author = "Izumi, Kiwamu and Arai, Koji and Barr, Bryan and Betzwieser, Joseph and Brooks, Aidan and Dahl, Katrin and Doravari, Suresh and Driggers, Jennifer C. and Korth, W. Zach and Miao, Haixing and et al.", publisher = "Optical Society of America", doi = "10.1364/JOSAA.29.002092", title = "Multicolor cavity metrology", url = "http://dx.doi.org/10.1364/JOSAA.29.002092", journal = "Journal of the Optical Society of America A", number = "10", month = "Sep", volume = "29", year = "2012", pages = "2092" } EOF test_expect_equal_file OUTPUT EXPECTED ################################################################ test_done xapers-0.5.2/test/test-aggregate-results000077500000000000000000000027561214264575400203530ustar00rootroot00000000000000#!/usr/bin/env bash fixed=0 success=0 failed=0 broken=0 total=0 for file do while read type value do case $type in '') continue ;; fixed) fixed=$(($fixed + $value)) ;; success) success=$(($success + $value)) ;; failed) failed=$(($failed + $value)) ;; broken) broken=$(($broken + $value)) ;; total) total=$(($total + $value)) ;; esac done <"$file" done pluralize () { case $2 in 1) case $1 in test) echo test ;; failure) echo failure ;; esac ;; *) case $1 in test) echo tests ;; failure) echo failures ;; esac ;; esac } echo "Xapers test suite complete." if [ "$fixed" = "0" ] && [ "$failed" = "0" ]; then tests=$(pluralize "test" $total) printf "All $total $tests " if [ "$broken" = "0" ]; then echo "passed." else failures=$(pluralize "failure" $broken) echo "behaved as expected ($broken expected $failures)." fi; else echo "$success/$total tests passed." if [ "$broken" != "0" ]; then tests=$(pluralize "test" $broken) echo "$broken broken $tests failed as expected." fi if [ "$fixed" != "0" ]; then tests=$(pluralize "test" $fixed) echo "$fixed broken $tests now fixed." fi if [ "$failed" != "0" ]; then tests=$(pluralize "test" $failed) echo "$failed $tests failed." fi fi skipped=$(($total - $fixed - $success - $failed - $broken)) if [ "$skipped" != "0" ]; then tests=$(pluralize "test" $skipped) echo "$skipped $tests skipped." fi xapers-0.5.2/test/test-lib.sh000066400000000000000000000444251214264575400161010ustar00rootroot00000000000000# # Copyright (c) 2005 Junio C Hamano # # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation, either version 2 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program. If not, see http://www.gnu.org/licenses/ . if [ ${BASH_VERSINFO[0]} -lt 4 ]; then echo "Error: The notmuch test suite requires a bash version >= 4.0" echo "due to use of associative arrays within the test suite." echo "Please try again with a newer bash (or help us fix the" echo "test suite to be more portable). Thanks." exit 1 fi # if --tee was passed, write the output not only to the terminal, but # additionally to the file test-results/$BASENAME.out, too. case "$GIT_TEST_TEE_STARTED, $* " in done,*) # do not redirect again ;; *' --tee '*|*' --va'*) mkdir -p test-results BASE=test-results/$(basename "$0" .sh) (GIT_TEST_TEE_STARTED=done ${SHELL-sh} "$0" "$@" 2>&1; echo $? > $BASE.exit) | tee $BASE.out test "$(cat $BASE.exit)" = 0 exit ;; esac # Keep the original TERM for say_color and test_emacs ORIGINAL_TERM=$TERM # For repeatability, reset the environment to known value. LANG=C LC_ALL=C PAGER=cat TZ=UTC TERM=dumb export LANG LC_ALL PAGER TERM TZ GIT_TEST_CMP=${GIT_TEST_CMP:-diff -u} TEST_EMACS=${TEST_EMACS:-${EMACS:-emacs}} # Protect ourselves from common misconfiguration to export # CDPATH into the environment unset CDPATH unset GREP_OPTIONS # Convenience # # A regexp to match 5 and 40 hexdigits _x05='[0-9a-f][0-9a-f][0-9a-f][0-9a-f][0-9a-f]' _x40="$_x05$_x05$_x05$_x05$_x05$_x05$_x05$_x05" _x04='[0-9a-f][0-9a-f][0-9a-f][0-9a-f]' _x32="$_x04$_x04$_x04$_x04$_x04$_x04$_x04$_x04" # Each test should start with something like this, after copyright notices: # # test_description='Description of this test... # This test checks if command xyzzy does the right thing... # ' # . ./test-lib.sh [ "x$ORIGINAL_TERM" != "xdumb" ] && ( TERM=$ORIGINAL_TERM && export TERM && [ -t 1 ] && tput bold >/dev/null 2>&1 && tput setaf 1 >/dev/null 2>&1 && tput sgr0 >/dev/null 2>&1 ) && color=t while test "$#" -ne 0 do case "$1" in -d|--d|--de|--deb|--debu|--debug) debug=t; shift ;; -i|--i|--im|--imm|--imme|--immed|--immedi|--immedia|--immediat|--immediate) immediate=t; shift ;; -l|--l|--lo|--lon|--long|--long-|--long-t|--long-te|--long-tes|--long-test|--long-tests) GIT_TEST_LONG=t; export GIT_TEST_LONG; shift ;; -h|--h|--he|--hel|--help) help=t; shift ;; -v|--v|--ve|--ver|--verb|--verbo|--verbos|--verbose) verbose=t; shift ;; -q|--q|--qu|--qui|--quie|--quiet) quiet=t; shift ;; --with-dashes) with_dashes=t; shift ;; --no-color) color=; shift ;; --no-python) # noop now... shift ;; --va|--val|--valg|--valgr|--valgri|--valgrin|--valgrind) valgrind=t; verbose=t; shift ;; --tee) shift ;; # was handled already --root=*) root=$(expr "z$1" : 'z[^=]*=\(.*\)') shift ;; *) echo "error: unknown test option '$1'" >&2; exit 1 ;; esac done if test -n "$debug"; then print_subtest () { printf " %-4s" "[$((test_count - 1))]" } else print_subtest () { true } fi if test -n "$color"; then say_color () { ( TERM=$ORIGINAL_TERM export TERM case "$1" in error) tput bold; tput setaf 1;; # bold red skip) tput bold; tput setaf 2;; # bold green pass) tput setaf 2;; # green info) tput setaf 3;; # brown *) test -n "$quiet" && return;; esac shift printf " " printf "$@" tput sgr0 print_subtest ) } else say_color() { test -z "$1" && test -n "$quiet" && return shift printf " " printf "$@" print_subtest } fi error () { say_color error "error: $*\n" GIT_EXIT_OK=t exit 1 } say () { say_color info "$*" } test "${test_description}" != "" || error "Test script did not set test_description." if test "$help" = "t" then echo "Tests ${test_description}" exit 0 fi echo $(basename "$0"): "Testing ${test_description}" exec 5>&1 test_failure=0 test_count=0 test_fixed=0 test_broken=0 test_success=0 die () { code=$? rm -rf "$TEST_TMPDIR" if test -n "$GIT_EXIT_OK" then exit $code else echo >&5 "FATAL: Unexpected exit with code $code" exit 1 fi } GIT_EXIT_OK= # Note: TEST_TMPDIR *NOT* exported! TEST_TMPDIR=$(mktemp -d "${TMPDIR:-/tmp}/test-$$.XXXXXX") trap 'die' EXIT test_decode_color () { sed -e 's/.\[1m//g' \ -e 's/.\[31m//g' \ -e 's/.\[32m//g' \ -e 's/.\[33m//g' \ -e 's/.\[34m//g' \ -e 's/.\[35m//g' \ -e 's/.\[36m//g' \ -e 's/.\[m//g' } q_to_nul () { perl -pe 'y/Q/\000/' } q_to_cr () { tr Q '\015' } append_cr () { sed -e 's/$/Q/' | tr Q '\015' } remove_cr () { tr '\015' Q | sed -e 's/Q$//' } test_begin_subtest () { if [ -n "$inside_subtest" ]; then exec 1>&6 2>&7 # Restore stdout and stderr error "bug in test script: Missing test_expect_equal in ${BASH_SOURCE[1]}:${BASH_LINENO[0]}" fi test_subtest_name="$1" test_reset_state_ # Remember stdout and stderr file descriptors and redirect test # output to the previously prepared file descriptors 3 and 4 (see # below) if test "$verbose" != "t"; then exec 4>test.output 3>&4; fi exec 6>&1 7>&2 >&3 2>&4 inside_subtest=t } # Pass test if two arguments match # # Note: Unlike all other test_expect_* functions, this function does # not accept a test name. Instead, the caller should call # test_begin_subtest before calling this function in order to set the # name. test_expect_equal () { exec 1>&6 2>&7 # Restore stdout and stderr inside_subtest= test "$#" = 3 && { prereq=$1; shift; } || prereq= test "$#" = 2 || error "bug in the test script: not 2 or 3 parameters to test_expect_equal" output="$1" expected="$2" if ! test_skip "$test_subtest_name" then if [ "$output" = "$expected" ]; then test_ok_ "$test_subtest_name" else testname=$this_test.$test_count echo "$expected" > $testname.expected echo "$output" > $testname.output test_failure_ "$test_subtest_name" "$(diff -u $testname.expected $testname.output)" fi fi } # Like test_expect_equal, but takes two filenames. test_expect_equal_file () { exec 1>&6 2>&7 # Restore stdout and stderr inside_subtest= test "$#" = 3 && { prereq=$1; shift; } || prereq= test "$#" = 2 || error "bug in the test script: not 2 or 3 parameters to test_expect_equal" output="$1" expected="$2" if ! test_skip "$test_subtest_name" then if diff -q "$expected" "$output" >/dev/null ; then test_ok_ "$test_subtest_name" else testname=$this_test.$test_count cp "$output" $testname.output cp "$expected" $testname.expected test_failure_ "$test_subtest_name" "$(diff -u $testname.expected $testname.output)" fi fi } # Like test_expect_equal, but arguments are JSON expressions to be # canonicalized before diff'ing. If an argument cannot be parsed, it # is used unchanged so that there's something to diff against. test_expect_equal_json () { output=$(echo "$1" | python -mjson.tool || echo "$1") expected=$(echo "$2" | python -mjson.tool || echo "$2") shift 2 test_expect_equal "$output" "$expected" "$@" } # Use test_set_prereq to tell that a particular prerequisite is available. # The prerequisite can later be checked for in two ways: # # - Explicitly using test_have_prereq. # # - Implicitly by specifying the prerequisite tag in the calls to # test_expect_{success,failure,code}. # # The single parameter is the prerequisite tag (a simple word, in all # capital letters by convention). test_set_prereq () { satisfied="$satisfied$1 " } satisfied=" " test_have_prereq () { case $satisfied in *" $1 "*) : yes, have it ;; *) ! : nope ;; esac } # declare prerequisite for the given external binary test_declare_external_prereq () { binary="$1" test "$#" = 2 && name=$2 || name="$binary(1)" hash $binary 2>/dev/null || eval " test_missing_external_prereq_${binary}_=t $binary () { echo -n \"\$test_subtest_missing_external_prereqs_ \" | grep -qe \" $name \" || test_subtest_missing_external_prereqs_=\"\$test_subtest_missing_external_prereqs_ $name\" false }" } # Explicitly require external prerequisite. Useful when binary is # called indirectly (e.g. from emacs). # Returns success if dependency is available, failure otherwise. test_require_external_prereq () { binary="$1" if [ "$(eval echo -n \$test_missing_external_prereq_${binary}_)" = t ]; then # dependency is missing, call the replacement function to note it eval "$binary" else true fi } # You are not expected to call test_ok_ and test_failure_ directly, use # the text_expect_* functions instead. test_ok_ () { if test "$test_subtest_known_broken_" = "t"; then test_known_broken_ok_ "$@" return fi test_success=$(($test_success + 1)) say_color pass "%-6s" "PASS" echo " $@" } test_failure_ () { if test "$test_subtest_known_broken_" = "t"; then test_known_broken_failure_ "$@" return fi test_failure=$(($test_failure + 1)) test_failure_message_ "FAIL" "$@" test "$immediate" = "" || { GIT_EXIT_OK=t; exit 1; } return 1 } test_failure_message_ () { say_color error "%-6s" "$1" echo " $2" shift 2 echo "$@" | sed -e 's/^/ /' if test "$verbose" != "t"; then cat test.output; fi } test_known_broken_ok_ () { test_reset_state_ test_fixed=$(($test_fixed+1)) say_color pass "%-6s" "FIXED" echo " $@" } test_known_broken_failure_ () { test_reset_state_ test_broken=$(($test_broken+1)) test_failure_message_ "BROKEN" "$@" return 1 } test_debug () { test "$debug" = "" || eval "$1" } test_run_ () { test_cleanup=: if test "$verbose" != "t"; then exec 4>test.output 3>&4; fi eval >&3 2>&4 "$1" eval_ret=$? eval >&3 2>&4 "$test_cleanup" return 0 } test_skip () { test_count=$(($test_count+1)) to_skip= for skp in $XAPERS_SKIP_TESTS do case $this_test.$test_count in $skp) to_skip=t esac done if test -z "$to_skip" && test -n "$prereq" && ! test_have_prereq "$prereq" then to_skip=t fi case "$to_skip" in t) test_report_skip_ "$@" ;; *) test_check_missing_external_prereqs_ "$@" ;; esac } test_check_missing_external_prereqs_ () { if test -n "$test_subtest_missing_external_prereqs_"; then say_color skip >&1 "missing prerequisites:" echo "$test_subtest_missing_external_prereqs_" >&1 test_report_skip_ "$@" else false fi } test_report_skip_ () { test_reset_state_ say_color skip >&3 "skipping test:" echo " $@" >&3 say_color skip "%-6s" "SKIP" echo " $1" } test_subtest_known_broken () { test_subtest_known_broken_=t } test_expect_success () { test "$#" = 3 && { prereq=$1; shift; } || prereq= test "$#" = 2 || error "bug in the test script: not 2 or 3 parameters to test-expect-success" test_reset_state_ if ! test_skip "$@" then test_run_ "$2" run_ret="$?" # test_run_ may update missing external prerequisites test_check_missing_external_prereqs_ "$@" || if [ "$run_ret" = 0 -a "$eval_ret" = 0 ] then test_ok_ "$1" else test_failure_ "$@" fi fi } test_expect_code () { test "$#" = 4 && { prereq=$1; shift; } || prereq= test "$#" = 3 || error "bug in the test script: not 3 or 4 parameters to test-expect-code" test_reset_state_ if ! test_skip "$@" then test_run_ "$3" run_ret="$?" # test_run_ may update missing external prerequisites, test_check_missing_external_prereqs_ "$@" || if [ "$run_ret" = 0 -a "$eval_ret" = "$1" ] then test_ok_ "$2" else test_failure_ "$@" fi fi } # test_external runs external test scripts that provide continuous # test output about their progress, and succeeds/fails on # zero/non-zero exit code. It outputs the test output on stdout even # in non-verbose mode, and announces the external script with "* run # : ..." before running it. When providing relative paths, keep in # mind that all scripts run in "trash directory". # Usage: test_external description command arguments... # Example: test_external 'Perl API' perl ../path/to/test.pl test_external () { test "$#" = 4 && { prereq=$1; shift; } || prereq= test "$#" = 3 || error >&5 "bug in the test script: not 3 or 4 parameters to test_external" descr="$1" shift test_reset_state_ if ! test_skip "$descr" "$@" then # Announce the script to reduce confusion about the # test output that follows. say_color "" " run $test_count: $descr ($*)" # Run command; redirect its stderr to &4 as in # test_run_, but keep its stdout on our stdout even in # non-verbose mode. "$@" 2>&4 if [ "$?" = 0 ] then test_ok_ "$descr" else test_failure_ "$descr" "$@" fi fi } # Like test_external, but in addition tests that the command generated # no output on stderr. test_external_without_stderr () { # The temporary file has no (and must have no) security # implications. tmp="$TMPDIR"; if [ -z "$tmp" ]; then tmp=/tmp; fi stderr="$tmp/git-external-stderr.$$.tmp" test_external "$@" 4> "$stderr" [ -f "$stderr" ] || error "Internal error: $stderr disappeared." descr="no stderr: $1" shift if [ ! -s "$stderr" ]; then rm "$stderr" test_ok_ "$descr" else if [ "$verbose" = t ]; then output=`echo; echo Stderr is:; cat "$stderr"` else output= fi # rm first in case test_failure exits. rm "$stderr" test_failure_ "$descr" "$@" "$output" fi } # This is not among top-level (test_expect_success) # but is a prefix that can be used in the test script, like: # # test_expect_success 'complain and die' ' # do something && # do something else && # test_must_fail git checkout ../outerspace # ' # # Writing this as "! git checkout ../outerspace" is wrong, because # the failure could be due to a segv. We want a controlled failure. test_must_fail () { "$@" test $? -gt 0 -a $? -le 129 -o $? -gt 192 } # test_cmp is a helper function to compare actual and expected output. # You can use it like: # # test_expect_success 'foo works' ' # echo expected >expected && # foo >actual && # test_cmp expected actual # ' # # This could be written as either "cmp" or "diff -u", but: # - cmp's output is not nearly as easy to read as diff -u # - not all diff versions understand "-u" test_cmp() { $GIT_TEST_CMP "$@" } # This function can be used to schedule some commands to be run # unconditionally at the end of the test to restore sanity: # # test_expect_success 'test core.capslock' ' # git config core.capslock true && # test_when_finished "git config --unset core.capslock" && # hello world # ' # # That would be roughly equivalent to # # test_expect_success 'test core.capslock' ' # git config core.capslock true && # hello world # git config --unset core.capslock # ' # # except that the greeting and config --unset must both succeed for # the test to pass. test_when_finished () { test_cleanup="{ $* } && (exit \"\$eval_ret\"); eval_ret=\$?; $test_cleanup" } test_done () { GIT_EXIT_OK=t test_results_dir="$TEST_DIRECTORY/test-results" mkdir -p "$test_results_dir" test_results_path="$test_results_dir/${0%.sh}-$$" echo "total $test_count" >> $test_results_path echo "success $test_success" >> $test_results_path echo "fixed $test_fixed" >> $test_results_path echo "broken $test_broken" >> $test_results_path echo "failed $test_failure" >> $test_results_path echo "" >> $test_results_path echo [ -n "$EMACS_SERVER" ] && test_emacs '(kill-emacs)' if [ "$test_failure" = "0" ]; then if [ "$test_broken" = "0" ]; then rm -rf "$remove_tmp" fi exit 0 else exit 1 fi } test_python() { export LD_LIBRARY_PATH=$TEST_DIRECTORY/../lib export PYTHONPATH=$TEST_DIRECTORY/../bindings/python # Some distros (e.g. Arch Linux) ship Python 2.* as /usr/bin/python2, # most others as /usr/bin/python. So first try python2, and fallback to # python if python2 doesn't exist. cmd=python2 [[ "$test_missing_external_prereq_python2_" = t ]] && cmd=python (echo "import sys; _orig_stdout=sys.stdout; sys.stdout=open('OUTPUT', 'w')"; cat) \ | $cmd - } test_reset_state_ () { test -z "$test_init_done_" && test_init_ test_subtest_known_broken_= test_subtest_missing_external_prereqs_= } # called once before the first subtest test_init_ () { test_init_done_=t # skip all tests if there were external prerequisites missing during init test_check_missing_external_prereqs_ "all tests in $this_test" && test_done } # Test the binaries we have just built. The tests are kept in # test/ subdirectory and are run in 'trash directory' subdirectory. TEST_DIRECTORY=$(pwd) export PATH # Test repository test="tmp.$(basename "$0" .sh)" test -n "$root" && test="$root/$test" case "$test" in /*) TMP_DIRECTORY="$test" ;; *) TMP_DIRECTORY="$TEST_DIRECTORY/$test" ;; esac test ! -z "$debug" || remove_tmp=$TMP_DIRECTORY rm -fr "$test" || { GIT_EXIT_OK=t echo >&5 "FATAL: Cannot prepare test area" exit 1 } mkdir -p "${test}" # load local test library . ./test-local.sh # Use -P to resolve symlinks in our working directory so that the cwd # in subprocesses like git equals our $PWD (for pathname comparisons). cd -P "$test" || error "Cannot setup test environment" if test "$verbose" = "t" then exec 4>&2 3>&1 else exec 4>test.output 3>&4 fi this_test=${0##*/} for skp in $XAPERS_SKIP_TESTS do to_skip= for skp in $XAPERS_SKIP_TESTS do case "$this_test" in $skp) to_skip=t esac done case "$to_skip" in t) say_color skip >&3 "skipping test $this_test altogether" say_color skip "skip all tests in $this_test" test_done esac done # Provide an implementation of the 'yes' utility yes () { if test $# = 0 then y=y else y="$*" fi while echo "$y" do : done } # Fix some commands on Windows case $(uname -s) in *MINGW*) # Windows has its own (incompatible) sort and find sort () { /usr/bin/sort "$@" } find () { /usr/bin/find "$@" } sum () { md5sum "$@" } # git sees Windows-style pwd pwd () { builtin pwd -W } # no POSIX permissions # backslashes in pathspec are converted to '/' # exec does not inherit the PID ;; *) test_set_prereq POSIXPERM test_set_prereq BSLASHPSPEC test_set_prereq EXECKEEPSPID ;; esac test -z "$NO_PERL" && test_set_prereq PERL test -z "$NO_PYTHON" && test_set_prereq PYTHON # test whether the filesystem supports symbolic links ln -s x y 2>/dev/null && test -h y 2>/dev/null && test_set_prereq SYMLINKS rm -f y xapers-0.5.2/test/test-local.sh000066400000000000000000000004711214264575400164160ustar00rootroot00000000000000# declare prerequisites for external binaries used in tests test_declare_external_prereq python test_declare_external_prereq python2 export PATH="$TEST_DIRECTORY"/../bin:$PATH export PYTHONPATH="$TEST_DIRECTORY"/../lib:$PYTHONPATH export DOC_DIR="$TEST_DIRECTORY/docs" export XAPERS_ROOT="$TMP_DIRECTORY/docs" xapers-0.5.2/test/test-verbose000077500000000000000000000013131214264575400163570ustar00rootroot00000000000000#!/usr/bin/env bash test_description='the verbosity options of the test framework itself.' . ./test-lib.sh test_expect_success 'print something in test_expect_success and pass' ' echo "hello stdout" && echo "hello stderr" >&2 && true ' test_expect_success 'print something in test_expect_success and fail' ' echo "hello stdout" && echo "hello stderr" >&2 && false ' test_begin_subtest 'print something between test_begin_subtest and test_expect_equal and pass' echo "hello stdout" echo "hello stderr" >&2 test_expect_equal "a" "a" test_begin_subtest 'print something test_begin_subtest and test_expect_equal and fail' echo "hello stdout" echo "hello stderr" >&2 test_expect_equal "a" "b" test_done xapers-0.5.2/test/test.expected-output/000077500000000000000000000000001214264575400201265ustar00rootroot00000000000000xapers-0.5.2/test/test.expected-output/test-verbose-no000066400000000000000000000011451214264575400231060ustar00rootroot00000000000000test-verbose: Testing the verbosity options of the test framework itself. PASS print something in test_expect_success and pass FAIL print something in test_expect_success and fail echo "hello stdout" && echo "hello stderr" >&2 && false hello stdout hello stderr PASS print something between test_begin_subtest and test_expect_equal and pass FAIL print something test_begin_subtest and test_expect_equal and fail --- test-verbose.4.expected 2010-11-14 21:41:12.738189710 +0000 +++ test-verbose.4.output 2010-11-14 21:41:12.738189710 +0000 @@ -1 +1 @@ -b +a hello stdout hello stderr xapers-0.5.2/test/test.expected-output/test-verbose-yes000066400000000000000000000012311214264575400232660ustar00rootroot00000000000000test-verbose: Testing the verbosity options of the test framework itself. hello stdout hello stderr PASS print something in test_expect_success and pass hello stdout hello stderr FAIL print something in test_expect_success and fail echo "hello stdout" && echo "hello stderr" >&2 && false hello stdout hello stderr PASS print something between test_begin_subtest and test_expect_equal and pass hello stdout hello stderr FAIL print something test_begin_subtest and test_expect_equal and fail --- test-verbose.4.expected 2010-11-14 21:41:06.650023289 +0000 +++ test-verbose.4.output 2010-11-14 21:41:06.650023289 +0000 @@ -1 +1 @@ -b +a xapers-0.5.2/test/xapers-test000077500000000000000000000021661214264575400162230ustar00rootroot00000000000000#!/usr/bin/env bash # Run tests # # Copyright (c) 2005 Junio C Hamano # # Adapted from a Makefile to a shell script by Carl Worth (2010) if [ ${BASH_VERSINFO[0]} -lt 4 ]; then echo "Error: The notmuch test suite requires a bash version >= 4.0" echo "due to use of associative arrays within the test suite." echo "Please try again with a newer bash (or help us fix the" echo "test suite to be more portable). Thanks." exit 1 fi cd $(dirname "$0") TESTS=" basic all import " TESTS_NET=" sources " if [ "$XAPERS_TEST_NET" ] ; then TESTS="$TESTS $TESTS_NET " fi # setup TESTS=${XAPERS_TESTS:=$TESTS} # Clean up any results from a previous run rm -rf test-results docs/.xapers # test for timeout utility if command -v timeout >/dev/null; then TEST_TIMEOUT_CMD="timeout 2m " echo "INFO: using 2 minute timeout for tests" else TEST_TIMEOUT_CMD="" fi trap 'e=$?; kill $!; exit $e' HUP INT TERM # Run the tests for test in $TESTS; do $TEST_TIMEOUT_CMD ./$test "$@" & wait $! done trap - HUP INT TERM # Report results ./test-aggregate-results test-results/* # Clean up rm -rf test-result