slowmovideo-0.5+git20180116/0000755000000000000000000000000013417566411014035 5ustar rootrootslowmovideo-0.5+git20180116/.gitignore0000664000000000000000000000033413151342440016014 0ustar rootroot*~ *.txt.user install /*.png /slowmoVideo*.bz2 flowScripts/*.avi flowScripts/*.pyc clips */build slowmoVideo/docs/html slowmoVideo/docs/latex img/*.sVflow qtcreator* material/*.blend1 material/*.blend2 .DS_Store slowmovideo-0.5+git20180116/slowmoVideo.spec0000664000000000000000000000507513151342440017216 0ustar rootrootName: slowmoVideo Version: 0.3.1 Release: 1%{?dist} Summary: slowmoVideo is an OpenSource program that creates slow-motion videos from your footage. Group: Applications/Multimedia License: GPLv3 URL: http://slowmovideo.granjow.net/ Source0: http://slowmovideo.granjow.net/builds/%{name}-sources-v0.3+2d2b352.tar.bz2 Patch0: %{name}-%{version}-rpm.patch BuildRoot: %{_tmppath}/%{name}-%{version}-%{release}-root-%(%{__id_u} -n) BuildRequires: cmake BuildRequires: ffmpeg-devel BuildRequires: qt-devel BuildRequires: gcc-c++ BuildRequires: glew-devel BuildRequires: glut-devel BuildRequires: SDL-devel BuildRequires: libpng-devel BuildRequires: libjpeg-devel BuildRequires: opencv-devel Requires: ffmpeg Requires: qt Requires: glew Requires: glut Requires: SDL Requires: libpng Requires: libjpeg Requires: opencv %description slowmoVideo is an OpenSource program that creates slow-motion videos from your footage. %package devel Summary: Development libraries for %{name} Requires: %{name}%{_isa} = %{version}-%{release} %description devel slowmoVideo is an OpenSource program that creates slow-motion videos from your footage. This package contains development files. %prep %setup -q -c "%{name}-%{version} %patch0 -p1 %build mkdir -p %{_target_platform}-1 pushd %{_target_platform}-1 %{cmake} -D CMAKE_INSTALL_PREFIX:STRING=%{_prefix} \ -D CMAKE_BUILD_TYPE:STRING=Release \ -D ENABLE_TESTS:BOOL=false ../slowmoVideo popd make %{?_smp_mflags} -C %{_target_platform}-1 mkdir -p %{_target_platform}-2 pushd %{_target_platform}-2 %{cmake} -D CMAKE_INSTALL_PREFIX:STRING=%{_prefix} \ -D CMAKE_BUILD_TYPE:STRING=Release \ -D BUILD_INCLUDE_DIR:STRING=../slowmoVideo/lib \ -D BUILD_LIB_DIR:STRING=../%{_target_platform}-1/lib ../V3D popd make %{?_smp_mflags} -C %{_target_platform}-2 %install rm -rf $RPM_BUILD_ROOT make install/fast DESTDIR=$RPM_BUILD_ROOT -C %{_target_platform}-1 make install/fast DESTDIR=$RPM_BUILD_ROOT -C %{_target_platform}-2 %clean rm -rf $RPM_BUILD_ROOT %files %defattr(-,root,root,-) %doc README.md todo.org %{_bindir}/slowmoFlowEdit %{_bindir}/slowmoInfo %{_bindir}/slowmoInterpolate %{_bindir}/slowmoRenderer %{_bindir}/slowmoUI %{_bindir}/slowmoVideoInfo %{_bindir}/slowmoVisualizeFlow %{_bindir}/slowmoFlowBuilder %{_libdir}/libV3D.so %files devel %defattr(-,root,root,-) %{_includedir}/%{name}/flowField_sV.h %{_includedir}/%{name}/flowRW_sV.h %{_includedir}/%{name}/flowTools_sV.h %{_libdir}/%{name}/libsVflow.a %changelog * Tue Feb 5 2013 Steven Boswell Initial .spec file slowmovideo-0.5+git20180116/fixup_osx_lib.sh0000664000000000000000000000273613151342440017242 0ustar rootroot#!/bin/bash # script to fixup additionnal library on osx prog=$1.app/Contents/MacOS/$1 lib=$1.app/Contents/Frameworks/lib function fixup_dylib { if test -h $1 ; then echo "Skipping $1 because it is a symlink" return 0 fi dy_check=$(otool -D $1 | tail -n 1 | grep "is not an object file") if [[ $dy_check != "" ]] ; then echo "Skipping $3 because it is not a library" return 0 fi #get dylib name thislib=$(otool -D $1 | tail -1) thisLibraryFixed=$(echo "$thislib" | sed "s/^lib/@executable_path\/..\/Frameworks\/lib/g") # fixing ref lib echo "fixing referening libs" sharedLib=$(otool -L $1 | grep -v executable | grep -v "/usr/lib" | grep -v "/System/Library" | awk '{print $1}') for lib in $sharedLib do newlibrary=$(echo $lib | sed "s/^lib/@executable_path\/..\/Frameworks\/lib/g") install_name_tool -change $lib $newlibrary $1 done # fixing id echo "making execute relative : $thislib $thisLibraryFixed" install_name_tool -id $thisLibraryFixed $1 return 1 } function fixup_exe { if test -h $1 ; then echo "Skipping $3 because it is a symlink" return 0 fi sharedLib=$(otool -L $1 | grep -v executable | grep -v "/usr/lib" | grep -v "/System/Library" | awk '{print $1}') for lib in $sharedLib do newlibrary=$(echo $lib | sed "s/^lib/@executable_path\/..\/Frameworks\/lib/g") install_name_tool -change $lib $newlibrary $1 done return 1 } for lib in $lib/* do fixup_dylib "$lib" done fixup_exe "$prog" slowmovideo-0.5+git20180116/fixup_qt_osx.sh0000664000000000000000000000133513151342440017112 0ustar rootroot#!/bin/bash # fixup loader for OSX install_name_tool -change /Users/val/Documents/Sources/qt4/lib/QtScript.framework/Versions/4/QtScript @executable_path/../Frameworks/QtScript.framework/Versions/4/QtScript slowmoInfo install_name_tool -change /Users/val/Documents/Sources/qt4/lib/QtCore.framework/Versions/4/QtCore @executable_path/../Frameworks/QtCore.framework/Versions/4/QtCore slowmoInfo install_name_tool -change /Users/val/Documents/Sources/qt4/lib/QtTest.framework/Versions/4/QtTest @executable_path/../Frameworks/QtTest.framework/Versions/4/QtTest slowmoInfo install_name_tool -change /Users/val/Documents/Sources/qt4/lib/QtXml.framework/Versions/4/QtXml @executable_path/../Frameworks/QtXml.framework/Versions/4/QtXml slowmovideo-0.5+git20180116/LICENSE.md0000664000000000000000000010451313151342440015434 0ustar rootroot GNU GENERAL PUBLIC LICENSE Version 3, 29 June 2007 Copyright (C) 2007 Free Software Foundation, Inc. Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. Preamble The GNU General Public License is a free, copyleft license for software and other kinds of works. The licenses for most software and other practical works are designed to take away your freedom to share and change the works. By contrast, the GNU General Public License is intended to guarantee your freedom to share and change all versions of a program--to make sure it remains free software for all its users. We, the Free Software Foundation, use the GNU General Public License for most of our software; it applies also to any other work released this way by its authors. You can apply it to your programs, too. When we speak of free software, we are referring to freedom, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for them if you wish), that you receive source code or can get it if you want it, that you can change the software or use pieces of it in new free programs, and that you know you can do these things. To protect your rights, we need to prevent others from denying you these rights or asking you to surrender the rights. Therefore, you have certain responsibilities if you distribute copies of the software, or if you modify it: responsibilities to respect the freedom of others. For example, if you distribute copies of such a program, whether gratis or for a fee, you must pass on to the recipients the same freedoms that you received. You must make sure that they, too, receive or can get the source code. And you must show them these terms so they know their rights. Developers that use the GNU GPL protect your rights with two steps: (1) assert copyright on the software, and (2) offer you this License giving you legal permission to copy, distribute and/or modify it. For the developers' and authors' protection, the GPL clearly explains that there is no warranty for this free software. For both users' and authors' sake, the GPL requires that modified versions be marked as changed, so that their problems will not be attributed erroneously to authors of previous versions. Some devices are designed to deny users access to install or run modified versions of the software inside them, although the manufacturer can do so. This is fundamentally incompatible with the aim of protecting users' freedom to change the software. The systematic pattern of such abuse occurs in the area of products for individuals to use, which is precisely where it is most unacceptable. Therefore, we have designed this version of the GPL to prohibit the practice for those products. If such problems arise substantially in other domains, we stand ready to extend this provision to those domains in future versions of the GPL, as needed to protect the freedom of users. Finally, every program is threatened constantly by software patents. States should not allow patents to restrict development and use of software on general-purpose computers, but in those that do, we wish to avoid the special danger that patents applied to a free program could make it effectively proprietary. To prevent this, the GPL assures that patents cannot be used to render the program non-free. The precise terms and conditions for copying, distribution and modification follow. TERMS AND CONDITIONS 0. Definitions. "This License" refers to version 3 of the GNU General Public License. "Copyright" also means copyright-like laws that apply to other kinds of works, such as semiconductor masks. "The Program" refers to any copyrightable work licensed under this License. Each licensee is addressed as "you". "Licensees" and "recipients" may be individuals or organizations. To "modify" a work means to copy from or adapt all or part of the work in a fashion requiring copyright permission, other than the making of an exact copy. The resulting work is called a "modified version" of the earlier work or a work "based on" the earlier work. A "covered work" means either the unmodified Program or a work based on the Program. To "propagate" a work means to do anything with it that, without permission, would make you directly or secondarily liable for infringement under applicable copyright law, except executing it on a computer or modifying a private copy. Propagation includes copying, distribution (with or without modification), making available to the public, and in some countries other activities as well. To "convey" a work means any kind of propagation that enables other parties to make or receive copies. Mere interaction with a user through a computer network, with no transfer of a copy, is not conveying. An interactive user interface displays "Appropriate Legal Notices" to the extent that it includes a convenient and prominently visible feature that (1) displays an appropriate copyright notice, and (2) tells the user that there is no warranty for the work (except to the extent that warranties are provided), that licensees may convey the work under this License, and how to view a copy of this License. If the interface presents a list of user commands or options, such as a menu, a prominent item in the list meets this criterion. 1. Source Code. The "source code" for a work means the preferred form of the work for making modifications to it. "Object code" means any non-source form of a work. A "Standard Interface" means an interface that either is an official standard defined by a recognized standards body, or, in the case of interfaces specified for a particular programming language, one that is widely used among developers working in that language. The "System Libraries" of an executable work include anything, other than the work as a whole, that (a) is included in the normal form of packaging a Major Component, but which is not part of that Major Component, and (b) serves only to enable use of the work with that Major Component, or to implement a Standard Interface for which an implementation is available to the public in source code form. A "Major Component", in this context, means a major essential component (kernel, window system, and so on) of the specific operating system (if any) on which the executable work runs, or a compiler used to produce the work, or an object code interpreter used to run it. The "Corresponding Source" for a work in object code form means all the source code needed to generate, install, and (for an executable work) run the object code and to modify the work, including scripts to control those activities. However, it does not include the work's System Libraries, or general-purpose tools or generally available free programs which are used unmodified in performing those activities but which are not part of the work. For example, Corresponding Source includes interface definition files associated with source files for the work, and the source code for shared libraries and dynamically linked subprograms that the work is specifically designed to require, such as by intimate data communication or control flow between those subprograms and other parts of the work. The Corresponding Source need not include anything that users can regenerate automatically from other parts of the Corresponding Source. The Corresponding Source for a work in source code form is that same work. 2. Basic Permissions. All rights granted under this License are granted for the term of copyright on the Program, and are irrevocable provided the stated conditions are met. This License explicitly affirms your unlimited permission to run the unmodified Program. The output from running a covered work is covered by this License only if the output, given its content, constitutes a covered work. This License acknowledges your rights of fair use or other equivalent, as provided by copyright law. You may make, run and propagate covered works that you do not convey, without conditions so long as your license otherwise remains in force. You may convey covered works to others for the sole purpose of having them make modifications exclusively for you, or provide you with facilities for running those works, provided that you comply with the terms of this License in conveying all material for which you do not control copyright. Those thus making or running the covered works for you must do so exclusively on your behalf, under your direction and control, on terms that prohibit them from making any copies of your copyrighted material outside their relationship with you. Conveying under any other circumstances is permitted solely under the conditions stated below. Sublicensing is not allowed; section 10 makes it unnecessary. 3. Protecting Users' Legal Rights From Anti-Circumvention Law. No covered work shall be deemed part of an effective technological measure under any applicable law fulfilling obligations under article 11 of the WIPO copyright treaty adopted on 20 December 1996, or similar laws prohibiting or restricting circumvention of such measures. When you convey a covered work, you waive any legal power to forbid circumvention of technological measures to the extent such circumvention is effected by exercising rights under this License with respect to the covered work, and you disclaim any intention to limit operation or modification of the work as a means of enforcing, against the work's users, your or third parties' legal rights to forbid circumvention of technological measures. 4. Conveying Verbatim Copies. You may convey verbatim copies of the Program's source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice; keep intact all notices stating that this License and any non-permissive terms added in accord with section 7 apply to the code; keep intact all notices of the absence of any warranty; and give all recipients a copy of this License along with the Program. You may charge any price or no price for each copy that you convey, and you may offer support or warranty protection for a fee. 5. Conveying Modified Source Versions. You may convey a work based on the Program, or the modifications to produce it from the Program, in the form of source code under the terms of section 4, provided that you also meet all of these conditions: a) The work must carry prominent notices stating that you modified it, and giving a relevant date. b) The work must carry prominent notices stating that it is released under this License and any conditions added under section 7. This requirement modifies the requirement in section 4 to "keep intact all notices". c) You must license the entire work, as a whole, under this License to anyone who comes into possession of a copy. This License will therefore apply, along with any applicable section 7 additional terms, to the whole of the work, and all its parts, regardless of how they are packaged. This License gives no permission to license the work in any other way, but it does not invalidate such permission if you have separately received it. d) If the work has interactive user interfaces, each must display Appropriate Legal Notices; however, if the Program has interactive interfaces that do not display Appropriate Legal Notices, your work need not make them do so. A compilation of a covered work with other separate and independent works, which are not by their nature extensions of the covered work, and which are not combined with it such as to form a larger program, in or on a volume of a storage or distribution medium, is called an "aggregate" if the compilation and its resulting copyright are not used to limit the access or legal rights of the compilation's users beyond what the individual works permit. Inclusion of a covered work in an aggregate does not cause this License to apply to the other parts of the aggregate. 6. Conveying Non-Source Forms. You may convey a covered work in object code form under the terms of sections 4 and 5, provided that you also convey the machine-readable Corresponding Source under the terms of this License, in one of these ways: a) Convey the object code in, or embodied in, a physical product (including a physical distribution medium), accompanied by the Corresponding Source fixed on a durable physical medium customarily used for software interchange. b) Convey the object code in, or embodied in, a physical product (including a physical distribution medium), accompanied by a written offer, valid for at least three years and valid for as long as you offer spare parts or customer support for that product model, to give anyone who possesses the object code either (1) a copy of the Corresponding Source for all the software in the product that is covered by this License, on a durable physical medium customarily used for software interchange, for a price no more than your reasonable cost of physically performing this conveying of source, or (2) access to copy the Corresponding Source from a network server at no charge. c) Convey individual copies of the object code with a copy of the written offer to provide the Corresponding Source. This alternative is allowed only occasionally and noncommercially, and only if you received the object code with such an offer, in accord with subsection 6b. d) Convey the object code by offering access from a designated place (gratis or for a charge), and offer equivalent access to the Corresponding Source in the same way through the same place at no further charge. You need not require recipients to copy the Corresponding Source along with the object code. If the place to copy the object code is a network server, the Corresponding Source may be on a different server (operated by you or a third party) that supports equivalent copying facilities, provided you maintain clear directions next to the object code saying where to find the Corresponding Source. Regardless of what server hosts the Corresponding Source, you remain obligated to ensure that it is available for as long as needed to satisfy these requirements. e) Convey the object code using peer-to-peer transmission, provided you inform other peers where the object code and Corresponding Source of the work are being offered to the general public at no charge under subsection 6d. A separable portion of the object code, whose source code is excluded from the Corresponding Source as a System Library, need not be included in conveying the object code work. A "User Product" is either (1) a "consumer product", which means any tangible personal property which is normally used for personal, family, or household purposes, or (2) anything designed or sold for incorporation into a dwelling. In determining whether a product is a consumer product, doubtful cases shall be resolved in favor of coverage. For a particular product received by a particular user, "normally used" refers to a typical or common use of that class of product, regardless of the status of the particular user or of the way in which the particular user actually uses, or expects or is expected to use, the product. A product is a consumer product regardless of whether the product has substantial commercial, industrial or non-consumer uses, unless such uses represent the only significant mode of use of the product. "Installation Information" for a User Product means any methods, procedures, authorization keys, or other information required to install and execute modified versions of a covered work in that User Product from a modified version of its Corresponding Source. The information must suffice to ensure that the continued functioning of the modified object code is in no case prevented or interfered with solely because modification has been made. If you convey an object code work under this section in, or with, or specifically for use in, a User Product, and the conveying occurs as part of a transaction in which the right of possession and use of the User Product is transferred to the recipient in perpetuity or for a fixed term (regardless of how the transaction is characterized), the Corresponding Source conveyed under this section must be accompanied by the Installation Information. But this requirement does not apply if neither you nor any third party retains the ability to install modified object code on the User Product (for example, the work has been installed in ROM). The requirement to provide Installation Information does not include a requirement to continue to provide support service, warranty, or updates for a work that has been modified or installed by the recipient, or for the User Product in which it has been modified or installed. Access to a network may be denied when the modification itself materially and adversely affects the operation of the network or violates the rules and protocols for communication across the network. Corresponding Source conveyed, and Installation Information provided, in accord with this section must be in a format that is publicly documented (and with an implementation available to the public in source code form), and must require no special password or key for unpacking, reading or copying. 7. Additional Terms. "Additional permissions" are terms that supplement the terms of this License by making exceptions from one or more of its conditions. Additional permissions that are applicable to the entire Program shall be treated as though they were included in this License, to the extent that they are valid under applicable law. If additional permissions apply only to part of the Program, that part may be used separately under those permissions, but the entire Program remains governed by this License without regard to the additional permissions. When you convey a copy of a covered work, you may at your option remove any additional permissions from that copy, or from any part of it. (Additional permissions may be written to require their own removal in certain cases when you modify the work.) You may place additional permissions on material, added by you to a covered work, for which you have or can give appropriate copyright permission. Notwithstanding any other provision of this License, for material you add to a covered work, you may (if authorized by the copyright holders of that material) supplement the terms of this License with terms: a) Disclaiming warranty or limiting liability differently from the terms of sections 15 and 16 of this License; or b) Requiring preservation of specified reasonable legal notices or author attributions in that material or in the Appropriate Legal Notices displayed by works containing it; or c) Prohibiting misrepresentation of the origin of that material, or requiring that modified versions of such material be marked in reasonable ways as different from the original version; or d) Limiting the use for publicity purposes of names of licensors or authors of the material; or e) Declining to grant rights under trademark law for use of some trade names, trademarks, or service marks; or f) Requiring indemnification of licensors and authors of that material by anyone who conveys the material (or modified versions of it) with contractual assumptions of liability to the recipient, for any liability that these contractual assumptions directly impose on those licensors and authors. All other non-permissive additional terms are considered "further restrictions" within the meaning of section 10. If the Program as you received it, or any part of it, contains a notice stating that it is governed by this License along with a term that is a further restriction, you may remove that term. If a license document contains a further restriction but permits relicensing or conveying under this License, you may add to a covered work material governed by the terms of that license document, provided that the further restriction does not survive such relicensing or conveying. If you add terms to a covered work in accord with this section, you must place, in the relevant source files, a statement of the additional terms that apply to those files, or a notice indicating where to find the applicable terms. Additional terms, permissive or non-permissive, may be stated in the form of a separately written license, or stated as exceptions; the above requirements apply either way. 8. Termination. You may not propagate or modify a covered work except as expressly provided under this License. Any attempt otherwise to propagate or modify it is void, and will automatically terminate your rights under this License (including any patent licenses granted under the third paragraph of section 11). However, if you cease all violation of this License, then your license from a particular copyright holder is reinstated (a) provisionally, unless and until the copyright holder explicitly and finally terminates your license, and (b) permanently, if the copyright holder fails to notify you of the violation by some reasonable means prior to 60 days after the cessation. Moreover, your license from a particular copyright holder is reinstated permanently if the copyright holder notifies you of the violation by some reasonable means, this is the first time you have received notice of violation of this License (for any work) from that copyright holder, and you cure the violation prior to 30 days after your receipt of the notice. Termination of your rights under this section does not terminate the licenses of parties who have received copies or rights from you under this License. If your rights have been terminated and not permanently reinstated, you do not qualify to receive new licenses for the same material under section 10. 9. Acceptance Not Required for Having Copies. You are not required to accept this License in order to receive or run a copy of the Program. Ancillary propagation of a covered work occurring solely as a consequence of using peer-to-peer transmission to receive a copy likewise does not require acceptance. However, nothing other than this License grants you permission to propagate or modify any covered work. These actions infringe copyright if you do not accept this License. Therefore, by modifying or propagating a covered work, you indicate your acceptance of this License to do so. 10. Automatic Licensing of Downstream Recipients. Each time you convey a covered work, the recipient automatically receives a license from the original licensors, to run, modify and propagate that work, subject to this License. You are not responsible for enforcing compliance by third parties with this License. An "entity transaction" is a transaction transferring control of an organization, or substantially all assets of one, or subdividing an organization, or merging organizations. If propagation of a covered work results from an entity transaction, each party to that transaction who receives a copy of the work also receives whatever licenses to the work the party's predecessor in interest had or could give under the previous paragraph, plus a right to possession of the Corresponding Source of the work from the predecessor in interest, if the predecessor has it or can get it with reasonable efforts. You may not impose any further restrictions on the exercise of the rights granted or affirmed under this License. For example, you may not impose a license fee, royalty, or other charge for exercise of rights granted under this License, and you may not initiate litigation (including a cross-claim or counterclaim in a lawsuit) alleging that any patent claim is infringed by making, using, selling, offering for sale, or importing the Program or any portion of it. 11. Patents. A "contributor" is a copyright holder who authorizes use under this License of the Program or a work on which the Program is based. The work thus licensed is called the contributor's "contributor version". A contributor's "essential patent claims" are all patent claims owned or controlled by the contributor, whether already acquired or hereafter acquired, that would be infringed by some manner, permitted by this License, of making, using, or selling its contributor version, but do not include claims that would be infringed only as a consequence of further modification of the contributor version. For purposes of this definition, "control" includes the right to grant patent sublicenses in a manner consistent with the requirements of this License. Each contributor grants you a non-exclusive, worldwide, royalty-free patent license under the contributor's essential patent claims, to make, use, sell, offer for sale, import and otherwise run, modify and propagate the contents of its contributor version. In the following three paragraphs, a "patent license" is any express agreement or commitment, however denominated, not to enforce a patent (such as an express permission to practice a patent or covenant not to sue for patent infringement). To "grant" such a patent license to a party means to make such an agreement or commitment not to enforce a patent against the party. If you convey a covered work, knowingly relying on a patent license, and the Corresponding Source of the work is not available for anyone to copy, free of charge and under the terms of this License, through a publicly available network server or other readily accessible means, then you must either (1) cause the Corresponding Source to be so available, or (2) arrange to deprive yourself of the benefit of the patent license for this particular work, or (3) arrange, in a manner consistent with the requirements of this License, to extend the patent license to downstream recipients. "Knowingly relying" means you have actual knowledge that, but for the patent license, your conveying the covered work in a country, or your recipient's use of the covered work in a country, would infringe one or more identifiable patents in that country that you have reason to believe are valid. If, pursuant to or in connection with a single transaction or arrangement, you convey, or propagate by procuring conveyance of, a covered work, and grant a patent license to some of the parties receiving the covered work authorizing them to use, propagate, modify or convey a specific copy of the covered work, then the patent license you grant is automatically extended to all recipients of the covered work and works based on it. A patent license is "discriminatory" if it does not include within the scope of its coverage, prohibits the exercise of, or is conditioned on the non-exercise of one or more of the rights that are specifically granted under this License. You may not convey a covered work if you are a party to an arrangement with a third party that is in the business of distributing software, under which you make payment to the third party based on the extent of your activity of conveying the work, and under which the third party grants, to any of the parties who would receive the covered work from you, a discriminatory patent license (a) in connection with copies of the covered work conveyed by you (or copies made from those copies), or (b) primarily for and in connection with specific products or compilations that contain the covered work, unless you entered into that arrangement, or that patent license was granted, prior to 28 March 2007. Nothing in this License shall be construed as excluding or limiting any implied license or other defenses to infringement that may otherwise be available to you under applicable patent law. 12. No Surrender of Others' Freedom. If conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot convey a covered work so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not convey it at all. For example, if you agree to terms that obligate you to collect a royalty for further conveying from those to whom you convey the Program, the only way you could satisfy both those terms and this License would be to refrain entirely from conveying the Program. 13. Use with the GNU Affero General Public License. Notwithstanding any other provision of this License, you have permission to link or combine any covered work with a work licensed under version 3 of the GNU Affero General Public License into a single combined work, and to convey the resulting work. The terms of this License will continue to apply to the part which is the covered work, but the special requirements of the GNU Affero General Public License, section 13, concerning interaction through a network will apply to the combination as such. 14. Revised Versions of this License. The Free Software Foundation may publish revised and/or new versions of the GNU General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Program specifies that a certain numbered version of the GNU General Public License "or any later version" applies to it, you have the option of following the terms and conditions either of that numbered version or of any later version published by the Free Software Foundation. If the Program does not specify a version number of the GNU General Public License, you may choose any version ever published by the Free Software Foundation. If the Program specifies that a proxy can decide which future versions of the GNU General Public License can be used, that proxy's public statement of acceptance of a version permanently authorizes you to choose that version for the Program. Later license versions may give you additional or different permissions. However, no additional obligations are imposed on any author or copyright holder as a result of your choosing to follow a later version. 15. Disclaimer of Warranty. THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 16. Limitation of Liability. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. 17. Interpretation of Sections 15 and 16. If the disclaimer of warranty and limitation of liability provided above cannot be given local legal effect according to their terms, reviewing courts shall apply local law that most closely approximates an absolute waiver of all civil liability in connection with the Program, unless a warranty or assumption of liability accompanies a copy of the Program in return for a fee. END OF TERMS AND CONDITIONS How to Apply These Terms to Your New Programs If you develop a new program, and you want it to be of the greatest possible use to the public, the best way to achieve this is to make it free software which everyone can redistribute and change under these terms. To do so, attach the following notices to the program. It is safest to attach them to the start of each source file to most effectively state the exclusion of warranty; and each file should have at least the "copyright" line and a pointer to where the full notice is found. Copyright (C) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . Also add information on how to contact you by electronic and paper mail. If the program does terminal interaction, make it output a short notice like this when it starts in an interactive mode: Copyright (C) This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'. This is free software, and you are welcome to redistribute it under certain conditions; type `show c' for details. The hypothetical commands `show w' and `show c' should show the appropriate parts of the General Public License. Of course, your program's commands might be different; for a GUI interface, you would use an "about box". You should also get your employer (if you work as a programmer) or school, if any, to sign a "copyright disclaimer" for the program, if necessary. For more information on this, and how to apply and follow the GNU GPL, see . The GNU General Public License does not permit incorporating your program into proprietary programs. If your program is a subroutine library, you may consider it more useful to permit linking proprietary applications with the library. If this is what you want to do, use the GNU Lesser General Public License instead of this License. But first, please read . slowmovideo-0.5+git20180116/version.sh0000775000000000000000000000040213151342440016044 0ustar rootroot#!/bin/bash cd $(dirname $0) now=$(date +%F) revision=$(git describe --always) version=$(echo "${now}-${revision}") version=$(echo $(git describe --always HEAD --abbrev=0)+$(git rev-list HEAD -n 1 |cut -c 1-7)) #echo ${now} #echo ${revision} echo ${version} slowmovideo-0.5+git20180116/CMakeLists.txt0000664000000000000000000000105013151342440016560 0ustar rootrootmessage("==========================================================") message("This CMake file is only here to give you this information:") message("* To build slowmoVideo use the CMakeLists.txt file in the slowmoVideo subdirectory (inside the main directory).") message("* To build flowBuilder (runs on nVidia GPUs) run cmake in the V3D subdirectory. (A CPU version is included in slowmoVideo and flowBuilder is not necessary anymore.)") message("==========================================================") message(FATAL_ERROR "Aborting therefore.") slowmovideo-0.5+git20180116/src/0000775000000000000000000000000013151342440014613 5ustar rootrootslowmovideo-0.5+git20180116/src/version.h.in0000664000000000000000000000056313151342440017062 0ustar rootroot/* * slowmovideo version information */ #ifndef _VER_INFO_H #define _VER_INFO_H 1 #define SLOWMOVIDEO_VERSION_MAJOR @PROJECT_VERSION_MAJOR@ #define SLOWMOVIDEO_VERSION_MINOR @PROJECT_VERSION_MINOR@ #define SLOWMOVIDEO_VERSION_PATCH @PROJECT_VERSION_PATCH@ #define SLOWMOVIDEO_VERSION_SHA1 "@PROJECT_VERSION_SHA1@" #define SLOWMOVIDEO_VERSION_FULL "@VERSION@" #endif slowmovideo-0.5+git20180116/src/CMakeLists.txt0000664000000000000000000003451413151342440017362 0ustar rootrootcmake_minimum_required(VERSION 2.6) project(slowmoVideo) if(CMAKE_SOURCE_DIR STREQUAL CMAKE_BINARY_DIR) message(FATAL_ERROR "In-source builds are not allowed.") endif() set (CMAKE_BUILD_TYPE Release) #set(CMAKE_BUILD_TYPE Debug) # set(CMAKE_MODULE_PATH ${slowmoVideo_SOURCE_DIR}/cmake ) # Make a version file containing the current version from git. # include(GetGitRevisionDescription) git_describe(VERSION --dirty=-dev) if (VERSION) #parse the version information into pieces. # v0.4.0-123-gdddf621 string(REGEX REPLACE "^v([0-9]+)\\..*" "\\1" PROJECT_VERSION_MAJOR "${VERSION}") string(REGEX REPLACE "^v[0-9]+\\.([0-9]+).*" "\\1" PROJECT_VERSION_MINOR "${VERSION}") string(REGEX REPLACE "^v[0-9]+\\.[0-9]+\\.([0-9]+)-.*" "\\1" PROJECT_VERSION_PATCH "${VERSION}") string(REGEX REPLACE "^v[0-9]+\\.[0-9]+-([0-9]+)-.*" "\\1" PROJECT_VERSION_PATCH "${VERSION}") string(REGEX REPLACE "^v[0-9]+\\.[0-9]+\\.[0-9]+-(.*)" "\\1" PROJECT_VERSION_SHA1 "${VERSION}") else() set(PROJECT_VERSION_MAJOR "0") set(PROJECT_VERSION_MINOR "5") set(PROJECT_VERSION_PATCH "0") endif() if(NOT PROJECT_VERSION_PATCH) # git describe bug ? set(PROJECT_VERSION_PATCH "0") endif() set(PROJECT_VERSION "${PROJECT_VERSION_MAJOR}.${PROJECT_VERSION_MINOR}.${PROJECT_VERSION_PATCH}") configure_file(version.h.in version.h) ### Compiler options ### if (APPLE) # To compile with clang: #set(CMAKE_CXX_COMPILER "clang++") #set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wall --verbose") set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wall ") #set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wall -O2 -mtune=corei7") # Set additional project information set(COMPANY "granjow") set(COPYRIGHT "Copyright (c) 2011 Simon A. Eugster (Granjow). All rights reserved.") set(IDENTIFIER "net.granjow.slomoui") else() set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -Wall -g") endif() if(CMAKE_TOOLCHAIN_FILE) set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -DMXE") set(CMAKE_C_FLAGS "${CMAKE_CXX_FLAGS} -DMXE") endif(CMAKE_TOOLCHAIN_FILE) ### CMake Configuration ### option (ENABLE_TESTS "Build the unit tests" FALSE) set(ADDITIONAL_LIBS "") if(MSYS) message(STATUS "MSYS system detected.") include("${PROJECT_SOURCE_DIR}/cmake/MingwCrossEnv.cmake") endif(MSYS) ### Find packages ### # Check if environment variable QTDIR is set. # needed for Qt5 # Extra security for windows environment as well. if (DEFINED ENV{QTDIR}) set(CMAKE_PREFIX_PATH $ENV{QTDIR} ${CMAKE_PREFIX_PATH}) endif () if (APPLE) set(DEST "slowmoUI.app/Contents/Tools/bin") else() set(DEST "bin") endif() include(cmake/macros.cmake) # search for Qt4 if (NOT FORCE_QT4) find_package(Qt5Core QUIET) if (Qt5Core_FOUND) message(STATUS "Using qt5") set(USE_QT TRUE) # go on with other packages find_package(Qt5 COMPONENTS Core Widgets Gui Xml Script) if (Qt5_POSITION_INDEPENDENT_CODE) set(CMAKE_POSITION_INDEPENDENT_CODE ON) endif(Qt5_POSITION_INDEPENDENT_CODE) # set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} # ${Qt5Core_EXECUTABLE_COMPILE_FLAGS}") include_directories(${Qt5Core_INCLUDES}) include_directories(${Qt5Widgets_INCLUDES}) include_directories(${Qt5Gui_INCLUDES}) include_directories(${Qt5Xml_INCLUDES}) include_directories(${Qt5Script_INCLUDES}) macro(qt_use_modules) qt5_use_modules(${ARGN}) endmacro() macro(qt_wrap_ui) qt5_wrap_ui(${ARGN}) endmacro() macro(qt_wrap_cpp) qt5_wrap_cpp(${ARGN}) endmacro() macro(qt_add_resources) qt5_add_resources(${ARGN}) endmacro() #find_package(Qt5LinguistTools REQUIRED) macro(qt_add_translation) qt5_add_translation(${ARGN}) endmacro() macro(install_qt_executable) install_qt5_executable(${ARGN}) endmacro() endif(Qt5Core_FOUND) endif(NOT FORCE_QT4) #old qt4 ... if(NOT Qt5Core_FOUND) # should replace : #SET(QT_USE_QTXML TRUE) #SET(QT_USE_QTSCRIPT TRUE) #find_package(Qt4) #include(${QT_USE_FILE}) #include_directories(${QT_INCLUDES}) message(STATUS "Could not find Qt5, searching for Qt4 instead...") set(NEEDED_QT4_COMPONENTS "QtCore" "QtXml" "QtGui" "QtScript") find_package(Qt4 REQUIRED COMPONENTS ${NEEDED_QT4_COMPONENTS}) macro(qt_use_modules) endmacro() macro(qt_wrap_ui) qt4_wrap_ui(${ARGN}) endmacro() macro(qt_wrap_cpp) qt4_wrap_cpp(${ARGN}) endmacro() macro(qt_add_resources) qt4_add_resources(${ARGN}) endmacro() macro(qt_add_translation) qt4_add_translation(${ARGN}) endmacro() macro(install_qt_executable) install_qt4_executable(${ARGN}) endmacro() include(${QT_USE_FILE}) add_definitions(${QT_DEFINITIONS}) set(USE_QT TRUE) endif() message("Qt libraries found at : ${Qt5Gui_LIBRARIES} / ${QT_LIBRARIES}" ) set (USE_QTKIT OFF CACHE BOOL "Build with the QTKit encoder") set (USE_FFMPEG ON CACHE BOOL "Build with the FFMPEG encoder") set (USE_DBUS OFF CACHE BOOL "Build with the DBUS notification support") if(NOT MSYS) find_package(FFMPEG) else(NOT MSYS) # Handled by MingwCrossEnv.cmake to avoid errors like: # libavformat.a(avisynth.o):avisynth.c:(.text+0x6b): undefined reference to `AVIStreamRelease@4' endif(NOT MSYS) # not here anymore #include_directories(${FFMPEG_INCLUDE_DIR}) #include_directories("/usr/include/ffmpeg/") #link_directories(${FFMPEG_LIBRARY_DIR}) if (APPLE AND USE_QTKIT) find_package(QTKIT) message(STATUS "QTKIT find at ${QTKIT_LIBRARY} ") set(ADDITIONAL_LIBS "-framework Cocoa -framework QTKit -framework QuartzCore -framework AppKit -framework OpenCL") endif() # Find OpenCV, you may need to set OpenCV_DIR variable # to the absolute path to the directory containing OpenCVConfig.cmake file # via the command line or GUI find_package(OpenCV REQUIRED) # If the package has been found, several variables will # be set, you can find the full list with descriptions # in the OpenCVConfig.cmake file. # Print some message showing some of them message(STATUS "OpenCV library status:") message(STATUS " version: ${OpenCV_VERSION}") message(STATUS " libraries: ${OpenCV_LIBS}") message(STATUS " include path: ${OpenCV_INCLUDE_DIRS}") if (${OpenCV_VERSION_MAJOR} EQUAL 3) set(HAS_OCV_VERSION_3 ON) else() set(HAS_OCV_VERSION_3 OFF) endif() include_directories(${OPENCV_INCLUDE_DIRS}) # for config.h include_directories(${CMAKE_CURRENT_BINARY_DIR}) ### Set up libraries ### if(MSYS) set(EXTERNAL_LIBS ${FFMPEG_LIBRARIES} ${QT_LIBRARIES} ${OpenCV_LIBS_OPT} ${OpenCV_EXTRA_LIBS_OPT} ${ADDITIONAL_LIBS}) else(MSYS) set(EXTERNAL_LIBS ${QT_LIBRARIES} ${OpenCV_LIBS} ${ADDITIONAL_LIBS} ${FFMPEG_LIBRARIES}) endif(MSYS) ### Information output set(BUILD_SLOWMO "NO") #if(QT_LIBRARIES AND FFMPEG_FOUND) if(USE_QT AND FFMPEG_FOUND) set(BUILD_SLOWMO "YES") #endif(QT_LIBRARIES AND FFMPEG_FOUND) endif() if(NOT FFMPEG_SWSCALE_FOUND) if(CMAKE_TOOLCHAIN_FILE) else(CMAKE_TOOLCHAIN_FILE) set(BUILD_SLOWMO "NO") endif(CMAKE_TOOLCHAIN_FILE) endif(NOT FFMPEG_SWSCALE_FOUND) ## Include projects to build ## include_directories(slowmoVideo/tr) add_subdirectory(slowmoVideo/lib) add_subdirectory(slowmoVideo/libgui) add_subdirectory(slowmoVideo/project) add_subdirectory(slowmoVideo/slowmoCLI) add_subdirectory(slowmoVideo/slowmoUI) add_subdirectory(slowmoVideo/slowmoFlowEdit) add_subdirectory(slowmoVideo/slowmoInfo) add_subdirectory(slowmoVideo/slowmoRenderer) add_subdirectory(slowmoVideo/visualizeFlow) if(ENABLE_TESTS) SET(QT_USE_QTTEST TRUE) ## add_subdirectory(slowmoVideo/test) add_subdirectory(slowmoVideo/unittests) endif(ENABLE_TESTS) ##### SV END ##### ##### V3D START ##### if(WIN32) set(GLUT_ROOT_PATH ${PROJECT_SOURCE_DIR}/libs/) endif(WIN32) find_package(OpenGL) find_package(GLEW) find_package(GLUT) find_package(JPEG) find_package(PNG) find_package(ZLIB) #find_package(X11) # Windows: Try to find libraries that could not be found manually in the libs/ directory. if(WIN32) if(NOT ZLIB_FOUND) FIND_PATH(ZLIB_INCLUDE_DIR zlib.h PATHS ${PROJECT_SOURCE_DIR}/libs/include ) find_library(ZLIB_LIBRARY NAMES zlib PATHS ${PROJECT_SOURCE_DIR}/libs/lib) if(ZLIB_INCLUDE_DIR AND ZLIB_LIBRARY) set(ZLIB_FOUND TRUE) endif(ZLIB_INCLUDE_DIR AND ZLIB_LIBRARY) message(STATUS "Manual search for zlib: ${ZLIB_LIBRARY} in ${ZLIB_INCLUDE_DIR}") endif(NOT ZLIB_FOUND) if(NOT PNG_FOUND) find_path(PNG_INCLUDE_DIR png.h PATHS ${PROJECT_SOURCE_DIR}/libs/include) find_library(PNG_LIBRARIES libpng PATHS ${PROJECT_SOURCE_DIR}/libs/lib) if(PNG_INCLUDE_DIR AND PNG_LIBRARIES) set(PNG_FOUND TRUE) endif(PNG_INCLUDE_DIR AND PNG_LIBRARIES) message(STATUS "Manual search for png: ${PNG_LIBRARIES} in ${PNG_INCLUDE_DIR}") endif(NOT PNG_FOUND) if(NOT GLUT_FOUND) find_path(GLUT_LIBRARY_DIR NAMES GL/glut.h PATHS ${PROJECT_SOURCE_DIR}/libs/include) find_library(GLUT_LIBRARIES glut32 PATHS ${PROJECT_SOURCE_DIR}/libs/lib) if(GLUT_LIBRARY_DIR AND GLUT_LIBRARIES) set(GLUT_FOUND TRUE) endif(GLUT_LIBRARY_DIR AND GLUT_LIBRARIES) message(STATUS "Manual search for GLUT: ${GLUT_LIBRARIES} in ${GLUT_LIBRARY_DIR}") endif(NOT GLUT_FOUND) if(NOT JPEG_FOUND) FIND_PATH(JPEG_INCLUDE_DIR jpeglib.h PATHS ${PROJECT_SOURCE_DIR}/libs/include) SET(JPEG_NAMES ${JPEG_NAMES} jpeg) FIND_LIBRARY(JPEG_LIBRARY NAMES ${JPEG_NAMES} PATHS ${PROJECT_SOURCE_DIR}/libs/lib) if(JPEG_INCLUDE_DIR AND JPEG_LIBRARY) set(JPEG_FOUND TRUE) endif(JPEG_INCLUDE_DIR AND JPEG_LIBRARY) message(STATUS "Manual search for JPEG: ${JPEG_LIBRARY} in ${JPEG_INCLUDE_DIR}") endif(NOT JPEG_FOUND) endif(WIN32) set(BUILD_FLOW_BUILDER "NO") if(OPENGL_FOUND AND GLUT_FOUND AND GLEW_FOUND AND JPEG_FOUND AND PNG_FOUND) set(BUILD_FLOW_BUILDER "YES") endif(OPENGL_FOUND AND GLUT_FOUND AND GLEW_FOUND AND JPEG_FOUND AND PNG_FOUND) set(INCLUDE_SOURCE "YES") if(DISABLE_INCLUDE_SOURCE) set(INCLUDE_SOURCE "NO") add_definitions(-DDISABLE_INCLUDE_SOURCE) endif(DISABLE_INCLUDE_SOURCE) if (BUILD_FLOW_BUILDER) include_directories(${OPENGL_INCLUDE_DIR}) include_directories(${GLUT_INCLUDE_DIR}) include_directories(${GLEW_INCLUDE_DIR}) include_directories(${JPEG_INCLUDE_DIR}) include_directories(${ZLIB_INCLUDE_DIR}) include_directories(${PNG_INCLUDE_DIR}) set (V3D_DIR ${CMAKE_CURRENT_SOURCE_DIR}/V3D) set (V3D_INCLUDE_DIRS ${V3D_DIR}/.) include (V3D/Config/v3d_macros.cmake) include_directories(${V3D_INCLUDE_DIRS} ${EXTRA_INC_DIRS}) add_definitions(-DDISABLE_REDEFINITIONS) #-------------------------------------------------- enable_feature (V3DLIB_ENABLE_LIBJPEG) enable_feature (V3DLIB_ENABLE_LIBPNG) enable_feature_libraries (V3DLIB_ENABLE_LIBJPEG ${JPEG_LIBRARIES}) enable_feature_libraries (V3DLIB_ENABLE_LIBPNG ${PNG_LIBRARIES}) enable_feature (V3DLIB_ENABLE_GPGPU) enable_feature_libraries (V3DLIB_ENABLE_GPGPU ${OPENGL_LIBRARIES}) enable_feature_libraries (V3DLIB_ENABLE_GPGPU ${GLEW_LIBRARIES}) enable_feature_libraries (V3DLIB_ENABLE_GPGPU ${GLUT_glut_LIBRARY}) #-------------------------------------------------- include_directories(V3D/Config) set (GL_SRC V3D/GL/glsl_shaders.cpp V3D/GL/v3d_gpubase.cpp V3D/GL/v3d_gpuflow.cpp V3D/GL/v3d_gpucolorflow.cpp V3D/GL/v3d_gpupyramid.cpp ) set (ALL_SRC ${GL_SRC} V3D/Config/config.h V3D/Base/v3d_image.cpp V3D/Base/v3d_imageprocessing.h V3D/Base/v3d_exception.h V3D/Base/v3d_timer.h V3D/Base/v3d_serialization.h V3D/Base/v3d_utilities.h V3D/Math/v3d_linear.h V3D/Math/v3d_linearbase.h ) add_library(V3D STATIC ${ALL_SRC}) target_link_libraries(V3D ${GLEW_LIBRARIES} X11) #install(TARGETS V3D DESTINATION lib) add_subdirectory(V3D/Apps) endif (BUILD_FLOW_BUILDER) ##### V3D END ##### message("") message("======================V3D============================") message("* (info) Installation prefix: ${CMAKE_INSTALL_PREFIX}.") message(" (Can be adjusted with -DCMAKE_INSTALL_PREFIX=your_path. Default: ${SV_INST_DIR}.)") message("* (info) Shaders will be included in the binary: ${INCLUDE_SOURCE}") if(INCLUDE_SOURCE) message(" (Can be disabled with the cmake flag -DDISABLE_INCLUDE_SOURCE)") endif(INCLUDE_SOURCE) if(NOT OPENGL_FOUND) message("* OpenGL could not be found.") else(NOT OPENGL_FOUND) message("* (ok) OpenGL found in ${OPENGL_INCLUDE_DIR}: ${OPENGL_LIBRARIES}") endif(NOT OPENGL_FOUND) if(NOT GLUT_FOUND) message("* GLUT could not be found.") else(NOT GLUT_FOUND) message("* (ok) GLUT found in ${GLUT_INCLUDE_DIR}: ${GLUT_LIBRARIES}") endif(NOT GLUT_FOUND) if(NOT GLEW_FOUND) message("* GLEW could not be found.") else(NOT GLEW_FOUND) message("* (ok) GLEW found at ${GLEW_INCLUDE_DIR}") endif(NOT GLEW_FOUND) if(NOT JPEG_FOUND) message("* JPEG libraries could not be found.") else(NOT JPEG_FOUND) message("* (ok) JPEG libraries found at ${JPEG_INCLUDE_DIR}: ${JPEG_LIBRARIES}") endif(NOT JPEG_FOUND) if(NOT PNG_FOUND) message("* PNG libraries could not be found.") else(NOT PNG_FOUND) message("* (ok) PNG libraries found at ${PNG_INCLUDE_DIR}") endif(NOT PNG_FOUND) message("* V3D will be built: ---${BUILD_FLOW_BUILDER}---") message("==================slowmoVideo========================") message("* (info) slowmoVideo installation goes to ${CMAKE_INSTALL_PREFIX}.") message(" (Can be adjusted with -DCMAKE_INSTALL_PREFIX=your_path. Default is ${SV_INST_DIR}.)") #if(NOT QT_LIBRARIES) if (NOT USE_QT) message("QT5 nor Qt4 libraries could not be found.") #endif(NOT QT_LIBRARIES) endif(NOT USE_QT) if(NOT FFMPEG_FOUND) message("x ffmpeg libraries could not be found.") else(NOT FFMPEG_FOUND) message("* (ok) ffmpeg found at ${FFMPEG_LIBRARY_DIR}") endif(NOT FFMPEG_FOUND) if(NOT FFMPEG_SWSCALE_FOUND) message("x libswscale could not be found.") endif(NOT FFMPEG_SWSCALE_FOUND) if(NOT OpenCV_VERSION) message("x OpenCV could not be found.") else(NOT OpenCV_VERSION) message("* (ok) OpenCV ${OpenCV_VERSION} found at ${OpenCV_INCLUDE_DIRS}.") endif(NOT OpenCV_VERSION) message("* slowmoVideo will be built: ---${BUILD_SLOWMO}---") message("=======================END===========================") message("") if(NOT BUILD_SLOWMO) message(WARNING "Cannot build slowmoVideo, please install the missing packages first.") endif(NOT BUILD_SLOWMO) if(NOT BUILD_FLOW_BUILDER) message(WARNING "Cannot build V3D.") endif(NOT BUILD_FLOW_BUILDER) configure_file(config.h.in config.h) slowmovideo-0.5+git20180116/src/cmake/0000775000000000000000000000000013151342440015673 5ustar rootrootslowmovideo-0.5+git20180116/src/cmake/FindQTKit.cmake0000664000000000000000000000113713151342440020474 0ustar rootroot# Locate Apple QTKit (next-generation QuickTime) # This module defines # QTKIT_LIBRARY # QTKIT_FOUND, if false, do not try to link to gdal # QTKIT_INCLUDE_DIR, where to find the headers # # $QTKIT_DIR is an environment variable that would # correspond to the ./configure --prefix=$QTKIT_DIR # # Created by Eric Wing. # QTKit on OS X looks different than QTKit for Windows, # so I am going to case the two. IF(APPLE) FIND_PATH(QTKIT_INCLUDE_DIR QTKit/QTKit.h) FIND_LIBRARY(QTKIT_LIBRARY QTKit) ENDIF() SET(QTKIT_FOUND "NO") IF(QTKIT_LIBRARY AND QTKIT_INCLUDE_DIR) SET(QTKIT_FOUND "YES") ENDIF() slowmovideo-0.5+git20180116/src/cmake/MingwCrossEnv.cmake0000664000000000000000000000337213151342440021446 0ustar rootroot add_definitions(-DWINDOWS) set(WINDOWS "yes") # Make QJson build statically (flag QJson patched by cross-env) # Thanks to Mark Brand for the hint add_definitions(-DQJSON_STATIC) # Find the FFMPEG libraries with PkgConfig # Thanks to Martin Müllenhaupt for the code message(STATUS "Checking ffmpeg with pkg_check_modules") find_package(PkgConfig REQUIRED) pkg_check_modules(FFMPEG_avdevice libavdevice) SET(FFMPEG_avdevice_LIBRARY ${FFMPEG_avdevice_STATIC_LIBRARIES}) pkg_check_modules(FFMPEG_avfilter libavfilter) SET(FFMPEG_avfilter_LIBRARY ${FFMPEG_avfilter_STATIC_LIBRARIES}) pkg_check_modules(FFMPEG_avformat libavformat) SET(FFMPEG_avformat_LIBRARY ${FFMPEG_avformat_STATIC_LIBRARIES}) pkg_check_modules(FFMPEG_avcodec libavcodec) SET(FFMPEG_avcodec_LIBRARY ${FFMPEG_avcodec_STATIC_LIBRARIES}) pkg_check_modules(FFMPEG_postproc libpostproc) SET(FFMPEG_postproc_LIBRARY ${FFMPEG_postproc_STATIC_LIBRARIES}) pkg_check_modules(FFMPEG_swscale libswscale) SET(FFMPEG_swscale_LIBRARY ${FFMPEG_swscale_STATIC_LIBRARIES}) pkg_check_modules(FFMPEG_avutil libavutil) SET(FFMPEG_avutil_LIBRARY ${FFMPEG_avutil_STATIC_LIBRARIES}) SET(FFMPEG_INCLUDE_DIRS ${FFMPEG_avdevice_INCLUDE_DIRS} ${FFMPEG_avfilter_INCLUDE_DIRS} ${FFMPEG_avformat_INCLUDE_DIRS} ${FFMPEG_avcodec_INCLUDE_DIRS} ${FFMPEG_postproc_INCLUDE_DIRS} ${FFMPEG_swscale_INCLUDE_DIRS} ${FFMPEG_avutil_INCLUDE_DIRS}) SET(FFMPEG_LIBRARIES ${FFMPEG_avdevice_STATIC_LIBRARIES} ${FFMPEG_avfilter_STATIC_LIBRARIES} ${FFMPEG_avformat_STATIC_LIBRARIES} ${FFMPEG_avcodec_STATIC_LIBRARIES} ${FFMPEG_postproc_STATIC_LIBRARIES} ${FFMPEG_swscale_STATIC_LIBRARIES} ${FFMPEG_avutil_STATIC_LIBRARIES}) SET(FFMPEG_INCLUDE_DIR ${FFMPEG_avcodec_INCLUDE_DIRS}) SET(FFMPEG_FOUND ${FFMPEG_avcodec_FOUND}) set(ADDITIONAL_LIBS lzma lcms ) slowmovideo-0.5+git20180116/src/cmake/FindGLEW.cmake0000664000000000000000000000344013151342440020235 0ustar rootroot# Source: http://openlibraries.org/browser/trunk/FindGLEW.cmake (LGPL) # - Try to find GLEW # Once done this will define # # GLEW_FOUND - system has GLEW # GLEW_INCLUDE_DIR - the GLEW include directory # GLEW_LIBRARIES_DIR - where the libraries are # GLEW_LIBRARIES - Link these to use GLEW # IF (GLEW_INCLUDE_DIR) # Already in cache, be silent SET(GLEW_FIND_QUIETLY TRUE) ENDIF (GLEW_INCLUDE_DIR) if( WIN32 ) if( MSVC80 ) set( COMPILER_PATH "C:/Program\ Files/Microsoft\ Visual\ Studio\ 8/VC" ) endif( MSVC80 ) if( MSVC71 ) set( COMPILER_PATH "C:/Program\ Files/Microsoft\ Visual\ Studio\ .NET\ 2003/Vc7" ) endif( MSVC71 ) FIND_PATH( GLEW_INCLUDE_DIR gl/glew.h gl/wglew.h PATHS c:/glew/include ${COMPILER_PATH}/PlatformSDK/Include ${PROJECT_SOURCE_DIR}/libs/include) SET( GLEW_NAMES glew32 ) FIND_LIBRARY( GLEW_LIBRARIES NAMES ${GLEW_NAMES} PATHS c:/glew/lib ${COMPILER_PATH}/PlatformSDK/Lib ${PROJECT_SOURCE_DIR}/libs/lib ) message(STATUS "GLEW library: ${GLEW_LIBRARIES} at ${GLEW_INCLUDE_DIR} (also searched at ${PROJECT_SOURCE_DIR}/libs/include)") else( WIN32 ) FIND_PATH( GLEW_INCLUDE_DIR glew.h wglew.h PATHS /usr/local/include /usr/include PATH_SUFFIXES gl/ GL/ ) SET( GLEW_NAMES glew GLEW ) FIND_LIBRARY( GLEW_LIBRARIES NAMES ${GLEW_NAMES} PATHS /usr/lib /usr/local/lib ) endif( WIN32 ) GET_FILENAME_COMPONENT( GLEW_LIBRARIES_DIR ${GLEW_LIBRARIES} PATH ) IF (GLEW_INCLUDE_DIR AND GLEW_LIBRARIES) SET(GLEW_FOUND TRUE) SET( GLEW_LIBRARIES_DIR ${GLEW_LIBRARIES} ) ELSE (GLEW_INCLUDE_DIR AND GLEW_LIBRARIES) SET( GLEW_FOUND FALSE ) SET( GLEW_LIBRARIES_DIR ) ENDIF (GLEW_INCLUDE_DIR AND GLEW_LIBRARIES) slowmovideo-0.5+git20180116/src/cmake/GetGitRevisionDescription.cmake0000664000000000000000000001001513151342440024000 0ustar rootroot# - Returns a version string from Git # # These functions force a re-configure on each git commit so that you can # trust the values of the variables in your build system. # # get_git_head_revision( [ ...]) # # Returns the refspec and sha hash of the current head revision # # git_describe( [ ...]) # # Returns the results of git describe on the source tree, and adjusting # the output so that it tests false if an error occurs. # # git_get_exact_tag( [ ...]) # # Returns the results of git describe --exact-match on the source tree, # and adjusting the output so that it tests false if there was no exact # matching tag. # # Requires CMake 2.6 or newer (uses the 'function' command) # # Original Author: # 2009-2010 Ryan Pavlik # http://academic.cleardefinition.com # Iowa State University HCI Graduate Program/VRAC # # Copyright Iowa State University 2009-2010. # Distributed under the Boost Software License, Version 1.0. # (See accompanying file LICENSE_1_0.txt or copy at # http://www.boost.org/LICENSE_1_0.txt) if(__get_git_revision_description) return() endif() set(__get_git_revision_description YES) # We must run the following at "include" time, not at function call time, # to find the path to this module rather than the path to a calling list file get_filename_component(_gitdescmoddir ${CMAKE_CURRENT_LIST_FILE} PATH) function(get_git_head_revision _refspecvar _hashvar) set(GIT_PARENT_DIR "${CMAKE_CURRENT_SOURCE_DIR}") set(GIT_DIR "${GIT_PARENT_DIR}/.git") while(NOT EXISTS "${GIT_DIR}") # .git dir not found, search parent directories set(GIT_PREVIOUS_PARENT "${GIT_PARENT_DIR}") get_filename_component(GIT_PARENT_DIR ${GIT_PARENT_DIR} PATH) if(GIT_PARENT_DIR STREQUAL GIT_PREVIOUS_PARENT) # We have reached the root directory, we are not in git set(${_refspecvar} "GITDIR-NOTFOUND" PARENT_SCOPE) set(${_hashvar} "GITDIR-NOTFOUND" PARENT_SCOPE) return() endif() set(GIT_DIR "${GIT_PARENT_DIR}/.git") endwhile() # check if this is a submodule if(NOT IS_DIRECTORY ${GIT_DIR}) file(READ ${GIT_DIR} submodule) string(REGEX REPLACE "gitdir: (.*)\n$" "\\1" GIT_DIR_RELATIVE ${submodule}) get_filename_component(SUBMODULE_DIR ${GIT_DIR} PATH) get_filename_component(GIT_DIR ${SUBMODULE_DIR}/${GIT_DIR_RELATIVE} ABSOLUTE) endif() set(GIT_DATA "${CMAKE_CURRENT_BINARY_DIR}/CMakeFiles/git-data") if(NOT EXISTS "${GIT_DATA}") file(MAKE_DIRECTORY "${GIT_DATA}") endif() if(NOT EXISTS "${GIT_DIR}/HEAD") return() endif() set(HEAD_FILE "${GIT_DATA}/HEAD") configure_file("${GIT_DIR}/HEAD" "${HEAD_FILE}" COPYONLY) configure_file("${_gitdescmoddir}/GetGitRevisionDescription.cmake.in" "${GIT_DATA}/grabRef.cmake" @ONLY) include("${GIT_DATA}/grabRef.cmake") set(${_refspecvar} "${HEAD_REF}" PARENT_SCOPE) set(${_hashvar} "${HEAD_HASH}" PARENT_SCOPE) endfunction() function(git_describe _var) if(NOT GIT_FOUND) find_package(Git QUIET) endif() get_git_head_revision(refspec hash) if(NOT GIT_FOUND) set(${_var} "GIT-NOTFOUND" PARENT_SCOPE) return() endif() if(NOT hash) set(${_var} "HEAD-HASH-NOTFOUND" PARENT_SCOPE) return() endif() # TODO sanitize #if((${ARGN}" MATCHES "&&") OR # (ARGN MATCHES "||") OR # (ARGN MATCHES "\\;")) # message("Please report the following error to the project!") # message(FATAL_ERROR "Looks like someone's doing something nefarious with git_describe! Passed arguments ${ARGN}") #endif() #message(STATUS "Arguments to execute_process: ${ARGN}") execute_process(COMMAND "${GIT_EXECUTABLE}" describe ${ARGN} WORKING_DIRECTORY "${CMAKE_SOURCE_DIR}" RESULT_VARIABLE res OUTPUT_VARIABLE out ERROR_QUIET OUTPUT_STRIP_TRAILING_WHITESPACE) if(NOT res EQUAL 0) set(out "${out}-${res}-NOTFOUND") endif() set(${_var} "${out}" PARENT_SCOPE) endfunction() function(git_get_exact_tag _var) git_describe(out --exact-match ${ARGN}) set(${_var} "${out}" PARENT_SCOPE) endfunction() slowmovideo-0.5+git20180116/src/cmake/GetGitRevisionDescription.cmake.in0000664000000000000000000000226213151342440024412 0ustar rootroot# # Internal file for GetGitRevisionDescription.cmake # # Requires CMake 2.6 or newer (uses the 'function' command) # # Original Author: # 2009-2010 Ryan Pavlik # http://academic.cleardefinition.com # Iowa State University HCI Graduate Program/VRAC # # Copyright Iowa State University 2009-2010. # Distributed under the Boost Software License, Version 1.0. # (See accompanying file LICENSE_1_0.txt or copy at # http://www.boost.org/LICENSE_1_0.txt) set(HEAD_HASH) file(READ "@HEAD_FILE@" HEAD_CONTENTS LIMIT 1024) string(STRIP "${HEAD_CONTENTS}" HEAD_CONTENTS) if(HEAD_CONTENTS MATCHES "ref") # named branch string(REPLACE "ref: " "" HEAD_REF "${HEAD_CONTENTS}") if(EXISTS "@GIT_DIR@/${HEAD_REF}") configure_file("@GIT_DIR@/${HEAD_REF}" "@GIT_DATA@/head-ref" COPYONLY) elseif(EXISTS "@GIT_DIR@/logs/${HEAD_REF}") configure_file("@GIT_DIR@/logs/${HEAD_REF}" "@GIT_DATA@/head-ref" COPYONLY) set(HEAD_HASH "${HEAD_REF}") endif() else() # detached HEAD configure_file("@GIT_DIR@/HEAD" "@GIT_DATA@/head-ref" COPYONLY) endif() if(NOT HEAD_HASH) file(READ "@GIT_DATA@/head-ref" HEAD_HASH LIMIT 1024) string(STRIP "${HEAD_HASH}" HEAD_HASH) endif() slowmovideo-0.5+git20180116/src/cmake/DeployQt5.cmake0000664000000000000000000002711613151342440020532 0ustar rootroot#.rst: # DeployQt5 # --------- # # Functions to help assemble a standalone Qt5 executable. # # A collection of CMake utility functions useful for deploying Qt5 # executables. # # The following functions are provided by this module: # # :: # # write_qt5_conf # resolve_qt5_paths # fixup_qt5_executable # install_qt5_plugin_path # install_qt5_plugin # install_qt5_executable # # Requires CMake 2.8.9 or greater because Qt 5 does. # Also depends on BundleUtilities.cmake. # # :: # # WRITE_QT5_CONF( ) # # Writes a qt.conf file with the into . # # :: # # RESOLVE_QT5_PATHS( []) # # Loop through list and if any don't exist resolve them # relative to the (if supplied) or the # CMAKE_INSTALL_PREFIX. # # :: # # FIXUP_QT5_EXECUTABLE( [ ]) # # Copies Qt plugins, writes a Qt configuration file (if needed) and # fixes up a Qt5 executable using BundleUtilities so it is standalone # and can be drag-and-drop copied to another machine as long as all of # the system libraries are compatible. # # should point to the executable to be fixed-up. # # should contain a list of the names or paths of any Qt # plugins to be installed. # # will be passed to BundleUtilities and should be a list of any # already installed plugins, libraries or executables to also be # fixed-up. # # will be passed to BundleUtilities and should contain and # directories to be searched to find library dependencies. # # allows an custom plugins directory to be used. # # will force a qt.conf file to be written even if not # needed. # # :: # # INSTALL_QT5_PLUGIN_PATH(plugin executable copy installed_plugin_path_var ) # # Install (or copy) a resolved to the default plugins directory # (or ) relative to and store the result in # . # # If is set to TRUE then the plugins will be copied rather than # installed. This is to allow this module to be used at CMake time # rather than install time. # # If is set then anything installed will use this COMPONENT. # # :: # # INSTALL_QT5_PLUGIN(plugin executable copy installed_plugin_path_var ) # # Install (or copy) an unresolved to the default plugins # directory (or ) relative to and store the # result in . See documentation of # INSTALL_QT5_PLUGIN_PATH. # # :: # # INSTALL_QT5_EXECUTABLE( [ ]) # # Installs Qt plugins, writes a Qt configuration file (if needed) and # fixes up a Qt5 executable using BundleUtilities so it is standalone # and can be drag-and-drop copied to another machine as long as all of # the system libraries are compatible. The executable will be fixed-up # at install time. is the COMPONENT used for bundle fixup # and plugin installation. See documentation of FIXUP_QT5_BUNDLE. #============================================================================= # Copyright 2011 Mike McQuaid # # Distributed under the OSI-approved BSD License (the "License"); # see accompanying file Copyright.txt for details. # # This software is distributed WITHOUT ANY WARRANTY; without even the # implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. # See the License for more information. #============================================================================= # (To distribute this file outside of CMake, substitute the full # License text for the above reference.) # The functions defined in this file depend on the fixup_bundle function # (and others) found in BundleUtilities.cmake include(BundleUtilities) set(DeployQt5_cmake_dir "${CMAKE_CURRENT_LIST_DIR}") set(DeployQt5_apple_plugins_dir "PlugIns") function(write_qt5_conf qt_conf_dir qt_conf_contents) set(qt_conf_path "${qt_conf_dir}/qt.conf") message(STATUS "Writing ${qt_conf_path}") file(WRITE "${qt_conf_path}" "${qt_conf_contents}") endfunction() function(resolve_qt5_paths paths_var) set(executable_path ${ARGV1}) set(paths_resolved) foreach(path ${${paths_var}}) if(EXISTS "${path}") list(APPEND paths_resolved "${path}") else() if(${executable_path}) list(APPEND paths_resolved "${executable_path}/${path}") else() list(APPEND paths_resolved "\$ENV{DESTDIR}\${CMAKE_INSTALL_PREFIX}/${path}") endif() endif() endforeach() set(${paths_var} ${paths_resolved} PARENT_SCOPE) endfunction() function(fixup_qt5_executable executable) set(qtplugins ${ARGV1}) set(libs ${ARGV2}) set(dirs ${ARGV3}) set(plugins_dir ${ARGV4}) set(request_qt_conf ${ARGV5}) message(STATUS "fixup_qt5_executable") message(STATUS " executable='${executable}'") message(STATUS " qtplugins='${qtplugins}'") message(STATUS " libs='${libs}'") message(STATUS " dirs='${dirs}'") message(STATUS " plugins_dir='${plugins_dir}'") message(STATUS " request_qt_conf='${request_qt_conf}'") if(QT_LIBRARY_DIR) list(APPEND dirs "${QT_LIBRARY_DIR}") endif() if(QT_BINARY_DIR) list(APPEND dirs "${QT_BINARY_DIR}") endif() if(APPLE) set(qt_conf_dir "${executable}/Contents/Resources") set(executable_path "${executable}") set(write_qt_conf TRUE) if(NOT plugins_dir) set(plugins_dir "${DeployQt5_apple_plugins_dir}") endif() else() get_filename_component(executable_path "${executable}" PATH) if(NOT executable_path) set(executable_path ".") endif() set(qt_conf_dir "${executable_path}") set(write_qt_conf ${request_qt_conf}) endif() foreach(plugin ${qtplugins}) set(installed_plugin_path "") install_qt5_plugin("${plugin}" "${executable}" 1 installed_plugin_path) list(APPEND libs ${installed_plugin_path}) endforeach() foreach(lib ${libs}) if(NOT EXISTS "${lib}") message(FATAL_ERROR "Library does not exist: ${lib}") endif() endforeach() resolve_qt5_paths(libs "${executable_path}") if(write_qt_conf) set(qt_conf_contents "[Paths]\nPlugins = ${plugins_dir}") write_qt5_conf("${qt_conf_dir}" "${qt_conf_contents}") endif() fixup_bundle("${executable}" "${libs}" "${dirs}") endfunction() function(install_qt5_plugin_path plugin executable copy installed_plugin_path_var) set(plugins_dir ${ARGV4}) set(component ${ARGV5}) set(configurations ${ARGV6}) if(EXISTS "${plugin}") if(APPLE) if(NOT plugins_dir) set(plugins_dir "${DeployQt5_apple_plugins_dir}") endif() set(plugins_path "${executable}/Contents/${plugins_dir}") else() get_filename_component(plugins_path "${executable}" PATH) if(NOT plugins_path) set(plugins_path ".") endif() if(plugins_dir) set(plugins_path "${plugins_path}/${plugins_dir}") endif() endif() set(plugin_group "") get_filename_component(plugin_path "${plugin}" PATH) get_filename_component(plugin_parent_path "${plugin_path}" PATH) get_filename_component(plugin_parent_dir_name "${plugin_parent_path}" NAME) get_filename_component(plugin_name "${plugin}" NAME) string(TOLOWER "${plugin_parent_dir_name}" plugin_parent_dir_name) if("${plugin_parent_dir_name}" STREQUAL "plugins") get_filename_component(plugin_group "${plugin_path}" NAME) set(${plugin_group_var} "${plugin_group}") endif() set(plugins_path "${plugins_path}/${plugin_group}") if(${copy}) file(MAKE_DIRECTORY "${plugins_path}") file(COPY "${plugin}" DESTINATION "${plugins_path}") else() if(configurations AND (CMAKE_CONFIGURATION_TYPES OR CMAKE_BUILD_TYPE)) set(configurations CONFIGURATIONS ${configurations}) else() unset(configurations) endif() install(FILES "${plugin}" DESTINATION "${plugins_path}" ${configurations} ${component}) endif() set(${installed_plugin_path_var} "${plugins_path}/${plugin_name}" PARENT_SCOPE) endif() endfunction() function(install_qt5_plugin plugin executable copy installed_plugin_path_var) set(plugins_dir ${ARGV4}) set(component ${ARGV5}) if(EXISTS "${plugin}") install_qt5_plugin_path("${plugin}" "${executable}" "${copy}" "${installed_plugin_path_var}" "${plugins_dir}" "${component}") else() string(TOUPPER "QT_${plugin}_PLUGIN" plugin_var) set(plugin_release_var "${plugin_var}_RELEASE") set(plugin_debug_var "${plugin_var}_DEBUG") set(plugin_release "${${plugin_release_var}}") set(plugin_debug "${${plugin_debug_var}}") if(DEFINED "${plugin_release_var}" AND DEFINED "${plugin_debug_var}" AND NOT EXISTS "${plugin_release}" AND NOT EXISTS "${plugin_debug}") message(WARNING "Qt plugin \"${plugin}\" not recognized or found.") endif() if(NOT EXISTS "${${plugin_debug_var}}") set(plugin_debug "${plugin_release}") endif() if(CMAKE_CONFIGURATION_TYPES OR CMAKE_BUILD_TYPE) install_qt5_plugin_path("${plugin_release}" "${executable}" "${copy}" "${installed_plugin_path_var}_release" "${plugins_dir}" "${component}" "Release|RelWithDebInfo|MinSizeRel") install_qt5_plugin_path("${plugin_debug}" "${executable}" "${copy}" "${installed_plugin_path_var}_debug" "${plugins_dir}" "${component}" "Debug") if(CMAKE_BUILD_TYPE MATCHES "^Debug$") set(${installed_plugin_path_var} ${${installed_plugin_path_var}_debug}) else() set(${installed_plugin_path_var} ${${installed_plugin_path_var}_release}) endif() else() install_qt5_plugin_path("${plugin_release}" "${executable}" "${copy}" "${installed_plugin_path_var}" "${plugins_dir}" "${component}") endif() endif() set(${installed_plugin_path_var} ${${installed_plugin_path_var}} PARENT_SCOPE) endfunction() function(install_qt5_executable executable) set(qtplugins ${ARGV1}) set(libs ${ARGV2}) set(dirs ${ARGV3}) set(plugins_dir ${ARGV4}) set(request_qt_conf ${ARGV5}) set(component ${ARGV6}) if(QT_LIBRARY_DIR) list(APPEND dirs "${QT_LIBRARY_DIR}") endif() if(QT_BINARY_DIR) list(APPEND dirs "${QT_BINARY_DIR}") endif() if(component) set(component COMPONENT ${component}) else() unset(component) endif() get_filename_component(executable_absolute "${executable}" ABSOLUTE) if(EXISTS "${QT_QTCORE_LIBRARY_RELEASE}") gp_file_type("${executable_absolute}" "${QT_QTCORE_LIBRARY_RELEASE}" qtcore_type) elseif(EXISTS "${QT_QTCORE_LIBRARY_DEBUG}") gp_file_type("${executable_absolute}" "${QT_QTCORE_LIBRARY_DEBUG}" qtcore_type) endif() if(qtcore_type STREQUAL "system") set(qt_plugins_dir "") endif() if(QT_IS_STATIC) message(WARNING "Qt built statically: not installing plugins.") else() if(APPLE) get_property(loc TARGET Qt5::QCocoaIntegrationPlugin PROPERTY LOCATION_RELEASE) install_qt5_plugin("${loc}" "${executable}" 0 installed_plugin_paths "PlugIns" "${component}") list(APPEND libs ${installed_plugin_paths}) endif() foreach(plugin ${qtplugins}) set(installed_plugin_paths "") install_qt5_plugin("${plugin}" "${executable}" 0 installed_plugin_paths "${plugins_dir}" "${component}") list(APPEND libs ${installed_plugin_paths}) endforeach() endif() resolve_qt5_paths(libs "") install(CODE "include(\"${DeployQt5_cmake_dir}/DeployQt5.cmake\") set(BU_CHMOD_BUNDLE_ITEMS TRUE) FIXUP_QT5_EXECUTABLE(\"\$ENV{DESTDIR}\${CMAKE_INSTALL_PREFIX}/${executable}\" \"\" \"${libs}\" \"${dirs}\" \"${plugins_dir}\" \"${request_qt_conf}\")" ${component} ) endfunction() slowmovideo-0.5+git20180116/src/cmake/FindFFMPEG.cmake0000664000000000000000000001230513151342440020443 0ustar rootroot# From: http://www.openlibraries.org/browser/trunk/FindFFMPEG.cmake # - Try to find FFMPEG # Once done this will define # # FFMPEG_FOUND - system has FFMPEG # FFMPEG_INCLUDE_DIR - the include directories # FFMPEG_LIBRARY_DIR - the directory containing the libraries # FFMPEG_LIBRARIES - link these to use FFMPEG # FFMPEG_SWSCALE_FOUND - FFMPEG also has SWSCALE # #SET( FFMPEG_HEADERS avformat.h avcodec.h avutil.h avdevice.h ) #SET( FFMPEG_PATH_SUFFIXES libavformat libavcodec libavutil libavdevice ) SET( FFMPEG_HEADERS avformat.h avcodec.h avutil.h ) SET( FFMPEG_PATH_SUFFIXES libavformat libavcodec libavutil ) SET( FFMPEG_SWS_HEADERS swscale.h ) SET( FFMPEG_SWS_PATH_SUFFIXES libswscale ) SET( FFMPEG_LIBRARY_DIR $ENV{FFMPEGDIR}/lib ) if( WIN32 ) #SET( FFMPEG_LIBRARIES avformat.lib avcodec.lib avutil.lib avdevice.lib ) SET( FFMPEG_LIBRARIES avformat.lib avcodec.lib avutil.lib ) SET( FFMPEG_SWS_LIBRARIES swscale.lib ) SET( FFMPEG_LIBRARY_DIR $ENV{FFMPEGDIR}\\lib ) SET( FFMPEG_INCLUDE_PATHS $ENV{FFMPEGDIR}\\include ) # check to see if we can find swscale SET( TMP_ TMP-NOTFOUND ) FIND_PATH( TMP_ ${FFMPEG_SWS_LIBRARIES} PATHS ${FFMPEG_LIBRARY_DIR} ) IF ( TMP_ ) SET( SWSCALE_FOUND TRUE ) ENDIF( TMP_ ) else( WIN32 ) #SET( FFMPEG_LIBRARIES avformat avcodec avutil avdevice ) SET( FFMPEG_LIBRARIES avformat avcodec avutil ) SET( FFMPEG_SWS_LIBRARIES swscale ) INCLUDE(FindPkgConfig) if ( PKG_CONFIG_FOUND ) pkg_check_modules( AVFORMAT libavformat ) pkg_check_modules( AVCODEC libavcodec ) pkg_check_modules( AVUTIL libavutil ) #pkg_check_modules( AVDEVICE libavdevice ) pkg_check_modules( SWSCALE libswscale ) endif ( PKG_CONFIG_FOUND ) SET( FFMPEG_LIBRARY_DIR ${AVFORMAT_LIBRARY_DIRS} ${AVCODEC_LIBRARY_DIRS} ${AVUTIL_LIBRARY_DIRS} #${AVDEVICE_LIBRARY_DIRS} ) SET( FFMPEG_INCLUDE_PATHS ${AVFORMAT_INCLUDE_DIRS} ${AVCODEC_INCLUDE_DIRS} ${AVUTIL_INCLUDE_DIRS} #${AVDEVICE_INCLUDE_DIRS} ) # check to see if we can find swscale FIND_LIBRARY(LIB_SWSCALE_ swscale ${FFMPEG_LIBRARY_DIR} ) IF ( LIB_SWSCALE_ ) SET( SWSCALE_FOUND TRUE ) ENDIF( LIB_SWSCALE_ ) endif( WIN32 ) # add in swscale if found IF ( SWSCALE_FOUND ) SET( FFMPEG_LIBRARY_DIR ${FFMPEG_LIBRARY_DIR} ${SWSCALE_LIBRARY_DIRS} ) SET( FFMPEG_INCLUDE_PATHS ${FFMPEG_INCLUDE_PATHS} ${SWSCALE_INCLUDE_DIRS} ) SET( FFMPEG_HEADERS ${FFMPEG_HEADERS} ${FFMPEG_SWS_HEADERS} ) SET( FFMPEG_PATH_SUFFIXES ${FFMPEG_PATH_SUFFIXES} ${FFMPEG_SWS_PATH_SUFFIXES} ) SET( FFMPEG_LIBRARIES ${FFMPEG_LIBRARIES} ${FFMPEG_SWS_LIBRARIES} ) ENDIF ( SWSCALE_FOUND ) # find includes SET( INC_SUCCESS 0 ) SET( TMP_ TMP-NOTFOUND ) SET( FFMPEG_INCLUDE_DIR ${FFMPEG_INCLUDE_PATHS} ) FOREACH( INC_ ${FFMPEG_HEADERS} ) message(STATUS "checking: ${INC_}" ) FIND_PATH( TMP_ ${INC_} PATHS ${FFMPEG_INCLUDE_PATHS} PATH_SUFFIXES ${FFMPEG_PATH_SUFFIXES} ) IF ( TMP_ ) message(STATUS " ${TMP_}" ) MATH( EXPR INC_SUCCESS ${INC_SUCCESS}+1 ) SET( FFMPEG_INCLUDE_DIR ${FFMPEG_INCLUDE_DIR} ${TMP_} ) ENDIF ( TMP_ ) SET( TMP_ TMP-NOTFOUND ) ENDFOREACH( INC_ ) list(LENGTH FFMPEG_INCLUDE_DIR LENGTH_INCLUDES) list(LENGTH FFMPEG_LIBRARY_DIR LENGTH_LIBRARIES) # clear out duplicates if(LENGTH_INCLUDES GREATER 0) LIST( REMOVE_DUPLICATES FFMPEG_INCLUDE_DIR ) endif(LENGTH_INCLUDES GREATER 0) if(LENGTH_LIBRARIES GREATER 0) LIST( REMOVE_DUPLICATES FFMPEG_LIBRARY_DIR ) endif(LENGTH_LIBRARIES GREATER 0) # find the full paths of the libraries SET( TMP_ TMP-NOTFOUND ) IF ( NOT WIN32 ) FOREACH( LIB_ ${FFMPEG_LIBRARIES} ) FIND_LIBRARY( TMP_ NAMES ${LIB_} PATHS ${FFMPEG_LIBRARY_DIR} ) IF ( TMP_ ) SET( FFMPEG_LIBRARIES_FULL ${FFMPEG_LIBRARIES_FULL} ${TMP_} ) ENDIF ( TMP_ ) SET( TMP_ TMP-NOTFOUND ) ENDFOREACH( LIB_ ) SET ( FFMPEG_LIBRARIES ${FFMPEG_LIBRARIES_FULL} ) ENDIF( NOT WIN32 ) LIST( LENGTH FFMPEG_HEADERS LIST_SIZE_ ) SET( FFMPEG_FOUND FALSE ) SET( FFMPEG_SWSCALE_FOUND FALSE ) IF ( ${INC_SUCCESS} EQUAL ${LIST_SIZE_} ) SET( FFMPEG_FOUND TRUE ) SET( FFMPEG_SWSCALE_FOUND ${SWSCALE_FOUND} ) ENDIF ( ${INC_SUCCESS} EQUAL ${LIST_SIZE_} ) string(REPLACE "/usr/include/" "" FFMPEG_INCLUDE_DIR "${FFMPEG_INCLUDE_DIR}") # add libx264 ... if (APPLE) FIND_LIBRARY(LIB_X264 NAMES x264 libx264.a PATHS /usr/local/lib $ENV{LIBX264DIR}/lib DOC "x264 library" ) #SET(FFMPEG_LIBRARIES ${FFMPEG_LIBRARIES} "/Users/val/Documents/Sources/slowlib/lib/libx264.a" ) SET(FFMPEG_LIBRARIES ${FFMPEG_LIBRARIES} ${LIB_X264} ) message(STATUS "x264 found in: ${LIB_X264}" ) endif() # On OS X we ffmpeg libraries depend on VideoDecodeAcceleration and CoreVideo frameworks IF (APPLE) SET(FFMPEG_LIBRARIES ${FFMPEG_LIBRARIES} "-framework CoreFoundation -framework QuartzCore -framework VideoDecodeAcceleration -liconv -lbz2 -lz") ENDIF() slowmovideo-0.5+git20180116/src/cmake/macros.cmake0000664000000000000000000000125513151342440020164 0ustar rootroot macro(copy_files FILELIST) foreach (FILENAME ${FILELIST}) set(FILE_SRC "${CMAKE_CURRENT_SOURCE_DIR}/${FILENAME}") set(FILE_DST "${CMAKE_CURRENT_BINARY_DIR}/${FILENAME}") message(STATUS "Copying " ${FILENAME} " to " ${FILE_DST}) configure_file(${FILE_SRC} ${FILE_DST} COPYONLY) endforeach(FILENAME) endmacro(copy_files) macro(copy_files_basedir FILELIST) foreach (FILENAME ${FILELIST}) set(FILE_SRC "${CMAKE_CURRENT_SOURCE_DIR}/${FILENAME}") set(FILE_DST "${CMAKE_BINARY_DIR}/${FILENAME}") message(STATUS "Copying ${FILE_SRC} to ${FILE_DST}") configure_file(${FILE_SRC} ${FILE_DST} COPYONLY) endforeach(FILENAME) endmacro(copy_files_basedir) slowmovideo-0.5+git20180116/src/V3D/0000775000000000000000000000000013151342440015207 5ustar rootrootslowmovideo-0.5+git20180116/src/V3D/COPYING.TXT0000664000000000000000000001672713151342440016735 0ustar rootroot GNU LESSER GENERAL PUBLIC LICENSE Version 3, 29 June 2007 Copyright (C) 2007 Free Software Foundation, Inc. Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. This version of the GNU Lesser General Public License incorporates the terms and conditions of version 3 of the GNU General Public License, supplemented by the additional permissions listed below. 0. Additional Definitions. As used herein, "this License" refers to version 3 of the GNU Lesser General Public License, and the "GNU GPL" refers to version 3 of the GNU General Public License. "The Library" refers to a covered work governed by this License, other than an Application or a Combined Work as defined below. An "Application" is any work that makes use of an interface provided by the Library, but which is not otherwise based on the Library. Defining a subclass of a class defined by the Library is deemed a mode of using an interface provided by the Library. A "Combined Work" is a work produced by combining or linking an Application with the Library. The particular version of the Library with which the Combined Work was made is also called the "Linked Version". The "Minimal Corresponding Source" for a Combined Work means the Corresponding Source for the Combined Work, excluding any source code for portions of the Combined Work that, considered in isolation, are based on the Application, and not on the Linked Version. The "Corresponding Application Code" for a Combined Work means the object code and/or source code for the Application, including any data and utility programs needed for reproducing the Combined Work from the Application, but excluding the System Libraries of the Combined Work. 1. Exception to Section 3 of the GNU GPL. You may convey a covered work under sections 3 and 4 of this License without being bound by section 3 of the GNU GPL. 2. Conveying Modified Versions. If you modify a copy of the Library, and, in your modifications, a facility refers to a function or data to be supplied by an Application that uses the facility (other than as an argument passed when the facility is invoked), then you may convey a copy of the modified version: a) under this License, provided that you make a good faith effort to ensure that, in the event an Application does not supply the function or data, the facility still operates, and performs whatever part of its purpose remains meaningful, or b) under the GNU GPL, with none of the additional permissions of this License applicable to that copy. 3. Object Code Incorporating Material from Library Header Files. The object code form of an Application may incorporate material from a header file that is part of the Library. You may convey such object code under terms of your choice, provided that, if the incorporated material is not limited to numerical parameters, data structure layouts and accessors, or small macros, inline functions and templates (ten or fewer lines in length), you do both of the following: a) Give prominent notice with each copy of the object code that the Library is used in it and that the Library and its use are covered by this License. b) Accompany the object code with a copy of the GNU GPL and this license document. 4. Combined Works. You may convey a Combined Work under terms of your choice that, taken together, effectively do not restrict modification of the portions of the Library contained in the Combined Work and reverse engineering for debugging such modifications, if you also do each of the following: a) Give prominent notice with each copy of the Combined Work that the Library is used in it and that the Library and its use are covered by this License. b) Accompany the Combined Work with a copy of the GNU GPL and this license document. c) For a Combined Work that displays copyright notices during execution, include the copyright notice for the Library among these notices, as well as a reference directing the user to the copies of the GNU GPL and this license document. d) Do one of the following: 0) Convey the Minimal Corresponding Source under the terms of this License, and the Corresponding Application Code in a form suitable for, and under terms that permit, the user to recombine or relink the Application with a modified version of the Linked Version to produce a modified Combined Work, in the manner specified by section 6 of the GNU GPL for conveying Corresponding Source. 1) Use a suitable shared library mechanism for linking with the Library. A suitable mechanism is one that (a) uses at run time a copy of the Library already present on the user's computer system, and (b) will operate properly with a modified version of the Library that is interface-compatible with the Linked Version. e) Provide Installation Information, but only if you would otherwise be required to provide such information under section 6 of the GNU GPL, and only to the extent that such information is necessary to install and execute a modified version of the Combined Work produced by recombining or relinking the Application with a modified version of the Linked Version. (If you use option 4d0, the Installation Information must accompany the Minimal Corresponding Source and Corresponding Application Code. If you use option 4d1, you must provide the Installation Information in the manner specified by section 6 of the GNU GPL for conveying Corresponding Source.) 5. Combined Libraries. You may place library facilities that are a work based on the Library side by side in a single library together with other library facilities that are not Applications and are not covered by this License, and convey such a combined library under terms of your choice, if you do both of the following: a) Accompany the combined library with a copy of the same work based on the Library, uncombined with any other library facilities, conveyed under the terms of this License. b) Give prominent notice with the combined library that part of it is a work based on the Library, and explaining where to find the accompanying uncombined form of the same work. 6. Revised Versions of the GNU Lesser General Public License. The Free Software Foundation may publish revised and/or new versions of the GNU Lesser General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Library as you received it specifies that a certain numbered version of the GNU Lesser General Public License "or any later version" applies to it, you have the option of following the terms and conditions either of that published version or of any later version published by the Free Software Foundation. If the Library as you received it does not specify a version number of the GNU Lesser General Public License, you may choose any version of the GNU Lesser General Public License ever published by the Free Software Foundation. If the Library as you received it specifies that a proxy can decide whether future versions of the GNU Lesser General Public License shall apply, that proxy's public statement of acceptance of any version is permanent authorization for you to choose that version for the Library. slowmovideo-0.5+git20180116/src/V3D/Math/0000775000000000000000000000000013151342440016100 5ustar rootrootslowmovideo-0.5+git20180116/src/V3D/Math/v3d_linear.h0000664000000000000000000004235113151342440020304 0ustar rootroot// -*- C++ -*- #ifndef V3D_LINEAR_H #define V3D_LINEAR_H #include "Math/v3d_linearbase.h" #include namespace V3D { template struct InlineVector : public InlineVectorBase { InlineVector() { } template explicit InlineVector(InlineVector const& a) { for(int i=0; i_vec[i] = (Elem2)a[i]; } template InlineVector slice( int first ) const { InlineVector vec; copyVectorSlice(*this,first,size,vec,0); return vec; } }; // end struct InlineVector template struct InlineVector : public InlineVectorBase { InlineVector() { } InlineVector(Elem a, Elem b) { InlineVectorBase::_vec[0] = a; InlineVectorBase::_vec[1] = b; } template explicit InlineVector(InlineVector const& a) { for(int i=0; i<2; i++) this->_vec[i] = (Elem2)a[i]; } template InlineVector slice( int first ) const { InlineVector vec; copyVectorSlice(*this,first,size,vec,0); return vec; } }; // end struct InlineVector template struct InlineVector : public InlineVectorBase { InlineVector() { } InlineVector(Elem a, Elem b, Elem c) { InlineVectorBase::_vec[0] = a; InlineVectorBase::_vec[1] = b; InlineVectorBase::_vec[2] = c; } InlineVector(InlineVector const &vec, Elem c) { InlineVectorBase::_vec[0] = vec[0]; InlineVectorBase::_vec[1] = vec[1]; InlineVectorBase::_vec[2] = c; } template explicit InlineVector(InlineVector const& a) { for(int i=0; i<3; i++) this->_vec[i] = (Elem2)a[i]; } template InlineVector slice( int first ) const { InlineVector vec; copyVectorSlice(*this,first,size,vec,0); return vec; } }; // end struct InlineVector template struct InlineVector : public InlineVectorBase { InlineVector() { } InlineVector(Elem a, Elem b, Elem c, Elem d) { InlineVectorBase::_vec[0] = a; InlineVectorBase::_vec[1] = b; InlineVectorBase::_vec[2] = c; InlineVectorBase::_vec[3] = d; } InlineVector( InlineVector const& v, Elem d ) { InlineVectorBase::_vec[0] = v[0]; InlineVectorBase::_vec[1] = v[1]; InlineVectorBase::_vec[2] = v[2]; InlineVectorBase::_vec[3] = d; } template explicit InlineVector(InlineVector const& a) { for(int i=0; i<4; i++) this->_vec[i] = (Elem2)a[i]; } template InlineVector slice( int first ) const { InlineVector vec; copyVectorSlice(*this,first,size,vec,0); return vec; } }; // end struct InlineVector template struct Vector : public VectorBase { Vector() : VectorBase() { } Vector(unsigned int size) : VectorBase(size) { } Vector(unsigned int size, Elem * values) : VectorBase(size, values) { } Vector(Vector const& a) : VectorBase(a) { } Vector& operator=(Vector const& a) { (VectorBase::operator=)(a); return *this; } Vector& operator+=(Vector const& rhs) { addVectorsIP(rhs, *this); return *this; } Vector& operator*=(Elem scale) { scaleVectorIP(scale, *this); return *this; } Vector operator+(Vector const& rhs) const { Vector res(this->size()); addVectors(*this, rhs, res); return res; } Vector operator-(Vector const& rhs) const { Vector res(this->size()); subtractVectors(*this, rhs, res); return res; } Elem operator*(Vector const& rhs) const { return innerProduct(*this, rhs); } }; // end struct Vector template struct InlineMatrix : public InlineMatrixBase { template InlineMatrix slice( int row, int col ) const { InlineMatrix mat; copyMatrixSlice(*this,row,col,numRows,numCols,mat,0,0); return mat; } InlineVector row( int row ) const { InlineVector vec; this->getRowSlice(row,0,Cols,vec); return vec; } InlineVector col( int col ) const { InlineVector vec; this->getColumnSlice(0,Rows,col,vec); return vec; } InlineMatrix transposed() const { return transposedMatrix(*this); } }; // end struct InlineMatrix template struct Matrix : public MatrixBase { Matrix() : MatrixBase() { } Matrix(unsigned int rows, unsigned int cols) : MatrixBase(rows, cols) { } Matrix(unsigned int rows, unsigned int cols, Elem value) : MatrixBase(rows, cols) { fillMatrix(*this, value); } Matrix(unsigned int rows, unsigned int cols, Elem * values) : MatrixBase(rows, cols, values) { } Matrix(Matrix const& a) : MatrixBase(a) { } Matrix& operator=(Matrix const& a) { (MatrixBase::operator=)(a); return *this; } Matrix& operator+=(Matrix const& rhs) { addMatricesIP(rhs, *this); return *this; } Matrix& operator*=(Elem scale) { scaleMatrixIP(scale, *this); return *this; } Matrix operator+(Matrix const& rhs) const { Matrix res(this->num_rows(), this->num_cols()); addMatrices(*this, rhs, res); return res; } Matrix operator-(Matrix const& rhs) const { Matrix res(this->num_rows(), this->num_cols()); subtractMatrices(*this, rhs, res); return res; } }; // end struct Matrix //---------------------------------------------------------------------- typedef InlineVector Vector2f; typedef InlineVector Vector2d; typedef InlineVector Vector3f; typedef InlineVector Vector3d; typedef InlineVector Vector4f; typedef InlineVector Vector4d; typedef InlineVector Vector3b; // For color specifications e.g. typedef InlineMatrix Matrix2x2f; typedef InlineMatrix Matrix2x2d; typedef InlineMatrix Matrix3x3f; typedef InlineMatrix Matrix3x3d; typedef InlineMatrix Matrix4x4f; typedef InlineMatrix Matrix4x4d; typedef InlineMatrix Matrix2x3f; typedef InlineMatrix Matrix2x3d; typedef InlineMatrix Matrix3x4f; typedef InlineMatrix Matrix3x4d; template struct VectorArray { VectorArray(unsigned count, unsigned size) : _count(count), _size(size), _values(0), _vectors(0) { unsigned const nTotal = _count * _size; if (count > 0) _vectors = new Vector[count]; if (nTotal > 0) _values = new Elem[nTotal]; for (unsigned i = 0; i < _count; ++i) new (&_vectors[i]) Vector(_size, _values + i*_size); } VectorArray(unsigned count, unsigned size, Elem initVal) : _count(count), _size(size), _values(0), _vectors(0) { unsigned const nTotal = _count * _size; if (count > 0) _vectors = new Vector[count]; if (nTotal > 0) _values = new Elem[nTotal]; for (unsigned i = 0; i < _count; ++i) new (&_vectors[i]) Vector(_size, _values + i*_size); std::fill(_values, _values + nTotal, initVal); } ~VectorArray() { delete [] _values; delete [] _vectors; } unsigned count() const { return _count; } unsigned size() const { return _size; } //! Get the submatrix at position ix Vector const& operator[](unsigned ix) const { return _vectors[ix]; } //! Get the submatrix at position ix Vector& operator[](unsigned ix) { return _vectors[ix]; } protected: unsigned _count, _size; Elem * _values; Vector * _vectors; private: VectorArray(VectorArray const&); void operator=(VectorArray const&); }; template struct MatrixArray { MatrixArray(unsigned count, unsigned nRows, unsigned nCols) : _count(count), _rows(nRows), _columns(nCols), _values(0), _matrices(0) { unsigned const nTotal = _count * _rows * _columns; if (count > 0) _matrices = new Matrix[count]; if (nTotal > 0) _values = new double[nTotal]; for (unsigned i = 0; i < _count; ++i) new (&_matrices[i]) Matrix(_rows, _columns, _values + i*(_rows*_columns)); } ~MatrixArray() { delete [] _matrices; delete [] _values; } //! Get the submatrix at position ix Matrix const& operator[](unsigned ix) const { return _matrices[ix]; } //! Get the submatrix at position ix Matrix& operator[](unsigned ix) { return _matrices[ix]; } unsigned count() const { return _count; } unsigned num_rows() const { return _rows; } unsigned num_cols() const { return _columns; } protected: unsigned _count, _rows, _columns; double * _values; Matrix * _matrices; private: MatrixArray(MatrixArray const&); void operator=(MatrixArray const&); }; //---------------------------------------------------------------------- template inline InlineVector homogenizeVector(InlineVector const& v) { InlineVector res; copyVectorSlice(v, 0, Size, res, 0); res[Size] = 1; return res; } template inline InlineVector operator-(InlineVector const& v) { InlineVector res; scaleVector(-1, v, res); return res; } template inline InlineVector operator+(InlineVector const& v, InlineVector const& w) { InlineVector res; addVectors(v, w, res); return res; } template inline InlineVector operator-(InlineVector const& v, InlineVector const& w) { InlineVector res; subtractVectors(v, w, res); return res; } template inline InlineMatrix operator-(InlineMatrix const& A, InlineMatrix const& B) { InlineMatrix res; subtractMatrices(A, B, res); return res; } template inline InlineMatrix operator+(InlineMatrix const& A, InlineMatrix const& B) { InlineMatrix res; addMatrices(A, B, res); return res; } template inline InlineVector operator*(Elem scale, InlineVector const& v) { InlineVector res; scaleVector(scale, v, res); return res; } template inline InlineMatrix operator*(Elem scale, InlineMatrix const& A) { InlineMatrix res; scaleMatrix(A, scale, res); return res; } template inline InlineVector operator*(InlineMatrix const& A, InlineVector const& v) { InlineVector res; multiply_A_v(A, v, res); return res; } template inline InlineVector operator*(InlineVector const& v, InlineMatrix const& A) { InlineVector res; multiply_At_v(A, v, res); return res; } template inline InlineMatrix operator*(InlineMatrix const& A, InlineMatrix const& B) { InlineMatrix res; multiply_A_B(A, B, res); return res; } template inline InlineMatrix transposedMatrix(InlineMatrix const& A) { InlineMatrix At; makeTransposedMatrix(A, At); return At; } template inline InlineMatrix invertedMatrix(InlineMatrix const& A) { Elem a = A[0][0], b = A[0][1], c = A[0][2]; Elem d = A[1][0], e = A[1][1], f = A[1][2]; Elem g = A[2][0], h = A[2][1], i = A[2][2]; Elem const det = a*e*i + b*f*g + c*d*h - c*e*g - b*d*i - a*f*h; InlineMatrix res; res[0][0] = e*i-f*h; res[0][1] = c*h-b*i; res[0][2] = b*f-c*e; res[1][0] = f*g-d*i; res[1][1] = a*i-c*g; res[1][2] = c*d-a*f; res[2][0] = d*h-e*g; res[2][1] = b*g-a*h; res[2][2] = a*e-b*d; scaleMatrixIP(1.0/det, res); return res; } template inline InlineMatrix outerProductMatrix(InlineVector const& u, InlineVector const& v) { InlineMatrix mat; makeOuterProductMatrix(u,v,mat); return mat; } template inline InlineMatrix crossProductMatrix(InlineVector const& v) { InlineMatrix res; makeCrossProductMatrix(v, res); return res; } template inline InlineVector crossProduct(InlineVector const& u, InlineVector const& v) { InlineVector res; makeCrossProductVector(u,v,res); return res; } template inline InlineVector makeVector2(Elem a, Elem b) { InlineVector res; res[0] = a; res[1] = b; return res; } template inline InlineVector makeVector3(Elem a, Elem b, Elem c) { InlineVector res; res[0] = a; res[1] = b; res[2] = c; return res; } //====================================================================== template inline Vector operator*(Matrix const& A, Vector const& v) { Vector res(A.num_rows()); multiply_A_v(A, v, res); return res; } //====================================================================== template inline void displayVector(Vec const& v) { using namespace std; cout << "[ "; for (int r = 0; r < v.size(); ++r) cout << v[r] << " "; cout << "]" << endl; } template inline void displayMatrix(Mat const& A) { using namespace std; cout << "[ "; for (int r = 0; r < A.num_rows(); ++r) { for (int c = 0; c < A.num_cols(); ++c) cout << A[r][c] << " "; if (r < A.num_rows()-1) cout << endl; else cout << "]" << endl; } } } // end namespace V3D #endif slowmovideo-0.5+git20180116/src/V3D/Math/v3d_linearbase.h0000664000000000000000000011271713151342440021143 0ustar rootroot// -*- C++ -*- #ifndef V3D_LINEAR_BASE_H #define V3D_LINEAR_BASE_H #include #include #include #include #include "Base/v3d_serialization.h" namespace V3D { using namespace std; //! Unboxed vector type template struct InlineVectorBase { typedef Elem value_type; typedef Elem element_type; typedef Elem const * const_iterator; typedef Elem * iterator; static unsigned int size() { return Size; } Elem& operator[](unsigned int i) { return _vec[i]; } Elem operator[](unsigned int i) const { return _vec[i]; } Elem& operator()(unsigned int i) { return _vec[i-1]; } Elem operator()(unsigned int i) const { return _vec[i-1]; } const_iterator begin() const { return _vec; } iterator begin() { return _vec; } const_iterator end() const { return _vec + Size; } iterator end() { return _vec + Size; } void newsize(unsigned int sz) const { assert(sz == Size); } template void serialize(Archive& ar) { SerializationScope scope(ar); int sz = Size; ar & sz; if (ar.isLoading()) this->newsize(sz); for (int i = 0; i < sz; ++i) ar & _vec[i]; } V3D_DEFINE_LOAD_SAVE(InlineVectorBase); protected: Elem _vec[Size]; }; //V3D_DEFINE_TEMPLATE_IOSTREAM_OPS(InlineVectorBase); //! Boxed (heap allocated) vector. template struct VectorBase { typedef Elem value_type; typedef Elem element_type; typedef Elem const * const_iterator; typedef Elem * iterator; VectorBase() : _size(0), _ownsVec(true), _vec(0) { } VectorBase(unsigned int size) : _size(size), _ownsVec(true), _vec(0) { if (size > 0) _vec = new Elem[size]; } VectorBase(unsigned int size, Elem * values) : _size(size), _ownsVec(false), _vec(values) { } VectorBase(VectorBase const& a) : _size(0), _ownsVec(true), _vec(0) { _size = a._size; if (_size == 0) return; _vec = new Elem[_size]; std::copy(a._vec, a._vec + _size, _vec); } ~VectorBase() { if (_ownsVec && _vec != 0) delete [] _vec; } VectorBase& operator=(VectorBase const& a) { if (this == &a) return *this; this->newsize(a._size); std::copy(a._vec, a._vec + _size, _vec); return *this; } unsigned int size() const { return _size; } VectorBase& newsize(unsigned int sz) { if (sz == _size) return *this; assert(_ownsVec); __destroy(); _size = sz; if (_size > 0) _vec = new Elem[_size]; return *this; } Elem& operator[](unsigned int i) { return _vec[i]; } Elem operator[](unsigned int i) const { return _vec[i]; } Elem& operator()(unsigned int i) { return _vec[i-1]; } Elem operator()(unsigned int i) const { return _vec[i-1]; } const_iterator begin() const { return _vec; } iterator begin() { return _vec; } const_iterator end() const { return _vec + _size; } iterator end() { return _vec + _size; } template void serialize(Archive& ar) { SerializationScope scope(ar); int sz = _size; ar & sz; if (ar.isLoading()) this->newsize(sz); for (int i = 0; i < sz; ++i) ar & _vec[i]; } V3D_DEFINE_LOAD_SAVE(VectorBase); protected: void __destroy() { assert(_ownsVec); if (_vec != 0) delete [] _vec; _size = 0; _vec = 0; } unsigned int _size; bool _ownsVec; Elem * _vec; }; //V3D_DEFINE_TEMPLATE_IOSTREAM_OPS(VectorBase); template struct InlineMatrixBase { typedef Elem value_type; typedef Elem element_type; typedef Elem * iterator; typedef Elem const * const_iterator; static unsigned int num_rows() { return Rows; } static unsigned int num_cols() { return Cols; } Elem * operator[](unsigned int row) { return _m[row]; } Elem const * operator[](unsigned int row) const { return _m[row]; } Elem& operator()(unsigned int row, unsigned int col) { return _m[row-1][col-1]; } Elem operator()(unsigned int row, unsigned int col) const { return _m[row-1][col-1]; } template void getRowSlice(unsigned int row, unsigned int first, unsigned int len, Vec& dst) const { for (unsigned int c = 0; c < len; ++c) dst[c] = _m[row][c+first]; } template void getColumnSlice(unsigned int first, unsigned int len, unsigned int col, Vec& dst) const { for (unsigned int r = 0; r < len; ++r) dst[r] = _m[r+first][col]; } template void setRowSlice(unsigned int row, unsigned int first, unsigned int len, const Vec& src) { for (unsigned int c = 0; c < len; ++c) _m[row][c+first] = src[c]; } template void setColumnSlice(unsigned int first, unsigned int len, unsigned int col, const Vec& src) { for (unsigned int r = 0; r < len; ++r) _m[r+first][col] = src[r]; } void newsize(unsigned int rows, unsigned int cols) const { assert(rows == Rows && cols == Cols); } const_iterator begin() const { return &_m[0][0]; } iterator begin() { return &_m[0][0]; } const_iterator end() const { return &_m[0][0] + Rows*Cols; } iterator end() { return &_m[0][0] + Rows*Cols; } template void serialize(Archive& ar) { SerializationScope scope(ar); int n = Rows, m = Cols; ar & n & m; if (ar.isLoading()) this->newsize(n, m); for (int r = 0; r < n; ++r) for (int c = 0; c < m; ++c) ar & _m[r][c]; } V3D_DEFINE_LOAD_SAVE(InlineMatrixBase); protected: Elem _m[Rows][Cols]; }; //V3D_DEFINE_TEMPLATE_IOSTREAM_OPS(InlineMatrixBase); template struct MatrixBase { typedef Elem value_type; typedef Elem element_type; typedef Elem const * const_iterator; typedef Elem * iterator; MatrixBase() : _rows(0), _cols(0), _ownsData(true), _m(0) { } MatrixBase(unsigned int rows, unsigned int cols) : _rows(rows), _cols(cols), _ownsData(true), _m(0) { if (_rows * _cols == 0) return; _m = new Elem[rows*cols]; } MatrixBase(unsigned int rows, unsigned int cols, Elem * values) : _rows(rows), _cols(cols), _ownsData(false), _m(values) { } MatrixBase(MatrixBase const& a) : _ownsData(true), _m(0) { _rows = a._rows; _cols = a._cols; if (_rows * _cols == 0) return; _m = new Elem[_rows*_cols]; std::copy(a._m, a._m+_rows*_cols, _m); } ~MatrixBase() { if (_ownsData && _m != 0) delete [] _m; } MatrixBase& operator=(MatrixBase const& a) { if (this == &a) return *this; this->newsize(a.num_rows(), a.num_cols()); std::copy(a._m, a._m+_rows*_cols, _m); return *this; } void newsize(unsigned int rows, unsigned int cols) { if (rows == _rows && cols == _cols) return; assert(_ownsData); __destroy(); _rows = rows; _cols = cols; if (_rows * _cols == 0) return; _m = new Elem[rows*cols]; } unsigned int num_rows() const { return _rows; } unsigned int num_cols() const { return _cols; } Elem * operator[](unsigned int row) { return _m + row*_cols; } Elem const * operator[](unsigned int row) const { return _m + row*_cols; } Elem& operator()(unsigned int row, unsigned int col) { return _m[(row-1)*_cols + col-1]; } Elem operator()(unsigned int row, unsigned int col) const { return _m[(row-1)*_cols + col-1]; } const_iterator begin() const { return _m; } iterator begin() { return _m; } const_iterator end() const { return _m + _rows*_cols; } iterator end() { return _m + _rows*_cols; } template void getRowSlice(unsigned int row, unsigned int first, unsigned int last, Vec& dst) const { Elem const * v = (*this)[row]; for (unsigned int c = first; c < last; ++c) dst[c-first] = v[c]; } template void getColumnSlice(unsigned int first, unsigned int len, unsigned int col, Vec& dst) const { for (unsigned int r = 0; r < len; ++r) dst[r] = (*this)[r+first][col]; } template void setRowSlice(unsigned int row, unsigned int first, unsigned int len, const Vec& src) { Elem * v = (*this)[row]; for (unsigned int c = 0; c < len; ++c) v[c+first] = src[c]; } template void setColumnSlice(unsigned int first, unsigned int len, unsigned int col, const Vec& src) { for (unsigned int r = 0; r < len; ++r) (*this)[r+first][col] = src[r]; } template void serialize(Archive& ar) { SerializationScope scope(ar); int n = _rows, m = _cols; ar & n & m; if (ar.isLoading()) this->newsize(n, m); for (int i = 0; i < n*m; ++i) ar & _m[i]; } V3D_DEFINE_LOAD_SAVE(MatrixBase); protected: void __destroy() { assert(_ownsData); if (_m != 0) delete [] _m; _m = 0; _rows = _cols = 0; } unsigned int _rows, _cols; bool _ownsData; Elem * _m; }; //V3D_DEFINE_TEMPLATE_IOSTREAM_OPS(MatrixBase); //---------------------------------------------------------------------- template struct CCS_Matrix { typedef T value_type; typedef T element_type; CCS_Matrix() : _rows(0), _cols(0) { } CCS_Matrix(int const rows, int const cols, vector > const& nonZeros) : _rows(rows), _cols(cols) { this->initialize(nonZeros); } CCS_Matrix(int const rows, int const cols, vector > const& nonZeros, vector const& values) : _rows(rows), _cols(cols) { assert(nonZeros.size() == values.size()); this->initialize(nonZeros); for (size_t i = 0; i < values.size(); ++i) _values[_destIdxs[i]] = values[i]; } CCS_Matrix(CCS_Matrix const& b) : _rows(b._rows), _cols(b._cols), _colStarts(b._colStarts), _rowIdxs(b._rowIdxs), _destIdxs(b._destIdxs), _values(b._values) { } CCS_Matrix& operator=(CCS_Matrix const& b) { if (this == &b) return *this; _rows = b._rows; _cols = b._cols; _colStarts = b._colStarts; _rowIdxs = b._rowIdxs; _destIdxs = b._destIdxs; _values = b._values; return *this; } void create(int const rows, int const cols, vector > const& nonZeros) { _rows = rows; _cols = cols; this->initialize(nonZeros); } unsigned int num_rows() const { return _rows; } unsigned int num_cols() const { return _cols; } int getNonzeroCount() const { return _values.size(); } T const * getValues() const { return &_values[0]; } T * getValues() { return &_values[0]; } int const * getDestIndices() const { return &_destIdxs[0]; } int const * getColumnStarts() const { return &_colStarts[0]; } int const * getRowIndices() const { return &_rowIdxs[0]; } void getRowRange(unsigned int col, unsigned int& firstRow, unsigned int& lastRow) const { firstRow = _rowIdxs[_colStarts[col]]; lastRow = _rowIdxs[_colStarts[col+1]-1]+1; } template void getColumnSlice(unsigned int first, unsigned int len, unsigned int col, Vec& dst) const { unsigned int const last = first + len; for (int r = 0; r < len; ++r) dst[r] = 0; // Fill vector with zeros int const colStart = _colStarts[col]; int const colEnd = _colStarts[col+1]; int i = colStart; int r; // Skip rows less than the given start row while (i < colEnd && (r = _rowIdxs[i]) < first) ++i; // Copy elements until the final row while (i < colEnd && (r = _rowIdxs[i]) < last) { dst[r-first] = _values[i]; ++i; } } // end getColumnSlice() int getColumnNonzeroCount(unsigned int col) const { int const colStart = _colStarts[col]; int const colEnd = _colStarts[col+1]; return colEnd - colStart; } template void getSparseColumn(unsigned int col, VecA& rows, VecB& values) const { int const colStart = _colStarts[col]; int const colEnd = _colStarts[col+1]; int const nnz = colEnd - colStart; for (int i = 0; i < nnz; ++i) { rows[i] = _rowIdxs[colStart + i]; values[i] = _values[colStart + i]; } } protected: struct NonzeroInfo { int row, col, serial; // Sort wrt the column first bool operator<(NonzeroInfo const& rhs) const { if (col < rhs.col) return true; if (col > rhs.col) return false; return row < rhs.row; } }; void initialize(std::vector > const& nonZeros) { using namespace std; int const nnz = nonZeros.size(); _colStarts.resize(_cols + 1); _rowIdxs.resize(nnz); vector nz(nnz); for (int k = 0; k < nnz; ++k) { nz[k].row = nonZeros[k].first; nz[k].col = nonZeros[k].second; nz[k].serial = k; } // Sort in column major order std::sort(nz.begin(), nz.end()); for (size_t k = 0; k < nnz; ++k) _rowIdxs[k] = nz[k].row; int curCol = -1; for (int k = 0; k < nnz; ++k) { NonzeroInfo const& el = nz[k]; if (el.col != curCol) { // Update empty cols between for (int c = curCol+1; c < el.col; ++c) _colStarts[c] = k; curCol = el.col; _colStarts[curCol] = k; } // end if } // end for (k) // Update remaining columns for (int c = curCol+1; c <= _cols; ++c) _colStarts[c] = nnz; _destIdxs.resize(nnz); for (int k = 0; k < nnz; ++k) _destIdxs[nz[k].serial] = k; _values.resize(nnz); } // end initialize() int _rows, _cols; std::vector _colStarts; std::vector _rowIdxs; std::vector _destIdxs; std::vector _values; }; // end struct CCS_Matrix //---------------------------------------------------------------------- template inline void fillVector(Elem val, Vec& v) { // We do not use std::fill since we rely only on size() and operator[] member functions. for (unsigned int i = 0; i < v.size(); ++i) v[i] = val; } template inline void makeZeroVector(Vec& v) { fillVector(0, v); } template inline void copyVector(VecA const& src, VecB& dst) { assert(src.size() == dst.size()); // We do not use std::fill since we rely only on size() and operator[] member functions. for (unsigned int i = 0; i < src.size(); ++i) dst[i] = src[i]; } template inline void copyVectorSlice(VecA const& src, unsigned int srcStart, unsigned int srcLen, VecB& dst, unsigned int dstStart) { unsigned int const end = std::min(srcStart + srcLen, src.size()); unsigned int const sz = dst.size(); unsigned int i0, i1; for (i0 = srcStart, i1 = dstStart; i0 < end && i1 < sz; ++i0, ++i1) dst[i1] = src[i0]; } template inline typename Vec::value_type norm_L1(Vec const& v) { typename Vec::value_type res(0); for (unsigned int i = 0; i < v.size(); ++i) res += std::abs(v[i]); return res; } template inline typename Vec::value_type norm_Linf(Vec const& v) { typename Vec::value_type res(0); for (unsigned int i = 0; i < v.size(); ++i) res = std::max(res, std::abs(v[i])); return res; } template inline typename Vec::value_type norm_L2(Vec const& v) { typename Vec::value_type res(0); for (unsigned int i = 0; i < v.size(); ++i) res += v[i]*v[i]; return sqrt((double)res); } template inline typename Vec::value_type sqrNorm_L2(Vec const& v) { typename Vec::value_type res(0); for (unsigned int i = 0; i < v.size(); ++i) res += v[i]*v[i]; return res; } template inline typename VecA::value_type distance_L2(VecA const& a, VecB const& b) { assert(a.size() == b.size()); typename VecA::value_type res(0); for (unsigned int i = 0; i < a.size(); ++i) res += (a[i]-b[i])*(a[i]-b[i]); return sqrt(res); } template inline typename VecA::value_type sqrDistance_L2(VecA const& a, VecB const& b) { assert(a.size() == b.size()); typename VecA::value_type res(0); for (unsigned int i = 0; i < a.size(); ++i) res += (a[i]-b[i])*(a[i]-b[i]); return res; } template inline typename VecA::value_type distance_Linf(VecA const& a, VecB const& b) { typedef typename VecA::value_type T; assert(a.size() == b.size()); T res(0); for (unsigned int i = 0; i < a.size(); ++i) res = std::max(res, T(fabs(a[i] - b[i]))); return res; } template inline void normalizeVector(Vec& v) { typename Vec::value_type norm(norm_L2(v)); for (unsigned int i = 0; i < v.size(); ++i) v[i] /= norm; } template inline typename VecA::value_type innerProduct(VecA const& a, VecB const& b) { assert(a.size() == b.size()); typename VecA::value_type res(0); for (unsigned int i = 0; i < a.size(); ++i) res += a[i] * b[i]; return res; } template inline void scaleVector(Elem s, VecA const& v, VecB& dst) { for (unsigned int i = 0; i < v.size(); ++i) dst[i] = s * v[i]; } template inline void scaleVectorIP(Elem s, Vec& v) { typedef typename Vec::value_type Elem2; for (unsigned int i = 0; i < v.size(); ++i) v[i] = (Elem2)(v[i] * s); } template inline void makeCrossProductVector(VecA const& v, VecB const& w, VecC& dst) { assert(v.size() == 3); assert(w.size() == 3); assert(dst.size() == 3); dst[0] = v[1]*w[2] - v[2]*w[1]; dst[1] = v[2]*w[0] - v[0]*w[2]; dst[2] = v[0]*w[1] - v[1]*w[0]; } template inline void addVectors(VecA const& v, VecB const& w, VecC& dst) { assert(v.size() == w.size()); assert(v.size() == dst.size()); for (unsigned int i = 0; i < v.size(); ++i) dst[i] = v[i] + w[i]; } template inline void addVectorsIP(VecA const& v, VecB& dst) { assert(v.size() == dst.size()); for (unsigned int i = 0; i < v.size(); ++i) dst[i] += v[i]; } template inline void subtractVectors(VecA const& v, VecB const& w, VecC& dst) { assert(v.size() == w.size()); assert(v.size() == dst.size()); for (unsigned int i = 0; i < v.size(); ++i) dst[i] = v[i] - w[i]; } template inline void makeInterpolatedVector(Elem a, VecA const& v, Elem b, VecB const& w, VecC& dst) { assert(v.size() == w.size()); assert(v.size() == dst.size()); for (unsigned int i = 0; i < v.size(); ++i) dst[i] = a * v[i] + b * w[i]; } template inline typename VecA::value_type unsignedAngleBetweenVectors(VecA const& v, VecB const& w) { assert(v.size() == w.size()); typename VecA::value_type dot = innerProduct(v, w) / norm_L2(v) / norm_L2(w); if (dot > 1.0) return 0; if (dot < -1.0) return M_PI; return acos(dot); } template inline void copyMatrix(MatA const& src, MatB& dst) { unsigned int const rows = src.num_rows(); unsigned int const cols = src.num_cols(); assert(dst.num_rows() == rows); assert(dst.num_cols() == cols); for (unsigned int c = 0; c < cols; ++c) for (unsigned int r = 0; r < rows; ++r) dst[r][c] = src[r][c]; } template inline void copyMatrixSlice(MatA const& src, unsigned int rowStart, unsigned int colStart, unsigned int rowLen, unsigned int colLen, MatB& dst, unsigned int dstRow, unsigned int dstCol) { unsigned int const rows = dst.num_rows(); unsigned int const cols = dst.num_cols(); unsigned int const rowEnd = std::min(rowStart + rowLen, src.num_rows()); unsigned int const colEnd = std::min(colStart + colLen, src.num_cols()); unsigned int c0, c1, r0, r1; for (c0 = colStart, c1 = dstCol; c0 < colEnd && c1 < cols; ++c0, ++c1) for (r0 = rowStart, r1 = dstRow; r0 < rowEnd && r1 < rows; ++r0, ++r1) dst[r1][c1] = src[r0][c0]; } template inline void makeTransposedMatrix(MatA const& src, MatB& dst) { unsigned int const rows = src.num_rows(); unsigned int const cols = src.num_cols(); assert(dst.num_cols() == rows); assert(dst.num_rows() == cols); for (unsigned int c = 0; c < cols; ++c) for (unsigned int r = 0; r < rows; ++r) dst[c][r] = src[r][c]; } template inline void fillMatrix(Mat& m, typename Mat::value_type val) { unsigned int const rows = m.num_rows(); unsigned int const cols = m.num_cols(); for (unsigned int c = 0; c < cols; ++c) for (unsigned int r = 0; r < rows; ++r) m[r][c] = val; } template inline void makeZeroMatrix(Mat& m) { fillMatrix(m, 0); } template inline void makeIdentityMatrix(Mat& m) { makeZeroMatrix(m); unsigned int const rows = m.num_rows(); unsigned int const cols = m.num_cols(); unsigned int n = std::min(rows, cols); for (unsigned int i = 0; i < n; ++i) m[i][i] = 1; } template inline void makeCrossProductMatrix(Vec const& v, Mat& m) { assert(v.size() == 3); assert(m.num_rows() == 3); assert(m.num_cols() == 3); m[0][0] = 0; m[0][1] = -v[2]; m[0][2] = v[1]; m[1][0] = v[2]; m[1][1] = 0; m[1][2] = -v[0]; m[2][0] = -v[1]; m[2][1] = v[0]; m[2][2] = 0; } template inline void makeOuterProductMatrix(Vec const& v, Mat& m) { assert(m.num_cols() == m.num_rows()); assert(v.size() == m.num_cols()); unsigned const sz = v.size(); for (unsigned r = 0; r < sz; ++r) for (unsigned c = 0; c < sz; ++c) m[r][c] = v[r]*v[c]; } template inline void makeOuterProductMatrix(VecA const& u, VecB const& v, Mat& m) { assert(m.num_cols() == m.num_rows()); assert(u.size() == m.num_cols()); assert(v.size() == m.num_cols()); unsigned const sz = u.size(); for (unsigned r = 0; r < sz; ++r) for (unsigned c = 0; c < sz; ++c) m[r][c] = u[r]*v[c]; } template void addMatrices(MatA const& a, MatB const& b, MatC& dst) { assert(a.num_cols() == b.num_cols()); assert(a.num_rows() == b.num_rows()); assert(dst.num_cols() == a.num_cols()); assert(dst.num_rows() == a.num_rows()); unsigned int const rows = a.num_rows(); unsigned int const cols = a.num_cols(); for (unsigned r = 0; r < rows; ++r) for (unsigned c = 0; c < cols; ++c) dst[r][c] = a[r][c] + b[r][c]; } template void addMatricesIP(MatA const& a, MatB& dst) { assert(dst.num_cols() == a.num_cols()); assert(dst.num_rows() == a.num_rows()); unsigned int const rows = a.num_rows(); unsigned int const cols = a.num_cols(); for (unsigned r = 0; r < rows; ++r) for (unsigned c = 0; c < cols; ++c) dst[r][c] += a[r][c]; } template void subtractMatrices(MatA const& a, MatB const& b, MatC& dst) { assert(a.num_cols() == b.num_cols()); assert(a.num_rows() == b.num_rows()); assert(dst.num_cols() == a.num_cols()); assert(dst.num_rows() == a.num_rows()); unsigned int const rows = a.num_rows(); unsigned int const cols = a.num_cols(); for (unsigned r = 0; r < rows; ++r) for (unsigned c = 0; c < cols; ++c) dst[r][c] = a[r][c] - b[r][c]; } template inline void scaleMatrix(MatA const& m, Elem scale, MatB& dst) { unsigned int const rows = m.num_rows(); unsigned int const cols = m.num_cols(); for (unsigned int c = 0; c < cols; ++c) for (unsigned int r = 0; r < rows; ++r) dst[r][c] = m[r][c] * scale; } template inline void scaleMatrixIP(Elem scale, Mat& m) { unsigned int const rows = m.num_rows(); unsigned int const cols = m.num_cols(); for (unsigned int c = 0; c < cols; ++c) for (unsigned int r = 0; r < rows; ++r) m[r][c] *= scale; } template inline void multiply_A_v(Mat const& m, VecA const& in, VecB& dst) { unsigned int const rows = m.num_rows(); unsigned int const cols = m.num_cols(); assert(in.size() == cols); assert(dst.size() == rows); makeZeroVector(dst); for (unsigned int r = 0; r < rows; ++r) for (unsigned int c = 0; c < cols; ++c) dst[r] += m[r][c] * in[c]; } template inline void multiply_A_v_projective(Mat const& m, VecA const& in, VecB& dst) { unsigned int const rows = m.num_rows(); unsigned int const cols = m.num_cols(); assert(in.size() == cols-1); assert(dst.size() == rows-1); typename VecB::value_type w = m[rows-1][cols-1]; unsigned int r, i; for (i = 0; i < cols-1; ++i) w += m[rows-1][i] * in[i]; for (r = 0; r < rows-1; ++r) dst[r] = m[r][cols-1]; for (r = 0; r < rows-1; ++r) for (unsigned int c = 0; c < cols-1; ++c) dst[r] += m[r][c] * in[c]; for (i = 0; i < rows-1; ++i) dst[i] /= w; } template inline void multiply_A_v_affine(Mat const& m, VecA const& in, VecB& dst) { unsigned int const rows = m.num_rows(); unsigned int const cols = m.num_cols(); assert(in.size() == cols-1); assert(dst.size() == rows); unsigned int r; for (r = 0; r < rows; ++r) dst[r] = m[r][cols-1]; for (r = 0; r < rows; ++r) for (unsigned int c = 0; c < cols-1; ++c) dst[r] += m[r][c] * in[c]; } template inline void multiply_At_v(Mat const& m, VecA const& in, VecB& dst) { unsigned int const rows = m.num_rows(); unsigned int const cols = m.num_cols(); assert(in.size() == rows); assert(dst.size() == cols); makeZeroVector(dst); for (unsigned int c = 0; c < cols; ++c) for (unsigned int r = 0; r < rows; ++r) dst[c] += m[r][c] * in[r]; } template inline void multiply_At_v_Sparse(SparseMat const& a, VecA const& in, VecB& dst) { assert(in.size() == a.num_rows()); assert(dst.size() == a.num_cols()); typedef typename VecB::value_type Elem; std::vector rows(a.num_rows()); std::vector vals(a.num_rows()); makeZeroVector(dst); for (unsigned int c = 0; c < a.num_cols(); ++c) { int const nnz = a.getColumnNonzeroCount(c); a.getSparseColumn(c, rows, vals); Elem accum = 0; for (int i = 0; i < nnz; ++i) { int const r = rows[i]; accum += vals[i] * in[r]; } dst[c] = accum; } } // end multiply_At_v_Sparse() template inline void multiply_At_A(MatA const& a, MatB& dst) { assert(dst.num_rows() == a.num_cols()); assert(dst.num_cols() == a.num_cols()); typedef typename MatB::value_type Elem; int const M = a.num_rows(); int const N = a.num_cols(); Elem accum; for (int r = 0; r < N; ++r) for (int c = 0; c < N; ++c) { accum = 0; for (int k = 0; k < M; ++k) accum += a[k][r] * a[k][c]; dst[r][c] = accum; } } template inline void multiply_At_A_Sparse(SparseMatA const& a, MatB& dst) { assert(dst.num_rows() == a.num_cols()); assert(dst.num_cols() == a.num_cols()); typedef typename MatB::value_type Elem; makeZeroMatrix(dst); std::vector rows1(a.num_rows()), rows2(a.num_rows()); std::vector vals1(a.num_rows()), vals2(a.num_rows()); for (unsigned int r = 0; r < dst.num_rows(); ++r) { int const nnz1 = a.getColumnNonzeroCount(r); a.getSparseColumn(r, rows1, vals1); for (unsigned int c = 0; c <= r; ++c) { int const nnz2 = a.getColumnNonzeroCount(c); a.getSparseColumn(c, rows2, vals2); Elem accum = 0; int i1 = 0, i2 = 0; while (i1 < nnz1 && i2 < nnz2) { if (rows1[i1] > rows2[i2]) ++i2; else if (rows1[i1] < rows2[i2]) ++i1; else { accum += vals1[i1] * vals2[i2]; ++i1; ++i2; } } // end while dst[r][c] = accum; dst[c][r] = accum; } // end for (c) } // end for (r) } // multiply_At_A_Sparse() template inline void multiply_A_B(MatA const& a, MatB const& b, MatC& dst) { assert(a.num_cols() == b.num_rows()); assert(dst.num_rows() == a.num_rows()); assert(dst.num_cols() == b.num_cols()); typedef typename MatC::value_type Elem; Elem accum; for (unsigned int r = 0; r < a.num_rows(); ++r) for (unsigned int c = 0; c < b.num_cols(); ++c) { accum = 0; for (unsigned int k = 0; k < a.num_cols(); ++k) accum += a[r][k] * b[k][c]; dst[r][c] = accum; } } template inline void multiply_At_B(MatA const& a, MatB const& b, MatC& dst) { assert(a.num_rows() == b.num_rows()); assert(dst.num_rows() == a.num_cols()); assert(dst.num_cols() == b.num_cols()); typedef typename MatC::value_type Elem; Elem accum; for (unsigned int r = 0; r < a.num_cols(); ++r) for (unsigned int c = 0; c < b.num_cols(); ++c) { accum = 0; for (unsigned int k = 0; k < a.num_rows(); ++k) accum += a[k][r] * b[k][c]; dst[r][c] = accum; } } template inline void multiply_A_Bt(MatA const& a, MatB const& b, MatC& dst) { assert(a.num_cols() == b.num_cols()); assert(dst.num_rows() == a.num_rows()); assert(dst.num_cols() == b.num_rows()); typedef typename MatC::value_type Elem; Elem accum; for (unsigned int r = 0; r < a.num_rows(); ++r) for (unsigned int c = 0; c < b.num_rows(); ++c) { accum = 0; for (unsigned int k = 0; k < a.num_cols(); ++k) accum += a[r][k] * b[c][k]; dst[r][c] = accum; } } template inline void transposeMatrixIP(Mat& a) { assert(a.num_rows() == a.num_cols()); for (unsigned int r = 0; r < a.num_rows(); ++r) for (unsigned int c = 0; c < r; ++c) std::swap(a[r][c], a[c][r]); } template inline typename Mat::value_type matrixDeterminant3x3(Mat const& A) { assert(A.num_rows() == 3); assert(A.num_cols() == 3); return (A[0][0]*A[1][1]*A[2][2] + A[0][1]*A[1][2]*A[2][0] + A[0][2]*A[1][0]*A[2][1] -A[0][2]*A[1][1]*A[2][0] - A[0][1]*A[1][0]*A[2][2] - A[0][0]*A[1][2]*A[2][1]); } template inline double matrixNormFrobenius(Mat const& a) { double accum(0.0); for (unsigned int r = 0; r < a.num_rows(); ++r) for (unsigned int c = 0; c < a.num_cols(); ++c) accum += a[r][c]*a[r][c]; return sqrt(accum); } //********************************************************************** //! Convert a matrix to upper triangular form, aka Gauss elimination. template inline void convertToRowEchelonMatrix(Mat& A) { typedef typename Mat::value_type Field; int const n = A.num_rows(); int const m = A.num_cols(); int i, j, k; int lead = 0; // Pivot column // Pass 1: Generate a upper right triangular matrix for (i = 0; i < n && lead < m; ++i, ++lead) { int pivot_row = i; Field pivot_elem(0); while (lead < m) { // Search for the largest pivot element in column lead for (k = i; k < n; ++k) { Field a = std::abs(A[k][lead]); if (a > pivot_elem) { pivot_elem = a; pivot_row = k; } } // end for (k) if (pivot_elem == Field(0)) ++lead; else break; } if (lead >= m) break; if (i != pivot_row) { // Exchange row i and pivot_row for (j = 0; j < m; ++j) { Field tmp = A[i][j]; A[i][j] = A[pivot_row][j]; A[pivot_row][j] = tmp; } } Field pivot = A[i][lead]; Field rcpPivot = Field(1)/pivot; A[i][lead] = Field(1); for (j = lead+1; j < m; ++j) A[i][j] = A[i][j] * rcpPivot; for (k = i+1; k < n; ++k) { Field q = A[k][lead]; A[k][lead] = Field(0); for (j = lead+1; j < m; ++j) A[k][j] = A[k][j] - q*A[i][j]; } } // end for (i) } // end convertToRowEchelonMatrix() // Convert a row echelon matrix to a reduced one, i.e. Gauss-Jordan elimination. template inline void convertToReducedRowEchelonMatrix(Mat& A) { typedef typename Mat::value_type Field; // Pass 2: Remove additional elements above the diagonal int const n = A.num_rows(); int const m = A.num_cols(); for (int i = n-1; i >= 0; --i) { int lead = i; while (lead < m && A[i][lead] == Field(0)) ++lead; if (lead >= m) continue; for (int k = 0; k < i; ++k) { Field q = A[k][lead]; for (int j = lead; j < m; ++j) A[k][j] = A[k][j] - q*A[i][j]; } // end for (k) } // end for (i) } // end convertToReducedRowEchelonMatrix() } // end namespace V3D #endif slowmovideo-0.5+git20180116/src/V3D/README.TXT0000664000000000000000000000562213151342440016552 0ustar rootrootDescription This is a GPU implementation of feature point tracking with and without simultaneous gain estimation (i.e. changes in the overall image brightness are detected and handled). For a technical description see C. Zach, D. Gallup, and J.-M. Frahm, "Fast Gain-Adaptive KLT Tracking on the GPU,". CVPR Workshop on Computer Vision on GPU's (CVGPU), 2008, available at http://cs.unc.edu/~cmzach/publications.html. Additionally, this package now includes the TV-L1 optical flow implementation as described in C.Zach, T. Pock, and H. Bischof, "A Duality Based Approach for Realtime TV-L1 Optical Flow", Pattern Recognition (Proc. DAGM), vol. 4792 of Lecture Notes in Computer Science, 2007 (also available at http://cs.unc.edu/~cmzach/publications.html). Using the simple API for tracking is demonstrated in Apps/GL/klt_for_video.cpp, which successively reads frames from videos (using the OpenCV library) and displays the obtained feature tracks. Note that you have to set the V3D_SHADER_DIR environment variable to point to the Shader directory, e.g. export V3D_SHADER_DIR=/home/user/src/GPU-KLT+FLOW-1.0/GL/Shaders with a sh/bash environment. A simple application for TV-L1 optical flow is provided in Apps/GL/tvl1_flow.cpp. Requirements NVidias Cg toolkit (version 2 or later, http://developer.nvidia.com/object/cg_toolkit.html) and the OpenGL extension wrangler library (glew.sourceforge.net) are required to build the library. Currenlty, the GPU tracker is limited to NVidia hardware (Geforce6 series or newer, the hardware must support the fp40 Cg profile). If you want to run the simple demo applications, the OpenCV computer vision library (http://sourceforge.net/projects/opencvlibrary/) with sufficient support for video codecs (e.g. through ffmpeg or xinelib) is required. The build system uses cmake (www.cmake.org). The optical flow code can use libpng or libjpeg for loading images. If none of these is available (modify the main CMakeLists.txt), then binary PNM images (magic code either P5 or P6) are still supported. GLUT or freeglut is used for GL window handling. The library was developed under Linux, but should compile equally well on other operating systems. -Christopher Zach (chzach@inf.ethz.ch) /* Copyright (c) 2008-2010 UNC-Chapel Hill & ETH Zurich This file is part of GPU-KLT+FLOW. GPU-KLT+FLOW is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. GPU-KLT+FLOW is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with GPU-KLT+FLOW. If not, see . */ slowmovideo-0.5+git20180116/src/V3D/Config/0000775000000000000000000000000013151342440016414 5ustar rootrootslowmovideo-0.5+git20180116/src/V3D/Config/config.h0000664000000000000000000000065513151342440020040 0ustar rootroot#ifndef CONFIG_H #define CONFIG_H /** This header file is for directly including the shader files in the binary which makes it easier to ship an exectuable. The #defines are only for QtCreator which does not support the CMake add_definitions yet. -- Simon A. Eugster */ #ifndef DISABLE_REDEFINITIONS #define V3DLIB_ENABLE_LIBJPEG #define V3DLIB_ENABLE_LIBPNG #define V3DLIB_ENABLE_GPGPU #endif #endif // CONFIG_H slowmovideo-0.5+git20180116/src/V3D/Config/v3d_macros.cmake0000664000000000000000000000222413151342440021456 0ustar rootroot# -*- CMake -*- macro (enable_feature feature) set (${feature} 1) add_definitions(-D${feature}) endmacro (enable_feature) macro (enable_conditional_feature feature dep_feature) if (${dep_feature}) set (${feature} 1) add_definitions(-D${feature}) endif (${dep_feature}) endmacro (enable_conditional_feature) macro (enable_feature_inc_path feature) if (${feature}) set (EXTRA_INC_DIRS ${EXTRA_INC_DIRS} ${ARGN}) endif (${feature}) endmacro (enable_feature_inc_path) macro (enable_feature_lib_path feature) if (${feature}) set (EXTRA_LIB_DIRS ${EXTRA_LIB_DIRS} ${ARGN}) endif (${feature}) endmacro (enable_feature_lib_path) macro (enable_feature_libraries feature) if (${feature}) set (EXTRA_LIBRARIES ${EXTRA_LIBRARIES} ${ARGN}) endif (${feature}) endmacro (enable_feature_libraries) macro (add_v3d_executable target) #message(STATUS "ARGN variable contains: ${ARGN}") add_executable(${target} ${ARGN}) add_dependencies(${target} V3d2) endmacro (add_v3d_executable) macro (add_simple_v3d_executable target) add_executable(${target} ${target}.cpp) add_dependencies(${target} V3d2) endmacro (add_simple_v3d_executable) slowmovideo-0.5+git20180116/src/V3D/Apps/0000775000000000000000000000000013151342440016112 5ustar rootrootslowmovideo-0.5+git20180116/src/V3D/Apps/CMakeLists.txt0000664000000000000000000000011413151342440020646 0ustar rootrootif (V3DLIB_ENABLE_GPGPU) add_subdirectory(GL) endif (V3DLIB_ENABLE_GPGPU) slowmovideo-0.5+git20180116/src/V3D/Apps/GL/0000775000000000000000000000000013151342440016414 5ustar rootrootslowmovideo-0.5+git20180116/src/V3D/Apps/GL/CMakeLists.txt0000664000000000000000000000075113151342440021157 0ustar rootrootinclude_directories(${V3D_INCLUDE_DIRS} ${EXTRA_INC_DIRS} ${slowmoVideo_SOURCE_DIR}/slowmoVideo/lib) link_directories(${V3D_DIR} ${EXTRA_LIB_DIRS}) link_libraries (V3D ${EXTRA_LIBRARIES}) if(V3DLIB_ENABLE_GPGPU) add_v3d_executable( slowmoFlowBuilder flowBuilder.cpp ${slowmoVideo_SOURCE_DIR}/slowmoVideo/lib/flowField_sV.cpp ${slowmoVideo_SOURCE_DIR}/slowmoVideo/lib/flowRW_sV.cpp ) endif (V3DLIB_ENABLE_GPGPU) install(TARGETS slowmoFlowBuilder DESTINATION ${DEST}) slowmovideo-0.5+git20180116/src/V3D/Apps/GL/flowBuilder.cpp0000664000000000000000000002524313151342440021404 0ustar rootroot/// \todo Check that (width|height)/2^nLevels >= 1 #include "Base/v3d_image.h" #include "Base/v3d_imageprocessing.h" #include "Base/v3d_timer.h" #include "GL/v3d_gpucolorflow.h" #include #include #ifdef __APPLE__ #include #include #include #include #elif defined(_WIN32) #include #else #define USE_RAW_X11 #include #include #include #include #endif #include "flowRW_sV.h" #include "flowField_sV.h" using namespace V3D; using namespace V3D_GPU; #define VERSION "2.0" //#define USE_LAB_COLORSPACE 1 int const nLevels = 6; // Must be large enough for big images. Number of pyramid levels. int const startLevel = 0; int nIterations = 200; int const nOuterIterations = 4; float const lambdaScale = 1.0; Image leftImage, rightImage; Image leftImageLab, rightImageLab; const char *outputFile; float lambda = 1.0f; float const tau = 0.249f; float const theta = 0.1f; typedef TVL1_ColorFlowEstimator_QR TVL1_FlowEstimator; TVL1_FlowEstimator::Config flowCfg(tau, theta); TVL1_FlowEstimator * flowEstimator; #if !defined(USE_LAB_COLORSPACE) PyramidWithDerivativesCreator leftPyrR(false), rightPyrR(false); PyramidWithDerivativesCreator leftPyrG(false), rightPyrG(false); PyramidWithDerivativesCreator leftPyrB(false), rightPyrB(false); #else char const * pyrTexSpec = "r=32f noRTT"; PyramidWithDerivativesCreator leftPyrR(false, pyrTexSpec), rightPyrR(false, pyrTexSpec); PyramidWithDerivativesCreator leftPyrG(false, pyrTexSpec), rightPyrG(false, pyrTexSpec); PyramidWithDerivativesCreator leftPyrB(false, pyrTexSpec), rightPyrB(false, pyrTexSpec); #endif #ifdef USE_LAB_COLORSPACE inline void convertRGBImageToCIELab(Image const& src, Image& dst) { int const w = src.width(); int const h = src.height(); dst.resize(w, h, 3); Vector3f rgb, xyz, lab; for (int y = 0; y < h; ++y) for (int x = 0; x < w; ++x) { rgb[0] = src(x, y, 0) / 255.0f; rgb[1] = src(x, y, 1) / 255.0f; rgb[2] = src(x, y, 2) / 255.0f; rgb = convertRGBPixelTo_sRGB(rgb); xyz = convert_sRGBPixelToXYZ(rgb); lab = convertXYZPixelToCIELab(xyz); scaleVectorIP(0.01f, lab); dst(x, y, 0) = lab[0]; dst(x, y, 1) = lab[1]; dst(x, y, 2) = lab[2]; } } // end convertRGBImageToCIELab() #endif void drawscene() { int const w = leftImage.width(); int const h = leftImage.height(); { ScopedTimer st("glew/cg init"); glewInit(); } { ScopedTimer st("initialization flow"); flowEstimator = new TVL1_FlowEstimator(nLevels); flowEstimator->configurePrecision(false, false, false); flowEstimator->allocate(w, h); flowEstimator->setLambda(lambda); flowEstimator->configure(flowCfg); flowEstimator->setInnerIterations(nIterations); flowEstimator->setOuterIterations(nOuterIterations); flowEstimator->setStartLevel(startLevel); } { ScopedTimer st("allocating pyramids"); leftPyrR.allocate(w, h, nLevels); rightPyrR.allocate(w, h, nLevels); leftPyrG.allocate(w, h, nLevels); rightPyrG.allocate(w, h, nLevels); leftPyrB.allocate(w, h, nLevels); rightPyrB.allocate(w, h, nLevels); } unsigned int leftPyrTexIDs[3]; unsigned int rightPyrTexIDs[3]; { ScopedTimer st("building pyramids"); #if !defined(USE_LAB_COLORSPACE) if (leftImage.numChannels() == 3) { leftPyrR.buildPyramidForGrayscaleImage(leftImage.begin(0)); leftPyrG.buildPyramidForGrayscaleImage(leftImage.begin(1)); leftPyrB.buildPyramidForGrayscaleImage(leftImage.begin(2)); } else { leftPyrR.buildPyramidForGrayscaleImage(leftImage.begin(0)); leftPyrG.buildPyramidForGrayscaleImage(leftImage.begin(0)); leftPyrB.buildPyramidForGrayscaleImage(leftImage.begin(0)); } if (rightImage.numChannels() == 3) { rightPyrR.buildPyramidForGrayscaleImage(rightImage.begin(0)); rightPyrG.buildPyramidForGrayscaleImage(rightImage.begin(1)); rightPyrB.buildPyramidForGrayscaleImage(rightImage.begin(2)); } else { rightPyrR.buildPyramidForGrayscaleImage(rightImage.begin(0)); rightPyrG.buildPyramidForGrayscaleImage(rightImage.begin(0)); rightPyrB.buildPyramidForGrayscaleImage(rightImage.begin(0)); } #else leftPyrR.buildPyramidForGrayscaleImage(leftImageLab.begin(0)); leftPyrG.buildPyramidForGrayscaleImage(leftImageLab.begin(1)); leftPyrB.buildPyramidForGrayscaleImage(leftImageLab.begin(2)); rightPyrR.buildPyramidForGrayscaleImage(rightImageLab.begin(0)); rightPyrG.buildPyramidForGrayscaleImage(rightImageLab.begin(1)); rightPyrB.buildPyramidForGrayscaleImage(rightImageLab.begin(2)); #endif leftPyrTexIDs[0] = leftPyrR.textureID(); leftPyrTexIDs[1] = leftPyrG.textureID(); leftPyrTexIDs[2] = leftPyrB.textureID(); rightPyrTexIDs[0] = rightPyrR.textureID(); rightPyrTexIDs[1] = rightPyrG.textureID(); rightPyrTexIDs[2] = rightPyrB.textureID(); } { ScopedTimer st("flowEstimator"); flowEstimator->run(leftPyrTexIDs, rightPyrTexIDs); } // Save the generated flow field float *data = new float[2*leftImage.width()*leftImage.height()]; { ScopedTimer st("readPixels"); RTT_Buffer *buf = flowEstimator->getFlowBuffer(); buf->makeCurrent(); glReadBuffer(GL_COLOR_ATTACHMENT0); glReadPixels(0,0,leftImage.width(), leftImage.height(), GL_RG, GL_FLOAT, data); } { ScopedTimer st("saving"); FlowField_sV field(leftImage.width(), leftImage.height(), data, FlowField_sV::GLFormat_RG); FlowRW_sV::save(outputFile, &field); } exit(0); } int main( int argc, char** argv) { if ((argc-1) == 1) { if (strcmp(argv[1], "--identify") == 0) { std::cout << "slowmoFlowBuilder v" << VERSION << std::endl; return 0; } } if ((argc-1) < 3) { std::cout << "Usage: " << argv[0] << " " "[ [] ]" << std::endl; return -1; } { ScopedTimer st("loading files"); loadImageFile(argv[1], leftImage); loadImageFile(argv[2], rightImage); } outputFile = argv[3]; if ((argc-1) >= 4) { lambda = atof(argv[4]); if ((argc-1) >= 5) { nIterations = atoi(argv[5]); } } if (leftImage.numChannels() != 3 || rightImage.numChannels() != 3) { std::cout << "leftImage.numChannels() = " << leftImage.numChannels() << std::endl; std::cout << "rightImage.numChannels() = " << rightImage.numChannels() << std::endl; } { ScopedTimer st("initialization GL"); #ifdef USE_RAW_X11 Display *dpy = XOpenDisplay(0); if (!dpy) { std::cerr << "ERROR: could not open display\n"; exit(1); } GLint att[] = { GLX_RGBA, GLX_DEPTH_SIZE, 24, GLX_DOUBLEBUFFER, None, 0 }; XVisualInfo *vi = glXChooseVisual(dpy,0,att); if (!vi) { std::cerr << "ERROR: could not choose a GLX visual\n"; exit(1); } Colormap cmap = XCreateColormap(dpy, DefaultRootWindow(dpy), vi->visual, AllocNone); XSetWindowAttributes wattr; wattr.colormap = cmap; Window win = XCreateWindow(dpy, DefaultRootWindow(dpy), 0, 0, 1, 1, 0, vi->depth, InputOutput, vi->visual,CWColormap, &wattr); if (win == None) { std::cerr << "ERROR: could not create window\n"; exit(1); } GLXContext ctx = glXCreateContext(dpy, vi, NULL, GL_TRUE); if (!ctx) { std::cerr << "ERROR: could not create GL context\n"; exit(1); } if (!glXMakeCurrent(dpy,win,ctx)) { std::cerr << "ERROR: could not make context current\n"; exit(1); } #else glutInitWindowPosition(0, 0); glutInitWindowSize(100, 100); glutInit(&argc, argv); #endif } #if !defined(USE_LAB_COLORSPACE) if (leftImage.numChannels() < 3 || rightImage.numChannels() < 3) cerr << "Warning: grayscale images provided." << std::endl; #else if (leftImage.numChannels() < 3 || rightImage.numChannels() < 3) { cerr << "Error: grayscale images provided." << std::endl; return -2; } convertRGBImageToCIELab(leftImage, leftImageLab); convertRGBImageToCIELab(rightImage, rightImageLab); float maxL = -1e30f, minL = 1e30f; float maxA = -1e30f, minA = 1e30f; float maxB = -1e30f, minB = 1e30f; maxL = std::max(maxL, *max_element(leftImageLab.begin(0), leftImageLab.end(0))); minL = std::min(minL, *min_element(leftImageLab.begin(0), leftImageLab.end(0))); maxL = std::max(maxL, *max_element(rightImageLab.begin(0), rightImageLab.end(0))); minL = std::min(minL, *min_element(rightImageLab.begin(0), rightImageLab.end(0))); std::cout << "minL = " << minL << " maxL = " << maxL << std::endl; maxA = std::max(maxA, *max_element(leftImageLab.begin(1), leftImageLab.end(1))); minA = std::min(minA, *min_element(leftImageLab.begin(1), leftImageLab.end(1))); maxA = std::max(maxA, *max_element(rightImageLab.begin(1), rightImageLab.end(1))); minA = std::min(minA, *min_element(rightImageLab.begin(1), rightImageLab.end(1))); std::cout << "minA = " << minA << " maxA = " << maxA << std::endl; maxB = std::max(maxB, *max_element(leftImageLab.begin(2), leftImageLab.end(2))); minB = std::min(minB, *min_element(leftImageLab.begin(2), leftImageLab.end(2))); maxB = std::max(maxB, *max_element(rightImageLab.begin(2), rightImageLab.end(2))); minB = std::min(minB, *min_element(rightImageLab.begin(2), rightImageLab.end(2))); std::cout << "minB = " << minB << " maxB = " << maxB << std::endl; #endif #ifdef USE_RAW_X11 drawscene(); #else //glutInitDisplayMode(GLUT_3_2_CORE_PROFILE | GLUT_RGB | GLUT_DEPTH | GLUT_DOUBLE); glutInitDisplayMode( GLUT_RGB | GLUT_DEPTH | GLUT_DOUBLE); // glutInitContextVersion(3,2); /* or later versions, core was introduced only with 3.2 */ // glutInitContextProfile(GLUT_CORE_PROFILE); if (!glutCreateWindow("GPU TV-L1 Optic Flow")) { cerr << "Error, couldn't open window" << std::endl; return -1; } #ifdef __APPLE__ CGLContextObj ctx = CGLGetCurrentContext(); char *vendor = (char*)glGetString(GL_VENDOR); char *renderer = (char*)glGetString(GL_RENDERER); char *version = (char*)glGetString(GL_VERSION); printf("vendor: %s\n",vendor); fprintf(stderr,"%s\n%s\n", renderer, // e.g. Intel HD Graphics 3000 OpenGL Engine version // e.g. 3.2 INTEL-8.0.61 ); // CGLSetVirtualScreen(ctx, 1); // second GPU ? // vendor = (char*)glGetString(GL_VENDOR); // printf("vendor: %s\n",vendor); #endif glutDisplayFunc(drawscene); glutMainLoop(); #endif return 0; } slowmovideo-0.5+git20180116/src/V3D/GL/0000775000000000000000000000000013151342440015511 5ustar rootrootslowmovideo-0.5+git20180116/src/V3D/GL/glsl_shaders.cpp0000664000000000000000000005105313151342440020673 0ustar rootroot#include "glsl_shaders.h" namespace GLSL_Shaders { const std::string tvl1_flow_new_update_p = "#version 330\n" "\n" "uniform sampler2D uv_src;\n" "uniform sampler2D p_uv_src;\n" "\n" "in vec4 gl_TexCoord[4];\n" "\n" "out vec4 my_FragColor;\n" "\n" "uniform float timestep;\n" "uniform float rcpLambda_p;\n" "\n" "const float eps_dual = 0.0f;\n" "\n" "vec2 tv(vec4 uv_grad)\n" "{\n" " return vec2(length(uv_grad.xz), length(uv_grad.yw));\n" "}\n" "\n" "void main(void)\n" "{\n" " vec2 st0 = gl_TexCoord[0].st;\n" " vec4 stEW = gl_TexCoord[1];\n" " vec4 stSN = gl_TexCoord[2];\n" "\n" " vec2 uv = texture2D(uv_src, st0).xy;\n" " vec4 uv_ES = vec4(0.0f);\n" " uv_ES.xy = texture2D(uv_src, stEW.xy).xy;\n" " uv_ES.zw = texture2D(uv_src, stSN.xy).xy;\n" "\n" " vec4 p_uv = texture2D(p_uv_src, st0);\n" "\n" " // The right clamping mode should handle the boundary conditions.\n" " vec4 uv_grad = uv_ES - uv.xyxy;\n" "\n" " p_uv -= timestep * (uv_grad + eps_dual * p_uv);\n" "\n" "#if 0\n" " const float denom = max(1.0f, rcpLambda_p * length(p_uv));\n" " p_uv /= denom;\n" "#else\n" " vec2 denom = max(vec2(1.0f), rcpLambda_p * tv(p_uv));\n" " p_uv /= denom.xyxy;\n" "#endif\n" " my_FragColor = p_uv;\n" "}\n" "\n"; const std::string flow_warp_image = "#version 330\n" "\n" "uniform sampler2D uv_src;\n" "uniform sampler2D I0_tex;\n" "uniform sampler2D I1_tex;\n" "\n" "in vec4 gl_TexCoord[4];\n" "\n" "out vec4 my_FragColor;\n" "\n" "float signum(float value)\n" "{\n" " return (value < 0) ? -1 : 1;\n" "}\n" "\n" "void main()\n" "{\n" " vec2 st0 = gl_TexCoord[0].st;\n" " vec3 st3 = gl_TexCoord[3].stp;\n" "\n" " const float eps = 0.001f;\n" "\n" " vec2 uv = texture2D(uv_src, st0).xy;\n" " vec3 I0 = texture2D(I0_tex, st0).xyz;\n" " vec3 I1 = texture2D(I1_tex, st0 + st3.xy*uv).xyz;\n" " I1.yz *= st3.z;\n" "\n" " // Central differences and use gradients from both images\n" " const vec2 ds0 = vec2(1, 0); // I0_tex has nearest texture filtering\n" " const vec2 dt0 = vec2(0, 1);\n" " const vec2 ds = vec2(0.5, 0);\n" " const vec2 dt = vec2(0, 0.5);\n" " vec2 I0grad;\n" " vec2 I1grad;\n" " I0grad.x = 0.5f * (texture2D(I0_tex, st0 + st3.xy*ds0).x - texture2D(I0_tex, st0 - st3.xy*ds0).x);\n" " I0grad.y = 0.5f * (texture2D(I0_tex, st0 + st3.xy*dt0).x - texture2D(I0_tex, st0 - st3.xy*dt0).x);\n" " I1grad.x = texture2D(I1_tex, st0 + st3.xy*(uv+ds)).x - texture2D(I1_tex, st0 + st3.xy*(uv-ds)).x;\n" " I1grad.y = texture2D(I1_tex, st0 + st3.xy*(uv+dt)).x - texture2D(I1_tex, st0 + st3.xy*(uv-dt)).x;\n" " I1.yz = 0.5f * (I0grad + I1grad);\n" "\n" " // Avoid zero gradients\n" " I1.y = (abs(I1.y) < eps) ? (signum(I1.y) * eps) : I1.y;\n" " I1.z = (abs(I1.z) < eps) ? (signum(I1.z) * eps) : I1.z;\n" "\n" " vec4 color_out;\n" " color_out.x = I1.x - dot(I1.yz, uv) - I0.x;\n" " color_out.yz = I1.yz;\n" " color_out.w = abs(I1.x-I0.x);\n" "\n" " my_FragColor = color_out;\n" "}\n"; const std::string pyramid_with_derivative_pass1v = "#version 330\n" "\n" "uniform sampler2D src_tex;\n" "\n" "uniform int presmoothing;\n" "\n" "in vec4 gl_TexCoord[4];\n" "\n" "out vec4 my_FragColor;\n" "\n" "void main()\n" "{\n" " vec4 st0 = gl_TexCoord[0];\n" " vec4 st1 = gl_TexCoord[1];\n" " vec4 st2 = gl_TexCoord[2];\n" " vec4 st3 = gl_TexCoord[3];\n" " if (presmoothing == 3) {\n" " // This is the (odd) binomial kernel [0 1 6 15 20 15 6 1 0]\n" " const vec4 f1 = vec4(0, 1, 6, 15) / 64.0f;\n" " const vec4 f2 = vec4(20, 15, 6, 1) / 64.0f;\n" "\n" " const vec4 df = vec4(1, 6, 14, 14) / 128.0f;\n" "\n" " vec2 ds = st0.zw - st0.xy;\n" "\n" " vec4 g1, g2;\n" " float g3;\n" " g1.x = texture2D(src_tex, st0.xy - ds).x;\n" " g1.y = texture2D(src_tex, st0.xy).x;\n" " g1.z = texture2D(src_tex, st0.zw).x;\n" " g1.w = texture2D(src_tex, st1.xy).x;\n" " g2.x = texture2D(src_tex, st1.zw).x;\n" " g2.y = texture2D(src_tex, st2.xy).x;\n" " g2.z = texture2D(src_tex, st2.zw).x;\n" " g2.w = texture2D(src_tex, st3.xy).x;\n" " g3 = texture2D(src_tex, st3.zw).x;\n" "\n" " g1 *= 255.0f;\n" " g2 *= 255.0f;\n" " g3 *= 255.0f;\n" "\n" " float v = dot(f1, g1) + dot(f2, g2);\n" " float dv = -dot(df, g1) + dot(df, vec4(g2.yzw, g3));\n" " my_FragColor.x = v;\n" " my_FragColor.y = dv;\n" " } else if (presmoothing == 2) {\n" " // This is the (odd) binomial kernel [0 1 4 6 4 1 0]\n" " const vec4 f1 = vec4(0, 1, 4, 6) / 16.0f;\n" " const vec3 f2 = vec3(4, 1, 0) / 16.0f;\n" "\n" " const vec4 df1 = vec4(-1, -4, -5, 0) / 32.0f;\n" " const vec3 df2 = vec3(5, 4, 1) / 32.0f;\n" "\n" " vec4 g1;\n" " vec3 g2;\n" "\n" " g1.x = texture2D(src_tex, st0.xy).x;\n" " g1.y = texture2D(src_tex, st0.zw).x;\n" " g1.z = texture2D(src_tex, st1.xy).x;\n" " g1.w = texture2D(src_tex, st1.zw).x;\n" " g2.x = texture2D(src_tex, st2.xy).x;\n" " g2.y = texture2D(src_tex, st2.zw).x;\n" " g2.z = texture2D(src_tex, st3.xy).x;\n" "\n" " g1 *= 255.0f;\n" " g2 *= 255.0f;\n" "\n" " float v = dot(f1, g1) + dot(f2, g2);\n" " float dv = dot(df1, g1) + dot(df2, g2);\n" " my_FragColor.x = v;\n" " my_FragColor.y = dv;\n" " } else if (presmoothing == 1) {\n" " // This is the (odd) binomial kernel [0 1 2 1 0]\n" " const vec4 f1 = vec4(0, 1, 2, 1) / 4.0f;\n" "\n" " const vec4 df1 = vec4(-1, -2, 0, 2) / 8.0f;\n" " const float df2 = 1.0 / 8.0f;\n" "\n" " vec4 g1;\n" " float g2;\n" "\n" " g1.x = texture2D(src_tex, st0.zw).x;\n" " g1.y = texture2D(src_tex, st1.xy).x;\n" " g1.z = texture2D(src_tex, st1.zw).x;\n" " g1.w = texture2D(src_tex, st2.xy).x;\n" " g2 = texture2D(src_tex, st2.zw).x;\n" "\n" " g1 *= 255.0f;\n" " g2 *= 255.0f;\n" "\n" " float v = dot(f1, g1);\n" " float dv = dot(df1, g1) + df2*g2;\n" " my_FragColor.x = v;\n" " my_FragColor.y = dv;\n" " } else if (presmoothing == 4) {\n" " // This is the (odd) kernel [0 1 6 1 0]\n" " const vec4 f1 = vec4(0, 1, 6, 1) / 8.0f;\n" "\n" " const vec4 df1 = vec4(-1, -6, 0, 6) / 16.0f;\n" " const float df2 = 1.0 / 16.0f;\n" "\n" " vec4 g1;\n" " float g2;\n" "\n" " g1.x = texture2D(src_tex, st0.zw).x;\n" " g1.y = texture2D(src_tex, st1.xy).x;\n" " g1.z = texture2D(src_tex, st1.zw).x;\n" " g1.w = texture2D(src_tex, st2.xy).x;\n" " g2 = texture2D(src_tex, st2.zw).x;\n" "\n" " g1 *= 255.0f;\n" " g2 *= 255.0f;\n" "\n" " float v = dot(f1, g1);\n" " float dv = dot(df1, g1) + df2*g2;\n" " my_FragColor.x = v;\n" " my_FragColor.y = dv;\n" " } else {\n" "\n" " const vec4 f1 = vec4(0, 0, 1, 0);\n" "\n" " const vec4 df1 = vec4(0, -1, 0, 1) / 2.0f;\n" "\n" " vec4 g1;\n" "\n" " g1.x = 0;\n" " g1.y = texture2D(src_tex, st1.xy).x;\n" " g1.z = texture2D(src_tex, st1.zw).x;\n" " g1.w = texture2D(src_tex, st2.xy).x;\n" "\n" " g1 *= 255.0f;\n" "\n" " float v = dot(f1, g1);\n" " float dv = dot(df1, g1);\n" " my_FragColor.x = v;\n" " my_FragColor.y = dv;\n" " }\n" "}\n"; const std::string pyramid_with_derivative_pass1h = "#version 330\n" "\n" "uniform sampler2D src_tex;\n" "\n" "uniform int presmoothing;\n" "\n" "in vec4 gl_TexCoord[4];\n" "\n" "out vec4 my_FragColor;\n" "\n" "void main()\n" "{\n" " vec4 st0 = gl_TexCoord[0];\n" " vec4 st1 = gl_TexCoord[1];\n" " vec4 st2 = gl_TexCoord[2];\n" " vec4 st3 = gl_TexCoord[3];\n" "\n" " vec3 color;\n" " if (presmoothing == 3) {\n" " // This is the (odd) binomial kernel [0 1 6 15 20 15 6 1 0]\n" " const vec4 f1 = vec4(0, 1, 6, 15) / 64.0f;\n" " const vec4 f2 = vec4(20, 15, 6, 1) / 64.0f;\n" "\n" " const vec4 df = vec4(1, 6, 14, 14) / 128.0f;\n" "\n" " vec2 ds = st0.zw - st0.xy;\n" "\n" " vec4 rawVal1, rawVal2, rawVal3, rawVal4, rawVal5, rawVal6, rawVal7, rawVal8, rawVal9;\n" "\n" " rawVal1 = texture2D(src_tex, st0.xy - ds);\n" " rawVal2 = texture2D(src_tex, st0.xy);\n" " rawVal3 = texture2D(src_tex, st0.zw);\n" " rawVal4 = texture2D(src_tex, st1.xy);\n" " rawVal5 = texture2D(src_tex, st1.zw);\n" " rawVal6 = texture2D(src_tex, st2.xy);\n" " rawVal7 = texture2D(src_tex, st2.zw);\n" " rawVal8 = texture2D(src_tex, st3.xy);\n" " rawVal9 = texture2D(src_tex, st3.zw);\n" "\n" " vec4 g1, g2, dg1, dg2;\n" " float g3, dg3;\n" " g1.x = rawVal1.x;\n" " dg1.x = rawVal1.y;\n" " g1.y = rawVal2.x;\n" " dg1.y = rawVal2.y;\n" " g1.z = rawVal3.x;\n" " dg1.z = rawVal3.y;\n" " g1.w = rawVal4.x;\n" " dg1.w = rawVal4.y;\n" " g2.x = rawVal5.x;\n" " dg2.x = rawVal5.y;\n" " g2.y = rawVal6.x;\n" " dg2.y = rawVal6.y;\n" " g2.z = rawVal7.x;\n" " dg2.z = rawVal7.y;\n" " g2.w = rawVal8.x;\n" " dg2.w = rawVal8.y;\n" " g3 = rawVal9.x;\n" " dg3 = rawVal9.y;\n" "\n" " float v = dot(f1, g1) + dot(f2, g2);\n" " float dv_dx = -dot(df, g1) + dot(df, vec4(g2.yzw, g3));\n" " float dv_dy = dot(f1, dg1) + dot(f2, dg2);\n" "\n" " color.x = v;\n" " color.y = dv_dx;\n" " color.z = dv_dy;\n" " } else if (presmoothing == 2) {\n" " // This is the (odd) binomial kernel [0 1 4 6 4 1 0]\n" " const vec4 f1 = vec4(0, 1, 4, 6) / 16.0f;\n" " const vec3 f2 = vec3(4, 1, 0) / 16.0f;\n" "\n" " const vec4 df1 = vec4(-1, -4, -5, 0) / 32.0f;\n" " const vec3 df2 = vec3(5, 4, 1) / 32.0f;\n" "\n" " vec4 rawVal1, rawVal2, rawVal3, rawVal4, rawVal5, rawVal6, rawVal7;\n" "\n" " rawVal1 = texture2D(src_tex, st0.xy);\n" " rawVal2 = texture2D(src_tex, st0.zw);\n" " rawVal3 = texture2D(src_tex, st1.xy);\n" " rawVal4 = texture2D(src_tex, st1.zw);\n" " rawVal5 = texture2D(src_tex, st2.xy);\n" " rawVal6 = texture2D(src_tex, st2.zw);\n" " rawVal7 = texture2D(src_tex, st3.xy);\n" "\n" " vec4 g1, dg1;\n" " vec3 g2, dg2;\n" "\n" " g1.x = rawVal1.x;\n" " dg1.x = rawVal1.y;\n" " g1.y = rawVal2.x;\n" " dg1.y = rawVal2.y;\n" " g1.z = rawVal3.x;\n" " dg1.z = rawVal3.y;\n" " g1.w = rawVal4.x;\n" " dg1.w = rawVal4.y;\n" " g2.x = rawVal5.x;\n" " dg2.x = rawVal5.y;\n" " g2.y = rawVal6.x;\n" " dg2.y = rawVal6.y;\n" " g2.z = rawVal7.x;\n" " dg2.z = rawVal7.y;\n" "\n" " float v = dot(f1, g1) + dot(f2, g2);\n" " float dv_dx = dot(df1, g1) + dot(df2, g2);\n" " float dv_dy = dot(f1, dg1) + dot(f2, dg2);\n" "\n" " color.x = v;\n" " color.y = dv_dx;\n" " color.z = dv_dy;\n" " } else if (presmoothing == 1) {\n" " // This is the (odd) binomial kernel [0 1 2 1 0]\n" " const vec4 f1 = vec4(0, 1, 2, 1) / 4.0f;\n" " // Note: f2 = 0.\n" "\n" " const vec4 df1 = vec4(-1, -2, 0, 2) / 8.0f;\n" " const float df2 = 1.0 / 8.0f;\n" "\n" " vec4 rawVal1, rawVal2, rawVal3, rawVal4, rawVal5;\n" "\n" " rawVal1 = texture2D(src_tex, st0.zw);\n" " rawVal2 = texture2D(src_tex, st1.xy);\n" " rawVal3 = texture2D(src_tex, st1.zw);\n" " rawVal4 = texture2D(src_tex, st2.xy);\n" " rawVal5 = texture2D(src_tex, st2.zw);\n" "\n" " vec4 g1, dg1;\n" " float g2, dg2;\n" "\n" " g1.x = rawVal1.x;\n" " dg1.x = rawVal1.y;\n" " g1.y = rawVal2.x;\n" " dg1.y = rawVal2.y;\n" " g1.z = rawVal3.x;\n" " dg1.z = rawVal3.y;\n" " g1.w = rawVal4.x;\n" " dg1.w = rawVal4.y;\n" " g2 = rawVal5.x;\n" " dg2 = rawVal5.y;\n" "\n" " float v = dot(f1, g1);\n" " float dv_dx = dot(df1, g1) + df2*g2;\n" " float dv_dy = dot(f1, dg1);\n" "\n" " color.x = v;\n" " color.y = dv_dx;\n" " color.z = dv_dy;\n" " } else if (presmoothing == 4) {\n" " // This is the (odd) kernel [0 1 6 1 0]\n" " const vec4 f1 = vec4(0, 1, 6, 1) / 8.0f;\n" " // Note: f2 = 0.\n" "\n" " const vec4 df1 = vec4(-1, -6, 0, 6) / 16.0f;\n" " const float df2 = 1.0 / 16.0f;\n" "\n" " vec4 rawVal1, rawVal2, rawVal3, rawVal4, rawVal5;\n" "\n" " rawVal1 = texture2D(src_tex, st0.zw);\n" " rawVal2 = texture2D(src_tex, st1.xy);\n" " rawVal3 = texture2D(src_tex, st1.zw);\n" " rawVal4 = texture2D(src_tex, st2.xy);\n" " rawVal5 = texture2D(src_tex, st2.zw);\n" "\n" " vec4 g1, dg1;\n" " float g2, dg2;\n" "\n" " g1.x = rawVal1.x;\n" " dg1.x = rawVal1.y;\n" " g1.y = rawVal2.x;\n" " dg1.y = rawVal2.y;\n" " g1.z = rawVal3.x;\n" " dg1.z = rawVal3.y;\n" " g1.w = rawVal4.x;\n" " dg1.w = rawVal4.y;\n" " g2 = rawVal5.x;\n" " dg2 = rawVal5.y;\n" "\n" " float v = dot(f1, g1);\n" " float dv_dx = dot(df1, g1) + df2*g2;\n" " float dv_dy = dot(f1, dg1);\n" "\n" " color.x = v;\n" " color.y = dv_dx;\n" " color.z = dv_dy;\n" " } else {\n" " // This is the (odd and degenerate) binomial kernel [0 0 1 0 0]\n" " const vec4 f1 = vec4(0, 0, 1, 0);\n" " // Note: f2 = 0.\n" "\n" " const vec4 df1 = vec4(0, -1, 0, 1) / 2.0f;\n" "\n" " vec4 rawVal1, rawVal2, rawVal3, rawVal4, rawVal5;\n" "\n" " rawVal1 = texture2D(src_tex, st0.zw);\n" " rawVal2 = texture2D(src_tex, st1.xy);\n" " rawVal3 = texture2D(src_tex, st1.zw);\n" " rawVal4 = texture2D(src_tex, st2.xy);\n" " rawVal5 = texture2D(src_tex, st2.zw);\n" "\n" " vec4 g1;\n" " vec4 dg1;\n" "\n" " g1.x = rawVal1.x;\n" " dg1.x = rawVal1.y;\n" " g1.y = rawVal2.x;\n" " dg1.y = rawVal2.y;\n" " g1.z = rawVal3.x;\n" " dg1.z = rawVal3.y;\n" " g1.w = rawVal4.x;\n" " dg1.w = rawVal4.y;\n" "\n" " float v = dot(f1, g1);\n" " float dv_dx = dot(df1, g1);\n" " float dv_dy = dot(f1, dg1);\n" "\n" " color.x = v;\n" " color.y = dv_dx;\n" " color.z = dv_dy;\n" " }\n" "\n" " my_FragColor.rgb = color;\n" "}\n" "\n"; const std::string pyramid_with_derivative_pass2 = "#version 330\n" "\n" "uniform sampler2D src_tex;\n" "\n" "in vec4 gl_TexCoord[4];\n" "\n" "out vec4 my_FragColor;\n" "\n" "void main()\n" "{\n" " vec4 st0 = gl_TexCoord[0];\n" " vec4 st1 = gl_TexCoord[1];\n" "\n" " vec3 val1 = texture2D(src_tex, st0.xy).xyz;\n" " vec3 val2 = texture2D(src_tex, st0.zw).xyz;\n" " vec3 val3 = texture2D(src_tex, st1.xy).xyz;\n" " vec3 val4 = texture2D(src_tex, st1.zw).xyz;\n" "\n" " // This is the (even) binomial kernel [1 3 3 1]\n" " my_FragColor.rgb = (val1 + 3*val2 + 3*val3 + val4) / 8.0f;\n" " //my_FragColor.rgb = (0*val1 + val2 + 2*val3 + val4) / 4.0f;\n" " //my_FragColor.rgb = (val2 + val3) / 2.0f;\n" "}\n"; const std::string tvl1_color_flow_QR_update_uv = "#version 330\n" "\n" "uniform sampler2D uv_src;\n" "uniform sampler2D p_uv_src;\n" "uniform sampler2D warped_R_tex;\n" "uniform sampler2D warped_G_tex;\n" "uniform sampler2D warped_B_tex;\n" "\n" "in vec4 gl_TexCoord[4];\n" "\n" "out vec4 my_FragColor;" "\n" "uniform float lambda_theta;\n" "uniform float theta;\n" "\n" "vec3 thresholdingStep(vec3 a2, vec3 b, float lambda_theta)\n" "{\n" " vec3 lam_a2 = lambda_theta * a2;\n" " vec3 result;\n" " result.x = (b.x + lam_a2.x < 0) ? lambda_theta : ((b.x - lam_a2.x > 0) ? -lambda_theta : (-b.x/a2.x));\n" " result.y = (b.y + lam_a2.y < 0) ? lambda_theta : ((b.y - lam_a2.y > 0) ? -lambda_theta : (-b.y/a2.y));\n" " result.z = (b.z + lam_a2.z < 0) ? lambda_theta : ((b.z - lam_a2.z > 0) ? -lambda_theta : (-b.z/a2.z));\n" " return result;\n" "}\n" "\n" "void main(void)\n" "{\n" " vec2 st0 = gl_TexCoord[0].st;\n" " vec4 stEW = gl_TexCoord[1];\n" " vec4 stSN = gl_TexCoord[2];\n" "\n" " vec3 warped_R = texture2D(warped_R_tex, st0).xyz;\n" " vec3 warped_G = texture2D(warped_G_tex, st0).xyz;\n" " vec3 warped_B = texture2D(warped_B_tex, st0).xyz;\n" "\n" " // Normalize here to allow lower precision for the warped buffer\n" " warped_R /= 255;\n" " warped_G /= 255;\n" " warped_B /= 255;\n" "\n" " vec2 stW = stEW.zw;\n" " vec2 stN = stSN.zw;\n" "\n" " bool isLeftBorder = (stW.x < 0);\n" " bool isRightBorder = (stEW.x > 1);\n" " bool isTopBorder = (stN.y < 0);\n" " bool isBottomBorder = (stSN.y > 1);\n" "\n" " vec2 uv = texture2D(uv_src, st0).xy;\n" "\n" " vec4 p_uv = texture2D(p_uv_src, st0);\n" " vec2 p1_W_uv = texture2D(p_uv_src, stW).xy;\n" " vec2 p2_N_uv = texture2D(p_uv_src, stN).zw;\n" "\n" " p1_W_uv = isLeftBorder ? vec2(0.0f) : p1_W_uv;\n" " p2_N_uv = isTopBorder ? vec2(0.0f) : p2_N_uv;\n" " p_uv.xy = isRightBorder ? vec2(0.0f) : p_uv.xy;\n" " p_uv.zw = isBottomBorder ? vec2(0.0f) : p_uv.zw;\n" "\n" " vec2 div_p = p_uv.xy - p1_W_uv + p_uv.zw - p2_N_uv;\n" "\n" " // new u and v\n" " vec3 b = vec3(0);\n" " b.x = dot(warped_R, vec3(1, uv));\n" " b.y = dot(warped_G, vec3(1, uv));\n" " b.z = dot(warped_B, vec3(1, uv));\n" "\n" " vec3 r2 = vec3(0);\n" " r2.x = dot(warped_R.yz, warped_R.yz);\n" " r2.y = dot(warped_G.yz, warped_G.yz);\n" " r2.z = dot(warped_B.yz, warped_B.yz);\n" "\n" " vec3 step = thresholdingStep(r2, b, lambda_theta);\n" "\n" " vec2 UV = vec2(0.0f);\n" " UV.x += dot(step, vec3(warped_R.y, warped_G.y, warped_B.y));\n" " UV.y += dot(step, vec3(warped_R.z, warped_G.z, warped_B.z));\n" " UV /= 3.0;\n" " UV += uv;\n" "\n" " vec2 uv_out = UV + theta * div_p;\n" "\n" " my_FragColor = vec4(uv_out.x, uv_out.y, 0.0f, 0.0f);\n" "}\n" "\n"; } slowmovideo-0.5+git20180116/src/V3D/GL/v3d_gpubase.h0000664000000000000000000002751413151342440020075 0ustar rootroot// -*- C++ -*- #include "config.h" #ifndef V3D_GPUBASE_H #define V3D_GPUBASE_H # if defined(V3DLIB_ENABLE_GPGPU) #include #include #include #include #include #define checkGLErrorsHere0() { V3D_GPU::checkGLErrors(__FILE__, __LINE__); } #define checkGLErrorsHere1(NAME) { V3D_GPU::checkGLErrors(__FILE__, __LINE__, NAME); } #define raiseGLErrorHere1(MSG) { V3D_GPU::raiseGLError(__FILE__, __LINE__, MSG); } #define raiseGLErrorHere2(MSG, NAME) { V3D_GPU::raiseGLError(__FILE__, __LINE__, MSG, NAME); } namespace V3D_GPU { typedef unsigned char uchar; struct TextureSpecification { TextureSpecification() : nChannels(3), nBitsPerChannel(8), isFloatTexture(false), isDepthTexture(false), isRTT(true), enableTextureRG(false) { } TextureSpecification(char const * specString); unsigned int getGLInternalFormat() const; uchar nChannels, nBitsPerChannel; bool isFloatTexture, isDepthTexture; bool isRTT, enableTextureRG; }; // end struct TextureSpecification struct ImageTexture2D { ImageTexture2D(char const * texName = "") : _texName(texName), _textureID(0), _textureTarget(0), _width(0), _height(0) { } virtual ~ImageTexture2D() { } bool allocateID(); void deallocateID(); void reserve(int width, int height, TextureSpecification const& texSpec); void overwriteWith(uchar const * pixels, int nChannels); void overwriteWith(uchar const * redPixels, uchar const * greenPixels, uchar const * bluePixels); void overwriteWith(uchar const * redPixels, uchar const * greenPixels, uchar const * bluePixels, uchar const * alphaPixels); void overwriteWith(float const * pixels, int nChannels); void overwriteWith(float const * redPixels, float const * greenPixels, float const * bluePixels); void overwriteWith(float const * redPixels, float const * greenPixels, float const * bluePixels, float const * alphaPixels); void clear(); void bind(); void bind(unsigned texUnit); void enable(); void enable(unsigned texUnit); void disable(); void disable(unsigned texUnit); unsigned width() const { return _width; } unsigned height() const { return _height; } unsigned textureID() const { return _textureID; } unsigned textureTarget() const { return _textureTarget; } protected: std::string _texName; unsigned _textureID; unsigned _textureTarget; unsigned _width, _height; }; // end struct ImageTexture2D struct FrameBufferObject { FrameBufferObject(char const * fboName = "") : _fboName(fboName), _attachedDepthTexture(0) { std::fill(_attachedColorTextures, _attachedColorTextures+16, (ImageTexture2D *)0); } virtual ~FrameBufferObject() { } bool allocate(); void deallocate(); void makeCurrent(); void activate(bool setViewport = true); bool isCurrent(); void attachTexture2D(ImageTexture2D& texture, GLenum attachment = GL_COLOR_ATTACHMENT0_EXT, int mipLevel = 0); void attachTextures2D(int numTextures, ImageTexture2D * textures, GLenum * attachment = 0, int * mipLevel = 0); void detach(GLenum attachment); void detachAll(); # ifndef NDEBUG_GL bool checkValidity(std::ostream& os = std::cerr); # else bool checkValidity(std::ostream& os = std::cerr) { return true; } # endif unsigned width() const { for (int i = 0; i < 16; ++i) if (_attachedColorTextures[i]) return _attachedColorTextures[i]->width(); return 0; } unsigned height() const { for (int i = 0; i < 16; ++i) if (_attachedColorTextures[i]) return _attachedColorTextures[i]->height(); return 0; } unsigned frameBufferID() const { return _fboID; } ImageTexture2D& getColorTexture(int i) const { return *_attachedColorTextures[i]; } static int getMaxColorAttachments(); static void disableFBORendering(); protected: bool checkBinding(char const * what); std::string _fboName; GLuint _fboID; ImageTexture2D * _attachedColorTextures[16]; ImageTexture2D * _attachedDepthTexture; }; // end struct FrameBufferObject struct RTT_Buffer { RTT_Buffer(char const * texSpec = "rgb=8", char const * rttName = "") : _texSpec(texSpec), _tex(rttName), _fbo(rttName) { } virtual ~RTT_Buffer() { } bool allocate(int const w, int const h) { _tex.allocateID(); _tex.reserve(w, h, TextureSpecification(_texSpec.c_str())); _fbo.allocate(); _fbo.makeCurrent(); _fbo.attachTexture2D(_tex); _fbo.checkValidity(); return true; } bool allocate(char const * texSpec, int const w, int const h) { _texSpec = texSpec; return allocate(w,h); } bool reallocate(int w, int h) { _tex.reserve(w, h, TextureSpecification(_texSpec.c_str())); return true; } void deallocate() { _fbo.deallocate(); _tex.deallocateID(); } unsigned width() const { return _tex.width(); } unsigned height() const { return _tex.height(); } unsigned textureID() const { return _tex.textureID(); } unsigned textureTarget() const { return _tex.textureTarget(); } void bindTexture() { _tex.bind(); } void bindTexture(unsigned texUnit) { _tex.bind(texUnit); } void enableTexture() { _tex.enable(); } void enableTexture(unsigned texUnit) { _tex.enable(texUnit); } void disableTexture() { _tex.disable(); } void disableTexture(unsigned texUnit) { _tex.disable(texUnit); } void makeCurrent() { _fbo.makeCurrent(); } void activate(bool setViewport = true) { _fbo.activate(setViewport); } bool isCurrent() { return _fbo.isCurrent(); } ImageTexture2D& getTexture() { return _tex; } FrameBufferObject& getFBO() { return _fbo; } protected: std::string _texSpec; ImageTexture2D _tex; FrameBufferObject _fbo; }; // end struct RTT_Buffer //---------------------------------------------------------------------- struct ProgramBase { ProgramBase(char const * shaderName) : _shaderName(shaderName) { } virtual ~ProgramBase() { } virtual void setProgram(char const * source) = 0; virtual void setProgram(std::string const& source) { this->setProgram(source.c_str()); } virtual void compile(char const * * compilerArgs = 0, char const *entry = 0) = 0; virtual void compile(std::vector const& compilerArgs, char const *entry = 0) = 0; virtual void enable() = 0; virtual void disable() = 0; virtual void parameter(char const * param, float x) = 0; virtual void parameter(char const * param, float x, float y) = 0; virtual void parameter(char const * param, float x, float y, float z) = 0; virtual void parameter(char const * param, float x, float y, float z, float w) = 0; virtual void parameter(char const * param, int len, float const * array) = 0; virtual void matrixParameterR(char const * param, int rows, int cols, double const * values) = 0; virtual void matrixParameterC(char const * param, int rows, int cols, double const * values) = 0; virtual unsigned getTexUnit(char const * param) = 0; std::string const& shaderName() const { return _shaderName; } protected: std::string _shaderName; }; struct GLSL_FragmentProgram : public ProgramBase { GLSL_FragmentProgram(char const * shaderName) : ProgramBase(shaderName), _source(), _program(0), _inUse(false), _texUnitMap() { } virtual ~GLSL_FragmentProgram() { } virtual void setProgram(char const * source); virtual void compile(char const * * compilerArgs = 0, char const *entry = 0); virtual void compile(std::vector const& compilerArgs, char const *entry = 0); virtual void bindFragDataLocation(const std::string& var); virtual void enable(); virtual void disable(); virtual void bindTexture(const std::string& name, unsigned int unit); virtual void parameter(char const * param, int x); virtual void parameter(char const * param, float x); virtual void parameter(char const * param, float x, float y); virtual void parameter(char const * param, float x, float y, float z); virtual void parameter(char const * param, float x, float y, float z, float w); virtual void parameter(char const * param, int len, float const * array); virtual void matrixParameterR(char const * param, int rows, int cols, double const * values); virtual void matrixParameterC(char const * param, int rows, int cols, double const * values); virtual unsigned getTexUnit(char const * param); protected: std::string _source; GLhandleARB _program; bool _inUse; std::map _texUnitMap; }; //---------------------------------------------------------------------- # ifndef NDEBUG_GL void checkGLErrors(char const * location, std::ostream& os = std::cerr); void checkGLErrors(char const * file, int line, std::ostream& os = std::cerr); void checkGLErrors(char const * file, int line, std::string const& name, std::ostream& os = std::cerr); bool checkFrameBufferStatus(char const * file, int line, std::string const& name, std::ostream& os = std::cerr); # else inline void checkGLErrors(char const * location, std::ostream& os = std::cerr) { } inline void checkGLErrors(char const * file, int line, std::ostream& os = std::cerr) { } inline void checkGLErrors(char const * file, int line, std::string const& name, std::ostream& os = std::cerr) { } inline bool checkFrameBufferStatus(char const * file, int line, std::string const& name, std::ostream& os = std::cerr) { return true; } # endif void raiseGLError(char const * file, int line, char const * msg, std::ostream& os = std::cerr); void raiseGLError(char const * file, int line, char const * msg, std::string const& name, std::ostream& os = std::cerr); //---------------------------------------------------------------------- enum GPUTextureSamplingPattern { GPU_SAMPLE_NONE = 0, GPU_SAMPLE_NEIGHBORS = 1, //!< Sample the 4 direct neighbors (E, W, N, S) GPU_SAMPLE_REVERSE_NEIGHBORS = 2, //!< Sample the 4 direct neighbors (W, E, S, N) GPU_SAMPLE_DIAG_NEIGBORS = 3, //!< Sample the diagonal neighbors (NE, SW, NW, SE) GPU_SAMPLE_2X2_BLOCK = 4, //!< Sample the 2x2 block (here, E, S, SE) }; void setupNormalizedProjection(bool flipY = false); void renderNormalizedQuad(); void renderNormalizedQuad(GPUTextureSamplingPattern pattern, float ds, float dt); void enableTrivialTexture2DShader(); void disableTrivialTexture2DShader(); } // end namespace V3D_GPU # endif // defined(V3DLIB_ENABLE_GPGPU) #endif // defined(V3D_GPUBASE_H) slowmovideo-0.5+git20180116/src/V3D/GL/v3d_gpucolorflow.h0000664000000000000000000000644213151342440021166 0ustar rootroot// -*- C++ -*- #include "config.h" #ifndef V3D_GPU_COLOR_FLOW_H #define V3D_GPU_COLOR_FLOW_H #include "GL/v3d_gpubase.h" #include "GL/v3d_gpuflow.h" namespace V3D_GPU { struct TVL1_ColorFlowEstimatorBase { TVL1_ColorFlowEstimatorBase(int nLevels) : _warpedBufferHighPrecision(true), _uvBufferHighPrecision(true), _pBufferHighPrecision(false), // fp16 is usually enough for p _nOuterIterations(1), _nInnerIterations(50), _startLevel(0), _nLevels(nLevels), _width(-1), _height(-1) { } void setLambda(float lambda) { _lambda = lambda; } void setOuterIterations(int nIters) { _nOuterIterations = nIters; } void setInnerIterations(int nIters) { _nInnerIterations = nIters; } void setStartLevel(int startLevel) { _startLevel = startLevel; } // Must be called before allocate() to have an effect. void configurePrecision(bool warpedBufferHighPrecision, bool uvBufferHighPrecision, bool pBufferHighPrecision) { _warpedBufferHighPrecision = warpedBufferHighPrecision; _uvBufferHighPrecision = uvBufferHighPrecision; _pBufferHighPrecision = pBufferHighPrecision; } void allocate(int w, int h); void deallocate(); RTT_Buffer * getWarpedBuffer(int channel, int level) { return _warpedBufferPyramids[channel][level]; } protected: bool _warpedBufferHighPrecision, _uvBufferHighPrecision, _pBufferHighPrecision; std::vector _warpedBufferPyramids[3]; int _nOuterIterations, _nInnerIterations; float _lambda; int _startLevel, _nLevels; int _width, _height; }; // end struct TVL1_ColorFlowEstimatorBase //---------------------------------------------------------------------- // Quadratic relaxation approach struct TVL1_ColorFlowEstimator_QR : public TVL1_ColorFlowEstimatorBase { public: struct Config { Config(float tau = 0.249f, float theta = 0.1f) : _tau(tau), _theta(theta) { } float _tau, _theta; }; TVL1_ColorFlowEstimator_QR(int nLevels) : TVL1_ColorFlowEstimatorBase(nLevels) { _shader_uv = 0; _shader_p = 0; } ~TVL1_ColorFlowEstimator_QR() { } void configure(Config const& cfg) { _cfg = cfg; } void allocate(int w, int h); void deallocate(); void run(unsigned int I0_TexIDs[3], unsigned int I1_TexIDs[3]); unsigned int getFlowFieldTextureID() { return _uBuffer2Pyramid[_startLevel]->textureID(); } RTT_Buffer *getFlowBuffer() { return _uBuffer2Pyramid[_startLevel]; } protected: Config _cfg; GLSL_FragmentProgram *_shader_uv; GLSL_FragmentProgram *_shader_p; std::vector _uBuffer1Pyramid, _uBuffer2Pyramid; std::vector _pBuffer1Pyramid, _pBuffer2Pyramid; }; // end struct TVL1_ColorFlowEstimator_QR } // end namespace V3D_GPU #endif // defined(V3D_GPU_FLOW_H) slowmovideo-0.5+git20180116/src/V3D/GL/v3d_gpuflow.h0000664000000000000000000000411713151342440020124 0ustar rootroot// -*- C++ -*- #include "config.h" #ifndef V3D_GPU_FLOW_H #define V3D_GPU_FLOW_H #include "GL/v3d_gpubase.h" #include "GL/v3d_gpupyramid.h" namespace V3D_GPU { struct TVL1_FlowEstimatorBase { TVL1_FlowEstimatorBase(int nLevels) : _warpedBufferHighPrecision(true), _uvBufferHighPrecision(true), _pBufferHighPrecision(false), // fp16 is usually enough for p _nOuterIterations(1), _nInnerIterations(50), _startLevel(0), _nLevels(nLevels), _width(-1), _height(-1) { } void setLambda(float lambda) { _lambda = lambda; } void setOuterIterations(int nIters) { _nOuterIterations = nIters; } void setInnerIterations(int nIters) { _nInnerIterations = nIters; } void setStartLevel(int startLevel) { _startLevel = startLevel; } // Must be called before allocate() to have an effect. void configurePrecision(bool warpedBufferHighPrecision, bool uvBufferHighPrecision, bool pBufferHighPrecision) { _warpedBufferHighPrecision = warpedBufferHighPrecision; _uvBufferHighPrecision = uvBufferHighPrecision; _pBufferHighPrecision = pBufferHighPrecision; } void allocate(int w, int h); void deallocate(); RTT_Buffer * getWarpedBuffer(int level) { return _warpedBufferPyramid[level]; } protected: bool _warpedBufferHighPrecision, _uvBufferHighPrecision, _pBufferHighPrecision; std::vector _warpedBufferPyramid; int _nOuterIterations, _nInnerIterations; float _lambda; int _startLevel, _nLevels; int _width, _height; }; // end struct TVL1_FlowEstimatorBase //---------------------------------------------------------------------- void warpImageWithFlowField(unsigned int uv_tex, unsigned int I0_tex, unsigned int I1_tex, int level, RTT_Buffer& dest); } // end namespace V3D_GPU #endif // defined(V3D_GPU_FLOW_H) slowmovideo-0.5+git20180116/src/V3D/GL/v3d_gpuflow.cpp0000664000000000000000000002115313151342440020456 0ustar rootroot#include "config.h" #include "Base/v3d_utilities.h" #include "v3d_gpuflow.h" #include "glsl_shaders.h" #include #include #include using namespace std; using namespace V3D_GPU; namespace { void upsampleDisparities(unsigned uvSrcTex, unsigned pSrcTex, float pScale, RTT_Buffer& ubuffer, RTT_Buffer& pbuffer) { static GLSL_FragmentProgram * upsampleShader = 0; if (upsampleShader == 0) { upsampleShader = new GLSL_FragmentProgram("v3d_gpuflow::upsampleDisparities::upsampleShader"); char const * source = "void main(uniform sampler2D src_tex : TEXTURE0, \n" " float2 st0 : TEXCOORD0, \n" " float4 st3 : TEXCOORD3, \n" " out float4 res_out : COLOR0) \n" "{ \n" " res_out = st3 * tex2D(src_tex, st0); \n" "} \n"; upsampleShader->setProgram(source); upsampleShader->compile(); checkGLErrorsHere0(); } // end if setupNormalizedProjection(); ubuffer.activate(); glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D, uvSrcTex); glEnable(GL_TEXTURE_2D); upsampleShader->enable(); // Provide uniform paramter via texcoord to avoid recompilation of shaders glMultiTexCoord4f(GL_TEXTURE3_ARB, 2, 2, 1, 1); //glMultiTexCoord4f(GL_TEXTURE3_ARB, 0, 0, 0, 0); renderNormalizedQuad(); //upsampleShader->disable(); glDisable(GL_TEXTURE_2D); checkGLErrorsHere0(); pbuffer.activate(); glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D, pSrcTex); glEnable(GL_TEXTURE_2D); //upsampleShader->enable(); // Provide uniform paramter via texcoord to avoid recompilation of shaders glMultiTexCoord4f(GL_TEXTURE3_ARB, pScale, pScale, pScale, pScale); renderNormalizedQuad(); upsampleShader->disable(); glDisable(GL_TEXTURE_2D); checkGLErrorsHere0(); } // upsampleDisparities() void upsampleBuffer(unsigned srcTex, float scale, FrameBufferObject& dstFbo) { static GLSL_FragmentProgram * upsampleShader = 0; if (upsampleShader == 0) { upsampleShader = new GLSL_FragmentProgram("v3d_gpuflow::upsampleBuffer::upsampleShader"); char const * source = "void main(uniform sampler2D src_tex : TEXTURE0, \n" " float2 st0 : TEXCOORD0, \n" " float4 st3 : TEXCOORD3, \n" " out float4 res_out : COLOR0) \n" "{ \n" " res_out = st3 * tex2D(src_tex, st0); \n" "} \n"; upsampleShader->setProgram(source); upsampleShader->compile(); checkGLErrorsHere0(); } // end if setupNormalizedProjection(); dstFbo.activate(); glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D, srcTex); glEnable(GL_TEXTURE_2D); upsampleShader->enable(); // Provide uniform paramter via texcoord to avoid recompilation of shaders glMultiTexCoord4f(GL_TEXTURE3_ARB, scale, scale, scale, scale); renderNormalizedQuad(); upsampleShader->disable(); glDisable(GL_TEXTURE_2D); checkGLErrorsHere0(); } // upsampleBuffer() void upsampleBuffers(unsigned src1Tex, unsigned src2Tex, float scale1, float scale2, FrameBufferObject& dstFbo) { static GLSL_FragmentProgram * upsampleShader = 0; if (upsampleShader == 0) { upsampleShader = new GLSL_FragmentProgram("v3d_gpuflow::upsampleBuffer::upsampleShader"); char const * source = "void main(uniform sampler2D src1_tex : TEXTURE0, \n" " uniform sampler2D src2_tex : TEXTURE1, \n" " float2 st0 : TEXCOORD0, \n" " float4 st3 : TEXCOORD3, \n" " float4 st4 : TEXCOORD4, \n" " out float4 res1_out : COLOR0, \n" " out float4 res2_out : COLOR1) \n" "{ \n" " res1_out = st3 * tex2D(src1_tex, st0); \n" " res1_out = st4 * tex2D(src2_tex, st0); \n" "} \n"; upsampleShader->setProgram(source); upsampleShader->compile(); checkGLErrorsHere0(); } // end if setupNormalizedProjection(); dstFbo.activate(); GLenum const targetBuffers[2] = { GL_COLOR_ATTACHMENT0_EXT, GL_COLOR_ATTACHMENT1_EXT }; glDrawBuffersARB(2, targetBuffers); glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D, src1Tex); glEnable(GL_TEXTURE_2D); glActiveTexture(GL_TEXTURE1); glBindTexture(GL_TEXTURE_2D, src2Tex); glEnable(GL_TEXTURE_2D); upsampleShader->enable(); // Provide uniform paramter via texcoord to avoid recompilation of shaders // Texcoords 0-2 are assigned by renderNormalizedQuad(). glMultiTexCoord4f(GL_TEXTURE3_ARB, scale1, scale1, scale1, scale1); glMultiTexCoord4f(GL_TEXTURE4_ARB, scale2, scale2, scale2, scale2); renderNormalizedQuad(); upsampleShader->disable(); glActiveTexture(GL_TEXTURE0); glDisable(GL_TEXTURE_2D); glActiveTexture(GL_TEXTURE1); glDisable(GL_TEXTURE_2D); glDrawBuffer(GL_COLOR_ATTACHMENT0_EXT); checkGLErrorsHere0(); } // upsampleBuffers() } // end namespace //---------------------------------------------------------------------- namespace V3D_GPU { void TVL1_FlowEstimatorBase::allocate(int W, int H) { _width = W; _height = H; char const * texSpec = _warpedBufferHighPrecision ? "rgba=32f tex2D" : "rgba=16f tex2D"; _warpedBufferPyramid.resize(_nLevels); for (int level = 0; level < _nLevels; ++level) { int const w = _width / (1 << level); int const h = _height / (1 << level); _warpedBufferPyramid[level] = new RTT_Buffer(texSpec, "_warpedBufferPyramid[]"); _warpedBufferPyramid[level]->allocate(w, h); } } // end TVL1_FlowEstimatorBase::allocate() void TVL1_FlowEstimatorBase::deallocate() { for (int level = 0; level < _nLevels; ++level) _warpedBufferPyramid[level]->deallocate(); } //---------------------------------------------------------------------- void warpImageWithFlowField(unsigned int uv_tex, unsigned int I0_tex, unsigned int I1_tex, int level, RTT_Buffer& dest) { static GLSL_FragmentProgram * shader = 0; if (shader == 0) { shader = new GLSL_FragmentProgram("v3d_gpuflow::warpImageWithFlowField::shader"); shader->setProgram(GLSL_Shaders::flow_warp_image.c_str()); shader->compile(); checkGLErrorsHere0(); } int const w = dest.width(); int const h = dest.height(); dest.activate(); setupNormalizedProjection(); glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D, uv_tex); glEnable(GL_TEXTURE_2D); glActiveTexture(GL_TEXTURE1); glBindTexture(GL_TEXTURE_2D, I0_tex); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_BASE_LEVEL, level); //glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST_MIPMAP_NEAREST); glEnable(GL_TEXTURE_2D); glActiveTexture(GL_TEXTURE2); glBindTexture(GL_TEXTURE_2D, I1_tex); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_BASE_LEVEL, level); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); //glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_NEAREST); glEnable(GL_TEXTURE_2D); // Provide uniform paramter via texcoord to avoid recompilation of shaders glMultiTexCoord3f(GL_TEXTURE3_ARB, 1.0f/w, 1.0f/h, 1 << level); shader->enable(); shader->bindTexture("uv_src", 0); shader->bindTexture("I0_tex", 1); shader->bindTexture("I1_tex", 2); renderNormalizedQuad(); shader->disable(); glActiveTexture(GL_TEXTURE0); glDisable(GL_TEXTURE_2D); glActiveTexture(GL_TEXTURE1); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_BASE_LEVEL, 0); //glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glDisable(GL_TEXTURE_2D); glActiveTexture(GL_TEXTURE2); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_BASE_LEVEL, 0); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glDisable(GL_TEXTURE_2D); } // end warpImageWithFlowField() } // end namespace V3D_GPU slowmovideo-0.5+git20180116/src/V3D/GL/glsl_shaders.h0000664000000000000000000000066313151342440020341 0ustar rootroot#ifndef GLSL_SHADERS_H #define GLSL_SHADERS_H #include namespace GLSL_Shaders { extern const std::string flow_warp_image; extern const std::string pyramid_with_derivative_pass1h; extern const std::string pyramid_with_derivative_pass1v; extern const std::string pyramid_with_derivative_pass2; extern const std::string tvl1_color_flow_QR_update_uv; extern const std::string tvl1_flow_new_update_p; } #endif slowmovideo-0.5+git20180116/src/V3D/GL/v3d_gpupyramid.h0000664000000000000000000000331613151342440020622 0ustar rootroot// -*- C++ -*- #include "config.h" #ifndef V3D_GPU_PYRAMID_H #define V3D_GPU_PYRAMID_H #include "v3d_gpubase.h" namespace V3D_GPU { struct PyramidWithDerivativesCreator { PyramidWithDerivativesCreator(bool useFP32 = false, char const * srcTexSpec = "r=8 noRTT") : _useFP32(useFP32), _width(0), _height(0), _nLevels(0), _srcTexSpec(srcTexSpec), _preSmoothingFilter(0), _pass1HorizShader(0), _pass1VertShader(0), _pass2Shader(0), shaders_initialized(false) { } ~PyramidWithDerivativesCreator() { } int numberOfLevels() const { return _nLevels; } void allocate(int w, int h, int nLevels, int preSmoothingFilter = 0); void deallocate(); void buildPyramidForGrayscaleImage(uchar const * image); void buildPyramidForGrayscaleImage(float const * image); void buildPyramidForGrayscaleTexture(unsigned int srcTexID); void activateTarget(int level); unsigned int textureID() const { return _pyrTexID; } unsigned int sourceTextureID() const { return _srcTex.textureID(); } void initializeShaders(int presmoohing); protected: bool const _useFP32; int _width, _height, _nLevels; unsigned int _pyrFbIDs, _tmpFbIDs, _tmp2FbID; unsigned int _pyrTexID, _tmpTexID, _tmpTex2ID; ImageTexture2D _srcTex; char const * _srcTexSpec; int _preSmoothingFilter; GLSL_FragmentProgram *_pass1HorizShader; GLSL_FragmentProgram *_pass1VertShader; GLSL_FragmentProgram *_pass2Shader; bool shaders_initialized; }; // end struct PyramidWithDerivativesCreator } // end namespace V3D_GPU #endif slowmovideo-0.5+git20180116/src/V3D/GL/v3d_gpupyramid.cpp0000664000000000000000000002335313151342440021160 0ustar rootroot#include "config.h" #include "v3d_gpupyramid.h" #include "Base/v3d_timer.h" #include "glsl_shaders.h" #include #include #include #include using namespace std; using namespace V3D_GPU; using namespace V3D; namespace { inline void renderQuad4Tap(float dS, float dT) { glBegin(GL_TRIANGLES); glMultiTexCoord4f(GL_TEXTURE0_ARB, 0-1*dS, 0-1*dT, 0-0*dS, 0-0*dT); glMultiTexCoord4f(GL_TEXTURE1_ARB, 0+1*dS, 0+1*dT, 0+2*dS, 0+2*dT); glVertex2f(0, 0); glMultiTexCoord4f(GL_TEXTURE0_ARB, 2-1*dS, 0-1*dT, 2-0*dS, 0-0*dT); glMultiTexCoord4f(GL_TEXTURE1_ARB, 2+1*dS, 0+1*dT, 2+2*dS, 0+2*dT); glVertex2f(2, 0); glMultiTexCoord4f(GL_TEXTURE0_ARB, 0-1*dS, 2-1*dT, 0-0*dS, 2-0*dT); glMultiTexCoord4f(GL_TEXTURE1_ARB, 0+1*dS, 2+1*dT, 0+2*dS, 2+2*dT); glVertex2f(0, 2); glEnd(); } // end renderQuad() inline void renderQuad8Tap(float dS, float dT) { glBegin(GL_TRIANGLES); glMultiTexCoord4f(GL_TEXTURE0_ARB, 0-3*dS, 0-3*dT, 0-2*dS, 0-2*dT); glMultiTexCoord4f(GL_TEXTURE1_ARB, 0-1*dS, 0-1*dT, 0-0*dS, 0-0*dT); glMultiTexCoord4f(GL_TEXTURE2_ARB, 0+1*dS, 0+1*dT, 0+2*dS, 0+2*dT); glMultiTexCoord4f(GL_TEXTURE3_ARB, 0+3*dS, 0+3*dT, 0+4*dS, 0+4*dT); glVertex2f(0, 0); glMultiTexCoord4f(GL_TEXTURE0_ARB, 2-3*dS, 0-3*dT, 2-2*dS, 0-2*dT); glMultiTexCoord4f(GL_TEXTURE1_ARB, 2-1*dS, 0-1*dT, 2-0*dS, 0-0*dT); glMultiTexCoord4f(GL_TEXTURE2_ARB, 2+1*dS, 0+1*dT, 2+2*dS, 0+2*dT); glMultiTexCoord4f(GL_TEXTURE3_ARB, 2+3*dS, 0+3*dT, 2+4*dS, 0+4*dT); glVertex2f(2, 0); glMultiTexCoord4f(GL_TEXTURE0_ARB, 0-3*dS, 2-3*dT, 0-2*dS, 2-2*dT); glMultiTexCoord4f(GL_TEXTURE1_ARB, 0-1*dS, 2-1*dT, 0-0*dS, 2-0*dT); glMultiTexCoord4f(GL_TEXTURE2_ARB, 0+1*dS, 2+1*dT, 0+2*dS, 2+2*dT); glMultiTexCoord4f(GL_TEXTURE3_ARB, 0+3*dS, 2+3*dT, 0+4*dS, 2+4*dT); glVertex2f(0, 2); glEnd(); } // end renderQuad() } // end namespace <> //---------------------------------------------------------------------- namespace V3D_GPU { void PyramidWithDerivativesCreator::allocate(int w, int h, int nLevels, int preSmoothingFilter) { ScopedTimer st(__PRETTY_FUNCTION__); _width = w; _height = h; _nLevels = nLevels; _preSmoothingFilter = preSmoothingFilter; GLenum const textureTarget = GL_TEXTURE_2D; GLenum const floatFormat = _useFP32 ? GL_RGBA32F_ARB : GL_RGBA16F_ARB; glGenFramebuffersEXT(1, &_pyrFbIDs); glGenFramebuffersEXT(1, &_tmpFbIDs); glGenFramebuffersEXT(1, &_tmp2FbID); checkGLErrorsHere0(); _srcTex.allocateID(); _srcTex.reserve(w, h, TextureSpecification(_srcTexSpec)); glGenTextures(1, &_pyrTexID); glGenTextures(1, &_tmpTexID); glGenTextures(1, &_tmpTex2ID); //cout << "pyrTexID" << endl; glBindTexture(textureTarget, _pyrTexID); glTexParameteri(textureTarget, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(textureTarget, GL_TEXTURE_MIN_FILTER, GL_NEAREST); //glTexParameteri(textureTarget, GL_TEXTURE_MIN_FILTER, GL_NEAREST_MIPMAP_NEAREST); glTexParameteri(textureTarget, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(textureTarget, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glTexParameteri(textureTarget, GL_TEXTURE_BASE_LEVEL, 0); glTexParameteri(textureTarget, GL_TEXTURE_MAX_LEVEL, nLevels-1); // This is the simplest way to create the full mipmap pyramid. for (int n = 0; n < nLevels; ++n) glTexImage2D(textureTarget, n, floatFormat, w >> n , h >> n , 0, GL_RGBA, GL_UNSIGNED_BYTE,0); checkGLErrorsHere0(); //cout << "tmpTexID" << endl; glBindTexture(textureTarget, _tmpTexID); glTexParameteri(textureTarget, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(textureTarget, GL_TEXTURE_MIN_FILTER, GL_NEAREST); //glTexParameteri(textureTarget, GL_TEXTURE_MIN_FILTER, GL_NEAREST_MIPMAP_NEAREST); glTexParameteri(textureTarget, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(textureTarget, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glTexParameteri(textureTarget, GL_TEXTURE_BASE_LEVEL, 0); glTexParameteri(textureTarget, GL_TEXTURE_MAX_LEVEL, nLevels-1); for (int n = 0; n < nLevels; ++n) glTexImage2D(textureTarget, n, floatFormat, w >> n, (h/2) >> n, 0, GL_RGBA, GL_UNSIGNED_BYTE,0); //cout << "tmpTex2ID" << endl; glBindTexture(textureTarget, _tmpTex2ID); glTexParameteri(textureTarget, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(textureTarget, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexParameteri(textureTarget, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(textureTarget, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glTexImage2D(textureTarget, 0, floatFormat, w, h, 0, GL_RGBA, GL_UNSIGNED_BYTE, 0); glBindTexture(textureTarget, 0); checkGLErrorsHere0(); initializeShaders(preSmoothingFilter); } // end PyramidWithDerivativesCreator::allocate() void PyramidWithDerivativesCreator::initializeShaders(int preSmoothingFilter) { ScopedTimer st(__PRETTY_FUNCTION__); if (shaders_initialized) return; shaders_initialized = true; if (_pass1HorizShader == 0) { _pass1HorizShader = new GLSL_FragmentProgram("PyramidWithDerivativesCreator::buildPyramidForGrayscaleImage_impl::pass1HorizShader"); _pass1HorizShader->setProgram(GLSL_Shaders::pyramid_with_derivative_pass1v.c_str()); _pass1HorizShader->compile(); checkGLErrorsHere0(); } // end if (_pass1HorizShader == 0) if (_pass1VertShader == 0) { _pass1VertShader = new GLSL_FragmentProgram("PyramidWithDerivativesCreator::buildPyramidForGrayscaleImage_impl::pass1VertShader"); _pass1VertShader->setProgram(GLSL_Shaders::pyramid_with_derivative_pass1h.c_str()); _pass1VertShader->compile(); checkGLErrorsHere0(); } // end if (_pass1VertShader == 0) if (_pass2Shader == 0) { _pass2Shader = new GLSL_FragmentProgram("PyramidWithDerivativesCreator::buildPyramidForGrayscaleImage_impl::pass2Shader"); _pass2Shader->setProgram(GLSL_Shaders::pyramid_with_derivative_pass2.c_str()); _pass2Shader->compile(); checkGLErrorsHere0(); } // end if (_pass2Shader == 0) } void PyramidWithDerivativesCreator::deallocate() { glDeleteFramebuffersEXT(1, &_pyrFbIDs); glDeleteFramebuffersEXT(1, &_tmpFbIDs); glDeleteFramebuffersEXT(1, &_tmp2FbID); glDeleteTextures(1, &_pyrTexID); glDeleteTextures(1, &_tmpTexID); glDeleteTextures(1, &_tmpTex2ID); _srcTex.deallocateID(); } void PyramidWithDerivativesCreator::buildPyramidForGrayscaleImage(uchar const * image) { _srcTex.overwriteWith(image, 1); this->buildPyramidForGrayscaleTexture(_srcTex.textureID()); } void PyramidWithDerivativesCreator::buildPyramidForGrayscaleImage(float const * image) { _srcTex.overwriteWith(image, 1); this->buildPyramidForGrayscaleTexture(_srcTex.textureID()); } void PyramidWithDerivativesCreator::buildPyramidForGrayscaleTexture(unsigned int srcTexID) { setupNormalizedProjection(); glViewport(0, 0, _width, _height); GLenum const textureTarget = GL_TEXTURE_2D; glEnable(GL_TEXTURE_2D); glActiveTexture(GL_TEXTURE0); glBindTexture(textureTarget, srcTexID); glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, _tmp2FbID); glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, textureTarget, _tmpTex2ID, 0); _pass1HorizShader->parameter("presmoothing", _preSmoothingFilter); _pass1HorizShader->enable(); _pass1HorizShader->bindTexture("src_tex", 0); renderQuad8Tap(0.0f, 1.0f/_height); _pass1HorizShader->disable(); glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, _pyrFbIDs); glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT,GL_TEXTURE_2D,_pyrTexID,0); glBindTexture(textureTarget, _tmpTex2ID); _pass1VertShader->parameter("presmoothing", _preSmoothingFilter); _pass1VertShader->enable(); _pass1VertShader->bindTexture("src_tex", 0); renderQuad8Tap(1.0f/_width, 0.0f); _pass1VertShader->disable(); _pass2Shader->enable(); for (int level = 1; level < _nLevels; ++level) { // Source texture dimensions. int const W = (_width >> (level-1)); int const H = (_height >> (level-1)); glBindTexture(textureTarget, _pyrTexID); glTexParameteri(textureTarget, GL_TEXTURE_BASE_LEVEL, level-1); glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, _tmpFbIDs); glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT,GL_TEXTURE_2D,_tmpTexID,level-1); glViewport(0, 0, W, H/2); renderQuad4Tap(0.0f, 1.0f/H); //glTexParameteri(textureTarget, GL_TEXTURE_BASE_LEVEL, 0); glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, _pyrFbIDs); glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT,GL_TEXTURE_2D,_pyrTexID,level); glViewport(0, 0, W/2, H/2); glBindTexture(textureTarget, _tmpTexID); glTexParameteri(textureTarget, GL_TEXTURE_BASE_LEVEL, level-1); renderQuad4Tap(1.0f/W, 0.0f); //glTexParameteri(textureTarget, GL_TEXTURE_BASE_LEVEL, 0); } // end for (level) _pass2Shader->disable(); glDisable(GL_TEXTURE_2D); glBindTexture(textureTarget, _pyrTexID); glTexParameteri(textureTarget, GL_TEXTURE_BASE_LEVEL, 0); glBindTexture(textureTarget, _tmpTexID); glTexParameteri(textureTarget, GL_TEXTURE_BASE_LEVEL, 0); } // end PyramidWithDerivativesCreator::buildPyramidForGrayscaleTexturel() void PyramidWithDerivativesCreator::activateTarget(int level) { int const W = (_width >> level); int const H = (_height >> level); glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, _pyrFbIDs); glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT,GL_TEXTURE_2D, _pyrTexID, level); glViewport(0, 0, W, H); } } // end namespace V3D_GPU slowmovideo-0.5+git20180116/src/V3D/GL/v3d_gpucolorflow.cpp0000664000000000000000000003003613151342440021515 0ustar rootroot#include "config.h" #include "Base/v3d_utilities.h" #include "v3d_gpucolorflow.h" #include "glsl_shaders.h" #include #include #include using namespace std; using namespace V3D_GPU; namespace { const std::string SOURCE( "#version 330\n" "\n" "uniform sampler2D src_tex;\n" "\n" "in vec4 gl_TexCoord[4];\n" "\n" "out vec4 my_FragColor;" "\n" "void main(void)\n" "{\n" " my_FragColor = gl_TexCoord[3] * texture2D(src_tex, gl_TexCoord[0].st);\n" "}\n"); void upsampleDisparities(unsigned uvSrcTex, unsigned pSrcTex, float pScale, RTT_Buffer& ubuffer, RTT_Buffer& pbuffer) { static GLSL_FragmentProgram * upsampleShader = 0; if (upsampleShader == 0) { upsampleShader = new GLSL_FragmentProgram("v3d_gpuflow::upsampleDisparities::upsampleShader"); char const * source = SOURCE.c_str(); upsampleShader->setProgram(source); upsampleShader->compile(); checkGLErrorsHere0(); } // end if setupNormalizedProjection(); ubuffer.activate(); glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D, uvSrcTex); glEnable(GL_TEXTURE_2D); upsampleShader->enable(); // Provide uniform paramter via texcoord to avoid recompilation of shaders glMultiTexCoord4f(GL_TEXTURE3_ARB, 2, 2, 1, 1); //glMultiTexCoord4f(GL_TEXTURE3_ARB, 0, 0, 0, 0); renderNormalizedQuad(); //upsampleShader->disable(); glDisable(GL_TEXTURE_2D); checkGLErrorsHere0(); pbuffer.activate(); glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D, pSrcTex); glEnable(GL_TEXTURE_2D); //upsampleShader->enable(); // Provide uniform paramter via texcoord to avoid recompilation of shaders glMultiTexCoord4f(GL_TEXTURE3_ARB, pScale, pScale, pScale, pScale); renderNormalizedQuad(); upsampleShader->disable(); glDisable(GL_TEXTURE_2D); checkGLErrorsHere0(); } // upsampleDisparities() void upsampleDisparities(unsigned uvSrcTex, unsigned pSrcTex, unsigned qSrcTex, float pScale, RTT_Buffer& ubuffer, RTT_Buffer& pbuffer, RTT_Buffer& qbuffer) { static GLSL_FragmentProgram * upsampleShader = 0; if (upsampleShader == 0) { upsampleShader = new GLSL_FragmentProgram("v3d_gpuflow::upsampleDisparities::upsampleShader"); char const * source = SOURCE.c_str(); upsampleShader->setProgram(source); upsampleShader->compile(); checkGLErrorsHere0(); } // end if setupNormalizedProjection(); ubuffer.activate(); glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D, uvSrcTex); glEnable(GL_TEXTURE_2D); upsampleShader->enable(); // Provide uniform paramter via texcoord to avoid recompilation of shaders glMultiTexCoord4f(GL_TEXTURE3_ARB, 2, 2, 1, 1); //glMultiTexCoord4f(GL_TEXTURE3_ARB, 0, 0, 0, 0); renderNormalizedQuad(); //upsampleShader->disable(); glDisable(GL_TEXTURE_2D); checkGLErrorsHere0(); pbuffer.activate(); glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D, pSrcTex); glEnable(GL_TEXTURE_2D); //upsampleShader->enable(); // Provide uniform paramter via texcoord to avoid recompilation of shaders glMultiTexCoord4f(GL_TEXTURE3_ARB, pScale, pScale, pScale, pScale); renderNormalizedQuad(); upsampleShader->disable(); glDisable(GL_TEXTURE_2D); checkGLErrorsHere0(); qbuffer.activate(); glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D, qSrcTex); glEnable(GL_TEXTURE_2D); //upsampleShader->enable(); // Provide uniform paramter via texcoord to avoid recompilation of shaders glMultiTexCoord4f(GL_TEXTURE3_ARB, pScale, pScale, pScale, pScale); renderNormalizedQuad(); upsampleShader->disable(); glDisable(GL_TEXTURE_2D); checkGLErrorsHere0(); } // upsampleDisparities() } // end namespace //---------------------------------------------------------------------- namespace V3D_GPU { void TVL1_ColorFlowEstimatorBase::allocate(int W, int H) { _width = W; _height = H; char const * texSpec = _warpedBufferHighPrecision ? "rgba=32f tex2D" : "rgba=16f tex2D"; for (int ch = 0; ch < 3; ++ch) { _warpedBufferPyramids[ch].resize(_nLevels); for (int level = 0; level < _nLevels; ++level) { int const w = _width / (1 << level); int const h = _height / (1 << level); _warpedBufferPyramids[ch][level] = new RTT_Buffer(texSpec, "_warpedBufferPyramid[]"); _warpedBufferPyramids[ch][level]->allocate(w, h); } } // end for (ch) } // end TVL1_ColorFlowEstimatorBase::allocate() void TVL1_ColorFlowEstimatorBase::deallocate() { for (int ch = 0; ch < 3; ++ch) for (int level = 0; level < _nLevels; ++level) _warpedBufferPyramids[ch][level]->deallocate(); } //---------------------------------------------------------------------- void TVL1_ColorFlowEstimator_QR::allocate(int W, int H) { TVL1_ColorFlowEstimatorBase::allocate(W, H); _shader_uv = new GLSL_FragmentProgram("tvl1_color_flow_new_update_uv"); _shader_p = new GLSL_FragmentProgram("tvl1_flow_relaxed_update_p"); _shader_uv->setProgram(GLSL_Shaders::tvl1_color_flow_QR_update_uv.c_str()); _shader_p->setProgram(GLSL_Shaders::tvl1_flow_new_update_p.c_str()); _shader_uv->compile(); _shader_p->compile(); _uBuffer1Pyramid.resize(_nLevels); _uBuffer2Pyramid.resize(_nLevels); _pBuffer1Pyramid.resize(_nLevels); _pBuffer2Pyramid.resize(_nLevels); char const * uvTexSpec = _uvBufferHighPrecision ? "rg=32f tex2D enableTextureRG" : "rg=16f tex2D enableTextureRG"; char const * pTexSpec = _pBufferHighPrecision ? "rgba=32f tex2D" : "rgba=16f tex2D"; for (int level = 0; level < _nLevels; ++level) { int const w = _width / (1 << level); int const h = _height / (1 << level); _uBuffer1Pyramid[level] = new RTT_Buffer(uvTexSpec, "ubuffer1"); _uBuffer1Pyramid[level]->allocate(w, h); _uBuffer2Pyramid[level] = new RTT_Buffer(uvTexSpec, "ubuffer2"); _uBuffer2Pyramid[level]->allocate(w, h); _pBuffer1Pyramid[level] = new RTT_Buffer(pTexSpec, "pbuffer1"); _pBuffer1Pyramid[level]->allocate(w, h); _pBuffer2Pyramid[level] = new RTT_Buffer(pTexSpec, "pbuffer2"); _pBuffer2Pyramid[level]->allocate(w, h); } // end for (level) } // end TVL1_ColorFlowEstimator_QR::allocate() void TVL1_ColorFlowEstimator_QR::deallocate() { TVL1_ColorFlowEstimatorBase::deallocate(); for (int level = 0; level < _nLevels; ++level) { _uBuffer1Pyramid[level]->deallocate(); _uBuffer2Pyramid[level]->deallocate(); _pBuffer1Pyramid[level]->deallocate(); _pBuffer2Pyramid[level]->deallocate(); } } // end TVL1_ColorFlowEstimator_QR::deallocate() void TVL1_ColorFlowEstimator_QR::run(unsigned int I0_TexIDs[3], unsigned int I1_TexIDs[3]) { for (int level = _nLevels-1; level >= _startLevel; --level) { RTT_Buffer * ubuffer1 = _uBuffer1Pyramid[level]; RTT_Buffer * ubuffer2 = _uBuffer2Pyramid[level]; RTT_Buffer * pbuffer1 = _pBuffer1Pyramid[level]; RTT_Buffer * pbuffer2 = _pBuffer2Pyramid[level]; float const lambda = _lambda; if (level == _nLevels-1) { glClearColor(0, 0, 0, 0); ubuffer2->activate(); glClear(GL_COLOR_BUFFER_BIT); pbuffer2->activate(); glClear(GL_COLOR_BUFFER_BIT); } else { upsampleDisparities(_uBuffer2Pyramid[level+1]->textureID(), _pBuffer2Pyramid[level+1]->textureID(), 1.0f, *ubuffer2, *pbuffer2); } int const w = _width / (1 << level); int const h = _height / (1 << level); RTT_Buffer& warpedBuffer_R = *_warpedBufferPyramids[0][level]; RTT_Buffer& warpedBuffer_G = *_warpedBufferPyramids[1][level]; RTT_Buffer& warpedBuffer_B = *_warpedBufferPyramids[2][level]; float const ds = 1.0f / w; float const dt = 1.0f / h; _shader_uv->parameter("lambda_theta", 3.0f * _cfg._theta * lambda); _shader_uv->parameter("theta", _cfg._theta); //_shader_p->parameter("timestep_over_theta", _cfg._tau / _cfg._theta); _shader_p->parameter("timestep", -_cfg._tau / _cfg._theta); _shader_p->parameter("rcpLambda_p", 1.0f); for (int iter = 0; iter < _nOuterIterations; ++iter) { warpImageWithFlowField(ubuffer2->textureID(), I0_TexIDs[0], I1_TexIDs[0], level, warpedBuffer_R); warpImageWithFlowField(ubuffer2->textureID(), I0_TexIDs[1], I1_TexIDs[1], level, warpedBuffer_G); warpImageWithFlowField(ubuffer2->textureID(), I0_TexIDs[2], I1_TexIDs[2], level, warpedBuffer_B); checkGLErrorsHere0(); setupNormalizedProjection(); for (int k = 0; k < _nInnerIterations /* * sqrtf(resizeFactor) */; ++k) { pbuffer1->activate(); ubuffer2->enableTexture(GL_TEXTURE0_ARB); pbuffer2->enableTexture(GL_TEXTURE1_ARB); _shader_p->enable(); _shader_p->bindTexture("uv_src", 0); _shader_p->bindTexture("p_uv_src", 1); renderNormalizedQuad(GPU_SAMPLE_REVERSE_NEIGHBORS, ds, dt); _shader_p->disable(); //ubuffer2->disableTexture(GL_TEXTURE0_ARB); pbuffer2->disableTexture(GL_TEXTURE1_ARB); std::swap(pbuffer1, pbuffer2); ubuffer1->activate(); //ubuffer2->enableTexture(GL_TEXTURE0_ARB); pbuffer2->enableTexture(GL_TEXTURE1_ARB); warpedBuffer_R.enableTexture(GL_TEXTURE2_ARB); warpedBuffer_G.enableTexture(GL_TEXTURE3_ARB); warpedBuffer_B.enableTexture(GL_TEXTURE4_ARB); _shader_uv->enable(); _shader_uv->bindTexture("uv_src", 0); _shader_uv->bindTexture("p_uv_src", 1); _shader_uv->bindTexture("warped_R_tex", 2); _shader_uv->bindTexture("warped_G_tex", 3); _shader_uv->bindTexture("warped_B_tex", 4); renderNormalizedQuad(GPU_SAMPLE_REVERSE_NEIGHBORS, ds, dt); _shader_uv->disable(); ubuffer2->disableTexture(GL_TEXTURE0_ARB); pbuffer2->disableTexture(GL_TEXTURE1_ARB); warpedBuffer_R.disableTexture(GL_TEXTURE2_ARB); warpedBuffer_G.disableTexture(GL_TEXTURE3_ARB); warpedBuffer_B.disableTexture(GL_TEXTURE4_ARB); std::swap(ubuffer1, ubuffer2); } // end for (k) } // end for (iter) } // end for (level) } // end TVL1_ColorFlowEstimator_QR::run() } // end namespace V3D_GP slowmovideo-0.5+git20180116/src/V3D/GL/v3d_gpubase.cpp0000664000000000000000000011655013151342440020427 0ustar rootroot #include "config.h" #if defined(V3DLIB_ENABLE_GPGPU) #include "v3d_gpubase.h" #include "Base/v3d_exception.h" #include #include #include #include #include #include #ifdef WIN32 #include #endif namespace { unsigned int const extPixelFormat[] = { 0, GL_LUMINANCE, GL_LUMINANCE_ALPHA, GL_RGB, GL_RGBA }; inline int intFromString(std::string const& value, int defaultVal) { if (value.empty()) return defaultVal; return atoi(value.c_str()); } // end intFromString() template inline void interleavePixels(int const w, int const h, T const * red, T const * green, T const * blue, T * pixels) { for (int p = 0; p < w*h; ++p) { pixels[3*p+0] = red[p]; pixels[3*p+1] = green[p]; pixels[3*p+2] = blue[p]; } } // end interleavePixels() template inline void interleavePixels(int const w, int const h, T const * red, T const * green, T const * blue, T const * alpha, T * pixels) { for (int p = 0; p < w*h; ++p) { pixels[4*p+0] = red[p]; pixels[4*p+1] = green[p]; pixels[4*p+2] = blue[p]; pixels[4*p+3] = alpha[p]; } } // end interleavePixels() } // end namespace <> namespace V3D_GPU { using namespace std; TextureSpecification::TextureSpecification(char const * specString) : nChannels(3), nBitsPerChannel(8), isFloatTexture(false), isDepthTexture(false), isRTT(true), enableTextureRG(false) { istringstream is(specString); string token, key, value; while (!is.eof()) { is >> token; string::size_type pos = 0; if ((pos = token.find("=")) != token.npos) { key = token.substr(0, pos); value = token.substr(pos+1, token.length()-pos+1); } else { key = token; value = ""; } if (key == "r") { this->nChannels = 1; if (value.find("f") != value.npos) this->isFloatTexture = true; this->nBitsPerChannel = intFromString(value, this->isFloatTexture ? 32 : 8); } else if (key == "rg") { this->nChannels = 2; if (value.find("f") != value.npos) this->isFloatTexture = true; this->nBitsPerChannel = intFromString(value, this->isFloatTexture ? 32 : 8); } else if (key == "rgb") { this->nChannels = 3; if (value.find("f") != value.npos) this->isFloatTexture = true; this->nBitsPerChannel = intFromString(value, this->isFloatTexture ? 32 : 8); } else if (key == "rgba") { this->nChannels = 4; if (value.find("f") != value.npos) this->isFloatTexture = true; this->nBitsPerChannel = intFromString(value, this->isFloatTexture ? 32 : 8); } else if (key == "depth") { this->isDepthTexture = true; this->nBitsPerChannel = intFromString(value, 24); } else if (key == "RTT") { this->isRTT = true; } else if (key == "noRTT") { this->isRTT = false; } else if (key == "enableTextureRG") { #if defined(GL_ARB_texture_rg) this->enableTextureRG = true; #endif } else if (key == "tex2D") { // Ignore that keyword } else { cerr << "TextureSpecification::TextureSpecification(): " << "Warning Unknown keyword: '" << key << "'; ignored." << endl; } } // end while if ((this->nChannels < 3) && this->isRTT && !this->enableTextureRG) { cerr << "TextureSpecification::TextureSpecification(): " << "Warning: luminance or luminance/alpha texture will not work as render target." << endl; } } // end TextureSpecification::TextureSpecification() unsigned int TextureSpecification::getGLInternalFormat() const { if (!this->isDepthTexture) { if (!this->enableTextureRG) { if (!this->isFloatTexture) { if (this->nBitsPerChannel == 8) { unsigned int const formats[5] = { 0, GL_LUMINANCE8, GL_LUMINANCE8_ALPHA8, GL_RGB8, GL_RGBA8 }; return formats[this->nChannels]; } else if (this->nBitsPerChannel == 16) { unsigned int const formats[5] = { 0, GL_LUMINANCE16, GL_LUMINANCE16_ALPHA16, GL_RGB16, GL_RGBA16 }; return formats[this->nChannels]; } else { raiseGLErrorHere1("Unsupported number of bits for int texture (8 or 16 bits)."); } } else { if (this->nBitsPerChannel == 32) { unsigned int const formats[5] = { 0, GL_LUMINANCE32F_ARB, GL_LUMINANCE_ALPHA32F_ARB, GL_RGB32F_ARB, GL_RGBA32F_ARB }; return formats[this->nChannels]; } else if (this->nBitsPerChannel == 16) { unsigned int const formats[5] = { 0, GL_LUMINANCE16F_ARB, GL_LUMINANCE_ALPHA16F_ARB, GL_RGB16F_ARB, GL_RGBA16F_ARB }; return formats[this->nChannels]; } else { raiseGLErrorHere1("Unsupported number of bits for float texture (16 or 32 bits)."); } } // end if (!useFloat) } else { #if defined(GL_ARB_texture_rg) // Note: this requires the ARB_texture_rg extension if (!this->isFloatTexture) { if (this->nBitsPerChannel == 8) { unsigned int const formats[5] = { 0, GL_R8, GL_RG8, GL_RGB8, GL_RGBA8 }; return formats[this->nChannels]; } else if (this->nBitsPerChannel == 16) { unsigned int const formats[5] = { 0, GL_R16, GL_RG16, GL_RGB16, GL_RGBA16 }; return formats[this->nChannels]; } else { raiseGLErrorHere1("Unsupported number of bits for int texture (8 or 16 bits)."); } } else { if (this->nBitsPerChannel == 32) { unsigned int const formats[5] = { 0, GL_R32F, GL_RG32F, GL_RGB32F_ARB, GL_RGBA32F_ARB }; return formats[this->nChannels]; } else if (this->nBitsPerChannel == 16) { unsigned int const formats[5] = { 0, GL_R16F, GL_RG16F, GL_RGB16F_ARB, GL_RGBA16F_ARB }; return formats[this->nChannels]; } else { raiseGLErrorHere1("Unsupported number of bits for float texture (16 or 32 bits)."); } } // end if (!useFloat) #else raiseGLErrorHere1("Attempting to use ARB_texture_rg, but support is not compiled in."); #endif } // end if ((!this->enableTextureRG) } else { switch (nBitsPerChannel) { case 16: return GL_DEPTH_COMPONENT16_ARB; case 24: return GL_DEPTH_COMPONENT24_ARB; case 32: return GL_DEPTH_COMPONENT32_ARB; default: raiseGLErrorHere1("Unsupported number of bits for depth texture (16, 24 or 32 bits)."); } } // end if (!isDepthTexture) return 0; } // end TextureSpecification::getGLInternalFormat() //---------------------------------------------------------------------- bool ImageTexture2D::allocateID() { _textureTarget = GL_TEXTURE_2D; glGenTextures(1, &_textureID); glBindTexture(_textureTarget, _textureID); // Default is clamp to edge and nearest filtering. glTexParameteri(_textureTarget, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri(_textureTarget, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glTexParameteri(_textureTarget, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glTexParameteri(_textureTarget, GL_TEXTURE_MAG_FILTER, GL_NEAREST); checkGLErrorsHere1(_texName); return true; } void ImageTexture2D::deallocateID() { if (_textureID == 0) return; this->clear(); glDeleteTextures(1, &_textureID); _textureID = 0; checkGLErrorsHere1(_texName); } void ImageTexture2D::reserve(int width, int height, TextureSpecification const& texSpec) { _width = width; _height = height; unsigned internalFormat = texSpec.getGLInternalFormat(); unsigned format = texSpec.isDepthTexture ? GL_DEPTH_COMPONENT : GL_RGBA; unsigned type = texSpec.isFloatTexture ? GL_FLOAT : GL_UNSIGNED_BYTE; glBindTexture(_textureTarget, _textureID); glTexImage2D(_textureTarget, 0, internalFormat, _width, _height, 0, format, type, 0); checkGLErrorsHere1(_texName); } void ImageTexture2D::overwriteWith(uchar const * pixels, int nChannels) { unsigned const format = extPixelFormat[nChannels]; unsigned const type = GL_UNSIGNED_BYTE; glBindTexture(_textureTarget, _textureID); glTexSubImage2D(_textureTarget, 0, 0, 0, _width, _height, format, type, pixels); checkGLErrorsHere1(_texName); } void ImageTexture2D::overwriteWith(uchar const * redPixels, uchar const * greenPixels, uchar const * bluePixels) { unsigned const format = extPixelFormat[3]; unsigned const type = GL_UNSIGNED_BYTE; unsigned char * pixels = new unsigned char[3 * _width * _height]; interleavePixels(_width, _height, redPixels, greenPixels, bluePixels, pixels); glBindTexture(_textureTarget, _textureID); glTexSubImage2D(_textureTarget, 0, 0, 0, _width, _height, format, type, pixels); delete [] pixels; checkGLErrorsHere1(_texName); } void ImageTexture2D::overwriteWith(uchar const * redPixels, uchar const * greenPixels, uchar const * bluePixels, uchar const * alphaPixels) { unsigned const format = extPixelFormat[4]; unsigned const type = GL_UNSIGNED_BYTE; unsigned char * pixels = new unsigned char[4 * _width * _height]; interleavePixels(_width, _height, redPixels, greenPixels, bluePixels, alphaPixels, pixels); glBindTexture(_textureTarget, _textureID); glTexSubImage2D(_textureTarget, 0, 0, 0, _width, _height, format, type, pixels); delete [] pixels; checkGLErrorsHere1(_texName); } void ImageTexture2D::overwriteWith(float const * pixels, int nChannels) { unsigned const format = extPixelFormat[nChannels]; unsigned const type = GL_FLOAT; glBindTexture(_textureTarget, _textureID); glTexSubImage2D(_textureTarget, 0, 0, 0, _width, _height, format, type, pixels); checkGLErrorsHere1(_texName); } void ImageTexture2D::overwriteWith(float const * redPixels, float const * greenPixels, float const * bluePixels) { unsigned const format = extPixelFormat[3]; unsigned const type = GL_FLOAT; float * pixels = new float[3 * _width * _height]; interleavePixels(_width, _height, redPixels, greenPixels, bluePixels, pixels); glBindTexture(_textureTarget, _textureID); glTexSubImage2D(_textureTarget, 0, 0, 0, _width, _height, format, type, pixels); delete [] pixels; checkGLErrorsHere1(_texName); } void ImageTexture2D::overwriteWith(float const * redPixels, float const * greenPixels, float const * bluePixels, float const * alphaPixels) { unsigned const format = extPixelFormat[4]; unsigned const type = GL_FLOAT; float * pixels = new float[4 * _width * _height]; interleavePixels(_width, _height, redPixels, greenPixels, bluePixels, alphaPixels, pixels); glBindTexture(_textureTarget, _textureID); glTexSubImage2D(_textureTarget, 0, 0, 0, _width, _height, format, type, pixels); delete [] pixels; checkGLErrorsHere1(_texName); } void ImageTexture2D::clear() { glBindTexture(_textureTarget, _textureID); glTexImage2D(_textureTarget, 0, GL_RGBA, 0, 0, 0, GL_RGBA, GL_UNSIGNED_BYTE, 0); checkGLErrorsHere1(_texName); } void ImageTexture2D::bind() { glBindTexture(_textureTarget, _textureID); } void ImageTexture2D::bind(unsigned texUnit) { glActiveTexture(texUnit); this->bind(); } void ImageTexture2D::enable() { glBindTexture(_textureTarget, _textureID); glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_REPLACE); glEnable(_textureTarget); } void ImageTexture2D::enable(unsigned texUnit) { glActiveTexture(texUnit); this->enable(); } void ImageTexture2D::disable() { glDisable(_textureTarget); } void ImageTexture2D::disable(unsigned texUnit) { glActiveTexture(texUnit); this->disable(); } //---------------------------------------------------------------------- bool FrameBufferObject::allocate() { // if (glGenFramebuffers != 0) { glGenFramebuffers(1, &_fboID); // fprintf(stderr,"new alloc\n"); //} //else { // glGenFramebuffersEXT(1, &_fboID); // fprintf(stderr,"old alloc\n"); //} checkGLErrorsHere1(_fboName); return true; } void FrameBufferObject::deallocate() { std::fill(_attachedColorTextures, _attachedColorTextures+16, (ImageTexture2D *)0); glDeleteFramebuffersEXT(1, &_fboID); checkGLErrorsHere1(_fboName); } void FrameBufferObject::makeCurrent() { glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, _fboID); checkGLErrorsHere1(_fboName); } void FrameBufferObject::activate(bool setViewport) { this->makeCurrent(); if (this->checkValidity()) { if (setViewport) glViewport(0, 0, this->width(), this->height()); } } bool FrameBufferObject::isCurrent() { GLint curFboID; glGetIntegerv(GL_FRAMEBUFFER_BINDING_EXT, &curFboID); return ((int)_fboID == curFboID); } void FrameBufferObject::attachTexture2D(ImageTexture2D& texture, GLenum attachment, int mipLevel) { if (this->checkBinding("FrameBufferObject::attachTexture2D()")) { if (attachment >= GL_COLOR_ATTACHMENT0_EXT && attachment <= GL_COLOR_ATTACHMENT15_EXT) { glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, attachment, texture.textureTarget(), texture.textureID(), mipLevel); _attachedColorTextures[attachment - GL_COLOR_ATTACHMENT0_EXT] = &texture; } else if (attachment == GL_DEPTH_ATTACHMENT_EXT) { glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, attachment, texture.textureTarget(), texture.textureID(), mipLevel); _attachedDepthTexture = &texture; } else raiseGLErrorHere2("Unknown/unsupported attachment specifier", _fboName.c_str()); } checkGLErrorsHere1(_fboName); } void FrameBufferObject::attachTextures2D(int numTextures, ImageTexture2D * textures, GLenum * attachment, int * mipLevel) { for (int i = 0; i < numTextures; ++i) this->attachTexture2D(textures[i], (attachment != 0) ? attachment[i] : (GL_COLOR_ATTACHMENT0_EXT + i), (mipLevel != 0) ? mipLevel[i] : 0); } void FrameBufferObject::detach(GLenum attachment) { if (this->checkBinding("FrameBufferObject::detach()")) { if (attachment >= GL_COLOR_ATTACHMENT0_EXT && attachment <= GL_COLOR_ATTACHMENT15_EXT) { glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, attachment, GL_TEXTURE_2D, 0, 0); _attachedColorTextures[attachment - GL_COLOR_ATTACHMENT0_EXT] = 0; } else if (attachment == GL_DEPTH_ATTACHMENT_EXT) { glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, attachment, GL_TEXTURE_2D, 0, 0); _attachedDepthTexture = 0; } else raiseGLErrorHere2("Unknown/unsupported attachment specifier", _fboName.c_str()); } checkGLErrorsHere1(_fboName); } void FrameBufferObject::detachAll() { int const numAttachments = this->getMaxColorAttachments(); for (int i = 0; i < numAttachments; ++i) this->detach(GL_COLOR_ATTACHMENT0_EXT + i); } # ifndef NDEBUG_GL bool FrameBufferObject::checkValidity(std::ostream& os) { if (!this->checkBinding("FrameBufferObject::checkValidity()")) return false; GLenum const status = glCheckFramebufferStatusEXT(GL_FRAMEBUFFER_EXT); switch(status) { case GL_FRAMEBUFFER_COMPLETE_EXT: // Everything's OK return true; case GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT_EXT: raiseGLErrorHere2("Frame buffer is incomplete (GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT_EXT) ", _fboName.c_str()); return false; case GL_FRAMEBUFFER_INCOMPLETE_MISSING_ATTACHMENT_EXT: raiseGLErrorHere2("Frame buffer is incomplete (GL_FRAMEBUFFER_INCOMPLETE_MISSING_ATTACHMENT_EXT) ", _fboName.c_str()); return false; case GL_FRAMEBUFFER_INCOMPLETE_DIMENSIONS_EXT: raiseGLErrorHere2("Frame buffer is incomplete (GL_FRAMEBUFFER_INCOMPLETE_DIMENSIONS_EXT) ", _fboName.c_str()); return false; case GL_FRAMEBUFFER_INCOMPLETE_FORMATS_EXT: raiseGLErrorHere2("Frame buffer is incomplete (GL_FRAMEBUFFER_INCOMPLETE_FORMATS_EXT) ", _fboName.c_str()); return false; case GL_FRAMEBUFFER_INCOMPLETE_DRAW_BUFFER_EXT: raiseGLErrorHere2("Frame buffer is incomplete (GL_FRAMEBUFFER_INCOMPLETE_DRAW_BUFFER_EXT) ", _fboName.c_str()); return false; case GL_FRAMEBUFFER_INCOMPLETE_READ_BUFFER_EXT: raiseGLErrorHere2("Frame buffer is incomplete (GL_FRAMEBUFFER_INCOMPLETE_READ_BUFFER_EXT) ", _fboName.c_str()); return false; case GL_FRAMEBUFFER_UNSUPPORTED_EXT: raiseGLErrorHere2("Frame buffer is incomplete (GL_FRAMEBUFFER_UNSUPPORTED_EXT) ", _fboName.c_str()); return false; default: raiseGLErrorHere2("Frame buffer is incomplete (unknown error code) ", _fboName.c_str()); return false; } return false; } # endif int FrameBufferObject::getMaxColorAttachments() { GLint res = 0; glGetIntegerv(GL_MAX_COLOR_ATTACHMENTS_EXT, &res); return res; } void FrameBufferObject::disableFBORendering() { glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, 0); } bool FrameBufferObject::checkBinding(char const * what) { GLint curFboID; glGetIntegerv(GL_FRAMEBUFFER_BINDING_EXT, &curFboID); ostringstream oss; oss << "FBO operation (" << what << ") on unbound frame buffer attempted"; if (curFboID != (int)_fboID) { raiseGLErrorHere2(oss.str().c_str(), this->_fboName); return false; } return true; } //---------------------------------------------------------------------- # ifndef NDEBUG_GL void checkGLErrors(char const * location, ostream& os) { GLuint errnum; char const * errstr; bool hasError = false; std::cout << "error checking\n"; while ((errnum = glGetError())) { errstr = reinterpret_cast(gluErrorString(errnum)); if (errstr) os << errstr; else os << "Error " << errnum; os << " at " << location << endl; #ifdef WIN32 break; #endif } if(hasError) throwV3DErrorHere(""); } void checkGLErrors(char const * file, int line, ostream& os) { GLuint errnum; char const * errstr; bool hasError = false; std::cout << "error checking\n"; while ((errnum = glGetError())) { hasError = true; errstr = reinterpret_cast(gluErrorString(errnum)); if (errstr) os << errstr; else os << "Error " << errnum; os << " at " << file << ":" << line << endl; #ifdef WIN32 break; #endif } if(hasError) throwV3DErrorHere(""); } void checkGLErrors(char const * file, int line, string const& name, ostream& os) { GLuint errnum; char const * errstr; bool hasError = false; std::cout << "error checking\n"; while ((errnum = glGetError())) { errstr = reinterpret_cast(gluErrorString(errnum)); if (errstr) os << errstr; else os << "Error " << errnum; os << " with " << name << " at " << file << ":" << line << endl; #ifdef WIN32 break; #endif } if(hasError) throwV3DErrorHere(""); } bool checkFrameBufferStatus(char const * file, int line, std::string const& name, std::ostream& os) { GLenum const status = glCheckFramebufferStatusEXT(GL_FRAMEBUFFER_EXT); char const * msg = 0; switch(status) { case GL_FRAMEBUFFER_COMPLETE_EXT: // Everything's OK return true; case GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT_EXT: msg = "Frame buffer is incomplete (GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT_EXT) "; break; case GL_FRAMEBUFFER_INCOMPLETE_MISSING_ATTACHMENT_EXT: msg = "Frame buffer is incomplete (GL_FRAMEBUFFER_INCOMPLETE_MISSING_ATTACHMENT_EXT) "; break; case GL_FRAMEBUFFER_INCOMPLETE_DIMENSIONS_EXT: msg = "Frame buffer is incomplete (GL_FRAMEBUFFER_INCOMPLETE_DIMENSIONS_EXT) "; break; case GL_FRAMEBUFFER_INCOMPLETE_FORMATS_EXT: msg = "Frame buffer is incomplete (GL_FRAMEBUFFER_INCOMPLETE_FORMATS_EXT) "; break; case GL_FRAMEBUFFER_INCOMPLETE_DRAW_BUFFER_EXT: msg = "Frame buffer is incomplete (GL_FRAMEBUFFER_INCOMPLETE_DRAW_BUFFER_EXT) "; break; case GL_FRAMEBUFFER_INCOMPLETE_READ_BUFFER_EXT: msg = "Frame buffer is incomplete (GL_FRAMEBUFFER_INCOMPLETE_READ_BUFFER_EXT) "; break; case GL_FRAMEBUFFER_UNSUPPORTED_EXT: msg = "Frame buffer is incomplete (GL_FRAMEBUFFER_UNSUPPORTED_EXT) "; break; default: msg = "Frame buffer is incomplete (unknown error code) "; break; } os << msg << " with " << name << " at " << file << ":" << line << endl; return false; } // end checkFrameBufferStatus() # endif void raiseGLError(char const * file, int line, char const * msg, std::ostream& os) { os << msg << " at " << file << ":" << line << endl; } void raiseGLError(char const * file, int line, char const * msg, std::string const& name, std::ostream& os) { os << msg << " with " << name << " at " << file << ":" << line << endl; } } // end namespace V3D_GPU //---------------------------------------------------------------------- // CG SHADER ROUTINES //---------------------------------------------------------------------- namespace V3D_GPU { void GLSL_FragmentProgram::setProgram(char const * source) { _source = source; // To enforce recompilation. if (_program) glDeleteObjectARB(_program); _program = 0; } void GLSL_FragmentProgram::compile(char const * * compilerArgs, char const *entry) { if (compilerArgs != 0) cerr << "GLSL_FragmentProgram::compile(): arguments to the compiler are not supported (and ignored)." << endl; if(entry != 0) cerr << "GLSL_FragmentProgram::compile(): named entry point is not supported (and ignored)." << endl; checkGLErrorsHere1(_shaderName); if (_program == 0) { int len = _source.length() + 1; GLcharARB * str = new char[len]; strcpy(str, _source.c_str()); GLint success = GL_FALSE; GLint logLength; _program = glCreateProgramObjectARB(); unsigned fragmentProgram = glCreateShaderObjectARB(GL_FRAGMENT_SHADER_ARB); glShaderSourceARB(fragmentProgram, 1, (GLcharARB const * *)&str, &len); glCompileShaderARB(fragmentProgram); delete [] str; glGetObjectParameterivARB(fragmentProgram, GL_OBJECT_COMPILE_STATUS_ARB, &success); if (!success) { glGetObjectParameterivARB(fragmentProgram, GL_OBJECT_INFO_LOG_LENGTH_ARB, &logLength); GLcharARB * logStr = new GLcharARB[logLength]; glGetInfoLogARB(fragmentProgram, logLength, NULL, logStr); cout << logStr << endl; delete [] logStr; glDeleteObjectARB(_program); _program = 0; return; } // end if checkGLErrorsHere1(_shaderName); glAttachObjectARB(_program, fragmentProgram); glDeleteObjectARB(fragmentProgram); // To delete the shader if the program is deleted. glLinkProgramARB(_program); glGetObjectParameterivARB(_program, GL_OBJECT_LINK_STATUS_ARB, &success); if (!success) { glGetObjectParameterivARB(_program, GL_OBJECT_INFO_LOG_LENGTH_ARB, &logLength); GLcharARB * logStr = new GLcharARB[logLength]; glGetInfoLogARB(_program, logLength, NULL, logStr); cout << logStr << endl; delete [] logStr; glDeleteObjectARB(_program); _program = 0; return; } // end if checkGLErrorsHere1(_shaderName); glUseProgramObjectARB(_program); { // Build a mapping from texture parameters to texture units. _texUnitMap.clear(); int count, size; GLenum type; char paramName[1024]; int texUnit = 0; glGetObjectParameterivARB(_program, GL_OBJECT_ACTIVE_UNIFORMS_ARB, &count); for (int i = 0; i < count; ++i) { glGetActiveUniformARB(_program, i, 1024, NULL, &size, &type, paramName); switch (type) { case GL_SAMPLER_1D_ARB: case GL_SAMPLER_2D_ARB: case GL_SAMPLER_3D_ARB: case GL_SAMPLER_CUBE_ARB: case GL_SAMPLER_1D_SHADOW_ARB: case GL_SAMPLER_2D_SHADOW_ARB: case GL_SAMPLER_2D_RECT_ARB: case GL_SAMPLER_2D_RECT_SHADOW_ARB: { _texUnitMap.insert(make_pair(string(paramName), texUnit)); int location = glGetUniformLocationARB(_program, paramName); glUniform1iARB(location, texUnit); ++texUnit; break; } default: break; } // end switch() } // end for (i) } glUseProgramObjectARB(0); } // end if checkGLErrorsHere1(_shaderName); } // end GLSL_FragmentProgram::compile() void GLSL_FragmentProgram::compile(std::vector const& compilerArgs, char const *entry) { if (compilerArgs.size() != 0) cerr << "GLSL_FragmentProgram::compile(): arguments to the compiler are not supported (and ignored)." << endl; if(entry != 0) cerr << "GLSL_FragmentProgram::compile(): named entry point is not supported (and ignored)." << endl; this->compile(); } void GLSL_FragmentProgram::bindFragDataLocation(const std::string& var) { glBindFragDataLocation(_program, 0, var.c_str()); } void GLSL_FragmentProgram::enable() { // implicitly bind FragDataLocation. Maybe this can be done elsewhere once an apropriate interface is available. bindFragDataLocation("my_FragColor"); glUseProgramObjectARB(_program); _inUse = true; } void GLSL_FragmentProgram::disable() { _inUse = false; glUseProgramObjectARB(0); } void GLSL_FragmentProgram::bindTexture(const std::string& name, unsigned int unit) { glUniform1i(glGetUniformLocation(_program, name.c_str()), unit); } void GLSL_FragmentProgram::parameter(char const * param, int x) { glUseProgramObjectARB(_program); glUniform1iARB(glGetUniformLocationARB(_program, param), x); if (!_inUse) glUseProgramObjectARB(0); } void GLSL_FragmentProgram::parameter(char const * param, float x) { glUseProgramObjectARB(_program); glUniform1fARB(glGetUniformLocationARB(_program, param), x); if (!_inUse) glUseProgramObjectARB(0); } void GLSL_FragmentProgram::parameter(char const * param, float x, float y) { glUseProgramObjectARB(_program); glUniform2fARB(glGetUniformLocationARB(_program, param), x, y); if (!_inUse) glUseProgramObjectARB(0); } void GLSL_FragmentProgram::parameter(char const * param, float x, float y, float z) { glUseProgramObjectARB(_program); glUniform3fARB(glGetUniformLocationARB(_program, param), x, y, z); if (!_inUse) glUseProgramObjectARB(0); } void GLSL_FragmentProgram::parameter(char const * param, float x, float y, float z, float w) { glUseProgramObjectARB(_program); glUniform4fARB(glGetUniformLocationARB(_program, param), x, y, z, w); if (!_inUse) glUseProgramObjectARB(0); } void GLSL_FragmentProgram::parameter(char const * param, int len, float const * array) { glUseProgramObjectARB(_program); glUniform1fvARB(glGetUniformLocationARB(_program, param), len, array); if (!_inUse) glUseProgramObjectARB(0); } void GLSL_FragmentProgram::matrixParameterR(char const * param, int rows, int cols, double const * values) { float * fvalues = new float[rows*cols]; std::copy(values, values+rows*cols, fvalues); glUseProgramObjectARB(_program); if (rows == 2 && cols == 2) { glUniformMatrix2fvARB(glGetUniformLocationARB(_program, param), 1, GL_TRUE, fvalues); } else if (rows == 3 && cols == 3) { glUniformMatrix3fvARB(glGetUniformLocationARB(_program, param), 1, GL_TRUE, fvalues); } else if (rows == 4 && cols == 4) { glUniformMatrix4fvARB(glGetUniformLocationARB(_program, param), 1, GL_TRUE, fvalues); } else raiseGLErrorHere2("Matrix parameter should be 2x2, 3x3 or 4x4.", _shaderName.c_str()); delete [] fvalues; if (!_inUse) glUseProgramObjectARB(0); } void GLSL_FragmentProgram::matrixParameterC(char const * param, int rows, int cols, double const * values) { float * fvalues = new float[rows*cols]; std::copy(values, values+rows*cols, fvalues); glUseProgramObjectARB(_program); if (rows == 2 && cols == 2) { glUniformMatrix2fvARB(glGetUniformLocationARB(_program, param), 1, GL_FALSE, fvalues); } else if (rows == 3 && cols == 3) { glUniformMatrix3fvARB(glGetUniformLocationARB(_program, param), 1, GL_FALSE, fvalues); } else if (rows == 4 && cols == 4) { glUniformMatrix4fvARB(glGetUniformLocationARB(_program, param), 1, GL_FALSE, fvalues); } else raiseGLErrorHere2("Matrix parameter should be 2x2, 3x3 or 4x4.", _shaderName.c_str()); delete [] fvalues; if (!_inUse) glUseProgramObjectARB(0); } unsigned GLSL_FragmentProgram::getTexUnit(char const * param) { map::const_iterator p = _texUnitMap.find(string(param)); if (p != _texUnitMap.end()) return GL_TEXTURE0_ARB + (*p).second; else raiseGLErrorHere2("Parameter name denotes no texture sampler.", _shaderName.c_str()); return 0; } } // end namespace V3D_GPU //---------------------------------------------------------------------- namespace V3D_GPU { void setupNormalizedProjection(bool flipY) { glMatrixMode(GL_PROJECTION); glLoadIdentity(); if (!flipY) glOrtho(0, 1, 0, 1, -1, 1); else glOrtho(0, 1, 1, 0, -1, 1); glMatrixMode(GL_MODELVIEW); glLoadIdentity(); } void renderNormalizedQuad() { // It is usually recommended to draw a large (clipped) triangle // instread of a single quad, therefore avoiding the diagonal edge. glBegin(GL_TRIANGLES); glMultiTexCoord2f(GL_TEXTURE0_ARB, 0, 0); glMultiTexCoord4f(GL_TEXTURE1_ARB, 0, 0, 0, 0); glMultiTexCoord4f(GL_TEXTURE2_ARB, 0, 0, 0, 0); glVertex2f(0, 0); glMultiTexCoord2f(GL_TEXTURE0_ARB, 2, 0); glMultiTexCoord4f(GL_TEXTURE1_ARB, 2, 0, 2, 0); glMultiTexCoord4f(GL_TEXTURE2_ARB, 2, 0, 2, 0); glVertex2f(2, 0); glMultiTexCoord2f(GL_TEXTURE0_ARB, 0, 2); glMultiTexCoord4f(GL_TEXTURE1_ARB, 0, 2, 0, 2); glMultiTexCoord4f(GL_TEXTURE2_ARB, 0, 2, 0, 2); glVertex2f(0, 2); glEnd(); } void renderNormalizedQuad(GPUTextureSamplingPattern pattern, float ds, float dt) { switch (pattern) { case GPU_SAMPLE_NEIGHBORS: { glBegin(GL_TRIANGLES); glMultiTexCoord2f(GL_TEXTURE0_ARB, 0, 0); glMultiTexCoord4f(GL_TEXTURE1_ARB, 0-ds, 0, 0+ds, 0); glMultiTexCoord4f(GL_TEXTURE2_ARB, 0, 0-dt, 0, 0+dt); glVertex2f(0, 0); glMultiTexCoord2f(GL_TEXTURE0_ARB, 2, 0); glMultiTexCoord4f(GL_TEXTURE1_ARB, 2-ds, 0, 2+ds, 0); glMultiTexCoord4f(GL_TEXTURE2_ARB, 2, 0-dt, 2, 0+dt); glVertex2f(2, 0); glMultiTexCoord2f(GL_TEXTURE0_ARB, 0, 2); glMultiTexCoord4f(GL_TEXTURE1_ARB, 0-ds, 2, 0+ds, 2); glMultiTexCoord4f(GL_TEXTURE2_ARB, 0, 2-dt, 0, 2+dt); glVertex2f(0, 2); glEnd(); break; } case GPU_SAMPLE_REVERSE_NEIGHBORS: { glBegin(GL_TRIANGLES); glMultiTexCoord2f(GL_TEXTURE0_ARB, 0, 0); glMultiTexCoord4f(GL_TEXTURE1_ARB, 0+ds, 0, 0-ds, 0); glMultiTexCoord4f(GL_TEXTURE2_ARB, 0, 0+dt, 0, 0-dt); glVertex2f(0, 0); glMultiTexCoord2f(GL_TEXTURE0_ARB, 2, 0); glMultiTexCoord4f(GL_TEXTURE1_ARB, 2+ds, 0, 2-ds, 0); glMultiTexCoord4f(GL_TEXTURE2_ARB, 2, 0+dt, 2, 0-dt); glVertex2f(2, 0); glMultiTexCoord2f(GL_TEXTURE0_ARB, 0, 2); glMultiTexCoord4f(GL_TEXTURE1_ARB, 0+ds, 2, 0-ds, 2); glMultiTexCoord4f(GL_TEXTURE2_ARB, 0, 2+dt, 0, 2-dt); glVertex2f(0, 2); glEnd(); break; } case GPU_SAMPLE_DIAG_NEIGBORS: { glBegin(GL_TRIANGLES); glMultiTexCoord2f(GL_TEXTURE0_ARB, 0, 0); glMultiTexCoord4f(GL_TEXTURE1_ARB, 0-ds, 0-dt, 0+ds, 0+dt); glMultiTexCoord4f(GL_TEXTURE2_ARB, 0+ds, 0-dt, 0-ds, 0+dt); glVertex2f(0, 0); glMultiTexCoord2f(GL_TEXTURE0_ARB, 2, 0); glMultiTexCoord4f(GL_TEXTURE1_ARB, 2-ds, 0-dt, 2+ds, 0+dt); glMultiTexCoord4f(GL_TEXTURE2_ARB, 2+ds, 0-dt, 2-ds, 0+dt); glVertex2f(2, 0); glMultiTexCoord2f(GL_TEXTURE0_ARB, 0, 2); glMultiTexCoord4f(GL_TEXTURE1_ARB, 0-ds, 2-dt, 0+ds, 2+dt); glMultiTexCoord4f(GL_TEXTURE2_ARB, 0+ds, 2-dt, 0-ds, 2+dt); glVertex2f(0, 2); glEnd(); break; } case GPU_SAMPLE_2X2_BLOCK: { glBegin(GL_TRIANGLES); glMultiTexCoord2f(GL_TEXTURE0_ARB, 0, 0); glMultiTexCoord4f(GL_TEXTURE1_ARB, 0, 0, 0+ds, 0 ); glMultiTexCoord4f(GL_TEXTURE2_ARB, 0, 0+dt, 0+ds, 0+dt); glVertex2f(0, 0); glMultiTexCoord2f(GL_TEXTURE0_ARB, 2, 0); glMultiTexCoord4f(GL_TEXTURE1_ARB, 2, 0, 2+ds, 0 ); glMultiTexCoord4f(GL_TEXTURE2_ARB, 2, 0+dt, 2+ds, 0+dt); glVertex2f(2, 0); glMultiTexCoord2f(GL_TEXTURE0_ARB, 0, 2); glMultiTexCoord4f(GL_TEXTURE1_ARB, 0, 2, 0+ds, 2 ); glMultiTexCoord4f(GL_TEXTURE2_ARB, 0, 2+dt, 0+ds, 2+dt); glVertex2f(0, 2); glEnd(); break; } default: raiseGLErrorHere1("Unknown sampling pattern."); } // end switch (pattern) } // end renderQuadWithTexSampling() } // end namespace V3D_GPU namespace { V3D_GPU::GLSL_FragmentProgram * trivialTexture2DShader = 0; } namespace V3D_GPU { void enableTrivialTexture2DShader() { if (trivialTexture2DShader == 0) { trivialTexture2DShader = new GLSL_FragmentProgram("trivialTexture2DShader"); char const * source = "void main(uniform sampler2D texture, \n" " float2 st : TEXCOORD0, \n" " out float4 color : COLOR) \n" "{ \n" " color = tex2D(texture, st); \n" "} \n"; trivialTexture2DShader->setProgram(source); trivialTexture2DShader->compile(); checkGLErrorsHere0(); } trivialTexture2DShader->enable(); } void disableTrivialTexture2DShader() { if (trivialTexture2DShader) trivialTexture2DShader->disable(); } } // end namespace V3D_GPU #endif // defined(V3DLIB_ENABLE_GPGPU) slowmovideo-0.5+git20180116/src/V3D/Base/0000775000000000000000000000000013151342440016061 5ustar rootrootslowmovideo-0.5+git20180116/src/V3D/Base/v3d_image.h0000664000000000000000000002262113151342440020073 0ustar rootroot// -*- C++ -*- #include "config.h" #ifndef V3D_IMAGE_H #define V3D_IMAGE_H #include #include #include #include #include #include #include #include #include #ifdef V3DLIB_ENABLE_IMDEBUG # include #endif namespace V3D { template struct Image { typedef Elem value_type; typedef Elem const * const_iterator; typedef Elem * iterator; Image() : _width(0), _height(0), _nChannels(0), _planeSize(0), _data(0) { } // (Deep) Copy constructor. I wanted this for use with STL // in non-performance-critical code. // Otherwise, please degenerate the copy constructor by // declaring it private and not defining it. Image( const Image &im ) : _width(0), _height(0), _nChannels(0), _planeSize(0), _data(0) { copyFrom(im); } Image(int w, int h, int nChannels = 1) : _width(w), _height(h), _nChannels(nChannels), _data(0) { _planeSize = w*h; _data = new Elem[w*h*nChannels]; } Image(int w, int h, int nChannels, Elem const& value) : _width(w), _height(h), _nChannels(nChannels), _data(0) { _planeSize = w*h; _data = new Elem[w*h*nChannels]; std::fill(_data, _data+w*h*nChannels, value); } ~Image() { if (_data) delete [] _data; } Image& operator=(Image const& rhs) { this->copyFrom(rhs); return *this; } void resize(int w, int h, int nChannels = 1) { if(w==_width && h==_height && nChannels==_nChannels) return; if (_data) delete [] _data; _width = w; _height = h; _nChannels = nChannels; _planeSize = w*h; _data = new Elem[w*h*nChannels]; } void resize(int w, int h, int nChannels, Elem const& value) { this->resize(w, h, nChannels); std::fill(_data, _data+w*h*nChannels, value); } template void copyFrom( const Image &im ) { resize(im.width(),im.height(),im.numChannels()); for(int i = 0; i < (int)im.numChannels(); i++) std::copy(im.begin(i),im.end(i),begin(i)); } void fill( Elem val ) { std::fill(_data, _data+_width*_height*_nChannels, val); } unsigned int width() const { return _width; } unsigned int height() const { return _height; } unsigned int numChannels() const { return _nChannels; } bool contains( int x, int y ) const { return x>=0 && x<_width && y>=0 && y<_height; } Elem const& operator()(int x, int y, int channel = 0) const { return _data[channel*_planeSize + y*_width + x]; } Elem& operator()(int x, int y, int channel = 0) { return _data[channel*_planeSize + y*_width + x]; } const_iterator begin(int channel = 0) const { return _data + channel*_planeSize; } iterator begin(int channel = 0) { return _data + channel*_planeSize; } const_iterator end(int channel = 0) const { return _data + (channel+1)*_planeSize; } iterator end(int channel = 0) { return _data + (channel+1)*_planeSize; } Elem const& minElement(int chan = 0) const { return *min_element(this->begin(chan), this->end(chan)); } Elem& minElement(int chan = 0) { return *min_element(this->begin(chan), this->end(chan)); } Elem const& maxElement(int chan = 0) const { return *max_element(this->begin(chan), this->end(chan)); } Elem& maxElement(int chan = 0) { return *max_element(this->begin(chan), this->end(chan)); } protected: int _width, _height, _nChannels, _planeSize; Elem * _data; }; // end struct Image //---------------------------------------------------------------------- struct ImageFileStat { unsigned width, height; unsigned numChannels; unsigned bitDepth; }; // end struct ImageFileStat void statImageFile(char const * fileName, ImageFileStat& stat); void loadImageFile(char const * fileName, Image& image); void saveImageFile(Image const& image, char const * fileName); //! Reads PPM (P6) and PGM (P5) image files. void statPNMImageFile(char const * fileName, ImageFileStat& stat); void loadPNMImageFile(char const * fileName, Image& image); void savePNMImageFile(Image const& image, char const * fileName); #if defined(V3DLIB_ENABLE_LIBJPEG) void statJPGImageFile(char const * fileName, ImageFileStat& stat); void loadJPGImageFile(char const * fileName, Image& image); void saveJPGImageFile(Image const& image, char const * fileName, int quality = 85); #endif #if defined(V3DLIB_ENABLE_LIBPNG) void statPNGImageFile(char const * fileName, ImageFileStat& stat); void loadPNGImageFile(char const * fileName, Image& image); void savePNGImageFile(Image const& image, char const * fileName); #endif inline void statDataImageFile(const char *filename, ImageFileStat &stat) { std::ifstream in(filename,std::ios_base::binary|std::ios_base::in); verify(in.is_open(),"Failed to open image data file."); in.read((char*)&stat.width,sizeof(unsigned int)); in.read((char*)&stat.height,sizeof(unsigned int)); in.read((char*)&stat.numChannels,sizeof(unsigned int)); in.read((char*)&stat.bitDepth,sizeof(unsigned int)); } template void loadDataImageFile(const char *filename, Image &image) { // Stat file. ImageFileStat stat; unsigned int type; std::ifstream in(filename,std::ios_base::binary|std::ios_base::in); verify(in.is_open(),"Failed to open image data file."); in.read((char*)&stat.width,sizeof(unsigned int)); in.read((char*)&stat.height,sizeof(unsigned int)); in.read((char*)&stat.numChannels,sizeof(unsigned int)); in.read((char*)&stat.bitDepth,sizeof(unsigned int)); in.read((char*)&type,sizeof(unsigned int)); // TODO: Create a mechanism for getting a type constant. // Then type can be verified exactly. // Verify bit depth of type. verify(sizeof(Elem)*8==stat.bitDepth,"Image data incompatible with type."); // Read image. image.resize(stat.width,stat.height,stat.numChannels); in.read((char*)&image(0,0,0),sizeof(Elem)*stat.width*stat.height*stat.numChannels); } template void saveDataImageFile(const Image &image, const char *filename) { // Open file. std::ofstream out(filename,std::ios_base::out|std::ios_base::binary); verify(out.is_open(),"Failed to open iamge data file."); // Write image stat. ImageFileStat stat; stat.width = image.width(); stat.height = image.height(); stat.numChannels = image.numChannels(); stat.bitDepth = sizeof(Elem)*8; // TODO: Create a mechanism for getting a type constant. // Then type can be stored exactly. unsigned int type = 0xffffffff; out.write((char*)&stat.width,sizeof(unsigned int)); out.write((char*)&stat.height,sizeof(unsigned int)); out.write((char*)&stat.numChannels,sizeof(unsigned int)); out.write((char*)&stat.bitDepth,sizeof(unsigned int)); out.write((char*)&type,sizeof(unsigned int)); // Write image data. out.write((char*)&image(0,0,0),sizeof(Elem)*image.width()*image.height()*image.numChannels()); } template inline void saveImageChannel(Image const& im, int channel, Elem minVal, Elem maxVal, char const * name) { int const w = im.width(); int const h = im.height(); Elem const len = maxVal - minVal; Image byteIm(w, h, 1); for (int y = 0; y < h; ++y) for (int x = 0; x < w; ++x) { Elem c = std::max(Elem(0), std::min(Elem(255), Elem(255) * (im(x, y, channel) - minVal) / len)); byteIm(x, y) = int(c); } saveImageFile(byteIm, name); } // end saveImageChannel() template inline void saveImageChannel(Image const& im, int channel, char const * name) { int const w = im.width(); int const h = im.height(); if (w == 0 || h == 0) return; Elem minVal = im(0, 0, channel); Elem maxVal = minVal; for (int y = 0; y < h; ++y) for (int x = 0; x < w; ++x) { minVal = std::min(minVal, im(x, y, channel)); maxVal = std::max(maxVal, im(x, y, channel)); } saveImageChannel(im, channel, minVal, maxVal, name); } // end saveImageChannel() template inline void copyImageChannel(Image const& im, int channel, Image &out) { out.resize(im.width(),im.height(),1); for(int y=0; y #include #include #include #include #include #include #include "Base/v3d_exception.h" //! Adds load and save routines to the serializable struct. #define V3D_DEFINE_LOAD_SAVE(T) \ template \ void save(Ar& ar) const { T& self = const_cast(*this); self.serialize(ar); } \ template \ void load(Ar& ar) { this->serialize(ar); } //! Implements \c << and \c >> operators for streams for serializable structs. #define V3D_DEFINE_IOSTREAM_OPS(T) \ inline std::istream& operator>>(std::istream& is, T& v) \ { \ return V3D::loadFromIStream(is, v); \ } \ inline std::ostream& operator<<(std::ostream& os, T const& v) \ { \ return V3D::saveToOStream(os, v); \ } //! Implements \c << and \c >> operators for streams for templated serializable structs. #define V3D_DEFINE_TEMPLATE_IOSTREAM_OPS(T) \ template \ inline std::istream& operator>>(std::istream& is, T& v) \ { \ return V3D::loadFromIStream(is, v); \ } \ template \ inline std::ostream& operator<<(std::ostream& os, T const& v) \ { \ return V3D::saveToOStream(os, v); \ } namespace V3D { template struct OArchiveProtocol { Archive * archive() { return static_cast(this); } static bool isLoading() { return false; } static bool isSaving() { return true; } template Archive& operator<<(T const& val) { this->archive()->save(val); return *this->archive(); } template Archive& operator&(T const& val) { return (*this) << val; } void serializeBlob(void * address, size_t count) { this->archive()->serializeBlob(address, count); } //! Do not use whitespace in \param tag! void tag(char const * tag) { this->archive()->saveTag(tag); } void enterScope() { this->archive()->saveTag("{"); } void leaveScope() { this->archive()->saveTag("}"); } void endl() { this->archive()->endl(); } }; template struct IArchiveProtocol { Archive * archive() { return static_cast(this); } static bool isLoading() { return true; } static bool isSaving() { return false; } template Archive& operator>>(T& val) { this->archive()->load(val); return *this->archive(); } template Archive& operator&(T& val) { return (*this) >> val; } void serializeBlob(void * address, size_t count) { this->archive()->serializeBlob(address, count); } //! Do not use whitespace in \param tag! void tag(char const * tag) { std::string stag(tag); std::string s; this->archive()->loadTag(s); if (stag != s) throwV3DErrorHere(std::string("Tag mismatch <" + s + "> instead of <") + stag + std::string(">")); } void enterScope() { std::string s; this->archive()->loadTag(s); if (s != "{") throwV3DErrorHere("Bracket mismatch <{>"); } void leaveScope() { std::string s; this->archive()->loadTag(s); if (s != "}") throwV3DErrorHere("Bracket mismatch <}>"); } void endl() { } }; template struct SerializationScope { SerializationScope(Archive& ar) : _ar(ar) { ar.enterScope(); } SerializationScope(Archive& ar, char const * tag) : _ar(ar) { ar.tag(tag); ar.enterScope(); } ~SerializationScope() { _ar.leaveScope(); } protected: Archive& _ar; }; // end struct SerializationScope //---------------------------------------------------------------------- struct TextOStreamArchive : public OArchiveProtocol { private: typedef OArchiveProtocol Base; public: TextOStreamArchive(std::ostream& os) : Base(), _os(os), _indentation(0) { } template void save(T const& v) { v.T::save(*this); } void save(bool val) { this->put((int)val); } void save(unsigned int val) { this->put(val); } void save(int val) { this->put(val); } void save(long val) { this->put(val); } void save(unsigned long val) { this->put(val); } void save(short int val) { this->put(val); } void save(unsigned short int val) { this->put(val); } void save(float val) { this->put(val); } void save(double val) { this->put(val); } void save(char val) { this->put(int(val)); } void save(unsigned char val) { this->put((unsigned int)val); } void save(char const * str) { unsigned int len = std::strlen(str); this->save(len); _os.write(str, len); _os << " "; } void save(std::string const& str) { unsigned int len = str.length(); this->save(len); _os.write(str.c_str(), len); _os << " "; } void serializeBlob(void * address, size_t count) { char const * buffer = static_cast(address); _os.write(buffer, count); } void saveTag(char const * tag) { this->put(tag); } void enterScope() { _indentation += 2; Base::enterScope(); } void leaveScope() { _indentation -= 2; Base::leaveScope(); this->endl(); } void endl() { _os << std::endl; this->indent(); } protected: template void put(T const& v) { _os << v << " "; } void indent() { for (int i = 0; i < _indentation; ++i) _os << " "; } std::ostream& _os; int _indentation; }; struct TextIStreamArchive : public IArchiveProtocol { private: typedef IArchiveProtocol Base; public: TextIStreamArchive(std::istream& is) : Base(), _is(is) { } template void load(T& v) { v.T::load(*this); } void load(bool& val) { int v; get(v); val = (v != 0) ? true : false; } void load(unsigned int& val) { get(val); } void load(int& val) { get(val); } void load(long& val) { get(val); } void load(unsigned long& val) { get(val); } void load(short int& val) { get(val); } void load(unsigned short int& val) { get(val); } void load(float& val) { get(val); } void load(double& val) { get(val); } void load(char& val) { int v; get(v); val = v; } void load(unsigned char& val) { unsigned v; get(v); val = v; } void load(char * str) { unsigned int len; this->load(len); _is.ignore(); // Ignore the extra blank after len _is.read(str, len); str[len] = 0; } void load(std::string& str) { unsigned int len; this->load(len); _is.ignore(); // Ignore the extra blank after len std::vector buf(len+1); _is.read(&buf[0], len); buf[len] = 0; str = &buf[0]; } void serializeBlob(void * address, size_t count) { char * buffer = static_cast(address); _is.read(buffer, count); } void loadTag(std::string& tag) { _is >> std::ws >> tag; } void enterScope() { Base::enterScope(); } void leaveScope() { Base::leaveScope(); } protected: template void get(T& v) { _is >> v; } std::istream& _is; }; //---------------------------------------------------------------------- //! Output archive using a binary stream to write. /*! Integer values (short, int, long) are stored as 32 bit entities. * Hence binary serialization is not dependent on sizeof(int) etc. * Floating point types are directly written, but all relevant platforms use * IEEE fp format anyway. * Chars and bools are written as 8 bit entities. * Do not forget the ios::binary attribute for stream opening! */ struct BinaryOStreamArchive : public OArchiveProtocol { private: typedef OArchiveProtocol Base; public: BinaryOStreamArchive(std::ostream& os) : Base(),_os(os) { } template void save(T const& v) { v.save(*this); } void save(bool val) { this->put_byte(val); } void save(unsigned int val) { this->put_uint32(val); } void save(int val) { this->put_int32(val); } void save(long val) { this->put_int32(val); } void save(unsigned long val) { this->put_uint32(val); } void save(short int val) { this->put_int32(val); } void save(unsigned short int val) { this->put_uint32(val); } void save(float val) { this->put_blob(val); } void save(double val) { this->put_blob(val); } void save(char val) { this->put_byte(val); } void save(unsigned char val) { this->put_byte(val); } void save(char const * str) { //assert(static_cast(std::strlen(str)) < std::numeric_limits::max()); unsigned int len = static_cast(std::strlen(str)); save(len); _os.write(str, len); } void save(std::string const& str) { //assert(static_cast(str.length()) < std::numeric_limits::max()); unsigned int len = static_cast(str.length()); save(len); _os.write(str.c_str(), len); } void serializeBlob(void * address, size_t count) { char const * buffer = static_cast(address); _os.write(buffer, count); } void saveTag(char const *) { } void enterScope() { } void leaveScope() { } void endl() { } protected: template void put_byte(T v) { char val = v; _os.write(&val, 1); } void put_int32(signed long v) { unsigned char buf[4]; buf[0] = static_cast(v & 0xff); buf[1] = static_cast((v >> 8) & 0xff); buf[2] = static_cast((v >> 16) & 0xff); buf[3] = static_cast((v >> 24) & 0xff); _os.write((char *)buf, 4); } void put_uint32(unsigned long v) { unsigned char buf[4]; buf[0] = static_cast(v & 0xff); buf[1] = static_cast((v >> 8) & 0xff); buf[2] = static_cast((v >> 16) & 0xff); buf[3] = static_cast((v >> 24) & 0xff); _os.write((char *)buf, 4); } template void put_blob(T v) { _os.write((char *)&v, sizeof(v)); } std::ostream& _os; }; struct BinaryIStreamArchive : public IArchiveProtocol { private: typedef IArchiveProtocol Base; public: BinaryIStreamArchive(std::istream& is) : Base(), _is(is) { } template void load(T& v) { v.load(*this); } void load(bool& val) { get_uchar(val); } void load(unsigned int& val) { getUnsigned(val); } void load(int& val) { getSigned(val); } void load(long& val) { getSigned(val); } void load(unsigned long& val) { getUnsigned(val); } void load(short int& val) { getSigned(val); } void load(unsigned short int& val) { getUnsigned(val); } void load(float& val) { getGeneric(val); } void load(double& val) { getGeneric(val); } void load(char& val) { get_schar(val); } void load(unsigned char& val) { get_uchar(val); } void load(char * str) { unsigned int len; load(len); _is.read(str, len); str[len] = 0; } void load(std::string& str) { unsigned int len; this->load(len); std::vector buf(len+1); _is.read(&buf[0], len); buf[len] = 0; str = &buf[0]; } void serializeBlob(void * address, size_t count) { char * buffer = static_cast(address); _is.read(buffer, count); } void loadTag(std::string& /*tag*/) { } void tag(std::string const& /*tag*/) { } void enterScope() { } void leaveScope() { } void skip(unsigned int nBytes) { _is.ignore(nBytes); } protected: template void get_uchar(T& v) { unsigned char val; _is.read((char *)&val, 1); v = val; } template void get_schar(T& v) { char val; _is.read((char *)&val, 1); v = val; } template void getSigned(T& v) { signed long val; get_int32(val); v = val; } template void getUnsigned(T& v) { unsigned long val; get_uint32(val); v = val; } void get_uint32(unsigned long& v) { unsigned char buf[4]; _is.read((char *)buf, 4); v = buf[0] + (buf[1] << 8) + (buf[2] << 16) + (buf[3] << 24); } void get_int32(signed long& v) { unsigned char buf[4]; _is.read((char *)buf, 4); v = buf[0] + (buf[1] << 8) + (buf[2] << 16); // The following is somewhat magic, interpret the most significant byte as signed char. v += (signed char)(buf[3]) << 24; } template void getGeneric(T& v) { _is.read((char *)&v, sizeof(v)); } std::istream& _is; }; //---------------------------------------------------------------------- struct BinaryArchiveSizeAccumulator : public OArchiveProtocol { BinaryArchiveSizeAccumulator() : _byteSize(0) { } template void save(T const& v) { v.save(*this); } void save(bool) { _byteSize += 1; } void save(unsigned int) { _byteSize += 4; } void save(int) { _byteSize += 4; } void save(long) { _byteSize += 4; } void save(unsigned long) { _byteSize += 4; } void save(short int) { _byteSize += 4; } void save(unsigned short int) { _byteSize += 4; } void save(float) { _byteSize += 4; } void save(double) { _byteSize += 8; } void save(char) { _byteSize += 1; } void save(unsigned char) { _byteSize += 1; } void save(char const * str) { unsigned int len = std::strlen(str); this->save(len); _byteSize += len; } void save(std::string const& str) { unsigned int len = str.length(); this->save(len); _byteSize += len; } void serializeBlob(void *, size_t count) { _byteSize += count; } void tag(std::string const&) { } void enterScope() { } void leaveScope() { } void endl() { } unsigned int byteSize() const { return _byteSize; } protected: unsigned int _byteSize; }; // end struct BinaryArchiveSizeAccumulator //---------------------------------------------------------------------- // Serialize to a blob (contiguous memory) struct BlobOArchive : public OArchiveProtocol { private: typedef OArchiveProtocol Base; public: BlobOArchive(int sz = 0) : Base() { if (sz > 0) _blob.reserve(sz); } void clear() { _blob.clear(); } unsigned char const * getBlob() const { return &_blob[0]; } int blobSize() const { return _blob.size(); } template void save(T const& v) { v.save(*this); } void save(bool val) { this->put_byte(val); } void save(unsigned int val) { this->put_uint32(val); } void save(int val) { this->put_int32(val); } void save(long val) { this->put_int32(val); } void save(unsigned long val) { this->put_uint32(val); } void save(short int val) { this->put_int32(val); } void save(unsigned short int val) { this->put_uint32(val); } void save(float val) { this->put_blob(val); } void save(double val) { this->put_blob(val); } void save(char val) { this->put_byte(val); } void save(unsigned char val) { this->put_byte(val); } void save(char const * str) { unsigned int len = static_cast(std::strlen(str)); this->save(len); this->serializeBlob(str, len); } void save(std::string const& str) { unsigned int len = static_cast(str.length()); this->save(len); this->serializeBlob(str.c_str(), len); } void serializeBlob(void const * address, size_t count) { unsigned char const * buffer = static_cast(address); for (size_t i = 0; i < count; ++i) _blob.push_back(buffer[i]); } void saveTag(char const *) { } void enterScope() { } void leaveScope() { } void endl() { } protected: template void put_byte(T v) { unsigned char val = v; _blob.push_back(val); } void put_int32(signed long v) { unsigned char buf[4]; buf[0] = static_cast(v & 0xff); buf[1] = static_cast((v >> 8) & 0xff); buf[2] = static_cast((v >> 16) & 0xff); buf[3] = static_cast((v >> 24) & 0xff); _blob.push_back(buf[0]); _blob.push_back(buf[1]); _blob.push_back(buf[2]); _blob.push_back(buf[3]); } void put_uint32(unsigned long v) { unsigned char buf[4]; buf[0] = static_cast(v & 0xff); buf[1] = static_cast((v >> 8) & 0xff); buf[2] = static_cast((v >> 16) & 0xff); buf[3] = static_cast((v >> 24) & 0xff); _blob.push_back(buf[0]); _blob.push_back(buf[1]); _blob.push_back(buf[2]); _blob.push_back(buf[3]); } template void put_blob(T v) { this->serializeBlob(&v, sizeof(v)); } std::vector _blob; }; struct BlobIArchive : public IArchiveProtocol { private: typedef IArchiveProtocol Base; public: BlobIArchive(unsigned char const * blobStart) : Base(), _blobPtr(blobStart) { } template void load(T& v) { v.load(*this); } void load(bool& val) { get_uchar(val); } void load(unsigned int& val) { getUnsigned(val); } void load(int& val) { getSigned(val); } void load(long& val) { getSigned(val); } void load(unsigned long& val) { getUnsigned(val); } void load(short int& val) { getSigned(val); } void load(unsigned short int& val) { getUnsigned(val); } void load(float& val) { getGeneric(val); } void load(double& val) { getGeneric(val); } void load(char& val) { get_schar(val); } void load(unsigned char& val) { get_uchar(val); } void load(char * str) { unsigned int len; this->load(len); this->serializeBlob(str, len); str[len] = 0; } void load(std::string& str) { unsigned int len; this->load(len); std::vector buf(len+1); this->serializeBlob(&buf[0], len); buf[len] = 0; str = &buf[0]; } void serializeBlob(void * address, size_t count) { unsigned char * buffer = static_cast(address); for (size_t i = 0; i < count; ++i) buffer[i] = _blobPtr[i]; _blobPtr += count; } void loadTag(std::string& /*tag*/) { } void tag(std::string const& /*tag*/) { } void enterScope() { } void leaveScope() { } void skip(unsigned int nBytes) { _blobPtr += nBytes; } protected: template void get_uchar(T& v) { v = *_blobPtr++; } template void get_schar(T& v) { v = *_blobPtr++; } template void getSigned(T& v) { signed long val; get_int32(val); v = val; } template void getUnsigned(T& v) { unsigned long val; get_uint32(val); v = val; } void get_uint32(unsigned long& v) { unsigned char const * buf = _blobPtr; v = buf[0] + (buf[1] << 8) + (buf[2] << 16) + (buf[3] << 24); _blobPtr += 4; } void get_int32(signed long& v) { unsigned char const * buf = _blobPtr; v = buf[0] + (buf[1] << 8) + (buf[2] << 16); // The following is somewhat magic, interpret the most significant byte as signed char. v += (signed char)(buf[3]) << 24; _blobPtr += 4; } template void getGeneric(T& v) { this->serializeBlob(&v, sizeof(v)); } unsigned char const * _blobPtr; }; //---------------------------------------------------------------------- //! Serializes a vector of serializable items. template inline void serializeVector(std::vector& v, Archive& ar) { unsigned int sz = v.size(); ar & sz; if (ar.isLoading()) v.resize(sz); SerializationScope s(ar); for (unsigned i = 0; i < sz; ++i) ar & v[i]; } template inline void serializeVector(char const * tag, std::vector& v, Archive& ar) { ar.tag(tag); serializeVector(v, ar); } template inline void serializeSet(std::set& v, Archive& ar) { unsigned int sz = v.size(); ar & sz; SerializationScope s(ar); if (ar.isLoading()) { T elem; for (unsigned i = 0; i < sz; ++i) { ar & elem; v.insert(elem); } } else { T elem; for (typename std::set::iterator p = v.begin(); p != v.end(); ++p) { elem = *p; ar & elem; } } // end if } // end serializeSet() template inline void serializeMap(std::map& v, Archive& ar) { unsigned int sz = v.size(); ar & sz; SerializationScope s(ar); if (ar.isLoading()) { v.clear(); Key key; T elem; for (unsigned i = 0; i < sz; ++i) { ar & key & elem; v.insert(make_pair(key, elem)); } } else { Key key; for (typename std::map::iterator p = v.begin(); p != v.end(); ++p) { key = p->first; ar & key & p->second; } } // end if } // end serializeMap() template inline void serializeDataToFile(char const * archiveName, T const& data, bool writeBinary = true) { using namespace std; int const tagLength = 6; if (writeBinary) { char const * magicTag = "V3DBIN"; ofstream os(archiveName, ios::binary); os.write(magicTag, tagLength); BinaryOStreamArchive ar(os); ar & data; } else { char const * magicTag = "V3DTXT"; ofstream os(archiveName); os << magicTag << endl; TextOStreamArchive ar(os); ar & data; } } // end serializeDataToFile() template inline void serializeDataFromFile(char const * archiveName, T& data) { using namespace std; int const tagLength = 6; bool isBinary = true; { // Determine archive format from the first 8 chars char magicTag[tagLength]; ifstream is(archiveName, ios::binary); is.read(magicTag, tagLength); if (strncmp(magicTag, "V3DBIN", tagLength) == 0) isBinary = true; else if (strncmp(magicTag, "V3DTXT", tagLength) == 0) isBinary = false; else throwV3DErrorHere("Unknown archive magic tag"); } if (isBinary) { ifstream is(archiveName, ios::binary); is.ignore(tagLength); BinaryIStreamArchive ar(is); ar & data; } else { ifstream is(archiveName); is.ignore(tagLength); TextIStreamArchive ar(is); ar & data; } } // end serializeDataFromFile() template inline std::ostream& saveToOStream(std::ostream& os, T const& v) { using namespace std; TextOStreamArchive ar(os); ar << v; return os; } template inline std::istream& loadFromIStream(std::istream& is, T& v) { using namespace std; TextIStreamArchive ar(is); ar >> v; return is; } //---------------------------------------------------------------------- template struct SerializableVector : public std::vector { SerializableVector() : std::vector() { } SerializableVector(size_t sz) : std::vector(sz) { } template void serialize(Archive& ar) { serializeVector(*this, ar); } V3D_DEFINE_LOAD_SAVE(SerializableVector); }; // end struct SerializableVector V3D_DEFINE_TEMPLATE_IOSTREAM_OPS(SerializableVector); } // end namespace V3D #endif slowmovideo-0.5+git20180116/src/V3D/Base/v3d_timer.h0000664000000000000000000000764613151342440020143 0ustar rootroot// -*- C++ -*- #include "config.h" #ifndef V3D_TIMER_H #define V3D_TIMER_H #include #include #include #if defined(WIN32) # include #else # include # include #endif namespace V3D { using namespace std; class Timer { public: Timer(const char *name = "", int history_size = 0) : _total_time(0), _history(0), _history_index(0), _count(0) { _name[0] = '\0'; #ifdef _MSC_VER LARGE_INTEGER freq; QueryPerformanceFrequency(&freq); _freq = freq.QuadPart; if (name) strcpy_s(_name, sizeof(_name), name); else strcpy_s(_name, sizeof(_name), ""); #else _freq = 1000000; // gettimeofday() return microseconds. if (name) strncpy(_name, name, sizeof(_name)); else strncpy(_name, "", sizeof(_name)); #endif if (history_size > 0) _history = new unsigned long long[history_size]; _history_size = history_size; std::fill(_history,_history+_history_size,0); } ~Timer() { delete[] _history; } void start() { #ifdef WIN32 LARGE_INTEGER start_time; QueryPerformanceCounter(&start_time); _start_time = start_time.QuadPart; #else timeval tv; gettimeofday(&tv, 0); _start_time = tv.tv_sec*_freq + tv.tv_usec; #endif } void stop() { #ifdef WIN32 LARGE_INTEGER cur_time; QueryPerformanceCounter(&cur_time); unsigned long long elapsed = cur_time.QuadPart - _start_time; #else timeval tv; gettimeofday(&tv, 0); unsigned long long elapsed = tv.tv_sec*_freq + tv.tv_usec - _start_time; #endif _total_time += elapsed; if (_history) { if ((int)_count == _history_size) _total_time -= _history[_history_index]; else ++_count; _history[_history_index] = elapsed; ++_history_index; if(_history_index >= _history_size) _history_index = 0; } else ++_count; } // end stop() double getHertz() const { return _freq*(double)_count/_total_time; } double getTime() const { return (double)_total_time/_freq; } unsigned long getCount() const { return _count; } const char *getName() const { return _name; } void printHertz() const { printf("TIMING: %s: %.03f Hz\n",_name, this->getHertz()); } void printTime() const { printf("TIMING: %s: %.03f s\n",_name, this->getTime()); } void print() const { printf("TIMING: %s: %.03f Hz, %.03f s/exec, %ld execs, %.03f s\n", _name, getHertz(), getTime()/getCount(), getCount(), getTime()); } private: unsigned long long _freq; unsigned long long _total_time; unsigned long long * _history; int _history_size; int _history_index; unsigned long _count; unsigned long long _start_time; char _name[80]; }; // end struct Timer struct ScopedTimer { ScopedTimer(const char *name = "") : _timer(name,0) { _timer.start(); } ~ScopedTimer() { _timer.stop(); #ifndef NDEBUG _timer.print(); #endif } private: Timer _timer; }; } // end namespace V3D #endif slowmovideo-0.5+git20180116/src/V3D/Base/v3d_image.cpp0000664000000000000000000006516013151342440020433 0ustar rootroot#include "Base/v3d_image.h" #include "Base/v3d_exception.h" #include #include #include #include // for INT_MAX #if defined(V3DLIB_ENABLE_LIBJPEG) extern "C" { # include } #endif #if defined(V3DLIB_ENABLE_LIBPNG) # include #endif using namespace std; namespace { // Adapted from netpbm inline bool pnm_getc(FILE * const file, char& dst) { int ich; ich = getc(file); if (ich == EOF) return false; // premature EOF dst = (char) ich; // Skip comments if (dst == '#') { do { ich = getc(file); if (ich == EOF) return false; // premature EOF dst = (char) ich; } while (dst != '\n' && dst != '\r'); } return true; } // end pnm_getc() inline bool pnm_getuint(FILE * const file, unsigned int& res) { char ch; do { if (!pnm_getc(file, ch)) return false; } while (ch == ' ' || ch == '\t' || ch == '\n' || ch == '\r'); if (ch < '0' || ch > '9') return false; res = 0; do { unsigned int const digitVal = ch - '0'; if (res > INT_MAX/10 - digitVal) return false; res = res * 10 + digitVal; if (!pnm_getc(file, ch)) return false; } while (ch >= '0' && ch <= '9'); return true; } // end pnm_getuint() //---------------------------------------------------------------------- #if defined(V3DLIB_ENABLE_LIBPNG) /* our method that reads from a FILE* and fills up the buffer that libpng wants when parsing a PNG file */ void user_read_callback(png_structp png_ptr, png_bytep data, png_uint_32 length) { unsigned readlen = fread(data, 1, length, (FILE *)png_get_io_ptr(png_ptr)); if (readlen != length) { /* FIXME: then what? png_error()? 20020821 mortene */ } } /* our method that write compressed png image data to a FILE* */ void user_write_callback(png_structp png_ptr, png_bytep data, png_uint_32 length) { unsigned writelen = fwrite(data, 1, length, (FILE *)png_get_io_ptr(png_ptr)); if (writelen != length) { /* FIXME: then what? png_error()? 20020821 mortene */ } } /* our method that flushes written compressed png image data */ void user_flush_callback(png_structp png_ptr) { int err = fflush((FILE *)png_get_io_ptr(png_ptr)); if (err != 0) { /* FIXME: then what? png_error()? 20020821 mortene */ } } #endif // defined(V3DLIB_ENABLE_LIBPNG) //---------------------------------------------------------------------- enum ImageFileType { V3D_IMAGE_FILE_TYPE_UNKNOWN = -1, V3D_IMAGE_FILE_TYPE_PNM = 0, V3D_IMAGE_FILE_TYPE_JPEG = 1, V3D_IMAGE_FILE_TYPE_PNG = 2, }; inline ImageFileType determineImageFileType(string const& fileName) { size_t extStart = fileName.find_last_of("./\\"); if (extStart == fileName.npos || fileName[extStart] != '.') return V3D_IMAGE_FILE_TYPE_UNKNOWN; string const extension = fileName.substr(extStart+1); if (extension == "jpg" || extension == "JPG" || extension == "jpeg" || extension == "JPEG") return V3D_IMAGE_FILE_TYPE_JPEG; if (extension == "png" || extension == "PNG") return V3D_IMAGE_FILE_TYPE_PNG; if (extension == "ppm" || extension == "PPM" || extension == "pgm" || extension == "PGM") return V3D_IMAGE_FILE_TYPE_PNM; return V3D_IMAGE_FILE_TYPE_UNKNOWN; } // end determineImageFileType() } // end namespace <> namespace V3D { void statImageFile(char const * fileName, ImageFileStat& stat) { ImageFileType fileType = determineImageFileType(std::string(fileName)); switch (fileType) { case V3D_IMAGE_FILE_TYPE_PNM: statPNMImageFile(fileName, stat); return; #if defined(V3DLIB_ENABLE_LIBJPEG) case V3D_IMAGE_FILE_TYPE_JPEG: statJPGImageFile(fileName, stat); return; #endif #if defined(V3DLIB_ENABLE_LIBPNG) case V3D_IMAGE_FILE_TYPE_PNG: statPNGImageFile(fileName, stat); return; #endif default: throw Exception(__FILE__, __LINE__, "Unkown or unsupported image file extension."); } // end switch() } // end statImageFile() void loadImageFile(char const * fileName, Image& image) { ImageFileType fileType = determineImageFileType(std::string(fileName)); switch (fileType) { case V3D_IMAGE_FILE_TYPE_PNM: loadPNMImageFile(fileName, image); return; #if defined(V3DLIB_ENABLE_LIBJPEG) case V3D_IMAGE_FILE_TYPE_JPEG: loadJPGImageFile(fileName, image); return; #endif #if defined(V3DLIB_ENABLE_LIBPNG) case V3D_IMAGE_FILE_TYPE_PNG: loadPNGImageFile(fileName, image); return; #endif default: throw Exception(__FILE__, __LINE__, "Unkown or unsupported image file extension."); } // end switch() } // end loadImageFile() void saveImageFile(Image const& image, char const * fileName) { cout << "Writing image to " << fileName << endl; ImageFileType fileType = determineImageFileType(std::string(fileName)); switch (fileType) { case V3D_IMAGE_FILE_TYPE_PNM: savePNMImageFile(image, fileName); return; #if defined(V3DLIB_ENABLE_LIBJPEG) case V3D_IMAGE_FILE_TYPE_JPEG: saveJPGImageFile(image, fileName); return; #endif #if defined(V3DLIB_ENABLE_LIBPNG) case V3D_IMAGE_FILE_TYPE_PNG: savePNGImageFile(image, fileName); return; #endif default: throw Exception(__FILE__, __LINE__, "Unkown or unsupported image file extension."); } // end switch() } // end saveImageFile() void statPNMImageFile(char const * fileName, ImageFileStat& stat) { stat.width = stat.height = stat.numChannels = stat.bitDepth = -1; FILE * file = fopen(fileName, "rb"); if (!file) throw Exception(__FILE__, __LINE__, "Cannot open PNM image file.");; int magic[2]; magic[0] = getc(file); magic[1] = getc(file); if (magic[0] != 'P' && (magic[1] != '5' || magic[1] != '6')) throw Exception(__FILE__, __LINE__, "Wrong or unsupported PNM magic number."); unsigned int width, height, maxVal; if (!pnm_getuint(file, width)) throw Exception(__FILE__, __LINE__, "Cannot read PNM header."); if (!pnm_getuint(file, height)) throw Exception(__FILE__, __LINE__, "Cannot read PNM header."); if (!pnm_getuint(file, maxVal)) throw Exception(__FILE__, __LINE__, "Cannot read PNM header."); stat.numChannels = (magic[1] == '5') ? 1 : 3; stat.width = width; stat.height = height; stat.bitDepth = (maxVal > 255) ? 16 : 8; fclose(file); } // end statPNMImageFile() void loadPNMImageFile(char const * fileName, Image& image) { FILE * file = fopen(fileName, "rb"); if (!file) throw Exception(__FILE__, __LINE__, "Cannot open PNM image file."); int magic[2]; magic[0] = getc(file); magic[1] = getc(file); if (magic[0] == 'P' && magic[1] == '6') { unsigned int width, height, maxVal; if (!pnm_getuint(file, width)) throw Exception(__FILE__, __LINE__, "Cannot read PNM header."); if (!pnm_getuint(file, height)) throw Exception(__FILE__, __LINE__, "Cannot read PNM header."); if (!pnm_getuint(file, maxVal)) throw Exception(__FILE__, __LINE__, "Cannot read PNM header."); if (maxVal > 255) // Does not fit in unsigned char throw Exception(__FILE__, __LINE__, "PNM image file has unsupported bit depth."); image.resize(width, height, 3); unsigned char * pixels = new unsigned char[width*height*3]; fread(pixels, width*height*3, 1, file); for (int y = 0; y < (int)height; ++y) { int const rowOfs = 3*width*y; for (int x = 0; x < (int)width; ++x) { image(x, y, 0) = pixels[rowOfs + 3*x + 0]; image(x, y, 1) = pixels[rowOfs + 3*x + 1]; image(x, y, 2) = pixels[rowOfs + 3*x + 2]; } } delete [] pixels; } else if (magic[0] == 'P' && magic[1] == '5') { unsigned int width, height, maxVal; if (!pnm_getuint(file, width)) throw Exception(__FILE__, __LINE__, "Cannot read PNM header."); if (!pnm_getuint(file, height)) throw Exception(__FILE__, __LINE__, "Cannot read PNM header."); if (!pnm_getuint(file, maxVal)) throw Exception(__FILE__, __LINE__, "Cannot read PNM header."); if (maxVal > 255) // Does not fit in unsigned char throw Exception(__FILE__, __LINE__, "PNM image file has unsupported bit depth."); image.resize(width, height, 1); fread(image.begin(), width*height, 1, file); } fclose(file); } // end loadPNMImageFile() void savePNMImageFile(Image const& image, char const * fileName) { int const width = image.width(); int const height = image.height(); int const nPlanes = image.numChannels(); if (nPlanes == 3) { unsigned char * pixels = new unsigned char[width*height*3]; for (int y = 0; y < height; ++y) { int const rowOfs = 3*width*y; for (int x = 0; x < width; ++x) { pixels[rowOfs + 3*x + 0] = image(x, y, 0); pixels[rowOfs + 3*x + 1] = image(x, y, 1); pixels[rowOfs + 3*x + 2] = image(x, y, 2); } } FILE * file = fopen(fileName, "wb"); if (!file) throw Exception(__FILE__, __LINE__, "Cannot open PNM image file for writing."); fprintf(file, "P6\n%i %i\n%i\n", width, height, 255); fwrite(pixels, 3*width*height, 1, file); fclose(file); delete [] pixels; } else if (nPlanes == 1) { FILE * file = fopen(fileName, "wb"); fprintf(file, "P5\n%i %i\n%i\n", width, height, 255); fwrite(image.begin(), width*height, 1, file); fclose(file); } } // end savePNMImageFile() #if defined(V3DLIB_ENABLE_LIBJPEG) void statJPGImageFile(char const * fileName, ImageFileStat& stat) { stat.width = stat.height = stat.numChannels = stat.bitDepth = -1; struct jpeg_decompress_struct cinfo; struct jpeg_error_mgr jerr; FILE * infile; /* source file */ if ((infile = fopen(fileName, "rb")) == NULL) throw Exception(__FILE__, __LINE__, "Cannot open JPG image file."); cinfo.err = jpeg_std_error(&jerr); /* Now we can initialize the JPEG decompression object. */ jpeg_create_decompress(&cinfo); jpeg_stdio_src(&cinfo, infile); jpeg_read_header(&cinfo, TRUE); stat.width = cinfo.image_width; stat.height = cinfo.image_height; stat.numChannels = cinfo.num_components; stat.bitDepth = 8; jpeg_destroy_decompress(&cinfo); fclose(infile); } // end statJPGImageFile() void loadJPGImageFile(char const * fileName, Image& image) { struct jpeg_decompress_struct cinfo; struct jpeg_error_mgr jerr; FILE * infile; /* source file */ JSAMPARRAY buffer; /* Output row buffer */ int row_stride; /* physical row width in output buffer */ if ((infile = fopen(fileName, "rb")) == NULL) throw Exception(__FILE__, __LINE__, "Cannot open JPG image file."); cinfo.err = jpeg_std_error(&jerr); /* Now we can initialize the JPEG decompression object. */ jpeg_create_decompress(&cinfo); jpeg_stdio_src(&cinfo, infile); jpeg_read_header(&cinfo, TRUE); jpeg_start_decompress(&cinfo); int const nPlanes = cinfo.num_components; int const w = cinfo.image_width; int const h = cinfo.image_height; image.resize(w, h, nPlanes); row_stride = w * nPlanes; /* Make a one-row-high sample array that will go away when done with image */ buffer = (*cinfo.mem->alloc_sarray)((j_common_ptr) &cinfo, JPOOL_IMAGE, row_stride, 1); while ((int)cinfo.output_scanline < h) { int const y = cinfo.output_scanline; /* jpeg_read_scanlines expects an array of pointers to scanlines. * Here the array is only one element long, but you could ask for * more than one scanline at a time if that's more convenient. */ jpeg_read_scanlines(&cinfo, buffer, 1); for (int x = 0; x < w; ++x) for (int plane = 0; plane < nPlanes; ++plane) image(x, y, plane) = buffer[0][nPlanes*x + plane]; } // end while jpeg_finish_decompress(&cinfo); jpeg_destroy_decompress(&cinfo); fclose(infile); } // end loadJPGImageFile() void saveJPGImageFile(Image const& image, char const * fileName, int quality) { struct jpeg_compress_struct cinfo; struct jpeg_error_mgr jerr; FILE * outfile; /* target file */ JSAMPROW row_pointer[1]; /* pointer to JSAMPLE row[s] */ int row_stride; /* physical row width in image buffer */ int const w = image.width(); int const h = image.height(); int const nPlanes = image.numChannels(); cinfo.err = jpeg_std_error(&jerr); /* Now we can initialize the JPEG compression object. */ jpeg_create_compress(&cinfo); if ((outfile = fopen(fileName, "wb")) == NULL) throw Exception(__FILE__, __LINE__, "Cannot open JPG image file for writing."); jpeg_stdio_dest(&cinfo, outfile); cinfo.image_width = w; /* image width and height, in pixels */ cinfo.image_height = h; cinfo.input_components = nPlanes; /* # of color components per pixel */ if (nPlanes == 3) cinfo.in_color_space = JCS_RGB; /* colorspace of input image */ else cinfo.in_color_space = JCS_GRAYSCALE; /* grayscale of input image */ jpeg_set_defaults(&cinfo); jpeg_set_quality(&cinfo, quality, TRUE /* limit to baseline-JPEG values */); jpeg_start_compress(&cinfo, TRUE); row_stride = w * nPlanes; /* JSAMPLEs per row in image_buffer */ row_pointer[0] = new unsigned char[row_stride]; while (cinfo.next_scanline < cinfo.image_height) { /* jpeg_write_scanlines expects an array of pointers to scanlines. * Here the array is only one element long, but you could pass * more than one scanline at a time if that's more convenient. */ int const y = cinfo.next_scanline; for (int x = 0; x < w; ++x) for (int plane = 0; plane < nPlanes; ++plane) row_pointer[0][nPlanes*x + plane] = image(x, y, plane); jpeg_write_scanlines(&cinfo, row_pointer, 1); } delete [] row_pointer[0]; jpeg_finish_compress(&cinfo); fclose(outfile); jpeg_destroy_compress(&cinfo); } // end saveJPGImageFile() #endif // #defined(V3DLIB_ENABLE_LIBJPEG) #if defined(V3DLIB_ENABLE_LIBPNG) void statPNGImageFile(char const * fileName, ImageFileStat& stat) { stat.width = stat.height = stat.numChannels = stat.bitDepth = -1; FILE *fp = fopen(fileName, "rb"); if (!fp) throw Exception(__FILE__, __LINE__, "Cannot open PNG image file."); unsigned char header[8]; png_uint_32 width, height; int bit_depth, color_type, interlace_type; fread(header, 1, 8, fp); bool is_png = !png_sig_cmp(header, 0, 8); if (!is_png) throw Exception(__FILE__, __LINE__, "Cannot read PNG image header."); png_structp png_ptr = png_create_read_struct(PNG_LIBPNG_VER_STRING, 0, 0, 0); png_infop info_ptr = png_create_info_struct(png_ptr); /* we're not using png_init_io(), as we don't want to pass a FILE* into libpng, in case it's an MSWindows DLL with a different CRT (C run-time library) */ fseek(fp, 0, SEEK_SET); png_set_read_fn(png_ptr, (void *)fp, (png_rw_ptr)user_read_callback); /* The call to png_read_info() gives us all of the information from the * PNG file before the first IDAT (image data chunk). REQUIRED */ png_read_info(png_ptr, info_ptr); png_get_IHDR(png_ptr, info_ptr, &width, &height, &bit_depth, &color_type, &interlace_type, NULL, NULL); stat.width = width; stat.height = height; stat.bitDepth = bit_depth; switch (color_type) { case PNG_COLOR_TYPE_GRAY: stat.numChannels = 1; break; case PNG_COLOR_TYPE_GRAY_ALPHA: stat.numChannels = 2; break; case PNG_COLOR_TYPE_RGB: stat.numChannels = 3; break; case PNG_COLOR_TYPE_RGB_ALPHA: stat.numChannels = 4; break; default: throw Exception(__FILE__, __LINE__, "Unsupported number of channels in PNG image file."); } png_destroy_read_struct(&png_ptr, &info_ptr, (png_infopp)NULL); fclose(fp); } // end statPNGImageFile() void loadPNGImageFile(char const * fileName, Image& image) { FILE *fp = fopen(fileName, "rb"); if (!fp) throw Exception(__FILE__, __LINE__, "Cannot open PNG image file."); unsigned char header[8]; png_uint_32 width, height; int bit_depth, color_type, interlace_type; fread(header, 1, 8, fp); bool is_png = !png_sig_cmp(header, 0, 8); if (!is_png) throw Exception(__FILE__, __LINE__, "Cannot read PNG image header."); { png_structp png_ptr = png_create_read_struct(PNG_LIBPNG_VER_STRING, 0, 0, 0); png_infop info_ptr = png_create_info_struct(png_ptr); /* we're not using png_init_io(), as we don't want to pass a FILE* into libpng, in case it's an MSWindows DLL with a different CRT (C run-time library) */ fseek(fp, 0, SEEK_SET); png_set_read_fn(png_ptr, (void *)fp, (png_rw_ptr)user_read_callback); /* The call to png_read_info() gives us all of the information from the * PNG file before the first IDAT (image data chunk). REQUIRED */ png_read_info(png_ptr, info_ptr); png_get_IHDR(png_ptr, info_ptr, &width, &height, &bit_depth, &color_type, &interlace_type, NULL, NULL); /* tell libpng to strip 16 bit/color files down to 8 bits/color */ png_set_strip_16(png_ptr); /* expand paletted colors into true RGB triplets */ if (color_type == PNG_COLOR_TYPE_PALETTE) png_set_expand(png_ptr); /* expand grayscale images to the full 8 bits from 1, 2, or 4 bits/pixel */ if (color_type == PNG_COLOR_TYPE_GRAY && bit_depth < 8) png_set_expand(png_ptr); /* expand paletted or RGB images with transparency to full alpha channels * so the data will be available as RGBA quartets */ if (png_get_valid(png_ptr, info_ptr, PNG_INFO_tRNS)) png_set_expand(png_ptr); png_read_update_info(png_ptr, info_ptr); int nChannels = png_get_channels(png_ptr, info_ptr); image.resize(width, height, nChannels); int bytes_per_row = png_get_rowbytes(png_ptr, info_ptr); unsigned char * buffer = new unsigned char[bytes_per_row * height]; png_bytepp row_pointers = new png_bytep[height]; for (unsigned y = 0; y < height; y++) row_pointers[y] = buffer + y*bytes_per_row; png_read_image(png_ptr, row_pointers); png_read_end(png_ptr, info_ptr); delete [] row_pointers; png_destroy_read_struct(&png_ptr, &info_ptr, (png_infopp)NULL); fclose(fp); for (int chan = 0; chan < nChannels; ++chan) { unsigned char * p = image.begin(chan); for (unsigned y = 0; y < height; ++y) for (unsigned x = 0; x < width; ++x, ++p) *p = buffer[y*bytes_per_row + nChannels*x + chan]; } delete [] buffer; } // end scope } // end loadImageFilePNG() void savePNGImageFile(Image const& image, char const * fileName) { /* open the file */ FILE * fp = fopen(fileName, "wb"); if (!fp) throw Exception(__FILE__, __LINE__, "Cannot open PNG image file for writing."); /* Create and initialize the png_struct with the desired error handler * functions. If you want to use the default stderr and longjump method, * you can supply NULL for the last three parameters. We also check that * the library version is compatible with the one used at compile time, * in case we are using dynamically linked libraries. REQUIRED. */ png_structp png_ptr = png_create_write_struct(PNG_LIBPNG_VER_STRING, NULL, NULL, NULL); if (png_ptr == NULL) throw Exception(__FILE__, __LINE__, "Cannot create PNG structures for writing."); /* Allocate/initialize the image information data. REQUIRED */ png_infop info_ptr = png_create_info_struct(png_ptr); if (info_ptr == NULL) throw Exception(__FILE__, __LINE__, "Cannot create PNG structures for writing."); /* Set error handling. REQUIRED if you aren't supplying your own * error hadnling functions in the png_create_write_struct() call. */ // if (setjmp(png_ptr->jmpbuf)) { // /* If we get here, we had a problem reading the file */ // fclose(fp); // png_destroy_write_struct(&png_ptr, (png_infopp)info_ptr); // pngerror = ERR_PNGLIB_WRITE; // return 0; // } /* we're not using png_init_io(), as we don't want to pass a FILE* into libpng, in case it's an MSWindows DLL with a different CRT (C run-time library) */ png_set_write_fn(png_ptr, (void *)fp, (png_rw_ptr)user_write_callback, (png_flush_ptr)user_flush_callback); /* Set the image information here. Width and height are up to 2^31, * bit_depth is one of 1, 2, 4, 8, or 16, but valid values also depend on * the color_type selected. color_type is one of PNG_COLOR_TYPE_GRAY, * PNG_COLOR_TYPE_GRAY_ALPHA, PNG_COLOR_TYPE_PALETTE, PNG_COLOR_TYPE_RGB, * or PNG_COLOR_TYPE_RGB_ALPHA. interlace is either PNG_INTERLACE_NONE or * PNG_INTERLACE_ADAM7, and the compression_type and filter_type MUST * currently be PNG_COMPRESSION_TYPE_BASE and PNG_FILTER_TYPE_BASE. REQUIRED */ int colortype = PNG_COLOR_TYPE_RGB; unsigned const width = image.width(); unsigned const height = image.height(); unsigned const nChannels = image.numChannels(); switch (nChannels) { case 1: colortype = PNG_COLOR_TYPE_GRAY; break; case 3: colortype = PNG_COLOR_TYPE_RGB; break; case 4: colortype = PNG_COLOR_TYPE_RGB_ALPHA; break; default: png_destroy_write_struct(&png_ptr, (png_infopp)NULL); throw Exception(__FILE__, __LINE__, "Unsupported number of channels for writing a PNG image file."); } png_set_IHDR(png_ptr, info_ptr, width, height, 8, colortype, PNG_INTERLACE_NONE, PNG_COMPRESSION_TYPE_BASE, PNG_FILTER_TYPE_BASE); /* Optional gamma chunk is strongly suggested if you have any guess * as to the correct gamma of the image. */ /* png_set_gAMA(png_ptr, info_ptr, gamma); */ /* other optional chunks like cHRM, bKGD, tRNS, tIME, oFFs, pHYs, */ /* Write the file header information. REQUIRED */ png_write_info(png_ptr, info_ptr); /* Once we write out the header, the compression type on the text * chunks gets changed to PNG_TEXT_COMPRESSION_NONE_WR or * PNG_TEXT_COMPRESSION_zTXt_WR, so it doesn't get written out again * at the end. */ /* set up the transformations you want. Note that these are * all optional. Only call them if you want them. */ /* invert monocrome pixels */ /* png_set_invert(png_ptr); */ /* Shift the pixels up to a legal bit depth and fill in * as appropriate to correctly scale the image */ /* png_set_shift(png_ptr, &sig_bit);*/ /* pack pixels into bytes */ /* png_set_packing(png_ptr); */ /* swap location of alpha bytes from ARGB to RGBA */ /* png_set_swap_alpha(png_ptr); */ /* Get rid of filler (OR ALPHA) bytes, pack XRGB/RGBX/ARGB/RGBA into * RGB (4 channels -> 3 channels). The second parameter is not used. */ /* png_set_filler(png_ptr, 0, PNG_FILLER_BEFORE); */ /* flip BGR pixels to RGB */ /* png_set_bgr(png_ptr); */ /* swap bytes of 16-bit files to most significant byte first */ /* png_set_swap(png_ptr); */ /* swap bits of 1, 2, 4 bit packed pixel formats */ /* png_set_packswap(png_ptr); */ /* The easiest way to write the image (you may have a different memory * layout, however, so choose what fits your needs best). You need to * use the first method if you aren't handling interlacing yourself. */ /* If you are only writing one row at a time, this works */ unsigned const bytesperrow = width * nChannels; if (nChannels != 1) { unsigned char * buffer = new unsigned char[bytesperrow]; for (unsigned y = 0; y < height; ++y) { for (unsigned chan = 0; chan < nChannels; ++chan) { unsigned char const * p = image.begin(chan) + y*width; for (unsigned x = 0; x < width; ++x, ++p) buffer[x*nChannels + chan] = *p; } png_write_row(png_ptr, (png_bytep)buffer); } // end for (y) delete [] buffer; } else { // Faster path for grayscale images for (unsigned y = 0; y < height; ++y) png_write_row(png_ptr, (png_bytep)(image.begin() + y*width)); } // end if /* You can write optional chunks like tEXt, zTXt, and tIME at the end * as well. */ /* It is REQUIRED to call this to finish writing the rest of the file */ png_write_end(png_ptr, info_ptr); /* if you allocated any text comments, free them here */ /* clean up after the write, and free any memory allocated */ png_destroy_write_struct(&png_ptr, &info_ptr); /* close the file */ fclose(fp); /* that's it */ } // end saveImageFilePNG() #endif // defined(V3DLIB_ENABLE_LIBPNG) } // end namespace V3D slowmovideo-0.5+git20180116/src/V3D/Base/v3d_exception.h0000664000000000000000000000325613151342440021012 0ustar rootroot// -*- C++ -*- #ifndef V3D_EXCEPTION_H #define V3D_EXCEPTION_H #include #include #include #include #include #define verify(condition, message) do \ { \ if (!(condition)) { \ std::cout << "VERIFY FAILED: " << (message) << "\n" \ << " " << __FILE__ << ", " << __LINE__ << "\n"; \ assert(false); \ exit(0); \ } \ } while(false); #define throwV3DErrorHere(reason) throw V3D::Exception(__FILE__, __LINE__, reason) namespace V3D { struct Exception : public std::exception { Exception(char const * reason) : _reason(reason) { } Exception(std::string const& reason) : _reason(reason) { } Exception(char const * file, int line, char const * reason) { std::ostringstream os; os << file << ":" << line << ": " << reason; _reason = os.str(); } Exception(char const * file, int line, std::string const& reason) { std::ostringstream os; os << file << ":" << line << ": " << reason; _reason = os.str(); } virtual ~Exception() throw() { } virtual const char * what() const throw() { return _reason.c_str(); } protected: std::string _reason; }; // end struct Exception } // end namespace V3D #endif slowmovideo-0.5+git20180116/src/V3D/Base/v3d_imageprocessing.h0000664000000000000000000007107013151342440022172 0ustar rootroot// -*- C++ -*- #include "config.h" #ifndef V3D_IMAGEPROCESSING_H #define V3D_IMAGEPROCESSING_H #include "v3d_image.h" #include "Math/v3d_linear.h" namespace V3D { template Elem2 bilinearSample( const Image &im, Elem2 x, Elem2 y, int c=0 ) { int x0 = floor(x); int y0 = floor(y); Elem2 xf = x-x0; Elem2 yf = y-y0; return (1-yf)*(1-xf)*im(x0 ,y0 ,c) + (1-yf)*( xf)*im(x0+1,y0 ,c) + ( yf)*(1-xf)*im(x0 ,y0+1,c) + ( yf)*( xf)*im(x0+1,y0+1,c); } template Elem2 bilinearSampleBorder( const Image &im, Elem2 x, Elem2 y, int c=0, Elem2 border = 0 ) { int x0 = floor(x); int y0 = floor(y); Elem2 xf = x-x0; Elem2 yf = y-y0; if(x0>=0 && x0+1=0 && y0+1 inline void convolveImage( const Image &im, const Image &kernel, Image &out ) { // NOTE: This code could be much more optimized. verify(im.width() >= kernel.width(),"image must be larger than kernel"); verify(im.height() >= kernel.height(),"image must be larger than kernel"); verify(im.numChannels() == kernel.numChannels(),"number of channels must be the same"); if (out.width() != im.width() || out.height() != im.height() || out.numChannels() != im.numChannels()) out.resize(im.width(), im.height(), im.numChannels()); int xx0, xx1, yy0, yy1, kx, ky, kx0, ky0; Elem2 sum, denom; for (int ch = 0; ch < im.numChannels(); ++ch) { for(int y = 0; y < im.height(); ++y) { yy0 = y - kernel.height()/2; yy1 = yy0 + kernel.height(); yy0 = std::max(yy0, 0); yy1 = std::min(yy1, int(im.height())); ky0 = yy0 - y + kernel.height()/2; for(int x = 0; x < im.width(); ++x) { xx0 = x - kernel.width()/2; xx1 = xx0 + kernel.width(); xx0 = std::max(xx0, 0); xx1 = std::min(xx1, int(im.width())); kx0 = xx0 - x + kernel.width()/2; sum = 0; denom = 0; for(int yy = yy0, ky = ky0; yy < yy1; ++yy, ++ky) { for(int xx = xx0, kx = kx0; xx < xx1; ++xx, ++kx) { sum += im(xx, yy, ch) * kernel(kx, ky, ch); denom += kernel(kx, ky, ch); } } out(x, y, ch) = (Elem3)(sum / denom); } // end for (x) } // end for (y) } // end for (ch) } // end convolveImage() inline int choose(int n, int k) { if (k > n) return 0; if (k > n/2) k = n-k; // faster double accum = 1; for (int i = 1; i <= k; i++) accum = accum * (n-k+i) / i; return (int)(accum + 0.5); // avoid rounding error } template inline void boxFilterImage( const Image &im, int boxWidth, int boxHeight, Image &out, Image& temp) { int x,y; int halfw = boxWidth/2; int halfh = boxHeight/2; double sum,weight; temp.resize(im.width(),im.height()); out.resize(im.width(),im.height(),im.numChannels()); for(int i=0; i inline void boxFilterImage( const Image &im, int boxWidth, int boxHeight, Image &out) { Image temp; boxFilterImage(im, boxWidth, boxHeight, out, temp); } // Note: boxWidth and boxHeight must be odd template inline void boxFilterImage_fast(Image const& im, int boxWidth, int boxHeight, Image& out, Image& temp) { int const w = im.width(); int const h = im.height(); int const nChannels = im.numChannels(); int const W2 = boxWidth/2; int const H2 = boxHeight/2; Elem const WH = boxWidth*boxHeight; temp.resize(w, h, 1); out.resize(w, h, nChannels); std::vector row(w+boxWidth); for (int ch = 0; ch < nChannels; ++ch) { // Horizontal pass for (int y = 0; y < h; ++y) { for (int x = 0; x < W2; ++x) row[x] = im(0, y, ch); for (int x = 0; x < w; ++x) row[x+W2] = im(x, y, ch); for (int x = 0; x < W2; ++x) row[w+x] = im(w-1, y, ch); Elem sum = 0; for (int x = 0; x < boxWidth; ++x) sum += row[x]; for (int x = 0; x < w-1; ++x) { temp(x, y) = sum; sum -= row[x]; sum += row[x+boxWidth]; } temp(w-1, y) = sum; } // end for (y) // Vertical pass for (int x = 0; x < w; ++x) row[x] = (H2+1)*temp(x, 0); for (int dy = 1; dy <= H2; ++dy) for (int x = 0; x < w; ++x) row[x] += temp(x, dy); for (int y = 0; y < h; ++y) { int const Y0 = std::max(0, y-H2); int const Y1 = std::min(h-1, y+H2+1); for (int x = 0; x < w; ++x) { out(x, y, ch) = row[x] / WH; row[x] -= temp(x, Y0); row[x] += temp(x, Y1); } // end for (x) } // end for (y) } // end for (ch) } // end boxFilterImage_fast() template inline void boxFilterImage_fast(Image const& im, int boxWidth, int boxHeight, Image &out) { Image temp; boxFilterImage_fast(im, boxWidth, boxHeight, out, temp); } template inline void boxFilterImage_fast(Image const &im1, Image const& im2, int boxWidth, int boxHeight, BinaryFunc op, Image& out, Image& temp, Image& temp2) { int const w = im1.width(); int const h = im2.height(); temp2.resize(w, h, im1.numChannels()); for (int ch = 0; ch < im1.numChannels(); ++ch) { for (int y = 0; y < h; ++y) for (int x = 0; x < w; ++x) temp2(x, y, ch) = op(im1(x, y, ch), im2(x, y, ch)); } boxFilterImage_fast(temp2, boxWidth, boxHeight, out, temp); } // end boxFilterImage_fast() template inline void boxFilterImage_fast(Image const& im1, Image const& im2, int boxWidth, int boxHeight, BinaryFunc op, Image& out) { Image temp, temp2; boxFilterImage_fast(im1, im2, boxWidth, boxHeight, op, out, temp, temp2); } template inline void binomialFilterImage( const Image &im, int Nx, int Ny, Image &out, Image& temp ) { // TODO: Use factored binomial kernels for better speed. Image kernelX(Nx+1,1,im.numChannels()); Image kernelY(1,Ny+1,im.numChannels()); int x,y,i; for(i=0; i inline void binomialFilterImage( const Image &im, int Nx, int Ny, Image &out) { Image temp; binomialFilterImage(im, Nx, Ny, out, temp); } template void meanFilterImage( const Image &in, int w, int h, Image &out, Elem bias, Image& temp) { boxFilterImage(in,w,h,out,temp); for(int i=0; i void meanFilterImage( const Image &in, int w, int h, Image &out, Elem bias = 0) { Image temp; meanFilterImage(in, w, h, out, bias, temp); } template void rankFilterImage( const Image &in, int w, int h, Image &out ) { out.resize(in.width(),in.height(),in.numChannels()); w = (w/2)*2+1; h = (h/2)*2+1; for(int i=0; i(in.width()-1,x+w/2); int yy0 = max(0,y-h/2); int yy1 = min(in.height()-1,y+h/2); out(x,y,i) = 0; for(int xx=xx0; xx<=xx1; xx++) { for(int yy=yy0; yy<=yy1; yy++) { if(in(x,y,i) > in(xx,yy,i)) out(x,y,i)++; } } out(x,y,i) = double(out(x,y,i)) * w*h / ((xx1-xx0+1)*(yy1-yy0+1)); } } } } template void resampleImage( const Image &im, Image &out ) { // TODO: Apply low-pass filter if downsampling. Image im2(im.width(),im.height(),im.numChannels()); binomialFilterImage(im,0,0,im2); verify(im.numChannels()==out.numChannels(),"number of channels must be the same"); for(int y=0; y void floodFill( Image &im, int x0, int y0, Elem val ) { Elem initval = im(x0,y0); if(initval==val) return; int imw = im.width(); int imh = im.height(); std::stack > flood; flood.push(std::pair(x0,y0)); while(!flood.empty()) { int x = flood.top().first; int y = flood.top().second; flood.pop(); //if(x>=0 && x=0 && y(x+1,y)); if(x>0 && im(x-1,y)==initval) flood.push(std::pair(x-1,y)); if(y(x,y+1)); if(y>0 && im(x,y-1)==initval) flood.push(std::pair(x,y-1)); //} } } //========================================================================= template class ImagePyramid { public: ImagePyramid() {} void resize( int w, int h, int channels, int levels ) { _levels.resize(levels); for(int i=0; i>i,h>>i,channels); } void generate( const Image &im, int levels ) { // Copy base level. _levels.resize(levels); _levels[0].copyFrom(im); // Build pyramid. for(int i=1; i<_levels.size(); i++) { _levels[i].resize(im.width()>>i,im.height()>>i,im.numChannels()); for(int y=0; y<_levels[i].height(); y++) { for(int x=0; x<_levels[i].width(); x++) { for(int c=0; c<_levels[i].numChannels(); c++) { _levels[i](x,y,c) = (Elem) (((double)_levels[i-1](2*x+0,2*y+0,c) + (double)_levels[i-1](2*x+1,2*y+0,c) + (double)_levels[i-1](2*x+0,2*y+1,c) + (double)_levels[i-1](2*x+1,2*y+1,c)) / 4); } // channels } // x } // y } // levels } const Image &level( int l ) const { return _levels[l]; } Image &level( int l ) { return _levels[l]; } int numLevels() const { return _levels.size(); } private: vector > _levels; }; template Elem warpMipmapTrilinear( const ImagePyramid &in, const Matrix3x3d &H, int x, int y, int c = 1 ) // TODO: Make this function able to return values for all channels in one call. { float area,level; float mx0,my0,mx1,my1; int xx0,yy0,xx1,yy1,ll; float u0,v0,u1,v1,w; // TODO: Replace this code with derivative-based scale selection. // Warp pixel corners and pixel center. Vector3d m00 = H*Vector3d(x+0.0,y+0.0,1); m00 = 1.0/m00[2] * m00; Vector3d m01 = H*Vector3d(x+1.0,y+0.0,1); m01 = 1.0/m01[2] * m01; Vector3d m10 = H*Vector3d(x+0.0,y+1.0,1); m10 = 1.0/m10[2] * m10; Vector3d m11 = H*Vector3d(x+1.0,y+1.0,1); m11 = 1.0/m11[2] * m11; Vector3d m = H*Vector3d(x+0.5,y+0.5,1); m = 1.0/m[2] * m; // Measure area of projected pixel (bounding box). area = (max(max(m00[0],m01[0]),max(m10[0],m11[0])) - min(min(m00[0],m01[0]),min(m10[0],m11[0])))* (max(max(m00[1],m01[1]),max(m10[1],m11[1])) - min(min(m00[1],m01[1]),min(m10[1],m11[1]))); // Compute pyramid level. 2^(2*level)=area -> level=log2(area)/2 level = 0.5f*log(area)/log(2.0f); level = max(min(level,(float)in.numLevels()-2.0f),0.0f); // Find pixels in pyramid. ll = (int)level; mx0 = m[0]/(1<=in.level(ll).width()-1 || xx1>=in.level(ll+1).width()-1 || yy0>=in.level(ll).height()-1 || yy1>=in.level(ll+1).height()-1 || ll<0 || ll>=in.numLevels()-1) { return (float)(rand()%256); } // Compute trilinear coefficients. u0 = mx0-xx0; v0 = my0-yy0; u1 = mx1-xx1; v1 = my1-yy1; w = level-ll; // Compute trilinear interpolation. //return mx1; //return in.level(ll)(xx0,yy0); //return in.level(0)[y][x]; return (1-u0)*(1-v0)*(1-w)*in.level(ll )(xx0 ,yy0 ,c) + //[yy0 ][xx0 ] + ( u0)*(1-v0)*(1-w)*in.level(ll )(xx0+1,yy0 ,c) + //[yy0 ][xx0+1] + (1-u0)*( v0)*(1-w)*in.level(ll )(xx0 ,yy0+1,c) + //[yy0+1][xx0 ] + ( u0)*( v0)*(1-w)*in.level(ll )(xx0+1,yy0+1,c) + //[yy0+1][xx0+1] + (1-u1)*(1-v1)*( w)*in.level(ll+1)(xx1 ,yy1 ,c) + //[yy1 ][xx1 ] + ( u1)*(1-v1)*( w)*in.level(ll+1)(xx1+1,yy1 ,c) + //[yy1 ][xx1+1] + (1-u1)*( v1)*( w)*in.level(ll+1)(xx1 ,yy1+1,c) + //[yy1+1][xx1 ] + ( u1)*( v1)*( w)*in.level(ll+1)(xx1+1,yy1+1,c); //[yy1+1][xx1+1]; } template Elem warpBilinear( const ImagePyramid &in, const Matrix3x3d &H, int x, int y, int c = 1 ) { Vector3d m = H*Vector3d(x+0.5,y+0.5,1); /*Vector3d m; m[0] = H[0][0]*x + H[0][1]*y + H[0][2]; m[1] = H[1][0]*x + H[1][1]*y + H[1][2]; m[2] = H[2][0]*x + H[2][1]*y + H[2][2];*/ m = 1.0/m[2] * m; m[0] -= 0.5; m[1] -= 0.5; int xx0 = (int)(floor(m[0])); int yy0 = (int)(floor(m[1])); double u0 = m[0]-xx0; double v0 = m[1]-yy0; if(xx0<0 || yy0<0 || xx0>=in.level(0).width()-1 || yy0>=in.level(0).height()-1) { return (float)(rand()%256); } return (1-u0)*(1-v0)*in.level(0)(xx0 ,yy0 ,c) + //[yy0 ][xx0 ] + ( u0)*(1-v0)*in.level(0)(xx0+1,yy0 ,c) + //[yy0 ][xx0+1] + (1-u0)*( v0)*in.level(0)(xx0 ,yy0+1,c) + //[yy0+1][xx0 ] + ( u0)*( v0)*in.level(0)(xx0+1,yy0+1,c); //[yy0+1][xx0+1] + } template void warpImageBilinear( Image &out, const ImagePyramid &in, const Matrix3x3d &H ) { for(int y=0; y void warpImageMipmapTrilinear( Image &out, const ImagePyramid &in, const Matrix3x3d &H ) { for(int y=0; y void convertRGBToGrayscale( const Image &rgb, Image &gray, double rf=0.3, double gf=0.59, double bf=0.11 ) { // Ensure output size. if(rgb.width()!=gray.width() || rgb.height()!=gray.height()) gray.resize(rgb.width(),rgb.height(),1); // Convert. int x,y; for(y=0; y void convertRGBToRGBInterleaved( const Image &rgb, Elem *rgbI, int sizeBytes ) { verify(sizeBytes>=rgb.width()*rgb.height()*3*sizeof(Elem),"Buffer too small."); int x,y; for(y=0; y void convertToUchar( const Image &in, Image &out, double scale = 0.0, double bias = 0 ) { if(scale==0.0) { // auto scale Elem maxval = *std::max_element(in.begin(0),in.end(in.numChannels()-1)); scale = 255.0/(double)maxval; } if(in.width()!=out.width() || in.height()!=out.height() || in.numChannels()!=out.numChannels()) { out.resize(in.width(),in.height(),in.numChannels()); } int x,y,c; for(y=0; y void convertIndexedImage( const Image &in, const std::vector > &map, Image &out ) { if(in.width()!=out.width() || in.height()!=out.height() || channels!=out.numChannels()) { out.resize(in.width(),in.height(),channels); } int x,y,c; for(y=0; y const& src, Image& dst) { int const w = src.width(); int const h = src.height(); dst.resize(w, h, 3); Vector3f rgb, yuv; for (int y = 0; y < h; ++y) for (int x = 0; x < w; ++x) { rgb[0] = src(x, y, 0) / 255.0f; rgb[1] = src(x, y, 1) / 255.0f; rgb[2] = src(x, y, 2) / 255.0f; yuv = convertRGBPixelToYUV(rgb); dst(x, y, 0) = yuv[0]; dst(x, y, 1) = yuv[1]; dst(x, y, 2) = yuv[2]; } } // end convertRGBImageToYUV() // RGB values are expected to be in [0, 1], S and L results are in [0, 1] inline Vector3f convertRGBPixelToHSL(Vector3f const& rgb) { Vector3f hsl; float const r = rgb[0]; float const g = rgb[1]; float const b = rgb[2]; float h = 0.0f, s = 0.0f; float const minimum = std::min(std::min(r, g), b); float const maximum = std::max(std::max(r, g), b); float const delta = maximum - minimum; float const l = (minimum+maximum)/2; if (delta > 0) { s = (l <= 0.5f) ? (delta / (2*l)) : (delta / (2.0f - 2*l)); } if (delta > 0) { if (r == maximum) h = (g - b) / delta; else if (g == maximum) h = 2 + (b - r) / delta; else if (b == maximum) h = 4 + (r - g) / delta; } h *= 60; if (h < 0) h += 360; h /= 255.0f; hsl[0] = h; hsl[1] = s; hsl[2] = l; return hsl; } inline void convertRGBImageToHSL(Image const& src, Image& dst) { int const w = src.width(); int const h = src.height(); dst.resize(w, h, 3); Vector3f rgb, hsl; for (int y = 0; y < h; ++y) for (int x = 0; x < w; ++x) { rgb[0] = src(x, y, 0) / 255.0f; rgb[1] = src(x, y, 1) / 255.0f; rgb[2] = src(x, y, 2) / 255.0f; hsl = convertRGBPixelToHSL(rgb); dst(x, y, 0) = hsl[0]; dst(x, y, 1) = hsl[1]; dst(x, y, 2) = hsl[2]; } } // end convertRGBImageToHSL() // RGB values are expected to be in [0, 1] inline Vector3f convertRGBPixelTo_sRGB(Vector3f const& rgb) { float const th = 0.04045; Vector3f srgb; srgb[0] = (rgb[0] < th) ? (rgb[0] / 12.92f) : powf((rgb[0] + 0.055f) / 1.055f, 2.4f); srgb[1] = (rgb[1] < th) ? (rgb[1] / 12.92f) : powf((rgb[1] + 0.055f) / 1.055f, 2.4f); srgb[2] = (rgb[2] < th) ? (rgb[2] / 12.92f) : powf((rgb[2] + 0.055f) / 1.055f, 2.4f); return srgb; } inline Vector3f convert_sRGBPixelToXYZ(Vector3f const& srgb) { float const R = srgb[0], G = srgb[1], B = srgb[2]; Vector3f xyz; xyz[0] = (float) (R * 0.412424 + G * 0.357579 + B * 0.180464); xyz[1] = (float) (R * 0.212656 + G * 0.715158 + B * 0.072186); xyz[2] = (float) (R * 0.019332 + G * 0.119193 + B * 0.950444); return xyz; } // end convertRGBPixelToXYZ() // RGB values are expected to be in [0, 1] inline Vector3f convertRGBPixelToXYZ(Vector3f const& rgb) { Vector3f srgb = convertRGBPixelTo_sRGB(rgb); return convert_sRGBPixelToXYZ(srgb); } // end convertRGBPixelToXYZ() inline Vector3f convertXYZPixelToCIELab(Vector3f const& xyz) { float const one_over_3 = 1.0f/3.0f; float const c2 = 16.0f/116.0f; float const X = (xyz[0] > 0.0088565) ? pow(xyz[0], one_over_3) : (7.787*xyz[0] + c2); float const Y = (xyz[1] > 0.0088565) ? pow(xyz[1], one_over_3) : (7.787*xyz[1] + c2); float const Z = (xyz[2] > 0.0088565) ? pow(xyz[2], one_over_3) : (7.787*xyz[2] + c2); Vector3f lab; lab[0] = 116.0f*Y - 16.0f; lab[1] = 500.0f * (X - Y); lab[2] = 200.0f * (Y - Z); return lab; } // end convertXYZPixelToCIELab() //--------------------------------------------------------------------- // Image display function. template inline void showImage( const Image &im, double scale = 0.0 ) { #ifdef V3DLIB_ENABLE_IMDEBUG Image out; convertToUchar(im,out,scale); if(im.numChannels()==3) { unsigned char *rgb = new unsigned char[im.width()*im.height()*im.numChannels()]; convertRGBToRGBInterleaved(out,rgb,im.width()*im.height()*im.numChannels()); imdebug("rgb w=%d h=%d %p",im.width(),im.height(),rgb); delete[] rgb; } else if(im.numChannels()==1) { imdebug("lum w=%d h=%d %p",im.width(),im.height(),out.begin()); } else { verify(false,"Num channels must be 1 or 3"); } #endif } inline void showFloatImage( const Image &im ) { #ifdef V3DLIB_ENABLE_IMDEBUG if(im.numChannels()==1) { imdebug("lum *auto b=32f w=%d h=%d %p",im.width(),im.height(),im.begin()); } else { verify(false,"Num channels must be 1"); } #endif } } // namespace V3D #endif slowmovideo-0.5+git20180116/src/V3D/Base/v3d_utilities.h0000664000000000000000000000762113151342440021027 0ustar rootroot// -*- C++ -*- #include "config.h" // General utility procedures that do not fit anywhere else. #ifndef V3D_UTILITIES_H #define V3D_UTILITIES_H #include "Base/v3d_image.h" #include "Math/v3d_linear.h" #include #include #include #ifdef _WIN32 # define M_PI 3.14159265358979323846 #endif namespace V3D { //---------------------------------------------------------------------- inline Image makeColorWheelImage() { // relative lengths of color transitions: // these are chosen based on perceptual similarity // (e.g. one can distinguish more shades between red and yellow // than between yellow and green) #if 0 int const RY = 15; int const YG = 6; int const GC = 4; int const CB = 11; int const BM = 13; int const MR = 6; #else // Make a ramp of 64 colors (instead of 55). int const RY = 17; int const YG = 7; int const GC = 5; int const CB = 13; int const BM = 15; int const MR = 7; #endif int const w = RY + YG + GC + CB + BM + MR; Image I(w, 1, 3); int x = 0; for (int i = 0; i < RY; ++i, ++x) { I(x, 0, 0) = 255; I(x, 0, 1) = 255*i/RY; I(x, 0, 2) = 0; } for (int i = 0; i < YG; ++i, ++x) { I(x, 0, 0) = 255-255*i/YG; I(x, 0, 1) = 255; I(x, 0, 2) = 0; } for (int i = 0; i < GC; ++i, ++x) { I(x, 0, 0) = 0; I(x, 0, 1) = 255; I(x, 0, 2) = 255*i/GC; } for (int i = 0; i < CB; ++i, ++x) { I(x, 0, 0) = 0; I(x, 0, 1) = 255-255*i/CB; I(x, 0, 2) = 255; } for (int i = 0; i < BM; ++i, ++x) { I(x, 0, 0) = 255*i/BM; I(x, 0, 1) = 0; I(x, 0, 2) = 255; } for (int i = 0; i < MR; ++i, ++x) { I(x, 0, 0) = 255; I(x, 0, 1) = 0; I(x, 0, 2) = 255-255*i/MR; } return I; } // end makeColorWheelImage() inline Vector3b getVisualColorForFlowVector(float u, float v, bool useSqrtMap = false) { using namespace std; static Image const wheel = makeColorWheelImage(); int const w = wheel.width(); float r = sqrtf(u*u + v*v); if (useSqrtMap) r = sqrtf(r); float const phi = atan2f(-v, -u) / M_PI; float const fk = (phi + 1.0) / 2.0 * w; int const k0 = (int)fk; int const k1 = (k0 + 1) % w; float const f = fk - k0; Vector3b res; for (int b = 0; b < 3; ++b) { float const col0 = float(wheel(k0, 0, b)); float const col1 = float(wheel(k1, 0, b)); float col = (1-f)*col0 + f*col1; if (r <= 1) col = 255.0f - r * (255.0f - col); // increase saturation with radius else col *= .75f; // out of range res[b] = (int)col; } // end for (b) return res; } // end getVisualColorForFlowVector() inline Image getVisualImageForFlowField(Image const& u, Image const& v, float scale, bool useSqrtMap = false) { int const w = u.width(); int const h = u.height(); Image res(w, h, 3); for (int y = 0; y < h; ++y) for (int x = 0; x < w; ++x) { Vector3b const c = getVisualColorForFlowVector(scale * u(x, y), scale * v(x, y), useSqrtMap); res(x, y, 0) = c[0]; res(x, y, 1) = c[1]; res(x, y, 2) = c[2]; } return res; } // end getVisualImageForFlowField() template inline void flipImageUpsideDown(Image& I) { int const w = I.width(); int const h = I.height(); int const nChannels = I.numChannels(); for (int c = 0; c < nChannels; ++c) for (int y = 0; y < h/2; ++y) { int const y1 = h - 1 - y; for (int x = 0; x < w; ++x) std::swap(I(x, y, c), I(x, y1, c)); } } // end flipImageUpsideDown() } // end namespace V3D #endif slowmovideo-0.5+git20180116/src/slowmoVideo/0000775000000000000000000000000013151342440017122 5ustar rootrootslowmovideo-0.5+git20180116/src/slowmoVideo/project/0000775000000000000000000000000013151342440020570 5ustar rootrootslowmovideo-0.5+git20180116/src/slowmoVideo/project/segmentList_sV.cpp0000664000000000000000000000237313151342440024247 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "segmentList_sV.h" #include #include SegmentList_sV::SegmentList_sV() { } void SegmentList_sV::unselectAll() { for (int i = 0; i < m_list.size(); i++) { m_list[i].select(false); Q_ASSERT(!m_list.at(i).selected()); } } void SegmentList_sV::grow() { m_list.append(Segment_sV(m_list.size())); unselectAll(); qSort(m_list); for (int i = 0; i < m_list.size(); i++) { qDebug() << "Segment " << i << ": " << toString(m_list.at(i)); } } void SegmentList_sV::shrink() { m_list.removeLast(); for (int i = 0; i < m_list.size(); i++) { qDebug() << "Segment " << i << ": " << toString(m_list.at(i)); } } int SegmentList_sV::size() const { return m_list.size(); } const Segment_sV& SegmentList_sV::at(int i) const { return m_list.at(i); } Segment_sV& SegmentList_sV::operator [](int i) { return m_list[i]; } slowmovideo-0.5+git20180116/src/slowmoVideo/project/abstractFrameSource_sV.cpp0000664000000000000000000000122713151342440025705 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "abstractFrameSource_sV.h" AbstractFrameSource_sV::AbstractFrameSource_sV(const Project_sV *project) : m_project(project) { } AbstractFrameSource_sV::~AbstractFrameSource_sV() { } double AbstractFrameSource_sV::maxTime() const throw(Div0Exception) { return (framesCount()-1)/fps()->fps(); } slowmovideo-0.5+git20180116/src/slowmoVideo/project/nodeList_sV.cpp0000664000000000000000000004547213151342440023541 0ustar rootroot/* slowmoUI is a user interface for slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "nodeList_sV.h" #include "node_sV.h" #include "../lib/bezierTools_sV.h" #include #include //#define DEBUG_NL #ifdef DEBUG_NL #include #endif NodeList_sV::NodeList_sV(float minDist) : m_maxY(10), m_list(), m_minDist(minDist) { } void NodeList_sV::setMaxY(qreal time) { Q_ASSERT(time > 0); m_maxY = time; } qreal NodeList_sV::startTime(bool useMoved) const { if (m_list.length() > 0) { if (useMoved) { return m_list[0].x(); } else { return m_list[0].xUnmoved(); } } else { // qDebug() << "No start time available (no nodes)"; return 0; } } qreal NodeList_sV::endTime(bool useMoved) const { if (m_list.length() > 0) { if (useMoved) { return m_list[m_list.length()-1].x(); } else { return m_list[m_list.length()-1].xUnmoved(); } } else { // qDebug() << "No end time available (no nodes)"; return 0; } } bool NodeList_sV::isInsideCurve(qreal targetTime, bool useMoved) const { return m_list.size() >= 2 && startTime(useMoved) <= targetTime && targetTime <= endTime(useMoved); } qreal NodeList_sV::totalTime() const { return endTime()-startTime(); } qreal NodeList_sV::sourceTime(qreal targetTime) const { qreal srcTime = -1; int index = find(targetTime); if (index >= 0) { if (m_list.size() > index+1) { if (m_list.at(index).rightCurveType() == CurveType_Bezier && m_list.at(index+1).leftCurveType() == CurveType_Bezier) { srcTime = BezierTools_sV::interpolateAtX(targetTime, m_list.at(index).toQPointF(), m_list.at(index).toQPointF()+m_list.at(index).rightNodeHandle(), m_list.at(index+1).toQPointF()+m_list.at(index+1).leftNodeHandle(), m_list.at(index+1).toQPointF()).y(); } else { float ratio = (targetTime-m_list[index].x())/(m_list[index+1].x()-m_list[index].x()); srcTime = m_list[index].y() + ratio*( m_list[index+1].y()-m_list[index].y() ); } } else { //TODO: if (index >= m_list.size()) { qDebug() << "index " << index << " is > list size: " << m_list.size(); //Q_ASSERT(false); } else { srcTime = m_list[index].y(); } } } else { // this seem because no project loaded ? // TODO: how can we check ? qDebug() << "No node before " << targetTime; //Q_ASSERT(false); if (m_list.size() > 0) { srcTime = m_list[0].y(); } } return srcTime; } bool NodeList_sV::add(Node_sV node) { bool add = true; #ifdef DEBUG_NL qDebug() << "Before adding: \n" << *this; #endif node.setX(qMax(.0, node.x())); node.setY(qMax(.0, qMin(m_maxY, node.y()))); int pos = find(node.x()); if (pos >= 0 && m_list.size() > pos) { add = fabs(node.x()-m_list.at(pos).x()) > m_minDist; #ifdef DEBUG_NL qDebug() << "Left distance is " << fabs(node.x()-m_list.at(pos).x()); #endif if (add && m_list.size() > pos+1) { add = fabs(node.x()-m_list.at(pos+1).x()) > m_minDist; #ifdef DEBUG_NL qDebug() << "Right distance is " << fabs(node.x()-m_list.at(pos+1).x()); #endif } } #ifdef DEBUG_NL qDebug() << "Adding? " << add; #endif if (add) { m_list.append(node); qSort(m_list); if (m_list.size() > 1) { m_segments.grow(); } // Reset curve type of neighbours if this is a linear node int index = m_list.indexOf(node); if (index > 0 && node.leftCurveType() == CurveType_Linear) { m_list[index-1].setRightCurveType(CurveType_Linear); } if (index < m_list.size()-1 && node.rightCurveType() == CurveType_Linear) { m_list[index+1].setLeftCurveType(CurveType_Linear); } fixHandles(index-1); fixHandles(index); } #ifdef DEBUG_NL qDebug() << "After adding: \n" << *this; #endif validate(); return add; } uint NodeList_sV::deleteSelected() { uint counter = 0; for (int i = 0; i < m_list.size(); ) { if (m_list.at(i).selected()) { m_list.removeOne(m_list.at(i)); if (m_list.size() > 0) { m_segments.shrink(); } counter++; } else { i++; } } validate(); return counter; } void NodeList_sV::deleteNode(int index) { Q_ASSERT(index >= 0); Q_ASSERT(index < m_list.size()); if (m_list.size() > 0) { if (m_list.size() > 1) { m_segments.shrink(); } m_list.removeAt(index); } if (index > m_list.size() && (index-1) >= 0) { if (m_list.at(index-1).rightCurveType() != m_list.at(index).leftCurveType()) { m_list[index-1].setRightCurveType(CurveType_Linear); m_list[index].setLeftCurveType(CurveType_Linear); } } validate(); } void NodeList_sV::select(const Node_sV *node, bool newSelection) { if (newSelection) { unselectAll(); const_cast(node)->select(true); } else { const_cast(node)->select(!node->selected()); } } void NodeList_sV::unselectAll() { for (int i = 0; i < m_list.size(); i++) { m_list[i].select(false); } } bool NodeList_sV::validate() const { bool valid = true; qreal last = -m_minDist; for (int i = 0; i < m_list.size() && valid; i++) { valid = m_list.at(i).x() >= 0 && m_list.at(i).y() >= 0 && m_list.at(i).x() - last >= m_minDist && m_list.at(i).y() <= m_maxY; if (!valid) { qDebug() << "Invalid node position for node " << i << " (" << m_list.size() << " total); Distance is " << m_list.at(i).x() - last; qDebug() << "Positions: " << last << "/" << m_list.at(i).x(); Q_ASSERT(false); break; } last = m_list.at(i).x(); } if (valid) { for (int i = 1; i < m_list.size(); i++) { float space = (m_list.at(i).x() + m_list.at(i).leftNodeHandle().x()) - (m_list.at(i-1).x() + m_list.at(i-1).rightNodeHandle().x()); valid = space >= 0; if (!valid) { qDebug() << "Invalid handle position for nodes " << i-1 << " and " << i; qDebug() << "Positions: " << m_list.at(i-1) << " with handle " << toString(m_list.at(i-1).rightNodeHandle()) << ", " << m_list.at(i) << " with handle " << toString(m_list.at(i).leftNodeHandle()) << ", space: " << space; Q_ASSERT(false); break; } } } if (valid) { Q_ASSERT( (m_list.size() == 0 && m_segments.size() == 0) || (m_list.size() > 0 && m_segments.size() == m_list.size()-1) ); } return valid; } ////////// Moving void NodeList_sV::moveSelected(const Node_sV &time,bool snap) { qreal maxRMove = 100000; qreal maxLMove = -100000; qreal maxUMove = 100000; qreal maxDMove = -100000; const Node_sV *left = NULL; const Node_sV *right; for (int i = 0; i < m_list.size(); i++) { right = &m_list.at(i); /* Get the maximum allowed horizontal movement distance here such that there is no overlapping. For moving the selected nodes to the left, only unselected nodes which are directly followed by a selected node need to be taken into account. O----O / \ x -----x \ / x-----------O min( ^1^, ^-----2-----^ ) + minDist */ if (left != NULL) { if (left->selected() && !right->selected()) { // Move-right distance maxRMove = qMin(maxRMove, right->xUnmoved()+right->leftNodeHandle().x() - (left->xUnmoved()+left->rightNodeHandle().x()) - m_minDist); } else if (!left->selected() && right->selected()) { // Move-left distance maxLMove = qMax(maxLMove, left->xUnmoved()+left->rightNodeHandle().x() - (right->xUnmoved()+right->leftNodeHandle().x()) + m_minDist); } } if (right->selected()) { maxDMove = qMax(maxDMove, -right->yUnmoved()); maxUMove = qMin(maxUMove, m_maxY-right->yUnmoved()); } left = right; } if (m_list.size() > 0 && m_list.at(0).selected()) { // Do not allow to move nodes to x < 0 maxLMove = qMax(maxLMove, -m_list.at(0).xUnmoved()); } #ifdef DEBUG_NL qDebug() << "Max move: left " << maxLMove << ", right: " << maxRMove; #endif Node_sV newTime( qMax(maxLMove, qMin(maxRMove, time.x())), qMax(maxDMove, qMin(maxUMove, time.y())) ); for (int i = 0; i < m_list.size(); i++) { if (m_list.at(i).selected()) { m_list[i].move(newTime); } } } void NodeList_sV::shift(qreal after, qreal by) { int pos = nodeAfter(after); if (pos >= 0) { if (pos > 0) { // []----o o----[]--- <- nodes with handles // <---> <- maximum distance by = qMax(by, m_list.at(pos-1).xUnmoved()+m_list.at(pos-1).rightNodeHandle().x() - (m_list.at(pos).xUnmoved()+m_list.at(pos).leftNodeHandle().x()) + m_minDist ); } if (pos == 0) { by = qMax(by, -m_list.at(pos).xUnmoved()); } for (; pos < m_list.size(); pos++) { m_list[pos].move(Node_sV(by, 0)); } } if (!validate()) { qDebug() << "Invalid node configuration! (This should not happen.)"; } } void NodeList_sV::confirmMove() { for (int i = 0; i < m_list.size(); i++) { m_list[i].confirmMove(); } validate(); } void NodeList_sV::abortMove() { for (int i = 0; i < m_list.size(); i++) { if (m_list.at(i).selected()) { m_list[i].abortMove(); } } } void NodeList_sV::moveHandle(const NodeHandle_sV *handle, Node_sV relPos) { Node_sV otherNode; Node_sV *currentNode = const_cast(handle->parentNode()); int nodeIndex = indexOf(handle->parentNode()); Q_ASSERT(nodeIndex >= 0); Q_ASSERT(nodeIndex < m_list.size()); if (handle == ¤tNode->leftNodeHandle()) { // o------[] if (nodeIndex > 0) { // Ensure that it does not overlap with the left node's handle (injectivity) otherNode = m_list.at(nodeIndex-1); qDebug() << "Left node: " << otherNode; qDebug() << "Right node: " << currentNode; qDebug() << "Before overlapping check: " << relPos; relPos.setX(qMax(relPos.x(), -(currentNode->x() - otherNode.x() - otherNode.rightNodeHandle().x()))); qDebug() << "After overlapping check: " << relPos; qDebug() << "Space left: " << currentNode->x() + relPos.x() - (otherNode.x() + otherNode.rightNodeHandle().x()); } // Additionally the handle has to stay on the left of its node relPos.setX(qMin(relPos.x(), .0)); currentNode->setLeftNodeHandle(relPos.x(), relPos.y()); } else { // []-------o if (nodeIndex+1 < m_list.size()) { otherNode = m_list.at(nodeIndex+1); relPos.setX(qMin(relPos.x(), otherNode.x() - currentNode->x() + otherNode.leftNodeHandle().x())); } relPos.setX(qMax(relPos.x(), .0)); currentNode->setRightNodeHandle(relPos.x(), relPos.y()); } validate(); } ////////// Curve void NodeList_sV::setCurveType(qreal segmentTime, CurveType type) { int left, right; findBySegment(segmentTime, left, right); #ifdef DEBUG_NL qDebug() << "Setting curve type for nodes " << left << " and " << right; #endif if (left != -1) { m_list[left].setRightCurveType(type); } if (right != -1) { m_list[right].setLeftCurveType(type); } } void NodeList_sV::fixHandles(int leftIndex) { if (leftIndex >= 0 && (leftIndex+1) < m_list.size()) { qreal right = m_list.at(leftIndex+1).x() - m_list.at(leftIndex).x(); qreal leftHandle = m_list.at(leftIndex).rightNodeHandle().x(); qreal rightHandle = m_list.at(leftIndex+1).leftNodeHandle().x(); if (leftHandle < 0) { leftHandle = 0; } if (rightHandle > 0) { rightHandle = 0; } if (leftHandle > right+rightHandle && (leftHandle-rightHandle) > 0) { qreal factor = right / (leftHandle - rightHandle); qDebug() << "Factor: " << factor << ", left: " << leftHandle << ", right: " << rightHandle << ", distance: " << right; leftHandle *= factor; rightHandle *= factor; qDebug() << "After scaling: left: " << leftHandle << ", right: " << rightHandle; Q_ASSERT(leftHandle <= right+rightHandle); } m_list[leftIndex].setRightNodeHandle(leftHandle, m_list.at(leftIndex).rightNodeHandle().y()); m_list[leftIndex+1].setLeftNodeHandle(rightHandle, m_list.at(leftIndex+1).leftNodeHandle().y()); } } /** * on error return int indicating error type * hint : maybe add error method ? */ int NodeList_sV::setSpeed(qreal segmentTime, qreal speed) { int error = 0; int left, right; findBySegment(segmentTime, left, right); if (left >= 0 && right >= 0) { Node_sV *leftN = &m_list[left]; Node_sV *rightN = &m_list[right]; qreal y = leftN->y() + speed*(rightN->x()-leftN->x()); if (y > m_maxY || y < 0) { if (y > m_maxY) { qDebug() << speed << "x speed would shoot over maximum time. Correcting."; error = -1; y = m_maxY; } else { qDebug() << speed << "x speed goes below 0. Correcting."; error = -2; y = 0; } qreal xNew = leftN->x() + (y - leftN->y())/speed; rightN->setY(y); if (xNew - leftN->x() >= m_minDist) { add(Node_sV(xNew, y)); } else { qDebug() << "New node would be too close, not adding it."; error = -3; } } else { rightN->setY(y); } } else { qDebug() << "Outside segment."; error = -4; } validate(); return error; } ////////// Access int NodeList_sV::indexOf(const Node_sV *node) const { return m_list.indexOf(*node); } int NodeList_sV::find(qreal time) const { int pos; for ( pos = 0; m_list.size() > (pos+1) && m_list.at(pos+1).x() <= time; pos++ ) {} if (m_list.size() == 0 || (pos == 0 && time < m_list[0].x())) { #ifdef DEBUG_NL if (m_list.size() > 0) { std::cout.precision(30); std::cout << "find(): time: " << time << ", left boundary: " << m_list[0].x() << ", unmoved: " << m_list[0].xUnmoved() << ", diff: " << m_list[pos].x()-time << std::endl; } #endif pos = -1; } return pos; } int NodeList_sV::find(QPointF pos, qreal tdelta) const { for (int i = 0; i < m_list.size(); i++) { if (std::pow(m_list.at(i).xUnmoved() - pos.x(), 2) + std::pow(m_list.at(i).yUnmoved()-pos.y(), 2) < std::pow(tdelta, 2)) { return i; } } return -1; } void NodeList_sV::findBySegment(qreal tx, int &leftIndex_out, int &rightIndex_out) const { for (int i = 0; i < m_list.size(); i++) { leftIndex_out = i-1; rightIndex_out = i; if (m_list.at(i).xUnmoved() > tx) { break; } if (i == m_list.size()-1) { leftIndex_out = i; rightIndex_out = -1; } } } QList NodeList_sV::objectsNear(QPointF pos, qreal tmaxdist) const { qreal maxdist2 = std::pow(tmaxdist, 2); QList objects; qreal dist; for (int i = 0; i < m_list.size(); i++) { dist = dist2(m_list.at(i).toQPointF() - pos); if (dist <= maxdist2) { objects << PointerWithDistance(&m_list[i], dist, PointerWithDistance::Node); } if (m_list.at(i).leftCurveType() != CurveType_Linear) { dist = dist2(m_list.at(i).toQPointF() + m_list.at(i).leftNodeHandle() - pos); if (dist <= maxdist2) { objects << PointerWithDistance(&m_list[i].leftNodeHandle(), dist, PointerWithDistance::Handle); } } if (m_list.at(i).rightCurveType() != CurveType_Linear) { dist = dist2(m_list.at(i).toQPointF() + m_list.at(i).rightNodeHandle() - pos); if (dist <= maxdist2) { objects << PointerWithDistance(&m_list[i].rightNodeHandle(), dist, PointerWithDistance::Handle); } } if (i > 0) { if (m_list.at(i-1).x() < pos.x() && m_list.at(i).x() > pos.x()) { objects << PointerWithDistance(&m_segments.at(i-1), std::pow(sourceTime(pos.x()) - pos.y(), 2), PointerWithDistance::Segment); } } } qSort(objects); return objects; } qreal NodeList_sV::dist2(QPointF point) const { return std::pow(point.x(), 2) + std::pow(point.y(), 2); } int NodeList_sV::nodeAfter(qreal time) const { int pos = 0; while (m_list.size() > pos) { if (m_list.at(pos).xUnmoved() >= time) { break; } pos++; } if (pos >= m_list.size()) { pos = -1; } Q_ASSERT(pos < 0 || m_list.at(pos).xUnmoved() >= time); return pos; } const Node_sV& NodeList_sV::at(int i) const { return m_list.at(i); } Node_sV& NodeList_sV::operator[](int i) { return m_list[i]; } int NodeList_sV::size() const { return m_list.size(); } SegmentList_sV* NodeList_sV::segments() { return &m_segments; } ////////// Debug QDebug operator<<(QDebug dbg, const NodeList_sV &list) { for (int i = 0; i < list.size(); i++) { dbg.nospace() << i << ": " << list.at(i) << " "; } return dbg.maybeSpace(); } slowmovideo-0.5+git20180116/src/slowmoVideo/project/projectPreferences_sV.cpp0000664000000000000000000000540413151342440025577 0ustar rootroot#include "projectPreferences_sV.h" #include #include #include "config.h" ProjectPreferences_sV::ProjectPreferences_sV() : m_tagAxis(TagAxis_Source), m_viewport_t0(0, 0), m_viewport_secRes(50, 50), m_canvas_xAxisFPS(24), m_renderSectionMode("full"), m_renderFrameSize(FrameSize_Orig), m_renderInterpolationType(InterpolationType_TwowayNew), m_motionblurType(MotionblurType_Convolving), m_renderFPS(24), m_imagesOutputDir(QDir::homePath()), m_imagesFilenamePattern("rendered-%1.jpg"), m_flowV3DLambda(20.0) { #if QT_VERSION < QT_VERSION_CHECK(5, 0, 0) #ifdef USE_QTKIT m_videoFilename = (QDesktopServices::storageLocation(QDesktopServices::MoviesLocation)+"/rendered.mov"); #else m_videoFilename = (QDesktopServices::storageLocation(QDesktopServices::MoviesLocation)+"/rendered.mp4"); #endif #else // deprecated in qt5 ? #ifdef USE_QTKIT m_videoFilename = (QStandardPaths::writableLocation(QStandardPaths::MoviesLocation)+"/rendered.mov"); #else m_videoFilename = (QStandardPaths::writableLocation(QStandardPaths::MoviesLocation)+"/rendered.mp4"); #endif #endif } TagAxis& ProjectPreferences_sV::lastSelectedTagAxis() { return m_tagAxis; } QPointF& ProjectPreferences_sV::viewport_secRes() { return m_viewport_secRes; } QPointF& ProjectPreferences_sV::viewport_t0() { return m_viewport_t0; } Fps_sV& ProjectPreferences_sV::canvas_xAxisFPS() { return m_canvas_xAxisFPS; } FrameSize& ProjectPreferences_sV::renderFrameSize() { return m_renderFrameSize; } InterpolationType& ProjectPreferences_sV::renderInterpolationType() { return m_renderInterpolationType; } MotionblurType& ProjectPreferences_sV::renderMotionblurType() { return m_motionblurType; } QString& ProjectPreferences_sV::renderSectionMode() { return m_renderSectionMode; } QString& ProjectPreferences_sV::renderStartTag() { return m_renderStartTag; } QString& ProjectPreferences_sV::renderEndTag() { return m_renderEndTag; } QString& ProjectPreferences_sV::renderStartTime() { return m_renderStartTime; } QString& ProjectPreferences_sV::renderEndTime() { return m_renderEndTime; } Fps_sV& ProjectPreferences_sV::renderFPS() { return m_renderFPS; } QString& ProjectPreferences_sV::renderTarget() { return m_renderTarget; } QString& ProjectPreferences_sV::imagesOutputDir() { return m_imagesOutputDir; } QString& ProjectPreferences_sV::imagesFilenamePattern() { return m_imagesFilenamePattern; } QString& ProjectPreferences_sV::videoFilename() { qDebug() << "filename is : " << m_videoFilename; return m_videoFilename; } QString& ProjectPreferences_sV::videoCodec() { return m_vcodec; } float& ProjectPreferences_sV::flowV3DLambda() { return m_flowV3DLambda; } bool& ProjectPreferences_sV::renderFormat() { return m_renderFormat;}; slowmovideo-0.5+git20180116/src/slowmoVideo/project/xmlProjectRW_sV.h0000664000000000000000000000373513151342440024021 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef XMLPROJECTRW_SV_H #define XMLPROJECTRW_SV_H #include "../lib/defs_sV.hpp" #include "tag_sV.h" #include class Node_sV; class Project_sV; class AbstractFrameSource_sV; // Remember to change the slowmoVideo version as well #define SLOWMOPROJECT_VERSION_MAJOR 2 #define SLOWMOPROJECT_VERSION_MINOR 7 /** \brief Reads and writes project files in XML format Additional stored parameters require a minor version change. Big changes in the schema, like elements moved to a different node, require a major version change and a function that loads old project files. Version changes (both major and minor) require a micro slowmoVideo version change. */ class XmlProjectRW_sV { public: /** \fn loadProject() Reads an XML project file. \param filename Project file to load \param warning Information message when trying to load project files with a wrong version number \return NULL if an error ocurred. */ /** \fn saveProject() Saves a project to an XML project file. */ static Project_sV* loadProject(QString filename, QString *warning = NULL) throw(FrameSourceError, Error_sV); static int saveProject(Project_sV *project, QString filename) throw(Error_sV); private: static const QDomElement nodeToDom(QDomDocument *doc, const Node_sV *node); static const QDomElement tagToDom(QDomDocument *doc, const Tag_sV &tag); static const QDomElement frameSource(QDomDocument *doc, const AbstractFrameSource_sV *frameSource); static void loadFrameSource(QXmlStreamReader *reader, Project_sV *project) throw(FrameSourceError); }; #endif // XMLPROJECTRW_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/motionBlur_sV.cpp0000664000000000000000000002342413151342440024103 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "motionBlur_sV.h" #include "project_sV.h" #include "abstractFrameSource_sV.h" #include "abstractFlowSource_sV.h" #include "interpolator_sV.h" #include "renderTask_sV.h" #include "../lib/flowField_sV.h" #include "../lib/shutter_sV.h" #define MAX_CONV_FRAMES 5 //#define DEBUG MotionBlur_sV::MotionBlur_sV(Project_sV *project) : m_project(project), m_slowmoSamples(16), m_maxSamples(64), m_slowmoMaxFrameDist(.5) { createDirectories(); } QImage MotionBlur_sV::blur(float startFrame, float endFrame, float replaySpeed, RenderPreferences_sV prefs) throw(RangeTooSmallError_sV) { if (prefs.motionblur == MotionblurType_Nearest) { return nearest(startFrame, prefs); } else if (prefs.motionblur == MotionblurType_Convolving) { return convolutionBlur(startFrame, endFrame, replaySpeed, prefs); } else { if (replaySpeed > 0.5) { return fastBlur(startFrame, endFrame, prefs); } else { return slowmoBlur(startFrame, endFrame, prefs); } } } QImage MotionBlur_sV::nearest(float startFrame, const RenderPreferences_sV &prefs) { return m_project->frameSource()->frameAt(startFrame, prefs.size); } QImage MotionBlur_sV::fastBlur(float startFrame, float endFrame, const RenderPreferences_sV &prefs) throw(RangeTooSmallError_sV) { float low, high; if (startFrame < endFrame) { low = startFrame; high = endFrame; } else { low = endFrame; high = startFrame; } low = qMax(low, float(0)); high = qMin(high, (float)m_project->frameSource()->framesCount()-1); float dist = 0.125; float lowRounded; float highRounded; for (; ; dist *= 2) { lowRounded = ceil(low/dist)*dist; highRounded = floor(high/dist)*dist; if ((highRounded-lowRounded)/dist <= m_maxSamples) { break; } } float pos = lowRounded; QStringList frameList; while (pos < high) { frameList << cachedFramePath(pos, prefs); pos += dist; } qDebug() << "Fast blurring " << frameList.size() << " frames from " << startFrame << " to " << endFrame << ", low: " << low << ", high: " << high << ", with a distance of " << dist; if (frameList.size() > 1) { return Shutter_sV::combine(frameList); } else { throw RangeTooSmallError_sV(QObject::tr("Range too small: Start frame is %1, end frame is %2. " "Using normal interpolation.").arg(startFrame).arg(endFrame)); } } /// \todo fixed distance as additional option QImage MotionBlur_sV::slowmoBlur(float startFrame, float endFrame, const RenderPreferences_sV &prefs) { float low, high; if (startFrame < endFrame) { low = startFrame; high = endFrame; } else { low = endFrame; high = startFrame; } low = qMax(low, float(0)); high = qMin(high, (float)m_project->frameSource()->framesCount()); QStringList frameList; float increment = (high-low)/(m_slowmoSamples-1); if (increment < .1) { qDebug() << "slowmoBlur(): Increasing distance from " << increment << " to .1"; increment = .1; } if (increment > m_slowmoMaxFrameDist) { qDebug() << "slowmoBlur(): Decreasing distance from " << increment << " to " << m_slowmoMaxFrameDist; increment = m_slowmoMaxFrameDist; } for (float pos = low; pos <= high; pos += increment) { frameList << cachedFramePath(pos, prefs, true); } return Shutter_sV::combine(frameList); } QImage MotionBlur_sV::convolutionBlur(float startFrame, float endFrame, float replaySpeed, const RenderPreferences_sV &prefs) { float low, high; if (startFrame < endFrame) { low = startFrame; high = endFrame; } else { low = endFrame; high = startFrame; } low = qMax(low, float(0)); high = qMin(high, (float)m_project->frameSource()->framesCount()-1); if (floor(low) == floor(high) && low > .01) { if (floor(low) < m_project->frameSource()->framesCount()-1) { qDebug() << "Small shutter." << startFrame << endFrame; FlowField_sV *field = m_project->requestFlow(floor(low), floor(low)+1, prefs.size); QImage blur = Shutter_sV::convolutionBlur(Interpolator_sV::interpolate(m_project, startFrame, prefs), field, high-low, low-floor(low)); delete field; return blur; } else { /// \todo Convolve last frame as well qDebug() << "No shutter, at the last frame."; return m_project->frameSource()->frameAt(floor(low), prefs.size); } } QList images; FlowField_sV *field; int start = floor(low); int end = std::min((int64_t)ceil(high), m_project->frameSource()->framesCount()-2); int inc = 1; qDebug() << "Large shutter." << startFrame << endFrame << " -- replay speed is " << replaySpeed; qDebug() << "Additional parts: " << start << end; if (replaySpeed < 2) { // \todo Lower weight for those frames when combining if (low-start > .1) { qDebug() << "First part: " << start << low; field = m_project->requestFlow(start, start+1, prefs.size); images << Shutter_sV::convolutionBlur(Interpolator_sV::interpolate(m_project, startFrame, prefs), field, floor(low)+1 - low, low-floor(low)); delete field; start++; } if (end-high > .1) { qDebug() << "Last part: " << end-1 << high; field = m_project->requestFlow(end-1, end, prefs.size); images << Shutter_sV::convolutionBlur(m_project->frameSource()->frameAt(end-1, prefs.size), field, 1 + high-end); delete field; end--; } } else { // \todo Check increment value while ((end-start) / inc > MAX_CONV_FRAMES) { if (inc == 1) { inc = 2; } else { inc += 2; } } start = start - start%inc; end = end - end%inc; qDebug() << "Parts scaled to " << start << end << " with increment " << inc; } for (int f = start; f <= end; f += inc) { QString name = QString("%1/convolved-%2+%3.png").arg(cacheDir(prefs.size).absolutePath()).arg(f).arg(inc); if (replaySpeed < 2) { if (QFileInfo(name).exists()) { qDebug() << "Using convolved image from cache: " << name; images << QImage(name); continue; } } field = m_project->requestFlow(f, f+1, prefs.size); images << Shutter_sV::convolutionBlur(m_project->frameSource()->frameAt(f, prefs.size), field, inc); if (replaySpeed < 2) { qDebug() << "Caching convolved image: " << name; images.last().save(name); } delete field; } #ifdef DEBUG for (int i = 0; i < images.size(); i++) { images.at(i).save(QString("/tmp/mblur-%1.png").arg(i)); } #endif return Shutter_sV::combine(images); } QDir MotionBlur_sV::cacheDir(FrameSize size) const { switch (size) { case FrameSize_Small: return m_dirCacheSmall; case FrameSize_Orig: return m_dirCacheOrig; default: qDebug() << "Unknown frame size in MotionBlur_sV: " << toString(size); Q_ASSERT(false); return QDir(); } } QString MotionBlur_sV::cachedFramePath(float framePos, const RenderPreferences_sV &prefs, bool highPrecision) { int precision = 2; if (highPrecision) { precision = 2; } /// \todo check precision int width = 5+1 + precision; QString name = QString("%2/cached%1.png").arg(framePos, width, 'f', precision, '0'); if (prefs.size == FrameSize_Small) { name = name.arg(m_dirCacheSmall.absolutePath()); } else if (prefs.size == FrameSize_Orig) { name = name.arg(m_dirCacheOrig.absolutePath()); } else { qDebug() << "MotionBlur: Frame size " << toString(prefs.size) << " given, not supported!"; Q_ASSERT(false); } if (fabs(framePos-int(framePos)) < MOTIONBLUR_PRECISION_LIMIT) { name = m_project->frameSource()->framePath(uint(framePos), prefs.size); } else { if (!QFileInfo(name).exists()) { qDebug() << name << " does not exist yet. Interpolating and saving to cache."; QImage frm = Interpolator_sV::interpolate(m_project, framePos, prefs); frm.save(name); } } return name; } void MotionBlur_sV::slotUpdateProjectDir() { //TODO: check if really needed ? //m_dirCacheSmall.rmdir("."); //m_dirCacheOrig.rmdir("."); createDirectories(); } void MotionBlur_sV::createDirectories() { m_dirCacheSmall = m_project->getDirectory("cache/motionBlurSmall"); m_dirCacheOrig = m_project->getDirectory("cache/motionBlurOrig"); } void MotionBlur_sV::setSlowmoSamples(int slowmoSamples) { m_slowmoSamples = slowmoSamples; Q_ASSERT(m_slowmoSamples > 0); } void MotionBlur_sV::setMaxSamples(int maxSamples) { m_maxSamples = maxSamples; Q_ASSERT(m_maxSamples > 0); } slowmovideo-0.5+git20180116/src/slowmoVideo/project/new_videoRenderTarget.cpp0000664000000000000000000000323313151342440025563 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2014 Valery Brasseur This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ // Against the «UINT64_C not declared» message. // See: http://code.google.com/p/ffmpegsource/issues/detail?id=11 #ifdef __cplusplus #define __STDC_CONSTANT_MACROS #ifdef _STDINT_H #undef _STDINT_H #endif # include #endif #include "new_videoRenderTarget.h" #include "renderTask_sV.h" #include #include "../lib/video_enc.h" newVideoRenderTarget::newVideoRenderTarget(RenderTask_sV *parentRenderTask) : AbstractRenderTarget_sV(parentRenderTask) { } newVideoRenderTarget::~newVideoRenderTarget() { } void newVideoRenderTarget::setTargetFile(const QString &filename) { m_filename = filename; } void newVideoRenderTarget::setVcodec(const QString &codec) { m_vcodec = codec; } void newVideoRenderTarget::openRenderTarget() throw(Error_sV) { writer = CreateVideoWriter(m_filename.toStdString().c_str(), renderTask()->resolution().width(), renderTask()->resolution().height(), renderTask()->fps().fps(),1, m_vcodec.toStdString().c_str()); if (writer == 0) { throw Error_sV(QObject::tr("Video could not be prepared .\n")); } } void newVideoRenderTarget::slotConsumeFrame(const QImage &image, const int frameNumber) { WriteFrame(writer, image); } void newVideoRenderTarget::closeRenderTarget() throw(Error_sV) { ReleaseVideoWriter( &writer ); } slowmovideo-0.5+git20180116/src/slowmoVideo/project/imagesFrameSource_sV.cpp0000664000000000000000000001101313151342440025341 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "imagesFrameSource_sV.h" #include "project_sV.h" #include #include #include ImagesFrameSource_sV::ImagesFrameSource_sV(Project_sV *project, QStringList images) throw(FrameSourceError) : AbstractFrameSource_sV(project), m_fps(24, 1), m_initialized(false), m_stopInitialization(false), m_nextFrame(0) { QString msg = validateImages(images); if (msg.length() > 0) { throw FrameSourceError("Image frame source: " + msg); } m_imagesList.append(images); m_imagesList.sort(); QImage repImage(m_imagesList.at(0)); if (repImage.isNull()) { qDebug() << "Image is null: " << m_imagesList.at(0); qDebug() << "Supported image formats: " << QImageReader::supportedImageFormats(); throw FrameSourceError(QString("Cannot read image: %1").arg(m_imagesList.at(0))); } m_sizeSmall = repImage.size(); if (m_sizeSmall.isEmpty()) { throw FrameSourceError(QString("Image read from %1 is empty.").arg(m_imagesList.at(0))); } while (m_sizeSmall.width() > 600) { m_sizeSmall = m_sizeSmall/2; } createDirectories(); } QString ImagesFrameSource_sV::validateImages(const QStringList images) { if (images.size() == 0) { return tr("No images selected."); } QSize size = QImage(images.at(0)).size(); for (int i = 0; i < images.length(); i++) { if (QImage(images.at(i)).size() != size) { return tr("Image %1 is not of the same size (%2) as the first image (%3).") .arg(images.at(i)).arg(toString(QImage(images.at(i)).size())).arg(toString(size)); } } return QString(); } const QStringList ImagesFrameSource_sV::inputFiles() const { return m_imagesList; } void ImagesFrameSource_sV::slotUpdateProjectDir() { //TODO: really needed ? //m_dirImagesSmall.rmdir("."); createDirectories(); } void ImagesFrameSource_sV::createDirectories() { m_dirImagesSmall = m_project->getDirectory("frames/imagesSmall"); } void ImagesFrameSource_sV::initialize() { m_stopInitialization = false; QMetaObject::invokeMethod(this, "slotContinueInitialization", Qt::QueuedConnection); } bool ImagesFrameSource_sV::initialized() const { return m_initialized; } /// \todo What if image missing? void ImagesFrameSource_sV::slotContinueInitialization() { emit signalNextTask(tr("Creating preview images from the input images"), m_imagesList.size()); for (; m_nextFrame < m_imagesList.size(); m_nextFrame++) { QString outputFile(framePath(m_nextFrame, FrameSize_Small)); if (QFile(outputFile).exists()) { emit signalTaskItemDescription(tr("Resized image already exists for %1").arg(QFileInfo(m_imagesList.at(m_nextFrame)).fileName())); } else { emit signalTaskItemDescription(tr("Re-sizing image %1 to:\n%2") .arg(QFileInfo(m_imagesList.at(m_nextFrame)).fileName()) .arg(outputFile)); QImage small = QImage(m_imagesList.at(m_nextFrame)).scaled(m_sizeSmall, Qt::IgnoreAspectRatio, Qt::SmoothTransformation); small.save(outputFile); } emit signalTaskProgress(m_nextFrame); if (m_stopInitialization) { break; } } m_initialized = true; emit signalAllTasksFinished(); } void ImagesFrameSource_sV::slotAbortInitialization() { m_stopInitialization = true; } int64_t ImagesFrameSource_sV::framesCount() const { return m_imagesList.size(); } const Fps_sV* ImagesFrameSource_sV::fps() const { return &m_fps; } QImage ImagesFrameSource_sV::frameAt(const uint frame, const FrameSize frameSize) { if (int(frame) < m_imagesList.size()) { return QImage(framePath(frame, frameSize)); } else { return QImage(); } } const QString ImagesFrameSource_sV::framePath(const uint frame, const FrameSize frameSize) const { switch (frameSize) { case FrameSize_Orig: return QString(m_imagesList.at(frame)); case FrameSize_Small: default: return QString(m_dirImagesSmall.absoluteFilePath(QFileInfo(m_imagesList.at(frame)).completeBaseName() + ".png")); } } slowmovideo-0.5+git20180116/src/slowmoVideo/project/xmlProjectRW_sV.cpp0000664000000000000000000007606413151342440024361 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "xmlProjectRW_sV.h" #include "project_sV.h" #include "projectPreferences_sV.h" #include "shutterFunctionList_sV.h" #include "shutterFunction_sV.h" #include "nodeList_sV.h" #include "videoFrameSource_sV.h" #include "emptyFrameSource_sV.h" #include "imagesFrameSource_sV.h" #include "motionBlur_sV.h" #include "abstractFlowSource_sV.h" //TODO: #include "defs_sV.hpp" #include #include #if 0 #if QT_VERSION < QT_VERSION_CHECK(5, 0, 0) #include #endif // qt5 #endif int XmlProjectRW_sV::saveProject(Project_sV *project, QString filename) throw(Error_sV) { //TODO: use global define in defs ! QDomDocument doc; QDomElement root = doc.createElement("sVproject"); root.setAttribute("version", SLOWMOPROJECT_VERSION_MAJOR); root.setAttribute("version_minor", SLOWMOPROJECT_VERSION_MINOR); doc.appendChild(root); // File info QDomElement info = doc.createElement("info"); root.appendChild(info); QDomElement appName = doc.createElement("appName"); appName.appendChild(doc.createTextNode("slowmoVideo")); QDomElement version = doc.createElement("version"); version.setAttribute("major", SLOWMOVIDEO_VERSION_MAJOR); version.setAttribute("minor", SLOWMOVIDEO_VERSION_MINOR); version.setAttribute("micro", SLOWMOVIDEO_VERSION_PATCH); info.appendChild(appName); info.appendChild(version); ProjectPreferences_sV *pr = project->preferences(); // Project Preferences QDomElement preferences = doc.createElement("preferences"); root.appendChild(preferences); QDomElement renderSectionMode = doc.createElement("renderSectionMode"); QDomElement renderStartTag = doc.createElement("renderStartTag"); QDomElement renderEndTag = doc.createElement("renderEndTag"); QDomElement renderStartTime = doc.createElement("renderStartTime"); QDomElement renderEndTime = doc.createElement("renderEndTime"); QDomElement renderFrameSize = doc.createElement("renderFrameSize"); QDomElement renderInterpolation = doc.createElement("renderInterpolationType"); QDomElement renderMotionblur = doc.createElement("renderMotionblurType"); QDomElement renderFPS = doc.createElement("renderFPS"); QDomElement renderSlowmoSamples = doc.createElement("renderSlowmoSamples"); QDomElement renderMaxSamples = doc.createElement("renderMaxSamples"); QDomElement renderTarget = doc.createElement("renderTarget"); QDomElement imagesOutputDir = doc.createElement("imagesOutputDir"); QDomElement imagesFilenamePattern = doc.createElement("imagesFilenamePattern"); QDomElement videoFilename = doc.createElement("videoFilename"); QDomElement videoCodec = doc.createElement("videoCodec"); QDomElement flowV3dLambda = doc.createElement("flowV3dLambda"); QDomElement prevTagAxis = doc.createElement("prevTagAxis"); QDomElement viewport_t0 = doc.createElement("viewport_t0"); QDomElement viewport_secRes = doc.createElement("viewport_secRes"); QDomElement canvas_xAxisFPS = doc.createElement("canvas_xAxisFPS"); preferences.appendChild(renderSectionMode); renderSectionMode.appendChild(renderStartTag); renderSectionMode.appendChild(renderEndTag); renderSectionMode.appendChild(renderStartTime); renderSectionMode.appendChild(renderEndTime); preferences.appendChild(renderFrameSize); preferences.appendChild(renderInterpolation); preferences.appendChild(renderMotionblur); preferences.appendChild(renderFPS); preferences.appendChild(renderSlowmoSamples); preferences.appendChild(renderMaxSamples); preferences.appendChild(renderTarget); preferences.appendChild(imagesOutputDir); preferences.appendChild(imagesFilenamePattern); preferences.appendChild(videoFilename); preferences.appendChild(videoCodec); preferences.appendChild(flowV3dLambda); preferences.appendChild(prevTagAxis); preferences.appendChild(viewport_t0); preferences.appendChild(viewport_secRes); preferences.appendChild(canvas_xAxisFPS); renderSectionMode.setAttribute("mode", pr->renderSectionMode()); renderStartTag.setAttribute("label", pr->renderStartTag()); renderEndTag.setAttribute("label", pr->renderEndTag()); renderStartTime.setAttribute("time", pr->renderStartTime()); renderEndTime.setAttribute("time", pr->renderEndTime()); renderFrameSize.setAttribute("size", pr->renderFrameSize()); renderInterpolation.setAttribute("type", pr->renderInterpolationType()); renderMotionblur.setAttribute("type", pr->renderMotionblurType()); renderFPS.setAttribute("fps", pr->renderFPS().toString()); renderSlowmoSamples.setAttribute("number", project->motionBlur()->slowmoSamples()); renderMaxSamples.setAttribute("number", project->motionBlur()->maxSamples()); renderTarget.setAttribute("target", pr->renderTarget()); imagesOutputDir.setAttribute("dir", pr->imagesOutputDir()); imagesFilenamePattern.setAttribute("pattern", pr->imagesFilenamePattern()); videoFilename.setAttribute("file", pr->videoFilename()); videoCodec.setAttribute("codec", pr->videoCodec()); flowV3dLambda.setAttribute("lambda", pr->flowV3DLambda()); prevTagAxis.setAttribute("axis", QVariant(pr->lastSelectedTagAxis()).toString()); viewport_t0.setAttribute("x", pr->viewport_t0().x()); viewport_t0.setAttribute("y", pr->viewport_t0().y()); viewport_secRes.setAttribute("x", QString().setNum(pr->viewport_secRes().x())); viewport_secRes.setAttribute("y", QString().setNum(pr->viewport_secRes().y())); canvas_xAxisFPS.setAttribute("fps", pr->canvas_xAxisFPS().toString()); // Project Resources QDomElement resources = doc.createElement("resources"); root.appendChild(resources); QDomElement projectDir = doc.createElement("projectDir"); projectDir.appendChild(doc.createTextNode(project->getDirectory(".", false).absolutePath())); resources.appendChild(projectDir); resources.appendChild(frameSource(&doc, project->frameSource())); // Shutter functions QDomElement shutterFunctions = doc.createElement("shutterFunctions"); root.appendChild(shutterFunctions); for (int i = 0; i < project->shutterFunctions()->size(); i++) { QDomElement func = doc.createElement("function"); func.setAttribute("id", project->shutterFunctions()->at(i)->id()); QDomElement code = doc.createElement("code"); func.appendChild(code); code.appendChild(doc.createTextNode(project->shutterFunctions()->at(i)->function())); shutterFunctions.appendChild(func); } // Nodes QDomElement nodes = doc.createElement("nodes"); root.appendChild(nodes); NodeList_sV *nodeList = project->nodes(); for (int i = 0; i < nodeList->size(); i++) { nodes.appendChild(nodeToDom(&doc, &nodeList->at(i))); } // Tags QDomElement tags = doc.createElement("tags"); root.appendChild(tags); for (int i = 0; i < project->tags()->size(); i++) { tags.appendChild(tagToDom(&doc, project->tags()->at(i))); } QFile outFile(filename); if (!outFile.open(QIODevice::WriteOnly)) { throw Error_sV(QObject::tr("Cannot write to %1; please check if you have write permissions.").arg(filename)); } QTextStream output(&outFile); doc.save(output, 4); output.flush(); outFile.close(); project->setProjectFilename(filename); return 0; } const QDomElement XmlProjectRW_sV::nodeToDom(QDomDocument *doc, const Node_sV *node) { QDomElement el = doc->createElement("node"); QDomElement x = doc->createElement("x"); QDomElement y = doc->createElement("y"); QDomElement selected = doc->createElement("selected"); QDomElement leftHandle = doc->createElement("leftHandle"); QDomElement rightHandle = doc->createElement("rightHandle"); QDomElement shutterFunc = doc->createElement("shutterFunction"); QDomElement leftCurveType = doc->createElement("type"); QDomElement rightCurveType = doc->createElement("type"); QDomElement leftX = doc->createElement("x"); QDomElement leftY = doc->createElement("y"); QDomElement rightX = doc->createElement("x"); QDomElement rightY = doc->createElement("y"); el.appendChild(x); el.appendChild(y); el.appendChild(selected); el.appendChild(leftHandle); el.appendChild(rightHandle); if (node->shutterFunctionID().length() > 0) { el.appendChild(shutterFunc); } x.appendChild(doc->createTextNode(QString("%1").arg(node->xUnmoved()))); y.appendChild(doc->createTextNode(QString("%1").arg(node->yUnmoved()))); selected.appendChild(doc->createTextNode(QString("%1").arg(node->selected()))); shutterFunc.appendChild(doc->createTextNode(node->shutterFunctionID())); leftHandle.appendChild(leftX); leftHandle.appendChild(leftY); leftHandle.appendChild(leftCurveType); rightHandle.appendChild(rightX); rightHandle.appendChild(rightY); rightHandle.appendChild(rightCurveType); leftX.appendChild(doc->createTextNode(QString("%1").arg(node->leftNodeHandle().x()))); leftY.appendChild(doc->createTextNode(QString("%1").arg(node->leftNodeHandle().y()))); leftCurveType.appendChild(doc->createTextNode(QVariant((int)node->leftCurveType()).toString())); rightX.appendChild(doc->createTextNode(QString("%1").arg(node->rightNodeHandle().x()))); rightY.appendChild(doc->createTextNode(QString("%1").arg(node->rightNodeHandle().y()))); rightCurveType.appendChild(doc->createTextNode(QVariant((int)node->rightCurveType()).toString())); return el; } const QDomElement XmlProjectRW_sV::tagToDom(QDomDocument *doc, const Tag_sV &tag) { QDomElement el = doc->createElement("tag"); QDomElement t = doc->createElement("time"); QDomElement desc = doc->createElement("description"); QDomElement axis = doc->createElement("axis"); el.appendChild(t); el.appendChild(desc); el.appendChild(axis); t.appendChild(doc->createTextNode(QString("%1").arg(tag.time()))); desc.appendChild(doc->createTextNode(tag.description())); axis.appendChild(doc->createTextNode(QVariant(tag.axis()).toString())); return el; } const QDomElement XmlProjectRW_sV::frameSource(QDomDocument *doc, const AbstractFrameSource_sV *frameSource) { QDomElement source = doc->createElement("frameSource"); if (dynamic_cast(frameSource) != NULL) { qDebug() << "Frame source is a video."; const VideoFrameSource_sV *vfs = dynamic_cast(frameSource); source.setAttribute("type", "videoFile"); QDomElement file = doc->createElement("inputFile"); file.appendChild(doc->createTextNode(vfs->videoFile())); source.appendChild(file); } else if (dynamic_cast(frameSource) != NULL) { qDebug() << "Frame source are images."; const ImagesFrameSource_sV *ifs = dynamic_cast(frameSource); source.setAttribute("type", "images"); QDomElement files = doc->createElement("inputFiles"); QStringList images = ifs->inputFiles(); for (int i = 0; i < images.size(); i++) { QDomElement file = doc->createElement("file"); file.appendChild(doc->createTextNode(images.at(i))); files.appendChild(file); } source.appendChild(files); } else if (dynamic_cast(frameSource) != NULL) { qDebug() << "Frame source is empty."; source.setAttribute("type", "empty"); } else { qDebug() << "Unknown frame source type; cannot save!"; source.setAttribute("type", "unknown"); Q_ASSERT(false); } return source; } void XmlProjectRW_sV::loadFrameSource(QXmlStreamReader *reader, Project_sV *project) throw(FrameSourceError) { //qDebug() << "loadFrameSource"; QStringRef frameSourceType = reader->attributes().value("type"); if (frameSourceType.compare(QLatin1String("videoFile")) == 0) { while (reader->readNextStartElement()) { if (reader->name() == "inputFile") { VideoFrameSource_sV *frameSource = new VideoFrameSource_sV(project, reader->readElementText()); project->loadFrameSource(frameSource); } else { qDebug() << "Unknown element in video frame source section: " << reader->name(); reader->skipCurrentElement(); } } } else if (frameSourceType.compare(QLatin1String("images")) == 0) { while (reader->readNextStartElement()) { if (reader->name() == "inputFiles") { QStringList images; while (reader->readNextStartElement()) { if (reader->name() == "file") { images << reader->readElementText(); } else { reader->skipCurrentElement(); } } ImagesFrameSource_sV *frameSource = new ImagesFrameSource_sV(project, images); project->loadFrameSource(frameSource); } } } else if (frameSourceType.compare(QLatin1String("empty")) == 0) { EmptyFrameSource_sV *frameSource = new EmptyFrameSource_sV(project); project->loadFrameSource(frameSource); reader->skipCurrentElement(); } else { reader->skipCurrentElement(); qDebug() << "Unknown frame source: " << frameSourceType << "; Cannot load!"; throw FrameSourceError(QObject::trUtf8("Unknown frame source “%1”. Cannot load the project.").arg(frameSourceType.toString())); } //qDebug() << "loadFrameSource ended"; } Project_sV* XmlProjectRW_sV::loadProject(QString filename, QString *warning) throw(FrameSourceError, Error_sV) { QFile file(filename); if (!file.open(QIODevice::ReadOnly)) { qDebug() << "Cannot read file " << filename; throw Error_sV(QObject::tr("Cannot read from file %1. (Opening in read-only mode failed.)").arg(filename)); } else { QXmlStreamReader xml; xml.setDevice(&file); if (!xml.readNextStartElement()) { qDebug() << "Could not read " << filename; throw Error_sV(QObject::tr("Invalid project file: %1").arg(filename)); } else { if (xml.name() != "sVproject") { qDebug() << "Invalid project file (incorrect root element): " << filename; } else { int projVersionMajor = xml.attributes().value("version").toString().toInt(); int projVersionMinor = xml.attributes().value("version_minor").toString().toInt(); if (projVersionMajor > 0) { qDebug().nospace() << "Reading project file: version " << projVersionMajor << "." << projVersionMinor; } else { qDebug() << "Reading project file of unknown version"; } int version_major = 0, version_minor = 0, version_micro = 0; Project_sV *project = new Project_sV(); project->setProjectFilename(filename); project->setProjectDir(QFileInfo(filename).absolutePath()); ProjectPreferences_sV *pr = project->preferences(); while (xml.readNextStartElement()) { if (xml.name() == "info") { while (xml.readNextStartElement()) { if (xml.name() == "appName") { qDebug() << "App name: " << xml.readElementText(); } else if (xml.name() == "version") { version_major = xml.attributes().value("major").toString().toInt(); version_minor = xml.attributes().value("minor").toString().toInt(); version_micro = xml.attributes().value("micro").toString().toInt(); xml.skipCurrentElement(); } else { xml.skipCurrentElement(); } } } else if (xml.name() == "resources") { while (xml.readNextStartElement()) { if (projVersionMajor < 2 && xml.name() == "inputFile") { QString inFilename = xml.readElementText(); qDebug() << "Input file: " << inFilename; project->loadFrameSource(new VideoFrameSource_sV(project, inFilename)); } else if (xml.name() == "frameSource") { loadFrameSource(&xml, project); } else if (xml.name() == "projectDir") { xml.skipCurrentElement(); } else { xml.skipCurrentElement(); } } } else if (xml.name() == "shutterFunctions") { ShutterFunction_sV func; while (xml.readNextStartElement()) { if (xml.name() == "function") { func.setID(xml.attributes().value("id").toString()); while (xml.readNextStartElement()) { if (xml.name() == "code") { func.updateFunction(xml.readElementText()); } else { xml.skipCurrentElement(); } } project->shutterFunctions()->addFunction(func, false); } else { xml.skipCurrentElement(); } } } else if (xml.name() == "nodes") { while (xml.readNextStartElement()) { if (xml.name() == "node") { Node_sV node; while (xml.readNextStartElement()) { if (xml.name() == "x") { node.setX(QVariant(xml.readElementText()).toFloat()); } else if (xml.name() == "y") { node.setY(QVariant(xml.readElementText()).toFloat()); } else if (xml.name() == "selected") { node.select(QVariant(xml.readElementText()).toBool()); } else if (xml.name() == "shutterFunction") { node.setShutterFunctionID(xml.readElementText()); } else if (xml.name() == "leftHandle") { while (xml.readNextStartElement()) { QString text = xml.readElementText(); if (xml.name() == "type") { node.setLeftCurveType((CurveType)text.toInt()); } else if (xml.name() == "x") { node.setLeftNodeHandle(text.toDouble(), node.leftNodeHandle().y()); } else if (xml.name() == "y") { node.setLeftNodeHandle(node.leftNodeHandle().x(), text.toDouble()); } else { xml.skipCurrentElement(); } } } else if (xml.name() == "rightHandle") { while (xml.readNextStartElement()) { QString text = xml.readElementText(); if (xml.name() == "type") { node.setRightCurveType((CurveType)text.toInt()); } else if (xml.name() == "x") { node.setRightNodeHandle(text.toDouble(), node.rightNodeHandle().y()); } else if (xml.name() == "y") { node.setRightNodeHandle(node.rightNodeHandle().x(), text.toDouble()); } else { xml.skipCurrentElement(); } } } else { xml.skipCurrentElement(); } } project->nodes()->add(node); } else { xml.skipCurrentElement(); } } } else if (xml.name() == "tags") { while (xml.readNextStartElement()) { if (xml.name() == "tag") { Tag_sV tag; while (xml.readNextStartElement()) { if (xml.name() == "time") { tag.setTime(QVariant(xml.readElementText()).toFloat()); } else if (xml.name() == "description") { tag.setDescription(xml.readElementText()); } else if (xml.name() == "axis") { tag.setAxis((TagAxis)QVariant(xml.readElementText()).toInt()); } else { xml.skipCurrentElement(); } } project->tags()->push_back(tag); } else { xml.skipCurrentElement(); } } } else if (xml.name() == "preferences") { while (xml.readNextStartElement()) { if (xml.name() == "renderSectionMode") { pr->renderSectionMode() = xml.attributes().value("mode").toString(); if (projVersionMajor == 2 && projVersionMinor < 4) { if (pr->renderSectionMode() == "time") { pr->renderSectionMode() = "expr"; } } while (xml.readNextStartElement()) { if (xml.name() == "renderStartTag") { pr->renderStartTag() = xml.attributes().value("label").toString(); xml.skipCurrentElement(); } else if (xml.name() == "renderEndTag") { pr->renderEndTag() = xml.attributes().value("label").toString(); xml.skipCurrentElement(); } else if (xml.name() == "renderStartTime") { pr->renderStartTime() = xml.attributes().value("time").toString(); xml.skipCurrentElement(); } else if (xml.name() == "renderEndTime") { pr->renderEndTime() = xml.attributes().value("time").toString(); xml.skipCurrentElement(); } else { xml.skipCurrentElement(); } } } else if (xml.name() == "renderFrameSize") { pr->renderFrameSize() = (FrameSize) xml.attributes().value("size").toString().toInt(); xml.skipCurrentElement(); } else if (xml.name() == "renderInterpolationType") { pr->renderInterpolationType() = (InterpolationType) xml.attributes().value("type").toString().toInt(); xml.skipCurrentElement(); } else if (xml.name() == "renderMotionblurType") { pr->renderMotionblurType() = (MotionblurType) xml.attributes().value("type").toString().toInt(); xml.skipCurrentElement(); } else if (xml.name() == "renderFPS") { if (projVersionMajor == 2 && projVersionMinor <= 2) { pr->renderFPS() = xml.attributes().value("fps").toString().toFloat(); } else { pr->renderFPS() = xml.attributes().value("fps").toString(); } xml.skipCurrentElement(); } else if (xml.name() == "renderSlowmoSamples") { project->motionBlur()->setSlowmoSamples(xml.attributes().value("number").toString().toInt()); xml.skipCurrentElement(); } else if (xml.name() == "renderMaxSamples") { project->motionBlur()->setMaxSamples(xml.attributes().value("number").toString().toInt()); xml.skipCurrentElement(); } else if (xml.name() == "renderTarget") { pr->renderTarget() = xml.attributes().value("target").toString(); xml.skipCurrentElement(); } else if (xml.name() == "imagesOutputDir") { pr->imagesOutputDir() = xml.attributes().value("dir").toString(); xml.skipCurrentElement(); } else if (xml.name() == "imagesFilenamePattern") { pr->imagesFilenamePattern() = xml.attributes().value("pattern").toString(); xml.skipCurrentElement(); } else if (xml.name() == "videoFilename") { QString filename = xml.attributes().value("file").toString(); if (!filename.isEmpty()) { pr->videoFilename() = filename; } xml.skipCurrentElement(); } else if (xml.name() == "videoCodec") { pr->videoCodec() = xml.attributes().value("codec").toString(); xml.skipCurrentElement(); } else if (xml.name() == "flowV3dLambda") { pr->flowV3DLambda() = xml.attributes().value("lambda").toString().toFloat(); xml.skipCurrentElement(); } else if (xml.name() == "prevTagAxis") { pr->lastSelectedTagAxis() = (TagAxis)xml.attributes().value("axis").toString().toInt(); xml.skipCurrentElement(); } else if (xml.name() == "viewport_t0") { pr->viewport_t0().rx() = xml.attributes().value("x").toString().toFloat(); pr->viewport_t0().ry() = xml.attributes().value("y").toString().toFloat(); xml.skipCurrentElement(); } else if (xml.name() == "viewport_secRes") { pr->viewport_secRes().rx() = xml.attributes().value("x").toString().toFloat(); pr->viewport_secRes().ry() = xml.attributes().value("y").toString().toFloat(); xml.skipCurrentElement(); } else if (xml.name() == "canvas_xAxisFPS") { pr->canvas_xAxisFPS() = xml.attributes().value("fps").toString(); xml.skipCurrentElement(); } else { xml.skipCurrentElement(); } } } else { qDebug() << "Unknown element: " << xml.name(); xml.skipCurrentElement(); } } xml.readNextStartElement(); if (xml.name().length() > 0) { qDebug() << "Did not read the whole project file! Stopped at: " << xml.name(); } Q_ASSERT(xml.name().length() == 0); file.close(); // Handle new project versions if (projVersionMajor > SLOWMOPROJECT_VERSION_MAJOR && projVersionMajor) { throw Error_sV(QString("This file has been created with slowmoVideo %1.%2.%3 which uses a newer " "project file version (%4.%5; supported: %6.%7). File cannot be loaded " "(or only partially). Please upgrade to a newer version of slowmoVideo.") .arg(version_major).arg(version_minor).arg(version_micro) .arg(projVersionMajor).arg(projVersionMinor) .arg(SLOWMOPROJECT_VERSION_MAJOR).arg(SLOWMOPROJECT_VERSION_MINOR)); } else if (projVersionMinor > SLOWMOPROJECT_VERSION_MINOR) { QString warningMsg = QString("This file has been created with a slightly newer version of slowmoVideo " "(version %1.%2.%3) which uses project file version %4.%5 (supported: %6.%7). " "When saving this project, some added properties will be lost.") .arg(version_major).arg(version_minor).arg(version_micro) .arg(projVersionMajor).arg(projVersionMinor) .arg(SLOWMOPROJECT_VERSION_MAJOR).arg(SLOWMOPROJECT_VERSION_MINOR); qDebug() << warningMsg; if (warning != NULL) { *warning = warningMsg; } } return project; } } file.close(); } return NULL; } slowmovideo-0.5+git20180116/src/slowmoVideo/project/work_flow.cpp0000664000000000000000000001003613151342440023305 0ustar rootroot/* * precalculate optical flow * 2014 Valery Brasseur */ #include "flowSourceOpenCV_sV.h" #include "project_sV.h" #include "abstractFrameSource_sV.h" #include "../lib/flowRW_sV.h" #include "../lib/flowField_sV.h" #ifndef OCV_VERSION_3 // OpenCV 2.x #include "opencv2/video/tracking.hpp" #include "opencv2/imgproc/imgproc.hpp" #include "opencv2/highgui/highgui.hpp" #else #error "need to portto OpenCV 3.x" #endif #include "work_flow.h" #include #include #include #include using namespace cv; WorkerFlow::WorkerFlow(QObject *parent) : QObject(parent) { _working =false; _abort = false; } void WorkerFlow::requestWork() { mutex.lock(); _working = true; _abort = false; qDebug()<<"OpticalFlow worker start in Thread "<currentThreadId(); mutex.unlock(); emit workFlowRequested(); } void WorkerFlow::abort() { mutex.lock(); if (_working) { _abort = true; qDebug()<<"OpticalFlow worker aborting in Thread "<currentThreadId(); } mutex.unlock(); } //TODO: should get an object Flow... void WorkerFlow::setFrameSize(FrameSize _frameSize) { frameSize = _frameSize; } void WorkerFlow::setProject(Project_sV *_project) { project = _project; } void WorkerFlow::setFlowSource(AbstractFlowSource_sV* _flowsource) { flowSource = _flowsource; } //TODO: call abstractflow source method /** * doWorkFlow * call optical flow on each frame */ void WorkerFlow::doWorkFlow() { qDebug() << "Starting OpticalFlow process in Thread " << thread()->currentThreadId(); int lastFrame; int frame; int next = 0; int nextFrame; /* out of loop work */ lastFrame = project->frameSource()->framesCount(); frame = 0; if (forward) { qDebug() << "forward flow"; next = 1; } else { qDebug() << "backward flow"; next = -1; } Mat prevgray, gray, flow; qDebug() << "Pre Building forward flow for Size: " << frameSize; // load first frame QString prevpath = project->frameSource()->framePath(frame, frameSize); prevgray = imread(prevpath.toStdString(), 0); /* real workhorse */ for (frame=0; framecurrentThreadId(); break; } nextFrame = frame + next; QString flowFileName(flowSource->flowPath(frame, nextFrame, frameSize)); qDebug() << "Building flow for left frame " << frame << " to right frame " << nextFrame << "; Size: " << frameSize; /// \todo Check if size is equal if (!QFile(flowFileName).exists()) { //QTime time; //time.start(); QString prevpath = project->frameSource()->framePath(frame, frameSize); QString path = project->frameSource()->framePath(nextFrame, frameSize); gray = imread(path.toStdString(), 0); // use previous flow info //if (frame!=0) // flags |= OPTFLOW_USE_INITIAL_FLOW; qDebug() << "dummy Optical flow built for " << flowFileName; std::swap(prevgray, gray); qDebug() << "Optical flow built for " << flowFileName; } else { qDebug().nospace() << "Re-using existing flow image for left frame " << frame << " to right frame " << nextFrame << ": " << flowFileName; } //qDebug() << "Optical flow built for " << flowFileName << " in " << time.elapsed() << " ms."; // Once we're done waiting, value is updated emit valueChanged(QString::number(frame)); } /* for */ // Set _working to false, meaning the process can't be aborted anymore. mutex.lock(); _working = false; mutex.unlock(); qDebug()<<"OpticalFlow process finished in Thread "<currentThreadId(); // the finished signal is sent emit finished(); } slowmovideo-0.5+git20180116/src/slowmoVideo/project/project_sV.cpp0000664000000000000000000003667213151342440023430 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "project_sV.h" #include "projectPreferences_sV.h" #include "videoFrameSource_sV.h" #include "emptyFrameSource_sV.h" #include "flowSourceV3D_sV.h" #include "flowSourceOpenCV_sV.h" #include "interpolator_sV.h" #include "motionBlur_sV.h" #include "nodeList_sV.h" #include "renderTask_sV.h" #include "shutterFunction_sV.h" #include "shutterFunctionList_sV.h" #include "../lib/shutter_sV.h" #include "../lib/interpolate_sV.h" #include "../lib/flowRW_sV.h" #include "../lib/flowField_sV.h" #if 1 #include "work_flow.h" #endif #include #include #include #include #include #include #if QT_VERSION >= QT_VERSION_CHECK(5, 0, 0) #include #endif //#define DEBUG_P #ifdef DEBUG_P #include #endif #define MIN_FRAME_DIST .001 Project_sV::Project_sV() { //TODO: scoping problem m_projDir = getDirectoryName(); init(); int tid; for(tid=0;tid<4;tid++) { worker[tid]=0; thread[tid]=0; } } Project_sV::Project_sV(QString projectDir) : m_projDir(projectDir) { init(); } QDir Project_sV::getDirectoryName() { QDir dirName; #if QT_VERSION >= QT_VERSION_CHECK(5, 0, 0) //QTemporaryDir tempDir("slowmovideo"); QTemporaryDir tempDir;; // use default if (tempDir.isValid()) dirName = QDir(tempDir.path()); else #endif dirName = QDir::temp(); return dirName; } void Project_sV::init() { m_preferences = new ProjectPreferences_sV(); m_frameSource = new EmptyFrameSource_sV(this); m_flowSource = 0; // leak ? new FlowSourceV3D_sV(this); m_motionBlur = new MotionBlur_sV(this); reloadFlowSource(); m_tags = new QList(); m_nodes = new NodeList_sV(); m_shutterFunctions = new ShutterFunctionList_sV(m_nodes); m_renderTask = NULL; m_v3dFailCounter = 0; /* better here ? */ int tid; for(tid=0;tid<4;tid++) { worker[tid]=0; thread[tid]=0; } } Project_sV::~Project_sV() { int tid; for(tid=0;tid<4;tid++) { if (worker[tid]!=0) { worker[tid]->abort(); thread[tid]->wait(); //qDebug()<<"Deleting thread and worker in Thread "<QObject::thread()->currentThreadId(); delete worker[tid]; delete thread[tid]; } } delete m_renderTask; qDebug() << "delete pref"; delete m_preferences; delete m_frameSource; delete m_flowSource; delete m_motionBlur; delete m_tags; delete m_nodes; delete m_shutterFunctions; } void Project_sV::reloadFlowSource() { if (m_flowSource != 0) delete m_flowSource; QSettings m_settings; QString method = m_settings.value("preferences/flowMethod", "OpenCV-CPU").toString(); if ("V3D" == method) { m_flowSource = new FlowSourceV3D_sV(this); } else { int algo = m_settings.value("preferences/preferredOpenCVAlgo", 0).toInt(); int dev_index = m_settings.value("preferences/oclDriver", -1).toInt(); if (method == "OpenCV-CPU") { dev_index = -1; } m_flowSource = new FlowSourceOpenCV_sV(this, algo, dev_index); } } void Project_sV::setupProjectDir() { //qDebug() << "Project directory: " << m_projDir.absolutePath(); m_frameSource->slotUpdateProjectDir(); m_flowSource->slotUpdateProjectDir(); m_motionBlur->slotUpdateProjectDir(); } void Project_sV::setProjectDir(QString projectDir) { m_projDir = projectDir; // Create directory if necessary qDebug() << "Project directory: " << m_projDir.absolutePath(); if (!m_projDir.exists()) { m_projDir.mkpath("."); } setupProjectDir(); } void Project_sV::setProjectFilename(QString filename) { m_projectFilename = filename; } QString Project_sV::projectFilename() const { return m_projectFilename; } void Project_sV::loadFrameSource(AbstractFrameSource_sV *frameSource) { if (m_frameSource != NULL) { delete m_frameSource; } if (frameSource == NULL) { m_frameSource = new EmptyFrameSource_sV(this); } else { m_frameSource = frameSource; } m_nodes->setMaxY(m_frameSource->maxTime()); } //TODO: remove this void Project_sV::replaceRenderTask(RenderTask_sV *task) { /* if (m_renderTask != NULL) { m_renderTask->slotStopRendering(); m_renderTask->deleteLater(); m_renderTask = NULL; } */ m_renderTask = task; } const QDir Project_sV::getDirectory(const QString &name, bool createIfNotExists) const { QDir dir(m_projDir.absolutePath() + "/" + name); if (createIfNotExists && !dir.exists()) { dir.mkpath("."); } return dir; } QImage Project_sV::render(qreal outTime, RenderPreferences_sV prefs) { if (outTime < m_nodes->startTime() || outTime > m_nodes->endTime()) { #ifdef DEBUG_P std::cout.precision(30); std::cout << m_nodes->startTime() << " -- " << outTime << " -- " << m_nodes->endTime() << std::endl; std::cout.flush(); #endif qDebug() << "Output time out of bounds"; Q_ASSERT(false); } float sourceTime = m_nodes->sourceTime(outTime); if (sourceTime < 0) { sourceTime = 0; } if (sourceTime > m_frameSource->maxTime()) { sourceTime = m_frameSource->maxTime(); } float sourceFrame = sourceTimeToFrame(sourceTime); int leftIndex = m_nodes->find(outTime); if (leftIndex < 0) { qDebug() << "left node is not here!"; Q_ASSERT(false); } if (leftIndex == m_nodes->size()-1) { // outTime is at the very end of the node. // Take next to last node to still have a right node. leftIndex--; } const Node_sV *leftNode = &(*m_nodes)[leftIndex]; const Node_sV *rightNode = &(*m_nodes)[leftIndex+1]; ShutterFunction_sV *shutterFunction = m_shutterFunctions->function(leftNode->shutterFunctionID()); if (shutterFunction != NULL) { float dy = 0; if (outTime+1/prefs.fps().fps() <= m_nodes->endTime()) { dy = m_nodes->sourceTime(outTime+1/prefs.fps().fps()) - sourceTime; } else { dy = sourceTime - m_nodes->sourceTime(outTime-1/prefs.fps().fps()); } float replaySpeed = fabs(dy)*prefs.fps().fps(); float shutter = shutterFunction->evaluate( (outTime-leftNode->x())/(rightNode->x()-leftNode->x()), // x on [0,1] outTime, // t prefs.fps().fps(), // FPS sourceFrame, // y dy // dy to next frame ); qDebug() << "Shutter value for output time " << outTime << " is " << shutter; if (shutter > 0) { try { return m_motionBlur->blur(sourceFrame, sourceFrame+shutter*prefs.fps().fps(), replaySpeed, prefs); } catch (RangeTooSmallError_sV &err) {} } } return Interpolator_sV::interpolate(this, sourceFrame, prefs); } FlowField_sV* Project_sV::requestFlow(int leftFrame, int rightFrame, const FrameSize frameSize) throw(FlowBuildingError) { Q_ASSERT(leftFrame < m_frameSource->framesCount()); Q_ASSERT(rightFrame < m_frameSource->framesCount()); if (dynamic_cast(m_frameSource) == NULL) { FlowSourceV3D_sV *v3d; if ((v3d = dynamic_cast(m_flowSource)) != NULL) { v3d->setLambda(m_preferences->flowV3DLambda()); try { return m_flowSource->buildFlow(leftFrame, rightFrame, frameSize); } catch (FlowBuildingError err) { m_v3dFailCounter++; qDebug() << "Failed creating optical flow, falling back to OpenCV ..."; qDebug() << "Failed attempts so far: " << m_v3dFailCounter; delete m_flowSource; QSettings m_settings; int algo = m_settings.value("preferences/preferredOpenCVAlgo", 0).toInt(); int dev_index = m_settings.value("preferences/oclDriver", -1).toInt(); QString method = m_settings.value("preferences/flowMethod", "OpenCV-CPU").toString(); if (method == "OpenCV-CPU") { dev_index = -1; } m_flowSource = new FlowSourceOpenCV_sV(this, algo, dev_index); return m_flowSource->buildFlow(leftFrame, rightFrame, frameSize); } } return m_flowSource->buildFlow(leftFrame, rightFrame, frameSize); } else { throw FlowBuildingError(tr("Empty frame source; Cannot build flow.")); } } inline qreal Project_sV::sourceTimeToFrame(qreal time) const { Q_ASSERT(time >= 0); return time * m_frameSource->fps()->fps(); } qreal Project_sV::snapToFrame(const qreal time, bool roundUp, const Fps_sV &fps, int *out_framesBeforeHere) { Q_ASSERT(time >= 0); int frameCount = 0; double snapTime = 0; double frameLength; frameLength = 1.0/fps.fps(); while (snapTime < time) { snapTime += frameLength; frameCount++; } // snapTime is now >= time if (!roundUp && snapTime != time) { snapTime -= frameLength; frameCount--; } if (out_framesBeforeHere != NULL) { *out_framesBeforeHere = frameCount; } return snapTime; } qreal Project_sV::snapToOutFrame(qreal time, bool roundUp, const Fps_sV &fps, int *out_framesBeforeHere) const { if (time > m_nodes->endTime()) { time = m_nodes->endTime(); } time -= m_nodes->startTime(); if (time < 0) { time = 0; } float snapped = snapToFrame(time, roundUp, fps, out_framesBeforeHere) + m_nodes->startTime(); return snapped; } qreal Project_sV::toOutTime(QString timeExpression, const Fps_sV &fps) const throw(Error_sV) { if (m_nodes->size() < 2) { throw Error_sV(tr("Not enough nodes available in the project.")); } // t:time l:label f:frame p:percent :start :end time bool ok = false; qreal time = 0; if (timeExpression.startsWith("t:")) { time = timeExpression.mid(2).toDouble(&ok); if (!ok) { throw Error_sV(tr("%1 is not a valid time. Format: t:123.45").arg(timeExpression)); } } else if (timeExpression.startsWith("l:")) { QString label = timeExpression.mid(2); bool inputAxisFound = false; for (int i = 0; i < m_tags->size(); i++) { if (m_tags->at(i).description() == label) { if (m_tags->at(i).axis() == TagAxis_Output) { time = m_tags->at(i).time(); ok = true; break; } else { inputAxisFound = true; } } } if (!ok) { if (inputAxisFound) { throw Error_sV(tr("%1 is an input label and not an output label and cannot be used for rendering.").arg(label)); } else { throw Error_sV(tr("No label found for %1").arg(timeExpression)); } } } else if (timeExpression.startsWith("f:")) { int frame = timeExpression.mid(2).toInt(&ok); if (ok) { time = frame / fps.fps(); } else { throw Error_sV(tr("%1 is not a valid frame number. Format: f:1234").arg(timeExpression)); } } else if (timeExpression.startsWith("p:")) { QString sPercent = timeExpression.mid(2).trimmed(); if (sPercent.endsWith("%")) { sPercent.chop(1); } float percent = sPercent.toFloat(&ok); if (ok) { if (percent >= 0 && percent <= 100) { time = m_nodes->startTime() + m_nodes->totalTime()*percent/100; } else { throw Error_sV(tr("%1 is not a valid percentage number; must be between 0 and 100.").arg(percent)); } } else { throw Error_sV(tr("%1 is not a valid percentage expression. Format: p:0% until p:100.0%").arg(timeExpression)); } } else if (timeExpression.startsWith(":")) { if (":start" == timeExpression) { time = m_nodes->startTime(); } else if (":end" == timeExpression) { time = m_nodes->endTime(); } else { throw Error_sV(tr("%1 is not a valid position. Valid: :start and :end").arg(timeExpression)); } } else { time = timeExpression.toDouble(&ok); if (!ok) { throw Error_sV(tr("Not a valid time format. Options: t:1.25 or 1.25 (time), f:1234 (frame), " "l:slowdown (label), p:42.42% (percentage), :start and :end (project start/end).")); } } time = qMax(time, m_nodes->startTime()); time = qMin(time, m_nodes->endTime()); return time; } QList Project_sV::objectsNear(QPointF pos, qreal tmaxdist) const { QList list = m_nodes->objectsNear(pos, tmaxdist); qreal dist; for (int i = 0; i < m_tags->size(); i++) { if (m_tags->at(i).axis() == TagAxis_Source) { dist = fabs(pos.y() - m_tags->at(i).time()); } else { dist = fabs(pos.x() - m_tags->at(i).time()); } if (dist <= tmaxdist) { list << NodeList_sV::PointerWithDistance(&m_tags->at(i), dist, NodeList_sV::PointerWithDistance::Tag); } } qSort(list); return list; } /** * start an optical flow on a thread * * @param threadid the thread on which to run our flow * @param frameSize size of frame for calcul (small/orig) * @param direction flow direction */ void Project_sV::startFlow(int threadid,const FrameSize frameSize,int direction) { thread[threadid] = new QThread(); worker[threadid] = new WorkerFlow(); // set on what to work ... worker[threadid]->setFrameSize(frameSize); worker[threadid]->setProject(this); worker[threadid]->setDirection(direction); worker[threadid]->setFlowSource(flowSource()); worker[threadid]->moveToThread(thread[threadid]); //connect(worker, SIGNAL(valueChanged(QString)), ui->label, SLOT(setText(QString))); connect(worker[threadid], SIGNAL(workFlowRequested()), thread[threadid], SLOT(start())); connect(thread[threadid], SIGNAL(started()), worker[threadid], SLOT(doWorkFlow())); connect(worker[threadid], SIGNAL(finished()), thread[threadid], SLOT(quit()), Qt::DirectConnection); // let's start thread[threadid]->wait(); // If the thread is not running, this will immediately return. worker[threadid]->requestWork(); } /** * prebuild the optical flow using threading */ void Project_sV::buildCacheFlowSource() { QSettings settings; bool precalc = settings.value("preferences/precalcFlow", true).toBool(); if (precalc && (m_flowSource != NULL)) { #if 0 qDebug() << "Creating cached FlowSources "; Q_ASSERT(m_flowSource != NULL); // TODO: test/check better place ? // we should do it for each size/each way // use threading here startFlow(0,FrameSize_Small,0); startFlow(1,FrameSize_Small,1); startFlow(2,FrameSize_Orig,0); startFlow(3,FrameSize_Orig,1); #else qDebug() << "cache flow source disable"; #endif } } slowmovideo-0.5+git20180116/src/slowmoVideo/project/flowSourceV3D_sV.cpp0000664000000000000000000001157113151342440024416 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "flowSourceV3D_sV.h" #include "project_sV.h" #include "abstractFrameSource_sV.h" #include "../lib/flowRW_sV.h" #include #include #include #include FlowSourceV3D_sV::FlowSourceV3D_sV(Project_sV *project, double lambda) : AbstractFlowSource_sV(project) { createDirectories(); m_lambda = lambda; } FlowField_sV* FlowSourceV3D_sV::buildFlow(uint leftFrame, uint rightFrame, FrameSize frameSize) throw(FlowBuildingError) { QString flowFileName(flowPath(leftFrame, rightFrame, frameSize)); /// \todo Check if size is equal if (!QFile(flowFileName).exists()) { QSettings settings; QString programLocation(settings.value("binaries/v3dFlowBuilder", "/usr/local/bin/slowmoFlowBuilder").toString()); if (!QFile(programLocation).exists()) { QString newLoc = correctFlowBinaryLocation(); if (newLoc.length() > 0) { programLocation = newLoc; } } if (!QFile(programLocation).exists()) { throw FlowBuildingError("Program\n" + programLocation + "\ndoes not exist (did you compile/make V3D?), cannot build flow!"); } QString program(programLocation); qDebug() << "Building flow for left frame " << leftFrame << " to right frame " << rightFrame << "; Size: " << frameSize; QStringList args; args << project()->frameSource()->framePath(leftFrame, frameSize) << project()->frameSource()->framePath(rightFrame, frameSize) << flowFileName << QVariant(m_lambda).toString() << "100"; qDebug() << "Arguments: " << args; QTime time; QProcess proc; time.start(); proc.start(program, args); proc.waitForFinished(-1); if (proc.exitCode() != 0) { qDebug() << "Failed: " << proc.readAllStandardError() << proc.readAllStandardOutput(); throw FlowBuildingError(QString("Flow builder exited with exit code %1; For details see debugging output").arg(proc.exitCode())); } else { qDebug() << "Optical flow built for " << flowFileName << " in " << time.elapsed() << " ms"; qDebug() << proc.readAllStandardError() << proc.readAllStandardOutput(); } } else { qDebug().nospace() << "Re-using existing flow image for left frame " << leftFrame << " to right frame " << rightFrame << ": " << flowFileName; } try { return FlowRW_sV::load(flowFileName.toStdString()); } catch (FlowRW_sV::FlowRWError &err) { throw FlowBuildingError(err.message.c_str()); } } QString FlowSourceV3D_sV::correctFlowBinaryLocation() { QSettings settings; QString programLocation(settings.value("binaries/v3dFlowBuilder", "/usr/local/bin/slowmoFlowBuilder").toString()); QStringList paths; paths << programLocation; paths << QDir::currentPath() + "/slowmoFlowBuilder"; paths << QCoreApplication::applicationDirPath() + "/slowmoFlowBuilder"; paths << "/usr/bin/slowmoFlowBuilder" << "/usr/local/bin/slowmoFlowBuilder"; for (int i = 0; i < paths.size(); i++) { if (validateFlowBinary(paths.at(i))) { settings.setValue("binaries/v3dFlowBuilder", paths.at(i)); return paths.at(i); } } return QString(); } bool FlowSourceV3D_sV::validateFlowBinary(const QString path) { bool valid = false; qDebug() << "Checking " << path << " ..."; if (QFile(path).exists() && QFileInfo(path).isExecutable()) { QProcess process; QStringList args; args << "--identify"; process.start(path, args); process.waitForFinished(2000); QString output(process.readAllStandardOutput()); if (output.startsWith("slowmoFlowBuilder")) { valid = true; qDebug() << path << " is valid."; } else { qDebug() << "Invalid output from flow executable: " << output; } process.terminate(); } return valid; } const QString FlowSourceV3D_sV::flowPath(const uint leftFrame, const uint rightFrame, const FrameSize frameSize) const { QDir dir; if (frameSize == FrameSize_Orig) { dir = m_dirFlowOrig; } else { dir = m_dirFlowSmall; } QString direction; if (leftFrame < rightFrame) { direction = "forward"; } else { direction = "backward"; } return dir.absoluteFilePath(QString("%1-lambda%4_%2-%3.sVflow").arg(direction).arg(leftFrame).arg(rightFrame).arg(m_lambda, 0, 'f', 2)); } slowmovideo-0.5+git20180116/src/slowmoVideo/project/nodeHandle_sV.h0000664000000000000000000000165413151342440023460 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef NODEHANDLE_SV_H #define NODEHANDLE_SV_H #include "canvasObject_sV.h" #include #include class Node_sV; class NodeHandle_sV : public QPointF, public CanvasObject_sV { public: NodeHandle_sV(); NodeHandle_sV(qreal x, qreal y); NodeHandle_sV(const QPointF &other); NodeHandle_sV(const NodeHandle_sV &other); ~NodeHandle_sV() {} void setParentNode(Node_sV *node); const Node_sV* parentNode() const; private: Node_sV *m_parentNode; }; QDebug operator<<(QDebug qd, const NodeHandle_sV& n); #endif // NODEHANDLE_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/renderPreferences_sV.h0000664000000000000000000000065213151342440025055 0ustar rootroot#ifndef RENDERPREFERENCES_SV_H #define RENDERPREFERENCES_SV_H #include "../lib/defs_sV.hpp" class RenderPreferences_sV { public: RenderPreferences_sV(); InterpolationType interpolation; MotionblurType motionblur; FrameSize size; void setFps(Fps_sV fps); Fps_sV fps() const; bool fpsSetByUser() const; private: Fps_sV m_fps; bool m_fpsSetByUser; }; #endif // RENDERPREFERENCES_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/videoRenderTarget_sV.h0000664000000000000000000000250313151342440025026 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef VIDEORENDERTARGET_SV_H #define VIDEORENDERTARGET_SV_H #include "abstractRenderTarget_sV.h" struct VideoOut_sV; class RenderTask_sV; /// Produces videos from frames. class VideoRenderTarget_sV : public AbstractRenderTarget_sV { public: /// Constructs a new video render target VideoRenderTarget_sV(RenderTask_sV *parentRenderTask); virtual ~VideoRenderTarget_sV(); /// openRenderTarget() will throw an error if the target file cannot be opened. void setTargetFile(const QString& filename); /// Set a custom video codec (see
ffmpeg -codecs
for a list of available codecs). void setVcodec(const QString& codec); void openRenderTarget() throw(Error_sV); void closeRenderTarget() throw(Error_sV); public slots: void slotConsumeFrame(const QImage &image, const int frameNumber); private: QString m_filename; QString m_vcodec; VideoOut_sV *m_videoOut; int m_width; int m_height; }; #endif // VIDEORENDERTARGET_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/exportVideoRenderTarget.h0000664000000000000000000000313513151342440025562 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2014 Valery Brasseur This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ /* * handle both ffmpeg and qtkit */ #ifndef EXPORTVIDEORENDERTARGET_SV_H #define EXPORTVIDEORENDERTARGET_SV_H #include "abstractRenderTarget_sV.h" #include class VideoWriter; class RenderTask_sV; /// Produces videos from frames. // store temporary frame on disk (png) then export with "ffmpeg" or quicktime class exportVideoRenderTarget : public AbstractRenderTarget_sV { public: /// Constructs a new video render target exportVideoRenderTarget(RenderTask_sV *parentRenderTask); virtual ~exportVideoRenderTarget(); void openRenderTarget() throw(Error_sV) {} ; /// openRenderTarget() will throw an error if the target file cannot be opened. void setTargetFile(const QString& filename); /// Set a custom video codec void setVcodec(const QString& codec); void closeRenderTarget() throw(Error_sV); void setQT(int use) { use_qt = use;}; public slots: void slotConsumeFrame(const QImage &image, const int frameNumber); private: QString m_filename; QString m_vcodec; VideoWriter* writer; int m_width; int m_height; int use_qt; // using QuickTime int first; // first frame to be written // png temp QDir m_targetDir; QString m_filenamePattern; }; #endif // EXPORTVIDEORENDERTARGET_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/abstractRenderTarget_sV.cpp0000664000000000000000000000105513151342440026057 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "abstractRenderTarget_sV.h" AbstractRenderTarget_sV::AbstractRenderTarget_sV(RenderTask_sV *renderTask) : m_renderTask(renderTask) { } AbstractRenderTarget_sV::~AbstractRenderTarget_sV() { } slowmovideo-0.5+git20180116/src/slowmoVideo/project/new_videoRenderTarget.h0000664000000000000000000000244513151342440025234 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2014 Valery Brasseur This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ /* * handle both ffmpeg and qtkit */ #ifndef NEW_VIDEORENDERTARGET_SV_H #define NEW_VIDEORENDERTARGET_SV_H #include "abstractRenderTarget_sV.h" class VideoWriter; class RenderTask_sV; /// Produces videos from frames. class newVideoRenderTarget : public AbstractRenderTarget_sV { public: /// Constructs a new video render target newVideoRenderTarget(RenderTask_sV *parentRenderTask); virtual ~newVideoRenderTarget(); /// openRenderTarget() will throw an error if the target file cannot be opened. void setTargetFile(const QString& filename); /// Set a custom video codec void setVcodec(const QString& codec); void openRenderTarget() throw(Error_sV); void closeRenderTarget() throw(Error_sV); public slots: void slotConsumeFrame(const QImage &image, const int frameNumber); private: QString m_filename; QString m_vcodec; VideoWriter* writer; int m_width; int m_height; }; #endif // NEW_VIDEORENDERTARGET_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/interpolator_sV.cpp0000664000000000000000000001156113151342440024472 0ustar rootroot#include "interpolator_sV.h" #include "abstractFrameSource_sV.h" #include "../lib/flowField_sV.h" #include "../lib/interpolate_sV.h" #include #define MIN_FRAME_DIST .001 QImage Interpolator_sV::interpolate(Project_sV *pr, float frame, const RenderPreferences_sV &prefs) throw(FlowBuildingError, InterpolationError) { if (frame > pr->frameSource()->framesCount()) { throw InterpolationError(QObject::tr("Requested frame %1: Not within valid range. (%2 frames)") .arg(frame).arg(pr->frameSource()->framesCount())); } if (frame-floor(frame) > MIN_FRAME_DIST) { QImage left = pr->frameSource()->frameAt(floor(frame), prefs.size); QImage right = pr->frameSource()->frameAt(floor(frame)+1, prefs.size); QImage out(left.size(), QImage::Format_ARGB32); /// Position between two frames, on [0 1] const float pos = frame-floor(frame); switch (prefs.interpolation) { case InterpolationType_Twoway: { FlowField_sV *forwardFlow = pr->requestFlow(floor(frame), floor(frame)+1, prefs.size); FlowField_sV *backwardFlow = pr->requestFlow(floor(frame)+1, floor(frame), prefs.size); Q_ASSERT(forwardFlow != NULL); Q_ASSERT(backwardFlow != NULL); if (forwardFlow == NULL || backwardFlow == NULL) { qDebug() << "No flow received!"; Q_ASSERT(false); } Interpolate_sV::twowayFlow(left, right, forwardFlow, backwardFlow, pos, out); delete forwardFlow; delete backwardFlow; } break; case InterpolationType_TwowayNew: { FlowField_sV *forwardFlow = pr->requestFlow(floor(frame), floor(frame)+1, prefs.size); FlowField_sV *backwardFlow = pr->requestFlow(floor(frame)+1, floor(frame), prefs.size); Q_ASSERT(forwardFlow != NULL); Q_ASSERT(backwardFlow != NULL); if (forwardFlow == NULL || backwardFlow == NULL) { qDebug() << "No flow received!"; Q_ASSERT(false); } Interpolate_sV::newTwowayFlow(left, right, forwardFlow, backwardFlow, pos, out); delete forwardFlow; delete backwardFlow; } break; case InterpolationType_Forward: { FlowField_sV *forwardFlow = pr->requestFlow(floor(frame), floor(frame)+1, prefs.size); Q_ASSERT(forwardFlow != NULL); if (forwardFlow == NULL) { qDebug() << "No flow received!"; Q_ASSERT(false); } Interpolate_sV::forwardFlow(left, forwardFlow, pos, out); delete forwardFlow; } break; case InterpolationType_ForwardNew: { FlowField_sV *forwardFlow = pr->requestFlow(floor(frame), floor(frame)+1, prefs.size); Q_ASSERT(forwardFlow != NULL); if (forwardFlow == NULL) { qDebug() << "No flow received!"; Q_ASSERT(false); } Interpolate_sV::newForwardFlow(left, forwardFlow, pos, out); delete forwardFlow; } break; case InterpolationType_Bezier: { FlowField_sV *currNext = pr->requestFlow(floor(frame)+2, floor(frame)+1, prefs.size); // Allowed to be NULL FlowField_sV *currPrev = pr->requestFlow(floor(frame)+0, floor(frame)+1, prefs.size); Q_ASSERT(currPrev != NULL); Interpolate_sV::bezierFlow(left, right, currPrev, currNext, pos, out); delete currNext; delete currPrev; } break; case InterpolationType_None: { //qDebug() << "Simple interpolation type!"; Interpolate_sV::simpleinterpolate(left, right, pos, out); } break; case InterpolationType_Nearest: { Interpolate_sV::nearestinterpolate(left, right, pos, out); } break; default : { qDebug() << "Unsupported interpolation type!"; Q_ASSERT(false); } } return out; } else { qDebug() << "No interpolation necessary."; return pr->frameSource()->frameAt(floor(frame), prefs.size); } } slowmovideo-0.5+git20180116/src/slowmoVideo/project/abstractRenderTarget_sV.h0000664000000000000000000000237513151342440025532 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef ABSTRACTRENDERTARGET_SV_H #define ABSTRACTRENDERTARGET_SV_H #include #include "../lib/defs_sV.hpp" class RenderTask_sV; /** \brief Should represent a render target like video or an image sequence */ class AbstractRenderTarget_sV { public: AbstractRenderTarget_sV(RenderTask_sV *parentRenderTask); virtual ~AbstractRenderTarget_sV(); RenderTask_sV* renderTask() { return m_renderTask; } /// Prepares the render target (if necessary), like initializing video streams etc. virtual void openRenderTarget() throw(Error_sV) = 0; /// Finishes the render target (e.g. writes the trailer to a video file) virtual void closeRenderTarget() throw(Error_sV) = 0; public slots: /// Adds one frame to the output virtual void slotConsumeFrame(const QImage &image, const int frameNumber) = 0; private: RenderTask_sV *m_renderTask; }; #endif // ABSTRACTRENDERTARGET_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/segment_sV.h0000664000000000000000000000160513151342440023055 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef SEGMENT_SV_H #define SEGMENT_SV_H #include #include "canvasObject_sV.h" /** \brief Dummy object for a segment between two nodes */ class Segment_sV : public CanvasObject_sV { public: Segment_sV(int leftNodeIndex); ~Segment_sV() {} int leftNodeIndex() const; bool selected() const; void select(bool select = true); bool operator <(const Segment_sV &other) const; private: int m_leftNodeIndex; bool m_selected; }; QString toString(const Segment_sV& segment); #endif // SEGMENT_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/renderTask_sV.cpp0000664000000000000000000001704713151342440024057 0ustar rootroot/* * 2014 Valery Brasseur */ #include "renderTask_sV.h" #include "abstractRenderTarget_sV.h" #include "emptyFrameSource_sV.h" #include #include #include #include #include #include #include #include "project_sV.h" #include "nodeList_sV.h" #include "../lib/defs_sV.hpp" RenderTask_sV::RenderTask_sV(Project_sV *project) : //m_project(project), m_renderTarget(NULL), m_renderTimeElapsed(0), m_stopRendering(false), m_prevTime(-1) { //m_project->setupProjectDir(); m_project = project; m_timeStart = m_project->nodes()->startTime(); m_timeEnd = m_project->nodes()->endTime(); m_nextFrameTime = m_project->nodes()->startTime(); _working =false; } //TODO: RenderTask_sV::~RenderTask_sV() { if (m_renderTarget != NULL) { delete m_renderTarget; } } /** * let run the thread ! */ void RenderTask_sV::requestWork() { mutex.lock(); _working = true; m_stopRendering = false; qDebug()<<"rendering worker start in Thread "<currentThreadId(); mutex.unlock(); //emit workFlowRequested(); qDebug() << "workflow request"; } /** * stop the running thread */ void RenderTask_sV::slotStopRendering() { //qDebug()<<"abort rendering required in Thread "<currentThreadId(); mutex.lock(); if (_working) { m_stopRendering = true; qDebug()<<"rendering aborting in Thread "<currentThreadId(); } mutex.unlock(); } #pragma mark - #pragma mark progress dialog /** * setup the progress dialog */ void RenderTask_sV::setupProgress(QString desc, int taskSize) { //qDebug() << "setup progress call " << desc << " size " << taskSize; emit signalNewTask(desc,taskSize); } /** * update progress dialog from outside */ void RenderTask_sV::updateProgress(int value) { //qDebug() << "updateProgress call " << value; currentProgress = value; emit signalTaskProgress(value ); } void RenderTask_sV::stepProgress(int step) { //qDebug() << "stepProgress call " << step; currentProgress+=step; emit signalTaskProgress(currentProgress ); } void RenderTask_sV::updateMessage(QString desc) { emit signalItemDesc(desc); } #pragma mark - #pragma mark set/get task void RenderTask_sV::setRenderTarget(AbstractRenderTarget_sV *renderTarget) { Q_ASSERT(renderTarget != NULL); if (m_renderTarget != NULL && m_renderTarget != renderTarget) { delete m_renderTarget; } m_renderTarget = renderTarget; } void RenderTask_sV::setTimeRange(qreal start, qreal end) { Q_ASSERT(start <= end); Q_ASSERT(start >= m_project->nodes()->startTime()); Q_ASSERT(end <= m_project->nodes()->endTime()); m_timeStart = start; m_timeEnd = end; } void RenderTask_sV::setTimeRange(QString start, QString end) { Q_ASSERT(m_prefs.fpsSetByUser()); m_timeStart = m_project->toOutTime(start, m_prefs.fps()); m_timeEnd = m_project->toOutTime(end, m_prefs.fps()); Q_ASSERT(m_timeStart < m_timeEnd); } QSize RenderTask_sV::resolution() { return const_cast(m_project)->frameSource()->frameAt(0, m_prefs.size).size(); } /* * return a suitable dir for rendered frame */ QDir RenderTask_sV::getRenderDirectory() { //bug using : return m_project->getDirectory("cache/rendered"); QDir dir(m_project->getProjectDir().absolutePath() + "/" + "rendered"); if (!dir.exists()) { dir.mkpath("."); } return dir; } #pragma mark - #pragma mark rendering /** * this is the real workhorse. * maybe we should not call this directly, but instead from doWork ? */ void RenderTask_sV::slotContinueRendering() { qDebug()<<"Starting rendering process in Thread "<currentThreadId(); /* real workhorse, need to account for exporting */ setupProgress(trUtf8("Rendering Slow-Mo …"), 2* int(m_prefs.fps().fps() * (m_timeEnd-m_timeStart))); //TODO: initialize m_stopwatch.start(); m_nextFrameTime=m_timeStart; int framesBefore; qreal snapped = m_project->snapToOutFrame(m_nextFrameTime, false, m_prefs.fps(), &framesBefore); qDebug() << "Frame snapping in from " << m_nextFrameTime << " to " << snapped; m_nextFrameTime = snapped; Q_ASSERT(int((m_nextFrameTime - m_project->nodes()->startTime()) * m_prefs.fps().fps() + .5) == framesBefore); try { m_renderTarget->openRenderTarget(); } catch (Error_sV &err) { m_stopRendering = true; emit signalRenderingAborted(tr("Rendering aborted.") + " " + err.message()); return; } // render loop // TODO: add more threading here while(m_nextFrameTimecurrentThreadId(); m_renderTimeElapsed = m_stopwatch.elapsed(); emit signalRenderingStopped(QTime().addMSecs(m_renderTimeElapsed).toString("hh:mm:ss")); qDebug() << "Rendering stopped after " << QTime().addMSecs(m_renderTimeElapsed).toString("hh:mm:ss"); break; } // do the work int outputFrame = (m_nextFrameTime - m_project->nodes()->startTime()) * m_prefs.fps().fps() + .5; qreal srcTime = m_project->nodes()->sourceTime(m_nextFrameTime); qDebug() << "Rendering frame number " << outputFrame << " @" << m_nextFrameTime << " from source time " << srcTime; updateMessage(tr("Rendering frame %1 @ %2 s from input position: %3 s (frame %4)") .arg( QString::number(outputFrame),QString::number(m_nextFrameTime), QString::number(srcTime), QString::number(srcTime*m_project->frameSource()->fps()->fps()) ) ); try { QImage rendered = m_project->render(m_nextFrameTime, m_prefs); m_renderTarget->slotConsumeFrame(rendered, outputFrame); m_nextFrameTime = m_nextFrameTime + 1/m_prefs.fps().fps(); updateProgress( (m_nextFrameTime-m_timeStart) * m_prefs.fps().fps() ); } catch (FlowBuildingError &err) { m_stopRendering = true; emit signalRenderingAborted(err.message()); } catch (InterpolationError &err) { updateMessage(err.message()); } } /* while */ // Checks if the process should be aborted mutex.lock(); bool abort = m_stopRendering; mutex.unlock(); if (abort) { qDebug() << "Rendering : aborting"; updateMessage(tr("Rendering : aborting")); } else { //TODO: closing rendering project qDebug() << "Rendering : exporting"; updateMessage(tr("Rendering : exporting")); m_renderTarget->closeRenderTarget(); } m_renderTimeElapsed = m_stopwatch.elapsed(); qDebug() << "time : " << m_renderTimeElapsed; emit signalRenderingFinished(QTime(0,0).addMSecs(m_renderTimeElapsed).toString("hh:mm:ss")); qDebug() << "Rendering stopped after " << QTime(0,0).addMSecs(m_renderTimeElapsed).toString("hh:mm:ss"); qDebug()<<"Rendering process finished in Thread "<currentThreadId(); // Set _working to false, meaning the process can't be aborted anymore. mutex.lock(); _working = false; mutex.unlock(); } slowmovideo-0.5+git20180116/src/slowmoVideo/project/segment_sV.cpp0000664000000000000000000000167013151342440023412 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "segment_sV.h" Segment_sV::Segment_sV(int leftNodeIndex) : m_leftNodeIndex(leftNodeIndex), m_selected(false) { } int Segment_sV::leftNodeIndex() const { return m_leftNodeIndex; } bool Segment_sV::selected() const { return m_selected; } void Segment_sV::select(bool select) { m_selected = select; } bool Segment_sV::operator <(const Segment_sV& other) const { return m_leftNodeIndex < other.m_leftNodeIndex; } QString toString(const Segment_sV &segment) { return QString("Left node: %1; selected: %2").arg(segment.leftNodeIndex()).arg(segment.selected()); } slowmovideo-0.5+git20180116/src/slowmoVideo/project/CMakeLists.txt0000664000000000000000000000231313151342440023327 0ustar rootrootset(SRCS_PROJ project_sV.cpp projectPreferences_sV.cpp renderPreferences_sV.cpp tag_sV.cpp node_sV.cpp nodeList_sV.cpp segment_sV.cpp segmentList_sV.cpp nodeHandle_sV.cpp renderTask_sV.cpp xmlProjectRW_sV.cpp abstractFrameSource_sV.cpp imagesFrameSource_sV.cpp videoFrameSource_sV.cpp emptyFrameSource_sV.cpp abstractRenderTarget_sV.cpp imagesRenderTarget_sV.cpp videoRenderTarget_sV.cpp abstractFlowSource_sV.cpp flowSourceOpenCV_sV.cpp flowSourceV3D_sV.cpp interpolator_sV.cpp shutterFunction_sV.cpp shutterFunctionList_sV.cpp motionBlur_sV.cpp canvasObject_sV.h exportVideoRenderTarget.cpp work_flow.cpp ) if (OLD_FFMPEG) set(SRCS_PROJ ${SRCS_PROJ} new_videoRenderTarget.cpp ) endif() set(SRCS_MOC project_sV.h renderTask_sV.h abstractFrameSource_sV.h imagesFrameSource_sV.h videoFrameSource_sV.h emptyFrameSource_sV.h work_flow.h ) qt_wrap_cpp(MOC_OUT ${SRCS_MOC}) include_directories(${FFMPEG_INCLUDE_PATHS}) add_library(sVproj STATIC ${SRCS_PROJ} ${MOC_OUT}) qt_use_modules(sVproj Core) qt_use_modules(sVproj Gui) qt_use_modules(sVproj Xml) qt_use_modules(sVproj Script) target_link_libraries(sVproj sV sVinfo sVflow sVencode ${EXTERNAL_LIBS}) slowmovideo-0.5+git20180116/src/slowmoVideo/project/imagesFrameSource_sV.h0000664000000000000000000000330313151342440025011 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef IMAGESFRAMESOURCE_SV_H #define IMAGESFRAMESOURCE_SV_H #include "abstractFrameSource_sV.h" #include #include #include class Project_sV; /** \todo Allow re-ordering of images \todo Check image resolution more efficiently for large number of images \todo Support OpenEXR or similar through ffmpeg. 16-bit images. */ class ImagesFrameSource_sV : public AbstractFrameSource_sV { Q_OBJECT public: ImagesFrameSource_sV(Project_sV *project, QStringList images) throw(FrameSourceError); static QString validateImages(const QStringList images); void initialize(); bool initialized() const; int64_t framesCount() const; const Fps_sV* fps() const; QImage frameAt(const uint frame, const FrameSize frameSize = FrameSize_Orig); const QString framePath(const uint frame, const FrameSize frameSize) const; const QStringList inputFiles() const; void loadOrigFrames() { }; // TODO public slots: void slotAbortInitialization(); void slotUpdateProjectDir(); private: QStringList m_imagesList; QDir m_dirImagesSmall; QSize m_sizeSmall; Fps_sV m_fps; bool m_initialized; bool m_stopInitialization; int m_nextFrame; void createDirectories(); private slots: void slotContinueInitialization(); }; #endif // IMAGESFRAMESOURCE_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/project_sV.h0000664000000000000000000001260413151342440023062 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef PROJECT_SV_H #define PROJECT_SV_H #include "tag_sV.h" #include "nodeList_sV.h" #include "renderPreferences_sV.h" #include "../lib/defs_sV.hpp" #include "../lib/videoInfo_sV.h" #include #include #include #include #include class ProjectPreferences_sV; class Flow_sV; class AbstractFrameSource_sV; class AbstractFlowSource_sV; class MotionBlur_sV; class ShutterFunctionList_sV; class RenderTask_sV; class FlowField_sV; class QSignalMapper; class QProcess; class QRegExp; class QTimer; class WorkerFlow; /** \brief slowmoVideo project holding all important information. \todo Check if width%4 == 0 for V3D */ class Project_sV : public QObject { Q_OBJECT public: /** Creates an empty project. A video file can be loaded with loadFile(QString, QString). */ Project_sV(); /** Creates a new project. \param filename Input video file \param projectDir Project directory; All cached files will be put in there. */ Project_sV(QString projectDir); ~Project_sV(); ProjectPreferences_sV* preferences() { return m_preferences; } void setProjectDir(QString projectDir); void setupProjectDir(); // TODO: QDir getProjectDir() { return m_projDir; }; /** The project filename should be set when saving or loading the project. */ void setProjectFilename(QString filename); /** \return The filename this project was last saved as. */ QString projectFilename() const; /** \param frameSource will be managed (deleted) by the project. If \c NULL, an empty frame source will be used. */ void loadFrameSource(AbstractFrameSource_sV *frameSource); AbstractFrameSource_sV* frameSource() { return m_frameSource; } AbstractFlowSource_sV* flowSource() { return m_flowSource; } NodeList_sV *nodes() const { return m_nodes; } QList *tags() const { return m_tags; } ShutterFunctionList_sV* shutterFunctions() { return m_shutterFunctions; } MotionBlur_sV *motionBlur() { return m_motionBlur; } /** \see replaceRenderTask() */ RenderTask_sV *renderTask() { return m_renderTask; } void replaceRenderTask(RenderTask_sV *task); /** \fn snapToFrame() \brief Snaps in the given time on a grid given by the number of frames per second. This allows to, for example, render from 0 to 3.2 seconds and then from 3.2 to 5 seconds to images with the same effect as rendering all at once. Always starts from 0! \param time Time to snap in \param roundUp To chose between rounding up or down \param fps Frames per second to use. */ /** \fn toOutTime() \brief Converts a time expression like \c f:123 or \c :start to an output time \c float. Accepted input format: \li \c 24.3 or \c t:24.3 for 24.3 seconds \li \c f:123 for frame 123 \li \c p:25% for 25 % \li \c l:slowdown for the slowdown label (tag) \li \c :start and \c :end for project start/end */ static qreal snapToFrame(const qreal time, bool roundUp, const Fps_sV& fps, int* out_framesBeforeHere); qreal snapToOutFrame(qreal time, bool roundUp, const Fps_sV& fps, int* out_framesBeforeHere) const; qreal toOutTime(QString timeExpression, const Fps_sV& fps) const throw(Error_sV); const QDir getDirectory(const QString &name, bool createIfNotExists = true) const; QImage render(qreal outTime, RenderPreferences_sV prefs); FlowField_sV* requestFlow(int leftFrame, int rightFrame, const FrameSize frameSize) throw(FlowBuildingError); /** \brief Searches for objects near the given \c pos. This search includes tags. \see NodeList_sV::objectsNear() Used by this method. Does not include tags. */ QList objectsNear(QPointF pos, qreal tmaxdist) const; public: /// Reload the flow source in case the user changed the default (preferred) method. void reloadFlowSource(); // prebuilt the need optical flow files void buildCacheFlowSource(); private: QDir m_projDir; QString m_projectFilename; ProjectPreferences_sV *m_preferences; AbstractFrameSource_sV *m_frameSource; AbstractFlowSource_sV *m_flowSource; MotionBlur_sV *m_motionBlur; NodeList_sV *m_nodes; QList *m_tags; //TODO: remove this RenderTask_sV *m_renderTask; ShutterFunctionList_sV *m_shutterFunctions; qreal sourceTimeToFrame(qreal time) const; QDir getDirectoryName(); void init(); /** * @brief Thread object which will let us manipulate the running thread */ QThread *thread[4]; /** * @brief Object which contains methods that should be runned in another thread */ WorkerFlow *worker[4]; private: /// Count how many times V3D failed, after a certain limit we assume the user does not have an nVidia card /// and constantly switch to OpenCV int m_v3dFailCounter; // launch a worker thread for optical flow void startFlow(int threadid,const FrameSize frameSize,int direction); }; #endif // PROJECT_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/abstractFrameSource_sV.h0000664000000000000000000000742713151342440025362 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef ABSTRACTFRAMESOURCE_SV_H #define ABSTRACTFRAMESOURCE_SV_H #include "../lib/defs_sV.hpp" #include #include #include class Project_sV; class Div0Exception {}; /** \brief Represents a source for input frames, like a video or an image sequence */ class AbstractFrameSource_sV : public QObject { Q_OBJECT public: AbstractFrameSource_sV(const Project_sV *project); virtual ~AbstractFrameSource_sV(); const Project_sV* project() { return m_project; } /** \fn initialize() Initializes the frame source (like extracting frames from a video). When re-implementing this method, the parent function must be called in order for initialized() to work. \see signalNextTask() and the other signals */ /** \fn initialized() \return \c true, if the frame source has been initialized. */ virtual void initialize() = 0; virtual bool initialized() const = 0; /** \fn framesCount() \return Number of frames in this frame source */ virtual int64_t framesCount() const = 0; virtual const Fps_sV* fps() const = 0; double maxTime() const throw(Div0Exception); /* * add a qcache here for perf loading */ QCache frameCache; // use it in frameAt /** \fn frameAt() \return The frame at the given position, as image. Fails if the frames have not been extracted yet. */ /** \fn framePath() \return The path to the frame at position \c frame */ virtual QImage frameAt(const uint frame, const FrameSize frameSize = FrameSize_Orig) = 0; virtual const QString framePath(const uint frame, const FrameSize frameSize = FrameSize_Orig) const = 0; virtual void loadOrigFrames() =0; signals: /** \fn void signalNextTask(const QString taskDescription, int taskSize) A new task has been started, like extracting frames from a video when loading a new frame source. \param taskSize Number of task items in this task (e.g. number of frames to extract from a video file) */ /** \fn void signalTaskProgress(int progress) The current task has made progress. \param progress Task item that has been completed. Should be smaller than taskSize given in signalNextTask(). */ /** \fn void signalTaskItemDescription(const QString desc) \param desc Description for the current task (like the file name of an extracted frame) */ /** \fn void signalAllTasksFinished() All due tasks have been completed. */ void signalNextTask(const QString taskDescription, int taskSize); void signalTaskProgress(int progress); void signalTaskItemDescription(const QString desc); void signalAllTasksFinished(); public slots: /** \fn slotAbortInitialization() Should abort the initialization of the frame source since this might take a long time (e.g. extracting large frames from a long video, or re-sizing lots of frames). */ /** \fn slotUpdateProjectDir() Informs the frame source that the project directory has changed. If the frame source created sub-directories in the old project directories, it can e.g. delete them and create them at the new place. */ virtual void slotAbortInitialization() = 0; virtual void slotUpdateProjectDir() = 0; protected: const Project_sV *m_project; }; #endif // ABSTRACTFRAMESOURCE_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/imagesRenderTarget_sV.h0000664000000000000000000000200313151342440025160 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef IMAGESRENDERTARGET_SV_H #define IMAGESRENDERTARGET_SV_H #include "abstractRenderTarget_sV.h" #include class RenderTask_sV; class ImagesRenderTarget_sV : public AbstractRenderTarget_sV { public: ImagesRenderTarget_sV(RenderTask_sV *parentRenderTask); virtual void openRenderTarget() throw(Error_sV) {} ; virtual void closeRenderTarget() throw(Error_sV) {} ; void setTargetDir(const QDir dir); bool setFilenamePattern(const QString pattern); public slots: void slotConsumeFrame(const QImage &image, const int frameNumber); private: QDir m_targetDir; QString m_filenamePattern; }; #endif // IMAGESRENDERTARGET_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/flowSourceOpenCV_sV.h0000664000000000000000000000441613151342440024621 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2012 Lucas Walter 2012 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef FLOWSOURCEOPENCV_SV_H #define FLOWSOURCEOPENCV_SV_H #include "abstractFlowSource_sV.h" #include #include "opencv2/core/version.hpp" #include "opencv2/video/tracking.hpp" #include "opencv2/imgproc/imgproc.hpp" #include "opencv2/highgui/highgui.hpp" #include "opencv2/opencv_modules.hpp" #if CV_MAJOR_VERSION == 2 #include "opencv2/core/gpumat.hpp" #ifdef HAVE_OPENCV_OCL #include "opencv2/ocl/ocl.hpp" #endif #else #include "opencv2/core/ocl.hpp" #endif class FlowSourceOpenCV_sV : public AbstractFlowSource_sV { public: FlowSourceOpenCV_sV(Project_sV *project, int algo, int ocl_dev_index); ~FlowSourceOpenCV_sV() {} virtual FlowField_sV* buildFlow(uint leftFrame, uint rightFrame, FrameSize frameSize) throw(FlowBuildingError); virtual const QString flowPath(const uint leftFrame, const uint rightFrame, const FrameSize frameSize = FrameSize_Orig) const; void setupOpticalFlow(const int levels,const int winsize,const double polySigma, const double pyrScale, const int polyN); void setupTVL1(const double tau, const double lambda, const int nscales, const int warps, const int iterations, const double epsilon); private: int ocl_device_index; int algo; // optical flow Farn int numLevels; int numIters; int winSize; double polySigma; double pyrScale; int polyN; int flags; // optical TVL1 double tau; double lambda; int warps; int nscales; int iterations; double epsilon; #if CV_MAJOR_VERSION == 3 void buildFlowOpenCV_3(cv::UMat& prevgray, cv::UMat& gray, std::string flowfilename); #else void buildFlowOpenCV_CPU(cv::Mat& prevgray, cv::Mat& gray, std::string flowfilename); #ifdef HAVE_OPENCV_OCL void buildFlowOpenCV_OCL(cv::Mat& prevgray, cv::Mat& gray, std::string flowfilename); void setupOclDevice(); #endif #endif void dumpAlgosParams(); }; #endif // FLOWSOURCEOPENCV_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/shutterFunction_sV.h0000664000000000000000000000504713151342440024623 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef SHUTTERFUNCTION_SV_H #define SHUTTERFUNCTION_SV_H #include #include #include class QScriptEngine; /** \brief Defines the shutter length over a node segment. The function is written in QtScript which is basically ECMAScript (known from JavaScript). Examples: \code // 180° shutter return 1/2*dy; \endcode \code // (replay speed)² return Math.pow(dy/dx, 2); \endcode \code // Variable declarations var dx = 1/fps; var speed = dy/dx; if (speed < 1) { speed = 0; } return speed; \endcode */ class ShutterFunction_sV { public: /// Creates a default shutter function (evaluates to 0) ShutterFunction_sV(); /// Creates a shutter function from the given QtScript code ShutterFunction_sV(const QString& function); /// Copy constructor ShutterFunction_sV(const ShutterFunction_sV& other); /// Destructor ~ShutterFunction_sV(); /// Default template header with some comments and the available variables static QString templateHeader; /// Example body, static QString templateBody; /// Default template footer static QString templateFooter; /// Does not do any checking/validating of the ID. void setID(const QString id); /// Compiles the given function code void updateFunction(const QString& function); /// Function's ID (empty by default) QString id() const; /// Code used by this function QString function() const; /** Evaluates the function with the given parameters. The returned value is the desired shutter duration in seconds. \param x x location between two nodes, scaled to
[0,1]
\param t Output time \param fps Output frames per second \param y Source frame at position x \param dy y change to the next frame \return Shutter duration in seconds */ float evaluate(const float x, const float t, const float fps, const float y, const float dy); private: QString m_id; QString m_function; QScriptEngine *m_scriptEngine; QScriptValue m_compiledFunction; void init(); void operator =(const ShutterFunction_sV &other); }; #endif // SHUTTERFUNCTION_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/videoRenderTarget_sV.cpp0000664000000000000000000000414113151342440025361 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ // Against the «UINT64_C not declared» message. // See: http://code.google.com/p/ffmpegsource/issues/detail?id=11 #ifdef __cplusplus #define __STDC_CONSTANT_MACROS #ifdef _STDINT_H #undef _STDINT_H #endif # include #endif #include "videoRenderTarget_sV.h" #include "renderTask_sV.h" #include extern "C" { #include "../lib/ffmpegEncode_sV.h" } VideoRenderTarget_sV::VideoRenderTarget_sV(RenderTask_sV *parentRenderTask) : AbstractRenderTarget_sV(parentRenderTask) { m_videoOut = (VideoOut_sV*)malloc(sizeof(VideoOut_sV)); } VideoRenderTarget_sV::~VideoRenderTarget_sV() { free(m_videoOut); } void VideoRenderTarget_sV::setTargetFile(const QString &filename) { m_filename = filename; } void VideoRenderTarget_sV::setVcodec(const QString &codec) { m_vcodec = codec; } void VideoRenderTarget_sV::openRenderTarget() throw(Error_sV) { char *vcodec = NULL; if (m_vcodec.length() > 0) { vcodec = (char*)malloc(m_vcodec.length()+1); strcpy(vcodec, m_vcodec.toStdString().c_str()); } int worked = prepare(m_videoOut, m_filename.toStdString().c_str(), vcodec, renderTask()->resolution().width(), renderTask()->resolution().height(), renderTask()->fps().fps() * renderTask()->resolution().width() * renderTask()->resolution().height(), renderTask()->fps().den, renderTask()->fps().num); if (worked != 0) { throw Error_sV(QObject::tr("Video could not be prepared (error code %1).\n%2").arg(worked).arg(m_videoOut->errorMessage)); } } void VideoRenderTarget_sV::slotConsumeFrame(const QImage &image, const int frameNumber) { eatARGB(m_videoOut, image.bits()); } void VideoRenderTarget_sV::closeRenderTarget() throw(Error_sV) { finish(m_videoOut); } slowmovideo-0.5+git20180116/src/slowmoVideo/project/videoFrameSource_sV.h0000664000000000000000000000622613151342440024661 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef VIDEOFRAMESOURCE_SV_H #define VIDEOFRAMESOURCE_SV_H #include "abstractFrameSource_sV.h" #include "../lib/defs_sV.hpp" #include "../lib/avconvInfo_sV.h" #include #include #include #include #include #include "../lib/videoInfo_sV.h" class QProcess; class Project_sV; /** \brief Uses frames from a video file \todo Use libav directly for frame extraction? (not the ffmpeg command) \todo Extract full frames only before rendering, only used ones */ class VideoFrameSource_sV : public AbstractFrameSource_sV { Q_OBJECT public: /** Builds a new video frame source from the given file. */ VideoFrameSource_sV(const Project_sV *project, const QString &filename) throw(FrameSourceError); ~VideoFrameSource_sV(); void initialize(); bool initialized() const; int64_t framesCount() const; void setFramesCount(int64_t framesCount); const Fps_sV* fps() const; QImage frameAt(const uint frame, const FrameSize frameSize = FrameSize_Orig); const QString framePath(const uint frame, const FrameSize frameSize) const; /** \return The absolute path of the input video file. */ const QString videoFile() const; void loadOrigFrames(); public slots: void slotAbortInitialization(); void slotUpdateProjectDir(); private: static QRegExp regexFrameNumber; private: QFile m_inFile; QDir m_dirFramesSmall; QDir m_dirFramesOrig; QSettings m_settings; AvconvInfo m_avconvInfo; VideoInfoSV *m_videoInfo; Fps_sV m_fps; QTimer *m_timer; QProcess *m_ffmpeg; QSemaphore m_ffmpegSemaphore; bool m_initialized; int64_t cur_frame; void createDirectories(); /** Extracts the frames from the video file into single images */ void extractFramesFor(const FrameSize frameSize, QProcess *process); /** Checks the availability of the frames and decides whether they need to be extracted with extractFrames() */ bool rebuildRequired(const FrameSize frameSize); void locateFFmpeg(); public: static bool testFfmpegExecutable(QString path); signals: /** Emitted when the task for extracting original-sized images has finished (or has been terminated) */ void signalExtractOrigFramesFinished(); /** Emitted when the task for extracting thumbnail-sized images has finished (or has been terminated) */ void signalExtractSmallFramesFinished(); private slots: void slotExtractOrigFrames(); void slotExtractSmallFrames(); void slotInitializationFinished(); /** Checks the progress of the ffmpeg threads by reading their stderr and emits signalTaskProgress() and signalTaskItemDescription() if necessary. */ void slotProgressUpdate(); }; #endif // VIDEOFRAMESOURCE_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/flowSourceOpenCV_sV.cpp0000664000000000000000000002641113151342440025153 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2012 Lucas Walter 2012 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "flowSourceOpenCV_sV.h" #include "project_sV.h" #include "abstractFrameSource_sV.h" #include "../lib/flowRW_sV.h" #include "../lib/flowField_sV.h" #include #include #include #include using namespace cv; FlowSourceOpenCV_sV::FlowSourceOpenCV_sV(Project_sV *project, int _algo, int _ocl_dev_idx) : AbstractFlowSource_sV(project) { ocl_device_index = _ocl_dev_idx; algo = _algo; createDirectories(); } /** * create a optical flow file * * @param flow optical flow to save * @param flowname file name for optical flow */ void drawOptFlowMap(const Mat& flow, std::string flowname ) { FlowField_sV flowField(flow.cols, flow.rows); //qDebug() << "flow is : " << flow.cols << " by " << flow.rows; for(int y = 0; y < flow.rows; y++) for(int x = 0; x < flow.cols; x++) { const Point2f& fxyo = flow.at(y, x); flowField.setX(x, y, fxyo.x); flowField.setY(x, y, fxyo.y); } FlowRW_sV::save(flowname, &flowField); } void drawOptFlowMapSeparateXandY(const Mat& flowx, const Mat& flowy, std::string flowname ) { FlowField_sV flowField(flowx.cols, flowy.rows); for(int y = 0; y < flowx.rows; y++) { for(int x = 0; x < flowx.cols; x++) { const float flowx_float = flowx.at(y, x); const float flowy_float = flowy.at(y, x); flowField.setX(x, y, flowx_float); flowField.setY(x, y, flowy_float); } } FlowRW_sV::save(flowname, &flowField); } /** * build path of flow file * * @param leftFrame left frame for flow * @param rightFrame right frame * @param frameSize resolution (small/orig) * * @return name of flow file */ const QString FlowSourceOpenCV_sV::flowPath(const uint leftFrame, const uint rightFrame, const FrameSize frameSize) const { QDir dir; if (frameSize == FrameSize_Orig) { dir = m_dirFlowOrig; } else { dir = m_dirFlowSmall; } QString direction; if (leftFrame < rightFrame) { direction = "forward"; } else { direction = "backward"; } return dir.absoluteFilePath(QString("ocv-%1-%2-%3.sVflow").arg(direction).arg(leftFrame).arg(rightFrame)); } /** * setup parameter value for flow algorithm * * @param levels number of pyramide level * @param winsize windows size * @param polySigma sigma * @param pyrScale pyramide scale * @param polyN <#polyN description#> */ void FlowSourceOpenCV_sV::setupOpticalFlow(const int levels, const int winsize, const double polySigma, const double pyrScale, const int polyN) { qDebug() << "setup Optical Flow "; this->pyrScale = pyrScale; this->polyN = polyN; this->polySigma = polySigma; this->flags = 0; this->numLevels = levels; this->winSize = winsize; //const int iterations = 8; // 10 this->numIters = 8; } void FlowSourceOpenCV_sV::setupTVL1(const double tau, const double lambda, const int nscales, const int warps, const int iterations, const double epsilon) { qDebug() << "setup Optical Flow TLV1"; this->tau = tau; this->lambda = lambda; this->nscales = nscales; this->warps = warps; this->iterations = iterations; this->epsilon = epsilon; } FlowField_sV* FlowSourceOpenCV_sV::buildFlow(uint leftFrame, uint rightFrame, FrameSize frameSize) throw(FlowBuildingError) { #if CV_MAJOR_VERSION == 2 #ifdef HAVE_OPENCV_OCL if (ocl_device_index >= 0) { setupOclDevice(); } #endif #endif QString flowFileName(flowPath(leftFrame, rightFrame, frameSize)); /// \todo Check if size is equal if (!QFile(flowFileName).exists()) { QTime time; time.start(); QString prevpath = project()->frameSource()->framePath(leftFrame, frameSize); QString path = project()->frameSource()->framePath(rightFrame, frameSize); qDebug() << "Building flow for left frame " << leftFrame << " to right frame " << rightFrame << "; Size: " << frameSize; // check if file have been generated ! //TODO: maybe better error handling ? if (!QFile(prevpath).exists()) throw FlowBuildingError(QString("Could not read image " + prevpath)); if (!QFile(path).exists()) throw FlowBuildingError(QString("Could not read image " + path)); cv::Mat prevgray, gray; prevgray = cv::imread(prevpath.toStdString(), CV_LOAD_IMAGE_ANYDEPTH); gray = cv::imread(path.toStdString(), CV_LOAD_IMAGE_ANYDEPTH); #if CV_MAJOR_VERSION == 3 cv::UMat uprevgray, ugray; prevgray.copyTo(uprevgray); gray.copyTo(ugray); #endif { if (!prevgray.empty()) { #if CV_MAJOR_VERSION == 3 buildFlowOpenCV_3(uprevgray, ugray, flowFileName.toStdString()); #else #ifdef HAVE_OPENCV_OCL if (ocl_device_index >= 0) { buildFlowOpenCV_OCL(prevgray, gray, flowFileName.toStdString()); } else { buildFlowOpenCV_CPU(prevgray, gray, flowFileName.toStdString()); } #else buildFlowOpenCV_CPU(prevgray, gray, flowFileName.toStdString()); #endif #endif } else { qDebug() << "imread: Could not read image " << prevpath; throw FlowBuildingError(QString("imread: Could not read image " + prevpath)); } } qDebug() << "Optical flow built for " << flowFileName << " in " << time.elapsed() << " ms."; } else { qDebug().nospace() << "Re-using existing flow image for left frame " << leftFrame << " to right frame " << rightFrame << ": " << flowFileName; } try { return FlowRW_sV::load(flowFileName.toStdString()); } catch (FlowRW_sV::FlowRWError &err) { throw FlowBuildingError(err.message.c_str()); } } void FlowSourceOpenCV_sV::dumpAlgosParams() { if (algo == 1) { // DualTVL1 qDebug() << "flow via TLV1 algo." << " lambda:" << lambda << " tau:" << tau << " nscales:" << nscales << "warps:" << warps << " iterations:" << iterations << "epsilon:" << epsilon; } else { // _FARN_ qDebug() << "flow via Farneback algo." << " pyrScale:" << pyrScale << " numLevels:" << numLevels << " winSize:" << winSize << " numIters:" << numIters << " polyN:" << polyN << " polySigma:" << polySigma << " flags:" << flags; } } #if CV_MAJOR_VERSION == 3 void FlowSourceOpenCV_sV::buildFlowOpenCV_3(cv::UMat& uprevgray, cv::UMat& ugray, std::string flowfilename) { dumpAlgosParams(); qDebug() << "Have OpenCL: " << cv::ocl::haveOpenCL() << " useOpenCL:" << cv::ocl::useOpenCL(); UMat uflow; if (algo == 1) { // DualTVL1 cv::Ptr tvl1 = cv::createOptFlow_DualTVL1(); tvl1->setLambda(lambda); tvl1->setTau(tau); tvl1->setScalesNumber(nscales); tvl1->setWarpingsNumber(warps); tvl1->setOuterIterations(iterations); tvl1->setEpsilon(epsilon); tvl1->calc( uprevgray, ugray, uflow ); } else { // _FARN_ calcOpticalFlowFarneback( uprevgray, ugray, uflow, pyrScale, //0.5, numLevels, //3, winSize, //15, numIters, //8, polyN, //5, polySigma, //1.2, flags //0 ); } Mat flow; uflow.copyTo(flow); qDebug() << "finished"; drawOptFlowMap(flow, flowfilename); } #else // start CV_MAJOR_VERSION != 3 void FlowSourceOpenCV_sV::buildFlowOpenCV_CPU(cv::Mat& prevgray, cv::Mat& gray, std::string flowfilename) { dumpAlgosParams(); cv::Mat_ flow; if (algo == 1) { // DualTVL1 cv::Ptr tvl1 = cv::createOptFlow_DualTVL1(); tvl1->set("lambda", lambda); tvl1->set("tau", tau); tvl1->set("nscales", nscales); tvl1->set("warps", warps); tvl1->set("iterations", iterations); tvl1->set("epsilon", epsilon); tvl1->calc(prevgray, gray, flow); } else { // _FARN_ // TODO: check to use prev flow as initial flow ? (flags) //gray, prevgray, // TBD this seems to match V3D output better but a sign flip could also do that calcOpticalFlowFarneback( prevgray, gray, flow, pyrScale, //0.5, numLevels, //3, winSize, //15, numIters, //8, polyN, //5, polySigma, //1.2, flags //0 ); } qDebug() << "finished"; drawOptFlowMap(flow, flowfilename); } #ifdef HAVE_OPENCV_OCL /** * OpenCV2 OCL algos have memleaks. */ void FlowSourceOpenCV_sV::buildFlowOpenCV_OCL(cv::Mat& prevgray, cv::Mat& gray, std::string flowfilename) { dumpAlgosParams(); using namespace cv::ocl; oclMat ocl_flowx, ocl_flowy; if (algo == 1) { OpticalFlowDual_TVL1_OCL tvl1_ocl_alg; tvl1_ocl_alg.tau = tau; tvl1_ocl_alg.lambda = lambda; tvl1_ocl_alg.nscales = nscales; tvl1_ocl_alg.warps = warps; tvl1_ocl_alg.epsilon = epsilon; tvl1_ocl_alg.iterations = iterations; tvl1_ocl_alg(oclMat(prevgray), oclMat(gray), ocl_flowx, ocl_flowy); tvl1_ocl_alg.collectGarbage(); } else { FarnebackOpticalFlow farneback_ocl_algo; farneback_ocl_algo.numLevels = numLevels; farneback_ocl_algo.pyrScale = pyrScale; farneback_ocl_algo.pyrScale = pyrScale; farneback_ocl_algo.winSize = winSize; farneback_ocl_algo.numIters = numIters; farneback_ocl_algo.polyN = polyN; farneback_ocl_algo.polySigma = polySigma; farneback_ocl_algo.flags = flags; farneback_ocl_algo(oclMat(prevgray), oclMat(gray), ocl_flowx, ocl_flowy); farneback_ocl_algo.releaseMemory(); } Mat flowx, flowy; ocl_flowx.download(flowx); ocl_flowy.download(flowy); drawOptFlowMapSeparateXandY(flowx, flowy, flowfilename); } void FlowSourceOpenCV_sV::setupOclDevice() { qDebug() << "using olc device index: " << ocl_device_index; using namespace cv::ocl; PlatformsInfo platform_infos; getOpenCLPlatforms(platform_infos); int index = 0; for (unsigned int i = 0; i < platform_infos.size(); i++) { const PlatformInfo *pi = platform_infos[i]; for (unsigned int j = 0; j < pi->devices.size(); j++) { if (index == ocl_device_index) { const DeviceInfo *dic = pi->devices[j]; DeviceInfo *di = (DeviceInfo *)dic; di->deviceName = "ocl_devicename_slowmovideo"; setDevice(di); break; } } } } #endif // end if HAVE_OPENCV_OCL #endif // above CV_MAJOR_VERSION == 2 slowmovideo-0.5+git20180116/src/slowmoVideo/project/tag_sV.cpp0000664000000000000000000000075113151342440022522 0ustar rootroot#include "tag_sV.h" Tag_sV::Tag_sV(TagAxis axis) : m_axis(axis) { } Tag_sV::Tag_sV(qreal time, QString description, TagAxis axis) : m_axis(axis), m_time(time), m_description(description) { } void Tag_sV::setAxis(TagAxis axis) { m_axis = axis; } void Tag_sV::setDescription(QString desc) { m_description = desc; } void Tag_sV::setTime(qreal time) { m_time = time; } bool Tag_sV::operator <(const Tag_sV &other) const { return m_time < other.time(); } slowmovideo-0.5+git20180116/src/slowmoVideo/project/shutterFunction_sV.cpp0000664000000000000000000000560413151342440025155 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "shutterFunction_sV.h" #include #include #include QString ShutterFunction_sV::templateHeader( "// x: on [0,1] \n" "// t: output time \n" "// fps: Frames per second (1/dt) \n" "// y: source time at position x \n" "// dy: Source time delta to next output frame \n" "(function(x, t, fps, y, dy) \n" "{"); QString ShutterFunction_sV::templateBody( " // Replace this with your function \n" " return Math.sin(x*Math.PI);" ); QString ShutterFunction_sV::templateFooter( "})"); ShutterFunction_sV::ShutterFunction_sV() { init(); updateFunction(templateBody); } ShutterFunction_sV::ShutterFunction_sV(const QString &function) { init(); updateFunction(function); } ShutterFunction_sV::ShutterFunction_sV(const ShutterFunction_sV &other) { init(); m_id = other.m_id; updateFunction(other.m_function); } void ShutterFunction_sV::init() { m_scriptEngine = new QScriptEngine(); qDebug() << "Script engine initialized for function " << this; } ShutterFunction_sV::~ShutterFunction_sV() { delete m_scriptEngine; } void ShutterFunction_sV::operator =(const ShutterFunction_sV &other) { qDebug() << "Shutter functions should not be copied!"; Q_ASSERT(false); if (this != &other) { m_id = other.m_id; m_function = other.m_function; m_compiledFunction = other.m_compiledFunction; } } void ShutterFunction_sV::setID(const QString id) { m_id = id; } QString ShutterFunction_sV::id() const { return m_id; } QString ShutterFunction_sV::function() const { return m_function; } void ShutterFunction_sV::updateFunction(const QString &function) { m_function = function; QString f = QString("%1%2%3").arg(templateHeader).arg(m_function).arg(templateFooter); // qDebug() << "===== Function is:\n" << f << "\n====="; m_compiledFunction = m_scriptEngine->evaluate(f); } float ShutterFunction_sV::evaluate(const float x, const float t, const float fps, const float y, const float dy) { QScriptValueList args; args << x << t << fps << y << dy; QString result = m_compiledFunction.call(QScriptValue(), args).toString(); float val; bool ok = false; if (result.length() > 0) { val = result.toFloat(&ok); } if (!ok) { val = 0; // qDebug() << "Error: Could not evaluate function " << m_id; } else { // qDebug() << "Evaluated at " << x0 << ": " << val; } return val; } slowmovideo-0.5+git20180116/src/slowmoVideo/project/abstractProgressDialog.h0000664000000000000000000000030713151342440025411 0ustar rootroot #ifndef PROGRESSDIALOG_H #define PROGRESSDIALOG_H class progressDialog { public: progressDialog() {}; virtual ~progressDialog() {} virtual void updateProgress() = 0; }; #endif slowmovideo-0.5+git20180116/src/slowmoVideo/project/emptyFrameSource_sV.cpp0000664000000000000000000000111713151342440025236 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "emptyFrameSource_sV.h" EmptyFrameSource_sV::EmptyFrameSource_sV(const Project_sV *project) : AbstractFrameSource_sV(project), m_fps(24,1) { } void EmptyFrameSource_sV::initialize() { emit signalAllTasksFinished(); } slowmovideo-0.5+git20180116/src/slowmoVideo/project/flowSourceV3D_sV.h0000664000000000000000000000202513151342440024055 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef V3DFLOWSOURCE_SV_H #define V3DFLOWSOURCE_SV_H #include "abstractFlowSource_sV.h" #include class FlowSourceV3D_sV : public AbstractFlowSource_sV { public: /** Creates a new flow source using V3D optical flow */ FlowSourceV3D_sV(Project_sV *project, double lambda = 10); ~FlowSourceV3D_sV() {} FlowField_sV* buildFlow(uint leftFrame, uint rightFrame, FrameSize frameSize) throw(FlowBuildingError); const QString flowPath(const uint leftFrame, const uint rightFrame, const FrameSize frameSize) const; static bool validateFlowBinary(const QString path); static QString correctFlowBinaryLocation(); private: }; #endif // V3DFLOWSOURCE_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/videoFrameSource_sV.cpp0000664000000000000000000002261313151342440025212 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "videoFrameSource_sV.h" #include "project_sV.h" #include #include #include #include QRegExp VideoFrameSource_sV::regexFrameNumber("frame=\\s*(\\d+)"); //not sure about that here, means unlimited ! const int tmout = (-1); /// \todo Check QProcess::exitCode() to find out if ffmpeg worked or not VideoFrameSource_sV::VideoFrameSource_sV(const Project_sV *project, const QString &filename) throw(FrameSourceError) : AbstractFrameSource_sV(project), m_inFile(filename), m_fps(1,1), m_ffmpegSemaphore(1), m_initialized(false), cur_frame(0) { if (!QFileInfo(filename).exists()) { throw FrameSourceError(tr("Video file %1 does not exist!").arg(filename)); } m_videoInfo = new VideoInfoSV(); // use copy constructor *m_videoInfo = getInfo(filename.toStdString().c_str()); if (m_videoInfo->streamsCount <= 0) { qDebug() << "Video info is invalid: " << filename; throw FrameSourceError(tr("Video is invalid, no streams found in %1").arg(filename)); } m_fps = Fps_sV(m_videoInfo->frameRateNum, m_videoInfo->frameRateDen); createDirectories(); locateFFmpeg(); m_ffmpeg = new QProcess(this); m_timer = new QTimer(this); QObject::connect(m_timer, SIGNAL(timeout()), this, SLOT(slotProgressUpdate())); } VideoFrameSource_sV::~VideoFrameSource_sV() { delete m_ffmpeg; delete m_timer; delete m_videoInfo; } void VideoFrameSource_sV::slotUpdateProjectDir() { //TODO: is it really needed ? // Delete old directories if they are empty //m_dirFramesSmall.rmdir("."); //m_dirFramesOrig.rmdir("."); createDirectories(); } void VideoFrameSource_sV::createDirectories() { m_dirFramesSmall = project()->getDirectory("frames/small"); m_dirFramesOrig = project()->getDirectory("frames/orig"); } void VideoFrameSource_sV::initialize() { if (!initialized()) { // Start the frame extraction process slotExtractSmallFrames(); } } bool VideoFrameSource_sV::initialized() const { return m_initialized; } int64_t VideoFrameSource_sV::framesCount() const { return m_videoInfo->framesCount; } void VideoFrameSource_sV::setFramesCount(int64_t framesCount) { //qDebug() << "setting frameCount "<< framesCount; m_videoInfo->framesCount = framesCount; } const Fps_sV* VideoFrameSource_sV::fps() const { return &m_fps; } QImage VideoFrameSource_sV::frameAt(const uint frame, const FrameSize frameSize) { // TODO: QString path = framePath(frame, frameSize); //qDebug() << "frameAt "<< frame; #if 0 if(frameCache.contains(path)) return *(frameCache.object(path)); #endif QImage _frame = QImage(path); #if 0 frameCache.insert(path, &_frame); #endif return _frame; } const QString VideoFrameSource_sV::videoFile() const { return m_inFile.fileName(); } const QString VideoFrameSource_sV::framePath(const uint frame, const FrameSize frameSize) const { QString dir; switch (frameSize) { case FrameSize_Orig: dir = m_dirFramesOrig.absolutePath(); break; case FrameSize_Small: default: dir = m_dirFramesSmall.absolutePath(); break; } // ffmpeg numbering starts with 1, therefore add 1 to the frame number #if QT_VERSION < QT_VERSION_CHECK(5, 0, 0) return QString("%1/frame%2.png").arg(dir).arg(frame+1, 5, 10, QChar::fromAscii('0')); #else return QString("%1/frame%2.png").arg(dir).arg(frame+1, 5, 10, QChar::fromLatin1('0')); #endif } void VideoFrameSource_sV::extractFramesFor(const FrameSize frameSize, QProcess *process) { QStringList args; args << "-i" << m_inFile.fileName(); args << "-f" << "image2"; if (frameSize == FrameSize_Small) { int w = m_videoInfo->width; int h = m_videoInfo->height; while (w > 600) { w /= 2; h /= 2; } qDebug() << "Thumbnail frame size: " << w << "x" << h; args << "-s" << QString("%1x%2").arg(w).arg(h); args << m_dirFramesSmall.absoluteFilePath("frame%05d.png"); } else { args << m_dirFramesOrig.absoluteFilePath("frame%05d.png"); } qDebug() << "Extracting frames with " << m_settings.value("binaries/ffmpeg", "ffmpeg").toString() << args; // { // QStringList a2; // a2 << "-version"; // process->start(m_settings.value("binaries/ffmpeg").toString());//, "ffmpeg").toString(), a2); // qDebug() << process->readAllStandardOutput(); // qDebug() << process->readAllStandardError(); // process->terminate(); // } process->start(m_settings.value("binaries/ffmpeg", "ffmpeg").toString(), args); qDebug() << process->readAllStandardOutput(); qDebug() << process->readAllStandardError(); } bool VideoFrameSource_sV::rebuildRequired(const FrameSize frameSize) { bool needsRebuild = false; QImage frame = frameAt(0, frameSize); needsRebuild |= frame.isNull(); //qDebug() << "last frame to check " << m_videoInfo->framesCount-1; // rewind a little bit to account rounding error... frame = frameAt(m_videoInfo->framesCount-10, frameSize); needsRebuild |= frame.isNull(); return needsRebuild; } void VideoFrameSource_sV::locateFFmpeg() { if (m_avconvInfo.locate(m_settings.value("binaries/ffmpeg", "").toString())) { m_settings.setValue("binaries/ffmpeg", m_avconvInfo.executablePath()); m_settings.sync(); } else { throw FrameSourceError(tr("ffmpeg/avconv executable not found! Cannot load video." "\n(It is also possible that it took a little long to respond " "due to high workload, so you might want to try again.)" #ifdef WINDOWS "\nPlease download the static ffmpeg build from ffmpeg.zeranoe.com " "and extract ffmpeg.exe in the same directory as slowmoUI.exe." #endif )); } } void VideoFrameSource_sV::slotExtractSmallFrames() { emit signalNextTask(tr("Extracting thumbnail-sized frames from the video file"), m_videoInfo->framesCount); m_timer->start(100); if (rebuildRequired(FrameSize_Small)) { m_ffmpegSemaphore.acquire(); m_ffmpeg->waitForFinished(tmout); m_ffmpeg->terminate(); disconnect(m_ffmpeg, SIGNAL(finished(int)), this, 0); connect(m_ffmpeg, SIGNAL(finished(int)), this, SLOT(slotExtractOrigFrames())); extractFramesFor(FrameSize_Small, m_ffmpeg); m_ffmpegSemaphore.release(); } else { slotExtractOrigFrames(); } } void VideoFrameSource_sV::slotExtractOrigFrames() { emit signalNextTask(tr("Extracting original-sized frames from the video file"), m_videoInfo->framesCount); m_timer->start(100); if (rebuildRequired(FrameSize_Orig)) { m_ffmpegSemaphore.acquire(); m_ffmpeg->waitForFinished(tmout); m_ffmpeg->terminate(); disconnect(m_ffmpeg, SIGNAL(finished(int)), this, 0); connect(m_ffmpeg, SIGNAL(finished(int)), this, SLOT(slotInitializationFinished())); extractFramesFor(FrameSize_Orig, m_ffmpeg); m_ffmpegSemaphore.release(); } else { slotInitializationFinished(); } } void VideoFrameSource_sV::slotInitializationFinished() { m_timer->stop(); emit signalAllTasksFinished(); m_ffmpegSemaphore.acquire(); m_ffmpeg->waitForFinished(tmout); QRegExp regex(regexFrameNumber); QString s; s = QString(m_ffmpeg->readAllStandardError()); qDebug() << "slotExtractOrigFrames : " << s << "end"; if (regex.lastIndexIn(s) >= 0) { //fprintf(stderr,"last frame is : %d\n",regex.cap(1).toInt()); setFramesCount(regex.cap(1).toLong()); } else { //fprintf(stderr, "last frame not found !\n" ); qDebug() << "last frame not found"; } m_ffmpeg->terminate(); m_ffmpegSemaphore.release(); if (!rebuildRequired(FrameSize_Small) && !rebuildRequired(FrameSize_Orig)) { m_initialized = true; } } void VideoFrameSource_sV::slotAbortInitialization() { m_ffmpegSemaphore.acquire(); if (m_ffmpeg != NULL) { m_ffmpeg->terminate(); } m_ffmpegSemaphore.release(); } void VideoFrameSource_sV::slotProgressUpdate() { QRegExp regex(regexFrameNumber); QString s; m_ffmpegSemaphore.acquire(); s = QString(m_ffmpeg->readAllStandardError()); if (regex.lastIndexIn(s) >= 0) { emit signalTaskProgress(regex.cap(1).toInt()); emit signalTaskItemDescription(tr("Frame %1 of %2").arg(regex.cap(1)).arg(m_videoInfo->framesCount)); } m_ffmpegSemaphore.release(); } void VideoFrameSource_sV::loadOrigFrames() { m_ffmpegSemaphore.acquire(); m_ffmpeg->waitForFinished(tmout); m_ffmpeg->terminate(); QTime time; time.start(); extractFramesFor(FrameSize_Orig, m_ffmpeg); m_ffmpeg->waitForFinished(tmout); m_ffmpeg->terminate(); qDebug() << "ffmpeg in " << time.elapsed() << "ms"; m_ffmpegSemaphore.release(); } slowmovideo-0.5+git20180116/src/slowmoVideo/project/imagesRenderTarget_sV.cpp0000664000000000000000000000304113151342440025516 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "imagesRenderTarget_sV.h" #include ImagesRenderTarget_sV::ImagesRenderTarget_sV(RenderTask_sV *parentRenderTask) : AbstractRenderTarget_sV(parentRenderTask) { m_targetDir = QDir::temp(); m_filenamePattern = "rendered-%1.jpg"; } void ImagesRenderTarget_sV::setTargetDir(const QDir dir) { m_targetDir = dir; } bool ImagesRenderTarget_sV::setFilenamePattern(const QString pattern) { if (pattern.contains("%1")) { m_filenamePattern = pattern; return true; } return false; } void ImagesRenderTarget_sV::slotConsumeFrame(const QImage &image, const int frameNumber) { if (!m_targetDir.exists()) { m_targetDir.mkpath("."); } #if QT_VERSION < QT_VERSION_CHECK(5, 0, 0) QString path = m_targetDir.absoluteFilePath(m_filenamePattern.arg(frameNumber+1, 5, 10, QChar::fromAscii('0'))); #else QString path = m_targetDir.absoluteFilePath(m_filenamePattern.arg(frameNumber+1, 5, 10, QChar::fromLatin1('0'))); #endif bool ok; ok = image.save(path); if (!ok) { qDebug() << " Writing image to " << path << " failed!"; } else { qDebug() << " Saved frame number " << frameNumber << " to " << path; } } slowmovideo-0.5+git20180116/src/slowmoVideo/project/segmentList_sV.h0000664000000000000000000000144313151342440023711 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef SEGMENTLIST_SV_H #define SEGMENTLIST_SV_H #include "segment_sV.h" #include /** \brief List of Segment_sV */ class SegmentList_sV { public: SegmentList_sV(); void unselectAll(); void select(int segment); void shrink(); void grow(); int size() const; const Segment_sV& at(int i) const; Segment_sV& operator [](int i); private: QList m_list; }; #endif // SEGMENTLIST_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/node_sV.h0000664000000000000000000000454613151342440022347 0ustar rootroot/* slowmoUI is a user interface for slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef NODE_SV_H #define NODE_SV_H #include "../lib/defs_sV.hpp" #include "nodeHandle_sV.h" #include "canvasObject_sV.h" #include #include /** \brief Node for defining the input/output curve */ class Node_sV : public CanvasObject_sV { public: Node_sV(); Node_sV(const qreal &x, const qreal &y); Node_sV(const QPointF &point); Node_sV(const Node_sV &other); ~Node_sV() {} bool operator<(const Node_sV &other) const; bool operator==(const Node_sV &other) const; Node_sV operator-(const Node_sV &other) const; Node_sV operator+(const Node_sV &other) const; void operator+=(const Node_sV &other); void operator-=(const Node_sV &other); /// This assignment operator explicitly needs to be defined since otherwise /// the handle's parent relationship gets lost (pointer to a temporary node is copied). void operator =(const Node_sV &other); qreal x() const; qreal y() const; qreal xUnmoved() const; qreal yUnmoved() const; qreal setX(qreal x); qreal setY(qreal y); void select(bool); bool selected() const; void move(const Node_sV &dist); void abortMove(); void confirmMove(); const NodeHandle_sV& leftNodeHandle() const; const NodeHandle_sV& rightNodeHandle() const; CurveType leftCurveType() const; CurveType rightCurveType() const; const QString shutterFunctionID() const; void setLeftNodeHandle(qreal x, qreal y); void setRightNodeHandle(qreal x, qreal y); void setLeftCurveType(CurveType type); void setRightCurveType(CurveType type); void setShutterFunctionID(QString id); QPointF toQPointF() const; private: qreal m_x; qreal m_y; qreal m_moveX; qreal m_moveY; bool m_selected; NodeHandle_sV m_leftHandle; NodeHandle_sV m_rightHandle; CurveType m_leftCurveType; CurveType m_rightCurveType; QString m_shutterFunctionID; void init(); }; QDebug operator<<(QDebug qd, const Node_sV& n); #endif // NODE_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/abstractFlowSource_sV.h0000664000000000000000000000324413151342440025230 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef ABSTRACTFLOWSOURCE_SV_H #define ABSTRACTFLOWSOURCE_SV_H #include "../lib/defs_sV.hpp" #include #include class Project_sV; class FlowField_sV; class AbstractFlowSource_sV { public: AbstractFlowSource_sV(Project_sV *project); virtual ~AbstractFlowSource_sV() {} /** \return The flow field from \c leftFrame to \c rightFrame */ virtual FlowField_sV* buildFlow(uint leftFrame, uint rightFrame, FrameSize frameSize) throw(FlowBuildingError) = 0; /** \return The path to the flow file for the given frames */ virtual const QString flowPath(const uint leftFrame, const uint rightFrame, const FrameSize frameSize = FrameSize_Orig) const = 0; virtual void setLambda(double lambda) { m_lambda = lambda;} ; void clearFlowCache(); void createDirectories(); void cleardirectory(QDir dir); public slots: /** \fn slotUpdateProjectDir() Informs the flow source that the project directory has changed. If the flow source created sub-directories in the old project directories, it can e.g. delete them and create them at the new place. */ virtual void slotUpdateProjectDir() ; protected: Project_sV* project(); double m_lambda; QDir m_dirFlowSmall; QDir m_dirFlowOrig; private: Project_sV *m_project; }; #endif // ABSTRACTFLOWSOURCE_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/work_flow.h0000664000000000000000000000377513151342440022766 0ustar rootroot/* precalculate optical flow * 2014 Valery Brasseur */ #ifndef WORKERFLOW_H #define WORKERFLOW_H #include "project_sV.h" #include #include class WorkerFlow : public QObject { Q_OBJECT public: explicit WorkerFlow(QObject *parent = 0); /** * Requests the process to start * * It is thread safe as it uses #mutex to protect access to #_working variable. */ void requestWork(); /** * Requests the process to abort * * It is thread safe as it uses #mutex to protect access to #_abort variable. */ void abort(); /* * set running size */ void setFrameSize(FrameSize _frameSize); void setProject(Project_sV *_project); void setFlowSource(AbstractFlowSource_sV* _flowsource); void setDirection(int _forward) { forward = _forward;}; private: /** * Process is aborted when true */ bool _abort; /** * true when Worker is doing work */ bool _working; /** * Protects access to #_abort */ QMutex mutex; /** * which source flow */ AbstractFlowSource_sV* flowSource; /** * which size do we create */ FrameSize frameSize; /* * which project is concern */ Project_sV *project; int forward; #if 0 const QString flowPath(const uint leftFrame, const uint rightFrame, const FrameSize frameSize) const; QDir m_dirFlowSmall; QDir m_dirFlowOrig; #endif signals: /** * This signal is emitted when the Worker request to Work * requestWork() */ void workFlowRequested(); /** * This signal is emitted when counted value is changed (every sec) */ void valueChanged(const QString &value); /** * This signal is emitted when process is finished (or being aborted) */ void finished(); public slots: /** * calculate optical flow * */ void doWorkFlow(); }; #endif // WORKER_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/shutterFunctionList_sV.cpp0000664000000000000000000000626113151342440026011 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "shutterFunctionList_sV.h" #include "shutterFunction_sV.h" #include "nodeList_sV.h" #include #include #include QRegExp ShutterFunctionList_sV::regexID("^[\\d\\w]+$"); ShutterFunctionList_sV::ShutterFunctionList_sV(NodeList_sV *nodes) : m_nodes(nodes) { } ShutterFunctionList_sV::~ShutterFunctionList_sV() { for (int i = 0; i < m_functions.size(); i++) { delete m_functions.at(i); } } int ShutterFunctionList_sV::size() const { return m_functions.size(); } const QString ShutterFunctionList_sV::nextID() const { int nr = 1; QString id; bool stop = false; while (!stop) { stop = true; id = QString("func%1").arg(nr); for (int i = 0; i < m_functions.size(); i++) { if (m_functions.at(i)->id() == id) { nr++; stop = false; continue; } } } return id; } bool ShutterFunctionList_sV::updateID(ShutterFunction_sV *function, const QString id) { if (id.length() == 0 || regexID.indexIn(id) != 0) { qDebug() << "Not a valid ID: " << id; return false; } for (int i = 0; i < m_functions.size(); i++) { if (function != m_functions.at(i) && m_functions.at(i)->id() == id) { qDebug() << "ID already exists!"; return false; } } function->setID(id); return true; } ShutterFunction_sV* ShutterFunctionList_sV::addFunction(const ShutterFunction_sV function, bool generateID) { ShutterFunction_sV *fun = new ShutterFunction_sV(function); if (generateID) { fun->setID(nextID()); } for (int i = 0; i < m_functions.size(); i++) { if (m_functions.at(i)->id() == fun->id()) { qDebug() << "Function ID is already here!"; delete fun; Q_ASSERT(false); return NULL; } } m_functions.append(fun); return fun; } bool ShutterFunctionList_sV::removeFunction(const QString id) { for (int i = 0; i < m_functions.size(); i++) { if (m_functions.at(i)->id() == id) { delete m_functions.at(i); m_functions.removeAt(i); for (int i = 0; i < m_nodes->size(); i++) { if (m_nodes->at(i).shutterFunctionID() == id) { (*m_nodes)[i].setShutterFunctionID(""); } } return true; } } return false; } const ShutterFunction_sV* ShutterFunctionList_sV::at(int index) const { Q_ASSERT(index < m_functions.size()); return m_functions.at(index); } ShutterFunction_sV* ShutterFunctionList_sV::function(const QString id) { for (int i = 0; i < m_functions.size(); i++) { if (m_functions.at(i)->id() == id) { return m_functions[i]; } } return NULL; } slowmovideo-0.5+git20180116/src/slowmoVideo/project/motionBlur_sV.h0000664000000000000000000000636513151342440023555 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef MOTIONBLUR_SV_H #define MOTIONBLUR_SV_H #include #include #include "renderPreferences_sV.h" class Project_sV; /// Thrown if the frame range is too small for motion blur to still make sense class RangeTooSmallError_sV : public Error_sV { public: RangeTooSmallError_sV(QString msg) : Error_sV(msg) {} }; /// If a frame is closer than this to a full frame, the full frame will be used instead. #define MOTIONBLUR_PRECISION_LIMIT .01 /** \brief Renders motion blur \todo Force fast blurring for a segment? \todo Use .jpg for cached frames? */ class MotionBlur_sV { public: MotionBlur_sV(Project_sV *project); /** Selects either fastBlur() or slowmoBlur(), depending on the replay speed. \param replaySpeed Must be >= 0 */ QImage blur(float startFrame, float endFrame, float replaySpeed, RenderPreferences_sV prefs) throw(RangeTooSmallError_sV); /** Blurs frames using cached frames on fixed, coarse-grained intervals. If the replay speed is high enough, it does not matter if frame 1.424242 or frame 1.5 is used together with other frames for rendering motion blur. That way calculation can be sped up a little bit. */ QImage fastBlur(float startFrame, float endFrame, const RenderPreferences_sV &prefs) throw(RangeTooSmallError_sV); /** Blurs frames that are re-played at very low speed, such that fastBlur() cannot be used. The blurred parts of the image still need to move slowly, rounding frames to interpolate to 0.5 would not work therefore. */ QImage slowmoBlur(float startFrame, float endFrame, const RenderPreferences_sV& prefs); QImage convolutionBlur(float startFrame, float endFrame, float replaySpeed, const RenderPreferences_sV& prefs); QImage nearest(float startFrame, const RenderPreferences_sV& prefs); /** \fn setSlowmoSamples(); Sets the minimum number of samples for motion blur. This is ignored by fastBlur() where the interpolation scale is fixed (i.e. at most 1/8 steps between two frames). However slowmoBlur() uses this exact value for interpolating. */ /** \fn setMaxSamples(); Sets the maximum number of samples that are used for rendering motion blur. */ void setSlowmoSamples(int slowmoSamples); void setMaxSamples(int maxSamples); void setSlowmoMaxFrameDistance(float distance); int slowmoSamples() const { return m_slowmoSamples; } int maxSamples() const { return m_maxSamples; } public slots: void slotUpdateProjectDir(); private: Project_sV *m_project; QDir m_dirCacheSmall; QDir m_dirCacheOrig; int m_slowmoSamples; int m_maxSamples; float m_slowmoMaxFrameDist; QString cachedFramePath(float framePos, const RenderPreferences_sV &prefs, bool highPrecision = false); void createDirectories(); QDir cacheDir(FrameSize size) const; }; #endif // MOTIONBLUR_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/nodeHandle_sV.cpp0000664000000000000000000000206013151342440024003 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "nodeHandle_sV.h" NodeHandle_sV::NodeHandle_sV() : QPointF(0, 0), m_parentNode(NULL) { } NodeHandle_sV::NodeHandle_sV(const QPointF &other) : QPointF(other), m_parentNode(NULL) { } NodeHandle_sV::NodeHandle_sV(qreal x, qreal y) : QPointF(x, y), m_parentNode(NULL) { } NodeHandle_sV::NodeHandle_sV(const NodeHandle_sV &other) : QPointF(other) { } void NodeHandle_sV::setParentNode(Node_sV *node) { m_parentNode = node; } const Node_sV* NodeHandle_sV::parentNode() const { return m_parentNode; } QDebug operator <<(QDebug qd, const NodeHandle_sV& h) { qd.nospace() << "(" << h.x() << "|" << h.y() << "):" << &h << "@" << h.parentNode(); return qd.maybeSpace(); } slowmovideo-0.5+git20180116/src/slowmoVideo/project/node_sV.cpp0000664000000000000000000001176013151342440022676 0ustar rootroot/* slowmoUI is a user interface for slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "node_sV.h" #include //#define DEBUG_N Node_sV::Node_sV() : m_x(0), m_y(0) { init(); } Node_sV::Node_sV(const qreal &x, const qreal &y) : m_x(x), m_y(y) { init(); } Node_sV::Node_sV(const QPointF &point) : m_x(point.x()), m_y(point.y()) { init(); } Node_sV::Node_sV(const Node_sV &other) : m_x(other.x()), m_y(other.y()), m_leftHandle(other.m_leftHandle), m_rightHandle(other.m_rightHandle), m_shutterFunctionID(other.m_shutterFunctionID) { init(); m_leftCurveType = other.m_leftCurveType; m_rightCurveType = other.m_rightCurveType; Q_ASSERT(this == m_leftHandle.parentNode()); Q_ASSERT(this == m_rightHandle.parentNode()); } void Node_sV::init() { m_moveX = 0; m_moveY = 0; m_selected = false; m_leftHandle.setParentNode(this); m_rightHandle.setParentNode(this); m_leftCurveType = CurveType_Linear; m_rightCurveType = CurveType_Linear; Q_ASSERT(this == m_leftHandle.parentNode()); Q_ASSERT(this == m_rightHandle.parentNode()); } ////////// Basic commands qreal Node_sV::x() const { return m_x + m_moveX; } qreal Node_sV::y() const { return m_y + m_moveY; } qreal Node_sV::xUnmoved() const { return m_x; } qreal Node_sV::yUnmoved() const { return m_y; } qreal Node_sV::setX(qreal x) { qreal ret = m_x; m_x = x; return ret; } qreal Node_sV::setY(qreal y) { qreal ret = m_y; m_y = y; return ret; } void Node_sV::select(bool select) { m_selected = select; } bool Node_sV::selected() const { return m_selected; } ////////// Curve types, handles const NodeHandle_sV& Node_sV::leftNodeHandle() const { return m_leftHandle; } const NodeHandle_sV& Node_sV::rightNodeHandle() const { return m_rightHandle; } CurveType Node_sV::leftCurveType() const { return m_leftCurveType; } CurveType Node_sV::rightCurveType() const { return m_rightCurveType; } const QString Node_sV::shutterFunctionID() const { return m_shutterFunctionID; } void Node_sV::setLeftCurveType(CurveType type) { m_leftCurveType = type; } void Node_sV::setRightCurveType(CurveType type) { m_rightCurveType = type; } void Node_sV::setLeftNodeHandle(qreal x, qreal y) { Q_ASSERT(x <= 0); // Relative offset to current node; ensure that mapping is injective m_leftHandle.rx() = x; m_leftHandle.ry() = y; } void Node_sV::setRightNodeHandle(qreal x, qreal y) { Q_ASSERT(x >= 0); m_rightHandle.rx() = x; m_rightHandle.ry() = y; } void Node_sV::setShutterFunctionID(QString id) { m_shutterFunctionID = id; } ////////// Movement void Node_sV::move(const Node_sV &dist) { m_moveX = dist.x(); m_moveY = dist.y(); } void Node_sV::abortMove() { m_moveX = 0; m_moveY = 0; } void Node_sV::confirmMove() { m_x += m_moveX; m_y += m_moveY; m_moveX = 0; m_moveY = 0; } ////////// Operators bool Node_sV::operator <(const Node_sV& other) const { return m_x < other.x(); } bool Node_sV::operator ==(const Node_sV& other) const { return m_x == other.m_x && m_y == other.m_y && m_moveX == other.m_moveX && m_moveY == other.m_moveY; } Node_sV Node_sV::operator -(const Node_sV& other) const { return Node_sV(m_x - other.m_x, m_y - other.m_y); } Node_sV Node_sV::operator +(const Node_sV& other) const { return Node_sV(m_x + other.m_x, m_y + other.m_y); } void Node_sV::operator +=(const Node_sV& other) { m_x += other.m_x; m_y += other.m_y; } void Node_sV::operator -=(const Node_sV& other) { m_x -= other.m_x; m_y -= other.m_y; } void Node_sV::operator =(const Node_sV& other) { if (this != &other) { #ifdef DEBUG_N qDebug() << "Other: " << other; #endif m_x = other.m_x; m_y = other.m_y; m_leftHandle.setX(other.leftNodeHandle().x()); m_leftHandle.setY(other.leftNodeHandle().y()); m_rightHandle.setX(other.rightNodeHandle().x()); m_rightHandle.setY(other.rightNodeHandle().y()); m_leftCurveType = other.m_leftCurveType; m_rightCurveType = other.m_rightCurveType; m_shutterFunctionID = other.m_shutterFunctionID; #ifdef DEBUG_N qDebug() << "This: " << *this; #endif } Q_ASSERT(this == m_leftHandle.parentNode()); Q_ASSERT(this == m_rightHandle.parentNode()); } ////////// Conversion QPointF Node_sV::toQPointF() const { return QPointF(x(), y()); } QDebug operator<<(QDebug qd, const Node_sV& n) { qd.nospace() << "("; qd.nospace() << n.x() << "|" << n.y(); if (n.selected()) { qd.nospace() << "|s"; } qd.nospace() << ")@" << &n << " l: " << n.leftNodeHandle() << " " << toString(n.leftCurveType()) << ", r: " << n.rightNodeHandle() << " " << toString(n.rightCurveType()) << "\n"; return qd.maybeSpace(); } slowmovideo-0.5+git20180116/src/slowmoVideo/project/canvasObject_sV.h0000664000000000000000000000022713151342440024014 0ustar rootroot#ifndef CANVASOBJECT_SV_H #define CANVASOBJECT_SV_H class CanvasObject_sV { public: virtual ~CanvasObject_sV() {} }; #endif // CANVASOBJECT_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/renderTask_sV.h0000664000000000000000000000514313151342440023516 0ustar rootroot/* * 2014 Valery Brasseur */ #ifndef RENDERTASK_SV_H #define RENDERTASK_SV_H #include #include #include #include #include "renderPreferences_sV.h" class Project_sV; class AbstractRenderTarget_sV; #include #include class RenderTask_sV : public QObject { Q_OBJECT public: RenderTask_sV(Project_sV *project); ~RenderTask_sV(); /** * Requests the process to start * * It is thread safe as it uses #mutex to protect access to #_working variable. */ void requestWork(); /** * Requests the process to abort * * It is thread safe as it uses #mutex to protect access to #_abort variable. */ void abort(); void setRenderTarget(AbstractRenderTarget_sV *renderTarget); void setTimeRange(qreal start, qreal end); void setTimeRange(QString start, QString end); QDir getRenderDirectory(); /// Rendered frames per second Fps_sV fps() { return m_prefs.fps(); } /// Output frame resolution QSize resolution(); RenderPreferences_sV& renderPreferences() { return m_prefs; } void setupProgress(QString desc, int taskSize); void updateProgress(int value); void stepProgress(int step=1); void updateMessage(QString desc); private: /** * true when Worker is doing work */ bool _working; /** * Protects access to #_abort */ QMutex mutex; Project_sV *m_project; RenderPreferences_sV m_prefs; ///< \todo Set preferences AbstractRenderTarget_sV *m_renderTarget; qreal m_timeStart; qreal m_timeEnd; QElapsedTimer m_stopwatch; qint64 m_renderTimeElapsed; bool m_stopRendering; // Process is aborted when true qreal m_nextFrameTime; qreal m_prevTime; int currentProgress; signals: /** * This signal is emitted when the Worker request to Work * requestWork() */ void workFlowRequested(); /** * signal rendering progression */ void signalItemDesc(QString desc); void signalTaskProgress(int value); /** * This signal is emitted when process is finished (or being aborted) */ void signalRenderingContinued(); void signalRenderingStopped(QString renderTime); void signalRenderingFinished(QString renderTime); void signalRenderingAborted(QString reason); void signalNewTask(QString desc, int taskSize); public slots: void slotContinueRendering(); void slotStopRendering(); // was void doWorkFlow(); }; #endif // RENDERTASK_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/interpolator_sV.h0000664000000000000000000000053613151342440024137 0ustar rootroot#ifndef INTERPOLATOR_SV_H #define INTERPOLATOR_SV_H #include "renderPreferences_sV.h" #include "project_sV.h" class Interpolator_sV { public: static QImage interpolate(Project_sV *project, float frame, const RenderPreferences_sV& prefs) throw(FlowBuildingError, InterpolationError); }; #endif // INTERPOLATOR_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/exportVideoRenderTarget.cpp0000664000000000000000000000641213151342440026116 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2014 Valery Brasseur This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "exportVideoRenderTarget.h" #include #if QT_VERSION >= QT_VERSION_CHECK(5, 0, 0) #include #endif #include "renderTask_sV.h" #include #include "../lib/video_enc.h" exportVideoRenderTarget::exportVideoRenderTarget(RenderTask_sV *parentRenderTask) : AbstractRenderTarget_sV(parentRenderTask) { #if _NO_INSIDE_TMPDIR_ //TODO: should use projectdir to create render dir #if QT_VERSION >= QT_VERSION_CHECK(5, 0, 0) //QTemporaryDir tempDir("slowmovideo"); QTemporaryDir tempDir;; // use default if (tempDir.isValid()) m_targetDir = QDir(tempDir.path()); else #endif m_targetDir = QDir::temp(); #else m_targetDir = parentRenderTask->getRenderDirectory(); #endif qDebug() << " target dir " << m_targetDir; m_filenamePattern = "rendered-%1.png"; use_qt = 1; first = 0; } exportVideoRenderTarget::~exportVideoRenderTarget() { #ifdef _DO_NOT_KEEP_TEMP // QT bug ? qDebug() << "should remove dir : " << m_targetDir; #if QT_VERSION >= QT_VERSION_CHECK(5, 0, 0) m_targetDir.removeRecursively(); #else #warning removeRecursively not define in QT4 #endif #endif } void exportVideoRenderTarget::setTargetFile(const QString &filename) { m_filename = filename; } void exportVideoRenderTarget::setVcodec(const QString &codec) { m_vcodec = codec; } void exportVideoRenderTarget::slotConsumeFrame(const QImage &image, const int frameNumber) { if (!m_targetDir.exists()) { m_targetDir.mkpath("."); } #if QT_VERSION < QT_VERSION_CHECK(5, 0, 0) QString path = m_targetDir.absoluteFilePath(m_filenamePattern.arg(frameNumber+1, 5, 10, QChar::fromAscii('0'))); #else QString path = m_targetDir.absoluteFilePath(m_filenamePattern.arg(frameNumber+1, 5, 10, QChar::fromLatin1('0'))); #endif bool ok; ok = image.save(path); if (!ok) { qDebug() << " Writing image to " << path << " failed!"; } else { qDebug() << " Saved frame number " << frameNumber << " to " << path; } if (first == 0) first = frameNumber + 1; } void exportVideoRenderTarget::closeRenderTarget() throw(Error_sV) { VideoWriter* writer;; qDebug() << "exporting temporary frame to Video" << m_filename << " using codec " << m_vcodec << "starting at " << first; if (m_vcodec.isEmpty()) writer = CreateVideoWriter(m_filename.toStdString().c_str(), renderTask()->resolution().width(), renderTask()->resolution().height(), renderTask()->fps().fps(),use_qt,0); else writer = CreateVideoWriter(m_filename.toStdString().c_str(), renderTask()->resolution().width(), renderTask()->resolution().height(), renderTask()->fps().fps(),use_qt,m_vcodec.toStdString().c_str()); if (writer == 0) { throw Error_sV(QObject::tr("Video could not be prepared .\n")); } exportFrames(writer, m_targetDir.absoluteFilePath(m_filenamePattern.arg("%05d")).toStdString().c_str(),first,renderTask()); ReleaseVideoWriter( &writer ); } slowmovideo-0.5+git20180116/src/slowmoVideo/project/abstractFlowSource_sV.cpp0000664000000000000000000000314713151342440025565 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "abstractFlowSource_sV.h" #include #include #include "project_sV.h" AbstractFlowSource_sV::AbstractFlowSource_sV(Project_sV *project) : m_project(project) { } Project_sV* AbstractFlowSource_sV::project() { return m_project; } /* * clear all flow file in flow directories */ void AbstractFlowSource_sV::cleardirectory(QDir dir) { dir.setFilter( QDir::NoDotAndDotDot | QDir::Files ); foreach( QString dirItem, dir.entryList() ) dir.remove( dirItem ); dir.setFilter( QDir::NoDotAndDotDot | QDir::Dirs ); foreach( QString dirItem, dir.entryList() ) { QDir subDir( dir.absoluteFilePath( dirItem ) ); #if QT_VERSION >= QT_VERSION_CHECK(5, 0, 0) subDir.removeRecursively(); #else #warning removeRecursively not define in QT4 #endif } } void AbstractFlowSource_sV::clearFlowCache() { cleardirectory(m_dirFlowSmall); cleardirectory(m_dirFlowOrig); } void AbstractFlowSource_sV::slotUpdateProjectDir() { //TODO: check //m_dirFlowSmall.rmdir("."); //m_dirFlowOrig.rmdir("."); createDirectories(); } void AbstractFlowSource_sV::createDirectories() { m_dirFlowSmall = project()->getDirectory("cache/oFlowSmall"); m_dirFlowOrig = project()->getDirectory("cache/oFlowOrig"); } slowmovideo-0.5+git20180116/src/slowmoVideo/project/renderPreferences_sV.cpp0000664000000000000000000000077713151342440025420 0ustar rootroot#include "renderPreferences_sV.h" RenderPreferences_sV::RenderPreferences_sV() : interpolation(InterpolationType_TwowayNew), motionblur(MotionblurType_Convolving), size(FrameSize_Orig), m_fps(24), m_fpsSetByUser(false) { } Fps_sV RenderPreferences_sV::fps() const { return m_fps; } void RenderPreferences_sV::setFps(Fps_sV fps) { Q_ASSERT(fps.num > 0); m_fpsSetByUser = true; m_fps = fps; } bool RenderPreferences_sV::fpsSetByUser() const { return m_fpsSetByUser; } slowmovideo-0.5+git20180116/src/slowmoVideo/project/projectPreferences_sV.h0000664000000000000000000000303313151342440025240 0ustar rootroot#ifndef PROJECTPREFERENCES_SV_H #define PROJECTPREFERENCES_SV_H #include "../lib/defs_sV.hpp" class ProjectPreferences_sV { public: ProjectPreferences_sV(); /** \return Reference to the previously selected tag axis */ TagAxis& lastSelectedTagAxis(); QPointF& viewport_t0(); QPointF& viewport_secRes(); Fps_sV& canvas_xAxisFPS(); // Rendering QString& renderSectionMode(); QString& renderStartTag(); QString& renderEndTag(); QString& renderStartTime(); QString& renderEndTime(); FrameSize& renderFrameSize(); InterpolationType& renderInterpolationType(); MotionblurType& renderMotionblurType(); Fps_sV& renderFPS(); QString& renderTarget(); bool& renderFormat(); QString& imagesOutputDir(); QString& imagesFilenamePattern(); QString& videoFilename(); QString& videoCodec(); float& flowV3DLambda(); private: TagAxis m_tagAxis; QPointF m_viewport_t0; QPointF m_viewport_secRes; Fps_sV m_canvas_xAxisFPS; QString m_renderSectionMode; QString m_renderStartTag; QString m_renderEndTag; QString m_renderStartTime; QString m_renderEndTime; FrameSize m_renderFrameSize; InterpolationType m_renderInterpolationType; MotionblurType m_motionblurType; Fps_sV m_renderFPS; QString m_renderTarget; bool m_renderFormat; QString m_imagesOutputDir; QString m_imagesFilenamePattern; QString m_videoFilename; QString m_vcodec; float m_flowV3DLambda; }; #endif // PROJECTPREFERENCES_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/tag_sV.h0000664000000000000000000000146513151342440022172 0ustar rootroot#ifndef TAG_SV_H #define TAG_SV_H #include "../lib/defs_sV.hpp" #include "canvasObject_sV.h" #include /** \brief Tags are used for marking positions in time in the project. */ class Tag_sV : public CanvasObject_sV { public: Tag_sV(TagAxis axis = TagAxis_Source); Tag_sV(qreal time, QString description, TagAxis axis = TagAxis_Source); ~Tag_sV() {} TagAxis axis() const { return m_axis; } qreal time() const { return m_time; } const QString& description() const { return m_description; } void setAxis(TagAxis axis); void setTime(qreal time); void setDescription(QString desc); // If renaming allowed: Update preferences! bool operator <(const Tag_sV& other) const; private: TagAxis m_axis; qreal m_time; QString m_description; }; #endif // TAG_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/nodeList_sV.h0000664000000000000000000001344113151342440023175 0ustar rootroot/* slowmoUI is a user interface for slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef NODELIST_SV_H #define NODELIST_SV_H #include "node_sV.h" #include "segmentList_sV.h" #include "canvasObject_sV.h" #include "../lib/defs_sV.hpp" #include #include /** \brief Represents a curve defined by a Node_sV list. This object can be queried for the source time given an output time. The curve is (ensured to be) injective, i.e. \f$ t_1 \neq t_2 \rightarrow f(t_1) \neq f(t_2) \f$ with \f$ t_1,t_2 \in \f$ target time, \f$ f(t_1),f(t_2) \in \f$ source time. This means that there is alwas a non-ambiguous answer to the question: Which frame from the input video has to be displayed at output time \f$ t \f$? */ class NodeList_sV { public: NodeList_sV(float minDist = 1/30.0f); /// For sorting objects struct PointerWithDistance { /// Defines the order on object types. Nodes will come first in a sorted list. enum ObjectType { Node = 1, Handle = 2, Segment = 3, Tag = 4 }; /// Pointer to the object const CanvasObject_sV* ptr; /// Distance to the object from the search position (e.g. mouse position) qreal dist; /// The object type should only be used for sorting! ObjectType type; bool operator <(const PointerWithDistance &other) const { return type < other.type || (type == other.type && dist < other.dist); } PointerWithDistance(const CanvasObject_sV* ptr, qreal dist, ObjectType type) : ptr(ptr), dist(dist), type(type) { } }; void setMaxY(qreal time); ///< Sets the maximum y value that is allowed, usually the duration of the input. qreal sourceTime(qreal targetTime) const; ///< Calculates the source time in seconds for the given output time. qreal startTime(bool useMoved = false) const; ///< Time of the first node. useMoved uses the unconfirmed position of the node while it is moved. qreal endTime(bool useMoved = false) const; ///< Time of the rightmost node. See totalTime() for the curve length. qreal totalTime() const; ///< Length of the curve, ignores space (startTime())at the beginning. bool isInsideCurve(qreal targetTime, bool useMoved = false) const; ///< Returns true if startTime <= targetTime <= endTime /** Add a new node at the given position. @return true if the node has been added. The node is NOT added if it is too close to another node. */ bool add(Node_sV node); uint deleteSelected(); void deleteNode(int index); void select(const Node_sV *node, bool newSelection = true); void unselectAll(); void shift(qreal after, qreal by); /** Move the selected nodes by the given time vector. Only succeeds if the nodes are still within valid bounds. A move has to be either confirmed or aborted. */ void moveSelected(const Node_sV &time,bool snap = false); /** Confirm the move on all nodes. */ void confirmMove(); /** Abort the move on all nodes. Resets the temporary movement vector. */ void abortMove(); void moveHandle(const NodeHandle_sV *handle, Node_sV relPos); /// Sets the curve type for the segment at time \c segmentTime. void setCurveType(qreal segmentTime, CurveType type); void fixHandles(int leftIndex); int setSpeed(qreal segmentTime, qreal speed); /** \brief Returns the \c node's index in the node list \return -1 if the node could not be located */ int indexOf(const Node_sV *node) const; /** @return The position of the node whose target time (x()) is <= time, or -1 if there is no such node. */ int find(qreal time) const; /** @return The position of the first node in the list which is within a radius of \c tdelta around the point (tx|ty), or -1 if no such node exists. */ int find(QPointF pos, qreal tdelta) const; /** \brief Searches for a segment (or path) between two nodes at time \c tx. If a left or right node does not exist (when \c tx is outside of the curve), the return value for this index is -1. */ void findBySegment(qreal tx, int& leftIndex_out, int& rightIndex_out) const; /** \brief Searches for node objects (nodes, handles, and segments) around a position. \param pos Center of the search position. \param tmaxdist This is the search radius. Elements are included if their euclidian distance to \c pos is <= tmaxdist, except for segments where only the x position is taken into account. */ QList objectsNear(QPointF pos, qreal tmaxdist) const; /** @return The index of the node whose time is equal or greater than the given time, or -1 if there is no such node. */ int nodeAfter(qreal time) const; const Node_sV& at(int i) const; Node_sV& operator [](int i); int size() const; /** @return false if nodes are not in a valid position. Nodes must be ordered, and the minimum distance (on the y axis) must be at least m_minDist. */ bool validate() const; SegmentList_sV* segments(); private: qreal m_maxY; QList m_list; SegmentList_sV m_segments; const float m_minDist; qreal bezierSourceTime(qreal targetTime, QPointF p0, QPointF p1, QPointF p2, QPointF p3) const; inline qreal dist2(QPointF point) const; }; QDebug operator<<(QDebug qd, const NodeList_sV &list); #endif // NODELIST_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/shutterFunctionList_sV.h0000664000000000000000000000426613151342440025461 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef SHUTTERFUNCTIONLIST_SV_H #define SHUTTERFUNCTIONLIST_SV_H #include #include #include class NodeList_sV; class ShutterFunction_sV; /** \brief Maintains a list of ShutterFunction_sV and ensures unique IDs. When a function is removed, the nodes in the project's NodeList_sV are scanned for the function's ID and then reset to the default ID if they used the deleted function. */ class ShutterFunctionList_sV { public: /// Links the ShutterFunctionList_sV to the given NodeList_sV ShutterFunctionList_sV(NodeList_sV *nodes); /// Destructor ~ShutterFunctionList_sV(); /// For validating IDs (should be alphanumeric) static QRegExp regexID; /// Number of functions in this list int size() const; /// Returns the next unique function ID const QString nextID() const; /// Updates the function's ID. Returns \c false if \c id is already taken. bool updateID(ShutterFunction_sV *function, const QString id); /// Returns the function with the given \c id, or \c NULL if there is no function ShutterFunction_sV* function(const QString id); /// Returns the shutter function at the given \c index const ShutterFunction_sV* at(int index) const; /** Adds the given function. \param generateID If \c true, a new ID generated with nextID() will be used instead of the one set for the function. */ ShutterFunction_sV* addFunction(const ShutterFunction_sV function, bool generateID); /** Removes the function and resets all references from nodes to this function. \return \c false if a function with the given \c id could not be found. */ bool removeFunction(const QString id); private: QList m_functions; NodeList_sV *m_nodes; }; #endif // SHUTTERFUNCTIONLIST_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/project/emptyFrameSource_sV.h0000664000000000000000000000230213151342440024700 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef EMPTYFRAMESOURCE_H #define EMPTYFRAMESOURCE_H #include "abstractFrameSource_sV.h" class EmptyFrameSource_sV : public AbstractFrameSource_sV { Q_OBJECT public: EmptyFrameSource_sV(const Project_sV *project); ~EmptyFrameSource_sV() {} void initialize(); bool initialized() const { return true; } int64_t framesCount() const { return 1000; } const Fps_sV* fps() const { return &m_fps; } int frameRateNum() const { return 24; } int frameRateDen() const { return 1; } QImage frameAt(const uint, const FrameSize = FrameSize_Orig) { return QImage(); } const QString framePath(const uint, const FrameSize) const { return QString(); } void loadOrigFrames() {}; public slots: void slotAbortInitialization() {} void slotUpdateProjectDir() {} private: Fps_sV m_fps; }; #endif // EMPTYFRAMESOURCE_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoCLI/0000775000000000000000000000000013151342440020772 5ustar rootrootslowmovideo-0.5+git20180116/src/slowmoVideo/slowmoCLI/videoInfo.cpp0000664000000000000000000000060213151342440023416 0ustar rootroot #include #include #include "../lib/videoInfo_sV.h" void printUsage(const char progName[]) { printf("Displays information like frame rate of a video file. \nUsage: %s file\n", progName); } int main(int argc, char *argv[]) { if (argc < 2 || strcmp("-h", argv[1]) == 0) { printUsage(argv[0]); return -1; } getInfo(argv[1]); return 0; } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoCLI/CMakeLists.txt0000664000000000000000000000065613151342440023541 0ustar rootroot set(SOURCES_MAIN main.cpp ) #set(CMAKE_C_FLAGS "${CMAKE_C_FLAGS} -std=c99") set(SOURCES_VINFO videoInfo.cpp ) add_executable(slowmoInterpolate ${SOURCES_MAIN}) target_link_libraries(slowmoInterpolate sV sVflow ) add_executable(slowmoVideoInfo ${SOURCES_VINFO}) target_link_libraries(slowmoVideoInfo sVinfo sV ) install(TARGETS slowmoInterpolate DESTINATION ${DEST}) install(TARGETS slowmoVideoInfo DESTINATION ${DEST}) slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoCLI/main.cpp0000664000000000000000000001407613151342440022432 0ustar rootroot/* slowmoCLI is a command-line interface for slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "../lib/interpolate_sV.h" #include "../lib/flowField_sV.h" #include "../lib/flowRW_sV.h" #include #include #include #include #include #include #include const int RET_MISSING_PARAM = -1; const int RET_WRONG_PARAM = -2; const int RET_MISSING_FILE = -3; const int RET_SIZE_DIFFERS = -4; char *myName; enum FlowMode { FlowMode_Forward, FlowMode_Twoway, FlowMode_Undef }; void printUsage() { std::cout << "Usage: " << std::endl; std::cout << "\t" << myName << " twoway [ [numberOffset [fps]] ]" << std::endl; std::cout << "\t" << myName << " forward [ [numberOffset [fps]] ]" << std::endl; } char* nextArg(int argc, int &argi, char *argv[]) { argi++; if (argi < argc) { std::cout << "Arg: " << argv[argi] << std::endl; return argv[argi]; } else { std::cout << "Argument " << argi << " missing." << std::endl; printUsage(); exit(RET_MISSING_PARAM); } } const char* nextOptArg(int argc, int &argi, char *argv[], const char defaultParam[]) { argi++; if (argi < argc) { std::cout << "Optional argument: " << argv[argi] << std::endl; return argv[argi]; } else { std::cout << "Optional argument " << argi << " not given." << std::endl; return defaultParam; } } int main(int argc, char *argv[]) { myName = argv[0]; int argi = 0; char *arg; FlowMode mode = FlowMode_Undef; arg = nextArg(argc, argi, argv); if (strcmp("forward", arg) == 0) { mode = FlowMode_Forward; } else if (strcmp("twoway", arg) == 0) { mode = FlowMode_Twoway; } if (mode == FlowMode_Undef) { printUsage(); exit(RET_WRONG_PARAM); } QImage left, right, output; FlowField_sV *ffForward, *ffBackward; switch (mode) { case FlowMode_Twoway: std::cout << "Running two-way flow." << std::endl; left = QImage(nextArg(argc, argi, argv)); right = QImage(nextArg(argc, argi, argv)); ffForward = FlowRW_sV::load(nextArg(argc, argi, argv)); ffBackward = FlowRW_sV::load(nextArg(argc, argi, argv)); break; case FlowMode_Forward: std::cout << "Running forward flow." << std::endl; left = QImage(nextArg(argc, argi, argv)); ffForward = FlowRW_sV::load(nextArg(argc, argi, argv)); break; case FlowMode_Undef: Q_ASSERT(false); break; } QString pattern(nextOptArg(argc, argi, argv, "output%1.png")); if (!pattern.contains("%1")) { std::cout << "Error: Output pattern must contain a %1 for the image number. Example: output%1.png." << std::endl; return RET_WRONG_PARAM; } bool ok; int numberOffset = QString(nextOptArg(argc, argi, argv, "0")).toInt(&ok); if (!ok) { std::cout << "Error converting argument to number." << std::endl; return RET_WRONG_PARAM; } const unsigned int fps = QString(nextOptArg(argc, argi, argv, "24")).toInt(&ok); if (!ok) { std::cout << "Error converting argument to number." << std::endl; return RET_WRONG_PARAM; } switch (mode) { case FlowMode_Twoway: if (ffBackward == NULL) { std::cout << "Backward flow is not valid." << std::endl; exit(RET_MISSING_FILE); } if (right.isNull()) { std::cout << "Right image does not exist." << std::endl; exit(RET_MISSING_FILE); } if (ffBackward->width() != left.width() || ffBackward->height() != left.height()) { qDebug() << "Invalid backward flow field size. Image is " << left.width() << ", flow is " << ffBackward->width() << "x" << ffBackward->height() << "."; exit(RET_SIZE_DIFFERS); } // Fall through case FlowMode_Forward: if (left.isNull()) { std::cout << "Left image does not exist." << std::endl; exit(RET_MISSING_FILE); } if (ffForward == NULL) { std::cout << "Forward flow is not valid." << std::endl; exit(RET_MISSING_FILE); } if (left.size() != right.size()) { qDebug() << "Left image size differs from right image size: " << left.size() << " vs. " << right.size() << "."; } if (ffForward->width() != left.width() || ffForward->height() != left.height()) { qDebug() << "Invalid forward flow field size. Image is " << left.width() << ", flow is " << ffForward->width() << "x" << ffForward->height() << "."; exit(RET_SIZE_DIFFERS); } break; case FlowMode_Undef: qDebug() << "Undefined flow mode selected."; break; } output = QImage(left.size(), QImage::Format_RGB32); //const int stepLog = ceil(log10(numberOffset + steps)); const int stepLog = 8; float pos; const QChar fillChar = QLatin1Char('0'); qDebug() << stepLog << ": max length"; QString filename; for (unsigned int step = 0; step < fps+1; step++) { pos = step/float(fps); if (mode == FlowMode_Twoway) { Interpolate_sV::twowayFlow(left, right, ffForward, ffBackward, pos, output); } else if (mode == FlowMode_Forward) { Interpolate_sV::forwardFlow(left, ffForward, pos, output); } filename = pattern.arg(QString::number(numberOffset + step), stepLog, fillChar); qDebug() << "Saving position " << pos << " to image " << filename; output.save(filename); } delete ffForward; if (mode == FlowMode_Twoway) { delete ffBackward; } } slowmovideo-0.5+git20180116/src/slowmoVideo/lib/0000775000000000000000000000000013151342440017670 5ustar rootrootslowmovideo-0.5+git20180116/src/slowmoVideo/lib/flowTools_sV.cpp0000664000000000000000000003024713151342440023042 0ustar rootroot#include "flowTools_sV.h" #include #include #include //#define DEBUG void FlowTools_sV::deleteRect(FlowField_sV &field, int top, int left, int bottom, int right) { for (int y = top; y <= bottom; y++) { for (int x = left; x <= right; x++) { field.rx(x,y) = FlowField_sV::nullValue; } } } void FlowTools_sV::fillRect(FlowField_sV &field, int top, int left, int bottom, int right, float vx, float vy) { for (int y = top; y <= bottom; y++) { for (int x = left; x <= right; x++) { field.setX(x,y, vx); field.setY(x,y, vy); } } } void FlowTools_sV::refill(FlowField_sV &field, const Kernel_sV &kernel, int top, int left, int bottom, int right) { assert(top <= bottom); assert(left <= right); assert(top >= 0); assert(left >= 0); assert(bottom < field.height()); assert(right < field.width()); int newTop = top; int newLeft = left; int newRight = right; int newBottom = bottom; if (top > 0) { refillLine(field, kernel, top, left, right-left+1, true); newTop++; } if (left > 0) { refillLine(field, kernel, top, left, bottom-top+1, false); newLeft++; } if (right+1 < field.width()) { refillLine(field, kernel, top, right, bottom-top+1, false); newRight--; } if (bottom+1 < field.height()) { refillLine(field, kernel, bottom, left, right-left+1, true); newBottom--; } if (newRight-newLeft >= 0 && newBottom-newTop >= 0) { refill(field, kernel, newTop, newLeft, newBottom, newRight); } } void FlowTools_sV::refillLine(FlowField_sV &field, const Kernel_sV &kernel, int startTop, int startLeft, int length, bool horizontal) { int x = startLeft; int y = startTop; float valX = 0; float valY = 0; float weight = 0; while (true) { if (x >= field.width() || (horizontal && x >= startLeft+length) || y >= field.height() || (!horizontal && y >= startTop+length)) { break; } valX = 0; valY = 0; weight = 0; for (int dx = -kernel.rX(); dx <= kernel.rX(); dx++) { for (int dy = -kernel.rY(); dy <= kernel.rY(); dy++) { if (x+dx >= 0 && x+dx < field.width() && y+dy >= 0 && y+dy < field.height() && field.x(x+dx,y+dy) != FlowField_sV::nullValue) { valX += field.x(x+dx,y+dy) * kernel(dx, dy); valY += field.y(x+dx,y+dy) * kernel(dx, dy); weight += kernel(dx, dy); } } } if (weight > 0) { std::cout << "Before: " << field.rx(x,y); field.rx(x,y) = valX/weight; std::cout << ", Afterwards: " << field.rx(x,y) << std::endl; field.ry(x,y) = valY/weight; } if (horizontal) { x++; } else { y++; } } } void FlowTools_sV::refill(FlowField_sV &field, int top, int left, int bottom, int right) { assert(top <= bottom); assert(left <= right); assert(top >= 0); assert(left >= 0); assert(bottom < field.height()); assert(right < field.width()); // Top line if (top == bottom) { // Only a single line left. Can be inside an image or at a border. if (top == 0) { refillLine(field, top, left, right-left+1, HorizontalFromBottom); } else if (top == field.height()-1) { refillLine(field, top, left, right-left+1, HorizontalFromTop); } else { refillLine(field, top, left, right-left+1, HorizontalFromBoth); } } else if (top > 0){ if (bottom > top) { // Fill from above refillLine(field, top, left, right-left+1, HorizontalFromTop); } else { // Only a line left; fill from both sides refillLine(field, top, left, right-left+1, HorizontalFromBoth); } } // Left line if (left == right) { if (left == 0) { refillLine(field, top, left, bottom-top+1, VerticalFromRight); } else if (left == field.width()-1) { refillLine(field, top, left, bottom-top+1, VerticalFromLeft); } else { refillLine(field, top, left, bottom-top+1, VerticalFromBoth); } } else if (left > 0) { if (right > left) { refillLine(field, top, left, bottom-top+1, VerticalFromLeft); } else { refillLine(field, top, left, bottom-top+1, VerticalFromBoth); } } // Right line if (right+1 < field.width() && left != right) { // left == right already handled if (left < right) { refillLine(field, top, right, bottom-top+1, VerticalFromRight); } else { refillLine(field, top, right, bottom-top+1, VerticalFromBoth); } } // Bottom line if (bottom+1 < field.height() && bottom != top) { // bottom == top already handled if (top < bottom) { refillLine(field, bottom, left, right-left+1, HorizontalFromBottom); } else { refillLine(field, bottom, left, right-left+1, HorizontalFromBoth); } } refillCorner(field, top, left, TopLeft); refillCorner(field, top, right, TopRight); refillCorner(field, bottom, left, BottomLeft); refillCorner(field, bottom, right, BottomRight); if (bottom-top-2 >= 0 && right-left-2 >= 0) { refill(field, top+(top > 0 ? 1 : 0), left+(left > 0 ? 1 : 0), bottom-(bottom < field.height()-1 ? 1 : 0), right-(right < field.width()-1 ? 1 : 0)); // Now handle special cases where one line at the border was not filled (rect was only 2 pixels wide) } else if (bottom-top-1 == 0) { if (top == 0) { refill(field, top, left, top, right); } else if (bottom == field.height()-1) { refill(field, bottom, left, bottom, right); } } else if (right-left-1 == 0) { if (left == 0) { refill(field, top, left, bottom, left); } else if (right == field.width()-1) { refill(field, top, right, bottom, right); } } } void FlowTools_sV::difference(const FlowField_sV &left, const FlowField_sV &right, FlowField_sV &out) { float dx, dy; for (int y = 0; y < left.height(); y++) { for (int x = 0; x < left.width(); x++) { dx = left.x(x,y); dy = left.y(x,y); if (x+dx >= 0 && y+dy >= 0 && x+dx <= left.width()-1 && y+dy <= left.height()-1) { dx += right.x(x+dx, y+dy); dy += right.y(x+dx, y+dy); } out.setX(x,y, dx); out.setY(x,y, dy); } } } void FlowTools_sV::signedDifference(const FlowField_sV &left, const FlowField_sV &right, FlowField_sV &out) { float lx, ly; float rx, ry; for (int y = 0; y < left.height(); y++) { for (int x = 0; x < left.width(); x++) { lx = left.x(x,y); ly = left.y(x,y); if (x+lx >= 0 && y+ly >= 0 && x+lx <= left.width()-1 && y+ly <= left.height()-1) { rx = right.x(x+lx, y+ly); ry = right.y(x+lx, y+ly); if (fabs(lx)+fabs(ly) > fabs(rx)+fabs(ry)) { lx = fabs(lx+rx); ly = fabs(ly+ry); } else { lx = -fabs(lx+rx); ly = -fabs(ly+ry); } } else { lx = fabs(lx); rx = fabs(rx); } out.setX(x,y, lx); out.setY(x,y, ly); } } } void FlowTools_sV::refillLine(FlowField_sV &field, int startTop, int startLeft, int length, LineFillMode fillMode) { int x = startLeft; int y = startTop; float sumX; float sumY; short count = 0; if (fillMode == HorizontalFromTop || fillMode == HorizontalFromBottom || fillMode == HorizontalFromBoth) { x++; while (x < startLeft+length) { sumX = 0; sumY = 0; count = 0; if (fillMode == HorizontalFromTop || fillMode == HorizontalFromBoth) { sumX += field.x(x-1, y-1) + field.x(x, y-1) + field.x(x+1, y-1); sumY += field.y(x-1, y-1) + field.y(x, y-1) + field.y(x+1, y-1); count += 3; } if (fillMode == HorizontalFromBottom || fillMode == HorizontalFromBoth) { sumX += field.x(x-1, y+1) + field.x(x, y+1) + field.x(x+1, y+1); sumY += field.y(x-1, y+1) + field.y(x, y+1) + field.y(x+1, y+1); count += 3; } #ifdef DEBUG sumX *= 1.2; sumY *= 1.2; #endif field.rx(x,y) = sumX/count; field.ry(x,y) = sumY/count; x++; } } else { y++; while (y < startTop + length) { sumX = 0; sumY = 0; count = 0; if (fillMode == VerticalFromLeft || fillMode == VerticalFromBoth) { sumX += field.x(x-1, y-1) + field.x(x-1, y) + field.x(x-1, y+1); sumY += field.y(x-1, y-1) + field.y(x-1, y) + field.y(x-1, y+1); count += 3; } if (fillMode == VerticalFromRight || fillMode == VerticalFromBoth) { sumX += field.x(x+1, y-1) + field.x(x+1, y) + field.x(x+1, y+1); sumY += field.y(x+1, y-1) + field.y(x+1, y) + field.y(x+1, y+1); count += 3; } #ifdef DEBUG sumX *= 1.2; sumY *= 1.2; #endif field.rx(x,y) = sumX/count; field.ry(x,y) = sumY/count; y++; } } } void FlowTools_sV::refillCorner(FlowField_sV &field, int top, int left, CornerPosition pos) { int dx; int dy; if (pos == TopRight || pos == BottomRight) { dx = 1; } else { dx = -1; } if (pos == TopRight || pos == TopLeft) { dy = -1; } else { dy = 1; } float sumX = 0; float sumY = 0; int count = 0; if (left+dx > 0 && top+dy > 0 && left+dx < field.width() && top+dy < field.height()) { sumX += field.x(left+dx, top+dy); sumY += field.y(left+dx, top+dy); count++; } if (left+dx > 0 && top-dy > 0 && left+dx < field.width() && top-dy < field.height()) { sumX += field.x(left+dx, top-dy); sumY += field.y(left+dx, top-dy); count++; } if (left-dx > 0 && top+dy > 0 && left-dx < field.width() && top+dy < field.height()) { sumX += field.x(left-dx, top+dy); sumY += field.y(left-dx, top+dy); count++; } #ifdef DEBUG sumX = 100; sumY = 100; #endif field.rx(left,top) = sumX/count; field.ry(left,top) = sumY/count; } FlowField_sV* FlowTools_sV::median(const FlowField_sV *const fa, const FlowField_sV *const fb, const FlowField_sV *const fc) { assert(fa != NULL); assert(fb != NULL); assert(fc != NULL); assert(fa->width() == fb->width() && fa->width() == fc->width()); assert(fa->height() == fb->height() && fa->height() == fc->height()); int w = fa->width(); int h = fa->height(); FlowField_sV *ff = new FlowField_sV(w, h); for (int y = 0; y < h; y++) { for (int x = 0; x < w; x++) { float a = fa->x(x,y)*fa->x(x,y) + fa->y(x,y)*fa->y(x,y); float b = fb->x(x,y)*fb->x(x,y) + fb->y(x,y)*fb->y(x,y); float c = fc->x(x,y)*fc->x(x,y) + fc->y(x,y)*fc->y(x,y); // Determine the median // < a b c // a - ? ? // b ? - ? // c ? ? - // The median element has SUM == 1 for its row char detA = (a < b) + (a < c); char detB = (b < a) + (b < c); if (detA == 1) { ff->rx(x,y) = fa->x(x,y); ff->ry(x,y) = fa->y(x,y); } else if (detB == 1) { ff->rx(x,y) = fb->x(x,y); ff->ry(x,y) = fb->y(x,y); } else { ff->rx(x,y) = fc->x(x,y); ff->ry(x,y) = fc->y(x,y); } } } return ff; } slowmovideo-0.5+git20180116/src/slowmoVideo/lib/qtkit.mm0000664000000000000000000002722613151342440021370 0ustar rootroot/* * class to export a movie using QuickTime */ #include #include #include #include #include "qtkit.h" #include "video_enc.h" #include "../project/renderTask_sV.h" #pragma mark - #pragma mark cocoa bridge // tools for qt 4.8 // convert pixamp <-> nsimage static void drawImageReleaseData (void *info, const void *, size_t) { delete static_cast(info); } CGImageRef qt_mac_image_to_cgimage(const QImage&img) { QImage *image; if (img.depth() != 32) image = new QImage(img.convertToFormat(QImage::Format_ARGB32_Premultiplied)); else image = new QImage(img); uint cgflags = kCGImageAlphaNone; switch (image->format()) { case QImage::Format_ARGB32_Premultiplied: cgflags = kCGImageAlphaPremultipliedFirst; break; case QImage::Format_ARGB32: cgflags = kCGImageAlphaFirst; break; case QImage::Format_RGB32: cgflags = kCGImageAlphaNoneSkipFirst; default: break; } cgflags |= kCGBitmapByteOrder32Host; CGDataProviderRef dataProvider = CGDataProviderCreateWithData(image, static_cast(image)->bits(), image->byteCount(), drawImageReleaseData); CGColorSpaceRef colorspace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB); CGImageRef cgImage = CGImageCreate(image->width(), image->height(), 8, 32, image->bytesPerLine(), colorspace, cgflags, dataProvider, 0, false, kCGRenderingIntentDefault); CFRelease(dataProvider); CFRelease(colorspace); return cgImage; } NSImage *toNSImage(const QImage& InImage) { NSBitmapImageRep *bitmapRep = [[NSBitmapImageRep alloc] initWithCGImage: qt_mac_image_to_cgimage(InImage)]; NSImage *image = [[NSImage alloc] init]; [image addRepresentation:bitmapRep]; [bitmapRep release]; return image; } // end of tools #pragma mark - /* TODO : "-fps: Frames per second for final movie can be anywhere between 0.1 and 60.0.\n" "-height: If specified images are resized proportionally to height given.\n" "-codec: Codec to use to encode can be 'h264' 'photojpeg' 'raw' or 'mpv4'.\n" "-quality: Quality to encode with can be 'high' 'normal' 'low'.\n" "-quiet: Set to 'yes' to suppress output during encoding.\n" "-reverse: Set to 'yes' to reverse the order that images are displayed in the movie.\n" "DEFAULTS\n" "fps = 30\n" "height = original image size\n" "codec = h264\n" "quality = high\n\n" */ VideoQT::VideoQT(int width,int height,double fps,const char *vcodec,const char* vquality,const char *filename) { NSAutoreleasePool* localpool = [[NSAutoreleasePool alloc] init]; movieFPS = fps; mMovie = nil; mHeight = height; mWidth = width; codecSpec = nil; qualitySpec = nil; NSDictionary *codec = [NSDictionary dictionaryWithObjectsAndKeys: @"avc1", @"h264", @"mpv4", @"mpv4", @"jpeg", @"photojpeg", @"raw ", @"raw", nil]; NSDictionary *quality = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithLong:codecLowQuality], @"low", [NSNumber numberWithLong:codecNormalQuality], @"normal", [NSNumber numberWithLong:codecMaxQuality], @"high", nil]; if (codecSpec == nil) { codecSpec = @"h264"; } /* codecLosslessQuality = 0x00000400, codecMaxQuality = 0x000003FF, codecMinQuality = 0x00000000, codecLowQuality = 0x00000100, codecNormalQuality = 0x00000200, codecHighQuality = 0x00000300 */ if (qualitySpec == nil) { qualitySpec = @"high"; } imageAttributes = [[NSDictionary dictionaryWithObjectsAndKeys: [codec objectForKey:codecSpec], QTAddImageCodecType, [quality objectForKey:qualitySpec], QTAddImageCodecQuality, [NSNumber numberWithLong:100000], QTTrackTimeScaleAttribute, nil] retain]; long timeScale = 100000; long long timeValue = (long long) ceil((double) timeScale / fps); duration = QTMakeTime(timeValue, timeScale); NSFileManager *fileManager = [NSFileManager defaultManager]; destPath = [[NSURL fileURLWithPath:[[NSString stringWithUTF8String:filename] stringByExpandingTildeInPath]] path]; if (![destPath hasSuffix:@".mov"]) { fprintf(stderr, "Error: Output filename must be of type '.mov'\n"); //return 1; } if ([fileManager fileExistsAtPath:destPath]) { fprintf(stderr, "Error: Output file already exists.\n"); //return 1; } mMovie = [[QTMovie alloc] initToWritableFile:destPath error:NULL]; if (mMovie == nil) { fprintf(stderr, "%s","Error: Unable to initialize QT object.\n"); //return 1; } [mMovie setAttribute:[NSNumber numberWithBool:YES] forKey:QTMovieEditableAttribute]; [localpool drain]; } // // add one frame to the movie int VideoQT::writeFrame(const QImage& frame) { NSAutoreleasePool* localpool = [[NSAutoreleasePool alloc] init]; #if 1 NSImage* nsimage =toNSImage(frame); #else NSBitmapImageRep* imageRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:&imagedata pixelsWide:width pixelsHigh:height bitsPerSample:8 samplesPerPixel:4 hasAlpha:YES isPlanar:NO colorSpaceName:NSDeviceRGBColorSpace bitmapFormat:NSAlphaFirstBitmapFormat bytesPerRow:argbimage->widthStep bitsPerPixel:32] ; NSImage* nsimage = [[NSImage alloc] init]; [nsimage addRepresentation:imageRep]; #endif // maybe should resize here ? [mMovie addImage:nsimage forDuration:duration withAttributes:imageAttributes]; if (![mMovie updateMovieFile]) { fprintf(stderr, "Didn't successfully update movie file. \n" ); return 1; } //[imageRep release]; [nsimage release]; [localpool drain]; return 0; } #pragma mark - int VideoQT::exportFrames(QString filepattern,int first,RenderTask_sV *progress) { NSAutoreleasePool* localpool = [[NSAutoreleasePool alloc] init]; NSString *inputPath; NSArray *imageFiles; NSError *err; NSImage *image; NSString *fullFilename; qDebug() << "exporting frame from : " << filepattern << " to " << destPath; NSLog(@"export to @%", destPath); NSFileManager *fileManager = [NSFileManager defaultManager]; NSString* inputdir = [[NSURL fileURLWithPath:[[NSString stringWithUTF8String:filepattern.toStdString().c_str()] stringByExpandingTildeInPath]] path]; inputPath = [inputdir stringByDeletingLastPathComponent]; imageFiles = [fileManager contentsOfDirectoryAtPath:inputPath error:&err]; imageFiles = [imageFiles sortedArrayUsingSelector:@selector(localizedStandardCompare:)]; for (NSString *file in imageFiles) { fullFilename = [inputPath stringByAppendingPathComponent:file]; if ([[fullFilename pathExtension] caseInsensitiveCompare:@"jpeg"] == NSOrderedSame || [[fullFilename pathExtension] caseInsensitiveCompare:@"png"] == NSOrderedSame || [[fullFilename pathExtension] caseInsensitiveCompare:@"jpg"] == NSOrderedSame) { NSAutoreleasePool *innerPool = [[NSAutoreleasePool alloc] init]; image = [[NSImage alloc] initWithContentsOfFile:fullFilename]; //NSLog(@"adding %@",fullFilename); [mMovie addImage:image forDuration:duration withAttributes:imageAttributes]; if (![mMovie updateMovieFile]) { fprintf(stderr, "Didn't successfully update movie file. \n" ); return 1; } // TODO: progress->stepProgress(); [image release]; [innerPool release]; } } [localpool drain]; return 0; } // // close the movie file VideoQT::~VideoQT() { NSAutoreleasePool* localpool = [[NSAutoreleasePool alloc] init]; [mMovie updateMovieFile]; [mMovie release]; [qualitySpec release]; [codecSpec release]; // TODO: [destPath release]; [localpool drain]; } #pragma mark - #pragma mark C/C++ bridge VideoWriter* CreateVideoWriter_QT ( const char* filename, int width, int height, double fps,const char* codec) { VideoQT* driver= new VideoQT(width,height,fps,0,0,filename); return driver; } #if 0 CGImageRef qt_mac_image_to_cgimage(const QImage &image) { int bitsPerColor = 8; int bitsPerPixel = 32; if (image.depth() == 1) { bitsPerColor = 1; bitsPerPixel = 1; } QCFType provider = CGDataProviderCreateWithData(0, image.bits(), image.bytesPerLine() * image.height(), 0); uint cgflags = kCGImageAlphaPremultipliedFirst; #ifdef kCGBitmapByteOrder32Host //only needed because CGImage.h added symbols in the minor version cgflags |= kCGBitmapByteOrder32Host; #endif CGImageRef cgImage = CGImageCreate(image.width(), image.height(), bitsPerColor, bitsPerPixel, image.bytesPerLine(), QCoreGraphicsPaintEngine::macGenericColorSpace(), cgflags, provider, 0, 0, kCGRenderingIntentDefault); return cgImage; } void * /*NSImage */qt_mac_create_nsimage(const QPixmap &pm) { QMacCocoaAutoReleasePool pool; if(QCFType image = pm.toMacCGImageRef()) { NSImage *newImage = 0; NSRect imageRect = NSMakeRect(0.0, 0.0, CGImageGetWidth(image), CGImageGetHeight(image)); newImage = [[NSImage alloc] initWithSize:imageRect.size]; [newImage lockFocus]; { CGContextRef imageContext = (CGContextRef)[[NSGraphicsContext currentContext] graphicsPort]; CGContextDrawImage(imageContext, *(CGRect*)&imageRect, image); } [newImage unlockFocus]; return newImage; } return 0; } QPixmap pixmap = icon.pixmap(size); CGImageRef cgImage = pixmap.toMacCGImageRef();//compile errors here image = [[NSImage alloc] initWithCGImage:cgImage size:NSZeroSize]; CFRelease(cgImage); Creates a CGImageRef equivalent to the QPixmap. Returns the CGImageRef handle. It is the caller's responsibility to release the CGImageRef data after use. Warning: This function is only available on Mac OS X. This function was introduced in Qt 4.2. #endif slowmovideo-0.5+git20180116/src/slowmoVideo/lib/video_enc.cpp0000664000000000000000000000150213151342440022325 0ustar rootroot#include "config.h" #include "video_enc.h" VideoWriter* CreateVideoWriter( const char* filename, int width, int height,double fps,int use_qt,const char* codec) { VideoWriter* driver; #ifdef USE_QTKIT if (use_qt) driver= CreateVideoWriter_QT(filename,width, height,fps,codec); else #endif #ifdef USE_FFMPEG driver= CreateVideoWriter_FFMPEG(filename,width, height,fps,codec); #endif return driver; } int WriteFrame( VideoWriter* writer, const QImage& frame) { return writer ? writer->writeFrame(frame) : 0; } int exportFrames(VideoWriter* writer,QString filepattern,int first,RenderTask_sV *progress) { return writer ? writer->exportFrames(filepattern,first,progress) : 0; } void ReleaseVideoWriter( VideoWriter** pwriter ) { if( pwriter && *pwriter ) { delete *pwriter; *pwriter = 0; } } slowmovideo-0.5+git20180116/src/slowmoVideo/lib/bezierTools_sV.h0000664000000000000000000000237613151342440023022 0ustar rootroot/* slowmoVideo creates slow-motion videos from normal-speed videos. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef BEZIERTOOLS_SV_H #define BEZIERTOOLS_SV_H #include "defs_sV.hpp" /** Contains common function for working with cubic bézier curves. */ class BezierTools_sV { public: /** \brief Interpolates the bézier curve at x value \c x. For an injective bézier curve (i.e. only one y value for each x value) this function calculates the y value at a given x , which may differ from the time \c t in interpolate(). */ static QPointF interpolateAtX(float x, QPointF p0, QPointF p1, QPointF p2, QPointF p3); /** \brief Interpolates the bézier curve at time \c t. This function interpolates the cubic bézier curve defined by end points \c p0 and \c p3 and handles \c p1 and \c p2 at time \f$ t \in [0,1] \f$. */ static QPointF interpolate(float t, QPointF p0, QPointF p1, QPointF p2, QPointF p3); }; #endif // BEZIERTOOLS_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/lib/intMatrix_sV.cpp0000664000000000000000000000275313151342440023032 0ustar rootroot/* slowmoVideo creates slow-motion videos from normal-speed videos. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "intMatrix_sV.h" #include IntMatrix_sV::IntMatrix_sV(int width, int height, int channels) : m_width(width), m_height(height), m_channels(channels) { m_data = new int[width*height*m_channels]; std::fill(m_data, m_data + width*height*channels, 0); } IntMatrix_sV::~IntMatrix_sV() { delete[] m_data; } int IntMatrix_sV::width() const { return m_width; } int IntMatrix_sV::height() const { return m_height; } int IntMatrix_sV::channels() const { return m_channels; } void IntMatrix_sV::operator +=(const unsigned char *bytes) { for (int i = 0; i < m_width*m_height*m_channels; i++) { m_data[i] += bytes[i]; } } void IntMatrix_sV::operator /=(int divisor) { for (int i = 0; i < m_width*m_height*m_channels; i++) { m_data[i] /= divisor; } } unsigned char* IntMatrix_sV::toBytesArray() const { unsigned char *arr = new unsigned char[m_width*m_height*m_channels]; for (int i = 0; i < m_width*m_height*m_channels; i++) { arr[i] = (unsigned char) m_data[i]; } return arr; } const int* IntMatrix_sV::data() const { return m_data; } slowmovideo-0.5+git20180116/src/slowmoVideo/lib/ffmpegEncode_sV.c0000664000000000000000000004021113151342440023064 0ustar rootroot/* This code is based on http://ffmpeg.org/doxygen/trunk/encoding_8c-source.html and http://ffmpeg.org/doxygen/trunk/muxing_8c-source.html and has been adjusted with a lot of help from Tjoppen at irc.freenode.org#ffmpeg. (Thanks!) Copyright (c) 2001 Fabrice Bellard 2011 Simon A. Eugster This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "ffmpegEncode_sV.h" #include void setErrorMessage(VideoOut_sV *video, const char *msg) { if (video->errorMessage != NULL) { free(video->errorMessage); } video->errorMessage = malloc(strlen(msg)+1); strcpy(video->errorMessage, msg); } void prepareDefault(VideoOut_sV *video) { prepare(video, "/tmp/ffmpegTest.avi", NULL, 352, 288, 400000, 1, 24); } int open_video(VideoOut_sV *video) { AVCodec *codec; AVCodecContext *cc; cc = video->streamV->codec; /* find the video encoder */ codec = avcodec_find_encoder(cc->codec_id); if (!codec) { char s[200]; sprintf(s, "Codec for ID %d could not be found.\n", cc->codec_id); fputs(s, stderr); setErrorMessage(video, s); return 3; } else { printf("Codec used: %s\n", codec->name); } /* open the codec */ #if LIBAVCODEC_VERSION_INT < AV_VERSION_INT(53,8,0) if (avcodec_open(cc, codec) < 0) { #else if (avcodec_open2(cc, codec, NULL) < 0) { #endif char s[200]; sprintf(s, "Could not open codec %s.\n", codec->long_name); fputs(s, stderr); return 3; } video->outbufV = NULL; if (!(video->fc->oformat->flags & AVFMT_RAWPICTURE)) { /* allocate output buffer */ /* XXX: API change will be done */ /* buffers passed into lav* can be allocated any way you prefer, as long as they're aligned enough for the architecture, and they're freed appropriately (such as using av_free for buffers allocated with av_malloc) */ // \todo av_get_picture_size? video->outbufSizeV = 200000; video->outbufV = av_malloc(video->outbufSizeV); } return 0; } int prepare(VideoOut_sV *video, const char *filename, const char *vcodec, const int width, const int height, const int bitrate, const unsigned int numerator, const unsigned int denominator) { #if LIBAVCODEC_VERSION_INT < AV_VERSION_INT(53,7,1) // Must be called before using the avcodec library. (Done automatically in more recent versions.) avcodec_init(); #endif video->frameNr = 0; video->errorMessage = NULL; video->filename = malloc(strlen(filename)+1); strcpy(video->filename, filename); /* initialize libavcodec, and register all codecs and formats */ av_register_all(); #if LIBAVFORMAT_VERSION_INT >= AV_VERSION_INT(53,19,0) avformat_network_init(); #endif /* allocate the output media context */ #if LIBAVFORMAT_VERSION_INT < AV_VERSION_INT(52,45,0) video->fc = avformat_alloc_context(); video->fc->oformat = guess_format(NULL, filename, NULL); strncpy(video->fc->filename, filename, sizeof(video->fc->filename)); #elif LIBAVFORMAT_VERSION_INT < AV_VERSION_INT(53,4,0) || defined(MOST_LIKELY_LIBAV) video->fc = avformat_alloc_context(); video->fc->oformat = av_guess_format(NULL, filename, NULL); strncpy(video->fc->filename, filename, sizeof(video->fc->filename)); #else // Actually introduced in 53.2.0 but not working in 53.3.0 packages avformat_alloc_output_context2(&video->fc, NULL, NULL, filename); if (!video->fc) { printf("Could not deduce output format from file extension: using MPEG.\n"); avformat_alloc_output_context2(&video->fc, NULL, "mpeg", filename); } #endif if (!video->fc) { const char *s = "Could allocate the output context, even MPEG is not available.\n"; fputs(s, stderr); setErrorMessage(video, s); return 2; } video->format = video->fc->oformat; printf("Using format %s.\n", video->format->name); /* Use the given vcodec if it is not NULL */ if (vcodec != NULL) { AVCodec *codec = avcodec_find_encoder_by_name(vcodec); if (codec == NULL) { char s[strlen(vcodec)+150]; sprintf(s, "No codec available for %s. Check the output of \nffmpeg -codecs\nto see a list of available codecs.\n", vcodec); fputs(s, stderr); setErrorMessage(video, s); return 2; } printf("Found codec: %s\n", codec->long_name); video->format->video_codec = codec->id; } /* add the audio and video streams using the default format codecs and initialize the codecs */ video->streamV = NULL; if (video->format->video_codec != CODEC_ID_NONE) { #if LIBAVFORMAT_VERSION_INT < AV_VERSION_INT(53,10,0) video->streamV = av_new_stream(video->fc, 0); #else video->streamV = avformat_new_stream(video->fc, 0); #endif if (!video->streamV) { const char *s = "Could not allocate the video stream.\n"; fputs(s, stderr); setErrorMessage(video, s); return 2; } AVCodecContext *cc = video->streamV->codec; cc->codec_id = video->format->video_codec; #if LIBAVCODEC_VERSION_INT < (52<<16 | 64<<8 | 0) cc->codec_type = CODEC_TYPE_VIDEO; #else cc->codec_type = AVMEDIA_TYPE_VIDEO; #endif cc->bit_rate = bitrate; /* resolution must be a multiple of two */ cc->width = width; cc->height = height; /* time base: this is the fundamental unit of time (in seconds) in terms of which frame timestamps are represented. for fixed-fps content, timebase should be 1/framerate and timestamp increments should be identically 1. */ cc->time_base = (AVRational){numerator, denominator}; cc->gop_size = 12; /* emit one intra frame every ten frames */ cc->pix_fmt = PIX_FMT_YUV420P; if (cc->codec_id == CODEC_ID_MPEG2VIDEO || cc->codec_id == CODEC_ID_MPEG4) { /* just for testing, we also add B frames */ cc->max_b_frames = 2; } if (cc->codec_id == CODEC_ID_MPEG1VIDEO){ /* Needed to avoid using macroblocks in which some coeffs overflow. This does not happen with normal video, it just happens here as the motion of the chroma plane does not match the luma plane. */ cc->mb_decision=2; } // some formats want stream headers to be separate if(video->fc->oformat->flags & AVFMT_GLOBALHEADER) { cc->flags |= CODEC_FLAG_GLOBAL_HEADER; } video->rgbConversionContext = sws_getContext( cc->width, cc->height, PIX_FMT_BGRA, cc->width, cc->height, cc->pix_fmt, SWS_BICUBIC, NULL, NULL, NULL); // One line size for each plane. RGB consists of one plane only. // (YUV420p consists of 3, Y, Cb, and Cr video->rgbLinesize[0] = cc->width*4; video->rgbLinesize[1] = 0; video->rgbLinesize[2] = 0; video->rgbLinesize[3] = 0; if (video->rgbConversionContext == NULL) { char s[200]; sprintf(s, "Cannot initialize the RGB conversion context. Incorrect size (%dx%d)?\n", cc->width, cc->height); fputs(s, stderr); setErrorMessage(video, s); return 2; } printf("Settings: %dx%d, %d bits/s (tolerance: %d), %d/%d fps\n", cc->width, cc->height, cc->bit_rate, cc->bit_rate_tolerance, cc->time_base.den, cc->time_base.num); // printf("Stream settings: %d/%d fps\n", video->streamV->time_base.den, video->streamV->time_base.num); fflush(stdout); } else { const char *s = "No codec ID given.\n"; fputs(s, stderr); setErrorMessage(video, s); return 2; } #if LIBAVFORMAT_VERSION_INT <= AV_VERSION_INT(52,100,1) dump_format(video->fc, 0, filename, 1); #else av_dump_format(video->fc, 0, filename, 1); #endif /* now that all the parameters are set, we can open the audio and video codecs and allocate the necessary encode buffers */ if (video->streamV) { int ret = open_video(video); if (ret != 0) { return ret; } } else { const char *s = "Could not open video stream.\n"; fputs(s, stderr); setErrorMessage(video, s); return 2; } /* open the output file, if needed */ if (!(video->format->flags & AVFMT_NOFILE)) { #if LIBAVFORMAT_VERSION_INT <= AV_VERSION_INT(52,102,0) if (url_fopen(&video->fc->pb, filename, URL_WRONLY) < 0) { #else #if LIBAVFORMAT_VERSION_INT <= AV_VERSION_INT(53,0,0) if (avio_open(&video->fc->pb, filename, AVIO_WRONLY) < 0) { #else if (avio_open(&video->fc->pb, filename, AVIO_FLAG_WRITE) < 0) { #endif #endif // Check if non-ASCII characters are present in the file path char nonAscii = 0; int i; for (i = 0; i < strlen(video->filename); i++) { if ((unsigned short)video->filename[i] > 0x7f) { fprintf(stderr, "Contains non-ASCII character: %c (%d)\n", video->filename[i], (unsigned char)video->filename[i]); nonAscii = 1; } } // Build the error message char *msg; if (nonAscii == 1) { msg = "Could not open file (probably due to non-ASCII characters): "; } else { msg = "Could not open file: "; } char *msgAll = malloc(sizeof(char) * (strlen(filename) + strlen(msg))); strcpy(msgAll, msg); strcat(msgAll, filename); fputs(msgAll, stderr); setErrorMessage(video, msgAll); free(msgAll); return 5; } } /* write the stream header, if any */ #if LIBAVFORMAT_VERSION_INT <= AV_VERSION_INT(53,1,3) av_write_header(video->fc); #else avformat_write_header(video->fc, NULL); #endif /* alloc image and output buffer */ video->outbufSizeV = avpicture_get_size(video->streamV->codec->pix_fmt, width, height); video->outbufV = av_malloc(video->outbufSizeV); video->picture = avcodec_alloc_frame(); //TODO: replace by: av_frame_alloc(); ? avpicture_alloc((AVPicture*)video->picture, video->streamV->codec->pix_fmt, video->streamV->codec->width, video->streamV->codec->height); if (!video->picture) { const char *s = "Could not allocate AVPicture.\n"; fputs(s, stderr); setErrorMessage(video, s); return 2; } return 0; } int eatARGB(VideoOut_sV *video, const unsigned char *data) { fflush(stdout); int ret = 0; AVCodecContext *cc = video->streamV->codec; #if LIBSWSCALE_VERSION_INT < AV_VERSION_INT(0,8,0) sws_scale(video->rgbConversionContext, (uint8_t**)&data, video->rgbLinesize, 0, cc->height, video->picture->data, video->picture->linesize ); #else sws_scale(video->rgbConversionContext, &data, video->rgbLinesize, 0, cc->height, video->picture->data, video->picture->linesize ); #endif if (video->fc->oformat->flags & AVFMT_RAWPICTURE) { /* raw video case. The API will change slightly in the near future for that */ AVPacket pkt; av_init_packet(&pkt); #if LIBAVCODEC_VERSION_INT < AV_VERSION_INT(52,30,2) pkt.flags |= PKT_FLAG_KEY; #else pkt.flags |= AV_PKT_FLAG_KEY; #endif pkt.stream_index = video->streamV->index; pkt.data = (uint8_t *)video->picture; pkt.size = sizeof(AVPicture); ret = av_interleaved_write_frame(video->fc, &pkt); } else { /* encode the image */ video->outSize = avcodec_encode_video(cc, video->outbufV, video->outbufSizeV, video->picture); //TODO: check usage of avcodec_encode_video2 ? /* if zero size, it means the image was buffered */ if (video->outSize > 0) { AVPacket pkt; av_init_packet(&pkt); if (cc->coded_frame->pts != AV_NOPTS_VALUE) { pkt.pts = av_rescale_q(cc->coded_frame->pts, cc->time_base, video->streamV->time_base); // printf("pkt.pts is %d.\n", pkt.pts); } if(cc->coded_frame->key_frame) { #if LIBAVCODEC_VERSION_INT < AV_VERSION_INT(52,30,2) pkt.flags |= PKT_FLAG_KEY; #else pkt.flags |= AV_PKT_FLAG_KEY; #endif } pkt.stream_index = video->streamV->index; pkt.data = video->outbufV; pkt.size = video->outSize; /* write the compressed frame in the media file */ ret = av_interleaved_write_frame(video->fc, &pkt); } else { ret = 0; } } if (ret != 0) { const char *s = "Error while writing video frame (interleaved_write).\n"; fputs(s, stderr); setErrorMessage(video, s); return ret; } printf("Added frame %d to %s.\n", video->frameNr, video->filename); video->frameNr++; return ret; } void eatSample(VideoOut_sV *video) { fflush(stdout); /* prepare a dummy image */ /* Y */ int x, y; for(y = 0; y < video->streamV->codec->height; y++) { for(x = 0; x < video->streamV->codec->width; x++) { video->picture->data[0][y * video->picture->linesize[0] + x] = x + y + video->frameNr * 3; } } /* Cb and Cr */ for(y = 0; y < video->streamV->codec->height/2; y++) { for(x = 0; x < video->streamV->codec->width/2; x++) { video->picture->data[1][y * video->picture->linesize[1] + x] = 128 + y + video->frameNr * 2; video->picture->data[2][y * video->picture->linesize[2] + x] = 64 + x + video->frameNr * 5; } } /* encode the image */ AVCodecContext *cc = video->streamV->codec; video->outSize = avcodec_encode_video(cc, video->outbufV, video->outbufSizeV, video->picture); /* if zero size, it means the image was buffered */ if (video->outSize > 0) { AVPacket pkt; av_init_packet(&pkt); if (cc->coded_frame->pts != AV_NOPTS_VALUE) { pkt.pts = av_rescale_q(cc->coded_frame->pts, cc->time_base, video->streamV->time_base); printf("pkt.pts is %lld.\n", pkt.pts); } if(cc->coded_frame->key_frame) { #if LIBAVCODEC_VERSION_INT < AV_VERSION_INT(52,30,2) pkt.flags |= PKT_FLAG_KEY; #else pkt.flags |= AV_PKT_FLAG_KEY; #endif } pkt.stream_index = video->streamV->index; pkt.data = video->outbufV; pkt.size = video->outSize; /* write the compressed frame in the media file */ av_interleaved_write_frame(video->fc, &pkt); } video->frameNr++; } void finish(VideoOut_sV *video) { /* write the trailer, if any. the trailer must be written * before you close the CodecContexts open when you wrote the * header; otherwise write_trailer may try to use memory that * was freed on av_codec_close() */ av_write_trailer(video->fc); /* close each codec */ if (video->streamV) { avcodec_close(video->streamV->codec); av_free(video->picture->data[0]); av_free(video->picture); av_free(video->outbufV); } /* free the streams */ int i; for(i = 0; i < video->fc->nb_streams; i++) { av_freep(&video->fc->streams[i]->codec); av_freep(&video->fc->streams[i]); } if (!(video->format->flags & AVFMT_NOFILE)) { /* close the output file */ #if LIBAVFORMAT_VERSION_INT <= AV_VERSION_INT(52,106,0) url_fclose(video->fc->pb); #else avio_close(video->fc->pb); #endif } /* free the stream */ av_free(video->fc); sws_freeContext(video->rgbConversionContext); printf("\nWrote to %s.\n", video->filename); #if LIBAVFORMAT_VERSION_INT >= AV_VERSION_INT(53,13,0) avformat_network_deinit(); #endif } slowmovideo-0.5+git20180116/src/slowmoVideo/lib/defs_sV.h0000664000000000000000000000107413151342440021434 0ustar rootroot#ifndef DEFS_SV_H #define DEFS_SV_H #include "macros_sV.h" #if defined(WINDOWS) && !defined(MXE) typedef __int64 int64_t; #else #include #endif /// Holds information about a video input file. typedef struct VideoInfoSV { /// Frame rate numerator int frameRateNum; /// Frame rate denominator int frameRateDen; /// Frame width int width; /// Frame height int height; /// Number of frames in total int64_t framesCount; /// Number of available video streams int streamsCount; } VideoInfoSV; #endif // DEFS_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/lib/shutter_sV.h0000664000000000000000000000222713151342440022212 0ustar rootroot/* slowmoVideo creates slow-motion videos from normal-speed videos. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef SHUTTER_SV_H #define SHUTTER_SV_H #include class FlowField_sV; /** \brief Simulates shutter (long exposure) with multiple images. */ class Shutter_sV { public: /// Combines the given images to a new image by addition and division. static QImage combine(const QStringList images); static QImage combine(const QList images); static QImage convolutionBlur(const QImage source, const FlowField_sV *flow, float length); static QImage convolutionBlur(const QImage interpolatedAtOffset, const FlowField_sV *flow, float length, float offset); private: struct ColorStack { ColorStack(); void add(QColor col); QColor col(); private: float r, g, b, a; int count; }; }; #endif // SHUTTER_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/lib/flowField_sV.h0000664000000000000000000000447213151342440022433 0ustar rootroot/* slowmoVideo creates slow-motion videos from normal-speed videos. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef FLOWFIELD_SV_H #define FLOWFIELD_SV_H /** \brief Represents a dense optical flow field. Values are internally stored in row order: \code [ x0 y0 x1 y1 x2 y2 ... xi yi xj yj xk yk ... ] \endcode \see FlowRW_sV for reading and writing flow fields. */ class FlowField_sV { public: /** OpenGL format */ enum GLFormat { GLFormat_RGB, GLFormat_RG }; static float nullValue; /** Constructor for uninitialized data */ FlowField_sV(int width, int height); /** Constructor for data read from OpenGL in the given \c format. */ FlowField_sV(int width, int height, float *data, GLFormat format); ~FlowField_sV(); /** \fn data() \return Pointer to the raw data. See the class description for the accurate format. */ /** \fn dataSize() \return Number of elements in the data array. */ /// Pointer to the raw data. See the class description for the accurate format. float* data(); /// Number of elements in the data array. int dataSize() const { return 2*m_width*m_height; } /// Width of the flow field int width() const { return m_width; } /// Height of the flow field int height() const { return m_height; } /// Flow in x direction at position (x|y) float x(int x, int y) const; /// Flow in y direction at position (x|y) float y(int x, int y) const; /// Reference to the value x(x, y) float& rx(int x, int y); /// Reference to the value y(x, y) float& ry(int x, int y); /// Sets the flow in x direction for the position (x|y) void setX(int x, int y, float value); /// Sets the flow in y direction for the position (x|y) void setY(int x, int y, float value); /// Equality test. Equal if all entries match. bool operator==(const FlowField_sV& other) const; private: int m_width; int m_height; float *m_data; }; #endif // FLOWFIELD_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/lib/interpolate_sV.cpp0000664000000000000000000003672313151342440023405 0ustar rootroot/* slowmoVideo creates slow-motion videos from normal-speed videos. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "interpolate_sV.h" #include "flowField_sV.h" #include "flowTools_sV.h" #include "sourceField_sV.h" #include "vector_sV.h" #include "bezierTools_sV.h" #ifdef WINDOWS #include #else #include #endif #include #include #include #define CLAMP1(x) ( ((x) > 1.0) ? 1.0 : (x) ) #define CLAMP(x,min,max) ( ((x) < (min)) ? (min) : ( ((x) > (max)) ? (max) : (x) ) ) #define INTERPOLATE //#define FIX_FLOW #define FIX_BORDERS //#define DEBUG_I enum ColorComponent { CC_Red, CC_Green, CC_Blue }; inline float interpR(const QColor cols[2][2], float x, float y) { return (1-x)*(1-y) * cols[0][0].redF() + x*(1-y) * cols[1][0].redF() + y*(1-x) * cols[0][1].redF() + x*y * cols[1][1].redF(); } inline float interpG(const QColor cols[2][2], float x, float y) { return (1-x)*(1-y) * cols[0][0].greenF() + x*(1-y) * cols[1][0].greenF() + y*(1-x) * cols[0][1].greenF() + x*y * cols[1][1].greenF(); } inline float interpB(const QColor cols[2][2], float x, float y) { return (1-x)*(1-y) * cols[0][0].blueF() + x*(1-y) * cols[1][0].blueF() + y*(1-x) * cols[0][1].blueF() + x*y * cols[1][1].blueF(); } QColor Interpolate_sV::interpolate(const QImage& in, float x, float y) { #ifdef DEBUG_I if (x >= in.width()-1 || y >= in.height()-1) { Q_ASSERT(false); } #endif QColor carr[2][2]; int floorX = floor(x); int floorY = floor(y); carr[0][0] = QColor(in.pixel(floorX, floorY)); carr[0][1] = QColor(in.pixel(floorX, floorY+1)); carr[1][0] = QColor(in.pixel(floorX+1, floorY)); carr[1][1] = QColor(in.pixel(floorX+1, floorY+1)); float dx = x - floorX; float dy = y - floorY; QColor out = QColor::fromRgbF( interpR(carr, dx, dy), interpG(carr, dx, dy), interpB(carr, dx, dy) ); return out; } /// validated. correct. QColor Interpolate_sV::blend(const QColor &left, const QColor &right, float pos) { Q_ASSERT(pos >= 0 && pos <= 1); float r = (1-pos)*left.redF() + pos*right.redF(); float g = (1-pos)*left.greenF() + pos*right.greenF(); float b = (1-pos)*left.blueF() + pos*right.blueF(); float a = (1-pos)*left.alphaF() + pos*right.alphaF(); r = CLAMP(r,0.0,1.0); g = CLAMP(g,0.0,1.0); b = CLAMP(b,0.0,1.0); a = CLAMP(a,0.0,1.0); return QColor::fromRgbF(r, g, b, a); } void Interpolate_sV::blend(ColorMatrix4x4 &c, const QColor &blendCol, float posX, float posY) { Q_ASSERT(posX >= 0 && posX <= 1); Q_ASSERT(posY >= 0 && posY <= 1); if (c.c00.alpha() == 0) { c.c00 = blendCol; } else { c.c00 = blend(c.c00, blendCol, std::sqrt((1-posX) * (1-posY))); } if (c.c10.alpha() == 0) { c.c10 = blendCol; } else { c.c10 = blend(c.c10, blendCol, std::sqrt( posX * (1-posY))); } if (c.c01.alpha() == 0) { c.c01 = blendCol; } else { c.c01 = blend(c.c01, blendCol, std::sqrt((1-posX) * posY)); } if (c.c11.alpha() == 0) { c.c11 = blendCol; } else {c.c11 = blend(c.c11, blendCol, std::sqrt(posX * posY)); } } void Interpolate_sV::twowayFlow(const QImage &left, const QImage &right, const FlowField_sV *flowForward, const FlowField_sV *flowBackward, float pos, QImage &output) { #ifdef INTERPOLATE const float Wmax = left.width()-1.0001; // A little less than the maximum pixel to avoid out of bounds when interpolating const float Hmax = left.height()-1.0001; float posX, posY; #endif QColor colOut, colLeft, colRight; float r,g,b; Interpolate_sV::Movement forward, backward; for (int y = 0; y < left.height(); y++) { for (int x = 0; x < left.width(); x++) { forward.moveX = flowForward->x(x, y); forward.moveY = flowForward->y(x, y); backward.moveX = flowBackward->x(x, y); backward.moveY = flowBackward->y(x, y); #ifdef INTERPOLATE posX = x - pos*forward.moveX; posY = y - pos*forward.moveY; posX = CLAMP(posX, 0, Wmax); posY = CLAMP(posY, 0, Hmax); colLeft = interpolate(left, posX, posY); posX = x - (1-pos)*backward.moveX; posY = y - (1-pos)*backward.moveY; posX = CLAMP(posX, 0, Wmax); posY = CLAMP(posY, 0, Hmax); colRight = interpolate(right, posX, posY); #else colLeft = QColor(left.pixel(x - pos*forward.moveX, y - pos*forward.moveY)); colRight = QColor(right.pixel(x - (1-pos)*backward.moveX , y - (1-pos)*backward.moveY)); #endif r = (1-pos)*colLeft.redF() + pos*colRight.redF(); g = (1-pos)*colLeft.greenF() + pos*colRight.greenF(); b = (1-pos)*colLeft.blueF() + pos*colRight.blueF(); colOut = QColor::fromRgbF( CLAMP1(r), CLAMP1(g), CLAMP1(b) ); output.setPixel(x,y, colOut.rgb()); } } } void Interpolate_sV::newTwowayFlow(const QImage &left, const QImage &right, const FlowField_sV *flowLeftRight, const FlowField_sV *flowRightLeft, float pos, QImage &output) { const int W = left.width(); const int H = left.height(); SourceField_sV leftSourcePixel(flowLeftRight, pos); leftSourcePixel.inpaint(); SourceField_sV rightSourcePixel(flowRightLeft, 1-pos); rightSourcePixel.inpaint(); float aspect = 1 - (.5 + std::cos(M_PI*pos)/2); #if defined(FIX_FLOW) FlowField_sV diffField(flowLeftRight->width(), flowLeftRight->height()); FlowTools_sV::difference(*flowLeftRight, *flowRightLeft, diffField); float diffSum; float tmpAspect; #endif #ifdef FIX_BORDERS bool leftOk; bool rightOk; #endif float fx, fy; QColor colLeft, colRight; for (int y = 0; y < H; y++) { for (int x = 0; x < W; x++) { #ifdef FIX_BORDERS fx = leftSourcePixel.at(x,y).fromX; fy = leftSourcePixel.at(x,y).fromY; if (fx >= 0 && fx < W-1 && fy >= 0 && fy < H-1) { colLeft = interpolate(left, fx, fy); leftOk = true; } else { fx = leftSourcePixel.at(x,y).fromX; fy = leftSourcePixel.at(x,y).fromY; fx = CLAMP(fx, 0, W-1.01); fy = CLAMP(fy, 0, H-1.01); colLeft = interpolate(left, fx, fy); leftOk = false; } fx = rightSourcePixel.at(x,y).fromX; fy = rightSourcePixel.at(x,y).fromY; if (fx >= 0 && fx < W-1 && fy >= 0 && fy < H-1) { colRight = interpolate(right, fx, fy); rightOk = true; } else { colRight = qRgb(0,255,0); rightOk = false; } if (leftOk && rightOk) { output.setPixel(x,y, blend(colLeft, colRight, aspect).rgba()); } else if (rightOk) { output.setPixel(x,y, colRight.rgba()); // output.setPixel(x,y, qRgb(255, 0, 0)); } else if (leftOk) { output.setPixel(x,y, colLeft.rgba()); // output.setPixel(x,y, qRgb(0, 255, 0)); } else { output.setPixel(x,y, colLeft.rgba()); } #else fx = leftSourcePixel.at(x,y).fromX; fy = leftSourcePixel.at(x,y).fromY; fx = CLAMP(fx, 0, W-1.01); fy = CLAMP(fy, 0, H-1.01); colLeft = interpolate(left, fx, fy); #ifdef FIX_FLOW diffSum = diffField.x(fx, fy)+diffField.y(fx, fy); if (diffSum > 5) { tmpAspect = 0; } else if (diffSum < -5) { tmpAspect = 1; } else { tmpAspect = aspect; } #endif fx = rightSourcePixel.at(x,y).fromX; fy = rightSourcePixel.at(x,y).fromY; fx = CLAMP(fx, 0, W-1.01); fy = CLAMP(fy, 0, H-1.01); colRight = interpolate(right, fx, fy); #ifdef FIX_FLOW diffSum = diffField.x(fx, fy)+diffField.y(fx, fy); if (diffSum < 5) { tmpAspect = 0; } else if (diffSum > -5) { tmpAspect = 1; } #endif #ifdef FIX_FLOW output.setPixel(x,y, blend(colLeft, colRight, tmpAspect).rgba()); #else output.setPixel(x,y, blend(colLeft, colRight, aspect).rgba()); #endif #endif } } } void Interpolate_sV::forwardFlow(const QImage &left, const FlowField_sV *flow, float pos, QImage &output) { qDebug() << "Interpolating flow at offset " << pos; #ifdef INTERPOLATE float posX, posY; const float Wmax = left.width()-1.0001; const float Hmax = left.height()-1.0001; #endif QColor colOut; Interpolate_sV::Movement forward; for (int y = 0; y < left.height(); y++) { for (int x = 0; x < left.width(); x++) { // Forward flow from the left to the right image tells for each pixel in the right image // from which location in the left image the pixel has come from. forward.moveX = flow->x(x, y); forward.moveY = flow->y(x, y); posX = x - pos*forward.moveX; posY = y - pos*forward.moveY; posX = CLAMP(posX, 0, Wmax); posY = CLAMP(posY, 0, Hmax); #ifdef INTERPOLATE colOut = interpolate(left, posX, posY); #else colOut = QColor(left.pixel(posX, posY)); #endif output.setPixel(x,y, colOut.rgb()); } } } void Interpolate_sV::newForwardFlow(const QImage &left, const FlowField_sV *flow, float pos, QImage &output) { const int W = left.width(); const int H = left.height(); // Calculate the source flow field SourceField_sV field(flow, pos); field.inpaint(); // Draw the pixels float fx, fy; for (int y = 0; y < H; y++) { for (int x = 0; x < W; x++) { // Since interpolate() uses the floor()+1 values, // set the maximum to a little less than size-1 // such that the pixel always lies inside. fx = field.at(x,y).fromX; fx = CLAMP(fx, 0, W-1.01); fy = field.at(x,y).fromY; fy = CLAMP(fy, 0, H-1.01); output.setPixel(x,y, interpolate(left, fx, fy).rgba()); } } } /** \todo fix bézier interpolation \code C prev / / / / / / A curr \ \ B next (can be NULL) \endcode */ void Interpolate_sV::bezierFlow(const QImage &prev, const QImage &right, const FlowField_sV *flowPrevCurr, const FlowField_sV *flowCurrNext, float pos, QImage &output) { const float Wmax = prev.width()-1.0001; const float Hmax = prev.height()-1.0001; Vector_sV a, b, c; Vector_sV Ta, Sa; float dist; QColor colOut; for (int y = 0; y < prev.height(); y++) { for (int x = 0; x < prev.width(); x++) { a = Vector_sV(x, y); // WHY minus? c = a + Vector_sV(flowPrevCurr->x(x, y), flowPrevCurr->y(x, y)); if (flowCurrNext != NULL) { b = a + Vector_sV(flowCurrNext->x(x,y), flowCurrNext->y(x,y)); } else { b = a; } dist = (b-a).length() + (c-a).length(); if (dist > 0) { Ta = b + ( (b-a).length() / dist ) * (c-b); Sa = (Ta - a).rotate90(); Sa = a + Sa; } else { Sa = a; } #ifdef DEBUG_I Sa = a; #endif QPointF position = BezierTools_sV::interpolate(pos, c.toQPointF(), c.toQPointF(), Sa.toQPointF(), a.toQPointF()); position.rx() = x - pos*flowPrevCurr->x(x,y); position.ry() = y - pos*flowPrevCurr->y(x,y); position.rx() = CLAMP(position.x(), 0, Wmax); position.ry() = CLAMP(position.y(), 0, Hmax); #ifdef DEBUG_I // if (x == 100 && y == 100 && false) { // qDebug() << "Interpolated from " << toString(c.toQPointF()) << ", " << toString(a.toQPointF()) << ", " // << toString(b.toQPointF()) << " at " << pos << ": " << toString(position); // } if (y % 4 == 0) { position.rx() = x; position.ry() = y; } #endif colOut = interpolate(prev, position.x(), position.y()); #ifdef DEBUG_I if (y % 4 == 1 && x % 2 == 0) { colOut = right.pixel(x, y); } #endif output.setPixel(x,y, colOut.rgb()); } } /* for (int y = 0; y < prev.height(); y++) { for (int x = 1; x < prev.width()-1; x++) { if (qAlpha(output.pixel(x,y)) == 0 && qAlpha(output.pixel(x-1,y)) > 0 && qAlpha(output.pixel(x+1,y)) > 0) { output.setPixel(x,y, qRgba( (qRed(output.pixel(x-1,y)) + qRed(output.pixel(x+1,y)))/2, (qGreen(output.pixel(x-1,y)) + qGreen(output.pixel(x+1,y)))/2, (qBlue(output.pixel(x-1,y)) + qBlue(output.pixel(x+1,y)))/2, (qAlpha(output.pixel(x-1,y)) + qAlpha(output.pixel(x+1,y)))/2 )); } } } for (int x = 0; x < prev.width(); x++) { for (int y = 1; y < prev.height()-1; y++) { if (qAlpha(output.pixel(x,y)) == 0 && qAlpha(output.pixel(x,y-1)) > 0 && qAlpha(output.pixel(x,y+1)) > 0) { output.setPixel(x,y, qRgba( (qRed(output.pixel(x,y-1)) + qRed(output.pixel(x,y+1)))/2, (qGreen(output.pixel(x,y-1)) + qGreen(output.pixel(x,y+1)))/2, (qBlue(output.pixel(x,y-1)) + qBlue(output.pixel(x,y+1)))/2, (qAlpha(output.pixel(x,y-1)) + qAlpha(output.pixel(x,y+1)))/2 )); } } } */ } /** * simple linear in time itnerpolation */ void Interpolate_sV::simpleinterpolate(const QImage &prev, const QImage &right, float pos, QImage &output) { QColor colOut; for (int y = 0; y < prev.height(); y++) { for (int x = 0; x < prev.width(); x++) { QRgb lt = prev.pixel(x,y); QRgb rt = right.pixel(x,y); int red = CLAMP((1-pos)*qRed(lt)+(pos)*qRed(rt),0,255); int green = CLAMP((1-pos)*qGreen(lt)+(pos)*qGreen(rt),0,255); int blue = CLAMP((1-pos)*qBlue(lt)+(pos)*qBlue(rt),0,255); QColor out = QColor::fromRgb(red,green,blue); output.setPixel(x,y, out.rgb()); } /* for x */ } /* for y */ } /** * simple nearest frame interoplation */ void Interpolate_sV::nearestinterpolate(const QImage &prev, const QImage &right, float pos, QImage &output) { if (pos<0.5) output = prev; else output = right; } slowmovideo-0.5+git20180116/src/slowmoVideo/lib/flowTools_sV.h0000664000000000000000000000331213151342440022500 0ustar rootroot#ifndef FLOWTOOLS_SV_H #define FLOWTOOLS_SV_H #include "flowField_sV.h" #include "kernel_sV.h" class FlowTools_sV { public: enum LineFillMode { HorizontalFromTop, HorizontalFromBottom, HorizontalFromBoth, VerticalFromLeft, VerticalFromRight, VerticalFromBoth }; enum CornerPosition { TopLeft, TopRight, BottomLeft, BottomRight }; static void difference(const FlowField_sV &left, const FlowField_sV &right, FlowField_sV &out); static void signedDifference(const FlowField_sV &left, const FlowField_sV &right, FlowField_sV &out); static void deleteRect(FlowField_sV &field, int top, int left, int bottom, int right); /** \brief Clears the content of the given rectangle and fills it with the surrounding pixels. The coordinates are inclusive, so filling (0,0,1,1) fills four pixels. The axis origin (0|0) is at top left. */ static void refill(FlowField_sV &field, int top, int left, int bottom, int right); static void refill(FlowField_sV &field, const Kernel_sV &kernel, int top, int left, int bottom, int right); static FlowField_sV* median(FlowField_sV const * const fa, FlowField_sV const * const fb, FlowField_sV const * const fc); static void fillRect(FlowField_sV &field, int top, int left, int bottom, int right, float vx, float vy); private: static void refillLine(FlowField_sV &field, int startTop, int startLeft, int length, LineFillMode fillMode); static void refillLine(FlowField_sV &field, const Kernel_sV &kernel, int startTop, int startLeft, int length, bool horizontal); static void refillCorner(FlowField_sV &field, int top, int left, CornerPosition pos); }; #endif // FLOWTOOLS_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/lib/ffmpegEncode_sV.h0000664000000000000000000000567513151342440023110 0ustar rootroot#ifndef FFMPEGENCODE_SV_H #define FFMPEGENCODE_SV_H /* Copyright (c) 2011 Simon A. Eugster This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "defs_sV.h" // Against the «UINT64_C not declared» message. // See: http://code.google.com/p/ffmpegsource/issues/detail?id=11 #ifdef __cplusplus #ifndef __STDC_CONSTANT_MACROS #define __STDC_CONSTANT_MACROS #ifdef _STDINT_H #undef _STDINT_H #endif # include #endif // __STDC_CONSTANT_MACROS #endif #include #include #if ((LIBAVFORMAT_VERSION_MAJOR == 53) && (LIBAVFORMAT_VERSION_MINOR >= 21) && (LIBAVFORMAT_VERSION_MICRO < 100)) #define MOST_LIKELY_LIBAV #endif /// This struct can eat frames and produces videos! /// Variables should not be changed from the outside. typedef struct VideoOut_sV { AVFrame *picture; ///< Temporary picture for the incoming frame AVFormatContext *fc; ///< Video's format context AVOutputFormat *format; ///< Just a shortcut to fc->format AVStream *streamV; ///< Video output stream /// Current frame number that is encoded int frameNr; /// Context for converting RGB frames to YUV420p struct SwsContext* rgbConversionContext; /// Required for converting RGB images int rgbLinesize[4]; /// Video filename char *filename; int outSize; int outbufSizeV; uint8_t *outbufV; /// Set if an error occurs (file does not exist, for example), for more accurate information. char *errorMessage; } VideoOut_sV; /// Prepares a default VideoOut_sV struct, mainly for testing purposes with eatSample(). void prepareDefault(VideoOut_sV *video); /** Prepares a VideoOut_sV struct. After preparation it is ready to eat RGB images. \param video VideoOut_sV struct to prepare. \param filename Target filename \param vcodec Video codec to use (see ffmpeg -codecs). May be \c NULL, in this case a default codec for the format will be chosen. \param width Video width \param height Video height \param bitrate Bit rate. width*height*fps seems to be a good choice for high quality. \param numerator A frame is shown for numerator/denominator s; i.e. the fps number is denominator/numerator. \param denominator See numerator. */ int prepare(VideoOut_sV *video, const char *filename, const char *vcodec, const int width, const int height, const int bitrate, const unsigned int numerator, const unsigned int denominator); /// Eats an RGB image and deposits it in the output frame. int eatARGB(VideoOut_sV *video, const unsigned char *data); /// Eats a sample image. For testing. void eatSample(VideoOut_sV *video); /// Finishes the produced video file. void finish(VideoOut_sV *video); #endif // FFMPEGENCODE_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/lib/ffmpeg_writer.h0000664000000000000000000000256213151342440022706 0ustar rootroot/* * class to export a movie using ffmpeg */ #ifndef _FFMPEG_WRITER #define _FFMPEG_WRITER #include #include #include #include #include "video_enc.h" // Against the «UINT64_C not declared» message. // See: http://code.google.com/p/ffmpegsource/issues/detail?id=11 #ifdef __cplusplus #ifndef __STDC_CONSTANT_MACROS #define __STDC_CONSTANT_MACROS #ifdef _STDINT_H #undef _STDINT_H #endif # include #endif // __STDC_CONSTANT_MACROS #endif extern "C" { // ffmpeg libs #include "../lib/ffmpegEncode_sV.h" } class VideoFFMPEG : public QObject, public VideoWriter { Q_OBJECT private: int mHeight; int mWidth; double movieFPS; char* m_filename; char* m_vcodec; VideoOut_sV *m_videoOut; QProcess *process; static QRegExp regexFrameNumber; RenderTask_sV *progress; int last; public: VideoFFMPEG(int width,int height,double fps,const char *vcodec,const char* vquality,const char *filename); ~VideoFFMPEG(); int writeFrame(const QImage& frame); int exportFrames(QString filepattern,int first, RenderTask_sV *progress); public slots: void processStarted(); void readOutput(); void encodingFinished(int); void ffmpegError(QProcess::ProcessError error); private slots: void process_state_changed(); }; #endif // _FFMPEG_WRITER slowmovideo-0.5+git20180116/src/slowmoVideo/lib/vector_sV.h0000664000000000000000000000457313151342440022024 0ustar rootroot/* slowmoVideo creates slow-motion videos from normal-speed videos. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef VECTOR_SV_H #define VECTOR_SV_H #include /** Represents a float vector. Supports adding and rotation by 90 degrees. */ class Vector_sV { public: /** \fn Vector_sV() \brief Creates a zero vector. */ /** \fn Vector_sV(float x, float y) \brief Creates a vector with the given coordinates. */ /** \fn Vector_sV(float fromX, float fromY, float toX, float toY) \brief Creates a vector from a \c from and a \c to point. */ Vector_sV(); Vector_sV(float x, float y); Vector_sV(float fromX, float fromY, float toX, float toY); /// Converts the vector to a point QPointF toQPointF() const; /// x component. See rx() for a reference to the x value. float x() const; /// y component. See ry() for a reference to the y value. float y() const; /// Reference to the x value, allows direct modification. float& rx(); /// Reference to the y value, allows direct modification. float& ry(); /// Calculates the euclidian length of the vector. float length() const; /// Rotates the vector by 90 degrees, clockwise if \c counterclock is set to \c false. Vector_sV& rotate90(bool counterclock = true); /// Multiplicates a vector with a constant factor. Vector_sV operator *(float factor); /// Adds two vectors. Vector_sV operator +(const Vector_sV &other); /// Subtracts two vectors. Vector_sV operator -(const Vector_sV &other); /// Multiplicates this vector with a factor. Vector_sV& operator *=(float factor); /// Adds a vector to this vector. Vector_sV& operator +=(const Vector_sV &other); /// Subtracts a vector from this vector. Vector_sV& operator -=(const Vector_sV &other); /// Equality test bool operator ==(const Vector_sV &other); /// Inequality test bool operator !=(const Vector_sV &other); private: float m_x; float m_y; }; Vector_sV operator *(const float &factor, const Vector_sV &other); #endif // VECTOR_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/lib/flowRW_sV.h0000664000000000000000000000466313151342440021742 0ustar rootroot/* slowmoVideo creates slow-motion videos from normal-speed videos. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef FLOWRW_SV_H #define FLOWRW_SV_H //#include "defs_sV.hpp" #include class FlowField_sV; /** \brief Reads and writes Optical Flow fields. Binary format description: \code "flow_sV" 0x1(char) width(int) height(int) x0(float) y0(float) x1 y1 x2 y2 ... x[width*height-1] y[width*height-1] \endcode The number after \c flow_sV describes the file version, this allows to e.g. add compression in future. Width and height should match the image resolution. The following flow data describes the movement of each pixel in x and y direction. \see FlowField_sV */ class FlowRW_sV { public: /// Holds information about a flow file struct FlowInfo_sV { /// Flow field width int width; /// Flow field height int height; /// Flow field version for future changes in the file format char version; /// Magic number (first few bytes) std::string magic; /// Should be set to true if the flow is valid (valid magic number, valid version etc.) bool valid; /// Initially set to invalid. FlowInfo_sV() { magic = "flow_sV"; version = '1'; valid = false; } }; struct FlowRWError { std::string message; FlowRWError(std::string msg) : message(msg) {} }; /** \fn load(std::string) \return \c NULL, if the file could not be loaded, and the flow field otherwise. */ /** \fn save(std::string, FlowField_sV*); \see FlowField_sV::FlowField_sV(int, int, float*, FlowField_sV::GLFormat) */ /** \fn readInfo(std::string) \return Information about the flow file (like dimension); Does not read the whole file and is therefore faster than load(std::string). */ static void save(std::string filename, FlowField_sV *flowField); static FlowField_sV* load(std::string filename) throw(FlowRWError); static FlowInfo_sV readInfo(std::string filename); private: static const std::string m_magicNumber; static const char m_version; }; #endif // FLOWRW_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/lib/CMakeLists.txt0000664000000000000000000000344413151342440022435 0ustar rootroot# Static libraries # http://www.itk.org/pipermail/insight-users/2007-November/024141.html # http://www.linux-magazin.de/Heft-Abo/Ausgaben/2007/02/Mal-ausspannen set(LIB_SRC_BASE defs_sV.cpp defs_sV.hpp vector_sV.cpp shutter_sV.cpp intMatrix_sV.cpp interpolate_sV.cpp bezierTools_sV.cpp sourceField_sV.cpp ) set(LIB_SRC_VIDEO defs_sV.h videoInfo_sV.cpp avconvInfo_sV.cpp ) set(LIB_SRC_ARGS trivialArgsReader_sV.cpp ) set(LIB_SRC_FLOW flowRW_sV.cpp flowField_sV.cpp flowTools_sV.cpp kernel_sV.cpp ) set(LIB_SRC_FLOWVIS flowVisualization_sV.cpp ) set(HEADERS flowRW_sV.h flowField_sV.h flowTools_sV.h ) set(LIB_SRC_ENCODE macros_sV.h video_enc.h ffmpeg_writer.h ffmpeg_writer.cpp video_enc.cpp ) set(SRCS_MOC ffmpeg_writer.h ) qt_wrap_cpp(MOC_OUT ${SRCS_MOC}) if (OLD_FFMPEG) set(LIB_SRC_ENCODE ${LIB_SRC_ENCODE} ffmpegEncode_sV.c ) endif() if (APPLE AND USE_QTKIT) set(LIB_SRC_ENCODE ${LIB_SRC_ENCODE} qtkit.h qtkit.mm ) endif() set(SRCS_MOC ffmpeg_writer.h ) qt_wrap_cpp(MOC_OUT ${SRCS_MOC}) message(STATUS "FFMPEG libraries are at ${FFMPEG_LIBRARIES}") include_directories(${FFMPEG_INCLUDE_PATHS}) add_library(sV STATIC ${LIB_SRC_BASE}) target_link_libraries(sV ${QT_LIBRARIES}) qt_use_modules(sV Core) qt_use_modules(sV Gui) add_library(sVinfo STATIC ${LIB_SRC_VIDEO}) qt_use_modules(sVinfo Core) qt_use_modules(sVinfo Gui) target_link_libraries(sVinfo ${FFMPEG_LIBRARIES}) add_library(sVencode STATIC ${LIB_SRC_ENCODE} ${MOC_OUT}) target_link_libraries(sVencode ${FFMPEG_LIBRARIES}) qt_use_modules(sVencode Core) qt_use_modules(sVencode Gui) add_library(sVflow STATIC ${LIB_SRC_FLOW}) add_library(sVvis STATIC ${LIB_SRC_FLOWVIS}) qt_use_modules(sVvis Core) qt_use_modules(sVvis Gui) target_link_libraries(sVvis sVflow ${QT_LIBRARIES}) slowmovideo-0.5+git20180116/src/slowmoVideo/lib/kernel_sV.cpp0000664000000000000000000000413213151342440022324 0ustar rootroot#include "kernel_sV.h" #include #include Kernel_sV::Kernel_sV(int radiusX, int radiusY) : m_radiusX(radiusX), m_radiusY(radiusY), m_nElements((2*m_radiusX+1)*(2*m_radiusY+1)) { m_data = new float[m_nElements]; } Kernel_sV::Kernel_sV(const Kernel_sV &other) : m_radiusX(other.m_radiusX), m_radiusY(other.m_radiusY), m_nElements((2*m_radiusX+1)*(2*m_radiusY+1)) { m_data = new float[m_nElements]; std::copy(other.m_data, other.m_data+m_nElements, m_data); } Kernel_sV::~Kernel_sV() { delete[] m_data; } int Kernel_sV::rX() const { return m_radiusX; } int Kernel_sV::rY() const { return m_radiusY; } void Kernel_sV::gauss() { float r; for (int dx = -m_radiusX; dx <= m_radiusX; dx++) { for (int dy = -m_radiusY; dy <= m_radiusY; dy++) { r = std::sqrt( std::pow(dx/float(m_radiusX),2) + std::pow(dy/float(m_radiusY),2) ); (*this)(dx, dy) = std::exp(-std::pow(r*2, 2)); } } } float& Kernel_sV::operator ()(int dx, int dy) const { int x = dx + m_radiusX; int y = dy + m_radiusY; int width = 2*m_radiusX+1; return m_data[y*width + x]; } Kernel_sV& Kernel_sV::operator =(const Kernel_sV &other) { m_radiusX = other.m_radiusX; m_radiusY = other.m_radiusY; m_nElements = (2*m_radiusX+1)*(2*m_radiusY+1); delete[] m_data; m_data = new float[m_nElements]; std::copy(other.m_data, other.m_data+m_nElements, m_data); return *this; } std::ostream& operator <<(std::ostream &cout, const Kernel_sV& kernel) { int prec = cout.precision(); cout.precision(4); cout.setf(std::ios::fixed,std::ios::floatfield); cout << "Kernel with radius " << kernel.rX() << "," << kernel.rY() << " at memory location " << &kernel << ":" << std::endl; for (int dy = -kernel.rY(); dy <= kernel.rY(); dy++) { for (int dx = -kernel.rX(); dx <= kernel.rX(); dx++) { cout << kernel(dx, dy) << " "; } cout << std::endl; } cout.precision(prec); return cout; } slowmovideo-0.5+git20180116/src/slowmoVideo/lib/videoInfo_sV.cpp0000664000000000000000000000733613151342440022777 0ustar rootroot/* slowmoVideo creates slow-motion videos from normal-speed videos. Copyright (C) 2014 Valery brasseur This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "videoInfo_sV.h" #include "defs_sV.hpp" #include #include #include /** * get information from video file */ VideoInfoSV getInfo(const char filename[]) { VideoInfoSV info; info.frameRateNum = 0; info.frameRateDen = 0; info.streamsCount = -1; info.framesCount = 0; qDebug() << "Reading info for file " << filename; //flush(stdout); double videorate; double duration; QString output; QProcess ffmpeg; QSettings settings; QString prog = settings.value("binaries/ffmpeg", "ffmpeg").toString(); QStringList args; args << "-i" << filename; args << "-f" << "null"; args << "/dev/null"; ffmpeg.start(prog, args); ffmpeg.waitForFinished(-1); QString videoInfo = ffmpeg.readAllStandardError(); ffmpeg.close(); // Example of output from 0.7 and up releases // Stream #0:0: Video: mpeg4 (Simple Profile) (DX50 / 0x30355844), yuv420p, 400x240 [SAR 1:1 DAR 5:3], 23 tbr, 23 tbn, 23 tbc // Duration: 00:00:11.13, start: 0.000000, bitrate: 6338 kb/s // Stream #0.0(eng): Video: mpeg4 (Main Profile), yuv420p, 1280x720 [PAR 1:1 DAR 16:9], 6309 kb/s, 30.69 fps, 90k tbn, 300 tbc /* prefer use of : avconv -i ~/Videos/MOV_0010.MP4 -f null /dev/nul frame= 336 fps= 0 q=0.0 Lsize= 0kB time=10.92 bitrate= 0.0kbits/s video:0kB audio:696kB global headers:0kB muxing overhead -100.000000% use frame= for fnum use time= divide by frame for fps */ qDebug() << "output : " << videoInfo; // find the source resolution //QRegExp rx("Stream.*Video:.*(([0-9]{2,5})x([0-9]{2,5}))"); QRegExp rx("Stream.*Video:.*([1-9][0-9]*)x([1-9][0-9]*).*"); //QRegExp rx("Stream.*Video:.*(\\d{2,})x(\\d{2,}).*"); //rx.setMinimal(true); if (-1 == rx.indexIn(videoInfo)) { qDebug() << "Could not find size."; return info; } info.width = rx.cap(1).toInt(); info.height = rx.cap(2).toInt(); // find the duration //rx.setPattern("Duration: (([0-9]+):([0-9]{2}):([0-9]{2}).([0-9]+))"); rx.setPattern("Duration: ([0-9]*):([0-9]*):([0-9]*\\.[0-9]*)"); if (-1 == rx.indexIn(videoInfo)) { qDebug() << "Could not find duration of stream."; return info; } int hours = rx.cap(1).toInt(); int minutes = rx.cap(2).toInt(); double seconds = rx.cap(3).toDouble(); duration = 3600*hours + 60*minutes + seconds; // container rate rx = QRegExp("([0-9\\.]+) fps"); rx.setMinimal(true); if (rx.indexIn(videoInfo) !=-1) { videorate = rx.cap(1).toDouble(); } else { rx = QRegExp("Video:.*, ([0-9]*\\.?[0-9]+) tbn"); rx.setMinimal(true); if (rx.indexIn(videoInfo) !=-1) { videorate = rx.cap(1).toDouble(); } else { videorate = 0; } } info.framesCount = duration * videorate; qDebug() << "calculated framesCount : " << info.framesCount << "with :" << duration << " and " << videorate; // TODO: correct time rx = QRegExp("frame=\\s*(\\d+).*time=(\\d+)"); rx.setMinimal(true); // beeter use of pos/offset ? rx.lastIndexIn (videoInfo); qDebug() << "frame" << rx.cap(1) << "time " << rx.cap(2); info.framesCount = rx.cap(1).toLong(); Fps_sV fps(videorate); info.frameRateNum = fps.num; info.frameRateDen = fps.den; info.streamsCount = 1; return info; } slowmovideo-0.5+git20180116/src/slowmoVideo/lib/bezierTools_sV.cpp0000664000000000000000000000304513151342440023347 0ustar rootroot/* slowmoVideo creates slow-motion videos from normal-speed videos. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "bezierTools_sV.h" #include #include QPointF BezierTools_sV::interpolateAtX(float x, QPointF p0, QPointF p1, QPointF p2, QPointF p3) { float delta = 1; float t = 0; int iterations = 10*(p3.x()-p0.x()); for (int i = 0; i < iterations; i++) { float plus = interpolate(t+delta, p0, p1, p2, p3).x(); float minus = interpolate(t-delta, p0, p1, p2, p3).x(); float norm = interpolate(t , p0, p1, p2, p3).x(); if ((t+delta) <= 1 && fabs(plus-x) < fabs(norm-x)) { t += delta; } else if ((t-delta) >= 0 && fabs(minus-x) < fabs(norm-x)) { t -= delta; } delta /= 2; } // std::cout << "Interpolating at t=" << t << " for x time " << 100*interpolate(t, p0, p1, p2, p3).x << ": " << interpolate(t, p0, p1, p2, p3).y << std::endl; return interpolate(t, p0, p1, p2, p3); } QPointF BezierTools_sV::interpolate(float t, QPointF p0, QPointF p1, QPointF p2, QPointF p3) { p0 = p0 * 1 * pow(t,0) * pow(1-t, 3); p1 = p1 * 3 * pow(t,1) * pow(1-t, 2); p2 = p2 * 3 * pow(t,2) * pow(1-t, 1); p3 = p3 * 1 * pow(t,3) * pow(1-t, 0); return p0+p1+p2+p3; } slowmovideo-0.5+git20180116/src/slowmoVideo/lib/flowField_sV.cpp0000664000000000000000000000542213151342440022762 0ustar rootroot/* slowmoVideo creates slow-motion videos from normal-speed videos. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "flowField_sV.h" #include "string.h" #include float FlowField_sV::nullValue = 65535; FlowField_sV::FlowField_sV(int width, int height) : m_width(width), m_height(height) { m_data = new float[2*m_width*m_height]; } FlowField_sV::FlowField_sV(int width, int height, float *data, FlowField_sV::GLFormat format) : m_width(width), m_height(height) { m_data = new float[2*m_width*m_height]; switch (format) { case GLFormat_RG: memcpy(m_data, data, width*height*2*sizeof(float)); break; case GLFormat_RGB: default: float *fieldData = m_data; int pos = 0; for (int i = 0; i < width*height; i++) { *(fieldData++) = data[pos++]; *(fieldData++) = data[pos++]; pos++; } } } FlowField_sV::~FlowField_sV() { delete[] m_data; } float FlowField_sV::x(int x, int y) const { return m_data[2*(y*m_width+x)+0]; } float FlowField_sV::y(int x, int y) const { return m_data[2*(y*m_width+x)+1]; } float& FlowField_sV::rx(int x, int y) { return m_data[2*(y*m_width+x)+0]; } float& FlowField_sV::ry(int x, int y) { return m_data[2*(y*m_width+x)+1]; } void FlowField_sV::setX(int x, int y, float value) { m_data[2*(y*m_width+x)+0] = value; } void FlowField_sV::setY(int x, int y, float value) { m_data[2*(y*m_width+x)+1] = value; } float* FlowField_sV::data() { return m_data; } bool FlowField_sV::operator ==(const FlowField_sV& other) const { if (m_width != other.m_width) { std::cout << "Width differs: " << m_width << " vs. " << other.m_width << "." << std::endl; return false; } if (m_height != other.m_height) { std::cout << "Height differs. " << m_height << " vs. " << other.m_height << "" << std::endl; return false; } for (int y = 0; y < m_height; y++) { for (int x = 0; x < m_width; x++) { if (this->x(x,y) != other.x(x,y)) { std::cout << "x Value differs at " << x << "," << y << ": " << this->x(x,y) << "/" << other.x(x,y) << std::endl; return false; } if (this->y(x,y) != other.y(x,y)) { std::cout << "y Value differs at " << x << "," << y << ": " << this->y(x,y) << "/" << other.y(x,y) << std::endl; return false; } } } return true; } slowmovideo-0.5+git20180116/src/slowmoVideo/lib/sourceField_sV.cpp0000664000000000000000000001005313151342440023307 0ustar rootroot/* slowmoVideo creates slow-motion videos from normal-speed videos. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "sourceField_sV.h" #include "flowField_sV.h" #include #include #define FIX_FLOW SourceField_sV::SourceField_sV(int width, int height) : m_width(width), m_height(height) { m_field = new Source[width*height]; } SourceField_sV::SourceField_sV(const SourceField_sV &other) : m_width(other.m_width), m_height(other.m_height) { m_field = new Source[m_width*m_height]; std::copy(other.m_field, other.m_field+m_width*m_height, m_field); } SourceField_sV::SourceField_sV(const FlowField_sV *flow, float pos) : m_width(flow->width()), m_height(flow->height()) { m_field = new Source[m_width*m_height]; for (int y = 0; y < m_height; y++) { for (int x = 0; x < m_width; x++) { float tx = x + pos * flow->x(x,y); float ty = y + pos * flow->y(x,y); // +.5: Round to nearest int ix = floor(tx+.5); int iy = floor(ty+.5); // The position the pixel moved to is a float, but to avoid very complex // interpolation (how to set a pixel at (55.3, 97.16) to red?), this information // is reverted (where did (55, 97) come from? -> (50.8, 101.23) which can be // interpolated easily from the source image) if (ix >= 0 && iy >= 0 && ix < m_width && iy < m_height) { at(ix, iy).set(x + (ix-tx), y + (iy-ty)); } } } } SourceField_sV::~SourceField_sV() { delete[] m_field; } void SourceField_sV::inpaint() { Source pos; SourceSum sum; int dist; bool xm, xp, ym, yp; SourceField_sV clone = *this; for (int y = 0; y < m_height; y++) { for (int x = 0; x < m_width; x++) { if (!clone.at(x,y).isSet) { pos = Source(x,y); sum.reset(); dist = 1; while (sum.count <= 2) { xm = (x-dist) >= 0; xp = (x+dist) < m_width; ym = (y-dist) >= 0; yp = (y+dist) < m_height; if (xm) { sum += clone.at(x-dist, y) - pos; } if (ym) { sum += clone.at(x, y-dist) - pos; } if (xp) { sum += clone.at(x+dist, y) - pos; } if (yp) { sum += clone.at(x, y+dist) - pos; } if (sum.count > 2) break; if (xm) { if (ym) { sum += clone.at(x-dist, y-dist) - pos; } if (yp) { sum += clone.at(x-dist, y+dist) - pos; } } if (xp) { if (ym) { sum += clone.at(x+dist, y-dist) - pos; } if (yp) { sum += clone.at(x+dist, y+dist) - pos; } } dist++; } at(x,y) = sum.norm() + pos; } } } } SourceField_sV& SourceField_sV::operator =(const SourceField_sV &other) { if (this != &other) { if (other.m_width != m_width || other.m_height != m_height) { m_width = other.m_width; m_height = other.m_height; delete m_field; m_field = new Source[m_width*m_height]; } std::copy(other.m_field, other.m_field+m_width*m_height, m_field); } return *this; } slowmovideo-0.5+git20180116/src/slowmoVideo/lib/vector_sV.cpp0000664000000000000000000000426213151342440022352 0ustar rootroot/* slowmoVideo creates slow-motion videos from normal-speed videos. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "vector_sV.h" #include Vector_sV::Vector_sV() { } Vector_sV::Vector_sV(float x, float y) : m_x(x), m_y(y) { } Vector_sV::Vector_sV(float fromX, float fromY, float toX, float toY) : m_x(toX - fromX), m_y(toY - fromY) { } QPointF Vector_sV::toQPointF() const { return QPointF(m_x, m_y); } float Vector_sV::x() const { return m_x; } float Vector_sV::y() const { return m_y; } float& Vector_sV::rx() { return m_x; } float& Vector_sV::ry() { return m_y; } float Vector_sV::length() const { return std::sqrt(std::pow(m_x, 2) + std::pow(m_y, 2)); } Vector_sV& Vector_sV::rotate90(bool counterclock) { float tmp = m_x; if (counterclock) { m_x = m_y; m_y = -tmp; } else { m_x = -m_y; m_y = tmp; } return *this; } Vector_sV Vector_sV::operator *(float factor) { return Vector_sV(factor * m_x, factor * m_y); } Vector_sV Vector_sV::operator +(const Vector_sV &other) { return Vector_sV(m_x + other.m_x, m_y + other.m_y); } Vector_sV Vector_sV::operator -(const Vector_sV &other) { return Vector_sV(m_x - other.m_x, m_y - other.m_y); } Vector_sV operator *(const float &factor, const Vector_sV &other) { return Vector_sV(factor * other.x(), factor * other.y()); } Vector_sV& Vector_sV::operator *=(float factor) { m_x *= factor; m_y *= factor; return *this; } Vector_sV& Vector_sV::operator +=(const Vector_sV &other) { m_x += other.m_x; m_y += other.m_y; return *this; } Vector_sV& Vector_sV::operator -=(const Vector_sV &other) { m_x -= other.m_x; m_y -= other.m_y; return *this; } bool Vector_sV::operator ==(const Vector_sV &other) { return m_x == other.m_x && m_y == other.m_y; } bool Vector_sV::operator !=(const Vector_sV &other) { return m_x != other.m_x || m_y != other.m_y; } slowmovideo-0.5+git20180116/src/slowmoVideo/lib/intMatrix_sV.h0000664000000000000000000000270013151342440022467 0ustar rootroot/* slowmoVideo creates slow-motion videos from normal-speed videos. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef INTMATRIX_SV_H #define INTMATRIX_SV_H /** \brief Simple matrix that can add image data to itself. This matrix is used for shutter simulation (i.e. motion blur). */ class IntMatrix_sV { public: /** \brief Creates a new image matrix. \c channels should match the number of channels of the images that will be added. */ IntMatrix_sV(int width, int height, int channels); ~IntMatrix_sV(); /// Matrix width int width() const; /// Matrix height int height() const; /// Number of colour channels int channels() const; /// Adds the input bytes (as row-wise image data, usually) to this matrix. void operator +=(const unsigned char *bytes); /// Scales the matrix, e.g. by the number of images added. void operator /=(int divisor); /// Converts the image to a byte array. Internal values are stored as int. unsigned char* toBytesArray() const; /// Image data. const int* data() const; private: int m_width; int m_height; int m_channels; int *m_data; }; #endif // INTMATRIX_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/lib/shutter_sV.cpp0000664000000000000000000001274013151342440022546 0ustar rootroot/* slowmoVideo creates slow-motion videos from normal-speed videos. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "flowField_sV.h" #include "sourceField_sV.h" #include "interpolate_sV.h" #include "intMatrix_sV.h" #include "shutter_sV.h" #include #include #include #include #define CLAMP(x,min,max) ( ((x) < (min)) ? (min) : ( ((x) > (max)) ? (max) : (x) ) ) #define MIN_DIST 0.01 QImage Shutter_sV::combine(const QStringList images) { Q_ASSERT(images.size() > 0); QImage img(images.at(0)); IntMatrix_sV matrix(img.width(), img.height(), 4); for (int i = 0; i < images.size(); i++) { img = QImage(images.at(i)); matrix += img.bits(); } matrix /= images.size(); unsigned char *bytes = matrix.toBytesArray(); QImage result(matrix.width(), matrix.height(), QImage::Format_ARGB32); std::copy(bytes, bytes+matrix.width()*matrix.height()*matrix.channels()+1, result.bits()); delete[] bytes; return result; } QImage Shutter_sV::combine(const QList images) { Q_ASSERT(images.size() > 0); IntMatrix_sV matrix(images.at(0).width(), images.at(0).height(), 4); for (int i = 0; i < images.size(); i++) { matrix += images.at(i).bits(); } matrix /= images.size(); unsigned char *bytes = matrix.toBytesArray(); QImage result(matrix.width(), matrix.height(), QImage::Format_ARGB32); std::copy(bytes, bytes+matrix.width()*matrix.height()*matrix.channels()+1, result.bits()); delete[] bytes; return result; } QImage Shutter_sV::convolutionBlur(const QImage source, const FlowField_sV *flow, float length) { Q_ASSERT(source.width() == flow->width()); Q_ASSERT(source.height() == flow->height()); const float Wmax = source.width()-1.001; const float Hmax = source.height()-1.001; QImage blurred(source.size(), source.format()); ColorStack stack; float dx, dy; float xf, yf; int samples, inc; for (int y = 0; y < source.height(); y++) { for (int x = 0; x < source.width(); x++) { stack = ColorStack(); dx = length * flow->x(x,y); dy = length * flow->y(x,y); dx = CLAMP(x+dx, 0.0, Wmax)-x; dy = CLAMP(y+dy, 0.0, Hmax)-y; samples = ceil(std::sqrt(dx*dx + dy*dy)); if (samples < 1) { samples = 1; } inc = std::max(1, samples/20); // Lower inc value leads to a smoother result xf = CLAMP(x, 0.0, Wmax); yf = CLAMP(y, 0.0, Hmax); stack.add(source.pixel(x,y)); // Avoids interpolation error, and interpolation for (x,y) is not necessary anyway for (int i = 1; i <= samples; i += inc) { // \todo adjust increment stack.add(Interpolate_sV::interpolate(source, xf+float(i)/samples * dx, yf+float(i)/samples * dy)); } blurred.setPixel(x, y, stack.col().rgba()); } } return blurred; } QImage Shutter_sV::convolutionBlur(const QImage interpolatedAtOffset, const FlowField_sV *flow, float length, float offset) { Q_ASSERT(interpolatedAtOffset.width() == flow->width()); Q_ASSERT(interpolatedAtOffset.height() == flow->height()); // could be equal to 0, in case of integer ! //Q_ASSERT(offset > 0); Q_ASSERT(offset < 1); SourceField_sV source(flow, offset); source.inpaint(); const float Wmax = interpolatedAtOffset.width()-1.01; const float Hmax = interpolatedAtOffset.height()-1.01; QImage blurred(interpolatedAtOffset.size(), interpolatedAtOffset.format()); ColorStack stack; float dx, dy; float xf, yf; int samples, inc; for (int y = 0; y < interpolatedAtOffset.height(); y++) { for (int x = 0; x < interpolatedAtOffset.width(); x++) { stack = ColorStack(); dx = -(source.at(x,y).fromX - x); // Get the optical flow vector back from the source field dy = -(source.at(x,y).fromY - y); dx = dx/offset * length; // First normalize to one frame, then adjust the length dy = dy/offset * length; dx = CLAMP(x+dx, 0.0, Wmax)-x; dy = CLAMP(y+dy, 0.0, Hmax)-y; samples = ceil(std::sqrt(dx*dx + dy*dy)); if (samples < 1) { samples = 1; } inc = std::max(1, samples/20); xf = CLAMP(x, 0.0, Wmax); yf = CLAMP(y, 0.0, Hmax); stack.add(interpolatedAtOffset.pixel(x,y)); // Avoids interpolation error, and interpolation for (x,y) is not necessary anyway for (int i = 1; i <= samples; i += inc) { // \todo adjust increment stack.add(Interpolate_sV::interpolate(interpolatedAtOffset, xf+float(i)/samples * dx, yf+float(i)/samples * dy)); } blurred.setPixel(x, y, stack.col().rgba()); } } return blurred; } Shutter_sV::ColorStack::ColorStack() : r(0), g(0), b(0), a(0), count(0) {} void Shutter_sV::ColorStack::add(QColor col) { r += col.redF(); g += col.greenF(); b += col.blueF(); a += col.alphaF(); ++count; } QColor Shutter_sV::ColorStack::col() { return QColor::fromRgbF(r/count, g/count, b/count, a/count); } slowmovideo-0.5+git20180116/src/slowmoVideo/lib/macros_sV.h0000775000000000000000000000063213151342440022001 0ustar rootroot #if _WIN64 || __amd64__ #define BITS_64 #endif #if defined(__MINGW32__) && !defined(WINDOWS) #define WINDOWS (1) #endif #if defined _WIN32 || defined _WIN64 || defined WIN32 || defined _WIN32 || defined WINDOWS #define WINDOWS (1) #elif defined __linux__ #define LINUX #elif defined TARGET_OS_MAC || defined __APPLE__ #define OSX #else #error Operating system cannot be determined! #endif slowmovideo-0.5+git20180116/src/slowmoVideo/lib/qtkit.h0000664000000000000000000000117013151342440021174 0ustar rootroot/* * class to export a movie using QuickTime under OSX */ #include #import #include "video_enc.h" class VideoQT : public VideoWriter{ int mHeight; int mWidth; double movieFPS; NSString *codecSpec; NSString *qualitySpec; NSString *destPath; QTMovie* mMovie; NSDictionary *imageAttributes; QTTime duration; public: VideoQT(int width,int height,double fps,const char *vcodec,const char* vquality,const char *filename); ~VideoQT(); int writeFrame(const QImage& frame); int exportFrames(QString filepattern,int first,RenderTask_sV *progress); }; slowmovideo-0.5+git20180116/src/slowmoVideo/lib/defs_sV.hpp0000664000000000000000000001142513151342440021775 0ustar rootroot/* slowmoVideo creates slow-motion videos from normal-speed videos. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef DEFS_SV_HPP #define DEFS_SV_HPP #include "macros_sV.h" #if defined(WINDOWS) && !defined(MXE) typedef __int64 int64_t; #else #include #endif #include #include #include #include #include #include #include "version.h" /// Contains information about this slowmoVideo version namespace Version_sV { /// Major version number static int major = SLOWMOVIDEO_VERSION_MAJOR; /// Minor version number static int minor = SLOWMOVIDEO_VERSION_MINOR; /// Micro version number static int micro = SLOWMOVIDEO_VERSION_PATCH; /// Version number as string static QString version_short(QString("%1.%2.%3").arg(major).arg(minor).arg(micro)); static QString version(SLOWMOVIDEO_VERSION_FULL); /// Architecture static QString bits( #ifdef BITS_64 "64-bit" #else "32-bit" #endif ); /// Platform static QString platform( #if defined LINUX "Linux" #elif defined OSX "OSX" #elif defined WINDOWS "Windows" #endif ); } enum FlowDirection { FlowDirection_Forward, FlowDirection_Backward }; enum FrameSize { FrameSize_Orig = 1, FrameSize_Small = 2 }; enum CurveType { CurveType_Linear = 1, CurveType_Bezier = 2 }; enum TagAxis { TagAxis_Source = 1, TagAxis_Output = 2 }; enum InterpolationType { InterpolationType_Forward = 0, InterpolationType_ForwardNew = 1, InterpolationType_Twoway = 10, InterpolationType_TwowayNew = 11, InterpolationType_Bezier = 20 , InterpolationType_None = 30 , InterpolationType_Nearest = 40 }; enum MotionblurType { MotionblurType_Stacking = 0, MotionblurType_Convolving = 10, MotionblurType_Nearest = 20 }; /// Default colours used in slowmoVideo (e.g. in the user interface) namespace Colours_sV { static QColor colOk(158, 245, 94); ///< For checked text fields that are OK static QColor colBad(247, 122, 48); ///< For checked text fields that are invalid } /// For general errors. class Error_sV { public: /// Creates a new error object with the given information message. Error_sV(QString msg); /// Returns the information message. QString message() const; private: QString m_message; }; /// FPS representation, can guess numerator/denominator from a float value. struct Fps_sV { /// numerator int num; /// denominator int den; Fps_sV(int num, int den) throw(Error_sV); ///< den must be > 0. Fps_sV(float fps) throw(Error_sV); ///< Converts a float fps number to a fractional. 23.97 and 29.97 are detected. Fps_sV(QString fpsString) throw(Error_sV); ///< Accepts fps strings like 24000/1001 for 23.97 fps. QString toString() const; /// Frames per second as float. double fps() const { return double(num)/den; } }; /// For errors related to building optical flow. class FlowBuildingError : public Error_sV { public: /// Default constructor. FlowBuildingError(QString msg); }; /// For errors related to the frame source. class FrameSourceError : public Error_sV { public: /// Default constructor. FrameSourceError(QString msg); }; class InterpolationError : public Error_sV { public: /// Default constructor InterpolationError(QString msg); }; QString toString(const QSize& size); QString toString(const FrameSize &size); QString toString(const FlowDirection &dir); QString toString(const CurveType &curveType); QString toString(const QPointF &p); QString toString(const TagAxis &axis); QString toString(const InterpolationType &interpolation); QString toString(const MotionblurType &interpolation); inline QDebug operator<<(QDebug qd, const FlowDirection &direction) { switch (direction) { case FlowDirection_Forward: qd << "Forward"; break; case FlowDirection_Backward: qd << "Backward"; break; default: qd << "Unknown direction"; Q_ASSERT(false); break; } return qd; } inline QDebug operator<<(QDebug qd, const FrameSize &frameSize) { switch(frameSize) { case FrameSize_Orig: qd << "Original frame size"; break; case FrameSize_Small: qd << "Small frame size"; break; default: qd << "Unknown frame size"; Q_ASSERT(false); break; } return qd; } #endif // DEFS_SV_HPP slowmovideo-0.5+git20180116/src/slowmoVideo/lib/flowVisualization_sV.cpp0000664000000000000000000000456413151342440024606 0ustar rootroot#include "flowVisualization_sV.h" #include #include #define CLAMP0255(x) ( (x) < 0 ? 0 : (x) > 255 ? 255 : (x) ) QImage FlowVisualization_sV::colourizeFlow(const FlowField_sV *flowField, ColourizingType type, float boost) { switch (type) { case WXY: return colourizeFlowWXY(flowField, boost); case HSV: return colourizeFlowHSV(flowField, boost); default: exit(33); } } QImage FlowVisualization_sV::colourizeFlowWXY(const FlowField_sV *flowField, float boost) { QImage img(flowField->width(), flowField->height(), QImage::Format_RGB32); int r,g,b; for (int y = 0; y < flowField->height(); y++) { for (int x = 0; x < flowField->width(); x++) { r = 127 + boost*flowField->x(x,y); g = 127 + boost*flowField->y(x,y); b = boost*std::sqrt(flowField->x(x,y)*flowField->x(x,y) + flowField->y(x,y)*flowField->y(x,y)); r = CLAMP0255(r); g = CLAMP0255(g); b = CLAMP0255(b); img.setPixel(x, y, qRgb(r,g,b)); } } return img; } QImage FlowVisualization_sV::colourizeFlowHSV(const FlowField_sV *flowField, float boost) { QImage img(flowField->width(), flowField->height(), QImage::Format_RGB32); int h, s, v; float dx, dy; float r, phi; for (int y = 0; y < flowField->height(); y++) { for (int x = 0; x < flowField->width(); x++) { dx = boost*flowField->x(x,y); dy = boost*flowField->y(x,y); r = std::sqrt(dx*dx + dy*dy); // // Variant a // if (r == 0) { // phi = 0; // } else if (y >= 0) { // phi = std::acos(dx/r); // } else { // phi = -std::acos(dx/r); // } // // Variant b // if (r+x == 0) { // phi = M_PI; // } else { // phi = 2*std::atan(dy/(r+dx)); // } // Variant easiest ... phi = std::atan2(dx, dy); phi += M_PI; h = 359*phi/(2*M_PI); s = r; v = 255; if (s > 255) { v = 255 - (r-255); s = 255; if (v < 0) { v = 0; } } img.setPixel(x, y, QColor::fromHsv(h, s, v).rgb()); } } return img; } slowmovideo-0.5+git20180116/src/slowmoVideo/lib/ffmpeg_writer.cpp0000664000000000000000000001155213151342440023240 0ustar rootroot/* * class to export a movie using ffmpeg */ #include #include #include #include #include #include #include "video_enc.h" #include "ffmpeg_writer.h" //#include "ffmpegEncode_sV.h" #include "defs_sV.hpp" // for reporting #include "../project/renderTask_sV.h" QRegExp VideoFFMPEG::regexFrameNumber("frame=\\s*(\\d+)"); VideoFFMPEG::VideoFFMPEG(int width,int height,double fps,const char *vcodec,const char* vquality,const char *filename) { m_videoOut = (VideoOut_sV*)malloc(sizeof(VideoOut_sV)); m_filename = strdup(filename); if (vcodec != 0) m_vcodec = strdup(vcodec); else m_vcodec = strdup("libx264"); // -b:v 5000k Fps_sV m_fps(fps); movieFPS = fps; mHeight = height; mWidth = width; #if 0 char *pcodec = NULL; if (m_vcodec.length() > 0) { pcodec = (char*)malloc(m_vcodec.length()+1); strcpy(pcodec, m_vcodec.toStdString().c_str()); } int worked = prepare(m_videoOut, m_filename, pcodec, width, height, fps * width * height, m_fps.den, m_fps.num); if (worked != 0) { //TODO: better here fprintf(stderr,"cannot create FFMPEG encoder\n"); } #endif process = 0; } VideoFFMPEG::~VideoFFMPEG() { //TODO: //m_dirFramesOrig.rmdir("."); if (process != 0) { if (process->state() == QProcess::Running) { abort(); process->waitForFinished(); } } free(m_vcodec); free(m_filename); free(m_videoOut); } #pragma mark - int VideoFFMPEG::writeFrame(const QImage& frame) { //TODO: check this #if 0 return eatARGB(m_videoOut, frame.bits()); #else return (-1); #endif } int VideoFFMPEG::exportFrames(QString filepattern,int first,RenderTask_sV *progress) { QSettings settings; qDebug() << "exporting frame from : " << filepattern << " to " << m_filename; QStringList args; args << "-r" << QString::number(movieFPS); args << "-f" << "image2"; if (first != 0) args << "-start_number" << QString::number(first); args << "-i" << filepattern; args << "-vcodec" << m_vcodec; args << "-s" << QString("%1x%2").arg(QString::number(mWidth), QString::number(mHeight)); args << m_filename; qDebug() << "Arguments: " << args; this->progress = progress; last = 0; process = new QProcess; //QObject::connect(process, SIGNAL(started()), this, SLOT(processStarted())); QObject::connect(process, SIGNAL(finished(int)), this, SLOT(encodingFinished(int))); QObject::connect(process,SIGNAL(readyReadStandardOutput()),this,SLOT(readOutput())); QObject::connect(process,SIGNAL(readyReadStandardError()),this,SLOT(readOutput())); QObject::connect(process, SIGNAL(error(QProcess::ProcessError)), this, SLOT(ffmpegError(QProcess::ProcessError))); QObject::connect(process, SIGNAL(stateChanged(QProcess::ProcessState)), this, SLOT(process_state_changed())); process->start(settings.value("binaries/ffmpeg", "ffmpeg").toString(), args); if (!process->waitForStarted()) { qDebug() << "can't start encoding !"; process->deleteLater(); process = 0; return 1; } // warn: default timeout at 30s ! process->waitForFinished(-1); // let time goes on ! qDebug() << process->readAllStandardOutput(); qDebug() << process->readAllStandardError(); process->terminate(); qDebug() << "exit : " << process->exitStatus(); delete process; process = 0; return 0; } #pragma mark - #pragma mark C bridge void VideoFFMPEG::process_state_changed() { if (process->state() == QProcess::Starting) { qDebug() << "Process is starting up..."; } if (process->state() == QProcess::Running) { qDebug() << "Process is now running."; } if (process->state() == QProcess::NotRunning) { qDebug() << "Process is finished running."; } } void VideoFFMPEG::processStarted() { qDebug() << "process started"; } void VideoFFMPEG::readOutput() { QRegExp regex(regexFrameNumber); //qDebug() << "process read"; QString line = process->readAllStandardOutput(); //qDebug() << "got [" << line << "]"; line = process->readAllStandardError(); if (regex.lastIndexIn(line) >= 0) { //emit signalTaskProgress(;); //qDebug() << "prog update : " << regex.cap(1).toInt(); progress->stepProgress(regex.cap(1).toInt()-last); last = regex.cap(1).toInt(); } //qDebug() << "got " << line; //TODO: may check if we need to stop/cancel here // qprocess->kill() ? } void VideoFFMPEG::ffmpegError(QProcess::ProcessError error) { qDebug() << "ffmpeg finish with error : " << error; } void VideoFFMPEG::encodingFinished(int error) { if (error != 0) qDebug() << "process finish with error : " << error; } VideoWriter* CreateVideoWriter_FFMPEG( const char* filename, int width, int height, double fps, const char *codec) { VideoFFMPEG* driver= new VideoFFMPEG (width,height,fps,codec,0,filename); return driver; } slowmovideo-0.5+git20180116/src/slowmoVideo/lib/sourceField_sV.h0000664000000000000000000001003613151342440022755 0ustar rootroot/* slowmoVideo creates slow-motion videos from normal-speed videos. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef SOURCEFIELD_SV_H #define SOURCEFIELD_SV_H class FlowField_sV; /** \brief Counterpart to an Optical Flow field, says where a pixel came from. \todo Reversing vectors: Take smallest one? */ class SourceField_sV { public: /** \fn SourceField_sV(int width, int height) \brief Creates an empty source field. */ /** \fn SourceField_sV(const FlowField_sV *flow, float pos) \brief Converts the optical flow to «where did the pixel come from?» from the target image's perspective. The target image is at position \c pos. inpaint() is \em not called by this method. */ /** \fn inpaint() \brief Fills holes in the source field. When generating a source field from an optical flow field, for example: \code 0 0 1 1 0 0 | v 0 1 1 1 1 0 \endcode then the source field might look like this: \code 0 1 -1 0 \endcode since the two edges moved one pixel left or right, respectively, and no pixel «went to» the part in the middle. This function interpolates those from nearby members whose source location is known. */ SourceField_sV(const SourceField_sV &other); SourceField_sV(int width, int height); SourceField_sV(const FlowField_sV *flow, float pos); ~SourceField_sV(); /** Represents a source flow pixel. For a Source at (x|y), the pixel at (x|y) originated at (fromX|fromY). This value usually is a float and the colour at its location should therefore be interpolated to avoid aliasing artifacts. */ struct Source { /// x coordinate of the origin float fromX; /// y coordinate of the origin float fromY; /// \c false if this item is still a «hole» (see inpaint()) bool isSet; /// Creates an empty Source (i.e. a hole) Source() : isSet(false) {} /// Creates an initialized Source Source(float fromX, float fromY) : fromX(fromX), fromY(fromY), isSet(true) {} /// Sets the source coordinates. Is not a hole anymore afterwards. void set(float x, float y) { fromX = x; fromY = y; isSet = true; } Source operator -(const Source &other) const { if (other.isSet && isSet) { return Source(fromX - other.fromX, fromY - other.fromY); } return Source(); } Source operator +(const Source &other) const { if (other.isSet && isSet) { return Source(fromX + other.fromX, fromY + other.fromY); } return Source(); } }; /// Source array. Only public for the inline function at(). Source *m_field; /// Returns the Source at the given coordinates, for calculating the colour at (x|y). inline Source& at(int x, int y) { return m_field[m_width*y + x]; } void inpaint(); SourceField_sV& operator =(const SourceField_sV &other); private: int m_width; int m_height; struct SourceSum { float x; float y; int count; SourceSum() { reset(); } Source norm() { if (count == 0) { return Source(); } else { return Source(x/count, y/count); } } void operator +=(const Source &other) { if (other.isSet) { count++; x += other.fromX; y += other.fromY; } } void reset() { x = 0; y = 0; count = 0; } }; }; #endif // SOURCEFIELD_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/lib/defs_sV.cpp0000664000000000000000000001036513151342440021772 0ustar rootroot#include "defs_sV.hpp" Error_sV::Error_sV(QString msg) : m_message(msg) { qDebug() << msg; } QString Error_sV::message() const { return m_message; } FlowBuildingError::FlowBuildingError(QString msg) : Error_sV(msg) { qDebug() << "Flow building error: " << msg; } FrameSourceError::FrameSourceError(QString msg) : Error_sV(msg) { qDebug() << "Frame source error: " << msg; } InterpolationError::InterpolationError(QString msg) : Error_sV(msg) { qDebug() << "Interpolation error : " << msg; } QString toString(const QSize &size) { return QString::fromUtf8("%1×%2").arg(size.width()).arg(size.height()); } QString toString(const FrameSize &size) { switch (size) { case FrameSize_Orig: return QObject::tr("Orig"); case FrameSize_Small: return QObject::tr("Small"); default: Q_ASSERT(false); return QObject::tr("Unknown size"); } } QString toString(const FlowDirection &dir) { switch (dir) { case FlowDirection_Forward: return QObject::tr("Forward"); case FlowDirection_Backward: return QObject::tr("Backward"); default: Q_ASSERT(false); return QObject::tr("Unknown direction"); } } QString toString(const CurveType &curveType) { switch (curveType) { case CurveType_Linear: return QObject::tr("Linear"); case CurveType_Bezier: return QObject::trUtf8("Bézier"); default: Q_ASSERT(false); return QObject::tr("Unknown curve type"); } } QString toString(const QPointF &p) { return QString("(%1|%2)").arg(p.x()).arg(p.y()); } QString toString(const TagAxis axis) { switch (axis) { case TagAxis_Source: return QObject::tr("Source axis"); case TagAxis_Output: return QObject::tr("Output axis"); default: Q_ASSERT(false); return QObject::tr("Unknown axis"); } } QString toString(const InterpolationType &interpolation) { switch (interpolation) { case InterpolationType_Forward: return QObject::tr("Forward interpolation (fast)"); case InterpolationType_ForwardNew: return QObject::tr("Forward interpolation (accurate)"); case InterpolationType_Twoway: return QObject::tr("Two-way interpolation (fast)"); case InterpolationType_TwowayNew: return QObject::tr("Two-way interpolation (accurate)"); case InterpolationType_Bezier: return QObject::trUtf8("Bézier interpolation"); case InterpolationType_None: return QObject::trUtf8("Linear interpolation"); case InterpolationType_Nearest: return QObject::trUtf8("Nearest Frame interpolation"); default: return QObject::tr("Unknown interpolation"); } } QString toString(const MotionblurType &type) { switch (type) { case MotionblurType_Stacking: return QObject::tr("Stacking"); case MotionblurType_Convolving: return QObject::tr("Convolution"); case MotionblurType_Nearest: return QObject::tr("Nearest (no blurring)"); default: Q_ASSERT(false); return QString("Unknown motion blur type"); } } Fps_sV::Fps_sV(int num, int den) throw(Error_sV) : num(num), den(den) { if (den <= 0) { throw Error_sV("FPS denominator must be >= 0."); } } Fps_sV::Fps_sV(QString fpsString) throw(Error_sV) { QRegExp e("(\\d+)\\/(\\d+)"); if (e.exactMatch(fpsString)) { num = e.cap(1).toInt(); den = e.cap(2).toInt(); if (den <= 0) { throw Error_sV("FPS denominator must be >= 0."); } } else { throw Error_sV("Cannot create fps value from " + fpsString); } } Fps_sV::Fps_sV(float fps) throw(Error_sV) { if (fps <= 0) { throw Error_sV(QString("FPS value must be larger than zero (is: %1)").arg(fps)); } // Check for 23.976 and similar numbers (24*1000/1001) if (fabs(1000*ceil(fps)-1001*fps) < 7) { num = 1000*ceil(fps); den = 1001; } else { num = 100000*fps; den = 100000; // Prettify for (int i = 10; i > 1; i--) { while (num % i == 0 && den % i == 0) { num /= i; den /= i; } } } } QString Fps_sV::toString() const { return QString("%1/%2").arg(num).arg(den); } slowmovideo-0.5+git20180116/src/slowmoVideo/lib/kernel_sV.h0000664000000000000000000000136013151342440021771 0ustar rootroot#ifndef KERNEL_SV_H #define KERNEL_SV_H #include "flowField_sV.h" #include /** Kernel for convolution This time following the Rule of Three: http://en.wikipedia.org/wiki/Rule_of_three_%28C%2B%2B_programming%29 */ class Kernel_sV { public: Kernel_sV(int radiusX, int radiusY); Kernel_sV(const Kernel_sV &other); ~Kernel_sV(); void gauss(); int rX() const; int rY() const; float& operator()(int dx, int dy) const; Kernel_sV& operator=(const Kernel_sV &other); private: int m_radiusX; int m_radiusY; int m_nElements; /// Data storage in row major order (line-wise) float *m_data; }; std::ostream& operator <<(std::ostream &cout, const Kernel_sV& kernel); #endif // KERNEL_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/lib/flowRW_sV.cpp0000664000000000000000000000670613151342440022275 0ustar rootroot/* slowmoVideo creates slow-motion videos from normal-speed videos. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "flowRW_sV.h" #include "flowField_sV.h" #include #include const std::string FlowRW_sV::m_magicNumber = "flow_sV"; const char FlowRW_sV::m_version = 1; void FlowRW_sV::save(std::string filename, FlowField_sV *flowField) { int width = flowField->width(); int height = flowField->height(); std::cout << "Writing flow file " << filename << ": " << width << "x" << height << ", version " << (int)m_version << ", magic number " << m_magicNumber << std::endl; float *data = flowField->data(); std::ofstream file(filename.c_str(), std::ios_base::out | std::ios_base::binary); file.write((char*) m_magicNumber.c_str(), m_magicNumber.length()*sizeof(char)); file.write((char*) &m_version, sizeof(char)); file.write((char*) &width, sizeof(int)); file.write((char*) &height, sizeof(int)); file.write((char*) data, sizeof(float)*flowField->dataSize()); file.close(); } FlowRW_sV::FlowInfo_sV FlowRW_sV::readInfo(std::string filename) { FlowInfo_sV info; std::ifstream file(filename.c_str(), std::ios_base::in | std::ios_base::binary); char *magic = new char[m_magicNumber.size()+1]; magic[m_magicNumber.size()] = 0; file.read(magic, sizeof(char)*m_magicNumber.size()); file.read(&info.version, sizeof(char)); info.magic = std::string(magic); delete[] magic; file.read((char*) &info.width, sizeof(int)); file.read((char*) &info.height, sizeof(int)); if (file.rdstate() == std::ios::goodbit) { if (info.magic.compare(m_magicNumber) == 0) { info.valid = true; } std::cout << "Magic number: " << info.magic << ", version: " << (int)info.version << ", size: " << info.width << "x" << info.height << std::endl; } else { std::cerr << "Failed to read width/height from " << filename << "." << std::endl; } file.close(); return info; } FlowField_sV* FlowRW_sV::load(std::string filename) throw(FlowRWError) { std::ifstream file(filename.c_str(), std::ios_base::in | std::ios_base::binary); char *magic = new char[m_magicNumber.size()+1]; magic[m_magicNumber.size()] = 0; char version; file.read(magic, sizeof(char)*m_magicNumber.size()); file.read((char*) &version, sizeof(char)); int width, height; file.read((char*) &width, sizeof(int)); file.read((char*) &height, sizeof(int)); if (file.rdstate() != std::ios::goodbit) { file.close(); throw FlowRWError("Failed to read width/height from file " + filename); } FlowField_sV *field = new FlowField_sV(width, height); file.read((char*) field->data(), sizeof(float)*field->dataSize()); if (file.rdstate() != std::ios::goodbit) { delete field; file.close(); throw FlowRWError("Failed to read data from file " + filename); } file.close(); std::cout << "Read flow file of size " << field->width() << "×" << field->height() << ". Magic number: " << magic << ", version: " << (int)version << std::endl; delete[] magic; return field; } slowmovideo-0.5+git20180116/src/slowmoVideo/lib/flowVisualization_sV.h0000664000000000000000000000100213151342440024233 0ustar rootroot#ifndef FLOWVISUALIZATION_SV_H #define FLOWVISUALIZATION_SV_H #include "flowField_sV.h" #include class FlowVisualization_sV { public: enum ColourizingType { WXY, HSV }; static QImage colourizeFlow(const FlowField_sV *flowField, ColourizingType type, float boost = 1.0); private: static QImage colourizeFlowWXY(const FlowField_sV *flowField, float boost = 1.0); static QImage colourizeFlowHSV(const FlowField_sV *flowField, float boost = 1.0); }; #endif // FLOWVISUALIZATION_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/lib/video_enc.h0000664000000000000000000000161313151342440021775 0ustar rootroot/* * global abstract file */ #ifndef _VID_W_H #define _VID_W_H #include class RenderTask_sV; class VideoWriter { public: virtual ~VideoWriter() {}; virtual int writeFrame(const QImage& frame) = 0; virtual int exportFrames(QString filepattern,int first,RenderTask_sV* progress) = 0; }; VideoWriter* CreateVideoWriter( const char* filename, int width,int height,double fps,int use_qt,const char* codec); void ReleaseVideoWriter( VideoWriter** pwriter ); int WriteFrame( VideoWriter* writer, const QImage& frame); int exportFrames(VideoWriter* pwriter,QString filepattern,int first,RenderTask_sV* progress); /* lib dependant ... */ VideoWriter* CreateVideoWriter_FFMPEG(const char* filename, int width,int height,double fps,const char* codec); VideoWriter* CreateVideoWriter_QT ( const char* filename, int width,int height,double fps,const char* codec); #endif /* _VID_W_H */ slowmovideo-0.5+git20180116/src/slowmoVideo/lib/avconvInfo_sV.h0000664000000000000000000000105413151342440022621 0ustar rootroot#ifndef AVCONVINFO_H #define AVCONVINFO_H #include "../lib/defs_sV.hpp" #include class AvconvInfo { public: enum Distribution { Dist_ffmpeg, Dist_avconv }; static bool testAvconvExecutable(QString path); AvconvInfo(); bool locate(QString executablePath = ""); QString executablePath() const; QString optionSameQuant() const; Distribution distribution() const; void printInfo() const; private: void identify(); QString m_executablePath; Distribution m_dist; }; #endif // AVCONVINFO_H slowmovideo-0.5+git20180116/src/slowmoVideo/lib/interpolate_sV.h0000664000000000000000000000644413151342440023047 0ustar rootroot/* slowmoVideo creates slow-motion videos from normal-speed videos. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include class QImage; class FlowField_sV; /** \short Provides interpolation methods between frames \todo Half resolution for optical flow */ class Interpolate_sV { public: /** \fn forwardFlow() Interpolates a frame using only the flow from the first to the second frame. This algorithm is simplified and only partly correct since it assumes the flow field to tell where a pixel came from and not where it went to, which usually leads to artifacts like on object boundaries or like objects that do not move as far as they should, but is much easier to interpolate. */ /** \fn newForwardFlow Like forwardFlow(), but uses the forward flow correctly. This includes more work like filling holes if an object expanded (a pixel then becomes larger, or «multiplies», which cannot be expressed with usual optical flow (where did the pixel go to? cannot be answered since it went to multiple locations). The benefit is that this algorithm works more precisely. */ /** \fn twowayFlow() Interpolates a frame using optical flow from the first to the second frame, as well as from the second to the first frame. */ /** \fn newTwowayFlow() Like twowayFlow(), but uses forward and backward flow correctly. See also newForwardFlow(). */ /** \fn interpolate(const QImage &in, float x, float y) \brief Interpolates the colour at position (x|y). \c x should fulfil \f$ 0 \leq x < width-1 \f$, same with y, to avoid reading outside the image. Not tested inside the function for efficiency reasons. */ static void forwardFlow(const QImage& left, const FlowField_sV *flow, float pos, QImage& output); static void newForwardFlow(const QImage& left, const FlowField_sV *flow, float pos, QImage& output); static void twowayFlow(const QImage& left, const QImage& right, const FlowField_sV *flowForward, const FlowField_sV *flowBackward, float pos, QImage& output); static void newTwowayFlow(const QImage &left, const QImage &right, const FlowField_sV *flowLeftRight, const FlowField_sV *flowRightLeft, float pos, QImage &output); static void bezierFlow(const QImage& left, const QImage& right, const FlowField_sV *flowCurrPrev, const FlowField_sV *flowCurrNext, float pos, QImage &output); static QColor interpolate(const QImage& in, float x, float y); static void simpleinterpolate(const QImage& left, const QImage& right, float pos, QImage &output); static void nearestinterpolate(const QImage& left, const QImage& right, float pos, QImage &output); private: struct Movement { float moveX; float moveY; }; struct ColorMatrix4x4 { QColor c00, c10, c01, c11; }; static void blend(ColorMatrix4x4& colors, const QColor &blendCol, float posX, float posY); static QColor blend(const QColor& left, const QColor& right, float pos); }; slowmovideo-0.5+git20180116/src/slowmoVideo/lib/avconvInfo_sV.cpp0000664000000000000000000000416213151342440023157 0ustar rootroot#include "avconvInfo_sV.h" #include #include AvconvInfo::AvconvInfo() : m_executablePath(""), m_dist(Dist_avconv) {} bool AvconvInfo::testAvconvExecutable(QString path) { QProcess ffmpeg(NULL); QStringList args; args << "-version"; ffmpeg.start(path, args); ffmpeg.waitForFinished(5000); QByteArray output = ffmpeg.readAllStandardOutput(); return output.size() > 0; } bool AvconvInfo::locate(QString executablePath) { QStringList paths; if (executablePath.size() > 0) { paths << executablePath; } paths << "avconv" << "ffmpeg" #ifndef WINDOWS << "/usr/bin/avconv" << "/usr/bin/ffmpeg" << "/usr/local/bin/avconv" << "/usr/local/bin/ffmpeg" #endif ; bool found = false; foreach (QString path, paths) { if (testAvconvExecutable(path)) { m_executablePath = path; found = true; identify(); break; } } if (found) { printInfo(); } else { qDebug() << "Did not find avconv/ffmpeg. Searched at the following locations:"; foreach (QString path, paths) { qDebug() << "* " << QFileInfo(path).absoluteFilePath(); } } return found; } void AvconvInfo::identify() { QProcess ffmpeg(NULL); QStringList args; args << "-version"; ffmpeg.start(m_executablePath, args); ffmpeg.waitForFinished(5000); QByteArray output = ffmpeg.readAllStandardOutput(); if (output.indexOf("avconv") >= 0) { m_dist = Dist_avconv; } else { m_dist = Dist_ffmpeg; } } void AvconvInfo::printInfo() const { qDebug() << "Found avconv/ffmpeg executable at " << QFileInfo(m_executablePath).absoluteFilePath() << "\n\tDistribution: " << (m_dist == Dist_avconv ? "avconv" : "ffmpeg"); } QString AvconvInfo::executablePath() const { return m_executablePath; } QString AvconvInfo::optionSameQuant() const { if (m_dist == Dist_avconv) { return "-same_quant"; } else { return "-sameq"; } } slowmovideo-0.5+git20180116/src/slowmoVideo/lib/videoInfo_sV.h0000664000000000000000000000115013151342440022430 0ustar rootroot/* slowmoVideo creates slow-motion videos from normal-speed videos. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef VIDEOINFO_SV_H #define VIDEOINFO_SV_H #include "defs_sV.h" /** Reads video information (number of frames, fps, resolution, etc.) from a video file. */ VideoInfoSV getInfo(const char filename[]); #endif //VIDEOINFO_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoFlowEdit/0000775000000000000000000000000013151342440022100 5ustar rootrootslowmovideo-0.5+git20180116/src/slowmoVideo/slowmoFlowEdit/mainwindow.h0000664000000000000000000000300713151342440024425 0ustar rootroot/* slowmoFlowEdit is a user interface for editing slowmoVideo's Optical Flow files. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef MAINWINDOW_H #define MAINWINDOW_H class FlowEditCanvas; namespace Ui { class MainWindow; } #include #if QT_VERSION >= QT_VERSION_CHECK(5, 0, 0) #include #endif #include #include "../libgui/combinedShortcuts.h" class MainWindow : public QMainWindow { Q_OBJECT public: explicit MainWindow(QWidget *parent = 0); ~MainWindow(); const CombinedShortcuts& shortcuts() const; protected slots: void closeEvent(QCloseEvent *e); private: enum Shortcuts { BOOST1, BOOST2, BOOST3, OPEN, SAVE, PREV, NEXT, HELP, QUIT }; Ui::MainWindow *ui; CombinedShortcuts m_cs; FlowEditCanvas *m_canvas; QSettings m_settings; QString m_lastFlowFile; void updateTitle(); void loadFlow(QString filename); void amplify(float val); QString nextFilename(QString originalName, int shift) const; private slots: void slotOpenFlow(); void slotSaveFlow(); void slotNextFile(); void slotPrevFile(); void slotChangeFile(int shift); void slotShortcutUsed(int id); void slotShowShortcuts(); }; #endif // MAINWINDOW_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoFlowEdit/CMakeLists.txt0000664000000000000000000000511613151342440024643 0ustar rootroot include_directories(${slowmoVideo_SOURCE_DIR}) set(SRCS main.cpp mainwindow.cpp flowEditCanvas.cpp shortcutListDialog.cpp ) set(SRCS_UI mainwindow.ui flowEditCanvas.ui shortcutListDialog.ui ) set(SRCS_MOC mainwindow.h flowEditCanvas.h shortcutListDialog.h ) qt_wrap_ui(UI_H_OUT ${SRCS_UI}) qt_wrap_cpp(MOC_OUT ${SRCS_MOC}) if(APPLE) set(BUNDLE "slowmoFlowEdit") set(ICONS_DIR "${${PROJECT_NAME}_SOURCE_DIR}/slowmoVideo/slowmoUI/res") message( "OS X build" ) set(MACOSX_BUNDLE_INFO_STRING "${BUNDLE} ${PROJECT_VERSION}") set(MACOSX_BUNDLE_BUNDLE_VERSION "${BUNDLE} ${PROJECT_VERSION}") set(MACOSX_BUNDLE_LONG_VERSION_STRING "${BUNDLE} ${PROJECT_VERSION}") set(MACOSX_BUNDLE_SHORT_VERSION_STRING "${PROJECT_VERSION}") set(MACOSX_BUNDLE_COPYRIGHT "${PROJECT_COPYRIGHT_YEAR} ${PROJECT_VENDOR}") set(MACOSX_BUNDLE_ICON_FILE "slowmoUI.icns") set(MACOSX_BUNDLE_GUI_IDENTIFIER "${PROJECT_DOMAIN_SECOND}.${PROJECT_DOMAIN_FIRST}") set(MACOSX_BUNDLE_BUNDLE_NAME "${BUNDLE}") set(MACOSX_BUNDLE_RESOURCES "${CMAKE_CURRENT_BINARY_DIR}/${BUNDLE}.app/Contents/Resources") set(MACOSX_BUNDLE_ICON "${ICONS_DIR}/${MACOSX_BUNDLE_ICON_FILE}") SET_SOURCE_FILES_PROPERTIES( ${MACOSX_BUNDLE_ICON} PROPERTIES MACOSX_PACKAGE_LOCATION Resources) message(STATUS "Bundle will be : ${MACOSX_BUNDLE} => ${PROJECT_NAME} ") set( SRCS ${SRCS} ${MACOSX_BUNDLE_ICON} ) endif() include_directories(..) include_directories(${CMAKE_BINARY_DIR}/slowmoVideo/slowmoFlowEdit) include_directories(${CMAKE_BINARY_DIR}/slowmoVideo/libgui) add_executable(slowmoFlowEdit WIN32 MACOSX_BUNDLE ${SRCS} ${MOC_OUT} ${UI_H_OUT}) qt_use_modules(slowmoFlowEdit Widgets Gui Core ) target_link_libraries(slowmoFlowEdit sVgui sVflow sVvis) #install(TARGETS slowmoFlowEdit # BUNDLE DESTINATION . COMPONENT Runtime # RUNTIME DESTINATION ${DEST} COMPONENT Runtime) #install(TARGETS slowmoUI DESTINATION ".") #install(TARGETS slowmoFlowEdit DESTINATION ${DEST}) if (Qt5Core_FOUND) include(DeployQt5) # 2.8.7 or later else() include(DeployQt4) # 2.8.7 or later endif() if (APPLE) install(TARGETS slowmoFlowEdit DESTINATION ".") install_qt_executable(slowmoFlowEdit.app "" "" ) elseif(WIN32) install(TARGETS slowmoFlowEdit DESTINATION ".") install_qt_executable(slowmoFlowEdit.exe "" "" ) else() install(TARGETS slowmoFlowEdit DESTINATION ${DEST}) # install_qt_executable(slowmoFlowEdit "" "" ) endif() slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoFlowEdit/shortcutListDialog.h0000664000000000000000000000163313151342440026103 0ustar rootroot/* slowmoFlowEdit is a user interface for editing slowmoVideo's Optical Flow files. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef SHORTCUTLISTDIALOG_H #define SHORTCUTLISTDIALOG_H #include #include namespace Ui { class ShortcutListDialog; } class MainWindow; class ShortcutListDialog : public QDialog { Q_OBJECT public: explicit ShortcutListDialog(MainWindow *parent); ~ShortcutListDialog(); protected: virtual void keyReleaseEvent(QKeyEvent *); virtual void mouseReleaseEvent(QMouseEvent *); private: Ui::ShortcutListDialog *ui; QTime m_openedAt; }; #endif // SHORTCUTLISTDIALOG_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoFlowEdit/main.cpp0000664000000000000000000000143513151342440023533 0ustar rootroot/* slowmoFlowEdit is a user interface for editing slowmoVideo's Optical Flow files. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include #include "mainwindow.h" int main(int argc, char *argv[]) { QApplication a(argc, argv); // Set up preferences for the QSettings file QCoreApplication::setOrganizationName("Granjow"); QCoreApplication::setOrganizationDomain("granjow.net"); QCoreApplication::setApplicationName("slowmoFlowEdit"); MainWindow w; w.show(); return a.exec(); } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoFlowEdit/flowEditCanvas.ui0000664000000000000000000001267013151342440025356 0ustar rootroot FlowEditCanvas 0 0 1058 608 Form QLayout::SetMinimumSize Qt::Horizontal 40 20 TextLabel Qt::Horizontal QSizePolicy::Fixed 10 20 Values at mouse position 0 0 QFrame::StyledPanel QFrame::Raised 0 0 20 3 Qt::Vertical QSlider::TicksBelow 1 Qt::Vertical 20 40 0 0 tool 0 20 89 22 Eyedropper 0 60 89 22 picker 0 40 89 22 0 0 average ImageDisplay QFrame
libgui/imageDisplay.h
1
slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoFlowEdit/flowEditCanvas.h0000664000000000000000000000243613151342440025167 0ustar rootroot/* slowmoFlowEdit is a user interface for editing slowmoVideo's Optical Flow files. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef FLOWEDITCANVAS_H #define FLOWEDITCANVAS_H #include #include class FlowField_sV; namespace Ui { class FlowEditCanvas; } /// \todo Auto-fix feature (confirm to accept) class FlowEditCanvas : public QWidget { Q_OBJECT public: explicit FlowEditCanvas(QWidget *parent = 0); ~FlowEditCanvas(); void setAmplification(float val); float amplification() const; public slots: void slotLoadFlow(QString filename); void slotSaveFlow(QString filename = QString()); void newAmplification(int val); private: Ui::FlowEditCanvas *ui; FlowField_sV *m_flowField; QString m_flowFilename; float m_boost; float vx,vy; int tool; void repaintFlow(); private slots: void slotRectDrawn(QRectF imageRect); void slotExamineValues(float x, float y); void slotPickValues(float x, float y); }; #endif // FLOWEDITCANVAS_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoFlowEdit/mainwindow.cpp0000664000000000000000000001722013151342440024762 0ustar rootroot/* slowmoFlowEdit is a user interface for editing slowmoVideo's Optical Flow files. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "mainwindow.h" #include "ui_mainwindow.h" #include #include #include #if QT_VERSION >= QT_VERSION_CHECK(5, 0, 0) #include #endif #include #include #include #include #include "flowEditCanvas.h" #include "shortcutListDialog.h" #define MAX_SEARCH_SHIFT 500 MainWindow::MainWindow(QWidget *parent) : QMainWindow(parent), ui(new Ui::MainWindow), m_cs(this) { ui->setupUi(this); restoreGeometry(m_settings.value("geometry").toByteArray()); restoreState(m_settings.value("windowState").toByteArray()); m_canvas = new FlowEditCanvas(this); setCentralWidget(m_canvas); m_canvas->setAmplification(m_settings.value("view/amplify", 1.0).toFloat()); m_cs.addShortcut("o", OPEN, "Open flow file"); m_cs.addShortcut("s-s", SAVE, "Save"); m_cs.addShortcut("j", PREV, "Previous file"); m_cs.addShortcut("k", NEXT, "Next file"); //m_cs.addShortcut("b-1", BOOST1, "No amplification"); //m_cs.addShortcut("b-2", BOOST2, "Low amplification"); //m_cs.addShortcut("b-3", BOOST3, "High amplification (details best visible)"); m_cs.addShortcut("q-q", QUIT, "Quit"); m_cs.addShortcut("h-h", HELP, "Show shortcut dialog"); ui->actionQuit->setShortcut(QKeySequence("Ctrl+Q")); ui->actionOpen->setShortcut(QKeySequence("Ctrl+O")); ui->actionSave->setShortcut(QKeySequence("Ctrl+S")); ui->actionPrev->setShortcut(QKeySequence("Ctrl+Left")); ui->actionNext->setShortcut(QKeySequence("Ctrl+Right")); ui->actionShortcuts->setShortcut(QKeySequence("F1")); connect(ui->actionQuit, SIGNAL(triggered()), this, SLOT(close())); connect(ui->actionOpen, SIGNAL(triggered()), this, SLOT(slotOpenFlow())); connect(ui->actionSave, SIGNAL(triggered()), this, SLOT(slotSaveFlow())); connect(ui->actionNext, SIGNAL(triggered()), this, SLOT(slotNextFile())); connect(ui->actionPrev, SIGNAL(triggered()), this, SLOT(slotPrevFile())); connect(ui->actionShortcuts, SIGNAL(triggered()), this, SLOT(slotShowShortcuts())); connect(&m_cs, SIGNAL(signalShortcutUsed(int)), this, SLOT(slotShortcutUsed(int))); updateTitle(); if (m_settings.value("prevFlowFile", "").toString().length() != 0) { loadFlow(m_settings.value("prevFlowFile", "").toString()); } qDebug() << "Shortcut list: " << m_cs.shortcutList(); } MainWindow::~MainWindow() { delete ui; } const CombinedShortcuts& MainWindow::shortcuts() const { return m_cs; } void MainWindow::updateTitle() { QString file = m_lastFlowFile; if (file.length() == 0) { file = "no file loaded"; } setWindowTitle(QString("slowmo Flow Editor (%1)").arg(file)); } void MainWindow::closeEvent(QCloseEvent *e) { m_settings.setValue("geometry", saveGeometry()); m_settings.setValue("windowState", saveState()); if (m_lastFlowFile.length() > 0) { m_settings.setValue("prevFlowFile", m_lastFlowFile); } m_settings.setValue("view/amplify", m_canvas->amplification()); QMainWindow::closeEvent(e); } QString MainWindow::nextFilename(QString originalName, int shift) const { if (false) { QStringList parts; QRegExp e("(\\d+)"); int min = originalName.indexOf("_"); int pos = 0; int prevPos = 0; while ((pos = e.indexIn(originalName, pos)) != -1) { parts << originalName.mid(prevPos, pos-prevPos); if (pos > min) { parts << QVariant(e.cap(1).toInt()+shift).toString(); } else { parts << e.cap(1); } pos += e.matchedLength(); prevPos = pos; } parts << originalName.mid(prevPos); return parts.join(""); } else { QStringList filters; filters << "*.sVflow"; QDir dir(QFileInfo(originalName).absolutePath()); QStringList filenames = dir.entryList(filters, QDir::Files | QDir::Readable, QDir::Name); QString current = QFileInfo(originalName).fileName(); QString next; if (filenames.contains(current)) { int index = filenames.indexOf(current); if (filenames.size() > index+shift && index+shift >= 0) { next = QFileInfo(originalName).absolutePath() + "/" + filenames[index+shift]; } else { qDebug() << "No file in this direction"; } } else { qDebug() << filenames; } return next; } } void MainWindow::loadFlow(QString filename) { if (QFileInfo(filename).exists()) { m_canvas->slotLoadFlow(filename); m_lastFlowFile = filename; updateTitle(); } } void MainWindow::slotOpenFlow() { QFileDialog dialog(this, "Open flow file"); dialog.setAcceptMode(QFileDialog::AcceptOpen); dialog.setFileMode(QFileDialog::ExistingFile); dialog.setNameFilter("Flow files (*.sVflow)"); if (m_settings.value("directories/lastFlowDir", "").toString().length() > 0) { dialog.setDirectory(m_settings.value("directories/lastFlowDir", "").toString()); } if (dialog.exec() == QDialog::Accepted) { m_settings.setValue("directories/lastFlowDir", QFileInfo(dialog.selectedFiles().at(0)).absolutePath()); loadFlow(dialog.selectedFiles().at(0)); statusBar()->showMessage("Loaded " + m_lastFlowFile, 3000); } } void MainWindow::slotSaveFlow() { statusBar()->showMessage("Saving ...", 3000); m_canvas->slotSaveFlow(); statusBar()->showMessage("Saved " + m_lastFlowFile, 3000); } void MainWindow::slotNextFile() { slotChangeFile(+1); } void MainWindow::slotPrevFile() { slotChangeFile(-1); } void MainWindow::slotChangeFile(int shift) { for (int i = 1; i < MAX_SEARCH_SHIFT; i++) { QString name = nextFilename(m_lastFlowFile, i*shift); if (QFileInfo(name).exists()) { loadFlow(name); return; } } QMessageBox::warning(this, "File not found", QString("The flow file %1 does not exist.\n\n " "I even searched %2 steps for a file in this direction, " "and still did not find a file.") .arg(nextFilename(m_lastFlowFile, shift)).arg(MAX_SEARCH_SHIFT), QMessageBox::Ok); } void MainWindow::amplify(float val) { m_canvas->setAmplification(val); statusBar()->showMessage(QString("Setting visual amplification to %1").arg(val), 3000); } void MainWindow::slotShortcutUsed(int id) { /* if (id == BOOST1) { qDebug() << "Amplify 1"; amplify(1); } else if (id == BOOST2) { qDebug() << "Amplify 2"; amplify(3); } else if (id == BOOST3) { qDebug() << "Amplify 3"; amplify(9); } else */ if (id == PREV) { slotPrevFile(); } else if (id == NEXT) { slotNextFile(); } else if (id == OPEN) { slotOpenFlow(); } else if (id == SAVE) { slotSaveFlow(); } else if (id == HELP) { slotShowShortcuts(); } else if (id == QUIT) { close(); } else { qDebug() << "Shortcut with ID " << id << " has no action!"; Q_ASSERT(false); } } void MainWindow::slotShowShortcuts() { ShortcutListDialog dialog(this); dialog.exec(); } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoFlowEdit/shortcutListDialog.ui0000664000000000000000000000215213151342440026266 0ustar rootroot ShortcutListDialog 0 0 400 300 0 0 Flow Editor shortcuts TextLabel Qt::Vertical 20 40 slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoFlowEdit/flowEditCanvas.cpp0000664000000000000000000001113613151342440025517 0ustar rootroot/* slowmoFlowEdit is a user interface for editing slowmoVideo's Optical Flow files. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "flowEditCanvas.h" #include "ui_flowEditCanvas.h" #include "lib/flowRW_sV.h" #include "lib/flowTools_sV.h" #include "lib/flowVisualization_sV.h" #include #include #include #if QT_VERSION >= QT_VERSION_CHECK(5, 0, 0) #include #endif FlowEditCanvas::FlowEditCanvas(QWidget *parent) : QWidget(parent), ui(new Ui::FlowEditCanvas), m_flowField(NULL), m_boost(1.0) { ui->setupUi(this); ui->flow->trackMouse(true); connect(ui->flow, SIGNAL(signalRectDrawn(QRectF)), this, SLOT(slotRectDrawn(QRectF))); connect(ui->flow, SIGNAL(signalMouseMoved(float,float)), this, SLOT(slotExamineValues(float,float))); connect(ui->flow, SIGNAL(signalMousePressed(float,float)), this,SLOT(slotPickValues(float,float))); connect(ui->amplification, SIGNAL(valueChanged(int)),this, SLOT(newAmplification(int))); tool= 0; vx = 0.0; vy = 0.0; ui->average->setChecked(true); } FlowEditCanvas::~FlowEditCanvas() { delete ui; } float FlowEditCanvas::amplification() const { return m_boost; } void FlowEditCanvas::setAmplification(float val) { //qDebug() << "setAmplification: " << val; Q_ASSERT(val > 0); m_boost = val; repaintFlow(); } void FlowEditCanvas::newAmplification(int val) { //qDebug() << "newAmplification: " << val; Q_ASSERT(val > 0); m_boost = (float)val; repaintFlow(); } /// \todo Make flow visualization configurable void FlowEditCanvas::repaintFlow() { if (m_flowField != NULL) { ui->flow->loadImage(FlowVisualization_sV::colourizeFlow(m_flowField, FlowVisualization_sV::HSV, m_boost)); repaint(); } } void FlowEditCanvas::slotRectDrawn(QRectF imageRect) { qDebug() << "Rect drawn: " << imageRect; if (m_flowField != NULL) { //TODO: ugly code if (ui->average->isChecked() ) { // average qDebug() << "average"; Kernel_sV k(8, 8); k.gauss(); FlowTools_sV::deleteRect(*m_flowField, imageRect.top(), imageRect.left(), imageRect.bottom(), imageRect.right()); FlowTools_sV::refill(*m_flowField, k, imageRect.top(), imageRect.left(), imageRect.bottom(), imageRect.right()); } if (ui->picker->isChecked() ) { qDebug() << "paint" << vx << " , " << vy; FlowTools_sV::fillRect(*m_flowField, imageRect.top(), imageRect.left(), imageRect.bottom(), imageRect.right(), vx, vy); } repaintFlow(); } } void FlowEditCanvas::slotLoadFlow(QString filename) { if (m_flowField != NULL) { delete m_flowField; m_flowField = NULL; } m_flowField = FlowRW_sV::load(filename.toStdString()); m_flowFilename = filename; repaintFlow(); } void FlowEditCanvas::slotSaveFlow(QString filename) { if (m_flowField != NULL) { if (filename.length() == 0) { filename = m_flowFilename; } FlowRW_sV::save(filename.toStdString(), m_flowField); } else { qDebug() << "No flow file loaded, cannot save."; } } void FlowEditCanvas::slotExamineValues(float x, float y) { if (m_flowField != NULL) { if (x >= 0 && y >= 0 && x <= m_flowField->width()-1 && y <= m_flowField->height()-1) { float dx = m_flowField->x(x,y); float dy = m_flowField->y(x,y); ui->lblValues->setText(QString("dx/dy: (%1|%2)").arg(dx, 0, 'f', 2).arg(dy, 0, 'f', 2)); ui->lblPos->setText(QString("(%1|%2)").arg(x).arg(y)); } } } void FlowEditCanvas::slotPickValues(float x, float y) { if (ui->eyedropper->isChecked()) { qDebug() << "pick value"; if (m_flowField != NULL) { if (x >= 0 && y >= 0 && x <= m_flowField->width()-1 && y <= m_flowField->height()-1) { vx = m_flowField->x(x,y); vy = m_flowField->y(x,y); qDebug() << "will fill with : " << vx << " , " << vy; } } } }slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoFlowEdit/mainwindow.ui0000664000000000000000000000430113151342440024611 0ustar rootroot MainWindow 0 0 800 600 slowmo Flow Editor 0 0 800 22 File Help Open Save Quit QAction::QuitRole Next file Previous file true Amplify colours Shortcuts slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoFlowEdit/shortcutListDialog.cpp0000664000000000000000000000216213151342440026434 0ustar rootroot/* slowmoFlowEdit is a user interface for editing slowmoVideo's Optical Flow files. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "shortcutListDialog.h" #include "ui_shortcutListDialog.h" #include "mainwindow.h" ShortcutListDialog::ShortcutListDialog(MainWindow *parent) : QDialog(parent), ui(new Ui::ShortcutListDialog), m_openedAt(QTime::currentTime()) { ui->setupUi(this); ui->shortcuts->setText(parent->shortcuts().shortcutList()); adjustSize(); } ShortcutListDialog::~ShortcutListDialog() { delete ui; } void ShortcutListDialog::keyReleaseEvent(QKeyEvent *) { // Ensure the dialog is not closed with the same key it was opened with if (m_openedAt.elapsed() > 1000) { close(); } } void ShortcutListDialog::mouseReleaseEvent(QMouseEvent *) { if (m_openedAt.elapsed() > 1000) { close(); } } slowmovideo-0.5+git20180116/src/slowmoVideo/libgui/0000775000000000000000000000000013151342440020375 5ustar rootrootslowmovideo-0.5+git20180116/src/slowmoVideo/libgui/combinedShortcuts.cpp0000664000000000000000000001013313151342440024576 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "combinedShortcuts.h" #include #define DEBUG #include CombinedShortcuts::CombinedShortcuts(QWidget *parent) : m_parent(parent), m_signalMapper(parent) { connect(&m_signalMapper, SIGNAL(mapped(QString)), this, SLOT(slotShortcutUsed(QString))); } CombinedShortcuts::~CombinedShortcuts() { for (int i = 0; i < m_qShortcuts.size(); i++) { delete m_qShortcuts.at(i); } } QString CombinedShortcuts::shortcutList() const { QStringList shortcuts; for (int i = 0; i < m_shortcuts.size(); i++) { shortcuts << m_shortcuts.at(i).shortcut + "\t" + m_shortcuts.at(i).desc; } return shortcuts.join("\n"); } void CombinedShortcuts::addShortcut(QString shortcut, int id, QString description) { for (int i = 0; i < m_shortcuts.size(); i++) { if (m_shortcuts.at(i).shortcut == shortcut) { qDebug() << "Shortcut " << shortcut << " is not unique, used for " << description << " and " << m_shortcuts.at(i).desc; Q_ASSERT(false); return; } } if (shortcut.length() == 1) { addShortcutKey(shortcut); m_shortcuts << ShortcutItem(id, shortcut, description); } else if (shortcut.length() == 3 && shortcut.at(1) == '-') { addShortcutKey(shortcut.at(0)); addShortcutKey(shortcut.at(2)); m_shortcuts << ShortcutItem(id, shortcut, description); } else { qDebug() << "Cannot add shortcut " << shortcut << ", format not supported."; Q_ASSERT(false); } } void CombinedShortcuts::addShortcutKey(QString key) { if (!m_uniqueKeys.contains(key)) { #ifdef DEBUG qDebug() << "Adding unique key " << key; #endif m_uniqueKeys << key; // Create a new shortcut for each unique key QShortcut *qshortcut = new QShortcut(QKeySequence(key), m_parent); m_qShortcuts << qshortcut; m_signalMapper.setMapping(qshortcut, key); // Connect shortcut to the signal mapper connect(qshortcut, SIGNAL(activated()), &m_signalMapper, SLOT(map())); } } void CombinedShortcuts::slotShortcutUsed(QString key) { TimedShortcut ts; ts.shortcut = key; ts.start = QTime::currentTime(); #ifdef DEBUG qDebug() << key << " pressed. Last shortcut: " << m_previousKey.start.elapsed() << " ms ago."; #endif // QString at = QString(" @ %1.%2::%3") // .arg(ts.start.minute()) // .arg(ts.start.second()) // .arg(ts.start.msec()); bool handled = false; // Use a timeout. Otherwise pressing a key may lead to unpredictable results // since it may depend on the key you pressed several minutes ago. if (m_previousKey.start.elapsed() < 600) { QString combinedShortcut = QString("%1-%2").arg(m_previousKey.shortcut).arg(key); for (int i = 0; i < m_shortcuts.size(); i++) { if (m_shortcuts.at(i).shortcut == combinedShortcut) { #ifdef DEBUG qDebug() << QString("Shortcut %1 (%2) has been triggered!") .arg(m_shortcuts.at(i).shortcut) .arg(m_shortcuts.at(i).desc); #endif emit signalShortcutUsed(m_shortcuts.at(i).id); handled = true; break; } } } if (!handled) { // The key pressed did not belong to a combined shortcut. // Check if there is a shortcut with a single key for it. for (int i = 0; i < m_shortcuts.size(); i++) { if (m_shortcuts.at(i).shortcut == key) { emit signalShortcutUsed(m_shortcuts.at(i).id); handled = true; break; } } } if (!handled) { m_previousKey = ts; } } slowmovideo-0.5+git20180116/src/slowmoVideo/libgui/imageDisplay.cpp0000664000000000000000000002022713151342440023514 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "imageDisplay.h" #include #include #include #include #include #include #include #include #include #include ImageDisplay::ImageDisplay(QWidget *parent, Qt::WindowFlags f) : QFrame(parent, f), m_scale(1) { m_aScaling = new QAction(tr("Scale image to widget size"), this); m_aScaling->setCheckable(true); m_aScaling->setChecked(true); m_aExportImage = new QAction(tr("Export image"), this); connect(m_aScaling, SIGNAL(triggered()), this, SLOT(repaint())); connect(m_aExportImage, SIGNAL(triggered()), this, SLOT(slotExportImage())); setContentsMargins(5, 5, 5, 5); m_states.mouseInitialImagePos = QPointF(0.0,0.0); m_states.mousePrevPos = QPoint(0,0); m_states.manhattan = 0; } ImageDisplay::~ImageDisplay() { delete m_aScaling; } void ImageDisplay::trackMouse(bool track) { setMouseTracking(track); } void ImageDisplay::loadImage(const QImage img) { if (!img.isNull()) { m_image = img; } } const QImage& ImageDisplay::image() const { return m_image; } bool ImageDisplay::loadOverlay(const QImage img) { if (img.size() != m_image.size()) { return false; } m_overlay = img; return true; } void ImageDisplay::clearOverlay() { m_overlay = QImage(); } void ImageDisplay::contextMenuEvent(QContextMenuEvent *e) { QMenu menu; menu.addAction(m_aScaling); menu.addAction(m_aExportImage); m_aExportImage->setEnabled(!m_image.isNull()); menu.exec(e->globalPos()); } QPointF ImageDisplay::convertCanvasToPixel(QPoint p) const { float scale = m_scale; if (m_aScaling->isChecked()) { scale = m_scaledImageSize.width()/float(m_image.width()); } return (p - contentsRect().topLeft())/scale; } QPointF ImageDisplay::convertCanvasToImage(QPoint p) const { if (!m_aScaling->isChecked()) { return m_imageOffset + convertCanvasToPixel(p); } else { return convertCanvasToPixel(p); } } QPoint ImageDisplay::convertImageToPixel(QPointF p) const { float scale = m_scale; if (m_aScaling->isChecked()) { scale = m_scaledImageSize.width()/float(m_image.width()); } return (p*scale + QPointF(contentsRect().topLeft())).toPoint(); } QPoint ImageDisplay::convertImageToCanvas(QPointF p) const { if (!m_aScaling->isChecked()) { return convertImageToPixel(p - m_imageOffset); } else { return convertImageToPixel(p); } } void ImageDisplay::mousePressEvent(QMouseEvent *e) { m_states.mouseInitialImagePos = convertCanvasToImage(e->pos()); m_states.mousePrevPos = e->pos(); m_states.manhattan = 0; QPointF pos = convertCanvasToImage(e->pos()); emit signalMousePressed(pos.x(), pos.y()); } void ImageDisplay::mouseMoveEvent(QMouseEvent *e) { if (e->buttons().testFlag(Qt::LeftButton)) { m_states.manhattan += (e->pos()-m_states.mousePrevPos).manhattanLength(); } m_states.mousePrevPos = e->pos(); if (!m_aScaling->isChecked()) { #if QT_VERSION < 0x040700 if (e->buttons().testFlag(Qt::MidButton)) { #else if (e->buttons().testFlag(Qt::MiddleButton)) { #endif // Move the viewport QPointF offset = m_states.mouseInitialImagePos - convertCanvasToPixel(e->pos()); m_imageOffset = offset; repaint(); } } if (hasMouseTracking() && !m_image.isNull()) { int x = e->pos().x() - contentsRect().x(); int y = e->pos().y() - contentsRect().y(); if (x < 0 || y < 0 || x >= contentsRect().width() || y >= contentsRect().height()) { // qDebug() << "Not inside drawing boundaries."; return; } QPointF pos = convertCanvasToImage(e->pos()); emit signalMouseMoved(pos.x(), pos.y()); } repaint(); } void ImageDisplay::mouseReleaseEvent(QMouseEvent *e) { QPointF p0 = m_states.mouseInitialImagePos; QPointF releasePos = convertCanvasToImage(e->pos()); QPointF minPoint = min(p0, releasePos, true); QPointF maxPoint = max(p0, releasePos, true); qDebug() << p0 << releasePos << minPoint << maxPoint; QRectF mouseRect(minPoint, maxPoint); if (m_states.countsAsMove()) { emit signalRectDrawn(mouseRect); } } void ImageDisplay::wheelEvent(QWheelEvent *e) { if (!m_aScaling->isChecked()) { QPointF mouseOffset = convertCanvasToImage(e->pos()); if (e->delta() > 0) { if (m_scale < 20) { m_scale *= 1.4; } } else { if (m_scale > float(contentsRect().width())/(2*m_image.width())) { m_scale /= 1.4; } } m_imageOffset = mouseOffset - (e->pos()-contentsRect().topLeft())/m_scale; repaint(); } } void ImageDisplay::paintEvent(QPaintEvent *e) { QFrame::paintEvent(e); if (!m_image.isNull()) { QPainter p(this); QImage subImg; if (m_aScaling->isChecked()) { // Scale to frame size subImg = m_image.scaled(contentsRect().size(), Qt::KeepAspectRatio); m_scaledImageSize = subImg.size(); } else { // User-defined scaling subImg = m_image.copy(std::floor(m_imageOffset.x()), std::floor(m_imageOffset.y()), std::ceil(contentsRect().width()/m_scale+1), std::ceil(contentsRect().height()/m_scale+1)); subImg = subImg.scaled(m_scale*subImg.size(), Qt::KeepAspectRatio, Qt::SmoothTransformation); subImg = subImg.copy(m_scale*(m_imageOffset.x()-floor(m_imageOffset.x())), m_scale*(m_imageOffset.y()-floor(m_imageOffset.y())), contentsRect().width(), contentsRect().height()); } p.drawImage(contentsRect().topLeft(), subImg); if (m_states.countsAsMove() && !QApplication::mouseButtons().testFlag(Qt::NoButton)) { QRect r; QPoint origin = convertImageToCanvas(m_states.mouseInitialImagePos); r.setTopLeft(min(m_states.mousePrevPos, origin)); r.setWidth(abs(m_states.mousePrevPos.x()-origin.x())); r.setHeight(abs(m_states.mousePrevPos.y()-origin.y())); p.drawRect(r); } } } void ImageDisplay::slotExportImage() { Q_ASSERT(!m_image.isNull()); QSettings settings; QFileDialog dialog(this, tr("Export render preview to image")); dialog.setAcceptMode(QFileDialog::AcceptSave); dialog.setFileMode(QFileDialog::AnyFile); dialog.setDirectory(settings.value("directories/imageDisplay", QDir::homePath()).toString()); if (dialog.exec() == QDialog::Accepted) { m_image.save(dialog.selectedFiles().at(0)); settings.setValue("directories/imageDisplay", QFileInfo(dialog.selectedFiles().at(0)).absolutePath()); } } qreal ImageDisplay::clamp(qreal val, qreal min, qreal max) const { return (val < min) ? min : ( (val > max) ? max : val ); } QPointF ImageDisplay::max(QPointF p1, QPointF p2, bool limitToImage) const { QPointF p(qMax(p1.x(), p2.x()), qMax(p1.y(), p2.y())); if (limitToImage) { p.rx() = clamp(p.x(), 0, m_image.width()-1); p.ry() = clamp(p.y(), 0, m_image.height()-1); } return p; } QPointF ImageDisplay::min(QPointF p1, QPointF p2, bool limitToImage) const { QPointF p(qMin(p1.x(), p2.x()), qMin(p1.y(), p2.y())); if (limitToImage) { p.rx() = clamp(p.x(), 0, m_image.width()-1); p.ry() = clamp(p.y(), 0, m_image.height()-1); } return p; } QPoint ImageDisplay::min(QPoint p1, QPoint p2) const { return QPoint(qMin(p1.x(), p2.x()), qMin(p1.y(), p2.y())); } QPoint ImageDisplay::max(QPoint p1, QPoint p2) const { return QPoint(qMax(p1.x(), p2.x()), qMax(p1.y(), p2.y())); } slowmovideo-0.5+git20180116/src/slowmoVideo/libgui/imageDisplay.h0000664000000000000000000000525013151342440023160 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef IMAGEDISPLAY_H #define IMAGEDISPLAY_H #include #include /** \brief Simple image display. Images can be scaled to the frame size and exported to a file via the context menu. */ class ImageDisplay : public QFrame { Q_OBJECT public: explicit ImageDisplay(QWidget *parent = 0, Qt::WindowFlags f = 0); ~ImageDisplay(); void trackMouse(bool track); /// \return The image that is currently displayed const QImage& image() const; public slots: /// Loads the given image; does \em not call repaint()! void loadImage(const QImage img); /// Loads the overlay that will be painted over the image; does \em not call repaint() either. /// \return \c false if the image sizes do not match bool loadOverlay(const QImage img); void clearOverlay(); signals: void signalMouseMoved(float x, float y); void signalMousePressed(float x, float y); void signalRectDrawn(QRectF imageRect); protected slots: virtual void paintEvent(QPaintEvent *e); virtual void contextMenuEvent(QContextMenuEvent *e); virtual void mousePressEvent(QMouseEvent *e); virtual void mouseMoveEvent(QMouseEvent *e); virtual void mouseReleaseEvent(QMouseEvent *e); virtual void wheelEvent(QWheelEvent *e); private: QImage m_image; QImage m_overlay; QAction *m_aScaling; QAction *m_aExportImage; float m_scale; QPointF m_imageOffset; QSize m_scaledImageSize; struct { QPointF mouseInitialImagePos; QPoint mousePrevPos; int manhattan; bool countsAsMove() { return manhattan > 4; } } m_states; /// Convert canvas coordinates to image coordinates. /// The image may have an offset. QPointF convertCanvasToImage(QPoint p) const; /// Convert canvas coordinates to pixel coordinates (ignores the image offset) QPointF convertCanvasToPixel(QPoint p) const; QPoint convertImageToCanvas(QPointF p) const; QPoint convertImageToPixel(QPointF p) const; qreal clamp(qreal val, qreal min, qreal max) const; QPointF max(QPointF p1, QPointF p2, bool limitToImage) const; QPointF min(QPointF p1, QPointF p2, bool limitToImage) const; QPoint min(QPoint p1, QPoint p2) const; QPoint max(QPoint p1, QPoint p2) const; private slots: void slotExportImage(); }; #endif // IMAGEDISPLAY_H slowmovideo-0.5+git20180116/src/slowmoVideo/libgui/CMakeLists.txt0000664000000000000000000000076313151342440023143 0ustar rootrootset(SRCS imageDisplay.cpp combinedShortcuts.cpp ) set(SRCS_MOC imageDisplay.h combinedShortcuts.h ) qt_wrap_cpp(MOC_OUT ${SRCS_MOC}) include_directories(${CMAKE_BINARY_DIR}/libgui) add_library(sVgui STATIC ${SRCS} ${MOC_OUT}) qt_use_modules(sVgui Core) qt_use_modules(sVgui Gui) qt_use_modules(sVgui Widgets) target_link_libraries(sVgui ${EXTERNAL_LIBS}) # If the library is used in a different project, cmake requires: #include_directories(${slowmoVideo_SOURCE_DIR}/libgui) slowmovideo-0.5+git20180116/src/slowmoVideo/libgui/combinedShortcuts.h0000664000000000000000000000355413151342440024254 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef COMBINEDSHORTCUTS_H #define COMBINEDSHORTCUTS_H #include #include #include #include //#include class QShortcut; /** \brief Manager for combined shortcuts (known from GMail) Combined shortcuts are for example s-a or \c q; the former one is triggered when the user presses \c s and within a limited amount of time \c a. */ class CombinedShortcuts : public QObject { Q_OBJECT public: CombinedShortcuts(QWidget *parent); ~CombinedShortcuts(); /// Adds the given shortcut if it does not exist yet void addShortcut(QString shortcut, int id, QString description); /// Returns a list of the available shortcuts that have been added QString shortcutList() const; signals: /// Emitted when the shortcut has been used void signalShortcutUsed(int id); private: struct ShortcutItem { ShortcutItem(int id, QString shortcut, QString desc) : id(id), shortcut(shortcut), desc(desc) {} int id; QString shortcut; QString desc; }; struct TimedShortcut { QTime start; QString shortcut; }; QList m_shortcuts; QList m_uniqueKeys; QList m_qShortcuts; TimedShortcut m_previousKey; QWidget *m_parent; QSignalMapper m_signalMapper; void addShortcutKey(QString key); private slots: void slotShortcutUsed(QString key); }; #endif // COMBINEDSHORTCUTS_H slowmovideo-0.5+git20180116/src/slowmoVideo/tr/0000775000000000000000000000000013151342440017547 5ustar rootrootslowmovideo-0.5+git20180116/src/slowmoVideo/tr/slowmoVideo_it.ts0000664000000000000000000020733513151342440023134 0ustar rootroot AboutDialog About Informazioni About slowmoVideo Informazioni su SlowmoVideo <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd"> <html><head><meta name="qrichtext" content="1" /><style type="text/css"> p, li { white-space: pre-wrap; } </style></head><body style=" font-family:'Bitstream Vera Sans'; font-size:10pt; font-weight:400; font-style:normal;"> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-style:italic;">slowmoVideo</span> is developed by Simon A. Eugster (aka. Granjow, co-author of Kdenlive) and licensed under GPLv3.</p> <p style="-qt-paragraph-type:empty; margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><br /></p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-style:italic;">slowmoVideo</span> allows to change the speed of a video clip based upon a curve. If the speed becomes higher than 1×, an exposure (shutter) effect simulates motion blur. For lower speed, frames are interpolated with optical flow.</p> <p style="-qt-paragraph-type:empty; margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><br /></p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Thanks for contributing:</p> <ul style="margin-top: 0px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; -qt-list-indent: 1;"><li style=" margin-top:12px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Mirko Götze <span style=" font-style:italic;">&lt;mail@mgo80.de&gt;</span> for converting <span style=" font-weight:600;">Cg to GLSL</span> (Removing the nVidia dependency)</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Morten Sylvest Olsen <span style=" font-style:italic;">&lt;mso@kapowsoftware.com&gt;</span> for the <span style=" font-weight:600;">V3D speedup</span> and removing the unnecessary window</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Elias Vanderstuyft for displaying the <span style=" font-weight:600;">shutter function</span> on the canvas</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Christian Frisson <span style=" font-style:italic;">&lt;christian.frisson@umons.ac.be&gt;</span> for<span style=" font-weight:600;"> OpenCV on MXE</span> (allowed me to compile slowmoVideo for Windows)</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Per <span style=" font-style:italic;">&lt;per@stuffmatic.com&gt;</span> for the<span style=" font-weight:600;"> OpenCV</span> code (slowmoVideo can run on CPU only with it)</li></ul> <p style="-qt-paragraph-type:empty; margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><br /></p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Visit <a href="http://slowmoVideo.granjow.net"><span style=" text-decoration: underline; color:#0057ae;">slowmoVideo.granjow.net</span></a> for more information.</p></body></html> </style></head><body style=" font-family:'Bitstream Vera Sans'; font-size:10pt; font-weight:400; font-style:normal;"> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-style:italic;">slowmoVideo</span> è sviluppato da Simon A. Eugster (aka. Granjow, coautore di Kdenlive) sotto licenza GPLv3.</p> <p style="-qt-paragraph-type:empty; margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><br /></p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-style:italic;">slowmoVideo</span> permette di cambiare la velocità di una clip video in base ad una curva.Se la velocità diventa maggiore di 1x, un effetto di esposizione (shutter) simulerà una sfocatura di movimento. Per velocità inferiori, verranno interpolati dei frame nel flusso video.</p> <p style="-qt-paragraph-type:empty; margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><br /></p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Grazie per i contributi:</p> <ul style="margin-top: 0px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; -qt-list-indent: 1;"><li style=" margin-top:12px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Mirko Götze <span style=" font-style:italic;">&lt;mail@mgo80.de&gt;</span>per la conversione <span style=" font-weight:600;">Cg in GLSL</span> (Rimozione della dipendenza nVidia)</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Morten Sylvest Olsen <span style=" font-style:italic;">&lt;mso@kapowsoftware.com&gt;</span> per la <span style=" font-weight:600;">velocizzazione di V3D</span> e la rimozione delle finestre non necessarie</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Elias Vanderstuyft per la visualizzazione per la visualizzazione della <span style=" font-weight:600;">funzione shutter</span> nell'area di lavoro</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Christian Frisson <span style=" font-style:italic;">&lt;christian.frisson@umons.ac.be&gt;</span>per la conversione di<span style=" font-weight:600;"> OpenCV in MXE</span> (permettendomi di compilare slowmoVideo per Windows)</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Per <span style=" font-style:italic;">&lt;per@stuffmatic.com&gt;</span> per il codice<span style=" font-weight:600;"> OpenCV</span> (slowmoVideo può funzionare nella CPU solo con esso)</li></ul> <p style="-qt-paragraph-type:empty; margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><br /></p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Visit <a href="http://slowmoVideo.granjow.net"><span style=" text-decoration: underline; color:#0057ae;">slowmoVideo.granjow.net</span></a> per maggiori informazioni.</p></body></html> (c) 2012 Simon A. Eugster Version %1, %2 Canvas slowmoUI canvas &Delete node &Elimina_nodo &Snap in node &Inserisci nodo &Delete tag &Elimina etichetta &Rename tag &Rinomina etichetta Set tag &time Imposta etichetta &ora &Linear curve &Curva lineare &Bézier curve &Curva Bézier Set &custom speed Imposta &Velocità personalizzata Set/edit shutter &function Imposta/modifica &funzione shutter Set speed to %1× Imposta la velocità a %1× Reset left handle Resetta la maniglia sinistra Reset right handle Resetta la maniglia destra Segment replay &speed … Velocità di riproduzione del &segmento … Node %1 Nodo %1 Handle actions Azioni della maniglia Segment between node %1 and %2 Segmento tra il nodo %1 e il %2 Tag %1 Etichetta %1 New tag name Nuovo nome etichetta Tag: Etichetta: New tag time Nuova etichetta tempo Time: Tempo: Replay speed for current segment Velocità di riproduzione del segmento corrente Speed: Velocità: FlowEditCanvas Values at mouse position Valore della posizione del mouse FlowExaminer Close Chiudi FrameMonitor Input monitor Monitor di input ImageDisplay Scale image to widget size Ridimensiona l'immagine alla dimensione del widget Export image Esporta immagine Export render preview to image Esporta l'anteprima di rendering in immagine ImagesFrameSource_sV No images selected. Nessuna immagine selezionata. Image %1 is not of the same size (%2) as the first image (%3). Immagine %1 non è della stessa dimensione (%2) della prima immagine (%3). Creating preview images from the input images Creazione delle immagini di anteprima dalle immagini inserite Resized image already exists for %1 Esiste già l'immagine ridimensionata per %1 Re-sizing image %1 to: %2 Ridimensionamento immagine %1 a: %2 MainWindow slowmo Flow Editor Editor del flusso slowmo File File Help Aiuto Open Apri Save Salva Quit Esci Next file File successivo Previous file File precedente Amplify colours Amplifica i colori Shortcuts Scorciatoie slowmoVideo UI SlowmoVideo Interfaccia utente &File &File &Help &Aiuto &View &Vista &Project &Progetto &Render &Render Preferences Preferenze &Save &Salva Save &as … Salva &come ... &Open &Apri &Shortcuts &Scorciatoie &About &Informazioni &Quit &Esci Render &preview &Anteprima del render E&xamine flow E&samina il flusso Examine flow at input frame Esamina il flusso al frame di input &New … &Nuovo ... &Preferences &Preferenze Zoom &in Zoom &in Zoom &out Zoom &out Input monitor Monitor ingresso Curve monitor Monitor della curva Render preview Anteprima di render Show help overlay Mostra l'aiuto in sovrapposizione New project Nuovo progetto Open project Apri progetto Save as ... Salva come ... Abort move Annulla spostamento Unselect all Deseleziona tutto Delete selected nodes Elimina i nodi selezionati Selecting tool Strumento di selezione Move tool Strumento di spostamento Insert label (tag) Inserisci etichetta Load Project Carica progetto slowmoVideo projects (*.sVproj) progetti slowmoVideo (*.sVproj) Warning Attenzione Frame source error Errore nella sorgente dei frame Error Errore No filename given, won't save. (Perhaps an empty project?) Nessun nome al file, impossibile salvare. (Forse un progetto vuoto?) Saved project as: %1 Progetto salvato come:: %1 Error writing project file Errore nella scrittura del file di progetto Save project Salva progetto Navigation: [Shift] Scroll, Drag Navigatione: [Shift] Scorrimento, Trascinamento Move nodes: [Ctrl] Drag Sposta nodi: [ctrl] Trascinamento empty project Progetto vuoto Rendering progress Progressione del rendering Frame extraction progress Progresso di estrazione del frame NewProjectDialog New slowmo Project Nuovo Progetto slowmo Browse Naviga Abort Terminare Ok Ok Directory will be created La cartella verrà creata Video source Sorgente video Video information: Informazioni sul video: Input video Video di ingresso Image source Immagine sorgente Input images Inserimento immagini Images information: Informazione sulle immagini: Video file File video Images Immagini Project Directory Cartella del Progetto Project Filename Nome del File di Progetto .sVproj You should preferredly use an empty directory here. Each project needs its own project directory. Qui si dovrebbe preferire l'uso di una cartella vuota. Ogni progetto ha bisogno di una propria cartella di progetto. The project will be saved to this file right after confirming this dialog. Il progetto verrà salvato in questo file subito dopo la conferma di questa schermata. Select input video file Selezionare il file video d'ingresso Character %1 is not an ASCII character. This file path will likely not work with ffmpeg. Il carattere %1 non è un carattere ASCII. Questo percorso file non funzionerà con ffmpeg. Select input images Selezionare le immagini d'ingresso Select a project directory Selezionare una cartella di progetto Number of video streams: %1 Frames: %2 Size: %3×%4 Numero dei flussi video: %1\nFrame: %2\nDimensione: %3×%4\n Frame rate: %1/%2 Frame rate: %1/%2 No video stream detected. Nessun flusso video trovato. Image size: %1 Dimensione immagine: %1 PreferencesDialog slowmoUI preferences Preferenze interfaccia utente slowmo Binary locations Posizioni Binary Browse Naviga Flow method Metodo flusso GPU, V3D (nVidia card required) GPU, V3D (necessaria scheda nVidia) CPU, OpenCV CPU, OpenCV Cancel annulla Ok Ok flowBuilder binary location Posizione flowBuilder binary ProgressDialog Progress Avanzamento Current Task Attività Corrente Task Description Descrizione attività Abort Annulla Ok Ok (Finished) (Finito) Aborted Annullato Task finished in %1. Attività ultimata in %1. Task finished. Attività ultimata. (Finished) %1 (Terminato) %1 ProjectPreferencesDialog Dialog Finestra di dialogo FPS value to use for calculating the output frame (display only) Il valore FPS da usare per il calcolo del frame in uscita (solo visualizzazione) Project_sV Empty frame source; Cannot build flow. Sorgente del frame vuota; Impossibile creare il flusso. QObject Orig Origine Small Piccolo Unknown size Dimensione sconosciuta Forward Avanti Backward Indietro Unknown direction Direzione sconosciuta Linear Lineare Bézier Bézier Unknown curve type Tipologia della curva sconosciuta Source axis Assi della sorgente Output axis Assi dell'uscita Unknown axis Assi sconosciuti Forward interpolation (fast) Interpolazione in avanti (veloce) Forward interpolation (accurate) Interpolazione in avanti (accurata) Two-way interpolation (fast) Interpolazione in due direzioni (veloce) Two-way interpolation (accurate) Interpolazione in due direzioni (accurata) Bézier interpolation Interpolazione Bézier Unknown interpolation Interpolazione sconosciuta Stacking Sovrapposizione Convolution Convoluzione Nearest (no blurring) Più vicino (nessuna sfocatura) Requested frame %1: Not within valid range. (%2 frames) Frame richiesto %1: Non è all'interno di un intervallo valido. (%2 frames) Range too small: Start frame is %1, end frame is %2. Using normal interpolation. Intervallo troppo piccolo: Il frame iniziale è %1, il frame finale è %2. Usare una interpolazione normale. Video could not be prepared (error code %1). %2 Il video non può essere creato (Codice errore %1). %2 Cannot write to %1; please check if you have write permissions. Impossibile scrivere in %1; per favore controllare se si hanno i permessi in scrittura. Unknown frame source “%1”. Cannot load the project. Sorgente dei frame sconosciuta “%1”. Impossibile caricare il progetto. Cannot read from file %1. (Opening in read-only mode failed.) Impossibile leggere dal file %1. (Fallita l'apertura del file in modalità sola lettura.) Invalid project file: %1 File di progetto non valido: %1 %1 s %1 s %1 min %2 s %1 min %2 s Frame %1 Frame %1 %1 % %1 % RenderPreview Render preview Anteprima rendering This is an information message. Questo è un messaggio informativo. Cannot render preview, no frames loaded. Impossibile creare l'anteprima del rendering, nessun frame caricato. Cannot render preview at the curve position since no curve is available. Impossibile creare l'anteprima del rendering nella posizione della curva finchè non è disponibile una curva. Rendering preview at output time %1 s (might take some time) ... Anteprima del rendering al momento dell'uscita %1 s (potrebbe occorrere un po' di tempo) ... Preview is still being rendered. Anteprima è ancora in fase di rendering. Cannot render at output time %1 s; Not within the curve. Impossibile creare il rendering al tempo di uscita %1 s; Niente all'interno della curva. Preview rendering finished. Anteprima del rendering terminata. RenderTask_sV Rendering aborted. Rendering annullato. Rendering Slow-Mo … Rendering Slow-Mo … No rendering target given! Aborting rendering. Nessuna destinazione per il rendering! Rendering annullato. Empty frame source, cannot be rendered. Sorgente frame vuota, impossibile il rendering. Rendering frame %1 @ %2 s from input position: %3 s (frame %4) Frame di rendering %1 @ %2 s dalla posizione iniziale: %3 s (frame %4) RenderingDialog Rendering settings Impostazioni di rendering a a Full Project Progetto completo Tag section Etichetta della sezione Custom section Sezione personalizzata to a Frames per second: Frame al secondo: Size: Dimensione: Interpolation: Interpolazione: For two frames A and B, the two-way interpolations calculate both the flows A→B and B→A, which leads to smoother transitions between them. Forward interpolations only calculate A→B; Twice as fast, but usually less smooth. Da due frame A e B, le interpolazioni in due direzioni calcolano sia il flusso A→B sia B→A,che porta a delle transizioni più uniformi tra loro. Le interpolazioni in avanti calcolano il flusso A→B; Due volte più veloce ma di solito meno uniforme. Optical Flow Flusso Visivo Optical flow Flusso visivo buildFlow lambda Valore lambda del flusso video Use a higher value for high-quality footage and larger images. Utilizzare un valore maggiore per filmati di alta qualità e immagini più grandi. The lambda is only used with the GPU based Optical Flow algorithm. There is no general rule which value is best, so it is usually a good idea to render a short part with a low (5) and a high (50) lambda to see the differences, and then try to find the best value between. Il valore lambda è usato unicamente per GPU basate su algoritmi Optical Flow. Non esiste una regola generale sul valore migliore, quindi è una buona regola fare il rendering di una piccola parte con un valore di lambda basso (5) e uno con un valore alto (50), per evidenziarne le differenze e per poi trovare il miglior valore tra i due limiti. Motion Blur Sfocatura di Movimento Motion blur Sfocatura di movimento Motion blur will only be applied for segments on which it is enabled. La sfocatura di movimento verrà unicamente applicata per segmenti nei quali è stata abilitata. Stacking blur (Uses more disk space, but is faster for repeated rendering.) Sfocatura in sovrapposizione (Occupa maggior spazio sul disco, ma è più veloce nel caso di rendering ripetuti.) Maximum samples Campioni massimi Samples for slow motion Campioni per lo slow motion Convolution blur (Smoother than stacking, usually the better choice.) Sfocatura di convoluzione (Più uniforme della sfocatura in sovrapposizione, di solito la scelta migliore.) Nearest (no blurring) Più vicino (nessuna sfocatura) Output Uscita Target: Destinazione: Video Video Images Immagini The %1 in the filename pattern is mandatory and will be replaced by the frame number. Il %1 nel nome del percorso del file è obbligatorio e sarà sostituito con il numero di frame. Output directory Cartella di destinazione Browse Naviga Filename pattern Modello del nome del file rendered-%1.jpg Rendered-%1.jpg Videos will be encoded with ffmpeg. If additional arguments are left empty, defaults will be used. The video format is determined by ffmpeg according to the file suffix. I video saranno codificati con ffmpeg. Se gli argomenti aggiuntivi vengono lasciati vuoti, per impostazione predefinita verrà utilizzato. Il formato video verrà determinato da ffmpeg in base al suffisso del file. Output file File di output Optional arguments Valori opzionali Will *not* save the project! Il progetto *non* verrà salvato! &Save settings &Salva impostazioni &Abort &Annulla &Ok &Ok Original size Dimensione originale Small Piccola <Start> <Inizio> <End> <Fine> Start time must be < end time! Il tempo iniziale deve essere < del tempo finale! Rendering from %1 s to %2 s. Rendering da %1 s a %2 s. Output directory for rendered images Cartella di output per le immagini renderizzate Output video file File video di output ShortcutListDialog Flow Editor shortcuts Scorciatoie dell'Editor di Flusso ShutterFunctionDialog Shutter Functions Funzioni Shutter < < Segment %1 Segmento %1 > > + + - - shutterFunc1 shutterFunc1 Used: %1 times Usato: %1 volte // header (function foo(args...) { return 0; // footer }) <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd"> <html><head><meta name="qrichtext" content="1" /><style type="text/css"> p, li { white-space: pre-wrap; } </style></head><body style=" font-family:'Bitstream Vera Sans'; font-size:10pt; font-weight:400; font-style:normal;"> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Math functions are available in the Math namespace:</p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-family:'Monospace';">Math.PI, Math.cos(...), Math.pow(base. exponent) etc.</span></p> </body></html> Close Chiudi Segment %1 (total number: %2) Segmento %1 (totale: %2) TagAddDialog Add tag Aggiungi etichetta Change the tag type with Arrows up/down. Cambia tipo di etichetta con le frecce alto/basso. Source Tag Etichetta sorgente Output Tag Etichetta di output Abort Annulla Ok Ok VideoFrameSource_sV Video file %1 does not exist! Il file video %1 non esiste! Video is invalid, no streams found in %1 Il video non è valido, nessun flusso trovato in %1 ffmpeg/avconv executable not found! Cannot load video. (It is also possible that it took a little long to respond due to high workload, so you might want to try again.) L'eseguibile ffmpeg/avconv non è stato trovato! Impossibile caricare il video (È anche possibile che occorra un po' più di tempo per la risposta a causa della grande quantità di lavoro, quindi si potrebbe riprovare di nuovo.) Extracting thumbnail-sized frames from the video file Estrazione dell'anteprima del frame di dimensioni ridotte dal file video Extracting original-sized frames from the video file Estrazione dell'anteprima del frame di dimensioni originali dal file video Frame %1 of %2 Frame %1 di %2 slowmovideo-0.5+git20180116/src/slowmoVideo/tr/slowmoVideo_fr.qm0000664000000000000000000001206613151342440023111 0ustar rootrootIZr\yj z, C&C ܽ~ q  [7Qz~ݗeI - D c_ h ĥ t>q J  . W  l =Zef .i Courbe de Bezier&Bézier curveCanvas Effacer le noeud &Delete nodeCanvasEffacer le tag &Delete tagCanvasCourbe Lineaire &Linear curveCanvasRenomer le tag &Rename tagCanvasNoeud %1Node %1CanvasVitesse:Speed:CanvasTime:CanvasDValeurs la position de la sourieValues at mouse positionFlowEditCanvas FermerClose FlowExaminer Exporter l'image Export image ImageDisplayCration des images de previsualisation partir des images d'entre-Creating preview images from the input imagesImagesFrameSource_sVL'image %1 n'est pas la mme taile (%2) que la premire image (%3).>Image %1 is not of the same size (%2) as the first image (%3).ImagesFrameSource_sV2Pas d'image selectionne.No images selected.ImagesFrameSource_sV&A propos&About MainWindow&Fichier&File MainWindow &Aide&Help MainWindow&Nouveau&New … MainWindow&Ouvrir&Open MainWindow&Prfrences &Preferences MainWindow&Projet&Project MainWindow&Quiter&Quit MainWindow &Rendu&Render MainWindow&Sauver&Save MainWindow&Raccourcits &Shortcuts MainWindow&Vue&View MainWindow(Annuler le mouvement Abort move MainWindow,Amplifier les couleursAmplify colours MainWindow"E&xaminer le flux E&xamine flow MainWindowFichierFile MainWindowAideHelp MainWindowNouveau projet New project MainWindowFichier suivant Next file MainWindow OuvrirOpen MainWindow Ouvrir un projet Open project MainWindowPrferences Preferences MainWindow"Fichier prcedent Previous file MainWindowQuitterQuit MainWindow:Generer une &prvisualisationRender &preview MainWindow SauverSave MainWindowSauver &comme Save &as … MainWindowSauver comme Save as ... MainWindowRacourcits Shortcuts MainWindow.sVprojNewProjectDialogAnnulerAbortNewProjectDialogSource d'image Image sourceNewProjectDialogVideo en entre Input videoNewProjectDialog(Rpertoire du projetProject DirectoryNewProjectDialogNom du projetProject FilenameNewProjectDialog(Information du MediaVideo information:NewProjectDialogSource du Media Video sourceNewProjectDialog8Localisation des executablesBinary locationsPreferencesDialogAnnulerCancelPreferencesDialogHMthode d'valuation du flux optique Flow methodPreferencesDialogPlocalisation de l'executable flowBuilderflowBuilder binary locationPreferencesDialog(Finit) (Finished) ProgressDialog(Fini) %1 (Finished) %1ProgressDialogAnnulerAbortProgressDialog AnnulAbortedProgressDialogTache courante Current TaskProgressDialogOkOkProgressDialog(Tache termin en %1.Task finished in %1.ProgressDialogTache termin.Task finished.ProgressDialogslowmovideo-0.5+git20180116/src/slowmoVideo/tr/slowmoVideo_fr.ts0000664000000000000000000017777413151342440023144 0ustar rootroot AboutDialog About A propos About slowmoVideo A propos de slowmoVideo <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd"> <html><head><meta name="qrichtext" content="1" /><style type="text/css"> p, li { white-space: pre-wrap; } </style></head><body style=" font-family:'Bitstream Vera Sans'; font-size:10pt; font-weight:400; font-style:normal;"> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-style:italic;">slowmoVideo</span> is developed by Simon A. Eugster (aka. Granjow, co-author of Kdenlive) and licensed under GPLv3.</p> <p style="-qt-paragraph-type:empty; margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><br /></p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-style:italic;">slowmoVideo</span> allows to change the speed of a video clip based upon a curve. If the speed becomes higher than 1×, an exposure (shutter) effect simulates motion blur. For lower speed, frames are interpolated with optical flow.</p> <p style="-qt-paragraph-type:empty; margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><br /></p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Thanks for contributing:</p> <ul style="margin-top: 0px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; -qt-list-indent: 1;"><li style=" margin-top:12px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Mirko Götze <span style=" font-style:italic;">&lt;mail@mgo80.de&gt;</span> for converting <span style=" font-weight:600;">Cg to GLSL</span> (Removing the nVidia dependency)</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Morten Sylvest Olsen <span style=" font-style:italic;">&lt;mso@kapowsoftware.com&gt;</span> for the <span style=" font-weight:600;">V3D speedup</span> and removing the unnecessary window</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Elias Vanderstuyft for displaying the <span style=" font-weight:600;">shutter function</span> on the canvas</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Christian Frisson <span style=" font-style:italic;">&lt;christian.frisson@umons.ac.be&gt;</span> for<span style=" font-weight:600;"> OpenCV on MXE</span> (allowed me to compile slowmoVideo for Windows)</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Per <span style=" font-style:italic;">&lt;per@stuffmatic.com&gt;</span> for the<span style=" font-weight:600;"> OpenCV</span> code (slowmoVideo can run on CPU only with it)</li></ul> <p style="-qt-paragraph-type:empty; margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><br /></p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Visit <a href="http://slowmoVideo.granjow.net"><span style=" text-decoration: underline; color:#0057ae;">slowmoVideo.granjow.net</span></a> for more information.</p></body></html> (c) 2012 Simon A. Eugster (c) 2012 Simon A. Eugster Version %1, %2 Version %1, %2 Canvas slowmoUI canvas Canevas slowmoUI &Delete node Effacer le &noeud &Snap in node &Aimanter aux noeuds &Delete tag Effacer le &tag &Rename tag &Renomer le tag Set tag &time &Horodater le tag &Linear curve Courbe &Lineaire &Bézier curve Courbe de &Bézier Set &custom speed Fixer une &vitesse personnalisé Set/edit shutter &function Fixer/Éditer la fonction de l'&obturateur Set speed to %1× Fixer la vitesse à %1x Reset left handle Réinitialiser la poignée gauche Reset right handle Réinitialiser la poignée droite Segment replay &speed … &Vitesse de relecture du segment ... Node %1 Noeud %1 Handle actions Actions des poignées Segment between node %1 and %2 Segment entre les noeuds %1 et %2 Tag %1 Tag %1 New tag name Nouveau nom du tag Tag: Tag: New tag time Nouveau temps du tag Time: Temps Replay speed for current segment Vitesse de relecture pour le segment courrant Speed: Vitesse: FlowEditCanvas Values at mouse position Valeurs à la position de la sourie FlowExaminer Close Fermer FrameMonitor Input monitor Moniteur d'entrée ImageDisplay Scale image to widget size Ajuster l'image à la taille du widget Export image Exporter l'image Export render preview to image Exporter une image de la prévisualisation ImagesFrameSource_sV No images selected. Pas d'image selectionnée. Image %1 is not of the same size (%2) as the first image (%3). L'image %1 n'as pas la même taille (%2) que la première image (%3). Creating preview images from the input images Création des images de previsualisation à partir des images d'entrée Resized image already exists for %1 Une image de cette taille existe déjà pour %1 Re-sizing image %1 to: %2 Changement de la taille de l'image %1 vers : %2 MainWindow slowmo Flow Editor slowmo Éditeur de flux File Fichier Help Aide Open Ouvrir Save Enregistrer Quit Quitter Next file Fichier suivant Previous file Fichier précedent Amplify colours Amplifier les couleurs Shortcuts Raccourcis slowmoVideo UI slowmoVideo UI &File &Fichier &Help &Aide &View &Affichage &Project &Projet &Render &Rendu Preferences Préférences &Save &Enregistrer Save &as … Enregistrer &sous... &Open &Ouvrir &Shortcuts &Raccourcis &About &À propos &Quit &Quitter Render &preview Générer une &prévisualisation E&xamine flow E&xaminer le flux Examine flow at input frame Examiner le flux à l'image d'entrée &New … &Nouveau ... &Preferences &Préférences Zoom &in Zoom &avant Zoom &out Zoom a&rrière Input monitor Moniteur d'entrée Curve monitor Moniteur de courbe Render preview Générer une prévisualisation Show help overlay Afficher l'aide New project Nouveau projet Open project Ouvrir un projet Save as ... Enregistrer sous... Abort move Annuler le mouvement Unselect all Tout désélectionner Delete selected nodes Supprimer les noeuds sélectionnés Selecting tool Outil sélection Move tool Outil déplacer Insert label (tag) Insérer une étiquette (tag) Load Project Charger un Projet slowmoVideo projects (*.sVproj) projects slowmoVideo (*.sVproj) Warning Avertissement Frame source error Érreur d'image source Error Erreur No filename given, won't save. (Perhaps an empty project?) Pas de nom de fichier donné, pas d'enregistrement. (Peut être un projet vide ?) Saved project as: %1 Projet enregistré sous : %1 Error writing project file Érreur lors de l'écriture du projet Save project Enregistrer le projet Navigation: [Shift] Scroll, Drag Navigation: [Shift] Défiler, Déplacer Move nodes: [Ctrl] Drag Déplacement des noeuds: [Ctrl] Déplacer empty project projet vide Rendering progress Rendu en cours Frame extraction progress Extraction d'image en cours NewProjectDialog New slowmo Project Nouveau projet slowmo Browse Parcourir Abort Annuler Ok Ok Directory will be created Le répertoire seras créé Video source Vidéo source Video information: Informations sur la vidéo : Input video Video en entrée Image source Images source Input images Images d'entrée Images information: Informations sur les images : Video file Fichier vidéo Images Images Project Directory Répertoire du projet Project Filename Nom du projet .sVproj You should preferredly use an empty directory here. Each project needs its own project directory. Chaque projet ayant besoin de son répertoire de projet, vous devriez en choisir un vide. The project will be saved to this file right after confirming this dialog. Le projet sera enregistré dans ce fichier à la confirmation de cette boite de dialogue. Select input video file Sélectionnez le fichier vidéo d'entrée Character %1 is not an ASCII character. This file path will likely not work with ffmpeg. Le charactère %1 n'est pas ASCII. Ce chemin de fichier ne vas malheuresement pas fonctionner avec ffmpeg. Select input images Sélectionnez les images d'entrée Select a project directory Sélectionnez un répertoire pour le projet Number of video streams: %1 Frames: %2 Size: %3×%4 Nombre de flux vidéo : %1 Images : %2 Taille : %3x%4 Frame rate: %1/%2 Fréquence d'image: %1/%2 No video stream detected. Aucun flux vidéo trouvé. Image size: %1 Taille de l'image : %1 PreferencesDialog slowmoUI preferences slommoUI préférences Binary locations Localisation des executables Browse Parcourir Flow method Méthode d'évaluation du flux optique GPU, V3D (requires flowBuilder and a video card) GPU, V3D (nécessite flowBuilder et une carte graphique) CPU, OpenCV CPU, OpenCV Cancel Annuler Ok Ok flowBuilder binary location localisation de l'executable flowBuilder ProgressDialog Progress Progression Current Task Tache courante Task Description Déscription de la tache Abort Annuler Ok Ok (Finished) (Terminé) Aborted Annulé Task finished in %1. Tache terminé en %1. Task finished. Tache terminé. (Finished) %1 (Fini) %1 ProjectPreferencesDialog Dialog Dialogue FPS value to use for calculating the output frame (display only) Valeur FPS à utiliser pour le calcul de l'image de sortie (affichage seulement) Project_sV Empty frame source; Cannot build flow. Image source vide: Impossible de construire le flux. QObject Orig Original Small Miniature Unknown size Taille inconnue Forward Arrière Backward Avant Unknown direction Direction inconnue Linear Linéaire Bézier Bézier Unknown curve type Type de courbe inconnue Source axis Axe Source Output axis Axe Sortie Unknown axis Axe Inconnue Forward interpolation (fast) Interpolation avant (rapide) Forward interpolation (accurate) Interpolation avant (précis) Two-way interpolation (fast) Interpolation avant et arrière (rapide) Two-way interpolation (accurate) Interpolation avant et arrière (précis) Bézier interpolation Interpolation Bézier Unknown interpolation Interpolation inconnue Stacking Empillement Convolution Nearest (no blurring) Le plus proche (pas de flou) Requested frame %1: Not within valid range. (%2 frames) L'image demandé %1: Est hors borne. (%2 images) Range too small: Start frame is %1, end frame is %2. Using normal interpolation. Borne trop petite: L'image de départ est %1, celle de fin est %2. En utilisant une interpolation normale. Video could not be prepared (error code %1). %2 La vidéo n'as pas pu être préparé (code d'erreur %1). %2 Cannot write to %1; please check if you have write permissions. Écriture impossible sur %1: merci de vérifier vos permissions d'écriture. Unknown frame source “%1”. Cannot load the project. Image source inconnue "%1". Le projet ne sera pas chargé. Cannot read from file %1. (Opening in read-only mode failed.) Impossible de lire le fichier %1. (l'ouverture en lecture seule a échoué) Invalid project file: %1 Fichier de projet invalide: %1 %1 s %1 s %1 min %2 s %1 min %2 s Frame %1 Image %1 %1 % %1 % RenderPreview Render preview Prévisualisation de rendu This is an information message. Ceci est un message d'information. Cannot render preview, no frames loaded. Impossible de faire une prévisualisation, aucune images chargées. Cannot render preview at the curve position since no curve is available. Impossible de faire un rendu a la position de courbe tant que la courbe n'est pas disponible. Rendering preview at output time %1 s (might take some time) ... Rendu de la prévisualisation au temps %1 (cela peu encore prendre du temps) ... Preview is still being rendered. La prévisualisation est toujours en cours. Cannot render at output time %1 s; Not within the curve. Impossible de faire le rendu au temps %1 ; Il n'est pas contenu dans la courbe. Preview rendering finished. Rendu de la prévisualisation terminé. RenderTask_sV Rendering aborted. Rendu annulé. Rendering Slow-Mo … Rendu Slow-Mo ... No rendering target given! Aborting rendering. Aucune cible de rendu donné! Annulation du rendu. Empty frame source, cannot be rendered. Image source vide, rendu impossible. Rendering frame %1 @ %2 s from input position: %3 s (frame %4) Rendu de l'image %1 @ %2 s depuis la position d'entrée : %3 s (image %4) RenderingDialog Rendering settings Paramètres de rendu a Full Project Projet entier Tag section Section par tags Custom section Section personnalisée to à Frames per second: Images par seconde : Size: Taille : Interpolation: Interpolation : For two frames A and B, the two-way interpolations calculate both the flows A→B and B→A, which leads to smoother transitions between them. Forward interpolations only calculate A→B; Twice as fast, but usually less smooth. Pour deux images A et B, l'interpolation "avant-arrière" calcule les deux flux A→B et B→A, ce qui genère des transitions plus fluide entre elles. L'interpolation "avant" ne calcule que A→B; deux fois plus rapide, mais générallement moins fluide. Optical Flow Flux Optique Optical flow Flux optique buildFlow lambda Use a higher value for high-quality footage and larger images. Une valeur haute produit des vidéos haute-qualité et des images plus grosses. The lambda is only used with the GPU based Optical Flow algorithm. There is no general rule which value is best, so it is usually a good idea to render a short part with a low (5) and a high (50) lambda to see the differences, and then try to find the best value between. Le lambda est uniquement utilisé pour l'algorithme de de Flux Optique basé sur le GPU. Il n'y a pas de régle général pour le choix de la meilleur valeur. C'est générallement une bonne idée d'effectuer des tests sur un extrait avec un faible (5) et un haut (50) lambda pour finallement trouver la meilleure valeur entre les deux. Motion Blur Motion blur Motion blur will only be applied for segments on which it is enabled. Motion blur ne seras appliqué uniquement pour les segments sur lequel il est activé. Stacking blur (Uses more disk space, but is faster for repeated rendering.) Stacking blur. (Utilise plus d'espace disque mais plus rapide pour des rendus répétés) Maximum samples Samples maximum Samples for slow motion Samples pour le slow motion Convolution blur (Smoother than stacking, usually the better choice.) Convolution blur (Plus fluide que stacking, générallement un meilleur choix) Nearest (no blurring) Output Sortie Target: Cible : Video Vidéo Images Images The %1 in the filename pattern is mandatory and will be replaced by the frame number. Le %1 dans le modèle du nom de fichier est obligatoire, il sera remplacé par le numéro de l'image. Output directory Dossier de sortie Browse Parcourir Filename pattern Modèle du nom de fichier rendered-%1.jpg rendu-%1.jpg Videos will be encoded with ffmpeg. If additional arguments are left empty, defaults will be used. The video format is determined by ffmpeg according to the file suffix. Les vidéos seront encodées avec ffmpeg. Si aucun paramètres additionels ne sont donnés, ceux par défaut seront utilisés. Le format de la vidéo est déterminé par ffmpeg suivant le suffixe du fichier. Output file Fichier de sortie Optional arguments Paramètre optionnels Will *not* save the project! Ceci n'enregistrera *pas* le projet! &Save settings &Enregistrer les paramètres &Abort &Annuler &Ok &Ok Original size Taille original Small Miniature <Start> <Début> <End> <Fin> Start time must be < end time! Le temps de début doit être < au temps de fin! Rendering from %1 s to %2 s. Rendu en cours de %1 s à %2 s. Output directory for rendered images Dossier de sortie pour le rendu des images Output video file Fichier vidéo de sortie ShortcutListDialog Flow Editor shortcuts Raccourcis de l'Éditeur de flux ShutterFunctionDialog Shutter Functions Fonctions de l'obturateur < < Segment %1 Segment %1 > > + + - - shutterFunc1 obturateurFunc1 Used: %1 times Utilisé : %1 fois // header (function foo(args...) { return 0; // footer }) <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd"> <html><head><meta name="qrichtext" content="1" /><style type="text/css"> p, li { white-space: pre-wrap; } </style></head><body style=" font-family:'Bitstream Vera Sans'; font-size:10pt; font-weight:400; font-style:normal;"> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Math functions are available in the Math namespace:</p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-family:'Monospace';">Math.PI, Math.cos(...), Math.pow(base. exponent) etc.</span></p> </body></html> Close Fermer Segment %1 (total number: %2) Segment %1 (sur un total de %2) TagAddDialog Add tag Ajouter un tag Change the tag type with Arrows up/down. Changez le type du tag avec les flèches haut/bas. Source Tag Tag Source Output Tag Tag Sortie Abort Annuler Ok Ok VideoFrameSource_sV Video file %1 does not exist! Le fichier vidéo %1 n'existe pas ! Video is invalid, no streams found in %1 La vidéo n'est pas valide, aucun flux trouvé dans %1 ffmpeg/avconv executable not found! Cannot load video. (It is also possible that it took a little long to respond due to high workload, so you might want to try again.) Please download the 32-bit static ffmpeg build from ffmpeg.zeranoe.com and extract ffmpeg.exe in the same directory as slowmoUI.exe. Le programme ffmpeg/avconv n'as pas été trouvé ! Impossible de charger la vidéo. (Il est également possible qu'il ai mis du temps à répondre du à une importante charge processus, vous pouvez avoir envie de retenter) Merci d'installer ffmpeg.exe, 32-bit static build depuis ffmpeg.zeranoe.com, dans le même dossier que slowmoUI.exe. Extracting thumbnail-sized frames from the video file Extraction d'images miniatures depuis le fichier vidéo Extracting original-sized frames from the video file Extraction d'images à la taille originale depuis le fichier vidéo Frame %1 of %2 Image %1 sur %2 slowmovideo-0.5+git20180116/src/slowmoVideo/tr/slowmoVideo_de.qm0000664000000000000000000000735613151342440023100 0ustar rootroot  a%[0f;h,GJ6~J6 AZZ8,W292 jj%a 0KI"P6s:-ܽ~awx&rzz7 X#< c_ e .  D   w if &z7UC E7f i MberAbout AboutDialog ber slowmoVideoAbout slowmoVideo AboutDialogSchliessenClose FlowExaminer*Verschieben abbrechen Abort move MainWindow"Farben verstrkenAmplify colours MainWindow DateiFile MainWindow HilfeHelp MainWindowNeues Projekt New project MainWindowNchste Datei Next file MainWindow ffnenOpen MainWindowProjekt ffnen Open project MainWindowVorherige Datei Previous file MainWindowBeendenQuit MainWindowSpeichernSave MainWindow&Speichern unter ... Save as ... MainWindow Hilfe einblendenShow help overlay MainWindowRckwrtsBackwardQObject BzierBézierQObjectVorwrtsForwardQObject LinearLinearQObjectOriginalOrigQObject KleinSmallQObject*Unbekannter KurventypUnknown curve typeQObject&Unbekannte RichtungUnknown directionQObject"Unbekannte Grsse Unknown sizeQObject*Rendere Slow-Motion &Rendering Slow-Mo … RenderTask_sV(Rendern abgebrochen.Rendering aborted. RenderTask_sV6Benutzerdefinierter BereichCustom sectionRenderingDialog&Frames pro Sekunde:Frames per second:RenderingDialog Gesamtes Projekt Full ProjectRenderingDialogInterpolation:Interpolation:RenderingDialog$Bewegungsunschrfe Motion BlurRenderingDialog&RendereinstellungenRendering settingsRenderingDialogGrsse:Size:RenderingDialogTag-Bereich Tag sectionRenderingDialogbistoRenderingDialog++ShutterFunctionDialog--ShutterFunctionDialog<<ShutterFunctionDialog>>ShutterFunctionDialogSchliessenCloseShutterFunctionDialogSegment %1 Segment %1ShutterFunctionDialog2Segment %1 (%2 insgesamt)Segment %1 (total number: %2)ShutterFunctionDialog*Verschluss-FunktionenShutter FunctionsShutterFunctionDialog"Verwendet: %1 MalUsed: %1 timesShutterFunctionDialogverschlussFunc1 shutterFunc1ShutterFunctionDialogTag hinzufgenAdd tag TagAddDialoghTag-Typ kann mit Pfeiltasten auf/ab gendert werden.(Change the tag type with Arrows up/down. TagAddDialogslowmovideo-0.5+git20180116/src/slowmoVideo/tr/slowmoVideo_de.ts0000664000000000000000000023176713151342440023116 0ustar rootroot AboutDialog About Über About slowmoVideo Über slowmoVideo <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd"> <html><head><meta name="qrichtext" content="1" /><style type="text/css"> p, li { white-space: pre-wrap; } </style></head><body style=" font-family:'Bitstream Vera Sans'; font-size:10pt; font-weight:400; font-style:normal;"> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-style:italic;">slowmoVideo</span> is developed by Simon A. Eugster (aka. Granjow, co-author of Kdenlive) and licensed under GPLv3.</p> <p style="-qt-paragraph-type:empty; margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><br /></p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-style:italic;">slowmoVideo</span> allows to change the speed of a video clip based upon a curve. If the speed becomes higher than 1×, an exposure (shutter) effect simulates motion blur. For lower speed, frames are interpolated with optical flow.</p> <p style="-qt-paragraph-type:empty; margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><br /></p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Thanks for contributing:</p> <ul style="margin-top: 0px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; -qt-list-indent: 1;"><li style=" margin-top:12px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Mirko Götze <span style=" font-style:italic;">&lt;mail@mgo80.de&gt;</span> for converting <span style=" font-weight:600;">Cg to GLSL</span> (Removing the nVidia dependency)</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Morten Sylvest Olsen <span style=" font-style:italic;">&lt;mso@kapowsoftware.com&gt;</span> for the <span style=" font-weight:600;">V3D speedup</span> and removing the unnecessary window</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Elias Vanderstuyft for displaying the <span style=" font-weight:600;">shutter function</span> on the canvas</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Christian Frisson <span style=" font-style:italic;">&lt;christian.frisson@umons.ac.be&gt;</span> for<span style=" font-weight:600;"> OpenCV on MXE</span> (allowed me to compile slowmoVideo for Windows)</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Per <span style=" font-style:italic;">&lt;per@stuffmatic.com&gt;</span> for the<span style=" font-weight:600;"> OpenCV</span> code (slowmoVideo can run on CPU only with it)</li></ul> <p style="-qt-paragraph-type:empty; margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><br /></p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Visit <a href="http://slowmoVideo.granjow.net"><span style=" text-decoration: underline; color:#0057ae;">slowmoVideo.granjow.net</span></a> for more information.</p></body></html> version number goes here. (c) 2012 Simon A. Eugster <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd"> <html><head><meta name="qrichtext" content="1" /><style type="text/css"> p, li { white-space: pre-wrap; } </style></head><body style=" font-family:'Bitstream Vera Sans'; font-size:10pt; font-weight:400; font-style:normal;"> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-style:italic;">slowmoVideo</span> is developed by Simon A. Eugster (aka. Granjow, co-author of Kdenlive) and licensed under GPLv3.</p> <p style="-qt-paragraph-type:empty; margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"></p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-style:italic;">slowmoVideo</span> allows to change the speed of a video clip based upon a curve. If the speed becomes higher than 1×, an exposure (shutter) effect simulates motion blur. For lower speed, frames are interpolated with optical flow.</p> <p style="-qt-paragraph-type:empty; margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"></p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Visit <a href="http://slowmoVideo.granjow.net"><span style=" text-decoration: underline; color:#0057ae;">slowmoVideo.granjow.net</span></a> for more information.</p></body></html> (c) 2011 Simon A. Eugster Version %1, %2 Canvas slowmoUI canvas &Delete node &Snap in node &Delete tag &Rename tag Set tag &time &Linear curve &Bézier curve Set &custom speed Set/edit shutter &function Set speed to %1× Reset left handle Reset right handle Segment replay &speed … Node %1 Handle actions Segment between node %1 and %2 Tag %1 New tag name Tag: New tag time Time: Replay speed for current segment Speed: FlowEditCanvas Form TextLabel Values at mouse position FlowExaminer Dialog Close Schliessen FrameMonitor Input monitor ImageDisplay Scale image to widget size Export image Export render preview to image ImagesFrameSource_sV No images selected. Image %1 is not of the same size (%2) as the first image (%3). Creating preview images from the input images Resized image already exists for %1 Re-sizing image %1 to: %2 MainWindow slowmo Flow Editor File Datei Help Hilfe Open Öffnen Save Speichern Quit Beenden Next file Nächste Datei Previous file Vorherige Datei Amplify colours Farben verstärken Shortcuts slowmoVideo UI &File &Help &View &Project &Render Preferences &Save Save &as … &Open &Shortcuts &About &Quit Render &preview E&xamine flow Examine flow at input frame &New … &Preferences Zoom &in Zoom &out Input monitor Curve monitor Render preview Show help overlay Hilfe einblenden New project Neues Projekt Open project Projekt öffnen Save as ... Speichern unter ... Abort move Verschieben abbrechen Unselect all Delete selected nodes Selecting tool Move tool Insert label (tag) Load Project slowmoVideo projects (*.sVproj) Warning Frame source error Error No filename given, won't save. (Perhaps an empty project?) Saved project as: %1 Error writing project file Save project Navigation: [Shift] Scroll, Drag Move nodes: [Ctrl] Drag empty project Rendering progress Frame extraction progress NewProjectDialog New slowmo Project Browse Abort Ok Directory will be created Video source Video information: Input video Image source Input images Images information: Video file Images Project Directory Project Filename .sVproj You should preferredly use an empty directory here. Each project needs its own project directory. The project will be saved to this file right after confirming this dialog. Select input video file Character %1 is not an ASCII character. This file path will likely not work with ffmpeg. Select input images Select a project directory Number of video streams: %1 Frames: %2 Size: %3×%4 Frame rate: %1/%2 No video stream detected. Image size: %1 PreferencesDialog slowmoUI preferences Binary locations Browse flowBuilder ffmpeg Flow method GPU, V3D (nVidia card required) CPU, OpenCV Cancel Ok flowBuilder binary location ProgressDialog Progress Current Task Task Description Abort Ok (Finished) Aborted Task finished in %1. Task finished. (Finished) %1 ProjectPreferencesDialog Dialog FPS value to use for calculating the output frame (display only) Project_sV Empty frame source; Cannot build flow. QObject Orig Original Small Klein Unknown size Unbekannte Grösse Forward Vorwärts Backward Rückwärts Unknown direction Unbekannte Richtung Linear Linear Bézier Bézier Unknown curve type Unbekannter Kurventyp Source axis Output axis Unknown axis Forward interpolation (fast) Forward interpolation (accurate) Two-way interpolation (fast) Two-way interpolation (accurate) Bézier interpolation Unknown interpolation Stacking Convolution Nearest (no blurring) Requested frame %1: Not within valid range. (%2 frames) Range too small: Start frame is %1, end frame is %2. Using normal interpolation. Video could not be prepared (error code %1). %2 Cannot write to %1; please check if you have write permissions. Unknown frame source “%1”. Cannot load the project. Cannot read from file %1. (Opening in read-only mode failed.) Invalid project file: %1 %1 s %1 min %2 s Frame %1 %1 % RenderPreview Form Render preview This is an information message. Cannot render preview, no frames loaded. Cannot render preview at the curve position since no curve is available. Rendering preview at output time %1 s (might take some time) ... Preview is still being rendered. Cannot render at output time %1 s; Not within the curve. Preview rendering finished. RenderTask_sV Rendering aborted. Rendern abgebrochen. Rendering Slow-Mo … Rendere Slow-Motion … No rendering target given! Aborting rendering. Empty frame source, cannot be rendered. Rendering frame %1 @ %2 s from input position: %3 s (frame %4) RenderingDialog Rendering settings Rendereinstellungen a Full Project Gesamtes Projekt Tag section Tag-Bereich Custom section Benutzerdefinierter Bereich to bis Frames per second: Frames pro Sekunde: 23.976 24 25 29.976 30 50 60 72 Size: Grösse: Interpolation: Interpolation: For two frames A and B, the two-way interpolations calculate both the flows A→B and B→A, which leads to smoother transitions between them. Forward interpolations only calculate A→B; Twice as fast, but usually less smooth. Optical Flow Optical flow buildFlow lambda Use a higher value for high-quality footage and larger images. The lambda is only used with the GPU based Optical Flow algorithm. There is no general rule which value is best, so it is usually a good idea to render a short part with a low (5) and a high (50) lambda to see the differences, and then try to find the best value between. Motion Blur Bewegungsunschärfe Motion blur Motion blur will only be applied for segments on which it is enabled. Stacking blur (Uses more disk space, but is faster for repeated rendering.) Maximum samples Samples for slow motion Convolution blur (Smoother than stacking, usually the better choice.) Nearest (no blurring) Output Target: Video Images The %1 in the filename pattern is mandatory and will be replaced by the frame number. Output directory Browse Filename pattern rendered-%1.jpg Videos will be encoded with ffmpeg. If additional arguments are left empty, defaults will be used. The video format is determined by ffmpeg according to the file suffix. Output file Optional arguments vcodec Will *not* save the project! &Save settings &Abort &Ok Stacking blur Uses more disk space, but is faster for repeated rendering. Convolution blur Smoother than stacking, usually the better choice. Original size Small <Start> <End> Start time must be < end time! Rendering from %1 s to %2 s. Output directory for rendered images Output video file ShortcutListDialog Flow Editor shortcuts TextLabel ShutterFunctionDialog Shutter Functions Verschluss-Funktionen < < Segment %1 Segment %1 > > + + - - shutterFunc1 verschlussFunc1 Used: %1 times Verwendet: %1 Mal // header (function foo(args...) { return 0; // footer }) <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd"> <html><head><meta name="qrichtext" content="1" /><style type="text/css"> p, li { white-space: pre-wrap; } </style></head><body style=" font-family:'Bitstream Vera Sans'; font-size:10pt; font-weight:400; font-style:normal;"> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Math functions are available in the Math namespace:</p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-family:'Monospace';">Math.PI, Math.cos(...), Math.pow(base. exponent) etc.</span></p> </body></html> Close Schliessen Segment %1 (total number: %2) Segment %1 (%2 insgesamt) TagAddDialog Add tag Tag hinzufügen Change the tag type with Arrows up/down. Tag-Typ kann mit Pfeiltasten auf/ab geändert werden. Source Tag Output Tag Abort Ok VideoFrameSource_sV Video file %1 does not exist! Video is invalid, no streams found in %1 ffmpeg/avconv executable not found! Cannot load video. (It is also possible that it took a little long to respond due to high workload, so you might want to try again.) Extracting thumbnail-sized frames from the video file Extracting original-sized frames from the video file Frame %1 of %2 slowmovideo-0.5+git20180116/src/slowmoVideo/tr/slowmoVideo_it.qm0000664000000000000000000012341013151342440023112 0ustar rootrootfa[N[V[Y.[+[t%\s]b%=|0>fAnhbBfC0*3]%*%8N*08|+f8+9+9+:b@~tGHGXkGG(J61J6Lb<=ZZ8,eoZ8,[0\to0WE[Ok?,fWR /N .1 f-f .+CT292Ah~>t^Sg0[XOEq.D]KSGTLCNkf1XidS~~T~~ AE2 Ks8}?`y9+ѕ+[VyH_rA BKO d 2Xj%zo|pn.#@R;aL"1">2Ԃ>!fwBu!K.3W_IUnInU+nu WZ)V/;S=8ʢ %-&}K}z >5v~.fUc0Kb66tCYLZr+lMwN<\yjH*z00N1Ig^`Ȑ6UFYdJhժN,LHnYP{R|_*Fs:(vˎ`Z3{g=C:!CEdNp6QC<ܽ~Dwra&rz{-D4O>_QC>I9qUWC 9VX]z7ScH6@52#z37*v'FiYaJQ3z~4^|Q\~jfN9,E DhTK5ݗe) -C )O ,e ^YW c_: iqn ei ìcgE ĻPY ˵nQ . 뇂i ` QY ?G7q h5W ze *D ĥ)b R;3 D Kv >m Ga C2J 0` E9bf E9} K@t,+ Oo( j-S t>Y }] J9 7B 7n {@ ӰG B s Rr $>i3 0U 0Fsc AiO x<? / W) #;  ̫ - VG A w ~pB @4I[ XC? i` lV ~ PFa ZZ r4B)4&g7UC=9C#lNHEPN\c F -!IťMCi^H]Ze*2f@GquAy)i%</head><body style=" font-family:'Bitstream Vera Sans'; font-size:10pt; font-weight:400; font-style:normal;"> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-style:italic;">slowmoVideo</span> sviluppato da Simon A. Eugster (aka. Granjow, coautore di Kdenlive) sotto licenza GPLv3.</p> <p style="-qt-paragraph-type:empty; margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><br /></p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-style:italic;">slowmoVideo</span> permette di cambiare la velocit di una clip video in base ad una curva.Se la velocit diventa maggiore di 1x, un effetto di esposizione (shutter) simuler una sfocatura di movimento. Per velocit inferiori, verranno interpolati dei frame nel flusso video.</p> <p style="-qt-paragraph-type:empty; margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><br /></p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Grazie per i contributi:</p> <ul style="margin-top: 0px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; -qt-list-indent: 1;"><li style=" margin-top:12px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Mirko Gtze <span style=" font-style:italic;">&lt;mail@mgo80.de&gt;</span>per la conversione <span style=" font-weight:600;">Cg in GLSL</span> (Rimozione della dipendenza nVidia)</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Morten Sylvest Olsen <span style=" font-style:italic;">&lt;mso@kapowsoftware.com&gt;</span> per la <span style=" font-weight:600;">velocizzazione di V3D</span> e la rimozione delle finestre non necessarie</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Elias Vanderstuyft per la visualizzazione per la visualizzazione della <span style=" font-weight:600;">funzione shutter</span> nell'area di lavoro</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Christian Frisson <span style=" font-style:italic;">&lt;christian.frisson@umons.ac.be&gt;</span>per la conversione di<span style=" font-weight:600;"> OpenCV in MXE</span> (permettendomi di compilare slowmoVideo per Windows)</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Per <span style=" font-style:italic;">&lt;per@stuffmatic.com&gt;</span> per il codice<span style=" font-weight:600;"> OpenCV</span> (slowmoVideo pu funzionare nella CPU solo con esso)</li></ul> <p style="-qt-paragraph-type:empty; margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><br /></p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Visit <a href="http://slowmoVideo.granjow.net"><span style=" text-decoration: underline; color:#0057ae;">slowmoVideo.granjow.net</span></a> per maggiori informazioni.</p></body></html>

slowmoVideo is developed by Simon A. Eugster (aka. Granjow, co-author of Kdenlive) and licensed under GPLv3.


slowmoVideo allows to change the speed of a video clip based upon a curve. If the speed becomes higher than 1×, an exposure (shutter) effect simulates motion blur. For lower speed, frames are interpolated with optical flow.


Thanks for contributing:

  • Mirko Götze <mail@mgo80.de> for converting Cg to GLSL (Removing the nVidia dependency)
  • Morten Sylvest Olsen <mso@kapowsoftware.com> for the V3D speedup and removing the unnecessary window
  • Elias Vanderstuyft for displaying the shutter function on the canvas
  • Christian Frisson <christian.frisson@umons.ac.be> for OpenCV on MXE (allowed me to compile slowmoVideo for Windows)
  • Per <per@stuffmatic.com> for the OpenCV code (slowmoVideo can run on CPU only with it)


Visit slowmoVideo.granjow.net for more information.

 AboutDialogInformazioniAbout AboutDialog6Informazioni su SlowmoVideoAbout slowmoVideo AboutDialogVersion %1, %2 AboutDialog&Curva Bzier&Bézier curveCanvas&Elimina_nodo &Delete nodeCanvas$&Elimina etichetta &Delete tagCanvas&Curva lineare &Linear curveCanvas&&Rinomina etichetta &Rename tagCanvas&Inserisci nodo &Snap in nodeCanvas*Azioni della manigliaHandle actionsCanvas(Nuovo nome etichetta New tag nameCanvas*Nuova etichetta tempo New tag timeCanvasNodo %1Node %1Canvas\Velocit di riproduzione del segmento corrente Replay speed for current segmentCanvas8Resetta la maniglia sinistraReset left handleCanvas4Resetta la maniglia destraReset right handleCanvas>Segmento tra il nodo %1 e il %2Segment between node %1 and %2CanvasPVelocit di riproduzione del &segmento &Segment replay &speed …Canvas@Imposta &Velocit personalizzataSet &custom speedCanvas4Imposta la velocit a %1Set speed to %1×Canvas,Imposta etichetta &ora Set tag &timeCanvasDImposta/modifica &funzione shutterSet/edit shutter &functionCanvasVelocit:Speed:CanvasEtichetta %1Tag %1CanvasEtichetta:Tag:Canvas Tempo:Time:CanvasslowmoUI canvasCanvas@Valore della posizione del mouseValues at mouse positionFlowEditCanvas ChiudiClose FlowExaminer Monitor di input Input monitor FrameMonitor Esporta immagine Export image ImageDisplayXEsporta l'anteprima di rendering in immagineExport render preview to image ImageDisplaydRidimensiona l'immagine alla dimensione del widgetScale image to widget size ImageDisplayzCreazione delle immagini di anteprima dalle immagini inserite-Creating preview images from the input imagesImagesFrameSource_sVImmagine %1 non della stessa dimensione (%2) della prima immagine (%3).>Image %1 is not of the same size (%2) as the first image (%3).ImagesFrameSource_sV:Nessuna immagine selezionata.No images selected.ImagesFrameSource_sVFRidimensionamento immagine %1 a: %2Re-sizing image %1 to: %2ImagesFrameSource_sVVEsiste gi l'immagine ridimensionata per %1#Resized image already exists for %1ImagesFrameSource_sVBSposta nodi: [ctrl] Trascinamento Move nodes: [Ctrl] Drag MainWindow^Navigatione: [Shift] Scorrimento, Trascinamento! Navigation: [Shift] Scroll, Drag MainWindow&Informazioni&About MainWindow &File&File MainWindow &Aiuto&Help MainWindow&Nuovo ...&New … MainWindow &Apri&Open MainWindow&Preferenze &Preferences MainWindow&Progetto&Project MainWindow &Esci&Quit MainWindow&Render&Render MainWindow &Salva&Save MainWindow&Scorciatoie &Shortcuts MainWindow &Vista&View MainWindow&Annulla spostamento Abort move MainWindow$Amplifica i coloriAmplify colours MainWindow&Monitor della curva Curve monitor MainWindow4Elimina i nodi selezionatiDelete selected nodes MainWindow$E&samina il flusso E&xamine flow MainWindow ErroreError MainWindowVErrore nella scrittura del file di progettoError writing project file MainWindowFEsamina il flusso al frame di inputExamine flow at input frame MainWindowFileFile MainWindowBProgresso di estrazione del frameFrame extraction progress MainWindow>Errore nella sorgente dei frameFrame source error MainWindow AiutoHelp MainWindow Monitor ingresso Input monitor MainWindow&Inserisci etichettaInsert label (tag) MainWindowCarica progetto Load Project MainWindow0Strumento di spostamento Move tool MainWindowNuovo progetto New project MainWindowFile successivo Next file MainWindowNessun nome al file, impossibile salvare. (Forse un progetto vuoto?):No filename given, won't save. (Perhaps an empty project?) MainWindowApriOpen MainWindowApri progetto Open project MainWindowPreferenze Preferences MainWindowFile precedente Previous file MainWindowEsciQuit MainWindow*&Anteprima del renderRender &preview MainWindow&Anteprima di renderRender preview MainWindow4Progressione del renderingRendering progress MainWindow SalvaSave MainWindowSalva &come ... Save &as … MainWindowSalva come ... Save as ... MainWindowSalva progetto Save project MainWindow4Progetto salvato come:: %1Saved project as: %1 MainWindow,Strumento di selezioneSelecting tool MainWindowScorciatoie Shortcuts MainWindowBMostra l'aiuto in sovrapposizioneShow help overlay MainWindow"Deseleziona tutto Unselect all MainWindowAttenzioneWarning MainWindowZoom &inZoom &in MainWindowZoom &out Zoom &out MainWindowProgetto vuoto empty project MainWindow0Editor del flusso slowmoslowmo Flow Editor MainWindow<SlowmoVideo Interfaccia utenteslowmoVideo UI MainWindow>progetti slowmoVideo (*.sVproj)slowmoVideo projects (*.sVproj) MainWindow.sVprojNewProjectDialogTerminareAbortNewProjectDialog NavigaBrowseNewProjectDialogIl carattere %1 non un carattere ASCII. Questo percorso file non funzioner con ffmpeg.XCharacter %1 is not an ASCII character. This file path will likely not work with ffmpeg.NewProjectDialog0La cartella verr creataDirectory will be createdNewProjectDialog"Frame rate: %1/%2Frame rate: %1/%2NewProjectDialog.Dimensione immagine: %1Image size: %1NewProjectDialog"Immagine sorgente Image sourceNewProjectDialogImmaginiImagesNewProjectDialog8Informazione sulle immagini:Images information:NewProjectDialog(Inserimento immagini Input imagesNewProjectDialog"Video di ingresso Input videoNewProjectDialog*Nuovo Progetto slowmoNew slowmo ProjectNewProjectDialog8Nessun flusso video trovato.No video stream detected.NewProjectDialogxNumero dei flussi video: %1\nFrame: %2\nDimensione: %3%4\n4Number of video streams: %1 Frames: %2 Size: %3×%4 NewProjectDialogOkOkNewProjectDialog*Cartella del ProgettoProject DirectoryNewProjectDialog2Nome del File di ProgettoProject FilenameNewProjectDialogHSelezionare una cartella di progettoSelect a project directoryNewProjectDialogDSelezionare le immagini d'ingressoSelect input imagesNewProjectDialogHSelezionare il file video d'ingressoSelect input video fileNewProjectDialogIl progetto verr salvato in questo file subito dopo la conferma di questa schermata.JThe project will be saved to this file right after confirming this dialog.NewProjectDialogFile video Video fileNewProjectDialog.Informazioni sul video:Video information:NewProjectDialogSorgente video Video sourceNewProjectDialogQui si dovrebbe preferire l'uso di una cartella vuota. Ogni progetto ha bisogno di una propria cartella di progetto.aYou should preferredly use an empty directory here. Each project needs its own project directory.NewProjectDialog Posizioni BinaryBinary locationsPreferencesDialog NavigaBrowsePreferencesDialogCPU, OpenCV CPU, OpenCVPreferencesDialogannullaCancelPreferencesDialogMetodo flusso Flow methodPreferencesDialogFGPU, V3D (necessaria scheda nVidia)GPU, V3D (nVidia card required)PreferencesDialogOkOkPreferencesDialog8Posizione flowBuilder binaryflowBuilder binary locationPreferencesDialogHPreferenze interfaccia utente slowmoslowmoUI preferencesPreferencesDialog(Finito) (Finished) ProgressDialog(Terminato) %1 (Finished) %1ProgressDialogAnnullaAbortProgressDialogAnnullatoAbortedProgressDialog"Attivit Corrente Current TaskProgressDialogOkOkProgressDialogAvanzamentoProgressProgressDialog(Descrizione attivitTask DescriptionProgressDialog0Attivit ultimata in %1.Task finished in %1.ProgressDialog$Attivit ultimata.Task finished.ProgressDialog&Finestra di dialogoDialogProjectPreferencesDialogIl valore FPS da usare per il calcolo del frame in uscita (solo visualizzazione)@FPS value to use for calculating the output frame (display only)ProjectPreferencesDialognSorgente del frame vuota; Impossibile creare il flusso.&Empty frame source; Cannot build flow. Project_sVFrame %1 Frame %1QObject%1 %%1 %QObject%1 min %2 s %1 min %2 sQObject%1 s%1 sQObjectIndietroBackwardQObject BzierBézierQObject*Interpolazione BzierBézier interpolationQObjectImpossibile leggere dal file %1. (Fallita l'apertura del file in modalit sola lettura.)=Cannot read from file %1. (Opening in read-only mode failed.)QObjectImpossibile scrivere in %1; per favore controllare se si hanno i permessi in scrittura.?Cannot write to %1; please check if you have write permissions.QObjectConvoluzione ConvolutionQObject AvantiForwardQObjectFInterpolazione in avanti (accurata) Forward interpolation (accurate)QObjectBInterpolazione in avanti (veloce)Forward interpolation (fast)QObject>File di progetto non valido: %1Invalid project file: %1QObjectLineareLinearQObject<Pi vicino (nessuna sfocatura)Nearest (no blurring)QObjectOrigineOrigQObject Assi dell'uscita Output axisQObjectIntervallo troppo piccolo: Il frame iniziale %1, il frame finale %2. Usare una interpolazione normale.PRange too small: Start frame is %1, end frame is %2. Using normal interpolation.QObjectFrame richiesto %1: Non all'interno di un intervallo valido. (%2 frames)7Requested frame %1: Not within valid range. (%2 frames)QObjectPiccoloSmallQObject&Assi della sorgente Source axisQObjectSovrapposizioneStackingQObjectTInterpolazione in due direzioni (accurata) Two-way interpolation (accurate)QObjectPInterpolazione in due direzioni (veloce)Two-way interpolation (fast)QObject Assi sconosciuti Unknown axisQObjectBTipologia della curva sconosciutaUnknown curve typeQObject*Direzione sconosciutaUnknown directionQObjectSorgente dei frame sconosciuta %1 . Impossibile caricare il progetto.7Unknown frame source “%1”. Cannot load the project.QObject4Interpolazione sconosciutaUnknown interpolationQObject,Dimensione sconosciuta Unknown sizeQObjectjIl video non pu essere creato (Codice errore %1). %2/Video could not be prepared (error code %1). %2QObjectImpossibile creare il rendering al tempo di uscita %1 s; Niente all'interno della curva.8Cannot render at output time %1 s; Not within the curve. RenderPreviewImpossibile creare l'anteprima del rendering nella posizione della curva finch non disponibile una curva.HCannot render preview at the curve position since no curve is available. RenderPreviewImpossibile creare l'anteprima del rendering, nessun frame caricato.(Cannot render preview, no frames loaded. RenderPreviewPAnteprima ancora in fase di rendering. Preview is still being rendered. RenderPreviewDAnteprima del rendering terminata.Preview rendering finished. RenderPreview&Anteprima renderingRender preview RenderPreviewAnteprima del rendering al momento dell'uscita %1 s (potrebbe occorrere un po' di tempo) ...@Rendering preview at output time %1 s (might take some time) ... RenderPreviewDQuesto un messaggio informativo.This is an information message. RenderPreview^Sorgente frame vuota, impossibile il rendering.'Empty frame source, cannot be rendered. RenderTask_sVvNessuna destinazione per il rendering! Rendering annullato..No rendering target given! Aborting rendering. RenderTask_sV&Rendering Slow-Mo &Rendering Slow-Mo … RenderTask_sV(Rendering annullato.Rendering aborted. RenderTask_sVFrame di rendering %1 @ %2 s dalla posizione iniziale: %3 s (frame %4)?Rendering frame %1 @ %2 s from input position: %3 s (frame %4) RenderTask_sV&Annulla&AbortRenderingDialog&Ok&OkRenderingDialog&&Salva impostazioni&Save settingsRenderingDialog <Fine>RenderingDialog<Inizio>RenderingDialog NavigaBrowseRenderingDialogSfocatura di convoluzione (Pi uniforme della sfocatura in sovrapposizione, di solito la scelta migliore.)EConvolution blur (Smoother than stacking, usually the better choice.)RenderingDialog,Sezione personalizzataCustom sectionRenderingDialog2Modello del nome del fileFilename patternRenderingDialogDa due frame A e B, le interpolazioni in due direzioni calcolano sia il flusso A!B sia B!A,che porta a delle transizioni pi uniformi tra loro. Le interpolazioni in avanti calcolano il flusso A!B; Due volte pi veloce ma di solito meno uniforme.For two frames A and B, the two-way interpolations calculate both the flows A→B and B→A, which leads to smoother transitions between them. Forward interpolations only calculate A→B; Twice as fast, but usually less smooth.RenderingDialog"Frame al secondo:Frames per second:RenderingDialog"Progetto completo Full ProjectRenderingDialogImmaginiImagesRenderingDialogInterpolazione:Interpolation:RenderingDialog Campioni massimiMaximum samplesRenderingDialog,Sfocatura di Movimento Motion BlurRenderingDialog,Sfocatura di movimento Motion blurRenderingDialogLa sfocatura di movimento verr unicamente applicata per segmenti nei quali stata abilitata.EMotion blur will only be applied for segments on which it is enabled.RenderingDialog<Pi vicino (nessuna sfocatura)Nearest (no blurring)RenderingDialogFlusso Visivo Optical FlowRenderingDialogFlusso visivo Optical flowRenderingDialog Valori opzionaliOptional argumentsRenderingDialog(Dimensione originale Original sizeRenderingDialog UscitaOutputRenderingDialog0Cartella di destinazioneOutput directoryRenderingDialog^Cartella di output per le immagini renderizzate$Output directory for rendered imagesRenderingDialogFile di output Output fileRenderingDialog(File video di outputOutput video fileRenderingDialog2Rendering da %1 s a %2 s.Rendering from %1 s to %2 s.RenderingDialog2Impostazioni di renderingRendering settingsRenderingDialog6Campioni per lo slow motionSamples for slow motionRenderingDialogDimensione:Size:RenderingDialogPiccolaSmallRenderingDialogSfocatura in sovrapposizione (Occupa maggior spazio sul disco, ma pi veloce nel caso di rendering ripetuti.)KStacking blur (Uses more disk space, but is faster for repeated rendering.)RenderingDialogbIl tempo iniziale deve essere < del tempo finale!Start time must be < end time!RenderingDialog.Etichetta della sezione Tag sectionRenderingDialogDestinazione:Target:RenderingDialogIl %1 nel nome del percorso del file obbligatorio e sar sostituito con il numero di frame.UThe %1 in the filename pattern is mandatory and will be replaced by the frame number.RenderingDialogIl valore lambda usato unicamente per GPU basate su algoritmi Optical Flow. Non esiste una regola generale sul valore migliore, quindi una buona regola fare il rendering di una piccola parte con un valore di lambda basso (5) e uno con un valore alto (50), per evidenziarne le differenze e per poi trovare il miglior valore tra i due limiti.The lambda is only used with the GPU based Optical Flow algorithm. There is no general rule which value is best, so it is usually a good idea to render a short part with a low (5) and a high (50) lambda to see the differences, and then try to find the best value between.RenderingDialogUtilizzare un valore maggiore per filmati di alta qualit e immagini pi grandi.>Use a higher value for high-quality footage and larger images.RenderingDialog VideoVideoRenderingDialogI video saranno codificati con ffmpeg. Se gli argomenti aggiuntivi vengono lasciati vuoti, per impostazione predefinita verr utilizzato. Il formato video verr determinato da ffmpeg in base al suffisso del file.Videos will be encoded with ffmpeg. If additional arguments are left empty, defaults will be used. The video format is determined by ffmpeg according to the file suffix.RenderingDialog@Il progetto *non* verr salvato!Will *not* save the project!RenderingDialogaaRenderingDialog<Valore lambda del flusso videobuildFlow lambdaRenderingDialogRendered-%1.jpgrendered-%1.jpgRenderingDialogatoRenderingDialogBScorciatoie dell'Editor di FlussoFlow Editor shortcutsShortcutListDialog++ShutterFunctionDialog--ShutterFunctionDialog // footer })ShutterFunctionDialog"// header (function foo(args...) {ShutterFunctionDialog<<ShutterFunctionDialog

Math functions are available in the Math namespace:

Math.PI, Math.cos(...), Math.pow(base. exponent) etc.

ShutterFunctionDialog>>ShutterFunctionDialog ChiudiCloseShutterFunctionDialogSegmento %1 Segment %1ShutterFunctionDialog0Segmento %1 (totale: %2)Segment %1 (total number: %2)ShutterFunctionDialog Funzioni ShutterShutter FunctionsShutterFunctionDialogUsato: %1 volteUsed: %1 timesShutterFunctionDialog return 0;ShutterFunctionDialogshutterFunc1 shutterFunc1ShutterFunctionDialogAnnullaAbort TagAddDialog$Aggiungi etichettaAdd tag TagAddDialogdCambia tipo di etichetta con le frecce alto/basso.(Change the tag type with Arrows up/down. TagAddDialogOkOk TagAddDialog&Etichetta di output Output Tag TagAddDialog$Etichetta sorgente Source Tag TagAddDialogEstrazione dell'anteprima del frame di dimensioni originali dal file video4Extracting original-sized frames from the video fileVideoFrameSource_sVEstrazione dell'anteprima del frame di dimensioni ridotte dal file video5Extracting thumbnail-sized frames from the video fileVideoFrameSource_sVFrame %1 di %2Frame %1 of %2VideoFrameSource_sV8Il file video %1 non esiste!Video file %1 does not exist!VideoFrameSource_sVdIl video non valido, nessun flusso trovato in %1(Video is invalid, no streams found in %1VideoFrameSource_sVL'eseguibile ffmpeg/avconv non stato trovato! Impossibile caricare il video ( anche possibile che occorra un po' pi di tempo per la risposta a causa della grande quantit di lavoro, quindi si potrebbe riprovare di nuovo.)ffmpeg/avconv executable not found! Cannot load video. (It is also possible that it took a little long to respond due to high workload, so you might want to try again.)VideoFrameSource_sVslowmovideo-0.5+git20180116/src/slowmoVideo/slowmoRenderer/0000775000000000000000000000000013151342440022131 5ustar rootrootslowmovideo-0.5+git20180116/src/slowmoVideo/slowmoRenderer/slowmoRenderer_sV.cpp0000664000000000000000000002030713151342440026316 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "config.h" #include "slowmoRenderer_sV.h" #include "project/project_sV.h" #include "project/projectPreferences_sV.h" #include "project/xmlProjectRW_sV.h" #include "project/renderTask_sV.h" #include "project/imagesRenderTarget_sV.h" #include "project/videoFrameSource_sV.h" #include "project/emptyFrameSource_sV.h" #include "project/imagesFrameSource_sV.h" #ifdef USE_FFMPEG #if 0 #include "project/new_videoRenderTarget.h" #else #include "project/exportVideoRenderTarget.h" #endif #else #include "project/videoRenderTarget_sV.h" #endif #include "project/flowSourceV3D_sV.h" #include Error::Error(std::string message) : // m_nodes->setMaxY(m_frameSource->maxTime()); message(message) {} SlowmoRenderer_sV::SlowmoRenderer_sV() : m_project(NULL), m_taskSize(0), m_lastProgress(0), m_start(":start"), m_end(":end"), m_renderTargetSet(false) { } SlowmoRenderer_sV::~SlowmoRenderer_sV() { #if QT_VERSION >= QT_VERSION_CHECK(5, 0, 0) m_project->getProjectDir().removeRecursively(); #else #warning removeRecursively not define in QT4 #endif delete m_project; } void SlowmoRenderer_sV::save(QString filename) { XmlProjectRW_sV writer; writer.saveProject(m_project, filename); } void SlowmoRenderer_sV::load(QString filename) throw(Error) { if (m_project != NULL) { delete m_project; m_project = NULL; } QString warning; try { Project_sV *proj = XmlProjectRW_sV::loadProject(QString(filename), &warning); if (warning.length() > 0) { std::cout << warning.toStdString() << std::endl; } m_project = proj; RenderTask_sV *task = new RenderTask_sV(m_project); m_project->replaceRenderTask(task); task->renderPreferences().setFps(24); task->setTimeRange(m_start, m_end); connect(m_project->renderTask(), SIGNAL(signalNewTask(QString,int)), this, SLOT(slotTaskSize(QString,int))); connect(m_project->renderTask(), SIGNAL(signalTaskProgress(int)), this, SLOT(slotProgressInfo(int))); connect(m_project->renderTask(), SIGNAL(signalRenderingAborted(QString)), this, SLOT(slotFinished(QString))); connect(m_project->renderTask(), SIGNAL(signalRenderingFinished(QString)), this, SLOT(slotFinished(QString))); connect(m_project->renderTask(), SIGNAL(signalRenderingStopped(QString)), this, SLOT(slotFinished(QString))); } catch (Error_sV &err) { throw Error(err.message().toStdString()); } } void SlowmoRenderer_sV::create() throw(Error) { std::cout << "Standalone Rendering." << std::endl; if (m_project != NULL) { delete m_project; m_project = NULL; } try { m_project = new Project_sV(); RenderTask_sV *task = new RenderTask_sV(m_project); m_project->replaceRenderTask(task); task->renderPreferences().setFps(24); //task->setTimeRange(m_start, m_end); connect(m_project->renderTask(), SIGNAL(signalNewTask(QString,int)), this, SLOT(slotTaskSize(QString,int))); connect(m_project->renderTask(), SIGNAL(signalTaskProgress(int)), this, SLOT(slotProgressInfo(int))); connect(m_project->renderTask(), SIGNAL(signalRenderingAborted(QString)), this, SLOT(slotFinished(QString))); connect(m_project->renderTask(), SIGNAL(signalRenderingFinished(QString)), this, SLOT(slotFinished(QString))); connect(m_project->renderTask(), SIGNAL(signalRenderingStopped(QString)), this, SLOT(slotFinished(QString))); } catch (Error_sV &err) { throw Error(err.message().toStdString()); } } void SlowmoRenderer_sV::setSpeed(double slowfactor) { /* add a first (default) node */ Node_sV snode; snode.setX(0.0); snode.setY(0.0); m_project->nodes()->add(snode); Node_sV enode; // linear slope ? // maybe should calc ? // need to check for video loaded ? enode.setY(m_project->frameSource()->maxTime()); enode.setX((1/slowfactor)*m_project->frameSource()->maxTime()); m_project->nodes()->add(enode); //m_project->nodes()->setSpeed(0,slowfactor); m_project->renderTask()->setTimeRange(m_start, m_end); } void SlowmoRenderer_sV::setTimeRange(QString start, QString end) { m_start = start; m_end = end; m_project->renderTask()->setTimeRange(m_start, m_end); } void SlowmoRenderer_sV::setFps(double fps) { m_project->renderTask()->renderPreferences().setFps(fps); } void SlowmoRenderer_sV::setInputTarget(QString inFilename) { m_project->loadFrameSource(new VideoFrameSource_sV(m_project, inFilename)); connect(m_project->frameSource(), SIGNAL(signalNextTask(QString,int)), this, SLOT(slotNewFrameSourceTask(QString,int))); connect(m_project->frameSource(), SIGNAL(signalAllTasksFinished()), this, SLOT(slotFrameSourceTasksFinished())); //m_project->frameSource()->initialize(); m_project->frameSource()->loadOrigFrames(); // m_nodes->setMaxY(m_frameSource->maxTime()); // std::cerr << "max time : " << m_project->frameSource()->maxTime() << std::endl; } void SlowmoRenderer_sV::setVideoRenderTarget(QString filename, QString codec) { #ifdef USE_FFMPEG #if 0 #warning "using QTKit version" newVideoRenderTarget *vrt = new newVideoRenderTarget(m_project->renderTask()); #else #warning "using fork version" exportVideoRenderTarget *vrt = new exportVideoRenderTarget(m_project->renderTask()); #endif #else #warning "should not use this" VideoRenderTarget_sV *vrt = new VideoRenderTarget_sV(m_project->renderTask()); #endif vrt->setTargetFile(QString(filename)); vrt->setVcodec(QString(codec)); m_project->renderTask()->setRenderTarget(vrt); m_renderTargetSet = true; } void SlowmoRenderer_sV::setImagesRenderTarget(QString filenamePattern, QString directory) { ImagesRenderTarget_sV *irt = new ImagesRenderTarget_sV(m_project->renderTask()); irt->setFilenamePattern(QString(filenamePattern)); irt->setTargetDir(QString(directory)); m_project->renderTask()->setRenderTarget(irt); m_renderTargetSet = true; } void SlowmoRenderer_sV::setInterpolation(InterpolationType interpolation) { m_project->renderTask()->renderPreferences().interpolation = interpolation; } void SlowmoRenderer_sV::setMotionblur(MotionblurType motionblur) { m_project->renderTask()->renderPreferences().motionblur = motionblur; } void SlowmoRenderer_sV::setSize(bool original) { if (original) { m_project->renderTask()->renderPreferences().size = FrameSize_Orig; } else { m_project->renderTask()->renderPreferences().size = FrameSize_Small; } } void SlowmoRenderer_sV::setV3dLambda(float lambda) { m_project->preferences()->flowV3DLambda() = lambda; } void SlowmoRenderer_sV::start() { m_project->renderTask()->slotContinueRendering(); } void SlowmoRenderer_sV::abort() { m_project->renderTask()->slotStopRendering(); } void SlowmoRenderer_sV::slotProgressInfo(int progress) { m_lastProgress = progress; } void SlowmoRenderer_sV::slotTaskSize(QString desc, int size) { std::cout << desc.toStdString() << std::endl; m_taskSize = size; } void SlowmoRenderer_sV::slotFinished(QString time) { std::cout << std::endl << "Rendering finished. Time taken: " << time.toStdString() << std::endl; } void SlowmoRenderer_sV::printProgress() { std::cout << m_lastProgress << "/" << m_taskSize << std::endl; } bool SlowmoRenderer_sV::isComplete(QString &message) const { bool b = true; if (!m_renderTargetSet) { b = false; message.append("No render target set.\n"); } return b; } void SlowmoRenderer_sV::slotNewFrameSourceTask(const QString taskDescription, int taskSize) { std::cout << "slotNewFrameSourceTask"; } void SlowmoRenderer_sV::slotFrameSourceTasksFinished() { std::cout << "slotFrameSourceTasksFinished"; } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoRenderer/CMakeLists.txt0000664000000000000000000000060513151342440024672 0ustar rootroot include_directories(..) set(SRCS rendererMain.cpp slowmoRenderer_sV.cpp ) set(SRCS_MOC slowmoRenderer_sV.h ) qt_wrap_cpp(MOC_OUT ${SRCS_MOC}) add_executable(slowmoRenderer ${SRCS} ${MOC_OUT}) target_link_libraries(slowmoRenderer sVproj ${EXTERNAL_LIBS}) qt_use_modules(slowmoRenderer Script Widgets Concurrent Gui Core ) install(TARGETS slowmoRenderer DESTINATION ${DEST}) slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoRenderer/slowmoRenderer_sV.h0000664000000000000000000000404313151342440025762 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef SLOWMORENDERER_SV_H #define SLOWMORENDERER_SV_H #include "lib/defs_sV.hpp" #include #include #include class Project_sV; class Error { public: Error(std::string message); std::string message; }; /** \brief Just for rendering. */ class SlowmoRenderer_sV : public QObject { Q_OBJECT public: SlowmoRenderer_sV(); ~SlowmoRenderer_sV(); void load(QString filename) throw(Error); void save(QString filename); void create() throw(Error); void start(); void abort(); void setTimeRange(QString start, QString end); void setFps(double fps); void setVideoRenderTarget(QString filename, QString codec); void setImagesRenderTarget(QString filenamePattern, QString directory); void setInputTarget(QString inFilename); void setInterpolation(InterpolationType interpolation); void setMotionblur(MotionblurType motionblur); void setSize(bool original); void setV3dLambda(float lambda); void setSpeed(double slowfactor); void printProgress(); /// Checks if all necessary parameters (e.g. paths) are set /// \param message Will contain an error message if the function returned \c false bool isComplete(QString &message) const; private: Project_sV *m_project; int m_taskSize; int m_lastProgress; QString m_start; QString m_end; bool m_renderTargetSet; private slots: void slotProgressInfo(int progress); void slotTaskSize(QString desc, int size); void slotFinished(QString time); void slotNewFrameSourceTask(const QString taskDescription, int taskSize); void slotFrameSourceTasksFinished(); }; #endif // SLOWMORENDERER_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoRenderer/rendererMain.cpp0000664000000000000000000002046513151342440025257 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "slowmoRenderer_sV.h" #include #include #include #include #include QString myName; SlowmoRenderer_sV renderer; int terminateCounter = 0; int genproj = 0; //TODO: maybe in case of abort we should remove directories ? void terminate(int) { if (terminateCounter == 0) { std::cout << "Telling renderer to stop." << std::endl; } else if (terminateCounter == 1) { std::cout << "Really want to kill rendering? Send the SIGINT a third time." << std::endl; } else { exit(SIGINT); } terminateCounter++; renderer.abort(); } void printProgress(int) { renderer.printProgress(); } void printHelp() { std::cout << "slowmoRenderer for slowmoVideo " << Version_sV::version.toStdString() << std::endl << myName.toStdString() << " []" << std::endl << "\t-target [video [|auto] | images ] " << std::endl << "\t-input video | images ] " << std::endl << "\t-size [small|orig] " << std::endl << "\t-fps " << std::endl << "\t-start -end " << std::endl << "\t-interpolation [forward[2]|twoway[2]] " << std::endl << "\t -motionblur [stack|convolve] " << std::endl << "\t -slowfactor " << std::endl << "\t-v3dLambda " << std::endl; } void require(int nArgs, int index, int size) { if (size <= index + nArgs) { std::cout << "Not enough arguments delivered (" << nArgs << " required)." << std::endl; printHelp(); exit(-1); } } int main(int argc, char *argv[]) { QCoreApplication app(argc, argv); // Set up preferences for the QSettings file QCoreApplication::setOrganizationName("Granjow"); QCoreApplication::setOrganizationDomain("granjow.net"); QCoreApplication::setApplicationName("slowmoUI"); if (signal(SIGINT, terminate) == SIG_ERR) { std::cerr << "Could not set up SIGINT handler." << std::endl; } #ifndef WINDOWS if (signal(SIGUSR1, printProgress) == SIG_ERR) { std::cerr << "Could not set up SIGUSR1 handler." << std::endl; } #endif QStringList args = app.arguments(); myName = args.at(0); if (argc <= 1 || "--help" == args.at(1) || "-h" == args.at(1)) { printHelp(); return 0; } int next = 1; if ((args.at(1)).contains("svProj", Qt::CaseInsensitive) ) { renderer.load(args.at(1)); next = 2; } else renderer.create(); QString start = ":start"; QString end = ":end"; const int n = args.size(); while (next < n) { if ("-target" == args.at(next)) { require(3, next, n); next++; if ("video" == args.at(next)) { next++; QString filename = args.at(next++); QString codec = args.at(next++); if ("auto" == codec) { codec = ""; } renderer.setVideoRenderTarget(filename, codec); } else if ("images" == args.at(next)) { next++; QString filenamePattern = args.at(next++); QString dir = args.at(next++); renderer.setImagesRenderTarget(filenamePattern, dir); } else { std::cerr << "Not a valid target: " << args.at(next).toStdString() << std::endl; return -1; } } else if ("-input" == args.at(next)) { require(2, next, n); next++; if ("video" == args.at(next)) { next++; QString filename = args.at(next++); renderer.setInputTarget(filename); } else if ("images" == args.at(next)) { next++; QString filenamePattern = args.at(next++); QString dir = args.at(next++); //TODO: pattern ? //renderer.setImagesRenderTarget(filenamePattern, dir); } else { std::cerr << "Not a valid input: " << args.at(next).toStdString() << std::endl; return -1; } } else if ("-size" == args.at(next)) { require(1, next, n); next++; if ("small" == args.at(next)) { renderer.setSize(false); } else if ("orig" == args.at(next)) { renderer.setSize(true); } else { std::cerr << "Not a valid size: " << args.at(next).toStdString() << std::endl; return -1; } next++; } else if ("-fps" == args.at(next)) { require(1, next, n); next++; bool b; double fps = args.at(next).toDouble(&b); if (!b) { std::cerr << "Not a number: " << args.at(next).toStdString() << std::endl; return -1; } renderer.setFps(fps); next++; } else if ("-start" == args.at(next)) { require(1, next, n); next++; start = args.at(next++); } else if ("-end" == args.at(next)) { require(1, next, n); next++; end = args.at(next++); } else if ("-interpolation" == args.at(next)) { require(1, next, n); next++; if ("forward" == args.at(next)) { renderer.setInterpolation(InterpolationType_Forward); } else if ("forward2" == args.at(next)) { renderer.setInterpolation(InterpolationType_ForwardNew); } else if ("twoway" == args.at(next)) { renderer.setInterpolation(InterpolationType_Twoway); } else if ("twoway2" == args.at(next)) { renderer.setInterpolation(InterpolationType_TwowayNew); } else { std::cerr << "Not a valid interpolation type: " << args.at(next).toStdString() << std::endl; return -1; } next++; } else if ("-motionblur" == args.at(next)) { require(1, next, n); next++; if ("stack" == args.at(next)) { renderer.setMotionblur(MotionblurType_Stacking); } else if ("convolve" == args.at(next)) { renderer.setMotionblur(MotionblurType_Convolving); } else { std::cerr << "Not a valid motion blur type: " << args.at(next).toStdString() << std::endl; return -1; } next++; } else if ("-v3dLambda" == args.at(next)) { require(1, next, n); next++; bool b; float lambda = args.at(next).toFloat(&b); if (!b) { std::cerr << "Not a number: " << args.at(next).toStdString() << std::endl; return -1; } renderer.setV3dLambda(lambda); next++; } else if ("-slowfactor" == args.at(next)) { require(1, next, n); next++; bool b; double slowfactor = args.at(next).toDouble(&b); if (!b) { std::cerr << "Not a number: " << args.at(next).toStdString() << std::endl; return -1; } std::cerr << "will slow down to : " << slowfactor << std::endl; renderer.setSpeed(slowfactor); next++; } else { std::cout << "Argument not recognized: " << args.at(next).toStdString() << std::endl; printHelp(); return -1; } } renderer.setTimeRange(start, end); QString msg; if (!renderer.isComplete(msg)) { std::cout << msg.toStdString() << std::endl; std::cout << "Project will not be rendered." << std::endl; return 42; } if (genproj) renderer.save("test.svProj"); else renderer.start(); } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/0000775000000000000000000000000013151342440020700 5ustar rootrootslowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/logbrowserdialog.h0000664000000000000000000000154513151342440024423 0ustar rootroot#ifndef LOGDIALOG_H #define LOGDIALOG_H #include #include class QTextBrowser; class QPushButton; class LogBrowserDialog : public QDialog { Q_OBJECT public: LogBrowserDialog(QWidget *parent = 0); ~LogBrowserDialog(); public slots: void outputMessage( QtMsgType type, const QString &msg ); protected slots: void save(); // TODO: /* static void registerQDebugMessageHandler(){ qInstallMessageHandler(myQDebugMessageHandler); Q_DebugStream::registerQDebugMessageHandler(); */ protected: virtual void keyPressEvent( QKeyEvent *e ); virtual void closeEvent( QCloseEvent *e ); //QTextBrowser *browser; /*static*/ QTextEdit *browser; // m_LogEdit; QPushButton *clearButton; QPushButton *saveButton; }; #endif // LOGDIALOG_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/resources.qrc0000664000000000000000000000136113151342440023422 0ustar rootroot res/iconAdd.png res/iconSel.png res/iconMov.png res/shutterFunction.png res/AppIcon.png ../tr/slowmoVideo_de.qm ../tr/slowmoVideo_it.qm ../tr/slowmoVideo_fr.qm slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/renderPreview.h0000664000000000000000000000243613151342440023677 0ustar rootroot/* slowmoUI is a user interface for slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef RENDERPREVIEW_H #define RENDERPREVIEW_H #include #include #include #include "../lib/defs_sV.hpp" namespace Ui { class RenderPreview; } class Project_sV; class QMainWindow; /** \brief Renders a preview frame from the project */ class RenderPreview : public QWidget { Q_OBJECT public: explicit RenderPreview(Project_sV *project, QWidget *parent = 0); ~RenderPreview(); /// Uses the given project (and its curve etc.) void load(Project_sV *project); public slots: /// Renders the output frame at the given time (asynchronous) void slotRenderAt(qreal time); private: Ui::RenderPreview *ui; Project_sV *m_project; QMainWindow *m_parentMainWindow; QFutureWatcher m_futureWatcher; QFuture m_future; void notify(const QString message); private slots: void slotUpdateImage(); }; #endif // RENDERPREVIEW_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/renderPreview.cpp0000664000000000000000000000610613151342440024230 0ustar rootroot/* slowmoUI is a user interface for slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "renderPreview.h" #include "ui_renderPreview.h" #include "project/emptyFrameSource_sV.h" #include "project/project_sV.h" #include "project/projectPreferences_sV.h" #include #include #include #include #if QT_VERSION >= QT_VERSION_CHECK(5, 0, 0) #include #endif RenderPreview::RenderPreview(Project_sV *project, QWidget *parent) : QWidget(parent), ui(new Ui::RenderPreview), m_project(project) { ui->setupUi(this); m_parentMainWindow = dynamic_cast(parentWidget()); ui->info->setVisible(m_parentMainWindow == NULL); ui->info->clear(); connect(&m_futureWatcher, SIGNAL(finished()), this, SLOT(slotUpdateImage())); } RenderPreview::~RenderPreview() { delete ui; } void RenderPreview::load(Project_sV *project) { m_project = project; } void RenderPreview::notify(const QString message) { if (m_parentMainWindow != NULL) { m_parentMainWindow->statusBar()->showMessage(message, 5000); } else { ui->info->setText(message); } } void RenderPreview::slotRenderAt(qreal time) { if (dynamic_cast(m_project->frameSource()) != NULL) { notify(tr("Cannot render preview, no frames loaded.")); return; } if (m_project->nodes()->size() < 2) { notify(tr("Cannot render preview at the curve position since no curve is available.")); return; } if (time >= m_project->nodes()->startTime() && time <= m_project->nodes()->endTime()) { notify(tr("Rendering preview at output time %1 s (might take some time) ...").arg(time)); if (m_future.isRunning()) { notify(tr("Preview is still being rendered.")); } else { RenderPreferences_sV prefs; prefs.fps() = m_project->preferences()->renderFPS(); prefs.interpolation = m_project->preferences()->renderInterpolationType(); prefs.size = FrameSize_Orig; m_future = QtConcurrent::run(m_project, &Project_sV::render, time, prefs); m_futureWatcher.setFuture(m_future); if (m_future.isFinished()) { qDebug() << "qFuture has already finished! Manually calling update."; slotUpdateImage(); } } } else { notify(tr("Cannot render at output time %1 s; Not within the curve.").arg(time)); } } void RenderPreview::slotUpdateImage() { qDebug() << "Updating preview image now. Saving as /tmp/renderPreview.jpg."; ///< \todo do not save anymore ui->imageDisplay->loadImage(m_future.result()); ui->imageDisplay->image().save(QDir::tempPath () + "/renderPreview.jpg"); repaint(); notify(tr("Preview rendering finished.")); } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/notificator.cpp0000664000000000000000000002353613151342440023736 0ustar rootroot#include "notificator.h" #include #include #include #include #include #include #include #include #include #include #ifdef USE_DBUS #include #include #endif #ifdef Q_OS_MAC #include #include "macnotificationhandler.h" #endif // https://wiki.ubuntu.com/NotificationDevelopmentGuidelines recommends at least 128 const int FREEDESKTOP_NOTIFICATION_ICON_SIZE = 128; Notificator::Notificator(const QString &programName, QSystemTrayIcon *trayicon, QWidget *parent): QObject(parent), parent(parent), programName(programName), mode(None), trayIcon(trayicon) #ifdef USE_DBUS ,interface(0) #endif { if(trayicon && trayicon->supportsMessages()) { mode = QSystemTray; } #ifdef USE_DBUS interface = new QDBusInterface("org.freedesktop.Notifications", "/org/freedesktop/Notifications", "org.freedesktop.Notifications"); if(interface->isValid()) { mode = Freedesktop; } #endif #ifdef Q_OS_MAC // check if users OS has support for NSUserNotification if( MacNotificationHandler::instance()->hasUserNotificationCenterSupport()) { mode = UserNotificationCenter; } else { // Check if Growl is installed (based on Qt's tray icon implementation) CFURLRef cfurl; OSStatus status = LSGetApplicationForInfo(kLSUnknownType, kLSUnknownCreator, CFSTR("growlTicket"), kLSRolesAll, 0, &cfurl); if (status != kLSApplicationNotFoundErr) { CFBundleRef bundle = CFBundleCreate(0, cfurl); if (CFStringCompare(CFBundleGetIdentifier(bundle), CFSTR("com.Growl.GrowlHelperApp"), kCFCompareCaseInsensitive | kCFCompareBackwards) == kCFCompareEqualTo) { if (CFStringHasSuffix(CFURLGetString(cfurl), CFSTR("/Growl.app/"))) mode = Growl13; else mode = Growl12; } CFRelease(cfurl); CFRelease(bundle); } } #endif } Notificator::~Notificator() { #ifdef USE_DBUS delete interface; #endif } #ifdef USE_DBUS // Loosely based on http://www.qtcentre.org/archive/index.php/t-25879.html class FreedesktopImage { public: FreedesktopImage() {} FreedesktopImage(const QImage &img); static int metaType(); // Image to variant that can be marshalled over DBus static QVariant toVariant(const QImage &img); private: int width, height, stride; bool hasAlpha; int channels; int bitsPerSample; QByteArray image; friend QDBusArgument &operator<<(QDBusArgument &a, const FreedesktopImage &i); friend const QDBusArgument &operator>>(const QDBusArgument &a, FreedesktopImage &i); }; Q_DECLARE_METATYPE(FreedesktopImage); // Image configuration settings const int CHANNELS = 4; const int BYTES_PER_PIXEL = 4; const int BITS_PER_SAMPLE = 8; FreedesktopImage::FreedesktopImage(const QImage &img): width(img.width()), height(img.height()), stride(img.width() * BYTES_PER_PIXEL), hasAlpha(true), channels(CHANNELS), bitsPerSample(BITS_PER_SAMPLE) { // Convert 00xAARRGGBB to RGBA bytewise (endian-independent) format QImage tmp = img.convertToFormat(QImage::Format_ARGB32); const uint32_t *data = reinterpret_cast(tmp.bits()); unsigned int num_pixels = width * height; image.resize(num_pixels * BYTES_PER_PIXEL); for(unsigned int ptr = 0; ptr < num_pixels; ++ptr) { image[ptr*BYTES_PER_PIXEL+0] = data[ptr] >> 16; // R image[ptr*BYTES_PER_PIXEL+1] = data[ptr] >> 8; // G image[ptr*BYTES_PER_PIXEL+2] = data[ptr]; // B image[ptr*BYTES_PER_PIXEL+3] = data[ptr] >> 24; // A } } QDBusArgument &operator<<(QDBusArgument &a, const FreedesktopImage &i) { a.beginStructure(); a << i.width << i.height << i.stride << i.hasAlpha << i.bitsPerSample << i.channels << i.image; a.endStructure(); return a; } const QDBusArgument &operator>>(const QDBusArgument &a, FreedesktopImage &i) { a.beginStructure(); a >> i.width >> i.height >> i.stride >> i.hasAlpha >> i.bitsPerSample >> i.channels >> i.image; a.endStructure(); return a; } int FreedesktopImage::metaType() { return qDBusRegisterMetaType(); } QVariant FreedesktopImage::toVariant(const QImage &img) { FreedesktopImage fimg(img); return QVariant(FreedesktopImage::metaType(), &fimg); } void Notificator::notifyDBus(Class cls, const QString &title, const QString &text, const QIcon &icon, int millisTimeout) { Q_UNUSED(cls); // Arguments for DBus call: QList args; // Program Name: args.append(programName); // Unique ID of this notification type: args.append(0U); // Application Icon, empty string args.append(QString()); // Summary args.append(title); // Body args.append(text); // Actions (none, actions are deprecated) QStringList actions; args.append(actions); // Hints QVariantMap hints; // If no icon specified, set icon based on class QIcon tmpicon; if(icon.isNull()) { QStyle::StandardPixmap sicon = QStyle::SP_MessageBoxQuestion; switch(cls) { case Information: sicon = QStyle::SP_MessageBoxInformation; break; case Warning: sicon = QStyle::SP_MessageBoxWarning; break; case Critical: sicon = QStyle::SP_MessageBoxCritical; break; default: break; } tmpicon = QApplication::style()->standardIcon(sicon); } else { tmpicon = icon; } hints["icon_data"] = FreedesktopImage::toVariant(tmpicon.pixmap(FREEDESKTOP_NOTIFICATION_ICON_SIZE).toImage()); args.append(hints); // Timeout (in msec) args.append(millisTimeout); // "Fire and forget" interface->callWithArgumentList(QDBus::NoBlock, "Notify", args); } #endif void Notificator::notifySystray(Class cls, const QString &title, const QString &text, const QIcon &icon, int millisTimeout) { Q_UNUSED(icon); QSystemTrayIcon::MessageIcon sicon = QSystemTrayIcon::NoIcon; switch(cls) // Set icon based on class { case Information: sicon = QSystemTrayIcon::Information; break; case Warning: sicon = QSystemTrayIcon::Warning; break; case Critical: sicon = QSystemTrayIcon::Critical; break; } trayIcon->showMessage(title, text, sicon, millisTimeout); } // Based on Qt's tray icon implementation #ifdef Q_OS_MAC void Notificator::notifyGrowl(Class cls, const QString &title, const QString &text, const QIcon &icon) { const QString script( "tell application \"%5\"\n" " set the allNotificationsList to {\"Notification\"}\n" // -- Make a list of all the notification types (all) " set the enabledNotificationsList to {\"Notification\"}\n" // -- Make a list of the notifications (enabled) " register as application \"%1\" all notifications allNotificationsList default notifications enabledNotificationsList\n" // -- Register our script with Growl " notify with name \"Notification\" title \"%2\" description \"%3\" application name \"%1\"%4\n" // -- Send a Notification "end tell" ); QString notificationApp(QApplication::applicationName()); if (notificationApp.isEmpty()) notificationApp = "Application"; QPixmap notificationIconPixmap; if (icon.isNull()) { // If no icon specified, set icon based on class QStyle::StandardPixmap sicon = QStyle::SP_MessageBoxQuestion; switch (cls) { case Information: sicon = QStyle::SP_MessageBoxInformation; break; case Warning: sicon = QStyle::SP_MessageBoxWarning; break; case Critical: sicon = QStyle::SP_MessageBoxCritical; break; } notificationIconPixmap = QApplication::style()->standardPixmap(sicon); } else { QSize size = icon.actualSize(QSize(48, 48)); notificationIconPixmap = icon.pixmap(size); } QString notificationIcon; QTemporaryFile notificationIconFile; if (!notificationIconPixmap.isNull() && notificationIconFile.open()) { QImageWriter writer(¬ificationIconFile, "PNG"); if (writer.write(notificationIconPixmap.toImage())) notificationIcon = QString(" image from location \"file://%1\"").arg(notificationIconFile.fileName()); } QString quotedTitle(title), quotedText(text); quotedTitle.replace("\\", "\\\\").replace("\"", "\\"); quotedText.replace("\\", "\\\\").replace("\"", "\\"); QString growlApp(this->mode == Notificator::Growl13 ? "Growl" : "GrowlHelperApp"); MacNotificationHandler::instance()->sendAppleScript(script.arg(notificationApp, quotedTitle, quotedText, notificationIcon, growlApp)); } void Notificator::notifyMacUserNotificationCenter(Class cls, const QString &title, const QString &text, const QIcon &icon) { // icon is not supported by the user notification center yet. OSX will use the app icon. MacNotificationHandler::instance()->showNotification(title, text); } #endif void Notificator::notify(Class cls, const QString &title, const QString &text, const QIcon &icon, int millisTimeout) { switch(mode) { #ifdef USE_DBUS case Freedesktop: notifyDBus(cls, title, text, icon, millisTimeout); break; #endif case QSystemTray: notifySystray(cls, title, text, icon, millisTimeout); break; #ifdef Q_OS_MAC case UserNotificationCenter: notifyMacUserNotificationCenter(cls, title, text, icon); break; case Growl12: case Growl13: notifyGrowl(cls, title, text, icon); break; #endif default: if(cls == Critical) { // Fall back to old fashioned pop-up dialog if critical and no other notification available QMessageBox::critical(parent, title, text, QMessageBox::Ok, QMessageBox::Ok); } break; } } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/canvas.ui0000664000000000000000000000350413151342440022514 0ustar rootroot Canvas 0 0 400 300 255 255 255 47 47 53 255 255 255 47 47 53 47 47 53 47 47 53 slowmoUI canvas slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/canvasTools.cpp0000664000000000000000000000405513151342440023704 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "canvasTools.h" #include "canvas.h" #include "project/project_sV.h" #include "project/projectPreferences_sV.h" #include //#define DEBUG #ifdef DEBUG #include #endif QString CanvasTools::outputTimeLabel(Canvas *canvas, Node_sV &time) { int decimals = 0; float maxRes = 1; while (canvas->m_secResX > maxRes) { decimals++; maxRes *= 10; } #ifdef DEBUG qDebug() << "resX: " << canvas->m_secResX << ", decimals: " << decimals << ", max res: " << maxRes; #endif QString timeText; if (time.x() < 60) { timeText = QString(QObject::tr("%1 s")).arg(time.x(), 0, 'f', decimals); } else { timeText = QString(QObject::tr("%1 min %2 s")).arg(int(time.x()/60)).arg(time.x()-60*int(time.x()/60), 0, 'f', decimals); } float frame = canvas->m_project->preferences()->canvas_xAxisFPS().fps()*time.x(); timeText += QString(QObject::tr("\nFrame %1")).arg(frame, 0, 'f', (decimals <= 1 ? 0 : 1)); return timeText; } QString CanvasTools::outputSpeedLabel(Node_sV &time, Project_sV *project) { if (!project->nodes()->isInsideCurve(time.x(), true)) { return ""; } const qreal dx = 1.0/project->preferences()->canvas_xAxisFPS().fps(); qreal t1, t2; if (time.x()+dx <= project->nodes()->endTime()) { t1 = project->nodes()->sourceTime(time.x()); t2 = project->nodes()->sourceTime(time.x()+dx); } else { t1 = project->nodes()->sourceTime(time.x()-dx); t2 = project->nodes()->sourceTime(time.x()); } const qreal dy = t2-t1; qreal percent = 0; if (dy != 0) { percent = dy/dx; } return QString(QObject::tr("%1 %")).arg(percent, 0, 'f');//, 1); } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/0000775000000000000000000000000013151342440022654 5ustar rootrootslowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/tagAddDialog.h0000664000000000000000000000166113151342440025335 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef TAGADDDIALOG_H #define TAGADDDIALOG_H #include "project/tag_sV.h" #include "lib/defs_sV.hpp" #include namespace Ui { class TagAddDialog; } class TagAddDialog : public QDialog { Q_OBJECT public: explicit TagAddDialog(TagAxis defaultAxis, QWidget *parent = 0); ~TagAddDialog(); QString m_text; Tag_sV buildTag(QPointF time); protected: void keyPressEvent(QKeyEvent *); private: Ui::TagAddDialog *ui; TagAxis m_axis; private slots: void slotTextChanged(const QString& text); void slotUpdateAxis(); }; #endif // TAGADDDIALOG_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/newProjectDialog.cpp0000664000000000000000000002206413151342440026624 0ustar rootroot/* slowmoUI is a user interface for slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "newProjectDialog.h" #include "ui_newProjectDialog.h" #include "project/videoFrameSource_sV.h" #include "project/imagesFrameSource_sV.h" #include #include #include #include #include NewProjectDialog::NewProjectDialog(QWidget *parent) : QDialog(parent), ui(new Ui::NewProjectDialog) { ui->setupUi(this); m_buttonGroup = new QButtonGroup(this); m_buttonGroup->addButton(ui->radioVideo); m_buttonGroup->addButton(ui->radioImages); ui->radioVideo->setChecked(true); ui->projectDir->setText(m_settings.value("directories/lastProjectDir", QDir::current().absolutePath()).toString()); m_videoInfo.streamsCount = 0; connect(ui->browseInputVideo, SIGNAL(clicked()), this, SLOT(slotSelectVideoFile())); connect(ui->browseInputImages, SIGNAL(clicked()), this, SLOT(slotSelectImages())); connect(ui->browseProjectDir, SIGNAL(clicked()), this, SLOT(slotSelectProjectDir())); connect(ui->inputVideo, SIGNAL(textChanged(QString)), this, SLOT(slotUpdateVideoInfo())); connect(ui->projectDir, SIGNAL(textChanged(QString)), this, SLOT(slotUpdateButtonStates())); connect(ui->projectFilename, SIGNAL(textChanged(QString)), this, SLOT(slotUpdateButtonStates())); connect(ui->bAbort, SIGNAL(clicked()), this, SLOT(reject())); connect(ui->bOk, SIGNAL(clicked()), this, SLOT(accept())); connect(m_buttonGroup, SIGNAL(buttonClicked(int)), this, SLOT(slotUpdateFrameSourceType())); slotUpdateImagesInfo(); slotUpdateVideoInfo(); slotUpdateButtonStates(); slotUpdateFrameSourceType(); } NewProjectDialog::~NewProjectDialog() { delete ui; delete m_buttonGroup; } Project_sV* NewProjectDialog::buildProject() throw(FrameSourceError) { Project_sV *project = new Project_sV(ui->projectDir->text()); AbstractFrameSource_sV *frameSource = NULL; if (ui->radioVideo->isChecked()) { frameSource = new VideoFrameSource_sV(project, ui->inputVideo->text()); m_settings.setValue("directories/lastInputVideo", QFileInfo(ui->inputVideo->text()).absolutePath()); } else { frameSource = new ImagesFrameSource_sV(project, m_images); m_settings.setValue("directories/lastInputImage", QFileInfo(m_images.last()).absolutePath()); } project->loadFrameSource(frameSource); m_settings.setValue("directories/lastProjectDir", ui->projectDir->text()); m_settings.setValue("directories/lastProjectDir", ui->projectDir->text()); return project; } const QString NewProjectDialog::projectFilename() const { return QString(ui->projectDir->text() + "/" + ui->projectFilename->text() + ".sVproj"); } void NewProjectDialog::slotSelectVideoFile() { QFileDialog dialog(this, tr("Select input video file")); dialog.setAcceptMode(QFileDialog::AcceptOpen); dialog.setFileMode(QFileDialog::ExistingFile); if (ui->inputVideo->text().length() > 0) { dialog.setDirectory(ui->inputVideo->text()); } else { dialog.setDirectory(m_settings.value("directories/lastInputVideo", QDir::homePath()).toString()); } if (dialog.exec() == QDialog::Accepted) { ui->inputVideo->setText(dialog.selectedFiles().at(0)); ui->txtVideoInfo->clear(); slotUpdateVideoInfo(); if (m_videoInfo.streamsCount <= 0) { // No video stream found. Check if the path contains a non-ASCII character and warn if this is the case. unsigned char ascii; for (int i = 0; i < ui->inputVideo->text().length(); i++) { #if QT_VERSION < QT_VERSION_CHECK(5, 0, 0) ascii = ui->inputVideo->text().at(i).toAscii(); #else ascii = ui->inputVideo->text().at(i).toLatin1(); #endif if (ascii == 0 || ascii > 0x7f) { ui->txtVideoInfo->appendPlainText( tr("Character %1 is not an ASCII character. This file path will likely not work with ffmpeg.") .arg(ui->inputVideo->text().at(i))); break; } } } } } void NewProjectDialog::slotSelectImages() { QFileDialog dialog(this, tr("Select input images")); dialog.setAcceptMode(QFileDialog::AcceptOpen); dialog.setFileMode(QFileDialog::ExistingFiles); if (m_images.size() > 0) { dialog.setDirectory(QFileInfo(m_images.last()).absolutePath()); } else { dialog.setDirectory(m_settings.value("directories/lastInputImage", QDir::homePath()).toString()); } if (dialog.exec() == QDialog::Accepted) { m_images = dialog.selectedFiles(); ui->inputImages->clear(); for (int i = 0; i < m_images.size(); i++) { new QListWidgetItem(m_images.at(i), ui->inputImages); } slotUpdateImagesInfo(); } } void NewProjectDialog::slotSelectProjectDir() { QFileDialog dialog(this, tr("Select a project directory")); dialog.setAcceptMode(QFileDialog::AcceptOpen); dialog.setFileMode(QFileDialog::Directory); if (ui->projectDir->text().length() > 0) { dialog.setDirectory(ui->projectDir->text()); } else { dialog.setDirectory(m_settings.value("directories/lastProjectDir").toString()); } if (dialog.exec() == QDialog::Accepted) { ui->projectDir->setText(dialog.selectedFiles().at(0)); slotUpdateButtonStates(); } } void NewProjectDialog::slotUpdateVideoInfo() { QFile file(ui->inputVideo->text()); if (file.exists()) { m_videoInfo = getInfo(ui->inputVideo->text().toStdString().c_str()); QString text = trUtf8("Number of video streams: %1\nFrames: %2\nSize: %3×%4\n") .arg(m_videoInfo.streamsCount).arg(m_videoInfo.framesCount) .arg(m_videoInfo.width).arg(m_videoInfo.height); text.append(tr("Frame rate: %1/%2").arg(m_videoInfo.frameRateNum).arg(m_videoInfo.frameRateDen)); ui->txtVideoInfo->setPlainText(text); } else { m_videoInfo.streamsCount = 0; ui->txtVideoInfo->setPlainText(tr("No video stream detected.")); } slotUpdateButtonStates(); } void NewProjectDialog::slotUpdateImagesInfo() { m_imagesMsg = ImagesFrameSource_sV::validateImages(m_images); slotUpdateButtonStates(); } void NewProjectDialog::slotUpdateButtonStates() { bool ok = true; if (ui->projectDir->text().length() > 0) { QDir dir(ui->projectDir->text()); ui->cbDirectoryCreated->setChecked(!dir.exists()); ui->projectDir->setStyleSheet(QString("QLineEdit { background-color: %1; }").arg(Colours_sV::colOk.name())); m_projectDir = ui->projectDir->text(); } else { ui->projectDir->setStyleSheet(QString("QLineEdit { background-color: %1; }").arg(Colours_sV::colBad.name())); ok = false; } QFile projectFile(projectFilename()); if (ui->projectFilename->text().length() > 0 && !projectFile.exists()) { ui->projectFilename->setStyleSheet(QString("QLineEdit { background-color: %1; }").arg(Colours_sV::colOk.name())); } else { ui->projectFilename->setStyleSheet(QString("QLineEdit { background-color: %1; }").arg(Colours_sV::colBad.name())); ok = false; } if (ui->radioVideo->isChecked()) { // Validate the video file #if 0 // ubuntu can't handle mp4 correctly ? if (m_videoInfo.streamsCount > 0 && m_videoInfo.framesCount > 0) { #else if (m_videoInfo.streamsCount > 0) { #endif ui->inputVideo->setStyleSheet(QString("QLineEdit { background-color: %1; }").arg(Colours_sV::colOk.name())); m_inputFile = ui->inputVideo->text(); } else { ui->inputVideo->setStyleSheet(QString("QLineEdit { background-color: %1; }").arg(Colours_sV::colBad.name())); ok = false; } } else if (ui->radioImages->isChecked()) { // Validate the images if (m_imagesMsg.length() == 0) { ui->inputImages->setStyleSheet(QString("QListWidget { background-color: %1; }").arg(Colours_sV::colOk.name())); ui->txtImageInfo->setText(tr("Image size: %1").arg(toString(QImage(m_images.at(0)).size()))); } else { ui->inputImages->setStyleSheet(QString("QListWidget { background-color: %1; }").arg(Colours_sV::colBad.name())); ui->txtImageInfo->setText(m_imagesMsg); ok = false; } } ui->bOk->setEnabled(ok); } void NewProjectDialog::slotUpdateFrameSourceType() { ui->groupImages->setEnabled(ui->radioImages->isChecked()); ui->groupImages->setVisible(ui->radioImages->isChecked()); ui->groupVideo->setEnabled(ui->radioVideo->isChecked()); ui->groupVideo->setVisible(ui->radioVideo->isChecked()); slotUpdateButtonStates(); QSize prevSize = size(); adjustSize(); resize(prevSize.width(), size().height()); } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/flowExaminer.h0000664000000000000000000000150413151342440025465 0ustar rootroot#ifndef FLOWEXAMINER_H #define FLOWEXAMINER_H #include namespace Ui { class FlowExaminer; } class Project_sV; class FlowField_sV; class FlowExaminer : public QDialog { Q_OBJECT public: explicit FlowExaminer(Project_sV *project, QWidget *parent = 0); ~FlowExaminer(); void examine(int leftFrame); void loadFlow(); private: Ui::FlowExaminer *ui; Project_sV *m_project; FlowField_sV *m_flowLR; FlowField_sV *m_flowRL; // current frame int frame; // color flow amplification float m_boost; private slots: void slotMouseMoved(float x, float y); void updateFlow(); public slots: void newAmplification(int val); protected: void wheelEvent(QWheelEvent *); void keyPressEvent(QKeyEvent *event); signals: void frameChanged(); }; #endif // FLOWEXAMINER_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/preferencesDialog.h0000664000000000000000000000147613151342440026456 0ustar rootroot#ifndef PREFERENCESDIALOG_H #define PREFERENCESDIALOG_H #include #include #include #include "opencv2/opencv_modules.hpp" #ifdef HAVE_OPENCV_OCL #include "opencv2/ocl/ocl.hpp" #endif namespace Ui { class PreferencesDialog; } class PreferencesDialog : public QDialog { Q_OBJECT public: explicit PreferencesDialog(QWidget *parent = 0); ~PreferencesDialog(); protected slots: void accept(); private: Ui::PreferencesDialog *ui; QButtonGroup m_flowMethodGroup; QSettings m_settings; int isOCLsupported(); QList oclFillDevices(void); private slots: void slotValidateFlowBinary(); void slotUpdateFlowMethod(); void slotUpdateFfmpeg(); void slotBrowseFlow(); void slotBrowseFfmpeg(); }; #endif // PREFERENCESDIALOG_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/progressDialog.ui0000664000000000000000000000450113151342440026177 0ustar rootroot ProgressDialog 0 0 562 140 Progress 75 true Current Task 0 Task Description true Qt::Vertical 20 40 Qt::Horizontal 40 20 Abort Ok slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/shutterFunctionDialog.h0000664000000000000000000000271113151342440027352 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef SHUTTERFUNCTIONDIALOG_H #define SHUTTERFUNCTIONDIALOG_H #include class Project_sV; class ShutterFunction_sV; namespace Ui { class ShutterFunctionDialog; } /** \brief Manages ShutterFunction_sV for the current curve */ class ShutterFunctionDialog : public QDialog { Q_OBJECT public: explicit ShutterFunctionDialog(Project_sV *project, QWidget *parent = 0); ~ShutterFunctionDialog(); void loadProject(Project_sV *project); void setSegment(int segment); public slots: void slotNodesUpdated(); protected slots: virtual void paintEvent(QPaintEvent *e); virtual void closeEvent(QCloseEvent *e); private: static QString emptyFunction; Ui::ShutterFunctionDialog *ui; Project_sV *m_project; ShutterFunction_sV *m_currentFunction; float m_dy; float m_t0; int m_segment; private slots: void slotUpdateNode(); void slotFunctionTextChanged(); void slotLoadSelectedFunction(); void slotAddFunction(); void slotRemoveFunction(); void slotNextSegment(); void slotPrevSegment(); }; #endif // SHUTTERFUNCTIONDIALOG_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/newProjectDialog.ui0000664000000000000000000002016213151342440026454 0ustar rootroot NewProjectDialog 0 0 698 697 New slowmo Project Browse Qt::Horizontal 40 20 Abort 0 0 Ok false Directory will be created true false Video source Browse Video information: 600 0 Qt::NoFocus true 75 true Input video Image source 75 true Input images Browse Images information: true Video file Images Qt::Horizontal 40 20 Qt::Vertical 20 40 75 true Project Directory 75 true Project Filename .sVproj You should preferredly use an empty directory here. Each project needs its own project directory. true The project will be saved to this file right after confirming this dialog. true projectDir browseProjectDir cbDirectoryCreated projectFilename radioVideo radioImages inputVideo browseInputVideo inputImages browseInputImages txtImageInfo bAbort bOk slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/renderingDialog.cpp0000664000000000000000000005501413151342440026462 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "config.h" #include "renderingDialog.h" #include "ui_renderingDialog.h" #include "lib/defs_sV.hpp" #include "project/motionBlur_sV.h" #include "project/project_sV.h" #include "project/projectPreferences_sV.h" #include "project/renderTask_sV.h" #include "project/imagesRenderTarget_sV.h" #include "project/abstractFlowSource_sV.h" #include "project/flowSourceOpenCV_sV.h" #include "project/flowSourceV3D_sV.h" #ifdef USE_FFMPEG #if 0 #include "project/new_videoRenderTarget.h" #else #include "project/exportVideoRenderTarget.h" #endif #else #include "project/videoRenderTarget_sV.h" #endif #include "project/emptyFrameSource_sV.h" #include #include #include // TODO: better #include #include RenderingDialog::RenderingDialog(Project_sV *project, QWidget *parent) : QDialog(parent), ui(new Ui::RenderingDialog), m_project(project) { ui->setupUi(this); // Render section m_sectionGroup = new QButtonGroup(this); m_sectionGroup->addButton(ui->radioFullProject); m_sectionGroup->addButton(ui->radioSection); m_sectionGroup->addButton(ui->radioTagSection); QString mode(m_project->preferences()->renderSectionMode()); if (mode == "full") { ui->radioFullProject->setChecked(true); } else if (mode == "expr") { ui->radioSection->setChecked(true); } else if (mode == "tags") { ui->radioTagSection->setChecked(true); } else { qDebug() << "Unknown render section mode: " << mode; Q_ASSERT(false); } // Optical flow ui->lambda->setValue(m_project->preferences()->flowV3DLambda()); QSettings settings; //TODO: better define in project ? ui->opticalFlowAlgo->clear(); QString flow_method = settings.value("preferences/flowMethod", "OpenCV-CPU").toString(); if (flow_method == "V3D") { ui->opticalFlowAlgo->addItem(tr("flowBuilder"), QVariant(1)); } else { ui->opticalFlowAlgo->addItem(tr("OpenCV - Farneback"), QVariant(2)); ui->opticalFlowAlgo->addItem(tr("OpenCV - Dual TVL1"), QVariant(3)); } connect(ui->opticalFlowAlgo, SIGNAL(activated(int)), ui->flowStackedWidget, SLOT(setCurrentIndex(int))); QWidget *flowbuilder_pane = ui->flowStackedWidget->widget(0); QWidget *farneback_pane = ui->flowStackedWidget->widget(1); QWidget *tvl1_pane = ui->flowStackedWidget->widget(2); if (flow_method == "V3D") { ui->opticalFlowAlgo->setCurrentIndex(0); ui->flowStackedWidget->setCurrentIndex(0); ui->flowStackedWidget->removeWidget(farneback_pane); ui->flowStackedWidget->removeWidget(tvl1_pane); } else { int algo = settings.value("preferences/preferredOpenCVAlgo", 0).toInt(); ui->opticalFlowAlgo->setCurrentIndex(algo); ui->flowStackedWidget->setCurrentIndex(algo+1); ui->flowStackedWidget->removeWidget(flowbuilder_pane); } connect(ui->clearflow, SIGNAL(clicked()), this, SLOT(slotClearFlowCache())); // Motion blur ui->maxSamples->setValue(m_project->motionBlur()->maxSamples()); ui->slowmoSamples->setValue(m_project->motionBlur()->slowmoSamples()); m_blurGroup = new QButtonGroup(this); m_blurGroup->addButton(ui->radioBlurConvolution); m_blurGroup->addButton(ui->radioBlurStacking); m_blurGroup->addButton(ui->radioBlurNearest); if (m_project->preferences()->renderMotionblurType() == MotionblurType_Convolving) { ui->radioBlurConvolution->setChecked(true); } else if (m_project->preferences()->renderMotionblurType() == MotionblurType_Stacking) { ui->radioBlurStacking->setChecked(true); } else { ui->radioBlurNearest->setChecked(true); } fillTagLists(); // Output target type m_targetGroup = new QButtonGroup(this); m_targetGroup->addButton(ui->radioImages); m_targetGroup->addButton(ui->radioVideo); if (m_project->preferences()->renderTarget() == "images") { ui->radioImages->setChecked(true); } else { ui->radioVideo->setChecked(true); } // Output target files ui->imagesOutputDir->setText(m_project->preferences()->imagesOutputDir()); ui->imagesFilenamePattern->setText(m_project->preferences()->imagesFilenamePattern()); ui->videoOutputFile->setText(m_project->preferences()->videoFilename()); ui->vcodec->setText(m_project->preferences()->videoCodec()); // FPS QString fps = QVariant(m_project->preferences()->renderFPS().fps()).toString(); if (ui->cbFps->findText(fps) < 0 && fps.toFloat() > 0) { ui->cbFps->addItem(fps); } ui->cbFps->setCurrentIndex(ui->cbFps->findText(fps)); // Output size ui->cbSize->addItem(tr("Original size"), QVariant(FrameSize_Orig)); ui->cbSize->addItem(tr("Small"), QVariant(FrameSize_Small)); ui->cbSize->setCurrentIndex(ui->cbSize->findData(QVariant(m_project->preferences()->renderFrameSize()))); // Interpolation type ui->cbInterpolation->addItem(toString(InterpolationType_Forward), QVariant(InterpolationType_Forward)); ui->cbInterpolation->addItem(toString(InterpolationType_ForwardNew), QVariant(InterpolationType_ForwardNew)); ui->cbInterpolation->addItem(toString(InterpolationType_Twoway), QVariant(InterpolationType_Twoway)); ui->cbInterpolation->addItem(toString(InterpolationType_TwowayNew), QVariant(InterpolationType_TwowayNew)); ui->cbInterpolation->addItem(toString(InterpolationType_Bezier), QVariant(InterpolationType_Bezier)); ui->cbInterpolation->addItem(toString(InterpolationType_None), QVariant(InterpolationType_None)); ui->cbInterpolation->addItem(toString(InterpolationType_Nearest), QVariant(InterpolationType_Nearest)); if (ui->cbInterpolation->findData(QVariant(m_project->preferences()->renderInterpolationType())) >= 0) { ui->cbInterpolation->setCurrentIndex(ui->cbInterpolation->findData(QVariant(m_project->preferences()->renderInterpolationType()))); } connect(m_targetGroup, SIGNAL(buttonClicked(int)), this, SLOT(slotUpdateRenderTarget())); connect(m_sectionGroup, SIGNAL(buttonClicked(int)), this, SLOT(slotSectionModeChanged())); connect(ui->timeStart, SIGNAL(textChanged(QString)), this, SLOT(slotValidate())); connect(ui->timeEnd, SIGNAL(textChanged(QString)), this, SLOT(slotValidate())); connect(ui->cbStartTag, SIGNAL(currentIndexChanged(int)), this, SLOT(slotTagIndexChanged())); connect(ui->cbEndTag, SIGNAL(currentIndexChanged(int)), this, SLOT(slotTagIndexChanged())); connect(ui->bAbort, SIGNAL(clicked()), this, SLOT(reject())); connect(ui->bOk, SIGNAL(clicked()), this, SLOT(accept())); connect(ui->bSave, SIGNAL(clicked()), this, SLOT(slotSaveSettings())); connect(ui->cbFps, SIGNAL(editTextChanged(QString)), this, SLOT(slotValidate())); connect(ui->imagesOutputDir, SIGNAL(textChanged(QString)), this, SLOT(slotValidate())); connect(ui->imagesFilenamePattern, SIGNAL(textChanged(QString)), this, SLOT(slotValidate())); connect(ui->videoOutputFile, SIGNAL(textChanged(QString)), this, SLOT(slotValidate())); connect(ui->bImagesBrowseDir, SIGNAL(clicked()), this, SLOT(slotBrowseImagesDir())); connect(ui->bBrowseVideoOutputFile, SIGNAL(clicked()), this, SLOT(slotBrowseVideoFile())); // Restore rendering start/end int index = ui->cbStartTag->findText(m_project->preferences()->renderStartTag()); if (index >= 0) { ui->cbStartTag->setCurrentIndex(index); } index = ui->cbEndTag->findText(m_project->preferences()->renderEndTag()); if (index >= 0) { ui->cbEndTag->setCurrentIndex(index); } if (m_project->preferences()->renderStartTime().length() > 0) { ui->timeStart->setText(m_project->preferences()->renderStartTime()); } if (m_project->preferences()->renderEndTime().length() > 0) { ui->timeEnd->setText(m_project->preferences()->renderEndTime()); } #if QT_VERSION >= 0x040700 ui->timeStart->setPlaceholderText(QVariant(m_project->nodes()->startTime()).toString()); ui->timeEnd->setPlaceholderText(QVariant(m_project->nodes()->endTime()).toString()); #endif #ifndef USE_QTKIT ui->use_qt->setChecked(false); ui->use_qt->setEnabled(false); #endif slotUpdateRenderTarget(); slotSectionModeChanged(); } RenderingDialog::~RenderingDialog() { delete m_targetGroup; delete ui; } RenderTask_sV* RenderingDialog::buildTask() { if (!slotValidate()) { return NULL; } slotSaveSettings(); ProjectPreferences_sV *prefs = m_project->preferences(); const QString imagesOutputDir = ui->imagesOutputDir->text(); const QString imagesFilenamePattern = ui->imagesFilenamePattern->text(); RenderTask_sV *task = new RenderTask_sV(m_project); task->renderPreferences().setFps(prefs->renderFPS()); task->renderPreferences().size = prefs->renderFrameSize(); task->renderPreferences().interpolation = prefs->renderInterpolationType(); task->renderPreferences().motionblur = prefs->renderMotionblurType(); if (ui->radioImages->isChecked()) { ImagesRenderTarget_sV *renderTarget = new ImagesRenderTarget_sV(task); renderTarget->setFilenamePattern(imagesFilenamePattern); renderTarget->setTargetDir(imagesOutputDir); task->setRenderTarget(renderTarget); } else if (ui->radioVideo->isChecked()) { #ifdef USE_FFMPEG #if 0 newVideoRenderTarget *renderTarget = new newVideoRenderTarget(task); #else exportVideoRenderTarget *renderTarget = new exportVideoRenderTarget(task); #endif const bool use_qt = ui->use_qt->isChecked(); if (!use_qt) { qDebug() << "using classical FFMPEG"; renderTarget->setQT(0); } #else #warning "should not use this" VideoRenderTarget_sV *renderTarget = new VideoRenderTarget_sV(task); #endif // check if file exist QFile filetest(ui->videoOutputFile->text()); if (filetest.exists()) { int r = QMessageBox::warning(this, tr("slowmoUI"), tr("The file already exist.\n" "Do you want to overwrite it ?"), QMessageBox::Yes | QMessageBox::No); if (r == QMessageBox::Yes) { filetest.remove(); } else { //TODO: maybe should delete task ? return 0; } } renderTarget->setTargetFile(ui->videoOutputFile->text()); renderTarget->setVcodec(ui->vcodec->text()); task->setRenderTarget(renderTarget); } else { qDebug() << "Render target is neither images nor video. Not implemented?"; Q_ASSERT(false); } if (ui->radioTagSection->isChecked()) { bool b; qreal start = ui->cbStartTag->itemData(ui->cbStartTag->currentIndex()).toFloat(&b); Q_ASSERT(b); qreal end = ui->cbEndTag->itemData(ui->cbEndTag->currentIndex()).toFloat(&b); Q_ASSERT(b); qDebug() << QString("Rendering tag section from %1 (%2) to %3 (%4)") .arg(ui->cbStartTag->currentText()) .arg(start).arg(ui->cbEndTag->currentText()).arg(end); Q_ASSERT(start <= end); task->setTimeRange(start, end); } else if (ui->radioSection->isChecked()) { qDebug() << QString("Rendering time section from %1 to %3") .arg(ui->cbStartTag->currentText()) .arg(ui->cbEndTag->currentText()); task->setTimeRange(ui->timeStart->text(), ui->timeEnd->text()); } QString mode; if (ui->radioFullProject->isChecked()) { mode = "full"; } else if (ui->radioSection->isChecked()) { mode = "time"; m_project->preferences()->renderStartTime() = ui->timeStart->text(); m_project->preferences()->renderEndTime() = ui->timeEnd->text(); } else if (ui->radioTagSection->isChecked()) { mode = "tags"; m_project->preferences()->renderStartTag() = ui->cbStartTag->currentText(); m_project->preferences()->renderEndTag() = ui->cbEndTag->currentText(); } else { qDebug() << "No section mode selected?"; Q_ASSERT(false); } // set optical flow parameters QSettings settings; QString flow_method = settings.value("preferences/flowMethod", "OpenCV-CPU").toString(); if (flow_method == "V3D") { AbstractFlowSource_sV *flow_algo = m_project->flowSource(); flow_algo->setLambda(prefs->flowV3DLambda()); } else if (flow_method == "OpenCV-CPU" || flow_method == "OpenCV-OCL") { int algo_index = ui->opticalFlowAlgo->currentIndex(); qDebug() << "algo index is " << algo_index; FlowSourceOpenCV_sV *flow_algo = (FlowSourceOpenCV_sV *)m_project->flowSource(); switch (algo_index) { case 0: flow_algo->setupOpticalFlow( ui->FarnLevels->value(), ui->FarnWin->value(), ui->FarnPoly->value(), ui->FarnPyr->value(), ui->FarnPolyN->value() ); break; case 1: flow_algo->setupTVL1( ui->TVLtau->value(), ui->TVLlambda->value(), ui->TVLnscales->value(), ui->TVLwarps->value(), ui->TVLiterations->value(), ui->TVLepsilon->value() ); break; default: qDebug() << "no algo defined"; } } else { throw Error_sV("Unsupported Flow method"); } return task; } void RenderingDialog::fillTagLists() { QList list; for (int i = 0; i < m_project->tags()->size(); i++) { if (m_project->tags()->at(i).axis() == TagAxis_Output && m_project->tags()->at(i).time() > m_project->nodes()->startTime() && m_project->tags()->at(i).time() < m_project->nodes()->endTime()) { list << m_project->tags()->at(i); } } qSort(list); ui->cbStartTag->addItem(tr(""), QVariant(m_project->nodes()->startTime())); for (int i = 0; i < list.size(); i++) { ui->cbStartTag->addItem(list.at(i).description(), QVariant(list.at(i).time())); ui->cbEndTag->addItem(list.at(i).description(), QVariant(list.at(i).time())); } ui->cbEndTag->addItem(tr(""), QVariant(m_project->nodes()->endTime())); } void RenderingDialog::slotSaveSettings() { qDebug() << "RenderingDialog::slotSaveSettings()"; const InterpolationType interpolation = (InterpolationType)ui->cbInterpolation->itemData(ui->cbInterpolation->currentIndex()).toInt(); const FrameSize size = (FrameSize)ui->cbSize->itemData(ui->cbSize->currentIndex()).toInt(); const QString imagesOutputDir = ui->imagesOutputDir->text(); const QString imagesFilenamePattern = ui->imagesFilenamePattern->text(); const float fps = ui->cbFps->currentText().toFloat(); const bool use_qt = ui->use_qt->isChecked(); m_project->motionBlur()->setMaxSamples(ui->maxSamples->value()); m_project->motionBlur()->setSlowmoSamples(ui->slowmoSamples->value()); m_project->preferences()->flowV3DLambda() = ui->lambda->value(); if (ui->radioBlurConvolution->isChecked()) { m_project->preferences()->renderMotionblurType() = MotionblurType_Convolving; } else if (ui->radioBlurStacking->isChecked()) { m_project->preferences()->renderMotionblurType() = MotionblurType_Stacking; } else { m_project->preferences()->renderMotionblurType() = MotionblurType_Nearest; } QString mode; if (ui->radioFullProject->isChecked()) { mode = "full"; } else if (ui->radioSection->isChecked()) { mode = "expr"; m_project->preferences()->renderStartTime() = ui->timeStart->text(); m_project->preferences()->renderEndTime() = ui->timeEnd->text(); } else if (ui->radioTagSection->isChecked()) { mode = "tags"; m_project->preferences()->renderStartTag() = ui->cbStartTag->currentText(); m_project->preferences()->renderEndTag() = ui->cbEndTag->currentText(); } else { qDebug() << "No section mode selected?"; Q_ASSERT(false); } m_project->preferences()->renderSectionMode() = mode; m_project->preferences()->imagesOutputDir() = imagesOutputDir; m_project->preferences()->imagesFilenamePattern() = imagesFilenamePattern; m_project->preferences()->videoFilename() = ui->videoOutputFile->text(); m_project->preferences()->videoCodec() = ui->vcodec->text(); m_project->preferences()->renderInterpolationType() = interpolation; m_project->preferences()->renderFrameSize() = size; m_project->preferences()->renderFPS() = fps; m_project->preferences()->renderTarget() = ui->radioImages->isChecked() ? "images" : "video"; m_project->preferences()->renderFormat() = use_qt; accept(); } bool RenderingDialog::slotValidate() { qDebug() << "RenderingDialog::slotValidate()"; bool ok = true; float fps = ui->cbFps->currentText().toFloat(&ok); ok &= fps > 0; if (ok) { ui->cbFps->setStyleSheet(QString("QComboBox { background-color: %1; }").arg(Colours_sV::colOk.name())); } else { ui->cbFps->setStyleSheet(QString("QComboBox { background-color: %1; }").arg(Colours_sV::colBad.name())); } if (ui->radioImages->isChecked()) { if (ui->imagesFilenamePattern->text().contains("%1")) { ui->imagesFilenamePattern->setStyleSheet(QString("QLineEdit { background-color: %1; }").arg(Colours_sV::colOk.name())); } else { ok = false; ui->imagesFilenamePattern->setStyleSheet(QString("QLineEdit { background-color: %1; }").arg(Colours_sV::colBad.name())); } if (ui->imagesOutputDir->text().length() > 0) { ui->imagesOutputDir->setStyleSheet(QString("QLineEdit { background-color: %1; }").arg(Colours_sV::colOk.name())); } else { ok = false; ui->imagesOutputDir->setStyleSheet(QString("QLineEdit { background-color: %1; }").arg(Colours_sV::colBad.name())); } } else if (ui->radioVideo->isChecked()) { if (ui->videoOutputFile->text().length() > 0) { ui->videoOutputFile->setStyleSheet(QString("QLineEdit { background-color: %1; }").arg(Colours_sV::colOk.name())); } else { ok = false; ui->videoOutputFile->setStyleSheet(QString("QLineEdit { background-color: %1; }").arg(Colours_sV::colBad.name())); } } else { Q_ASSERT(false); } if (ui->radioSection->isChecked()) { bool startOk = false; bool endOk = false; qreal timeStart = 0; qreal timeEnd = 0; QStringList messages; Fps_sV currentFps(ui->cbFps->currentText().toFloat()); try { timeStart = m_project->toOutTime(ui->timeStart->text(), currentFps); startOk = true; } catch (Error_sV &err) { messages << err.message(); } try { timeEnd = m_project->toOutTime(ui->timeEnd->text(), currentFps); endOk = true; } catch (Error_sV &err) { messages << err.message(); } if (timeEnd <= timeStart) { endOk = false; messages << tr("Start time must be < end time!"); } messages << tr("Rendering from %1 s to %2 s.").arg(timeStart).arg(timeEnd); ui->sectionMessage->setText(messages.join("\n")); ok &= startOk && endOk; if (!startOk) { ui->timeStart->setStyleSheet(QString("QLineEdit { background-color: %1; }").arg(Colours_sV::colBad.name())); } else { ui->timeStart->setStyleSheet(QString("QLineEdit { background-color: %1; }").arg(Colours_sV::colOk.name())); } if (!endOk) { ui->timeEnd->setStyleSheet(QString("QLineEdit { background-color: %1; }").arg(Colours_sV::colBad.name())); } else { ui->timeEnd->setStyleSheet(QString("QLineEdit { background-color: %1; }").arg(Colours_sV::colOk.name())); } } ok &= dynamic_cast(m_project->frameSource()) == NULL; ok &= m_project->nodes()->size() >= 2; ui->bOk->setEnabled(ok); return ok; } void RenderingDialog::slotUpdateRenderTarget() { ui->groupImages->setVisible(ui->radioImages->isChecked()); ui->groupVideo->setVisible(ui->radioVideo->isChecked()); slotValidate(); } void RenderingDialog::slotBrowseImagesDir() { QFileDialog dialog(this, tr("Output directory for rendered images")); dialog.setAcceptMode(QFileDialog::AcceptOpen); dialog.setFileMode(QFileDialog::Directory); dialog.setOption(QFileDialog::ShowDirsOnly, true); dialog.setDirectory(ui->imagesOutputDir->text()); if (dialog.exec() == QDialog::Accepted) { ui->imagesOutputDir->setText(dialog.selectedFiles().at(0)); } } void RenderingDialog::slotBrowseVideoFile() { QFileDialog dialog(this, tr("Output video file")); dialog.setAcceptMode(QFileDialog::AcceptSave); dialog.setFileMode(QFileDialog::AnyFile); dialog.setDirectory(QFileInfo(ui->videoOutputFile->text()).absolutePath()); if (dialog.exec() == QDialog::Accepted) { ui->videoOutputFile->setText(dialog.selectedFiles().at(0)); } } void RenderingDialog::slotSectionModeChanged() { ui->timeStart->setVisible(ui->radioSection->isChecked()); ui->timeEnd->setVisible(ui->radioSection->isChecked()); ui->sectionMessage->setVisible(ui->radioSection->isChecked()); ui->cbStartTag->setVisible(ui->radioTagSection->isChecked()); ui->cbEndTag->setVisible(ui->radioTagSection->isChecked()); ui->lblcTo->setVisible(ui->radioSection->isChecked() || ui->radioTagSection->isChecked()); slotValidate(); } void RenderingDialog::slotTagIndexChanged() { if (QObject::sender() == ui->cbStartTag) { qDebug() << "Start tag"; if (ui->cbEndTag->currentIndex() < ui->cbStartTag->currentIndex()) { ui->cbEndTag->setCurrentIndex(ui->cbStartTag->currentIndex()); } } else { qDebug() << "End tag"; if (ui->cbStartTag->currentIndex() > ui->cbEndTag->currentIndex()) { ui->cbStartTag->setCurrentIndex(ui->cbEndTag->currentIndex()); } } } #if 0 void MainWindow::comboBox_Activated() { std::cout << "Activated " << this->ui.comboBox->currentIndex() << std::endl; } #endif void RenderingDialog::slotClearFlowCache() { m_project->flowSource()->clearFlowCache(); } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/projectPreferencesDialog.h0000664000000000000000000000111313151342440027771 0ustar rootroot#ifndef PROJECTPREFERENCESDIALOG_H #define PROJECTPREFERENCESDIALOG_H #include #include "project/projectPreferences_sV.h" namespace Ui { class ProjectPreferencesDialog; } class ProjectPreferencesDialog : public QDialog { Q_OBJECT public: explicit ProjectPreferencesDialog(ProjectPreferences_sV *prefs, QWidget *parent = 0); ~ProjectPreferencesDialog(); protected: void accept(); private: Ui::ProjectPreferencesDialog *ui; ProjectPreferences_sV *m_projectPrefs; private slots: void slotCheckFPS(); }; #endif // PROJECTPREFERENCESDIALOG_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/shutterFunctionDialog.ui0000664000000000000000000002140313151342440027537 0ustar rootroot ShutterFunctionDialog 0 0 868 575 Shutter Functions < Segment %1 > Qt::Horizontal 40 20 150 0 + - false 200 0 Monospace 50 false shutterFunc1 Qt::Horizontal QSizePolicy::Minimum 10 20 Used: %1 times Qt::Horizontal 40 20 Qt::Horizontal Monospace 8 // header (function foo(args...) { Qt::AlignBottom|Qt::AlignLeading|Qt::AlignLeft true Qt::Vertical QSizePolicy::Minimum 20 130 Monospace true return 0; Monospace 8 // footer }) true <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd"> <html><head><meta name="qrichtext" content="1" /><style type="text/css"> p, li { white-space: pre-wrap; } </style></head><body style=" font-family:'Bitstream Vera Sans'; font-size:10pt; font-weight:400; font-style:normal;"> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Math functions are available in the Math namespace:</p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-family:'Monospace';">Math.PI, Math.cos(...), Math.pow(base. exponent) etc.</span></p> </body></html> Qt::AlignLeading|Qt::AlignLeft|Qt::AlignTop true 0 0 200 0 QFrame::StyledPanel QFrame::Plain 0 Qt::Horizontal 40 20 Close ShutterFunctionFrame QFrame
shutterFunctionFrame.h
1
slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/flowExaminer.ui0000664000000000000000000000716013151342440025657 0ustar rootroot FlowExaminer 0 0 918 603 Dialog 0 0 QFrame::StyledPanel QFrame::Raised QFrame::StyledPanel QFrame::Raised 0 0 QFrame::StyledPanel QFrame::Raised QFrame::StyledPanel QFrame::Raised Amplification : 40 10 Qt::Horizontal QSlider::TicksBelow 1 Qt::Horizontal 40 0 Close ImageDisplay QFrame
libgui/imageDisplay.h
1
slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/shutterFunctionFrame.cpp0000664000000000000000000000476613151342440027554 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "shutterFunctionFrame.h" #include "lib/defs_sV.hpp" #include "../canvas.h" #include #include #include ShutterFunctionFrame::ShutterFunctionFrame(QWidget *parent, Qt::WindowFlags f) : QFrame(parent, f), m_frameHeight(100) { } void ShutterFunctionFrame::updateValues(float y, float dy) { m_y = y; m_dy = dy; } void ShutterFunctionFrame::paintEvent(QPaintEvent *e) { qDebug() << "Repainting shutter curve"; QFrame::paintEvent(e); int x, y; QPainter p(this); p.fillRect(rect(), Canvas::backgroundCol); y = m_frameHeight; while (y < height()) { p.setPen(Canvas::gridCol); if ((y % m_frameHeight) % 10 == 0) { p.setPen(Canvas::fatGridCol); } p.drawLine(0, height()-1 - y, width()-1, height()-1 - y); y += m_frameHeight; } p.setPen(Canvas::lineCol); float t; for (x = 0; x < width(); x++) { t = float(x)/width(); y = height()-1 - m_frameHeight*m_function.evaluate(t, t, 24, m_y, m_dy); p.drawPoint(x, y); } } void ShutterFunctionFrame::wheelEvent(QWheelEvent *e) { if (e->delta() > 0) { int old = m_frameHeight; m_frameHeight *= 1.4; if (m_frameHeight == old) { m_frameHeight++; } } else { m_frameHeight /= 1.4; if (m_frameHeight < 1) { m_frameHeight = 1; } } repaint(); } void ShutterFunctionFrame::slotDisplayFunction(const QString &function) { m_function.updateFunction(function); float max = qMax(m_function.evaluate(0, 0, 1.0/24, m_y, m_dy), qMax( m_function.evaluate(.5, .5, 1.0/24, m_y, m_dy), m_function.evaluate(1, 1, 1.0/24, m_y, m_dy))); if (max > 50) { max = 50; } if (max > .1) { while (m_frameHeight*max > height()) { m_frameHeight /= 1.4; } while (m_frameHeight*max < height()/5) { m_frameHeight *= 1.4; } while (m_frameHeight > height()) { m_frameHeight /= 1.4; } } else { max = .5; } repaint(); } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/projectPreferencesDialog.cpp0000664000000000000000000000244413151342440030334 0ustar rootroot#include "projectPreferencesDialog.h" #include "ui_projectPreferencesDialog.h" #include "lib/defs_sV.hpp" ProjectPreferencesDialog::ProjectPreferencesDialog(ProjectPreferences_sV *prefs, QWidget *parent) : QDialog(parent), ui(new Ui::ProjectPreferencesDialog), m_projectPrefs(prefs) { ui->setupUi(this); ui->canvas_xAxisFPS->setText(m_projectPrefs->canvas_xAxisFPS().toString()); connect(ui->canvas_xAxisFPS, SIGNAL(textChanged(QString)), this, SLOT(slotCheckFPS())); } ProjectPreferencesDialog::~ProjectPreferencesDialog() { delete ui; } void ProjectPreferencesDialog::accept() { try { Fps_sV fps(ui->canvas_xAxisFPS->text()); m_projectPrefs->canvas_xAxisFPS().num = fps.num; m_projectPrefs->canvas_xAxisFPS().den = fps.den; } catch (Error_sV &err) {} QDialog::accept(); } void ProjectPreferencesDialog::slotCheckFPS() { try { Fps_sV fps(ui->canvas_xAxisFPS->text()); ui->canvas_xAxisFPS->setStyleSheet(QString("QLineEdit { background-color: %1; }").arg(Colours_sV::colOk.name())); fps.den = fps.num; // Just to hopefully avoid the fps being optimized out } catch (Error_sV &err) { ui->canvas_xAxisFPS->setStyleSheet(QString("QLineEdit { background-color: %1; }").arg(Colours_sV::colBad.name())); } } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/renderingDialog.ui0000664000000000000000000013722213151342440026317 0ustar rootroot RenderingDialog 0 0 762 593 Rendering settings Qt::Vertical QSizePolicy::Fixed 20 10 0 a a Rendering settings Fu&ll Project true &Tag section Custom sect&ion Qt::Horizontal QSizePolicy::Fixed 13 20 0 0 100 0 16 to 0 0 100 0 16 Qt::Horizontal 40 20 Qt::Horizontal QSizePolicy::Fixed 30 20 true Frames per second: 0 0 100 0 100 0 true 0 23.976 24 25 29.976 30 50 60 72 Qt::Horizontal 40 20 Size: Qt::Horizontal QSizePolicy::Minimum 10 20 Interpolation: Qt::Horizontal 40 20 8 For two frames A and B, the two-way interpolations calculate both the flows A→B and B→A, which leads to smoother transitions between them. Forward interpolations only calculate A→B; Twice as fast, but usually less smooth. true Qt::Vertical 20 40 Optical Flow Optical flow Optical Flow Algorithm : Qt::Horizontal 40 20 0 0 0 0 20 472 36 buildFlow lambda 5.000000000000000 50.000000000000000 1.000000000000000 20.000000000000000 Qt::Horizontal QSizePolicy::Fixed 10 20 0 0 Use a higher value for high-quality footage and larger images. true 0 70 662 101 8 The lambda is only used with the GPU based Optical Flow algorithm. There is no general rule which value is best, so it is usually a good idea to render a short part with a low (5) and a high (50) lambda to see the differences, and then try to find the best value between. true 0 223 10 10 231 212 QLayout::SetMinAndMaxSize 2 5 <html><head/><body><p>number of pyramid layers including the initial image; `levels=1` means that no extra layers are created and only the original images are used</p></body></html> Levels: <html><head/><body><p>number of pyramid layers including the initial image; `levels=1` means that no extra layers are created and only the original images are used</p></body></html> 0 3.000000000000000 <html><head/><body><p>Averaging window size; larger values increase the algorithm robustness to image noise and give more chances for fast motion detection, but yield more blurred motion field.</p></body></html> WinSize: <html><head/><body><p>Averaging window size; larger values increase the algorithm robustness to image noise and give more chances for fast motion detection, but yield more blurred motion field.</p></body></html> 0 15.000000000000000 <html><head/><body><p>Standard deviation of the Gaussian that is used to smooth derivatives used as a basis for the polynomial expansion; for `poly_n=5`, you can set `poly_sigma=1.1`, for `poly_n=7`, a good value would be `poly_sigma=1.5`.</p></body></html> PolySigma: <html><head/><body><p>Standard deviation of the Gaussian that is used to smooth derivatives used as a basis for the polynomial expansion; for `poly_n=5`, you can set `poly_sigma=1.1`, for `poly_n=7`, a good value would be `poly_sigma=1.5`.</p></body></html> 0.010000000000000 1.200000000000000 <html><head/><body><p>Parameter, specifying the image scale (&lt;1) to build pyramids for each image; `pyr_scale=0.5` means a classical pyramid, where each next layer is twice smaller than the previous one.</p></body></html> Pyr_Scale: <html><head/><body><p>Size of the pixel neighborhood used to find polynomial expansion in each pixel; larger values mean that the image will be approximated with smoother surfaces, yielding more robust algorithm and more blurred motion field, typically `poly_n` = 5 or 7.</p></body></html> PolyN: <html><head/><body><p>Size of the pixel neighborhood used to find polynomial expansion in each pixel; larger values mean that the image will be approximated with smoother surfaces, yielding more robust algorithm and more blurred motion field, typically `poly_n` = 5 or 7.</p></body></html> 0 5.000000000000000 <html><head/><body><p>Parameter, specifying the image scale (&lt;1) to build pyramids for each image; `pyr_scale=0.5` means a classical pyramid, where each next layer is twice smaller than the previous one.</p></body></html> 0.010000000000000 0.500000000000000 0 229 10 10 261 218 QFormLayout::AllNonFixedFieldsGrow 2 4 <html><head/><body><p><span style=" font-family:'sans-serif'; font-size:16px; color:#000000; background-color:#ffffff;">Time step of the numerical scheme</span></p></body></html> Tau: <html><head/><body><p><span style=" font-family:'sans-serif'; font-size:16px; color:#000000; background-color:#ffffff;">Time step of the numerical scheme</span></p></body></html> 0.010000000000000 0.250000000000000 <html><head/><body><p><span style=" font-family:'sans-serif'; font-size:16px; color:#000000; background-color:#ffffff;">Weight parameter for the data term, attachment parameter. This is the most relevant parameter, which determines the smoothness of the output. The smaller this parameter is, the smoother the solutions we obtain. It depends on the range of motions of the images, so its value should be adapted to each image sequence</span> Lambda: <html><head/><body><p><span style=" font-family:'sans-serif'; font-size:16px; color:#000000; background-color:#ffffff;">Weight parameter for the data term, attachment parameter. This is the most relevant parameter, which determines the smoothness of the output. The smaller this parameter is, the smoother the solutions we obtain. It depends on the range of motions of the images, so its value should be adapted to each image sequence</span> 0.010000000000000 0.150000000000000 <html><head/><body><p><span style=" font-family:'sans-serif'; font-size:16px; color:#000000; background-color:#ffffff;">Number of warpings per scale. Represents the number of times that I1(x+u0) and grad( I1(x+u0) ) are computed per scale. This is a parameter that assures the stability of the method. It also affects the running time, so it is a compromise between speed and accuracy</span> Warps: <html><head/><body><p><span style=" font-family:'sans-serif'; font-size:16px; color:#000000; background-color:#ffffff;">Number of warpings per scale. Represents the number of times that I1(x+u0) and grad( I1(x+u0) ) are computed per scale. This is a parameter that assures the stability of the method. It also affects the running time, so it is a compromise between speed and accuracy</span> 0 10.000000000000000 <html><head/><body><p><span style=" font-family:'sans-serif'; font-size:16px; color:#000000; background-color:#ffffff;">Number of scales used to create the pyramid of images</span> Nscales: <html><head/><body><p><span style=" font-family:'sans-serif'; font-size:16px; color:#000000; background-color:#ffffff;">Number of scales used to create the pyramid of images</span> 0 5.000000000000000 <html><head/><body><p><span style=" font-family:'sans-serif'; font-size:16px; color:#000000; background-color:#ffffff;">Stopping criterion iterations number used in the numerical scheme</span> 0 400.000000000000000 100.000000000000000 <html><head/><body><p><span style=" font-family:'sans-serif'; font-size:16px; color:#000000; background-color:#ffffff;">Stopping criterion iterations number used in the numerical scheme</span> Iterations: <html><head/><body><p><span style=" font-family:'sans-serif'; font-size:16px; color:#000000; background-color:#ffffff;">Stopping criterion threshold used in the numerical scheme, which is a trade-off between precision and running time. A small value will yield more accurate solutions at the expense of a slower convergence</span> Epsilon: <html><head/><body><p><span style=" font-family:'sans-serif'; font-size:16px; color:#000000; background-color:#ffffff;">Stopping criterion threshold used in the numerical scheme, which is a trade-off between precision and running time. A small value will yield more accurate solutions at the expense of a slower convergence</span> 0.010000000000000 0.010000000000000 Clear Flow Cache Qt::Horizontal 40 20 Qt::Vertical 20 40 Motion Blur Motion blur Motion blur will only be applied for segments on which it is enabled. true Stacking &blur (Uses more disk space, but is faster for repeated rendering.) Maximum samples 1 64 Qt::Horizontal QSizePolicy::Fixed 10 20 Samples for slow motion 1 16 Qt::Horizontal 40 20 Convolution blur (Smoother &than stacking, usually the better choice.) Nearest (no b&lurring) Qt::Vertical 20 40 Output Target: true &Video &Images true Qt::Horizontal 40 20 Images The %1 in the filename pattern is mandatory and will be replaced by the frame number. Output directory Browse Filename pattern rendered-%1.jpg 0 0 Video Videos will be encoded with ffmpeg. If additional arguments are left empty, defaults will be used. The video format is determined by ffmpeg according to the file suffix. true Output file Browse Optional arguments Use Quicktime true vcodec Qt::Vertical 20 40 Qt::Horizontal 40 20 Will *not* save the project! &Save settings &Abort &Ok radioFullProject cbFps cbSize cbInterpolation radioVideo radioImages imagesOutputDir bImagesBrowseDir imagesFilenamePattern videoOutputFile bBrowseVideoOutputFile vcodec bAbort bOk slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/progressDialog.h0000664000000000000000000000140313151342440026007 0ustar rootroot#ifndef PROGRESSDIALOG_H #define PROGRESSDIALOG_H #include namespace Ui { class ProgressDialog; } /// \todo Close when finished (checkbox) class ProgressDialog : public QDialog { Q_OBJECT public: explicit ProgressDialog(QWidget *parent = 0); ~ProgressDialog(); public slots: void slotNextTask(const QString taskDescription, int taskSize); void slotTaskProgress(int progress); void slotTaskItemDescription(const QString desc); void slotAllTasksFinished(const QString& timePassed = ""); void slotAborted(const QString& message = ""); signals: void signalAbortTask(); private: Ui::ProgressDialog *ui; void setWorking(bool working); private slots: void slotAbortPressed(); }; #endif // PROGRESSDIALOG_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/aboutDialog.cpp0000664000000000000000000000213113151342440025607 0ustar rootroot/* slowmoUI is a user interface for slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "aboutDialog.h" #include "ui_aboutDialog.h" #include "lib/defs_sV.hpp" #include #include AboutDialog::AboutDialog(QWidget *parent) : QDialog(parent), ui(new Ui::AboutDialog), m_appIcon(":icons/slowmoIcon.png") { ui->setupUi(this); ui->lblVersion->setText(tr("Version %1, %2").arg(Version_sV::version).arg(Version_sV::bits)); } AboutDialog::~AboutDialog() { delete ui; } void AboutDialog::keyPressEvent(QKeyEvent *) { accept(); } void AboutDialog::paintEvent(QPaintEvent *e) { QDialog::paintEvent(e); QImage img = m_appIcon.scaled(ui->iconFrame->size(), Qt::KeepAspectRatio, Qt::SmoothTransformation); QPainter p(this); p.drawImage(ui->iconFrame->pos(), img); } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/shutterFunctionFrame.h0000664000000000000000000000176713151342440027217 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef SHUTTERFUNCTIONFRAME_H #define SHUTTERFUNCTIONFRAME_H #include "project/shutterFunction_sV.h" #include /** \brief Renders a ShutterFunction_sV */ class ShutterFunctionFrame : public QFrame { Q_OBJECT public: ShutterFunctionFrame(QWidget * parent = 0, Qt::WindowFlags f = 0); void updateValues(float y, float dy); public slots: void slotDisplayFunction(const QString &function); protected slots: virtual void paintEvent(QPaintEvent *e); virtual void wheelEvent(QWheelEvent *e); private: ShutterFunction_sV m_function; int m_frameHeight; float m_y; float m_dy; }; #endif // SHUTTERFUNCTIONFRAME_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/aboutDialog.ui0000664000000000000000000002010313151342440025441 0ustar rootroot AboutDialog 0 0 903 354 About Gentium Basic 19 75 true About slowmoVideo 0 0 <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd"> <html><head><meta name="qrichtext" content="1" /><style type="text/css"> p, li { white-space: pre-wrap; } </style></head><body style=" font-family:'Bitstream Vera Sans'; font-size:10pt; font-weight:400; font-style:normal;"> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-style:italic;">slowmoVideo</span> is developed by Simon A. Eugster (aka. Granjow, co-author of Kdenlive) and licensed under GPLv3.</p> <p style="-qt-paragraph-type:empty; margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><br /></p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><span style=" font-style:italic;">slowmoVideo</span> allows to change the speed of a video clip based upon a curve. If the speed becomes higher than 1×, an exposure (shutter) effect simulates motion blur. For lower speed, frames are interpolated with optical flow.</p> <p style="-qt-paragraph-type:empty; margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><br /></p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Thanks for contributing:</p> <ul style="margin-top: 0px; margin-bottom: 0px; margin-left: 0px; margin-right: 0px; -qt-list-indent: 1;"><li style=" margin-top:12px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Mirko Götze <span style=" font-style:italic;">&lt;mail@mgo80.de&gt;</span> for converting <span style=" font-weight:600;">Cg to GLSL</span> (Removing the nVidia dependency)</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Morten Sylvest Olsen <span style=" font-style:italic;">&lt;mso@kapowsoftware.com&gt;</span> for the <span style=" font-weight:600;">V3D speedup</span> and removing the unnecessary window</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Elias Vanderstuyft for displaying the <span style=" font-weight:600;">shutter function</span> on the canvas</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Christian Frisson <span style=" font-style:italic;">&lt;christian.frisson@umons.ac.be&gt;</span> for<span style=" font-weight:600;"> OpenCV on MXE</span> (allowed me to compile slowmoVideo for Windows)</li> <li style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Per <span style=" font-style:italic;">&lt;per@stuffmatic.com&gt;</span> for the<span style=" font-weight:600;"> OpenCV</span> code (slowmoVideo can run on CPU only with it)</li></ul> <p style="-qt-paragraph-type:empty; margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;"><br /></p> <p style=" margin-top:0px; margin-bottom:0px; margin-left:0px; margin-right:0px; -qt-block-indent:0; text-indent:0px;">Visit <a href="http://slowmoVideo.granjow.net"><span style=" text-decoration: underline; color:#0057ae;">slowmoVideo.granjow.net</span></a> for more information.</p></body></html> Qt::RichText true Qt::Horizontal 40 20 Qt::Vertical QSizePolicy::Fixed 20 20 Qt::Vertical QSizePolicy::Fixed 20 10 8 version number goes here. 8 (c) 2012 Simon A. Eugster Qt::AlignRight|Qt::AlignTrailing|Qt::AlignVCenter QFrame::NoFrame QFrame::Raised Qt::Vertical 20 40 slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/shutterFunctionDialog.cpp0000664000000000000000000001620613151342440027711 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "shutterFunctionDialog.h" #include "ui_shutterFunctionDialog.h" #include "project/shutterFunction_sV.h" #include "project/shutterFunctionList_sV.h" #include "project/project_sV.h" #include QString ShutterFunctionDialog::emptyFunction(""); /// \todo Icons ShutterFunctionDialog::ShutterFunctionDialog(Project_sV *project, QWidget *parent) : QDialog(parent), ui(new Ui::ShutterFunctionDialog), m_currentFunction(NULL) { ui->setupUi(this); ui->lblcHeader->setText(ShutterFunction_sV::templateHeader); ui->lblcFooter->setText(ShutterFunction_sV::templateFooter); connect(ui->bClose, SIGNAL(clicked()), this, SLOT(accept())); connect(ui->cbFunction, SIGNAL(currentIndexChanged(int)), this, SLOT(slotUpdateNode())); connect(ui->cbFunction, SIGNAL(currentIndexChanged(int)), this, SLOT(slotLoadSelectedFunction())); connect(ui->function, SIGNAL(textChanged()), this, SLOT(slotFunctionTextChanged())); connect(ui->bAdd, SIGNAL(clicked()), this, SLOT(slotAddFunction())); connect(ui->bRemove, SIGNAL(clicked()), this, SLOT(slotRemoveFunction())); connect(ui->bPrevSegment, SIGNAL(clicked()), this, SLOT(slotPrevSegment())); connect(ui->bNextSegment, SIGNAL(clicked()), this, SLOT(slotNextSegment())); loadProject(project); slotLoadSelectedFunction(); } ShutterFunctionDialog::~ShutterFunctionDialog() { delete ui; } void ShutterFunctionDialog::loadProject(Project_sV *project) { qDebug() << "loadProject();"; m_project = project; ui->cbFunction->blockSignals(true); ui->cbFunction->clear(); ui->cbFunction->addItem(emptyFunction); for (int i = 0; i < project->shutterFunctions()->size(); i++) { ui->cbFunction->addItem(m_project->shutterFunctions()->at(i)->id()); } ui->cbFunction->blockSignals(false); setSegment(0); } void ShutterFunctionDialog::paintEvent(QPaintEvent *e) { QDialog::paintEvent(e); QPainter p(this); QImage img(":images/shutterFunc.png"); p.drawImage(ui->verticalLayout_code->contentsRect().topRight() - QPoint(img.width()+10, -10), img); } void ShutterFunctionDialog::closeEvent(QCloseEvent *e) { m_project->nodes()->segments()->unselectAll(); parentWidget()->repaint(); QDialog::closeEvent(e); } void ShutterFunctionDialog::slotNodesUpdated() { qDebug() << "slotNodesUpdated();"; if (m_segment+1 >= m_project->nodes()->size() && m_project->nodes()->size() >= 2) { setSegment(m_project->nodes()->size()-2); } else { setSegment(m_segment); } } void ShutterFunctionDialog::setSegment(int segment) { qDebug() << "setSegment(" << segment << ");"; m_segment = segment; ui->lblSegmentNumber->setText(tr("Segment %1 (total number: %2)") .arg(m_segment).arg(m_project->nodes()->size()-1)); // Enable/disable buttons ui->bPrevSegment->setEnabled(m_segment > 0); ui->bNextSegment->setEnabled(m_segment+1 < m_project->nodes()->size()-1); // Update the curve parameters for this segment if (m_project->nodes()->size() >= 2) { const Node_sV *leftNode = &m_project->nodes()->at(m_segment); const Node_sV *rightNode = &m_project->nodes()->at(m_segment+1); ui->shutterCurve->updateValues( leftNode->y(), 1.0/24 * (rightNode->y()-leftNode->y()) / (rightNode->x()-leftNode->x()) // dy = dx * /\y / /\x ); Q_ASSERT(m_segment+1 < m_project->nodes()->size()); } else { qDebug() << "Less than 2 nodes!"; } // Select function in the dropdown if (m_project->nodes()->size() >= 2) { const Node_sV *node = &m_project->nodes()->at(m_segment); QString id = node->shutterFunctionID(); qDebug() << "Shutter function ID of node " << node << " is " << id; if (id.length() == 0) { ui->cbFunction->setCurrentIndex(ui->cbFunction->findText(emptyFunction)); } else { int pos = ui->cbFunction->findText(id); Q_ASSERT(pos >= 0); ui->cbFunction->setCurrentIndex(pos); } } m_project->nodes()->segments()->unselectAll(); (*m_project->nodes()->segments())[m_segment].select(); parentWidget()->repaint(); Q_ASSERT(m_segment >= 0); } void ShutterFunctionDialog::slotUpdateNode() { qDebug() << "slotUpdateNode();"; if (m_project->nodes()->size() >= 2) { Q_ASSERT(m_segment+1 < m_project->nodes()->size()); QString id = ui->cbFunction->currentText(); if (id == emptyFunction) { id = ""; } Node_sV *node = &(*m_project->nodes())[m_segment]; node->setShutterFunctionID(id); qDebug() << "Shutter function ID of node " << node << "set to " << id; } } void ShutterFunctionDialog::slotFunctionTextChanged() { qDebug() << "slotUpdateFunctionCode();"; if (m_currentFunction != NULL) { ui->function->setEnabled(true); m_currentFunction->updateFunction(ui->function->toPlainText()); ui->shutterCurve->slotDisplayFunction(ui->function->toPlainText()); } else { ui->function->setEnabled(false); ui->shutterCurve->slotDisplayFunction("return 0;"); } } void ShutterFunctionDialog::slotLoadSelectedFunction() { QString id = ui->cbFunction->currentText(); if (id == emptyFunction) { id = ""; m_currentFunction = NULL; ui->function->setPlainText("return 0;"); } else { m_currentFunction = m_project->shutterFunctions()->function(id); ui->function->setPlainText(m_currentFunction->function()); } int count = 0; for (int i = 0; i < m_project->nodes()->size(); i++) { if (m_project->nodes()->at(i).shutterFunctionID() == id) { count++; } } ui->lblUsage->setText(tr("Used: %1 times").arg(count)); } void ShutterFunctionDialog::slotAddFunction() { ShutterFunction_sV *fun = m_project->shutterFunctions()->addFunction(ShutterFunction_sV(), true); if (fun != NULL) { ui->cbFunction->addItem(fun->id()); ui->cbFunction->setCurrentIndex(ui->cbFunction->findText(fun->id())); } else { qDebug() << "Could not add new function."; Q_ASSERT(false); } } void ShutterFunctionDialog::slotRemoveFunction() { if (ui->cbFunction->currentText() != emptyFunction) { bool ok = m_project->shutterFunctions()->removeFunction(ui->cbFunction->currentText()); int index = ui->cbFunction->currentIndex(); ui->cbFunction->setCurrentIndex(ui->cbFunction->findText(emptyFunction)); ui->cbFunction->removeItem(index); Q_ASSERT(ok); } } void ShutterFunctionDialog::slotPrevSegment() { qDebug() << "slotPrevSegment();"; setSegment(m_segment-1); } void ShutterFunctionDialog::slotNextSegment() { qDebug() << "slotNextSegment();"; setSegment(m_segment+1); } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/aboutDialog.h0000664000000000000000000000141613151342440025261 0ustar rootroot/* slowmoUI is a user interface for slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef ABOUTDIALOG_H #define ABOUTDIALOG_H #include namespace Ui { class AboutDialog; } class AboutDialog : public QDialog { Q_OBJECT public: explicit AboutDialog(QWidget *parent = 0); ~AboutDialog(); protected: virtual void keyPressEvent(QKeyEvent *); virtual void paintEvent(QPaintEvent *e); private: Ui::AboutDialog *ui; QImage m_appIcon; }; #endif // ABOUTDIALOG_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/tagAddDialog.cpp0000664000000000000000000000432113151342440025664 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "tagAddDialog.h" #include "ui_tagAddDialog.h" #include TagAddDialog::TagAddDialog(TagAxis defaultAxis, QWidget *parent) : QDialog(parent), ui(new Ui::TagAddDialog), m_axis(defaultAxis) { ui->setupUi(this); ui->bOk->setEnabled(false); connect(ui->bAbort, SIGNAL(clicked()), this, SLOT(reject())); connect(ui->bOk, SIGNAL(clicked()), this, SLOT(accept())); connect(ui->tag, SIGNAL(textChanged(QString)), this, SLOT(slotTextChanged(QString))); connect(ui->tag, SIGNAL(returnPressed()), ui->bOk, SLOT(click())); slotUpdateAxis(); } TagAddDialog::~TagAddDialog() { delete ui; } Tag_sV TagAddDialog::buildTag(QPointF time) { if (m_axis == TagAxis_Source) { return Tag_sV(time.y(), ui->tag->text(), m_axis); } else { return Tag_sV(time.x(), ui->tag->text(), m_axis); } } void TagAddDialog::keyPressEvent(QKeyEvent *e) { if (e->key() == Qt::Key_Up) { m_axis = TagAxis_Source; slotUpdateAxis(); } else if (e->key() == Qt::Key_Down) { m_axis = TagAxis_Output; slotUpdateAxis(); } else { QDialog::keyPressEvent(e); } } void TagAddDialog::slotTextChanged(const QString &text) { if (text.length() == 0) { ui->bOk->setEnabled(false); } else { ui->bOk->setEnabled(true); m_text = text; } } void TagAddDialog::slotUpdateAxis() { QSizePolicy::Policy upperPolicy = QSizePolicy::Fixed; QSizePolicy::Policy lowerPolicy = QSizePolicy::Expanding; if (m_axis == TagAxis_Output) { upperPolicy = QSizePolicy::Expanding; lowerPolicy = QSizePolicy::Fixed; } ui->verticalUpperSpacer->changeSize(0, 0, QSizePolicy::Minimum, upperPolicy); ui->verticalLowerSpacer->changeSize(0, 0, QSizePolicy::Minimum, lowerPolicy); ui->verticalLayout_2->invalidate(); repaint(); updateGeometry(); } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/renderingDialog.h0000664000000000000000000000252613151342440026127 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef RENDERINGDIALOG_H #define RENDERINGDIALOG_H #include namespace Ui { class RenderingDialog; } class QButtonGroup; class RenderTask_sV; class Project_sV; /** \brief Dialog for rendering option */ class RenderingDialog : public QDialog { Q_OBJECT public: explicit RenderingDialog(Project_sV *project, QWidget *parent = 0); ~RenderingDialog(); /** \return \c NULL on invalid input, a render task for the given project otherwise */ RenderTask_sV* buildTask(); public slots: bool slotValidate(); private: Ui::RenderingDialog *ui; Project_sV *m_project; QButtonGroup *m_targetGroup; QButtonGroup *m_sectionGroup; QButtonGroup *m_blurGroup; void fillTagLists(); private slots: void slotBrowseImagesDir(); void slotBrowseVideoFile(); void slotUpdateRenderTarget(); void slotSectionModeChanged(); void slotTagIndexChanged(); void slotSaveSettings(); void slotClearFlowCache(); }; #endif // RENDERINGDIALOG_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/flowExaminer.cpp0000664000000000000000000001202213151342440026015 0ustar rootroot#include "flowExaminer.h" #include "ui_flowExaminer.h" #include "project/project_sV.h" #include "project/abstractFrameSource_sV.h" #include "lib/flowVisualization_sV.h" #include #include #include FlowExaminer::FlowExaminer(Project_sV *project, QWidget *parent) : QDialog(parent), ui(new Ui::FlowExaminer), m_project(project), m_flowLR(NULL), m_flowRL(NULL) { ui->setupUi(this); // ui->leftFrame->trackMouse(true); // ui->rightFrame->trackMouse(true); connect(ui->leftFrame, SIGNAL(signalMouseMoved(float,float)), this, SLOT(slotMouseMoved(float,float))); connect(ui->rightFrame, SIGNAL(signalMouseMoved(float,float)), this, SLOT(slotMouseMoved(float,float))); connect(ui->bClose, SIGNAL(clicked()), this, SLOT(close())); connect(ui->amplification, SIGNAL(valueChanged(int)),this, SLOT(newAmplification(int))); connect(this, SIGNAL(frameChanged()),this, SLOT(updateFlow())); } FlowExaminer::~FlowExaminer() { delete ui; if (m_flowLR != NULL) { delete m_flowLR; } if (m_flowRL != NULL) { delete m_flowRL; } } /// \todo Make flow visualization configurable void FlowExaminer::examine(int leftFrame) { frame = leftFrame; frame= 0; loadFlow();; } void FlowExaminer::loadFlow() { if (m_flowLR != NULL) { delete m_flowLR; m_flowLR = NULL; } if (m_flowRL != NULL) { delete m_flowRL; m_flowRL = NULL; } try { m_flowLR = m_project->requestFlow(frame, frame+1, FrameSize_Orig); m_flowRL = m_project->requestFlow(frame+1, frame, FrameSize_Orig); ui->leftFrame->loadImage(m_project->frameSource()->frameAt(frame, FrameSize_Orig)); ui->rightFrame->loadImage(m_project->frameSource()->frameAt(frame+1, FrameSize_Orig)); ui->leftFlow->loadImage(FlowVisualization_sV::colourizeFlow(m_flowLR, FlowVisualization_sV::HSV,m_boost)); ui->rightFlow->loadImage(FlowVisualization_sV::colourizeFlow(m_flowRL, FlowVisualization_sV::HSV,m_boost)); emit updateFlow(); } catch (FlowBuildingError &err) { } //repaint(); } void FlowExaminer::updateFlow() { ui->leftFrame->update(); ui->rightFrame->update(); ui->leftFlow->update(); ui->rightFlow->update(); } void FlowExaminer::newAmplification(int val) { //qDebug() << "newAmplification: " << val; Q_ASSERT(val > 0); m_boost = (float)val; // reload flow with new gain loadFlow(); } /// \todo Show vectors etc. void FlowExaminer::slotMouseMoved(float x, float y) { if (QObject::sender() == ui->leftFrame) { qDebug() << "Should display something in the right frame now."; if (m_flowLR != NULL) { float moveX = m_flowLR->x(x,y); float moveY = m_flowLR->y(x,y); QImage leftOverlay(m_flowLR->width(), m_flowLR->height(), QImage::Format_ARGB32); QImage rightOverlay(leftOverlay.size(), QImage::Format_ARGB32); QPainter davinci; davinci.begin(&leftOverlay); davinci.drawLine(x, y, x+moveX, y+moveY); davinci.end(); qDebug() << "Line coordinates: " << x << y << moveX << moveY; bool ok; ok = ui->leftFrame->loadOverlay(leftOverlay); Q_ASSERT(ok); davinci.begin(&rightOverlay); davinci.drawEllipse(x+moveX, y+moveY, 2, 2); davinci.end(); ui->rightFrame->loadOverlay(rightOverlay); repaint(); } else { qDebug() << "Flow is 0!"; } } else if (QObject::sender() == ui->rightFrame) { qDebug() << "Should display something in the left frame now."; } else { qDebug() << "Unknown sender!"; Q_ASSERT(false); } } void FlowExaminer::keyPressEvent(QKeyEvent *event) { //qDebug() << "keypressed : " << event->key(); switch (event->key()) { case Qt::Key_Up: qDebug() << "key up"; //m_states.prevMousePos += QPoint(0,-1); break; case Qt::Key_Down: qDebug() << "key down"; //m_states.prevMousePos += QPoint(0,1); break; case Qt::Key_Right: qDebug() << "key right"; //m_states.prevMousePos += QPoint(1,0); frame++; loadFlow(); break; case Qt::Key_Left: qDebug() << "key left"; //m_states.prevMousePos += QPoint(-1,0); frame--; loadFlow(); break; } QWidget::keyPressEvent(event); //repaint(); } void FlowExaminer::wheelEvent(QWheelEvent *event) { int numDegrees = event->delta() / 8; int numSteps = numDegrees / 15; if (event->orientation() == Qt::Horizontal) { qDebug() << "wheel : horiz " << numSteps; } else { qDebug() << "wheel : vert " << numSteps; } qDebug() << "in wheel"; event->accept(); } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/progressDialog.cpp0000664000000000000000000000442313151342440026347 0ustar rootroot#include "progressDialog.h" #include "ui_progressDialog.h" #include #include #include "notificator.h" ProgressDialog::ProgressDialog(QWidget *parent) : QDialog(parent), ui(new Ui::ProgressDialog) { ui->setupUi(this); // alas this make window transparent ! //setWindowFlags(Qt::CustomizeWindowHint |Qt::WindowStaysOnTopHint); connect(ui->bAbort, SIGNAL(clicked()), this, SLOT(slotAbortPressed())); connect(ui->bOk, SIGNAL(clicked()), this, SLOT(accept())); ui->bOk->setVisible(false); ui->bOk->setEnabled(false); } ProgressDialog::~ProgressDialog() { delete ui; } void ProgressDialog::setWorking(bool working) { ui->bOk->setVisible(!working); ui->bOk->setEnabled(!working); ui->bAbort->setVisible(working); ui->bAbort->setEnabled(working); } void ProgressDialog::slotNextTask(const QString taskDescription, int taskSize) { ui->lblTaskDesc->setText(taskDescription); ui->progress->setMaximum(taskSize); ui->progress->setValue(0); if (windowTitle().startsWith(tr("(Finished) "))) { setWindowTitle(windowTitle().remove(0, tr("(Finished) ").length())); } setWorking(true); } void ProgressDialog::slotTaskProgress(int progress) { ui->progress->setValue(progress); } void ProgressDialog::slotTaskItemDescription(const QString desc) { ui->lblTaskItemDesc->setText(desc); repaint(); } void ProgressDialog::slotAbortPressed() { emit signalAbortTask(); } void ProgressDialog::slotAborted(const QString &message) { if (message.length() > 0) { // Show message QMessageBox box(QMessageBox::Warning, tr("Aborted"), message, QMessageBox::Ok); box.show(); } reject(); } void ProgressDialog::slotAllTasksFinished(const QString& timePassed) { ui->progress->setValue(ui->progress->maximum()); setWorking(false); QString notifmsg = tr("Task finished in %1.").arg(timePassed); if (timePassed.length() > 0) { slotTaskItemDescription(notifmsg); } else { slotTaskItemDescription(tr("Task finished.")); } setWindowTitle(tr("(Finished) %1").arg(windowTitle())); // display OS notification Notificator* notif; notif = new Notificator("simple"); notif->notify(Notificator::Information, windowTitle(), notifmsg); } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/newProjectDialog.h0000664000000000000000000000254213151342440026270 0ustar rootroot/* slowmoUI is a user interface for slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef NEWPROJECTDIALOG_H #define NEWPROJECTDIALOG_H #include #include #include #include "project/project_sV.h" #include "lib/videoInfo_sV.h" namespace Ui { class NewProjectDialog; } class QButtonGroup; class NewProjectDialog : public QDialog { Q_OBJECT public: explicit NewProjectDialog(QWidget *parent = 0); ~NewProjectDialog(); QString m_inputFile; QString m_projectDir; Project_sV* buildProject() throw(FrameSourceError); const QString projectFilename() const; private: Ui::NewProjectDialog *ui; QButtonGroup *m_buttonGroup; VideoInfoSV m_videoInfo; QStringList m_images; QString m_imagesMsg; QSettings m_settings; private slots: void slotSelectProjectDir(); void slotSelectVideoFile(); void slotSelectImages(); void slotUpdateVideoInfo(); void slotUpdateImagesInfo(); void slotUpdateButtonStates(); void slotUpdateFrameSourceType(); }; #endif // NEWPROJECTDIALOG_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/preferencesDialog.ui0000664000000000000000000001456213151342440026644 0ustar rootroot PreferencesDialog 0 0 489 410 slowmoUI preferences Binary locations Browse flowBuilder ffmpeg Browse Flow method &flowBuilder (external binary, GPU based, requires GLSL) CP&U, OpenCV true OpenC&V with OpenCL true 0 0 0 0 0 true Preferred OpenCV Algorithm: true Qt::Vertical 20 38 Qt::Horizontal 40 20 Cancel Ok buildFlow bBuildFlow bCancel bOk methodOCV toggled(bool) preferredOpenCVAlgoBox setEnabled(bool) 34 197 170 270 methodOCL toggled(bool) preferredOpenCVAlgoBox setEnabled(bool) 73 236 86 271 slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/projectPreferencesDialog.ui0000664000000000000000000000353113151342440030165 0ustar rootroot ProjectPreferencesDialog 0 0 588 62 Dialog FPS value to use for calculating the output frame (display only) Qt::Horizontal QDialogButtonBox::Cancel|QDialogButtonBox::Ok buttonBox accepted() ProjectPreferencesDialog accept() 248 254 157 274 buttonBox rejected() ProjectPreferencesDialog reject() 316 260 286 274 slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/preferencesDialog.cpp0000664000000000000000000001501713151342440027005 0ustar rootroot#include "preferencesDialog.h" #include "ui_preferencesDialog.h" #include "project/flowSourceV3D_sV.h" #include "lib/defs_sV.hpp" #include "lib/avconvInfo_sV.h" #include "../../project/videoFrameSource_sV.h" #include #include #include PreferencesDialog::PreferencesDialog(QWidget *parent) : QDialog(parent), ui(new Ui::PreferencesDialog) { ui->setupUi(this); ui->buildFlow->setText(m_settings.value("binaries/v3dFlowBuilder", "").toString()); ui->ffmpeg->setText(m_settings.value("binaries/ffmpeg", "ffmpeg").toString()); #if QT_VERSION < QT_VERSION_CHECK(5, 0, 0) #if QT_VERSION >= 0x040700 ui->buildFlow->setPlaceholderText(QApplication::translate("PreferencesDialog", "flowBuilder binary location", 0, QApplication::UnicodeUTF8)); #endif #else ui->buildFlow->setPlaceholderText(QApplication::translate("PreferencesDialog", "flowBuilder binary location", 0)); #endif // TODO: qcombox box instead ? m_flowMethodGroup.addButton(ui->methodOCV,-1); m_flowMethodGroup.addButton(ui->methodV3D,-1); m_flowMethodGroup.addButton(ui->methodOCL,-1); m_flowMethodGroup.setExclusive(true); QString method = m_settings.value("preferences/flowMethod", "OpenCV-CPU").toString(); qDebug() << "method is: " << method; int opencv_olc_supported = isOCLsupported(); if (opencv_olc_supported) { ui->methodOCL->setEnabled(true); // add OpenCL devices QList ocldevices = oclFillDevices(); ui->ocl_device->addItems(ocldevices); } else { ui->methodOCL->setEnabled(false); } ui->preferredOpenCVAlgo->clear(); ui->preferredOpenCVAlgo->addItem(tr("Farneback"), QVariant(2)); ui->preferredOpenCVAlgo->addItem(tr("Dual TVL1"), QVariant(3)); int algo = m_settings.value("preferences/preferredOpenCVAlgo", 0).toInt(); ui->preferredOpenCVAlgo->setCurrentIndex(algo); if ("V3D" == method) { ui->methodV3D->setChecked(true); } else if ("OpenCV-OCL" == method) { ui->methodOCL->setChecked(true); // restore selected device for OpenCL int dev = m_settings.value("preferences/oclDriver", 0).toInt(); ui->ocl_device->setCurrentIndex(dev); } else { ui->methodOCV->setChecked(true); } #if 0 // TODO: remove // state of threading bool precalc = m_settings.value("preferences/precalcFlow", true).toBool(); if (precalc) ui->precalcFlow->setChecked(true); else ui->precalcFlow->setChecked(false); #endif connect(ui->bOk, SIGNAL(clicked()), this, SLOT(accept())); connect(ui->bCancel, SIGNAL(clicked()), this, SLOT(reject())); connect(ui->bBuildFlow, SIGNAL(clicked()), this, SLOT(slotBrowseFlow())); connect(ui->buildFlow, SIGNAL(textChanged(QString)), this, SLOT(slotValidateFlowBinary())); connect(&m_flowMethodGroup, SIGNAL(buttonClicked(int)), this, SLOT(slotUpdateFlowMethod())); connect(ui->bFFmpeg, SIGNAL(clicked()), this, SLOT(slotBrowseFfmpeg())); if (!FlowSourceV3D_sV::validateFlowBinary(ui->buildFlow->text())) { FlowSourceV3D_sV::correctFlowBinaryLocation(); ui->buildFlow->setText(m_settings.value("binaries/v3dFlowBuilder", "").toString()); } slotValidateFlowBinary(); } PreferencesDialog::~PreferencesDialog() { delete ui; } void PreferencesDialog::accept() { // V3D binary location if (FlowSourceV3D_sV::validateFlowBinary(ui->buildFlow->text())) { m_settings.setValue("binaries/v3dFlowBuilder", ui->buildFlow->text()); } // Flow method QString method("OpenCV-CPU"); if (ui->methodV3D->isChecked()) { method = "V3D"; } else if (ui->methodOCL->isChecked()) { method = "OpenCV-OCL"; int dev = ui->ocl_device->currentIndex(); qDebug() << "OpenCV-OCL driver choosen is: " << dev; m_settings.setValue("preferences/oclDriver", dev); } int algo = ui->preferredOpenCVAlgo->currentIndex(); m_settings.setValue("preferences/preferredOpenCVAlgo", algo); qDebug() << "saving method: " << method; m_settings.setValue("preferences/flowMethod", method); // ffmpeg location if (AvconvInfo::testAvconvExecutable(ui->ffmpeg->text())) { m_settings.setValue("binaries/ffmpeg", ui->ffmpeg->text()); } else { qDebug() << "Not a valid ffmpeg/avconv executable: " << ui->ffmpeg->text(); } // Store the values right now m_settings.sync(); QDialog::accept(); } void PreferencesDialog::slotUpdateFlowMethod() { } void PreferencesDialog::slotUpdateFfmpeg() { m_settings.setValue("binaries/ffmpeg", ui->ffmpeg->text()); } void PreferencesDialog::slotValidateFlowBinary() { if (FlowSourceV3D_sV::validateFlowBinary(ui->buildFlow->text())) { ui->buildFlow->setStyleSheet(QString("QLineEdit { background-color: %1; }").arg(Colours_sV::colOk.name())); ui->methodV3D->setEnabled(true); } else { ui->buildFlow->setStyleSheet(QString("QLineEdit { background-color: %1; }").arg(Colours_sV::colBad.name())); ui->methodV3D->setEnabled(false); //ui->methodOCV->setChecked(true); } } void PreferencesDialog::slotBrowseFlow() { QFileDialog dialog; dialog.setAcceptMode(QFileDialog::AcceptOpen); dialog.setFileMode(QFileDialog::ExistingFile); dialog.setDirectory(QFileInfo(ui->buildFlow->text()).absolutePath()); if (dialog.exec() == QDialog::Accepted) { ui->buildFlow->setText(dialog.selectedFiles().at(0)); slotValidateFlowBinary(); } } void PreferencesDialog::slotBrowseFfmpeg() { QFileDialog dialog; dialog.setAcceptMode(QFileDialog::AcceptOpen); dialog.setFileMode(QFileDialog::ExistingFile); dialog.setDirectory(QFileInfo(ui->ffmpeg->text()).absolutePath()); if (dialog.exec() == QDialog::Accepted) { ui->ffmpeg->setText(dialog.selectedFiles().at(0)); //slotValidateFffmpegBinary(); } } int PreferencesDialog::isOCLsupported() { #ifdef HAVE_OPENCV_OCL return true; #else return false; #endif } QList PreferencesDialog::oclFillDevices(void) { QList device_list; #ifdef HAVE_OPENCV_OCL using namespace cv::ocl; PlatformsInfo platform_infos; getOpenCLPlatforms(platform_infos); for (unsigned int i = 0; i < platform_infos.size(); i++) { const PlatformInfo *pi = platform_infos[i]; for (unsigned int j = 0; j < pi->devices.size(); j++) { const DeviceInfo *di = pi->devices[j]; QString device = QString::fromStdString(di->deviceName); device_list << device; } } #endif return device_list; } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/dialogues/tagAddDialog.ui0000664000000000000000000001030413151342440025515 0ustar rootroot TagAddDialog 0 0 691 145 Add tag Change the tag type with Arrows up/down. Qt::Vertical QSizePolicy::Minimum 20 0 Source Tag Qt::Vertical 20 40 Output Tag Qt::Vertical QSizePolicy::Minimum 20 0 Qt::Vertical QSizePolicy::Expanding 20 0 Qt::Vertical 20 0 Qt::Vertical 20 40 Qt::Horizontal 40 20 Abort Ok slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/res/0000775000000000000000000000000013151342440021471 5ustar rootrootslowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/res/Info.plist.in0000664000000000000000000000312013151342440024042 0ustar rootroot CFBundleDevelopmentRegion English CFBundleDocumentTypes CFBundleTypeExtensions sVproj CFBundleTypeIconFile ${MACOSX_BUNDLE_ICON_FILE} CFBundleTypeName slowmoVideo Project File CFBundleTypeRole Editor LSTypeIsPackage NSPersistentStoreTypeKey XML CFBundleExecutable ${PROGNAME} CFBundleGetInfoString ${MACOSX_BUNDLE_INFO_STRING} CFBundleIconFile ${MACOSX_BUNDLE_ICON_FILE} CFBundleIdentifier ${MACOSX_BUNDLE_GUI_IDENTIFIER} CFBundleInfoDictionaryVersion 6.0 CFBundleLongVersionString ${MACOSX_BUNDLE_LONG_VERSION_STRING} CFBundleName ${BUNDLE} CFBundlePackageType APPL CFBundleShortVersionString ${MACOSX_BUNDLE_SHORT_VERSION_STRING} CFBundleSignature ???? CFBundleVersion ${MACOSX_BUNDLE_BUNDLE_VERSION} NSHumanReadableCopyright ${MACOSX_BUNDLE_COPYRIGHT} slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/res/shutterFunction.png0000664000000000000000000001256713151342440025416 0ustar rootrootPNG  IHDRfvusBIT|d pHYs B(xtEXtSoftwarewww.inkscape.org<IDATxyx]U>I[$R,`Ej[+Re^ bKhU MRE+^PAe2(TfRf:9c@쓳ydk[M HQ_IC#(R!|IOHG6IlIK{c]D 6Tf{헀Kc%KخT",I}>Nҵʒ5F"—oI #`:I42wx\|ZdI?_hu#%Uq[6|{Im 0>D(:~_4 x m +*DQx$&ھ? ,^:Gm{-HXbfy?U:evxە0[R =[y(bOѮ@5`[vb'pJ ;7%z [oS̻X$ yI.q0ARO  耤$K6'$kUq!Òvtzr.?Oh5^X.s/GT|"qp4puWml?"XI(0| fsJD ,>? |_ lᲨl{훀 1V`8P:ùG$cQqbh1${+(;ŭ9wW6s$ .}eWݧK㛘Tt]/i@w뇝ٿkʺ+H=&Ij=IgHS{~ۥMg Ҏm\cҥgљ tkԑ&:  ̐4!韒~7_r:^o%nf0X$iB=q¯>>vGےxow.sq=}lMږxe޶Kږ"Wmߞ?wKK"W!w$Bx/XҶlˈ^~!7q"@ؑ'p p'p$}~'*%@>*#>ńEk>[iC5b#z$m875S8Ux}@دP ?C^x/~P? im?.Bjom k.ԗ I攄P?PpK:I'FPHM/iI ǏaҤm ?i{'NB?C!/&\ɼo苑'm\ΝK#F8.Θy'Dx,>c{io]lhŊjw.ӡ٥־T-[35?Co4g}8YLQ 7hРʀ ۼ[W۞- d3fjBF"f{{6k袋ߐtcMM͏W^Vm.䒵[[[>!/I$${lQ,3[BSS%\nd2G7whMݙk#5!/`WKq2psAv$~gL4TA^Q{s״28ICnHgt3>avׇU^⁻(ox0EwMT8oȣ9t o]ϒdTTTmll6믿>jΙ3'ȳ{Wo|b&oo%q ]-  >Wla(%\|q :M .]eKE$!W4NρDuG~xXY`Sy E_#$ y}"q{S}eʔ)>V~5{WVV~Yf nii9w־yOo>q뉫f{K]555r-Q6}:@KKKpxEe٢ Yf1W~3A_H'uOgΜY nmm]8o޼fΜY9p7WXѯrDeew tssstyVY>A~_\\e_nkkkbO??}Ágl\ EIߴM:Ik+**2k֬yH]&5'%~},ybmH־퇀ڟtTkkQ^١LIr=(:L\.w41IEQ4G?Q~HRFRiJ \_mC$:#ӧO*++|,Txkl-R% 7(Hp^!.}<~!_yvX7bi$5/"i8n$NI\/",>;bp3qnluR"?a$mOJu}t e$ #~>Gx&}a/@ a$J,6bQ>9  n!)C%&q:BoF i }?"a{YrG*;xk[6g'`tgLbX^"%;X/j{ QIKRIM,i߲"^cO(e va+B(s&d⁴ᯑlO-Ul}J40_`p r} ͔m!i'?d ϔ34y @ )$?H!A@ RH~ PHڨIߕGgJٍ$؁` IHOLI~h{9$S(\?` T*̴al{E@(H Ld KmC&hg$BfB@Ws#7fcm4Fgn %ߘ,Il/]a3W([UT$=:%8YٜM D4_ocg\,c6^'mcts;0HU@EF5eNn0k76*3 2ot<~ $5{&Iq6p 1``¶)\4JIENDB`slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/res/AppIcon.png0000664000000000000000000036577013151342440023552 0ustar rootrootPNG  IHDRߊ pHYsgR IDATxI$Izn{쑙UY(`43!C! @P >́B'(x!O0 zQUxxj`D/涸G~ӟ  7 CNA; :AA$AA@NA; :AA$AA@NA; :AA$AA@NA; :AA$AA@NA; :AA$AA@NA; :AA$AA@NA; :AA$AA@NA; :AA$AA@NA; :AA$AA@NA; :AA$AA@NA; :AA$AA@NA; :AA$AA@NA; :AA$AA@NA; :AA$AA@NA; :AA$AA@NA; :AA$AA@NA; :AA$AA@NA; :AA$AA@NA; :AA$AA@NA; :AA$AA@NA; :AA6(8];Rj7R -OKyFZ1ƒƶ>% bZ#Tj)FJ}R-WJ`R=Rʯ*8cL)B_ 3SJE1- _h[Lؘ10ƮRs~[r_s8cc)- ~5> :E:BRjڗRI)#)@K)jfY7wEQ4_þ1meٮFH_?޴*T}Z/B1v9s=c,?2 ']t-)#GRZK)Ԃ׿)x趜sp!.1(X,z˲Po\CiՏ?jaq-jҲOc q1X0ƾ7 R|k(ZB@)5Bn-mUXm[{S(j*RJpy<,˺wZ6}<IlZ˥yms?x,l߿`ۮ>֢(P9O}EQk}RʦJ)-}!O cl ,sm oUOBO =,_ecTYƶ^c1yo8(es?M5MlZz1CBYkhqjX,Bǰ, mC)0 a6lFXz-8<cZ!2qڂDz,r4(J) <7ʲ|#8:Apʲ|I@- o-˲s) $A_9_Hеxe/R'eYjqYŬ)m]iq{xxsE4M!8p]sij\׫ J)Z-#,E8c BJ eV;PYa,Ksq# Ccio<7 (}ޚ̲ e! CdYVxa mU/\z˲>"_[$A_?[7o!eiAAqv\uHeYf4MD)p>|8MW~smO9( X9,Bk MSpyY s!p{{,t}eaZ!sX.BI}ݼFNĚsɒ'aK)Ojk{XſB~;!t Rʈ~.2wd,oõ=#z-Hy6}Xz\X8(aٖqkw4!c " #}c-DAx#R=$q,$ 8FF$HĈ8}\{,CeFNvab:""00˲<Wm;$AbK)OOIY~6kmowe7-g۶Hi}3 oė9gPv]77N4MB@a:s])Ueݷ"L&y"/n8 DdpleZ"X܂m;plmY`(6á{S%0b0DZιK)!@2C\^^N&!)~O\s?kβA`iQ-7D_A+#ۼ5E֮f[:NHv~z_&a"؀3E )*[* (TH&XUr\PuR>TUMk!$HV gZi !9]lF#(ױ~)%<σe[H,*F\EYHX RHsб$IL]݆r|@Qs}XE3I1RJJ)\şyߖeBK)Ou㜂Iuxx?k#QbYhZoX۬ߦ+6q}B@?״hQRk rZ>,{cA_u 4`Q|GۃyȳBԮx0(SFU{wq tpli\%$BH()!mX+( (62qoXbױ؇X:oƯBrVB $Ib\Pa`gjB(K,KcIjU2j%ZYPJ,J0J@EYL,- {=,K$I<ˡ]UۣG0NZz8۶mtYȲeEȑ#"}$IyuBQT9~Axz`xt@1QY ƚ\)LJyX[*(I)bݒ{ ] Am{ xķ,uܴmzMy[3̍ VSqqpp` 8GFU|cwtƘInRmyN< @Ck]꓀6*}χBUn CAP7"+^efWa%uٚvw]z=7 ZլrF X [z=XUYضN8vqt|"/[>MSUSfI.{4E݆{~{{EuM}YkyWy7eYsz 9ZNb8nK/lp,hfWLZs_}7*^;^n]_ZT\.enB c}V+؎}җ~jnqkkalȲ &n]y׹NPjJ)Aϯ:a9(t%k܈tq,1NXxxxhzf]\>W-y{{{f(Eֻ*}B;A=ߨNgN?VVĶm* ǘq XbnfAAՁ?Sv*]Z$@&V fZHӵgq-y#Zfyⶸeoxjv;99jBc]uS}i"2a>9x8X Gx LPy9`;6[mmDQd>!*Gl6CEpQjn￟_E_*~S)uA|W߇-nmUX>!4.e_T^۲U1Myӳ8zZ0vꅈ{هyUV<P899E(XJ)t:mdyfb3ቦ3a[ 'eQ%PRqy#MSyGXVUr^U}C]4ƔeVe,~VBWXb8xlb@x Fc>˅"q(繹={f*ĶmM!~_ޅ@_C3еXZ hZP[nKXm|g^qw:EZM- nyAJBB(r坵؊Zx yÊTlR )esnt8(P@E L3lLB\ֽ, sAl^je^9, puue]-Np SľV\'b( mpqppt,M`00_iw$#ڞEXDZiI-``4!|8ܯ~uB_-F@EPRɓ'HWWW{{b>;σyX.eRQY_e' . Ag4ݴMuR-*_nضV$M90JɵlWns0đZME1 ۱ Dv/ڋ9 JQFD,ۆp\GU&y Nby8aC,jS{Rtt8M93mZ|۲E-L`'''k\ۭ\s8oZBPrL&p]yunѨ/.{lX.\ox]1Dp(QY(Z am !L7]ʲ4zv7:^o6V˕@ׯM$I@˲0Le?~d&i01Lngo|83 ˶(^xabZUR8==5u]mXn ݮW_?n;u3w 56guSuwٖV+BQVfm<6h3tWjՃ (ck )>׽ձ!/rSW N$Ydf35A.ugBT10 1D7u]}ܘB؁mp]XzE-Z-c1dn[baη,K^nSYȆ1\`oo\붺:.2@)Gz8::eY3.:p_J)K)9T G7uttybm{sݏWnw(Vu#IS(%-vGWJr7jڛY~(4Lַ88:)*PcF!1 iQ3E~[t]+J9t6p8r4><ϫ t]c8cS;фoK!o1,%BlJ;3s= R[ڶ]5zõێ],/7 J@Sׄ(>d:v /8.fi5U׭\Q5ʌtmW }3 JVތ뒵,ˌ%jIsӂV'֬Z{f86ֻQzʸuSuXmcZjA)eL*z|>G}?dxe8??GEXEQ`4qJX6vm#I<}I=eYzU&@';R+(dUUV6JkQEbŧy[5q-w}[*Q *3VuZfej#/FH4P@u!Rt!7e^zCι\ow:{]g2qV<s[t:8??7str4 $IL^-x )@T~F, &9"7cV+ah}/8l-z...ɌϞ=3 m\ׅy&USq\8!MS3ZVJyv)o /VAߜ#]G :jc7޵[mPi[)J.f!kom?Lf7%r~U}¯Rѷ*LC݈Eֶcȋ7r{]gc>5GܚPC,L&9t:5֮nj9|>Gƺ &j XeYָ0t:5sOOO!|1`00Vr::...0Nqttu8v*w]pWL# U}`ms766h+co=P굮o<;K~-xjߵ)ǫr35B׫ZϼPi9E'in68ʲDt:5.s˲Lo>>Ǐ+|Hb>Y{Ղm 3UbeYxc8;;3urL=o̹{< F Wu[V(Zyoj D`00Bo+uTER8לEC:>U`y IDATmosoy6|9s{`SǶxXj>S u^mL8Gf1u;^%1Vydk׿΄r:+{ttdF`:d:qPrijOOOquuqp}}nW^>CY Ȳ*]O;;;3&3v< ]O&cq0 aqsʢWy11B0Rr#qxh6z[|Sj?uO_wkfFͤ:ƥ1NNN$4M߈kW[B#M=h@o$IL n ̌v=>>f [U8`0/^0V˫ L&#x 3,`NFp*..;<}e FAQw}겹F(eY1A0XuÐ//u/HhJ V2Vu6~uM:(>K``J,, qoܛIN_,˸g߷){oͽy>5i5W.qP(jQfi+?]7QJ-gN5JVY5zt){z!s.'^oHI c0z390Djx< hZb@fۇ~sEx)& OSt: p8ϫF (h,p}}GOa[(!nnnP(g}far3Ҍ^ux.(>a5%OI~rEV I1cF#Ec\?c̕Rv-FZ즠of1`fo5:Zvۮ0"/G9$58&9Nʵkh4MX,,|rV'%Ib*2tm^8Z-#zpd21x ZZ0pvvSb0 M@Lprr,P%?^'ݽ߯KQ"/ry,:DA˲a[6ڭ6^}A:~q/cUJ}.CIJ,vw6MkC],ĸ.6m~{n\y[m>6^n5~ if7pppHF/XLR?ύaZaX~!8e618& 4Et:|#|'?1L ag/M9=\ aVsZ_x`0@d2VxH?=m? vﴅun~w:{K{CJ-Y=vVׅm.y=V z~Im~$ !X,+_g5}|nD}XS.wwM(X,M\[tO?:8<<4t: Og-yuIzz᪽ n7778v(pttdiZ bwN ;~/ }h42^FVt seGeYcl9%cį+oiNx}/`na:Y^n/aW;KV|K2W>}(t} q]ĸ5a5A,rriz !vq||lƬjx7#i[ Ĺ4K1nc>W9|czvv{GEQnNsEi f6j*BgSx;Ii:͛ q0&PJ@BT% Y9 #i#2$ Y@1Z>Mdpc\aY"-dy\eER m;jGrԔm>wڬ`2ܻݮi4 3ci2ϿBe&! j>}j˗8<fF3 ⻋WaZU%tojns)P B *D c Fv hWo-)3Һa<3 ڋdݹUV+ (6pUUQ>B|xXIQBjLkX%+ Z1럘%5 n:!^뺰mLnx}yLv=՛^Y?Ķׯ_c<c0 c}z=8;;ógϰX,prr;mt:|{{{u{&lnooE||> ڝv5EH8c f_I8>>>>XO?/_ގQ&<8oc }w73RIR@(! P:CRb0[ָcNĖ[ :]m®Du\nŵ:m $JQ@) +RHH H1,0XʂoYݚUV3ܪ&A)ۚ0=p*EQ`>&8Ik&4t>ŅW:N1a64)& +sЋjeJL[Z'''XV0=σqCJg?(p8uM6.I[U`Ȳ *6%kc`&Tey V"浟""F#3Ʋ,c!~/q,ok |'.5e5[w*ECH $DessnT/߉6G%k"͘:ۇiX_PkS\_sN3>%?P[B/JYb+,(vNmmXB>(EV-B5 1Ɛeٽ;~db,oukfU;W-:YN)e&~t`d2c[u3cܠ,K~ڌ-:3ܳ,>=zrln Zx9./.oFZ$Ip{{#ZL=]wR&!'$U;zaooJ)걷?+k؄N [t+7*tck#M,i0neg}}Vhe\%5YKzȄ a7k9qľqbmTýXQRHLn&&umxng뷹vy%w6T_21L̘Wqpssc}F"IF0 aY)~_88888dͶmdY4Mò-0Teg_Ɠ'O`Y:Y\ض]-Z\^^VlZ s=&ǹ !p}}m凜plIQb:bGEOI`.1100ΰ\,M=zyѕ8wRG*D\wcr6@'@z`枃3`֛ONp6uxU1n9UlFt^^\ JeQVMRXeYUSŘ ooZo3@٥\Ek$1'Y,9(r\+;88@#pttke ue./.>^~m&x>c 777BlANwD39}9鰉sΐ9Ms\^^Г֚el~߈@qf{=4e\b2<ҏ=2cMzOsDQdVt?G!MSSl|,K8jUUtʣvΫ(EQ5G&``xtx^xafer?0K_.@U\.1M\.0.L|Xufps}0_f &c[mu7]l *LkZ赥$h7~&͟uvk]ZF1non^*^Uވcu؎]P3 fYqຮIVQǪu{fʛvjv )%1q~~Ǐ V ?я`Yy>noo+pv3p}}AE(Ҹ,39x%=] !45Ics0R vIn,UaY$؅eY,A|3'β ٴ \طjEn|t77k:k~3]0M"Si6ƺΟ]RfW{ߝmBM?ǷB*5.pg^V-]Ye!Q^b->UmkDP,\.:\ڵG:bi$I$I0NM-yfj|>7c<F 8;;P561NsV}|߇W%ki"K33ʕc^q(U%q("F=aVeJNOO.9e|T)m)N_?wX%U{1_`XWu_⯭5d1n~/GcV]r[rgoz5bmO,8M!߶M;?J>AH7S\˗UBqj)Z7C׉cn1rq퍠 UnsyWz$qb<$Ixj˲08Nj/Z'Opx<61f6"@:AoE-p1T7)0#nۭ69$EBf&tpzzOjؿRB,s~eY/+ >t[f3p#rHq:z:5׻ioٺX)0uWy͛ ?n>lq~!{mig/F+R Ih[&i}jLu e3WU v[8+Pw IDATWw#^=Vpi I Q nc2eᣏ>ryx:͐ec CdY*^nY.Us=jq|W/_aXRIE^w\ENa2&3^V+SpTşI)?,`ixpR$_bekm^57\xx+ ~Of猻{ 7G\מSH[mʇˋKLnn\-Qn p4HӪ [k8s: ޼յY=GGGH3ՔNSIp||-I8^[a\-aq q?Ø)_Uض#'w(R$ELS((Vk6 <_5e,J3Qz=Z-}dYb[(?,̲_jW yKteD}z }$݉,hh֎o7±,|xz}Tpױg@6lu`{{㹦(ʿ$ ,KxcP~#,Nn϶!1IKsl6z(*bȑLnyU2fېJ! BX-WN)j9(2>^ziشwmqy$smL7"7/4!8vmj怶]Չsݲ,TmdW9y(2c_ f3lwc-V;t0m ;4uB6D2Y\BsmO}Wx.^_ +RDQugj॔U3"/r/A)J˶a:N]tׂeū/!.0My^5vjLZYpnz|qyyi2!4y~_ԡN(`6thU㧻^ooXsJ)TşsZ'~n2XM7Am&Ľ\V6p%qעo9%,ߓg\]F+e&2xyY#/~nJe[#lۇ*SYqnm ns8yQMpm|> EQWy"MS CL&z qW˪e[8=9j2e^~Ã*' Lw;ūoI~v|?8$ILՙA,tU`48-O8EZρ/gAYPڦUް߰7ɬ~o. ?MY7 T($-qr>n& 1^mv,eYύYQHy݆dž[UC7mr$ɮŎ{l/wuO/3CYg|g/Oh~d(Hވ,=]]kVVf"X]< jfUl@ N{='V!ڭ60NRKƓ W)$.vvvPu]Di[۸^`<h" ZmTkUv7A))vvwmRմ&c8Aa 7càܮU I30 1g"r.naJQ^dh`d˲| 90}~{}S=@խb:J=|~loo_?0r9gPF]1h$AE1g濬=ױGMe3=ukEg46?nlmdD_\Ueoyy= wSS Ӳ\DPp၏BG²"D•q;!e5ͼT砚f~djpyqvzMx}8>>'.8UTd|; 8V\꘳** 0}ԱM'I|`ƣ&:q0OijkstՠEͼ~t-3iVFbB '@se褠/2c琬}oAx^'x~ aakk3cy&Iq"Rz.ُ4 4-qm: HX4$ۨVyV>'̥hsv#}nlݻw%! ]z= J)!B嫸 TƸeYh6rhȑMqB)=~M קucy,dE'WyœdV5n>/^XuSZ=pvr̀9)f_1{Ϗ" xY^Џpvv~ o6>667`|Yw:O³h8 u]|a\"IV*8<ϛ"L1 t fbnooކix)*$5q~`:ƍ8::6& )SQjF͸^J8y3Y0LqK80M~a>ԣ*_lT0I|DzXֱa*kYo.ɫO=a 0D$== fɄ Yٌb^i=L0A ;\!p4M?O_?FSJ1g8>>9´Lwp3 xo ˾:Gl{V.nܸM}x0 $I %̝ sXGye\r'(|NG^3WtnzC',^+])S?;ev1nO?˓c08?Μk_,Z (s0R aOVagk^Wnl{!Iݻ8#vww1LPo<~{{{ҝ1sɾ#R=z4.\.4i6!~%\/><J8) "DaOXb|,W:g!A9[9=i/-t: 3Kak9K{A|uxR~[b7y8}q/}dM4[ lll&(D 72"]5>3!zK`RVY`,cg:tnuۭ038>eZZ? \\\lb4awwWtOZ Ah mZJ7Zp](aPS8j5J͛7n1 z5$V[:r@{[kW^)ej]7RWnCˀ&CފPBuIRkҁ՝Z^te90W xc<|MܻLÔbNsӄ%O\&pıO@ /][1R`kk T*|F<X8#ΜyLj]TU$In߮7{iR`teLweY) AuXN#_<xփ_:eu˱AU*@<& bhJOi? r; Uu`_N4sc\s2$;W$(M >}7Hp3[ny c'+~oBk]۶@G,Ff%HA"lnr(Of Z ?.m;2L%zi10GVpԎ7-q{e##ؖ{)i27oT ðED%uc+u]霻k.c+9= Ŭwa#{?]0x2'eJm٫́b9J _tJ^?LVE= + }lom;wdoaP SNbIz uASً.\(1͛71nK6LAv-^cp.nސ8;=l6_ ۶QAV$ҸBR .օAeYr^="u4 ]`:"JYXh]*ؕXȸem8kӕxk>HPMFGCd&22ee:UB%WUk>p/tL@|WËACܸqQq#b,t\Xٖ}Թ""$!LF1 j` }qmj^7 \\\H znEE\xP*w0Jѥ;0m7m;R¹ f uQ1d͂_:x m3}y%,=3)k3`Oc$1мݹfɝ`tf|,)W*]\W g~}NR=N$Ƴ#|oAAZCf{0%ÂBE+wERRqU{nG ˲/Flp4DDrlkk a͛҆0 TUN#=g4w i3u:!R4jzl6q)aהҗk!u{L]rמ,wK4 рx#r$ɖf`a:!at~xUWlс*\ c ƛگ j^\G[7o`4@(A)A\54x9|^yaI'\73K98IlK(%0Srqp9C8^xp2j@Vq g cIV 8n<' ˲Pa ֱt1溮Sg8;;u0qhYւ_ǿx,wy 9K.,/Ub"N{z{Cπ;;g?20BU~Uifkd+`X" d `NNz|o1 h4puco~iO< /?2[q! a~*Pb3 Kj5|Ljzz, }'''|jmUI=;<[mz-%rc08N*-{}jUM&alK/ߺuKݞBc0Ѷu{rfYۦtkgm֧ F9k^Wj҂^urh$_J'2n^5sX5[ϛ{\/d26ݻ&&)L@Ĉ)0 R2v)U88t:EVG8L\\^ݻib2`{gZRնma{{IhHg6>Nz&f )lv0q?6l~Ƴ$[uV8<#aq*T3?u/Ib'cQ؎Rn  Wy1twh<G00'cLSlmm^k[ -:,˒3qZ)pg6㘓XF!-cM0 %;^ln VEe5Yn^plmW}E^KO_? `V)GO+Hm*ȭ4/ϯ'drp j[&^0Q,#),x_>fs{f3cAN@R[U|3<$I DZ|iRFدJh6PTp9=rq|| ˲l(1%{)71Mf3L&ӌ+!z I !0 3o||- ״V^.! FQqlY?ǕYyLcY_j[ߏa+:L"; DVrV#WO˿2]w+;V{X|ٹN[)L 3|+|3B%Zۧǽʳq X4wU\.+|?xa8W%ܘ%uBY"ňL]ږ5bq`_vi渗fb)/v]o], _/7  !bPCG nFk0 a;\n2a4Iz0p]hmi:0EA˞|ZH7L> @YC~18A^q~_$IBȹa֧u|ceAh{8mfRU8`gրF ˛i=`@rLwRzRN4uA~V;]n%f<:6 WqA $'nG v/³㯰qnb;m%(50 OƋSQa{{D!0K#-X+ 6q3`0lpc޿j5\lt*LÔ=F]R~7@)m0-i2\GRUjV(0Aa_6;ere<1kzv6=Z_)k_j[@_q8b΄3wR W5r J+/l}Qv,2-)Ӓ6Na?Jfq~h-|Vf( ܒ!c9]g7?=Kseb{k{D8a&^xjc4 L&>p8۶Rj*%r;IYT=q 0d /+ yш{q%^m lt dX,cL4L9]P!n0l<&Kp3KL%^լЈ('evxpxրA={dq+JՒnl,arDnc$;w'E3}cܼySq8EVx 傴à IDATia4IwaV\!ϭpKR(!lFRA^y{FY -WCeL\O#f8f{3$U–Xߔ,8|Rh"U8u"֍A3`G;VbvqM?}`ϗߵ I8yt4ܖ{wM=.ۗ%J$ma!|i/PΒ=Y^FlI3j۶eIOn=v <;>BŰ<So P@rA$Gɭm.{0LzGrL&L&7n%}{{/Ca=aN#%c jin#d/jd=\{Eэ̝H]VFnWl߃5~ߖzQ[{RxgOD%8 "]\ {X7BFK3tTǝ. ,\N{j{Ѐq*Jf/vP/6YN4@ \}Kʅ1]w0p9F# ۶ ˲SYv4?LpA#ܼyjniٳghZ=:QA(3I̵ ocqosyXj5uYOK8^3G%I_^H2 'l2\?TvZ#R15X~8 :\ɝyf) @',omma8s)G0^]0 S>a j6hZ j4M ̼2z 'iQFͬv`1KWAdJXMa ${`t>EQ0x 8?/9v#c2`MWdᄦF$,tǒ>p8,c$It}LScj5[n!c^SL&cZJ/u۶1NE j N [_T Ja \Er[lh4B p I̶jו;HnLD}C70V@dk\^SLBpPƔsdrY9~I;,n [>dXG|稯['s`9yңX}K (#ގzp{F@&P )? n| >}]<~O=EL K@-ߡDh۶4K K"[V /^͛7ʍ7d[tG$<$Lj~_E>ZT, qñTj`:0 T*?0$۷o# CX#؜NkZeY&W[{Dg_$G"(ҺZWT>GMxLMh*np,r8"H||i]!{!y)ghINV``_~9{@ # "#ۗZ{hJmj09˴^1UB$IouE%~/A; U>2mM]9&2hjj2t:/=It:|"IܹsGʿu X1-H=xQ!9+%|'A B܆R80}{,T*\o^u]1aQ}lzV}apMKHz6Qd_C{?ˀ^hW0iwpc-"W|hг۽ \ _Z%K^,  칲{]\Opt?zah2/^߹O~> J Ald;0l, vTeӧ6 @ld2XF#Tݲ,ku/\BqΛy6L {µMqt:)!6c{IgG u2ڿs2wͫGYB\zDWU\Xs'ZpcI m6.frPTYtu??_v2t0Ƚ]oyQv }RݿVN5pç@8LBPw>oa0c{k }!&)ϻ0(7) 6 2$b, J@Ʃ}M/񛛛8==c J^Ǩj?Nna'#vhXP`Pu.6cRgq<_PV.NOO|YC,[?ԲW]ji2ezޙU9"K\SΗ'%y QrχND"H^$vB4s ^, t/X,$ey3sU#4],V'_!ܾÑM(˷Q2t᫞$ 0l6CDzx,u j1<j}҅m2hH;V=Kg7LJs˴R{,3rmB}qyyp(8֙:Ў ~aD`M\Qp\;o9Y,#;[g_xqlqoP k<3]n3'/*2Yf 1R{IFt˥eed| Jv W B9}tqR@B`/>;0.?$D8pQ;;;R5 CfY: ͸Fd2#e$/nh`:uI4[lܴLA l6Ed; ˶aZ666pyy xkP_;ڱd ߃hNPp*)-Soe<[σ8[|xY:g7)nV={Cct,r]F^ AcDoV)2@/">gn07=+d_Jg+8?&* ´ ^BdR=)(\eY7! 0d}A:f3.EVh8#IGRT0 8#bE-~%mM3 7zBcB*v xhZ?}R*yt}P]n?z$Ny@Tjtt?b򌐚i^IA>PsЕ6`L>y "H.,X-ykƠv6[_ $u1%f,G?6V"RK<Nu{E  <1=w?`4F˗4n& ÀigV ĶUU~' JE s%s~d i7(+ŪtD)zT.<7/ʁ}1+gR g;Bjz%0,^7Mv9M~ %Y2`W1{ε2V"]o} Ў& j*hy ^܋xj _'0_ QG?1pzvoI)W"/+GQƘ4n!/$Xsƍjvh6߇iL&{W۶l61i1NzSJ˲dY113fl'HVu^a(֙:޹Ȕ_g|ܸ֫r6O zŹ`6JOK> 5AL3tj N.*2N/S3}<\{>Wۚj?./]^@[Zn_v!+ĶֽճtB`<.[:v>D{0=A 7s0?gSLFG+\B8?3ܺu Ϟ>#gWfnGQ$bD!60L(ۏF#T*Y)t:فi PfӣH!L'>4a|~>yOcòl< 3jH]z$Y:މX˿ѱ9Pfb%Ce%Y"/EhNv0v6*: =hLٞ=8C:Y:L/`ָUS>FV%xW6|)f]6 ZοN>tE=yKv~Ⱦ.Sh!e^=ya`{{w܁i8yy0VK"Cq8N%Z)* (p]C#p.[>*FϨeIxzhpYLt. 5-ecM4[-i"J 5 n4Sly|8QEvww帙}NQq#?@ٔlwu|-"00PB_*xQqE*6 C/mۖ.0D$kP_[+Y2}ӆz;7CU0E42+]9wYQggW!%wrW{z0IJ= Uן7E(2O :gUS;b@ l+Zp%g[6Άlq{S^8-5> O/Y/X~eeĸRO_/1N? ڭu8 \ċ"cDQ$[tlb:R~39n ^zGGGh6ؐsEdծb0Hv`۶4ya"ARAF(|Pq~~ A\VJckx{{(]Tz<--/ӑ >bսd,Kfŵ&J톒2v{ϕr;s'$p3)bݪ"'Ly\]1do7r4cL$ dFj"뜧Y*Kwý{am{A.mu]4Mܺu ^t:}Oi\jv6a6}lllh`2fS*EQ$X2kuN#n۶d 9PӯA}o-<͢/.(>=0k‰,ce70+DJI=͸38c**h^bϐD >ۨD(`9U-&0+Mw+w[vWyՌUZ?E%sׁgY}dˎ咘Nͯv?̨ٙE"'֊uiV<A 8 Cŋzqf4ịGq@)mh4|D#o*QL&m[n7c<[Մ(D>..^LӔ sz=Xkĕtު'|vό @\˃zv,JL,Һ \fI(8J$)A'W!?HHؓCP-/4o3n;}]O>??{?%ݪ 0YQv(Dl$NT*}_fIQj縸b:q*뒼. Ð$AZnf GRf]ׅR6"LSy10\:В^WVFfѡ[[䏕(" NrD%P N#qI6ggp<8~d{I3Q#,qt2RSO0}w2͒/AU,x!2n~ЕY 1^c+slmo8??`0u`^cEy>/mXwww7Vz3oh$3~}Nj/noj"tmI mz]Ljp8D8^vlmxkr&)<9πLWj=>arz6  ˳X̠2&[2Ossy0Q$`*au|qWgE֝gkz W3]pe1 Jue:Qewk!@s?td.ts]s`Add=J)g?яPTsaԒUd*xªU(;w8NNg78p hB=&i*F(TE|Y0.//vx)"/p|qppc YB3bY8yP*<ܾ}fc6˗ |:Jal lR4MP-;3"' j i˗?0 aa ?xݑ) S}$&V[ӟ3yĦUNM2/RMʨ`nJVzdာ+ja$dUq5 gMbd\E >C]0ԝOIZG\$B8n}(wE ^0߮u- }ƅJ=.}+ 8{`;6PʤZU+[na2 IVf~/jdassRܾ}[caKyA @0 i*(iB~H A OeX$:۶j5TU)o xuX衿+WuzB,5Jg+ 8H1I3R_H8ԕ3h^a+Y-ܞ)J148wW 46}y!qr TNWeG RԖ;ؒwgs+:X9OGR*bǑs5{E2#$enQzfRqc^ gc@cssSmI XÔOhj...08زj }_yr0 u#$I"a:3!oJ*N#*IHf~bg3ku:PJDZE,(gk8ZǫF!UUMG3JUu`%2Ue+lv9ՔR%d.#"OˑTX[ٜ.7 դ۪˘%{O\eZѧ'%QV0g^~ ɓ'``>s Gl.L0Eq,{mKQV/m8<,^ARQ'}Dvrlݲ(AKe`ΖVݮO{ǏE\W u1>.F& dv-ߦ)t-5MT1n<q [q%q0PTLm(T*"CLS1i@дJckoC}K ˬUJZv`Q ._zWnK.3t۳jTiUq5Bd&-zRZWDϒL2~2;Uc%N jϫICxv+:2@_:VE \%~_mϞ=e'h".b}O>AVf)m_Kw4eERh Bgؾ# CPJ8|Kw+\1.cMU޶l 0ƈeYu\'V[[5^g?,9{K/㘂/dPFK qyd A\]1.QFPqdfĺ$uNEحI3rQvȾ6HH3+~Od^E6lH XլP}Ğ. J({ɟ?ɏ%,{>TCqLS8۷o" r–ѣGؐ*h+ \ERgI^'l1V`۶dar$cnJ R-/BtB|lWqu\'^ؚ< W){Y |&|<:ftMpf'rQg*!N_ LKcJ3w$iRC .[(cNxS{bdnXJ VLb4ߓh_Dh$⟙Zi)=vrw Ȳ4Vإjqkl7 1/_Rv:s^tn)ronjaB TUF#mn Z $Ix x3U 9E*AtpYD0Q%1 |O)EV&$1 p08:o͜ex5 YjsZzט/K0#i gnCfRU!AR P oܮ̔#?W?Wnj]^Hr{>W0/IvuXw}5bzou 󜸈2lTeޚx~@Uy \_׈<@VǏ_"fլ_,R˗<|6{,= d~/K7qwo82bQYw ۲x$ C!<g[ǕJsW!]1obm1W0_)\Ozz\F-#ߚ\f1dqvc&s nN\p(*"Ǵ Hr)X2Wpρ:%,\jFh(& q$;.YH.Α+!X.j1ɔʼnb3pIVF3!@vBF-,()!#sBYWUOlJ l@Xw ϿwiM'O?޽#վ{Sf觫ZcT"xutsVA`iRZ-DaN4QTlm̓Z0a8M=f3)-K)NnMF#11oYruK#SrՒusxmVN2^%jx#ũ^6U:괆yf+|]]݂:f,sXJuB_yI%rP,f~hUt=ZRX*/ۖw4=}yy&:EmUM- e0 G݄j!NNN@).NOOa&1Z\Eѐh406g3o1'ԛ0 WA wqÛyp+a?#ra(/&8^[cX衿nYzϿJ"^@;z[(E!Yԙt8M]USm~)&CnZr- ۞#n.2ہa"cߚVvȐٹD(RF |M'ھ>ʊ*zhOUO_ߗ^|U<+=I6t^z<ۉ}]֦V(mJeX Mӊ3ŋ1N57]8>^^n J)phQQDZ*|sPFA(V GՀ8WLvW(j*eƑ2;W~^ a͔̩YBfyLSH $Zs;5r]]#S<6@WS݊čgAZ?`2p'_>mܸqIXPYW5Zql%ZzǸs0A #sm.syluj${=2 7N "^h!Nn uB $w6j1C?IPo3i=\B̼' J@A700PN]\6sZIWCբ^idzF}B2vv.gj e)<ڛ +O (cscZr\{Jol:ASW  `bL:n߾ ޽{4ߣ({:>*q,zP t{x;yAp;qbZk.z[ڲl p 4SհDu@eYYyEPkypCXetU5udCVys8^qMN^Xj&; w/Xꏻs)пP 1򱫂Te<^,R0_.8ɝ xge(۶)?6 SRS# F% e4JE&lfeZ r\& ޹0V3(f~^alb6ΉۤE t/,[}=x0YUߝ:?B?$)NF mOkq{Ɓwǵk*M#sH$Yi=qc}}1-Bspp{կ.Ɠ14CI`yȲ PDWx 9GR*=x4M1׌kB:F+ړĝgjݲH%sj \t8uF%BW~R423cUv`h8-a|գOs7Ή*w77UCUm!^ )&2z=ܿsDQ$I0%3o믿y9,ZݮD>@*-R@Fq\Vd=HZ "z=ld2۷1pη|ΔΆODrϟ2_I>\<8w.á'Qa& TJ%](ףD~9+v.eo s3`leK~{5yU 45wn} ,p`"'u@=fIТŠΗ~㹼Z8~8~o5t+a4X,*)J6771(Tb|_>-RJX,cLJ(3xx(}|UWqB%[_ __!aS9oY>q`B\mb1;vE@K*)SP@w;XeF- v>sPE+hUfɎ6ډ un ZN>Yboo.Pnw>gO߱<Re YP'? `%*H_~%1ݮ~.Fn޼cݝt{]yl/yݻ ǹ~ye} Cz=a~R^ŋzJ|? !:/Gh:2t4z秽юíW}9߆z#U$_ , %kiJ:HUF(6CrНf+eAn]jm%^눡Bέ_TW#>S!ձzT6,0 y7C@=`W;܆Ϻ_ʕ+E|4]ܾ4=kڂ7|Rk< }nkkk:NݮN=::Ba6}ݏWs啮@s 1dI`cskkk888P(~+>*%g!򬖵띻K'ġRbeU x"J_Rq}x6ڝXnNzf$7A.k,世y# gA ,CY=XK0)_kWF1ī:Nv KfPUG's~HRCq1 @G*]!rc:Q. ](8F#L|dwJj@9b hHBe XӚb ޗ3[fjWhP+w g(fn(Эa"yX#ތիDn8>f77ybYS; L'wb޽{:L^ӕ0LCNa2u{XΝ; `:j^*s@c'Z^y4K@E1)ZZܹsx!)\Aıl8Sq[;}Ås_b͚< #5-25^W%84+rQUuj+̥F{a2M%43c6|+6ܮhlfW/7sGeKtn(\؇Ս[MƾX&Rng7}u+$iEnܸzQ:!a#c$I=K' "ܺuKꠜe ̅U| CBcrYGh4e(,0~}VzƉik Uz$@\}CEY쟛K%](,9_X-cqυ &\. Kqڍcº^1)G xvA@_~ 1(իW[974-'24e:}{NfZ?c{gnBEx!.^n}7ū!b2sI `~cHwJxMEx!4 4+^677| ''ʉw,!X21}%ZJ @s.q)e^IU):Հ&&eT̲MSYn,]^ɬ98VC۩q6&ߓ.O07JEojCQSs\vP'̒}ރ3_}8CxT|[6:iq7x888@E8Yᭋo!"w… 㾹)]TĥKϿЎktiL&zf 1gIMBӝ^0 /@hiyx5?] pSByW!}yBܫ5Z Q澞Åc@ݥwNLA ?ME ״W Fȿ to1?&NqsU[v@]l/-sip nuΘ֠(WM/<ZGʛBAYAm]7K,Ï>le?]%u{d2Aeh>19..^VUB5oM! | "IL&AZF6q\Q+ *o_J'Yyzv@wJVߗ8◙V3mjUUI9 gʬ4|v@CC PL3sMg3%p];:;W}%MEa aief~׀:s5B2y1tyA}67y]VJMqӷevvdN }|駈fڒ(" TN>>OP)kW Ѯ"^p :ŵ`^\m R+—9%-kH?7oVS$Md=>).=ܻwX,0\i !&vhHD{A,t4<!8`L|w q"Tik&:8|3Iq̜X9qⴠLuWz̒{5 ?tYhw%&J:%.#dp Z olԀp+N̥]UKU!ڛ-.94,xHnn;]{?/eP?I^n޸!T)*R|'ΦX, ǚo ]͛711x<&,v]$qRSp.} I;wIJJ1NWwcxVz_T. T_n(k@_VfӲ U˭HM93f*euc_0^ǹ)aU1eY+=w5̀ Q*ٹBr~<*8` .PUE[i2}bΥ,']NFnz_?[o+HlOCճA\~d F(%tׯcggFм,0>+x'qJ9Cs\P&M_( pttB| -!d~_AO{OA"☽5:&9'XYaB@iD;5ZQQׯi`q ^_xp%ֺ(w~\f8oИo(XYUo f.UñٽxS"dY7 Z~a,NMJu8?O0N#Gy4I%O x+ I`woW˻[U 8ÇQOE\p$q9NaBrOJc[oiCO!f)q .x+MKIҥKE"rܽ{;t;w(B4nI"{a>seB|$i(Ht#`s~A)=yIF-? {>;Y5!]bBF:.3N )elTo֤gT@.3ײ߭jP'Is-kNjZlD80ԯ7~ѬwN,|Evhzyߦo{֎kZMXq|~ڑ4[^H`dSPNMo۞hvnG&!CL#WR~cJ!NbH})(v888o,ˤx̢5'RBZa8wEΥ_y}i!rt"FE޽dBH99yyƉhk$$>vZq%tg/ +Tl3R3KPjř75JZgt= LY"w̖>FHi (zTҴR'SlljEsdBE_ TRyWa7PPl]g94R+mIunoYnMw~if2przϯwO駟x| @C&`B@ig!u*Ɠ1$~I`t4¹sfZ.s.: I|PF(E兌lPFsF#$Ig3/`[[[?v;OvڥGn~U/e<*$_Oz;ۋWr.Mw5D5CF%Xמa{ڦZ1} ~71>5[ڷJV `Z;PebJG߼)N2+v,) C{y޴4zS 7L5~_͋o`kkGBH3Rgp!{Bjȋ89:.1p]!K3|ŗΦt:v?JB@(s$6kIy&3z(CE '@H*sYD[kn[|i'.eRsXS*qswS)Njouw@VʬEK 7mL SV]LQڄhT27atW^|>yWFfBNpŞm'JL '崵AYZ!wV-iSWS3BnU6$`*Y4j:0 s Iq2;'uOtw@iP*µΓ:}R;asګ.!-i'WHҊ\U{ XgjĩHQ D PwEyF1Uy Bi^l>|6Ɠ1!"PyvJ,cҖ5 B\ְEMxF,c(Df1һ'o|X[͛<%.ކsk IDATTW99|f)00@(t2q"2_riRtyR靮\ I:eHRyaql օ|?}Amd󸟹.՜?qpTݞHY$M4ū{hz(?WٕMu70pqc5˭fdG7Qn5]nj ꡨi#QdH`n黎"꧳NЦxh؆q_Qo;oqnwsj,[f? 3p (9D/}Ԗo68::Bqma{{q˾7tU!0δgz/0y$t2n0 I1 tqn{{{Ii-н*ÉrR 6)?s&tw^LHt,D3ϩ\s5YWiUdZ]2WLv'@sz*8+Щ9eua䧷靫ah5C@zE;`vikչ|u]/K"*Ak9h8\H;W@8UV djvjJ}s:[1Y?хm_db4 OxmfHa_~;s *R\Jblmn!BLLDahMPJqmiͭ-Iw wB( DӁ۲LKze`CfzyHv<x<ׁ~X`B!w<b z4֜툃JC Tк0}<{B9 *A*:fv&=a!W*R<({uw.BT@s\r[`C&:+c BjyRt /v(ipͲwB5t~ڞ'Yd'Ux1{;&ftw_(ąW۲HZsBVlƍ\})jJX-7[Akw6nY|V) n[&;ou\ Nht|2,k/te|Q\/S<9yY|M07m04ҭx?aowO! )d}(/J676*YKCҨ̾Zo\.R{]ا/֨_do]@9r8g}5^߽=j~&+iʛlUTiJ4[RNM{85qe)e_x39[-uDO'CHы{3w;W ^  ];ʲV`]V+=xkbF{X+?Cgys,_ȥ9ⱐ;II]n3q1 }tP}p|yR$ *96Ν8zL0DZ 9e QO$ITW̏<|>!"=X}O=MS/f=7 -ʗ(SKU .qh[u&K+%(4T)W0J&!3d .'Sm&{) #TUK@sh9 w΁3ԅ5W̝9-oOY+#~ /wo9C/׾Ʒ|kׯ!N*̜(cxàA,ˑ&osAϓzaxp=6A%TaN[YI?x.]6) ƶ}%k\QȴmY:Kՙhܥ6oiuНsp/wm7D }Ηeh M[@&[ 10+Wl  ?<!7nоo&xj$IpxxC1KBpε  ;c-˗/sBf$+_HѬ4k5/i&@ĢqTu5gRN*Aȶ0%IU]6:+.(X*"L vR>!D[]BԀEk#Ap&k[FU_WȦEk*a[re8i`5i7ͩQ [t}=\`U81Dc>f99 x,˱!,GC&O&HD˾RJu"]1 '@AQI[=fUy !@s΋Y l9YڮKj{EWnfMRENDd{ZDTQCdWefv.'0n{k/ 1D_Ҭ R9>3_jZsLeXEnTN:߄@P8cU0̉Wn˺ґ߉^M8`u6~j*ۙ~4 |qs|GeAX(X )3dt H o]| q# C1(Ν;/0>8Fڨ|gsܻwYa0 3L@ĠT xA]q_ !W,N0y3]פTqM9PO KY J@ MTʴRV6ȍIJDU9#م}Ģ-3`Q(6K)E\`l{+cL5S,"#~qf.-dUf/y8nxwfR? >'u{o Ŷr%ϓ7J+z"8X[_&nܸ!۹@_!DsT1u,2 .!ʲ;%>{BHy_a}W,VikE,2ā.qz֒|u8A\LZi8B4(y:U֩Qrs9)[%XeqV{5^xw.HP ){6+`V麛v !,љڎ5oho}7yeJY:Pl6\Yy9)m I]7Yek9Taf`g0;od`t<@:Y8_1޽ldYx X,(C߇y`>:0e(BGfHaP]' !"MRYB 4˃0@F8::;w ؞AO F?áf3UwV:Է1Qzk&"QZOf!fZ 43(-P)j4)q+e)-"U|uh=m[\a`5fE\׳ZV(C=q,A&88+9Ym׉_>`M_81/t777 # Ca6IH4I1ϱX,0OpMq,p]LtUSu.>www @9X2Opu u93d 5WW??! !r;@ZRFX,px8Bcgw{R\&MS}a#I$i"yi3销I]Ҵt;UlVs&:aM<9~aggBpα>|[ϭ$M~>99:ә,Tߓ$tOa0Ll9<֞ZݞIjqmknN6 z~4oĩIHLu8+.d($WKμU (tU(f RHEmD83{Wx]=wld9XRN<`yE="vvc;] j;qYV3r1oٖ%9 So<;jj7o&pm,ua`9\UFG#xBf30a0`:"-j$hA u:X`[*!g~T[n)P7=3g2ǥ]+[wdm *EuH5ݰKU{Y>Vej]n׽* \MFk*Z]L0a҈WlWKJn y國w.꿕zsJ픵0\Q_E_ aٖN=cy}$KWGFS6+gh9hL-(dR;ic= |r:eW[|) {D7\h9\"R;Q0qc:ʀ="~ vwwe]EW&+~I`0`6a>x`8Jq΅NW֪y!28w-DGH2}> `;;;?~xֶ'ڮ~N֫& W/Uz/e|+kJ1B?5t.Kt"`+ F^#DfT5JWde)ٓPA79F0x]p"j(E}nx0=%eu1k:JydL.L~@B OtKUCVh(@p'A>` S8u?]]ԛJdm{[ Ѻ(fMe0~Yd :ypx{!Jf- YB/t6m^^\H:Yy4 @G`޻+E^[[[X[[GGGmhB0,^<EQJv}[[[l6eE= #I.#y;gYKikݯ>m}r@&B%غԱL @e(wQN͆W1g)jWT/ۉmUg {NԄH=nwΪw4ҥ IDATn*f&/Ic9SZ#BAxJht&ioTNTfwvԃǿw ZMp"V$Cj qiiPd@,56MF/kH%)ڑmJfs39\(RTA%`k71 m!NV ˮTvvhNU |\=@Qj ' qnV~Y-{`D>! ـ0C˿?׷J7)Qs5mwwAbx<\#`>["=|>|X3 J,')4CAHXkkk|{{{jnq|fG+KUz8Y5]<n `eAS"&QnvI*^ ;JQ+ݴNfU&)V9 PU/ޭ(]wC0ǴW]~-Xpf;,QgdM(gNѮ8v0׊ty68K0J '\zq/jRkR; Mm } 'muY(8V_knݽK_.] RjԳ4K/^ lb0!4 N(2ӹsȲL8F$i<ϑg9<\>1pA(zgkk3w=1Ғc߉e(b4- gW.]B3(B/=iJ(>|G'8<<oJ'EAS'b!<$ 7!58Tp-2ϲ B x $']9 !d󰻻ϿLW|rU~W/<X P*~Z|>_lvxƃ. ~褍́4NCXfx١i͠) mv׌dW*rj+6m C 4GPI~2VO֟2h/p,0I̕o: qX~hqsL۽ČhNӞUHco7^P5 [t7Y b@/qUiD~=-! !vwvvqup1N [ۈX_[:֐e|1@ {JPf*Y皟>py}IYlL~FeS} E`d|sl_<w{`EJzXԷڿ?mޞUQiGW?7ꢾ=~s|\|M|v`A&=V,K wsK74ϓ@$It1Ծ^%n4I{ ½!)PK|.=+Ow9>s(̝ م}>);فԚ\/u"7`sS<6S+D[@з4+[&D߲~Vu,|@*349#P( a9 >^=t>w\N QiQ3qoa-g)ҢB ^ &AiO٢ al\.9ݻ`dbxZy.qjEQb9G̐&d¨ݸq\/s;/0e7]<ԁU^DQ##ߠNrb,x]ngj}@мvp%(ЛmAp2 \h㵆-'jKC.&Ӓ !ZjFq ضk,,#쇀xa8-nn[ٳսy/QY4P-itMYu^e׼֚ `o %wuph[{uOPy0& pu}ue,0M`=99Aev aX`2,K#cL&>%}6Eڗk^i/E)"c1_ח\w"7m'''f "z%#kK#יa˱R+ SJoN<!gWΝB iw( ~u'0Eы'@hcqӹ.m4%5hVY핇(ײi`^ &6>:sV16R^w[89I58;;3n/_l61zN%&ϲ Y*+, 鼦qf03פ(Z1xB@p{,8D\n#ܙ> 6޺7#nw6u gr1;ؽZGCV.jVf~Ī@&ֶLU&~o%v7uw{!eA+w6UR)8tx .? u׬]gJl>|w? zQc:=CYʛ*R/yŋ( gnݺp||(#Excڽ{%`0vvvpmvib6̴b$I@YS?z[<֢7ѥh!Z kp_&t mJk&C[`7DV&or{aOTsCGpϸr}L?HK'U!.~e{t6`n՛V-m2q,\=@ [m71s}, _k#u;㏮k| |D$2nf+e ^UX\fR FP=88;wK,K "yL/K:Z^Z%D'|}.߿^9cvʷcuz-]_n֨e5MT|ngL,*tJM`bl\H{0A 857'w[212}vR~{dѾ:=sNf|w/"3~ NA "h*H'@X/E[aPw-x/ȱI~3ܾu LЏ"EcO&kس3|g8<z;; {}ij4:}X( jsO[Q&C RfXb4!IIMG!}҅$PIk{.8] We\8mJ};S^k8G (۷Ms׫ 4{5J멹<$Eo Ujn,Qmsl 2dJ֪~g \=i-ٯ>`{C+p_CbTpXW;W]yYC).Wr}Pg'L^p=lM;8P\._~;w`gg{{{N`İgϞIxb>#ci:c>KQ>8Vx) ; U\r#!Si1&+I$Ip}|+_1JreY;Prh_B7' d94bSlj[Jd&&ۦڕ;w k~Y?:N@(6,V0Z|sr&b뵓_\ hvk#SmLxy :v]aџwlkJ/ |/ ΄^wZ4ۮ}``)g-Y"i=';s@B4B MSdY=jakk8`0@8>>6\u9fDZe9Áʢc qCtbCo3S.q8,*ǟ8mU_) V'Dt |CDŽF򨢬^ ZNs ?p=s+|+2HSU<V@Hf94GR_UNAC1=\:x(lM9)܄s6|8٨ F:A}H=o3wb7-^#7 n[gN$"Ν;([[[z\ȿ޽{BF#,*ݭ[(L^)Yp$}{iO=@pc"-[/^Х4MG"HCK]zzlԂs{x'ca ,!:;5K@)g6+!`lzݪ)c<`\յ6|c?*1P2j hxp [g(ak*b`~AC.鵙Xa{{G'wh4Bxc8;;Ë/eHʞiB 4bDQcX,Kip۷oc2,S.i\*m"CrAqwM󀁄dif .dIA3h2g6r{pnQlU9v&]fBWԀFE>b#/rL&S1Pʲ=[G[[k.P73'u,@%q)7! OItgtap mgA}g:䖻$e[Uw QN ^¶>m ؝&>tiu-sڗjs-UEW-k#˱>v ԀX'{4{4@Fn?M3P/?qv6ooR.gfh"R~It]Y!"b2`\`X 2F#U,0NHK U);&=?jĻW3B3&Hk)t*wWAgr]y҈vbZQṃ~]_`ŀX/+dz*;nm[#Kr}v@/s<$<ӟ}t{7 8`0@'TfZ}{{yШF#f3ӧr;Ql$u1q-eZeUb>c\`6*+p.K B/,KLR~Ws͛8˲K#?J!;C G~.KHiX]O:XfS輴dTuuWAL4lSupjY*$PE 5MoSK^'dV&5\2]֑ m}ӬL-d 1 \1٠X}V/ľU~hro={x@|`n?a\?OQ̠E,ː᠗eS,K\~;;;[[[2Wl!bXĤZ\U!Iz={}\v ~!n߾-y?6?G'}Uȭ{5kB~@"2=ҝB(id=r{\˟*kqn{ץwG,HM8Qd ,gMruޅW33E@8>NS̍O97{d,j9 ^ T]YR̵CT; 3U`^݈i}5ݦ?-Jn@9!N=ZUz)/9bU> O$ILV+!"r>X,繤- &ApZ]n4E Q![fH,C/IS.~_szϞ=;!'I@,xokL**N8mBn Y"4RCjEɼ7QHuaz{s\re^M^4v^?۳6 T;UMmv6cڭ^M0_u+Z&/jZ\ף\V^%YxV}C#jjX}Y \*? .c}4`n~3߀{tVTӢ.Ddb(kz|>888t*U4W͛sFcX`8PˋRU#[JU10P OĐeleD6Ν;?_7_&[p잙JP*eM1,Ó'Opvv|E&M]t/=" Cl(p eU!b>vvw{m׮]B?l'Za{yɥ G7 8* wc-@-~:dfsIM' ]FY`Y5Մ(YY.`lC_~[D0J hzΉ N}trfVFiLU _=:0]j'3Zqq4ͩPSZg}Kvhg 'vErBY[ǧ~S}h4x<6JE"/#(QU$AXd D,FHQJdydl [ԃe߹Gt_ mm P}X` Ό'1tXy%(U.-1K>,֫_+ բ%(n[uup8w]<|P"̽`ui[a$c<|>d2`0!F8NL&Z"ea891 |(B*1RU2[b\#b qŋ`AqzC!ܛ ,A+˽cfB9jvyYro%a}#BxB%Ct^~M45Q=T9p:Yf^O};N/\(3S룑;NV2vBCxj)6o!D}l4ݮ Ț;uݻ.{`(7>5G$Mӻ֮jzh8zZH$p8t:r(82REQ MzǪ^ Nbp^I H:Ⲝ)D"!@SiC,OqG+< ~uKk!ܭ(r;zxZ @<`=K5r|8SnI$3kVk rgZYZQ w$cmZC{jd斸 N_L5'jG֦+BKi V}mw'9 + U^uIQTsT@NYK:moˬz/g9>#ERmЗF>8AUUNH{Y)Z>nܸ!ˌronvU,P].pJ 5዇&i" &{‘|m8+&"pҴb (e_m5Ḭѭ.t{?*[gV!t\~_Y7zvYnؚipy^so&Lg=(Y׹*je w /hl!)Y GVeUZɔ7 hœHW:̘|ahYVI2y:ždڸerޖ# W_~!׶9SUmg$ ?#dYOc\g}!# \T=MSdY#<ov=Uq&v_EQGsUqP{cmNqL21%@%CWHwp>0J+sSnon74 MI޿vș|V9WTɊY-*{j~j%$sjim@]ڵfY!3075 xa^I~ղQ> ^>3{sƽbhX3>3|AY8>>RNgeY"2'+V==9ŭ[PU%%qdDfʲ1DHq(eJ#.8ʢTFWGN7'''fWix\}/6\i=FfG*۽s; ٽR'+p$VƴvєQO43;V47WydEQd'[g_]'{YQsyҳPEێAT-G4 %Rej2ÄᔗZR_㸧QR8-܂%)(0l9\O~ n޺ihiܨt:5IӪpMLS]7EDȅz(FMg );1;az&mZ@Ez hFI^Y}mu-ԳCkL!3W:߆6d| w_:Z"y^ l΃Uӥ~GTӒ>YB5*\`V3evm |s0kreXbZAsyTfPeY]^Wb ᕹܔ-gWqG7pzm92r-embH.Pj߶ӟׯ_73Lp9̲LQL&x99cRu0x,Kyly(P(Bb*x}F#$ImC0*?|7厷BraTzAr8a7qVz?^g̲sa(vN`?DC uZr*2; !,.zAni0Mvk8|N=G`@NmWRq2 wÀf#_w$,}?''wcb9vat"l6 ө~BݻB֭[&>=; z>ʢl*q;ΦB`4 !MS{aa0 NbdYl9ݻ۷o/޾{}K)Dn! w-Y`^n?,I6Qoj7 :'Px8 ƕ,;ev.x30;1Ynn /CuDI=mo2'[P(vE@n ]t)k5}"A 9X@4`ޒojhk U`2 IDATm CTU~xA߿_ڳf9_},qxt UYaIdT>no٘MgO1JreYb: ^|d, }6a4ƍ GGGG899)%ʲ@+kDWy^ -LT#ܹso6Xv;WvDw}9A\ۺ],\[["jmڐ]/ T)k L#]5Sr$Z%jۧٴ >68%mLHS9+Zg|Կկbkk _~%KeiҜ#E^@H{=Ċ&q(BUJJk6)⟼WCP\{'ruy[ZWHt,Qlʨmu#*c/xo kX䔯S-aRu_t@U=/r %uM[.0_~jZEd%Ipzz*[/иi=wߗ=)$KI}+rUORMT&.PҒ1)` _?|(nev19U@ G!Qn>(I~;UQz ڢ;Wˆp&*"DC}N3EXqrgC$W5Ffi K#s>%n?Yޗ{,Q a)Ǧ7Do!؛siN_*EnWC99+5X~[e%uxAhd">0+2tz>O>ulWkrpEmL&p|\pdK~ܹsYĵkUiƒ9"aggG=EFmdY4Mqzr1}=)i*A'UQ`E`Đ6=U^ړw>'Oh ]WcoAL}Bx\_pofDU9# X Fe=ˢZTY\,'Pk9ृ| [敛vAo0' 9/x>uTYj iw&h]8& ;0y>r C|}?Fe.K^ca ',Í7P)a4ӧ8==EUUzxae0류L&֭YD'\(Kψ0I?z/ɥFt_a= iJ2*@zMTXZӮ|u8wt]lԐ6* ?.\?9J37'ACUy"ܢk˴kR Y?Pqm0w[dO< Z7͏i1{1:YKػ]ͭyjQV'E{zJUVU (OdV63~omdя~;w; wv^|h$T'I`>ɓ'_kk&߿|o>JB>2lii8l6CYVȲL~/6vvwp-YJvekt$o ZZs.O` -(üz2& G!D#k"8!U2Pqhsn#U& !jT)voLشbKֲ @(aYGӞb+OPJjTCW}}lyn$5</:mqkU`zK{*x&O<LjҔ5 =|$IkۈTV$ ?~$ 8ݻwr%8b0nj1,%nmx<~$FXj##R)%HցcTU[UUz?|ǹS;. ּlcu/c& YKQ+fɻZ BɒpܾtWPևۢrd_ۜ<]q #U`u?Z8u ` kqFzS \dhۥig?5h㎟zغ.2[ДKV/k9ƻc߁\˿KPDJ뀤iʚv^5UU#ܼy`?)8f, d 2`[λQ[JV4D [ "0>mWn@r 5:mt;NyVL'2B@ `ilZy@ i7srΥZ>{ {VFP zv -X^Xz[_>a,Aktȋsro8+rny÷m*#]j__Y沛.' k׮9kpB(`^,Kcq 0NsTUk׮J f3dyIP͇?9=Ag(BisD|ٕᰮeϯY:SD澒 ub(kQ,ޮCazR8FrhإR\ܢwne`k #x7Ԣ)6TAֻ`/Ԗ]7{Bef+it;SnDTV^Zן{~r 8LR4IF^ {/+N6~! ZUj?Ov#@+Л.ZԐgP`e~%GA(hHps&k.j.x͵z좲 6NbAZ%Lm+8ҰFͭML|`G0h+h &/$A`15+"&V,Qb|[(;'9PdٻdL3)&_MñrJf2w菶 bku/}0~[xbEYη6M,L |_""c1_`8~2`s,%"@p6=3"1Ij\؈)X,8;;CY0Q?6*_9PPucs-f*5Y  :1Y[ T*Yv#DcMdfj#$[;a0٣ M @c<ѭǠUzJ< t/VfB gQ%Sd6^5Y+/_[ջűbΣηyc\5-@S%>}bp<ϱ{ ?ÇqvvfJe)'q$MW"2x2FUU(2͛71LQQ^ }!_i "(ӲNcEHŔmsF],vpJB=wU=Niaby7fP5)iq7XX,MS߼mϧϡ3QFBkT}b@V5{$Ů@&8]к*|<PEDau؆=dݫZghϊW>Ŋ ✮Ϟ> n鳧8d2t:o"<~^{{{HPge.@ {~ʲDeXA?FYJ#b0&/"xphPUw~c}:rk={ТA5 pM(FA'M];U-7 j`titɸŖ[BOƂӿ,=z݀]:jRM!ecDFvjmXmm H#/S]I,5l˗/q5z=f3Y=򼐶0ϰ\.q8"BKa8c>%+ kͯj뚍(A^f'W\tpSnFL]93[ݞ u ~@͗ݞvo9Aҟ}SDm![TUTVyA tB~!1&Kۯ ѣT7QS _va@E`4%Ayl hi0sqmV_(K, 3kmdmOQ=ЩcWdq)QYye oWp ;š,#\"嵸5qֈ;?XX]u;ζ33efʿ` |0Z=IgYιsQ_v͔(d2AUUCGUUȋ\yN "c~Ids[F2n^Wo} iW] UL3"vD; qlgșMBIs&P2$- wzyb"֙>5uENdO7"o,TZC%tA.7 Jˮs_ ԃQYd}cer.<ʫ(tY}ޒ[MzbBq+շt#~]nnl kxB]h2qo-#޼XQS謼.gʍ=-B ֥m;VwBe= -/j(ugy9@n nSVZeF ++鍓y- /_č{fZ l˲ Ϟ=1fA !pu,KGeUveDQT3D"la\WEQ1B-9{\YU{ոhЯLs Q;I娆7z WC=1ٺ f}E39 wz߁r"o28Kjs]ƯV. +Ioc!u"r3;|Wj/(E:]K{S xo8a٩6:Y7svUѲM9YV~`v~^+8,f 5kh, /}2~/K<{ 1&㉤M0OpmnBUVvY: IDATJt\pLΤ젧Ԁ(QD b( ˅,keq.fx2{updE&Ď~;G^V9tX8kDee~8a۳kxPHt' J Cށ]u.k-(gCNص(mB,zo^M&ӹg-nk-״+SZs_W6 pr QV7xyMQnjt.wke:L0p=1p|||!󹔓E+H4l1G粿Nןlɇ>9_>ߗ ִK+z?n\U8O9[@4F!.JMY4k)aY3HHAi@& X$ 8 ڈ5#<)[nu"L nYt"dϚu Έ44Ms?`>oeBZkY^#(5[>޺ VHw=D{ӽ/Vw0;ڵ]|ofX.n򛗸nC4899y+<ܑFH iXR5D$QФ[6igϞOy_/^6ېhd @m}s )$n=Ćť$ISI 0`J_4LJW)Zr=Z6s .^ԉӲ#Q~$?Zݳeʛ7 MD]]\haZ1O~BuZTjX5^?PUX֐2?@z1Ax{K?t_~uF1Rf#kUۭKfxeYz,˰]_~q>|frDȋYACt;Vkw;ivqq/;mU>+4' i m8F9͹m1߮z6ۿ X"BzKX۽"5r,.xl,lbSf֯4{w:!a̜gж9K6lbvE] J&<.wgb ڔ!7:GThj̽0~(e! 9/yo?3jk3Vwߡm4`i'ǘh״HEQ( y "w4]v34GHUJm(~Y_s?ZsNavoGOz219 wox<ԠC'%o"wq W7QweJXbdA#bN ǜ(6Ȓ,}wvΒێPYV)@?mJ}=! j_#.r AD8;;Ǐ1+\__;vr1ps2nr''ϰb$iǰm]!MfXJO=7'YoojѳN+ A=V?5 Dt(Nk֑kȑ2d/[u4FLA6& =WėJ23k ~rE$|d ^qٓd{1Z4E<\G7~Z=̬wr=ؽCn>>>~F9Y y˯8ys___ks5ͶGrr4Mqrr<ϑepzvn&)~QU$C 4GS7hywDLvvf,79 ixz^Yu臼>N Ѯ[SdHW@@ڽqeM{|Y3wMBݑՠh4qC/yE, ^ngt٩ñF-y4D#miS"S8cmlM\"mf6m->R$ci{:^NӬ@Y^WWWH~~||j+_,X,p{{j_&ORTe v;_Ƽ MiPV%6[YG,z38Rkj|ݿt/wӀ%{Dw4oa;tx+<%ۄ.#a d %YԬek$lۻ>HKjZzsv{݇e[&CJi P)6Ot?@Zv4b[Qum};F <>~( }@}A@׿H1ӊ7ܱݫ»w[`>իWȲ WWWȲ  )-4E%޼y4pi@)dolb065ϟ#sOS[J^[M|u B#_$aΞºhYpS(_Chz½uԣG$kiv U/C̀N'ԢnnX]s"b!#5ff2$%lڝr.0_4%6w9bjLwzUVM]>{}|P6_5ʪ=~|!wsѶ#%I,0͍͠krrr+V+0kg , c( VHEQ`6u e,MI@Y1þ#MSe4Ms\]]:Z˯u^L!v1;jz4hʒD1چ՘$۩or{܂Y,@k]_Ϝu7Gn%i|ܠA7$zs00}!k;B$m:pwJVgW{k'ǟsqvβ1˲q7|$lYgYq~>%J)B)vI0&RVcBkeF&EQq+1ξ>Y7'=Ŀ= [Rp[?H/=%I6gQ.g a04hz-0GUǥ@1hB]SžtbMVRxR@.u~,ྷ1 YM4Ж4Yk_W;_UOﭏNZ+O^Ht^5<%|@4 Fg*BgYR׼'7Gg|Nx/f>\fkÒn@JQLwn k lW8Y +)trY1wSoߚrOгEY/seDR /~"ِhn Dz暳G#oE6A0ʉ2(x^^]K=v'?i lߏ۶]MӸ4vIkysf&]e֦/:M`6m[di,ɜ+\SHI1 L_q1O" xnt.>XMF!"8Y;W-O4qu[upFS<^{#iU5-V%p5ۏy1NT$3wz ލAO[sFtc,v5B@MwbOre#{oc5\.1ϻx-a&-X6?͜[n5 mIbUS!I4.`Y$MP3$i@fsg01μ%}hGo2Etb9v$W!S ̞Cya:]Jv>R] 2IAF7>9Quf.= RYe9XXPjܢwem8gS3!ȵw*ꁎikNa$20um~ңEG:Cr\>S}qz|8/._wwk۷.^ "g݂rvE]F*x3lvuż|"4t&v-]"GZwq<)p{{c#t::D}9M|xЇ]F!j ;h2K :(]JsZ{MRנXNwM9RRʀl nХmڹ~?ZxC ;L9av_K6*73Q7(w^l۲wldGnvGF;L`Xv_R~F"k0Q_kl>CLNujmpݣm[Ej"rDxnnn0PfD%v6ȝ}߻uY3,l>#@۶/ڶS->)aCQ2\̇=FAD]_ʨvQ:qj.k;9V?Zwti(dfz{T+g0J U.2Nn,^ԡsMrB\pNڦ9hz[4<}Z݁8 5L8(sM`g%0=-`#)P׵S^%DV3?'''X.x=4441Qr~ݗVU]L%ygyKVI4OKK A;JԎǀ>P2= Row};U}ɤ=betmkV/E < K;K,_g5Ҁ ڶymJiSB[˽5ׯ0&~xos?t;>rsKs0Iu&}kZɼ~g.ز,scł~MMBr79LqgA@VԫKm&=o__ǏZwwwȟv֙(5afhD#E?xӭiꚿ^zwޡ_B% M݀2kml0+ hfYaFUUXhFIq9"ao>2X'O0J?&YC ~5$蛠YatXc/&.&v:짮}G:Wo#GmMC8ܭ˜R3&օ7G8Q3kvqbÉtE<vH#AD;|nfz(\O,\DnqfFUUh:=?( {\__#2kZkugl6Vk<o߾a˦P IDATBd[Au}YB(GGG#Ck>X&I(Z'7z@K4;QAJ PH38к5lYC?b&&/ќ$tvX?vzD-nIow~]*)zytGԸ65:& LvxCtn?p% քvojְ^;n(H-x\=z?k<NNN06ؠ* .2oZkݡk`2]S~-./.P75ʪtit5@ ,V=:2 y h_h41nVh|CYFQ 2EasGqQܠniɎF܀ށ9Ӡ%uHFٌ}u8յU/dƝ|=,&uM޳#y2Z$Ωinl]S; oucGu˿'V|}̸jrz8[W9 yŋ8??G$x%r.//qttd noo^l0qvzf,Q7 )qR u5}c[h[c۝[4}ʿ2F2ǀЋ}yZ_)0qFt5ѹp m_hبE|us^Z:\f+/c)S;mwYr˷6 egdm6Ud^ N pHPH--XSoLy+/ r8~GS8Y+[~v<7iPz?kjycہpssxJ)i,35<(Yn"}DL@exyêSlhr>Ymnߢb5ZdKfuy)m^./ә{C@̽zXOO`W@K4qq3y7AЍ_&(v}֙E& iQ#1eqg<>-; 9_a&9oԏ[ǍԎ[Inԙt_HwyY-Vo>+[ao\ YjT6 57 Ss駟GR.ʾkMEQ`ۚH}9CYV( r"ˡF:^2vÔ+G@{n3M@w)4P75(8ZL4u ##u˝ܮGl(]Z6-Zc2J(VЭFeX,tI3գۧt4eF_LCpQPEX.q.?֐ŦLq:&<ԻF2MVN6 o=<:]Nj"'yiǞKc AEA5LW5KGg>.Rj~ܿ7(Ѱ=.a0w9Roy}"BV޽\a =aD5|?ߡbx[unOggg0hѹCa5 O(nPU6 Mwt&0(4kY,Q%Znj9$HU,K R4MfpZ{0nbPA7؃"N.>d5xRzs,3W:>ѪuH+[=ЇqfRYa)|#{qggIKӶ(i5RŬa^̑$ |>H~(B帺-)U 0PTu{X:]y7HӴ*@x |ybi{#[xrrpgc aqB0c=ug\WWvi#nsD;~>c^C-X34|a-=~e #5mw&uKc nE EHR{E3kkBDJk7ğ2pK&o#w'[o%/p $ɼOÔ;,0!ayŽ:y}$E\߯KgZ;e3_qol'A~1 {`M,0 i΂1 Cn;hO\ u(cpgmh! }a{׹ xkt SȻ7[ݺu65=}6A-ʪ2[;9>L{q"$W'yDUd$ùlW,wHO:QUޭKe>{8{_|UI0s_Gvk/oAKb#;kq ܁;nیmr?'`yv}p?c޺#ύ[g#))DO"[b #}syOsMN~X}\+,pw@HDmᴻ0 nϊ呑wϿuiprriq ,y^8"ɱִ 4nC̸i(/ z=Zi@F@W<<$I6ZUDn;vW3= |ҔWG\iɁo]N#keJ*>5t* FYw#r%o3(˒5zه6kk;FHxAw=K ȻdxAV[EST ?˧fĖT 38ZW!6f'IouWw`e9担wͲ41HBhu gY/ l7hEQ /Kύ} ,ϐf|qU03&JC$Z4;\DcRP(Wr:z3)Y<.z|G5eXL{2䱙m}[7ͻMن(m0jxir[՝[ߑ" ̆Ѳ_'q`ۺ3~cq?{5GPJX֕Y'"hf˽V%vrjls|cvub|4Kl61i5fո8;=5ThBBfz _3VOt az0ev%#]6KSIp.rf_-+ R$iMԵȻ'lIf?5/bB]M$!N;vZ~"7f~¡v7Aޒs?u͂5!4Zs-s{+MH֋ҹǴK+7MѧtCBЎ&@`Yo RhhRK6Mmqzb@W% ...qu}pzFgX.l7(Hgggh P&ʌ$IE Ϟ=)*&K|nqsfW@P12\Qrh$Rʀ0 tyRk$3扯Cp!{3+@'ULuLZ9P1ܥ m o @d "[G߁vNlKpf6X&{EhEdG,;01h]0I;$0A|s{؁'&ԟ3o 'c%ߞC@}d9F\]>{cu_힟=i$3V땋ԛ|nLOH(v, O w@x xP9nno$ak-7M#/K̊5fEprl<޷ ޼qqq~o{l;\\\ K3,˰m&0&K;C]%Ƥi1ϱ`6`ZES"{a Q#}b~>BN q[3ʳ$udA=*N~\λIIR&m>=:#bs=sTn&}Z[vV~2G_SNx-Dܫ:&ږ-"z%dRqnȀmX4E|F'Sȏqm̻(=E>j(!m_SРL dNt,v\ZJIiryP͑h aTνג{ ~=I}]?p@r7YFP"eųz;xiF6Oz'"?#V΁4}dvUUliZxQ,GӁ+ks nwnnI޼ykXHi$ c$H-v]|mB`6t3C`T^cWaSC@&43\ɬQHW+yH}Q{eIsA Q$nD&=:4#r WD^DN< -qOs.{7H>$%$2@X/ꈌs-Kہ@NqݘD M1B<+%(0ǾXԨ~?cv1nnu[@mۢn*\\njmiW{/?`H?V5̴XMT#Rl6iU]cWe֌ϟ-N=j`4;MFS챈<4{6M|E&%[F{/ ^x0wbtbvlﶷZȹ`%j?@x0w/Z M Z,I雂P%Ip^Mp(XCD8y75jtHLw`]+\x(wI۽-LvAR'h|c> r1$ǚ $9w=%Znt8w'&_oj$AU(r41}%^|W)NNL4#MRѴ 43e6M M[wLCkIzFy{ )dܺ"j<`UU`$E7hPR4K.J-Rღ6{ѩL#LjY׿'~N~MջmJ^ "!rpj4m! >6x} HqwV5HӈǼxLqD"10Vecv`Lˁh}Ct xp'2`i `MmdlDX,}պf f77ט3|5/p~q|֌w4Y<Ϡ¾)"GPp{{ "B۶Yk}d( Ƽ=D(=|pT#d{sAS3IL5eQF嫽F;&4rZq['w܈:s"kOz#A9:/Qv1ٚ[sZԌIHb کg˳Ѓp4?gي{?G s(>HOG,N~a.*W"j^AyʑhAiaY@ؽ19IcdW`ݿy'`BC gD(f)v %Y]ݺ9kOGGЭF3$!ǽxmm4u R*QhGy,P{vNNNp{{k @{eU7oBQoe 'dfzԲvG8L&l%;/0{s)4vN6r&3ܓH6ZoJڇZo81N=f|aJ dFDA|(qDrnAC׃(iǥ.okvlwZWt1a (XZ|O} c! `tfnyX:s`AUV Eؗ{/ȎWnߴ lG65Nf'PJeY3M V+ӨmѴ 9.YNj (*%$*AvR u]cB)eIj2yHM{7lON#-gǁ` J阼֮ >UUrϋ:yxT6^a8 ÿ4:pXte{ ;p,>! q)-p{oC3k_ HF xQz>r>(wxM}2>c64׻75`]yC% US_fz]JFX.Zb_|m%vk\\\,CY #24$&U$8p IDAT( {;v={r~(='~] #jg"r$YɲOBԗ`۟8'UI!p]n 5egoJGM(.F+҆T8;c\ݹYk[-\#,V'^uxpY[|XݕJA2(AuK{JwN"E!bgH #>r|L2B06`;(Be݄nk?}=M *`X(7h/VK*U]kIfbV@ g۹.|$Q܈^iQz@|ѶbG@rtps)Ȍʸ< cVʟX&8/:SmXEo}2w-ur~fVԴ(\*r99af3{(ԅCvz$FH,*,m;&K1yX@.YGdgB)2G@|fTK#iyQ9P?sbT\Vz׫+h`E /<++/f m@9quuz~K|M9t[!ˣ1eUgؗ{C/D Vm' x$UEJ K*ч<_1[rd9 'Yаt,c^ZXO۾Fj”;rbme@ڎIAC$M803me-7iy}oDR{N+R7Rq[6Cz4p)S$u([qQ{^n6Ox'4S}(Y7"\6ix,< ՛?US^=W7+wML %v]GtCko7W8;;nǫ."EQ J)4sʐt[SO_pwwc۶1fYPh̃e] k2Ҙww.=oىmb"jlJ,ʀOsk Hf ʺ}%{'4A%LXs DV" کY.ΙNkIdX#.g9۱ ,c%^Fޠ@{:j˽IQf2Q 1u13 w;w߯>ġoh ]/HZ_lmrr D%J67Qե|8{?ܥ$O bnl`u](fx6ww8==P midSL{m$U;WOУDX+WK%ov!3-O0|O$X= r ?c)w 9ppt.39ϰȒLZ&"2imHԪDKVc_Zx?,1_,\,mKMB!C͌45XoV8::FBL&v;(7׷`Z̋2ԝ `xSzR8;9{9 d'͗vD?'NF H<Є$ϐ31l-C>vnlI kޢUFhl"{=Ӛ| tpcp 8˜ރ @Mm8^p-u{f/[ ',3S jѹGd?hĆ/0JBZ أz =$bKŏq={?8!y ߿,}5;4Xݭ NNNTnjWWW|viڠn4-nooQ"7mU9Eh bPP(KS?mZcb̳' gྶM60!},"Yfajq]Q ۦP{0\y"cW,T']EFӱjV$X 8~ ?Z MeD?agi?߇p@&3b)`s?935E@fI~b)> \w_={#Inpp-ga0@xyjjycFd^u+ޠ+$*_,G8>>nCpttTbkB]He]b:3.¢X jVe9cܓy=Xf$J/hӭ}P6 0^\h0-&qHx`}<(OY[mϑ d#{"/vd+qYVH bx xth=f8pLFˆ:d{`)ft=Ņ`Gn#6,KKEՍqW/z׏&)9׿*\_`}w\.[kl߿GY8;9ff P3  iA3Np(U]Cv-];;Z?{Jnq!+MFpOutہRFF-1+横 w;0q8l6qdYK4EooC[;BUY"MSY6xݚr@S7Hg)i>Mz?m*>"WM^n*ŐE5Eryy-^c.`dwpp!~"A/ ТkчՍU`l RYs68c{4?v t1 *SQx)v27k`( X*^Cr|@g[yz1rZi,HqM:=&5xŨA6;#~DAO,z]u˹W#{*9R)FKiC#m\ ~ Ѭ{LpOt]Kj:3a8?LVDb-l,gW)e M] hPPvF E6%o4-wwB@di7ooES7wH^Bhۭ dY~gr]\D4ЭFYKӸ׶?ai2~ݎH~$KK H r1w>"|gh~.ϡ*v2AraGۯ 9S~DN?,DH_Aj[&?} 2/,vW kjn$JU<9_k\6f,%wdǎC0#sl[ if3i jaǓ"E( Ϟ=/"z9t/Gc{:*7T(nL&>cL>!XO|ɾnڔ@/UJNj*am-d^}yX l>dL*8BdvGzo6~< bcǶa #27<g%'ÉF憏|W)p6`Ǘǥa> Is<-tm .j m&I{ۻ[ܮo1/ 0^^x5-6 m kѴ 4 f 8>:FUWjB^o/{5h4{ =4Faď>D}K@e=z9cR5f#<9G2vw`;yGu$~ >' :K;H޲edjb2ie>#mS]F&c?gn,3<Հ]kcAq)^ hXb i4VFR1땿ML9WlH,7we/rKBh0]< m7[\\#Q g̸Y%JCF^r4Mެ&)d4Mqs}sTUei~&̼pףh"5}o+Է .C@vBPk ʨLM߈3|?=$GIȇxe]H#)}DTO$^D4#eRBDY&U~t#-SMņQ>e_&d9mr1JQ S!Lxoa}XKPF9|{($&ڢ6vZI$?9MH8}h|1T<0Sc۔s~"`8e+A1Ůڂ5жM hF42%FUU>f 4FVd`0]8:9np^c\b<n(o޼q5f~R ZOeO.  zϤ=|61y:Dű(=:0FӬr0L ȁ=; 77-]?tv#*>y~b.Ǡ?>0QS+"􁉏ub 'NQĈc8IkwFCIO}D/F>s!!jb; Pi-knSC4vr}NwHg'׷WHC#ͱYoPBYX-[pqy~mPUq#r2 #2:虹wX'!:dؾZ?0%> ׏?E)ӟw~YgL )3Rᵍ*PI#ahΡ)]4Rcqsbx @ǓBaV0YI|FD&S: Awѧ_eT_~hZ9OľҷHZ{$ UWCEDW+\~Ļb_&{o$9d >+"2#+[Gvo+۳"3++S]UYq i 4w+=(f4F#<ӧPnEȒ 5Tu'14x9#xk?1 C;zu+=sywP8 ]`@/ !8]p2O(!7tC tJ`Oݞv iF xN'!ϸn<o".厓%{Rig^c~vr4;509`>cGtgws3}NSۅ*W1ߛkXi7H9߆Ñ~LSפyrSm-3(Y#ZaXcDuBa-@p!c QVaUUCk0λZ in,r5-J0@j|3!5nz}6Ԭls\Kh3۳pazǛ1w$ߪ}:f2yFsǤt5ge LIT$ZdUqp. oObX( ico_5"!KSpΠ& y^ sE~XI8MfѪc #Q!K3DQW?)#hM8s~q~GU8s-6 da^غ EE$]LSZ~iYJq!YA-ٓX@*Y4v~yvꗾ`{=o#1Oa&;9hEcߋ!)P9m'`3{:|m*Iԟ-Oun G$?"eZDg T6kCt̰uդLOxAy13t{b xq'eU̱g҈&\^Noq1~ ./wpc{9]o`Vx&w9V<6;C3?dι==.w1o\ s9`="#;;@m{ 'dQO?G`XgDM p8p{w M?qs׈[5d$!yQ@kj8d$#UfEǨ *5UuZmg塏M0p2̶À:`ܘx'0c';AO(t۱70}w 46KXa0f9{&Ac@:&,xPF!=e5[Ͻ_C)a5:S}+0EU+=Ypsg|PkuZ`g.5wJP95J@yGйj)꺆` `R48ҨqeY^q#MShRH_~M uX.EbZW,JZqx8M^pcs)=iUwp=Yz%a|j@I˦x3U\RҘ,ݘ69)׺@ ۮpd0t@N{GYb:4ޗ59߳ғ_DxSNXW4nfaW{#4=.KcwOXsd? YTۧX w"Cnm@S'!2h؎>?FP`50%v* #27p`~@jJ\]v V  e?/kXd2y@+j649+oGfߙ| }W$0?8Fn^QsQzsBo !Hb0 'h_=T+}$EsRx{s-rԛXzBY6b3z`QҎ^Ztndo5Ti P͋L܃j`.1c,] ce 5%ʺUU!MSH!5JZk(RX-VHEUBi)$..Bj$I(&chY>o Avw&Jq`Z.0 |?cJiTQs>:nƒ]OA% Vh'-t8~@kЯn =孞^u87 t;ɺkg:Y>+q؃nz7ōMTcL;<`}x qyܬ@6.vsu]x<"1~ޔUE^4aA5^| 0-@H[3\8y"pQUHa qDDƿ?˲aMN>tw맹?r@*$XO ђE&\޵4n&+&ijm׀YdihT؃45D:ƺIq(?ħ4<k%ǝ3拝ӌ*.}Y𒧈EfgSʙ Y/208!kC>]n ]}~3|O[ݶ?/@]2%)nkugq,uU~{o hH!HMA۪ı<" ZH#c|Y,KYH7H4K#VU$AYVPBQ(EOTmȗ 䛇rZ0+.2b 5\w%<" i.ӄL)DdHmg&Ai (ž@TKp. xΥrZV}oXALNqaKf x)|3 3VuY|\quN[rL!9˛) >q+;>_ClwۡJcn>ӳԧ} ە{f*5PXE:Yw.iSMTtjd? )"}J)v;,Kӟ ׿`^_#"l6[DqjeB`{C&0u ʪ EU hBy|smXlz+R@GੱE'VFbnt͡*Uuΐo5[@ 3pۚ2THvZ菓 wγX]vIp|8  CoY!ImUnuhmr'aSNsGnspT伩,p'0f^G}]B2D$j2oV@^wm,K} 00\pjxq}E<̱I@g "LD`t}́<扸X:R[鄃غ494 T2 2&EN@g SFtй@᠊q9 8h]Xlpg;su>:>; RP߳D\6|m* .wen|6B|;?,0ɁW؎?`YP%[=+Y^"KeW1\={[hxLsu;|-ܵwN)<;LZ議:u8X]HqhQH M3\^\*K3O~§Ob 4R.Q$$ACL@=(B]m 3Ja{9Od7'L̓JMbpY&n3:YdgMj0&*h3*Iq蚃1E͌X䬩m}@d`%ַjSc]x716I6JL0E4jڵ,wrj6pzYm@\?s?w ,N|u=`n̝krbЩ&4S7N#l{MpIBilg7y] |Kz  o{ٌy(=mePdr BJ"+UYA+ux2|YW =oXaa\c"m^Bq`X6"(q81$oJꪂL4MP%)gA@?{\+}no`C@\PoȻ3iN`nUڴj5AU'?75i77Լ].ւ~w3fg <}??3f3߷-6qvrSp𞊛USPGklIMhM*'YӜ=$>7lj瓁> `>;r]FIKTON';G7(KԵ2v[*qssv i"[,pY+$Iw.KV(Q.AF~8BJ[la߸$B!7w#Q{8 yv\V39`Q M55C Sw @jj.HvBȀ ?X9o.;@n7swn6@GHw!R]=3dS௎MXUYٰ@p -uߟ;ނ~{HcKY^.vX:`>7G<磿ӝtLzY 5Ie%@x48AYКO5\? oZk\^\B^KGK6{+A{"d"EUWY,*TUnp58oga3Zr vo90c8d3IQ!XB^W.sDx-KcQw*"r+*4tK;%'DKٝ6S8;"1޻@k؉qߵSf@0 ! g1{z:=9$֪Y4s@:0UN9 d'sNgq,qgj=>)5*20/!EWzǽ,kNuQ EQg^~m\Qbef)$VKyiJa^,KT, LJD2 :=w)ߞr,)l!OZ-ww ޷XE9CEW ⺆!/d p&Iuk7sUw;&*fl yL!T`%p=ɉi3pAމwڈLk]ob!l.؉q6WsM }`?nh?-@N#t#-4s\9+MSZw;`>7'6]¦H3;90C7sZuΙ0RYSZkEX_~s)"l[|2*EIi+{l8.ADP2<ưX- kSz9ʢ$ǣ5VjT|@MDGGu&#;VnY7m S9ȡ7?b1e | *e`q˝a ýjʸfdүkL37\L)~"Aus[0\kY=)jx= Y^y-XY qM>HeT4>ky\Sg lsJ=K;1DK9( IDATz2m oY f'6F7=c3.* J#v h-LTvY_z>~9^|O>!TuEn%o߽CELv i}.8(jSME8/̻=ȥƞ}``DmP@{hTHAhsS͠syL94Æiw'G1='l–5scNkv,?{zo/$9 @`7 $ldx*^JtryFLsrL!7eP0#سKԳ={{tH&T;\+{0۝ \RU588.p^j(Ucj~w%2jFPJ  ,,4(Fňn E!L|](9s69Ք=s X0ST3H}nzI؟{c{%X`jHkXFq8+ع9oDvv#Ʊ PK}pX{x>&5#:>3_,>}B]םHW{^I+ނsSJG@ߑ7D9^ܼxnCd)x{{c#J% 2ZT%$mX "~ˀʔ^vz6f5øyY'pw@PW V8M1BkE2nru{"8S~gE&q`T<B|yԳ1dwX5xFky,\$9ANnw4g؛텫#A 7u>nRN&Mz=!;ygCrsr_Ɩ+s0 *s\G )0ZCС0גbBj0ƱHwwHqTu6ǫWP,C77"$i<B DaҀxR}m_˲۳Q\G4oy:WR z0!ܽؕ}qgpZhqx~8 t tk6Eߠ 0:RKn23%v Z]oJt*~7o`#LE$I%B%D^6EYh`[PJ!cU u3VlHg`^iJ3]uW411Ayo-!ǑbhAA|d;6>Nie^?y@%ou96Cb'˺JK') :s=U|1ߤ{S{5=spM; Bnx 0:EMMԷl{URC)ll)K0Ƒ׿!ME,~ÇO ib\=cX-WФ@ j.$_jv5X=\0r{y^{DeB1N_* yyY4/l뜝7|.ycw2׆+R݆ϬYTU`;fnb)Occ>kN\!+bpu{ p@0d &b ! 0D7=.ͅ@$v8D eY`< g 7+oHJ*i\]_eU!X.89"i ѨU 01mr?g$i }H/ !L Z@0-dŁǍt88NZlaX"6=r]cͷr~М#ʬ5ojg"HOdz#qZT# xWG8|Oӝy|0Z<'Fn7LcӰȆsU&,rS,`SH a"F폞Mds1ǀݞSpAw;czbSwwvo[-cy(cn,Z( , Qn74=Vʪ4=@X֨Up`܃C"d&?mv}c}3VY=,ѳoMi@є~VyAb' 5,rU;g8cSmU; z˒KͶ]I[bG]3r(" =s'v*:>@ vq@qBnJRKEse}}=EV;3qK#;Ϻ&v{>p^۾X[I[ҳTV֢="yB%vs92R< ,s߄x,ؾkMww>M ANEj-%ۭeS\@eY/n^Ð2d"PqQ4PU5Ņq4v"Z*iv)e'!#IctܓN3 q`wcψ&_eX{|p2`YfcDo۾viz>{~!JҮ ERRl6`ŋ eU"b$I!~=,5E6{\_] 2] <8W#T`]J~dx&ۓi?+~8@]|>*'uԵF?aO&MG蹞xyWd/z]Q8^Ɯ-p^Cjs#$3|'/_nHt@) 09 3>ӚUp~'f~/;(TUMն,˰Z;T9nX`Z4iӇO5*UDHDQa.!$QLB 1`-丳%Y;BŽAYAYjN{gBm;3Ms-s?D ӦcݳP_rzqEt\&ͰHq<03 PG#(ӝgߴհSȌ3;;g߱ŷ`KMom ܻ.$ǧ[61pnT%O`CCC Lky($ #\I9=4xD3d da9UۻOX.W XjbU\O73VBw#.֪z?#|ʒhb4ѻ3]N3/bl_x'w@E>BՆӱ}`c𜡜/NY-nw~mN㧎g6~b56lUcܤg. !r,WKE=/ vCJ8;!#)2y`(T56q٦6[}`<iı,y)jh{*%ģ6P烶>  ¡{1x;xCo!|?}ū 8G,z,=Y&uT`2?$D?v#H#M)-y಼*,!I;;Ύ$ Ǔqtx{źYz4x5*=h ?@nI@h4@J".U3RQ,'(aVe_~y^( p ٮG]ǯ Z(B] A"FFj6 Bgp?n@ 'y l烼wf=fKƬhH"^*'[F"g0qes!+m+ঀXl1v#l}h51m4o!^: IQ6/PW5757\VX/H-c\q. LiZԥW/+#vv]c?,lw_fOfS0m Ađc|ZsB+zg[?pg{0vgQcS7,yĉNePm'VF|zTabnCQRBk4M! NM 4Iq8@kSUøӷ;HbXxxm{.d{@uqlD:]bݓԖ:ۯ=~oh9G&Uy?MsV],yqx:<9scHFPuix&n^q8ëW K|{kH"jB<C*Uq bTu.+#&QFH]H(Yc OgwX$Ȏh OaHs/΍}t)ف ӏv7sKus4ۗ@7]sx1Z]b߿lvZX,n{ 2DDUV#?qϧqĸ2֪V(Tzg}|NA;Fqy5^G' 2<|mr Fl!- }M;pșh]a3~6fg-5ǹM^И@//[Tֽ'0w13ic;BfVշunV6 (bu]wB5H (FfHWWרT5ϖ,r&=2EJ .[Umc?U̬F1^dz\=Df9r`p͔if$e4v^s5OSXe[p[_78M]sS[Bt୵~%xWWX?v 5޾}ˋK,+00a\G eY}vi?Ila 8׷}N:V[3g7|LwQ,=7NuL1 $9Us$;2S= IoB~>y=Qg 3AAyo UUsrͩ㍘k2=2P>~ l6X,PZ^' F 3U  .4̔Lyѯ9VYGT[{.PgNB't\ \OoL`~pyFy~'xNع}J?Oi,0^\=Uܷ]]_4j֓mӛ'eYvV]nHSHFR*X"C+q#2Te 45=VHG``\c"˺;9ƶsc牉c1@qL^y'6{$6y>}nsVKc3s,^<~snl!s-2pu$IzRs"} N 5V(+9>8Z2=x#.8]]t!8d$FYUxMOF>;)A~S#F4s6}]>] J} |\99DZb=|.'0&A6JxFO؛>؞k0 };[&[(Bos҉Q 1TBv5U59n?B)8 XbHZk(j[sw4*=%FLEv?yV-YM}F?{οӳgcMkb,w|0뽦ENE>>=7fiNqJ6)RO-JϩkD(ĵ. w81 eU"[f%noo߀s$q>ədUQ52DZ8`"IIYkXJh)͓ݿȏ{Ђcۧgl2sxJbP[}a{8gsGΞ>xa?@vfΕ9]/ ְcky+xZq{{ျ;{dNJJH`/.$ldi"p.$)s<+WT[{"йSyẊ mi 3ƞ[Sly[l7B,^,X.X-WF ^)G|WWWW[eYfbU9Q)ð/rQ*j0͡Iv (xDYc| $|Otu}ñoeB=G]5RۻݯF{G{cgYfiE,eyZ!?Ȳ q00(Q5VU͵=;QүOYK~xxg[0K&P؅?dp?ࡱoD}dã~C9G,6KC~̃ǴV( DQqWo˧ !X,9ʼD縹1?,fA8A, u zQVcwjh}Bg'*87a iwg| Q9;s}!w{E]sٿF~ww\7ꏏsiHD'nvі8>#kDiIRsュϽ_ms#=LF\kY?,ϿI)QVE7^ԴvK GU[{mmۙ,:.}O"{f!GU[_by-ryf\O@PJjn7'LE>W~C_n|qbi\>.&uO eU!|*g0ҮR ;Մ$2jsQAF]`V BLg)*o[ts?Y~ Ms Zy{ؾc倉 !j‹/PTEF}ݻwq275Ik(/"i{;Z!Rups(eC/\R9>|kmQ-`'aj~XX#mYt5B]kcmpp0q2R&wwwl7flb9c^HXAIcKb@kJW8w湦OP>{m=gOK{ʻjm,Ʒ/_N-Y:&{w=qF{=?J׾^quujHFL+p) bTL-z$12OmtLLLލԸ[Dtx"jIENDB`slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/res/Icons.svg0000664000000000000000000001647513151342440023302 0ustar rootroot image/svg+xml slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/res/iconAdd.png0000664000000000000000000000065213151342440023543 0ustar rootrootPNG  IHDR00WsBIT|d pHYs))"ߌtEXtSoftwarewww.inkscape.org<'IDATh10-H.xWI[K H2,o\ޯU$ȟB㓌1{GYiB1F)Zú.5<s1󥆿1RJڳR !hqZ_z,ry_(mY6 F۶omڣ埀hmYk4  X6 f,@h&WYc ޻jsVk+=a5RD^*CD~vn wkUw2IENDB`slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/res/Colours.svg0000664000000000000000000001003013151342440023632 0ustar rootroot image/svg+xml slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/res/iconMov.png0000664000000000000000000000103313151342440023606 0ustar rootrootPNG  IHDR00WsBIT|d pHYs))"ߌtEXtSoftwarewww.inkscape.org<IDATh1 tQ+T U*:>sb X)ə:+'BXk3k-RLH19GaͷۍeYZ(=d~zknSZc -/149>D^24:G^w_j#16^Z$Eѫĝ$ 2Ϭi!%$6|&-# ^Z*##3"  .&-7" [2)8L8" 6~54K>-$%m^i5Qh[UQF44^eh:>>534.WU]>6.(! PEN11249>)=dzknSZc -//2:>D^2267:H_y`j"nG 6\ZZ?;EѫĝCa2%1Ьi"Y_>35}&$lY;1- 0cZ+ 7zS;/(& ($/$QN;215*\O3%mH?7+55G`P9 1m`h2..l]if9Xl[dni  ,XX^>3*$QGQ11249>(=d~zknTZc -/149>D^24:G^vcj" 7_ZFѫĝ,2Ϭi! 8|&]Z*$  .(  Z2,!E7~5/# !n\i83( a`i;5,& WT\>71*$ PDNs8mk;KG0 L.XU):$&il32 11011236787434x_H0+>21 23578*k)1K012359=GORAtD0123434456;HXkoYZcb80,&"$=*!(.38FhZp9 C$06Jo; E #29fCKw9" ()4RUk|y5!)4MObreY((1L,*^+;"-' 4V8h3$!0& 6n}Dz@&# '2% #>/'! $Mm-41.+0H=/'" ?~v1630-9O<1+'#5ny;855E4  -Of>DE>:852/+)&%$## -K_8=A?<852/+(%" '0JY04;@=963/,)&#  2IP!5 11011236787435x_H0+>21 23578*k)1K025469=HORAtD0123:<<656;HXkoYZcb80,&(Y$17Ko;BS 389fBJw90U613 -p+-4RUk|z5$WkK6! 34MObreY*ehPE;/  1L+*^+;"7qeOC:50" 3V8h3$!GxaMB:4/,$7n}ȳ@&# $Y{^MA94/+(#  -@hUG?:9;<8,$Mn-41.1PeSIFEB8( ?~w16457jg[TL@0  6oz;88EgŕqcWNH=$ ! %*)8cx)2C:;\;EyVXt;=>B=767:?DK?  .Ka8=A?<9740-*'%#"! '0K[04<@=963/,)&"   3JS5 11011236787434x_H0+>21 23578*k)1K012259=GORAtD0123456CE>:740-*'$  -J_8=@?<852.+($! '0JX04;@=963/,)&"  3IP!5l8mk;%17P$7BC;+ih323210/-+:[D:89743552,53ZB6331245575* ?{I'&8401223368:>?DHB;Bb(*?302346;EQW_^U6d 9011342442338DPdxxfGsQxz41220-+*0@0)*,134326raC(#!(0(! ,4DƩ9H_24%#!-1'!  34Sʉ0?)&$" !11'! 87g`4)'&#!&40'!  8B}C@*)'%#!,6/'!  ?*6Yj3*+(&$" !18.'"  (8;wC2,*(&#! %68-'"  (6Ua4+-+)'%#"!+:7-'"  6=x31/-*(&$##1>7-("   )6\DRY00.,*(&%'6@6-(#  6Hg+1/-+)''+=A5.(#  1;j|4431.,*)(0CA4.(#   $6X9?I0420-,+*7H@4.($!  6IQkj4531/-,->K?4.)%"   4?m_26420//2GO>5/+(&$! /8bg28532208QP>830,($  )6Wm59754;BI\TEA;62.*(&#    '7Po%(::885Nieda_[WROLJGECA8$&'#/7Kwo26?;997Vjfeca_][YWUSQOMB'OLMNF5Fom8=B<:98Xkgeca_][YXWVTRQE *,-*06Ejk:@C =;99A >;:988<51/-,--.-#   5@ca6:>><:876320-+)'%$#"!!   5?`\25:?=;97530.,)'%#!   5A]W-/7@><97530.,*(&#! %5@ZQ"$4A><:8631/-+)&$"  *3AWH 13210/-+;^D;897435523=; %ZB6331245575*?{I'&84001001224468:>?EHB;Bb(*?30137975446;FQX_^U6d 9014:><>:422338DPdxxfGsQyz41220-29TX81-134326?.  /Q)733$,66f̄7.8' $HqlYNE>96.  '69lv}4'" )UviXNE?951/#%7>raC(#!!1bxfWMD>:62--%+4DǪ9I_24(%#!$;nxdVKC>9610*+%65Sʉ0?)&$" 'GwwaUKC=952-.''&  >:h`4)'&#!",V~u`TJB<850/*+'$" !"3>D}B@*)'%#!&2dr_TIB<731,-)'&%! /R5:Yj3*+(&$" ")w~hZOF@;63//--..3774, 6Hh+,1/-+)+2IzgYNF@;7520136:95."1;j}44,31.,+/6YwfXMF@<96689<:4+ %6X9?I0420-,2;kudVMFA>>?@?;2'  6IRkk4531//7CscUMHGGFC<2$  4?m`264327@RsdZVTPJA6*  /8bh2766:@Fgynf_WND:3-& 38Xn 597;A_}vlc\WURMA,  %,35/:>Sq%(::9ADȿ<#"Aaegc_FOxq26?;:CJE)"fLKqo8=B<;CJéD*#EnqsolIJlm:@C,=;?I_К{;& '.5705>Ghh9>A><><;;>@=>=9766557:=@D4   5?b^14:?=;987630-+)(''&%%$"   5A_Y,.7@><97531.,*(&$"  &5A\S!#4A><:8631/-+)&$"  +3BZI 13210/-+:[D:89743552)21ZB6331245575* >{I'&840213358:>?DHB;Bb(*?301346;EQW_^U6d 901123432338DPdwxfGsQxz41220-)'!&(,034326raC#!  ,4DƩ9I_24%#!  24Sʉ0?&$"    77g`4'&#!  7B}C@)'%#!  *6Yj3+(&$"   7;wC2,*(&#!  '6Ua4 -+)'%#   6=x31/-*(&$"   )6\DRX00.,*(%#!  6Hg+1/-+)'$"  1;j|4431.,*(&"  $6X9?I0420-+)'#  6IQkj4531/,*(#  4?m_26420.+)$ /8bg27531/-*% (5Wm 597420-*&  #6P~o%(::8532/*'"  5Jwo26?;96452-)&! .Fnm8=B<:75540*%"   2Djj;@C=:8631.%&"    5Cff9>A>;9752.*,*(&#!  5?ca6:>><:86320-*(&$"  5>`\25:?=;9642/-+)'%#!   5A]W-/7@><97531.,*(&$! %5@ZP"$4A><:8631/-+)&$"  *3AWG 1h8mk 6bʺX-dTn `-q $#P13eO6JYbc]Q?)it32p7895344221224435:97 5545320/.$)) "56,1/123455421. ()_5265232352/BJ:=8--20225953*++H 94312212101010212110321 /,,^?:/13525321101100113425422;8655858774210+-,-! 224))610101020016512469<68;>>?=<99:3//04:3)7145010110101)43201384144755<8;;C?BHMULcdimrqoVC8)'"02;73020101001/22332121100/2301753784=@BFSLO``jo}ynM:,#,L029740101223343+2212/3307205748678>ANDJXV\nx||XB/ 12)40301122334.322110110210322112485011234310//02202858;=AJX_bn}rD*+2;85012 0-)% (IEH9&.03431/ 012157<6;7?FLMXisB$/21261221-(! +JFI8 *(-4631//01405966:DGTddi~]2 .1.+5.(!/KFJ7 # $/442..0251667@FMJ^l~L' -32KGK5 $!/851.121138:9?QJ[s~i3!.3216 5LGK3 "$483..00463C@=MV_H%%/XB4 7LHK2 -;70/253<@IWjqb/+4 :LHK0  -;90.11554>IRYiq=,3 =0/5127?GBWd{ŤM"H0216 =JGJ* %,?;0/23/=8?K_sƺb*,//5 ?JGJ& ' /<8-11558DM]ju0, ?HEH" ' !8D1-416:=JOpz̍9 +-,4  ( 8GF# GB*0214BGSdy̙=*&,5 C F@E6 3H2-14:08OVvˡB/945 !$&& * 6FD  G@*1229KOfxѬE.635  !%'*'" .  7G..32:COd}ΪD-,*8$')*%! *  $N6.154AV\yѬF +,/!%(**%!   C:.525=FZzҧ< *.1#&),+%"  #=H+0319JgoҞ7+23 %(+-*%"  4G.013>Qbx՗5-34. &),.)%"  -M-022;UZ{҉,?2@5! %"(*-.(%"  (L.142>N`r$5;+6!! $$)+..(%"  K0156?Se~^ -(1"!! &*,/-'%"   H/024J^n{ʻH +23/""!  (+.0,'%"   J-018DUaҢ7-:7#""! #),/1+'$"    'S+126F`uч*I:1##"!! (&*-00*'%"   ,N*23;F\re+2$##"! )(,.10*'$"   5F*28@M`y˲E .84$$##"! !)-/2/)'%"     A?+17?Tdΐ-b;,+%$$#""! $+.12.)'$"    N9-26Jane ,*3&%$$#"! ,',/13-)'$"     R/13:OewƱD0)&%%$##"! )-022,)'%"     #Q-21?Ng̃)?:3'&%%$#"!! !+/142,)'$"      .66Fiu&1G8>+*)(('&%%$#"!! !,14783-+)'$#   % BHC K16143++*)(''&%$##"!  $.36882-,)'%"!      $ /F.28?g~l$/E7,+**)('&&%$#"!! ! '047981-+)'%"!     $  J326@Yn;43-,,+*)(('&%%##"!!  !!*258:70.,)'%#      (H.48H_ya.7-,++*)('&&%$##"!! !!#-479;60.,)'%#!     G2/5DYp584--,+*))('&%%$#""!&""%058:;50.+)'%"!    'B.58Hb}P13.-,,+*)(('&%$$#""!""#(269;;40.+)'%#!    E253=Vox+699..-,+**)('&&%$##"# +48:=;30.+)'%#!       /?-3:Ihy=26/.--,+*))('&%%$#$%.6:<>:30.+)'%#"          H/35FYt^"15 0/.-,,+*)(''&%$ %&28;=?930.+)'%$!      :9.3>Nj.H6:0/..-,+**)('&&%%$$%$)59<>?830.,)'%$!!   D-28D`vB2710/.--,+*)(('&&%=&&,7;=@?731.,*'%#"     B325GYqb#1= 10//.-,++*)(''&#''/9631.,)(%$"    )@.3;U`u.I6210/.--,+**)(''&''(3;>@B=631.,*'&$"!   I-15I]s=3/ 2100/.-,++*)(('-((*6ACC:530.,)(%$"    %=-3ADFB9631.,)(&$#"  >3/4GXdF22 432210/.--,+*)**+,8@CEG@8631.,*(&%#!! +:/3;PbwY#06 4432100/.-,++*++/8630/,*(&%$" !  A.3;GUl}2+5 55432100/.-,--6BEHJG<8631.,*('%#"#   ;439CRfv>2 65433210/..-,-..;DGIKF;9631/,*)'%%$!  *;.2;L`sI"12 665432110/..-./1?FIKME;9641/-+('&%#!!""  ?/2:D^hwX&1< 765433210//.//5CHKMMC;9741/-+*('%$%$"  ?/1;IOm|b+1; 76654322100/0109FKMPNA<:7420.,+)(''%#"!  =226:Rmzs1%5876544321001122>JNPRM@=;8531/--,,+)(&%$#! 34239Odu{8 487665433231122345DMPRTL@=;9643210/-,*('&$#"  .704;JZr> 1998765543244559JPSUVKA?=:876531/.,+*('%#!  %;/3:C]k}C%!298766554355688>OTVXWJB@?><:86421/.-+)'%#"!!   !>047DUoxM%/>99776655669887CSWY[WGBB@><:864310.,*('&%%$#"!  #  ?134GRkrQ( 07:98776677ITRQNV\]]^ZPLJGECA?=;:8743210//.--,+*)('%#"   @15:BSgr|X)!1B :99877667789]hf0edccbaa`^\ZXVTRPNLJIHGFEDCCBA@@?>=;:89/ $ =369@Q]n|]+/;;:9887889<`fed-cba`_^]\[YWUTSRQPONMLKJJIHGFEDCB@@6    ! <664ETWpz{b.76;:99899@dffgf0edccba`_^]\[ZZXWVUSSRQQPONNMLKJIHGEE:   ?EDEFEDDA548?H_ov|~e1:2;::989:9Cghg1ffedcba``_^]\\[ZZYXWVUTSRRQPPONMLKJHH<   @HGHIIHGGC6 ?EUfrwzf1)3 <;:998899:9Iih3gffedcbba`_^]]\[[ZZYXXWWUUTSRRQPOOMLJK=   >FFGHHGFEC668=JSeqvwl3'3<;::9;9Mjhggfedcba``_^]]\[[ZZYXXWWVVUTSRQPOMKL=   AJIJKJIID656AJW`nszh20)4<<;:9:;9Rlhgffedbba`_^^]\[[ZZYYXXWWVVUUTSRQPNLM<    $)*+,-,+*'8:749IZglruk7'$4=<;::9:;9Jegf eedcba`_^^]\[ZYYXWWVUUTSRRQPNO;   /834=ESakstg5*'3=<<;::9::;AR[cb_]XGEKLMNMLKKJIIHFH3   -735?DQbhnuf4+(3>=<;::9:;:?Uda\WK108899:;<=>?>>=<:;'    -625=JRZimpd5*'3>=<;;::9::9?NUWSE.3865443455678776544! #  -613==<;::9878<=8145432100//./01210/.,    0611:?S`gjn_21,4?>=<;::99876542110/..-,+*+,++**% 02036ALWcgm\12)3 ?>=<;;:99887665543110/.--,++**))(''&%%&'&%% 32078APWggjX/11%?>==<;:998877665443210//.-,++*))(''&&%$"!"! 31058BLXcfgV/1.??>=<;::98776655432100/.-,+*))(''&%%$$#""!!  50027@LXaddP/05K@?>=<<;:9887665433210/.-,,+*)(('&&%$##"!   6002:FQZabgJ,,;L@?>>=<;:9887655432110/.-,++*)(''&%$$#"!!  7/16?CJZ]_eJ+$/>L@@?>=<;::987665432110/.-,,+*)(''&%$$#"!!  8025=@BV_]cD,'05MA@?>=<<;:987765432210/..-,+*))('&%%$#"!!  5/349GOS[^b?*&/=MA@?>==<;:988765433210//.-,+**)(''&%$#""!  '5/25:DLVZaZ8,*1NA@??>=<;:9987655432100/.-,,+*)(''&%$$#"!  ,20179CJSYYU2.,12OAA@?>=<;;:987765432210/..-,+*))('&%%$#"!!  12015;FNUUYS2-+1OBA@?>==<;:988765433210//.-,+**)(''&%$#""!   30044>?=<:9:3//04:3)7145010220101)44212494245756<8;;C?BHMULcdjmrqoVC8)'"02;73020106123468:;<==<;9753203301753784=@BFSLO`ajo~znM:,#,L02974010812469;>@ACB@?<:7625407205748678>ANDJXV\ny||XB/ 12)40=3001369<>ADEFEDB?<974320210322112485AFDbOEB>;8643310//02202858;=AJX_bn}rD*+2;85012 0-)%!!#(.3;8f<:4-'! &.04431/ 012157<6;7?FLMXisB$/2126M0221-(" $,5>NW`H%%/XB4!"+4BJXjqb/+4("+46576986?JRYiq=,3U"+3;CCt@A80'  !7GD87:55:@HBWd{ŤM"H0216=!)18@Bm=?6.'   )?JB761>9?K_sƺb*,//5W '.6=@e:<4-'! !&&+AG>354668DN]jv0,X%+28=[692-($!   %%1XJ%")DH50516:=JOpz̍9 +-,4"(.37GMML>74/,(#!7"!   "%.^#$,KD-0214BGSdy̙=*&,5Z $).22343430.*4<82+&#"!  $"D#&!8H2-14:08OVvˡB/945 $(+/1C/.-,+CY`ddUC7,$ #&,X"%H?*1229KOfxѫE.635  #%')*B++.JYagmf[UMC7* !&%.O@%'"8G..32:COd}ΪD-,*8\!##$%')*3Q]djmbYTPMIE=2% #&'!#'# $N6.154AV\yѬF +,/ M "%)*:Wahmnb[UQMGFAB;7* "$&%" D:.525=FZzҧ< *.1 L $),B[djpnb[VRNJFC@;>77-  =G+0319JgoҞ7+23 " $).J^gmrla\WSNKFE??<7:54."% 4G.013>Qbx՘5-34 . %*2Qbiotka\WSNKGE@A:<84553(" -M-022;UY{Ӊ,?2@56! !'+9Xdlrtia\WSPKIDC?;>6864/25'  (L.142>N`r$5;+6!! .#),@\gnttg`[WSOKHDD=?;8964042(,& K0156?Se~^ -(1"!! .%*.H`ipvsf`[WSNLGEB>A9;87345-)21# H/024J^n{ʻH ,23C""! !&,2Pdlrxpe`[VRNKGF@B=;<76562*23)%(# J-018DUbӣ7-:7#""! 0#(-7Wfntxnd_ZVQNJHDAB:>:8766-05/&'0.  'S+126F`vч*I:1##"!! 3$*.>]iqwyld_YVQNIHBC>=>89782/63)(01&%# -N*23;F\re*2$##"! 4!&,/Eblsywjc^YUQLJFCC<@:9:86056.(/3*&$&- 5F*28@M`y̲E .84G$$##"! #(-2Nenu{vic^YTQLJEE@??9;882472)/5-'&'/'$ A?+27?Tdϐ-b;,+%$$#""! $*/6Vhqw|shc^YTOMHFD>B;86475+.51( 1*#$$ O:/37Jane ,*3&%$$#"! S!&+/<]ksz}qhb]XTOMHGAA@:>98577/-64+))1.%&%#&   S235<87683-46-))20&'&$''#    +S254AOḧ́)?:3H'&%%$#"!!  $)/2Lgpx~{nga\XSPKICD@984671*16/)0.&''&($$#"#  #'&KkF&%N=7:>Vgq$/D5j*)(''&%$##"! !$(-26Rlv}qjd_[URLKGCD43 ++*)(''&%$##"! ,"$(-26Gjvynic^YTQLIHAC?:<993275-)-50)((',-'&($$###)%AaB&($8H139?g~l$/E7 ,+**)('&&%$#"!,#&*.47Ooyvnhc]YTQKJECD;>;96376/*.51))((/-&'+&&%"$&$##!!%)% %($L537@Yn;43;,,+*)(('&%%$#"!!"$'+069Ys|unhb]WUOMIEF>>?:85771*051))(*1-')%&&%%$#$!#&'&" ,I/48H`ya.7,-,++*)('&&%$##""#%)-27>bvtmga]XSOMHHC?B;:83+162*)(-2+'(''&))''&%%%$"  H3/5DXp584 --,+*))('&%%$#0$'*.48Ejzsmgb\XRPJIF@D>;;995-362+))03*'(('(+*''&&'&%&# (C.57Hb}P138.-,,+*)(('&%$$#$%(,069Mq}}rmfa\WTNLJCEA;=;970473+),33*(',-)'('')*('& F253=Vox+6996..-,+**)('&&%$$%&)-28;Wvzrlf`[WSNLGEEazyrke`[WQOJGH@ ;:6796-+151)4.1,())((*//+))((*-06"  H/35FYt^"15+0/.-,,+*)(''&%&'),16:Dk~xqke_[USMJJCAC<;97-.460**),21*())(*/1-)*,03664- :9.3>Nj.H6:V0/..-,+**)(''&&(*.28;Lrwpje_ZVRMLGCF><<::8/1770*)*04/)**)*/3/*)*+.26996202 D-28D`vB27 10/.--,+*))('G)+/4:=Uywpjd_YVPNJEHA=?<;814870++.44-)*)+042,**,048;;732101* C325GYqc#1= 10//.-,++*)(G*-16<:47:80+.362+**+055.*+-15:<<8433221-))@.3;U`u.I6F210/.--,+**)(()+.38=Cj~voic^XUPLKGAE@=;8:;80.2760++,0561++.37;=<843 1.)#I-15I]s< 3/ 2100/.-,++*)H*,04:>Jt}uohc]XTONIDGB==;;<801796/+-0573-,059<><8433431.)# >516BUl{V!0,V3210/..-,+**)*+.26>=8434431-(! &=-3A^|tmga]VTNKKF@DA>=97:<:3/37973039<>?<8534430,& B-17Ibn73 33210//.-,+E,.15:@Dj{smfa[XRNMHCGC?><;=<9348:9646;=>?=8534430*# >3/4GXdF226432210/.--,+,-/27>=979;:99:=?=9542.(  ,:/3;PbwY#06 4432100/.-,-.049?BRzrle`ZVSMJLFADC??><;<;==>?@A@=9542-& B-29Ibnm*365432210/.-"/26;AD]yrke^ZVQONGDHD@AA>==?BAA?:641+#  A.3;GTl~2)5L55432100/.--/038=CFiyqkd_YUSOJIKDAED??BFFDCBB@;75540)  ;439CRfw>265433210/.>/25:@EKvxqjd^YVQNMLEFHDBEJJGDCCA=8764/' +;.2;L`sI"12G665432110//0248=CHSxpjc_ZTSPMIKJGINNIFEDB>:885.% ?/2:D^ixX&1<76543321:247;@GI^xpjd^ZWSQNOONQRNIGGE@=;:6.% ?/1;IOm|c*1;766543458;@EKNlxqke`\XWUVRMKKHE@>=80(" >237:Rm{s1$58765547579<@EJPS|yslhc`^^_\ZXSPOMIFCA;3-)'%#    56349Odv|8 4876658678:=AFKPU\{uoljihe`^[WUSOLIF@9421/-+(%"   5;36=K[r> 199876989;84/)$   !%(+,.,+%6F8:@998778:<@DINQ_`a`Ŧ|wsplifc_ZURPPONMLKIHFD@=840*# $  %)-146788764/7K<;;JTlsR( 07:98;9;?CHNQ͹|xtpmlkjihgfedca_]ZVSH50(! $  %+16:=?ABAA?=8=P@@BGVgs}Y)!1B:9889;=AELRZ܀)t:5-%$ $+25:>BDFGGFDB==PEDDFT_n}^+/;;:99:AFLTSGC:0& % #,3=hDEIOZapv|i20(4E<<;;<>AEKSS¿EB9/&   !)18_lmprtrpmgz_EACM\inuxl7'$4=<;;<=@DIQP܀3BA8.% $ %-338<@CEFFECA=6KIC@EIVcmwxi5*'3=<;;<=?CGMR\y¥@>5,# $ !'-49=ABDEEDC@>8HE@>EGSdjpyh4,(3->=<;<=>AEJOUTlĺm^fhkosx{~b=;2*! $ !'+0469:;;:964.BA<=<=?CGKPVUla]_][\_cgjmoqsuwxz{|}~||P:6.&    $(+-/01/.+&>>98?JY\hpqb4+(4>=<>@CFJNSRR\ktmVVUTSRQP*QRUW[^acfghjklnopqrrqqoniB61*#   !#%&%#!::75=<=>@BEIKNQPNMMONLKJIH%IKNPSVY[]^_aabccbabW72,%   66358BMXejp]02)3?>=<<;<>?ACEGHIJJIGEDBA@?#@@BDGJMPRSTUUTSUE/,'!    54298APXijmZ/31?>==<;<=>?@ABA@>=;:9878:;=@CEGHGH6)&" 41159BLXehjW/1??>=<;;:;<= <<;98754322110123467)$! 50027ALYcgfQ/05@?>=<<;::9 87654310/..-,+*('&%"  7013:FQ[bejK,,; @?>>=<;::987766543210/-,,+*))(('&%$#"! 8/16?CK[^bhJ*#/>'@@?>=<;::9887665433210/.-,+*))(''&%%$$##"!  8025=ACVaafE,&05-A@?>=<<;:988765433210//.-,+*))('&&%$$#""!!  5/349HOT^bd?*&/=MA@?>==<;:988765433210//.-,+**)(''&%$##"!!  '5/25:EMX\d\8,*1NA@??>=<;:9987655432100/.-,,+*)(''&%$$#"!  -20179CKU[]W2/,12OAA@?>=<;;:987765432210/..-,+*))('&%%$#"!!  12115;GOWX]U2-+1OBA@?>==<;:988765433210//.-,+**)(''&%$#""!   3/044=CLVW\H.a-37895344221224435:97 5545320/.$)) "56,1/123455421. ()^5265232352/BJ:=8--20225953*++H 94312212101010212110321 /,,^?:/13525321101100113425422;8655858774210+-,-! 224))610101020016512468<68;>>?=<99:3//04:3)7145010110101)43201384144755<8;;C?BHMULcdimrqoVC8)'"02;73020101010010$/2301753784=@BFSLO``jo}ynM:,#,L0297401010010(1/3307205748678>ANDJXV\nx||XB/ 12)4030(110210322112485IRYiq=,3 " '=<..3017?GBWd{ŤM"H0216 !(>:/.13/=8?K_sƺb*,//5 +;7,11558DM]ju0,  6C0-416:=JOpz̍9 +-,4 GA*0214BGSdy̙=*&,5 2G1-14:08OVvˡB/945 G@*1229KOfxЫE.635 7G..32:COd}ΪD-,*8  $N6.154AV\yѬF +,/  C:.525=FZzҧ< *.1! =H+0319JgoҞ7+23"  4G.013>Qbx՘5-34"  -M-022;UZ{҉,?2@5#!  (L.142>N`r$5;+6#!!  K0156?Se~^ -(1 "!!  H/024J^n{ʻH +23""!  J-018DUaҢ7-:7 #""!  'S+126F`uч*I:1##"!!  ,N*23;F\re+2$##"!  5F*28@M`y˲F .84$$##"!  A?+17?Tdΐ-b;,%$$#""!  N8-26Jane ,*3&%$$#"!  Q.12:OewƱD0)&%%$##"!  "Q,11>Ng̃)?:3'&%%$#"!!  ZyA31-*))('&&%$#""!  6=-66Fiu&1G8.+*)(('&%%$#"!!   J15143++*)(''&%$##"!  /E.28?g~l$/E7,+**)('&&%$#"!!   J326@Yn;430,,+*)(('&%%##"!   (H.48H_ya.7-,++*)('&&%$##"!   G2/5CXo5841--,+*))('&%%$#"!!  'B.58Hb}P13.-,,+*)(('&%$$#"!  E253=Vox+699..-,+**)('&&%$#""!  /?-3:Ihy=263/.--,+*))('&%%$#"!   H/35FYt^"150/.-,,+*)(''&%$##"!  :9.3>Nj.H6:0/..-,+**)('&&%$#"!!  D-28D`vB2710/.--,+*)(('&%$$#"!  B325GYqb#1=10//.-,++*)(''&%$#""! )@.3;U`u.I6210/.--,+**(('&%$$#"! I-15I]s=3/2100/.-,++*)(''&%$##! =516BUk{V!0,3210/..-,+**)('&%%$#! %>-33/4GXdE22432210/.--,+*))('&%$" +:/3;PbwY#064432100/.-,,+*)('&&$! B.29Ibnm*365432210/.--,+*))('&$!   A.3;GUl~}2+555432100/.-,,+*)(''$" ;439CRfv>265433210/..-,+**)('$"  *;.2;L`sH"12665432110/.--,+*)('$"  ?/2:D^hwX&1<765433210/..-,+*))'%"  ?/1;IOm|b+1;7665432100/.-,,+*)'$"! =126:Rmzs1%58765433210//.-,+**'%#!  24139Ndu{8 487665432110/.-,,+*(%#"  -704;JZq> 1998765433210//.-,+*(%$" " #:.2:C]k~C%!2987665432110/.-,,*(&$#! =-37DUoxM%/>998765433210//.,,*(&$$ ! >.13FRkrQ( 07:987665432200/.,+)'%$#  =.39ASfr{X)!1B :9987654332120.-+*('%$#" 9/38?P]n|]+/;;:9876654321310.,+)(&%#"  32 DSVozza.76C;:9887654332421/-,*)'&$"!  )25>G^nv|~d1823;::9876554335320.,+)(&%#"    $43=DUfrwzf1)3D<;:9877654346431/-,*)'&$#!   "55;IRequwk3'3D<;::9876553575420.-+*(&%#"    32?IV`nrzh2/)43<<;:98776546975310.,+)(&%#"   -408HYgkquk7'$4E=<;:9987655334331/.,*(&%$"   *602CPbhmuf4+(3 >=<;:988765:2332/+#!+*)('&$#"!   (404=<;;:987655431-,+)!&,+*)('&%$#"!  )4/2;HWZgmma4,(4H>==<;:9877654331.+)%&+,+*))('&%%$"!   -5119?S_gjm^21,4H?>=<;:99876554322100/-,++*)('&&%$#""!  /2036ALWcfm[11)36?>=<;;:987665432110/.--,+*)(''&%$##"!  22078APWfgiX/116?>==<;:987765433210//--,+*))('&%%$#"!  31058BLXcegU/1J??>=<;:998765543210//.-,++*)('&&%$#""!  40027@LXadcP/05K@?>=<<;:987765432110/.--,+*)(('&%$$#"!   6002:FQZ`agJ,,;L@?>>=<;:988765433210//.-,+**)('&&%$#"!!  8/16?CJZ]_dJ+$/>L@@?>=<;::987655432100/.-,,+*)(''&%$##"!  8025=@BV^\cD,'05MA@?>=<<;:987765432210/..-,+*))('&%%$#"!  5/349GOSZ]a?*&/=MA@?>==<;:988765433210//.-,+**)(''&%$#""!  &5/25:DLVY`Z8,*1NA@??>=<;:9987655432100/.-,,+*)(''&%$$#"!  ,20179BJSXYU2/,12OAA@?>=<;;:987765432210/..-,+*))('&%%$#"!!  12015;FNUUYS2-+1OBA@?>==<;:988765433210//.-,+**)(''&%$#""!   30044n_1 r3ޠU\G{4@B6y@E%T 0BFFFF;,rAp0u)T t!iGj*r!b 7p!J|+Hf  iN0ic08 jP ftypjp2 jp2 Ojp2hihdrcolr"cdefjp2cOQ2d#Creator: JasPer Version 1.900.1R \@@HHPHHPHHPHHPHHP]@@HHPHHPHHPHHPHHP]@@HHPHHPHHPHHPHHP]@@HHPHHPHHPHHPHHP ߂ z(Y"KXuhaIKWȗw$Dc:ykn"Q8?v^CFwqf߂( \o\TiR:[&ޖ 2̺]-6t#L~e[誐YQ|?߂\go4`&w"j<go&41&Ϊ0 ;tLepݟsoN#X߂ dž1.1J؎73ByT0TtQJ<{h nYַqd#gSֿ> YV.D~fc$A}Qnqhd萐8I[F)n%eR67kb(AIM/A B:BNx<H#huBϚ(5=+ԕ6GUANgN8XȺlb۽CeO2ʰ󦛏>z!hj3f>Ҩ @YV.D-+Ngu l}Ed8xݡ #%ez@bBM\y532z6v1Ͽ(=G\ôOkHÊO00"_.>f~qYu >$h4,ߧՂV6y ]lM&M/٤O$(|new\l(*6Jeş>*kИG+v7R+%V :6:T^d{-,= z]/uv~ȧ:IFΥAb xӀvr.)Y7[:f  33P&qL7vomDǃ"i*>_c`niv5L1x Q2tH'Jf] ?yA14Z. kCleT?)0\x=%巃g˯,0csgz"(N[0 A(N=SgL{޳g3n߱R9Yu'5_cևCnW /њ,a}1ZwExg|}h‹3G+粣b{܀|$4%:V3W-pҧڷY@/a( loGUYu|3t;񵐽z#D\:i(N8?]lWV|Lr< cZqr#M6Hg|~+Mk]G!D7CEMKZt3IBRnB`bK$7uEʏ*ʹϚI,ҧB"C"Z$$k͇XIhg˪EYfC5xPNXqT2Ͷsԯ 倆IX>c*4c\~zHIGD.|6lO!ҥ ܹT6xUzw퇂>}G+v7^]v̫)j<f(u Ov'/z! 4y#ȉ#䗓)J\V3i>o)zC%w.2%Hy&DSEC&Q14M""sxtE̹|nImiCJSc߫.ZC0XsuĔg˔9E`vP0O]jqfV[?ŷ,IMd戄_ƃN-&/6o/O_tLBsCnM7}ȇhf0n} ~B*jzz@^@$פrIFHYGC6D@l&T| פ#&HL~2TQl痩S鐢3=tm*r%G7<1ش"8ȭCHadמ"";pOGNPl'nJM60.%qP"6Qt`җ{ھ:\Yq6۬~Y- mx4`ӦBNΤISa/=jOn`"kezWTqBT0O5ɷ ԥ%^Sp1՘i^9 nS9ćߕd|fhM_Gi6EgwS7e\㕓;zȤ]Mk ^BMgG:0*8V(,1es&ٵFԏHxk},ù\hxE5k7:H/%a5=xO |!XMhvl .^w mK)ȈH@3uFP/mj\o( tvZsLqkrTG`" o&{ 5h{1Ky_ pRDC &̀E% eNNxjDoFlvBlmxOl++ * #ͤr%$)1 k=Yj;IG>v ƘO^$׌wO;衉v 3<$&WdUrw7lcD1v,(C3|b&d[v!e:(UL}l#I* 5gi֛ahz.ydXYpFeL$@7T<9h_" 9AҚ44^d8:YE/GSoدmK qpx5ex]HGdd85Zth)\bEqSai "&c]9Ij8%;km{ BÙ[򆪒oSwƛ4s&>k_<)JAndQ ZeĚKj+8J [gf"Ivrv0 a#u4zY"h7|p^q JGej)cO(lg?|o>ɸRUg2lMlUࠄCDh1;5 +WVr|t:=8Sѕc޶_G.88F|* dȂ `V!FZޒD}X72C " VeyjK>|یCK@ \wkH<&dF~vDSM(]Eofv_zD ؚ3*g(lBedo_#',,lgW!v+!u/O; 6# W7&3X=ЯE_9򅞃|W!crJ/l?&u3h]߮+cyS5xx9.X-}(r<\DR)Fn;'XJw9MS||śL`u&ߦ]kuAÇ"S݉9&H8q@o*,ʻlLGH "o(+gW]pآـw)ۥT~`Y- mx4`,5OtW ^;X{;qjPz^"JQVJL9ԥ"|It> ּѲ̬,dbF3Bd_KXΜV._~.|V@Lɴ/]ITLv63:%HiR$:;,'f\UJ`>>mY2}X3~tV\މJ'P7RߍTmXx8 \W^VۉL|Ð?^(\@o(׉NHVY[{|!0L+OYFmdq,gH\ڙ^3sի9l߹<nï0Ckў]JmZJHG=rt ll:ю*JQ!F-IL4sQPj3bcJ-Ate9cI?3~eڼ!?ʺS,vCŘ?[?S6b>A7ơo*6H62V€ĵ2cq̴mwpP(B9"Vu  5:ۑ: &'4 !woZ~,~ ʗ|-[Kzz9ٮf0AMJPHAJak>q3)70ZoGZSp#5g L qΤAܿ~tZƄXJ oÓ݋ Zs)1jdY* &%u 10fB fSj(g ѮЈ#ORbZ*=_4㰒!dlWyHIQ{EDmzUT}R\+_\%)0 2õYpq ~6Y |$ j,IBf ~|lhc-^{Wot~0u;ր@qs&z47o~$e _AiDnsikm`?}̂q*ՆT\+/`ǻ LQu;XBzA /kחv| kwHn@1:v,R.Y g`Tܗ+'JDˢM/ PxxӂNfk6qYFĴGNb-OE8qK-C;Ni6XH=ʈခ%qcC F7Oh+6;rr4D>k!&0< Qw~w66}Yz d% VkXj4 1]cFܚs"_0IE%EA+ 75q8W 2#zI`Q/;WI0tg2}1 ]YonH:+n*CCoM~!d_BMXL_qmf-#r͞M5KA=! LEH9D$}Mbf9S(;wN>x̐BgCN\ӯ|Sx!o/Є $BE`T8Eu$K`F->S!*wY)ɏ., Hǟ1_ج8ȁ&* hڹIڪ(HlU&B{UA9aɬnA3֤ñ43NtVFv.if[PR,^6[ϔbdVnZ Qݾ6'7,jN;b_lτpepB<+a{2(kn5Ϥʉ|X~ w›Nҵ@\lZg`9#@sV; ?A򎮌ɧ&T!7֧ZDH ŗ,E@k硟 &EUCȸJ3iakx qr eIh=v|?>fXsW̫VImD] JdJ% fMUR)twYx??r%$sQ^hk˹vrA4u[3j6s'ʙA[ %hQ"#4Lϑ:B2NAa̽QL@sJJsdBvc<ּ;S)G~sB]MUyb-PTz!hEk,'+cu;n9L ADϑnkD*mYD}Q :Ybn=Qf@S/̊Dxf8,Mrs;ҒK t7,+ mxwD&:O=J7nUt+V9-Nbx g!B=68Pw|qRD qNl|3`jXT0_,7+Qe+2W}Vuw uDΖL>6:iC(~ RN!miS)S_>eH7aj CL:e0-5WS N{},vjzW7n?5U^knoN=9czmq!@hyfW=~6!銛iҾ܎|1U@u(-x4#L](ڙ͎}W)?v@+ESB@],z'=uں sE|G(JI.TT.6 s|ܴF\yd?f}JΨ—d̸/ݪ2sl>jݿ=23%dzaۚl]$s\1i.k[# YWbJ#DȞͭ!Y\UʪAL'B,}zZMks: )sEi{s1;N3\:caΚ X:\.],le<4 mfƞd5ZxIX:fx^+ \1Q4d!kxZ[Y&&];dؖ>hi|qZkԯD3J6H.ϤEZe1( ւ뱸vU'i$Ə*c /}~H\h 'O⩿mzA|djʼnc#=#>=N+.aN !.Dgz]W뚧BϘ Jg0?e$pbA98PFt؎;oѪ[xe%xy^NMJbmVhlB2ip+ت}`;"]BgW]tx@pIzmyJ7X%7}%nUW5sY)|5x?X7Sɪax޺U(.vU4G:F]+ڢ?wBYBfv4v0x˞_q(ÿhţ'`w+[|$Q9֗rК`VoPBۏb`;r)ƠqzbLzԎ, [[I]Y 1:^zx:] a޽חzDoD~?^D+NB54sP6Ew7wC 32(b|>wk`^s-]bA_×,{C)J98TYu I=Aw_YmZGLd3 ,IAsS\sc}nlL9UT#D;k0A߸@g͞+J8zS gKCA8  K(d"'#vY6 hjɷya]ZIb$b1꺡8֎Nڰ\r9IO>ȑߘ. ջCG`N(`bS]*&L'\BBhw};&3 ijis Yі9ЎhT=le6ZW2JLBgWtyM=lN JY=:.x`qՊ^o;q{d"Ο"桔gS*<:_\ )E F%nR]O!1mTba.o :+Wq{ҠQOw'hA:B. Yl!Qd.X7A8VF+sfXnvR+ze\pT'F+zg)D\E ͷ,,UrܗG7@L]/|ؔVG-;TEޕ$ [ ueA>ӽ՛ }ߔBB$`(r=DbuZ qTRsUh)bmrViI.CC2UK7ul=vDP$HHne\ng +71!B Jf͠k6QlU,\X$ŶMd:ibS4cF # vto]Yom1?-# ^Lȵjlʍ/*+c%GZFdfDپIW5hNkwJQn5svɥ=$v].EaAUb^ wgaa@d g@ Kq1LR"-h||b׉,4*'iF)$x/ߌtxgxkw+T>4O[]y RFUR{Vf.'zuR:W7 #c^2"IYjkM"r`i2~J+>HJߕmGJA80tuʼnb75qтlBC ^sXf=a$odXy{ȱ(+<'O+Сe_RFxRw-\nH<"Oj]Y<"qqs%fS& z?j%i4"ѦE/Ӊ93!eӷE&fKaڭazy.)qI7GJF!eU xv -LcUQ#M n"LKyʩ Q(+KD!:4eWO;C:ɝ3~-SL0n'3PIԥ\>Ԗj$\xb?eְxكZcK}?04R4hű!q0ل`Vjz76V~4.:FL^wp [Dl/-op-CpU*@ p"yްl8ix<ݟw)}wR0Gc6E#`nDR+51=l?D qNl|3`jXT0L\Cm:,ti*.KȾz([2#.eDi C)r !/ii̢058J.erg/ -x[{69u3tyDZ՜ J0(O2&Rհ!dżr#.e7,싴R$StT+O.{u:wi' B{ss}xHC x<:@mn7a96!

G^L/a1J #C})ܰMU>tASCH[| X$ #Ȑc):TC=* &:q9b=¶sÀv%,K4 †o4c;878BwrU2B F-$^W[~L8uiyflu0WdcbD:^C̓8]Lsol-AiRfKKoY  2%4Dkl"7 ׍~RDRЕi'irv-(J[9b*: !ONpEۮ]aUVmsq]iH- K\RV"# r~KI؁~= }5OP&Wcÿ?.B=$@)J* yv&G:':4(XjoEjED..$[)X(=\Yf^LFM_cg eo}66V r(ܺtPHQЏeHc{9vffT}G0u;ւ4âuˏ "^ 1n_lژkN$D_]@z#W `j({F_0 J~+*'Ck^Rn츏 I(9,v3~J|G׆5$:Eܫ䀖#kvVmZ%)Jc;M \*:&j1@KBxWpT`K5*ԧoѧ_OZrܕ(\ӗ6ǒ_b6~] Wˆ.[iFi_X<1Ne7>R'ԚU;xN*u[=Cׅi%V |#H:W"?doviXzQL#OnuwiwZ$1퟼D%cWS ;{5*a~i$`JQφhc]1&hMf|t".Omhn!C+ǒzVc tCH=xns-6h8ᦼ`JXTvJ 6Jd'ۼ0b+(ȊUτט{yD4< 搼r(}%l_Y Ֆp8$ qRt>7CSIA:pi$x;;y<8{})j:0h訿i{%R@<%RW_rY`LiQU>3 TN&E<dLnGwklLMq~wB%QaL͋{PY@oBڏ”Gc[uuC]YonH:+n*CU~ڻl|CDypNa*4pϡH)ݩ B F4^o\ھy&ӝR!RG.m% QEO8Ś"gĮ5*`|uh?e1ҘekLmA*J@i (IrL/9,Qk߅Jm<L@aSTۊ]!CS ɒ>ȕb;͜ս3ܱٙis1Bܕ'TyG~Io6DŽ(v\q_p$+"rgK$h pVsu^,8 ?Ƈ ؊DPT EPxg۸5NE)W15So2Ktԛ~yCۢ:a/ǯP ]#)kE8a=зBƣ;wHCGLkDTBtpWM+n'~bDc/u8Jf|WA>JJGr.T' n}tI ݸg-FhYc<.@'=1ߴBQv?&|fJlKݘٌ*-M9?(qyIHP"Wh\ǩ }cƃ`w陣# y+`gg֪rjiFOScYX.] 4[%Fdu geCwQ\Gz:XSebArr2suRPEw;8eה;\= XN^pIU wQK?OxݿĆ";7m+S9K.gW1}|ҧD qNl|3`jXT0_,7+Qe+2Xo>^%4^v}Z KgX"߽K_7hhz U {ɼF{.GoFp@Qքi)MNR x8K`1Z|\tj!@b='Z% vېRb)PzVep8*W )u8_$hEmӕqՈy (*"v M8p>s_yiK$oI14mPEV+K_7ئC81c,G2|1 yY#?9{L:6jo 5go^8ўK#[6,`:ʈѥf`ZހFu/ׂ9Q ̉G3UXζiC r N ʯy)k؄_r0Uu5k33< 8DZyny7[n7QY$җ̃)XJԇ21_Zx6߬tZV3F:q5 `G3ܼFϒOA),:".F7hC"a.G^<\lb^W<^]WG(^Esc(s3a˺k.0lp<.()ʤ g+ S4UKΛ 96ߣV>_zr 7"ҩE]wcPkJEhgJkzZBfiYQxȾ(ԁi淵1bt秏ҌHGD|١' ex{9o5\r"Y$RE7]g4(':P]ftXQIIM.<5 yH^jSEF-̛G ݮqdi>"K C>>#'s:8)4Kf)aHw8S8T˨- MP⺴V31j \ ?4=' +.?O~V N%D`3U<Ǥw){GHte ~%r ^SO8EL\ćPwW0%DqM@jSt ?`u!gI r[ kȥA>H |E?Xn;%Hv8 .F !m[3#bd;8c.Qޘ%3\ yvjߴrg\8=_En} /zXqiN˿3zދ򰢑}#s7]{stC+BR KPBpn$1Guy_TeT:屖!g@:3 fm$"cGΩwK \U;2~0?+*Vђ0'2`6E4<ګSt{U4ԵpݜV7f% YL^,֜עx>@z^vg-b&P(+EsPh=k܇59Lǥ[Ml^Rcp&qU dz.RJ`yQ<0K3z?dߚD4Lú~*@$I]ĄڵGWu!. Do JBhdSxuJyL\밅: ;ަ+'[ @ ,\nw@of$#ǗUލ- 1h0wudpg}KZC8󴳬Z#iH`{n'kn>Nup Oc^9J^$Q5OL񱬄|dCDZ찉YLCvfJ}YhC͞) Rc Zpɣ0ƦV~:$Fobaݲ;'KJ/}sa!arrҭ,\F2U mtXWW1]>;,yYN}n.r"~Am\ qXq*O@tob5 V 5;gy0vA󬱂}9f#*Wz), ^"ΗaivSsO)h=|Lؽʡn>䛟y:*̊VBs&-$`|ĦPXP)HFTC~30hsr\C$*I;e X" "XSM7ObSf+9"ͭrSu}n(V#!ǒ, }") Cͫowχ,B VT]ȵ]SS{iEU<_@}`"ZB% Ujk'vfWhWMlvDx5omd k:uE}irb䒨 Y5WƯï7 GLRH|?S&e|`Ŧ$76[I}P{gR\RVx;o4=7?WYEp"Uu[hS)@:NgE^nz@S_,C 0 gn!d"ˉm4VY Dd^0%<̬I_!? 8.(g~R*{vn `K˱!>L_FF*#$bSu2 0{@oB|67xӷu#4T-ź ؕVsCi6tiƒ𦅺8I{B.CI=]-e+dg9a7cM?\z )\l"'9oK"ٚ}E\ P7WU% S'2`5KyG4W/l"5jF&+i{4\o6S.?п[OͺW[:04KʹC\C.ڷ,2f1&KTNϥ3)''~}&L㌏GîQ9m5 <|$EBYif!i›=P(G'α]+Vm?UG(.?D%@HR]T$ /M|/I FQ7tէ\&D߯3ċe8ۘ'd 8ZOCz ǎYPZbɩd2N")kú+K`tZΚer<=&=2͆619H\Rfc R,}(H&;FvF0o c*(s Mq, l-P+8]3A\8bSV0qgp\)va&QgN<)v9=kyThq8tE*Q&jӗ9& ȫ#Sdp8Vp ,p cϯ%gOĞCDlAAĿu\y,6lD5֡dNօoZ]&/ݢVLRiA˗P&\tM6Mա6"u O϶*/`"29Y$u4veֲ4# !F M"mU$PDPTddQ.:*d/WI<+f8Y0? ^)?ӏuY Q ),Udi>dd OH 6J>٠Z:R ÆA.=S5ܪ3a{4 Uf?yJ8K}E"y`zsv=>?y4.]3,,fK[dpiDKSN >)k>*Tw|R&K==gL{Kψѡq=VEhoq}T'@FAn<@õ+Ò4Yt-NaJL~})IR j>QDie%r%<"b|p={P(&C 34+3'8ڝ*a@jboFݐ ~"gS}bzږGI(?kaIQt3'ZMTK.Q)]hGx)B=ڼ>ucDreXh}N 42MꠈtTZ&XH3,Cܮ/ɩK8Ew>}Ihp=WX2B+Q,CCw_ʞ)+,'A[㕤EIu'uqBO(}ɶ+*j$)3 RnA۝)hnŌ6X)U=>Lkp  Hc=FwQ$8&.tYķL.@]Ǘ&!Xcf 8/BjHu#qN/xͫh[@DN dBR(riZ3|_͘RBٶtcAC Zxqa3[D,L@GsFn@cPjͣb~>Xj!$Z|I+_~V;s`|=\BP>>u29/WjV͢٬n !U_k#+ɝ^q6r!u@ !z89C92Ob pdPXke8j8Yܖr Z)[4ĉS( 5^{G @HeUj{0Dێas|޿J/-L oRfPanWfTIDŸoXtv/ЬuũM8Db+T$n_; Ŀ|6Ps:\ۮo`j@oݠF>FRij~'R\XW9lz[;|ZZof7~[ dpB[]g, 1k"206搊k>;S݄7–˽0{Č}0~j؉oqq2}Kv+)rxzvho!cq#Ԋ% I-B:C6sGLd ;o-U];O.Gp1S11L_p]}m`}`3:kDŔA6Fuxkr4⿉S+ږ(@xB;_ߦ(1ys4Om1h \`BYa^. zP%-v r pPʘwԼc'O, 'xD9Hi9*  Iw9͚.ղ("c; ?I>Hc%ϮAO[PCV,&Vncx>5H*'\aZ~g..;(߆v"!V0K6wž,*Aȕ&unawYILerU4|5󩬦չ5oXPa Zt2a}ѦIh;.Bl@uq0/M;nwHzpS>"/sOzBYH=Y'Oc9n5@s-b`Wk2FjcyVʥ98$Z |js7F4s@u[ƨ1 OC&maܟeF;;4h6_f;y>sZ逃hB2S E턔/M?Bm (_ F/u?@4ӭ}}E:zD#AGTs7]X@+fUwU{, 5߈y5 `q#(q QҥV`̲E#H Rhv۸aCseZ9M44B'[4PK:est0Ek3դfƠ(c 4*\#rJqZsi{g+/ pG͓/qLl#;Q֩f6o-!׺ )nj@a^^u4ҥPK)'ўaF>lw@|aaa]"sh[r26٬5̡[29$,b\X35HP𬐑rYD[ׁnG ƭCЃFG5[hJ[ /{e U58zwmۯP[G }'D{nttW {Cqc pSdr]ۖ(uJlLW?w)2 ,AsiLW`Ɯ~NҘ; v7oc(WL>!.$ 4Pkʢ&P#6vNʬc<>h7hg'-dB;>ܽxwקQ`5E=Gkq= {ȁ3 {2᳞*+:[j'NO•$AraHO"У ԰vUZܡ=y8 N.p%77JM22I+r;ߵ.Ñr_|`q^ d礜H$skqN]Mͧ{3LjaFvuX:Ϳ (RoMޗNmed~ic#a8Ba1ָfc=xImG,kj E)0 #0ft-Hf#uIlb°蠹9v D}rFDܝS([F>V+@k|WttT"S!$`0Ž+G3ĈF3N [7ۨ?˴ xՑMqJ'.93kȌ d֢}\zPؗ*2~vH )!δ2j69)x<b'"=)g`+ǑĤ|lA3x^FhߐX6[AZ] ,SM*q>wu;aJ&IzRT;g܉ kn!tDjt*wmZ^Q_=֡%VK}?iĠ&j!SJȻcr8i]e3W>Kj$"Sͨ6K5ph̙]->&twzRjgvb$1}Ԉ`&6^eϕL8.bh8k!(9lofx#`]D  eӽlkR:ʭWDV!hd9&QV\.;e#^c˩3go%T >#@ؽOȎV}Lks=rM(RXt0؞w K$9JzG0qxr7iGv_q O~9am'zL!jzpz6[A8u]:KC͖''KVo]@@Z *+/oˡwͪ2Ds\bo?@")a},d1؆ON=V!Ei #]csBp\ -45ɒmN\?._1A겵b^tÐ,ǵ`nсyVєԆϿȂ%Lt[W#7XAؾhE[Aȅ|V=ƠCIF0XHAZCԣ*lP[cK )rʀRnN>,EJ 5srFPx'';G Nʷߖ Q!w6[sOp-LgYƮfyUQQоC4] '.`Y]P,6ȔHuU\/s{9w}qsNF)䵍0`M]Y3nj*R*ˌ`8 (S@%( v;e,HvQ }Pm&- ]j+q]P20\k݄*o4и%O؃}1!uek}M"^=7n !C2#(K" ޶3^N"λqa;p'8F ^$%nAD&aP#g+ (sceŤQ1gBAJI!wޯ>y7t#A`=+Ti?p1ȍ]#@gIMMC,E0+dTCUHd/AA(ئOEtegJ%:ꢁOkD4v "Z8b0*x_~.?φ؇!1F7%oX/߬^O q8ŲcPީ.p(eYae3`Iln!5D,]F'yq\Rhd"Gסվg'ziKHmދLdB~(Ι}z*t㲐cM#_{l.ct#p".a 1 晒me\P4b 3W&u ÂKY)/#|H\֛m&yOa+CP]bd[\o2;~AyDFNz)0xL4XWLi P%|d,PI@(nB]σ}Q^߃ϊiD!)I"ȱ6 *gô]ȞZ}x왡b r_%.*ʀ7=ƫ4 Wv,p6-%bѪ &8d^j,et&q,@)֮/ SD>3۰%Q`Z:ح'M3 cs4㢚M#,WU\q#5#dwPF, +#u`piə=VƨrVtB"I3}M%L#\M-vmo5*`'zJKG+V2w߱᪔Ɓl-l )9=RJ`yQ<0K3z?dߚD4Lú~*@$I]ĄڵGWu!. Do JBhb0J?p.FT(R65EW*eE$үUqSFy6y}/o^u#fqeb腼|B䍁XVt6J:uCU{u'1 _1H`::"h{zY vJ >B fb qwQޕohtbJ1'%1656,.y7dkWTuwӇ|fl3uDIj}}a"y@QMw#c?ު|c2jfNEŢMN.|jSX&y}ęW8 5D*<"v257s?̜I +VrR-u?Yt$Q O︒r=_Jg;l:R:p5"ю=Xο0s|YΔMU +T]Vj2Wc~7M~YmwȻnDA. ;I#čS8cٖn낓x-"qwA)eJ2=l=Yc?i I=eT,1NMLQx-"Z~7 QxCg_/䅴ln) aa2Xfnfuy}eDq& qaJIniȑbvDq.4Y*eZ~4yk'ec y/$w%wJL;+j;Xkjr^i5.uזZ.c6WR8Y-}I@ x4G7pzNSSV|}czW&.$Wa&a >&Ljq'} b?Xsm$s%kmdLkK|VqKr'\~$PdXvQhz-sPUb`6qU`hҰC?|\љ~-?_'mɳ!ƽ<즒AtD^224s\@6iH=rd%ohwۣ kOL$PDPTddQ.:*d/WR4~VA(F|kviU8rf M'MhZdMzTnMۺ2$ J8_K;ޮ|MIOi*jQ|J4$9ej Fgq{ rgܱIO;#=JXJf,}lkk%zRP@[1؎K1h1I@lY2PiGv]ZYNvrJw:E;F6-& }Xt·EoTa&"7boLLmb-Rv7|KM$!{_dz wsny JkK _@%_nI#GC oBjHu#qN/xͫh[@DN dBRnKrhCuWY-&2W~6[(ɾ~߶GM ?:n"^`$nœ/f_%s9>2 d\sݘ=N=,.GT P~ai6@xaكRnB=-sHluXX=;R<\av!G[ 6\N`0PtQߜ?p[` e/tx4/:M&8qO"$| #4ĸ. vumvdȞ"Mc׃ZoͳsK's' { MBCtm':Wtb3, )4!a!1Tû ޏq!;$vCGdݪ(JnZxqLKwF `I5&)Gj' E,C}x NW֚w7Vm{h"oF)X_q_Igca~e?2ҬbT(d*q Ll;Śe_3 儦Y[B6zY_z;+X~<#8f@nUܧ ryBHu@unIUl|T.kNn k3tB  x4V#V2d3 .k"\bRK^ mi.wpmKM&m@+Kvd '_5kx~6u)2>jñ$7!07͑aǺE_RNAwwO^|mZ9}X\{`ѮS/Z H(4kf4k]z,> W+6SOJf" n^ O LB,@58!Iu;<0(eZ "ktׄP887OZzݽ7M^4|:A"] U?KGY ɨo~yS- OԼyT|G_!kԦ$44qHœΈa bBiQQL*իm5}tMC[.+[ǜwv͚N՜q%?@*Nʷ , ϣ*fFdCsgJ@!T5ܞNyzJ8l1)}w>A cQz|@Qp"QYYȂ݁ԤFd=h`5EyTp^ SHHˆ_jQӭ]k|}a#6$8\++Vַ]dLxr BJ]Mšѧ='<$lẃݝf(@%M[svqt$4K@$;WØrg.ӱ6>'͋y"ʱJ|wFmm^bŖtE1,!lbB'dY]OmrS?9͚.ճ$;V 8S pQw8-{1S3+ K~Fq?xF] "B'1M E5xMIEC窧ݒlɸjގ :쩺Jƹhr/ik.:8њ#spw"AڍU# 0OX-E0[u[@;ݬL͗Y3H<jӃʢGpNԅi6}f>[_X;NX ֣8 Օj,>x?%2>);ɩ'jkEќ."̛A<#0Pah ?mbMl ~BCp?\1j^x u%dlr",uFGGԲs$D4qߤ@XWc 9m\`}#j;`1 [bCI8hmk?;ZoCti{iH^?ܾ|*P{,Ui\\0# ]0jْI1Eҿƛ=qRVώabo>Km3nX9ɒH> İZWA8*{|Te$;uLl>rF(漛zN{ M\J:-dBݽhӔP &x INQQ)ud$jewan"Ʉ|2eZSo^.Jm)aǿii͎߇M W)G9bqe,֩^p:p beWӫ!&*b=φܖI;{BAJ( y% KIG21f^ o:T b ^%#*L| GX.|[KkNtÏj8H!P"%ְ9L-c,CmT=˷6uY(]Wq?<i꺞겵~4D#f*gČi⩲𼓾ԔVf3!YBH>ki:tV&I P|s…K*#._i/azJT?V/u5J0ݺPV)zno_7ްk$3=_ѽ,˟P}oq*؜>yMS޸%0Z>_ߕǥVF8⽚PL+r%:yO(l ; W}{PȊ`ח{3.,ݓNJm}˚hy,VlywvIXNқmc8T%(C.0}`D;zOsy +eu7;A=~c&oޔsj?us*qm"|ԕֈK+y@ BQl#$r 'MDA$vHe)G\mߥ3پD~0;xSMV7r猞ޭsk| sΜi u𦜒}^vr𧰈  A&^у f@g;ondc)G{B+,Ԓc 4eO8\{Gdp\%SX9lhfQ}$DEV<zˢļ-ܵ .="^5vR,8h} )/~_3_X/,cJ ef\M],jpHHkt1wx" 2 @-S1qט$ KlP4;Ÿ>`զ`֝6ZdZSR OM?m SӉrHXdb<4y[K6D1񢰀;d$laL(Tp`X0ܧTsȁ@o:.1T278 lj+R.^hO/q麿&UG8 WGvC-&I:#e7 kwgCQLIAްAPGN_9m匽_E hM+x?1.u- <lWX0m=5%]P5R NEVbbWAꦲk"Co]*ʷ;Xg\Nx^e`3Z :0{icsO 7Ǝ#Z e`\;̄'Hΐ؂ @"Y0 czb C-Ծ+VTi^OCћ:`eZgMxIVW9PWũ \S3TP5Œ}%*a8[Uٖl-w#3e8UaO0 B3km %uV+J ^NPILjwR`6׮3{2ͨI% _#Zx>EQ[8b0$&}6aLHO`Nq3fۨ!(pLj-l/߽5{~3hNKdO_D>|vr&!Dk4NwYRHDTt39w*G}$vAn{)p# M= z\|wK̇ܨWcp^1QJJNh1Y_36e kzFj`SNbSKj7))a!NB^W.~Ğ^EQ7_?qjҽ>;k|I_0FA]Z]}e;֮B%T{h: 3{4st@pcD7'G/A5|49ow|(g}ʻ~yClf"v5@ '#k.O g_&"DƫV/P+*]qhg65mu@5<߆0r)\LE$?54\G-ng'k=G @/s+[bՕ8p@@ʿKu {j )7 g}bC<@di_kc;zhi:|;k~Hu62W⊗ :T# n(ƐPpiM?h^$b`uNT:U 4Lv^efbHO%s&dcoJY=.ziz߯rd", xԵ1bt秏ҌHGD|١'<9F${sv+rڪN:^a&e’l)AZ xi"gy_< "…T?-n_Q vwft*Jbwk* ,6٢m X?bgNs*6?1ɣoto0Gu89$,JE[@mG7.zA9Z{@]psAO:} {(NgHQ[m q4E~y?)G~#޶bc>o6;J 4>˷C@#oœ2U| ".LGUlz E//,MqUD6gsnO O.:mVM,<[I^S)dLpnP[ ͟ϟ2m9v=hi_, |e}qҢ6[<5lnԵ P?^yVE'B ͘糧FlIu` C+ u' yp‰崌VW*;9RJ`yQ<0K3z?dߚD4Lú~*@$I]ĄڵGWu!.kJZ؊${0eT#gȡwVld Z;*X3kӵ(0aJܼF>WaKN1'\ JQY?8,,X9;Ub$HY?QoTn'2*ezl)&M>T-bԷ&rt>kV/qp#c5lBrW"3^gOAy6~:$qC9ͳnu?,)p(1F^ BN(n0v[! :cJCpXEauqL]X!鍵a23ǭB 0+M9 %(aJ;/-?dps`Ƥm;-HQCb)2.o[y:$.QXiO`S Ul5򦞐%Y멙&awN~\Q?F(dgv6!,㸀~%Şe wiQt:vXԧ.m-%f`#Ҝkr+ HR N&DjQ.or&ǿ/rc̨^ntoNcT.i5b~XDn] PЈLY(&Ⱑξ cb`c%/"!YocHbY+<42gTC"SCcb}s;}i?ǹ.L3oh41Y֠T9?=a)y?UD}Q>-qǴڑnq$6F%&'tQ3VDp .M T';NhȒ1Eh]tv(& 'qzAJxBwr&]S4Dʚ0[|QX2']IٟVď()qCγweق@a]_zD ;-|vQ`J ;Qk6<)b66WZZI?yƜO  +لzu'r EtNw3;rKvJN9OGY \-g2?i+m?q{ /eX#陨y0ogF6. 4z<Ɗ-ل4t$PDPTddQ.:*d/WI<+P;*ixrHv quMVKmq!{yQIgtxq=Sўֻ~=3hF ܤmpKbʷ/ >Z&&st έ]nw.KW.v,J=ABX!$g.b RCu=o|ǰ]')vnd IgMq=(a?=[?Rt9rNp-,-IA|(e *DLL*%iV۞XԆr$.NAѴAvDnKWbt,f2r/ϝ\WE5^uB=t(cXփ.-HS䯍!YuZS P :xbI])kD=Rsr&K[Qf#Xj_4k} Ie) -(PvJA]2 GpPn=~(ELwqQ v_UEq\)WvWkarR,쏵;q}VeLec.eܛaOC9svmkraU|Pk_|UcK]?[^f*tlGC 7DhNI7<C _䰧D2o+B6] 2j'6}|$E~nPɶER;U}k}0)ozxf@)Iχg.xݤP}ҡs)21i3uQʴ G?ڥ8}8 9q"%7{D'ȁBv<&C@mn]jTS Jkl 7jo2r?:Ii9G,m{9 "A7f/% 59rL2 m;nf }9kX1@'~N/ ~т>IG;, t)+;-]2p]ǕQS]^\:4 {4_mL:IN0hۓ>0nn3:M*@Y݋RYnǏtdZIxl$ +lCW-xδ]~mlXg.4;pvF9͚.ղ("c; ?I>Hc'>L͸ P>v0\LdQ*2bckv~ \.{ч{dBӬ*>#1^o Kɼ+=EbB/{zM;! :Y%aJFU%<# v `RK%Gn i'M('E 5np1*X-Czg W_f!gAYZ,wu8+w*2Mto\/ `rqvFlof`DeT`1.<,M7dmA!t5yxMWv&!|\w L:pF˺/E~c'Ȼ)|T+1`u +՚hIYR`( XPAJuBK+2Sk;0ɸclR@de'1FoUgrqe=ӥK1?b%[Ŋ=!BVdH<+ubz c&?Gt 'm*I Aٰs6셢T/{:i:֐?0?養C#@5F&4Kazԝ.M0I*<+(ݕ&[^HgkCYu&_W8 ]U4*E%G-c]F9eҺfyF3 I5F>rj Ȣ-‘F(l ; Wğ0V]wDLj5Kbb5! 7@XK)TSS7`4DS9UPF8 6m e:ɕV)N&ârdeybreXp  9rlu2z[zQ,7Fը%@/Åz˄9F,TLѱg:Qaa+ϏpxąWb˱ilT?X@Syvf V؂"3M8?hlre}?yUG*xa\wf`$-6)a;2 nȩhC.WZ?"Vob$y_Q;dk3C*)Z'r -| {+2?9vm٦pƅ0/n@5 {qpܨ%Q"!v4)&0x)x6<'x5ݲvn<*3p2Z8j53'`ԣT>Zg1ި6!d)jmP=Kxљٙz2`@c֦\\m.4;06>%|ՊSY!~±xprF`G٬CT͋~[wo"/ ;Igp'67+H4m·9ikh;-Of{8^[>s-A6ͳ]?ϩ #;fQ/Zq\3NtGfXz/H#G Uajtû`Z&:2'xPjlCh jLbJZ$@M&Fa#:p'B͑R݇q P!{BYh`"pA5HO O/fUx" .`6a-Kb#:¨ّSȳ!ydp&G҂h46P !.MBz"-D]1S#8QuIirz4VinqaYGZxe7Ъd IM{~pXP@z&zmO.Y+@ Uj~}P*+CW7s[jcc7j;)"0i?'f S`*FzC>X4rE5/٪$!5/((6\[+^J&tڵ0 43C]Hage"XS[Aè3)KA*nt wRlUp+)sEG!N,Y\lt1cI̻ApAM%9ث(ޘM^ RtDvgtB}bm+w4*cH`L-DnaumW_nc^E,i-*2z^\۟w pUto9S QVMdP]g1;+/:MAoR)ҁXj82*9xij|gV5Z!7ZqPW߅bM嶋)uD\N|͟Z@S}&6"BȄICY{d$Pmڕx[<Ї ࠻4$є{ Г߅hBy{!J*~/v]ׂ]& K+bt0ZOg튊Mmogf^]-ێ>B3~%]G#?ٟ.Q˯b1'"AC'G/l~ޑZT>ȍWeqR`<ޢ$sxf,pIxRQz_-933P&|4F&5ΓM(I(^5x셞z[1s:}.iNGٍ_n]s$!(4onrR8sn"mL:/ ܠَ#?lG:؉؎dsOFtg 4]Qy͘]&*'}CO<8bZC&$LB&L9ұ|24X IKaCV])5Dד׉Š+ٟF?xTTMpK^PE.c7Ίת0#?d^FDszBiA$oi%^7Db}"3@ٟ.Q͒uITn ͪq=ox!8\s{۽lׄakڹOd#[̧Ԣ5C1H/jlP3]ޮu8KJ&{ho9b.uY@DIS, {έClcώ3_4ۃv( N{hؓ_J $egj_m&ʤՑA;N7̕0b LF1c!Ң6Qqw6`PYVQ͚3}Aj)0nWz,5\r[y0K?~ 3E63tIl^0p]Q!4NU}`"'0ǔi乵B' ~2TQRX'it"0C;(x#XWKte IR|='d1Bw"Jr$gU0֝Co#\"|zI,ɽABp@G=4O_l@Wl/?G 7_X`!7[BB6p\ϲ`oUEa$0(}kR۾0 :d4O>]Gbx8ھEFD":2:`K7wP6At]o. پꩲ9-V(\ͽL5S Tf~m`=3QJE|SYzT9yi)z3'a:\F6,l(aY [b{ܶ.wOXQ2&~3>S ,q嘙D/je8wgJ;L̹ j*:֏ C/5~]>Omc{u;ڃ"ya_FAtPPh'ě=u藑ިdK9. 6R*M*=yKz֍۞eG]A:w5@2 -;]f$LMQeEYodå*M!b qj\s1xCzBܽ3ꔽWQ,EHխIzkP)0§SfMsG}4M7|]˝cv.WdR f@; 9[ iǐe>q2|i"Iai6oz2on ^I@I9VE$6;h1  \>9n[^Bc8hrR5Jg"'_tR,aMoUE(cQw ۱1\~4=*U=>AaVހs@6N5~6yu:mͭUۛ^`]>w@L‹3D%-I(}c䒥_SH҅jjyFYȹ1Z>r҂KaZ>kR=W!lը+w!➴[P&ƨ4x̯k"RÖs/E0)TFٔK̓WZ*b3y[?|P6h ,3x/ 4 "= V؛xk0:o-)̓k  Piju̮LKO2otr?;d2l'1rrbaK .8ᵷv5㾢'><)}$k2cR-ᛑ&`3mST)RT?Ec&{iWIAQP[ѣuwwTldu 7JwM'` K/bEI> }qWo&{它OY&pV[ZQL2h0Ռ6OѾ!hAuA3279‌Y8qK+_ҬNωl%0IѬ~NRq̊ 3#T;J!hGRk6q`%V?'Ϻ w, vݽyrgmѳnXjwR7Q!Еԋȶ{y'~y5bƏwhnF 9 CЇ5jd.DQKWB~F3D_F_'\p@ $;}[PwFz!}(X`~Z5m4buBV|TA#w&uQG^EߧodKDϢ6]g6OW?Y Hƅ$'X`qc|v#h{@[N{^;ۏEhN6@H rBH}lN`[靍jԌ ;1m֏[wIyDXZU1E1V)ؕg0zFo ZcVp(`GjF}g Bvx"tVO-Pr19=SZ:(L rوs[l a+MNݛҕ4{a#|*H$&UkB~-= 1u?^@ne}석3oE+j(]A}0 H6&;|3!Dm=;t2Az>p/Xͬp &T{$}DTJJaվp%L_FAtPPhT}rB#cZ@(S}k}UU"K8L{lRIPc4NC8w%!GR`p%ilL9>FuS| ,3ŶJ,rrrJ< .nW^SƮpTc\YUb ^qT#cGoy`]r36ZÞ6qbҞ Lvw(cЇ&"1ߙm`QfAn\3PmGwK[zs%M`+Rēv[6e%xIAjTqAE;\eTdYu{b]㳪+"#gbL bD x"r4-J:..PĄz[|4c0EhOS+ PXA2mY%[I XxM 8*bXI+uVdUz$99wȀDه^%pdi{< ?qYҊn9vIbfk 9ak>!=owű >/#WЖ?q0J:cᇚ@m:Ьm~fC0ClBK38Y9c6\Gp)e*_r> U})0-i;c^9nsq"rW@B$=v:[ \|b^_">;= vspTp[ wF!JԄ7ԛħ[~0CUM('ϒ_8숢k/Fi1cR5b)+ǤZp PQϣ3NaAW#OiY! 0Aqrz- s**o\pKW[[i&,0l03o}߀=9㞇B2S754ӂ4j}.~=lX4"G3/R4Mt}~S Wލޭ'`)/[i LA$]y-+ ^v.8eګ h>sIvQa3䝐T(<Ʌn^ `IIQy)A㐔CͶ}PVIog&3-U$K^~Pm.t_ e?a>eB:m ,׺J|šxfK1-ZN#sq ;Wuy]uQpMu5u{)iF!r`Ћ*D^zQ_~w~[]9-0Zk]h!*L7EX@xAOnlZj]qvLϧnF*}c".yD 1㿻6mR˒|c,0ymh-Zm"byC^~"A qЀa lNGy:T |+2b.V7r)Xg-j83I^5)Wӽei̞V. Pn՚; WW%{!01ڲ'uJm YJn7QfHtclp .d}R`vcjyS/ZfZi] >{]{-%;ZU+yݿ]q@oK19Euܡ7r")[Y7hמš7`;۝P?I73`R>-cԻ; Mu(S9Դa5h:ᱟPϖ4u^i+~D{!9Jn$sC&@dɦ}¼mҷSri3=#*$ B,Їʧy 89ϦynqIY"5ꖙ;N%\g W/qUT(r5  xZge.tC+M`"b! H6S]Rל>_'N ylmiT䗉d&q^_ڎLq⢼ *gM&lW/PAqI. cH+7KRgu0~Nk,#Ύ_pV!o6)8 V\ -[!3DEyDAglPnw?uMa^e* Me$l:%gBa>"M[3HвOSJ6q/R1g;8q \ uf 56tµ ԾTG-R5+ aT ?8eR*&gC hy&_sg$/iW0zbwiUnPm&DMA0.GHI t<]`X*#gYYÖBȐ6a:˷ "mxJqYs_? {aPCmƽ9׌F˜gs!FgWZ=s ֻY[.e\MݴEFΣJ-)b榍>=ZM)j _MN B9<.H!%抲<56xW"(NWAOzvvuF8=׎ڣp𰔊B|m}oBsʁ9SYy.,4LZE%K'hԽ[2::By0K)/}ʹ Wr/-%@ЖC;%bR KB©@+us>*ʻ C vկTWQ -IsY 8ҭO}#q|J s[HտsdpNEc1I \mXPO(]U[QBuh†futR9yLW+::eXЂ#SR+o8ık%-a= s~>pM'':Ɉp9p{jO"ǗMPw!ɀL.|S!e ޓRO\\ޡB ʚj&&p[):~J2/'zTd~wl݌*'@AO_/og[t˕LkcmW`#nKUU|R RxۯLA `+"s>Z T k{Wiz,bG K{}x۽=7)j3C?u0:vB{WzJ@pW0krTKSPx ̑^ňvoy;PFXIʔg9΂# /s1L͹kd-<'GoUvːGлm?M,SW붗&='zju[,' !Z΀Wtيcʅ%36}1g!^ٚHA ٷlĿu /ta| &$)WZє{(E]ψo9۳1`G\egT 8GIbvmJҀ2rn"PxcI#>^E$p8|?6o'5xsJ[hjZe3 uu B 5rzv$h @u$,4o :Wrc eI>NC59\geg^#L>]^vJe6aŭZ#\wS/ͣ3h Pp䫯J\;du6N\ M@n5.70yq8tLF;7RBk#Pl^6mȱ'7]^P@+jU,B㝬jU@ř׌NM+r׵_IPzA('B/iDMAl_=o;`}-[N 1f;>Z%Z5{O?X"==5IMh p}OB]4.Ob]V:ޫ]Y/}WĪ9E(.t{nU:;!E/e).$@= ec~6~X]R;ZdS+};68#@ӉP)?I& lRr4n6FioXƺ̀3@ٶ;\W({Z6CqiD>ԿƖ~m7,=ϯXuZ!e/sfu_qxZguuf{a/gahs?kzbO;ZOٛ#$u:e#_3%æ̢9)o0dQƸa_Tl S*n9Euq\0G4@%ׄ䟗Si=fN-+ ^zK8N}'3[r;rcO]! t 1^MeknIP+EiI5ӥULcKҦ˟dvB|͙`$KMڋ.|\-J~'4_(&U0s1fk\PDyGnfgjr0fG"6!cFu-`/1"d4O\U坴ujO c~]9lIr{qvq5ޖPEU,E:X}I(A?/&'t*( ΘUJ9<}@Y_UF22.\h?/Bs}i75^E©X6BU8 N& ycj6r"m4Eu'fP!@4*ΐ /m UXv'i0kJ%U]wpj~ɬ]EHO"2c(ߜXONE}죙ρVM X_R*k20j;eVj(Jt@Z k7ޫ]!zъ.W?ݲ XH@ GH|hŸrmsj s(xpzeCTڢuBWJT“owű >/#WЖ?q0MwX*}tZ43:0~Hb!uŔ,c19V500W+IX0=M)GwS%Akv||"$ƚ-ma@/0uy +^}H _># Nl+F{e(b>5lT.d/8=-vg\b7'msB40pqZrI&cdGt7,iLP[i`gHC*m*(Ml7ɕr*2rU'FXE@N9FQ`wk5sn3]X K | ,ĪڹضUՁG>@z,J,֖ ]؟2$r= %iqQoؽlv2,x7W"s7E;pr8wX_v䪄tVJєKaO:>axOΦ&8iYo0(c\j}څ`6QY=,nE*Iѻ.{?e?P6 uuqZX(gH2p[L'p/s炍ӗiQT`W$|GXm Ncy f44< %"$Xܵ}ά<"mnS\ io$9i>yT*z%tيIxђi88C`׍vҼ7R$Wg֢S/g}W*.&;30jܽS`5h(M뢔{L""4u+a6;R0ү9.OXSi3.|^9|?Uߏ2ZAOҫpUĴ|RM|\c@cL>G'ga'[1zcGoQ,O2\aV^M-B@^n(บ|ˤs(F† ^QK!qW޺k$Q<,/ :jn|5k23@xorgp~?j#_ȣ&vmuؤN \|0wÂo>fi̲5M }^Vv!Nའq&_(X!{3r~h N6 $Hb'P1]-Bjը3V.L5{`%W &v73;!@T'/$-αO˗7 #μ@v9:Kg5FcEnH-ұ1rAҺB{Y-TTj} iĖ<lo3bѫE) y+dZ!'t6rD9QW%Dkw8 ީ dM`XE+uj&JGR  :5X.ۻ=` q`Z#X.160ftfYuH~s4,F#Y?5c˓%PSԤh{h zmZ<:0J=ɘ髑bnmp:P$t@1qy,)N%Yvmh*Iճ_>w@6`g7݃ N:3V!XW+%^8jC? :IaZH1{>m||of/ti=qRjOc ![/dRms\=82iD=q1)pu*H,XE(d!Ŵx]5}bBx(ت]F:_`j&]`{$'~PXSQ cp{QyY 4Zrm%/LyXB>z~} CpbW?n? հ|Gm ~tjS7L)5Oy}oVq\Aq0$l8$: wL-s<(MHc?bS~Bϝl#P@ S2*~V+ kZ[W305s41 < X8c?%3iti-yC6:JVN W , ȵvE1VqU'E*DgxK$3tVQB=q)r#QQ像[aӁwGEH<(zR4T>gg+]K2=i!9*^g1޾7@G5B֡+6??v}hf_݊"@NN iv0Whimmu>]'IsšhrnzU@75-"g䪢 L.}QR_gZ Ќ,XJ -5+{k4^v5MGWͧq&bTD$A/)7/|BPWFGtg] l>Fe=y )dJv*^:JOwH6Ug.Xb3u7WaE sS[`s dJA.$͎9?Ӟ P2u&(0! B^VqrVǭf \1[E6Wb%HG2*SY F XoæAK:f=萬L+pbTWTDɷEulG;f]pϔZ}W 9L^-u$^6BIjG%aTnn4.F 󯓞ȈJe:vPڄq׸&t0(,Do8N9N]ػz nrNG0#osFKG.|[ _%R, !zA%|i+[m,- ;{Ѓܷ`BI{Պl>|)rh{i7IG jux&Pp?GPJ6/8CQbLk@4UXDE,˔#&hdNS~sSYu5mtaQq3l;+M|;n uGNDKdsO-{M0}ju+JR)8@־3dHi!ؗ-`+c^Fx#eTc {{YQ|I/(eUʼnV~mQdLȼ J#+3sZQzR(X{ X#w{g'V."}/&v:qtg-zoir)^G3|Drc5SLT|@V'lῌ5m~.yzb3g=P~9+ΤmpHϸa08%mgSʂKhY"h`XEgD~.ݴg![ cyaEf-L%b|Ԯ{NW&&SZM3`X(Ȥ\({|̰zS&XW` S:Z݉ Z8IqNqD:8kx?6dk,3 Jwzw6#'ئ v}m[>%jԄvdXgN^7% <؏*.dwaj'7;^HbAlN?ОcUЩ+Ho2AŌKyniZl7:nke*Yh 8DQ g8bʼn?,ӄtF\TƆNȬj]2ښߢk$7ǙN纅 <+J90-q?5'M—z(Lp5|ly22uجAA4(*=DHvrKttjTrDP Bz]ALĿx6T`q ]R5RP>ǁ32}۸}ee-h@b*1uF\\3 &)Q:ÅF\'m mdÿdnWqUQ8!ӋƧm1CKB]JlarVqGƅp"&\;R0#9'»Y‰dހ+|M bKHP0]4Ëjpq}ԀS`CSrߗMJ1)E@}wάCMs؂mxk т-t!{?b"_Uy&QɅ0D݈:N.b|v59.Nԡ+ n:i:M{d MKbi8y^qfj݈⿸]i g+l_%Qʫ/{w.?n ;^P+*#+GC Ej oTgp ^>/kC ]+<3w$<|e=# ] T-ypOUO%W뵱ܹw|"7%>},?=%CIsD!0455SeK1ْzҔaz#4Z&PX4]hPRxpPc%˄s2I0Fl@| uI//ahB9 ҤٖTf0TqbT72&ƙgPJIktU!L-k[PhXޖD|K\~#( >(K6| 8sX0Hi- 9/fzH2%E Wl%-rx#G |sz̡W l:6 o S{x{Es@p-}jkRщ%sAnA~ NRn fQo1c`F \|:dO䝕e_CSӗB(yz;1J&q}?O=;.0 &3 d Oh+B'Ř(?)g[&J_Hr7w|K{!:_ BćSvn ;ήTد ZS n7'_w|EGS.1Z\ƖMkrE7''3_50nl^̞5)(wH [PoĦ& ]AYߍ%n*<Oe.ra6\_6:BǺ񉍔Q< 7WSk/-'o{3EZρNt11 sFdL#*rDca|f?F&}QOauyq0QUr R .G/oq*>% m#o\C-pYT} MWeG!S?u'hm?W=l9X)9Lۋݞ{%Ix&.+3k}([-~Bj=p([8 +H'|b{E"j=Yfk?ĝ'FG;$DPIb<d9k5G3hOQECVSFDƺ|gNu擐q/v.D+^ U7Ƃql%x$3t B8@d\El"H&ٗ0Q0tҫ|p2|[t:inCk z_ lyTupӱ0.x!t)n Ol dcAcE2 Epp'z$v,X􆴎2g9J!%5+%+Ci2l  !I淿ǃ|y~XmFENL..])fm`a4{P8bkkp܈c԰ůQ+~69%,^ bvfvq@9Z -A6>yYwIh<.+s$:ClV ~X:4}Yh=mgp2[/ [b 퉄\[v5 SDBQHPOpk R Z JmpEډn?a&֗]I b}gD 9_KMAr+aG{%a >}^l0:>Wհse:nvLc,(_Xi"CCX7(J=4ߨ+{MnvJ` $og]WNP1&lF^q[UdÂ_%>I\eKOف[y}*DVR-&SGeg鉦\މ_pMøf:;0|b] %O@+w[cǔfOrn< g~C OZ峔~A/cOuϖ"q?oioeKh\WD PkʢmbM}MVK<7hDbݹ :o/ilgk92dΚKg:fM5 C#X\)\Q4>! :gY!c Y{97vsOdKcU4۬"J YÞCcv]Ke`QMׅj^ Tr~O) mZJd;+\8* I3iv* oͫTb#`VwT)R|+h&Əם<B3ZC(+="\D륋4_x'#Q0"@䊜lɏ]rR1!JrgM%|-sLlN\1::U TU6¿_7F Ίl )O'6G/ :]/DMʡɐˆ|{銡12an0?i儖Iq@A% $'σh/ٹг ADw"# 6;D҄ܵV%OIð!0ꧾ˗6UcN|?ܾ-tJ0t߫Wdmx1Z>h_d`P0MFLRt7ZN\\)oSt?}^χll~wP|C{oQ:0#adjWZ!)};RqO-]3ߋPQq"<2t+ܦWwLW9j??YR-?EEtt(=wrY_g˝֧.& ho( 3$}4WorPUW .7%wxi `'aLXmYyY6u-Q(3R|բ,RCNfμz2ƈF ϐ g@}&'ؽFG' BJS_/6\Y#EUL73!;lAAD\ yX؂!Dzp ;Aim7Ȭ WaW(8O f~D%݊rPt[w[-|s0)u.صXaWO\jG;#0s ktLx4 RD~mu,#Sx-1y(,?}r8 j:BuD_@vE|Qo+cMs-f' ZWg?DqL,nn+(j}xgy9Wה}VPْV{"qfdpG}SeΝŷ@WKjI@vrԪ0.LkX4]3(Ju2 <5k:j/Pژ+h;f*mKD6j0ex`~G mnK4.#^-\Z 6ۃn{ j>teesd9Dz?&ADk1q\nr?9wǴSL5x?WOZ̭/[j 0,vVc@&C(ڨiX})CyFTm(%Yed&ssC+8ݸtL~9H #[I WԖ{©N<>|qjAb;wj"@IAK*ӓoa{j &"W#[Cù 5$w5#;ϣp=ee>{"OK"2o.Yqp־un|a5 kYG+JgW1^3 ѯ$ ۴7S4B~x-Mj{߷ np/%eRЂ1,T1V!J4b1aL2~QVQ.l=-{0CH[Inv.V~<l~w}5#V eרh \Α( f]tWfo*V&4"?<N$\>pILKDZ[%^ݼ-,ei{gqȓ4fx]:F}?3nnS()B%"ZѸ᫉mS,(nEPL8PNѤY*h"SoD+T4f4ƹF%%UwCBĽCHhNޱL(`IVzWþB|D xq߹ܰJԸQvΞ6 !y"^R" $݄fqD`{kKX'q1tMEA?Jk:<e6ݞVi p"x"$_G'i/^_NƸ@]6t 'JDLS8WʔY'͖G&SNEqߍ:J.m A3$Bg+ cEԈs4GhN%GZݤ~$ h8~Z%b@>!ٽH7kO)֌21貦e[L<ǜЂ0Ցޡ]ǪYKZ‘K lzCU @,4kcEbӢK =Q=.;ħ)r%Y]SGq%fSB4q&yS{1!Ju^Ot^f],%er.Z&͒Ȅzדpm+nzl>n?֞|4V9༫b3((W9(f@Tԯ|^wȟdڦwP޲&1x+HE myH)1nb|EkH6-^OHNz̒2 F݈|c+N+" D*EN+*v(Zx^.7rhijirEoUV&C{v[[SZQo%/ja"uCQY~m?KnJ[uC.$Q_con* q]{VsG(pi 1BB'UuLk>5uDʠTY $D WspϾIRBgDDnw?,F8+2hO:$'9/KNYw%;]=!ހ7wLއCG} H0C -2$鮂TKc<&;>#%szŁziW2“ kmu+uL鮝NNv ~QF@Ga=Z]K47g d0%sFUwW6[46ZWR!|v,Ϟ% 5 %(D D# [&TV~p&\0IbZ@":2:t׺f؊s&n:Fp)3&0䤂2# WGGݒD8u—;5 [SRn7G+!#f^3Өadӆb<~A-,K**; "6 [ ?׎ D[)T{3 upc 9B*Hbf[ njK {X锊26܃O=Il);]竢#\CYr õ~,-e'Wj400.ڠJx~1 pO_$7އ8vG%$ UIؓ&|)qa3 ƮHF$ڋ aFܛJ)R+`aΰ^*ޱoS98T|Of+ @S;*LՀw:JjRfF'RvHﮂ:FlM,EɱCR'{X:/MkRkV8PQ6n!vwz+x?|݈E[٨Eϻ"?ʌz!K̎1fDΊ u Sy]} "BEZ x~wەYly7 tvey: 4S/7M%ʶboE\`B? ET zp zu{S#%):Ԏ/r?'{RH]En.VKI>@% XZkծ2~0aWwdEٿdHt#LQQӰ(?> TVX`:?sJ#x9S]}1y+z*&?QPX|[C8m˽ZvG}$r"œ^t92; & .s^IXI J&\;$Bte@5%19RycUґhmBfO>UIrd[PhPC,]1񻮸Nirz`z8ؖ)ž8&/uډ|;[8nS,|WLCQ],[x (6EK'93=F ܚY{yyfrBL C'tJ{o\k/"ds[ #cḁ Gv4FDwCrۆN̅Gm,7@N$5.LP!0f’@7[NGJC=k9C3fKν|c]|?@6AK-0;eyMD7+0~3:(.`2;viȉtDڔo)TZ-ʹ 3IT'-7h$t{ 3K3Ih6Qbr,տbCD7tY4[[c\ D[ Bzm Fve#jaM 7՗z RaNFVmy_uc l]o??v M600*X="P`O&ji#ME濳;%K.Ҫj1mx?1(I<5hq?{ Bz^Q¿e$tsE J*JĥR,5.5rW$WCSYiZqtxո!5|!6 }JNJ 40YTږRavL Cvn ^'cͅsLУ$c]eI}3& f -fA'JZrf:-G>C{{m}5T9BI;b^3Ö@ Wer@ח\Pr8A.ӴٚڕnNey>ɎH=W[ unhs 9KMtҔ/~MM`mwHpE/w4ʃJV_u03\%Ym% GWt7 u-Tul/e-̨MQҴİw:y8\fS1²wZ X넍k3#(>?@Cy 2E?Kuٶ+a۝R"6Z0S(//>ƴ|6 U!SlF;x)0_ʖYRaQ?;~ b}Ӣh0nB,ŕfFP| aȪhE>+)$2 : r\p, 5Ϻ rKlU$F.Wp}H~_z5rUtbF";ߏ({G0qJ#$pY#iGÈ1QU UG!ʃF6S? cUbկ]-x! huڒ5M"(:lۈZeÓ`"# R26y`vy:sOzq@n Zp7G{Xv9/4f7; jɵ`coMhWnFkzI'Q:;GMIz>.VY8;I0-|iG#/[BRb!Te;\fL?+'?O/8<ֈa{<oa$n~ X?_ A5aj\Ϗ5};/K4XafuH8Чs*|]\=ޔ _ Iw?]9f%aZV{K_(SY_S^5j'5`E_Ƚ(ޔA?{aV\egBpB%aV8033B;fz`+OlD89;ƴWigzswu4>V5uTl"*C٘܃]emu鿰 %}T6> uK3cH_#GX-;L6IZwpJu땽0fKa}JxGVka)6#_jao}pi\!X<> Ŕ툏 z {F9=XY6?Uf P x7ɽ`iqm{=s9\b 22<5t \~`X|;,0t[OB[6pWHld;`ݜ&J^O؇k9Ъ&!9",͈ ٸTn. ϻɿ;oqNGk@d书$VCwn>k@Q.8vkfT(kLmGz[&J_Hr7w|K{!:_ BćSvn ;ήE:`ѻve1Y{;~F ENX(xfEdYޢԺ37P *Q O#=-lhhLE&ѷCZ4pUP8c%\r yJRbPunwG˜e+x_L4F.ECף5arN̅~Bkɶa?I㖱c9Ei^@h';O \mص$IN3XDrRfl#CDwPd}r +]DU.`jpHpѯ7pK+ [)fp)mUĊ`z2yR;k٘"0KC*7\\%, []ՙC( {\k*,4|Up_!= &?²pjNyBNwNn#9U\ ~+C ߳1g@hV狢BQ?V/ƌU=;mvl] -%)*\JFwH?kDQ֩q'J5-ߵWn_/lç]lHUňr~ ?@BᔆF8So%-4.i(€ꉩyed,ODyqf:e >r.NiXk '(^`pSFgDϣϽQAܼ"(WűV=ּx* O AW?Z:.âclO1ˣIlVT,YRv ؒɘ?A8e9[nH^wdMhc]ex qZHث]MpYRD"f睮n0ĂU gJC$)N,3_9c˜V4yT2b[!"D:ڳզ JYhv^ )ol`%f]'a8h ៦ܴRD r3ugld@-J3nrćZ,,t [1F+M0Ə}Le>дmY|pVϯ Z+6娿&*ڄ*̳YAwWGaa@]}[jQ!CO ⃎`Fy~)$|=G `./\!uRI n/'k=ϻPaDn״10[z֟lAIgeÆrAdi D'lW^ eX;Nc qc̣{S|A=auR'zb)NWJ7KZ€eo&Elw%$ɭƣ=jIX;~'Z矫T&~wtS+im}Ԑ3`D1x?!(`YU 3O wmnNFk%8,L.I:42ܬ!;2V7 ;uCEJN>JFF0%Ch/c`"]{y6/8;@t{bNdX8g쫔 7z !EUQ[sڇT#_CzI*BuhE[8v(OxRe&˃"EM/-pT; SEE涉JuS|Lf4y.o5Z.ta0XY}nOZ8VN?ꗘ."Y`M;p5qly?C37Gƨ4Igs)}N *@>33'R?vjX.@n\44i!8_F`ӯ_"tdQv4 Qȁ%)A,q9,0}D_ܷ² KmJ3Δj#Ct3>pvmG徼@)kTq!ġ}XbKN W N7~J[v1RNj78nGu5Yd hϸhkajv^f#X)s27P ES*\q_f.o5//69}(];HP/?JA.[*1ggVg $d:Lǰ` ĸ!UaLeosha&蝂Qy5߽h"#Æ{]~GP!|Fݢk|kgMuBǤ ^,Zi9ICD𸢋]MϏc8 n5NHL: =izH^9/Z9y)zdG3c>[W̬2|N>ԫl0e B,3v7snBJf~RaXz86mKiq׽~NG;OM_9 l2T"OKJ>T͛`]LGEm ب a d^"tcdf UcW"D.EP6d!iS](1(x#[[ى#Lh\jlbI` q-HIZӑ*l>1mu#Ns#G@ߐm~8kWT*K(E7V:sry~籨`bOYvrR:I5w5f<{ҪHR0Լe-_J?U6N_ݽa3YIR @熽 k)"U{;VԉO9:H vIJ1bP!n!lI;ssALWǵ7%%WKwnamEtEiBb"_uZ U?k$sxUeeYV=[ݟ+H~fH"i{[go#^7tm=WܽtS#EؼGa>o[S"mX1Iђ [чvX#H\.OL#c\N"'S %%:W ۻG1I֞cgU MYȯkQMÝν8{衷0T o"PҖptK!ue:]@m5wǪq1H=z&= Cj-+E7Qb6 2hRMɜ _AhT -M ڀm0)C<۠o~N7-<ܰCmޣufJV*[sJw3>WnRo1kn No?$$ mGTz %oSg!h77刌yKΖX!!x^V8gB7# ї.ajezx"$axeJ>}w) )C^!.wRw`e-w}'>p\%hx񡧲3n$qO mJfMseeշ'y\˅ (~@>4>3|RX#G$0hXww|C!7<TM@E҉a2DśPA*/CYM*ʢyLP{&8LPnj,LP F)A7Agy<=%eyI6#f !%Hq}Vk=vl%`H|vi҅(^>tqϺ>wn弋(|b ǡ8T䓻Ԙ͉Q)alF 9B؊3]RJlw-*=s=}t7ܷ-FbCM3wfR%YB-5Th9 h%^ܺ ze(6\` T?941KZ !) "jکAx$NXEG0`Bm!/wubҠa'd_em!wkTɳrj]/OZxlD9*j_@B(x,䤐 f0yKB*4vA^hp1%Eڨֹ-y[`VprhNg`FHk\ԀD8Us8ퟓQ*mq!8+G(~fҺ7"MrqϩMJHm+}(H4)R6Sv?~%ت]catZn '8kY9˖2| X1[KoAq4{YS4B:?iO'[(bʅ#O;dQ o[-KD6^5ln; c[aVbn!, RLGWoY;!J>Bxu=uY ;/tIW^OS.O1Dy";Fpb*L!F*R8JYPg[Ŋ&B {1ZS<ơ S֛RV*tՎu.r__oLv%Aôh2 ,n5"m"ɛ=gԾ s͜{lHyc~Lx!C\.`of~AȎG !Enet3H/!qESEceo%U*OW# t>|aVʰh˜Y穡W\[wh'n)܃MPuNeY|j%~hC?z3k/y}:+I{fYV9N"BȄYub[Vrt hLѝT!ֳ|y{ dM:8L; =.|S9;: *+9tO;(@.c09Δ E#\!۶W&/hٴ= +*YS'sqk7%MX#'v*Xo> ͒퍿Pw.>l~gO>"D k-c!rPAPUG `"T}S5]M,EɱCR'{X:/M %OEN3wutt >cnzݭ&(8H(ǟsHMrMxv~lIxφƳf7 D&K[ǍaW8h5'BYFh8?]ya3!>KY(ǣ·nSH>j~#OzFuUUg2R qHWzEF/ڰd- )7UqZ>j\F֨i~"a!I93~Q = )٠%$D  蠺;⊟RÃ`tL"2kQhw"/Z%wwI J`4]dKھUVJp*iuE_%"5u^Sc!\qkA(Z  Û|O8S>"Ƿ/c|[@'aK䕪 lt֦bݞ҂g'KkdBq$/A=hm%PgB6+Bv[4x {MRgw) vύSF !@ǡSh"t6-Tk -q8,;J"aUkb[QOHaa}](p!Zb|emH34׾cYeš"Y^vgxڪs2?֚6PΊ<,(i eцA$[Jn7D"!%10 ׯsK,BVSy5f,0U9g~w)G fp!J06J +f%dhp('x lH+^>t2q#WPT8пT,SkVg`74y[xe6KY59{ˡo~m JJ,.1-M<V<0sg",:FJD".`@\'$$ј%%uOӪA![(E- ٔ}Wlb^ fǫ,'Ϋډ[ɋZ{Θiv4#G Z|yw jq}V6 Aa&haFM3mǷ5u7I4"ʟ.=漗3[n?e ޵p{R~`qK@2@1 EGcSklDK +ԸE#~.MfL{O;G2ҷe6qQ#/lqj4Ee\l |US?j/ +-ȋ{(3K4i=_p>O?ۂEՀu*%J\ !@ղ[yѥoG#iQ>c8A) >2" OZpRIf'7b#U.31+Ip9"S5X435I584 DA&kMYy3h3d*%L :KQ 6 ڇQiO,B& ʣhۍP4&*$U_7q7_ 6IPe;_O1 Ɋ@v&-ϻ(/!}(N,)U`^Jrz>SG_$?A(]LyjBQ:eNs?(OL"vw /rUGo2hy4v~PWp"$R`!CWBj{F.?͋LxSX"/DfEk>6?bD$ w4ZLQ-w"}N0K LP$vA3nI{=FrUe T,>s[n@On u Bniyepm%,5>zRS:|y7ث!VGR,U1EM^ 0x00x8%7>]EާvӒPSfLZ|!Id C vfxfTx yT iyfFhTF"n o'ƌ Za2jZuyMJtHހ=SNHI(N7ջ> uSKe 5LPrKb@AA\ 3_ +~u}bQќVΊEx֑\ùF(BX pqaE&B^1ϸ U O9#ލ~]lwTC$1և=6gSѳKvo1rvb BΰEtѠ7]i 9_@q&dʸNHr^A,?My逜ocgyvm;BXk+6# ~b?2 */T *5aJ+ u2rvI311R KrWh߹;mSVƼa]"6 w"P}R8ӎfV]V̋uRxljs2#oo%c2P+zȰ߯\ٙrPď[&J_Hr7w|K{!:_ BćSvn ;ήTد ZS n7'_w|EGS.1Z\ƖMkrEP_pJ>TCr @<$9+7-Lv{fFN>G%n*<oH 𨛡1)5tqpS>wW>!O"Yv9h qbtpQo*7k\tkA-IU3WċAj0tVx6MsKB>v+Eq=2boHDxx "Gu6 5[/[!1M^" HfPOzpP-U"j {y0DoRK7&vmA7hՠ>Z(Z{\N*Rz`MJlC_k`jE(Lev5sSpQe,54i+z>~d s=/Hx,} fyqm{a!tAhv۵O^J~:v^Lsw~S !>SFDprN(zi^Vi)]tGo c0bGrb3hem/0/YR ϕmTV }1r6Ңa5U_Iѝh荈j2c &V*vު蕅!(]{UlAڢ#>N{,ۅg%`ꎹ$r[,3&9Y1yzܠiWf[[f8Yf CXGТz9/{ %r| u3(m}xBșPa\;}8n $68 AT)l_47 xlB^ƚFMq}mwu ɲ'=}=Hv*\;1U@FyG= c3'ذb" ݀9Odjb]N9Z{ۡd^*ui- 1K;3f'' 8ƠK(mt<5oA~ŧu\']s=^3}ņԤ TZz@\wiv"zPVC#Aޠp)Cc Hσv`[2!a*mƜK&u)S ~%Y4 5ʛ0#vv|_tח U3VpS:CG"/E ?.|ޓ9mmKG1-X::U ] St?o>H'ƪ6h(utͥ y^T7LL!q9U/;{*L20@j;frfqi8V+b߶b a"ە> aԞeLnX8ZR_؟zd-poIKȘ:`׃6߰E2.҈Y<]u@r= q9ZRv۶C%K4>uN@5uUA2''y"R52pkeS3߻FؿL(8;`a]Z Y*a+m_JĤ) @;5Tq%vn"ςrE(n4J. ;WG4KVsmMҳk\H67~ Y= ݱrqo:WzJ҉Yg:KKbgQy-T<{ ad/6\Y#EUL73 ӄnw(2K^Vݰli'vψ - Y1VEVvau;}[}\ j^j|0ո??XQ k|lUZB,*aM} lpt4®eD<~ӣMUǐXՔ emSb8~pϘ+BB LQF[!,w^fH$hpgE!v9Yc4;Sbz}󡁻U.:0B\Hfj G}-`7n"h{%wF"+4?E:_ ~ErOq**a3\vR 3< ? M[e*-j*} YǪݥG9'.oxaj߆D Г꿠쌟,V(+oۖєO?1AUWȖ/2_U A1gWHP'ETKF^%f4O9gr\ȥdh}hUEّDʃ\[*fq50Fa~Cgʂsen`Uv$RPp~] t 5R}bs3'`5ڤ#yϮBDQCj]BQҁ,]1dx5/SS!پ I #&seLe(T~)H$#|1ܜ&G!M`M(V 'a{y>]__% hGP6' SOM)Għ"S|v,7J[ jC _%5٦retϫtJ>汷F#CkOa@gtuGYQu\x.`km JB#xrZivTfMiA:Jŷ L, 1AJ>hm>u[m9O/YfOeraPQiN9V+eVF? H5Gl0[=fAՠ^ qS2H>%܎l9+pKn7V?x"$aA]!Io,o Y2w4uJ}+QW>۟Y;Qr_6ZAiV%ݺܓnYz$i֥ h֨_qu?ɥғ03U<|QNaV"G#z~,tn(>y~yp=nhy)h% ז QIaUy$Ҩ䪻e)Ǹ[X6vaH74(9n%2r ,A9RscDXH7o3L$Jh\~`^#VuR!|r m~dә*{_F)9u9Ý+p:4:n/^fX@6C2v!Zz&q|aw/d>o1tÍI46Uu@yҵsx8W[VL}ZA%kycğ1 ?Y"g-<<+`%Iҿ%J;ֆ<7S P> ]"1{,kx@<.VY pl=Ra&B2EKwc+r2v=㩥-[8xO^_W>F1] eX%Si* ZVIS悠u~\|_¹Y.U?6LC$76PC?ZȻlē~_'-%:c: ̵S+qirꛤ ̦b@Ws kS5o$ D; ,oU@EOn햓fGb󧮋lMLY7OoI GKkmajYgR4M \lh8RU{%kk:K>{*Eڛ\wssb=qAn\Mtvf[bfMetđ# Q.~r]oZɷ2ahU*&5]Y~QZ.Fyf"y nf f Ģ"klBJ)gK"7ٲQz&un3[3xm4Ǜ)B#އ:x,u Rh~~وUERݤpd~=@>&vK/ t75PJd }r TLB!rt$ Ӧ&_?c Xw88(b/KI:L{dgK&~N(=W&.(^Y\@{[iPvTذ>={sR@ 1݅ϛ[,+rM]f}{]qtU$C(~V60A)j@EqL|{6u +TD$Z6;l ;zf Uc kCT.dWlmbTQv U+a${T#cZ)7*݇ h! OK􀳰{AD'!dAf ڿ@%b~VN Ti9/PM b]MZEՎk51JD}NU"9ULBXmRg26'߬?K!N$Xj,ݖPM_9Wk2n,iA^qMb$3ˌDzӼ.螞)+=ʤ4Ze{>9Q?b\C1M-h$ls<}^aޓdhۙTustz?wړ{~Cz)?D7_ï #!>Twߨ}]z Cq7 uwh܉w1=t7q}﷬lG +io9 L7"!Cd\&B,>v4EjI9Poʩ-+)Ɇ񾒛+AU W8YU~JUpe G+01pĂg~&=nfeh5'GK)͡i+pD5<3|K ՘Le!t|xǗ~np";1xו.K4;DMMBp#3ߑj-U@so~WT>O q_Z9'ls{ԶQZ945&뻠D4엠,V O+`goiO^S42*fsSHs~+}}+6 LV"owRmBvøcUqMF>\*\$䏗݉-3_*AF# #I =XwJCܣw_H 6_{%0{RghSmrJ,B2^l=Rv]{hhR}вXP<_7,I63yң'=m*gq|h-%€. |F)Z|=*d}I} cY5oqG긅ͅ|S_6[!pfbs9V|JR?]T@v1}s߂4S0TO#rξתa5v~c/nQ7KԪy5&;L%9y ?*#aL(01_Qs"1OgF@:Gܨ5 9ێE6npO 7[7e[9<cy,؂P{eݏnIIhI"iRq0~\J(gUDfcD_E`Al cel`!4U[ Pr&} ]]Boo}GWcMfDE.lkwXH(ySR<.,cx.9B0>X QOpmT!P ')/H3 I;U1.)8[Z"<2 ލ.#%6Tq;7lz #;*z 0 j;21bR |s'As3Qā*EGƒ3-|we"pc"֑h^,ψEj1BwC8wqKx-x,4:C2`˃S\ lwۇZb=$vz7|'rJBú@wRhsF~XwUOVnoܬ7mcCT;fRCuX.U , o_$#rD|ѣ!Yx1)jG? })g| '$4!|HS>E;ƼfLv1@쀳P}Bı2xA,]ٻ7`SLQjAc`tojM>|Q`X1Rl%HwZi#FpǁDCY"$ SJ eé|ƮlEyũ٫ȆQٵyq'N]~QL K+]Ѽ;r\~kO-(r˽` GwEa(ڨPxsE,hG{^L`W{nbהw+ƵakSK2vc =~_U{FINjgT2;ѭ tX5e'Nz"6-ȵg+xםlfG  ]^rmkޅ&nihT 3IVP`[F^Q10DiFDN<-a>\t ,B]t4Mfg@@‹䲊C][k`6s⏃eY+@ӠZ*Ai2rL3sF'ډvZI,#4?2No6nta9.n\iqŃsc:+CBĠ3\gE1 KQjHoZVg89 TVTO ? yLO$k[idhZ;^.4dnCꌤ>ߜy@Wj[;Y _>_ou:ai]Bq~~qH dj#im@10]A~d[ίWyP@b*X;VzbW_C6 t,.hpEԠ%HQ%GA+n:GT5Nz(!p;Fudov3P)ݭh<{.6VY{ $?cFgҕfx9Љ[χuqA t8 H\pzr^k(^/ FG|vr %П{l )x4يQ[i#{`<.pښWxÈDQ!>}$9z+7*o# >w /yheG-iD_ږ]ߴ-r+3伻lSAt.W;h*s>:_;/",W>cM0Zo;gy u_ g- "dOs?`+YbYh9ke7U`v^6W f W䣉z@JZޞ-xhRh&#&1٠X8MN=:ȕYMt/Qg|L}05_L3[أWd6dT+D%ܗs,i~Q>SsMD8yNbT8~XemA$]{(L4EZ omb$.cL>< IW>BMPM I=W.Ţ0O[-A摳|! uoN4?+ Uh4|9lvZ律%_Aՙ]?:y?B+Q]t>b\ q57r|oNz! |>7hdQN"dlYE:"%+.#:mׁBb߹Ye_MǮ+^ι`1L/-4y|ֽxE!kᅔDJ}Zd@Rfd cl$ES=*w&O>s ayHONs`.νĆ[QOB織 XPTIur#Z< FԖcQfnghlPx?e;:S"4wt|%~9T2p8=!>v'X%xBgelW^Lp( .>>|"CZJ?=6a3%C 2RӅٹVҕŁ+nvME 9;q>RalʬoAw}(#{ x߅<ݐ#q漞ʄ-ˉۦǹ[w~v!E2ȷ$茺S (bikNGH ^8Q y5Ԝ J_qg4:6Gϓq4,ĚKA0T?r*T5x˲tmw8A5NtH}yAT$z?Tbie }i>X"2:hHSx+ŵ&`^W?-1Hn/s Գjt]`<# d+l_ŵd3 uT6oz2JIe/wN,w°aph v~PRr[@*^|7NwwbUʋ/aCNv6X0B ly^CTb6xW z-8',`!EiiLV%&FAQTrl =M"dZD0ZZ?r/&g.=8L^|Ā+L]쵻;P:whLH_bJ 'yD%*5u*kQe4.a1{UUU_3=6񎠥ē&;&է 9bh =,F'Xs n2!nc1XF]e=7zDI+úR5|DP.!/Q13Z62SADyC&qlحnnRo $ k`􆮽ŲB^E,~U%76RQ,,^Cnq͆W }븆M5D@]N=VeAg\`M.h [H|{jU `%i+\ kZi@[9Kb>{&#cib:CJŰˠrkY}ٴdv` 3}d9G:MI :b"oi.c0ngז0b2 GYAũD+ 4Z7 Tݪ1 E^-`O #_3Mt׳YܺM3?o'=4#[Gi.̐}`v:4 +Zx% MMLfD~t)ޚ/E0j$+Ϝ-D'#wx=-6C`@a7{ӯWؤ`c~Β5`c(d#;FKw&8giJr7F+e7ة0y ^GH| y&ۉ^b)a{.έ+`[y"Yuޢ[o嫦k9 癟.ԡ۠;޾D>c\uhysN]Ub͂Tl[Y9F߿%e&da&gNG2Zo>'<>԰&bHĴ.3jd rm),?'0qINf;>&N9){cJ&X=SQxrG2ꯧJ<=64m>O UHf鬊~*Sr 5qjFB!]4֡A%ʛPz+VdѠUp5d b Y/YJ%걮Ckm“\7t-<}@̌{&(V6/L&F#4ˍHs[H:[Mb*ID9!ϤSz,|t`*y6C܊Au"iVG.{!=nq;`Sy/oL!Gq nD߬lvqRcHOyU$|܍ {pvg#bMP&zUL*$#,˚x |>AVA1fڤOh,v2Sk2/!y*Q-W@d;4Iя*5-7 ą2 Fr?YI>R)5u@.yH R蝉iAeōF+OQ./z6==7IUSՁ Ђ5ߜVz7I\Gd)x+]q.%"G0K][ǧvižL{PeI ]|ZkNP)1c1$1߫l q;9\ ;L&2CFRJ ){ZLS=d3^,Um+k|xeHM4dإrMj۸g նh<MSsIRoRW"כ%JnGJhtHn᳎g i@#^ c+2ItlnIc+at-g{2=ĭ@Uu}Zp/3" ?p3βl>xWy2 O[3Iw 38]?{?p tR]5Oz44nQ΅[j^"ky&vjTR&Z4'$kh=̏|i857i&ǬƮ`GÅ#=[dږ3 8*I4UTdr|jՑRXR늈 .4#Mr[? iyUܽ5SEhT mԢ5zhS:|b1, 01"_'yyN_LX$/ڡν~(u=v1یH BCxSZ ~g+o6Ǝr-«?y;e.=U, "9J8oVu{]B\A !D-Vt,H1mqĢ Uj+'rD !VG NawR6 ]¢xhlǢzY A*ZӝX~Xe^W.'kgE3m Zw>{Ӡfm$,hh_1|1R7uQ+'SDg x|_Bwyd!~^\PF%A: N bvoa__dg7-lָ>E&3l+`ilQJRSxf, 0_< ¯JUn zw \n,!YRn 3/}ȑm٠>Ī)(L}YWd9%޲dɆ%*QKDQXQ悽!\ u=7;[3 P(TI778og@0$vVP?:4$G8`{~aN.~F3'Իy;ŭK(hbOoWh zNix9rKEu=*t h y=+FݕFFpxYl&vq-{obY\53W:s#Y69EO8#4g'Ҩd6#dAΉϤIF2 KZ|T:߾RopӁV:~1pND.W=֎dȠntLB\Bc%HFDz>Xy,WXYM>7EU{~K}<H]H0bΘ{>3"NvK^ބoiKa vl-0#!8WTޛ/<,RoeL]p6m߉RJh",mn:Ӂ}4\Q+'繒d䞽yiGDkH '| 웗 6Jc (/*]{srQ.[La(0O*p AWp<M˂([F$ ?q/r<:j|b59.˺yjlٛS*e`/`Jsd=#KeBΝ3b)%5C&A25?2Iº֕״"q<֫ &JlFiX6킊b;T;·Nɷ$1*[B`H&qlF8rB絨@ }gUM0#zIûK wK~i_1(‰xG0SeI :Y^1M1PA{C E$"ؑ-%7j(V?Bt/ M8Td1fH"L;_Ln'0ٯˁĊ~t_A}Oh&΄#s?)XXѶ梥>;G ׾ۊD\OщK'*x7ҫR7+YWe>ۊ\MPRy[N&WH 2Xj*-ܔ3x7B>xɉI.3:hQZHsi56T "G[gfg"Mu[U6E|aKXMA. //+gRGǍ˷ti eP>F,(JN,Rԟ%uQU5٪#R-[vV bs/|^Iܱ;)Ȯ6ΨY.2V4sh2QJT}^;GJGV}?X3Jt`CR .I%c!xZ'oh3ӆpr!z gVnvBBD>: 4}B-}n(U XJQ4t(=^OZO_AU!ӶdC ԡ, H1'G鴖ti""ο- 4Dh6^nv R,z.c/!87  W`C,׀q5"tZ&Xj-( P8'}8ڨ J3t\_ZUSm1ë&hPiaIaßޣ=L=ܑ`ؒY,\/: =bTZulz9m;ٵkD%1 :' Jerh.uQ l53dSY+ 8=9oV`>r|_$ TY#<_D,XuUA@rAL\ xzGy aL[f!d<n-wmeA٥B+7tǨ+ΞqEV`ej Rcx_DΠ`bK* }R_r19Qdl_@*iD0R=W@i= RSO7iq{,GL_]P r# IISXU! 3K<J:uR uB1L#)<}nD7%c!u7ƶCvkKw ;`ENPZ:FZjgL!d,!K N^붎n꺐 XNVџ}W(^d 5'$#Jt|(JY;aJ=US1@O+a@{9kc6Q&7x[chHsG/bS_YҨ9*7ݲ7{ `V0#?D1:p|n`Okf>q%-5RMPd( L&"iqQnCBcto5z@oH)ҁO©K=L*ӛ-x-0ʓRޜj"L(9ېH2V YڲUfv/@Xe^z֯5K2Q\sHVQ¥|+,L5;#Q|$LЅƸ*[1*#){aE-Eq'3W#Gj BW|(Kyl*=vCS7T]lXdvu-QЖjXӦfj\e:}gᶆs6KDoxk3B9H'Ih QF $RKbsk-_ l"[rnSNV 7w Ta&7E|e~f1#\[j`viH#bRߠ:j;sbPh~ Fr.9}D<|@u8@+P֔{X[3)݅X Ͻ\XAyL'~W)xTNTzށF OH!Vp=\ yKd}/=J9+r=-IC_M54\n!OZA@@O빜P =# @Vr}SѱיrG nO{Xd`O ,|؞ݠ=bNG$7wmЙXFWK95ٴ2~nL2 [ғ(afwP haN ^_a4o9ȊĵVy.TP\\n<}8w7<ڏ #37掠\dP~&}^l" f`x&kʣ.eV4aY"jbfc8l?I#S#sˢT˳8:čSF|R=f )O`<G/@PJa꙱[uݺ1쫶׷ZG65ԎWyѤ=~_^2ZBMZU v0x%YEmSr#J~ab6҅`@ȵVj WBNM Fߥޏl䦤"7\6ǖuUJv*[uedACx]2.ݜ,Co,4xCNM9T'NؽԵB.\VQ!raLL ⪭`WcͶ& Yy{d3McR|(B<u6Ճ9Kk" h"i 炫&GU'x{a4"Fq:5SϚU#?Bm|hofp#,hg%Dۙr^ cӬƩ[Ь8 ]aoؒ\@RB% ߯.̩=W6N'mpQէ͟@Gʂ5c8%WV!.iT0ңXIGj1`pg@@`D 'z4{Lc 2O\`B*ysQNol;4Pnl5N%/ ^aN^^y$i),cK.w|By](N>;n1eBcXHԧ^zMstFηl+߇;]KiLdblHBk~r Rz5lu1 v戕SiU:wWLmC\+thNJ~܋[HH1m,9&ZmhǏwBԛ8l]}nܰ_޽Y2~9'n]3 Ns=I)Ҡ,@d&(] Hbjjl+J <&ªB;Uy y&bm ?8t-3 *.So!LFD5 3bdVFy3i&afF/}͐Dw N υ$6!au y8?r_O+2s͡_q+ZFm-mǾ3z!"kʛֹGNe%N]!(b &dbB ~_ɽHgT)G>^UIrca[^V/K~L0k2#}%/NNY |X35hy[ub/$cu)$SϏD{5Cyfhmeĕ>f 'w0cL Lʮ%1{߱ӕUQړb@|Xt/v#:!6]yh:ҹPrx-gc?sl<"dNbAljѷMn?Ǝ*noB\"1?jgP,Ί7:K׸i ̘\+m}`>7Xx.B/tRHX!FXgGI_kN`˼}SgŶ2 G9'xjUMF|jo[R+Ğ%,XK;H3󔴼wU00'o^ yahȷ^_g[qnVnq 6%ʃ|Dp33;C|uTɆ(݌uh[sn Zgd|sdRv3o [XE*礞f).rpB3CMy'%)g3MCx骸{dU Oo+-H6D"'TOPx,6RuRf5L7pG2βܪKLZaG,gRܳa@dJ :BR!#nZN=f64q; -LLESجnuhm\*̽wOPکk}͝$ Еtcypd9}XzXy6b}7:/[5וGVv/xjkHnպ39(+ Cȇ\ZgCu ̭p I.K p}JNm 0WQY#ڢ [r"' $}ȉA蝼xz> ;ޡ?+WH"XʸJ MN`5Df:ͶhETV~0LmyH z+&T:>)|Hhߡt,t3J; F^,ا<öMiU~4 _^ ֮+_C'iU>F 5t]&2&KD 2ȈbҜ7 3kw*Ͻ8 v6)yt~~sˮ=aG h0rPsV^,^P7Du/|u2ٔ'}^ɧl]b *>/~0w@LIH#{!e_`"Z`+DN?| *C9BN>gԹ\KmV.Ҟ)xPKAT7@bW7hSƲ jE5Q#a=aH]7S{ ]|q,pmqqu(?X&W+Q*1ٌqä5W Ĕ1mMzћ CbK,˽Om?]UE.FL/#P%&`肺č!nڼX&稊*CIwg][hu8Tyq)9/ K*p.)J y$(Ɛ G?E,1F4׆ْ8 xʊ{?bLAf ,9i4^;/F6NJSC7?~? .sۮx:u 4'H3 p!Q/˭ͪ toL;>#RDqe-z,mxQp!Ucla$"%u1xdL4Q׬iNC-&HnUZj )CYu>#*׮0TvFlIBy ,޷Aa /MH³"=4sX-=Wp~1O }8ᛳټ. %,'Ǯn&YVO1l|IA^GYDwy"C PҭkV{{o~ZpG;L534']A$ꚇDCځ_3qm|W!R5ZA !sj(Cً< S|냲fM'?}^0ӶA6[%]FN݇T̗K RLsD*35phx=?nf3*ᛵFnp \$]Y ^kF{.\r1$WopJiT[ [&ZlpAmcY :Q^(ڢ u.8Ci;@l|zQޭΰJ\hw(#":Gp'y)  <IX4(u?,DEpn6.JjNi0D 5ƍ ]ɍ]ŞVRu% :ɛC,Rl3QT rr8 yϗ>Is b'$B [3fb0-tL[t~ٵqmLIq>1+qb2@ vx/Nk!s<`h`J++Q%ryQfd -ysD\V<V9b_()_EDnt vN`mFӣ al]`X kB[YxR8 Cel֙Oq(^f"8ĕ\ ( oϾjm^*d2_)ÜC$v{sJ"?=<}=U'y߷M4u2ٗΙE_a 3E"M%- @-4h @~e:u"n{1wC5H Iv ;ECz˰ ;a%Hpzed[(+s#JY_-sxrx [:iI֒F!9+M?$VS.ڎm_X8-~u6- c/>V[Obf"؂f!TArwDuArO-19쑱'33r7"X=(57STDlcrJ $I̧}U(iXS֛m J(W+:X۸~Yfn{)׎MfZ2skLjYK m>!4tbljQV"C*wB)+QG6-]1Mu7ny Ϛ٪":ZBA5~d"Mm"&QG&Q3eSVꀀeL/ յ¦Yů5 -dc}܎"ZN=̗+"$C"K`,~[~@nbyGx2`68-8튛q̖3/7I U^bZ!ys9Z(R̤PO)ǯآ-d9ym0c1t\P֫FҨ Ģٌ451(K ?Ds8)Geȓu)!JTMBj\ +%+ <荘(I7Cܲk,b5AgsUG!EG=$b!Dsn:A ~;e͑X9~R&ݞVY͝%g`/s1p9o,X0EHCNrh"\/P \W_% Z$E$cN}ѓÄ1 sZDZtfJN(zpG mo=f,puM1 c 0N9X{sr뚔ΓV}oH d?QD_& ΀7yk8.|XX_"ssܻ 9IM$"P_V:VbqՔm SCel-;m2؉Z׷faJ pyXo`g "XE+&*%)nFQf5 EЍ`EƜy$QepF)CYfǙ>7A h`56mwOVC|Ja:j>gDop7}fvٙ ]~Sa6{sb@˜kG,I=\=o8ߵ|;8iZOdZ rF'G]R7>7s/F5fpWTadVq[%E.$P]j-]#mtOVҭ*5QXaIA@V-/ ֦7d0 ~p&@>v~DNRǫxA#tLH^tG7͹H"7'3sg;n=;tl;QN\uhGoy Cүr^vf}3 s_C쪸H#fs%6VBwjDG&ԯ!V&o54YYa xc)wJjd-nC+Җ&vGaq󥨶_?9pLҲW;M H%#pyI)|\YG7[Ϙ9D䷅>Yu cƠ/FwwWv"xOz1=:e1œ6ۮXpԞM⿒!]sAe[ւM XqZ؉!{QM[`jX6}Ro㭼K> $la GQ.?YOF3\R[>?n92(I MʕK<:B.p(Um7g~hcnBpf?8P:dy (K`(˕3R*=Y<c#%+^ /QƃwUpҘ U2?H(5Ct/n3g`fGW5]?hKC9yUn]7:HbH5bG&dL[ԯUxE+$ڱZx'\FuP>9p_@=MͲ[%sφ6"0k&LWYWD+"" gizr!e3V-YJq4$ qy EDt/[H!֓6mOTd[jJ+O_;\BX#_4.4vuELDƌ/&\2.kA<+!9Rխ֨kEkAv-Mx422 Bx?:ŻNje_aRg b.,4tth6\A*O|}riD4Ϥ?"q9?j|P#' :,tt3 zWGhrMqXD:Xoguxb?kO<5۳KjM 2ybOV[t$ <"zU$ZW\+p2/lRX B'I(\8 2YCg<~[&2442"vR?*{=@:BNۨb_Y\_CxnЀwԴjQЖ hχz vi@;.mAPU'?0$_p%|>!Чe dGYwj́f}!A gYiIN,R= >Fh: bh Pglҹ':͚\\l7\+2U~y4N޳mX,sS҇ :9RrDF) OK+Hm486W!dUUδLZQ*&)FO$m"s՘m/"|an`*<+]L ^ zf37Cx]9W,Z}0.5@#Li|译Yɗ ޾[9OڨHO$E@]=IA^l2y^U_YM~Ē! J_( `֛E\QX iS qأB1{3786i}wwsBܞC1G8 ç#I%0NR]g (&*["IS3-ʕy}@Tuչ:p;"$G*rZKX݅ D mLſcBC7t(&+M_n0]!B|Oak+H0]9tI50h,V4 E^Uѻ %ٛx֭]CiI (P u$ESȄp?ԣ ;~̺.:F@w#O]^T_ɝ ٍqGPv5+~y EfY_o]?ጹ; xG=0V{qB$'=cbMZ ӞT!@XQ!+NC&h>?%yw֚m@%@Ѡ7?$=]++U?w mCqݍѬ -;ڍrO+tko("o]lX .' V6CؕiHgh@H2vFK]a2JingXaϪAf= 0(RBbi@_)5U9sG@W{a.j+x=[/4oz27S  W+L22R__DZ~v=|Qj:rd.>ŸnԀ-NIJ钕[!ِBmȬ؆|}]\yQ*=aeh'[k2@%FqTMiuHy%[11gq}6=惄&)FҐeӑ)rň+L UhUi+Y{M?%{%'':G4jp(DqqcM -*4!șw~/12ow~{S룪Kž`3%O˻*ɪq? 訊@7ŲmH?R~wAvi6ǒ*~S6SF=ͻ{ 9m`2=H2G{kBJ zR`Pyz jAD:V;BdK-<7h=˯V$08RdSz~N&̦D2 SY2ΎyS?e>t!cʋu7 fD ̐)E;@&``A_ n4A1 :Ys~6N03+RD*Z'ox(_ L#Dž돐(dʰ5PpbH. |cI6OO.9(6I0ydB-fJ]*/ʔzBRWO5LJmSs:zEqLJk$H{ύ~Tjp/ZŒqj%H4ggp- 4kYaqD+pc}7kk߆qu`NZ,0"YD4H(\Js@'2ufY΂>=c+4/QkB=wԯFaP O.^v6?gZ"%];f!m#e5M&ZƔ!3sD9Q&8'2^"_hH6I Z+B &6۟I)4Hχ~cȑDtᡒőU5F {~}މm?DRBmr ģlU"0HUeX|>IOFV^ C$_Jy3ӕ*AԺ:XsI-ϛÚUˢ+jpL0zid*E ?sԝ \7+!6SdK {?Bik^\w:4ʚT'zʇCQ[/8guG!h_k3 Ze4 Ic括+'Go`5ȳ9Su5Zw%i(F ; պ욑"mcIkkJ>smM V^8j*=u *-$QspV/Bݾ êJ*Wif8n R_׆⥜ vZ]@[:RzP8:)`ֆ04nsRADH:!R'BVb`Nh9l% lD 91y {W xAd5 dRoǜ/}( L%JF٘9< ]SEg?(<`P#:_W}!8j D[.V)޿? ylҞtO:= H %dvУ<xhJ+^A1V7ꕦG 0x[Y~1cɓOjo`L e;M[ `j}h8TXu+|Z\$!~Mz`q*}jP_LDWU9Xf€Ḵ'r\^֚9yz"s?2m]>2"SmdTSS$_D#f:3y4gW;m@VƐxwyp^(!2e*`eF*n .f"~f[k2ccȽxE ,]7 '@?l ͖bb?RfR!)϶ԄYD׈/xkw]d%Bi  n봶oKuaPjE[>zE3i1aΔM2Rl+$o/0{~Hh#uv< eVZw-$e3c!: Rw&f먮pDe )wm]c]蕬EY\ui.^!RxVLn'˘e"E$ Roə mB+RbW}z% >c(6ʢS%BC.b+n'}%,?m.Q Emm-=5΂?eQGullBdTx 'zWA}>ֆ> [0Do-ݵ-)ܸʂi $N w^ڏZywcd_Qa#7Űa3Zƹ\]C.ӈ@wNQ|/&Shl{yv7D] ~zwqp4L>l"mS7Ţ#wRNRQQpV߿1{N1a)\|C&RfLv1@쀳P}Bı2xA,]ٻ7`SLQjAc`tojM>SYsR|- w&YfIPgryq: KjZ;U ¥O g)= 57!S\l}ftՙ^ [ L`"Toe:a솶'14VeyNa",S[aX+d8w0k])~R\4lء') R:o3@@H`-:hTj/8۸Owl 'ܶ*n'WGFMZbJn /הz+Y,w ]#<6"rGϳ3=i8Ǵ~7y !g:@Ɍ9 /\(64<ǜJaրՉ2KCL|صFg-I<sQQ~q@MgkiyJ24m=:ϲZVjAwn|⁺緲G 6*3M"saw;_:6DU ݉ձ4ɔpMm3c6Di%#l$ uс`L6 Mn'qsfwiJa1ukl?x@$cMx '8Gx?-/+/hX|9xWS4T :pJ͑.`9+Gԁ.z|l3Bֆ9j䃩j_-9{y<}wYI'~/x60R^tyY3&R#ʟ^Rmhր-1BlH+,OJՈ7L5"÷[I4Uezcޢ[KM{6|w򍆦+_yR}/*#]AR3yB# O"rS ĮO{Jr7HQcҹ̣B_5%N-iTSܼ*MW-D"-,Zmh.0ՃsV}4z!/,xmP>XM` #gU`v^6W f W`ᷓ&A5^vPHyv_{xLcKV V>unI}ZBδCc<2\a9=EMcέ0`U5ZvFWbʁ\* 2$ЬHEX /2fCGVơM1vG ) `G6Pk-b=M5n`ށdRbӄ(DI+B Od0D>%]C,,ӄ`pɒX8ݞ}cxKT-,ջ /clvboz.>Y2IÛQXU(Gvy6B)P1n&˘toy4Վ^ #澬T4` ڨv_d N@Y3_#>bN܁"M߅S S0Mҷ+Y#uq9(>n\`X}?MFjw"66%W[]XCf']i)rsR65Zse0{65w^ދɴHM]q fnmd]!qgG4χ(R²ckBӫ:aP*[z|8BѮF&f6xSs҂OFbFsI(w=)[gJHa|!8 QHnD@n%-" =w 4j$KMz֩>wqZS=* kc#̾l=Ѓx8Qm#(tu~/Z-yxpnV $=ufLx::!|3/2FkGCJ(r|W[Hs"VTy@>Q]LlۡL׷ڽtc>_x#02Ni$] :/&8#?u9Lt!#Nw]N#/XѧA2y vnD ,p}Bڕ-nVn@Ap_=jmkpW0jso{,]Ap'r/Ꟶ=uNO6khbÕs3qu(pԫZ9l&$A.7؍I#x5paats+yy4ua-By?KDLމRchRr*\LB X6))L޹w#$T]hw@, Y0c9 X)؋P^;=2 $Nb3V ӟa9RVgH>2]2بTg/Zhe~a; %f<)p2*zݓǹEpudk*QeP^8qKbe`ޅ*|WDINm=J^С2IF) )O^~O\aP >cys~|Z+t[G!:Cڔ6zQ0UfK+G]SV.iJkw٘r"V{XƧyRW0s1lprgOI0U}IuF庡]Uq"ݮ iVr$p!:qqMcez8oY~pj;"=")hO[[f6c *rš|+0 $ DчuϊQЇco;52_f8uF!PH"aC)jOP/r1[ 47`mtwaQpL 1z YΥwؽϦ6j kpY=,xbSE܀5(?RgU=oRIae-bcv%j*qŧtv i /$jϒQ^WvWZРGuʹdj/MX'vkgjgd}|gy_ߡ#L[vLoPZV\;\Ր|PD)d={P\qe_ ⫐u'QupQ+7>?a7OPӖp/,+:ZXȨop|f \ 7Kd8 5yiߧ$PU厂c ιk2 \mj+k釟kdlkS&oWѳ/[ Y~oa$u<`:^vM6b7,^F fQhi74NR՛"ꕑ2fym^"7<'pa]9gƦs14x )N<>˵Ҵ>: u}(pٱ Ff%y(j'p6Y y{WIF_HRFCUC旙0gCP^6U+%D+~ *TC(5 hպĬ2g+WSx;7EsgP؄AlR+ d2`j*CDǏ]>ee7,`_l[tJ1>].lۀ,o:ߴtW17g-r /ø}O# oɢ{X+;3(`jнHT bby"rsE>V,R,?Cm0,کnCߚɤ> *y..͓I'ܥ(حO:b$6 Á;0W ܣz%ONJ6Ds`nTO_;j{>> W/ZU[s|yq$ AiC|GPm+ONIy>qY[mK=f +}3;F=0[ϐRc kȀ͉ɹ=ŕ0GͺWKofiIcteXi gï%qK;]VPܩ|$jF]Y\#eE:W"57?&kS]L ~uěVU]Dbd&Y{6{M+u(e4H ZXc`<7fbg'tRWT-^c.j~XY ⎪5G N3GY B`]svw)ssŀ{er>q|8RJSF(}5qw"]ѿa +Ξ_H*tU}Pᑺ>G>qJ+y]ozKJ$i< ]9H d_*R'|4γm`29(7PΆG/=0++Z<{_wYd>kkS"Nn[>wɈzKMP5D]:;ܮxm( a)Ŷ]߭Ԏho"{5ޠ!1-VRF Y xآn*夽Wdᔭ9mF! (3F RٺG $SDuZuW6}+ }Φ8 b/u#uryoן<(K~7~1"Nإ~兀|#l$8pA ;2}$4&6я$ dGW[SBi@L$>먄!?:Wwx뭒m!ei/9|4*l״Fpl*9wG +wPgP"ox1dSe,7^9.󻇼j7)_^ f-_ sF>XLpxu]ﲧ 7hT {O BANhsoi{WS`K-_ miF,0IWenG0 s\kBE#b iǶz X4|+6ײ[>7#Q%w䮊 Oó& Z;azpjJ9/-P .5J#>(fQO) ݗ+O/ jꯧJ<‡*M#ݳ(,I9cj)5vͼ=U.OY dmM@Y/gXv||MtTF_ex|5 D2b7}N%S$¼="҂51HWBϒMͨnrYbpP"oj]}1y^o.Ь5:"׺%mO)D |~"'; \8PP݀:x\xhߓ8">~t7޾UL.ǥ+0uGXr%ʾ&ͅ}q)pTj~0JM pJ?V@rx \[/5_]F˻«Q >޹C0_ C r5dOɞEyYkužR,b w= (ϟ?7ƿfBorN0NV֗(ڄ0Q.(Hʹ-*h/yVIo&|ì@cIS_|R[lye=_ 2+g; YGV?h>d{;}3^S7-}zO}&'J >+*kt;K%kqb>d_mΔTf Dd 9hr]]3y>%}D VGJ$n¬' 3zgPp%25(qq=噓[z{ dqO'iA_=v1یH BCxSZ ~g+o6Ɛ],dX9M^͏<=lN*Sespܝ .("4Hd(P%·yV~ZB1&w$u=RqoU!nP7->2!2.%7JLGSE{l9s=d/,a:M/[7A4`(Hr!sI98P#rĕ7(/2:>:C_]c"\sBcS<5Q8vAg0&|ipr} 9w ΋ͧ#qE5g m"Z˭hxMNҎuXR->6ujqWy wy5b.Dw.n'1DMB!՘͘\DxQSIꐑ̊ -o,hfc(%U8+[ 2vwyŘowI1WAh;B)l~!̂Ly$}ұ!Hkew9-ÈG%g eH43D1WQpɾ<=M|ck#w|jct\\VXeM4&rjLZOhjK f_$Spa{?UªenuaJŲM@y0h6ɓQKDQXQ悽}>A?+&jb&|0$;YioS6%\f-eZ"F@k6%M(H>Ϝ"ܵkC"k7= ?b/NaPЖ}f/Q #m:)rpOvy.SߟDd$rynF1ՎX!ʓsڢ޵_պ|5c@#P?b/Гh{ОVmaC'Tg=HB@ @./W8R!b ɳNvM:X_Ӣ$:Mª/iڨhkY:Tf8.Uq):%&?9=0&N.nQ 5r~"Xpdiw>{#*F[dzjztqK04 =f&MʴAJލ y'N <7߯]JC'<FR"FD8$$Ơݶ2'!5=DU 8|O]õX?yAO_/Md4=kp6WK\h7{H>c>l]JW f YHD.&᷹Vϓ`T7az@lY|PE:| wݹK P4Σ@rꏹ=!6IΎFSs 8Qb]H wxw?&iĸA6q'ԑ}~@'4홡I)QeE~Y桶_FYj9o7\M:"f]3_J5.؏˟y%VmFiyq3o/7kE͍` Hr{$ܻ7@ZqK4Amx*wdGO `ѷO~%Eb 螹LstP뮋dZ USd!b ?]{*JR:F F hAp t[̢ruv]|4gamW&;P|,_JIu42m[/ P6 ۻR7N4n?Rp?R)۷՛\wI0#uy&u8 +ICOz>K3:TI!RHKۛ@jiUTh| P6#>YUu~\#g4.a1Ⱥ)Ȭ5d(>6(K{V:C2? v]kء9 qNtv5MpB -]e%mہ49N(6&e.] IM)кg7ҟ4'Es$='-Lb//: #A9CpK)<[ iGzf*fC^+k^WyYqt A6^oNXz ؄ ,s#-qג͆%fcS(v+&K@/x9\FGPg)Uit~|AYDQ>e5 %fxU%BvZFK&x*9cyk Iʸa I82 6D z-.qv6!yx٥NMA8Y744P%QsU4`ΘߣL D MuG !w6n-Oi̹g5:M,M 6OX}CpoU~mpUAd!elQ7®/D 8W.UZ %HjIcɦ.:!kJ3[a|qPo W ׂIvHNӄ\z~6}?ÍEsΐl0|S@)?!x 8IP +T煚xيjtKgY`QDp4yM[^y2eD.ÛU`=cdC_x]ź5ѣ.qͬ|џO[oSqSÈL?Ku3F#wԩs[FcViPg&_Z낵h8]ݰZ"f;i%WH"rڭ*:z$d[wB طp bm6IQ-#݇vέo0VH7̻lo)=٬\goshW$y9`{l.>6~>;lvxߤagVڇWU#GQF,Wflqr00q-ʡtNؓ zqByd@WwnuPS81C؟q# 3fV.R$,qScܙ&bJVzL2K~N*>[iЖc$U)3gV]Nat y| QhS=)5O,^- F)Π _F?l{]w<r1,9J$Bտm-خIZsES}8L[FLB d#IO<6I|(ud5O  G⠋BcU-2ͼXIpZ{MD2:a)|ܖB#h nzF2|Pi{7 R/S7{BqKc:v! PRh!bH '\Z`:mqr{ Eo{{j6|@BF%U."̞℟ɕn-JO(mnHF5)uxwSJĔp"'br/%`A:eQݤ&&U | 7(p!6-V NЎU!l=}- B!ɆOlx6[ .s+O%C;X^=u*7GՏ=!+LCF^2{JfE\͍{fn PWUTeQsy1闛(䌆ju)^f~ 9*&UJPGpsRx$tGۈJqrW s%7 p"&fع#Tc<܈Q!smȕISQqYhT9WhY5h컞V<TLaI>TJȩ|q7DASu6>JTξl_|zլ'oB ^A\81LJFOYra]Kg;*}4JytQ{Of iTF\@T1tG YɸsGɩ61WVr썯OC;b1W3{kǤANx*7HUj~7ªKCn9ͺ *Ƙ0v9S l3F`aTخg(˧;P4̴٭I?T 슱Ҁ]U*W kw| 3NbOiIjF}O|Gu .ҷ-35 B8ru@G?8s\dLpקp $SAVcI) r@Ktm,.?xx1AlΘxnA;Y?`r5Eƛ>83RWρcJR\_£6svt| m]F;咎dld= -2I={mǼD+nzI3.) |doX:|q=ȵݨ|TgɽU]r ;"e!U'~|Ldr2O#F)n=vkq/4xI4m~lX֘f!AڌVgvW3KU?*es*u%WHQyg hlj\!z6j$oٲ%K;܋]$: Aq؜0^貪X`f"'Å;n"cwŸߥ(܉,mܮ1O4/PL]K<:jD:DuVȞ(WqGcpYb;u6@pͤ3Ɣ[Mnٿ, x˟(VŚS<+?2T+}Nbpk|a2ą}y;woeNOcM:uhnva(']gw%|l'cY>e]MrU _H"JAQҔl)9iufdfSƳUlrռLj>OWl_շ ^t*ѓ9ƺEΣkdZTEG52xLFmKڻ8dUN8z 6fzNж4rlCUezx". iCy{N/4}G&mDQ%XkWHaURK \XtHΟ(5sf7Hj3vM:9sΓH@F[~[F&Έhjrt?yvfvj;Ż,HˊDҤS>[+ Qq]*}g_Ўł)mה$xB$. R$ݐ6d2D~akK]w2 *ghAP wW ,nb @5&0Um#e̥ɻ6,=ɚQ)uk6hG zpqᯐoP?g4UZcDAzfSڔNg*r4r"v9O=T82-Ie,M.[#ދKC@*uhM{D&|Ȫ( :pDDem4D(Yftufѩ(A5^)N5CJP֣"dMX謻CvƤ Rɹ>GRT5n38Oq@xa!YUSG0ŇО9fW*ԇ{*8 _e[^[#nSt^'`~u$OM90:㌌\Mn·wD򂋒/rkj=J9+r=-IC_M54\n!OZA@@O빜P =# @V(F|b5I)Kڞ6jU:tֵ#uo94 " sb>^:|/C]"&j!myI0 ueS?\Ӛͻ# ,h(;ޓ d";Iiq.*9ݗ~vD%G9 }('hEDL7y!LLj 3%JE9؅ ř3$歙#7Ac+(JD-96'{4= C^+l0/r`5\[SL,y\+JGv yxgܯhvX3bŷ=>I^(v#p/ECOM{<"ǚ {Bz dO{Uv9#>ۨO2()Ӈb>k.^"ȓw@w.~VŧC 񹤬Ŏޓվ Ao޺;'&@ ҩAE?!^O,7WoZ fwm)kծSFk.Fz AВxc73^},oe_TBV+>^3dQe@hg:60Zcg4Q$y ?[@W,lݖb_ nҊ+҆d}23+s$J<)ƮKSu:rQoPO?h.5*OؽVovs0٫hGΣHtwnzFπm!X_JJD`]>rK 'uߒq w=nj.(;РzO@v7pщSТ~|g%"HQqC/r6I5*U:_pW`X+r̶)S.hn[Vx+9[]Jk "5aP!Oז?$+oƩ<ʩՎhӘ7{ٷpP Q"ݍ;vvG`} %2.(<4*SJ4!' o _1ꡉH)/^K:b/9tMCKlic{L_@(j̈.V+ JPAcC,o] &.=X's@_gz2#g:yN"Y{|9_sʂzt͚"6w-pazO9HP='xqzAx>6Ց=j|\l?E]1sitS&d. 6[\~\N&3Ėɻ;C)'x1he_Sԙ@Ñ٧Ndc?nOܪ-{l @:r-s'դ;ny LZ( ;Sr}ͦ6ye?ȿoKc P&{qjhѡ9?s'TK{.QGO##rhy[Qg2K&mk<^9,Kֶ3drV׷upDO|ws/UBߘR7/D DL51LsCd#efh0^!w Atݐ=DUB;t(D$"Ä`,;kֻeQj&Y;?) 9z6Pqt^t59d(Lːd[R$Gu~U2.2ŠU٧ʙoR%~NkekuA^41;Ic :N~ J;mf/\4)LW+.tO ݩب<5vrPf HHqe, G9Zg07sl<"dNbAǞ*\*嵀EAmO,|\ ,ؓڪ NJ~]v1qV8W :_'=q8̈`~gO݋0OFZ?7@ھОq8peE\:+$4݃,÷=< {coTJnZA,dPپ(Nٟ=0"X,- :Iɳ (jVĒ tAsYhK! gg!|wذk(~i2S+>#3kĹ}.kK*}ПA0T= SñjLb 4b'`Pmx}l|(^^;>Ц~H1:QNc^Yi[&? [sJ8?whpaK`µR׃#P@'o-c{kٚ|OIw#&.}GTn_ h*Î{]:%_?% 4i5@5h-rJ I{e;nL6([W[PZ¶r~@omKHY#ܚ.(7 ;{fdha7/WԸ-P?nWQp5ȝv&5TD A:5tn5X{hWtkct恗Y7wĜ).Rpt7ßf@Β\@;_Ҥ\}RKT eQk*!c۾3yϺ yY )^#ƝPٸrh lWP 0gcQ0Np>37_9! [)_DފUCzx^=i. wgL&Mp+?b3}!B'a/SaCqշi5 BŕT39(+ Cȇ\ZgCu ̭p I.K p}JFkY+k`?IIN8[x}@e6A O"ݏdi-CU򵃤J\oOS9wwN+$~=v{zN|: "o>*Y;a j7]e۲@SL O,=5Rf1Ħ|_Ѽ|ՐB59ݎ˲焆,~1iH{ao*w RCh0uU-$`?\1Rѷth߸J_x h(S+.>Թ\KmV.Ҟ)xPKAT7@bW7hSƲ jE5Q#a=aH]7S{ ]|q,pmqqu(?X&W+Q*1ٌqä5W e6c5hy3]־{k裸TkǙUQ}Kуsq.L."n{J~}~IQ$(S>$_$m4qQmdz5}cj++Kcm _^"e!Ʉ @vݎFûd٠M!ً@~z{);mg5wߵX+&%#<Ps&*P!)<әEBM,v%'ϟF}^EC)z eʉ0pU4Ȃ2^JaTbH@t?ں=#Sl>1cg:oxZKqo M`ID~@LILauyw2C#ߘa,Y&)^j/h^AVol (>Lj`20Yn(] gY)'ѧb˼3(e#) :Ӈ;{Ozm0DeOSL^ffoKBM{qQG{1O1l|IA^GEph>Yr]N|@ 5162v$\G(rЩRص`sU1*G5*x90B_8jȠNP84HNjri[v %U{(_ -_QwXLSZv˽i:"{"AYp*n7]gkN1)b"lz5F~ ީ!qW"쭹X`GNKH`kVߺHLG͉p+j8n_:y-3bCT#|{ \4X(ƀ]M!S5f}q)S-u wEMvUjԨZAk[rOL}G4+t"]֨噥@ TuY_LjGUhN"c_ @=4=`">cFBNE.u):(O1yS"ПI*hNF <ĉ򢘗@=Ƴ> h9r}?snޣ8^ 5ƍ ]ɍ]Ś=m%ڙ{8\D]V$ȷ6LeOS 9ǃm[\J?a4jÓ񜚠Zro?p<."M=7p(UkYZwF,@?UXlj؀}-s;gXRlPF5#jxc(6W~-٘۸ng5 Aȩ,X/Ӽ*Z4.^XZ59%Khڪ!R>൴kƒѳ_Nc<_c' Fiw+#9uʥ(4 BiZ<ڦ7zM!s#.X!R=z0P6YJk*aUHr(Ǐql5Y/^GV&N[X!1ʮoRbl!vp= ME52`{V \*kcvu4\ך` uehᏌMScxKψ(bjN2s$݆KpZR2EsQ=*|M l^fxN~]P٢}ɍU?+ہy_Mn\/3\fy옩[>befY*#<Vs5`D[ZWa:\Eѹj|~sn0ٜ⃜? dE"(G{(~ ]wƷ5ȑWȊ >%^~RmK<33J;b}i'2ώ#'-U x;#/Mzt3t`(7@BwE$mWn lZ+E1+#^ˠm &V'wTn^[ޣu[Dqwė.WeZCGi;aѳZ44 L! 5x,h |5½ [jϥ9\b@oluyנͣYԆ ;OOVJȉDx[c[~C"X=(57STDlcrJ $I̧}U(iXS֛m J(W+:X۸~Yfp|Q#4D#yYJd'vyا)Q1nw5UCHLֳ~ ~c+pAeEy<vUĴ1oSm$hSN7`R #ju"dP)EL޴Dt<^sJLf؂vIr"[ۓI]ÜdKW$ ݧ6EU|)BcDd vɗo I&nT>7-q`{,"T֬+^xEGa4kXO[]])U8j̠fʤ>ڬ/j5"9G&} xnߑ{6 V.m0c1XۑB5pt]rfoa\㵉hS ͺvat&it6HN!|*>LH?ғ xmu0uSwhFIQ& Qo31vXdO'Ey#W;t% Z\-c=\dzVqSf)5iܽ`>_*?\"kfƈC4O su_lЭcSlf[7>BriĔ$C[o}cݥT67I!ZBm(14[^pzѭ_}/5A}c'q &G8bO*kBB⇸-6KI"7MY'u܌;e}E4U=W|<8TA d4zz7a1F>e8XxuvaOo #3pحkS(k SP(`ꦜhfTYo(n8L=^Q͔44FvgnYF 9#J.K:YkfJeRQŚ5.~TnTqF5adyfHo

fH|R;v [mqSL0x+-v;/GH醳)x?d!=M|OQOy/.@ :&/u``ܸ&9RlqT$V?[Г쉨-:wKA׏\yX,Fw%mf:TXƦus ݠ0 ]]cse] wiKE)?($Fݜ];Ь1𣚖N k MCL!0a~?dd%\޷[N)(W4OvhFS z6YKo2MČ}gٮU$`@tvOI8ډA$| GpnZ|D9nئT<aAKO/;mv;><2=pAHCR7GwyyzǯH"v86ːo|o̾'>QhlVۓi!(dmLq^}G]}SOHg*O{LbSdʳ礽tjm\ #_-,Vo߯o3-zP{^)OQu*ƻP})m:J;0CR:jWock ףXq#"a"0/XmaWe,^^{X>TDl"Y0P%2,%LX$Һu8MC3$͚9ӇӻWɶ!ekj% 2u|Av҄P>7dkUkj5a# < >o3YG15 [. Ǘ]tQ[p3ppne~"jACz/7,h':*lޮ.^n'9䅒C Z*-\dUdxQ[1ѝBd<~`]oNic|ḩY4}0V () !9Y 8?VÅS s<9`v0v %M̅LT R; "Bأ6Z%Hv%u%O_5ˮZ);ak3慄o|Zolɳ1їtkSP#/=.{CgV||eJ>հ*N05e/JZWhncc3C72!|®{hpG\PH)JbF #i-&fѧhAx6*!*lyمb -))V96")MW3I L 2آ4Ź"nF! {[vS FI@lֳp`xIJRݛi 0U pI{g)>vB+H HFtm}jˎݷ]`gqFI<%5=ӪcQLG |9v"="9aye׹͸/YVs]޴-WJ˚>wRC ,ۋpl"K67=P}9Z҆Vؑ{ɥI!x "s}1nM]D`HG\ß6Ph83[ڽ^V(`ꋈSB+ LqjR4`5XIGebWJx1_dh@FC1 ϒ N& 0>T욤ˋ!X6ERA(hcqPD&uJjW["IS3-ʕy}@T< =YBJz7_8V4Xο|AqF9Yb V>|=cNmG 汘-~TʾuM%z̊S2HnkMpn A^e|BR14B% ?cE"Kd[U :=&]׍x2uMA?Z]vfgKwczx:Tfp olRу>j]GN,3dgy I &vs-WĈWf0P'%g/]{Z ʃBPZ++abo EGJO\0}͜Jkj[#!MòRd}$˙IDdh4T*940B:+ 9k~5mXWT2KA{(";+&vE#Ѳ4q n٤G8oL݊(4ٙYLyl@'Md wر)_ő86 wM$&)a_Euoc Z IdLωU*U\^wDHnj$=٦{ASkQg7 QεwjQ.Qaڮ$qnR7#yx.?|eҹz`s!:s<u _:24_Z KS"z_*>2+a_3H 67(9Fː}6wX[[Xjyw߽T0rpS^iyvv%g N #!l?73i3r] C,U!Qw~^/cVy:AFêOzDG6^20YQXpIL--ա i8Rt=sM)d뫄^Ü5iE)XYz8(2&S#vc\ADsH.R9SK!c'ښݯ'AsJ^t;vy)Jſ(nd-7I}qa5oc5/ n8Zg(?+AR˴dP`999K`4Ԗ{y|*LizY>ɍD`HӔ]PVN6{ri#'ws1X:xxH?O"K[x@Q ]wNᬍ$r红 ^j0p[4uXIEWdaawt+y>|,&p oC:|u%zp@b΀TZxxW 73G&i΢hNPĂE lS)T:IOvo9az(bdRūz L^ͺ3˝zF^gg@Qv1\ ˃&<ʂ'wlq4&Ue]u3^ ʯ1%$[' TӳhPNi^,  [٣␁y&Iv>'ĽMu7{O,whb* Y)QI[/YᶚML i܊- fLEFi/_Imlir"^9H1U[`-:O;ɘ'BU:tp+e>`LiMԯRJ84|rPFVċ]?X7qgs'T3PƾQ(U쪓]R`S9 ,)jڥGry.P1!Be 6b{'Nz-S _wvr% SXI17cF4̹aV%=ukTӄbzș5?e28JO yIx%aԒfٟ̔D샖(G1 :YMk+ɏp vn!~9Zm1VGϛD0L)]Y &.1|2 AD/%܃|[]!!)tYcvۡXQl"Xz*"ars8F%nfU_w"o|c>mۼ" @2|kԫxfw; )VWځ]2s_v1:|N/Es:Ns |cuѝ؜nDᲐ sgZ 2m3yh15]*uR m%r{c o ocAv>tz5P d sh A01„eˁcIsD"! /tVVKM#fE@YGg$cV}ȡz@cYvZι؞ :qS [ b?g7$15lIɗ`Y=L{l6 Z-빟NRiϵ@ jGHZ w'|Oc/#omv%b2s p>XFZpUUU]BNمJ"z4\b57TY L} sGJf[erTW}&wz$HG%<3'BF!aVn2}SY>CM˽Te\doW΀_."nZ1:FR;Wفyd} c-~wU4`3>6TxL6p y)< z Պ>514B[x7Ԡ캲aPbM~9]o 3ߪ E#C:G8窞_7]NOQ`mqD8 M6z?e﫢}8.Sw}v;~rnCmnxu>~O?0wjgNsm'z_jn Kuow}d?ÿ`+io9 L7"!Cd\&B,>v4Ej? 5Y5%h{FZ*^Yv '% FWkb,&~; `` =e![>nujVҭ} /.z)p/3O%0+=ũ ֌DN1]=Eue̯%5cvX%H!yKH~7BcxMS ZQRqCڦ˦0𴐟mmoM0vvDR ] TNi)NRzϧ&>ko`͹:aU;X_x/hф*yphtӱz-AC-u7X:Fc(t&cg})> pox 0#l"d D?8h#zZCggǦFj4/%0{RghSmrJ,B2^l=Rv]{hhR}вXP<_7,I62QU [=s~`/248s_`t7V!)αЅhC;Z[49(ﰿ1v?|To u([MK@m^⯰>_TJ0XGvv@E7ufO+1 PSnۿ<:o)*en~`a*n*ʋxdMwL2v5r!dpq+Kv3lЧ9"Dr]_SK܏W›Ξ|7+*ڐ|}= u*4k Wv&cw3#)ܔPmq9 D i#to-i䐖fa5ۉ*-U"?]LC5UΪkƈ@BvUz<W^R7+IBgե9{&:v0һJEqۯ^d#lm2,50JFe\!3Up>h7YZt)ۘ!*ǖ 9կ$) |kf K:Yuy$qcM=ᄌ*0LQg]/Pg y.(|C(2?pUl-q 4[~ƞ_-Io$6FY7KvvAx`Z]282xVx'XldFH[^&2?C|m(!)DUz7 „zr3aZ M!621-юx_]էc:iWq(u;Lƿh)A=u{pokJ$ !egeLd^'M xR؀` hjDKk^GJ%G 3(t0]Aeb#zy5x- Q/@;_o84VŘ`1{wJJZI$}2IciBr} C 'V fK8/zqPx{gD9s<6 ,@D854$E=g-ۆ"43;ʟUt G 6g""8x{ZzI\o!,=P|-=sȕyFxŚ<b ~t N˺*~*E.H<)@#_<8/v%Q3z ȳ(G,'~ž?m>jӲyez-qp\j>HC2Mۯf Kj?Uq_k?]8Glh?06"*NYWriZEQLжC:Vqz`|F`wD'iP =~8:v0_%4Lhd 1B!YpucSJq h RKd¹Y![+z+`j2U/7B/g/UF3E?r\x}b /ZU*M.<Ѩ@pH A2%UxtHz%#?C)Z b<2](^mI3 [8C}šOGNN`w cж|76fLv1$o7îCXtfKhpnU?]1%,%`閨x{q~yFV_r UUM =X#_[ 3]r4A\U 64`> 623W"x_jeҎP6nG&b^-! rG3HO FSi"f+˜zGUmԵ4`ёvĦhFPXtux炽 38f{|8}V tXBogD۪P䌘붚u^A zԻ-Ϥ&-D=kɕjdSd<~6jZ򳕼kζ3G?4+C23?p>/O zdmRj@S4mhA_)m^Хf <}vwpg|ljbGv(+eHdK$Pͬi,T"We],F]4%w d$;O,၉.Y~gI nqHfDoD;{z[bSXZ'ƕa!? ;ٙA{!~\3v/#hi"%$%д DZdtށTEm]W[},nF>#U]l܉E-(; FvVlBDN9ZaD]7q1^j/kX;0pȟZ1}F[&chI-z1,#)]R?\1(6 CQS[BbN#8.}յv\_~:stdBvrRpIZAa\y>).;*o@*P!6~Ts=㯄ջNE<*_;킨.2{3rlo$EӶz̙hq Z;OD@Cxt3A1:8<&LGhOPQ񚺸c1?}QXK_׉%i1y @|vr >YL(>ЗN\ W=H(=.xvQ`⪮ 4H ]=Y1y-/)bmM%E#(hoU8Y kLS߉c`$x L[l;-8h9vKfqJ[Z%h`%䀖IH`Tm69A'v[;Íd&Z'o9+S=q73֛٬gqZa` 0S'ٚ34>CFP#*3xj'䕀YX %g~n$#&1p|dznaB\WjBwN9>Os6wt%ږU`v^6W f W䣉z@JZޞ-xx(3eoy=!*MtܵTBpRBZMbof4Cy:Wa10Cb%vrŖPM@e8ʬ1)p)vϖ~b7&='Q]o[82gSHy0D`Gx=?kjXgjbIRXH!?ŶjjZ"~] #_{M1hirO|cSġ \19pKg}T0}epTA0gH ~ǘ#`BDiWa>s +c9hIO}rȜH6gGwQyDзP8 {_Y6&ѿCQ\B!VwE@dfM 3 N7KKG1^jՇEx͎ć1{.~@ QQ~LXm; 2 ;zIM jӹtIiKݰF%wIAPH28[la0'􂰞!ǹ[w~v!3Mɓ^%yF tVЬj.|'~32aU.(oGbqwzWd}2qMئe;ci)0SoTa]ڸpJkPdpx쿑^*_؇U*a Xm}kǨ:S9bn }m}ǖ3IJw[~@/m}&"wO,w3/|xC<,^I.Ae%LDq=wl]>Bb^CfMXkV\*WRÐ=6ud&C JƎnu[lj> v3-u& svC_w}ŀLLTbo?]!;О;VQkYY˴t'Ƒ+6ʪ&zRP]H!dE{>)sLRY^92oj\y7<[djp0Ęѽ-@H(">۞&gb 8- Vi'frb0 Ǿ}Qg~dŒnzvCCK?p<6 0gڇ|v} tD.+`O;~"9I|`1Nm@5۞)7ӈK^ʻhDwܸ,쏐uq>4Β>a{i& [tAkn -<7 '? \[/5_]F˻& s7TmU:)FU*v Lߨ:S<=?[c<%FɰZvTMwQ҅MImP@)ٞOV`{Hw֡؏u)wbP4{zuv&l)(c:@;CV a/ȱ DYF6};sx*tDQ>3v{I 6ޢJ)_?eܠk$kΜSJ-@}L76-`kk#:B̚{0\]r֝.uؽFv Dجrޙ.GzFF˥&UÏz_ae 8Z[ZGz jý%?/B:`+vWL]Ύn(;Нtg ۺPj杩!l\o'\L̯ZggA=՜MD,Sv e f20^;4-*9`D^cݗAR[레M39Ok,@+Z%(x,WFyo*#Lc|Z.[V}iڵԵ:60>Up._ |k#h!bmjjvZPܺ8e\ۅu8MաxNA.75 pFsW6Őzmu_\e.* qZʡ~;V(pNyEhu)>dG\uΑ@4˩~Ix}Ewy+3 Cm:|ȧq@46=⢷nQyJxy]r(5 } vݞwYq4ū)(1j3r<45H݇qŤFsum|;<yz1 OM˜RFR}ʲ1Ҁ&/P=^+UZW]1YSD8mxl琔 w<Mh܂C`m͐09+4) +s^*)?E= 6oUIbsKֈJQ|ZW> ~J0್3Õ)_ܮÚE$si ܛk)9n+EEx-]iyC~;g3RL'sR8ykp+0 01"_'yyN_LX$/ڡν~(u=v1یH BCxSZ ~g~{Jq}3֨;\'Gt7J2b;puχc.|c,+7kI pp%2Z}lL{jmJ D.a3Db^jb r1X mܜ<;Cҽ5țwPę~?Ma nشuNL=W£y+GsԇY=XsF LDH~-%i޿W$H $Ӹi s 2xJ:%w:ڜ$JYC&4Chmd6A ]^nݣ)tNTWa%3`A 2J`qɋ>]dtA8E|D  y [)J^A|O ńfEYٝFtf.xRMw 6ZA;q8|u׾ٔVdП ڕa͌&d̆ԈڲxGd}_x9MB\l[e*VޠO#5IԬUK)0̽"EfQ#.pXCìI]S9NgRU6 QKDQXQ悽x%0c M%KJRۖۑ3I׬ɋtE[a"vyMU8)G#<F N$eV'Q=UߖUV%UM<8S*uW&LZ⑲v[MPk͍&HCMvea~cAj(:9<%^ͽWgJ<=1R3S8823%T;ug )̼ƱFKG }rv*Ls5}Lv^/MF5kOޒ|:;32F?DsP8|;P@k 85]!=v>GݯDr}@4~](%lhoASs~kp'Czrv\7赸~NE6Op;K&/&hE׵ۢµh};JX*2& LUBdI>!)jf1XM<_ *GNw; ߽jb9,TF,Bk17NOtF1V 1ʠVr3q8[Q#"+.CéŨJ-)[n_o3JnWNρ{p_qTtfdw2C0y1 m>ǔhF:r`ή9<àg-#LD 8W-z$Se҄RKv8kpu j/[JF ߘʆZ(:(˟ϻiX$qd>V bs/|^Iܱ;)Ȯ6ΨY.2iNO#4^:qU Kl 1!UA{ Se誩ꑢ X%Q#)b.ar' ctZ~Tf٫R62 L[Vj5ZTl'*6~"*:/pl`1ܻo Hę3djl{.,VJ6X~:diG,zn{j&rIH^IcP8m %~sғ^o`TD2zbE[I;YÞDɀEad.rHXmbtɊaQ!{42| `3Lur~ (}[ T1S&JWdX3~".R|F a͢x,6ZeO't/6!EA*W!O#d45A=]f^>ov'D @x, )SLiL S>vMGs/{\bSVaxGFp=Q4lv((y ڕǕv8HSo, %l1cW#MLo E;~1dak:VpX^Afi9 Ci'5*tuՙ[뚓{3$xB6!yWLKKWF>Źf|(/nBx@9[3bҬT KpwN% T(m ֌Qg "U8M!xZlԘ.2637݇wN._M h&[&,I& DeɺV4:!]O Ym V Nj*O`:|M!i|H ecj)uԹک E(Bv1?x»Ҷm$CG.8.qb:,uGuu@5̏(2,BiQwO=g.otv3N]MZf4e)BtkS og8Ӑ;A0%4%݇ŧ~ךy>|#0pأ%&l"0bS1\&WyU;=Mz?f 94*1Oк*2'#}dG6)В&)c W[E(Dn%b Pyޚ-?<nxyCMMq,VQ͘n=wV:8 y%![j^w?TotRQ{#_bMX+r`$A$^E}nE\`Evi6u8b`[CaЎK?Cm <ao! 1bxҍ. M |F,4ܵ ~ٞ]>ǤFQ׏tvV­q?A@[ zd]EjAyKov4/gF~ŘjHGchjӠ~#(#  8r{o|еɢFa6+ pߧמݛfpFX_1⛊Ejyh(E-lt!Xq3ey]7>Q5Q|hI2=џ#erAbX S3Peײ6&TS>ϧSNzZK9H,J`JaeE]MZ,8 2Q qN&MX&'Sp5 F} †-gK'8k*Q̐`y 8 5,t/'\_3#w< Y:@ (.t1|?!#JC߿/=S~$67#qʥ)<WTAJ&, Tޝ5?MYmQupxv ,b?)~&8GguEaĩ,&,KRId@r"svݤ>b?I8viЬvM=J9+r=-IC_M54\n!OZA@@O빜P =# @Vr}SѱיrG n!8>Hӑw1u7$E6Vz&⶿IPxven@,fm4 1L fe{HeM. r_⁶$1TVv-ceмݓ?-s1h"[vQU+ pvL_PVtWru츋ל&"6Γ ^OL{1*䍓%GuyU=!ezQ2H=kh\XЇUsåZ*LwW1 (& 7w1ʕz[$㬲YZ)?v-oY$ĢOe@NLsn3ڪ~ړRn}xצb֡lBq Lu랚XuB%&qH E"4&GNM@1: - +pr!٢F3]j+lxyEQ_+!gCDaRA=3Z _*ԐekPI`#rk;B=vAOQ Suob6R5< O=Ⱦ Hp_6WѸG ø0tbc8j\*fCEPC9|YeQ"q;Ǻue.(Q~˓"JuSǙ*^p4W3@nGjh6gt-jVj1:{Nx />2ťXdQѕ-j#y_/fN(MV,+6 cE%'t 9ImD=GEʧELǞj!: w. n*iTT V_t_4ȑI!SμWry0)>x&'MwJݦ"'yEX,22Ԃ'ڐ\=p& Dh'x ]Ӡ ,e~o;u"z%zՊD!dmu#!^$|{wbRg"a}nI 7 3,zNߘ{Aܼ&A"5洩/tjㆀSE%>65֮Lp,Nܘ3RL7ԏ5!^M1 2iܼ1Y`Q*` VM@m"X@'f .i(k}'>ez*B6S!'4zG]_iXA-zfjU74[Y+}e-휭 @0L6$cU6nf}52"PTAnm,Әt6uEDs!n$LֵonƾJ\~9vzɑӮrDFDr+뵝Cw@g?mdW +!Z9swA5]ym{dam6͗F˼>fMc jE~,Ȅo.TYխTA\r3; {%rr*zsl<"dNbAljѷMn?Ǝ*noB\"v:-`pi!r iZǛs!Ib86F;D_c˪INz͌j MUZ4nREP*0'ϊaY7;G)4#l|RN6d2*/\S4BiRFl$=xۂΧ0_,^y{ Ev0^`>n\{'W5t>+Hy\K8 4h΀\a%9~]Kk:xA0^阉q^%jC{3D^G$dٍBvy+f'@=S@ xʳ<xZ"5`▩ Zmf+ ao\Ǵ" LttKN,VHx?ASL1p1p}=2yݢD\TEۇ~SS:Qkg,8_xaɘKXLn+3` )/ Łok| gD\ jV02 r=nq"r3l:Ή*KvV 䛉`iPph]I?M$z7Sz M0|Ed Q݉p8][B>X-[#=]ë;H|Ɲ!;8y^`Lݺ[Avj R8-,שbV`Cd*AArb9?7fTMB{=Ξ9E=s(Z[Cq3}+ *ږvcbŦ>-zoQQ1F&s-ֱ߯HC7$;{u Ka俛m@y o,mTIy1(7S_t̞M/ r%9ຶ:A1\<)q)Z\5yu46DMmۮ0C&qZ۵E;̳U0ܠbBw =`^ĺtMIoed>ًFڳxm!7<9,ǻO5n$Z"Cza?K 'ڽh ]ٮ)r{1VΉjN _X&(-;wŚ%;TGhJkmb`m%|w4=MDL;^#BɶL0IxJ6xba$OԹ\KmV.Ҟ)xPKAT7@bW7hSƲ jE5Q#a=aH]7S{ ]|q,pmqqu(?X&W+Q*1ٌqä5W Ĕ1mMzћ CbK,˽Om?]UE.FL!}Ѝ kX Gy*#yΌzV1MD{*(݋R֨v6b:2ʸQ6w>D,=59 y#٣!XڷSlť ~?BEjvqYYidq,0 ˈ]#ն9Y{IJy1IUgK!a@q(Ctܪ j.N' \s`"%H鳹d,H!@og ,7ͰsV3Z(#z_\b#dav_n0Cea t{<nD1P2RwӶw)!.0 c_>;\}aPs?7ZW dE{ Wv&l{ %a˞9ujQE٨ؒ o[VxNWHi姴錈0M5Fk֌ytqшE٘ @it>ٳi1.%|!ZAGvR뻮_O1l|IA^Gxo+NBsڗjY e&VN4pL@ZWZ*J$T}Jgn\qVZY8v(ˆ^ @TȃW*Xs#Cu؞}Du+  $徿 tx<'DSrE$kq 4 FC#?Y$p>BA%ewh/#lFMo|8;Hs-1Z2a9yd"X=(57STDlcrJ $I̧}U(iXS֛m J(W+:X۸~YfzM 1(SpbRYG` TdSU Av ,ff2xBk$8}([K^Zn-Q6jmb{w}5V 7b>ʹŻ]+˧Zg!_^~3B4|q}1'&{qمTeC" N\4Ōz3_`FIb0 b F_uQRɴ H9F"!\'Oy8OVDWHDEٳ ED'tFڜ | Am.Ih1+=ODW`Lk ܡkˮUSJLlJiNK gɘ,3:%^x.ѶNG5AϓHgv9_~zp/\P:s0u ̬xL.bn !,x(M~\mPG덧Zi4С@3 |' ǿ2I.e|Wt.,Bn[_8!MܾEe&Ǽι*_rOg.>!ЗϚYXH\7pz(ᣳWZmQkj_ vPE[A _N[jN?tTйc쏷HB uPrKƎsƚAVB$hӚrmy@+P oaXOd%4V8!WLL;(ks7PB~hh |;zrm_C05z0fW!fj"N/T1\F% <`9_nW/4[)ymOda6~|J_7ꁠ?^܁3#4X< [="l69Ub8Z5IV#4l_XZVWϢQG7 r!SiKşjE_J8Z=l7j4$/яŜc5m1VJÃ@rokAǩ; sMQML] Cj U,0%tƖD¬:pʹɰnsxNqjcξꎵ|4FQ,?BJ%RϽ$"$+<2M.4Q&4E s26` Whf:Ģf!>ޑo$]XK"cif֯#Ms2>8Ųu&%y%2MW'٫br_|rQ6}TWBW'Θ1_v,]a몖%B䝐vTw"IS3-ʕy}@TK_->$!RBN'Q1!#O&ׇyh O&jH?Wu eqw+8Վ EZCgsUZ(cYcD3}rqd_BN^5 MJ3p(u ;,{S?ˎ*kuѮ8H }0f5#V!Ɗ6c*mJ_-r[.i>[õl/+v*L:ldخ4JFQW1%9lR'1~P")a{P``_[1{hY*OD@f |}Kc+:jg^4uMB9D<]Y_ry,K$Wy,J"B&ۛw AGxٳ* M>FQ` *n(``4 sVoHRq>F)r%DuG` S(%\q-L=uхRϟ$%սMYX93%[tf:ў?&Pu&&[H aOr:M:yh15e6(Ĝ30*cT4GK9Ͳw {̀-#x4oX# N=G3~'l ((DbW~Ͷ.o.rȬ'4Q܋1z̩q`B2 /]Dd2S|QӏzǑȢ23C(2Q~5/t?Ѥd談UZ7gR;,"͟E@)z 1z\Bf$󢋩lf @t?ne\mmx_QJQ¾{W꨻ 꾢u~|p۞ѳ*e+>B 0\ȹ[cW^!">)ϐ\ׯOEa[K-|HTbP\gX[(L춒7x1?ڧEg -9@+(kq{Shk<}*;'kT^@Ϝ!YÞpOj*:ZUEJtm^{bb6n(q߽T46/bA jG EYw?qm6r:vnϏ{QV88O\j ![ʟ6{PHǸ/:I8~мS(> 3S vJͿ!SH@C*uYYr:k0^9jC( [i~׍.Du" $.%[Y%8~( n]-,Y& AǬ0YMuFK5;0uGt't7%DrpCRXjUdgG B̫I _@)rOgo8'ŅZdxA:,mtpWL?!έLB;}Z`ټGdUn}Xrt9$+ј]qVCv`3HT4)+4$C5m NR%tl*>֎yC4:'=K2)ogBZ[11OWYP5>S:؊}p-Z:S`\W ZqXTToIuf:eM#@+8L,12H5F{ǓE0 H?WhECzөaǹҦ b^jȃ'LsCP|_I ww[q(ڰ4Ix]UԸ gM@%V(B:\ܒdI"=HxrG+M+,j4J+νeRrF+];,J:v{yu]Ojw^gu#^H?IaxK-euگPɊ ;Obya6}M}й ;x/u|5?$2y60~2|!hbF'*>0=;/;_F?Zد4"(YXFŲVܕIp[HrMXI5ٯ qPR5j&lD }ƒg?yl ز>NÝb^o)ab|-7fK!eORج?ΘVh MAvJ[j(-5!ؘV% 0BLk;Fhbo+! ЧmҶF+HT*O4 bW;7)Sl-SJ!YlN!<#?w_cv?wݨ?lpYv=%$)K{7+Vp9mZZ˶, Kо!"0O-{:|U#B_|Qtv:\?C>I7+jqOy5\b(s.(OEVZ7Cg MTOq}fDŃ J_O&G:*3[Q/C)4L,/&+xIQύYuHFkxy" ?L̦Kl#L5aSe kVtHxW P'(1)Ws~{LAwߔDž>?_< F>K6`%V(BBc9v*IaO-Np44F {6r孺4B#TGT")KMd=hEJKcϥJ8X/['sa$.V >v񁿀!VJܰuoe~&u)c?0P8"ot5䘓{oPC`04\c$H*c"v8g3& *>9] ŏE% -dercaKrʈka\)63t Ħ꼡ڧ c1&o0j IS8I<'=2 ;X} ;#OaѪEq^^#*"b=,kؠ3&f: Y׉OLs<90g/QAa鑓=EEb+Uz2@%^(Mo=ЧaL_ACz\ ~":3$s~^,Ywg .cw,s >0Or>0ĦdՇ٥#A_QQ1%mo`Vi@g3 n%N wK*i U]>(2VSkt:\ȲNÍ;N}?xUU->p1 a|'&v|$ ffY4vĬw}wqmc>sض 1I eG$m~)Kʧ#IENDB`slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/mainwindow.h0000664000000000000000000000645413151342440023236 0ustar rootroot/* slowmoUI is a user interface for slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef MAINWINDOW_H #define MAINWINDOW_H #include "canvas.h" #include "frameMonitor.h" #include "renderPreview.h" #include "dialogues/flowExaminer.h" #include "../project/project_sV.h" #include "../libgui/combinedShortcuts.h" namespace Ui { class MainWindow; } #include #include #include #include #include #include class Canvas; class ProgressDialog; class QShortcut; class QSignalMapper; namespace Ui { class MainWindow; } /// \todo Call flow editor from here class MainWindow : public QMainWindow { Q_OBJECT public: explicit MainWindow(QString projectPath = QString(), QWidget *parent = 0); ~MainWindow(); void displayHelp(QPainter &davinci) const; protected slots: virtual void closeEvent(QCloseEvent *e); #if QT_VERSION <= QT_VERSION_CHECK(4, 2, 0) virtual bool eventFilter(QObject *obj, QEvent *e); #endif private: enum ShortcutCommands { Quit, Abort, Abort_Selection, Delete_Node, Tool_Select, Tool_Move, Tag, Help, New, Open, Save_Same, Save_As }; Ui::MainWindow *ui; QSettings m_settings; Project_sV *m_project; QString m_projectPath; Canvas *m_wCanvas; FrameMonitor *m_wInputMonitor; FrameMonitor *m_wCurveMonitor; QDockWidget *m_wInputMonitorDock; QDockWidget *m_wCurveMonitorDock; RenderPreview* m_wRenderPreview; QDockWidget *m_wRenderPreviewDock; QList m_widgetActions; ProgressDialog *m_progressDialog; ProgressDialog *m_renderProgressDialog; FlowExaminer *m_flowExaminer; CombinedShortcuts m_cs; QThread m_rendererThread; void createDockWindows(); void createActions(); void loadProject(QString path); void loadProject(Project_sV *project); void resetDialogs(); void updateWindowTitle(); bool okToContinue(); private slots: void slotShortcutUsed(int id); void slotShowRenderDialog(); void slotShowPreferencesDialog(); void slotShowProjectPreferencesDialog(); void slotShowFlowExaminerDialog(); void slotShowDebugWindow(bool set); void slotForwardInputPosition(qreal frame); void slotForwardCurveSrcPosition(qreal frame); void slotNewFrameSourceTask(const QString taskDescription, int taskSize); void slotFrameSourceTasksFinished(); void slotCloseFrameSourceProgress(); void slotRenderingAborted(QString message); void slotNewProject(); void slotSaveProject(QString filename = ""); void slotSaveProjectDialog(); void slotLoadProjectDialog(); void slotToggleHelp(); void slotShowAboutDialog(); void slotUpdateRenderPreview(); void slotShowFlowEditWindow(); signals: void deleteNodes(); void setMode(const Canvas::ToolMode mode); void abort(const Canvas::Abort abort); void addTag(); void signalRendererContinue(); }; #endif // MAINWINDOW_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/macnotificationhandler.h0000664000000000000000000000123313151342440025555 0ustar rootroot#ifndef MACNOTIFICATIONHANDLER_H #define MACNOTIFICATIONHANDLER_H #include /** Macintosh-specific notification handler (supports UserNotificationCenter and Growl). */ class MacNotificationHandler : public QObject { Q_OBJECT public: /** shows a 10.8+ UserNotification in the UserNotificationCenter */ void showNotification(const QString &title, const QString &text); /** executes AppleScript */ void sendAppleScript(const QString &script); /** check if OS can handle UserNotifications */ bool hasUserNotificationCenterSupport(void); static MacNotificationHandler *instance(); }; #endif // MACNOTIFICATIONHANDLER_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/frameMonitor.cpp0000664000000000000000000000510413151342440024046 0ustar rootroot/* slowmoUI is a user interface for slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "frameMonitor.h" #include "ui_frameMonitor.h" #include #include #include #include FrameMonitor::FrameMonitor(QWidget *parent) : QWidget(parent), ui(new Ui::FrameMonitor), m_semaphore(1) { ui->setupUi(this); m_queue[0] = NULL; m_queue[1] = NULL; imgCache.clear(); setCacheLimit(10240); // cache size of 10Mb } FrameMonitor::~FrameMonitor() { delete ui; if (m_queue[0] != NULL) { delete m_queue[0]; } if (m_queue[1] != NULL) { delete m_queue[1]; } } /** * Sets the cache limit to n kilobytes. */ void FrameMonitor::setCacheLimit(int n) { cache_limit = n; imgCache.setMaxCost(1024 * cache_limit); } void FrameMonitor::slotLoadImage(const QString &filename) { m_semaphore.acquire(); if (m_queue[0] == NULL) { m_queue[0] = new QString(filename); } else { if (m_queue[1] != NULL) { delete m_queue[1]; m_queue[1] = NULL; } m_queue[1] = new QString(filename); } m_semaphore.release(); repaint(); } void FrameMonitor::closeEvent(QCloseEvent *event) { QWidget::closeEvent(event); } void FrameMonitor::paintEvent(QPaintEvent *) { QString image; m_semaphore.acquire(); if (m_queue[0] != NULL) { image = *m_queue[0]; delete m_queue[0]; m_queue[0] = NULL; } if (m_queue[1] != NULL) { m_queue[0] = m_queue[1]; m_queue[1] = NULL; } m_semaphore.release(); if (!image.isNull()) { // add some better cache mgmt QImage *_image; //qDebug() << "cost : " << imgCache.totalCost(); if(imgCache.contains(image)) { //return *(frameCache.object(path)); //qDebug() << "cache : " << image; _image = imgCache.object(image); } else { _image = new QImage(image); //qDebug() << "cache store : " << image << "cost : " << _image->byteCount(); //TODO: provide a method for that bool success = imgCache.insert(image, _image, _image->byteCount()); if ( !success) { qDebug() << "WARN: memory error"; _image = new QImage(image); } } if (_image != 0) ui->imageDisplay->loadImage(*_image); } } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/notificator.h0000664000000000000000000000503713151342440023377 0ustar rootroot#ifndef NOTIFICATOR_H #define NOTIFICATOR_H #include #include QT_BEGIN_NAMESPACE class QSystemTrayIcon; #ifdef USE_DBUS class QDBusInterface; #endif QT_END_NAMESPACE /** Cross-platform desktop notification client. */ class Notificator: public QObject { Q_OBJECT public: /** Create a new notificator. @note Ownership of trayIcon is not transferred to this object. */ Notificator(const QString &programName=QString(), QSystemTrayIcon *trayIcon=0, QWidget *parent=0); ~Notificator(); // Message class enum Class { Information, /**< Informational message */ Warning, /**< Notify user of potential problem */ Critical /**< An error occurred */ }; public slots: /** Show notification message. @param[in] cls general message class @param[in] title title shown with message @param[in] text message content @param[in] icon optional icon to show with message @param[in] millisTimeout notification timeout in milliseconds (defaults to 10 seconds) @note Platform implementations are free to ignore any of the provided fields except for \a text. */ void notify(Class cls, const QString &title, const QString &text, const QIcon &icon = QIcon(), int millisTimeout = 10000); private: QWidget *parent; enum Mode { None, /**< Ignore informational notifications, and show a modal pop-up dialog for Critical notifications. */ Freedesktop, /**< Use DBus org.freedesktop.Notifications */ QSystemTray, /**< Use QSystemTray::showMessage */ Growl12, /**< Use the Growl 1.2 notification system (Mac only) */ Growl13, /**< Use the Growl 1.3 notification system (Mac only) */ UserNotificationCenter /**< Use the 10.8+ User Notification Center (Mac only) */ }; QString programName; Mode mode; QSystemTrayIcon *trayIcon; #ifdef USE_DBUS QDBusInterface *interface; void notifyDBus(Class cls, const QString &title, const QString &text, const QIcon &icon, int millisTimeout); #endif void notifySystray(Class cls, const QString &title, const QString &text, const QIcon &icon, int millisTimeout); #ifdef Q_OS_MAC void notifyGrowl(Class cls, const QString &title, const QString &text, const QIcon &icon); void notifyMacUserNotificationCenter(Class cls, const QString &title, const QString &text, const QIcon &icon); #endif }; #endif // NOTIFICATOR_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/CMakeLists.txt0000664000000000000000000001330513151342440023442 0ustar rootroot # Building Qt+UI apps: # http://www.qtcentre.org/wiki/index.php?title=Compiling_Qt4_apps_with_CMake # http://www.cmake.org/pipermail/cmake/2008-September/023908.html include_directories(${slowmoVideo_SOURCE_DIR}) set(SRCS main.cpp mainwindow.cpp canvas.cpp canvasTools.cpp flowEditCanvas.cpp frameMonitor.cpp renderPreview.cpp dialogues/newProjectDialog.cpp dialogues/preferencesDialog.cpp dialogues/projectPreferencesDialog.cpp dialogues/progressDialog.cpp dialogues/renderingDialog.cpp dialogues/shutterFunctionDialog.cpp dialogues/shutterFunctionFrame.cpp dialogues/flowExaminer.cpp dialogues/tagAddDialog.cpp dialogues/aboutDialog.cpp notificator.cpp logbrowserdialog.cpp ) # notification if (APPLE) find_library(CORE_FOUNDATION NAMES CoreFoundation) find_library(FOUNDATION NAMES Foundation) find_library(APPLICATION_SERVICES NAMES ApplicationServices) find_library(CORE_SERVICES NAMES CoreServices) find_library(COCOA NAMES Cocoa) SET(OSX_EXTRA_LIBS ${FOUNDATION} ${CORE_FOUNDATION} ${APPLICATION_SERVICES}) message(STATUS "OSX Additional libraries: ${OSX_EXTRA_LIBS}") set(SRCS ${SRCS} macnotificationhandler.mm ) endif() set(SRCS_UI mainwindow.ui canvas.ui frameMonitor.ui renderPreview.ui flowEditCanvas.ui dialogues/newProjectDialog.ui dialogues/preferencesDialog.ui dialogues/projectPreferencesDialog.ui dialogues/progressDialog.ui dialogues/renderingDialog.ui dialogues/shutterFunctionDialog.ui dialogues/flowExaminer.ui dialogues/tagAddDialog.ui dialogues/aboutDialog.ui ) set(SRCS_MOC mainwindow.h canvas.h frameMonitor.h flowEditCanvas.h renderPreview.h dialogues/newProjectDialog.h dialogues/preferencesDialog.h dialogues/projectPreferencesDialog.h dialogues/progressDialog.h dialogues/renderingDialog.h dialogues/shutterFunctionDialog.h dialogues/shutterFunctionFrame.h dialogues/flowExaminer.h dialogues/tagAddDialog.h dialogues/aboutDialog.h notificator.h logbrowserdialog.h ) # notification if (APPLE) set(SRCS_MOC ${SRCS_MOC} macnotificationhandler.h ) endif() if(APPLE) set(BUNDLE "slowmoUI") set(ICONS_DIR "${${PROJECT_NAME}_SOURCE_DIR}/slowmoVideo/slowmoUI/res") message( "OS X build" ) set(MACOSX_BUNDLE_INFO_STRING "${BUNDLE} ${PROJECT_VERSION}") set(MACOSX_BUNDLE_BUNDLE_VERSION "${BUNDLE} ${PROJECT_VERSION}") set(MACOSX_BUNDLE_LONG_VERSION_STRING "${BUNDLE} ${PROJECT_VERSION}") set(MACOSX_BUNDLE_SHORT_VERSION_STRING "${PROJECT_VERSION}") set(MACOSX_BUNDLE_COPYRIGHT ${COPYRIGHT} ) set(MACOSX_BUNDLE_ICON_FILE "slowmoUI.icns") set(MACOSX_BUNDLE_GUI_IDENTIFIER ${IDENTIFIER} ) set(MACOSX_BUNDLE_BUNDLE_NAME "${BUNDLE}") set(MACOSX_BUNDLE_RESOURCES "${CMAKE_CURRENT_BINARY_DIR}/${BUNDLE}.app/Contents/Resources") set(MACOSX_BUNDLE_ICON "${ICONS_DIR}/${MACOSX_BUNDLE_ICON_FILE}") SET_SOURCE_FILES_PROPERTIES( ${MACOSX_BUNDLE_ICON} PROPERTIES MACOSX_PACKAGE_LOCATION Resources) message(STATUS "slowmoUI Bundle will be : ${MACOSX_BUNDLE} ") set( SRCS ${SRCS} ${MACOSX_BUNDLE_ICON} ) set(INFO_PLIST_FILENAME ${CMAKE_CURRENT_BINARY_DIR}/Info.plist) configure_file(${CMAKE_SOURCE_DIR}/slowmoVideo/slowmoUI/res/Info.plist.in ${INFO_PLIST_FILENAME}) set_target_properties(${PROGNAME} PROPERTIES MACOSX_BUNDLE_INFO_PLIST ${INFO_PLIST_FILENAME} ) #ADD_CUSTOM_COMMAND(TARGET slowmoUI # POST_BUILD # COMMAND cp Info.plist slowmoUI.app/Contents/Info.plist # COMMENT "Updating Info.plist" # WORKING_DIRECTORY ${CMAKE_CURRENT_BINARY_DIR} ) endif() # Without these includes the promoted widgets fail to compile # since the headers are not found. (For whatever reason.) include_directories(dialogues) include_directories(.) include_directories(..) # Embed images in the binary set(SRC_RES resources.qrc) qt_add_resources(RES_OUT ${SRC_RES}) # Generate header files from the .ui files qt_wrap_ui(UI_H_OUT ${SRCS_UI}) qt_wrap_cpp(MOC_OUT ${SRCS_MOC}) # Include the generated header files include_directories(${CMAKE_BINARY_DIR}/slowmoVideo/slowmoUI) link_directories(${FFMPEG_LIBRARY_DIR}) add_executable(slowmoUI WIN32 MACOSX_BUNDLE ${SRCS} ${MOC_OUT} ${UI_H_OUT} ${RES_OUT}) qt_use_modules(slowmoUI Script Widgets Concurrent Gui Core ) target_link_libraries(slowmoUI sVproj sVvis sVgui ${OSX_EXTRA_LIBS}) #install(TARGETS ${slowmoUI} # BUNDLE DESTINATION . COMPONENT Runtime # RUNTIME DESTINATION ${BIN_INSTALL_DIR} COMPONENT Runtime) if (UNIX AND NOT APPLE) # create desktop file for KDE/Gnome # desktop file section file( WRITE "${PROJECT_BINARY_DIR}/slowmoUI.desktop" "#!/usr/bin/env xdg-open [Desktop Entry] Type=Application Comment=Slow Motion Video Exec=${CMAKE_INSTALL_PREFIX}/bin/slowmoUI GenericName=slowmoVideo Icon=${CMAKE_INSTALL_PREFIX}/share/icons/AppIcon.png MimeType= Name=slowmoUI Terminal=false Categories=Qt;AudioVideo;Video;\n") #install ( FILES icons/slowmoUI.png DESTINATION share/icons ) install ( FILES res/AppIcon.png DESTINATION share/icons ) install ( FILES ${PROJECT_BINARY_DIR}/slowmoUI.desktop DESTINATION share/applications PERMISSIONS OWNER_READ OWNER_WRITE OWNER_EXECUTE GROUP_READ GROUP_EXECUTE WORLD_READ WORLD_EXECUTE ) endif() if (Qt5Core_FOUND) include(DeployQt5) # 2.8.7 or later else() include(DeployQt4) # 2.8.7 or later endif() #set( PLUGINS "qjpeg;qtiff" ) set( PLUGINS qjpeg ) if (APPLE) install(TARGETS slowmoUI DESTINATION ".") install_qt_executable(slowmoUI.app "${PLUGINS}" ) elseif(WIN32) install(TARGETS slowmoUI DESTINATION ".") install_qt_executable(slowmoUI.exe "${PLUGINS}" ) else() install(TARGETS slowmoUI DESTINATION ${DEST}) # install_qt_executable(slowmoUI "${PLUGINS}" ) endif() slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/logbrowserdialog.cpp0000664000000000000000000001065713151342440024762 0ustar rootroot/* * snippet from : https://wiki.qt.io/index.php?title=Browser_for_QDebug_output&oldid=15731 * add a logwindow to a project * */ #include "logbrowserdialog.h" #include #include #include #include #include #include #include #include #include #include #include LogBrowserDialog::LogBrowserDialog(QWidget *parent) : QDialog(parent) { // setAttribute(Qt::WA_DeleteOnClose); QVBoxLayout *layout = new QVBoxLayout; setLayout(layout); // browser = new QTextBrowser(this); browser = new QTextEdit(this); layout->addWidget(browser); QHBoxLayout *buttonLayout = new QHBoxLayout; buttonLayout->setContentsMargins(0, 0, 0, 0); layout->addLayout(buttonLayout); buttonLayout->addStretch(10); clearButton = new QPushButton(this); clearButton->setText("clear"); buttonLayout->addWidget(clearButton); connect(clearButton, SIGNAL (clicked()), browser, SLOT (clear())); saveButton = new QPushButton(this); saveButton->setText("save output"); buttonLayout->addWidget(saveButton); connect(saveButton, SIGNAL (clicked()), this, SLOT (save())); resize(600, 400); } LogBrowserDialog::~LogBrowserDialog() { } void LogBrowserDialog::outputMessage(QtMsgType type, const QString &msg) { // out << QTime::currentTime().toString("hh:mm:ss.zzz "); // fprintf(stderr, "Debug: %s (%s:%u, %s)\n", localMsg.constData(), context.file, context.line, context.function); QString temp; switch (type) { case QtDebugMsg: temp= (tr("DEBUG: %1").arg(msg)); break; case QtWarningMsg: temp = (tr("WARNING: %1").arg(msg)); break; case QtCriticalMsg: temp = (tr("CRITICAL: %1").arg(msg)); break; case QtFatalMsg: temp = (tr("FATAL: %1").arg(msg)); break; default: temp = (tr("UNK: %1").arg(msg)); break; } // indirect invoke for threading safety QMetaObject::invokeMethod(browser, "append", Qt::QueuedConnection, Q_ARG(QString, temp)); } void LogBrowserDialog::save() { QString saveFileName = QFileDialog::getSaveFileName( this, tr("Save Log Output"), tr("%1/logfile.txt").arg(QDir::homePath()), tr("Text Files ('''.txt);;All Files (*)") ); if(saveFileName.isEmpty()) return; QFile file(saveFileName); if(!file.open(QIODevice::WriteOnly)) { QMessageBox::warning( this, tr("Error"), QString(tr("File '%1'
cannot be opened for writing.

" "The log output could not be saved!
")) .arg(saveFileName)); return; } QTextStream stream(&file); stream << browser->toPlainText(); file.close(); } void LogBrowserDialog::closeEvent(QCloseEvent *e) { QMessageBox::StandardButton answer = QMessageBox::question( this, tr("Close Log Browser?"), tr("Do you really want to close the log browser?"), QMessageBox::Yes | QMessageBox::No ); if (answer == QMessageBox::Yes) e->accept(); else e->ignore(); } void LogBrowserDialog::keyPressEvent(QKeyEvent *e) { // ignore all keyboard events // protects against accidentally closing of the dialog // without asking the user e->ignore(); } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/canvas.h0000664000000000000000000001561713151342440022336 0ustar rootroot/* slowmoUI is a user interface for slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef CANVAS_H #define CANVAS_H #include "../project/node_sV.h" #include "../project/nodeList_sV.h" #include "../project/project_sV.h" #include "../project/tag_sV.h" #include #include #include #define NODE_RADIUS 6 #define SELECT_RADIUS 12 #define HANDLE_RADIUS 4 #define MOVE_THRESHOLD 3 #define SCROLL_FACTOR 3 #define ZOOM_FACTOR 1.414 class QColor; class QPoint; class CanvasTools; class ShutterFunctionDialog; namespace Ui { class Canvas; } /** This class is for building helper objects for the signals&slots mechanism for passing pointers to objects which are not QObjects. */ class TransferObject : public QObject { Q_OBJECT public: CanvasObject_sV* objectPointer; enum Reason { ACTION_DELETE, ACTION_RENAME, ACTION_SETTIME, ACTION_SNAPIN } reason; TransferObject() : objectPointer(NULL), reason(ACTION_SNAPIN) {} TransferObject(CanvasObject_sV* objectPointer, Reason reason) : objectPointer(objectPointer), reason(reason) {} }; class Project_sV; /** \brief Canvas for drawing motion curves. \todo Frame lines on high zoom \todo Custom speed factor to next node */ class Canvas : public QWidget { Q_OBJECT friend class CanvasTools; public: explicit Canvas(Project_sV *project, QWidget *parent = 0); ~Canvas(); enum ToolMode { ToolMode_Select, ToolMode_Move }; enum Abort { Abort_General, Abort_Selection }; static QColor lineCol; static QColor selectedLineCol; static QColor nodeCol; static QColor gridCol; static QColor fatGridCol; static QColor minGridCol; static QColor selectedCol; static QColor hoverCol; static QColor srcTagCol; static QColor outTagCol; static QColor handleLineCol; static QColor backgroundCol; static QColor shutterRegionCol; static QColor shutterRegionBoundCol; void load(Project_sV *project); void showHelp(bool show); void toggleHelp(); const QPointF prevMouseTime() const; const float prevMouseInFrame() const; public slots: void slotAbort(Canvas::Abort abort); void slotAddTag(); void slotDeleteNodes(); void slotSetToolMode(Canvas::ToolMode mode); signals: void signalMouseInputTimeChanged(qreal frame); void signalMouseCurveSrcTimeChanged(qreal frame); void nodesChanged(); protected: void paintEvent(QPaintEvent *); void mouseMoveEvent(QMouseEvent *); void mousePressEvent(QMouseEvent *); void mouseReleaseEvent(QMouseEvent *); void wheelEvent(QWheelEvent *); void keyPressEvent(QKeyEvent *event); void leaveEvent(QEvent *); void contextMenuEvent(QContextMenuEvent *); private: Ui::Canvas *ui; QSettings m_settings; Project_sV *m_project; ShutterFunctionDialog *m_shutterFunctionDialog; bool m_mouseWithinWidget; int m_distLeft; int m_distBottom; int m_distRight; int m_distTop; Node_sV m_t0; ///< Viewport, bottom left Node_sV m_tmax; ///< Upper bounds for the viewport (so the user does not get lost by zooming too far up) float m_secResX; ///< How many pixels wide is one output second? float m_secResY; ///< How many pixels wide is one input second? bool m_showHelp; NodeList_sV *m_nodes; QList *m_tags; ToolMode m_mode; /** Saves states about mouse events. The prev... variables are updated when the mouse moves, the initial... variables only on mouse clicks. */ struct { bool nodesMoved; bool selectAttempted; bool moveAborted; QPoint prevMousePos; QPoint initialMousePos; QPointF contextmenuMouseTime; Node_sV initial_t0; Qt::KeyboardModifiers prevModifiers; Qt::KeyboardModifiers initialModifiers; Qt::MouseButtons initialButtons; const CanvasObject_sV *initialContextObject; void reset() { moveAborted = false; nodesMoved = false; selectAttempted = false; initialContextObject = NULL; travelledDistance = 0; } void travel(int length) { travelledDistance += length; } bool countsAsMove() { return travelledDistance >= MOVE_THRESHOLD; } private: int travelledDistance; } m_states; /* The transfer objects to each action defines the action to take when the slot is called, and additionally stores a pointer to the object it was called on (the object is known in the context menu event). */ QAction *m_aDeleteNode; TransferObject m_toDeleteNode; QAction *m_aSnapInNode; TransferObject m_toSnapInNode; QAction *m_aDeleteTag; TransferObject m_toDeleteTag; QAction *m_aRenameTag; TransferObject m_toRenameTag; QAction *m_aSetTagTime; TransferObject m_toSetTagTime; QSignalMapper *m_hackMapper; QSignalMapper *m_curveTypeMapper; QSignalMapper *m_handleMapper; QSignalMapper *m_speedsMapper; QAction *m_aLinear; QAction *m_aBezier; QAction *m_aResetLeftHandle; QAction *m_aResetRightHandle; QAction *m_aCustomSpeed; QAction *m_aShutterFunction; std::vector m_aSpeeds; Node_sV convertCanvasToTime(const QPoint &p, bool snap = false) const; QPoint convertTimeToCanvas(const Node_sV &p) const; QPoint convertTimeToCanvas(const QPointF &p) const; QPointF convertDistanceToTime(const QPoint &p) const; QPoint convertTimeToDistance(const QPointF &time) const; /** \return The distance in px converted to time */ float delta(int px) const; bool insideCanvas(const QPoint& pos); bool selectAt(const QPoint& pos, bool addToSelection = false); void drawModes(QPainter &davinci, int top, int right); const CanvasObject_sV* objectAt(QPoint pos, Qt::KeyboardModifiers modifiers) const; void setCurveSpeed(double speed); private slots: void slotRunAction(QObject *o); void slotChangeCurveType(int curveType); void slotResetHandle(const QString &position); void slotSetSpeed(); void slotSetSpeed(QString s); void slotSetShutterFunction(); void slotZoomIn(); void slotZoomOut(); private: void zoom(bool in, QPoint pos); QRect leftDrawingRect(int y, const int height = 12, const int min = -1, const int max = -1) const; QRect bottomDrawingRect(int x, const int width = 160, const int min = -1, const int max = -1, bool rightJustified = true) const; }; QDebug operator<<(QDebug qd, const Canvas::ToolMode &mode); QDebug operator<<(QDebug qd, const Canvas::Abort &abort); QString toString(TransferObject::Reason reason); #endif // CANVAS_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/main.cpp0000664000000000000000000000726613151342440022343 0ustar rootroot/* slowmoUI is a user interface for slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include #include #include #include #include "opencv2/core/version.hpp" #include "mainwindow.h" #include "logbrowserdialog.h" //QPointer logBrowser; QPointer logBrowser; #if QT_VERSION >= 0x050000 void myMessageOutput(QtMsgType type, const QMessageLogContext &context, const QString &msg) { #else void myMessageOutput(QtMsgType type, const char *str) { QString msg = QString::fromUtf8(str); #endif if(logBrowser) logBrowser->outputMessage( type, msg ); } int main(int argc, char *argv[]) { #ifdef Q_OS_MACX // fix for osx UI and Qt4 if ( QSysInfo::MacintoshVersion > QSysInfo::MV_10_8 ) { // fix Mac OS X 10.9 (mavericks) font issue // https://bugreports.qt-project.org/browse/QTBUG-32789 QFont::insertSubstitution(".Lucida Grande UI", "Lucida Grande"); } #endif QApplication a(argc, argv); //QDebug()<<"starting app" // Set up preferences for the QSettings file QCoreApplication::setOrganizationName("Granjow"); QCoreApplication::setOrganizationDomain("granjow.net"); QCoreApplication::setApplicationName("slowmoUI"); // Setup debug output system. logBrowser = new LogBrowserDialog; #if QT_VERSION >= 0x050000 qInstallMessageHandler(myMessageOutput); #else qInstallMsgHandler(myMessageOutput); #endif // startup... QString projectPath; qDebug() << "threading info : " << QThread::idealThreadCount(); qDebug() << a.arguments(); //TODO: place this in About ... qDebug() << "OpenCV version: " << CV_MAJOR_VERSION << "." << CV_MINOR_VERSION << "." << CV_SUBMINOR_VERSION; const int N = a.arguments().size(); for (int n = 1; n < N; n++) { QString arg = a.arguments().at(n); if (arg.startsWith("--")) { bool langUpdated = true; // Changes the file loaded from the resource container // to force a different language if (arg == "--fr") { QLocale::setDefault(QLocale::French); } else if (arg == "--de") { QLocale::setDefault(QLocale::German); } else if (arg == "--en") { QLocale::setDefault(QLocale::English); } else if (arg == "--it") { QLocale::setDefault(QLocale::Italian); } else { langUpdated = false; } if (langUpdated) { qDebug() << "Changed locale to " << QLocale::languageToString(QLocale().language()); } else { qDebug() << "Not handled: " << arg; } } else { QFileInfo info(arg); if (info.exists() && info.isReadable() && info.isFile()) { projectPath = info.absoluteFilePath(); qDebug() << "Loading project: " << projectPath; } else { qDebug() << projectPath << " does not exist."; } } } // Load the translation file from the resource container and use it QTranslator translator; translator.load(":translations"); a.installTranslator(&translator); MainWindow w(projectPath); w.show(); //use menu here : logBrowser->show(); int result = a.exec(); qDebug() << "application exec return result =" << result; delete logBrowser; return result; } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/flowEditCanvas.ui0000664000000000000000000000735713151342440024164 0ustar rootroot FlowEditCanvas 0 0 1058 608 Form QLayout::SetMinimumSize Qt::Horizontal 40 20 TextLabel Qt::Horizontal QSizePolicy::Fixed 10 20 Values at mouse position 0 0 QFrame::StyledPanel QFrame::Raised 0 0 20 3 Qt::Vertical QSlider::TicksBelow 1 Qt::Vertical 20 40 ImageDisplay QFrame

libgui/imageDisplay.h
1 slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/canvas.cpp0000664000000000000000000013633713151342440022674 0ustar rootroot/* slowmoUI is a user interface for slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "canvas.h" #include "canvasTools.h" #include "ui_canvas.h" #include "mainwindow.h" #include "tagAddDialog.h" #include "shutterFunctionDialog.h" #include "project/shutterFunction_sV.h" #include "project/shutterFunctionList_sV.h" #include "lib/bezierTools_sV.h" #include "project/projectPreferences_sV.h" #include "project/abstractFrameSource_sV.h" #include #include #include #include #include #include #include #include #include #include #include #include #include #include //#define VALIDATE_BEZIER #ifdef VALIDATE_BEZIER #include "lib/bezierTools_sV.h" #endif //#define DEBUG_C #ifdef DEBUG_C #include #endif QColor Canvas::selectedCol ( 0, 175, 255, 100); QColor Canvas::hoverCol (255, 175, 0, 200); QColor Canvas::lineCol (255, 255, 255); QColor Canvas::selectedLineCol(255, 175, 0, 200); QColor Canvas::nodeCol (240, 240, 240); QColor Canvas::gridCol (255, 255, 255, 30); QColor Canvas::fatGridCol (255, 255, 255, 60); QColor Canvas::minGridCol (200, 200, 255, 150); QColor Canvas::handleLineCol(255, 255, 255, 128); QColor Canvas::srcTagCol ( 30, 245, 0, 150); QColor Canvas::outTagCol ( 30, 245, 0, 150); QColor Canvas::backgroundCol( 34, 34, 34); QColor Canvas::shutterRegionCol(175, 25, 75, 100); QColor Canvas::shutterRegionBoundCol(240, 0, 60, 150); /// \todo move with MMB /// \todo replay curve Canvas::Canvas(Project_sV *project, QWidget *parent) : QWidget(parent), ui(new Ui::Canvas), m_project(project), m_mouseWithinWidget(false), m_distLeft(90), m_distBottom(50), m_distRight(20), m_distTop(32), m_t0(0,0), m_tmax(10,10), m_secResX(100), m_secResY(100), m_showHelp(false), m_nodes(project->nodes()), m_tags(project->tags()), m_mode(ToolMode_Select) { ui->setupUi(this); m_shutterFunctionDialog = NULL; // Enable mouse tracking (not only when a mouse button is pressed) this->setMouseTracking(true); setContextMenuPolicy(Qt::DefaultContextMenu); setFocusPolicy(Qt::StrongFocus); m_states.prevMousePos = QPoint(0,0); m_states.contextmenuMouseTime = QPointF(0,0); m_states.initialContextObject = NULL; Q_ASSERT(m_secResX > 0); Q_ASSERT(m_secResY > 0); m_aDeleteNode = new QAction(tr("&Delete node"), this); m_aSnapInNode = new QAction(tr("&Snap in node"), this); m_aDeleteTag = new QAction(tr("&Delete tag"), this); m_aRenameTag = new QAction(tr("&Rename tag"), this); m_aSetTagTime = new QAction(tr("Set tag &time"), this); m_hackMapper = new QSignalMapper(this); m_hackMapper->setMapping(m_aRenameTag, &m_toRenameTag); m_toRenameTag.reason = TransferObject::ACTION_RENAME; m_hackMapper->setMapping(m_aDeleteTag, &m_toDeleteTag); m_toDeleteTag.reason = TransferObject::ACTION_DELETE; m_hackMapper->setMapping(m_aDeleteNode, &m_toDeleteNode); m_toDeleteNode.reason = TransferObject::ACTION_DELETE; m_hackMapper->setMapping(m_aSnapInNode, &m_toSnapInNode); m_toSnapInNode.reason = TransferObject::ACTION_SNAPIN; m_hackMapper->setMapping(m_aSetTagTime, &m_toSetTagTime); m_toSetTagTime.reason = TransferObject::ACTION_SETTIME; m_curveTypeMapper = new QSignalMapper(this); m_aLinear = new QAction(tr("&Linear curve"), this); m_aBezier = new QAction(trUtf8("&Bézier curve"), this); m_curveTypeMapper->setMapping(m_aLinear, CurveType_Linear); m_curveTypeMapper->setMapping(m_aBezier, CurveType_Bezier); m_aCustomSpeed = new QAction(tr("Set &custom speed"), this); m_aShutterFunction = new QAction(tr("Set/edit shutter &function"), this); m_speedsMapper = new QSignalMapper(this); double arr[] = {1, .5, 0, -.5, -1}; #define N_SPEEDS 5 for (int i = 0; i < N_SPEEDS; i++) { m_aSpeeds.push_back(new QAction(trUtf8("Set speed to %1×").arg(arr[i], 0, 'f', 1), this)); m_speedsMapper->setMapping(m_aSpeeds.back(), QString("%1").arg(arr[i],0,'f',1)); } m_handleMapper = new QSignalMapper(this); m_aResetLeftHandle = new QAction(tr("Reset left handle"), this); m_aResetRightHandle = new QAction(tr("Reset right handle"), this); m_handleMapper->setMapping(m_aResetLeftHandle, "left"); m_handleMapper->setMapping(m_aResetRightHandle, "right"); connect(m_aSnapInNode, SIGNAL(triggered()), m_hackMapper, SLOT(map())); connect(m_aDeleteNode, SIGNAL(triggered()), m_hackMapper, SLOT(map())); connect(m_aDeleteTag, SIGNAL(triggered()), m_hackMapper, SLOT(map())); connect(m_aRenameTag, SIGNAL(triggered()), m_hackMapper, SLOT(map())); connect(m_aSetTagTime, SIGNAL(triggered()), m_hackMapper, SLOT(map())); connect(m_hackMapper, SIGNAL(mapped(QObject*)), this, SLOT(slotRunAction(QObject*))); connect(m_aLinear, SIGNAL(triggered()), m_curveTypeMapper, SLOT(map())); connect(m_aBezier, SIGNAL(triggered()), m_curveTypeMapper, SLOT(map())); connect(m_curveTypeMapper, SIGNAL(mapped(int)), this, SLOT(slotChangeCurveType(int))); connect(m_aResetLeftHandle, SIGNAL(triggered()), m_handleMapper, SLOT(map())); connect(m_aResetRightHandle, SIGNAL(triggered()), m_handleMapper, SLOT(map())); connect(m_handleMapper, SIGNAL(mapped(QString)), this, SLOT(slotResetHandle(QString))); connect(m_aCustomSpeed, SIGNAL(triggered()), this, SLOT(slotSetSpeed())); connect(m_speedsMapper, SIGNAL(mapped(QString)), this, SLOT(slotSetSpeed(QString))); connect(m_aShutterFunction, SIGNAL(triggered()), this, SLOT(slotSetShutterFunction())); for (std::vector::iterator it = m_aSpeeds.begin(); it != m_aSpeeds.end(); ++it) { connect(*it, SIGNAL(triggered()), m_speedsMapper, SLOT(map())); } } Canvas::~Canvas() { delete ui; if (m_shutterFunctionDialog != NULL) { delete m_shutterFunctionDialog; } while (!m_aSpeeds.empty()) { delete m_aSpeeds.back(); m_aSpeeds.pop_back(); } delete m_speedsMapper; delete m_hackMapper; delete m_aSnapInNode; delete m_aDeleteNode; delete m_aDeleteTag; delete m_aRenameTag; } void Canvas::load(Project_sV *project) { if (m_shutterFunctionDialog != NULL) { m_shutterFunctionDialog->close(); delete m_shutterFunctionDialog; m_shutterFunctionDialog = NULL; } m_project = project; m_t0 = m_project->preferences()->viewport_t0(); m_secResX = m_project->preferences()->viewport_secRes().x(); m_secResY = m_project->preferences()->viewport_secRes().y(); Q_ASSERT(m_secResX > 0); Q_ASSERT(m_secResY > 0); qDebug() << "Canvas: Project loaded from " << project; m_nodes = project->nodes(); m_tags = project->tags(); qDebug() << "Frame source: " << project->frameSource(); m_tmax.setY(project->frameSource()->maxTime()); qDebug() << "tMaxY set to " << m_tmax.y(); repaint(); } void Canvas::showHelp(bool show) { m_showHelp = show; repaint(); m_settings.setValue("ui/displayHelp", show); m_settings.sync(); } void Canvas::toggleHelp() { showHelp(!m_showHelp); } const QPointF Canvas::prevMouseTime() const { return convertCanvasToTime(m_states.prevMousePos).toQPointF(); } const float Canvas::prevMouseInFrame() const { return convertCanvasToTime(m_states.prevMousePos).toQPointF().y() * m_project->frameSource()->fps()->fps(); } bool Canvas::selectAt(const QPoint &pos, bool addToSelection) { bool selected = false; int ti = m_nodes->find(convertCanvasToTime(pos).x()); qDebug() << "Nearest node index: " << ti; if (ti != -1 && m_nodes->size() > ti) { QPoint p = convertTimeToCanvas(m_nodes->at(ti)); qDebug() << "Mouse pos: " << pos << ", node pos: " << p; if ( abs(p.x() - pos.x()) <= NODE_RADIUS+SELECT_RADIUS+4 && abs(p.y() - pos.y()) <= NODE_RADIUS+SELECT_RADIUS+4 ) { qDebug() << "Selected: " << pos.x() << "/" << pos.y(); if (!m_nodes->at(ti).selected() && !addToSelection) { m_nodes->unselectAll(); } if (addToSelection) { (*m_nodes)[ti].select(!m_nodes->at(ti).selected()); } else { (*m_nodes)[ti].select(true); } selected = true; } } return selected; } bool Canvas::insideCanvas(const QPoint &pos) { return pos.x() >= m_distLeft && pos.y() >= m_distTop && pos.x() < width()-m_distRight && pos.y() < height()-m_distBottom; } QRect Canvas::leftDrawingRect(int y, const int height, const int min, const int max) const { if (y < min) { y = min; } if (max > 0 && y > max-height) { y = max-height; } return QRect(8, y-6, m_distLeft-2*8, 50); } QRect Canvas::bottomDrawingRect(int x, const int width, const int min, const int max, bool rightJustified) const { if (rightJustified) { if (max > 0 && x > max) { x = max; } if (min > 0 && x< min+width) { x = min+width; } return QRect(x-width, height()-1 - (m_distBottom-8), width, m_distBottom-2*8); } else { if (max > 0 && x > max-width) { x = max-width; } if (min > 0 && x < min) { x = min; } return QRect(x, height()-1 - (m_distBottom-8), width, m_distBottom-2*8); } } void Canvas::paintEvent(QPaintEvent *) { QPainter davinci(this); davinci.setRenderHint(QPainter::Antialiasing, false); davinci.fillRect(0, 0, width(), height(), backgroundCol); QList nearObjects = m_nodes->objectsNear( convertCanvasToTime(m_states.prevMousePos).toQPointF(), delta(SELECT_RADIUS)); if (m_states.prevModifiers.testFlag(Qt::ShiftModifier)) { while (nearObjects.size() > 0 && nearObjects.at(0).type == NodeList_sV::PointerWithDistance::Node) { nearObjects.removeFirst(); } } bool drawLine; // x grid for (int tx = ceil(m_t0.x()); true; tx++) { QPoint pos = convertTimeToCanvas(Node_sV(tx, m_t0.y())); if (insideCanvas(pos)) { drawLine = m_secResX >= 7.5; if (tx%60 == 0) { davinci.setPen(minGridCol); drawLine = true; } else if (tx%10 == 0) { davinci.setPen(fatGridCol); drawLine = m_secResX >= .75; } else { davinci.setPen(gridCol); } if (drawLine) { davinci.drawLine(pos.x(), pos.y(), pos.x(), m_distTop); } } else { break; } } // y grid for (int ty = ceil(m_t0.y()); true; ty++) { QPoint pos = convertTimeToCanvas(Node_sV(m_t0.x(), ty)); if (insideCanvas(pos)) { drawLine = m_secResY >= 7.5; if (ty%60 == 0) { davinci.setPen(minGridCol); drawLine = true; } else if (ty%10 == 0) { davinci.setPen(fatGridCol); drawLine = m_secResX >= .75; } else { davinci.setPen(gridCol); } if (drawLine) { davinci.drawLine(pos.x(), pos.y(), width()-1 - m_distRight, pos.y()); } } else { break; } } { QPoint pos = convertTimeToCanvas(Node_sV(m_t0.x(), m_tmax.y())); if (insideCanvas(pos)) { davinci.setPen(QPen(QBrush(lineCol), 2)); davinci.drawLine(pos.x(), pos.y(), width()-1-m_distRight, pos.y()); } } drawModes(davinci, 8, width()-1 - m_distRight); // Frames/seconds davinci.setPen(lineCol); if (m_mouseWithinWidget && insideCanvas(m_states.prevMousePos)) { QString timeText, speedText; Node_sV time = convertCanvasToTime(m_states.prevMousePos); const int mX = m_states.prevMousePos.x(); const int mY = m_states.prevMousePos.y(); davinci.drawLine(mX, m_distTop, mX, height()-1 - m_distBottom); timeText = CanvasTools::outputTimeLabel(this, time); speedText = CanvasTools::outputSpeedLabel(time, m_project); // Ensure that the text does not go over the right border davinci.drawText(bottomDrawingRect(mX-20, 160, m_distLeft, -180+width()-m_distRight-50), Qt::AlignRight, timeText); davinci.drawText(bottomDrawingRect(mX+20, 160, m_distLeft+180, width()-m_distRight-50, false), Qt::AlignLeft, speedText); davinci.drawLine(m_distLeft, mY, mX, mY); if (time.y() < 60) { timeText = QString("f %1\n%2 s") .arg(time.y()*m_project->frameSource()->fps()->fps(), 2, 'f', 2) .arg(time.y()); } else { timeText = QString("f %1\n%2 min\n+%3 s") .arg(time.y()*m_project->frameSource()->fps()->fps(), 2, 'f', 2) .arg(int(time.y()/60)) .arg(time.y()-60*int(time.y()/60), 0, 'f', 2); } davinci.drawText(leftDrawingRect(mY, 48, m_distTop+24, height()-m_distBottom), Qt::AlignRight, timeText); } { Node_sV node; QString timeText ; // yMax node = convertCanvasToTime(QPoint(m_distLeft, m_distTop)); timeText = QString("f %1").arg(node.y(), 0, 'f', 1); davinci.drawText(leftDrawingRect(m_distTop), Qt::AlignRight, timeText); // yMin node = convertCanvasToTime(QPoint(m_distLeft, height()-1 - m_distBottom)); timeText = QString("f %1").arg(node.y(), 0, 'f', 1); davinci.drawText(leftDrawingRect(height()-m_distBottom-8), Qt::AlignRight, timeText); // xMin node = convertCanvasToTime(QPoint(m_distLeft, height()-1 - m_distBottom)); timeText = QString("f %1").arg(node.x(), 0, 'f', 1); davinci.drawText(bottomDrawingRect(m_distLeft+8), Qt::AlignRight, timeText); // xMax node = convertCanvasToTime(QPoint(width()-1 - m_distRight, height()-1 - m_distBottom)); timeText = QString("f %1").arg(node.x(), 0, 'f', 1); davinci.drawText(bottomDrawingRect(width()-1-m_distRight), Qt::AlignRight, timeText); } int bottom = height()-1 - m_distBottom; davinci.drawLine(m_distLeft, bottom, width()-1 - m_distRight, bottom); davinci.drawLine(m_distLeft, bottom, m_distLeft, m_distTop); // Shutter Lengths (for motion blur) davinci.setRenderHint(QPainter::Antialiasing, false); const Node_sV *leftNode = NULL; const Node_sV *rightNode = NULL; const float outFps = m_project->preferences()->canvas_xAxisFPS().fps(); const float sourceFps = m_project->frameSource()->fps()->fps(); for (int i = 0; i < m_nodes->size(); i++) { rightNode = &m_nodes->at(i); QPoint p = convertTimeToCanvas(*rightNode); if (leftNode != NULL) { ShutterFunction_sV *shutterFunction = m_project->shutterFunctions()->function(leftNode->shutterFunctionID()); if (shutterFunction != NULL) { QPoint pp = convertTimeToCanvas(*leftNode); for (int x = pp.x(); x < p.x(); x++) { qreal progressOnCurve = ((qreal)x - pp.x()) / (p.x() - pp.x()); QPointF time; if (leftNode->rightCurveType() == CurveType_Bezier && rightNode->leftCurveType() == CurveType_Bezier) { time = BezierTools_sV::interpolateAtX(convertCanvasToTime(QPoint(x, 0)).x(), leftNode->toQPointF(), leftNode->toQPointF()+leftNode->rightNodeHandle(), rightNode->toQPointF()+rightNode->leftNodeHandle(), rightNode->toQPointF()); } else { time = leftNode->toQPointF() + (rightNode->toQPointF() - leftNode->toQPointF()) * progressOnCurve; } qreal outTime = time.x(); qreal sourceTime = time.y(); qreal sourceFrame = sourceTime * sourceFps; float dy; if (outTime + 1/outFps <= m_nodes->endTime()) { dy = m_nodes->sourceTime(outTime + 1/outFps) - sourceTime; } else { dy = sourceTime - m_nodes->sourceTime(outTime - 1/outFps); } float shutter = shutterFunction->evaluate( progressOnCurve, // x on [0,1] outTime, // t outFps, // FPS sourceFrame, // y dy // dy to next frame ); QPoint sourceShutterTimeStart = QPoint(x, convertTimeToCanvas(time).y()); QPoint sourceShutterTimeEnd = QPoint(x, convertTimeToCanvas(time + QPointF(0, shutter * outFps/sourceFps)).y()); if (shutter > 0) { davinci.setPen(shutterRegionCol); davinci.drawLine(sourceShutterTimeStart, sourceShutterTimeEnd); } davinci.setPen(shutterRegionBoundCol); davinci.drawPoint(x, sourceShutterTimeEnd.y() - 1); } } } leftNode = &m_nodes->at(i); } // Tags davinci.setRenderHint(QPainter::Antialiasing, false); for (int i = 0; i < m_tags->size(); i++) { Tag_sV tag = m_tags->at(i); if (tag.axis() == TagAxis_Source) { QPoint p = convertTimeToCanvas(Node_sV(m_t0.x(), tag.time())); if (insideCanvas(p)) { davinci.setPen(srcTagCol); davinci.drawLine(m_distLeft, p.y(), width()-m_distRight, p.y()); davinci.drawText(m_distLeft+10, p.y()-1, tag.description()); } } else { QPoint p = convertTimeToCanvas(Node_sV(tag.time(), m_t0.y())); if (insideCanvas(p)) { davinci.setPen(outTagCol); davinci.drawLine(p.x(), height()-1 - m_distBottom, p.x(), m_distTop); davinci.drawText(p.x()+2, m_distTop, tag.description()); } } } // Nodes davinci.setPen(lineCol); davinci.setRenderHint(QPainter::Antialiasing, true); const Node_sV *prev = NULL; const Node_sV *curr = NULL; for (int i = 0; i < m_nodes->size(); i++) { curr = &m_nodes->at(i); QPoint p = convertTimeToCanvas(*curr); if (curr->selected()) { davinci.setPen(QPen(QBrush(selectedCol), 2.0)); davinci.fillRect(p.x()-NODE_RADIUS, p.y()-NODE_RADIUS, 2*NODE_RADIUS+1, 2*NODE_RADIUS+1, selectedCol); } davinci.setPen(nodeCol); if (nearObjects.size() > 0 && curr == nearObjects.at(0).ptr) { davinci.setPen(hoverCol); } davinci.drawRect(p.x()-NODE_RADIUS, p.y()-NODE_RADIUS, 2*NODE_RADIUS+1, 2*NODE_RADIUS+1); if (prev != NULL) { if (m_project->nodes()->segments()->at(i-1).selected()) { davinci.setPen(selectedLineCol); } else { davinci.setPen(lineCol); } if (prev->rightCurveType() == CurveType_Bezier && curr->leftCurveType() == CurveType_Bezier) { QPainterPath path; path.moveTo(convertTimeToCanvas(*prev)); path.cubicTo( convertTimeToCanvas(prev->toQPointF() + prev->rightNodeHandle()), convertTimeToCanvas(curr->toQPointF() + curr->leftNodeHandle()), convertTimeToCanvas(*curr)); davinci.drawPath(path); #ifdef VALIDATE_BEZIER for (int x = convertTimeToCanvas(*prev).x(); x < p.x(); x++) { QPointF py = BezierTools_sV::interpolateAtX(convertCanvasToTime(QPoint(x, 0)).x(), prev->toQPointF(), prev->toQPointF()+prev->rightNodeHandle(), curr->toQPointF()+curr->leftNodeHandle(), curr->toQPointF()); qreal y = convertTimeToCanvas(py).y(); // qDebug() << convertCanvasToTime(QPoint(x, 0)).x() << ": " << x << y; davinci.drawPoint(x, y); } #endif } else { davinci.drawLine(convertTimeToCanvas(*prev), p); } } // Handles if (i > 0 && curr->leftCurveType() != CurveType_Linear && prev->rightCurveType() != CurveType_Linear) { davinci.setPen(handleLineCol); if (nearObjects.size() > 0 && &curr->leftNodeHandle() == nearObjects.at(0).ptr) { davinci.setPen(hoverCol); } QPoint h = convertTimeToCanvas(curr->toQPointF() + curr->leftNodeHandle()); davinci.drawLine(convertTimeToCanvas(*curr), h); davinci.drawEllipse(QPoint(h.x(), h.y()), HANDLE_RADIUS, HANDLE_RADIUS); davinci.setPen(handleLineCol); if (nearObjects.size() > 0 && &prev->rightNodeHandle() == nearObjects.at(0).ptr) { davinci.setPen(hoverCol); } h = convertTimeToCanvas(prev->toQPointF() + prev->rightNodeHandle()); davinci.drawLine(convertTimeToCanvas(*prev), h); davinci.drawEllipse(QPoint(h.x(), h.y()), HANDLE_RADIUS, HANDLE_RADIUS); } prev = &m_nodes->at(i); } if (m_showHelp) { MainWindow *mw; if ((mw = dynamic_cast(parentWidget())) != NULL) { mw->displayHelp(davinci); } else { qDebug() << "Cannot show help; wrong parent widget?"; Q_ASSERT(false); } } } void Canvas::drawModes(QPainter &davinci, int t, int r) { qreal opacity = davinci.opacity(); int w = 16; int d = 8; int dR = 0; dR += w; davinci.setOpacity(.5 + ((m_mode == ToolMode_Select) ? .5 : 0)); davinci.drawImage(r - dR, t, QImage(":icons/iconSel.png").scaled(16, 16)); dR += d+w; davinci.setOpacity(.5 + ((m_mode == ToolMode_Move) ? .5 : 0)); davinci.drawImage(r - dR, t, QImage(":icons/iconMov.png").scaled(16, 16)); davinci.setOpacity(opacity); } void Canvas::mousePressEvent(QMouseEvent *e) { m_states.reset(); m_states.prevMousePos = e->pos(); m_states.initialMousePos = e->pos(); m_states.prevModifiers = e->modifiers(); m_states.initialModifiers = e->modifiers(); m_states.initialButtons = e->buttons(); m_states.initialContextObject = objectAt(e->pos(), e->modifiers()); m_states.initial_t0 = m_t0; if (m_states.initialContextObject != NULL) { qDebug() << "Mouse pressed. Context: " << typeid(*m_states.initialContextObject).name(); } } void Canvas::mouseMoveEvent(QMouseEvent *e) { m_mouseWithinWidget = true; m_states.travel((m_states.prevMousePos - e->pos()).manhattanLength()); m_states.prevMousePos = e->pos(); m_states.prevModifiers = e->modifiers(); if (e->buttons().testFlag(Qt::LeftButton)) { Node_sV diff = convertCanvasToTime(e->pos()) - convertCanvasToTime(m_states.initialMousePos); #ifdef DEBUG_C qDebug() << m_states.initialMousePos << "to" << e->pos() << "; Diff: " << diff; #endif if (m_mode == ToolMode_Select) { if (dynamic_cast(m_states.initialContextObject) != NULL) { const NodeHandle_sV *handle = dynamic_cast(m_states.initialContextObject); int index = m_nodes->indexOf(handle->parentNode()); if (index < 0) { qDebug () << "FAIL!"; } #ifdef DEBUG_C qDebug() << "Moving handle" << handle << " of node " << handle->parentNode() << QString(" (%1)").arg(index); qDebug() << "Parent node x: " << handle->parentNode()->x(); qDebug() << "Handle x: " << handle->x(); #endif if (index >= 0) { m_nodes->moveHandle( handle, convertCanvasToTime(e->pos())-m_nodes->at(index) ); } else { for (int i = 0; i < m_nodes->size(); i++) { qDebug() << "Node " << i << " is at " << &m_nodes->at(i); } } } else if (dynamic_cast(m_states.initialContextObject) != NULL) { const Node_sV *node = (const Node_sV*) m_states.initialContextObject; if (!m_states.nodesMoved) { qDebug() << "Moving node " << node; } if (!m_states.moveAborted) { if (m_states.countsAsMove()) { if (!node->selected()) { if (!m_states.selectAttempted) { m_states.selectAttempted = true; m_nodes->select(node, !e->modifiers().testFlag(Qt::ControlModifier)); } } if (e->modifiers().testFlag(Qt::ControlModifier)) { if (qAbs(diff.x()) < qAbs(diff.y())) { diff.setX(0); } else { diff.setY(0); } } //qDebug() << "move selected"; bool snap = (e->modifiers() & Qt::ShiftModifier); //TODO qDebug() << "is snap ? " << snap; m_nodes->moveSelected(diff,snap); } } m_states.nodesMoved = true; } else { // Cannot move this object, so move the canvas instead. // if (m_states.initialContextObject != NULL) { // qDebug() << "Trying to move " << typeid(*m_states.initialContextObject).name() << ": Not supported yet!"; // } m_t0 = m_states.initial_t0 - diff; if (m_t0.y() < 0) { m_t0.setY(0); } if (m_t0.x() < 0) { m_t0.setX(0); } if (m_t0.y() > m_tmax.y()) { m_t0.setY(m_tmax.y()); } } } else if (m_mode == ToolMode_Move) { if (!m_states.moveAborted) { m_nodes->shift(convertCanvasToTime(m_states.initialMousePos).x(), diff.x()); } m_states.nodesMoved = true; } } // Emit the source time at the mouse position emit signalMouseInputTimeChanged( convertCanvasToTime(m_states.prevMousePos).y() * m_project->frameSource()->fps()->fps() ); // Emit the source time at the intersection of the out time and the curve qreal timeOut = convertCanvasToTime(m_states.prevMousePos).x(); if (m_nodes->size() > 1 && m_nodes->startTime() <= timeOut && timeOut <= m_nodes->endTime()) { #ifdef DEBUG_C std::cout.precision(32); std::cout << "start: " << m_nodes->startTime() << ", out: " << timeOut << ", end: " << m_nodes->endTime() << std::endl; #endif if (m_nodes->find(timeOut) >= 0) { emit signalMouseCurveSrcTimeChanged( m_nodes->sourceTime(timeOut) * m_project->frameSource()->fps()->fps()); } } repaint(); } void Canvas::mouseReleaseEvent(QMouseEvent *event) { if (m_states.initialButtons.testFlag(Qt::LeftButton)) { if (!m_states.moveAborted) { switch (m_mode) { case ToolMode_Select: if (m_states.countsAsMove()) { m_nodes->confirmMove(); qDebug() << "Move confirmed."; emit nodesChanged(); } else { if (m_states.initialMousePos.x() >= m_distLeft && m_states.initialMousePos.y() < this->height()-m_distBottom && !m_states.selectAttempted) { // Try to select a node below the mouse. If there is none, add a point. if (m_states.initialContextObject == NULL || dynamic_cast(m_states.initialContextObject) == NULL) { if (m_mode == ToolMode_Select) { // check snap to grid //qDebug()<< event->modifiers(); bool snap = (event->modifiers() & Qt::ShiftModifier); //qDebug() << "snap to frame ? " << snap; Node_sV p = convertCanvasToTime(m_states.initialMousePos,snap); //qDebug() << "adding node"; m_nodes->add(p); emit nodesChanged(); } else { qDebug() << "Not adding node. Mode is " << m_mode; } } else if (dynamic_cast(m_states.initialContextObject) != NULL) { m_nodes->select((const Node_sV*) m_states.initialContextObject, !m_states.initialModifiers.testFlag(Qt::ControlModifier)); } repaint(); } else { qDebug() << "Not inside bounds."; } } break; case ToolMode_Move: m_nodes->confirmMove(); qDebug() << "Move confirmed."; emit nodesChanged(); break; } } #if QT_VERSION < 0x040700 } else if (m_states.initialButtons.testFlag(Qt::RightButton) || m_states.initialButtons.testFlag(Qt::MidButton)) { #else } else if (m_states.initialButtons.testFlag(Qt::RightButton) || m_states.initialButtons.testFlag(Qt::MiddleButton)) { #endif QList nearObjects = m_project->objectsNear(convertCanvasToTime(m_states.initialMousePos).toQPointF(), delta(10)); #if _NEARBY_DBG qDebug() << "Nearby objects:"; for (int i = 0; i < nearObjects.size(); i++) { qDebug() << typeid(*(nearObjects.at(i).ptr)).name() << " at distance " << nearObjects.at(i).dist; } #endif } } void Canvas::contextMenuEvent(QContextMenuEvent *e) { qDebug() << "Context menu requested"; m_states.contextmenuMouseTime = convertCanvasToTime(e->pos()).toQPointF(); QMenu menu; QMenu speedMenu(trUtf8("Segment replay &speed …"), &menu); const CanvasObject_sV *obj = objectAt(e->pos(), m_states.prevModifiers); if (dynamic_cast(obj)) { Node_sV *node = (Node_sV *) obj; m_toDeleteNode.objectPointer = node; m_toSnapInNode.objectPointer = node; int nodeIndex = m_nodes->indexOf(node); menu.addAction(QString(tr("Node %1")).arg(nodeIndex))->setEnabled(false); menu.addAction(m_aDeleteNode); // menu.addAction(m_aSnapInNode); // \todo Activate Snap in menu.addSeparator()->setText(tr("Handle actions")); menu.addAction(m_aResetLeftHandle); menu.addAction(m_aResetRightHandle); } else if (dynamic_cast(obj) != NULL) { const Segment_sV* segment = (const Segment_sV*) obj; int leftNode = segment->leftNodeIndex(); menu.addAction(QString(tr("Segment between node %1 and %2")).arg(leftNode).arg(leftNode+1))->setEnabled(false); menu.addAction(m_aLinear); menu.addAction(m_aBezier); menu.addAction(m_aShutterFunction); speedMenu.addAction(m_aCustomSpeed); std::vector::iterator it = m_aSpeeds.begin(); while (it != m_aSpeeds.end()) { speedMenu.addAction(*it); it++; } menu.addMenu(&speedMenu); } else if (dynamic_cast(obj) != NULL) { Tag_sV* tag = (Tag_sV*) obj; m_toDeleteTag.objectPointer = tag; m_toRenameTag.objectPointer = tag; m_toSetTagTime.objectPointer = tag; menu.addAction(QString(tr("Tag %1")).arg(tag->description())); menu.addAction(m_aDeleteTag); menu.addAction(m_aRenameTag); menu.addAction(m_aSetTagTime); } else { if (obj != NULL) { qDebug() << "No context menu available for object of type " << typeid(*obj).name(); } return; } menu.exec(e->globalPos()); } void Canvas::leaveEvent(QEvent *) { m_mouseWithinWidget = false; repaint(); } void Canvas::keyPressEvent(QKeyEvent *event) { if (dynamic_cast(m_states.initialContextObject) != NULL) { const Node_sV *node = (const Node_sV*) m_states.initialContextObject; //qDebug() << "node : " << node->x() << "," << node->y(); qDebug() << "canvas node : " << convertTimeToCanvas(*node); //qDebug() << "mouse " << m_states.prevMousePos << " vs " << m_states.initialMousePos; if (!m_states.nodesMoved) { qDebug() << "should be Moving node " << node; Node_sV diff; switch (event->key()) { case Qt::Key_Up: //qDebug() << "key up"; diff = convertCanvasToTime(QPoint(0,-1))-convertCanvasToTime(QPoint(0,0)); break; case Qt::Key_Down: //qDebug() << "key down"; diff = convertCanvasToTime(QPoint(0,1))-convertCanvasToTime(QPoint(0,0)); break; case Qt::Key_Right: //qDebug() << "key right"; diff = convertCanvasToTime(QPoint(1,0))-convertCanvasToTime(QPoint(0,0)); break; case Qt::Key_Left: //qDebug() << "key left"; diff = convertCanvasToTime(QPoint(-1,0))-convertCanvasToTime(QPoint(0,0)); break; } //qDebug() << "moving of " << diff; m_nodes->moveSelected(diff); //TODO: update other windows ? //TODO: confirm move ? // from mouserelease /*if (m_states.countsAsMove()) */{ m_nodes->confirmMove(); //qDebug() << "key Move confirmed."; emit nodesChanged(); repaint(); } #if 1 // from mouse move ? // Emit the source time at the mouse position emit signalMouseInputTimeChanged(node->y() * m_project->frameSource()->fps()->fps() ); //TODO: get right time ! // Emit the source time at the intersection of the out time and the curve qreal timeOut = node->x(); if (m_nodes->size() > 1 && m_nodes->startTime() <= timeOut && timeOut <= m_nodes->endTime()) { #ifdef DEBUG_C std::cout.precision(32); std::cout << "start: " << m_nodes->startTime() << ", out: " << timeOut << ", end: " << m_nodes->endTime() << std::endl; #endif if (m_nodes->find(timeOut) >= 0) { emit signalMouseCurveSrcTimeChanged( timeOut/*m_nodes->sourceTime(timeOut)*/ * m_project->frameSource()->fps()->fps()); } } #endif // mouse ? } //event->ignore(); } QWidget::keyPressEvent(event); } void Canvas::wheelEvent(QWheelEvent *e) { // Mouse wheel movement in degrees int deg = e->delta()/8; if (e->modifiers().testFlag(Qt::ControlModifier)) { zoom(deg > 0, e->pos()); } else if (e->modifiers().testFlag(Qt::ShiftModifier)) { // Horizontal scrolling m_t0 -= Node_sV(SCROLL_FACTOR*convertDistanceToTime(QPoint(deg, 0)).x(),0); if (m_t0.x() < 0) { m_t0.setX(0); } } else { //Vertical scrolling m_t0 += Node_sV(0, SCROLL_FACTOR*convertDistanceToTime(QPoint(deg, 0)).x()); if (m_t0.y() < 0) { m_t0.setY(0); } if (m_t0.y() > m_tmax.y()) { m_t0.setY(m_tmax.y()); } } m_project->preferences()->viewport_t0() = m_t0.toQPointF(); m_project->preferences()->viewport_secRes().rx() = m_secResX; m_project->preferences()->viewport_secRes().ry() = m_secResY; Q_ASSERT(m_secResX > 0); Q_ASSERT(m_secResY > 0); Q_ASSERT(m_t0.x() >= 0); Q_ASSERT(m_t0.y() >= 0); repaint(); } void Canvas::zoom(bool in, QPoint pos) { Node_sV n0 = convertCanvasToTime(pos); // Update the line resolution if (in) { m_secResX *= ZOOM_FACTOR; } else { m_secResX /= ZOOM_FACTOR; } if (m_secResX < .05) { m_secResX = .05; } // Y resolution is the same as X resolution (at least at the moment) m_secResY = m_secResX; // qDebug() << "Resolution: " << m_secResX; // Adjust t0 such that the mouse points to the same time as before Node_sV nDiff = convertCanvasToTime(pos) - convertCanvasToTime(QPoint(m_distLeft, height()-1-m_distBottom)); m_t0 = n0 - nDiff; if (m_t0.x() < 0) { m_t0.setX(0); } if (m_t0.y() < 0) { m_t0.setY(0); } Q_ASSERT(m_secResX > 0); Q_ASSERT(m_secResY > 0); Q_ASSERT(m_t0.x() >= 0); Q_ASSERT(m_t0.y() >= 0); repaint(); } void Canvas::slotZoomIn() { zoom(true, QCursor::pos()); } void Canvas::slotZoomOut() { zoom(false, QCursor::pos()); } const CanvasObject_sV* Canvas::objectAt(QPoint pos, Qt::KeyboardModifiers modifiers) const { QList nearObjects = m_project->objectsNear(convertCanvasToTime(pos).toQPointF(), convertDistanceToTime(QPoint(SELECT_RADIUS,0)).x()); if (modifiers.testFlag(Qt::ShiftModifier)) { // Ignore nodes with Shift pressed while (nearObjects.size() > 0 && dynamic_cast(nearObjects.at(0).ptr) != NULL) { nearObjects.removeFirst(); } } if (nearObjects.size() > 0) { return nearObjects.at(0).ptr; } else { return NULL; } } ////////// Conversion Time <--> Screen pixels Node_sV Canvas::convertCanvasToTime(const QPoint &p, bool snap) const { Q_ASSERT(m_secResX > 0); Q_ASSERT(m_secResY > 0); QPointF tDelta = convertDistanceToTime(QPoint( p.x()-m_distLeft, height()-1 - m_distBottom - p.y() )); QPointF tFinal = tDelta + m_t0.toQPointF(); //qDebug() << "convert: " << tFinal; if (snap) { tFinal.setX(int(tFinal.x())); } return Node_sV(tFinal.x(), tFinal.y()); } QPoint Canvas::convertTimeToCanvas(const Node_sV &p) const { return convertTimeToCanvas(p.toQPointF()); } QPoint Canvas::convertTimeToCanvas(const QPointF &p) const { QPoint tDelta = convertTimeToDistance(QPointF( p.x()-m_t0.x(), p.y()-m_t0.y() )); QPoint out( tDelta.x() + m_distLeft, height()-1 - m_distBottom - tDelta.y() ); return out; } QPointF Canvas::convertDistanceToTime(const QPoint &p) const { QPointF out( float(p.x()) / m_secResX, float(p.y()) / m_secResY ); return out; } QPoint Canvas::convertTimeToDistance(const QPointF &time) const { QPoint out( time.x()*m_secResX, time.y()*m_secResY ); return out; } float Canvas::delta(int px) const { return convertDistanceToTime(QPoint(px, 0)).x(); } ////////// Slots void Canvas::slotAbort(Canvas::Abort abort) { qDebug() << "Signal: " << abort; switch (abort) { case Abort_General: m_states.moveAborted = true; m_nodes->abortMove(); repaint(); break; case Abort_Selection: m_nodes->unselectAll(); repaint(); break; } } void Canvas::slotAddTag() { if (m_mouseWithinWidget) { TagAddDialog dialog(m_project->preferences()->lastSelectedTagAxis(), this); if (dialog.exec() == QDialog::Accepted) { Tag_sV tag = dialog.buildTag(convertCanvasToTime(m_states.prevMousePos).toQPointF()); m_project->preferences()->lastSelectedTagAxis() = tag.axis(); m_tags->push_back(tag); qDebug() << "Tag added. Number is now: " << m_tags->size(); repaint(); } else { qDebug() << "Tag dialog not accepted."; } } else { qDebug() << "Mouse outside widget."; } } void Canvas::slotDeleteNodes() { qDebug() << "Will delete"; uint nDel = m_nodes->deleteSelected(); qDebug() << nDel << " deleted."; if (nDel > 0) { repaint(); emit nodesChanged(); } } void Canvas::slotSetToolMode(ToolMode mode) { m_mode = mode; qDebug() << "Mode set to: " << mode; repaint(); } void Canvas::slotRunAction(QObject *o) { TransferObject *to = (TransferObject*) o; qDebug() << "Desired action: " << toString(to->reason); if (dynamic_cast(to->objectPointer) != NULL) { /// Tag actions /// Tag_sV* tag = (Tag_sV*) to->objectPointer; qDebug() << " ... on Tag " << tag->description(); switch (to->reason) { case TransferObject::ACTION_DELETE: for (int i = 0; i < m_tags->size(); ++i) { if (&m_tags->at(i) == tag) { qDebug() << "Tag found, removing: " << m_tags->at(i).description(); m_tags->removeAt(i); break; } } break; case TransferObject::ACTION_RENAME: { bool ok; QString newName = QInputDialog::getText(this, tr("New tag name"), tr("Tag:"), QLineEdit::Normal, tag->description(), &ok, 0, Qt::ImhNone ); if (ok) { tag->setDescription(newName); } break; } case TransferObject::ACTION_SETTIME: { bool ok; double d = QInputDialog::getDouble(this, tr("New tag time"), tr("Time:"), tag->time(), 0, 424242, 5, &ok); if (ok) { tag->setTime(d); } break; } default: qDebug() << "Unknown action on Tag: " << toString(to->reason); Q_ASSERT(false); break; } } else if (dynamic_cast(to->objectPointer)) { /// Node actions /// Node_sV const* node = (Node_sV const*) to->objectPointer; qDebug() << " ... on node " << *node; switch (to->reason) { case TransferObject::ACTION_DELETE: m_nodes->deleteNode(m_nodes->indexOf(node)); break; default: qDebug() << "Unknown action on Node: " << toString(to->reason); Q_ASSERT(false); break; } } repaint(); } void Canvas::slotChangeCurveType(int curveType) { qDebug() << "Changing curve type to " << toString((CurveType)curveType) << " at " << convertCanvasToTime(m_states.prevMousePos).x(); m_nodes->setCurveType(convertCanvasToTime(m_states.prevMousePos).x(), (CurveType) curveType); emit nodesChanged(); } void Canvas::slotResetHandle(const QString &position) { if (dynamic_cast(m_states.initialContextObject) != NULL) { Node_sV *node = const_cast(dynamic_cast(m_states.initialContextObject)); if (position == "left") { node->setLeftNodeHandle(0, 0); } else { node->setRightNodeHandle(0, 0); } emit nodesChanged(); } else { qDebug() << "Object at mouse position is " << m_states.initialContextObject << ", cannot reset the handle."; } } /** * */ void Canvas::setCurveSpeed(double speed) { QString message; qDebug() << "Setting curve to " << speed << "x speed."; int errcode = m_nodes->setSpeed(convertCanvasToTime(m_states.prevMousePos).x(), speed); //TODO: should find a better way ... (try/catch ?) switch (errcode) { case -1: message = tr("%1 x speed would shoot over maximum time. Correcting.").arg(speed); QMessageBox(QMessageBox::Warning, tr("Warning"), message, QMessageBox::Ok).exec(); break; case -2: message = tr("%1 x speed goes below 0. Correcting.").arg(speed); QMessageBox(QMessageBox::Warning, tr("Warning"), message, QMessageBox::Ok).exec(); break; case -3: message = tr("New node would be too close, not adding it."); QMessageBox(QMessageBox::Warning, tr("Warning"), message, QMessageBox::Ok).exec(); break; case -4 : message = tr("Outside segment."); QMessageBox(QMessageBox::Warning, tr("Warning"), message, QMessageBox::Ok).exec(); break; } emit nodesChanged(); repaint(); } void Canvas::slotSetSpeed() { bool ok = true; double d = m_settings.value("canvas/replaySpeed", 1.0).toDouble(); qDebug() << "Getting: " << d; d = QInputDialog::getDouble(this, tr("Replay speed for current segment"), tr("Speed:"), d, -1000, 1000, 3, &ok); if (ok) { setCurveSpeed(d); m_settings.setValue("canvas/replaySpeed", d); qDebug() << "Setting: " << d; } } void Canvas::slotSetSpeed(QString s) { bool ok = true; double d = s.toDouble(&ok); if (ok) { setCurveSpeed(d); } else { qDebug() << "Not ok: " << s; } } void Canvas::slotSetShutterFunction() { int left = m_nodes->find(m_states.contextmenuMouseTime.x()); if (left == m_nodes->size()-1) { left = m_nodes->size()-2; } if (m_shutterFunctionDialog == NULL) { m_shutterFunctionDialog = new ShutterFunctionDialog(m_project, this); connect(this, SIGNAL(nodesChanged()), m_shutterFunctionDialog, SLOT(slotNodesUpdated())); } m_shutterFunctionDialog->setSegment(left); if (!m_shutterFunctionDialog->isVisible()) { m_shutterFunctionDialog->show(); } } QDebug operator <<(QDebug qd, const Canvas::ToolMode &mode) { switch(mode) { case Canvas::ToolMode_Select: qd << "Select tool"; break; case Canvas::ToolMode_Move: qd << "Move tool"; break; } return qd.maybeSpace(); } QDebug operator <<(QDebug qd, const Canvas::Abort &abort) { switch(abort) { case Canvas::Abort_General: qd << "Abort General"; break; case Canvas::Abort_Selection: qd << "Abort Selection"; break; } return qd.maybeSpace(); } QString toString(TransferObject::Reason reason) { switch (reason) { case TransferObject::ACTION_DELETE : return "Delete"; case TransferObject::ACTION_SNAPIN : return "Snap in"; case TransferObject::ACTION_RENAME : return "Rename"; case TransferObject::ACTION_SETTIME : return "Set time"; default : Q_ASSERT(false); return "Unknown action"; } } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/canvasTools.h0000664000000000000000000000150413151342440023345 0ustar rootroot/* This file is part of slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef CANVASTOOLS_H #define CANVASTOOLS_H #include class Canvas; class Node_sV; class Project_sV; class CanvasTools { public: /// Assembles the output time label at cursor position, taking into account fps and axis resolution. static QString outputTimeLabel(Canvas *canvas, Node_sV &time); /// Calculates the replay speed at mouse position in percent static QString outputSpeedLabel(Node_sV &time, Project_sV *project); }; #endif // CANVASTOOLS_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/flowEditCanvas.h0000664000000000000000000000232313151342440023762 0ustar rootroot/* slowmoFlowEdit is a user interface for editing slowmoVideo's Optical Flow files. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef FLOWEDITCANVAS_H #define FLOWEDITCANVAS_H #include #include class FlowField_sV; namespace Ui { class FlowEditCanvas; } /// \todo Auto-fix feature (confirm to accept) class FlowEditCanvas : public QWidget { Q_OBJECT public: explicit FlowEditCanvas(QWidget *parent = 0); ~FlowEditCanvas(); void setAmplification(float val); float amplification() const; public slots: void slotLoadFlow(QString filename); void slotSaveFlow(QString filename = QString()); void newAmplification(int val); private: Ui::FlowEditCanvas *ui; FlowField_sV *m_flowField; QString m_flowFilename; float m_boost; void repaintFlow(); private slots: void slotRectDrawn(QRectF imageRect); void slotExamineValues(float x, float y); }; #endif // FLOWEDITCANVAS_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/mainwindow.cpp0000664000000000000000000006221613151342440023567 0ustar rootroot/* slowmoUI is a user interface for slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include #include #include #include #include #include #include #include #include #include #include #include #include #include #include #include "mainwindow.h" #include "ui_mainwindow.h" #include "newProjectDialog.h" #include "progressDialog.h" #include "renderingDialog.h" #include "projectPreferencesDialog.h" #include "preferencesDialog.h" #include "aboutDialog.h" #include "lib/defs_sV.hpp" #include "project/renderTask_sV.h" #include "project/xmlProjectRW_sV.h" #include "project/abstractFrameSource_sV.h" #include "project/projectPreferences_sV.h" // for editing optical flow #include "flowEditCanvas.h" #include "logbrowserdialog.h" extern QPointer logBrowser; MainWindow::MainWindow(QString projectPath, QWidget *parent) : QMainWindow(parent), ui(new Ui::MainWindow), m_progressDialog(NULL), m_renderProgressDialog(NULL), m_flowExaminer(NULL), m_cs(this) { ui->setupUi(this); m_project = new Project_sV(); m_wCanvas = new Canvas(m_project, this); setCentralWidget(m_wCanvas); createActions(); createDockWindows(); updateWindowTitle(); setWindowIcon(QIcon(":icons/slowmoIcon.png")); QSettings settings; bool show = settings.value("ui/displayHelp", false).toBool(); m_wCanvas->showHelp(show); settings.sync(); restoreGeometry(settings.value("mainwindow/geometry").toByteArray()); restoreState(settings.value("mainwindow/windowState").toByteArray()); if (!projectPath.isEmpty()) { loadProject(projectPath); } if (!settings.contains("binaries/ffmpeg")) { qDebug() << "need to find ffmpeg"; QMessageBox::information( this, "valid FFMPEG not found", "Please choose a working ffmpeg\n" , QMessageBox::Ok, 0 ); PreferencesDialog dialog; dialog.exec(); } } MainWindow::~MainWindow() { delete ui; delete m_wInputMonitor; delete m_wCurveMonitor; delete m_wInputMonitorDock; delete m_wCurveMonitorDock; delete m_wRenderPreview; delete m_wRenderPreviewDock; if (m_project != NULL) { delete m_project; } // shoudl check task ? m_rendererThread.quit(); m_rendererThread.wait(); if (m_progressDialog != NULL) { delete m_progressDialog; } if (m_renderProgressDialog != NULL) { delete m_renderProgressDialog; } if (m_flowExaminer != NULL) { delete m_flowExaminer; } for (int i = 0; i < m_widgetActions.size(); i++) { delete m_widgetActions[i]; } } void MainWindow::createActions() { ui->actionNew->setShortcut(QKeySequence(QKeySequence::New)); ui->actionOpen->setShortcut(QKeySequence(QKeySequence::Open)); ui->actionSave->setShortcut(QKeySequence(QKeySequence::Save)); ui->actionSave_as->setShortcut(QKeySequence("Shift+Ctrl+S")); ui->actionShortcuts->setShortcut(QKeySequence("Ctrl+H")); ui->actionRender->setShortcut(QKeySequence("Ctrl+R")); ui->actionRenderPreview->setShortcut(QKeySequence("Shift+Ctrl+R")); ui->actionExamineFlow->setShortcut(QKeySequence("Shift+Ctrl+X")); ui->actionPreferences->setShortcut(QKeySequence("Ctrl+,")); ui->actionAbout->setShortcut(QKeySequence("F1")); ui->actionQuit->setShortcut(QKeySequence(QKeySequence::Quit)); ui->actionZoomIn->setShortcut(QKeySequence(QKeySequence::ZoomIn)); ui->actionZoomOut->setShortcut(QKeySequence(QKeySequence::ZoomOut)); m_cs.addShortcut("h", Help, tr("Show help overlay")); m_cs.addShortcut("q-q", Quit, tr("Quit")); m_cs.addShortcut("n", New, tr("New project")); m_cs.addShortcut("o", Open, tr("Open project")); m_cs.addShortcut("s-s", Save_Same, tr("Save")); m_cs.addShortcut("s-a", Save_As, tr("Save as ...")); m_cs.addShortcut("a", Abort, tr("Abort move")); m_cs.addShortcut("a-s", Abort_Selection, tr("Unselect all")); m_cs.addShortcut("d-n", Delete_Node, tr("Delete selected nodes")); m_cs.addShortcut("t-s", Tool_Select, tr("Selecting tool")); m_cs.addShortcut("t-m", Tool_Move, tr("Move tool")); m_cs.addShortcut("t-t", Tag, tr("Insert label (tag)")); connect(&m_cs, SIGNAL(signalShortcutUsed(int)), this, SLOT(slotShortcutUsed(int))); connect(this, SIGNAL(deleteNodes()), m_wCanvas, SLOT(slotDeleteNodes())); connect(this, SIGNAL(setMode(Canvas::ToolMode)), m_wCanvas, SLOT(slotSetToolMode(Canvas::ToolMode))); connect(this, SIGNAL(abort(Canvas::Abort)), m_wCanvas, SLOT(slotAbort(Canvas::Abort))); connect(this, SIGNAL(addTag()), m_wCanvas, SLOT(slotAddTag())); connect(ui->actionZoomIn, SIGNAL(triggered()), m_wCanvas, SLOT(slotZoomIn())); connect(ui->actionZoomOut, SIGNAL(triggered()), m_wCanvas, SLOT(slotZoomOut())); connect(m_wCanvas, SIGNAL(signalMouseInputTimeChanged(qreal)), this, SLOT(slotForwardInputPosition(qreal))); connect(m_wCanvas, SIGNAL(signalMouseCurveSrcTimeChanged(qreal)), this, SLOT(slotForwardCurveSrcPosition(qreal))); connect(ui->actionNew, SIGNAL(triggered()), this, SLOT(slotNewProject())); connect(ui->actionOpen, SIGNAL(triggered()), this, SLOT(slotLoadProjectDialog())); connect(ui->actionSave, SIGNAL(triggered()), this, SLOT(slotSaveProject())); connect(ui->actionSave_as, SIGNAL(triggered()), this, SLOT(slotSaveProjectDialog())); connect(ui->actionRender, SIGNAL(triggered()), this, SLOT(slotShowRenderDialog())); connect(ui->actionRenderPreview, SIGNAL(triggered()), this, SLOT(slotUpdateRenderPreview())); connect(ui->actionExamineFlow, SIGNAL(triggered()), this, SLOT(slotShowFlowExaminerDialog())); connect(ui->actionPreferences, SIGNAL(triggered()), this, SLOT(slotShowPreferencesDialog())); connect(ui->actionShortcuts, SIGNAL(triggered()), this, SLOT(slotToggleHelp())); connect(ui->actionAbout, SIGNAL(triggered()), this, SLOT(slotShowAboutDialog())); connect(ui->actionQuit, SIGNAL(triggered()), this, SLOT(close())); connect(ui->actionProjectPreferences, SIGNAL(triggered()), this, SLOT(slotShowProjectPreferencesDialog())); connect(ui->actionEdit_Flow, SIGNAL(triggered()), this, SLOT(slotShowFlowEditWindow())); connect(ui->actionDebug_Window, SIGNAL(toggled(bool)), this, SLOT(slotShowDebugWindow(bool))); } void MainWindow::createDockWindows() { m_wInputMonitor = new FrameMonitor(this); m_wInputMonitorDock = new QDockWidget(tr("Input monitor"), this); m_wInputMonitorDock->setWidget(m_wInputMonitor); m_wInputMonitorDock->setObjectName("inputMonitor"); addDockWidget(Qt::TopDockWidgetArea, m_wInputMonitorDock); m_wCurveMonitor = new FrameMonitor(this); m_wCurveMonitorDock = new QDockWidget(tr("Curve monitor"), this); m_wCurveMonitorDock->setWidget(m_wCurveMonitor); m_wCurveMonitorDock->setObjectName("curveMonitor"); addDockWidget(Qt::TopDockWidgetArea, m_wCurveMonitorDock); m_wRenderPreview = new RenderPreview(m_project, this); m_wRenderPreviewDock = new QDockWidget(tr("Render preview"), this); m_wRenderPreviewDock->setWidget(m_wRenderPreview); m_wRenderPreviewDock->setObjectName("renderPreview"); addDockWidget(Qt::TopDockWidgetArea, m_wRenderPreviewDock); //TODO: replace by : // ui->menuView->addAction(dock->toggleViewAction()); // http://ariya.ofilabs.com/2007/04/custom-toggle-action-for-qdockwidget.html // Fill the view menu that allows (de)activating widgets QObjectList windowChildren = children(); QDockWidget *w; for (int i = 0; i < windowChildren.size(); i++) { if ((w = dynamic_cast(windowChildren.at(i))) != NULL) { qDebug() << "Adding " << w->windowTitle() << " to the menu's widget list"; QAction *a = new QAction("&" + w->objectName(), this); a->setCheckable(true); connect(a, SIGNAL(toggled(bool)), w, SLOT(setVisible(bool))); // This does not work since it is also emitted e.g. when the window is minimized // (with «Show Desktop» on KDE4), therefore an event filter is required. (below.) // Thanks ArGGu^^ for the tip! connect(w, SIGNAL(visibilityChanged(bool)), a, SLOT(setChecked(bool))); //a->setChecked(true); #if QT_VERSION <= QT_VERSION_CHECK(4, 2, 0) // To uncheck the menu entry when the widget is closed via the (x) w->installEventFilter(this); #endif ui->menuView->addAction(a); m_widgetActions << a; } } } bool MainWindow::okToContinue() { if (isWindowModified()) { int r = QMessageBox::warning(this, tr("slowmoUI"), tr("The document has been modified.\n" "Do you want to save your changes?"), QMessageBox::Yes | QMessageBox::No | QMessageBox::Cancel); if (r == QMessageBox::Yes) { slotSaveProjectDialog(); return true; } else if (r == QMessageBox::Cancel) { return false; } } return true; } void MainWindow::closeEvent(QCloseEvent *e) { if (okToContinue()) { qDebug() << "closing"; m_settings.setValue("mainwindow/geometry", saveGeometry()); m_settings.setValue("mainwindow/windowState", saveState()); logBrowser->close(); QMainWindow::closeEvent(e); } } #if QT_VERSION <= QT_VERSION_CHECK(4, 2, 0) /** * this is only for pre 4.2 code ! * http://ariya.ofilabs.com/2007/04/custom-toggle-action-for-qdockwidget.html */ bool MainWindow::eventFilter(QObject *obj, QEvent *e) { QObjectList windowChildren = children(); QDockWidget *w; if (e->type() == QEvent::Close && windowChildren.contains(obj)) { if ((w = dynamic_cast(obj)) != NULL) { QList actions = findChildren(); for (int i = 0; i < actions.size(); i++) { if (actions.at(i)->text() == w->objectName()) { actions.at(i)->setChecked(false); return true; } } } } return QObject::eventFilter(object, event); //return false; } #endif ////////// Shortcuts void MainWindow::slotShortcutUsed(int id) { if (id == Quit) { qApp->quit(); } else if (id == Abort_Selection) { emit abort(Canvas::Abort_Selection); } else if (id == Delete_Node) { emit deleteNodes(); } else if (id == Tool_Select) { emit setMode(Canvas::ToolMode_Select); } else if (id == Tool_Move) { emit setMode(Canvas::ToolMode_Move); } else if (id == Tag) { emit addTag(); } else if (id == Save_Same) { slotSaveProject(); } else if (id == Save_As) { slotSaveProjectDialog(); } else if (id == Abort) { emit abort(Canvas::Abort_General); } else if (id == Help) { slotToggleHelp(); } else if (id == New) { slotNewProject(); } else if (id == Open) { slotLoadProjectDialog(); } } ////////// Project R/W void MainWindow::slotNewProject() { NewProjectDialog npd(this); if (npd.exec() == QDialog::Accepted) { try { Project_sV *project = npd.buildProject(); // Save project XmlProjectRW_sV writer; //qDebug() << "Saving project as " << npd.filename; // check if directory exist ... QFileInfo projfile(npd.projectFilename()); QDir dir(projfile.absoluteDir()); if (!dir.exists()) { dir.mkpath("."); } try { writer.saveProject(project, npd.projectFilename()); statusBar()->showMessage(QString(tr("Saved project as: %1")).arg(npd.projectFilename())); setWindowModified(false); } catch (Error_sV &err) { QMessageBox(QMessageBox::Warning, tr("Error writing project file"), err.message()).exec(); } m_projectPath = npd.projectFilename(); project->preferences()->viewport_secRes() = QPointF(400, 400)/project->frameSource()->framesCount()*project->frameSource()->fps()->fps(); /* add a first (default) node */ Node_sV snode; snode.setX(0.0); snode.setY(0.0); project->nodes()->add(snode); loadProject(project); m_wCanvas->showHelp(true); setWindowModified(true); } catch (FrameSourceError &err) { QMessageBox(QMessageBox::Warning, "Frame source error", err.message()).exec(); } } } void MainWindow::loadProject(Project_sV *project) { Q_ASSERT(project != NULL); resetDialogs(); Project_sV *projTemp = NULL; if (m_project != NULL) { projTemp = m_project; } m_project = project; m_wCanvas->load(m_project); m_wRenderPreview->load(m_project); updateWindowTitle(); if (projTemp != NULL) { // Do not delete the old project object earlier to avoid segfaults // (may still be used in the ShutterFunction dialog e.g.) delete projTemp; } connect(m_project->frameSource(), SIGNAL(signalNextTask(QString,int)), this, SLOT(slotNewFrameSourceTask(QString,int))); connect(m_project->frameSource(), SIGNAL(signalAllTasksFinished()), this, SLOT(slotFrameSourceTasksFinished())); m_project->frameSource()->initialize(); } void MainWindow::slotLoadProjectDialog() { if (okToContinue()) { QString dir = m_settings.value("directories/lastOpenedProject", QDir::current().absolutePath()).toString(); QString file = QFileDialog::getOpenFileName(this, tr("Load Project"), dir, tr("slowmoVideo projects (*.sVproj)")); if (!file.isEmpty()) { qDebug() << file; loadProject(QFileInfo(file).absoluteFilePath()); } } } void MainWindow::loadProject(QString path) { m_settings.setValue("directories/lastOpenedProject", path); XmlProjectRW_sV reader; try { QString warning; Project_sV *project = reader.loadProject(path, &warning); if (warning.length() > 0) { QMessageBox(QMessageBox::Warning, tr("Warning"), warning).exec(); } m_projectPath = path; loadProject(project); } catch (FrameSourceError &err) { QMessageBox(QMessageBox::Warning, tr("Frame source error"), err.message()).exec(); } catch (Error_sV &err) { QMessageBox(QMessageBox::Warning, tr("Error"), err.message()).exec(); } } void MainWindow::slotSaveProject(QString filename) { if (filename.length() == 0) { filename = m_project->projectFilename(); } if (filename.length() == 0) { qDebug() << "No filename given, won't save. (Perhaps an empty project?)"; statusBar()->showMessage(tr("No filename given, won't save. (Perhaps an empty project?)"), 5000); } else { qDebug() << "Saving project as " << filename; try { XmlProjectRW_sV writer; writer.saveProject(m_project, filename); statusBar()->showMessage(QString(tr("Saved project as: %1")).arg(filename)); setWindowModified(false); } catch (Error_sV &err) { QMessageBox(QMessageBox::Warning, tr("Error writing project file"), err.message()).exec(); } } } void MainWindow::slotSaveProjectDialog() { QFileDialog dialog(this, tr("Save project")); dialog.setAcceptMode(QFileDialog::AcceptSave); dialog.setDefaultSuffix("sVproj"); dialog.setNameFilter(tr("slowmoVideo projects (*.sVproj)")); dialog.setFileMode(QFileDialog::AnyFile); dialog.setDirectory(QFileInfo(m_project->projectFilename()).absolutePath()); if (dialog.exec() == QDialog::Accepted) { slotSaveProject(dialog.selectedFiles().at(0)); } } ////////// UI interaction void MainWindow::slotToggleHelp() { m_wCanvas->toggleHelp(); } void MainWindow::displayHelp(QPainter &davinci) const { QString helpText = m_cs.shortcutList() + tr("\nNavigation: [Shift] Scroll, Drag") + tr("\nMove nodes: [Ctrl] Drag"); QRect content; const QPoint topLeft(10, 10); const QPoint padding(10, 10); // Check how big the text's bounding box will be davinci.drawText(QRect(0,0,0,0), Qt::AlignLeft | Qt::TextExpandTabs, helpText, &content); // Draw the background content.adjust(topLeft.x(), topLeft.y(), topLeft.x()+2*padding.x(), topLeft.y()+2*padding.y()); davinci.fillRect(content, QColor(0,0,40, 200)); // Really draw the text now content.translate(padding); davinci.drawText(content, Qt::AlignLeft, helpText, &content); } void MainWindow::slotForwardInputPosition(qreal frame) { if (0 <= frame && frame < m_project->frameSource()->framesCount()) { m_wInputMonitor->slotLoadImage(m_project->frameSource()->framePath(qFloor(frame), FrameSize_Small)); } } void MainWindow::slotForwardCurveSrcPosition(qreal frame) { if (0 <= frame && frame < m_project->frameSource()->framesCount()) { m_wCurveMonitor->slotLoadImage(m_project->frameSource()->framePath(qFloor(frame), FrameSize_Small)); } } void MainWindow::slotUpdateRenderPreview() { m_wRenderPreview->slotRenderAt(m_project->snapToOutFrame( m_wCanvas->prevMouseTime().x(), false, m_project->preferences()->renderFPS(), NULL) ); } void MainWindow::updateWindowTitle() { QString project(tr("empty project")); if (m_projectPath.length() > 0) { project = m_projectPath; } setWindowTitle(QString("slowmo UI (%1) [*]").arg(project)); } ////////// Dialogues void MainWindow::resetDialogs() { if (m_progressDialog != NULL) { m_progressDialog->close(); delete m_progressDialog; m_progressDialog = NULL; } if (m_renderProgressDialog != NULL) { m_renderProgressDialog->close(); delete m_renderProgressDialog; m_renderProgressDialog = NULL; } if (m_flowExaminer != NULL) { m_flowExaminer->close(); delete m_flowExaminer; m_flowExaminer = NULL; } } void MainWindow::slotShowAboutDialog() { AboutDialog dialog(this); dialog.exec(); } void MainWindow::slotShowPreferencesDialog() { PreferencesDialog dialog; dialog.exec(); // Use the new flow method (if it has changed) m_project->reloadFlowSource(); } void MainWindow::slotShowProjectPreferencesDialog() { ProjectPreferencesDialog ppd(m_project->preferences(), this); ppd.exec(); } void MainWindow::slotShowFlowExaminerDialog() { if (m_flowExaminer == NULL) { m_flowExaminer = new FlowExaminer(m_project, this); } int frame = floor(m_wCanvas->prevMouseInFrame()); if (frame+1 >= m_project->frameSource()->framesCount()) { frame = m_project->frameSource()->framesCount()-2; } if (frame < 0) { frame = 0; } m_flowExaminer->show(); m_flowExaminer->examine(frame); } void MainWindow::slotShowRenderDialog() { if (m_project->renderTask() != NULL) { disconnect(SIGNAL(signalRendererContinue()), m_project->renderTask()); } RenderingDialog renderingDialog(m_project, this); if (renderingDialog.exec() == QDialog::Accepted) { RenderTask_sV *task = renderingDialog.buildTask(); if (task != 0) { task->moveToThread(&m_rendererThread); if (m_project->renderTask() != NULL) { disconnect(SIGNAL(signalRendererContinue()), m_project->renderTask()); } //m_project->replaceRenderTask(task); if (m_renderProgressDialog == NULL) { m_renderProgressDialog = new ProgressDialog(this); m_renderProgressDialog->setWindowTitle(tr("Rendering progress")); } else { m_renderProgressDialog->disconnect(); } connect(task, SIGNAL(signalNewTask(QString,int)), m_renderProgressDialog, SLOT(slotNextTask(QString,int))); connect(task, SIGNAL(signalItemDesc(QString)), m_renderProgressDialog, SLOT(slotTaskItemDescription(QString))); connect(task, SIGNAL(signalTaskProgress(int)), m_renderProgressDialog, SLOT(slotTaskProgress(int))); connect(task, SIGNAL(signalRenderingFinished(QString)), m_renderProgressDialog, SLOT(slotAllTasksFinished(QString))); connect(task, SIGNAL(signalRenderingAborted(QString)), this, SLOT(slotRenderingAborted(QString))); connect(task, SIGNAL(signalRenderingAborted(QString)), m_renderProgressDialog, SLOT(close())); connect(task, SIGNAL(signalRenderingStopped(QString)), m_renderProgressDialog, SLOT(slotAborted(QString))); connect(m_renderProgressDialog, SIGNAL(signalAbortTask()), task, SLOT(slotStopRendering())); //connect(this, SIGNAL(signalRendererContinue()), task, SLOT(slotContinueRendering()), Qt::UniqueConnection); connect(task, SIGNAL(workFlowRequested()), &m_rendererThread, SLOT(start())); connect(&m_rendererThread, SIGNAL(started()), task, SLOT(slotContinueRendering())); connect(task, SIGNAL(signalRenderingFinished(QString)), &m_rendererThread, SLOT(quit())); // done another way ?! connect(task, SIGNAL(signalRenderingFinished(QString)), task, SLOT(deleteLater())); //connect(&m_rendererThread, &QThread::finished, task, &QObject::deleteLater); // let's start m_rendererThread.wait(); // If the thread is not running, this will immediately return. m_renderProgressDialog->show(); //emit signalRendererContinue(); //m_rendererThread.exec (); m_rendererThread.start(); task->requestWork(); } } else { QMessageBox(QMessageBox::Warning, tr("Aborted"), tr("Aborted by user"), QMessageBox::Ok).exec(); } } void MainWindow::slotRenderingAborted(QString message) { QMessageBox(QMessageBox::Warning, tr("Error"), message, QMessageBox::Ok).exec(); } void MainWindow::slotNewFrameSourceTask(const QString taskDescription, int taskSize) { if (m_progressDialog == NULL) { m_progressDialog = new ProgressDialog(this); m_progressDialog->setWindowTitle(tr("Frame extraction progress")); connect(m_project->frameSource(), SIGNAL(signalNextTask(QString,int)), m_progressDialog, SLOT(slotNextTask(QString,int))); connect(m_project->frameSource(), SIGNAL(signalTaskProgress(int)), m_progressDialog, SLOT(slotTaskProgress(int))); connect(m_project->frameSource(), SIGNAL(signalTaskItemDescription(QString)), m_progressDialog, SLOT(slotTaskItemDescription(QString))); connect(m_project->frameSource(), SIGNAL(signalAllTasksFinished()), m_progressDialog, SLOT(slotAllTasksFinished())); connect(m_progressDialog, SIGNAL(signalAbortTask()), m_project->frameSource(), SLOT(slotAbortInitialization())); } m_progressDialog->show(); m_progressDialog->slotNextTask(taskDescription, taskSize); } void MainWindow::slotFrameSourceTasksFinished() { QTimer::singleShot(200, this, SLOT(slotCloseFrameSourceProgress())); } void MainWindow::slotCloseFrameSourceProgress() { if (m_progressDialog != NULL) { m_progressDialog->close(); } //is right place ? should we check ? m_project->buildCacheFlowSource(); } #if 0 /** * load a flow file in edit window * * @param filename <#filename description#> */ void MainWindow::loadFlow(QString filename) { if (QFileInfo(filename).exists()) { m_canvas->slotLoadFlow(filename); m_lastFlowFile = filename; updateTitle(); } } #endif /** * display the optical flow editor */ void MainWindow::slotShowFlowEditWindow() { //TODO: show window qDebug() << "slotShowFlowEditWindow: No Yet Implemented"; #if 1 FlowEditCanvas* m_canvas; m_canvas = new FlowEditCanvas(0); m_canvas->show(); #endif #if 0 // dock it QDockWidget *m_wflowMonitorDock = new QDockWidget(tr("Flow monitor"), this); m_wflowMonitorDock->setWidget(m_canvas); m_wflowMonitorDock->setObjectName("flowMonitor"); addDockWidget(Qt::TopDockWidgetArea, m_wflowMonitorDock); #endif } /** * display/hide the debug window */ void MainWindow::slotShowDebugWindow(bool set) { qDebug() << "slotShowDebugWindow " << set; if (set) logBrowser->show(); else logBrowser->hide(); #if 0 // dock it QDockWidget *m_wflowMonitorDock = new QDockWidget(tr("Flow monitor"), this); m_wflowMonitorDock->setWidget(m_canvas); m_wflowMonitorDock->setObjectName("flowMonitor"); addDockWidget(Qt::TopDockWidgetArea, m_wflowMonitorDock); #endif } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/macnotificationhandler.mm0000664000000000000000000000460313151342440025743 0ustar rootroot#include "macnotificationhandler.h" #undef slots #include void MacNotificationHandler::showNotification(const QString &title, const QString &text) { // check if users OS has support for NSUserNotification if(this->hasUserNotificationCenterSupport()) { // okay, seems like 10.8+ QByteArray utf8 = title.toUtf8(); char* cString = (char *)utf8.constData(); NSString *titleMac = [[NSString alloc] initWithUTF8String:cString]; utf8 = text.toUtf8(); cString = (char *)utf8.constData(); NSString *textMac = [[NSString alloc] initWithUTF8String:cString]; // do everything weak linked (because we will keep <10.8 compatibility) Class userNotification = [[NSClassFromString(@"NSUserNotification") alloc] init]; if (userNotification) { [userNotification performSelector:@selector(setTitle:) withObject:titleMac]; [userNotification performSelector:@selector(setInformativeText:) withObject:textMac]; } Class notificationCenterInstance = [NSClassFromString(@"NSUserNotificationCenter") performSelector:@selector(defaultUserNotificationCenter)]; if (notificationCenterInstance) [notificationCenterInstance performSelector:@selector(deliverNotification:) withObject:userNotification]; [titleMac release]; [textMac release]; [userNotification release]; } } // sendAppleScript just take a QString and executes it as apple script void MacNotificationHandler::sendAppleScript(const QString &script) { QByteArray utf8 = script.toUtf8(); char* cString = (char *)utf8.constData(); NSString *scriptApple = [[NSString alloc] initWithUTF8String:cString]; NSAppleScript *as = [[NSAppleScript alloc] initWithSource:scriptApple]; NSDictionary *err = nil; [as executeAndReturnError:&err]; [as release]; [scriptApple release]; } bool MacNotificationHandler::hasUserNotificationCenterSupport(void) { Class possibleClass = NSClassFromString(@"NSUserNotificationCenter"); // check if users OS has support for NSUserNotification if(possibleClass!=nil) { return true; } return false; } MacNotificationHandler *MacNotificationHandler::instance() { static MacNotificationHandler *s_instance = NULL; if (!s_instance) s_instance = new MacNotificationHandler(); return s_instance; } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/renderPreview.ui0000664000000000000000000000330413151342440024060 0ustar rootroot RenderPreview 0 0 458 277 Render preview 0 0 QFrame::NoFrame QFrame::Raised Qt::Horizontal 40 0 This is an information message. ImageDisplay QFrame
libgui/imageDisplay.h
1
slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/frameMonitor.h0000664000000000000000000000224413151342440023515 0ustar rootroot/* slowmoUI is a user interface for slowmoVideo. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #ifndef INPUTMONITOR_H #define INPUTMONITOR_H #include #include #include #include namespace Ui { class FrameMonitor; } /** \brief Used for displaying input frames at the mouse position. */ class FrameMonitor : public QWidget { Q_OBJECT public: explicit FrameMonitor(QWidget *parent = 0); ~FrameMonitor(); void setCacheLimit(int n); int cacheLimit() { return cache_limit; }; protected: virtual void paintEvent(QPaintEvent *event); virtual void closeEvent(QCloseEvent *event) ; public slots: void slotLoadImage(const QString &filename); private: Ui::FrameMonitor *ui; QSemaphore m_semaphore; QString *m_queue[2]; // for cache mgmt int cache_limit; QCache imgCache; }; #endif // INPUTMONITOR_H slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/frameMonitor.ui0000664000000000000000000000203413151342440023700 0ustar rootroot FrameMonitor 0 0 400 300 0 240 Input monitor QFrame::NoFrame QFrame::Raised ImageDisplay QFrame
libgui/imageDisplay.h
1
slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/flowEditCanvas.cpp0000664000000000000000000000630013151342440024314 0ustar rootroot/* slowmoFlowEdit is a user interface for editing slowmoVideo's Optical Flow files. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "flowEditCanvas.h" #include "ui_flowEditCanvas.h" #include "lib/flowRW_sV.h" #include "lib/flowTools_sV.h" #include "lib/flowVisualization_sV.h" #include FlowEditCanvas::FlowEditCanvas(QWidget *parent) : QWidget(parent), ui(new Ui::FlowEditCanvas), m_flowField(NULL), m_boost(1.0) { ui->setupUi(this); ui->flow->trackMouse(true); connect(ui->flow, SIGNAL(signalRectDrawn(QRectF)), this, SLOT(slotRectDrawn(QRectF))); connect(ui->flow, SIGNAL(signalMouseMoved(float,float)), this, SLOT(slotExamineValues(float,float))); connect(ui->amplification, SIGNAL(valueChanged(int)),this, SLOT(newAmplification(int))); } FlowEditCanvas::~FlowEditCanvas() { delete ui; } float FlowEditCanvas::amplification() const { return m_boost; } void FlowEditCanvas::setAmplification(float val) { //qDebug() << "setAmplification: " << val; Q_ASSERT(val > 0); m_boost = val; repaintFlow(); } void FlowEditCanvas::newAmplification(int val) { //qDebug() << "newAmplification: " << val; Q_ASSERT(val > 0); m_boost = (float)val; repaintFlow(); } /// \todo Make flow visualization configurable void FlowEditCanvas::repaintFlow() { if (m_flowField != NULL) { ui->flow->loadImage(FlowVisualization_sV::colourizeFlow(m_flowField, FlowVisualization_sV::HSV, m_boost)); repaint(); } } void FlowEditCanvas::slotRectDrawn(QRectF imageRect) { qDebug() << "Rect drawn: " << imageRect; if (m_flowField != NULL) { Kernel_sV k(8, 8); k.gauss(); FlowTools_sV::deleteRect(*m_flowField, imageRect.top(), imageRect.left(), imageRect.bottom(), imageRect.right()); FlowTools_sV::refill(*m_flowField, k, imageRect.top(), imageRect.left(), imageRect.bottom(), imageRect.right()); repaintFlow(); } } void FlowEditCanvas::slotLoadFlow(QString filename) { if (m_flowField != NULL) { delete m_flowField; m_flowField = NULL; } m_flowField = FlowRW_sV::load(filename.toStdString()); m_flowFilename = filename; repaintFlow(); } void FlowEditCanvas::slotSaveFlow(QString filename) { if (m_flowField != NULL) { if (filename.length() == 0) { filename = m_flowFilename; } FlowRW_sV::save(filename.toStdString(), m_flowField); } else { qDebug() << "No flow file loaded, cannot save."; } } void FlowEditCanvas::slotExamineValues(float x, float y) { if (m_flowField != NULL) { if (x >= 0 && y >= 0 && x <= m_flowField->width()-1 && y <= m_flowField->height()-1) { float dx = m_flowField->x(x,y); float dy = m_flowField->y(x,y); ui->lblValues->setText(QString("dx/dy: (%1|%2)").arg(dx, 0, 'f', 2).arg(dy, 0, 'f', 2)); ui->lblPos->setText(QString("(%1|%2)").arg(x).arg(y)); } } } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoUI/mainwindow.ui0000664000000000000000000001163013151342440023414 0ustar rootroot MainWindow 0 0 1000 700 slowmoVideo UI 0 0 1000 25 &File &Help &View &Project TopToolBarArea false &Render Preferences QAction::PreferencesRole &Save Save &as … &Open &Shortcuts &About QAction::AboutRole &Quit QAction::QuitRole Render &preview E&xamine flow Examine flow at input frame &New … &Preferences QAction::TextHeuristicRole Zoom &in Zoom &out true Edit Optical Flow Edit Optical Flow true Debug Window slowmovideo-0.5+git20180116/src/slowmoVideo/visualizeFlow/0000775000000000000000000000000013151342440021765 5ustar rootrootslowmovideo-0.5+git20180116/src/slowmoVideo/visualizeFlow/CMakeLists.txt0000664000000000000000000000032413151342440024524 0ustar rootroot include_directories(..) set(SRCS visualizeFlow.cpp ) add_executable(slowmoVisualizeFlow ${SRCS}) target_link_libraries(slowmoVisualizeFlow sVvis ) install(TARGETS slowmoVisualizeFlow DESTINATION ${DEST}) slowmovideo-0.5+git20180116/src/slowmoVideo/visualizeFlow/visualizeFlow.cpp0000664000000000000000000000654313151342440025344 0ustar rootroot/* slowmoVideo creates slow-motion videos from normal-speed videos. Copyright (C) 2011 Simon A. Eugster (Granjow) This program is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version. */ #include "lib/flowRW_sV.h" #include "lib/flowVisualization_sV.h" #include "lib/flowTools_sV.h" #include #include #include char *myName; void printUsage() { std::cout << "Usage: " << std::endl; std::cout << "\t" << myName << " " << std::endl; std::cout << "\t" << myName << " diff " << std::endl; std::cout << "\t" << myName << " ref (writes an HSV reference image)" << std::endl; } QImage reference() { const int width = 1024, height = 1024; float cX = width/2.0; float cY = height/2.0; FlowField_sV field(width, height); for (int x = 0; x < width; x++) { for (int y = 0; y < height; y++) { field.setX(x,y, x-cX); field.setY(x,y, y-cY); } } return FlowVisualization_sV::colourizeFlow(&field, FlowVisualization_sV::HSV, 1.0); } void colourizeFlow(int argc, char *argv[]) { if (argc <= 2) { printUsage(); exit(-1); } std::string inputFile = argv[1]; QString outputFile(argv[2]); FlowField_sV *flowField; try { flowField = FlowRW_sV::load(inputFile); } catch (FlowRW_sV::FlowRWError &err) { std::cout << err.message << std::endl; exit(-2); } std::cout << "Flow file loaded. Width: " << flowField->width() << ", height: " << flowField->height() << std::endl; /// \todo make visualization type configurable QImage img = FlowVisualization_sV::colourizeFlow(flowField, FlowVisualization_sV::WXY); std::cout << "Saving " << outputFile.toStdString() << " ..." << std::endl; img.save(outputFile); delete flowField; } void diffFlow(int argc, char *argv[]) { if (argc <= 4) { printUsage(); exit(-1); } FlowField_sV *leftFlow = FlowRW_sV::load(argv[2]); FlowField_sV *rightFlow = FlowRW_sV::load(argv[3]); FlowField_sV flowDifference(leftFlow->width(), leftFlow->height()); if (strcmp("diffSigned", argv[1]) == 0) { FlowTools_sV::signedDifference(*leftFlow, *rightFlow, flowDifference); } else { FlowTools_sV::difference(*leftFlow, *rightFlow, flowDifference); } int d; QImage img(flowDifference.width(), flowDifference.height(), QImage::Format_ARGB32); for (int y = 0; y < img.height(); y++) { for (int x = 0; x < img.width(); x++) { d = 128 + flowDifference.x(x,y)+flowDifference.y(x,y); if (d > 255) { d = 255; } if (d < 0) { d = 0; } img.setPixel(x, y, qRgb(d,d,d)); } } img.save(argv[4]); delete leftFlow; delete rightFlow; } int main(int argc, char *argv[]) { myName = argv[0]; if (argc <= 1) { printUsage(); exit(-1); } if (strcmp("diff", argv[1]) == 0 || strcmp("diffSigned", argv[1]) == 0) { diffFlow(argc, argv); } else if (strcmp("ref", argv[1]) == 0) { reference().save("reference.png"); } else { colourizeFlow(argc, argv); } } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoInfo/0000775000000000000000000000000013151342440021256 5ustar rootrootslowmovideo-0.5+git20180116/src/slowmoVideo/slowmoInfo/slowmoInfo.cpp0000664000000000000000000000123413151342440024116 0ustar rootroot #include "../lib/defs_sV.hpp" #include int main(int argc, char *argv[]) { if (argc-1 > 0) { if (strcmp("platform", argv[1]) == 0) { std::cout << Version_sV::platform.toStdString(); } else if (strcmp("version", argv[1]) == 0) { std::cout << Version_sV::version.toStdString(); } else if (strcmp("bits", argv[1]) == 0) { std::cout << Version_sV::bits.toStdString(); } else { std::cout << "Argument not recognized; see " << argv[0] << " (without arguments) for a list.\n"; } } else { std::cout << "Possible arguments: version, bits, platform\n"; } } slowmovideo-0.5+git20180116/src/slowmoVideo/slowmoInfo/CMakeLists.txt0000664000000000000000000000056513151342440024024 0ustar rootroot message(STATUS "slowmoInfo Bundle will be : ${MACOSX_BUNDLE} => ${PROJECT_NAME} ") add_executable(slowmoInfo slowmoInfo.cpp) target_link_libraries(slowmoInfo sV ${EXTERNAL_LIBS}) install(TARGETS ${slowmoUI} BUNDLE DESTINATION . COMPONENT Runtime RUNTIME DESTINATION ${BIN_INSTALL_DIR} COMPONENT Runtime) #install(TARGETS slowmoInfo DESTINATION ${DEST}) slowmovideo-0.5+git20180116/src/slowmoVideo/docs/0000775000000000000000000000000013151342440020052 5ustar rootrootslowmovideo-0.5+git20180116/src/slowmoVideo/docs/src/0000775000000000000000000000000013151342440020641 5ustar rootrootslowmovideo-0.5+git20180116/src/slowmoVideo/docs/src/tabs.css0000664000000000000000000000065613151342440022313 0ustar rootroot.tablist, .navpath ul { clear: left; } .tablist li, .navpath ul li { border-top: 1px solid #999; border-bottom: 1px dotted #999; float: left; list-style-type: none; } .tablist li a, .navpath ul li a { display: block; padding: .5em; } .tablist li a:hover, .navpath ul li a:hover { background-color: #999; } .tabs2 .tablist li, .navpath ul li { border-top: none; } #searchli { border: none; }slowmovideo-0.5+git20180116/src/slowmoVideo/docs/src/description.dox0000664000000000000000000000144313151342440023702 0ustar rootroot/** \mainpage slowmoVideo slowmoVideo is a library for generating slow-motion videos by frame interpolation with optical flow. It contains: \li Libraries for reading and writing optical flow (such that it does not need to be re-calculated every time) \li Classes for project management, including reading and writing project files including defined curves, and rendering \li Algorithms for interpolating frames with optical flow (like forward and twoway) \section slowmoUI slowmoUI is a user interface for slowmoVideo for drawing the frame mapping curve (\em speed curve), creating new projects, and rendering. \section visualizeFlow visualizeFlow renders flow files that have been saved with the slowmoVideo libraries to image files, making it easier for human beings to «read» the flow data. */ slowmovideo-0.5+git20180116/src/slowmoVideo/docs/src/doxygen.css0000664000000000000000000001710213151342440023031 0ustar rootroot/* The standard CSS for doxygen modified by RossCairns.com and granjow.net */ #top, .header, div.contents, .footer { clear: left; float: none; max-width: 800px; position: relative; margin: 0 auto; } body, table, div, p, dl { font-family: Lucida Grande, Verdana, Geneva, Arial, sans-serif; font-size: 12px; } p, div, dl { color:#3E3E3E; } .header { padding-top: 2em; } .headertitle { padding-top: .5em; } .footer { padding-top: 2em; } /* @group Heading Levels */ #projectname { font-size: 2em; font-weight: bold; } div.headertitle .title { font-weight: bold; font-size: 2em; } h1 { margin-bottom: 10px; font-size: 30px; padding: 0px 0px 20px 0px ; border-bottom:1px dotted #E0E0E0; } h2 { padding-top: 30px; font-size: 17px; color:#42657B; font-family: Lucida Grande, Verdana, Geneva, Arial, sans-serif; } h3 { font-size: 17px; font-family: Lucida Grande, Verdana, Geneva, Arial, sans-serif; } dt { font-weight: bold; } div.multicol { -moz-column-gap: 1em; -webkit-column-gap: 1em; -moz-column-count: 3; -webkit-column-count: 3; } p.startli, p.startdd, p.starttd { margin-top: 2px; } p.endli { margin-bottom: 0px; } p.enddd { margin-bottom: 4px; } p.endtd { margin-bottom: 2px; } /* @end */ caption { font-weight: bold; } span.legend { font-size: 70%; text-align: center; } h3.version { font-size: 90%; padding-bottom:10px; border-bottom:1px dotted #E0E0E0; } div.qindex, div.navtab{ background-color: #e8eef2; border: 1px solid #84b0c7; text-align: center; margin: 2px; padding: 2px; } div.qindex, div.navpath { width: 100%; line-height: 140%; } div.navtab { margin-right: 15px; } /* @group Link Styling */ a { color: #153788; font-weight: normal; text-decoration: none; } .contents a:visited { color: #1b77c5; } a:hover { text-decoration: underline; } a.qindex { font-weight: bold; } a.qindexHL { font-weight: bold; background-color: #6666cc; color: #ffffff; border: 1px double #9295C2; } .contents a.qindexHL:visited { color: #ffffff; } a.el { font-weight: bold; } a.elRef { } a.code { color: #3030f0; } a.codeRef { color: #3030f0; } /* @end */ dl.el { margin-left: -1cm; } .fragment { font-family: monospace, fixed; font-size: 105%; } pre.fragment { border: 1px solid #CCCCCC; background-color: #f5f5f5; padding: 4px 6px; margin: 4px 8px 4px 2px; overflow: auto; word-wrap: break-word; font-size: 9pt; line-height: 125%; } div.ah { background-color: black; font-weight: bold; color: #ffffff; margin-bottom: 3px; margin-top: 3px } div.groupHeader { margin-left: 16px; margin-top: 12px; margin-bottom: 6px; font-weight: bold; } div.groupText { margin-left: 16px; font-style: italic; } body { background: white; color: black; margin-right: 20px; margin-left: 20px; } td.indexkey { background-color: #F1F5F9; font-weight: bold; border: 1px solid #CCCCCC; margin: 2px 0px 2px 0; padding: 2px 10px; } td.indexvalue { background-color: #F1F5F9; border: 1px solid #CCCCCC; padding: 2px 10px; margin: 2px 0px; } tr.memlist { background-color: #f0f0f0; } p.formulaDsp { text-align: center; } img.formulaDsp { } img.formulaInl { vertical-align: middle; } div.center { text-align: center; margin-top: 0px; margin-bottom: 0px; padding: 0px; } div.center img { border: 0px; } img.footer { border: 0px; vertical-align: middle; } /* @group Code Colorization */ span.keyword { color: #008000 } span.keywordtype { color: #604020 } span.keywordflow { color: #e08000 } span.comment { color: #800000 } span.preprocessor { color: #806020 } span.stringliteral { color: #002080 } span.charliteral { color: #008080 } span.vhdldigit { color: #ff00ff } span.vhdlchar { color: #000000 } span.vhdlkeyword { color: #700070 } span.vhdllogic { color: #ff0000 } /* @end */ .search { color: #003399; font-weight: bold; } form.search { margin-bottom: 0px; margin-top: 0px; } input.search { font-size: 75%; color: #000080; font-weight: normal; background-color: #F1F5F9; } td.tiny { font-size: 75%; } .dirtab { padding: 4px; border-collapse: collapse; border: 1px solid #84b0c7; } th.dirtab { background: #F1F5F9; font-weight: bold; } hr { height: 0; border: none; border-top: 1px solid #666; } /* @group Member Descriptions */ .mdescLeft, .mdescRight, .memItemLeft, .memItemRight, .memTemplItemLeft, .memTemplItemRight, .memTemplParams { background-color: #FAFAFA; border: none; margin: 4px; padding: 1px 0 0 8px; } .mdescLeft, .mdescRight { padding: 0px 8px 4px 8px; color: #555; } .memItemLeft, .memItemRight, .memTemplParams { border-top: 1px solid #ccc; background-color: #F9F9F9; } .memItemLeft, .memTemplItemLeft { white-space: nowrap; } .memTemplParams { color: #606060; white-space: nowrap; } /* @end */ /* @group Member Details */ /* Styles for detailed member documentation */ .memtemplate { font-size: 80%; color: #606060; font-weight: normal; margin-left: 3px; } .memnav { background-color: #F1F5F9; border: 1px solid #84b0c7; text-align: center; margin: 2px; margin-right: 15px; padding: 2px; } .memitem { padding: 0; margin-bottom: 30px; } .memname { white-space: nowrap; font-weight: bold; color:#42657B; padding:3px 5px; } .memproto, .memdoc { border: 1px dotted #E0E0E0; } .memproto { padding: 0; background-color: #F9F9F9; font-weight: bold; -webkit-box-shadow: 5px 5px 5px rgba(0, 0, 0, 0.15); -moz-box-shadow: rgba(0, 0, 0, 0.15) 5px 5px 5px; } .memdoc { padding: 2px 20px 20px; background-color: #FFFFFF; border-top-width: 0; -webkit-box-shadow: 5px 5px 5px rgba(0, 0, 0, 0.15); -moz-box-shadow: rgba(0, 0, 0, 0.15) 5px 5px 5px; } .paramkey { text-align: right; } .paramtype { white-space: nowrap; } .paramname { color: #885656; white-space: nowrap; } .paramname em { font-style: normal; } /* @end */ /* @group Directory (tree) */ /* for the tree view */ .ftvtree { font-family: sans-serif; margin: 0.5em; } /* these are for tree view when used as main index */ .directory { font-size: 9pt; font-weight: bold; } .directory h3 { margin: 0px; margin-top: 1em; font-size: 11pt; } /* The following two styles can be used to replace the root node title with an image of your choice. Simply uncomment the next two styles, specify the name of your image and be sure to set 'height' to the proper pixel height of your image. */ /* .directory h3.swap { height: 61px; background-repeat: no-repeat; background-image: url("yourimage.gif"); } .directory h3.swap span { display: none; } */ .directory > h3 { margin-top: 0; } .directory p { margin: 0px; white-space: nowrap; } .directory div { display: none; margin: 0px; } .directory img { vertical-align: -30%; } /* these are for tree view when not used as main index */ .directory-alt { font-size: 100%; font-weight: bold; } .directory-alt h3 { margin: 0px; margin-top: 1em; font-size: 11pt; } .directory-alt > h3 { margin-top: 0; } .directory-alt p { margin: 0px; white-space: nowrap; } .directory-alt div { display: none; margin: 0px; } .directory-alt img { vertical-align: -30%; } /* @end */ address { font-style: normal; color: #333; } table.doxtable { border-collapse:collapse; } table.doxtable td, table.doxtable th { border: 1px solid #153788; padding: 3px 7px 2px; } table.doxtable th { background-color: #254798; color: #FFFFFF; font-size: 110%; padding-bottom: 4px; padding-top: 5px; text-align:left; } hr { border-top: none; } .contents { padding-top: 30px; } h1 { margin-top:0; } .contents .dynsection { margin-top:10px; } slowmovideo-0.5+git20180116/src/slowmoVideo/test/0000775000000000000000000000000013151342440020101 5ustar rootrootslowmovideo-0.5+git20180116/src/slowmoVideo/test/ffmpegTest.c0000664000000000000000000000215513151342440022354 0ustar rootroot// LibAV docs: http://libav.org/doxygen/master/avformat_8h.html // Tutorial: http://dranger.com/ffmpeg/tutorial01.html #include #include int main(int argc, char *argv[]) { av_register_all(); AVFormatContext *pFormatContext; int ret; if (argc < 2) { printf("Usage: %s file\n", argv[0]); return -1; } if ((ret = av_open_input_file(&pFormatContext, argv[1], NULL, 0, NULL)) != 0) { printf("Could not open file %s.\n", argv[1]); return ret; } if (av_find_stream_info(pFormatContext) < 0) { printf("No stream information found.\n"); return -2; } av_dump_format(pFormatContext, 0, argv[1], 0); AVCodecContext *pCodecContext; int videoStream = -1; for (int i = 0; i < pFormatContext->nb_streams; i++) { if (pFormatContext->streams[i]->codec->codec_type == CODEC_TYPE_VIDEO) { videoStream = i; pCodecContext = pFormatContext->streams[i]->codec; AVRational fps = pFormatContext->streams[i]->r_frame_rate; printf("Frame rate: %d/%d = %f\n", fps.num, fps.den, (float)fps.num / fps.den); } } return 0; } slowmovideo-0.5+git20180116/src/slowmoVideo/test/CMakeLists.txt0000664000000000000000000000111413151342440022636 0ustar rootrootset(SRCS test.cpp ) include_directories(${FFMPEG_INCLUDE_PATHS}) add_executable(Test ${SRCS}) target_link_libraries(Test sVproj ${EXTERNAL_LIBS}) #add_executable(encodeTest encodeTest.c) #target_link_libraries(encodeTest sVencode ${EXTERNAL_LIBS}) #add_executable(encodeFramesTest ffmpegTestEncodeFrames.cpp) #message(STATUS "Additional libraries: ${ADDITIONAL_LIBS}") #target_link_libraries(encodeFramesTest sVencode ${EXTERNAL_LIBS}) add_executable(AvconvInfo testAvconvInfo.cpp) target_link_libraries(AvconvInfo sVinfo ${EXTERNAL_LIBS}) install(TARGETS Test DESTINATION bin) slowmovideo-0.5+git20180116/src/slowmoVideo/test/encodeTest.c0000664000000000000000000000030713151342440022342 0ustar rootroot #include "../lib/ffmpegEncode_sV.h" int main() { int i; VideoOut_sV video; prepareDefault(&video); for (i = 0; i < 50; i++) { eatSample(&video); } finish(&video); } slowmovideo-0.5+git20180116/src/slowmoVideo/test/Makefile0000664000000000000000000002361513151342440021550 0ustar rootroot# CMAKE generated file: DO NOT EDIT! # Generated by "Unix Makefiles" Generator, CMake Version 2.8 # Default target executed when no arguments are given to make. default_target: all .PHONY : default_target #============================================================================= # Special targets provided by cmake. # Disable implicit rules so canonical targets will work. .SUFFIXES: # Remove some rules from gmake that .SUFFIXES does not remove. SUFFIXES = .SUFFIXES: .hpux_make_needs_suffix_list # Suppress display of executed commands. $(VERBOSE).SILENT: # A target that is always out of date. cmake_force: .PHONY : cmake_force #============================================================================= # Set environment variables for the build. # The shell in which to execute make rules. SHELL = /bin/sh # The CMake executable. CMAKE_COMMAND = /usr/bin/cmake # The command to remove a file. RM = /usr/bin/cmake -E remove -f # The top-level source directory on which CMake was run. CMAKE_SOURCE_DIR = /home/mso/src/slowmoVideo/slowmoVideo # The top-level build directory on which CMake was run. CMAKE_BINARY_DIR = /home/mso/src/slowmoVideo/slowmoVideo #============================================================================= # Targets provided globally by CMake. # Special rule for the target edit_cache edit_cache: @$(CMAKE_COMMAND) -E cmake_echo_color --switch=$(COLOR) --cyan "Running interactive CMake command-line interface..." /usr/bin/cmake -i . .PHONY : edit_cache # Special rule for the target edit_cache edit_cache/fast: edit_cache .PHONY : edit_cache/fast # Special rule for the target install install: preinstall @$(CMAKE_COMMAND) -E cmake_echo_color --switch=$(COLOR) --cyan "Install the project..." /usr/bin/cmake -P cmake_install.cmake .PHONY : install # Special rule for the target install install/fast: preinstall/fast @$(CMAKE_COMMAND) -E cmake_echo_color --switch=$(COLOR) --cyan "Install the project..." /usr/bin/cmake -P cmake_install.cmake .PHONY : install/fast # Special rule for the target install/local install/local: preinstall @$(CMAKE_COMMAND) -E cmake_echo_color --switch=$(COLOR) --cyan "Installing only the local directory..." /usr/bin/cmake -DCMAKE_INSTALL_LOCAL_ONLY=1 -P cmake_install.cmake .PHONY : install/local # Special rule for the target install/local install/local/fast: install/local .PHONY : install/local/fast # Special rule for the target install/strip install/strip: preinstall @$(CMAKE_COMMAND) -E cmake_echo_color --switch=$(COLOR) --cyan "Installing the project stripped..." /usr/bin/cmake -DCMAKE_INSTALL_DO_STRIP=1 -P cmake_install.cmake .PHONY : install/strip # Special rule for the target install/strip install/strip/fast: install/strip .PHONY : install/strip/fast # Special rule for the target list_install_components list_install_components: @$(CMAKE_COMMAND) -E cmake_echo_color --switch=$(COLOR) --cyan "Available install components are: \"Unspecified\"" .PHONY : list_install_components # Special rule for the target list_install_components list_install_components/fast: list_install_components .PHONY : list_install_components/fast # Special rule for the target rebuild_cache rebuild_cache: @$(CMAKE_COMMAND) -E cmake_echo_color --switch=$(COLOR) --cyan "Running CMake to regenerate build system..." /usr/bin/cmake -H$(CMAKE_SOURCE_DIR) -B$(CMAKE_BINARY_DIR) .PHONY : rebuild_cache # Special rule for the target rebuild_cache rebuild_cache/fast: rebuild_cache .PHONY : rebuild_cache/fast # The main all target all: cmake_check_build_system cd /home/mso/src/slowmoVideo/slowmoVideo && $(CMAKE_COMMAND) -E cmake_progress_start /home/mso/src/slowmoVideo/slowmoVideo/CMakeFiles /home/mso/src/slowmoVideo/slowmoVideo/test/CMakeFiles/progress.marks cd /home/mso/src/slowmoVideo/slowmoVideo && $(MAKE) -f CMakeFiles/Makefile2 test/all $(CMAKE_COMMAND) -E cmake_progress_start /home/mso/src/slowmoVideo/slowmoVideo/CMakeFiles 0 .PHONY : all # The main clean target clean: cd /home/mso/src/slowmoVideo/slowmoVideo && $(MAKE) -f CMakeFiles/Makefile2 test/clean .PHONY : clean # The main clean target clean/fast: clean .PHONY : clean/fast # Prepare targets for installation. preinstall: all cd /home/mso/src/slowmoVideo/slowmoVideo && $(MAKE) -f CMakeFiles/Makefile2 test/preinstall .PHONY : preinstall # Prepare targets for installation. preinstall/fast: cd /home/mso/src/slowmoVideo/slowmoVideo && $(MAKE) -f CMakeFiles/Makefile2 test/preinstall .PHONY : preinstall/fast # clear depends depend: cd /home/mso/src/slowmoVideo/slowmoVideo && $(CMAKE_COMMAND) -H$(CMAKE_SOURCE_DIR) -B$(CMAKE_BINARY_DIR) --check-build-system CMakeFiles/Makefile.cmake 1 .PHONY : depend # Convenience name for target. test/CMakeFiles/Test.dir/rule: cd /home/mso/src/slowmoVideo/slowmoVideo && $(MAKE) -f CMakeFiles/Makefile2 test/CMakeFiles/Test.dir/rule .PHONY : test/CMakeFiles/Test.dir/rule # Convenience name for target. Test: test/CMakeFiles/Test.dir/rule .PHONY : Test # fast build rule for target. Test/fast: cd /home/mso/src/slowmoVideo/slowmoVideo && $(MAKE) -f test/CMakeFiles/Test.dir/build.make test/CMakeFiles/Test.dir/build .PHONY : Test/fast # Convenience name for target. test/CMakeFiles/encodeFramesTest.dir/rule: cd /home/mso/src/slowmoVideo/slowmoVideo && $(MAKE) -f CMakeFiles/Makefile2 test/CMakeFiles/encodeFramesTest.dir/rule .PHONY : test/CMakeFiles/encodeFramesTest.dir/rule # Convenience name for target. encodeFramesTest: test/CMakeFiles/encodeFramesTest.dir/rule .PHONY : encodeFramesTest # fast build rule for target. encodeFramesTest/fast: cd /home/mso/src/slowmoVideo/slowmoVideo && $(MAKE) -f test/CMakeFiles/encodeFramesTest.dir/build.make test/CMakeFiles/encodeFramesTest.dir/build .PHONY : encodeFramesTest/fast # Convenience name for target. test/CMakeFiles/encodeTest.dir/rule: cd /home/mso/src/slowmoVideo/slowmoVideo && $(MAKE) -f CMakeFiles/Makefile2 test/CMakeFiles/encodeTest.dir/rule .PHONY : test/CMakeFiles/encodeTest.dir/rule # Convenience name for target. encodeTest: test/CMakeFiles/encodeTest.dir/rule .PHONY : encodeTest # fast build rule for target. encodeTest/fast: cd /home/mso/src/slowmoVideo/slowmoVideo && $(MAKE) -f test/CMakeFiles/encodeTest.dir/build.make test/CMakeFiles/encodeTest.dir/build .PHONY : encodeTest/fast encodeTest.o: encodeTest.c.o .PHONY : encodeTest.o # target to build an object file encodeTest.c.o: cd /home/mso/src/slowmoVideo/slowmoVideo && $(MAKE) -f test/CMakeFiles/encodeTest.dir/build.make test/CMakeFiles/encodeTest.dir/encodeTest.c.o .PHONY : encodeTest.c.o encodeTest.i: encodeTest.c.i .PHONY : encodeTest.i # target to preprocess a source file encodeTest.c.i: cd /home/mso/src/slowmoVideo/slowmoVideo && $(MAKE) -f test/CMakeFiles/encodeTest.dir/build.make test/CMakeFiles/encodeTest.dir/encodeTest.c.i .PHONY : encodeTest.c.i encodeTest.s: encodeTest.c.s .PHONY : encodeTest.s # target to generate assembly for a file encodeTest.c.s: cd /home/mso/src/slowmoVideo/slowmoVideo && $(MAKE) -f test/CMakeFiles/encodeTest.dir/build.make test/CMakeFiles/encodeTest.dir/encodeTest.c.s .PHONY : encodeTest.c.s ffmpegTestEncodeFrames.o: ffmpegTestEncodeFrames.cpp.o .PHONY : ffmpegTestEncodeFrames.o # target to build an object file ffmpegTestEncodeFrames.cpp.o: cd /home/mso/src/slowmoVideo/slowmoVideo && $(MAKE) -f test/CMakeFiles/encodeFramesTest.dir/build.make test/CMakeFiles/encodeFramesTest.dir/ffmpegTestEncodeFrames.cpp.o .PHONY : ffmpegTestEncodeFrames.cpp.o ffmpegTestEncodeFrames.i: ffmpegTestEncodeFrames.cpp.i .PHONY : ffmpegTestEncodeFrames.i # target to preprocess a source file ffmpegTestEncodeFrames.cpp.i: cd /home/mso/src/slowmoVideo/slowmoVideo && $(MAKE) -f test/CMakeFiles/encodeFramesTest.dir/build.make test/CMakeFiles/encodeFramesTest.dir/ffmpegTestEncodeFrames.cpp.i .PHONY : ffmpegTestEncodeFrames.cpp.i ffmpegTestEncodeFrames.s: ffmpegTestEncodeFrames.cpp.s .PHONY : ffmpegTestEncodeFrames.s # target to generate assembly for a file ffmpegTestEncodeFrames.cpp.s: cd /home/mso/src/slowmoVideo/slowmoVideo && $(MAKE) -f test/CMakeFiles/encodeFramesTest.dir/build.make test/CMakeFiles/encodeFramesTest.dir/ffmpegTestEncodeFrames.cpp.s .PHONY : ffmpegTestEncodeFrames.cpp.s test.o: test.cpp.o .PHONY : test.o # target to build an object file test.cpp.o: cd /home/mso/src/slowmoVideo/slowmoVideo && $(MAKE) -f test/CMakeFiles/Test.dir/build.make test/CMakeFiles/Test.dir/test.cpp.o .PHONY : test.cpp.o test.i: test.cpp.i .PHONY : test.i # target to preprocess a source file test.cpp.i: cd /home/mso/src/slowmoVideo/slowmoVideo && $(MAKE) -f test/CMakeFiles/Test.dir/build.make test/CMakeFiles/Test.dir/test.cpp.i .PHONY : test.cpp.i test.s: test.cpp.s .PHONY : test.s # target to generate assembly for a file test.cpp.s: cd /home/mso/src/slowmoVideo/slowmoVideo && $(MAKE) -f test/CMakeFiles/Test.dir/build.make test/CMakeFiles/Test.dir/test.cpp.s .PHONY : test.cpp.s # Help Target help: @echo "The following are some of the valid targets for this Makefile:" @echo "... all (the default if no target is provided)" @echo "... clean" @echo "... depend" @echo "... Test" @echo "... edit_cache" @echo "... encodeFramesTest" @echo "... encodeTest" @echo "... install" @echo "... install/local" @echo "... install/strip" @echo "... list_install_components" @echo "... rebuild_cache" @echo "... encodeTest.o" @echo "... encodeTest.i" @echo "... encodeTest.s" @echo "... ffmpegTestEncodeFrames.o" @echo "... ffmpegTestEncodeFrames.i" @echo "... ffmpegTestEncodeFrames.s" @echo "... test.o" @echo "... test.i" @echo "... test.s" .PHONY : help #============================================================================= # Special targets to cleanup operation of make. # Special rule to run CMake to check the build system integrity. # No rule that depends on this can have commands that come from listfiles # because they might be regenerated. cmake_check_build_system: cd /home/mso/src/slowmoVideo/slowmoVideo && $(CMAKE_COMMAND) -H$(CMAKE_SOURCE_DIR) -B$(CMAKE_BINARY_DIR) --check-build-system CMakeFiles/Makefile.cmake 0 .PHONY : cmake_check_build_system slowmovideo-0.5+git20180116/src/slowmoVideo/test/test.cpp0000664000000000000000000000605713151342440021574 0ustar rootroot#include #include "../project/project_sV.h" #include "../project/xmlProjectRW_sV.h" #include "../project/videoFrameSource_sV.h" #include "../lib/bezierTools_sV.h" #include #include #include #include void testSave() { Project_sV proj("/tmp"); proj.loadFrameSource(new VideoFrameSource_sV(&proj, "/data/Videos/2010-09-14-DSC_5111.AVI")); XmlProjectRW_sV writer; writer.saveProject(&proj, "/tmp/test.sVproj"); } void testBezier() { QPointF p0(0,1); QPointF p1(2,3); QPointF p2(0,0); QPointF p3(3,1); QImage img(300, 200, QImage::Format_ARGB32); img.fill(qRgb(255,255,255)); for (int i = 0; i < 300; i++) { QPointF p = BezierTools_sV::interpolate(float(i)/300, p0, p1, p2, p3); img.setPixel(qRound(100*p.x()), qRound(100*p.y()), qRgb(200, 200, 40)); } img.save("/tmp/bezier.png"); img.fill(qRgb(255,255,255)); for (int i = 0; i < 300; i++) { QPointF p = BezierTools_sV::interpolateAtX(float(i)/100, p0, p1, p2, p3); img.setPixel(qRound(100*p.x()), qRound(100*p.y()), qRgb(200, 200, 40)); qDebug() << "Painting at " << toString(p*100); if (qRound(100*p.x()) != i) { qDebug() << "this index is off!" << int(100*p.x()) << " != i: " << i << ", qRound: " << qRound(100*p.x()); } } img.save("/tmp/bezier2.png"); img.fill(qRgb(255,255,255)); QPainter davinci(&img); davinci.setRenderHint(QPainter::Antialiasing, true); int n = 30; QPointF prev = p0*100; for (int i = 0; i < n; i++) { QPointF cur = BezierTools_sV::interpolateAtX(3*float(i)/n, p0, p1, p2, p3)*100; davinci.drawLine(QPointF(prev.x(), prev.y()), QPointF(cur.x(), cur.y())); prev = cur; } davinci.drawLine(QPointF(prev.x(), prev.y()), QPointF(100*p3.x(), 100*p3.y())); img.save("/tmp/bezier3.png"); } void testFloatArg() { qDebug() << QString("%2").arg(24.249, 8, 'f', 2, '0'); } void testQtScript() { int argc = 0; QCoreApplication app(argc, NULL); QScriptEngine engine; QScriptValue val = engine.evaluate("1+2"); QScriptValue fx = engine.evaluate("(function(x, dy) { return Math.PI; })"); QScriptValueList args; args << .5 << 0; qDebug() << fx.call(QScriptValue(), args).toString(); } void testRegex() { QStringList parts; QRegExp e("(\\d+)"); QString str("forward-lambda20.0_1234_2345.sVflow"); int min = str.indexOf("_"); int pos = 0; int prevPos = 0; while ((pos = e.indexIn(str, pos)) != -1) { qDebug() << str.mid(prevPos, pos-prevPos); parts << str.mid(prevPos, pos-prevPos); if (pos > min) { parts << QVariant(e.cap(1).toInt()+1).toString(); } else { parts << e.cap(1); } qDebug() << e.cap(1) << " at " << pos; pos += e.matchedLength(); prevPos = pos; } qDebug() << str.mid(prevPos); parts << str.mid(prevPos); qDebug() << "Next: " << parts.join(""); } int main() { testRegex(); } slowmovideo-0.5+git20180116/src/slowmoVideo/test/testAvconvInfo.cpp0000664000000000000000000000013213151342440023551 0ustar rootroot#include "../lib/avconvInfo_sV.h" int main() { AvconvInfo info; info.locate(); } slowmovideo-0.5+git20180116/src/slowmoVideo/test/ffmpegTestEncodeFrames.cpp0000664000000000000000000000212013151342440025160 0ustar rootroot // Against the «UINT64_C not declared» message. // See: http://code.google.com/p/ffmpegsource/issues/detail?id=11 #ifdef __cplusplus #define __STDC_CONSTANT_MACROS #ifdef _STDINT_H #undef _STDINT_H #endif # include #endif extern "C" { #include "../lib/ffmpegEncode_sV.h" } #include #include #include #include int main() { int width = 640, height = 320; QImage img(width, height, QImage::Format_ARGB32); QPainter davinci(&img); davinci.setPen(QColor(40, 80, 255, 20)); VideoOut_sV video; int ret = prepare(&video, "/tmp/ffmpegEncodedFrames.mov", "ffv1", width, height, 1000000, 1, 25); if (ret != 0) { qDebug() << "Preparing the video failed: " << video.errorMessage << "(" << ret << ")"; return ret; } eatSample(&video); for (int i = 0; i < 100; i++) { img.fill(QColor(100, 100, 100).rgb()); davinci.drawLine(0, 2*i, width-1, 2*i); davinci.fillRect(0, 0, 100, 2*i, QColor(255, 0, 0)); eatARGB(&video, img.bits()); } finish(&video); } slowmovideo-0.5+git20180116/src/slowmoVideo/Makefile0000664000000000000000000003144113151342440020565 0ustar rootroot# CMAKE generated file: DO NOT EDIT! # Generated by "Unix Makefiles" Generator, CMake Version 2.8 # Default target executed when no arguments are given to make. default_target: all .PHONY : default_target #============================================================================= # Special targets provided by cmake. # Disable implicit rules so canonical targets will work. .SUFFIXES: # Remove some rules from gmake that .SUFFIXES does not remove. SUFFIXES = .SUFFIXES: .hpux_make_needs_suffix_list # Suppress display of executed commands. $(VERBOSE).SILENT: # A target that is always out of date. cmake_force: .PHONY : cmake_force #============================================================================= # Set environment variables for the build. # The shell in which to execute make rules. SHELL = /bin/sh # The CMake executable. CMAKE_COMMAND = /usr/bin/cmake # The command to remove a file. RM = /usr/bin/cmake -E remove -f # The top-level source directory on which CMake was run. CMAKE_SOURCE_DIR = /home/mso/src/slowmoVideo/slowmoVideo # The top-level build directory on which CMake was run. CMAKE_BINARY_DIR = /home/mso/src/slowmoVideo/slowmoVideo #============================================================================= # Targets provided globally by CMake. # Special rule for the target edit_cache edit_cache: @$(CMAKE_COMMAND) -E cmake_echo_color --switch=$(COLOR) --cyan "Running interactive CMake command-line interface..." /usr/bin/cmake -i . .PHONY : edit_cache # Special rule for the target edit_cache edit_cache/fast: edit_cache .PHONY : edit_cache/fast # Special rule for the target install install: preinstall @$(CMAKE_COMMAND) -E cmake_echo_color --switch=$(COLOR) --cyan "Install the project..." /usr/bin/cmake -P cmake_install.cmake .PHONY : install # Special rule for the target install install/fast: preinstall/fast @$(CMAKE_COMMAND) -E cmake_echo_color --switch=$(COLOR) --cyan "Install the project..." /usr/bin/cmake -P cmake_install.cmake .PHONY : install/fast # Special rule for the target install/local install/local: preinstall @$(CMAKE_COMMAND) -E cmake_echo_color --switch=$(COLOR) --cyan "Installing only the local directory..." /usr/bin/cmake -DCMAKE_INSTALL_LOCAL_ONLY=1 -P cmake_install.cmake .PHONY : install/local # Special rule for the target install/local install/local/fast: install/local .PHONY : install/local/fast # Special rule for the target install/strip install/strip: preinstall @$(CMAKE_COMMAND) -E cmake_echo_color --switch=$(COLOR) --cyan "Installing the project stripped..." /usr/bin/cmake -DCMAKE_INSTALL_DO_STRIP=1 -P cmake_install.cmake .PHONY : install/strip # Special rule for the target install/strip install/strip/fast: install/strip .PHONY : install/strip/fast # Special rule for the target list_install_components list_install_components: @$(CMAKE_COMMAND) -E cmake_echo_color --switch=$(COLOR) --cyan "Available install components are: \"Unspecified\"" .PHONY : list_install_components # Special rule for the target list_install_components list_install_components/fast: list_install_components .PHONY : list_install_components/fast # Special rule for the target rebuild_cache rebuild_cache: @$(CMAKE_COMMAND) -E cmake_echo_color --switch=$(COLOR) --cyan "Running CMake to regenerate build system..." /usr/bin/cmake -H$(CMAKE_SOURCE_DIR) -B$(CMAKE_BINARY_DIR) .PHONY : rebuild_cache # Special rule for the target rebuild_cache rebuild_cache/fast: rebuild_cache .PHONY : rebuild_cache/fast # The main all target all: cmake_check_build_system $(CMAKE_COMMAND) -E cmake_progress_start /home/mso/src/slowmoVideo/slowmoVideo/CMakeFiles /home/mso/src/slowmoVideo/slowmoVideo/CMakeFiles/progress.marks $(MAKE) -f CMakeFiles/Makefile2 all $(CMAKE_COMMAND) -E cmake_progress_start /home/mso/src/slowmoVideo/slowmoVideo/CMakeFiles 0 .PHONY : all # The main clean target clean: $(MAKE) -f CMakeFiles/Makefile2 clean .PHONY : clean # The main clean target clean/fast: clean .PHONY : clean/fast # Prepare targets for installation. preinstall: all $(MAKE) -f CMakeFiles/Makefile2 preinstall .PHONY : preinstall # Prepare targets for installation. preinstall/fast: $(MAKE) -f CMakeFiles/Makefile2 preinstall .PHONY : preinstall/fast # clear depends depend: $(CMAKE_COMMAND) -H$(CMAKE_SOURCE_DIR) -B$(CMAKE_BINARY_DIR) --check-build-system CMakeFiles/Makefile.cmake 1 .PHONY : depend #============================================================================= # Target rules for targets named sV # Build rule for target. sV: cmake_check_build_system $(MAKE) -f CMakeFiles/Makefile2 sV .PHONY : sV # fast build rule for target. sV/fast: $(MAKE) -f lib/CMakeFiles/sV.dir/build.make lib/CMakeFiles/sV.dir/build .PHONY : sV/fast #============================================================================= # Target rules for targets named sVencode # Build rule for target. sVencode: cmake_check_build_system $(MAKE) -f CMakeFiles/Makefile2 sVencode .PHONY : sVencode # fast build rule for target. sVencode/fast: $(MAKE) -f lib/CMakeFiles/sVencode.dir/build.make lib/CMakeFiles/sVencode.dir/build .PHONY : sVencode/fast #============================================================================= # Target rules for targets named sVflow # Build rule for target. sVflow: cmake_check_build_system $(MAKE) -f CMakeFiles/Makefile2 sVflow .PHONY : sVflow # fast build rule for target. sVflow/fast: $(MAKE) -f lib/CMakeFiles/sVflow.dir/build.make lib/CMakeFiles/sVflow.dir/build .PHONY : sVflow/fast #============================================================================= # Target rules for targets named sVinfo # Build rule for target. sVinfo: cmake_check_build_system $(MAKE) -f CMakeFiles/Makefile2 sVinfo .PHONY : sVinfo # fast build rule for target. sVinfo/fast: $(MAKE) -f lib/CMakeFiles/sVinfo.dir/build.make lib/CMakeFiles/sVinfo.dir/build .PHONY : sVinfo/fast #============================================================================= # Target rules for targets named sVvis # Build rule for target. sVvis: cmake_check_build_system $(MAKE) -f CMakeFiles/Makefile2 sVvis .PHONY : sVvis # fast build rule for target. sVvis/fast: $(MAKE) -f lib/CMakeFiles/sVvis.dir/build.make lib/CMakeFiles/sVvis.dir/build .PHONY : sVvis/fast #============================================================================= # Target rules for targets named sVgui # Build rule for target. sVgui: cmake_check_build_system $(MAKE) -f CMakeFiles/Makefile2 sVgui .PHONY : sVgui # fast build rule for target. sVgui/fast: $(MAKE) -f libgui/CMakeFiles/sVgui.dir/build.make libgui/CMakeFiles/sVgui.dir/build .PHONY : sVgui/fast #============================================================================= # Target rules for targets named sVproj # Build rule for target. sVproj: cmake_check_build_system $(MAKE) -f CMakeFiles/Makefile2 sVproj .PHONY : sVproj # fast build rule for target. sVproj/fast: $(MAKE) -f project/CMakeFiles/sVproj.dir/build.make project/CMakeFiles/sVproj.dir/build .PHONY : sVproj/fast #============================================================================= # Target rules for targets named slowmoInterpolate # Build rule for target. slowmoInterpolate: cmake_check_build_system $(MAKE) -f CMakeFiles/Makefile2 slowmoInterpolate .PHONY : slowmoInterpolate # fast build rule for target. slowmoInterpolate/fast: $(MAKE) -f slowmoCLI/CMakeFiles/slowmoInterpolate.dir/build.make slowmoCLI/CMakeFiles/slowmoInterpolate.dir/build .PHONY : slowmoInterpolate/fast #============================================================================= # Target rules for targets named videoInfo # Build rule for target. videoInfo: cmake_check_build_system $(MAKE) -f CMakeFiles/Makefile2 videoInfo .PHONY : videoInfo # fast build rule for target. videoInfo/fast: $(MAKE) -f slowmoCLI/CMakeFiles/videoInfo.dir/build.make slowmoCLI/CMakeFiles/videoInfo.dir/build .PHONY : videoInfo/fast #============================================================================= # Target rules for targets named slowmoUI # Build rule for target. slowmoUI: cmake_check_build_system $(MAKE) -f CMakeFiles/Makefile2 slowmoUI .PHONY : slowmoUI # fast build rule for target. slowmoUI/fast: $(MAKE) -f slowmoUI/CMakeFiles/slowmoUI.dir/build.make slowmoUI/CMakeFiles/slowmoUI.dir/build .PHONY : slowmoUI/fast #============================================================================= # Target rules for targets named slowmoFlowEdit # Build rule for target. slowmoFlowEdit: cmake_check_build_system $(MAKE) -f CMakeFiles/Makefile2 slowmoFlowEdit .PHONY : slowmoFlowEdit # fast build rule for target. slowmoFlowEdit/fast: $(MAKE) -f slowmoFlowEdit/CMakeFiles/slowmoFlowEdit.dir/build.make slowmoFlowEdit/CMakeFiles/slowmoFlowEdit.dir/build .PHONY : slowmoFlowEdit/fast #============================================================================= # Target rules for targets named slowmoInfo # Build rule for target. slowmoInfo: cmake_check_build_system $(MAKE) -f CMakeFiles/Makefile2 slowmoInfo .PHONY : slowmoInfo # fast build rule for target. slowmoInfo/fast: $(MAKE) -f slowmoInfo/CMakeFiles/slowmoInfo.dir/build.make slowmoInfo/CMakeFiles/slowmoInfo.dir/build .PHONY : slowmoInfo/fast #============================================================================= # Target rules for targets named slowmoRenderer # Build rule for target. slowmoRenderer: cmake_check_build_system $(MAKE) -f CMakeFiles/Makefile2 slowmoRenderer .PHONY : slowmoRenderer # fast build rule for target. slowmoRenderer/fast: $(MAKE) -f slowmoRenderer/CMakeFiles/slowmoRenderer.dir/build.make slowmoRenderer/CMakeFiles/slowmoRenderer.dir/build .PHONY : slowmoRenderer/fast #============================================================================= # Target rules for targets named visualizeFlow # Build rule for target. visualizeFlow: cmake_check_build_system $(MAKE) -f CMakeFiles/Makefile2 visualizeFlow .PHONY : visualizeFlow # fast build rule for target. visualizeFlow/fast: $(MAKE) -f visualizeFlow/CMakeFiles/visualizeFlow.dir/build.make visualizeFlow/CMakeFiles/visualizeFlow.dir/build .PHONY : visualizeFlow/fast #============================================================================= # Target rules for targets named Test # Build rule for target. Test: cmake_check_build_system $(MAKE) -f CMakeFiles/Makefile2 Test .PHONY : Test # fast build rule for target. Test/fast: $(MAKE) -f test/CMakeFiles/Test.dir/build.make test/CMakeFiles/Test.dir/build .PHONY : Test/fast #============================================================================= # Target rules for targets named encodeFramesTest # Build rule for target. encodeFramesTest: cmake_check_build_system $(MAKE) -f CMakeFiles/Makefile2 encodeFramesTest .PHONY : encodeFramesTest # fast build rule for target. encodeFramesTest/fast: $(MAKE) -f test/CMakeFiles/encodeFramesTest.dir/build.make test/CMakeFiles/encodeFramesTest.dir/build .PHONY : encodeFramesTest/fast #============================================================================= # Target rules for targets named encodeTest # Build rule for target. encodeTest: cmake_check_build_system $(MAKE) -f CMakeFiles/Makefile2 encodeTest .PHONY : encodeTest # fast build rule for target. encodeTest/fast: $(MAKE) -f test/CMakeFiles/encodeTest.dir/build.make test/CMakeFiles/encodeTest.dir/build .PHONY : encodeTest/fast #============================================================================= # Target rules for targets named UnitTests # Build rule for target. UnitTests: cmake_check_build_system $(MAKE) -f CMakeFiles/Makefile2 UnitTests .PHONY : UnitTests # fast build rule for target. UnitTests/fast: $(MAKE) -f unittests/CMakeFiles/UnitTests.dir/build.make unittests/CMakeFiles/UnitTests.dir/build .PHONY : UnitTests/fast # Help Target help: @echo "The following are some of the valid targets for this Makefile:" @echo "... all (the default if no target is provided)" @echo "... clean" @echo "... depend" @echo "... edit_cache" @echo "... install" @echo "... install/local" @echo "... install/strip" @echo "... list_install_components" @echo "... rebuild_cache" @echo "... sV" @echo "... sVencode" @echo "... sVflow" @echo "... sVinfo" @echo "... sVvis" @echo "... sVgui" @echo "... sVproj" @echo "... slowmoInterpolate" @echo "... videoInfo" @echo "... slowmoUI" @echo "... slowmoFlowEdit" @echo "... slowmoInfo" @echo "... slowmoRenderer" @echo "... visualizeFlow" @echo "... Test" @echo "... encodeFramesTest" @echo "... encodeTest" @echo "... UnitTests" .PHONY : help #============================================================================= # Special targets to cleanup operation of make. # Special rule to run CMake to check the build system integrity. # No rule that depends on this can have commands that come from listfiles # because they might be regenerated. cmake_check_build_system: $(CMAKE_COMMAND) -H$(CMAKE_SOURCE_DIR) -B$(CMAKE_BINARY_DIR) --check-build-system CMakeFiles/Makefile.cmake 0 .PHONY : cmake_check_build_system slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/0000775000000000000000000000000013151342440021164 5ustar rootrootslowmovideo-0.5+git20180116/src/slowmoVideo/unittests/testAll.cpp0000664000000000000000000000146613151342440023307 0ustar rootroot#include "testFlowField_sV.h" #include "testFlowRW_sV.h" #include "testIntMatrix_sV.h" #include "testXmlProjectRW_sV.h" #include "testVector_sV.h" #include "testDefs_sV.h" #include "testShutterFunction_sV.h" #include "testProject_sV.h" #include "testNodeList_sV.h" #include int main() { TestFlowRW_sV flowRW; QTest::qExec(&flowRW); TestFlowField_sV flowField; QTest::qExec(&flowField); TestIntMatrix_sV intMatrix; QTest::qExec(&intMatrix); TestXmlProjectRW_sV xmlRW; QTest::qExec(&xmlRW); TestVector_sV vector; QTest::qExec(&vector); TestDefs_sV defs; QTest::qExec(&defs); TestShutterFunction_sV shutter; QTest::qExec(&shutter); TestProject_sV project; QTest::qExec(&project); TestNodeList_sV nodes; QTest::qExec(&nodes); } slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/testProject_sV.h0000664000000000000000000000077513151342440024324 0ustar rootroot#ifndef TESTPROJECT_SV_H #define TESTPROJECT_SV_H #include #include class Fps_sV; class Project_sV; class TestProject_sV : public QObject { Q_OBJECT private slots: void slotTestSnapInFrames(); void slotTestTimeExpressions(); void slotTestPercentageExpressions(); void slotTestLabelExpressions(); void slotTsetPositionExpressions(); void init(); void cleanup(); private: Project_sV *m_project; Fps_sV *m_fps; }; #endif // TESTPROJECT_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/testFlowRW_sV.cpp0000664000000000000000000000267013151342440024425 0ustar rootroot#include "testFlowRW_sV.h" #include "../lib/flowRW_sV.h" #include "../lib/flowField_sV.h" #include #include void TestFlowRW_sV::testWriteAndRead() { const std::string filename("/tmp/unittestFlowField_sV.sVflow"); const int width = 3; const int height = 2; FlowField_sV *field = new FlowField_sV(width, height); for (int y = 0; y < height; y++) { for (int x = 0; x < width; x++) { field->setX(x, y, x/float(y+1)); field->setY(x, y, x/float(y+2)); } } FlowRW_sV::save(filename, field); FlowField_sV *loadedField = FlowRW_sV::load(filename); QVERIFY(*field == *loadedField); delete field; delete loadedField; } void TestFlowRW_sV::testWriteAndReadFail() { const std::string filename("/tmp/unittestFlowField_sV.sVflow"); const int width = 3; const int height = 2; FlowField_sV *field = new FlowField_sV(width, height); for (int y = 0; y < height; y++) { for (int x = 0; x < width; x++) { field->setX(x, y, x/float(y+1)); field->setY(x, y, x/float(y+2)); } } FlowRW_sV::save(filename, field); field->setX(width-1, height-1, -3.1415927); FlowField_sV *loadedField = FlowRW_sV::load(filename); qDebug() << "Equal? " << (field->x(width-1, height-1) == loadedField->x(width-1, height-1)); QVERIFY( !( *field == *loadedField ) ); delete field; delete loadedField; } slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/testShutterFunction_sV.h0000664000000000000000000000062413151342440026053 0ustar rootroot#ifndef TESTSHUTTERFUNCTION_SV_H #define TESTSHUTTERFUNCTION_SV_H #include #include class TestShutterFunction_sV : public QObject { Q_OBJECT private slots: void initTestCase(); void cleanupTestCase(); void testZeroFunction(); void testFunctions(); void testWithVariables(); private: QCoreApplication *app; }; #endif // TESTSHUTTERFUNCTION_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/testVector_sV.h0000664000000000000000000000043113151342440024145 0ustar rootroot#ifndef TESTVECTOR_SV_H #define TESTVECTOR_SV_H #include #include class TestVector_sV : public QObject { Q_OBJECT private slots: void testLength(); void testAdd(); void testScale(); void testEqual(); }; #endif // TESTVECTOR_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/testIntMatrix_sV.h0000664000000000000000000000062113151342440024623 0ustar rootroot#ifndef TESTINTMATRIX_SV_H #define TESTINTMATRIX_SV_H #include #include class IntMatrix_sV; class TestIntMatrix_sV : public QObject { Q_OBJECT private: void testAdd(int w, int h, int c); static void dumpMatrix(IntMatrix_sV *mat); private slots: void testInitMatrix(); void testAddMatrix(); void testAddMatrix2C(); }; #endif // TESTINTMATRIX_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/testIntMatrix_sV.cpp0000664000000000000000000000374213151342440025165 0ustar rootroot#include "testIntMatrix_sV.h" #include "../lib/intMatrix_sV.h" #include #include void TestIntMatrix_sV::testAdd(int w, int h, int c) { int len = w*h*c; unsigned char *orig = new unsigned char[len]; for (int i = 0; i < len; i++) { orig[i] = i; } unsigned char *data; IntMatrix_sV mat(w, h, c); mat += orig; // dumpMatrix(&mat); data = mat.toBytesArray(); for (int i = 0; i < mat.width()*mat.height()*mat.channels(); i++) { QVERIFY(data[i] == orig[i]); } delete[] data; mat += orig; // dumpMatrix(&mat); data = mat.toBytesArray(); for (int i = 0; i < mat.width()*mat.height()*mat.channels(); i++) { QVERIFY(data[i] == 2*orig[i]); } delete[] data; mat /= 2; // dumpMatrix(&mat); data = mat.toBytesArray(); for (int i = 0; i < mat.width()*mat.height()*mat.channels(); i++) { QVERIFY(data[i] == orig[i]); } delete[] data; delete[] orig; } void TestIntMatrix_sV::testAddMatrix() { testAdd(2,2,1); } void TestIntMatrix_sV::testAddMatrix2C() { testAdd(2,2,2); } void TestIntMatrix_sV::testInitMatrix() { IntMatrix_sV mat(2, 2, 1); unsigned char *data = mat.toBytesArray(); for (int i = 0; i < mat.width()*mat.height()*mat.channels(); i++) { QVERIFY(data[i] == 0); } delete[] data; } void TestIntMatrix_sV::dumpMatrix(IntMatrix_sV *mat) { std::cout << "Matrix dump: " << std::endl; for (int y = 0; y < mat->height(); y++) { for (int x = 0; x < mat->width(); x++) { if (mat->channels() > 1) { std::cout << "[ "; for (int i = 0; i < mat->channels(); i++) { std::cout << std::setw(4) << mat->data()[mat->channels()*(y*mat->width()+x) + i]; } std::cout << " ] "; } else { std::cout << std::setw(4) << mat->data()[y*mat->width()+x]; } } std::cout << std::endl; } } slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/CMakeLists.txt0000664000000000000000000000132713151342440023727 0ustar rootroot set(SRCS testFlowRW_sV.cpp testFlowField_sV.cpp testVector_sV.cpp testIntMatrix_sV.cpp testXmlProjectRW_sV.cpp testDefs_sV.cpp testShutterFunction_sV.cpp testProject_sV.cpp testNodeList_sV.cpp testAll.cpp ) set(SRCS_MOC testFlowRW_sV.h testFlowField_sV.h testVector_sV.h testIntMatrix_sV.h testDefs_sV.h testShutterFunction_sV.h testXmlProjectRW_sV.h testNodeList_sV.h testProject_sV.h ) qt_wrap_cpp(MOC_OUT ${SRCS_MOC}) include_directories(${FFMPEG_INCLUDE_PATHS}) add_executable(UnitTests ${SRCS} ${MOC_OUT}) qt_use_modules(UnitTests Concurrent Core Test) target_link_libraries(UnitTests sVproj ${EXTERNAL_LIBS}) add_test(UnitTests UnitTests) slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/testFlowRW_sV.h0000664000000000000000000000037713151342440024074 0ustar rootroot#ifndef TESTFLOWRW_SV_H #define TESTFLOWRW_SV_H #include #include class TestFlowRW_sV : public QObject { Q_OBJECT private slots: void testWriteAndRead(); void testWriteAndReadFail(); }; #endif // TESTFLOWRW_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/testProject_sV.cpp0000664000000000000000000000443113151342440024650 0ustar rootroot#include "testProject_sV.h" #include "../project/project_sV.h" #include void TestProject_sV::slotTestSnapInFrames() { int framesBefore; float pos; Fps_sV fps(10, 1); pos = Project_sV::snapToFrame(0, false, fps, &framesBefore); QVERIFY(pos == 0); QVERIFY(framesBefore == 0); pos = Project_sV::snapToFrame(0, true, fps, &framesBefore); QVERIFY(pos == 0); QVERIFY(framesBefore == 0); pos = Project_sV::snapToFrame(0.49, false, fps, &framesBefore); QVERIFY(pos == (float).4); QVERIFY(framesBefore == 4); pos = Project_sV::snapToFrame(0.49, true, fps, &framesBefore); QVERIFY(pos == (float).5); QVERIFY(framesBefore == 5); } void TestProject_sV::init() { m_project = new Project_sV(); m_project->nodes()->add(Node_sV(1,42)); m_project->nodes()->add(Node_sV(5,21)); m_project->tags()->append(Tag_sV(4.5, "sourceLabel", TagAxis_Source)); m_project->tags()->append(Tag_sV(2.5, "outputLabel", TagAxis_Output)); m_fps = new Fps_sV(10, 1); } void TestProject_sV::cleanup() { delete m_project; delete m_fps; } void TestProject_sV::slotTestTimeExpressions() { Project_sV project; project.nodes()->add(Node_sV(1, 42)); Fps_sV fps(10, 1); try { project.toOutTime(0, fps); QVERIFY(false); } catch (Error_sV &err) {} project.nodes()->add(Node_sV(5, 21)); QVERIFY(qreal(1) == project.toOutTime("1", fps)); QVERIFY(qreal(1) == project.toOutTime("-1", fps)); QVERIFY(qreal(5) == project.toOutTime("55", fps)); QVERIFY(qreal(4.5) == project.toOutTime("4.5", fps)); } void TestProject_sV::slotTestPercentageExpressions() { QVERIFY(qreal(3) == m_project->toOutTime("p:50%", *m_fps)); try { m_project->toOutTime("p:101%", *m_fps); QVERIFY(false); } catch (Error_sV &err) {} } void TestProject_sV::slotTestLabelExpressions() { QVERIFY(qreal(2.5) == m_project->toOutTime("l:outputLabel", *m_fps)); try { m_project->toOutTime("l:sourceLabel", *m_fps); QVERIFY(false); } catch (Error_sV &err) {} } void TestProject_sV::slotTsetPositionExpressions() { QVERIFY(m_project->nodes()->startTime() == m_project->toOutTime(":start", *m_fps)); QVERIFY(m_project->nodes()->endTime() == m_project->toOutTime(":end", *m_fps)); } slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/testXmlProjectRW_sV.cpp0000664000000000000000000000274313151342440025606 0ustar rootroot#include "testXmlProjectRW_sV.h" #include "../project/xmlProjectRW_sV.h" #include "../project/nodeList_sV.h" void TestXmlProjectRW_sV::initTestCase() { int i = 0; app = new QCoreApplication(i, NULL); } void TestXmlProjectRW_sV::cleanupTestCase() { delete app; } void TestXmlProjectRW_sV::init() { m_project = new Project_sV(QDir::tempPath()); m_project->nodes()->add(Node_sV(0,0)); m_project->nodes()->add(Node_sV(1,1)); m_project->nodes()->setCurveType(.5, CurveType_Bezier); (*m_project->nodes())[0].setRightNodeHandle(.5, 2); (*m_project->nodes())[1].setLeftNodeHandle(-.4, -2); } void TestXmlProjectRW_sV::slotTestHandles() { XmlProjectRW_sV rw; const QString filename = QString("%1/xprTest.sVproj").arg(QDir::tempPath()); rw.saveProject(m_project, filename); Project_sV *loaded = rw.loadProject(filename); qDebug() << "Node handles: " << toString(loaded->nodes()->at(0).rightNodeHandle()) << "/" << toString(m_project->nodes()->at(0).rightNodeHandle()); qDebug() << "Node handles: " << toString(loaded->nodes()->at(1).leftNodeHandle()) << "/" << toString(m_project->nodes()->at(1).leftNodeHandle()); QVERIFY(loaded->nodes()->at(0).rightNodeHandle() == m_project->nodes()->at(0).rightNodeHandle()); QVERIFY(loaded->nodes()->at(1).leftNodeHandle() == m_project->nodes()->at(1).leftNodeHandle()); delete loaded; } void TestXmlProjectRW_sV::cleanup() { delete m_project; } slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/testNodeList_sV.h0000664000000000000000000000034113151342440024424 0ustar rootroot#ifndef TESTNODELIST_SV_H #define TESTNODELIST_SV_H #include #include class TestNodeList_sV : public QObject { Q_OBJECT private slots: void testCurveType(); }; #endif // TESTNODELIST_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/testDefs_sV.h0000664000000000000000000000035013151342440023564 0ustar rootroot#ifndef TESTDEFS_SV_H #define TESTDEFS_SV_H #include #include class TestDefs_sV : public QObject { Q_OBJECT private slots: void testFpsInt(); void testFpsFloat(); }; #endif // TESTDEFS_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/testDefs_sV.cpp0000664000000000000000000000066213151342440024125 0ustar rootroot#include "testDefs_sV.h" #include "../lib/defs_sV.hpp" void TestDefs_sV::testFpsInt() { Fps_sV fps(24, 1); QVERIFY(fps.fps() == 24); } void TestDefs_sV::testFpsFloat() { Fps_sV fps(24.0); QVERIFY(fps.num == 24); QVERIFY(fps.den == 1); fps = Fps_sV(23.97); QVERIFY(fps.num == 24000); QVERIFY(fps.den == 1001); fps = Fps_sV(29.97); QVERIFY(fps.num == 30000); QVERIFY(fps.den == 1001); } slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/testNodeList_sV.cpp0000664000000000000000000000142013151342440024756 0ustar rootroot#include "testNodeList_sV.h" #include "../project/nodeList_sV.h" void TestNodeList_sV::testCurveType() { NodeList_sV nodes; nodes.add(Node_sV(0, 1)); nodes.add(Node_sV(2, 1)); nodes.add(Node_sV(4, 1)); nodes.setCurveType(1, CurveType_Bezier); QVERIFY(nodes.at(0).rightCurveType() == CurveType_Bezier); QVERIFY(nodes.at(1).rightCurveType() == CurveType_Linear); nodes.setCurveType(3, CurveType_Bezier); QVERIFY(nodes.at(0).rightCurveType() == CurveType_Bezier); QVERIFY(nodes.at(1).rightCurveType() == CurveType_Bezier); nodes.add(Node_sV(1,1)); QVERIFY(nodes.at(0).rightCurveType() == CurveType_Linear); QVERIFY(nodes.at(1).rightCurveType() == CurveType_Linear); QVERIFY(nodes.at(2).rightCurveType() == CurveType_Bezier); } slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/testShutterFunction_sV.cpp0000664000000000000000000000341113151342440026403 0ustar rootroot#include "testShutterFunction_sV.h" #include "../project/shutterFunction_sV.h" #include void TestShutterFunction_sV::initTestCase() { int i = 0; app = new QCoreApplication(i, NULL); } void TestShutterFunction_sV::cleanupTestCase() { delete app; } void TestShutterFunction_sV::testZeroFunction() { ShutterFunction_sV f("return 0;"); for (float t = 0; t <= 1; t += .1) { QVERIFY(f.evaluate(t, 0, 0, 0, 0) == 0); } } void TestShutterFunction_sV::testFunctions() { ShutterFunction_sV f; f.updateFunction("return 1;"); for (float t = 0; t <= 1; t += .1) { QVERIFY(f.evaluate(t, 0, 0, 0, 0) == 1); } f.updateFunction("return Math.pow(x, 2)+t"); for (float t = 0; t <= 1; t += .1) { float cpp = std::pow(t,2)+t; float qsc = f.evaluate(t, t, 0, 0, 0); // qDebug() << "QScript says " << qsc << " (Qt: " << cpp << ")"; QVERIFY(fabs(cpp - qsc) < .0001); } f.updateFunction("return fps+y*dy"); for (float t = 0; t <= 1; t += .1) { float cpp = 24+t; float qsc = f.evaluate(0, 0, 24, t, 1); // qDebug() << "QScript says " << qsc << " (Qt: " << cpp << ")"; QVERIFY(fabs(cpp - qsc) < .0001); } } void TestShutterFunction_sV::testWithVariables() { ShutterFunction_sV f; f.updateFunction("var dx = 1/fps; \n" "var speed = dy/dx;\n" "if (speed < 1) { speed = 0; }\n" "return speed;"); float dx = 1.0/24; for (float dy = 0; dy <= 1; dy += .1) { float speed = dy/dx; if (speed < 1) { speed = 0; } float qsc = f.evaluate(0, 0, 24, 0, dy); // qDebug() << "QScript says " << qsc << " (Qt: " << speed << ")"; QVERIFY(fabs(speed-qsc) < .0001); } } slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/testLog.cpp0000664000000000000000000000101213151342440023303 0ustar rootroot#include #include #include #include "logbrowser.h" QPointer logBrowser; void myMessageOutput(QtMsgType type, const char *msg) { if(logBrowser) logBrowser->outputMessage( type, msg ); } int main(int argc, char *argv[]) { QApplication a(argc, argv); logBrowser = new LogBrowser; qInstallMsgHandler(myMessageOutput); qDebug() << "test for debug"; int result = a.exec(); qDebug() << "application exec return result =" << result; delete logBrowser; return result; } slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/testFlowField_sV.h0000664000000000000000000000060313151342440024557 0ustar rootroot#ifndef TESTFLOWFIELD_SV_H #define TESTFLOWFIELD_SV_H #include #include class FlowField_sV; class TestFlowField_sV : public QObject { Q_OBJECT private slots: void slotTestConstructorOpenGL(); void slotTestGaussKernel(); void slotTestMedian(); private: void initFlowField(FlowField_sV *field, int *values); }; #endif // TESTFLOWFIELD_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/testFlowField_sV.cpp0000664000000000000000000000461513151342440025121 0ustar rootroot#include "testFlowField_sV.h" #include "../lib/flowField_sV.h" #include "../lib/flowTools_sV.h" #include void TestFlowField_sV::slotTestConstructorOpenGL() { const int width = 2; const int height = 2; FlowField_sV field(width, height); for (int i = 0; i < 2*width*height; i++) { field.data()[i] = i; } float *glData = new float[3*width*height]; float *pData = glData; for (int i = 0; i < width*height; i++) { *(pData++) = field.data()[2*i+0]; *(pData++) = field.data()[2*i+1]; pData++; } FlowField_sV *glField = new FlowField_sV(width, height, glData, FlowField_sV::GLFormat_RGB); QVERIFY(field == *glField); delete[] glData; delete glField; } void TestFlowField_sV::slotTestGaussKernel() { Kernel_sV kernel(2,2); kernel.gauss(); std::cout << kernel; QVERIFY(fabs(kernel(0,0)-1) < .001); QVERIFY(fabs(kernel(-1,-1)-.135) < .001); QVERIFY(kernel(-1,-1) == kernel(1,1)); kernel = Kernel_sV(3,1); kernel.gauss(); std::cout << kernel; QVERIFY(fabs(kernel(0,0)-1) < .001); QVERIFY(fabs(kernel(-3,-1)) < .001); } void TestFlowField_sV::slotTestMedian() { int *values = new int[4]; values[0] = 0; values[1] = 0; values[2] = 0; values[3] = 2; FlowField_sV f1(2,2); initFlowField(&f1, values); values[0] = 0; values[1] = 1; values[2] = 0; values[3] = 1; FlowField_sV f2(2,2); initFlowField(&f2, values); values[0] = 0; values[1] = 2; values[2] = 1; values[3] = 0; FlowField_sV f3(2,2); initFlowField(&f3, values); FlowField_sV *outField = FlowTools_sV::median(&f1, &f2, &f3); values[0] = 0; values[1] = 1; values[2] = 0; values[3] = 1; QVERIFY(outField->x(0,0) == values[0]); QVERIFY(outField->x(1,0) == values[1]); QVERIFY(outField->x(0,1) == values[2]); QVERIFY(outField->x(1,1) == values[3]); QVERIFY(outField->y(0,0) == values[0]); QVERIFY(outField->y(1,0) == values[1]); QVERIFY(outField->y(0,1) == values[2]); QVERIFY(outField->y(1,1) == values[3]); delete values; delete outField; } void TestFlowField_sV::initFlowField(FlowField_sV *field, int *values) { int c = 0; for (int y = 0; y < field->height(); y++) { for (int x = 0; x < field->width(); x++) { field->rx(x,y) = values[c]; field->ry(x,y) = values[c]; c++; } } } slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/testXmlProjectRW_sV.h0000664000000000000000000000066513151342440025254 0ustar rootroot#ifndef TESTXMLPROJECTRW_SV_H #define TESTXMLPROJECTRW_SV_H #include "../project/project_sV.h" #include #include class TestXmlProjectRW_sV : public QObject { Q_OBJECT private slots: void initTestCase(); void cleanupTestCase(); void slotTestHandles(); void init(); void cleanup(); private: QCoreApplication *app; Project_sV *m_project; }; #endif // TESTXMLPROJECTRW_SV_H slowmovideo-0.5+git20180116/src/slowmoVideo/unittests/testVector_sV.cpp0000664000000000000000000000161213151342440024502 0ustar rootroot#include "testVector_sV.h" #include "../lib/vector_sV.h" void TestVector_sV::testLength() { Vector_sV vec(0, 0, 3, 4); QVERIFY(5 == vec.length()); } void TestVector_sV::testAdd() { Vector_sV vec(0,1,2,3); vec = vec + Vector_sV(0, 1, 2, 3); QVERIFY(vec == Vector_sV(4,4)); vec += Vector_sV(-8, -8.5); QVERIFY(vec == Vector_sV(-4, -4.5)); vec -= Vector_sV(-8, -8.5); QVERIFY(vec == Vector_sV(4,4)); } void TestVector_sV::testEqual() { Vector_sV vec(0,1,2,3); QVERIFY(vec == Vector_sV(2, 2)); QVERIFY(!(vec == Vector_sV(2, 2.1))); QVERIFY(vec != Vector_sV(2, 2.1)); QVERIFY(!(vec != Vector_sV(2, 2))); } void TestVector_sV::testScale() { Vector_sV vec(1, -1); vec *= 42; QVERIFY(vec == Vector_sV(42, -42)); vec = vec * .5; QVERIFY(vec == Vector_sV(21, -21)); vec = .5 * vec; QVERIFY(vec == Vector_sV(10.5, -10.5)); } slowmovideo-0.5+git20180116/src/slowmoVideo/Doxyfile0000664000000000000000000021526413151342440020642 0ustar rootroot# Doxyfile 1.7.4 # This file describes the settings to be used by the documentation system # doxygen (www.doxygen.org) for a project. # # All text after a hash (#) is considered a comment and will be ignored. # The format is: # TAG = value [value, ...] # For lists items can also be appended using: # TAG += value [value, ...] # Values that contain spaces should be placed between quotes (" "). #--------------------------------------------------------------------------- # Project related configuration options #--------------------------------------------------------------------------- # This tag specifies the encoding used for all characters in the config file # that follow. The default is UTF-8 which is also the encoding used for all # text before the first occurrence of this tag. Doxygen uses libiconv (or the # iconv built into libc) for the transcoding. See # http://www.gnu.org/software/libiconv for the list of possible encodings. DOXYFILE_ENCODING = UTF-8 # The PROJECT_NAME tag is a single word (or a sequence of words surrounded # by quotes) that should identify the project. PROJECT_NAME = slowmoVideo # The PROJECT_NUMBER tag can be used to enter a project or revision number. # This could be handy for archiving the generated documentation or # if some version control system is used. PROJECT_NUMBER = # Using the PROJECT_BRIEF tag one can provide an optional one line description # for a project that appears at the top of each page and should give viewer # a quick idea about the purpose of the project. Keep the description short. PROJECT_BRIEF = # With the PROJECT_LOGO tag one can specify an logo or icon that is # included in the documentation. The maximum height of the logo should not # exceed 55 pixels and the maximum width should not exceed 200 pixels. # Doxygen will copy the logo to the output directory. PROJECT_LOGO = # The OUTPUT_DIRECTORY tag is used to specify the (relative or absolute) # base path where the generated documentation will be put. # If a relative path is entered, it will be relative to the location # where doxygen was started. If left blank the current directory will be used. OUTPUT_DIRECTORY = docs # If the CREATE_SUBDIRS tag is set to YES, then doxygen will create # 4096 sub-directories (in 2 levels) under the output directory of each output # format and will distribute the generated files over these directories. # Enabling this option can be useful when feeding doxygen a huge amount of # source files, where putting all generated files in the same directory would # otherwise cause performance problems for the file system. CREATE_SUBDIRS = NO # The OUTPUT_LANGUAGE tag is used to specify the language in which all # documentation generated by doxygen is written. Doxygen will use this # information to generate all constant output in the proper language. # The default language is English, other supported languages are: # Afrikaans, Arabic, Brazilian, Catalan, Chinese, Chinese-Traditional, # Croatian, Czech, Danish, Dutch, Esperanto, Farsi, Finnish, French, German, # Greek, Hungarian, Italian, Japanese, Japanese-en (Japanese with English # messages), Korean, Korean-en, Lithuanian, Norwegian, Macedonian, Persian, # Polish, Portuguese, Romanian, Russian, Serbian, Serbian-Cyrillic, Slovak, # Slovene, Spanish, Swedish, Ukrainian, and Vietnamese. OUTPUT_LANGUAGE = English # If the BRIEF_MEMBER_DESC tag is set to YES (the default) Doxygen will # include brief member descriptions after the members that are listed in # the file and class documentation (similar to JavaDoc). # Set to NO to disable this. BRIEF_MEMBER_DESC = YES # If the REPEAT_BRIEF tag is set to YES (the default) Doxygen will prepend # the brief description of a member or function before the detailed description. # Note: if both HIDE_UNDOC_MEMBERS and BRIEF_MEMBER_DESC are set to NO, the # brief descriptions will be completely suppressed. REPEAT_BRIEF = YES # This tag implements a quasi-intelligent brief description abbreviator # that is used to form the text in various listings. Each string # in this list, if found as the leading text of the brief description, will be # stripped from the text and the result after processing the whole list, is # used as the annotated text. Otherwise, the brief description is used as-is. # If left blank, the following values are used ("$name" is automatically # replaced with the name of the entity): "The $name class" "The $name widget" # "The $name file" "is" "provides" "specifies" "contains" # "represents" "a" "an" "the" ABBREVIATE_BRIEF = # If the ALWAYS_DETAILED_SEC and REPEAT_BRIEF tags are both set to YES then # Doxygen will generate a detailed section even if there is only a brief # description. ALWAYS_DETAILED_SEC = NO # If the INLINE_INHERITED_MEMB tag is set to YES, doxygen will show all # inherited members of a class in the documentation of that class as if those # members were ordinary class members. Constructors, destructors and assignment # operators of the base classes will not be shown. INLINE_INHERITED_MEMB = NO # If the FULL_PATH_NAMES tag is set to YES then Doxygen will prepend the full # path before files name in the file list and in the header files. If set # to NO the shortest path that makes the file name unique will be used. FULL_PATH_NAMES = YES # If the FULL_PATH_NAMES tag is set to YES then the STRIP_FROM_PATH tag # can be used to strip a user-defined part of the path. Stripping is # only done if one of the specified strings matches the left-hand part of # the path. The tag can be used to show relative paths in the file list. # If left blank the directory from which doxygen is run is used as the # path to strip. STRIP_FROM_PATH = # The STRIP_FROM_INC_PATH tag can be used to strip a user-defined part of # the path mentioned in the documentation of a class, which tells # the reader which header file to include in order to use a class. # If left blank only the name of the header file containing the class # definition is used. Otherwise one should specify the include paths that # are normally passed to the compiler using the -I flag. STRIP_FROM_INC_PATH = # If the SHORT_NAMES tag is set to YES, doxygen will generate much shorter # (but less readable) file names. This can be useful if your file system # doesn't support long names like on DOS, Mac, or CD-ROM. SHORT_NAMES = NO # If the JAVADOC_AUTOBRIEF tag is set to YES then Doxygen # will interpret the first line (until the first dot) of a JavaDoc-style # comment as the brief description. If set to NO, the JavaDoc # comments will behave just like regular Qt-style comments # (thus requiring an explicit @brief command for a brief description.) JAVADOC_AUTOBRIEF = NO # If the QT_AUTOBRIEF tag is set to YES then Doxygen will # interpret the first line (until the first dot) of a Qt-style # comment as the brief description. If set to NO, the comments # will behave just like regular Qt-style comments (thus requiring # an explicit \brief command for a brief description.) QT_AUTOBRIEF = NO # The MULTILINE_CPP_IS_BRIEF tag can be set to YES to make Doxygen # treat a multi-line C++ special comment block (i.e. a block of //! or /// # comments) as a brief description. This used to be the default behaviour. # The new default is to treat a multi-line C++ comment block as a detailed # description. Set this tag to YES if you prefer the old behaviour instead. MULTILINE_CPP_IS_BRIEF = NO # If the INHERIT_DOCS tag is set to YES (the default) then an undocumented # member inherits the documentation from any documented member that it # re-implements. INHERIT_DOCS = YES # If the SEPARATE_MEMBER_PAGES tag is set to YES, then doxygen will produce # a new page for each member. If set to NO, the documentation of a member will # be part of the file/class/namespace that contains it. SEPARATE_MEMBER_PAGES = NO # The TAB_SIZE tag can be used to set the number of spaces in a tab. # Doxygen uses this value to replace tabs by spaces in code fragments. TAB_SIZE = 8 # This tag can be used to specify a number of aliases that acts # as commands in the documentation. An alias has the form "name=value". # For example adding "sideeffect=\par Side Effects:\n" will allow you to # put the command \sideeffect (or @sideeffect) in the documentation, which # will result in a user-defined paragraph with heading "Side Effects:". # You can put \n's in the value part of an alias to insert newlines. ALIASES = # Set the OPTIMIZE_OUTPUT_FOR_C tag to YES if your project consists of C # sources only. Doxygen will then generate output that is more tailored for C. # For instance, some of the names that are used will be different. The list # of all members will be omitted, etc. OPTIMIZE_OUTPUT_FOR_C = NO # Set the OPTIMIZE_OUTPUT_JAVA tag to YES if your project consists of Java # sources only. Doxygen will then generate output that is more tailored for # Java. For instance, namespaces will be presented as packages, qualified # scopes will look different, etc. OPTIMIZE_OUTPUT_JAVA = NO # Set the OPTIMIZE_FOR_FORTRAN tag to YES if your project consists of Fortran # sources only. Doxygen will then generate output that is more tailored for # Fortran. OPTIMIZE_FOR_FORTRAN = NO # Set the OPTIMIZE_OUTPUT_VHDL tag to YES if your project consists of VHDL # sources. Doxygen will then generate output that is tailored for # VHDL. OPTIMIZE_OUTPUT_VHDL = NO # Doxygen selects the parser to use depending on the extension of the files it # parses. With this tag you can assign which parser to use for a given extension. # Doxygen has a built-in mapping, but you can override or extend it using this # tag. The format is ext=language, where ext is a file extension, and language # is one of the parsers supported by doxygen: IDL, Java, Javascript, CSharp, C, # C++, D, PHP, Objective-C, Python, Fortran, VHDL, C, C++. For instance to make # doxygen treat .inc files as Fortran files (default is PHP), and .f files as C # (default is Fortran), use: inc=Fortran f=C. Note that for custom extensions # you also need to set FILE_PATTERNS otherwise the files are not read by doxygen. EXTENSION_MAPPING = # If you use STL classes (i.e. std::string, std::vector, etc.) but do not want # to include (a tag file for) the STL sources as input, then you should # set this tag to YES in order to let doxygen match functions declarations and # definitions whose arguments contain STL classes (e.g. func(std::string); v.s. # func(std::string) {}). This also makes the inheritance and collaboration # diagrams that involve STL classes more complete and accurate. BUILTIN_STL_SUPPORT = NO # If you use Microsoft's C++/CLI language, you should set this option to YES to # enable parsing support. CPP_CLI_SUPPORT = NO # Set the SIP_SUPPORT tag to YES if your project consists of sip sources only. # Doxygen will parse them like normal C++ but will assume all classes use public # instead of private inheritance when no explicit protection keyword is present. SIP_SUPPORT = NO # For Microsoft's IDL there are propget and propput attributes to indicate getter # and setter methods for a property. Setting this option to YES (the default) # will make doxygen replace the get and set methods by a property in the # documentation. This will only work if the methods are indeed getting or # setting a simple type. If this is not the case, or you want to show the # methods anyway, you should set this option to NO. IDL_PROPERTY_SUPPORT = YES # If member grouping is used in the documentation and the DISTRIBUTE_GROUP_DOC # tag is set to YES, then doxygen will reuse the documentation of the first # member in the group (if any) for the other members of the group. By default # all members of a group must be documented explicitly. DISTRIBUTE_GROUP_DOC = NO # Set the SUBGROUPING tag to YES (the default) to allow class member groups of # the same type (for instance a group of public functions) to be put as a # subgroup of that type (e.g. under the Public Functions section). Set it to # NO to prevent subgrouping. Alternatively, this can be done per class using # the \nosubgrouping command. SUBGROUPING = YES # When the INLINE_GROUPED_CLASSES tag is set to YES, classes, structs and # unions are shown inside the group in which they are included (e.g. using # @ingroup) instead of on a separate page (for HTML and Man pages) or # section (for LaTeX and RTF). INLINE_GROUPED_CLASSES = NO # When TYPEDEF_HIDES_STRUCT is enabled, a typedef of a struct, union, or enum # is documented as struct, union, or enum with the name of the typedef. So # typedef struct TypeS {} TypeT, will appear in the documentation as a struct # with name TypeT. When disabled the typedef will appear as a member of a file, # namespace, or class. And the struct will be named TypeS. This can typically # be useful for C code in case the coding convention dictates that all compound # types are typedef'ed and only the typedef is referenced, never the tag name. TYPEDEF_HIDES_STRUCT = NO # The SYMBOL_CACHE_SIZE determines the size of the internal cache use to # determine which symbols to keep in memory and which to flush to disk. # When the cache is full, less often used symbols will be written to disk. # For small to medium size projects (<1000 input files) the default value is # probably good enough. For larger projects a too small cache size can cause # doxygen to be busy swapping symbols to and from disk most of the time # causing a significant performance penalty. # If the system has enough physical memory increasing the cache will improve the # performance by keeping more symbols in memory. Note that the value works on # a logarithmic scale so increasing the size by one will roughly double the # memory usage. The cache size is given by this formula: # 2^(16+SYMBOL_CACHE_SIZE). The valid range is 0..9, the default is 0, # corresponding to a cache size of 2^16 = 65536 symbols SYMBOL_CACHE_SIZE = 0 #--------------------------------------------------------------------------- # Build related configuration options #--------------------------------------------------------------------------- # If the EXTRACT_ALL tag is set to YES doxygen will assume all entities in # documentation are documented, even if no documentation was available. # Private class members and static file members will be hidden unless # the EXTRACT_PRIVATE and EXTRACT_STATIC tags are set to YES EXTRACT_ALL = NO # If the EXTRACT_PRIVATE tag is set to YES all private members of a class # will be included in the documentation. EXTRACT_PRIVATE = NO # If the EXTRACT_STATIC tag is set to YES all static members of a file # will be included in the documentation. EXTRACT_STATIC = YES # If the EXTRACT_LOCAL_CLASSES tag is set to YES classes (and structs) # defined locally in source files will be included in the documentation. # If set to NO only classes defined in header files are included. EXTRACT_LOCAL_CLASSES = YES # This flag is only useful for Objective-C code. When set to YES local # methods, which are defined in the implementation section but not in # the interface are included in the documentation. # If set to NO (the default) only methods in the interface are included. EXTRACT_LOCAL_METHODS = NO # If this flag is set to YES, the members of anonymous namespaces will be # extracted and appear in the documentation as a namespace called # 'anonymous_namespace{file}', where file will be replaced with the base # name of the file that contains the anonymous namespace. By default # anonymous namespaces are hidden. EXTRACT_ANON_NSPACES = NO # If the HIDE_UNDOC_MEMBERS tag is set to YES, Doxygen will hide all # undocumented members of documented classes, files or namespaces. # If set to NO (the default) these members will be included in the # various overviews, but no documentation section is generated. # This option has no effect if EXTRACT_ALL is enabled. HIDE_UNDOC_MEMBERS = NO # If the HIDE_UNDOC_CLASSES tag is set to YES, Doxygen will hide all # undocumented classes that are normally visible in the class hierarchy. # If set to NO (the default) these classes will be included in the various # overviews. This option has no effect if EXTRACT_ALL is enabled. HIDE_UNDOC_CLASSES = NO # If the HIDE_FRIEND_COMPOUNDS tag is set to YES, Doxygen will hide all # friend (class|struct|union) declarations. # If set to NO (the default) these declarations will be included in the # documentation. HIDE_FRIEND_COMPOUNDS = NO # If the HIDE_IN_BODY_DOCS tag is set to YES, Doxygen will hide any # documentation blocks found inside the body of a function. # If set to NO (the default) these blocks will be appended to the # function's detailed documentation block. HIDE_IN_BODY_DOCS = NO # The INTERNAL_DOCS tag determines if documentation # that is typed after a \internal command is included. If the tag is set # to NO (the default) then the documentation will be excluded. # Set it to YES to include the internal documentation. INTERNAL_DOCS = NO # If the CASE_SENSE_NAMES tag is set to NO then Doxygen will only generate # file names in lower-case letters. If set to YES upper-case letters are also # allowed. This is useful if you have classes or files whose names only differ # in case and if your file system supports case sensitive file names. Windows # and Mac users are advised to set this option to NO. CASE_SENSE_NAMES = YES # If the HIDE_SCOPE_NAMES tag is set to NO (the default) then Doxygen # will show members with their full class and namespace scopes in the # documentation. If set to YES the scope will be hidden. HIDE_SCOPE_NAMES = NO # If the SHOW_INCLUDE_FILES tag is set to YES (the default) then Doxygen # will put a list of the files that are included by a file in the documentation # of that file. SHOW_INCLUDE_FILES = YES # If the FORCE_LOCAL_INCLUDES tag is set to YES then Doxygen # will list include files with double quotes in the documentation # rather than with sharp brackets. FORCE_LOCAL_INCLUDES = NO # If the INLINE_INFO tag is set to YES (the default) then a tag [inline] # is inserted in the documentation for inline members. INLINE_INFO = YES # If the SORT_MEMBER_DOCS tag is set to YES (the default) then doxygen # will sort the (detailed) documentation of file and class members # alphabetically by member name. If set to NO the members will appear in # declaration order. SORT_MEMBER_DOCS = YES # If the SORT_BRIEF_DOCS tag is set to YES then doxygen will sort the # brief documentation of file, namespace and class members alphabetically # by member name. If set to NO (the default) the members will appear in # declaration order. SORT_BRIEF_DOCS = NO # If the SORT_MEMBERS_CTORS_1ST tag is set to YES then doxygen # will sort the (brief and detailed) documentation of class members so that # constructors and destructors are listed first. If set to NO (the default) # the constructors will appear in the respective orders defined by # SORT_MEMBER_DOCS and SORT_BRIEF_DOCS. # This tag will be ignored for brief docs if SORT_BRIEF_DOCS is set to NO # and ignored for detailed docs if SORT_MEMBER_DOCS is set to NO. SORT_MEMBERS_CTORS_1ST = NO # If the SORT_GROUP_NAMES tag is set to YES then doxygen will sort the # hierarchy of group names into alphabetical order. If set to NO (the default) # the group names will appear in their defined order. SORT_GROUP_NAMES = NO # If the SORT_BY_SCOPE_NAME tag is set to YES, the class list will be # sorted by fully-qualified names, including namespaces. If set to # NO (the default), the class list will be sorted only by class name, # not including the namespace part. # Note: This option is not very useful if HIDE_SCOPE_NAMES is set to YES. # Note: This option applies only to the class list, not to the # alphabetical list. SORT_BY_SCOPE_NAME = NO # If the STRICT_PROTO_MATCHING option is enabled and doxygen fails to # do proper type resolution of all parameters of a function it will reject a # match between the prototype and the implementation of a member function even # if there is only one candidate or it is obvious which candidate to choose # by doing a simple string match. By disabling STRICT_PROTO_MATCHING doxygen # will still accept a match between prototype and implementation in such cases. STRICT_PROTO_MATCHING = NO # The GENERATE_TODOLIST tag can be used to enable (YES) or # disable (NO) the todo list. This list is created by putting \todo # commands in the documentation. GENERATE_TODOLIST = YES # The GENERATE_TESTLIST tag can be used to enable (YES) or # disable (NO) the test list. This list is created by putting \test # commands in the documentation. GENERATE_TESTLIST = YES # The GENERATE_BUGLIST tag can be used to enable (YES) or # disable (NO) the bug list. This list is created by putting \bug # commands in the documentation. GENERATE_BUGLIST = YES # The GENERATE_DEPRECATEDLIST tag can be used to enable (YES) or # disable (NO) the deprecated list. This list is created by putting # \deprecated commands in the documentation. GENERATE_DEPRECATEDLIST= YES # The ENABLED_SECTIONS tag can be used to enable conditional # documentation sections, marked by \if sectionname ... \endif. ENABLED_SECTIONS = # The MAX_INITIALIZER_LINES tag determines the maximum number of lines # the initial value of a variable or macro consists of for it to appear in # the documentation. If the initializer consists of more lines than specified # here it will be hidden. Use a value of 0 to hide initializers completely. # The appearance of the initializer of individual variables and macros in the # documentation can be controlled using \showinitializer or \hideinitializer # command in the documentation regardless of this setting. MAX_INITIALIZER_LINES = 30 # Set the SHOW_USED_FILES tag to NO to disable the list of files generated # at the bottom of the documentation of classes and structs. If set to YES the # list will mention the files that were used to generate the documentation. SHOW_USED_FILES = YES # If the sources in your project are distributed over multiple directories # then setting the SHOW_DIRECTORIES tag to YES will show the directory hierarchy # in the documentation. The default is NO. SHOW_DIRECTORIES = YES # Set the SHOW_FILES tag to NO to disable the generation of the Files page. # This will remove the Files entry from the Quick Index and from the # Folder Tree View (if specified). The default is YES. SHOW_FILES = YES # Set the SHOW_NAMESPACES tag to NO to disable the generation of the # Namespaces page. # This will remove the Namespaces entry from the Quick Index # and from the Folder Tree View (if specified). The default is YES. SHOW_NAMESPACES = YES # The FILE_VERSION_FILTER tag can be used to specify a program or script that # doxygen should invoke to get the current version for each file (typically from # the version control system). Doxygen will invoke the program by executing (via # popen()) the command , where is the value of # the FILE_VERSION_FILTER tag, and is the name of an input file # provided by doxygen. Whatever the program writes to standard output # is used as the file version. See the manual for examples. FILE_VERSION_FILTER = # The LAYOUT_FILE tag can be used to specify a layout file which will be parsed # by doxygen. The layout file controls the global structure of the generated # output files in an output format independent way. The create the layout file # that represents doxygen's defaults, run doxygen with the -l option. # You can optionally specify a file name after the option, if omitted # DoxygenLayout.xml will be used as the name of the layout file. LAYOUT_FILE = #--------------------------------------------------------------------------- # configuration options related to warning and progress messages #--------------------------------------------------------------------------- # The QUIET tag can be used to turn on/off the messages that are generated # by doxygen. Possible values are YES and NO. If left blank NO is used. QUIET = NO # The WARNINGS tag can be used to turn on/off the warning messages that are # generated by doxygen. Possible values are YES and NO. If left blank # NO is used. WARNINGS = YES # If WARN_IF_UNDOCUMENTED is set to YES, then doxygen will generate warnings # for undocumented members. If EXTRACT_ALL is set to YES then this flag will # automatically be disabled. WARN_IF_UNDOCUMENTED = YES # If WARN_IF_DOC_ERROR is set to YES, doxygen will generate warnings for # potential errors in the documentation, such as not documenting some # parameters in a documented function, or documenting parameters that # don't exist or using markup commands wrongly. WARN_IF_DOC_ERROR = YES # The WARN_NO_PARAMDOC option can be enabled to get warnings for # functions that are documented, but have no documentation for their parameters # or return value. If set to NO (the default) doxygen will only warn about # wrong or incomplete parameter documentation, but not about the absence of # documentation. WARN_NO_PARAMDOC = NO # The WARN_FORMAT tag determines the format of the warning messages that # doxygen can produce. The string should contain the $file, $line, and $text # tags, which will be replaced by the file and line number from which the # warning originated and the warning text. Optionally the format may contain # $version, which will be replaced by the version of the file (if it could # be obtained via FILE_VERSION_FILTER) WARN_FORMAT = "$file:$line: $text" # The WARN_LOGFILE tag can be used to specify a file to which warning # and error messages should be written. If left blank the output is written # to stderr. WARN_LOGFILE = #--------------------------------------------------------------------------- # configuration options related to the input files #--------------------------------------------------------------------------- # The INPUT tag can be used to specify the files and/or directories that contain # documented source files. You may enter file names like "myfile.cpp" or # directories like "/usr/src/myproject". Separate the files or directories # with spaces. INPUT = slowmoCLI slowmoUI lib project visualizeFlow docs/src # This tag can be used to specify the character encoding of the source files # that doxygen parses. Internally doxygen uses the UTF-8 encoding, which is # also the default input encoding. Doxygen uses libiconv (or the iconv built # into libc) for the transcoding. See http://www.gnu.org/software/libiconv for # the list of possible encodings. INPUT_ENCODING = UTF-8 # If the value of the INPUT tag contains directories, you can use the # FILE_PATTERNS tag to specify one or more wildcard pattern (like *.cpp # and *.h) to filter out the source-files in the directories. If left # blank the following patterns are tested: # *.c *.cc *.cxx *.cpp *.c++ *.d *.java *.ii *.ixx *.ipp *.i++ *.inl *.h *.hh # *.hxx *.hpp *.h++ *.idl *.odl *.cs *.php *.php3 *.inc *.m *.mm *.dox *.py # *.f90 *.f *.for *.vhd *.vhdl FILE_PATTERNS = # The RECURSIVE tag can be used to turn specify whether or not subdirectories # should be searched for input files as well. Possible values are YES and NO. # If left blank NO is used. RECURSIVE = YES # The EXCLUDE tag can be used to specify files and/or directories that should # excluded from the INPUT source files. This way you can easily exclude a # subdirectory from a directory tree whose root is specified with the INPUT tag. EXCLUDE = # The EXCLUDE_SYMLINKS tag can be used select whether or not files or # directories that are symbolic links (a Unix file system feature) are excluded # from the input. EXCLUDE_SYMLINKS = NO # If the value of the INPUT tag contains directories, you can use the # EXCLUDE_PATTERNS tag to specify one or more wildcard patterns to exclude # certain files from those directories. Note that the wildcards are matched # against the file with absolute path, so to exclude all test directories # for example use the pattern */test/* EXCLUDE_PATTERNS = # The EXCLUDE_SYMBOLS tag can be used to specify one or more symbol names # (namespaces, classes, functions, etc.) that should be excluded from the # output. The symbol name can be a fully qualified name, a word, or if the # wildcard * is used, a substring. Examples: ANamespace, AClass, # AClass::ANamespace, ANamespace::*Test EXCLUDE_SYMBOLS = # The EXAMPLE_PATH tag can be used to specify one or more files or # directories that contain example code fragments that are included (see # the \include command). EXAMPLE_PATH = # If the value of the EXAMPLE_PATH tag contains directories, you can use the # EXAMPLE_PATTERNS tag to specify one or more wildcard pattern (like *.cpp # and *.h) to filter out the source-files in the directories. If left # blank all files are included. EXAMPLE_PATTERNS = # If the EXAMPLE_RECURSIVE tag is set to YES then subdirectories will be # searched for input files to be used with the \include or \dontinclude # commands irrespective of the value of the RECURSIVE tag. # Possible values are YES and NO. If left blank NO is used. EXAMPLE_RECURSIVE = NO # The IMAGE_PATH tag can be used to specify one or more files or # directories that contain image that are included in the documentation (see # the \image command). IMAGE_PATH = # The INPUT_FILTER tag can be used to specify a program that doxygen should # invoke to filter for each input file. Doxygen will invoke the filter program # by executing (via popen()) the command , where # is the value of the INPUT_FILTER tag, and is the name of an # input file. Doxygen will then use the output that the filter program writes # to standard output. # If FILTER_PATTERNS is specified, this tag will be # ignored. INPUT_FILTER = # The FILTER_PATTERNS tag can be used to specify filters on a per file pattern # basis. # Doxygen will compare the file name with each pattern and apply the # filter if there is a match. # The filters are a list of the form: # pattern=filter (like *.cpp=my_cpp_filter). See INPUT_FILTER for further # info on how filters are used. If FILTER_PATTERNS is empty or if # non of the patterns match the file name, INPUT_FILTER is applied. FILTER_PATTERNS = # If the FILTER_SOURCE_FILES tag is set to YES, the input filter (if set using # INPUT_FILTER) will be used to filter the input files when producing source # files to browse (i.e. when SOURCE_BROWSER is set to YES). FILTER_SOURCE_FILES = NO # The FILTER_SOURCE_PATTERNS tag can be used to specify source filters per file # pattern. A pattern will override the setting for FILTER_PATTERN (if any) # and it is also possible to disable source filtering for a specific pattern # using *.ext= (so without naming a filter). This option only has effect when # FILTER_SOURCE_FILES is enabled. FILTER_SOURCE_PATTERNS = #--------------------------------------------------------------------------- # configuration options related to source browsing #--------------------------------------------------------------------------- # If the SOURCE_BROWSER tag is set to YES then a list of source files will # be generated. Documented entities will be cross-referenced with these sources. # Note: To get rid of all source code in the generated output, make sure also # VERBATIM_HEADERS is set to NO. SOURCE_BROWSER = YES # Setting the INLINE_SOURCES tag to YES will include the body # of functions and classes directly in the documentation. INLINE_SOURCES = NO # Setting the STRIP_CODE_COMMENTS tag to YES (the default) will instruct # doxygen to hide any special comment blocks from generated source code # fragments. Normal C and C++ comments will always remain visible. STRIP_CODE_COMMENTS = YES # If the REFERENCED_BY_RELATION tag is set to YES # then for each documented function all documented # functions referencing it will be listed. REFERENCED_BY_RELATION = NO # If the REFERENCES_RELATION tag is set to YES # then for each documented function all documented entities # called/used by that function will be listed. REFERENCES_RELATION = NO # If the REFERENCES_LINK_SOURCE tag is set to YES (the default) # and SOURCE_BROWSER tag is set to YES, then the hyperlinks from # functions in REFERENCES_RELATION and REFERENCED_BY_RELATION lists will # link to the source code. # Otherwise they will link to the documentation. REFERENCES_LINK_SOURCE = YES # If the USE_HTAGS tag is set to YES then the references to source code # will point to the HTML generated by the htags(1) tool instead of doxygen # built-in source browser. The htags tool is part of GNU's global source # tagging system (see http://www.gnu.org/software/global/global.html). You # will need version 4.8.6 or higher. USE_HTAGS = NO # If the VERBATIM_HEADERS tag is set to YES (the default) then Doxygen # will generate a verbatim copy of the header file for each class for # which an include is specified. Set to NO to disable this. VERBATIM_HEADERS = YES #--------------------------------------------------------------------------- # configuration options related to the alphabetical class index #--------------------------------------------------------------------------- # If the ALPHABETICAL_INDEX tag is set to YES, an alphabetical index # of all compounds will be generated. Enable this if the project # contains a lot of classes, structs, unions or interfaces. ALPHABETICAL_INDEX = YES # If the alphabetical index is enabled (see ALPHABETICAL_INDEX) then # the COLS_IN_ALPHA_INDEX tag can be used to specify the number of columns # in which this list will be split (can be a number in the range [1..20]) COLS_IN_ALPHA_INDEX = 5 # In case all classes in a project start with a common prefix, all # classes will be put under the same header in the alphabetical index. # The IGNORE_PREFIX tag can be used to specify one or more prefixes that # should be ignored while generating the index headers. IGNORE_PREFIX = #--------------------------------------------------------------------------- # configuration options related to the HTML output #--------------------------------------------------------------------------- # If the GENERATE_HTML tag is set to YES (the default) Doxygen will # generate HTML output. GENERATE_HTML = YES # The HTML_OUTPUT tag is used to specify where the HTML docs will be put. # If a relative path is entered the value of OUTPUT_DIRECTORY will be # put in front of it. If left blank `html' will be used as the default path. HTML_OUTPUT = html # The HTML_FILE_EXTENSION tag can be used to specify the file extension for # each generated HTML page (for example: .htm,.php,.asp). If it is left blank # doxygen will generate files with .html extension. HTML_FILE_EXTENSION = .html # The HTML_HEADER tag can be used to specify a personal HTML header for # each generated HTML page. If it is left blank doxygen will generate a # standard header. Note that when using a custom header you are responsible # for the proper inclusion of any scripts and style sheets that doxygen # needs, which is dependent on the configuration options used. # It is adviced to generate a default header using "doxygen -w html # header.html footer.html stylesheet.css YourConfigFile" and then modify # that header. Note that the header is subject to change so you typically # have to redo this when upgrading to a newer version of doxygen or when changing the value of configuration settings such as GENERATE_TREEVIEW! HTML_HEADER = # The HTML_FOOTER tag can be used to specify a personal HTML footer for # each generated HTML page. If it is left blank doxygen will generate a # standard footer. HTML_FOOTER = # The HTML_STYLESHEET tag can be used to specify a user-defined cascading # style sheet that is used by each HTML page. It can be used to # fine-tune the look of the HTML output. If the tag is left blank doxygen # will generate a default style sheet. Note that doxygen will try to copy # the style sheet file to the HTML output directory, so don't put your own # stylesheet in the HTML output directory as well, or it will be erased! HTML_STYLESHEET = docs/src/doxygen.css # The HTML_EXTRA_FILES tag can be used to specify one or more extra images or # other source files which should be copied to the HTML output directory. Note # that these files will be copied to the base HTML output directory. Use the # $relpath$ marker in the HTML_HEADER and/or HTML_FOOTER files to load these # files. In the HTML_STYLESHEET file, use the file name only. Also note that # the files will be copied as-is; there are no commands or markers available. HTML_EXTRA_FILES = # The HTML_COLORSTYLE_HUE tag controls the color of the HTML output. # Doxygen will adjust the colors in the stylesheet and background images # according to this color. Hue is specified as an angle on a colorwheel, # see http://en.wikipedia.org/wiki/Hue for more information. # For instance the value 0 represents red, 60 is yellow, 120 is green, # 180 is cyan, 240 is blue, 300 purple, and 360 is red again. # The allowed range is 0 to 359. HTML_COLORSTYLE_HUE = 220 # The HTML_COLORSTYLE_SAT tag controls the purity (or saturation) of # the colors in the HTML output. For a value of 0 the output will use # grayscales only. A value of 255 will produce the most vivid colors. HTML_COLORSTYLE_SAT = 100 # The HTML_COLORSTYLE_GAMMA tag controls the gamma correction applied to # the luminance component of the colors in the HTML output. Values below # 100 gradually make the output lighter, whereas values above 100 make # the output darker. The value divided by 100 is the actual gamma applied, # so 80 represents a gamma of 0.8, The value 220 represents a gamma of 2.2, # and 100 does not change the gamma. HTML_COLORSTYLE_GAMMA = 80 # If the HTML_TIMESTAMP tag is set to YES then the footer of each generated HTML # page will contain the date and time when the page was generated. Setting # this to NO can help when comparing the output of multiple runs. HTML_TIMESTAMP = YES # If the HTML_ALIGN_MEMBERS tag is set to YES, the members of classes, # files or namespaces will be aligned in HTML using tables. If set to # NO a bullet list will be used. HTML_ALIGN_MEMBERS = YES # If the HTML_DYNAMIC_SECTIONS tag is set to YES then the generated HTML # documentation will contain sections that can be hidden and shown after the # page has loaded. For this to work a browser that supports # JavaScript and DHTML is required (for instance Mozilla 1.0+, Firefox # Netscape 6.0+, Internet explorer 5.0+, Konqueror, or Safari). HTML_DYNAMIC_SECTIONS = NO # If the GENERATE_DOCSET tag is set to YES, additional index files # will be generated that can be used as input for Apple's Xcode 3 # integrated development environment, introduced with OSX 10.5 (Leopard). # To create a documentation set, doxygen will generate a Makefile in the # HTML output directory. Running make will produce the docset in that # directory and running "make install" will install the docset in # ~/Library/Developer/Shared/Documentation/DocSets so that Xcode will find # it at startup. # See http://developer.apple.com/tools/creatingdocsetswithdoxygen.html # for more information. GENERATE_DOCSET = NO # When GENERATE_DOCSET tag is set to YES, this tag determines the name of the # feed. A documentation feed provides an umbrella under which multiple # documentation sets from a single provider (such as a company or product suite) # can be grouped. DOCSET_FEEDNAME = "Doxygen generated docs" # When GENERATE_DOCSET tag is set to YES, this tag specifies a string that # should uniquely identify the documentation set bundle. This should be a # reverse domain-name style string, e.g. com.mycompany.MyDocSet. Doxygen # will append .docset to the name. DOCSET_BUNDLE_ID = org.doxygen.Project # When GENERATE_PUBLISHER_ID tag specifies a string that should uniquely identify # the documentation publisher. This should be a reverse domain-name style # string, e.g. com.mycompany.MyDocSet.documentation. DOCSET_PUBLISHER_ID = org.doxygen.Publisher # The GENERATE_PUBLISHER_NAME tag identifies the documentation publisher. DOCSET_PUBLISHER_NAME = Publisher # If the GENERATE_HTMLHELP tag is set to YES, additional index files # will be generated that can be used as input for tools like the # Microsoft HTML help workshop to generate a compiled HTML help file (.chm) # of the generated HTML documentation. GENERATE_HTMLHELP = NO # If the GENERATE_HTMLHELP tag is set to YES, the CHM_FILE tag can # be used to specify the file name of the resulting .chm file. You # can add a path in front of the file if the result should not be # written to the html output directory. CHM_FILE = # If the GENERATE_HTMLHELP tag is set to YES, the HHC_LOCATION tag can # be used to specify the location (absolute path including file name) of # the HTML help compiler (hhc.exe). If non-empty doxygen will try to run # the HTML help compiler on the generated index.hhp. HHC_LOCATION = # If the GENERATE_HTMLHELP tag is set to YES, the GENERATE_CHI flag # controls if a separate .chi index file is generated (YES) or that # it should be included in the master .chm file (NO). GENERATE_CHI = NO # If the GENERATE_HTMLHELP tag is set to YES, the CHM_INDEX_ENCODING # is used to encode HtmlHelp index (hhk), content (hhc) and project file # content. CHM_INDEX_ENCODING = # If the GENERATE_HTMLHELP tag is set to YES, the BINARY_TOC flag # controls whether a binary table of contents is generated (YES) or a # normal table of contents (NO) in the .chm file. BINARY_TOC = NO # The TOC_EXPAND flag can be set to YES to add extra items for group members # to the contents of the HTML help documentation and to the tree view. TOC_EXPAND = NO # If the GENERATE_QHP tag is set to YES and both QHP_NAMESPACE and # QHP_VIRTUAL_FOLDER are set, an additional index file will be generated # that can be used as input for Qt's qhelpgenerator to generate a # Qt Compressed Help (.qch) of the generated HTML documentation. GENERATE_QHP = NO # If the QHG_LOCATION tag is specified, the QCH_FILE tag can # be used to specify the file name of the resulting .qch file. # The path specified is relative to the HTML output folder. QCH_FILE = # The QHP_NAMESPACE tag specifies the namespace to use when generating # Qt Help Project output. For more information please see # http://doc.trolltech.com/qthelpproject.html#namespace QHP_NAMESPACE = org.doxygen.Project # The QHP_VIRTUAL_FOLDER tag specifies the namespace to use when generating # Qt Help Project output. For more information please see # http://doc.trolltech.com/qthelpproject.html#virtual-folders QHP_VIRTUAL_FOLDER = doc # If QHP_CUST_FILTER_NAME is set, it specifies the name of a custom filter to # add. For more information please see # http://doc.trolltech.com/qthelpproject.html#custom-filters QHP_CUST_FILTER_NAME = # The QHP_CUST_FILT_ATTRS tag specifies the list of the attributes of the # custom filter to add. For more information please see #
# Qt Help Project / Custom Filters. QHP_CUST_FILTER_ATTRS = # The QHP_SECT_FILTER_ATTRS tag specifies the list of the attributes this # project's # filter section matches. # # Qt Help Project / Filter Attributes. QHP_SECT_FILTER_ATTRS = # If the GENERATE_QHP tag is set to YES, the QHG_LOCATION tag can # be used to specify the location of Qt's qhelpgenerator. # If non-empty doxygen will try to run qhelpgenerator on the generated # .qhp file. QHG_LOCATION = # If the GENERATE_ECLIPSEHELP tag is set to YES, additional index files # will be generated, which together with the HTML files, form an Eclipse help # plugin. To install this plugin and make it available under the help contents # menu in Eclipse, the contents of the directory containing the HTML and XML # files needs to be copied into the plugins directory of eclipse. The name of # the directory within the plugins directory should be the same as # the ECLIPSE_DOC_ID value. After copying Eclipse needs to be restarted before # the help appears. GENERATE_ECLIPSEHELP = NO # A unique identifier for the eclipse help plugin. When installing the plugin # the directory name containing the HTML and XML files should also have # this name. ECLIPSE_DOC_ID = org.doxygen.Project # The DISABLE_INDEX tag can be used to turn on/off the condensed index at # top of each HTML page. The value NO (the default) enables the index and # the value YES disables it. DISABLE_INDEX = NO # The ENUM_VALUES_PER_LINE tag can be used to set the number of enum values # (range [0,1..20]) that doxygen will group on one line in the generated HTML # documentation. Note that a value of 0 will completely suppress the enum # values from appearing in the overview section. ENUM_VALUES_PER_LINE = 4 # The GENERATE_TREEVIEW tag is used to specify whether a tree-like index # structure should be generated to display hierarchical information. # If the tag value is set to YES, a side panel will be generated # containing a tree-like index structure (just like the one that # is generated for HTML Help). For this to work a browser that supports # JavaScript, DHTML, CSS and frames is required (i.e. any modern browser). # Windows users are probably better off using the HTML help feature. GENERATE_TREEVIEW = NO # By enabling USE_INLINE_TREES, doxygen will generate the Groups, Directories, # and Class Hierarchy pages using a tree view instead of an ordered list. USE_INLINE_TREES = NO # If the treeview is enabled (see GENERATE_TREEVIEW) then this tag can be # used to set the initial width (in pixels) of the frame in which the tree # is shown. TREEVIEW_WIDTH = 250 # When the EXT_LINKS_IN_WINDOW option is set to YES doxygen will open # links to external symbols imported via tag files in a separate window. EXT_LINKS_IN_WINDOW = NO # Use this tag to change the font size of Latex formulas included # as images in the HTML documentation. The default is 10. Note that # when you change the font size after a successful doxygen run you need # to manually remove any form_*.png images from the HTML output directory # to force them to be regenerated. FORMULA_FONTSIZE = 10 # Use the FORMULA_TRANPARENT tag to determine whether or not the images # generated for formulas are transparent PNGs. Transparent PNGs are # not supported properly for IE 6.0, but are supported on all modern browsers. # Note that when changing this option you need to delete any form_*.png files # in the HTML output before the changes have effect. FORMULA_TRANSPARENT = YES # Enable the USE_MATHJAX option to render LaTeX formulas using MathJax # (see http://www.mathjax.org) which uses client side Javascript for the # rendering instead of using prerendered bitmaps. Use this if you do not # have LaTeX installed or if you want to formulas look prettier in the HTML # output. When enabled you also need to install MathJax separately and # configure the path to it using the MATHJAX_RELPATH option. USE_MATHJAX = NO # When MathJax is enabled you need to specify the location relative to the # HTML output directory using the MATHJAX_RELPATH option. The destination # directory should contain the MathJax.js script. For instance, if the mathjax # directory is located at the same level as the HTML output directory, then # MATHJAX_RELPATH should be ../mathjax. The default value points to the # mathjax.org site, so you can quickly see the result without installing # MathJax, but it is strongly recommended to install a local copy of MathJax # before deployment. MATHJAX_RELPATH = http://www.mathjax.org/mathjax # When the SEARCHENGINE tag is enabled doxygen will generate a search box # for the HTML output. The underlying search engine uses javascript # and DHTML and should work on any modern browser. Note that when using # HTML help (GENERATE_HTMLHELP), Qt help (GENERATE_QHP), or docsets # (GENERATE_DOCSET) there is already a search function so this one should # typically be disabled. For large projects the javascript based search engine # can be slow, then enabling SERVER_BASED_SEARCH may provide a better solution. SEARCHENGINE = YES # When the SERVER_BASED_SEARCH tag is enabled the search engine will be # implemented using a PHP enabled web server instead of at the web client # using Javascript. Doxygen will generate the search PHP script and index # file to put on the web server. The advantage of the server # based approach is that it scales better to large projects and allows # full text search. The disadvantages are that it is more difficult to setup # and does not have live searching capabilities. SERVER_BASED_SEARCH = NO #--------------------------------------------------------------------------- # configuration options related to the LaTeX output #--------------------------------------------------------------------------- # If the GENERATE_LATEX tag is set to YES (the default) Doxygen will # generate Latex output. GENERATE_LATEX = YES # The LATEX_OUTPUT tag is used to specify where the LaTeX docs will be put. # If a relative path is entered the value of OUTPUT_DIRECTORY will be # put in front of it. If left blank `latex' will be used as the default path. LATEX_OUTPUT = latex # The LATEX_CMD_NAME tag can be used to specify the LaTeX command name to be # invoked. If left blank `latex' will be used as the default command name. # Note that when enabling USE_PDFLATEX this option is only used for # generating bitmaps for formulas in the HTML output, but not in the # Makefile that is written to the output directory. LATEX_CMD_NAME = latex # The MAKEINDEX_CMD_NAME tag can be used to specify the command name to # generate index for LaTeX. If left blank `makeindex' will be used as the # default command name. MAKEINDEX_CMD_NAME = makeindex # If the COMPACT_LATEX tag is set to YES Doxygen generates more compact # LaTeX documents. This may be useful for small projects and may help to # save some trees in general. COMPACT_LATEX = NO # The PAPER_TYPE tag can be used to set the paper type that is used # by the printer. Possible values are: a4, letter, legal and # executive. If left blank a4wide will be used. PAPER_TYPE = a4 # The EXTRA_PACKAGES tag can be to specify one or more names of LaTeX # packages that should be included in the LaTeX output. EXTRA_PACKAGES = # The LATEX_HEADER tag can be used to specify a personal LaTeX header for # the generated latex document. The header should contain everything until # the first chapter. If it is left blank doxygen will generate a # standard header. Notice: only use this tag if you know what you are doing! LATEX_HEADER = # The LATEX_FOOTER tag can be used to specify a personal LaTeX footer for # the generated latex document. The footer should contain everything after # the last chapter. If it is left blank doxygen will generate a # standard footer. Notice: only use this tag if you know what you are doing! LATEX_FOOTER = # If the PDF_HYPERLINKS tag is set to YES, the LaTeX that is generated # is prepared for conversion to pdf (using ps2pdf). The pdf file will # contain links (just like the HTML output) instead of page references # This makes the output suitable for online browsing using a pdf viewer. PDF_HYPERLINKS = YES # If the USE_PDFLATEX tag is set to YES, pdflatex will be used instead of # plain latex in the generated Makefile. Set this option to YES to get a # higher quality PDF documentation. USE_PDFLATEX = YES # If the LATEX_BATCHMODE tag is set to YES, doxygen will add the \\batchmode. # command to the generated LaTeX files. This will instruct LaTeX to keep # running if errors occur, instead of asking the user for help. # This option is also used when generating formulas in HTML. LATEX_BATCHMODE = NO # If LATEX_HIDE_INDICES is set to YES then doxygen will not # include the index chapters (such as File Index, Compound Index, etc.) # in the output. LATEX_HIDE_INDICES = NO # If LATEX_SOURCE_CODE is set to YES then doxygen will include # source code with syntax highlighting in the LaTeX output. # Note that which sources are shown also depends on other settings # such as SOURCE_BROWSER. LATEX_SOURCE_CODE = NO #--------------------------------------------------------------------------- # configuration options related to the RTF output #--------------------------------------------------------------------------- # If the GENERATE_RTF tag is set to YES Doxygen will generate RTF output # The RTF output is optimized for Word 97 and may not look very pretty with # other RTF readers or editors. GENERATE_RTF = NO # The RTF_OUTPUT tag is used to specify where the RTF docs will be put. # If a relative path is entered the value of OUTPUT_DIRECTORY will be # put in front of it. If left blank `rtf' will be used as the default path. RTF_OUTPUT = rtf # If the COMPACT_RTF tag is set to YES Doxygen generates more compact # RTF documents. This may be useful for small projects and may help to # save some trees in general. COMPACT_RTF = NO # If the RTF_HYPERLINKS tag is set to YES, the RTF that is generated # will contain hyperlink fields. The RTF file will # contain links (just like the HTML output) instead of page references. # This makes the output suitable for online browsing using WORD or other # programs which support those fields. # Note: wordpad (write) and others do not support links. RTF_HYPERLINKS = NO # Load stylesheet definitions from file. Syntax is similar to doxygen's # config file, i.e. a series of assignments. You only have to provide # replacements, missing definitions are set to their default value. RTF_STYLESHEET_FILE = # Set optional variables used in the generation of an rtf document. # Syntax is similar to doxygen's config file. RTF_EXTENSIONS_FILE = #--------------------------------------------------------------------------- # configuration options related to the man page output #--------------------------------------------------------------------------- # If the GENERATE_MAN tag is set to YES (the default) Doxygen will # generate man pages GENERATE_MAN = NO # The MAN_OUTPUT tag is used to specify where the man pages will be put. # If a relative path is entered the value of OUTPUT_DIRECTORY will be # put in front of it. If left blank `man' will be used as the default path. MAN_OUTPUT = man # The MAN_EXTENSION tag determines the extension that is added to # the generated man pages (default is the subroutine's section .3) MAN_EXTENSION = .3 # If the MAN_LINKS tag is set to YES and Doxygen generates man output, # then it will generate one additional man file for each entity # documented in the real man page(s). These additional files # only source the real man page, but without them the man command # would be unable to find the correct page. The default is NO. MAN_LINKS = NO #--------------------------------------------------------------------------- # configuration options related to the XML output #--------------------------------------------------------------------------- # If the GENERATE_XML tag is set to YES Doxygen will # generate an XML file that captures the structure of # the code including all documentation. GENERATE_XML = NO # The XML_OUTPUT tag is used to specify where the XML pages will be put. # If a relative path is entered the value of OUTPUT_DIRECTORY will be # put in front of it. If left blank `xml' will be used as the default path. XML_OUTPUT = xml # The XML_SCHEMA tag can be used to specify an XML schema, # which can be used by a validating XML parser to check the # syntax of the XML files. XML_SCHEMA = # The XML_DTD tag can be used to specify an XML DTD, # which can be used by a validating XML parser to check the # syntax of the XML files. XML_DTD = # If the XML_PROGRAMLISTING tag is set to YES Doxygen will # dump the program listings (including syntax highlighting # and cross-referencing information) to the XML output. Note that # enabling this will significantly increase the size of the XML output. XML_PROGRAMLISTING = YES #--------------------------------------------------------------------------- # configuration options for the AutoGen Definitions output #--------------------------------------------------------------------------- # If the GENERATE_AUTOGEN_DEF tag is set to YES Doxygen will # generate an AutoGen Definitions (see autogen.sf.net) file # that captures the structure of the code including all # documentation. Note that this feature is still experimental # and incomplete at the moment. GENERATE_AUTOGEN_DEF = NO #--------------------------------------------------------------------------- # configuration options related to the Perl module output #--------------------------------------------------------------------------- # If the GENERATE_PERLMOD tag is set to YES Doxygen will # generate a Perl module file that captures the structure of # the code including all documentation. Note that this # feature is still experimental and incomplete at the # moment. GENERATE_PERLMOD = NO # If the PERLMOD_LATEX tag is set to YES Doxygen will generate # the necessary Makefile rules, Perl scripts and LaTeX code to be able # to generate PDF and DVI output from the Perl module output. PERLMOD_LATEX = NO # If the PERLMOD_PRETTY tag is set to YES the Perl module output will be # nicely formatted so it can be parsed by a human reader. # This is useful # if you want to understand what is going on. # On the other hand, if this # tag is set to NO the size of the Perl module output will be much smaller # and Perl will parse it just the same. PERLMOD_PRETTY = YES # The names of the make variables in the generated doxyrules.make file # are prefixed with the string contained in PERLMOD_MAKEVAR_PREFIX. # This is useful so different doxyrules.make files included by the same # Makefile don't overwrite each other's variables. PERLMOD_MAKEVAR_PREFIX = #--------------------------------------------------------------------------- # Configuration options related to the preprocessor #--------------------------------------------------------------------------- # If the ENABLE_PREPROCESSING tag is set to YES (the default) Doxygen will # evaluate all C-preprocessor directives found in the sources and include # files. ENABLE_PREPROCESSING = YES # If the MACRO_EXPANSION tag is set to YES Doxygen will expand all macro # names in the source code. If set to NO (the default) only conditional # compilation will be performed. Macro expansion can be done in a controlled # way by setting EXPAND_ONLY_PREDEF to YES. MACRO_EXPANSION = NO # If the EXPAND_ONLY_PREDEF and MACRO_EXPANSION tags are both set to YES # then the macro expansion is limited to the macros specified with the # PREDEFINED and EXPAND_AS_DEFINED tags. EXPAND_ONLY_PREDEF = NO # If the SEARCH_INCLUDES tag is set to YES (the default) the includes files # pointed to by INCLUDE_PATH will be searched when a #include is found. SEARCH_INCLUDES = YES # The INCLUDE_PATH tag can be used to specify one or more directories that # contain include files that are not input files but should be processed by # the preprocessor. INCLUDE_PATH = # You can use the INCLUDE_FILE_PATTERNS tag to specify one or more wildcard # patterns (like *.h and *.hpp) to filter out the header-files in the # directories. If left blank, the patterns specified with FILE_PATTERNS will # be used. INCLUDE_FILE_PATTERNS = # The PREDEFINED tag can be used to specify one or more macro names that # are defined before the preprocessor is started (similar to the -D option of # gcc). The argument of the tag is a list of macros of the form: name # or name=definition (no spaces). If the definition and the = are # omitted =1 is assumed. To prevent a macro definition from being # undefined via #undef or recursively expanded use the := operator # instead of the = operator. PREDEFINED = # If the MACRO_EXPANSION and EXPAND_ONLY_PREDEF tags are set to YES then # this tag can be used to specify a list of macro names that should be expanded. # The macro definition that is found in the sources will be used. # Use the PREDEFINED tag if you want to use a different macro definition that # overrules the definition found in the source code. EXPAND_AS_DEFINED = # If the SKIP_FUNCTION_MACROS tag is set to YES (the default) then # doxygen's preprocessor will remove all references to function-like macros # that are alone on a line, have an all uppercase name, and do not end with a # semicolon, because these will confuse the parser if not removed. SKIP_FUNCTION_MACROS = YES #--------------------------------------------------------------------------- # Configuration::additions related to external references #--------------------------------------------------------------------------- # The TAGFILES option can be used to specify one or more tagfiles. # Optionally an initial location of the external documentation # can be added for each tagfile. The format of a tag file without # this location is as follows: # # TAGFILES = file1 file2 ... # Adding location for the tag files is done as follows: # # TAGFILES = file1=loc1 "file2 = loc2" ... # where "loc1" and "loc2" can be relative or absolute paths or # URLs. If a location is present for each tag, the installdox tool # does not have to be run to correct the links. # Note that each tag file must have a unique name # (where the name does NOT include the path) # If a tag file is not located in the directory in which doxygen # is run, you must also specify the path to the tagfile here. TAGFILES = # When a file name is specified after GENERATE_TAGFILE, doxygen will create # a tag file that is based on the input files it reads. GENERATE_TAGFILE = # If the ALLEXTERNALS tag is set to YES all external classes will be listed # in the class index. If set to NO only the inherited external classes # will be listed. ALLEXTERNALS = NO # If the EXTERNAL_GROUPS tag is set to YES all external groups will be listed # in the modules index. If set to NO, only the current project's groups will # be listed. EXTERNAL_GROUPS = YES # The PERL_PATH should be the absolute path and name of the perl script # interpreter (i.e. the result of `which perl'). PERL_PATH = /usr/bin/perl #--------------------------------------------------------------------------- # Configuration options related to the dot tool #--------------------------------------------------------------------------- # If the CLASS_DIAGRAMS tag is set to YES (the default) Doxygen will # generate a inheritance diagram (in HTML, RTF and LaTeX) for classes with base # or super classes. Setting the tag to NO turns the diagrams off. Note that # this option also works with HAVE_DOT disabled, but it is recommended to # install and use dot, since it yields more powerful graphs. CLASS_DIAGRAMS = YES # You can define message sequence charts within doxygen comments using the \msc # command. Doxygen will then run the mscgen tool (see # http://www.mcternan.me.uk/mscgen/) to produce the chart and insert it in the # documentation. The MSCGEN_PATH tag allows you to specify the directory where # the mscgen tool resides. If left empty the tool is assumed to be found in the # default search path. MSCGEN_PATH = # If set to YES, the inheritance and collaboration graphs will hide # inheritance and usage relations if the target is undocumented # or is not a class. HIDE_UNDOC_RELATIONS = YES # If you set the HAVE_DOT tag to YES then doxygen will assume the dot tool is # available from the path. This tool is part of Graphviz, a graph visualization # toolkit from AT&T and Lucent Bell Labs. The other options in this section # have no effect if this option is set to NO (the default) HAVE_DOT = NO # The DOT_NUM_THREADS specifies the number of dot invocations doxygen is # allowed to run in parallel. When set to 0 (the default) doxygen will # base this on the number of processors available in the system. You can set it # explicitly to a value larger than 0 to get control over the balance # between CPU load and processing speed. DOT_NUM_THREADS = 0 # By default doxygen will write a font called Helvetica to the output # directory and reference it in all dot files that doxygen generates. # When you want a differently looking font you can specify the font name # using DOT_FONTNAME. You need to make sure dot is able to find the font, # which can be done by putting it in a standard location or by setting the # DOTFONTPATH environment variable or by setting DOT_FONTPATH to the directory # containing the font. DOT_FONTNAME = Helvetica # The DOT_FONTSIZE tag can be used to set the size of the font of dot graphs. # The default size is 10pt. DOT_FONTSIZE = 10 # By default doxygen will tell dot to use the output directory to look for the # FreeSans.ttf font (which doxygen will put there itself). If you specify a # different font using DOT_FONTNAME you can set the path where dot # can find it using this tag. DOT_FONTPATH = # If the CLASS_GRAPH and HAVE_DOT tags are set to YES then doxygen # will generate a graph for each documented class showing the direct and # indirect inheritance relations. Setting this tag to YES will force the # the CLASS_DIAGRAMS tag to NO. CLASS_GRAPH = YES # If the COLLABORATION_GRAPH and HAVE_DOT tags are set to YES then doxygen # will generate a graph for each documented class showing the direct and # indirect implementation dependencies (inheritance, containment, and # class references variables) of the class with other documented classes. COLLABORATION_GRAPH = YES # If the GROUP_GRAPHS and HAVE_DOT tags are set to YES then doxygen # will generate a graph for groups, showing the direct groups dependencies GROUP_GRAPHS = YES # If the UML_LOOK tag is set to YES doxygen will generate inheritance and # collaboration diagrams in a style similar to the OMG's Unified Modeling # Language. UML_LOOK = NO # If set to YES, the inheritance and collaboration graphs will show the # relations between templates and their instances. TEMPLATE_RELATIONS = NO # If the ENABLE_PREPROCESSING, SEARCH_INCLUDES, INCLUDE_GRAPH, and HAVE_DOT # tags are set to YES then doxygen will generate a graph for each documented # file showing the direct and indirect include dependencies of the file with # other documented files. INCLUDE_GRAPH = YES # If the ENABLE_PREPROCESSING, SEARCH_INCLUDES, INCLUDED_BY_GRAPH, and # HAVE_DOT tags are set to YES then doxygen will generate a graph for each # documented header file showing the documented files that directly or # indirectly include this file. INCLUDED_BY_GRAPH = YES # If the CALL_GRAPH and HAVE_DOT options are set to YES then # doxygen will generate a call dependency graph for every global function # or class method. Note that enabling this option will significantly increase # the time of a run. So in most cases it will be better to enable call graphs # for selected functions only using the \callgraph command. CALL_GRAPH = NO # If the CALLER_GRAPH and HAVE_DOT tags are set to YES then # doxygen will generate a caller dependency graph for every global function # or class method. Note that enabling this option will significantly increase # the time of a run. So in most cases it will be better to enable caller # graphs for selected functions only using the \callergraph command. CALLER_GRAPH = NO # If the GRAPHICAL_HIERARCHY and HAVE_DOT tags are set to YES then doxygen # will generate a graphical hierarchy of all classes instead of a textual one. GRAPHICAL_HIERARCHY = YES # If the DIRECTORY_GRAPH, SHOW_DIRECTORIES and HAVE_DOT tags are set to YES # then doxygen will show the dependencies a directory has on other directories # in a graphical way. The dependency relations are determined by the #include # relations between the files in the directories. DIRECTORY_GRAPH = YES # The DOT_IMAGE_FORMAT tag can be used to set the image format of the images # generated by dot. Possible values are svg, png, jpg, or gif. # If left blank png will be used. DOT_IMAGE_FORMAT = png # The tag DOT_PATH can be used to specify the path where the dot tool can be # found. If left blank, it is assumed the dot tool can be found in the path. DOT_PATH = # The DOTFILE_DIRS tag can be used to specify one or more directories that # contain dot files that are included in the documentation (see the # \dotfile command). DOTFILE_DIRS = # The MSCFILE_DIRS tag can be used to specify one or more directories that # contain msc files that are included in the documentation (see the # \mscfile command). MSCFILE_DIRS = # The DOT_GRAPH_MAX_NODES tag can be used to set the maximum number of # nodes that will be shown in the graph. If the number of nodes in a graph # becomes larger than this value, doxygen will truncate the graph, which is # visualized by representing a node as a red box. Note that doxygen if the # number of direct children of the root node in a graph is already larger than # DOT_GRAPH_MAX_NODES then the graph will not be shown at all. Also note # that the size of a graph can be further restricted by MAX_DOT_GRAPH_DEPTH. DOT_GRAPH_MAX_NODES = 50 # The MAX_DOT_GRAPH_DEPTH tag can be used to set the maximum depth of the # graphs generated by dot. A depth value of 3 means that only nodes reachable # from the root by following a path via at most 3 edges will be shown. Nodes # that lay further from the root node will be omitted. Note that setting this # option to 1 or 2 may greatly reduce the computation time needed for large # code bases. Also note that the size of a graph can be further restricted by # DOT_GRAPH_MAX_NODES. Using a depth of 0 means no depth restriction. MAX_DOT_GRAPH_DEPTH = 0 # Set the DOT_TRANSPARENT tag to YES to generate images with a transparent # background. This is disabled by default, because dot on Windows does not # seem to support this out of the box. Warning: Depending on the platform used, # enabling this option may lead to badly anti-aliased labels on the edges of # a graph (i.e. they become hard to read). DOT_TRANSPARENT = NO # Set the DOT_MULTI_TARGETS tag to YES allow dot to generate multiple output # files in one run (i.e. multiple -o and -T options on the command line). This # makes dot run faster, but since only newer versions of dot (>1.8.10) # support this, this feature is disabled by default. DOT_MULTI_TARGETS = YES # If the GENERATE_LEGEND tag is set to YES (the default) Doxygen will # generate a legend page explaining the meaning of the various boxes and # arrows in the dot generated graphs. GENERATE_LEGEND = YES # If the DOT_CLEANUP tag is set to YES (the default) Doxygen will # remove the intermediate dot files that are used to generate # the various graphs. DOT_CLEANUP = YES slowmovideo-0.5+git20180116/src/config.h.in0000664000000000000000000000033513151342440016637 0ustar rootroot/* * compile time/plateform define */ #ifndef CONFIG_H #define CONFIG_H #cmakedefine USE_QTKIT #cmakedefine USE_DBUS #cmakedefine USE_FFMPEG // OpenCV version in use #cmakedefine HAS_OCV_VERSION_3 #endif // CONFIG_H slowmovideo-0.5+git20180116/material/0000775000000000000000000000000013151342440015622 5ustar rootrootslowmovideo-0.5+git20180116/material/shutterFunction.svg0000664000000000000000000004057413151342440021561 0ustar rootroot image/svg+xml dy 1/fps t y 0 1 x [s] [s] slowmovideo-0.5+git20180116/material/Move.svg0000664000000000000000000001317213151342440017255 0ustar rootroot image/svg+xml 10 20 30 40 slowmovideo-0.5+git20180116/material/slowmoIcon.blend0000664000000000000000000214115013151342440020765 0ustar rootrootBLENDER-v261REND P3SceneTEST8`p""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$  ` ` ` ` ` ` ` ` ` ` ` ` ` ` `""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$""$ ` ` ` ` ` ` ` ` ` ` ` ` ` ` `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH$$%@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH$$%@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH$$%@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH$$%@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH$$%@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH$$%@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH$$% @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH@j$Fz$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH$$% @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHMM```Ih6EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH446 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH`````N$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEH?h$?h$?h$`````M?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$BV6EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHR%```````````````````````````````````````````S$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHR%```````````````````````````````````````````S$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$?h$ @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEH```````````````````````````````````````````S$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH`````````````` @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEH```````````````````````````````````````````S$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH`````````````` @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEH```````````````````````````````````````````S$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH`````````````` @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEH```````````````````````````````````````````S$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH`````````````` @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEH```````````````````````````````````````````S$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH##% @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEH```````````````````````````````````````````VEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH##%@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHS$S$S$S$S$`````WS$S$S$S$S$S$S$S$S$S$S$S$S$S$S$S$S$S$S$S$S$S$S$S$S$S$S$S$S$S$S$S$Ll6EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH##%@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHY````WEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH##%@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%````[EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHLl6````]EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH`````N$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH`````N$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%````[EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH##%@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%````[EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH##%@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHLl6````]Ih6EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH##%@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH`````N$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHY````N$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%````[EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%````[EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH##% @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH`````Ih6EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH`````N$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHY````P$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH##% @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%````[EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHLl6````[EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH`````Ih6EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH##% @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH`````N$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%````TEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%````[EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH##% @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHLl6````]EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH`````N$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHY````N$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%````WEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%````[EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH````]EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH`````N$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHY````N$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%````WEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%E^6EEHEEHEEHEEHEEH000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHLl6````[EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHKk7```Ll6EEHEEHEEH447 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH````]Ih6EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%```S$EEHEEHEEH@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH`````N$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHLl6```Lm6EEHEEH447 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%````P$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHLl6S$Ll6EEHEEHEEH@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%````[EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH##% @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHLl6````[EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH`````Ih6EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHY````N$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%````P$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%````[EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH````[EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH`````Ih6EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHY````N$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%````TEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHLl6````[EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH````]EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447 !@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH`````N$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%````N$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%````WEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH##%000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHLl6````[EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH````]EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHY````N$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%````N$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%````WEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH````[EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHBV6EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH##%000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH````]EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHY``Ll6EEHEEHEEHEEHEEHEEHEEH447000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHS$S$Ll6EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH```S$EEHEEHEEHEEHEEHEEH##%@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHY``Ll6EEHEEHEEHEEH447@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH````He6EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHS$Lm6EEHEEHEEHEEH@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH````L$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%```L$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447 !@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%```S$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%```S$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH##%000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%```S$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447##%000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%```S$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447 !@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `""$""$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%```S$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH##%000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%```S$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH##%##% !@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `##%446EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%```S$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEHEEH##%##%000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `##%##%EEHEEHEEHEEHEEHEEHEEHEEHEEHEEHR%```S$EEHEEHEEHEEHEEHEEHEEHEEHEEHEEH447##%##%000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `##%##%##%*J```F##%##%##%##%@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` !@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` !@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` !@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` !@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ `000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` !! !!@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` !! !!@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` !! !!@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@000A!!!!!! !! !!@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!000A@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` `@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` ` `GLOB 3 0&-p8[(p /data/cworkspace/slowmoVideo/material/slowmoIcon.blendWM(8*-pWMWinMan-p-p-p-p8e%p8w!p4-p?p8$&-p8&&-p8$&-p8%&-p8&&-p -p -p -pDATA-p-p&-pscreen'V -pK>pK>p8-8-p8I&-p8,8-p8,8-p8>ph=pSN-p8#-pSRAnimation.001x-pؿ-pX-p1-p8U&-p8\&-p8[(pDATA x-pع-pDATA ع-p8-px-pDATA 8-p-pع-pDATA -p-p8-pDATA -pX-p-pDATA X-p-p-pDATA -p-pX-pDATA -px-p-pDATA x-pؼ-p-p<DATA ؼ-p8-px-p<DATA 8-p-pؼ-pXDATA -p-p8-pXDATA -pX-p-pXDATA X-p-p-pDATA -p-pX-pDATA -px-p-pDATA x-pؿ-p-p<DATA ؿ-px-p<DATA(X-p-pع-p8-pDATA(-p1-pX-pع-p-pDATA(1-p(1-p-p8-pX-pDATA((1-p1-p1-p-pX-pDATA(1-p1-p(1-px-p-pDATA(1-px1-p1-p-p-pDATA(x1-p1-p1-pX-p-pDATA(1-pX1-px1-p-px-pDATA(X1-pȳ1-p1-p-pؼ-pDATA(ȳ1-p81-pX1-px-pؼ-pDATA(81-p1-pȳ1-px-p8-pDATA(1-p1-p81-p-p-pDATA(1-p1-p1-p-p-pDATA(1-p1-p1-p8-p-pDATA(1-ph1-p1-p8-p-pDATA(h1-pض1-p1-p-p-pDATA(ض1-pH1-ph1-p-pX-pDATA(H1-p1-pض1-p-pX-pDATA(1-p(1-pH1-p-pX-pDATA((1-p1-p1-p8-p-pDATA(1-p1-p(1-p-p-pDATA(1-px1-p1-p-p-pDATA(x1-p1-p1-px-px-pDATA(1-pX1-px1-p-px-pDATA(X1-pȺ1-p1-pX-pؿ-pDATA(Ⱥ1-p81-pX1-pؼ-pؿ-pDATA(81-p1-pȺ1-px-pؿ-pDATA(1-p1-p81-p-p-pDATA(1-p1-pX-p-pDATA8U&-p8V&-p-pع-p8-pX-p(-p(-p*-p8*-pDATA(*-p8*-pDA DADADA?? DATA(8*-p*-pmEmEpoo?? pDATA8V&-p8W&-p8U&-p-px-pؼ-p-p;<8-p8-p*-p8*-pDATA(*-p8*-pCAb'&6XCAWCAWCA?? ";DATA(8*-p*-pXC=GC!>!?@ ""!"x'-p'-pDATAXx'-p8'-pBUTTONS_PT_contextBUTTONS_PT_contextContext$DATAX8'-p'-px'-pRENDER_PT_renderRENDER_PT_renderRender=DATAX'-p'-p8'-pRENDER_PT_layersRENDER_PT_layersLayersoDATAX'-px'-p'-pRENDER_PT_dimensionsRENDER_PT_dimensionsDimensionsDATAXx'-p8'-p'-pRENDER_PT_antialiasingRENDER_PT_antialiasingAnti-Aliasing::DATAX8'-p'-px'-pRENDER_PT_motion_blurRENDER_PT_motion_blurSampled Motion Blur"DATAX'-p'-p8'-pRENDER_PT_shadingRENDER_PT_shadingShading DATAX'-px'-p'-pRENDER_PT_performanceRENDER_PT_performancePerformanceDATAXx'-p8'-p'-pRENDER_PT_post_processingRENDER_PT_post_processingPost ProcessingDATAX8'-p'-px'-pRENDER_PT_stampRENDER_PT_stampStamp DATAX'-p'-p8'-pRENDER_PT_outputRENDER_PT_outputOutput( DATAX'-p'-pRENDER_PT_bakeRENDER_PT_bakeBake DATA8-pDATA8W&-p8X&-p8V&-px-p8-p-p-pWXx-px-p*-p8*-pDATA(*-p8*-p DA'7DADADA?? DATA(8*-p*-p@~CHBpF}CHB=?HB|HHB= AH>W>DATAx-pDATA8X&-p8Y&-p8W&-px-px-pؿ-pؼ-p=;*-p*-p*-p8*-pDATA(*-p8*-pCAb'&6XCAWCAWCA?? ";DATA(8*-p*-pCGCS?? =!DATA*-pX,pDATAX,p -pDATA -p8[(p8[(p8[(p8[(p8i!p8 "p88$p8LW%p8!p8=p8[(pDATA8Y&-p8Z&-p8X&-p-pX-p-p-pYWj*-p*-p8*-p8*-pDATA(8*-p*-p@qDADAVDADA?? WWYrWDATA(*-p8*-p8*-pC@FCF++?@ ,sPDATA(8*-p*-p*-pCfCww?@ xfss"DATA(*-p8*-p8*-p#C`#C`?@ sDATA(8*-p*-psWP8-pDATAx8-p?ȧ? JLD>3;Q?Fwi?JF>#,TY!e?*=>o?E>Fwi?TY5;JF>!e?Q?#,+=>`DAoy@?>Ѣ0QQuZ? r[>?>#,>mi}?T*= lAoAc>ntU?@FR0ݹ,-3> O?8Clu·Ai>rB;B7@D>3;Q?Fwi?JF>#,TY!e?*=>o?>Ѣ0QQuZ? r[>?>#,>mi}?T*= lAoA [@ [@ [@?\>7?8˔oAoAH;? ! BVBq~BDATA@*-p333?? AL>8%p B?=C DATA8Z&-p8[&-p8Y&-p8-p-p-p-pY!-p!-p8*-p*-pDATA(8*-p*-p@YDACACACA?? YrDATA(*-p8*-p8*-pHCpHC?? sDATA(8*-p*-p*-psDATA(*-p8*-pC@zC Ao:o:|HPCGisDATA!-p8--pDATAp8--p8[(pDATA8[&-p8\&-p8Z&-p-p-pX-p-p x'-px'-p8*-p8*-pDATA(8*-p*-p^DACACACA?? DATA(*-p8*-p8*-p7CHC??DATA(8*-p*-p hDH hD|H@F #<HBJDATAHx'-p 8[(pDATA8\&-p8[&-px-p-pX-pؿ-p=*-p*-p*-p8*-pDATA(*-p8*-pfDAC@AICACA?? JJ==DATA(8*-p*-p=8-pDATAx8-p8 @d@AHMݕ/?V~'?3F:?>T8175e?4>Z& 4?ߕ/?7F:?81X~>75e?'?T3>ne@>N@?/?=I''? ??T?ļL@ l4}@~q11A 4AF>>֟ļG=->3xkBˇ֟&B|`eA(@ݕ/?V~'?3F:?>T8175e?4>Z& 4?/?=I''? ??T?ļL@ l4}@~q11A 4Afga@fga@fga@?H?N+Z# A;??ABcZBvBDATA@*-p333?? AL>8%p B? #<C SN8#-p&-p-pSRCompositingg.0011-p1-p1-p1-p8]&-p81-p8[(p{DATA 1-p1-pDATA 1-px1-p1-pDATA x1-p1-p1-pVDATA 1-p81-px1-pVDATA 81-p1-p1-pDATA 1-p1-p81-pVDATA 1-pX1-p1-p`@DATA X1-p1-p1-pV@DATA 1-p1-pX1-p`DATA 1-px1-p1-pDATA x1-p1-p1-p`DATA 1-p81-px1-p(DATA 81-p1-p1-p(DATA 1-p81-p`DATA(1-p1-p1-px1-pDATA(1-ph1-p1-p1-p81-pDATA(h1-pؽ1-p1-px1-p1-pDATA(ؽ1-pH1-ph1-p81-p1-pDATA(H1-p1-pؽ1-p1-pX1-pDATA(1-p(1-pH1-p1-pX1-pDATA((1-p1-p1-p1-p1-pDATA(1-p1-p(1-p81-p1-pDATA(1-px1-p1-p1-p1-pDATA(x1-p1-p1-p1-pX1-pDATA(1-pX1-px1-p81-p1-pDATA(X1-p1-p1-p1-px1-pDATA(1-p81-pX1-p1-px1-pDATA(81-p1-p1-p1-p1-pDATA(1-p1-p81-px1-p1-pDATA(1-p1-p1-p1-p81-pDATA(1-p1-p1-p81-p1-pDATA(1-ph1-p1-p1-p1-pDATA(h1-p1-p1-p1-p1-pDATA(1-pH1-ph1-p1-p81-pDATA(H1-p1-p1-px1-p1-pDATA(1-pH1-p1-p1-pDATA8]&-p8^&-p81-p1-px1-p1-pVWxq -p8(-p8(-p8*-p*-p8!p pDATA(8*-p*-pDADAVDADA?? WWVW8 -pDATA(*-p8*-pmEmEpoo?? p8 -pDATA8^&-p8_&-p8]&-p1-p1-pX1-p1-paV?@xW -px$-px$-p8*-p*-p"p!pDATA(8*-p*-pvCA7vCAuCAuCA?? aV8 -pDATA(*-p8*-p@~CHB23JуCHB%?HB|HHB= AH&aV?&8 -pDATAx$-p?DATA8_&-p81-p8^&-p1-p1-p1-pX1-paVAi^ -p%-p%-p8*-p*-p$p#pDATA(8*-p*-pvCA7vCAuCAuCA?? aV8 -pDATA(*-p8*-pC=dCNRN?@ OO!paVAO8 -p8'-px (-pDATAX8'-p'-p8 -pBUTTONS_PT_contextBUTTONS_PT_contextContext$&DATAX'-p'-p8'-p8%pRENDER_PT_renderRENDER_PT_renderRender=DATAX'-px'-p'-p%pRENDER_PT_layersRENDER_PT_layersLayersoDATAXx'-p8'-p'-p8%pRENDER_PT_dimensionsRENDER_PT_dimensionsDimensionsDATAX8'-p(-px'-p%pRENDER_PT_antialiasingRENDER_PT_antialiasingAnti-Aliasing::DATAX(-p(-p8'-p8%pRENDER_PT_motion_blurRENDER_PT_motion_blurSampled Motion Blur"DATAX(-px(-p(-p8ik&pRENDER_PT_shadingRENDER_PT_shadingShading DATAXx(-p8(-p(-p%pRENDER_PT_performanceRENDER_PT_performancePerformanceDATAX8(-p(-px(-p8%pRENDER_PT_post_processingRENDER_PT_post_processingPost ProcessingDATAX(-p (-p8(-p%pRENDER_PT_stampRENDER_PT_stampStamp DATAX (-px (-p(-p8%pRENDER_PT_outputRENDER_PT_outputOutput$ DATAXx (-p (-p8%pRENDER_PT_bakeRENDER_PT_bakeBake  DATA%-p8!pDATA81-p81-p8_&-p81-p1-px1-p1-p)_7X -p*-p*-p8*-p8*-p'p8%pDATA(8*-p*-p DA DA6 DA DA?? 77)_78 -pDATA(*-p8*-p8*-pC@FCF++?@ ,))8 -pDATA(8*-p*-p*-pCfCww?@ xf))"8 -pDATA(*-p8*-p8*-p#Cl#C?@ p__8 -pDATA(8*-p*-p)_78 -p8-pDATAx8-p?@ JLݕ/?V~ '?6F:?>T8195e?0>o?ޕ/?5F:?81U~>85e? '?T0> ASti@? @?I4' '˼K?nF?d?T?:A %@0 lAoA ?@O*?"D6C≯>4FD¨7B&szSAB4B@{ >'VJ`*_T?=[,9,^޴ݻo?ߧ/jW8?e?'V?QY:~5 ʃ<9= ;޴; lAoA[h@[h@[h@?H?N,Z#oAoAkZS;\>7?8˔?\DB sQBo>~BDATA@*-p333?? AL>8%p B?=C DATA81-p81-p81-p1-p81-p1-px1-p_`x\ -p8*-p8*-p8*-p*-p8)p8(pDATA(8*-p*-pDADA_DADA?? ``_.`8 -pDATA(*-p8*-p D D1O\iIbDȞeBr DO`zNNz??FFQ= @ `{Oi_/`{8 -pDATA08*-p @8[(p D??¢CxpDATA81-p81-p1-p1-p1-p81-p'(8[ -p82-p82-p*-p832-p8,p8*pDATA(*-p12-p DA DA' DA DA?? (('(8 -pDATA(12-p832-p*-p8 -pDATA(832-p12-pCCT?>@?('(8 -pDATA(!82-p8!p8*(pdA>d>ddd?SN&-p*-p8#-pSRDefault1-p!p(1-p6p81-p8p8[(pYX{DATA 1-pX1-pDATA X1-p1-p1-pDATA 1-p1-pX1-pVDATA 1-px1-p1-pVDATA x1-p1-p1-pDATA 1-p81-px1-pVDATA 81-p1-p1-p|DATA 1-p1-p81-p|DATA 1-pX1-p1-p|HDATA X1-px!p1-pVHDATA x!p!pX1-pDATA !px!p|DATA((1-p1-pX1-p1-pDATA(1-p1-p(1-pX1-px1-pDATA(1-px1-p1-p1-p1-pDATA(x1-p1-p1-px1-p1-pDATA(1-pX1-px1-p1-p81-pDATA(X1-p1-p1-p1-p81-pDATA(1-p81-pX1-px1-p1-pDATA(81-p1-p1-p1-p1-pDATA(1-p1-p81-p81-p1-pDATA(1-p1-p1-p1-p1-pDATA(1-p1-p1-p1-pX1-pDATA(1-ph1-p1-p1-pX1-pDATA(h1-p(1p1-p1-pX1-pDATA((1p4ph1-px!p1-pDATA(4p5p(1px!px1-pDATA(5p5p4p!p1-pDATA(5p6p5p!p81-pDATA(6p5px!p!pDATA81-p81-px1-pX1-p1-p1-pVW)xq -p(-p(-p42-p862-pP,ph%pDATA(42-p862-pDADAVDADA?? WWVWZ8 -p8!p8!p8;p8(pDATA(862-p42-pmEDpAFWoE?? WFVW[8 -p8%p83pDATA81-p81-p81-p81-p1-pX1-p1-p}VGH^ -p8(-p8(-p72-p892-pQ,pR,pDATA(72-p892-pfCAZCAYCAYCA?? }V.G\8 -p8!p8!p8=p8 pDATA(892-p72-pC7HC. -J-?@ .."p }V-.]8 -p8!p8!p8 (-p!!p8p8pDATAX8 (-p(-p8 -pBUTTONS_PT_contextBUTTONS_PT_contextContext$&DATAX(-p(-p8 (-p8%pRENDER_PT_renderRENDER_PT_renderRender=6DATAX(-px(-p(-p%pRENDER_PT_layersRENDER_PT_layersLayersC,7DATAXx(-p8(-p(-p8%pRENDER_PT_dimensionsRENDER_PT_dimensionsDimensions8DATAX8(-p(-px(-p%pRENDER_PT_antialiasingRENDER_PT_antialiasingAnti-Aliasing::9DATAX(-p(-p8(-p8%pRENDER_PT_motion_blurRENDER_PT_motion_blurSampled Motion Blur8$:DATAX(-px(-p(-p8ik&pRENDER_PT_shadingRENDER_PT_shadingShading ;DATAXx(-p8(-p(-p%pRENDER_PT_performanceRENDER_PT_performancePerformance<DATAX8(-p(-px(-p8%pRENDER_PT_post_processingRENDER_PT_post_processingPost Processingf=DATAX(-p(-p8(-p%pRENDER_PT_stampRENDER_PT_stampStamp>DATAX(-px (-p(-p8%pRENDER_PT_outputRENDER_PT_outputOutput:?DATAXx (-p8"(-p(-p8%pRENDER_PT_bakeRENDER_PT_bakeBake"@DATAX8"(-p#(-px (-pSCENE_PT_sceneSCENE_PT_sceneScene)=0DATAX#(-p%(-p8"(-pSCENE_PT_unitSCENE_PT_unitUnits)S1DATAX%(-px'(-p#(-pSCENE_PT_keying_setsSCENE_PT_keying_setsKeying Sets)E2DATAXx'(-p8)(-p%(-pSCENE_PT_physicsSCENE_PT_physicsGravity)$3DATAX8)(-p*(-px'(-pSCENE_PT_simplifySCENE_PT_simplifySimplify)P4DATAX*(-p8%p8)(-pSCENE_PT_custom_propsSCENE_PT_custom_propsCustom Properties)$5DATAX8%p8*%p*(-p:%pMATERIAL_PT_context_materialMATERIAL_PT_context_material^~$DATAX8*%pB%p8%p8<%pMATERIAL_PT_previewMATERIAL_PT_previewPreview%DATAXB%p8F%p8*%p8?%pMATERIAL_PT_diffuseMATERIAL_PT_diffuseDiffuseg?&DATAX8F%pG%pB%p@%pMATERIAL_PT_specularMATERIAL_PT_specularSpecularS'DATAXG%pI%p8F%p8B%pMATERIAL_PT_shadingMATERIAL_PT_shadingShadingP(DATAXI%pxK%pG%pC%pMATERIAL_PT_transpMATERIAL_PT_transpTransparency)S)DATAXxK%p8M%pI%p8E%pMATERIAL_PT_mirrorMATERIAL_PT_mirrorMirror*DATAX8M%pN%pxK%p8?(pMATERIAL_PT_sssMATERIAL_PT_sssSubsurface Scattering+DATAXN%pP%p8M%p8K%pMATERIAL_PT_strandMATERIAL_PT_strandStrand,DATAXP%pxR%pN%pL%pMATERIAL_PT_optionsMATERIAL_PT_optionsOptions-DATAXxR%p8T%pP%p8N%pMATERIAL_PT_shadowMATERIAL_PT_shadowShadow.DATAX8T%pTW%pxR%pX%pMATERIAL_PT_custom_propsMATERIAL_PT_custom_propsCustom Propertiesq/DATAXTW%p3-p8T%p8y%pWORLD_PT_context_worldWORLD_PT_context_world$DATAX3-px9&pTW%pz%pWORLD_PT_previewWORLD_PT_previewPreviewDATAXx9&px#8$p3-p8|%pWORLD_PT_worldWORLD_PT_worldWorldjDATAXx#8$p!8$px9&p}%pWORLD_PT_ambient_occlusionWORLD_PT_ambient_occlusionAmbient OcclusionZ$DATAX!8$p8$px#8$p8%pWORLD_PT_environment_lightingWORLD_PT_environment_lightingEnvironment Lighting$DATAX8$p88$p!8$p%pWORLD_PT_indirect_lightingWORLD_PT_indirect_lightingIndirect Lighting=DATAX88$px8$p8$p8%pWORLD_PT_gatherWORLD_PT_gatherGather2 DATAXx8$p8$p88$p%pWORLD_PT_mistWORLD_PT_mistMist!DATAX8$p=8$px8$p8%pWORLD_PT_starsWORLD_PT_starsStars"DATAX=8$pxb8$p8$p%pWORLD_PT_custom_propsWORLD_PT_custom_propsCustom Properties#DATAXxb8$p!p=8$p %pDATA_PT_modifiersDATA_PT_modifiersModifiersDATAX!p!pxb8$p8{*-pDATA_PT_context_cameraDATA_PT_context_camera$DATAX!p!p!p%pDATA_PT_lensDATA_PT_lensLens DATAX!px!p!p8%pDATA_PT_cameraDATA_PT_cameraCameraVDATAXx!p8!p!p%pDATA_PT_camera_dofDATA_PT_camera_dofDepth of FieldF=DATAX8!p!px!p8%pDATA_PT_camera_displayDATA_PT_camera_displayDisplay|DATAX!px!p8!p%pDATA_PT_custom_props_cameraDATA_PT_custom_props_cameraCustom PropertiesDATAXx!p&8$p!p%pDATA_PT_context_lampDATA_PT_context_lamp$ DATAX&8$p88$px!p8%pDATA_PT_previewDATA_PT_previewPreview DATAX88$px5"p&8$p%pDATA_PT_lampDATA_PT_lampLampDATAXx5"p5"p88$p%pDATA_PT_shadowDATA_PT_shadowShadowDATAX5"px,5"px5"p%pDATA_PT_custom_props_lampDATA_PT_custom_props_lampCustom PropertiesNDATAXx,5"px!p5"p%pDATA_PT_spotDATA_PT_spotSpot ShapefDATAXx!px.(-px,5"p8%pDATA_PT_areaDATA_PT_areaArea Shapef:DATAXx.(-p!px!p8]%pOBJECT_PT_context_objectOBJECT_PT_context_object$DATAX!p68$px.(-p^%pOBJECT_PT_transformOBJECT_PT_transformTransform'yDATAX68$pxp!p8`%pOBJECT_PT_delta_transformOBJECT_PT_delta_transformDelta TransformDATAXxp!p68$pa%pOBJECT_PT_transform_locksOBJECT_PT_transform_locksTransform LocksDATAX!px!pxp8c%pOBJECT_PT_relationsOBJECT_PT_relationsRelations}bDATAXx!p9&p!pd%pOBJECT_PT_groupsOBJECT_PT_groupsGroupsA$DATAX9&p!px!p8f%pOBJECT_PT_displayOBJECT_PT_displayDisplayDATAX!px!p9&pg%pOBJECT_PT_duplicationOBJECT_PT_duplicationDuplicationZ$DATAXx!p!p!p8i%pOBJECT_PT_relations_extrasOBJECT_PT_relations_extrasRelations ExtrasB DATAX!p!!px!pj%pOBJECT_PT_motion_pathsOBJECT_PT_motion_pathsMotion Paths* DATAX!!p!p8l%pOBJECT_PT_custom_propsOBJECT_PT_custom_propsCustom Properties DATA8(-px-p_DATA81-p81-p81-p1-px!p!p81-p{|xW -px)-px)-p:2-p8<2-p8S,p8T,pDATA(:2-p8<2-p`DADA{`DA`DA?? ||{|^8 -p8!p8!p8p8 pDATA(8<2-p:2-p@HBO~}BHB{{y?HB|HHB= AH|z{|z_8 -p8 p8pDATAx)-p?DATA81-p8p81-p1-p1-p1-pX1-p}VIS8V -p@2-p@2-p=2-p8?2-pT,pU,pDATA(=2-p8?2-pCAZCAYCAYCA?? }V`8 -p8!p8!p8p8pDATA(8?2-p=2-pCICx88?? 9'}VI9a8 -p8!p8!p8p8pDATA@2-px "px "p,pse SculptDATA,p( ( 86pDATA86p( 8[(p8[(p8[(p8[(p8i!p8 "p88$p8LW%p8!p8=p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p 8[(p 8[(p 8[(p 8[(p 8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p 8[(p!8[(p"8[(p#8[(p$8[(p%8[(p&8[(p'8[(p(8[(p)8[(p*8[(p+8[(p,8[(p-8[(p.8[(p/8[(p08[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p 8[(p 8[(p 8[(p 8[(p 8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p8[(p 8[(p!8[(p"8[(p#8[(p$8[(p%8[(p&8[(p'8[(p(8[(p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8|!p8x!p8t!p8p!p8l!p8h!p8\!p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p 8!p 8!p 8!p 8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p!8!p"8!p#8!p$8!p%8!p&8!p'8!p(8!p)8!p*8!p+8!p,8!p-8!p.8!p/8!p08!p18!p28!p38!p48!p58!p68!p78!p88!p98!p:8!p;8!p<8!p=8!p>8!p?8!p@8!pA8!pB8!pC8!pD8!pE8!p8=p8!p88$p8 "p-p8#-p&-p*-px.-p82-px3-p8LW%p8*-p8i!p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p 8!p 8!p 8!p 8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p!8!p"8!p#8!p$8!p%8!p&8!p'8!p(8!p)8!p*8!p+8!p,8!p-8!p.8!p/8!p08!p18!p28!p38!p48!p58!p68!p78!p88!p98!p:8!p;8!p<8!p=8!p>8!p?8!p@8!pA8!pB8!pC8!pD8!pE8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p 8!p 8!p 8!p 8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p!8!p"8!p#8!p$8!p%8!p&8!p'8!p(8!p)8!p*8!p+8!p,8!p-8!p.8!p/8!p08!p18!p28!p38!p48!p58!p68!p78!p88!p98!p:8!p;8!p<8!p=8!p>8!p?8!p@8!pA8!pB8!pC8!pD8!pE8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p 8!p 8!p 8!p 8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p!8!p"8!p#8!p$8!p%8!p&8!p'8!p(8!p)8!p*8!p+8!p,8!p-8!p.8!p/8!p08!p18!p28!p38!p48!p58!p68!p78!p88!p98!p:8!p;8!p<8!p=8!p>8!p?8!p@8!pA8!pB8!pC8!pD8!pE8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p 8!p 8!p 8!p 8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p!8!p"8!p#8!p$8!p%8!p&8!p'8!p(8!p)8!p*8!p+8!p,8!p-8!p.8!p/8!p08!p18!p28!p38!p48!p58!p68!p78!p88!p98!p:8!p;8!p<8!p=8!p>8!p?8!p@8!pA8!pB8!pC8!pD8!pE8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p 8!p 8!p 8!p 8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p!8!p"8!p#8!p$8!p%8!p&8!p'8!p(8!p)8!p*8!p+8!p,8!p-8!p.8!p/8!p08!p18!p28!p38!p48!p58!p68!p78!p88!p98!p:8!p;8!p<8!p=8!p>8!p?8!p@8!pA8!pB8!pC8!pD8!pE8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p 8!p 8!p 8!p 8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p!8!p"8!p#8!p$8!p%8!p&8!p'8!p(8!p)8!p*8!p+8!p,8!p-8!p.8!p/8!p08!p18!p28!p38!p48!p58!p68!p78!p88!p98!p:8!p;8!p<8!p=8!p>8!p?8!p@8!pA8!pB8!pC8!pD8!pE8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p 8!p 8!p 8!p 8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p!8!p"8!p#8!p$8!p%8!p&8!p'8!p(8!p)8!p*8!p+8!p,8!p-8!p.8!p/8!p08!p18!p28!p38!p48!p58!p68!p78!p88!p98!p:8!p;8!p<8!p=8!p>8!p?8!p@8!pA8!pB8!pC8!pD8!pE8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p 8!p 8!p 8!p 8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p!8!p"8!p#8!p$8!p%8!p&8!p'8!p(8!p)8!p*8!p+8!p,8!p-8!p.8!p/8!p08!p18!p28!p38!p48!p58!p68!p78!p88!p98!p:8!p;8!p<8!p=8!p>8!p?8!p@8!pA8!pB8!pC8!pD8!pE8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p 8!p 8!p 8!p 8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p!8!p"8!p#8!p$8!p%8!p&8!p'8!p(8!p)8!p*8!p+8!p,8!p-8!p.8!p/8!p08!p18!p28!p38!p48!p58!p68!p78!p88!p98!p:8!p;8!p<8!p=8!p>8!p?8!p@8!pA8!pB8!pC8!pD8!pE8!p8|!p8|!p8|!p8|!p8|!p8|!p8|!p8|!p8|!p 8|!p 8|!p 8|!p 8|!p 8|!p8|!p8|!p8|!p8|!p8|!p8|!p8|!p8|!p8|!p8|!p8|!p8|!p8|!p8|!p8|!p8|!p8|!p8|!p 8|!p!8|!p"8|!p#8|!p$8|!p%8|!p&8|!p'8|!p(8|!p)8|!p*8|!p+8|!p,8|!p-8|!p.8|!p/8|!p08|!p18|!p28|!p38|!p48|!p58|!p68|!p78|!p88|!p98|!p:8|!p;8|!p<8|!p=8|!p>8|!p?8|!p@8|!pA8|!pB8|!pC8|!pD8|!pE8|!p8x!p8x!p8x!p8x!p8x!p8x!p8x!p8x!p8x!p 8x!p 8x!p 8x!p 8x!p 8x!p8x!p8x!p8x!p8x!p8x!p8x!p8x!p8x!p8x!p8x!p8x!p8x!p8x!p8x!p8x!p8x!p8x!p8x!p 8x!p!8x!p"8x!p#8x!p$8x!p%8x!p&8x!p'8x!p(8x!p)8x!p*8x!p+8x!p,8x!p-8x!p.8x!p/8x!p08x!p18x!p28x!p38x!p48x!p58x!p68x!p78x!p88x!p98x!p:8x!p;8x!p<8x!p=8x!p>8x!p?8x!p@8x!pA8x!pB8x!pC8x!pD8x!pE8x!p8t!p8t!p8t!p8t!p8t!p8t!p8t!p8t!p8t!p 8t!p 8t!p 8t!p 8t!p 8t!p8t!p8t!p8t!p8t!p8t!p8t!p8t!p8t!p8t!p8t!p8t!p8t!p8t!p8t!p8t!p8t!p8t!p8t!p 8t!p!8t!p"8t!p#8t!p$8t!p%8t!p&8t!p'8t!p(8t!p)8t!p*8t!p+8t!p,8t!p-8t!p.8t!p/8t!p08t!p18t!p28t!p38t!p48t!p58t!p68t!p78t!p88t!p98t!p:8t!p;8t!p<8t!p=8t!p>8t!p?8t!p@8t!pA8t!pB8t!pC8t!pD8t!pE8t!p8p!p8p!p8p!p8p!p8p!p8p!p8p!p8p!p8p!p 8p!p 8p!p 8p!p 8p!p 8p!p8p!p8p!p8p!p8p!p8p!p8p!p8p!p8p!p8p!p8p!p8p!p8p!p8p!p8p!p8p!p8p!p8p!p8p!p 8p!p!8p!p"8p!p#8p!p$8p!p%8p!p&8p!p'8p!p(8p!p)8p!p*8p!p+8p!p,8p!p-8p!p.8p!p/8p!p08p!p18p!p28p!p38p!p48p!p58p!p68p!p78p!p88p!p98p!p:8p!p;8p!p<8p!p=8p!p>8p!p?8p!p@8p!pA8p!pB8p!pC8p!pD8p!pE8p!p8l!p8l!p8l!p8l!p8l!p8l!p8l!p8l!p8l!p 8l!p 8l!p 8l!p 8l!p 8l!p8l!p8l!p8l!p8l!p8l!p8l!p8l!p8l!p8l!p8l!p8l!p8l!p8l!p8l!p8l!p8l!p8l!p8l!p 8l!p!8l!p"8l!p#8l!p$8l!p%8l!p&8l!p'8l!p(8l!p)8l!p*8l!p+8l!p,8l!p-8l!p.8l!p/8l!p08l!p18l!p28l!p38l!p48l!p58l!p68l!p78l!p88l!p98l!p:8l!p;8l!p<8l!p=8l!p>8l!p?8l!p@8l!pA8l!pB8l!pC8l!pD8l!pE8l!p8h!p8h!p8h!p8h!p8h!p8h!p8h!p8h!p8h!p 8h!p 8h!p 8h!p 8h!p 8h!p8h!p8h!p8h!p8h!p8h!p8h!p8h!p8h!p8h!p8h!p8h!p8h!p8h!p8h!p8h!p8h!p8h!p8h!p 8h!p!8h!p"8h!p#8h!p$8h!p%8h!p&8h!p'8h!p(8h!p)8h!p*8h!p+8h!p,8h!p-8h!p.8h!p/8h!p08h!p18h!p28h!p38h!p48h!p58h!p68h!p78h!p88h!p98h!p:8h!p;8h!p<8h!p=8h!p>8h!p?8h!p@8h!pA8h!pB8h!pC8h!pD8h!pE8h!p8\!p8\!p8\!p8\!p8\!p8\!p8\!p8\!p8\!p 8\!p 8\!p 8\!p 8\!p 8\!p8\!p8\!p8\!p8\!p8\!p8\!p8\!p8\!p8\!p8\!p8\!p8\!p8\!p8\!p8\!p8\!p8\!p8\!p 8\!p!8\!p"8\!p#8\!p$8\!p%8\!p&8\!p'8\!p(8\!p)8\!p*8\!p+8\!p,8\!p-8\!p.8\!p/8\!p08\!p18\!p28\!p38\!p48\!p58\!p68\!p78\!p88\!p98\!p:8\!p;8\!p<8\!p=8\!p>8\!p?8\!p@8\!pA8\!pB8\!pC8\!pD8\!pE8\!p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p 88$p 88$p 88$p 88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p!88$p"88$p#88$p$88$p%88$p&88$p'88$p(88$p)88$p*88$p+88$p,88$p-88$p.88$p/88$p088$p188$p288$p388$p488$p588$p688$p788$p888$p988$p:88$p;88$p<88$p=88$p>88$p?88$p@88$pA88$pB88$pC88$pD88$pE88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p 88$p 88$p 88$p 88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p!88$p"88$p#88$p$88$p%88$p&88$p'88$p(88$p)88$p*88$p+88$p,88$p-88$p.88$p/88$p088$p188$p288$p388$p488$p588$p688$p788$p888$p988$p:88$p;88$p<88$p=88$p>88$p?88$p@88$pA88$pB88$pC88$pD88$pE88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p 88$p 88$p 88$p 88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p!88$p"88$p#88$p$88$p%88$p&88$p'88$p(88$p)88$p*88$p+88$p,88$p-88$p.88$p/88$p088$p188$p288$p388$p488$p588$p688$p788$p888$p988$p:88$p;88$p<88$p=88$p>88$p?88$p@88$pA88$pB88$pC88$pD88$pE88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p 88$p 88$p 88$p 88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p!88$p"88$p#88$p$88$p%88$p&88$p'88$p(88$p)88$p*88$p+88$p,88$p-88$p.88$p/88$p088$p188$p288$p388$p488$p588$p688$p788$p888$p988$p:88$p;88$p<88$p=88$p>88$p?88$p@88$pA88$pB88$pC88$pD88$pE88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p 88$p 88$p 88$p 88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p!88$p"88$p#88$p$88$p%88$p&88$p'88$p(88$p)88$p*88$p+88$p,88$p-88$p.88$p/88$p088$p188$p288$p388$p488$p588$p688$p788$p888$p988$p:88$p;88$p<88$p=88$p>88$p?88$p@88$pA88$pB88$pC88$pD88$pE88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p 88$p 88$p 88$p 88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p!88$p"88$p#88$p$88$p%88$p&88$p'88$p(88$p)88$p*88$p+88$p,88$p-88$p.88$p/88$p088$p188$p288$p388$p488$p588$p688$p788$p888$p988$p:88$p;88$p<88$p=88$p>88$p?88$p@88$pA88$pB88$pC88$pD88$pE88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p 88$p 88$p 88$p 88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p!88$p"88$p#88$p$88$p%88$p&88$p'88$p(88$p)88$p*88$p+88$p,88$p-88$p.88$p/88$p088$p188$p288$p388$p488$p588$p688$p788$p888$p988$p:88$p;88$p<88$p=88$p>88$p?88$p@88$pA88$pB88$pC88$pD88$pE88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p 88$p 88$p 88$p 88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p!88$p"88$p#88$p$88$p%88$p&88$p'88$p(88$p)88$p*88$p+88$p,88$p-88$p.88$p/88$p088$p188$p288$p388$p488$p588$p688$p788$p888$p988$p:88$p;88$p<88$p=88$p>88$p?88$p@88$pA88$pB88$pC88$pD88$pE88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p 88$p 88$p 88$p 88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p!88$p"88$p#88$p$88$p%88$p&88$p'88$p(88$p)88$p*88$p+88$p,88$p-88$p.88$p/88$p088$p188$p288$p388$p488$p588$p688$p788$p888$p988$p:88$p;88$p<88$p=88$p>88$p?88$p@88$pA88$pB88$pC88$pD88$pE88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p 88$p 88$p 88$p 88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p!88$p"88$p#88$p$88$p%88$p&88$p'88$p(88$p)88$p*88$p+88$p,88$p-88$p.88$p/88$p088$p188$p288$p388$p488$p588$p688$p788$p888$p988$p:88$p;88$p<88$p=88$p>88$p?88$p@88$pA88$pB88$pC88$pD88$pE88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p 88$p 88$p 88$p 88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p!88$p"88$p#88$p$88$p%88$p&88$p'88$p(88$p)88$p*88$p+88$p,88$p-88$p.88$p/88$p088$p188$p288$p388$p488$p588$p688$p788$p888$p988$p:88$p;88$p<88$p=88$p>88$p?88$p@88$pA88$pB88$pC88$pD88$pE88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p 88$p 88$p 88$p 88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p!88$p"88$p#88$p$88$p%88$p&88$p'88$p(88$p)88$p*88$p+88$p,88$p-88$p.88$p/88$p088$p188$p288$p388$p488$p588$p688$p788$p888$p988$p:88$p;88$p<88$p=88$p>88$p?88$p@88$pA88$pB88$pC88$pD88$pE88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p 88$p 88$p 88$p 88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p!88$p"88$p#88$p$88$p%88$p&88$p'88$p(88$p)88$p*88$p+88$p,88$p-88$p.88$p/88$p088$p188$p288$p388$p488$p588$p688$p788$p888$p988$p:88$p;88$p<88$p=88$p>88$p?88$p@88$pA88$pB88$pC88$pD88$pE88$p8=p8=p8=p8=p8=p8=p8=p8=p8=p 8=p 8=p 8=p 8=p 8=p8=p8=p8=p8=p8=p8=p8=p8=p8=p8=p8=p8=p8!p8!p8!p8!p8!p8!p8!p8!p8!p 8!p 8!p 8!p 8!p 8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p8!p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p 88$p 88$p 88$p 88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p88$p 88$p!88$p"88$p#88$p$88$p%88$p&88$p'88$p(88$p)88$p*88$p+88$p,88$p-88$p.88$p/88$p088$p188$p288$p388$p488$p588$p688$p788$p888$p988$p:88$p;88$p<88$p=88$p>88$p?88$p@88$pA88$pB88$pC88$pD88$pE88$pF88$pG88$pH88$pI88$pJ88$pK88$pL88$pM88$pN88$pO88$pP88$pQ88$pR88$pS88$p8 "p8 "p8 "p8 "p8 "p8 "p8 "p8 "p8 "p 8 "p 8 "p 8 "p 8 "p 8 "p8 "p8 "p8 "p8 "p8 "p8 "p8 "p8 "p8 "p8 "p8 "p8 "p8 "p8 "p8 "p8 "p8 "p8 "p 8 "p!8 "p"8 "p#8 "p$8 "p%8 "p&8 "p'8 "p(8 "p)8 "p*8 "p+8 "p,8 "p-8 "p.8 "p/8 "p08 "p18 "p28 "p38 "p-p-p-p-p-p-p-p-p-p -p8#-p8#-p8#-p8#-p8#-p8#-p8#-p8#-p8#-p 8#-p&-p&-p&-p&-p&-p&-p&-p&-p&-p &-p*-p*-p*-p*-p*-p*-p*-p*-p*-p *-px.-px.-px.-px.-px.-px.-px.-px.-px.-p x.-p82-p82-p82-p82-p82-p82-p82-p82-p82-p 82-px3-px3-px3-px3-px3-px3-px3-px3-px3-p x3-p8LW%p8LW%p8LW%p8LW%p8LW%p8LW%p8LW%p8LW%p8LW%p 8LW%p 8LW%p 8LW%p 8LW%p 8LW%p8LW%p8LW%p8LW%p8LW%p8LW%p8*-p8*-p8*-p8*-p8*-p8*-p8*-p8*-p8*-p 8*-p 8*-p8i!p8i!p8i!p8i!p8i!p8i!p8i!p8i!p8i!p 8i!p 8i!p 8i!p 8i!p 8i!p8i!p8i!p8i!p8i!p8i!p8i!p8i!p8 "p8 "p8!p8 "p8%p8%p8%p8%p 8%p 8%p8%p8 "p8!p8%p8 "p8!p8%p8 "p8%p8 "p8!p8!p8%p8 "p8!p 8%p 8%p8!p8%p8 p8%p8e%p8%p8 "p8!p8%p8=p8 "p8!pDATA8p81-px!px1-p1-p!p{|8[ -p8@q%p!p!p!p8*p8*p8ppDATA(!p!p@NDADA{`DA`DA?? ||{|W8 -p8!p8!p8&p8=pDATA(!p8!p!p8 -pDATA(8!p!p!p{{8 -pDATA(!p8!pCCT%?B`;?5~?|{|X8 -p8=p8<pDATA(!8@q%p8'p8!p8[(pdA>d>ddd?DATA(8!p!p@PDADAVDADA?? WWVW8 -pDATA(!p!p8!pC$_C$?@ 8 -p8p!pDATA(!p8!p!pDHBDHBf1DDBDDB?? g2g2V^g28 -pDATA(8!p!p s?M}02??1 s?jȠ>2jȠ s?>ğ-2@??/@>jȠ>}R?Qs s .W %@02@c?>|=>@s\^ן`@OA(@??3?34??,'@A3PM~~~>Ɵ-2@?|? "]z@@ :- ??BBmADATA@!p!p8'p!p8!p333?L>8%p B? #<zD 8opDATA(8!p8!p`DADA{`DA`DA?? ||{|8 -pDATA(8!p8!p@HBO~}BHB{{?HB|HHB= AH|{|8 -pDATA!p!p8!p8!p?SN*-px.-p&-pSRGame Logic.001x1-pX1-p1-pH3-p81-p81-p8[(pDATA x1-p1-pDATA 1-p81-px1-pDATA 81-p1-p1-p~DATA 1-p1-p81-p~DATA 1-pX1-p1-pDATA X1-p1-p1-p~DATA 1-p1-pX1-pDATA 1-px1-p1-p DATA x1-p1-p1-p DATA 1-p81-px1-p~DATA 81-p1-p1-p@DATA 1-p1-p81-p@DATA 1-pX1-p1-pDDATA X1-p1-pDDATA(1-px1-p1-p81-pDATA(x1-p1-p1-p1-p1-pDATA(1-pX1-px1-p81-pX1-pDATA(X1-p1-p1-p1-pX1-pDATA(1-p3-pX1-p1-p1-pDATA(3-p(3-p1-p1-p1-pDATA((3-p3-p3-p1-px1-pDATA(3-p3-p(3-p1-px1-pDATA(3-px3-p3-px1-p1-pDATA(x3-p3-p3-px1-px1-pDATA(3-pX3-px3-pX1-p1-pDATA(X3-pȃ3-p3-p1-p1-pDATA(ȃ3-p83-pX3-p1-p1-pDATA(83-p3-pȃ3-p81-p1-pDATA(3-p3-p83-pX1-p1-pDATA(3-p3-p3-p1-p81-pDATA(3-p3-p3-p1-p1-pDATA(3-ph3-p3-p81-p1-pDATA(h3-p؆3-p3-p1-pX1-pDATA(؆3-pH3-ph3-p1-pX1-pDATA(H3-p؆3-p1-pX1-pDATA81-p81-p1-p1-p81-pX1-p~8(-p8(-p8K2-pL2-pDATA(8K2-pL2-p DADA~DADA?? ~DATA(L2-p8K2-pmED@poo?? pDATA81-p81-p81-px1-p1-p1-p1-p!~^+-p+-p8N2-pO2-pDATA(8N2-pO2-pCACA]CACA?? ^^!~r^DATA(O2-p8N2-pC=CM^qNLq?@ ^rMr!~q^r3-p83-pDATAX3-p3-pBUTTONS_PT_contextBUTTONS_PT_contextContextL$DATAX3-px3-p3-pRENDER_PT_renderRENDER_PT_renderRenderL=DATAXx3-p83-p3-pRENDER_PT_layersRENDER_PT_layersLayersoLDATAX83-p3-px3-pRENDER_PT_dimensionsRENDER_PT_dimensionsDimensionsLDATAX3-p3-p83-pRENDER_PT_antialiasingRENDER_PT_antialiasingAnti-Aliasing:L:DATAX3-px3-p3-pRENDER_PT_motion_blurRENDER_PT_motion_blurSampled Motion Blur"LDATAXx3-p83-p3-pRENDER_PT_shadingRENDER_PT_shadingShading LDATAX83-p3-px3-pRENDER_PT_performanceRENDER_PT_performancePerformanceLDATAX3-p3-p83-pRENDER_PT_post_processingRENDER_PT_post_processingPost ProcessingLDATAX3-px3-p3-pRENDER_PT_stampRENDER_PT_stampStampL DATAXx3-p83-p3-pRENDER_PT_outputRENDER_PT_outputOutput(L DATAX83-px3-pRENDER_PT_bakeRENDER_PT_bakeBakeL DATA+-pDATA81-p81-p81-px1-p1-p1-px1-p --p--p8Q2-p8T2-pDATA(8Q2-pR2-pJCADADADA??   DATA(R2-p8T2-p8Q2-p\CKCqq?@ rrr3-p3-pDATAX3-pLOGIC_PT_propertiesLOGIC_PT_propertiesProperties$DATA(8T2-pR2-pDpCPDƶ¸C3Dq22q??FF?H? Dr3`DrDATAH--pDATA81-p81-p81-p81-p1-pX1-p1-pA~ >]8G,p8G,pU2-p8W2-pDATA(U2-p8W2-pCADA=@DA@DA?? >>A~>DATA(8W2-pU2-pCCDVD=B #<zD >C>CA~>CDATA8G,p DATA81-p81-p81-p1-pX1-p1-p81-pE?]8`2-p8`2-pX2-p^2-pDATA(X2-p8Z2-p@qDA~DA~DA~DA?? E?DATA(8Z2-p[2-pX2-pC@FCF++?@ ,EECDATA([2-p8]2-p8Z2-pCfCww?@ xfE?"DATA(8]2-p^2-p[2-p4Cm#Cmã?@ ??DATA(^2-p8]2-pE?C8-pDATAx8-p#=K(=o?????????#=K(=o?5ApykA?????#=K(=o?t@t@t@??5AoA9P=\>7?8˔?BBDATA@8`2-p333?? AL>8%p B?=zD DATA81-p81-p1-p1-pX1-p1-pCD]d2-pd2-pa2-p8c2-pDATA(a2-p8c2-pCACACCACA?? DDCDDATA(8c2-pa2-pCC@ 3DB22B?? DC31CDCDATAd2-p,pDATA,p8--pDATA8--p8[(p8[(p8[(p8[(p8i!p8 "p88$p8LW%p8!p8=p8[(pSNx.-p82-p*-pSRScriptingg.0011-p1-p3-p3-p81-p82-p8[(pDATA 1-p1-pDATA 1-px1-p1-pDATA x1-p1-p1-p~DATA 1-p81-px1-p~DATA 81-p1-p1-pDATA 1-p1-p81-p~DATA 1-pX1-p1-pDATA X1-p1-p1-pDATA 1-p1-pX1-phDATA 1-px1-p1-phDATA x1-p1-p1-phDATA 1-p81-px1-pDATA 81-p1-p1-p~DATA 1-p81-pDATA(3-p(3-p1-px1-pDATA((3-p3-p3-p1-p81-pDATA(3-p3-p(3-px1-p1-pDATA(3-px3-p3-p81-p1-pDATA(x3-p3-p3-p1-p1-pDATA(3-pX3-px3-p1-pX1-pDATA(X3-pȊ3-p3-p1-p1-pDATA(Ȋ3-p83-pX3-p81-p1-pDATA(83-p3-pȊ3-p1-p1-pDATA(3-p3-p83-pX1-p1-pDATA(3-p3-p3-p1-px1-pDATA(3-p3-p3-p1-px1-pDATA(3-ph3-p3-pX1-p1-pDATA(h3-p؍3-p3-p1-p1-pDATA(؍3-pH3-ph3-p1-p81-pDATA(H3-p3-p؍3-p1-p81-pDATA(3-p(3-pH3-p1-p81-pDATA((3-p3-p3-p81-p1-pDATA(3-p3-p(3-p1-p1-pDATA(3-px3-p3-px1-p1-pDATA(x3-p3-p3-p1-p1-pDATA(3-px3-p1-pX1-pDATA81-p82-p81-p1-px1-p1-p~](-p(-p8f2-pg2-pDATA(8f2-pg2-p DADA~DADA?? ~DATA(g2-p8f2-pDBDBnBomB?? CnC~CDATA82-p82-p81-pX1-p1-p81-p1-p~/-p/-p8i2-pj2-pDATA(8i2-pj2-pCACACACA?? ~DATA(j2-p8i2-pC=C4}~|?@ }~3-p3-pDATAX3-px3-pBUTTONS_PT_contextBUTTONS_PT_contextContext|$DATAXx3-p83-p3-pRENDER_PT_renderRENDER_PT_renderRender|=DATAX83-p3-px3-pRENDER_PT_layersRENDER_PT_layersLayerso|DATAX3-p3-p83-pRENDER_PT_dimensionsRENDER_PT_dimensionsDimensions|DATAX3-px3-p3-pRENDER_PT_antialiasingRENDER_PT_antialiasingAnti-Aliasing:|:DATAXx3-p83-p3-pRENDER_PT_motion_blurRENDER_PT_motion_blurSampled Motion Blur"|DATAX83-p3-px3-pRENDER_PT_shadingRENDER_PT_shadingShading |DATAX3-p3-p83-pRENDER_PT_performanceRENDER_PT_performancePerformance|DATAX3-px3-p3-pRENDER_PT_post_processingRENDER_PT_post_processingPost Processing|DATAXx3-p83-p3-pRENDER_PT_stampRENDER_PT_stampStamp| DATAX83-p3-px3-pRENDER_PT_outputRENDER_PT_outputOutput(| DATAX3-p83-pRENDER_PT_bakeRENDER_PT_bakeBake| DATA/-pDATA82-p82-p82-px1-p1-p1-p1-pi?s2-ps2-p8l2-p8r2-pDATA(8l2-pm2-p@qDA=DA=DA=DA?? iDATA(m2-p8o2-p8l2-pC@FCF++?@ ,%DATA(8o2-pp2-pm2-pCfCww?@ xf"DATA(p2-p8r2-p8o2-p#C#Cyy?@ zhDATA(8r2-pp2-p%8-pDATAx8-p?J?PףD>3;Q?Fwi?JF>#,TY!e?*=>o?E>Fwi?TY5;JF>!e?Q?#,+=>`DAoy@?>ޠQQuZ?> .>#,>m6uU?F +!>?`5hąC% ÈG6DWѦCGBD>3;Q?Fwi?JF>#,TY!e?*=>o?>ޠQQuZ?> .>#,>m7?8˔oAoAk;?! BVBt~BDATA@s2-p333?? AL>8%p B? #<C DATA82-p82-p82-p1-p1-p1-pX1-pgh8,p8,p8u2-pv2-pDATA(8u2-pv2-pCADADADA?? DATA(v2-p8u2-pD3CDCMM?? NNgNDATA(X3-pH,pDATAH,pDATA8,pX3-pX3-p>>> pythonDATA82-p82-p82-p1-p1-p1-p81-p~8{2-p8{2-p8x2-py2-pDATA(8x2-py2-pCACACACA?? ~DATA(y2-p8x2-pCC}||?? }~DATA8{2-p,pDATA,p0-pDATA0-p8[(p8[(p8[(p8[(p8i!p8 "p88$p8LW%p8!p8=p8[(pDATA82-p82-p1-p81-p1-px1-pi ?8J,p8J,p|2-p8~2-pDATA(|2-p8~2-pCA>DA=DA=DA?? iDATA(8~2-p|2-pDDD)dD.>CC$ #<zD %%%DATA8J,p =z||SN82-px3-px.-pSRUV Editing1-p1-pȑ3-p(3-p82-p82-p8[(pDATA 1-pX1-pDATA X1-p1-p1-pDATA 1-p1-pX1-p~DATA 1-px1-p1-p~DATA x1-p1-p1-pDATA 1-p81-px1-p~DATA 81-p1-p1-pDATA 1-p81-pDATA(ȑ3-p83-pX1-p1-pDATA(83-p3-pȑ3-pX1-px1-pDATA(3-p3-p83-p1-p1-pDATA(3-p3-p3-px1-p1-pDATA(3-p3-p3-px1-p81-pDATA(3-ph3-p3-p1-p1-pDATA(h3-pؔ3-p3-p1-px1-pDATA(ؔ3-pH3-ph3-p81-p1-pDATA(H3-p3-pؔ3-p1-p81-pDATA(3-p(3-pH3-p1-p1-pDATA((3-p3-p1-p1-pDATA82-p82-px1-pX1-p1-p1-p~8(-p8(-p2-p82-pDATA(2-p82-p DADA~DADA?? ~DATA(82-p2-pmEmEpoo?? pDATA82-p82-p82-p1-px1-p81-p1-p84-p84-p2-p2-pDATA(2-p82-pCArDAqDAqDA?? DATA(82-p2-p2-p[CsJCs?@ 3-p3-pDATAX3-pIMAGE_PT_gpencilIMAGE_PT_gpencilGrease PencilPDATA(2-p82-pCC33+?33@DATA(!84-pdA>d>ddd?DATA82-p82-p1-p81-p1-p1-p~2-p2-p82-p82-pDATA(82-p2-p@qDAmDA@mDA@mDA?? ~DATA(2-p82-p82-p CVCVWW?@ XXhXx3-px3-pDATAXx3-pVIEW3D_PT_tools_objectmodeVIEW3D_PT_tools_objectmodeObject ToolsDATA(82-p2-p2-p!CfC[Zww?@ xxhx"83-p83-pDATAX83-pVIEW3D_PT_last_operatorVIEW3D_PT_last_operatorOperatorDATA(2-p82-p82-p#C~#C~  ?@  ~~DATA(82-p2-pi~8-pDATAx8-pH?? JLD>3;Q?Fwi?JF>#,TY!e?*=>_?E>Fwi?TY4;JF>!e?Q?#,+=>6@_?? ?0QQ?X>?>#,>ՒΜz?T*=dbR@_@y\>,? (K5>}Q?sMd@JWA.Xj@-@D>3;Q?Fwi?JF>#,TY!e?*=>_? ?0QQ?X>?>#,>ՒΜz?T*=dbR@_@>>>?\>7?8˔_@_@r:?! BUBt~BDATA@2-p333?? AL>8%p B?=C SNx3-p82-pSRVideo Editing1-p1-p3-p3-p82-p8 2-p8[(pDATA 1-pX1-pDATA X1-p1-p1-pDATA 1-p1-pX1-pDATA 1-px1-p1-pDATA x1-p1-p1-pDATA 1-p81-px1-pDATA 81-p1-p1-p\DATA 1-p1-p81-pDDATA 1-pX1-p1-p0DATA X1-p1-p1-p\DATA 1-p1-pX1-p0\DATA 1-p1-pDDATA(3-p3-pX1-p1-pDATA(3-px3-p3-pX1-px1-pDATA(x3-p3-p3-p1-p1-pDATA(3-pX3-px3-px1-p1-pDATA(X3-pȘ3-p3-p1-p81-pDATA(Ș3-p83-pX3-p1-p1-pDATA(83-p3-pȘ3-px1-p1-pDATA(3-p3-p83-p1-pX1-pDATA(3-p3-p3-pX1-p1-pDATA(3-p3-p3-p1-p1-pDATA(3-ph3-p3-p81-p1-pDATA(h3-p؛3-p3-p1-p1-pDATA(؛3-pH3-ph3-p1-p1-pDATA(H3-p3-p؛3-p1-p1-pDATA(3-p(3-pH3-p1-p1-pDATA((3-p3-p3-p81-p1-pDATA(3-p3-p(3-px1-pX1-pDATA(3-p3-p81-pX1-pDATA82-p8 2-px1-pX1-p1-p1-p(-p(-pA4-p8C4-pDATA(A4-p8C4-pDA DADADA?? DATA(8C4-pA4-pmEmEpoo?? pDATA8 2-p8 2-p82-p1-p1-p1-p1-pCD4-p4-pD4-p8F4-pDATA(D4-p8F4-p DA DADADA?? DATA(8F4-pD4-p@~CHBpF}CHB)?HB|HHB= AH*C*DATA4-pDATA8 2-p8 2-p8 2-p1-pX1-p81-p1-pE[5-p5-pG4-p8L4-pDATA(G4-p8I4-pDA DADADA?? E^DATA(8I4-pJ4-pG4-p\C}KC}?@ _[DATA(J4-p8L4-p8I4-pppDDppDD;F;F'7PG[[DATA(8L4-pJ4-pzCAzCA A?|HB #<Bi_[DATA5-p@DATA8 2-p8 2-p8 2-pX1-px1-p1-p1-p/]0f87-p87-pM4-p8R4-pDATA(M4-p8O4-p@YDAA7 DA/ DA DA?? 00/]v0DATA(8O4-pP4-pM4-pHCpHCKK?? L:wLDATA(P4-p8R4-p8O4-p//wDATA(8R4-pP4-pC@zC AKVVKo:o:|HPCGiWL/wWLDATA87-p--pDATAp--p8[(pDATA8 2-p8 2-p1-p1-p1-p81-p1]fx8-px8-pS4-p8X4-pDATA(S4-p8U4-pCA0DA/DA/DA?? 1]vDATA(8U4-pV4-pS4-pwDATA(V4-p8X4-p8U4-pCC!!DDK;F;F'7PGLL1wLDATA(8X4-pV4-pzCAzCAKK A?|HB #<BiLDATAx8-p@SC(8[(pSCScenetageainxP>p8%p8i!p-pw)pw)pI?Ӱ?>Ɵ-2@>Ɵ-2@>Ɵ-2@xp8 "p8!p ZZD?dd??< 2ZQ@! ????xY>pxY>p??????/tmp/ L?L?L??>??_??BLENDER_RENDERD?fC??!p5p-p< ?=>L>I?fff?@?@Aff?AA@?A <@?DATA`xP>pDATA(-p ?p.8%pDATA( ?pxp-p Z8%pDATA(xpHp ?p%'8%pDATA(Hp-pxp&'8%pDATA(-p-pHp&'8%pDATA(-p!p-p%'8%pDATA(!p?p-p]8%pDATA(?p8?p!p&'8%pDATA(8?p1-p?p&'8%pDATA(1-pw)p8?p .8%pDATA(w)p1-p .8%pDATA8 "p8pp8W>p?L?B ?o:= ??88$pP2 HB2 B2 HB2 HB2 HB2 HB2 HB>? #<===ff??AHz?=???C#y??DATA88p8!pDATA8p8!pDATA`8W>p88$pdd|AJA6V{?DATAXxY>pRenderLayerrDATAxpFNTCompositing Nodetree8P:&p8D:&p8p8"."pDATA8P:&pDN:&p!pRender Layers8y)p8:-p8[(p%aEDCB(B ?!p%RCaEDCR#ŠCŠCjpF!DATAN:&pD8M:&p8P:&p8!pViewerd8=p8?p8!pȨ!pS2C CBB(B ?!pS2CS2C C CS2CS2C C CdpO!DATA8M:&pDK:&pN:&p8!pComposited8e%p85p8[(psCDBB(B ?!psCsC.CDsCsCq DqD8fpc!DATAK:&pD8J:&p8M:&p!pBlur8#p8"p8p8p(!puÌCBB(B ?!putvXBCBuúuÌCC(lpu!DATA8J:&pD8G:&pK:&p8!pBlur.0018p8=p8=p8=pX-pqùCBB(B ?!pqÀCBq raYCCxlpu!DATA8G:&pD8D:&p8J:&p!pMix8m!p8=p8=p8=pPI\#DBB(B ?!pPIAC\#DBPI€eA\DD?fpU!DATA8D:&pD8G:&p!pMix.001 8+p8=p8=p8=p?W$CCBB(B ?!p?W$C+C%CCB?W.C+CCCjpU!DATA8y)pB8 2-p8u!pImageRaD#!p*p?DATA#!pKDATA8 2-pB82-p8y)p8y!pAlphaRaDxgp?DATAxgpH?DATA82-pB82-p8 2-p8z!pZR C(!p?DATA(!pH?DATA82-pB82-p82-p8x!pNormal!p?DATA!pJ?DATA82-pB8.8-p82-p8t!pUVBp?DATABpJ?DATA8.8-pB89-p82-p8q!pSpeedp?DATApJ?DATA89-pB89-p8.8-p8V!pColor8+!p?DATA8+!pKDATA89-pB89-p89-p8U!pDiffuse+!p?DATA+!pKDATA89-pB89-p89-p8S!pSpecular+!p?DATA+!pKDATA89-pB89-p89-p8?pShadow%!p ?DATA%!pKDATA89-pB89-p89-p8>pAOh!p ?DATAh!pKDATA89-pB89-p89-p8 pReflect!p ?DATA!pKDATA89-pB89-p89-p8 pRefractH!p ?DATAH!pKDATA89-pB89-p89-p8 pIndirect !p ?DATA !pKDATA89-pB89-p89-p8 pIndexOBH5!p?DATAH5!pH?DATA89-pB89-p89-p8 pIndexMA1!p?DATA1!pH?DATA89-pB8:-p89-p8pMist/!p?DATA/!pH?DATA8:-pB8:-p89-p8pEmitR C/!px*p?DATA/!pKDATA8:-pB8:-p8pEnvironmentH/!p?DATAH/!pKDATA8=pB8=p8pImageS2C C.!p8"."p??DATA.!pK?DATA8=pB8?p8=p8pAlphaS2C C.!p??DATA.!pH??DATA8?pB8=p8pZS2C Cض!p??DATAض!pH??DATA(Ȩ!pDATA8e%pB8/p8pImagesCD!pp??DATA!pK?DATA8/pB85p8e%p8pAlphasCDh#!p??DATAh#!pH??DATA85pB8/p8pZsC.bCH!p??DATAH!pH??DATA8#pB8"p8pImageuB-!p8p?????DATA-!pK????DATA8"pB8#p8 pSizeuB,!p??DATA,!pH??DATA8pB8&pImagetvÌC!pxRB p?DATA!pKDATA((!pM##@@DATA8pB8=p8'pImageq`A!p8p?????DATA!pK????DATA8=pB8p8)pSizeqX!p??DATAX!pH??DATA8=pB8*pImage0Ch!p*p?DATAh!pKDATA(X-pM@@@@DATA8m!pB8=p8+pFacPIC!p7?@DATA!pH7?@DATA8=pB8=p8m!p8,pImagePIC!p8p?????DATA!pK????DATA8=pB8=p8-pImagePIC8;!pp?????DATA8;!pK????DATA8=pB8.pImageAD!p8*p?DATA!pKDATA8+pB8!p8/pFac?W$C\C?!pC ?@DATA?!pHC ?@DATA8!pB8=p8+p80pImage?W$CHC>!p8p?????DATA>!pK????DATA8=pB8!p81pImage?W$C4C>!p8p?????DATA>!pK????DATA8=pB82pImage+C>C)!p*p?DATA)!pKDATA88pEp8P:&pK:&p8:-p8#pDATA8pEp8p8G:&p8M:&p8=p8e%pDATA8pE8ppK:&p8G:&p8p8=pDATA88pE8pp8P:&p8J:&p8:-p8pDATA88pE8p8p8J:&p8D:&p8=p8=pDATA88pE8p8p8G:&p8D:&p8=p8!pDATA88pE8"."p8p8P:&p8G:&p8y)p8=pDATA88"."pE8p8D:&pN:&p8=p8=pIM8!p8!pIMRender Result8z-p8z-pO??IM8!p8!pIMViewer Node8H*p8H*p(O??CA8=p8=pCACameraamera.001?=BB@?BACA8=p8=pCACamera.001?=BB@?BALA8!p#8d+pLALamp ????L= AB>??%p.?A4B?@@L=@ ???o:??????@?????@pDATA@%pl????C?55?55?x!p??????DATAx!pj??DATA(@p  WO8i!p8R+pWOWorldrcP=rcP=rcP=6$<6$<6$<??A @A@pA A?L= ף;>??JpDATA(Jp AC8 p8e%pACCube.002Actionxp8p8p8pOBDATAhxpp8ppx"!p?DATApp*Ǿ> ???DATAx"!protation_eulerDATAhp8pxp8pp!p?DATApp*Ǿ> ???DATA!protation_eulerEDATAh8pp8pp="x!p?DATApp*="="-|?="@T?h?i: ?kDATAx!protation_eulerEDATAx8pxp8pRotationAC8e%p8 pACCube.003ActionpD pG pG pOBDATAhpA pG ppdp?DATApp*Ǿ> ???DATAdprotation_eulerDATAhA pD ppG pxB pbp?DATApxB p*Ǿ> ???DATAbprotation_eulerDATAhD pA pG pF pjȿh(!p?DATApF p*Ąþjȿjȿzq?jȿm??2?DATAh(!protation_eulerDATAxG ppD pRotationOB8%p|8%pOBCameraamera.001 8=p  >Ɵ-2@??????Ԍ>????? s?kȠ>kȠ s?>Ɵ-2@??????33?3?5)?? s?jȠjȠ> s?M}02???3??d8???>6 ?u=?????8=pDATA8=p??=L> ף<OB8%p|8%p8%pOBCamera.001 8=p  A0@?????? W?KRD>9Ct?)/:w>A0@?????????#6?bk>)o3RD>/:Ct?m*t ?#6?bk>)p->BAD>\~X?>T?2 @]?d8? #=?>=????????@???OB8%p|8%p8%pOBCube8 "p  !pر!p!<HA<`8= V;???????HA<`8= V;!<?????????#W?BAB/ÐCAU!:RD?#W?BAB/ÐCAU!:RD?d8? #=?>=????????@???DATA!pDATAر!pOB8%p|8%p8%pOBCube.001a p8 "p  !pH!pE<fQs???????fQsE<?????????o=]l髆@?o=h`)s@ɷf_@9?d8? #=?>=????????@???8V!p8S!pDATA!pDATAH!pOB8%p|8%p8%pOBCube.002xH pp8 "p  !ph!pP#=n>P#=???="????!=Vλ~j>=>P#=?????????4fA ?8~V@A?HAgH@ 7堡?c@zM@վþAemAkI?‹B?d8? #=?>=????????@???8z!p8!pDATA`xH p8 p?DATA !pDATAh!pOB8%p|8%p8%pOBCube.0038I p8p8 "p  !p!p=t:P)=43>P#=???jȿ????D9,)43>k:P#==t:?????????X-=kٶ@pB <A,?fHֲ@ߍѶNx_c@ԓ@v^>þA>R@A@곋B?d8? #=?>=????????@???8P!p8J!pDATA`8I p8e%p?DATA!pDATA!pOB8%p|8%p8%pOBCylinderx` p8 "p  I pI p9!p!p&j&jx???????&j&jx??????????Y݄lR.ɯ>TyHT>7?d8? #=?>=????????@???8!p8!pDATA9!pDATA!pDATAhI pMSubsurf8[(pOB8%p|8%p8%pOBCylinder.001xW p8 "p  ("!phhp< Y< Y<0??????? Y< Y<0<?????????~AB~AB8Q A?~AB7B6r sٯ>AӯW?d8? #=?>=????????@???8!p8r!pDATA("!pDATAhhpOB8%p|8%p8%pOBLamp 8!p  nRT#G???????$?c+^????=i>?@=aN>~*?> ?nRT#G???????22?3?'4'?=A=}*?i>a>?O> ?8୾?=@=~*?~> /[? ?8?o>]~K=?2??Dd8?<?>">u=??@???8=pDATA8=p??=L> ף<OB8%p|8%p8%pOBPlane8a p8 "p  J pJ pepap???????????????????????????1 s?jȠ>2jȠ s?>ğ-2@?d8? #=?>=????????@???8!p86?$pDATAepDATAapDATAhJ pMSubsurf8[(pOB8%p|8%pOBPlane.001!p8 "p  81!pX)!p 7">C+@C+@C+@???.3$@?????LX4C+[?dz 3dz[Z5 7">?????????2P>G_&P~"2w˲0M= "=?b3P>ėjFٵ= =s:m> ?d8? #=?>=????????@???8b!p8!pDATA81!pDATAX)!pMA08!p&8!p88$p8"pMAMaterial.001===??????????L?b=????????????? #<2L>??L>????? ף; ף;C@C@ ????????@?=?==??? ?p???????L==ff????DATA( ?p  MN8pDATA8p33ff33f3fffff3fff f ff f        f  3  f   3  !!!!!!  3  !!!"""     3  3f   3 3 3  f f3ff3MA08!p&8!p8!p8pMAMaterial.002>????????{>???L??????????????? #<2L>??L>???2?? ף; ף;C@C@ ????????@?=?==???(Fp???????L==ff????DATA((Fp  )*8PpDATA8Pp33 < ; ;'f'f33 > Sgfeeccbbb'f3 @mprsrpnkhdcccc O,f]y}~|yvrnjfcccc(f D{|xtpkfccd <3f~ytojdcd =2f}xslfde)fr{ungdf3 :f|unge R}umef+f >f|tke Tzqgh5f vlg+f Af !!!""!!! {qh @ Gf !""######""! uk Un!"#$$$$%%&%$#"!yo Tn "$$%%&'(*+,+(%#!}r Tp!#%&&'(*.49:71+&#!u ? Kf"$&''(*/8DO)R-M(A4+%" w @&3"%'((*-5 E[3oGuNlEV1A1(#! w+f!%'()+/;Q(oFcoblEM(8+%" v3%()*,0=V+xOo|pwPT.:,&#!  H+3%(*+,0;Q&pEaoeqIQ*9,&#"  A(*+,.5 E[/pExNqH]4F4 *&$"!d*3&)+,-08 DQ&V+Q'F9/)&$"! 3`f(+,-.15 ;=;5 /*'&$#!  P)+,--.000-+)'%$#! V3)+,,--,,+)('%$"!t3)++,+++*)(&%#" \3&)***))('%$"! >f''''&%$"  Ff3\fx Of%3MA08!p&8!p8!p8pMAMaterial.003e??L>ӣ0?d9=??fE?? ף; ף;CDCD ????????@?=?==? ??!p???????L=pA"c??Ga??DATA(!p  8 pDATA8 p 3 3#%&f f 3 3#)4-8F4>K9CO:COK.9G%1@ f 3!"/>.9F8BN?ITBKVDMXFNYFNYDMW@IT>GS:CO/:H&2A* f!.'3B1GRDMXDMXENXGOYGOYHPZGOYDLW?HS;EQ1GSHS:DP5?L1K4?K8BN$0?(4B+6D.9G1!.='8$501 3 f 1/0#4&6)9 ,<'2A$0?&2A'3A&2A)4C'2A(3B'3A&2A$0?!.= -< ,<%6$510/,'8!2 -1 1"3%6#4(8+;!-=+;!-=,; ,<,;,< -<*:(9$5$5#4"3"30 1'7ff+;$5 2 2#4"3$5&6$5%6'7)9(8&7(8(8&6&7&6%6$5!2"3"31!2&6*:(4$-8$0?)9$5$5%6"3&7$5&6%6'7%6$5)9(8$5'7(8!2'8&6%5'7$4&6'7*: ,<,8E f2=J+6D%1@#0?!-=,;+;+; ,<+; -<,; ,< ,<,;)9+;+;'7+;,;*9*: -<+;#0?%1?'3A+7E0:H f!f8BN2=J2K5?L$)0#f>HS?HTK3>K4?K5?L5?L6@M6@M7AN8BN7AM7AN4?K8BN:DP;DP?HT5??L>???2?? ף; ף;C@C@ ????????@?=?==???X!p???????L==ff????DATA(X!p  8,pDATA8,p33 ff33 f3"""###"""  f***000333444333111///+++'''###f...999>>>AAABBBBBB@@@>>>;;;888333///***%%% f999DDDJJJNNNOOOPPPNNNMMMJJJGGGDDD@@@:::666111+++%%% f@@@NNNTTTYYY[[[\\\\\\ZZZXXXVVVRRROOOJJJFFFAAA<<<777000***###f888SSS]]]ccceeeggghhhgggfffdddaaa]]]YYYUUUPPPKKKEEE@@@:::444---%%%3fWWWccckkkooorrrssssssrrrqqqnnnlllhhhddd___ZZZTTTOOOIIICCC===666...&&&UUUhhhqqqxxx|||~~~~~~~~~}}}|||yyyvvvrrrmmmiiiccc]]]XXXRRRLLLEEE>>>777///&&& f###fhhhvvv~~~|||wwwrrrlllfff```ZZZTTTMMMFFF???777...$$$MMMwww{{{vvvoooiiiccc\\\UUUNNNFFF>>>555,,, fmmm~~~xxxqqqjjjddd\\\UUUMMMEEE<<<333''' f&&&f}}}yyyrrrkkkddd\\\TTTLLLCCC999---!!!---fzzzssskkkccc[[[RRRIII???333&&&IIIzzzrrrjjjaaaXXXNNNDDD999+++IIIxxxpppggg]]]TTTIII===000###KKK~~~vvvlllcccXXXMMMAAA444&&&222f{{{qqqggg]]]QQQEEE777(((3vvvlll```TTTGGG:::))) fzzzooodddWWWIII;;;'''3fff}}}rrreeeYYYKKK;;; 3tttgggZZZKKK777uuugggYYYHHH&&&3tttfffVVV@@@ 3LLLfrrrbbbNNN###zzz}}}mmmZZZ+++3}}}wwwccc;;; 3{{{|||ggg222 3~~~UUU"""fqqqddd,,,f 3HHHfpppqqq̉~~~UUU777f3TEx8LW%p TETex>@????????@@????? @??<w!pDATA(w!p  ` `8!p8P=$pDATA8!pDATA8P=$pME8 "p18 "p8 "pMECube.001&!p8x%p8 -p8b(ppxp 8p`Hy7B_A?CDATA&!p8!pDATAhpq8 -pDATA8 -p7`IA {A?CubO A_A?FA2nXA?O%A$tA?hFIA {ACub A_AtA3nXA%A$tAht:BAht>A9W@C닝>A!At9B!Aub:BA?hF>A9W@?C닝O>A!A?F9B!A?ubOx7B>?III@Bp>?II>B`?Iy7B?IIx7B>II@Bp>I@By7BI:B">A"s>AA ?F>A=Wub>$ht8!C닝9ž?F>;W?ubO>$?hF8!?C닝Ox7°?IA ?II?@>?IIIy7P>?IIx7°A I@`>IIy7P>I: ACub>"Aht>:W@9›At: A?CubO> A?hF>:W@?O9›A?F%$tA?F6nXA?C닝O _A?hFI {A?ubO%$tAt6nXAC닝 _AhtI {Aub?_AII?2~AI(2~A0^AI?_A?III?2~A?II(2~A?I0^A?IIDATAhxpq8b(pDATA8b(p4"""""""""""" " " " " " "" " " " " """"""""""""""""""""""""" $"!%"#'""&" #" !"!"""#"$%"$'"&'"%&")*"()"(+"*+",-"-."./",/")-"(,"*."+/"04"15"37"26"03"01"12"23"45"47"67"56"9:"89"8;":;"<="=>">?"<?"9="8<":>";?"@D"AE"CG"BF"@C"@A"AB"BC"DE"DG"FG"EF"IJ"HI"HK"JK"LM"MN"NO"LO"IM"HL"JN"KO"PT"QU"SW"RV"PS"PQ"QR"RS"TU"TW"VW"UV"YZ"XY"X["Z["\]"]^"^_"\_"Y]"X\"Z^"[_"DATAh8pq8x%pDATA8x%p3HXYZ[\_^]X\]YY]^ZZ^_[\X[_      #'$ %! $&"!%'#"&# !"'&%$()*+,/.-(,-))-.**./+,(+/37405104621573263012765489:;<?>=8<=99=>::>?;<8;?CGD@EA@DFBAEGCBFC@ABGFEDHIJKLONMHLMIIMNJJNOKLHKOSWTPUQPTVRQUWSRVSPQRWVUTME8 "p18 "p8 "pMECube.0028&!p8*p8f!p8!ppp u5"p 0???CDATA8&!p8!pDATAhpq8f!pDATA8f!p7 4k>?y>?I?II6?I4h>Iv>IIIII4IIoRAPR?AIV?R?AIID?RAIwSA?IGS?A?IIY?R?A?IIIG?RA?II2AŶ ?IIB䶟 ?IB> ?4A> ?I2AIIIBඟIIB>I2A>II?QII?Q?IQ?QI?R?III?R??IIQ??IQ?IIDATAhpq8!pDATA@8!p40############ # # # # ## # # # # # #########################DATAhu5"pq8*pDATA8*p3          ME8 "p18 "p8 "pMECube.0038!p8O p8=p8e%p8m5"p!p x%5"p ???CDATA8!p8!pDATAh8m5"pq8=pDATA8=p7Z?< @IIZ?OoIZGoZ> @I Z?: @?IIIZ?{o?II!Z;o?IZ< @?IIDATAh!pq8e%pDATA8e%p4 #""""#""#"#"DATAhx%5"pq8O pDATAx8O p3ME8 "p18 "p8 "pMECube.004x!pb p8 p8p15"p8p xi8$p ???CDATAx!p8!pDATAh15"pq8 pDATA8 p7Z? @IIZ?8þIZ8þZ@I Z? @?IIIZ?8þ?II!Z8þ?IZ @?IIDATAh8pq8pDATA8p4 ############DATAhxi8$pqb pDATAxb p3ME8 "p18 "p8 "pMECylinder>!p8 -p8r%p8 -p8d8$px35"p 89&pB`_???CDATA>!p8!pDATAh8d8$pq8r%pDATA(8r%p7B?*]<G>{?-`[<>_l?#V<9?1T?3wM<5?5?AA<1T?9?wM3<_l?>V#<{?G>`[-<?i!3*]<{?G`[<_l?þVY<2T?9wM><5?5A <9?2T3<>_l#<G>{-<L﮴֢<G{<$þ\lY<9,T>̉<55 <8T9><dlþY<{G<5֢<{G>-<Wl8>#<&T9?3<55?A<9>T?>wM<þhl?YV<]G{?`[<w5??*]W1G>{??-`[WL>Sl??#VW9? T??3wMW 5?5??AAWDT?9??wM3Wkl?>?V#W{?3G>?`[-W?Ȫ?*]W{?W5?5?A W9?4T?3W>^l?#WG>{?-Wy5?֢WkG{?Wþjl?YW9ET?>̉W55? WT:?>WJl|þ?YW{G?W?֢W{G>?,W~l}>?#WbT9??3W755??AW/:T??>wMWþ5l??YVWG{??`[W?DATAhx35"pq8 -pDATA8 -p4@!#@! A! !#!A!#@!!"#"A!#@!"###A!#@!#$#$A!#@!$%#%A!#@!%&#&A!#@!&'#'A!#@!'(#(A! # @!()#)A! # @!)*#*A! # @!*+#+A! # @!+,#,A! # @!,-#-A! #@!-.#.A!#@!./#/A!#@!/0#0A!#@!01#1A!#@!12#2A!#@!23#3A!#@!34#4A!#@!45#5A!#@!56#6A!#@!67#7A!#@!78#8A!#@!89#9A!#@!9:#:A!#@!:;#;A!#@!;<#<A!#@!<=#=A!#@!=>#>A!#@!>?#?A!# ?#!# #"###$#%#&#'#(# )# *# +# ,# -#.#/#0#1#2#3#4#5#6#7#8#9#:#;#<#=#>#?#DATAh89&pq8 -pDATA8 -p3`@A! @A"!@A#"@A$#@A%$@A&%@A'&@A('@ A)(@ A*)@ A+*@ A,+@ A-,@ A.-@A/.@A0/@A10@A21@A32@A43@A54@A65@A76@A87@A98@A:9@A;:@A<;@A=<@A>=@A?>@A ? !!""##$$%%&&''(()  )*  *+  +,  ,-  -..//00112233445566778899::;;<<==>>? ?ME8 "p18 "p8 "pMECylinder.001!p8 p80!p82ps!pu!p xp???CDATA!p8!pDATAhs!pq80!pDATA=80!p7mB*]<G>DZB-`[<>B#V<5?BBAA<1T?ئBwM3<_l?BV#<{?5B`[-<?mB*]<{?+B`[<_l?BVY<2T? 5BwM><5?ݙBA <9?B3<>B#<G>B-<L﮴mB֢<GB<$þBY<9B>̉<5ݙB <8T 5B><dlBY<{+B<mB֢<{5B-<WlB#<&T٦B3<5BBA<9^B=vM<]GDZB`[<w5mB?*]W1G>DZB?-`[W9?^B?3wMW 5?BB?AAWDT?ئB?wM3Wkl?B?V#W{?5B?`[-W?mB?*]W{?+B?`[WVl?B?VYW'T? 5B?wM>W5?ݙB?A W9?B?3W>B?#WG>B?-Wy5mB?֢WkGB?WþB?YW9B?>̉W5ݙB? WT 5B?>WJlB?YW{*B?WmB?֢W{5B?-W~lB?#WbTצB?3W75BB?AW/:]B?=vMWþB?YVWGDZB?`[WmBmB?mAA?mAA=AqB?ǧ<?yAB?<AB? <[AB?< A-fB?4)q<1A,B?8<+jArB? F<QAB?Pk<ڊAB?8X<vAުA?^\<TrAA?\<A~A?Y<&AAA?S5)<(AA?I8<'AʛA?m= F<mAA?.P<ۏA? A?8X<bAA?) ^\<AA?\<AWA?Y<ArA?֏S<LAFA?II<qA7YA?m=<[AA?Q.<AA? A?ȧ<l\AAl\A^\<%AiA\<A)(AY<BA4)S<'?BƘA8I<BJA Fm=<FBmA?P.WpBBA?9XWBByA?^\) WBA?\WB[A?YW.fB$A?SW,B3A?IIWrB,jA?m=WBRA?.QWBڊA?ȧWުAwA?) WATrA? W~AA?WA&AA?qWA)A?IWʛA'A?WAmA?QkW> AۏA?ǧWAbA?WAA? WXAA?WrAA?q4)WFALA?8W6YAqA? FWA[A?kѯPW? ADA?9XW?<B@>?Y<^B@:??><BB@5?? <ئBT??>̉<Bl??Y<5B0{??<mB ??֢<,B{??-<B`l??#< 5B0T??3<ޙB5??A <B9??wM><B@>?VY<B@G>?`[<mB6?*]<BG?`[-<Bþ?V#<B9?wM3<ޙB5?AA< 5BT?3wM<B0l?#V<,B{?-`[<mB?*]<5B{?`[<BPl?YV<ڦB T?>wM<BB5? A<`B9?3=Bþ?#<FZBG?-<mB?֢<FZBG>W B@>YW`B:?>WBB5? WڦB@T?>̉WB`l?YW5B{?WmB?֢W,B{?-WBl?#W 5B`T?3WޙB05?AWB:?wM>WB@>VYWBG>`[WmB6*]WBG`[-WBþV#WB9wM3WޙB5AAW 5B T3wMWB@l#VW,B{-`[WmB*]W5B{`[WBPlYVWڦBT>wMWBB5AW^B93WBþ#WFZBG-WmB6֢WFBmPk<BI F<'?BƘ8<B5)r<A*(<%Ai <?Al\<C AAȧ<A[kQ<:YAq<FALI<rAq<YA<A <Ab) <A Aۏȧ<AmQ.<ΛA'm=<A(II<A%A֏S<AY<ATr\<Av) ^\<Bڊ8X<BR.P<rB-jm= F<-B3I8<.fB$S4)<B[Y<B\<BCy^\<qBA9X<FBm?PkWBI? FW&?BƘ?8WB?5)qWA)(?W%Ai? W?Al\?WB AC?ǧWA[?kQW:YAq?WFAL?IWrA?qWZA?WA? WAb?) WA Aۏ?ȧWAm?Q.W̛A'?m=WA'?IIWA&A?֎SWA?YWATr?\WAv?) ^\WBڊ?8XWBP?.PWrB+j?m= FW.B1?I8W.fB?S4)WB[?YWB?\WB>y?^\WrB>?9XWAmAm?mA?mAOA? F<ƘA%??8<ÚA?q4)<,(A?<iA%? <p\A>?<FAA ?ǧ<[A?Qk<qA9Y?<LAF?I<Ar?q<AY?<A? <bA?) <ۏA@ ?ȧ<mA?.Q<'Aʛ?m=<+A?II<(AA?S<A?Y<VrA?\<yAߪ?^\) <ڊA?8X<TA?P.<.jAr? Fm=<5A-?8I<&A.f?4)S<[A?Y<A?\=CyA?^\<DAq?9X<mAF?kѯP<LA FWƘA(?8WAq4)W+(AWiA% Wn\A>WDAB ȧW[AQkWqA8YWLAFIWArrWAXWA WbA) WۏAB ȧWmA.QW'Aʛm=W+AIIW(AASWAYWVrA\WzA^\) WڊA8XWVAP.W0jAr Fm=W5A-8IW&A.f4)SW[AYWA\WEyA^\WDAq9XWmAFkѯPW6m֢<GGZ<@þY<`9_>̉<5B<Tڦ><0lY<{5<m֢<p{--<l#<T 53<5ߙA<9>wM<þYV<G`[<6m*]<G>,`[<>#V< :?3wM<@5?ߙAA<T? 5wM3<Ѓl?V#< {?-`[-<0?m*]<{?5`[<l?VY<0T?ڦwM>< 5?BA < :?a3<> #<G>GZ-<6m?֢WGGZ?Wþ?YW9a?>̉W5B? WTڦ?>W0l?YW{5?Wm?֢Wp{-?-Wl?#WT 5?3W5? AW9?>wMWþ?YVWG?`[W6m?*]WG>?,`[W>?#VW:??3vMW05??AAW`T? 5?wM3Wl??V#W{?-?`[-W0?m?*]W0{?5?`[Wl??VYWT?٦?wM>W`5?B?A W:?_?3W>?#WG>GZ?-W7m7m?m?m>s?9X<>y?) ^\<?\<[?Y< .f?֎S<1/?II<,jr?m=<P?Q.<ڊ?ȧ<v?) <Tr? <?<&A?q<&?I='ϛ?<m?kQ<ۏD ?ȧ<b?<? <^?<r?4)q<LF?8<qo\( <%i <-(<šq<(?ƘI<N<Fm?QkWrF?ǧWEy?W?W[?W.f'?r4)W-7?8Wr/j? FWV?kѯPW¦ڊ?8XW|?^\WXr?\W?YW*A?5)SW.?8IW̛'? Fn=Wm?P.WA ۏ?8XWb?^\) W?\W\?YWr?SWFL?IIW:Yq?m=W[?.QWB H?ǧW?r\?( W%i? W.(?WĚ?qW%?Ƙ?IWQ?Wmm?m?mGZ€G?`[-<þ?V#<`:?wM3<B€5?AA<ڦ T?3wM<l?#V<5`{?-`[<m0?*]<. {?`[<l?YV< 5€T?>wM<@5? A< :?3<þ?#=G?,<m?֢<€G>?< >?Y<p9??><€5?? < 5T??>̉<l??Y<.P{??<m??֢<5{??-<l??#<ۦT??3<B°5??A<b 9??wM>=@>?VY<HZ€G>?`[<m?*]<HZG`[-W @þV#Wb :wM3WBP5AAWڦ0T3vMWl#VW5@{-`[Wm8*]W.P{`[WlYVW 5 T>wMW`5 AWP:3Wþ#W@G,Wm֢W€G>W >YWp9?>W°5?W 5T?>̉Wl?YW.P{?Wm°?֢W5`{?-Wl?#WۦT?3WB°5?AW`9?wM>W >VYWHZ€G>`[Wm*]WFmAQ.<JAn==)?¸ƘAII<¹A֎S<)(AY<%iA\<Dl\A( ^\<H BA8X<[A.P<?YqAn= F<FLAI8<rAS4)<]AY<A\<bA^\<F ۏA9X<mAPk<қ'A F<)A8<%AA4)q<A<TrA <vA< ڊAȧ<SAkQ<r-jA<.2AJ<.f$Ar<[A=A < ByA( <tBAǧ<FmA?Q.WJA?n=W(?¸ƘA?JIW¿A?֏SW*(A?YW%iA?\WDl\A?( ^\WE BA?8XW[A?.PW?YqA?n= FWFLA?I8WrA?S5)W`A?YWA?\WbA?^\WF ۏA?9XWmA?PkWқ'A? FW&A?8W&AA?4)qWA?WTrA? WvA?W¡ڊA?ȧWQA?kQWr+jA?W02A?JW/f!A?rW[A?WA? W=yA?( Wt<A?ǧWmAmA?mA?mASB?n=<Ƙ&?B?II<ƚB?S<0(A?Y<i%A?\<t\@A?^\( <JC A?9X<[A?P.<q;YA? Fm=<LFA?8I<rA?4)S<\A?Y<A?\<bA?^\<ۏB A?8X<mA?kѯP<'͛A? F<0A?8<,AA?q5)<³A?<ZrA? <~A?<ڊB?ȧ<XB?Qk<2jrB?<9.B?I<)/fB?r<[B?<B?=GyB?( <HrB?ǧ<mFB?.Q<PBn=WƘ(?BIIWĚBSW0(AYWi%A\Wp\>A^\( WJF A9XW[AP.Wq;YA Fm=WLFA8IWrA4)SW\AYWA\WbA^\WۏD A8XWmAkѯPW'͛A FW0A8W,AAr5)WAWZrA W~AWڊBȧWXBQkW4jrBW9.BIW)/fBrW[BWB WIyB( WHrBǧWmFB.QW8?`B3wM<þ BYV=> B?#VWDATAhu!pq82pDATAZ82p4=!#=!>!#>!#=! >!=! !#!>!#=!!"#">!#=!"###>!#=!#$#$>!#=!$%#%>!#=!%&#&>! # =!&'#'>! # =!'(#(>! # =!()#)>! # =!)*#*>! # =!*+#+>! #=!+,#,>!#=!,-#->!#=!-.#.>!#=!./#/>!#=!/0#0>!#=!01#1>!#=!12#2>!#=!23#3>!#=!34#4>!#=!45#5>!#=!56#6>!#=!67#7>!#=!78#8>!#=!89#9>!#=!9:#:>!:;#;>!=!;<#<>!#<###!#"###$#%#&# '# (# )# *# +#,#-#.#/#0#1#2#3#4#5#6#7#8#9#:#<##@!@!`_#?_!?`!~#@~!_^#?^!~}#@}!^]#?]!}|#@|!]\#?\!|{#@{!\[#?[!{z#@z![Z#?Z!zy#@y!ZY#?Y!yx#@x!YX#?X!xw#@w!XW#?W!wv#@v!WV#?V!vu#@u!VU#?U!ut#@t!UT#?T!ts#@s!TS#?S!sr#@r!SR#?R!rq#@q!RQ#?Q!qp#@p!PQ#P?!po#@o!PO#O?!on#@n!ON#N?!nm#@m!NM#M?!ml#@l!ML#L?!lk#@k!LK#K?!kj#@j!KJ#J?!ji#@i!JI#I?!ih#@h!IH#H?!hg#@g!HG#G?!gf#@f!GF#F?!fe#@e!FE#E?!ed#@d!ED#D?!dc#@c!DC#C?!cb#@b!CB#B?!ba#@a!BA#A?!a#A`#`#_#~^#}]#|\#{[#zZ#yY#xX#wW#vV#uU#tT#sS#rR#qQ#Pp#Oo#Nn#Mm#Ll#Kk#Jj#Ii#Hh#Gg#Ff#Ee#Dd#Cc#Bb#Aa#!#!!#!#!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!####################################!!#!!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!##################################E!#E!&F!%&#%F!#E!'F!&'##E!(F!'(# # E!)F!()#  # E!*F!)*#  # E!+F!*+#  # E!,F!+,#  # E!-F!,-# #E!.F!-.##E!/F!./##E!0F!/0##E!1F!01##E!2F!12##E!3F!23##E!4F!34##E!5F!45##E!6F!56##E!7F!67##E!8F!78##E!9F!89##E!:F!9:##E!;F!:;##E!<F!;<##E!=F!<=##E!>F!=>##E!?F!>?# # E!@F!?@# !#!E!AF!@A#!"#"E!BF!AB#"###E!CF!BC##$#$E!DF!CD#$#%D#%#&#'#(# )# *# +# ,# -#.#/#0#1#2#3#4#5#6#7#8#9#:#;#<#=#>#?# @#!A#"B##C#$D##H!H!gh#Gg!Gh!#H!fg#Gf!#H!ef#Ge!#H!de#Gd!#H!cd#Gc!#H!bc#Gb!#H!ab#Ga!#H!`a#G`!#H!_`#G_!~#H~!^_#G^!}~#H}!]^#G]!|}#H|!\]#G\!{|#H{![\#G[!z{#Hz!Z[#GZ!yz#Hy!YZ#GY!xy#Hx!XY#GX!wx#Hw!WX#GW!vw#Hv!VW#GV!uv#Hu!UV#GU!tu#Ht!TU#GT!st#Hs!ST#GS!rs#Hr!RS#GR!qr#Hq!QR#GQ!pq#Hp!PQ#GP!op#Ho!OP#GO!no#Hn!NO#GN!mn#Hm!MN#GM!lm#Hl!LM#GL!kl#Hk!KL#GK!jk#Hj!JK#GJ!ij#Hi!IJ#GI!i#Ih#h#g#f#e#d#c#b#a#`#_#^~#]}#\|#[{#Zz#Yy#Xx#Ww#Vv#Uu#Tt#Ss#Rr#Qq#Pp#Oo#Nn#Mm#Ll#Kk#Jj#Ii#!#!!#!#!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!###################################  # ! !#!!  # !#!  # !#! #!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#! ## # # # ############################# M! #M!.N!-.#-N!#M!/N!./##M!0N!/0##M!1N!01##M!2N!12##M!3N!23##M!4N!34##M!5N!45##M!6N!56##M!7N!67##M!8N!78##M!9N!89##M!:N!9:##M!;N!:;##M!<N!;<##M!=N!<=##M!>N!=>##M!?N!>?# # M!@N!?@# !#!M!AN!@A#!"#"M!BN!AB#"###M!CN!BC##$#$M!DN!CD#$%#%M!EN!DE#%&#&M!FN!EF#&'#'M!GN!FG#'(#(M!HN!GH#()#)M!IN!HI#)*#*M!JN!IJ#*+#+M!KN!JK#+,#,M!LN!KL#, #-L#- #.#/#0#1#2#3#4#5#6#7#8#9#:#;#<#=#>#?# @#!A#"B##C#$D#%E#&F#'G#(H#)I#*J#+K#,L##P!P!op#Oo!Op!#P!no#On!#P!mn#Om!#P!lm#Ol!#P!kl#Ok!#P!jk#Oj!#P!ij#Oi!#P!hi#Oh!#P!gh#Og!#P!fg#Of!#P!ef#Oe!#P!de#Od!#P!cd#Oc!#P!bc#Ob!#P!ab#Oa!#P!`a#O`!#P!_`#O_!~#P~!^_#O^!}~#P}!]^#O]!|}#P|!\]#O\!{|#P{![\#O[!z{#Pz!Z[#OZ!yz#Py!YZ#OY!xy#Px!XY#OX!wx#Pw!WX#OW!vw#Pv!VW#OV!uv#Pu!UV#OU!tu#Pt!TU#OT!st#Ps!ST#OS!rs#Pr!RS#OR!qr#Pq!QR#OQ!q#Qp#p#o#n#m#l#k#j#i#h#g#f#e#d#c#b#a#`#_#^~#]}#\|#[{#Zz#Yy#Xx#Ww#Vv#Uu#Tt#Ss#Rr#Qq#!#!!#!#!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!##!!####################################!!#!!#!#!#!#!#!#!#!#!#!#! # !#!  # !#!  # !#!  # !#!  # !#! #!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!#!######### # # # # #####################;# ###=!## #=!##>!DATAhxpq8 pDATAZ8 p3=>=>=> =>! =>"!=>#"=>$#=>%$=>&%= >'&= >('= >)(= >*)= >+*= >,+=>-,=>.-=>/.=>0/=>10=>21=>32=>43=>54=>65=>76=>87=>98=>:9=>;:=><;=><  !!""##$$%%&&'  '(  ()  )*  *+  +,,--..//00112233445566778899::;;<<@`_?~@_^?}~@^]?|}@]\?{|@\[?z{@[Z?yz@ZY?xy@YX?wx@XW?vw@WV?uv@VU?tu@UT?st@TS?rs@SR?qr@RQ?pq@QP?op@PO?no@ON?mn@NM?lm@ML?kl@LK?jk@KJ?ij@JI?hi@IH?gh@HG?fg@GF?ef@FE?de@ED?cd@DC?bc@CB?ab@BA?@aA`?_`^_~]^~}\]}|[\|{Z[{zYZzyXYyxWXxwVWwvUVvuTUutSTtsRSsrQRrqPQqpOPpoNOonMNnmLMmlKLlkJKkjIJjiHIihGHhgFGgfEFfeDEedCDdcBCcbABbaa`AEF&%EF'&EF('E F)(E  F*)E  F+*E  F,+E  F-,E F.-EF/.EF0/EF10EF21EF32EF43EF54EF65EF76EF87EF98EF:9EF;:EF<;EF=<EF>=EF?>E F@?E !FA@E!"FBAE"#FCBE#$FDC$EF%D%&&''(()  )*  *+  +,  ,-  -..//00112233445566778899::;;<<==>>??@  @A!!AB""BC##CD$%$DHhgGHgfGHfeGHedGHdcGHcbGHbaGHa`GH`_G~H_^G}~H^]G|}H]\G{|H\[Gz{H[ZGyzHZYGxyHYXGwxHXWGvwHWVGuvHVUGtuHUTGstHTSGrsHSRGqrHRQGpqHQPGopHPOGnoHONGmnHNMGlmHMLGklHLKGjkHKJGijHJIGHiIhGghfgefdecdbcab`a_`^_~]^~}\]}|[\|{Z[{zYZzyXYyxWXxwVWwvUVvuTUutSTtsRSsrQRrqPQqpOPpoNOonMNnmLMmlKLlkJKkjIJjiihI                M N.-MN/.MN0/MN10MN21MN32MN43MN54MN65MN76MN87MN98MN:9MN;:MN<;MN=<MN>=MN?>M N@?M !NA@M!"NBAM"#NCBM#$NDCM$%NEDM%&NFEM&'NGFM'(NHGM()NIHM)*NJIM*+NKJM+,NLK, MN-L -..//00112233445566778899::;;<<==>>??@  @A!!AB""BC##CD$$DE%%EF&&FG''GH((HI))IJ**JK++KL,- ,LPpoOPonOPnmOPmlOPlkOPkjOPjiOPihOPhgOPgfOPfeOPedOPdcOPcbOPbaOPa`OP`_O~P_^O}~P^]O|}P]\O{|P\[Oz{P[ZOyzPZYOxyPYXOwxPXWOvwPWVOuvPVUOtuPUTOstPTSOrsPSROqrPRQOPqQpOopnomnlmkljkijhighfgefdecdbcab`a_`^_~]^~}\]}|[\|{Z[{zYZzyXYyxWXxwVWwvUVvuTUutSTtsRSsrQRrqqpQ                    ME8 "p18 "p8 "pMEPlaneAp8!p80."p8!p8`!p8pp pDDCh@=m?m?܇$=CDATA8!p8!pDATAh8pq8!pDATAP8!p7DM?{wʼ;3:@i?݃$ Y Y?@p5 Y<|8]5?ʼ;i%k? l<80KU ?jb$)E<$_a?-dc=5Pfi?@3=? 5&=N?(6z< À'?2^; bI>MJ^;nI>=Mz< ÀX>׮L&= ?X>/J= Q>$@Ddc=)_P>9)EH3l<% K>5 2ʼ;>{M Y<+]>5^ Y<l]ڱ> GAʼ;P>Bl<Kɒ>I)E<{{a>Tdc=1P> [= ӫ>^&=&_娢>s_z[^;;DxIA"> f^; ~IB+%>.jz< ÀJ*$>h&=OZd">qf=f>*_dc=hP4 >TS)E< aM>Kl< K{>Jʼ; ؃$>@i Yp5> Y<]zM쾸 Y<]6 2;ʼ;H3Ͼl< ~K9־)E<Άa"@DNdc=_P,J龢=֮LX&=?X2:Mz< ÀMJ|^;n?I[ʟ^;Dx+Is_ਢz< À^Ϋ&=_"[=yTdc=1QPIɒ)E<{ʅa܈B l Yʼ; KT>l< LTSC >)E< a*_f>dc=hPqf">=hW*$>&=ZdO.jN+%>z< À fA">^; ~I[ʟ>^;Dx;Is_訢>z< AÀ ^ث>&=_&݌[>= T>dc=1PIɒ>)E<{{aوB>l<L!GAޱ>ʼ;O1^> Y Y<+]6 2>ʼ;H3>l< %L9>)Edc=_)P*J>= ѮLX>&=?X ͌5M>z< ÀMJ|>^;nI2'?^; bI(6?z< Àx 5?&=N@3ci?=-?dc=5Pdb$U ?)E<_Ԝ$a?l<80L5?ʼ;ji%:p5>? Y<|8]> f?^;IN+%>.j?zh?&=O">qf?=pf>*_?dc=RPG >TS?)E< 8aZ>K?l< IL>J?ʼ; Q9ۃ$>@i? Y-^? Y<R]ޱ>!GA?ʼ;6>ֈB?l<~FLɒ>I?)E<{5a>T?dc=NP>[?= .ի> ^?&=&?樢>s_?z[?^;;ԼI|>MJ?^;*I>/M?z<ÀX>ήL?&= >&J?= !P>@D?dc=)HP>9?)EH3?l<%@L>6 2?ʼ;g2>rM? Y<+L]>?6p5? Y<|8QC]5??ʼ;i%,??l<80w9LT ?ab$?)E<$+a?-?dc=5?@Pai?@3?=X?u 5?&=񱌲?(6?z<_À'?2?^;I2?'?^;I(6??z<_Ày 5??&=񱀾@3?^i?=X-??dc=?@5Peb$?N ?)E<+$a??l? Y Y<L+]6 2?>ʼ;g2H3?>l<@%L9?>)Edc=H)P'J?>=! ЮL?X>&= ͌2M?>z<ÀMJ?p>^;*I[?ʟ>^;;Is_?֨>z&=?&݌[?>=. T?>dc=NPI?~ɒ>)E<5{a؈B?>l<~FL!GA?ڱ>ʼ;6.^?> Y Yʼ;Q9 K?8>l)E<8 a*_?f>dc=RPqf?">=ph?:*$>&=O .j?/+%>z^;Iti?^;Im?z<ÀXl?`&=i?0=Ib?@dc=SPV?@)E<8aO?l<KLl?P Y#7>#67#.?#7?#8?!58#49#89#3:#9:#2;#:;#1<#;<#0=#<=#=>#>A#AB#=B#BC#<C#CD#;D#DE#:E#EF#9F#FG#8G#?@#@G!@H#7H#AH#AP#IP#HI#@Q#IQ#JQ!GJ#FK#JK#EL#KL#DM#LM#CN#MN#BO#NO#OP#PS#ST#OT#TU#NU#UV#MV#VW#LW#WX#KX#XY#JY#QR#RY!RZ#IZ#SZ#Sb#[b#Z[#Rc#[c#\c!Y\#X]#\]#W^#]^#V_#^_#U`#_`#Ta#`a#ab#be#ef#af#fg#`g#gh#_h#hi#^i#ij#]j#jk#\k#cd#dk!dl#[l#el#et#mt#lm#du#mu#nu!kn#jo#no#ip#op#hq#pq#gr#qr#fs#rs#st#tw#wx#sx#xy#ry#yz#qz#z{#p{#{|#o|#|}#n}#uv#v}!v~#m~#w~#w##~#v##!}#|##{##z##y##x#################!########!##########################!########!##########################!########!##########################!########!##########################!########!##########################!########!##########################!########!############### # #  # #  # #  # #  # ## !########! # ## ## ## ###################! # # #(#!(# !#)#!)#")!"###"##$##$#%#$%#&#%&#'#&'#'(#(+#+,#',#,-#&-#-.#%.#./#$/#/0##0#01#"1#)*#*1!*2#!2#+2#+:#3:#23#*;#3;#4;!14#05#45#/6#56#.7#67#-8#78#,9#89#9:#:=#=>#9>#>?#8?#?@#7@#@A#6A#AB#5B#;<#<C#3C#=C#<!4#B#>#?#@#A#B# <#C#=#DATAhpq80."pDATAP80."p3DC=C< < BBAA@@??>>=              !!""##$$$,%$-%#-&#"'&"!('! )( *)+*,++,/0*+01)*12()23'(34&'45&-.5%-.6%,/66/>76.?75.?8549843:932;:21<;10=<0/>==>AB<=BC;<CD:;DE9:EF89FG8?@G7?@H7>AHHAPIH@QIG@QJGFKJFELKEDMLDCNMCBONBAPOOPSTNOTUMNUVLMVWKLWXJKXYJQRYIQRZIPSZZSb[ZRc[YRc\YX]\XW^]WV_^VU`_UTa`TSbaabef`afg_`gh^_hi]^ij\]jk\cdk[cdl[belletmldumkdunkjonjipoihqphgrqgfsrfetsstwxrsxyqryzpqz{op{|no|}nuv}muv~mtw~~w~v}v}||{{zzyyxxw                       (! )!)"#"$#%$&%'&(''(+,&',-%&-.$%./#$/0"#01")*1!)*2!(+22+:32*;31*;410540/65/.76.-87-,98,+:99:=>89>?78?@67@A56ABB45<4;3;<C3:=CME8 "p18 "p8 "pMEPlane.001(!pXpxc px?ppw5"p 9&p???CDATA(!p8!pDATAhpqxc pDATAPxc p7p?>OZ?p?ֿOZ?ֿOZ?>OZ?DATAhw5"pqx?pDATA0x?p4####DATAh9&pqXpDATAXp3BR8!pp8!pBRAddh.001?!p??????????L>????????????????????????????? # Kfff?=???L>??>!>???DATA!p??????????L>?????????????????????????????DATA@!pl????C?~6=~.=Cp??????DATA0Cpj?>k?@? ף=?BR8!pp8!p8!pBRBlob001?8!p??????????L>?????????????????????????????# Kfff?=??????>!>?>>>>?DATA!p??????????L>?????????????????????????????DATA@8!pl????C?._raHw)p??????DATA0Hw)pj?>ףp?@?u=?BR8!pp8!p8!pBRBlur.004?!p??????????L>????????????????????????????? # Kfff?=???L>??>!>???DATA!p??????????L>?????????????????????????????DATA@!pl????C?~6=~.=Hw)p??????DATA0Hw)pj?>k?@? ף=?BR8!pp8!p8!pBRBrush?8!p??????????L>?????????????????????????????# Kfff?=??????>!?>>>>?DATA!p??????????L>?????????????????????????????DATA@8!pl????C?._ra2&p??????DATA02&pj?>ףp?@?u=?BR8!pp8!p8!pBRClay001?!p??????????L>?????????????????????????????# Kfff?=??????>!>?>>>>?DATA!p??????????L>?????????????????????????????DATA@!pl????C?._rah>p??????DATA0h>pj?>ףp?@?u=?BR8!pp8!p8!pBRClone001?8!p??????????L>?????????????????????????????# Kfff?=???333???>!>???DATA!p??????????L>?????????????????????????????DATA@8!pl????C?~6=~.=?p??????DATA0?pj?>k?@? ף=?BR8!pp8!p8!pBRCrease001?!p??????????L>?????????????????????????????# Kfff?=???>??>!>?>>>>?DATA!p??????????L>?????????????????????????????DATA@!pl????C?a2p? 3-p??????DATA03-pj?>?@? #=?BR8!pp8!p8!pBRDarken06?8!p??????????L>????????????????????????????? # Kfff?=???L>??>!>???DATA!p??????????L>?????????????????????????????DATA@8!pl????C?~6=~.=X?p??????DATA0X?pj?>k?@? ף=?BR8!pp8!p8!pBRDraw.001?!p??????????L>?????????????????????????????# Kfff?=??????>!>?>>>>?DATA!p??????????L>?????????????????????????????DATA@!pl????C?._ra(Mp??????DATA0(Mpj?>ףp?@?u=?BR8!pp8|!p8!pBRFill/Deepen001?8!p??????????L>?????????????????????????????# Kfff?=???? ??>!>??>>??DATA!p??????????L>?????????????????????????????DATA@8!pl????C?._ra?p??????DATA0?pj?>ףp?@?u=?BR8|!pp8x!p8!pBRFlatten/Contrast001?!p??????????L>?????????????????????????????# Kfff?=??????>!>??>>??DATA|!p??????????L>?????????????????????????????DATA@!pl????C?._raOp??????DATA0Opj?>ףp?@?u=?BR8x!pp8t!p8|!pBRGrab001?8!p??????????L>?????????????????????????????K Kfff?=???L>??>!>>?>DATAx!p??????????L>?????????????????????????????DATA@8!pl????C?._raX?p??????DATA0X?pj?>ףp?@?u=?BR8t!pp8p!p8x!pBRInflate/Deflate001?!p??????????L>?????????????????????????????# Kfff?=??????>!>@?@?@?>>>DATAt!p??????????L>?????????????????????????????DATA@!pl????C?._ra2&p??????DATA02&pj?>ףp?@?u=?BR8p!pp8l!p8t!pBRLayer001?8!p??????????L>?????????????????????????????# Kfff?=??????>!>?>>DATAp!p??????????L>?????????????????????????????DATA@8!pl????C?._raؤ2&p??????DATA0ؤ2&pj?>ףp?@?u=?BR8l!pp8h!p8p!pBRLighten5?!p??????????L>????????????????????????????? # Kfff?=???L>??>!>???DATAl!p??????????L>?????????????????????????????DATA@!pl????C?~6=~.=(<$p??????DATA0(<$pj?>k?@? ף=?BR8h!pp8\!p8l!pBRMixh?8!p??????????L>????????????????????????????? # Kfff?=???333???>!>???DATAh!p??????????L>?????????????????????????????DATA@8!pl????C?~6=~.=84p??????DATA084pj?>k?@? ף=?BR8\!pp88$p8h!pBRMultiply?!p??????????L>????????????????????????????? # Kfff?=???L>??>!>???DATA\!p??????????L>?????????????????????????????DATA@!pl????C?~6=~.=Hw!p??????DATA0Hw!pj?>k?@? ף=?BR88$pp88$p8\!pBRNudge001?8!p??????????L>?????????????????????????????# Kfff?=???? ??>!>>?>DATA8$p??????????L>?????????????????????????????DATA@8!pl????C?._raX4-p??????DATA0X4-pj?>ףp?@?u=?BR88$pp88$p88$pBRPinch/Magnify001?!p??????????L>?????????????????????????????# Kfff?=??????>!>@?@?@?>>>DATA8$p??????????L>?????????????????????????????DATA@!pl????C?._ra>p??????DATA0>pj?>ףp?@?u=?BR88$pp88$p88$pBRPolish001?8!p??????????L>?????????????????????????????# Kfff?=???????>!>??>>??DATA8$p??????????L>?????????????????????????????DATA@8!pl????C?._ra>p??????DATA0>pj?>ףp?@?u=?BR88$pp88$p88$pBRScrape/Peaks001?!p??????????L>?????????????????????????????# Kfff?=???? ??>!>??>>??DATA8$p??????????L>?????????????????????????????DATA@!pl????C?._rax!p??????DATA0x!pj?>ףp?@?u=?BR88$pp88$p88$pBRSculptDraw?8"+p??????????L>?????????????????????????????# Kfff?=??????>!wN??>>>>?DATA8$p??????????L>?????????????????????????????DATA@8"+pl????C?._ra8-p??????DATA08-pj?>ףp?@?u=?BR88$pp88$p88$pBRSmear001?8!p??????????L>?????????????????????????????# Kfff?=???L>??>!>???DATA8$p??????????L>?????????????????????????????DATA@8!pl????C?~6=~.=>p??????DATA0>pj?>k?@? ף=?BR88$pp88$p88$pBRSmooth001?I2-p??????????L>?????????????????????????????#Kfff?=??????>!>@?@?@?DATA8$p??????????L>?????????????????????????????DATA@I2-pl????C?._ra(!p??????DATA0(!pj?>ףp?@?u=?BR88$pp88$p88$pBRSnake Hook001?B:&p??????????L>?????????????????????????????K Kfff?=???? ??>!>>?>DATA8$p??????????L>?????????????????????????????DATA@B:&pl????C?._ra!p??????DATA0!pj?>ףp?@?u=?BR88$pp88$p88$pBRSoften01?8!p??????????L>?????????????????????????????# Kfff?=???L>??>!>???DATA8$p??????????L>?????????????????????????????DATA@8!pl????C?~6=~.=!p??????DATA0!pj?>k?@? ף=?BR88$pp88$p88$pBRSubtract?8g4-p??????????L>????????????????????????????? # Kfff?=???L>??>!>???DATA8$p??????????L>?????????????????????????????DATA@8g4-pl????C?~6=~.=Fp??????DATA0Fpj?>k?@? ף=?BR88$pp88$p88$pBRTexDraw?8%p??????????L>?????????????????????????????# Kfff?=???333???>!>???>>?DATA8$p??????????L>?????????????????????????????DATA@8%pl????C?._raȓ!p??????DATA0ȓ!pj?>ףp?@?u=?BR88$pp88$p88$pBRThumb001?Q:&p??????????L>?????????????????????????????K Kfff?=???? ??>!>>?>DATA8$p??????????L>?????????????????????????????DATA@Q:&pl????C?._raH!p??????DATA0H!pj?>ףp?@?u=?BR88$pp88$pBRTwist001?k4-p??????????L>?????????????????????????????K Kfff?=??????>!>>?>DATA8$p??????????L>?????????????????????????????DATA@k4-pl????C?._ra!p??????DATA0!pj?>ףp?@?u=?DNA18`i%pSDNANAME *next*prev*data*first*lastxyxminxmaxyminymax*pointergroupvalval2typesubtypeflagname[32]saveddatalentotallen*newid*libname[24]usicon_id*propertiesid*idblock*filedataname[240]filepath[240]totpad*parentw[2]h[2]changed[2]changed_timestamp[2]*rect[2]*obblocktypeadrcodename[128]*bp*beztmaxrcttotrctvartypetotvertipoextraprtbitmaskslide_minslide_maxcurval*drivercurvecurshowkeymuteipoposrelativetotelempad2*weightsvgroup[32]sliderminslidermax*adt*refkeyelemstr[32]elemsizeblock*ipo*fromtotkeyslurph*line*formatblenlinenostartendpad1flagscolor[4]pad[4]*namenlineslines*curl*sellcurcselcmarkers*undo_bufundo_posundo_len*compiledmtimesizeseekdtxpassepartalphaclipstaclipendlensortho_scaledrawsizesensor_xsensor_yshiftxshiftyYF_dofdist*dof_obsensor_fitpad[7]*sceneframenrframesoffsetsfrafie_imacyclokmulti_indexlayerpassibufs*gputexture*anim*rr*renders[8]render_slotlast_render_slotsourcelastframetpageflagtotbindxrepyreptwstatwendbindcode*repbind*packedfile*previewlastupdatelastusedanimspeedgen_xgen_ygen_typegen_flagaspxaspytexcomaptomaptonegblendtype*object*texuvname[32]projxprojyprojzmappingofs[3]size[3]rottexflagcolormodelpmaptopmaptonegnormapspacewhich_outputbrush_map_modergbkdef_varcolfacvarfacnorfacdispfacwarpfaccolspecfacmirrfacalphafacdifffacspecfacemitfachardfacraymirrfactranslfacambfaccolemitfaccolreflfaccoltransfacdensfacscatterfacreflfactimefaclengthfacclumpfacdampfackinkfacroughfacpadensfacgravityfaclifefacsizefacivelfacfieldfacshadowfaczenupfaczendownfacblendfacname[160]*handle*pname*stnamesstypesvars*varstr*result*cfradata[32](*doit)()(*instance_init)()(*callback)()versionaipotype*ima*cube[6]imat[4][4]obimat[3][3]stypeviewscalenotlaycuberesdepthrecalclastsizefalloff_typefalloff_softnessradiuscolor_sourcetotpointspdpadpsyspsys_cache_spaceob_cache_space*point_tree*point_datanoise_sizenoise_depthnoise_influencenoise_basispdpad3[3]noise_facspeed_scalefalloff_speed_scalepdpad2*coba*falloff_curveresol[3]interp_typefile_formatextendsmoked_typeint_multiplierstill_framesource_path[240]*datasetcachedframeoceanmod[64]outputnoisesizeturbulbrightcontrastsaturationrfacgfacbfacfiltersizemg_Hmg_lacunaritymg_octavesmg_offsetmg_gaindist_amountns_outscalevn_w1vn_w2vn_w3vn_w4vn_mexpvn_distmvn_coltypenoisedepthnoisetypenoisebasisnoisebasis2imaflagcropxmincropymincropxmaxcropymaxtexfilterafmaxxrepeatyrepeatcheckerdistnablaiuser*nodetree*plugin*env*pd*vd*otuse_nodesloc[3]rot[3]mat[4][4]min[3]max[3]cobablend_color[3]blend_factorblend_typepad[3]modetotexshdwrshdwgshdwbshdwpadenergydistspotsizespotblendhaintatt1att2*curfalloffshadspotsizebiassoftcompressthreshpad5[3]bufsizesampbuffersfiltertypebufflagbuftyperay_sampray_sampyray_sampzray_samp_typearea_shapearea_sizearea_sizeyarea_sizezadapt_threshray_samp_methodtexactshadhalostepsun_effect_typeskyblendtypehorizon_brightnessspreadsun_brightnesssun_sizebackscattered_lightsun_intensityatm_turbidityatm_inscattering_factoratm_extinction_factoratm_distance_factorskyblendfacsky_exposuresky_colorspacepad4[6]*mtex[18]pr_texturepad6[4]densityemissionscatteringreflectionemission_col[3]transmission_col[3]reflection_col[3]density_scaledepth_cutoffasymmetrystepsize_typeshadeflagshade_typeprecache_resolutionstepsizems_diffms_intensityms_spreadalpha_blendface_orientationmaterial_typespecrspecgspecbmirrmirgmirbambrambbambgambemitangspectraray_mirroralpharefspeczoffsaddtranslucencyvolgamefresnel_mirfresnel_mir_ifresnel_trafresnel_tra_ifiltertx_limittx_falloffray_depthray_depth_traharseed1seed2gloss_mirgloss_trasamp_gloss_mirsamp_gloss_traadapt_thresh_miradapt_thresh_traaniso_gloss_mirdist_mirfadeto_mirshade_flagmode_lflarecstarclinecringchasizeflaresizesubsizeflarebooststrand_stastrand_endstrand_easestrand_surfnorstrand_minstrand_widthfadestrand_uvname[32]sbiaslbiasshad_alphaseptexrgbselpr_typepr_backpr_lampml_flagdiff_shaderspec_shaderroughnessrefracparam[4]rmsdarkness*ramp_col*ramp_specrampin_colrampin_specrampblend_colrampblend_specramp_showpad3rampfac_colrampfac_spec*groupfrictionfhreflectfhdistxyfrictdynamodesss_radius[3]sss_col[3]sss_errorsss_scalesss_iorsss_colfacsss_texfacsss_frontsss_backsss_flagsss_presetmapto_texturedshadowonly_flagindexgpumaterialname[256]*bbi1j1k1i2j2k2selcol1selcol2zquat[4]expxexpyexpzradrad2s*mat*imatelemsdisp*editelems**matflag2totcolwiresizerendersizethresh*lastelemvec[3][3]alfaweighth1h2f1f2f3hidevec[4]mat_nrpntsupntsvresoluresolvorderuordervflaguflagv*knotsu*knotsvtilt_interpradius_interpcharidxkernwhnurbs*keyindexshapenrnurb*editnurb*bevobj*taperobj*textoncurve*path*keybevdrawflagtwist_modetwist_smoothsmallcaps_scalepathlenbevresolwidthext1ext2resolu_renresolv_renactnu*lastselspacemodespacinglinedistshearfsizewordspaceulposulheightxofyoflinewidth*str*selboxes*editfontfamily[24]*vfont*vfontb*vfonti*vfontbisepcharctimetotboxactbox*tbselstartselend*strinfocurinfo*mface*mtface*tface*mvert*medge*dvert*mcol*msticky*texcomesh*mselect*edit_meshvdataedatafdatatotedgetotfacetotselectact_facesmoothreshsubdivsubdivrsubsurftypeeditflag*mr*tpageuv[4][2]col[4]transptileunwrapv1v2v3v4edcodecreasebweightdef_nr*dwtotweightco[3]no[3]uv[2]co[2]fis[256]totdisp(*disps)()v[4]midpad[2]v[2]*faces*colfaces*edges*vertslevelslevel_countcurrentnewlvledgelvlpinlvlrenderlvluse_col*edge_flags*edge_creasesstackindex*errormodifier*texture*map_objectuvlayer_name[32]uvlayer_tmptexmappingsubdivTyperenderLevels*emCache*mCachedefaxispad[6]lengthrandomizeseed*ob_arm*start_cap*end_cap*curve_ob*offset_oboffset[3]scale[3]merge_distfit_typeoffset_typecountaxistolerance*mirror_obsplit_anglevalueresval_flagslim_flagse_flagsbevel_angledefgrp_name[32]*domain*flow*colltimepad10strengthdirectionmidlevel*projectors[10]*imagenum_projectorsaspectxaspectyscalexscaleypercentfaceCountfacrepeat*objectcenterstartxstartyheightnarrowspeeddampfallofftimeoffslifetimedeformflagmulti*prevCossubtarget[32]parentinv[4][4]cent[3]*indexartotindexforce*clothObject*sim_parms*coll_parms*point_cacheptcaches*x*xnew*xold*current_xnew*current_x*current_v*mfacesnumvertsnumfacestime_xtime_xnew*bvhtree*v*dmcfraoperationvertextotinfluencegridsize*bindinfluences*bindoffsets*bindcagecostotcagevert*dyngrid*dyninfluences*dynverts*pad2dyngridsizedyncellmin[3]dyncellwidthbindmat[4][4]*bindweights*bindcos(*bindfunc)()*psystotdmverttotdmedgetotdmfacepositionrandom_position*facepavgroupprotectlvlsculptlvltotlvlsimple*fss*target*auxTargetvgroup_name[32]keepDistshrinkTypeshrinkOptsprojAxissubsurfLevels*originfactorlimit[2]originOptsoffset_facoffset_fac_vgcrease_innercrease_outercrease_rimmat_ofsmat_ofs_rim*ob_axisstepsrender_stepsiterscrew_ofsangle*ocean*oceancacheresolutionspatial_sizewind_velocitysmallest_wavewave_alignmentwave_directionwave_scalechop_amountfoam_coveragebakestartbakeendcachepath[240]foamlayername[32]cachedgeometry_moderefreshrepeat_xrepeat_yfoam_fade*object_from*object_tofalloff_radiusedit_flagsdefault_weight*cmap_curveadd_thresholdrem_thresholdmask_constantmask_defgrp_name[32]mask_tex_use_channel*mask_texture*mask_tex_map_objmask_tex_mappingmask_tex_uvlayer_name[32]pad_i1defgrp_name_a[32]defgrp_name_b[32]default_weight_adefault_weight_bmix_modemix_setpad_c1[6]proximity_modeproximity_flags*proximity_ob_targetmin_distmax_distpad_s1*canvas*brush*lattpntswopntsuopntsvopntswtypeutypevtypewfufvfwdudvdw*def*latticedatalatmat[4][4]*editlattvec[8][3]*sculptpartypepar1par2par3parsubstr[32]*track*proxy*proxy_group*proxy_from*action*poselib*pose*gpdavs*mpathconstraintChannelseffectdefbasemodifiersrestore_mode*matbitsactcoldloc[3]orig[3]dsize[3]dscale[3]drot[3]dquat[4]rotAxis[3]drotAxis[3]rotAngledrotAngleobmat[4][4]constinv[4][4]imat_ren[4][4]laypad6colbitstransflagprotectflagtrackflagupflagnlaflagipoflagscaflagscavisflagpad5dupondupoffdupstadupendsfmassdampinginertiaformfactorrdampingmarginmax_velmin_velm_contactProcessingThresholdobstacleRadrotmodeboundtypecollision_boundtyperestrictflagdtempty_drawtypeempty_drawsizedupfacescapropsensorscontrollersactuatorsbbsize[3]actdefgameflaggameflag2*bsoftsoftflaganisotropicFriction[3]constraintsnlastripshooksparticlesystem*soft*dup_groupbody_typeshapeflag*fluidsimSettings*derivedDeform*derivedFinallastDataMaskcustomdata_maskstateinit_stategpulamppc_ids*duplilistima_ofs[2]curindexactiveoriglayomat[4][4]orco[3]no_drawanimateddeflectforcefieldshapetex_modekinkkink_axiszdirf_strengthf_dampf_flowf_sizef_powermaxdistmindistf_power_rmaxradminradpdef_damppdef_rdamppdef_permpdef_frictpdef_rfrictpdef_sticknessabsorptionpdef_sbdamppdef_sbiftpdef_sboftclump_facclump_powkink_freqkink_shapekink_ampfree_endtex_nabla*rngf_noiseweight[13]global_gravityrt[3]totdataframetotpointdata_types*data[8]*cur[8]extradatastepsimframestartframeendframeeditframelast_exactcompressionname[64]prev_name[64]info[64]path[240]*cached_framesmem_cache*edit(*free_edit)()linStiffangStiffvolumeviterationspiterationsditerationsciterationskSRHR_CLkSKHR_CLkSSHR_CLkSR_SPLT_CLkSK_SPLT_CLkSS_SPLT_CLkVCFkDPkDGkLFkPRkVCkDFkMTkCHRkKHRkSHRkAHRcollisionflagsnumclusteriterationsweldingtotspring*bpoint*bspringmsg_lockmsg_valuenodemassnamedVG_Mass[32]gravmediafrictrklimitphysics_speedgoalspringgoalfrictmingoalmaxgoaldefgoalvertgroupnamedVG_Softgoal[32]fuzzynessinspringinfrictnamedVG_Spring_K[32]efraintervallocalsolverflags**keystotpointkeysecondspringcolballballdampballstiffsbc_modeaeroedgeminloopsmaxloopschokesolver_IDplasticspringpreload*scratchshearstiffinpush*pointcache*effector_weightslcom[3]lrot[3][3]lscale[3][3]pad4[4]vel[3]*fmdshow_advancedoptionsresolutionxyzpreviewresxyzrealsizeguiDisplayModerenderDisplayModeviscosityValueviscosityModeviscosityExponentgrav[3]animStartanimEndbakeStartbakeEndgstarmaxRefineiniVelxiniVelyiniVelz*orgMesh*meshBBsurfdataPath[240]bbStart[3]bbSize[3]typeFlagsdomainNovecgenvolumeInitTypepartSlipValuegenerateTracersgenerateParticlessurfaceSmoothingsurfaceSubdivsparticleInfSizeparticleInfAlphafarFieldSize*meshVelocitiescpsTimeStartcpsTimeEndcpsQualityattractforceStrengthattractforceRadiusvelocityforceStrengthvelocityforceRadiuslastgoodframemistypehorrhorghorbzenrzengzenbfastcolexposureexprangelinfaclogfacgravityactivityBoxRadiusskytypeocclusionResphysicsEngineticratemaxlogicstepphysubstepmaxphystepmisimiststamistdistmisthistarrstargstarbstarkstarsizestarmindiststardiststarcolnoisedofstadofenddofmindofmaxaodistaodistfacaoenergyaobiasaomodeaosampaomixaocolorao_adapt_threshao_adapt_speed_facao_approx_errorao_approx_correctionao_indirect_energyao_env_energyao_pad2ao_indirect_bouncesao_padao_samp_methodao_gather_methodao_approx_passes*aosphere*aotablesselcolsxsy*lpFormat*lpParmscbFormatcbParmsfccTypefccHandlerdwKeyFrameEverydwQualitydwBytesPerSeconddwFlagsdwInterleaveEveryavicodecname[128]*cdParms*padcdSizeqtcodecname[128]codecTypecodecSpatialQualitycodeccodecFlagscolorDepthcodecTemporalQualityminSpatialQualityminTemporalQualitykeyFrameRatebitRateaudiocodecTypeaudioSampleRateaudioBitDepthaudioChannelsaudioCodecFlagsaudioBitRateaudio_codecvideo_bitrateaudio_bitrateaudio_mixrateaudio_channelsaudio_padaudio_volumegop_sizerc_min_raterc_max_raterc_buffer_sizemux_packet_sizemux_ratemixratemainspeed_of_sounddoppler_factordistance_model*mat_override*light_overridelay_zmasklayflagpassflagpass_xorimtypeplanesqualitycompressexr_codeccineon_flagcineon_whitecineon_blackcineon_gammajp2_flagim_format*avicodecdata*qtcodecdataqtcodecsettingsffcodecdatasubframepsfrapefraimagesframaptothreadsframelenblurfacedgeRedgeGedgeBfullscreenxplayyplayfreqplayattribframe_stepstereomodedimensionspresetmaximsizexschyschxpartsypartssubimtypedisplaymodescemoderaytrace_optionsraytrace_structureocrespad4alphamodeosafrs_secedgeintsafetyborderdisprectlayersactlaymblur_samplesxaspyaspfrs_sec_basegausscolor_mgt_flagpostgammaposthuepostsatdither_intensitybake_osabake_filterbake_modebake_flagbake_normal_spacebake_quad_splitbake_maxdistbake_biasdistbake_padpic[240]stampstamp_font_idstamp_udata[160]fg_stamp[4]bg_stamp[4]seq_prev_typeseq_rend_typeseq_flagpad5[5]simplify_flagsimplify_subsurfsimplify_shadowsamplessimplify_particlessimplify_aossscineonwhitecineonblackcineongammajp2_presetjp2_depthrpad3domeresdomemodedomeangledometiltdomeresbuf*dometextengine[32]particle_percsubsurf_maxshadbufsample_maxao_errortiltresbuf*warptextcol[3]cellsizecellheightagentmaxslopeagentmaxclimbagentheightagentradiusedgemaxlenedgemaxerrorregionminsizeregionmergesizevertsperpolydetailsampledistdetailsamplemaxerrorframingrt1rt2domestereoflageyeseparationrecastDatamatmodeobstacleSimulationlevelHeight*camera*paint_cursorpaint_cursor_col[4]paintseam_bleednormal_anglescreen_grab_size[2]*paintcursorinverttotrekeytotaddkeybrushtypebrush[7]emitterdistselectmodeedittypedraw_stepfade_framesname[36]mat[3][3]radial_symm[3]last_xlast_ylast_angledraw_anchoredanchored_sizeanchored_location[3]anchored_initial_mouse[2]draw_pressurepressure_valuespecial_rotation*vpaint_prev*wpaint_prev*vpaint*wpaintvgroup_weightcornertypeeditbutflagjointrilimitdegrturnextr_offsdoublimitnormalsizeautomergesegmentsringsverticesunwrapperuvcalc_radiusuvcalc_cubesizeuvcalc_marginuvcalc_mapdiruvcalc_mapalignuvcalc_flaguv_flaguv_selectmodeuv_padgpencil_flagsautoik_chainlenimapaintparticleproportional_sizeselect_threshclean_threshautokey_modeautokey_flagretopo_moderetopo_paint_toolline_divellipse_divretopo_hotspotmultires_subdiv_typeskgen_resolutionskgen_threshold_internalskgen_threshold_externalskgen_length_ratioskgen_length_limitskgen_angle_limitskgen_correlation_limitskgen_symmetry_limitskgen_retarget_angle_weightskgen_retarget_length_weightskgen_retarget_distance_weightskgen_optionsskgen_postproskgen_postpro_passesskgen_subdivisions[3]skgen_multi_level*skgen_templatebone_sketchingbone_sketching_convertskgen_subdivision_numberskgen_retarget_optionsskgen_retarget_rollskgen_side_string[8]skgen_num_string[8]edge_modeedge_mode_live_unwrapsnap_modesnap_flagsnap_targetproportionalprop_modeproportional_objectspad[5]auto_normalizemultipaintsculpt_paint_settingssculpt_paint_unified_sizesculpt_paint_unified_unprojected_radiussculpt_paint_unified_alphatotobjtotlamptotobjseltotcurvetotmeshtotarmaturescale_lengthsystemsystem_rotationgravity[3]quick_cache_step*world*setbase*basact*obeditcursor[3]twcent[3]twmin[3]twmax[3]layactlay_updatedcustomdata_mask_modal*ed*toolsettings*statsaudiotransform_spaces*sound_scene*sound_scene_handle*sound_scrub_handle*speaker_handles*fps_info*theDagdagisvaliddagflagsactive_keyingsetkeyingsetsgmunitphysics_settings*clipcuserblendviewwinmat[4][4]viewmat[4][4]viewinv[4][4]persmat[4][4]persinv[4][4]viewmatob[4][4]persmatob[4][4]twmat[4][4]viewquat[4]zfaccamdxcamdypixsizecamzoomtwdrawflagis_persprflagviewlockperspclip[6][4]clip_local[6][4]*clipbb*localvd*ri*render_engine*depths*sms*smooth_timerlviewquat[4]lpersplviewgridviewtwangle[3]rot_anglerot_axis[3]pad2[4]regionbasespacetypeblockscaleblockhandler[8]bundle_sizebundle_drawtypelay_used*ob_centrebgpicbase*bgpicob_centre_bone[32]drawtypeob_centre_cursorscenelockaroundgridnearfarmodeselectgridlinesgridsubdivgridflagtwtypetwmodetwflagpad2[2]afterdraw_transpafterdraw_xrayafterdraw_xraytranspzbufxraypad3[2]*properties_storageverthormaskmin[2]max[2]minzoommaxzoomscrollscroll_uikeeptotkeepzoomkeepofsalignwinxwinyoldwinxoldwiny*tab_offsettab_numtab_currpt_maskv2d*adsghostCurvesautosnapcursorValmainbmainbomainbuserre_alignpreviewtexture_contextpathflagdataicon*pinid*texuserrender_sizechanshownzebrazoomtitle[32]dir[240]file[80]renamefile[80]renameedit[80]filter_glob[64]active_filesel_firstsel_lastsortdisplayf_fpfp_str[8]scroll_offset*params*files*folders_prev*folders_next*op*smoothscroll_timer*layoutrecentnrbookmarknrsystemnrtree*treestoresearch_string[32]search_tseoutlinevisstoreflagsearch_flags*cumapscopessample_line_histcursor[2]centxcentycurtilelockpindt_uvstickydt_uvstretch*texttopviewlinesmenunrlheightcwidthlinenrs_totleftshowlinenrstabnumbershowsyntaxline_hlightoverwritelive_editpix_per_linetxtscrolltxtbarwordwrapdopluginsfindstr[256]replacestr[256]margin_column*drawcache*py_draw*py_event*py_button*py_browsercallback*py_globaldictlastspacescriptname[256]scriptarg[256]*script*but_refs*arraycachescache_display*idaspectpadfmxmy*edittreetreetypetexfromshaderfromlinkdraglen_alloccursorscrollbackhistoryprompt[256]language[32]sel_startsel_endfilter[64]xlockofylockofuserpath_lengthloc[2]scalestabmat[4][4]unistabmat[4][4]filename[256]blf_iduifont_idr_to_lpointskerningitalicboldshadowshadxshadyshadowalphashadowcolorpaneltitlegrouplabelwidgetlabelwidgetpanelzoomminlabelcharsminwidgetcharscolumnspacetemplatespaceboxspacebuttonspacexbuttonspaceypanelspacepanelouteroutline[4]inner[4]inner_sel[4]item[4]text[4]text_sel[4]shadedshadetopshadedownalpha_checkinner_anim[4]inner_anim_sel[4]inner_key[4]inner_key_sel[4]inner_driven[4]inner_driven_sel[4]header[4]show_headerwcol_regularwcol_toolwcol_textwcol_radiowcol_optionwcol_togglewcol_numwcol_numsliderwcol_menuwcol_pulldownwcol_menu_backwcol_menu_itemwcol_boxwcol_scrollwcol_progresswcol_list_itemwcol_statepaneliconfile[80]icon_alphaback[4]title[4]text_hi[4]header_title[4]header_text[4]header_text_hi[4]button[4]button_title[4]button_text[4]button_text_hi[4]list[4]list_title[4]list_text[4]list_text_hi[4]panel[4]panel_title[4]panel_text[4]panel_text_hi[4]shade1[4]shade2[4]hilite[4]grid[4]wire[4]select[4]lamp[4]speaker[4]active[4]group[4]group_active[4]transform[4]vertex[4]vertex_select[4]edge[4]edge_select[4]edge_seam[4]edge_sharp[4]edge_facesel[4]edge_crease[4]face[4]face_select[4]face_dot[4]extra_edge_len[4]extra_face_angle[4]extra_face_area[4]pad3[4]normal[4]vertex_normal[4]bone_solid[4]bone_pose[4]strip[4]strip_select[4]cframe[4]nurb_uline[4]nurb_vline[4]act_spline[4]nurb_sel_uline[4]nurb_sel_vline[4]lastsel_point[4]handle_free[4]handle_auto[4]handle_vect[4]handle_align[4]handle_auto_clamped[4]handle_sel_free[4]handle_sel_auto[4]handle_sel_vect[4]handle_sel_align[4]handle_sel_auto_clamped[4]ds_channel[4]ds_subchannel[4]console_output[4]console_input[4]console_info[4]console_error[4]console_cursor[4]vertex_sizeoutline_widthfacedot_sizenoodle_curvingsyntaxl[4]syntaxn[4]syntaxb[4]syntaxv[4]syntaxc[4]movie[4]image[4]scene[4]audio[4]effect[4]plugin[4]transition[4]meta[4]editmesh_active[4]handle_vertex[4]handle_vertex_select[4]handle_vertex_sizemarker_outline[4]marker[4]act_marker[4]sel_marker[4]dis_marker[4]lock_marker[4]bundle_solid[4]path_before[4]path_after[4]camera_path[4]hpad[7]preview_back[4]solid[4]tuitbutstv3dtfiletipotinfotacttnlatseqtimatexttoopsttimetnodetlogictuserpreftconsoletcliptarm[20]active_theme_areamodule[64]spec[4]dupflagsavetimetempdir[160]fontdir[160]renderdir[240]textudir[160]plugtexdir[160]plugseqdir[160]pythondir[160]sounddir[160]image_editor[240]anim_player[240]anim_player_presetv2d_min_gridsizetimecode_styleversionsdbl_click_timegameflagswheellinescrolluiflaglanguageuserprefviewzoommixbufsizeaudiodeviceaudiorateaudioformataudiochannelsdpiencodingtransoptsmenuthreshold1menuthreshold2themesuifontsuistyleskeymapsuser_keymapsaddonskeyconfigstr[64]undostepsundomemorygp_manhattendistgp_euclideandistgp_erasergp_settingstb_leftmousetb_rightmouselight[3]tw_hotspottw_flagtw_handlesizetw_sizetextimeouttexcollectratewmdrawmethoddragthresholdmemcachelimitprefetchframesframeserverportpad_rot_angleobcenter_diarvisizervibrightrecent_filessmooth_viewtxglreslimitcurssizecolor_picker_typeipo_newkeyhandles_newscrcastfpsscrcastwaitwidget_unitanisotropic_filterndof_sensitivityndof_flagversemaster[160]verseuser[160]glalphacliptext_renderpad9[3]coba_weightsculpt_paint_overlay_col[3]tweak_thresholdauthor[80]vertbaseedgebaseareabase*newsceneredraws_flagfulltempwiniddo_drawdo_refreshdo_draw_gesturedo_draw_paintcursordo_draw_dragswapmainwinsubwinactive*animtimer*contexthandler[8]*newvvec*v1*v2*typepanelname[64]tabname[64]drawname[64]ofsxofsysizexsizeylabelofsruntime_flagcontrolsnapsortorder*paneltab*activedatalist_scrolllist_sizelist_last_lenlist_grip_sizelist_search[64]*v3*v4*fullbutspacetypeheadertypespacedatahandlersactionzoneswinrctdrawrctswinidregiontypealignmentdo_draw_overlayuiblockspanels*headerstr*regiondatasubvstr[4]subversionpadsminversionminsubversionwinpos*curscreen*curscenefileflagsglobalfrevisionfilename[240]name[80]orig_widthorig_heightbottomrightxofsyofslift[3]gamma[3]gain[3]dir[160]tcbuild_size_flagsbuild_tc_flagsdonestartstillendstill*stripdata*crop*transform*color_balance*instance_private_data**current_private_data*tmpstartofsendofsmachinestartdispenddispsatmulhandsizeanim_preseekstreamindex*strip*scene_cameraeffect_faderspeed_fader*seq1*seq2*seq3seqbase*sound*scene_soundpitchpanscenenrmulticam_sourcestrobe*effectdataanim_startofsanim_endofsblend_modeblend_opacity*oldbasep*parseq*seqbasepmetastack*act_seqact_imagedir[256]act_sounddir[256]over_ofsover_cfraover_flagover_borderedgeWidthforwardwipetypefMinifClampfBoostdDistdQualitybNoCompScalexIniScaleyInixIniyInirotIniinterpolationuniform_scale*frameMapglobalSpeedlastValidFramebuttypeuserjitstatotpartnormfacobfacrandfactexfacrandlifeforce[3]vectsizemaxlendefvec[3]mult[4]life[4]child[4]mat[4]texmapcurmultstaticstepomattimetexspeedtexflag2negvertgroup_vvgroupname[32]vgroupname_v[32]*keysminfacnrusedusedelem*poinresetdistlastval*makeyqualqual2targetName[32]toggleName[32]value[32]maxvalue[32]delaydurationmaterialName[32]damptimerpropname[32]matname[32]axisflagposechannel[32]constraint[32]*fromObjectsubject[32]body[32]otypepulsefreqtotlinks**linksleveltapjoyindexaxis_singleaxisfbuttonhathatfprecisionstr[128]*mynewinputstotslinks**slinksvalostate_mask*actframeProp[32]blendinpriorityend_resetstrideaxisstridelengthlayer_weightmin_gainmax_gainreference_distancemax_distancerolloff_factorcone_inner_anglecone_outer_anglecone_outer_gainsndnrsound3Dpad6[1]*melinVelocity[3]angVelocity[3]localflagdyn_operationforceloc[3]forcerot[3]pad1[3]linearvelocity[3]angularvelocity[3]*referenceminmaxrotdampminloc[3]maxloc[3]minrot[3]maxrot[3]matprop[32]butstabutenddistributionint_arg_1int_arg_2float_arg_1float_arg_2toPropName[32]*toObjectbodyTypefilename[64]loadaniname[64]int_argfloat_arg*subtargetfacingaxisvelocityaccelerationturnspeedupdateTime*navmeshgo*newpackedfileattenuationdistance*cache*waveform*playback_handle*lamprengobjectdupli_ofs[3]*propchildbaserollhead[3]tail[3]bone_mat[3][3]arm_head[3]arm_tail[3]arm_mat[4][4]arm_rollxwidthzwidthease1ease2rad_headrad_tailpad[1]bonebasechainbase*edbo*act_bone*act_edbone*sketchgevertdeformerlayer_usedlayer_protectedghostepghostsizeghosttypepathsizeghostsfghostefpathsfpathefpathbcpathac*pointsstart_frameend_frameghost_sfghost_efghost_bcghost_acghost_typeghost_stepghost_flagpath_typepath_steppath_viewflagpath_bakeflagpath_sfpath_efpath_bcpath_acikflagagrp_indexconstflagselectflagpad0[6]*bone*childiktreesiktree*custom*custom_txeul[3]chan_mat[4][4]pose_mat[4][4]pose_head[3]pose_tail[3]limitmin[3]limitmax[3]stiffness[3]ikstretchikrotweightiklinweight*tempchanbase*chanhashproxy_layerstride_offset[3]cyclic_offset[3]agroupsactive_groupiksolver*ikdata*ikparamproxy_act_bone[32]numiternumstepminstepmaxstepsolverfeedbackmaxveldampmaxdampepschannelscustomColcscurvesgroupsactive_markeridroot*source*filter_grpsearchstr[64]filterflagrenameIndexadstimeslide*grpname[30]ownspacetarspaceenforceheadtaillin_errorrot_error*tarmatrix[4][4]spacerotOrdertarnumtargetsiterationsrootbonemax_rootbone*poletarpolesubtarget[32]poleangleorientweightgrabtarget[3]numpointschainlenxzScaleModereserved1reserved2minmaxflagstuckcache[3]lockflagfollowflagvolmodeplaneorglengthbulgepivXpivYpivZaxXaxYaxZminLimit[6]maxLimit[6]extraFzinvmat[4][4]fromtomap[3]expofrom_min[3]from_max[3]to_min[3]to_max[3]rotAxiszminzmaxpad[9]track[24]channel[32]no_rot_axisstride_axiscurmodactstartactendactoffsstridelenblendoutstridechannel[32]offs_bone[32]hasinputhasoutputdatatypesockettypeis_copyexternal*new_sock*storagelimitlocxlocy*default_valuestack_indexstack_typeown_indexto_index*groupsock*linkns*rectxsizeysize*new_nodelastyoutputsminiwidthupdatelabel[32]custom1custom2custom3custom4need_execexec*threaddatatotrbutrprvr*block*typeinfo*fromnode*tonode*fromsock*tosocknodeslinksinitcur_indexnodetype*execdata(*progress)()(*stats_draw)()(*test_break)()*tbh*prh*sdhvalue[3]value[4]cyclicmoviesamplesmaxspeedminspeedcurvedpercentxpercentybokehgammaimage_in_widthimage_in_heightcenter_xcenter_yspinwrapsigma_colorsigma_spacehuet1t2t3fstrengthfalphakey[4]algorithmchannelx1x2y1y2fac_x1fac_x2fac_y1fac_y2colname[32]bktypepad_c1gamcono_zbuffstopmaxblurbthreshrotationpad_f1*dict*nodecolmodmixthresholdfadeangle_ofsmcjitprojfitslope[3]power[3]lift_lgg[3]gamma_inv[3]limchanunspilllimscaleuspillruspillguspillbtex_mappingcolor_mappingsun_direction[3]turbiditycolor_spacegradient_typecoloringmusgrave_typewave_typeshortymintablemaxtableext_in[2]ext_out[2]*curve*table*premultablepresetchanged_timestampcurrcliprcm[4]black[3]white[3]bwmul[3]sample[3]x_resolutiondata_r[256]data_g[256]data_b[256]data_luma[256]sample_fullsample_linesaccuracywavefrm_modewavefrm_alphawavefrm_yfacwavefrm_heightvecscope_alphavecscope_heightminmax[3][2]hist*waveform_1*waveform_2*waveform_3*vecscopewaveform_totoffset[2]clonemtex*icon_imbuficon_filepath[240]normal_weightob_modejittersmooth_stroke_radiussmooth_stroke_factorratergb[3]sculpt_planeplane_offsetsculpt_toolvertexpaint_toolimagepaint_toolpad3[5]autosmooth_factorcrease_pinch_factorplane_trimtexture_sample_biastexture_overlay_alphaunprojected_radiusadd_col[3]sub_col[3]active_rndactive_cloneactive_mask*layerstotlayermaxlayertotsize*pool*externalrot[4]ave[3]*groundwander[3]rest_lengthparticle_index[2]delete_flagnumparentpa[4]w[4]fuv[4]foffsetprev_state*hair*boiddietimenum_dmcachehair_indexalivespring_kplasticity_constantyield_ratioplasticity_balanceyield_balanceviscosity_omegaviscosity_betastiffness_kstiffness_knearrest_densitybuoyancyspring_frames*boids*fluiddistrphystypeavemodereacteventdrawdraw_asdraw_sizechildtyperen_assubframesdraw_colren_stephair_stepkeys_stepadapt_angleadapt_pixrotfromintegratorbb_alignbb_uv_splitbb_animbb_split_offsetbb_tiltbb_rand_tiltbb_offset[2]bb_size[2]bb_vel_headbb_vel_tailcolor_vec_maxsimplify_refsizesimplify_ratesimplify_transitionsimplify_viewporttimetweakcourant_targetjitfaceff_hairgrid_randps_offset[1]grid_reseffector_amounttime_flagtime_pad[3]partfactanfactanphasereactfacob_vel[3]avefacphasefacrandrotfacrandphasefacrandsizeacc[3]dragfacbrownfacrandlengthchild_nbrren_child_nbrparentschildsizechildrandsizechildradchildflatclumppowkink_flatkink_amp_clumprough1rough1_sizerough2rough2_sizerough2_thresrough_endrough_end_shapeclengthclength_thresparting_facparting_minparting_maxbranch_thresdraw_line[2]path_startpath_endtrail_countkeyed_loopsdupliweights*eff_group*dup_ob*bb_ob*pd2*part*particles**pathcache**childcachepathcachebufschildcachebufs*clmd*hair_in_dm*hair_out_dm*target_ob*latticetree_framebvhtree_framechild_seedtotunexisttotchildtotcachedtotchildcachetarget_psystotkeyedbakespacebb_uvname[3][32]vgroup[12]vg_negrt3*renderdata*effectors*fluid_springstot_fluidspringsalloc_fluidsprings*tree*pdd*franddt_frac_padCdisCvistructuralbendingmax_bendmax_structmax_shearavg_spring_lentimescaleeff_force_scaleeff_wind_scalesim_time_oldvelocity_smoothcollider_frictionstepsPerFrameprerollmaxspringlensolver_typevgroup_bendvgroup_massvgroup_structshapekey_restpresetsreset*collision_listepsilonself_frictionselfepsilonrepel_forcedistance_repelself_loop_countloop_countpressurethicknessstrokesframenum*actframegstepinfo[128]sbuffer_sizesbuffer_sflag*sbufferlistprintlevelstorelevel*reporttimer*windrawable*winactivewindowsinitializedfile_savedop_undo_depthoperatorsqueuereportsjobspaintcursorsdragskeyconfigs*defaultconf*addonconf*userconftimers*autosavetimer*ghostwingrabcursor*screen*newscreenscreenname[32]posxposywindowstatemonitorlastcursormodalcursoraddmousemove*eventstate*curswin*tweakdrawmethoddrawfail*drawdatamodalhandlerssubwindowsgestureidname[64]propvalueshiftctrlaltoskeykeymodifiermaptype*ptr*remove_item*add_itemitemsdiff_itemsspaceidregionidkmi_id(*poll)()*modal_itemsbasename[64]actkeymap*customdata*py_instance*reportsmacro*opm*edatainfluence*coefficientsarraysizepoly_orderamplitudephase_multiplierphase_offsetvalue_offsetmidvalbefore_modeafter_modebefore_cyclesafter_cyclesrectphasemodificationstep_size*rna_pathpchan_name[32]transChanidtypetargets[8]num_targetsvariablesexpression[256]*expr_compvec[2]*fptarray_indexcolor_modecolor[3]from[128]to[128]mappingsstrips*remapfcurvesstrip_timeblendmodeextendmode*speaker_handlegroup[64]groupmodekeyingflagpathstypeinfo[64]active_path*tmpactnla_tracks*actstripdriversoverridesact_blendmodeact_extendmodeact_influenceruleoptionsfear_factorsignal_idlook_aheadoloc[3]queue_sizewanderflee_distancehealthstate_idrulesconditionsactionsruleset_typerule_fuzzinesslast_state_idlanding_smoothnessbankingaggressionair_min_speedair_max_speedair_max_accair_max_aveair_personal_spaceland_jump_speedland_max_speedland_max_accland_max_aveland_personal_spaceland_stick_forcestates*smd*fluid_group*coll_group*wt*tex_wt*tex_shadow*shadowp0[3]p1[3]dxomegatempAmbbetares[3]amplifymaxresviewsettingsnoisediss_percentdiss_speedres_wt[3]dx_wtv3dnumcache_compcache_high_comp*point_cache[2]ptcaches[2]border_collisionstime_scalevorticityvelocity[2]vel_multivgrp_heat_scale[2]vgroup_flowvgroup_densityvgroup_heat*points_old*velmat_old[4][4]volume_maxvolume_mindistance_maxdistance_referencecone_angle_outercone_angle_innercone_volume_outerrender_flagbuild_size_flagbuild_tc_flagbuild_flaglastsize[2]tracking*tracking_contextproxytrack_preview_height*track_previewtrack_pos[2]track_disabled*markerslide_scale[2]error*intrinsicssensor_widthpixel_aspectfocalunitsprincipal[2]k3pos[2]pat_min[2]pat_max[2]search_min[2]search_max[2]markersnrlast_marker*markersbundle_pos[3]pat_flagsearch_flagframes_limitpattern_matchtrackerpyramid_levelsminimum_correlationdefault_trackerdefault_pyramid_levelsdefault_minimum_correlationdefault_pattern_sizedefault_search_sizedefault_frames_limitdefault_margindefault_pattern_matchkeyframe1keyframe2refine_camera_intrinsicsclean_framesclean_actionclean_errortot_trackact_trackmaxscale*rot_tracklocinfscaleinfrotinf*scaleibuflast_cameracamnr*camerasmessage[256]settingscameratracksreconstructionstabilization*act_track*brush_groupcurrent_frameformatdisp_typeimage_fileformateffect_uipreview_idinit_color_typepad_simage_resolutionsubstepsinit_color[4]*init_textureinit_layername[40]dry_speeddepth_clampdisp_factorspread_speedcolor_spread_speedshrink_speeddrip_veldrip_accwave_dampingwave_speedwave_timescalewave_springpad_image_output_path[240]output_name[40]output_name2[40]*pmdsurfacesactive_surerror[64]collisionwetnessparticle_radiusparticle_smoothpaint_distance*paint_ramp*vel_rampproximity_falloffray_dirwave_factorwave_clampmax_velocitysmudge_strengthTYPEcharucharshortushortintlongulongfloatdoublevoidLinkLinkDataListBasevec2svec2frctirctfIDPropertyDataIDPropertyIDLibraryFileDataPreviewImageIpoDriverObjectIpoCurveBPointBezTripleIpoKeyBlockKeyAnimDataTextLineTextMarkerTextPackedFileCameraImageUserSceneImageGPUTextureanimRenderResultMTexTexPluginTexCBDataColorBandEnvMapImBufPointDensityCurveMappingVoxelDataOceanTexbNodeTreeTexMappingColorMappingLampVolumeSettingsGameSettingsMaterialGroupVFontVFontDataMetaElemBoundBoxMetaBallNurbCharInfoTextBoxEditNurbGHashCurvePathSelBoxEditFontMeshMFaceMTFaceTFaceMVertMEdgeMDeformVertMColMStickyMSelectEditMeshCustomDataMultiresMDeformWeightMTexPolyMLoopUVMLoopColMFloatPropertyMIntPropertyMStringPropertyOrigSpaceFaceMDispsMultiresColMultiresColFaceMultiresFaceMultiresEdgeMultiresLevelMRecastModifierDataMappingInfoModifierDataSubsurfModifierDataLatticeModifierDataCurveModifierDataBuildModifierDataMaskModifierDataArrayModifierDataMirrorModifierDataEdgeSplitModifierDataBevelModifierDataBMeshModifierDataSmokeModifierDataSmokeDomainSettingsSmokeFlowSettingsSmokeCollSettingsDisplaceModifierDataUVProjectModifierDataDecimateModifierDataSmoothModifierDataCastModifierDataWaveModifierDataArmatureModifierDataHookModifierDataSoftbodyModifierDataClothModifierDataClothClothSimSettingsClothCollSettingsPointCacheCollisionModifierDataBVHTreeSurfaceModifierDataDerivedMeshBVHTreeFromMeshBooleanModifierDataMDefInfluenceMDefCellMeshDeformModifierDataParticleSystemModifierDataParticleSystemParticleInstanceModifierDataExplodeModifierDataMultiresModifierDataFluidsimModifierDataFluidsimSettingsShrinkwrapModifierDataSimpleDeformModifierDataShapeKeyModifierDataSolidifyModifierDataScrewModifierDataOceanModifierDataOceanOceanCacheWarpModifierDataWeightVGEditModifierDataWeightVGMixModifierDataWeightVGProximityModifierDataDynamicPaintModifierDataDynamicPaintCanvasSettingsDynamicPaintBrushSettingsEditLattLatticebDeformGroupSculptSessionbActionbPosebGPdatabAnimVizSettingsbMotionPathBulletSoftBodyPartDeflectSoftBodyObHookDupliObjectRNGEffectorWeightsPTCacheExtraPTCacheMemPTCacheEditSBVertexBodyPointBodySpringSBScratchFluidVertexVelocityWorldBaseAviCodecDataQuicktimeCodecDataQuicktimeCodecSettingsFFMpegCodecDataAudioDataSceneRenderLayerImageFormatDataRenderDataRenderProfileGameDomeGameFramingRecastDataGameDataTimeMarkerPaintBrushImagePaintSettingsParticleBrushDataParticleEditSettingsTransformOrientationSculptVPaintToolSettingsbStatsUnitSettingsPhysicsSettingsEditingSceneStatsDagForestMovieClipBGpicMovieClipUserRegionView3DRenderInfoRenderEngineViewDepthsSmoothViewStorewmTimerView3DSpaceLinkView2DSpaceInfoSpaceIpobDopeSheetSpaceButsSpaceSeqFileSelectParamsSpaceFileFileListwmOperatorFileLayoutSpaceOopsTreeStoreTreeStoreElemSpaceImageScopesHistogramSpaceNlaSpaceTextScriptSpaceScriptSpaceTimeCacheSpaceTimeSpaceNodeSpaceLogicConsoleLineSpaceConsoleSpaceUserPrefSpaceClipMovieClipScopesuiFontuiFontStyleuiStyleuiWidgetColorsuiWidgetStateColorsuiPanelColorsThemeUIThemeSpaceThemeWireColorbThemebAddonSolidLightUserDefbScreenScrVertScrEdgePanelPanelTypeuiLayoutScrAreaSpaceTypeARegionARegionTypeFileGlobalStripElemStripCropStripTransformStripColorBalanceStripProxyStripPluginSeqSequencebSoundMetaStackWipeVarsGlowVarsTransformVarsSolidColorVarsSpeedControlVarsEffectBuildEffPartEffParticleWaveEffbPropertybNearSensorbMouseSensorbTouchSensorbKeyboardSensorbPropertySensorbActuatorSensorbDelaySensorbCollisionSensorbRadarSensorbRandomSensorbRaySensorbArmatureSensorbMessageSensorbSensorbControllerbJoystickSensorbExpressionContbPythonContbActuatorbAddObjectActuatorbActionActuatorSound3DbSoundActuatorbEditObjectActuatorbSceneActuatorbPropertyActuatorbObjectActuatorbIpoActuatorbCameraActuatorbConstraintActuatorbGroupActuatorbRandomActuatorbMessageActuatorbGameActuatorbVisibilityActuatorbTwoDFilterActuatorbParentActuatorbStateActuatorbArmatureActuatorbSteeringActuatorGroupObjectBonebArmaturebMotionPathVertbPoseChannelbIKParambItascbActionGroupSpaceActionbActionChannelbConstraintChannelbConstraintbConstraintTargetbPythonConstraintbKinematicConstraintbSplineIKConstraintbTrackToConstraintbRotateLikeConstraintbLocateLikeConstraintbSizeLikeConstraintbSameVolumeConstraintbTransLikeConstraintbMinMaxConstraintbActionConstraintbLockTrackConstraintbDampTrackConstraintbFollowPathConstraintbStretchToConstraintbRigidBodyJointConstraintbClampToConstraintbChildOfConstraintbTransformConstraintbPivotConstraintbLocLimitConstraintbRotLimitConstraintbSizeLimitConstraintbDistLimitConstraintbShrinkwrapConstraintbFollowTrackConstraintbCameraSolverConstraintbActionModifierbActionStripbNodeStackbNodeSocketbNodeLinkbNodePreviewbNodeuiBlockbNodeTypebNodeTreeExecbNodeSocketValueIntbNodeSocketValueFloatbNodeSocketValueBooleanbNodeSocketValueVectorbNodeSocketValueRGBANodeImageAnimNodeBlurDataNodeDBlurDataNodeBilateralBlurDataNodeHueSatNodeImageFileNodeChromaNodeTwoXYsNodeTwoFloatsNodeGeometryNodeVertexColNodeDefocusNodeScriptDictNodeGlareNodeTonemapNodeLensDistNodeColorBalanceNodeColorspillNodeTexBaseNodeTexSkyNodeTexImageNodeTexEnvironmentNodeTexGradientNodeTexNoiseNodeTexVoronoiNodeTexMusgraveNodeTexWaveNodeTexMagicNodeShaderAttributeTexNodeOutputCurveMapPointCurveMapBrushCloneCustomDataLayerCustomDataExternalHairKeyParticleKeyBoidParticleBoidDataParticleSpringChildParticleParticleTargetParticleDupliWeightParticleDataSPHFluidSettingsParticleSettingsBoidSettingsParticleCacheKeyKDTreeParticleDrawDataLinkNodebGPDspointbGPDstrokebGPDframebGPDlayerReportListwmWindowManagerwmWindowwmKeyConfigwmEventwmSubWindowwmGesturewmKeyMapItemPointerRNAwmKeyMapDiffItemwmKeyMapwmOperatorTypeFModifierFMod_GeneratorFMod_FunctionGeneratorFCM_EnvelopeDataFMod_EnvelopeFMod_CyclesFMod_PythonFMod_LimitsFMod_NoiseFMod_SteppedDriverTargetDriverVarChannelDriverFPointFCurveAnimMapPairAnimMapperNlaStripNlaTrackKS_PathKeyingSetAnimOverrideIdAdtTemplateBoidRuleBoidRuleGoalAvoidBoidRuleAvoidCollisionBoidRuleFollowLeaderBoidRuleAverageSpeedBoidRuleFightBoidStateFLUID_3DWTURBULENCESpeakerMovieClipProxyMovieClipCacheMovieTrackingMovieTrackingTrackMovieTrackingMarkerMovieReconstructedCameraMovieTrackingCameraMovieTrackingSettingsMovieTrackingStabilizationMovieTrackingReconstructionMovieTrackingStatsDynamicPaintSurfacePaintSurfaceDataTLEN `HH(p$8p`(0((xxh@(P8X0hXhP 0@ ( @ @Phx``XXp8XxP0x`0phX`Pphhp8h0xxH  (@H X@8`0X`8 Phx@8@p`h(!x@0H(h 8  P8 (X( X,    H@@00Hh(H,(lHH`h<PPh HXPpT `88pX0(((xHX8XPx8000(HH008hp`88(H08( ,@  `@ 8H88@( <h (((x x8(h(hp P8P@hH@H088STRC                  !"#$%&'()*+,-./01+,23456  789:; <=+>?#@:,ABC DEFG HIJK: LMNOP  QRS! !!TUVW XYZ"[X\ ] ^ _`a bcde fg#hi $HjklmnopqrstuMvwx%&yz{|}~#C'! ()**#+A,x6-   #.@=/"=.0'1lm2      /34 5#,@H !"#$C%&'()*+,-./0123456789:;<=>?~{|}@AW%B6CM'-D/0E2F4G5HIx7 JK#LMN*8 /O PQRS9JHTUVWXYZ[\]^_`3aClmbcdefghijklmnopqrstuvWwxyz{|}~M+I6C:;W<~H:;TwI//+6CM=C    > ?#@@@A BHA ! " #M<$%&#JK'()@* +,-4./01234,-03#CCC536789:;<=>?./@ABDC5#CEDEF FGGHZHDHA I "FJKLMMINO<$ PJKQRSTU#&VWXY89Z[\ ]]@^_`abcdefghiJjKkl>m>n>o>pqrstEuvwDxDyL&HAMO<$MzN{O|P}Q~RSTLUVWWW3JKQ&XO TM5QY-RYPSZ'T[\#TU N'T]^_`aZbcbd5ef ffdce3#PX PWWgh hhT#&yih,jhX  khlhmhUnhETo hXphqhXr h#Xsh#thuvwx h,y h'X#zh{h|hh}h,#WW~hC h hh&y      hPPPPPPMhPPh#-| h!T#"#$3%&'()* +,-./ 0h12346h*656h789h:;<=Xh>  h?@ABCDEF hGAHITJ#h h|KLMNOPQ hRSTUVWhXYZ[\]^_`abcdefghijkhl#h,mn3aohpq3rstuvw,xyz{|h}~uvw,xyz{|huvw,xyz{h#HZH67C#MORExA#H$MA    "  T<$&JKrj         F=H   $L  * L  (    !"#$%&'()*+,,-.=/012  345 6 7 89:;<=>4 ?6@ABCD EF G4HIJKLMNOPQRSTUVWXYZ[\]^_`abc64def#ghijklmnopqrstuvwxy}z{|}~O 2LLM3#NHUwCTM+I6C      #   C !W"#$%&'()X*+,-. /0123#JC <4=567#89 :;<=>?@ABCxjDEFGH}zIJKLMNOPQRSTUVWXYZ[jh\]^_`;:a<bcTdeWfghijklmn opqrstuvwxyz{|}~C" WCTWC"WCWCTUVWXZTW3@ X#W h9 6&yX##"R S9      !"#$%&'()*+W,-./012343567896&.H:&; <=>?@ABCDEI6CFGHI b J K L M N OPQRS TUVWX'%BXYfghZ[#)\]^_`abcd[efghijkSlmn[opAqrstuvwxyz{|}~3 d[n[ rC%n?    "=  x  T s# N  fg[#   C   '%B3fg#  # "X        X  XW     NWfg6C6 S !  #"Q# # $ %&'() #* fg+,-XT[./0W#123456# 5789:;<=>?@@ABCDEFGHIJKLM# NOPQRSTUVW XYZ[\]Z# ^_# `abcdefghijklmno p qrs# otuRv^wxyz{|}~ #                   ##4[     $       !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGH/IJKL M N O &y&PQWRSTUVWXYZ[\#] ^_` abcm#defghijkl#mnopq rstuvwbcxyz1{|#V#d }  ~ bjkUZSd   ~ b&# #!)<"""6! #      $0$$  U}"M&y)#D$$$ % J &&& $   $#'W()*#+X,,,6---6}.-..tVOA# % "   / =000W   B 111 2[  #3WC4< [#5     6#  7#8  #9  TC:W; <  TW =  >  ???! " # $ WC @% *& ' #A ( ) * + , - . B/ C"T@@@@0 1 $ ! 2 C D% ?3  4 5 E#*F6 V7 8 9 : ; #< = G> ? @ A B C D E H F WCJ%GG gH I *LI J K L M JW#&yK# *L ! N O SP Q R S M V7 WCgN*T U WCO TV #W X Y Z [ P VS=\ ] Q^ _ ` a b  Rc d  e WC SVf g TUZh i "V*WX  -?j Y (k [l m n o ?p D DDD0 ! q  *% # #r MJs > ? t X# u v w ZZZ* x = y z [[[{ [$ | } ~       [-       \H [  #            ]]                  ^(^^{   [ ^$^  ^ J #           G # Wr     _ `  .          aaa  @   b  # =   #b   T ccca M SdddM eee    #M  f ff    g"{   h      -  T[i     j   X#k  l  m  nJo p   | WC q  |UVT U #6 r  s #t |K u     v           #WCw %x # y          z  {  ! " %|  ! " %}  ! " %~ [dT#?[CE# X$ #X# %  & *T' ( M6 UV) * + , 08 - . /  4T U 0 1 2 3 4 5 6  7 8 W9 : ; < =  u > ? @ A B C D E #%F & G <  1 H $ 7 9 : WI J K L M N O P Q R S T U V W X Y Z [ #6H \ ] ^ _ J ` 1 H a b c d e f g T U T U Sh T U i {} j k #jkl m n Ao p q jr s t u  v w t Wx Uy #z { U#|   D}z } ~  b                l     0    <UW h       |s      #          7 8 <  < #< #< #<< #< #< #<#@  4       3 =                T               #' #" 3 + 1   Z h _               |   Z W    #  -#   8 6         6*T * 6    O     h               H  6 ! w" # $ % & ' ( ) * + , - . / 0 "1 2 3 4 5 6 7 8 9 : ; < = > ? @ A VB C D E F G H I J K L M N O P Q R S T hU V W X Y Z [ \ ] ^ _ ` a *()b c d e f g h i j k l m n o p q r s t u += v =w x y MFz 4{ |  F G} ~    $                        %u q  8 a      r   sop   X         S       X  =       {   Y  o     #            T #    jk  #   C     ~          #              d     S   @ }z8 -    T    T U 013 T U     "{  #h #  |     ! @" # : $ % & : ' #a ; /( 3:)  * + , - ? .  / 6 0 1 @ 2 UV) * 08 - 3 4 W 5 C /  @ 6 ! 7 # ) 8  9 @: 8 ;  )  < 0 = > ? @ A B C HD *E F G  D E H D *JI t E J D K & 6D t L M V N T O P Q R S G  JE T U V M W  X Y Z [ \ ] ^ _ ` a b c u(td  =e =w =f g ((h (i j k l m n So p q r s X#t u v w x y z { | } ~    Cv td 1S      Xw td    L  #CH%    s    JP z <   #  )u   #  1   z  z L   #  W   z          #  +     C            C[   #        #01           H.=   @       X    # ,  w                 X#  1<X     / /    #    ENDBslowmovideo-0.5+git20180116/.travis.yml0000664000000000000000000000115413151342440016136 0ustar rootroot# # Copyright (c) 2016 # travis CI config for slowmovideo check/test # for Ubuntu 14.04 Trusty sudo: required dist: trusty cache: apt # Enable C++ support language: cpp # Compiler selection compiler: - gcc env: - OpenCV=2 - OpenCV=3 before_install: - export DISPLAY=:99.0 - sh -e /etc/init.d/xvfb start #install necessary Qt files install: - sudo apt-get install -y qttools5-dev-tools libopencv-dev qtbase5-dev qtscript5-dev qtdeclarative5-dev before_script: - mkdir build - cd build - cmake ../src -DENABLE_TESTS=TRUE # Build steps script: - make - ./slowmoVideo/unittests/UnitTests slowmovideo-0.5+git20180116/flowScripts/0000775000000000000000000000000013151342440016343 5ustar rootrootslowmovideo-0.5+git20180116/flowScripts/buildFlow.py0000775000000000000000000000653613151342440020661 0ustar rootroot#!/usr/bin/env python3 import argparse import os.path import signal from naming import * parser = argparse.ArgumentParser(description="Note that this is a rather old (read: deprecated) sample script. \ It simply builds flow files for all files in the input directory.") parser.add_argument("-i", "--input", dest="inDir", required=True, help="Input Directory", metavar="DIR") parser.add_argument("-o", "--output", dest="outDir", required=True, help="Output Directory", metavar="DIR") parser.add_argument("--flow", dest="flowExecutable", required=True, help="Executable for optical flow") parser.add_argument("--forward-only", action="store_true", dest="forwardOnly", help="Calculate forward flow only") parser.add_argument("--backward-only", action="store_true", dest="backwardOnly", help="Calculate backward flow only") parser.add_argument("--force-rebuild", action="store_true", dest="forceRebuild", help="Force rebuild of existing flow images") parser.add_argument("--offset", dest="offset", type=int, default=0, help="Frame offset") parser.add_argument("--lambda", dest="lambdaValue", type=float, default=10, help="V3D lambda value") args = parser.parse_args() def handler(signum, frame) : print("Signal %s received at frame %s. Terminating." % (signum, frame)) exit(signum) signal.signal(signal.SIGINT, handler) if not os.path.exists(args.inDir) : print("Input directory does not exist.") exit(-2) if not os.path.isdir(args.inDir) : print("Input directory is not a directory.") exit(-2) args.outDir = os.path.abspath(args.outDir) print("Output files go to %s." % args.outDir) files = os.listdir(args.inDir) files.sort() if not os.path.exists(args.outDir) : print("Oputput directory does not exist. Creating it.") os.makedirs(args.outDir) prev = None for s in files : if frameID(s) != None and int(frameID(s)) >= args.offset : if prev != None : leftFile = args.inDir + os.sep + prev rightFile = args.inDir + os.sep + s if not args.backwardOnly : outFile = args.outDir + os.sep + nameForwardFlow(prev, s, args.lambdaValue) if not os.path.exists(outFile) or args.forceRebuild : cmd = "%s %s %s %s 10 100" % (args.flowExecutable, leftFile, rightFile, outFile) ret = os.system(cmd) print("%s: Returned %s" % (outFile, ret)) if ret == 2 : print("SIGINT received, terminating.") exit(2) elif ret == 65024 : print("Environment variable not set, terminating") exit(65024) if not args.forwardOnly : outFile = args.outDir + os.sep + nameBackwardFlow(prev, s, args.lambdaValue) if (not os.path.exists(outFile)) or args.forceRebuild : cmd = "%s %s %s %s 10 100" % (args.flowExecutable, rightFile, leftFile, outFile) ret = os.system(cmd) print("%s: Returned %s" % (outFile, ret)) if ret == 2 : print("SIGINT received, terminating.") exit(2) elif ret == 65024 : print("Environment variable not set, terminating") exit(65024) prev = s slowmovideo-0.5+git20180116/flowScripts/buildSlowmo.py0000775000000000000000000000465013151342440021225 0ustar rootroot#!/usr/bin/env python3 import os import signal import argparse from naming import * parser = argparse.ArgumentParser(description="Deprecated. Used to render output frames. Executable not available anymore.") parser.add_argument("-i", "--input", dest="inDir", required=True, help="Input Directory containing the video's frames", metavar="DIR") parser.add_argument("-f", "--flow", dest="flowDir", required=True, help="Input Directory containing the optical flow frames", metavar="DIR") parser.add_argument("-o", "--output", dest="outDir", required=True, help="Output Directory", metavar="DIR") parser.add_argument("--slowmo", dest="slowmoExecutable", required=True, help="Executable for slowmoVideo") parser.add_argument("--oneway", dest="oneway", action="store_true", help="Use forward flow only") parser.add_argument("--offset", dest="offset", type=int, default=0, help="Frame offset") parser.add_argument("--lambda", dest="lambdaValue", required=True, type=float, default=10, help="V3D lambda value") parser.add_argument("--framerate", dest="framerate", type=float, default=30, help="Output frame rate") args = parser.parse_args() def handler(signum, frame) : print("Received signal %s at frame %s. Terminating." % (signum, frame)) exit(signum) signal.signal(signal.SIGINT, handler) if not os.path.exists(args.outDir) : os.makedirs(args.outDir) frames = os.listdir(args.inDir) frames.sort() counter = 0 prev = None for frame in frames : if frameID(frame) != None and int(frameID(frame)) >= args.offset : if prev != None : counter = int(frameID(frame))*args.framerate leftFrame = args.inDir + os.sep + prev rightFrame = args.inDir + os.sep + frame forwardFlow = args.flowDir + os.sep + nameForwardFlow(prev, frame, args.lambdaValue) backwardFlow = args.flowDir + os.sep + nameBackwardFlow(prev, frame, args.lambdaValue) if args.oneway : cmd = "%s forward %s %s %s/f%%1.png %s %s" % (args.slowmoExecutable, leftFrame, forwardFlow, args.outDir, counter, args.framerate) else : cmd = "%s twoway %s %s %s %s %s/f%%1.png %s %s" % (args.slowmoExecutable, leftFrame, rightFrame, forwardFlow, backwardFlow, args.outDir, counter, args.framerate) print("Command: %s" % cmd) ret = os.system(cmd) if ret != 0 : exit(ret) prev = frame slowmovideo-0.5+git20180116/flowScripts/naming.py0000775000000000000000000000104713151342440020173 0ustar rootroot#!/usr/bin/env python3 import re pattern = re.compile('\D+(?P\d+)\D+') def frameID(name) : o = pattern.match(name) if o == None : print("No number found in %s." % name) return None else : return o.group('name') def nameForwardFlow(left, right, lambdaValue) : return "forward-lambda{:.2f}_{}-{}.sVflow".format(lambdaValue, frameID(left), frameID(right)) def nameBackwardFlow(left, right, lambdaValue) : return "backward-lambda{:.2f}_{}-{}.sVflow".format(lambdaValue, frameID(right), frameID(left)) slowmovideo-0.5+git20180116/README.osx.md0000664000000000000000000000312213151342440016111 0ustar rootroot### Building for MacOS here, I will describe how to build slowmoVideo for OSX from scratch. you will need of course *Xcode* and *command line tools*, with *cmake* will need some dependencies : * glew (glew-1.10.0) * ffmpeg (ffmpeg-2.2) * jpeg (jpeg-9a) * libpng (libpng-1.6.10) * zlib (zlib-1.2.8) * yasm (for ffmpeg) (yasm-1.2.0) * opencv (opencv-2.4.8) * qt4 (qt 4.8.5) * x264 for ffmpeg 1- you need to specify where to find some libraries for cmake : ```export QTDIR=/Users/val/Documents/Sources/qt4 export FFMPEGDIR=/Users/val/Documents/Sources/ffmpeg ``` 2- run cmake : ``` cmake ../slowmoVideo/src -DCMAKE_INSTALL_PREFIX=/Users/val/Applications/slowmoVideo -DQTDIR=/Users/val/Documents/Sources/qt4 -DQT_MAKE_EXECUTABLE=/Users/val/Documents/Sources/qt4/bin/qmake -DOpenCV_DIR=/Users/val/Documents/Sources/opencv/share/OpenCV -DGLEW_INCLUDE_DIR=/Users/val/Documents/Sources/slowlib/include -DGLEW_LIBRAIRIES=/Users/val/Documents/Sources/slowlib/lib/libGLEW.a -DJPEG_INCLUDE_DIR=/Users/val/Documents/Sources/slowlib/include -DJPEG_LIBRARY=/Users/val/Documents/Sources/slowlib/lib/libjpeg.a -DFFMPEG_LIBRARY_DIR=/Users/val/Documents/Sources/ffmpeg/lib -DFFMPEG_INCLUDE_PATHS="/Users/val/Documents/Sources/ffmpeg/include" ``` check if cmake find all the needed part. As in my case some library where not found, so I have to specify them in CMakeCache.txt directly … they where : glew libraries and libswcale ! * if all is ok you can run `make ; make install` you will have some warning during compilation… you should now have a working GUI application bundle for MacOS in your install target directory. slowmovideo-0.5+git20180116/README.md0000664000000000000000000000501413151342440015303 0ustar rootrootslowmoVideo =========== Hello! This is a short introduction for you if you want to: - compile - develop - translate slowmoVideo. For everything else please go to the [web page](http://slowmoVideo.granjow.net) or the [Google+ group](https://plus.google.com/communities/116570263544012246711). Building -------- ### Building for Linux See [our wiki](https://github.com/slowmoVideo/slowmoVideo/wiki/Download) for build instructions Or see (outdated) http://slowmovideo.granjow.net/download.php ### Building for Windows Compiling slowmoVideo for Windows using MXE on Linux: 1. Get mxe _not_ from http://mxe.cc/ BUT, as long as OpenCV is not in the official branch, from https://github.com/Granjow/mxe/tree/opencv (Changes by Christian Frisson) 3. Build opencv, qt, ffmpeg and copy the fixed CMake file (avoids library names like liblibjasper) with: `$ cp replaceOnTime/OpenCVConfig.cmake usr/i686-pc-mingw32/` 4. Run cmake for slowmoVideo, but now give a toolchain file: `cmake .. -DCMAKE_TOOLCHAIN_FILE=/PATH_TO_MXE/usr/i686-pc-mingw32/share/cmake/mxe-conf.cmake` 5. Compile! ### Building for MacOS take a look at README.osx for more detailed instruction #### Notes Additionally to slowmoVideo, ffmpeg.exe (32-bit build, static) is required. Download it from http://ffmpeg.zeranoe.com/builds/ and put it into the same directory as slowmoUI.exe. Translating ----------- For this you should be in the slowmoVideo subdirectory which contains the tr/ directory. The tools (`linguist`, `lupdate`, `lrelease`) are available in the `qt4-dev-tools` package for Debian based systems. ### Adding your language To add your language xx (like fr, it), run the following command to generate the respective .ts file: lupdate . -ts tr/slowmoVideo_xx.ts After this you can start translating. To make slowmoVideo actually use the translation, add this entry to `slowmoUI/resources.qrc`: ../tr/slowmoVideo_xx.qm ### Translation First, run `lupdate` to get the newest strings to translate from the code. (Otherwise you might be translating something that does not even exist anymore.) Then the .ts file can be translated, preferrably with qt’s Linguist, or with any other translation tool you like. Finally, to see your translation “in action”, release the .ts file (this creates a .qm file). lupdate src/ -ts tr/slowmoVideo_xx.ts linguist tr/slowmoVideo_xx.ts lrelease tr/slowmoVideo_xx.ts Now you can push your `.ts` file to git. slowmovideo-0.5+git20180116/img/0000775000000000000000000000000013151342440014600 5ustar rootrootslowmovideo-0.5+git20180116/img/move0.png0000664000000000000000000001002013151342440016325 0ustar rootrootPNG  IHDR,RUsBIT|d pHYs B(xtEXtSoftwarewww.inkscape.org<IDATxy|I& +K, e=T]gqUU[^kwnw[-jXcV-*XP@9 W B H*4C~L^`3f*++N`0`0`0`0`0`0`0`0`0`0`0`0`0`0`0`0`0`0`0`0`0`0`0`0`0C֬ ш+vzE"!ٹYkWA.{qUYYpgξJ_y>FL?X_wF%,u~+I8Cs1X.r[h\2OYN@kDxn91`9ԧ-`#Nrz"X !X X X XppCoը1Nz p{ fiIJK)$`ͿjvnPxFמp^{^&>[l-WTX<ā$]9ߨt Nz=}z?陗7+GǓH6sr0e2'~-|Suյwh//GdyjwnY˟~)Ţx6'[) fN IR.XGuqR.XE%'Wur~!G],K`[7֭^@fPK @o VZآ&Eآ[w89yOh$}颹*=#7ugԘ ~}{V 3&ү_Q>>} ]Rg۵6ƱYSI'픃Ḷ6G\!nsV(s:Ń9=BW *:أUOKÏwK;MNɑߓZ?4 JӹvvG:%+N1>͟z:] U2[BE}{Rx.I׏RUQN @)OOw ~9npIz?_Q>2!<7*lV5Ƨ"5?}zy69D-vZ!M{iajlyk}g,]֤뿹VS'-מ.F-3۴|Aa{/bq]x8$]~# ma#0eY9NpJ~auŢNѯ-kF-{yۥ\2Hw:R4uR*^|AWGXSת65zpϦzkSߣYeءKեyM<ǶZчNbш~O헜Ŝ樢ф}h,e+ '+ӣizmiSc:*ם/nQ0754WYcv^4HK7)O svԬe㕛W8p8][6{rv i-‘~n^(OdPZ?S2(]wuRޢvhTA@s*xM ׷GTskI0MDB#vŸ_IWW%԰?}!l+h44Z[w3=Fn&צk-}}ZB1e?:|5,O6|h~xIҎ]>4C_DݟS[{ mN;^ܤ_,UϣPLe3-2|n=nddH:wrx|~v@Kt %ZSע z>a}f{r}綑Zt_򲀶 NHcs%ٺ{Y=s\zkUq04vh\q^i* mLa v"]2󗤌RUTUIn6܁/AXlଐNk ]F^*@j"dڶsX6/KΩZ8?5FNE֧Væzޟݝ"Z;Sa~j}I::̊ey4"om+q\FnCs*SA "H>sL6-3?ٮ'ڝaz<@^$RMYʹ4(s,Xt9[AN` p,X>.v>r0Xv"agV 9^@"33333l+!G?nH\> fjQxDk٬ rT4#Oi2K3/ѥUnԾDBU¿iNLmae{U$}ÙT:u==Vӡ WLל 3:45d1LMxNUZ_ghRljW>,I4EUbLaT`׶_0Onb6#qU=,i:`V%͋vhCI~)\=\VS>Ęlqzr:G^NV|3x$7Sސ(o汃HsT@ V"лlP[Mf.ž,>BhEcRVG,3DWBݺQ't]Upt@y^Wns|*/d6ۢ=7ˣ7.|mOOTnUP4Z~zVy}{ L+2K3|2ˎt='XXѬP\rWc8CL˗}2 O/     u*N@r0X;.ۙHe+Qԩ>=q)Ȭ@s,XORK3dU?(1h{;_;=jN+\UW-t3D(PMLBj[rzqUVVM84p3333333333333333333333333333333333333333333333333333333333333nrIENDB`slowmovideo-0.5+git20180116/img/move1.png0000664000000000000000000001010413151342440016331 0ustar rootrootPNG  IHDR,RUsBIT|d pHYs B(xtEXtSoftwarewww.inkscape.org<IDATx{xTLf&afr$.F WW kYTnѪV+lokO]}d{yZYjwQ[U*Tr5%!rgG@oofwsΜsح`0`  1c, XA`0`  1c, XA`0` `U]eMt83f56G6UaA˽2l6'Ip$П)^1 S\Z̍$96}sX= R`e˗eςx^MK z @} NG4Z "%E.)f0e;RvXQ`]9EE X(Ё}5琒`_x8k?lgs.o~v"#e5/6鿞?}LΛhڙ ˽z[]CMAMRˊaow~Ei長Z=Fh83J-_6R-!]l_fV}-5Pb4tɲ cN$} K!-w2 v[=qLkڂ Ӻt[G+ ݷ$͚^( hK 0S2rJ2%lvFd=p83'{%}~c_ڿדgu-V %} ,O2>'ˡ]]~c.$>R,W܄S14O{逓Jme9*,pH w2ǝxlmWGgdgZbԈ<{jojԈ!++$+1N]QJ^Bj3Dgj}|%N\vMXn_]FTY1" E#W(:Իuhİ}VvMi͋MhM"*9$6 ZaCsolSYK.?~?r.fn}5u7g0 kXJ^C{zTT=Ն?QyاTy34ҭ=zkGGf|i h87qIikiQ~A&N\{d颅?eXsA*mYŢ=Jr2~T9L9|POV*-n[l.z7K餏[ݧGszFzxA\H\?͞ךM&X/]Sgl6]z͒$ӥ+S>ΎXu sf# &XSo7ETg* `Nf-;z1j=Ҥ>zdw9k~ʔv~c yX=~ڤdeVL`u#ݥM~꤈Czɟ7^zJgNWgѕSTT|J`0 iS.ed8, XA`0`  1c, XA`0`  1c, XA`0`  1c, Xz ݍ:t&9Uɱzh ƴ=gj{9h8fqy#O3 ʱY=JJ%[je@$Nl]ߞ$&{U]z!X)\|[G#6IWX=ƀ,f.IfVBInf`_`0` 36\ t;Ĵ GkkPFi:ϢIkXX6oӟ&Ĵdջj 'h|]mfѤ`Щw69cNxv6+^1U *Kt﹕Z:\~ &MIhw۔ OyUN9J#+XVAGi"vLᣡ_a)u=|2cq9|dTLu;r>T jYݭ8skqUVmn8*K U*pwH rkԴ18bu!UFFإUU\#Ǖ^v tt1*]ݓ˖{|a q#Qo8*/RhL8L܅umy1CNGۦjdԭMjEu٣4TȘ`FR\ V6NvHn61WNZ2\=U:Y5۲w ݜ1gh˘ q|eM:on߯a_Ӝ1źt mnn):ŦUg$gWȗmr^aJ͸ Ld`|UWp+ǮeD5, X)DuQC^*N+jfoZ,IK!BO],)ғ=ols|(w5kQ4nSټbMIҿ\'woݡW[x\E:g)oXvϕ-^h!cQ:j+[r84Dz ˽wOj{h->_Nor`Kv?z֊ :1c'VVaTy4So[SHHܩ@SPyCsj f_ysT;.j| `L]SJ\T|Jvw^sPJRTl969 jޥX({+M+[l igy2YVo~R#{5#&-&WQ >MDcz,t?ɖc;\EӮi>U\K}+uV hpZ^KPL@bBagG@`#q}vuhG؞ʾ+o/Qρ\=LYx4wnݡm]:'=*[3 }^B*.(q՟ݡ5E$<6i*;~C߾%;Gۺ7%+{ V߫8cG[llW4c p|էq85|qY'iئ`  1c, XV0r@:ɘ`E͹tqɜYt1 Ռ+&v2+f2&X-U܀*6!ydLjIAQFrzUt[yUN9Jq<W6u/3* | 0@ 1c, XA`0`  1c, XA`0`  1c, XA`0`  1c, XA`0`  1c, XA`0`  1c, XA`0`  1c, XA`0`  1c, XAwʥlIENDB`slowmovideo-0.5+git20180116/img/1rect0.png0000664000000000000000000000216213151342440016405 0ustar rootrootPNG  IHDR,RUsBIT|d pHYs B(xtEXtSoftwarewww.inkscape.org<IDATxAj]Uᓘ*E2JF>G֝t\bDpmLъufQSJw9&`߽V?ܳZ;, C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 ؛{y9:wo7k.Gg74׿?{4NlZG<{o2=~y/ԟO6/O=kW;G>Vcqxmw|ݙ{ ,u'lw`ܟ{V]E,>Xn]Z煋ud!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d,>Xi۵̽kgsOتOy"/{:L9/\w3rÂ`MOƽgsOya64>ӹVij޸u/4Ƿ_]se_B !X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@?g[K7/.IENDB`slowmovideo-0.5+git20180116/img/1rect1.png0000664000000000000000000000216313151342440016407 0ustar rootrootPNG  IHDR,RUsBIT|d pHYs B(xtEXtSoftwarewww.inkscape.org<IDATx1U3$DebBk$ p !.FK tFDьzCH&bbr> ӼsV4 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2 , C 2p>y2nߺ6ή,iכgg۹',V4o^{+׳{S`qݬ18r8>yt=vn1}v7Ƈ7O;`]?{Nyak)/:/`'X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@b^[ `_y1e_1~>N`}zgGs)647?/oc \y'X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`d!X@`$[K6IENDB`slowmovideo-0.5+git20180116/todo.org0000664000000000000000000000134313151342440015503 0ustar rootroot#+STARTUP: showall #+STARTUP: nohideblocks * UI [1/3] - [X] Always display some units (user feedback when scrolling) - [ ] Customize grid: Replay speed (e.g. 1x, 0.5x, etc.) - [ ] Help texts - [ ] Lock mode for replay speed - [ ] Windows: Note that flowBuilder is not available - [ ] Remove Bézier interpolation * Flow Editor [/] - [ ] Add pipet tool for inpainting - [ ] Alphabetic loading: Change file naming (%03d instead of %d e.g.) * File loading [0/2] - [ ] Skip defect image files - [ ] Maybe: Run a checker thread (image size, validity) in the background - [ ] Project loading speed for big files * General [0/1] - [ ] Ensure slowmoRenderer terminates * Windows [0/1] - [ ] Rendering: Check if directory exists (/tmp/...)