pax_global_header00006660000000000000000000000064122004640300014501gustar00rootroot0000000000000052 comment=ea487d3f927246aa0eac90952cabb8c42c97063a psimedia-master/000077500000000000000000000000001220046403000141315ustar00rootroot00000000000000psimedia-master/COPYING000066400000000000000000000635041220046403000151740ustar00rootroot00000000000000 GNU LESSER GENERAL PUBLIC LICENSE Version 2.1, February 1999 Copyright (C) 1991, 1999 Free Software Foundation, Inc. 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. [This is the first released version of the Lesser GPL. It also counts as the successor of the GNU Library Public License, version 2, hence the version number 2.1.] Preamble The licenses for most software are designed to take away your freedom to share and change it. By contrast, the GNU General Public Licenses are intended to guarantee your freedom to share and change free software--to make sure the software is free for all its users. This license, the Lesser General Public License, applies to some specially designated software packages--typically libraries--of the Free Software Foundation and other authors who decide to use it. You can use it too, but we suggest you first think carefully about whether this license or the ordinary General Public License is the better strategy to use in any particular case, based on the explanations below. When we speak of free software, we are referring to freedom of use, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for this service if you wish); that you receive source code or can get it if you want it; that you can change the software and use pieces of it in new free programs; and that you are informed that you can do these things. To protect your rights, we need to make restrictions that forbid distributors to deny you these rights or to ask you to surrender these rights. These restrictions translate to certain responsibilities for you if you distribute copies of the library or if you modify it. For example, if you distribute copies of the library, whether gratis or for a fee, you must give the recipients all the rights that we gave you. You must make sure that they, too, receive or can get the source code. If you link other code with the library, you must provide complete object files to the recipients, so that they can relink them with the library after making changes to the library and recompiling it. And you must show them these terms so they know their rights. We protect your rights with a two-step method: (1) we copyright the library, and (2) we offer you this license, which gives you legal permission to copy, distribute and/or modify the library. To protect each distributor, we want to make it very clear that there is no warranty for the free library. Also, if the library is modified by someone else and passed on, the recipients should know that what they have is not the original version, so that the original author's reputation will not be affected by problems that might be introduced by others. Finally, software patents pose a constant threat to the existence of any free program. We wish to make sure that a company cannot effectively restrict the users of a free program by obtaining a restrictive license from a patent holder. Therefore, we insist that any patent license obtained for a version of the library must be consistent with the full freedom of use specified in this license. Most GNU software, including some libraries, is covered by the ordinary GNU General Public License. This license, the GNU Lesser General Public License, applies to certain designated libraries, and is quite different from the ordinary General Public License. We use this license for certain libraries in order to permit linking those libraries into non-free programs. When a program is linked with a library, whether statically or using a shared library, the combination of the two is legally speaking a combined work, a derivative of the original library. The ordinary General Public License therefore permits such linking only if the entire combination fits its criteria of freedom. The Lesser General Public License permits more lax criteria for linking other code with the library. We call this license the "Lesser" General Public License because it does Less to protect the user's freedom than the ordinary General Public License. It also provides other free software developers Less of an advantage over competing non-free programs. These disadvantages are the reason we use the ordinary General Public License for many libraries. However, the Lesser license provides advantages in certain special circumstances. For example, on rare occasions, there may be a special need to encourage the widest possible use of a certain library, so that it becomes a de-facto standard. To achieve this, non-free programs must be allowed to use the library. A more frequent case is that a free library does the same job as widely used non-free libraries. In this case, there is little to gain by limiting the free library to free software only, so we use the Lesser General Public License. In other cases, permission to use a particular library in non-free programs enables a greater number of people to use a large body of free software. For example, permission to use the GNU C Library in non-free programs enables many more people to use the whole GNU operating system, as well as its variant, the GNU/Linux operating system. Although the Lesser General Public License is Less protective of the users' freedom, it does ensure that the user of a program that is linked with the Library has the freedom and the wherewithal to run that program using a modified version of the Library. The precise terms and conditions for copying, distribution and modification follow. Pay close attention to the difference between a "work based on the library" and a "work that uses the library". The former contains code derived from the library, whereas the latter must be combined with the library in order to run. GNU LESSER GENERAL PUBLIC LICENSE TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION 0. This License Agreement applies to any software library or other program which contains a notice placed by the copyright holder or other authorized party saying it may be distributed under the terms of this Lesser General Public License (also called "this License"). Each licensee is addressed as "you". A "library" means a collection of software functions and/or data prepared so as to be conveniently linked with application programs (which use some of those functions and data) to form executables. The "Library", below, refers to any such software library or work which has been distributed under these terms. A "work based on the Library" means either the Library or any derivative work under copyright law: that is to say, a work containing the Library or a portion of it, either verbatim or with modifications and/or translated straightforwardly into another language. (Hereinafter, translation is included without limitation in the term "modification".) "Source code" for a work means the preferred form of the work for making modifications to it. For a library, complete source code means all the source code for all modules it contains, plus any associated interface definition files, plus the scripts used to control compilation and installation of the library. Activities other than copying, distribution and modification are not covered by this License; they are outside its scope. The act of running a program using the Library is not restricted, and output from such a program is covered only if its contents constitute a work based on the Library (independent of the use of the Library in a tool for writing it). Whether that is true depends on what the Library does and what the program that uses the Library does. 1. You may copy and distribute verbatim copies of the Library's complete source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice and disclaimer of warranty; keep intact all the notices that refer to this License and to the absence of any warranty; and distribute a copy of this License along with the Library. You may charge a fee for the physical act of transferring a copy, and you may at your option offer warranty protection in exchange for a fee. 2. You may modify your copy or copies of the Library or any portion of it, thus forming a work based on the Library, and copy and distribute such modifications or work under the terms of Section 1 above, provided that you also meet all of these conditions: a) The modified work must itself be a software library. b) You must cause the files modified to carry prominent notices stating that you changed the files and the date of any change. c) You must cause the whole of the work to be licensed at no charge to all third parties under the terms of this License. d) If a facility in the modified Library refers to a function or a table of data to be supplied by an application program that uses the facility, other than as an argument passed when the facility is invoked, then you must make a good faith effort to ensure that, in the event an application does not supply such function or table, the facility still operates, and performs whatever part of its purpose remains meaningful. (For example, a function in a library to compute square roots has a purpose that is entirely well-defined independent of the application. Therefore, Subsection 2d requires that any application-supplied function or table used by this function must be optional: if the application does not supply it, the square root function must still compute square roots.) These requirements apply to the modified work as a whole. If identifiable sections of that work are not derived from the Library, and can be reasonably considered independent and separate works in themselves, then this License, and its terms, do not apply to those sections when you distribute them as separate works. But when you distribute the same sections as part of a whole which is a work based on the Library, the distribution of the whole must be on the terms of this License, whose permissions for other licensees extend to the entire whole, and thus to each and every part regardless of who wrote it. Thus, it is not the intent of this section to claim rights or contest your rights to work written entirely by you; rather, the intent is to exercise the right to control the distribution of derivative or collective works based on the Library. In addition, mere aggregation of another work not based on the Library with the Library (or with a work based on the Library) on a volume of a storage or distribution medium does not bring the other work under the scope of this License. 3. You may opt to apply the terms of the ordinary GNU General Public License instead of this License to a given copy of the Library. To do this, you must alter all the notices that refer to this License, so that they refer to the ordinary GNU General Public License, version 2, instead of to this License. (If a newer version than version 2 of the ordinary GNU General Public License has appeared, then you can specify that version instead if you wish.) Do not make any other change in these notices. Once this change is made in a given copy, it is irreversible for that copy, so the ordinary GNU General Public License applies to all subsequent copies and derivative works made from that copy. This option is useful when you wish to copy part of the code of the Library into a program that is not a library. 4. You may copy and distribute the Library (or a portion or derivative of it, under Section 2) in object code or executable form under the terms of Sections 1 and 2 above provided that you accompany it with the complete corresponding machine-readable source code, which must be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange. If distribution of object code is made by offering access to copy from a designated place, then offering equivalent access to copy the source code from the same place satisfies the requirement to distribute the source code, even though third parties are not compelled to copy the source along with the object code. 5. A program that contains no derivative of any portion of the Library, but is designed to work with the Library by being compiled or linked with it, is called a "work that uses the Library". Such a work, in isolation, is not a derivative work of the Library, and therefore falls outside the scope of this License. However, linking a "work that uses the Library" with the Library creates an executable that is a derivative of the Library (because it contains portions of the Library), rather than a "work that uses the library". The executable is therefore covered by this License. Section 6 states terms for distribution of such executables. When a "work that uses the Library" uses material from a header file that is part of the Library, the object code for the work may be a derivative work of the Library even though the source code is not. Whether this is true is especially significant if the work can be linked without the Library, or if the work is itself a library. The threshold for this to be true is not precisely defined by law. If such an object file uses only numerical parameters, data structure layouts and accessors, and small macros and small inline functions (ten lines or less in length), then the use of the object file is unrestricted, regardless of whether it is legally a derivative work. (Executables containing this object code plus portions of the Library will still fall under Section 6.) Otherwise, if the work is a derivative of the Library, you may distribute the object code for the work under the terms of Section 6. Any executables containing that work also fall under Section 6, whether or not they are linked directly with the Library itself. 6. As an exception to the Sections above, you may also combine or link a "work that uses the Library" with the Library to produce a work containing portions of the Library, and distribute that work under terms of your choice, provided that the terms permit modification of the work for the customer's own use and reverse engineering for debugging such modifications. You must give prominent notice with each copy of the work that the Library is used in it and that the Library and its use are covered by this License. You must supply a copy of this License. If the work during execution displays copyright notices, you must include the copyright notice for the Library among them, as well as a reference directing the user to the copy of this License. Also, you must do one of these things: a) Accompany the work with the complete corresponding machine-readable source code for the Library including whatever changes were used in the work (which must be distributed under Sections 1 and 2 above); and, if the work is an executable linked with the Library, with the complete machine-readable "work that uses the Library", as object code and/or source code, so that the user can modify the Library and then relink to produce a modified executable containing the modified Library. (It is understood that the user who changes the contents of definitions files in the Library will not necessarily be able to recompile the application to use the modified definitions.) b) Use a suitable shared library mechanism for linking with the Library. A suitable mechanism is one that (1) uses at run time a copy of the library already present on the user's computer system, rather than copying library functions into the executable, and (2) will operate properly with a modified version of the library, if the user installs one, as long as the modified version is interface-compatible with the version that the work was made with. c) Accompany the work with a written offer, valid for at least three years, to give the same user the materials specified in Subsection 6a, above, for a charge no more than the cost of performing this distribution. d) If distribution of the work is made by offering access to copy from a designated place, offer equivalent access to copy the above specified materials from the same place. e) Verify that the user has already received a copy of these materials or that you have already sent this user a copy. For an executable, the required form of the "work that uses the Library" must include any data and utility programs needed for reproducing the executable from it. However, as a special exception, the materials to be distributed need not include anything that is normally distributed (in either source or binary form) with the major components (compiler, kernel, and so on) of the operating system on which the executable runs, unless that component itself accompanies the executable. It may happen that this requirement contradicts the license restrictions of other proprietary libraries that do not normally accompany the operating system. Such a contradiction means you cannot use both them and the Library together in an executable that you distribute. 7. You may place library facilities that are a work based on the Library side-by-side in a single library together with other library facilities not covered by this License, and distribute such a combined library, provided that the separate distribution of the work based on the Library and of the other library facilities is otherwise permitted, and provided that you do these two things: a) Accompany the combined library with a copy of the same work based on the Library, uncombined with any other library facilities. This must be distributed under the terms of the Sections above. b) Give prominent notice with the combined library of the fact that part of it is a work based on the Library, and explaining where to find the accompanying uncombined form of the same work. 8. You may not copy, modify, sublicense, link with, or distribute the Library except as expressly provided under this License. Any attempt otherwise to copy, modify, sublicense, link with, or distribute the Library is void, and will automatically terminate your rights under this License. However, parties who have received copies, or rights, from you under this License will not have their licenses terminated so long as such parties remain in full compliance. 9. You are not required to accept this License, since you have not signed it. However, nothing else grants you permission to modify or distribute the Library or its derivative works. These actions are prohibited by law if you do not accept this License. Therefore, by modifying or distributing the Library (or any work based on the Library), you indicate your acceptance of this License to do so, and all its terms and conditions for copying, distributing or modifying the Library or works based on it. 10. Each time you redistribute the Library (or any work based on the Library), the recipient automatically receives a license from the original licensor to copy, distribute, link with or modify the Library subject to these terms and conditions. You may not impose any further restrictions on the recipients' exercise of the rights granted herein. You are not responsible for enforcing compliance by third parties with this License. 11. If, as a consequence of a court judgment or allegation of patent infringement or for any other reason (not limited to patent issues), conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot distribute so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not distribute the Library at all. For example, if a patent license would not permit royalty-free redistribution of the Library by all those who receive copies directly or indirectly through you, then the only way you could satisfy both it and this License would be to refrain entirely from distribution of the Library. If any portion of this section is held invalid or unenforceable under any particular circumstance, the balance of the section is intended to apply, and the section as a whole is intended to apply in other circumstances. It is not the purpose of this section to induce you to infringe any patents or other property right claims or to contest validity of any such claims; this section has the sole purpose of protecting the integrity of the free software distribution system which is implemented by public license practices. Many people have made generous contributions to the wide range of software distributed through that system in reliance on consistent application of that system; it is up to the author/donor to decide if he or she is willing to distribute software through any other system and a licensee cannot impose that choice. This section is intended to make thoroughly clear what is believed to be a consequence of the rest of this License. 12. If the distribution and/or use of the Library is restricted in certain countries either by patents or by copyrighted interfaces, the original copyright holder who places the Library under this License may add an explicit geographical distribution limitation excluding those countries, so that distribution is permitted only in or among countries not thus excluded. In such case, this License incorporates the limitation as if written in the body of this License. 13. The Free Software Foundation may publish revised and/or new versions of the Lesser General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Library specifies a version number of this License which applies to it and "any later version", you have the option of following the terms and conditions either of that version or of any later version published by the Free Software Foundation. If the Library does not specify a license version number, you may choose any version ever published by the Free Software Foundation. 14. If you wish to incorporate parts of the Library into other free programs whose distribution conditions are incompatible with these, write to the author to ask for permission. For software which is copyrighted by the Free Software Foundation, write to the Free Software Foundation; we sometimes make exceptions for this. Our decision will be guided by the two goals of preserving the free status of all derivatives of our free software and of promoting the sharing and reuse of software generally. NO WARRANTY 15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE LIBRARY "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE LIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE LIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. END OF TERMS AND CONDITIONS How to Apply These Terms to Your New Libraries If you develop a new library, and you want it to be of the greatest possible use to the public, we recommend making it free software that everyone can redistribute and change. You can do so by permitting redistribution under these terms (or, alternatively, under the terms of the ordinary General Public License). To apply these terms, attach the following notices to the library. It is safest to attach them to the start of each source file to most effectively convey the exclusion of warranty; and each file should have at least the "copyright" line and a pointer to where the full notice is found. Copyright (C) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA Also add information on how to contact you by electronic and paper mail. You should also get your employer (if you work as a programmer) or your school, if any, to sign a "copyright disclaimer" for the library, if necessary. Here is a sample; alter the names: Yoyodyne, Inc., hereby disclaims all copyright interest in the library `Frob' (a library for tweaking knobs) written by James Random Hacker. , 1 April 1990 Ty Coon, President of Vice That's all there is to it! psimedia-master/README000066400000000000000000000022551220046403000150150ustar00rootroot00000000000000PsiMedia 1.0.3 -------------- Date: June 10th, 2009 Website: http://delta.affinix.com/psimedia/ Mailing List: Delta Project License ------- This library is licensed under the Lesser GNU General Public License. See the COPYING file for more information. Description ----------- PsiMedia is a thick abstraction layer for providing audio and video RTP services to Psi-like IM clients. The implementation is based on GStreamer. Contents: psimedia/ API and plugin shim gstprovider/ provider plugin based on GStreamer demo/ demonstration GUI program To build the plugin and demo program, run: ./configure make There is no "make install". The compiled plugin can be found under the gstprovider directory. An application that uses PsiMedia should have instructions on what to do with the plugin. Building from SVN ------------------ First, install the 'qconf' program, at least version 1.4. You can download the source here: http://delta.affinix.com/qconf/ Then, go into the PsiMedia source tree and type 'qconf'. You should now have a configure program to execute, just like from a normal source package. psimedia-master/TODO000066400000000000000000000015771220046403000146330ustar00rootroot00000000000000api: multi-user jingle stuff backend: pausing/holding support playing file from bytearray support recording use pulsesrc/sink and AEC mode, no need for speexdsp on linux consider adding windows/mac AEC support into device elements switch to using official gst appsrc/sink use farsight gstreamer: 0.11: fix udpsink (#534243) 0.11: merge directsound (#584980) 0.11: ensure osxaudiosrc/sink stuff is merged (#602121) 0.11: ensure osxvideosrc stuff is merged theoraenc screws up when looped (#562163, unresolved) cleanup speexdsp/speexechoprobe and merge to gst get all elements we depend on from gst-plugins-bad to graduate to -good figure out windows video capture situation. is ksvideosrc the full answer? ksvideosrc device enumeration v4l2src that supports only image/jpeg fails if independently set to READY before changing to PAUSED. see devices.cpp psimedia-master/demo/000077500000000000000000000000001220046403000150555ustar00rootroot00000000000000psimedia-master/demo/config.ui000066400000000000000000000145311220046403000166650ustar00rootroot00000000000000 Config 0 0 338 384 Dialog Devices Audio output: 0 0 Send live stream Send file Audio input: 0 0 Video input: 0 0 File: ... Loop file playback Modes Audio: 0 0 Video: 0 0 Qt::Vertical 320 41 Qt::Horizontal QDialogButtonBox::Cancel|QDialogButtonBox::NoButton|QDialogButtonBox::Ok cb_audioOutDevice rb_sendLive rb_sendFile cb_audioInDevice cb_videoInDevice le_file tb_file ck_loop cb_audioMode cb_videoMode buttonBox buttonBox accepted() Config accept() 219 307 180 332 buttonBox rejected() Config reject() 290 307 280 333 psimedia-master/demo/demo.pro000066400000000000000000000004131220046403000165210ustar00rootroot00000000000000CONFIG -= app_bundle QT += network CONFIG += debug include(../psimedia/psimedia.pri) INCLUDEPATH += ../psimedia #DEFINES += GSTPROVIDER_STATIC #DEFINES += QT_STATICPLUGIN #include(../gstprovider/gstprovider.pri) SOURCES += main.cpp FORMS += mainwin.ui config.ui psimedia-master/demo/main.cpp000066400000000000000000001030621220046403000165070ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #include #include #include #include #include #include #include #include #include #include #include #include "psimedia.h" #include "ui_mainwin.h" #include "ui_config.h" #define BASE_PORT_MIN 1 #define BASE_PORT_MAX 65534 static QString urlishEncode(const QString &in) { QString out; for(int n = 0; n < in.length(); ++n) { if(in[n] == '%' || in[n] == ',' || in[n] == ';' || in[n] == ':' || in[n] == '\n') { unsigned char c = (unsigned char)in[n].toLatin1(); out += QString().sprintf("%%%02x", c); } else out += in[n]; } return out; } static QString urlishDecode(const QString &in) { QString out; for(int n = 0; n < in.length(); ++n) { if(in[n] == '%') { if(n + 2 >= in.length()) return QString(); QString hex = in.mid(n + 1, 2); bool ok; int x = hex.toInt(&ok, 16); if(!ok) return QString(); unsigned char c = (unsigned char)x; out += c; n += 2; } else out += in[n]; } return out; } static QString payloadInfoToString(const PsiMedia::PayloadInfo &info) { QStringList list; list += QString::number(info.id()); list += info.name(); list += QString::number(info.clockrate()); list += QString::number(info.channels()); list += QString::number(info.ptime()); list += QString::number(info.maxptime()); foreach(const PsiMedia::PayloadInfo::Parameter &p, info.parameters()) list += p.name + '=' + p.value; for(int n = 0; n < list.count(); ++n) list[n] = urlishEncode(list[n]); return list.join(","); } static PsiMedia::PayloadInfo stringToPayloadInfo(const QString &in) { QStringList list = in.split(','); if(list.count() < 6) return PsiMedia::PayloadInfo(); for(int n = 0; n < list.count(); ++n) { QString str = urlishDecode(list[n]); if(str.isEmpty()) return PsiMedia::PayloadInfo(); list[n] = str; } PsiMedia::PayloadInfo out; bool ok; int x; x = list[0].toInt(&ok); if(!ok) return PsiMedia::PayloadInfo(); out.setId(x); out.setName(list[1]); x = list[2].toInt(&ok); if(!ok) return PsiMedia::PayloadInfo(); out.setClockrate(x); x = list[3].toInt(&ok); if(!ok) return PsiMedia::PayloadInfo(); out.setChannels(x); x = list[4].toInt(&ok); if(!ok) return PsiMedia::PayloadInfo(); out.setPtime(x); x = list[5].toInt(&ok); if(!ok) return PsiMedia::PayloadInfo(); out.setMaxptime(x); QList plist; for(int n = 6; n < list.count(); ++n) { x = list[n].indexOf('='); if(x == -1) return PsiMedia::PayloadInfo(); PsiMedia::PayloadInfo::Parameter p; p.name = list[n].mid(0, x); p.value = list[n].mid(x + 1); plist += p; } out.setParameters(plist); return out; } static QString payloadInfoToCodecString(const PsiMedia::PayloadInfo *audio, const PsiMedia::PayloadInfo *video) { QStringList list; if(audio) list += QString("A:") + payloadInfoToString(*audio); if(video) list += QString("V:") + payloadInfoToString(*video); return list.join(";"); } static bool codecStringToPayloadInfo(const QString &in, PsiMedia::PayloadInfo *audio, PsiMedia::PayloadInfo *video) { QStringList list = in.split(';'); foreach(const QString &s, list) { int x = s.indexOf(':'); if(x == -1) return false; QString var = s.mid(0, x); QString val = s.mid(x + 1); if(val.isEmpty()) return false; PsiMedia::PayloadInfo info = stringToPayloadInfo(val); if(info.isNull()) return false; if(var == "A" && audio) *audio = info; else if(var == "V" && video) *video = info; } return true; } Q_DECLARE_METATYPE(PsiMedia::AudioParams); Q_DECLARE_METATYPE(PsiMedia::VideoParams); class Configuration { public: bool liveInput; QString audioOutDeviceId, audioInDeviceId, videoInDeviceId; QString file; bool loopFile; PsiMedia::AudioParams audioParams; PsiMedia::VideoParams videoParams; Configuration() : liveInput(false), loopFile(false) { } }; class PsiMediaFeaturesSnapshot { public: enum Mode { Input = 0x01, Output = 0x02, All = 0x03 }; int mode; QList audioOutputDevices; QList audioInputDevices; QList videoInputDevices; QList supportedAudioModes; QList supportedVideoModes; PsiMediaFeaturesSnapshot(Mode m) { mode = m; PsiMedia::Features f; int flags = 0; flags |= PsiMedia::Features::AudioModes; flags |= PsiMedia::Features::VideoModes; if(mode & Input) { flags |= PsiMedia::Features::AudioIn; flags |= PsiMedia::Features::VideoIn; } if(mode & Output) flags |= PsiMedia::Features::AudioOut; f.lookup(flags); f.waitForFinished(); audioOutputDevices = f.audioOutputDevices(); audioInputDevices = f.audioInputDevices(); videoInputDevices = f.videoInputDevices(); supportedAudioModes = f.supportedAudioModes(); supportedVideoModes = f.supportedVideoModes(); } }; // get default settings static Configuration getDefaultConfiguration() { Configuration config; config.liveInput = true; config.loopFile = true; PsiMediaFeaturesSnapshot snap(PsiMediaFeaturesSnapshot::All); QList devs; devs = snap.audioOutputDevices; if(!devs.isEmpty()) config.audioOutDeviceId = devs.first().id(); devs = snap.audioInputDevices; if(!devs.isEmpty()) config.audioInDeviceId = devs.first().id(); devs = snap.videoInputDevices; if(!devs.isEmpty()) config.videoInDeviceId = devs.first().id(); config.audioParams = snap.supportedAudioModes.first(); config.videoParams = snap.supportedVideoModes.first(); return config; } // adjust any invalid settings to nearby valid ones static Configuration adjustConfiguration(const Configuration &in, const PsiMediaFeaturesSnapshot &snap) { Configuration out = in; bool found; if((snap.mode & PsiMediaFeaturesSnapshot::Output) && !out.audioOutDeviceId.isEmpty()) { found = false; foreach(const PsiMedia::Device &dev, snap.audioOutputDevices) { if(out.audioOutDeviceId == dev.id()) { found = true; break; } } if(!found) { if(!snap.audioOutputDevices.isEmpty()) out.audioOutDeviceId = snap.audioOutputDevices.first().id(); else out.audioOutDeviceId.clear(); } } if((snap.mode & PsiMediaFeaturesSnapshot::Input) && !out.audioInDeviceId.isEmpty()) { found = false; foreach(const PsiMedia::Device &dev, snap.audioInputDevices) { if(out.audioInDeviceId == dev.id()) { found = true; break; } } if(!found) { if(!snap.audioInputDevices.isEmpty()) out.audioInDeviceId = snap.audioInputDevices.first().id(); else out.audioInDeviceId.clear(); } } if((snap.mode & PsiMediaFeaturesSnapshot::Input) && !out.videoInDeviceId.isEmpty()) { found = false; foreach(const PsiMedia::Device &dev, snap.videoInputDevices) { if(out.videoInDeviceId == dev.id()) { found = true; break; } } if(!found) { if(!snap.videoInputDevices.isEmpty()) out.videoInDeviceId = snap.videoInputDevices.first().id(); else out.videoInDeviceId.clear(); } } found = false; foreach(const PsiMedia::AudioParams ¶ms, snap.supportedAudioModes) { if(out.audioParams == params) { found = true; break; } } if(!found) out.audioParams = snap.supportedAudioModes.first(); found = false; foreach(const PsiMedia::VideoParams ¶ms, snap.supportedVideoModes) { if(out.videoParams == params) { found = true; break; } } if(!found) out.videoParams = snap.supportedVideoModes.first(); return out; } class ConfigDlg : public QDialog { Q_OBJECT public: Ui::Config ui; Configuration config; ConfigDlg(const Configuration &_config, QWidget *parent = 0) : QDialog(parent), config(_config) { ui.setupUi(this); setWindowTitle(tr("Configure Audio/Video")); ui.lb_audioInDevice->setEnabled(false); ui.cb_audioInDevice->setEnabled(false); ui.lb_videoInDevice->setEnabled(false); ui.cb_videoInDevice->setEnabled(false); ui.lb_file->setEnabled(false); ui.le_file->setEnabled(false); ui.tb_file->setEnabled(false); ui.ck_loop->setEnabled(false); connect(ui.rb_sendLive, SIGNAL(toggled(bool)), SLOT(live_toggled(bool))); connect(ui.rb_sendFile, SIGNAL(toggled(bool)), SLOT(file_toggled(bool))); connect(ui.tb_file, SIGNAL(clicked()), SLOT(file_choose())); PsiMediaFeaturesSnapshot snap(PsiMediaFeaturesSnapshot::All); ui.cb_audioOutDevice->addItem("", QString()); foreach(const PsiMedia::Device &dev, snap.audioOutputDevices) ui.cb_audioOutDevice->addItem(dev.name(), dev.id()); ui.cb_audioInDevice->addItem("", QString()); foreach(const PsiMedia::Device &dev, snap.audioInputDevices) ui.cb_audioInDevice->addItem(dev.name(), dev.id()); ui.cb_videoInDevice->addItem("", QString()); foreach(const PsiMedia::Device &dev, snap.videoInputDevices) ui.cb_videoInDevice->addItem(dev.name(), dev.id()); foreach(const PsiMedia::AudioParams ¶ms, snap.supportedAudioModes) { QString codec = params.codec(); if(codec == "vorbis" || codec == "speex") codec[0] = codec[0].toUpper(); else codec = codec.toUpper(); QString hz = QString::number(params.sampleRate() / 1000); QString chanstr; if(params.channels() == 1) chanstr = "Mono"; else if(params.channels() == 2) chanstr = "Stereo"; else chanstr = QString("Channels: %1").arg(params.channels()); QString str = QString("%1, %2KHz, %3-bit, %4").arg(codec).arg(hz).arg(params.sampleSize()).arg(chanstr); ui.cb_audioMode->addItem(str, qVariantFromValue(params)); } foreach(const PsiMedia::VideoParams ¶ms, snap.supportedVideoModes) { QString codec = params.codec(); if(codec == "theora") codec[0] = codec[0].toUpper(); else codec = codec.toUpper(); QString sizestr = QString("%1x%2").arg(params.size().width()).arg(params.size().height()); QString str = QString("%1, %2 @ %3fps").arg(codec).arg(sizestr).arg(params.fps()); ui.cb_videoMode->addItem(str, qVariantFromValue(params)); } // the following lookups are guaranteed, since the config is // adjusted to all valid values as necessary config = adjustConfiguration(config, snap); ui.cb_audioOutDevice->setCurrentIndex(ui.cb_audioOutDevice->findData(config.audioOutDeviceId)); ui.cb_audioInDevice->setCurrentIndex(ui.cb_audioInDevice->findData(config.audioInDeviceId)); ui.cb_videoInDevice->setCurrentIndex(ui.cb_videoInDevice->findData(config.videoInDeviceId)); ui.cb_audioMode->setCurrentIndex(findAudioParamsData(ui.cb_audioMode, config.audioParams)); ui.cb_videoMode->setCurrentIndex(findVideoParamsData(ui.cb_videoMode, config.videoParams)); if(config.liveInput) ui.rb_sendLive->setChecked(true); else ui.rb_sendFile->setChecked(true); ui.le_file->setText(config.file); ui.ck_loop->setChecked(config.loopFile); } // apparently custom QVariants can't be compared, so we have to // make our own find functions for the comboboxes int findAudioParamsData(QComboBox *cb, const PsiMedia::AudioParams ¶ms) { for(int n = 0; n < cb->count(); ++n) { if(qVariantValue(cb->itemData(n)) == params) return n; } return -1; } int findVideoParamsData(QComboBox *cb, const PsiMedia::VideoParams ¶ms) { for(int n = 0; n < cb->count(); ++n) { if(qVariantValue(cb->itemData(n)) == params) return n; } return -1; } protected: virtual void accept() { config.audioOutDeviceId = ui.cb_audioOutDevice->itemData(ui.cb_audioOutDevice->currentIndex()).toString(); config.audioInDeviceId = ui.cb_audioInDevice->itemData(ui.cb_audioInDevice->currentIndex()).toString(); config.audioParams = qVariantValue(ui.cb_audioMode->itemData(ui.cb_audioMode->currentIndex())); config.videoInDeviceId = ui.cb_videoInDevice->itemData(ui.cb_videoInDevice->currentIndex()).toString(); config.videoParams = qVariantValue(ui.cb_videoMode->itemData(ui.cb_videoMode->currentIndex())); config.liveInput = ui.rb_sendLive->isChecked(); config.file = ui.le_file->text(); config.loopFile = ui.ck_loop->isChecked(); QDialog::accept(); } private slots: void live_toggled(bool on) { ui.lb_audioInDevice->setEnabled(on); ui.cb_audioInDevice->setEnabled(on); ui.lb_videoInDevice->setEnabled(on); ui.cb_videoInDevice->setEnabled(on); } void file_toggled(bool on) { ui.lb_file->setEnabled(on); ui.le_file->setEnabled(on); ui.tb_file->setEnabled(on); ui.ck_loop->setEnabled(on); } void file_choose() { QString fileName = QFileDialog::getOpenFileName(this, tr("Open File"), QCoreApplication::applicationDirPath(), tr("Ogg Audio/Video (*.oga *.ogv *.ogg)")); if(!fileName.isEmpty()) ui.le_file->setText(fileName); } }; // handles two udp sockets class RtpSocketGroup : public QObject { Q_OBJECT public: QUdpSocket socket[2]; RtpSocketGroup(QObject *parent = 0) : QObject(parent) { connect(&socket[0], SIGNAL(readyRead()), SLOT(sock_readyRead())); connect(&socket[1], SIGNAL(readyRead()), SLOT(sock_readyRead())); connect(&socket[0], SIGNAL(bytesWritten(qint64)), SLOT(sock_bytesWritten(qint64))); connect(&socket[1], SIGNAL(bytesWritten(qint64)), SLOT(sock_bytesWritten(qint64))); } bool bind(int basePort) { if(!socket[0].bind(basePort)) return false; if(!socket[1].bind(basePort + 1)) return false; return true; } signals: void readyRead(int offset); void datagramWritten(int offset); private slots: void sock_readyRead() { QUdpSocket *udp = (QUdpSocket *)sender(); if(udp == &socket[0]) emit readyRead(0); else emit readyRead(1); } void sock_bytesWritten(qint64 bytes) { Q_UNUSED(bytes); QUdpSocket *udp = (QUdpSocket *)sender(); if(udp == &socket[0]) emit datagramWritten(0); else emit datagramWritten(1); } }; // bind a channel to a socket group. // takes ownership of socket group. class RtpBinding : public QObject { Q_OBJECT public: enum Mode { Send, Receive }; Mode mode; PsiMedia::RtpChannel *channel; RtpSocketGroup *socketGroup; QHostAddress sendAddress; int sendBasePort; RtpBinding(Mode _mode, PsiMedia::RtpChannel *_channel, RtpSocketGroup *_socketGroup, QObject *parent = 0) : QObject(parent), mode(_mode), channel(_channel), socketGroup(_socketGroup), sendBasePort(-1) { socketGroup->setParent(this); connect(socketGroup, SIGNAL(readyRead(int)), SLOT(net_ready(int))); connect(socketGroup, SIGNAL(datagramWritten(int)), SLOT(net_written(int))); connect(channel, SIGNAL(readyRead()), SLOT(app_ready())); connect(channel, SIGNAL(packetsWritten(int)), SLOT(app_written(int))); } private slots: void net_ready(int offset) { // here we handle packets received from the network, that // we need to give to psimedia while(socketGroup->socket[offset].hasPendingDatagrams()) { int size = (int)socketGroup->socket[offset].pendingDatagramSize(); QByteArray rawValue(size, offset); QHostAddress fromAddr; quint16 fromPort; if(socketGroup->socket[offset].readDatagram(rawValue.data(), size, &fromAddr, &fromPort) == -1) continue; // if we are sending RTP, we should not be receiving // anything on offset 0 if(mode == Send && offset == 0) continue; PsiMedia::RtpPacket packet(rawValue, offset); channel->write(packet); } } void net_written(int offset) { Q_UNUSED(offset); // do nothing } void app_ready() { // here we handle packets that psimedia wants to send out, // that we need to give to the network while(channel->packetsAvailable() > 0) { PsiMedia::RtpPacket packet = channel->read(); int offset = packet.portOffset(); if(offset < 0 || offset > 1) continue; // if we are receiving RTP, we should not be sending // anything on offset 0 if(mode == Receive && offset == 0) continue; if(sendAddress.isNull() || sendBasePort < BASE_PORT_MIN || sendBasePort > BASE_PORT_MAX) continue; socketGroup->socket[offset].writeDatagram(packet.rawValue(), sendAddress, sendBasePort + offset); } } void app_written(int count) { Q_UNUSED(count); // do nothing } }; class MainWin : public QMainWindow { Q_OBJECT public: Ui::MainWin ui; QAction *action_AboutProvider; QString creditName; PsiMedia::RtpSession producer; PsiMedia::RtpSession receiver; Configuration config; bool transmitAudio, transmitVideo, transmitting; bool receiveAudio, receiveVideo; RtpBinding *sendAudioRtp, *sendVideoRtp; RtpBinding *receiveAudioRtp, *receiveVideoRtp; bool recording; QFile *recordFile; MainWin() : action_AboutProvider(0), producer(this), receiver(this), sendAudioRtp(0), sendVideoRtp(0), receiveAudioRtp(0), receiveVideoRtp(0), recording(false), recordFile(0) { ui.setupUi(this); setWindowTitle(tr("PsiMedia Demo")); creditName = PsiMedia::creditName(); if(!creditName.isEmpty()) { action_AboutProvider = new QAction(this); action_AboutProvider->setText(tr("About %1").arg(creditName)); ui.menu_Help->addAction(action_AboutProvider); connect(action_AboutProvider, SIGNAL(triggered()), SLOT(doAboutProvider())); } config = getDefaultConfiguration(); ui.pb_transmit->setEnabled(false); ui.pb_stopSend->setEnabled(false); ui.pb_stopReceive->setEnabled(false); ui.pb_record->setEnabled(false); ui.le_sendConfig->setReadOnly(true); ui.lb_sendConfig->setEnabled(false); ui.le_sendConfig->setEnabled(false); ui.sl_mic->setMinimum(0); ui.sl_mic->setMaximum(100); ui.sl_spk->setMinimum(0); ui.sl_spk->setMaximum(100); ui.sl_mic->setValue(100); ui.sl_spk->setValue(100); ui.le_remoteAddress->setText("127.0.0.1"); ui.le_remoteAudioPort->setText("60000"); ui.le_remoteVideoPort->setText("60002"); ui.le_localAudioPort->setText("60000"); ui.le_localVideoPort->setText("60002"); ui.le_remoteAddress->selectAll(); ui.le_remoteAddress->setFocus(); connect(ui.action_Quit, SIGNAL(triggered()), SLOT(close())); connect(ui.action_Configure, SIGNAL(triggered()), SLOT(doConfigure())); connect(ui.action_About, SIGNAL(triggered()), SLOT(doAbout())); connect(ui.pb_startSend, SIGNAL(clicked()), SLOT(start_send())); connect(ui.pb_transmit, SIGNAL(clicked()), SLOT(transmit())); connect(ui.pb_stopSend, SIGNAL(clicked()), SLOT(stop_send())); connect(ui.pb_startReceive, SIGNAL(clicked()), SLOT(start_receive())); connect(ui.pb_stopReceive, SIGNAL(clicked()), SLOT(stop_receive())); connect(ui.pb_record, SIGNAL(clicked()), SLOT(record_toggle())); connect(ui.sl_mic, SIGNAL(valueChanged(int)), SLOT(change_volume_mic(int))); connect(ui.sl_spk, SIGNAL(valueChanged(int)), SLOT(change_volume_spk(int))); connect(&producer, SIGNAL(started()), SLOT(producer_started())); connect(&producer, SIGNAL(stopped()), SLOT(producer_stopped())); connect(&producer, SIGNAL(finished()), SLOT(producer_finished())); connect(&producer, SIGNAL(error()), SLOT(producer_error())); connect(&receiver, SIGNAL(started()), SLOT(receiver_started())); connect(&receiver, SIGNAL(stoppedRecording()), SLOT(receiver_stoppedRecording())); connect(&receiver, SIGNAL(stopped()), SLOT(receiver_stopped())); connect(&receiver, SIGNAL(error()), SLOT(receiver_error())); // set initial volume levels change_volume_mic(ui.sl_mic->value()); change_volume_spk(ui.sl_spk->value()); // associate video widgets producer.setVideoPreviewWidget(ui.vw_self); receiver.setVideoOutputWidget(ui.vw_remote); // hack: make the top/bottom layouts have matching height int lineEditHeight = ui.le_receiveConfig->sizeHint().height(); QWidget *spacer = new QWidget(this); spacer->setMinimumHeight(lineEditHeight); spacer->setSizePolicy(QSizePolicy::Expanding, QSizePolicy::Fixed); ui.gridLayout2->addWidget(spacer, 3, 1); // hack: give the video widgets a 4:3 ratio int gridSpacing = ui.gridLayout1->verticalSpacing(); if(gridSpacing == -1) gridSpacing = 9; // not sure how else to get this int pushButtonHeight = ui.pb_startSend->sizeHint().height(); int heightEstimate = lineEditHeight * 4 + pushButtonHeight + gridSpacing * 4; heightEstimate += 10; // pad just to be safe int goodWidth = (heightEstimate * 4) / 3; ui.vw_remote->setMinimumSize(goodWidth, heightEstimate); ui.vw_self->setMinimumSize(goodWidth, heightEstimate); // hack: remove empty File menu on mac #ifdef Q_WS_MAC ui.menu_File->menuAction()->setVisible(false); #endif } ~MainWin() { producer.reset(); receiver.reset(); cleanup_send_rtp(); cleanup_receive_rtp(); cleanup_record(); } void setSendFieldsEnabled(bool b) { ui.lb_remoteAddress->setEnabled(b); ui.le_remoteAddress->setEnabled(b); ui.lb_remoteAudioPort->setEnabled(b); ui.le_remoteAudioPort->setEnabled(b); ui.lb_remoteVideoPort->setEnabled(b); ui.le_remoteVideoPort->setEnabled(b); } void setSendConfig(const QString &s) { if(!s.isEmpty()) { ui.lb_sendConfig->setEnabled(true); ui.le_sendConfig->setEnabled(true); ui.le_sendConfig->setText(s); ui.le_sendConfig->setCursorPosition(0); } else { ui.lb_sendConfig->setEnabled(false); ui.le_sendConfig->setEnabled(false); ui.le_sendConfig->clear(); } } void setReceiveFieldsEnabled(bool b) { ui.lb_localAudioPort->setEnabled(b); ui.le_localAudioPort->setEnabled(b); ui.lb_localVideoPort->setEnabled(b); ui.le_localVideoPort->setEnabled(b); ui.lb_receiveConfig->setEnabled(b); ui.le_receiveConfig->setEnabled(b); } static QString rtpSessionErrorToString(PsiMedia::RtpSession::Error e) { QString str; switch(e) { case PsiMedia::RtpSession::ErrorSystem: str = tr("System error"); break; case PsiMedia::RtpSession::ErrorCodec: str = tr("Codec error"); break; default: // generic str = tr("Generic error"); break; } return str; } void cleanup_send_rtp() { delete sendAudioRtp; sendAudioRtp = 0; delete sendVideoRtp; sendVideoRtp = 0; } void cleanup_receive_rtp() { delete receiveAudioRtp; receiveAudioRtp = 0; delete receiveVideoRtp; receiveVideoRtp = 0; } void cleanup_record() { if(recording) { delete recordFile; recordFile = 0; recording = false; } } private slots: void doConfigure() { ConfigDlg w(config, this); w.exec(); config = w.config; } void doAbout() { QMessageBox::about(this, tr("About PsiMedia Demo"), tr( "PsiMedia Demo v1.0\n" "A simple test application for the PsiMedia system.\n" "\n" "Copyright (C) 2008 Barracuda Networks, Inc." )); } void doAboutProvider() { QMessageBox::about(this, tr("About %1").arg(creditName), PsiMedia::creditText() ); } void start_send() { config = adjustConfiguration(config, PsiMediaFeaturesSnapshot(PsiMediaFeaturesSnapshot::Input)); transmitAudio = false; transmitVideo = false; if(config.liveInput) { if(config.audioInDeviceId.isEmpty() && config.videoInDeviceId.isEmpty()) { QMessageBox::information(this, tr("Error"), tr( "Cannot send live without at least one audio " "input or video input device selected." )); return; } if(!config.audioInDeviceId.isEmpty()) { producer.setAudioInputDevice(config.audioInDeviceId); transmitAudio = true; } else producer.setAudioInputDevice(QString()); if(!config.videoInDeviceId.isEmpty()) { producer.setVideoInputDevice(config.videoInDeviceId); transmitVideo = true; } else producer.setVideoInputDevice(QString()); } else // non-live (file) input { producer.setFileInput(config.file); producer.setFileLoopEnabled(config.loopFile); // we just assume the file has both audio and video. // if it doesn't, no big deal, it'll still work. // update: after producer is started, we can correct // these variables. transmitAudio = true; transmitVideo = true; } QList audioParamsList; if(transmitAudio) audioParamsList += config.audioParams; producer.setLocalAudioPreferences(audioParamsList); QList videoParamsList; if(transmitVideo) videoParamsList += config.videoParams; producer.setLocalVideoPreferences(videoParamsList); ui.pb_startSend->setEnabled(false); ui.pb_stopSend->setEnabled(true); transmitting = false; producer.start(); } void transmit() { QHostAddress addr; if(!addr.setAddress(ui.le_remoteAddress->text())) { QMessageBox::critical(this, tr("Error"), tr( "Invalid send IP address." )); return; } int audioPort = -1; if(transmitAudio) { bool ok; audioPort = ui.le_remoteAudioPort->text().toInt(&ok); if(!ok || audioPort < BASE_PORT_MIN || audioPort > BASE_PORT_MAX) { QMessageBox::critical(this, tr("Error"), tr( "Invalid send audio port." )); return; } } int videoPort = -1; if(transmitVideo) { bool ok; videoPort = ui.le_remoteVideoPort->text().toInt(&ok); if(!ok || videoPort < BASE_PORT_MIN || videoPort > BASE_PORT_MAX) { QMessageBox::critical(this, tr("Error"), tr( "Invalid send video port." )); return; } } RtpSocketGroup *audioSocketGroup = new RtpSocketGroup; sendAudioRtp = new RtpBinding(RtpBinding::Send, producer.audioRtpChannel(), audioSocketGroup, this); sendAudioRtp->sendAddress = addr; sendAudioRtp->sendBasePort = audioPort; RtpSocketGroup *videoSocketGroup = new RtpSocketGroup; sendVideoRtp = new RtpBinding(RtpBinding::Send, producer.videoRtpChannel(), videoSocketGroup, this); sendVideoRtp->sendAddress = addr; sendVideoRtp->sendBasePort = videoPort; setSendFieldsEnabled(false); ui.pb_transmit->setEnabled(false); if(transmitAudio) producer.transmitAudio(); if(transmitVideo) producer.transmitVideo(); transmitting = true; } void stop_send() { ui.pb_stopSend->setEnabled(false); if(!transmitting) ui.pb_transmit->setEnabled(false); producer.stop(); } void start_receive() { config = adjustConfiguration(config, PsiMediaFeaturesSnapshot(PsiMediaFeaturesSnapshot::Output)); QString receiveConfig = ui.le_receiveConfig->text(); PsiMedia::PayloadInfo audio; PsiMedia::PayloadInfo video; if(receiveConfig.isEmpty() || !codecStringToPayloadInfo(receiveConfig, &audio, &video)) { QMessageBox::critical(this, tr("Error"), tr( "Invalid codec config." )); return; } receiveAudio = !audio.isNull(); receiveVideo = !video.isNull(); int audioPort = -1; if(receiveAudio) { bool ok; audioPort = ui.le_localAudioPort->text().toInt(&ok); if(!ok || audioPort < BASE_PORT_MIN || audioPort > BASE_PORT_MAX) { QMessageBox::critical(this, tr("Error"), tr( "Invalid receive audio port." )); return; } } int videoPort = -1; if(receiveVideo) { bool ok; videoPort = ui.le_localVideoPort->text().toInt(&ok); if(!ok || videoPort < BASE_PORT_MIN || videoPort > BASE_PORT_MAX) { QMessageBox::critical(this, tr("Error"), tr( "Invalid receive video port." )); return; } } if(receiveAudio && !config.audioOutDeviceId.isEmpty()) { receiver.setAudioOutputDevice(config.audioOutDeviceId); QList audioParamsList; audioParamsList += config.audioParams; receiver.setLocalAudioPreferences(audioParamsList); QList payloadInfoList; payloadInfoList += audio; receiver.setRemoteAudioPreferences(payloadInfoList); } if(receiveVideo) { QList videoParamsList; videoParamsList += config.videoParams; receiver.setLocalVideoPreferences(videoParamsList); QList payloadInfoList; payloadInfoList += video; receiver.setRemoteVideoPreferences(payloadInfoList); } RtpSocketGroup *audioSocketGroup = new RtpSocketGroup(this); RtpSocketGroup *videoSocketGroup = new RtpSocketGroup(this); if(!audioSocketGroup->bind(audioPort)) { delete audioSocketGroup; audioSocketGroup = 0; delete videoSocketGroup; videoSocketGroup = 0; QMessageBox::critical(this, tr("Error"), tr( "Unable to bind to receive audio ports." )); return; } if(!videoSocketGroup->bind(videoPort)) { delete audioSocketGroup; audioSocketGroup = 0; delete videoSocketGroup; videoSocketGroup = 0; QMessageBox::critical(this, tr("Error"), tr( "Unable to bind to receive video ports." )); return; } receiveAudioRtp = new RtpBinding(RtpBinding::Receive, receiver.audioRtpChannel(), audioSocketGroup, this); receiveVideoRtp = new RtpBinding(RtpBinding::Receive, receiver.videoRtpChannel(), videoSocketGroup, this); setReceiveFieldsEnabled(false); ui.pb_startReceive->setEnabled(false); ui.pb_stopReceive->setEnabled(true); receiver.start(); } void stop_receive() { ui.pb_stopReceive->setEnabled(false); receiver.stop(); } void change_volume_mic(int value) { producer.setInputVolume(value); } void change_volume_spk(int value) { receiver.setOutputVolume(value); } void producer_started() { PsiMedia::PayloadInfo audio, *pAudio; PsiMedia::PayloadInfo video, *pVideo; pAudio = 0; pVideo = 0; if(transmitAudio) { // confirm transmitting of audio is actually possible, // in the case that a file is used as input if(producer.canTransmitAudio()) { audio = producer.localAudioPayloadInfo().first(); pAudio = &audio; } else transmitAudio = false; } if(transmitVideo) { // same for video if(producer.canTransmitVideo()) { video = producer.localVideoPayloadInfo().first(); pVideo = &video; } else transmitVideo = false; } QString str = payloadInfoToCodecString(pAudio, pVideo); setSendConfig(str); ui.pb_transmit->setEnabled(true); } void producer_stopped() { cleanup_send_rtp(); setSendFieldsEnabled(true); setSendConfig(QString()); ui.pb_startSend->setEnabled(true); } void producer_finished() { cleanup_send_rtp(); setSendFieldsEnabled(true); setSendConfig(QString()); ui.pb_startSend->setEnabled(true); ui.pb_transmit->setEnabled(false); ui.pb_stopSend->setEnabled(false); } void producer_error() { cleanup_send_rtp(); setSendFieldsEnabled(true); setSendConfig(QString()); ui.pb_startSend->setEnabled(true); ui.pb_transmit->setEnabled(false); ui.pb_stopSend->setEnabled(false); QMessageBox::critical(this, tr("Error"), tr( "An error occurred while trying to send:\n%1." ).arg(rtpSessionErrorToString(producer.errorCode()) )); } void receiver_started() { ui.pb_record->setEnabled(true); } void receiver_stoppedRecording() { cleanup_record(); } void receiver_stopped() { cleanup_receive_rtp(); cleanup_record(); setReceiveFieldsEnabled(true); ui.pb_startReceive->setEnabled(true); ui.pb_record->setEnabled(false); } void receiver_error() { cleanup_receive_rtp(); cleanup_record(); setReceiveFieldsEnabled(true); ui.pb_startReceive->setEnabled(true); ui.pb_stopReceive->setEnabled(false); ui.pb_record->setEnabled(false); QMessageBox::critical(this, tr("Error"), tr( "An error occurred while trying to receive:\n%1." ).arg(rtpSessionErrorToString(receiver.errorCode()) )); } void record_toggle() { if(!recording) { QString fileName = QFileDialog::getSaveFileName(this, tr("Save File"), QDir::homePath(), tr("Ogg Audio/Video (*.oga *.ogv)")); if(fileName.isEmpty()) return; recordFile = new QFile(fileName, this); if(!recordFile->open(QIODevice::WriteOnly | QIODevice::Truncate)) { delete recordFile; QMessageBox::critical(this, tr("Error"), tr( "Unable to create file for recording." )); return; } receiver.setRecordingQIODevice(recordFile); recording = true; } else receiver.stopRecording(); } }; #ifdef GSTPROVIDER_STATIC Q_IMPORT_PLUGIN(gstprovider) #endif #ifndef GSTPROVIDER_STATIC static QString findPlugin(const QString &relpath, const QString &basename) { QDir dir(QCoreApplication::applicationDirPath()); if(!dir.cd(relpath)) return QString(); foreach(const QString &fileName, dir.entryList()) { if(fileName.contains(basename)) { QString filePath = dir.filePath(fileName); if(QLibrary::isLibrary(filePath)) return filePath; } } return QString(); } #endif int main(int argc, char **argv) { QApplication qapp(argc, argv); #ifndef GSTPROVIDER_STATIC QString pluginFile; QString resourcePath; pluginFile = qgetenv("PSI_MEDIA_PLUGIN"); if(pluginFile.isEmpty()) { #if defined(Q_OS_WIN) pluginFile = findPlugin(".", "gstprovider"); if(!pluginFile.isEmpty()) resourcePath = QCoreApplication::applicationDirPath() + "/gstreamer-0.10"; #elif defined(Q_OS_MAC) pluginFile = findPlugin("../plugins", "gstprovider"); if(!pluginFile.isEmpty()) resourcePath = QCoreApplication::applicationDirPath() + "/../Frameworks/gstreamer-0.10"; #endif if(pluginFile.isEmpty()) pluginFile = findPlugin("../gstprovider", "gstprovider"); } PsiMedia::loadPlugin(pluginFile, resourcePath); #endif if(!PsiMedia::isSupported()) { QMessageBox::critical(0, MainWin::tr("PsiMedia Demo"), MainWin::tr( "Error: Could not load PsiMedia subsystem." )); return 1; } MainWin mainWin; // give mainWin a chance to fix its layout before showing QTimer::singleShot(0, &mainWin, SLOT(show())); qapp.exec(); return 0; } #include "main.moc" psimedia-master/demo/mainwin.ui000066400000000000000000000251111220046403000170560ustar00rootroot00000000000000 MainWin 0 0 464 447 MainWin Send IP address: Audio base port: Video base port: Codec config: Qt::Vertical 261 16 &Start &Transmit St&op Qt::Horizontal 21 28 Local video 0 0 96 96 Mic 0 96 Qt::Vertical Receive Audio base port: Video base port: Codec config: Qt::Vertical 261 16 St&art Sto&p &Record Qt::Horizontal 20 28 Remote video 0 0 96 96 Spk 0 96 Qt::Vertical 0 0 464 30 &Help &File &About &Quit Ctrl+Q Co&nfigure Audio/Video... Ctrl+N PsiMedia::VideoWidget QWidget
psimedia.h
le_remoteAddress le_remoteAudioPort le_remoteVideoPort le_sendConfig pb_startSend pb_transmit pb_stopSend le_localAudioPort le_localVideoPort le_receiveConfig pb_startReceive pb_stopReceive pb_record sl_mic sl_spk
psimedia-master/gstprovider/000077500000000000000000000000001220046403000165015ustar00rootroot00000000000000psimedia-master/gstprovider/alsalatency/000077500000000000000000000000001220046403000210015ustar00rootroot00000000000000psimedia-master/gstprovider/alsalatency/Makefile000066400000000000000000000001301220046403000224330ustar00rootroot00000000000000all: alsalatency alsalatency: alsalatency.c gcc alsalatency.c -o alsalatency -lasound psimedia-master/gstprovider/alsalatency/alsalatency.c000066400000000000000000000143251220046403000234520ustar00rootroot00000000000000#include #include #include // sets up a playback or capture handle to 44100/S16_LE/1 // on success, sets *psize to period_size and returns non-zero static int setup_handle(snd_pcm_t *handle, unsigned int rate, snd_pcm_uframes_t *psize) { int err; snd_pcm_hw_params_t *hw_params; if((err = snd_pcm_hw_params_malloc(&hw_params)) < 0) { fprintf(stderr, "cannot allocate hardware parameter structure (%s)\n", snd_strerror(err)); return 0; } if((err = snd_pcm_hw_params_any(handle, hw_params)) < 0) { fprintf(stderr, "cannot initialize hardware parameter structure (%s)\n", snd_strerror(err)); return 0; } if((err = snd_pcm_hw_params_set_access(handle, hw_params, SND_PCM_ACCESS_RW_INTERLEAVED)) < 0) { fprintf(stderr, "cannot set access type (%s)\n", snd_strerror(err)); return 0; } if((err = snd_pcm_hw_params_set_format(handle, hw_params, SND_PCM_FORMAT_S16_LE)) < 0) { fprintf(stderr, "cannot set sample format (%s)\n", snd_strerror(err)); return 0; } if((err = snd_pcm_hw_params_set_rate_near(handle, hw_params, &rate, 0)) < 0) { fprintf(stderr, "cannot set sample rate (%s)\n", snd_strerror(err)); return 0; } if((err = snd_pcm_hw_params_set_channels(handle, hw_params, 1)) < 0) { fprintf(stderr, "cannot set channel count (%s)\n", snd_strerror(err)); return 0; } if((err = snd_pcm_hw_params(handle, hw_params)) < 0) { fprintf(stderr, "cannot set parameters (%s)\n", snd_strerror(err)); return 0; } snd_pcm_hw_params_get_period_size(hw_params, psize, 0); snd_pcm_hw_params_free(hw_params); return 1; } static void usage() { printf("usage: alsalatency rec (capture_device)\n"); printf(" alsalatency loop (play_device) (capture_device)\n"); printf("\n"); printf("note: if capture_device or play_device are omitted, 'default' is assumed.\n\n"); } int main(int argc, char **argv) { char *playback_device; char *capture_device; snd_pcm_t *playback_handle; snd_pcm_t *capture_handle; snd_pcm_uframes_t playback_psize; snd_pcm_uframes_t capture_psize; int mode; unsigned int rate = 44100; short *pbuf, *cbuf; int count; int err; int at, at_play; FILE *fin, *fout; if(argc < 2) { usage(); return 1; } mode = -1; if(!strcmp(argv[1], "rec")) mode = 0; else if(!strcmp(argv[1], "loop")) mode = 1; else { usage(); return 1; } playback_device = "default"; capture_device = playback_device; if(mode == 0) { if(argc >= 3) capture_device = argv[2]; } else // 1 { if(argc >= 3) { playback_device = argv[2]; if(argc >= 4) capture_device = argv[3]; } } if(mode == 0) { if((err = snd_pcm_open(&capture_handle, capture_device, SND_PCM_STREAM_CAPTURE, 0)) < 0) { fprintf(stderr, "cannot open audio device %s (%s)\n", capture_device, snd_strerror(err)); return 1; } if(!setup_handle(capture_handle, rate, &capture_psize)) return 1; fout = fopen("play.raw", "wb"); if(!fout) { fprintf(stderr, "Error opening play.raw for writing.\n"); return 1; } cbuf = (short *)malloc(capture_psize * 2); printf("Recording 5-second audio clip to play.raw\n"); if((err = snd_pcm_prepare(capture_handle)) < 0) { fprintf(stderr, "cannot prepare audio interface for use (%s)\n", snd_strerror(err)); return 1; } for(at = 0; at < (int)rate * 5;) { if(at + capture_psize < rate * 5) count = capture_psize; else count = (rate * 5) - at; if((err = snd_pcm_readi(capture_handle, cbuf, count)) != count) { fprintf(stderr, "read from audio interface failed (%s)\n", snd_strerror(err)); return 1; } fwrite(cbuf, count * 2, 1, fout); at += count; } snd_pcm_close(capture_handle); free(cbuf); fclose(fout); } else // 1 { if((err = snd_pcm_open(&playback_handle, playback_device, SND_PCM_STREAM_PLAYBACK, 0)) < 0) { fprintf(stderr, "cannot open audio device %s (%s)\n", playback_device, snd_strerror(err)); return 1; } if((err = snd_pcm_open(&capture_handle, capture_device, SND_PCM_STREAM_CAPTURE, 0)) < 0) { fprintf(stderr, "cannot open audio device %s (%s)\n", capture_device, snd_strerror(err)); return 1; } if(!setup_handle(playback_handle, rate, &playback_psize)) return 1; if(!setup_handle(capture_handle, rate, &capture_psize)) return 1; fin = fopen("play.raw", "rb"); if(!fin) { fprintf(stderr, "Error opening play.raw for reading.\n"); return 1; } fout = fopen("loop.raw", "wb"); if(!fout) { fprintf(stderr, "Error opening loop.raw for writing.\n"); return 1; } pbuf = (short *)malloc(playback_psize * 2); cbuf = (short *)malloc(capture_psize * 2); printf("Playing play.raw while recording simultaneously to loop.raw\n"); if((err = snd_pcm_prepare(playback_handle)) < 0) { fprintf(stderr, "cannot prepare audio interface for use (%s)\n", snd_strerror(err)); return 1; } at_play = 0; at = -1; while(at < at_play) { if(fin) { if(!feof(fin)) { count = fread(pbuf, 2, playback_psize, fin); if(count <= 0) { snd_pcm_close(playback_handle); free(pbuf); fclose(fin); fin = 0; continue; } if((err = snd_pcm_writei(playback_handle, pbuf, count)) != count) { fprintf(stderr, "write to audio interface failed (%s)\n", snd_strerror(err)); snd_pcm_prepare(playback_handle); } at_play += count; // make sure play buffer has enough // data to survive while capturing if(at_play < (at == -1 ? 0 : at) + (4 * (int)capture_psize)) continue; } else { snd_pcm_close(playback_handle); free(pbuf); fclose(fin); fin = 0; } } if(at == -1) { if((err = snd_pcm_prepare(capture_handle)) < 0) { fprintf(stderr, "cannot prepare audio interface for use (%s)\n", snd_strerror(err)); return 1; } at = 0; } if(at + (int)capture_psize < at_play) count = capture_psize; else count = at_play - at; if((err = snd_pcm_readi(capture_handle, cbuf, count)) != count) { fprintf(stderr, "read from audio interface failed (%s)\n", snd_strerror(err)); return 1; } fwrite(cbuf, count * 2, 1, fout); at += count; } snd_pcm_close(capture_handle); free(cbuf); fclose(fout); } printf("done\n"); return 0; } psimedia-master/gstprovider/bins.cpp000066400000000000000000000301331220046403000201400ustar00rootroot00000000000000/* * Copyright (C) 2009 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #include "bins.h" #include #include #include #include // default latency is 200ms #define DEFAULT_RTP_LATENCY 200 namespace PsiMedia { static int get_rtp_latency() { QString val = QString::fromLatin1(qgetenv("PSI_RTP_LATENCY")); if(!val.isEmpty()) return val.toInt(); else return DEFAULT_RTP_LATENCY; } static GstElement *audio_codec_to_enc_element(const QString &name) { QString ename; if(name == "speex") ename = "speexenc"; else if(name == "vorbis") ename = "vorbisenc"; else if(name == "pcmu") ename = "mulawenc"; else return 0; return gst_element_factory_make(ename.toLatin1().data(), NULL); } static GstElement *audio_codec_to_dec_element(const QString &name) { QString ename; if(name == "speex") ename = "speexdec"; else if(name == "vorbis") ename = "vorbisdec"; else if(name == "pcmu") ename = "mulawdec"; else return 0; return gst_element_factory_make(ename.toLatin1().data(), NULL); } static GstElement *audio_codec_to_rtppay_element(const QString &name) { QString ename; if(name == "speex") ename = "rtpspeexpay"; else if(name == "vorbis") ename = "rtpvorbispay"; else if(name == "pcmu") ename = "rtppcmupay"; else return 0; return gst_element_factory_make(ename.toLatin1().data(), NULL); } static GstElement *audio_codec_to_rtpdepay_element(const QString &name) { QString ename; if(name == "speex") ename = "rtpspeexdepay"; else if(name == "vorbis") ename = "rtpvorbisdepay"; else if(name == "pcmu") ename = "rtppcmudepay"; else return 0; return gst_element_factory_make(ename.toLatin1().data(), NULL); } static GstElement *video_codec_to_enc_element(const QString &name) { QString ename; if(name == "theora") ename = "theoraenc"; else if(name == "h263p") ename = "ffenc_h263p"; else return 0; return gst_element_factory_make(ename.toLatin1().data(), NULL); } static GstElement *video_codec_to_dec_element(const QString &name) { QString ename; if(name == "theora") ename = "theoradec"; else if(name == "h263p") ename = "ffdec_h263"; else return 0; return gst_element_factory_make(ename.toLatin1().data(), NULL); } static GstElement *video_codec_to_rtppay_element(const QString &name) { QString ename; if(name == "theora") ename = "rtptheorapay"; else if(name == "h263p") ename = "rtph263ppay"; else return 0; return gst_element_factory_make(ename.toLatin1().data(), NULL); } static GstElement *video_codec_to_rtpdepay_element(const QString &name) { QString ename; if(name == "theora") ename = "rtptheoradepay"; else if(name == "h263p") ename = "rtph263pdepay"; else return 0; return gst_element_factory_make(ename.toLatin1().data(), NULL); } static bool audio_codec_get_send_elements(const QString &name, GstElement **enc, GstElement **rtppay) { GstElement *eenc = audio_codec_to_enc_element(name); if(!eenc) return false; GstElement *epay = audio_codec_to_rtppay_element(name); if(!epay) { g_object_unref(G_OBJECT(eenc)); } *enc = eenc; *rtppay = epay; return true; } static bool audio_codec_get_recv_elements(const QString &name, GstElement **dec, GstElement **rtpdepay) { GstElement *edec = audio_codec_to_dec_element(name); if(!edec) return false; GstElement *edepay = audio_codec_to_rtpdepay_element(name); if(!edepay) { g_object_unref(G_OBJECT(edec)); } *dec = edec; *rtpdepay = edepay; return true; } static bool video_codec_get_send_elements(const QString &name, GstElement **enc, GstElement **rtppay) { GstElement *eenc = video_codec_to_enc_element(name); if(!eenc) return false; GstElement *epay = video_codec_to_rtppay_element(name); if(!epay) { g_object_unref(G_OBJECT(eenc)); } *enc = eenc; *rtppay = epay; return true; } static bool video_codec_get_recv_elements(const QString &name, GstElement **dec, GstElement **rtpdepay) { GstElement *edec = video_codec_to_dec_element(name); if(!edec) return false; GstElement *edepay = video_codec_to_rtpdepay_element(name); if(!edepay) { g_object_unref(G_OBJECT(edec)); } *dec = edec; *rtpdepay = edepay; return true; } GstElement *bins_videoprep_create(const QSize &size, int fps, bool is_live) { GstElement *bin = gst_bin_new("videoprepbin"); GstElement *videorate = 0; GstElement *ratefilter = 0; if(fps != -1) { // use videomaxrate for live sources if(is_live) videorate = gst_element_factory_make("videomaxrate", NULL); else videorate = gst_element_factory_make("videorate", NULL); ratefilter = gst_element_factory_make("capsfilter", NULL); GstCaps *caps = gst_caps_new_empty(); GstStructure *cs = gst_structure_new("video/x-raw-yuv", "framerate", GST_TYPE_FRACTION, fps, 1, NULL); gst_caps_append_structure(caps, cs); cs = gst_structure_new("video/x-raw-rgb", "framerate", GST_TYPE_FRACTION, fps, 1, NULL); gst_caps_append_structure(caps, cs); g_object_set(G_OBJECT(ratefilter), "caps", caps, NULL); gst_caps_unref(caps); } GstElement *videoscale = 0; GstElement *scalefilter = 0; if(size.isValid()) { videoscale = gst_element_factory_make("videoscale", NULL); scalefilter = gst_element_factory_make("capsfilter", NULL); GstCaps *caps = gst_caps_new_empty(); GstStructure *cs = gst_structure_new("video/x-raw-yuv", "width", G_TYPE_INT, size.width(), "height", G_TYPE_INT, size.height(), NULL); gst_caps_append_structure(caps, cs); cs = gst_structure_new("video/x-raw-rgb", "width", G_TYPE_INT, size.width(), "height", G_TYPE_INT, size.height(), NULL); gst_caps_append_structure(caps, cs); g_object_set(G_OBJECT(scalefilter), "caps", caps, NULL); gst_caps_unref(caps); } if(!videorate && !videoscale) { // not altering anything? return no-op return gst_element_factory_make("identity", NULL); } GstElement *start, *end; if(videorate && videoscale) { start = videorate; end = scalefilter; } else if(videorate && !videoscale) { start = videorate; end = ratefilter; } else // !videorate && videoscale { start = videoscale; end = scalefilter; } if(videorate) { gst_bin_add(GST_BIN(bin), videorate); gst_bin_add(GST_BIN(bin), ratefilter); gst_element_link(videorate, ratefilter); } if(videoscale) { gst_bin_add(GST_BIN(bin), videoscale); gst_bin_add(GST_BIN(bin), scalefilter); gst_element_link(videoscale, scalefilter); } if(videorate && videoscale) gst_element_link(ratefilter, videoscale); GstPad *pad; pad = gst_element_get_static_pad(start, "sink"); gst_element_add_pad(bin, gst_ghost_pad_new("sink", pad)); gst_object_unref(GST_OBJECT(pad)); pad = gst_element_get_static_pad(end, "src"); gst_element_add_pad(bin, gst_ghost_pad_new("src", pad)); gst_object_unref(GST_OBJECT(pad)); return bin; } GstElement *bins_audioenc_create(const QString &codec, int id, int rate, int size, int channels) { GstElement *bin = gst_bin_new("audioencbin"); GstElement *audioenc = 0; GstElement *audiortppay = 0; if(!audio_codec_get_send_elements(codec, &audioenc, &audiortppay)) return 0; if(id != -1) g_object_set(G_OBJECT(audiortppay), "pt", id, NULL); GstElement *audioconvert = gst_element_factory_make("audioconvert", NULL); GstElement *audioresample = gst_element_factory_make("audioresample", NULL); GstStructure *cs; GstCaps *caps = gst_caps_new_empty(); if(codec == "vorbis") { cs = gst_structure_new("audio/x-raw-float", "rate", G_TYPE_INT, rate, "width", G_TYPE_INT, size, "channels", G_TYPE_INT, channels, NULL); gst_caps_append_structure(caps, cs); } else { cs = gst_structure_new("audio/x-raw-int", "rate", G_TYPE_INT, rate, "width", G_TYPE_INT, size, "channels", G_TYPE_INT, channels, NULL); gst_caps_append_structure(caps, cs); printf("rate=%d,width=%d,channels=%d\n", rate, size, channels); } GstElement *capsfilter = gst_element_factory_make("capsfilter", NULL); g_object_set(G_OBJECT(capsfilter), "caps", caps, NULL); gst_caps_unref(caps); gst_bin_add(GST_BIN(bin), audioconvert); gst_bin_add(GST_BIN(bin), audioresample); gst_bin_add(GST_BIN(bin), capsfilter); gst_bin_add(GST_BIN(bin), audioenc); gst_bin_add(GST_BIN(bin), audiortppay); gst_element_link_many(audioconvert, audioresample, capsfilter, audioenc, audiortppay, NULL); GstPad *pad; pad = gst_element_get_static_pad(audioconvert, "sink"); gst_element_add_pad(bin, gst_ghost_pad_new("sink", pad)); gst_object_unref(GST_OBJECT(pad)); pad = gst_element_get_static_pad(audiortppay, "src"); gst_element_add_pad(bin, gst_ghost_pad_new("src", pad)); gst_object_unref(GST_OBJECT(pad)); return bin; } GstElement *bins_videoenc_create(const QString &codec, int id, int maxkbps) { GstElement *bin = gst_bin_new("videoencbin"); GstElement *videoenc = 0; GstElement *videortppay = 0; if(!video_codec_get_send_elements(codec, &videoenc, &videortppay)) return 0; if(id != -1) g_object_set(G_OBJECT(videortppay), "pt", id, NULL); if(codec == "theora") g_object_set(G_OBJECT(videoenc), "bitrate", maxkbps, NULL); GstElement *videoconvert = gst_element_factory_make("ffmpegcolorspace", NULL); gst_bin_add(GST_BIN(bin), videoconvert); gst_bin_add(GST_BIN(bin), videoenc); gst_bin_add(GST_BIN(bin), videortppay); gst_element_link_many(videoconvert, videoenc, videortppay, NULL); GstPad *pad; pad = gst_element_get_static_pad(videoconvert, "sink"); gst_element_add_pad(bin, gst_ghost_pad_new("sink", pad)); gst_object_unref(GST_OBJECT(pad)); pad = gst_element_get_static_pad(videortppay, "src"); gst_element_add_pad(bin, gst_ghost_pad_new("src", pad)); gst_object_unref(GST_OBJECT(pad)); return bin; } GstElement *bins_audiodec_create(const QString &codec) { GstElement *bin = gst_bin_new("audiodecbin"); GstElement *audiodec = 0; GstElement *audiortpdepay = 0; if(!audio_codec_get_recv_elements(codec, &audiodec, &audiortpdepay)) return 0; GstElement *audiortpjitterbuffer = gst_element_factory_make("gstrtpjitterbuffer", NULL); gst_bin_add(GST_BIN(bin), audiortpjitterbuffer); gst_bin_add(GST_BIN(bin), audiortpdepay); gst_bin_add(GST_BIN(bin), audiodec); gst_element_link_many(audiortpjitterbuffer, audiortpdepay, audiodec, NULL); g_object_set(G_OBJECT(audiortpjitterbuffer), "latency", (unsigned int)get_rtp_latency(), NULL); GstPad *pad; pad = gst_element_get_static_pad(audiortpjitterbuffer, "sink"); gst_element_add_pad(bin, gst_ghost_pad_new("sink", pad)); gst_object_unref(GST_OBJECT(pad)); pad = gst_element_get_static_pad(audiodec, "src"); gst_element_add_pad(bin, gst_ghost_pad_new("src", pad)); gst_object_unref(GST_OBJECT(pad)); return bin; } GstElement *bins_videodec_create(const QString &codec) { GstElement *bin = gst_bin_new("videodecbin"); GstElement *videodec = 0; GstElement *videortpdepay = 0; if(!video_codec_get_recv_elements(codec, &videodec, &videortpdepay)) return 0; GstElement *videortpjitterbuffer = gst_element_factory_make("gstrtpjitterbuffer", NULL); gst_bin_add(GST_BIN(bin), videortpjitterbuffer); gst_bin_add(GST_BIN(bin), videortpdepay); gst_bin_add(GST_BIN(bin), videodec); gst_element_link_many(videortpjitterbuffer, videortpdepay, videodec, NULL); g_object_set(G_OBJECT(videortpjitterbuffer), "latency", (unsigned int)get_rtp_latency(), NULL); GstPad *pad; pad = gst_element_get_static_pad(videortpjitterbuffer, "sink"); gst_element_add_pad(bin, gst_ghost_pad_new("sink", pad)); gst_object_unref(GST_OBJECT(pad)); pad = gst_element_get_static_pad(videodec, "src"); gst_element_add_pad(bin, gst_ghost_pad_new("src", pad)); gst_object_unref(GST_OBJECT(pad)); return bin; } } psimedia-master/gstprovider/bins.h000066400000000000000000000024201220046403000176030ustar00rootroot00000000000000/* * Copyright (C) 2009 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #ifndef PSI_BINS_H #define PSI_BINS_H #include class QString; class QSize; namespace PsiMedia { GstElement *bins_videoprep_create(const QSize &size, int fps, bool is_live); GstElement *bins_audioenc_create(const QString &codec, int id, int rate, int size, int channels); GstElement *bins_videoenc_create(const QString &codec, int id, int maxkbps); GstElement *bins_audiodec_create(const QString &codec); GstElement *bins_videodec_create(const QString &codec); } #endif psimedia-master/gstprovider/deviceenum/000077500000000000000000000000001220046403000206255ustar00rootroot00000000000000psimedia-master/gstprovider/deviceenum/README000066400000000000000000000005161220046403000215070ustar00rootroot00000000000000DeviceEnum looks up devices using platform-specific APIs and approaches. It does not depend on GStreamer, but the results can be used with GStreamer. GStreamer does actually support device detection on its own, but results are spotty. If the day comes where detection through GStreamer works 100% then we can deprecate DeviceEnum. psimedia-master/gstprovider/deviceenum/deviceenum.h000066400000000000000000000035311220046403000231240ustar00rootroot00000000000000/* * Copyright (C) 2006-2009 Justin Karneges * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #ifndef DEVICEENUM_H #define DEVICEENUM_H #include #include #include namespace DeviceEnum { class Item { public: enum Type { Audio, Video }; enum Direction { Input, Output }; Type type; // Audio or Video Direction dir; // Input (mic) or Output (speaker) QString name; // friendly name QString driver; // e.g. "oss", "alsa" QString id; // e.g. "/dev/dsp", "hw:0,0" QSize explicitCaptureSize; // work around buggy cameras/drivers }; // Note: // There will almost always be duplicate devices returned by // the following functions. It is up to the user of this interface // to filter the results as necessary. Possible duplications: // - ALSA devices showing up twice (hw and plughw) // - Video4Linux devices showing up twice (V4L and V4L2) // - Both OSS and ALSA entries showing up for the same device QList audioOutputItems(const QString &driver = QString()); QList audioInputItems(const QString &driver = QString()); QList videoInputItems(const QString &driver = QString()); } #endif psimedia-master/gstprovider/deviceenum/deviceenum.pri000066400000000000000000000003221220046403000234620ustar00rootroot00000000000000HEADERS += $$PWD/deviceenum.h windows: { SOURCES += $$PWD/deviceenum_win.cpp } unix:!mac: { SOURCES += $$PWD/deviceenum_unix.cpp } mac: { SOURCES += $$PWD/deviceenum_mac.cpp LIBS += -framework CoreAudio } psimedia-master/gstprovider/deviceenum/deviceenum_mac.cpp000066400000000000000000000074441220046403000243060ustar00rootroot00000000000000/* * Copyright (C) 2006-2009 Justin Karneges * Copyright (C) 2006-2009 Remko Troncon * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #include "deviceenum.h" #include namespace DeviceEnum { #define DIR_INPUT 1 #define DIR_OUTPUT 2 static int find_by_id(const QList &list, int id) { for(int n = 0; n < list.count(); ++n) { if(list[n].id.toInt() == id) return n; } return -1; } static QList get_audio_items(int type) { QList out; int nb_devices = 0; UInt32 size = 0; if(AudioHardwareGetPropertyInfo(kAudioHardwarePropertyDevices, &size, NULL) != 0) return out; nb_devices = size / sizeof(AudioDeviceID); // Get the devices AudioDeviceID devices[nb_devices]; AudioHardwareGetProperty(kAudioHardwarePropertyDevices, &size, devices); for(int i = 0; i < nb_devices; i++) { // Get the device name char name[1024]; size = sizeof(name); if(AudioDeviceGetProperty(devices[i], 0, 0, kAudioDevicePropertyDeviceName, &size, name) != 0) continue; QString qname = QString::fromLatin1(name); // Query the input streams if(AudioDeviceGetPropertyInfo(devices[i], 0, 1, kAudioDevicePropertyStreams, &size, NULL) != 0) continue; bool input = (size > 0); // Query the output streams if(AudioDeviceGetPropertyInfo(devices[i], 0, 0, kAudioDevicePropertyStreams, &size, NULL) != 0) continue; bool output = (size > 0); int dev_int = devices[i]; if(type & DIR_INPUT && input) { Item i; i.type = Item::Audio; i.dir = Item::Input; i.name = qname; i.driver = "osxaudio"; i.id = QString::number(dev_int); out += i; } if(type & DIR_OUTPUT && output) { Item i; i.type = Item::Audio; i.dir = Item::Output; i.name = qname; i.driver = "osxaudio"; i.id = QString::number(dev_int); out += i; } } // do default output first, then input, so that if both are found, input // will end up at the top. not that it really matters. // Get the default output device if(type & DIR_OUTPUT) { size = sizeof(AudioDeviceID); AudioDeviceID default_output = kAudioDeviceUnknown; if(AudioHardwareGetProperty(kAudioHardwarePropertyDefaultOutputDevice, &size, &default_output) == 0) { int at = find_by_id(out, default_output); if(at != -1) out.move(at, 0); } } // Get the default input device if(type & DIR_INPUT) { size = sizeof(AudioDeviceID); AudioDeviceID default_input = kAudioDeviceUnknown; if(AudioHardwareGetProperty(kAudioHardwarePropertyDefaultInputDevice, &size, &default_input) == 0) { int at = find_by_id(out, default_input); if(at != -1) out.move(at, 0); } } return out; } QList audioOutputItems(const QString &driver) { Q_UNUSED(driver); return get_audio_items(DIR_OUTPUT); } QList audioInputItems(const QString &driver) { Q_UNUSED(driver); return get_audio_items(DIR_INPUT); } QList videoInputItems(const QString &driver) { Q_UNUSED(driver); QList out; // hardcode a default input device Item i; i.type = Item::Video; i.dir = Item::Input; i.name = "Default"; i.driver = "osxvideo"; i.id = QString(); // unspecified out += i; return out; } } psimedia-master/gstprovider/deviceenum/deviceenum_unix.cpp000066400000000000000000000276631220046403000245360ustar00rootroot00000000000000/* * Copyright (C) 2006-2009 Justin Karneges * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #include "deviceenum.h" #include #include #include #include #include #include #include #include #include #ifdef Q_OS_LINUX # include # include # include # include #endif namespace DeviceEnum { #define DIR_INPUT 1 #define DIR_OUTPUT 2 // taken from netinterface_unix (changed the split to KeepEmptyParts) static QStringList read_proc_as_lines(const char *procfile) { QStringList out; FILE *f = fopen(procfile, "r"); if(!f) return out; QByteArray buf; while(!feof(f)) { // max read on a proc is 4K QByteArray block(4096, 0); int ret = fread(block.data(), 1, block.size(), f); if(ret <= 0) break; block.resize(ret); buf += block; } fclose(f); QString str = QString::fromLocal8Bit(buf); out = str.split('\n', QString::KeepEmptyParts); return out; } // check scheme from portaudio static bool check_oss(const QString &dev, bool input) { int fd = open(QFile::encodeName(dev).data(), (input ? O_RDONLY : O_WRONLY) | O_NONBLOCK); if(fd == -1) { if(errno == EBUSY || errno == EAGAIN) return false; // device is busy else return false; // can't access } close(fd); return true; } static QList get_oss_items(int type) { QList out; // sndstat detection scheme from pulseaudio QStringList stat; stat = read_proc_as_lines("/dev/sndstat"); if(stat.isEmpty()) { stat = read_proc_as_lines("/proc/sndstat"); if(stat.isEmpty()) { stat = read_proc_as_lines("/proc/asound/oss/sndstat"); if(stat.isEmpty()) return out; } } // sndstat processing scheme from pulseaudio int at; at = stat.indexOf("Audio devices:"); if(at == -1) { at = stat.indexOf("Installed devices:"); if(at == -1) return out; } ++at; for(; at < stat.count() && !stat[at].isEmpty(); ++at) { QString line = stat[at]; int x = line.indexOf(": "); if(x == -1) continue; QString devnum = line.mid(0, x); QString devname = line.mid(x + 2); // apparently FreeBSD ids start with pcm in front bool bsd = false; if(devnum.left(3) == "pcm") { bsd = true; devnum = devnum.mid(3); } bool ok; int num = devnum.toInt(&ok); if(!ok) continue; x = devname.indexOf(" (DUPLEX)"); if(x != -1) devname = devname.mid(0, x); QStringList possible; // apparently FreeBSD has ".0" appended to the devices if(bsd) possible += QString("/dev/dsp%1.0").arg(num); else possible += QString("/dev/dsp%1").arg(num); // if we're looking for the 0 item, this might be "dsp" // without a number on it if(num == 0 && !bsd) possible += "/dev/dsp"; QString dev; foreach(dev, possible) { if(QFile::exists(dev)) break; } if(type & DIR_INPUT && check_oss(dev, true)) { Item i; i.type = Item::Audio; i.dir = Item::Input; i.name = devname; i.driver = "oss"; i.id = dev; out += i; } if(type & DIR_OUTPUT && check_oss(dev, false)) { Item i; i.type = Item::Audio; i.dir = Item::Output; i.name = devname; i.driver = "oss"; i.id = dev; out += i; } } return out; } // /proc/asound/devices // 16: [0- 0]: digital audio playback // 24: [0- 0]: digital audio capture // 0: [0- 0]: ctl // 33: : timer // 56: [1- 0]: digital audio capture // 32: [1- 0]: ctl // // /proc/asound/pcm // 00-00: ALC260 Analog : ALC260 Analog : playback 1 : capture 1 // 01-00: USB Audio : USB Audio : capture 1 class AlsaItem { public: int card, dev; bool input; QString name; }; static QList get_alsa_items(int type) { #ifdef Q_OS_LINUX QList out; QList items; QStringList devices_lines = read_proc_as_lines("/proc/asound/devices"); foreach(QString line, devices_lines) { // get the fields we care about QString devbracket, devtype; int x = line.indexOf(": "); if(x == -1) continue; QString sub = line.mid(x + 2); x = sub.indexOf(": "); if(x == -1) continue; devbracket = sub.mid(0, x); devtype = sub.mid(x + 2); // skip all but playback and capture bool input; if(devtype == "digital audio playback") input = false; else if(devtype == "digital audio capture") input = true; else continue; // skip what isn't asked for if(!(type & DIR_INPUT) && input) continue; if(!(type & DIR_OUTPUT) && !input) continue; // hack off brackets if(devbracket[0] != '[' || devbracket[devbracket.length()-1] != ']') continue; devbracket = devbracket.mid(1, devbracket.length() - 2); QString cardstr, devstr; x = devbracket.indexOf('-'); if(x == -1) continue; cardstr = devbracket.mid(0, x); devstr = devbracket.mid(x + 1); AlsaItem ai; bool ok; ai.card = cardstr.toInt(&ok); if(!ok) continue; ai.dev = devstr.toInt(&ok); if(!ok) continue; ai.input = input; ai.name.sprintf("ALSA Card %d, Device %d", ai.card, ai.dev); items += ai; } // try to get the friendly names QStringList pcm_lines = read_proc_as_lines("/proc/asound/pcm"); foreach(QString line, pcm_lines) { QString devnumbers, devname; int x = line.indexOf(": "); if(x == -1) continue; devnumbers = line.mid(0, x); devname = line.mid(x + 2); x = devname.indexOf(" :"); if(x != -1) devname = devname.mid(0, x); else devname = devname.trimmed(); QString cardstr, devstr; x = devnumbers.indexOf('-'); if(x == -1) continue; cardstr = devnumbers.mid(0, x); devstr = devnumbers.mid(x + 1); bool ok; int cardnum = cardstr.toInt(&ok); if(!ok) continue; int devnum = devstr.toInt(&ok); if(!ok) continue; for(int n = 0; n < items.count(); ++n) { AlsaItem &ai = items[n]; if(ai.card == cardnum && ai.dev == devnum) ai.name = devname; } } // make a "default" item if(!items.isEmpty()) { Item i; i.type = Item::Audio; if(type == DIR_INPUT) i.dir = Item::Input; else // DIR_OUTPUT i.dir = Item::Output; i.name = "Default"; i.driver = "alsa"; i.id = "default"; out += i; } for(int n = 0; n < items.count(); ++n) { AlsaItem &ai = items[n]; // make an item for both hw and plughw Item i; i.type = Item::Audio; if(ai.input) i.dir = Item::Input; else i.dir = Item::Output; i.name = ai.name; i.driver = "alsa"; i.id = QString().sprintf("plughw:%d,%d", ai.card, ai.dev); out += i; // internet discussion seems to indicate that plughw is the // same as hw except that it will convert audio parameters // if necessary. the decision to use hw vs plughw is a // development choice, NOT a user choice. it is generally // recommended for apps to use plughw unless they have a // good reason. // // so, for now we'll only offer plughw and not hw //i.name = ai.name + " (Direct)"; //i.id = QString().sprintf("hw:%d,%d", ai.card, ai.dev); //out += i; } return out; #else // return empty list if non-linux Q_UNUSED(type); return QList(); #endif } #ifdef Q_OS_LINUX static QStringList scan_for_videodevs(const QString &dirpath) { QStringList out; DIR *dir = opendir(QFile::encodeName(dirpath)); if(!dir) return out; while(1) { struct dirent *e; e = readdir(dir); if(!e) break; QString fname = QFile::decodeName(e->d_name); if(fname == "." || fname == "..") continue; QFileInfo fi(dirpath + '/' + fname); if(fi.isSymLink()) continue; if(fi.isDir()) { out += scan_for_videodevs(fi.filePath()); } else { struct stat buf; if(lstat(QFile::encodeName(fi.filePath()).data(), &buf) == -1) continue; if(!S_ISCHR(buf.st_mode)) continue; int maj = ((unsigned short)buf.st_rdev) >> 8; int min = ((unsigned short)buf.st_rdev) & 0xff; if(maj == 81 && (min >= 0 && min <= 63)) out += fi.filePath(); } } closedir(dir); return out; } #endif class V4LName { public: QString name; QString dev; QString friendlyName; }; static QList get_v4l_names(const QString &path, bool sys) { QList out; QDir dir(path); if(!dir.exists()) return out; QStringList entries = dir.entryList(); foreach(QString fname, entries) { QFileInfo fi(dir.filePath(fname)); if(sys) { // sys names are dirs if(!fi.isDir()) continue; // sys names should begin with "video" if(fname.left(5) != "video") continue; V4LName v; v.name = fname; v.dev = QString("/dev/%1").arg(fname); QString modelPath = fi.filePath() + "/model"; QStringList lines = read_proc_as_lines(QFile::encodeName(modelPath).data()); if(!lines.isEmpty()) v.friendlyName = lines.first(); out += v; } else { // proc names are not dirs if(fi.isDir()) continue; // proc names need to be split into name/number int at; for(at = fname.length() - 1; at >= 0; --at) { if(!fname[at].isDigit()) break; } ++at; QString numstr = fname.mid(at); QString base = fname.mid(0, at); bool ok; int num = numstr.toInt(&ok); if(!ok) continue; // name should be "video" or "capture" if(base != "video" || base != "capture") continue; // but apparently the device is always called "video" QString dev = QString("/dev/video%1").arg(num); V4LName v; v.name = fname; v.dev = dev; out += v; } } return out; } static QList get_v4l2_items() { #ifdef Q_OS_LINUX QList out; QList list = get_v4l_names("/sys/class/video4linux", true); if(list.isEmpty()) list = get_v4l_names("/proc/video/dev", false); // if we can't find anything, then do a raw scan for possibilities if(list.isEmpty()) { QStringList possible = scan_for_videodevs("/dev"); foreach(QString str, possible) { V4LName v; v.dev = str; list += v; } } for(int n = 0; n < list.count(); ++n) { V4LName &v = list[n]; // if we already have a friendly name then we'll skip the confirm // in order to save resources. the only real drawback here that // I can think of is if the device isn't a capture type. but // what does it mean to have a V4L device that isn't capture?? if(v.friendlyName.isEmpty()) { int fd = open(QFile::encodeName(v.dev).data(), O_RDONLY | O_NONBLOCK); if(fd == -1) continue; // get video capabilities and close struct v4l2_capability caps; memset(&caps, 0, sizeof(caps)); int ret = ioctl(fd, VIDIOC_QUERYCAP, &caps); close(fd); if(ret == -1) continue; if(!(caps.capabilities & V4L2_CAP_VIDEO_CAPTURE)) continue; v.friendlyName = (const char *)caps.card; } Item i; i.type = Item::Video; i.dir = Item::Input; i.name = v.friendlyName; i.driver = "v4l2"; i.id = v.dev; out += i; } return out; #else // return empty list if non-linux return QList(); #endif } QList audioOutputItems(const QString &driver) { QList out; if(driver.isEmpty() || driver == "oss") out += get_oss_items(DIR_OUTPUT); if(driver.isEmpty() || driver == "alsa") out += get_alsa_items(DIR_OUTPUT); return out; } QList audioInputItems(const QString &driver) { QList out; if(driver.isEmpty() || driver == "oss") out += get_oss_items(DIR_INPUT); if(driver.isEmpty() || driver == "alsa") out += get_alsa_items(DIR_INPUT); return out; } QList videoInputItems(const QString &driver) { QList out; if(driver.isEmpty() || driver == "v4l2") out += get_v4l2_items(); return out; } } psimedia-master/gstprovider/deviceenum/deviceenum_win.cpp000066400000000000000000000032711220046403000243350ustar00rootroot00000000000000/* * Copyright (C) 2008-2009 Justin Karneges * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #include "deviceenum.h" namespace DeviceEnum { QList audioOutputItems(const QString &driver) { Q_UNUSED(driver); QList out; // hardcode a default output device Item i; i.type = Item::Audio; i.dir = Item::Output; i.name = "Default"; i.driver = "directsound"; i.id = QString(); // unspecified out += i; return out; } QList audioInputItems(const QString &driver) { Q_UNUSED(driver); QList out; // hardcode a default input device Item i; i.type = Item::Audio; i.dir = Item::Input; i.name = "Default"; i.driver = "directsound"; i.id = QString(); // unspecified out += i; return out; } QList videoInputItems(const QString &driver) { Q_UNUSED(driver); QList out; // hardcode a default input device Item i; i.type = Item::Video; i.dir = Item::Input; i.name = "Default"; i.driver = "winks"; i.id = QString(); // unspecified out += i; return out; } } psimedia-master/gstprovider/devices.cpp000066400000000000000000000266111220046403000206350ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #include "devices.h" #include #include #include #include #include "deviceenum/deviceenum.h" namespace PsiMedia { class GstDeviceProbeValue { public: QString id; QString name; }; static QList device_probe(GstElement *e) { GObjectClass *klass = G_OBJECT_GET_CLASS(e); if(!g_object_class_find_property(klass, "device") || !GST_IS_PROPERTY_PROBE(e)) return QList(); GstPropertyProbe *probe = GST_PROPERTY_PROBE(e); if(!probe) return QList(); const GParamSpec *pspec = gst_property_probe_get_property(probe, "device"); if(!pspec) return QList(); QList out; GValueArray *list = gst_property_probe_probe_and_get_values(probe, pspec); if(list) { for(int n = 0; n < (int)list->n_values; ++n) { GValue *i = g_value_array_get_nth(list, n); // FIXME: "device" isn't always a string gchar *name; g_object_set(G_OBJECT(e), "device", g_value_get_string(i), NULL); g_object_get(G_OBJECT(e), "device-name", &name, NULL); GstDeviceProbeValue dev; dev.id = QString::fromUtf8(g_value_get_string(i)); dev.name = QString::fromUtf8(name); g_free(name); out += dev; } g_value_array_free(list); } return out; } static bool element_should_use_probe(const QString &element_name) { // we can enumerate devices in two ways. one is via gst property // probing and the other is through our own DeviceEnum code. // since gst property probing is "the future", we'll take a // probe-by-default approach, and only use DeviceEnum for specific // elements // these should use DeviceEnum if(element_name == "alsasrc" || element_name == "alsasink" || element_name == "osssrc" || element_name == "osssink" || element_name == "v4lsrc" || element_name == "v4l2src" || element_name == "osxaudiosrc" || element_name == "osxaudiosink" || element_name == "ksvideosrc") { return false; } // all else probe else return true; } static QList device_enum(const QString &driver, PDevice::Type type) { if(type == PDevice::AudioOut) return DeviceEnum::audioOutputItems(driver); else if(type == PDevice::AudioIn) return DeviceEnum::audioInputItems(driver); else // PDevice::VideoIn return DeviceEnum::videoInputItems(driver); } static QString id_part_escape(const QString &in) { QString out; for(int n = 0; n < in.length(); ++n) { if(in[n] == '\\') out += "\\\\"; else if(in[n] == ',') out += "\\c"; else out += in[n]; } return out; } static QString id_part_unescape(const QString &in) { QString out; for(int n = 0; n < in.length(); ++n) { if(in[n] == '\\') { if(n + 1 >= in.length()) return QString(); ++n; if(in[n] == '\\') out += '\\'; else if(in[n] == 'c') out += ','; else return QString(); } else out += in[n]; } return out; } static QString resolution_to_string(const QSize &size) { return QString::number(size.width()) + 'x' + QString::number(size.height()); } static QSize string_to_resolution(const QString &in) { int at = in.indexOf('x'); if(at == -1) return QSize(); QString ws = in.mid(0, at); QString hs = in.mid(at + 1); bool ok; int w = ws.toInt(&ok); if(!ok) return QSize(); int h = hs.toInt(&ok); if(!ok) return QSize(); return QSize(w, h); } static QString encode_id(const QStringList &in) { QStringList list = in; for(int n = 0; n < list.count(); ++n) list[n] = id_part_escape(list[n]); return list.join(","); } static QStringList decode_id(const QString &in) { QStringList list = in.split(','); for(int n = 0; n < list.count(); ++n) list[n] = id_part_unescape(list[n]); return list; } static QString element_name_for_driver(const QString &driver, PDevice::Type type) { QString element_name; if(driver == "alsa") { if(type == PDevice::AudioOut) element_name = "alsasink"; else if(type == PDevice::AudioIn) element_name = "alsasrc"; } else if(driver == "oss") { if(type == PDevice::AudioOut) element_name = "osssink"; else if(type == PDevice::AudioIn) element_name = "osssrc"; } else if(driver == "osxaudio") { if(type == PDevice::AudioOut) element_name = "osxaudiosink"; else if(type == PDevice::AudioIn) element_name = "osxaudiosrc"; } else if(driver == "osxvideo") { if(type == PDevice::VideoIn) element_name = "osxvideosrc"; } else if(driver == "v4l") { if(type == PDevice::VideoIn) element_name = "v4lsrc"; } else if(driver == "v4l2") { if(type == PDevice::VideoIn) element_name = "v4l2src"; } else if(driver == "directsound") { if(type == PDevice::AudioOut) element_name = "directsoundsink"; else if(type == PDevice::AudioIn) element_name = "directsoundsrc"; } else if(driver == "winks") { if(type == PDevice::VideoIn) element_name = "ksvideosrc"; } return element_name; } // check to see that the necessary sources/sinks are available static QStringList check_supported_drivers(const QStringList &drivers, PDevice::Type type) { QStringList out; foreach(const QString &driver, drivers) { QString element_name = element_name_for_driver(driver, type); if(element_name.isEmpty()) continue; GstElement *e = gst_element_factory_make(element_name.toLatin1().data(), NULL); if(e) { out += driver; g_object_unref(G_OBJECT(e)); } } return out; } static GstElement *make_element_with_device(const QString &element_name, const QString &device_id) { GstElement *e = gst_element_factory_make(element_name.toLatin1().data(), NULL); if(!e) return 0; if(!device_id.isEmpty()) { // FIXME: is there a better way to determine if "device" is a string or int? if(element_name == "osxaudiosrc" || element_name == "osxaudiosink") g_object_set(G_OBJECT(e), "device", device_id.toInt(), NULL); else g_object_set(G_OBJECT(e), "device", device_id.toLatin1().data(), NULL); } else { // FIXME: remove this when ksvideosrc supports enumeration if(element_name == "ksvideosrc") { QByteArray val = qgetenv("PSI_KSVIDEOSRC_INDEX"); if(!val.isEmpty()) g_object_set(G_OBJECT(e), "device-index", val.toInt(), NULL); } } return e; } static bool test_video(const QString &element_name, const QString &device_id) { GstElement *e = make_element_with_device(element_name, device_id); if(!e) return false; gst_element_set_state(e, GST_STATE_PAUSED); int ret = gst_element_get_state(e, NULL, NULL, GST_CLOCK_TIME_NONE); // 'ret' has our answer, so we can free up the element now gst_element_set_state(e, GST_STATE_NULL); gst_element_get_state(e, NULL, NULL, GST_CLOCK_TIME_NONE); g_object_unref(G_OBJECT(e)); if(ret != GST_STATE_CHANGE_SUCCESS && ret != GST_STATE_CHANGE_NO_PREROLL) return false; return true; } // for elements that we can't enumerate devices for, we need a way to ensure // that at least the default device works // FIXME: why do we have both this function and test_video() ? static bool test_element(const QString &element_name) { GstElement *e = gst_element_factory_make(element_name.toLatin1().data(), NULL); if(!e) return 0; gst_element_set_state(e, GST_STATE_READY); int ret = gst_element_get_state(e, NULL, NULL, GST_CLOCK_TIME_NONE); gst_element_set_state(e, GST_STATE_NULL); gst_element_get_state(e, NULL, NULL, GST_CLOCK_TIME_NONE); g_object_unref(G_OBJECT(e)); if(ret != GST_STATE_CHANGE_SUCCESS) return false; return true; } static QList devices_for_drivers(const QStringList &drivers, PDevice::Type type) { QList out; QStringList supportedDrivers = check_supported_drivers(drivers, type); foreach(const QString &driver, supportedDrivers) { QString element_name = element_name_for_driver(driver, type); if(element_should_use_probe(element_name)) { GstElement *e = gst_element_factory_make(element_name.toLatin1().data(), NULL); QList list = device_probe(e); g_object_unref(G_OBJECT(e)); bool first = true; foreach(const GstDeviceProbeValue &i, list) { GstDevice dev; dev.name = i.name; #if defined(Q_OS_UNIX) && !defined(Q_OS_MAC) dev.name += QString(" (%1)").arg(driver); #endif dev.isDefault = first; QStringList parts; parts += driver; parts += i.id; dev.id = encode_id(parts); out += dev; first = false; } } else { QList list = device_enum(driver, type); bool first = true; foreach(const DeviceEnum::Item &i, list) { if(type == PDevice::VideoIn && (element_name == "v4lsrc" || element_name == "v4l2src")) { if(!test_video(element_name, i.id)) continue; } else if(element_name == "directsoundsrc" || element_name == "directsoundsink" || element_name == "ksvideosrc" || element_name == "osxvideosrc") { if(!test_element(element_name)) continue; } GstDevice dev; dev.name = i.name; #if defined(Q_OS_UNIX) && !defined(Q_OS_MAC) dev.name += QString(" (%1)").arg(i.driver); #endif dev.isDefault = first; QStringList parts; parts += i.driver; parts += i.id; if(i.explicitCaptureSize.isValid()) parts += resolution_to_string(i.explicitCaptureSize); dev.id = encode_id(parts); out += dev; first = false; } } } return out; } QList devices_list(PDevice::Type type) { QStringList drivers; if(type == PDevice::AudioOut) { drivers #if defined(Q_OS_MAC) << "osxaudio" #elif defined(Q_OS_LINUX) << "alsa" #else << "oss" #endif << "directsound"; } else if(type == PDevice::AudioIn) { drivers #if defined(Q_OS_MAC) << "osxaudio" #elif defined(Q_OS_LINUX) << "alsa" #else << "oss" #endif << "directsound"; } else // PDevice::VideoIn { drivers << "v4l" << "v4l2" << "osxvideo" << "winks"; } return devices_for_drivers(drivers, type); } GstElement *devices_makeElement(const QString &id, PDevice::Type type, QSize *captureSize) { QStringList parts = decode_id(id); if(parts.count() < 2) return 0; QString driver = parts[0]; QString device_id = parts[1]; QString element_name = element_name_for_driver(driver, type); if(element_name.isEmpty()) return 0; GstElement *e = make_element_with_device(element_name, device_id); if(!e) return 0; // FIXME: we don't set v4l2src to the READY state because it may break // the element in jpeg mode. this is really a bug in gstreamer or // lower that should be fixed... if(element_name != "v4l2src") { gst_element_set_state(e, GST_STATE_READY); int ret = gst_element_get_state(e, NULL, NULL, GST_CLOCK_TIME_NONE); if(ret != GST_STATE_CHANGE_SUCCESS) { g_object_unref(G_OBJECT(e)); return 0; } } if(parts.count() >= 3 && captureSize) *captureSize = string_to_resolution(parts[2]); return e; } } psimedia-master/gstprovider/devices.h000066400000000000000000000022621220046403000202760ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #ifndef DEVICES_H #define DEVICES_H #include #include #include #include "psimediaprovider.h" class QSize; namespace PsiMedia { class GstDevice { public: QString name; bool isDefault; QString id; }; QList devices_list(PDevice::Type type); GstElement *devices_makeElement(const QString &id, PDevice::Type type, QSize *captureSize = 0); } #endif psimedia-master/gstprovider/gstconf.pri000066400000000000000000000022401220046403000206560ustar00rootroot00000000000000# FIXME: building elements in shared mode causes them to drag in the entire # dependencies of psimedia include(../conf.pri) windows { INCLUDEPATH += \ c:/glib/include/glib-2.0 \ c:/glib/lib/glib-2.0/include \ c:/gstforwin/dxsdk/include \ c:/gstforwin/winsdk/include \ c:/gstforwin/include \ c:/gstforwin/include/liboil-0.3 \ c:/gstforwin/include/libxml2 \ c:/gstforwin/gstreamer/include/gstreamer-0.10 \ c:/gstforwin/gst-plugins-base/include/gstreamer-0.10 LIBS += \ -Lc:/glib/bin \ -Lc:/gstforwin/bin \ -Lc:/gstforwin/gstreamer/bin \ -Lc:/gstforwin/gst-plugins-base/bin \ -lgstreamer-0.10-0 \ -lgthread-2.0-0 \ -lglib-2.0-0 \ -lgobject-2.0-0 \ -lgstvideo-0.10-0 \ -lgstbase-0.10-0 \ -lgstinterfaces-0.10-0 # qmake mingw seems to have broken prl support, so force these win32-g++|contains($$list($$[QT_VERSION]), 4.0.*|4.1.*|4.2.*|4.3.*) { LIBS *= \ -Lc:/gstforwin/winsdk/lib \ -loil-0.3-0 \ -lgstaudio-0.10-0 \ -lgstrtp-0.10-0 \ -lgstnetbuffer-0.10-0 \ -lspeexdsp-1 \ -lsetupapi \ -lksuser \ -lamstrmid \ -ldsound \ -ldxerr9 \ -lole32 } } unix { LIBS += -lgstvideo-0.10 -lgstinterfaces-0.10 } psimedia-master/gstprovider/gstcustomelements/000077500000000000000000000000001220046403000222665ustar00rootroot00000000000000psimedia-master/gstprovider/gstcustomelements/apprtpsink.c000066400000000000000000000044641220046403000246350ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #include "gstcustomelements.h" #include "gstboilerplatefixed.h" GST_BOILERPLATE(GstAppRtpSink, gst_apprtpsink, GstBaseSink, GST_TYPE_BASE_SINK); static GstFlowReturn gst_apprtpsink_render(GstBaseSink *sink, GstBuffer *buffer); static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE("sink", GST_PAD_SINK, GST_PAD_ALWAYS, GST_STATIC_CAPS_ANY ); void gst_apprtpsink_base_init(gpointer gclass) { static GstElementDetails element_details = GST_ELEMENT_DETAILS( "Application RTP Sink", "Generic/Sink", "Send RTP packets to the application", "Justin Karneges " ); GstElementClass *element_class = GST_ELEMENT_CLASS(gclass); gst_element_class_add_pad_template(element_class, gst_static_pad_template_get(&sink_template)); gst_element_class_set_details(element_class, &element_details); } // class init void gst_apprtpsink_class_init(GstAppRtpSinkClass *klass) { GstBaseSinkClass *basesink_class; basesink_class = (GstBaseSinkClass *)klass; basesink_class->render = gst_apprtpsink_render; } // instance init void gst_apprtpsink_init(GstAppRtpSink *sink, GstAppRtpSinkClass *gclass) { (void)gclass; sink->appdata = 0; sink->packet_ready = 0; } GstFlowReturn gst_apprtpsink_render(GstBaseSink *sink, GstBuffer *buffer) { GstAppRtpSink *self = (GstAppRtpSink *)sink; // the assumption here is that every buffer is a complete rtp // packet, ready for sending if(self->packet_ready) self->packet_ready(GST_BUFFER_DATA(buffer), GST_BUFFER_SIZE(buffer), self->appdata); return GST_FLOW_OK; } psimedia-master/gstprovider/gstcustomelements/apprtpsrc.c000066400000000000000000000155271220046403000244620ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #include "gstcustomelements.h" #include "gstboilerplatefixed.h" #include #define APPRTPSRC_MAX_BUF_COUNT 32 GST_BOILERPLATE(GstAppRtpSrc, gst_apprtpsrc, GstPushSrc, GST_TYPE_PUSH_SRC); enum { PROP_0, PROP_CAPS, PROP_LAST }; static void gst_apprtpsrc_set_property(GObject *obj, guint prop_id, const GValue *value, GParamSpec *pspec); static void gst_apprtpsrc_get_property(GObject *obj, guint prop_id, GValue *value, GParamSpec *pspec); static void gst_apprtpsrc_finalize(GObject *obj); static gboolean gst_apprtpsrc_unlock(GstBaseSrc *src); static gboolean gst_apprtpsrc_unlock_stop(GstBaseSrc *src); static GstCaps *gst_apprtpsrc_get_caps(GstBaseSrc *src); static GstFlowReturn gst_apprtpsrc_create(GstPushSrc *src, GstBuffer **buf); static GstStaticPadTemplate src_template = GST_STATIC_PAD_TEMPLATE("src", GST_PAD_SRC, GST_PAD_ALWAYS, GST_STATIC_CAPS_ANY ); void gst_apprtpsrc_base_init(gpointer gclass) { static GstElementDetails element_details = GST_ELEMENT_DETAILS( "Application RTP Source", "Generic/Source", "Receive RTP packets from the application", "Justin Karneges " ); GstElementClass *element_class = GST_ELEMENT_CLASS(gclass); gst_element_class_add_pad_template(element_class, gst_static_pad_template_get(&src_template)); gst_element_class_set_details(element_class, &element_details); } // class init void gst_apprtpsrc_class_init(GstAppRtpSrcClass *klass) { GObjectClass *gobject_class; GstBaseSrcClass *basesrc_class; GstPushSrcClass *pushsrc_class; gobject_class = (GObjectClass *)klass; basesrc_class = (GstBaseSrcClass *)klass; pushsrc_class = (GstPushSrcClass *)klass; gobject_class->set_property = gst_apprtpsrc_set_property; gobject_class->get_property = gst_apprtpsrc_get_property; gobject_class->finalize = gst_apprtpsrc_finalize; g_object_class_install_property(gobject_class, PROP_CAPS, g_param_spec_boxed("caps", "Caps", "The caps of the source pad", GST_TYPE_CAPS, G_PARAM_READWRITE)); basesrc_class->unlock = gst_apprtpsrc_unlock; basesrc_class->unlock_stop = gst_apprtpsrc_unlock_stop; basesrc_class->get_caps = gst_apprtpsrc_get_caps; pushsrc_class->create = gst_apprtpsrc_create; } // instance init void gst_apprtpsrc_init(GstAppRtpSrc *src, GstAppRtpSrcClass *gclass) { (void)gclass; src->buffers = g_queue_new(); src->push_mutex = g_mutex_new(); src->push_cond = g_cond_new(); src->quit = FALSE; // not flushing src->caps = 0; // set up the base (adapted from udpsrc) // configure basesrc to be a live source gst_base_src_set_live(GST_BASE_SRC(src), TRUE); // make basesrc output a segment in time gst_base_src_set_format(GST_BASE_SRC(src), GST_FORMAT_TIME); // make basesrc set timestamps on outgoing buffers based on the // running_time when they were captured gst_base_src_set_do_timestamp(GST_BASE_SRC(src), TRUE); } // destruct static void my_foreach_func(gpointer data, gpointer user_data) { GstBuffer *buf = (GstBuffer *)data; (void)user_data; gst_buffer_unref(buf); } void gst_apprtpsrc_finalize(GObject *obj) { GstAppRtpSrc *src = (GstAppRtpSrc *)obj; g_queue_foreach(src->buffers, my_foreach_func, NULL); g_queue_free(src->buffers); g_mutex_free(src->push_mutex); g_cond_free(src->push_cond); if(src->caps) gst_caps_unref(src->caps); G_OBJECT_CLASS(parent_class)->finalize(obj); } gboolean gst_apprtpsrc_unlock(GstBaseSrc *bsrc) { GstAppRtpSrc *src = (GstAppRtpSrc *)bsrc; g_mutex_lock(src->push_mutex); src->quit = TRUE; // flushing g_cond_signal(src->push_cond); g_mutex_unlock(src->push_mutex); return TRUE; } gboolean gst_apprtpsrc_unlock_stop(GstBaseSrc *bsrc) { GstAppRtpSrc *src = (GstAppRtpSrc *)bsrc; g_mutex_lock(src->push_mutex); src->quit = FALSE; // not flushing g_mutex_unlock(src->push_mutex); return TRUE; } GstCaps *gst_apprtpsrc_get_caps(GstBaseSrc *bsrc) { GstAppRtpSrc *src = (GstAppRtpSrc *)bsrc; if(src->caps) return gst_caps_ref(src->caps); else return gst_caps_new_any(); } GstFlowReturn gst_apprtpsrc_create(GstPushSrc *bsrc, GstBuffer **buf) { GstAppRtpSrc *src = (GstAppRtpSrc *)bsrc; // the assumption here is that every buffer is a complete rtp // packet, ready for processing // i believe the app is supposed to block on this call waiting for // data g_mutex_lock(src->push_mutex); while(g_queue_is_empty(src->buffers) && !src->quit) g_cond_wait(src->push_cond, src->push_mutex); // flushing? if(src->quit) { g_mutex_unlock(src->push_mutex); return GST_FLOW_WRONG_STATE; } *buf = (GstBuffer *)g_queue_pop_head(src->buffers); gst_buffer_set_caps(*buf, src->caps); g_mutex_unlock(src->push_mutex); return GST_FLOW_OK; } void gst_apprtpsrc_packet_push(GstAppRtpSrc *src, const unsigned char *buf, int size) { GstBuffer *newbuf; g_mutex_lock(src->push_mutex); // if buffer is full, eat the oldest to make room if(g_queue_get_length(src->buffers) >= APPRTPSRC_MAX_BUF_COUNT) g_queue_pop_head(src->buffers); // ignore zero-byte packets if(size < 1) { g_mutex_unlock(src->push_mutex); return; } newbuf = gst_buffer_new_and_alloc(size); memcpy(GST_BUFFER_DATA(newbuf), buf, size); g_queue_push_tail(src->buffers, newbuf); g_cond_signal(src->push_cond); g_mutex_unlock(src->push_mutex); } void gst_apprtpsrc_set_property(GObject *obj, guint prop_id, const GValue *value, GParamSpec *pspec) { GstAppRtpSrc *src = (GstAppRtpSrc *)obj; (void)pspec; switch(prop_id) { case PROP_CAPS: { const GstCaps *new_caps_val = gst_value_get_caps(value); GstCaps *new_caps; GstCaps *old_caps; if(new_caps_val == NULL) new_caps = gst_caps_new_any(); else new_caps = gst_caps_copy(new_caps_val); old_caps = src->caps; src->caps = new_caps; if(old_caps) gst_caps_unref(old_caps); gst_pad_set_caps(GST_BASE_SRC(src)->srcpad, new_caps); break; } default: break; } } void gst_apprtpsrc_get_property(GObject *obj, guint prop_id, GValue *value, GParamSpec *pspec) { GstAppRtpSrc *src = (GstAppRtpSrc *)obj; switch(prop_id) { case PROP_CAPS: gst_value_set_caps(value, src->caps); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID(obj, prop_id, pspec); break; } } psimedia-master/gstprovider/gstcustomelements/appvideosink.c000066400000000000000000000054421220046403000251330ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #include "gstcustomelements.h" #include "gstboilerplatefixed.h" GST_BOILERPLATE(GstAppVideoSink, gst_appvideosink, GstVideoSink, GST_TYPE_VIDEO_SINK); static GstFlowReturn gst_appvideosink_render(GstBaseSink *sink, GstBuffer *buffer); static GstStaticPadTemplate sink_template = GST_STATIC_PAD_TEMPLATE("sink", GST_PAD_SINK, GST_PAD_ALWAYS, GST_STATIC_CAPS( #if G_BYTE_ORDER == G_LITTLE_ENDIAN GST_VIDEO_CAPS_BGRx #else GST_VIDEO_CAPS_xRGB #endif ) ); void gst_appvideosink_base_init(gpointer gclass) { static GstElementDetails element_details = GST_ELEMENT_DETAILS( "Application Video Sink", "Generic/Sink", "Send raw video frames to the application", "Justin Karneges " ); GstElementClass *element_class = GST_ELEMENT_CLASS(gclass); gst_element_class_add_pad_template(element_class, gst_static_pad_template_get(&sink_template)); gst_element_class_set_details(element_class, &element_details); } // class init void gst_appvideosink_class_init(GstAppVideoSinkClass *klass) { GstBaseSinkClass *basesink_class; basesink_class = (GstBaseSinkClass *)klass; basesink_class->render = gst_appvideosink_render; } // instance init void gst_appvideosink_init(GstAppVideoSink *sink, GstAppVideoSinkClass *gclass) { (void)gclass; sink->appdata = 0; sink->show_frame = 0; } GstFlowReturn gst_appvideosink_render(GstBaseSink *sink, GstBuffer *buffer) { int size; int width, height; GstCaps *caps; GstStructure *structure; GstAppVideoSink *self = (GstAppVideoSink *)sink; caps = GST_BUFFER_CAPS(buffer); structure = gst_caps_get_structure(caps, 0); // get width and height if(!gst_structure_get_int(structure, "width", &width) || !gst_structure_get_int(structure, "height", &height)) { return GST_FLOW_ERROR; } // make sure buffer size matches width * height * 32 bit rgb size = GST_BUFFER_SIZE(buffer); if(width * height * 4 != size) return GST_FLOW_ERROR; if(self->show_frame) self->show_frame(width, height, GST_BUFFER_DATA(buffer), self->appdata); return GST_FLOW_OK; } psimedia-master/gstprovider/gstcustomelements/gstboilerplatefixed.h000066400000000000000000000056421220046403000265060ustar00rootroot00000000000000/* GStreamer * Copyright (C) 1999,2000 Erik Walthinsen * 2000 Wim Taymans * 2002 Thomas Vander Stichele * * gstutils.h: Header for various utility functions * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. */ // here's a replacement for GST_BOILERPLATE_FULL that doesn't have warnings #ifndef GSTBOILERPLATEFIXED_H #define GSTBOILERPLATEFIXED_H #include G_BEGIN_DECLS #undef GST_BOILERPLATE_FULL #define GST_BOILERPLATE_FULL(type, type_as_function, parent_type, parent_type_macro, additional_initializations) \ \ static void type_as_function ## _base_init (gpointer g_class); \ static void type_as_function ## _class_init (type ## Class *g_class);\ static void type_as_function ## _init (type *object, \ type ## Class *g_class);\ static parent_type ## Class *parent_class = NULL; \ static void \ type_as_function ## _class_init_trampoline (gpointer g_class, \ gpointer data) \ { \ (void)data; \ parent_class = (parent_type ## Class *) \ g_type_class_peek_parent (g_class); \ type_as_function ## _class_init ((type ## Class *)g_class); \ } \ \ GType type_as_function ## _get_type (void); \ \ GType \ type_as_function ## _get_type (void) \ { \ static GType object_type = 0; \ if (G_UNLIKELY (object_type == 0)) { \ object_type = gst_type_register_static_full (parent_type_macro, #type, \ sizeof (type ## Class), \ type_as_function ## _base_init, \ NULL, /* base_finalize */ \ type_as_function ## _class_init_trampoline, \ NULL, /* class_finalize */ \ NULL, /* class_data */ \ sizeof (type), \ 0, /* n_preallocs */ \ (GInstanceInitFunc) type_as_function ## _init, \ NULL, \ (GTypeFlags) 0); \ additional_initializations (object_type); \ } \ return object_type; \ } G_END_DECLS #endif psimedia-master/gstprovider/gstcustomelements/gstcustomelements.c000066400000000000000000000027741220046403000262310ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #include "gstcustomelements.h" static gboolean register_elements(GstPlugin *plugin) { if(!gst_element_register(plugin, "appvideosink", GST_RANK_NONE, GST_TYPE_APPVIDEOSINK)) { return FALSE; } if(!gst_element_register(plugin, "apprtpsrc", GST_RANK_NONE, GST_TYPE_APPRTPSRC)) { return FALSE; } if(!gst_element_register(plugin, "apprtpsink", GST_RANK_NONE, GST_TYPE_APPRTPSINK)) { return FALSE; } return TRUE; } void gstcustomelements_register() { gst_plugin_register_static( GST_VERSION_MAJOR, GST_VERSION_MINOR, "my-private-plugins", "Private elements of my application", register_elements, "1.0.0", "LGPL", "my-application", "my-application", "http://www.my-application.net/" ); } psimedia-master/gstprovider/gstcustomelements/gstcustomelements.h000066400000000000000000000076201220046403000262310ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #ifndef GSTCUSTOMELEMENTS_H #define GSTCUSTOMELEMENTS_H #include #include #include #include #include G_BEGIN_DECLS // We create three custom elements here // // appvideosink - grab raw decoded frames, ready for painting // apprtpsrc - allow the app to feed in RTP packets // apprtpsink - allow the app to collect RTP packets // set up the defines/typedefs #define GST_TYPE_APPVIDEOSINK \ (gst_appvideosink_get_type()) #define GST_APPVIDEOSINK(obj) \ (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_APPVIDEOSINK,GstAppVideoSink)) #define GST_APPVIDEOSINK_CLASS(klass) \ (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_APPVIDEOSINK,GstAppVideoSinkClass)) #define GST_IS_APPVIDEOSINK(obj) \ (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_APPVIDEOSINK)) #define GST_IS_APPVIDEOSINK_CLASS(klass) \ (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_APPVIDEOSINK)) typedef struct _GstAppVideoSink GstAppVideoSink; typedef struct _GstAppVideoSinkClass GstAppVideoSinkClass; #define GST_TYPE_APPRTPSRC \ (gst_apprtpsrc_get_type()) #define GST_APPRTPSRC(obj) \ (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_APPRTPSRC,GstAppRtpSrc)) #define GST_APPRTPSRC_CLASS(klass) \ (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_APPRTPSRC,GstAppRtpSrcClass)) #define GST_IS_APPRTPSRC(obj) \ (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_APPRTPSRC)) #define GST_IS_APPRTPSRC_CLASS(klass) \ (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_APPRTPSRC)) typedef struct _GstAppRtpSrc GstAppRtpSrc; typedef struct _GstAppRtpSrcClass GstAppRtpSrcClass; #define GST_TYPE_APPRTPSINK \ (gst_apprtpsink_get_type()) #define GST_APPRTPSINK(obj) \ (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_APPRTPSINK,GstAppRtpSink)) #define GST_APPRTPSINK_CLASS(klass) \ (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_APPRTPSINK,GstAppRtpSinkClass)) #define GST_IS_APPRTPSINK(obj) \ (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_APPRTPSINK)) #define GST_IS_APPRTPSINK_CLASS(klass) \ (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_APPRTPSINK)) typedef struct _GstAppRtpSink GstAppRtpSink; typedef struct _GstAppRtpSinkClass GstAppRtpSinkClass; // done with defines/typedefs // GstAppVideoSink struct _GstAppVideoSink { GstVideoSink parent; gpointer appdata; void (*show_frame)(int width, int height, const unsigned char *rgb32, gpointer appdata); }; struct _GstAppVideoSinkClass { GstVideoSinkClass parent_class; }; GType gst_appvideosink_get_type(void); // GstAppRtpSrc struct _GstAppRtpSrc { GstPushSrc parent; GQueue *buffers; GMutex *push_mutex; GCond *push_cond; gboolean quit; GstCaps *caps; }; struct _GstAppRtpSrcClass { GstPushSrcClass parent_class; }; GType gst_apprtpsrc_get_type(void); void gst_apprtpsrc_packet_push(GstAppRtpSrc *src, const unsigned char *buf, int size); // GstAppRtpSink struct _GstAppRtpSink { GstBaseSink parent; gpointer appdata; void (*packet_ready)(const unsigned char *buf, int size, gpointer appdata); }; struct _GstAppRtpSinkClass { GstBaseSinkClass parent_class; }; GType gst_apprtpsink_get_type(void); void gstcustomelements_register(); G_END_DECLS #endif psimedia-master/gstprovider/gstcustomelements/gstcustomelements.pri000066400000000000000000000002241220046403000265650ustar00rootroot00000000000000HEADERS += \ $$PWD/gstcustomelements.h SOURCES += \ $$PWD/gstcustomelements.c \ $$PWD/appvideosink.c \ $$PWD/apprtpsrc.c \ $$PWD/apprtpsink.c psimedia-master/gstprovider/gstelements/000077500000000000000000000000001220046403000210335ustar00rootroot00000000000000psimedia-master/gstprovider/gstelements/README000066400000000000000000000005071220046403000217150ustar00rootroot00000000000000Here are GStreamer elements that we maintain outside of GStreamer CVS. We also keep copies of elements from gst-plugins-bad. directsound - windows audio in/out winks - windows video in osxaudio - mac audio in/out osxvideo - mac video in/out rtpmanager - rtp subsystem videomaxrate - limit framerate from a camera psimedia-master/gstprovider/gstelements/directsound.pri000066400000000000000000000011261220046403000240720ustar00rootroot00000000000000HEADERS += \ $$PWD/directsound/gstdirectsound.h \ $$PWD/directsound/gstdirectsoundringbuffer.h \ $$PWD/directsound/gstdirectsoundsink.h \ $$PWD/directsound/gstdirectsoundsrc.h SOURCES += \ $$PWD/directsound/gstdirectsound.c \ $$PWD/directsound/gstdirectsoundringbuffer.c \ $$PWD/directsound/gstdirectsoundsink.c \ $$PWD/directsound/gstdirectsoundsrc.c \ $$PWD/directsound/dsguids.c gstplugin:SOURCES += $$PWD/directsound/gstdirectsoundplugin.c !gstplugin:SOURCES += $$PWD/static/directsound_static.c LIBS *= \ -lgstinterfaces-0.10 \ -lgstaudio-0.10 \ -ldsound \ -ldxerr9 \ -lole32 psimedia-master/gstprovider/gstelements/directsound/000077500000000000000000000000001220046403000233565ustar00rootroot00000000000000psimedia-master/gstprovider/gstelements/directsound/Makefile.am000066400000000000000000000012211220046403000254060ustar00rootroot00000000000000plugin_LTLIBRARIES = libgstdirectsound.la libgstdirectsound_la_SOURCES = gstdirectsound.c \ gstdirectsoundringbuffer.c gstdirectsoundsink.c \ gstdirectsoundsrc.c gstdirectsoundplugin.c dsguids.c libgstdirectsound_la_CFLAGS = $(GST_CFLAGS) $(GST_BASE_CFLAGS) \ $(GST_PLUGINS_BASE_CFLAGS) $(DIRECTSOUND_CFLAGS) libgstdirectsound_la_LIBADD = $(DIRECTSOUND_LIBS) \ $(GST_BASE_LIBS) $(GST_PLUGINS_BASE_LIBS) \ -lgstaudio-$(GST_MAJORMINOR) -lgstinterfaces-$(GST_MAJORMINOR) libgstdirectsound_la_LDFLAGS = $(GST_PLUGIN_LDFLAGS) $(DIRECTSOUND_LDFLAGS) noinst_HEADERS = gstdirectsound.h gstdirectsoundringbuffer.h \ gstdirectsoundsink.h gstdirectsoundsrc.h psimedia-master/gstprovider/gstelements/directsound/dsguids.c000066400000000000000000000031531220046403000251660ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2005 Sebastien Moutte * Copyright (C) 2007-2009 Pioneers of the Inevitable * * gstdirectsound.c: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement * of Pioneers of the Inevitable, the creators of the Songbird Music player * */ // define the GUIDs we use here. according to KB130869, initguid.h needs to be // included after objbase.h, so we'll do it as late as possible // FIXME: the DEFINE_GUID macro from initguid.h throws warnings, so we use the // macro from objbase.h instead? //#include #define INITGUID #include DEFINE_GUID(IID_IDirectSoundBuffer8, 0x6825a449, 0x7524, 0x4d82, 0x92, 0x0f, 0x50, 0xe3, 0x6a, 0xb3, 0xab, 0x1e); DEFINE_GUID(IID_IDirectSoundCaptureBuffer8, 0x990df4, 0xdbb, 0x4872, 0x83, 0x3e, 0x6d, 0x30, 0x3e, 0x80, 0xae, 0xb6); psimedia-master/gstprovider/gstelements/directsound/gstdirectsound.c000066400000000000000000000111751220046403000265700ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2005 Sebastien Moutte * Copyright (C) 2007-2009 Pioneers of the Inevitable * * gstdirectsound.c: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement * of Pioneers of the Inevitable, the creators of the Songbird Music player * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif // note: INITGUID clashes with amstrmid.lib from winks, if both plugins are // statically built together. we'll get around this by only defining the // directsound-specific GUIDs we actually use (see dsguids.c) //#define INITGUID #include "gstdirectsound.h" #include #include GST_DEBUG_CATEGORY (directsound); #define GST_CAT_DEFAULT directsound static gchar * guid_to_string (LPGUID in) { WCHAR buffer[256]; if (StringFromGUID2 (in, buffer, sizeof buffer / sizeof buffer[0]) == 0) return NULL; return g_utf16_to_utf8 ((const gunichar2 *) buffer, -1, NULL, NULL, NULL); } static LPGUID string_to_guid (const gchar * str) { HRESULT ret; gunichar2 * wstr; LPGUID out; wstr = g_utf8_to_utf16 (str, -1, NULL, NULL, NULL); if (!wstr) return NULL; out = g_malloc (sizeof (GUID)); ret = CLSIDFromString ((LPOLESTR) wstr, out); g_free (wstr); if (ret != NOERROR) { g_free (out); return NULL; } return out; } static BOOL CALLBACK cb_enum (LPGUID lpGUID, LPCWSTR lpszDesc, LPCWSTR lpszDrvName, LPVOID lpContext) { GList ** list; gst_directsound_device * dev; list = (GList **) lpContext; dev = gst_directsound_device_alloc (); if (lpGUID == NULL) { /* default device */ dev->id = g_strdup (""); } else { dev->id = guid_to_string (lpGUID); if (!dev->id) { gst_directsound_device_free (dev); return TRUE; } } dev->name = g_utf16_to_utf8 ((const gunichar2 *) lpszDesc, -1, NULL, NULL, NULL); if (!dev->name) { gst_directsound_device_free (dev); return TRUE; } *list = g_list_append (*list, dev); return TRUE; } gst_directsound_device * gst_directsound_device_alloc () { gst_directsound_device * dev; dev = g_malloc (sizeof (gst_directsound_device)); dev->id = NULL; dev->name = NULL; return dev; } void gst_directsound_device_free (gst_directsound_device * dev) { if (!dev) return; if (dev->id) g_free (dev->id); if (dev->name) g_free (dev->name); g_free (dev); } void gst_directsound_device_free_func (gpointer data, gpointer user_data) { gst_directsound_device_free ((gst_directsound_device *) data); } GList * gst_directsound_playback_device_list () { GList * out = NULL; if (FAILED (DirectSoundEnumerateW ((LPDSENUMCALLBACK) cb_enum, &out))) { if (out) gst_directsound_device_list_free (out); return NULL; } return out; } GList * gst_directsound_capture_device_list () { GList * out = NULL; if (FAILED (DirectSoundCaptureEnumerateW ((LPDSENUMCALLBACK) cb_enum, &out))) { if (out) gst_directsound_device_list_free (out); return NULL; } return out; } void gst_directsound_device_list_free (GList * list) { g_list_foreach (list, gst_directsound_device_free_func, NULL); g_list_free (list); } LPGUID gst_directsound_get_device_guid (const gchar * id) { return string_to_guid (id); } void gst_directsound_set_volume (LPDIRECTSOUNDBUFFER8 pDSB8, gdouble volume) { HRESULT hr; long dsVolume; /* DirectSound controls volume using units of 100th of a decibel, * ranging from -10000 to 0. We use a linear scale of 0 - 100 * here, so remap. */ if (volume == 0) dsVolume = -10000; else dsVolume = 100 * (long) (20 * log10 (volume)); dsVolume = CLAMP (dsVolume, -10000, 0); GST_DEBUG ("Setting volume on secondary buffer to %d", (int) dsVolume); hr = IDirectSoundBuffer8_SetVolume (pDSB8, dsVolume); if (G_UNLIKELY (FAILED(hr))) { GST_WARNING ("Setting volume on secondary buffer failed."); } } psimedia-master/gstprovider/gstelements/directsound/gstdirectsound.h000066400000000000000000000040401220046403000265660ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2005 Sebastien Moutte * Copyright (C) 2007-2009 Pioneers of the Inevitable * * gstdirectsound.h: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement * of Pioneers of the Inevitable, the creators of the Songbird Music player * */ #ifndef __GST_DIRECTSOUND_H__ #define __GST_DIRECTSOUND_H__ #include #include #include /* use directsound v8 */ #ifdef DIRECTSOUND_VERSION #undef DIRECTSOUND_VERSION #endif #define DIRECTSOUND_VERSION 0x0800 #include GST_DEBUG_CATEGORY_EXTERN (directsound); G_BEGIN_DECLS typedef struct { gchar * id; gchar * name; } gst_directsound_device; gst_directsound_device * gst_directsound_device_alloc (); void gst_directsound_device_free (gst_directsound_device * dev); void gst_directsound_device_free_func (gpointer data, gpointer user_data); GList * gst_directsound_playback_device_list (); GList * gst_directsound_capture_device_list (); void gst_directsound_device_list_free (GList * list); /* if non-null, use g_free to free the guid */ LPGUID gst_directsound_get_device_guid (const gchar * id); void gst_directsound_set_volume (LPDIRECTSOUNDBUFFER8 pDSB8, gdouble volume); G_END_DECLS #endif /* __GST_DIRECTSOUND_H__ */ psimedia-master/gstprovider/gstelements/directsound/gstdirectsoundplugin.c000066400000000000000000000034311220046403000300030ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2005 Sebastien Moutte * Copyright (C) 2007 Pioneers of the Inevitable * * gstdirectsoundplugin.c: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * * The development of this code was made possible due to the involvement * of Pioneers of the Inevitable, the creators of the Songbird Music player * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif #include "gstdirectsound.h" #include "gstdirectsoundsink.h" #include "gstdirectsoundsrc.h" static gboolean plugin_init (GstPlugin * plugin) { if (!gst_element_register (plugin, "directsoundsink", GST_RANK_PRIMARY, GST_TYPE_DIRECTSOUND_SINK)) return FALSE; if (!gst_element_register (plugin, "directsoundsrc", GST_RANK_PRIMARY, GST_TYPE_DIRECTSOUND_SRC)) return FALSE; GST_DEBUG_CATEGORY_INIT (directsound, "directsound", 0, "DirectSound Elements"); return TRUE; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, "directsound", "Direct Sound plugin library", plugin_init, VERSION, "LGPL", GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN) psimedia-master/gstprovider/gstelements/directsound/gstdirectsoundringbuffer.c000066400000000000000000001054401220046403000306410ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2005 Sebastien Moutte * Copyright (C) 2007-2009 Pioneers of the Inevitable * Copyright (C) 2009 Barracuda Networks, Inc. * * gstdirectsoundringbuffer.c: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement * of Pioneers of the Inevitable, the creators of the Songbird Music player * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif #include "gstdirectsoundringbuffer.h" #include "gstdirectsoundsink.h" #include "gstdirectsoundsrc.h" #define GST_CAT_DEFAULT directsound #define MAX_LOST_RETRIES 10 #define DIRECTSOUND_ERROR_DEVICE_RECONFIGURED 0x88780096 #define DIRECTSOUND_ERROR_DEVICE_NO_DRIVER 0x88780078 static void gst_directsound_ring_buffer_class_init ( GstDirectSoundRingBufferClass * klass); static void gst_directsound_ring_buffer_init ( GstDirectSoundRingBuffer * ringbuffer, GstDirectSoundRingBufferClass * g_class); static void gst_directsound_ring_buffer_dispose (GObject * object); static void gst_directsound_ring_buffer_finalize (GObject * object); static gboolean gst_directsound_ring_buffer_open_device (GstRingBuffer * buf); static gboolean gst_directsound_ring_buffer_close_device ( GstRingBuffer * buf); static gboolean gst_directsound_ring_buffer_acquire (GstRingBuffer * buf, GstRingBufferSpec * spec); static gboolean gst_directsound_ring_buffer_release (GstRingBuffer * buf); static gboolean gst_directsound_ring_buffer_start (GstRingBuffer * buf); static gboolean gst_directsound_ring_buffer_pause (GstRingBuffer * buf); static gboolean gst_directsound_ring_buffer_resume (GstRingBuffer * buf); static gboolean gst_directsound_ring_buffer_stop (GstRingBuffer * buf); static guint gst_directsound_ring_buffer_delay (GstRingBuffer * buf); static GstRingBufferClass * ring_parent_class = NULL; static DWORD WINAPI gst_directsound_write_proc (LPVOID lpParameter); static DWORD WINAPI gst_directsound_read_proc (LPVOID lpParameter); GST_BOILERPLATE (GstDirectSoundRingBuffer, gst_directsound_ring_buffer, GstRingBuffer, GST_TYPE_RING_BUFFER); static void gst_directsound_ring_buffer_base_init (gpointer g_class) { /* Nothing to do right now */ } static void gst_directsound_ring_buffer_class_init (GstDirectSoundRingBufferClass * klass) { GObjectClass * gobject_class; GstObjectClass * gstobject_class; GstRingBufferClass * gstringbuffer_class; gobject_class = (GObjectClass *) klass; gstobject_class = (GstObjectClass *) klass; gstringbuffer_class = (GstRingBufferClass *) klass; ring_parent_class = g_type_class_peek_parent (klass); gobject_class->dispose = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_dispose); gobject_class->finalize = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_finalize); gstringbuffer_class->open_device = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_open_device); gstringbuffer_class->close_device = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_close_device); gstringbuffer_class->acquire = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_acquire); gstringbuffer_class->release = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_release); gstringbuffer_class->start = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_start); gstringbuffer_class->pause = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_pause); gstringbuffer_class->resume = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_resume); gstringbuffer_class->stop = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_stop); gstringbuffer_class->delay = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_delay); GST_DEBUG ("directsound ring buffer class init"); } static void gst_directsound_ring_buffer_init (GstDirectSoundRingBuffer * ringbuffer, GstDirectSoundRingBufferClass * g_class) { ringbuffer->element = NULL; ringbuffer->pDS8 = NULL; ringbuffer->pDSB8 = NULL; ringbuffer->pDSC8 = NULL; ringbuffer->pDSCB8 = NULL; memset (&ringbuffer->wave_format, 0, sizeof (WAVEFORMATEX)); ringbuffer->buffer_size = 0; ringbuffer->buffer_circular_offset = 0; ringbuffer->min_buffer_size = 0; ringbuffer->min_sleep_time = 10; /* in milliseconds */ ringbuffer->bytes_per_sample = 0; ringbuffer->segoffset = 0; ringbuffer->segsize = 0; ringbuffer->hThread = NULL; ringbuffer->suspended = FALSE; ringbuffer->should_run = FALSE; ringbuffer->flushing = FALSE; ringbuffer->volume = 1.0; ringbuffer->dsound_lock = g_mutex_new (); } static void gst_directsound_ring_buffer_dispose (GObject * object) { G_OBJECT_CLASS (ring_parent_class)->dispose (object); } static void gst_directsound_ring_buffer_finalize (GObject * object) { GstDirectSoundRingBuffer * dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (object); g_mutex_free (dsoundbuffer->dsound_lock); dsoundbuffer->dsound_lock = NULL; G_OBJECT_CLASS (ring_parent_class)->finalize (object); } static gboolean gst_directsound_ring_buffer_open_device (GstRingBuffer * buf) { HRESULT hr; gchar * device_id; LPGUID lpGuid = NULL; GstDirectSoundRingBuffer * dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); GST_DEBUG ("Opening DirectSound Device"); if (dsoundbuffer->is_src) { device_id = ((GstDirectSoundSrc *)(dsoundbuffer->element))->device_id; if (device_id) lpGuid = gst_directsound_get_device_guid (device_id); if (FAILED (hr = DirectSoundCaptureCreate8 (lpGuid, &dsoundbuffer->pDSC8, NULL))) { GST_ELEMENT_ERROR (dsoundbuffer->element, RESOURCE, FAILED, ("%ls.", DXGetErrorDescription9W(hr)), ("Failed to create directsound device. (%X)", (unsigned int) hr)); dsoundbuffer->pDSC8 = NULL; if (lpGuid) g_free (lpGuid); return FALSE; } if (lpGuid) g_free (lpGuid); } else { device_id = ((GstDirectSoundSink *)(dsoundbuffer->element))->device_id; if (device_id) lpGuid = gst_directsound_get_device_guid (device_id); if (FAILED (hr = DirectSoundCreate8 (lpGuid, &dsoundbuffer->pDS8, NULL))) { GST_ELEMENT_ERROR (dsoundbuffer->element, RESOURCE, FAILED, ("%ls.", DXGetErrorDescription9W(hr)), ("Failed to create directsound device. (%X)", (unsigned int) hr)); dsoundbuffer->pDS8 = NULL; if (lpGuid) g_free (lpGuid); return FALSE; } if (lpGuid) g_free (lpGuid); if (FAILED (hr = IDirectSound8_SetCooperativeLevel (dsoundbuffer->pDS8, GetDesktopWindow (), DSSCL_PRIORITY))) { GST_WARNING ("gst_directsound_sink_open: IDirectSound8_SetCooperativeLevel, hr = %X", (unsigned int) hr); return FALSE; } } return TRUE; } static gboolean gst_directsound_ring_buffer_close_device (GstRingBuffer * buf) { GstDirectSoundRingBuffer * dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); GST_DEBUG ("Closing DirectSound Device"); if (dsoundbuffer->is_src) { if (dsoundbuffer->pDSC8) { IDirectSoundCapture_Release (dsoundbuffer->pDSC8); dsoundbuffer->pDSC8 = NULL; } } else { if (dsoundbuffer->pDS8) { IDirectSound8_Release (dsoundbuffer->pDS8); dsoundbuffer->pDS8 = NULL; } } return TRUE; } static gboolean gst_directsound_create_buffer (GstRingBuffer * buf) { GstDirectSoundRingBuffer * dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); HRESULT hr; DSBUFFERDESC descSecondary; LPDIRECTSOUNDBUFFER pDSB; DSCBUFFERDESC captureDescSecondary; LPDIRECTSOUNDCAPTUREBUFFER pDSCB; if (dsoundbuffer->is_src) { memset (&captureDescSecondary, 0, sizeof (DSCBUFFERDESC)); captureDescSecondary.dwSize = sizeof (DSCBUFFERDESC); captureDescSecondary.dwFlags = 0; captureDescSecondary.dwBufferBytes = dsoundbuffer->buffer_size; captureDescSecondary.lpwfxFormat = (WAVEFORMATEX *) &dsoundbuffer->wave_format; hr = IDirectSoundCapture_CreateCaptureBuffer (dsoundbuffer->pDSC8, &captureDescSecondary, &pDSCB, NULL); if (G_UNLIKELY (FAILED (hr))) { GST_WARNING ("gst_directsound_ring_buffer_acquire: IDirectSoundCapture_CreateCaptureBuffer, hr = %X", (unsigned int) hr); return FALSE; } hr = IDirectSoundCaptureBuffer_QueryInterface (pDSCB, &IID_IDirectSoundCaptureBuffer8, (LPVOID *) &dsoundbuffer->pDSCB8); if (G_UNLIKELY (FAILED (hr))) { IDirectSoundCaptureBuffer_Release (pDSCB); GST_WARNING ("gst_directsound_ring_buffer_acquire: IDirectSoundCaptureBuffer_QueryInterface, hr = %X", (unsigned int) hr); return FALSE; } IDirectSoundCaptureBuffer_Release (pDSCB); } else { memset (&descSecondary, 0, sizeof (DSBUFFERDESC)); descSecondary.dwSize = sizeof (DSBUFFERDESC); descSecondary.dwFlags = DSBCAPS_GETCURRENTPOSITION2 | DSBCAPS_GLOBALFOCUS | DSBCAPS_CTRLVOLUME; descSecondary.dwBufferBytes = dsoundbuffer->buffer_size; descSecondary.lpwfxFormat = (WAVEFORMATEX *) &dsoundbuffer->wave_format; hr = IDirectSound8_CreateSoundBuffer (dsoundbuffer->pDS8, &descSecondary, &pDSB, NULL); if (G_UNLIKELY (FAILED (hr))) { GST_WARNING ("gst_directsound_ring_buffer_acquire: IDirectSound8_CreateSoundBuffer, hr = %X", (unsigned int) hr); return FALSE; } hr = IDirectSoundBuffer_QueryInterface (pDSB, &IID_IDirectSoundBuffer8, (LPVOID *) &dsoundbuffer->pDSB8); if (G_UNLIKELY (FAILED (hr))) { IDirectSoundBuffer_Release (pDSB); GST_WARNING ("gst_directsound_ring_buffer_acquire: IDirectSoundBuffer_QueryInterface, hr = %X", (unsigned int) hr); return FALSE; } IDirectSoundBuffer_Release (pDSB); } return TRUE; } static gboolean gst_directsound_ring_buffer_acquire (GstRingBuffer * buf, GstRingBufferSpec * spec) { GstDirectSoundRingBuffer * dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); WAVEFORMATEX wfx; /* sanity check, if no DirectSound device, bail out */ if (!dsoundbuffer->pDS8 && !dsoundbuffer->pDSC8) { GST_WARNING ("gst_directsound_ring_buffer_acquire: DirectSound 8 device is null!"); return FALSE; } /* save number of bytes per sample */ dsoundbuffer->bytes_per_sample = spec->bytes_per_sample; /* fill the WAVEFORMATEX structure with spec params */ memset (&wfx, 0, sizeof (wfx)); wfx.cbSize = sizeof (wfx); wfx.wFormatTag = WAVE_FORMAT_PCM; wfx.nChannels = spec->channels; wfx.nSamplesPerSec = spec->rate; wfx.wBitsPerSample = (spec->bytes_per_sample * 8) / wfx.nChannels; wfx.nBlockAlign = spec->bytes_per_sample; wfx.nAvgBytesPerSec = wfx.nSamplesPerSec * wfx.nBlockAlign; /* enforce a minimum latency of 2x the sleep_time */ if (spec->latency_time < (dsoundbuffer->min_sleep_time * 2 * 1000)) spec->latency_time = dsoundbuffer->min_sleep_time * 2 * 1000; /* Create directsound buffer with size based on our configured * buffer_size (which is 200 ms by default) */ dsoundbuffer->buffer_size = gst_util_uint64_scale_int (wfx.nAvgBytesPerSec, spec->buffer_time, GST_MSECOND); spec->segsize = gst_util_uint64_scale_int (wfx.nAvgBytesPerSec, spec->latency_time, GST_MSECOND); /* Now round the ringbuffer segment size to a multiple of the bytes per sample - otherwise the ringbuffer subtly fails */ spec->segsize = (spec->segsize + (spec->bytes_per_sample - 1))/ spec->bytes_per_sample * spec->bytes_per_sample; /* And base the total number of segments on the configured buffer size */ spec->segtotal = dsoundbuffer->buffer_size / spec->segsize; dsoundbuffer->buffer_size = spec->segsize * spec->segtotal; dsoundbuffer->segsize = spec->segsize; dsoundbuffer->min_buffer_size = dsoundbuffer->buffer_size / 2; GST_INFO_OBJECT (dsoundbuffer, "GstRingBufferSpec->channels: %d, GstRingBufferSpec->rate: %d, GstRingBufferSpec->bytes_per_sample: %d\n" "WAVEFORMATEX.nSamplesPerSec: %ld, WAVEFORMATEX.wBitsPerSample: %d, WAVEFORMATEX.nBlockAlign: %d, WAVEFORMATEX.nAvgBytesPerSec: %ld\n" "Size of dsound cirucular buffer: %d, Size of segment: %d, Total segments: %d\n", spec->channels, spec->rate, spec->bytes_per_sample, wfx.nSamplesPerSec, wfx.wBitsPerSample, wfx.nBlockAlign, wfx.nAvgBytesPerSec, dsoundbuffer->buffer_size, spec->segsize, spec->segtotal); dsoundbuffer->wave_format = wfx; if (!gst_directsound_create_buffer (buf)) return FALSE; buf->data = gst_buffer_new_and_alloc (spec->segtotal * spec->segsize); memset (GST_BUFFER_DATA (buf->data), 0, GST_BUFFER_SIZE (buf->data)); return TRUE; } static gboolean gst_directsound_ring_buffer_release (GstRingBuffer * buf) { GstDirectSoundRingBuffer * dsoundbuffer; dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); /* first we have to ensure our ring buffer is stopped */ gst_directsound_ring_buffer_stop (buf); GST_DSOUND_LOCK (dsoundbuffer); /* release secondary DirectSound buffer */ if (dsoundbuffer->pDSB8) { IDirectSoundBuffer8_Release (dsoundbuffer->pDSB8); dsoundbuffer->pDSB8 = NULL; } if (dsoundbuffer->pDSCB8) { IDirectSoundCaptureBuffer8_Release (dsoundbuffer->pDSCB8); dsoundbuffer->pDSCB8 = NULL; } gst_buffer_unref (buf->data); buf->data = NULL; GST_DSOUND_UNLOCK (dsoundbuffer); return TRUE; } static gboolean gst_directsound_ring_buffer_start (GstRingBuffer * buf) { GstDirectSoundRingBuffer * dsoundbuffer; HANDLE hThread; dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); GST_DEBUG ("Starting RingBuffer"); GST_DSOUND_LOCK (dsoundbuffer); if (dsoundbuffer->is_src) { hThread = CreateThread (NULL, 256 * 1024 /* Stack size: 256k */, gst_directsound_read_proc, buf, CREATE_SUSPENDED, NULL); } else { hThread = CreateThread (NULL, 256 * 1024 /* Stack size: 256k */, gst_directsound_write_proc, buf, CREATE_SUSPENDED, NULL); } if (!hThread) { GST_DSOUND_UNLOCK (dsoundbuffer); GST_WARNING ("gst_directsound_ring_buffer_start: CreateThread"); return FALSE; } dsoundbuffer->hThread = hThread; dsoundbuffer->should_run = TRUE; if (!dsoundbuffer->is_src) gst_directsound_set_volume (dsoundbuffer->pDSB8, dsoundbuffer->volume); if (G_UNLIKELY (!SetThreadPriority(hThread, THREAD_PRIORITY_TIME_CRITICAL))) GST_WARNING ("gst_directsound_ring_buffer_start: Failed to set thread priority."); ResumeThread (hThread); GST_DSOUND_UNLOCK (dsoundbuffer); return TRUE; } static gboolean gst_directsound_ring_buffer_pause (GstRingBuffer * buf) { HRESULT hr = S_OK; GstDirectSoundRingBuffer * dsoundbuffer; dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); GST_DEBUG ("Pausing RingBuffer"); GST_DSOUND_LOCK (dsoundbuffer); if (dsoundbuffer->is_src) { if (dsoundbuffer->pDSCB8) { hr = IDirectSoundCaptureBuffer_Stop (dsoundbuffer->pDSCB8); } } else { if (dsoundbuffer->pDSB8) { hr = IDirectSoundBuffer8_Stop (dsoundbuffer->pDSB8); } } if (G_LIKELY (!dsoundbuffer->suspended)) { if (G_UNLIKELY(SuspendThread (dsoundbuffer->hThread) == -1)) GST_WARNING ("gst_directsound_ring_buffer_pause: SuspendThread failed."); else dsoundbuffer->suspended = TRUE; } GST_DSOUND_UNLOCK (dsoundbuffer); /* in the unlikely event that a device was reconfigured, we can consider * ourselves stopped even though the stop call failed */ if (G_UNLIKELY (FAILED(hr)) && G_UNLIKELY(hr != DIRECTSOUND_ERROR_DEVICE_RECONFIGURED) && G_UNLIKELY(hr != DIRECTSOUND_ERROR_DEVICE_NO_DRIVER)) { if (dsoundbuffer->is_src) GST_WARNING ("gst_directsound_ring_buffer_pause: IDirectSoundCaptureBuffer_Stop, hr = %X", (unsigned int) hr); else GST_WARNING ("gst_directsound_ring_buffer_pause: IDirectSoundBuffer8_Stop, hr = %X", (unsigned int) hr); return FALSE; } return TRUE; } static gboolean gst_directsound_ring_buffer_resume (GstRingBuffer * buf) { GstDirectSoundRingBuffer * dsoundbuffer; dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); GST_DEBUG ("Resuming RingBuffer"); GST_DSOUND_LOCK (dsoundbuffer); if (G_LIKELY (dsoundbuffer->suspended) && ResumeThread (dsoundbuffer->hThread) != -1) { dsoundbuffer->suspended = FALSE; } else { GST_DSOUND_UNLOCK (dsoundbuffer); GST_WARNING ("gst_directsound_ring_buffer_resume: ResumeThread failed."); return FALSE; } GST_DSOUND_UNLOCK (dsoundbuffer); return TRUE; } static gboolean gst_directsound_ring_buffer_stop (GstRingBuffer * buf) { HRESULT hr; DWORD ret; HANDLE hThread; GstDirectSoundRingBuffer * dsoundbuffer; dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); GST_DEBUG ("Stopping RingBuffer"); GST_DSOUND_LOCK (dsoundbuffer); dsoundbuffer->should_run = FALSE; if (dsoundbuffer->is_src) { if (dsoundbuffer->pDSCB8) { hr = IDirectSoundCaptureBuffer_Stop (dsoundbuffer->pDSCB8); if (G_UNLIKELY (FAILED(hr))) { GST_DSOUND_UNLOCK (dsoundbuffer); GST_WARNING ("gst_directsound_ring_buffer_stop: IDirectSoundCaptureBuffer_Stop, hr = %X", (unsigned int) hr); return FALSE; } } } else { if (dsoundbuffer->pDSB8) { hr = IDirectSoundBuffer8_Stop (dsoundbuffer->pDSB8); if (G_UNLIKELY (FAILED(hr))) { GST_DSOUND_UNLOCK (dsoundbuffer); GST_WARNING ("gst_directsound_ring_buffer_stop: IDirectSoundBuffer8_Stop, hr = %X", (unsigned int) hr); return FALSE; } } } hThread = dsoundbuffer->hThread; if (dsoundbuffer->suspended && ResumeThread (hThread) != -1) { dsoundbuffer->suspended = FALSE; } else { GST_DSOUND_UNLOCK (dsoundbuffer); GST_WARNING ("gst_directsound_ring_buffer_stop: ResumeThread failed."); return FALSE; } GST_DSOUND_UNLOCK (dsoundbuffer); /* wait without lock held */ ret = WaitForSingleObject (hThread, 5000); if (G_UNLIKELY (ret == WAIT_TIMEOUT)) { GST_WARNING ("gst_directsound_ring_buffer_stop: Failed to wait for thread shutdown. (%u)", (unsigned int) ret); return FALSE; } GST_DSOUND_LOCK (dsoundbuffer); CloseHandle (dsoundbuffer->hThread); dsoundbuffer->hThread = NULL; GST_DSOUND_UNLOCK (dsoundbuffer); return TRUE; } static guint gst_directsound_ring_buffer_delay (GstRingBuffer * buf) { GstDirectSoundRingBuffer * dsoundbuffer; HRESULT hr; DWORD dwCurrentPlayCursor; DWORD dwCurrentWriteCursor; DWORD dwCurrentCaptureCursor; DWORD dwCurrentReadCursor; DWORD dwBytesInQueue = 0; gint nNbSamplesInQueue = 0; dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); GST_DSOUND_LOCK (dsoundbuffer); if (dsoundbuffer->is_src) { if (G_LIKELY (dsoundbuffer->pDSCB8)) { /* evaluate the number of samples in queue in the circular buffer */ hr = IDirectSoundCaptureBuffer8_GetCurrentPosition ( dsoundbuffer->pDSCB8, &dwCurrentCaptureCursor, &dwCurrentReadCursor); if (G_LIKELY (SUCCEEDED (hr))) { if (dwCurrentCaptureCursor >= dsoundbuffer->buffer_circular_offset) dwBytesInQueue = dwCurrentCaptureCursor - dsoundbuffer->buffer_circular_offset; else dwBytesInQueue = dwCurrentCaptureCursor + (dsoundbuffer->buffer_size - dsoundbuffer->buffer_circular_offset); nNbSamplesInQueue = dwBytesInQueue / dsoundbuffer->bytes_per_sample; } else { GST_WARNING ("gst_directsound_ring_buffer_delay: IDirectSoundCaptureBuffer8_GetCurrentPosition, hr = %X", (unsigned int) hr); } } } else { if (G_LIKELY (dsoundbuffer->pDSB8)) { /* evaluate the number of samples in queue in the circular buffer */ hr = IDirectSoundBuffer8_GetCurrentPosition (dsoundbuffer->pDSB8, &dwCurrentPlayCursor, &dwCurrentWriteCursor); if (G_LIKELY (SUCCEEDED (hr))) { if (dwCurrentPlayCursor <= dsoundbuffer->buffer_circular_offset) dwBytesInQueue = dsoundbuffer->buffer_circular_offset - dwCurrentPlayCursor; else dwBytesInQueue = dsoundbuffer->buffer_circular_offset + (dsoundbuffer->buffer_size - dwCurrentPlayCursor); nNbSamplesInQueue = dwBytesInQueue / dsoundbuffer->bytes_per_sample; } else { GST_WARNING ("gst_directsound_ring_buffer_delay: IDirectSoundBuffer8_GetCurrentPosition, hr = %X", (unsigned int) hr); } } } GST_DSOUND_UNLOCK (dsoundbuffer); return nNbSamplesInQueue; } static DWORD WINAPI gst_directsound_write_proc (LPVOID lpParameter) { GstRingBuffer * buf; GstDirectSoundRingBuffer * dsoundbuffer; HRESULT hr; DWORD dwStatus; LPVOID pLockedBuffer1 = NULL, pLockedBuffer2 = NULL; DWORD dwSizeBuffer1 = 0, dwSizeBuffer2 = 0; DWORD dwCurrentPlayCursor = 0; gint64 freeBufferSize = 0; guint8 * readptr = NULL; gint readseg = 0; guint len = 0; gint retries = 0; gboolean flushing = FALSE; gboolean should_run = TRUE; gboolean error = FALSE; buf = (GstRingBuffer *) lpParameter; dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); do { GST_DSOUND_LOCK (dsoundbuffer); if (dsoundbuffer->flushing || !dsoundbuffer->pDSB8) { GST_DSOUND_UNLOCK (dsoundbuffer); goto complete; } GST_DSOUND_UNLOCK (dsoundbuffer); restore_buffer: /* get current buffer status */ GST_DSOUND_LOCK (dsoundbuffer); hr = IDirectSoundBuffer8_GetStatus (dsoundbuffer->pDSB8, &dwStatus); GST_DSOUND_UNLOCK (dsoundbuffer); if (dwStatus & DSBSTATUS_BUFFERLOST) { GST_DEBUG ("Buffer was lost, attempting to restore"); GST_DSOUND_LOCK (dsoundbuffer); hr = IDirectSoundBuffer8_Restore (dsoundbuffer->pDSB8); GST_DSOUND_UNLOCK (dsoundbuffer); /* restore may fail again, ensure we restore the * buffer before we continue */ if (FAILED(hr) && hr == DSERR_BUFFERLOST) { if (retries++ < MAX_LOST_RETRIES) { GST_DEBUG ("Unable to restore, trying again"); goto restore_buffer; } else { GST_ELEMENT_ERROR (dsoundbuffer->element, RESOURCE, FAILED, ("%ls.", DXGetErrorDescription9W(hr)), ("gst_directsound_write_proc: IDirectSoundBuffer8_Restore, hr = %X", (unsigned int) hr)); goto complete; } } } /* get current play cursor and write cursor positions */ GST_DSOUND_LOCK (dsoundbuffer); hr = IDirectSoundBuffer8_GetCurrentPosition (dsoundbuffer->pDSB8, &dwCurrentPlayCursor, NULL); GST_DSOUND_UNLOCK (dsoundbuffer); if (G_UNLIKELY (FAILED(hr))) { /* try and reopen the default directsound device */ if (hr == DIRECTSOUND_ERROR_DEVICE_RECONFIGURED) { /* we have to wait a while for the sound device removal to actually * be processed before attempting to reopen the device. Yes, this * sucks */ Sleep (2000); GST_DSOUND_LOCK (dsoundbuffer); IDirectSoundBuffer8_Release (dsoundbuffer->pDSB8); dsoundbuffer->pDSB8 = NULL; GST_DSOUND_UNLOCK (dsoundbuffer); if (gst_directsound_ring_buffer_close_device (buf) && gst_directsound_ring_buffer_open_device (buf) && gst_directsound_create_buffer (buf) ) { dsoundbuffer->buffer_circular_offset = 0; goto restore_buffer; } } /* only trigger an error if we're not already in an error state */ if (FAILED(hr) && !error) { GST_ELEMENT_ERROR (dsoundbuffer->element, RESOURCE, FAILED, ("%ls.", DXGetErrorDescription9W(hr)), ("gst_directsound_write_proc: IDirectSoundBuffer8_GetCurrentPosition, hr = %X", (unsigned int) hr)); error = TRUE; goto complete; } } GST_LOG ("Current Play Cursor: %u Current Write Offset: %d", (unsigned int) dwCurrentPlayCursor, dsoundbuffer->buffer_circular_offset); /* calculate the free size of the circular buffer */ GST_DSOUND_LOCK (dsoundbuffer); if (dwCurrentPlayCursor <= dsoundbuffer->buffer_circular_offset) freeBufferSize = dsoundbuffer->buffer_size - (dsoundbuffer->buffer_circular_offset - dwCurrentPlayCursor); else freeBufferSize = dwCurrentPlayCursor - dsoundbuffer->buffer_circular_offset; GST_DSOUND_UNLOCK (dsoundbuffer); if (!gst_ring_buffer_prepare_read (buf, &readseg, &readptr, &len)) goto complete; len -= dsoundbuffer->segoffset; GST_LOG ("Size of segment to write: %d Free buffer size: %lld", len, freeBufferSize); /* If we can't write this into directsound because we don't have enough * space, then start playback if we're currently paused. Then, sleep * for a little while to wait until space is available */ // ###: why >= and not > ? // ###: what happens if this condition is true on the first iteration? // playback totally busted? if (len >= freeBufferSize) { if (!(dwStatus & DSBSTATUS_PLAYING)) { GST_DSOUND_LOCK (dsoundbuffer); hr = IDirectSoundBuffer8_Play (dsoundbuffer->pDSB8, 0, 0, DSBPLAY_LOOPING); GST_DSOUND_UNLOCK (dsoundbuffer); if (FAILED(hr)) { GST_WARNING ("gst_directsound_write_proc: IDirectSoundBuffer8_Play, hr = %X", (unsigned int) hr); } } goto complete; } /* lock it */ GST_DSOUND_LOCK (dsoundbuffer); hr = IDirectSoundBuffer8_Lock (dsoundbuffer->pDSB8, dsoundbuffer->buffer_circular_offset, len, &pLockedBuffer1, &dwSizeBuffer1, &pLockedBuffer2, &dwSizeBuffer2, 0L); /* copy chunks */ if (SUCCEEDED (hr)) { // ###: is it possible for len > dwSizeBuffer1 + dwSizeBuffer2 ? if (len <= dwSizeBuffer1) { memcpy (pLockedBuffer1, (LPBYTE) readptr + dsoundbuffer->segoffset, len); } else { memcpy (pLockedBuffer1, (LPBYTE) readptr + dsoundbuffer->segoffset, dwSizeBuffer1); memcpy (pLockedBuffer2, (LPBYTE) readptr + dsoundbuffer->segoffset + dwSizeBuffer1, len - dwSizeBuffer1); } IDirectSoundBuffer8_Unlock (dsoundbuffer->pDSB8, pLockedBuffer1, dwSizeBuffer1, pLockedBuffer2, dwSizeBuffer2); } else { GST_WARNING ("gst_directsound_write_proc: IDirectSoundBuffer8_Lock, hr = %X", (unsigned int) hr); } /* update tracking data */ dsoundbuffer->segoffset += dwSizeBuffer1 + (len - dwSizeBuffer1); dsoundbuffer->buffer_circular_offset += dwSizeBuffer1 + (len - dwSizeBuffer1); dsoundbuffer->buffer_circular_offset %= dsoundbuffer->buffer_size; GST_DSOUND_UNLOCK (dsoundbuffer); freeBufferSize -= dwSizeBuffer1 + (len - dwSizeBuffer1); GST_LOG ("DirectSound Buffer1 Data Size: %u DirectSound Buffer2 Data Size: %u", (unsigned int) dwSizeBuffer1, (unsigned int) dwSizeBuffer2); GST_LOG ("Free buffer size: %lld", freeBufferSize); /* check if we read a whole segment */ GST_DSOUND_LOCK (dsoundbuffer); if (dsoundbuffer->segoffset == dsoundbuffer->segsize) { GST_DSOUND_UNLOCK (dsoundbuffer); /* advance to next segment */ gst_ring_buffer_clear (buf, readseg); gst_ring_buffer_advance (buf, 1); GST_DSOUND_LOCK (dsoundbuffer); dsoundbuffer->segoffset = 0; } GST_DSOUND_UNLOCK (dsoundbuffer); complete: GST_DSOUND_LOCK (dsoundbuffer); should_run = dsoundbuffer->should_run; flushing = dsoundbuffer->flushing; retries = 0; GST_DSOUND_UNLOCK (dsoundbuffer); /* it's extremely important to sleep in without the lock! */ // ###: why >= and not > ? if (len >= freeBufferSize || flushing || error) Sleep (dsoundbuffer->min_sleep_time); } while(should_run); return 0; } static DWORD WINAPI gst_directsound_read_proc (LPVOID lpParameter) { GstRingBuffer * buf; GstDirectSoundRingBuffer * dsoundbuffer; HRESULT hr; DWORD dwStatus; LPVOID pLockedBuffer1 = NULL, pLockedBuffer2 = NULL; DWORD dwSizeBuffer1 = 0, dwSizeBuffer2 = 0; DWORD dwCurrentCaptureCursor = 0; DWORD dwCurrentReadCursor = 0; gint64 capturedBufferSize = 0; guint8 * writeptr = NULL; gint writeseg = 0; guint len = 0; gint retries = 0; gboolean flushing = FALSE; gboolean should_run = TRUE; gboolean error = FALSE; buf = (GstRingBuffer *) lpParameter; dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); do { GST_DSOUND_LOCK (dsoundbuffer); if (dsoundbuffer->flushing || !dsoundbuffer->pDSCB8) { GST_DSOUND_UNLOCK (dsoundbuffer); goto complete; } GST_DSOUND_UNLOCK (dsoundbuffer); restore_buffer: /* get current buffer status */ GST_DSOUND_LOCK (dsoundbuffer); hr = IDirectSoundCaptureBuffer8_GetStatus (dsoundbuffer->pDSCB8, &dwStatus); GST_DSOUND_UNLOCK (dsoundbuffer); // ###: the capture api doesn't seem to have _BUFFERLOST and _Restore, // so commenting this out until it can be investigated. #if 0 if (dwStatus & DSBSTATUS_BUFFERLOST) { GST_DEBUG ("Buffer was lost, attempting to restore"); GST_DSOUND_LOCK (dsoundbuffer); hr = IDirectSoundBuffer8_Restore (dsoundbuffer->pDSB8); GST_DSOUND_UNLOCK (dsoundbuffer); /* restore may fail again, ensure we restore the * buffer before we continue */ if (FAILED(hr) && hr == DSERR_BUFFERLOST) { if (retries++ < MAX_LOST_RETRIES) { GST_DEBUG ("Unable to restore, trying again"); goto restore_buffer; } else { GST_ELEMENT_ERROR (dsoundbuffer->element, RESOURCE, FAILED, ("%ls.", DXGetErrorDescription9W(hr)), ("gst_directsound_read_proc: IDirectSoundBuffer8_Restore, hr = %X", (unsigned int) hr)); goto complete; } } } #endif /* Starting capturing if not already */ if (!(dwStatus & DSCBSTATUS_CAPTURING)) { hr = IDirectSoundCaptureBuffer8_Start (dsoundbuffer->pDSCB8, DSCBSTART_LOOPING); if (FAILED(hr)) { GST_WARNING ("gst_directsound_read_proc: IDirectSoundCaptureBuffer8_Start, hr = %X", (unsigned int) hr); } } /* get current capture cursor and read cursor positions */ GST_DSOUND_LOCK (dsoundbuffer); hr = IDirectSoundCaptureBuffer8_GetCurrentPosition (dsoundbuffer->pDSCB8, &dwCurrentCaptureCursor, &dwCurrentReadCursor); GST_DSOUND_UNLOCK (dsoundbuffer); if (G_UNLIKELY (FAILED(hr))) { /* try and reopen the default directsound device */ if (hr == DIRECTSOUND_ERROR_DEVICE_RECONFIGURED) { /* we have to wait a while for the sound device removal to actually * be processed before attempting to reopen the device. Yes, this * sucks */ Sleep (2000); GST_DSOUND_LOCK (dsoundbuffer); IDirectSoundCaptureBuffer8_Release (dsoundbuffer->pDSCB8); dsoundbuffer->pDSCB8 = NULL; GST_DSOUND_UNLOCK (dsoundbuffer); if (gst_directsound_ring_buffer_close_device (buf) && gst_directsound_ring_buffer_open_device (buf) && gst_directsound_create_buffer (buf) ) { dsoundbuffer->buffer_circular_offset = 0; goto restore_buffer; } } /* only trigger an error if we're not already in an error state */ if (FAILED(hr) && !error) { GST_ELEMENT_ERROR (dsoundbuffer->element, RESOURCE, FAILED, ("%ls.", DXGetErrorDescription9W(hr)), ("gst_directsound_read_proc: IDirectSoundCaptureBuffer8_GetCurrentPosition, hr = %X", (unsigned int) hr)); error = TRUE; goto complete; } } GST_LOG ("Current Read Start: %u Current Read End: %d", dsoundbuffer->buffer_circular_offset, (unsigned int) dwCurrentReadCursor); /* calculate the captured amount in the circular buffer */ GST_DSOUND_LOCK (dsoundbuffer); if (dwCurrentReadCursor >= dsoundbuffer->buffer_circular_offset) capturedBufferSize = dwCurrentReadCursor - dsoundbuffer->buffer_circular_offset; else capturedBufferSize = dsoundbuffer->buffer_size - (dsoundbuffer->buffer_circular_offset - dwCurrentReadCursor); GST_DSOUND_UNLOCK (dsoundbuffer); if (!gst_ring_buffer_prepare_read (buf, &writeseg, &writeptr, &len)) goto complete; len -= dsoundbuffer->segoffset; if (len > capturedBufferSize) len = capturedBufferSize; GST_LOG ("Size of segment to read: %d Captured buffer size: %lld", len, capturedBufferSize); /* If we can't read from directsound because we don't have enough * captured data, then sleep for a little while to wait until data is * available */ // ###: why >= and not > ? // ###: what happens if this condition is true on the first iteration? // capture totally busted? //if (len >= capturedBufferSize) { // goto complete; //} if (len <= 0) goto complete; /* lock it */ GST_DSOUND_LOCK (dsoundbuffer); hr = IDirectSoundCaptureBuffer8_Lock (dsoundbuffer->pDSCB8, dsoundbuffer->buffer_circular_offset, len, &pLockedBuffer1, &dwSizeBuffer1, &pLockedBuffer2, &dwSizeBuffer2, 0L); /* copy chunks */ if (SUCCEEDED (hr)) { if (len <= dwSizeBuffer1) { memcpy ((LPBYTE) writeptr + dsoundbuffer->segoffset, pLockedBuffer1, len); } else { memcpy ((LPBYTE) writeptr + dsoundbuffer->segoffset, pLockedBuffer1, dwSizeBuffer1); memcpy ((LPBYTE) writeptr + dsoundbuffer->segoffset + dwSizeBuffer1, pLockedBuffer2, len - dwSizeBuffer1); } IDirectSoundCaptureBuffer8_Unlock (dsoundbuffer->pDSCB8, pLockedBuffer1, dwSizeBuffer1, pLockedBuffer2, dwSizeBuffer2); } else { GST_WARNING ("gst_directsound_read_proc: IDirectSoundCaptureBuffer8_Lock, hr = %X", (unsigned int) hr); } /* update tracking data */ dsoundbuffer->segoffset += len; dsoundbuffer->buffer_circular_offset += len; dsoundbuffer->buffer_circular_offset %= dsoundbuffer->buffer_size; GST_DSOUND_UNLOCK (dsoundbuffer); capturedBufferSize -= len; GST_LOG ("DirectSound Buffer1 Data Size: %u DirectSound Buffer2 Data Size: %u", (unsigned int) dwSizeBuffer1, (unsigned int) dwSizeBuffer2); GST_LOG ("Captured buffer size remaining: %lld", capturedBufferSize); /* check if we wrote a whole segment */ GST_DSOUND_LOCK (dsoundbuffer); if (dsoundbuffer->segoffset == dsoundbuffer->segsize) { GST_DSOUND_UNLOCK (dsoundbuffer); /* advance to next segment */ gst_ring_buffer_advance (buf, 1); GST_DSOUND_LOCK (dsoundbuffer); dsoundbuffer->segoffset = 0; } GST_DSOUND_UNLOCK (dsoundbuffer); complete: GST_DSOUND_LOCK (dsoundbuffer); should_run = dsoundbuffer->should_run; flushing = dsoundbuffer->flushing; retries = 0; GST_DSOUND_UNLOCK (dsoundbuffer); /* it's extremely important to sleep in without the lock! */ if (len >= capturedBufferSize || flushing || error) Sleep (dsoundbuffer->min_sleep_time); } while(should_run); return 0; } psimedia-master/gstprovider/gstelements/directsound/gstdirectsoundringbuffer.h000066400000000000000000000075541220046403000306550ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2005 Sebastien Moutte * Copyright (C) 2007-2009 Pioneers of the Inevitable * Copyright (C) 2009 Barracuda Networks, Inc. * * gstdirectsoundringbuffer.h: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement * of Pioneers of the Inevitable, the creators of the Songbird Music player * */ #ifndef __GST_DIRECTSOUNDRINGBUFFER_H__ #define __GST_DIRECTSOUNDRINGBUFFER_H__ #include #include #include "gstdirectsound.h" G_BEGIN_DECLS #define GST_DSOUND_LOCK(obj) (g_mutex_lock (obj->dsound_lock)) #define GST_DSOUND_UNLOCK(obj) (g_mutex_unlock (obj->dsound_lock)) #define GST_TYPE_DIRECTSOUND_RING_BUFFER \ (gst_directsound_ring_buffer_get_type()) #define GST_DIRECTSOUND_RING_BUFFER(obj) \ (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_DIRECTSOUND_RING_BUFFER,GstDirectSoundRingBuffer)) #define GST_DIRECTSOUND_RING_BUFFER_CLASS(klass) \ (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_DIRECTSOUND_RING_BUFFER,GstDirectSoundRingBufferClass)) #define GST_DIRECTSOUND_RING_BUFFER_GET_CLASS(obj) \ (G_TYPE_INSTANCE_GET_CLASS((obj),GST_TYPE_DIRECTSOUND_RING_BUFFER,GstDirectSoundRingBufferClass)) #define GST_IS_DIRECTSOUND_RING_BUFFER(obj) \ (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_DIRECTSOUND_RING_BUFFER)) #define GST_IS_DIRECTSOUND_RING_BUFFER_CLASS(klass) \ (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_DIRECTSOUND_RING_BUFFER)) typedef struct _GstDirectSoundRingBuffer GstDirectSoundRingBuffer; typedef struct _GstDirectSoundRingBufferClass GstDirectSoundRingBufferClass; struct _GstDirectSoundRingBuffer { GstRingBuffer object; /* FALSE for playback, TRUE for capture */ gboolean is_src; /* related element, either GstDirectSoundSink or GstDirectSoundSrc */ GstElement * element; /* lock used to protect writes and resets */ GMutex * dsound_lock; /* directsound buffer waveformat description */ WAVEFORMATEX wave_format; /* directsound object interface pointer */ LPDIRECTSOUND8 pDS8; /* directsound sound object interface pointer */ LPDIRECTSOUNDBUFFER8 pDSB8; /* directsound capture object interface pointer */ LPDIRECTSOUNDCAPTURE8 pDSC8; /* directsound capture sound object interface pointer */ LPDIRECTSOUNDCAPTUREBUFFER8 pDSCB8; /* directsound buffer size */ guint buffer_size; /* directsound buffer read/write offset */ guint buffer_circular_offset; /* minimum buffer size before playback start */ guint min_buffer_size; /* minimum sleep time for thread */ guint min_sleep_time; /* ringbuffer bytes per sample */ guint bytes_per_sample; /* ringbuffer segment size */ gint segsize; /* ring buffer offset*/ guint segoffset; /* thread */ HANDLE hThread; /* thread suspended? */ gboolean suspended; /* should run thread */ gboolean should_run; /* are we currently flushing? */ gboolean flushing; /* current volume */ gdouble volume; }; struct _GstDirectSoundRingBufferClass { GstRingBufferClass parent_class; }; GType gst_directsound_ring_buffer_get_type (void); G_END_DECLS #endif /* __GST_DIRECTSOUNDRINGBUFFER_H__ */ psimedia-master/gstprovider/gstelements/directsound/gstdirectsoundsink.c000066400000000000000000000402401220046403000274500ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2005 Sebastien Moutte * Copyright (C) 2007-2009 Pioneers of the Inevitable * * gstdirectsoundsink.c: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement * of Pioneers of the Inevitable, the creators of the Songbird Music player * */ /** * SECTION:element-directsoundsink * * This element lets you output sound using the DirectSound API. * * Note that you should almost always use generic audio conversion elements * like audioconvert and audioresample in front of an audiosink to make sure * your pipeline works under all circumstances (those conversion elements will * act in passthrough-mode if no conversion is necessary). * * * Example pipelines * |[ * gst-launch -v audiotestsrc ! audioconvert ! volume volume=0.1 ! directsoundsink * ]| will output a sine wave (continuous beep sound) to your sound card (with * a very low volume as precaution). * |[ * gst-launch -v filesrc location=music.ogg ! decodebin ! audioconvert ! audioresample ! directsoundsink * ]| will play an Ogg/Vorbis audio file and output it. * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif #include "gstdirectsoundsink.h" #include #define GST_CAT_DEFAULT directsound /* elementfactory information */ static const GstElementDetails gst_directsound_sink_details = GST_ELEMENT_DETAILS ("DirectSound8 Audio Sink", "Sink/Audio", "Output to a sound card via DirectSound8", "Ghislain 'Aus' Lacroix "); static GstStaticPadTemplate directsoundsink_sink_factory = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, GST_STATIC_CAPS ("audio/x-raw-int, " "endianness = (int) LITTLE_ENDIAN, " "signed = (boolean) TRUE, " "width = (int) {8, 16}, " "depth = (int) {8, 16}, " "rate = (int) [ 1, MAX ], " "channels = (int) [ 1, 2 ]")); static void gst_directsound_sink_init_interfaces (GType type); static void gst_directsound_sink_base_init (gpointer g_class); static void gst_directsound_sink_class_init (GstDirectSoundSinkClass * klass); static void gst_directsound_sink_init (GstDirectSoundSink * dsoundsink, GstDirectSoundSinkClass * g_class); static void gst_directsound_sink_dispose (GObject * object); static gboolean gst_directsound_sink_event (GstBaseSink * bsink, GstEvent * event); static void gst_directsound_sink_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static void gst_directsound_sink_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); static GstRingBuffer * gst_directsound_sink_create_ringbuffer ( GstBaseAudioSink * sink); enum { ARG_0, ARG_VOLUME, ARG_DEVICE, ARG_DEVICE_NAME }; GST_BOILERPLATE_FULL (GstDirectSoundSink, gst_directsound_sink, GstBaseAudioSink, GST_TYPE_BASE_AUDIO_SINK, gst_directsound_sink_init_interfaces); static gboolean device_set_default (GstDirectSoundSink * sink) { GList * list; gst_directsound_device * dev; gboolean ret; /* obtain the device list */ list = gst_directsound_playback_device_list (sink); if (!list) return FALSE; ret = FALSE; /* the first item is the default */ if (g_list_length (list) >= 1) { dev = (gst_directsound_device *) list->data; /* take the strings, no need to copy */ sink->device_id = dev->id; sink->device_name = dev->name; dev->id = NULL; dev->name = NULL; /* null out the item */ gst_directsound_device_free (dev); list->data = NULL; ret = TRUE; } gst_directsound_device_list_free (list); return ret; } static gboolean device_get_name (GstDirectSoundSink * sink) { GList * l, * list; gst_directsound_device * dev; gboolean ret; /* if there is no device set, then attempt to set up with the default, * which will also grab the name in the process. */ if (!sink->device_id) return device_set_default (sink); /* if we already have a name, free it */ if (sink->device_name) { g_free (sink->device_name); sink->device_name = NULL; } /* obtain the device list */ list = gst_directsound_playback_device_list (sink); if (!list) return FALSE; ret = FALSE; /* look up the id */ for (l = list; l != NULL; l = l->next) { dev = (gst_directsound_device *) l->data; if (g_str_equal (dev->id, sink->device_id)) { /* take the string, no need to copy */ sink->device_name = dev->name; dev->name = NULL; ret = TRUE; break; } } gst_directsound_device_list_free (list); return ret; } static void gst_directsound_sink_base_init (gpointer g_class) { GstElementClass * element_class = GST_ELEMENT_CLASS (g_class); gst_element_class_set_details (element_class, &gst_directsound_sink_details); gst_element_class_add_pad_template (element_class, gst_static_pad_template_get (&directsoundsink_sink_factory)); } static void gst_directsound_sink_class_init (GstDirectSoundSinkClass * klass) { GObjectClass * gobject_class; GstElementClass * gstelement_class; GstBaseSinkClass * gstbasesink_class; GstBaseAudioSinkClass * gstbaseaudiosink_class; gobject_class = (GObjectClass *) klass; gstelement_class = (GstElementClass *) klass; gstbasesink_class = (GstBaseSinkClass *) klass; gstbaseaudiosink_class = (GstBaseAudioSinkClass *) klass; parent_class = g_type_class_peek_parent (klass); gobject_class->dispose = gst_directsound_sink_dispose; gobject_class->set_property = GST_DEBUG_FUNCPTR (gst_directsound_sink_set_property); gobject_class->get_property = GST_DEBUG_FUNCPTR (gst_directsound_sink_get_property); g_object_class_install_property (gobject_class, ARG_VOLUME, g_param_spec_double ("volume", "Volume", "Volume of this stream", 0, 1.0, 1.0, G_PARAM_READWRITE)); g_object_class_install_property (gobject_class, ARG_DEVICE, g_param_spec_string ("device", "Device", "DirectSound playback device as a GUID string", NULL, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, ARG_DEVICE_NAME, g_param_spec_string ("device-name", "Device name", "Human-readable name of the audio device", NULL, G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); gstbasesink_class->event = GST_DEBUG_FUNCPTR (gst_directsound_sink_event); gstbaseaudiosink_class->create_ringbuffer = GST_DEBUG_FUNCPTR (gst_directsound_sink_create_ringbuffer); } static void gst_directsound_sink_init (GstDirectSoundSink * dsoundsink, GstDirectSoundSinkClass * g_class) { dsoundsink->dsoundbuffer = NULL; dsoundsink->volume = 1.0; dsoundsink->device_id = NULL; dsoundsink->device_name = NULL; } static void gst_directsound_sink_dispose (GObject * object) { GstDirectSoundSink * self = GST_DIRECTSOUND_SINK (object); GST_DEBUG_OBJECT (object, G_STRFUNC); if (self->device_id) { g_free (self->device_id); self->device_id = NULL; } if (self->device_name) { g_free (self->device_name); self->device_name = NULL; } G_OBJECT_CLASS (parent_class)->dispose (object); } static gboolean gst_directsound_sink_event (GstBaseSink * bsink, GstEvent * event) { HRESULT hr; DWORD dwStatus; DWORD dwSizeBuffer = 0; LPVOID pLockedBuffer = NULL; GstDirectSoundSink * dsoundsink; dsoundsink = GST_DIRECTSOUND_SINK (bsink); GST_BASE_SINK_CLASS (parent_class)->event (bsink, event); /* no buffer, no event to process */ if (!dsoundsink->dsoundbuffer) return TRUE; switch (GST_EVENT_TYPE (event)) { case GST_EVENT_FLUSH_START: GST_DSOUND_LOCK (dsoundsink->dsoundbuffer); dsoundsink->dsoundbuffer->flushing = TRUE; GST_DSOUND_UNLOCK (dsoundsink->dsoundbuffer); break; case GST_EVENT_FLUSH_STOP: GST_DSOUND_LOCK (dsoundsink->dsoundbuffer); dsoundsink->dsoundbuffer->flushing = FALSE; if (dsoundsink->dsoundbuffer->pDSB8) { hr = IDirectSoundBuffer8_GetStatus (dsoundsink->dsoundbuffer->pDSB8, &dwStatus); if (FAILED(hr)) { GST_DSOUND_UNLOCK (dsoundsink->dsoundbuffer); GST_WARNING("gst_directsound_sink_event: IDirectSoundBuffer8_GetStatus, hr = %X", (unsigned int) hr); return FALSE; } if (!(dwStatus & DSBSTATUS_PLAYING)) { /* reset position */ hr = IDirectSoundBuffer8_SetCurrentPosition (dsoundsink->dsoundbuffer->pDSB8, 0); dsoundsink->dsoundbuffer->buffer_circular_offset = 0; /* reset the buffer */ hr = IDirectSoundBuffer8_Lock (dsoundsink->dsoundbuffer->pDSB8, dsoundsink->dsoundbuffer->buffer_circular_offset, 0L, &pLockedBuffer, &dwSizeBuffer, NULL, NULL, DSBLOCK_ENTIREBUFFER); if (SUCCEEDED (hr)) { memset (pLockedBuffer, 0, dwSizeBuffer); hr = IDirectSoundBuffer8_Unlock (dsoundsink->dsoundbuffer->pDSB8, pLockedBuffer, dwSizeBuffer, NULL, 0); if (FAILED(hr)) { GST_DSOUND_UNLOCK (dsoundsink->dsoundbuffer); GST_WARNING("gst_directsound_sink_event: IDirectSoundBuffer8_Unlock, hr = %X", (unsigned int) hr); return FALSE; } } else { GST_DSOUND_UNLOCK (dsoundsink->dsoundbuffer); GST_WARNING ( "gst_directsound_sink_event: IDirectSoundBuffer8_Lock, hr = %X", (unsigned int) hr); return FALSE; } } } GST_DSOUND_UNLOCK (dsoundsink->dsoundbuffer); break; default: break; } return TRUE; } static void gst_directsound_sink_set_volume (GstDirectSoundSink * dsoundsink) { if (dsoundsink->dsoundbuffer) { GST_DSOUND_LOCK (dsoundsink->dsoundbuffer); dsoundsink->dsoundbuffer->volume = dsoundsink->volume; if (dsoundsink->dsoundbuffer->pDSB8) { gst_directsound_set_volume (dsoundsink->dsoundbuffer->pDSB8, dsoundsink->volume); } GST_DSOUND_UNLOCK (dsoundsink->dsoundbuffer); } } static void gst_directsound_sink_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) { GstDirectSoundSink * sink = GST_DIRECTSOUND_SINK (object); switch (prop_id) { case ARG_VOLUME: sink->volume = g_value_get_double (value); gst_directsound_sink_set_volume (sink); break; case ARG_DEVICE: if (sink->device_id) { g_free (sink->device_id); sink->device_id = NULL; } if (sink->device_name) { g_free (sink->device_name); sink->device_name = NULL; } sink->device_id = g_strdup (g_value_get_string (value)); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static void gst_directsound_sink_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec) { GstDirectSoundSink * sink = GST_DIRECTSOUND_SINK (object); switch (prop_id) { case ARG_VOLUME: g_value_set_double (value, sink->volume); break; case ARG_DEVICE: if (!sink->device_id) device_set_default (sink); g_value_set_string (value, sink->device_id); break; case ARG_DEVICE_NAME: if (!sink->device_name) device_get_name (sink); g_value_set_string (value, sink->device_name); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } /* GstBaseAudioSink vmethod implementations */ static GstRingBuffer * gst_directsound_sink_create_ringbuffer (GstBaseAudioSink * sink) { GstDirectSoundSink * dsoundsink; GstDirectSoundRingBuffer * ringbuffer; dsoundsink = GST_DIRECTSOUND_SINK (sink); GST_DEBUG ("creating ringbuffer"); ringbuffer = g_object_new (GST_TYPE_DIRECTSOUND_RING_BUFFER, NULL); GST_DEBUG ("directsound sink 0x%p", dsoundsink); /* playback */ ringbuffer->is_src = FALSE; /* set the sink element on the ringbuffer for error messages */ ringbuffer->element = GST_ELEMENT (dsoundsink); /* set the ringbuffer on the sink */ dsoundsink->dsoundbuffer = ringbuffer; /* set initial volume on ringbuffer */ dsoundsink->dsoundbuffer->volume = dsoundsink->volume; return GST_RING_BUFFER (ringbuffer); } static const GList * probe_get_properties (GstPropertyProbe * probe) { GObjectClass * klass = G_OBJECT_GET_CLASS (probe); static GList * list = NULL; // ###: from gstalsadeviceprobe.c /* well, not perfect, but better than no locking at all. * In the worst case we leak a list node, so who cares? */ GST_CLASS_LOCK (GST_OBJECT_CLASS (klass)); if (!list) { GParamSpec * pspec; pspec = g_object_class_find_property (klass, "device"); list = g_list_append (NULL, pspec); } GST_CLASS_UNLOCK (GST_OBJECT_CLASS (klass)); return list; } static void probe_probe_property (GstPropertyProbe * probe, guint prop_id, const GParamSpec * pspec) { /* we do nothing in here. the actual "probe" occurs in get_values(), * which is a common practice when not caching responses. */ if (!g_str_equal (pspec->name, "device")) { G_OBJECT_WARN_INVALID_PROPERTY_ID (probe, prop_id, pspec); } } static gboolean probe_needs_probe (GstPropertyProbe * probe, guint prop_id, const GParamSpec * pspec) { /* don't cache probed data */ return TRUE; } static GValueArray * probe_get_values (GstPropertyProbe * probe, guint prop_id, const GParamSpec * pspec) { //GstDirectSoundSink * sink; GValueArray * array; GValue value = { 0, }; GList * l, * list; gst_directsound_device * dev; if (!g_str_equal (pspec->name, "device")) { G_OBJECT_WARN_INVALID_PROPERTY_ID (probe, prop_id, pspec); return NULL; } //sink = GST_DIRECTSOUND_SINK (probe); list = gst_directsound_playback_device_list (); if (list == NULL) { GST_LOG_OBJECT (probe, "No devices found"); return NULL; } array = g_value_array_new (g_list_length (list)); g_value_init (&value, G_TYPE_STRING); for (l = list; l != NULL; l = l->next) { dev = (gst_directsound_device *) l->data; GST_LOG_OBJECT (probe, "Found device: id=[%s] name=[%s]", dev->id, dev->name); g_value_take_string (&value, dev->id); dev->id = NULL; gst_directsound_device_free (dev); l->data = NULL; g_value_array_append (array, &value); } g_value_unset (&value); g_list_free (list); return array; } static void gst_directsound_sink_property_probe_interface_init (GstPropertyProbeInterface * iface) { iface->get_properties = probe_get_properties; iface->probe_property = probe_probe_property; iface->needs_probe = probe_needs_probe; iface->get_values = probe_get_values; } static gboolean gst_directsound_sink_iface_supported (GstImplementsInterface * iface, GType iface_type) { // FIXME: shouldn't this be TRUE? (at least for the probe type?) return FALSE; } static void gst_directsound_sink_interface_init (GstImplementsInterfaceClass * klass) { /* default virtual functions */ klass->supported = gst_directsound_sink_iface_supported; } static void gst_directsound_sink_init_interfaces (GType type) { static const GInterfaceInfo implements_iface_info = { (GInterfaceInitFunc) gst_directsound_sink_interface_init, NULL, NULL, }; static const GInterfaceInfo probe_iface_info = { (GInterfaceInitFunc) gst_directsound_sink_property_probe_interface_init, NULL, NULL, }; g_type_add_interface_static (type, GST_TYPE_IMPLEMENTS_INTERFACE, &implements_iface_info); g_type_add_interface_static (type, GST_TYPE_PROPERTY_PROBE, &probe_iface_info); } psimedia-master/gstprovider/gstelements/directsound/gstdirectsoundsink.h000066400000000000000000000045111220046403000274560ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2005 Sebastien Moutte * Copyright (C) 2007-2009 Pioneers of the Inevitable * * gstdirectsoundsink.h: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement * of Pioneers of the Inevitable, the creators of the Songbird Music player * */ #ifndef __GST_DIRECTSOUNDSINK_H__ #define __GST_DIRECTSOUNDSINK_H__ #include #include #include "gstdirectsound.h" #include "gstdirectsoundringbuffer.h" G_BEGIN_DECLS #define GST_TYPE_DIRECTSOUND_SINK \ (gst_directsound_sink_get_type()) #define GST_DIRECTSOUND_SINK(obj) \ (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_DIRECTSOUND_SINK,GstDirectSoundSink)) #define GST_DIRECTSOUND_SINK_CLASS(klass) \ (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_DIRECTSOUND_SINK,GstDirectSoundSinkClass)) #define GST_IS_DIRECTSOUND_SINK(obj) \ (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_DIRECTSOUND_SINK)) #define GST_IS_DIRECTSOUND_SINK_CLASS(klass) \ (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_DIRECTSOUND_SINK)) typedef struct _GstDirectSoundSink GstDirectSoundSink; typedef struct _GstDirectSoundSinkClass GstDirectSoundSinkClass; struct _GstDirectSoundSink { /* base audio sink */ GstBaseAudioSink sink; /* ringbuffer */ GstDirectSoundRingBuffer * dsoundbuffer; /* current volume */ gdouble volume; gchar * device_id; gchar * device_name; }; struct _GstDirectSoundSinkClass { GstBaseAudioSinkClass parent_class; }; GType gst_directsound_sink_get_type (void); G_END_DECLS #endif /* __GST_DIRECTSOUNDSINK_H__ */ psimedia-master/gstprovider/gstelements/directsound/gstdirectsoundsrc.c000066400000000000000000000365771220046403000273150ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2005 Sebastien Moutte * Copyright (C) 2007-2009 Pioneers of the Inevitable * Copyright (C) 2009 Barracuda Networks, Inc. * * gstdirectsoundsrc.c: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement * of Pioneers of the Inevitable, the creators of the Songbird Music player * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif #include "gstdirectsoundsrc.h" #include #define GST_CAT_DEFAULT directsound /* elementfactory information */ static const GstElementDetails gst_directsound_src_details = GST_ELEMENT_DETAILS ("DirectSound8 Audio Source", "Source/Audio", "Input from a sound card via DirectSound8", "Justin Karneges "); static GstStaticPadTemplate directsoundsrc_src_factory = GST_STATIC_PAD_TEMPLATE ("src", GST_PAD_SRC, GST_PAD_ALWAYS, GST_STATIC_CAPS ("audio/x-raw-int, " "endianness = (int) LITTLE_ENDIAN, " "signed = (boolean) TRUE, " "width = (int) {8, 16}, " "depth = (int) {8, 16}, " "rate = (int) [ 1, MAX ], " "channels = (int) 1")); static void gst_directsound_src_init_interfaces (GType type); static void gst_directsound_src_base_init (gpointer g_class); static void gst_directsound_src_class_init (GstDirectSoundSrcClass * klass); static void gst_directsound_src_init (GstDirectSoundSrc * dsoundsrc, GstDirectSoundSrcClass * g_class); static void gst_directsound_src_dispose (GObject * object); static gboolean gst_directsound_src_event (GstBaseSrc * bsrc, GstEvent * event); static void gst_directsound_src_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static void gst_directsound_src_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); static GstRingBuffer * gst_directsound_src_create_ringbuffer ( GstBaseAudioSrc * src); enum { ARG_0, //ARG_VOLUME, ARG_DEVICE, ARG_DEVICE_NAME }; GST_BOILERPLATE_FULL (GstDirectSoundSrc, gst_directsound_src, GstBaseAudioSrc, GST_TYPE_BASE_AUDIO_SRC, gst_directsound_src_init_interfaces); static gboolean device_set_default (GstDirectSoundSrc * src) { GList * list; gst_directsound_device * dev; gboolean ret; /* obtain the device list */ list = gst_directsound_capture_device_list (src); if (!list) return FALSE; ret = FALSE; /* the first item is the default */ if (g_list_length (list) >= 1) { dev = (gst_directsound_device *) list->data; /* take the strings, no need to copy */ src->device_id = dev->id; src->device_name = dev->name; dev->id = NULL; dev->name = NULL; /* null out the item */ gst_directsound_device_free (dev); list->data = NULL; ret = TRUE; } gst_directsound_device_list_free (list); return ret; } static gboolean device_get_name (GstDirectSoundSrc * src) { GList * l, * list; gst_directsound_device * dev; gboolean ret; /* if there is no device set, then attempt to set up with the default, * which will also grab the name in the process. */ if (!src->device_id) return device_set_default (src); /* if we already have a name, free it */ if (src->device_name) { g_free (src->device_name); src->device_name = NULL; } /* obtain the device list */ list = gst_directsound_capture_device_list (src); if (!list) return FALSE; ret = FALSE; /* look up the id */ for (l = list; l != NULL; l = l->next) { dev = (gst_directsound_device *) l->data; if (g_str_equal (dev->id, src->device_id)) { /* take the string, no need to copy */ src->device_name = dev->name; dev->name = NULL; ret = TRUE; break; } } gst_directsound_device_list_free (list); return ret; } static void gst_directsound_src_base_init (gpointer g_class) { GstElementClass * element_class = GST_ELEMENT_CLASS (g_class); gst_element_class_set_details (element_class, &gst_directsound_src_details); gst_element_class_add_pad_template (element_class, gst_static_pad_template_get (&directsoundsrc_src_factory)); } static void gst_directsound_src_class_init (GstDirectSoundSrcClass * klass) { GObjectClass * gobject_class; GstElementClass * gstelement_class; GstBaseSrcClass * gstbasesrc_class; GstBaseAudioSrcClass * gstbaseaudiosrc_class; gobject_class = (GObjectClass *) klass; gstelement_class = (GstElementClass *) klass; gstbasesrc_class = (GstBaseSrcClass *) klass; gstbaseaudiosrc_class = (GstBaseAudioSrcClass *) klass; parent_class = g_type_class_peek_parent (klass); gobject_class->dispose = gst_directsound_src_dispose; gobject_class->set_property = GST_DEBUG_FUNCPTR (gst_directsound_src_set_property); gobject_class->get_property = GST_DEBUG_FUNCPTR (gst_directsound_src_get_property); /*g_object_class_install_property (gobject_class, ARG_VOLUME, g_param_spec_double ("volume", "Volume", "Volume of this stream", 0, 1.0, 1.0, G_PARAM_READWRITE));*/ g_object_class_install_property (gobject_class, ARG_DEVICE, g_param_spec_string ("device", "Device", "DirectSound capture device as a GUID string", NULL, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, ARG_DEVICE_NAME, g_param_spec_string ("device-name", "Device name", "Human-readable name of the audio device", NULL, G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); gstbasesrc_class->event = GST_DEBUG_FUNCPTR (gst_directsound_src_event); gstbaseaudiosrc_class->create_ringbuffer = GST_DEBUG_FUNCPTR (gst_directsound_src_create_ringbuffer); } static void gst_directsound_src_init (GstDirectSoundSrc * dsoundsrc, GstDirectSoundSrcClass * g_class) { dsoundsrc->dsoundbuffer = NULL; dsoundsrc->volume = 1.0; dsoundsrc->device_id = NULL; dsoundsrc->device_name = NULL; } static void gst_directsound_src_dispose (GObject * object) { GstDirectSoundSrc * self = GST_DIRECTSOUND_SRC (object); GST_DEBUG_OBJECT (object, G_STRFUNC); if (self->device_id) { g_free (self->device_id); self->device_id = NULL; } if (self->device_name) { g_free (self->device_name); self->device_name = NULL; } G_OBJECT_CLASS (parent_class)->dispose (object); } static gboolean gst_directsound_src_event (GstBaseSrc * bsrc, GstEvent * event) { HRESULT hr; DWORD dwStatus; //DWORD dwSizeBuffer = 0; //LPVOID pLockedBuffer = NULL; GstDirectSoundSrc * dsoundsrc; dsoundsrc = GST_DIRECTSOUND_SRC (bsrc); GST_BASE_SRC_CLASS (parent_class)->event (bsrc, event); /* no buffer, no event to process */ if (!dsoundsrc->dsoundbuffer) return TRUE; switch (GST_EVENT_TYPE (event)) { case GST_EVENT_FLUSH_START: GST_DSOUND_LOCK (dsoundsrc->dsoundbuffer); dsoundsrc->dsoundbuffer->flushing = TRUE; GST_DSOUND_UNLOCK (dsoundsrc->dsoundbuffer); break; case GST_EVENT_FLUSH_STOP: GST_DSOUND_LOCK (dsoundsrc->dsoundbuffer); dsoundsrc->dsoundbuffer->flushing = FALSE; if (dsoundsrc->dsoundbuffer->pDSCB8) { hr = IDirectSoundCaptureBuffer8_GetStatus (dsoundsrc->dsoundbuffer->pDSCB8, &dwStatus); if (FAILED(hr)) { GST_DSOUND_UNLOCK (dsoundsrc->dsoundbuffer); GST_WARNING("gst_directsound_src_event: IDirectSoundCaptureBuffer8_GetStatus, hr = %X", (unsigned int) hr); return FALSE; } if (!(dwStatus & DSCBSTATUS_CAPTURING)) { // ###: capture api doesn't support _SetCurrentPosition. commenting // out for now. #if 0 /* reset position */ hr = IDirectSoundBuffer8_SetCurrentPosition (dsoundsink->dsoundbuffer->pDSB8, 0); dsoundsink->dsoundbuffer->buffer_circular_offset = 0; /* reset the buffer */ hr = IDirectSoundBuffer8_Lock (dsoundsink->dsoundbuffer->pDSB8, dsoundsink->dsoundbuffer->buffer_circular_offset, 0L, &pLockedBuffer, &dwSizeBuffer, NULL, NULL, DSBLOCK_ENTIREBUFFER); if (SUCCEEDED (hr)) { memset (pLockedBuffer, 0, dwSizeBuffer); hr = IDirectSoundBuffer8_Unlock (dsoundsink->dsoundbuffer->pDSB8, pLockedBuffer, dwSizeBuffer, NULL, 0); if (FAILED(hr)) { GST_DSOUND_UNLOCK (dsoundsink->dsoundbuffer); GST_WARNING("gst_directsound_sink_event: IDirectSoundBuffer8_Unlock, hr = %X", (unsigned int) hr); return FALSE; } } else { GST_DSOUND_UNLOCK (dsoundsink->dsoundbuffer); GST_WARNING ( "gst_directsound_sink_event: IDirectSoundBuffer8_Lock, hr = %X", (unsigned int) hr); return FALSE; } #endif } } GST_DSOUND_UNLOCK (dsoundsrc->dsoundbuffer); break; default: break; } return TRUE; } /*static void gst_directsound_sink_set_volume (GstDirectSoundSink * dsoundsink) { if (dsoundsink->dsoundbuffer) { GST_DSOUND_LOCK (dsoundsink->dsoundbuffer); dsoundsink->dsoundbuffer->volume = dsoundsink->volume; if (dsoundsink->dsoundbuffer->pDSB8) { gst_directsound_set_volume (dsoundsink->dsoundbuffer->pDSB8, dsoundsink->volume); } GST_DSOUND_UNLOCK (dsoundsink->dsoundbuffer); } }*/ static void gst_directsound_src_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) { GstDirectSoundSrc * src = GST_DIRECTSOUND_SRC (object); switch (prop_id) { /*case ARG_VOLUME: sink->volume = g_value_get_double (value); gst_directsound_sink_set_volume (sink); break;*/ case ARG_DEVICE: if (src->device_id) { g_free (src->device_id); src->device_id = NULL; } if (src->device_name) { g_free (src->device_name); src->device_name = NULL; } src->device_id = g_strdup (g_value_get_string (value)); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static void gst_directsound_src_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec) { GstDirectSoundSrc * src = GST_DIRECTSOUND_SRC (object); switch (prop_id) { /*case ARG_VOLUME: g_value_set_double (value, sink->volume); break;*/ case ARG_DEVICE: if (!src->device_id) device_set_default (src); g_value_set_string (value, src->device_id); break; case ARG_DEVICE_NAME: if (!src->device_name) device_get_name (src); g_value_set_string (value, src->device_name); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } /* GstBaseAudioSrc vmethod implementations */ static GstRingBuffer * gst_directsound_src_create_ringbuffer (GstBaseAudioSrc * src) { GstDirectSoundSrc * dsoundsrc; GstDirectSoundRingBuffer * ringbuffer; dsoundsrc = GST_DIRECTSOUND_SRC (src); GST_DEBUG ("creating ringbuffer"); ringbuffer = g_object_new (GST_TYPE_DIRECTSOUND_RING_BUFFER, NULL); GST_DEBUG ("directsound src 0x%p", dsoundsrc); /* capture */ ringbuffer->is_src = TRUE; /* set the src element on the ringbuffer for error messages */ ringbuffer->element = GST_ELEMENT (dsoundsrc); /* set the ringbuffer on the src */ dsoundsrc->dsoundbuffer = ringbuffer; /* set initial volume on ringbuffer */ dsoundsrc->dsoundbuffer->volume = dsoundsrc->volume; return GST_RING_BUFFER (ringbuffer); } static const GList * probe_get_properties (GstPropertyProbe * probe) { GObjectClass * klass = G_OBJECT_GET_CLASS (probe); static GList * list = NULL; // ###: from gstalsadeviceprobe.c /* well, not perfect, but better than no locking at all. * In the worst case we leak a list node, so who cares? */ GST_CLASS_LOCK (GST_OBJECT_CLASS (klass)); if (!list) { GParamSpec * pspec; pspec = g_object_class_find_property (klass, "device"); list = g_list_append (NULL, pspec); } GST_CLASS_UNLOCK (GST_OBJECT_CLASS (klass)); return list; } static void probe_probe_property (GstPropertyProbe * probe, guint prop_id, const GParamSpec * pspec) { /* we do nothing in here. the actual "probe" occurs in get_values(), * which is a common practice when not caching responses. */ if (!g_str_equal (pspec->name, "device")) { G_OBJECT_WARN_INVALID_PROPERTY_ID (probe, prop_id, pspec); } } static gboolean probe_needs_probe (GstPropertyProbe * probe, guint prop_id, const GParamSpec * pspec) { /* don't cache probed data */ return TRUE; } static GValueArray * probe_get_values (GstPropertyProbe * probe, guint prop_id, const GParamSpec * pspec) { //GstDirectSoundSrc * src; GValueArray * array; GValue value = { 0, }; GList * l, * list; gst_directsound_device * dev; if (!g_str_equal (pspec->name, "device")) { G_OBJECT_WARN_INVALID_PROPERTY_ID (probe, prop_id, pspec); return NULL; } //src = GST_DIRECTSOUND_SRC (probe); list = gst_directsound_capture_device_list (); if (list == NULL) { GST_LOG_OBJECT (probe, "No devices found"); return NULL; } array = g_value_array_new (g_list_length (list)); g_value_init (&value, G_TYPE_STRING); for (l = list; l != NULL; l = l->next) { dev = (gst_directsound_device *) l->data; GST_LOG_OBJECT (probe, "Found device: id=[%s] name=[%s]", dev->id, dev->name); g_value_take_string (&value, dev->id); dev->id = NULL; gst_directsound_device_free (dev); l->data = NULL; g_value_array_append (array, &value); } g_value_unset (&value); g_list_free (list); return array; } static void gst_directsound_src_property_probe_interface_init (GstPropertyProbeInterface * iface) { iface->get_properties = probe_get_properties; iface->probe_property = probe_probe_property; iface->needs_probe = probe_needs_probe; iface->get_values = probe_get_values; } static gboolean gst_directsound_src_iface_supported (GstImplementsInterface * iface, GType iface_type) { // FIXME: shouldn't this be TRUE? (at least for the probe type?) return FALSE; } static void gst_directsound_src_interface_init (GstImplementsInterfaceClass * klass) { /* default virtual functions */ klass->supported = gst_directsound_src_iface_supported; } static void gst_directsound_src_init_interfaces (GType type) { static const GInterfaceInfo implements_iface_info = { (GInterfaceInitFunc) gst_directsound_src_interface_init, NULL, NULL, }; static const GInterfaceInfo probe_iface_info = { (GInterfaceInitFunc) gst_directsound_src_property_probe_interface_init, NULL, NULL, }; g_type_add_interface_static (type, GST_TYPE_IMPLEMENTS_INTERFACE, &implements_iface_info); g_type_add_interface_static (type, GST_TYPE_PROPERTY_PROBE, &probe_iface_info); } psimedia-master/gstprovider/gstelements/directsound/gstdirectsoundsrc.h000066400000000000000000000045341220046403000273060ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2005 Sebastien Moutte * Copyright (C) 2007-2009 Pioneers of the Inevitable * Copyright (C) 2009 Barracuda Networks, Inc. * * gstdirectsoundsrc.h: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement * of Pioneers of the Inevitable, the creators of the Songbird Music player * */ #ifndef __GST_DIRECTSOUNDSRC_H__ #define __GST_DIRECTSOUNDSRC_H__ #include #include #include "gstdirectsound.h" #include "gstdirectsoundringbuffer.h" G_BEGIN_DECLS #define GST_TYPE_DIRECTSOUND_SRC \ (gst_directsound_src_get_type()) #define GST_DIRECTSOUND_SRC(obj) \ (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_DIRECTSOUND_SRC,GstDirectSoundSrc)) #define GST_DIRECTSOUND_SRC_CLASS(klass) \ (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_DIRECTSOUND_SRC,GstDirectSoundSrcClass)) #define GST_IS_DIRECTSOUND_SRC(obj) \ (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_DIRECTSOUND_SRC)) #define GST_IS_DIRECTSOUND_SRC_CLASS(klass) \ (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_DIRECTSOUND_SRC)) typedef struct _GstDirectSoundSrc GstDirectSoundSrc; typedef struct _GstDirectSoundSrcClass GstDirectSoundSrcClass; struct _GstDirectSoundSrc { /* base audio src */ GstBaseAudioSrc src; /* ringbuffer */ GstDirectSoundRingBuffer * dsoundbuffer; /* current volume */ gdouble volume; gchar * device_id; gchar * device_name; }; struct _GstDirectSoundSrcClass { GstBaseAudioSrcClass parent_class; }; GType gst_directsound_src_get_type (void); G_END_DECLS #endif /* __GST_DIRECTSOUNDSRC_H__ */ psimedia-master/gstprovider/gstelements/directsound_sinkonly/000077500000000000000000000000001220046403000253045ustar00rootroot00000000000000psimedia-master/gstprovider/gstelements/directsound_sinkonly/Makefile.am000066400000000000000000000011601220046403000273360ustar00rootroot00000000000000plugin_LTLIBRARIES = libgstdirectsoundsink.la libgstdirectsoundsink_la_SOURCES = gstdirectsound.c \ gstdirectsoundringbuffer.c gstdirectsoundsink.c gstdirectsoundplugin.c libgstdirectsoundsink_la_CFLAGS = $(GST_CFLAGS) $(GST_BASE_CFLAGS) \ $(GST_PLUGINS_BASE_CFLAGS) $(DIRECTSOUND_CFLAGS) libgstdirectsoundsink_la_LIBADD = $(DIRECTSOUND_LIBS) \ $(GST_BASE_LIBS) $(GST_PLUGINS_BASE_LIBS) \ -lgstaudio-$(GST_MAJORMINOR) -lgstinterfaces-$(GST_MAJORMINOR) libgstdirectsoundsink_la_LDFLAGS = $(GST_PLUGIN_LDFLAGS) $(DIRECTSOUND_LDFLAGS) noinst_HEADERS = gstdirectsound.h gstdirectsoundringbuffer.h \ gstdirectsoundsink.h psimedia-master/gstprovider/gstelements/directsound_sinkonly/gstdirectsound.c000066400000000000000000000036041220046403000305140ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2005 Sebastien Moutte * Copyright (C) 2007-2009 Pioneers of the Inevitable * * gstdirectsound.c: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement * of Pioneers of the Inevitable, the creators of the Songbird Music player * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif #define INITGUID #include "gstdirectsound.h" #include GST_DEBUG_CATEGORY (directsound); #define GST_CAT_DEFAULT directsound void gst_directsound_set_volume (LPDIRECTSOUNDBUFFER8 pDSB8, gdouble volume) { HRESULT hr; long dsVolume; /* DirectSound controls volume using units of 100th of a decibel, * ranging from -10000 to 0. We use a linear scale of 0 - 100 * here, so remap. */ if (volume == 0) dsVolume = -10000; else dsVolume = 100 * (long) (20 * log10 (volume)); dsVolume = CLAMP (dsVolume, -10000, 0); GST_DEBUG ("Setting volume on secondary buffer to %d", (int) dsVolume); hr = IDirectSoundBuffer8_SetVolume (pDSB8, dsVolume); if (G_UNLIKELY (FAILED(hr))) { GST_WARNING ("Setting volume on secondary buffer failed."); } } psimedia-master/gstprovider/gstelements/directsound_sinkonly/gstdirectsound.h000066400000000000000000000030101220046403000305100ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2005 Sebastien Moutte * Copyright (C) 2007-2009 Pioneers of the Inevitable * * gstdirectsound.h: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement * of Pioneers of the Inevitable, the creators of the Songbird Music player * */ #ifndef __GST_DIRECTSOUND_H__ #define __GST_DIRECTSOUND_H__ #include #include #include /* use directsound v8 */ #ifdef DIRECTSOUND_VERSION #undef DIRECTSOUND_VERSION #endif #define DIRECTSOUND_VERSION 0x0800 #include GST_DEBUG_CATEGORY_EXTERN (directsound); G_BEGIN_DECLS void gst_directsound_set_volume (LPDIRECTSOUNDBUFFER8 pDSB8, gdouble volume); G_END_DECLS #endif /* __GST_DIRECTSOUND_H__ */ psimedia-master/gstprovider/gstelements/directsound_sinkonly/gstdirectsoundplugin.c000066400000000000000000000031721220046403000317330ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2005 Sebastien Moutte * Copyright (C) 2007 Pioneers of the Inevitable * * gstdirectsoundplugin.c: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * * The development of this code was made possible due to the involvement * of Pioneers of the Inevitable, the creators of the Songbird Music player * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif #include "gstdirectsound.h" #include "gstdirectsoundsink.h" static gboolean plugin_init (GstPlugin * plugin) { if (!gst_element_register (plugin, "directsoundsink", GST_RANK_PRIMARY, GST_TYPE_DIRECTSOUND_SINK)) return FALSE; GST_DEBUG_CATEGORY_INIT (directsound, "directsound", 0, "DirectSound Elements"); return TRUE; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, "directsound", "Direct Sound plugin library", plugin_init, VERSION, "LGPL", GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN) psimedia-master/gstprovider/gstelements/directsound_sinkonly/gstdirectsoundringbuffer.c000066400000000000000000000564521220046403000325770ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2005 Sebastien Moutte * Copyright (C) 2007-2009 Pioneers of the Inevitable * * gstdirectsoundringbuffer.c: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement * of Pioneers of the Inevitable, the creators of the Songbird Music player * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif #include "gstdirectsoundringbuffer.h" #include "gstdirectsoundsink.h" #define GST_CAT_DEFAULT directsound #define MAX_LOST_RETRIES 10 #define DIRECTSOUND_ERROR_DEVICE_RECONFIGURED 0x88780096 #define DIRECTSOUND_ERROR_DEVICE_NO_DRIVER 0x88780078 static void gst_directsound_ring_buffer_class_init ( GstDirectSoundRingBufferClass * klass); static void gst_directsound_ring_buffer_init ( GstDirectSoundRingBuffer * ringbuffer, GstDirectSoundRingBufferClass * g_class); static void gst_directsound_ring_buffer_dispose (GObject * object); static void gst_directsound_ring_buffer_finalize (GObject * object); static gboolean gst_directsound_ring_buffer_open_device (GstRingBuffer * buf); static gboolean gst_directsound_ring_buffer_close_device ( GstRingBuffer * buf); static gboolean gst_directsound_ring_buffer_acquire (GstRingBuffer * buf, GstRingBufferSpec * spec); static gboolean gst_directsound_ring_buffer_release (GstRingBuffer * buf); static gboolean gst_directsound_ring_buffer_start (GstRingBuffer * buf); static gboolean gst_directsound_ring_buffer_pause (GstRingBuffer * buf); static gboolean gst_directsound_ring_buffer_resume (GstRingBuffer * buf); static gboolean gst_directsound_ring_buffer_stop (GstRingBuffer * buf); static guint gst_directsound_ring_buffer_delay (GstRingBuffer * buf); static DWORD WINAPI gst_directsound_write_proc (LPVOID lpParameter); static GstRingBufferClass * ring_parent_class = NULL; static void gst_directsound_ring_buffer_class_init_trampoline ( gpointer g_class, gpointer data) { ring_parent_class = (GstRingBufferClass *) g_type_class_peek_parent (g_class); gst_directsound_ring_buffer_class_init ( (GstDirectSoundRingBufferClass *) g_class); } GType gst_directsound_ring_buffer_get_type (void); GType gst_directsound_ring_buffer_get_type (void) { static volatile gsize gonce_data; if (__gst_once_init_enter (&gonce_data)) { GType _type; _type = gst_type_register_static_full (GST_TYPE_RING_BUFFER, g_intern_static_string ("GstDirectSoundRingBuffer"), sizeof (GstDirectSoundRingBufferClass), NULL, NULL, gst_directsound_ring_buffer_class_init_trampoline, NULL, NULL, sizeof (GstDirectSoundRingBuffer), 0, (GInstanceInitFunc) gst_directsound_ring_buffer_init, NULL, (GTypeFlags) 0); __gst_once_init_leave (&gonce_data, (gsize) _type); } return (GType) gonce_data; } static void gst_directsound_ring_buffer_class_init (GstDirectSoundRingBufferClass * klass) { GObjectClass * gobject_class; GstObjectClass * gstobject_class; GstRingBufferClass * gstringbuffer_class; gobject_class = (GObjectClass *) klass; gstobject_class = (GstObjectClass *) klass; gstringbuffer_class = (GstRingBufferClass *) klass; gobject_class->dispose = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_dispose); gobject_class->finalize = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_finalize); gstringbuffer_class->open_device = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_open_device); gstringbuffer_class->close_device = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_close_device); gstringbuffer_class->acquire = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_acquire); gstringbuffer_class->release = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_release); gstringbuffer_class->start = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_start); gstringbuffer_class->pause = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_pause); gstringbuffer_class->resume = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_resume); gstringbuffer_class->stop = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_stop); gstringbuffer_class->delay = GST_DEBUG_FUNCPTR (gst_directsound_ring_buffer_delay); GST_DEBUG ("directsound ring buffer class init"); } static void gst_directsound_ring_buffer_init (GstDirectSoundRingBuffer * ringbuffer, GstDirectSoundRingBufferClass * g_class) { ringbuffer->dsoundsink = NULL; ringbuffer->pDS8 = NULL; ringbuffer->pDSB8 = NULL; memset (&ringbuffer->wave_format, 0, sizeof (WAVEFORMATEX)); ringbuffer->buffer_size = 0; ringbuffer->buffer_write_offset = 0; ringbuffer->min_buffer_size = 0; ringbuffer->min_sleep_time = 10; /* in milliseconds */ ringbuffer->bytes_per_sample = 0; ringbuffer->segoffset = 0; ringbuffer->segsize = 0; ringbuffer->hThread = NULL; ringbuffer->suspended = FALSE; ringbuffer->should_run = FALSE; ringbuffer->flushing = FALSE; ringbuffer->volume = 1.0; ringbuffer->dsound_lock = g_mutex_new (); } static void gst_directsound_ring_buffer_dispose (GObject * object) { G_OBJECT_CLASS (ring_parent_class)->dispose (object); } static void gst_directsound_ring_buffer_finalize (GObject * object) { GstDirectSoundRingBuffer * dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (object); g_mutex_free (dsoundbuffer->dsound_lock); dsoundbuffer->dsound_lock = NULL; G_OBJECT_CLASS (ring_parent_class)->finalize (object); } static gboolean gst_directsound_ring_buffer_open_device (GstRingBuffer * buf) { HRESULT hr; GstDirectSoundRingBuffer * dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); GST_DEBUG ("Opening DirectSound Device"); if (FAILED (hr = DirectSoundCreate8 (NULL, &dsoundbuffer->pDS8, NULL))) { GST_ELEMENT_ERROR (dsoundbuffer->dsoundsink, RESOURCE, FAILED, ("%ls.", DXGetErrorDescription9W(hr)), ("Failed to create directsound device. (%X)", (unsigned int) hr)); dsoundbuffer->pDS8 = NULL; return FALSE; } if (FAILED (hr = IDirectSound8_SetCooperativeLevel (dsoundbuffer->pDS8, GetDesktopWindow (), DSSCL_PRIORITY))) { GST_WARNING ("gst_directsound_sink_open: IDirectSound8_SetCooperativeLevel, hr = %X", (unsigned int) hr); return FALSE; } return TRUE; } static gboolean gst_directsound_ring_buffer_close_device (GstRingBuffer * buf) { GstDirectSoundRingBuffer * dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); GST_DEBUG ("Closing DirectSound Device"); if (dsoundbuffer->pDS8) { IDirectSound8_Release (dsoundbuffer->pDS8); dsoundbuffer->pDS8 = NULL; } return TRUE; } static gboolean gst_directsound_create_buffer (GstRingBuffer * buf) { GstDirectSoundRingBuffer * dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); HRESULT hr; DSBUFFERDESC descSecondary; LPDIRECTSOUNDBUFFER pDSB; memset (&descSecondary, 0, sizeof (DSBUFFERDESC)); descSecondary.dwSize = sizeof (DSBUFFERDESC); descSecondary.dwFlags = DSBCAPS_GETCURRENTPOSITION2 | DSBCAPS_GLOBALFOCUS | DSBCAPS_CTRLVOLUME; descSecondary.dwBufferBytes = dsoundbuffer->buffer_size; descSecondary.lpwfxFormat = (WAVEFORMATEX *) & dsoundbuffer->wave_format; hr = IDirectSound8_CreateSoundBuffer (dsoundbuffer->pDS8, &descSecondary, &pDSB, NULL); if (G_UNLIKELY (FAILED (hr))) { GST_WARNING ("gst_directsound_ring_buffer_acquire: IDirectSound8_CreateSoundBuffer, hr = %X", (unsigned int) hr); return FALSE; } hr = IDirectSoundBuffer_QueryInterface (pDSB, &IID_IDirectSoundBuffer8, (LPVOID *) &dsoundbuffer->pDSB8); if (G_UNLIKELY (FAILED (hr))) { IDirectSoundBuffer_Release (pDSB); GST_WARNING ("gst_directsound_ring_buffer_acquire: IDirectSoundBuffer_QueryInterface, hr = %X", (unsigned int) hr); return FALSE; } IDirectSoundBuffer_Release (pDSB); return TRUE; } static gboolean gst_directsound_ring_buffer_acquire (GstRingBuffer * buf, GstRingBufferSpec * spec) { GstDirectSoundRingBuffer * dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); WAVEFORMATEX wfx; /* sanity check, if no DirectSound device, bail out */ if (!dsoundbuffer->pDS8) { GST_WARNING ("gst_directsound_ring_buffer_acquire: DirectSound 8 device is null!"); return FALSE; } /*save number of bytes per sample */ dsoundbuffer->bytes_per_sample = spec->bytes_per_sample; /* fill the WAVEFORMATEX struture with spec params */ memset (&wfx, 0, sizeof (wfx)); wfx.cbSize = sizeof (wfx); wfx.wFormatTag = WAVE_FORMAT_PCM; wfx.nChannels = spec->channels; wfx.nSamplesPerSec = spec->rate; wfx.wBitsPerSample = (spec->bytes_per_sample * 8) / wfx.nChannels; wfx.nBlockAlign = spec->bytes_per_sample; wfx.nAvgBytesPerSec = wfx.nSamplesPerSec * wfx.nBlockAlign; /* Create directsound buffer with size based on our configured * buffer_size (which is 200 ms by default) */ dsoundbuffer->buffer_size = gst_util_uint64_scale_int (wfx.nAvgBytesPerSec, spec->buffer_time, GST_MSECOND); spec->segsize = gst_util_uint64_scale_int (wfx.nAvgBytesPerSec, spec->latency_time, GST_MSECOND); /* Now round the ringbuffer segment size to a multiple of the bytes per sample - otherwise the ringbuffer subtly fails */ spec->segsize = (spec->segsize + (spec->bytes_per_sample - 1))/ spec->bytes_per_sample * spec->bytes_per_sample; /* And base the total number of segments on the configured buffer size */ spec->segtotal = dsoundbuffer->buffer_size / spec->segsize; dsoundbuffer->buffer_size = spec->segsize * spec->segtotal; dsoundbuffer->segsize = spec->segsize; dsoundbuffer->min_buffer_size = dsoundbuffer->buffer_size / 2; GST_INFO_OBJECT (dsoundbuffer, "GstRingBufferSpec->channels: %d, GstRingBufferSpec->rate: %d, GstRingBufferSpec->bytes_per_sample: %d\n" "WAVEFORMATEX.nSamplesPerSec: %ld, WAVEFORMATEX.wBitsPerSample: %d, WAVEFORMATEX.nBlockAlign: %d, WAVEFORMATEX.nAvgBytesPerSec: %ld\n" "Size of dsound cirucular buffer: %d, Size of segment: %d, Total segments: %d\n", spec->channels, spec->rate, spec->bytes_per_sample, wfx.nSamplesPerSec, wfx.wBitsPerSample, wfx.nBlockAlign, wfx.nAvgBytesPerSec, dsoundbuffer->buffer_size, spec->segsize, spec->segtotal); dsoundbuffer->wave_format = wfx; if (!gst_directsound_create_buffer (buf)) return FALSE; buf->data = gst_buffer_new_and_alloc (spec->segtotal * spec->segsize); memset (GST_BUFFER_DATA (buf->data), 0, GST_BUFFER_SIZE (buf->data)); return TRUE; } static gboolean gst_directsound_ring_buffer_release (GstRingBuffer * buf) { GstDirectSoundRingBuffer * dsoundbuffer; dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); /* first we have to ensure our ring buffer is stopped */ gst_directsound_ring_buffer_stop (buf); GST_DSOUND_LOCK (dsoundbuffer); /* release secondary DirectSound buffer */ if (dsoundbuffer->pDSB8) { IDirectSoundBuffer8_Release (dsoundbuffer->pDSB8); dsoundbuffer->pDSB8 = NULL; } gst_buffer_unref (buf->data); buf->data = NULL; GST_DSOUND_UNLOCK (dsoundbuffer); return TRUE; } static gboolean gst_directsound_ring_buffer_start (GstRingBuffer * buf) { GstDirectSoundRingBuffer * dsoundbuffer; HANDLE hThread; dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); GST_DEBUG ("Starting RingBuffer"); GST_DSOUND_LOCK (dsoundbuffer); hThread = CreateThread (NULL, 256 * 1024 /* Stack size: 256k */, gst_directsound_write_proc, buf, CREATE_SUSPENDED, NULL); if (!hThread) { GST_DSOUND_UNLOCK (dsoundbuffer); GST_WARNING ("gst_directsound_ring_buffer_start: CreateThread"); return FALSE; } dsoundbuffer->hThread = hThread; dsoundbuffer->should_run = TRUE; gst_directsound_set_volume (dsoundbuffer->pDSB8, dsoundbuffer->volume); if (G_UNLIKELY (!SetThreadPriority(hThread, THREAD_PRIORITY_TIME_CRITICAL))) GST_WARNING ("gst_directsound_ring_buffer_start: Failed to set thread priority."); ResumeThread (hThread); GST_DSOUND_UNLOCK (dsoundbuffer); return TRUE; } static gboolean gst_directsound_ring_buffer_pause (GstRingBuffer * buf) { HRESULT hr = S_OK; GstDirectSoundRingBuffer * dsoundbuffer; dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); GST_DEBUG ("Pausing RingBuffer"); GST_DSOUND_LOCK (dsoundbuffer); if (dsoundbuffer->pDSB8) { hr = IDirectSoundBuffer8_Stop (dsoundbuffer->pDSB8); } if (G_LIKELY (!dsoundbuffer->suspended)) { if (G_UNLIKELY(SuspendThread (dsoundbuffer->hThread) == -1)) GST_WARNING ("gst_directsound_ring_buffer_pause: SuspendThread failed."); else dsoundbuffer->suspended = TRUE; } GST_DSOUND_UNLOCK (dsoundbuffer); /* in the unlikely event that a device was reconfigured, we can consider * ourselves stopped even though the stop call failed */ if (G_UNLIKELY (FAILED(hr)) && G_UNLIKELY(hr != DIRECTSOUND_ERROR_DEVICE_RECONFIGURED) && G_UNLIKELY(hr != DIRECTSOUND_ERROR_DEVICE_NO_DRIVER)) { GST_WARNING ("gst_directsound_ring_buffer_pause: IDirectSoundBuffer8_Stop, hr = %X", (unsigned int) hr); return FALSE; } return TRUE; } static gboolean gst_directsound_ring_buffer_resume (GstRingBuffer * buf) { GstDirectSoundRingBuffer *dsoundbuffer; dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); GST_DEBUG ("Resuming RingBuffer"); GST_DSOUND_LOCK (dsoundbuffer); if (G_LIKELY (dsoundbuffer->suspended) && ResumeThread (dsoundbuffer->hThread) != -1) { dsoundbuffer->suspended = FALSE; } else { GST_DSOUND_UNLOCK (dsoundbuffer); GST_WARNING ("gst_directsound_ring_buffer_resume: ResumeThread failed."); return FALSE; } GST_DSOUND_UNLOCK (dsoundbuffer); return TRUE; } static gboolean gst_directsound_ring_buffer_stop (GstRingBuffer * buf) { HRESULT hr; DWORD ret; HANDLE hThread; GstDirectSoundRingBuffer * dsoundbuffer; dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); GST_DEBUG ("Stopping RingBuffer"); GST_DSOUND_LOCK (dsoundbuffer); dsoundbuffer->should_run = FALSE; if (dsoundbuffer->pDSB8) { hr = IDirectSoundBuffer8_Stop (dsoundbuffer->pDSB8); if (G_UNLIKELY (FAILED(hr))) { GST_DSOUND_UNLOCK (dsoundbuffer); GST_WARNING ("gst_directsound_ring_buffer_stop: IDirectSoundBuffer8_Stop, hr = %X", (unsigned int) hr); return FALSE; } } hThread = dsoundbuffer->hThread; if (dsoundbuffer->suspended && ResumeThread (hThread) != -1) { dsoundbuffer->suspended = FALSE; } else { GST_DSOUND_UNLOCK (dsoundbuffer); GST_WARNING ("gst_directsound_ring_buffer_stop: ResumeThread failed."); return FALSE; } GST_DSOUND_UNLOCK (dsoundbuffer); /* wait without lock held */ ret = WaitForSingleObject (hThread, 5000); if (G_UNLIKELY (ret == WAIT_TIMEOUT)) { GST_WARNING ("gst_directsound_ring_buffer_stop: Failed to wait for thread shutdown. (%u)", (unsigned int) ret); return FALSE; } GST_DSOUND_LOCK (dsoundbuffer); CloseHandle (dsoundbuffer->hThread); dsoundbuffer->hThread = NULL; GST_DSOUND_UNLOCK (dsoundbuffer); return TRUE; } static guint gst_directsound_ring_buffer_delay (GstRingBuffer * buf) { GstDirectSoundRingBuffer * dsoundbuffer; HRESULT hr; DWORD dwCurrentPlayCursor; DWORD dwCurrentWriteCursor; DWORD dwBytesInQueue = 0; gint nNbSamplesInQueue = 0; dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); if (G_LIKELY (dsoundbuffer->pDSB8)) { /* evaluate the number of samples in queue in the circular buffer */ hr = IDirectSoundBuffer8_GetCurrentPosition (dsoundbuffer->pDSB8, &dwCurrentPlayCursor, &dwCurrentWriteCursor); if (G_LIKELY (SUCCEEDED (hr))) { if (dwCurrentPlayCursor <= dsoundbuffer->buffer_write_offset) dwBytesInQueue = dsoundbuffer->buffer_write_offset - dwCurrentPlayCursor; else dwBytesInQueue = dsoundbuffer->buffer_write_offset + (dsoundbuffer->buffer_size - dwCurrentPlayCursor); nNbSamplesInQueue = dwBytesInQueue / dsoundbuffer->bytes_per_sample; } else { GST_WARNING ("gst_directsound_ring_buffer_delay: IDirectSoundBuffer8_GetCurrentPosition, hr = %X", (unsigned int) hr); } } return nNbSamplesInQueue; } static DWORD WINAPI gst_directsound_write_proc (LPVOID lpParameter) { GstRingBuffer * buf; GstDirectSoundRingBuffer * dsoundbuffer; HRESULT hr; DWORD dwStatus; LPVOID pLockedBuffer1 = NULL, pLockedBuffer2 = NULL; DWORD dwSizeBuffer1 = 0, dwSizeBuffer2 = 0; DWORD dwCurrentPlayCursor = 0; gint64 freeBufferSize = 0; guint8 * readptr = NULL; gint readseg = 0; guint len = 0; gint retries = 0; gboolean flushing = FALSE; gboolean should_run = TRUE; gboolean error = FALSE; buf = (GstRingBuffer *) lpParameter; dsoundbuffer = GST_DIRECTSOUND_RING_BUFFER (buf); do { GST_DSOUND_LOCK (dsoundbuffer); if (dsoundbuffer->flushing || !dsoundbuffer->pDSB8) { GST_DSOUND_UNLOCK (dsoundbuffer); goto complete; } GST_DSOUND_UNLOCK (dsoundbuffer); restore_buffer: /* get current buffer status */ GST_DSOUND_LOCK (dsoundbuffer); hr = IDirectSoundBuffer8_GetStatus (dsoundbuffer->pDSB8, &dwStatus); GST_DSOUND_UNLOCK (dsoundbuffer); if (dwStatus & DSBSTATUS_BUFFERLOST) { GST_DEBUG ("Buffer was lost, attempting to restore"); GST_DSOUND_LOCK (dsoundbuffer); hr = IDirectSoundBuffer8_Restore (dsoundbuffer->pDSB8); GST_DSOUND_UNLOCK (dsoundbuffer); /* restore may fail again, ensure we restore the * buffer before we continue */ if (FAILED(hr) && hr == DSERR_BUFFERLOST) { if (retries++ < MAX_LOST_RETRIES) { GST_DEBUG ("Unable to restore, trying again"); goto restore_buffer; } else { GST_ELEMENT_ERROR (dsoundbuffer->dsoundsink, RESOURCE, FAILED, ("%ls.", DXGetErrorDescription9W(hr)), ("gst_directsound_write_proc: IDirectSoundBuffer8_Restore, hr = %X", (unsigned int) hr)); goto complete; } } } /* get current play cursor and write cursor positions */ GST_DSOUND_LOCK (dsoundbuffer); hr = IDirectSoundBuffer8_GetCurrentPosition (dsoundbuffer->pDSB8, &dwCurrentPlayCursor, NULL); GST_DSOUND_UNLOCK (dsoundbuffer); if (G_UNLIKELY (FAILED(hr))) { /* try and reopen the default directsound device */ if (hr == DIRECTSOUND_ERROR_DEVICE_RECONFIGURED) { /* we have to wait a while for the sound device removal to actually * be processed before attempting to reopen the device. Yes, this sucks */ Sleep (2000); GST_DSOUND_LOCK (dsoundbuffer); IDirectSoundBuffer8_Release (dsoundbuffer->pDSB8); dsoundbuffer->pDSB8 = NULL; GST_DSOUND_UNLOCK (dsoundbuffer); if (gst_directsound_ring_buffer_close_device (buf) && gst_directsound_ring_buffer_open_device (buf) && gst_directsound_create_buffer (buf) ) { dsoundbuffer->buffer_write_offset = 0; goto restore_buffer; } } /* only trigger an error if we're not already in an error state */ if (FAILED(hr) && !error) { GST_ELEMENT_ERROR (dsoundbuffer->dsoundsink, RESOURCE, FAILED, ("%ls.", DXGetErrorDescription9W(hr)), ("gst_directsound_write_proc: IDirectSoundBuffer8_GetCurrentPosition, hr = %X", (unsigned int) hr)); error = TRUE; goto complete; } } GST_LOG ("Current Play Cursor: %u Current Write Offset: %d", (unsigned int) dwCurrentPlayCursor, dsoundbuffer->buffer_write_offset); /* calculate the free size of the circular buffer */ GST_DSOUND_LOCK (dsoundbuffer); if (dwCurrentPlayCursor <= dsoundbuffer->buffer_write_offset) freeBufferSize = dsoundbuffer->buffer_size - (dsoundbuffer->buffer_write_offset - dwCurrentPlayCursor); else freeBufferSize = dwCurrentPlayCursor - dsoundbuffer->buffer_write_offset; GST_DSOUND_UNLOCK (dsoundbuffer); if (!gst_ring_buffer_prepare_read (buf, &readseg, &readptr, &len)) goto complete; len -= dsoundbuffer->segoffset; GST_LOG ("Size of segment to write: %d Free buffer size: %lld", len, freeBufferSize); /* If we can't write this into directsound because we don't have enough * space, then start playback if we're currently paused. Then, sleep * for a little while to wait until space is available */ if (len >= freeBufferSize) { if (!(dwStatus & DSBSTATUS_PLAYING)) { GST_DSOUND_LOCK (dsoundbuffer); hr = IDirectSoundBuffer8_Play (dsoundbuffer->pDSB8, 0, 0, DSBPLAY_LOOPING); GST_DSOUND_UNLOCK (dsoundbuffer); if (FAILED(hr)) { GST_WARNING ("gst_directsound_write_proc: IDirectSoundBuffer8_Play, hr = %X", (unsigned int) hr); } } goto complete; } /* lock it */ GST_DSOUND_LOCK (dsoundbuffer); hr = IDirectSoundBuffer8_Lock (dsoundbuffer->pDSB8, dsoundbuffer->buffer_write_offset, len, &pLockedBuffer1, &dwSizeBuffer1, &pLockedBuffer2, &dwSizeBuffer2, 0L); /* copy chunks */ if (SUCCEEDED (hr)) { if (len <= dwSizeBuffer1) { memcpy (pLockedBuffer1, (LPBYTE) readptr + dsoundbuffer->segoffset, len); } else { memcpy (pLockedBuffer1, (LPBYTE) readptr + dsoundbuffer->segoffset, dwSizeBuffer1); memcpy (pLockedBuffer2, (LPBYTE) readptr + dsoundbuffer->segoffset + dwSizeBuffer1, len - dwSizeBuffer1); } IDirectSoundBuffer8_Unlock (dsoundbuffer->pDSB8, pLockedBuffer1, dwSizeBuffer1, pLockedBuffer2, dwSizeBuffer2); } else { GST_WARNING ("gst_directsound_write_proc: IDirectSoundBuffer8_Lock, hr = %X", (unsigned int) hr); } /* update tracking data */ dsoundbuffer->segoffset += dwSizeBuffer1 + (len - dwSizeBuffer1); dsoundbuffer->buffer_write_offset += dwSizeBuffer1 + (len - dwSizeBuffer1); dsoundbuffer->buffer_write_offset %= dsoundbuffer->buffer_size; GST_DSOUND_UNLOCK (dsoundbuffer); freeBufferSize -= dwSizeBuffer1 + (len - dwSizeBuffer1); GST_LOG ("DirectSound Buffer1 Data Size: %u DirectSound Buffer2 Data Size: %u", (unsigned int) dwSizeBuffer1, (unsigned int) dwSizeBuffer2); GST_LOG ("Free buffer size: %lld", freeBufferSize); /* check if we read a whole segment */ GST_DSOUND_LOCK (dsoundbuffer); if (dsoundbuffer->segoffset == dsoundbuffer->segsize) { GST_DSOUND_UNLOCK (dsoundbuffer); /* advance to next segment */ gst_ring_buffer_clear (buf, readseg); gst_ring_buffer_advance (buf, 1); GST_DSOUND_LOCK (dsoundbuffer); dsoundbuffer->segoffset = 0; } GST_DSOUND_UNLOCK (dsoundbuffer); complete: GST_DSOUND_LOCK (dsoundbuffer); should_run = dsoundbuffer->should_run; flushing = dsoundbuffer->flushing; retries = 0; GST_DSOUND_UNLOCK (dsoundbuffer); /* it's extremely important to sleep in without the lock! */ if (len >= freeBufferSize || flushing || error) Sleep (dsoundbuffer->min_sleep_time); } while(should_run); return 0; } psimedia-master/gstprovider/gstelements/directsound_sinkonly/gstdirectsoundringbuffer.h000066400000000000000000000070751220046403000326010ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2005 Sebastien Moutte * Copyright (C) 2007-2009 Pioneers of the Inevitable * * gstdirectsoundringbuffer.h: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement * of Pioneers of the Inevitable, the creators of the Songbird Music player * */ #ifndef __GST_DIRECTSOUNDRINGBUFFER_H__ #define __GST_DIRECTSOUNDRINGBUFFER_H__ #include #include #include "gstdirectsound.h" G_BEGIN_DECLS struct _GstDirectSoundSink; #define GST_DSOUND_LOCK(obj) (g_mutex_lock (obj->dsound_lock)) #define GST_DSOUND_UNLOCK(obj) (g_mutex_unlock (obj->dsound_lock)) #define GST_TYPE_DIRECTSOUND_RING_BUFFER \ (gst_directsound_ring_buffer_get_type()) #define GST_DIRECTSOUND_RING_BUFFER(obj) \ (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_DIRECTSOUND_RING_BUFFER,GstDirectSoundRingBuffer)) #define GST_DIRECTSOUND_RING_BUFFER_CLASS(klass) \ (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_DIRECTSOUND_RING_BUFFER,GstDirectSoundRingBufferClass)) #define GST_DIRECTSOUND_RING_BUFFER_GET_CLASS(obj) \ (G_TYPE_INSTANCE_GET_CLASS((obj),GST_TYPE_DIRECTSOUND_RING_BUFFER,GstDirectSoundRingBufferClass)) #define GST_IS_DIRECTSOUND_RING_BUFFER(obj) \ (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_DIRECTSOUND_RING_BUFFER)) #define GST_IS_DIRECTSOUND_RING_BUFFER_CLASS(klass) \ (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_DIRECTSOUND_RING_BUFFER)) typedef struct _GstDirectSoundRingBuffer GstDirectSoundRingBuffer; typedef struct _GstDirectSoundRingBufferClass GstDirectSoundRingBufferClass; struct _GstDirectSoundRingBuffer { GstRingBuffer object; /* sink element */ struct _GstDirectSoundSink * dsoundsink; /* lock used to protect writes and resets */ GMutex * dsound_lock; /* directsound buffer waveformat description */ WAVEFORMATEX wave_format; /* directsound object interface pointer */ LPDIRECTSOUND8 pDS8; /* directsound sound object interface pointer */ LPDIRECTSOUNDBUFFER8 pDSB8; /* directsound buffer size */ guint buffer_size; /* directsound buffer write offset */ guint buffer_write_offset; /* minimum buffer size before playback start */ guint min_buffer_size; /* minimum sleep time for thread */ guint min_sleep_time; /* ringbuffer bytes per sample */ guint bytes_per_sample; /* ringbuffer segment size */ gint segsize; /* ring buffer offset*/ guint segoffset; /* thread */ HANDLE hThread; /* thread suspended? */ gboolean suspended; /* should run thread */ gboolean should_run; /* are we currently flushing? */ gboolean flushing; /* current volume */ gdouble volume; }; struct _GstDirectSoundRingBufferClass { GstRingBufferClass parent_class; }; GType gst_directsound_ring_buffer_get_type (void); G_END_DECLS #endif /* __GST_DIRECTSOUNDRINGBUFFER_H__ */ psimedia-master/gstprovider/gstelements/directsound_sinkonly/gstdirectsoundsink.c000066400000000000000000000225661220046403000314110ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2005 Sebastien Moutte * Copyright (C) 2007-2009 Pioneers of the Inevitable * * gstdirectsoundsink.c: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement * of Pioneers of the Inevitable, the creators of the Songbird Music player * */ /** * SECTION:element-directsoundsink * * This element lets you output sound using the DirectSound API. * * Note that you should almost always use generic audio conversion elements * like audioconvert and audioresample in front of an audiosink to make sure * your pipeline works under all circumstances (those conversion elements will * act in passthrough-mode if no conversion is necessary). * * * Example pipelines * |[ * gst-launch -v audiotestsrc ! audioconvert ! volume volume=0.1 ! directsoundsink * ]| will output a sine wave (continuous beep sound) to your sound card (with * a very low volume as precaution). * |[ * gst-launch -v filesrc location=music.ogg ! decodebin ! audioconvert ! audioresample ! directsoundsink * ]| will play an Ogg/Vorbis audio file and output it. * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif #include "gstdirectsoundsink.h" #define GST_CAT_DEFAULT directsound /* elementfactory information */ static const GstElementDetails gst_directsound_sink_details = GST_ELEMENT_DETAILS ("DirectSound8 Audio Sink", "Sink/Audio", "Output to a sound card via DirectSound8", "Ghislain 'Aus' Lacroix "); static GstStaticPadTemplate directsoundsink_sink_factory = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, GST_STATIC_CAPS ("audio/x-raw-int, " "endianness = (int) LITTLE_ENDIAN, " "signed = (boolean) TRUE, " "width = (int) {8, 16}, " "depth = (int) {8, 16}, " "rate = (int) [ 1, MAX ], " "channels = (int) [ 1, 2 ]")); static void gst_directsound_sink_base_init (gpointer g_class); static void gst_directsound_sink_class_init (GstDirectSoundSinkClass * klass); static void gst_directsound_sink_init (GstDirectSoundSink * dsoundsink, GstDirectSoundSinkClass * g_class); static gboolean gst_directsound_sink_event (GstBaseSink * bsink, GstEvent * event); static void gst_directsound_sink_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static void gst_directsound_sink_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); static GstRingBuffer * gst_directsound_sink_create_ringbuffer ( GstBaseAudioSink * sink); enum { ARG_0, ARG_VOLUME }; GST_BOILERPLATE (GstDirectSoundSink, gst_directsound_sink, GstBaseAudioSink, GST_TYPE_BASE_AUDIO_SINK); static void gst_directsound_sink_base_init (gpointer g_class) { GstElementClass * element_class = GST_ELEMENT_CLASS (g_class); gst_element_class_set_details (element_class, &gst_directsound_sink_details); gst_element_class_add_pad_template (element_class, gst_static_pad_template_get (&directsoundsink_sink_factory)); } static void gst_directsound_sink_class_init (GstDirectSoundSinkClass * klass) { GObjectClass * gobject_class; GstElementClass * gstelement_class; GstBaseSinkClass * gstbasesink_class; GstBaseAudioSinkClass * gstbaseaudiosink_class; gobject_class = (GObjectClass *) klass; gstelement_class = (GstElementClass *) klass; gstbasesink_class = (GstBaseSinkClass *) klass; gstbaseaudiosink_class = (GstBaseAudioSinkClass *) klass; parent_class = g_type_class_peek_parent (klass); gobject_class->set_property = GST_DEBUG_FUNCPTR (gst_directsound_sink_set_property); gobject_class->get_property = GST_DEBUG_FUNCPTR (gst_directsound_sink_get_property); g_object_class_install_property (gobject_class, ARG_VOLUME, g_param_spec_double ("volume", "Volume", "Volume of this stream", 0, 1.0, 1.0, G_PARAM_READWRITE)); gstbasesink_class->event = GST_DEBUG_FUNCPTR (gst_directsound_sink_event); gstbaseaudiosink_class->create_ringbuffer = GST_DEBUG_FUNCPTR (gst_directsound_sink_create_ringbuffer); } static void gst_directsound_sink_init (GstDirectSoundSink * dsoundsink, GstDirectSoundSinkClass * g_class) { dsoundsink->dsoundbuffer = NULL; dsoundsink->volume = 1.0; } static gboolean gst_directsound_sink_event (GstBaseSink * bsink, GstEvent * event) { HRESULT hr; DWORD dwStatus; DWORD dwSizeBuffer = 0; LPVOID pLockedBuffer = NULL; GstDirectSoundSink * dsoundsink; dsoundsink = GST_DIRECTSOUND_SINK (bsink); GST_BASE_SINK_CLASS (parent_class)->event (bsink, event); /* no buffer, no event to process */ if (!dsoundsink->dsoundbuffer) return TRUE; switch (GST_EVENT_TYPE (event)) { case GST_EVENT_FLUSH_START: GST_DSOUND_LOCK (dsoundsink->dsoundbuffer); dsoundsink->dsoundbuffer->flushing = TRUE; GST_DSOUND_UNLOCK (dsoundsink->dsoundbuffer); break; case GST_EVENT_FLUSH_STOP: GST_DSOUND_LOCK (dsoundsink->dsoundbuffer); dsoundsink->dsoundbuffer->flushing = FALSE; if (dsoundsink->dsoundbuffer->pDSB8) { hr = IDirectSoundBuffer8_GetStatus (dsoundsink->dsoundbuffer->pDSB8, &dwStatus); if (FAILED(hr)) { GST_DSOUND_UNLOCK (dsoundsink->dsoundbuffer); GST_WARNING("gst_directsound_sink_event: IDirectSoundBuffer8_GetStatus, hr = %X", (unsigned int) hr); return FALSE; } if (!(dwStatus & DSBSTATUS_PLAYING)) { /*reset position */ hr = IDirectSoundBuffer8_SetCurrentPosition (dsoundsink->dsoundbuffer->pDSB8, 0); dsoundsink->dsoundbuffer->buffer_write_offset = 0; /*reset the buffer */ hr = IDirectSoundBuffer8_Lock (dsoundsink->dsoundbuffer->pDSB8, dsoundsink->dsoundbuffer->buffer_write_offset, 0L, &pLockedBuffer, &dwSizeBuffer, NULL, NULL, DSBLOCK_ENTIREBUFFER); if (SUCCEEDED (hr)) { memset (pLockedBuffer, 0, dwSizeBuffer); hr = IDirectSoundBuffer8_Unlock (dsoundsink->dsoundbuffer->pDSB8, pLockedBuffer, dwSizeBuffer, NULL, 0); if (FAILED(hr)) { GST_DSOUND_UNLOCK (dsoundsink->dsoundbuffer); GST_WARNING("gst_directsound_sink_event: IDirectSoundBuffer8_Unlock, hr = %X", (unsigned int) hr); return FALSE; } } else { GST_DSOUND_UNLOCK (dsoundsink->dsoundbuffer); GST_WARNING ( "gst_directsound_sink_event: IDirectSoundBuffer8_Lock, hr = %X", (unsigned int) hr); return FALSE; } } } GST_DSOUND_UNLOCK (dsoundsink->dsoundbuffer); break; default: break; } return TRUE; } static void gst_directsound_sink_set_volume (GstDirectSoundSink * dsoundsink) { if (dsoundsink->dsoundbuffer) { GST_DSOUND_LOCK (dsoundsink->dsoundbuffer); dsoundsink->dsoundbuffer->volume = dsoundsink->volume; if (dsoundsink->dsoundbuffer->pDSB8) { gst_directsound_set_volume (dsoundsink->dsoundbuffer->pDSB8, dsoundsink->volume); } GST_DSOUND_UNLOCK (dsoundsink->dsoundbuffer); } } static void gst_directsound_sink_set_property (GObject * object, guint prop_id, const GValue * value , GParamSpec * pspec) { GstDirectSoundSink * sink = GST_DIRECTSOUND_SINK (object); switch (prop_id) { case ARG_VOLUME: sink->volume = g_value_get_double (value); gst_directsound_sink_set_volume (sink); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static void gst_directsound_sink_get_property (GObject * object, guint prop_id, GValue * value , GParamSpec * pspec) { GstDirectSoundSink * sink = GST_DIRECTSOUND_SINK (object); switch (prop_id) { case ARG_VOLUME: g_value_set_double (value, sink->volume); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } /* GstBaseAudioSink vmethod implementations */ static GstRingBuffer * gst_directsound_sink_create_ringbuffer (GstBaseAudioSink * sink) { GstDirectSoundSink * dsoundsink; GstDirectSoundRingBuffer * ringbuffer; dsoundsink = GST_DIRECTSOUND_SINK (sink); GST_DEBUG ("creating ringbuffer"); ringbuffer = g_object_new (GST_TYPE_DIRECTSOUND_RING_BUFFER, NULL); GST_DEBUG ("directsound sink 0x%p", dsoundsink); /* set the sink element on the ringbuffer for error messages */ ringbuffer->dsoundsink = dsoundsink; /* set the ringbuffer on the sink */ dsoundsink->dsoundbuffer = ringbuffer; /* set initial volume on ringbuffer */ dsoundsink->dsoundbuffer->volume = dsoundsink->volume; return GST_RING_BUFFER (ringbuffer); } psimedia-master/gstprovider/gstelements/directsound_sinkonly/gstdirectsoundsink.h000066400000000000000000000044341220046403000314100ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2005 Sebastien Moutte * Copyright (C) 2007-2009 Pioneers of the Inevitable * * gstdirectsoundsink.h: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement * of Pioneers of the Inevitable, the creators of the Songbird Music player * */ #ifndef __GST_DIRECTSOUNDSINK_H__ #define __GST_DIRECTSOUNDSINK_H__ #include #include #include "gstdirectsound.h" #include "gstdirectsoundringbuffer.h" G_BEGIN_DECLS #define GST_TYPE_DIRECTSOUND_SINK \ (gst_directsound_sink_get_type()) #define GST_DIRECTSOUND_SINK(obj) \ (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_DIRECTSOUND_SINK,GstDirectSoundSink)) #define GST_DIRECTSOUND_SINK_CLASS(klass) \ (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_DIRECTSOUND_SINK,GstDirectSoundSinkClass)) #define GST_IS_DIRECTSOUND_SINK(obj) \ (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_DIRECTSOUND_SINK)) #define GST_IS_DIRECTSOUND_SINK_CLASS(klass) \ (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_DIRECTSOUND_SINK)) typedef struct _GstDirectSoundSink GstDirectSoundSink; typedef struct _GstDirectSoundSinkClass GstDirectSoundSinkClass; struct _GstDirectSoundSink { /* base audio sink */ GstBaseAudioSink sink; /* ringbuffer */ GstDirectSoundRingBuffer * dsoundbuffer; /* current volume */ gdouble volume; }; struct _GstDirectSoundSinkClass { GstBaseAudioSinkClass parent_class; }; GType gst_directsound_sink_get_type (void); G_END_DECLS #endif /* __GST_DIRECTSOUNDSINK_H__ */ psimedia-master/gstprovider/gstelements/liveadder.pri000066400000000000000000000002561220046403000235110ustar00rootroot00000000000000HEADERS += \ $$PWD/liveadder/liveadder.h gstplugin:SOURCES += $$PWD/liveadder/liveadder.c !gstplugin:SOURCES += $$PWD/static/liveadder_static.c LIBS *= \ -lgstaudio-0.10 psimedia-master/gstprovider/gstelements/liveadder/000077500000000000000000000000001220046403000227725ustar00rootroot00000000000000psimedia-master/gstprovider/gstelements/liveadder/Makefile.am000066400000000000000000000005371220046403000250330ustar00rootroot00000000000000plugin_LTLIBRARIES = libgstliveadder.la libgstliveadder_la_SOURCES = liveadder.c libgstliveadder_la_CFLAGS = $(GST_CFLAGS) $(GST_PLUGINS_BASE_CFLAGS) $(ERROR_CFLAGS) libgstliveadder_la_LIBADD = $(GST_LIBS_LIBS) libgstliveadder_la_LDFLAGS = $(GST_PLUGIN_LDFLAGS) $(GST_BASE_LIBS) $(GST_PLUGINS_BASE_LIBS) -lgstaudio-0.10 noinst_HEADERS = liveadder.h psimedia-master/gstprovider/gstelements/liveadder/liveadder.c000066400000000000000000001262401220046403000251020ustar00rootroot00000000000000/* * Farsight Voice+Video library * * Copyright 2008 Collabora Ltd * Copyright 2008 Nokia Corporation * @author: Olivier Crete * * With parts copied from the adder plugin which is * Copyright (C) 1999,2000 Erik Walthinsen * 2001 Thomas * 2005,2006 Wim Taymans * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif #include "liveadder.h" #include #include #define DEFAULT_LATENCY_MS 60 GST_DEBUG_CATEGORY_STATIC (live_adder_debug); #define GST_CAT_DEFAULT (live_adder_debug) /* elementfactory information */ static const GstElementDetails gst_live_adder_details = GST_ELEMENT_DETAILS ( "Live Adder element", "Generic/Audio", "Mixes live/discontinuous audio streams", "Olivier Crete "); static GstStaticPadTemplate gst_live_adder_sink_template = GST_STATIC_PAD_TEMPLATE ("sink%d", GST_PAD_SINK, GST_PAD_REQUEST, GST_STATIC_CAPS (GST_AUDIO_INT_PAD_TEMPLATE_CAPS "; " GST_AUDIO_FLOAT_PAD_TEMPLATE_CAPS) ); static GstStaticPadTemplate gst_live_adder_src_template = GST_STATIC_PAD_TEMPLATE ("src", GST_PAD_SRC, GST_PAD_ALWAYS, GST_STATIC_CAPS (GST_AUDIO_INT_PAD_TEMPLATE_CAPS "; " GST_AUDIO_FLOAT_PAD_TEMPLATE_CAPS) ); /* Valve signals and args */ enum { /* FILL ME */ LAST_SIGNAL }; enum { PROP_0, PROP_LATENCY, }; typedef struct _GstLiveAdderPadPrivate { GstSegment segment; gboolean eos; GstClockTime expected_timestamp; } GstLiveAdderPadPrivate; GST_BOILERPLATE(GstLiveAdder, gst_live_adder, GstElement, GST_TYPE_ELEMENT); static void gst_live_adder_finalize (GObject * object); static void gst_live_adder_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static void gst_live_adder_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); static GstPad * gst_live_adder_request_new_pad (GstElement * element, GstPadTemplate * templ, const gchar * unused); static void gst_live_adder_release_pad (GstElement * element, GstPad * pad); static GstStateChangeReturn gst_live_adder_change_state (GstElement * element, GstStateChange transition); static gboolean gst_live_adder_setcaps (GstPad * pad, GstCaps * caps); static GstCaps * gst_live_adder_sink_getcaps (GstPad * pad); static gboolean gst_live_adder_src_activate_push (GstPad * pad, gboolean active); static gboolean gst_live_adder_src_event (GstPad * pad, GstEvent * event); static void gst_live_adder_loop (gpointer data); static gboolean gst_live_adder_query (GstPad * pad, GstQuery * query); static gboolean gst_live_adder_sink_event (GstPad * pad, GstEvent * event); static void reset_pad_private (GstPad *pad); /* clipping versions */ #define MAKE_FUNC(name,type,ttype,min,max) \ static void name (type *out, type *in, gint bytes) { \ gint i; \ for (i = 0; i < bytes / sizeof (type); i++) \ out[i] = CLAMP ((ttype)out[i] + (ttype)in[i], min, max); \ } /* non-clipping versions (for float) */ #define MAKE_FUNC_NC(name,type,ttype) \ static void name (type *out, type *in, gint bytes) { \ gint i; \ for (i = 0; i < bytes / sizeof (type); i++) \ out[i] = (ttype)out[i] + (ttype)in[i]; \ } /* *INDENT-OFF* */ MAKE_FUNC (add_int32, gint32, gint64, G_MININT32, G_MAXINT32) MAKE_FUNC (add_int16, gint16, gint32, G_MININT16, G_MAXINT16) MAKE_FUNC (add_int8, gint8, gint16, G_MININT8, G_MAXINT8) MAKE_FUNC (add_uint32, guint32, guint64, 0, G_MAXUINT32) MAKE_FUNC (add_uint16, guint16, guint32, 0, G_MAXUINT16) MAKE_FUNC (add_uint8, guint8, guint16, 0, G_MAXUINT8) MAKE_FUNC_NC (add_float64, gdouble, gdouble) MAKE_FUNC_NC (add_float32, gfloat, gfloat) /* *INDENT-ON* */ static void gst_live_adder_base_init (gpointer klass) { } static void gst_live_adder_class_init (GstLiveAdderClass * klass) { GObjectClass *gobject_class; GstElementClass *gstelement_class; gobject_class = (GObjectClass *) klass; gobject_class->finalize = gst_live_adder_finalize; gobject_class->set_property = gst_live_adder_set_property; gobject_class->get_property = gst_live_adder_get_property; gstelement_class = (GstElementClass *) klass; gst_element_class_add_pad_template (gstelement_class, gst_static_pad_template_get (&gst_live_adder_src_template)); gst_element_class_add_pad_template (gstelement_class, gst_static_pad_template_get (&gst_live_adder_sink_template)); gst_element_class_set_details (gstelement_class, &gst_live_adder_details); parent_class = g_type_class_peek_parent (klass); gstelement_class->request_new_pad = gst_live_adder_request_new_pad; gstelement_class->release_pad = gst_live_adder_release_pad; gstelement_class->change_state = gst_live_adder_change_state; g_object_class_install_property (gobject_class, PROP_LATENCY, g_param_spec_uint ("latency", "Buffer latency in ms", "Amount of data to buffer", 0, G_MAXUINT, DEFAULT_LATENCY_MS, G_PARAM_READWRITE)); GST_DEBUG_CATEGORY_INIT (live_adder_debug, "liveadder", 0, "Live Adder"); } static void gst_live_adder_init (GstLiveAdder * adder, GstLiveAdderClass *klass) { GstPadTemplate *template; template = gst_static_pad_template_get (&gst_live_adder_src_template); adder->srcpad = gst_pad_new_from_template (template, "src"); gst_object_unref (template); gst_pad_set_getcaps_function (adder->srcpad, GST_DEBUG_FUNCPTR (gst_pad_proxy_getcaps)); gst_pad_set_setcaps_function (adder->srcpad, GST_DEBUG_FUNCPTR (gst_live_adder_setcaps)); gst_pad_set_query_function (adder->srcpad, GST_DEBUG_FUNCPTR (gst_live_adder_query)); gst_pad_set_event_function (adder->srcpad, GST_DEBUG_FUNCPTR (gst_live_adder_src_event)); gst_pad_set_activatepush_function (adder->srcpad, GST_DEBUG_FUNCPTR (gst_live_adder_src_activate_push)); gst_element_add_pad (GST_ELEMENT (adder), adder->srcpad); adder->format = GST_LIVE_ADDER_FORMAT_UNSET; adder->padcount = 0; adder->func = NULL; adder->not_empty_cond = g_cond_new (); adder->next_timestamp = GST_CLOCK_TIME_NONE; adder->latency_ms = DEFAULT_LATENCY_MS; adder->buffers = g_queue_new (); } static void gst_live_adder_finalize (GObject * object) { GstLiveAdder *adder = GST_LIVE_ADDER (object); g_cond_free (adder->not_empty_cond); g_queue_foreach (adder->buffers, (GFunc) gst_mini_object_unref, NULL); while (g_queue_pop_head (adder->buffers)) {} g_queue_free (adder->buffers); g_list_free (adder->sinkpads); G_OBJECT_CLASS (parent_class)->finalize (object); } static void gst_live_adder_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) { GstLiveAdder *adder = GST_LIVE_ADDER (object); switch (prop_id) { case PROP_LATENCY: { guint64 new_latency, old_latency; new_latency = g_value_get_uint (value); GST_OBJECT_LOCK (adder); old_latency = adder->latency_ms; adder->latency_ms = new_latency; GST_OBJECT_UNLOCK (adder); /* post message if latency changed, this will inform the parent pipeline * that a latency reconfiguration is possible/needed. */ if (new_latency != old_latency) { GST_DEBUG_OBJECT (adder, "latency changed to: %" GST_TIME_FORMAT, GST_TIME_ARGS (new_latency)); gst_element_post_message (GST_ELEMENT_CAST (adder), gst_message_new_latency (GST_OBJECT_CAST (adder))); } break; } default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static void gst_live_adder_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec) { GstLiveAdder *adder = GST_LIVE_ADDER (object); switch (prop_id) { case PROP_LATENCY: GST_OBJECT_LOCK (adder); g_value_set_uint (value, adder->latency_ms); GST_OBJECT_UNLOCK (adder); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } /* we can only accept caps that we and downstream can handle. */ static GstCaps * gst_live_adder_sink_getcaps (GstPad * pad) { GstLiveAdder *adder; GstCaps *result, *peercaps, *sinkcaps; adder = GST_LIVE_ADDER (GST_PAD_PARENT (pad)); /* get the downstream possible caps */ peercaps = gst_pad_peer_get_caps (adder->srcpad); /* get the allowed caps on this sinkpad, we use the fixed caps function so * that it does not call recursively in this function. */ sinkcaps = gst_pad_get_fixed_caps_func (pad); if (peercaps) { /* if the peer has caps, intersect */ GST_DEBUG_OBJECT (adder, "intersecting peer and template caps"); result = gst_caps_intersect (peercaps, sinkcaps); gst_caps_unref (peercaps); gst_caps_unref (sinkcaps); } else { /* the peer has no caps (or there is no peer), just use the allowed caps * of this sinkpad. */ GST_DEBUG_OBJECT (adder, "no peer caps, using sinkcaps"); result = sinkcaps; } return result; } /* the first caps we receive on any of the sinkpads will define the caps for all * the other sinkpads because we can only mix streams with the same caps. * */ static gboolean gst_live_adder_setcaps (GstPad * pad, GstCaps * caps) { GstLiveAdder *adder; GList *pads; GstStructure *structure; const char *media_type; adder = GST_LIVE_ADDER (GST_PAD_PARENT (pad)); GST_LOG_OBJECT (adder, "setting caps on pad %p,%s to %" GST_PTR_FORMAT, pad, GST_PAD_NAME (pad), caps); /* FIXME, see if the other pads can accept the format. Also lock the * format on the other pads to this new format. */ GST_OBJECT_LOCK (adder); pads = GST_ELEMENT (adder)->pads; while (pads) { GstPad *otherpad = GST_PAD (pads->data); if (otherpad != pad) gst_caps_replace (&GST_PAD_CAPS (otherpad), caps); pads = g_list_next (pads); } /* parse caps now */ structure = gst_caps_get_structure (caps, 0); media_type = gst_structure_get_name (structure); if (strcmp (media_type, "audio/x-raw-int") == 0) { GST_DEBUG_OBJECT (adder, "parse_caps sets adder to format int"); adder->format = GST_LIVE_ADDER_FORMAT_INT; gst_structure_get_int (structure, "width", &adder->width); gst_structure_get_int (structure, "depth", &adder->depth); gst_structure_get_int (structure, "endianness", &adder->endianness); gst_structure_get_boolean (structure, "signed", &adder->is_signed); if (adder->endianness != G_BYTE_ORDER) goto not_supported; switch (adder->width) { case 8: adder->func = (adder->is_signed ? (GstLiveAdderFunction) add_int8 : (GstLiveAdderFunction) add_uint8); break; case 16: adder->func = (adder->is_signed ? (GstLiveAdderFunction) add_int16 : (GstLiveAdderFunction) add_uint16); break; case 32: adder->func = (adder->is_signed ? (GstLiveAdderFunction) add_int32 : (GstLiveAdderFunction) add_uint32); break; default: goto not_supported; } } else if (strcmp (media_type, "audio/x-raw-float") == 0) { GST_DEBUG_OBJECT (adder, "parse_caps sets adder to format float"); adder->format = GST_LIVE_ADDER_FORMAT_FLOAT; gst_structure_get_int (structure, "width", &adder->width); switch (adder->width) { case 32: adder->func = (GstLiveAdderFunction) add_float32; break; case 64: adder->func = (GstLiveAdderFunction) add_float64; break; default: goto not_supported; } } else { goto not_supported; } gst_structure_get_int (structure, "channels", &adder->channels); gst_structure_get_int (structure, "rate", &adder->rate); /* precalc bps */ adder->bps = (adder->width / 8) * adder->channels; GST_OBJECT_UNLOCK (adder); return TRUE; /* ERRORS */ not_supported: { GST_OBJECT_UNLOCK (adder); GST_DEBUG_OBJECT (adder, "unsupported format set as caps"); return FALSE; } } static void gst_live_adder_flush_start (GstLiveAdder * adder) { GST_DEBUG_OBJECT (adder, "Disabling pop on queue"); GST_OBJECT_LOCK (adder); /* mark ourselves as flushing */ adder->srcresult = GST_FLOW_WRONG_STATE; /* Empty the queue */ g_queue_foreach (adder->buffers, (GFunc) gst_mini_object_unref, NULL); while (g_queue_pop_head (adder->buffers)) {} /* unlock clock, we just unschedule, the entry will be released by the * locking streaming thread. */ if (adder->clock_id) gst_clock_id_unschedule (adder->clock_id); g_cond_broadcast (adder->not_empty_cond); GST_OBJECT_UNLOCK (adder); } static gboolean gst_live_adder_src_activate_push (GstPad * pad, gboolean active) { gboolean result = TRUE; GstLiveAdder *adder = NULL; adder = GST_LIVE_ADDER (gst_pad_get_parent (pad)); if (active) { /* Mark as non flushing */ GST_OBJECT_LOCK (adder); adder->srcresult = GST_FLOW_OK; GST_OBJECT_UNLOCK (adder); /* start pushing out buffers */ GST_DEBUG_OBJECT (adder, "Starting task on srcpad"); gst_pad_start_task (adder->srcpad, (GstTaskFunction) gst_live_adder_loop, adder); } else { /* make sure all data processing stops ASAP */ gst_live_adder_flush_start (adder); /* NOTE this will hardlock if the state change is called from the src pad * task thread because we will _join() the thread. */ GST_DEBUG_OBJECT (adder, "Stopping task on srcpad"); result = gst_pad_stop_task (pad); } gst_object_unref (adder); return result; } static gboolean gst_live_adder_sink_event (GstPad * pad, GstEvent * event) { gboolean ret = TRUE; GstLiveAdder *adder = NULL; GstLiveAdderPadPrivate *padprivate = NULL; adder = GST_LIVE_ADDER (gst_pad_get_parent (pad)); padprivate = gst_pad_get_element_private (pad); if (!padprivate) return FALSE; GST_LOG_OBJECT (adder, "received %s", GST_EVENT_TYPE_NAME (event)); switch (GST_EVENT_TYPE (event)) { case GST_EVENT_NEWSEGMENT: { GstFormat format; gdouble rate, arate; gint64 start, stop, time; gboolean update; gst_event_parse_new_segment_full (event, &update, &rate, &arate, &format, &start, &stop, &time); gst_event_unref (event); /* we need time for now */ if (format != GST_FORMAT_TIME) goto newseg_wrong_format; GST_DEBUG_OBJECT (adder, "newsegment: update %d, rate %g, arate %g, start %" GST_TIME_FORMAT ", stop %" GST_TIME_FORMAT ", time %" GST_TIME_FORMAT, update, rate, arate, GST_TIME_ARGS (start), GST_TIME_ARGS (stop), GST_TIME_ARGS (time)); /* now configure the values, we need these to time the release of the * buffers on the srcpad. */ GST_OBJECT_LOCK (adder); gst_segment_set_newsegment_full (&padprivate->segment, update, rate, arate, format, start, stop, time); GST_OBJECT_UNLOCK (adder); break; } case GST_EVENT_FLUSH_START: gst_live_adder_flush_start (adder); ret = gst_pad_push_event (adder->srcpad, event); break; case GST_EVENT_FLUSH_STOP: GST_OBJECT_LOCK (adder); adder->segment_pending = TRUE; adder->next_timestamp = GST_CLOCK_TIME_NONE; reset_pad_private (pad); adder->segment_pending = TRUE; GST_OBJECT_UNLOCK (adder); ret = gst_pad_push_event (adder->srcpad, event); ret = gst_live_adder_src_activate_push (adder->srcpad, TRUE); break; case GST_EVENT_EOS: { GST_OBJECT_LOCK (adder); ret = adder->srcresult == GST_FLOW_OK; if (ret && !padprivate->eos) { GST_DEBUG_OBJECT (adder, "queuing EOS"); padprivate->eos = TRUE; g_cond_broadcast (adder->not_empty_cond); } else if (padprivate->eos) { GST_DEBUG_OBJECT (adder, "dropping EOS, we are already EOS"); } else { GST_DEBUG_OBJECT (adder, "dropping EOS, reason %s", gst_flow_get_name (adder->srcresult)); } GST_OBJECT_UNLOCK (adder); gst_event_unref (event); break; } default: ret = gst_pad_push_event (adder->srcpad, event); break; } done: gst_object_unref (adder); return ret; /* ERRORS */ newseg_wrong_format: { GST_DEBUG_OBJECT (adder, "received non TIME newsegment"); ret = FALSE; goto done; } } static gboolean gst_live_adder_query_pos_dur (GstLiveAdder * adder, GstFormat informat, gboolean position, gint64 *outvalue) { gint64 max = G_MININT64; gboolean res = TRUE; GstIterator *it; gboolean done = FALSE; it = gst_element_iterate_sink_pads (GST_ELEMENT_CAST (adder)); while (!done) { GstIteratorResult ires; gpointer item; GstFormat format = informat; ires = gst_iterator_next (it, &item); switch (ires) { case GST_ITERATOR_DONE: done = TRUE; break; case GST_ITERATOR_OK: { GstPad *pad = GST_PAD_CAST (item); gint64 value; gboolean curres; /* ask sink peer for duration */ if (position) curres = gst_pad_query_peer_position (pad, &format, &value); else curres = gst_pad_query_peer_duration (pad, &format, &value); /* take max from all valid return values */ /* Only if the format is the one we requested, otherwise ignore it ? */ if (curres && format == informat) { res &= curres; /* valid unknown length, stop searching */ if (value == -1) { max = value; done = TRUE; } else if (value > max) { max = value; } } break; } case GST_ITERATOR_RESYNC: max = -1; res = TRUE; break; default: res = FALSE; done = TRUE; break; } } gst_iterator_free (it); if (res) *outvalue = max; return res; } /* FIXME: * * When we add a new stream (or remove a stream) the duration might * also become invalid again and we need to post a new DURATION * message to notify this fact to the parent. * For now we take the max of all the upstream elements so the simple * cases work at least somewhat. */ static gboolean gst_live_adder_query_duration (GstLiveAdder * adder, GstQuery * query) { GstFormat format; gint64 max; gboolean res; /* parse format */ gst_query_parse_duration (query, &format, NULL); res = gst_live_adder_query_pos_dur (adder, format, FALSE, &max); if (res) { /* and store the max */ gst_query_set_duration (query, format, max); } return res; } static gboolean gst_live_adder_query_position (GstLiveAdder * adder, GstQuery * query) { GstFormat format; gint64 max; gboolean res; /* parse format */ gst_query_parse_position (query, &format, NULL); res = gst_live_adder_query_pos_dur (adder, format, TRUE, &max); if (res) { /* and store the max */ gst_query_set_position (query, format, max); } return res; } static gboolean gst_live_adder_query (GstPad * pad, GstQuery * query) { GstLiveAdder *adder; gboolean res = FALSE; adder = GST_LIVE_ADDER (gst_pad_get_parent (pad)); switch (GST_QUERY_TYPE (query)) { case GST_QUERY_LATENCY: { /* We need to send the query upstream and add the returned latency to our * own */ GstClockTime min_latency = 0, max_latency = G_MAXUINT64; gpointer item; GstIterator *iter = NULL; gboolean done = FALSE; iter = gst_element_iterate_sink_pads (GST_ELEMENT (adder)); while (!done) { switch (gst_iterator_next (iter, &item)) { case GST_ITERATOR_OK: { GstPad *sinkpad = item; GstClockTime pad_min_latency, pad_max_latency; gboolean pad_us_live; if (gst_pad_peer_query (sinkpad, query)) { gst_query_parse_latency (query, &pad_us_live, &pad_min_latency, &pad_max_latency); res = TRUE; GST_DEBUG_OBJECT (adder, "Peer latency for pad %s: min %" GST_TIME_FORMAT " max %" GST_TIME_FORMAT, GST_PAD_NAME (sinkpad), GST_TIME_ARGS (pad_min_latency), GST_TIME_ARGS (pad_max_latency)); min_latency = MAX (pad_min_latency, min_latency); max_latency = MIN (pad_max_latency, max_latency); } gst_object_unref (item); } break; case GST_ITERATOR_RESYNC: min_latency = 0; max_latency = G_MAXUINT64; gst_iterator_resync (iter); break; case GST_ITERATOR_ERROR: GST_ERROR_OBJECT (adder, "Error looping sink pads"); done = TRUE; break; case GST_ITERATOR_DONE: done = TRUE; break; } } gst_iterator_free (iter); if (res) { GstClockTime my_latency = adder->latency_ms * GST_MSECOND; GST_OBJECT_LOCK (adder); adder->peer_latency = min_latency; min_latency += my_latency; GST_OBJECT_UNLOCK (adder); /* Make sure we don't risk an overflow */ if (max_latency < G_MAXUINT64 - my_latency) max_latency += my_latency; else max_latency = G_MAXUINT64; gst_query_set_latency (query, TRUE, min_latency, max_latency); GST_DEBUG_OBJECT (adder, "Calculated total latency : min %" GST_TIME_FORMAT " max %" GST_TIME_FORMAT, GST_TIME_ARGS (min_latency), GST_TIME_ARGS (max_latency)); } break; } case GST_QUERY_DURATION: res = gst_live_adder_query_duration (adder, query); break; case GST_QUERY_POSITION: res = gst_live_adder_query_position (adder, query); break; default: res = gst_pad_query_default (pad, query); break; } gst_object_unref (adder); return res; } static gboolean forward_event_func (GstPad * pad, GValue * ret, GstEvent * event) { gst_event_ref (event); GST_LOG_OBJECT (pad, "About to send event %s", GST_EVENT_TYPE_NAME (event)); if (!gst_pad_push_event (pad, event)) { g_value_set_boolean (ret, FALSE); GST_WARNING_OBJECT (pad, "Sending event %p (%s) failed.", event, GST_EVENT_TYPE_NAME (event)); } else { GST_LOG_OBJECT (pad, "Sent event %p (%s).", event, GST_EVENT_TYPE_NAME (event)); } /* unref the pad because of a FIXME in gst_iterator_unfold * it does a gst_iterator_next which refs the pad, but it never unrefs it */ gst_object_unref (pad); return TRUE; } /* forwards the event to all sinkpads, takes ownership of the * event * * Returns: TRUE if the event could be forwarded on all * sinkpads. */ static gboolean forward_event (GstLiveAdder * adder, GstEvent * event) { gboolean ret; GstIterator *it; GValue vret = { 0 }; GST_LOG_OBJECT (adder, "Forwarding event %p (%s)", event, GST_EVENT_TYPE_NAME (event)); ret = TRUE; g_value_init (&vret, G_TYPE_BOOLEAN); g_value_set_boolean (&vret, TRUE); it = gst_element_iterate_sink_pads (GST_ELEMENT_CAST (adder)); gst_iterator_fold (it, (GstIteratorFoldFunction) forward_event_func, &vret, event); gst_iterator_free (it); ret = g_value_get_boolean (&vret); return ret; } static gboolean gst_live_adder_src_event (GstPad * pad, GstEvent * event) { GstLiveAdder *adder; gboolean result; adder = GST_LIVE_ADDER (gst_pad_get_parent (pad)); switch (GST_EVENT_TYPE (event)) { case GST_EVENT_QOS: /* TODO : QoS might be tricky */ result = FALSE; break; case GST_EVENT_NAVIGATION: /* TODO : navigation is rather pointless. */ result = FALSE; break; default: /* just forward the rest for now */ result = forward_event (adder, event); break; } gst_event_unref (event); gst_object_unref (adder); return result; } static guint gst_live_adder_length_from_duration (GstLiveAdder *adder, GstClockTime duration) { guint64 ret = (duration * adder->rate / GST_SECOND) * adder->bps; return (guint) ret; } static GstFlowReturn gst_live_live_adder_chain (GstPad *pad, GstBuffer *buffer) { GstLiveAdder *adder = GST_LIVE_ADDER (gst_pad_get_parent_element (pad)); GstLiveAdderPadPrivate *padprivate = NULL; GstFlowReturn ret = GST_FLOW_OK; GList *item = NULL; GstClockTime skip = 0; gint64 drift = 0; /* Positive if new buffer after old buffer */ GST_OBJECT_LOCK (adder); ret = adder->srcresult; GST_DEBUG ("Incoming buffer time:%"GST_TIME_FORMAT" duration:%"GST_TIME_FORMAT, GST_TIME_ARGS(GST_BUFFER_TIMESTAMP(buffer)), GST_TIME_ARGS(GST_BUFFER_DURATION(buffer))); if (ret != GST_FLOW_OK) { GST_DEBUG_OBJECT (adder, "Passing non-ok result from src: %s", gst_flow_get_name (ret)); gst_buffer_unref (buffer); goto out; } padprivate = gst_pad_get_element_private (pad); if (!padprivate) { ret = GST_FLOW_NOT_LINKED; gst_buffer_unref (buffer); goto out; } if (padprivate->eos) { GST_DEBUG_OBJECT (adder, "Received buffer after EOS"); ret = GST_FLOW_UNEXPECTED; gst_buffer_unref (buffer); goto out; } if (!GST_BUFFER_TIMESTAMP_IS_VALID(buffer)) goto invalid_timestamp; if (padprivate->segment.format == GST_FORMAT_UNDEFINED) { GST_WARNING_OBJECT (adder, "No new-segment received," " initializing segment with time 0..-1"); gst_segment_init (&padprivate->segment, GST_FORMAT_TIME); gst_segment_set_newsegment (&padprivate->segment, FALSE, 1.0, GST_FORMAT_TIME, 0, -1, 0); } if (padprivate->segment.format != GST_FORMAT_TIME) goto invalid_segment; buffer = gst_buffer_make_metadata_writable (buffer); drift = GST_BUFFER_TIMESTAMP (buffer) - padprivate->expected_timestamp; /* Just see if we receive invalid timestamp/durations */ if (GST_CLOCK_TIME_IS_VALID (padprivate->expected_timestamp) && !GST_BUFFER_FLAG_IS_SET(buffer, GST_BUFFER_FLAG_DISCONT) && (drift != 0)) { GST_LOG_OBJECT (adder, "Timestamp discontinuity without the DISCONT flag set" " (expected %" GST_TIME_FORMAT ", got %" GST_TIME_FORMAT" drift:%ldms)", GST_TIME_ARGS (padprivate->expected_timestamp), GST_TIME_ARGS (GST_BUFFER_TIMESTAMP (buffer)), (long int)(drift / GST_MSECOND)); /* We accept drifts of 10ms*/ if (ABS(drift) < (10 * GST_MSECOND)) { GST_DEBUG ("Correcting minor drift"); GST_BUFFER_TIMESTAMP (buffer) = padprivate->expected_timestamp; } } /* If there is no duration, lets set one */ if (!GST_BUFFER_DURATION_IS_VALID (buffer)) { GST_BUFFER_DURATION (buffer) = gst_audio_duration_from_pad_buffer (pad, buffer); padprivate->expected_timestamp = GST_CLOCK_TIME_NONE; } else { padprivate->expected_timestamp = GST_BUFFER_TIMESTAMP (buffer) + GST_BUFFER_DURATION (buffer); } /* * Lets clip the buffer to the segment (so we don't have to worry about * cliping afterwards). * This should also guarantee us that we'll have valid timestamps and * durations afterwards */ buffer = gst_audio_buffer_clip (buffer, &padprivate->segment, adder->rate, adder->bps); /* buffer can be NULL if it's completely outside of the segment */ if (!buffer) { GST_DEBUG ("Buffer completely outside of configured segment, dropping it"); goto out; } /* * Make sure all incoming buffers share the same timestamping */ GST_BUFFER_TIMESTAMP (buffer) = gst_segment_to_running_time ( &padprivate->segment, padprivate->segment.format, GST_BUFFER_TIMESTAMP (buffer)); if (GST_CLOCK_TIME_IS_VALID (adder->next_timestamp) && GST_BUFFER_TIMESTAMP (buffer) < adder->next_timestamp) { if (GST_BUFFER_TIMESTAMP (buffer) + GST_BUFFER_DURATION (buffer) < adder->next_timestamp) { GST_DEBUG_OBJECT (adder, "Buffer is late, dropping (ts: %" GST_TIME_FORMAT " duration: %" GST_TIME_FORMAT ")", GST_TIME_ARGS (GST_BUFFER_TIMESTAMP (buffer)), GST_TIME_ARGS (GST_BUFFER_DURATION (buffer))); gst_buffer_unref (buffer); goto out; } else { skip = adder->next_timestamp - GST_BUFFER_TIMESTAMP (buffer); GST_DEBUG_OBJECT (adder, "Buffer is partially late, skipping %" GST_TIME_FORMAT, GST_TIME_ARGS (skip)); } } /* If our new buffer's head is higher than the queue's head, lets wake up, * we may not have to wait for as long */ if (adder->clock_id && g_queue_peek_head (adder->buffers) != NULL && GST_BUFFER_TIMESTAMP (buffer) + skip < GST_BUFFER_TIMESTAMP (g_queue_peek_head (adder->buffers))) gst_clock_id_unschedule (adder->clock_id); for (item = g_queue_peek_head_link (adder->buffers); item; item = g_list_next (item)) { GstBuffer *oldbuffer = item->data; GstClockTime old_skip = 0; GstClockTime mix_duration = 0; GstClockTime mix_start = 0; GstClockTime mix_end = 0; /* We haven't reached our place yet */ if (GST_BUFFER_TIMESTAMP (buffer) + skip >= GST_BUFFER_TIMESTAMP (oldbuffer) + GST_BUFFER_DURATION (oldbuffer)) continue; /* We're past our place, lets insert ouselves here */ if (GST_BUFFER_TIMESTAMP (buffer) + GST_BUFFER_DURATION (buffer) <= GST_BUFFER_TIMESTAMP (oldbuffer)) break; /* if we reach this spot, we have overlap, so we must mix */ /* First make a subbuffer with the non-overlapping part */ if (GST_BUFFER_TIMESTAMP (buffer) + skip < GST_BUFFER_TIMESTAMP (oldbuffer)) { GstBuffer *subbuffer = NULL; GstClockTime subbuffer_duration = GST_BUFFER_TIMESTAMP (oldbuffer) - (GST_BUFFER_TIMESTAMP (buffer) + skip); subbuffer = gst_buffer_create_sub (buffer, gst_live_adder_length_from_duration (adder, skip), gst_live_adder_length_from_duration (adder, subbuffer_duration)); GST_BUFFER_TIMESTAMP (subbuffer) = GST_BUFFER_TIMESTAMP (buffer) + skip; GST_BUFFER_DURATION (subbuffer) = subbuffer_duration; skip += subbuffer_duration; g_queue_insert_before (adder->buffers, item, subbuffer); } /* Now we are on the overlapping part */ oldbuffer = gst_buffer_make_writable (oldbuffer); item->data = oldbuffer; old_skip = GST_BUFFER_TIMESTAMP (buffer) + skip - GST_BUFFER_TIMESTAMP (oldbuffer); mix_start = GST_BUFFER_TIMESTAMP (oldbuffer) + old_skip; if (GST_BUFFER_TIMESTAMP (buffer) + GST_BUFFER_DURATION (buffer) < GST_BUFFER_TIMESTAMP (oldbuffer) + GST_BUFFER_DURATION (oldbuffer)) mix_end = GST_BUFFER_TIMESTAMP (buffer) + GST_BUFFER_DURATION (buffer); else mix_end = GST_BUFFER_TIMESTAMP (oldbuffer) + GST_BUFFER_DURATION (oldbuffer); mix_duration = mix_end - mix_start; adder->func (GST_BUFFER_DATA (oldbuffer) + gst_live_adder_length_from_duration (adder, old_skip), GST_BUFFER_DATA (buffer) + gst_live_adder_length_from_duration (adder, skip), gst_live_adder_length_from_duration (adder, mix_duration)); skip += mix_duration; } g_cond_broadcast (adder->not_empty_cond); if (skip == GST_BUFFER_DURATION (buffer)) { gst_buffer_unref (buffer); } else { if (skip) { GstClockTime subbuffer_duration = GST_BUFFER_DURATION (buffer) - skip; GstClockTime subbuffer_ts = GST_BUFFER_TIMESTAMP (buffer) + skip; buffer = gst_buffer_create_sub (buffer, gst_live_adder_length_from_duration (adder, skip), gst_live_adder_length_from_duration (adder, subbuffer_duration)); GST_BUFFER_TIMESTAMP (buffer) = subbuffer_ts; GST_BUFFER_DURATION (buffer) = subbuffer_duration; } if (item) g_queue_insert_before (adder->buffers, item, buffer); else g_queue_push_tail (adder->buffers, buffer); } out: GST_OBJECT_UNLOCK (adder); gst_object_unref (adder); return ret; invalid_timestamp: GST_OBJECT_UNLOCK (adder); gst_buffer_unref (buffer); GST_ELEMENT_ERROR (adder, STREAM, FAILED, ("Buffer without a valid timestamp received"), ("Invalid timestamp received on buffer")); return GST_FLOW_ERROR; invalid_segment: { const gchar *format = gst_format_get_name (padprivate->segment.format); GST_OBJECT_UNLOCK (adder); gst_buffer_unref (buffer); GST_ELEMENT_ERROR (adder, STREAM, FAILED, ("This element only supports TIME segments, received other type"), ("Received a segment of type %s, only support time segment", format)); return GST_FLOW_ERROR; } } /* * This only works because the GstObject lock is taken * * It checks if all sink pads are EOS */ static gboolean check_eos_locked (GstLiveAdder *adder) { GList *item; /* We can't be EOS if we have no sinkpads */ if (adder->sinkpads == NULL) return FALSE; for (item = adder->sinkpads; item; item = g_list_next (item)) { GstPad *pad = item->data; GstLiveAdderPadPrivate *padprivate = gst_pad_get_element_private (pad); if (padprivate && padprivate->eos != TRUE) return FALSE; } return TRUE; } static void gst_live_adder_loop (gpointer data) { GstLiveAdder *adder = GST_LIVE_ADDER (data); GstClockTime buffer_timestamp = 0; GstClockTime sync_time = 0; GstClock *clock = NULL; GstClockID id = NULL; GstClockReturn ret; GstBuffer *buffer = NULL; GstFlowReturn result; GstEvent *newseg_event = NULL; GST_OBJECT_LOCK (adder); again: for (;;) { if (adder->srcresult != GST_FLOW_OK) goto flushing; if (!g_queue_is_empty (adder->buffers)) break; if (check_eos_locked (adder)) goto eos; g_cond_wait (adder->not_empty_cond, GST_OBJECT_GET_LOCK(adder)); } buffer_timestamp = GST_BUFFER_TIMESTAMP (g_queue_peek_head (adder->buffers)); clock = GST_ELEMENT_CLOCK (adder); /* If we have no clock, then we can't do anything.. error */ if (!clock) { if (adder->playing) goto no_clock; else goto push_buffer; } GST_DEBUG_OBJECT (adder, "sync to timestamp %" GST_TIME_FORMAT, GST_TIME_ARGS (buffer_timestamp)); sync_time = buffer_timestamp + GST_ELEMENT_CAST (adder)->base_time; /* add latency, this includes our own latency and the peer latency. */ sync_time += adder->latency_ms * GST_MSECOND; sync_time += adder->peer_latency; /* create an entry for the clock */ id = adder->clock_id = gst_clock_new_single_shot_id (clock, sync_time); GST_OBJECT_UNLOCK (adder); ret = gst_clock_id_wait (id, NULL); GST_OBJECT_LOCK (adder); /* and free the entry */ gst_clock_id_unref (id); adder->clock_id = NULL; /* at this point, the clock could have been unlocked by a timeout, a new * head element was added to the queue or because we are shutting down. Check * for shutdown first. */ if (adder->srcresult != GST_FLOW_OK) goto flushing; if (ret == GST_CLOCK_UNSCHEDULED) { GST_DEBUG_OBJECT (adder, "Wait got unscheduled, will retry to push with new buffer"); goto again; } if (ret != GST_CLOCK_OK && ret != GST_CLOCK_EARLY) goto clock_error; push_buffer: buffer = g_queue_pop_head (adder->buffers); if (!buffer) goto again; /* * We make sure the timestamps are exactly contiguous * If its only small skew (due to rounding errors), we correct it * silently. Otherwise we put the discont flag */ if (GST_CLOCK_TIME_IS_VALID (adder->next_timestamp) && GST_BUFFER_TIMESTAMP (buffer) != adder->next_timestamp) { GstClockTimeDiff diff = GST_CLOCK_DIFF (GST_BUFFER_TIMESTAMP (buffer), adder->next_timestamp); if (diff < 0) diff = -diff; if (diff < GST_SECOND / adder->rate) { GST_BUFFER_TIMESTAMP (buffer) = adder->next_timestamp; GST_DEBUG_OBJECT (adder, "Correcting slight skew"); GST_BUFFER_FLAG_UNSET(buffer, GST_BUFFER_FLAG_DISCONT); } else { GST_BUFFER_FLAG_SET(buffer, GST_BUFFER_FLAG_DISCONT); GST_DEBUG_OBJECT (adder, "Expected buffer at %" GST_TIME_FORMAT ", but is at %" GST_TIME_FORMAT", setting discont", GST_TIME_ARGS (adder->next_timestamp), GST_TIME_ARGS (GST_BUFFER_TIMESTAMP (buffer))); } } else { GST_BUFFER_FLAG_UNSET(buffer, GST_BUFFER_FLAG_DISCONT); } GST_BUFFER_OFFSET(buffer) = GST_BUFFER_OFFSET_NONE; GST_BUFFER_OFFSET_END(buffer) = GST_BUFFER_OFFSET_NONE; if (GST_BUFFER_DURATION_IS_VALID (buffer)) adder->next_timestamp = GST_BUFFER_TIMESTAMP (buffer) + GST_BUFFER_DURATION (buffer); else adder->next_timestamp = GST_CLOCK_TIME_NONE; if (adder->segment_pending) { /* * We set the start at 0, because we re-timestamps to the running time */ newseg_event = gst_event_new_new_segment_full (FALSE, 1.0, 1.0, GST_FORMAT_TIME, 0, -1, 0); adder->segment_pending = FALSE; } GST_OBJECT_UNLOCK (adder); if (newseg_event) gst_pad_push_event (adder->srcpad, newseg_event); GST_DEBUG ("About to push buffer time:%"GST_TIME_FORMAT" duration:%"GST_TIME_FORMAT, GST_TIME_ARGS(GST_BUFFER_TIMESTAMP(buffer)), GST_TIME_ARGS(GST_BUFFER_DURATION(buffer))); result = gst_pad_push (adder->srcpad, buffer); if (result != GST_FLOW_OK) goto pause; return; flushing: { GST_DEBUG_OBJECT (adder, "we are flushing"); gst_pad_pause_task (adder->srcpad); GST_OBJECT_UNLOCK (adder); return; } clock_error: { gst_pad_pause_task (adder->srcpad); GST_OBJECT_UNLOCK (adder); GST_ELEMENT_ERROR (adder, STREAM, MUX, ("Error with the clock"), ("Error with the clock: %d", ret)); GST_ERROR_OBJECT (adder, "Error with the clock: %d", ret); return; } no_clock: { gst_pad_pause_task (adder->srcpad); GST_OBJECT_UNLOCK (adder); GST_ELEMENT_ERROR (adder, STREAM, MUX, ("No available clock"), ("No available clock")); GST_ERROR_OBJECT (adder, "No available clock"); return; } pause: { const gchar *reason = gst_flow_get_name (result); GST_DEBUG_OBJECT (adder, "pausing task, reason %s", reason); GST_OBJECT_LOCK (adder); /* store result */ adder->srcresult = result; /* we don't post errors or anything because upstream will do that for us * when we pass the return value upstream. */ gst_pad_pause_task (adder->srcpad); GST_OBJECT_UNLOCK (adder); return; } eos: { /* store result, we are flushing now */ GST_DEBUG_OBJECT (adder, "We are EOS, pushing EOS downstream"); adder->srcresult = GST_FLOW_UNEXPECTED; gst_pad_pause_task (adder->srcpad); GST_OBJECT_UNLOCK (adder); gst_pad_push_event (adder->srcpad, gst_event_new_eos ()); return; } } static GstPad * gst_live_adder_request_new_pad (GstElement * element, GstPadTemplate * templ, const gchar * unused) { gchar *name; GstLiveAdder *adder; GstPad *newpad; gint padcount; GstLiveAdderPadPrivate *padprivate = NULL; if (templ->direction != GST_PAD_SINK) goto not_sink; adder = GST_LIVE_ADDER (element); /* increment pad counter */ padcount = g_atomic_int_exchange_and_add (&adder->padcount, 1); name = g_strdup_printf ("sink%d", padcount); newpad = gst_pad_new_from_template (templ, name); GST_DEBUG_OBJECT (adder, "request new pad %s", name); g_free (name); gst_pad_set_getcaps_function (newpad, GST_DEBUG_FUNCPTR (gst_live_adder_sink_getcaps)); gst_pad_set_setcaps_function (newpad, GST_DEBUG_FUNCPTR (gst_live_adder_setcaps)); gst_pad_set_event_function (newpad, GST_DEBUG_FUNCPTR (gst_live_adder_sink_event)); padprivate = g_new0 (GstLiveAdderPadPrivate, 1); gst_segment_init (&padprivate->segment, GST_FORMAT_UNDEFINED); padprivate->eos = FALSE; padprivate->expected_timestamp = GST_CLOCK_TIME_NONE; gst_pad_set_element_private (newpad, padprivate); gst_pad_set_chain_function (newpad, gst_live_live_adder_chain); if (!gst_pad_set_active (newpad, TRUE)) goto could_not_activate; /* takes ownership of the pad */ if (!gst_element_add_pad (GST_ELEMENT (adder), newpad)) goto could_not_add; GST_OBJECT_LOCK (adder); adder->sinkpads = g_list_prepend (adder->sinkpads, newpad); GST_OBJECT_UNLOCK (adder); return newpad; /* errors */ not_sink: { g_warning ("gstadder: request new pad that is not a SINK pad\n"); return NULL; } could_not_add: { GST_DEBUG_OBJECT (adder, "could not add pad"); g_free (padprivate); gst_object_unref (newpad); return NULL; } could_not_activate: { GST_DEBUG_OBJECT (adder, "could not activate new pad"); g_free (padprivate); gst_object_unref (newpad); return NULL; } } static void gst_live_adder_release_pad (GstElement * element, GstPad * pad) { GstLiveAdder *adder; GstLiveAdderPadPrivate *padprivate; adder = GST_LIVE_ADDER (element); GST_DEBUG_OBJECT (adder, "release pad %s:%s", GST_DEBUG_PAD_NAME (pad)); GST_OBJECT_LOCK (element); padprivate = gst_pad_get_element_private (pad); gst_pad_set_element_private (pad, NULL); adder->sinkpads = g_list_remove_all (adder->sinkpads, pad); GST_OBJECT_UNLOCK (element); g_free (padprivate); gst_element_remove_pad (element, pad); } static void reset_pad_private (GstPad *pad) { GstLiveAdderPadPrivate *padprivate; padprivate = gst_pad_get_element_private (pad); if (!padprivate) return; gst_segment_init (&padprivate->segment, GST_FORMAT_UNDEFINED); padprivate->expected_timestamp = GST_CLOCK_TIME_NONE; padprivate->eos = FALSE; } static GstStateChangeReturn gst_live_adder_change_state (GstElement * element, GstStateChange transition) { GstLiveAdder *adder; GstStateChangeReturn ret; adder = GST_LIVE_ADDER (element); switch (transition) { case GST_STATE_CHANGE_READY_TO_PAUSED: GST_OBJECT_LOCK (adder); adder->segment_pending = TRUE; adder->peer_latency = 0; adder->next_timestamp = GST_CLOCK_TIME_NONE; g_list_foreach (adder->sinkpads, (GFunc) reset_pad_private, NULL); GST_OBJECT_UNLOCK (adder); break; case GST_STATE_CHANGE_PLAYING_TO_PAUSED: GST_OBJECT_LOCK (adder); adder->playing = FALSE; GST_OBJECT_UNLOCK (adder); break; default: break; } ret = GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); switch (transition) { case GST_STATE_CHANGE_PAUSED_TO_PLAYING: GST_OBJECT_LOCK (adder); adder->playing = TRUE; GST_OBJECT_UNLOCK (adder); break; default: break; } return ret; } static gboolean plugin_init (GstPlugin * plugin) { if (!gst_element_register (plugin, "liveadder", GST_RANK_NONE, GST_TYPE_LIVE_ADDER)) { return FALSE; } return TRUE; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, "liveadder", "Adds multiple live discontinuous streams", plugin_init, VERSION, "LGPL", "Farsight", "http://farsight.sf.net") psimedia-master/gstprovider/gstelements/liveadder/liveadder.h000066400000000000000000000060771220046403000251140ustar00rootroot00000000000000/* * Farsight Voice+Video library * * Copyright 2008 Collabora Ltd * Copyright 2008 Nokia Corporation * @author: Olivier Crete * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * */ #ifndef __GST_LIVE_ADDER_H__ #define __GST_LIVE_ADDER_H__ #include G_BEGIN_DECLS #define GST_TYPE_LIVE_ADDER (gst_live_adder_get_type()) #define GST_LIVE_ADDER(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_LIVE_ADDER,GstLiveAdder)) #define GST_IS_LIVE_ADDER(obj) (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_LIVE_ADDER)) #define GST_LIVE_ADDER_CLASS(klass) (G_TYPE_CHECK_CLASS_CAST((klass) ,GST_TYPE_LIVE_ADDER,GstLiveAdderClass)) #define GST_IS_LIVE_ADDER_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE((klass) ,GST_TYPE_LIVE_ADDER)) #define GST_LIVE_ADDER_GET_CLASS(obj) (G_TYPE_INSTANCE_GET_CLASS((obj) ,GST_TYPE_LIVE_ADDER,GstLiveAdderClass)) typedef struct _GstLiveAdder GstLiveAdder; typedef struct _GstLiveAdderClass GstLiveAdderClass; typedef enum { GST_LIVE_ADDER_FORMAT_UNSET, GST_LIVE_ADDER_FORMAT_INT, GST_LIVE_ADDER_FORMAT_FLOAT } GstLiveAdderFormat; typedef void (*GstLiveAdderFunction) (gpointer out, gpointer in, guint size); /** * GstLiveAdder: * * The adder object structure. */ struct _GstLiveAdder { GstElement element; GstPad *srcpad; /* pad counter, used for creating unique request pads */ gint padcount; GList *sinkpads; GstFlowReturn srcresult; GstClockID clock_id; /* the queue is ordered head to tail */ GQueue *buffers; GCond *not_empty_cond; GstClockTime next_timestamp; /* the next are valid for both int and float */ GstLiveAdderFormat format; gint rate; gint channels; gint width; gint endianness; /* the next are valid only for format == GST_LIVE_ADDER_FORMAT_INT */ gint depth; gboolean is_signed; /* number of bytes per sample, actually width/8 * channels */ gint bps; /* function to add samples */ GstLiveAdderFunction func; GstClockTime latency_ms; GstClockTime peer_latency; gboolean segment_pending; gboolean playing; }; struct _GstLiveAdderClass { GstElementClass parent_class; }; GType gst_live_adder_get_type (void); G_END_DECLS #endif /* __GST_LIVE_ADDER_H__ */ psimedia-master/gstprovider/gstelements/osxaudio.pri000066400000000000000000000010471220046403000234040ustar00rootroot00000000000000HEADERS += \ $$PWD/osxaudio/gstosxaudiosink.h \ $$PWD/osxaudio/gstosxaudioelement.h \ $$PWD/osxaudio/gstosxringbuffer.h \ $$PWD/osxaudio/gstosxaudiosrc.h SOURCES += \ $$PWD/osxaudio/gstosxringbuffer.c \ $$PWD/osxaudio/gstosxaudioelement.c \ $$PWD/osxaudio/gstosxaudiosink.c \ $$PWD/osxaudio/gstosxaudiosrc.c gstplugin:SOURCES += $$PWD/osxaudio/gstosxaudio.c !gstplugin:SOURCES += $$PWD/static/osxaudio_static.c LIBS *= \ -lgstinterfaces-0.10 \ -lgstaudio-0.10 LIBS += \ -framework CoreAudio \ -framework AudioUnit \ -framework Carbon psimedia-master/gstprovider/gstelements/osxaudio/000077500000000000000000000000001220046403000226665ustar00rootroot00000000000000psimedia-master/gstprovider/gstelements/osxaudio/Makefile.am000066400000000000000000000016671220046403000247340ustar00rootroot00000000000000plugin_LTLIBRARIES = libgstosxaudio.la libgstosxaudio_la_SOURCES = gstosxringbuffer.c \ gstosxaudioelement.c \ gstosxaudiosink.c \ gstosxaudiosrc.c \ gstosxaudio.c libgstosxaudio_la_CFLAGS = $(GST_PLUGINS_BASE_CFLAGS) $(GST_CFLAGS) \ -Wno-deprecated-declarations libgstosxaudio_la_LIBADD = \ -lgstinterfaces-@GST_MAJORMINOR@ \ -lgstaudio-@GST_MAJORMINOR@ \ $(GST_PLUGINS_BASE_LIBS) \ $(GST_BASE_LIBS) \ $(GST_LIBS) libgstosxaudio_la_LDFLAGS = $(GST_PLUGIN_LDFLAGS) -Wl,-framework -Wl,CoreAudio -Wl,-framework -Wl,AudioUnit -Wl,-framework -Wl,CoreServices libgstosxaudio_la_LIBTOOLFLAGS = --tag=disable-static noinst_HEADERS = gstosxaudiosink.h \ gstosxaudioelement.h \ gstosxringbuffer.h \ gstosxaudiosrc.h psimedia-master/gstprovider/gstelements/osxaudio/gstosxaudio.c000066400000000000000000000033111220046403000254010ustar00rootroot00000000000000/* * GStreamer * Copyright (C) 1999 Erik Walthinsen * Copyright (C) 2007,2008 Pioneers of the Inevitable * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement of * Pioneers of the Inevitable, the creators of the Songbird Music player * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif #include "gstosxaudioelement.h" #include "gstosxaudiosink.h" #include "gstosxaudiosrc.h" static gboolean plugin_init (GstPlugin * plugin) { if (!gst_element_register (plugin, "osxaudiosink", GST_RANK_PRIMARY, GST_TYPE_OSX_AUDIO_SINK)) { return FALSE; } if (!gst_element_register (plugin, "osxaudiosrc", GST_RANK_PRIMARY, GST_TYPE_OSX_AUDIO_SRC)) { return FALSE; } return TRUE; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, "osxaudio", "OSX (Mac OS X) audio support for GStreamer", plugin_init, VERSION, GST_LICENSE, GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN) psimedia-master/gstprovider/gstelements/osxaudio/gstosxaudioelement.c000066400000000000000000000063431220046403000267630ustar00rootroot00000000000000/* * GStreamer * Copyright (C) 2006 Zaheer Abbas Merali * Copyright (C) 2007 Pioneers of the Inevitable * * Permission is hereby granted, free of charge, to any person obtaining a * copy of this software and associated documentation files (the "Software"), * to deal in the Software without restriction, including without limitation * the rights to use, copy, modify, merge, publish, distribute, sublicense, * and/or sell copies of the Software, and to permit persons to whom the * Software is furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER * DEALINGS IN THE SOFTWARE. * * Alternatively, the contents of this file may be used under the * GNU Lesser General Public License Version 2.1 (the "LGPL"), in * which case the following provisions apply instead of the ones * mentioned above: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement of * Pioneers of the Inevitable, the creators of the Songbird Music player * */ #include #include "gstosxaudioelement.h" static void gst_osx_audio_element_class_init (GstOsxAudioElementInterface * klass); GType gst_osx_audio_element_get_type () { static GType gst_osxaudioelement_type = 0; if (!gst_osxaudioelement_type) { static const GTypeInfo gst_osxaudioelement_info = { sizeof (GstOsxAudioElementInterface), (GBaseInitFunc) gst_osx_audio_element_class_init, NULL, NULL, NULL, NULL, 0, 0, NULL, NULL }; gst_osxaudioelement_type = g_type_register_static (G_TYPE_INTERFACE, "GstOsxAudioElement", &gst_osxaudioelement_info, 0); } return gst_osxaudioelement_type; } static void gst_osx_audio_element_class_init (GstOsxAudioElementInterface * klass) { static gboolean initialized = FALSE; if (!initialized) { initialized = TRUE; } /* default virtual functions */ klass->io_proc = NULL; } psimedia-master/gstprovider/gstelements/osxaudio/gstosxaudioelement.h000066400000000000000000000066111220046403000267660ustar00rootroot00000000000000/* * GStreamer * Copyright (C) 2006 Zaheer Abbas Merali * Copyright (C) 2007 Pioneers of the Inevitable * * Permission is hereby granted, free of charge, to any person obtaining a * copy of this software and associated documentation files (the "Software"), * to deal in the Software without restriction, including without limitation * the rights to use, copy, modify, merge, publish, distribute, sublicense, * and/or sell copies of the Software, and to permit persons to whom the * Software is furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER * DEALINGS IN THE SOFTWARE. * * Alternatively, the contents of this file may be used under the * GNU Lesser General Public License Version 2.1 (the "LGPL"), in * which case the following provisions apply instead of the ones * mentioned above: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement of * Pioneers of the Inevitable, the creators of the Songbird Music player * */ #ifndef __GST_OSX_AUDIO_ELEMENT_H__ #define __GST_OSX_AUDIO_ELEMENT_H__ #include #include #include G_BEGIN_DECLS #define GST_OSX_AUDIO_ELEMENT_TYPE \ (gst_osx_audio_element_get_type()) #define GST_OSX_AUDIO_ELEMENT(obj) \ (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_OSX_AUDIO_ELEMENT_TYPE,GstOsxAudioElementInterface)) #define GST_IS_OSX_AUDIO_ELEMENT(obj) \ (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_OSX_AUDIO_ELEMENT_TYPE)) #define GST_OSX_AUDIO_ELEMENT_GET_INTERFACE(inst) \ (G_TYPE_INSTANCE_GET_INTERFACE((inst),GST_OSX_AUDIO_ELEMENT_TYPE,GstOsxAudioElementInterface)) typedef struct _GstOsxAudioElementInterface GstOsxAudioElementInterface; struct _GstOsxAudioElementInterface { GTypeInterface parent; OSStatus (*io_proc) (void * userdata, AudioUnitRenderActionFlags * ioActionFlags, const AudioTimeStamp * inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * bufferList); }; GType gst_osx_audio_element_get_type (void); G_END_DECLS #endif /* __GST_OSX_AUDIO_ELEMENT_H__ */ psimedia-master/gstprovider/gstelements/osxaudio/gstosxaudiosink.c000066400000000000000000000260031220046403000262710ustar00rootroot00000000000000/* * GStreamer * Copyright (C) 2005,2006 Zaheer Abbas Merali * Copyright (C) 2007,2008 Pioneers of the Inevitable * * Permission is hereby granted, free of charge, to any person obtaining a * copy of this software and associated documentation files (the "Software"), * to deal in the Software without restriction, including without limitation * the rights to use, copy, modify, merge, publish, distribute, sublicense, * and/or sell copies of the Software, and to permit persons to whom the * Software is furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER * DEALINGS IN THE SOFTWARE. * * Alternatively, the contents of this file may be used under the * GNU Lesser General Public License Version 2.1 (the "LGPL"), in * which case the following provisions apply instead of the ones * mentioned above: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement of * Pioneers of the Inevitable, the creators of the Songbird Music player * */ /** * SECTION:element-osxaudiosink * * This element renders raw audio samples using the CoreAudio api. * * * Example pipelines * |[ * gst-launch filesrc location=sine.ogg ! oggdemux ! vorbisdec ! audioconvert ! audioresample ! osxaudiosink * ]| Play an Ogg/Vorbis file. * * * Last reviewed on 2006-03-01 (0.10.4) */ #ifdef HAVE_CONFIG_H # include #endif #include #include #include #include "gstosxaudiosink.h" #include "gstosxaudioelement.h" GST_DEBUG_CATEGORY_STATIC (osx_audiosink_debug); #define GST_CAT_DEFAULT osx_audiosink_debug static GstElementDetails gst_osx_audio_sink_details = GST_ELEMENT_DETAILS ("Audio Sink (OSX)", "Sink/Audio", "Output to a sound card in OS X", "Zaheer Abbas Merali "); /* Filter signals and args */ enum { /* FILL ME */ LAST_SIGNAL }; enum { ARG_0, ARG_DEVICE, ARG_VOLUME }; #define DEFAULT_VOLUME 1.0 static GstStaticPadTemplate sink_factory = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, GST_STATIC_CAPS ("audio/x-raw-float, " "endianness = (int) {" G_STRINGIFY (G_BYTE_ORDER) " }, " "signed = (boolean) { TRUE }, " "width = (int) 32, " "depth = (int) 32, " "rate = (int) [1, MAX], " "channels = (int) [1, MAX]") ); static void gst_osx_audio_sink_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static void gst_osx_audio_sink_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); static GstRingBuffer *gst_osx_audio_sink_create_ringbuffer (GstBaseAudioSink * sink); static void gst_osx_audio_sink_osxelement_init (gpointer g_iface, gpointer iface_data); static void gst_osx_audio_sink_select_device (GstOsxAudioSink * osxsink); static void gst_osx_audio_sink_set_volume (GstOsxAudioSink * sink); static OSStatus gst_osx_audio_sink_io_proc (GstOsxRingBuffer * buf, AudioUnitRenderActionFlags * ioActionFlags, const AudioTimeStamp * inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * bufferList); static void gst_osx_audio_sink_do_init (GType type) { static const GInterfaceInfo osxelement_info = { gst_osx_audio_sink_osxelement_init, NULL, NULL }; GST_DEBUG_CATEGORY_INIT (osx_audiosink_debug, "osxaudiosink", 0, "OSX Audio Sink"); GST_DEBUG ("Adding static interface"); g_type_add_interface_static (type, GST_OSX_AUDIO_ELEMENT_TYPE, &osxelement_info); } GST_BOILERPLATE_FULL (GstOsxAudioSink, gst_osx_audio_sink, GstBaseAudioSink, GST_TYPE_BASE_AUDIO_SINK, gst_osx_audio_sink_do_init); static void gst_osx_audio_sink_base_init (gpointer g_class) { GstElementClass *element_class = GST_ELEMENT_CLASS (g_class); gst_element_class_add_pad_template (element_class, gst_static_pad_template_get (&sink_factory)); gst_element_class_set_details (element_class, &gst_osx_audio_sink_details); } static void gst_osx_audio_sink_class_init (GstOsxAudioSinkClass * klass) { GObjectClass *gobject_class; GstElementClass *gstelement_class; GstBaseSinkClass *gstbasesink_class; GstBaseAudioSinkClass *gstbaseaudiosink_class; gobject_class = (GObjectClass *) klass; gstelement_class = (GstElementClass *) klass; gstbasesink_class = (GstBaseSinkClass *) klass; gstbaseaudiosink_class = (GstBaseAudioSinkClass *) klass; parent_class = g_type_class_peek_parent (klass); gobject_class->set_property = GST_DEBUG_FUNCPTR (gst_osx_audio_sink_set_property); gobject_class->get_property = GST_DEBUG_FUNCPTR (gst_osx_audio_sink_get_property); g_object_class_install_property (gobject_class, ARG_DEVICE, g_param_spec_int ("device", "Device ID", "Device ID of output device", 0, G_MAXINT, 0, G_PARAM_READWRITE)); g_object_class_install_property (gobject_class, ARG_VOLUME, g_param_spec_double ("volume", "Volume", "Volume of this stream", 0, 1.0, 1.0, G_PARAM_READWRITE)); gstbaseaudiosink_class->create_ringbuffer = GST_DEBUG_FUNCPTR (gst_osx_audio_sink_create_ringbuffer); } static void gst_osx_audio_sink_init (GstOsxAudioSink * sink, GstOsxAudioSinkClass * gclass) { GST_DEBUG ("Initialising object"); sink->device_id = kAudioDeviceUnknown; sink->volume = DEFAULT_VOLUME; } static void gst_osx_audio_sink_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) { GstOsxAudioSink *sink = GST_OSX_AUDIO_SINK (object); switch (prop_id) { case ARG_DEVICE: sink->device_id = g_value_get_int (value); break; case ARG_VOLUME: sink->volume = g_value_get_double (value); gst_osx_audio_sink_set_volume (sink); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static void gst_osx_audio_sink_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec) { GstOsxAudioSink *sink = GST_OSX_AUDIO_SINK (object); switch (prop_id) { case ARG_DEVICE: g_value_set_int (value, sink->device_id); break; case ARG_VOLUME: g_value_set_double (value, sink->volume); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static GstRingBuffer * gst_osx_audio_sink_create_ringbuffer (GstBaseAudioSink * sink) { GstOsxAudioSink *osxsink; GstOsxRingBuffer *ringbuffer; osxsink = GST_OSX_AUDIO_SINK (sink); gst_osx_audio_sink_select_device (osxsink); GST_DEBUG ("Creating ringbuffer"); ringbuffer = g_object_new (GST_TYPE_OSX_RING_BUFFER, NULL); GST_DEBUG ("osx sink 0x%p element 0x%p ioproc 0x%p", osxsink, GST_OSX_AUDIO_ELEMENT_GET_INTERFACE (osxsink), (void *) gst_osx_audio_sink_io_proc); gst_osx_audio_sink_set_volume (osxsink); ringbuffer->element = GST_OSX_AUDIO_ELEMENT_GET_INTERFACE (osxsink); ringbuffer->device_id = osxsink->device_id; return GST_RING_BUFFER (ringbuffer); } /* HALOutput AudioUnit will request fairly arbitrarily-sized chunks of data, * not of a fixed size. So, we keep track of where in the current ringbuffer * segment we are, and only advance the segment once we've read the whole * thing */ static OSStatus gst_osx_audio_sink_io_proc (GstOsxRingBuffer * buf, AudioUnitRenderActionFlags * ioActionFlags, const AudioTimeStamp * inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * bufferList) { guint8 *readptr; gint readseg; gint len; gint remaining = bufferList->mBuffers[0].mDataByteSize; gint offset = 0; while (remaining) { if (!gst_ring_buffer_prepare_read (GST_RING_BUFFER (buf), &readseg, &readptr, &len)) return 0; len -= buf->segoffset; if (len > remaining) len = remaining; memcpy ((char *) bufferList->mBuffers[0].mData + offset, readptr + buf->segoffset, len); buf->segoffset += len; offset += len; remaining -= len; if ((gint) buf->segoffset == GST_RING_BUFFER (buf)->spec.segsize) { /* clear written samples */ gst_ring_buffer_clear (GST_RING_BUFFER (buf), readseg); /* we wrote one segment */ gst_ring_buffer_advance (GST_RING_BUFFER (buf), 1); buf->segoffset = 0; } } return 0; } static void gst_osx_audio_sink_osxelement_init (gpointer g_iface, gpointer iface_data) { GstOsxAudioElementInterface *iface = (GstOsxAudioElementInterface *) g_iface; iface->io_proc = (AURenderCallback) gst_osx_audio_sink_io_proc; } static void gst_osx_audio_sink_set_volume (GstOsxAudioSink * sink) { if (!sink->audiounit) return; AudioUnitSetParameter (sink->audiounit, kHALOutputParam_Volume, kAudioUnitScope_Global, 0, (float) sink->volume, 0); } static void gst_osx_audio_sink_select_device (GstOsxAudioSink * osxsink) { OSStatus status; UInt32 propertySize; if (osxsink->device_id == kAudioDeviceUnknown) { /* If no specific device has been selected by the user, then pick the * default device */ GST_DEBUG_OBJECT (osxsink, "Selecting device for OSXAudioSink"); propertySize = sizeof (osxsink->device_id); status = AudioHardwareGetProperty (kAudioHardwarePropertyDefaultOutputDevice, &propertySize, &osxsink->device_id); if (status) { GST_WARNING_OBJECT (osxsink, "AudioHardwareGetProperty returned %d", (int) status); } else { GST_DEBUG_OBJECT (osxsink, "AudioHardwareGetProperty returned 0"); } if (osxsink->device_id == kAudioDeviceUnknown) { GST_WARNING_OBJECT (osxsink, "AudioHardwareGetProperty: device_id is kAudioDeviceUnknown"); } GST_DEBUG_OBJECT (osxsink, "AudioHardwareGetProperty: device_id is %lu", (long) osxsink->device_id); } } psimedia-master/gstprovider/gstelements/osxaudio/gstosxaudiosink.h000066400000000000000000000062601220046403000263010ustar00rootroot00000000000000/* * GStreamer * Copyright (C) 2005-2006 Zaheer Abbas Merali * Copyright (C) 2007 Pioneers of the Inevitable * * Permission is hereby granted, free of charge, to any person obtaining a * copy of this software and associated documentation files (the "Software"), * to deal in the Software without restriction, including without limitation * the rights to use, copy, modify, merge, publish, distribute, sublicense, * and/or sell copies of the Software, and to permit persons to whom the * Software is furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER * DEALINGS IN THE SOFTWARE. * * Alternatively, the contents of this file may be used under the * GNU Lesser General Public License Version 2.1 (the "LGPL"), in * which case the following provisions apply instead of the ones * mentioned above: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement of * Pioneers of the Inevitable, the creators of the Songbird Music player * */ #ifndef __GST_OSXAUDIOSINK_H__ #define __GST_OSXAUDIOSINK_H__ #include #include #include "gstosxringbuffer.h" G_BEGIN_DECLS #define GST_TYPE_OSX_AUDIO_SINK \ (gst_osx_audio_sink_get_type()) #define GST_OSX_AUDIO_SINK(obj) \ (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_OSX_AUDIO_SINK,GstOsxAudioSink)) #define GST_OSX_AUDIO_SINK_CLASS(klass) \ (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_OSX_AUDIO_SINK,GstOsxAudioSinkClass)) typedef struct _GstOsxAudioSink GstOsxAudioSink; typedef struct _GstOsxAudioSinkClass GstOsxAudioSinkClass; struct _GstOsxAudioSink { GstBaseAudioSink sink; AudioDeviceID device_id; AudioUnit audiounit; double volume; }; struct _GstOsxAudioSinkClass { GstBaseAudioSinkClass parent_class; }; GType gst_osx_audio_sink_get_type (void); G_END_DECLS #endif /* __GST_OSXAUDIOSINK_H__ */ psimedia-master/gstprovider/gstelements/osxaudio/gstosxaudiosrc.c000066400000000000000000000257761220046403000261340ustar00rootroot00000000000000/* * GStreamer * Copyright (C) 2005,2006 Zaheer Abbas Merali * Copyright (C) 2008 Pioneers of the Inevitable * * Permission is hereby granted, free of charge, to any person obtaining a * copy of this software and associated documentation files (the "Software"), * to deal in the Software without restriction, including without limitation * the rights to use, copy, modify, merge, publish, distribute, sublicense, * and/or sell copies of the Software, and to permit persons to whom the * Software is furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER * DEALINGS IN THE SOFTWARE. * * Alternatively, the contents of this file may be used under the * GNU Lesser General Public License Version 2.1 (the "LGPL"), in * which case the following provisions apply instead of the ones * mentioned above: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. */ /** * SECTION:element-osxaudiosrc * * This element captures raw audio samples using the CoreAudio api. * * * Example launch line * |[ * gst-launch osxaudiosrc ! wavenc ! filesink location=audio.wav * ]| * */ #ifdef HAVE_CONFIG_H # include #endif #include #include #include #include "gstosxaudiosrc.h" #include "gstosxaudioelement.h" GST_DEBUG_CATEGORY_STATIC (osx_audiosrc_debug); #define GST_CAT_DEFAULT osx_audiosrc_debug static GstElementDetails gst_osx_audio_src_details = GST_ELEMENT_DETAILS ("Audio Source (OSX)", "Source/Audio", "Input from a sound card in OS X", "Zaheer Abbas Merali "); /* Filter signals and args */ enum { /* FILL ME */ LAST_SIGNAL }; enum { ARG_0, ARG_DEVICE }; static GstStaticPadTemplate src_factory = GST_STATIC_PAD_TEMPLATE ("src", GST_PAD_SRC, GST_PAD_ALWAYS, GST_STATIC_CAPS ("audio/x-raw-float, " "endianness = (int) {" G_STRINGIFY (G_BYTE_ORDER) " }, " "signed = (boolean) { TRUE }, " "width = (int) 32, " "depth = (int) 32, " "rate = (int) [1, MAX], " "channels = (int) [1, MAX]") ); static void gst_osx_audio_src_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static void gst_osx_audio_src_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); static GstCaps *gst_osx_audio_src_get_caps (GstBaseSrc * src); static GstRingBuffer *gst_osx_audio_src_create_ringbuffer (GstBaseAudioSrc * src); static void gst_osx_audio_src_osxelement_init (gpointer g_iface, gpointer iface_data); static OSStatus gst_osx_audio_src_io_proc (GstOsxRingBuffer * buf, AudioUnitRenderActionFlags * ioActionFlags, const AudioTimeStamp * inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * bufferList); static void gst_osx_audio_src_select_device (GstOsxAudioSrc * osxsrc); static void gst_osx_audio_src_do_init (GType type) { static const GInterfaceInfo osxelement_info = { gst_osx_audio_src_osxelement_init, NULL, NULL }; GST_DEBUG_CATEGORY_INIT (osx_audiosrc_debug, "osxaudiosrc", 0, "OSX Audio Src"); GST_DEBUG ("Adding static interface"); g_type_add_interface_static (type, GST_OSX_AUDIO_ELEMENT_TYPE, &osxelement_info); } GST_BOILERPLATE_FULL (GstOsxAudioSrc, gst_osx_audio_src, GstBaseAudioSrc, GST_TYPE_BASE_AUDIO_SRC, gst_osx_audio_src_do_init); static void gst_osx_audio_src_base_init (gpointer g_class) { GstElementClass *element_class = GST_ELEMENT_CLASS (g_class); gst_element_class_add_pad_template (element_class, gst_static_pad_template_get (&src_factory)); gst_element_class_set_details (element_class, &gst_osx_audio_src_details); } static void gst_osx_audio_src_class_init (GstOsxAudioSrcClass * klass) { GObjectClass *gobject_class; GstElementClass *gstelement_class; GstBaseSrcClass *gstbasesrc_class; GstBaseAudioSrcClass *gstbaseaudiosrc_class; gobject_class = (GObjectClass *) klass; gstelement_class = (GstElementClass *) klass; gstbasesrc_class = (GstBaseSrcClass *) klass; gstbaseaudiosrc_class = (GstBaseAudioSrcClass *) klass; parent_class = g_type_class_peek_parent (klass); gobject_class->set_property = GST_DEBUG_FUNCPTR (gst_osx_audio_src_set_property); gobject_class->get_property = GST_DEBUG_FUNCPTR (gst_osx_audio_src_get_property); gstbasesrc_class->get_caps = GST_DEBUG_FUNCPTR (gst_osx_audio_src_get_caps); g_object_class_install_property (gobject_class, ARG_DEVICE, g_param_spec_int ("device", "Device ID", "Device ID of input device", 0, G_MAXINT, 0, G_PARAM_READWRITE)); gstbaseaudiosrc_class->create_ringbuffer = GST_DEBUG_FUNCPTR (gst_osx_audio_src_create_ringbuffer); } static void gst_osx_audio_src_init (GstOsxAudioSrc * src, GstOsxAudioSrcClass * gclass) { gst_base_src_set_live (GST_BASE_SRC (src), TRUE); src->device_id = kAudioDeviceUnknown; src->deviceChannels = -1; } static void gst_osx_audio_src_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) { GstOsxAudioSrc *src = GST_OSX_AUDIO_SRC (object); switch (prop_id) { case ARG_DEVICE: src->device_id = g_value_get_int (value); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static void gst_osx_audio_src_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec) { GstOsxAudioSrc *src = GST_OSX_AUDIO_SRC (object); switch (prop_id) { case ARG_DEVICE: g_value_set_int (value, src->device_id); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static GstCaps * gst_osx_audio_src_get_caps (GstBaseSrc * src) { GstElementClass *gstelement_class; GstOsxAudioSrc *osxsrc; GstPadTemplate *pad_template; GstCaps *caps; gint min, max; gstelement_class = GST_ELEMENT_GET_CLASS (src); osxsrc = GST_OSX_AUDIO_SRC (src); if (osxsrc->deviceChannels == -1) { /* -1 means we don't know the number of channels yet. for now, return * template caps. */ return NULL; } max = osxsrc->deviceChannels; if (max < 1) max = 1; /* 0 channels means 1 channel? */ min = MIN (1, max); pad_template = gst_element_class_get_pad_template (gstelement_class, "src"); g_return_val_if_fail (pad_template != NULL, NULL); caps = gst_caps_copy (gst_pad_template_get_caps (pad_template)); if (min == max) { gst_caps_set_simple (caps, "channels", G_TYPE_INT, max, NULL); } else { gst_caps_set_simple (caps, "channels", GST_TYPE_INT_RANGE, min, max, NULL); } return caps; } static GstRingBuffer * gst_osx_audio_src_create_ringbuffer (GstBaseAudioSrc * src) { GstOsxAudioSrc *osxsrc; GstOsxRingBuffer *ringbuffer; osxsrc = GST_OSX_AUDIO_SRC (src); gst_osx_audio_src_select_device (osxsrc); GST_DEBUG ("Creating ringbuffer"); ringbuffer = g_object_new (GST_TYPE_OSX_RING_BUFFER, NULL); GST_DEBUG ("osx src 0x%p element 0x%p ioproc 0x%p", osxsrc, GST_OSX_AUDIO_ELEMENT_GET_INTERFACE (osxsrc), (void *) gst_osx_audio_src_io_proc); ringbuffer->element = GST_OSX_AUDIO_ELEMENT_GET_INTERFACE (osxsrc); ringbuffer->is_src = TRUE; ringbuffer->device_id = osxsrc->device_id; return GST_RING_BUFFER (ringbuffer); } static OSStatus gst_osx_audio_src_io_proc (GstOsxRingBuffer * buf, AudioUnitRenderActionFlags * ioActionFlags, const AudioTimeStamp * inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * bufferList) { OSStatus status; guint8 *writeptr; gint writeseg; gint len; gint remaining; gint offset = 0; status = AudioUnitRender (buf->audiounit, ioActionFlags, inTimeStamp, inBusNumber, inNumberFrames, buf->recBufferList); if (status) { GST_WARNING_OBJECT (buf, "AudioUnitRender returned %d", (int) status); return status; } remaining = buf->recBufferList->mBuffers[0].mDataByteSize; while (remaining) { if (!gst_ring_buffer_prepare_read (GST_RING_BUFFER (buf), &writeseg, &writeptr, &len)) return 0; len -= buf->segoffset; if (len > remaining) len = remaining; memcpy (writeptr + buf->segoffset, (char *) buf->recBufferList->mBuffers[0].mData + offset, len); buf->segoffset += len; offset += len; remaining -= len; if ((gint) buf->segoffset == GST_RING_BUFFER (buf)->spec.segsize) { /* we wrote one segment */ gst_ring_buffer_advance (GST_RING_BUFFER (buf), 1); buf->segoffset = 0; } } return 0; } static void gst_osx_audio_src_osxelement_init (gpointer g_iface, gpointer iface_data) { GstOsxAudioElementInterface *iface = (GstOsxAudioElementInterface *) g_iface; iface->io_proc = (AURenderCallback) gst_osx_audio_src_io_proc; } static void gst_osx_audio_src_select_device (GstOsxAudioSrc * osxsrc) { OSStatus status; UInt32 propertySize; if (osxsrc->device_id == kAudioDeviceUnknown) { /* If no specific device has been selected by the user, then pick the * default device */ GST_DEBUG_OBJECT (osxsrc, "Selecting device for OSXAudioSrc"); propertySize = sizeof (osxsrc->device_id); status = AudioHardwareGetProperty (kAudioHardwarePropertyDefaultInputDevice, &propertySize, &osxsrc->device_id); if (status) { GST_WARNING_OBJECT (osxsrc, "AudioHardwareGetProperty returned %d", (int) status); } else { GST_DEBUG_OBJECT (osxsrc, "AudioHardwareGetProperty returned 0"); } if (osxsrc->device_id == kAudioDeviceUnknown) { GST_WARNING_OBJECT (osxsrc, "AudioHardwareGetProperty: device_id is kAudioDeviceUnknown"); } GST_DEBUG_OBJECT (osxsrc, "AudioHardwareGetProperty: device_id is %lu", (long) osxsrc->device_id); } } psimedia-master/gstprovider/gstelements/osxaudio/gstosxaudiosrc.h000066400000000000000000000057341220046403000261310ustar00rootroot00000000000000/* * GStreamer * Copyright (C) 2005-2006 Zaheer Abbas Merali * * Permission is hereby granted, free of charge, to any person obtaining a * copy of this software and associated documentation files (the "Software"), * to deal in the Software without restriction, including without limitation * the rights to use, copy, modify, merge, publish, distribute, sublicense, * and/or sell copies of the Software, and to permit persons to whom the * Software is furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER * DEALINGS IN THE SOFTWARE. * * Alternatively, the contents of this file may be used under the * GNU Lesser General Public License Version 2.1 (the "LGPL"), in * which case the following provisions apply instead of the ones * mentioned above: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. */ #ifndef __GST_OSXAUDIOSRC_H__ #define __GST_OSXAUDIOSRC_H__ #include #include #include "gstosxringbuffer.h" G_BEGIN_DECLS #define GST_TYPE_OSX_AUDIO_SRC \ (gst_osx_audio_src_get_type()) #define GST_OSX_AUDIO_SRC(obj) \ (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_OSX_AUDIO_SRC,GstOsxAudioSrc)) #define GST_OSX_AUDIO_SRC_CLASS(klass) \ (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_OSX_AUDIO_SRC,GstOsxAudioSrcClass)) typedef struct _GstOsxAudioSrc GstOsxAudioSrc; typedef struct _GstOsxAudioSrcClass GstOsxAudioSrcClass; struct _GstOsxAudioSrc { GstBaseAudioSrc src; AudioDeviceID device_id; /* actual number of channels reported by input device */ int deviceChannels; }; struct _GstOsxAudioSrcClass { GstBaseAudioSrcClass parent_class; }; GType gst_osx_audio_src_get_type (void); G_END_DECLS #endif /* __GST_OSXAUDIOSRC_H__ */ psimedia-master/gstprovider/gstelements/osxaudio/gstosxringbuffer.c000066400000000000000000000524511220046403000264420ustar00rootroot00000000000000/* * GStreamer * Copyright (C) 2006 Zaheer Abbas Merali * Copyright (C) 2008 Pioneers of the Inevitable * * Permission is hereby granted, free of charge, to any person obtaining a * copy of this software and associated documentation files (the "Software"), * to deal in the Software without restriction, including without limitation * the rights to use, copy, modify, merge, publish, distribute, sublicense, * and/or sell copies of the Software, and to permit persons to whom the * Software is furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER * DEALINGS IN THE SOFTWARE. * * Alternatively, the contents of this file may be used under the * GNU Lesser General Public License Version 2.1 (the "LGPL"), in * which case the following provisions apply instead of the ones * mentioned above: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. */ #include #include #include #include #include "gstosxringbuffer.h" #include "gstosxaudiosink.h" #include "gstosxaudiosrc.h" GST_DEBUG_CATEGORY_STATIC (osx_audio_debug); #define GST_CAT_DEFAULT osx_audio_debug static void gst_osx_ring_buffer_dispose (GObject * object); static void gst_osx_ring_buffer_finalize (GObject * object); static gboolean gst_osx_ring_buffer_open_device (GstRingBuffer * buf); static gboolean gst_osx_ring_buffer_close_device (GstRingBuffer * buf); static gboolean gst_osx_ring_buffer_acquire (GstRingBuffer * buf, GstRingBufferSpec * spec); static gboolean gst_osx_ring_buffer_release (GstRingBuffer * buf); static gboolean gst_osx_ring_buffer_start (GstRingBuffer * buf); static gboolean gst_osx_ring_buffer_pause (GstRingBuffer * buf); static gboolean gst_osx_ring_buffer_stop (GstRingBuffer * buf); static guint gst_osx_ring_buffer_delay (GstRingBuffer * buf); static GstRingBufferClass *ring_parent_class = NULL; static OSStatus gst_osx_ring_buffer_render_notify (GstOsxRingBuffer * osxbuf, AudioUnitRenderActionFlags * ioActionFlags, const AudioTimeStamp * inTimeStamp, unsigned int inBusNumber, unsigned int inNumberFrames, AudioBufferList * ioData); static AudioBufferList *buffer_list_alloc (int channels, int size); static void buffer_list_free (AudioBufferList * list); static void gst_osx_ring_buffer_do_init (GType type) { GST_DEBUG_CATEGORY_INIT (osx_audio_debug, "osxaudio", 0, "OSX Audio Elements"); } GST_BOILERPLATE_FULL (GstOsxRingBuffer, gst_osx_ring_buffer, GstRingBuffer, GST_TYPE_RING_BUFFER, gst_osx_ring_buffer_do_init); static void gst_osx_ring_buffer_base_init (gpointer g_class) { /* Nothing to do right now */ } static void gst_osx_ring_buffer_class_init (GstOsxRingBufferClass * klass) { GObjectClass *gobject_class; GstObjectClass *gstobject_class; GstRingBufferClass *gstringbuffer_class; gobject_class = (GObjectClass *) klass; gstobject_class = (GstObjectClass *) klass; gstringbuffer_class = (GstRingBufferClass *) klass; ring_parent_class = g_type_class_peek_parent (klass); gobject_class->dispose = GST_DEBUG_FUNCPTR (gst_osx_ring_buffer_dispose); gobject_class->finalize = GST_DEBUG_FUNCPTR (gst_osx_ring_buffer_finalize); gstringbuffer_class->open_device = GST_DEBUG_FUNCPTR (gst_osx_ring_buffer_open_device); gstringbuffer_class->close_device = GST_DEBUG_FUNCPTR (gst_osx_ring_buffer_close_device); gstringbuffer_class->acquire = GST_DEBUG_FUNCPTR (gst_osx_ring_buffer_acquire); gstringbuffer_class->release = GST_DEBUG_FUNCPTR (gst_osx_ring_buffer_release); gstringbuffer_class->start = GST_DEBUG_FUNCPTR (gst_osx_ring_buffer_start); gstringbuffer_class->pause = GST_DEBUG_FUNCPTR (gst_osx_ring_buffer_pause); gstringbuffer_class->resume = GST_DEBUG_FUNCPTR (gst_osx_ring_buffer_start); gstringbuffer_class->stop = GST_DEBUG_FUNCPTR (gst_osx_ring_buffer_stop); gstringbuffer_class->delay = GST_DEBUG_FUNCPTR (gst_osx_ring_buffer_delay); GST_DEBUG ("osx ring buffer class init"); } static void gst_osx_ring_buffer_init (GstOsxRingBuffer * ringbuffer, GstOsxRingBufferClass * g_class) { /* Nothing to do right now */ } static void gst_osx_ring_buffer_dispose (GObject * object) { G_OBJECT_CLASS (ring_parent_class)->dispose (object); } static void gst_osx_ring_buffer_finalize (GObject * object) { G_OBJECT_CLASS (ring_parent_class)->finalize (object); } static AudioUnit gst_osx_ring_buffer_create_audio_unit (GstOsxRingBuffer * osxbuf, gboolean input, AudioDeviceID device_id) { ComponentDescription desc; Component comp; OSStatus status; AudioUnit unit; UInt32 enableIO; /* Create a HALOutput AudioUnit. * This is the lowest-level output API that is actually sensibly usable * (the lower level ones require that you do channel-remapping yourself, * and the CoreAudio channel mapping is sufficiently complex that doing * so would be very difficult) * * Note that for input we request an output unit even though we will do * input with it: http://developer.apple.com/technotes/tn2002/tn2091.html */ desc.componentType = kAudioUnitType_Output; desc.componentSubType = kAudioUnitSubType_HALOutput; desc.componentManufacturer = kAudioUnitManufacturer_Apple; desc.componentFlags = 0; desc.componentFlagsMask = 0; comp = FindNextComponent (NULL, &desc); if (comp == NULL) { GST_WARNING_OBJECT (osxbuf, "Couldn't find HALOutput component"); return NULL; } status = OpenAComponent (comp, &unit); if (status) { GST_WARNING_OBJECT (osxbuf, "Couldn't open HALOutput component"); return NULL; } if (input) { /* enable input */ enableIO = 1; status = AudioUnitSetProperty (unit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Input, 1, /* 1 = input element */ &enableIO, sizeof (enableIO)); if (status) { CloseComponent (unit); GST_WARNING_OBJECT (osxbuf, "Failed to enable input: %lx", (gulong) status); return NULL; } /* disable output */ enableIO = 0; status = AudioUnitSetProperty (unit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Output, 0, /* 0 = output element */ &enableIO, sizeof (enableIO)); if (status) { CloseComponent (unit); GST_WARNING_OBJECT (osxbuf, "Failed to disable output: %lx", (gulong) status); return NULL; } } /* Specify which device we're using. */ GST_DEBUG_OBJECT (osxbuf, "Setting device to %d", (int) device_id); status = AudioUnitSetProperty (unit, kAudioOutputUnitProperty_CurrentDevice, kAudioUnitScope_Global, 0, /* N/A for global */ &device_id, sizeof (AudioDeviceID)); if (status) { CloseComponent (unit); GST_WARNING_OBJECT (osxbuf, "Failed to set device: %lx", (gulong) status); return NULL; } GST_DEBUG_OBJECT (osxbuf, "Create HALOutput AudioUnit: %p", unit); return unit; } static gboolean gst_osx_ring_buffer_open_device (GstRingBuffer * buf) { GstOsxRingBuffer *osxbuf; GstOsxAudioSink *sink; GstOsxAudioSrc *src; AudioStreamBasicDescription asbd_in; OSStatus status; UInt32 propertySize; osxbuf = GST_OSX_RING_BUFFER (buf); sink = NULL; src = NULL; osxbuf->audiounit = gst_osx_ring_buffer_create_audio_unit (osxbuf, osxbuf->is_src, osxbuf->device_id); if (osxbuf->is_src) { src = GST_OSX_AUDIO_SRC (GST_OBJECT_PARENT (buf)); propertySize = sizeof (asbd_in); status = AudioUnitGetProperty (osxbuf->audiounit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 1, &asbd_in, &propertySize); if (status) { CloseComponent (osxbuf->audiounit); osxbuf->audiounit = NULL; GST_WARNING_OBJECT (osxbuf, "Unable to obtain device properties: %lx", (gulong) status); return FALSE; } src->deviceChannels = asbd_in.mChannelsPerFrame; } else { sink = GST_OSX_AUDIO_SINK (GST_OBJECT_PARENT (buf)); /* needed for the sink's volume control */ sink->audiounit = osxbuf->audiounit; } return TRUE; } static gboolean gst_osx_ring_buffer_close_device (GstRingBuffer * buf) { GstOsxRingBuffer *osxbuf; osxbuf = GST_OSX_RING_BUFFER (buf); CloseComponent (osxbuf->audiounit); osxbuf->audiounit = NULL; return TRUE; } static AudioChannelLabel gst_audio_channel_position_to_coreaudio_channel_label (GstAudioChannelPosition position, int channel) { switch (position) { case GST_AUDIO_CHANNEL_POSITION_NONE: return kAudioChannelLabel_Discrete_0 | channel; case GST_AUDIO_CHANNEL_POSITION_FRONT_MONO: return kAudioChannelLabel_Mono; case GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT: return kAudioChannelLabel_Left; case GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT: return kAudioChannelLabel_Right; case GST_AUDIO_CHANNEL_POSITION_REAR_CENTER: return kAudioChannelLabel_CenterSurround; case GST_AUDIO_CHANNEL_POSITION_REAR_LEFT: return kAudioChannelLabel_LeftSurround; case GST_AUDIO_CHANNEL_POSITION_REAR_RIGHT: return kAudioChannelLabel_RightSurround; case GST_AUDIO_CHANNEL_POSITION_LFE: return kAudioChannelLabel_LFEScreen; case GST_AUDIO_CHANNEL_POSITION_FRONT_CENTER: return kAudioChannelLabel_Center; case GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT_OF_CENTER: return kAudioChannelLabel_Center; // ??? case GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT_OF_CENTER: return kAudioChannelLabel_Center; // ??? case GST_AUDIO_CHANNEL_POSITION_SIDE_LEFT: return kAudioChannelLabel_LeftSurroundDirect; case GST_AUDIO_CHANNEL_POSITION_SIDE_RIGHT: return kAudioChannelLabel_RightSurroundDirect; default: return kAudioChannelLabel_Unknown; } } static gboolean gst_osx_ring_buffer_acquire (GstRingBuffer * buf, GstRingBufferSpec * spec) { /* Configure the output stream and allocate ringbuffer memory */ GstOsxRingBuffer *osxbuf; AudioStreamBasicDescription format; AudioChannelLayout *layout = NULL; OSStatus status; UInt32 propertySize; int layoutSize; int element; int i; AudioUnitScope scope; gboolean ret = FALSE; GstStructure *structure; GstAudioChannelPosition *positions; UInt32 frameSize; osxbuf = GST_OSX_RING_BUFFER (buf); /* Fill out the audio description we're going to be using */ format.mFormatID = kAudioFormatLinearPCM; format.mSampleRate = (double) spec->rate; format.mChannelsPerFrame = spec->channels; format.mFormatFlags = kAudioFormatFlagsNativeFloatPacked; format.mBytesPerFrame = spec->channels * sizeof (float); format.mBitsPerChannel = sizeof (float) * 8; format.mBytesPerPacket = spec->channels * sizeof (float); format.mFramesPerPacket = 1; format.mReserved = 0; /* Describe channels */ layoutSize = sizeof (AudioChannelLayout) + spec->channels * sizeof (AudioChannelDescription); layout = g_malloc (layoutSize); structure = gst_caps_get_structure (spec->caps, 0); positions = gst_audio_get_channel_positions (structure); layout->mChannelLayoutTag = kAudioChannelLayoutTag_UseChannelDescriptions; layout->mChannelBitmap = 0; /* Not used */ layout->mNumberChannelDescriptions = spec->channels; for (i = 0; i < spec->channels; i++) { if (positions) { layout->mChannelDescriptions[i].mChannelLabel = gst_audio_channel_position_to_coreaudio_channel_label (positions[i], i); } else { /* Discrete channel numbers are ORed into this */ layout->mChannelDescriptions[i].mChannelLabel = kAudioChannelLabel_Discrete_0 | i; } /* Others unused */ layout->mChannelDescriptions[i].mChannelFlags = 0; layout->mChannelDescriptions[i].mCoordinates[0] = 0.f; layout->mChannelDescriptions[i].mCoordinates[1] = 0.f; layout->mChannelDescriptions[i].mCoordinates[2] = 0.f; } if (positions) { g_free (positions); positions = NULL; } GST_LOG_OBJECT (osxbuf, "Format: %x, %f, %u, %x, %d, %d, %d, %d, %d", (unsigned int) format.mFormatID, format.mSampleRate, (unsigned int) format.mChannelsPerFrame, (unsigned int) format.mFormatFlags, (unsigned int) format.mBytesPerFrame, (unsigned int) format.mBitsPerChannel, (unsigned int) format.mBytesPerPacket, (unsigned int) format.mFramesPerPacket, (unsigned int) format.mReserved); GST_DEBUG_OBJECT (osxbuf, "Setting format for AudioUnit"); scope = osxbuf->is_src ? kAudioUnitScope_Output : kAudioUnitScope_Input; element = osxbuf->is_src ? 1 : 0; propertySize = sizeof (format); status = AudioUnitSetProperty (osxbuf->audiounit, kAudioUnitProperty_StreamFormat, scope, element, &format, propertySize); if (status) { GST_WARNING_OBJECT (osxbuf, "Failed to set audio description: %lx", (gulong) status); goto done; } status = AudioUnitSetProperty (osxbuf->audiounit, kAudioUnitProperty_AudioChannelLayout, scope, element, layout, layoutSize); if (status) { GST_WARNING_OBJECT (osxbuf, "Failed to set output channel layout: %lx", (gulong) status); goto done; } spec->segsize = (spec->latency_time * spec->rate / G_USEC_PER_SEC) * spec->bytes_per_sample; spec->segtotal = spec->buffer_time / spec->latency_time; /* create AudioBufferList needed for recording */ if (osxbuf->is_src) { propertySize = sizeof (frameSize); status = AudioUnitGetProperty (osxbuf->audiounit, kAudioDevicePropertyBufferFrameSize, kAudioUnitScope_Global, 0, /* N/A for global */ &frameSize, &propertySize); if (status) { GST_WARNING_OBJECT (osxbuf, "Failed to get frame size: %lx", (gulong) status); goto done; } osxbuf->recBufferList = buffer_list_alloc (format.mChannelsPerFrame, frameSize * format.mBytesPerFrame); } buf->data = gst_buffer_new_and_alloc (spec->segtotal * spec->segsize); memset (GST_BUFFER_DATA (buf->data), 0, GST_BUFFER_SIZE (buf->data)); osxbuf->segoffset = 0; status = AudioUnitInitialize (osxbuf->audiounit); if (status) { gst_buffer_unref (buf->data); buf->data = NULL; if (osxbuf->recBufferList) { buffer_list_free (osxbuf->recBufferList); osxbuf->recBufferList = NULL; } GST_WARNING_OBJECT (osxbuf, "Failed to initialise AudioUnit: %d", (int) status); goto done; } GST_DEBUG_OBJECT (osxbuf, "osx ring buffer acquired"); ret = TRUE; done: g_free (layout); return ret; } static gboolean gst_osx_ring_buffer_release (GstRingBuffer * buf) { GstOsxRingBuffer *osxbuf; osxbuf = GST_OSX_RING_BUFFER (buf); AudioUnitUninitialize (osxbuf->audiounit); gst_buffer_unref (buf->data); buf->data = NULL; if (osxbuf->recBufferList) { buffer_list_free (osxbuf->recBufferList); osxbuf->recBufferList = NULL; } return TRUE; } static void gst_osx_ring_buffer_remove_render_callback (GstOsxRingBuffer * osxbuf) { AURenderCallbackStruct input; OSStatus status; /* Deactivate the render callback by calling SetRenderCallback with a NULL * inputProc. */ input.inputProc = NULL; input.inputProcRefCon = NULL; status = AudioUnitSetProperty (osxbuf->audiounit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Global, 0, /* N/A for global */ &input, sizeof (input)); if (status) { GST_WARNING_OBJECT (osxbuf, "Failed to remove render callback"); } /* Remove the RenderNotify too */ status = AudioUnitRemoveRenderNotify (osxbuf->audiounit, (AURenderCallback) gst_osx_ring_buffer_render_notify, osxbuf); if (status) { GST_WARNING_OBJECT (osxbuf, "Failed to remove render notify callback"); } /* We're deactivated.. */ osxbuf->io_proc_needs_deactivation = FALSE; osxbuf->io_proc_active = FALSE; } static OSStatus gst_osx_ring_buffer_render_notify (GstOsxRingBuffer * osxbuf, AudioUnitRenderActionFlags * ioActionFlags, const AudioTimeStamp * inTimeStamp, unsigned int inBusNumber, unsigned int inNumberFrames, AudioBufferList * ioData) { /* Before rendering a frame, we get the PreRender notification. * Here, we detach the RenderCallback if we've been paused. * * This is necessary (rather than just directly detaching it) to work * around some thread-safety issues in CoreAudio */ if ((*ioActionFlags) & kAudioUnitRenderAction_PreRender) { if (osxbuf->io_proc_needs_deactivation) { gst_osx_ring_buffer_remove_render_callback (osxbuf); } } return noErr; } static gboolean gst_osx_ring_buffer_start (GstRingBuffer * buf) { OSStatus status; GstOsxRingBuffer *osxbuf; AURenderCallbackStruct input; AudioUnitPropertyID callback_type; osxbuf = GST_OSX_RING_BUFFER (buf); GST_DEBUG ("osx ring buffer start ioproc: 0x%p device_id %lu", osxbuf->element->io_proc, (gulong) osxbuf->device_id); if (!osxbuf->io_proc_active) { callback_type = osxbuf->is_src ? kAudioOutputUnitProperty_SetInputCallback : kAudioUnitProperty_SetRenderCallback; input.inputProc = (AURenderCallback) osxbuf->element->io_proc; input.inputProcRefCon = osxbuf; status = AudioUnitSetProperty (osxbuf->audiounit, callback_type, kAudioUnitScope_Global, 0, /* N/A for global */ &input, sizeof (input)); if (status) { GST_WARNING ("AudioUnitSetProperty returned %d", (int) status); return FALSE; } // ### does it make sense to do this notify stuff for input mode? status = AudioUnitAddRenderNotify (osxbuf->audiounit, (AURenderCallback) gst_osx_ring_buffer_render_notify, osxbuf); if (status) { GST_WARNING ("AudioUnitAddRenderNotify returned %d", (int) status); return FALSE; } osxbuf->io_proc_active = TRUE; } osxbuf->io_proc_needs_deactivation = FALSE; status = AudioOutputUnitStart (osxbuf->audiounit); if (status) { GST_WARNING ("AudioOutputUnitStart returned %d", (int) status); return FALSE; } return TRUE; } // ### static gboolean gst_osx_ring_buffer_pause (GstRingBuffer * buf) { GstOsxRingBuffer *osxbuf = GST_OSX_RING_BUFFER (buf); GST_DEBUG ("osx ring buffer pause ioproc: 0x%p device_id %lu", osxbuf->element->io_proc, (gulong) osxbuf->device_id); if (osxbuf->io_proc_active) { /* CoreAudio isn't threadsafe enough to do this here; we must deactivate * the render callback elsewhere. See: * http://lists.apple.com/archives/Coreaudio-api/2006/Mar/msg00010.html */ osxbuf->io_proc_needs_deactivation = TRUE; } return TRUE; } // ### static gboolean gst_osx_ring_buffer_stop (GstRingBuffer * buf) { OSErr status; GstOsxRingBuffer *osxbuf; osxbuf = GST_OSX_RING_BUFFER (buf); GST_DEBUG ("osx ring buffer stop ioproc: 0x%p device_id %lu", osxbuf->element->io_proc, (gulong) osxbuf->device_id); status = AudioOutputUnitStop (osxbuf->audiounit); if (status) GST_WARNING ("AudioOutputUnitStop returned %d", (int) status); // ###: why is it okay to directly remove from here but not from pause() ? if (osxbuf->io_proc_active) { gst_osx_ring_buffer_remove_render_callback (osxbuf); } return TRUE; } static guint gst_osx_ring_buffer_delay (GstRingBuffer * buf) { double latency; UInt32 size = sizeof (double); GstOsxRingBuffer *osxbuf; OSStatus status; guint samples; osxbuf = GST_OSX_RING_BUFFER (buf); status = AudioUnitGetProperty (osxbuf->audiounit, kAudioUnitProperty_Latency, kAudioUnitScope_Global, 0, /* N/A for global */ &latency, &size); if (status) { GST_WARNING_OBJECT (buf, "Failed to get latency: %d", (int) status); return 0; } samples = latency * GST_RING_BUFFER (buf)->spec.rate; GST_DEBUG_OBJECT (buf, "Got latency: %f seconds -> %d samples", latency, samples); return samples; } static AudioBufferList * buffer_list_alloc (int channels, int size) { AudioBufferList *list; int total_size; int n; total_size = sizeof (AudioBufferList) + 1 * sizeof (AudioBuffer); list = (AudioBufferList *) g_malloc (total_size); list->mNumberBuffers = 1; for (n = 0; n < (int) list->mNumberBuffers; ++n) { list->mBuffers[n].mNumberChannels = channels; list->mBuffers[n].mDataByteSize = size; list->mBuffers[n].mData = g_malloc (size); } return list; } static void buffer_list_free (AudioBufferList * list) { int n; for (n = 0; n < (int) list->mNumberBuffers; ++n) { if (list->mBuffers[n].mData) g_free (list->mBuffers[n].mData); } g_free (list); } psimedia-master/gstprovider/gstelements/osxaudio/gstosxringbuffer.h000066400000000000000000000067771220046403000264610ustar00rootroot00000000000000/* * GStreamer * Copyright (C) 2006 Zaheer Abbas Merali * * Permission is hereby granted, free of charge, to any person obtaining a * copy of this software and associated documentation files (the "Software"), * to deal in the Software without restriction, including without limitation * the rights to use, copy, modify, merge, publish, distribute, sublicense, * and/or sell copies of the Software, and to permit persons to whom the * Software is furnished to do so, subject to the following conditions: * * The above copyright notice and this permission notice shall be included in * all copies or substantial portions of the Software. * * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER * DEALINGS IN THE SOFTWARE. * * Alternatively, the contents of this file may be used under the * GNU Lesser General Public License Version 2.1 (the "LGPL"), in * which case the following provisions apply instead of the ones * mentioned above: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. */ #ifndef __GST_OSX_RING_BUFFER_H__ #define __GST_OSX_RING_BUFFER_H__ #include #include #include #include "gstosxaudioelement.h" G_BEGIN_DECLS #define GST_TYPE_OSX_RING_BUFFER \ (gst_osx_ring_buffer_get_type()) #define GST_OSX_RING_BUFFER(obj) \ (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_OSX_RING_BUFFER,GstOsxRingBuffer)) #define GST_OSX_RING_BUFFER_CLASS(klass) \ (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_OSX_RING_BUFFER,GstOsxRingBufferClass)) #define GST_OSX_RING_BUFFER_GET_CLASS(obj) \ (G_TYPE_INSTANCE_GET_CLASS((obj),GST_TYPE_OSX_RING_BUFFER,GstOsxRingBufferClass)) #define GST_IS_OSX_RING_BUFFER(obj) \ (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_OSX_RING_BUFFER)) #define GST_IS_OSX_RING_BUFFER_CLASS(klass) \ (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_OSX_RING_BUFFER)) typedef struct _GstOsxRingBuffer GstOsxRingBuffer; typedef struct _GstOsxRingBufferClass GstOsxRingBufferClass; struct _GstOsxRingBuffer { GstRingBuffer object; gboolean is_src; AudioUnit audiounit; AudioDeviceID device_id; gboolean io_proc_active; gboolean io_proc_needs_deactivation; guint buffer_len; guint segoffset; AudioBufferList * recBufferList; GstOsxAudioElementInterface * element; }; struct _GstOsxRingBufferClass { GstRingBufferClass parent_class; }; GType gst_osx_ring_buffer_get_type (void); G_END_DECLS #endif /* __GST_OSX_RING_BUFFER_H__ */ psimedia-master/gstprovider/gstelements/osxvideo.pri000066400000000000000000000007021220046403000234060ustar00rootroot00000000000000HEADERS += \ $$PWD/osxvideo/osxvideosink.h \ $$PWD/osxvideo/cocoawindow.h \ $$PWD/osxvideo/osxvideosrc.h SOURCES += \ $$PWD/osxvideo/osxvideosink.m \ $$PWD/osxvideo/cocoawindow.m \ $$PWD/osxvideo/osxvideosrc.c gstplugin:SOURCES += $$PWD/osxvideo/osxvideoplugin.m !gstplugin:SOURCES += $$PWD/static/osxvideo_static.m LIBS *= \ -lgstinterfaces-0.10 \ -lgstvideo-0.10 LIBS += \ -framework Cocoa \ -framework QuickTime \ -framework OpenGL psimedia-master/gstprovider/gstelements/osxvideo/000077500000000000000000000000001220046403000226735ustar00rootroot00000000000000psimedia-master/gstprovider/gstelements/osxvideo/Makefile.am000066400000000000000000000014741220046403000247350ustar00rootroot00000000000000# FIXME: clean up this crap OBJC=gcc plugin_LTLIBRARIES = libgstosxvideosink.la libgstosxvideosink_la_SOURCES = osxvideoplugin.m \ osxvideosink.m \ cocoawindow.m \ osxvideosrc.c libgstosxvideosink_la_CFLAGS = $(GST_CFLAGS) $(GST_BASE_CFLAGS) \ $(GST_PLUGINS_BASE_CFLAGS) -Wno-deprecated-declarations # Apple is hiring monkeys nowadays libgstosxvideosink_la_LIBADD = \ $(GST_BASE_LIBS) $(GST_PLUGINS_BASE_LIBS) -lgstvideo-$(GST_MAJORMINOR) \ -lgstinterfaces-$(GST_MAJORMINOR) libgstosxvideosink_la_LDFLAGS = $(GST_PLUGIN_LDFLAGS) \ -Wl,-framework -Wl,Cocoa -Wl,-framework -Wl,QuickTime \ -Wl,-framework -Wl,OpenGL libgstosxvideosink_la_LIBTOOLFLAGS = --tag=disable-static AM_OBJCFLAGS=$(CFLAGS) $(GST_CFLAGS) $(GST_PLUGINS_BASE_CFLAGS) $(GST_BASE_CFLAGS) noinst_HEADERS = osxvideosink.h osxvideosrc.h cocoawindow.h psimedia-master/gstprovider/gstelements/osxvideo/cocoawindow.h000066400000000000000000000042301220046403000253570ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2004 Zaheer Abbas Merali * Copyright (C) 2007 Pioneers of the Inevitable * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement of Pioneers * of the Inevitable, the creators of the Songbird Music player * */ /* inspiration gained from looking at source of osx video out of xine and vlc * and is reflected in the code */ #import #import #import struct _GstOSXImage; @interface GstGLView : NSOpenGLView { int i_effect; unsigned int pi_texture; float f_x; float f_y; int initDone; char* data; int width, height; BOOL fullscreen; NSOpenGLContext* fullScreenContext; NSOpenGLContext* actualContext; } - (void) drawQuad; - (void) drawRect: (NSRect) rect; - (id) initWithFrame: (NSRect) frame; - (void) initTextures; - (void) reloadTexture; - (void) cleanUp; - (void) displayTexture; - (char*) getTextureBuffer; - (void) setFullScreen: (BOOL) flag; - (void) reshape; - (void) setVideoSize: (int) w: (int) h; @end @interface GstOSXVideoSinkWindow: NSWindow { int width, height; GstGLView *gstview; } - (void) setContentSize: (NSSize) size; - (GstGLView *) gstView; - (id)initWithContentRect:(NSRect)contentRect styleMask:(unsigned int)styleMask backing:(NSBackingStoreType)bufferingType defer:(BOOL)flag screen:(NSScreen *)aScreen; @end psimedia-master/gstprovider/gstelements/osxvideo/cocoawindow.m000066400000000000000000000222021220046403000253630ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2004 Zaheer Abbas Merali * Copyright (C) 2007 Pioneers of the Inevitable * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement of Pioneers * of the Inevitable, the creators of the Songbird Music player * */ /* inspiration gained from looking at source of osx video out of xine and vlc * and is reflected in the code */ #include #include #import "cocoawindow.h" #import "osxvideosink.h" #include #include #include /* Debugging category */ #include @ implementation GstOSXVideoSinkWindow /* The object has to be released */ - (id) initWithContentRect: (NSRect) rect styleMask: (unsigned int) styleMask backing: (NSBackingStoreType) bufferingType defer: (BOOL) flag screen:(NSScreen *) aScreen { self = [super initWithContentRect: rect styleMask: styleMask backing: bufferingType defer: flag screen:aScreen]; GST_DEBUG ("Initializing GstOSXvideoSinkWindow"); gstview = [[GstGLView alloc] initWithFrame:rect]; if (gstview) [self setContentView:gstview]; [self setTitle:@"GStreamer Video Output"]; return self; } - (void) setContentSize:(NSSize) size { width = size.width; height = size.height; [gstview setVideoSize: (int) width:(int) height]; [super setContentSize:size]; } - (GstGLView *) gstView { return gstview; } - (void) awakeFromNib { [self setAcceptsMouseMovedEvents:YES]; } - (void) sendEvent:(NSEvent *) event { BOOL taken = NO; GST_DEBUG ("event %p type:%d", event,[event type]); if ([event type] == NSKeyDown) { } /*taken = [gstview keyDown:event]; */ if (!taken) { [super sendEvent:event]; } } @end // // OpenGL implementation // @ implementation GstGLView - (id) initWithFrame:(NSRect) frame { NSOpenGLPixelFormat *fmt; NSOpenGLPixelFormatAttribute attribs[] = { NSOpenGLPFAAccelerated, NSOpenGLPFANoRecovery, NSOpenGLPFADoubleBuffer, NSOpenGLPFAColorSize, 24, NSOpenGLPFAAlphaSize, 8, NSOpenGLPFADepthSize, 24, NSOpenGLPFAWindow, 0 }; fmt = [[NSOpenGLPixelFormat alloc] initWithAttributes:attribs]; if (!fmt) { GST_WARNING ("Cannot create NSOpenGLPixelFormat"); return nil; } self = [super initWithFrame: frame pixelFormat:fmt]; actualContext = [self openGLContext]; [actualContext makeCurrentContext]; [actualContext update]; /* Black background */ glClearColor (0.0, 0.0, 0.0, 0.0); pi_texture = 0; data = nil; width = frame.size.width; height = frame.size.height; GST_LOG ("Width: %d Height: %d", width, height); [self initTextures]; return self; } - (void) reshape { NSRect bounds; GST_LOG ("reshaping"); if (!initDone) { return; } [actualContext makeCurrentContext]; bounds = [self bounds]; glViewport (0, 0, (GLint) bounds.size.width, (GLint) bounds.size.height); } - (void) initTextures { [actualContext makeCurrentContext]; /* Free previous texture if any */ if (pi_texture) { glDeleteTextures (1, (GLuint *)&pi_texture); } if (data) { data = g_realloc (data, width * height * sizeof(short)); // short or 3byte? } else { data = g_malloc0(width * height * sizeof(short)); } /* Create textures */ glGenTextures (1, (GLuint *)&pi_texture); glEnable (GL_TEXTURE_RECTANGLE_EXT); glEnable (GL_UNPACK_CLIENT_STORAGE_APPLE); glPixelStorei (GL_UNPACK_ALIGNMENT, 1); glPixelStorei (GL_UNPACK_ROW_LENGTH, width); glBindTexture (GL_TEXTURE_RECTANGLE_EXT, pi_texture); /* Use VRAM texturing */ glTexParameteri (GL_TEXTURE_RECTANGLE_EXT, GL_TEXTURE_STORAGE_HINT_APPLE, GL_STORAGE_CACHED_APPLE); /* Tell the driver not to make a copy of the texture but to use our buffer */ glPixelStorei (GL_UNPACK_CLIENT_STORAGE_APPLE, GL_TRUE); /* Linear interpolation */ glTexParameteri (GL_TEXTURE_RECTANGLE_EXT, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexParameteri (GL_TEXTURE_RECTANGLE_EXT, GL_TEXTURE_MAG_FILTER, GL_LINEAR); /* I have no idea what this exactly does, but it seems to be necessary for scaling */ glTexParameteri (GL_TEXTURE_RECTANGLE_EXT, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameteri (GL_TEXTURE_RECTANGLE_EXT, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); // glPixelStorei (GL_UNPACK_ROW_LENGTH, 0); WHY ?? glTexImage2D (GL_TEXTURE_RECTANGLE_EXT, 0, GL_RGBA, width, height, 0, GL_YCBCR_422_APPLE, GL_UNSIGNED_SHORT_8_8_APPLE, data); initDone = 1; } - (void) reloadTexture { if (!initDone) { return; } GST_LOG ("Reloading Texture"); [actualContext makeCurrentContext]; glBindTexture (GL_TEXTURE_RECTANGLE_EXT, pi_texture); glPixelStorei (GL_UNPACK_ROW_LENGTH, width); /* glTexSubImage2D is faster than glTexImage2D http://developer.apple.com/samplecode/Sample_Code/Graphics_3D/ TextureRange/MainOpenGLView.m.htm */ glTexSubImage2D (GL_TEXTURE_RECTANGLE_EXT, 0, 0, 0, width, height, GL_YCBCR_422_APPLE, GL_UNSIGNED_SHORT_8_8_APPLE, data); //FIXME } - (void) cleanUp { initDone = 0; } - (void) drawQuad { f_x = 1.0; f_y = 1.0; glBegin (GL_QUADS); /* Top left */ glTexCoord2f (0.0, 0.0); glVertex2f (-f_x, f_y); /* Bottom left */ glTexCoord2f (0.0, (float) height); glVertex2f (-f_x, -f_y); /* Bottom right */ glTexCoord2f ((float) width, (float) height); glVertex2f (f_x, -f_y); /* Top right */ glTexCoord2f ((float) width, 0.0); glVertex2f (f_x, f_y); glEnd (); } - (void) drawRect:(NSRect) rect { GLint params[] = { 1 }; [actualContext makeCurrentContext]; CGLSetParameter (CGLGetCurrentContext (), kCGLCPSwapInterval, params); /* Black background */ glClear (GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); if (!initDone) { [actualContext flushBuffer]; return; } /* Draw */ glBindTexture (GL_TEXTURE_RECTANGLE_EXT, pi_texture); // FIXME [self drawQuad]; /* Draw */ [actualContext flushBuffer]; } - (void) displayTexture { if ([self lockFocusIfCanDraw]) { [self drawRect:[self bounds]]; [self reloadTexture]; [self unlockFocus]; } } - (char *) getTextureBuffer { return data; } - (void) setFullScreen:(BOOL) flag { if (!fullscreen && flag) { // go to full screen /* Create the new pixel format */ NSOpenGLPixelFormat *fmt; NSOpenGLPixelFormatAttribute attribs[] = { NSOpenGLPFAAccelerated, NSOpenGLPFANoRecovery, NSOpenGLPFADoubleBuffer, NSOpenGLPFAColorSize, 24, NSOpenGLPFAAlphaSize, 8, NSOpenGLPFADepthSize, 24, NSOpenGLPFAFullScreen, NSOpenGLPFAScreenMask, CGDisplayIDToOpenGLDisplayMask (kCGDirectMainDisplay), 0 }; fmt = [[NSOpenGLPixelFormat alloc] initWithAttributes:attribs]; if (!fmt) { GST_WARNING ("Cannot create NSOpenGLPixelFormat"); return; } /* Create the new OpenGL context */ fullScreenContext = [[NSOpenGLContext alloc] initWithFormat: fmt shareContext:nil]; if (!fullScreenContext) { GST_WARNING ("Failed to create new NSOpenGLContext"); return; } actualContext = fullScreenContext; /* Capture display, switch to fullscreen */ if (CGCaptureAllDisplays () != CGDisplayNoErr) { GST_WARNING ("CGCaptureAllDisplays() failed"); return; } [fullScreenContext setFullScreen]; [fullScreenContext makeCurrentContext]; fullscreen = YES; [self initTextures]; [self setNeedsDisplay:YES]; } else if (fullscreen && !flag) { // fullscreen now and needs to go back to normal initDone = NO; actualContext = [self openGLContext]; [NSOpenGLContext clearCurrentContext]; [fullScreenContext clearDrawable]; [fullScreenContext release]; fullScreenContext = nil; CGReleaseAllDisplays (); [self reshape]; [self initTextures]; [self setNeedsDisplay:YES]; fullscreen = NO; initDone = YES; } } - (void) setVideoSize: (int) w:(int) h { GST_LOG ("width:%d, height:%d", w, h); width = w; height = h; // if (data) g_free(data); // data = g_malloc0 (2 * w * h); [self initTextures]; } - (void) dealloc { GST_LOG ("dealloc called"); if (data) g_free(data); if (fullScreenContext) { [NSOpenGLContext clearCurrentContext]; [fullScreenContext clearDrawable]; [fullScreenContext release]; if (actualContext == fullScreenContext) actualContext = nil; fullScreenContext = nil; } [super dealloc]; } @end psimedia-master/gstprovider/gstelements/osxvideo/osxvideoplugin.m000066400000000000000000000036451220046403000261400ustar00rootroot00000000000000/* GStreamer * OSX video sink * Copyright (C) 2004-6 Zaheer Abbas Merali * Copyright (C) 2007 Pioneers of the Inevitable * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement of * Pioneers of the Inevitable, the creators of the Songbird Music player. * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif /* Object header */ #include "osxvideosink.h" #include "osxvideosrc.h" static gboolean plugin_init (GstPlugin * plugin) { if (!gst_element_register (plugin, "osxvideosink", GST_RANK_PRIMARY, GST_TYPE_OSX_VIDEO_SINK)) return FALSE; GST_DEBUG_CATEGORY_INIT (gst_debug_osx_video_sink, "osxvideosink", 0, "osxvideosink element"); if (!gst_element_register (plugin, "osxvideosrc", GST_RANK_PRIMARY, GST_TYPE_OSX_VIDEO_SRC)) return FALSE; GST_DEBUG_CATEGORY_INIT (gst_debug_osx_video_src, "osxvideosrc", 0, "osxvideosrc element"); return TRUE; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, "osxvideo", "OSX native video input/output plugin", plugin_init, VERSION, GST_LICENSE, GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN) psimedia-master/gstprovider/gstelements/osxvideo/osxvideosink.h000066400000000000000000000057331220046403000256010ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2004-6 Zaheer Abbas Merali * Copyright (C) 2007 Pioneers of the Inevitable * * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * * The development of this code was made possible due to the involvement of Pioneers * of the Inevitable, the creators of the Songbird Music player * */ #ifndef __GST_OSX_VIDEO_SINK_H__ #define __GST_OSX_VIDEO_SINK_H__ #include #include #include #include #include #import "cocoawindow.h" GST_DEBUG_CATEGORY_EXTERN (gst_debug_osx_video_sink); G_BEGIN_DECLS #define GST_TYPE_OSX_VIDEO_SINK \ (gst_osx_video_sink_get_type()) #define GST_OSX_VIDEO_SINK(obj) \ (G_TYPE_CHECK_INSTANCE_CAST((obj), GST_TYPE_OSX_VIDEO_SINK, GstOSXVideoSink)) #define GST_OSX_VIDEO_SINK_CLASS(klass) \ (G_TYPE_CHECK_CLASS_CAST((klass), GST_TYPE_OSX_VIDEO_SINK, GstOSXVideoSinkClass)) #define GST_IS_OSX_VIDEO_SINK(obj) \ (G_TYPE_CHECK_INSTANCE_TYPE((obj), GST_TYPE_OSX_VIDEO_SINK)) #define GST_IS_OSX_VIDEO_SINK_CLASS(klass) \ (G_TYPE_CHECK_CLASS_TYPE((klass), GST_TYPE_OSX_VIDEO_SINK)) typedef struct _GstOSXWindow GstOSXWindow; typedef struct _GstOSXVideoSink GstOSXVideoSink; typedef struct _GstOSXVideoSinkClass GstOSXVideoSinkClass; #define GST_TYPE_OSXVIDEOBUFFER (gst_osxvideobuffer_get_type()) /* OSXWindow stuff */ struct _GstOSXWindow { gint width, height; gboolean internal; GstOSXVideoSinkWindow* win; GstGLView* gstview; NSAutoreleasePool *pool; }; struct _GstOSXVideoSink { /* Our element stuff */ GstVideoSink videosink; GstOSXWindow *osxwindow; GstTask *event_task; GStaticRecMutex event_task_lock; /* Unused */ gint pixel_width, pixel_height; GstClockTime time; gboolean embed; gboolean fullscreen; gboolean sw_scaling_failed; }; struct _GstOSXVideoSinkClass { GstVideoSinkClass parent_class; }; GType gst_osx_video_sink_get_type(void); #if MAC_OS_X_VERSION_MAX_ALLOWED >= MAC_OS_X_VERSION_10_4 @interface NSApplication(AppleMenu) - (void)setAppleMenu:(NSMenu *)menu; @end #endif @interface GstAppDelegate : NSObject - (NSApplicationTerminateReply)applicationShouldTerminate:(NSApplication *)sender; @end G_END_DECLS #endif /* __GST_OSX_VIDEO_SINK_H__ */ psimedia-master/gstprovider/gstelements/osxvideo/osxvideosink.m000066400000000000000000000465741220046403000256160ustar00rootroot00000000000000/* GStreamer * OSX video sink * Copyright (C) 2004-6 Zaheer Abbas Merali * Copyright (C) 2007 Pioneers of the Inevitable * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement of * Pioneers of the Inevitable, the creators of the Songbird Music player. * */ /** * SECTION:element-osxvideosink * * * * The OSXVideoSink renders video frames to a MacOSX window. The video output * can be directed to a window embedded in an existing NSApp. This can be done * by setting the "embed" property to #TRUE. When the NSView to be embedded is * created an element #GstMessage with a name of 'have-ns-view' will be created * and posted on the bus. The pointer to the NSView to embed will be in the * 'nsview' field of that message. If no embedding is requested, the plugin will * create a standalone window. * * Examples * * Simple timeline to test the sink : * * gst-launch-0.10 -v videotestsrc ! osxvideosink * * * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif /* Object header */ #include "osxvideosink.h" #include #import "cocoawindow.h" /* Debugging category */ GST_DEBUG_CATEGORY (gst_debug_osx_video_sink); #define GST_CAT_DEFAULT gst_debug_osx_video_sink /* ElementFactory information */ static const GstElementDetails gst_osx_video_sink_details = GST_ELEMENT_DETAILS ("OSX Video sink", "Sink/Video", "OSX native videosink", "Zaheer Abbas Merali "); /* Default template - initiated with class struct to allow gst-register to work without X running */ static GstStaticPadTemplate gst_osx_video_sink_sink_template_factory = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, GST_STATIC_CAPS ("video/x-raw-yuv, " "framerate = (fraction) [ 0, MAX ], " "width = (int) [ 1, MAX ], " "height = (int) [ 1, MAX ], " #if G_BYTE_ORDER == G_BIG_ENDIAN "format = (fourcc) YUY2") #else "format = (fourcc) UYVY") #endif ); // much of the following cocoa NSApp code comes from libsdl and libcaca @implementation NSApplication(Gst) - (void)setRunning { _running = 1; } @end @implementation GstAppDelegate : NSObject - (NSApplicationTerminateReply)applicationShouldTerminate:(NSApplication *)sender { // destroy stuff here! GST_DEBUG("Kill me please!"); return NSTerminateNow; } @end enum { ARG_0, ARG_EMBED, ARG_FULLSCREEN /* FILL ME */ }; static GstVideoSinkClass *parent_class = NULL; /* cocoa event loop - needed if not run in own app */ static void cocoa_event_loop (GstOSXVideoSink * vsink) { NSAutoreleasePool *pool; GST_DEBUG_OBJECT (vsink, "Entering event loop"); pool = [[NSAutoreleasePool alloc] init]; while ([NSApp isRunning]) { NSEvent *event = [NSApp nextEventMatchingMask:NSAnyEventMask untilDate:[NSDate distantPast] inMode:NSDefaultRunLoopMode dequeue:YES ]; if ( event == nil ) { g_usleep (2000); break; } else { switch ([event type]) { default: //XXX Feed me please [NSApp sendEvent:event]; break; } /* loop */ } } [pool release]; } static NSString * GetApplicationName(void) { NSDictionary *dict; NSString *appName = 0; /* Determine the application name */ dict = (NSDictionary *)CFBundleGetInfoDictionary(CFBundleGetMainBundle()); if (dict) appName = [dict objectForKey: @"CFBundleName"]; if (![appName length]) appName = [[NSProcessInfo processInfo] processName]; return appName; } static void CreateApplicationMenus(void) { NSString *appName; NSString *title; NSMenu *appleMenu; NSMenu *windowMenu; NSMenuItem *menuItem; /* Create the main menu bar */ [NSApp setMainMenu:[[NSMenu alloc] init]]; /* Create the application menu */ appName = GetApplicationName(); appleMenu = [[NSMenu alloc] initWithTitle:@""]; /* Add menu items */ title = [@"About " stringByAppendingString:appName]; [appleMenu addItemWithTitle:title action:@selector(orderFrontStandardAboutPanel:) keyEquivalent:@""]; [appleMenu addItem:[NSMenuItem separatorItem]]; title = [@"Hide " stringByAppendingString:appName]; [appleMenu addItemWithTitle:title action:@selector(hide:) keyEquivalent:@/*"h"*/""]; menuItem = (NSMenuItem *)[appleMenu addItemWithTitle:@"Hide Others" action:@selector(hideOtherApplications:) keyEquivalent:@/*"h"*/""]; [menuItem setKeyEquivalentModifierMask:(NSAlternateKeyMask|NSCommandKeyMask)]; [appleMenu addItemWithTitle:@"Show All" action:@selector(unhideAllApplications:) keyEquivalent:@""]; [appleMenu addItem:[NSMenuItem separatorItem]]; title = [@"Quit " stringByAppendingString:appName]; [appleMenu addItemWithTitle:title action:@selector(terminate:) keyEquivalent:@/*"q"*/""]; /* Put menu into the menubar */ menuItem = [[NSMenuItem alloc] initWithTitle:@"" action:nil keyEquivalent:@""]; [menuItem setSubmenu:appleMenu]; [[NSApp mainMenu] addItem:menuItem]; [menuItem release]; /* Tell the application object that this is now the application menu */ [NSApp setAppleMenu:appleMenu]; [appleMenu release]; /* Create the window menu */ windowMenu = [[NSMenu alloc] initWithTitle:@"Window"]; /* "Minimize" item */ menuItem = [[NSMenuItem alloc] initWithTitle:@"Minimize" action:@selector(performMiniaturize:) keyEquivalent:@/*"m"*/""]; [windowMenu addItem:menuItem]; [menuItem release]; /* Put menu into the menubar */ menuItem = [[NSMenuItem alloc] initWithTitle:@"Window" action:nil keyEquivalent:@""]; [menuItem setSubmenu:windowMenu]; [[NSApp mainMenu] addItem:menuItem]; [menuItem release]; /* Tell the application object that this is now the window menu */ [NSApp setWindowsMenu:windowMenu]; [windowMenu release]; } /* This function handles osx window creation */ static GstOSXWindow * gst_osx_video_sink_osxwindow_new (GstOSXVideoSink * osxvideosink, gint width, gint height) { NSRect rect; GstOSXWindow *osxwindow = NULL; g_return_val_if_fail (GST_IS_OSX_VIDEO_SINK (osxvideosink), NULL); GST_DEBUG_OBJECT (osxvideosink, "Creating new OSX window"); osxwindow = g_new0 (GstOSXWindow, 1); osxwindow->width = width; osxwindow->height = height; osxwindow->internal = TRUE; osxwindow->pool = [[NSAutoreleasePool alloc] init]; if (osxvideosink->embed == FALSE) { ProcessSerialNumber psn; unsigned int mask = NSTitledWindowMask | NSClosableWindowMask | NSResizableWindowMask | NSTexturedBackgroundWindowMask | NSMiniaturizableWindowMask; rect.origin.x = 100.0; rect.origin.y = 100.0; rect.size.width = (float) osxwindow->width; rect.size.height = (float) osxwindow->height; if (!GetCurrentProcess(&psn)) { TransformProcessType(&psn, kProcessTransformToForegroundApplication); SetFrontProcess(&psn); } [NSApplication sharedApplication]; osxwindow->win =[[GstOSXVideoSinkWindow alloc] initWithContentRect: rect styleMask: mask backing: NSBackingStoreBuffered defer: NO screen: nil]; GST_DEBUG("VideoSinkWindow created, %p", osxwindow->win); [osxwindow->win autorelease]; [NSApplication sharedApplication]; [osxwindow->win makeKeyAndOrderFront:NSApp]; osxwindow->gstview =[osxwindow->win gstView]; [osxwindow->gstview autorelease]; if (osxvideosink->fullscreen) [osxwindow->gstview setFullScreen:YES]; CreateApplicationMenus(); [NSApp finishLaunching]; [NSApp setDelegate:[[GstAppDelegate alloc] init]]; [NSApp setRunning]; g_static_rec_mutex_init (&osxvideosink->event_task_lock); osxvideosink->event_task = gst_task_create ((GstTaskFunction)cocoa_event_loop, osxvideosink); gst_task_set_lock (osxvideosink->event_task, &osxvideosink->event_task_lock); gst_task_start (osxvideosink->event_task); } else { GstStructure *s; GstMessage *msg; gchar * tmp; /* Needs to be embedded */ rect.origin.x = 0.0; rect.origin.y = 0.0; rect.size.width = (float) osxwindow->width; rect.size.height = (float) osxwindow->height; osxwindow->gstview =[[GstGLView alloc] initWithFrame:rect]; [osxwindow->gstview autorelease]; s = gst_structure_new ("have-ns-view", "nsview", G_TYPE_POINTER, osxwindow->gstview, nil); tmp = gst_structure_to_string (s); GST_DEBUG_OBJECT (osxvideosink, "Sending message %s (with view %p)", tmp, osxwindow->gstview); g_free (tmp); msg = gst_message_new_element (GST_OBJECT (osxvideosink), s); gst_element_post_message (GST_ELEMENT (osxvideosink), msg); GST_LOG_OBJECT (osxvideosink, "'have-ns-view' message sent"); } return osxwindow; } /* This function destroys a GstXWindow */ static void gst_osx_video_sink_osxwindow_destroy (GstOSXVideoSink * osxvideosink, GstOSXWindow * osxwindow) { g_return_if_fail (osxwindow != NULL); g_return_if_fail (GST_IS_OSX_VIDEO_SINK (osxvideosink)); [osxwindow->pool release]; if (osxvideosink->event_task) { gst_task_join (osxvideosink->event_task); gst_object_unref (osxvideosink->event_task); osxvideosink->event_task = NULL; g_static_rec_mutex_free (&osxvideosink->event_task_lock); } g_free (osxwindow); } /* This function resizes a GstXWindow */ static void gst_osx_video_sink_osxwindow_resize (GstOSXVideoSink * osxvideosink, GstOSXWindow * osxwindow, guint width, guint height) { NSAutoreleasePool *subPool = [[NSAutoreleasePool alloc] init]; g_return_if_fail (osxwindow != NULL); g_return_if_fail (GST_IS_OSX_VIDEO_SINK (osxvideosink)); osxwindow->width = width; osxwindow->height = height; GST_DEBUG_OBJECT (osxvideosink, "Resizing window to (%d,%d)", width, height); if (osxwindow->win) { /* Call relevant cocoa function to resize window */ NSSize size; size.width = width; size.height = height; NSLog(@"osxwindow->win = %@", osxwindow->win); GST_DEBUG_OBJECT (osxvideosink, "Calling setContentSize on %p", osxwindow->win); [osxwindow->win setContentSize:size]; } else { /* Directly resize the underlying view */ GST_DEBUG_OBJECT (osxvideosink, "Calling setVideoSize on %p", osxwindow->gstview); [osxwindow->gstview setVideoSize:width :height]; } [subPool release]; } static void gst_osx_video_sink_osxwindow_clear (GstOSXVideoSink * osxvideosink, GstOSXWindow * osxwindow) { g_return_if_fail (osxwindow != NULL); g_return_if_fail (GST_IS_OSX_VIDEO_SINK (osxvideosink)); } /* Element stuff */ static gboolean gst_osx_video_sink_setcaps (GstBaseSink * bsink, GstCaps * caps) { GstOSXVideoSink *osxvideosink; GstStructure *structure; gboolean res, result = FALSE; gint video_width, video_height; osxvideosink = GST_OSX_VIDEO_SINK (bsink); GST_DEBUG_OBJECT (osxvideosink, "caps: %" GST_PTR_FORMAT, caps); structure = gst_caps_get_structure (caps, 0); res = gst_structure_get_int (structure, "width", &video_width); res &= gst_structure_get_int (structure, "height", &video_height); if (!res) { goto beach; } GST_DEBUG_OBJECT (osxvideosink, "our format is: %dx%d video", video_width, video_height); GST_VIDEO_SINK_WIDTH (osxvideosink) = video_width; GST_VIDEO_SINK_HEIGHT (osxvideosink) = video_height; gst_osx_video_sink_osxwindow_resize (osxvideosink, osxvideosink->osxwindow, video_width, video_height); result = TRUE; beach: return result; } static GstStateChangeReturn gst_osx_video_sink_change_state (GstElement * element, GstStateChange transition) { GstOSXVideoSink *osxvideosink; osxvideosink = GST_OSX_VIDEO_SINK (element); GST_DEBUG_OBJECT (osxvideosink, "%s => %s", gst_element_state_get_name(GST_STATE_TRANSITION_CURRENT (transition)), gst_element_state_get_name(GST_STATE_TRANSITION_NEXT (transition))); switch (transition) { case GST_STATE_CHANGE_NULL_TO_READY: /* Creating our window and our image */ if (!osxvideosink->osxwindow) { GST_VIDEO_SINK_WIDTH (osxvideosink) = 320; GST_VIDEO_SINK_HEIGHT (osxvideosink) = 240; osxvideosink->osxwindow = gst_osx_video_sink_osxwindow_new (osxvideosink, GST_VIDEO_SINK_WIDTH (osxvideosink), GST_VIDEO_SINK_HEIGHT (osxvideosink)); gst_osx_video_sink_osxwindow_clear (osxvideosink, osxvideosink->osxwindow); } else { if (osxvideosink->osxwindow->internal) gst_osx_video_sink_osxwindow_resize (osxvideosink, osxvideosink->osxwindow, GST_VIDEO_SINK_WIDTH (osxvideosink), GST_VIDEO_SINK_HEIGHT (osxvideosink)); } break; case GST_STATE_CHANGE_READY_TO_PAUSED: GST_DEBUG ("ready to paused"); if (osxvideosink->osxwindow) gst_osx_video_sink_osxwindow_clear (osxvideosink, osxvideosink->osxwindow); osxvideosink->time = 0; break; case GST_STATE_CHANGE_PAUSED_TO_PLAYING: break; case GST_STATE_CHANGE_PLAYING_TO_PAUSED: break; case GST_STATE_CHANGE_PAUSED_TO_READY: osxvideosink->sw_scaling_failed = FALSE; GST_VIDEO_SINK_WIDTH (osxvideosink) = 0; GST_VIDEO_SINK_HEIGHT (osxvideosink) = 0; break; case GST_STATE_CHANGE_READY_TO_NULL: if (osxvideosink->osxwindow) { gst_osx_video_sink_osxwindow_destroy (osxvideosink, osxvideosink->osxwindow); osxvideosink->osxwindow = NULL; } break; } return (GST_ELEMENT_CLASS (parent_class))->change_state (element, transition); } static GstFlowReturn gst_osx_video_sink_show_frame (GstBaseSink * bsink, GstBuffer * buf) { GstOSXVideoSink *osxvideosink; char *viewdata; osxvideosink = GST_OSX_VIDEO_SINK (bsink); viewdata = [osxvideosink->osxwindow->gstview getTextureBuffer]; GST_DEBUG ("show_frame"); memcpy (viewdata, GST_BUFFER_DATA (buf), GST_BUFFER_SIZE (buf)); [osxvideosink->osxwindow->gstview displayTexture]; return GST_FLOW_OK; } /* Buffer management */ /* =========================================== */ /* */ /* Init & Class init */ /* */ /* =========================================== */ static void gst_osx_video_sink_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) { GstOSXVideoSink *osxvideosink; g_return_if_fail (GST_IS_OSX_VIDEO_SINK (object)); osxvideosink = GST_OSX_VIDEO_SINK (object); switch (prop_id) { case ARG_EMBED: osxvideosink->embed = g_value_get_boolean (value); break; case ARG_FULLSCREEN: osxvideosink->fullscreen = g_value_get_boolean (value); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static void gst_osx_video_sink_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec) { GstOSXVideoSink *osxvideosink; g_return_if_fail (GST_IS_OSX_VIDEO_SINK (object)); osxvideosink = GST_OSX_VIDEO_SINK (object); switch (prop_id) { case ARG_EMBED: g_value_set_boolean (value, osxvideosink->embed); break; case ARG_FULLSCREEN: g_value_set_boolean (value, osxvideosink->fullscreen); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static void gst_osx_video_sink_init (GstOSXVideoSink * osxvideosink) { osxvideosink->osxwindow = NULL; osxvideosink->pixel_width = osxvideosink->pixel_height = 1; osxvideosink->sw_scaling_failed = FALSE; osxvideosink->embed = FALSE; osxvideosink->fullscreen = FALSE; } static void gst_osx_video_sink_base_init (gpointer g_class) { GstElementClass *element_class = GST_ELEMENT_CLASS (g_class); gst_element_class_set_details (element_class, &gst_osx_video_sink_details); gst_element_class_add_pad_template (element_class, gst_static_pad_template_get (&gst_osx_video_sink_sink_template_factory)); } static void gst_osx_video_sink_class_init (GstOSXVideoSinkClass * klass) { GObjectClass *gobject_class; GstElementClass *gstelement_class; GstBaseSinkClass *gstbasesink_class; gobject_class = (GObjectClass *) klass; gstelement_class = (GstElementClass *) klass; gstbasesink_class = (GstBaseSinkClass *) klass; parent_class = g_type_class_ref (GST_TYPE_VIDEO_SINK); gobject_class->set_property = gst_osx_video_sink_set_property; gobject_class->get_property = gst_osx_video_sink_get_property; gstbasesink_class->set_caps = gst_osx_video_sink_setcaps; gstbasesink_class->preroll = gst_osx_video_sink_show_frame; gstbasesink_class->render = gst_osx_video_sink_show_frame; gstelement_class->change_state = gst_osx_video_sink_change_state; /** * GstOSXVideoSink:embed * * Set to #TRUE if you are embedding the video window in an application. * **/ g_object_class_install_property (gobject_class, ARG_EMBED, g_param_spec_boolean ("embed", "embed", "When enabled, it " "can be embedded", FALSE, G_PARAM_READWRITE)); /** * GstOSXVideoSink:fullscreen * * Set to #TRUE to have the video displayed in fullscreen. **/ g_object_class_install_property (gobject_class, ARG_FULLSCREEN, g_param_spec_boolean ("fullscreen", "fullscreen", "When enabled, the view " "is fullscreen", FALSE, G_PARAM_READWRITE)); } /* ============================================================= */ /* */ /* Public Methods */ /* */ /* ============================================================= */ /* =========================================== */ /* */ /* Object typing & Creation */ /* */ /* =========================================== */ GType gst_osx_video_sink_get_type (void) { static GType osxvideosink_type = 0; if (!osxvideosink_type) { static const GTypeInfo osxvideosink_info = { sizeof (GstOSXVideoSinkClass), gst_osx_video_sink_base_init, NULL, (GClassInitFunc) gst_osx_video_sink_class_init, NULL, NULL, sizeof (GstOSXVideoSink), 0, (GInstanceInitFunc) gst_osx_video_sink_init, }; osxvideosink_type = g_type_register_static (GST_TYPE_VIDEO_SINK, "GstOSXVideoSink", &osxvideosink_info, 0); } return osxvideosink_type; } psimedia-master/gstprovider/gstelements/osxvideo/osxvideosrc.c000066400000000000000000001077331220046403000254220ustar00rootroot00000000000000/* * GStreamer * Copyright 2007 Ole André Vadla RavnÃ¥s * Copyright 2007 Ali Sabil * Copyright 2008 Barracuda Networks * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. */ /** * SECTION:element-osxvideosrc * * * osxvideosrc can be used to capture video from capture devices on OS X. * Example launch line * * * gst-launch osxvideosrc ! osxvideosink * * This pipeline shows the video captured from the default capture device. * * */ #ifdef HAVE_CONFIG_H # include #endif #include // for usleep #include #include #include "osxvideosrc.h" /* for now, framerate is hard-coded */ #define FRAMERATE 30 // TODO: for completeness, write an _unlock function /* QuickTime notes: EnterMovies initialize QT subsystem there is no deinit OpenDefaultComponent of type SeqGrabComponentType this gets a handle to a sequence grabber CloseComponent release the sequence grabber SGInitialize initialize the SG there is no deinit, simply close the component SGSetDataRef of seqGrabDontMakeMovie this is to disable file creation. we only want frames SGNewChannel of VideoMediaType make a video capture channel QTNewGWorld specify format (e.g. k32ARGBPixelFormat) specify size LockPixels this makes it so the base address of the image doesn't "move". you can UnlockPixels also, if you care to. CocoaSequenceGrabber locks (GetPortPixMap(gWorld)) for the entire session. it also locks/unlocks the pixmaphandle [ PixMapHandle pixMapHandle = GetGWorldPixMap(gworld); ] during the moment where it extracts the frame from the gworld SGSetGWorld assign the gworld to the component pass GetMainDevice() as the last arg, which is just a formality? SGSetChannelBounds use this to set our desired capture size. the camera might not actually capture at this size, but it will pick something close. SGSetChannelUsage of seqGrabRecord enable recording SGSetDataProc set callback handler SGPrepare prepares for recording. this initializes the camera (the light should come on) so that when you call SGStartRecord you hit the ground running. maybe we should call SGPrepare when READY->PAUSED happens? SGRelease unprepare the recording SGStartRecord kick off the recording SGStop stop recording SGGetChannelSampleDescription obtain the size the camera is actually capturing at DecompressSequenceBegin i'm pretty sure you have to use this to receive the raw frames. you can also use it to scale the image. to scale, create a matrix from the source and desired sizes and pass the matrix to this function. *** deprecated: use DecompressSequenceBeginS instead CDSequenceEnd stop a decompress sequence DecompressSequenceFrameS use this to obtain a raw frame. the result ends up in the gworld *** deprecated: use DecompressSequenceFrameWhen instead SGGetChannelDeviceList of sgDeviceListIncludeInputs obtain the list of devices for the video channel SGSetChannelDevice set the master device (DV, USB, etc) on the channel, by string name SGSetChannelDeviceInput set the sub device on the channel (iSight), by integer id device ids should be a concatenation of the above two values. */ GST_DEBUG_CATEGORY (gst_debug_osx_video_src); #define GST_CAT_DEFAULT gst_debug_osx_video_src /* Filter signals and args */ enum { /* FILL ME */ LAST_SIGNAL }; enum { ARG_0, ARG_DEVICE, ARG_DEVICE_NAME }; static GstStaticPadTemplate src_template = GST_STATIC_PAD_TEMPLATE ("src", GST_PAD_SRC, GST_PAD_ALWAYS, GST_STATIC_CAPS ("video/x-raw-yuv, " "format = (fourcc) UYVY, " "width = (int) [ 1, MAX ], " "height = (int) [ 1, MAX ], " //"framerate = (fraction) 0/1") "framerate = (fraction) 30/1") ); static void gst_osx_video_src_init_interfaces (GType type); static void gst_osx_video_src_type_add_device_property_probe_interface (GType type); GST_BOILERPLATE_FULL (GstOSXVideoSrc, gst_osx_video_src, GstPushSrc, GST_TYPE_PUSH_SRC, gst_osx_video_src_init_interfaces); static void gst_osx_video_src_dispose (GObject * object); static void gst_osx_video_src_finalize (GstOSXVideoSrc * osx_video_src); static void gst_osx_video_src_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static void gst_osx_video_src_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); static GstStateChangeReturn gst_osx_video_src_change_state ( GstElement * element, GstStateChange transition); static GstCaps * gst_osx_video_src_get_caps (GstBaseSrc * src); static gboolean gst_osx_video_src_set_caps (GstBaseSrc * src, GstCaps * caps); static gboolean gst_osx_video_src_start (GstBaseSrc * src); static gboolean gst_osx_video_src_stop (GstBaseSrc * src); static gboolean gst_osx_video_src_query (GstBaseSrc * bsrc, GstQuery * query); static GstFlowReturn gst_osx_video_src_create (GstPushSrc * src, GstBuffer ** buf); static void gst_osx_video_src_fixate (GstBaseSrc * bsrc, GstCaps * caps); static gboolean prepare_capture (GstOSXVideoSrc * self); /* \ = \\, : = \c */ static GString * escape_string (const GString * in) { GString * out; int n; out = g_string_sized_new (64); for (n = 0; n < (int) in->len; ++n) { if (in->str[n] == '\\') g_string_append (out, "\\\\"); else if (in->str[n] == ':') g_string_append (out, "\\:"); else g_string_append_c (out, in->str[n]); } return out; } /* \\ = \, \c = : */ static GString * unescape_string (const GString * in) { GString * out; int n; out = g_string_sized_new (64); for (n = 0; n < (int) in->len; ++n) { if (in->str[n] == '\\') { if (n + 1 < (int) in->len) { ++n; if (in->str[n] == '\\') g_string_append_c (out, '\\'); else if (in->str[n] == 'c') g_string_append_c (out, ':'); else { /* unknown code, we will eat the escape sequence */ } } else { /* string ends with backslash, we will eat it */ } } else g_string_append_c (out, in->str[n]); } return out; } static gchar * create_device_id (const gchar * sgname, int inputIndex) { GString * out; GString * name; GString * nameenc; gchar * ret; name = g_string_new (sgname); nameenc = escape_string (name); g_string_free (name, TRUE); if (inputIndex >= 0) { out = g_string_new (""); g_string_printf (out, "%s:%d", nameenc->str, inputIndex); } else { /* unspecified index */ out = g_string_new (nameenc->str); } g_string_free (nameenc, TRUE); ret = g_string_free (out, FALSE); return ret; } static gboolean parse_device_id (const gchar * id, gchar ** sgname, int * inputIndex) { gchar ** parts; int numparts; GString * p1; GString * out1; int out2; out2 = 0; parts = g_strsplit (id, ":", -1); numparts = 0; while (parts[numparts]) ++numparts; /* must be exactly 1 or 2 parts */ if (numparts < 1 || numparts > 2) { g_strfreev (parts); return FALSE; } p1 = g_string_new (parts[0]); out1 = unescape_string (p1); g_string_free (p1, TRUE); if (numparts >= 2) { errno = 0; out2 = strtol (parts[1], NULL, 10); if (out2 == 0 && (errno == ERANGE || errno == EINVAL)) { g_string_free (out1, TRUE); g_strfreev (parts); return FALSE; } } g_strfreev (parts); *sgname = g_string_free (out1, FALSE); *inputIndex = out2; return TRUE; } typedef struct { gchar * id; gchar * name; } video_device; static video_device * video_device_alloc () { video_device * dev; dev = g_malloc (sizeof (video_device)); dev->id = NULL; dev->name = NULL; return dev; } static void video_device_free (video_device * dev) { if (!dev) return; if (dev->id) g_free (dev->id); if (dev->name) g_free (dev->name); g_free (dev); } static void video_device_free_func (gpointer data, gpointer user_data) { video_device_free ((video_device *) data); } /* return a list of available devices. the default device (if any) will be * the first in the list. */ static GList * device_list (GstOSXVideoSrc * src) { SeqGrabComponent component; SGChannel channel; SGDeviceList deviceList; SGDeviceName * deviceEntry; SGDeviceInputList inputList; SGDeviceInputName * inputEntry; ComponentResult err; int n, i; GList * list; video_device * dev, * default_dev; gchar sgname[256]; gchar friendly_name[256]; component = NULL; list = NULL; default_dev = NULL; if (src->video_chan) { /* if we already have a video channel allocated, use that */ GST_DEBUG_OBJECT (src, "reusing existing channel for device_list"); channel = src->video_chan; } else { /* otherwise, allocate a temporary one */ component = OpenDefaultComponent (SeqGrabComponentType, 0); if (!component) { err = paramErr; GST_ERROR_OBJECT (src, "OpenDefaultComponent failed. paramErr=%d", (int) err); goto end; } err = SGInitialize (component); if (err != noErr) { GST_ERROR_OBJECT (src, "SGInitialize returned %d", (int) err); goto end; } err = SGSetDataRef (component, 0, 0, seqGrabDontMakeMovie); if (err != noErr) { GST_ERROR_OBJECT (src, "SGSetDataRef returned %d", (int) err); goto end; } err = SGNewChannel (component, VideoMediaType, &channel); if (err != noErr) { GST_ERROR_OBJECT (src, "SGNewChannel returned %d", (int) err); goto end; } } err = SGGetChannelDeviceList (channel, sgDeviceListIncludeInputs, &deviceList); if (err != noErr) { GST_ERROR_OBJECT (src, "SGGetChannelDeviceList returned %d", (int) err); goto end; } for (n = 0; n < (*deviceList)->count; ++n) { deviceEntry = &(*deviceList)->entry[n]; if (deviceEntry->flags & sgDeviceNameFlagDeviceUnavailable) continue; p2cstrcpy (sgname, deviceEntry->name); inputList = deviceEntry->inputs; if (inputList && (*inputList)->count >= 1) { for (i = 0; i < (*inputList)->count; ++i) { inputEntry = &(*inputList)->entry[i]; p2cstrcpy (friendly_name, inputEntry->name); dev = video_device_alloc (); dev->id = create_device_id (sgname, i); if (!dev->id) { video_device_free (dev); i = -1; break; } dev->name = g_strdup (friendly_name); list = g_list_append (list, dev); /* if this is the default device, note it */ if (n == (*deviceList)->selectedIndex && i == (*inputList)->selectedIndex) { default_dev = dev; } } /* error */ if (i == -1) break; } else { /* ### can a device have no defined inputs? */ dev = video_device_alloc (); dev->id = create_device_id (sgname, -1); if (!dev->id) { video_device_free (dev); break; } dev->name = g_strdup (sgname); list = g_list_append (list, dev); /* if this is the default device, note it */ if (n == (*deviceList)->selectedIndex) { default_dev = dev; } } } /* move default device to the front */ if (default_dev) { list = g_list_remove (list, default_dev); list = g_list_prepend (list, default_dev); } end: if (!src->video_chan) { err = CloseComponent (component); if (err != noErr) GST_WARNING_OBJECT (src, "CloseComponent returned %d", (int) err); } return list; } static gboolean device_set_default (GstOSXVideoSrc * src) { GList * list; video_device * dev; gboolean ret; /* obtain the device list */ list = device_list (src); if (!list) return FALSE; ret = FALSE; /* the first item is the default */ if (g_list_length (list) >= 1) { dev = (video_device *) list->data; /* take the strings, no need to copy */ src->device_id = dev->id; src->device_name = dev->name; dev->id = NULL; dev->name = NULL; /* null out the item */ video_device_free (dev); list->data = NULL; ret = TRUE; } /* clean up */ g_list_foreach (list, video_device_free_func, NULL); g_list_free (list); return ret; } static gboolean device_get_name (GstOSXVideoSrc * src) { GList * l, * list; video_device * dev; gboolean ret; /* if there is no device set, then attempt to set up with the default, * which will also grab the name in the process. */ if (!src->device_id) return device_set_default (src); /* if we already have a name, free it */ if (src->device_name) { g_free (src->device_name); src->device_name = NULL; } /* obtain the device list */ list = device_list (src); if (!list) return FALSE; ret = FALSE; /* look up the id */ for (l = list; l != NULL; l = l->next) { dev = (video_device *) l->data; if (g_str_equal (dev->id, src->device_id)) { /* take the string, no need to copy */ src->device_name = dev->name; dev->name = NULL; ret = TRUE; break; } } g_list_foreach (list, video_device_free_func, NULL); g_list_free (list); return ret; } static gboolean device_select (GstOSXVideoSrc * src) { Str63 pstr; ComponentResult err; gchar * sgname; int inputIndex; /* if there's no device id set, attempt to select default device */ if (!src->device_id && !device_set_default (src)) return FALSE; if (!parse_device_id (src->device_id, &sgname, &inputIndex)) { GST_ERROR_OBJECT (src, "unable to parse device id: [%s]", src->device_id); return FALSE; } c2pstrcpy (pstr, sgname); g_free (sgname); err = SGSetChannelDevice (src->video_chan, (StringPtr) &pstr); if (err != noErr) { GST_ERROR_OBJECT (src, "SGSetChannelDevice returned %d", (int) err); return FALSE; } err = SGSetChannelDeviceInput (src->video_chan, inputIndex); if (err != noErr) { GST_ERROR_OBJECT (src, "SGSetChannelDeviceInput returned %d", (int) err); return FALSE; } return TRUE; } static gboolean gst_osx_video_src_iface_supported (GstImplementsInterface * iface, GType iface_type) { return FALSE; } static void gst_osx_video_src_interface_init (GstImplementsInterfaceClass * klass) { /* default virtual functions */ klass->supported = gst_osx_video_src_iface_supported; } static void gst_osx_video_src_init_interfaces (GType type) { static const GInterfaceInfo implements_iface_info = { (GInterfaceInitFunc) gst_osx_video_src_interface_init, NULL, NULL, }; g_type_add_interface_static (type, GST_TYPE_IMPLEMENTS_INTERFACE, &implements_iface_info); gst_osx_video_src_type_add_device_property_probe_interface (type); } static void gst_osx_video_src_base_init (gpointer gclass) { static GstElementDetails element_details = { "Video Source (OSX)", "Source/Video", "Reads raw frames from a capture device on OS X", "Ole Andre Vadla Ravnaas , " "Ali Sabil " }; GstElementClass * element_class = GST_ELEMENT_CLASS (gclass); GST_DEBUG (G_STRFUNC); gst_element_class_add_pad_template (element_class, gst_static_pad_template_get (&src_template)); gst_element_class_set_details (element_class, &element_details); } static void gst_osx_video_src_class_init (GstOSXVideoSrcClass * klass) { GObjectClass * gobject_class; GstElementClass * element_class; GstBaseSrcClass * basesrc_class; GstPushSrcClass * pushsrc_class; OSErr err; GST_DEBUG (G_STRFUNC); gobject_class = G_OBJECT_CLASS (klass); element_class = GST_ELEMENT_CLASS (klass); basesrc_class = GST_BASE_SRC_CLASS (klass); pushsrc_class = GST_PUSH_SRC_CLASS (klass); gobject_class->dispose = gst_osx_video_src_dispose; gobject_class->finalize = (GObjectFinalizeFunc) gst_osx_video_src_finalize; gobject_class->set_property = GST_DEBUG_FUNCPTR (gst_osx_video_src_set_property); gobject_class->get_property = GST_DEBUG_FUNCPTR (gst_osx_video_src_get_property); element_class->change_state = gst_osx_video_src_change_state; basesrc_class->get_caps = gst_osx_video_src_get_caps; basesrc_class->set_caps = gst_osx_video_src_set_caps; basesrc_class->start = gst_osx_video_src_start; basesrc_class->stop = gst_osx_video_src_stop; basesrc_class->query = gst_osx_video_src_query; basesrc_class->fixate = gst_osx_video_src_fixate; pushsrc_class->create = gst_osx_video_src_create; g_object_class_install_property (gobject_class, ARG_DEVICE, g_param_spec_string ("device", "Device", "Sequence Grabber input device in format 'sgname:input#'", NULL, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, ARG_DEVICE_NAME, g_param_spec_string ("device-name", "Device name", "Human-readable name of the video device", NULL, G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); err = EnterMovies(); if (err == noErr) { klass->movies_enabled = TRUE; } else { klass->movies_enabled = FALSE; GST_ERROR ("EnterMovies returned %d", err); } } static void gst_osx_video_src_init (GstOSXVideoSrc * self, GstOSXVideoSrcClass * klass) { GST_DEBUG_OBJECT (self, G_STRFUNC); gst_base_src_set_format (GST_BASE_SRC (self), GST_FORMAT_TIME); gst_base_src_set_live (GST_BASE_SRC (self), TRUE); } static void gst_osx_video_src_dispose (GObject * object) { GstOSXVideoSrc * self = GST_OSX_VIDEO_SRC (object); GST_DEBUG_OBJECT (object, G_STRFUNC); if (self->device_id) { g_free (self->device_id); self->device_id = NULL; } if (self->device_name) { g_free (self->device_name); self->device_name = NULL; } if (self->buffer != NULL) { gst_buffer_unref (self->buffer); self->buffer = NULL; } G_OBJECT_CLASS (parent_class)->dispose (object); } static void gst_osx_video_src_finalize (GstOSXVideoSrc * self) { GST_DEBUG_OBJECT (self, G_STRFUNC); G_OBJECT_CLASS (parent_class)->finalize (G_OBJECT (self)); } static void gst_osx_video_src_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) { GstOSXVideoSrc * src = GST_OSX_VIDEO_SRC (object); switch (prop_id) { case ARG_DEVICE: if (src->device_id) { g_free (src->device_id); src->device_id = NULL; } if (src->device_name) { g_free (src->device_name); src->device_name = NULL; } src->device_id = g_strdup (g_value_get_string (value)); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static void gst_osx_video_src_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec) { GstOSXVideoSrc * src = GST_OSX_VIDEO_SRC (object); switch (prop_id) { case ARG_DEVICE: if (!src->device_id) device_set_default (src); g_value_set_string (value, src->device_id); break; case ARG_DEVICE_NAME: if (!src->device_name) device_get_name (src); g_value_set_string (value, src->device_name); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static GstCaps * gst_osx_video_src_get_caps (GstBaseSrc * src) { GstElementClass * gstelement_class; GstOSXVideoSrc * self; GstPadTemplate * pad_template; GstCaps * caps; GstStructure * structure; gint width, height; gstelement_class = GST_ELEMENT_GET_CLASS (src); self = GST_OSX_VIDEO_SRC (src); /* if we don't have the resolution set up, return template caps */ if (!self->world) return NULL; pad_template = gst_element_class_get_pad_template (gstelement_class, "src"); /* i don't think this can actually fail... */ if (!pad_template) return NULL; width = self->rect.right; height = self->rect.bottom; caps = gst_caps_copy (gst_pad_template_get_caps (pad_template)); structure = gst_caps_get_structure (caps, 0); gst_structure_set (structure, "width", G_TYPE_INT, width, NULL); gst_structure_set (structure, "height", G_TYPE_INT, height, NULL); return caps; } static gboolean gst_osx_video_src_set_caps (GstBaseSrc * src, GstCaps * caps) { GstOSXVideoSrc * self = GST_OSX_VIDEO_SRC (src); GstStructure * structure = gst_caps_get_structure (caps, 0); gint width, height, framerate_num, framerate_denom; float fps; ComponentResult err; GST_DEBUG_OBJECT (src, G_STRFUNC); if (!self->seq_grab) return FALSE; gst_structure_get_int (structure, "width", &width); gst_structure_get_int (structure, "height", &height); gst_structure_get_fraction (structure, "framerate", &framerate_num, &framerate_denom); fps = (float) framerate_num / framerate_denom; GST_DEBUG_OBJECT (src, "changing caps to %dx%d@%f", width, height, fps); if (self->world) { /* capture is already active. we currently don't allow dynamic changing * of caps, so make sure the caps match what we are already doing */ if (width == self->rect.right && height == self->rect.bottom && (int)fps == FRAMERATE) return TRUE; else return FALSE; } SetRect (&self->rect, 0, 0, width, height); err = QTNewGWorld (&self->world, k422YpCbCr8PixelFormat, &self->rect, 0, NULL, 0); if (err != noErr) { GST_ERROR_OBJECT (self, "QTNewGWorld returned %d", (int) err); goto fail; } if (!LockPixels (GetPortPixMap (self->world))) { GST_ERROR_OBJECT (self, "LockPixels failed"); goto fail; } err = SGSetGWorld (self->seq_grab, self->world, NULL); if (err != noErr) { GST_ERROR_OBJECT (self, "SGSetGWorld returned %d", (int) err); goto fail; } err = SGSetChannelBounds (self->video_chan, &self->rect); if (err != noErr) { GST_ERROR_OBJECT (self, "SGSetChannelBounds returned %d", (int) err); goto fail; } // ###: if we ever support choosing framerates, do something with this /*err = SGSetFrameRate (self->video_chan, FloatToFixed(fps)); if (err != noErr) { GST_ERROR_OBJECT (self, "SGSetFrameRate returned %d", (int) err); goto fail; }*/ return TRUE; fail: if (self->world) { SGSetGWorld (self->seq_grab, NULL, NULL); DisposeGWorld (self->world); self->world = NULL; } return FALSE; } static void gst_osx_video_src_fixate (GstBaseSrc * bsrc, GstCaps * caps) { GstStructure * structure; int i; /* this function is for choosing defaults as a last resort */ for (i = 0; i < (int) gst_caps_get_size (caps); ++i) { structure = gst_caps_get_structure (caps, i); gst_structure_fixate_field_nearest_int (structure, "width", 640); gst_structure_fixate_field_nearest_int (structure, "height", 480); // ###: if we ever support choosing framerates, do something with this //gst_structure_fixate_field_nearest_fraction (structure, "framerate", 15, 2); } } static gboolean gst_osx_video_src_start (GstBaseSrc * src) { GstOSXVideoSrc * self; GObjectClass * gobject_class; GstOSXVideoSrcClass * klass; ComponentResult err; self = GST_OSX_VIDEO_SRC (src); gobject_class = G_OBJECT_GET_CLASS (src); klass = GST_OSX_VIDEO_SRC_CLASS (gobject_class); GST_DEBUG_OBJECT (src, "entering"); if (!klass->movies_enabled) return FALSE; self->seq_num = 0; self->seq_grab = OpenDefaultComponent (SeqGrabComponentType, 0); if (self->seq_grab == NULL) { err = paramErr; GST_ERROR_OBJECT (self, "OpenDefaultComponent failed. paramErr=%d", (int) err); goto fail; } err = SGInitialize (self->seq_grab); if (err != noErr) { GST_ERROR_OBJECT (self, "SGInitialize returned %d", (int) err); goto fail; } err = SGSetDataRef (self->seq_grab, 0, 0, seqGrabDontMakeMovie); if (err != noErr) { GST_ERROR_OBJECT (self, "SGSetDataRef returned %d", (int) err); goto fail; } err = SGNewChannel (self->seq_grab, VideoMediaType, &self->video_chan); if (err != noErr) { GST_ERROR_OBJECT (self, "SGNewChannel returned %d", (int) err); goto fail; } if (!device_select (self)) goto fail; GST_DEBUG_OBJECT (self, "started"); return TRUE; fail: self->video_chan = NULL; if (self->seq_grab) { err = CloseComponent (self->seq_grab); if (err != noErr) GST_WARNING_OBJECT (self, "CloseComponent returned %d", (int) err); self->seq_grab = NULL; } return FALSE; } static gboolean gst_osx_video_src_stop (GstBaseSrc * src) { GstOSXVideoSrc * self; ComponentResult err; self = GST_OSX_VIDEO_SRC (src); GST_DEBUG_OBJECT (src, "stopping"); self->video_chan = NULL; err = CloseComponent (self->seq_grab); if (err != noErr) GST_WARNING_OBJECT (self, "CloseComponent returned %d", (int) err); self->seq_grab = NULL; DisposeGWorld (self->world); self->world = NULL; if (self->buffer != NULL) { gst_buffer_unref (self->buffer); self->buffer = NULL; } return TRUE; } static gboolean gst_osx_video_src_query (GstBaseSrc * bsrc, GstQuery * query) { GstOSXVideoSrc * self; gboolean res = FALSE; self = GST_OSX_VIDEO_SRC (bsrc); switch (GST_QUERY_TYPE (query)) { case GST_QUERY_LATENCY: { GstClockTime min_latency, max_latency; gint fps_n, fps_d; fps_n = FRAMERATE; fps_d = 1; /* min latency is the time to capture one frame */ min_latency = gst_util_uint64_scale_int (GST_SECOND, fps_d, fps_n); /* max latency is total duration of the frame buffer */ // FIXME: we don't know what this is, so we'll just say 2 frames max_latency = 2 * min_latency; GST_DEBUG_OBJECT (bsrc, "report latency min %" GST_TIME_FORMAT " max %" GST_TIME_FORMAT, GST_TIME_ARGS (min_latency), GST_TIME_ARGS (max_latency)); /* we are always live, the min latency is 1 frame and the max latency is * the complete buffer of frames. */ gst_query_set_latency (query, TRUE, min_latency, max_latency); res = TRUE; break; } default: res = GST_BASE_SRC_CLASS (parent_class)->query (bsrc, query); break; } return res; } static GstStateChangeReturn gst_osx_video_src_change_state (GstElement * element, GstStateChange transition) { GstStateChangeReturn result; GstOSXVideoSrc * self; ComponentResult err; result = GST_STATE_CHANGE_SUCCESS; self = GST_OSX_VIDEO_SRC (element); // ###: prepare_capture in READY->PAUSED? switch (transition) { case GST_STATE_CHANGE_PAUSED_TO_PLAYING: { ImageDescriptionHandle imageDesc; Rect sourceRect; MatrixRecord scaleMatrix; if (!prepare_capture(self)) return GST_STATE_CHANGE_FAILURE; // ###: should we start recording /after/ making the decompressionsequence? // CocoaSequenceGrabber does it beforehand, so we do too, but it feels // wrong. err = SGStartRecord (self->seq_grab); if (err != noErr) { /* since we prepare here, we should also unprepare */ SGRelease (self->seq_grab); GST_ERROR_OBJECT (self, "SGStartRecord returned %d", (int) err); return GST_STATE_CHANGE_FAILURE; } imageDesc = (ImageDescriptionHandle) NewHandle (0); err = SGGetChannelSampleDescription (self->video_chan, (Handle) imageDesc); if (err != noErr) { SGStop (self->seq_grab); SGRelease (self->seq_grab); DisposeHandle ((Handle) imageDesc); GST_ERROR_OBJECT (self, "SGGetChannelSampleDescription returned %d", (int) err); return GST_STATE_CHANGE_FAILURE; } GST_DEBUG_OBJECT (self, "actual capture resolution is %dx%d", (int) (**imageDesc).width, (int) (**imageDesc).height); SetRect (&sourceRect, 0, 0, (**imageDesc).width, (**imageDesc).height); RectMatrix(&scaleMatrix, &sourceRect, &self->rect); err = DecompressSequenceBegin (&self->dec_seq, imageDesc, self->world, NULL, NULL, &scaleMatrix, srcCopy, NULL, 0, codecNormalQuality, bestSpeedCodec); if (err != noErr) { SGStop (self->seq_grab); SGRelease (self->seq_grab); DisposeHandle ((Handle) imageDesc); GST_ERROR_OBJECT (self, "DecompressSequenceBegin returned %d", (int) err); return GST_STATE_CHANGE_FAILURE; } DisposeHandle ((Handle) imageDesc); break; } default: break; } result = GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); if (result == GST_STATE_CHANGE_FAILURE) return result; switch (transition) { case GST_STATE_CHANGE_PAUSED_TO_READY: SGStop (self->seq_grab); err = CDSequenceEnd (self->dec_seq); if (err != noErr) GST_WARNING_OBJECT (self, "CDSequenceEnd returned %d", (int) err); self->dec_seq = 0; SGRelease (self->seq_grab); break; default: break; } return result; } static GstFlowReturn gst_osx_video_src_create (GstPushSrc * src, GstBuffer ** buf) { GstOSXVideoSrc * self = GST_OSX_VIDEO_SRC (src); ComponentResult err; GstCaps * caps; //GstClock * clock; // ###: we need to sleep between calls to SGIdle. originally, the sleeping // was done using gst_clock_id_wait(), but it turns out that approach // doesn't work well. it has two issues: // 1) every so often, gst_clock_id_wait() will block for a much longer // period of time than requested (upwards of a minute) causing video // to freeze until it finally returns. this seems to happen once // every few minutes, which probably means something like 1 in every // several hundred calls gst_clock_id_wait() does the wrong thing. // 2) even when the gst_clock approach is working properly, it uses // quite a bit of cpu in comparison to a simple usleep(). on one // test machine, using gst_clock_id_wait() caused osxvideosrc to use // nearly 100% cpu, while using usleep() brough the usage to less // than 10%. // // so, for now, we comment out the gst_clock stuff and use usleep. //clock = gst_system_clock_obtain (); do { err = SGIdle (self->seq_grab); if (err != noErr) { GST_ERROR_OBJECT (self, "SGIdle returned %d", (int) err); gst_object_unref (clock); return GST_FLOW_UNEXPECTED; } if (self->buffer == NULL) { /*GstClockID clock_id; clock_id = gst_clock_new_single_shot_id (clock, (GstClockTime) (gst_clock_get_time(clock) + (GST_SECOND / ((float)FRAMERATE * 2)))); gst_clock_id_wait (clock_id, NULL); gst_clock_id_unref (clock_id);*/ usleep (1000000 / (FRAMERATE * 2)); } } while (self->buffer == NULL); //gst_object_unref (clock); *buf = self->buffer; self->buffer = NULL; caps = gst_pad_get_caps (GST_BASE_SRC_PAD (src)); gst_buffer_set_caps (*buf, caps); gst_caps_unref (caps); return GST_FLOW_OK; } static OSErr data_proc (SGChannel c, Ptr p, long len, long * offset, long chRefCon, TimeValue time, short writeType, long refCon) { GstOSXVideoSrc * self; gint fps_n, fps_d; GstClockTime duration, timestamp, latency; CodecFlags flags; ComponentResult err; PixMapHandle hPixMap; Rect portRect; int pix_rowBytes; void *pix_ptr; int pix_height; int pix_size; self = GST_OSX_VIDEO_SRC (refCon); if (self->buffer != NULL) { gst_buffer_unref (self->buffer); self->buffer = NULL; } err = DecompressSequenceFrameS (self->dec_seq, p, len, 0, &flags, NULL); if (err != noErr) { GST_ERROR_OBJECT (self, "DecompressSequenceFrameS returned %d", (int) err); return err; } hPixMap = GetGWorldPixMap (self->world); LockPixels (hPixMap); GetPortBounds (self->world, &portRect); pix_rowBytes = (int) GetPixRowBytes (hPixMap); pix_ptr = GetPixBaseAddr (hPixMap); pix_height = (portRect.bottom - portRect.top); pix_size = pix_rowBytes * pix_height; GST_DEBUG_OBJECT (self, "num=%5d, height=%d, rowBytes=%d, size=%d", self->seq_num, pix_height, pix_rowBytes, pix_size); fps_n = FRAMERATE; fps_d = 1; duration = gst_util_uint64_scale_int (GST_SECOND, fps_d, fps_n); latency = duration; timestamp = gst_clock_get_time (GST_ELEMENT_CAST (self)->clock); timestamp -= gst_element_get_base_time (GST_ELEMENT_CAST (self)); if (timestamp > latency) timestamp -= latency; else timestamp = 0; self->buffer = gst_buffer_new_and_alloc (pix_size); GST_BUFFER_OFFSET (self->buffer) = self->seq_num; GST_BUFFER_TIMESTAMP (self->buffer) = timestamp; memcpy (GST_BUFFER_DATA (self->buffer), pix_ptr, pix_size); self->seq_num++; UnlockPixels (hPixMap); return noErr; } static gboolean prepare_capture (GstOSXVideoSrc * self) { ComponentResult err; err = SGSetChannelUsage (self->video_chan, seqGrabRecord); if (err != noErr) { GST_ERROR_OBJECT (self, "SGSetChannelUsage returned %d", (int) err); return FALSE; } err = SGSetDataProc (self->seq_grab, NewSGDataUPP (data_proc), (long) self); if (err != noErr) { GST_ERROR_OBJECT (self, "SGSetDataProc returned %d", (int) err); return FALSE; } err = SGPrepare (self->seq_grab, false, true); if (err != noErr) { GST_ERROR_OBJECT (self, "SGPrepare returnd %d", (int) err); return FALSE; } return TRUE; } static const GList * probe_get_properties (GstPropertyProbe * probe) { GObjectClass * klass = G_OBJECT_GET_CLASS (probe); static GList * list = NULL; // ###: from gstalsadeviceprobe.c /* well, not perfect, but better than no locking at all. * In the worst case we leak a list node, so who cares? */ GST_CLASS_LOCK (GST_OBJECT_CLASS (klass)); if (!list) { GParamSpec * pspec; pspec = g_object_class_find_property (klass, "device"); list = g_list_append (NULL, pspec); } GST_CLASS_UNLOCK (GST_OBJECT_CLASS (klass)); return list; } static void probe_probe_property (GstPropertyProbe * probe, guint prop_id, const GParamSpec * pspec) { /* we do nothing in here. the actual "probe" occurs in get_values(), * which is a common practice when not caching responses. */ if (!g_str_equal (pspec->name, "device")) { G_OBJECT_WARN_INVALID_PROPERTY_ID (probe, prop_id, pspec); } } static gboolean probe_needs_probe (GstPropertyProbe * probe, guint prop_id, const GParamSpec * pspec) { /* don't cache probed data */ return TRUE; } static GValueArray * probe_get_values (GstPropertyProbe * probe, guint prop_id, const GParamSpec * pspec) { GstOSXVideoSrc * src; GValueArray * array; GValue value = { 0, }; GList * l, * list; video_device * dev; if (!g_str_equal (pspec->name, "device")) { G_OBJECT_WARN_INVALID_PROPERTY_ID (probe, prop_id, pspec); return NULL; } src = GST_OSX_VIDEO_SRC (probe); list = device_list (src); if (list == NULL) { GST_LOG_OBJECT (probe, "No devices found"); return NULL; } array = g_value_array_new (g_list_length (list)); g_value_init (&value, G_TYPE_STRING); for (l = list; l != NULL; l = l->next) { dev = (video_device *) l->data; GST_LOG_OBJECT (probe, "Found device: %s", dev->id); g_value_take_string (&value, dev->id); dev->id = NULL; video_device_free (dev); l->data = NULL; g_value_array_append (array, &value); } g_value_unset (&value); g_list_free (list); return array; } static void gst_osx_video_src_property_probe_interface_init (GstPropertyProbeInterface * iface) { iface->get_properties = probe_get_properties; iface->probe_property = probe_probe_property; iface->needs_probe = probe_needs_probe; iface->get_values = probe_get_values; } void gst_osx_video_src_type_add_device_property_probe_interface (GType type) { static const GInterfaceInfo probe_iface_info = { (GInterfaceInitFunc) gst_osx_video_src_property_probe_interface_init, NULL, NULL, }; g_type_add_interface_static (type, GST_TYPE_PROPERTY_PROBE, &probe_iface_info); } psimedia-master/gstprovider/gstelements/osxvideo/osxvideosrc.h000066400000000000000000000042241220046403000254160ustar00rootroot00000000000000/* * GStreamer * Copyright 2007 Ole André Vadla RavnÃ¥s * Copyright 2007 Ali Sabil * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. */ #ifndef __GST_OSX_VIDEO_SRC_H__ #define __GST_OSX_VIDEO_SRC_H__ #include #include #include GST_DEBUG_CATEGORY_EXTERN (gst_debug_osx_video_src); G_BEGIN_DECLS /* #defines don't like whitespacey bits */ #define GST_TYPE_OSX_VIDEO_SRC \ (gst_osx_video_src_get_type()) #define GST_OSX_VIDEO_SRC(obj) \ (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_OSX_VIDEO_SRC,GstOSXVideoSrc)) #define GST_OSX_VIDEO_SRC_CLASS(klass) \ (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_OSX_VIDEO_SRC,GstOSXVideoSrcClass)) #define GST_IS_OSX_VIDEO_SRC(obj) \ (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_OSX_VIDEO_SRC)) #define GST_IS_OSX_VIDEO_SRC_CLASS(klass) \ (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_OSX_VIDEO_SRC)) typedef struct _GstOSXVideoSrc GstOSXVideoSrc; typedef struct _GstOSXVideoSrcClass GstOSXVideoSrcClass; struct _GstOSXVideoSrc { GstPushSrc pushsrc; gchar * device_id; gchar * device_name; SeqGrabComponent seq_grab; SGChannel video_chan; GWorldPtr world; Rect rect; ImageSequence dec_seq; GstBuffer * buffer; guint seq_num; }; struct _GstOSXVideoSrcClass { GstPushSrcClass parent_class; gboolean movies_enabled; }; GType gst_osx_video_src_get_type (void); G_END_DECLS #endif /* __GST_OSX_VIDEO_SRC_H__ */ psimedia-master/gstprovider/gstelements/shared/000077500000000000000000000000001220046403000223015ustar00rootroot00000000000000psimedia-master/gstprovider/gstelements/shared/config.h000066400000000000000000000003011220046403000237110ustar00rootroot00000000000000#define VERSION "1.0.0" #define GST_LICENSE "LGPL" #define PACKAGE "N/A" #define GST_PACKAGE_NAME "GStreamer Plugins (PsiMedia)" #define GST_PACKAGE_ORIGIN "http://delta.affinix.com/psimedia/" psimedia-master/gstprovider/gstelements/shared/directsound/000077500000000000000000000000001220046403000246245ustar00rootroot00000000000000psimedia-master/gstprovider/gstelements/shared/directsound/directsound.pro000066400000000000000000000002051220046403000276660ustar00rootroot00000000000000TEMPLATE = lib CONFIG -= qt CONFIG += plugin gstplugin DESTDIR = $$PWD/../lib include(../shared.pri) include(../../directsound.pri) psimedia-master/gstprovider/gstelements/shared/liveadder/000077500000000000000000000000001220046403000242405ustar00rootroot00000000000000psimedia-master/gstprovider/gstelements/shared/liveadder/liveadder.pro000066400000000000000000000002031220046403000267140ustar00rootroot00000000000000TEMPLATE = lib CONFIG -= qt CONFIG += plugin gstplugin DESTDIR = $$PWD/../lib include(../shared.pri) include(../../liveadder.pri) psimedia-master/gstprovider/gstelements/shared/osxaudio/000077500000000000000000000000001220046403000241345ustar00rootroot00000000000000psimedia-master/gstprovider/gstelements/shared/osxaudio/osxaudio.pro000066400000000000000000000002021220046403000265030ustar00rootroot00000000000000TEMPLATE = lib CONFIG -= qt CONFIG += plugin gstplugin DESTDIR = $$PWD/../lib include(../shared.pri) include(../../osxaudio.pri) psimedia-master/gstprovider/gstelements/shared/osxvideo/000077500000000000000000000000001220046403000241415ustar00rootroot00000000000000psimedia-master/gstprovider/gstelements/shared/osxvideo/osxvideo.pro000066400000000000000000000002021220046403000265150ustar00rootroot00000000000000TEMPLATE = lib CONFIG -= qt CONFIG += plugin gstplugin DESTDIR = $$PWD/../lib include(../shared.pri) include(../../osxvideo.pri) psimedia-master/gstprovider/gstelements/shared/shared.pri000066400000000000000000000002241220046403000242610ustar00rootroot00000000000000*-g++:QMAKE_CFLAGS_WARN_ON = -Wall -Wdeclaration-after-statement #-Werror include(../../gstconf.pri) DEFINES += HAVE_CONFIG_H INCLUDEPATH += $$PWD psimedia-master/gstprovider/gstelements/shared/shared.pro000066400000000000000000000002051220046403000242660ustar00rootroot00000000000000TEMPLATE = subdirs SUBDIRS += videomaxrate liveadder speexdsp windows:SUBDIRS += directsound winks mac:SUBDIRS += osxaudio osxvideo psimedia-master/gstprovider/gstelements/shared/speexdsp/000077500000000000000000000000001220046403000241345ustar00rootroot00000000000000psimedia-master/gstprovider/gstelements/shared/speexdsp/speexdsp.pro000066400000000000000000000002021220046403000265030ustar00rootroot00000000000000TEMPLATE = lib CONFIG -= qt CONFIG += plugin gstplugin DESTDIR = $$PWD/../lib include(../shared.pri) include(../../speexdsp.pri) psimedia-master/gstprovider/gstelements/shared/videomaxrate/000077500000000000000000000000001220046403000247715ustar00rootroot00000000000000psimedia-master/gstprovider/gstelements/shared/videomaxrate/videomaxrate.pro000066400000000000000000000002061220046403000302010ustar00rootroot00000000000000TEMPLATE = lib CONFIG -= qt CONFIG += plugin gstplugin DESTDIR = $$PWD/../lib include(../shared.pri) include(../../videomaxrate.pri) psimedia-master/gstprovider/gstelements/shared/winks/000077500000000000000000000000001220046403000234345ustar00rootroot00000000000000psimedia-master/gstprovider/gstelements/shared/winks/winks.pro000066400000000000000000000001771220046403000253160ustar00rootroot00000000000000TEMPLATE = lib CONFIG -= qt CONFIG += plugin gstplugin DESTDIR = $$PWD/../lib include(../shared.pri) include(../../winks.pri) psimedia-master/gstprovider/gstelements/speexdsp.pri000066400000000000000000000004311220046403000234000ustar00rootroot00000000000000HEADERS += \ $$PWD/speexdsp/speexdsp.h \ $$PWD/speexdsp/speexechoprobe.h SOURCES += \ $$PWD/speexdsp/speexdsp.c \ $$PWD/speexdsp/speexechoprobe.c gstplugin:SOURCES += $$PWD/speexdsp/speexdspplugin.c !gstplugin:SOURCES += $$PWD/static/speexdsp_static.c LIBS *= \ -lspeexdsp psimedia-master/gstprovider/gstelements/speexdsp/000077500000000000000000000000001220046403000226665ustar00rootroot00000000000000psimedia-master/gstprovider/gstelements/speexdsp/Makefile.am000066400000000000000000000006111220046403000247200ustar00rootroot00000000000000plugin_LTLIBRARIES = libgstspeexdsp.la libgstspeexdsp_la_SOURCES = speexdsp.c speexechoprobe.c libgstspeexdsp_la_CFLAGS = $(GST_CFLAGS) $(GST_PLUGINS_BASE_CFLAGS) $(ERROR_CFLAGS) $(SPEEXDSP_CFLAGS) libgstspeexdsp_la_LIBADD = $(GST_LIBS) $(SPEEXDSP_LIBS) libgstspeexdsp_la_LDFLAGS = $(GST_PLUGIN_LDFLAGS) $(GST_BASE_LIBS) $(GST_PLUGINS_BASE_LIBS) noinst_HEADERS = speexdsp.h speexechoprobe.h psimedia-master/gstprovider/gstelements/speexdsp/speexdsp.c000066400000000000000000001360511220046403000246730ustar00rootroot00000000000000/* * Farsight Voice+Video library * * Copyright 2008 Collabora Ltd * Copyright 2008 Nokia Corporation * @author: Olivier Crete * Copyright 2009 Barracuda Networks, Inc * @author: Justin Karneges * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif #include "speexdsp.h" #include #include typedef struct { FILE * fp; int offset; } FileLog; typedef struct { char * pfname; char * cfname; FileLog * playback; FileLog * capture; GstClockTime start; } PairLog; GStaticMutex pairlog_mutex; static PairLog * pairlog = NULL; #include #include extern GStaticMutex global_mutex; extern GstSpeexDSP * global_dsp; extern GstSpeexEchoProbe * global_probe; GST_DEBUG_CATEGORY (speex_dsp_debug); #define GST_CAT_DEFAULT (speex_dsp_debug) #define DEFAULT_LATENCY_TUNE (0) #define DEFAULT_AGC (FALSE) #define DEFAULT_AGC_INCREMENT (12) #define DEFAULT_AGC_DECREMENT (-40) #define DEFAULT_AGC_LEVEL (8000) #define DEFAULT_AGC_MAX_GAIN (30) #define DEFAULT_DENOISE (TRUE) #define DEFAULT_ECHO_SUPPRESS (-40) #define DEFAULT_ECHO_SUPPRESS_ACTIVE (-15) #define DEFAULT_NOISE_SUPPRESS (-15) static const GstElementDetails gst_speex_dsp_details = GST_ELEMENT_DETAILS ( "Voice processor", "Generic/Audio", "Preprepocesses voice with libspeexdsp", "Olivier Crete "); static GstStaticPadTemplate gst_speex_dsp_rec_sink_template = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, GST_STATIC_CAPS ("audio/x-raw-int, " "rate = (int) [ 6000, 48000 ], " "channels = (int) [1, MAX], " "endianness = (int) BYTE_ORDER, " "signed = (boolean) TRUE, " "width = (int) 16, " "depth = (int) 16") ); static GstStaticPadTemplate gst_speex_dsp_rec_src_template = GST_STATIC_PAD_TEMPLATE ("src", GST_PAD_SRC, GST_PAD_ALWAYS, GST_STATIC_CAPS ("audio/x-raw-int, " "rate = (int) [ 6000, 48000 ], " "channels = (int) [1, MAX], " "endianness = (int) BYTE_ORDER, " "signed = (boolean) TRUE, " "width = (int) 16, " "depth = (int) 16") ); enum { /* FILL ME */ LAST_SIGNAL }; enum { PROP_0, PROP_PROBE, PROP_LATENCY_TUNE, PROP_AGC, PROP_AGC_INCREMENT, PROP_AGC_DECREMENT, PROP_AGC_LEVEL, PROP_AGC_MAX_GAIN, PROP_DENOISE, PROP_ECHO_SUPPRESS, PROP_ECHO_SUPPRESS_ACTIVE, PROP_NOISE_SUPPRESS }; GST_BOILERPLATE(GstSpeexDSP, gst_speex_dsp, GstElement, GST_TYPE_ELEMENT); static void gst_speex_dsp_finalize (GObject * object); static void gst_speex_dsp_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static void gst_speex_dsp_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); static GstStateChangeReturn gst_speex_dsp_change_state (GstElement * element, GstStateChange transition); static gboolean gst_speex_dsp_setcaps (GstPad * pad, GstCaps * caps); static GstCaps * gst_speex_dsp_getcaps (GstPad * pad); static GstFlowReturn gst_speex_dsp_rec_chain (GstPad * pad, GstBuffer * buffer); static gboolean gst_speex_dsp_rec_event (GstPad * pad, GstEvent * event); static const GstQueryType * gst_speex_dsp_query_type (GstPad * pad); static gboolean gst_speex_dsp_query (GstPad * pad, GstQuery * query); static void gst_speex_dsp_reset_locked (GstSpeexDSP * self); static void try_auto_attach (); static GstBuffer * try_echo_cancel (SpeexEchoState * echostate, const GstBuffer * recbuf, GstClockTime rec_adj, GstClockTime rec_base, GQueue * buffers, int rate, GstPad * srcpad, const GstCaps * outcaps, GstFlowReturn * res, GstSpeexDSP * obj); static FileLog * filelog_new (const char * fname) { FileLog * fl; FILE *fp; fp = fopen (fname, "wb"); if (!fp) return NULL; fl = (FileLog *)malloc (sizeof (FileLog)); fl->fp = fp; fl->offset = 0; return fl; } static void filelog_delete (FileLog * fl) { fclose (fl->fp); free (fl); } static void filelog_append (FileLog * fl, const unsigned char * buf, int offset, int size) { int n, pad, start, len; pad = 0; start = 0; if (offset < fl->offset) { pad = 0; start = fl->offset - offset; } else if (offset > fl->offset) { pad = offset - fl->offset; start = 0; } len = size - start; if (len <= 0) return; for (n = 0; n < pad; ++n) fputc (0, fl->fp); if (fwrite (buf + start, len, 1, fl->fp) < 1) GST_DEBUG ("unable to write to log file"); //fflush (fl->fp); fl->offset += pad + len; } /*static void filelog_append_pad (FileLog * fl, int size) { int n; for (n = 0; n < size; ++n) fputc (0, fl->fp); }*/ static PairLog * pairlog_new (const char * pfname, const char * cfname) { PairLog * pl; pl = (PairLog *)malloc (sizeof (PairLog)); pl->pfname = strdup (pfname); pl->cfname = strdup (cfname); pl->playback = NULL; pl->capture = NULL; pl->start = GST_CLOCK_TIME_NONE; return pl; } static void pairlog_delete (PairLog * pl) { if (pl->playback) filelog_delete (pl->playback); if (pl->capture) filelog_delete (pl->capture); free (pl->pfname); free (pl->cfname); free (pl); } static void pairlog_append_playback (PairLog * pl, const unsigned char * buf, int offset, int size, GstClockTime time, int rate) { gint64 i; if (rate <= 0) { GST_DEBUG ("bad rate"); return; } if (!pl->playback) { pl->playback = filelog_new (pl->pfname); if (!pl->playback) { GST_DEBUG ("unable to create playback log '%s'", pl->pfname); return; } GST_DEBUG ("started playback log at %"GST_TIME_FORMAT, GST_TIME_ARGS (time)); if (pl->capture) pl->start = time; } if (pl->start == GST_CLOCK_TIME_NONE) return; i = (((gint64)time - (gint64)pl->start) * rate / GST_SECOND) * 2; offset = (int)i; GST_LOG ("start=%"GST_TIME_FORMAT", time=%"GST_TIME_FORMAT", offset=%d", GST_TIME_ARGS (pl->start), GST_TIME_ARGS (time), offset); if (offset < 0) return; filelog_append (pl->playback, buf, offset, size); } static void pairlog_append_capture (PairLog * pl, const unsigned char * buf, int offset, int size, GstClockTime time, int rate) { gint64 i; if (rate <= 0) { GST_DEBUG ("bad rate"); return; } if (!pl->capture) { pl->capture = filelog_new (pl->cfname); if (!pl->capture) { GST_DEBUG ("unable to create capture log '%s'", pl->cfname); return; } GST_DEBUG ("started capture log at %"GST_TIME_FORMAT, GST_TIME_ARGS (time)); if (pl->playback) pl->start = time; } if (pl->start == GST_CLOCK_TIME_NONE) return; i = (((gint64)time - (gint64)pl->start) * rate / GST_SECOND) * 2; offset = (int)i; GST_LOG ("start=%"GST_TIME_FORMAT", time=%"GST_TIME_FORMAT", offset=%d", GST_TIME_ARGS (pl->start), GST_TIME_ARGS (time), offset); if (offset < 0) return; filelog_append (pl->capture, buf, offset, size); } static gboolean have_env (const char *var) { const char *val = g_getenv (var); if (val && strcmp (val, "1") == 0) return TRUE; else return FALSE; } // within a hundredth of a millisecond static gboolean near_enough_to (GstClockTime a, GstClockTime b) { GstClockTime dist = GST_MSECOND / 100 / 2; if (b >= a - dist && b <= a + dist) return TRUE; else return FALSE; } static void adapter_push_at (GstAdapter * adapter, GstBuffer * buffer, int offset) { int size; int n, pad, start, len; int end; GstBuffer * newbuf; char * p; pad = 0; start = 0; size = GST_BUFFER_SIZE (buffer); end = gst_adapter_available (adapter); if (offset < end) { pad = 0; start = end - offset; } else if (offset > end) { pad = offset - end; start = 0; } len = size - start; if (len <= 0) return; newbuf = gst_buffer_new_and_alloc (pad + len); p = (char *)GST_BUFFER_DATA (newbuf); for (n = 0; n < pad; ++n) *(p++) = 0; memcpy (p, GST_BUFFER_DATA (buffer) + start, len); gst_adapter_push (adapter, newbuf); gst_buffer_unref (buffer); } // ----- static void gst_speex_dsp_base_init (gpointer klass) { GST_DEBUG_CATEGORY_INIT (speex_dsp_debug, "speexdsp", 0, "libspeexdsp wrapping elements"); } static void gst_speex_dsp_class_init (GstSpeexDSPClass * klass) { GObjectClass * gobject_class; GstElementClass * gstelement_class; gobject_class = (GObjectClass *) klass; gobject_class->finalize = gst_speex_dsp_finalize; gobject_class->set_property = gst_speex_dsp_set_property; gobject_class->get_property = gst_speex_dsp_get_property; gstelement_class = (GstElementClass *) klass; gst_element_class_add_pad_template (gstelement_class, gst_static_pad_template_get (&gst_speex_dsp_rec_src_template)); gst_element_class_add_pad_template (gstelement_class, gst_static_pad_template_get (&gst_speex_dsp_rec_sink_template)); gst_element_class_set_details (gstelement_class, &gst_speex_dsp_details); gstelement_class->change_state = gst_speex_dsp_change_state; parent_class = g_type_class_peek_parent (klass); g_object_class_install_property (gobject_class, PROP_PROBE, g_param_spec_object ("probe", "A probe that gathers the buffers to do echo cancellation on", "This is a link to the probe that gets buffers to cancel the echo" " against", GST_TYPE_SPEEX_ECHO_PROBE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_LATENCY_TUNE, g_param_spec_int ("latency-tune", "Add/remove latency", "Use this to tune the latency value, in milliseconds, in case it is" " detected incorrectly", G_MININT, G_MAXINT, DEFAULT_LATENCY_TUNE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_AGC, g_param_spec_boolean ("agc", "Automatic Gain Control state", "Enable or disable automatic Gain Control state", DEFAULT_AGC, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_AGC_INCREMENT, g_param_spec_int ("agc-increment", "Maximal gain increase in dB/second", "Maximal gain increase in dB/second", G_MININT, G_MAXINT, DEFAULT_AGC_INCREMENT, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_AGC_DECREMENT, g_param_spec_int ("agc-decrement", "Maximal gain increase in dB/second", "Maximal gain increase in dB/second", G_MININT, G_MAXINT, DEFAULT_AGC_DECREMENT, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_AGC_LEVEL, g_param_spec_float ("agc-level", "Automatic Gain Control level", "Automatic Gain Control level", -G_MAXFLOAT, G_MAXFLOAT, DEFAULT_AGC_LEVEL, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_AGC_MAX_GAIN, g_param_spec_int ("agc-max-gain", "Maximal gain in dB", "Maximal gain in dB", G_MININT, G_MAXINT, DEFAULT_AGC_MAX_GAIN, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_DENOISE, g_param_spec_boolean ("denoise", "Denoiser state", "Enable or disable denoiser state", DEFAULT_DENOISE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_ECHO_SUPPRESS, g_param_spec_int ("echo-suppress", "Maximum attenuation of the residual echo in dB", "Maximum attenuation of the residual echo in dB (negative number)", G_MININT, 0, DEFAULT_ECHO_SUPPRESS, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_ECHO_SUPPRESS_ACTIVE, g_param_spec_int ("echo-suppress-active", "Maximum attenuation of the residual echo in dB" " when near end is active", "Maximum attenuation of the residual echo in dB" " when near end is active (negative number)", G_MININT, 0, DEFAULT_ECHO_SUPPRESS_ACTIVE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_NOISE_SUPPRESS, g_param_spec_int ("noise-suppress", "Maximum attenuation of the noise in dB", "Maximum attenuation of the noise in dB (negative number)", G_MININT, 0, DEFAULT_NOISE_SUPPRESS, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); } static void gst_speex_dsp_init (GstSpeexDSP * self, GstSpeexDSPClass *klass) { GstPadTemplate * template; template = gst_static_pad_template_get (&gst_speex_dsp_rec_src_template); self->rec_srcpad = gst_pad_new_from_template (template, "src"); gst_object_unref (template); gst_pad_set_getcaps_function (self->rec_srcpad, GST_DEBUG_FUNCPTR (gst_speex_dsp_getcaps)); gst_pad_set_event_function (self->rec_srcpad, GST_DEBUG_FUNCPTR (gst_speex_dsp_rec_event)); gst_pad_set_query_function (self->rec_srcpad, GST_DEBUG_FUNCPTR (gst_speex_dsp_query)); gst_pad_set_query_type_function (self->rec_srcpad, GST_DEBUG_FUNCPTR (gst_speex_dsp_query_type)); gst_element_add_pad (GST_ELEMENT (self), self->rec_srcpad); template = gst_static_pad_template_get (&gst_speex_dsp_rec_sink_template); self->rec_sinkpad = gst_pad_new_from_template (template, "sink"); gst_object_unref (template); gst_pad_set_chain_function (self->rec_sinkpad, GST_DEBUG_FUNCPTR (gst_speex_dsp_rec_chain)); gst_pad_set_getcaps_function (self->rec_sinkpad, GST_DEBUG_FUNCPTR (gst_speex_dsp_getcaps)); gst_pad_set_setcaps_function (self->rec_sinkpad, GST_DEBUG_FUNCPTR (gst_speex_dsp_setcaps)); gst_pad_set_event_function (self->rec_sinkpad, GST_DEBUG_FUNCPTR (gst_speex_dsp_rec_event)); gst_element_add_pad (GST_ELEMENT (self), self->rec_sinkpad); self->channels = 1; self->frame_size_ms = 20; self->filter_length_ms = 200; self->rec_adapter = gst_adapter_new (); self->rec_time = GST_CLOCK_TIME_NONE; self->rec_offset = GST_BUFFER_OFFSET_NONE; self->probe = NULL; self->latency_tune = DEFAULT_LATENCY_TUNE; self->agc = DEFAULT_AGC; self->agc_increment = DEFAULT_AGC_INCREMENT; self->agc_decrement = DEFAULT_AGC_DECREMENT; self->agc_level = DEFAULT_AGC_LEVEL; self->agc_max_gain = DEFAULT_AGC_MAX_GAIN; self->denoise = DEFAULT_DENOISE; self->echo_suppress = DEFAULT_ECHO_SUPPRESS; self->echo_suppress_active = DEFAULT_ECHO_SUPPRESS_ACTIVE; self->noise_suppress = DEFAULT_NOISE_SUPPRESS; self->buffers = g_queue_new(); g_static_mutex_lock (&pairlog_mutex); if (!pairlog && have_env("SPEEXDSP_LOG")) pairlog = pairlog_new ("gst_play.raw", "gst_rec.raw"); g_static_mutex_unlock (&pairlog_mutex); g_static_mutex_lock (&global_mutex); if (!global_dsp) { global_dsp = self; try_auto_attach (); } g_static_mutex_unlock (&global_mutex); } static void gst_speex_dsp_finalize (GObject * object) { GstSpeexDSP * self = GST_SPEEX_DSP (object); g_static_mutex_lock (&global_mutex); if (global_dsp && global_dsp == self) { if (global_probe && global_probe == self->probe) GST_DEBUG_OBJECT (self, "speexdsp detaching from globally discovered speexechoprobe"); global_dsp = NULL; } g_static_mutex_unlock (&global_mutex); if (self->probe) { GST_OBJECT_LOCK (self->probe); self->probe->dsp = NULL; GST_OBJECT_UNLOCK (self->probe); g_object_unref (self->probe); self->probe = NULL; } g_queue_foreach (self->buffers, (GFunc) gst_mini_object_unref, NULL); g_queue_free (self->buffers); if (self->preprocstate) speex_preprocess_state_destroy (self->preprocstate); if (self->echostate) speex_echo_state_destroy (self->echostate); g_object_unref (self->rec_adapter); g_static_mutex_lock (&pairlog_mutex); if (pairlog) { pairlog_delete (pairlog); pairlog = NULL; } g_static_mutex_unlock (&pairlog_mutex); G_OBJECT_CLASS (parent_class)->finalize (object); } static void gst_speex_dsp_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) { GstSpeexDSP * self = GST_SPEEX_DSP (object); GST_OBJECT_LOCK (self); switch (prop_id) { case PROP_PROBE: if (G_LIKELY (g_value_get_object (value) != self->probe)) { if (self->probe) gst_speex_dsp_detach (self); if (g_value_get_object (value)) gst_speex_dsp_attach (self, g_value_get_object (value)); } break; case PROP_LATENCY_TUNE: self->latency_tune = g_value_get_int (value); break; case PROP_AGC: self->agc = g_value_get_boolean (value); if (self->preprocstate) speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_SET_AGC, &self->agc); break; case PROP_AGC_INCREMENT: self->agc_increment = g_value_get_int (value); if (self->preprocstate) speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_SET_AGC_INCREMENT, &self->agc_increment); break; case PROP_AGC_DECREMENT: self->agc_decrement = g_value_get_int (value); if (self->preprocstate) speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_SET_AGC_DECREMENT, &self->agc_decrement); break; case PROP_AGC_LEVEL: self->agc_level = g_value_get_float (value); if (self->preprocstate) speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_SET_AGC_LEVEL, &self->agc_level); break; case PROP_AGC_MAX_GAIN: self->agc_max_gain = g_value_get_int (value); if (self->preprocstate) speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_SET_AGC_MAX_GAIN, &self->agc_max_gain); break; case PROP_DENOISE: self->denoise = g_value_get_boolean (value); if (self->preprocstate) speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_SET_DENOISE, &self->denoise); break; case PROP_ECHO_SUPPRESS: self->echo_suppress = g_value_get_int (value); if (self->preprocstate) speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_SET_ECHO_SUPPRESS, &self->echo_suppress); break; case PROP_ECHO_SUPPRESS_ACTIVE: self->echo_suppress_active= g_value_get_int (value); if (self->preprocstate) speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_SET_ECHO_SUPPRESS_ACTIVE, &self->echo_suppress_active); break; case PROP_NOISE_SUPPRESS: self->noise_suppress = g_value_get_int (value); if (self->preprocstate) speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_SET_NOISE_SUPPRESS, &self->noise_suppress); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } GST_OBJECT_UNLOCK (self); } static void gst_speex_dsp_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec) { GstSpeexDSP * self = GST_SPEEX_DSP (object); GST_OBJECT_LOCK (self); switch (prop_id) { case PROP_PROBE: g_value_set_object (value, self->probe); break; case PROP_LATENCY_TUNE: g_value_set_int (value, self->latency_tune); break; case PROP_AGC: if (self->preprocstate) speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_GET_AGC, &self->agc); g_value_set_boolean (value, self->agc); break; case PROP_AGC_INCREMENT: if (self->preprocstate) speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_GET_AGC_INCREMENT, &self->agc_increment); g_value_set_int (value, self->agc_increment); break; case PROP_AGC_DECREMENT: if (self->preprocstate) speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_GET_AGC_DECREMENT, &self->agc_decrement); g_value_set_int (value, self->agc_decrement); break; case PROP_AGC_LEVEL: if (self->preprocstate) speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_GET_AGC_LEVEL, &self->agc_level); g_value_set_float (value, self->agc_level); break; case PROP_AGC_MAX_GAIN: if (self->preprocstate) speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_GET_AGC_MAX_GAIN, &self->agc_max_gain); g_value_set_int (value, self->agc_max_gain); break; case PROP_DENOISE: if (self->preprocstate) speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_GET_DENOISE, &self->denoise); g_value_set_boolean (value, self->denoise); break; case PROP_ECHO_SUPPRESS: if (self->preprocstate) speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_GET_ECHO_SUPPRESS, &self->echo_suppress); g_value_set_int (value, self->echo_suppress); break; case PROP_ECHO_SUPPRESS_ACTIVE: if (self->preprocstate) speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_GET_ECHO_SUPPRESS_ACTIVE, &self->echo_suppress_active); g_value_set_int (value, self->echo_suppress_active); break; case PROP_NOISE_SUPPRESS: if (self->preprocstate) speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_GET_NOISE_SUPPRESS, &self->noise_suppress); g_value_set_int (value, self->noise_suppress); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } GST_OBJECT_UNLOCK (self); } /* we can only accept caps that we and downstream can handle. */ static GstCaps * gst_speex_dsp_getcaps (GstPad * pad) { GstSpeexDSP * self; GstCaps * result, * peercaps, * tmpcaps; self = GST_SPEEX_DSP (gst_pad_get_parent (pad)); result = gst_caps_copy (gst_pad_get_pad_template_caps (pad)); if (self->echostate != NULL) { GST_OBJECT_LOCK (self); gst_caps_set_simple (result, "rate", G_TYPE_INT, self->rate, "channels", G_TYPE_INT, self->channels, NULL); GST_OBJECT_UNLOCK (self); goto out; } GST_OBJECT_LOCK (self); if (self->probe) { GST_OBJECT_LOCK (self->probe); if (self->probe->rate) gst_caps_set_simple (result, "rate", G_TYPE_INT, self->probe->rate, NULL); GST_OBJECT_UNLOCK (self->probe); } GST_OBJECT_UNLOCK (self); if (pad == self->rec_sinkpad) { peercaps = gst_pad_peer_get_caps (self->rec_srcpad); if (peercaps) { tmpcaps = result; result = gst_caps_intersect (result, peercaps); gst_caps_unref (tmpcaps); gst_caps_unref (peercaps); } } else if (pad == self->rec_srcpad) { peercaps = gst_pad_peer_get_caps (self->rec_sinkpad); if (peercaps) { tmpcaps = result; result = gst_caps_intersect (result, peercaps); gst_caps_unref (tmpcaps); gst_caps_unref (peercaps); } } out: gst_object_unref (self); return result; } static gboolean gst_speex_dsp_setcaps (GstPad * pad, GstCaps * caps) { GstSpeexDSP * self; GstStructure * structure; gint rate; gint channels = 1; gboolean ret = TRUE; self = GST_SPEEX_DSP (gst_pad_get_parent (pad)); GST_LOG_OBJECT (self, "setting caps on pad %p,%s to %" GST_PTR_FORMAT, pad, GST_PAD_NAME (pad), caps); structure = gst_caps_get_structure (caps, 0); if (!gst_structure_get_int (structure, "rate", &rate)) { GST_WARNING_OBJECT (self, "Tried to set caps without a rate"); gst_object_unref (self); return FALSE; } gst_structure_get_int (structure, "channels", &channels); GST_OBJECT_LOCK (self); if (self->echostate) { if (self->rate != rate) { GST_WARNING_OBJECT (self, "Wrong rate, got %d, expected %d", rate, self->rate); ret = FALSE; } if (self->channels != channels) { GST_WARNING_OBJECT (self, "Wrong channel count, got %d, expected %d", channels, self->channels); ret = FALSE; } goto done; } if (self->probe) { GST_OBJECT_LOCK (self->probe); if (self->probe->rate) { if (self->probe->rate != rate) { GST_WARNING_OBJECT (self, "Wrong rate, probe has %d, we have %d", self->probe->rate, rate); ret = FALSE; } else { self->probe->rate = rate; } } GST_OBJECT_UNLOCK (self->probe); if (!ret) goto done; } self->rate = rate; /*if (self->probe)*/ { guint probe_channels = 1; guint frame_size, filter_length; frame_size = rate * self->frame_size_ms / 1000; filter_length = rate * self->filter_length_ms / 1000; if (self->probe) { GST_OBJECT_LOCK (self->probe); probe_channels = self->probe->channels; GST_OBJECT_UNLOCK (self->probe); } // FIXME: if this is -1, then probe caps aren't set yet. there should // be a better solution besides forcing this to 1 if (probe_channels == -1) probe_channels = 1; if (self->channels == 1 && probe_channels == 1) { GST_DEBUG_OBJECT (self, "speex_echo_state_init (%d, %d)", frame_size, filter_length); self->echostate = speex_echo_state_init (frame_size, filter_length); } else { GST_DEBUG_OBJECT (self, "speex_echo_state_init_mc (%d, %d, %d, %d)", frame_size, filter_length, self->channels, probe_channels); self->echostate = speex_echo_state_init_mc (frame_size, filter_length, self->channels, probe_channels); } } self->preprocstate = speex_preprocess_state_init ( rate * self->frame_size_ms / 1000, rate); if (self->echostate) { speex_echo_ctl (self->echostate, SPEEX_ECHO_SET_SAMPLING_RATE, &rate); speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_SET_ECHO_STATE, self->echostate); } speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_SET_AGC, &self->agc); speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_SET_AGC_INCREMENT, &self->agc_increment); speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_SET_AGC_DECREMENT, &self->agc_decrement); speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_SET_AGC_LEVEL, &self->agc_level); speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_SET_AGC_MAX_GAIN, &self->agc_max_gain); speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_SET_DENOISE, &self->denoise); speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_SET_ECHO_SUPPRESS, &self->echo_suppress); speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_SET_ECHO_SUPPRESS_ACTIVE, &self->echo_suppress_active); speex_preprocess_ctl (self->preprocstate, SPEEX_PREPROCESS_SET_NOISE_SUPPRESS, &self->noise_suppress); done: GST_OBJECT_UNLOCK (self); gst_object_unref (self); return ret; } static gboolean gst_speex_dsp_rec_event (GstPad * pad, GstEvent * event) { GstSpeexDSP * self = GST_SPEEX_DSP (gst_pad_get_parent (pad)); gboolean res = FALSE; switch (GST_EVENT_TYPE (event)) { case GST_EVENT_FLUSH_STOP: /* synchronized */ gst_adapter_clear (self->rec_adapter); self->rec_offset = 0; self->rec_time = GST_CLOCK_TIME_NONE; gst_segment_init (&self->rec_segment, GST_FORMAT_UNDEFINED); g_queue_foreach (self->buffers, (GFunc) gst_mini_object_unref, NULL); g_queue_clear (self->buffers); GST_OBJECT_LOCK (self); gst_speex_dsp_reset_locked (self); GST_OBJECT_UNLOCK (self); break; case GST_EVENT_NEWSEGMENT: /* synchronized */ { gboolean update; gdouble rate; gdouble applied_rate; GstFormat format; gint64 start; gint64 stop; gint64 position; gst_event_parse_new_segment_full (event, &update, &rate, &applied_rate, &format, &start, &stop, &position); if (rate != 1.0 || applied_rate != 1.0) { GST_ERROR_OBJECT (self, "Only a rate of 1.0 is allowed"); goto out; } if (format != GST_FORMAT_TIME) { GST_ERROR_OBJECT (self, "Only times segments are allowed"); goto out; } gst_segment_set_newsegment_full (&self->rec_segment, update, rate, applied_rate, format, start, stop, position); } default: break; } if (pad == self->rec_sinkpad) res = gst_pad_push_event (self->rec_srcpad, event); else res = gst_pad_push_event (self->rec_sinkpad, event); out: gst_object_unref (self); return res; } static const GstQueryType * gst_speex_dsp_query_type (GstPad * pad) { static const GstQueryType types[] = { GST_QUERY_LATENCY, 0 }; return types; } static gboolean gst_speex_dsp_query (GstPad * pad, GstQuery * query) { GstSpeexDSP * self = GST_SPEEX_DSP (gst_pad_get_parent (pad)); gboolean res = TRUE; switch (GST_QUERY_TYPE (query)) { case GST_QUERY_LATENCY: { GstClockTime min, max; gboolean live; guint64 latency; GstPad * peer; if ((peer = gst_pad_get_peer (self->rec_sinkpad))) { if ((res = gst_pad_query (peer, query))) { gst_query_parse_latency (query, &live, &min, &max); GST_DEBUG_OBJECT (self, "Peer latency: min %" GST_TIME_FORMAT " max %" GST_TIME_FORMAT, GST_TIME_ARGS (min), GST_TIME_ARGS (max)); /* add our own latency */ latency = ((guint64)self->frame_size_ms) * 1000000; /* to nanos */ GST_DEBUG_OBJECT (self, "Our latency: %" GST_TIME_FORMAT, GST_TIME_ARGS (latency)); min += latency; if (max != GST_CLOCK_TIME_NONE) max += latency; GST_DEBUG_OBJECT (self, "Calculated total latency : min %" GST_TIME_FORMAT " max %" GST_TIME_FORMAT, GST_TIME_ARGS (min), GST_TIME_ARGS (max)); gst_query_set_latency (query, live, min, max); } gst_object_unref (peer); } break; } default: res = gst_pad_query_default (pad, query); break; } gst_object_unref (self); return res; } // TODO static GstFlowReturn gst_speex_dsp_rec_chain (GstPad * pad, GstBuffer * buffer) { GstSpeexDSP * self = GST_SPEEX_DSP (gst_pad_get_parent (pad)); GstFlowReturn res = GST_FLOW_OK; GstBuffer * recbuffer = NULL; gint sampsize, bufsize; GstClockTime duration; GstSpeexEchoProbe * probe = NULL; gint rate = 0; GstClockTime base_time; GstClockTime skew_fix; gint buffer_offset; base_time = gst_element_get_base_time (GST_ELEMENT_CAST (self)); skew_fix = base_time; // FIXME FIXME FIXME FIXME! GST_OBJECT_LOCK (self); if (self->probe) probe = g_object_ref (self->probe); rate = self->rate; GST_OBJECT_UNLOCK (self); sampsize = rate * self->frame_size_ms / 1000; bufsize = 2 * sampsize; duration = self->frame_size_ms * GST_MSECOND; // FIXME to the max if (GST_BUFFER_IS_DISCONT (buffer)) { GST_LOG_OBJECT (self, "***discontinuous! starting over"); // clear adapter, because otherwise new data will be considered relative // to what's left in the adapter, which is of course wrong. we need // timestamps in the adapter or something.. gst_adapter_clear (self->rec_adapter); // clear played buffers, in case the discontinuity is due to a clock // change, which means existing buffers are timestamped wrong (there's // probably a better way to handle this...) GST_OBJECT_LOCK (self); g_queue_foreach (self->buffers, (GFunc) gst_mini_object_unref, NULL); g_queue_clear (self->buffers); GST_OBJECT_UNLOCK (self); } if (gst_adapter_available (self->rec_adapter) == 0) { GST_LOG_OBJECT (self, "The adapter is empty, its a good time to reset the" " timestamp and offset"); self->rec_time = GST_CLOCK_TIME_NONE; self->rec_offset = GST_BUFFER_OFFSET_NONE; } buffer_offset = 0; if (self->rec_time != GST_CLOCK_TIME_NONE) { GstClockTime a, b; gint64 i; a = self->rec_time + ((gst_adapter_available (self->rec_adapter) / 2) * GST_SECOND / rate); b = GST_BUFFER_TIMESTAMP (buffer); i = (((gint64)GST_BUFFER_TIMESTAMP (buffer) - (gint64)self->rec_time) * rate / GST_SECOND) * 2; buffer_offset = (gint)i; if (!near_enough_to (a, b)) { GST_LOG_OBJECT (self, "***continuous buffer with wrong timestamp" " (want=%"GST_TIME_FORMAT", got=%"GST_TIME_FORMAT")," " compensating %d bytes", GST_TIME_ARGS (a), GST_TIME_ARGS (b), buffer_offset); } } if (self->rec_time == GST_CLOCK_TIME_NONE) self->rec_time = GST_BUFFER_TIMESTAMP (buffer); if (self->rec_offset == GST_BUFFER_OFFSET_NONE) self->rec_offset = GST_BUFFER_OFFSET (buffer); { GstClockTime rec_rt; rec_rt = gst_segment_to_running_time (&self->rec_segment, GST_FORMAT_TIME, GST_BUFFER_TIMESTAMP (buffer)); GST_LOG_OBJECT (self, "Captured buffer at %"GST_TIME_FORMAT " (len=%"GST_TIME_FORMAT", offset=%lld, base=%lld)", //GST_TIME_ARGS (GST_BUFFER_TIMESTAMP (buffer) /*+ base_time*/), GST_TIME_ARGS (rec_rt), GST_TIME_ARGS (GST_BUFFER_DURATION (buffer)), GST_BUFFER_OFFSET (buffer), base_time); } g_static_mutex_lock (&pairlog_mutex); if (pairlog) { pairlog_append_capture (pairlog, (const unsigned char *)GST_BUFFER_DATA (buffer), GST_BUFFER_OFFSET (buffer) * 2, GST_BUFFER_SIZE (buffer), GST_BUFFER_TIMESTAMP (buffer) + skew_fix, rate); } g_static_mutex_unlock (&pairlog_mutex); // TODO: handle gaps (see above about discontinuity) //gst_adapter_push (self->rec_adapter, buffer); adapter_push_at (self->rec_adapter, buffer, buffer_offset); //res = gst_pad_push (self->rec_srcpad, buffer); //goto out; while (TRUE) { GstBuffer * outbuffer = NULL; //GstClockTime rec_rt = 0; // buffer at least 500ms + 1 frame before processing //if (gst_adapter_available (self->rec_adapter) < (2 * rate * 2000 / 1000) + bufsize) // break; recbuffer = gst_adapter_take_buffer (self->rec_adapter, bufsize); if (!recbuffer) break; GST_BUFFER_TIMESTAMP (recbuffer) = self->rec_time; GST_BUFFER_OFFSET (recbuffer) = self->rec_offset; GST_BUFFER_DURATION (recbuffer) = duration; // FIXME: don't need this? //rec_rt = gst_segment_to_running_time (&self->rec_segment, GST_FORMAT_TIME, // self->rec_time); GST_OBJECT_LOCK (self); outbuffer = try_echo_cancel ( self->echostate, recbuffer, self->rec_time + skew_fix /*- (((GstClockTime)self->latency_tune) * GST_MSECOND)*/, base_time, self->buffers, rate, self->rec_srcpad, GST_PAD_CAPS (self->rec_sinkpad), &res, self); GST_OBJECT_UNLOCK (self); if (outbuffer) { /* if cancel succeeds, then post-processing occurs on the newly returned * buffer and we can free the original one. newly returned buffer has * appropriate caps. */ gst_buffer_unref (recbuffer); } else { /* if cancel fails, it's possible it was due to a flow error when * creating a new buffer */ if (res != GST_FLOW_OK) goto out; /* if cancel fails, then post-processing occurs on the original buffer, * just make it writable and set appropriate caps. */ outbuffer = gst_buffer_make_writable (recbuffer); gst_buffer_set_caps (outbuffer, GST_PAD_CAPS (self->rec_sinkpad)); } GST_OBJECT_LOCK (self); speex_preprocess_run (self->preprocstate, (spx_int16_t *) GST_BUFFER_DATA (outbuffer)); GST_OBJECT_UNLOCK (self); self->rec_time += duration; self->rec_offset += sampsize; // FIXME: does this work for >1 channels? GST_LOG_OBJECT (self, "Sending out buffer %p", outbuffer); res = gst_pad_push (self->rec_srcpad, outbuffer); if (res != GST_FLOW_OK) break; } out: if (probe) gst_object_unref (probe); gst_object_unref (self); return res; } static void gst_speex_dsp_reset_locked (GstSpeexDSP * self) { if (self->preprocstate) speex_preprocess_state_destroy (self->preprocstate); self->preprocstate = NULL; if (self->echostate) speex_echo_state_destroy (self->echostate); self->echostate = NULL; self->rate = 0; } static GstStateChangeReturn gst_speex_dsp_change_state (GstElement * element, GstStateChange transition) { GstSpeexDSP * self; GstStateChangeReturn ret; self = GST_SPEEX_DSP (element); switch (transition) { case GST_STATE_CHANGE_READY_TO_PAUSED: GST_OBJECT_LOCK (self); gst_speex_dsp_reset_locked (self); GST_OBJECT_UNLOCK (self); gst_segment_init (&self->rec_segment, GST_FORMAT_UNDEFINED); break; default: break; } ret = GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); return ret; } /* lock global_mutex during this call */ static void try_auto_attach () { if (global_probe) { gst_speex_dsp_attach (global_dsp, global_probe); GST_DEBUG_OBJECT (global_dsp, "speexdsp attaching to globally discovered speexechoprobe"); } } void gst_speex_dsp_set_auto_attach (GstSpeexDSP * self, gboolean enabled) { g_static_mutex_lock (&global_mutex); if (enabled) { if (!global_dsp) { global_dsp = self; try_auto_attach (); } } else { if (global_dsp == self) global_dsp = NULL; } g_static_mutex_unlock (&global_mutex); } void gst_speex_dsp_add_capture_buffer (GstSpeexDSP * self, GstBuffer * buf) { GstClockTime base_time = gst_element_get_base_time (GST_ELEMENT_CAST (self)); GstClockTime duration = GST_CLOCK_TIME_NONE; GstCaps * caps; GstStructure * structure; int rate = 0; GST_OBJECT_LOCK (self); if (self->rate != 0) { rate = self->rate; GST_OBJECT_UNLOCK (self); } else { GST_OBJECT_UNLOCK (self); caps = GST_BUFFER_CAPS (buf); if (caps) { structure = gst_caps_get_structure (GST_BUFFER_CAPS (buf), 0); if (structure) gst_structure_get_int (structure, "rate", &rate); } } if (rate != 0) duration = GST_BUFFER_SIZE (buf) * GST_SECOND / (rate * 2); GST_LOG_OBJECT (self, "Played buffer at %"GST_TIME_FORMAT " (len=%"GST_TIME_FORMAT", offset=%lld)", GST_TIME_ARGS (GST_BUFFER_TIMESTAMP (buf) - base_time), GST_TIME_ARGS (duration), GST_BUFFER_OFFSET (buf)); g_static_mutex_lock (&pairlog_mutex); if (pairlog && rate != 0) { pairlog_append_playback (pairlog, (const unsigned char *)GST_BUFFER_DATA (buf), GST_BUFFER_OFFSET (buf) * 2, GST_BUFFER_SIZE (buf), GST_BUFFER_TIMESTAMP (buf) - base_time, rate); } g_static_mutex_unlock (&pairlog_mutex); GST_OBJECT_LOCK (self); g_queue_push_head (self->buffers, buf); GST_OBJECT_UNLOCK (self); } /* global_mutex locked during this call */ void gst_speex_dsp_attach (GstSpeexDSP * self, GstSpeexEchoProbe * probe) { g_object_ref (probe); self->probe = probe; GST_OBJECT_LOCK (probe); probe->dsp = self; GST_OBJECT_UNLOCK (probe); } /* global_mutex locked during this call */ void gst_speex_dsp_detach (GstSpeexDSP * self) { if (self->probe) { GST_OBJECT_LOCK (self->probe); self->probe->dsp = NULL; GST_OBJECT_UNLOCK (self->probe); g_object_unref (self->probe); self->probe = NULL; } } /* this function attempts to cancel echo. * * echostate: AEC state * recbuf: recorded buffer to strip echo from * rec_adj: time of recorded buffer with latency/skew adjustment * rec_base: base time of element that received the recorded buffer * buffers: a queue of recently played buffers, in clock time * rate: sample rate being used (both rec/play are the same rate) * srcpad: the pad that the resulting data should go out on * outcaps: the caps of the resulting data * res: if this function returns null, a flow value may be stored here * obj: object to log debug messages against * * returns: new buffer with echo cancelled, or NULL if cancelling was not * possible. check 'res' to see if the reason was a flow problem. * * note that while this function taks a pad as an argument, it does not * actually send a buffer out on the pad. it uses gst_pad_alloc_buffer() * to create the output buffer, which requires a pad as input. */ GstBuffer * try_echo_cancel (SpeexEchoState * echostate, const GstBuffer * recbuf, GstClockTime rec_adj, GstClockTime rec_base, GQueue * buffers, int rate, GstPad * srcpad, const GstCaps * outcaps, GstFlowReturn * res, GstSpeexDSP * obj) { GstFlowReturn res_ = GST_FLOW_OK; GstBuffer * recbuffer = NULL; GstBuffer * play_buffer = NULL; gchar * buf = NULL; gint bufsize; GstClockTime duration; GstClockTime play_rt = 0, rec_rt = 0, rec_end = 0; gint rec_offset; GstBuffer * outbuffer = NULL; recbuffer = (GstBuffer *)recbuf; rec_rt = rec_adj; rec_offset = GST_BUFFER_OFFSET (recbuffer); bufsize = GST_BUFFER_SIZE (recbuffer); duration = GST_BUFFER_DURATION (recbuffer); if (!echostate) { GST_LOG_OBJECT (obj, "No echostate, not doing echo cancellation"); return NULL; } /* clean out the queue, throwing out any buffers that are too old */ while (TRUE) { play_buffer = g_queue_peek_tail (buffers); if (!play_buffer || GST_BUFFER_TIMESTAMP (play_buffer) - rec_base + (GST_BUFFER_SIZE (play_buffer) * GST_SECOND / (rate * 2)) >= rec_rt) break; GST_LOG_OBJECT (obj, "Throwing out old played buffer"); g_queue_pop_tail (buffers); gst_buffer_unref (play_buffer); } play_buffer = g_queue_peek_tail (buffers); if (!play_buffer) { GST_LOG_OBJECT (obj, "No playout buffer, not doing echo cancellation"); return NULL; } play_rt = GST_BUFFER_TIMESTAMP (play_buffer) - rec_base; GST_LOG_OBJECT (obj, "rec_start=%"GST_TIME_FORMAT"," " play_start=%"GST_TIME_FORMAT"", GST_TIME_ARGS (rec_rt), GST_TIME_ARGS (play_rt)); if (play_rt > rec_rt + duration) { GST_LOG_OBJECT (obj, "Have no buffers to compare, not cancelling echo"); return NULL; } res_ = gst_pad_alloc_buffer (srcpad, rec_offset, bufsize, (GstCaps *)outcaps, &outbuffer); if (res_ != GST_FLOW_OK) { *res = res_; return NULL; } g_assert (outbuffer); // FIXME: what if GST_BUFFER_SIZE (outbuffer) != bufsize ? GST_BUFFER_TIMESTAMP (outbuffer) = GST_BUFFER_TIMESTAMP (recbuffer); GST_BUFFER_OFFSET (outbuffer) = GST_BUFFER_OFFSET (recbuffer); GST_BUFFER_DURATION (outbuffer) = GST_BUFFER_DURATION (recbuffer); /* here's a buffer we'll fill up with played data. we initialize it to * silence in case we don't have enough played data to populate the whole * thing. */ buf = g_malloc0 (bufsize); /* canceling is done relative to rec_rt, even though it may not be the same * as the actual timestamp of the recorded buffer (rec_rt has latency_tune * applied) */ rec_end = rec_rt + duration; rec_offset = 0; // FIXME: slightly confusing, reusing this variable for // different purpose while (rec_offset < bufsize) { GstClockTime play_duration, time; gint play_offset, size; play_buffer = g_queue_peek_tail (buffers); if (!play_buffer) { GST_LOG_OBJECT (obj, "Queue empty, can't cancel everything"); break; } play_rt = GST_BUFFER_TIMESTAMP (play_buffer) - rec_base; if (rec_end < play_rt) { GST_LOG_OBJECT (obj, "End of recorded buffer (at %"GST_TIME_FORMAT")" " is before any played buffer" " (which start at %"GST_TIME_FORMAT")", GST_TIME_ARGS (rec_end), GST_TIME_ARGS (play_rt)); break; } play_duration = GST_BUFFER_SIZE (play_buffer) * GST_SECOND / (rate * 2); // FIXME: it seems we already do something like this earlier. we // shouldn't need it in two spots, and the one here is probably // more appropriate if (play_rt + play_duration < rec_rt) { GST_LOG_OBJECT (obj, "Start of rec data (at %"GST_TIME_FORMAT")" " after the end of played data (at %"GST_TIME_FORMAT")", GST_TIME_ARGS (rec_rt), GST_TIME_ARGS (play_rt + play_duration)); g_queue_pop_tail (buffers); gst_buffer_unref (play_buffer); continue; } if (rec_rt > play_rt) { GstClockTime time_diff = rec_rt - play_rt; time_diff *= 2 * rate; time_diff /= GST_SECOND; time_diff &= ~0x01; // ensure even play_offset = time_diff; GST_LOG_OBJECT (obj, "rec>play off: %d", play_offset); } else { gint rec_skip; GstClockTime time_diff = play_rt - rec_rt; time_diff *= 2 * rate; time_diff /= GST_SECOND; time_diff &= ~0x01; // ensure even rec_skip = time_diff; rec_rt += rec_skip * GST_SECOND / (rate * 2); rec_offset += rec_skip; play_offset = 0; GST_LOG_OBJECT (obj, "rec<=play off: %d", rec_skip); } play_offset = MIN (play_offset, GST_BUFFER_SIZE (play_buffer)); size = MIN (GST_BUFFER_SIZE (play_buffer) - play_offset, bufsize - rec_offset); time = play_offset; time *= GST_SECOND; time /= rate * 2; time += play_rt; GST_LOG_OBJECT (obj, "Cancelling data recorded at %"GST_TIME_FORMAT " with data played at %"GST_TIME_FORMAT " (difference %"GST_TIME_FORMAT") for %d bytes", GST_TIME_ARGS (rec_rt), GST_TIME_ARGS (time), GST_TIME_ARGS (rec_rt - time), size); GST_LOG_OBJECT (obj, "using play buffer %p (size=%d), mid(%d, %d)", play_buffer, GST_BUFFER_SIZE (play_buffer), play_offset, size); if(rec_offset < 0 || play_offset < 0 || rec_offset + size > bufsize || play_offset + size > GST_BUFFER_SIZE (play_buffer)) { fprintf (stderr, "***speexdsp explosions!\n"); abort(); return NULL; } memcpy (buf + rec_offset, GST_BUFFER_DATA (play_buffer) + play_offset, size); rec_rt += size * GST_SECOND / (rate * 2); rec_offset += size; if (GST_BUFFER_SIZE (play_buffer) == play_offset + size) { GstBuffer *pb = g_queue_pop_tail (buffers); gst_buffer_unref (play_buffer); g_assert (pb == play_buffer); } } GST_LOG_OBJECT (obj, "Cancelling echo"); speex_echo_cancellation (echostate, (const spx_int16_t *) GST_BUFFER_DATA (recbuffer), (const spx_int16_t *) buf, (spx_int16_t *) GST_BUFFER_DATA (outbuffer)); g_free (buf); return outbuffer; } psimedia-master/gstprovider/gstelements/speexdsp/speexdsp.h000066400000000000000000000064151220046403000247000ustar00rootroot00000000000000/* * Farsight Voice+Video library * * Copyright 2008 Collabora Ltd * Copyright 2008 Nokia Corporation * @author: Olivier Crete * Copyright 2009 Barracuda Networks, Inc * @author: Justin Karneges * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * */ #ifndef __GST_SPEEX_DSP_H__ #define __GST_SPEEX_DSP_H__ #include #include #include #include #include "speexechoprobe.h" G_BEGIN_DECLS GST_DEBUG_CATEGORY_EXTERN (speex_dsp_debug); #define GST_TYPE_SPEEX_DSP \ (gst_speex_dsp_get_type()) #define GST_SPEEX_DSP(obj) \ (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_SPEEX_DSP,GstSpeexDSP)) #define GST_IS_SPEEX_DSP(obj) \ (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_SPEEX_DSP)) #define GST_SPEEX_DSP_CLASS(klass) \ (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_SPEEX_DSP,GstSpeexDSPClass)) #define GST_IS_SPEEX_DSP_CLASS(klass) \ (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_SPEEX_DSP)) #define GST_SPEEX_DSP_GET_CLASS(obj) \ (G_TYPE_INSTANCE_GET_CLASS((obj),GST_TYPE_SPEEX_DSP,GstSpeexDSPClass)) typedef struct _GstSpeexDSP GstSpeexDSP; typedef struct _GstSpeexDSPClass GstSpeexDSPClass; struct _GstSpeexDSP { GstElement element; GstPad * rec_srcpad; GstPad * rec_sinkpad; /* Protected by the stream lock */ guint frame_size_ms; /* frame size in ms */ guint filter_length_ms; /* filter length in ms */ /* Protected by the object lock */ gint rate; gint channels; /* Protected by the stream lock */ GstSegment rec_segment; GstAdapter * rec_adapter; GstClockTime rec_time; guint64 rec_offset; /* Protected by the object lock */ SpeexPreprocessState * preprocstate; /* Protected by the stream lock */ SpeexEchoState * echostate; /* Protected by the object lock */ GstSpeexEchoProbe * probe; GQueue * buffers; /* Protected by the object lock */ gint latency_tune; gboolean agc; gint agc_increment; gint agc_decrement; gfloat agc_level; gint agc_max_gain; gboolean denoise; gint echo_suppress; gint echo_suppress_active; gint noise_suppress; }; struct _GstSpeexDSPClass { GstElementClass parent_class; }; GType gst_speex_dsp_get_type (void); void gst_speex_dsp_set_auto_attach (GstSpeexDSP * self, gboolean enabled); void gst_speex_dsp_add_capture_buffer (GstSpeexDSP * self, GstBuffer * buf); /* called by probe, with global_mutex locked */ void gst_speex_dsp_attach (GstSpeexDSP * self, GstSpeexEchoProbe * probe); void gst_speex_dsp_detach (GstSpeexDSP * self); G_END_DECLS #endif /* __GST_SPEEX_DSP_H__ */ psimedia-master/gstprovider/gstelements/speexdsp/speexdspplugin.c000066400000000000000000000034671220046403000261160ustar00rootroot00000000000000/* * Farsight Voice+Video library * * Copyright 2008 Collabora Ltd * Copyright 2008 Nokia Corporation * @author: Olivier Crete * Copyright 2009 Barracuda Networks, Inc * @author: Justin Karneges * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif #include #include #include "speexdsp.h" #include "speexechoprobe.h" /* dsp/probe use these to discover each other */ GStaticMutex global_mutex = G_STATIC_MUTEX_INIT; GstSpeexDSP * global_dsp = NULL; GstSpeexEchoProbe * global_probe = NULL; static gboolean plugin_init (GstPlugin * plugin) { if (!gst_element_register (plugin, "speexdsp", GST_RANK_NONE, GST_TYPE_SPEEX_DSP)) { return FALSE; } if (!gst_element_register (plugin, "speexechoprobe", GST_RANK_NONE, GST_TYPE_SPEEX_ECHO_PROBE)) { return FALSE; } return TRUE; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, "speexdsp", "Voice preprocessing using libspeex", plugin_init, VERSION, "LGPL", "Farsight", "http://farsight.sf.net") psimedia-master/gstprovider/gstelements/speexdsp/speexechoprobe.c000066400000000000000000000346471220046403000260630ustar00rootroot00000000000000/* * Farsight Voice+Video library * * Copyright 2008 Collabora Ltd * Copyright 2008 Nokia Corporation * @author: Olivier Crete * Copyright 2009 Barracuda Networks, Inc * @author: Justin Karneges * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif #include "speexechoprobe.h" #include #include #include "speexdsp.h" extern GStaticMutex global_mutex; extern GstSpeexDSP * global_dsp; extern GstSpeexEchoProbe * global_probe; #define GST_CAT_DEFAULT (speex_dsp_debug) #define DEFAULT_LATENCY_TUNE (0) static const GstElementDetails gst_speex_echo_probe_details = GST_ELEMENT_DETAILS ( "Accoustic Echo canceller probe", "Generic/Audio", "Gathers playback buffers for speexdsp", "Olivier Crete "); static GstStaticPadTemplate gst_speex_echo_probe_sink_template = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, GST_STATIC_CAPS ("audio/x-raw-int, " "rate = (int) [ 6000, 48000 ], " "channels = (int) [1, MAX], " "endianness = (int) BYTE_ORDER, " "signed = (boolean) TRUE, " "width = (int) 16, " "depth = (int) 16") ); static GstStaticPadTemplate gst_speex_echo_probe_src_template = GST_STATIC_PAD_TEMPLATE ("src", GST_PAD_SRC, GST_PAD_ALWAYS, GST_STATIC_CAPS ("audio/x-raw-int, " "rate = (int) [ 6000, 48000 ], " "channels = (int) [1, MAX], " "endianness = (int) BYTE_ORDER, " "signed = (boolean) TRUE, " "width = (int) 16, " "depth = (int) 16") ); enum { /* FILL ME */ LAST_SIGNAL }; enum { PROP_0, PROP_LATENCY_TUNE }; GST_BOILERPLATE(GstSpeexEchoProbe, gst_speex_echo_probe, GstElement, GST_TYPE_ELEMENT); static void gst_speex_echo_probe_finalize (GObject * object); static void gst_speex_echo_probe_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static void gst_speex_echo_probe_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); static GstStateChangeReturn gst_speex_echo_probe_change_state (GstElement * element, GstStateChange transition); static GstFlowReturn gst_speex_echo_probe_chain (GstPad * pad, GstBuffer * buffer); static gboolean gst_speex_echo_probe_event (GstPad * pad, GstEvent * event); static gboolean gst_speex_echo_probe_setcaps (GstPad * pad, GstCaps * caps); static GstCaps * gst_speex_echo_probe_getcaps (GstPad * pad); static void try_auto_attach (); static void gst_speex_echo_probe_base_init (gpointer klass) { } static void gst_speex_echo_probe_class_init (GstSpeexEchoProbeClass * klass) { GObjectClass * gobject_class; GstElementClass * gstelement_class; gobject_class = (GObjectClass *) klass; gobject_class->finalize = gst_speex_echo_probe_finalize; gobject_class->set_property = gst_speex_echo_probe_set_property; gobject_class->get_property = gst_speex_echo_probe_get_property; gstelement_class = (GstElementClass *) klass; gst_element_class_add_pad_template (gstelement_class, gst_static_pad_template_get (&gst_speex_echo_probe_src_template)); gst_element_class_add_pad_template (gstelement_class, gst_static_pad_template_get (&gst_speex_echo_probe_sink_template)); gst_element_class_set_details (gstelement_class, &gst_speex_echo_probe_details); gstelement_class->change_state = gst_speex_echo_probe_change_state; parent_class = g_type_class_peek_parent (klass); g_object_class_install_property (gobject_class, PROP_LATENCY_TUNE, g_param_spec_int ("latency-tune", "Add/remove latency", "Use this to tune the latency value, in milliseconds, in case it is" " detected incorrectly", G_MININT, G_MAXINT, DEFAULT_LATENCY_TUNE, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); } static void gst_speex_echo_probe_init (GstSpeexEchoProbe * self, GstSpeexEchoProbeClass * klass) { GstPadTemplate * template; template = gst_static_pad_template_get (&gst_speex_echo_probe_src_template); self->srcpad = gst_pad_new_from_template (template, "src"); gst_object_unref (template); gst_pad_set_event_function (self->srcpad, GST_DEBUG_FUNCPTR (gst_speex_echo_probe_event)); gst_pad_set_getcaps_function (self->srcpad, GST_DEBUG_FUNCPTR (gst_speex_echo_probe_getcaps)); gst_element_add_pad (GST_ELEMENT (self), self->srcpad); template = gst_static_pad_template_get (&gst_speex_echo_probe_sink_template); self->sinkpad = gst_pad_new_from_template (template, "sink"); gst_object_unref (template); gst_pad_set_chain_function (self->sinkpad, GST_DEBUG_FUNCPTR (gst_speex_echo_probe_chain)); gst_pad_set_event_function (self->sinkpad, GST_DEBUG_FUNCPTR (gst_speex_echo_probe_event)); gst_pad_set_setcaps_function (self->sinkpad, GST_DEBUG_FUNCPTR (gst_speex_echo_probe_setcaps)); gst_pad_set_getcaps_function (self->sinkpad, GST_DEBUG_FUNCPTR (gst_speex_echo_probe_getcaps)); gst_element_add_pad (GST_ELEMENT (self), self->sinkpad); self->latency = -1; self->latency_tune = DEFAULT_LATENCY_TUNE; self->rate = 0; self->channels = -1; self->dsp = NULL; g_static_mutex_lock (&global_mutex); if (!global_probe) { global_probe = self; try_auto_attach (); } g_static_mutex_unlock (&global_mutex); } static void gst_speex_echo_probe_finalize (GObject * object) { GstSpeexEchoProbe * self = GST_SPEEX_ECHO_PROBE (object); g_static_mutex_lock (&global_mutex); if (global_probe && global_probe == self) { if (global_dsp) { gst_speex_dsp_detach (GST_SPEEX_DSP (global_dsp)); GST_DEBUG_OBJECT (self, "speexechoprobe detaching from globally discovered speexdsp"); } global_probe = NULL; } g_static_mutex_unlock (&global_mutex); G_OBJECT_CLASS (parent_class)->finalize (object); } static void gst_speex_echo_probe_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) { GstSpeexEchoProbe * self = GST_SPEEX_ECHO_PROBE (object); switch (prop_id) { case PROP_LATENCY_TUNE: GST_OBJECT_LOCK (self); self->latency_tune = g_value_get_int (value); GST_OBJECT_UNLOCK (self); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static void gst_speex_echo_probe_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec) { GstSpeexEchoProbe * self = GST_SPEEX_ECHO_PROBE (object); switch (prop_id) { case PROP_LATENCY_TUNE: GST_OBJECT_LOCK (self); g_value_set_int (value, self->latency_tune); GST_OBJECT_UNLOCK (self); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static gboolean gst_speex_echo_probe_setcaps (GstPad * pad, GstCaps * caps) { GstSpeexEchoProbe * self = GST_SPEEX_ECHO_PROBE (gst_pad_get_parent (pad)); gint rate, channels = 1; GstStructure * structure; gboolean ret = TRUE; GST_DEBUG_OBJECT (self, "setting caps on pad %p,%s to %" GST_PTR_FORMAT, pad, GST_PAD_NAME (pad), caps); structure = gst_caps_get_structure (caps, 0); if (!gst_structure_get_int (structure, "rate", &rate)) { GST_WARNING_OBJECT (self, "Tried to set caps without a rate"); gst_object_unref (self); return FALSE; } gst_structure_get_int (structure, "channels", &channels); GST_OBJECT_LOCK (self); if (self->rate && self->rate != rate) { GST_WARNING_OBJECT (self, "Wrong rate, got %d, expected %d", rate, self->rate); ret = FALSE; } else if (self->channels != -1 && self->channels != channels) { GST_WARNING_OBJECT (self, "Wrong channels, got %d, expected %d", channels, self->channels); ret = FALSE; } if (ret) { self->rate = rate; self->channels = channels; } GST_OBJECT_UNLOCK (self); gst_object_unref (self); return ret; } static GstCaps * gst_speex_echo_probe_getcaps (GstPad * pad) { GstSpeexEchoProbe * self; GstCaps * result, * peercaps, * tmpcaps; self = GST_SPEEX_ECHO_PROBE (gst_pad_get_parent (pad)); result = gst_caps_copy (gst_pad_get_pad_template_caps (pad)); GST_OBJECT_LOCK (self); if (self->rate) gst_caps_set_simple (result, "rate", G_TYPE_INT, self->rate, NULL); if (self->channels != -1) gst_caps_set_simple (result, "channels", G_TYPE_INT, self->channels, NULL); GST_OBJECT_UNLOCK (self); if (pad == self->sinkpad) { peercaps = gst_pad_peer_get_caps (self->srcpad); if (peercaps) { tmpcaps = result; result = gst_caps_intersect (result, peercaps); gst_caps_unref (tmpcaps); gst_caps_unref (peercaps); } } else if (pad == self->srcpad) { peercaps = gst_pad_peer_get_caps (self->sinkpad); if (peercaps) { tmpcaps = result; result = gst_caps_intersect (result, peercaps); gst_caps_unref (tmpcaps); gst_caps_unref (peercaps); } } gst_object_unref (self); return result; } static gboolean gst_speex_echo_probe_event (GstPad * pad, GstEvent * event) { GstSpeexEchoProbe * self = GST_SPEEX_ECHO_PROBE (gst_pad_get_parent (pad)); gboolean res = FALSE; GstClockTime latency; switch (GST_EVENT_TYPE (event)) { case GST_EVENT_LATENCY: gst_event_parse_latency (event, &latency); GST_OBJECT_LOCK (self); self->latency = latency; GST_OBJECT_UNLOCK (self); GST_DEBUG_OBJECT (self, "We have a latency of %"GST_TIME_FORMAT, GST_TIME_ARGS (latency)); break; case GST_EVENT_FLUSH_STOP: GST_OBJECT_LOCK (self); gst_segment_init (&self->segment, GST_FORMAT_UNDEFINED); self->rate = 0; self->channels = -1; GST_OBJECT_UNLOCK (self); break; case GST_EVENT_NEWSEGMENT: { gboolean update; gdouble rate; gdouble applied_rate; GstFormat format; gint64 start; gint64 stop; gint64 position; gst_event_parse_new_segment_full (event, &update, &rate, &applied_rate, &format, &start, &stop, &position); if (rate != 1.0 || applied_rate != 1.0) { GST_ERROR_OBJECT (self, "Only a rate of 1.0 is allowed"); goto out; } if (format != GST_FORMAT_TIME) { GST_ERROR_OBJECT (self, "Only times segments are allowed"); goto out; } GST_OBJECT_LOCK (self); gst_segment_set_newsegment_full (&self->segment, update, rate, applied_rate, format, start, stop, position); GST_OBJECT_UNLOCK (self); } break; default: break; } if (pad == self->sinkpad) res = gst_pad_push_event (self->srcpad, event); else res = gst_pad_push_event (self->sinkpad, event); out: gst_object_unref (self); return res; } static GstFlowReturn gst_speex_echo_probe_chain (GstPad * pad, GstBuffer * buffer) { GstSpeexEchoProbe * self = GST_SPEEX_ECHO_PROBE (gst_pad_get_parent (pad)); GstFlowReturn res; GstBuffer * newbuf = NULL; GstClockTime base_time; base_time = gst_element_get_base_time (GST_ELEMENT_CAST (self)); /*{ GstClockTime rec_rt, duration; rec_rt = gst_segment_to_running_time (&self->segment, GST_FORMAT_TIME, GST_BUFFER_TIMESTAMP (buffer)); duration = GST_BUFFER_SIZE (buffer) * GST_SECOND / (self->rate * 2); GST_LOG_OBJECT (self, "Played buffer at %"GST_TIME_FORMAT " (len=%"GST_TIME_FORMAT", offset=%lld, base=%lld)", //GST_TIME_ARGS (GST_BUFFER_TIMESTAMP (buffer)), GST_TIME_ARGS (rec_rt) + self->latency, GST_TIME_ARGS (duration), GST_BUFFER_OFFSET (buffer), base_time); }*/ GST_OBJECT_LOCK (self); if (self->dsp) { /* fork the buffer, changing the timestamp to be in clock time, with * latency applied */ //gst_buffer_ref (buffer); // FIXME: don't need to ref this, right? newbuf = gst_buffer_create_sub (buffer, 0, GST_BUFFER_SIZE (buffer)); GST_BUFFER_TIMESTAMP (newbuf) += base_time; // FIXME: if we don't have latency yet, does it make sense to be passing // buffers without it applied? i'm not sure but i think if we manage // to get a buffer before latency is known, then it means latency will // end up being zero anyway, so maybe this is fine... if (self->latency != -1) GST_BUFFER_TIMESTAMP (newbuf) += self->latency; GST_BUFFER_TIMESTAMP (newbuf) += ((GstClockTime)self->latency_tune) * GST_MSECOND; gst_speex_dsp_add_capture_buffer (self->dsp, newbuf); } GST_OBJECT_UNLOCK (self); res = gst_pad_push (self->srcpad, buffer); gst_object_unref (self); return res; } static GstStateChangeReturn gst_speex_echo_probe_change_state (GstElement * element, GstStateChange transition) { GstSpeexEchoProbe * self; GstStateChangeReturn ret; self = GST_SPEEX_ECHO_PROBE (element); switch (transition) { case GST_STATE_CHANGE_READY_TO_PAUSED: GST_OBJECT_LOCK (self); gst_segment_init (&self->segment, GST_FORMAT_UNDEFINED); self->rate = 0; self->channels = -1; GST_OBJECT_UNLOCK (self); break; default: break; } ret = GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); return ret; } /* lock global_mutex during this call */ static void try_auto_attach () { if (global_dsp) { gst_speex_dsp_attach (global_dsp, global_probe); GST_DEBUG_OBJECT (global_probe, "speexechoprobe attaching to globally discovered speexdsp"); } } void gst_speex_echo_probe_set_auto_attach (GstSpeexEchoProbe * self, gboolean enabled) { g_static_mutex_lock (&global_mutex); if (enabled) { if (!global_probe) { global_probe = self; try_auto_attach (); } } else { if (global_probe == self) global_probe = NULL; } g_static_mutex_unlock (&global_mutex); } psimedia-master/gstprovider/gstelements/speexdsp/speexechoprobe.h000066400000000000000000000045721220046403000260620ustar00rootroot00000000000000/* * Farsight Voice+Video library * * Copyright 2008 Collabora Ltd * Copyright 2008 Nokia Corporation * @author: Olivier Crete * Copyright 2009 Barracuda Networks, Inc * @author: Justin Karneges * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * */ #ifndef __GST_SPEEX_ECHO_PROBE_H__ #define __GST_SPEEX_ECHO_PROBE_H__ #include G_BEGIN_DECLS #define GST_TYPE_SPEEX_ECHO_PROBE \ (gst_speex_echo_probe_get_type()) #define GST_SPEEX_ECHO_PROBE(obj) \ (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_SPEEX_ECHO_PROBE,GstSpeexEchoProbe)) #define GST_IS_SPEEX_ECHO_PROBE(obj) \ (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_SPEEX_ECHO_PROBE)) #define GST_SPEEX_ECHO_PROBE_CLASS(klass) \ (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_SPEEX_ECHO_PROBE,GstSpeexEchoProbeClass)) #define GST_IS_SPEEX_ECHO_PROBE_CLASS(klass) \ (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_SPEEX_ECHO_PROBE)) #define GST_SPEEX_ECHO_PROBE_GET_CLASS(obj) \ (G_TYPE_INSTANCE_GET_CLASS((obj),GST_TYPE_SPEEX_ECHO_PROBE,GstSpeexEchoProbeClass)) struct _GstSpeexDSP; typedef struct _GstSpeexEchoProbe GstSpeexEchoProbe; typedef struct _GstSpeexEchoProbeClass GstSpeexEchoProbeClass; struct _GstSpeexEchoProbe { GstElement element; GstPad * srcpad; GstPad * sinkpad; GstSegment segment; /* protected by object lock */ gint latency; gint rate, channels; gint latency_tune; struct _GstSpeexDSP * dsp; }; struct _GstSpeexEchoProbeClass { GstElementClass parent_class; }; GType gst_speex_echo_probe_get_type (void); void gst_speex_echo_probe_set_auto_attach (GstSpeexEchoProbe * self, gboolean enabled); G_END_DECLS #endif /* __GST_SPEEX_ECHO_PROBE_H__ */ psimedia-master/gstprovider/gstelements/static/000077500000000000000000000000001220046403000223225ustar00rootroot00000000000000psimedia-master/gstprovider/gstelements/static/audioresample_static.c000066400000000000000000000653401220046403000266770ustar00rootroot00000000000000/* GStreamer * Copyright (C) 1999 Erik Walthinsen * Copyright (C) 2003,2004 David A. Schleef * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. */ /* Element-Checklist-Version: 5 */ /** * SECTION:element-legacyresample * * legacyresample resamples raw audio buffers to different sample rates using * a configurable windowing function to enhance quality. * * * Example launch line * |[ * gst-launch -v filesrc location=sine.ogg ! oggdemux ! vorbisdec ! audioconvert ! legacyresample ! audio/x-raw-int, rate=8000 ! alsasink * ]| Decode an Ogg/Vorbis downsample to 8Khz and play sound through alsa. * To create the Ogg/Vorbis file refer to the documentation of vorbisenc. * * * Last reviewed on 2006-03-02 (0.10.4) */ #ifdef HAVE_CONFIG_H #include "config.h" #endif #include #include /*#define DEBUG_ENABLED */ #include "../audioresample/gstaudioresample.h" #include #include GST_DEBUG_CATEGORY_STATIC (audioresample_debug); #define GST_CAT_DEFAULT audioresample_debug /* elementfactory information */ static const GstElementDetails gst_audioresample_details = GST_ELEMENT_DETAILS ("Audio scaler", "Filter/Converter/Audio", "Resample audio", "David Schleef "); #define DEFAULT_FILTERLEN 16 enum { PROP_0, PROP_FILTERLEN }; #define SUPPORTED_CAPS \ GST_STATIC_CAPS ( \ "audio/x-raw-int, " \ "rate = (int) [ 1, MAX ], " \ "channels = (int) [ 1, MAX ], " \ "endianness = (int) BYTE_ORDER, " \ "width = (int) 16, " \ "depth = (int) 16, " \ "signed = (boolean) true;" \ "audio/x-raw-int, " \ "rate = (int) [ 1, MAX ], " \ "channels = (int) [ 1, MAX ], " \ "endianness = (int) BYTE_ORDER, " \ "width = (int) 32, " \ "depth = (int) 32, " \ "signed = (boolean) true;" \ "audio/x-raw-float, " \ "rate = (int) [ 1, MAX ], " \ "channels = (int) [ 1, MAX ], " \ "endianness = (int) BYTE_ORDER, " \ "width = (int) 32; " \ "audio/x-raw-float, " \ "rate = (int) [ 1, MAX ], " \ "channels = (int) [ 1, MAX ], " \ "endianness = (int) BYTE_ORDER, " \ "width = (int) 64" \ ) static GstStaticPadTemplate gst_audioresample_sink_template = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, SUPPORTED_CAPS); static GstStaticPadTemplate gst_audioresample_src_template = GST_STATIC_PAD_TEMPLATE ("src", GST_PAD_SRC, GST_PAD_ALWAYS, SUPPORTED_CAPS); static void gst_audioresample_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static void gst_audioresample_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); /* vmethods */ static gboolean audioresample_get_unit_size (GstBaseTransform * base, GstCaps * caps, guint * size); static GstCaps *audioresample_transform_caps (GstBaseTransform * base, GstPadDirection direction, GstCaps * caps); static void audioresample_fixate_caps (GstBaseTransform * base, GstPadDirection direction, GstCaps * caps, GstCaps * othercaps); static gboolean audioresample_transform_size (GstBaseTransform * trans, GstPadDirection direction, GstCaps * incaps, guint insize, GstCaps * outcaps, guint * outsize); static gboolean audioresample_set_caps (GstBaseTransform * base, GstCaps * incaps, GstCaps * outcaps); static GstFlowReturn audioresample_pushthrough (GstAudioresample * audioresample); static GstFlowReturn audioresample_transform (GstBaseTransform * base, GstBuffer * inbuf, GstBuffer * outbuf); static gboolean audioresample_event (GstBaseTransform * base, GstEvent * event); static gboolean audioresample_start (GstBaseTransform * base); static gboolean audioresample_stop (GstBaseTransform * base); static gboolean audioresample_query (GstPad * pad, GstQuery * query); static const GstQueryType *audioresample_query_type (GstPad * pad); #define DEBUG_INIT(bla) \ GST_DEBUG_CATEGORY_INIT (audioresample_debug, "legacyresample", 0, "audio resampling element"); GST_BOILERPLATE_FULL (GstAudioresample, gst_audioresample, GstBaseTransform, GST_TYPE_BASE_TRANSFORM, DEBUG_INIT); static void gst_audioresample_base_init (gpointer g_class) { GstElementClass *gstelement_class = GST_ELEMENT_CLASS (g_class); gst_element_class_add_pad_template (gstelement_class, gst_static_pad_template_get (&gst_audioresample_src_template)); gst_element_class_add_pad_template (gstelement_class, gst_static_pad_template_get (&gst_audioresample_sink_template)); gst_element_class_set_details (gstelement_class, &gst_audioresample_details); } static void gst_audioresample_class_init (GstAudioresampleClass * klass) { GObjectClass *gobject_class; gobject_class = (GObjectClass *) klass; gobject_class->set_property = gst_audioresample_set_property; gobject_class->get_property = gst_audioresample_get_property; g_object_class_install_property (gobject_class, PROP_FILTERLEN, g_param_spec_int ("filter-length", "filter length", "Length of the resample filter", 0, G_MAXINT, DEFAULT_FILTERLEN, G_PARAM_READWRITE | G_PARAM_CONSTRUCT | G_PARAM_STATIC_STRINGS)); GST_BASE_TRANSFORM_CLASS (klass)->start = GST_DEBUG_FUNCPTR (audioresample_start); GST_BASE_TRANSFORM_CLASS (klass)->stop = GST_DEBUG_FUNCPTR (audioresample_stop); GST_BASE_TRANSFORM_CLASS (klass)->transform_size = GST_DEBUG_FUNCPTR (audioresample_transform_size); GST_BASE_TRANSFORM_CLASS (klass)->get_unit_size = GST_DEBUG_FUNCPTR (audioresample_get_unit_size); GST_BASE_TRANSFORM_CLASS (klass)->transform_caps = GST_DEBUG_FUNCPTR (audioresample_transform_caps); GST_BASE_TRANSFORM_CLASS (klass)->fixate_caps = GST_DEBUG_FUNCPTR (audioresample_fixate_caps); GST_BASE_TRANSFORM_CLASS (klass)->set_caps = GST_DEBUG_FUNCPTR (audioresample_set_caps); GST_BASE_TRANSFORM_CLASS (klass)->transform = GST_DEBUG_FUNCPTR (audioresample_transform); GST_BASE_TRANSFORM_CLASS (klass)->event = GST_DEBUG_FUNCPTR (audioresample_event); GST_BASE_TRANSFORM_CLASS (klass)->passthrough_on_same_caps = TRUE; } static void gst_audioresample_init (GstAudioresample * audioresample, GstAudioresampleClass * klass) { GstBaseTransform *trans; trans = GST_BASE_TRANSFORM (audioresample); /* buffer alloc passthrough is too impossible. FIXME, it * is trivial in the passthrough case. */ gst_pad_set_bufferalloc_function (trans->sinkpad, NULL); audioresample->filter_length = DEFAULT_FILTERLEN; audioresample->need_discont = FALSE; gst_pad_set_query_function (trans->srcpad, audioresample_query); gst_pad_set_query_type_function (trans->srcpad, audioresample_query_type); } /* vmethods */ static gboolean audioresample_start (GstBaseTransform * base) { GstAudioresample *audioresample = GST_AUDIORESAMPLE (base); audioresample->resample = resample_new (); audioresample->ts_offset = -1; audioresample->offset = -1; audioresample->next_ts = -1; resample_set_filter_length (audioresample->resample, audioresample->filter_length); return TRUE; } static gboolean audioresample_stop (GstBaseTransform * base) { GstAudioresample *audioresample = GST_AUDIORESAMPLE (base); if (audioresample->resample) { resample_free (audioresample->resample); audioresample->resample = NULL; } gst_caps_replace (&audioresample->sinkcaps, NULL); gst_caps_replace (&audioresample->srccaps, NULL); return TRUE; } static gboolean audioresample_get_unit_size (GstBaseTransform * base, GstCaps * caps, guint * size) { gint width, channels; GstStructure *structure; gboolean ret; g_assert (size); /* this works for both float and int */ structure = gst_caps_get_structure (caps, 0); ret = gst_structure_get_int (structure, "width", &width); ret &= gst_structure_get_int (structure, "channels", &channels); g_return_val_if_fail (ret, FALSE); *size = width * channels / 8; return TRUE; } static GstCaps * audioresample_transform_caps (GstBaseTransform * base, GstPadDirection direction, GstCaps * caps) { GstCaps *res; GstStructure *structure; /* transform caps gives one single caps so we can just replace * the rate property with our range. */ res = gst_caps_copy (caps); structure = gst_caps_get_structure (res, 0); gst_structure_set (structure, "rate", GST_TYPE_INT_RANGE, 1, G_MAXINT, NULL); return res; } /* Fixate rate to the allowed rate that has the smallest difference */ static void audioresample_fixate_caps (GstBaseTransform * base, GstPadDirection direction, GstCaps * caps, GstCaps * othercaps) { GstStructure *s; gint rate; s = gst_caps_get_structure (caps, 0); if (!gst_structure_get_int (s, "rate", &rate)) return; s = gst_caps_get_structure (othercaps, 0); gst_structure_fixate_field_nearest_int (s, "rate", rate); } static gboolean resample_set_state_from_caps (ResampleState * state, GstCaps * incaps, GstCaps * outcaps, gint * channels, gint * inrate, gint * outrate) { GstStructure *structure; gboolean ret; gint myinrate, myoutrate; int mychannels; gint width, depth; ResampleFormat format; GST_DEBUG ("incaps %" GST_PTR_FORMAT ", outcaps %" GST_PTR_FORMAT, incaps, outcaps); structure = gst_caps_get_structure (incaps, 0); /* get width */ ret = gst_structure_get_int (structure, "width", &width); if (!ret) goto no_width; /* figure out the format */ if (g_str_equal (gst_structure_get_name (structure), "audio/x-raw-float")) { if (width == 32) format = RESAMPLE_FORMAT_F32; else if (width == 64) format = RESAMPLE_FORMAT_F64; else goto wrong_depth; } else { /* for int, depth and width must be the same */ ret = gst_structure_get_int (structure, "depth", &depth); if (!ret || width != depth) goto not_equal; if (width == 16) format = RESAMPLE_FORMAT_S16; else if (width == 32) format = RESAMPLE_FORMAT_S32; else goto wrong_depth; } ret = gst_structure_get_int (structure, "rate", &myinrate); ret &= gst_structure_get_int (structure, "channels", &mychannels); if (!ret) goto no_in_rate_channels; structure = gst_caps_get_structure (outcaps, 0); ret = gst_structure_get_int (structure, "rate", &myoutrate); if (!ret) goto no_out_rate; if (channels) *channels = mychannels; if (inrate) *inrate = myinrate; if (outrate) *outrate = myoutrate; resample_set_format (state, format); resample_set_n_channels (state, mychannels); resample_set_input_rate (state, myinrate); resample_set_output_rate (state, myoutrate); return TRUE; /* ERRORS */ no_width: { GST_DEBUG ("failed to get width from caps"); return FALSE; } not_equal: { GST_DEBUG ("width %d and depth %d must be the same", width, depth); return FALSE; } wrong_depth: { GST_DEBUG ("unknown depth %d found", depth); return FALSE; } no_in_rate_channels: { GST_DEBUG ("could not get input rate and channels"); return FALSE; } no_out_rate: { GST_DEBUG ("could not get output rate"); return FALSE; } } static gboolean audioresample_transform_size (GstBaseTransform * base, GstPadDirection direction, GstCaps * caps, guint size, GstCaps * othercaps, guint * othersize) { GstAudioresample *audioresample = GST_AUDIORESAMPLE (base); ResampleState *state; GstCaps *srccaps, *sinkcaps; gboolean use_internal = FALSE; /* whether we use the internal state */ gboolean ret = TRUE; GST_LOG_OBJECT (base, "asked to transform size %d in direction %s", size, direction == GST_PAD_SINK ? "SINK" : "SRC"); if (direction == GST_PAD_SINK) { sinkcaps = caps; srccaps = othercaps; } else { sinkcaps = othercaps; srccaps = caps; } /* if the caps are the ones that _set_caps got called with; we can use * our own state; otherwise we'll have to create a state */ if (gst_caps_is_equal (sinkcaps, audioresample->sinkcaps) && gst_caps_is_equal (srccaps, audioresample->srccaps)) { use_internal = TRUE; state = audioresample->resample; } else { GST_DEBUG_OBJECT (audioresample, "caps are not the set caps, creating state"); state = resample_new (); resample_set_filter_length (state, audioresample->filter_length); resample_set_state_from_caps (state, sinkcaps, srccaps, NULL, NULL, NULL); } if (direction == GST_PAD_SINK) { /* asked to convert size of an incoming buffer */ *othersize = resample_get_output_size_for_input (state, size); } else { /* asked to convert size of an outgoing buffer */ *othersize = resample_get_input_size_for_output (state, size); } g_assert (*othersize % state->sample_size == 0); /* we make room for one extra sample, given that the resampling filter * can output an extra one for non-integral i_rate/o_rate */ GST_LOG_OBJECT (base, "transformed size %d to %d", size, *othersize); if (!use_internal) { resample_free (state); } return ret; } static gboolean audioresample_set_caps (GstBaseTransform * base, GstCaps * incaps, GstCaps * outcaps) { gboolean ret; gint inrate, outrate; int channels; GstAudioresample *audioresample = GST_AUDIORESAMPLE (base); GST_DEBUG_OBJECT (base, "incaps %" GST_PTR_FORMAT ", outcaps %" GST_PTR_FORMAT, incaps, outcaps); ret = resample_set_state_from_caps (audioresample->resample, incaps, outcaps, &channels, &inrate, &outrate); g_return_val_if_fail (ret, FALSE); audioresample->channels = channels; GST_DEBUG_OBJECT (audioresample, "set channels to %d", channels); audioresample->i_rate = inrate; GST_DEBUG_OBJECT (audioresample, "set i_rate to %d", inrate); audioresample->o_rate = outrate; GST_DEBUG_OBJECT (audioresample, "set o_rate to %d", outrate); /* save caps so we can short-circuit in the size_transform if the caps * are the same */ gst_caps_replace (&audioresample->sinkcaps, incaps); gst_caps_replace (&audioresample->srccaps, outcaps); return TRUE; } static gboolean audioresample_event (GstBaseTransform * base, GstEvent * event) { GstAudioresample *audioresample; audioresample = GST_AUDIORESAMPLE (base); switch (GST_EVENT_TYPE (event)) { case GST_EVENT_FLUSH_START: break; case GST_EVENT_FLUSH_STOP: if (audioresample->resample) resample_input_flush (audioresample->resample); audioresample->ts_offset = -1; audioresample->next_ts = -1; audioresample->offset = -1; break; case GST_EVENT_NEWSEGMENT: resample_input_pushthrough (audioresample->resample); audioresample_pushthrough (audioresample); audioresample->ts_offset = -1; audioresample->next_ts = -1; audioresample->offset = -1; break; case GST_EVENT_EOS: resample_input_eos (audioresample->resample); audioresample_pushthrough (audioresample); break; default: break; } return parent_class->event (base, event); } static GstFlowReturn audioresample_do_output (GstAudioresample * audioresample, GstBuffer * outbuf) { int outsize; int outsamples; ResampleState *r; r = audioresample->resample; outsize = resample_get_output_size (r); GST_LOG_OBJECT (audioresample, "audioresample can give me %d bytes", outsize); /* protect against mem corruption */ if (outsize > GST_BUFFER_SIZE (outbuf)) { GST_WARNING_OBJECT (audioresample, "overriding audioresample's outsize %d with outbuffer's size %d", outsize, GST_BUFFER_SIZE (outbuf)); outsize = GST_BUFFER_SIZE (outbuf); } /* catch possibly wrong size differences */ if (GST_BUFFER_SIZE (outbuf) - outsize > r->sample_size) { GST_WARNING_OBJECT (audioresample, "audioresample's outsize %d too far from outbuffer's size %d", outsize, GST_BUFFER_SIZE (outbuf)); } outsize = resample_get_output_data (r, GST_BUFFER_DATA (outbuf), outsize); outsamples = outsize / r->sample_size; GST_LOG_OBJECT (audioresample, "resample gave me %d bytes or %d samples", outsize, outsamples); GST_BUFFER_OFFSET (outbuf) = audioresample->offset; GST_BUFFER_TIMESTAMP (outbuf) = audioresample->next_ts; if (audioresample->ts_offset != -1) { audioresample->offset += outsamples; audioresample->ts_offset += outsamples; audioresample->next_ts = gst_util_uint64_scale_int (audioresample->ts_offset, GST_SECOND, audioresample->o_rate); GST_BUFFER_OFFSET_END (outbuf) = audioresample->offset; /* we calculate DURATION as the difference between "next" timestamp * and current timestamp so we ensure a contiguous stream, instead of * having rounding errors. */ GST_BUFFER_DURATION (outbuf) = audioresample->next_ts - GST_BUFFER_TIMESTAMP (outbuf); } else { /* no valid offset know, we can still sortof calculate the duration though */ GST_BUFFER_DURATION (outbuf) = gst_util_uint64_scale_int (outsamples, GST_SECOND, audioresample->o_rate); } /* check for possible mem corruption */ if (outsize > GST_BUFFER_SIZE (outbuf)) { /* this is an error that when it happens, would need fixing in the * resample library; we told it we wanted only GST_BUFFER_SIZE (outbuf), * and it gave us more ! */ GST_WARNING_OBJECT (audioresample, "audioresample, you memory corrupting bastard. " "you gave me outsize %d while my buffer was size %d", outsize, GST_BUFFER_SIZE (outbuf)); return GST_FLOW_ERROR; } /* catch possibly wrong size differences */ if (GST_BUFFER_SIZE (outbuf) - outsize > r->sample_size) { GST_WARNING_OBJECT (audioresample, "audioresample's written outsize %d too far from outbuffer's size %d", outsize, GST_BUFFER_SIZE (outbuf)); } GST_BUFFER_SIZE (outbuf) = outsize; if (G_UNLIKELY (audioresample->need_discont)) { GST_DEBUG_OBJECT (audioresample, "marking this buffer with the DISCONT flag"); GST_BUFFER_FLAG_SET (outbuf, GST_BUFFER_FLAG_DISCONT); audioresample->need_discont = FALSE; } GST_LOG_OBJECT (audioresample, "transformed to buffer of %d bytes, ts %" GST_TIME_FORMAT ", duration %" GST_TIME_FORMAT ", offset %" G_GINT64_FORMAT ", offset_end %" G_GINT64_FORMAT, outsize, GST_TIME_ARGS (GST_BUFFER_TIMESTAMP (outbuf)), GST_TIME_ARGS (GST_BUFFER_DURATION (outbuf)), GST_BUFFER_OFFSET (outbuf), GST_BUFFER_OFFSET_END (outbuf)); return GST_FLOW_OK; } static gboolean audioresample_check_discont (GstAudioresample * audioresample, GstClockTime timestamp) { if (timestamp != GST_CLOCK_TIME_NONE && audioresample->prev_ts != GST_CLOCK_TIME_NONE && audioresample->prev_duration != GST_CLOCK_TIME_NONE && timestamp != audioresample->prev_ts + audioresample->prev_duration) { /* Potentially a discontinuous buffer. However, it turns out that many * elements generate imperfect streams due to rounding errors, so we permit * a small error (up to one sample) without triggering a filter * flush/restart (if triggered incorrectly, this will be audible) */ GstClockTimeDiff diff = timestamp - (audioresample->prev_ts + audioresample->prev_duration); if (ABS (diff) > GST_SECOND / audioresample->i_rate) { GST_WARNING_OBJECT (audioresample, "encountered timestamp discontinuity of %" G_GINT64_FORMAT, diff); return TRUE; } } return FALSE; } static GstFlowReturn audioresample_transform (GstBaseTransform * base, GstBuffer * inbuf, GstBuffer * outbuf) { GstAudioresample *audioresample; ResampleState *r; guchar *data, *datacopy; gulong size; GstClockTime timestamp; audioresample = GST_AUDIORESAMPLE (base); r = audioresample->resample; data = GST_BUFFER_DATA (inbuf); size = GST_BUFFER_SIZE (inbuf); timestamp = GST_BUFFER_TIMESTAMP (inbuf); GST_LOG_OBJECT (audioresample, "transforming buffer of %ld bytes, ts %" GST_TIME_FORMAT ", duration %" GST_TIME_FORMAT ", offset %" G_GINT64_FORMAT ", offset_end %" G_GINT64_FORMAT, size, GST_TIME_ARGS (timestamp), GST_TIME_ARGS (GST_BUFFER_DURATION (inbuf)), GST_BUFFER_OFFSET (inbuf), GST_BUFFER_OFFSET_END (inbuf)); /* check for timestamp discontinuities and flush/reset if needed */ if (G_UNLIKELY (audioresample_check_discont (audioresample, timestamp))) { /* Flush internal samples */ audioresample_pushthrough (audioresample); /* Inform downstream element about discontinuity */ audioresample->need_discont = TRUE; /* We want to recalculate the offset */ audioresample->ts_offset = -1; } if (audioresample->ts_offset == -1) { /* if we don't know the initial offset yet, calculate it based on the * input timestamp. */ if (GST_CLOCK_TIME_IS_VALID (timestamp)) { GstClockTime stime; /* offset used to calculate the timestamps. We use the sample offset for * this to make it more accurate. We want the first buffer to have the * same timestamp as the incoming timestamp. */ audioresample->next_ts = timestamp; audioresample->ts_offset = gst_util_uint64_scale_int (timestamp, r->o_rate, GST_SECOND); /* offset used to set as the buffer offset, this offset is always * relative to the stream time, note that timestamp is not... */ stime = (timestamp - base->segment.start) + base->segment.time; audioresample->offset = gst_util_uint64_scale_int (stime, r->o_rate, GST_SECOND); } } audioresample->prev_ts = timestamp; audioresample->prev_duration = GST_BUFFER_DURATION (inbuf); /* need to memdup, resample takes ownership. */ datacopy = g_memdup (data, size); resample_add_input_data (r, datacopy, size, g_free, datacopy); return audioresample_do_output (audioresample, outbuf); } /* push remaining data in the buffers out */ static GstFlowReturn audioresample_pushthrough (GstAudioresample * audioresample) { int outsize; ResampleState *r; GstBuffer *outbuf; GstFlowReturn res = GST_FLOW_OK; GstBaseTransform *trans; r = audioresample->resample; outsize = resample_get_output_size (r); if (outsize == 0) { GST_DEBUG_OBJECT (audioresample, "no internal buffers needing flush"); goto done; } trans = GST_BASE_TRANSFORM (audioresample); res = gst_pad_alloc_buffer (trans->srcpad, GST_BUFFER_OFFSET_NONE, outsize, GST_PAD_CAPS (trans->srcpad), &outbuf); if (G_UNLIKELY (res != GST_FLOW_OK)) { GST_WARNING_OBJECT (audioresample, "failed allocating buffer of %d bytes", outsize); goto done; } res = audioresample_do_output (audioresample, outbuf); if (G_UNLIKELY (res != GST_FLOW_OK)) goto done; res = gst_pad_push (trans->srcpad, outbuf); done: return res; } static gboolean audioresample_query (GstPad * pad, GstQuery * query) { GstAudioresample *audioresample = GST_AUDIORESAMPLE (gst_pad_get_parent (pad)); GstBaseTransform *trans = GST_BASE_TRANSFORM (audioresample); gboolean res = TRUE; switch (GST_QUERY_TYPE (query)) { case GST_QUERY_LATENCY: { GstClockTime min, max; gboolean live; guint64 latency; GstPad *peer; gint rate = audioresample->i_rate; gint resampler_latency = audioresample->filter_length / 2; if (gst_base_transform_is_passthrough (trans)) resampler_latency = 0; if ((peer = gst_pad_get_peer (trans->sinkpad))) { if ((res = gst_pad_query (peer, query))) { gst_query_parse_latency (query, &live, &min, &max); GST_DEBUG ("Peer latency: min %" GST_TIME_FORMAT " max %" GST_TIME_FORMAT, GST_TIME_ARGS (min), GST_TIME_ARGS (max)); /* add our own latency */ if (rate != 0 && resampler_latency != 0) latency = gst_util_uint64_scale (resampler_latency, GST_SECOND, rate); else latency = 0; GST_DEBUG ("Our latency: %" GST_TIME_FORMAT, GST_TIME_ARGS (latency)); min += latency; if (max != GST_CLOCK_TIME_NONE) max += latency; GST_DEBUG ("Calculated total latency : min %" GST_TIME_FORMAT " max %" GST_TIME_FORMAT, GST_TIME_ARGS (min), GST_TIME_ARGS (max)); gst_query_set_latency (query, live, min, max); } gst_object_unref (peer); } break; } default: res = gst_pad_query_default (pad, query); break; } gst_object_unref (audioresample); return res; } static const GstQueryType * audioresample_query_type (GstPad * pad) { static const GstQueryType types[] = { GST_QUERY_LATENCY, 0 }; return types; } static void gst_audioresample_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) { GstAudioresample *audioresample; audioresample = GST_AUDIORESAMPLE (object); switch (prop_id) { case PROP_FILTERLEN: audioresample->filter_length = g_value_get_int (value); GST_DEBUG_OBJECT (GST_ELEMENT (audioresample), "new filter length %d", audioresample->filter_length); if (audioresample->resample) { resample_set_filter_length (audioresample->resample, audioresample->filter_length); gst_element_post_message (GST_ELEMENT (audioresample), gst_message_new_latency (GST_OBJECT (audioresample))); } break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static void gst_audioresample_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec) { GstAudioresample *audioresample; audioresample = GST_AUDIORESAMPLE (object); switch (prop_id) { case PROP_FILTERLEN: g_value_set_int (value, audioresample->filter_length); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static gboolean plugin_init (GstPlugin * plugin) { resample_init (); if (!gst_element_register (plugin, "legacyresample", GST_RANK_MARGINAL, GST_TYPE_AUDIORESAMPLE)) { return FALSE; } return TRUE; } void gstelements_audioresample_register() { gst_plugin_register_static( GST_VERSION_MAJOR, GST_VERSION_MINOR, "legacyresample", "Resamples audio", plugin_init, "1.0.0", "LGPL", "my-application", "my-application", "http://www.my-application.net/" ); } psimedia-master/gstprovider/gstelements/static/directsound_static.c000066400000000000000000000036731220046403000263710ustar00rootroot00000000000000/* GStreamer * Copyright (C) 2005 Sebastien Moutte * Copyright (C) 2007 Pioneers of the Inevitable * * gstdirectsoundplugin.c: * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * * The development of this code was made possible due to the involvement * of Pioneers of the Inevitable, the creators of the Songbird Music player * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif #include "../directsound/gstdirectsound.h" #include "../directsound/gstdirectsoundsink.h" #include "../directsound/gstdirectsoundsrc.h" static gboolean plugin_init (GstPlugin * plugin) { if (!gst_element_register (plugin, "directsoundsink", GST_RANK_PRIMARY, GST_TYPE_DIRECTSOUND_SINK)) return FALSE; if (!gst_element_register (plugin, "directsoundsrc", GST_RANK_PRIMARY, GST_TYPE_DIRECTSOUND_SRC)) return FALSE; GST_DEBUG_CATEGORY_INIT (directsound, "directsound", 0, "DirectSound Elements"); return TRUE; } void gstelements_directsound_register() { gst_plugin_register_static( GST_VERSION_MAJOR, GST_VERSION_MINOR, "directsound", "Direct Sound plugin library", plugin_init, "1.0.0", "LGPL", "my-application", "my-application", "http://www.my-application.net/" ); } psimedia-master/gstprovider/gstelements/static/gstelements.c000066400000000000000000000033741220046403000250270ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #include "gstelements.h" #ifdef HAVE_VIDEOMAXRATE void gstelements_videomaxrate_register(); #endif #ifdef HAVE_LIVEADDER void gstelements_liveadder_register(); #endif #ifdef HAVE_SPEEXDSP void gstelements_speexdsp_register(); #endif #ifdef HAVE_DIRECTSOUND void gstelements_directsound_register(); #endif #ifdef HAVE_WINKS void gstelements_winks_register(); #endif #ifdef HAVE_OSXAUDIO void gstelements_osxaudio_register(); #endif #ifdef HAVE_OSXVIDEO void gstelements_osxvideo_register(); #endif void gstelements_register() { #ifdef HAVE_VIDEOMAXRATE gstelements_videomaxrate_register(); #endif #ifdef HAVE_LIVEADDER gstelements_liveadder_register(); #endif #ifdef HAVE_SPEEXDSP gstelements_speexdsp_register(); #endif #ifdef HAVE_DIRECTSOUND gstelements_directsound_register(); #endif #ifdef HAVE_WINKS gstelements_winks_register(); #endif #ifdef HAVE_OSXAUDIO gstelements_osxaudio_register(); #endif #ifdef HAVE_OSXVIDEO gstelements_osxvideo_register(); #endif } psimedia-master/gstprovider/gstelements/static/gstelements.h000066400000000000000000000016521220046403000250310ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #ifndef PSI_GSTELEMENTS_H #define PSI_GSTELEMENTS_H #include G_BEGIN_DECLS void gstelements_register(); G_END_DECLS #endif psimedia-master/gstprovider/gstelements/static/liveadder_static.c000066400000000000000000001264401220046403000260030ustar00rootroot00000000000000/* * Farsight Voice+Video library * * Copyright 2008 Collabora Ltd * Copyright 2008 Nokia Corporation * @author: Olivier Crete * * With parts copied from the adder plugin which is * Copyright (C) 1999,2000 Erik Walthinsen * 2001 Thomas * 2005,2006 Wim Taymans * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif #include "../liveadder/liveadder.h" #include #include #define DEFAULT_LATENCY_MS 60 GST_DEBUG_CATEGORY_STATIC (live_adder_debug); #define GST_CAT_DEFAULT (live_adder_debug) /* elementfactory information */ static const GstElementDetails gst_live_adder_details = GST_ELEMENT_DETAILS ( "Live Adder element", "Generic/Audio", "Mixes live/discontinuous audio streams", "Olivier Crete "); static GstStaticPadTemplate gst_live_adder_sink_template = GST_STATIC_PAD_TEMPLATE ("sink%d", GST_PAD_SINK, GST_PAD_REQUEST, GST_STATIC_CAPS (GST_AUDIO_INT_PAD_TEMPLATE_CAPS "; " GST_AUDIO_FLOAT_PAD_TEMPLATE_CAPS) ); static GstStaticPadTemplate gst_live_adder_src_template = GST_STATIC_PAD_TEMPLATE ("src", GST_PAD_SRC, GST_PAD_ALWAYS, GST_STATIC_CAPS (GST_AUDIO_INT_PAD_TEMPLATE_CAPS "; " GST_AUDIO_FLOAT_PAD_TEMPLATE_CAPS) ); /* Valve signals and args */ enum { /* FILL ME */ LAST_SIGNAL }; enum { PROP_0, PROP_LATENCY, }; typedef struct _GstLiveAdderPadPrivate { GstSegment segment; gboolean eos; GstClockTime expected_timestamp; } GstLiveAdderPadPrivate; GST_BOILERPLATE(GstLiveAdder, gst_live_adder, GstElement, GST_TYPE_ELEMENT); static void gst_live_adder_finalize (GObject * object); static void gst_live_adder_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static void gst_live_adder_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); static GstPad * gst_live_adder_request_new_pad (GstElement * element, GstPadTemplate * templ, const gchar * unused); static void gst_live_adder_release_pad (GstElement * element, GstPad * pad); static GstStateChangeReturn gst_live_adder_change_state (GstElement * element, GstStateChange transition); static gboolean gst_live_adder_setcaps (GstPad * pad, GstCaps * caps); static GstCaps * gst_live_adder_sink_getcaps (GstPad * pad); static gboolean gst_live_adder_src_activate_push (GstPad * pad, gboolean active); static gboolean gst_live_adder_src_event (GstPad * pad, GstEvent * event); static void gst_live_adder_loop (gpointer data); static gboolean gst_live_adder_query (GstPad * pad, GstQuery * query); static gboolean gst_live_adder_sink_event (GstPad * pad, GstEvent * event); static void reset_pad_private (GstPad *pad); /* clipping versions */ #define MAKE_FUNC(name,type,ttype,min,max) \ static void name (type *out, type *in, gint bytes) { \ gint i; \ for (i = 0; i < bytes / sizeof (type); i++) \ out[i] = CLAMP ((ttype)out[i] + (ttype)in[i], min, max); \ } /* non-clipping versions (for float) */ #define MAKE_FUNC_NC(name,type,ttype) \ static void name (type *out, type *in, gint bytes) { \ gint i; \ for (i = 0; i < bytes / sizeof (type); i++) \ out[i] = (ttype)out[i] + (ttype)in[i]; \ } /* *INDENT-OFF* */ MAKE_FUNC (add_int32, gint32, gint64, G_MININT32, G_MAXINT32) MAKE_FUNC (add_int16, gint16, gint32, G_MININT16, G_MAXINT16) MAKE_FUNC (add_int8, gint8, gint16, G_MININT8, G_MAXINT8) MAKE_FUNC (add_uint32, guint32, guint64, 0, G_MAXUINT32) MAKE_FUNC (add_uint16, guint16, guint32, 0, G_MAXUINT16) MAKE_FUNC (add_uint8, guint8, guint16, 0, G_MAXUINT8) MAKE_FUNC_NC (add_float64, gdouble, gdouble) MAKE_FUNC_NC (add_float32, gfloat, gfloat) /* *INDENT-ON* */ static void gst_live_adder_base_init (gpointer klass) { } static void gst_live_adder_class_init (GstLiveAdderClass * klass) { GObjectClass *gobject_class; GstElementClass *gstelement_class; gobject_class = (GObjectClass *) klass; gobject_class->finalize = gst_live_adder_finalize; gobject_class->set_property = gst_live_adder_set_property; gobject_class->get_property = gst_live_adder_get_property; gstelement_class = (GstElementClass *) klass; gst_element_class_add_pad_template (gstelement_class, gst_static_pad_template_get (&gst_live_adder_src_template)); gst_element_class_add_pad_template (gstelement_class, gst_static_pad_template_get (&gst_live_adder_sink_template)); gst_element_class_set_details (gstelement_class, &gst_live_adder_details); parent_class = g_type_class_peek_parent (klass); gstelement_class->request_new_pad = gst_live_adder_request_new_pad; gstelement_class->release_pad = gst_live_adder_release_pad; gstelement_class->change_state = gst_live_adder_change_state; g_object_class_install_property (gobject_class, PROP_LATENCY, g_param_spec_uint ("latency", "Buffer latency in ms", "Amount of data to buffer", 0, G_MAXUINT, DEFAULT_LATENCY_MS, G_PARAM_READWRITE)); GST_DEBUG_CATEGORY_INIT (live_adder_debug, "liveadder", 0, "Live Adder"); } static void gst_live_adder_init (GstLiveAdder * adder, GstLiveAdderClass *klass) { GstPadTemplate *template; template = gst_static_pad_template_get (&gst_live_adder_src_template); adder->srcpad = gst_pad_new_from_template (template, "src"); gst_object_unref (template); gst_pad_set_getcaps_function (adder->srcpad, GST_DEBUG_FUNCPTR (gst_pad_proxy_getcaps)); gst_pad_set_setcaps_function (adder->srcpad, GST_DEBUG_FUNCPTR (gst_live_adder_setcaps)); gst_pad_set_query_function (adder->srcpad, GST_DEBUG_FUNCPTR (gst_live_adder_query)); gst_pad_set_event_function (adder->srcpad, GST_DEBUG_FUNCPTR (gst_live_adder_src_event)); gst_pad_set_activatepush_function (adder->srcpad, GST_DEBUG_FUNCPTR (gst_live_adder_src_activate_push)); gst_element_add_pad (GST_ELEMENT (adder), adder->srcpad); adder->format = GST_LIVE_ADDER_FORMAT_UNSET; adder->padcount = 0; adder->func = NULL; adder->not_empty_cond = g_cond_new (); adder->next_timestamp = GST_CLOCK_TIME_NONE; adder->latency_ms = DEFAULT_LATENCY_MS; adder->buffers = g_queue_new (); } static void gst_live_adder_finalize (GObject * object) { GstLiveAdder *adder = GST_LIVE_ADDER (object); g_cond_free (adder->not_empty_cond); g_queue_foreach (adder->buffers, (GFunc) gst_mini_object_unref, NULL); while (g_queue_pop_head (adder->buffers)) {} g_queue_free (adder->buffers); g_list_free (adder->sinkpads); G_OBJECT_CLASS (parent_class)->finalize (object); } static void gst_live_adder_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) { GstLiveAdder *adder = GST_LIVE_ADDER (object); switch (prop_id) { case PROP_LATENCY: { guint64 new_latency, old_latency; new_latency = g_value_get_uint (value); GST_OBJECT_LOCK (adder); old_latency = adder->latency_ms; adder->latency_ms = new_latency; GST_OBJECT_UNLOCK (adder); /* post message if latency changed, this will inform the parent pipeline * that a latency reconfiguration is possible/needed. */ if (new_latency != old_latency) { GST_DEBUG_OBJECT (adder, "latency changed to: %" GST_TIME_FORMAT, GST_TIME_ARGS (new_latency)); gst_element_post_message (GST_ELEMENT_CAST (adder), gst_message_new_latency (GST_OBJECT_CAST (adder))); } break; } default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static void gst_live_adder_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec) { GstLiveAdder *adder = GST_LIVE_ADDER (object); switch (prop_id) { case PROP_LATENCY: GST_OBJECT_LOCK (adder); g_value_set_uint (value, adder->latency_ms); GST_OBJECT_UNLOCK (adder); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } /* we can only accept caps that we and downstream can handle. */ static GstCaps * gst_live_adder_sink_getcaps (GstPad * pad) { GstLiveAdder *adder; GstCaps *result, *peercaps, *sinkcaps; adder = GST_LIVE_ADDER (GST_PAD_PARENT (pad)); /* get the downstream possible caps */ peercaps = gst_pad_peer_get_caps (adder->srcpad); /* get the allowed caps on this sinkpad, we use the fixed caps function so * that it does not call recursively in this function. */ sinkcaps = gst_pad_get_fixed_caps_func (pad); if (peercaps) { /* if the peer has caps, intersect */ GST_DEBUG_OBJECT (adder, "intersecting peer and template caps"); result = gst_caps_intersect (peercaps, sinkcaps); gst_caps_unref (peercaps); gst_caps_unref (sinkcaps); } else { /* the peer has no caps (or there is no peer), just use the allowed caps * of this sinkpad. */ GST_DEBUG_OBJECT (adder, "no peer caps, using sinkcaps"); result = sinkcaps; } return result; } /* the first caps we receive on any of the sinkpads will define the caps for all * the other sinkpads because we can only mix streams with the same caps. * */ static gboolean gst_live_adder_setcaps (GstPad * pad, GstCaps * caps) { GstLiveAdder *adder; GList *pads; GstStructure *structure; const char *media_type; adder = GST_LIVE_ADDER (GST_PAD_PARENT (pad)); GST_LOG_OBJECT (adder, "setting caps on pad %p,%s to %" GST_PTR_FORMAT, pad, GST_PAD_NAME (pad), caps); /* FIXME, see if the other pads can accept the format. Also lock the * format on the other pads to this new format. */ GST_OBJECT_LOCK (adder); pads = GST_ELEMENT (adder)->pads; while (pads) { GstPad *otherpad = GST_PAD (pads->data); if (otherpad != pad) gst_caps_replace (&GST_PAD_CAPS (otherpad), caps); pads = g_list_next (pads); } /* parse caps now */ structure = gst_caps_get_structure (caps, 0); media_type = gst_structure_get_name (structure); if (strcmp (media_type, "audio/x-raw-int") == 0) { GST_DEBUG_OBJECT (adder, "parse_caps sets adder to format int"); adder->format = GST_LIVE_ADDER_FORMAT_INT; gst_structure_get_int (structure, "width", &adder->width); gst_structure_get_int (structure, "depth", &adder->depth); gst_structure_get_int (structure, "endianness", &adder->endianness); gst_structure_get_boolean (structure, "signed", &adder->is_signed); if (adder->endianness != G_BYTE_ORDER) goto not_supported; switch (adder->width) { case 8: adder->func = (adder->is_signed ? (GstLiveAdderFunction) add_int8 : (GstLiveAdderFunction) add_uint8); break; case 16: adder->func = (adder->is_signed ? (GstLiveAdderFunction) add_int16 : (GstLiveAdderFunction) add_uint16); break; case 32: adder->func = (adder->is_signed ? (GstLiveAdderFunction) add_int32 : (GstLiveAdderFunction) add_uint32); break; default: goto not_supported; } } else if (strcmp (media_type, "audio/x-raw-float") == 0) { GST_DEBUG_OBJECT (adder, "parse_caps sets adder to format float"); adder->format = GST_LIVE_ADDER_FORMAT_FLOAT; gst_structure_get_int (structure, "width", &adder->width); switch (adder->width) { case 32: adder->func = (GstLiveAdderFunction) add_float32; break; case 64: adder->func = (GstLiveAdderFunction) add_float64; break; default: goto not_supported; } } else { goto not_supported; } gst_structure_get_int (structure, "channels", &adder->channels); gst_structure_get_int (structure, "rate", &adder->rate); /* precalc bps */ adder->bps = (adder->width / 8) * adder->channels; GST_OBJECT_UNLOCK (adder); return TRUE; /* ERRORS */ not_supported: { GST_OBJECT_UNLOCK (adder); GST_DEBUG_OBJECT (adder, "unsupported format set as caps"); return FALSE; } } static void gst_live_adder_flush_start (GstLiveAdder * adder) { GST_DEBUG_OBJECT (adder, "Disabling pop on queue"); GST_OBJECT_LOCK (adder); /* mark ourselves as flushing */ adder->srcresult = GST_FLOW_WRONG_STATE; /* Empty the queue */ g_queue_foreach (adder->buffers, (GFunc) gst_mini_object_unref, NULL); while (g_queue_pop_head (adder->buffers)) {} /* unlock clock, we just unschedule, the entry will be released by the * locking streaming thread. */ if (adder->clock_id) gst_clock_id_unschedule (adder->clock_id); g_cond_broadcast (adder->not_empty_cond); GST_OBJECT_UNLOCK (adder); } static gboolean gst_live_adder_src_activate_push (GstPad * pad, gboolean active) { gboolean result = TRUE; GstLiveAdder *adder = NULL; adder = GST_LIVE_ADDER (gst_pad_get_parent (pad)); if (active) { /* Mark as non flushing */ GST_OBJECT_LOCK (adder); adder->srcresult = GST_FLOW_OK; GST_OBJECT_UNLOCK (adder); /* start pushing out buffers */ GST_DEBUG_OBJECT (adder, "Starting task on srcpad"); gst_pad_start_task (adder->srcpad, (GstTaskFunction) gst_live_adder_loop, adder); } else { /* make sure all data processing stops ASAP */ gst_live_adder_flush_start (adder); /* NOTE this will hardlock if the state change is called from the src pad * task thread because we will _join() the thread. */ GST_DEBUG_OBJECT (adder, "Stopping task on srcpad"); result = gst_pad_stop_task (pad); } gst_object_unref (adder); return result; } static gboolean gst_live_adder_sink_event (GstPad * pad, GstEvent * event) { gboolean ret = TRUE; GstLiveAdder *adder = NULL; GstLiveAdderPadPrivate *padprivate = NULL; adder = GST_LIVE_ADDER (gst_pad_get_parent (pad)); padprivate = gst_pad_get_element_private (pad); if (!padprivate) return FALSE; GST_LOG_OBJECT (adder, "received %s", GST_EVENT_TYPE_NAME (event)); switch (GST_EVENT_TYPE (event)) { case GST_EVENT_NEWSEGMENT: { GstFormat format; gdouble rate, arate; gint64 start, stop, time; gboolean update; gst_event_parse_new_segment_full (event, &update, &rate, &arate, &format, &start, &stop, &time); gst_event_unref (event); /* we need time for now */ if (format != GST_FORMAT_TIME) goto newseg_wrong_format; GST_DEBUG_OBJECT (adder, "newsegment: update %d, rate %g, arate %g, start %" GST_TIME_FORMAT ", stop %" GST_TIME_FORMAT ", time %" GST_TIME_FORMAT, update, rate, arate, GST_TIME_ARGS (start), GST_TIME_ARGS (stop), GST_TIME_ARGS (time)); /* now configure the values, we need these to time the release of the * buffers on the srcpad. */ GST_OBJECT_LOCK (adder); gst_segment_set_newsegment_full (&padprivate->segment, update, rate, arate, format, start, stop, time); GST_OBJECT_UNLOCK (adder); break; } case GST_EVENT_FLUSH_START: gst_live_adder_flush_start (adder); ret = gst_pad_push_event (adder->srcpad, event); break; case GST_EVENT_FLUSH_STOP: GST_OBJECT_LOCK (adder); adder->segment_pending = TRUE; adder->next_timestamp = GST_CLOCK_TIME_NONE; reset_pad_private (pad); adder->segment_pending = TRUE; GST_OBJECT_UNLOCK (adder); ret = gst_pad_push_event (adder->srcpad, event); ret = gst_live_adder_src_activate_push (adder->srcpad, TRUE); break; case GST_EVENT_EOS: { GST_OBJECT_LOCK (adder); ret = adder->srcresult == GST_FLOW_OK; if (ret && !padprivate->eos) { GST_DEBUG_OBJECT (adder, "queuing EOS"); padprivate->eos = TRUE; g_cond_broadcast (adder->not_empty_cond); } else if (padprivate->eos) { GST_DEBUG_OBJECT (adder, "dropping EOS, we are already EOS"); } else { GST_DEBUG_OBJECT (adder, "dropping EOS, reason %s", gst_flow_get_name (adder->srcresult)); } GST_OBJECT_UNLOCK (adder); gst_event_unref (event); break; } default: ret = gst_pad_push_event (adder->srcpad, event); break; } done: gst_object_unref (adder); return ret; /* ERRORS */ newseg_wrong_format: { GST_DEBUG_OBJECT (adder, "received non TIME newsegment"); ret = FALSE; goto done; } } static gboolean gst_live_adder_query_pos_dur (GstLiveAdder * adder, GstFormat informat, gboolean position, gint64 *outvalue) { gint64 max = G_MININT64; gboolean res = TRUE; GstIterator *it; gboolean done = FALSE; it = gst_element_iterate_sink_pads (GST_ELEMENT_CAST (adder)); while (!done) { GstIteratorResult ires; gpointer item; GstFormat format = informat; ires = gst_iterator_next (it, &item); switch (ires) { case GST_ITERATOR_DONE: done = TRUE; break; case GST_ITERATOR_OK: { GstPad *pad = GST_PAD_CAST (item); gint64 value; gboolean curres; /* ask sink peer for duration */ if (position) curres = gst_pad_query_peer_position (pad, &format, &value); else curres = gst_pad_query_peer_duration (pad, &format, &value); /* take max from all valid return values */ /* Only if the format is the one we requested, otherwise ignore it ? */ if (curres && format == informat) { res &= curres; /* valid unknown length, stop searching */ if (value == -1) { max = value; done = TRUE; } else if (value > max) { max = value; } } break; } case GST_ITERATOR_RESYNC: max = -1; res = TRUE; break; default: res = FALSE; done = TRUE; break; } } gst_iterator_free (it); if (res) *outvalue = max; return res; } /* FIXME: * * When we add a new stream (or remove a stream) the duration might * also become invalid again and we need to post a new DURATION * message to notify this fact to the parent. * For now we take the max of all the upstream elements so the simple * cases work at least somewhat. */ static gboolean gst_live_adder_query_duration (GstLiveAdder * adder, GstQuery * query) { GstFormat format; gint64 max; gboolean res; /* parse format */ gst_query_parse_duration (query, &format, NULL); res = gst_live_adder_query_pos_dur (adder, format, FALSE, &max); if (res) { /* and store the max */ gst_query_set_duration (query, format, max); } return res; } static gboolean gst_live_adder_query_position (GstLiveAdder * adder, GstQuery * query) { GstFormat format; gint64 max; gboolean res; /* parse format */ gst_query_parse_position (query, &format, NULL); res = gst_live_adder_query_pos_dur (adder, format, TRUE, &max); if (res) { /* and store the max */ gst_query_set_position (query, format, max); } return res; } static gboolean gst_live_adder_query (GstPad * pad, GstQuery * query) { GstLiveAdder *adder; gboolean res = FALSE; adder = GST_LIVE_ADDER (gst_pad_get_parent (pad)); switch (GST_QUERY_TYPE (query)) { case GST_QUERY_LATENCY: { /* We need to send the query upstream and add the returned latency to our * own */ GstClockTime min_latency = 0, max_latency = G_MAXUINT64; gpointer item; GstIterator *iter = NULL; gboolean done = FALSE; iter = gst_element_iterate_sink_pads (GST_ELEMENT (adder)); while (!done) { switch (gst_iterator_next (iter, &item)) { case GST_ITERATOR_OK: { GstPad *sinkpad = item; GstClockTime pad_min_latency, pad_max_latency; gboolean pad_us_live; if (gst_pad_peer_query (sinkpad, query)) { gst_query_parse_latency (query, &pad_us_live, &pad_min_latency, &pad_max_latency); res = TRUE; GST_DEBUG_OBJECT (adder, "Peer latency for pad %s: min %" GST_TIME_FORMAT " max %" GST_TIME_FORMAT, GST_PAD_NAME (sinkpad), GST_TIME_ARGS (pad_min_latency), GST_TIME_ARGS (pad_max_latency)); min_latency = MAX (pad_min_latency, min_latency); max_latency = MIN (pad_max_latency, max_latency); } gst_object_unref (item); } break; case GST_ITERATOR_RESYNC: min_latency = 0; max_latency = G_MAXUINT64; gst_iterator_resync (iter); break; case GST_ITERATOR_ERROR: GST_ERROR_OBJECT (adder, "Error looping sink pads"); done = TRUE; break; case GST_ITERATOR_DONE: done = TRUE; break; } } gst_iterator_free (iter); if (res) { GstClockTime my_latency = adder->latency_ms * GST_MSECOND; GST_OBJECT_LOCK (adder); adder->peer_latency = min_latency; min_latency += my_latency; GST_OBJECT_UNLOCK (adder); /* Make sure we don't risk an overflow */ if (max_latency < G_MAXUINT64 - my_latency) max_latency += my_latency; else max_latency = G_MAXUINT64; gst_query_set_latency (query, TRUE, min_latency, max_latency); GST_DEBUG_OBJECT (adder, "Calculated total latency : min %" GST_TIME_FORMAT " max %" GST_TIME_FORMAT, GST_TIME_ARGS (min_latency), GST_TIME_ARGS (max_latency)); } break; } case GST_QUERY_DURATION: res = gst_live_adder_query_duration (adder, query); break; case GST_QUERY_POSITION: res = gst_live_adder_query_position (adder, query); break; default: res = gst_pad_query_default (pad, query); break; } gst_object_unref (adder); return res; } static gboolean forward_event_func (GstPad * pad, GValue * ret, GstEvent * event) { gst_event_ref (event); GST_LOG_OBJECT (pad, "About to send event %s", GST_EVENT_TYPE_NAME (event)); if (!gst_pad_push_event (pad, event)) { g_value_set_boolean (ret, FALSE); GST_WARNING_OBJECT (pad, "Sending event %p (%s) failed.", event, GST_EVENT_TYPE_NAME (event)); } else { GST_LOG_OBJECT (pad, "Sent event %p (%s).", event, GST_EVENT_TYPE_NAME (event)); } /* unref the pad because of a FIXME in gst_iterator_unfold * it does a gst_iterator_next which refs the pad, but it never unrefs it */ gst_object_unref (pad); return TRUE; } /* forwards the event to all sinkpads, takes ownership of the * event * * Returns: TRUE if the event could be forwarded on all * sinkpads. */ static gboolean forward_event (GstLiveAdder * adder, GstEvent * event) { gboolean ret; GstIterator *it; GValue vret = { 0 }; GST_LOG_OBJECT (adder, "Forwarding event %p (%s)", event, GST_EVENT_TYPE_NAME (event)); ret = TRUE; g_value_init (&vret, G_TYPE_BOOLEAN); g_value_set_boolean (&vret, TRUE); it = gst_element_iterate_sink_pads (GST_ELEMENT_CAST (adder)); gst_iterator_fold (it, (GstIteratorFoldFunction) forward_event_func, &vret, event); gst_iterator_free (it); ret = g_value_get_boolean (&vret); return ret; } static gboolean gst_live_adder_src_event (GstPad * pad, GstEvent * event) { GstLiveAdder *adder; gboolean result; adder = GST_LIVE_ADDER (gst_pad_get_parent (pad)); switch (GST_EVENT_TYPE (event)) { case GST_EVENT_QOS: /* TODO : QoS might be tricky */ result = FALSE; break; case GST_EVENT_NAVIGATION: /* TODO : navigation is rather pointless. */ result = FALSE; break; default: /* just forward the rest for now */ result = forward_event (adder, event); break; } gst_event_unref (event); gst_object_unref (adder); return result; } static guint gst_live_adder_length_from_duration (GstLiveAdder *adder, GstClockTime duration) { guint64 ret = (duration * adder->rate / GST_SECOND) * adder->bps; return (guint) ret; } static GstFlowReturn gst_live_live_adder_chain (GstPad *pad, GstBuffer *buffer) { GstLiveAdder *adder = GST_LIVE_ADDER (gst_pad_get_parent_element (pad)); GstLiveAdderPadPrivate *padprivate = NULL; GstFlowReturn ret = GST_FLOW_OK; GList *item = NULL; GstClockTime skip = 0; gint64 drift = 0; /* Positive if new buffer after old buffer */ GST_OBJECT_LOCK (adder); ret = adder->srcresult; GST_DEBUG ("Incoming buffer time:%"GST_TIME_FORMAT" duration:%"GST_TIME_FORMAT, GST_TIME_ARGS(GST_BUFFER_TIMESTAMP(buffer)), GST_TIME_ARGS(GST_BUFFER_DURATION(buffer))); if (ret != GST_FLOW_OK) { GST_DEBUG_OBJECT (adder, "Passing non-ok result from src: %s", gst_flow_get_name (ret)); gst_buffer_unref (buffer); goto out; } padprivate = gst_pad_get_element_private (pad); if (!padprivate) { ret = GST_FLOW_NOT_LINKED; gst_buffer_unref (buffer); goto out; } if (padprivate->eos) { GST_DEBUG_OBJECT (adder, "Received buffer after EOS"); ret = GST_FLOW_UNEXPECTED; gst_buffer_unref (buffer); goto out; } if (!GST_BUFFER_TIMESTAMP_IS_VALID(buffer)) goto invalid_timestamp; if (padprivate->segment.format == GST_FORMAT_UNDEFINED) { GST_WARNING_OBJECT (adder, "No new-segment received," " initializing segment with time 0..-1"); gst_segment_init (&padprivate->segment, GST_FORMAT_TIME); gst_segment_set_newsegment (&padprivate->segment, FALSE, 1.0, GST_FORMAT_TIME, 0, -1, 0); } if (padprivate->segment.format != GST_FORMAT_TIME) goto invalid_segment; buffer = gst_buffer_make_metadata_writable (buffer); drift = GST_BUFFER_TIMESTAMP (buffer) - padprivate->expected_timestamp; /* Just see if we receive invalid timestamp/durations */ if (GST_CLOCK_TIME_IS_VALID (padprivate->expected_timestamp) && !GST_BUFFER_FLAG_IS_SET(buffer, GST_BUFFER_FLAG_DISCONT) && (drift != 0)) { GST_LOG_OBJECT (adder, "Timestamp discontinuity without the DISCONT flag set" " (expected %" GST_TIME_FORMAT ", got %" GST_TIME_FORMAT" drift:%ldms)", GST_TIME_ARGS (padprivate->expected_timestamp), GST_TIME_ARGS (GST_BUFFER_TIMESTAMP (buffer)), (long int)(drift / GST_MSECOND)); /* We accept drifts of 10ms*/ if (ABS(drift) < (10 * GST_MSECOND)) { GST_DEBUG ("Correcting minor drift"); GST_BUFFER_TIMESTAMP (buffer) = padprivate->expected_timestamp; } } /* If there is no duration, lets set one */ if (!GST_BUFFER_DURATION_IS_VALID (buffer)) { GST_BUFFER_DURATION (buffer) = gst_audio_duration_from_pad_buffer (pad, buffer); padprivate->expected_timestamp = GST_CLOCK_TIME_NONE; } else { padprivate->expected_timestamp = GST_BUFFER_TIMESTAMP (buffer) + GST_BUFFER_DURATION (buffer); } /* * Lets clip the buffer to the segment (so we don't have to worry about * cliping afterwards). * This should also guarantee us that we'll have valid timestamps and * durations afterwards */ buffer = gst_audio_buffer_clip (buffer, &padprivate->segment, adder->rate, adder->bps); /* buffer can be NULL if it's completely outside of the segment */ if (!buffer) { GST_DEBUG ("Buffer completely outside of configured segment, dropping it"); goto out; } /* * Make sure all incoming buffers share the same timestamping */ GST_BUFFER_TIMESTAMP (buffer) = gst_segment_to_running_time ( &padprivate->segment, padprivate->segment.format, GST_BUFFER_TIMESTAMP (buffer)); if (GST_CLOCK_TIME_IS_VALID (adder->next_timestamp) && GST_BUFFER_TIMESTAMP (buffer) < adder->next_timestamp) { if (GST_BUFFER_TIMESTAMP (buffer) + GST_BUFFER_DURATION (buffer) < adder->next_timestamp) { GST_DEBUG_OBJECT (adder, "Buffer is late, dropping (ts: %" GST_TIME_FORMAT " duration: %" GST_TIME_FORMAT ")", GST_TIME_ARGS (GST_BUFFER_TIMESTAMP (buffer)), GST_TIME_ARGS (GST_BUFFER_DURATION (buffer))); gst_buffer_unref (buffer); goto out; } else { skip = adder->next_timestamp - GST_BUFFER_TIMESTAMP (buffer); GST_DEBUG_OBJECT (adder, "Buffer is partially late, skipping %" GST_TIME_FORMAT, GST_TIME_ARGS (skip)); } } /* If our new buffer's head is higher than the queue's head, lets wake up, * we may not have to wait for as long */ if (adder->clock_id && g_queue_peek_head (adder->buffers) != NULL && GST_BUFFER_TIMESTAMP (buffer) + skip < GST_BUFFER_TIMESTAMP (g_queue_peek_head (adder->buffers))) gst_clock_id_unschedule (adder->clock_id); for (item = g_queue_peek_head_link (adder->buffers); item; item = g_list_next (item)) { GstBuffer *oldbuffer = item->data; GstClockTime old_skip = 0; GstClockTime mix_duration = 0; GstClockTime mix_start = 0; GstClockTime mix_end = 0; /* We haven't reached our place yet */ if (GST_BUFFER_TIMESTAMP (buffer) + skip >= GST_BUFFER_TIMESTAMP (oldbuffer) + GST_BUFFER_DURATION (oldbuffer)) continue; /* We're past our place, lets insert ouselves here */ if (GST_BUFFER_TIMESTAMP (buffer) + GST_BUFFER_DURATION (buffer) <= GST_BUFFER_TIMESTAMP (oldbuffer)) break; /* if we reach this spot, we have overlap, so we must mix */ /* First make a subbuffer with the non-overlapping part */ if (GST_BUFFER_TIMESTAMP (buffer) + skip < GST_BUFFER_TIMESTAMP (oldbuffer)) { GstBuffer *subbuffer = NULL; GstClockTime subbuffer_duration = GST_BUFFER_TIMESTAMP (oldbuffer) - (GST_BUFFER_TIMESTAMP (buffer) + skip); subbuffer = gst_buffer_create_sub (buffer, gst_live_adder_length_from_duration (adder, skip), gst_live_adder_length_from_duration (adder, subbuffer_duration)); GST_BUFFER_TIMESTAMP (subbuffer) = GST_BUFFER_TIMESTAMP (buffer) + skip; GST_BUFFER_DURATION (subbuffer) = subbuffer_duration; skip += subbuffer_duration; g_queue_insert_before (adder->buffers, item, subbuffer); } /* Now we are on the overlapping part */ oldbuffer = gst_buffer_make_writable (oldbuffer); item->data = oldbuffer; old_skip = GST_BUFFER_TIMESTAMP (buffer) + skip - GST_BUFFER_TIMESTAMP (oldbuffer); mix_start = GST_BUFFER_TIMESTAMP (oldbuffer) + old_skip; if (GST_BUFFER_TIMESTAMP (buffer) + GST_BUFFER_DURATION (buffer) < GST_BUFFER_TIMESTAMP (oldbuffer) + GST_BUFFER_DURATION (oldbuffer)) mix_end = GST_BUFFER_TIMESTAMP (buffer) + GST_BUFFER_DURATION (buffer); else mix_end = GST_BUFFER_TIMESTAMP (oldbuffer) + GST_BUFFER_DURATION (oldbuffer); mix_duration = mix_end - mix_start; adder->func (GST_BUFFER_DATA (oldbuffer) + gst_live_adder_length_from_duration (adder, old_skip), GST_BUFFER_DATA (buffer) + gst_live_adder_length_from_duration (adder, skip), gst_live_adder_length_from_duration (adder, mix_duration)); skip += mix_duration; } g_cond_broadcast (adder->not_empty_cond); if (skip == GST_BUFFER_DURATION (buffer)) { gst_buffer_unref (buffer); } else { if (skip) { GstClockTime subbuffer_duration = GST_BUFFER_DURATION (buffer) - skip; GstClockTime subbuffer_ts = GST_BUFFER_TIMESTAMP (buffer) + skip; buffer = gst_buffer_create_sub (buffer, gst_live_adder_length_from_duration (adder, skip), gst_live_adder_length_from_duration (adder, subbuffer_duration)); GST_BUFFER_TIMESTAMP (buffer) = subbuffer_ts; GST_BUFFER_DURATION (buffer) = subbuffer_duration; } if (item) g_queue_insert_before (adder->buffers, item, buffer); else g_queue_push_tail (adder->buffers, buffer); } out: GST_OBJECT_UNLOCK (adder); gst_object_unref (adder); return ret; invalid_timestamp: GST_OBJECT_UNLOCK (adder); gst_buffer_unref (buffer); GST_ELEMENT_ERROR (adder, STREAM, FAILED, ("Buffer without a valid timestamp received"), ("Invalid timestamp received on buffer")); return GST_FLOW_ERROR; invalid_segment: { const gchar *format = gst_format_get_name (padprivate->segment.format); GST_OBJECT_UNLOCK (adder); gst_buffer_unref (buffer); GST_ELEMENT_ERROR (adder, STREAM, FAILED, ("This element only supports TIME segments, received other type"), ("Received a segment of type %s, only support time segment", format)); return GST_FLOW_ERROR; } } /* * This only works because the GstObject lock is taken * * It checks if all sink pads are EOS */ static gboolean check_eos_locked (GstLiveAdder *adder) { GList *item; /* We can't be EOS if we have no sinkpads */ if (adder->sinkpads == NULL) return FALSE; for (item = adder->sinkpads; item; item = g_list_next (item)) { GstPad *pad = item->data; GstLiveAdderPadPrivate *padprivate = gst_pad_get_element_private (pad); if (padprivate && padprivate->eos != TRUE) return FALSE; } return TRUE; } static void gst_live_adder_loop (gpointer data) { GstLiveAdder *adder = GST_LIVE_ADDER (data); GstClockTime buffer_timestamp = 0; GstClockTime sync_time = 0; GstClock *clock = NULL; GstClockID id = NULL; GstClockReturn ret; GstBuffer *buffer = NULL; GstFlowReturn result; GstEvent *newseg_event = NULL; GST_OBJECT_LOCK (adder); again: for (;;) { if (adder->srcresult != GST_FLOW_OK) goto flushing; if (!g_queue_is_empty (adder->buffers)) break; if (check_eos_locked (adder)) goto eos; g_cond_wait (adder->not_empty_cond, GST_OBJECT_GET_LOCK(adder)); } buffer_timestamp = GST_BUFFER_TIMESTAMP (g_queue_peek_head (adder->buffers)); clock = GST_ELEMENT_CLOCK (adder); /* If we have no clock, then we can't do anything.. error */ if (!clock) { if (adder->playing) goto no_clock; else goto push_buffer; } GST_DEBUG_OBJECT (adder, "sync to timestamp %" GST_TIME_FORMAT, GST_TIME_ARGS (buffer_timestamp)); sync_time = buffer_timestamp + GST_ELEMENT_CAST (adder)->base_time; /* add latency, this includes our own latency and the peer latency. */ sync_time += adder->latency_ms * GST_MSECOND; sync_time += adder->peer_latency; /* create an entry for the clock */ id = adder->clock_id = gst_clock_new_single_shot_id (clock, sync_time); GST_OBJECT_UNLOCK (adder); ret = gst_clock_id_wait (id, NULL); GST_OBJECT_LOCK (adder); /* and free the entry */ gst_clock_id_unref (id); adder->clock_id = NULL; /* at this point, the clock could have been unlocked by a timeout, a new * head element was added to the queue or because we are shutting down. Check * for shutdown first. */ if (adder->srcresult != GST_FLOW_OK) goto flushing; if (ret == GST_CLOCK_UNSCHEDULED) { GST_DEBUG_OBJECT (adder, "Wait got unscheduled, will retry to push with new buffer"); goto again; } if (ret != GST_CLOCK_OK && ret != GST_CLOCK_EARLY) goto clock_error; push_buffer: buffer = g_queue_pop_head (adder->buffers); if (!buffer) goto again; /* * We make sure the timestamps are exactly contiguous * If its only small skew (due to rounding errors), we correct it * silently. Otherwise we put the discont flag */ if (GST_CLOCK_TIME_IS_VALID (adder->next_timestamp) && GST_BUFFER_TIMESTAMP (buffer) != adder->next_timestamp) { GstClockTimeDiff diff = GST_CLOCK_DIFF (GST_BUFFER_TIMESTAMP (buffer), adder->next_timestamp); if (diff < 0) diff = -diff; if (diff < GST_SECOND / adder->rate) { GST_BUFFER_TIMESTAMP (buffer) = adder->next_timestamp; GST_DEBUG_OBJECT (adder, "Correcting slight skew"); GST_BUFFER_FLAG_UNSET(buffer, GST_BUFFER_FLAG_DISCONT); } else { GST_BUFFER_FLAG_SET(buffer, GST_BUFFER_FLAG_DISCONT); GST_DEBUG_OBJECT (adder, "Expected buffer at %" GST_TIME_FORMAT ", but is at %" GST_TIME_FORMAT", setting discont", GST_TIME_ARGS (adder->next_timestamp), GST_TIME_ARGS (GST_BUFFER_TIMESTAMP (buffer))); } } else { GST_BUFFER_FLAG_UNSET(buffer, GST_BUFFER_FLAG_DISCONT); } GST_BUFFER_OFFSET(buffer) = GST_BUFFER_OFFSET_NONE; GST_BUFFER_OFFSET_END(buffer) = GST_BUFFER_OFFSET_NONE; if (GST_BUFFER_DURATION_IS_VALID (buffer)) adder->next_timestamp = GST_BUFFER_TIMESTAMP (buffer) + GST_BUFFER_DURATION (buffer); else adder->next_timestamp = GST_CLOCK_TIME_NONE; if (adder->segment_pending) { /* * We set the start at 0, because we re-timestamps to the running time */ newseg_event = gst_event_new_new_segment_full (FALSE, 1.0, 1.0, GST_FORMAT_TIME, 0, -1, 0); adder->segment_pending = FALSE; } GST_OBJECT_UNLOCK (adder); if (newseg_event) gst_pad_push_event (adder->srcpad, newseg_event); GST_DEBUG ("About to push buffer time:%"GST_TIME_FORMAT" duration:%"GST_TIME_FORMAT, GST_TIME_ARGS(GST_BUFFER_TIMESTAMP(buffer)), GST_TIME_ARGS(GST_BUFFER_DURATION(buffer))); result = gst_pad_push (adder->srcpad, buffer); if (result != GST_FLOW_OK) goto pause; return; flushing: { GST_DEBUG_OBJECT (adder, "we are flushing"); gst_pad_pause_task (adder->srcpad); GST_OBJECT_UNLOCK (adder); return; } clock_error: { gst_pad_pause_task (adder->srcpad); GST_OBJECT_UNLOCK (adder); GST_ELEMENT_ERROR (adder, STREAM, MUX, ("Error with the clock"), ("Error with the clock: %d", ret)); GST_ERROR_OBJECT (adder, "Error with the clock: %d", ret); return; } no_clock: { gst_pad_pause_task (adder->srcpad); GST_OBJECT_UNLOCK (adder); GST_ELEMENT_ERROR (adder, STREAM, MUX, ("No available clock"), ("No available clock")); GST_ERROR_OBJECT (adder, "No available clock"); return; } pause: { const gchar *reason = gst_flow_get_name (result); GST_DEBUG_OBJECT (adder, "pausing task, reason %s", reason); GST_OBJECT_LOCK (adder); /* store result */ adder->srcresult = result; /* we don't post errors or anything because upstream will do that for us * when we pass the return value upstream. */ gst_pad_pause_task (adder->srcpad); GST_OBJECT_UNLOCK (adder); return; } eos: { /* store result, we are flushing now */ GST_DEBUG_OBJECT (adder, "We are EOS, pushing EOS downstream"); adder->srcresult = GST_FLOW_UNEXPECTED; gst_pad_pause_task (adder->srcpad); GST_OBJECT_UNLOCK (adder); gst_pad_push_event (adder->srcpad, gst_event_new_eos ()); return; } } static GstPad * gst_live_adder_request_new_pad (GstElement * element, GstPadTemplate * templ, const gchar * unused) { gchar *name; GstLiveAdder *adder; GstPad *newpad; gint padcount; GstLiveAdderPadPrivate *padprivate = NULL; if (templ->direction != GST_PAD_SINK) goto not_sink; adder = GST_LIVE_ADDER (element); /* increment pad counter */ padcount = g_atomic_int_exchange_and_add (&adder->padcount, 1); name = g_strdup_printf ("sink%d", padcount); newpad = gst_pad_new_from_template (templ, name); GST_DEBUG_OBJECT (adder, "request new pad %s", name); g_free (name); gst_pad_set_getcaps_function (newpad, GST_DEBUG_FUNCPTR (gst_live_adder_sink_getcaps)); gst_pad_set_setcaps_function (newpad, GST_DEBUG_FUNCPTR (gst_live_adder_setcaps)); gst_pad_set_event_function (newpad, GST_DEBUG_FUNCPTR (gst_live_adder_sink_event)); padprivate = g_new0 (GstLiveAdderPadPrivate, 1); gst_segment_init (&padprivate->segment, GST_FORMAT_UNDEFINED); padprivate->eos = FALSE; padprivate->expected_timestamp = GST_CLOCK_TIME_NONE; gst_pad_set_element_private (newpad, padprivate); gst_pad_set_chain_function (newpad, gst_live_live_adder_chain); if (!gst_pad_set_active (newpad, TRUE)) goto could_not_activate; /* takes ownership of the pad */ if (!gst_element_add_pad (GST_ELEMENT (adder), newpad)) goto could_not_add; GST_OBJECT_LOCK (adder); adder->sinkpads = g_list_prepend (adder->sinkpads, newpad); GST_OBJECT_UNLOCK (adder); return newpad; /* errors */ not_sink: { g_warning ("gstadder: request new pad that is not a SINK pad\n"); return NULL; } could_not_add: { GST_DEBUG_OBJECT (adder, "could not add pad"); g_free (padprivate); gst_object_unref (newpad); return NULL; } could_not_activate: { GST_DEBUG_OBJECT (adder, "could not activate new pad"); g_free (padprivate); gst_object_unref (newpad); return NULL; } } static void gst_live_adder_release_pad (GstElement * element, GstPad * pad) { GstLiveAdder *adder; GstLiveAdderPadPrivate *padprivate; adder = GST_LIVE_ADDER (element); GST_DEBUG_OBJECT (adder, "release pad %s:%s", GST_DEBUG_PAD_NAME (pad)); GST_OBJECT_LOCK (element); padprivate = gst_pad_get_element_private (pad); gst_pad_set_element_private (pad, NULL); adder->sinkpads = g_list_remove_all (adder->sinkpads, pad); GST_OBJECT_UNLOCK (element); g_free (padprivate); gst_element_remove_pad (element, pad); } static void reset_pad_private (GstPad *pad) { GstLiveAdderPadPrivate *padprivate; padprivate = gst_pad_get_element_private (pad); if (!padprivate) return; gst_segment_init (&padprivate->segment, GST_FORMAT_UNDEFINED); padprivate->expected_timestamp = GST_CLOCK_TIME_NONE; padprivate->eos = FALSE; } static GstStateChangeReturn gst_live_adder_change_state (GstElement * element, GstStateChange transition) { GstLiveAdder *adder; GstStateChangeReturn ret; adder = GST_LIVE_ADDER (element); switch (transition) { case GST_STATE_CHANGE_READY_TO_PAUSED: GST_OBJECT_LOCK (adder); adder->segment_pending = TRUE; adder->peer_latency = 0; adder->next_timestamp = GST_CLOCK_TIME_NONE; g_list_foreach (adder->sinkpads, (GFunc) reset_pad_private, NULL); GST_OBJECT_UNLOCK (adder); break; case GST_STATE_CHANGE_PLAYING_TO_PAUSED: GST_OBJECT_LOCK (adder); adder->playing = FALSE; GST_OBJECT_UNLOCK (adder); break; default: break; } ret = GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); switch (transition) { case GST_STATE_CHANGE_PAUSED_TO_PLAYING: GST_OBJECT_LOCK (adder); adder->playing = TRUE; GST_OBJECT_UNLOCK (adder); break; default: break; } return ret; } static gboolean plugin_init (GstPlugin * plugin) { if (!gst_element_register (plugin, "liveadder", GST_RANK_NONE, GST_TYPE_LIVE_ADDER)) { return FALSE; } return TRUE; } void gstelements_liveadder_register() { gst_plugin_register_static( GST_VERSION_MAJOR, GST_VERSION_MINOR, "liveadder", "Adds multiple live discontinuous streams", plugin_init, "1.0.0", "LGPL", "my-application", "my-application", "http://www.my-application.net/" ); } psimedia-master/gstprovider/gstelements/static/osxaudio_static.c000066400000000000000000000045221220046403000256730ustar00rootroot00000000000000/* GStreamer * Copyright (C) <1999> Erik Walthinsen * Copyright (C) 2007,2008 Pioneers of the Inevitable * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement of * Pioneers of the Inevitable, the creators of the Songbird Music player * */ /** * SECTION:element-osxaudiosink * @short_description: play audio to an CoreAudio device * * * * This element renders raw audio samples using the CoreAudio api. * * Example pipelines * * Play an Ogg/Vorbis file. * * * gst-launch -v filesrc location=sine.ogg ! oggdemux ! vorbisdec ! audioconvert ! audioresample ! osxaudiosink * * * * Last reviewed on 2006-03-01 (0.10.4) */ #ifdef HAVE_CONFIG_H #include "config.h" #endif #include "../osxaudio/gstosxaudioelement.h" #include "../osxaudio/gstosxaudiosink.h" #include "../osxaudio/gstosxaudiosrc.h" static gboolean plugin_init (GstPlugin * plugin) { if (!gst_element_register (plugin, "osxaudiosink", GST_RANK_PRIMARY, GST_TYPE_OSX_AUDIO_SINK)) { return FALSE; } if (!gst_element_register (plugin, "osxaudiosrc", GST_RANK_PRIMARY, GST_TYPE_OSX_AUDIO_SRC)) { return FALSE; } return TRUE; } void gstelements_osxaudio_register() { gst_plugin_register_static( GST_VERSION_MAJOR, GST_VERSION_MINOR, "osxaudio", "OSX (Mac OS X) audio support for GStreamer", plugin_init, "1.0.0", "LGPL", "my-application", "my-application", "http://www.my-application.net/" ); } psimedia-master/gstprovider/gstelements/static/osxvideo_static.m000066400000000000000000000041021220046403000257040ustar00rootroot00000000000000/* GStreamer * OSX video sink * Copyright (C) 2004-6 Zaheer Abbas Merali * Copyright (C) 2007 Pioneers of the Inevitable * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. * * The development of this code was made possible due to the involvement of * Pioneers of the Inevitable, the creators of the Songbird Music player. * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif /* Object header */ #include "../osxvideo/osxvideosink.h" #include "../osxvideo/osxvideosrc.h" static gboolean plugin_init (GstPlugin * plugin) { if (!gst_element_register (plugin, "osxvideosink", GST_RANK_PRIMARY, GST_TYPE_OSX_VIDEO_SINK)) return FALSE; GST_DEBUG_CATEGORY_INIT (gst_debug_osx_video_sink, "osxvideosink", 0, "osxvideosink element"); if (!gst_element_register (plugin, "osxvideosrc", GST_RANK_PRIMARY, GST_TYPE_OSX_VIDEO_SRC)) return FALSE; GST_DEBUG_CATEGORY_INIT (gst_debug_osx_video_src, "osxvideosrc", 0, "osxvideosrc element"); return TRUE; } void gstelements_osxvideo_register() { gst_plugin_register_static( GST_VERSION_MAJOR, GST_VERSION_MINOR, "osxvideo", "OSX native video input/output plugin", plugin_init, "1.0.0", "LGPL", "my-application", "my-application", "http://www.my-application.net/" ); } psimedia-master/gstprovider/gstelements/static/rtpmanager_static.c000066400000000000000000000043011220046403000261730ustar00rootroot00000000000000/* GStreamer * Copyright (C) <2007> Wim Taymans * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. */ #ifdef HAVE_CONFIG_H #include "config.h" #endif #include "../rtpmanager/gstrtpbin.h" #include "../rtpmanager/gstrtpclient.h" #include "../rtpmanager/gstrtpjitterbuffer.h" #include "../rtpmanager/gstrtpptdemux.h" #include "../rtpmanager/gstrtpsession.h" #include "../rtpmanager/gstrtpssrcdemux.h" static gboolean plugin_init (GstPlugin * plugin) { if (!gst_element_register (plugin, "gstrtpbin", GST_RANK_NONE, GST_TYPE_RTP_BIN)) return FALSE; if (!gst_element_register (plugin, "gstrtpclient", GST_RANK_NONE, GST_TYPE_RTP_CLIENT)) return FALSE; if (!gst_element_register (plugin, "gstrtpjitterbuffer", GST_RANK_NONE, GST_TYPE_RTP_JITTER_BUFFER)) return FALSE; if (!gst_element_register (plugin, "gstrtpptdemux", GST_RANK_NONE, GST_TYPE_RTP_PT_DEMUX)) return FALSE; if (!gst_element_register (plugin, "gstrtpsession", GST_RANK_NONE, GST_TYPE_RTP_SESSION)) return FALSE; if (!gst_element_register (plugin, "gstrtpssrcdemux", GST_RANK_NONE, GST_TYPE_RTP_SSRC_DEMUX)) return FALSE; return TRUE; } void gstelements_rtpmanager_register() { gst_plugin_register_static( GST_VERSION_MAJOR, GST_VERSION_MINOR, "gstrtpmanager", "RTP session management plugin library", plugin_init, "1.0.0", "LGPL", "my-application", "my-application", "http://www.my-application.net/" ); } psimedia-master/gstprovider/gstelements/static/speexdsp_static.c000066400000000000000000000034621220046403000256750ustar00rootroot00000000000000/* * Farsight Voice+Video library * * Copyright 2008 Collabora Ltd * Copyright 2008 Nokia Corporation * @author: Olivier Crete * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA * */ #include #include #include "../speexdsp/speexdsp.h" #include "../speexdsp/speexechoprobe.h" /* dsp/probe use these to discover each other */ GStaticMutex global_mutex = G_STATIC_MUTEX_INIT; GstSpeexDSP * global_dsp = NULL; GstSpeexEchoProbe * global_probe = NULL; static gboolean plugin_init (GstPlugin * plugin) { if (!gst_element_register (plugin, "speexdsp", GST_RANK_NONE, GST_TYPE_SPEEX_DSP)) { return FALSE; } if (!gst_element_register (plugin, "speexechoprobe", GST_RANK_NONE, GST_TYPE_SPEEX_ECHO_PROBE)) { return FALSE; } return TRUE; } void gstelements_speexdsp_register() { gst_plugin_register_static( GST_VERSION_MAJOR, GST_VERSION_MINOR, "speexdsp", "Voice preprocessing using libspeex", plugin_init, "1.0.0", "LGPL", "my-application", "my-application", "http://www.my-application.net/" ); } psimedia-master/gstprovider/gstelements/static/static.pro000066400000000000000000000015131220046403000243330ustar00rootroot00000000000000TEMPLATE = lib CONFIG -= qt CONFIG += staticlib create_prl TARGET = gstelements_static DESTDIR = lib CONFIG += videomaxrate liveadder speexdsp windows:CONFIG += directsound winks mac:CONFIG += osxaudio osxvideo *-g++*:QMAKE_CFLAGS_WARN_ON = -Wall -Wdeclaration-after-statement #-Werror include(../../gstconf.pri) videomaxrate { include(../videomaxrate.pri) DEFINES += HAVE_VIDEOMAXRATE } liveadder { include(../liveadder.pri) DEFINES += HAVE_LIVEADDER } speexdsp { include(../speexdsp.pri) DEFINES += HAVE_SPEEXDSP } directsound { include(../directsound.pri) DEFINES += HAVE_DIRECTSOUND } winks { include(../winks.pri) DEFINES += HAVE_WINKS } osxaudio { include(../osxaudio.pri) DEFINES += HAVE_OSXAUDIO } osxvideo { include(../osxvideo.pri) DEFINES += HAVE_OSXVIDEO } HEADERS += gstelements.h SOURCES += gstelements.c psimedia-master/gstprovider/gstelements/static/videomaxrate_static.c000066400000000000000000000025431220046403000265310ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif #include "../videomaxrate/videomaxrate.h" static gboolean plugin_init (GstPlugin * plugin) { if (!gst_element_register (plugin, "videomaxrate", GST_RANK_NONE, GST_TYPE_VIDEOMAXRATE)) return FALSE; return TRUE; } void gstelements_videomaxrate_register() { gst_plugin_register_static( GST_VERSION_MAJOR, GST_VERSION_MINOR, "videomaxrate", "Drop extra frames", plugin_init, "1.0.0", "LGPL", "my-application", "my-application", "http://www.my-application.net/" ); } psimedia-master/gstprovider/gstelements/static/winks_static.c000066400000000000000000000724161220046403000252020ustar00rootroot00000000000000/* * Copyright (C) 2008 Ole André Vadla RavnÃ¥s * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. */ /** * SECTION:element-ksvideosrc * * Provides low-latency video capture from WDM cameras on Windows. * * * Example pipelines * |[ * gst-launch -v ksvideosrc do-stats=TRUE ! ffmpegcolorspace ! dshowvideosink * ]| Capture from a camera and render using dshowvideosink. * |[ * gst-launch -v ksvideosrc do-stats=TRUE ! image/jpeg, width=640, height=480 * ! jpegdec ! ffmpegcolorspace ! dshowvideosink * ]| Capture from an MJPEG camera and render using dshowvideosink. * */ #ifdef HAVE_CONFIG_H # include #endif #include "../winks/gstksvideosrc.h" #include "../winks/gstksclock.h" #include "../winks/gstksvideodevice.h" #include "../winks/kshelpers.h" #include "../winks/ksvideohelpers.h" #define DEFAULT_DEVICE_PATH NULL #define DEFAULT_DEVICE_NAME NULL #define DEFAULT_DEVICE_INDEX -1 #define DEFAULT_DO_STATS FALSE #define DEFAULT_ENABLE_QUIRKS TRUE enum { PROP_0, PROP_DEVICE_PATH, PROP_DEVICE_NAME, PROP_DEVICE_INDEX, PROP_DO_STATS, PROP_FPS, PROP_ENABLE_QUIRKS, }; GST_DEBUG_CATEGORY (gst_ks_debug); #define GST_CAT_DEFAULT gst_ks_debug #define KS_WORKER_LOCK(priv) g_mutex_lock (priv->worker_lock) #define KS_WORKER_UNLOCK(priv) g_mutex_unlock (priv->worker_lock) #define KS_WORKER_WAIT(priv) \ g_cond_wait (priv->worker_notify_cond, priv->worker_lock) #define KS_WORKER_NOTIFY(priv) g_cond_signal (priv->worker_notify_cond) #define KS_WORKER_WAIT_FOR_RESULT(priv) \ g_cond_wait (priv->worker_result_cond, priv->worker_lock) #define KS_WORKER_NOTIFY_RESULT(priv) \ g_cond_signal (priv->worker_result_cond) typedef enum { KS_WORKER_STATE_STARTING, KS_WORKER_STATE_READY, KS_WORKER_STATE_STOPPING, KS_WORKER_STATE_ERROR, } KsWorkerState; typedef struct { /* Properties */ gchar *device_path; gchar *device_name; gint device_index; gboolean do_stats; gboolean enable_quirks; /* State */ GstKsClock *ksclock; GstKsVideoDevice *device; guint64 offset; GstClockTime prev_ts; gboolean running; /* Worker thread */ GThread *worker_thread; GMutex *worker_lock; GCond *worker_notify_cond; GCond *worker_result_cond; KsWorkerState worker_state; GstCaps *worker_pending_caps; gboolean worker_setcaps_result; gboolean worker_pending_run; gboolean worker_run_result; /* Statistics */ GstClockTime last_sampling; guint count; guint fps; } GstKsVideoSrcPrivate; #define GST_KS_VIDEO_SRC_GET_PRIVATE(o) \ (G_TYPE_INSTANCE_GET_PRIVATE ((o), GST_TYPE_KS_VIDEO_SRC, \ GstKsVideoSrcPrivate)) static void gst_ks_video_src_finalize (GObject * object); static void gst_ks_video_src_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); static void gst_ks_video_src_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static void gst_ks_video_src_reset (GstKsVideoSrc * self); static GstStateChangeReturn gst_ks_video_src_change_state (GstElement * element, GstStateChange transition); static gboolean gst_ks_video_src_set_clock (GstElement * element, GstClock * clock); static GstCaps *gst_ks_video_src_get_caps (GstBaseSrc * basesrc); static gboolean gst_ks_video_src_set_caps (GstBaseSrc * basesrc, GstCaps * caps); static void gst_ks_video_src_fixate (GstBaseSrc * basesrc, GstCaps * caps); static gboolean gst_ks_video_src_query (GstBaseSrc * basesrc, GstQuery * query); static gboolean gst_ks_video_src_unlock (GstBaseSrc * basesrc); static gboolean gst_ks_video_src_unlock_stop (GstBaseSrc * basesrc); static GstFlowReturn gst_ks_video_src_create (GstPushSrc * pushsrc, GstBuffer ** buffer); GST_BOILERPLATE (GstKsVideoSrc, gst_ks_video_src, GstPushSrc, GST_TYPE_PUSH_SRC); static void gst_ks_video_src_base_init (gpointer gclass) { GstElementClass *element_class = GST_ELEMENT_CLASS (gclass); static GstElementDetails element_details = { "KsVideoSrc", "Source/Video", "Stream data from a video capture device through Windows kernel streaming", "Ole André Vadla RavnÃ¥s \n" "Haakon Sporsheim " }; gst_element_class_set_details (element_class, &element_details); gst_element_class_add_pad_template (element_class, gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, ks_video_get_all_caps ())); } static void gst_ks_video_src_class_init (GstKsVideoSrcClass * klass) { GObjectClass *gobject_class = G_OBJECT_CLASS (klass); GstElementClass *gstelement_class = GST_ELEMENT_CLASS (klass); GstBaseSrcClass *gstbasesrc_class = GST_BASE_SRC_CLASS (klass); GstPushSrcClass *gstpushsrc_class = GST_PUSH_SRC_CLASS (klass); g_type_class_add_private (klass, sizeof (GstKsVideoSrcPrivate)); gobject_class->finalize = gst_ks_video_src_finalize; gobject_class->get_property = gst_ks_video_src_get_property; gobject_class->set_property = gst_ks_video_src_set_property; gstelement_class->change_state = GST_DEBUG_FUNCPTR (gst_ks_video_src_change_state); gstelement_class->set_clock = GST_DEBUG_FUNCPTR (gst_ks_video_src_set_clock); gstbasesrc_class->get_caps = GST_DEBUG_FUNCPTR (gst_ks_video_src_get_caps); gstbasesrc_class->set_caps = GST_DEBUG_FUNCPTR (gst_ks_video_src_set_caps); gstbasesrc_class->fixate = GST_DEBUG_FUNCPTR (gst_ks_video_src_fixate); gstbasesrc_class->query = GST_DEBUG_FUNCPTR (gst_ks_video_src_query); gstbasesrc_class->unlock = GST_DEBUG_FUNCPTR (gst_ks_video_src_unlock); gstbasesrc_class->unlock_stop = GST_DEBUG_FUNCPTR (gst_ks_video_src_unlock_stop); gstpushsrc_class->create = GST_DEBUG_FUNCPTR (gst_ks_video_src_create); g_object_class_install_property (gobject_class, PROP_DEVICE_PATH, g_param_spec_string ("device-path", "Device Path", "The device path", DEFAULT_DEVICE_PATH, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_DEVICE_NAME, g_param_spec_string ("device-name", "Device Name", "The human-readable device name", DEFAULT_DEVICE_NAME, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_DEVICE_INDEX, g_param_spec_int ("device-index", "Device Index", "The zero-based device index", -1, G_MAXINT, DEFAULT_DEVICE_INDEX, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_DO_STATS, g_param_spec_boolean ("do-stats", "Enable statistics", "Enable logging of statistics", DEFAULT_DO_STATS, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_FPS, g_param_spec_int ("fps", "Frames per second", "Last measured framerate, if statistics are enabled", -1, G_MAXINT, -1, G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_ENABLE_QUIRKS, g_param_spec_boolean ("enable-quirks", "Enable quirks", "Enable driver-specific quirks", DEFAULT_ENABLE_QUIRKS, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); GST_DEBUG_CATEGORY_INIT (gst_ks_debug, "ksvideosrc", 0, "Kernel streaming video source"); } static void gst_ks_video_src_init (GstKsVideoSrc * self, GstKsVideoSrcClass * gclass) { GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); GstBaseSrc *basesrc = GST_BASE_SRC (self); gst_base_src_set_live (basesrc, TRUE); gst_base_src_set_format (basesrc, GST_FORMAT_TIME); gst_ks_video_src_reset (self); priv->device_path = DEFAULT_DEVICE_PATH; priv->device_name = DEFAULT_DEVICE_NAME; priv->device_index = DEFAULT_DEVICE_INDEX; priv->do_stats = DEFAULT_DO_STATS; priv->enable_quirks = DEFAULT_ENABLE_QUIRKS; } static void gst_ks_video_src_finalize (GObject * object) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (object); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); g_free (priv->device_name); g_free (priv->device_path); G_OBJECT_CLASS (parent_class)->finalize (object); } static void gst_ks_video_src_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (object); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); switch (prop_id) { case PROP_DEVICE_PATH: g_value_set_string (value, priv->device_path); break; case PROP_DEVICE_NAME: g_value_set_string (value, priv->device_name); break; case PROP_DEVICE_INDEX: g_value_set_int (value, priv->device_index); break; case PROP_DO_STATS: GST_OBJECT_LOCK (object); g_value_set_boolean (value, priv->do_stats); GST_OBJECT_UNLOCK (object); break; case PROP_FPS: GST_OBJECT_LOCK (object); g_value_set_int (value, priv->fps); GST_OBJECT_UNLOCK (object); break; case PROP_ENABLE_QUIRKS: g_value_set_boolean (value, priv->enable_quirks); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static void gst_ks_video_src_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (object); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); switch (prop_id) { case PROP_DEVICE_PATH: g_free (priv->device_path); priv->device_path = g_value_dup_string (value); break; case PROP_DEVICE_NAME: g_free (priv->device_name); priv->device_name = g_value_dup_string (value); break; case PROP_DEVICE_INDEX: priv->device_index = g_value_get_int (value); break; case PROP_DO_STATS: GST_OBJECT_LOCK (object); priv->do_stats = g_value_get_boolean (value); GST_OBJECT_UNLOCK (object); break; case PROP_ENABLE_QUIRKS: priv->enable_quirks = g_value_get_boolean (value); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static void gst_ks_video_src_reset (GstKsVideoSrc * self) { GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); /* Reset statistics */ priv->last_sampling = GST_CLOCK_TIME_NONE; priv->count = 0; priv->fps = -1; /* Reset timestamping state */ priv->offset = 0; priv->prev_ts = GST_CLOCK_TIME_NONE; priv->running = FALSE; } static void gst_ks_video_src_apply_driver_quirks (GstKsVideoSrc * self) { GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); HMODULE mod; /* * Logitech's driver software injects the following DLL into all processes * spawned. This DLL does some nasty tricks, sitting in between the * application and the low-level ntdll API (NtCreateFile, NtClose, * NtDeviceIoControlFile, NtDuplicateObject, etc.), making all sorts * of assumptions. * * The only regression that this quirk causes is that the video effects * feature doesn't work. */ mod = GetModuleHandle ("LVPrcInj.dll"); if (mod != NULL) { GST_DEBUG_OBJECT (self, "Logitech DLL detected, neutralizing it"); /* * We know that no-one's actually keeping this handle around to decrement * its reference count, so we'll take care of that job. The DLL's DllMain * implementation takes care of rolling back changes when it gets unloaded, * so this seems to be the cleanest and most future-proof way that we can * get rid of it... */ FreeLibrary (mod); /* Paranoia: verify that it's no longer there */ mod = GetModuleHandle ("LVPrcInj.dll"); if (mod != NULL) GST_WARNING_OBJECT (self, "failed to neutralize Logitech DLL"); } } static gboolean gst_ks_video_src_open_device (GstKsVideoSrc * self) { GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); GstKsVideoDevice *device = NULL; GList *devices, *cur; g_assert (priv->device == NULL); devices = ks_enumerate_devices (&KSCATEGORY_VIDEO); if (devices == NULL) goto error_no_devices; for (cur = devices; cur != NULL; cur = cur->next) { KsDeviceEntry *entry = cur->data; GST_DEBUG_OBJECT (self, "device %d: name='%s' path='%s'", entry->index, entry->name, entry->path); } for (cur = devices; cur != NULL && device == NULL; cur = cur->next) { KsDeviceEntry *entry = cur->data; gboolean match; if (priv->device_path != NULL) { match = g_strcasecmp (entry->path, priv->device_path) == 0; } else if (priv->device_name != NULL) { match = g_strcasecmp (entry->name, priv->device_name) == 0; } else if (priv->device_index >= 0) { match = entry->index == priv->device_index; } else { match = TRUE; /* pick the first entry */ } if (match) { priv->ksclock = g_object_new (GST_TYPE_KS_CLOCK, NULL); if (priv->ksclock != NULL && gst_ks_clock_open (priv->ksclock)) { GstClock *clock = GST_ELEMENT_CLOCK (self); if (clock != NULL) gst_ks_clock_provide_master_clock (priv->ksclock, clock); } else { GST_WARNING_OBJECT (self, "failed to create/open KsClock"); g_object_unref (priv->ksclock); priv->ksclock = NULL; } device = g_object_new (GST_TYPE_KS_VIDEO_DEVICE, "clock", priv->ksclock, "device-path", entry->path, NULL); } ks_device_entry_free (entry); } g_list_free (devices); if (device == NULL) goto error_no_match; if (!gst_ks_video_device_open (device)) goto error_open; priv->device = device; return TRUE; /* ERRORS */ error_no_devices: { GST_ELEMENT_ERROR (self, RESOURCE, NOT_FOUND, ("No video capture devices found"), (NULL)); return FALSE; } error_no_match: { if (priv->device_path != NULL) { GST_ELEMENT_ERROR (self, RESOURCE, SETTINGS, ("Specified video capture device with path '%s' not found", priv->device_path), (NULL)); } else if (priv->device_name != NULL) { GST_ELEMENT_ERROR (self, RESOURCE, SETTINGS, ("Specified video capture device with name '%s' not found", priv->device_name), (NULL)); } else { GST_ELEMENT_ERROR (self, RESOURCE, SETTINGS, ("Specified video capture device with index %d not found", priv->device_index), (NULL)); } return FALSE; } error_open: { GST_ELEMENT_ERROR (self, RESOURCE, OPEN_READ, ("Failed to open device"), (NULL)); g_object_unref (device); return FALSE; } } static void gst_ks_video_src_close_device (GstKsVideoSrc * self) { GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); g_assert (priv->device != NULL); gst_ks_video_device_close (priv->device); g_object_unref (priv->device); priv->device = NULL; if (priv->ksclock != NULL) { gst_ks_clock_close (priv->ksclock); g_object_unref (priv->ksclock); priv->ksclock = NULL; } gst_ks_video_src_reset (self); } /* * Worker thread that takes care of starting, configuring and stopping things. * * This is needed because Logitech's driver software injects a DLL that * intercepts API functions like NtCreateFile, NtClose, NtDeviceIoControlFile * and NtDuplicateObject so that they can provide in-place video effects to * existing applications. Their assumption is that at least one thread tainted * by their code stays around for the lifetime of the capture. */ static gpointer gst_ks_video_src_worker_func (gpointer data) { GstKsVideoSrc *self = data; GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); if (!gst_ks_video_src_open_device (self)) goto open_failed; KS_WORKER_LOCK (priv); priv->worker_state = KS_WORKER_STATE_READY; KS_WORKER_NOTIFY_RESULT (priv); while (priv->worker_state != KS_WORKER_STATE_STOPPING) { KS_WORKER_WAIT (priv); if (priv->worker_pending_caps != NULL) { priv->worker_setcaps_result = gst_ks_video_device_set_caps (priv->device, priv->worker_pending_caps); priv->worker_pending_caps = NULL; KS_WORKER_NOTIFY_RESULT (priv); } else if (priv->worker_pending_run) { if (priv->ksclock != NULL) gst_ks_clock_start (priv->ksclock); priv->worker_run_result = gst_ks_video_device_set_state (priv->device, KSSTATE_RUN); priv->worker_pending_run = FALSE; KS_WORKER_NOTIFY_RESULT (priv); } } KS_WORKER_UNLOCK (priv); gst_ks_video_src_close_device (self); return NULL; /* ERRORS */ open_failed: { KS_WORKER_LOCK (priv); priv->worker_state = KS_WORKER_STATE_ERROR; KS_WORKER_NOTIFY_RESULT (priv); KS_WORKER_UNLOCK (priv); return NULL; } } static gboolean gst_ks_video_src_start_worker (GstKsVideoSrc * self) { GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); gboolean result; priv->worker_lock = g_mutex_new (); priv->worker_notify_cond = g_cond_new (); priv->worker_result_cond = g_cond_new (); priv->worker_pending_caps = NULL; priv->worker_pending_run = FALSE; priv->worker_state = KS_WORKER_STATE_STARTING; priv->worker_thread = g_thread_create (gst_ks_video_src_worker_func, self, TRUE, NULL); KS_WORKER_LOCK (priv); while (priv->worker_state < KS_WORKER_STATE_READY) KS_WORKER_WAIT_FOR_RESULT (priv); result = priv->worker_state == KS_WORKER_STATE_READY; KS_WORKER_UNLOCK (priv); return result; } static void gst_ks_video_src_stop_worker (GstKsVideoSrc * self) { GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); KS_WORKER_LOCK (priv); priv->worker_state = KS_WORKER_STATE_STOPPING; KS_WORKER_NOTIFY (priv); KS_WORKER_UNLOCK (priv); g_thread_join (priv->worker_thread); priv->worker_thread = NULL; g_cond_free (priv->worker_result_cond); priv->worker_result_cond = NULL; g_cond_free (priv->worker_notify_cond); priv->worker_notify_cond = NULL; g_mutex_free (priv->worker_lock); priv->worker_lock = NULL; } static GstStateChangeReturn gst_ks_video_src_change_state (GstElement * element, GstStateChange transition) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (element); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); GstStateChangeReturn ret; switch (transition) { case GST_STATE_CHANGE_NULL_TO_READY: if (priv->enable_quirks) gst_ks_video_src_apply_driver_quirks (self); if (!gst_ks_video_src_start_worker (self)) goto open_failed; break; } ret = GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); switch (transition) { case GST_STATE_CHANGE_READY_TO_NULL: gst_ks_video_src_stop_worker (self); break; } return ret; /* ERRORS */ open_failed: { gst_ks_video_src_stop_worker (self); return GST_STATE_CHANGE_FAILURE; } } static gboolean gst_ks_video_src_set_clock (GstElement * element, GstClock * clock) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (element); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); GST_OBJECT_LOCK (element); if (clock != NULL && priv->ksclock != NULL) gst_ks_clock_provide_master_clock (priv->ksclock, clock); GST_OBJECT_UNLOCK (element); return TRUE; } static GstCaps * gst_ks_video_src_get_caps (GstBaseSrc * basesrc) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (basesrc); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); if (priv->device != NULL) return gst_ks_video_device_get_available_caps (priv->device); else return NULL; /* BaseSrc will return template caps */ } static gboolean gst_ks_video_src_set_caps (GstBaseSrc * basesrc, GstCaps * caps) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (basesrc); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); if (priv->device == NULL) return FALSE; KS_WORKER_LOCK (priv); priv->worker_pending_caps = caps; KS_WORKER_NOTIFY (priv); while (priv->worker_pending_caps == caps) KS_WORKER_WAIT_FOR_RESULT (priv); KS_WORKER_UNLOCK (priv); return priv->worker_setcaps_result; } static void gst_ks_video_src_fixate (GstBaseSrc * basesrc, GstCaps * caps) { GstStructure *structure = gst_caps_get_structure (caps, 0); gst_structure_fixate_field_nearest_int (structure, "width", G_MAXINT); gst_structure_fixate_field_nearest_int (structure, "height", G_MAXINT); gst_structure_fixate_field_nearest_fraction (structure, "framerate", G_MAXINT, 1); } static gboolean gst_ks_video_src_query (GstBaseSrc * basesrc, GstQuery * query) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (basesrc); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); gboolean result = FALSE; switch (GST_QUERY_TYPE (query)) { case GST_QUERY_LATENCY:{ GstClockTime min_latency, max_latency; if (priv->device == NULL) goto beach; result = gst_ks_video_device_get_latency (priv->device, &min_latency, &max_latency); if (!result) goto beach; GST_DEBUG_OBJECT (self, "reporting latency of min %" GST_TIME_FORMAT " max %" GST_TIME_FORMAT, GST_TIME_ARGS (min_latency), GST_TIME_ARGS (max_latency)); gst_query_set_latency (query, TRUE, min_latency, max_latency); break; } default: result = GST_BASE_SRC_CLASS (parent_class)->query (basesrc, query); break; } beach: return result; } static gboolean gst_ks_video_src_unlock (GstBaseSrc * basesrc) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (basesrc); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); GST_DEBUG_OBJECT (self, "%s", G_STRFUNC); gst_ks_video_device_cancel (priv->device); return TRUE; } static gboolean gst_ks_video_src_unlock_stop (GstBaseSrc * basesrc) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (basesrc); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); GST_DEBUG_OBJECT (self, "%s", G_STRFUNC); gst_ks_video_device_cancel_stop (priv->device); return TRUE; } static gboolean gst_ks_video_src_timestamp_buffer (GstKsVideoSrc * self, GstBuffer * buf, GstClockTime presentation_time) { GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); GstClockTime duration; GstClock *clock; GstClockTime timestamp; duration = gst_ks_video_device_get_duration (priv->device); GST_OBJECT_LOCK (self); clock = GST_ELEMENT_CLOCK (self); if (clock != NULL) { gst_object_ref (clock); timestamp = GST_ELEMENT (self)->base_time; if (GST_CLOCK_TIME_IS_VALID (presentation_time)) { if (presentation_time > GST_ELEMENT (self)->base_time) presentation_time -= GST_ELEMENT (self)->base_time; else presentation_time = 0; } } else { timestamp = GST_CLOCK_TIME_NONE; } GST_OBJECT_UNLOCK (self); if (clock != NULL) { /* The time according to the current clock */ timestamp = gst_clock_get_time (clock) - timestamp; if (timestamp > duration) timestamp -= duration; else timestamp = 0; if (GST_CLOCK_TIME_IS_VALID (presentation_time)) { /* * We don't use this for anything yet, need to ponder how to deal * with pins that use an internal clock and timestamp from 0. */ GstClockTimeDiff diff = GST_CLOCK_DIFF (presentation_time, timestamp); GST_DEBUG_OBJECT (self, "diff between gst and driver timestamp: %" G_GINT64_FORMAT, diff); } gst_object_unref (clock); clock = NULL; /* Unless it's the first frame, align the current timestamp on a multiple * of duration since the previous */ if (GST_CLOCK_TIME_IS_VALID (priv->prev_ts)) { GstClockTime delta; guint delta_remainder, delta_offset; /* REVISIT: I've seen this happen with the GstSystemClock on Windows, * scary... */ if (timestamp < priv->prev_ts) { GST_WARNING_OBJECT (self, "clock is ticking backwards"); return FALSE; } /* Round to a duration boundary */ delta = timestamp - priv->prev_ts; delta_remainder = delta % duration; if (delta_remainder < duration / 3) timestamp -= delta_remainder; else timestamp += duration - delta_remainder; /* How many frames are we off then? */ delta = timestamp - priv->prev_ts; delta_offset = delta / duration; if (delta_offset == 1) /* perfect */ GST_BUFFER_FLAG_UNSET (buf, GST_BUFFER_FLAG_DISCONT); else if (delta_offset > 1) { guint lost = delta_offset - 1; GST_INFO_OBJECT (self, "lost %d frame%s, setting discont flag", lost, (lost > 1) ? "s" : ""); GST_BUFFER_FLAG_SET (buf, GST_BUFFER_FLAG_DISCONT); } else if (delta_offset == 0) { /* overproduction, skip this frame */ GST_INFO_OBJECT (self, "skipping frame"); return FALSE; } priv->offset += delta_offset; } priv->prev_ts = timestamp; } GST_BUFFER_OFFSET (buf) = priv->offset; GST_BUFFER_OFFSET_END (buf) = GST_BUFFER_OFFSET (buf) + 1; GST_BUFFER_TIMESTAMP (buf) = timestamp; GST_BUFFER_DURATION (buf) = duration; return TRUE; } static void gst_ks_video_src_update_statistics (GstKsVideoSrc * self) { GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); GstClock *clock; GST_OBJECT_LOCK (self); clock = GST_ELEMENT_CLOCK (self); if (clock != NULL) gst_object_ref (clock); GST_OBJECT_UNLOCK (self); if (clock != NULL) { GstClockTime now = gst_clock_get_time (clock); gst_object_unref (clock); priv->count++; if (GST_CLOCK_TIME_IS_VALID (priv->last_sampling)) { if (now - priv->last_sampling >= GST_SECOND) { GST_OBJECT_LOCK (self); priv->fps = priv->count; GST_OBJECT_UNLOCK (self); g_object_notify (G_OBJECT (self), "fps"); priv->last_sampling = now; priv->count = 0; } } else { priv->last_sampling = now; } } } static GstFlowReturn gst_ks_video_src_create (GstPushSrc * pushsrc, GstBuffer ** buffer) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (pushsrc); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); guint buf_size; GstCaps *caps; GstBuffer *buf = NULL; GstFlowReturn result; GstClockTime presentation_time; gulong error_code; gchar *error_str; g_assert (priv->device != NULL); if (!gst_ks_video_device_has_caps (priv->device)) goto error_no_caps; buf_size = gst_ks_video_device_get_frame_size (priv->device); g_assert (buf_size); caps = gst_pad_get_negotiated_caps (GST_BASE_SRC_PAD (self)); if (caps == NULL) goto error_no_caps; result = gst_pad_alloc_buffer (GST_BASE_SRC_PAD (self), priv->offset, buf_size, caps, &buf); gst_caps_unref (caps); if (G_UNLIKELY (result != GST_FLOW_OK)) goto error_alloc_buffer; if (G_UNLIKELY (!priv->running)) { KS_WORKER_LOCK (priv); priv->worker_pending_run = TRUE; KS_WORKER_NOTIFY (priv); while (priv->worker_pending_run) KS_WORKER_WAIT_FOR_RESULT (priv); priv->running = priv->worker_run_result; KS_WORKER_UNLOCK (priv); if (!priv->running) goto error_start_capture; } do { gulong bytes_read; result = gst_ks_video_device_read_frame (priv->device, GST_BUFFER_DATA (buf), buf_size, &bytes_read, &presentation_time, &error_code, &error_str); if (G_UNLIKELY (result != GST_FLOW_OK)) goto error_read_frame; GST_BUFFER_SIZE (buf) = bytes_read; } while (!gst_ks_video_src_timestamp_buffer (self, buf, presentation_time)); if (G_UNLIKELY (priv->do_stats)) gst_ks_video_src_update_statistics (self); gst_ks_video_device_postprocess_frame (priv->device, GST_BUFFER_DATA (buf), GST_BUFFER_SIZE (buf)); *buffer = buf; return GST_FLOW_OK; /* ERRORS */ error_no_caps: { GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, ("not negotiated"), ("maybe setcaps failed?")); return GST_FLOW_ERROR; } error_start_capture: { GST_ELEMENT_ERROR (self, RESOURCE, OPEN_READ, ("could not start capture"), ("failed to change pin state to KSSTATE_RUN")); return GST_FLOW_ERROR; } error_alloc_buffer: { GST_ELEMENT_ERROR (self, CORE, PAD, ("alloc_buffer failed"), (NULL)); return result; } error_read_frame: { if (result != GST_FLOW_WRONG_STATE && result != GST_FLOW_UNEXPECTED) { GST_ELEMENT_ERROR (self, RESOURCE, READ, ("read failed: %s [0x%08x]", error_str, error_code), ("gst_ks_video_device_read_frame failed")); } g_free (error_str); gst_buffer_unref (buf); return result; } } static gboolean plugin_init (GstPlugin * plugin) { return gst_element_register (plugin, "ksvideosrc", GST_RANK_NONE, GST_TYPE_KS_VIDEO_SRC); } void gstelements_winks_register() { gst_plugin_register_static( GST_VERSION_MAJOR, GST_VERSION_MINOR, "winks", "Windows kernel streaming plugin", plugin_init, "1.0.0", "LGPL", "my-application", "my-application", "http://www.my-application.net/" ); } psimedia-master/gstprovider/gstelements/videomaxrate.pri000066400000000000000000000003301220046403000242330ustar00rootroot00000000000000HEADERS += \ $$PWD/videomaxrate/videomaxrate.h SOURCES += \ $$PWD/videomaxrate/videomaxrate.c gstplugin:SOURCES += $$PWD/videomaxrate/videomaxrateplugin.c !gstplugin:SOURCES += $$PWD/static/videomaxrate_static.c psimedia-master/gstprovider/gstelements/videomaxrate/000077500000000000000000000000001220046403000235235ustar00rootroot00000000000000psimedia-master/gstprovider/gstelements/videomaxrate/videomaxrate.c000066400000000000000000000124411220046403000263610ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #include "videomaxrate.h" static const GstElementDetails videomaxrate_details = GST_ELEMENT_DETAILS ("Video maximum rate adjuster", "Filter/Effect/Video", "Drops extra frames", "Justin Karneges "); static GstStaticPadTemplate gst_videomaxrate_src_template = GST_STATIC_PAD_TEMPLATE ("src", GST_PAD_SRC, GST_PAD_ALWAYS, GST_STATIC_CAPS ("video/x-raw-yuv; video/x-raw-rgb") ); static GstStaticPadTemplate gst_videomaxrate_sink_template = GST_STATIC_PAD_TEMPLATE ("sink", GST_PAD_SINK, GST_PAD_ALWAYS, GST_STATIC_CAPS ("video/x-raw-yuv; video/x-raw-rgb") ); static gboolean gst_videomaxrate_sink_event (GstPad *pad, GstEvent *event); static GstCaps *gst_videomaxrate_transform_caps (GstBaseTransform *trans, GstPadDirection direction, GstCaps *caps); static gboolean gst_videomaxrate_set_caps (GstBaseTransform *trans, GstCaps *incaps, GstCaps *outcaps); static GstFlowReturn gst_videomaxrate_transform_ip (GstBaseTransform *trans, GstBuffer *buf); GST_BOILERPLATE (GstVideoMaxRate, gst_videomaxrate, GstBaseTransform, GST_TYPE_BASE_TRANSFORM); static void gst_videomaxrate_base_init (gpointer gclass) { GstElementClass *element_class = GST_ELEMENT_CLASS (gclass); gst_element_class_set_details (element_class, &videomaxrate_details); gst_element_class_add_pad_template (element_class, gst_static_pad_template_get (&gst_videomaxrate_sink_template)); gst_element_class_add_pad_template (element_class, gst_static_pad_template_get (&gst_videomaxrate_src_template)); } static void gst_videomaxrate_class_init (GstVideoMaxRateClass *klass) { GstBaseTransformClass *base_class; base_class = (GstBaseTransformClass *)klass; base_class->transform_caps = gst_videomaxrate_transform_caps; base_class->set_caps = gst_videomaxrate_set_caps; base_class->transform_ip = gst_videomaxrate_transform_ip; } static void gst_videomaxrate_init (GstVideoMaxRate *videomaxrate, GstVideoMaxRateClass *gclass) { (void)gclass; videomaxrate->to_rate_numerator = -1; videomaxrate->to_rate_denominator = -1; videomaxrate->have_last_ts = FALSE; gst_pad_set_event_function (GST_BASE_TRANSFORM_SINK_PAD (videomaxrate), gst_videomaxrate_sink_event); } gboolean gst_videomaxrate_sink_event (GstPad *pad, GstEvent *event) { GstVideoMaxRate *videomaxrate; gboolean ret; videomaxrate = (GstVideoMaxRate *)gst_pad_get_parent (pad); switch (GST_EVENT_TYPE(event)) { case GST_EVENT_NEWSEGMENT: case GST_EVENT_FLUSH_STOP: videomaxrate->have_last_ts = FALSE; break; default: break; } ret = gst_pad_push_event (GST_BASE_TRANSFORM_SRC_PAD (videomaxrate), event); gst_object_unref (videomaxrate); return ret; } GstCaps * gst_videomaxrate_transform_caps (GstBaseTransform *trans, GstPadDirection direction, GstCaps *caps) { GstCaps *ret; GstStructure *structure; (void)trans; (void)direction; /* this function is always called with a simple caps */ g_return_val_if_fail (GST_CAPS_IS_SIMPLE (caps), NULL); ret = gst_caps_copy (caps); /* set the framerate as a range */ structure = gst_structure_copy (gst_caps_get_structure (ret, 0)); gst_structure_set (structure, "framerate", GST_TYPE_FRACTION_RANGE, 0, 1, G_MAXINT, 1, NULL); gst_caps_merge_structure (ret, gst_structure_copy (structure)); gst_structure_free (structure); return ret; } gboolean gst_videomaxrate_set_caps (GstBaseTransform *trans, GstCaps *incaps, GstCaps *outcaps) { GstVideoMaxRate *videomaxrate; GstStructure *cs; gint numerator, denominator; videomaxrate = (GstVideoMaxRate *)trans; (void)incaps; // keep track of the outbound framerate cs = gst_caps_get_structure (outcaps, 0); if (!gst_structure_get_fraction (cs, "framerate", &numerator, &denominator)) return FALSE; videomaxrate->to_rate_numerator = numerator; videomaxrate->to_rate_denominator = denominator; return TRUE; } GstFlowReturn gst_videomaxrate_transform_ip (GstBaseTransform *trans, GstBuffer *buf) { GstVideoMaxRate *videomaxrate; GstClockTime ts; videomaxrate = (GstVideoMaxRate *)trans; ts = GST_BUFFER_TIMESTAMP (buf); /* drop frames if they exceed our output rate */ if (videomaxrate->have_last_ts) { if (ts < videomaxrate->last_ts + gst_util_uint64_scale (1, videomaxrate->to_rate_denominator * GST_SECOND, videomaxrate->to_rate_numerator)) { return GST_BASE_TRANSFORM_FLOW_DROPPED; } } videomaxrate->last_ts = ts; videomaxrate->have_last_ts = TRUE; return GST_FLOW_OK; } psimedia-master/gstprovider/gstelements/videomaxrate/videomaxrate.h000066400000000000000000000035141220046403000263670ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #ifndef __GST_VIDEOMAXRATE_H__ #define __GST_VIDEOMAXRATE_H__ #include #include G_BEGIN_DECLS #define GST_TYPE_VIDEOMAXRATE \ (gst_videomaxrate_get_type()) #define GST_VIDEOMAXRATE(obj) \ (G_TYPE_CHECK_INSTANCE_CAST((obj),GST_TYPE_VIDEOMAXRATE,GstVideoMaxRate)) #define GST_VIDEOMAXRATE_CLASS(klass) \ (G_TYPE_CHECK_CLASS_CAST((klass),GST_TYPE_VIDEOMAXRATE,GstVideoMaxRateClass)) #define GST_IS_VIDEOMAXRATE(obj) \ (G_TYPE_CHECK_INSTANCE_TYPE((obj),GST_TYPE_VIDEOMAXRATE)) #define GST_IS_VIDEOMAXRATE_CLASS(klass) \ (G_TYPE_CHECK_CLASS_TYPE((klass),GST_TYPE_VIDEOMAXRATE)) typedef struct _GstVideoMaxRate GstVideoMaxRate; typedef struct _GstVideoMaxRateClass GstVideoMaxRateClass; struct _GstVideoMaxRate { GstBaseTransform parent; gint to_rate_numerator; gint to_rate_denominator; gboolean have_last_ts; GstClockTime last_ts; }; struct _GstVideoMaxRateClass { GstBaseTransformClass parent_class; }; GType gst_videomaxrate_get_type(void); G_END_DECLS #endif /* __GST_VIDEOMAXRATE_H__ */ psimedia-master/gstprovider/gstelements/videomaxrate/videomaxrateplugin.c000066400000000000000000000023351220046403000276010ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #ifdef HAVE_CONFIG_H #include "config.h" #endif #include "videomaxrate.h" static gboolean plugin_init (GstPlugin * plugin) { if (!gst_element_register (plugin, "videomaxrate", GST_RANK_NONE, GST_TYPE_VIDEOMAXRATE)) return FALSE; return TRUE; } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, "videomaxrate", "Drop extra frames", plugin_init, VERSION, "LGPL", GST_PACKAGE_NAME, GST_PACKAGE_ORIGIN) psimedia-master/gstprovider/gstelements/winks.pri000066400000000000000000000007051220046403000227040ustar00rootroot00000000000000# winks HEADERS += \ $$PWD/winks/kshelpers.h \ $$PWD/winks/ksvideohelpers.h \ $$PWD/winks/gstksclock.h \ $$PWD/winks/gstksvideodevice.h \ $$PWD/winks/gstksvideosrc.h SOURCES += \ $$PWD/winks/kshelpers.c \ $$PWD/winks/ksvideohelpers.c \ $$PWD/winks/gstksclock.c \ $$PWD/winks/gstksvideodevice.c gstplugin:SOURCES += $$PWD/winks/gstksvideosrc.c !gstplugin:SOURCES += $$PWD/static/winks_static.c LIBS *= \ -lsetupapi \ -lksuser \ -lamstrmid psimedia-master/gstprovider/gstelements/winks/000077500000000000000000000000001220046403000221665ustar00rootroot00000000000000psimedia-master/gstprovider/gstelements/winks/Makefile.am000066400000000000000000000004561220046403000242270ustar00rootroot00000000000000# This plugin isn't buildable with autotools at this point in time, so just # ensure everything's listed in EXTRA_DIST EXTRA_DIST = \ gstksclock.c gstksclock.h \ gstksvideodevice.c gstksvideodevice.h \ gstksvideosrc.c gstksvideosrc.h \ kshelpers.c kshelpers.h \ ksvideohelpers.c ksvideohelpers.h psimedia-master/gstprovider/gstelements/winks/gstksclock.c000066400000000000000000000211231220046403000245000ustar00rootroot00000000000000/* * Copyright (C) 2008 Ole André Vadla RavnÃ¥s * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. */ #ifdef UNICODE #undef UNICODE #endif #include "gstksclock.h" #include "kshelpers.h" GST_DEBUG_CATEGORY_EXTERN (gst_ks_debug); #define GST_CAT_DEFAULT gst_ks_debug typedef struct { GMutex *mutex; GCond *client_cond; GCond *worker_cond; HANDLE clock_handle; gboolean open; gboolean closing; KSSTATE state; GThread *worker_thread; gboolean worker_running; gboolean worker_initialized; GstClock *master_clock; } GstKsClockPrivate; #define GST_KS_CLOCK_GET_PRIVATE(o) \ (G_TYPE_INSTANCE_GET_PRIVATE ((o), GST_TYPE_KS_CLOCK, \ GstKsClockPrivate)) #define GST_KS_CLOCK_LOCK() g_mutex_lock (priv->mutex) #define GST_KS_CLOCK_UNLOCK() g_mutex_unlock (priv->mutex) static void gst_ks_clock_dispose (GObject * object); static void gst_ks_clock_finalize (GObject * object); GST_BOILERPLATE (GstKsClock, gst_ks_clock, GObject, G_TYPE_OBJECT); static void gst_ks_clock_base_init (gpointer gclass) { } static void gst_ks_clock_class_init (GstKsClockClass * klass) { GObjectClass *gobject_class = G_OBJECT_CLASS (klass); g_type_class_add_private (klass, sizeof (GstKsClockPrivate)); gobject_class->dispose = gst_ks_clock_dispose; gobject_class->finalize = gst_ks_clock_finalize; } static void gst_ks_clock_init (GstKsClock * self, GstKsClockClass * gclass) { GstKsClockPrivate *priv = GST_KS_CLOCK_GET_PRIVATE (self); priv->mutex = g_mutex_new (); priv->client_cond = g_cond_new (); priv->worker_cond = g_cond_new (); priv->clock_handle = INVALID_HANDLE_VALUE; priv->open = FALSE; priv->closing = FALSE; priv->state = KSSTATE_STOP; priv->worker_thread = NULL; priv->worker_running = FALSE; priv->worker_initialized = FALSE; priv->master_clock = NULL; } static void gst_ks_clock_dispose (GObject * object) { GstKsClock *self = GST_KS_CLOCK (object); GstKsClockPrivate *priv = GST_KS_CLOCK_GET_PRIVATE (self); g_assert (!priv->open); G_OBJECT_CLASS (parent_class)->dispose (object); } static void gst_ks_clock_finalize (GObject * object) { GstKsClock *self = GST_KS_CLOCK (object); GstKsClockPrivate *priv = GST_KS_CLOCK_GET_PRIVATE (self); g_cond_free (priv->worker_cond); g_cond_free (priv->client_cond); g_mutex_free (priv->mutex); G_OBJECT_CLASS (parent_class)->finalize (object); } static void gst_ks_clock_close_unlocked (GstKsClock * self); gboolean gst_ks_clock_open (GstKsClock * self) { GstKsClockPrivate *priv = GST_KS_CLOCK_GET_PRIVATE (self); gboolean ret = FALSE; GList *devices; KsDeviceEntry *device; KSSTATE state; GST_KS_CLOCK_LOCK (); g_assert (!priv->open); priv->state = KSSTATE_STOP; devices = ks_enumerate_devices (&KSCATEGORY_CLOCK); if (devices == NULL) goto error; device = devices->data; priv->clock_handle = CreateFile (device->path, GENERIC_READ | GENERIC_WRITE, 0, NULL, OPEN_EXISTING, FILE_ATTRIBUTE_NORMAL | FILE_FLAG_OVERLAPPED, NULL); if (!ks_is_valid_handle (priv->clock_handle)) goto error; state = KSSTATE_STOP; if (!ks_object_set_property (priv->clock_handle, KSPROPSETID_Clock, KSPROPERTY_CLOCK_STATE, &state, sizeof (state))) goto error; ks_device_list_free (devices); priv->open = TRUE; GST_KS_CLOCK_UNLOCK (); return TRUE; error: ks_device_list_free (devices); gst_ks_clock_close_unlocked (self); GST_KS_CLOCK_UNLOCK (); return FALSE; } static gboolean gst_ks_clock_set_state_unlocked (GstKsClock * self, KSSTATE state) { GstKsClockPrivate *priv = GST_KS_CLOCK_GET_PRIVATE (self); KSSTATE initial_state; gint addend; g_assert (priv->open); if (state == priv->state) return TRUE; initial_state = priv->state; addend = (state > priv->state) ? 1 : -1; GST_DEBUG ("Initiating clock state change from %s to %s", ks_state_to_string (priv->state), ks_state_to_string (state)); while (priv->state != state) { KSSTATE next_state = priv->state + addend; GST_DEBUG ("Changing clock state from %s to %s", ks_state_to_string (priv->state), ks_state_to_string (next_state)); if (ks_object_set_property (priv->clock_handle, KSPROPSETID_Clock, KSPROPERTY_CLOCK_STATE, &next_state, sizeof (next_state))) { priv->state = next_state; GST_DEBUG ("Changed clock state to %s", ks_state_to_string (priv->state)); } else { GST_WARNING ("Failed to change clock state to %s", ks_state_to_string (next_state)); return FALSE; } } GST_DEBUG ("Finished clock state change from %s to %s", ks_state_to_string (initial_state), ks_state_to_string (state)); return TRUE; } static void gst_ks_clock_close_unlocked (GstKsClock * self) { GstKsClockPrivate *priv = GST_KS_CLOCK_GET_PRIVATE (self); if (priv->closing) return; priv->closing = TRUE; if (priv->worker_thread != NULL) { priv->worker_running = FALSE; g_cond_signal (priv->worker_cond); GST_KS_CLOCK_UNLOCK (); g_thread_join (priv->worker_thread); priv->worker_thread = NULL; GST_KS_CLOCK_LOCK (); } gst_ks_clock_set_state_unlocked (self, KSSTATE_STOP); if (ks_is_valid_handle (priv->clock_handle)) { CloseHandle (priv->clock_handle); priv->clock_handle = INVALID_HANDLE_VALUE; } if (priv->master_clock != NULL) { gst_object_unref (priv->master_clock); priv->master_clock = NULL; } priv->open = FALSE; priv->closing = FALSE; } void gst_ks_clock_close (GstKsClock * self) { GstKsClockPrivate *priv = GST_KS_CLOCK_GET_PRIVATE (self); GST_KS_CLOCK_LOCK (); gst_ks_clock_close_unlocked (self); GST_KS_CLOCK_UNLOCK (); } HANDLE gst_ks_clock_get_handle (GstKsClock * self) { GstKsClockPrivate *priv = GST_KS_CLOCK_GET_PRIVATE (self); HANDLE handle; GST_KS_CLOCK_LOCK (); g_assert (priv->open); handle = priv->clock_handle; GST_KS_CLOCK_UNLOCK (); return handle; } void gst_ks_clock_prepare (GstKsClock * self) { GstKsClockPrivate *priv = GST_KS_CLOCK_GET_PRIVATE (self); GST_KS_CLOCK_LOCK (); if (priv->state < KSSTATE_PAUSE) gst_ks_clock_set_state_unlocked (self, KSSTATE_PAUSE); GST_KS_CLOCK_UNLOCK (); } static gpointer gst_ks_clock_worker_thread_func (gpointer data) { GstKsClock *self = GST_KS_CLOCK (data); GstKsClockPrivate *priv = GST_KS_CLOCK_GET_PRIVATE (self); GST_KS_CLOCK_LOCK (); gst_ks_clock_set_state_unlocked (self, KSSTATE_RUN); while (priv->worker_running) { if (priv->master_clock != NULL) { GstClockTime now = gst_clock_get_time (priv->master_clock); now /= 100; if (ks_object_set_property (priv->clock_handle, KSPROPSETID_Clock, KSPROPERTY_CLOCK_TIME, &now, sizeof (now))) { GST_DEBUG ("clock synchronized"); gst_object_unref (priv->master_clock); priv->master_clock = NULL; } else { GST_WARNING ("failed to synchronize clock"); } } if (!priv->worker_initialized) { priv->worker_initialized = TRUE; g_cond_signal (priv->client_cond); } g_cond_wait (priv->worker_cond, priv->mutex); } priv->worker_initialized = FALSE; GST_KS_CLOCK_UNLOCK (); return NULL; } void gst_ks_clock_start (GstKsClock * self) { GstKsClockPrivate *priv = GST_KS_CLOCK_GET_PRIVATE (self); GST_KS_CLOCK_LOCK (); if (priv->worker_thread == NULL) { priv->worker_running = TRUE; priv->worker_initialized = FALSE; priv->worker_thread = g_thread_create (gst_ks_clock_worker_thread_func, self, TRUE, NULL); } while (!priv->worker_initialized) g_cond_wait (priv->client_cond, priv->mutex); GST_KS_CLOCK_UNLOCK (); } void gst_ks_clock_provide_master_clock (GstKsClock * self, GstClock * master_clock) { GstKsClockPrivate *priv = GST_KS_CLOCK_GET_PRIVATE (self); GST_KS_CLOCK_LOCK (); gst_object_ref (master_clock); if (priv->master_clock != NULL) gst_object_unref (priv->master_clock); priv->master_clock = master_clock; g_cond_signal (priv->worker_cond); GST_KS_CLOCK_UNLOCK (); } psimedia-master/gstprovider/gstelements/winks/gstksclock.h000066400000000000000000000037641220046403000245200ustar00rootroot00000000000000/* * Copyright (C) 2008 Ole André Vadla RavnÃ¥s * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. */ #ifndef __GST_KS_CLOCK_H__ #define __GST_KS_CLOCK_H__ #include #include #include "ksvideohelpers.h" G_BEGIN_DECLS #define GST_TYPE_KS_CLOCK \ (gst_ks_clock_get_type ()) #define GST_KS_CLOCK(obj) \ (G_TYPE_CHECK_INSTANCE_CAST ((obj), GST_TYPE_KS_CLOCK, GstKsClock)) #define GST_KS_CLOCK_CLASS(klass) \ (G_TYPE_CHECK_CLASS_CAST ((klass), GST_TYPE_KS_CLOCK, GstKsClockClass)) #define GST_IS_KS_CLOCK(obj) \ (G_TYPE_CHECK_INSTANCE_TYPE ((obj), GST_TYPE_KS_CLOCK)) #define GST_IS_KS_CLOCK_CLASS(klass) \ (G_TYPE_CHECK_CLASS_TYPE ((klass), GST_TYPE_KS_CLOCK)) typedef struct _GstKsClock GstKsClock; typedef struct _GstKsClockClass GstKsClockClass; struct _GstKsClock { GObject parent; }; struct _GstKsClockClass { GObjectClass parent_class; }; GType gst_ks_clock_get_type (void); gboolean gst_ks_clock_open (GstKsClock * self); void gst_ks_clock_close (GstKsClock * self); HANDLE gst_ks_clock_get_handle (GstKsClock * self); void gst_ks_clock_prepare (GstKsClock * self); void gst_ks_clock_start (GstKsClock * self); void gst_ks_clock_provide_master_clock (GstKsClock * self, GstClock * master_clock); G_END_DECLS #endif /* __GST_KS_CLOCK_H__ */ psimedia-master/gstprovider/gstelements/winks/gstksvideodevice.c000066400000000000000000000776031220046403000257110ustar00rootroot00000000000000/* * Copyright (C) 2008 Ole André Vadla RavnÃ¥s * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. */ #ifdef UNICODE #undef UNICODE #endif #include "gstksvideodevice.h" #include "gstksclock.h" #include "kshelpers.h" #include "ksvideohelpers.h" #define READ_TIMEOUT (10 * 1000) #define MJPEG_MAX_PADDING 128 #define MAX_OUTSTANDING_FRAMES 128 #define BUFFER_ALIGNMENT 512 #define DEFAULT_DEVICE_PATH NULL GST_DEBUG_CATEGORY_EXTERN (gst_ks_debug); #define GST_CAT_DEFAULT gst_ks_debug #define GST_DEBUG_IS_ENABLED() \ (gst_debug_category_get_threshold (GST_CAT_DEFAULT) >= GST_LEVEL_DEBUG) enum { PROP_0, PROP_CLOCK, PROP_DEVICE_PATH, }; typedef struct { KSSTREAM_HEADER header; KS_FRAME_INFO frame_info; } KSSTREAM_READ_PARAMS; typedef struct { KSSTREAM_READ_PARAMS params; guint8 *buf_unaligned; guint8 *buf; OVERLAPPED overlapped; } ReadRequest; typedef struct { gboolean open; KSSTATE state; GstKsClock *clock; gchar *dev_path; HANDLE filter_handle; GList *media_types; GstCaps *cached_caps; HANDLE cancel_event; KsVideoMediaType *cur_media_type; GstCaps *cur_fixed_caps; guint width; guint height; guint fps_n; guint fps_d; guint8 *rgb_swap_buf; gboolean is_mjpeg; HANDLE pin_handle; gboolean requests_submitted; gulong num_requests; GArray *requests; GArray *request_events; GstClockTime last_timestamp; } GstKsVideoDevicePrivate; #define GST_KS_VIDEO_DEVICE_GET_PRIVATE(o) \ (G_TYPE_INSTANCE_GET_PRIVATE ((o), GST_TYPE_KS_VIDEO_DEVICE, \ GstKsVideoDevicePrivate)) static void gst_ks_video_device_dispose (GObject * object); static void gst_ks_video_device_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); static void gst_ks_video_device_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static void gst_ks_video_device_reset_caps (GstKsVideoDevice * self); GST_BOILERPLATE (GstKsVideoDevice, gst_ks_video_device, GObject, G_TYPE_OBJECT); static void gst_ks_video_device_base_init (gpointer gclass) { } static void gst_ks_video_device_class_init (GstKsVideoDeviceClass * klass) { GObjectClass *gobject_class = G_OBJECT_CLASS (klass); g_type_class_add_private (klass, sizeof (GstKsVideoDevicePrivate)); gobject_class->dispose = gst_ks_video_device_dispose; gobject_class->get_property = gst_ks_video_device_get_property; gobject_class->set_property = gst_ks_video_device_set_property; g_object_class_install_property (gobject_class, PROP_CLOCK, g_param_spec_object ("clock", "Clock to use", "Clock to use", GST_TYPE_KS_CLOCK, G_PARAM_READWRITE | G_PARAM_CONSTRUCT_ONLY | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_DEVICE_PATH, g_param_spec_string ("device-path", "Device Path", "The device path", DEFAULT_DEVICE_PATH, G_PARAM_READWRITE | G_PARAM_CONSTRUCT_ONLY | G_PARAM_STATIC_STRINGS)); } static void gst_ks_video_device_init (GstKsVideoDevice * self, GstKsVideoDeviceClass * gclass) { GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); priv->open = FALSE; priv->state = KSSTATE_STOP; } static void gst_ks_video_device_dispose (GObject * object) { GstKsVideoDevice *self = GST_KS_VIDEO_DEVICE (object); GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); gst_ks_video_device_reset_caps (self); gst_ks_video_device_close (self); if (priv->clock != NULL) { g_object_unref (priv->clock); priv->clock = NULL; } G_OBJECT_CLASS (parent_class)->dispose (object); } static void gst_ks_video_device_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec) { GstKsVideoDevice *self = GST_KS_VIDEO_DEVICE (object); GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); switch (prop_id) { case PROP_CLOCK: g_value_set_object (value, priv->clock); break; case PROP_DEVICE_PATH: g_value_set_string (value, priv->dev_path); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static void gst_ks_video_device_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) { GstKsVideoDevice *self = GST_KS_VIDEO_DEVICE (object); GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); switch (prop_id) { case PROP_CLOCK: if (priv->clock != NULL) g_object_unref (priv->clock); priv->clock = g_value_dup_object (value); break; case PROP_DEVICE_PATH: g_free (priv->dev_path); priv->dev_path = g_value_dup_string (value); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static void gst_ks_video_device_parse_win32_error (const gchar * func_name, DWORD error_code, gulong * ret_error_code, gchar ** ret_error_str) { if (ret_error_code != NULL) *ret_error_code = error_code; if (ret_error_str != NULL) { GString *message; gchar buf[1480]; DWORD result; message = g_string_sized_new (1600); g_string_append_printf (message, "%s returned ", func_name); result = FormatMessage (FORMAT_MESSAGE_FROM_SYSTEM | FORMAT_MESSAGE_IGNORE_INSERTS, NULL, error_code, 0, buf, sizeof (buf), NULL); if (result != 0) { g_string_append_printf (message, "0x%08x: %s", error_code, g_strchomp (buf)); } else { DWORD format_error_code = GetLastError (); g_string_append_printf (message, "<0x%08x (FormatMessage error code: %s)>", error_code, (format_error_code == ERROR_MR_MID_NOT_FOUND) ? "no system error message found" : "failed to retrieve system error message"); } *ret_error_str = message->str; g_string_free (message, FALSE); } } static void gst_ks_video_device_clear_buffers (GstKsVideoDevice * self) { GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); guint i; if (priv->requests == NULL) return; /* Cancel pending requests */ CancelIo (priv->pin_handle); for (i = 0; i < priv->num_requests; i++) { ReadRequest *req = &g_array_index (priv->requests, ReadRequest, i); DWORD bytes_returned; GetOverlappedResult (priv->pin_handle, &req->overlapped, &bytes_returned, TRUE); } /* Clean up */ for (i = 0; i < priv->requests->len; i++) { ReadRequest *req = &g_array_index (priv->requests, ReadRequest, i); HANDLE ev = g_array_index (priv->request_events, HANDLE, i); g_free (req->buf_unaligned); if (ev) CloseHandle (ev); } g_array_free (priv->requests, TRUE); priv->requests = NULL; g_array_free (priv->request_events, TRUE); priv->request_events = NULL; } static void gst_ks_video_device_prepare_buffers (GstKsVideoDevice * self) { GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); guint i; guint frame_size; g_assert (priv->cur_media_type != NULL); gst_ks_video_device_clear_buffers (self); priv->requests = g_array_sized_new (FALSE, TRUE, sizeof (ReadRequest), priv->num_requests); priv->request_events = g_array_sized_new (FALSE, TRUE, sizeof (HANDLE), priv->num_requests + 1); frame_size = gst_ks_video_device_get_frame_size (self); for (i = 0; i < priv->num_requests; i++) { ReadRequest req = { 0, }; req.buf_unaligned = g_malloc (frame_size + BUFFER_ALIGNMENT - 1); req.buf = (guint8 *) (((gsize) req.buf_unaligned + BUFFER_ALIGNMENT - 1) & ~(BUFFER_ALIGNMENT - 1)); req.overlapped.hEvent = CreateEvent (NULL, TRUE, FALSE, NULL); g_array_append_val (priv->requests, req); g_array_append_val (priv->request_events, req.overlapped.hEvent); } g_array_append_val (priv->request_events, priv->cancel_event); /* * REVISIT: Could probably remove this later, for now it's here to help * track down the case where we capture old frames. This has been * observed with UVC cameras, presumably with some system load. */ priv->last_timestamp = GST_CLOCK_TIME_NONE; } static void gst_ks_video_device_dump_supported_property_sets (GstKsVideoDevice * self, const gchar * obj_name, const GUID * propsets, gulong propsets_len) { guint i; GST_DEBUG ("%s supports %d property set%s", obj_name, propsets_len, (propsets_len != 1) ? "s" : ""); for (i = 0; i < propsets_len; i++) { gchar *propset_name = ks_property_set_to_string (&propsets[i]); GST_DEBUG ("[%d] %s", i, propset_name); g_free (propset_name); } } gboolean gst_ks_video_device_open (GstKsVideoDevice * self) { GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); GUID *propsets = NULL; gulong propsets_len; GList *cur; g_assert (!priv->open); g_assert (priv->dev_path != NULL); /* * Open the filter. */ priv->filter_handle = CreateFile (priv->dev_path, GENERIC_READ | GENERIC_WRITE, 0, NULL, OPEN_EXISTING, FILE_ATTRIBUTE_NORMAL | FILE_FLAG_OVERLAPPED, NULL); if (!ks_is_valid_handle (priv->filter_handle)) goto error; /* * Query the filter for supported property sets. */ if (ks_object_get_supported_property_sets (priv->filter_handle, &propsets, &propsets_len)) { gst_ks_video_device_dump_supported_property_sets (self, "filter", propsets, propsets_len); g_free (propsets); } else { GST_DEBUG ("failed to query filter for supported property sets"); } /* * Probe for supported media types. */ priv->media_types = ks_video_probe_filter_for_caps (priv->filter_handle); priv->cached_caps = gst_caps_new_empty (); for (cur = priv->media_types; cur != NULL; cur = cur->next) { KsVideoMediaType *media_type = cur->data; gst_caps_append (priv->cached_caps, gst_caps_copy (media_type->translated_caps)); #if 1 { gchar *str; str = gst_caps_to_string (media_type->translated_caps); GST_DEBUG ("pin[%d]: found media type: %s", media_type->pin_id, str); g_free (str); } #endif } priv->cancel_event = CreateEvent (NULL, TRUE, FALSE, NULL); priv->open = TRUE; return TRUE; error: g_free (priv->dev_path); priv->dev_path = NULL; return FALSE; } void gst_ks_video_device_close (GstKsVideoDevice * self) { GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); GList *cur; gst_ks_video_device_reset_caps (self); g_free (priv->dev_path); priv->dev_path = NULL; if (ks_is_valid_handle (priv->filter_handle)) { CloseHandle (priv->filter_handle); priv->filter_handle = INVALID_HANDLE_VALUE; } for (cur = priv->media_types; cur != NULL; cur = cur->next) { KsVideoMediaType *mt = cur->data; ks_video_media_type_free (mt); } if (priv->media_types != NULL) { g_list_free (priv->media_types); priv->media_types = NULL; } if (priv->cached_caps != NULL) { gst_caps_unref (priv->cached_caps); priv->cached_caps = NULL; } if (ks_is_valid_handle (priv->cancel_event)) CloseHandle (priv->cancel_event); priv->cancel_event = INVALID_HANDLE_VALUE; priv->open = FALSE; } GstCaps * gst_ks_video_device_get_available_caps (GstKsVideoDevice * self) { GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); g_assert (priv->open); return gst_caps_ref (priv->cached_caps); } gboolean gst_ks_video_device_has_caps (GstKsVideoDevice * self) { GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); return (priv->cur_media_type != NULL) ? TRUE : FALSE; } static HANDLE gst_ks_video_device_create_pin (GstKsVideoDevice * self, KsVideoMediaType * media_type, gulong * num_outstanding) { GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); HANDLE pin_handle = INVALID_HANDLE_VALUE; KSPIN_CONNECT *pin_conn = NULL; DWORD ret; GUID *propsets = NULL; gulong propsets_len; gboolean supports_mem_transport = FALSE; KSALLOCATOR_FRAMING *framing = NULL; gulong framing_size = sizeof (KSALLOCATOR_FRAMING); KSALLOCATOR_FRAMING_EX *framing_ex = NULL; gulong alignment; DWORD mem_transport; /* * Instantiate the pin. */ pin_conn = ks_video_create_pin_conn_from_media_type (media_type); GST_DEBUG ("calling KsCreatePin with pin_id = %d", media_type->pin_id); ret = KsCreatePin (priv->filter_handle, pin_conn, GENERIC_READ, &pin_handle); if (ret != ERROR_SUCCESS) goto error_create_pin; GST_DEBUG ("KsCreatePin succeeded, pin %p created", pin_handle); g_free (pin_conn); pin_conn = NULL; /* * Query the pin for supported property sets. */ if (ks_object_get_supported_property_sets (pin_handle, &propsets, &propsets_len)) { guint i; gst_ks_video_device_dump_supported_property_sets (self, "pin", propsets, propsets_len); for (i = 0; i < propsets_len; i++) { if (IsEqualGUID (&propsets[i], &KSPROPSETID_MemoryTransport)) supports_mem_transport = TRUE; } g_free (propsets); } else { GST_DEBUG ("failed to query pin for supported property sets"); } /* * Figure out how many simultanous requests it prefers. * * This is really important as it depends on the driver and the device. * Doing too few will result in poor capture performance, whilst doing too * many will make some drivers crash really horribly and leave you with a * BSOD. I've experienced the latter with older Logitech drivers. */ *num_outstanding = 0; alignment = 0; if (ks_object_get_property (pin_handle, KSPROPSETID_Connection, KSPROPERTY_CONNECTION_ALLOCATORFRAMING_EX, &framing_ex, NULL)) { if (framing_ex->CountItems >= 1) { *num_outstanding = framing_ex->FramingItem[0].Frames; alignment = framing_ex->FramingItem[0].FileAlignment; } else { GST_DEBUG ("ignoring empty ALLOCATORFRAMING_EX"); } } else { GST_DEBUG ("query for ALLOCATORFRAMING_EX failed, trying " "ALLOCATORFRAMING"); if (ks_object_get_property (pin_handle, KSPROPSETID_Connection, KSPROPERTY_CONNECTION_ALLOCATORFRAMING, &framing, &framing_size)) { *num_outstanding = framing->Frames; alignment = framing->FileAlignment; } else { GST_DEBUG ("query for ALLOCATORFRAMING failed"); } } GST_DEBUG ("num_outstanding: %d alignment: 0x%08x", *num_outstanding, alignment); if (*num_outstanding == 0 || *num_outstanding > MAX_OUTSTANDING_FRAMES) { GST_DEBUG ("setting number of allowable outstanding frames to 1"); *num_outstanding = 1; } g_free (framing); g_free (framing_ex); /* * TODO: We also need to respect alignment, but for now we just align * on FILE_512_BYTE_ALIGNMENT. */ /* Set the memory transport to use. */ if (supports_mem_transport) { mem_transport = 0; /* REVISIT: use the constant here */ if (!ks_object_set_property (pin_handle, KSPROPSETID_MemoryTransport, KSPROPERTY_MEMORY_TRANSPORT, &mem_transport, sizeof (mem_transport))) { GST_DEBUG ("failed to set memory transport, sticking with the default"); } } /* * Override the clock if we have one and the pin doesn't have any either. */ if (priv->clock != NULL) { HANDLE *cur_clock_handle = NULL; gulong cur_clock_handle_size = sizeof (HANDLE); if (ks_object_get_property (pin_handle, KSPROPSETID_Stream, KSPROPERTY_STREAM_MASTERCLOCK, (gpointer *) & cur_clock_handle, &cur_clock_handle_size)) { GST_DEBUG ("current master clock handle: 0x%08x", *cur_clock_handle); CloseHandle (*cur_clock_handle); g_free (cur_clock_handle); } else { HANDLE new_clock_handle = gst_ks_clock_get_handle (priv->clock); if (ks_object_set_property (pin_handle, KSPROPSETID_Stream, KSPROPERTY_STREAM_MASTERCLOCK, &new_clock_handle, sizeof (new_clock_handle))) { gst_ks_clock_prepare (priv->clock); } else { GST_WARNING ("failed to set pin's master clock"); } } } return pin_handle; /* ERRORS */ error_create_pin: { gchar *str; gst_ks_video_device_parse_win32_error ("KsCreatePin", ret, NULL, &str); GST_ERROR ("%s", str); g_free (str); goto beach; } beach: { g_free (framing); if (ks_is_valid_handle (pin_handle)) CloseHandle (pin_handle); g_free (pin_conn); return INVALID_HANDLE_VALUE; } } static void gst_ks_video_device_close_current_pin (GstKsVideoDevice * self) { GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); if (!ks_is_valid_handle (priv->pin_handle)) return; gst_ks_video_device_set_state (self, KSSTATE_STOP); CloseHandle (priv->pin_handle); priv->pin_handle = INVALID_HANDLE_VALUE; } static void gst_ks_video_device_reset_caps (GstKsVideoDevice * self) { GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); gst_ks_video_device_close_current_pin (self); ks_video_media_type_free (priv->cur_media_type); priv->cur_media_type = NULL; priv->width = priv->height = priv->fps_n = priv->fps_d = 0; g_free (priv->rgb_swap_buf); priv->rgb_swap_buf = NULL; if (priv->cur_fixed_caps != NULL) { gst_caps_unref (priv->cur_fixed_caps); priv->cur_fixed_caps = NULL; } } gboolean gst_ks_video_device_set_caps (GstKsVideoDevice * self, GstCaps * caps) { GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); GList *cur; GstStructure *s; /* State to be committed on success */ KsVideoMediaType *media_type = NULL; guint width, height, fps_n, fps_d; HANDLE pin_handle = INVALID_HANDLE_VALUE; /* Reset? */ if (caps == NULL) { gst_ks_video_device_reset_caps (self); return TRUE; } /* Validate the caps */ if (!gst_caps_is_subset (caps, priv->cached_caps)) { gchar *string_caps = gst_caps_to_string (caps); gchar *string_c_caps = gst_caps_to_string (priv->cached_caps); GST_ERROR ("caps (%s) is not a subset of device caps (%s)", string_caps, string_c_caps); g_free (string_caps); g_free (string_c_caps); goto error; } for (cur = priv->media_types; cur != NULL; cur = cur->next) { KsVideoMediaType *mt = cur->data; if (gst_caps_is_subset (caps, mt->translated_caps)) { media_type = ks_video_media_type_dup (mt); break; } } if (media_type == NULL) goto error; s = gst_caps_get_structure (caps, 0); if (!gst_structure_get_int (s, "width", &width) || !gst_structure_get_int (s, "height", &height) || !gst_structure_get_fraction (s, "framerate", &fps_n, &fps_d)) { GST_ERROR ("Failed to get width/height/fps"); goto error; } if (!ks_video_fixate_media_type (media_type->range, media_type->format, width, height, fps_n, fps_d)) goto error; if (priv->cur_media_type != NULL) { if (media_type->format_size == priv->cur_media_type->format_size && memcmp (media_type->format, priv->cur_media_type->format, priv->cur_media_type->format_size) == 0) { GST_DEBUG ("%s: re-using existing pin", G_STRFUNC); goto same_caps; } else { GST_DEBUG ("%s: re-creating pin", G_STRFUNC); } } gst_ks_video_device_close_current_pin (self); pin_handle = gst_ks_video_device_create_pin (self, media_type, &priv->num_requests); if (!ks_is_valid_handle (pin_handle)) { /* Re-create the old pin */ if (priv->cur_media_type != NULL) priv->pin_handle = gst_ks_video_device_create_pin (self, priv->cur_media_type, &priv->num_requests); goto error; } /* Commit state: no turning back past this */ gst_ks_video_device_reset_caps (self); priv->cur_media_type = media_type; priv->width = width; priv->height = height; priv->fps_n = fps_n; priv->fps_d = fps_d; if (gst_structure_has_name (s, "video/x-raw-rgb")) priv->rgb_swap_buf = g_malloc (media_type->sample_size / priv->height); else priv->rgb_swap_buf = NULL; priv->is_mjpeg = gst_structure_has_name (s, "image/jpeg"); priv->pin_handle = pin_handle; priv->cur_fixed_caps = gst_caps_copy (caps); return TRUE; error: { ks_video_media_type_free (media_type); return FALSE; } same_caps: { ks_video_media_type_free (media_type); return TRUE; } } gboolean gst_ks_video_device_set_state (GstKsVideoDevice * self, KSSTATE state) { GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); KSSTATE initial_state; gint addend; g_assert (priv->cur_media_type != NULL); if (state == priv->state) return TRUE; initial_state = priv->state; addend = (state > priv->state) ? 1 : -1; GST_DEBUG ("Initiating pin state change from %s to %s", ks_state_to_string (priv->state), ks_state_to_string (state)); while (priv->state != state) { KSSTATE next_state = priv->state + addend; /* Skip the ACQUIRE step on the way down like DirectShow does */ if (addend < 0 && next_state == KSSTATE_ACQUIRE) next_state = KSSTATE_STOP; GST_DEBUG ("Changing pin state from %s to %s", ks_state_to_string (priv->state), ks_state_to_string (next_state)); if (ks_object_set_connection_state (priv->pin_handle, next_state)) { priv->state = next_state; GST_DEBUG ("Changed pin state to %s", ks_state_to_string (priv->state)); if (priv->state == KSSTATE_PAUSE && addend > 0) gst_ks_video_device_prepare_buffers (self); else if (priv->state == KSSTATE_STOP && addend < 0) gst_ks_video_device_clear_buffers (self); } else { GST_WARNING ("Failed to change pin state to %s", ks_state_to_string (next_state)); return FALSE; } } GST_DEBUG ("Finished pin state change from %s to %s", ks_state_to_string (initial_state), ks_state_to_string (state)); return TRUE; } guint gst_ks_video_device_get_frame_size (GstKsVideoDevice * self) { GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); g_assert (priv->cur_media_type != NULL); return priv->cur_media_type->sample_size; } GstClockTime gst_ks_video_device_get_duration (GstKsVideoDevice * self) { GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); g_assert (priv->cur_media_type != NULL); return gst_util_uint64_scale_int (GST_SECOND, priv->fps_d, priv->fps_n); } gboolean gst_ks_video_device_get_latency (GstKsVideoDevice * self, GstClockTime * min_latency, GstClockTime * max_latency) { GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); if (priv->cur_media_type == NULL) return FALSE; *min_latency = gst_util_uint64_scale_int (GST_SECOND, priv->fps_d, priv->fps_n); *max_latency = *min_latency; return TRUE; } static gboolean gst_ks_video_device_request_frame (GstKsVideoDevice * self, ReadRequest * req, gulong * error_code, gchar ** error_str) { GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); HANDLE event; KSSTREAM_READ_PARAMS *params; BOOL success; DWORD bytes_returned = 0; /* Reset the OVERLAPPED structure */ event = req->overlapped.hEvent; memset (&req->overlapped, 0, sizeof (OVERLAPPED)); req->overlapped.hEvent = event; /* Fill out KSSTREAM_HEADER and KS_FRAME_INFO */ params = &req->params; memset (params, 0, sizeof (KSSTREAM_READ_PARAMS)); params->header.Size = sizeof (KSSTREAM_HEADER) + sizeof (KS_FRAME_INFO); params->header.PresentationTime.Numerator = 1; params->header.PresentationTime.Denominator = 1; params->header.FrameExtent = gst_ks_video_device_get_frame_size (self); params->header.Data = req->buf; params->frame_info.ExtendedHeaderSize = sizeof (KS_FRAME_INFO); /* * Clear the buffer like DirectShow does * * REVISIT: Could probably remove this later, for now it's here to help * track down the case where we capture old frames. This has been * observed with UVC cameras, presumably with some system load. */ memset (params->header.Data, 0, params->header.FrameExtent); success = DeviceIoControl (priv->pin_handle, IOCTL_KS_READ_STREAM, NULL, 0, params, params->header.Size, &bytes_returned, &req->overlapped); if (!success && GetLastError () != ERROR_IO_PENDING) goto error_ioctl; return TRUE; /* ERRORS */ error_ioctl: { gst_ks_video_device_parse_win32_error ("DeviceIoControl", GetLastError (), error_code, error_str); return FALSE; } } GstFlowReturn gst_ks_video_device_read_frame (GstKsVideoDevice * self, guint8 * buf, gulong buf_size, gulong * bytes_read, GstClockTime * presentation_time, gulong * error_code, gchar ** error_str) { GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); guint req_idx; DWORD wait_ret; BOOL success; DWORD bytes_returned; g_assert (priv->cur_media_type != NULL); /* First time we're called, submit the requests. */ if (G_UNLIKELY (!priv->requests_submitted)) { priv->requests_submitted = TRUE; for (req_idx = 0; req_idx < priv->num_requests; req_idx++) { ReadRequest *req = &g_array_index (priv->requests, ReadRequest, req_idx); if (!gst_ks_video_device_request_frame (self, req, error_code, error_str)) goto error_request_failed; } } do { /* Wait for either a request to complete, a cancel or a timeout */ wait_ret = WaitForMultipleObjects (priv->request_events->len, (HANDLE *) priv->request_events->data, FALSE, READ_TIMEOUT); if (wait_ret == WAIT_TIMEOUT) goto error_timeout; else if (wait_ret == WAIT_FAILED) goto error_wait; /* Stopped? */ if (WaitForSingleObject (priv->cancel_event, 0) == WAIT_OBJECT_0) goto error_cancel; *bytes_read = 0; /* Find the last ReadRequest that finished and get the result, immediately * re-issuing each request that has completed. */ for (req_idx = wait_ret - WAIT_OBJECT_0; req_idx < priv->num_requests; req_idx++) { ReadRequest *req = &g_array_index (priv->requests, ReadRequest, req_idx); /* * Completed? WaitForMultipleObjects() returns the lowest index if * multiple objects are in the signaled state, and we know that requests * are processed one by one so there's no point in looking further once * we've found the first that's non-signaled. */ if (WaitForSingleObject (req->overlapped.hEvent, 0) != WAIT_OBJECT_0) break; success = GetOverlappedResult (priv->pin_handle, &req->overlapped, &bytes_returned, TRUE); ResetEvent (req->overlapped.hEvent); if (success) { KSSTREAM_HEADER *hdr = &req->params.header; KS_FRAME_INFO *frame_info = &req->params.frame_info; GstClockTime timestamp = GST_CLOCK_TIME_NONE; GstClockTime duration = GST_CLOCK_TIME_NONE; if (hdr->OptionsFlags & KSSTREAM_HEADER_OPTIONSF_TIMEVALID) timestamp = hdr->PresentationTime.Time * 100; if (hdr->OptionsFlags & KSSTREAM_HEADER_OPTIONSF_DURATIONVALID) duration = hdr->Duration * 100; /* Assume it's a good frame */ *bytes_read = hdr->DataUsed; if (G_LIKELY (presentation_time != NULL)) *presentation_time = timestamp; if (G_UNLIKELY (GST_DEBUG_IS_ENABLED ())) { gchar *options_flags_str = ks_options_flags_to_string (hdr->OptionsFlags); GST_DEBUG ("PictureNumber=%" G_GUINT64_FORMAT ", DropCount=%" G_GUINT64_FORMAT ", PresentationTime=%" GST_TIME_FORMAT ", Duration=%" GST_TIME_FORMAT ", OptionsFlags=%s: %d bytes", frame_info->PictureNumber, frame_info->DropCount, GST_TIME_ARGS (timestamp), GST_TIME_ARGS (duration), options_flags_str, hdr->DataUsed); g_free (options_flags_str); } /* Protect against old frames. This should never happen, see previous * comment on last_timestamp. */ if (G_LIKELY (GST_CLOCK_TIME_IS_VALID (timestamp))) { if (G_UNLIKELY (GST_CLOCK_TIME_IS_VALID (priv->last_timestamp) && timestamp < priv->last_timestamp)) { GST_WARNING ("got an old frame (last_timestamp=%" GST_TIME_FORMAT ", timestamp=%" GST_TIME_FORMAT ")", GST_TIME_ARGS (priv->last_timestamp), GST_TIME_ARGS (timestamp)); *bytes_read = 0; } else { priv->last_timestamp = timestamp; } } if (*bytes_read > 0) { /* Grab the frame data */ g_assert (buf_size >= hdr->DataUsed); memcpy (buf, req->buf, hdr->DataUsed); if (priv->is_mjpeg) { /* * Workaround for cameras/drivers that intermittently provide us * with incomplete or corrupted MJPEG frames. * * Happens with for instance Microsoft LifeCam VX-7000. */ gboolean valid = FALSE; guint padding = 0; /* JFIF SOI marker */ if (*bytes_read > MJPEG_MAX_PADDING && buf[0] == 0xff && buf[1] == 0xd8) { guint8 *p = buf + *bytes_read - 2; /* JFIF EOI marker (but skip any padding) */ while (padding < MJPEG_MAX_PADDING - 1 - 2 && !valid) { if (p[0] == 0xff && p[1] == 0xd9) { valid = TRUE; } else { padding++; p--; } } } if (valid) *bytes_read -= padding; else *bytes_read = 0; } } } else if (GetLastError () != ERROR_OPERATION_ABORTED) goto error_get_result; /* Submit a new request immediately */ if (!gst_ks_video_device_request_frame (self, req, error_code, error_str)) goto error_request_failed; } } while (*bytes_read == 0); return GST_FLOW_OK; /* ERRORS */ error_request_failed: { return GST_FLOW_ERROR; } error_timeout: { GST_DEBUG ("IOCTL_KS_READ_STREAM timed out"); if (error_code != NULL) *error_code = 0; if (error_str != NULL) *error_str = NULL; return GST_FLOW_UNEXPECTED; } error_wait: { gst_ks_video_device_parse_win32_error ("WaitForMultipleObjects", GetLastError (), error_code, error_str); return GST_FLOW_ERROR; } error_cancel: { if (error_code != NULL) *error_code = 0; if (error_str != NULL) *error_str = NULL; return GST_FLOW_WRONG_STATE; } error_get_result: { gst_ks_video_device_parse_win32_error ("GetOverlappedResult", GetLastError (), error_code, error_str); return GST_FLOW_ERROR; } } void gst_ks_video_device_postprocess_frame (GstKsVideoDevice * self, guint8 * buf, guint buf_size) { GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); /* If it's RGB we need to flip the image */ if (priv->rgb_swap_buf != NULL) { gint stride, line; guint8 *dst, *src; stride = buf_size / priv->height; dst = buf; src = buf + buf_size - stride; for (line = 0; line < priv->height / 2; line++) { memcpy (priv->rgb_swap_buf, dst, stride); memcpy (dst, src, stride); memcpy (src, priv->rgb_swap_buf, stride); dst += stride; src -= stride; } } } void gst_ks_video_device_cancel (GstKsVideoDevice * self) { GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); SetEvent (priv->cancel_event); } void gst_ks_video_device_cancel_stop (GstKsVideoDevice * self) { GstKsVideoDevicePrivate *priv = GST_KS_VIDEO_DEVICE_GET_PRIVATE (self); ResetEvent (priv->cancel_event); } psimedia-master/gstprovider/gstelements/winks/gstksvideodevice.h000066400000000000000000000056121220046403000257050ustar00rootroot00000000000000/* * Copyright (C) 2008 Ole André Vadla RavnÃ¥s * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. */ #ifndef __GST_KS_VIDEO_DEVICE_H__ #define __GST_KS_VIDEO_DEVICE_H__ #include #include #include G_BEGIN_DECLS #define GST_TYPE_KS_VIDEO_DEVICE \ (gst_ks_video_device_get_type ()) #define GST_KS_VIDEO_DEVICE(obj) \ (G_TYPE_CHECK_INSTANCE_CAST ((obj), GST_TYPE_KS_VIDEO_DEVICE, GstKsVideoDevice)) #define GST_KS_VIDEO_DEVICE_CLASS(klass) \ (G_TYPE_CHECK_CLASS_CAST ((klass), GST_TYPE_KS_VIDEO_DEVICE, GstKsVideoDeviceClass)) #define GST_IS_KS_VIDEO_DEVICE(obj) \ (G_TYPE_CHECK_INSTANCE_TYPE ((obj), GST_TYPE_KS_VIDEO_DEVICE)) #define GST_IS_KS_VIDEO_DEVICE_CLASS(klass) \ (G_TYPE_CHECK_CLASS_TYPE ((klass), GST_TYPE_KS_VIDEO_DEVICE)) typedef struct _GstKsVideoDevice GstKsVideoDevice; typedef struct _GstKsVideoDeviceClass GstKsVideoDeviceClass; struct _GstKsVideoDevice { GObject parent; }; struct _GstKsVideoDeviceClass { GObjectClass parent_class; }; GType gst_ks_video_device_get_type (void); gboolean gst_ks_video_device_open (GstKsVideoDevice * self); void gst_ks_video_device_close (GstKsVideoDevice * self); GstCaps * gst_ks_video_device_get_available_caps (GstKsVideoDevice * self); gboolean gst_ks_video_device_has_caps (GstKsVideoDevice * self); gboolean gst_ks_video_device_set_caps (GstKsVideoDevice * self, GstCaps * caps); gboolean gst_ks_video_device_set_state (GstKsVideoDevice * self, KSSTATE state); guint gst_ks_video_device_get_frame_size (GstKsVideoDevice * self); GstClockTime gst_ks_video_device_get_duration (GstKsVideoDevice * self); gboolean gst_ks_video_device_get_latency (GstKsVideoDevice * self, GstClockTime * min_latency, GstClockTime * max_latency); GstFlowReturn gst_ks_video_device_read_frame (GstKsVideoDevice * self, guint8 * buf, gulong buf_size, gulong * bytes_read, GstClockTime * presentation_time, gulong * error_code, gchar ** error_str); void gst_ks_video_device_postprocess_frame (GstKsVideoDevice * self, guint8 * buf, guint buf_size); void gst_ks_video_device_cancel (GstKsVideoDevice * self); void gst_ks_video_device_cancel_stop (GstKsVideoDevice * self); G_END_DECLS #endif /* __GST_KS_VIDEO_DEVICE_H__ */ psimedia-master/gstprovider/gstelements/winks/gstksvideosrc.c000066400000000000000000000722301220046403000252300ustar00rootroot00000000000000/* * Copyright (C) 2008 Ole André Vadla RavnÃ¥s * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. */ /** * SECTION:element-ksvideosrc * * Provides low-latency video capture from WDM cameras on Windows. * * * Example pipelines * |[ * gst-launch -v ksvideosrc do-stats=TRUE ! ffmpegcolorspace ! dshowvideosink * ]| Capture from a camera and render using dshowvideosink. * |[ * gst-launch -v ksvideosrc do-stats=TRUE ! image/jpeg, width=640, height=480 * ! jpegdec ! ffmpegcolorspace ! dshowvideosink * ]| Capture from an MJPEG camera and render using dshowvideosink. * */ #ifdef UNICODE #undef UNICODE #endif #ifdef HAVE_CONFIG_H # include #endif #include "gstksvideosrc.h" #include "gstksclock.h" #include "gstksvideodevice.h" #include "kshelpers.h" #include "ksvideohelpers.h" #define DEFAULT_DEVICE_PATH NULL #define DEFAULT_DEVICE_NAME NULL #define DEFAULT_DEVICE_INDEX -1 #define DEFAULT_DO_STATS FALSE #define DEFAULT_ENABLE_QUIRKS TRUE enum { PROP_0, PROP_DEVICE_PATH, PROP_DEVICE_NAME, PROP_DEVICE_INDEX, PROP_DO_STATS, PROP_FPS, PROP_ENABLE_QUIRKS, }; GST_DEBUG_CATEGORY (gst_ks_debug); #define GST_CAT_DEFAULT gst_ks_debug #define KS_WORKER_LOCK(priv) g_mutex_lock (priv->worker_lock) #define KS_WORKER_UNLOCK(priv) g_mutex_unlock (priv->worker_lock) #define KS_WORKER_WAIT(priv) \ g_cond_wait (priv->worker_notify_cond, priv->worker_lock) #define KS_WORKER_NOTIFY(priv) g_cond_signal (priv->worker_notify_cond) #define KS_WORKER_WAIT_FOR_RESULT(priv) \ g_cond_wait (priv->worker_result_cond, priv->worker_lock) #define KS_WORKER_NOTIFY_RESULT(priv) \ g_cond_signal (priv->worker_result_cond) typedef enum { KS_WORKER_STATE_STARTING, KS_WORKER_STATE_READY, KS_WORKER_STATE_STOPPING, KS_WORKER_STATE_ERROR, } KsWorkerState; typedef struct { /* Properties */ gchar *device_path; gchar *device_name; gint device_index; gboolean do_stats; gboolean enable_quirks; /* State */ GstKsClock *ksclock; GstKsVideoDevice *device; guint64 offset; GstClockTime prev_ts; gboolean running; /* Worker thread */ GThread *worker_thread; GMutex *worker_lock; GCond *worker_notify_cond; GCond *worker_result_cond; KsWorkerState worker_state; GstCaps *worker_pending_caps; gboolean worker_setcaps_result; gboolean worker_pending_run; gboolean worker_run_result; /* Statistics */ GstClockTime last_sampling; guint count; guint fps; } GstKsVideoSrcPrivate; #define GST_KS_VIDEO_SRC_GET_PRIVATE(o) \ (G_TYPE_INSTANCE_GET_PRIVATE ((o), GST_TYPE_KS_VIDEO_SRC, \ GstKsVideoSrcPrivate)) static void gst_ks_video_src_finalize (GObject * object); static void gst_ks_video_src_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec); static void gst_ks_video_src_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec); static void gst_ks_video_src_reset (GstKsVideoSrc * self); static GstStateChangeReturn gst_ks_video_src_change_state (GstElement * element, GstStateChange transition); static gboolean gst_ks_video_src_set_clock (GstElement * element, GstClock * clock); static GstCaps *gst_ks_video_src_get_caps (GstBaseSrc * basesrc); static gboolean gst_ks_video_src_set_caps (GstBaseSrc * basesrc, GstCaps * caps); static void gst_ks_video_src_fixate (GstBaseSrc * basesrc, GstCaps * caps); static gboolean gst_ks_video_src_query (GstBaseSrc * basesrc, GstQuery * query); static gboolean gst_ks_video_src_unlock (GstBaseSrc * basesrc); static gboolean gst_ks_video_src_unlock_stop (GstBaseSrc * basesrc); static GstFlowReturn gst_ks_video_src_create (GstPushSrc * pushsrc, GstBuffer ** buffer); GST_BOILERPLATE (GstKsVideoSrc, gst_ks_video_src, GstPushSrc, GST_TYPE_PUSH_SRC); static void gst_ks_video_src_base_init (gpointer gclass) { GstElementClass *element_class = GST_ELEMENT_CLASS (gclass); static GstElementDetails element_details = { "KsVideoSrc", "Source/Video", "Stream data from a video capture device through Windows kernel streaming", "Ole André Vadla RavnÃ¥s \n" "Haakon Sporsheim " }; gst_element_class_set_details (element_class, &element_details); gst_element_class_add_pad_template (element_class, gst_pad_template_new ("src", GST_PAD_SRC, GST_PAD_ALWAYS, ks_video_get_all_caps ())); } static void gst_ks_video_src_class_init (GstKsVideoSrcClass * klass) { GObjectClass *gobject_class = G_OBJECT_CLASS (klass); GstElementClass *gstelement_class = GST_ELEMENT_CLASS (klass); GstBaseSrcClass *gstbasesrc_class = GST_BASE_SRC_CLASS (klass); GstPushSrcClass *gstpushsrc_class = GST_PUSH_SRC_CLASS (klass); g_type_class_add_private (klass, sizeof (GstKsVideoSrcPrivate)); gobject_class->finalize = gst_ks_video_src_finalize; gobject_class->get_property = gst_ks_video_src_get_property; gobject_class->set_property = gst_ks_video_src_set_property; gstelement_class->change_state = GST_DEBUG_FUNCPTR (gst_ks_video_src_change_state); gstelement_class->set_clock = GST_DEBUG_FUNCPTR (gst_ks_video_src_set_clock); gstbasesrc_class->get_caps = GST_DEBUG_FUNCPTR (gst_ks_video_src_get_caps); gstbasesrc_class->set_caps = GST_DEBUG_FUNCPTR (gst_ks_video_src_set_caps); gstbasesrc_class->fixate = GST_DEBUG_FUNCPTR (gst_ks_video_src_fixate); gstbasesrc_class->query = GST_DEBUG_FUNCPTR (gst_ks_video_src_query); gstbasesrc_class->unlock = GST_DEBUG_FUNCPTR (gst_ks_video_src_unlock); gstbasesrc_class->unlock_stop = GST_DEBUG_FUNCPTR (gst_ks_video_src_unlock_stop); gstpushsrc_class->create = GST_DEBUG_FUNCPTR (gst_ks_video_src_create); g_object_class_install_property (gobject_class, PROP_DEVICE_PATH, g_param_spec_string ("device-path", "Device Path", "The device path", DEFAULT_DEVICE_PATH, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_DEVICE_NAME, g_param_spec_string ("device-name", "Device Name", "The human-readable device name", DEFAULT_DEVICE_NAME, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_DEVICE_INDEX, g_param_spec_int ("device-index", "Device Index", "The zero-based device index", -1, G_MAXINT, DEFAULT_DEVICE_INDEX, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_DO_STATS, g_param_spec_boolean ("do-stats", "Enable statistics", "Enable logging of statistics", DEFAULT_DO_STATS, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_FPS, g_param_spec_int ("fps", "Frames per second", "Last measured framerate, if statistics are enabled", -1, G_MAXINT, -1, G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)); g_object_class_install_property (gobject_class, PROP_ENABLE_QUIRKS, g_param_spec_boolean ("enable-quirks", "Enable quirks", "Enable driver-specific quirks", DEFAULT_ENABLE_QUIRKS, G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)); GST_DEBUG_CATEGORY_INIT (gst_ks_debug, "ksvideosrc", 0, "Kernel streaming video source"); } static void gst_ks_video_src_init (GstKsVideoSrc * self, GstKsVideoSrcClass * gclass) { GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); GstBaseSrc *basesrc = GST_BASE_SRC (self); gst_base_src_set_live (basesrc, TRUE); gst_base_src_set_format (basesrc, GST_FORMAT_TIME); gst_ks_video_src_reset (self); priv->device_path = DEFAULT_DEVICE_PATH; priv->device_name = DEFAULT_DEVICE_NAME; priv->device_index = DEFAULT_DEVICE_INDEX; priv->do_stats = DEFAULT_DO_STATS; priv->enable_quirks = DEFAULT_ENABLE_QUIRKS; } static void gst_ks_video_src_finalize (GObject * object) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (object); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); g_free (priv->device_name); g_free (priv->device_path); G_OBJECT_CLASS (parent_class)->finalize (object); } static void gst_ks_video_src_get_property (GObject * object, guint prop_id, GValue * value, GParamSpec * pspec) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (object); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); switch (prop_id) { case PROP_DEVICE_PATH: g_value_set_string (value, priv->device_path); break; case PROP_DEVICE_NAME: g_value_set_string (value, priv->device_name); break; case PROP_DEVICE_INDEX: g_value_set_int (value, priv->device_index); break; case PROP_DO_STATS: GST_OBJECT_LOCK (object); g_value_set_boolean (value, priv->do_stats); GST_OBJECT_UNLOCK (object); break; case PROP_FPS: GST_OBJECT_LOCK (object); g_value_set_int (value, priv->fps); GST_OBJECT_UNLOCK (object); break; case PROP_ENABLE_QUIRKS: g_value_set_boolean (value, priv->enable_quirks); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static void gst_ks_video_src_set_property (GObject * object, guint prop_id, const GValue * value, GParamSpec * pspec) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (object); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); switch (prop_id) { case PROP_DEVICE_PATH: g_free (priv->device_path); priv->device_path = g_value_dup_string (value); break; case PROP_DEVICE_NAME: g_free (priv->device_name); priv->device_name = g_value_dup_string (value); break; case PROP_DEVICE_INDEX: priv->device_index = g_value_get_int (value); break; case PROP_DO_STATS: GST_OBJECT_LOCK (object); priv->do_stats = g_value_get_boolean (value); GST_OBJECT_UNLOCK (object); break; case PROP_ENABLE_QUIRKS: priv->enable_quirks = g_value_get_boolean (value); break; default: G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec); break; } } static void gst_ks_video_src_reset (GstKsVideoSrc * self) { GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); /* Reset statistics */ priv->last_sampling = GST_CLOCK_TIME_NONE; priv->count = 0; priv->fps = -1; /* Reset timestamping state */ priv->offset = 0; priv->prev_ts = GST_CLOCK_TIME_NONE; priv->running = FALSE; } static void gst_ks_video_src_apply_driver_quirks (GstKsVideoSrc * self) { GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); HMODULE mod; /* * Logitech's driver software injects the following DLL into all processes * spawned. This DLL does some nasty tricks, sitting in between the * application and the low-level ntdll API (NtCreateFile, NtClose, * NtDeviceIoControlFile, NtDuplicateObject, etc.), making all sorts * of assumptions. * * The only regression that this quirk causes is that the video effects * feature doesn't work. */ mod = GetModuleHandle ("LVPrcInj.dll"); if (mod != NULL) { GST_DEBUG_OBJECT (self, "Logitech DLL detected, neutralizing it"); /* * We know that no-one's actually keeping this handle around to decrement * its reference count, so we'll take care of that job. The DLL's DllMain * implementation takes care of rolling back changes when it gets unloaded, * so this seems to be the cleanest and most future-proof way that we can * get rid of it... */ FreeLibrary (mod); /* Paranoia: verify that it's no longer there */ mod = GetModuleHandle ("LVPrcInj.dll"); if (mod != NULL) GST_WARNING_OBJECT (self, "failed to neutralize Logitech DLL"); } } static gboolean gst_ks_video_src_open_device (GstKsVideoSrc * self) { GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); GstKsVideoDevice *device = NULL; GList *devices, *cur; g_assert (priv->device == NULL); devices = ks_enumerate_devices (&KSCATEGORY_VIDEO); if (devices == NULL) goto error_no_devices; for (cur = devices; cur != NULL; cur = cur->next) { KsDeviceEntry *entry = cur->data; GST_DEBUG_OBJECT (self, "device %d: name='%s' path='%s'", entry->index, entry->name, entry->path); } for (cur = devices; cur != NULL && device == NULL; cur = cur->next) { KsDeviceEntry *entry = cur->data; gboolean match; if (priv->device_path != NULL) { match = g_strcasecmp (entry->path, priv->device_path) == 0; } else if (priv->device_name != NULL) { match = g_strcasecmp (entry->name, priv->device_name) == 0; } else if (priv->device_index >= 0) { match = entry->index == priv->device_index; } else { match = TRUE; /* pick the first entry */ } if (match) { priv->ksclock = g_object_new (GST_TYPE_KS_CLOCK, NULL); if (priv->ksclock != NULL && gst_ks_clock_open (priv->ksclock)) { GstClock *clock = GST_ELEMENT_CLOCK (self); if (clock != NULL) gst_ks_clock_provide_master_clock (priv->ksclock, clock); } else { GST_WARNING_OBJECT (self, "failed to create/open KsClock"); g_object_unref (priv->ksclock); priv->ksclock = NULL; } device = g_object_new (GST_TYPE_KS_VIDEO_DEVICE, "clock", priv->ksclock, "device-path", entry->path, NULL); } ks_device_entry_free (entry); } g_list_free (devices); if (device == NULL) goto error_no_match; if (!gst_ks_video_device_open (device)) goto error_open; priv->device = device; return TRUE; /* ERRORS */ error_no_devices: { GST_ELEMENT_ERROR (self, RESOURCE, NOT_FOUND, ("No video capture devices found"), (NULL)); return FALSE; } error_no_match: { if (priv->device_path != NULL) { GST_ELEMENT_ERROR (self, RESOURCE, SETTINGS, ("Specified video capture device with path '%s' not found", priv->device_path), (NULL)); } else if (priv->device_name != NULL) { GST_ELEMENT_ERROR (self, RESOURCE, SETTINGS, ("Specified video capture device with name '%s' not found", priv->device_name), (NULL)); } else { GST_ELEMENT_ERROR (self, RESOURCE, SETTINGS, ("Specified video capture device with index %d not found", priv->device_index), (NULL)); } return FALSE; } error_open: { GST_ELEMENT_ERROR (self, RESOURCE, OPEN_READ, ("Failed to open device"), (NULL)); g_object_unref (device); return FALSE; } } static void gst_ks_video_src_close_device (GstKsVideoSrc * self) { GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); g_assert (priv->device != NULL); gst_ks_video_device_close (priv->device); g_object_unref (priv->device); priv->device = NULL; if (priv->ksclock != NULL) { gst_ks_clock_close (priv->ksclock); g_object_unref (priv->ksclock); priv->ksclock = NULL; } gst_ks_video_src_reset (self); } /* * Worker thread that takes care of starting, configuring and stopping things. * * This is needed because Logitech's driver software injects a DLL that * intercepts API functions like NtCreateFile, NtClose, NtDeviceIoControlFile * and NtDuplicateObject so that they can provide in-place video effects to * existing applications. Their assumption is that at least one thread tainted * by their code stays around for the lifetime of the capture. */ static gpointer gst_ks_video_src_worker_func (gpointer data) { GstKsVideoSrc *self = data; GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); if (!gst_ks_video_src_open_device (self)) goto open_failed; KS_WORKER_LOCK (priv); priv->worker_state = KS_WORKER_STATE_READY; KS_WORKER_NOTIFY_RESULT (priv); while (priv->worker_state != KS_WORKER_STATE_STOPPING) { KS_WORKER_WAIT (priv); if (priv->worker_pending_caps != NULL) { priv->worker_setcaps_result = gst_ks_video_device_set_caps (priv->device, priv->worker_pending_caps); priv->worker_pending_caps = NULL; KS_WORKER_NOTIFY_RESULT (priv); } else if (priv->worker_pending_run) { if (priv->ksclock != NULL) gst_ks_clock_start (priv->ksclock); priv->worker_run_result = gst_ks_video_device_set_state (priv->device, KSSTATE_RUN); priv->worker_pending_run = FALSE; KS_WORKER_NOTIFY_RESULT (priv); } } KS_WORKER_UNLOCK (priv); gst_ks_video_src_close_device (self); return NULL; /* ERRORS */ open_failed: { KS_WORKER_LOCK (priv); priv->worker_state = KS_WORKER_STATE_ERROR; KS_WORKER_NOTIFY_RESULT (priv); KS_WORKER_UNLOCK (priv); return NULL; } } static gboolean gst_ks_video_src_start_worker (GstKsVideoSrc * self) { GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); gboolean result; priv->worker_lock = g_mutex_new (); priv->worker_notify_cond = g_cond_new (); priv->worker_result_cond = g_cond_new (); priv->worker_pending_caps = NULL; priv->worker_pending_run = FALSE; priv->worker_state = KS_WORKER_STATE_STARTING; priv->worker_thread = g_thread_create (gst_ks_video_src_worker_func, self, TRUE, NULL); KS_WORKER_LOCK (priv); while (priv->worker_state < KS_WORKER_STATE_READY) KS_WORKER_WAIT_FOR_RESULT (priv); result = priv->worker_state == KS_WORKER_STATE_READY; KS_WORKER_UNLOCK (priv); return result; } static void gst_ks_video_src_stop_worker (GstKsVideoSrc * self) { GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); KS_WORKER_LOCK (priv); priv->worker_state = KS_WORKER_STATE_STOPPING; KS_WORKER_NOTIFY (priv); KS_WORKER_UNLOCK (priv); g_thread_join (priv->worker_thread); priv->worker_thread = NULL; g_cond_free (priv->worker_result_cond); priv->worker_result_cond = NULL; g_cond_free (priv->worker_notify_cond); priv->worker_notify_cond = NULL; g_mutex_free (priv->worker_lock); priv->worker_lock = NULL; } static GstStateChangeReturn gst_ks_video_src_change_state (GstElement * element, GstStateChange transition) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (element); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); GstStateChangeReturn ret; switch (transition) { case GST_STATE_CHANGE_NULL_TO_READY: if (priv->enable_quirks) gst_ks_video_src_apply_driver_quirks (self); if (!gst_ks_video_src_start_worker (self)) goto open_failed; break; } ret = GST_ELEMENT_CLASS (parent_class)->change_state (element, transition); switch (transition) { case GST_STATE_CHANGE_READY_TO_NULL: gst_ks_video_src_stop_worker (self); break; } return ret; /* ERRORS */ open_failed: { gst_ks_video_src_stop_worker (self); return GST_STATE_CHANGE_FAILURE; } } static gboolean gst_ks_video_src_set_clock (GstElement * element, GstClock * clock) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (element); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); GST_OBJECT_LOCK (element); if (clock != NULL && priv->ksclock != NULL) gst_ks_clock_provide_master_clock (priv->ksclock, clock); GST_OBJECT_UNLOCK (element); return TRUE; } static GstCaps * gst_ks_video_src_get_caps (GstBaseSrc * basesrc) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (basesrc); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); if (priv->device != NULL) return gst_ks_video_device_get_available_caps (priv->device); else return NULL; /* BaseSrc will return template caps */ } static gboolean gst_ks_video_src_set_caps (GstBaseSrc * basesrc, GstCaps * caps) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (basesrc); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); if (priv->device == NULL) return FALSE; KS_WORKER_LOCK (priv); priv->worker_pending_caps = caps; KS_WORKER_NOTIFY (priv); while (priv->worker_pending_caps == caps) KS_WORKER_WAIT_FOR_RESULT (priv); KS_WORKER_UNLOCK (priv); return priv->worker_setcaps_result; } static void gst_ks_video_src_fixate (GstBaseSrc * basesrc, GstCaps * caps) { GstStructure *structure = gst_caps_get_structure (caps, 0); gst_structure_fixate_field_nearest_int (structure, "width", G_MAXINT); gst_structure_fixate_field_nearest_int (structure, "height", G_MAXINT); gst_structure_fixate_field_nearest_fraction (structure, "framerate", G_MAXINT, 1); } static gboolean gst_ks_video_src_query (GstBaseSrc * basesrc, GstQuery * query) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (basesrc); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); gboolean result = FALSE; switch (GST_QUERY_TYPE (query)) { case GST_QUERY_LATENCY:{ GstClockTime min_latency, max_latency; if (priv->device == NULL) goto beach; result = gst_ks_video_device_get_latency (priv->device, &min_latency, &max_latency); if (!result) goto beach; GST_DEBUG_OBJECT (self, "reporting latency of min %" GST_TIME_FORMAT " max %" GST_TIME_FORMAT, GST_TIME_ARGS (min_latency), GST_TIME_ARGS (max_latency)); gst_query_set_latency (query, TRUE, min_latency, max_latency); break; } default: result = GST_BASE_SRC_CLASS (parent_class)->query (basesrc, query); break; } beach: return result; } static gboolean gst_ks_video_src_unlock (GstBaseSrc * basesrc) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (basesrc); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); GST_DEBUG_OBJECT (self, "%s", G_STRFUNC); gst_ks_video_device_cancel (priv->device); return TRUE; } static gboolean gst_ks_video_src_unlock_stop (GstBaseSrc * basesrc) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (basesrc); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); GST_DEBUG_OBJECT (self, "%s", G_STRFUNC); gst_ks_video_device_cancel_stop (priv->device); return TRUE; } static gboolean gst_ks_video_src_timestamp_buffer (GstKsVideoSrc * self, GstBuffer * buf, GstClockTime presentation_time) { GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); GstClockTime duration; GstClock *clock; GstClockTime timestamp; duration = gst_ks_video_device_get_duration (priv->device); GST_OBJECT_LOCK (self); clock = GST_ELEMENT_CLOCK (self); if (clock != NULL) { gst_object_ref (clock); timestamp = GST_ELEMENT (self)->base_time; if (GST_CLOCK_TIME_IS_VALID (presentation_time)) { if (presentation_time > GST_ELEMENT (self)->base_time) presentation_time -= GST_ELEMENT (self)->base_time; else presentation_time = 0; } } else { timestamp = GST_CLOCK_TIME_NONE; } GST_OBJECT_UNLOCK (self); if (clock != NULL) { /* The time according to the current clock */ timestamp = gst_clock_get_time (clock) - timestamp; if (timestamp > duration) timestamp -= duration; else timestamp = 0; if (GST_CLOCK_TIME_IS_VALID (presentation_time)) { /* * We don't use this for anything yet, need to ponder how to deal * with pins that use an internal clock and timestamp from 0. */ GstClockTimeDiff diff = GST_CLOCK_DIFF (presentation_time, timestamp); GST_DEBUG_OBJECT (self, "diff between gst and driver timestamp: %" G_GINT64_FORMAT, diff); } gst_object_unref (clock); clock = NULL; /* Unless it's the first frame, align the current timestamp on a multiple * of duration since the previous */ if (GST_CLOCK_TIME_IS_VALID (priv->prev_ts)) { GstClockTime delta; guint delta_remainder, delta_offset; /* REVISIT: I've seen this happen with the GstSystemClock on Windows, * scary... */ if (timestamp < priv->prev_ts) { GST_WARNING_OBJECT (self, "clock is ticking backwards"); return FALSE; } /* Round to a duration boundary */ delta = timestamp - priv->prev_ts; delta_remainder = delta % duration; if (delta_remainder < duration / 3) timestamp -= delta_remainder; else timestamp += duration - delta_remainder; /* How many frames are we off then? */ delta = timestamp - priv->prev_ts; delta_offset = delta / duration; if (delta_offset == 1) /* perfect */ GST_BUFFER_FLAG_UNSET (buf, GST_BUFFER_FLAG_DISCONT); else if (delta_offset > 1) { guint lost = delta_offset - 1; GST_INFO_OBJECT (self, "lost %d frame%s, setting discont flag", lost, (lost > 1) ? "s" : ""); GST_BUFFER_FLAG_SET (buf, GST_BUFFER_FLAG_DISCONT); } else if (delta_offset == 0) { /* overproduction, skip this frame */ GST_INFO_OBJECT (self, "skipping frame"); return FALSE; } priv->offset += delta_offset; } priv->prev_ts = timestamp; } GST_BUFFER_OFFSET (buf) = priv->offset; GST_BUFFER_OFFSET_END (buf) = GST_BUFFER_OFFSET (buf) + 1; GST_BUFFER_TIMESTAMP (buf) = timestamp; GST_BUFFER_DURATION (buf) = duration; return TRUE; } static void gst_ks_video_src_update_statistics (GstKsVideoSrc * self) { GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); GstClock *clock; GST_OBJECT_LOCK (self); clock = GST_ELEMENT_CLOCK (self); if (clock != NULL) gst_object_ref (clock); GST_OBJECT_UNLOCK (self); if (clock != NULL) { GstClockTime now = gst_clock_get_time (clock); gst_object_unref (clock); priv->count++; if (GST_CLOCK_TIME_IS_VALID (priv->last_sampling)) { if (now - priv->last_sampling >= GST_SECOND) { GST_OBJECT_LOCK (self); priv->fps = priv->count; GST_OBJECT_UNLOCK (self); g_object_notify (G_OBJECT (self), "fps"); priv->last_sampling = now; priv->count = 0; } } else { priv->last_sampling = now; } } } static GstFlowReturn gst_ks_video_src_create (GstPushSrc * pushsrc, GstBuffer ** buffer) { GstKsVideoSrc *self = GST_KS_VIDEO_SRC (pushsrc); GstKsVideoSrcPrivate *priv = GST_KS_VIDEO_SRC_GET_PRIVATE (self); guint buf_size; GstCaps *caps; GstBuffer *buf = NULL; GstFlowReturn result; GstClockTime presentation_time; gulong error_code; gchar *error_str; g_assert (priv->device != NULL); if (!gst_ks_video_device_has_caps (priv->device)) goto error_no_caps; buf_size = gst_ks_video_device_get_frame_size (priv->device); g_assert (buf_size); caps = gst_pad_get_negotiated_caps (GST_BASE_SRC_PAD (self)); if (caps == NULL) goto error_no_caps; result = gst_pad_alloc_buffer (GST_BASE_SRC_PAD (self), priv->offset, buf_size, caps, &buf); gst_caps_unref (caps); if (G_UNLIKELY (result != GST_FLOW_OK)) goto error_alloc_buffer; if (G_UNLIKELY (!priv->running)) { KS_WORKER_LOCK (priv); priv->worker_pending_run = TRUE; KS_WORKER_NOTIFY (priv); while (priv->worker_pending_run) KS_WORKER_WAIT_FOR_RESULT (priv); priv->running = priv->worker_run_result; KS_WORKER_UNLOCK (priv); if (!priv->running) goto error_start_capture; } do { gulong bytes_read; result = gst_ks_video_device_read_frame (priv->device, GST_BUFFER_DATA (buf), buf_size, &bytes_read, &presentation_time, &error_code, &error_str); if (G_UNLIKELY (result != GST_FLOW_OK)) goto error_read_frame; GST_BUFFER_SIZE (buf) = bytes_read; } while (!gst_ks_video_src_timestamp_buffer (self, buf, presentation_time)); if (G_UNLIKELY (priv->do_stats)) gst_ks_video_src_update_statistics (self); gst_ks_video_device_postprocess_frame (priv->device, GST_BUFFER_DATA (buf), GST_BUFFER_SIZE (buf)); *buffer = buf; return GST_FLOW_OK; /* ERRORS */ error_no_caps: { GST_ELEMENT_ERROR (self, CORE, NEGOTIATION, ("not negotiated"), ("maybe setcaps failed?")); return GST_FLOW_ERROR; } error_start_capture: { GST_ELEMENT_ERROR (self, RESOURCE, OPEN_READ, ("could not start capture"), ("failed to change pin state to KSSTATE_RUN")); return GST_FLOW_ERROR; } error_alloc_buffer: { GST_ELEMENT_ERROR (self, CORE, PAD, ("alloc_buffer failed"), (NULL)); return result; } error_read_frame: { if (result != GST_FLOW_WRONG_STATE && result != GST_FLOW_UNEXPECTED) { GST_ELEMENT_ERROR (self, RESOURCE, READ, ("read failed: %s [0x%08x]", error_str, error_code), ("gst_ks_video_device_read_frame failed")); } g_free (error_str); gst_buffer_unref (buf); return result; } } static gboolean plugin_init (GstPlugin * plugin) { return gst_element_register (plugin, "ksvideosrc", GST_RANK_NONE, GST_TYPE_KS_VIDEO_SRC); } GST_PLUGIN_DEFINE (GST_VERSION_MAJOR, GST_VERSION_MINOR, "winks", "Windows kernel streaming plugin", plugin_init, VERSION, "LGPL", "GStreamer", "http://gstreamer.net/") psimedia-master/gstprovider/gstelements/winks/gstksvideosrc.h000066400000000000000000000033341220046403000252340ustar00rootroot00000000000000/* * Copyright (C) 2008 Ole André Vadla RavnÃ¥s * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. */ #ifndef __GST_KS_VIDEO_SRC_H__ #define __GST_KS_VIDEO_SRC_H__ #include G_BEGIN_DECLS #define GST_TYPE_KS_VIDEO_SRC \ (gst_ks_video_src_get_type ()) #define GST_KS_VIDEO_SRC(obj) \ (G_TYPE_CHECK_INSTANCE_CAST ((obj), GST_TYPE_KS_VIDEO_SRC, GstKsVideoSrc)) #define GST_KS_VIDEO_SRC_CLASS(klass) \ (G_TYPE_CHECK_CLASS_CAST ((klass), GST_TYPE_KS_VIDEO_SRC, GstKsVideoSrcClass)) #define GST_IS_KS_VIDEO_SRC(obj) \ (G_TYPE_CHECK_INSTANCE_TYPE ((obj), GST_TYPE_KS_VIDEO_SRC)) #define GST_IS_KS_VIDEO_SRC_CLASS(klass) \ (G_TYPE_CHECK_CLASS_TYPE ((klass), GST_TYPE_KS_VIDEO_SRC)) typedef struct _GstKsVideoSrc GstKsVideoSrc; typedef struct _GstKsVideoSrcClass GstKsVideoSrcClass; struct _GstKsVideoSrc { GstPushSrc push_src; }; struct _GstKsVideoSrcClass { GstPushSrcClass parent_class; }; GType gst_ks_video_src_get_type (void); G_END_DECLS #endif /* __GST_KS_VIDEO_SRC_H__ */ psimedia-master/gstprovider/gstelements/winks/kshelpers.c000066400000000000000000000334121220046403000243350ustar00rootroot00000000000000/* * Copyright (C) 2008 Ole André Vadla Ravnås * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. */ #ifdef UNICODE #undef UNICODE #endif #include "kshelpers.h" #include #include GST_DEBUG_CATEGORY_EXTERN (gst_ks_debug); #define GST_CAT_DEFAULT gst_ks_debug gboolean ks_is_valid_handle (HANDLE h) { return (h != INVALID_HANDLE_VALUE && h != NULL); } GList * ks_enumerate_devices (const GUID * category) { GList *result = NULL; HDEVINFO devinfo; gint i; devinfo = SetupDiGetClassDevsW (category, NULL, NULL, DIGCF_PRESENT | DIGCF_DEVICEINTERFACE); if (!ks_is_valid_handle (devinfo)) return NULL; /* no devices */ for (i = 0;; i++) { BOOL success; SP_DEVICE_INTERFACE_DATA if_data = { 0, }; SP_DEVICE_INTERFACE_DETAIL_DATA_W *if_detail_data; DWORD if_detail_data_size; SP_DEVINFO_DATA devinfo_data = { 0, }; DWORD req_size; if_data.cbSize = sizeof (SP_DEVICE_INTERFACE_DATA); success = SetupDiEnumDeviceInterfaces (devinfo, NULL, category, i, &if_data); if (!success) /* all devices enumerated? */ break; if_detail_data_size = (MAX_PATH - 1) * sizeof (gunichar2); if_detail_data = g_malloc0 (if_detail_data_size); if_detail_data->cbSize = sizeof (SP_DEVICE_INTERFACE_DETAIL_DATA_W); devinfo_data.cbSize = sizeof (SP_DEVINFO_DATA); success = SetupDiGetDeviceInterfaceDetailW (devinfo, &if_data, if_detail_data, if_detail_data_size, &req_size, &devinfo_data); if (success) { KsDeviceEntry *entry; WCHAR buf[512]; entry = g_new0 (KsDeviceEntry, 1); entry->index = i; entry->path = g_utf16_to_utf8 (if_detail_data->DevicePath, -1, NULL, NULL, NULL); if (SetupDiGetDeviceRegistryPropertyW (devinfo, &devinfo_data, SPDRP_FRIENDLYNAME, NULL, (BYTE *) buf, sizeof (buf), NULL)) { entry->name = g_utf16_to_utf8 (buf, -1, NULL, NULL, NULL); } if (entry->name == NULL) { if (SetupDiGetDeviceRegistryPropertyW (devinfo, &devinfo_data, SPDRP_DEVICEDESC, NULL, (BYTE *) buf, sizeof (buf), NULL)) { entry->name = g_utf16_to_utf8 (buf, -1, NULL, NULL, NULL); } } if (entry->name != NULL) result = g_list_prepend (result, entry); else ks_device_entry_free (entry); } g_free (if_detail_data); } SetupDiDestroyDeviceInfoList (devinfo); return g_list_reverse (result); } void ks_device_entry_free (KsDeviceEntry * entry) { if (entry == NULL) return; g_free (entry->path); g_free (entry->name); g_free (entry); } void ks_device_list_free (GList * devices) { GList *cur; for (cur = devices; cur != NULL; cur = cur->next) ks_device_entry_free (cur->data); g_list_free (devices); } static gboolean ks_sync_device_io_control (HANDLE device, gulong io_control_code, gpointer in_buffer, gulong in_buffer_size, gpointer out_buffer, gulong out_buffer_size, gulong * bytes_returned) { OVERLAPPED overlapped = { 0, }; BOOL success; overlapped.hEvent = CreateEvent (NULL, TRUE, FALSE, NULL); success = DeviceIoControl (device, io_control_code, in_buffer, in_buffer_size, out_buffer, out_buffer_size, bytes_returned, &overlapped); if (!success && GetLastError () == ERROR_IO_PENDING) success = GetOverlappedResult (device, &overlapped, bytes_returned, TRUE); CloseHandle (overlapped.hEvent); return success ? TRUE : FALSE; } gboolean ks_filter_get_pin_property (HANDLE filter_handle, gulong pin_id, GUID prop_set, gulong prop_id, gpointer value, gulong value_size) { KSP_PIN prop = { 0, }; DWORD bytes_returned = 0; prop.PinId = pin_id; prop.Property.Set = prop_set; prop.Property.Id = prop_id; prop.Property.Flags = KSPROPERTY_TYPE_GET; return ks_sync_device_io_control (filter_handle, IOCTL_KS_PROPERTY, &prop, sizeof (prop), value, value_size, &bytes_returned); } gboolean ks_filter_get_pin_property_multi (HANDLE filter_handle, gulong pin_id, GUID prop_set, gulong prop_id, KSMULTIPLE_ITEM ** items) { KSP_PIN prop = { 0, }; DWORD items_size = 0, bytes_written = 0; gboolean ret; *items = NULL; prop.PinId = pin_id; prop.Property.Set = prop_set; prop.Property.Id = prop_id; prop.Property.Flags = KSPROPERTY_TYPE_GET; ret = ks_sync_device_io_control (filter_handle, IOCTL_KS_PROPERTY, &prop.Property, sizeof (prop), NULL, 0, &items_size); if (!ret) { DWORD err = GetLastError (); if (err != ERROR_INSUFFICIENT_BUFFER && err != ERROR_MORE_DATA) goto error; } *items = g_malloc0 (items_size); ret = ks_sync_device_io_control (filter_handle, IOCTL_KS_PROPERTY, &prop, sizeof (prop), *items, items_size, &bytes_written); if (!ret) goto error; return ret; error: g_free (*items); *items = NULL; return FALSE; } gboolean ks_object_query_property (HANDLE handle, GUID prop_set, gulong prop_id, gulong prop_flags, gpointer * value, gulong * value_size) { KSPROPERTY prop = { 0, }; DWORD req_value_size = 0, bytes_written = 0; gboolean ret; *value = NULL; prop.Set = prop_set; prop.Id = prop_id; prop.Flags = prop_flags; if (value_size == NULL || *value_size == 0) { ret = ks_sync_device_io_control (handle, IOCTL_KS_PROPERTY, &prop, sizeof (prop), NULL, 0, &req_value_size); if (!ret) { DWORD err = GetLastError (); if (err != ERROR_INSUFFICIENT_BUFFER && err != ERROR_MORE_DATA) goto error; } } else { req_value_size = *value_size; } *value = g_malloc0 (req_value_size); ret = ks_sync_device_io_control (handle, IOCTL_KS_PROPERTY, &prop, sizeof (prop), *value, req_value_size, &bytes_written); if (!ret) goto error; if (value_size != NULL) *value_size = bytes_written; return ret; error: g_free (*value); *value = NULL; if (value_size != NULL) *value_size = 0; return FALSE; } gboolean ks_object_get_property (HANDLE handle, GUID prop_set, gulong prop_id, gpointer * value, gulong * value_size) { return ks_object_query_property (handle, prop_set, prop_id, KSPROPERTY_TYPE_GET, value, value_size); } gboolean ks_object_set_property (HANDLE handle, GUID prop_set, gulong prop_id, gpointer value, gulong value_size) { KSPROPERTY prop = { 0, }; DWORD bytes_returned; prop.Set = prop_set; prop.Id = prop_id; prop.Flags = KSPROPERTY_TYPE_SET; return ks_sync_device_io_control (handle, IOCTL_KS_PROPERTY, &prop, sizeof (prop), value, value_size, &bytes_returned); } gboolean ks_object_get_supported_property_sets (HANDLE handle, GUID ** propsets, gulong * len) { gulong size = 0; *propsets = NULL; *len = 0; if (ks_object_query_property (handle, GUID_NULL, 0, KSPROPERTY_TYPE_SETSUPPORT, propsets, &size)) { if (size % sizeof (GUID) == 0) { *len = size / sizeof (GUID); return TRUE; } } g_free (*propsets); *propsets = NULL; *len = 0; return FALSE; } gboolean ks_object_set_connection_state (HANDLE handle, KSSTATE state) { return ks_object_set_property (handle, KSPROPSETID_Connection, KSPROPERTY_CONNECTION_STATE, &state, sizeof (state)); } const gchar * ks_state_to_string (KSSTATE state) { switch (state) { case KSSTATE_STOP: return "KSSTATE_STOP"; case KSSTATE_ACQUIRE: return "KSSTATE_ACQUIRE"; case KSSTATE_PAUSE: return "KSSTATE_PAUSE"; case KSSTATE_RUN: return "KSSTATE_RUN"; default: g_assert_not_reached (); } return "UNKNOWN"; } #define CHECK_OPTIONS_FLAG(flag) \ if (flags & KSSTREAM_HEADER_OPTIONSF_##flag)\ {\ if (str->len > 0)\ g_string_append (str, "|");\ g_string_append (str, G_STRINGIFY (flag));\ flags &= ~KSSTREAM_HEADER_OPTIONSF_##flag;\ } gchar * ks_options_flags_to_string (gulong flags) { gchar *ret; GString *str; str = g_string_sized_new (128); CHECK_OPTIONS_FLAG (SPLICEPOINT); CHECK_OPTIONS_FLAG (PREROLL); CHECK_OPTIONS_FLAG (DATADISCONTINUITY); CHECK_OPTIONS_FLAG (TYPECHANGED); CHECK_OPTIONS_FLAG (TIMEVALID); CHECK_OPTIONS_FLAG (TIMEDISCONTINUITY); CHECK_OPTIONS_FLAG (FLUSHONPAUSE); CHECK_OPTIONS_FLAG (DURATIONVALID); CHECK_OPTIONS_FLAG (ENDOFSTREAM); CHECK_OPTIONS_FLAG (BUFFEREDTRANSFER); CHECK_OPTIONS_FLAG (VRAM_DATA_TRANSFER); CHECK_OPTIONS_FLAG (LOOPEDDATA); if (flags != 0) g_string_append_printf (str, "|0x%08x", flags); ret = str->str; g_string_free (str, FALSE); return ret; } typedef struct { const GUID guid; const gchar *name; } KsPropertySetMapping; #ifndef STATIC_KSPROPSETID_GM #define STATIC_KSPROPSETID_GM \ 0xAF627536, 0xE719, 0x11D2, 0x8A, 0x1D, 0x00, 0x60, 0x97, 0xD2, 0xDF, 0x5D #endif #ifndef STATIC_KSPROPSETID_Jack #define STATIC_KSPROPSETID_Jack \ 0x4509F757, 0x2D46, 0x4637, 0x8E, 0x62, 0xCE, 0x7D, 0xB9, 0x44, 0xF5, 0x7B #endif #ifndef STATIC_PROPSETID_VIDCAP_SELECTOR #define STATIC_PROPSETID_VIDCAP_SELECTOR \ 0x1ABDAECA, 0x68B6, 0x4F83, 0x93, 0x71, 0xB4, 0x13, 0x90, 0x7C, 0x7B, 0x9F #endif #ifndef STATIC_PROPSETID_EXT_DEVICE #define STATIC_PROPSETID_EXT_DEVICE \ 0xB5730A90, 0x1A2C, 0x11cf, 0x8c, 0x23, 0x00, 0xAA, 0x00, 0x6B, 0x68, 0x14 #endif #ifndef STATIC_PROPSETID_EXT_TRANSPORT #define STATIC_PROPSETID_EXT_TRANSPORT \ 0xA03CD5F0, 0x3045, 0x11cf, 0x8c, 0x44, 0x00, 0xAA, 0x00, 0x6B, 0x68, 0x14 #endif #ifndef STATIC_PROPSETID_TIMECODE_READER #define STATIC_PROPSETID_TIMECODE_READER \ 0x9B496CE1, 0x811B, 0x11cf, 0x8C, 0x77, 0x00, 0xAA, 0x00, 0x6B, 0x68, 0x14 #endif static const KsPropertySetMapping known_property_sets[] = { {{STATIC_KSPROPSETID_General}, "General"}, {{STATIC_KSPROPSETID_MediaSeeking}, "MediaSeeking"}, {{STATIC_KSPROPSETID_Topology}, "Topology"}, {{STATIC_KSPROPSETID_GM}, "GM"}, {{STATIC_KSPROPSETID_Pin}, "Pin"}, {{STATIC_KSPROPSETID_Quality}, "Quality"}, {{STATIC_KSPROPSETID_Connection}, "Connection"}, {{STATIC_KSPROPSETID_MemoryTransport}, "MemoryTransport"}, {{STATIC_KSPROPSETID_StreamAllocator}, "StreamAllocator"}, {{STATIC_KSPROPSETID_StreamInterface}, "StreamInterface"}, {{STATIC_KSPROPSETID_Stream}, "Stream"}, {{STATIC_KSPROPSETID_Clock}, "Clock"}, {{STATIC_KSPROPSETID_DirectSound3DListener}, "DirectSound3DListener"}, {{STATIC_KSPROPSETID_DirectSound3DBuffer}, "DirectSound3DBuffer"}, {{STATIC_KSPROPSETID_Hrtf3d}, "Hrtf3d"}, {{STATIC_KSPROPSETID_Itd3d}, "Itd3d"}, {{STATIC_KSPROPSETID_Bibliographic}, "Bibliographic"}, {{STATIC_KSPROPSETID_TopologyNode}, "TopologyNode"}, {{STATIC_KSPROPSETID_RtAudio}, "RtAudio"}, {{STATIC_KSPROPSETID_DrmAudioStream}, "DrmAudioStream"}, {{STATIC_KSPROPSETID_Audio}, "Audio"}, //{{STATIC_KSPROPSETID_Acoustic_Echo_Cancel}, "Acoustic_Echo_Cancel"}, //{{STATIC_KSPROPSETID_Wave_Queued}, "Wave_Queued"}, {{STATIC_KSPROPSETID_Wave}, "Wave"}, //{{STATIC_KSPROPSETID_WaveTable}, "WaveTable"}, {{STATIC_KSPROPSETID_Cyclic}, "Cyclic"}, //{{STATIC_KSPROPSETID_Sysaudio}, "Sysaudio"}, //{{STATIC_KSPROPSETID_Sysaudio_Pin}, "Sysaudio_Pin"}, //{{STATIC_KSPROPSETID_AudioGfx}, "AudioGfx"}, //{{STATIC_KSPROPSETID_Linear}, "Linear"}, {{STATIC_KSPROPSETID_Mpeg2Vid}, "Mpeg2Vid"}, {{STATIC_KSPROPSETID_AC3}, "AC3"}, {{STATIC_KSPROPSETID_AudioDecoderOut}, "AudioDecoderOut"}, {{STATIC_KSPROPSETID_DvdSubPic}, "DvdSubPic"}, {{STATIC_KSPROPSETID_CopyProt}, "CopyProt"}, {{STATIC_KSPROPSETID_VBICAP_PROPERTIES}, "VBICAP_PROPERTIES"}, {{STATIC_KSPROPSETID_VBICodecFiltering}, "VBICodecFiltering"}, {{STATIC_KSPROPSETID_VramCapture}, "VramCapture"}, {{STATIC_KSPROPSETID_OverlayUpdate}, "OverlayUpdate"}, {{STATIC_KSPROPSETID_VPConfig}, "VPConfig"}, {{STATIC_KSPROPSETID_VPVBIConfig}, "VPVBIConfig"}, {{STATIC_KSPROPSETID_TSRateChange}, "TSRateChange"}, {{STATIC_KSPROPSETID_Jack}, "Jack"}, {{STATIC_PROPSETID_ALLOCATOR_CONTROL}, "ALLOCATOR_CONTROL"}, {{STATIC_PROPSETID_VIDCAP_VIDEOPROCAMP}, "VIDCAP_VIDEOPROCAMP"}, {{STATIC_PROPSETID_VIDCAP_SELECTOR}, "VIDCAP_SELECTOR"}, {{STATIC_PROPSETID_TUNER}, "TUNER"}, {{STATIC_PROPSETID_VIDCAP_VIDEOENCODER}, "VIDCAP_VIDEOENCODER"}, {{STATIC_PROPSETID_VIDCAP_VIDEODECODER}, "VIDCAP_VIDEODECODER"}, {{STATIC_PROPSETID_VIDCAP_CAMERACONTROL}, "VIDCAP_CAMERACONTROL"}, {{STATIC_PROPSETID_EXT_DEVICE}, "EXT_DEVICE"}, {{STATIC_PROPSETID_EXT_TRANSPORT}, "EXT_TRANSPORT"}, {{STATIC_PROPSETID_TIMECODE_READER}, "TIMECODE_READER"}, {{STATIC_PROPSETID_VIDCAP_CROSSBAR}, "VIDCAP_CROSSBAR"}, {{STATIC_PROPSETID_VIDCAP_TVAUDIO}, "VIDCAP_TVAUDIO"}, {{STATIC_PROPSETID_VIDCAP_VIDEOCOMPRESSION}, "VIDCAP_VIDEOCOMPRESSION"}, {{STATIC_PROPSETID_VIDCAP_VIDEOCONTROL}, "VIDCAP_VIDEOCONTROL"}, {{STATIC_PROPSETID_VIDCAP_DROPPEDFRAMES}, "VIDCAP_DROPPEDFRAMES"}, }; gchar * ks_property_set_to_string (const GUID * guid) { guint i; for (i = 0; i < sizeof (known_property_sets) / sizeof (known_property_sets[0]); i++) { if (IsEqualGUID (guid, &known_property_sets[i].guid)) return g_strdup_printf ("KSPROPSETID_%s", known_property_sets[i].name); } return g_strdup_printf ("{%08X-%04X-%04X-%02X%02X-%02X%02X%02X%02X%02X%02X}", guid->Data1, guid->Data2, guid->Data3, guid->Data4[0], guid->Data4[1], guid->Data4[2], guid->Data4[3], guid->Data4[4], guid->Data4[5], guid->Data4[6], guid->Data4[7]); } psimedia-master/gstprovider/gstelements/winks/kshelpers.h000066400000000000000000000044071220046403000243440ustar00rootroot00000000000000/* * Copyright (C) 2008 Ole André Vadla Ravnås * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. */ #ifndef __KSHELPERS_H__ #define __KSHELPERS_H__ #include #include #include #include G_BEGIN_DECLS typedef struct _KsDeviceEntry KsDeviceEntry; struct _KsDeviceEntry { guint index; gchar * name; gchar * path; }; gboolean ks_is_valid_handle (HANDLE h); GList * ks_enumerate_devices (const GUID * category); void ks_device_entry_free (KsDeviceEntry * entry); void ks_device_list_free (GList * devices); gboolean ks_filter_get_pin_property (HANDLE filter_handle, gulong pin_id, GUID prop_set, gulong prop_id, gpointer value, gulong value_size); gboolean ks_filter_get_pin_property_multi (HANDLE filter_handle, gulong pin_id, GUID prop_set, gulong prop_id, KSMULTIPLE_ITEM ** items); gboolean ks_object_query_property (HANDLE handle, GUID prop_set, gulong prop_id, gulong prop_flags, gpointer * value, gulong * value_size); gboolean ks_object_get_property (HANDLE handle, GUID prop_set, gulong prop_id, gpointer * value, gulong * value_size); gboolean ks_object_set_property (HANDLE handle, GUID prop_set, gulong prop_id, gpointer value, gulong value_size); gboolean ks_object_get_supported_property_sets (HANDLE handle, GUID ** propsets, gulong * len); gboolean ks_object_set_connection_state (HANDLE handle, KSSTATE state); const gchar * ks_state_to_string (KSSTATE state); gchar * ks_options_flags_to_string (gulong flags); gchar * ks_property_set_to_string (const GUID * guid); G_END_DECLS #endif /* __KSHELPERS_H__ */ psimedia-master/gstprovider/gstelements/winks/ksvideohelpers.c000066400000000000000000000464471220046403000254000ustar00rootroot00000000000000/* * Copyright (C) 2007 Haakon Sporsheim * 2008 Ole André Vadla Ravnås * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. */ #ifdef UNICODE #undef UNICODE #endif #include "ksvideohelpers.h" #include #include "kshelpers.h" GST_DEBUG_CATEGORY_EXTERN (gst_ks_debug); #define GST_CAT_DEFAULT gst_ks_debug static const GUID MEDIASUBTYPE_FOURCC = { 0x0 /* FourCC here */ , 0x0000, 0x0010, {0x80, 0x00, 0x00, 0xAA, 0x00, 0x38, 0x9B, 0x71} }; extern const GUID MEDIASUBTYPE_I420 = { 0x30323449, 0x0000, 0x0010, {0x80, 0x00, 0x00, 0xAA, 0x00, 0x38, 0x9B, 0x71} }; static GstStructure * ks_video_format_to_structure (GUID subtype_guid, GUID format_guid) { GstStructure *structure = NULL; if (IsEqualGUID (&subtype_guid, &MEDIASUBTYPE_MJPG) || IsEqualGUID (&subtype_guid, &MEDIASUBTYPE_TVMJ) || /* FIXME: NOT tested */ IsEqualGUID (&subtype_guid, &MEDIASUBTYPE_WAKE) || /* FIXME: NOT tested */ IsEqualGUID (&subtype_guid, &MEDIASUBTYPE_CFCC) || /* FIXME: NOT tested */ IsEqualGUID (&subtype_guid, &MEDIASUBTYPE_IJPG)) { /* FIXME: NOT tested */ structure = gst_structure_new ("image/jpeg", NULL); } else if (IsEqualGUID (&subtype_guid, &MEDIASUBTYPE_RGB555) || /* FIXME: NOT tested */ IsEqualGUID (&subtype_guid, &MEDIASUBTYPE_RGB565) || /* FIXME: NOT tested */ IsEqualGUID (&subtype_guid, &MEDIASUBTYPE_RGB24) || IsEqualGUID (&subtype_guid, &MEDIASUBTYPE_RGB32) || /* FIXME: NOT tested */ IsEqualGUID (&subtype_guid, &MEDIASUBTYPE_ARGB1555) || /* FIXME: NOT tested */ IsEqualGUID (&subtype_guid, &MEDIASUBTYPE_ARGB32) || /* FIXME: NOT tested */ IsEqualGUID (&subtype_guid, &MEDIASUBTYPE_ARGB4444)) { /* FIXME: NOT tested */ guint depth = 0, bpp = 0; gint endianness = 0; guint32 r_mask = 0, b_mask = 0, g_mask = 0; if (IsEqualGUID (&subtype_guid, &MEDIASUBTYPE_RGB555)) { bpp = 16; depth = 15; endianness = G_BIG_ENDIAN; r_mask = 0x7c00; g_mask = 0x03e0; b_mask = 0x001f; } else if (IsEqualGUID (&subtype_guid, &MEDIASUBTYPE_RGB565)) { bpp = depth = 16; endianness = G_BIG_ENDIAN; r_mask = 0xf800; g_mask = 0x07e0; b_mask = 0x001f; } else if (IsEqualGUID (&subtype_guid, &MEDIASUBTYPE_RGB24)) { bpp = depth = 24; endianness = G_BIG_ENDIAN; r_mask = 0x0000ff; g_mask = 0x00ff00; b_mask = 0xff0000; } else if (IsEqualGUID (&subtype_guid, &MEDIASUBTYPE_RGB32)) { bpp = 32; depth = 24; endianness = G_BIG_ENDIAN; r_mask = 0x000000ff; g_mask = 0x0000ff00; b_mask = 0x00ff0000; /* FIXME: check *r_mask = 0xff000000; *g_mask = 0x00ff0000; *b_mask = 0x0000ff00; */ } else if (IsEqualGUID (&subtype_guid, &MEDIASUBTYPE_ARGB1555)) { bpp = 16; depth = 15; endianness = G_BIG_ENDIAN; r_mask = 0x7c00; g_mask = 0x03e0; b_mask = 0x001f; } else if (IsEqualGUID (&subtype_guid, &MEDIASUBTYPE_ARGB32)) { bpp = depth = 32; endianness = G_BIG_ENDIAN; r_mask = 0x000000ff; g_mask = 0x0000ff00; b_mask = 0x00ff0000; /* FIXME: check *r_mask = 0xff000000; *g_mask = 0x00ff0000; *b_mask = 0x0000ff00; */ } else if (IsEqualGUID (&subtype_guid, &MEDIASUBTYPE_ARGB4444)) { bpp = 16; depth = 12; endianness = G_BIG_ENDIAN; r_mask = 0x0f00; g_mask = 0x00f0; b_mask = 0x000f; //r_mask = 0x000f; //g_mask = 0x00f0; //b_mask = 0x0f00; } else { g_assert_not_reached (); } structure = gst_structure_new ("video/x-raw-rgb", "bpp", G_TYPE_INT, bpp, "depth", G_TYPE_INT, depth, "red_mask", G_TYPE_INT, r_mask, "green_mask", G_TYPE_INT, g_mask, "blue_mask", G_TYPE_INT, b_mask, "endianness", G_TYPE_INT, endianness, NULL); } else if (IsEqualGUID (&subtype_guid, &MEDIASUBTYPE_dvsd)) { if (IsEqualGUID (&format_guid, &FORMAT_DvInfo)) { structure = gst_structure_new ("video/x-dv", "systemstream", G_TYPE_BOOLEAN, TRUE, NULL); } else if (IsEqualGUID (&format_guid, &FORMAT_VideoInfo)) { structure = gst_structure_new ("video/x-dv", "systemstream", G_TYPE_BOOLEAN, FALSE, "format", GST_TYPE_FOURCC, GST_MAKE_FOURCC ('d', 'v', 's', 'd'), NULL); } } else if (memcmp (&subtype_guid.Data2, &MEDIASUBTYPE_FOURCC.Data2, sizeof (subtype_guid) - sizeof (subtype_guid.Data1)) == 0) { guint8 *p = (guint8 *) & subtype_guid.Data1; structure = gst_structure_new ("video/x-raw-yuv", "format", GST_TYPE_FOURCC, GST_MAKE_FOURCC (p[0], p[1], p[2], p[3]), NULL); } if (!structure) { GST_DEBUG ("Unknown DirectShow Video GUID %08x-%04x-%04x-%04x-%08x%04x", subtype_guid.Data1, subtype_guid.Data2, subtype_guid.Data3, *(WORD *) subtype_guid.Data4, *(DWORD *) & subtype_guid.Data4[2], *(WORD *) & subtype_guid.Data4[6]); } return structure; } static gboolean ks_video_append_video_stream_cfg_fields (GstStructure * structure, const KS_VIDEO_STREAM_CONFIG_CAPS * vscc) { g_return_val_if_fail (structure, FALSE); g_return_val_if_fail (vscc, FALSE); /* width */ if (vscc->MinOutputSize.cx == vscc->MaxOutputSize.cx) { gst_structure_set (structure, "width", G_TYPE_INT, vscc->MaxOutputSize.cx, NULL); } else { gst_structure_set (structure, "width", GST_TYPE_INT_RANGE, vscc->MinOutputSize.cx, vscc->MaxOutputSize.cx, NULL); } /* height */ if (vscc->MinOutputSize.cy == vscc->MaxOutputSize.cy) { gst_structure_set (structure, "height", G_TYPE_INT, vscc->MaxOutputSize.cy, NULL); } else { gst_structure_set (structure, "height", GST_TYPE_INT_RANGE, vscc->MinOutputSize.cy, vscc->MaxOutputSize.cy, NULL); } /* framerate */ if (vscc->MinFrameInterval == vscc->MaxFrameInterval) { gst_structure_set (structure, "framerate", GST_TYPE_FRACTION, (gint) (10000000 / vscc->MaxFrameInterval), 1, NULL); } else { gst_structure_set (structure, "framerate", GST_TYPE_FRACTION_RANGE, (gint) (10000000 / vscc->MaxFrameInterval), 1, (gint) (10000000 / vscc->MinFrameInterval), 1, NULL); } return TRUE; } KsVideoMediaType * ks_video_media_type_dup (KsVideoMediaType * media_type) { KsVideoMediaType *result = g_new (KsVideoMediaType, 1); memcpy (result, media_type, sizeof (KsVideoMediaType)); result->range = g_malloc (media_type->range->FormatSize); memcpy ((gpointer) result->range, media_type->range, media_type->range->FormatSize); result->format = g_malloc (media_type->format_size); memcpy (result->format, media_type->format, media_type->format_size); result->translated_caps = gst_caps_ref (media_type->translated_caps); return result; } void ks_video_media_type_free (KsVideoMediaType * media_type) { if (media_type == NULL) return; g_free ((gpointer) media_type->range); g_free (media_type->format); if (media_type->translated_caps != NULL) gst_caps_unref (media_type->translated_caps); g_free (media_type); } static GList * ks_video_media_type_list_remove_duplicates (GList * media_types) { GList *master, *duplicates; do { GList *entry; master = duplicates = NULL; /* Find the first set of duplicates and their master */ for (entry = media_types; entry != NULL && duplicates == NULL; entry = entry->next) { KsVideoMediaType *mt = entry->data; GList *other_entry; for (other_entry = media_types; other_entry != NULL; other_entry = other_entry->next) { KsVideoMediaType *other_mt = other_entry->data; if (other_mt == mt) continue; if (gst_caps_is_equal (mt->translated_caps, other_mt->translated_caps)) duplicates = g_list_prepend (duplicates, other_mt); } if (duplicates != NULL) master = entry; } if (duplicates != NULL) { KsVideoMediaType *selected_mt = master->data; /* * Pick a FORMAT_VideoInfo2 if present, if not we just stay with the * first entry */ for (entry = duplicates; entry != NULL; entry = entry->next) { KsVideoMediaType *mt = entry->data; if (IsEqualGUID (&mt->range->Specifier, &FORMAT_VideoInfo2)) { ks_video_media_type_free (selected_mt); selected_mt = mt; } else { ks_video_media_type_free (mt); } /* Remove the dupe from the main list */ media_types = g_list_remove (media_types, mt); } /* Update master node with the selected MediaType */ master->data = selected_mt; g_list_free (duplicates); } } while (master != NULL); return media_types; } GList * ks_video_probe_filter_for_caps (HANDLE filter_handle) { GList *ret = NULL; gulong pin_count; guint pin_id; if (!ks_filter_get_pin_property (filter_handle, 0, KSPROPSETID_Pin, KSPROPERTY_PIN_CTYPES, &pin_count, sizeof (pin_count))) goto beach; GST_DEBUG ("pin_count = %d", pin_count); for (pin_id = 0; pin_id < pin_count; pin_id++) { KSPIN_COMMUNICATION pin_comm; KSPIN_DATAFLOW pin_flow; GUID pin_cat; if (!ks_filter_get_pin_property (filter_handle, pin_id, KSPROPSETID_Pin, KSPROPERTY_PIN_COMMUNICATION, &pin_comm, sizeof (pin_comm))) continue; if (!ks_filter_get_pin_property (filter_handle, pin_id, KSPROPSETID_Pin, KSPROPERTY_PIN_DATAFLOW, &pin_flow, sizeof (pin_flow))) continue; if (!ks_filter_get_pin_property (filter_handle, pin_id, KSPROPSETID_Pin, KSPROPERTY_PIN_CATEGORY, &pin_cat, sizeof (pin_cat))) continue; GST_DEBUG ("pin[%d]: pin_comm=%d, pin_flow=%d", pin_id, pin_comm, pin_flow); if (pin_flow == KSPIN_DATAFLOW_OUT && memcmp (&pin_cat, &PINNAME_CAPTURE, sizeof (GUID)) == 0) { KSMULTIPLE_ITEM *items; if (ks_filter_get_pin_property_multi (filter_handle, pin_id, KSPROPSETID_Pin, KSPROPERTY_PIN_DATARANGES, &items)) { KSDATARANGE *range = (KSDATARANGE *) (items + 1); guint i; for (i = 0; i < items->Count; i++) { if (IsEqualGUID (&range->MajorFormat, &KSDATAFORMAT_TYPE_VIDEO)) { KsVideoMediaType *entry; gpointer src_vscc, src_format; GstStructure *media_structure; entry = g_new0 (KsVideoMediaType, 1); entry->pin_id = pin_id; entry->range = g_malloc (range->FormatSize); memcpy ((gpointer) entry->range, range, range->FormatSize); if (IsEqualGUID (&range->Specifier, &FORMAT_VideoInfo)) { KS_DATARANGE_VIDEO *vr = (KS_DATARANGE_VIDEO *) entry->range; src_vscc = &vr->ConfigCaps; src_format = &vr->VideoInfoHeader; entry->format_size = sizeof (vr->VideoInfoHeader); entry->sample_size = vr->VideoInfoHeader.bmiHeader.biSizeImage; } else if (IsEqualGUID (&range->Specifier, &FORMAT_VideoInfo2)) { KS_DATARANGE_VIDEO2 *vr = (KS_DATARANGE_VIDEO2 *) entry->range; src_vscc = &vr->ConfigCaps; src_format = &vr->VideoInfoHeader; entry->format_size = sizeof (vr->VideoInfoHeader); entry->sample_size = vr->VideoInfoHeader.bmiHeader.biSizeImage; } else if (IsEqualGUID (&range->Specifier, &FORMAT_MPEGVideo)) { /* Untested and probably wrong... */ KS_DATARANGE_MPEG1_VIDEO *vr = (KS_DATARANGE_MPEG1_VIDEO *) entry->range; src_vscc = &vr->ConfigCaps; src_format = &vr->VideoInfoHeader; entry->format_size = sizeof (vr->VideoInfoHeader); entry->sample_size = vr->VideoInfoHeader.hdr.bmiHeader.biSizeImage; } else if (IsEqualGUID (&range->Specifier, &FORMAT_MPEG2Video)) { /* Untested and probably wrong... */ KS_DATARANGE_MPEG2_VIDEO *vr = (KS_DATARANGE_MPEG2_VIDEO *) entry->range; src_vscc = &vr->ConfigCaps; src_format = &vr->VideoInfoHeader; entry->format_size = sizeof (vr->VideoInfoHeader); entry->sample_size = vr->VideoInfoHeader.hdr.bmiHeader.biSizeImage; } else g_assert_not_reached (); g_assert (entry->sample_size != 0); memcpy ((gpointer) & entry->vscc, src_vscc, sizeof (entry->vscc)); entry->format = g_malloc (entry->format_size); memcpy (entry->format, src_format, entry->format_size); media_structure = ks_video_format_to_structure (range->SubFormat, range->MajorFormat); if (media_structure == NULL) { g_warning ("ks_video_format_to_structure returned NULL"); ks_video_media_type_free (entry); entry = NULL; } else if (ks_video_append_video_stream_cfg_fields (media_structure, &entry->vscc)) { entry->translated_caps = gst_caps_new_empty (); gst_caps_append_structure (entry->translated_caps, media_structure); } else { gst_structure_free (media_structure); ks_video_media_type_free (entry); entry = NULL; } if (entry != NULL) ret = g_list_prepend (ret, entry); } /* REVISIT: Each KSDATARANGE should start on a 64-bit boundary */ range = (KSDATARANGE *) (((guchar *) range) + range->FormatSize); } g_free (items); } } } if (ret != NULL) { ret = g_list_reverse (ret); ret = ks_video_media_type_list_remove_duplicates (ret); } beach: return ret; } KSPIN_CONNECT * ks_video_create_pin_conn_from_media_type (KsVideoMediaType * media_type) { KSPIN_CONNECT *conn = NULL; KSDATAFORMAT *format = NULL; guint8 *vih; conn = g_malloc0 (sizeof (KSPIN_CONNECT) + sizeof (KSDATAFORMAT) + media_type->format_size); conn->Interface.Set = KSINTERFACESETID_Standard; conn->Interface.Id = KSINTERFACE_STANDARD_STREAMING; conn->Interface.Flags = 0; conn->Medium.Set = KSMEDIUMSETID_Standard; conn->Medium.Id = KSMEDIUM_TYPE_ANYINSTANCE; conn->Medium.Flags = 0; conn->PinId = media_type->pin_id; conn->PinToHandle = NULL; conn->Priority.PriorityClass = KSPRIORITY_NORMAL; conn->Priority.PrioritySubClass = 1; format = (KSDATAFORMAT *) (conn + 1); memcpy (format, media_type->range, sizeof (KSDATAFORMAT)); format->FormatSize = sizeof (KSDATAFORMAT) + media_type->format_size; vih = (guint8 *) (format + 1); memcpy (vih, media_type->format, media_type->format_size); return conn; } gboolean ks_video_fixate_media_type (const KSDATARANGE * range, guint8 * format, gint width, gint height, gint fps_n, gint fps_d) { DWORD dwRate = (width * height * fps_n) / fps_d; g_return_val_if_fail (format != NULL, FALSE); if (IsEqualGUID (&range->Specifier, &FORMAT_VideoInfo)) { KS_VIDEOINFOHEADER *vih = (KS_VIDEOINFOHEADER *) format; vih->AvgTimePerFrame = gst_util_uint64_scale_int (10000000, fps_d, fps_n); vih->dwBitRate = dwRate * vih->bmiHeader.biBitCount; g_assert (vih->bmiHeader.biWidth == width); g_assert (vih->bmiHeader.biHeight == height); } else if (IsEqualGUID (&range->Specifier, &FORMAT_VideoInfo2)) { KS_VIDEOINFOHEADER2 *vih = (KS_VIDEOINFOHEADER2 *) format; vih->AvgTimePerFrame = gst_util_uint64_scale_int (10000000, fps_d, fps_n); vih->dwBitRate = dwRate * vih->bmiHeader.biBitCount; g_assert (vih->bmiHeader.biWidth == width); g_assert (vih->bmiHeader.biHeight == height); } else if (IsEqualGUID (&range->Specifier, &FORMAT_MPEGVideo)) { KS_MPEG1VIDEOINFO *vih = (KS_MPEG1VIDEOINFO *) format; vih->hdr.AvgTimePerFrame = gst_util_uint64_scale_int (10000000, fps_d, fps_n); vih->hdr.dwBitRate = dwRate * vih->hdr.bmiHeader.biBitCount; /* FIXME: set height and width? */ g_assert (vih->hdr.bmiHeader.biWidth == width); g_assert (vih->hdr.bmiHeader.biHeight == height); } else if (IsEqualGUID (&range->Specifier, &FORMAT_MPEG2Video)) { KS_MPEGVIDEOINFO2 *vih = (KS_MPEGVIDEOINFO2 *) format; vih->hdr.AvgTimePerFrame = gst_util_uint64_scale_int (10000000, fps_d, fps_n); vih->hdr.dwBitRate = dwRate * vih->hdr.bmiHeader.biBitCount; /* FIXME: set height and width? */ g_assert (vih->hdr.bmiHeader.biWidth == width); g_assert (vih->hdr.bmiHeader.biHeight == height); } else { return FALSE; } return TRUE; } static GstStructure * ks_video_append_var_video_fields (GstStructure * structure) { if (structure) { gst_structure_set (structure, "width", GST_TYPE_INT_RANGE, 1, G_MAXINT, "height", GST_TYPE_INT_RANGE, 1, G_MAXINT, "framerate", GST_TYPE_FRACTION_RANGE, 0, 1, G_MAXINT, 1, NULL); } return structure; } GstCaps * ks_video_get_all_caps (void) { static GstCaps *caps = NULL; if (caps == NULL) { GstStructure *structure; caps = gst_caps_new_empty (); /* from Windows SDK 6.0 uuids.h */ /* RGB formats */ structure = ks_video_append_var_video_fields (ks_video_format_to_structure (MEDIASUBTYPE_RGB555, FORMAT_VideoInfo)); gst_caps_append_structure (caps, structure); structure = ks_video_append_var_video_fields (ks_video_format_to_structure (MEDIASUBTYPE_RGB565, FORMAT_VideoInfo)); gst_caps_append_structure (caps, structure); structure = ks_video_append_var_video_fields (ks_video_format_to_structure (MEDIASUBTYPE_RGB24, FORMAT_VideoInfo)); gst_caps_append_structure (caps, structure); structure = ks_video_append_var_video_fields (ks_video_format_to_structure (MEDIASUBTYPE_RGB32, FORMAT_VideoInfo)); gst_caps_append_structure (caps, structure); /* YUV formats */ structure = ks_video_append_var_video_fields (gst_structure_new ("video/x-raw-yuv", NULL)); gst_caps_append_structure (caps, structure); /* Other formats */ structure = ks_video_append_var_video_fields (ks_video_format_to_structure (MEDIASUBTYPE_MJPG, FORMAT_VideoInfo)); gst_caps_append_structure (caps, structure); structure = ks_video_append_var_video_fields (ks_video_format_to_structure (MEDIASUBTYPE_dvsd, FORMAT_VideoInfo)); gst_caps_append_structure (caps, structure); structure = /* no variable video fields (width, height, framerate) */ ks_video_format_to_structure (MEDIASUBTYPE_dvsd, FORMAT_DvInfo); gst_caps_append_structure (caps, structure); } return caps; } psimedia-master/gstprovider/gstelements/winks/ksvideohelpers.h000066400000000000000000000040031220046403000253630ustar00rootroot00000000000000/* * Copyright (C) 2007 Haakon Sporsheim * 2008 Ole André Vadla Ravnås * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Library General Public * License as published by the Free Software Foundation; either * version 2 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, * Boston, MA 02110-1301, USA. */ #ifndef __KSVIDEOHELPERS_H__ #define __KSVIDEOHELPERS_H__ #include #include #include #include G_BEGIN_DECLS DEFINE_GUID(MEDIASUBTYPE_I420, 0x30323449, 0x0000, 0x0010, 0x80, 0x00, 0x00, 0xAA, 0x00, 0x38, 0x9B, 0x71); typedef struct _KsVideoMediaType KsVideoMediaType; /** * A structure that contain metadata about capabilities * for both KS and GStreamer for video only. */ struct _KsVideoMediaType { guint pin_id; const KSDATARANGE * range; const KS_VIDEO_STREAM_CONFIG_CAPS vscc; guint8 * format; guint format_size; guint sample_size; GstCaps * translated_caps; }; KsVideoMediaType * ks_video_media_type_dup (KsVideoMediaType * media_type); void ks_video_media_type_free (KsVideoMediaType * media_type); GList * ks_video_probe_filter_for_caps (HANDLE filter_handle); KSPIN_CONNECT * ks_video_create_pin_conn_from_media_type (KsVideoMediaType * media_type); gboolean ks_video_fixate_media_type (const KSDATARANGE * range, guint8 * format, gint width, gint height, gint fps_n, gint fps_d); GstCaps * ks_video_get_all_caps (void); G_END_DECLS #endif /* __KSVIDEOHELPERS_H__ */ psimedia-master/gstprovider/gstprovider.cpp000066400000000000000000000545231220046403000215660ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #include "psimediaprovider.h" #include #include #include #include #include #include #include #include "devices.h" #include "modes.h" #include "gstthread.h" #include "rwcontrol.h" #ifdef QT_GUI_LIB #include #include #endif namespace PsiMedia { static PDevice gstDeviceToPDevice(const GstDevice &dev, PDevice::Type type) { PDevice out; out.type = type; out.name = dev.name; out.id = dev.id; return out; } //---------------------------------------------------------------------------- // GstVideoWidget //---------------------------------------------------------------------------- class GstVideoWidget : public QObject { Q_OBJECT public: VideoWidgetContext *context; QImage curImage; GstVideoWidget(VideoWidgetContext *_context, QObject *parent = 0) : QObject(parent), context(_context) { QPalette palette; palette.setColor(context->qwidget()->backgroundRole(), Qt::black); context->qwidget()->setPalette(palette); context->qwidget()->setAutoFillBackground(true); connect(context->qobject(), SIGNAL(resized(const QSize &)), SLOT(context_resized(const QSize &))); connect(context->qobject(), SIGNAL(paintEvent(QPainter *)), SLOT(context_paintEvent(QPainter *))); } void show_frame(const QImage &image) { curImage = image; context->qwidget()->update(); } private slots: void context_resized(const QSize &newSize) { Q_UNUSED(newSize); } void context_paintEvent(QPainter *p) { if(curImage.isNull()) return; QSize size = context->qwidget()->size(); QSize newSize = curImage.size(); newSize.scale(size, Qt::KeepAspectRatio); int xoff = 0; int yoff = 0; if(newSize.width() < size.width()) xoff = (size.width() - newSize.width()) / 2; else if(newSize.height() < size.height()) yoff = (size.height() - newSize.height()) / 2; // ideally, the backend will follow desired_size() and give // us images that generally don't need resizing QImage i; if(curImage.size() != newSize) { // the IgnoreAspectRatio is okay here, since we // used KeepAspectRatio earlier i = curImage.scaled(newSize, Qt::IgnoreAspectRatio, Qt::SmoothTransformation); } else i = curImage; p->drawImage(xoff, yoff, i); } }; //---------------------------------------------------------------------------- // GstFeaturesContext //---------------------------------------------------------------------------- static QList get_audioOutputDevices() { QList list; foreach(const GstDevice &i, devices_list(PDevice::AudioOut)) list += gstDeviceToPDevice(i, PDevice::AudioOut); return list; } static QList get_audioInputDevices() { QList list; foreach(const GstDevice &i, devices_list(PDevice::AudioIn)) list += gstDeviceToPDevice(i, PDevice::AudioIn); return list; } static QList get_videoInputDevices() { QList list; foreach(const GstDevice &i, devices_list(PDevice::VideoIn)) list += gstDeviceToPDevice(i, PDevice::VideoIn); return list; } class FeaturesThread : public QThread { Q_OBJECT public: int types; PFeatures results; FeaturesThread(QObject *parent = 0) : QThread(parent) { } virtual void run() { PFeatures out; if(types & FeaturesContext::AudioOut) out.audioOutputDevices = get_audioOutputDevices(); if(types & FeaturesContext::AudioIn) out.audioInputDevices = get_audioInputDevices(); if(types & FeaturesContext::VideoIn) out.videoInputDevices = get_videoInputDevices(); if(types & FeaturesContext::AudioModes) out.supportedAudioModes = modes_supportedAudio(); if(types & FeaturesContext::VideoModes) out.supportedVideoModes = modes_supportedVideo(); results = out; } }; class GstFeaturesContext : public QObject, public FeaturesContext { Q_OBJECT Q_INTERFACES(PsiMedia::FeaturesContext) public: GstThread *gstThread; FeaturesThread *thread; GstFeaturesContext(GstThread *_gstThread, QObject *parent = 0) : QObject(parent), gstThread(_gstThread) { thread = new FeaturesThread(this); connect(thread, SIGNAL(finished()), SIGNAL(finished())); } ~GstFeaturesContext() { thread->wait(); delete thread; } virtual QObject *qobject() { return this; } virtual void lookup(int types) { if(types > 0) { thread->types = types; thread->start(); } } virtual bool waitForFinished(int msecs) { return thread->wait(msecs < 0 ? ULONG_MAX : msecs); } virtual PFeatures results() const { return thread->results; } signals: void finished(); }; //---------------------------------------------------------------------------- // GstRtpChannel //---------------------------------------------------------------------------- // for a live transmission we really shouldn't have excessive queuing (or // *any* queuing!), so we'll cap the queue sizes. if the system gets // overloaded and the thread scheduling skews such that our queues get // filled before they can be emptied, then we'll start dropping old // items making room for new ones. on a live transmission there's no // sense in keeping ancient data around. we just drop and move on. #define QUEUE_PACKET_MAX 25 // don't wake the main thread more often than this, for performance reasons #define WAKE_PACKET_MIN 40 class GstRtpSessionContext; class GstRtpChannel : public QObject, public RtpChannelContext { Q_OBJECT Q_INTERFACES(PsiMedia::RtpChannelContext) public: bool enabled; QMutex m; GstRtpSessionContext *session; QList in; //QTime wake_time; bool wake_pending; QList pending_in; int written_pending; GstRtpChannel() : QObject(), enabled(false), wake_pending(false), written_pending(0) { } virtual QObject *qobject() { return this; } virtual void setEnabled(bool b) { QMutexLocker locker(&m); enabled = b; } virtual int packetsAvailable() const { return in.count(); } virtual PRtpPacket read() { return in.takeFirst(); } virtual void write(const PRtpPacket &rtp) { m.lock(); if(!enabled) return; m.unlock(); receiver_push_packet_for_write(rtp); ++written_pending; // only queue one call per eventloop pass if(written_pending == 1) QMetaObject::invokeMethod(this, "processOut", Qt::QueuedConnection); } // session calls this, which may be in another thread void push_packet_for_read(const PRtpPacket &rtp) { QMutexLocker locker(&m); if(!enabled) return; // if the queue is full, bump off the oldest to make room if(pending_in.count() >= QUEUE_PACKET_MAX) pending_in.removeFirst(); pending_in += rtp; // TODO: use WAKE_PACKET_MIN and wake_time ? if(!wake_pending) { wake_pending = true; QMetaObject::invokeMethod(this, "processIn", Qt::QueuedConnection); } } signals: void readyRead(); void packetsWritten(int count); private slots: void processIn() { int oldcount = in.count(); m.lock(); wake_pending = false; in += pending_in; pending_in.clear(); m.unlock(); if(in.count() > oldcount) emit readyRead(); } void processOut() { int count = written_pending; written_pending = 0; emit packetsWritten(count); } private: void receiver_push_packet_for_write(const PRtpPacket &rtp); }; //---------------------------------------------------------------------------- // GstRecorder //---------------------------------------------------------------------------- class GstRecorder : public QObject { Q_OBJECT public: RwControlLocal *control; QIODevice *recordDevice, *nextRecordDevice; bool record_cancel; QMutex m; bool wake_pending; QList pending_in; GstRecorder(QObject *parent = 0) : QObject(parent), control(0), recordDevice(0), nextRecordDevice(0), record_cancel(false), wake_pending(false) { } void setDevice(QIODevice *dev) { Q_ASSERT(!recordDevice); Q_ASSERT(!nextRecordDevice); if(control) { recordDevice = dev; RwControlRecord record; record.enabled = true; control->setRecord(record); } else { // queue up the device for later nextRecordDevice = dev; } } void stop() { Q_ASSERT(recordDevice || nextRecordDevice); Q_ASSERT(!record_cancel); if(nextRecordDevice) { // if there was only a queued device, then there's // nothing to do but dequeue it nextRecordDevice = 0; } else { record_cancel = true; RwControlRecord record; record.enabled = false; control->setRecord(record); } } void startNext() { if(control && !recordDevice && nextRecordDevice) { recordDevice = nextRecordDevice; nextRecordDevice = 0; RwControlRecord record; record.enabled = true; control->setRecord(record); } } // session calls this, which may be in another thread void push_data_for_read(const QByteArray &buf) { QMutexLocker locker(&m); pending_in += buf; if(!wake_pending) { wake_pending = true; QMetaObject::invokeMethod(this, "processIn", Qt::QueuedConnection); } } signals: void stopped(); private slots: void processIn() { m.lock(); wake_pending = false; QList in = pending_in; pending_in.clear(); m.unlock(); QPointer self = this; while(!in.isEmpty()) { QByteArray buf = in.takeFirst(); if(!buf.isEmpty()) { recordDevice->write(buf); } else // EOF { recordDevice->close(); recordDevice = 0; bool wasCancelled = record_cancel; record_cancel = false; if(wasCancelled) { emit stopped(); if(!self) return; } } } } }; //---------------------------------------------------------------------------- // GstRtpSessionContext //---------------------------------------------------------------------------- class GstRtpSessionContext : public QObject, public RtpSessionContext { Q_OBJECT Q_INTERFACES(PsiMedia::RtpSessionContext) public: GstThread *gstThread; RwControlLocal *control; RwControlConfigDevices devices; RwControlConfigCodecs codecs; RwControlTransmit transmit; RwControlStatus lastStatus; bool isStarted; bool isStopping; bool pending_status; #ifdef QT_GUI_LIB GstVideoWidget *outputWidget, *previewWidget; #endif GstRecorder recorder; // keep these parentless, so they can switch threads GstRtpChannel audioRtp; GstRtpChannel videoRtp; QMutex write_mutex; bool allow_writes; GstRtpSessionContext(GstThread *_gstThread, QObject *parent = 0) : QObject(parent), gstThread(_gstThread), control(0), isStarted(false), isStopping(false), pending_status(false), recorder(this), allow_writes(false) { #ifdef QT_GUI_LIB outputWidget = 0; previewWidget = 0; #endif devices.audioOutVolume = 100; devices.audioInVolume = 100; codecs.useLocalAudioParams = true; codecs.useLocalVideoParams = true; audioRtp.session = this; videoRtp.session = this; connect(&recorder, SIGNAL(stopped()), SLOT(recorder_stopped())); } ~GstRtpSessionContext() { cleanup(); } virtual QObject *qobject() { return this; } void cleanup() { if(outputWidget) outputWidget->show_frame(QImage()); if(previewWidget) previewWidget->show_frame(QImage()); codecs = RwControlConfigCodecs(); isStarted = false; isStopping = false; pending_status = false; recorder.control = 0; write_mutex.lock(); allow_writes = false; delete control; control = 0; write_mutex.unlock(); } virtual void setAudioOutputDevice(const QString &deviceId) { devices.audioOutId = deviceId; if(control) control->updateDevices(devices); } virtual void setAudioInputDevice(const QString &deviceId) { devices.audioInId = deviceId; devices.fileNameIn.clear(); devices.fileDataIn.clear(); if(control) control->updateDevices(devices); } virtual void setVideoInputDevice(const QString &deviceId) { devices.videoInId = deviceId; devices.fileNameIn.clear(); devices.fileDataIn.clear(); if(control) control->updateDevices(devices); } virtual void setFileInput(const QString &fileName) { devices.fileNameIn = fileName; devices.audioInId.clear(); devices.videoInId.clear(); devices.fileDataIn.clear(); if(control) control->updateDevices(devices); } virtual void setFileDataInput(const QByteArray &fileData) { devices.fileDataIn = fileData; devices.audioInId.clear(); devices.videoInId.clear(); devices.fileNameIn.clear(); if(control) control->updateDevices(devices); } virtual void setFileLoopEnabled(bool enabled) { devices.loopFile = enabled; if(control) control->updateDevices(devices); } #ifdef QT_GUI_LIB virtual void setVideoOutputWidget(VideoWidgetContext *widget) { // no change? if(!outputWidget && !widget) return; if(outputWidget && outputWidget->context == widget) return; delete outputWidget; outputWidget = 0; if(widget) outputWidget = new GstVideoWidget(widget, this); devices.useVideoOut = widget ? true : false; if(control) control->updateDevices(devices); } virtual void setVideoPreviewWidget(VideoWidgetContext *widget) { // no change? if(!previewWidget && !widget) return; if(previewWidget && previewWidget->context == widget) return; delete previewWidget; previewWidget = 0; if(widget) previewWidget = new GstVideoWidget(widget, this); devices.useVideoPreview = widget ? true : false; if(control) control->updateDevices(devices); } #endif virtual void setRecorder(QIODevice *recordDevice) { // can't assign a new recording device after stopping Q_ASSERT(!isStopping); recorder.setDevice(recordDevice); } virtual void stopRecording() { recorder.stop(); } virtual void setLocalAudioPreferences(const QList ¶ms) { codecs.useLocalAudioParams = true; codecs.localAudioParams = params; } virtual void setLocalVideoPreferences(const QList ¶ms) { codecs.useLocalVideoParams = true; codecs.localVideoParams = params; } virtual void setMaximumSendingBitrate(int kbps) { codecs.maximumSendingBitrate = kbps; } virtual void setRemoteAudioPreferences(const QList &info) { codecs.useRemoteAudioPayloadInfo = true; codecs.remoteAudioPayloadInfo = info; } virtual void setRemoteVideoPreferences(const QList &info) { codecs.useRemoteVideoPayloadInfo = true; codecs.remoteVideoPayloadInfo = info; } virtual void start() { Q_ASSERT(!control && !isStarted); write_mutex.lock(); control = new RwControlLocal(gstThread, this); connect(control, SIGNAL(statusReady(const RwControlStatus &)), SLOT(control_statusReady(const RwControlStatus &))); connect(control, SIGNAL(previewFrame(const QImage &)), SLOT(control_previewFrame(const QImage &))); connect(control, SIGNAL(outputFrame(const QImage &)), SLOT(control_outputFrame(const QImage &))); connect(control, SIGNAL(audioOutputIntensityChanged(int)), SLOT(control_audioOutputIntensityChanged(int))); connect(control, SIGNAL(audioInputIntensityChanged(int)), SLOT(control_audioInputIntensityChanged(int))); control->app = this; control->cb_rtpAudioOut = cb_control_rtpAudioOut; control->cb_rtpVideoOut = cb_control_rtpVideoOut; control->cb_recordData = cb_control_recordData; allow_writes = true; write_mutex.unlock(); recorder.control = control; lastStatus = RwControlStatus(); isStarted = false; pending_status = true; control->start(devices, codecs); } virtual void updatePreferences() { Q_ASSERT(control && !pending_status); pending_status = true; control->updateCodecs(codecs); } virtual void transmitAudio() { transmit.useAudio = true; control->setTransmit(transmit); } virtual void transmitVideo() { transmit.useVideo = true; control->setTransmit(transmit); } virtual void pauseAudio() { transmit.useAudio = false; control->setTransmit(transmit); } virtual void pauseVideo() { transmit.useVideo = false; control->setTransmit(transmit); } virtual void stop() { Q_ASSERT(control && !isStopping); // note: it's possible to stop even if pending_status is // already true. this is so we can stop a session that // is in the middle of starting. isStopping = true; pending_status = true; control->stop(); } virtual QList localAudioPayloadInfo() const { return lastStatus.localAudioPayloadInfo; } virtual QList localVideoPayloadInfo() const { return lastStatus.localVideoPayloadInfo; } virtual QList remoteAudioPayloadInfo() const { return lastStatus.remoteAudioPayloadInfo; } virtual QList remoteVideoPayloadInfo() const { return lastStatus.remoteVideoPayloadInfo; } virtual QList audioParams() const { return lastStatus.localAudioParams; } virtual QList videoParams() const { return lastStatus.localVideoParams; } virtual bool canTransmitAudio() const { return lastStatus.canTransmitAudio; } virtual bool canTransmitVideo() const { return lastStatus.canTransmitVideo; } virtual int outputVolume() const { return devices.audioOutVolume; } virtual void setOutputVolume(int level) { devices.audioOutVolume = level; if(control) control->updateDevices(devices); } virtual int inputVolume() const { return devices.audioInVolume; } virtual void setInputVolume(int level) { devices.audioInVolume = level; if(control) control->updateDevices(devices); } virtual Error errorCode() const { return (Error)lastStatus.errorCode; } virtual RtpChannelContext *audioRtpChannel() { return &audioRtp; } virtual RtpChannelContext *videoRtpChannel() { return &videoRtp; } // channel calls this, which may be in another thread void push_packet_for_write(GstRtpChannel *from, const PRtpPacket &rtp) { QMutexLocker locker(&write_mutex); if(!allow_writes || !control) return; if(from == &audioRtp) control->rtpAudioIn(rtp); else if(from == &videoRtp) control->rtpVideoIn(rtp); } signals: void started(); void preferencesUpdated(); void audioOutputIntensityChanged(int intensity); void audioInputIntensityChanged(int intensity); void stoppedRecording(); void stopped(); void finished(); void error(); private slots: void control_statusReady(const RwControlStatus &status) { lastStatus = status; if(status.finished) { // finished status just means the file is done // sending. the session still remains active. emit finished(); } else if(status.error) { cleanup(); emit error(); } else if(pending_status) { if(status.stopped) { pending_status = false; cleanup(); emit stopped(); return; } // if we're currently stopping, ignore all other // pending status events except for stopped // (handled above) if(isStopping) return; pending_status = false; if(!isStarted) { isStarted = true; // if there was a pending record, start it recorder.startNext(); emit started(); } else emit preferencesUpdated(); } } void control_previewFrame(const QImage &img) { if(previewWidget) previewWidget->show_frame(img); } void control_outputFrame(const QImage &img) { if(outputWidget) outputWidget->show_frame(img); } void control_audioOutputIntensityChanged(int intensity) { emit audioOutputIntensityChanged(intensity); } void control_audioInputIntensityChanged(int intensity) { emit audioInputIntensityChanged(intensity); } void recorder_stopped() { emit stoppedRecording(); } private: static void cb_control_rtpAudioOut(const PRtpPacket &packet, void *app) { ((GstRtpSessionContext *)app)->control_rtpAudioOut(packet); } static void cb_control_rtpVideoOut(const PRtpPacket &packet, void *app) { ((GstRtpSessionContext *)app)->control_rtpVideoOut(packet); } static void cb_control_recordData(const QByteArray &packet, void *app) { ((GstRtpSessionContext *)app)->control_recordData(packet); } // note: this is executed from a different thread void control_rtpAudioOut(const PRtpPacket &packet) { audioRtp.push_packet_for_read(packet); } // note: this is executed from a different thread void control_rtpVideoOut(const PRtpPacket &packet) { videoRtp.push_packet_for_read(packet); } // note: this is executed from a different thread void control_recordData(const QByteArray &packet) { recorder.push_data_for_read(packet); } }; void GstRtpChannel::receiver_push_packet_for_write(const PRtpPacket &rtp) { if(session) session->push_packet_for_write(this, rtp); } //---------------------------------------------------------------------------- // GstProvider //---------------------------------------------------------------------------- class GstProvider : public QObject, public Provider { Q_OBJECT Q_INTERFACES(PsiMedia::Provider) public: GstThread *thread; GstProvider() : thread(0) { } virtual QObject *qobject() { return this; } virtual bool init(const QString &resourcePath) { thread = new GstThread(this); if(!thread->start(resourcePath)) { delete thread; thread = 0; return false; } return true; } ~GstProvider() { delete thread; } virtual QString creditName() { return "GStreamer"; } virtual QString creditText() { QString str = QString( "This application uses GStreamer %1, a comprehensive " "open-source and cross-platform multimedia framework. For " "more information, see http://www.gstreamer.net/\n\n" "If you enjoy this software, please give the GStreamer " "people a million dollars." ).arg(thread->gstVersion()); return str; } virtual FeaturesContext *createFeatures() { return new GstFeaturesContext(thread); } virtual RtpSessionContext *createRtpSession() { return new GstRtpSessionContext(thread); } }; class GstPlugin : public QObject, public Plugin { Q_OBJECT Q_INTERFACES(PsiMedia::Plugin) public: virtual Provider *createProvider() { return new GstProvider; } }; } Q_EXPORT_PLUGIN2(gstprovider, PsiMedia::GstPlugin) #include "gstprovider.moc" psimedia-master/gstprovider/gstprovider.pri000066400000000000000000000011031220046403000215600ustar00rootroot00000000000000CONFIG += link_prl depend_prl LIBS += -L$$PWD/gstelements/static/lib -lgstelements_static include(gstconf.pri) include(deviceenum/deviceenum.pri) include(gstcustomelements/gstcustomelements.pri) HEADERS += \ $$PWD/devices.h \ $$PWD/modes.h \ $$PWD/payloadinfo.h \ $$PWD/pipeline.h \ $$PWD/bins.h \ $$PWD/rtpworker.h \ $$PWD/gstthread.h \ $$PWD/rwcontrol.h SOURCES += \ $$PWD/devices.cpp \ $$PWD/modes.cpp \ $$PWD/payloadinfo.cpp \ $$PWD/pipeline.cpp \ $$PWD/bins.cpp \ $$PWD/rtpworker.cpp \ $$PWD/gstthread.cpp \ $$PWD/rwcontrol.cpp \ $$PWD/gstprovider.cpp psimedia-master/gstprovider/gstprovider.pro000066400000000000000000000001261220046403000215720ustar00rootroot00000000000000TEMPLATE = lib CONFIG += plugin INCLUDEPATH += ../psimedia include(gstprovider.pri) psimedia-master/gstprovider/gstthread.cpp000066400000000000000000000212471220046403000212000ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #include "gstthread.h" #include #include #include #include #include #include #include #include #include #include #include "gstcustomelements/gstcustomelements.h" #include "gstelements/static/gstelements.h" namespace PsiMedia { //---------------------------------------------------------------------------- // GstSession //---------------------------------------------------------------------------- // converts Qt-ified commandline args back to C style class CArgs { public: int argc; char **argv; CArgs() { argc = 0; argv = 0; } ~CArgs() { if(count > 0) { for(int n = 0; n < count; ++n) delete [] data[n]; free(argv); free(data); } } void set(const QStringList &args) { count = args.count(); if(count == 0) { data = 0; argc = 0; argv = 0; } else { data = (char **)malloc(sizeof(char **) * count); argv = (char **)malloc(sizeof(char **) * count); for(int n = 0; n < count; ++n) { QByteArray cs = args[n].toLocal8Bit(); data[n] = (char *)qstrdup(cs.data()); argv[n] = data[n]; } argc = count; } } private: int count; char **data; }; static void loadPlugins(const QString &pluginPath, bool print = false) { if(print) printf("Loading plugins in [%s]\n", qPrintable(pluginPath)); QDir dir(pluginPath); QStringList entryList = dir.entryList(QDir::Files); foreach(QString entry, entryList) { if(!QLibrary::isLibrary(entry)) continue; QString filePath = dir.filePath(entry); GError *err = 0; GstPlugin *plugin = gst_plugin_load_file( filePath.toUtf8().data(), &err); if(!plugin) { if(print) { printf("**FAIL**: %s: %s\n", qPrintable(entry), err->message); } g_error_free(err); continue; } if(print) { printf(" OK : %s name=[%s]\n", qPrintable(entry), gst_plugin_get_name(plugin)); } gst_object_unref(plugin); } if(print) printf("\n"); } static int compare_gst_version(int a1, int a2, int a3, int b1, int b2, int b3) { if(a1 > b1) return 1; else if(a1 < b1) return -1; if(a2 > b2) return 1; else if(a2 < b2) return -1; if(a3 > b3) return 1; else if(a3 < b3) return -1; return 0; } class GstSession { public: CArgs args; QString version; bool success; GstSession(const QString &pluginPath = QString()) { args.set(QCoreApplication::instance()->arguments()); // make sure glib threads are available if(!g_thread_supported()) g_thread_init(NULL); // ignore "system" plugins if(!pluginPath.isEmpty()) { qputenv("GST_PLUGIN_SYSTEM_PATH", ""); qputenv("GST_PLUGIN_PATH", ""); } // you can also use NULLs here if you don't want to pass args GError *error; if(!gst_init_check(&args.argc, &args.argv, &error)) { success = false; return; } guint major, minor, micro, nano; gst_version(&major, &minor, µ, &nano); QString nano_str; if(nano == 1) nano_str = " (CVS)"; else if(nano == 2) nano_str = " (Prerelease)"; version.sprintf("%d.%d.%d%s", major, minor, micro, !nano_str.isEmpty() ? qPrintable(nano_str) : ""); int need_maj = 0; int need_min = 10; int need_mic = 22; if(compare_gst_version(major, minor, micro, need_maj, need_min, need_mic) < 0) { printf("Need GStreamer version %d.%d.%d\n", need_maj, need_min, need_mic); success = false; return; } // manually load plugins? if(!pluginPath.isEmpty()) loadPlugins(pluginPath); gstcustomelements_register(); gstelements_register(); QStringList reqelem = QStringList() << "speexenc" << "speexdec" << "vorbisenc" << "vorbisdec" << "theoraenc" << "theoradec" << "rtpspeexpay" << "rtpspeexdepay" << "rtpvorbispay" << "rtpvorbisdepay" << "rtptheorapay" << "rtptheoradepay" << "filesrc" << "decodebin" << "jpegdec" << "oggmux" << "oggdemux" << "audioconvert" << "audioresample" << "volume" << "level" << "ffmpegcolorspace" << "videorate" << "videomaxrate" << "videoscale" << "gstrtpjitterbuffer" << "liveadder"; #if defined(Q_OS_MAC) reqelem << "osxaudiosrc" << "osxaudiosink" << "osxvideosrc"; #elif defined(Q_OS_LINUX) reqelem << "alsasrc" << "alsasink" << "v4l2src"; #elif defined(Q_OS_UNIX) reqelem << "osssrc" << "osssink"; #elif defined(Q_OS_WIN) reqelem << "directsoundsrc" << "directsoundsink" << "ksvideosrc"; #endif foreach(const QString &name, reqelem) { GstElement *e = gst_element_factory_make(name.toLatin1().data(), NULL); if(!e) { printf("Unable to load element '%s'.\n", qPrintable(name)); success = false; return; } g_object_unref(G_OBJECT(e)); } success = true; } ~GstSession() { // docs say to not bother with gst_deinit, but we'll do it // anyway in case there's an issue with plugin unloading // update: there could be other gstreamer users, so we // probably shouldn't call this. also, it appears to crash // on mac for at least one user.. maybe the function is // not very well tested. //gst_deinit(); } }; //---------------------------------------------------------------------------- // GstThread //---------------------------------------------------------------------------- class GstThread::Private { public: QString pluginPath; GstSession *gstSession; bool success; GMainContext *mainContext; GMainLoop *mainLoop; QMutex m; QWaitCondition w; Private() : mainContext(0), mainLoop(0) { } static gboolean cb_loop_started(gpointer data) { return ((Private *)data)->loop_started(); } gboolean loop_started() { w.wakeOne(); m.unlock(); return FALSE; } }; GstThread::GstThread(QObject *parent) : QThread(parent) { d = new Private; // HACK: if gstreamer initializes before certain Qt internal // initialization occurs, then the app becomes unstable. // I don't know what exactly needs to happen, or where the // bug is, but if I fiddle with the default QStyle before // initializing gstreamer, then this seems to solve it. // it could be a bug in QCleanlooksStyle or QGtkStyle, which // may conflict with separate Gtk initialization that may // occur through gstreamer plugin loading. { QIcon icon = QApplication::style()->standardIcon(QStyle::SP_MessageBoxCritical, 0, 0); } } GstThread::~GstThread() { stop(); delete d; } bool GstThread::start(const QString &pluginPath) { QMutexLocker locker(&d->m); d->pluginPath = pluginPath; QThread::start(); d->w.wait(&d->m); return d->success; } void GstThread::stop() { QMutexLocker locker(&d->m); if(d->mainLoop) { // thread-safe ? g_main_loop_quit(d->mainLoop); d->w.wait(&d->m); } wait(); } QString GstThread::gstVersion() const { QMutexLocker locker(&d->m); return d->gstSession->version; } GMainContext *GstThread::mainContext() { QMutexLocker locker(&d->m); return d->mainContext; } void GstThread::run() { //printf("GStreamer thread started\n"); // this will be unlocked as soon as the mainloop runs d->m.lock(); d->gstSession = new GstSession(d->pluginPath); // report error if(!d->gstSession->success) { d->success = false; delete d->gstSession; d->gstSession = 0; d->w.wakeOne(); d->m.unlock(); //printf("GStreamer thread completed (error)\n"); return; } d->success = true; //printf("Using GStreamer version %s\n", qPrintable(d->gstSession->version)); d->mainContext = g_main_context_new(); d->mainLoop = g_main_loop_new(d->mainContext, FALSE); // deferred call to loop_started() GSource *timer = g_timeout_source_new(0); g_source_attach(timer, d->mainContext); g_source_set_callback(timer, GstThread::Private::cb_loop_started, d, NULL); // kick off the event loop g_main_loop_run(d->mainLoop); QMutexLocker locker(&d->m); g_main_loop_unref(d->mainLoop); d->mainLoop = 0; g_main_context_unref(d->mainContext); d->mainContext = 0; delete d->gstSession; d->gstSession = 0; d->w.wakeOne(); //printf("GStreamer thread completed\n"); } } psimedia-master/gstprovider/gstthread.h000066400000000000000000000027421220046403000206440ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #ifndef PSI_GSTTHREAD_H #define PSI_GSTTHREAD_H #include #include namespace PsiMedia { // this class is kind of like QCA::SyncThread but for glib. It atomically // starts up a thread, initializes gstreamer, and sets up a glib eventloop // ready for use. if you want to do stuff in the other thread, set // up a glib timeout of 0 against mainContext(), and go from there. class GstThread : public QThread { Q_OBJECT public: GstThread(QObject *parent = 0); ~GstThread(); bool start(const QString &pluginPath); void stop(); QString gstVersion() const; GMainContext *mainContext(); protected: virtual void run(); private: class Private; Private *d; }; } #endif psimedia-master/gstprovider/modes.cpp000066400000000000000000000057521220046403000203250ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #include "modes.h" #include namespace PsiMedia { // FIXME: any better way besides hardcoding? /*static bool have_element(const QString &name) { GstElement *e = gst_element_factory_make(name.toLatin1().data(), NULL); if(!e) return false; g_object_unref(G_OBJECT(e)); return true; } static bool have_codec(const QString &enc, const QString &dec, const QString &pay, const QString &depay) { if(have_element(enc) && have_element(dec) && have_element(pay) && have_element(depay)) return true; else return false; } static bool have_pcmu() { return have_codec("mulawenc", "mulawdec", "rtppcmupay", "rtppcmudepay"); } static bool have_h263p() { return have_codec("ffenc_h263p", "ffdec_h263", "rtph263ppay", "rtph263pdepay"); }*/ // speex, theora, and vorbis are guaranteed to exist QList modes_supportedAudio() { QList list; /*if(have_pcmu()) { PAudioParams p; p.codec = "pcmu"; p.sampleRate = 8000; p.sampleSize = 16; p.channels = 1; list += p; }*/ { PAudioParams p; p.codec = "speex"; p.sampleRate = 8000; p.sampleSize = 16; p.channels = 1; list += p; } { PAudioParams p; p.codec = "speex"; p.sampleRate = 16000; p.sampleSize = 16; p.channels = 1; list += p; } /*{ PAudioParams p; p.codec = "speex"; p.sampleRate = 32000; p.sampleSize = 16; p.channels = 1; list += p; } { PAudioParams p; p.codec = "vorbis"; p.sampleRate = 44100; p.sampleSize = 32; p.channels = 2; list += p; }*/ return list; } QList modes_supportedVideo() { QList list; /*if(have_h263p()) { PVideoParams p; p.codec = "h263p"; p.size = QSize(160, 120); p.fps = 30; list += p; } { PVideoParams p; p.codec = "theora"; p.size = QSize(160, 120); p.fps = 30; list += p; } { PVideoParams p; p.codec = "theora"; p.size = QSize(320, 240); p.fps = 15; list += p; }*/ { PVideoParams p; p.codec = "theora"; p.size = QSize(320, 240); p.fps = 30; list += p; } /*{ PVideoParams p; p.codec = "theora"; p.size = QSize(640, 480); p.fps = 15; list += p; } { PVideoParams p; p.codec = "theora"; p.size = QSize(640, 480); p.fps = 30; list += p; }*/ return list; } } psimedia-master/gstprovider/modes.h000066400000000000000000000017531220046403000177670ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #ifndef MODES_H #define MODES_H #include #include "psimediaprovider.h" namespace PsiMedia { QList modes_supportedAudio(); QList modes_supportedVideo(); } #endif psimedia-master/gstprovider/payloadinfo.cpp000066400000000000000000000144451220046403000215220ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #include "payloadinfo.h" #include #include namespace PsiMedia { static QString hexEncode(const QByteArray &in) { QString out; for(int n = 0; n < in.size(); ++n) out += QString().sprintf("%02x", (unsigned char)in[n]); return out; } static int hexValue(char c) { if(c >= '0' && c <= '9') return c - '0'; else if(c >= 'a' && c <= 'f') return c - 'a' + 10; else if(c >= 'A' && c <= 'F') return c - 'A' + 10; else return -1; } static int hexByte(char hi, char lo) { int nhi = hexValue(hi); if(nhi < 0) return -1; int nlo = hexValue(lo); if(nlo < 0) return -1; int value = (hexValue(hi) << 4) + hexValue(lo); return value; } static QByteArray hexDecode(const QString &in) { QByteArray out; for(int n = 0; n + 1 < in.length(); n += 2) { int value = hexByte(in[n].toLatin1(), in[n + 1].toLatin1()); if(value < 0) return QByteArray(); // error out += (unsigned char)value; } return out; } class my_foreach_state { public: PPayloadInfo *out; QStringList *whitelist; QList *list; }; gboolean my_foreach_func(GQuark field_id, const GValue *value, gpointer user_data) { my_foreach_state &state = *((my_foreach_state *)user_data); QString name = QString::fromLatin1(g_quark_to_string(field_id)); if(G_VALUE_TYPE(value) == G_TYPE_STRING && state.whitelist->contains(name)) { QString svalue = QString::fromLatin1(g_value_get_string(value)); // FIXME: is there a better way to detect when we should do this conversion? if(name == "configuration" && (state.out->name == "THEORA" || state.out->name == "VORBIS")) { QByteArray config = QByteArray::fromBase64(svalue.toLatin1()); svalue = hexEncode(config); } PPayloadInfo::Parameter i; i.name = name; i.value = svalue; state.list->append(i); } return TRUE; } GstStructure *payloadInfoToStructure(const PPayloadInfo &info, const QString &media) { GstStructure *out = gst_structure_empty_new("application/x-rtp"); { GValue gv; memset(&gv, 0, sizeof(GValue)); g_value_init(&gv, G_TYPE_STRING); g_value_set_string(&gv, media.toLatin1().data()); gst_structure_set_value(out, "media", &gv); } // payload id field required if(info.id == -1) { gst_structure_free(out); return 0; } { GValue gv; memset(&gv, 0, sizeof(GValue)); g_value_init(&gv, G_TYPE_INT); g_value_set_int(&gv, info.id); gst_structure_set_value(out, "payload", &gv); } // name required for payload values 96 or greater if(info.id >= 96 && info.name.isEmpty()) { gst_structure_free(out); return 0; } { GValue gv; memset(&gv, 0, sizeof(GValue)); g_value_init(&gv, G_TYPE_STRING); g_value_set_string(&gv, info.name.toLatin1().data()); gst_structure_set_value(out, "encoding-name", &gv); } if(info.clockrate != -1) { GValue gv; memset(&gv, 0, sizeof(GValue)); g_value_init(&gv, G_TYPE_INT); g_value_set_int(&gv, info.clockrate); gst_structure_set_value(out, "clock-rate", &gv); } if(info.channels != -1) { GValue gv; memset(&gv, 0, sizeof(GValue)); g_value_init(&gv, G_TYPE_STRING); g_value_set_string(&gv, QString::number(info.channels).toLatin1().data()); gst_structure_set_value(out, "encoding-params", &gv); } foreach(const PPayloadInfo::Parameter &i, info.parameters) { QString value = i.value; // FIXME: is there a better way to detect when we should do this conversion? if(i.name == "configuration" && (info.name.toUpper() == "THEORA" || info.name.toUpper() == "VORBIS")) { QByteArray config = hexDecode(value); if(config.isEmpty()) { gst_structure_free(out); return 0; } value = QString::fromLatin1(config.toBase64()); } GValue gv; memset(&gv, 0, sizeof(GValue)); g_value_init(&gv, G_TYPE_STRING); g_value_set_string(&gv, value.toLatin1().data()); gst_structure_set_value(out, i.name.toLatin1().data(), &gv); } return out; } PPayloadInfo structureToPayloadInfo(GstStructure *structure, QString *media) { PPayloadInfo out; QString media_; gint x; const gchar *str; str = gst_structure_get_name(structure); QString sname = QString::fromLatin1(str); if(sname != "application/x-rtp") return PPayloadInfo(); str = gst_structure_get_string(structure, "media"); if(!str) return PPayloadInfo(); media_ = QString::fromLatin1(str); // payload field is required if(!gst_structure_get_int(structure, "payload", &x)) return PPayloadInfo(); out.id = x; str = gst_structure_get_string(structure, "encoding-name"); if(str) { out.name = QString::fromLatin1(str); } else { // encoding-name field is required for payload values 96 or greater if(out.id >= 96) return PPayloadInfo(); } if(gst_structure_get_int(structure, "clock-rate", &x)) out.clockrate = x; str = gst_structure_get_string(structure, "encoding-params"); if(str) { QString qstr = QString::fromLatin1(str); bool ok; int n = qstr.toInt(&ok); if(!ok) return PPayloadInfo(); out.channels = n; } // TODO: vbr, cng, mode? // see: http://tools.ietf.org/html/draft-ietf-avt-rtp-speex-05 // note: if we ever change away from the whitelist approach, be sure // not to grab the earlier static fields (e.g. clock-rate) as // dynamic parameters QStringList whitelist; whitelist << "sampling" << "width" << "height" << "delivery-method" << "configuration"; QList list; my_foreach_state state; state.out = &out; state.whitelist = &whitelist; state.list = &list; if(!gst_structure_foreach(structure, my_foreach_func, &state)) return PPayloadInfo(); out.parameters = list; if(media) *media = media_; return out; } } psimedia-master/gstprovider/payloadinfo.h000066400000000000000000000021471220046403000211630ustar00rootroot00000000000000/* * Copyright (C) 2008 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #ifndef PAYLOADINFO_H #define PAYLOADINFO_H #include #include #include "psimediaprovider.h" namespace PsiMedia { GstStructure *payloadInfoToStructure(const PPayloadInfo &info, const QString &media); PPayloadInfo structureToPayloadInfo(GstStructure *structure, QString *media = 0); } #endif psimedia-master/gstprovider/pipeline.cpp000066400000000000000000000507531220046403000210240ustar00rootroot00000000000000/* * Copyright (C) 2009 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #include "pipeline.h" #include #include #include #include #include "devices.h" // FIXME: this file is heavily commented out and a mess, mainly because // all of my attempts at a dynamic pipeline were futile. someday we // can uncomment and clean this up... #define PIPELINE_DEBUG // rates lower than 22050 (e.g. 16000) might not work with echo-cancel #define DEFAULT_FIXED_RATE 22050 // in milliseconds #define DEFAULT_LATENCY 20 //#define USE_LIVEADDER namespace PsiMedia { static int get_fixed_rate() { QString val = QString::fromLatin1(qgetenv("PSI_FIXED_RATE")); if(!val.isEmpty()) { int rate = val.toInt(); if(rate > 0) return rate; else return 0; } else return DEFAULT_FIXED_RATE; } static int get_latency_time() { QString val = QString::fromLatin1(qgetenv("PSI_AUDIO_LTIME")); if(!val.isEmpty()) { int x = val.toInt(); if(x > 0) return x; else return 0; } else return DEFAULT_LATENCY; } static const char *type_to_str(PDevice::Type type) { switch(type) { case PDevice::AudioIn: return "AudioIn"; case PDevice::AudioOut: return "AudioOut"; case PDevice::VideoIn: return "VideoIn"; default: Q_ASSERT(0); return 0; } } static void videosrcbin_pad_added(GstElement *element, GstPad *pad, gpointer data) { Q_UNUSED(element); GstPad *gpad = (GstPad *)data; //gchar *name = gst_pad_get_name(pad); //printf("videosrcbin pad-added: %s\n", name); //g_free(name); GstCaps *caps = gst_pad_get_caps(pad); //gchar *gstr = gst_caps_to_string(caps); //QString capsString = QString::fromUtf8(gstr); //g_free(gstr); //printf(" caps: [%s]\n", qPrintable(capsString)); gst_ghost_pad_set_target(GST_GHOST_PAD(gpad), pad); gst_caps_unref(caps); } static GstStaticPadTemplate videosrcbin_template = GST_STATIC_PAD_TEMPLATE("src", GST_PAD_SRC, GST_PAD_ALWAYS, GST_STATIC_CAPS( "video/x-raw-yuv; " "video/x-raw-rgb" ) ); static GstElement *filter_for_capture_size(const QSize &size) { GstElement *capsfilter = gst_element_factory_make("capsfilter", NULL); GstCaps *caps = gst_caps_new_empty(); GstStructure *cs; cs = gst_structure_new("video/x-raw-yuv", "width", G_TYPE_INT, size.width(), "height", G_TYPE_INT, size.height(), NULL); gst_caps_append_structure(caps, cs); cs = gst_structure_new("video/x-raw-rgb", "width", G_TYPE_INT, size.width(), "height", G_TYPE_INT, size.height(), NULL); gst_caps_append_structure(caps, cs); cs = gst_structure_new("image/jpeg", "width", G_TYPE_INT, size.width(), "height", G_TYPE_INT, size.height(), NULL); gst_caps_append_structure(caps, cs); g_object_set(G_OBJECT(capsfilter), "caps", caps, NULL); gst_caps_unref(caps); return capsfilter; } static GstElement *filter_for_desired_size(const QSize &size) { QList widths; widths << 160 << 320 << 640 << 800 << 1024; for(int n = 0; n < widths.count(); ++n) { if(widths[n] < size.width()) { widths.removeAt(n); --n; // adjust position } } GstElement *capsfilter = gst_element_factory_make("capsfilter", NULL); GstCaps *caps = gst_caps_new_empty(); // for(int n = 0; n < widths.count(); ++n) // { // GstStructure *cs; // cs = gst_structure_new("video/x-raw-yuv", // "width", GST_TYPE_INT_RANGE, 1, widths[n], // "height", GST_TYPE_INT_RANGE, 1, G_MAXINT, NULL); // gst_caps_append_structure(caps, cs); // // cs = gst_structure_new("video/x-raw-rgb", // "width", GST_TYPE_INT_RANGE, 1, widths[n], // "height", GST_TYPE_INT_RANGE, 1, G_MAXINT, NULL); // gst_caps_append_structure(caps, cs); // } caps = gst_caps_from_string("video/x-raw-yuv , width=[320] , " "height=[240] , framerate=[30/1]"); g_object_set(G_OBJECT(capsfilter), "caps", caps, NULL); GstStructure *cs = gst_structure_new("image/jpeg", NULL); gst_caps_append_structure(caps, cs); g_object_set(G_OBJECT(capsfilter), "caps", caps, NULL); gst_caps_unref(caps); return capsfilter; } static GstElement *make_devicebin(const QString &id, PDevice::Type type, const QSize &desiredSize) { QSize captureSize; GstElement *e = devices_makeElement(id, type, &captureSize); if(!e) return 0; // explicitly set audio devices to be low-latency if(/*type == PDevice::AudioIn ||*/ type == PDevice::AudioOut) { int latency_ms = get_latency_time(); if(latency_ms > 0) { gint64 lt = latency_ms * 1000; // microseconds g_object_set(G_OBJECT(e), "latency-time", lt, NULL); //g_object_set(G_OBJECT(e), "buffer-time", 2 * lt, NULL); } } GstElement *bin = gst_bin_new(NULL); if(type == PDevice::AudioIn) { gst_bin_add(GST_BIN(bin), e); GstElement *audioconvert = gst_element_factory_make("audioconvert", NULL); GstElement *audioresample = gst_element_factory_make("audioresample", NULL); GstElement *capsfilter = gst_element_factory_make("capsfilter", NULL); GstCaps *caps = gst_caps_new_empty(); int rate = get_fixed_rate(); GstStructure *cs; if(rate > 0) { cs = gst_structure_new("audio/x-raw-int", "rate", G_TYPE_INT, rate, "width", G_TYPE_INT, 16, "channels", G_TYPE_INT, 1, NULL); } else { cs = gst_structure_new("audio/x-raw-int", "width", G_TYPE_INT, 16, "channels", G_TYPE_INT, 1, NULL); } gst_caps_append_structure(caps, cs); g_object_set(G_OBJECT(capsfilter), "caps", caps, NULL); gst_caps_unref(caps); gst_bin_add(GST_BIN(bin), audioconvert); gst_bin_add(GST_BIN(bin), audioresample); gst_bin_add(GST_BIN(bin), capsfilter); gst_element_link_many(e, audioconvert, audioresample, capsfilter, NULL); GstPad *pad = gst_element_get_static_pad(capsfilter, "src"); gst_element_add_pad(bin, gst_ghost_pad_new("src", pad)); gst_object_unref(GST_OBJECT(pad)); } else if(type == PDevice::VideoIn) { GstElement *capsfilter = 0; #ifdef Q_OS_MAC // FIXME: hardcode resolution because filter_for_desired_size // doesn't really work with osxvideosrc due to the fact that // it can handle any resolution. for example, setting // desiredSize to 320x240 yields a caps of 320x480 which is // wrong (and may crash videoscale, but that's another // matter). We'll hardcode the caps to 320x240, since that's // the resolution psimedia currently wants anyway, // as opposed to not specifying a captureSize, which would // also work fine but may result in double-resizing. captureSize = QSize(320, 240); #endif if(captureSize.isValid()) capsfilter = filter_for_capture_size(captureSize); else if(desiredSize.isValid()) capsfilter = filter_for_desired_size(desiredSize); gst_bin_add(GST_BIN(bin), e); if(capsfilter) gst_bin_add(GST_BIN(bin), capsfilter); GstElement *decodebin = gst_element_factory_make("decodebin", NULL); gst_bin_add(GST_BIN(bin), decodebin); GstPad *pad = gst_ghost_pad_new_no_target_from_template("src", gst_static_pad_template_get(&videosrcbin_template)); gst_element_add_pad(bin, pad); g_signal_connect(G_OBJECT(decodebin), "pad-added", G_CALLBACK(videosrcbin_pad_added), pad); if(capsfilter) gst_element_link_many(e, capsfilter, decodebin, NULL); else gst_element_link(e, decodebin); } else // AudioOut { GstElement *audioconvert = gst_element_factory_make("audioconvert", NULL); GstElement *audioresample = gst_element_factory_make("audioresample", NULL); gst_bin_add(GST_BIN(bin), audioconvert); gst_bin_add(GST_BIN(bin), audioresample); gst_bin_add(GST_BIN(bin), e); gst_element_link_many(audioconvert, audioresample, e, NULL); GstPad *pad = gst_element_get_static_pad(audioconvert, "sink"); gst_element_add_pad(bin, gst_ghost_pad_new("sink", pad)); gst_object_unref(GST_OBJECT(pad)); } return bin; } //---------------------------------------------------------------------------- // PipelineContext //---------------------------------------------------------------------------- static GstElement *g_speexdsp = 0; static GstElement *g_speexprobe = 0; class PipelineDevice; class PipelineDeviceContextPrivate { public: PipelineContext *pipeline; PipelineDevice *device; PipelineDeviceOptions opts; bool activated; // queue for srcs, adder for sinks GstElement *element; }; class PipelineDevice { public: int refs; QString id; PDevice::Type type; GstElement *pipeline; GstElement *bin; bool activated; QSet contexts; // for srcs GstElement *speexdsp; GstElement *tee; // for sinks (audio only, video sinks are always unshared) GstElement *adder; GstElement *audioconvert; GstElement *audioresample; GstElement *capsfilter; GstElement *speexprobe; PipelineDevice(const QString &_id, PDevice::Type _type, PipelineDeviceContextPrivate *context) : refs(0), id(_id), type(_type), activated(false), speexdsp(0), tee(0), adder(0), speexprobe(0) { pipeline = context->pipeline->element(); bin = make_devicebin(id, type, context->opts.videoSize); if(!bin) return; // TODO: use context->opts.fps? if(type == PDevice::AudioIn || type == PDevice::VideoIn) { if(type == PDevice::AudioIn && !g_speexdsp) { speexdsp = gst_element_factory_make("speexdsp", NULL); if(speexdsp) { #ifdef PIPELINE_DEBUG printf("using speexdsp\n"); #endif g_speexdsp = speexdsp; } } if(speexdsp) { //gst_element_set_locked_state(speexdsp, TRUE); gst_bin_add(GST_BIN(pipeline), speexdsp); } tee = gst_element_factory_make("tee", NULL); //gst_element_set_locked_state(tee, TRUE); gst_bin_add(GST_BIN(pipeline), tee); //gst_element_set_locked_state(bin, TRUE); gst_bin_add(GST_BIN(pipeline), bin); if(speexdsp) gst_element_link_many(bin, speexdsp, tee, NULL); else gst_element_link(bin, tee); } else // AudioOut { #ifdef USE_LIVEADDER adder = gst_element_factory_make("liveadder", NULL); audioconvert = gst_element_factory_make("audioconvert", NULL); audioresample = gst_element_factory_make("audioresample", NULL); #endif capsfilter = gst_element_factory_make("capsfilter", NULL); GstCaps *caps = gst_caps_new_empty(); int rate = get_fixed_rate(); GstStructure *cs; if(rate > 0) { cs = gst_structure_new("audio/x-raw-int", "rate", G_TYPE_INT, rate, "width", G_TYPE_INT, 16, "channels", G_TYPE_INT, 1, NULL); } else { cs = gst_structure_new("audio/x-raw-int", "width", G_TYPE_INT, 16, "channels", G_TYPE_INT, 1, NULL); } gst_caps_append_structure(caps, cs); g_object_set(G_OBJECT(capsfilter), "caps", caps, NULL); gst_caps_unref(caps); if(!g_speexprobe && QString::fromLatin1(qgetenv("PSI_NO_ECHO_CANCEL")) != "1") { speexprobe = gst_element_factory_make("speexechoprobe", NULL); if(speexprobe) { printf("using speexechoprobe\n"); g_speexprobe = speexprobe; QString latency_tune = qgetenv("PSI_AUDIO_LTUNE"); if(!latency_tune.isEmpty()) g_object_set(G_OBJECT(speexprobe), "latency-tune", latency_tune.toInt(), NULL); } } gst_bin_add(GST_BIN(pipeline), bin); #ifdef USE_LIVEADDER gst_bin_add(GST_BIN(pipeline), adder); gst_bin_add(GST_BIN(pipeline), audioconvert); gst_bin_add(GST_BIN(pipeline), audioresample); #endif gst_bin_add(GST_BIN(pipeline), capsfilter); if(speexprobe) gst_bin_add(GST_BIN(pipeline), speexprobe); #ifdef USE_LIVEADDER gst_element_link_many(adder, audioconvert, audioresample, capsfilter, NULL); #endif if(speexprobe) gst_element_link_many(capsfilter, speexprobe, bin, NULL); else gst_element_link(capsfilter, bin); #ifndef USE_LIVEADDER // HACK adder = capsfilter; #endif /*gst_element_set_state(bin, GST_STATE_PLAYING); if(speexprobe) gst_element_set_state(speexprobe, GST_STATE_PLAYING); gst_element_set_state(capsfilter, GST_STATE_PLAYING); gst_element_set_state(audioresample, GST_STATE_PLAYING); gst_element_set_state(audioconvert, GST_STATE_PLAYING); gst_element_set_state(adder, GST_STATE_PLAYING);*/ // sink starts out activated activated = true; } addRef(context); } ~PipelineDevice() { Q_ASSERT(contexts.isEmpty()); if(!bin) return; if(type == PDevice::AudioIn || type == PDevice::VideoIn) { gst_bin_remove(GST_BIN(pipeline), bin); if(speexdsp) { gst_bin_remove(GST_BIN(pipeline), speexdsp); g_speexdsp = 0; } if(tee) gst_bin_remove(GST_BIN(pipeline), tee); } else // AudioOut { if(adder) { #ifdef USE_LIVEADDER gst_element_set_state(adder, GST_STATE_NULL); gst_element_set_state(audioconvert, GST_STATE_NULL); gst_element_set_state(audioresample, GST_STATE_NULL); #endif gst_element_set_state(capsfilter, GST_STATE_NULL); if(speexprobe) gst_element_set_state(speexprobe, GST_STATE_NULL); } gst_element_set_state(bin, GST_STATE_NULL); if(adder) { /*gst_element_get_state(adder, NULL, NULL, GST_CLOCK_TIME_NONE); gst_bin_remove(GST_BIN(pipeline), adder); gst_element_get_state(audioconvert, NULL, NULL, GST_CLOCK_TIME_NONE); gst_bin_remove(GST_BIN(pipeline), audioconvert); gst_element_get_state(audioresample, NULL, NULL, GST_CLOCK_TIME_NONE); gst_bin_remove(GST_BIN(pipeline), audioresample);*/ gst_element_get_state(capsfilter, NULL, NULL, GST_CLOCK_TIME_NONE); gst_bin_remove(GST_BIN(pipeline), capsfilter); if(speexprobe) { gst_element_get_state(speexprobe, NULL, NULL, GST_CLOCK_TIME_NONE); gst_bin_remove(GST_BIN(pipeline), speexprobe); g_speexprobe = 0; } } gst_bin_remove(GST_BIN(pipeline), bin); } } void addRef(PipelineDeviceContextPrivate *context) { Q_ASSERT(!contexts.contains(context)); // TODO: consider context->opts for refs after first if(type == PDevice::AudioIn || type == PDevice::VideoIn) { // create a queue from the tee, and hand it off. app // uses this queue element as if it were the actual // device GstElement *queue = gst_element_factory_make("queue", NULL); context->element = queue; //gst_element_set_locked_state(queue, TRUE); gst_bin_add(GST_BIN(pipeline), queue); gst_element_link(tee, queue); } else // AudioOut { context->element = adder; // sink starts out activated context->activated = true; } contexts += context; ++refs; } void removeRef(PipelineDeviceContextPrivate *context) { Q_ASSERT(contexts.contains(context)); // TODO: recalc video properties if(type == PDevice::AudioIn || type == PDevice::VideoIn) { // deactivate if not done so already deactivate(context); GstElement *queue = context->element; gst_bin_remove(GST_BIN(pipeline), queue); } contexts.remove(context); --refs; } void activate(PipelineDeviceContextPrivate *context) { // activate the context if(!context->activated) { //GstElement *queue = context->element; //gst_element_set_locked_state(queue, FALSE); //gst_element_set_state(queue, GST_STATE_PLAYING); context->activated = true; } // activate the device if(!activated) { //gst_element_set_locked_state(tee, FALSE); //gst_element_set_locked_state(bin, FALSE); //gst_element_set_state(tee, GST_STATE_PLAYING); //gst_element_set_state(bin, GST_STATE_PLAYING); activated = true; } } void deactivate(PipelineDeviceContextPrivate *context) { #if 0 if(activated && refs == 1) { if(type == PDevice::AudioIn || type == PDevice::VideoIn) { gst_element_set_locked_state(bin, TRUE); if(speexdsp) gst_element_set_locked_state(speexdsp, TRUE); if(tee) gst_element_set_locked_state(tee, TRUE); } } if(context->activated) { if(type == PDevice::AudioIn || type == PDevice::VideoIn) { GstElement *queue = context->element; gst_element_set_locked_state(queue, TRUE); } } if(activated && refs == 1) { if(type == PDevice::AudioIn || type == PDevice::VideoIn) { gst_element_set_state(bin, GST_STATE_NULL); gst_element_get_state(bin, NULL, NULL, GST_CLOCK_TIME_NONE); //printf("set to null\n"); if(speexdsp) { gst_element_set_state(speexdsp, GST_STATE_NULL); gst_element_get_state(speexdsp, NULL, NULL, GST_CLOCK_TIME_NONE); } if(tee) { gst_element_set_state(tee, GST_STATE_NULL); gst_element_get_state(tee, NULL, NULL, GST_CLOCK_TIME_NONE); } } } if(context->activated) { if(type == PDevice::AudioIn || type == PDevice::VideoIn) { GstElement *queue = context->element; // FIXME: until we fix this, we only support 1 ref // get tee and prepare srcpad /*GstPad *sinkpad = gst_element_get_pad(queue, "sink"); GstPad *srcpad = gst_pad_get_peer(sinkpad); gst_object_unref(GST_OBJECT(sinkpad)); gst_element_release_request_pad(tee, srcpad); gst_object_unref(GST_OBJECT(srcpad));*/ // set queue to null state gst_element_set_state(queue, GST_STATE_NULL); gst_element_get_state(queue, NULL, NULL, GST_CLOCK_TIME_NONE); context->activated = false; } } if(activated && refs == 1) { if(type == PDevice::AudioIn || type == PDevice::VideoIn) activated = false; } #endif // FIXME context->activated = false; activated = false; } void update() { // TODO: change video properties based on options } }; class PipelineContext::Private { public: GstElement *pipeline; bool activated; QSet devices; Private() : activated(false) { pipeline = gst_pipeline_new(NULL); } ~Private() { Q_ASSERT(devices.isEmpty()); deactivate(); g_object_unref(G_OBJECT(pipeline)); } void activate() { if(!activated) { gst_element_set_state(pipeline, GST_STATE_PLAYING); //gst_element_get_state(pipeline, NULL, NULL, GST_CLOCK_TIME_NONE); activated = true; } } void deactivate() { if(activated) { gst_element_set_state(pipeline, GST_STATE_NULL); gst_element_get_state(pipeline, NULL, NULL, GST_CLOCK_TIME_NONE); activated = false; } } }; PipelineContext::PipelineContext() { d = new Private; } PipelineContext::~PipelineContext() { delete d; } void PipelineContext::activate() { d->activate(); } void PipelineContext::deactivate() { d->deactivate(); } GstElement *PipelineContext::element() { return d->pipeline; } //---------------------------------------------------------------------------- // PipelineDeviceContext //---------------------------------------------------------------------------- PipelineDeviceContext::PipelineDeviceContext() { d = new PipelineDeviceContextPrivate; d->device = 0; } PipelineDeviceContext *PipelineDeviceContext::create(PipelineContext *pipeline, const QString &id, PDevice::Type type, const PipelineDeviceOptions &opts) { PipelineDeviceContext *that = new PipelineDeviceContext; that->d->pipeline = pipeline; that->d->opts = opts; that->d->activated = false; // see if we're already using this device, so we can attempt to share PipelineDevice *dev = 0; foreach(PipelineDevice *i, pipeline->d->devices) { if(i->id == id && i->type == type) { dev = i; break; } } if(!dev) { dev = new PipelineDevice(id, type, that->d); if(!dev->bin) { delete dev; delete that; return 0; } pipeline->d->devices += dev; } else { // FIXME: make sharing work //dev->addRef(that->d); delete that; return 0; } that->d->device = dev; #ifdef PIPELINE_DEBUG printf("Readying %s:[%s], refs=%d\n", type_to_str(dev->type), qPrintable(dev->id), dev->refs); #endif return that; } PipelineDeviceContext::~PipelineDeviceContext() { PipelineDevice *dev = d->device; if(dev) { dev->removeRef(d); #ifdef PIPELINE_DEBUG printf("Releasing %s:[%s], refs=%d\n", type_to_str(dev->type), qPrintable(dev->id), dev->refs); #endif if(dev->refs == 0) { d->pipeline->d->devices.remove(dev); delete dev; } } delete d; } void PipelineDeviceContext::activate() { d->device->activate(d); } void PipelineDeviceContext::deactivate() { d->device->deactivate(d); } GstElement *PipelineDeviceContext::element() { return d->element; } void PipelineDeviceContext::setOptions(const PipelineDeviceOptions &opts) { d->opts = opts; d->device->update(); } } psimedia-master/gstprovider/pipeline.h000066400000000000000000000061151220046403000204620ustar00rootroot00000000000000/* * Copyright (C) 2009 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #ifndef PSI_PIPELINE_H #define PSI_PIPELINE_H #include #include #include "psimediaprovider.h" namespace PsiMedia { class PipelineDeviceContext; class PipelineDeviceContextPrivate; class PipelineContext { public: PipelineContext(); ~PipelineContext(); // set the pipeline to playing (activate) or to null (deactivate) // FIXME: when we make dynamic pipelines work, we can remove these // functions. void activate(); void deactivate(); GstElement *element(); private: friend class PipelineDeviceContext; friend class PipelineDeviceContextPrivate; class Private; Private *d; }; // this is for hinting video input properties. the actual video quality may // end up being something else, but in general it will try to be the closest // possible quality to satisfy the maximum hinted of all the references. // thus, if one ref is made, set to 640x480, and another ref is made, set to // 320x240, the quality generated by the device (and therefore, both refs) // will probably be 640x480. class PipelineDeviceOptions { public: QSize videoSize; int fps; PipelineDeviceOptions() : fps(-1) { } }; class PipelineDeviceContext { public: static PipelineDeviceContext *create(PipelineContext *pipeline, const QString &id, PDevice::Type type, const PipelineDeviceOptions &opts = PipelineDeviceOptions()); ~PipelineDeviceContext(); // after creation, the device element is in the NULL state, and // potentially not linked to dependent internal elements. call // activate() to cause internals to be finalized and set to // PLAYING. the purpose of the activate() call is to give you time // to get your own elements into the pipeline, linked, and perhaps // set to PLAYING before the device starts working. // // note: this only applies to input (src) elements. output (sink) // elements start out activated. // FIXME: this function currently does nothing void activate(); // call this in order to stop the device element. it will be safely // set to the NULL state, so that you may then unlink your own // elements from it. // FIXME: this function currently does nothing void deactivate(); GstElement *element(); void setOptions(const PipelineDeviceOptions &opts); private: PipelineDeviceContext(); PipelineDeviceContextPrivate *d; }; } #endif psimedia-master/gstprovider/rtpworker.cpp000066400000000000000000001234501220046403000212510ustar00rootroot00000000000000/* * Copyright (C) 2008-2009 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #include "rtpworker.h" #include #include #include #include "devices.h" #include "payloadinfo.h" #include "pipeline.h" #include "bins.h" // TODO: support playing from bytearray // TODO: support recording #define RTPWORKER_DEBUG namespace PsiMedia { static GstStaticPadTemplate raw_audio_src_template = GST_STATIC_PAD_TEMPLATE("src", GST_PAD_SRC, GST_PAD_ALWAYS, GST_STATIC_CAPS( "audio/x-raw-int; " "audio/x-raw-float" ) ); static GstStaticPadTemplate raw_audio_sink_template = GST_STATIC_PAD_TEMPLATE("sink", GST_PAD_SINK, GST_PAD_ALWAYS, GST_STATIC_CAPS( "audio/x-raw-int; " "audio/x-raw-float" ) ); static GstStaticPadTemplate raw_video_sink_template = GST_STATIC_PAD_TEMPLATE("sink", GST_PAD_SINK, GST_PAD_ALWAYS, GST_STATIC_CAPS( "video/x-raw-yuv; " "video/x-raw-rgb" ) ); static const char *state_to_str(GstState state) { switch(state) { case GST_STATE_NULL: return "NULL"; case GST_STATE_READY: return "READY"; case GST_STATE_PAUSED: return "PAUSED"; case GST_STATE_PLAYING: return "PLAYING"; case GST_STATE_VOID_PENDING: default: return 0; } } class Stats { public: QString name; int calls; int sizes[30]; int sizes_at; QTime calltime; Stats(const QString &_name) : name(_name), calls(-1), sizes_at(0) { } void print_stats(int current_size) { // -2 means quit if(calls == -2) return; if(sizes_at >= 30) { memmove(sizes, sizes + 1, sizeof(int) * (sizes_at - 1)); --sizes_at; } sizes[sizes_at++] = current_size; // set timer on first call if(calls == -1) { calls = 0; calltime.start(); } // print bitrate after 10 seconds if(calltime.elapsed() >= 10000) { int avg = 0; for(int n = 0; n < sizes_at; ++n) avg += sizes[n]; avg /= sizes_at; int bytesPerSec = (calls * avg) / 10; int bps = bytesPerSec * 10; int kbps = bps / 1000; calls = -2; calltime.restart(); printf("%s: average packet size=%d, kbps=%d\n", qPrintable(name), avg, kbps); } else ++calls; } }; #ifdef RTPWORKER_DEBUG static void dump_pipeline(GstElement *in, int indent = 0) { GstIterator *it = gst_bin_iterate_elements(GST_BIN(in)); gboolean done = FALSE; void *item; while(!done) { switch(gst_iterator_next(it, &item)) { case GST_ITERATOR_OK: { GstElement *e = (GstElement *)item; for(int n = 0; n < indent; ++n) printf(" "); if(GST_IS_BIN(item)) { printf("%s:\n", gst_element_get_name(e)); dump_pipeline(e, indent + 2); } else printf("%s\n", gst_element_get_name(e)); gst_object_unref(item); break; } case GST_ITERATOR_RESYNC: gst_iterator_resync(it); break; case GST_ITERATOR_ERROR: done = TRUE; break; case GST_ITERATOR_DONE: done = TRUE; break; } } gst_iterator_free(it); } #endif //---------------------------------------------------------------------------- // RtpWorker //---------------------------------------------------------------------------- static int worker_refs = 0; static PipelineContext *send_pipelineContext = 0; static PipelineContext *recv_pipelineContext = 0; static GstElement *spipeline = 0; static GstElement *rpipeline = 0; //static GstBus *sbus = 0; static bool send_in_use = false; static bool recv_in_use = false; static bool use_shared_clock = true; static GstClock *shared_clock = 0; static bool send_clock_is_shared = false; //static bool recv_clock_is_shared = false; RtpWorker::RtpWorker(GMainContext *mainContext) : loopFile(false), maxbitrate(-1), canTransmitAudio(false), canTransmitVideo(false), outputVolume(100), inputVolume(100), cb_started(0), cb_updated(0), cb_stopped(0), cb_finished(0), cb_error(0), cb_audioOutputIntensity(0), cb_audioInputIntensity(0), cb_previewFrame(0), cb_outputFrame(0), cb_rtpAudioOut(0), cb_rtpVideoOut(0), mainContext_(mainContext), timer(0), pd_audiosrc(0), pd_videosrc(0), pd_audiosink(0), sendbin(0), recvbin(0), audiortpsrc(0), videortpsrc(0), volumein(0), volumeout(0), rtpaudioout(false), rtpvideoout(false) //recordTimer(0) { audioStats = new Stats("audio"); videoStats = new Stats("video"); if(worker_refs == 0) { send_pipelineContext = new PipelineContext; recv_pipelineContext = new PipelineContext; spipeline = send_pipelineContext->element(); rpipeline = recv_pipelineContext->element(); #ifdef RTPWORKER_DEBUG /*sbus = gst_pipeline_get_bus(GST_PIPELINE(spipeline)); GSource *source = gst_bus_create_watch(bus); gst_object_unref(bus); g_source_set_callback(source, (GSourceFunc)cb_bus_call, this, NULL); g_source_attach(source, mainContext_);*/ #endif QByteArray val = qgetenv("PSI_NO_SHARED_CLOCK"); if(!val.isEmpty()) use_shared_clock = false; } ++worker_refs; } RtpWorker::~RtpWorker() { if(timer) { g_source_destroy(timer); timer = 0; } /*if(recordTimer) { g_source_destroy(recordTimer); recordTimer = 0; }*/ cleanup(); --worker_refs; if(worker_refs == 0) { delete send_pipelineContext; send_pipelineContext = 0; delete recv_pipelineContext; recv_pipelineContext = 0; //sbus = 0; } delete audioStats; delete videoStats; } void RtpWorker::cleanup() { #ifdef RTPWORKER_DEBUG printf("cleaning up...\n"); #endif volumein_mutex.lock(); volumein = 0; volumein_mutex.unlock(); volumeout_mutex.lock(); volumeout = 0; volumeout_mutex.unlock(); audiortpsrc_mutex.lock(); audiortpsrc = 0; audiortpsrc_mutex.unlock(); videortpsrc_mutex.lock(); videortpsrc = 0; videortpsrc_mutex.unlock(); rtpaudioout_mutex.lock(); rtpaudioout = false; rtpaudioout_mutex.unlock(); rtpvideoout_mutex.lock(); rtpvideoout = false; rtpvideoout_mutex.unlock(); //if(pd_audiosrc) // pd_audiosrc->deactivate(); //if(pd_videosrc) // pd_videosrc->deactivate(); if(sendbin) { if(shared_clock && send_clock_is_shared) { gst_object_unref(shared_clock); shared_clock = 0; send_clock_is_shared = false; if(recv_in_use) { printf("recv clock reverts to auto\n"); gst_element_set_state(rpipeline, GST_STATE_READY); gst_element_get_state(rpipeline, NULL, NULL, GST_CLOCK_TIME_NONE); gst_pipeline_auto_clock(GST_PIPELINE(rpipeline)); // only restart the receive pipeline if it is // owned by a separate session if(!recvbin) { gst_element_set_state(rpipeline, GST_STATE_PLAYING); //gst_element_get_state(rpipeline, NULL, NULL, GST_CLOCK_TIME_NONE); } } } send_pipelineContext->deactivate(); gst_pipeline_auto_clock(GST_PIPELINE(spipeline)); //gst_element_set_state(sendbin, GST_STATE_NULL); //gst_element_get_state(sendbin, NULL, NULL, GST_CLOCK_TIME_NONE); gst_bin_remove(GST_BIN(spipeline), sendbin); sendbin = 0; send_in_use = false; } if(recvbin) { // NOTE: commenting this out because recv clock is no longer // ever shared /*if(shared_clock && recv_clock_is_shared) { gst_object_unref(shared_clock); shared_clock = 0; recv_clock_is_shared = false; if(send_in_use) { // FIXME: do we really need to restart the pipeline? printf("send clock becomes master\n"); send_pipelineContext->deactivate(); gst_pipeline_auto_clock(GST_PIPELINE(spipeline)); send_pipelineContext->activate(); //gst_element_get_state(spipeline, NULL, NULL, GST_CLOCK_TIME_NONE); // send clock becomes shared shared_clock = gst_pipeline_get_clock(GST_PIPELINE(spipeline)); gst_object_ref(GST_OBJECT(shared_clock)); gst_pipeline_use_clock(GST_PIPELINE(spipeline), shared_clock); send_clock_is_shared = true; } }*/ recv_pipelineContext->deactivate(); gst_pipeline_auto_clock(GST_PIPELINE(rpipeline)); //gst_element_set_state(recvbin, GST_STATE_NULL); //gst_element_get_state(recvbin, NULL, NULL, GST_CLOCK_TIME_NONE); gst_bin_remove(GST_BIN(rpipeline), recvbin); recvbin = 0; recv_in_use = false; } if(pd_audiosrc) { delete pd_audiosrc; pd_audiosrc = 0; audiosrc = 0; } if(pd_videosrc) { delete pd_videosrc; pd_videosrc = 0; videosrc = 0; } if(pd_audiosink) { delete pd_audiosink; pd_audiosink = 0; } #ifdef RTPWORKER_DEBUG printf("cleaning done.\n"); #endif } void RtpWorker::start() { Q_ASSERT(!timer); timer = g_timeout_source_new(0); g_source_set_callback(timer, cb_doStart, this, NULL); g_source_attach(timer, mainContext_); } void RtpWorker::update() { Q_ASSERT(!timer); timer = g_timeout_source_new(0); g_source_set_callback(timer, cb_doUpdate, this, NULL); g_source_attach(timer, mainContext_); } void RtpWorker::transmitAudio() { QMutexLocker locker(&rtpaudioout_mutex); rtpaudioout = true; } void RtpWorker::transmitVideo() { QMutexLocker locker(&rtpvideoout_mutex); rtpvideoout = true; } void RtpWorker::pauseAudio() { QMutexLocker locker(&rtpaudioout_mutex); rtpaudioout = false; } void RtpWorker::pauseVideo() { QMutexLocker locker(&rtpvideoout_mutex); rtpvideoout = false; } void RtpWorker::stop() { // cancel any current operation if(timer) g_source_destroy(timer); timer = g_timeout_source_new(0); g_source_set_callback(timer, cb_doStop, this, NULL); g_source_attach(timer, mainContext_); } void RtpWorker::rtpAudioIn(const PRtpPacket &packet) { QMutexLocker locker(&audiortpsrc_mutex); if(packet.portOffset == 0 && audiortpsrc) gst_apprtpsrc_packet_push((GstAppRtpSrc *)audiortpsrc, (const unsigned char *)packet.rawValue.data(), packet.rawValue.size()); } void RtpWorker::rtpVideoIn(const PRtpPacket &packet) { QMutexLocker locker(&videortpsrc_mutex); if(packet.portOffset == 0 && videortpsrc) gst_apprtpsrc_packet_push((GstAppRtpSrc *)videortpsrc, (const unsigned char *)packet.rawValue.data(), packet.rawValue.size()); } void RtpWorker::setOutputVolume(int level) { QMutexLocker locker(&volumeout_mutex); outputVolume = level; if(volumeout) { double vol = (double)level / 100; g_object_set(G_OBJECT(volumeout), "volume", vol, NULL); } } void RtpWorker::setInputVolume(int level) { QMutexLocker locker(&volumein_mutex); inputVolume = level; if(volumein) { double vol = (double)level / 100; g_object_set(G_OBJECT(volumein), "volume", vol, NULL); } } void RtpWorker::recordStart() { // FIXME: for now we just send EOF/error if(cb_recordData) cb_recordData(QByteArray(), app); } void RtpWorker::recordStop() { // TODO: assert recording // FIXME: don't just do nothing } gboolean RtpWorker::cb_doStart(gpointer data) { return ((RtpWorker *)data)->doStart(); } gboolean RtpWorker::cb_doUpdate(gpointer data) { return ((RtpWorker *)data)->doUpdate(); } gboolean RtpWorker::cb_doStop(gpointer data) { return ((RtpWorker *)data)->doStop(); } void RtpWorker::cb_fileDemux_no_more_pads(GstElement *element, gpointer data) { ((RtpWorker *)data)->fileDemux_no_more_pads(element); } void RtpWorker::cb_fileDemux_pad_added(GstElement *element, GstPad *pad, gpointer data) { ((RtpWorker *)data)->fileDemux_pad_added(element, pad); } void RtpWorker::cb_fileDemux_pad_removed(GstElement *element, GstPad *pad, gpointer data) { ((RtpWorker *)data)->fileDemux_pad_removed(element, pad); } gboolean RtpWorker::cb_bus_call(GstBus *bus, GstMessage *msg, gpointer data) { return ((RtpWorker *)data)->bus_call(bus, msg); } void RtpWorker::cb_show_frame_preview(int width, int height, const unsigned char *rgb32, gpointer data) { ((RtpWorker *)data)->show_frame_preview(width, height, rgb32); } void RtpWorker::cb_show_frame_output(int width, int height, const unsigned char *rgb32, gpointer data) { ((RtpWorker *)data)->show_frame_output(width, height, rgb32); } void RtpWorker::cb_packet_ready_rtp_audio(const unsigned char *buf, int size, gpointer data) { ((RtpWorker *)data)->packet_ready_rtp_audio(buf, size); } void RtpWorker::cb_packet_ready_rtp_video(const unsigned char *buf, int size, gpointer data) { ((RtpWorker *)data)->packet_ready_rtp_video(buf, size); } gboolean RtpWorker::cb_fileReady(gpointer data) { return ((RtpWorker *)data)->fileReady(); } gboolean RtpWorker::doStart() { timer = 0; fileDemux = 0; audiosrc = 0; videosrc = 0; audiortpsrc = 0; videortpsrc = 0; audiortppay = 0; videortppay = 0; // default to 400kbps if(maxbitrate == -1) maxbitrate = 400; if(!setupSendRecv()) { if(cb_error) cb_error(app); } else { // don't signal started here if using files if(!fileDemux && cb_started) cb_started(app); } return FALSE; } gboolean RtpWorker::doUpdate() { timer = 0; if(!setupSendRecv()) { if(cb_error) cb_error(app); } else { if(cb_updated) cb_updated(app); } return FALSE; } gboolean RtpWorker::doStop() { timer = 0; cleanup(); if(cb_stopped) cb_stopped(app); return FALSE; } void RtpWorker::fileDemux_no_more_pads(GstElement *element) { Q_UNUSED(element); #ifdef RTPWORKER_DEBUG printf("no more pads\n"); #endif // FIXME: make this get canceled on cleanup? GSource *ftimer = g_timeout_source_new(0); g_source_set_callback(ftimer, cb_fileReady, this, NULL); g_source_attach(ftimer, mainContext_); } void RtpWorker::fileDemux_pad_added(GstElement *element, GstPad *pad) { Q_UNUSED(element); #ifdef RTPWORKER_DEBUG gchar *name = gst_pad_get_name(pad); printf("pad-added: %s\n", name); g_free(name); #endif GstCaps *caps = gst_pad_get_caps(pad); #ifdef RTPWORKER_DEBUG gchar *gstr = gst_caps_to_string(caps); QString capsString = QString::fromUtf8(gstr); g_free(gstr); printf(" caps: [%s]\n", qPrintable(capsString)); #endif int num = gst_caps_get_size(caps); for(int n = 0; n < num; ++n) { GstStructure *cs = gst_caps_get_structure(caps, n); QString mime = gst_structure_get_name(cs); QStringList parts = mime.split('/'); if(parts.count() != 2) continue; QString type = parts[0]; QString subtype = parts[1]; GstElement *decoder = 0; bool isAudio = false; // FIXME: we should really just use decodebin if(type == "audio") { isAudio = true; if(subtype == "x-speex") decoder = gst_element_factory_make("speexdec", NULL); else if(subtype == "x-vorbis") decoder = gst_element_factory_make("vorbisdec", NULL); } else if(type == "video") { isAudio = false; if(subtype == "x-theora") decoder = gst_element_factory_make("theoradec", NULL); } if(decoder) { if(!gst_bin_add(GST_BIN(sendbin), decoder)) continue; GstPad *sinkpad = gst_element_get_static_pad(decoder, "sink"); if(!GST_PAD_LINK_SUCCESSFUL(gst_pad_link(pad, sinkpad))) continue; gst_object_unref(sinkpad); // FIXME // by default the element is not in a working state. // we set to 'paused' which hopefully means it'll // do the right thing. gst_element_set_state(decoder, GST_STATE_PAUSED); if(isAudio) { audiosrc = decoder; addAudioChain(); } else { videosrc = decoder; addVideoChain(); } // decoder set up, we're done break; } } gst_caps_unref(caps); } void RtpWorker::fileDemux_pad_removed(GstElement *element, GstPad *pad) { Q_UNUSED(element); // TODO: do we need to do anything here? #ifdef RTPWORKER_DEBUG gchar *name = gst_pad_get_name(pad); printf("pad-removed: %s\n", name); g_free(name); #endif } gboolean RtpWorker::bus_call(GstBus *bus, GstMessage *msg) { Q_UNUSED(bus); //GMainLoop *loop = (GMainLoop *)data; switch(GST_MESSAGE_TYPE(msg)) { case GST_MESSAGE_EOS: { g_print("End-of-stream\n"); //g_main_loop_quit(loop); break; } case GST_MESSAGE_ERROR: { gchar *debug; GError *err; gst_message_parse_error(msg, &err, &debug); g_free(debug); g_print("Error: %s: %s\n", gst_element_get_name(GST_MESSAGE_SRC(msg)), err->message); g_error_free(err); //g_main_loop_quit(loop); break; } case GST_MESSAGE_SEGMENT_DONE: { // FIXME: we seem to get this event too often? printf("Segment-done\n"); /*gst_element_seek(sendPipeline, 1, GST_FORMAT_TIME, (GstSeekFlags)(GST_SEEK_FLAG_SEGMENT), GST_SEEK_TYPE_SET, 0, GST_SEEK_TYPE_END, 0);*/ break; } case GST_MESSAGE_WARNING: { gchar *debug; GError *err; gst_message_parse_warning(msg, &err, &debug); g_free(debug); g_print("Warning: %s: %s\n", gst_element_get_name(GST_MESSAGE_SRC(msg)), err->message); g_error_free(err); //g_main_loop_quit(loop); break; } case GST_MESSAGE_STATE_CHANGED: { GstState oldstate, newstate, pending; gst_message_parse_state_changed(msg, &oldstate, &newstate, &pending); printf("State changed: %s: %s->%s", gst_element_get_name(GST_MESSAGE_SRC(msg)), state_to_str(oldstate), state_to_str(newstate)); if(pending != GST_STATE_VOID_PENDING) printf(" (%s)", state_to_str(pending)); printf("\n"); break; } case GST_MESSAGE_ASYNC_DONE: { printf("Async done: %s\n", gst_element_get_name(GST_MESSAGE_SRC(msg))); break; } default: printf("Bus message: %s\n", GST_MESSAGE_TYPE_NAME(msg)); break; } return TRUE; } void RtpWorker::show_frame_preview(int width, int height, const unsigned char *rgb32) { QImage image(width, height, QImage::Format_RGB32); memcpy(image.bits(), rgb32, image.numBytes()); Frame frame; frame.image = image; if(cb_previewFrame) cb_previewFrame(frame, app); } void RtpWorker::show_frame_output(int width, int height, const unsigned char *rgb32) { QImage image(width, height, QImage::Format_RGB32); memcpy(image.bits(), rgb32, image.numBytes()); Frame frame; frame.image = image; if(cb_outputFrame) cb_outputFrame(frame, app); } void RtpWorker::packet_ready_rtp_audio(const unsigned char *buf, int size) { QByteArray ba((const char *)buf, size); PRtpPacket packet; packet.rawValue = ba; packet.portOffset = 0; #ifdef RTPWORKER_DEBUG audioStats->print_stats(packet.rawValue.size()); #endif QMutexLocker locker(&rtpaudioout_mutex); if(cb_rtpAudioOut && rtpaudioout) cb_rtpAudioOut(packet, app); } void RtpWorker::packet_ready_rtp_video(const unsigned char *buf, int size) { QByteArray ba((const char *)buf, size); PRtpPacket packet; packet.rawValue = ba; packet.portOffset = 0; #ifdef RTPWORKER_DEBUG videoStats->print_stats(packet.rawValue.size()); #endif QMutexLocker locker(&rtpvideoout_mutex); if(cb_rtpVideoOut && rtpvideoout) cb_rtpVideoOut(packet, app); } gboolean RtpWorker::fileReady() { if(loopFile) { //gst_element_set_state(sendPipeline, GST_STATE_PAUSED); //gst_element_get_state(sendPipeline, NULL, NULL, GST_CLOCK_TIME_NONE); /*gst_element_seek(sendPipeline, 1, GST_FORMAT_TIME, (GstSeekFlags)(GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_SEGMENT), GST_SEEK_TYPE_SET, 0, GST_SEEK_TYPE_END, 0);*/ } send_pipelineContext->activate(); gst_element_get_state(send_pipelineContext->element(), NULL, NULL, GST_CLOCK_TIME_NONE); //gst_element_set_state(sendPipeline, GST_STATE_PLAYING); //gst_element_get_state(sendPipeline, NULL, NULL, GST_CLOCK_TIME_NONE); if(!getCaps()) { error = RtpSessionContext::ErrorCodec; if(cb_error) cb_error(app); return FALSE; } if(cb_started) cb_started(app); return FALSE; } bool RtpWorker::setupSendRecv() { // FIXME: // this code is not really correct, but it will suffice for our // modest purposes. basically the way it works is: // - non-empty params indicate desire for a media type // - the only control you have over quality is maxbitrate // - input device/file indicates desire to send // - remote payloadinfo indicates desire to receive (we need this // to support theora) // - once sending or receiving is started, media types cannot // be added or removed (doing so will throw an error) // - once sending or receiving is started, codecs can't be changed // (changes will be rejected). one exception: remote theora // config can be updated. // - once sending or receiving is started, devices can't be changed // (changes will be ignored) if(!sendbin) { if(!localAudioParams.isEmpty() || !localVideoParams.isEmpty()) { if(!startSend()) return false; } } else { // TODO: support adding/removing audio/video to existing session /*if((localAudioParams.isEmpty() != actual_localAudioPayloadInfo.isEmpty()) || (localVideoParams.isEmpty() != actual_videoPayloadInfo.isEmpty())) { error = RtpSessionContext::ErrorGeneric; return false; }*/ } if(!recvbin) { if((!localAudioParams.isEmpty() && !remoteAudioPayloadInfo.isEmpty()) || (!localVideoParams.isEmpty() && !remoteVideoPayloadInfo.isEmpty())) { if(!startRecv()) return false; } } else { // TODO: support adding/removing audio/video to existing session // see if theora was updated in the remote config updateTheoraConfig(); } // apply actual settings back to these variables, so the user can // read them localAudioPayloadInfo = actual_localAudioPayloadInfo; localVideoPayloadInfo = actual_localVideoPayloadInfo; remoteAudioPayloadInfo = actual_remoteAudioPayloadInfo; remoteVideoPayloadInfo = actual_remoteVideoPayloadInfo; return true; } bool RtpWorker::startSend() { return startSend(16000); } bool RtpWorker::startSend(int rate) { // file source if(!infile.isEmpty() || !indata.isEmpty()) { if(send_in_use) return false; sendbin = gst_bin_new("sendbin"); GstElement *fileSource = gst_element_factory_make("filesrc", NULL); g_object_set(G_OBJECT(fileSource), "location", infile.toUtf8().data(), NULL); fileDemux = gst_element_factory_make("oggdemux", NULL); g_signal_connect(G_OBJECT(fileDemux), "no-more-pads", G_CALLBACK(cb_fileDemux_no_more_pads), this); g_signal_connect(G_OBJECT(fileDemux), "pad-added", G_CALLBACK(cb_fileDemux_pad_added), this); g_signal_connect(G_OBJECT(fileDemux), "pad-removed", G_CALLBACK(cb_fileDemux_pad_removed), this); gst_bin_add(GST_BIN(sendbin), fileSource); gst_bin_add(GST_BIN(sendbin), fileDemux); gst_element_link(fileSource, fileDemux); } // device source else if(!ain.isEmpty() || !vin.isEmpty()) { if(send_in_use) return false; sendbin = gst_bin_new("sendbin"); if(!ain.isEmpty() && !localAudioParams.isEmpty()) { pd_audiosrc = PipelineDeviceContext::create(send_pipelineContext, ain, PDevice::AudioIn); if(!pd_audiosrc) { #ifdef RTPWORKER_DEBUG printf("Failed to create audio input element '%s'.\n", qPrintable(ain)); #endif g_object_unref(G_OBJECT(sendbin)); sendbin = 0; error = RtpSessionContext::ErrorGeneric; return false; } audiosrc = pd_audiosrc->element(); } if(!vin.isEmpty() && !localVideoParams.isEmpty()) { PipelineDeviceOptions opts; //opts.videoSize = localVideoParams[0].size; opts.videoSize = QSize(320, 240); opts.fps = 30; pd_videosrc = PipelineDeviceContext::create(send_pipelineContext, vin, PDevice::VideoIn, opts); if(!pd_videosrc) { #ifdef RTPWORKER_DEBUG printf("Failed to create video input element '%s'.\n", qPrintable(vin)); #endif delete pd_audiosrc; pd_audiosrc = 0; g_object_unref(G_OBJECT(sendbin)); sendbin = 0; error = RtpSessionContext::ErrorGeneric; return false; } videosrc = pd_videosrc->element(); } } // no desire to send if(!sendbin) return true; send_in_use = true; if(audiosrc) { if(!addAudioChain(rate)) { delete pd_audiosrc; pd_audiosrc = 0; delete pd_videosrc; pd_videosrc = 0; g_object_unref(G_OBJECT(sendbin)); sendbin = 0; error = RtpSessionContext::ErrorGeneric; return false; } } if(videosrc) { if(!addVideoChain()) { delete pd_audiosrc; pd_audiosrc = 0; delete pd_videosrc; pd_videosrc = 0; g_object_unref(G_OBJECT(sendbin)); sendbin = 0; error = RtpSessionContext::ErrorGeneric; return false; } } gst_bin_add(GST_BIN(spipeline), sendbin); if(!audiosrc && !videosrc) { // in the case of files, preroll gst_element_set_state(spipeline, GST_STATE_PAUSED); gst_element_get_state(spipeline, NULL, NULL, GST_CLOCK_TIME_NONE); //gst_element_set_state(sendbin, GST_STATE_PAUSED); //gst_element_get_state(sendbin, NULL, NULL, GST_CLOCK_TIME_NONE); /*if(loopFile) { gst_element_seek(sendPipeline, 1, GST_FORMAT_TIME, (GstSeekFlags)(GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_SEGMENT), GST_SEEK_TYPE_SET, 0, GST_SEEK_TYPE_END, 0); }*/ } else { // in the case of live transmission, wait for it to start and signal //gst_element_set_state(sendbin, GST_STATE_READY); //gst_element_get_state(sendbin, NULL, NULL, GST_CLOCK_TIME_NONE); #ifdef RTPWORKER_DEBUG printf("changing state...\n"); #endif //gst_element_set_state(sendbin, GST_STATE_PLAYING); if(audiosrc) { gst_element_link(audiosrc, sendbin); //pd_audiosrc->activate(); } if(videosrc) { gst_element_link(videosrc, sendbin); //pd_videosrc->activate(); } /*if(shared_clock && recv_clock_is_shared) { printf("send pipeline slaving to recv clock\n"); gst_pipeline_use_clock(GST_PIPELINE(spipeline), shared_clock); }*/ //gst_element_set_state(pipeline, GST_STATE_PLAYING); //gst_element_get_state(pipeline, NULL, NULL, GST_CLOCK_TIME_NONE); send_pipelineContext->activate(); // 6 seconds ought to be enough time to init int ret = gst_element_get_state(spipeline, NULL, NULL, 6 * GST_SECOND); //gst_element_get_state(sendbin, NULL, NULL, GST_CLOCK_TIME_NONE); if(ret != GST_STATE_CHANGE_SUCCESS && ret != GST_STATE_CHANGE_NO_PREROLL) { #ifdef RTPWORKER_DEBUG printf("error/timeout while setting send pipeline to PLAYING\n"); #endif cleanup(); error = RtpSessionContext::ErrorGeneric; return false; } if(!shared_clock && use_shared_clock) { printf("send clock is master\n"); shared_clock = gst_pipeline_get_clock(GST_PIPELINE(spipeline)); gst_pipeline_use_clock(GST_PIPELINE(spipeline), shared_clock); send_clock_is_shared = true; // if recv active, apply this clock to it if(recv_in_use) { printf("recv pipeline slaving to send clock\n"); gst_element_set_state(rpipeline, GST_STATE_READY); gst_element_get_state(rpipeline, NULL, NULL, GST_CLOCK_TIME_NONE); gst_pipeline_use_clock(GST_PIPELINE(rpipeline), shared_clock); gst_element_set_state(rpipeline, GST_STATE_PLAYING); } } #ifdef RTPWORKER_DEBUG printf("state changed\n"); dump_pipeline(spipeline); #endif if(!getCaps()) { error = RtpSessionContext::ErrorCodec; return false; } actual_localAudioPayloadInfo = localAudioPayloadInfo; actual_localVideoPayloadInfo = localVideoPayloadInfo; } return true; } bool RtpWorker::startRecv() { QString acodec, vcodec; GstElement *audioout = 0; GstElement *asrc = 0; // TODO: support more than speex int speex_at = -1; int samplerate = -1; for(int n = 0; n < remoteAudioPayloadInfo.count(); ++n) { const PPayloadInfo &ri = remoteAudioPayloadInfo[n]; if(ri.name.toUpper() == "SPEEX") { if (ri.clockrate > samplerate) { speex_at = n; samplerate = ri.clockrate; } } } if (samplerate != 16000) { cleanup(); startSend(samplerate); } // TODO: support more than theora int theora_at = -1; for(int n = 0; n < remoteVideoPayloadInfo.count(); ++n) { const PPayloadInfo &ri = remoteVideoPayloadInfo[n]; if(ri.name.toUpper() == "THEORA" && ri.clockrate == 90000) { theora_at = n; break; } } // if remote does not support our codecs, error out // FIXME: again, support more than speex/theora if((!remoteAudioPayloadInfo.isEmpty() && speex_at == -1) || (!remoteVideoPayloadInfo.isEmpty() && theora_at == -1)) { return false; } if(!remoteAudioPayloadInfo.isEmpty() && speex_at != -1) { #ifdef RTPWORKER_DEBUG printf("setting up audio recv\n"); #endif int at = speex_at; GstStructure *cs = payloadInfoToStructure(remoteAudioPayloadInfo[at], "audio"); if(!cs) { #ifdef RTPWORKER_DEBUG printf("cannot parse payload info\n"); #endif return false; } if(recv_in_use) return false; if(!recvbin) recvbin = gst_bin_new("recvbin"); audiortpsrc_mutex.lock(); audiortpsrc = gst_element_factory_make("apprtpsrc", NULL); audiortpsrc_mutex.unlock(); GstCaps *caps = gst_caps_new_empty(); gst_caps_append_structure(caps, cs); g_object_set(G_OBJECT(audiortpsrc), "caps", caps, NULL); gst_caps_unref(caps); // FIXME: what if we don't have a name and just id? // it's okay, for now we only support speex which requires // the name.. acodec = remoteAudioPayloadInfo[at].name.toLower(); } if(!remoteVideoPayloadInfo.isEmpty() && theora_at != -1) { #ifdef RTPWORKER_DEBUG printf("setting up video recv\n"); #endif int at = theora_at; GstStructure *cs = payloadInfoToStructure(remoteVideoPayloadInfo[at], "video"); if(!cs) { #ifdef RTPWORKER_DEBUG printf("cannot parse payload info\n"); #endif goto fail1; } if(recv_in_use) return false; if(!recvbin) recvbin = gst_bin_new("recvbin"); videortpsrc_mutex.lock(); videortpsrc = gst_element_factory_make("apprtpsrc", NULL); videortpsrc_mutex.unlock(); GstCaps *caps = gst_caps_new_empty(); gst_caps_append_structure(caps, cs); g_object_set(G_OBJECT(videortpsrc), "caps", caps, NULL); gst_caps_unref(caps); // FIXME: what if we don't have a name and just id? // it's okay, for now we only really support theora which // requires the name.. vcodec = remoteVideoPayloadInfo[at].name; if(vcodec == "H263-1998") // FIXME: gross vcodec = "h263p"; else vcodec = vcodec.toLower(); } // no desire to receive if(!recvbin) return true; recv_in_use = true; if(audiortpsrc) { GstElement *audiodec = bins_audiodec_create(acodec); if(!audiodec) goto fail1; if(!aout.isEmpty()) { #ifdef RTPWORKER_DEBUG printf("creating audioout\n"); #endif pd_audiosink = PipelineDeviceContext::create(recv_pipelineContext, aout, PDevice::AudioOut); if(!pd_audiosink) { #ifdef RTPWORKER_DEBUG printf("failed to create audio output element\n"); #endif goto fail1; } audioout = pd_audiosink->element(); } else audioout = gst_element_factory_make("fakesink", NULL); { QMutexLocker locker(&volumeout_mutex); volumeout = gst_element_factory_make("volume", NULL); double vol = (double)outputVolume / 100; g_object_set(G_OBJECT(volumeout), "volume", vol, NULL); } GstElement *audioconvert = gst_element_factory_make("audioconvert", NULL); GstElement *audioresample = gst_element_factory_make("audioresample", NULL); if(pd_audiosink) asrc = audioresample; gst_bin_add(GST_BIN(recvbin), audiortpsrc); gst_bin_add(GST_BIN(recvbin), audiodec); gst_bin_add(GST_BIN(recvbin), volumeout); gst_bin_add(GST_BIN(recvbin), audioconvert); gst_bin_add(GST_BIN(recvbin), audioresample); if(!asrc) gst_bin_add(GST_BIN(recvbin), audioout); gst_element_link_many(audiortpsrc, audiodec, volumeout, audioconvert, audioresample, NULL); if(!asrc) gst_element_link(audioresample, audioout); actual_remoteAudioPayloadInfo = remoteAudioPayloadInfo; } if(videortpsrc) { GstElement *videodec = bins_videodec_create(vcodec); if(!videodec) goto fail1; GstElement *videoconvert = gst_element_factory_make("ffmpegcolorspace", NULL); GstElement *videosink = gst_element_factory_make("appvideosink", NULL); GstAppVideoSink *appVideoSink = (GstAppVideoSink *)videosink; appVideoSink->appdata = this; appVideoSink->show_frame = cb_show_frame_output; gst_bin_add(GST_BIN(recvbin), videortpsrc); gst_bin_add(GST_BIN(recvbin), videodec); gst_bin_add(GST_BIN(recvbin), videoconvert); gst_bin_add(GST_BIN(recvbin), videosink); gst_element_link_many(videortpsrc, videodec, videoconvert, videosink, NULL); actual_remoteVideoPayloadInfo = remoteVideoPayloadInfo; } //gst_element_set_locked_state(recvbin, TRUE); gst_bin_add(GST_BIN(rpipeline), recvbin); if(asrc) { GstPad *pad = gst_element_get_static_pad(asrc, "src"); gst_element_add_pad(recvbin, gst_ghost_pad_new_from_template("src", pad, gst_static_pad_template_get(&raw_audio_src_template))); gst_object_unref(GST_OBJECT(pad)); gst_element_link(recvbin, audioout); } if(shared_clock && send_clock_is_shared) { printf("recv pipeline slaving to send clock\n"); gst_pipeline_use_clock(GST_PIPELINE(rpipeline), shared_clock); } //gst_element_set_locked_state(recvbin, FALSE); //gst_element_set_state(recvbin, GST_STATE_PLAYING); #ifdef RTPWORKER_DEBUG printf("activating\n"); #endif gst_element_set_state(rpipeline, GST_STATE_READY); gst_element_get_state(rpipeline, NULL, NULL, GST_CLOCK_TIME_NONE); recv_pipelineContext->activate(); /*if(!shared_clock && use_shared_clock) { printf("recv clock is master\n"); shared_clock = gst_pipeline_get_clock(GST_PIPELINE(rpipeline)); gst_pipeline_use_clock(GST_PIPELINE(rpipeline), shared_clock); recv_clock_is_shared = true; }*/ #ifdef RTPWORKER_DEBUG printf("receive pipeline started\n"); #endif return true; fail1: audiortpsrc_mutex.lock(); if(audiortpsrc) { g_object_unref(G_OBJECT(audiortpsrc)); audiortpsrc = 0; } audiortpsrc_mutex.unlock(); videortpsrc_mutex.lock(); if(videortpsrc) { g_object_unref(G_OBJECT(videortpsrc)); videortpsrc = 0; } videortpsrc_mutex.unlock(); if(recvbin) { g_object_unref(G_OBJECT(recvbin)); recvbin = 0; } delete pd_audiosink; pd_audiosink = 0; recv_in_use = false; return false; } bool RtpWorker::addAudioChain() { return addAudioChain(16000); } bool RtpWorker::addAudioChain(int rate) { // TODO: support other codecs. for now, we only support speex 16khz QString codec = "speex"; int size = 16; int channels = 1; //QString codec = localAudioParams[0].codec; //int rate = localAudioParams[0].sampleRate; //int size = localAudioParams[0].sampleSize; //int channels = localAudioParams[0].channels; #ifdef RTPWORKER_DEBUG printf("codec=%s\n", qPrintable(codec)); #endif // see if we need to match a pt id int pt = -1; for(int n = 0; n < remoteAudioPayloadInfo.count(); ++n) { const PPayloadInfo &ri = remoteAudioPayloadInfo[n]; if(ri.name.toUpper() == "SPEEX" && ri.clockrate == rate) { pt = ri.id; break; } } // NOTE: we don't bother with a maxbitrate constraint on audio yet GstElement *audioenc = bins_audioenc_create(codec, pt, rate, size, channels); if(!audioenc) return false; { QMutexLocker locker(&volumein_mutex); volumein = gst_element_factory_make("volume", NULL); double vol = (double)inputVolume / 100; g_object_set(G_OBJECT(volumein), "volume", vol, NULL); } GstElement *audiortpsink = gst_element_factory_make("apprtpsink", NULL); GstAppRtpSink *appRtpSink = (GstAppRtpSink *)audiortpsink; if(!fileDemux) g_object_set(G_OBJECT(appRtpSink), "sync", FALSE, NULL); appRtpSink->appdata = this; appRtpSink->packet_ready = cb_packet_ready_rtp_audio; GstElement *queue = 0; if(fileDemux) queue = gst_element_factory_make("queue", NULL); if(queue) gst_bin_add(GST_BIN(sendbin), queue); gst_bin_add(GST_BIN(sendbin), volumein); gst_bin_add(GST_BIN(sendbin), audioenc); gst_bin_add(GST_BIN(sendbin), audiortpsink); gst_element_link_many(volumein, audioenc, audiortpsink, NULL); audiortppay = audioenc; if(fileDemux) { gst_element_link(queue, volumein); gst_element_set_state(queue, GST_STATE_PAUSED); gst_element_set_state(volumein, GST_STATE_PAUSED); gst_element_set_state(audioenc, GST_STATE_PAUSED); gst_element_set_state(audiortpsink, GST_STATE_PAUSED); gst_element_link(audiosrc, queue); } else { GstPad *pad = gst_element_get_static_pad(volumein, "sink"); gst_element_add_pad(sendbin, gst_ghost_pad_new_from_template("sink0", pad, gst_static_pad_template_get(&raw_audio_sink_template))); gst_object_unref(GST_OBJECT(pad)); } return true; } bool RtpWorker::addVideoChain() { // TODO: support other codecs. for now, we only support theora QString codec = "theora"; QSize size = QSize(320, 240); int fps = 30; //QString codec = localVideoParams[0].codec; //QSize size = localVideoParams[0].size; //int fps = localVideoParams[0].fps; #ifdef RTPWORKER_DEBUG printf("codec=%s\n", qPrintable(codec)); #endif // see if we need to match a pt id int pt = -1; for(int n = 0; n < remoteVideoPayloadInfo.count(); ++n) { const PPayloadInfo &ri = remoteVideoPayloadInfo[n]; if(ri.name.toUpper() == "THEORA" && ri.clockrate == 90000) { pt = ri.id; break; } } int videokbps = maxbitrate; // NOTE: we assume audio takes 45kbps if(audiortppay) videokbps -= 45; GstElement *videoprep = bins_videoprep_create(size, fps, fileDemux ? false : true); if(!videoprep) return false; GstElement *videoenc = bins_videoenc_create(codec, pt, videokbps); if(!videoenc) { g_object_unref(G_OBJECT(videoprep)); return false; } GstElement *videotee = gst_element_factory_make("tee", NULL); GstElement *playqueue = gst_element_factory_make("queue", NULL); GstElement *videoconvertplay = gst_element_factory_make("ffmpegcolorspace", NULL); GstElement *videoplaysink = gst_element_factory_make("appvideosink", NULL); GstAppVideoSink *appVideoSink = (GstAppVideoSink *)videoplaysink; appVideoSink->appdata = this; appVideoSink->show_frame = cb_show_frame_preview; GstElement *rtpqueue = gst_element_factory_make("queue", NULL); GstElement *videortpsink = gst_element_factory_make("apprtpsink", NULL); GstAppRtpSink *appRtpSink = (GstAppRtpSink *)videortpsink; if(!fileDemux) g_object_set(G_OBJECT(appRtpSink), "sync", FALSE, NULL); appRtpSink->appdata = this; appRtpSink->packet_ready = cb_packet_ready_rtp_video; GstElement *queue = 0; if(fileDemux) queue = gst_element_factory_make("queue", NULL); if(queue) gst_bin_add(GST_BIN(sendbin), queue); gst_bin_add(GST_BIN(sendbin), videoprep); gst_bin_add(GST_BIN(sendbin), videotee); gst_bin_add(GST_BIN(sendbin), playqueue); gst_bin_add(GST_BIN(sendbin), videoconvertplay); gst_bin_add(GST_BIN(sendbin), videoplaysink); gst_bin_add(GST_BIN(sendbin), rtpqueue); gst_bin_add(GST_BIN(sendbin), videoenc); gst_bin_add(GST_BIN(sendbin), videortpsink); gst_element_link(videoprep, videotee); gst_element_link_many(videotee, playqueue, videoconvertplay, videoplaysink, NULL); gst_element_link_many(videotee, rtpqueue, videoenc, videortpsink, NULL); videortppay = videoenc; if(fileDemux) { gst_element_link(queue, videoprep); gst_element_set_state(queue, GST_STATE_PAUSED); gst_element_set_state(videoprep, GST_STATE_PAUSED); gst_element_set_state(videotee, GST_STATE_PAUSED); gst_element_set_state(playqueue, GST_STATE_PAUSED); gst_element_set_state(videoconvertplay, GST_STATE_PAUSED); gst_element_set_state(videoplaysink, GST_STATE_PAUSED); gst_element_set_state(rtpqueue, GST_STATE_PAUSED); gst_element_set_state(videoenc, GST_STATE_PAUSED); gst_element_set_state(videortpsink, GST_STATE_PAUSED); gst_element_link(videosrc, queue); } else { GstPad *pad = gst_element_get_static_pad(videoprep, "sink"); gst_element_add_pad(sendbin, gst_ghost_pad_new_from_template("sink1", pad, gst_static_pad_template_get(&raw_video_sink_template))); gst_object_unref(GST_OBJECT(pad)); } return true; } bool RtpWorker::getCaps() { if(audiortppay) { GstPad *pad = gst_element_get_static_pad(audiortppay, "src"); GstCaps *caps = gst_pad_get_negotiated_caps(pad); if(!caps) { #ifdef RTPWORKER_DEBUG printf("can't get audio caps\n"); #endif return false; } #ifdef RTPWORKER_DEBUG gchar *gstr = gst_caps_to_string(caps); QString capsString = QString::fromUtf8(gstr); g_free(gstr); printf("rtppay caps audio: [%s]\n", qPrintable(capsString)); #endif gst_object_unref(pad); GstStructure *cs = gst_caps_get_structure(caps, 0); PPayloadInfo pi = structureToPayloadInfo(cs); if(pi.id == -1) { gst_caps_unref(caps); return false; } gst_caps_unref(caps); PPayloadInfo speexnb; speexnb.id = 97; speexnb.name = "SPEEX"; speexnb.clockrate = 8000; speexnb.channels = 1; speexnb.ptime = pi.ptime; speexnb.maxptime = pi.maxptime; QList ppil; ppil << pi; ppil << speexnb; localAudioPayloadInfo = ppil; canTransmitAudio = true; } if(videortppay) { GstPad *pad = gst_element_get_static_pad(videortppay, "src"); GstCaps *caps = gst_pad_get_negotiated_caps(pad); if(!caps) { #ifdef RTPWORKER_DEBUG printf("can't get video caps\n"); #endif return false; } #ifdef RTPWORKER_DEBUG gchar *gstr = gst_caps_to_string(caps); QString capsString = QString::fromUtf8(gstr); g_free(gstr); printf("rtppay caps video: [%s]\n", qPrintable(capsString)); #endif gst_object_unref(pad); GstStructure *cs = gst_caps_get_structure(caps, 0); PPayloadInfo pi = structureToPayloadInfo(cs); if(pi.id == -1) { gst_caps_unref(caps); return false; } gst_caps_unref(caps); localVideoPayloadInfo = QList() << pi; canTransmitVideo = true; } return true; } bool RtpWorker::updateTheoraConfig() { // first, are we using theora currently? int theora_at = -1; for(int n = 0; n < actual_remoteVideoPayloadInfo.count(); ++n) { const PPayloadInfo &ri = actual_remoteVideoPayloadInfo[n]; if(ri.name.toUpper() == "THEORA" && ri.clockrate == 90000) { theora_at = n; break; } } if(theora_at == -1) return false; // if so, update the videortpsrc caps for(int n = 0; n < remoteAudioPayloadInfo.count(); ++n) { const PPayloadInfo &ri = remoteVideoPayloadInfo[n]; if(ri.name.toUpper() == "THEORA" && ri.clockrate == 90000 && ri.id == actual_remoteVideoPayloadInfo[theora_at].id) { GstStructure *cs = payloadInfoToStructure(remoteVideoPayloadInfo[n], "video"); if(!cs) { #ifdef RTPWORKER_DEBUG printf("cannot parse payload info\n"); #endif continue; } QMutexLocker locker(&videortpsrc_mutex); if(!videortpsrc) continue; GstCaps *caps = gst_caps_new_empty(); gst_caps_append_structure(caps, cs); g_object_set(G_OBJECT(videortpsrc), "caps", caps, NULL); gst_caps_unref(caps); actual_remoteAudioPayloadInfo[theora_at] = ri; return true; } } return false; } } psimedia-master/gstprovider/rtpworker.h000066400000000000000000000131651220046403000207170ustar00rootroot00000000000000/* * Copyright (C) 2008-2009 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #ifndef RTPWORKER_H #define RTPWORKER_H #include #include #include #include #include #include "psimediaprovider.h" #include "gstcustomelements/gstcustomelements.h" namespace PsiMedia { class PipelineDeviceContext; class Stats; // Note: do not destruct this class during one of its callbacks class RtpWorker { public: // this class exists in case we want to add metadata to the image, // such as a timestamp class Frame { public: QImage image; }; void *app; // for callbacks QString aout; QString ain; QString vin; QString infile; QByteArray indata; bool loopFile; QList localAudioParams; QList localVideoParams; QList localAudioPayloadInfo; QList localVideoPayloadInfo; QList remoteAudioPayloadInfo; QList remoteVideoPayloadInfo; int maxbitrate; // read-only bool canTransmitAudio; bool canTransmitVideo; int outputVolume; int inputVolume; int error; RtpWorker(GMainContext *mainContext); ~RtpWorker(); void start(); // must wait until cb_updated before calling update void update(); // must wait until cb_updated before calling update void transmitAudio(); void transmitVideo(); void pauseAudio(); void pauseVideo(); void stop(); // can be called at any time after calling start // the rtp input functions are safe to call from any thread void rtpAudioIn(const PRtpPacket &packet); void rtpVideoIn(const PRtpPacket &packet); void setOutputVolume(int level); void setInputVolume(int level); void recordStart(); void recordStop(); // callbacks void (*cb_started)(void *app); void (*cb_updated)(void *app); void (*cb_stopped)(void *app); void (*cb_finished)(void *app); void (*cb_error)(void *app); void (*cb_audioOutputIntensity)(int value, void *app); void (*cb_audioInputIntensity)(int value, void *app); // callbacks - from alternate thread, be safe! // also, it is not safe to assign callbacks except before starting void (*cb_previewFrame)(const Frame &frame, void *app); void (*cb_outputFrame)(const Frame &frame, void *app); void (*cb_rtpAudioOut)(const PRtpPacket &packet, void *app); void (*cb_rtpVideoOut)(const PRtpPacket &packet, void *app); // empty record packet = EOF/error void (*cb_recordData)(const QByteArray &packet, void *app); private: GMainContext *mainContext_; GSource *timer; PipelineDeviceContext *pd_audiosrc, *pd_videosrc, *pd_audiosink; GstElement *sendbin, *recvbin; GstElement *fileDemux; GstElement *audiosrc; GstElement *videosrc; GstElement *audiortpsrc; GstElement *videortpsrc; GstElement *audiortppay; GstElement *videortppay; GstElement *volumein; GstElement *volumeout; bool rtpaudioout; bool rtpvideoout; QMutex audiortpsrc_mutex; QMutex videortpsrc_mutex; QMutex volumein_mutex; QMutex volumeout_mutex; QMutex rtpaudioout_mutex; QMutex rtpvideoout_mutex; //GSource *recordTimer; QList actual_localAudioPayloadInfo; QList actual_localVideoPayloadInfo; QList actual_remoteAudioPayloadInfo; QList actual_remoteVideoPayloadInfo; Stats *audioStats; Stats *videoStats; void cleanup(); static gboolean cb_doStart(gpointer data); static gboolean cb_doUpdate(gpointer data); static gboolean cb_doStop(gpointer data); static void cb_fileDemux_no_more_pads(GstElement *element, gpointer data); static void cb_fileDemux_pad_added(GstElement *element, GstPad *pad, gpointer data); static void cb_fileDemux_pad_removed(GstElement *element, GstPad *pad, gpointer data); static gboolean cb_bus_call(GstBus *bus, GstMessage *msg, gpointer data); static void cb_show_frame_preview(int width, int height, const unsigned char *rgb32, gpointer data); static void cb_show_frame_output(int width, int height, const unsigned char *rgb32, gpointer data); static void cb_packet_ready_rtp_audio(const unsigned char *buf, int size, gpointer data); static void cb_packet_ready_rtp_video(const unsigned char *buf, int size, gpointer data); static gboolean cb_fileReady(gpointer data); gboolean doStart(); gboolean doUpdate(); gboolean doStop(); void fileDemux_no_more_pads(GstElement *element); void fileDemux_pad_added(GstElement *element, GstPad *pad); void fileDemux_pad_removed(GstElement *element, GstPad *pad); gboolean bus_call(GstBus *bus, GstMessage *msg); void show_frame_preview(int width, int height, const unsigned char *rgb32); void show_frame_output(int width, int height, const unsigned char *rgb32); void packet_ready_rtp_audio(const unsigned char *buf, int size); void packet_ready_rtp_video(const unsigned char *buf, int size); gboolean fileReady(); bool setupSendRecv(); bool startSend(); bool startSend(int rate); bool startRecv(); bool addAudioChain(); bool addAudioChain(int rate); bool addVideoChain(); bool getCaps(); bool updateTheoraConfig(); }; } #endif psimedia-master/gstprovider/rwcontrol.cpp000066400000000000000000000444251220046403000212470ustar00rootroot00000000000000/* * Copyright (C) 2008-2009 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #include "rwcontrol.h" #include #include "gstthread.h" #include "rtpworker.h" // note: queuing frames doesn't really make much sense, since if the UI // receives 5 frames at once, they'll just get painted on each other in // succession and you'd only really see the last one. however, we'll queue // frames in case we ever want to do timestamped frames. #define QUEUE_FRAME_MAX 10 namespace PsiMedia { static int queuedFrameInfo(const QList &list, RwControlFrame::Type type, int *firstPos) { int count = 0; bool first = true; for(int n = 0; n < list.count(); ++n) { const RwControlMessage *msg = list[n]; if(msg->type == RwControlMessage::Frame && ((RwControlFrameMessage *)msg)->frame.type == type) { if(first) *firstPos = n; ++count; first = false; } } return count; } static RwControlFrameMessage *getLatestFrameAndRemoveOthers(QList *list, RwControlFrame::Type type) { RwControlFrameMessage *fmsg = 0; for(int n = 0; n < list->count(); ++n) { RwControlMessage *msg = list->at(n); if(msg->type == RwControlMessage::Frame && ((RwControlFrameMessage *)msg)->frame.type == type) { // if we already had a frame, discard it and take the next if(fmsg) delete fmsg; fmsg = (RwControlFrameMessage *)msg; list->removeAt(n); --n; // adjust position } } return fmsg; } static RwControlAudioIntensityMessage *getLatestAudioIntensityAndRemoveOthers(QList *list, RwControlAudioIntensity::Type type) { RwControlAudioIntensityMessage *amsg = 0; for(int n = 0; n < list->count(); ++n) { RwControlMessage *msg = list->at(n); if(msg->type == RwControlMessage::AudioIntensity && ((RwControlAudioIntensityMessage *)msg)->intensity.type == type) { // if we already had a msg, discard it and take the next if(amsg) delete amsg; amsg = (RwControlAudioIntensityMessage *)msg; list->removeAt(n); --n; // adjust position } } return amsg; } static void simplifyQueue(QList *list) { // is there a stop message? int at = -1; for(int n = 0; n < list->count(); ++n) { if(list->at(n)->type == RwControlMessage::Stop) { at = n; break; } } // if there is, remove all messages after it if(at != -1) { for(int n = at + 1; n < list->count();) list->removeAt(n); } } static RwControlStatusMessage *statusFromWorker(RtpWorker *worker) { RwControlStatusMessage *msg = new RwControlStatusMessage; msg->status.localAudioParams = worker->localAudioParams; msg->status.localVideoParams = worker->localVideoParams; msg->status.localAudioPayloadInfo = worker->localAudioPayloadInfo; msg->status.localVideoPayloadInfo = worker->localVideoPayloadInfo; msg->status.canTransmitAudio = worker->canTransmitAudio; msg->status.canTransmitVideo = worker->canTransmitVideo; return msg; } static void applyDevicesToWorker(RtpWorker *worker, const RwControlConfigDevices &devices) { worker->aout = devices.audioOutId; worker->ain = devices.audioInId; worker->vin = devices.videoInId; worker->infile = devices.fileNameIn; worker->indata = devices.fileDataIn; worker->loopFile = devices.loopFile; worker->setOutputVolume(devices.audioOutVolume); worker->setInputVolume(devices.audioInVolume); } static void applyCodecsToWorker(RtpWorker *worker, const RwControlConfigCodecs &codecs) { if(codecs.useLocalAudioParams) worker->localAudioParams = codecs.localAudioParams; if(codecs.useLocalVideoParams) worker->localVideoParams = codecs.localVideoParams; if(codecs.useRemoteAudioPayloadInfo) worker->remoteAudioPayloadInfo = codecs.remoteAudioPayloadInfo; if(codecs.useRemoteVideoPayloadInfo) worker->remoteVideoPayloadInfo = codecs.remoteVideoPayloadInfo; worker->maxbitrate = codecs.maximumSendingBitrate; } //---------------------------------------------------------------------------- // RwControlLocal //---------------------------------------------------------------------------- RwControlLocal::RwControlLocal(GstThread *thread, QObject *parent) : QObject(parent), app(0), cb_rtpAudioOut(0), cb_rtpVideoOut(0), cb_recordData(0), wake_pending(false) { thread_ = thread; remote_ = 0; // create RwControlRemote, block until ready QMutexLocker locker(&m); timer = g_timeout_source_new(0); g_source_set_callback(timer, cb_doCreateRemote, this, NULL); g_source_attach(timer, thread_->mainContext()); w.wait(&m); } RwControlLocal::~RwControlLocal() { // delete RwControlRemote, block until done QMutexLocker locker(&m); timer = g_timeout_source_new(0); g_source_set_callback(timer, cb_doDestroyRemote, this, NULL); g_source_attach(timer, thread_->mainContext()); w.wait(&m); qDeleteAll(in); } void RwControlLocal::start(const RwControlConfigDevices &devices, const RwControlConfigCodecs &codecs) { RwControlStartMessage *msg = new RwControlStartMessage; msg->devices = devices; msg->codecs = codecs; remote_->postMessage(msg); } void RwControlLocal::stop() { RwControlStopMessage *msg = new RwControlStopMessage; remote_->postMessage(msg); } void RwControlLocal::updateDevices(const RwControlConfigDevices &devices) { RwControlUpdateDevicesMessage *msg = new RwControlUpdateDevicesMessage; msg->devices = devices; remote_->postMessage(msg); } void RwControlLocal::updateCodecs(const RwControlConfigCodecs &codecs) { RwControlUpdateCodecsMessage *msg = new RwControlUpdateCodecsMessage; msg->codecs = codecs; remote_->postMessage(msg); } void RwControlLocal::setTransmit(const RwControlTransmit &transmit) { RwControlTransmitMessage *msg = new RwControlTransmitMessage; msg->transmit = transmit; remote_->postMessage(msg); } void RwControlLocal::setRecord(const RwControlRecord &record) { RwControlRecordMessage *msg = new RwControlRecordMessage; msg->record = record; remote_->postMessage(msg); } void RwControlLocal::rtpAudioIn(const PRtpPacket &packet) { remote_->rtpAudioIn(packet); } void RwControlLocal::rtpVideoIn(const PRtpPacket &packet) { remote_->rtpVideoIn(packet); } // note: this is executed in the remote thread gboolean RwControlLocal::cb_doCreateRemote(gpointer data) { return ((RwControlLocal *)data)->doCreateRemote(); } // note: this is executed in the remote thread gboolean RwControlLocal::doCreateRemote() { QMutexLocker locker(&m); timer = 0; remote_ = new RwControlRemote(thread_->mainContext(), this); w.wakeOne(); return FALSE; } // note: this is executed in the remote thread gboolean RwControlLocal::cb_doDestroyRemote(gpointer data) { return ((RwControlLocal *)data)->doDestroyRemote(); } // note: this is executed in the remote thread gboolean RwControlLocal::doDestroyRemote() { QMutexLocker locker(&m); timer = 0; delete remote_; remote_ = 0; w.wakeOne(); return FALSE; } void RwControlLocal::processMessages() { in_mutex.lock(); wake_pending = false; QList list = in; in.clear(); in_mutex.unlock(); QPointer self = this; // we only care about the latest preview frame RwControlFrameMessage *fmsg; fmsg = getLatestFrameAndRemoveOthers(&list, RwControlFrame::Preview); if(fmsg) { QImage i = fmsg->frame.image; delete fmsg; emit previewFrame(i); if(!self) { qDeleteAll(list); return; } } // we only care about the latest output frame fmsg = getLatestFrameAndRemoveOthers(&list, RwControlFrame::Output); if(fmsg) { QImage i = fmsg->frame.image; delete fmsg; emit outputFrame(i); if(!self) { qDeleteAll(list); return; } } // we only care about the latest audio output intensity RwControlAudioIntensityMessage *amsg = getLatestAudioIntensityAndRemoveOthers(&list, RwControlAudioIntensity::Output); if(amsg) { int i = amsg->intensity.value; delete amsg; emit audioOutputIntensityChanged(i); if(!self) { qDeleteAll(list); return; } } // we only care about the latest audio input intensity amsg = getLatestAudioIntensityAndRemoveOthers(&list, RwControlAudioIntensity::Input); if(amsg) { int i = amsg->intensity.value; delete amsg; emit audioInputIntensityChanged(i); if(!self) { qDeleteAll(list); return; } } // process the remaining messages while(!list.isEmpty()) { RwControlMessage *msg = list.takeFirst(); if(msg->type == RwControlMessage::Status) { RwControlStatusMessage *smsg = (RwControlStatusMessage *)msg; RwControlStatus status = smsg->status; delete smsg; emit statusReady(status); if(!self) { qDeleteAll(list); return; } } else delete msg; } } // note: this may be called from the remote thread void RwControlLocal::postMessage(RwControlMessage *msg) { QMutexLocker locker(&in_mutex); // if this is a frame, and the queue is maxed, then bump off the // oldest frame to make room if(msg->type == RwControlMessage::Frame) { RwControlFrameMessage *fmsg = (RwControlFrameMessage *)msg; int firstPos = -1; if(queuedFrameInfo(in, fmsg->frame.type, &firstPos) >= QUEUE_FRAME_MAX) in.removeAt(firstPos); } in += msg; if(!wake_pending) { QMetaObject::invokeMethod(this, "processMessages", Qt::QueuedConnection); wake_pending = true; } } //---------------------------------------------------------------------------- // RwControlRemote //---------------------------------------------------------------------------- RwControlRemote::RwControlRemote(GMainContext *mainContext, RwControlLocal *local) : timer(0), start_requested(false), blocking(false), pending_status(false) { mainContext_ = mainContext; local_ = local; worker = new RtpWorker(mainContext_); worker->app = this; worker->cb_started = cb_worker_started; worker->cb_updated = cb_worker_updated; worker->cb_stopped = cb_worker_stopped; worker->cb_finished = cb_worker_finished; worker->cb_error = cb_worker_error; worker->cb_audioOutputIntensity = cb_worker_audioOutputIntensity; worker->cb_audioInputIntensity = cb_worker_audioInputIntensity; worker->cb_previewFrame = cb_worker_previewFrame; worker->cb_outputFrame = cb_worker_outputFrame; worker->cb_rtpAudioOut = cb_worker_rtpAudioOut; worker->cb_rtpVideoOut = cb_worker_rtpVideoOut; worker->cb_recordData = cb_worker_recordData; } RwControlRemote::~RwControlRemote() { delete worker; qDeleteAll(in); } gboolean RwControlRemote::cb_processMessages(gpointer data) { return ((RwControlRemote *)data)->processMessages(); } void RwControlRemote::cb_worker_started(void *app) { ((RwControlRemote *)app)->worker_started(); } void RwControlRemote::cb_worker_updated(void *app) { ((RwControlRemote *)app)->worker_updated(); } void RwControlRemote::cb_worker_stopped(void *app) { ((RwControlRemote *)app)->worker_stopped(); } void RwControlRemote::cb_worker_finished(void *app) { ((RwControlRemote *)app)->worker_finished(); } void RwControlRemote::cb_worker_error(void *app) { ((RwControlRemote *)app)->worker_error(); } void RwControlRemote::cb_worker_audioOutputIntensity(int value, void *app) { ((RwControlRemote *)app)->worker_audioOutputIntensity(value); } void RwControlRemote::cb_worker_audioInputIntensity(int value, void *app) { ((RwControlRemote *)app)->worker_audioInputIntensity(value); } void RwControlRemote::cb_worker_previewFrame(const RtpWorker::Frame &frame, void *app) { ((RwControlRemote *)app)->worker_previewFrame(frame); } void RwControlRemote::cb_worker_outputFrame(const RtpWorker::Frame &frame, void *app) { ((RwControlRemote *)app)->worker_outputFrame(frame); } void RwControlRemote::cb_worker_rtpAudioOut(const PRtpPacket &packet, void *app) { ((RwControlRemote *)app)->worker_rtpAudioOut(packet); } void RwControlRemote::cb_worker_rtpVideoOut(const PRtpPacket &packet, void *app) { ((RwControlRemote *)app)->worker_rtpVideoOut(packet); } void RwControlRemote::cb_worker_recordData(const QByteArray &packet, void *app) { ((RwControlRemote *)app)->worker_recordData(packet); } gboolean RwControlRemote::processMessages() { m.lock(); timer = 0; m.unlock(); while(1) { m.lock(); if(in.isEmpty()) { m.unlock(); break; } // if there is a stop message in the queue, remove all others // because they are unnecessary simplifyQueue(&in); RwControlMessage *msg = in.takeFirst(); m.unlock(); bool ret = processMessage(msg); delete msg; if(!ret) { m.lock(); blocking = true; if(timer) { g_source_destroy(timer); timer = 0; } m.unlock(); break; } } return FALSE; } bool RwControlRemote::processMessage(RwControlMessage *msg) { if(msg->type == RwControlMessage::Start) { RwControlStartMessage *smsg = (RwControlStartMessage *)msg; applyDevicesToWorker(worker, smsg->devices); applyCodecsToWorker(worker, smsg->codecs); start_requested = true; pending_status = true; worker->start(); return false; } else if(msg->type == RwControlMessage::Stop) { RwControlStopMessage *smsg = (RwControlStopMessage *)msg; Q_UNUSED(smsg); if(start_requested) { pending_status = true; worker->stop(); } else { // this can happen if we stop before we even start. // just send back a stopped status and don't muck // with the worker. RwControlStatusMessage *msg = new RwControlStatusMessage; msg->status.stopped = true; local_->postMessage(msg); } return false; } else if(msg->type == RwControlMessage::UpdateDevices) { RwControlUpdateDevicesMessage *umsg = (RwControlUpdateDevicesMessage *)msg; applyDevicesToWorker(worker, umsg->devices); worker->update(); return false; } else if(msg->type == RwControlMessage::UpdateCodecs) { RwControlUpdateCodecsMessage *umsg = (RwControlUpdateCodecsMessage *)msg; applyCodecsToWorker(worker, umsg->codecs); pending_status = true; worker->update(); return false; } else if(msg->type == RwControlMessage::Transmit) { RwControlTransmitMessage *tmsg = (RwControlTransmitMessage *)msg; if(tmsg->transmit.useAudio) worker->transmitAudio(); else worker->pauseAudio(); if(tmsg->transmit.useVideo) worker->transmitVideo(); else worker->pauseVideo(); } else if(msg->type == RwControlMessage::Record) { RwControlRecordMessage *rmsg = (RwControlRecordMessage *)msg; if(rmsg->record.enabled) worker->recordStart(); else worker->recordStop(); } return true; } void RwControlRemote::worker_started() { pending_status = false; RwControlStatusMessage *msg = statusFromWorker(worker); local_->postMessage(msg); resumeMessages(); } void RwControlRemote::worker_updated() { // only reply with status message if we were asking for one if(pending_status) { pending_status = false; RwControlStatusMessage *msg = statusFromWorker(worker); local_->postMessage(msg); } resumeMessages(); } void RwControlRemote::worker_stopped() { pending_status = false; RwControlStatusMessage *msg = statusFromWorker(worker); msg->status.stopped = true; local_->postMessage(msg); } void RwControlRemote::worker_finished() { RwControlStatusMessage *msg = statusFromWorker(worker); msg->status.finished = true; local_->postMessage(msg); } void RwControlRemote::worker_error() { RwControlStatusMessage *msg = statusFromWorker(worker); msg->status.error = true; msg->status.errorCode = worker->error; local_->postMessage(msg); } void RwControlRemote::worker_audioOutputIntensity(int value) { RwControlAudioIntensityMessage *msg = new RwControlAudioIntensityMessage; msg->intensity.type = RwControlAudioIntensity::Output; msg->intensity.value = value; local_->postMessage(msg); } void RwControlRemote::worker_audioInputIntensity(int value) { RwControlAudioIntensityMessage *msg = new RwControlAudioIntensityMessage; msg->intensity.type = RwControlAudioIntensity::Input; msg->intensity.value = value; local_->postMessage(msg); } void RwControlRemote::worker_previewFrame(const RtpWorker::Frame &frame) { RwControlFrameMessage *msg = new RwControlFrameMessage; msg->frame.type = RwControlFrame::Preview; msg->frame.image = frame.image; local_->postMessage(msg); } void RwControlRemote::worker_outputFrame(const RtpWorker::Frame &frame) { RwControlFrameMessage *msg = new RwControlFrameMessage; msg->frame.type = RwControlFrame::Output; msg->frame.image = frame.image; local_->postMessage(msg); } void RwControlRemote::worker_rtpAudioOut(const PRtpPacket &packet) { if(local_->cb_rtpAudioOut) local_->cb_rtpAudioOut(packet, local_->app); } void RwControlRemote::worker_rtpVideoOut(const PRtpPacket &packet) { if(local_->cb_rtpVideoOut) local_->cb_rtpVideoOut(packet, local_->app); } void RwControlRemote::worker_recordData(const QByteArray &packet) { if(local_->cb_recordData) local_->cb_recordData(packet, local_->app); } void RwControlRemote::resumeMessages() { QMutexLocker locker(&m); if(blocking) { blocking = false; if(!in.isEmpty() && !timer) { timer = g_timeout_source_new(0); g_source_set_callback(timer, cb_processMessages, this, NULL); g_source_attach(timer, mainContext_); } } } // note: this may be called from the local thread void RwControlRemote::postMessage(RwControlMessage *msg) { QMutexLocker locker(&m); // if a stop message is sent, unblock so that it can get processed. // this is so we can stop a session that is in the middle of // starting. note: care must be taken in the message handler, as // this will cause processing to resume before resumeMessages() has // been called. if(msg->type == RwControlMessage::Stop) blocking = false; in += msg; if(!blocking && !timer) { timer = g_timeout_source_new(0); g_source_set_callback(timer, cb_processMessages, this, NULL); g_source_attach(timer, mainContext_); } } // note: this may be called from the local thread void RwControlRemote::rtpAudioIn(const PRtpPacket &packet) { worker->rtpAudioIn(packet); } // note: this may be called from the local thread void RwControlRemote::rtpVideoIn(const PRtpPacket &packet) { worker->rtpVideoIn(packet); } } psimedia-master/gstprovider/rwcontrol.h000066400000000000000000000253171220046403000207130ustar00rootroot00000000000000/* * Copyright (C) 2008-2009 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #ifndef RWCONTROL_H #define RWCONTROL_H #include #include #include #include #include #include #include #include #include "psimediaprovider.h" #include "rtpworker.h" namespace PsiMedia { // These classes allow controlling RtpWorker from across the qt<->glib // thread boundary. // // RwControlLocal - object to live in "local" Qt eventloop // RwControlRemote - object to live in "remote" glib eventloop // // When RwControlLocal is created, you pass it the GstThread. The constructor // atomically creates a corresponding RwControlRemote in the remote thread and // associates the two objects. // // The possible exchanges are made clear here. Things you can do: // // - Start a session. This requires device and codec configuration to begin. // This operation is a transaction, you'll receive a status message when it // completes. // // - Stop a session. This operation is a transaction, you'll receive a // status message when it completes. // // - Update complete device configuration. This is fire and forget. // Eventually it will take effect, and you won't be notified when it // happens. From a local standpoint you simply assume it took effect // immediately. // // - Update codec configuration. This is a transaction, you'll receive a // status message when it completes. // // - Transmit/pause the audio/video streams. This is fire and forget. // // - Start/stop recording a session. For starting, this is somewhat fire // and forget. You'll eventually start receiving data packets, but the // assumption is that recording is occurring even before the first packet // is received. For stopping, this is somewhat transactional. The record // is not considered stopped until an EOF packet is received. // // - At any time, it is possible to receive a spontaneous status message. // This is to indicate an error or a completed file playback. // // - Preview and output video frames are signaled normally and are intended // for immediate display. // // - RTP packets and recording data bypass the event-based message-passing // mechanisms described above. Instead, special methods and callbacks are // used which require special care. class GstThread; class RwControlRemote; class RwControlConfigDevices { public: QString audioOutId; QString audioInId; QString videoInId; QString fileNameIn; QByteArray fileDataIn; bool loopFile; bool useVideoPreview; bool useVideoOut; int audioOutVolume; int audioInVolume; RwControlConfigDevices() : loopFile(false), useVideoPreview(false), useVideoOut(false), audioOutVolume(-1), audioInVolume(-1) { } }; class RwControlConfigCodecs { public: bool useLocalAudioParams; bool useLocalVideoParams; bool useRemoteAudioPayloadInfo; bool useRemoteVideoPayloadInfo; QList localAudioParams; QList localVideoParams; QList remoteAudioPayloadInfo; QList remoteVideoPayloadInfo; int maximumSendingBitrate; RwControlConfigCodecs() : useLocalAudioParams(false), useLocalVideoParams(false), useRemoteAudioPayloadInfo(false), useRemoteVideoPayloadInfo(false), maximumSendingBitrate(-1) { } }; class RwControlTransmit { public: bool useAudio; bool useVideo; RwControlTransmit() : useAudio(false), useVideo(false) { } }; class RwControlRecord { public: bool enabled; RwControlRecord() : enabled(false) { } }; // note: if this is received spontaneously, then only finished, error, and // errorCode are valid class RwControlStatus { public: QList localAudioParams; QList localVideoParams; QList localAudioPayloadInfo; QList localVideoPayloadInfo; QList remoteAudioPayloadInfo; QList remoteVideoPayloadInfo; bool canTransmitAudio; bool canTransmitVideo; bool stopped; bool finished; bool error; int errorCode; RwControlStatus() : canTransmitAudio(false), canTransmitVideo(false), stopped(false), finished(false), error(false), errorCode(-1) { } }; class RwControlAudioIntensity { public: enum Type { Output, Input }; Type type; int value; RwControlAudioIntensity() : type((Type)-1), value(-1) { } }; // always remote -> local, for internal use class RwControlFrame { public: enum Type { Preview, Output }; Type type; QImage image; }; // internal class RwControlMessage { public: enum Type { Start, Stop, UpdateDevices, UpdateCodecs, Transmit, Record, Status, AudioIntensity, Frame }; Type type; RwControlMessage(Type _type) : type(_type) { } virtual ~RwControlMessage() { } }; class RwControlStartMessage : public RwControlMessage { public: RwControlConfigDevices devices; RwControlConfigCodecs codecs; RwControlStartMessage() : RwControlMessage(RwControlMessage::Start) { } }; class RwControlStopMessage : public RwControlMessage { public: RwControlStopMessage() : RwControlMessage(RwControlMessage::Stop) { } }; class RwControlUpdateDevicesMessage : public RwControlMessage { public: RwControlConfigDevices devices; RwControlUpdateDevicesMessage() : RwControlMessage(RwControlMessage::UpdateDevices) { } }; class RwControlUpdateCodecsMessage : public RwControlMessage { public: RwControlConfigCodecs codecs; RwControlUpdateCodecsMessage() : RwControlMessage(RwControlMessage::UpdateCodecs) { } }; class RwControlTransmitMessage : public RwControlMessage { public: RwControlTransmit transmit; RwControlTransmitMessage() : RwControlMessage(RwControlMessage::Transmit) { } }; class RwControlRecordMessage : public RwControlMessage { public: RwControlRecord record; RwControlRecordMessage() : RwControlMessage(RwControlMessage::Record) { } }; class RwControlStatusMessage : public RwControlMessage { public: RwControlStatus status; RwControlStatusMessage() : RwControlMessage(RwControlMessage::Status) { } }; class RwControlAudioIntensityMessage : public RwControlMessage { public: RwControlAudioIntensity intensity; RwControlAudioIntensityMessage() : RwControlMessage(RwControlMessage::AudioIntensity) { } }; class RwControlFrameMessage : public RwControlMessage { public: RwControlFrame frame; RwControlFrameMessage() : RwControlMessage(RwControlMessage::Frame) { } }; class RwControlLocal : public QObject { Q_OBJECT public: RwControlLocal(GstThread *thread, QObject *parent = 0); ~RwControlLocal(); void start(const RwControlConfigDevices &devices, const RwControlConfigCodecs &codecs); void stop(); // if called, may still receive many status messages before stopped void updateDevices(const RwControlConfigDevices &devices); void updateCodecs(const RwControlConfigCodecs &codecs); void setTransmit(const RwControlTransmit &transmit); void setRecord(const RwControlRecord &record); // can be called from any thread void rtpAudioIn(const PRtpPacket &packet); void rtpVideoIn(const PRtpPacket &packet); // can come from any thread. // note that it is only safe to assign callbacks prior to starting. // note if the stream is stopped while recording is active, then // stopped status will not be reported until EOF is delivered. void *app; void (*cb_rtpAudioOut)(const PRtpPacket &packet, void *app); void (*cb_rtpVideoOut)(const PRtpPacket &packet, void *app); void (*cb_recordData)(const QByteArray &packet, void *app); signals: // response to start, stop, updateCodecs, or it could be spontaneous void statusReady(const RwControlStatus &status); void previewFrame(const QImage &img); void outputFrame(const QImage &img); void audioOutputIntensityChanged(int intensity); void audioInputIntensityChanged(int intensity); private slots: void processMessages(); private: GstThread *thread_; GSource *timer; QMutex m; QWaitCondition w; RwControlRemote *remote_; bool wake_pending; QMutex in_mutex; QList in; static gboolean cb_doCreateRemote(gpointer data); static gboolean cb_doDestroyRemote(gpointer data); gboolean doCreateRemote(); gboolean doDestroyRemote(); friend class RwControlRemote; void postMessage(RwControlMessage *msg); }; class RwControlRemote { public: RwControlRemote(GMainContext *mainContext, RwControlLocal *local); ~RwControlRemote(); private: GSource *timer; GMainContext *mainContext_; QMutex m; RwControlLocal *local_; bool start_requested; bool blocking; bool pending_status; RtpWorker *worker; QList in; static gboolean cb_processMessages(gpointer data); static void cb_worker_started(void *app); static void cb_worker_updated(void *app); static void cb_worker_stopped(void *app); static void cb_worker_finished(void *app); static void cb_worker_error(void *app); static void cb_worker_audioOutputIntensity(int value, void *app); static void cb_worker_audioInputIntensity(int value, void *app); static void cb_worker_previewFrame(const RtpWorker::Frame &frame, void *app); static void cb_worker_outputFrame(const RtpWorker::Frame &frame, void *app); static void cb_worker_rtpAudioOut(const PRtpPacket &packet, void *app); static void cb_worker_rtpVideoOut(const PRtpPacket &packet, void *app); static void cb_worker_recordData(const QByteArray &packet, void *app); gboolean processMessages(); void worker_started(); void worker_updated(); void worker_stopped(); void worker_finished(); void worker_error(); void worker_audioOutputIntensity(int value); void worker_audioInputIntensity(int value); void worker_previewFrame(const RtpWorker::Frame &frame); void worker_outputFrame(const RtpWorker::Frame &frame); void worker_rtpAudioOut(const PRtpPacket &packet); void worker_rtpVideoOut(const PRtpPacket &packet); void worker_recordData(const QByteArray &packet); void resumeMessages(); // return false to block further message processing bool processMessage(RwControlMessage *msg); friend class RwControlLocal; void postMessage(RwControlMessage *msg); void rtpAudioIn(const PRtpPacket &packet); void rtpVideoIn(const PRtpPacket &packet); }; } #endif psimedia-master/psimedia.pro000066400000000000000000000004001220046403000164400ustar00rootroot00000000000000TEMPLATE = subdirs sub_gstelements.subdir = gstprovider/gstelements/static sub_demo.subdir = demo sub_gstprovider.subdir = gstprovider sub_gstprovider.depends = sub_gstelements SUBDIRS += sub_gstelements SUBDIRS += sub_demo SUBDIRS += sub_gstprovider psimedia-master/psimedia.qc000066400000000000000000000022301220046403000162460ustar00rootroot00000000000000 PsiMedia psimedia.pro qcm psimedia-master/psimedia/000077500000000000000000000000001220046403000157245ustar00rootroot00000000000000psimedia-master/psimedia/psimedia.cpp000066400000000000000000000605001220046403000202240ustar00rootroot00000000000000/* * Copyright (C) 2008-2009 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #include "psimedia.h" #include #include #ifdef QT_GUI_LIB #include #endif #include "psimediaprovider.h" namespace PsiMedia { static AudioParams importAudioParams(const PAudioParams &pp) { AudioParams out; out.setCodec(pp.codec); out.setSampleRate(pp.sampleRate); out.setSampleSize(pp.sampleSize); out.setChannels(pp.channels); return out; } static PAudioParams exportAudioParams(const AudioParams &p) { PAudioParams out; out.codec = p.codec(); out.sampleRate = p.sampleRate(); out.sampleSize = p.sampleSize(); out.channels = p.channels(); return out; } static VideoParams importVideoParams(const PVideoParams &pp) { VideoParams out; out.setCodec(pp.codec); out.setSize(pp.size); out.setFps(pp.fps); return out; } static PVideoParams exportVideoParams(const VideoParams &p) { PVideoParams out; out.codec = p.codec(); out.size = p.size(); out.fps = p.fps(); return out; } static PayloadInfo importPayloadInfo(const PPayloadInfo &pp) { PayloadInfo out; out.setId(pp.id); out.setName(pp.name); out.setClockrate(pp.clockrate); out.setChannels(pp.channels); out.setPtime(pp.ptime); out.setMaxptime(pp.maxptime); QList list; foreach(const PPayloadInfo::Parameter &pi, pp.parameters) { PayloadInfo::Parameter i; i.name = pi.name; i.value = pi.value; list += i; } out.setParameters(list); return out; } static PPayloadInfo exportPayloadInfo(const PayloadInfo &p) { PPayloadInfo out; out.id = p.id(); out.name = p.name(); out.clockrate = p.clockrate(); out.channels = p.channels(); out.ptime = p.ptime(); out.maxptime = p.maxptime(); QList list; foreach(const PayloadInfo::Parameter &i, p.parameters()) { PPayloadInfo::Parameter pi; pi.name = i.name; pi.value = i.value; list += pi; } out.parameters = list; return out; } //---------------------------------------------------------------------------- // Global //---------------------------------------------------------------------------- static Provider *g_provider = 0; static QPluginLoader *g_pluginLoader = 0; static void cleanupProvider(); static Provider *provider() { if(!g_provider) { // static plugin around? Provider *provider = 0; QObjectList list = QPluginLoader::staticInstances(); foreach(QObject *obj, list) { Plugin *instance = qobject_cast(obj); if(!instance) continue; Provider *p = instance->createProvider(); if(p) { provider = p; break; } } if(provider) { if(!provider->init(QString())) { delete provider; return 0; } g_provider = provider; qAddPostRoutine(cleanupProvider); } } return g_provider; } bool isSupported() { return (provider() ? true : false); } PluginResult loadPlugin(const QString &fname, const QString &resourcePath) { if(g_provider) return PluginSuccess; QPluginLoader *loader = new QPluginLoader(fname); if(!loader->load()) { delete loader; return ErrorLoad; } Plugin *instance = qobject_cast(loader->instance()); if(!instance) { delete loader; return ErrorVersion; } Provider *provider = instance->createProvider(); if(!provider) { loader->unload(); delete loader; return ErrorInit; } if(!provider->init(resourcePath)) { delete provider; loader->unload(); delete loader; return ErrorInit; } g_provider = provider; g_pluginLoader = loader; qAddPostRoutine(cleanupProvider); return PluginSuccess; } void cleanupProvider() { if(!g_provider) return; delete g_provider; g_provider = 0; if(g_pluginLoader) { g_pluginLoader->unload(); delete g_pluginLoader; g_pluginLoader = 0; } } void unloadPlugin() { cleanupProvider(); } QString creditName() { return provider()->creditName(); } QString creditText() { return provider()->creditText(); } class Device::Private { public: Device::Type type; QString id; QString name; }; class Global { public: static Device importDevice(const PDevice &pd) { Device dev; dev.d = new Device::Private; dev.d->type = (Device::Type)pd.type; dev.d->id = pd.id; dev.d->name = pd.name; return dev; } }; //---------------------------------------------------------------------------- // Device //---------------------------------------------------------------------------- Device::Device() : d(0) { } Device::Device(const Device &other) : d(other.d ? new Private(*other.d) : 0) { } Device::~Device() { delete d; } Device & Device::operator=(const Device &other) { if(d) { if(other.d) { *d = *other.d; } else { delete d; d = 0; } } else { if(other.d) d = new Private(*other.d); } return *this; } bool Device::isNull() const { return (d ? false : true); } Device::Type Device::type() const { return d->type; } QString Device::name() const { return d->name; } QString Device::id() const { return d->id; } #ifdef QT_GUI_LIB //---------------------------------------------------------------------------- // VideoWidget //---------------------------------------------------------------------------- class VideoWidgetPrivate : public QObject, public VideoWidgetContext { Q_OBJECT public: friend class VideoWidget; VideoWidget *q; QSize videoSize; VideoWidgetPrivate(VideoWidget *_q) : QObject(_q), q(_q) { } virtual QObject *qobject() { return this; } virtual QWidget *qwidget() { return q; } virtual void setVideoSize(const QSize &size) { videoSize = size; emit q->videoSizeChanged(); } signals: void resized(const QSize &newSize); void paintEvent(QPainter *p); }; VideoWidget::VideoWidget(QWidget *parent) : QWidget(parent) { d = new VideoWidgetPrivate(this); } VideoWidget::~VideoWidget() { delete d; } QSize VideoWidget::sizeHint() const { return d->videoSize; } void VideoWidget::paintEvent(QPaintEvent *event) { Q_UNUSED(event); QPainter p(this); emit d->paintEvent(&p); } void VideoWidget::resizeEvent(QResizeEvent *event) { Q_UNUSED(event); emit d->resized(size()); } #endif //---------------------------------------------------------------------------- // AudioParams //---------------------------------------------------------------------------- class AudioParams::Private { public: QString codec; int sampleRate; int sampleSize; int channels; Private() : sampleRate(0), sampleSize(0), channels(0) { } }; AudioParams::AudioParams() : d(new Private) { } AudioParams::AudioParams(const AudioParams &other) : d(new Private(*other.d)) { } AudioParams::~AudioParams() { delete d; } AudioParams & AudioParams::operator=(const AudioParams &other) { *d = *other.d; return *this; } QString AudioParams::codec() const { return d->codec; } int AudioParams::sampleRate() const { return d->sampleRate; } int AudioParams::sampleSize() const { return d->sampleSize; } int AudioParams::channels() const { return d->channels; } void AudioParams::setCodec(const QString &s) { d->codec = s; } void AudioParams::setSampleRate(int n) { d->sampleRate = n; } void AudioParams::setSampleSize(int n) { d->sampleSize = n; } void AudioParams::setChannels(int n) { d->channels = n; } bool AudioParams::operator==(const AudioParams &other) const { if(d->codec == other.d->codec && d->sampleRate == other.d->sampleRate && d->sampleSize == other.d->sampleSize && d->channels == other.d->channels) { return true; } else return false; } //---------------------------------------------------------------------------- // VideoParams //---------------------------------------------------------------------------- class VideoParams::Private { public: QString codec; QSize size; int fps; Private() : fps(0) { } }; VideoParams::VideoParams() : d(new Private) { } VideoParams::VideoParams(const VideoParams &other) : d(new Private(*other.d)) { } VideoParams::~VideoParams() { delete d; } VideoParams & VideoParams::operator=(const VideoParams &other) { *d = *other.d; return *this; } QString VideoParams::codec() const { return d->codec; } QSize VideoParams::size() const { return d->size; } int VideoParams::fps() const { return d->fps; } void VideoParams::setCodec(const QString &s) { d->codec = s; } void VideoParams::setSize(const QSize &s) { d->size = s; } void VideoParams::setFps(int n) { d->fps = n; } bool VideoParams::operator==(const VideoParams &other) const { if(d->codec == other.d->codec && d->size == other.d->size && d->fps == other.d->fps) { return true; } else return false; } //---------------------------------------------------------------------------- // Features //---------------------------------------------------------------------------- static QList importDevices(const QList &in) { QList out; foreach(const PDevice &pd, in) out += Global::importDevice(pd); return out; } static QList importAudioModes(const QList &in) { QList out; foreach(const PAudioParams &pp, in) out += importAudioParams(pp); return out; } static QList importVideoModes(const QList &in) { QList out; foreach(const PVideoParams &pp, in) out += importVideoParams(pp); return out; } class Features::Private : public QObject { Q_OBJECT public: Features *q; FeaturesContext *c; QList audioOutputDevices; QList audioInputDevices; QList videoInputDevices; QList supportedAudioModes; QList supportedVideoModes; Private(Features *_q) : QObject(_q), q(_q) { c = provider()->createFeatures(); c->qobject()->setParent(this); connect(c->qobject(), SIGNAL(finished()), SLOT(c_finished())); } ~Private() { delete c; } void clearResults() { audioOutputDevices.clear(); audioInputDevices.clear(); videoInputDevices.clear(); supportedAudioModes.clear(); supportedVideoModes.clear(); } void importResults(const PFeatures &in) { audioOutputDevices = importDevices(in.audioOutputDevices); audioInputDevices = importDevices(in.audioInputDevices); videoInputDevices = importDevices(in.videoInputDevices); supportedAudioModes = importAudioModes(in.supportedAudioModes); supportedVideoModes = importVideoModes(in.supportedVideoModes); } private slots: void c_finished() { importResults(c->results()); emit q->finished(); } }; Features::Features(QObject *parent) : QObject(parent) { d = new Private(this); } Features::~Features() { delete d; } void Features::lookup(int types) { int ptypes = 0; if(types & AudioOut) ptypes |= FeaturesContext::AudioOut; if(types & AudioIn) ptypes |= FeaturesContext::AudioIn; if(types & VideoIn) ptypes |= FeaturesContext::VideoIn; if(types & AudioModes) ptypes |= FeaturesContext::AudioModes; if(types & VideoModes) ptypes |= FeaturesContext::VideoModes; d->clearResults(); d->c->lookup(ptypes); } bool Features::waitForFinished(int msecs) { bool ok = d->c->waitForFinished(msecs); if(ok) d->importResults(d->c->results()); return ok; } QList Features::audioOutputDevices() { return d->audioOutputDevices; } QList Features::audioInputDevices() { return d->audioInputDevices; } QList Features::videoInputDevices() { return d->videoInputDevices; } QList Features::supportedAudioModes() { return d->supportedAudioModes; } QList Features::supportedVideoModes() { return d->supportedVideoModes; } //---------------------------------------------------------------------------- // RtpPacket //---------------------------------------------------------------------------- class RtpPacket::Private : public QSharedData { public: QByteArray rawValue; int portOffset; Private(const QByteArray &_rawValue, int _portOffset) : rawValue(_rawValue), portOffset(_portOffset) { } }; RtpPacket::RtpPacket() : d(0) { } RtpPacket::RtpPacket(const QByteArray &rawValue, int portOffset) : d(new Private(rawValue, portOffset)) { } RtpPacket::RtpPacket(const RtpPacket &other) : d(other.d) { } RtpPacket::~RtpPacket() { } RtpPacket & RtpPacket::operator=(const RtpPacket &other) { d = other.d; return *this; } bool RtpPacket::isNull() const { return (d ? false : true); } QByteArray RtpPacket::rawValue() const { return d->rawValue; } int RtpPacket::portOffset() const { return d->portOffset; } //---------------------------------------------------------------------------- // RtpChannel //---------------------------------------------------------------------------- class RtpChannelPrivate : public QObject { Q_OBJECT public: RtpChannel *q; RtpChannelContext *c; bool enabled; int readyReadListeners; RtpChannelPrivate(RtpChannel *_q) : QObject(_q), q(_q), c(0), enabled(false), readyReadListeners(0) { } void setContext(RtpChannelContext *_c) { if(c) { c->qobject()->disconnect(this); c->qobject()->setParent(0); enabled = false; c = 0; } if(!_c) return; c = _c; c->qobject()->setParent(this); connect(c->qobject(), SIGNAL(readyRead()), SLOT(c_readyRead())); connect(c->qobject(), SIGNAL(packetsWritten(int)), SLOT(c_packetsWritten(int))); connect(c->qobject(), SIGNAL(destroyed()), SLOT(c_destroyed())); if(readyReadListeners > 0) { enabled = true; c->setEnabled(true); } } private slots: void c_readyRead() { emit q->readyRead(); } void c_packetsWritten(int count) { emit q->packetsWritten(count); } void c_destroyed() { enabled = false; c = 0; } }; RtpChannel::RtpChannel() { d = new RtpChannelPrivate(this); } RtpChannel::~RtpChannel() { delete d; } int RtpChannel::packetsAvailable() const { if(d->c) return d->c->packetsAvailable(); else return 0; } RtpPacket RtpChannel::read() { if(d->c) { PRtpPacket pp = d->c->read(); return RtpPacket(pp.rawValue, pp.portOffset); } else return RtpPacket(); } void RtpChannel::write(const RtpPacket &rtp) { if(d->c) { if(!d->enabled) { d->enabled = true; d->c->setEnabled(true); } PRtpPacket pp; pp.rawValue = rtp.rawValue(); pp.portOffset = rtp.portOffset(); d->c->write(pp); } } void RtpChannel::connectNotify(const char *signal) { int oldtotal = d->readyReadListeners; if(QLatin1String(signal) == QMetaObject::normalizedSignature(SIGNAL(readyRead())).data()) ++d->readyReadListeners; int total = d->readyReadListeners; if(d->c && oldtotal == 0 && total > 0) { d->enabled = true; d->c->setEnabled(true); } } void RtpChannel::disconnectNotify(const char *signal) { int oldtotal = d->readyReadListeners; if(QLatin1String(signal) == QMetaObject::normalizedSignature(SIGNAL(readyRead())).data()) --d->readyReadListeners; int total = d->readyReadListeners; if(d->c && oldtotal > 0 && total == 0) { d->enabled = false; d->c->setEnabled(false); } } //---------------------------------------------------------------------------- // PayloadInfo //---------------------------------------------------------------------------- bool PayloadInfo::Parameter::operator==(const PayloadInfo::Parameter &other) const { // according to xep-167, parameter names are case-sensitive if(name == other.name && value == other.value) return true; else return false; } class PayloadInfo::Private { public: int id; QString name; int clockrate; int channels; int ptime; int maxptime; QList parameters; Private() : id(-1), clockrate(-1), channels(-1), ptime(-1), maxptime(-1) { } bool operator==(const Private &other) const { // according to xep-167, parameters are unordered if(id == other.id && name.compare(other.name, Qt::CaseInsensitive) && clockrate == other.clockrate && channels == other.channels && ptime == other.ptime && maxptime == other.maxptime && compareUnordered(parameters, other.parameters)) { return true; } else return false; } static bool compareUnordered(const QList &a, const QList &b) { if(a.count() != b.count()) return false; // for every parameter in 'a' foreach(const PayloadInfo::Parameter &p, a) { // make sure it is found in 'b' if(!b.contains(p)) return false; } return true; } }; PayloadInfo::PayloadInfo() : d(new Private) { } PayloadInfo::PayloadInfo(const PayloadInfo &other) : d(new Private(*other.d)) { } PayloadInfo::~PayloadInfo() { delete d; } PayloadInfo & PayloadInfo::operator=(const PayloadInfo &other) { *d = *other.d; return *this; } bool PayloadInfo::isNull() const { return (d->id == -1); } int PayloadInfo::id() const { return d->id; } QString PayloadInfo::name() const { return d->name; } int PayloadInfo::clockrate() const { return d->clockrate; } int PayloadInfo::channels() const { return d->channels; } int PayloadInfo::ptime() const { return d->ptime; } int PayloadInfo::maxptime() const { return d->maxptime; } QList PayloadInfo::parameters() const { return d->parameters; } void PayloadInfo::setId(int i) { d->id = i; } void PayloadInfo::setName(const QString &str) { d->name = str; } void PayloadInfo::setClockrate(int i) { d->clockrate = i; } void PayloadInfo::setChannels(int num) { d->channels = num; } void PayloadInfo::setPtime(int i) { d->ptime = i; } void PayloadInfo::setMaxptime(int i) { d->maxptime = i; } void PayloadInfo::setParameters(const QList ¶ms) { d->parameters = params; } bool PayloadInfo::operator==(const PayloadInfo &other) const { return (*d == *other.d); } //---------------------------------------------------------------------------- // RtpSession //---------------------------------------------------------------------------- class RtpSessionPrivate : public QObject { Q_OBJECT public: RtpSession *q; RtpSessionContext *c; RtpChannel audioRtpChannel; RtpChannel videoRtpChannel; RtpSessionPrivate(RtpSession *_q) : QObject(_q), q(_q) { c = provider()->createRtpSession(); c->qobject()->setParent(this); connect(c->qobject(), SIGNAL(started()), SLOT(c_started())); connect(c->qobject(), SIGNAL(preferencesUpdated()), SLOT(c_preferencesUpdated())); connect(c->qobject(), SIGNAL(audioOutputIntensityChanged(int)), SLOT(c_audioOutputIntensityChanged(int))); connect(c->qobject(), SIGNAL(audioInputIntensityChanged(int)), SLOT(c_audioInputIntensityChanged(int))); connect(c->qobject(), SIGNAL(stoppedRecording()), SLOT(c_stoppedRecording())); connect(c->qobject(), SIGNAL(stopped()), SLOT(c_stopped())); connect(c->qobject(), SIGNAL(finished()), SLOT(c_finished())); connect(c->qobject(), SIGNAL(error()), SLOT(c_error())); } ~RtpSessionPrivate() { delete c; } private slots: void c_started() { audioRtpChannel.d->setContext(c->audioRtpChannel()); videoRtpChannel.d->setContext(c->videoRtpChannel()); emit q->started(); } void c_preferencesUpdated() { emit q->preferencesUpdated(); } void c_audioOutputIntensityChanged(int intensity) { emit q->audioOutputIntensityChanged(intensity); } void c_audioInputIntensityChanged(int intensity) { emit q->audioInputIntensityChanged(intensity); } void c_stoppedRecording() { emit q->stoppedRecording(); } void c_stopped() { audioRtpChannel.d->setContext(0); videoRtpChannel.d->setContext(0); emit q->stopped(); } void c_finished() { audioRtpChannel.d->setContext(0); videoRtpChannel.d->setContext(0); emit q->finished(); } void c_error() { audioRtpChannel.d->setContext(0); videoRtpChannel.d->setContext(0); emit q->error(); } }; RtpSession::RtpSession(QObject *parent) : QObject(parent) { d = new RtpSessionPrivate(this); } RtpSession::~RtpSession() { delete d; } void RtpSession::reset() { delete d; d = new RtpSessionPrivate(this); } void RtpSession::setAudioOutputDevice(const QString &deviceId) { d->c->setAudioOutputDevice(deviceId); } #ifdef QT_GUI_LIB void RtpSession::setVideoOutputWidget(VideoWidget *widget) { d->c->setVideoOutputWidget(widget ? widget->d : 0); } #endif void RtpSession::setAudioInputDevice(const QString &deviceId) { d->c->setAudioInputDevice(deviceId); } void RtpSession::setVideoInputDevice(const QString &deviceId) { d->c->setVideoInputDevice(deviceId); } void RtpSession::setFileInput(const QString &fileName) { d->c->setFileInput(fileName); } void RtpSession::setFileDataInput(const QByteArray &fileData) { d->c->setFileDataInput(fileData); } void RtpSession::setFileLoopEnabled(bool enabled) { d->c->setFileLoopEnabled(enabled); } #ifdef QT_GUI_LIB void RtpSession::setVideoPreviewWidget(VideoWidget *widget) { d->c->setVideoPreviewWidget(widget ? widget->d : 0); } #endif void RtpSession::setRecordingQIODevice(QIODevice *dev) { d->c->setRecorder(dev); } void RtpSession::stopRecording() { d->c->stopRecording(); } void RtpSession::setLocalAudioPreferences(const QList ¶ms) { QList list; foreach(const AudioParams &p, params) list += exportAudioParams(p); d->c->setLocalAudioPreferences(list); } void RtpSession::setLocalVideoPreferences(const QList ¶ms) { QList list; foreach(const VideoParams &p, params) list += exportVideoParams(p); d->c->setLocalVideoPreferences(list); } void RtpSession::setMaximumSendingBitrate(int kbps) { d->c->setMaximumSendingBitrate(kbps); } void RtpSession::setRemoteAudioPreferences(const QList &info) { QList list; foreach(const PayloadInfo &p, info) list += exportPayloadInfo(p); d->c->setRemoteAudioPreferences(list); } void RtpSession::setRemoteVideoPreferences(const QList &info) { QList list; foreach(const PayloadInfo &p, info) list += exportPayloadInfo(p); d->c->setRemoteVideoPreferences(list); } void RtpSession::start() { d->c->start(); } void RtpSession::updatePreferences() { d->c->updatePreferences(); } void RtpSession::transmitAudio() { d->c->transmitAudio(); } void RtpSession::transmitVideo() { d->c->transmitVideo(); } void RtpSession::pauseAudio() { d->c->pauseAudio(); } void RtpSession::pauseVideo() { d->c->pauseVideo(); } void RtpSession::stop() { d->c->stop(); } QList RtpSession::localAudioPayloadInfo() const { QList out; foreach(const PPayloadInfo &pp, d->c->localAudioPayloadInfo()) out += importPayloadInfo(pp); return out; } QList RtpSession::localVideoPayloadInfo() const { QList out; foreach(const PPayloadInfo &pp, d->c->localVideoPayloadInfo()) out += importPayloadInfo(pp); return out; } QList RtpSession::remoteAudioPayloadInfo() const { QList out; foreach(const PPayloadInfo &pp, d->c->remoteAudioPayloadInfo()) out += importPayloadInfo(pp); return out; } QList RtpSession::remoteVideoPayloadInfo() const { QList out; foreach(const PPayloadInfo &pp, d->c->remoteVideoPayloadInfo()) out += importPayloadInfo(pp); return out; } QList RtpSession::audioParams() const { QList out; foreach(const PAudioParams &pp, d->c->audioParams()) out += importAudioParams(pp); return out; } QList RtpSession::videoParams() const { QList out; foreach(const PVideoParams &pp, d->c->videoParams()) out += importVideoParams(pp); return out; } bool RtpSession::canTransmitAudio() const { return d->c->canTransmitAudio(); } bool RtpSession::canTransmitVideo() const { return d->c->canTransmitVideo(); } int RtpSession::outputVolume() const { return d->c->outputVolume(); } void RtpSession::setOutputVolume(int level) { d->c->setOutputVolume(level); } int RtpSession::inputVolume() const { return d->c->inputVolume(); } void RtpSession::setInputVolume(int level) { d->c->setInputVolume(level); } RtpSession::Error RtpSession::errorCode() const { return (RtpSession::Error)d->c->errorCode(); } RtpChannel *RtpSession::audioRtpChannel() { return &d->audioRtpChannel; } RtpChannel *RtpSession::videoRtpChannel() { return &d->videoRtpChannel; } } #include "psimedia.moc" psimedia-master/psimedia/psimedia.h000066400000000000000000000322321220046403000176720ustar00rootroot00000000000000/* * Copyright (C) 2008-2009 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #ifndef PSIMEDIA_H #define PSIMEDIA_H #include #include #include #ifdef QT_GUI_LIB #include #endif namespace PsiMedia { class RtpSession; class RtpSessionPrivate; class VideoWidgetPrivate; class RtpChannelPrivate; enum PluginResult { PluginSuccess, ErrorLoad, ErrorVersion, ErrorInit }; bool isSupported(); PluginResult loadPlugin(const QString &fname, const QString &resourcePath); void unloadPlugin(); QString creditName(); QString creditText(); class Device { public: enum Type { AudioOut, // speaker AudioIn, // microphone VideoIn // camera }; Device(); Device(const Device &other); ~Device(); Device & operator=(const Device &other); bool isNull() const; Type type() const; QString name() const; QString id() const; private: class Private; friend class Global; Private *d; }; #ifdef QT_GUI_LIB class VideoWidget : public QWidget { Q_OBJECT public: VideoWidget(QWidget *parent = 0); ~VideoWidget(); virtual QSize sizeHint() const; protected: virtual void paintEvent(QPaintEvent *event); virtual void resizeEvent(QResizeEvent *event); signals: // check the size hint after receiving this signal void videoSizeChanged(); private: Q_DISABLE_COPY(VideoWidget); friend class VideoWidgetPrivate; friend class RtpSession; VideoWidgetPrivate *d; }; #endif class AudioParams { public: AudioParams(); AudioParams(const AudioParams &other); ~AudioParams(); AudioParams & operator=(const AudioParams &other); QString codec() const; int sampleRate() const; int sampleSize() const; int channels() const; void setCodec(const QString &s); void setSampleRate(int n); void setSampleSize(int n); void setChannels(int n); bool operator==(const AudioParams &other) const; inline bool operator!=(const AudioParams &other) const { return !(*this == other); } private: class Private; Private *d; }; class VideoParams { public: VideoParams(); VideoParams(const VideoParams &other); ~VideoParams(); VideoParams & operator=(const VideoParams &other); QString codec() const; QSize size() const; int fps() const; void setCodec(const QString &s); void setSize(const QSize &s); void setFps(int n); bool operator==(const VideoParams &other) const; inline bool operator!=(const VideoParams &other) const { return !(*this == other); } private: class Private; Private *d; }; class Features : public QObject { Q_OBJECT public: enum Type { AudioOut = 0x01, AudioIn = 0x02, VideoIn = 0x04, AudioModes = 0x08, VideoModes = 0x10, All = 0xff }; Features(QObject *parent = 0); ~Features(); void lookup(int types = All); bool waitForFinished(int msecs = -1); QList audioOutputDevices(); QList audioInputDevices(); QList videoInputDevices(); QList supportedAudioModes(); QList supportedVideoModes(); signals: void finished(); private: class Private; friend class Private; Private *d; }; class RtpPacket { public: RtpPacket(); RtpPacket(const QByteArray &rawValue, int portOffset); RtpPacket(const RtpPacket &other); ~RtpPacket(); RtpPacket & operator=(const RtpPacket &other); bool isNull() const; QByteArray rawValue() const; int portOffset() const; private: class Private; QSharedDataPointer d; }; // may drop packets if not read fast enough. // may queue no packets at all, if nobody is listening to readyRead. class RtpChannel : public QObject { Q_OBJECT public: int packetsAvailable() const; RtpPacket read(); void write(const RtpPacket &rtp); signals: void readyRead(); void packetsWritten(int count); protected: virtual void connectNotify(const char *signal); virtual void disconnectNotify(const char *signal); private: RtpChannel(); ~RtpChannel(); Q_DISABLE_COPY(RtpChannel); friend class RtpSession; friend class RtpSessionPrivate; friend class RtpChannelPrivate; RtpChannelPrivate *d; }; // this class essentially follows jingle's notion of payload information, // though it's not really jingle-specific and should be usable for any RTP // purpose class PayloadInfo { public: class Parameter { public: QString name; QString value; bool operator==(const Parameter &other) const; inline bool operator!=(const Parameter &other) const { return !(*this == other); } }; PayloadInfo(); PayloadInfo(const PayloadInfo &other); ~PayloadInfo(); PayloadInfo & operator=(const PayloadInfo &other); bool isNull() const; int id() const; QString name() const; int clockrate() const; int channels() const; int ptime() const; int maxptime() const; QList parameters() const; void setId(int i); void setName(const QString &str); void setClockrate(int i); void setChannels(int num); void setPtime(int i); void setMaxptime(int i); void setParameters(const QList ¶ms); bool operator==(const PayloadInfo &other) const; inline bool operator!=(const PayloadInfo &other) const { return !(*this == other); } private: class Private; Private *d; }; class RtpSession : public QObject { Q_OBJECT public: enum Error { ErrorGeneric, ErrorSystem, ErrorCodec }; RtpSession(QObject *parent = 0); ~RtpSession(); void reset(); void setAudioOutputDevice(const QString &deviceId); #ifdef QT_GUI_LIB void setVideoOutputWidget(VideoWidget *widget); #endif void setAudioInputDevice(const QString &deviceId); void setVideoInputDevice(const QString &deviceId); void setFileInput(const QString &fileName); void setFileDataInput(const QByteArray &fileData); void setFileLoopEnabled(bool enabled); #ifdef QT_GUI_LIB void setVideoPreviewWidget(VideoWidget *widget); #endif // pass a QIODevice to record to. if a device is set before starting // the session, then recording will wait until it starts. // records in ogg theora+vorbis format void setRecordingQIODevice(QIODevice *dev); // stop recording operation. wait for stoppedRecording signal before // QIODevice is released. void stopRecording(); // set local preferences, using fuzzy *params structures. void setLocalAudioPreferences(const QList ¶ms); void setLocalVideoPreferences(const QList ¶ms); void setMaximumSendingBitrate(int kbps); // set remote preferences, using payloadinfo. void setRemoteAudioPreferences(const QList &info); void setRemoteVideoPreferences(const QList &info); // usage strategy: // - initiator sets local prefs / bitrate // - initiator starts(), waits for started() // - initiator obtains the corresponding payloadinfos and sends to // target. // - target receives payloadinfos // - target sets local prefs / bitrate, and remote prefs // - target starts(), waits for started() // - target obtains the corresponding payloadinfos, which is mostly // an intersection of initiator/target preferences, and sends to // initiator // - target is ready for use // - initiator receives payloadinfos, sets remote prefs, calls // updatePreferences() and waits for preferencesUpdated() // - initiator ready for use // // after starting, params getter functions will return a number // of objects matching that of the local payloadinfo getters. note // that these objects may not match the original local prefs // params (if any). // // you must set at least one local pref for each media type you want // to support. any fields in params may be left unset, even all of // them. if multiple params structs are specified for a media type, // then this means configurations "in between" what is specified are // allowed. // // note: targets should leave room in the prefs for flexibility in // choosing among the initiator payloadinfos. if a target // specifies exactly one params struct, and leaves no fields empty, // then this will result in very strict choosing. for a given media // type, targets should leave some fields blank or set at least two // params. // // adding audio/video to existing session lacking it: // - set new local prefs as params // - call updatePreferences(), wait for preferencesUpdated() // - obtain corresponding payloadinfos, send to peer // - peer receives payloadinfos, sets local prefs as params, and // remote prefs // - peer calls updatePreferences(), waits for preferencesUpdated() // - peer obtains corresponding payloadinfos (intersection), and // sends back // - receive payloadinfos, set remote prefs, call // updatePreferences() and wait for preferencesUpdated() // // modifying params of existing media types: // - set new local prefs as params // - save original payloadinfos // - call updatePreferences(), wait for preferencesUpdated() // - obtain corresponding payloadinfos, and compare to original to // determine what was added or removed // - send adds/removes to peer // - peer receives payloadinfos, sets remote prefs based on // adds/removes to the original // - peer calls updatePreferences(), waits for preferencesUpdated() // - if there were any adds, peer obtains corresponding payloadinfos // (intersection), and compares to original to determine what was // agreed to be added. // - peer acks back with accepted adds, or rejects // - if reject is received, set original remote prefs // - if accept is received, add the 'adds' to the original remote // prefs and set them // - call updatePreferences(), wait for preferencesUpdated() // // during modification, if a payloadinfo is being removed, then it // is removed from both local/remote payloadinfo. if the peer // transmits with the removed payload type, then it will be // ignored. the local app won't transmit with a removed type. // // during modification, if a payloadinfo is being added, then it // is added only to the local payloadinfo. the app must explicitly // set remote prefs to update the remote payloadinfo (it would // do this after receiving a peer ack). the app won't transmit // using the added payloadinfo until the remote list is updated // as appropriate (more generally, the app won't transmit using a // payloadinfo that is not in the remote list). void start(); // if prefs are changed after starting, this function needs to be // called for them to take effect void updatePreferences(); void transmitAudio(); void transmitVideo(); void pauseAudio(); void pauseVideo(); void stop(); // in a correctly negotiated session, there will be an equal amount of // local/remote values for each media type (during negotiation there // may be a mismatch). however, the payloadinfo for each won't // necessarily match exactly. for example, both sides could be // using theora, but they'll almost certainly have different // parameters. QList localAudioPayloadInfo() const; QList localVideoPayloadInfo() const; QList remoteAudioPayloadInfo() const; QList remoteVideoPayloadInfo() const; // maps to local payloadinfo QList audioParams() const; QList videoParams() const; // parameter negotiation is independent of the existence of input and // output devices. you could perform a negotiation without // specifying any input devices, and this just means you won't be // able to transmit until you eventually do specify them. similarly, // you could have specified input devices but then later removed them // (by setting the device id to an empty string). the following // methods can be used to know what media types you're able to send. // in the case of devices, this is somewhat redundant information, // but the methods are useful in the case of using a file as input, // which might have varying media contained. bool canTransmitAudio() const; bool canTransmitVideo() const; // speaker int outputVolume() const; // 0 (mute) to 100 void setOutputVolume(int level); // microphone int inputVolume() const; // 0 (mute) to 100 void setInputVolume(int level); Error errorCode() const; RtpChannel *audioRtpChannel(); RtpChannel *videoRtpChannel(); signals: void started(); void preferencesUpdated(); void audioOutputIntensityChanged(int intensity); // 0-100, -1 for no signal void audioInputIntensityChanged(int intensity); // 0-100 void stoppedRecording(); void stopped(); void finished(); // for file playback only void error(); private: Q_DISABLE_COPY(RtpSession); friend class RtpSessionPrivate; RtpSessionPrivate *d; }; } #endif psimedia-master/psimedia/psimedia.pri000066400000000000000000000001351220046403000202320ustar00rootroot00000000000000HEADERS += \ $$PWD/psimedia.h \ $$PWD/psimediaprovider.h SOURCES += \ $$PWD/psimedia.cpp psimedia-master/psimedia/psimediaprovider.h000066400000000000000000000153731220046403000214540ustar00rootroot00000000000000/* * Copyright (C) 2008-2009 Barracuda Networks, Inc. * * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 2.1 of the License, or (at your option) any later version. * * This library is distributed in the hope that it will be useful, * but WITHANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Lesser General Public License for more details. * * You should have received a copy of the GNU Lesser General Public * License along with this library; if not, write to the Free Software * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA * 02110-1301 USA * */ #ifndef PSIMEDIAPROVIDER_H #define PSIMEDIAPROVIDER_H #include #include #include #include #include class QImage; class QIODevice; #ifdef QT_GUI_LIB class QWidget; class QPainter; #endif // since we cannot put signals/slots in Qt "interfaces", we use the following // defines to hint about signals/slots that derived classes should provide #define HINT_SIGNALS protected #define HINT_PUBLIC_SLOTS public #define HINT_METHOD(x) namespace PsiMedia { class Provider; class FeaturesContext; class RtpSessionContext; class Plugin { public: virtual ~Plugin() {} virtual Provider *createProvider() = 0; }; class QObjectInterface { public: virtual ~QObjectInterface() {} virtual QObject *qobject() = 0; }; class PDevice { public: enum Type { AudioOut, AudioIn, VideoIn }; Type type; QString name; QString id; }; class PAudioParams { public: QString codec; int sampleRate; int sampleSize; int channels; inline PAudioParams() : sampleRate(0), sampleSize(0), channels(0) { } }; class PVideoParams { public: QString codec; QSize size; int fps; inline PVideoParams() : fps(0) { } }; class PFeatures { public: QList audioOutputDevices; QList audioInputDevices; QList videoInputDevices; QList supportedAudioModes; QList supportedVideoModes; }; class PPayloadInfo { public: class Parameter { public: QString name; QString value; }; int id; QString name; int clockrate; int channels; int ptime; int maxptime; QList parameters; inline PPayloadInfo() : id(-1), clockrate(-1), channels(-1), ptime(-1), maxptime(-1) { } }; class PRtpPacket { public: QByteArray rawValue; int portOffset; inline PRtpPacket() : portOffset(0) { } }; class Provider : public QObjectInterface { public: virtual bool init(const QString &resourcePath) = 0; virtual QString creditName() = 0; virtual QString creditText() = 0; virtual FeaturesContext *createFeatures() = 0; virtual RtpSessionContext *createRtpSession() = 0; }; class FeaturesContext : public QObjectInterface { public: enum Type { AudioOut = 0x01, AudioIn = 0x02, VideoIn = 0x04, AudioModes = 0x08, VideoModes = 0x10 }; virtual void lookup(int types) = 0; virtual bool waitForFinished(int msecs) = 0; // -1 = no timeout virtual PFeatures results() const = 0; HINT_SIGNALS: HINT_METHOD(finished()) }; class RtpChannelContext : public QObjectInterface { public: virtual void setEnabled(bool b) = 0; virtual int packetsAvailable() const = 0; virtual PRtpPacket read() = 0; virtual void write(const PRtpPacket &rtp) = 0; HINT_SIGNALS: HINT_METHOD(readyRead()) HINT_METHOD(packetsWritten(int count)) }; #ifdef QT_GUI_LIB class VideoWidgetContext : public QObjectInterface { public: virtual QWidget *qwidget() = 0; // this function causes VideoWidget::videoSizeChanged() to be emitted virtual void setVideoSize(const QSize &size) = 0; HINT_SIGNALS: HINT_METHOD(resized(const QSize &newSize)) // listener must use a direct connection and paint during the signal HINT_METHOD(paintEvent(QPainter *p)) }; #endif class RtpSessionContext : public QObjectInterface { public: enum Error { ErrorGeneric, ErrorSystem, ErrorCodec }; virtual void setAudioOutputDevice(const QString &deviceId) = 0; virtual void setAudioInputDevice(const QString &deviceId) = 0; virtual void setVideoInputDevice(const QString &deviceId) = 0; virtual void setFileInput(const QString &fileName) = 0; virtual void setFileDataInput(const QByteArray &fileData) = 0; virtual void setFileLoopEnabled(bool enabled) = 0; #ifdef QT_GUI_LIB virtual void setVideoOutputWidget(VideoWidgetContext *widget) = 0; virtual void setVideoPreviewWidget(VideoWidgetContext *widget) = 0; #endif virtual void setRecorder(QIODevice *recordDevice) = 0; virtual void stopRecording() = 0; virtual void setLocalAudioPreferences(const QList ¶ms) = 0; virtual void setLocalVideoPreferences(const QList ¶ms) = 0; virtual void setMaximumSendingBitrate(int kbps) = 0; virtual void setRemoteAudioPreferences(const QList &info) = 0; virtual void setRemoteVideoPreferences(const QList &info) = 0; virtual void start() = 0; virtual void updatePreferences() = 0; virtual void transmitAudio() = 0; virtual void transmitVideo() = 0; virtual void pauseAudio() = 0; virtual void pauseVideo() = 0; virtual void stop() = 0; virtual QList localAudioPayloadInfo() const = 0; virtual QList localVideoPayloadInfo() const = 0; virtual QList remoteAudioPayloadInfo() const = 0; virtual QList remoteVideoPayloadInfo() const = 0; virtual QList audioParams() const = 0; virtual QList videoParams() const = 0; virtual bool canTransmitAudio() const = 0; virtual bool canTransmitVideo() const = 0; virtual int outputVolume() const = 0; // 0 (mute) to 100 virtual void setOutputVolume(int level) = 0; virtual int inputVolume() const = 0; // 0 (mute) to 100 virtual void setInputVolume(int level) = 0; virtual Error errorCode() const = 0; virtual RtpChannelContext *audioRtpChannel() = 0; virtual RtpChannelContext *videoRtpChannel() = 0; HINT_SIGNALS: HINT_METHOD(started()) HINT_METHOD(preferencesUpdated()) HINT_METHOD(audioOutputIntensityChanged(int intensity)) HINT_METHOD(audioInputIntensityChanged(int intensity)) HINT_METHOD(stoppedRecording()) HINT_METHOD(stopped()) HINT_METHOD(finished()) // for file playback only HINT_METHOD(error()) }; } Q_DECLARE_INTERFACE(PsiMedia::Plugin, "org.psi-im.psimedia.Plugin/1.0") Q_DECLARE_INTERFACE(PsiMedia::Provider, "org.psi-im.psimedia.Provider/1.0") Q_DECLARE_INTERFACE(PsiMedia::FeaturesContext, "org.psi-im.psimedia.FeaturesContext/1.0") Q_DECLARE_INTERFACE(PsiMedia::RtpChannelContext, "org.psi-im.psimedia.RtpChannelContext/1.0") Q_DECLARE_INTERFACE(PsiMedia::RtpSessionContext, "org.psi-im.psimedia.RtpSessionContext/1.0") #endif psimedia-master/qcm/000077500000000000000000000000001220046403000147115ustar00rootroot00000000000000psimedia-master/qcm/buildmodeapp.qcm000066400000000000000000000064031220046403000200630ustar00rootroot00000000000000/* -----BEGIN QCMOD----- name: buildmodeapp section: project arg: release,Build with debugging turned off (default). arg: debug,Build with debugging turned on. arg: debug-and-release,Build two versions, with and without debugging turned on (mac only). arg: no-separate-debug-info,Do not store debug information in a separate file (default for mac). arg: separate-debug-info,Strip debug information into a separate .debug file (default for non-mac). -----END QCMOD----- */ #define QC_BUILDMODE bool qc_buildmode_release = false; bool qc_buildmode_debug = false; bool qc_buildmode_separate_debug_info = false; class qc_buildmodeapp : public ConfObj { public: qc_buildmodeapp(Conf *c) : ConfObj(c) {} QString name() const { return "buildmodeapp"; } QString shortname() const { return "buildmodeapp"; } // no output QString checkString() const { return QString(); } bool exec() { // first, parse out the options bool opt_release = false; bool opt_debug = false; bool opt_debug_and_release = false; bool opt_no_separate_debug_info = false; bool opt_separate_debug_info = false; if(conf->getenv("QC_RELEASE") == "Y") opt_release = true; if(conf->getenv("QC_DEBUG") == "Y") opt_debug = true; if(conf->getenv("QC_DEBUG_AND_RELEASE") == "Y") opt_debug_and_release = true; if(conf->getenv("QC_NO_SEPARATE_DEBUG_INFO") == "Y") opt_no_separate_debug_info = true; if(conf->getenv("QC_SEPARATE_DEBUG_INFO") == "Y") opt_separate_debug_info = true; bool staticmode = false; if(conf->getenv("QC_STATIC") == "Y") staticmode = true; #ifndef Q_OS_MAC if(opt_debug_and_release) { printf("\nError: The --debug-and-release option is for mac only.\n"); exit(1); } #endif // sanity check exclusive options int x; // build mode x = 0; if(opt_release) ++x; if(opt_debug) ++x; if(opt_debug_and_release) ++x; if(x > 1) { printf("\nError: Use only one of --release, --debug, or --debug-and-release.\n"); exit(1); } // debug info x = 0; if(opt_no_separate_debug_info) ++x; if(opt_separate_debug_info) ++x; if(x > 1) { printf("\nError: Use only one of --separate-debug-info or --no-separate-debug-info\n"); exit(1); } // now process the options if(opt_release) qc_buildmode_release = true; else if(opt_debug) qc_buildmode_debug = true; else if(opt_debug_and_release) { qc_buildmode_release = true; qc_buildmode_debug = true; } else // default qc_buildmode_release = true; if(opt_separate_debug_info) qc_buildmode_separate_debug_info = true; else if(opt_no_separate_debug_info) { // nothing to do } else // default { #ifndef Q_OS_MAC qc_buildmode_separate_debug_info = true; #endif } // make the string QStringList opts; QString other; if(qc_buildmode_release && qc_buildmode_debug) { opts += "debug_and_release"; opts += "build_all"; } else if(qc_buildmode_release) opts += "release"; else // qc_buildmode_debug opts += "debug"; if(qc_buildmode_separate_debug_info) { opts += "separate_debug_info"; other += "QMAKE_CFLAGS += -g\n"; other += "QMAKE_CXXFLAGS += -g\n"; } QString str = QString("CONFIG += ") + opts.join(" ") + '\n'; conf->addExtra(str); if(!other.isEmpty()) conf->addExtra(other); return true; } }; psimedia-master/qcm/qt4.qcm000066400000000000000000000004361220046403000161260ustar00rootroot00000000000000/* -----BEGIN QCMOD----- name: Qt >= 4.4 -----END QCMOD----- */ class qc_qt4 : public ConfObj { public: qc_qt4(Conf *c) : ConfObj(c) {} QString name() const { return "Qt >= 4.4.0"; } QString shortname() const { return "qt4"; } bool exec() { return(QT_VERSION >= 0x040400); } }; psimedia-master/qcm/universal.qcm000066400000000000000000000022111220046403000174170ustar00rootroot00000000000000/* -----BEGIN QCMOD----- name: Mac universal binary support section: project arg: universal,Build with Mac universal binary support. arg: mac-sdk=[path],Path to Mac universal SDK (PPC host only). -----END QCMOD----- */ #define QC_UNIVERSAL bool qc_universal_enabled = false; QString qc_universal_sdk; //---------------------------------------------------------------------------- // qc_universal //---------------------------------------------------------------------------- class qc_universal : public ConfObj { public: qc_universal(Conf *c) : ConfObj(c) {} QString name() const { return "Mac universal binary support"; } QString shortname() const { return "universal"; } QString checkString() const { return QString(); } bool exec() { #ifdef Q_OS_MAC if(qc_getenv("QC_UNIVERSAL") == "Y") { qc_universal_enabled = true; QString str = "contains(QT_CONFIG,x86):contains(QT_CONFIG,ppc) {\n" " CONFIG += x86 ppc\n" "}\n"; QString sdk = qc_getenv("QC_MAC_SDK"); if(!sdk.isEmpty()) { str += QString("QMAKE_MAC_SDK = %1\n").arg(sdk); qc_universal_sdk = sdk; } conf->addExtra(str); } #endif return true; } };