pax_global_header00006660000000000000000000000064123712136320014512gustar00rootroot0000000000000052 comment=aec40f0a02d0ad45dffaa2347432b8b3d7a94b1a fannj-fannj-0.7/000077500000000000000000000000001237121363200135465ustar00rootroot00000000000000fannj-fannj-0.7/.gitignore000066400000000000000000000000351237121363200155340ustar00rootroot00000000000000target/* .nbactions hs_err_* fannj-fannj-0.7/LICENSE000066400000000000000000000167431237121363200145660ustar00rootroot00000000000000 GNU LESSER GENERAL PUBLIC LICENSE Version 3, 29 June 2007 Copyright (C) 2007 Free Software Foundation, Inc. Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. This version of the GNU Lesser General Public License incorporates the terms and conditions of version 3 of the GNU General Public License, supplemented by the additional permissions listed below. 0. Additional Definitions. As used herein, "this License" refers to version 3 of the GNU Lesser General Public License, and the "GNU GPL" refers to version 3 of the GNU General Public License. "The Library" refers to a covered work governed by this License, other than an Application or a Combined Work as defined below. An "Application" is any work that makes use of an interface provided by the Library, but which is not otherwise based on the Library. Defining a subclass of a class defined by the Library is deemed a mode of using an interface provided by the Library. A "Combined Work" is a work produced by combining or linking an Application with the Library. The particular version of the Library with which the Combined Work was made is also called the "Linked Version". The "Minimal Corresponding Source" for a Combined Work means the Corresponding Source for the Combined Work, excluding any source code for portions of the Combined Work that, considered in isolation, are based on the Application, and not on the Linked Version. The "Corresponding Application Code" for a Combined Work means the object code and/or source code for the Application, including any data and utility programs needed for reproducing the Combined Work from the Application, but excluding the System Libraries of the Combined Work. 1. Exception to Section 3 of the GNU GPL. You may convey a covered work under sections 3 and 4 of this License without being bound by section 3 of the GNU GPL. 2. Conveying Modified Versions. If you modify a copy of the Library, and, in your modifications, a facility refers to a function or data to be supplied by an Application that uses the facility (other than as an argument passed when the facility is invoked), then you may convey a copy of the modified version: a) under this License, provided that you make a good faith effort to ensure that, in the event an Application does not supply the function or data, the facility still operates, and performs whatever part of its purpose remains meaningful, or b) under the GNU GPL, with none of the additional permissions of this License applicable to that copy. 3. Object Code Incorporating Material from Library Header Files. The object code form of an Application may incorporate material from a header file that is part of the Library. You may convey such object code under terms of your choice, provided that, if the incorporated material is not limited to numerical parameters, data structure layouts and accessors, or small macros, inline functions and templates (ten or fewer lines in length), you do both of the following: a) Give prominent notice with each copy of the object code that the Library is used in it and that the Library and its use are covered by this License. b) Accompany the object code with a copy of the GNU GPL and this license document. 4. Combined Works. You may convey a Combined Work under terms of your choice that, taken together, effectively do not restrict modification of the portions of the Library contained in the Combined Work and reverse engineering for debugging such modifications, if you also do each of the following: a) Give prominent notice with each copy of the Combined Work that the Library is used in it and that the Library and its use are covered by this License. b) Accompany the Combined Work with a copy of the GNU GPL and this license document. c) For a Combined Work that displays copyright notices during execution, include the copyright notice for the Library among these notices, as well as a reference directing the user to the copies of the GNU GPL and this license document. d) Do one of the following: 0) Convey the Minimal Corresponding Source under the terms of this License, and the Corresponding Application Code in a form suitable for, and under terms that permit, the user to recombine or relink the Application with a modified version of the Linked Version to produce a modified Combined Work, in the manner specified by section 6 of the GNU GPL for conveying Corresponding Source. 1) Use a suitable shared library mechanism for linking with the Library. A suitable mechanism is one that (a) uses at run time a copy of the Library already present on the user's computer system, and (b) will operate properly with a modified version of the Library that is interface-compatible with the Linked Version. e) Provide Installation Information, but only if you would otherwise be required to provide such information under section 6 of the GNU GPL, and only to the extent that such information is necessary to install and execute a modified version of the Combined Work produced by recombining or relinking the Application with a modified version of the Linked Version. (If you use option 4d0, the Installation Information must accompany the Minimal Corresponding Source and Corresponding Application Code. If you use option 4d1, you must provide the Installation Information in the manner specified by section 6 of the GNU GPL for conveying Corresponding Source.) 5. Combined Libraries. You may place library facilities that are a work based on the Library side by side in a single library together with other library facilities that are not Applications and are not covered by this License, and convey such a combined library under terms of your choice, if you do both of the following: a) Accompany the combined library with a copy of the same work based on the Library, uncombined with any other library facilities, conveyed under the terms of this License. b) Give prominent notice with the combined library that part of it is a work based on the Library, and explaining where to find the accompanying uncombined form of the same work. 6. Revised Versions of the GNU Lesser General Public License. The Free Software Foundation may publish revised and/or new versions of the GNU Lesser General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Library as you received it specifies that a certain numbered version of the GNU Lesser General Public License "or any later version" applies to it, you have the option of following the terms and conditions either of that published version or of any later version published by the Free Software Foundation. If the Library as you received it does not specify a version number of the GNU Lesser General Public License, you may choose any version of the GNU Lesser General Public License ever published by the Free Software Foundation. If the Library as you received it specifies that a proxy can decide whether future versions of the GNU Lesser General Public License shall apply, that proxy's public statement of acceptance of any version is permanent authorization for you to choose that version for the Library. fannj-fannj-0.7/README.md000066400000000000000000000033151237121363200150270ustar00rootroot00000000000000#fannj Java bindings to [FANN](http://leenissen.dk/fann), the Fast Artificial Neural Network C library. ##Overview Use FannJ if you have an existing ANN from the FANN project that you would like to access from Java. There are several GUI tools that will help you create and train an ANN. ##Installation Before using FannJ, you must build and install the FANN C library. FannJ has been tested on FANN 2.2.0. See the FANN site for instructions and help: http://leenissen.dk/fann ##Code Example Fann fann = new Fann( "/path/to/file" ); float[] inputs = new float[]{ -1, 1 }; float[] outputs = fann.run( inputs ); fann.close(); ##Dependencies [FANN](http://leenissen.dk/fann) - Does all the work. [JNA](https://github.com/twall/jna) - Provides the native access to FANN. ##Maven 2 Support This project is now in the Maven Central Repository. If you use Maven 2 for your builds, here is the stuff to put in your pom.xml com.googlecode.fannj fannj 0.6 ##Running JNA provides the binding from Java to the FANN C library via JNI. You must set the jna.library.path system property to the path to the FANN Library. This property is similar to java.library.path but only applies to libraries loaded by JNA. You should also change the appropriate library access environment variable before launching the VM. This is PATH on Windows, LD\_LIBRARY\_PATH on Linux, and DYLD\_LIBRARY\_PATH on OSX. Linux something like: LD_LIBRARY_PATH=/usr/local/lib java -Djna.library.path=/usr/local/lib -cp fannj-0.6.jar:jna-3.2.2.jar YourClass fannj-fannj-0.7/pom.xml000066400000000000000000000103251237121363200150640ustar00rootroot00000000000000 4.0.0 com.googlecode.fannj fannj jar 0.7 fannj Provides Java bindings to the Fast Artificial Neural Network Library (FANN). https://github.com/krenfro/fannj 2009 GNU LGPL v3 http://www.gnu.org/licenses/lgpl.txt repo krenfro Kyle Renfro kylerenfro@gmail.com scm:git:https://github.com/krenfro/fannj https://github.com/krenfro/fannj fannj-0.7 sonatype-nexus-staging Nexus Staging Repository http://oss.sonatype.org/service/local/staging/deploy/maven2/ org.apache.maven.plugins maven-compiler-plugin 3.1 1.6 1.6 org.apache.maven.plugins maven-javadoc-plugin 2.6.1 org.codehaus.mojo versions-maven-plugin 2.1 org.apache.maven.plugins maven-scm-plugin 1.9.1 org.apache.maven.plugins maven-release-plugin 2.5 org.apache.maven.plugins maven-gpg-plugin 1.5 sign-artifacts verify sign 277117C3 org.jvnet.wagon-svn wagon-svn 1.9 https://github.com/krenfro/fannj/issues net.java.dev.jna jna 3.2.7 commons-io commons-io 1.3.2 test junit junit 4.5 test fannj-fannj-0.7/releases/000077500000000000000000000000001237121363200153515ustar00rootroot00000000000000fannj-fannj-0.7/releases/fannj-0.6.jar000066400000000000000000000351231237121363200174500ustar00rootroot00000000000000PK J? META-INF/PK J?T* h~META-INF/MANIFEST.MFMLK-. K-*ϳR03r,J,K-BV+$xRKRSt* 3R|RxJ3sJEyiE]lfzFƼ\\PK J?com/PK J?com/googlecode/PK J?com/googlecode/fannj/PK J?ގ-C'com/googlecode/fannj/FannShortcut.classT[OAݲEA)PQKoxIń Zvֈ>|g?0rf[ 9sofˏw&q@7: I PꄅIS2u0¤]8"3ftqcc؝ Y)'K]qs=:Mꑧ[m'{H2e{n`VkcS.bZ [4!tᄁ/([6hn\ڒ:iz wU%_㍨@$Y [\U=U*rYh~([6Wꇁ, }ߊTK= cQ{U<ƽ A-;B?t}=OcnGSo p\dHo%\61Wp`ǣdvWIxVߖ8X+jۜWp}F5GNk9R'J^\]wlugҪ8\:+6eJS@nt~T7BڿM4ZLJuCb{^Fp',] c- OۮKg`\ύ6a0yd zcq=5W,AK+tvGIvVH!8!Ja y,XAKYQQ +؋};D1$O+~v1:0tou0Re<YOOƌ%^-Ĕ׊-z/L,Y9Mfr(vXhU=C"`*%8u"K+Q/PK J?)kc,com/googlecode/fannj/TrainingAlgorithm.classSkOP~֭])lxC -8c2J֔%#e|Oi=y9Okx%x$bZc1p0)S X&!u Sۅju^+jVzW tEw:ٶ-Cw>ͶuW3mnN!6P/vk; ɁreX\h*{%!a-'3&)QuZ[ZEDo!i'ZvVĐˌC\삭u L䉣I7qL&MQͶu{.1]weìUz1qB[3\@1cclJ"0̒7OȗcI6RT?[~^t,a+BxC`6Y wpAKs]} ^HX${MW;f\ C:Csds{ȷ)SSհ4|eos=^t]S<s, rKNlɇSCSR(^M&F(Cp CGJIBGb++bfԋ5cZ^!`_u~5&چ0B1.BISt4S˴(2>0֮08s&[ atM#ީwGAKA2Rj+̐B;=ّ7r|KѶKr#?@le]_Ib0w>#^aV&4d 鮇fH?"M;B .ZW–#?N)ƮQD7PK J?Un4x-com/googlecode/fannj/WindowsFunctionNames.txtW=s0 3LL&6wUW\xA G6_',˲$?UPg⹆dm Y.>>< LCkU1htJf}<3)ˊ`5Tek X.Q"lʞi֙j`woH*m+gK5ȗWZ&kc6'N8D5 NCel,>4Vװ tvUj 6d;& czgL$' -~p/yCK;ncnAd9kWr2Y/ϛ/[0ax=C2 |wQc#?Ы]qc_ѮspS#]X(W4> ̤sMk &2%%Rwjt>6J 1&=$Pnɡui]G~uSg~E^{,|z(7)>xݡ2Uٹ7XϘJ%²!u7H.o*rUEcKc(jnFa,!ҁFin[.`v&6:M7f9-v! tyj 6J9Eums84,-{4c-:ar_PS8n 쐰VtX[v&}GQaȊV5]Ձe͵Y[89qCЍ'bgɋקt::k@Ub:jIV2yliL:yD2_jb_l6Yz"궁>}ȓا#&ߧ[^'\'>ai$dǯoK2ۊ"nũ nZq!0 }w֝"3$egnK>'/$>CjQ扞͟gƦ~|PK J?場+com/googlecode/fannj/Fann.classW {U~gwNfCZ$͝V!vu&M-d3N:iS/V@++E(*UHS^/(93e9gw9gK?ƒqt31l#y<#b~T _P8QA%1<&>xD_WWb)obxR x {–0㸈g(~O0?ųQLH(~*~ ۶LΙ))[9蘶g jL,vXAd0gL*N>nq'qruXwMlFSfAAwe3IS=)ДLz5|T5iל*h xid̂'9OfK3\*ްJjEױy#ƕf,޽w/A}F >b؈uRvXUL)TNAˍXs.ox Rڞ)zCKn3?.zèJ>kT qs469jK:Sh5 E_T(-:iںe2lΘL*SqIG6gd nODVw]}XJ޼sFw _s2uƕ*O9+zK<$2T(-%T \u-̇D8 F/B~)KxT-=癳LvhBYQQ:Vb.x1cBU]_TZStH6-Lg.V){ȈNȬol@aD:ڗn:~FVYxv^dm^!yXs9Y#N)@Ltn!JÝءOC7zl/,#[n8m^=q۞=e !ء4/,l3g9 /bX.8 'Ր=t"E4,⊊4\b[ f4pZ ``RA \א) Džcءw~/,#f?jxL|WαgQP?*Ud\]̌8bΙB**U7A O9^;%Z=:g+hs`~"PMZ0Gjj8x_T;mĔrI*mݍGm6]6V,ޣ$Mx6y -oV[7LDG:mqgFϻʢcőxpmkJaYܼ0Xխ1<ɞB VȡN ڝL5L'a l;[nK#zW[b_N~¹2K\9$wqDx?SiR B@{"B/"\T$'mabv܉{xFJb;^K\F+W*lEὭ.bsgkGCu Іχwysi`{ꄕiNEqc8R38ؘNJO%-EDɧ؃>z_QՀw.C*WRޤK?B%\o~pF^_-K4EEnᮦchi,qMr|z>q v6acG(O (Y[FH`gJ xDȳ%xx?kgSP#󈄅2A-+ A ֌2w}FPCM-4/eU%bx?QJl2451Ple 2Ă8B^E\ց Bk^*+{* n$RMAr[޳Y|:`F|1&gmdϔy6-bt'nYέ{Y]aHqc#h(x;ݻjneSfJ`ŒeI 489ڢ'Vh[QI}JPK J?| 0com/googlecode/fannj/WindowsFunctionMapper.classV[WW L (/hH"AQPhhI̤3.U{o~Ӷkmedosߺg Օ3gٗ{s~ .|C+Pa/ḏF<<8)<⃀G}i>j$`\@ƒ}hCʃ4NxgSCGѼБq>XUb >`ϞpNy|/_$CӲhR()Z:׺Dː4S,EL)K a%IVΐ,_.9 ۵RC>*Mz\ #zRMeecDWI I=! / քB"񄞉u= $M*ZR6sZS TYWKɑePѣTJ61YJ&V Cy~067+-V [R,ŵAfX#%K1J<#ބ*Kڠ!N@Xr@ N͂$sb&E)ȔW ɘ-!(gVt@&dW#GMe~|!9bt'b}zH gc:KPeG%sDDC>\@Ҳ9R&)5ι s;/P/|(#\pYD"preF^`& |I%-mԷGTɤ -CDzdrL69l*yE'Aꖖ[K=ڠ)OI!% C% ٗZwviIe**)yܸY,F}0Fr3CU9s)?6G]46EjW %NVb:uU-mɘsepJjsI-W<iư!O//PXCn_l M76}jd[ f4c Jۈv f; 01msM؏W3q^TlE=oo4 8(Pm"ECo%Y^}7!q:ϣ2u4o؝G@Ŀ>+E\ylo)cH1w?n.7مQ^tRwFD'G" hiGHڏ򤋴(2;pBZ> S ϓ κp l-/h+Q#Uw|"^ ܂kЂ v-0D`\6e?%\>VldzRX)mJYLnfY5CZhi3U[ںefU>gFO@*WrR\yp2jc@ [Ln x,{uS+5wzE2$PѪƺZ mc's(iͺw*~joᲭV?]V8 n (5.wx2xF ?sV-[]@]k4 YS5X:/մ֎Vg&3ڑ{I@4[5V^LUЏCfBaԂah5k]ʹsWc\oMÐ0$+RRH;(*XO ^4 V0 wS}D]͚sN"hްTb>ϊU0ݦx^Χc<t' Rh>S]*,-Z[(0 ~ŠpQ K{ΉCDwԟv+#d4N֨Q^@1FOië)1][q0bLcls1'{ڄdc{tH>EwSd Sb:IÇF8%[`cnIIA<8OSA\ _C5vZzfpչ I^81Ma/6닍nKzCHг2 ~[AbK\ݦW`;D.%f+k>E+|Uqk&nxjmF[i5Jjvw Ɛ26 èK)} Wdq;\naT`0*Y5\)Φ:|#DwWGJ# fc| O}F=A(ʔ" i>{wwc+/NPk)v{Gq@΋|jL"T#L)PfH<$1ӎ}tzJ& K ?r~8oAEpsܱp t|˞] 'Z_웧fKyB T+nFI!2LI*d n`w>=e*L;7(5S9G\ś+ 0扟'M&Ձ?0n~'CJ䴠9/PK J??9 "com/googlecode/fannj/Trainer.classVkSW~G#TAbsRUP[bSB@ 6즛 j~~i;=g7IN&ﹽبZnq j#VO,ئmt⇲ POD۷sڪW%抜"{]F~Gw^/dqdV)tb⿡$Ƀ&y=8 ^S4Lx(ij*z >#= ziФ)ֳlFQ0bKaYc# cbGtJ+*T,Rm[HDK$JzP3-:_݆m2D؞mc:`sTVHT,S\|Č1a]>ymŎEYn^B]lD$HPS$Gjr*Gjrn)Yq2 5O6蛍mxt[Ǒn?!?$UxvNKA#>wƨ 4q<͒Ĕf},)FVX7J N^ KPK J?cD52-com/googlecode/fannj/ActivationFunction.classkSZWߣ(clnUԈx!zД=$E7 Li|NnSN'b߿ `?zB΋}s%/K_q30ki73ofs晦H`YXb>XfV̒@B ʴ֘'6e7N5L0$b{BB4\TJb>6m$+ԬsfUd.0#:den{/algO4 4N"k%#MQfo=4rݧ)#9o 7T,7ʇGHSwS8U]5Vt3Wi]A,Onp?f^<09I+N i.fGj<*S-f5Os3\6RبU-ɣӕYj"]iM{ZU[˻Sh*f^Nc *4&iG^F^-5iɯ~ ~$#L̰̲̱̳,,,,DX,k,,׶ 43j8IPm/4xlE殑A @ +N+(*)+.(.*.).+(*FkL_M'@-i\$TZd [ Z IiH3NK-fJ{|r7vxvddNr pܭo ^#%O4i:ڥ J锦KH-M~+]Jn@nHۺWa>FDKwW{z@=;{(ݧzPQt]tztz[z.># `AN^ $x އ ># AA`x ՇǤmp g %/um_`OPK J?META-INF/maven/PK J?$META-INF/maven/com.googlecode.fannj/PK J?*META-INF/maven/com.googlecode.fannj/fannj/PK J?U 531META-INF/maven/com.googlecode.fannj/fannj/pom.xmlWMo6W9n*t.:nE`$Z-I!)ɔ, Z>Ǜ sRhcq T#R"!S#Gs4uQ}8s2c\ic [)+\h&c f9:6 "ٟT*Y!,,~NX("h %B:,mIAP{$eYy alm,et[2/(]4fToJdY2Sj:|>Z磩#8K ,+祓vv뗮oCR x0\5M޸yDxk&C่&ܽ@?\ #-iq#%S#-tYcI]v./ƶ`8:$񮃱>Kf9 _Q7z==|xX,>#_췼k$Ӏ5<NWsWqс.L'xޏެO=a=Ly2J:8L?8= :N= 1ݧY-Fîh>5FkKZR͋ҧz Xh]3&&x@X`&n#ڕ)|;UÛ$^oPK J?3gl8META-INF/maven/com.googlecode.fannj/fannj/pom.propertiesSvOK-J,IMQHTM,KR\R M , LC R3l ̸ҋK * A standard fully connected back-propagation neural network. *

*

* Not thread safe. *

* A Java binding to the Fast Artificial Neural Network (FANN) native library. *

* This class invokes native code. You must call close() to prevent memory * leakage. *

* * @author krenfro * @see Fast Artificial Neural Network * @see JNA Direct Maping */ public class Fann { static { NativeLibrary fann; if (Platform.isWindows()){ fann = NativeLibrary.getInstance("fannfloat"); Map options = fann.getOptions(); options.put(Library.OPTION_CALLING_CONVENTION, StdCallLibrary.STDCALL_CONVENTION); options.put(Library.OPTION_FUNCTION_MAPPER, new WindowsFunctionMapper()); } else{ fann = NativeLibrary.getInstance("fann"); } Native.register(fann); } protected Pointer ann; protected Fann() { } /** * Load an existing FANN definition from a file * * @param file */ public Fann(String file) { ann = fann_create_from_file(file); } /** * Create a new ANN with the provided layers. * * @param layers */ public Fann(List layers) { if (layers == null) throw new IllegalArgumentException("layers == null"); if (layers.isEmpty()) throw new IllegalArgumentException("layers is empty"); int[] neurons = new int[layers.size()]; for (int x = 0; x < neurons.length; x++) neurons[x] = layers.get(x).size(); ann = fann_create_standard_array(neurons.length, neurons); addLayers(layers); } protected void addLayers(List layers) { for (int x = 1; x < layers.size(); x++) { Layer layer = layers.get(x); for (int n = 0; n < layer.size(); n++) { fann_set_activation_function(ann, layer.get(n) .getActivationFunction().ordinal(), x, n); fann_set_activation_steepness(ann, layer.get(n).getSteepness(), x, n); } } } public int getNumInputNeurons() { return fann_get_num_input(ann); } public int getNumOutputNeurons() { return fann_get_num_output(ann); } public int getTotalNumNeurons() { return fann_get_total_neurons(ann); } /** * Save this FANN to a file. * * @param file * @return true on success */ public boolean save(String file) { return fann_save(ann, file) == 0; } /** * Run the ANN on a set of inputs. * * @param input * length == numInputNeurons * @return the output of the ANN. (length = numOutputNeurons) */ public float[] run(float[] input) { Pointer result = fann_run(ann, input); float[] output = result.getFloatArray(0, getNumOutputNeurons()); return output; } /** *

* Frees allocated memory. *

* You must call this method when you are finished to prevent memory leaks. */ public void close() { if (ann != null) fann_destroy(ann); } /** * Call {@link #close()} on garbage collection to catch memory leaks. */ @Override public void finalize() throws Throwable { close(); super.finalize(); } /* * A JNA Direct Mapping implementation of the FANN library. This instance * should be more performant than #com.googlecode.fannj.jna.FannLibrary */ protected static native Pointer fann_create_standard_array(int numLayers, int[] layers); protected static native Pointer fann_create_sparse_array( float connection_rate, int numLayers, int[] layers); protected static native Pointer fann_create_shortcut_array(int numLayers, int[] layers); protected static native float fann_get_MSE(Pointer ann); protected static native Pointer fann_run(Pointer ann, float[] input); protected static native void fann_destroy(Pointer ann); protected static native int fann_get_num_input(Pointer ann); protected static native int fann_get_num_output(Pointer ann); protected static native int fann_get_total_neurons(Pointer ann); protected static native void fann_set_activation_function(Pointer ann, int activation_function, int layer, int neuron); protected static native void fann_set_activation_steepness(Pointer ann, float steepness, int layer, int neuron); protected static native Pointer fann_get_neuron(Pointer ann, int layer, int neuron); protected static native Pointer fann_create_from_file( String configuration_file); protected static native int fann_save(Pointer ann, String file); } fannj-fannj-0.7/src/main/java/com/googlecode/fannj/FannShortcut.java000066400000000000000000000041301237121363200254420ustar00rootroot00000000000000/* FannJ * Copyright (C) 2009 Kyle Renfro * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 3 of the License, or (at your option) any later version. * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 59 Temple Place - Suite 330, * Boston, MA 02111-1307, USA. The text of license can be also found * at http://www.gnu.org/copyleft/lgpl.html */ package com.googlecode.fannj; import java.util.List; /** *

* A standard backpropagation neural network, which is not fully connected and * which also has shortcut connections. *

* * @author krenfro, brandstaetter */ public class FannShortcut extends Fann { public FannShortcut(List layers) { super(); if (layers == null) throw new IllegalArgumentException("layers == null"); if (layers.isEmpty()) throw new IllegalArgumentException("layers is empty"); int[] neurons = new int[layers.size()]; for (int x = 0; x < neurons.length; x++) neurons[x] = layers.get(x).size(); ann = fann_create_shortcut_array(neurons.length, neurons); addLayers(layers); } /** * Create a new ANN with just the input and output layers for Cascade * Training * * @param inputs * The number of input neurons * @param outputs * The number of output neurons */ public FannShortcut(int inputs, int outputs) { int[] layers = new int[2]; layers[0] = inputs; layers[1] = outputs; int numLayers = 2; ann = fann_create_shortcut_array(numLayers, layers); } } fannj-fannj-0.7/src/main/java/com/googlecode/fannj/FannSparse.java000066400000000000000000000035021237121363200250660ustar00rootroot00000000000000/* FannJ * Copyright (C) 2009 Kyle Renfro * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 3 of the License, or (at your option) any later version. * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 59 Temple Place - Suite 330, * Boston, MA 02111-1307, USA. The text of license can be also found * at http://www.gnu.org/copyleft/lgpl.html */ package com.googlecode.fannj; import java.util.List; /** *

* A standard backpropagation neural network, which is not fully connected. *

* * @author krenfro */ public class FannSparse extends Fann { public static final float DEFAULT_CONNECTION_RATE = 1f; float connectionRate = 1f; public FannSparse(List layers) { this(DEFAULT_CONNECTION_RATE, layers); } public FannSparse(float connectionRate, List layers) { super(); if (layers == null) throw new IllegalArgumentException("layers == null"); if (layers.isEmpty()) throw new IllegalArgumentException("layers is empty"); this.connectionRate = connectionRate; int[] neurons = new int[layers.size()]; for (int x = 0; x < neurons.length; x++) neurons[x] = layers.get(x).size(); ann = fann_create_sparse_array(connectionRate, neurons.length, neurons); addLayers(layers); } } fannj-fannj-0.7/src/main/java/com/googlecode/fannj/Layer.java000066400000000000000000000050431237121363200241040ustar00rootroot00000000000000/* FannJ * Copyright (C) 2009 Kyle Renfro * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 3 of the License, or (at your option) any later version. * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 59 Temple Place - Suite 330, * Boston, MA 02111-1307, USA. The text of license can be also found * at http://www.gnu.org/copyleft/lgpl.html */ package com.googlecode.fannj; import java.util.ArrayList; /** * A Layer of Neurons in an ANN. * * @author krenfro */ public class Layer extends ArrayList { private static final long serialVersionUID = -6467294440860703773L; /** * Create a Layer with the specified number of neurons with the default * Activation Function: {@link Neuron#DEFAULT_ACTIVATION_FUNCTION} with * steepness: {@link Neuron#DEFAULT_ACTIVATION_STEEPNESS} * * @param numNeurons * @return */ public static Layer create(int numNeurons) { return create(numNeurons, Neuron.DEFAULT_ACTIVATION_FUNCTION, Neuron.DEFAULT_ACTIVATION_STEEPNESS); } /** * Create a Layer with the specified number of neurons and a particular * ActivationFunction with the steepness: * {@link Neuron#DEFAULT_ACTIVATION_STEEPNESS} * * @param numNeurons * @param activationFunction * @return */ public static Layer create(int numNeurons, ActivationFunction activationFunction) { return create(numNeurons, activationFunction, Neuron.DEFAULT_ACTIVATION_STEEPNESS); } /** * Create a Layer with the specified number of neurons and a particular * ActivationFunction with specified steepness * * @param numNeurons * @param activationFunction * @param steepness * @return */ public static Layer create(int numNeurons, ActivationFunction activationFunction, float steepness) { Layer layer = new Layer(); for (int i = 0; i < numNeurons; i++) layer.add(new Neuron(activationFunction, steepness)); return layer; } } fannj-fannj-0.7/src/main/java/com/googlecode/fannj/Neuron.java000066400000000000000000000062571237121363200243060ustar00rootroot00000000000000/* FannJ * Copyright (C) 2009 Kyle Renfro * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 3 of the License, or (at your option) any later version. * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 59 Temple Place - Suite 330, * Boston, MA 02111-1307, USA. The text of license can be also found * at http://www.gnu.org/copyleft/lgpl.html */ package com.googlecode.fannj; /** * @author krenfro */ public class Neuron { public static final ActivationFunction DEFAULT_ACTIVATION_FUNCTION = ActivationFunction.FANN_SIGMOID_STEPWISE; public static final float DEFAULT_ACTIVATION_STEEPNESS = .5f; ActivationFunction activationFunction; float steepness; /** * Create a neuron with default activation function: FANN_SIGMOID_STEPWISE * and activation steepness = .5 */ public Neuron() { this(DEFAULT_ACTIVATION_FUNCTION); } /** * Create a neuron with the specified activation function and default * activation steepness = .5 * * @param activationFunction */ public Neuron(ActivationFunction activationFunction) { this(activationFunction, DEFAULT_ACTIVATION_STEEPNESS); } /** * Create a neuron with the specified activation function and steepness. * * @param activationFunction * @param steepness */ public Neuron(ActivationFunction activationFunction, float steepness) { if (activationFunction == null) throw new IllegalArgumentException("activationFunction is null"); this.activationFunction = activationFunction; this.steepness = steepness; } public float getSteepness() { return steepness; } public ActivationFunction getActivationFunction() { return activationFunction; } @Override public int hashCode() { final int prime = 31; int result = 1; result = prime * result + ((activationFunction == null) ? 0 : activationFunction.hashCode()); result = prime * result + Float.floatToIntBits(steepness); return result; } @Override public boolean equals(Object obj) { if (this == obj) return true; if (obj == null) return false; if (getClass() != obj.getClass()) return false; Neuron other = (Neuron) obj; if (activationFunction == null) { if (other.activationFunction != null) return false; } else if (!activationFunction.equals(other.activationFunction)) return false; if (Float.floatToIntBits(steepness) != Float.floatToIntBits(other.steepness)) return false; return true; } } fannj-fannj-0.7/src/main/java/com/googlecode/fannj/Trainer.java000066400000000000000000000126341237121363200244400ustar00rootroot00000000000000/* FannJ * Copyright (C) 2009 Kyle Renfro, 2011 Daniel Thomas * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 3 of the License, or (at your option) any later version. * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 59 Temple Place - Suite 330, * Boston, MA 02111-1307, USA. The text of license can be also found * at http://www.gnu.org/copyleft/lgpl.html */ package com.googlecode.fannj; import com.sun.jna.Library; import com.sun.jna.Native; import com.sun.jna.NativeLibrary; import com.sun.jna.Platform; import com.sun.jna.Pointer; import com.sun.jna.win32.StdCallLibrary; import java.util.Map; /** * Trains an ANN. Currently only File based training is supported. * * @author krenfro, drt24, brandstaetter */ public class Trainer { static { NativeLibrary fann; if (Platform.isWindows()){ fann = NativeLibrary.getInstance("fannfloat"); Map options = fann.getOptions(); options.put(Library.OPTION_CALLING_CONVENTION, StdCallLibrary.STDCALL_CONVENTION); options.put(Library.OPTION_FUNCTION_MAPPER, new WindowsFunctionMapper()); } else{ fann = NativeLibrary.getInstance("fann"); } Native.register(fann); } Fann fann; public Trainer(Fann fann) { this.fann = fann; } /** * @param trainingFile * @param maxEpochs * @param epochsBetweenReports * @param desiredError * @return MSE for the ann once trained */ public float train(String trainingFile, int maxEpochs, int epochsBetweenReports, float desiredError) { fann_train_on_file(fann.ann, trainingFile, maxEpochs, epochsBetweenReports, desiredError); return fann_get_MSE(fann.ann); } /** * * @param dataFile * @param maxNeurons * @param neuronsBetweenReports * @param desiredError * @return MSE for the ann once trained */ public float cascadeTrain(String dataFile, int maxNeurons, int neuronsBetweenReports, float desiredError) { setTrainingAlgorithm(TrainingAlgorithm.FANN_TRAIN_RPROP); fann_cascadetrain_on_file(fann.ann, dataFile, maxNeurons, neuronsBetweenReports, desiredError); return fann_get_MSE(fann.ann); } public void setTrainingAlgorithm(TrainingAlgorithm algorithm) { fann_set_training_algorithm(fann.ann, algorithm.ordinal()); } /** * * @param testingFile * @return MSE for the Fann which has been tested */ public float test(String testingFile) { fann_reset_MSE(fann.ann);// so that only influence on the MSE is from // testing data Pointer testingData = fann_read_train_from_file(testingFile); fann_test_data(fann.ann, testingData); fann_destroy_train(testingData);// deallocate it. return fann_get_MSE(fann.ann); } /* A JNA Direct Mapping implementation of the FANN library. */ protected static native void fann_train_on_file(Pointer ann, String filename, int max_epochs, int epochs_between_reports, float desired_error); protected static native void fann_cascadetrain_on_file(Pointer ann, String filename, int max_neurons, int neurons_between_reports, float desired_error); protected static native void fann_set_training_algorithm(Pointer ann, int training_algorithm); protected static native int fann_get_training_algorithm(Pointer ann); /** * Resets the mean square error from the network. * * @param ann */ protected static native void fann_reset_MSE(Pointer ann); /** * Reads the mean square error from the network. * * @param ann * @return the mean square error of the network */ protected static native float fann_get_MSE(Pointer ann); /** * Test the network using data and return the MSE of the network. * * You might need to run {@link #fann_reset_MSE(Pointer)} first * * @param ann * @param data * the data to test with * @return the mean square error of the network */ protected static native float fann_test_data(Pointer ann, Pointer data); /** * Read the training or testing data from a file * * You must call {@link #fann_destroy_train(Pointer)} on the {@link Pointer} * you get from this after you have finished with it * * @param filename * the file name of the file to read the data from * @return pointer to the data which has been read for use with * {@link #fann_test_data(Pointer,Pointer)} */ protected static native Pointer fann_read_train_from_file(String filename); /** * Deallocate the data * * @param data * the training/testing data to deallocate */ protected static native void fann_destroy_train(Pointer data); } fannj-fannj-0.7/src/main/java/com/googlecode/fannj/TrainingAlgorithm.java000066400000000000000000000021021237121363200264430ustar00rootroot00000000000000/* FannJ * Copyright (C) 2009 Kyle Renfro * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 3 of the License, or (at your option) any later version. * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 59 Temple Place - Suite 330, * Boston, MA 02111-1307, USA. The text of license can be also found * at http://www.gnu.org/copyleft/lgpl.html */ package com.googlecode.fannj; public enum TrainingAlgorithm { /* The order these appear must match the order in the FANN src! */ FANN_TRAIN_INCREMENTAL, FANN_TRAIN_BATCH, FANN_TRAIN_RPROP, FANN_TRAIN_QUICKPROP } fannj-fannj-0.7/src/main/java/com/googlecode/fannj/WindowsFunctionMapper.java000066400000000000000000000040651237121363200273400ustar00rootroot00000000000000package com.googlecode.fannj; import com.sun.jna.FunctionMapper; import com.sun.jna.NativeLibrary; import java.io.BufferedReader; import java.io.IOException; import java.io.InputStreamReader; import java.lang.reflect.Method; import java.util.HashMap; import java.util.Map; import java.util.logging.Level; import java.util.logging.Logger; public class WindowsFunctionMapper implements FunctionMapper{ private static final Logger logger = Logger.getLogger(WindowsFunctionMapper.class.getName()); private final Map translations; public WindowsFunctionMapper(){ translations = new HashMap(); loadTranslations(); } private void loadTranslations(){ try{ BufferedReader in = new BufferedReader( new InputStreamReader( this.getClass().getResourceAsStream("WindowsFunctionNames.txt"))); String s = in.readLine(); while (s != null){ addTranslation(s); s = in.readLine(); } in.close(); } catch(IOException ex){ throw new IllegalStateException("Unable to load windows function names", ex); } } private void addTranslation(String windowsName){ if (windowsName != null && !windowsName.isEmpty() && !windowsName.startsWith("#")){ String cleanName = windowsName; if (windowsName.startsWith("_")) cleanName = windowsName.substring(1); int pos = cleanName.indexOf("@"); if (pos > 0) cleanName = cleanName.substring(0, pos); logger.log(Level.FINE, "{0} = {1}", new Object[]{cleanName.trim(), windowsName.trim()}); translations.put(cleanName.trim(), windowsName.trim()); } } @Override public String getFunctionName(NativeLibrary nl, Method method) { String result = translations.get(method.getName()); return result == null ? method.getName() : result; } } fannj-fannj-0.7/src/main/resources/000077500000000000000000000000001237121363200172735ustar00rootroot00000000000000fannj-fannj-0.7/src/main/resources/com/000077500000000000000000000000001237121363200200515ustar00rootroot00000000000000fannj-fannj-0.7/src/main/resources/com/googlecode/000077500000000000000000000000001237121363200221605ustar00rootroot00000000000000fannj-fannj-0.7/src/main/resources/com/googlecode/fannj/000077500000000000000000000000001237121363200232545ustar00rootroot00000000000000fannj-fannj-0.7/src/main/resources/com/googlecode/fannj/WindowsFunctionNames.txt000066400000000000000000000077551237121363200301570ustar00rootroot00000000000000#copied from 'depends' windows utility #these are processed by the WindowsFunctionMapper _fann_cascadetrain_on_data@20 _fann_cascadetrain_on_file@20 _fann_clear_scaling_params@4 _fann_create_from_file@4 _fann_create_shortcut_array@8 _fann_create_sparse_array@12 _fann_create_standard_array@8 _fann_create_train_from_callback@16 _fann_descale_input@8 _fann_descale_output@8 _fann_descale_train@8 _fann_destroy@4 _fann_destroy_train@4 _fann_duplicate_train_data@4 _fann_get_activation_function@12 _fann_get_activation_steepness@12 _fann_get_bias_array@8 _fann_get_bit_fail@4 _fann_get_bit_fail_limit@4 _fann_get_callback@4 _fann_get_cascade_activation_functions@4 _fann_get_cascade_activation_functions_count@4 _fann_get_cascade_activation_steepnesses@4 _fann_get_cascade_activation_steepnesses_count@4 _fann_get_cascade_candidate_change_fraction@4 _fann_get_cascade_candidate_limit@4 _fann_get_cascade_candidate_stagnation_epochs@4 _fann_get_cascade_max_cand_epochs@4 _fann_get_cascade_max_out_epochs@4 _fann_get_cascade_num_candidate_groups@4 _fann_get_cascade_num_candidates@4 _fann_get_cascade_output_change_fraction@4 _fann_get_cascade_output_stagnation_epochs@4 _fann_get_cascade_weight_multiplier@4 _fann_get_connection_array@8 _fann_get_connection_rate@4 _fann_get_errno@4 _fann_get_errstr@4 _fann_get_layer@8 _fann_get_layer_array@8 _fann_get_learning_momentum@4 _fann_get_learning_rate@4 _fann_get_MSE@4 _fann_get_network_type@4 _fann_get_neuron@12 _fann_get_neuron_layer@12 _fann_get_num_input@4 _fann_get_num_layers@4 _fann_get_num_output@4 _fann_get_quickprop_decay@4 _fann_get_quickprop_mu@4 _fann_get_rprop_decrease_factor@4 _fann_get_rprop_delta_max@4 _fann_get_rprop_delta_min@4 _fann_get_rprop_delta_zero@4 _fann_get_rprop_increase_factor@4 _fann_get_total_connections@4 _fann_get_total_neurons@4 _fann_get_train_error_function@4 _fann_get_train_stop_function@4 _fann_get_training_algorithm@4 _fann_get_user_data@4 _fann_init_weights@8 _fann_length_train_data@4 _fann_merge_train_data@8 _fann_num_input_train_data@4 _fann_num_output_train_data@4 _fann_print_connections@4 _fann_print_error@4 _fann_print_parameters@4 _fann_randomize_weights@12 _fann_read_train_from_file@4 _fann_reset_errno@4 _fann_reset_errstr@4 _fann_reset_MSE@4 _fann_run@8 _fann_save@8 _fann_save_to_fixed@8 _fann_save_train@8 _fann_save_train_to_fixed@12 _fann_scale_input@8 _fann_scale_input_train_data@12 _fann_scale_output@8 _fann_scale_output_train_data@12 _fann_scale_train@8 _fann_scale_train_data@12 _fann_set_activation_function@16 _fann_set_activation_function_hidden@8 _fann_set_activation_function_layer@12 _fann_set_activation_function_output@8 _fann_set_activation_steepness@16 _fann_set_activation_steepness_hidden@8 _fann_set_activation_steepness_layer@12 _fann_set_activation_steepness_output@8 _fann_set_bit_fail_limit@8 _fann_set_callback@8 _fann_set_cascade_activation_functions@12 _fann_set_cascade_activation_steepnesses@12 _fann_set_cascade_candidate_change_fraction@8 _fann_set_cascade_candidate_limit@8 _fann_set_cascade_candidate_stagnation_epochs@8 _fann_set_cascade_max_cand_epochs@8 _fann_set_cascade_max_out_epochs@8 _fann_set_cascade_num_candidate_groups@8 _fann_set_cascade_output_change_fraction@8 _fann_set_cascade_output_stagnation_epochs@8 _fann_set_cascade_weight_multiplier@8 _fann_set_error_log@8 _fann_set_input_scaling_params@16 _fann_set_learning_momentum@8 _fann_set_learning_rate@8 _fann_set_output_scaling_params@16 _fann_set_quickprop_decay@8 _fann_set_quickprop_mu@8 _fann_set_rprop_decrease_factor@8 _fann_set_rprop_delta_max@8 _fann_set_rprop_delta_min@8 _fann_set_rprop_delta_zero@8 _fann_set_rprop_increase_factor@8 _fann_set_scaling_params@24 _fann_set_train_error_function@8 _fann_set_train_stop_function@8 _fann_set_training_algorithm@8 _fann_set_user_data@8 _fann_set_weight@16 _fann_set_weight_array@12 _fann_shuffle_train_data@4 _fann_subset_train_data@12 _fann_test@12 _fann_test_data@8 _fann_train@12 _fann_train_epoch@8 _fann_train_on_data@20 _fann_train_on_file@20 #fann_create_shortcut #fann_create_sparse #fann_create_standard fannj-fannj-0.7/src/test/000077500000000000000000000000001237121363200153145ustar00rootroot00000000000000fannj-fannj-0.7/src/test/java/000077500000000000000000000000001237121363200162355ustar00rootroot00000000000000fannj-fannj-0.7/src/test/java/com/000077500000000000000000000000001237121363200170135ustar00rootroot00000000000000fannj-fannj-0.7/src/test/java/com/googlecode/000077500000000000000000000000001237121363200211225ustar00rootroot00000000000000fannj-fannj-0.7/src/test/java/com/googlecode/fannj/000077500000000000000000000000001237121363200222165ustar00rootroot00000000000000fannj-fannj-0.7/src/test/java/com/googlecode/fannj/FannTest.java000066400000000000000000000034351237121363200246100ustar00rootroot00000000000000/* FannJ * Copyright (C) 2009 Kyle Renfro * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 3 of the License, or (at your option) any later version. * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 59 Temple Place - Suite 330, * Boston, MA 02111-1307, USA. The text of license can be also found * at http://www.gnu.org/copyleft/lgpl.html */ package com.googlecode.fannj; import java.io.IOException; import static org.junit.Assert.*; import org.junit.Test; import java.io.File; import java.io.FileOutputStream; import org.apache.commons.io.IOUtils; public class FannTest { @Test public void testFromFile() throws IOException { File temp = File.createTempFile("fannj_", ".tmp"); temp.deleteOnExit(); IOUtils.copy(this.getClass().getResourceAsStream("xor_float.net"), new FileOutputStream(temp)); Fann fann = new Fann(temp.getPath()); assertEquals(2, fann.getNumInputNeurons()); assertEquals(1, fann.getNumOutputNeurons()); assertEquals(-1f, fann.run(new float[]{-1, -1})[0], .2f); assertEquals(1f, fann.run(new float[]{-1, 1})[0], .2f); assertEquals(1f, fann.run(new float[]{1, -1})[0], .2f); assertEquals(-1f, fann.run(new float[]{1, 1})[0], .2f); fann.close(); } } fannj-fannj-0.7/src/test/java/com/googlecode/fannj/FannTrainerTest.java000066400000000000000000000105151237121363200261320ustar00rootroot00000000000000/* FannJ * Copyright (C) 2009 Kyle Renfro * This library is free software; you can redistribute it and/or * modify it under the terms of the GNU Lesser General Public * License as published by the Free Software Foundation; either * version 3 of the License, or (at your option) any later version. * This library is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU * Library General Public License for more details. * * You should have received a copy of the GNU Library General Public * License along with this library; if not, write to the * Free Software Foundation, Inc., 59 Temple Place - Suite 330, * Boston, MA 02111-1307, USA. The text of license can be also found * at http://www.gnu.org/copyleft/lgpl.html */ package com.googlecode.fannj; import java.io.File; import java.io.FileOutputStream; import java.io.IOException; import static org.junit.Assert.assertTrue; import java.util.ArrayList; import java.util.List; import org.apache.commons.io.IOUtils; import org.junit.Test; public class FannTrainerTest { @Test public void testTrainingDefault() throws IOException { File temp = File.createTempFile("fannj_", ".tmp"); temp.deleteOnExit(); IOUtils.copy(this.getClass().getResourceAsStream("xor.data"), new FileOutputStream(temp)); List layers = new ArrayList(); layers.add(Layer.create(2)); layers.add(Layer.create(3, ActivationFunction.FANN_SIGMOID_SYMMETRIC)); layers.add(Layer.create(1, ActivationFunction.FANN_SIGMOID_SYMMETRIC)); Fann fann = new Fann(layers); Trainer trainer = new Trainer(fann); float desiredError = .001f; float mse = trainer.train(temp.getPath(), 500000, 1000, desiredError); assertTrue("" + mse, mse <= desiredError); } @Test public void testTrainingQuickprop() throws IOException { File temp = File.createTempFile("fannj_", ".tmp"); temp.deleteOnExit(); IOUtils.copy(this.getClass().getResourceAsStream("xor.data"), new FileOutputStream(temp)); List layers = new ArrayList(); layers.add(Layer.create(2)); layers.add(Layer.create(3, ActivationFunction.FANN_SIGMOID_SYMMETRIC)); layers.add(Layer.create(1, ActivationFunction.FANN_SIGMOID_SYMMETRIC)); Fann fann = new Fann(layers); Trainer trainer = new Trainer(fann); trainer.setTrainingAlgorithm(TrainingAlgorithm.FANN_TRAIN_QUICKPROP); float desiredError = .001f; float mse = trainer.train(temp.getPath(), 500000, 1000, desiredError); assertTrue("" + mse, mse <= desiredError); } @Test public void testTrainingBackprop() throws IOException { File temp = File.createTempFile("fannj_", ".tmp"); temp.deleteOnExit(); IOUtils.copy(this.getClass().getResourceAsStream("xor.data"), new FileOutputStream(temp)); List layers = new ArrayList(); layers.add(Layer.create(2)); layers.add(Layer.create(3, ActivationFunction.FANN_SIGMOID_SYMMETRIC)); layers.add(Layer.create(2, ActivationFunction.FANN_SIGMOID_SYMMETRIC)); layers.add(Layer.create(1, ActivationFunction.FANN_SIGMOID_SYMMETRIC)); Fann fann = new Fann(layers); Trainer trainer = new Trainer(fann); trainer.setTrainingAlgorithm(TrainingAlgorithm.FANN_TRAIN_INCREMENTAL); float desiredError = .001f; float mse = trainer.train(temp.getPath(), 500000, 1000, desiredError); assertTrue("" + mse, mse <= desiredError); } @Test public void testCascadeTraining() throws IOException { File temp = File.createTempFile("fannj_", ".tmp"); temp.deleteOnExit(); IOUtils.copy(this.getClass().getResourceAsStream("parity8.train"), new FileOutputStream(temp)); Fann fann = new FannShortcut(8, 1); Trainer trainer = new Trainer(fann); float desiredError = .00f; float mse = trainer.cascadeTrain(temp.getPath(), 30, 1, desiredError); assertTrue("" + mse, mse <= desiredError); } } fannj-fannj-0.7/src/test/resources/000077500000000000000000000000001237121363200173265ustar00rootroot00000000000000fannj-fannj-0.7/src/test/resources/com/000077500000000000000000000000001237121363200201045ustar00rootroot00000000000000fannj-fannj-0.7/src/test/resources/com/googlecode/000077500000000000000000000000001237121363200222135ustar00rootroot00000000000000fannj-fannj-0.7/src/test/resources/com/googlecode/fannj/000077500000000000000000000000001237121363200233075ustar00rootroot00000000000000fannj-fannj-0.7/src/test/resources/com/googlecode/fannj/jna/000077500000000000000000000000001237121363200240575ustar00rootroot00000000000000fannj-fannj-0.7/src/test/resources/com/googlecode/fannj/jna/xor.data000066400000000000000000000000441237121363200255200ustar00rootroot000000000000004 2 1 -1 -1 -1 -1 1 1 1 -1 1 1 1 -1 fannj-fannj-0.7/src/test/resources/com/googlecode/fannj/parity8.train000066400000000000000000000120111237121363200257410ustar00rootroot00000000000000256 8 1 1 1 0 0 0 0 0 1 1 1 0 0 1 0 0 1 0 1 0 0 0 0 0 1 1 0 0 1 1 0 0 1 1 1 0 1 1 0 1 1 0 1 1 1 0 1 1 1 1 1 0 1 1 1 0 1 0 1 0 0 1 0 1 0 0 1 1 0 1 1 1 1 1 0 0 1 0 1 0 0 1 0 1 1 0 0 1 0 1 0 0 1 0 1 1 1 0 1 1 0 0 0 0 0 0 1 1 0 0 1 0 0 1 1 1 1 1 0 1 0 0 0 1 0 0 0 0 0 0 1 1 1 0 0 1 0 1 0 1 1 1 1 0 1 1 0 1 1 0 1 0 1 1 0 0 1 0 0 1 0 1 1 0 0 1 0 1 0 1 0 1 0 1 0 1 0 0 0 1 1 0 0 0 1 1 0 0 1 1 1 1 1 0 0 1 0 1 1 1 1 1 0 0 1 1 1 0 1 1 0 1 0 1 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 1 1 1 0 1 0 1 1 1 1 1 1 0 0 0 0 0 0 1 0 0 0 1 0 1 1 0 0 0 0 1 1 0 1 1 1 1 1 0 0 1 0 0 1 0 0 1 0 0 1 1 0 0 0 0 1 0 0 1 0 0 1 1 0 0 1 1 0 0 0 1 0 0 0 1 1 1 0 0 1 0 1 0 0 0 0 0 0 0 1 0 1 1 0 0 1 0 1 1 0 0 1 0 0 1 0 0 0 1 0 1 0 0 0 1 1 0 0 0 1 0 1 0 1 0 0 1 1 0 0 0 1 0 0 0 1 1 0 1 0 1 1 1 1 0 0 0 1 1 1 1 1 0 0 1 0 1 0 0 0 0 1 0 0 1 0 0 0 1 0 1 0 0 1 0 0 1 1 0 0 1 0 1 1 1 1 0 1 1 1 1 0 1 0 1 0 0 1 0 1 0 1 1 0 1 1 0 1 0 0 0 0 1 0 1 0 0 1 1 0 1 0 0 0 0 0 0 0 1 0 1 1 0 1 0 1 0 1 0 0 1 1 1 0 1 0 1 1 0 0 0 1 0 1 1 0 1 0 0 0 0 0 1 1 1 1 0 0 1 0 0 0 0 1 1 1 0 1 0 0 1 0 1 1 0 0 1 0 0 0 0 0 1 0 0 1 1 0 0 0 0 1 1 0 1 0 0 1 0 1 0 1 1 0 1 1 0 1 1 0 1 0 1 1 0 1 0 1 1 1 0 0 0 1 0 0 1 0 0 1 1 1 0 0 0 0 1 0 0 0 0 0 1 1 1 0 1 0 0 1 0 0 0 1 0 0 0 0 1 1 1 0 1 1 1 0 0 1 0 1 0 0 0 0 1 0 1 1 0 1 0 0 0 1 0 0 0 0 1 1 1 0 0 0 1 1 1 1 1 1 0 1 0 1 0 1 1 0 1 1 1 0 1 0 1 1 0 0 0 0 0 0 1 0 1 1 0 1 0 1 1 1 1 1 1 0 0 1 0 1 1 1 0 0 1 1 0 1 0 1 0 1 1 1 1 1 0 1 0 1 0 1 1 1 0 1 0 0 0 1 1 0 1 0 0 0 1 0 0 1 1 1 1 1 1 1 1 1 1 0 1 1 1 0 0 0 1 1 1 1 0 0 1 1 1 0 1 0 0 1 1 1 1 1 1 1 1 0 1 1 0 1 1 0 1 0 1 0 1 1 1 0 0 1 0 0 1 0 0 0 0 0 1 0 0 1 0 1 1 0 0 0 0 1 1 0 0 1 0 0 0 0 0 1 1 1 1 0 1 0 1 0 1 1 1 1 1 0 0 0 1 1 1 1 0 0 0 0 0 1 0 1 1 1 0 1 1 1 0 0 1 1 0 1 1 1 0 1 0 0 1 1 1 0 0 1 0 1 0 0 1 1 0 1 1 1 1 0 0 1 0 1 1 0 0 1 1 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 0 1 1 1 0 0 0 0 0 1 0 0 0 1 0 1 1 0 1 0 1 1 1 1 1 0 0 1 0 1 0 1 0 1 0 0 0 1 0 1 1 1 0 0 0 0 1 0 0 1 1 0 1 1 0 1 1 1 0 1 0 1 1 0 0 0 1 0 1 1 0 0 1 1 0 0 1 1 1 1 0 1 1 1 1 0 1 1 0 1 1 1 0 0 1 0 0 0 0 0 0 0 1 0 0 0 1 1 0 0 1 1 1 0 1 1 0 0 1 0 0 0 0 1 0 0 0 0 1 1 0 1 1 0 0 1 1 1 1 1 0 1 0 1 0 1 1 1 1 1 0 0 1 0 0 1 0 0 0 1 1 1 0 0 1 1 1 1 1 0 1 1 1 1 1 0 1 0 0 0 1 0 1 1 0 0 0 1 0 0 0 1 0 1 0 1 1 1 0 0 1 0 1 0 1 0 0 1 1 0 0 0 1 0 1 1 1 0 1 1 1 1 0 0 1 0 0 0 1 1 0 1 0 0 1 0 0 1 1 1 0 0 1 0 0 0 1 1 1 1 1 0 0 1 0 0 0 1 1 1 0 0 1 1 0 1 1 0 0 1 1 1 1 0 1 1 1 1 1 1 1 1 0 0 0 0 0 1 1 1 0 0 1 1 0 1 1 1 0 1 1 1 0 0 1 1 1 0 0 0 0 1 0 1 1 0 0 0 0 0 1 1 1 1 0 1 0 0 1 1 1 1 0 1 0 1 0 0 1 1 0 1 1 0 1 0 1 0 1 1 1 1 0 1 0 0 1 1 1 0 0 0 0 1 0 1 1 1 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 0 1 0 1 1 0 1 1 1 1 0 1 0 0 1 0 1 1 0 0 1 0 1 1 1 0 1 1 0 1 0 1 0 0 1 0 0 1 1 0 0 0 0 1 1 1 1 0 0 0 1 1 0 0 0 0 0 0 1 0 1 0 1 0 0 1 0 0 1 1 1 0 0 0 1 0 1 0 1 0 0 0 1 1 0 1 1 0 1 0 0 0 1 0 0 1 1 1 0 1 0 0 0 0 1 0 0 1 1 0 0 1 0 0 0 0 1 0 1 0 0 1 1 0 1 1 0 1 1 0 1 1 1 0 1 0 0 1 1 1 0 0 1 1 1 0 0 0 1 0 0 0 0 1 0 1 1 0 1 1 0 1 0 1 0 0 1 0 1 0 1 1 1 0 1 0 1 1 0 1 0 0 0 1 1 0 0 0 1 1 0 0 1 0 1 1 1 1 0 0 0 0 1 0 0 1 1 0 0 1 0 1 0 1 1 0 0 1 1 1 1 0 1 1 0 0 1 0 0 0 0 0 0 0 1 0 0 1 1 0 1 0 0 0 0 0 0 0 0 1 0 0 1 1 1 0 0 0 1 1 0 0 0 0 0 1 1 0 0 1 0 1 1 1 0 0 1 1 1 1 1 0 1 1 0 0 1 1 0 1 0 0 1 1 0 0 1 1 1 1 0 1 0 0 0 0 0 0 0 1 1 0 0 0 0 1 0 0 0 0 1 1 1 0 0 0 0 1 0 0 0 0 1 1 0 0 0 0 0 1 1 0 0 0 1 1 0 0 1 1 0 0 1 0 1 0 0 0 1 0 1 1 1 0 0 0 0 1 1 0 0 0 0 1 1 1 0 1 1 1 0 0 0 0 1 0 1 0 0 1 1 0 0 1 1 1 1 0 1 1 0 1 0 1 1 0 1 0 0 1 0 0 0 1 1 1 0 0 1 1 0 1 0 0 1 0 0 1 1 1 1 0 0 1 1 0 1 0 1 0 1 1 1 1 0 0 1 0 1 1 1 0 0 0 1 0 1 0 0 0 1 0 1 1 0 0 0 0 1 1 0 1 1 1 1 1 0 1 1 0 0 1 1 0 0 1 1 0 1 1 1 0 0 0 1 0 0 1 1 1 1 0 0 1 0 0 0 1 0 1 0 0 0 0 0 0 1 0 0 1 1 1 0 0 0 1 1 1 1 0 1 1 0 0 1 0 1 1 1 0 1 1 0 1 1 0 1 1 1 0 1 1 0 1 1 1 0 1 0 0 0 0 1 1 0 0 1 0 0 1 0 1 1 0 1 0 1 1 0 1 1 0 1 0 1 0 0 0 1 1 1 0 1 1 0 0 0 0 0 0 1 1 0 1 0 0 1 0 0 1 0 0 0 0 0 1 1 0 0 0 0 0 0 1 0 0 1 1 1 1 0 0 1 1 0 1 1 1 1 1 0 0 1 1 0 1 1 1 1 0 0 1 1 1 0 1 1 0 0 1 1 1 1 0 1 1 1 1 1 1 0 1 1 0 1 1 0 0 0 0 1 1 1 0 0 1 1 1 1 1 0 1 1 1 0 1 0 1 0 1 0 0 1 1 0 0 0 0 0 1 1 1 1 1 1 0 1 0 1 0 1 0 0 1 0 0 0 0 0 0 0 1 1 1 1 0 0 0 1 1 0 0 1 1 1 0 0 0 1 0 0 0 0 1 0 1 1 1 1 1 1 0 1 1 0 0 0 1 0 0 1 1 0 1 1 1 1 0 1 0 0 0 1 0 0 0 0 0 0 0 0 1 1 0 1 1 1 1 1 1 0 0 1 0 1 1 1 0 1 0 1 1 1 0 1 0 1 0 0 0 0 1 1 0 0 1 1 0 0 1 1 0 1 0 0 1 1 1 1 0 1 0 1 1 1 0 1 0 1 1 1 0 0 0 1 0 1 0 1 1 1 0 0 0 1 0 0 1 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 0 0 1 1 0 0 0 1 1 1 1 0 0 0 1 1 0 0 0 0 0 1 0 1 0 0 0 1 1 1 1 0 0 0 0 1 0 0 1 0 0 0 0 0 0 1 1 0 1 0 1 0 1 0 1 0 0 1 0 1 0 0 1 1 0 1 0 0 1 0 0 1 0 1 0 1 1 0 0 0 1 1 0 1 1 0 1 1 0 1 1 0 0 0 1 1 0 1 0 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 0 1 0 1 1 0 1 1 1 1 1 1 1 0 1 0 0 1 fannj-fannj-0.7/src/test/resources/com/googlecode/fannj/xor.data000066400000000000000000000000441237121363200247500ustar00rootroot000000000000004 2 1 -1 -1 -1 -1 1 1 1 -1 1 1 1 -1 fannj-fannj-0.7/src/test/resources/com/googlecode/fannj/xor_float.net000066400000000000000000000035731237121363200260240ustar00rootroot00000000000000FANN_FLO_2.1 num_layers=3 learning_rate=0.700000 connection_rate=1.000000 network_type=0 learning_momentum=0.000000 training_algorithm=2 train_error_function=1 train_stop_function=1 cascade_output_change_fraction=0.010000 quickprop_decay=-0.000100 quickprop_mu=1.750000 rprop_increase_factor=1.200000 rprop_decrease_factor=0.500000 rprop_delta_min=0.000000 rprop_delta_max=50.000000 rprop_delta_zero=0.100000 cascade_output_stagnation_epochs=12 cascade_candidate_change_fraction=0.010000 cascade_candidate_stagnation_epochs=12 cascade_max_out_epochs=150 cascade_max_cand_epochs=150 cascade_num_candidate_groups=2 bit_fail_limit=9.99999977648258209229e-03 cascade_candidate_limit=1.00000000000000000000e+03 cascade_weight_multiplier=4.00000005960464477539e-01 cascade_activation_functions_count=10 cascade_activation_functions=3 5 7 8 10 11 14 15 16 17 cascade_activation_steepnesses_count=4 cascade_activation_steepnesses=2.50000000000000000000e-01 5.00000000000000000000e-01 7.50000000000000000000e-01 1.00000000000000000000e+00 layer_sizes=3 4 2 scale_included=0 neurons (num_inputs, activation_function, activation_steepness)=(0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (0, 0, 0.00000000000000000000e+00) (3, 5, 1.00000000000000000000e+00) (3, 5, 1.00000000000000000000e+00) (3, 5, 1.00000000000000000000e+00) (0, 5, 1.00000000000000000000e+00) (4, 5, 1.00000000000000000000e+00) (0, 5, 1.00000000000000000000e+00) connections (connected_to_neuron, weight)=(0, -2.22661828994750976562e+00) (1, 1.85336101055145263672e+00) (2, -1.52579700946807861328e+00) (0, -1.53036403656005859375e+00) (1, 2.32853889465332031250e+00) (2, 1.64283800125122070312e+00) (0, -9.78324294090270996094e-01) (1, -1.17475807666778564453e+00) (2, -1.95483958721160888672e+00) (3, 3.02897882461547851562e+00) (4, -2.84859561920166015625e+00) (5, -1.93521940708160400391e+00) (6, 1.01287376880645751953e+00)