pycorrfit-0.8.1/0000755000175000017500000000000012262536054012276 5ustar toortoorpycorrfit-0.8.1/.gitignore0000644000175000017500000000074712262516600014271 0ustar toortoor*.py[cod] # C extensions *.so # Packages *.egg *.egg-info dist build eggs parts bin var sdist develop-eggs .installed.cfg lib lib64 # Installer logs pip-log.txt # Unit test / coverage reports .coverage .tox nosetests.xml # Translations *.mo # Mr Developer .mr.developer.cfg .project .pydevproject # Latex *.aux *.glo *.idx *.log *.toc *.ist *.acn *.acr *.alg *.bbl *.blg *.dvi *.glg *.gls *.ilg *.ind *.lof *.lot *.maf *.mtc *.mtc1 *.out *.synctex.gz *.pdf *.bak # ~ *.py~ *.md~ pycorrfit-0.8.1/README.md0000644000175000017500000000276412262516600013561 0ustar toortoor![PyCorrFit](https://raw.github.com/paulmueller/PyCorrFit/master/doc-src/Images/PyCorrFit_logo_dark.png) ========= This repository contains the source code of PyCorrFit - a scientific tool for fitting correlation curves on a logarithmic plot. In current biomedical research, fluorescence correlation spectroscopy (FCS) is applied to characterize molecular dynamic processes in vitro and in living cells. Commercial FCS setups only permit data analysis that is limited to a specific instrument by the use of in-house file formats or a finite number of implemented correlation model functions. PyCorrFit is a general-purpose FCS evaluation software that, amongst other formats, supports the established Zeiss ConfoCor3 ~.fcs file format. PyCorrFit comes with several built-in model functions, covering a wide range of applications in standard confocal FCS. In addition, it contains equations dealing with different excitation geometries like total internal reflection (TIR). For more information, visit the official homepage at http://pycorrfit.craban.de. - [Download the latest version](https://github.com/paulmueller/PyCorrFit/releases) - [Documentation](https://github.com/paulmueller/PyCorrFit/raw/master/PyCorrFit_doc.pdf) - [Run PyCorrFit from source](https://github.com/paulmueller/PyCorrFit/wiki/Running-PyCorrFit-from-source) - [Write model functions](https://github.com/paulmueller/PyCorrFit/wiki/Writing-model-functions) - [Need help?](https://github.com/paulmueller/PyCorrFit/wiki/Creating-a-new-issue) pycorrfit-0.8.1/MANIFEST.in0000644000175000017500000000025212262516600014026 0ustar toortoorinclude doc-src/*.tex include doc-src/*.bib include doc-src/Images/* include external_model_functions/* include README.md include ChangeLog.txt include PyCorrFit_doc.pdf pycorrfit-0.8.1/ChangeLog.txt0000644000175000017500000001574012262516600014670 0ustar toortoor0.8.1 - Thanks to Alex Mestiashvili for providing initial setup.py files and for debianizing PyCorrFit (@mestia) - Thanks to Thomas Weidemann for his contributions to the documentation (@weidemann) - Bugfixes - Some ConfoCor files were not imported - The cpp was not calculated correctly in case of background correction (#45) - Enabled averaging of single pages (#58) - Background correction for cross-correlation data is now copmuted (#46) - Improvements of the user interface - The menus have been reordered (#47, #50) - The fitting panel has been optimized (#49) - the slider simulation got a reset button (#51) - The Help menu contains documentation and wiki (#56) - Model functions that are somehow redundant have been removed from the menu, but are still supported - The model doc strings were fully converted to unicode - Several text messages were modified for better coherence - The background correction tool is more intuitive - Statistics panel improvements (#43) - Run information is included in the Data set title - The page counter starts at "1" instead of "0" (#44) - New handling of background correction (#46, #53) 0.8.0 - Filename/title of each tab now shows up in the notebook (#39) - Statistics tool can plot parameters and page selection with the Overlay tool is possible (#31) 0.7.9 - Support for Mac OSx - Enhancements: - Export file format (.csv) layout improved - Model function info text in UTF-8 - Improved waring message when opening sessions from future versions - New feature lets user set the range for the fitting parameters - Bugfixes: - Cleaned minor tracebacks and exceptions created by the frontend - Mac version now works as expected, but .app bundling failed - Latex plotting features now support more characters such as "[]{}^" 0.7.8 - Enhancements: - Averages can now be calculated from user-selected pages - Pages selected in the Overlay tool are now automatically set for computation of average and for global fitting - Source pages are now displayed in average title - Graph normalization with particle numbers is now possible - Bugfixes: - Errors during fitting with weights equal to zero - Overlay tool displayed last curve after all pages have been removed - Global fit did not work with weights - Session saving now uses 20 digits accuracy - CSV export is now using tab-delimited data for easier Excel-import - Added version checking for session management 0.7.7 - Fixed: Tools windows could not be closed (or moved on MS Windows) - Fixed: .csv export failed in some cases where no weights were used - Enhancement: The user is now asked before a page is closed - Enhancement: Tool "Page Info" and in exported .csv files, variables and values are now separated by a tab stop instead of a "=" - Fixed: Opening a session with an empty page failed in some cases - Fixed: Tool "Statistics" missed to output the column "filename/title" if that key value is empty - replaced empty strings with "NoName" - Enhancement: Tool "Overlay" now asks the user to check kept curves instead of showing the curves to be removed - Enhancement: Tool "Overlay" now has a "Cancel" button 0.7.6 - Improved handling - Tools are now sorted according to a standard work-flow - Renamed "Curve selection" to "Overlay tool" - this is more intuitive - Tools will now stay open or may be opened when there are no open pages (#25) - Filenames and runs are now displayed on each page (also added filename/title tag) (#23) - Notebook: moved close button to each tab to prevent accidental closing of tabs - Improved tool "Statistics" (#21) - Fixed the case where "useless" data was produced - instead we write "NaN" data, removed warning message accordingly - Row-wise ordering according to page numbers (#22) - Column-wise ordering is now more intuitive (Fitted parameters with errors first) - Some columns are now checked by default - PyCorrFit remembers checked parameters for a page (not saved in session) - Improved tool "Overlay" (#23) - New feature: Overlay shows run number of each file (upon import), the run (or index) of an experimental file is unique to PyCorrFit - Upon import, filenames and runs are displayed - In a session, the filename/title is displayed - Web address of PyCorrFit changed from "fcstools.dyndns.org/pycorrfit" to "pycorrfit.craban.de" - Minor bugfixes: Batch control, Global fitting, import dialog 0.7.5 - Added model functions to documentation. - Weights from fitting are now exported in .csv files. - Rework of info panel for fitting - Cleared unintuitive behavior of session saving: The fitting parameters were read from the frontend. This could have led to saving false fit meta data. - During fitting, units are now displayed as "human readable" (#17). - Slider simulation now also uses human readable units (#17). - Secured support for Ubuntu 12.10 and 13.04 - Fixed: new line (\n) characters for LaTeX plotting on Windows 0.7.4 - New tool: Colorful curve selection - Import data: Curve selection possible - Average: Crop average according to current page. - Fixed: Page now displays Chi-squared of global fit. - Fixed: Chi-squared and parameters of global fitting are now stored in sessions. 0.7.3 - Issue closed. External weights from averages saved in session (#11). - Solved minor bugs - Added estimation of errors of fit (Issue #12/#14) - Fixed: Some .fcs files containing averages were not imported. 0.7.2 - Bugfix: Issue #10; we now have a goodness of the fit, if weighted fitting is performed - Bugfix: Weights for fitting not properly calculated (sqrt(std)). - Bugfix: Batch control IndexError with Info window opened - Tool Statistics: Sort values according to page numbers - Tool global: Added weighted fitting - Residuals: According to weighted fitting, weighted residuals are plotted - Average: Variances from averaging can be used for weighted fitting 0.7.1 - Feature: Added Python shell - Bugfix: Saving image was not performed using WXAgg - Bugfix: Notebook pages were drag'n'dropable - Update function now works in its own thread - Code cleanup: documentation of model functions - Added program icon 0.7.0 - File import dialog was enhanced (#4, #5 - subsequently #7, #8): - Now there is only one "load data" dialog in the file menu. - The model function is chosen for each type of data that is to be imported (AC, CC, etc.). - Loading files that do not contain data a pointed out to the user and the program continues with the other files. - Bugfix: Channel selection window causes crash on file import (#1). - Bugfix: Hidden feature changes fixed parameters during fitting (#2). - Feature: Convert TIR model function parameters lambda and NA to sigma (#3). - Code cleanup: Opening data files is now handled internally differently. 0.6.9 - Initital GitHub commit pycorrfit-0.8.1/PyCorrFit_doc.pdf0000644000175000017500000154346312262516600015512 0ustar toortoor%PDF-1.5 % 2 0 obj << /Type /ObjStm /N 100 /First 804 /Length 1194 /Filter /FlateDecode >> stream xڕVn8}W6 @7iۢ З(6E|Iԭc‰sredHedIrTd$2rHHCBWIs DJ&CcAR=x)4 9i-H.DK2fF%c08 (x&cuNPP^+(J5TZ*:d $$5V4Xd{!0Z;)$"'d$h1Id_),>Q BaLGk6svR#`(6@kx o;EoQ  r~B 0( B‰>:If )H"3&,g#q#2Ifr2!TH@*r69T%uΩpSq5 $7lS`ɉr2pWɜx@=Yh6\O,`vHPBA霝yCgtvn]ЫOS޾ؤf]-sr,׏m ZFj#୯e=яl=y٭W_;xbނmUmcux8zӢjfmztt[=mҴ{ Ԑ2Mi`U۲y3$x ;??V-&tUƀ!:jS5mG3Cz{]GǮ.vمj-_΢y+Q1Rx[.H%Os]iG#7S<131؍j&ߴ2&|j$ i]5|uli7Z˧M_}uws}3|xzwP{10 ӑ]F?2P}NrD ]%3HZ[` $ endstream endobj 244 0 obj << /Length 1420 /Filter /FlateDecode >> stream xZr6+$;!Cen2č"`>U~ AJSM%{4# \{ `쭼;ŝ1 zsшc1FoGlj C#y !b_J#ݙ (e&)򪣝 ݜ3YqnH+;=z^g>{@D"QE62O8o/3=/f_{+l o ZC4E(NE .WVL!^˯T2~&1kSm-]B{]?z4:{-IR'jilODg׭ G\42Vw:u3@#MJb#/buz"c;׭yVRO՘M[KL~0DKa*$A@pDq{ȍvTF=_D(zm/޶CVPagzr}3>D3ξ!bW s8VrzS"GAHP:ҵҨVZ}Y'ebTٝ3Qɂxu5%qۚoR/jO{ėCXK*VK#}AIvIJ~z<`m4XgRaH'db#JH3@TapRX(sQZ=}Y /ޔ7]Ӈ™lgv!AKv 5dC3TJequ2qzzH/tcaH#nx^Wp3~M"hhcI\JhݝQM*O:\6=s%R=o$P>p tgjOBEcqTk-UY,#oDQHGEָ_F@.?8@2 `:B#;|cpɊ՟UMd#Pec/0>jSU˙Ҳə*o,2pZ]WI;J6l|G|Nt!ۥ[P}sY\T]q֮ *d+g,*/D W}!hcȞG~6Yl) F9A;Tr7$M&s1n\BW{uezkٌnR,,3#MRdQ-RM 6fհ4ϋ^XGkwV6,VcH<̈́C~Mցvuc[ ԕhL _ KE~Ȗ:CL_EyiO@/ XS D!\__[$pnc[cSBXbuy1 |W+ endstream endobj 209 0 obj << /Type /XObject /Subtype /Form /FormType 1 /PTEX.FileName (./Images/PyCorrFit_logo_dark.pdf) /PTEX.PageNumber 1 /PTEX.InfoDict 254 0 R /BBox [0 0 629.979187 184.27446] /Group 249 0 R /Resources << /ExtGState << /a0 << /CA 0.172107 /ca 0.172107 >> /a1 << /CA 0.445104 /ca 0.445104 >> /a2 << /CA 1 /ca 1 >> /a3 << /CA 0.804734 /ca 0.804734 >> >>/Font << /f-0-0 255 0 R>> >> /Length 659 /Filter /FlateDecode >> stream xUMk1 WDo(9,B{ 9)P!kk&M)eO'o't w@=}w> 9ʃN R %F!^J2k#6BjfFWRuܣWpSیޓw*Jw]8N^ZO~3[U<Ы9Lo`N!N JrIS/0 5`ai 9j;\rP9gD=w/%+ %DSY3A^ c`!hm9-aofi/Hwm?Yߋ];.f-]-J)=)$C’6&1 %I-PfІ-XteWnOQ;!(f(&*^څX5/T3@%{weP~Ej1 mP<'"p u4te,~; ߱]tWpB?GWG> stream x]j >$n!Cд 1}PA|83[{+: gx0 Z)1:Z,k18I7/k3U8kb޽Oc+c\m!s*IXތ,jW+%{yTg9k.DD剨LT)3) х{+)TV6{b>vfZݚ\igs` endstream endobj 260 0 obj << /Length 261 0 R /Filter /FlateDecode /Length1 9652 >> stream x: xՕ;yfɼ< 3!!dL$"AB&@paM D, >W |k"2IMl-Z݅Xjn g@:w9{sϽ7 >`[ @Cn-|>%Tvx~nk{n?eѶ#+iIgcys;7n=Xh򓼼mռ 'mlK뺭#N| L8[4 *U#$0r0*y1Ҩ8%N;_珇o2~oC$<0oಏU% ( 7 Ep!\[m+Ѡ\0"}5nv}E ԅG]{0BecFMbQ((Q;̼<5+h2r!sAE \ШՓ $U\D<&sw8FHh5i?k1(nڕQCnTgF5D:C$ +5B11^8|kaj iJMLR[U)Lb*b0yi*E?I$wwsL~9M3%L2Zi&p-vGb*]%o^}'dPJ^F0;A^A*L Q%˟:2/I:M[n<`y 0jHL\j6ZBԣz)5Ox x<; )%SȔbJLZZ4T(pY9d/{nZ.Kia.n,(( r@ĀU& f R0= *ZrJQVN  3] q7)Ǎ KqVqAT8 ?8Rpnt1X$(,#dءTm{L+ZsWg+G謈ZD*y4Ǭu٬&{#`ǣfhFl27LaLPQV2jC ώ' ?#ݿȡc t1%/8}g8g DwefO(gQexly!z1:B_jG5qnʝ4)/7eZ3tZZ X3ܼ**_PeUUnAuAEU"r b'D ?$-Sw-WC?ob1ۏ6)m-L9[6a7uYӦ#5\o;(3:-LX(u=.S-pY*j>,+4N_WoZ`ldTT,ʮpȾo}I#upЊwBM-N&[_pw?A& - 8KQ"j&wtzc+34 E Ö|<++^#䱗 -I+l4SVbu,ƃQG3u*ku%VXӋ̓!}.z.Ʌtetk u=1 +'Dv/j}O{Nؒy?|]wZkR J F&d0V3cy&jT檪eߩ>wPH_lc/+h"{~2\s|ts%cm;.Vsiugx0>{/t@K$Gkϲ+*K 9B~!ۻrsק[FݏsrMv e>doŜ?\ku DP#r.% 1HJ&v|(͢>ņSΤ=XqIc%iH}`z\K5p 6+# F4C_RK80:$nxPFAa>yvl×O>vx {,X94UJ?c ixH@}ҀW4Uڈa$Y Ω o *8fX߁'a~ oBL"}=rJǓǥvV 6`<+y%n;a7S}>$j7̂Ў|3DML%m+ka؋:}qx>D2$A$w!y70 l6 wt |~*@=ވ\g=%P jQkeO%C7F0'T#c[ O;;`'ƀ'1}5oؿ(Y^X a:ZɋvJhѿw)tZ[q(ggZh49rzEyYԒ)E`Iybzr.Î d4dJ%D~gr>oPE_E@>iKsyƋO_'Da!wU-X-+B[FzC&m[lf1B;} Se Z%hR]<.'oY uկhwLSݑo\,6WOqɄRj]:637M稛ɵ+ BsFBïRv5#*5z!1>7cZol4"]dM43UpB>m\_;#PkH}(#>~Ţ#ґwag Ifʦ|~ŭEF-1[鋳5ҩQA5D#|StJzW'a0#@x'ec_xP}G&o&]J]Cx*݆N F(s/tZZA2S&Ù}2ҔEj1B8{^-g/Cs9Gs9Ga#@ؿr8agn >ܼlUnH.60T5hL"#<` El1l!'&VyH`ӛ 2F;6< orҹXtkta8PsC%qngdN+^,IXZL5[D(D(jV5Pv( k TG s̓7U # (A"4!Oۅp^xDޠx V4y y =GB裇fCl1V^!!hF ݓRMKpjncQMiSSKR"Pt8TF*X d),wAB`DDڋP|M/; 94 LBI=-x}n麀9Wg[8" ,0$6<0Dfj `%݅܅z=M\5nC 6i2L0010nLaڅi'PhVZ[CG+T/6LqjfXi6]5F|Ȭ=6GelVϭbWWoZoX/^!Q{PaP'_O +~rP_c'rOe<[ƥ28,҃r3^' y󏨑lKnN*|[_ 8)=RZY}uoQs"ˁˁ:98Iyd9YA,YfDue22,%FFu_ux*s6;fy`VˆNGZr:ȏ䴭c;bop[ʫu~IhGмz.6<2`a_鬍wVǚsTsXXXssdGT^ie:LuZtm3vU\w>"E=*)UxU&HTGȏUFd__ߞtolZճS۳i37@&45hcN9FئO=o:ی=kzpB $LmzVb7'= endstream endobj 282 0 obj << /Length 839 /Filter /FlateDecode >> stream x]o0+//_6Um]t`IP dۿIlIk:E?Y˳{^ݪK d9\{3H-]N)cե.&b u ]b;" RF UseGMXzadxн0Wų߆M[ pmz޳4!v@⯈ѽ ,7|0H@trD(<>eD(lO<5}` ̑h> lɼyoqq;Hd.R%H>E.]gTP@H2> \LSV@"¤\j;-Qkq$1([XTBmJAXYZ6ֹ{=4^Qdi[O }!+_+)3 At(4(SN")`=ʤ Ɔ;@@ēiUBIw ^)}ȗ%Lƅr#-6*E[nY WdmD\_8hZق(]yc_;ulS8*944>H2^K+(Ě۬^pfbz_4!DZE#ItVC"FR !8(زnE"ž1RR2ro7 endstream endobj 295 0 obj << /Length 2151 /Filter /FlateDecode >> stream xڵXYs8~#U5[Ǝ=rfkn&@ I;_hP"ev2r /x3U:,TKlDk6|f$dIN5+^lb_4k-eY~')0vS}0v{Fn5 -fuAKYQȬE-d7!hÆf D|Dqa mGY`=mė1x6yVDzPx-s|-.3#2QG05NUm6Oe3G5I*>"+K{@3UcM9)ˬWMMpٲ;w 03j.jjK6գD@(`E[=Mq8K?JGGЌ I %"=ɣ@|%"TYt+ #[vYK|{Q ^8:MY5K: P5C7܀=^ۗ‡˷ > ] ^z'^!9*WGQPb_fv?;[V=X*6Oe*ԔPҦ^<5MAӶ5,3qT:K+rT"w}Ǡe$,GMj׮2]i "@0#G;SUM[cc#'8y|-3߾Xw24Yfmt[ KJM :Nb<,dMr~&/BB/y3 `$=Y'L$|~(v,Eƒ<$ ԆuimF@o,7 v(f"$ g%~ X߮4An/ɮ鍺y^@kb80s)> %>8{kl} 60۞ ,K(#tSWqIWP> stream xڵYK8W{1-D)f# 6==Ȗ"KZn*%Knu:>Xb=q޼?߼ؾVz9Kʄ,1 l3zuv-ypa"c\r W! -us7RAڼH'ZAvE]> .yt+ޝZ:fRE}ZѦF lZTyFm}""fۼrKiP\d&rOA[kWmsރ2\nd\Ն+@lMwiŲϫWk&q}v57RӹiX`W_gZg&?,\%,Q"FQ.=B| *%b`3a8V'& >)`uL;?h k]?ߝm]AT}rQ7EuO,8-̃}]gPs]ifOp^0r8MyNr(ix's㢇-yi_T b3L~3ߙ5|ɆR 'NS!-*ѲL} J<Ĭ+B+|;@EG6o_Fp+ M`t<$ nP\R$L^js ~7[:W+сbHLAX~iۤ}y;8T'f1^dYCS,oFUFdJxGơ.c޻ (k Mjddqeb3h)MY۲Z}-<RDPMǜY8# Gf~$/F՟%k3B`e}ր_(C0L|TY[e7f'M;.G1ZYBtBCRS> )u.h"I81oP✲ ̢ZRf1Cl}!րc rKr;|X'&=un{!mBq֐aYS"VfV" Y'MO!(DâPc+ ^2SHhf4̄)I[G9*!R}BҤa8о3iҝ[B? N$78-zag+a lU]"_3l dHLKK_Q&uD7vP! Ӓ}6Rjn4f-Ep^D;i`78ȑМr%ܖ$ 1byQë3Rj+DgRv[}mҌU &LHrv O#"bNl`bolLR؀:3(v9 mo?m~ >OO@L>B\#PXP.N>鴾ox]4r8`Ե]pZ]GcG IH--]vĆĘDtP#VY| ݠoO!K+O&L;ǡ}&Ks"2PC,T|K iL8 Pi~p âCf{u@ٍ5r o}G??O H^.eez P+0I:+'1P\s.&[\[0y Ƚ;O<&p΃*M_8t'@cB7Tiok8j_l-ᴈnpH*%wC/r5'=J\7q/㎓\ڐ`"qAx `!5!tt}}⎸oS^8+5hlo RT/. "ў3j h0u.߸}JPąED~įWi[<*q4O4$ЕM,|>Źܼ? endstream endobj 203 0 obj << /Type /ObjStm /N 100 /First 905 /Length 3103 /Filter /FlateDecode >> stream x[r7}Wڔ1-Wd;UY'.dEN(n/4ږmci68ht7N7 P(/ӸD䬰!Kh㊟5 }k32'Z m 'Xm7$'V''t2=#v1?| _@ O)c/  `eD^((82& )a0o'@EP ^/& t2CJLid( Gx2V g#$[-B}.0 EO6XLD3nhd< gG{S}As_&D iP"8i(&@"$3fE'q؋HgLpt.NNtoכ,u=7kxڞOibȏrF(q0V%l`0 C rŏp|7d15yOv5nKk7'_(ѬcnyrXQgՋzNFI";kje9ge2-R,oL}A %\IfOVI,_ư=c ed팓Ia(?(Z[INAlHruZKnrM3iJmmG@AIHGɜ&#ws"kMG,Y m`孴 >3Y$W$3kkFbpiɔM{o@rsAzGb}}A郆brIxS\/~#{@X"'neB&C.H v keZtGvCX8s9r$uc.伖d:r2M!u1TEh*ʐz7Ïp\NaB\03w!E(\r9 :fVUWw&\*E?2(CU/xh#bȥ:ج" 4TƁGçT%OSd; 3.6kUAq<)͙UbP=@3.5h%WJA=Ù(p 0dPIt ׍ r JX V :#:¹1[֒9_A`H4춶m ti7bԍȸOxxe8ѩ/rןajNМ^=?ύi<*QݫR4nKﯥKٗ/%g_Jξ})9RrP"/y E^(BY,a>\FF%4`v4p&;qRG9 <  R<`mP$H+y0fpqh,{( iKY]1gD1!F$L*Å7'5"GgݥېtQE9qC,L|#K$7Lkܴ LO @lsN81#C `.JZp|)ömح8WH1wܘeO&¬!J#kE]̲5!Yb4_N1PQr `-#P/[X;jm(;[!;+O3:xXov@xg@zt%ixHT|\RH.d4fd"|ʗw]^ u+K+j42(bc9TOdqzS\ԗr9Ӻj5֓bQWGE``(%+|s=Zm tv:oCFOj)tlYOf]r!/6<}CT:>t\ P.mg5E2ՁW;Хm8tjԠo-}XE*Qq =/y<-H[*κ,֤?[I?tN;TΟR)XR_wzrV*[鰖I2%70$]H0r<}vy=A }| ?^ ;gZQEr>3o`4׋=F+d&>p APm3X7|l7;7m^.i"uX7E_^ |FÀ^OhXt5Oj&fƻ L&CmgzMa^5v<]~F+fd ,I4g"|B{1r.w䈟|yv endstream endobj 325 0 obj << /Length 2832 /Filter /FlateDecode >> stream xڍYIW}-`&d:aC*IS"V~}V\$'ƪ?&Mp7o?$&L8ɢa~e<(7/m?1a}^}0ÿvw=efE~|G醯v,-!ʽMݪ=e^<Ϻzi{0y=7U|o8jS]ym뀜rNE'~e 8U>"qSm>qydvC/T~ >#gxRzvpOwQݕOu0߁#R<|;pVۂ%ID' E+6Qziوw0V{eŌ= `RQ6ުY'?sYc:/2Ax S{jzϤDMsߩ{Y MQCߞUo*UA~407-i_vcs |ʆ3+n_Rfd 63|6 PuԠ$)|A=EYv*E N~$^ ) S@`Ԗu0ZBb՞Ϩ1,ܫMp(L!? T1NGt|5`0$ $S@Ry3~x8ˋĤ';u"_GYsե/Y택PiݩZ{xľ =FUe 7\jcO?D Nԥ]6W |!ugΚtx-֝u r[Xh$:) c!pNmgjx4K.gvw$+kIص$ wy >O>=bp~YGP+j l#8 ؝=?̫vL7Yu#sLɳ3 k,ȣq8~\c*IحdS2dcvuL5Ѓu=PY)17l`  WqG`Ȣ 5kgp$XK` <C~.ttX]$`ް`RUn%p7ؕ뿍V|GrI+n:j҈_F |<3JPNFqDYz%=1p1KUIY|KI+$wrg<ގ:$ڏL\JK1Es)4b.g5n%fTDڐPQ;G RcHG;*S{n4'r_^s 94"}1`O* _ (26[H]HHWίH endstream endobj 355 0 obj << /Length 3621 /Filter /FlateDecode >> stream xڭZY~ׯX%@C>&+#ÆHu:Bzb46j2DA)`:ϣPh9eAhcӹ5ܒ~2߻]IFƊ؉;qP?6*T}R%q; |f̹b=rlWsMM]F=y|T%F(}g:mRY9Ҥ[]W4t8fj#~`8ѿeyk(ͣp)lTAig::Jj0Tצmȼy0'WdޛN뾫[>2\t`i`1+7mNb^W+Z7hSuZa ثE#"sؑv(İNЉtԶFBEy`؎<ȮQ%žLzP54VѪ|,IE_*&LQ?ie;C!*d> qhv[[)/7rCS5Ȧv`P`ӓ+=AZ*ƆU`}16b4:<`0Doػ{O+’דHquSqZz'9q¦do_- ,{?J bs旎n-VU' ¨t ,'qә2Q,)su` AhuRM]hZ'rc/^'ڏeQ󚿫e9$JWm޹ޟ$+!pi@PܺgVDOyH|?V06V^r?Ԟ|@QI2;Q*(#Fi&?Fl7Yx@^R \`ɪ =c8DU7ϟ | Qy= [Bp=yx{]gNP 6h#vF{E$p$1') |8gze ǺJ1D/v=z|95v6<. 0 [`U`iHgFX  cI]';tGy20(+ҩ$ ʈ.rp|99n(-,O$=KR[hˬN ټYqx-p0eb~l؉_'|5 ]WD ex)DSfq'QLLBs.,Q2 b$ ]j7Pi줡mGʾ@a6={`k ?(:Vg_UbX2TKģpvn>YL< /Ф%TP)|$W szŗE~^FAdď.F%%O{!\ҿ owc E5~5WqTiTsߘ)\%Ie^"eaY&+ysb2[ς)aF=IԆ~>4`+;JQ$SU ^Ň \`ՒG}Ո= aőjXSyƎ"񬂯B$ Մ9e♏lA[udV_, k1,&%ǒ -b/bi~U#oUn0<\vV8hz{AA ҅ܜ;2J3M^zIlza^eYۙMJ:%fS1P{ obn'9yFiD2q N%j 3ԭ^  [ui3)d_ "|u<((KE:9b1]ÙSBZλ*W= 0i8.qaI[wQwx3[X<]jL-h(IU]P*VD0TÁHD F\q$H e=$O/ /]퐟 4%PՖ!3'mW%NU} \ oXq$Dk$qX\0 k3 )DB |T6N4(^Ǫ5ʼtP p9R,O2[Q0i`r4}4q/6'bA_#eۜWZ[w^˚! *v*! u ;K<}`?.5= ǡuJ缀dQi-0D+P #(?'3ur:9BřG/b1=7IP䑸y)%NM #nLYy9S>Rayr|6̃f Z/,T ʀWqʊЏ:e%́ͬcfYB Z|(td%E<2(C\RrFR DPQMuL_),S>v?@;Ն]/GYНnUYW0<*GHs.U#20aDd뮕8N]{ ] Vy o:,% RaZXk 6F3|ҽ<+oc;.4WkB,"8k0\sS5`FI^WV uYm4B]=j!80K,m >k5B/ $RH˛!of'%\Ԋ%(<:e<nHRQCI7bQ})"h~oU;HӍG2aͳ%C(#e=$ Cx:I|H^rԛv5tsqy1ebH9\:j$DGߔ^W̚P l$/QJVVӵ+|F^8,R_ i'Gs$¾<>ƽ|7(g7/tq_rq~d~,{$%i+Q[K2!N?bI*Kje= jrҔUVHtWBj0: BR,+m_Od<͗~?TGf./8 O+ќBiOVd?Vy-řeAD͸1t_$}l MTY<)4{rFz/ݮ^@v=8~"de]a؁E/ff~Me߿{ endstream endobj 359 0 obj << /Length 2306 /Filter /FlateDecode >> stream xڝXKϯؖE ]9|QsUKI>_JbH㫢i?Dw QXd.)0ӻ0J2lf~E޵߽h?{z1w* ˨Ty<=w?tE෈[o;f`݋P/<׷m=_VZ0-2\WjLӸģǽ.?CFo<#% n'Yς;7o5tac1;[ڀS3|m<'=D4@LUY+|Ne/7 SDAL"&`{ہjlP XInF \iKAukri^ ~Gk|WT?GD!H)2CÈ8lA f%a+_)at8(,{ h2AE g L$B]Kd>rXJzlh:c1qЍ8C4H2oq쪞G]slY?uxg̒Pjj˚U&~+~+q8 IB"&8Ҕ*{i+(@ LڜOoI?E`7lO@|!$S:֓PEf^Vm#7Y<;*)>7@[D1 b$V#ДԪEq)Y:~HC*(䲘#Ϸ0jn""?(,@?5Sy1Zd"z\MZM[gj;zoc_C(`^N)K-S/}@oӧ JILzzۺs9]"N򭡴xz/5;GaI 8, o~U% endstream endobj 342 0 obj << /Type /XObject /Subtype /Image /Width 997 /Height 613 /BitsPerComponent 8 /ColorSpace /DeviceRGB /Length 70690 /Filter /FlateDecode >> stream x PS힯w]ݷ>ުsݧs~H 2 ( |* 8 L (<ϳ22Cp `pSJ%gŻ֢QUh                    ȧL4 |]-Au=[ԶbFMAA5gB1I,eZ{}o Hm(/!w|UQ#XzeR򣓮T/QF.j,جK>Olê|VESѾQ5?ZowNj{B_0V# E vZndFmO69۞}Ⱦo& _L׵O7qaObi+gt U/?^׋ZoٴfIG/ U9-gj~l&+D.Oi~Kwb;  a4T;S`zJ$sTSku]~ejZUΨn3΢SMd5X%N5]f#Υ}uړ68AAֺ5aAp@uJf}ʎŰae/M/(]l薞ᯒW+xxT[{lж{+EU`ͫw6xߊKmio( 9{$A0r\]'N/!GQGnT濭̨H}!JtNFj):ᤚj/.~XL)m7'5D]?Tp!Gku]n%u]=irх\QLV]BnKP<ꡇ.  %CrE3Ul AAg\, loyux:6 ]GA&t]ZzI /3` uAAAAAAAAAAAAAAAAAAAϗC]              BrAAAeuU   _>uAA#  tdE  _LI#` |]cGAtAA:t:  uAA#foo;Xk܉Vd)$g*?m|Ls{9O^rSiT Ec\:  "@q˗uh;f3?|oMT$xK.5⤳mS\%=R{/UFT(G0ױh=: -{\3p;aXL|=8pz]{8jtkckU:nOqJO,ٶ[QR$qѫb;V0+ǃGN]{H:;Yvƀ,>]GAD/I׹k=gnQ;U`묎]yoV4ȑ=o;6j/6s)+ub;$[{;zLW!9< ^1$-Ʒf N>2u}[%/;Fs>6LM%jzN@dk5y@+ky]3W֍;W6sG3NӜ2;ݟxP8Bfh )3ySsˇ}4/ulr5̒?Z"'746`unmv-ouvmG U ,U}d/{ɊE E0f4\39Y2c$\r+g ɂ!4jNJ'e-qjɒ<+tr0)}L;c(hY $Lۋ3\V7?Z9).]X׼éƖN5UF͇TS[͵hjYcd~WMvo{9mEsޥO9+  M5r&KNi,[YKnu\pAM>ޣ IVF]n+f"Ț H`抉:G]O Uz#|A=;LJZftߣfbC5 C]]T9#AZ)L\M隕g05k3tc& ,*lq*uGq7eLMS]oOt:?ΙNr#RFXܬ^nnE/o_YY2cP΄d9=mL2;/_n91N4S Czf> g{vn:w,(]hZu:ָw9VJUt-ި[yn#1ed gj8 rfCWwOGT_{KS}i G5ye賍~ty2S}ހ<,#A' zh.3U.|ͦ9b9Gt4Om[`躼#Pq1bd-p+z Kq8.%c%VyQer0,k2YzNK4AJY>d}4) ctMW٨}dպ.gƷg=V4=]_'ut|.:[Ldo[5Q;#zr82,p;5:N\] #yd9e:Tٔ(r#2ׄ;{LWP|hʽaE/?jEN5;.0rE10SCtZKs[ە˖).{C?@%7cT+n2p䩦&'Wu0R l^8t7-wN5m8oh&uKP;7] >2-MdyGvYk27OKHѲ-6l{9/[^Уm~[SȜGLp9qgq S4.s`g+].砡لY:tOAaJu:ߒ–=r\OQU1vAJ:e[˻TS vz,\ȱۍ$Tx~if'1fkDu >hY4!w亂byݑ.cM؃IΪZr$Y5Sɹ w3قpAŬ}әszM]=C"t]lټ Z֊K|}/i;[>moGيu}"@סtt?w-\mpuKd}=+M]uuuɝC}#`aɇT"~mmkߩwu|:EWo[熂5Hː#u|>:RW0#fB>ZW<56]w$S'Fd|[E tuC'ۨ6xt@׿'8Шb18Y]s_r_Bmv˾uVC"]ߠPP].HEy%c϶2t@?Z׵t^DbCw']_E1V|`񐨥.(mڶ:3ENR!6kVte5PCr$"xccMjH*M'3;!= C躼W0cC33zc;ya4]#+٪\L.n.<@xlMӔʕƠ]j1#Һ.r˖BFZ#]53#>:]u%>1LISMS4tU6u7/=Քn|4$(㎃6uҶtqU:7дdyert]pF֎/+ c#~=3#ʤ~;DW-- 0V[~|2;:Ƈ{u4ScCLiy](s7ʻ _v^/˄rvt>UKOw{2<\2?_j'n;!ceI.eiG=FkgEAxgLOk;:t/Id9]dC:u]u]W D(u u:]Z*ҏ$ol؍P5)ҵ' {bH5t}y؃_yZVW&ߒx'b4:#z\OtWĽSa~G%]{}'-AS?a1]K__?yL*Y3zh ^]!Fu?AY tdxotB:E"f#.}ĐPR /\?gIRVT3]?I:=P4/uذN6V? 9Þ+ g!Le> ŐjbkÄ׳oN?]I ]kC?io^A]CO$t(+n5xufOYET|~dvE27<4ކ<>7#uR#37[\ͤߒL"Xg!Q5.sC522!Gש{/f]]W+-N1u:Ӻ.3zfi=P]{\ݨw _Ԁ@[/$u]t.,t+$kd57=?7:y$I }vԈpo˴1vl(k5VC=!Ӿ'H]ߠQie̼GQ15{.z꺼06_yY[:mOz4~1‹4ֳ~+ k[?`huK9G/3Sgfgg&{w|,y$I }zԈ&:^s-W[{SOmٻ;KAu--MWib;XpΔyK_4'FCG5ɸbDuMjJ8{.Yx+;ZTMs'l49Vu#_}GK5(?"kx.:E4ҔH:{wYȭmu k &{tX̚yf~i-ZTc:RuQTG`v/]vӢŖ1-\nvr/W~Qק'F'Gg&' )BA.,vt޳@Q;&:Ƕm6~ޕtz7`O7luv4sd`dP=3&sdιTCE$6tw@@W*3Y--~ 25<g8EC@wmh_(_rGv`UB+G؂O{ҷ߫xr&< ]]R!"]"nvo슱%bkRC6Vm:0t]ZJQ,@*gZXOդϵSHT\wjس7uEw6 "mi*Hɺru}]dz".#[T{[&} C Hz4?˩$O5M]zǕ#4z-GS/ A? ]oS׹o]4Ú%4]oqk0檉ѭY}S Z+^ڡ3ϓ\E]T&6#h )(4㧜κ C H 9֔g'y_ȱk8虻zbFyI~Kwb]-̬-OfB|]WxF5'ШT'YmU7~vj{, /h[Za8]R?~0 w~M-Zk\xS TU]iʡIH+MHrr]TPGD6I?xNy=6PL=yA;Q۬L$t}}>31vR/ ]f@o]kO{۫TSܙ#<]C:t:DܿކźuH= ]g4;jm",/tKhHO]Qc;ˇvC:tu}fw/%9iOq]gv5DtEc'#t]#CKSc}ovv'hQm5v(uQ ΙtRbSrx[wd\1&5^v%=8jw'&Gumπ~:t]猔yXkn0s5kV2լ}teUu=؃6TR^fj,WA/]RӒYu"ӓ3<~m_HNq]j,JTIN\ߎJx%eN흫ѶȽjT *ޟy!eu]g~:$<%.`ޝ!5yagLUL2) w3?T8(]C`]ȴ7ɉOcu9tLnRh) sS,nOK-A^ ۧʼ}_UTf6ZO`~NhKۮjmNn=_4"i ϢO La촷i!)]/|xu}w1mGeCMړTͽRVj.,c|jadҺw,oQ -5w?0q ;O_4״؁0uyudE*XKu19[~fS&r6Mȭ|9JG N M/<$)]/ωQI t+(>ڝGCɯsSM?Sg=ܿ]8D󥽦We_GG&#뭖6ևc'6+%s pԍv:}/"uW*3Y--~ ha$֚3ꢡq gN{ҷ߫xr&< ]E]T&R^Wx.cMyV~b؅ifgSR[TӑgT%- r2<>M 9u]fSPCR]d1 WmoK=@סtS Ýny_S1vBt;JS;?PO˭>͙B9T8)5 0ԅ*3\rrH|]Mۙ mFwS?LlQtzO[Ng7 osEp$UGYI:뤓>]ԥi9<_3lZvx=+bbgWt)]_Te ]w]ĩrN)U]hjlS9A9CqTՎz r]7$RVe"uaf˕ηSCE2 YMƮLo^8i:>Y]RyXg:'豛皃I)ftξu 06e qy;Xh˸稩rJnt]LViu}P;'>E]mV&>2`t5WK4IϤf' {]xx?f.n/װ3.oɿ#I5hM%^;ႝڿO't9#eܭ4STU$nu4]uAbRPvh^{ Y<@-zFٮq-S\s-GmLu3a+ÈgQיݍD\IGo$+d)uVt}#kxIC+_'Okʭk =}/?x.>'s .ӺuqqNb:tu}"zaZKnDZJ:]\?gIRVT3]?X߭u#z?f1w}>p%HOCו;6Lx/z=ğ~өٕ(u@ƘaK]kC{۫TSܙ#<]+/?㞂)kvW9+Q4ͿtGub{ yjC׿s'] 0QRi:`2,}oPc~txomMYL< DɣxcRR ]W Ý|Y=G\qgoQnD;X7@O{Φb&Jԗ> Őj2O#u<4elE5Ku;YsSH)wRuuu=+/z;r쵷IH^\Mψyh(TG9R ]_yFwD6Iu]xƙKЎy5<-u;S\^Q9;u׽\R.a` ]~L]}?9n0a~G\D{t6#>$.,TCחL?oC}68?i2Ks jx#$mp:)|~䈢giQk,SkHOIVT ;|1]G꺼@@W$ O|cnĆzǬN%]aiQDHfY$sC;m#yo!s3B]'"Wg2zksbI4I-}2`} *E=d5.sC5hTÀC׿/=let]_s3Y)tXC<'-IR#=o jْƮ֚¤wrt}S@Aqv L&uH=:t@ס]z?;=6;=>;31ݻ{Kg#yNZcF\י])2]}IĈ<]/ ΙҠu:]WRY70Zw^gWI|]n}h]C@?n0l Eڃ֛\3#rS%xP'3#_y>6HڅR^xGƮֶBסߕtlyuW*3b[[l2m u}|o!$ '-St*?^6' ]t}߂w?] :rnػ.D]JRIt]C?RG;ו*`>QZY]݀j|=1$]Z.áߣ׃aFDBM\ 9r9` q$f:t@?ͩrNVۅUH&tre:wxmЏ6]̓F?/]1:G:g&FUSߜڦi7LR?o}tOj}GKc7 '~틺>۬Lu}d1^,Z+x ZG[i*NWnKm/:Q:H;-dL"0p:t]juϡkt R_"u$Ҳ8UJe=]ºϨYL]cܼq<ⓠu:ttuz1M5{ϝ9BsIuwR ]ߗK;;mȣT;t Jm3t.|]?:<Է7>&,J{\&lQm)uR S_ kIv#ީƊz#O$Gt6K$7Q,5=)TC׿k]y:}zߗW^{vkoN P ərCI1]2Mt]W~F;wd(ZJݼWIg3)NzCbH5t]]䈢giQk,SkHOIVT Cס_]IƆ݈ Ys!"]Kx򫰧+*TCס%אttsz\OtWĽSku0 ڋ ^v0LZ̃~bH5t}<冮#.<4:rs]P0sttE0u̸G\.g!2`Y!9R ]C@׿hr9l6Y%i'Sm#tR]zuh<W$ͳ°lFy?'-I$2;{ۆ{Zd^#{Ш*G zI%XC-Pet=GFk~nE#g  !9u/Hc̍#,W3$HꍝԽ -K Ո[0>T2rZiv15˴ _Ԁ/=qK}).OauE{nR-iw KqwK_]]u}nf5K2Ś yHt<礅>;Iju}e;6j!O˞vːiz{ck"]$tʋڲi/(h]^PɳdRɚCN2ˮ?AgNNL 1{yYH~w{MxuZ^CڲwE9vf鏃H)jBL$QyAeWr.`9S[\Lc D?tTO{+]|n}VTۛA~5T^|Y>nw'=12=9:38yhp%9i!d*z^y$W+$.&ᯋk*3؇v䷋`3\,uE]"7io#BڅGߎkcy1} iPr߇L"l-ߗO;4Ky CnO7l<+D׍ϋJ:s:Yخw,oQȩGcoZjn˅qOg+?{ɕH麼Mץ.5zo]㤢s,oq OjiOH֦ͫ"uW7cY7fF8SctU8&Gq|oSd}K"LA'H&B vaE18mdl펎GTUThZS _d\1p{VN *?W ^ܫt]Ϫ ]W|WvrzLW\Su r2R|P8?wː^;3i掦RQ4|Moe2S]44qy Y]ѿ:?dM(񑾅O ,,N8մet+x5tCY䢽%%) eK7H[6flm[`@a;LO9$_޹3gι̙ooΝ;gF6-n1F%qK&Q&h4*AA@TMdn޿TWWUW<׿귞*[o}9!ac -b.3-;~ \ \P>܎▋sUW7,މS|^fիSUѝ~9k+!TIZ3iơ1,,:Y9buvqGyov"`_avҠuaw`oƎ5L_o?Ƥk>tv>MTf|yEi./e7E/ KJ^RD\wإ[ / }0hV Fu:]7u{/>$/mwЮGZ7/^pn\>Ptl~oԷ4i[&59Ydu\=v !:sA' {Rugiq11oU]o8m_re[L`\T=*csieOYu6.>~7,ʇA~91#b+/郸!?fƱ#+bڥ5J,ʫ;du:p\Vǫ=G z/>߬O [~#9b%Xb0 GAk/ oh8L:%jzk\rt6Ivۓv;s3ޟ?Jm|2nY52}l=4׻R2fnQwZ$q$ïU/'oH^^? Q  {4LVӹ9puP |VL"M\[\xv?y=C{/<]?4RSd[ۂV[ݸ]>ELq]C2L}-t^m[ތ#ф6Ax72&I> u:p]憆3}ʫildqDp.>8wsvy}3||yQ* KvX=֮k|gU +9df]; @;;>VT.-+H۱ sL' >zg_{fsݵ ,zBYt9y-[u\C@@@@'F}KRZTy(q=i,:p\gl߸>˃#ׁu:p\GJ慳aCWG\;pr\uQiruwz{>r fa֭RlW>j\tD'ʒb]meK*z&{Ut3T=`': сqhh|l޴aG@hE͢M͵e}TYUʚҾ¼K> P)e|:p\0J/uq]=gYLoϥQ, fz3W\Β Kc>[1[U=uә "[VTyxj)L~t|ӱA3tAjs]4, D_|'ϜBhv4pidM<_YDY,U娲8V=OV2M[uQMSXmniA`:Áu:p]'/Hh;N޸_";9ތИv r_?Ƽ[w> _$UlAˊ +C|!+4ʕ jy+IEM룃|>N{u lԮ)0<4:U0I0QuW|1EI4 22*yڙ7dځf:p݆gOc:p\O׵1^6A( Mʕ /<-~cOɷ,6d=}}2PS෕6h>\SN4p@]'hTFUW1\v\7iuBۉC8.ad຃.,[dF㭷wN'g_Nuܲ\\ĘM )~{AaU´gz=N #0} [X [K}3vA_ίY heғ4 Pp\3uՁu瀫uGr]!j% xslH n@ѱ>UP\~mM"<'o,N^|I:EUa^>jsA' {Rugi11oUׁu\;p`O+u܀u +DT3={fl鐶BxҒ , c,ߥOXb k3Iz ׅ64M\ye,$$}R;E \w< [uj;!MfCױ>wio p\G޾-(`ي-'󝲧@Oeօ O׏YxYݼVꖙN٨f}-t^m[ތ#ф6Ax72@8pݝnNu±uÏ:p])lnhN;wFG$WJ:dLZQv&@PJ;nXebykWGmA3W*$[Vr‡nNxuj"sKu}יQmSSJ~ȑL' >zg_{fsݵ nuB! ޽{QO:v ֻ؊ԝun<p >E_U*36niJ qKCԍ OfoQTy(<]Xi?lYwuY7^c#Ðַbi$q7ICw=tcF1LA7*ץR'O͛o?~hWնmX,¼q,VnݝNuR(M!uw 9udnZsq\ۿۮ]9NIIYd/~Ç%-dHDr ntmu:{tO>q=i,gr=66lBINNSXہuo9zB]mXq}wG4\KMr~؃KQ$Q\wߘ1/ u:R"6/ [=ځɓ'*k'bl,2]zPooSݩ&׍9Q\nxc\;J%pn דT\#dh7 nu6b:z6v4 b@rb[QR+/qgM]#4kH3_9Sj,o/k]'ƻ\pQ^ \w\z쉸[s#gu4q4aeT dߺuk)))⺯~ϛG.~ejEe; zM~SB'R܀븓uV!(S}N&LѮݽz<3!"; "kJ2rR"bP1ړ/_'?ɾ}pe ^?:}>^[/zFٲu<3BsbĆv܃vsO]C;I k՞Ce"77S +/bcJ΋›y#4j)~?xi⥿ /5uz! 獱[-|W>\LnrT~F\a֭nhЮ$E=EiV1S].J+~퍩- :ԍkשnWM_ \CuO%ŞH KtD;v-w=5P\2{tɫSg~[U9jqnwN?mu:ʻ\O=u4~w vN̄ o1&+rpM1c. %:[ /}0H]JpvNxU,pBbq18:􄇳uͥ Q_7 =%)kw#r# . B&zعiWbP1Ӹ/J) FFF˱(I_yOĕ4Teo44UX.!i4޿[X|{?zrã!:v': M q\w̖;n =m GL*rBr]v4 9z&rJKK X,Fz8pm۶,#>jo_E/rk[䅾# bvqgʾ'^7ƆUC r>ڭ:PH\_pR\>1Vqn$eRMJd2vv+zTM,6IBN1JԈf 9l$ٝ|?=]N l^m:2߻jwW#GGG~|_}iӦ&hSR4t0 \׬'p'_uv(6Tr\*A)yCWG4jA1 Q\vW[uY^ܔH\F4).uFS-=\gnJiе[RZhԒjwns[vu[L4e, pgkOnMMMn:x]?~]ŝ_/\[yR\ן C]8QX$b^A\GhGP 뼮5 "ΖgC}$\1$~LƦB\ok7Gl 1:)oSSr@tQ-9?@s]i nMv\'Hl\W*u:\ \hp\44>6voڰ#_G4ZTAT:!A* DB2L<}gY\\&/~D,wX)H۱k)e|]ʳ7. ZJ~CwV1}n8Q\wDub˖-"NUg㺹74wwCׁ6e‡ p뢆  _% ;U.,ll&xqj٪ js2n˃Qy?W:]ujH2yL}r)Q jGsQ Ά¬xc Lst;Lfg׉i s=ߞSI;ui?:#xc =dсe>7Mw^|\$|GcǎM6M;?xW?\w[vZ\&p]!\zӝ[O*jjRm `Lt@a~y'<}wIdX lUS_֛9ăZ6 v{ c:C\ qثCW`vM9?PsT@1l8NyW:ynv~;ߙ7dځfM{=ș_YNx]q<::Z;[QQ!Sǹs?םJL]BC^5紁uAHX3G /<S&o2zOGkiZ=۪1w18]:y#(#CDӃ<Ԯ)q,//=.͹_WLיE 4O ~IXUi] \QѦ^nhТ=u{ZZZկ~Р.//пuzBCM\H۰? p\t>94`ъóᲣW/(#ɻfMݒ+uez3!S?dz:pa%YbǶk>$ep?J.{Q~.OS9*w]PweC߇?TMPg檫oC 믄,LN_{׋4ӣ _\\!\WҸLٸ7 \wKpNSp}`{nxS}ԙ+/Wwz?όYa3V_eKV2B} $4PŰc5\2n]^UxZr-NJKM^RD\7o։'O<u:!upawpQG{Xu%/yNa?7BoCœ\`Ipdmf0>)q0BNrlQ(rTQMpJ;uۭv.׻R`.PE|QvHygo;omہfr]$ӦMۭ4 /0 Wȑu&A\Y!u: sYB,uS axt8٪f%a$:]:MR{c۩?jhuT6:ķ\$a-=6IgΜKu qK\'4\?!.pC!E1.VT< Fж \7Got0]uEI_&vuf_;x(Z6:,"1Zwզ:.I\v2Uxm{~EnX 9l\ZR zBYt9y-[u3j*FՁ&nuq:p\^uq/,ɍ^ CQ./㈏/5 \x2kTeسb^zYYkjFQFXf],s#Ǵ}@'\G9$Sb(,T'Ɲ-N\'s7#p\X8pN3;뜎:MqIB-8Cׁu#q;p‘ׁ^v>/;mOAh; \wZ^{וDv;#p1pB@8.!vPׁup>7n􈦱um[\u Ͷ9rvN\3*\GJ慳aCWG\;p\wK[&v:r]"a܆V\7Y \w]'Gq/nfl.f}#umh*A\GŀjsGnc0[ufXu•[u \׭纹ۃ vWz쉸[s#gu4q4aeT ]&1E;p¥K e\zSw݉>$wm~^ݫ3" (.$#'%2/-.u]dd[+IndCaA@+ }\JU N{ 2n8'6|kn{rH|tY]S { qn1וD7Q~ p©NGir5ugr3 nevD;v-'))Jus؁u5kl`W\9pݥ2I1anhЮ^NfujON o1ו$["<ح:A@P ͽvB[qC$\O=u4~w vN̄ o1&+rpM1\SFؐbC8{\p4!b_uD;vN4zDNrB!%AdR^;7j^Z *Tׁddۉ:؁u:p>\2\&e muz\nw+ \+)&\w'ch[[vݞI&I*ˤd(VF-Y1m6^WcrdG[ؖqnL_G9YݹC3wu]\\u\`Kykfw\ u1\';W$x:p}¹nC\^;xwN'_u@EYťbR J[=iU ji治ڪFD5Yn֐h \EV]mW;ˀ*374hqW~rΔ &KS}$\7 㺱]7QCׁu:p1\HQJ"X{ :zDӨE>>j\u5U>oQƫE- GvK[ׁNu؁u#nP㺹?PW;FE㢡a>{ӆ_=iԢj ,׹ ;* uzMUڜoCu+59;3| zׁu$u'? \p"]raY 69)iԁ-K U{/&v=E#c#1ї>3#F-E5Xs: ⩏9* UtxFD3ήs3`Wb]\6;gOYt:\?)=!\Wһђ\޳&FSCׁusz72xPfnd,>]'2|ңwqYuEvԧs1e']A>JP:t6Q j^1G)/ړt18M0g7o.Cg:p+]| \p=l຃NqL⺤-joƴvxw&s:G\uFQGty]SzY^"1Gϔ~~v4q}_/zTqL]n9c;2^{ ~W-כEfԐ.xF&W\7&ש;Sy8\k{sݸn2κxҒ, [ *Ȗ;GC]^4[pwۖ^ģ)ސG ;:RSs]~)FqǦ.od5ˊY^{K"\7^\ǞC8uspE;=pV2BafҌCcdX,[d刂tahp}0uiЎzU=c lw%gKz[mkb0uqW#ZGxƲAZ5d azmjL xr2:ɸNar;pbPlrݲ\CINarຓs]!j% xslHASqA'q\-ɤ>f٬{(J`c+LݝP^pvu/şN>_=NflّG?v?;%q-(Sw\at]_ĵN۩3BĺQPK5re3æ+r rT`;7|;a` lj㺄~1T\y_x34oFM?I0'v,~Y?s?[`T;qKn{JuMu{ׁ6vKML\j~k[PĒ-ImKNBAduT6:7O8:.ޫ>^^8'Ov}o^čnٕON:IJY \=JYO@沓ukۃt"o:+C踳#1kE lb e']u~w3r]a 8DhP?wHm.O˦.&od.p: f,GLOwZّ\'N֏\Zr,PieA _eho9$<X.Q*36|;MBķ}@'\G9$Sb(,T*vtKM^.`V Y#' 4g*>S]X?Aງ\u.p)5:vvmCuNG&YuŸ$\!$!e_ |?}WPq/;mOAh; \2ïU/'oH^^? !o}OMsƾ<2s'v˸Ng7:p\rBc \n{C<=i,:pם{S*c+uqcNar:pݶ\GJ慳aCWG\;p\݌ԡ;Sܼ\'}-$NuI*{P.,̺\GTu:p,ېt \;'׭\~?)sz%uigr=D- yj8Xi̇IQ1\w]S̵nxJԷp7ױׁdczcwvw4e2z+\p!{bw]A'N·IW5ŀj{pݑp,CV=n\LnrT~F\a֭nhЮ$E=EiV1\woo\';nMnu n%ʊP>`߲Yu:pȴsNֽ\z.u=;it0ݘ \7сqhh|l޴aG@hEungC8BŎ DBcxoШV.tur !usN%Mp.E\9zDӨuNgCaV<1Gu:p>Lq:p1;\\07:aMkẬ7ys slVA@u"+H}Ę!\'[v\ qثCW`vM9?PsT\wB \wI:eu:p\+׍\E-m՘w1x]<d:G\uFQGty]SzY^"1Gu8\Ss]qf:xtMu{puEL殧:.,"KBdˊ]!a.{QrtًZsq\|B}Qp\7Nuc9ӹ3)5שv \ׁuIK_[zL.9'iͤ*hplahp}0uiЎzU=c \98$-:qU/my \%)ځuj:$%:p}bQ;q]iXdb!7Y \w*6+nptQfg!M'q96V(9*\ͺ؁u~Guszu74̊KMH.v z[$6I%nB Bh*¡3 p%n\?Q\';pC; ?p\Ɏؓ9df]; ݠTE憆3}ʫildqDre']u~w3r]a 8DhP?wH u:NhGGຫ[\7;.s˂>srHTy(<]-f(w#_U/;zs7I,Q {FTX"J4 u:T˸n˃#ׁ.uk6zӁ5b{m8Sq]fLeo Y" O~I븧u:R"6/ [=ځu:pݑ\79P p]n۟p\wK'GqFLv0Vgs)6+!GlcGS :*Tׁu\㹎\-G\q+Ǟ;ܽ?9]m3:pY8 RW02*Tøu:ubȸN6B;p\XůETrxPzSw݉>$:ڵWg&Dd'^aPd[]~MIFNJd^Z *Tu:p\'(\'ׁu얻+Ňg&\ȸ}.>@ln{Ю=}ɽkhgS"aduM1\~\'(,[.u'z\w oH\LnrT~F\a֭nhЮ$E=EiV1:KVr,u@1JTu:p\GzTR쉤DKShRSpvtxه5_1\oQC+nhL8׍/;uށNyuRׁ4{2!hڝ ~3wYҼuU1߻jXKXa~Ge7CFh.;㟧|,V*NszZu»]}eL&ۛ2n tpz\DYuQ\*F9. ա+#VvXku.,/QjtnJ$.Q#f zD].s> AN863fp)V}$nc  3,ш^N22p݁\'۩Mz-\wZ{>\HQJ"X{ :zDӨE>>j\u5U>oQƫE- Gv~N:78&]-p\ugׁvhp\44>6voڰ#_G4ZTAT:!AbGa1Z} oN YUeW9̭;S?>UE1=qs2's:)FG5kŷg3{*3hNr>5Ջ<{LE[7t]ljn8צv8:~9-b^.4n1\ v`[M.2_'״ya\)vຫp~"0į6`>\s(Y~Y\ ˢv.fNq !nrsOٰx/\jz,osZ4?62L}}?s =iԂ\T:0+BgUĄQG8\o/_;eVln{5#N/T+䷵]\Y),:p5A'fiΫ3h6)0c|V}cUsYF>싁J27*biRԐcv \;攵]_ݶ\z72xPfnd,>] Y Y'/Hbuscy룃|>N{u lԮ)q8c q {|8cW%q=dozY>Ivk9C)MۛqZ])ʒwz;yEaC}՜߾~_״: }G5u:퓀^\;ƦuSm.iZ=۪1cǝ#1x]<Ǻv ׇ MPDc q=!..r;:ǧx3޽T{#3^gsXU*\Q[w94pQDEznGРEG+zж=?g'ǥ?O_@:pt:p]9ׁnu ~uEL殧:Q.,"KY?YԲ8+5\w%GE?ʧ)ẰΩ~kO'մNΣOgM(q;\ ;RvM#:sr WD N?MX%4`ѥgZw"/BPyұ-<\[ua\;pZ\uuk+!Ti%$4Pfd5æqXٮ`_avҠ{pJ滗'yEiNPڛU!uCn)m8]yLuu-[e7ׁ :pxK nuG>!\s\qA'q\-ɤ>樀?hlk7n /t_/BWWvuw;ӏ>n0용̷ϖ4y%.=-+ok*)s}:pnղpLuຽ9 \ۖԋqO3 FaI@9 @ 9* z(|"~ Ǻ[{V`L_r+fOpSؒiffȼw3 nq65י+OG<_UR:9:BT=V!}ׁ:eum \ۖ2}iKMH. 3RS뱮]:MR{c۩P(p::FƧANmnu%V;N:[r݂~>uŸ5pz Ǟ́e'3Y׶E7Got0WFQC֮w674T;T^Mc##\w7I,ׅl.ooyKԈf sh~Vej7 \'Xׁu[͹1w:m6p>\WZ4rDz o2U f,jPԐndzc2Q cq tuCžѡ>-Biu\wׁ6:Nu'p-x-g:Q5dk44\tiUWMjq\pqu?N_;שAnCS\s]KNtG#ׁ n` p݂3ؖ(+uJש_bn|6\n{C<=i,:pZ\7ySjû\7]n!rZ< \:}ׁz#%bٰա+#\C;pݽ>!݀dc'I*{S.,̺\GTu:ppԗR~Fu\ \Sے'oi(wLNW[|ā"\?Nõԕf>Lu:㹮$l|#b:uܨ3pN zSw݉>$:ڵWg&Dd'^aPd[]~MIFNJd^Z *TׁuBYbp7[nD \[ucȘӸ=\p!{bw]A'N·IW5ŀj:pa\7yb,\;rgm{PmN=q5xg溮I~¬[EY,N]ID{Ҭb@5p\/v:pn-~E7ɕ؉`ۀ;)Q^?{")&,ڵiԬb@5pS/b%*넟\,uG6:0. ݛ6MUh`YiV\T֔*UYSWr‡עP*#Ҟ~>~)?8Q6K]B.U6T[;z?uŮUu~~Il#?.;spQ\Dzu7Nc ׁuq].,ڹ<؛;y: ^]=gYLԾsn- oC uCrv2w[ss]4, D_|'ϜBhv4pidM<_YDY,U娲8V=OV2ͭJS[эfޒأ!}cڋ3g7V5eLlA98kʃw kY2uÝv8\[ucsz:8#u*uR \'⺬7ys slVA@u"+ՐK}Ę!\oy+IEMMUQ-i{(۵M߱h֥\'q{\ qثCW`vMЩʇIʇʬT/NYjAݥL=_s+뼢ËBh0.l\LowY%!qL4n[:ױ?7Q\W*:pMNvS[q \E-m՘w1Ye1\7\+b]ڝ}]zn\uFQGty]S`u@Kڼ/Ş7-`{K5@Ufty{fM Mlжt=<:wM3!9_\꟫>;̲ۨ|:pNg]>!\7r}B߆\'[vx Y̪>tus]uEL殧z^Xj+~%goLx>[ouzB:{ZoOj߽n լ C7تITH/5];L_j+8u1iYy[SI፝s,R.5k 򆪾!s{*k;ۚuv\$NA&s\'~:ٿp}">=\J\[u 3 uu`ާ?xد iݨѲAKʉDa64M\ye,$x6w P@ٱ{R6{{4Iί|`> ?z\^ZqjuYwe50#pn%ש| \븎FuX1u{; pbV\'PW4ƥa$YxYBIlۛޫ*[dd1 z[$ #0Gf mfoDe#\po:7%јu:vLMnPuڜdenu'is s̺vVA7LQԐ+Ý {g?WJQm̩)yu5|%-Gގ lgGu~w34:뽞|+w*T*{ eCsɯ[7̨!l2^^= C8{ש/5>Nu+~M `4QP N:QpMCuNG&YuŸ$\!!w^9DX9HβJ i+^\\p0iveu0^$&\me\W*'RSອNo˸n y[uqbOء\eWҹ{?v􈦱u:٩U[iԌXV5R\˯L(:iv:\7ũ;O]CitXE&Vug`u[ +Mcp/uzw3}6Vo\McgׁtB ؘ?}./7BƲrA֖?|<06:ɸ{Gѕb5}f1!A1sȶd\Շe}ƍ&/,שituDl^8:tzĵׁupe}S.u;qP}cdirMryǘNrRzSwp[0( WOzUYzFI9rFe ]+sg#Κ$=N.,̺\GTu_gHթ?{#1\F.Nq)Kp0s!m{ЇMoS׍5;͋R-:K#9&\' gדbOĝP8]m3:pY8 RW02*TMr->UCb}$c>|rpq\yJS\w>0pˑ#nhӹϬY\'|/ىE(ndƵN::avzSw݉>$:ڵWg&Dd'^aPd[]~MIFNJd^Z *TM}1A\ICUyF3HXY1Rs3zc2MTsRSkc;cO#OשJ—u]Jɩ4`+du RtR9?ש|t,eoߨ~/><3BsbĆv܃vsO]C;I ku!>jo_E/rk[ 䅾# bv N \וDirLY\7kNq1 @ukfb9a\JS}c\oMψ+̺UuڵH(*Tם,pn7nݮ\׫*iBBNu1spOrۄOӲEFِ؃cszYu} ɸN݊l)ruM 8l~O+#/\ۊnWf\ۄ#Lڊ vNzm{9iR7ҺAiLլ >׍:s%&Me2 &cY#s{p]Iik*L{:whўE{ f!,:ܶ:NtzqN~;I99׍uMG5K.:Nu}pz)ܷ  :C&<4n B~[u1w 8얺E ϞU߫*HWj:^uf:fnfn=c@Ӯ^Id++q):կے̖kDmWWj}D{5c!d\7ֻUn[Sɚtt){Z4ٓʧϞ[Gui'1h"_5]nRWjT|lWsӱԕUVW^f%W(~4?Q(EߎĎ;\ۏMB%N ѹن\'V}0NJ5׬{mPmjTۆPy/WzZh^}6ׁu؃]=4տX6me@OyZ{ຕpp}q\7Yiu܎Y {Z$DmԖƭ,d5os]7|S}$Nil-ͽUM媞˴xXHJDRiҏ،yKEt+1\pP}Tmz4؝R)_u:pN8Yuzs~̘H׺iI},|PA=NnsS&İbGbkO XSWcuj_mmv:U@pߪuúݺ/-*o(" + (  nb M!N=s/+yԮ++tP08j"G8j|U^".JLJ#.;2>`I0Ip͟/ _KU;6?_'« lߊ`PjoaX_TIIIߡu:KuQ[O׻nM^yHԥkM(|Af%2SL!.eg/)jN~Kv@jy/Ady9: jNE07L)@cBn`K=91AGgjsnJxo]u~Kැڷ.^Cסu]I94rݎ鶱W\RW{sM"nWiu@JC׻ۥus#E%%sMntp$[$ݺttm-i;NEv[Kr KT#34g~I`T=Bom$^Pѹu:t@%܄%shaayGtԋ}L׻nݪb6pY*W; KvV}/;7"Yx,oω2980YaTfƙC34~2߅[w]?n)-jo$h ZN5k]Ct]ɺćLr87OţtD9 ]]r=/!d~|I㶏c#I:;qݮmn|؞ɚ]i8c=M]) 6c;>Lטx5 &ܼo~erel}[ZmU(HIߡu:CkGdF^GtwĞ)vTvuw.!WPo^I^={yyٜ[klMo0savORgkcyFqb)l[gN,05 3ֿOR7Xmm瑃4ǯq)iSYx sM5t/TGץZbWt]BTY{Kou$oR8?KC&Og',vtܳs7M{{i,OٴGPD쵧qs>զut/]Q%CwY[..yrt]:p74o{ah8n_e97k5;/]A].ݠt]uV:uP7v[x3ci#v*kwh3t_ ]LtCsgtW&ZWs0 @{H׫yn (-l~ap۸HWmlb5;/]WA]<:ݎ#=vGvx;Z &}6Cu]uNoo}M@o򂳇_=o߂ ] ]{W ]WV鰯5;/TAץBt}=땅y[]9gُ ]Ju %vRBquD\>î_pf:@轺NgPJ:}OI&Dx?b{pi};HIOI!u|.C)}^ןW=}^ݟ ]uPM]."@ .$t]eEDFMYnY#:t@eu&B}@KSj@ODOzNi^W/uЗt]>WTu@_u兩99%Glu]˚-ᖓF>߃>\Z2Ct?SeuC3BA=r*~7|ښ Y}9.=9p֨q Ȗ]S׻%ʺ@PM?{yu?t]n*=t29Ɣ]Q#2 2+bkcu0,) G dXX _LtNKvkyհMzS|""|`YϠ30%:tOu2:[:Ɠmsvr~2OkY,:i{BR mͭ3;WϊSl6uNu:Bz?+.'Ƥ߾Ԧ)Zz/U)XsL V]/\<5#b0 ɡZtefIl_ W>r޺|u]'>zNSpAm=Xl1k1.1cUOXs`XZN^Y7m]OS:nYt> ǙGdNJtUB_0*#k> ]TZZoe_/eO\{-0*掗OX_~].)O=^`gk7fzm^rqnvmeQyVSuEtX7BB>0Nׅ|cA`rCqk?wuqu()`t(<=NPcirt4/"*ubn5_]Po_^!&F9l}?4_k8TcSSatz=D]˚Rb* tw&xgþYtc0;@uB^rwj&>5mgjtQ{7/l~;cm/z;o$UWV$:m5Qg<<#dڋEyYUzǪj:@׉_[;o_ìW/ƭ"U)Еߩ⋚CG?nL1_(idXl9Fw>_B1.]_z+9d8lnpcr9^VCT<1—H(7|%7q~w L^PP+FR$ܾ~bZ:tf۳ >=o_{ wH$D!"Uu"R] _cü¼îƅ{Ň_%7w) s-/}[㖧V$yHu@̿9ϸǍ\I*4x3ѩ;ar@U8j1&~!Wb /_g(/2dE^  P5ڪ=Ԧ)쑫͹'5c9|?o 3_3gt}wk|ۨ!u1nE+-fZ.65km{ ~}c޺t۽@.G{4OC 'B6Ae{b*dEj[n*R!tпt]+~ʶEnH2Mq R5_gᘒg/s5Oj}a]Lu:os3(.wH D.oq5[47{%^W YΫT]]oHY1j85c-M|txKe^ӿx Ñ yaYYMwYx0Y©9_QX{gXUlύF Y;<]bky]gT%?qU㠑clt]]PWwCnS{VFc=c]zcSu|p']7|/ D")9%/.BLep&xɊJHu@oWg\J*(Ht\G}cFbV'BA;2ޚ_6Ŀ4vПžjMa/)9~Svb [ͽpߔ}3vu[oHZ4t>-ZL?lJW^Y뭕![t5$?͊[.ޤ̠ t45 (]o}C%O:bdVE+&zAk[z!͍kCkZ9}5ǯIM͝ o;t*NYo j'Wυz?wƵ-wJo4ɡ^zN스B:7zKE|v\!/9Fֿ. XM֫׉\ޯIKaλxV5[8w(2eǍ u[JRɮkuaͣ&[&n)oz-g7?u[ido| ɢ!dE5e/lUO<ϏpW>GhMtݡw 6z 3>Ǘ-Y5#Kt\*B:@u{{j?QS`M V&/̅g֙|a"Jף\Ċeu[綹65XY~ @Ć ƻ./4XTTVh- mLJ'/QYTۂ˗u(]oH4x;4)@5;@uΛFsOkijκTo_%5Y}WWwoOIŷDgaa덏 zmtݫTH7FɥKiu]u09哒eo^,IZraR!t>m )3Q[^O8m?bRi.r2:uW mwiK1~CRqEUM]i Z)k{'v;ۊ9̒6GogM7(?9Ykf_].]o̰]0#P[V {s_D:}L׉~ u3dG[V_u7]ohkk,f 4v򳡏J =z- S]J5u[[45$:yuNQOh漼YpOVeZDo5NGv^ԒB:&u'sUu_<& ]:ujǞPеf;Q>㳄527Njk 'rl*&Hu@oyEǀ򊂇^[W]/2Ok~Slv=i5dek= * g7a:RƠno93_ SpzsW1N[;Bt#'`+ _R6(hL:^PUZtU.^p*5ڪG[У.kĢBQ~VxlCʼo$: r긢ʟӤrxa=öj ni=suPzUily'/ڬ6^W]gO'Tz]2`sYϖqK.% .l=rZkѳVL5.R:+b*'dކ iu:פٛii*nKvՐN?jM -ialv7IWjFNWyǫrc{xo%9Kxҫ6\x6queN VewMnUv|JtI.g3xC_*ӂ7/wͭi9z?Iȯ($t-W+o67Y[~-]@&jM: OZkO/еL`trZ]IsX=<,GFZN) alh~'K~u%zQEo'R;6z}t-6xG@?VmؽZ`mB;Uv'WRjuS]/(?m`t;ED"rm32:]崺NeJӾ0ݓkCJ8%>IwCMD]^qdw r=7\WHr/ i1-W+vLz;tmUAס軺 O`\KN>ͮ,*jjȝP$~ ;:}u}!/t~($:3Et}̺^rj {僢D5MwQWi[ީL:KrNua, ]@?uVS: O?*|iE쾩zf~yL.u0L=YZKiPglN^I[K杏7"mHuuzQu13$tH׫ +N.S7^͐Ą gDTM$:m5QgtiM~9)2ϊclMRqwDU^AhԢ[MoR[M:tyy%3 :{꾄rKpAUVS{Wxvʡu=]{9St#̝ [9_:]WzjǬxh8kzeh}m?l𘅇=bbz8xCF@o,`qet=8|v Fej~|8Ո>Qw GifݷŮT(ǺWaXRrï' ?~#}[*=t29Ɣ]Q#2 tG, 5< ]×8pA躌4ej+_n)nf <ɯ+wKkO3&$E ֘) =@@@ee,ֻ}۸6نxC|D/|^:88xٖ^b y4]v$+ip` @wx{92"-#zLh3 ]iQ6^IaV2ip,UCnjEc\REWu]42t-v߿3lʍĭ_]B+5Wt"5.G Ze|'{BR kz\ccdBJ&{Nיt|TyWݫ`|{=\txL54K(CZ;֟Ee>sƮ>%Y.nT(W;6'uYUU׳I5I0TYL=瓱+/uTiK7緾xxfr?w?tuy'/ڬ6^W]gO'T]xlCC6HVdu9#cLjs7L1RgγAxCfPp!2VȉmL2s^6q8-f}VazuOؤ~k:Ӥ Oy\23nfԲ{O]Pڍj=b8?ng.J3Jbw#LZ~02@΍ԗb s9o]>['JoeL.Oͽuvd\W퇳0"/ѽ J~a޼y.\ T1bU>Fن F"/E; df\\#0=zM"mcˈ'擹V`mB|}^Iq?ƈ`dZRgkp#l# r b(]͐0CYC~1^Ct6LOwMZ@!J׎j_B[JѷCwפٛ敬oo+Rp軠~њ-ᖓF>>\&¤u[ݧ'x׊u^/f@b"t7>>A'𹌺N8 }|5UD혠v/fܫY!?ԜWvx%nMIs\f=Qc{r`1w(Zl˷uk V'{@|Kx.>m6tD~^7ӟnXJ;6XOP;Eqovs &>v?PG컠]o/pS8fFYx2)#y"牟=X@?Q>0S$~ AoYp@|7suײ d 1~8.[B&.~QQ,xpSyTӅ4Xmtd]_@^q{IǹٵE[M ?4o3{_HCg7ʎ1`TFMyFi>apg] Y]K Svv֒,!w01U !jV#7&$4_k81ZWogxs]kIt~vuھTgW><4dBab.tԏ^<&23zbx-:]NiM~94㭦UY{ ~I*.|V|VkƯh t `aiiSc|ݏ|'&mwNO~f8go5nlk6Kn5[F]3SNػKx$卓CV:mK;cc:ExL_ wH*Htj6o%3]'kut{·Jj J/]-Suh>ՀWD?E3(v>ÁuC~V{:NH]yi󋉾yNOwndC= S~~jX=ט7jGBu'/=d9cJfX?0MD/,uo K&S#N]dyH]kIuvv}4c q#.d𛀳k'sgkt_m`ly7]~~hsS%z]]]|||HHH=~Xt6wNg]gMӭЋ,Ϫz=SQlٮWyr{nT*C?|fG~RW0e:,Q~ppu]ΤXWtbWס눞ȬԬ8?g1#4|Rr&r.]s򙭫˛>}`vIn]v䷼{NwXzS+t]슸]G[,N;{ݡWMN>DKK IfjsqH :h>\&z\BMO-k~ѿ[41uccdUqeDhה>I?+5YME"mm!{!uت´644F3AOm՗ғ gN[&eRĵGbxYM72:]RoI;Ixu 0tK+fxV-ͭ---mhmOO^.wګpl4H_7ʎ`Tɺ j8M)S&^###+vǼZn0Xħu]Ereu'%zousBwͺiuuzBUuf۳~Dśߐhln"񦦪gdIT@ʐ^7d%㩦քFhS` CSz+v]N]Wg.<2'+ :uwwl0;UuMC}S#WMo^7yHt,IP t:m=muIGt>Tap)U\%f<5%T)XuY73 荺>"gQcC;YW)ȸpW4RW 7~ecC]UE/o,IP_2^뭭Ƀn6>ĸCUYRu]Lt-mH4ݺNwsTw$wz,]ALDa^g\h6UWS{M i/[ ]9qr'vn9{Y4!UR[tݠ#?Ν;ɻ]tUE_0*Cn[#|# G֐lb麄'41yu]IO7RK#tzut]g}`f0Q;YU.]ן\t7XU=o3E/ˋ6+$ @{b͚5wd6csqH :hu+QYDU $W{Ǯ@KECOƮR}/2H(#>&_X髺NnDUK]u+])͢۱Htwfi/{Gݩ։}({F=HU躊iIWtXw w39|]z/t]BSS ]WY]٠K]^s:ϼ> Q]1JڣC>t]u]0׺ 7nI=ue/k38owZ#_ N=V]0ȜyaL,TVݗ ]]$zҼ̴ƳX<#7 VY$e0̊+V^]_ϟCq֬YVn 0q*7@&ed'N]u&W2t] 2I}/^g C]o|w0i"t]9.EVkž0Dkt:u8h>C68 r\zrQH-܍kLW]Lt]PݑFݖkR#I+M꺮>Dw#h饺>~Q_Zgt,I@{ 礧{MƸd!/1_qYBؐ^ﺮ330j,U׻K4Nw %G&]gx+ݿx$:ΰ yh7;+y.Suש.'E>x81کU*`,&]-nee5`e˖]zƍvz-,,I˗O׻R*].mLǡ$ t5`2RweZ_?e8gmWL)dnaC39Qm={(9w`ѓuGwd~or,iȽT`ڃ72c*f0 zoV77˷~d#_ xE+-E ,Ag2wu9I&q8ݲi.Eoi]St]!0VCw7#bo[`LFO1 KkI)&c&xkV+ +!3uq'E+fxOt03SE$9"εW/ت뙙tppٳ---_zRC]go.ua4+#]pdF3=LU`L7H[Gl!u]2WS'ʷ>Yk){vRqLi~^WLUV=zu*S_>vfͶ>D%my_` ]'U{jhhFGG1Z2rH>\&¤G?d_-J[֒:Vu Lf'C׻Q Ez~wyTX%0ꞕzYoG*e~ {v݋t=;;$[[[DII os1/ǥ'0Ż]IKq e53ȦwZIprɺN p62d]U׻w ]%])ϯ*ͭ*QK ׅ?̛7… 3Jr"""F!}Mޤ+;NQ&rj|'Eua4l6q׊u }0zEnd뜮Mѳ.$@t4/3%&Io};HIOIIuEo?s+++^ζm,X uxLRc- V3i뺮K0ޮ:T6#mLNgRY 9u tKإ]]xn>M(H{՘[e7#R>I 3$k n%1B-淄~ۢ%^%}$=%%5v:ͯ^jlׯi 4ej+i)nf cî&K|?_]{w9|jɍDP$t 73BhH  nxs"έmrs 򉱓^x뢴8;;=ckk Z@Y_?jV]oK:06Э ]Kx.w9j0f*m|P>^x뢌5Immmݾ}1cRf1qP;]yHX_ZePF|~[eMV'H{]JwK~ ue{뻷.۽[nw@~Det;D08#1 FsWby.͍ 35g9{=W &C•*oߊ`Po!o/]*=%C9l0sssuvv655% U>+G[@eus#E%%sMntp$[$}k$ݖ8I%FK}ϻl_bn@ Dߴ=8]s֞U . L[QUZTI/Ak:t`MM?pȑAAAV׻fT]8*n7o "Q[!ۢؾu%V񗌂 ]HGG磏>-//!oak.pS8fFYx2)#y"牟=X=']ˡtW]l-C޼9B%>7lNgf~}( onx1m *}t4]wE5~"[ں:w}^t]F]q>cy;"ȋ *}ر;]Dѧn?p6A3viHEջ!HIauaxu#؍76773ucdē7/nA~N޸|謱:֜;{]Ot1u>?Rm)ӷ5t.!MMM^wKw>0gP5}* ]W\s CGf՛pD׷A]ju[וuȇ4..~!hB7^q臏uNgsglcTQKr.v{ott}/޷wƴ-ܴR ]i4h4ui\sۺc0LKz]wu^_σi漝Ccuz'<%"ǻ]tUE_0*C"qJ;ga}{4qN/'~r{y[3,OٴLJef& ^8M"CLsqH :hb:$}pI_ #}-5v"P' G?ѿ[(*-)x'WzNi^W/ț-kw;g=a#i.pS8fFY6c7~@qkF o Sss"1ʸj^֔@qk?uqu(}KǕ|w~!/2? Pǥw?O[0_V7?C zi2]~Zk|6NQI7iK[;C&?kL5j5b?@?^rwj&>5mJuR|Qshޔi=] %GW"|`0@ _+t5Uڨu:]@?xuus_#tVtDaCImj  VԢ՝K[;`ЃzkO91n ֵrZk|qIA~f|ݎis{|FJAc"GÏ<J8؎nt0(.Gbzyn5E t;:i z1gפ1s~tf/\ECI;]y IJ֪0;ZzFׇx*G]@@}H{:u蛺ސj@cpj[hW#gY[Y/LI#E= &p?X:`lSs6ΰk=75ワgtq^c][%:íէ:.Ꮓ3tf#t:]Jcim+R&xgþYtcv@ꖞT:~]ScKGx]]@@*oM}/_sO;ONMa/)9~Svb [ͽpߔ}3vu[oHZk]@@-g~q-CG.9X`03?c5Y_'ry(v?,y*=ukbE-s^%[a ƌU{χd׵A#uJzk~= UK.b]M>_ ~;3\E-G-9Pݹ˪ҷmsmj 5>e<tTG6yGrsީoXRޜgBaɷfWG"CM3;F75H.8m2|vXt:]bD&HLuVp&N[}Xs zݕk[~jD~CRqEUM]i Z)k{'v;{yq<鴉vt:LmھL&iqϲ., D@ `DLW xQP"U.+, dAV;m2L. gYd|ysa~0Y/5ny,׵|GWWx7~wj!Sq S5~q}h$ ]ϟxAoe9{;O[_)~/؋ݺus>6OO9“-Ͽ~zI^Seu"u~DMg\jwt~$"uNYW&ceo'd*_ӱ;6eu"u&j\7&z==lЪm߲(E:EQEQ SEQE\EQEQu21aW,g` uuaO$Ko?~e~ʔ"(M%AἣtpL_1^特8w$vVk)޶J}L\{#$oc6|e~$Zsູ[8cri^ivi/m+4aoUα\>{b/UTVhM&TČ8ucG&:xd588"wDj:aZ'9t:=E|ĆEzG3[>fwcTXF$~\\kvvٛo }7{E5UY&R@ze.mk˝QιpS$n w11έ a/2.,E*{eL>KknP/O,Jg,xŚa<=1>Qk?o=>ƛK\^ߜou5Ph 29+Y"dD,{ ՙZ텓徽֝+Hbh2hy}E5nCPuCW}lQ]($ :G9gWq|Dې"KS>W&Nw]_q\}xY\,:F*oֶ_tD~LAu|j/ [W=|\[J۰,B!H&zgƎ{c `in_O$R.{jwΓ)jt)eH\W%wi[⧴kSU>vSe[ޗ|/I;μ}tsvΘsĖT~xbmעL9뽫|*-R_?Okw?/GƦq /obs| -߇onݷJ/A9cɦsWu1\R:X忻]X}pۡ-!*n %j0'ME0Ӹ>92s{qR>:^Eːm\Ktnnc[6kfkw d&WgH{{noWk0jsq} 7K%ׇ]x\|ҙI&c^bGMkQGFWffn:X)ĤU70wv-K8:9b[ u.8StnCRXW잝Ņx q}X\wvq}{v\ki1՝Q/ W|wfÞ7/üm.8]H5M⧻ GM\Om\_wW~0Kn\7g̗,uuhKh7dD0u}};DWP|Codmn |]u|$mcC0Stn֔(w#Wt);e0I* へ-{j-nw\v\+hNⶬ m"ٸ&>3~^Q~3~ qӵ7 WnZ܎GżԜs5emYaCyHҞ3uu=lh{M򳫷[L^U\wkoZ>ӦI$v:GdLѹuןMմ="KUckzVIELZGM8tTW~T=QSu=rw}˘V)z Ebc멥A[/WhٵD%ktgs{`S35-NH'Oy qz٪޾ߠW_ov> stream xڵZݏ۸_@*_cEm4>C\k %WIHn.pkr8WՓo'ϮJVqtJ?K4 }qx]$ix7f^zHٵEqbp։{Q7u+Zͽٛ?vǐblӡخtxo[tfrWۮ8ψsΤtnBȏ9>l%{uQZe9obkuʔW PtSўJCxo !p 9E@²ChS^1 bmSlM ȳT$hFdavwvuzm}O2zkͯ{kֶݦɆE2G@'[h`Fɠ`aA$+Wz>W鋎Q0O4AvĨۉ8mI:q'xwXuL@bmJ얞zކ|+Eb5D!p*)~+uaw.8aW@ENO۶S=*H )[v\-M6ˮuX 8)ZJM}E:n*U˼-ޓ 1$44v9WL=[SFo޽f$hdlJc8?GuϏbϏaY 0't[飈S?}uC`Ѿ*?3aHN@M$*W ǼL< >>)!v5C'6 ;n!{n\8\h01x 9|ՂC}b)Zu59ݗe#,]6rWEdy"rf;+m1c 덄2`><8DB1 %&`] mާ;\t5R ¤}cdYc]A {G(5O%O2eXe,* )\(n GSv`k!_*N𒈸^!f3Np3Aܓ/n]TXoqFv, &O37k,tQD) MG{Ut?7 +*})$Hk]lDni$r{\+Aq҂VQtٱʄu>sogu$H$A\f5 mS'lxoe!t­YtXli$_mOuvȊNcsFp:ٍ\K`?}(21wvi u&x/bJ2Hi\F==O8pQ~ i;dI"?Gd"ލ<\l7o`a b;[8~ rF2fvh^gK}.5YZ_7E}|D' 8 79,02qega>-JH՟9U<\}OQ`PxG3ꪔ:+Nkv 'w8}N f+!ˎ(hDzykO2hd2R%%x϶&"uw6L{XuYFFdGc"kƐ =z 9 z+`ಷ9mD]3Å@jDe,!Z U5EIsA'gC,Vlsҹxw)}trŁ #xLU2_4l}ʇ}:X9ntgf,?BA!lF{)n>PD:wnLDll5ATu*\/^siN33rsj!k`ݙGb~āŝ?nbŜ endstream endobj 386 0 obj << /Length 2925 /Filter /FlateDecode >> stream xڭ]H}~E{X%wm{f`5pw;a`W:I,;k;_}َ 1x̛=ξO7)&ag5 }1aiw=}sOF[wa6U1_҂nj'ʱۀO[L{L! EG+QF^2B*gl@xv$wCaT.ac,d=<ܧMMDms5?>w%wKۍ`E$C`iEqhZfygW.?/c4V/z 3, A?0g&Εuv(Қ}Ҳq7x;fx՛CmmLςHY`J\wyCϹ AȹViSϨj0HU(pn;at bq=CB1 #bNəb|'nZ &C{B/!V/x&/+`?_᛼q;6 ⠧p4Q". !Qkd={\r(W$]Y*Q3Fi& zrM#*2vMh=; kmGC¼W#楙1zOX j&hEKTPA,]uԟЇHNB4@TDr]10B##w5)8W/_bǼݳ<О;7?`%{WOa\yky^wEd7)X`OMA Tr$ >Rpu[>`ENSw/_ ZW{sw5:& c^m3%IY&tzz˫/l^ b05) `Q;,y0oSm_ 'b܃@Ս<Ox2_d + ORL `|`ʼnBZpChq<%E 8[As98۴^r5ƜZ bPe*19!yuTȃڋZ"8DqjR<&ז+@Ř^t唐 ʴagcp&Tx-cM'6 7*=-@CFu%UI-(ǧ3-\\_+- nU~}ykB $ODwB2Y3VS|[dUA.Bzyٰb?uygePQRa"椝WT*S; R KܚReæ6 9uߣIOziY Ԁ2{򥇊0^F>w %3> yFGš'BDiY!s˯L@DjI+O dAR⌥! e4`aCEQW͒BKu;=%'v}NA{tP7Efe l[ؔI tK]=9{7QY.}%y]+?K&ggeʽ&^Ȍ\&m3BbAI$HM 1Z ) Rl!COJQ'u%*E~|^$PwclrzK/$q5r+#n$۬]QL$+IbD( 6I.F@@l(;bq}di1jZ: egv#C/RcQ_Sn24z8r;>>7l"(F<%@?Hsv0 Ob疞_ mcug0Jc? L~ Th$O%ӱP&xEw ,d3ŭ^,BWgTO)㓗euePL٧1sNRb+LoP6mg.wsP!Ӿi/9TalfAG&ϖS endstream endobj 392 0 obj << /Length 2177 /Filter /FlateDecode >> stream xڭZKs8Wx1U9ګeV` XHm$ 3Xb A|CWׯw(A$8!qm>~FIp8,^~(A'pM@ $uF4 iB ah?f!cm+a`ic,4 0SIX 0'`!8X,ӿ8ӧEfSBB=UOLp?/~{8tqR{ Bq&,pڥ;}!Z7d,rgLn,zYWe[賾.AZx[B ن= e 0I01#J1gLZ0&43kr9ڞVͭle|m]q, <'F)pʣ$ʥ'7[|DUUw6F$䈌aFK)JJ}T>$f {uZo%[˼t<'}-svR?If%Ar8$r/v8܊oƫF,F*Zi<[Cw3'ƾl]6,Rxi uX|STbuGتD{J?*/C.zc/V伾-j6=u 2qR3ּlZ)٠?A!B8q3F-Lxp@ t;d cOAAt)%wfY%AM*ZYA6fb%dmTQn1R?qr } H7DN[݃!hNcU(Г'Fie.=6y#~ó7I.d #Pnw<襭Yt-D{U.0SIIVLi󭼍M )+'PrIط,UTq>=:{yBؠ}! #s)DHS6U7#D+S<ٿ6m[#h2Y"76mӭ˝*LYUײgժjK7RRȼWECnbE=HbdU[uan[X[n^7*nZ[7R6Tlbȳ#)?|U.riNaEaȨ[[ "y&.OY39i֭:S'xU"D;?ͣ7 .;'Qc]ӵK{U,sYfҰ5ΑĶ؊xy!^ $4Dt֌HeWGa[/N8dţrN<[(|0:,h2e.vbI~99Sx~U@$<#'~{VO ؇=Lxp`-F*ǖz׋C|zCQ>o\|T 'mJ^ZJ' .Zx%5Aard6(o.wp#U`q !ʙmxƹ8la'xCưf0ȑ|$Ą!ƀs-xI6EO /x$ d qM.aү۴߀]" H3hJBBL"Zq)N#d6e)?!JRlaI?Toߞ g}Ʃ 3{Xz %=a!Zڑ k3[MЪl3 r/  ?2"?lӋ9Ƨ8r`*F&e]XI endstream endobj 400 0 obj << /Length 2542 /Filter /FlateDecode >> stream xڥY_۸ Oᙾ71#(evrm=,V#K(g/@hww3+ A~x_ċ7%f@għ"B @J:a"AYD^}*|x~=L +N'~LʷaWsK2)j]o?Ϝ]əgA?*xqlC{4,bFe<z"xpGwpTeѻݜnė̽/a C _oԩҶs$/ [7phfqA:~G|v[v3o:*e\-&$_ڛǖ,C<`F?7lf԰(!8,&'̓I\\1`#^K!Pwܳu8Ҡ6c- +аCPA$ŏwiUpWifjb/\?_} c1լ"Xʫȋs&s\p=k)S\s3=*E ӧd ( ƣJRx\<O)X쐳=wo ]֏}ԧ]w.+{`9%}r.I!tتZ"71 bt@gd,Zbo3P#C\ {&QܦziN, ]!F=UMI0@\~@$ޱps'PWWW e˂e؞N(瓆+.o |!Z@4xo&A~uC HǛSq^c:YBR;XN%נcJƠ}N4C{pc(bS/x0e%| dX`dbCLџ;ݹhF6iN^/j!3u@zs(\ԧ\Ls|7luI玊Q 4#} $S}X;Vp6(g *N`$߹uJebXIƙ/IMJ@I_PyXcn{}y _x$%4 D:tGp Fy51d„87P*y`|h\.A?-+pIF υt;s  Du|,>VhzY O '=T4襞:jc3NB$@_DfU4x<8&A0HHHjZC|nqpĦ8d?:լ,U'X5氪:*!1dYW#a>3$yߙNٕpň`{@8Ab>6f:a~@$~ -4#ƃHĆ ?@أ#|`v[gDuim0PjK[ u L0lSz5U!ñSDĂlGvU3Ilu̲4]@E<,4Ջ4@PmL&"KY/+ߠ|"uJ7Hy2-Ή`ҡ|ҥ𫪱G+1FOW  }̮]u_Az O2> @; endstream endobj 406 0 obj << /Length 3130 /Filter /FlateDecode >> stream xڭ˒ܶ#ǥ,[R%;8$Ύ>93M4 tߌno?}C(ion(,&P'oġ mFm[dg$юݿj,gmԵp/̊`ZoNIښ1~T21̵{rlW;@4t#of ?wMFˌt&,"+* $cV \>.7 m'djb˟PIen$ >̣7M9tyhuLeOnt4"]oac;7[] 6stYGg%pwm&ãNcR͂hvdH ;Z³ g>Y rK3dmH+yrwx$<:|Q ۺ[kPL`O;P38, M'.IWTHA1s Gi# JQ)i)@b5.q@#ZRT3fZ: Cn)Db0R,n&퇡u6$ aZ>pzx N"d`jIQvgsƑsMгi0 ܓm 1 YAi[qv CA0[.|&V&<1P47 !#D9kSU6>%]u  ( ˼2Cν""53ˈSa2O[ž^$aJvaȶO`<[+t,œ%p~ v 8v_d1,җCչ hH&" Ib#a\ˮĵ8fwVi崅+\ƓG0^xV=?NДt!91LTFEش,2SiW1\|膃DЄWJr*Tyf@+O`uߐ1xF4hiA$jR%qXFװ5mT^1䄈(y/ɡ܈|/R^$Fښ ޤ,,tyxA7Oeg>yv lz Ũ2[QmH Nz%d}@.Z}LGuٺX/7mEٺ#0WI.\-PNK\]0-ޗ摄qjA Ly#l"w" 蚑 M!(ҍ/ kVlyGjCc8U%sfaNknsT2|a1oGjUo I`-[Q?rdрnOY`v hSq7d4̣ 2gSpp^S#j$dDit^ [#F/9?^b6rQ ¥tu,I20 ?`Cy *E LIu_:8aXz}8Gҩ5t[^kp$q܌lRiƾ&%&./WOȿ# uavxptp߸*T|Xaa Ϩ9UB=jIqP'k#iƤw<Е)F-d,nPĮ:.V/N`β"_v eԒ4i+BI㫫C b0Ȍ=- ܔ"^(RT~djc|ȶ|+֬3U{^@(Sh3[eyߋN, *| B#ǽ"OG@^}28?T3#xsC}Z3G9ib}ǰ{kaMgoDg^ywp-'!ZBh2_M&֋t!pr>k_ةϯH5^z̮LmluQ(JnGH\^Q[Ϩۓ "<1lFXef˜0? 9Xh1sMWO{,.=3-uyBɍ`o)jpZgw`a }NaH^ AIrNt6<0q4Bq '—FEb7 ܺB@ Yƍ}uEѐYFYUW~_(DnUfQ~$lN_4ƓdBfJ>U.Q&bo4)c+!saC_rO< s[ QjuՇ6:H3p}I'ЉGy' (Nlo0_[3Rʀ˙0͖MNNq67f6ppޫsU90h)xD]gV"8KjǑkDXnb+9M-v3UЕ/^Ńs; #4L˭_q !38dKMyjcgy s9E ؓK(HjOTb BGDD #TE{]Aj6;( N{nz Od>E".wo0>b:6)/ߌXr3Ht="RT2Jo"tDz endstream endobj 410 0 obj << /Length 3042 /Filter /FlateDecode >> stream xڍYKsܸWLn3) Hd'dWźy}HrI_~C(i@7n6w~{Sn((46y,y Fs>*=7] 'n9崋#α mYȝ6mŽz<֭vO*3$a9xhZHҪ1uhݢ#&5cRTt_2,c8yo #M5ݤVm&ެ* gfK> ؅.n@a^Ky]FXVqq|4mKC~Mt $O\oo^Szwxg vFNM'G_,KaQq3ؒ i{6w4m+ʲ@;/Mwr1ubELY қ ʼt K,Sx݀,H~{:AVk"(<3_F3vr`*}Z:_8̢I$ȁB Nlf'mspO o(HݮIg3Gn];]sE#$%=æbiwrf A>q]#Æz gR(i ?iNv/(zpVv<"¥*[*qt(lv[vpƠ;/y[(#1 ˜`N:)U`ZԵc@f<u-$II&MyFidw.)cb 95 xGdI<9-\B9?4/&xY֣bmك+Oe\8 a05>S;JcMlz`r mEszM j6חh<>R|ֶA k1AAdn~r3!8ӱv֍ݓxGJnC] hBȴ>5Ϝ֕:`? ]|ޒWBa MBXVj$tC϶F$U(SLQ`|mȓYjbQ>*ij9'G|GW`hU a(À(/3B"yy( ɺG@ۍ2kUY9W .|+fw˖M= ^?"'i6. Eޱ>Մ1M 3di@L-Ƅ]%|pZl|(cݯG4dZQ)x߿I4+J)lk(DCˡ [>9nڭq b0[n=,Y@&d>naR&W+s:`{h[9.Q)N0rqXǺӪCmdAs_ 58vPş=7aҞ4]Iԃ\x-"c"b,$>@5? V?x!ggQ2X4#>ӑA@m^\Qzv~D b-\suT tp~6sG5 HY,7T;ɚ6 B˼)u뻜d';])Z_ЅT)QHFʾba mRutSeQB|^ *%V5 *^+݌fիn>Q%Z?3f kCaVg~ޥ:Y&I.R=w/hjFn^_!6J@3[TCҜ`Ƣ&u1DžHӿ]˂dy)=Yf.CTVG\8_Wb-Utx|-W3DxUF/4Rjv6B Wñ?/i{oksF(N+ĩZZtI]ne7ៅ-#MO(_nNXӥ_%#8L/"_ECz|czF_;)uo$$v%4}MsEp:QJEMM,' "'dGrv"4N8WχJN. v'Ǿc`9G| uԊK &j${#G4nb.qB٥`jYxr Բݣ8:u+%i#"%| p(s%gOp9$]`f/!KB.i8C.ЩVZRH]1UAvypy&޻W<اZJ0$JH@ywbRpd(g9* HUŘ$0' |.1 pS{ryL \`rclݵxvEjU%(ӍYLY)i{{zu"3lb7Hq$h6_헺-mb:Hya9?Wv98 1E?w<_%Z,gjWe))E?ϴ< ^V__4w> stream xڵYIW,c'n$JdU0Ejb_==>F/oJq~~|0݅_i{8x%R>{?Y YፃTzjr9Îa;+Yw^C 0{jeeAہ۞=quGY]va7.v{&I ?NRoʫR l5 < 4>-;"߶ƹH")K$*s<(velYXGWrG56+֭C++L~5ƴy2 wq<&t{[vJb/LlF<(;މ ZͯAAR(\/Om~9V̪t6LuppńC<}d"vcgRC`mtRe}8#:Og#[ #ϖD~[c&Ӳ 77Ew3 mi8*4r?}%z5v ҅쑒 vD:Xċof?Bea =s^EỳjKq):@Z./5ڕ/~5> stream xZ]o}ׯc3$0 $)h Ikn&!H{fΕ] Ù. .Qr.Ev1$٥ .%quTǕ\JIXqueHJ0 z%aWksvD8bOֵ8 J z1bq(q$30k 1MT.$Dq#*ta .炋B,"U,e( +DOfXȱB"2olyIkKGVd2@ &\Z`b?U#UE䘁r P Cf጑r6\.ЫAƅn,v+ԃ>ɉ0k@BD }l0l@k58*A PGi^{b Sv UL]͔g0. `c-fhݠ{&+Ͱ$x1ʈ1:ehppEivt4ksǦ^Z OPog~>_ёkz#^nBD0d}c4 WbWJ <bXe}5W}l_rq]c׼k޴xo>v<lubgͫbq}mabFy46DcH8e:8䱠1w|`*lPB$e/ MJ>{x(H*,#HwXnj CR\gB+F#%E⍠4+u % 3|4)ca/$1%oc:M'"hज|3D% #)Y$wHqe y+=ya+*9 g\qd\p$3.3-SfD1E jM%1RVQ֟w 7#og0S@[(}LC>eܤP(2Pֿ]n!L_G;7RH[x%ޓOsޗ>/4%-kRV/vs')?\px-6ҬjuJS2'GG ͓ns6_^ޯV4ͧOrXO>\/g+؜^y< u ֥(WdB1bhʃ{䧊z[ Uo'B;qI*w+77( .:=|8Gzdm:iȇd{/Єxdjjb?$F ث4ҮKblUY&Cc(aS6> kG[dh endstream endobj 437 0 obj << /Length 3241 /Filter /FlateDecode >> stream xڭZIFW4Dͭ%p;q& 3'j$BI_?o"%fի{f*޽z>S7qVJ%7w" <)4L247hfTm$74vooSZۣۘ=[=پq?f$a|* &8")>[q4Ǔ'&;qm4Ff2,;<$ ƃ>ٔ7Zr67ވo0r~$ 0nop{nY5xv6p"z&s/Dnd9J Q0 ;tjAIsx,4p dUd}-+v$0:Aj`qE+#ȒLa͐{؄v ns@ҟ9\~VUš numLckF- Ӵ^y8 > bۄίP֘> }?4S[JCYcRQjvd>>J? XJfDNI3f|QO7 ZS?.;j||-P#B/L+^Lh tӐy@&!N}#D[Cc;tZ} 2Lcsp gkr@ 3 TLM8HxCծ <mVfrg[iM-^p#n2*Dyzd1mB@rΟr1/#|9oω*WP{mAmFmxr0_ZaL d;ȓ=Vg&rrp?c+#!"T0֘|4By&L7\9U dz_%@! Ba42fr #h)2$4xb.5ot<b.yά,|\ٍU R: 4HD Ѩ&pg~۞7Vyց-+\A:XQOЈ(qpf9s$,P2ˉ9]2,@~QIKI$PTĒtNBPT@dR<G^>ނŤf$G1j+M`F(-rU00$ ԍ(naC^j]FLKwoյȔL0U q녹Ɔ0ȅ<= 躅5 ]0('b`ۏg*Q |ܧalhkT:2@0 {a3ng"raR@z#9*Z0"#feq)zyOu 4edu!^VVa2AEX[@W[AÀ3N 0?6'Al`b5 JX[ KxVP34$0"ZA?仰\)kݜ8_F g^eRYA7Wh*sok*#ҠAw|e SynmV caxB*YkqQVnV S>j"M endstream endobj 446 0 obj << /Length 2229 /Filter /FlateDecode >> stream xڵXYo6~_'M"u%o/EnH@h0V[*v{ҳI`Mdίn67W:H)864YlTH(8wSf^$ ~*K+I v+3[ @ek,յ\ n.H$%9Y=u`SFYE詶#6x`{K\r" Gd `*ϝۼG[Bئ|Wx ]0ߊyK&-,vѮ[@C?}M<04~i7N&7ܶ~-hU_ZfLoj;~8u}b>9>9l:0m#^0񦿆R?89l zH8g;2߸7YQ0sWy&3' ڝ33U䳧lN]_M#W*Q<Gisـ%@cURTHyOǐ%:@ vD9F-A٠F Hu E-%-b%A2'loA+fqw]}Ш4\Т EF &hgeU 9(tqjt_hc$9/nqŒ;u]LsZ^N] g qRaOM5: ԞԃdPѸSa dEHd̦*f9]É|RI~;f3ڝx;SZ$ 8p¢/1{RxOۺȆ~jPqlHJ4:V>@BRK^J.t0j)6-7‘B_HٹTB7d#*uU 5Lj~A:Q`t a GU( C Ɵ(*=SX}9qWb FAU@C2b*.AYwI]XO|sh >`+R{tμ]XU?ߣ:G4Q\ ;Y k榄 2[&5MSvE/sR=3%V98 2X\a0GK3ouR5Pb%$ﲞ͞D ScZR,KI9`dƚ 85szp8G"vdD~̀RCr^JS9Nj6dtGq< s{arFC?OA]e2 >L_+B-MLq` ;7L5ΎSG+*"0f"M-$w;7Tm`1sշ9`2@i*u*\Ƹ9('gH.48d@B8ڥ=dnCs62Hzҿ7CJi Ff Ғ`ו=Ҙ:O̺| ^"fA3n1TPk%"( <5"*oB&d<ꮾ{B H鬔謽ό/t!:'(?rvC:rl7X I+t(L&Fhg,gE,)6' jȬjmvOҰKsLklH+'b8ǾO~zъQg)̿#zJ5]s&7?ή;ϴ{ϟmմ! >?!r L+]cCSnR˻RPV<B? *mVEm+~ L"W)|> stream xڭYY~_K &$+OwSNTy@YTxקݤH氝иU*<7:^EXnv$1$QBjUvz ?O|s-6cMC~,X$[ $Aw͹wUųNaD:U1?<|۴iG8:tݱʑ"ܶqϜ2 *iD}K+K[㉫4F$\m"-6tV̷bI0X7Z)%Q?^GA߻~O~eW[6$ڻ.zA E|״(t }#AvAW}N#' WU:pz.rG'bjk(Ȓ0ga&PP*%&ͳEEf8 ~óyw煁}/SlNbhN9C~kY@RKq~¬O雷'Q4Af~8 Uzmy%-h@iw;?%[fs_~Mn͋|fv3 k5UܬSm@gyVFx yy M\M憦o?m 064߆@dbQk;0Nm'ҮB\pEXF&~,JI=Ng (KEf+>%}^9Zwp7{2Tlߎ~9n8ã蔜`XA6{-:#q irl}ʡs#zlQ[K9dI&LjqQ%(җ8JfGH〰(P//^}$#ޘzRy{dkRM:Q}=pPk])d-_xOxCGPΧ\oI>wiĠXQtwh3nCy]SL$]`? اXK)"|8lQ<]Ƈ~:k{h;^@Q"<(በ) ެ05XZ=:NCSwdA% mp?/Y!(#t)e R88A< 0n%$zHyX b`ɳ>h n[}{p$/4+: dKYt d&eWA_38% t\wѾ,_'#=v֥}'6lEcCa+E&v 6ڦMѰ61솚nhqa4/I23KC fiĶm=iD2;/DEdF8dv9{z`ۼ xSla+-?3a9{UQiǖ:x~\NP&>1x@NXZ(=hc1C˜vDRj8OvH|4wʋ;1鹇fVvA0~o,fS j%8F▉=zn>k &cn"oVE8P@0#Cю.#Le c{W9x!k=*=%*<K-uF/okC4aGF8.xHܑ,H*-IՐq &;\a1؆ NHB7?>0Q_?cuH/ 2\Fe"S@M>C^|+>kk۾:F4Bd:Ǭ 2 {BۼeW L$Tzxlӫu)֑/`T7;+;Jח )2 j6?()[S!ZMG#3ۘ\BP yj`ڵ&}*x.-c#\EJjȠp᪄AHhS+(2\V>]m NN >rnF K3-9Oᱢ7q)pA,FKcBs)2ԩHρ # A|W& ) T^> V~ ]I%DooFNv0oש8xՅ}/[uD!2>v 88pi[*p";I 01 uβ*!'40Na;~PXx GGKB{E!: |Gr72t+;F2/>OJ}۶cAI` &%~ 0Յ-q;&/`R}=n-Dwf,f*#rC6$|>rZҍ6^`1 0rǢ0L>s V""6\l,!.pyN%kP;P `e-Wj$8g%IiyHŐg<$ &3c%wiW endstream endobj 467 0 obj << /Length 1899 /Filter /FlateDecode >> stream xYK6 Po4b>6ޚ$LۚڒGɯ/@Re$$ ih^<Z_=*bpdF,M#GM6VD)Y V,.*o] f~!N$B%(9zdNciE_5cl(7DZ'7u[ D2+5䄥ډ\nƝ)@W<Ds3S{^p([PG\^qmu#:"5IVd2~{tF9]s4n uF2ugmGs4.ξ?MCVw6jjaInlfC,-/-aZN=r {1K0LVz $,nݏ=NC< 9]ޘlg.Goķ vծC =K"%LehS1RIDԇH¹aO@9&:V}o6mst:C 8(}kn:~X)_õ#FkEe•$L%8yRMJǍ7Fng .thgp->E] Cet]rX(̈>Z@S26:g wTHAu 4-:*xl؝G= MX#Adﱹ|1W؜,~h9[Z2|Bڇ̑pz`7pW/(afC&Oh$9]عa, &K!<PU~ @@{0 W`"9 xL%"}\^ xu0Дte@b|dfSQ/UoZ vtCC"0DL4wҖȽk:~%/`˴Yu) ]ʲo:Yh'&(㔤2@D< !}_h)S&p!W;Sm篥U9-59bM㕼sKe|cxmk'Flk晀 ?*=: ^A-v&<`9Ž#p!Z3gT!.s'|hCu*%;!)+AKL2 nt~$MHs퇴JQ;\cV (J# e%ͳ;5 ƱFi IC6k?C O>lew@VkZ["Qbvf8An{F&s??[VD00maR\/,=Re}zb n8Ѕt:+ؔBʃ gt2=3SCcIl/Xm@UJS>k_(D&y |ӳlIJQ:Oae]v 1%R!3ZOy8b,Nc̎pU;~?mp~q(ugcq&?m8N:*jȎU/*> stream xɒ>_[*tH.ʉ'9X"13 ->u7nE"r`y|Jgwt*urs5f\J6\ RI1Y^'҇ _S: xqäLgɋBF,#*L4uJJ<@,N|^Z0"1*N-?b\鳅LO?&F&2mX.BÅ\pR6ؑ`8ISHC(0oE/_61$0!F8F@"M\vJO \*5f]4 W\pq !@|\- & a竿xĨ*OQ D-3v>nJl^OD[.c:WVtWQ8K.'cEUIzP}ʿVFJ47 i6[h&XFs)r.ƙN4en[3xdU>bNJ0d;`;MjWD嶭vx4xP,bo,SUWJ*RB<)7U8iɒY ![UVe;_pΓb oںydeS-}/ 6\? 0"8JQɾ! v7 Tex1O 3{\'1K =ud4 no[s|ixۮ!p4y@Fkmr֍E .KjcWX"0*|.ڲ` .r9h]W=Ǧ,mtbo~L1*xQFz}-] pލ!3eU`՛#%u8lʽ-uv_z]k_^WKݯ Zu^Z7*m>Wkc#۶^ ˆp-6ղ3d1{Y#LF2sEFQOԬ8/LU`.v϶n6źP: WѺ?9lw$}@!ABm4/Ro2(?:PSAvo驽f^9͇SC6-@j9mGi;3Α7 E#0 ,p)0柾17=?$ƆL*;Yה\0c]|& sqQoz,3(Ю{/)aqj ND[}gfuͣ[9n׶ # `pRC ?L̢& ! ;AՃ JIg1dAsx{] JP;I7GnາwMW%O-tov6_LWJ {  G'm8ZDTSݴ,w"c4Zoyz^Xy^OP65 [6ďXûOyuWdZqFqd";^! o#>e~78j\JO;/~q _lئ,-ZPc- #ZD#y|3<XLVY/l9j 2#6gz9`k.̔`y6L%5TR%[d&ݒ krDayfgE s)J3)*U(2g U5'âtΌV*Fj*>$ť(yY=Ő*A7̩lt})?"qAf8yy *v?lI}L>J)9܎]yHq =X\Mx\ cG" <_>D{>q xB`ދc ca"|4`h|dKW df,q[-qm̗}_Lm|ah?ħ5sʝb}p .>ٓ[j:{q74 ~ү SS%Ca4lrO&F r`RU$6eS39 JL`덖2(j )4ws endstream endobj 489 0 obj << /Length 3005 /Filter /FlateDecode >> stream x[w6БzP|gۍ9mU"]J;A!ًDpf0tr=g_} c$UO.& %:'I"0|M5ժ$RәH] 7+whx\yE;ebh47EL?]>;4 PxRzzvK}I̞[=/ Nsw4*_Vm1ϳUd>c;|P!h !EyUڔ|/˼γOI (cRqIXCfx&, CK,rSWxmSf_TyijUv0 enNj?Nfc %,'OtL(*=b+jA 9qI6#J sUZvhH,|'Z{X_gl;'0{Qi)nPt(ȡ"ˎ@x39<#U'Y. N8({qp羅`V c'"L)ͫ`B$}.)DWNX\:}APg:/koJu+5y<_Xe_  ErٺȠa.nG ovGeE"FsPN'@VBí;#s| {ȥ%ɏyCp3[jݻ! EPb'n2kUZ2_+mGɑY3\-^k@*=(/Ρ-),3um0R<(4mK3<ׄ^ewp nuqk[vJGb0:Wm'O^Yo=ׁ$M3 jh>Jvnq w'xLHk~'T¯V qW;:Hjs۫/E;.UtUWk8'=89QDHጷvAe3ϝښNɰSP 'KwpTf}'xҏ<8ٮ9+JWd5xEŶżNkPAݏzd3f endstream endobj 499 0 obj << /Length 2995 /Filter /FlateDecode >> stream xZ[۸~_a/2&^&Kf7f6@6,kKItͦ=b1X(wEw LL!_nBkAC\)M~նl)X,WBd[dmj?`)fׯĈMS]8'LiɻWuӼ.;xdߺ઻/Eq禟vCܖ]O4~-R©1m^><+w寅aNQq'˕2/[XIbRYY:eIv\I*}Wn,mZ?T$)+n6k57]ݔοwWV|?z6la^=I%q,9ofqC l˺XۡPZP~0*?x|2P?rz K*P6n׵{;@~{x8T2EMI4JՔ5J.?`m>kԒhA| B0ՁABEu&gg]]VLp"`똭*B gLS[H4_,WxJd5 P7Kn~H'u`l}*({Jdat7U:. Rb`) rJS0X+B zFF9tP)TkF fMM1- ؎hMd ɬJvN7b$E_geg:k(8 \ĐņLL$/cxm\2N\ y! _@tLa9'fd%-kZa1@>OYG U=aEhrfDel?AaS#rp9NԐ'/{8%5#zP DWuJR0d"r{AI9ѩh]oH({Ys W> AjƇwgb.Q kq>KBG#P?l͓l:l}LL=xG)|՟= hd}grb^] ɟ]@Ef"j(7gL0g2E,Ʉ~pԕP-Hth&M4(B }zi/+Sb x HZ+ϑ$ 3粞' K1 ^ܼt$PO$UsaШpHR\\e,?NfbΛG3Ԁmo2+4+sYRDIJ2HIVI!3!FM,ldœ\[_izWL CqMyц/4S+7~Ⱊ I k Uҡ!bTFdb0#Nc~w՞m!~ە?wXJ}qŴ5OT:ـדցu S%&=}uXuY3".(p-FL'Ҹu9(Q9+py[·6ˢf{VFn4\3gmѢt)Mʼnc٬@g~̹;!B7@b AºR2bӉqx>*}X(Xm~(G[RqЙ[$|{ʚBQS'!t"o 'A,7s  'e2tjflज़z]]o0 ڀEBMԃՆE$XVC0&`\BWOqK {4~#4^Zߦ_a03hƿ8W|XCf!u仫b !MAfѡ]? =0TU ]9uxHV@.~F&LAڃC\*3}DXO<)abDR4R|`\w 8D.#sO_qEC'.yڟ޻n'p5E/)Gp Ǥj|RuM 5jFƟo5J{:3gnW#q_>g&?< !_ 5G!ϻ0ώ/۷Ἑ{0 R9c߾c݄Yۭ{_9HQU|`жO_w, i599j!sb ? M`ۥiV M;p/7s"->qy+ۺC!AXr_ Ɗ{ p6s͉A|Ì%zg̀#6*uy)wXEO;qtF:=9d9 fA3x rp/p]`j ]*ςKF 9TQ/.q+PظQ(>(~\3iµfr#hdl> stream xZr+PɆ1;FHcWf$c% dT(P&ؓ@l,gFyhvK2HϮn2C:3F.q:{ܖ~b_,TA_K$7p%#&aZ"9zR|Q5.ESnu[)uѪ:RǨD}/UqQwTѪ׫bWU77x_W(],X~i"KKn{Nl _4҉(WMo;ǐ(+:Z{9w˥[Z/MZD[.b0ǵ2[rAy2!$)a]x4|ϔ,Nsα{Sz3$zֆZ,VzUz4[OXxJvmtKvժjֿ+lV˨ _mFo)C<()Q^􄕫Mp+nUBH2ֿbڻ,~n/A'/]2pspdVdaa iOC(B6R*" yAɭ~qIBA fCs ք>.~tf,/יF5Deiv(fһKBu琢 S9)N̰|iϭW~"FQ^p RΚ2v^`<̋r0J)z[k:Noi-0KYA8}·56\o7εS80X;< ^z`I?Nmu.6wԇP8lkY+. 7%L=q)s/ľ*לp!;%Mwq\@XޓB2j<)QVC%!@)QZ=!"?5@D{G, CSO, hrp6G9@$m><D%|) t: 렖wAQ”=k,F"у0is6qĦ!E;VԢ]N8IcX!I6=%8pzz tCP9+Fdm~"XDlYefXD~Ʌ^PYJqjAۢ"Q P1:zR4JfЮ}GJ 5#%˩g'S"\h|TMU/)vU "@8"DuPұ}5mt <HܑJĬh_zW6yy@d ~=Nynqj!4j&:ׄAX}^ym[H9C3~I/q oN1g-?.3-&ykݡ5KμQ ۢe7}mY^Ÿj_.Kd׀]Vz0-|$"ZZrn=j꛱YN? : gn\#eK܂H!n()P=7:wEòeZw֙@Zg:6"Yl.=]`Gg?^eH5!c"jTD$0Je>'!@ۃP͈( o#"@yMu>luZ µi+wKFQ^D(KQ%20iD#ɥS넹W7 sr BIa31CPH؜M}:a̞ pr:(@ĵh+ЇC5oLtGHdT1 n,7`{1,U SIK^"&Og|ܠWNSv~.vi l:%iA>f 6>Kn|~pp X<8Kzaސ~Ηla۝)]Gyf.';bY//%GwՇ0(WUq<=~5ϻͺ\gX` 9<`Vx)e`ObeRuшH \ZĚa[܍,Ł mm$L- [܌HΩ\M?%|yJ{y@Ǐ6$D6gp*ZcavZ endstream endobj 430 0 obj << /Type /ObjStm /N 100 /First 883 /Length 2105 /Filter /FlateDecode >> stream xZn}߯h/KOWUW_eC0$IL&ayY.;_Sá˽DNLuW ) BvQ\ FpMSŵ8NFV $,bbct"$9 Gq_69͢=9A#ث T^M58s`b8!Ss0^c΀qٱ*0۪Tq.,* \38 ΐ΢& Ԉ F\0a]S7ࡰ1.a &  GÜ"N"ɤa1K Bs`&dw 56Ydaf-32QE%z`U vlc'`:VbE^8hobNTf֨fUSS+D蛪 <*A>5Dqv.,ᨳX0P@CuJ T)Hu9#3i^݉Tp#if1D8Ӑ0 bGlZ00&TNkPv0y![0# T8y&g p snjuO"~q 7n Q;<,3=#E%ThxQh=Bќwph+iZ;g#ų[~ _>5t(̺Y>/vq^ KχIZiEY9 u!Z̜("8Խ=uTuX#1'4hՠdulK-)%JʘQ&kpw*P0 PT',Kq#ZˇLtP"F)톿N0 Y2#BY,G62 ܺմˎ7Lwcݶzk-2ՕB Й/^prj|. X #.*OYO1ہ+#`d3CERg/TBvPK`_P #C UД}-VϊAHQ~|$ɰw:{/]w7ʦ0/=ly<_Hwԟ߮n?~W~sl-ȓPe}v`o K>ySK{Q3abKf} E](=*76F2kRkU>־ئ}-ZXL|pma=2ۨ7CF7Ibٙ!֛ (As [#q}R*uLH*P4ۺg$-Xߜݬn֣=‚4:' `y}ϯ` $Z46'NDi V*b@s> stream xZM4ϯpS?{ق/-{g'Ud[%َȶI``/G[Vg2緓AFM2%_ o=w-r?n.__,_ff10J'mΏOo TwVx#fE6ֵ_܀\RMNF) zO (}hQJJR#"XGBѹM, ׁi?&#Lb<&I"NLK@5o)2S(L HC!ec*ĴZ^h%1ض'7kٌdThS$ fO7l?Jgԧ`$)e*;ICJD19F؎HDA^ALAZȎr@9l#Jюvđv{>x [_e>Ō@"T"T8`]'hH"U Uu`{q)!21W]ko B0h4 6T&`N.8WY8M$GR&K1L1DA8TƧ$ hrİDC1]/ (, UYGHϮFukK#2Bnh -ԺAR QkHt[lӖҗKW:Oﮜb ?f-˭U]~nlޔq}Qo̢,xƩD5f:ەtZmb^f~wl.ɸo5ͭf4Yt ;3G Bn+/o:W\.-&dD/yZ1@e(nav i)'9*(ŤE/ 3>4`\yW=0f0 QXqM-aJ H!`i5T Ȧ)&g zƫȢd B"T' = $n9 qQ!7+x 4-:B#`ZX;&: ENEdpJHgEK{I-&tbFxzQqt|P/ZUhXCD7Z&Ȗ1kv,3W 9H[ <rn w"Cj u/T'Z9E"B~j_(_V_&\CSuat oW@Ls*R-i;nvr‘b4Uu=l=Ȥ\^]/ګ йu_nE h客2' J{H"0f)?p :e0pt؅кi*Nlfur۱t{<]04;`(m@>4nLO[DO1$(4޺y:& ^pVY px:-ƮFiRAaR$bEd餘ؒb:&5 ;qS!Sfߒ] 'NW__6$eGտ,Sge:,tAY=X2|;e :<2AMq?YV TŚ$F=' #L'Rx.?PVX+.6+LȐAՎQY|Q&L=vb#oWsWT@S2ִd<{jpkmM*_@y+2vDT5)aTJO xB*OO lZM@@ztVS?tNz^JB@`dFM#䉼]b:d݉9gw#ôf|5 ur[xf{n.= [ףUwe|mkZ`xq+*Hj w&,lƵ{\p\3@ qc/hq_ vvM]^ӧrWn*knIEUT.׫]V+5R>˻/4>B\l= 0 9d31 endstream endobj 571 0 obj << /Length 3483 /Filter /FlateDecode >> stream x[mo#~Bmȸwr ܇ Mh.Y-v%Js"0`iwp3COO/^]i3Uq歝8Ը05L3y13O߬W7닙y}O_77bŏߴd&%~ot!tn^ѥt!t}x66?.뻻|z˅^5x-/ԫx\5y^"Oy$Jd 2%y_/Ӕ\0LTPgjSi &K&7y˕ OKma*V9.BS˸*4,dd是Iyi(ͼIױբ0ScJ3#-WӺ8 0:rE"ʦeIYIͦ)UVFIUB%-B)Ĵ^.R2|o|$<]1f7Ÿ5#BV*MQQON4P ħ@ʡd ,F({cZeht%F$gj}KUG7&x)b(huxZQ, qxe:L8kNzk67J ?קV+,P6g_C'R' SuY4|ꨧ/OiTX0i_ ҷCCч%xexTyf r΂FK$l֔Z52-|Z*Y-9}e$F;M,#d~ȌtMO;3-GV,H&"|WC>'e:E5VoB\T w7uS-=r_ |kiqI膢?f|'.~n qbDYO}+:.OC0˚G\RΗ;M>DB KuӢ28#-Jy|}:ƖIZϗF̗[?]!_Ύ"^@cpNzmq~%Kntԛ!?UsMֱ=.b%m$6ˇfu 7$Yⓒ,9'g>=Ozr3$܅2TLm)-|Io7CZ$˄-iYzb3}Y 3@]\'Zw@P5"sBp( Yjuq7Jf_ Uα3#FeF:Pw/qB%8ݓe`iqe|P"ה' I1;}$J@~D#-S)ՋzNUݽSGrīx*or #efd+A`Aކ=Sjw(\pHD,G,! {?? `9a<=pTPryA@@YJPu2R"1шvXo'g]{ |K]UQUJ cu* &<g>mg>O C =WJ[m"bT! 2uAҗ`v!VaӝYmˮ-my\-YtT0{ R)uJ)*vxHgԓyBH3",GT2"j6 TʋPCmObWO9w<#VGdCU_K&T"dY^5|mё8 j ?)+ݍs:]3p?Вy:שxѕz&` ϐ[wMGD<੄ X-gFһxF3Ԧ=~2bϮ.֋x 9Ԏoī [v|wn֏Awc/o(68Оfq* đ|_-ռiİ |*8GAĊˏ1 Tx+<='<- 9 |/׫pb -wtHmK3>Jnz1O_zsIea|ԁޓ.]`=ࡃX)+˕}ZBL endstream endobj 594 0 obj << /Length 2599 /Filter /FlateDecode >> stream xZKo9W^$$Ed0,6#`I=R; K$籿~?6[̖v23^,5&U_U}2-|cv+f ƈSqI,QZUśw1Q8Cg %#Ɖ0TD2^D2dR0¬H m'0_Li)iǷU|U,axY_m* 5nVw7o9 ։ WtS^aPǫwDh2DX+Yv;Lav!'ؑ/;3"\zL햛ެeݠcLva ~ޅ_h·꼋Z>V fwnS<. Yu696*qcYmM[.};Z}FiTID)d2a^M| ȉ}#A/U) -Хg.xpB'fq27[N簷TjgO -ҷە7Af]%ft߈R!^IH⩗$ iMqLGIz=oq_tV-#q{I$RhAHUxQO^c5XvjϷ$JQ:=e\eZb.﫩zi>5\77!To֭uC\K<0m] uɁ(؉(_o7:kZJ&.HInIû1-> V/aN%fl)4x e]?\}^6Ǵ wn = xAω*AF獼RNj.}d8|{Qىy~iȻƗ D yˆQy{1YInu krTpJ!=ZkDki߹x9pMZ)L) A-7obv"D?5So 0ӿ*~w[R„9UR9URΎ^iTzHYRSx/D*pWCS(4=MM214x,MmhbBҦF>"BM8S&jI.s&ɋ\ȘbEQqp{՝HM4 v"! 8(;SvCѴ6+ߎܜ_b2q0b14ezLH,uso/?4 s}* ΉqVdUH!ڍC uTASqa!8:e1Ld9L8ݧntṢb5:Zn8 " Bbj/8D )T.<&0Cwh2 k1fPG4OCq>qD*:ל$uoI}ޯ,L=Ed 9p?h=E_lfJ]E:䡋|Q;ron{Uu8;ֵolP3 LWۺ^}i*.u:قL<-ȾrR]05}XrT~>ۣ =D=JlR d|)֨D86بLsqԨ.;ߥTߢK)ӥNOv)}|2N(!ƚoԢحL}3[FWմTdj*_6m_pWMdBdz-9zBc0|-f߉|ԷQܷueRl?/w龭 Mu#d7:9o2L֫kG.BѤ.j^7g!! l,M|رqY9ՇfԄ$r"<Ã64o8QdZJ'5KR S &6c=Ę #Ffl~x1 &vXw&^ݭ#Db `䦡niyVV[7#qȷMcVcϳwt4h'Cė!TP%D63 N%Hj[ec0Qa& Υ R.AV;~~{{a,q.[.ǀ&ǧ.:5MOc^۟ama@ k]8!@iB`2PvK i&|vh3&ycFud(bŖ ip]JQ endstream endobj 610 0 obj << /Length 3030 /Filter /FlateDecode >> stream x[[~/Z$~iI(dAfBv%Gz~HuwEaᐇ˛7 oy_|0,#qmSLj̛&ULO6[ݿ_JI^젛 *71eIwߐ!kQ>0#tR>%d&vl,-ςuM;ǚc֥oOQwZHfo~e?dEh b\Oq7Mqɼ˔54:4c?iz%R_D^a2Qд0%6CU"$mR}-sQ[+8sVA/~ u\42Dx`ʙfvy3oa1vow/R۞{&MpR/@NL9m(,Ӳ3NY$" `S3 YMQ/k\`([.*n`×fFЋ[n˦|A@M!Ԑrvs;}u7O麗n(m7kG,ɫ!zMvb&=/㷫g6[=ܕ<=l')Z mbyz3?E&yt9OL{-;X@\iLUfSj52O-o+QubIߏY/?yS67yyQ$ hV`*d53mbU$o_*AIlE-¥9l$aLzeO!YL%hN8G1Z4,Nj5"'&׻x Ub1'fUʓ|{>ɷ'/ 6Q 8/5=کgVHm}G1fY

qj%nQ?ಞ&"#ÊK+ј0% U +9&].#eF%~_Y4xSD .A(d<[_216ml I-G,0.c3'[=n Qw;y-Aoew!-Fy@@˾>̍%,LQ.Yjz`H'^ұFoKqJ+JSsOAtfBF kz [`v[Nl+͈?w3yc`ɉjvNNފ5^pC ?%}|0Zah~%P?Ȣ+8: l09 ⩐p bg?1u$ll+v^ xBIдv'0aϢ(i2)̨vbŸEp m4&l?9&,>DŽ#"_qi (&, a$*cNa'q@8v OGK~|௰ϔA806-uJ܃)h3T^!\Q!Sxw\"sJ 6I؊%z𞮨9lz R=S=N}vxt=A+vS 2? f]]_|$\F{bu謼2$f<4[b-r֒pVE#M39HDK.<./Nq;gޞRB/:"*ZI<.fmz]-~M=G3L֩=0> I$4 7{|/~,eLO7tkރϋ8{bM&&sV=9VX?LAMJ/䱆6 endstream endobj 628 0 obj << /Length 1225 /Filter /FlateDecode >> stream xXK6FbO=[i(ķv-x;álѕ Ң'J3y|CO&|悟X_-.^,,7FNdL4CK)W>ԶW,\dZ4uQ(޴v㚚.eض 'DXp\huB]qCn.oxisK}Q#uk!%nkd<H̻{KKuyӥBm㢡cA.}[HX&쪢#K,޾N2+pͳs54g2BExܨQ7)*9A39Dő}{NY+Z68.>cyѬB֘)7,ؔnU՘B_,ӓ |PiVy ٲo(W"<$w$ =h};>P'nHuԝǰ|T}0PTme #h pYZz sm F}.\@^Q{2ݝ!`7%|dR~>3QB>?4%YřʱZZ˻7I`C $dNEzg.┄W@uFGݚQh P&|ǿ>ߙLE=F[ YڪI Ho=tcWe\PG Qz`@D j 8H2VP3 >T2ִVa`4eUX3$북e5Ʀ]\ "'δCr~Qu1QDlwD3!uh3._hFzPpzi1`0 4ABQ(&Lz&Xx_0' / )K!6~&Lwّ('g# LRR9dc~͵0p &b6d7.f `} A6NP PkZ|@AAw_6(mRx?_aףa M&!GÝ[ m0X2> stream xڥYKs8ϯ𑪊h9<&XS[( 0H-IeJavAY)g=S. 򌝽]_~:3Ƃ'lvs8,Me,8-ξFmUO$LyTNxdK5+luj'퍟`&IZfgSc%dTJueh&"1!'JRu1YKWxEXfQucۅ^]:PuJTnrK4d 76_jI7}g  VmѸM//Uʫ2CsgusGwV8ODL25`Gw$*ilU8{0J$r W䥫;[~y@_?^,bA^s_-иv1~nHU4o\`/x4JتCPg 8Rd+}0T]ǓiX_ytK*b~F٢nr0 om VH-Α@ꚺ- m@6( W<3Xץ-eotU_RXIyؤ^!JOwpWje$4ծdy<q*_`/xj=Ϙ 2nR; lz|f)c)]u&f* g1gZXI!Pc2[rk(JbEcl\u6E ;A=G n ^aWni+hkE3i@I&NZM,9Dya2 $&$(,=FGe(fpn[Ew"LXȴUiq!jHtޖ۔;DkjܓV.H()כP(/]H -)m?g̾8/OWT䬕kaI[mZQ;0 ?C](͒Tz¸dBS% @CVҭnsq~^ͫ)H ^m] {$ {1o.Bg[mN=IkLƩGF,[QCL׫%2oc\8Ġ T8=$a#]5t||T.Gz6`r$Z%%İrЍ|Q'p/l5z.r X,}y8 T(50tT$'! SGHF!>w4Es޶ fήk\V+.luyC/n0c(kr[-vf|ú'йa%l X-WP Gv'R,u+juu!4GBOc!D&6zCb$5oaXYڠK!T|w]xX)ҡ!=_}kp6=={m-a벱:P %WPuW\o3'>?0dyB .n&)О }SKAϥ|a׻,m4nLd"ACY&7$}-J.yاTo5xsBBip4P:Bp,jKt}@{j/I c}i0t6 6P{޻|  3,Jܕ]uI8J>';P!Sg+lY%$(l )%gmR_Iɿ8Δ ;]"NLSG9,24${ 2H > Rn$&UZ$#JaFQ6\ Z^K{ 5(|W<oƘ/2Of|pmmo1Tس@a# ' '!54 Ap"dX3&K;Tq<fmL!HԒ";gI~@R$<聭?:{SY,167<|aI@'[ܷ4oXOԠoIqFMC悴TAk-ж޳Dʭ1%bE0iL~tKmpR˓Hm+A4|c8a4h<&4i8`P+s<@XceuoIǼg3 \K0 "g< TZGpyE- xm GQ1sY~wrp{CzP/éNA&H w=[mbC,}\ⱇ\Sˡ/fS\Q }t'K%rF-%#qX=& MCo?;  )y^t`L8[,LGvoO ;eS=(l~Ds| AYN!aI& Z%.rXlum<0 ͍ˁ @rSSN*_~O endstream endobj 536 0 obj << /Type /ObjStm /N 100 /First 902 /Length 2693 /Filter /FlateDecode >> stream xZs~_8,XMf3N\˝6u@KF!U8$%ԝţipw[oDƙHb(hdO&0}6!_ 8`$>"eC.VhO^}Z̆ cd%₌w,S 6)b:Dũ^( H :e0DdÄq"@2Ip6 O39bLSbpNGx"d)T&lB`nbҋ]AQĄѳ@L]ɫN!MǪ~>OrM1em UcA?E3cj2S}FXB0[}dD g#)b0Ȉc_\uu^(^'&â*б`x0-tZ&ƅ-'/Qh”Ja#HLJ34<'X;uCCw2]`J" o!T H: CÍGͻ߮&y1͗㲽tn~}>~Cyͨy;9[-3`A Xr/iNMi^oΦˉb>3# EĪm`҉5%Lq6`}r88|;ozeGi?<6`ט6de۹A쳂KNn%~y 5=l[vgxL!n.Fm8Ba-7d trܖi5og̦i޼:1ͻp@ތ/&%pOf˅FN}]`1>,>|:nٴHg=bmdsA_ pZBֶՖjk˫6C=PX[J]qhjPM^fXLD 27-#o Tp㛫(E-X2 _chЀHrll 6!Yt"y;xYxx X`,:mz>]q'`0Ip[ΐ 1AmgbJa'v&bŶl !r>d® 渶brP@@û<$LF}RWOk80s (s@;pO6H܂A!Tz`b(m YDK,Fe{ {EB/w_ҿD_?){ҿ@](XJyRiT'IyU @jk{ެ8bN.[ 7ft|?%ز7X0e. '8!( Bn8 Uw 6A5BRyw.Xp9 /s FeUO>-n.2`* `]ڀ0t9V H9"<^!]Àp}ֻJ[0 E!Uv6 Qll=Y'/[-8|ф]W 0r5ͬ;jt c:}!2ekej]xòյ^’G{ Cu"Cٳȃpܮ5DaK eaFw4~M-$c:t^Kʀvg&J5.rrej2\ZL-W++nȅ:^3Z4TF yH4  Յ0HՅ%Zr Wboua[lLbKZv9D#y &C` G%݆2"н.XNKa: xUT\*-]'mp^J~S`aײŝ΅kwLeIrkB_lX~~y~n¯-66V\ fքzM0Z/]-gqԕOXŢ]|he@0\ܐCDp>ƥ̏y< Y~뢧bi'YJOy7`2u?atff-zK8!HybNXEJe(sn8y Sz-{IMOMjwdԌ罄h}OP;.n#P<5\ѓwWz s(5 p\y-Km{ R?ki2Ł/ɨ:3 QfW-5q;Bl9Ϛoo>-WGMsٞϧv~}ѐBy9g;LdA% }!B[rաM=u4PO [dAzjg%(IOkڝ r=CMlc _rqrtva''G97g'<I( .G2H|qz-?a >=T@֋-Î}kXW~[ @ 7 #qd |7WjdxP-k[D ^xA!ux8m 9us%MN?==_Azo endstream endobj 658 0 obj << /Length 1899 /Filter /FlateDecode >> stream xڕX[oܶ~ϯGh/~sN6- Y:ٕ֭_?3$7kV oLg3:|/nޜ}`jqJPb#0R;d_ϛͺٗ\:]\H] ߝ8 ByZ03"5;??Ŧ.Ð]W|ƋX%OEҲbv/}S(X]PMW'97/vuǨ *J"]sXT6Sl .b+[Q=h,Z;vn}sTP-}G.~h-ˊꬂvebAe>H\iD݉&Lt)QlhG3|3ϙvǪ0`2:'H %լČ8vfjݸ{F9@)CPDloclDQ19Oi1X\Fx=͘z(.6C)؏If.W=W YMAᦃ|2Bd`gvO FP}{`p$g`m*I;/F .[^pH.$̞%/`PAh~9SFE$>I?j` M2π5TxNF3AqgEr*w\ BVNB00 M"#4X, 0v!^_A?_1ňWͻy %E"*QM\2& ǘc`E})rh91lW!qC 7N}&R 0˲mG 5)2 s7LAz=l Ȝ0wt LN" )8zS Ȼc{r' a$W].~> pJl#)=:#V8 pHRӌ,*,Ops#$ܮmC#?+T"-P*iM2EV|:_֑mg2♶{iTtxCDQ4Ƅo)`,:FuA~ Zvڍ˃tu#+K7+B4]Y` ?%ܵ]ji$&+hXѸlYQP=*ڹTR־eoZg s``OIyXYmeQ)%K hm,޸gSȀG+\}NFVϋ>@B@[E/}P%ϋ!Oi2jo^G=(;Dø cf8!0Gi AM(h" \E? 9"vZԁ0eW I|t2Ie)3s@a-UX`8ÒZ=c B˳8Naߢ hiB2s8m&߼?E endstream endobj 680 0 obj << /Length 161 /Filter /FlateDecode >> stream x337U0P0U0S01CB.c I$r9yr\`W4K)YKE!P E? 00(?;h0a$>z A?$h LF N8\ù\=Y endstream endobj 689 0 obj << /Length1 1740 /Length2 11689 /Length3 0 /Length 12790 /Filter /FlateDecode >> stream xڍPX.w'H!54.Aw`;5ݙz~ǿ{ιUMI b 2J ,̼1Q-f33#33+"%%o9"d 1GM&n~3Tdm,lN^.^ff+33Ͽ Aq#KS#@dtBٻ;Z[6p:Z@۷&F6U%_!-`{^&&WWWF#['F =lP:]?(lQcDYX:K 29oKӛ)*#d ]?#ݟF&& [{#;wK;s IRٙahdz7r112~3t#2_L-NN6pd#5Kؙlmv`'?tݻ;_͵yYڙAٞI(#͛o9 `ff@7 ?T!~iz[~=\3F,,SK0hniw71_nc0ImLAv6b&1m Y(G)* rx2ppX9,, R%#˿GD;3_$nD\ ֆAo P=&o_,K:[͟zF6Yͳ3m7@ob-(u2` 35Z:IZM,&w-P dǃ``afΙX=*NoS|[N)ag2cXnh7dy[RS۟ `b\of G?:`C'b0Iq! 7K? lH[o?1&Z6? ڷ6o V-/:lߞ w| `M-29[*?=so M@&|V"]v)w5Sh<;?%Te;ފ$lKP/x+=y=ĪL!.L#1 y8xiYC@vR88s*aݻJ.|UޫCz.bT+5Μ'f żpCM*K}V੽0ZE@sdS Io޳0b]`>+4zp)7nVNG 1A WCWIɸ"#igeY|%PA&嗇 b8-HS$]'gꝴ^q,c$X[>ʌ&a (?NzK 9E{e,c *"q)=+ğtX 4TcLtlOY_.eXDQO2eS֙gmԡ4'MNȈ&P_2;NK0OO*ґHza_ &F2vϲxAM:LF +{_ؓlfqw>TiAFRI7HQh<X %<:,k?ɑ2q?_OH,qu e}$ Np u"IJj<A673j>W+z!\sL޵kֈ adIvZ54P0#n3Ԅ)όF~8ВNԈQ$Zg348݃D8CS#q} gy<+ dkmmv?7?xTRM$r^& 뤄NN5_eJNABx ߚѕ9ep[3 ˄ ͇l Qe ;eauobs;+fX&3iE 73hc>=:N3 ,V .6C(/{i0[겥檠 }aP$6ٲa,vIE[Xd~b:n#nv=b jזLBrKCY/{u֢ ->~z8/YaI}dy)A,?CBgSXu( ܣҾh{TVڙ8.8?Yv ѐ4VngpÐJ9?}O(CR;} `MC,yn7|T|fa:krxK>1,6|5j*{p` Jr3{e`1EPS]Ħ?}r?;F^ K)|QB @c`gKG%mFO&+4ӱ ݢlZ`%W ҽŠZ *x.0YI]^\q:]k6-T^s j/Z)5b6Q'9̓n@z[1B\@`vxWNK@5hc1Dˤ\BI ̅-LKDpU448~/UӜfJ]c^*ܭ\ $H!BkLũ<"j ̃L*Q5zy[uzQo;jR;&JCNNwKP:kxdm;ذwm렵2JbuBvЂLA|v]gM@^=`~#'|1](MO &H= ՞tt(]AM8PB\"H؅naĸ-@a1!]nrx sR<虇 y;k³b(Lqf1??khu!, #/i>kg,(M0oš]Q'_cWKy9^?y<%d&9|߁\,˗+R|,I{3!Www/i\ϣ>6#F İ)gN- hOcca.qCVhٯPq8Y&> 1Ha??@Bc?-z$Q-]c|. |T ߿X-YG(sf Z|ziFsVCNpn#Q<2}ȸ=Dqߖ#B$Hُ mwUb;mctPIDnbSTʼnދ2!Ïx36yŅ Kqp1N9"T {VroEo.IbSx̾2Ĺm&qSXAo#$zv|uCUPG%SNЫVϝ}v1,ʬ;]5s q14[2/aEA>a~~_e8U\N!Fgմ,1ʟ |/P$DTF3 ^8;k8WDt[CB, 0aʑ>/#=ld@RMc=J{'0r mgÈj<)FG;C5woRC_+P+(Ў~}!&+t7ttv֩VyP #K*꾼j񄳵Ls>+27i?2 "Ĺ 3Q- urV=/5Ž@bT}ھ\ϋK H5B͖^?^:w2&''K]Y烺L}thًr^V z(*{(ɼGG=ޮ i8LߘKȴPo_=lfh~/nrƷVmASg&P45w7ˁ9)y4a%_[]>by ی{̂ ֱ:>\A`$|ZIȇL˦L׊mAOqN5Z^ʌv>,EbԼCw),xu΂1l^(77,]unʹ\.1ToG7>m 8ܒ)tD T|)܂Eq͊k&%źwlr7;~/+M4Е S8/ :ĜX2firCאQWX( {@eI-Mx82Z&Jw >M,oFw _p8h?.zPA/9eSTlAa *V⺏KIBlYF IEC,!oLYEU5Q4ʋ!uV`!1ÆFbn«/+'!Fl49H\\u03,\%sr|ã J'!!*J\N:46,UrǬ$q1v+Oz9gn(*j󑡩#cӺs!|yם&05e/I6Z /qcIyr5gJ8-^*?R [bX &h  .WiiS@ɀ‘nW0f7eMxTI:Q썊aO2N{TO ^7o~;25u㥛uo}ljasJPeP!ϗ]!O7m!EQ/񐎈[]YyDȣByTbףdvZnthQ>.ՖqQckoW+Iދ|lIG GUu}h5zo3rapKc#H=9ZO$ 9z sx栰#a{:)8\38G= u#("[Yo]kꯗ0ϛn/T-K PC|u?Xdg#ͰV5`s"+vsGD$YY>Δz m9=W-H|PTV,fJ:Chz`X|Q[pkDNt$P pJ[:.DE'޼o(76N$^lP'y~Eu ǻKJ^&t XOC+FRUҨU8?Ыd!ee2R1+}m>fٚ#QՃY6Kx,_J0vkQlW㾐٩P,|Qѧ3omJXl]f;׶O zM@Ϊ_(AI9.N6 {# afDsc#! ۆ}37}|*ӻ^xiw2YS#Nx3*7|g\Z91u8mql}j$')"MRr\Em+SGsǡtcujjBxg~Fmen>- y^(B?8(gRDUMl)PsOKt*KcC3qPY@Ȧu2r>ưɑI4;B#w' ,w+˚Tu\j.O,mi14^)&3888 r_;KE]h.,>zW@t:&4w xHI <3fp! *Lщ^n:{P44::aʇބAP|U}j' L)}~Z'~z3YZO|~4b5)YpӻWfr`u&AIrEDbTYUgZPSPQϵa-%G* 7tՃyGf79q0WyWXflxk"xԔ'`I!fZw"l BEꉴ ?*M"XQH>5`|s<5|I ըJ}|] *s Z asl"HZߍY{ь= "Av ORSjG 'Cf%jIE)FRҙ$ӯ|҉$}fÔӉQY3kd8;6ð.턶!m&!Hji}Í2c6d9YԞeV]Łj"-*Mjq$N1)>&!'k1/Ug):ϡsB!69R]u٦w-}QPN9ֺ=Qe?X?L5G.$DNsQ.h'3ItvϜ?@q vNlq  ~Nϥ#xܡ PNdwSeQ JoφbVk䠊*jtYUPti??4:aNJH4tb #2澅td9C!|V>I+s;(_V%L:lc8$V a,$nÂJ` 킕tyŬ2}urg& Iesd[WjV>ѝ4w5$Ίg?$Yw>n[[y)Ue`x1GcK#LEljTHⰭA:w->ܐU-d/FsEC1)z]i.\ei{@V\M 2AU۬ D(+HwS^f'fFS#k/y עvz9 Od}(OR7tprߺ+amv>"/W~+h&m.DzǙ) BUe%Ur;a1VWUg$`l(ώj+ѯ'ƞT2+-WzA-#LbzLj7 DlacC'7<&ۜD|{]{*?:L õ^~Q%ڔimK}W EYә+NTv<1q[Rڱgc~r udߙa_-@axjPc` C81|;G h2QKTTrĺ0d}pcc9S\A[s3g UlR|g~>r\krBџ /"/_Vx\]m.άbZ,3cyCT; SCNsgkṈ`S T&i.TyrjEA&D1x2"E(YM$AVIu=w0KJ *^0C@~g27H(^lƋ,W8`c-}|?Dx'B9NwYKx{>S,LWO]\%,u$ :mg.?ܽ51RǶ> v ?~TI^kS"s/".[~n5rFF+--U2'{b.j$g|e۝(x3IraKLɇ &o>Ѻb.!k3:#|7=Y,>Vl0 n %VYtdcZ0b3 itЊSbu"+AG|d2`d:19Z  CfzXIbB.xJ1|?\ouJWIF1ӄ׬ytc,%rrZ 6kQ~4.04cTDB_[@5hNF@d,iԂp9 3N+^5'\odd Kto2ɬe&Tcho%q6㒲#/^{ 8n$* JAUiPS`%[50\AE$! iœ0 d ڑu!kiUT@RIdr܄,hnhVXre'L@A$ wŁNغsFVPJg_(CWq?ᑵKr^þe2%#HD^۠VcE *0I/7cc ¢~e/j1Z'K"~nWA] !K# yAijc3TlK|AX ^A5*IJMWw>%:CQI,<й~g+3;EJD37aגs6dIc- RK ¾A5V2ZB k$r|,BA׷gs;B陙* "7JcrVtR:@icCq5 #DO!!|i>MQE2vR gyq'ͽKB/(kbٓ>ꛌ[QՆ*:ב.3&-OC0kሤ m;.ͥ;XH+MZ?tqCJ7I}_|\. _MN 4-zK`KNlifl!i.[|@cמȣ-B6zJ\.)?Ӳ:gbA%1,+}tXs3kҕ a'lϬ)^ -A}2+zI-C>qc,"$Ƨn=:Ca n2.'OM"Fxc3/ [ 1ƅ&dIq\qk!|TaNAzH8zLFF\2Pېf5IHrc:hcYCG7v/.= _xL3_c`5h&bE×g2jȱ> C>ht|y:uQf08Ihݭmٱ]Ђ4?M׈֞xpOb v.2 FqM\NMQ5ڛt[ʼx7 ߨ>S~mfW'vct|kWŽo+[.58Ś gEOѼ*܆$|&\ϼ|c_7h_u{]Tդ$^^UF%:<τSk;g~Mmbq»  #(Qv`"aӺƟghg!^3H[p'%[O@gkkO c kfnd,a^ń${Ąm#gjcnѴDf*DuDek@e}ۏ|WBJb7;HSƱY vQ;J=яkP:ĉGB)׃{χ)Cf8Iޫ1 qIW͡rl}Oqu.<BLU JH0Z36lmQY d(Z?_>!cҫӐk:6,s?o ^:}đ`CKfcvHEHgl j:c$XY9{Y"p玳p9zpfu:{Gk4ۮ\lViH Z{?9'E<6Z)-j``oWk5 9Zڱ4/ endstream endobj 691 0 obj << /Length1 2215 /Length2 15326 /Length3 0 /Length 16637 /Filter /FlateDecode >> stream xڍPqw'Kph]BNpwwp99{UUkuAI(bjo sadeb)~be3!PR[؀#G99[B ty] 6Vv++7 ?N|q)@ korFwt4pyZ+//7[ tپg4M,A.Łݝ hdd.HptANn S_@[п1!P-,P7sq:K󻋫) &#PrX_ 7p+@{[9PgrpaL28ۿ݀6@wK$ETwldliG¼YTdW}N {2{vvAfvf0uu`ְttɈ]Gfrpps@_ =@+YsvwZ x;@'W?XY&.cŸbٿ,=, `翟7#fWS7*EE=ތF6NV++;/qہ[ wu,"ؙxE{!͠7>4_#uoE66ieZx}]]oCB_-jocu2. 3o-%-=@ʖ.&ZL=Ht7gb8o7_:9=X~ wdgx' 0wBk\fDB<f?,_`XR;Y0A\ S3(gP̪{?3hA>лz俈]gbo>H?{@ {:[䝂ٟ!{W}Y?{Wo"fBw+s~ux[}gl3l\ѻq{$}g^t~󝼳4=:_f 'Vn t|o?O5^ s4quzo6yLM>XՆW3BH[=hV$qmmWFx65A0^ugdH]?1<ϋf5d x,e+r.{G]XO*.9h9y<2Fb8:sԹY7xzo-؇yru6n| |;QHr6joGBc A!U@`~RD.5Asuѝ21Dtg_U_}eJNC ?+VMX3ص#Ʀdp+SCh3vtzn-rS4^ݫ.v)~Lmk)2h)ew6@ω]9()7ݕ"vzyo/to0u%vUVtuZp(XiQ2(kj&Cc,Ҡ_^~w+]Sb~ea{4Gk)N^r)MZ$R\Z,3TG 8mvyB$bְI.8(rI| ltU}}::Ķ>&!5qA2za`a~v~6 J?Z)k^=?cݏhql}cӆ} b Lt= _A`:Ɔ-/nT#a\8L2}{mD+SMbuN؆d8ώa -H\z:3W+aB7 nO4qa)4E 8J81I#}K {[ل4) 5C}e ]!6Q 0mb|SN栘_ݠMy$Fi*k8s#kc7&Ee]Йcp$Uи Fӑ~{yF>_~Ίij~{2-G'.~$/:b)$O"--%|CtQ lU)g!u%91CT~0· el-JAfMEpL b,DwT?WD9(6 gZS+ArtZU.PÌ'tLX'F}P<)p Y"%T`!x5ks={]A "\^aH o3v6 F `$>ӣq8rA5بtMCU+؄qYL>6`% )JP{#7D 'lг<'~>8b+ udS{G񽍰r6nSֆy(VuL%]vWj0;}6\ (FTGr.?u7JRX ($ΔE<;h [TDy_^d~e#"bYOH;Om0צ%ͬ;mۚU}H(UGn 3-c; |SGW^:Wip2*2x O$]v ՞>S+0 kIP C D0xM^OD4pa\+*hV] o7l'^IwXN$ a +3S~KM%"Oit0K'd-W~IL u2pg @8WWXzpϊtER-kVlު>EbFZJeSoz٫D9<^]Fm9t?ZH+B>JZTLrH1l=Ę!p4{ yGW^n:t7UT'#$yEIw3'uN!B*!8`0|5#$a?¤ 6zio:8uJn!A/3 ٌPk JqUβ)*=ްOMlΛ3O+їD WYWWȤ3&PPelhԲ54gmăӗ&2qwL6*sVÒVYm^lÕ5Z rM!-f dk~y}Uns{*FT)"쒈dE߸DX~ZtcJVg/ yRSl^[tܽ|Qx>-BZ"< y[,3#dƷu' rPkFM/]׾ӵJn܀=Ú^MEN&F$00 I5XQTN([t3IŸRt'g'Udtfx"#H BݻI\]o@ҫHϻCO/ÉîQƹ R]uki;=#y['Nq8B;Xt{LQ GܕLٜ%x2Ժ^ &).:~Qcu@Rq4#@=,}]Z2mO٪&fN=fuP;z"89߸` _Q.(:cAA票&+Idem+qx))9HGe/A08~4~;T? 6zv ;i;?q X@*іjK(cXw`X݄8oqH__zrw8;;~ĀH^z^:rWѥ׉،@߲-xkkǰi`'a&dX*E\UQRӛv[!Gߗ/JZڏ( yv@(tkͯ<ԛ,8wD>0lۮ.x2- 'dMP=wk# )4(_рU!m <W RFLbɟxH@li]P,0o$L!xH'<ԍ:=@JsSd68C9#lKܨŌ[Fbd7/F4{ +q?S6Iӗ)_B z ]U)xzď䳜#9Ž%zV( LG$~"g?ܮコ.<<%$P8Ⱥ uBspS.X34ZGPWC&b ~މG^*e_I lsD(\m%U}ʀ…sX6X}n.?i#O4vZjfJ?3,X-t:\懲>&jn@Lj^mŬS8n6Z8/S -j6Y'Y3D ?1Xl(:D&lx{_JI (h&e 5X4<:p2z.}o2c?]QCT{6/w &R*–῁*I ':9̏am4o]#oN͓C;yv3r~k);@E \)$] ފW Ӄ# ةwuQ*C֏tM\7; 3OR-7DE\6vzp>i8jIqUza~‖*%;>`gtAC CIZ I&E\n|t>HpcUN0sHmbHaV;>{9Ef'i̖zs!n#g&l\Sg}J7ۜ5;^_[)}ohEk:brW"fC,I]=JzpFNLG7 LS0еn E˼ vfL4[~em!'e~U #bYc}[lFyYbR~w쬿GgϩT}P ?L\9bTP(x;cQ^/hBEP|ʦLR2 cNZAWJj#g}rW=4VnQ yXPh:b'ě? ͡vJ 3ψUaxTadj>1;jI NVwz]0@x\rހ7?!<}DXd!c= LpO6a.."p*velY!z]ah1/o0HECT BI濨P%k\Q@m ' 2bOڨC'5K[,R<'~YڙGSAᣜ8|rtwX$ Zs7ng?ϩ6IQ4A3ȣC{.33ϞA̸*X7> 8n|s1 :Eɑis$ #y|~mWkni'GJxSITخ6Yb,F#W}ʃhgp(X%>`_ Ta7ş4FDKT68ֻZL_Pn/%My㮴!jnw܁n/*& ad*!+Ci(i;Mt -73D.\dU{ͻMhp[9׻aR:D9م0Odr>x o-Ib5K,M&Os#`c#U.ƍ~Y:qPW9;m o i4"XZNŏ$W4}ِX$UآwsQVsIy{p[p5{Q'AIeHPmFĒ_8=UN*"{glaCs>i4WaE^`=)T }gAΎ92b p'jG$Ikj@blG3 saAA+c.HF)1%eo~?60)힔KQcghpr ƲlA);J+=EbWu/"(9f%8k8*eyˬ-0LFH${D"g*(-~ 8}!֏-!2+sC'>}#\lrR'֧8b?UYxIZf'3dVSGWoDkĄ$6'cXfQ0GYP,gb4[EArfR;@7:,h|]!|а2& u\+ VR'i/\DdIeo+9AɛV 3^u/bwQ~[0kTi2"埳,z`pnhqq4&WUXtct6&XdoNd;_-zu5ޠWwЙŊ<[wvN֍( F}0vb.|nϥ}+<{^*Rd>QX(R~} LlH>IǪ45êw(9srGp ͩ޶)#ڢJQg}#^+{mvQAUD*e[j ˔&H#C;5F{=N1E1PDOeN`HS&^O֨$boW2l$uO~B*CsPl8Uc)~,1Du^wFM$ATYnH"]iDYd mM!9πʴ OD{[Y\0y6qv,Iy4džpG+0PTRƞ|DM8E f•ДxrraX Rۚ~ȕ`k# VȄZ|SM ΫI˄G4B}K!?Jjns[ԡ٦![kmQ2-h%>] V+$H 42 xY{Elk?wgZ6踪Dg_$C/ɪcv$$^EN5i͉RJ /EWUE4 {t, ۵-S i\ղ*kms~eD 833kQ_xByͮb&;Gg4 OHjmA~vɒܭ-n8&z>w uc3]4bˮ*)Ý&:(&d3~DN GSկd#gzax oFҍ]L W8 6ZD$*4|Ўq=1w/"e{g,YuwRJ4)<.Xbd>jZ#4 C1\%Q򰊘naR"XZ}55\H17N}V'&Qb)F39_AjwH%[{Wt!]at#N~:rcqomDI~f!>Xꗓm9dfJ ԩW2* O Z[wro*fw蘙A/KP9z?_Fyڛ,;T0$ KE̯w'ׇvsMP7n=̲=P7KI,L;!HIEL/v+0MlgAy\=,ɢ1S^\Sԅb(ie_kQ? 1|~=o5]UmDg٧k`:y4\J a:1pĴZ˰m' ?r>l䚥uaY)dM}ć]ϾH'dp~[=K3o Vs 2~'=ࣙ>@[fحGNRo_ OTZEoK*yf@hLq]WAVIW 1&PJ$Qۺّ$nOp>Fjxͷ%E((^߇[~a!]2 Dʼh&]` [z!ausAvFn}J)US(I̦W**jnG SA_~cUXqȀ*NO-FGlz6~/88|%1FN ˓l{bѓfrOH/<+EBXD}8¶`9tCھҔMSHOMU #L߿TExg_VP?`eGJ|:~YˤtH{Ukc"|R Љ%-dܸ 4BHsڮ0/k HYsfDh^U %kw>t F$?ebNF=&7͇9ys>:.J/&.' 6ލKΨn\ٰތêwX. Yp˰ Bgi c`cR '˭Ӻ>g@@Z6E|!VO9@#GxP.q_^X4=~h[]\oF#9B#C16wh 1L/a'!fojiG-"7nQA~!!i?:.#E&IѽUw<ԭ; Yٙ:fkx)&ܶ_ĽzMfWs+H-[̈e7υ; upo0Yn9M\,,e>r]n*! *] wWb4 1\DE,KV:;q[ yBHZ7BKT OFBD[+lfK$aSaDbP\v 9Ie㸰`r263Qڠx*B Nc' !tHP!4)_x|JTVN4Q(OI\|=r1Z~-67Tʚjj-_-5&vIC{f1DÖ!taXoΨzz#8pRqM;)$^q)fנ".Ю3-AWY/S}ycRqW>Fy#8B3p}#コ^ee9>`@"A*_$JEk3HqBޒVV8J҆!:XcV8>PskRa?v^onkNJ4滫buEdKcTxI=`KF5ǃiCB[rȡvY(ȹ\P8V% w5]7jiɰO,SA-=)j6/ \QU';2ax=:Qz| =tpU32> Dzެ8yx@^V#z}Oz)%zA۽Ppx`IUXl$*5ۜtUx'By#L^<[Df5j5SPoC-t-pA0BP4!hcZ +.!\#&]@@sIqhU=K] FMڭM'xhWyJҏ[{A6b$ <_J}X7uqB POmym߲޳ cs T6:j;x\+^n `SQX_%OZ'3n+[0uёn~1w}G 26ʄQ;<4V-M5/u Us K∾ fmIBXA9TRF9d$DCXt[9%IO2phc&-=9 'F~5G<_A.)l'G\q:-QP_1}l$mxcF(܄ Yw5;O{YG'*tX6CnԓG81Crm:BAuxh?$TXWxexѕk+JrlS:/-㝃ـOT$K]xW(GUUѭTf6&4g똡ATVv;&}Zr!b?2) б}gp$$9`[jiati_m[ 5u^Ye#SudchXcuxMָ.|oo{2HPNQP.i+|l?ןqڱ''E y nE-@M8Ȇ vbn.=C {SɌu>N2H,EI$Ҟ`1(znnLSH(y.z{Ex8tF3OO J>|) ڷyU&ͲjN[oX}?1TCcE? QwV~FUk4+}&z:=oDJۖvz@%bL/޽FK\4Vl|%ЗhS&ϭ?̍y>3Y !1 ncFS&y*@F 2X)=6cÌ!r5T5(*V>|uap0=2}K㤒wJbXlƟȘku@ Q+\TǛ9 &*lxr&7 )!iuLX~'Q)bᖵ^o 'PFw?3U":)mIl @&Q(R~Rt([9KFːgă4| wų,:[6K&mKJGz(/k h|GlcT;ܴm,Oi4B@&_~ 8EqfO"#!*VK1Q_jwbe۬s)V\XHU|ȐyUb{ ~ݤHą$ʸTR!$6FQMJ@+sČtJO< 1f4) lgRGςyt 4xތQI.lK0iب:{IjkrFFayhTu*A36kyx&Y?SZ _-Fp!IvpcL)Mb9΅>tAn]x (7tz_/eVUMN8]H`4|`&Nn/X. endstream endobj 693 0 obj << /Length1 1383 /Length2 6023 /Length3 0 /Length 6967 /Filter /FlateDecode >> stream xڍW4X!Hzg-z轗Xv.dEHQ&$D-ZDw7=޳ޙgfgf9g(AAUp PI$DF0+aE`PBBhMp+PX(,~GX"w`/-@( stBc KII *A0{0 F;Aݰ@C= W n'4 %@:ah'EzA!:`7of np@{P QO8b յPoo?Oo‚3 +lops}apG UD`8'B`^`+ us0PUQCeQ(OB?`(!ܠp4 ~0$v_ߓu#08' 1 UWhB=P{'|ݡ?Xw4Q`/(wǿOaa fAap?ٱf3vH=a 'kzApW+end߾w>@1)(,"*>; /r2@ן#+ rV 1=K-_!^Hn0W?*{F`O)"\!SGP;DJN7`P= UÇ uN UӿKH/;xI /'kBpĒ :  )4Īر5:6LO CkC[+{ ~X5}#?lX_$>+3 )xfd-xύ9CI|/q"~sF"W4xG8ﮧkPOEx66^|weeowL+G[2H:L# r7}f}QNMdiMl/ 5W8%{ !b]{'o\,v7g&XomjMxv7YO^"%s% '؁4č(pToZlkhLC36-6+G "YНs D(IكC+oǷBwXXsDR,QGɗ)T]u(d31>( =zL O 6rPŠc_tQG0!N~Oԋ1q8DF8Gm u;Y/3N4oϙdWEdJ(v.9W9I1|oo.?Q7PϮP|_/y:k zbLCHՌa~GC)DYC<_w^r]Zq׫u+:>\n%U:2J0 I#.H^PbiCC8]oY[)4-HYi&ݝ*q]PBgdƺ8gW/ \iw(Srf+ʼ2ly56s?Aۂ&O:E-g%^VcuzO8G2gx9e`Mus3u"ґϥ\E^YEW\tۖ/|k2w9Y$'{Jx7cH 43؂.F;b;]ݽn5Njr?>Z|eOKA[sa;lnR ̽JVn~M|8Fc]"Q#Wۘ\#}:xܓVDhG}yͲ[=􁗕}He~YWeX-W 4*#P[qQñ9YBth-$=j]2bhXBEQVMҹF8UKHZ5GLji9qrr3x{zkc6m?u~@Ǐ+2!]hLbXh_$v,uƵ_ػN7#HFmOT#Y@4o,~zCi6 BЀm.Iztsl8k6//G-G?ϒ{Pm"Vu7QjE,MU.@[6-]qZQBF N2'_}9w,fӭ= Oq{C|'D(ϭ;3(Nbxe;荭#kp膊4[|-[WŃPӠS^s ̳Xs7go9"n&ŏz0A3·G֛$|y.pN?£opK{Bqhu.Q5s!1?);Tk][~6Sb{M( "Yo^΄,z;F3zyoZ{H^1)y)8'B@-/PdA\W⇸UCXce看{''m?e[3pV ]<~02E'r_~j0IbnavW_Ǵ+R qhB>2*IFङ^}`-B9ހLM󮩝N NK2oyÎI~/iB$*0T3בMq*ᡙ/?$j 9ĝ}2z'0lIq_p2p8 3zv<ᘵ:͹fk}SUH0{dJH1]^Xj)C·kwk6#݇+ HEq]jLOFL`bp܄H!R'G7g&޸o/=JJx@Vg#Gmc;^4L{hppN^la>] lҾZNf& L%C_s➩JS(b.ZJO{#˴)BM}>t=ǢtɯBfBDY&<_%M.Mc:v% kc>qmAR<==2)y!B#FNufx|jL_ LijfQ}%hj&CL7:w^/ t#zWλǶ_ !@(mP =ݨ*0ޠG@[oa{xlq'8vstRHbu8kR Ha=ݓkלy~<K!ᦔR||}ʲ9.p "%7|{mRp&&?癋oWSSV>՞p0-FU,G pw&|xCܤ{|$oaȫFLԙt_ [+~AYց%~>k+dc01sv8kYmvQڕ g=N.V,V힗J:mKa t:ptSs"\n qu5;=&Y.ck }8JN0rOD܉β1)kb k` C=ZcŘiƣ7|74ga5ÿ\:aSxqH,'X/(0͢WP:c*G2,`׊ _$W_^Z&:0&9`8ZP#n͒>F]&^VZz!b JuӓA6JsFɒIk&ƞt"3  }Zkk%.}L,^0ݻ؂t:чe&]&OPJ" n]Hdbv.8^@gi؜qz:d+Lo5%P5GF:ݦO7B(K05$sIҖu$OKf'댪.SBئM,XubnXbdWE>s`ٹwbԆ>U(0r0,[| ໖dOOx?bɱ[_AB_A^4-/}=UO!73n"^vCYEt‰9JO{/;3bA7d͙~0e3Rv=s0j!^YG FK4auQwMgΨDThUK3J7/#s:z7ߞѹ'^?vSet[iZ}Wnk IBhYZ:|~BHRfP^!OiDs{\\υt?\) >s%tG!M&>QbVCwSt-ʕ'4ar)<`H1RtHA'drN=}j53f㈜~0%8Y7Jޠ,">hc2邹w;NوN\cgI$B4i68/>d.~t Ba!h:h5=9!r:j񧶄{Luo/*?Iz;;*u8KL3UwQx0E f22Cǟ&͛F{b;B8ݒq%j&?1t 58?œ1UHM֠m SsېcuXƁgU0O%:635gQFr] ^{Eh&%9Ւ\[VQVUgi |2LA>xeǐ|pĩ4=n_ {ϩkk̞sK|fhjxjĤ]ilip.TDe+ڏf=`ۉF^G0JF_K=[vF\na6#6ڪAb!}MV%ȋY!,5 > NIBR'+q#"x o|z%꣪&yVyd[b}]zЈ321&)TZ?tz~~2Ŕ$pE|i=Œ@F$ [%ї2R ųgC oy0٨9 } JG0hy6N`bR(iVWg?)*"ZV|6{3QdVqp0"d-2UG 4"楌˜nH7ؕcSEX|?˒<\#}~oj1b ۝E9M\++Dži56{#VvN#%8Y)=FM&#HאpvƝ 8NŻoLp|+}Jד:˸\9&NFGI[r?ʺ ms endstream endobj 695 0 obj << /Length1 1808 /Length2 8080 /Length3 0 /Length 9183 /Filter /FlateDecode >> stream xڍT6NJ#"0@JjS:0b1) iDJQ$E@R@Z[_gl羮IGG*#(/Pd @^ r-g5"` H(u#Sn8$ Dā@?(7(=`&/ugU@x#av{~pp@bb"ܿrP$ hQPmN} = " vvE 90=@Ez@!!ߡ an[' `6Pۍ;Enni]1[!2{v[BqpO" qÜ7_rMsA\Pnn014sf%8D CBmnp'- ݅!US͹+B@1~aay )ᱯ `{1zPHwcA fXC`p߈oꏄỳ7~y0/Wt䌕4U~(/ xń @DOCIuǤ+=7?oB44o67_)?_=RvwrsE?p37㦡Q7áRM&sw_T 9?)ü>7`p sx@@`7cgxWn L՟W*m/$ #`oߜ9@~7@ݨn{ E T@g8Bov-Jfg YU?AhXi68DPP/?Hr࿝??<F_kGIП@7N l \ܝk?n㎼qk0o: zAm'6!!rt<+R_XWR9y|'M?q8^!咺?L/)qN1^nW7<}waLoh= U-z<~~A-X\EurO=?xUu~ []-V'(ymT0šm>J| C{l׌q?ގ5=)6wkGkJCyD7+zMlS˵ٱA6ߌT[>i[L'3DH~ܢ;*VЄ c371e(&r4RiP& 0})-БN%殕m!JJ7pql)45bem>lGSN׎A}^7v|(`2Ԏm,.b\~HrרND.2T6o:64['t>pg-щ3_DeÜӀ:45Mϗŋ]#W[jhh9bRnt(3bt!\q| ^46&{Chr+gvsLjC;?Zr*:8grI咆p۬8G8 N R~ ,i,R#?N>ir0T2 LpGʯ8]-۾i 1Sl <²37Լ 'Ml|kWD5m{HljI^ /~Q4 O|6;z3#w W>{5Ǚw|>9w'~_ʶ!>%d1ʰb!^WW<`,NQc]f)|W%>mH$3w stRQ`ꗖl%Tei ^ K gBFxT7m!beIؾv}EvT >c0o#FXzaC;\f&ʔsDn);شgy{N)KvfdcW 6?{Y\c33gg=׏Sh}Aq=*RRx4iv{0nYV ]̆o4DafQ# ;ṿ^bkg o7ȶ!D"Ϩ Y@}n 澷>*Mw|U?tJV hBw,+)"@r#td7?0\6ޕo u--0}wEuLcCKWTԭo/l>uR.iZ[N~TaZDM.eAVn?8N.]>}P/^Jn3 T]e&/pnR HV$P# lIof4VvΊF_pUgs1 P )/"e~G JAY5?z޳X 4ppzQ HpЮ{G4#N]ug,R;KdbQSEmVG ^stKڮc|*]Vӊ( l XQ\Mfq/c)]Bq"='.cEp[u K^1+n؍f(#X|\A &Mw_q~,/9mhn.2&l& +G~hJٲ+ALUXLz' Vz"xXj% <!"FJbUZmj5sŮU>l}=a;eGD.N73gĦjf>a[$cb8́@KI>;ؑ@9O%6M.,b`}ߋ ӐrmC}n2E.=%VINGgKJzǹcDŽ.Kұ~ĵRWC}e fO +}i6q W`-u~.~WI]*b@WS{9ehmښ{&_X,4ܢ'}'~eV =e(P9H~Az1e%Lm !6YQbkOg^ahߘzs}%Mю(.yۆ;{=rYkoZYOf"%.#1сmw}Q9y+;E O9dH`ž,?ϨQѢYŴTKfqU[zN={I>'YM۳v>SϔJNv}}:Dv~2:zMy7xjD˜ r~.%gw"[iœT(L/CTE̖?3! 1ꡬ!iyW))2 xMKbNO͇1}i:01Yz̴€JW)0vEր!2BoVRx 2RxHtX(U.p)17_sbu#a^&hz(a C˶m d}GN[^%Id¸8mwM A =gn:Sol9veÃ1T}W[s% 'p3o^)}0O+Ά3{}+!Lw_ˊ]G4HsؙgkR̓|vQ.H鲂m) -KL6(IנCBz{wvE yXl*<˜B=RE뻁@ /mj6K^JY;|{`2ҝ8*82||̏ꄯ]d͡l"گBcbGOViF2Zַ;OW)Qt4X#3[8of%\[*u8B"_-B*\sr#=zd v5Фףt9 tFlwT3+'*zV Iv۪m CPu=" tYڅYRzi{U-[do l1Gah<65y8?Ȩ0;&xƫ%FB0Z湭{bcoGg [v7}AM=ҥ)wOh? ٚ^&&{_L*Q\U1 y5npW9;,C?8w9;4q4!IW@G|unr .\VqMb6)S^vd2S4}hd-e!ic?#k-:T;<Ǯ3Sk$s4x2Q]~8ɡQPKt9;!QXB'2Fnz`r| }%*OmP.!E½Sf*AXcֵY' 3J񐊳tkdPw+F@Jfo]+lf=5 hiϣEu']7ڳzpf2SWv$=|1u[(je+xKrfPV<%AEJ!! ^[ =`Ldq\O7J2ҎgKZ2Pz.ረ̋.u)DDz%sF(!Ozٳd~UΪ&.֦wbA`mZ0L!1SxbPT?ǒ;#*>fVw 9Y&0&cw ~I[0C2sIޗ'y=)Gd]V^߹&ȃ󻚊c9,U@i4Jtu{Ezap}k耭 ` 8%݃ѰSKE+o724$lRݢ <\Ĕ X;} siձ4WǦ<=x-QN/vf;JMRR|Ƙ{/(9X.yin _M@hgQ2u&cg;QioZqoMIzj◸)ejKA!p='" ïёW9Ff|(|#@"G ѶOV)VXzi)SaFAԜ8X@438lRN/\8?Aj^ ,TN(E-۪*##ijXvBմ;#YQR<7I쬥h?k]񑈓TymПx-D.CUU¥+;8 b>`ȱYʻn`OUM'i-(6;v\{~0K¾BvRGずNp:nCc뀦ExuTS&ĩgW֑͆xAGEcgA.kƧwgV: exujެ&+ۼ |0XbH<?ZC͹ .o ġ6V1&zgE54ӂKRT T%1ft{Ӕtn%1,:{65Avv!cY^_ߥkTvt.=yӛX* Tlw'7<^Iy gj#ryog QJ"tkK6-} ד2R S4%(1F&}@[nyx\Eu?!VA)`tQ\Plw[Re+}xwWK+_'_(IsΒn}v km&ܰq2S_k5O8%@0Pwtc/'j0Sj&UP>;{!i OGoBtVlbNZ$=,p[&]Uw\9e?֫a@fT#?WZqzYШ0ڕ*h3,A ί{A5e1:=H~=E~dC$ǦCq4åV1;w?4d`߯|;+]]aM "wKe-go*hdWus#67}8)%"TeOpЗ QY;F4П7ny콦`o)9qH[)pGyMcLIЃ8IHmWw=Y&Y8dUd۟]Əf\N?61r4gA񰩿ZgݧdRJ8)][u5T]H1l^5S2G9FSQNGC% 櫹)9ڋ"DoDCGm'1vfqOZ{(KC@F9S~JCoDgEXu tpm62I6oS9Pv8)jvoUBHnm dB(=fg*Cw[R[/Y\x|^WD dB_c:M+V5!1W4 _%Av.Pb[Y$ҎXn^T0np=t#Z{aveX9[4f3QY%H!]# ׯ1g;Im )[;nMB>=_hZ=K,7[0/%åvArgӛw龕Q8o8{t*ٔ -]FK8p.ZN2*(UC7W5l\v`maU/=EE#UDۅ1`hKNӟRF^8^qY b'0\{ѓnz@͚1C[^N(CIp-GMdIwz1|)vÌ96Sg0{!˜y>vܰZ)?[Nv#XJCLi6 P53q-y M=/]u#îyQXSMڀ$XwI ߈^VX|hm ^`9,2t\L ޚlyũRtIuktD}B1v^" X/''{_CЃuiy iW5°>vkh+%~[v̽";KHӆ>*2`:aXKΧRtu Z|;Pݣko n2Ӱ,\{8Cy[==`Cdk;MU;ة]'#\u&P%%]ϮBR8|QP$l]4q|msbRD؞ Y,Eviʕ) 3O-S8SP"Ȗ,(&P0bR $DX&x `U%`$@{htw! I~Piƀx\gk,;PN"-?-ΰ:d2Ll\F* /؉Z-)x$V'*0:ҠԖ+lc9ݽƥt ^/E2"gzs J)릞;8[V}ƻ@,rZy5'鐔/{5UD0]~{{3:gJr!Z際3p?|ۓ̺t z>FՌ`E3o qˍp5^1SDΑ.C>Ή3HWVa^P(#utDd  endstream endobj 697 0 obj << /Length1 1739 /Length2 10086 /Length3 0 /Length 11205 /Filter /FlateDecode >> stream xڍT.L)V Bq E,@;EkPܽS8Rҙ93suZl~QkhIZBrPG8';@ZUUAGۃ#Ǡ PGYHAG h u(8|BB.B]2 7%@u0褡N.kc<-$. #@; `F_[G+sCGr{eHhWrBWMn*ZE8qI$x-5 p ?i<氇850ыtgaxg {\Du:Z@-$/x#xs>.%P g_@ߢ??(F<? #qAK~w >m / Z C..##:=<_!c%0{_{k4..c {-0g!!Wdl[c"iW\lc&h^IDr=r&jRW9+sޛ5T<lTrTO/_6"K*vEРDզaNOT"ǯڪYYBL.{M d>{FMFQdZKekD %сJnA[?h2<Kq~dxJX2\UlS31{kyuMD$撗eN_,۝QZN)Z i=.C4V%}|nnl7HZ+`N?jv)hX%w7ǜZi/w uc~h'Vޣ9*~GTH, {sO=e%jy7gģ}Vn  SGgº|{1Y~5(8Hriq61 x'n$_nTe#\d"K|*v֔4UE6}Z".a'ۚi(a)Ճ)r݂՛aTGٙ]k=k9zScK&OS b2?z qZj8m_[tp7e)}3}v7>0}Z0͆T!==/=լ t;kKf㥻wuAa aFIIHփ70gJiǭ¸߼c51O5܄1B'D&XVֽ\ yS;+ĺqFDcyiVk~LyԪ{ȹϜo3ViL>Te2A4bߪ^䇭)ZF:3Zct ༭v aKi:_ _9E-x봍Z};2y@ى1-1TvsnUL%37+s]O/B畹/@8` ||9VĠ.:#9N#'|S0Cp#h# ؔc|iNDXe'eٳ-hɶT#4s~Z`+ɜif=hGx8RUU4 x^ΐ1"1XfӳPTI+u4/%%˗ vr sɒ뺙Ul*?pNUTSvķ_OxZyP Q p{ DZʊrSS*7ɼ_y)QDP|?.=W2S9+>< tOjwߪ ` 3͢y]R/*OǗzmiO<7)ՓR^P فB FҪR HEU Z~(KrX^Έa:2@CE"B"_\(6kGpHr)^W PW* uCդ˷U/f 3 b8}[X[?lieviO|NrVF_-$ cXj()7t 0~%!vr%W2$I{*ȚRW)M;6+8`4YUO{Ơ 'M<ǯ w:HL2 /6![-~Ts&)PD3*NNQ"{gsFrT%Mz)xLnYJYꊖ{s/t*nVpݹ;9RyChMO۫T=3!!CDYg߳lbیx@ʲaL q̫^Z,~o vVTӲs@7.ȸkR1UQ;BRٕ9y1NW>cY FYOFsc1Oj1;6|֖ereIShZ`tiçb;cD_Jp?BlW~G2XzTZ' IKne\Ϫ0@;^tf97oȌKm'-$ oi:=ٷ e@ >N4*5hζ }aWG=LbȬoZн! e"*|ؽS-*8#X7Ͽnw~LSu@`Z}R\_ڂdryQI 혎LHuaˋ|gl O%\IL_y?'_en[!GtBkf6૑C'e-|d{)~q4ti7}؝>b~ftKb}W÷[-0-qtyEՌ3MzNaӧv($U7Iq.M?M̨'W/%?+1╍ $8Zӎrz&Ѧa4to4"/k3M+,8OVNf!!M5PIk1q:*^0|%&;b+4 ڢcmwWvCegµASx1ŧ̽/>zst#۲ oُL\j r) #5_ޅ:eS]}sbaDIV bjΆ\DGzlWi#y3' JZƕW܄/K:mbro%)yK`Bd \mDg`}68gW `HXrҝ-p{eo-4s,0hӍBy:H՟BS7w]ds9-HRmʠOc;}8!QWZKxP,~{0*kjwY!"}'_:)zLZJTg m&s_̀ 9u0:Lxw'`RRC6m} `Es0R+a'Y'ƫFYK0.g9_ o.ux:s?[Ci;X\h#+\QhhInt{ɵLZN]~k^1hzJ+niIP$Yh`GXOtn[/ľ[[\9wgLzȋDLSa [f>]qq=\q:)Q'O%s`VUtKNc2 /؜0*#7θE]eO5ަ1ȈV 5ܬ+n]s(ׯJZ'iGYqMjs!`[3[ޭ$]v0K9!ʲweܑ&V}QCq(^<ęr$62Yf&} id-:\ c։weüfy.^ B㙝`Xزv,#Z 19\a!-{_uuiJ^$; eLZ`&vu*E+|Ϥqw}YFt?wcѢsR!FXq`3q߀rC_~>' g:@"cM2:{3 l'IFCZG٨~d[\槖eu;ܳD %L3#$2"_ַneDS 2 'H,'s2BCźg4B1P§&WԌE2"rpLn~E:9+ F.I (v&}X5mZ]QZ4H5 +9P56̼e:&\%\`#u(g|j]i\2=E=1_ԣ9zkvCKjŤV;#uO\VQӘϥ[tSG ZْAnEH 7;y;߁WdJh zHqgW4[ |@}66vl۞mGߦ(()P Lsbfcq+M3N7z@SΓETK}MB\U֤H?[[/q"J^glLğe*)zܻp;9t|er[-֦#Wm7.xn~X˂.L9 @)! =_o^%\JuNDC[ŭ'Z藜1|*-TBjus{ #f=ܼ.$s8W*j5c_~$PT?<&mr>KY=Uuo[mC<[Ģ>2V$^Qn;BY?wTToHORJkK5eOdn軉\4s.l,hIc!;u2 E 1 +p=u0<;?IʆLsPBϼ CQZYNRkZ;~ |9딵,쟨$ D'6?ԩXD#*fbKr}VxO`WJ|isZ˜nv+ŇǔQ_ߝwrx.LJ $YQ*ml\K?CΡzq٣9z¦(VYr8aӺ{LS#r6TM=_ӆޏyf·Ozi~е1&_\}'׽,ԅ^GK3F-#TJDŽJ"THiz#NeI|#>qexY!LΎK691V 8fW>Jci-@?Nٽ&-'EJy`?w7aG֭"S`vx "\qMbLyWI!-Z3 8 IV{S$ѴG>PÖDzm% end7 Sz @C$(#Hk|Wj<Dtjmfr|樂]NNKL Ws=!Xj*{DϻgN1L~]Ũbbg\2r2H`)h8X~~Ge _+bLan0LH< AdeM>.c +Xna4 ~ k{o_ 1)i5->]U|Zέ$Q1GEujuTuPkߓ[V MO>]9rtΪ jKJ_{=,EǬݯN':ך0ǣU^%&4BSr^$eU8M !E6g5H@i@+^^K^x܈szՇ6]NzLS(wE^-ј8Pf?eos:e*;T"܎eP&dFs >hpyDupF#:˞u,+}*"K!)expRC**y,"WxdswO5nRK8s@D4W D񊉤eszBuiqa1ƦXbV6ٔMܑK*+8fFO\"/r6P c7h3Nɿ0=Y?#K"ϟ2}M%R|xp)Ȁ1udG9wΞJzn2ܾ2ɫ뭎2eWNunwTlYxuR>$(I|MF\XsbuwfO'*8!s^ |#iXƔDZV=/H،d"\mH.pCKa. #4xf"IYw $NN %ᇆۂ_Y;"[z%D.n0J1^H[m8[ F|#ь&hM3S <#eFڶ9aYIEa(وG %c&K1SnL/G_]${n_=(~%#0V3I56C|N&6ޠcf%0ġǀU&OC-.B}1DjUTG³$qwkt]qCvo5hN==[Abs,8ѕI[Y+ƆRʻn0v4ZH&ƥ~ ŊtPEG ]O3{װK xm}$#zpd^ IxqkDl=}9 p_sx-y66&}V#3@<{q$؈瓋,BD5P/UEyk-(rue {eL3ҽ*AEx 镝-t3VfdQ;+)Bvf-ܟM]Q%t;vA)Nb !кص.+%Pkw#e)i} KۈC-Vd`\5 5F?&ZRrAC[ dWPcdr>eh#>]g;.WC^P`? 9K>د%F$Ֆ\VU,~Đ CNzIhZee3LuYhޏ ii5.g;!pXj=kխD6:°uy=9KX:(S"?"sN8MJ"cz t$꘏&37/bJf372@;x :OHZ1l><(!{ǵɰI"GVA6ۓy{G6: )ɳS7įLtKdi-5>Rܑċ'Ng*/>u22P6 vue7}:z4|I;1'2\3'yq% =NJx.Y*3u-nѤ sa9DdJ9't ;(b~C6.Bgg?J8#o}V*EMn׬2V1`?6#$8_h&42}WkLݵQ 3ci=/c s]fI]ׇJ,vd{h?3z#FWw^+IXOlD}ZdlƮwdd#3JgOdޮ čfNjt+R!r|9F6^,ɫ b45UqN3{ sӁR"#cd_*#$Re6~vv)ޢs5UF<b( {9?gI-mxc[:"1dx܎]bj$mHÌ,˂%X$[ %w~(;h]uB85jjV>C[L&\P@eR`1 lݭQYPe7ו(')3c>z/(_Ѩf8P|p֭pH4ȼdE/ȡoa;@6HЪߐ2U@P7 \qrxbzB"TcxF@ĺ~zeZs?#&G5  i$S_$m!`Eu ]R,i}#X9O;-lQx ]aZ8"{"1 I, @@nvB$v_Ί~;X+4,%=Y i(s+]]v;9Uff,`cB G-Dߣ2+N? tO` 1gϷDgitӚ;4<Úވw4I:X,+-qG&8X]0,+YB?;7KAh]?LCL28ST~r Jt u~X`4pgC`(mu>r/eZW*> stream xڍtm ǍƘv7hldb۶6'm~y[ul֐+ lx_dd$YLL LL, GK@{5?, e"26I'K3+ C{3 :u9#ʈI;@ h2283Xl@@G Akh```coOMp9@{g1/Y+࿩1@R(٘8KhxPe,/:`f`o{d`Ĥ]X:ؼ8, . &0xgo~F [G_ fQk/6VV@kGõq2YEɖQdͻ`gd@W#3ƿ(V2%~akc 0y= G{'? #G!d 'h/~ W1I}Œm-}Č_Ծ|U7*m\zv&3 aTD6y/տ)dm3LLF_oW9YZ dovr|_jm dK"dm>l Ll@@cyٿ?d qu{11Y_.g/":}aoF6$ ; }$;}y< `bcAs q "ELF?(A~E\:? gW/~n_bf$hd/f~7?|h'3Z:=h{]_k/dyOoa`k'!{>K+C޳X9结Rͳ}_?4X+G ,Fwey/dp`i`w\S:ƺrc}nG?  4_^1h^Fυ~o": D.cVRfƊTry﬍z7_DbD oqJvvהdHcsn+:)bSfo[Dj~#|wFZC#'wI&y_CZsv7tCq/J2ĊC_[.I?g%`t};(4My`e Jɞ# 1 0]SlY _ٵDj 4(R0O2ҤgP.`Ӹ1!ܷ3f\vf>u;܅o憎PX28wӮW[Z څLs9p'~HLJ>dmH1@v |i숊(vs@qwf dsBPcwTx4Vt;P6}?U->6Q8"yӴ@ NOׯ"T4r՞򾓬O X/s׈ l!~4OkY|Vݗ'bK S!Ut6hU?Ȁ$s4H[p/r2U ѡtR$%Dc卥d|;O=| A9:DQj*6L4g3A:A/~e<2gBN>3E-ѶINhq+5kA9rky؆#8xl"xo&vd .)MNT,QǶ~I9cK~ t”Kpe_DzQ#Lf#{=Ok抣I"P̍r*TSΰ\TSG@|2/9x2ȧʯJOvM^OFCO:cgx05}`f CThh%3AɕA^wcZ}P WKZ5_!^VS y g_l*g$3d%4F샲fM^SsR;Ŋq|`9Β,OC?,/ǤIF憑s9Um-Fp)z.7[I&u Ǵy~sabDYVYZ:<'@fo)G`5z8^'"b8rKt^ݾRhqզq^I SoGeVɍ53!Zm !J wb0 XnuG4OctH֜g%JGM\66`d*4f_wOq卖ðg|8k\~IlcR˸$'UDݛPpkH(xY-X0)-`',Ǭ'W(gNaȎWy:<FLq=ʱcej7K@'f6f3LŌSY-A\Ql\ O@hs~ku{VXi~r'5uoGf4 PLY2tT0಄ :M_.w؎[`~M**n*Q542XZ~cj@/tnuYu[iD+z,n&zN^f?{Q~v?va@m)rD)A>p j,G8 A~ ?4ڧ;# .=8 J _)--cGb#;0PX8\%>8.IV8~nyׅ&n߶!E݉{8b%ݕE%a ]Lǻjf0O?oS(87 oC=iA=Fr*4*lcyDQ's7i)U zJ'Q y>U:O s/sk~<Ù>QDTJ~j+ Uz]4O`|Kef3GQrDo*9xPSt'8:eS+E&mACe)˴хҤg<;zSpJ 3MQ3a|V*eghcWT;jzǃX3 M{"nXOׅ4P?sω03 ;.A r1/tBA%'!ɲO:a 7@z0UּOJنJZ$!҈鋫;_OFec(1F|Ŝ$^I<4ƫEq\D26vR^mw{%9#+B `0KEP/b-cRL#8FE0B緻_1 VA)Fj['̈́'F{1<+& syp}>M%=[f@ Nuɺmơhsp5Mme(BV<-B0:]잮=GG5a9TUJŴ^kṭv?ubo3q3^N{C\e,^B,sƚvI/dn/Ƌ@{?EPPb Sqfb!vi]ƢǩJ4ء;L4Lz}r-'<hc -I®g~09q! =x%G'F=N IvqF|w!v ~~\]Bfiݒ?5۟^e]B~YpSO;ghjMJ abYHM|qMVR v\ B}N ׃88-/?ݗWH(E hbcTdS*W4KGF_ÆUrTǩ -0qB~k &@H]gE)U$jPqXP< 3@N ~XlT5[ԨW8$|c.S/\5*L^1Y +fk=52D# ӱ5̛#O<%V3=g`rfl==,o6X/;ʐK{xC\OCNHk**q?(/=<~n.Y"=t^^:t{c`\2f6Y~k! OΩMy1S2]K該K4NoKmF^\yn į+ ; ?)L!rµk߿읪s8X5{ \Tv`?BOHJlq kr ^kr6Dc-JjݮO[p8BӤ;~@;f* 썜=3zxePwpd'7BuE}kCz⦡U:F]ʖN"_i/ZCȚ/VX\.G:ⲚnL:Pȅ sILgM6y!nTN]>Mݧ&p#+\͸p/,SkUK֌.ӆffIi ]+[[M+Lc2@%Kﻌ ;.۫r nJd4Z5::W뒡_Ym9 }M4IPbƏ4TH :(!śhf [Axo`$r>o5 ݚ ~Eњ vd Zp1aow1s*$66i4  ;N{āS⺤Is ;,i`Gg)])Ui0\mZϔ+2aEVX]h$n"z&snG N|M9w|^a v- Ӥ:~Z'tVYNv^ Sd؄$In&(1x - [VZC!%DsQ8*[Wz*Īk~MMrV4}GJ^,QKW9eP7}~>zW>#@' 6[xk[!}%Khql`r-Fph޶wDg϶(S&l@'0 "jKלP7)B5U78BY5: hrPp[KE^+z7go"N?_54'&<5ZHdc.L9h$s~+@-x+ݗ'ƫR9b)kQh>?>mW0+g DwH"@F *D})R+ F0nډk)|s-*zjG1҉o9]Jjcvy͖!&"ِNu͸*{0P)$7Q-=!7ܰ0aLr4Nw~ *:O#x†޾n ޓ9;q_FrIP-b@b`FΏ㉪A`nReءqFp$&--x/$*RVe?"%(V8d񳴼&g7QAWW px.} s]Eo,1Ϣ;e~wevXbR}?XRBj&PM 3&{u!zBxX272{l=G}?rS OD+E p8wwYUW4v9 c[yl !=2#} PLn3&<~yjҒ4ɓQ ܬu\TI#E5Ad.Wvs>J;Ajz탇!þ;(fu'ܖ^y^ +.68ǒi ںiҗULZO\/il/U! <32,Ι@b/ ^EXV*"bv&ґGIX! %l~X*Y c5BM!  Z ͅCjJ,RꭵmB4k|RVe l ">(;\ -6P`DtOSZw-;& !uTD0{ z}Dn>1h_{na ){+ }:n$ƹ{S'%ڡ@`Jhr}N@{}TOlr!*>/tl+?}bGFX|B\Owronfb쫱Lik&&+(:גt jMoi^; l15/ӹe5 dkEuq݉%B:z@tOn4Ô!ü+;lB!M"8 =)=C(#,V[[[L~= 8Cܖ]>ᙯJQwC5& IIޔrW2uN5VwJ+ìC%Ѿ_]ve2+:ؔ%?o9dRGz fa05e,[ پڎ 'Juo(cO25w"n#lw73 H1YSL n3BI t/<"?A\VI`3)hӨx ݣ&L{|/h "@ERE:/}/#=l % ֹ#(2%mVD&wd$/^QŒ>%SqȔkNv3-"J8%b -)#"!Τs8%8W E\婈o&0(u{[pZrjG;t$L#1 03FDX^WJ6> ޒq#!ȨEE%)JA#7Ǥfxw< тQ5>[9ɬk0v!`c}A͢K 4L4dϰNG/Q¯@| 鄑Õ%/NY\NS=;ž̖ J0W3dW0`x\n:h}hk!>FLhU{-?*˶i6U:59qw!gT+XӇ Ai !j\,`j< 1Af3c:IQixd!ۋ̘Zc^AĢ2jowD%8 =0mT>:wr 6]2o8 pst P:_FB: = ?GnL&+G%ss4c߯P~ Q*;|~O}Lq 85U у}lp)ί䕘7~+km3cXM %^OtqYJ v+_YEѷ:ҿ qAg,P>Ϗ$u7inW!b}qM~K9R"XNw:^>c+TGv Wĸ/7X?Sz!tI_ʝ&*?Lvek7!݌Dd]K!&=JV7ֈ/@ g,'iV3-^ni+p mFHD5ԱԾEZ$p)5Mx N1&/6zxZgf<\p|bc>QBm0]׌tFa炉+ذl\=k\Uϟv 8D>W x!l2íy,(|{V)K4TͤsHtl 7pIŏ JcJӚy m#5 "aJ.8&ܼAcZ=BZ`[)]o"YeWLΒOc4WUV/[f5~!͂=n=nE9$E ]HAa kgZ(0z9n v|j݈Gn*]:Q$%(m|h5)pN+/$o]֭ M6$!pb! gE'}P{Q>W_uBTNOggrA.)V! it4[:Õ c=v40 Q@_I>s.U4^;!q[6f*%gz?>ZFo뱗"F.*7"LB/K$ֻ/Wk.:ޡwzWIky9,-;/( $uE"M1Ka35k3Zl޴)7Ws` ^ 4%k|B.7sAK mCQ巭q+\L4`jWiMRQqbt9a|k!hpȐJlILY) Gw2B(2j {^{[}b5hSaљ>)I:P3,X,h|k{e?3}yJ"&(oأj42L\6/hH}91BWGL'VH}L{Bx~ZAEm|*!.HlXعms#ǘc3VSTA`nlGF95HO 6rb3T1DBs;/92 0l 'zH?"жFaLOdN ~CEpi J=҄TMQxc%dR-|@b]*5'c#=9䥤;nZG9w"û9 E5aݱ]׆\i8|km>+XIuWNt~~0NPa ~ۂпh1ͧF9xXږح%ʿ:[(K <ͦA YRQLOx(މFF4С;ΞzX:0{lՃKd9t6гw-ވP=m4kݠ2VLCA*C| JMm؃Vz+Vb<b[0>aeг[?lF&hu=o!rC'B5'"ԘivCW<]{}^h<񏔣A-w-P=҄t1Xdgzq!oձrm YEԋg0u.0;YZ攻m2]t_,6J6A8Fz3:AZ{8 ݒO7Y&̟)[P*JaMgD-HVF‚8?팗a]+{?8yzX3_t WW&YT~5f`,2뺸z 5.z'4+IS.m}VuYO:Ԏh9[x:B:X[*l,`:fD?/aLE. mM/&+o\:t;JO5xrP| {k#lLw@`27,94`*q!vЃ.2eBU.c(0W2P36^5*k U961LMwLDF*=|QWD<ߴYк}f2opog{تu3_?"^^[3m'7Y:*zL`;ßvHQY)dlc%aV ѣϗL΍|e wJߙDl0ETu8]as` ˬyǑ  E=ި,nUs|c1v8rEhy6eC`© 2|7JRKktiO`lk8ChYg)I6UGVbmZn%G ,ME( 7>D%_Ъ63bAN̫! ;vXs`4ZXE꽙w7.=h{>Ռ_>]"N-ZsT{Y?:] OGe,ų J?PoL6$S"Єp@}2*T2 u9 Lm5nhg#1."sA}FHdb̋QҘE+i[N,u%he4g糕sbے- 7ZO쁙?pҨXR\:iNO5s!r|}TStR|C)4M Wf*Z:8(g_Ԉ慙8Z{ʒwa?at)1^R:2Jܑ%J\ fV2N^3C`.^*o='dc]ֆSKo, D5%AW78dmW 6qɚP3?/b&`f.szcdIT3"V5ie*%ɯ٨$:yR.b2FBw8vUqg pD6gcP~ Of&_U6'P4SkSa"!IGV²:Xe!] C;>ie鿥6}+1'4:D=C#o>s~h pe8wS4LL ^n䞩"p['?ڧkX4$wZn=_Jh$x))j/7^زRT24+цwqu: }CA nFmJ|Tݗe¯sZݣ|<^SІzA3X^cܯkl%9/1L78[T2 rϒ}3#EҜ4AGJka'9eYFG7Zqd哅 $DžN%4>]~|q{O VcFO8z X%,8d|2۵Z\\` KA7kĴm; LZ9%d"1AgJvrpncn+.elV1 3Յ7%^ Oϓ[EAq!VOԊfG :19=nkSpb2KtKd3H|Jh3>hodF{mDbIKk jLjy,+-1"ꏏM7N}KPSqY9XCp-1TJrmìLpr6lOCen͵4gtą-JT M87IpG/ ~MtpɯT:~=cvXuUgɊeJ8cC b?ЧbV0!_;4 yyz U SfZUv@ endstream endobj 701 0 obj << /Length1 1407 /Length2 6093 /Length3 0 /Length 7046 /Filter /FlateDecode >> stream xڍtP[.UNPJ  $B*H$44tA4iJGA|?߽3Nf׳׳6 e(P- %D@qJ>>SDrA `:u紐Bt\ HZ$#ā@ٿQh9:tPH'%pt}@ YY78  _ q8_)0w911Q( $( A`pO8 0 :!p@)Q+/ M|ѿ+¡ӓ(|sMx 6a݌Kqg1=㶋I&938t7|߹@$Ӈϵ\u~G"\uG.ɋ#'i Zrʕ{ 9 s1y_5s9XF'Š9z[KƘ&~f%t mI>UWgustDx|}!Ϯ.IVe>[x,{36^ND)13eZaJ.sNS2&3Mka ~*!v+=ǎO` s`ඦƙI>RmyT|a[hJE|i7M(Il/-B̉"DoAS'LX l|OZC2'$1-op&!s+^{.L{iԗ3xSYXv>Go $0iH>˫y;}rx.ULԡ_6N2Z"˨ ޫ>y5L#I' B]{c⃑IeJ-F637(UT:߻•*'L/@!V;ϱ|4+a{6E\GANU 0J;% ٍRK7~m*抨+iy{]=9o+Γ?fbE}x# IdzBO=\G͓Eja,&MPupFbuϧSf\Jk-i~D$y Ʀ𷚽neL ;n`ATO<[yln~.% ,ޑMRa皈@F[xbR#W(Peg|S]q&-~oo1J>q x,S{7)ilOAQ+B N^۫jA@O@S!z2mŚcOiUY-L(t!L6ۺź##^~|4{DclOC^#쌉-`x)Atk=Qnތ;_LUmxל0B}‡v C ,oKƐ\ub+v6>") G,]n ¿\ogܫvz[}ա]V0ds FϧDeJʏ1r'k7Z8fc{$>6OA:CMCnjGjxl ћؗӽcA}l@}{X]_ohPy=qS"j~^qĕ:]>?=2>+5XLu1bW3]hQ _cubܐ~st0`jib&tP #ټ۬L=}|5ʅ)zAbor^9Jx!}t,7 @MryJk͏ &^q~ X7|}xG,{NVx X,JY<먟3'I#ܙIwf9?M#[QmD^M+l yk+ @ܷeJA\֡DDۍb|\=e_>PFtZ97*ID_q$P`߸}Iev*_B'Z_lJ{y u\oKt@w=1z1 5w- 7ιֳv[KDk_mȋa0Vsn'C D2^;k^1ԕI=["1 l&{MݾIhD LuO3&piNh1ާ\y^{DjQ~Y3N4SpsE 92k Jg!bQ Uw*gP#u o [-]~R>BpY8>T=Y:_;K }6!\~gjm#+i0DH0du)ͷׂFx3JLxwZ޶=œ7<{gA]qŠ% Q3Z̠8MծT8ԷA E[0-5aOsjGV7U[@`9xZ+8{ŸCdy-5/3t 1t~r645("2xhi0:8.MEW4FSc_'dF.Lnpai2"[  7&=7`T~(JjVSv.}?=tAٙ\͹X zWO&.M&Prټ7kgD,_ZqE[&Kjr eMe7.6D'Ģ0\1k+'d+ܒSn>t2u5H@H0%JșEd_}AZ~V,xye]j} LÚ~c[yBmrewPM}PAZ/荩Hj}efn^r7ݑ䙭?""Egv"GZ.aX+#\YHT/aQӭVf}.ޡaIs0.RcaO/ݕD|:xjfH̛{IibSI]}VG!sk%-Ϗ(|k9҅[so?Y5ku zV}f P!H^B${/Z*UThNw<ؠXPE7SMg*D=gko'm՚Pձ1I؆n"qP{& rzÌ;ӎ_ `_{"ߑx֞ l%YOD|qU_ƆdPU.ot%'B3TRP%J2B+uI4f?^^M\3QM/D*>}!"WxX-)84,X9yFdY6SO 3 n]Ax?^]fmK#:8mtEKw$_]<*=gPM5':~ Ӏr0ȣ&Uc6kѲ:yX>̍VB#C3MBT~AZY?2kt4[x3@ y# enD 1H,܆A+!scaY}}!}d^W RcYfkv6x1ԈK/,e$c攎J9V 2Œ-=}z5ХLh@Ms'eݝdSKp8nkI}Suޞ 9J d ޱm(T\"-}[)^ؕ'7)- P /7Ʈ:~!^)k{즎̅+N}0e2 13\ᓟ`>"<՛=N/ߒj]boG1GybfDB2%RES,oeHC?p{hZKNl5:4+Jpm88!N&7&~Tg` &K2htTx{si0;DWgAɋGN/bRs ml1+3Ũ̡I뽩irJVsR릇T[R Z#|w1Kt:>~EOP[J٭8Ttc<wuT/mqCS1~W3јYm(㒴AsΊ2.MJP"}-PdlTMVMW9ּ׺tR&~D\a2Isu!k :\Su&@8Oܛ!羜Ml_'m· șĨXh}In <I,#~>;O$J/,]s.FhRq&qK$p1^N|o/Kyś.WBk&Yѡ+癤O匹:ū!|PF6W͓؜LjN7UOg,29U'֍Z9c{5Np.rPBϱ8k0"2spW9yp5-QU!`|b#Nsr["5z={/NzԮdֆj鶌n=vP?5XѨ"6BSDyz.1QH=*d6ќ#r8 [ee)=|eۻ+]L[%&HcpMVswۤk,v>׵ڼ£[FQCf_NN}wmJ ] 98[2}&FgEob,f{ffYe Ͽh68(oCE@%JISddNŃ1ֿy1(FS(t&q02=_Km}ftI&AedB9aR88LtщRcAS6l|Zm̛XHXح!gr׶P;eq$_7Z+.(..H|;*mL4Z5F4q#+{M!wkB*7ӵ endstream endobj 703 0 obj << /Length1 1672 /Length2 9745 /Length3 0 /Length 10841 /Filter /FlateDecode >> stream xڍP-޸KKӍ4\w@,Hp'8ᑙ3sWWT5ڲg}Ե$  Y) RQQprsrriAAQtAnv\A@M}S.00''S?Wa4 PAntRgoW;|-\BB$@v@0@9=W: v Q0B@'7vk&V rz@vƎJжsӮz]Ag%j ;te& 89v`#&`߁@G7s>hx@_Y9CeVR''|v ks`'/`m݄3 Wȳ   \ /K[Π?\:CMAP}݀ o#T..%`lYj0|_&򲂀 c*,vOReqq8~M?@xfO#aw UȳAoi 7M{ YwG?܌?n_Zv> y;sU@VvNUClq6.^vN?vnv^ +u;J,k8ځA7Os'Yq{؟.Bo z޲> b{@WW7 yo@^@S=!ߦ??Co$  8ABѳ9ζ-| s)Zr8=v~V꟣>tsm~~܍׿s?x r8|uc-/?Y./@,EBCo%H=v&DN2n&M23fIZrV*edMT%]nr.^n7Q6{^Q~HXxK&^S&I!\NEʦ_ CcR'=^bXje2!ЏC;|D yN![1oUD;oQ<GVȻ=wNiá2"CdҼHuyeY8rRZ*TDCR)Ohs&avfBoyzB#l|Dd5;?.F^/#l;}jc̡yc[]~?*mQkiƮZ-aD(Ǖ._|&c |qϯ~mNIVFఋ-1n^^NkMqYCvaX^faA,BpܷN5*ѿ d>dk}V h ir9Xa'-١xARfv#N(jtm[ȥgs'> uj &9g,~& cY=iywctn8 Qű^c9"7P|˹F_ +cXAOg N7M@^r`{W\'srg$$VܷnrnUۑXJ}:tp+m|f 9_kprU\ʤot>rτ>P$| 9d{ d] LK,3X6+hLo8w%R[1l_!v.vL d!2$^\C! >UOXWnlpyN..a} 8Nks.0P!`^*^4 lu-nIk2?ଡ଼z[Z¬]ڬxm~6^3Y}a.xw­eVY "!`".Wnx5y_`s|bһeXr{Ϊ̀ CHmHåv Zᵘn,5cNIo}=HT??2݂l{cmoYlG'V;g'Rc_{1CZ8;d&AE5ݣv>: *`&IZDTŧ$O~2zvd0[18]fcj*ClMD$e.G]I%N(t>vKr>l <(NN9+ =`?(MlSQ_؁\d|t0,GCɈ}Pۘ7eg&D0IA]6(17 '**G~H"ڂDX1n pG^GM;L bG,G5;1p@cywƎ (-+A2%oPa<0qB#0u;MQ8q ~|hg9)L[1:ȇUL7vi&~/E#Y?::s+vZ>d(P%F x[CSv {z ;]7oZzuk)f[ՃakY=T^5䍧Eza\Zӧ2^~^g3we.C{yF0;5quh[S*Yae bJCUA +V,ߌ+CE.`G>]cM1 Ԙr&Aez Pn))"!M[9ܣ@2z{\&>D9X箦zcߠ<ڲL`r0IT =Zp01BՏg=~DDy3E 3?6]ؗ:JYִJ _2YK•tŸY#a l aJoAj򷩛VR驅ԛ6y{"{+D?_,NyB(豎]T+;ZM`U4.OZvfll1 1QЊW OG?-AW@toCA˜|ze:cs"Dc1Jx+'!WG> Ru_)rW jaNCugqY-@B';MGIvȬaۼO^y$_U:CDZYQ^byM`N|hⲾ5g2S_W*y;AH:ve1FzOwON;6pdE*$1?JUWGBnaWŤtA\x FfjL{tF:Sj BfR隹ŀg6mS2nq[yF̑o"kEcc.Mu{%x$2t\cz!$纯 940]Q~&^֭L""7kRnbh/SK/[^O"Zu]^|_!7(|0;PWoOLc+G̸j4de}Ù6.T]&K:N9r>ja04j4\8'{0.U:Ʒٽ#: ^J`ln. ˖*Kfa^q,919{r=$I7p9޽K i邗%1O0|4(Qv=e&Ms3Ugͤiw+22'LATSY*zWw H_N;ݭ'HI r.'WS_BD/uk^1 аܰx 싁Ŧ%Jjfo7eʑ?ՉBSsd`{5kd'Tf=ڽPdv{WLR?99-gKo''7(!.]EGݛ] ͽR6҃g-h+_c md-M f37"5&&cwx8Y6V!Q?C3y2@.RIaru;Y7mj ns|svA|~VB3N3 M$:fZQN+!?L_ '9NS@;EO?֎:@ܵ÷WgcI~ڭjXR}xMÏ#K|a>rC>>&ULK5Ix"u{q 'Vsu+¶ F*uCK Kۃ?fV: ;C9'IjP}].%3L[ҔWL]u6U)Թכ+6n&"8s50>:4KFRWaN3]9e4N0UjJrVyFI=ǓO6B`Ӓf.0ц®=!> ,H x|y9ĀH9,ce4yHT8L~l+![cBobMSkX"#CJU{csUJHl t+shdภPTēJ$hB_M֕`(8Y2c;) cc* f̮;#z7lIT a" ȊiAJ -Vާgꔧƭ9pꮟ#wظ!;94E0n Zȕ˅7 0[U}ImߦorO˜oBlaڱmhL7}clbfIpR S4c34I&& -xkL4{ty` Ț؞kuLu_$*)g ULS.Am^zS391j46Tvn@#~c Hf>+mJ y.C$ua\}WqX[+j"PHΛRLϔX2ُi_)PdYHJ}1c4lNόw$34]^IrAH4~fDp }0i}IGq SX'fޒ@T|wrܵC< .>L.G&3U-~mqcTd/flrڥG'\)|OCD9ƯKglivebs%j?o- hn6a/#뉶O4֍69tR[ HQi6C{0ݞ^ PI/= }Ku܈9 b?)4f6t ?)2Տ']J}W}Iu,4u;ھB=uTeخl>LE~§:fs kcoroYwv_EtF$Pn#0z&VO׿ҁfNMYld39V;~|,qϜz[:w Y e'Qtϱ'cu#gw_ٓr[b-v1U`_Gs{fȷa25'YY'U8L8@k_ \Qā2[.%ϏZ[@d[)a&=E*,G7ۀ qdB/x/F1;1h$:!'0ܴt0Ϋϻ}܋ o2*(--@_0n߾>Rk싖uWPۨ-1AύYXA!{Ŋ4`%wkoI#2K/y]UiՇ3K[7l&ٓ1Mec-t"veIc_?Ŀ&\:LD\>N0 ;+] %vtW[rA؄iE=P0FM˜ @z ΕQ$)SH^Oq@r7<=~qݼh%qOvI,{eLBL-_8.'-K V5NR(GG f@NJ=nx]LX,!b>9PN5Xe6*95gϢTpă9&:0%ZqYL7,JlY% 0pu}[a N(G<ȘS9ތ =gi3 6!{zJ1;i8>[>I| DУMxV@\$Ner7"}VqLDG!~7tb%^8`s܃ ~Qlm\F=obc0 *SWf§9k4lb&.?9dC4ASEQLK%)RE%_,)YnS;FNwuo7zoU<~ in< u{:-*,|B`[K `|74w0%>mZ`q Vq~^o[ꍅ2H0p5INI˱xߠu@ɼrKL%\WGʵ{Xۨ6g?(gI2)R1[kj~ƦP)9 nuh~~U8q:!|dcRa^+L?x:%VXʝF63t92]VD1ج\XQYիnMܽ_ٲ_ a/%T~Rr0(o猪`0??WH䵛6Cgb I@,f꫇SX<1|5`1~,(2su(T!襹Ѧ4N^jKU *{ C  ]VVIϲ4hP~a<):S3<9-$=DY$OatGxvfl+1^=Ae}ŧTO:"ԭS?;(I;IӓWTODx vh%sDգ5!^*Vw=+^*2dSGa݆0|dRAu+{\t$_IgA X/gqpsFb͎V)t"4k&Fw) u~3WuǴe G4s0J~8HnG. Ok bś)'v]%YmnD'VdH^E|cCJ<-SG_"* jbFb|߿P7}7w9T"/kT n (F, ]+%1D O:JM ,Σ!(@?zVriǾȐz6$0CIjj2>^zg&Kgg&o*)D*s,tYŒc\+ yQ_z%A7R?qN 32[M%QLӫlѻվAo*Ǿ[;?'0v^-$+t# S*X"7K n5u'9+`c-xN@l\[} uoPS37rr7\ϸ.J=Kn$t}r0T4ݨ!rbn[`~ԸfŎl" dCDq"a`~E;n$d5IpUM* NX @y+XуqƊWucՕDeun8Y!t9?(BGẟ-Ř1^ٰkLMGz,Z9ͱb.\M} ⱻŰ'+D|tPM)[ #<ڧ]Bzobqak V,my0Aa_K& _Ș=m~cu&-Ն0AKg~?PZk&$hR!h0!I=pgp$p]Ӣu3: /]ba4QY1D5՚smWE*/ \@ - [gFRaYY/ |*E,Uҁ\djDˎO;ol]-Lwlǒ;U`{{?OO*P7./>SQ,<Vy]nH,_\<7'.h^ހZ֑YB6iY_[K" ًT^3ZG~5*񉩵oL!=TxH#eJ앬"QfgB=$O+9ۿae0&wQ6`K0s t~FI \,3*-?\iyCEJBѬؿWۄ:)=NXBv?z5zIXQ_dy"nkAip-g$PPx2`pk[ wƓ۬0ۯcԋx>D\~+Ytn[βGgNJ 8+s'[ endstream endobj 705 0 obj << /Length1 2355 /Length2 19859 /Length3 0 /Length 21228 /Filter /FlateDecode >> stream xڌt llضm1gƶӦA6ilm۶m̵ޫqΘcdI(IUEMRv.L,|qE5V ; % ?bJ-N@LdhosXXXXl,,5wHYrv@gJq{O'kK+P>hh DmNf&vE+-( @?!h\\ݙLl,h.V53 h0@3&J?ru{ w' $69<\́NPr@h? oxog33{[;Ok;K ,03ofbmcb 2&)QU 99Y;839[E0*Kڙ\: TvO:?/̚v֎@YDe@' 7/;zY1^/1Dkm}!x;.N@_?XYf.S 1 j@4{>>eT:11{7# אq|7̿/*&9e,p<34YZfP2@{Y8Y@?X?/.(o$jco?Qvu=h96UV[_ h=D,m-\o@m*]6FPpfA3W}ߔvf-'zx6h]@.=__0%q#Ŀ,#vo`@#PL߈/2-_ HoJjF~E5@LL>A r N @P2fK ڃ.lfd` . _@й-~KkGp7P<@ Y@ >AO&bT?"wn-]tvj}f?m`:讱U_jqdTDt1X9h/.8:y{A=rxft'\fNz}n_@ aaތ?ïJQwƝQ) N-Ȱi+R֜nEt.oI܈,x5T~nWmzy6UiB=!'dyq &Gʃq-QS4ivGuKx1J3R?`24m ƅev =cD.8=[wÌJs;.. ~ΜwarG@IC2#*>[JZ]]OVj^EuFNdXMF.??[-Vo:,>$.NVt;W{>%6 nujZIZ//%B@ke1QݾkknrdthbۦLiKE:|&Hj^.ZH3ev? %F΂)4z,-N]Uȣ|O&H]/XkF DP]bMB8L~|{Ƥt]_ Ar8޶utҹ 60zh[y"f1E'\aKf0DF[EC`^%.7 &R F #d6sdR,{, 65@ы~ _ +2|9\Q']P*H 0WSҝh"]뾐W:nfYJz{[Rl ں&Lж[N{Snn)%hG#xf4JI,>b-QZu1]bSNbtќ4rvDvR0qdp`?Bx:5ph!LE_}0f^ c[<߸J2ܙ(e1I]|e$&O2CQT}Yу-}kG)~(!*BkEki@-/Y1dcc4IEA1(@MxC鍾 C黺Vͬ@Pyq |@29.;[?ǻb_ĄDxUfcD,peTL&4G*3F0{uKX銷+L>{ckig5M!#+3)[34w} \!Z2}۪ՆƲ#mt,j~x&afHCrP»騕W_0s6xl5+,g/oP! +ybx^vqDk47/+U% 'g9h)Ky#\寢nW'mjuv檩FwڳYxX <}^2y3Lz +.eiDǘ8rLݿћ\rOVÿV{ȒHKs=u6 W8%G 3k6hEaL/2]ҟn-=^ 3{ TOP"7Uez@pBCf;O\XFxhy,M{&8]v }M)drBeW#^&Rl.B~W6Qt|%xNtM# |Ux>8Fki x\]s@o~bH|^h$qIPlnJ,Q<{ ` v;9c(g[eHN|JnՑXK5$^ bpG/Bܥf>{kC18E'nœoY>lN)ek 싨}C%Qqp'uJ*Fy 8zc=~Goywyw:UF5%^#ϧgPd"몊.Pl%rwS>gZڥwC=dYuAz-;B6"M¯̡d%"oysÏ)v~' 9pK&Y =ang kˍ׬?(1lENcv^]A%5  :ߜۧ`xz"‘Dx1i!:Z:0`пNRphBGxRͳ4НPKo1YLaK̫|sƧk6af8ë曹]MrHyfrm>Z/"sHx]f@$=؋Ó$<ׅK} zxZ$fdq s\Ϳ$Mgn9c9ay,' 2u -VTd}L,_J [LZLlKy goɚP%A7ǝBx0Ӥ~\zz6o]LcxC')~`e_KH9".J@+ۣPt&y|[SW]e0=GaT,BY n`+Cm_rnYȿ (`Qŭ9Bq_N*;.DpMJ1D?kW+\F8/ύxX2fp#*7n|_ISu39x-FǙਃɾ!bՑGH㴺Ӕ-we<N +W],|9;֫1ܼQ2SouJ4 /x>rcgӹ*=1v4NY`BqaF|]/[C=Ŷ@|r&Gc_p',M[TNbv_+vX/M ͱ&(R]ѳܞTֶ`;AT1 r~ur")sēyz~?R^h8ddzN;~ 1,]Kt{"O9;=(gڵаI;v`CF->!TsB0A՟y&o,0.4N>]ip.ntj5B5:QdjJܺnOP 5qeyh9b5ƵZ*s"I J}q8#]3D{6B씹dÎqG'W~ƹi{sdb! D%> 0(]?8<}}gS:6D^ p}NnI}Qp;ݰ&8Et ._:h_ȗp8F NSƨop̵fJ_s n"{PJtKB$\DޱѓS2E> h[o+6 emO$f|ߣN;j#Z@ȉN6,;E>Y,urg8IȟsBqzSyAoA^D})e\,,`i*j`'AeLYFGAM$}+^ָ?*U!5uQO%Kqva y3=7DX+,?Jiֵga[8,$ꬢ4 h ա'C}_t%+uԠ5^ӷ͓.':8R~F)Q>j>Hh΢r"$- = h=A ɲbOyck.z^ nKZ@/ިJƄ=k>Ӗ[ba9N kX uCKmvGblC"+Q [нr VV (m.e?=WGdċի2K_&.#6;^ x~@ri:ћ"U>Q7, OY:]k5&&sjHf{>B%a3['JMbꜼ;ɺfS,5W M{`h~Q"aJi#B 1Hʺt>Q8%ˊ3"0~sy\ sӍ3ͭhDKSwcw/?TOK͏˨pwTlk0@T3HAfpF)ڰ ieAu0|\Gm>Bfh}Hb?m5˥np%buFЬ᠚W~lUd޻Q/B7/j_'X,^O<< N_݁W@:zqTFtOQk1qRT3VUļ'x p608|mOue- 9cyyqEfXi\S|>QB$̇Ӛ[.-6*b(2S࠸`[ Sr.0oYM\ZbԵ%Svl^7ƢG gٺ˩ `32At+/v7Y-7]^n'og{qk徭|ӻ--e{Tv5ƽ|LCu* Yz~iUl{/1~-QAX)uсbKߩX!7f"ߞڶ`+k~Y:z WuS@TIo-9; 9_^lEۺf3 wP|%'O z׉~9$# 74|< EJi/ffƇJtJZG13M't-W\ܦy8 -zMňEtM8UcXtO%!+U[e7Iqۤgbմ{XŻ@ g$]n2.,RdiI .6HyߞG&B+p-\ m0| \/3hw< -6qHPo𼞇S?! Q==nJBnƠXt(o6<;qZ^_Id!s:f;-VWek 7$3Vb9%wc [t,M)И ݤ{I^f'XgET`&[uZX3I*R?Z/$ơ~A]ROEVBDm` >8NKͭ$3*bEɫ/tjI# nmC0 0K:JU0=Y[eb:nMfe.o|e]\e!j/* /4fhsl_#?t- D<sAhS{B{V$ҨA)~LTioF|kRBp ~2>c߳?-ŜVHUuS1k7} 祯FFA&Bqc2IKpm讅qGk Z|/IFP"9$?Tl1CF${7eRͼ [3.-nT:^E5Vh#lُLPMvX@:Ĺ$? >&'0 w0+EQhWUqU|^xZjh<e쪂 5/q`#bkI-J{X@ '5U_0/3N:XEmxɇ9%gIcg V[U'#֪'<0Ԛϙ%%ޕIKv%Wqj;>CyѸ"`0@Cfi}_i+n-xI|o@`Dn%N#7nV ,&7.Cy%=LVug<6?/k 4AODj!ŀwRF|͗6p$UKs;]DyTAUū>fVvգ=Yq5)4#/kƷ8.0 {O|h2 Zґpa)Y1T ޶,nSX,|AR$.'mLd׍SP8ת[pL"X(~C;nj~{gDm돡)\ͶS1u8XtvSkMƥ!vTzk32!;_Kv)6FՖ_U?.Ua-O'x;$) ~bYZJ9Cv}R.&0LePҊ%a4eDPӄT j5sk,8opvq&5iꀖ-J;ljyKw@v9îv{kHX,Sx}G:i?;} ê,so2s23??ƆKics;AM2 ĝ k ҢomzC1o >"%JAb|eGfnFz6|茜=д\c,{LNqDe(&I~Vq^U 1|mxZ?t.  %u[K)5'FHtHGf`/taE8dPDIiSwYTPן#Ȇ7_k>R;%eljE"_Z"LN=(OBA]}A^n=X5ZjOxF8z_XWfHA,CX3q0:>AGv}xgHrs70D7eEnL4n'9~z*ɚDtUj 5j3#yGaBPȥdPzxn. oT?ُ6C`y֙ 7I^˃OU~5o- )w W,ʝO&6tޱ. @C ;<sʤ?}1]۹{l~QkQ/59CXDTQp!^^b{joG[6s) ֘@aŮ)Ͻ]ǻ>^`9)– l˼+;TGS)DfMvŀ`qIM;$(Mk,6\RcyShWYZ7֪e7 ,7'?de>ƻ,fla[err!~HA`'g<]lY 34a~3T|^!@2?w2dN|bv[OqD$/K$92R)dB*mBU0Zerp ^۳~+ǼZFOTJ͗/4PcT6 0Ƶ:U0w2 ,Ct0ԖbJsє,Yly.QCδZVEBLn&awZ]Yw"'nPdXn`:j;5 :> @0|@^_ݠ*|9TK/f.)zu`kF|(s"bO!"}qt؂2˰eAArIf_B|O5uUq*'p>YcF̵O"5) X:yKnLz) #(הnqxehr;еR8ͣ{d1~8}#L#Cv/ͷM"Zѷah fRL0~*r*21/U32#>a=B"қxh{Qo(y4 SK@V_Ѻͅ<]6ZPiaeP<dX+)Pv9S..^&8e bqBlr;!Ԋg~t\1+k/P2/=Iב*&[\-[ʈ9V{~}7w@VSJ lpv[ҨkU:%NGO>pj:ҝtIOo\BYd>mRK% /1!0n;Sth%w p7uLWsn)b:呑#. QE %XXDIfCWI{zA).9e֣ MVONp/rECޱ^qAS'*ס,2xn裳|Ũ^z/OJf?Z'4 j"Lm7>y8yBzоM mvoxFV0™%--搙zHebʺ*`}UG#伅x8 -/gm 8/#\u*h9П UC?{~Qg=IO_[ZHUQ\{ǩCP*B7 yx;b/tW۽t?@0( y8`ېedw1;e PB굼&2ǕEã}v؉-aWB5ݖ`ĘO)R~C:Jbic7Α~}O.\cŢHiDjXKvD="îD$gMeJC>nK9P2(nT)X Q[ ,<ޫ'$6p0uDT۸%"`m݆~P50M|\PzrM͌3쩵|m\!JUe$jzܤoffbkװ co W 73lH[Ta;NbIG#Fyk ,RSVrcX7m~I0NXIǠ\ g#rC`1Jo!}3qBC>05ݚyWsw.|r?T &2w!-"R(RB2-pB6[$ŬCTd'VC 2^z~N59͚12[fݱ[`)h* dq#?k$R:1E!p;m!a<+@L҄cNg@kCpkRįi9Jc)MJʝ;V鍺hCĢVM Dj.̓!*bs2迈87@8M D4%biBz> aǾPY63Ȑ5% ÛKE*+ y5 CSf˻rfSAD?J]aiZˁiZpj`lY7|B$# qmiͼ 1 4(Zs'?fN #<৆͌>Do ޮ`,`hX{GrpV U3l^Dk ӿJ]F$Qbk:/SÂ{ݯX<.ǿx>|;ZuS #lD;*G|cB~G(V珞Pq|j1:Pi<%ČYyH]U ֓?hYGQ_||QAR$V \%)Ϲ%"=FEGN-t| )0 T:DW6 ^w_ H7_YK 1,!֜ff7i RO&c2ii!Z,0GGQn<;ά~cePtSȇzٗJR*_jB6i3'Xڢ5ÇKw7"fPVF߁9jRbIa>!jZJJ|w3Kb L qo63Ur,Ɉz,EJS^ +]9}"wocY(hB}y8ٖ`iwN!Bqd|kg?qTeҕ:%ꬔ"Vk~Lhz/˫~,ڜA7wro9goSF ¤9нI([,h c&[m>oœG(M5UN oeid`oWm5C]}QY_}F˧__. XѲ$L"Y゠G VZrMl$¡r!IÌ0|S0/_1s. xlsRNauddL}K51(JZKU~N)4{f2=M"b5 ͣ S }ZzѤ΂Ġc dﶵsX3VWmnEpqiAS:%&i2F(ɛӣմñ#qCXz0 m^']6/.ು+"Gʇk#ζ{{>`v{Vu;+r|{ݵmGC#5fz(Uދ |C9s߾w2cF\rpwp1s#c}">ُؐ/:yysN&C{ LNJ ǟwkЗQ1AmO}O0}p̗’eT%>, E6將,b`mz΍Bm%*Ɉfd‰1Qk쪠'5IϘᬤ~QYI}9:+>x$uUk#{:trBkuZd Fl A=(| z_ zwB"I t[Q $΃Z7NDFzHwSz%eS.o5htisxd/z[GSV=l ,yظLx_|):nE,F|r.tDHd[T4i¯d[7=ܢ S}`:{בb=[[si?$q0ѢhY8a] ɕÝ}w/K|U  \!b/,hO3罠i%|oe 75 K-A*q!-lYc3)aҞyK$wҒ?E|229L5!8Sc`Dum&gfr$!Q4'Q|=zT YsFlXp5CS7dyyTN|#.;zDmE,Ua#؉'!pht<SLwsKzD90=8A91^qMCleX{VHceWwWvn̔o]1DG{p~Un.1JW<[<35y,R%%.r2(f-O4Z6gEJ[|u ;\^J}7frݳX7hȦДM C(At8*P&xr`~fODab`]+s셠( 입so6#7t~DBHglʹI/by3]Lܪ뜉N 10Ī?'<.(/8zJ\NZJ7?ohy}U$7Wf@MW ~6dl=kE) 3>{E\~߂8! ,'rHqUN ,)tc*)1> h qMB5J?N~=wA,P^g6$& %H{1a/TwXCl{ث976#өWD!Ye<9P@ʱ< onE${Pfxqj7#$wH+:Ƀ1(h`}S!שы_ۧzY7(iAslP@GȹzzAO@b j-!GV[)dB,h"Έ]Dьlmt@(5N^EXG: `oEGeH*V j^] .[0;Z_toHERwNߵ;3A7NW-R V-PWB&Le+ൺnMɧDtH$2n -L5j Vmb,uaoǵDQÂ}PA1=Osy/ķ#d4(`C }{3jATí|1I7'φK9>uU pX^֣*}qsba@Vva}DF胴+" rkr[<%LؾZ]$Q0O]ɫY2y3HQ Z6_}-F,rH~qˡbTfr:@sR&6τR7c-)C=(IrÏeXқ-R(4sb1vgέgg/ʓ措$ sdWl2WQhUR^/3Ze})[iDžx*}G^ij8qf !}Qځnpٙ0aƹWb*4zk&u?.Nr P5@¡stҼcKa-ٸ#{%=C*x8͡ȲХx }X#Ui}#o$#FK \ p^m6w.~]eo2 UD{OCFIaO>̜=HYrJKJxkx~Wblg1R;{ʞ+ Ǝ Y ^O1A2I#hg0EY݂ܯJ_}Bk%0ʣe8]<7d:SL{RLb-' )e9hj 178!!K?VI7g'~9[N<6tm=ě)n4PgH^%?d2Z|V6,m޹ocEiy1z'*'Vdch&jnsgܭfcG@ }-O&KC8ğO e3s nWMuXs}\BI9#b-*O';w9jRYAKzp^o+lg% X2qp٣-\)"JXW 3l"IJ̅!XLl8n3ςi.byYFrx=;^=~E@Vnek^w H(!If 3fs'?fԬLu.,WB3_8d AYmʌx0;f3W&8SɏMg@Fҿ Pd&E%.r)KKmwSD[ {ͯЉf7 lUVKipBSaܚ3&$;)B 9?Sb/D\}{! Qr,ǐ-|v(^$KnqУgd|s9qҿFZ[[nջ g| յl֌Z1jd-#YƪF-}5WT %-B2Y1;{bQ2P<,/1n /:GaW % O8j3mߘ{1.&_ 5KʳH9u)IU3D9[^t;#timNr`Q9X3 (@>^vg>Xl^& ܎N-ha?#7FxsiWaBFJEdt;!RM[>qґ |FQb5lIQ>M;uPH W(jFp;+ Mr .+gb 怈=}n@n؊F7"+?6qE.QIȂ#ݬGOv^-f8cVgjs~P@Y2O?ފ=#ץ5|Ls"GiidPXX*\W3rg9AvYn1?C%??ZORq8V x]莜Wk=eW*`TȖ9ӃsD=I.7ݏ%D1%',r4o9\ {4d*"Ϻp.T9 v!2Ol>Bý{zC-)$hnDO0g˙ZO8u:pc?d[^@ endstream endobj 707 0 obj << /Length1 2883 /Length2 21876 /Length3 0 /Length 23481 /Filter /FlateDecode >> stream xڌP 4 Npgpwww]w5hp$țo_{E̹==P)1ٛ%A. ,̼1UV33#33+-3&Gl7v)؃6 '/ /33Nqc7+3#@tFwtt/jS _;1`b hjl P7x5/;3 = t:(ni] lL gp+ /P(9A@go,,'W!+_Ʀv O+Pgtp~:ۃ݌lM17H Ӟ3~LrqFO h vO'kwy̭@f濚0su`Y9e ~,.fff.v0dU^܁l8}t!̬L]&@ +`3o @=?ef|4d_pq0|[_Vec1.(2x׆?=X@oc`6bRW7_Bl= +|*gCdʸOmr)[Z-M\ TwuX|Lm3xV%%@f+',/V7 =R6dN;!('Io `Lb7I70IL I7b0IFl&߈$F`.o(X\Z ,~C0YLV7Ud#0YLV7+jF`:"056 8-`v`qZ9N78Oo;ۿO*"V0Sc'g7bgSp~CVpfV@'3Za oqt_ 052x_Cpc0 9~AX~~+@p߄tXܹY6@~_c6T7pf~_ @Q ;?  `B@n5h43KȖy7 x rvVU9ǯsq g?f{YkbCwq#\Ⴓ=@&z&t{NO__ۀ@))_uCH];gY] e.Tڜ "}hk;7+OMpU:~<&Nv ,}*>i$'bPyr j-KQʍ}@ʣqbu"laWeSb!V#]`EI<X״s7?f1^He|ObJu7Y罾V:C`LLSz.zl8MtMn٥P{*9^PRLR1AaDl|N+#Ke=_j7qWJzQjPE}o:nЦ WNm$soD(ڵ~]7lu+Vsq{WEN{Wu5 YZ[klIg3;|WAuSqs]ľM*hgkVF6o=Fɮ=tG<Шx.A/o ߚtő+%LJjՙ~3?c(*m nRlaOS?3oGZA!/ ]i *TVH|7.m;!q_IX4^W_1[,zγN=aZp AA HYM^M!uWBC,ʓ8FtSS8^(0J 4:=tϾ'xfdUNV& [m ?K\Ku t l+QEJ?wdbEi>gT vTaekD >O#U"z M`&;Í,_)2bO rQOiOv\?IFdr9d,N=x/ $nY۶>4m(v+s10(;pe3B &2a< |Ur4卥sYd<5cE>Q ds# @`S'u2 $5ʛr١Hu k@Z`C'M2ho.OlgpXr}BXf_N풊N08 w<D+D9%bl(cSKfJ.jeOW]T=ьH/I3IYeLE3hwg֎RPkʦnkh( 7q3_'Y lۚ=$A;RՍkvB4 u0 D BW9Y\iYz"i,bV"9PevԝQ0'dz<1qWZ޼;X3EMBb@ʼnᇵ.IoK"l-(Ggըi&fӴJ _mݺy: i^O;eg MP)w(ԌNRr(v138_͜7PQ#B=Hu"HDP2+0p2vDfܐ}Na-]قf4 &>M(?@z6^Jh-L7 & )"H%szbZ1Fԩľ+|ov]סe2i˥9x̐9;`]hC<#|OԒ%~B>dV=Ѩ閻!\9$${G0T{yѺ)^5\=_;2cO >kb'qʋ1.J6ߓa)'veɓ*_k@+lp㑷d8._ӽ/ t (ح og$Aw? jfx)5*ˑoF=B-hx( wXHbÙ\\ Fе npGG8>1j8,ނ̦ѿC̖Io CORYQvvTCF #B{($t t]W둛.#[n%[(1K𸛨A~tJZi"kb8,^x,G\43u m\M"`Ay~?q=(Gܫ$ƀ(aKׂCTcr#ie%OŸwŅW1zA&N|,$%)#2,0or&Ie uL;U+~>'6cA2(J-loMU ́fL-S@]%|DFB& wjˮ/rh $} H !)ŵ0>Y$t#Fʝ/1yc(Z7(*382bg.ke܉ u%@T=eҋ"K PRu> J L!xA*ߙ DCHG& 敭ƣ[ST>z /r WزR?ƚ&ֿ5Rw7`W (< <`aRA"른`JdbU{>&Iq$SaK`㽑NI͚XV.[U#`f&/F12H E=u 1d]b '(sqk^;}0z]qՂS!4L{LISӐ6~]o*/D?kсFgqW}OĬ[w~{H͍2pC`%A;HfV0'^J)F . rg>{8fl^P#&$#Q-_n`:锆e9]1yӴrxdB !(ih"~siCglUP%GqQJ .j6'"=A;lu$yNzRzUk'[}iGSFa֜v(ntfŀ?=GcҍvWu)>*%H})UзAĪpVߤ6["R* Dİx5%|G{\]dt0QhLyM0e1G1kwbVFsn3nb{1u2[biM-nlŸ c!#; O AXq9VW6r5; ӨX م:=(Jgvjn]{-(*Ju<wPS;Lsq|9uJyy9#b&rwyH 3]1إ,DM FB$!ND7\;h/\bUEe%7?oU^8ݔi0jwv(P)\wwl;El 1GCI}[# Zv^`>\iaQP|4u0h᰾)A@tǐ&]Bΰw#ϰCWqWv5>Зu^-/Q :ŭ|Zn,@Wxb~: zKjع@ L[ ֶޅkFm[xqmWRyR$& jHZ݆ꪶwAH #*v!lgj`vTf_0ԣ: ޙrrSm򵾺+sצBھ@]B2WX؁;R%&&w7; n2_oF![ U01r_]sN{o)ؒeoAϚB&mdTYJ6U>pZ237[}诉({,]o (HD^Q%vL)ƹܾ L4# \X0{p9:Lnz:m)7U F1)d%e >nT58ѭco+kcO)\"$#]GhZPA^ӫ@̀{oXJuax}Jw(6Q/Esh> 㾰ڌofȰ*C!!%J鴌4$Zv*V/ |p6RvRK@!TI 룙@%65-m+ӭ 5w(,6{9A'\R§yk%3IMHgXeIߊ)85_A8VPj* 6rA ^\a;lt]!H NiZCJD:Jn%]j SRc'P97I[e}iUȇa%n]]//kjz3JD^vNJpbL?>vX}$!{Õ"f*^nd43Y>rMʼnGg\#nLg5G^G>Q?!m"#JK?le>AJ*Ef ~M]>]t W}#pL*.l(OqI^eKSzqEwW=`YYGSSUncBn,ҡ}LTtAG8gj3N94.KXȪL wCVU-;.Fj^*Ȉ%o$WS"yƨFD |^UpnRX}0eˈU%o+}Tm>279٪@kx~I taL?Vѓ':}b %=ePH[)u2`Aݹ|Q˱WF|R?\v^}>}hrY!'ݓdC^s͜ZFl3lQ!4Ab>^q;o 60\e=ꘓw0 | :>A)97`h"#GGRYvBe<3fē 61K(?7Emx-MƨbM(ҽ[F=ëk *uY}3Q51 !*}T6[Cܚ,j0U$サxO5DWX6%=t!jt3g7 .Oxo-!B`CQWr""+qjwk^ǢHkW-FOT4dJ[!k,p 64|!Ct[/i{##&G:߳>$36/cY阙pcmFg_XZ_j^yK8{0zUW>PLSGp#dc11U({b%7ʐA k5#> ]unܺ?ؘkg}6o2}MA B8:DpA2VU|Dq>3@ IKǩ0T@̓V w:*wzi'9-d4wZXAa)p=`-D-Ze1zm[i ЦYqv23.F 2z# ׽7./_e=] CŻXG1˚ZRpsuŔh,Qy=xCiPvv-}$޴Conf6u"oD!& RF.F2aDb3kCLnN=w,Rat2_r2~S&{G*6rC_ռk2I\TaWDc zfکj:.[VC>{VYnVlW? /Lyy2g6Փ0Hp&t'D?JNd6 *0Šyu?E e\@sgt -DJ=Akf͏Qu+S䗩i!4=:_\FSs%P}ؽn$ú]\aɳl5hcU3_}4ʩgL~uk\ I62oHI.X4c#[ͽ^ԭl7Fdd믠4Rdt>{R}EK<]ŋL=7 ]\5YOa:#/Y63Lڎ]8"QJU-od>^9"idبCsZ(==G(,V jtKgonGS_աk͔21&Wt]N!hi jrE| &TcLWT^w\?'uPy8$a$caXF!E?:|{5F~^Sg M]jh EW!#{TkF(T ~1o*uYkf!te~cAZC0v s.ڤD3>I} Sj+Q9U1CR[ #-KlAmNAO;^n[f \[J\U1Y.cT~?rđөxސEŕ4CGy: FOiFF$D;[Ln GDٖKJ$M`ҚUő<ުe1LRYSabAm-o>LZnh@5rwh?Q]{=Y'a DžgȂ/e)%|ѣ'CLp}J1$wdMx(-vIo]BfJԄN7"~6RJȗ/Q %=5DlBH%6 Q٪R峍-m{m?:wnl^zosY)6^j>PevÅtȵBmC>Bws\ E4r*8xj2+8ubV?V4ڐ0uXJi@r7I"PY~2Yd4lt ] ykpAFt  FU/sK*$VVP֊#E?1P"S؛Xv%^b3,NI~d0p1B3*II4 mfgq[VD>]f2sQ(ݷfsovG(Uyy6֜gfCbbFJM?! ju>%;6j[@}lU$LjNJ#/14rHm +֒"maMg Rlo^$E.qTy#v[ GP?)HP@Ƨ+1/,pH* !,ưu0RԾ.oJcu}8-€~w7Fl#iI _VcFdi(ahbN}@]¾k"ByN%n_ I%<R;x)i@DRc1Bt6(<7frlqXT@qE%cEp(ɪN$ MpvHߏݽ3-R39lFW͂ _/*,j& *Ⱦ ze.%@{e&o襞n{AvsnNjyݹ0~k;v* +u > :_Ӈ7]b<g\uRmTD&Td:w\uVPEOrb"s2F)W,7Hޜ>2X> 0e]쫺2ˊXDʰF+A8K$`c&Yd"஢l1RF<rS鹉^o ! H 60Ԝ zϹ„p+J1h.Ku7K^8\;~51P)ZqɧkcWrQ4,.22Fa[\)%)&51`+ ȼI8 TF9'Ҷ̍LIFa YKDDԼ=Iǀv+Uf:g!g䛪Mʦ2t%q3j蝱]"6נ=;Fv\klEcΌ>9tQjsWt>b|M.t*]ގ(uoE4hS9NriJf~CHnRfNJόjh#%6 ֮z5/E ,Tܻ1@xןD#41`Uq +|']'.z]҂ްL7j˦ wϓrOyH1ÒvdEեu#)'"Rl^5֓,vIpd_$LO 5{wd DD,x rr{ّWZRw;-$Q$ >OX|}[qQH ˊ)̏)Ah,(n>\AN@mi4ˆie; îwiÊ67c'j Q"%b}aẅfژ\H#hO/u4L $ov+5o)Ч.,e\KuȤ\ jPA,οz5W c{0&hT>8T!gեW7% c$n/5{Kj6cj슔y`Ww*il#6,"U3Gu&aF#"TlS\ z٣vWcocrBʋӳZEc{$Y&(X]b:z8^~2^V&ކ/ZN !cM{G$a{c|賰͙?Lܙ~4b0+:7*hRU6rPV&!'jRj{/P2ʝOCŃ%=fɯA!f}{O$M"=w)9)U)h4K<\YC5$ -m:({EY )L̏Di|NaVC02O+*tguXƱJEW_}?\a,\&MC\ݥZv1XcɏkLMq 8C ×>Tt~;񀲛` : w㨝5=#j o :. ^xA|+}XT}K y7gp)vRǠ$KN~BQgA1.S?L[oW>vq ޒ&UϵP"? z3P3o3[NW~փگb8L:baPǂy~c#3t9SY%ɱz,rztYmA;o7*8~`'} >slaV˕H͢`1)ռ`ZT~-eYd%Ynl빔h)[ ÖnTzmB\+|GKTcF.6^Od7+)N*-vF$"dѯ|w+}K+t]'DW@Z@l!~H1yXtq('5 v rys-*HQ#g<ۥ:W@ dZ5]BwaZqE4Ǒ׿[+D֫jz0A| aȜKאq J,Ц'̧}\W&o! sݨVx\8m>Z^ȉU`y#y^x})$R"Nz`1{2r%oҴғ{s`m+eH{Wـbݓv_w?>/%XB:6JJǹAܠ>@^D Ĭ -S+7$µg0:j"ӡqR~O|!J9uG# JӨp'No.ERuLp YC{^T1:qrHڤuY]fJoɬŨ[>B׾ЎcAɓ)hdGG#Z #/4Wv=m"hQy h@.ZsI[>Yhܬz΂rM N2'%gäпn6#^nJsTBaDRYVG~H~҆kV Yn%֐g=A/Y}m@f j 9FjO{D%m^9bKG+ΠOſ Z1XsEP3Fhwm}ĺvIx8[;D77Q|ɹRBVr/,ZPvtAX1@0bސ'~J q]BLѪ=XCsF dk^1[)piv]+T8Ʀ>@ #%e ۧ#: /BΑ03^v_O)maϟ d#BpͻۊIy ظ#T[8e|P#]:AK;X JK_~!]Kҙ+SUtE޲$ңH᏶aCD5TPKnZl^w>?x(X7#E Ø6!}OK7g8.6͍xl];Q@y覆j-YA9)pxj[)j1Ri#ss\Ej/+շf{%&Ȓv rt!9pĨ/gPRV1T$4?(nP.-x(ɷyqb^e՟"LcW2^!s2wOY My_Oh|i(]akAZ-]Uh3! qyVxkL(F Fļ6`wX6{! Pn]vΓUIU?;=U|1k9i֑B#> DzE ' hb$} SI#)I7n_Pɽ{?[yU<> Сu ڈ%4xfC݋Xg(eXXH̚'>%u'vueOˉg Ux˜}p GT߇N~6]YaiI| +W#xp\[ՖD{\ Au[~g)(z#(;~Osb61h`^P)h)])]izl M۬b@UEynK/=UA\93 ƁqɃA[X]|_Ա.݆C)QIjI?= mx٩pEx˗yc628׽IWZETd慫T ;`WHkHq[5VSmrKi$!F6[ZG@͖ǫGx_GFK9ʂNC=v:kuP4KhL CF5":̫p\Den_qUB;~a/AaF0yqG+N˃IJEb*KV\?{'.̂Sf|y3SE:E} xhZL/0&?rԥo7;IrF<[qk|,C?#u>?|y_q3f7> Cˆ8qg;\J!)QyNk860 Jxb"T֒M|`s=A\\v5~0"İ˓r:(m8dNF|r$$[ot}xmI.^U!'~ΧOHBQ.pxvI/ZZDF,NH"UM6@ZQN_\b'-LTj+GRQ>KŮz\ӿ!mD@x@B>Y \秃_V#]E'"n{ثrQNԌ'onןWNPjYFVLK?MOύ`v9Z>Tp%Uf:2MFp0 %a8}fw``?_ʼnb6n8^5s6-z_)L ,Ê㚓M_' zw nQX_+g)vgD̫_}cڠjTDC75Ȼ8݊Z3^q_reu5ZFIM6DP=MCG YfAeZGw@or"•' }忔t,DȦ,[U.LJWh:=Rh"^nȼTti~|T}>ſyiyKC,-hr#XQla>0~P>pð#ͻnp9p`t) ^ՔvH'K/yk'Xhd:- OJp>.3}5|@ꉾOȠEET{) .AR-4欷(qk{.vZd״8cG܌\{ScO# w'JԳ%rj&xGҽdA2t4*Tj9M$)diKc +0#4wZq=LgҳӽnXϲʋ`63K|*,dw hJ1Iyx6[$.(=hU`B +&2**iwka]ҥz'}[͘jtͶf8.[+,H$H¡%9FP.kuM~W枉ջ <97at_L޽{5C#}w3ʧJitOu/.:7tv?"p'^[r‘P @K{f'Xa05B)mրk0 ؅*|[/0ҺR!Dfhaو:2 yWczSTUNN^מ#7pdsM3w{[#Wbx3meɐ/ixlZYRJW=/ו)!G>IEAyyw7r7^jƱ4dI޽b}g~~c_u9L̶eZ yJj7~Z[ /?&gF0db*zNT;r&AVqHA'b/ 堗oJ'MJȼK/M5%?$lq 9J-=Mcb̀{ga|͓ʇ qp~@Ǿ5KCx85\Wr;PmlG3&CIYtϲ}%0hx&vtڢَwYT X7\;U*,Jhet>MPߌI$E]vi]䥼3퐢-]?_p1['B1y] N=_24t5U`Gl_fL3زLlaNͫ_{{PZ~=]S$>!etTMdM &bMɨd)nw1;;Dge.ltQ;a͡c@_n,\OHtv3x*~;wS#ustq]c4P^tmx;%%#/D,݈) w8 H:VZӡpb+{1BЬoZ_v" 73+@u0Q bzbv@p'%5U[p BUU^QqiUW-:٤ o*죬mPBz;Q^_.hxgp0X3 iu(X endstream endobj 709 0 obj << /Length1 1472 /Length2 6813 /Length3 0 /Length 7801 /Filter /FlateDecode >> stream xڍwT?(in)a l0ltKK !tHt %[?}ٞ9b34Tv;@40 H(P3B@1)CKeAxC0" `$FFb0]wHI P/G8B:w071*"1u8@Ғ 聑̍`w  AG{x . ?(` |!N_p`o`B\SWj30 w# 9Am]'?[ NgDP`# CCa.g;`+D!`/G7C߅F0ߟPO7B_i0MV9=< 07Ԡ#hcu`0'_|<`P/ڟ?: %/*+) 8c @q7@"| AiD H #';F qCLE_On9aWT oTT(@ ("@ҢI!YZC0ڀdԆ9@/ru}8?1_38+BG=? 1]- , Vm$0ր NPkP A@0yJ1m`VWNvND\F hb18 YN'7B08 8Ŀ tE1i0[Dž!!J~I!?loVa %~6 đxn(!{2$׺E6`r ڼ7匁yNY/vZ xabzzhVBbB0,&WC=;RIkֻ)AV"ISKU"] V̸Tlf޽5t>kG:}];bQza\6T2TŻ+(#VoT赁d^3 {twV6GEː r2od,^33P|#^82uRGU=uChz),]oKtz:V'Da!y:B\k:Wr]4 Q`[ݒG!:fީd}F.wEdޔ*|]XX燁w񴷉K"]I>KFoQLY0aFUI`_cgՑꢹ)M {Gryw@hc}e9BB'3hv$c߳{xCqF$>R<834j&\hhQUñ'u16'k DKڟf&nYRZu^ S5VdCV BB ½2b p[\uCO zRur@=\&n4.‘U'JsXau:E,vx58PiШa46!Ҝ]VGSNh[w$Kf!|}nB5N( Mk=$':!uG<1zS$fe;T{kAghNnV3q7y!:.G(& p_ bFiNėIG;UC[V {i|8$O( o}~?Bh|e{)/L0u%ny5k'^e. Z3L$OGKIC՗tEӬ2&N `5+ҲjC7wW6j9oKC^򪚜[-JNs #peXu,@cr³ +B}r}ޝ9/osrr Հ7]usMx%6?fQ_slf^bO Ÿys$NnfUѠo@ yKo"z:煉аGg\owq`L3՗.úR闪רimG\kgؔ7tv'g/ަHQH:I- јίrpՕ%Tظs=rl"Oz>ow3fw\cZ#YΓ~3hskz_=d}FZ:֐#Τ~Q7d!*C)%:xwj+24XXĐ{=aB\#,)R%Џ'. d)W7AYrӉΰjAilϫ+ܭݏ>c |&њȺHvV[g7/&pN>wJ\n= ,W~]0J Fに{VD8FGux*~?vhNS9x.@$ ޻mN`9sy7^s*g5L2<\;mzfa|zXd*f!ZŎ_$7\ 擭;!@faF*A'(ã)CyLƌodN,_ѸfXȚ\ULXO_23\$.n6,;oԠh.bvŋ(pQ69p}*տQW7{w.uIURZֵ V0}s}DȘZd%?VDKVЦu06|Zf {ӻl`1{Smg5Ҧr? >}_ MgmC!1B^ݷIbo>eaaA% /PCĢpXbUbyZ\ Xxյռt&`axPHS'3!Ov8e>|0̈́ϳ6 w_xT}{qf JQȯ?&#~ĩZ(*'"DykWs( PQŒɓ%\ga Lomܫ릺:\8Wm^-Kp/lK z/x\&[~tXa5%Kht:pMv(-<ɀY'q1. I|yy뙼RZ݋Vպg,SZ$Iy< ~2CߛFڨHZ]eƴ F2KeXj^WӫWkP光1F!ud4Iw:7T!!ugzWYy!: :z!'TgY_-M =DiS =b(+MB/MwݾOoq^ %Z߸3񭅮 lVTC52i:&zy]*G3íF4 3T;'J9DQ}ciuu^Q=oZOQL=܄BIr%[toh66;'[)/:NޫdсWqSu 8,4H0qL|II+HT쏰R5qxxZHu8^HM< =Oy@9v3Ys: ̺ۑFZyF}aW ݿ!X A.o k ?iP7cY9WnPGF>ÏđVЛ#g+i\sZ˱B󉦼#aۯg@|H6,O3;J_^<%`'F:aEզn0w⃅o {WxJVYλ2AG;xڤ=uLsNIʰ(7;_blhep c76R!_NA[wLE@z;Ko9K-9kQ+֓sѨlu[*mNCɢ5 aD{xA\h+fn:Ӽt*vįL5EE -y+/zě9eC1(?XCvdr#PIJĚ}5w| ;|IsxyZ=(/ϥ>k:T[n4-#Z<ﭩWc!_!g-uonm5 3]gl K KqmFn̞J?7r^',w@[?{(l agY0SsX™jp)37Z"jTZN8+x*7vY"b6hVn껢djLA=ݸ[[[szD7[%x [j<o>5F)`wݔBS O}.:>e*6Q3Kh'=ξ0ڕhpQtD9o>y{w ]j$|l]#F/}xB6fv]9[:S]崁X؞/WL:ռ72!!YDێ n*lkLMBcBBMo^"LcHmNӧNd\3w 2^l8)e <ު #I-^D)0UUXa2=o<%psltQYr?dK{y3)UArIYȁ<I8̇fp+jSBu*yqK4!~癁jzX%~6Ry1,{k6ZCn7ZVyESu'[j(&'!η":eW`Aj?m endstream endobj 711 0 obj << /Length1 1533 /Length2 7476 /Length3 0 /Length 8498 /Filter /FlateDecode >> stream xڍ4n6]{֊$D̠Jk+jKڵ)VڳFj7s4 PtF@=w_O IWDp`BC~pj a|1@#xCn{C @_H쟵# PŜ`8a@/YWrc?wo# tQa b^@{p(hs#f0?{\p_}Nx` 矕5N[(߿+ck?&EE/+(" `$nY7ߨ>@f@:R(oiv (a@eoAq_g,}T~[y~?Vb/ #tP@O^j` P@:pOU/A:5w#a(O1@c7JAd#;8dLnNJ`-EϽ4lcbdd#80r [\0?fU):ۆg4L*yp\/5ۭdHke*@ٽwlUg/$ޤ/KDkrd9ľffWRq3L݀BRMP0 _.<'&=,N'KޝF(iSV󘅉P`:&[ s(F6>-5[}HqDެĔhv6A6BXΔdXNqFd)|r5DğVͺP- +waG-TG}b t듛_os&۹k˨iW)3En֘퓹(:FϴՇL5~r}.Rv6)Ā0UK|#طcqiaDEv xrk$èFF,7FUKޒ z. KŧՂ:Cof2,4%ʗe+?ISrƣ'篁`-V?rwtv6@_)cdjQsjnc;-z*}BC2[<6"m -7?v|YA52962_e5|"b3Frـ!Z:l 2&ImOx+TT/|KYw4|~A{2.;Qں%>BcO͗Dk]I<?FC`_ߎ6 Fl=Qyt<-c%)&Ur̦^E]H auNke""n<$AxfND& BHt3r#`7YCQZ箺fQFivV6T̗N \Ո`) .&Oi{TٟXל>9Qu˔]>ĬȴNrAn8:͕!~Rm7W\Ym ܽ4(}+pΡ:k3NqdeO(f.ȼ^|=|$]Z*@"2WQȘ0ۙȏSd>nd.|RbVA<5Q+|眎Ѣ 02܈3Z=ƛ"?NfM' ߬jIJx@X< InqQse$?ZaM%l:Ob}Mh6-&ImnUϫ[VlfVe:@S:8r-4acjt]X㹆C& 9.%h͞Y[RbF.7, `vHP;Tǀ+آ m:öR}2]̈Mh~H뇱'릛e.[H_A,Zz:^4oU.~8es(fV,[xrƙrB#69iiPG*wP1X˪&Fj3V3Xo_k~8:z(5՝=qjJ?T9tã-L yál Lei:Y跀SwD:ArS S}SdBL+-Z75Gt d)}Mcc]s吜):o\zw%J p,+'zJ\ZdoioQgY_9:D|cϏ#+dʅg*}8/y4zJ塚yGԝiULPM eTK"`/FsC f1sKsjrl)AބBl )f~6ԁ1fc"zkEE<*CcX8/}A}\Ek;DWAǍ'%uX'tTm+ȴ)m0; ՞R "QR%̣?e]0M_in%j;X 3A#%V$ThɆ{`zT\#GOW(鯀]#5[莒s6ʲ}̜-o{F|׊J׎ -:c;'e]ωF0^3".4ܹ>c_IJ(0LB\ ]xzpicD@WSk!F~wcْ<.2rQةXZeSqm WXQpVRi&"`\<#o,cnx^5;F`do/%[K֦ 7'RPKDnAnpڕt5??USbv/jv8\yC[N"OlE̢dy1wi5^C?}/ ?9?>ŏ Y.) l?t\1[Zf! 0^D,F\/fFB.kh} :6GIQXaSb/ț5 *ޡw䂩}Dȕu>zQ>j&9nc c2e.0w/|FJ>kV|1{JDdn??{i;8f/4SEz.Y kL ? y 9cޠq|G)Pɦ0܀ҫt!{eE -#j3"Ij\[S]cz5Z)XUg qk]~22ofy< j@w:nZa pV z#0#e'* [ؤK路!4Xz3I=ᅺ\)8JUo^`{;,Pԟu}Hq;M}/Exd HnO([ K^D"oqv=GbÖ|y'oи_ߴicvnc }`EAJe9BTew!l.h&}tTR+>s(4a$ns+]P?ڮJ3gn (:vs7đ9`TwuZJ:##ܿLq}rQ-2ʼy FɤpN,3FU%* d—Yipq.%o4HD.~8e؀mbV;\[_gFi' =\>4.E IJqub1-(6& ̞UDPFX}I2ZQmx aNJ@wJ7,qCjҚ"xRS!c:9#"!} ;Ęem!Kgh#@ުqLv[T[mG,yA62F;4au20а9AXgd`ݨo{?D݌Pq hBSd8aB|?@w^6ß=dϪpMW|:={jRǺ &9*9NBlv5/d˫4̞F1q#Nk} />rR.cx^kuUCTJ-`$- Jӹ1<ܟ ' !#t+=*総39INJ܅5yI%'l#M`=1 u2ٿ82{ l_ 0Y^}Ԓ/3pnjL]f9Gx dx.yiud$F"/.SXIzwaVkdʣ OCɚ7{c]ѷIh3>*^aZ:}N8]MtU6b_`e}>G*gVsO}{u^!^h}s2q9c1<. rʜwao";aaŽ2![ObSQ?#SP{?w"@dWF9'1DfY>7ۋ$W6]Hbߝyґg%/kQ_D>YW${g[E{>DG+w*lmHezA>Q/dJ1\^ge9@[㞮L }jQC/BROnk:x ҥ6j hn|/s3]njvcﰤ"]a>-J?B#񈲮c4v9X ",\ uju ոwSsLu%Os.[=M'=qo/br(K[Yvt@x \>q/2?qQCav*F҃9khoٶ[>Xj?2> stream xڍPk-[иCp'7N7Ҹn! wMKrf朙׶o-_OC"n 6ʀAVvA:?FKB tqtA^dRf3%0p r 8mvHZX `Fbkmy9ߏz w# P2_N0sh-l Aqdc`5steX00-aIjm& ӻm(߾ N7 !h>:jö@w)9cyz6 /E}(aъ6 *7JHa!CbĹĜɝz&WH`B;*{RӵH g|G«Ңuԯ޻w.?ii`䣢[K(/V=𵘻2$>8c_ZO7u f66WAp<"rNF+~ 7gm8! {&'?+Ѹs5_#iJBl\]%>Qr}]'Q=GɠiP]{;-V?͂]@jw,n%S*F~ٲގP5mX> . eGчY=-DЈx6^k猤yh'9[aԏ#hyvܯrg;Tw{y{{O_1b\u)bxYB0p)9/[]=G>k|LϺ~4>ʒuFHO+4>{{R`/}y vв>(ljq# 8rVɸ>5tbCpw fGʷRWܐeM8WϺ~6@54>dl}RN"qV/Pe=dBʳ>ϧp<ו;݊5=<~.o7h2ҩ@h.b=f$ߊ%bm YɋĒGoQcũOSis?N1_Z7@ۗ$("HW#3񙘍Ɣّ6kOcc˕*UIrbu//'& |Dd{a z*2EcQX@nDzATsZ#;+bܘM.*]h0~%_'^VS:@ZGjoL熄N!p-=]4^ya`97}F{D Δ]aYV=/跡qU^J?An^ĿjRb0w ̧/g(0C1ży-naZt+X\ɸ_ $w&E$m\S.@lݣMX*Mt<3jm/|<ȣCt.gܙCG*nc-*E0H=sfi~ 8:T K#Ms=wkFz$FΉPv3=PG}i}Zl۾WS&8&9q'ed7=`"(뛊kdn obj^}\Z^\7\% (r;%~aZ^; 4 era& iP9P@|i~xo-,4+n˂T(A<Ң&sf=m7aj3Ofo75ϮK[yiOV;'~Cr+]UkO )wϼ}8Ib?=g2‡+<=[L,uzH7GhaL:;d+Xkb .ZKR/)i)''+qbh<+5u+guU lE1@?`} V!3ƚȈ~n6txz* )"+3`ն#uv qFIxT GL3w_nzAV>cAD22Qr$t]'; rfcy 5%l9Q+qc0I|'ԯxd߂|26 Ӄk+=tU0SRҩ-."ngP=qFyShgQu'!YFpa5ފ4Xgau׌v*ɝR@/ _9sIX" B V~),l"@|)C^!.·EܡFܑJ5%;S9EP,tԜ/GDfffiΨՠKƫ& DCtn~tGAOK8.Nhe8EiLm2 F*@n~ٹgvxRGL`s"[V9"DF\xf.jhx}241=UV7#t`Q~YyO4K596JZ#jpS(ޖQxa2 kGBU{tՎOPo^(2\e=Ξ]χm> iJ"(kE|_CAU1g[0Bk3M!}0@ReS8 c,4,V@/Vu6|AWACUǁ0ȎhQ]0D3i:' U۬*giFC2ԂU Z7TVm4OLyRԡbbS/5\c3}A+ޜ_8sqUꊯW1zKx+wZdrMgOgp3O*V N:MO 4^H&鸌b& AvДP: 47\W.+ :bo-ZhbP#᭴X(+B59ĝSm˲ʨLЙSp!v;S2)G?W,)Jم>@G]P9,cUDY*+,u|AH_둿oId;БU~P!^=%iڀyBDWϘxv^ :\"9izMpn ?ъM4*Ca'kQ/▮=]zLo3"e >{4fqmD5Eym4Dktg{z& V:?:8=UiW&R]DÙ3byj$@]'-D v}nZ+C)ڃwctG>w6 ikpݳ *įer.zT *rnWV8M2"c).>+^H~\~<ɴ푕fTĤ|=xح)/7EfK>a]jLq$\$m߆NzChǯ鵏|LpSp }t;ufKK ڝ #U-vEM[RzG^>MF˛t-^=>t=+[Q,PCA+Mu5)}̰)-Ǻ!3ߌ5en$^&%ڔS14Ru.%v%r:|D%(: oͦ=.kHw?i+G) 晪i٬/e)Bly$jt:RE)/1h뇽L!-o:`ImC~:P@cĩᬷad79_N䂚6&%*.5͍Ek"pa ~TRv1k'T=T4k_U|C6{&ܰ=c#?f!r!\ZPh/ѭ^A}i7=M E{3]XxpN#.w8M7_\:I;OUV4b(>xzIՅk4&9!aj|zM^&>[J`y$:Ggl Vs^ %Ϧ3MYi2?t5H g}*`Zj\H6`Q7C_H-2n'%{`E.~N G'1UU[Qڧr,eU|ws{ }R('F}g_ş# l}Cb@kPIeb2x{%6QpMx5B*lYI !+).<=EZ׎tvX;Wc=fB{DZb wr\IR: 7I.MebefE|}>|\㱯fS;z$ WU駣ߝ! htmNL"MnvWJ[(f;Ѩ@ϡgBXlJ4 kYgt/DD:QLMv"1pdpKY2wh;Α Z.WO%ʑ8ZySz6k4TC>li~ *Ѻ ~\iz[3eHV*n i EC\A}b/*!%V*lR!)={^v0 "Te^/ȃXȅm'fDAm*? oB4e ?kTI̍/ك3Y)c4{buL ՌIJji m@S2`Wm;ˆ@_eU.y=NNywF(",Ltgy}$n*ﻶzN?цKH/Nف\vYɈ$㆗ct¯Jp=8F~AoU.z=SKoI(;N%ƞG t3%Z@Ҝ 㵇`T_kU5uX`5͚߷TC+7!\tKAާ(6a{O$b5_9!7~1+M:N)YQ)H0USe- 1ku\V}l8Hþ9FKN_i+}[/`F׃.fw9nxu;!R nplGE_u?ZK7KoXTRȃndhMpZU6#TjGAhG܁$ΖD_%AGhC C[b.`D [&'v^8ٲKc.Y&ZmMtpJ3)U)dj ?F<쏃`pddD~8DV,{wcͧ{,tk:! ژ_:mgԄy IO>&1'Vqk7I@1ڏlҎZ,x:kd]X|0]ɮͿxBoǹ)=e Ϟ̔VFQ俺-ulm ȯG9CNq?W 0$.m~,lj52(]Nm\L/.lxnēL<>aۧ[ v*h$sf镉Ogl>EMd~AS8L)G>[yz`v0ɤc)D&٣>NIov=K;7W3 I+ Ԏ!|8>Ǥעv?b%4cY CnRd((4 N(ہa@>TWr]-"=?*_u&"SIZ1&@O-orqA_۹)&dM˯hsNBNT@wocU?#.69τÓ1- 5p%=N#VƽH$̈'`VayrGLIe_1vFI;+ .Hck׃׺O>GV@Ic_7Q~Ge2mQ},* WH{ IEߞ,KxJ=sd8IR$7G<73!Tŀ[ER*XQ^Y! uqVoB>f@IQAh6]Mxfc}P}T[iIgPTjqc+iIXatZUmG&܅E!ۖY>2 ?%X;SO WknUXgƵ\$%3йƄR +VB!z5/5+WQᐲZIUi:sYd_W~3U@>+~H#5J @%Z)A܃^+_<>`T ;&n[8]''@Mzc]#)TTÏ=U5|N|VRTMQm?rbrϧ` |: PyUƊR ;ݽCt%\򲾹K_*L8'OR7rs;!%]dSB Û+ytDCD>Opk$pQ2_Q⢭Sd`z8aYa)'LZdE'$ 9ZZfrtDϊ|UXqZ^-H[ߊ(9'hB$*BFBe" SGLdCr][<\ }Pj/M!Ʊ#}T:߲->Zh! DCi"HskW%*듫~d|WK/އZfq MI%Ye7[*hoǛәWx§sYim襉dwfeD5k< /Dh ȧm,wh`B$q)(53{][:ds`.S0}j6usңi\aR8jة&09l`$dŀg[8e,B'RO ebܥ,hP/iEMu&@KLd$oVlZ]'`6L+I(ݸ(荏 Sp !ߎjU.xW̞Cs?͖j>.p}uJn_~}XUDN$ZEw7b 1H ՁzL}L lM> stream xڍeT҆@p]wewI$$Xp _}{ZU]ouUP3CLA{f ?@BQ]]\`c`accGv Qk{H8L\^$M\^]! gc4qY! g$j 5Й||3zqEh(+2-o80sl||^6FUr߿eRtzfP5@g ٸ^ÿ_Q=<0z絳]]^D:+U ؚ_묈[]J4dv1w+&^ۂA*g_lgf8׿LߔRf󿦐`dz9_`enX@Un._K&nJ!^JMI}3qOv!o"M#<"MY36a 8 5Ņ^S9o-ccnλQ@Qn%J_M MWm8=?=Uc?kJ4Go<|'FqT;YPPh'*u })xm؛)\[(ϠkTٵMS Z_KٝFr>OKaǬ-rCغ Q $}`My⛷NxL{]3K\f?bNۿMR XXF:U#>R:/q~a7Ą?bLJvf&!vX)R&}N8Q}U羚A^Ƹ59G6)n zVc) 9]+8U-&+?щ!_(xۢ*>G obT$_dDf "U(tt}Cl<C*GjeA>)g|e!iY4ٜ+Rl/w7VHj&K^KsqU=d'}ywفpNtv Q&a$ q&ώFLi,`Γ/_ӭʐGQfܫ<+t(ڸ ;J]6L# Q j,K2kb1!;ҤHT=0z}3#aV`O `~iNeɖ>gxaST2e"ap'ߩPv(:CgϤ7L b%C5>ح(zx\4oя'Y/1ütiKg \@9K&Yf%*oLHNy^@^KD݌‹%A^fxe F8^=U'3dN~y^,'Cc-NӱItn8=[˞3JIy;4ObnܾV[vɳζ Yd%ݭ/#I/[=<rͷ-)־>(FT:R-1qAp}/\j샅űV0}&On6\A D)i5d|WE- ]OE  SU 8VSF~C^8)ߺc'dsu%?P٦ٽd7%I2#23 ~.LB -I ޛ.C10E#cbM4T1*,=:6 5|vVLw$,}*֣nTZ'is6PQڧI{k-Bա?uY6?P0á,exW.Am\d['=΅xљO}F)x k|bVVЊFpoWoڠn$;Uf8! 2DTXroo=5_osS j'6iJyD(Z5ovE/Y ޗ +0oJ+qxP03Jͽnrl(]OJ'K:=Sb5ͽ>Y[ r\1T q=Hec&#OX ܮ}W xt2~'/"(ԭ&}"P1¹"a$;áp/g+o f~|BRjN¶ l37@Su/tO96dq6N], iG֪v8 zJlD^8yd7te:ɩ tWr͓_0f"c/`N:4ຮ#vAԄ$z)2=8%Bg=F^4EO?5ĠuϣG/V|.oP͕-UJ[SZN}q=)kzZC6fmfZX[lҊk4b oEj mfJW-f8 ?#kY=0o&Ҭ)Ei{Ȼ-|oW7qVn W3:Om1HY:"`Z?tCB:pgv:P^d9P9]L(k'EϿe1YF|F!nB vA Uԡhy"uyJ̑ZZ ;h*g ?SlaDw5ż&3r,w"+23-Z}\BΧc0pEe6 LxDZPlb~ŐWVKm3Ӷ_O?F--nhQgn^S7I_eߧ pUJ- y?ӾR.g^=ΠpTx详=Աo;KRBU} 7 mV /Uq8W,R$B-e+Ш3nShLƬ yO$˂#qkLAw4А 2|YFзi&$J${)qN(;]<'$H`.ކՑO'RMU%dsX^;-a:*-1;lclЏxzRćx^H$!(_G݈1KPKԛ̼w/UbGB}Bh!BI{z4Gs~P0V!ToSd`UFJWCw^3 ~4O=F0^AMx?C?@dR9mb%}[(/soszق~}!@]Iſ5F䰬Btv(5RLG=0O+5 >eqV^\Wqf>>BzKKp|[kTre(B+[vۏc_{\f+E%T8H`'nixjƎ544V%5{Ft\ =m=Tw)7 m_ZN/ .f׏>yjC<$&Xt"%˿|gw^9L{IuL )o UR7k0~?<%z>Ш^6q~q{~8m`ݶpR eI!i p<VPoltv}]`oٲ# ;B_~)b%f=}59!q6wUSfiWwL0k2Ʃ:읕t&յ#3Jh+2Y0jI0ލZG;]-1rrWSD,Nx(I2,2ݭ᠓\efr*uPt?dQnؔc1%SR0U"[4$iX?62T(1=WH҈qpo`cg{1ֹ)⽂ ;+d{}To58?0ZR}˷|ӪS˓j" a1gx~y-J7.Z@7쐆]lOM".l9_Di.@DAv&jK!1;;-Y:w{$5t56ÓLc6HuHd PVeJtx w )1nӍ Zyj˕6hS7seC]7 DHzs7 y:YFk6oL7+véG >Ino?_bŐ;vօJ-mR q1)E F}[gF vwٗ~@F[E9Mte 2ďIL+.iind#?no774+lQEb=  MuhK"$Qu+he{DXqqz1^豯X-GF=Y5@&{_Isfِ>%.̶/NEw ICW}c"EwJ'H~bS  LOWh-{<g~s~fVЏD$*I5Ogҧ:Zk3!"l=;P'QwS҃T1̃na Ty鮍ֈ3[(e9s+z^*H?=d6n >G3d&7x@GHeD8"n[H̍u{6l@-EYTY&җFuu☫Q'"Is̎RHn\ X]{\X{\18>~H56U(rw%5v`{Su,Ebw}˔.⤯4F 7 /*O\?~?iKH}*|O %F\,ϽV̝&(iu>h` "KrϲXBXi oxodkl~#Jϻ Vt7A1V̘3/KTl.dqR+vu۵#fTjHM. s.$`).Ȃ,erwvq־0xY YFSԸ* |`(BW/a;ci\';e%ER"=OuB%hqX'?c8q-@%]Ho˲OYpbGYOo#C#Ubawr :P*:h>UK/ 687Je(*&]tXUbؙ=40unYOԴL;0@B2@DwQp߬,}Q6}dK#Q\?π]EEYT]W_B6Z@tG>lQ0 Âm{ JF`-W`[qA¹>ɧˆ3scUU|X': Ei>^<%c%z7ʐ*\'*k}]Vu;gOKDƉ8?uGҺynrGc٫3JUW CFa5a^ 2&(8:S= 2뵱=i~u_DiZNoԷ`i}*ZkJ۠+tHmr[V6mɁ4K?8ρ`Fn)Qu;Dh+87U\n1H+JnMf?z_vc@No>G֣1%&ܞ;z:B %h&7!b,-G9:P@C`UnQq̶1H*yx]8pj")X|01i~rtv_~u\AXjt~·R ._~qTx@ODq֨6.(x>3{75|-LԮE# 4<Py䍼neo눠=[v=9 9?,/$ɪ=#k}p!FW QP3ϠYo ʗ2ע7n z|=^%tʋ݉ 4̀ݣKԬ3!4*P""jk2<  . iSΏKu|#A4MeYX8({ N!Fx %B)`B)) X6FDdIW#\f-Zzp|2i N~ l7U>ąR-;tb;=9iUͯǡ=3*+j'>3f+?/?6!@i]~ď:(72:rOO,H9CĴ!¢j<J3c`0˵!dU[\tHLNĥ_c+H$ nY=5 *(`]P/ӳS}!#~r31Q,7j^鷵Br/W]QB(pw TgLuݡhR~8jZ9 k~ש81u^IWC^S/5 ]~h _P?*de>]`m:}*Ϩhƾl$NVȾD h%y9!Zmg~B.ًh+qFMiQ2zٗPNY~K7 oabY~ңryY> >x,@al_kQc|ltHmɗ,T*L@qm,HԾԞ|FYDxx ĈD) {"vHӐfhnK j[O`nw7=UI/~1#.(+@9%eJܲZf4فY &lթаߎ e]FiO1OvyUGF P,IvT:tgY^{FX:`'a&H.gs2| WDtyI/ SyC[_q(>\¾ųC4q<2ĺ1;^#q3C\9\{ɅPC~>>Y.d읟,oQEWm-j M֣=K2m{)1ϼ>nS<~zG\'>sc%"}T3^&W53~RR6? @}Guv#vYpf*\8B$.fNX;R ݇?q{Yw~jWX<(kX (~cwm<a# SQk&zToI6D OYjszuE"3V:‘ Uu Rq.lH+ѱ F_,Lva:!?L%p[q9'0wTS,on.}\Q~bFӲ =[8`H{ZbAޤ̻Nߦ0m=6ǁz7VK+uZ351rKT/b i񳩐G~zni eT>Y5PoWs 0nZSAf#71:ux5 ۈaC񉀬KoR5Np"f D|,~uE}q@\>S7$ŏS" \r0bn'1(e զ(ب#r,<:![V&̈g9Х udžc&gZn/(??Ԏ u&>IX kz02Y k8ύ.%9h hs4$'U,.'<>XoTg,iZxÅ~ 7/AM_~rcDuZX!pyzpBkE{64c@J7ls ',wʽUP/pEUY"H.Ik>*Y 108س%qsO{0P/#G"G5$.J֦M7KaYlf;JPDvʗ]Sy \7Emyej- r܎ 546|&E RU,Qb$*EGlEG{He;KEakK%1PCO7Jso L̻ 隱-ɄU:/6G,dx5c(oDKGP8 %ӏLT Ӟ̄SyYy)5XaX3wVUV~Kyb""gC|$*pC]e>k)2LlMRt(,ccص%#0b"&zpl÷rTɶ졞Ѕ]\l^(+v!˟S=n|_;cqjK48V%1b"-#iHcn!! /mhV~O:a i:ҕ{J22:s~SfS/||U[t8? / ~9>נ xRB5/Htz|h@V8D@5n(9a]QEyKQdM/HCe|?YaC E4 ӹ5$,ÂlL0zQU@77=aq|撋 Ge4&rPwž~m^ԦsJ%'G1V\*37KɩM@QR. =<@9+2oy8D]`/FdC)"ѼNY_Bw2G]Ę7ӟ9T  dyt*kB8.xZyz !>j/t!}/st/1$PEXӧQSd+Hnf;є7g.Uw6[!rS^mwRfQT¶'d%xhb}x gROJg!884ǟ93* lA)P$74@S T-=T >#J6_KqOA]t \b^(EWuKTGg?> N c_zAZs$z"hsy7jN:@7z:<1Q{uA3ʑkrFq$B*:t(2O1"$QXZ:a'kBY)Ig.!B˄Rݐ/H6HOQ^"AE,^CR@ [*TM/ZpZsVhh|S F!ѩXy[w[ @FWO J)_;}F!HۏO,ʶj˴_Y(BP-`󴥠]Fo)=yM/Bj_sF  Nt펈SZ5qpK9ɁM(/=^Y#hcy#C̡8G[ y=zeA#24-3q['r{cϳ G-WVȒp&dbyN5B@Hۇ5B֬uv٥"NOWOkb̩~Y&rWT}14cd"xiMg_ahl!Oo>7V-/ާp p(v#Q\NPY Gq7‡о LC#qG+[]-?s)c*D)x ̟x*#2Xj1T"6G~C7]YY}_uKؗ VgwőOlFJdj \.MIIDleSy"i{ԕcp\Յi$31 2Q4w%cmmL@X~.sp qqlK{j"Jt|ϱcâDyX<`7Jq"+ =e' UPo`E?zltDXt`CԘ0iL?kvc=*έc"q=F32e|O'aX B jB{+g])Vfz܋~Ĵ:m|BeYܯ4uhﴘ]OwHӚg='!a-JTnb(6&]LFC=wH;ݵ*_nQX8ز>m6ޣ)Jyb?ݸ`90z9JWiZnm:g^7pɪrڱh\K)~3JLg5#Ο 1w!F ߙ =f6t$@WM٧"~B$˾d)ni'ZP5ŭ#rqZe2KOLH#U\ HXLIZ;c/_ S8TV W)Eؕر^o\ݚ>UrR UN GEq)]5)}7riw;ٍ;l[aAoH4ObHR̚SNEy×cxEF=X ?МQ0[yS4~lxMu3 $+P6ע'wG+r) endstream endobj 717 0 obj << /Length1 1593 /Length2 7099 /Length3 0 /Length 8161 /Filter /FlateDecode >> stream xڍT}74HJIAS@bA:A:$$  ;Ewz{s?9{v^Θt mp178@A+ˏlA@qn8L n` )S"@-8  DyybnE' P p7G'&& v0vFF8F" tv熻Ks > * t)`qKaCx aH- 0P踀a5p4?ֿA` v(ɍFp0_@ z!P ;u @YNDV>wd` pgg0 +?Eϟ:^0Ovݯ2l=\x`W R=]`oϯ>.bd ~.p psz7p+}B@ wl977I?>?OHaPPH[_O(bb¼>>>a ~!/j0;8@"?`6GІ# OyxA7%m_^ߌ=z=@ -8rC` 5Z`[j@<'+ B 01Xu xyG\=nqG ܬU඿V_Hts"< j Mq7 @5nˇ̓q; G'ߺI@ twG̓ Zd|_& ꁜ $p)" . qAn~yx7o3 77o #M{A spD㛰Z9j_cvE'D !X>sVv~8;_15Oqp577x9]54.`i=Ne?!N|GKhmnΚb2w.'[m.PrAĐNE}n1}J1΃%9ځ6Ib$FߏNNZ5/-4D]lz*`mu\vETt+ya](˨dDW MqcHV˻#8-tpݷ;IjB9uV y 3y]*fb  57Q2qLfHi_/V iJ }4ytOQo/n^w D#~lK.3RmVu`"i7fKt:Hdn9_(Њ|gO̠ѩӁVh;i2{&;Y6U/ЏHlFTćÊM Bm ɢg-n}cvMg/%7],]~T+· WkE*⟋>N6Qp72䅻L[bU\.i>.ъCEx |DG+bV+fJiD qiX]#[u+խp4?\> iJJv8ŵ6=~RI~Vk L(O򧁚Ĩ|lK'8# W>60OYx(\}K c92PEsHiFt<73o~l><'B̘HdugyȲ=@p7z~~\EK:F*F$ȕUI3,/O+3K<)vkZQ嵐MRb6H3g0bVo1^ޝ*iMG>Ϟgo„yAyMf,Ј˺F=fJdweVS\8D< >RH-i' Y E~kPD갇5W|f& S+N}Erzj9iILr/+kM>mtg.gp!ϝߎUgyjkY hJJĕ5=A')J9#@r'(g]eoN8D]0.fC,?25~== %l4$Yg3~}?E]:X_%4`,HJD1ƕ:"^*-ecWg%Mt0O]X`558,?&Bpw /7|.A9t`v^}@i]Dxqc|(Xmq%3˅b(d: 'ԉ*K4X3\&R_ _vJכounxӉ9Ǔ7&HP)\8RNYT}αUQad.bتK/7ZUgxG[ŶU_6G0D.tSs 1z[I :s=m.VhhtvڔǞyq d68?.Xcz,ƘԬgu"B[uOIBMC^:i4+Knԧo;/c-NX39ĎnCC3)U OIXp,(mR#(Ûdݏ<仈pUpvĂBö&p^V~E#_EbOl*f-FJJYT$DÎڄw-j_ kh-@ [CJ_OM Ӛfaъ~_Uwi23n9ZŽJ9"ۉvՍ{k҆Fs4JJP#]`%@&Cxi*e DnwKt1̎ o4vJZ:f0Btt8Ӗ}Oar [ijR8-~XĬcѾcA7+Nڹ7xj9PEC=34&0YI'i &AlHp,У "x|B42=d+x ċ?& hxs$!ឋz1޷[iϛ+g|R(p߂t:^3&jåb n =UȇIhh QJ.l4 *+b?VbSx})*T^ӮһF7'M}fiRDU'P}}Qyʣ"PGJB]\j9*3=tfla"Ehvb5{Oϻ6:9/kڡ#Kf&;JpFO.%jp?C]2nb]0;]dǁ\Lܯ5H%J\_!P=<ٶTݚKpLlyퟃqخݔQ̎npʹl>!~-lR).rA\\RB !"9xɇ54Dd6VPݺz!b6 C*WS\9]2xn(~C$fNuEdPi*$3+a2v($By_re|(㲋njϝ TAuPd3V=w,`1.R ULiԁ= `O |eui\)lL٤D9w w*\|F'(z^YuR'ھSJ&)@EDf@Ƚubrrbܭ!C?ٵ\+˞xqBիy-+f^K.)QtY9n.~,y' WǬ䷮O NԳN'<2Δyvg5ikW>k*J:ryxYW[6+9fݞ*#u4h^1]3"1WfVVNd!PAN3K3̡JO]'`j5cͥ6_; NHWa\SxŽ^~ykEM\^7zSx_K > uГfRa嫀6J{_{GDns/ g)]2kGDpVvLvEHdx+.!ٚ NXbL'B_HXϒm~VoxrTֆ/q]uȯ+tHt* %rهtUŅc*,VRefeAH̋y]g -j#V;QA Gv883驱7J|sq0 Ci;O"/vN&ĕkX|7 %O 'x&&O]$[wUMB&( }ٝ|buhԓB =1AnAZi`gj9(k@I齧g4),"!ꅇTD;v'>ѣsg2<ʌFܵқg4fx)Gf*vݷC+Ƽnu>s< B sV7WUЊd+:60?VAF~-SBdF,mBcw *y;5Z_W)+B~Ÿŀp$(gWG8]pmIG߰XC6vUH? |c'\Ge iDGloVfỸz_ ߭x(FDݾ㾝+mߴq*us D@ΞJOpԔ7eF:AG~g/:"N&NŅ坨OjXCЖI$ u\rH @)8} $͖LSIH{;6Vo$ֹO$cdŶTMQ#0M/.$gY YJou_(Nu%[;fOy;>{&!oS"C@}yEm`g&.}&SϊuHc12=_Ay2C;ӵ6,bJlqnw*v%ާr!h*j'r( udB%󞘡OZyl֧OFx~s:}7K(oM^m:@D6Ul+p(g,mvv8?4lN6㐏o}.$Dr9K5) W{~jF<}f“Q =H @W\{9"UKz B1GOYq[$r3+7)l87{u",5_XNօQ VԠ썴a+13 |˧}ΒPN ghM's'WÖQޅϤ6.rL$2f2le0|L6j6}d l5ZB/HGFv![0 mRS3/Y=F~6Qw}}6_NBk5?~Y_]l(3/yɤ[Ə _ֺL5٧J1zGw]ʜe1e,=/2#ooiٷx\􇮖z*7zf*8.ԺRaֵg~ٴ{CuvTk}8pHD!ZZ{MT &lf\ikl?M,E'a'}2R$>Me?"=] #aIP$AmܠSGPj)ʧ ].p FVcxrenbV- "ˡ5fuJB4f 'VSTPaY23uk~(ĺݳԉqZS,"Ht.AZK~+YH~F\sPA+I@ >DD T7 : >89|JS=riAml3PvVoo,=H&, C [^'QnL5ݱ7he3y>_.};rBjѯj:`|~aToFkpSy]INgV:b|f'Bb=tfP1<muW/7}}wOO.ȸ _&wUEF`yW*=E5@mXOhqgX}~mC2B׽ >dЍj>L7c!N_o9F>jZͫ=XuE,V~qIc૘'9<\M\W'PoA)d@]ʺ1y]Ar#meVb}9s%R[ Ύxo"rE/%f[=T\FÃZ8_x.> stream xڍvT6-N atrlFlA@AAJ$DBABnSglOuq (A{HF@$$T5 Dp+/a CHJPA, `t+PX(,!#,)#$;P.D/ nW 7(,--C!`Pqaw]FHW n9' ]FPvC(G~74a(/0> 4vFH7bW8@c+<P hs!$Iu7@aU _8w0>~ vE#`/0lM}r0%=4wǠAh/`oY UA4T({&@z#Z8P_$&'LS,saBb@q;( 2wG$`p{', phs#t0?kQpV{@_| D{f|' ŁR@IIi`&:?5H_2<p2 ą 7o.W@<]]0 WVʞ-tXs ; ʺ0(0JG@Bbp8Շc N({0}$뉃uS k陼 '".P`_Vd"X cm 7P@b%@,@5VQQ gXo+ A8k(8vB@A78OP/_xPX?"`003F8E(1x |{O=6ɢ7F9IZhtҨޭaΜ4g%NGI8"D++'f]U+UK@7`KRn8 /wd ( OcMK8x±rk<ݩUB滹9}{QtT~Z,tԏs8G*It/ܢ83Vo1`kkB`>=% 魛ӻJ!ܺI>Zn$hH-Lh_s0e|ЯBJLG/i٧ٰM(\&S>P9!;O˧2*98:#x*bùv薯%Oc9yi!;~f@5l@,3J^8(ĺK&aUy^83o}^s/v.S[7|`lOJaԊMk~1U wOP~?=p!r$S'|/Iy-?𯛼 ! vR(nV&ʹ{WOh 5"_=83n_hEC0$^9-AJw5 . 13G}+3#FdƐ'̜.Q'(AC왎:iI^Fe's&so=\̙! scOWix` g{_H]]mPR\U5][yf_v hӄ٦! AnEe  uJ.mq>Jd=:u+q&/BK~$9%q"Vh6!;I+[p]#{PY Sr^4!qla+.eo <)Gћ6&n'v׸^h#xEFl$3Hd&Qǔ9Xe28A䄺)NtUyHzm"k9 "Fr 0jt+~PGC،_[ѥ퇯/Rk}"^ EH:놠 nݾccYGx? ,{BGQ1dս|5BH'?B,1DiHn^<>1(6@ݰ_ߙq\4/tKAsV|v'̹⒆/^5By:P.#¾۶IL wRlX[Z}ےŢr%ohO,>lVo\tTe8(JbZ7zSXVl9vy*k:dQ`rӞG. +8} !.kzb:lI֚2pY~yP[cnm;1yĝJ^Ϟ\/-&~v^ [H´ ^vO͏1 ϶pNy.Xm-|dKpnXO0,UH &F\rU Alȃ9QjDt?fSUIgu:P8zxAGG2@?%̋U ٬:%ʯ)O->d"K4jk%{ ʋ"BhߡdEFd܋$qMm8hYWor@B!enVH9&]F'T$Ohsb>`/T7=z@ ^Aij=&6 $/"_#+Dѓ|~ʂ h\+~UՆ΋C\޲ceP$PRN|P[OꬊcSھo~So;V lW9:Ϳ}̜W5caYʧSH>Xt'G>;T97O#2 N WH[z^o~S]\w-Yj G{(nxܲC{hU<1n1Jo3 o OnJ={:4n͒5 \6$VoׄiJэNTqYK1%-B}\^2Kx~<1~0K$\z(@ƛwlƏtSp_zK G iyva/57뚸 Ԃ>GH]H˜?Vxl[h,F6$ͧ%D \LF )mazPψ>>`sڨJ*RןM4% fm{w!-~ؙqJj/'/)P2&j@}Di|K<~\bŹ&*TKi 1 ڇn4n*%&|BY/򒓧CL_w?-L e<=o:1|o. )v9p w8{qyd6 7y'3u+Q*S)0&Ǘ\% RmFau0+^Ō /[ޮܧMc> 0E䏜(N=S㥨5C~$ B"fPY !A9G;KF*EFj7[ߍU7ijzxpӈO$5y]/zK#2}q7e](LYt+t]VgDFζk՞LO&l>UJט2h{{w{.pSrewObɷk6 *#$%M^=(ᨎ!M>-ɬqb/Tl1&mC(`?}]Imo΃'%^}:܎SnOL }Y6w] # (뒇o(x]38fX}? o۬=QJ{ItT|1[KYk7+[ Y>@Hh3 a3y^l&{hI:lTb{'?3M;[}D]"J>|+p3> ΀)y9tNOhpJHW1w R?6"3, V6s}jB5OSM\bzN !Bף ~XꡤKdH{9 %xZ7OSF!7n"卮DWw ڜ.L7x)yz|mȚh)gnxVW}z m\yp=kV,/$@8fG^E'kMP *\J?'m)ƭkɛ7^ c5{Z͏>vVYUV/g /dY%M\!/tplI#x !݌)kw47:Qս֣"u+d">fq@{Q'_uO]y؜=RLUB<dkֱ%/=,u3P{hܾ">ZH]*> stream xڍT6Lw4C̐%  03%] !tK R oss[5<םuoVFpkx k /ďjE8Aea5C0DȻAG <|>a1gb W M 4yjpUw@<'Æ'*t3 j4cG@n |"zyyynR</(qyBlPh!Sa8@r^`7S<`7cw@+XL]0(`u4xfG C֏ P pu:G1+l)@ 6s}0_ u*h@ ~q#? $jyn:l65 OG3/xnPosУ?e0[8?^1PYOWov~OO@>>~AǏ V0*Eoep63@6?|KgHO?_?wģ= CZhM-uO$@6i_ ݡ;Y Wmq?]ハ0+/$ }p~|j S / xL*| @?=+{ kh6nnʟ~ ۈ kzc_$ xSsK7D^T vNkg&G F|}8ON3+=ag3I'*x 5 Ȍetgē{ܞ<Pd_ fn:aWOnaXLk8TTߨ"( dl[<:zG|$D57Ʒbtl$^'-j9Tİ"*~qGƻ/ǿMr?di"}B2 fdf&Mֻ8S~^N=6G)8ҕKr5=!ob,[o klP-rOCKD&ƌ9BɄy0r~mѺEeci#׉(Vb4 ϢL-E \,emY$}|D |fz'<3i 5 K"rco:GB&>I1\_!82ɥ`$+-qSsjܗs-lحND^K RSZNFbǻa*Kd /Yx^%/>viIe#Uڒf'Tjڽ)]!Q)H/DN{y9ujS_bT޴2ا8~i2 U0Q* vAV{))PUȜtf{<ӍfNj4(M,`]9oaߊa2dwRZik ;H!ߋwh-ZMa=5$xdooZZas })3q |oΘvu@ޮ7U©Yd,Dp?jT<9?6vCkDD:XScԟI^+@gV{N%R3\PWcP=ݫ&?{Gʩ K0cRRkvUQ>" mKq)}P8Kؔ ڙp~<%tMB]GgȴDHvjT^8SPaDCWaEݧ _S՛+N?H <"b4]mՌҭMM֭W"c-pm-W]S,Ѫd˞:pzyѫ{iN~ ,&6t`?#X`$RDcoWTв~綆CcӹgO-΃ÛwdNjZMn\f<J]/~i2N K@j#WT*w.ZKR?[1HUT|TP$ K+N et%'`|u6l2pTc]$Al%;K m#-~U?1q^ir8{:bfɼOv՚XA,v!>3BȐadbZ4|ڔ>PP S+ 7\| [Fn~ñN!Nq5wQkj |,x6!A[[HvI4boVCvݒf(jKu=T \ErS?Gt!N+E4\i镺bԩy7Yg{⢱2t d|D|fnHzR'{iKuqыNTϩɐ\)4J)u&gɐk^K[Y5=UQh׻:`_ECQ#:W/"~~AL}Q9yGtZ>UPyˋI] Ν3/!bw|SqDM( ŢwO"Qg:+@6[M|.Q.>P}_!WHR 󀯛f$^!G S5wj錵ttznzʼl}X{"QtQի#,@iD%ḰWQO`6A{ckk unL~וr;;bݐ0n+t4&6,ђ˘4?\(+*x|NGLtOY!&垫谿/Ȍ½{`n+ZR7A\OH0#q~7MMCY\ǘR4X]a1v Hv>}Xp&nG4?Y#b#H; nWK..V%cKugqP7n8:~P&s%ɱ\מvxDY'1Cx C$]$̑?[GMVN^|C8YD.XƉ w],g֠)2 yv=}c?r$g-.D{6$CRZic1s}e=A DWW!|e)b0!qRM,ίfAkAniŐjO^'с==v4U\խ*􊿾gx6|2Z8P hŘ!6Vo+ۧ*N~nܫJaB4l0E7!g%ŏ**c D84 C ;[NzE2git({%7LٽKp_&dRZy(6Y_t3Gc '`F2OP W/bZ OvBM8X/9@Ƴ~R *ef&ND3!o.ƱLWwN2_.2xe.QSkxF3R=}M5hq{pB5:5Ibvˎ N%V]ߊsKT@wl痎4@;-6Vm SAW`sk3ƦR? >$j&LF`5h u8݊CQ^(Z @kJ W<Ն<;I@̓5)?=_9FǫU6R[4_ßaL珣.VY:ėB~[xE?Urse@ 후m{AZA'`w*u_(b^TPsO}|.ϓ6id!&."4.zŷE:1a9'ӊ D=KGmh2wW'b7"ʥt씞~7 Og[EV>ǢP,ɺP'jM*^ɾ@ {ذhqϗKNbBw+9z_Ө8=)d*rТֿC^#VݏGxRnVf{T_ 3FW?i<)Bܯ$Lh㏜VV'p)&v8N1 #pk'qhzzֵ݈x)Ic<cÞȊqK%|bBH,;`=crQ7e-kd3 : ["{:qowpܡߟDq4 gފIọ7]xw@W+\^[ugP_C+e{Wԩu 靭CxhWIe_Tt3 ^rK;״$ r%El#rTή!Ȩk86O`!z:Qa9&^U,re4z^IP#NdI;uqpB#zo g1",2ePaM~ZkV&##%Ⱥ\ɪ{R]}r|3ؿqZwƊV8"Vyz䓙xZzVa!#f=>t- [= 2<uEmvoc.ķM߅p{ʝ27>,:TxUP-mn\eBb 2F ~޻37ko?Q( vĀO'mU/vEA{}_ iig 'QKq$ESē\LEYHy'2q]6Yi3G s)Ӎ$`0:q ML, Zr4?* _d9_J<24?xz^c>eԋ_(=:TD֥t3fI86abJqGv%/=-aYImU yx؏;ش}WRXQMV| _Zea$NUMt8(F07Q~Ԙ Zع=)\BN5 qTUHI3{|$Ys).`AF \#REPMn[uo3#9(MJ;]IL>ud7a{eZ<-g99b ެ.ETh o|RH71͛ !B_ZLsS*:kUkZ"hЕ:,|z`;b7 x fM,I!cyk~-ݱ%6<^ݻ 4!=k#PfUh 4 ,ެ{RtԴ O=3ιHEIފ1 SM6]en[Y=)=;ǒ^8LRώN_:Ƹq!58Xaa'/h=!H6J+?1ߘDމ<]@5LJ*%'pSg=%=Z#uOZY]iM4gvb?;Tj]hn]O ~|tV[v3^j+&Kiݮ9ƧB~hs) Z˪TeN7$PmE 0 '׎$ǽ뒰8&5Qf{jMחc.L]=` EblYzkt֌Bz~3ӈYsw֞d8Sa:h%9v"RoI'Fhrq̦/;fAH7*Wޫf^{ŅOЕD[WG{*Y$\C̉sCηoVZ)oĄˆŬti!wK3E%w_A|lEQPH<]qcPw=W3IFX('9g}l|:~ue6n rjB Y_e[v<1&6Y M:$7+ïUI\<(џ=*ET4'e"x +8~Ԙ\bsطwMïh g;ri*xxtqƛb^I~HvW$~CG)]ΝSy :'9\Q>4Kc)y!SU "{PAA^UsS8UJh-mѱv1۱xun@͈1GyaDl-?^ԫ?QSXfO[itf^vcoràw' 9U tz61KJl{A?TL_DoRiOI[7aSҼz#U>.6o qh[@R|[.cex샊4<$z\d;3ϏS^>8UFIɟm!B #= en7"IOV97,G(Zuz| 6UVoWfX6CGk.j["T@U4dž J }ȗ99$V]{$)]mЛ!}՛m: W7ĕaϼNLStG%,ÙrׁO|u<$d"Hw[>ȱp nXsn| jqHN:bh7֤D͉fZv[XHrۗQcE4~_'6 Q?U+tRO?~LgΆ#ӕ>N;:gu$ؿegB8{ּ6;eXPQ8t̶ B2YMRr\n8Ԓh:SOrLWhΎ3ڣѠ*PRQS`~)K7۫n@f6y$@&DiYn6GN{6ȳxL?^E4_[~5ʜ*zY3'y+d@9+XcNNLʹ9WxmqEC4/CJ4׉N tRqhm /qqG]U_(>6{"57M} )&,k\qXRES-YB.O6u2 cЕ 63AiTB%-eS+YVcL\ ^Q%Rw{quQй5S#ʮ"u MhD=WZO]ރk>+)%STrS iln:~P;ASNrrɏn+f2[Ƥ< |>tw\?gcJNbR=_\ILsq, endstream endobj 723 0 obj << /Length1 2258 /Length2 17227 /Length3 0 /Length 18551 /Filter /FlateDecode >> stream xڌT%z gd۶15;vM]mv3Mk9<[}WkU/tBƶ&@':&zFn3L?r825G [ ?,DL >dNr@5`fd7@ G8‘ڹ;X;}ϯJ#*! # @#5@BP;9q308:S\-J&&.&ƀ( lLM bn/ C`madtpq8>d v&ˀl`ddkcgtL-M NnN_֎.nR|07?G# ;'GzG 82b@c[#_Z8}ݝõں=L-Ʀ0vcPZ;HCGff`cdddL܌Jng/oO;[; o Sp.&'go*11-&f@??&wph3~aƶ@k?AFNUDVߔuxұ٘LL,6FooW'/oφ m?@gu>1>]_Q_V$lm_?z k[|쳳m~\L-mVFf{NJ/ '#hϝ/F8=#Ǐ2+45p{dUcW@up|p:5Xv6_!v`8 A쿈 1$ f`X Gv?#]Ȯ_Oȧ}S>A>GLǏհpcQA1u#g__f`d1{G|e~8GF v/??a?JPf'ه{GvvOocvSG>9~pA?GhGnU~9;|qc?&&n&FpKF0Y1Jl>xDO[D-$^ ߺ%e={%ylsD ~݈au/zNZ-v٭džo=W29_4 2];r+zRŚ6wY8ھPnUtTy#DZErңl~dkĆ缍`c!?(!*Ll/Z#11Z gh#en8(+^%pqanb/& 5carPY\m0ނNka rodrf×Ȑ~%: q]0~ˣbL`6OhKB< ԡrapFnt͜M ṓw.+DO~sV1B6NJVF7hiG6nK y.hMuY_RK<.VcJ9EΉ w0>)`Tns+b[`oR '֐הW~@S @j k׀_fL_TZl5GO\[(Ey~ªcIA |WQ vT4ϜMfFV3m ĞewY> 7H}(KK h{k߬:WPh(؏W{Zj !22x~z[7~rP6sQ 6` P}i8'=wjPBlSsoBzn_K.` Ѱkq5J;`6ɱ  fߤw q}@SP/KօSKLMXmt^3.|1ꈹBr!!ԅ9=5uכpI7/Ko8q̬Z&Qs: `bn/Q{af eLdy]:ll릝ΧrX VsuW 0|`J>(v1K+xX,:1vl5t ۀv.מȃ]fD;+5UI}CШ3ԤkMxR*E˂?F7N[Zjf죗2f²Khد{ V2$.vo5 %&$&Q Gjk>:NRSkNn'INb:9.)w$kvB 2e2f y[Ԫ9'*롶MA N{ó]<ݔ+6ҥOԨ]i>ܽ ;Հ~m0M]:tI:FWG-)Ab0ЁTe~}]'JQڜ*]1pF;N|a\H >`dH]ԝ8|UϢg9p9P P- ީ#gY!خ)k.Hs~qÓͪfgn3&類/d"3\Bjv97ۣhR~dZFUp'nCfy/ F>)0yPNE+@HBQWqrkItS#I ̝ _li}""wW0h!=E:KyP]oCkJۈ묐|U!Tb_'K5>K??.XMH"-"C]]Vg:$yP'd,(筽%%lvd=v/i !vqMXr'1rP:UBm1D(pN1t_6FoE_ߌc[}!,%751I> ُ}UȿTJӏf#@?}8桮6;ɼzTEw h+I 務v6$cCEbBT?99>ޔ;Z-_;_:hyKf+%Qr!8eDd}BCU{[•P?XƓ#3? /Di;@MlKM4܌NDccR\ҕg0CX&싪!{q*᫳AԞ iY*|eJYCxuby/.S17_K0GPX^;MAXEyR):JcÜQXHY?*3Ӄ~R8uGvH-SĪxpf~N4PfnU"qy#MwR;Ν=:)룫@s%gG}3H*餿 "|Ѭ-@钯TR|EgN ׾Cw| 0-jv4~3i;To6;xxw*0p;YEǫ _0ɉ5{ YJK2YLҎsެndՒ?h2N(3"<,ksf^^'ܲEY:ߝhCRM˔$Y"f|s2f/iwDNmZ|G~Dֲ.sPFIh/ \P؋K)ƍ@4v:}O `57*hXh!kJIf-4#_ݎvw!dyB.G-3ok{/sS()4&ģb]wuX+mPhDmGc55AJՠc_)6 C>'p3dnAy\^c%W8#SG]x X2a)%BhAZV!`R'@U!r:0 + EU#&gpPd)^ʼnʹxK.5KPB=@[.}_ ~!&e~H́2}O,jt+}k3BuVjSvzY(<0]DH`9[ջ HTD>CͻiTڢ}P҇s`>Gh4!! ɝ]=.Q!D㒶đ.?c+[#Sv&oqoT?>Hm Rp{H];\ٿhp؀'S[ Hp@Lp?68~t#E!} ~W +1F| jH؎~8box!n҆q-N3Y ܴެh|rE2*6DWD}/H1a{LeU,1 BoӒHw.)Rh$έЫk EnVX%DZ~6u=FRs3'?`QN0]=jNؾMT8\x6́ z=%9qhS +Lh@%&SL+\ 4-_MF*G݂fsHT^cBILR)ݮ y2Ga$!#`D$`[6q=J/(aTV$ #-傲lb^\"X!,GنD1WPQ (^u(:3Rֻ.96d27L~-R c;_1jKthkJVFS^"x`D[I:"k̻ǧI=Y7X) ij64M;i+ uFUiNNmWP%#!xž k*u"2Gr&S!Mm-;M,eJ6oF\^^j T kJψg9<ËRK(9ؔ%/RF3Px {(,W-uMγ +CD^-We[eNbXć!%R'xX ˞[i;Kdx aE(0VRKe cꩭ}\[Y .0&'bBy@@6&V :qd7\ynܱؒ1MY$K $xXQT yD;Kn ѤHӸX55D{HTw]2<:m^XTx"4 I:2T@-8~r 27Q F*`Ejoexi:?v4M0ڈâ\n _'$4-$`1ѳPrn:[ b3suArgHJ1wE9rd3pFY '* 8$8j|ٱx!]M]ODf-Oy=ʫSRg#3-5n#I_fqB59kOуPNҙٺ-% 9(R<}KhXBq׳uj]IӠ dp%I jzj%]FJxЏQOV#(()q$[Xu_flՀJ`>RRdNKy 1A[FVb3|sغv_m]Nw=rWA("ϘFr@ 0܈IH>N ;'.Rm|N !T $i2\xpS;q,T +ʜܐEDfhJxh?PHS̖3yDEwwDVX^j M({o0QN|v.Zmu6 ^0xHDG g/6Rm"sȶ}OͬhCOx\S vXs^L{ܥv.s"amꪾ wyZ,{As>W`) 6<@d 3tPќO_.4r-c-;.k*ر zSBGJArհ!4P.Bl@#e*F mݳ޸ i˓ա*6#i5BrR4A mTuOHʘQT<2PʫM߾s0Ѯl1ޏ?5jDjtPUb_#1ǖT&i 9,A8 c8 //}W2AcvzHȲC7yT;baYb>v#' aNyÆ>oNnQ;LQİ+bݷ)67=(D :lK9֖^|_,.d(/(?~xBqKF5i^>ᲱW}{wQYKFC6߿/x /Cv^$ܐНVHKΌ!) S. 3τ"S<3D~;Q."ͪs*ϾPw!¹?S{w]S5RkX:fX2uifEFr+rO{N㔐!-ZXߔOÈm{BY-/TBn+2C'J괠oʳz ڪR{Ct5$F A>Lݔ0Gd_fW5dj$ 'av>~Q oBbݰk_MwE19L^vͥД#5F|)\|&AHUeEPωJԶNT8Tp.q 3@Zfqxp6YEwe"/ާ~ cSl&0tuQ@hg,>zi(f{IbQdP"dZqmcYZeHy.3]S&~O,r nY0نVbk'G0}Ԍ}>Tx#؏Q}uIxk3 8璹1=F}.(ɂeJչSn4Dx5\Ln/l-㠯vY!z>C! #(KmI;w^!~ W*[!+軧YM(2PߏeZd[OP:m N>7tv}NB_b 7{f[D8Z$w mJ_fr!Ĝ $V>[ޤfUO?ve)z-ē'n%;Yj7.inf_>zx^ IwH{)2j}r7qp擘z;<4|",9Ъ$̐e5O7:cNR>Hy8ß T? 2uE)@FA[j(j̝ {m)x@z* v&?ch,C|Rָm2gx t̓[]yo"HSbC2Y*~CgNף@W47Avz;j6i;4.7]qXaެGͼWA@|:-}wg~B&Bhg |TB("A,J:C:9~͸é\k[Y@ir}t.j[t/i<%0px[Uh]@A@hh]8Jdl"2/'ܻH#mdLH&5HF1vy \ԧ F"UL1S"/(uɏcqJ*!蠘xg;)O;cod;"ͅ-9+sO˗%G'ߥ/H,~>i`Kd457a'!/Τ k:uS@paH _?)fU~Y\pRIqGpi: R%gf!-Y ΄L1c̠}\Bֵ 14*c## Ɯ|5& | =xr-@1pczP KiwDe4?6m=bװ~펡pԱDRܴ>{Οb"k0Rtj:ȓnHGp'4cAȐVՊ3pl7[W8^?.9$  Љs HXvF|8xn`cɸjͣ244H`ߎ/o^Р<fhItP";r(2<$He-Z02ʍRl^$c/]>M)>_⬬XJUO#S XБ I] rQݱOayZ9ST~GD!֙qLl|/5cڧR3tt" _#"oe P["z4tK! {xos&&"*ɓ1̪24 ,D5nֻ;Х]9xmF[8bs\HѪ ,bݧF-.a')|1-BB9} G+VpN-M-2`ʍ" ZU!J%U$8of?64&d%O#aBީ]t:4fC|u@Hdsa;_Vh?昺2n֬TF N~X^k=C[H0,MӂZ괇^LC*|sst?a͵z|TP,LO#59O>@DȢ jy$u@S򡑋+ͭ-\fW)0tmB Q+SѷaaYώ`=H%; 暮C.=7sÚ/ zԋl`>p?*h\ƋL +Er /O2 !-W$y SR< $y7Gp[m+W\Kl],'tIsFWKbb@,w~Jm/F"/$λcBϾ5݂O0{2.!Z1(='j7%f.va[LlCǃę,t -b0Q`#jq,EQ 5@f)8Ew!!6'Aޛ⡄c`,^JY,-=K:9-O̗mdAF&; )O05^.JR>[<ذbIU2ʋ_F#aYt 2qD’PD"M'&m$?Oe7sRȭjCvuN'^_TB:m5ffMra6pdtxu2r?=!ƎSʣl@qǯU֮+Ӌ+Xr&*.Eg{1}7kp$SOfnQYII£E?lJc.̘-9i2^lႤ%I w?%'lpSlqKmǸ'c,[6# ql7%1Asl}1d'`r|F=Q3R! g(Z-~r#޼TwoWPqjzG#o~ | <,+2o[?qȁ&=fY=W&͜NIeSwݲDJ:)w$&n2þ|ǩaX )3WMK|lWMq92?̌8WyiC ,̔JV|mCk]*n '57rT?o7:^9@ZjPO\hWk2G-E 3z%RD32{[7[ _,>.t+" kg[): M5SSG򆟟.{\ڎDR7{r֚ZHN[~1]Ω;_U%u4!&r2{<`jJS-&FzBK !HpJQBHk7yQ;"JW|\S.ǫ"+"QȬ5%ƞ ar }N2}Ed~϶Ft*)ǝG6ǃ:HC Yߦ9dPʺFZɚ.MHY!Z%=qqEv!C?w捨"P\4Uun9M3.T`lŇC!H`Aonْt= JG4Ghխ[OezάCĪfc[1U}RX2I^k > @%1R_QrN C+nq=z3R|kد@'^@ Q|5 جM"$DFa'Ƒ!'8o<,VkMbvsDqگlԸn(LtOƕVz(ts<5%iLM/)s ~t F;&H}P)7BZۊ* C՚h"YB}e9YJVd>GJ /]izf4vʑqh-ܲJĖ!ǟ"w>e6FP]ErMD@_>⢣SJܟdz2}Հ)T*ָLf"},q@dcUqo#lz֧.,X f^YJ,s跨}{CH _KsI|̝qn7VdjٽV`ok}py[3FIO /6kͲoyc@HcsrEtSʮJQ_!}zPM iWa]g%DZ3?<0@ol/c{\e5sO%Vs9Q!ck Ɉ0V;s\MNls,LqتsQRcvq soJ DV|;kr(qVPj_04Gq'_{>(/Nm@= 9:IlJ`wPTA_ T?Jm^ׄC .Rϱ<-tAA.3U3}&!aތA]JKW^+;kja8g&Թ #`eO#b,l "K%C$60$4IrVudr%3˯^S]%}k ť٢\*Ĩ(Tw~&QS=y^k[ z:c9{g㧌e0&nċL>R*_WLa$+C1lOF KtLұW^=ie{JNoD+Vˆ5.#^HQ0uY?q$uj`J?諳Ҽhq2l)J?szݓ`~We`q@J2Rm $k*!R:[PnW-\g}C 84l o蠠fwӦ躿S^  =걠dt́)s?@05[zeXnɥa`杒NXKb n$`iZVRMJ oJ?tueeSXܳǖCɓ{(Ӝc|\LLϚ`ׂ2J/]v =g8\&K}Ikpj9lŦұ!=%/y: ml*=;:GwFnjQ*EX<x rOe7[^y9=*UE8f%sNl*nzY9{?sM0%>ܬ?(ǀ:HCQ;qWHȝ'rcw.ƒؤUw3N=f-;jaMث}0z֡Uh^/z{vIRz&,GD,hTIlxʹx0+Ǩ-Uf{xC Qiv'ԥã6 j3sldĕ^Kj~n[~Mjg<֨")( N]] {.^P[`%iUA= I潀2"vdCnGѐj-:*& 'X-t+ޘ`ozsg4}tK".*P[ 8 TS4*TT=0̬FB@].`HY yc (7KW\g_g Ay g|( 4?.4ȷY#ĺYA;&.:nKp _.'=E@Vw''rב\̖g 19oJAbǐj/+ endstream endobj 725 0 obj << /Length1 1883 /Length2 12063 /Length3 0 /Length 13239 /Filter /FlateDecode >> stream xڍP\-w :wwww;C#ܝ%;w .A}sޫڽǴ5ZcM^I^d فxjjL&&V&&Drr5K o;"G%ȎQG&f~ dm̬ffN&& ߁ G)@ :!-- 20sss:Z@Ml K JPY< FN Gsj:%t:MP(ƀHPtCdv5r^ 6&@;g;S#uu@h`3?ݟF&& [{#;wK;s (!vL4qYٺ@BX`/~N&`''K?82QuLEA@;Y:M^ݝõy,LalϨng +Մ311q t2a~iz[_=\3w#Dff ` 4ChznW10S14_qlzVV;+'U7?JF[=i;3?^o.ꯡ W5T_<)*WoG66Zڸfgdȃ^C5gyzF"lgnFZ:IXM,&*cxnciT9Yq虙:r&֯waN(ng2cX9FFLbagx2Ψ)Oi@+;o#0aq2%A\Hi/0& W ߖ?1 7 `4 -~Fb6Nlovζ^ǔ|mZ2e1s_e 맃7J7ړӿdc_גko뿐YkWbҐ+c*w94A\Y}&pߝ!H\rlwGK pNA_Z&yJ=R׿h<8k-), mf3{D&$&K9r5.|2|Dטc"2x Ņ  6IF;ߟIc18.qKy1k"#d.5suSl. E s5wSpZ:=>1F@YmY{ uƖ%U`QBVk5yb?>P&~挊WUI>w]ŦPf M3,pv;%Jn8Y(nDbewSN]0?\*[sAHfŴRhMr\>t۠49m1Ӂq8̬vVosSa5vv@AAnh$. (iho-͚t*2qEG.$zUdK>(4wŔ.#MXҫ:=ӭ>6TYZ3A!7sAٰ:KM˻#nwzފi"*gyTan/Jn5fxFiͶywO2{Ldq<̓IˋCl?0WkrHEf}Zv6FBc/bmГ)Vڏ2N年aX Yiܐ/Bj«J kR#!5yk>cYki^/8|gl_Z },Wnbc2Jc;w9AqjMNUSgZxq3p2R0rp'tҀ/%Ǒfxʰ w賦)=E9W_}i  ά8ɏ-^Rp0!F!ީNIƶ0''5 L1J&g%e]:ga6wgec,nf*֭X18f&<İv~!E:Tz{&aog>;o}ʌx7vg-\=l>d("a>Dҳ$fvյYuOA%r1:M]29ѵM|+Q'2W j;҇ N9.̆E:^zLATHk1p7˾Yd+<;Dqacm$?>$!*b "Ż8[R/0UO~ n R QiPAB}ۛcv3w,&@)bnٻ)9]^xxF ܐ3l?,w^T?=˶,;qnLRcxzE6ƣktR7 ń-[>7pM5P?gM@>4}ɘ[h#@lDZo, UOwSN;D ^ߖ;]!5b f?=$^r +~"ZؑY:4ܖS|U1;؛/@͕/ 6M-%WVz`z07WGfKZ8-<0A8U ) # lrq+uXZ64R ۿF{b' &x-Hʼx06eHQvh}ZH[(*،F ,𬒳_HMu`=t*xfh~qlsV~/͞rc^4;t3lm'Ǩ{eq, vi^08I?2j>Y[6xq?AzQ|%U~Ȉ&QˇFNCF2&(Pf Bg⥗% ΰ/hlhT=((=]#ۨ DrgB@&m?a_BF@]c} $ȗa8gx Em`՟$dI1uzߌOXD.vLt_CqM%xwȅC%=FV^ji&]Ð-sK蛦L˨PArЫќόvׂ0oxF:*S0`FjFI #My,WDPޥ&_uo•\h}?b~5 #nj+%_?$Eڟa\4܍J̤ij&۱ 'GJV$ޱ)PXf*vkĘG5=ノOgޔl6ISB: 7Z83&=caE-9Bl q@ c΁Dɶ?x*,Ww]i M'%GGܼxX3R s4jӁuKs3le2Zp::fnS()Y[jA&Ģl'mbn˝'jנ,K4R0tʶ0'˴4;qߥ5'fBd¯d TE>;ekᗾ+^Q%wԧ)US]p= Yt~7JT__6Vjz]K+m0t[|ʁ+|Xp^DK=vW-W(u", ,l 'c0S#,f-{ԎM>[(ws&nzRbS Dt#ưZSܬ%!Aņ̫Q b#uVqx [`1,ߦ }2r;!FE̅l2c:SƄh7هY}drϸsg̡21e(Ҥc?&ͣ7MhD0 i+pV%4LC'Uӌ*\M ٵA}5urB}6N%{fsSw̭ĹS  a'YX,T:}B;^zF;s(AtP_E#! qo>)|2,71$(u8B~;p99HI)'[>c?ø3P^?$!g7j&p3'dSp 8//AۃTI[ѳW31n08 !Nd/*TrE#qD ! n)Q⻕`T;mXg|[31iAQoq^O$pjK"+DWKvjlJmR3DVQX^ER>_ܡ:aAnPCm7r{vYodۤe36>*&ߔ214ztL?Vy~/p9]mw4q<4=48ܘ6J~8U:g۬@gk_& "m၌pG~Lu~sF1x]A5Iޜgp3hE+cŮ9d{r'7aOBqM5>i8 {M*?M/D>5K<'7xWڳDwv;ZGة<Y[Q 9&L n,B DܼFͺ۳jڨ᪭-6ޢb8m|bGOYENkfݵkc4_ϧ#]n_>Vl^Jh*&vJK7yy U*]~0hUg_<{eʇ1\ֲlQN4 '.f‘i0P] a;vEd Iҏrz8gLQc+B)HTD_[ F8 *B7_imD&x*4=޽zlH\h qx|r~ARf39+/{F@<=+Y#U\};d!WRݤXϡƑ asDWe\+ $v7k{ˤnKxݸP`- -}FEMh'3h+1IyԞ,[ `6.%1Ն+P3o)`{[[Y[j]UZ;SՌ rO+}MDg EAptn>s?us8+NmzY9Դ\w!Oߑ QyX!^Ea!5B.HD*ȁӿ_)2Șj4V r]dk.LA%}Qk- [R2zҍ8S9TU8.mj'\r&9-W+ߵ:z+E,$|שּׁ.R 46z >y}+{B$xqE j sXLIRcݻ8BBOon/9[DZj-2>v^ f֍yE6<>rx`K6dhmCpV_ѹ6/<ľys2 Z|,D[Z+;uJ8珦8#ediHeNsuNAZ'KJ??ֵ)yGAQw݇!?0s'@S惮"#z;Fblj9 F *ɉ¾t*q,5q1V7_hs,{5)- }f?է'"D(|sv3-b`9r cռsP懲x& T/XQ3̵wbKO J0\)/wS>^U@5NO/~l^8B wĎJkhV7)*fҬ~m}6s#BaqbgVAN lה#N#%ã5h{Qaԑx'i%mniɋ39Ĥxg|m}=ϧV'}ր+܅X/xcn^{QkDD~vlyX x'!Z,6z,jT& ($8srq˅#YMt׆5b[J69wH`kYKk%J*{r΁ɡ>`ZJ/F9E>g~?L}޳v%9/֤=.8ӄRQ^VLYcwPWfP*zKSFg 9!x ҅*zZ.4|2VR]3m w WgZ8IcۇK,![rX&Q|G_8pT}z|,7J dɡa6J[PD`!)d.F"[혌WwJ.d.??8_u ,|tp?BnmvJDk<,-*` Ǿr9j3*CUAE+t5>O^tȖЖ&5ÛٱT59ÖyE%H~\Y)54u8m;Tf36lU'M<uy]-QF\p1⫹Y0%D}c {Na '7?Y9]I`>WݵF*|xQXg C!J]svKkʔ*$X5'cD +<<,؎mO@F0!"ty. MSqX'7{䚜 ?Xj_$ecmɇH!N3.nSbڳyW)M#_MNȽ-Lr6z(V&:K㫱AS\qԴ%{6zmBțoo-oᑹkV0 sUy$7+с [9L #u1whrs B@1)MOG-ˆu:1Qox~ gE"I8/}0A`0dCwb+NvP-zIJ/BEf0P.h{IYoc$آ=I%ЀP4knZ_he(hޔye2e*g!@2뽧m*C|XH"7Vk Y~4խ(7#Jx;!V(n3_ \}R[U⑓p#g*T#Y 5ȦdQܟ=^kpƴ lorC~) ؋#r+#B26Z.):צ'T$콱!՝q@E:+K(I◨epc/ƔȺe#ײ{N[8c _k#e$`qs]7?} _MmF19~5#&g>|)]7 {#-o^wŭBeilier6;o SV( ^EYFGƠ5L|E]^v߅R;gb-$dyE穅1x./Qآn'$݇T: ;A٣ vOܮhY-C$<c>{):|*9|3+߬> 7zZvKj89ܗj'M땊\tY NWR;O"MZ9 =iGPʩ|_p㯷_PXv7[ј>rrjv'ߌ#_Sh?x>4XɛF܁WRv[QȺw7ᗛ#!eqCB!- w~ uϚ'4{{ڼt38 MJXNzmZM堕-V~iR{/:yzRCFf{41ogaES'L,qdofӪ\6дs A8\/KmВq~"x7@!O2RcD5o'&KͱK$wӓ$]wa!U45R͎h/D1Bwˬn!".B^xZ'Pk^ (*w!H8 f B(y2%NP=|ߌV``D|cAiܛັ4K,|4}S + ce]\CL3;U>>l볥uKfa+qڏP5mt7w J7g$>Q7G_~֒~`?DYEmcfU.]\ib JSPK% :rl=,ʹ{Ql},E,H,JGq4MͿ=~ ;:W> stream xڍT6HKw)CJwwKw CwHR)tt]xιo{"SVc53IفYX lv&6d**uK'T  qI v` `ca t4(0d Gd*q;{w4>hLh DmAK tB#mjv& 'prcfvuue:2A̅hNU#2.S2@o+@6& #l j2%{oc 4/?޿,9MLl`wK9Pgrrsb 6vP h 5+u @JTVO}&K{'G&GK52Yl*ngk ;9"O2ݝZ\Af`Se:3k-A2@EO2srL,Pwd-iog04A {:]@'3O#dVVdn F~bAfc!n=X~av`'F̬&*BO*F6v',*-ɍOlfhS?{AϡwE;64O˯b|r,ߌmlmm-mnd(AM@gje" 6Rn SeK'6?cXAv#+ Gz8B =QQlbgqr_lOV5f&`fA=R.No߈ ,O,x̒"nb0}Blf'`yB OA A#(xS A9՞Y A#hx/′t~2&aC& a$gWfm/ℒ@oL[ۧ,~O zbN_~GZٓ 473K?8~!8@MФ,RTffe-}r9`+`F;cz2=Ëz 2?1G-vOC~)Ih@ dVh%O 另rBWAp:ZAM)Qd1YhNv8@9WgrB٠n@hT? O9C<@3_o 2A3l%vea@9 ;Cڔ$t .p}ʞqV׫X{l x,ֆ+^?ն™o0xa 9;ko}65;lTs=gp8ޚJy+@X~Yfq~ f$#; L`hj[X/2'"#^I\d\/fxnxw^,uW(iH`jS6&5ПqH9:La/Y<xP)=ZRʶ1!|Hgń>//X޼aGVia0#f^>|'L()~ -7?6 ˜_ٕpNnfnEdo B_Ǔe u8c$C(,f]|Z 07}j 9]`@X*kK#U$%[IԷءi1 S%gA`I2&5N*> ݚdgL(DUMg_> ǬI[LѮ %X@rIpеЯw.Qވܱ_Ŵca>7Cӯ$7IzWøP%(s|iPYrKj˼|oR &MVp~~Q8QwV 9_rF&w&.A4Gպ߯6zjϲ2/HQHJ>Hcg0$e/T9$`*"][n _PJd^'=L$bF'ϙ[M,IAngU#}@G ݕizٛU93q2_\D W>wЊ%v\Xfpu.Q8JlţfɎRN,~|.!(EGl?sD*Fg)Ӫ>gUG+to\x-ἏBḁ33qt^Fkbg6ϔ2ZG*&R/h &FdT f|ڝ W6PlI)մIW"ԙUKmUOE$86;E-TTϔFlhvT!L%)ڣq!n{\fVj{c\_ɁdkkI=؝ua)PM,'Nad^a|M[eb-ȏ:9C|8u\Y1C&99qi'T(n9j#ͫ0鿤JlGj' qa ~R}60}TT_^~M jV 1aJHs :͢GPMAT[esvoLsdz~[:䄨Յb(qd|ϒWB6 m5V{z?MȭUU^VHreDcWڏ/зa2Yʾ~E^6L [XiPNv~qEOcD4;ai%S`{+*ժZ&C&ӥog}xk] z"Ң Ӳ6u7UwWe kQu"uHCڢ0uGA,q"/wl) ="Yj1)812@*$Z kbq\5&1UO,_-)2 GAc\Go)իg@=$7ƾC{;0r΍ zZxd,W+" &L3ZT=~R=4 `[u5Bqb^,+ ^FKW~f4vJ@zftNu*ҳDJsٔS&kU?k O?%}~GZAD 78CH$䵞zh/px>wFE0HYI!]rǷXYě-PU`e+Ϭ'kU+X mȫElL@H1U| =k~υfa,!KEgR_NbZ .\l# F Fh׿!Q@>ԨOg5"W$WǷnd%@ <]:`*t6 +"lkAB?Ne_֨gR`(^`A7 N桉dzsrM'NNdhd+2'i,2Q0%~ eN7v3W4~\!|M|fhB؝/^ 'q,J:NͱqMt!'}37IomP+]z 9MQH,w!1b0uBVr8Bʂ& \4vupXJ8}hyg5QvUs\;`$D6PPhny(2 õIi< .`3I3XU;o0by^~GQCd_ק~LZr_Gi`*?5SW' AIu}'G{FttZx5'IELm$2lt17۫<W{f٦!㕺l0ui}\7DV,iv=XSb)QC^JrwTcQ>49eeJ߳pT.g|'ٮܪ#rE3qlS#_7$o9:=Z>j vKԾ@gT@(}z)ܤzhgMQĜжn/7P~gwr׿|F@hgϮ6ƃĊ'?/,<jpDz&#&K;*ҽg"jf+INJ7_cL.w<ݫۢ fW Ԥ\|8+4ђu=}A+l&MqPoqf%~ilC=iQ+=}H3׋_'.C"Z}oեFY?FbHeqGswتR˸tP~N2"GK6(4U/Rh#@ɎI)vUVQvgۦ[g'б ,4~-͸8tg:~76j|6\'{R:'\yjڡ4?υuke2|u/AyP\Z'W0t*!6 ÁG1zVHur/5ʏ;8whjf wvthenrNs~5᷊\ص\~Q%߁phr(u`J]z>zU_,k=?_+hҲ|_Q4E\xRnWTI$d 'P |K_"tŨ'jO\\jkA7HF9Fbhl7%^ZdrCIz3ݞ|8J*!7dM_/OfgA3&;) Y{bǪ>evuP+36†BiD$`o5AΪ5qp^ru8d6k{a?!FCk ˣ֜F/\yӛv¿a?DLIٳ,=!$x/G2tZ jxE*Ø|WeDB| [dL=jP xbJA\cez9^cc_9 CMgӏyE v T92 qtrhg㎷7A/vRpNğݟ܄لK~з=aak-hqIZGorw3L<3fR^Gi۝P8{np*IZ>e^$}LkU8|$$L58rQn(ȗG: Ҧk !xUHGpdq~h(:8FyM:TC_" *) ZgS8iN۹ ؝"Gȿ2w>7k/zCMBm)!|Dh]e=ŷH Wa<wOt,czet99mܠG kX)'y0?{E'VK>k>7~K/*FK3z8:=1.KLS giG=4)Qru2Gf%}@ pBZd#ArBfδoG@$pKGَNtv^R{3^K*~k]޹o*+>^Oqn䜹Ȼ$xFK& %Ҷ ^v~ v9%/cٸhB*V/n,EµdkTᴙ5WǗ6 9#qTL4!)/| fѼ2X%s*֋;R_ϿhxŸ)S4!7ޢםwN*7:: 0v%Чrk:[|3Y?Ŵĸ=>wgQZG]XkW*$̡ XwuB >|DM]b󒡰M| y_tG^TkҖ|=>~ֲ} =Q-CGD!Ev!<Սm\5N{֜VfHGRG}%f~[zbxPҙ\/#jN8BZ2^B˟ Y#ζ)Eia (| 95z| 'FbrXWTIHI98, !`쏴I9(Ѹyx@7+3Z#?CG x%i@񴗊{!zAeG,ζ}Wh4=G!!cJUD坂}LTWÊۘ {^[Oq &tvU#u ]D:_XˀY_w}0*YX caz_/asڨ.wNM,oɛ0>XQ-qE[ztזӀ}zR1Ŧ|6~g&Z}ā-(eTEJ.u0A|HkZ/x\*eyychyqWbl7WDӵqj=Kq3~݋j5qFm3!ke'C4!au ]쎑/䆣m'SzW7L%bt\IЈ YAJrp/au~+P.6bC7ʴ]kJ;<)u>gEMbE}>bP=$Hc=Xbgψȁn PɳdzK[vgMB)B#hs>K9~/\_\U8C`AfTFF}ٌp6Ho KMJdcH$Z{38Uf)<1˯|_c>K;(2ϯ'Uې~BX§p2=+ ,vc cJ'JEBi)œV|jwSw]T~}M)eOXY3s* eخ\֒+|PSʿz6RxIG:4+96)Y̓ Bq9Gu叭v/J `Ú=Jޔ޹ ܘ95E^'G|]PhG`:l:]q:alpSv^ZyHi O6rjjtVg:_By#RqjLol\>}WN\m[[Dծ/pJHWta8ZMgDµim!zZech:n*"{ry {ntn'oSbwmTZs9~7R&]]ypyld\13ބw @@¿M3 oFip BwfvX]uHp?Zz([Zg4 'k= #[1_Y:HډBmӶ+ezȳr1â>9# cһ2c=BmeQ2?kϪB3TS<]{}g\NͷqC4KPJ9N(Fq,o$_%^:f8BnT#Mzw}q~+lꌉ[`3OYtD(HߵTp@2T(َƵ(\%eDblRr"~HV)P\0a} !3_t(혍Fdp6lx3xX;5׋O׉JkEّtt~dJ}~j^0~\8MCe !ڍ܊V#Tc{k-Mf@AkΨƽGKmiz b =faQFyH%+zP#NvCaP4!DmREЋPQ"  < ,ӡƸl M4e*Vewf> ձj ɨB&7IR 9XCQZ_3|~!MxRWc{UѡcZWQ./t7Õ5Î{ 6R凣`!#N uD.}-7ݕ^2Cfj Oe0d@;{;}h )4t]\/{\+Uy- Mղ^ʱ"'aOaX\Z14U{S*^y"[\*E/}>yV}me_Zh\W.u15r m%$0zZ r'\5r_9ˬKgY`7O%J]m$<ii NV/}sZ#*h+`zEZ+'>'f!y=*u8BE?p';9kV7_=p ot>ޏ-?ޫEɟc.;Qkh8 wXب`g1V=8+Gj.*HE0Pw?-ikyO(N-^V:X n:=P dM9)&Kf9SӏEsyٛXF~{q\EO@] q)Uj_Obx?-%Zͭx,,JPK﯅"))oZRR(i )2KMdjq% B2K3L%t8oZs2Pw{ܸL UeN>^`64 K7 ;j*^cuÎsH1A.>~49ʏ= m֐OLi>]Xl}.|0Lr3Gœ,;REBm\HIJQׄ+38[|徱hy:T3KVVrc=^NoLRu_+v%g+p?4ofH3=^:IJ~>!F5֗̒o._Sw)yQSgѷrmzd:,c+ڂŞaK*\z$w[1iר5&MellaIڥ#ivٜJmIN!-Nٕb I UH endstream endobj 729 0 obj << /Length1 1400 /Length2 1294 /Length3 0 /Length 2184 /Filter /FlateDecode >> stream xڍT TiCԌ(N d/ŀO@Q>BQ⥰D$=0֢0r Dkuҍmr4CR&1aWzt@0/177<[%ӊ X,`coBˁёA_`|3 n|D# cvI%## %[zU%o% aSu?°ƍH!r)8طdx0X5R/4qP+Petdu!!\4AmF_)9*u!ilGOT[dX"().&zXHvK(Z:R96هN(p=W&VyQ[p)u81y]Ar{C$OL r"l凷=?G Y:Lsm̙Je peOq®z9 tyD /S#rm 0O3t%(oD[ICz*WQ%ZRĒ Xڤ\dc _M>v"e,}S yuNmv'/M.~|n}%債Xբ[9:kKSXg݊ H^+Bg[̱f{w?V{.ġZSZG vMP~_9x k_wڈE E%Sm^|lr*}GXV$;xDUwJ]Qp|Iwhk;S2?cnVqF4l&^]ϵh|x;K}}?IϬ+].|V~\&F``LT.۔uvЩӸ9?]{#&|懀#4R2;73n406X|)h_'?dRl?Vepnou~=aȳQ E஦=ུeyɾHzg[Se?qN3jR_/;ܲ%/Kg-_4[R>cv؃>ȝ#wfN4aG~Z)^C I_Zw"7U+ӜUK>$,J9VAK_1)̉-֪Z#ش,wk{.M&Gs< +⢾/:o'읷xKК:B]fCs@|A;$z)>-+{6v1>^{Ψ]ͷ$KbVg0OpwYuֿHQZ endstream endobj 731 0 obj << /Length1 1244 /Length2 1256 /Length3 0 /Length 2046 /Filter /FlateDecode >> stream xڕU TZ%->*.R L B$**@@""**.*q DW`JTT AĕX% my=睞e߽=1T%#ӎ<$sÂy 'fV,<0 !"Qf|>0 ps<C< A %1A0LQ% & 5,ӐX<77WStMHHJC1@FAJd<*=XaDQ1M'$ CPbRT %s:q( 8\_$B(`\1 S`_Ve& &1,e=k ))L3rMe2T8M1ּ`0le2\L_,f2 p4GM0^L!7mC CY@4${Z}m/Sh*J0Þ00SPF 5ICh(Aay/Y\B^2Lw>ʈ_=rC|%}W'#$)$ kX یydjm.'h&ĩ$M,|W%lrzn(Dq Fzl6C("1"1HOO37N|wf~B \P_$)ޠEX}mt˕>nprr .|Ӻ6Ww1C 况ck{GOwYQmJ}dvY>:sm8wg3ifk~٪^W5`ʹafSeף5JۗK0sk@PtZWkG<'"M#FRXR(_+Xrxqݬ ՞O+ܻ,U٢Kؖ145mdgQєaww-V&,=#rZ<0^睲0LW0 lب{hط3auc,No'Eg|4%^t;}y*GG׿U`6flz{3?>ۢ@"A.;P( #F 9:[o_:rSO &Hjma"5Q DWprhٲלKl2)"_RfJK#vpc Œ5>w`ߺVVgt֬)c0#m@ՠ*Yr}dDb1;m7}?wӛ[\[>_QnJj2*.<)|R;M"sgjt,݁+NС궔b5cCuTy9,G[Z;NL8 2nPgu>WE!:';8u=WUlӱGyJ,^='^c>,5l^KC"#n-U5HbsF[goݕՁe̟U(8cjMMȅʨ"UN endstream endobj 733 0 obj << /Length1 1331 /Length2 6209 /Length3 0 /Length 7118 /Filter /FlateDecode >> stream xڍvTk-Ez/RCEDHE" A@B Eґ HRDA*?z<Z̞gyWx9 EHzz $ $HyyMgvR{0D(B`lj/ vwKee@ $7RAp[ fG'fj(t%?q'+AgXDO=?q rQ)+JJįt w7VC+] 0l Wr~K4K{~ߤ+SFο?B3_[].r{Gk`PC8/îV߱[$HH (ět% iiՒAa^ ";`D$@ k̤JmB]]{Ga0/-$3ǪgǕ,kx̞ ΠZO)+2,zQϯ ޙn! }n:;3Yk"fz\Uκ߅߽ 'F6ml7w9J\cNMs!k Q(HˠqL\Q6b!گ^TGcYCq¤;QK'> &vffs&6C>_dm)ߢic3k@LGԞpX_E2FS\ %Ԁ&.dñ> viF{)1v^Gi@g>%q >MnM yfE}b,3Yq=ڼe+'ka;͓ B1jbAoC~s@R0/B>pVco[w}}|C;H9cgBi;Fd: A΂)kBzK'яsK:s FKND߳*Րrzmo6g Sq=1YT$QORY6c]V☧C83UŨPRioo5SB5;$XTj۪&$O1ʩA`*o>(Fp M3n7DV{;&(.Im=Ƃ|$Ւ}iUޑ6v؏*I-?2`)㤮7֊ѷ->!|b]#/X '"T r -Lh2ϳDN"&אJˢڶɄ"S>6܀3C$G7C2.(ϣXoTʵ ?PإiQgIefMl߽Xs:4 >ξ)亭9~)f?Z#)"7}NQ kBn7:)pyI{o%WT ot`1nS^@QT1M"b3AVlo~zn(˞X@ljY\-Z$' -c\+|FρEW.Y//_=LIiO5O+V,ӔlD|J!10vt!OF+ĉAj) 6Ű4SNW92oX9ZzH*uP(,*jX|?N`X#FR=Ôˠ6q5mY" n)pxESKx%7ȆBwHA2P2:{٪G3a]j~kÚ\j՝* ttTMxDP/f&-e, Uǯ8;ek/TAN!:өXh[W4>)R;apB)K\tw|iacr#eS#$Q%ƩR@xWqLwN5PC 夣-MxT+5ჱ\ 6?=#>bi5_Љq&j e?Pߙ"\Q 1~)t2u4A0O$i[uTbaX&BvmxU~F">k3&ǒX}`7.[Q#䀓̉c<*z#V7~1޸C DVT!OV#d;r*t;~|h9ak6Ji3c&-jP/VZia[[[2[*j Rh? `Ium][t.R 'Ye:7z )ӡ8}hdb%)"OeMtc0g#$BYˇwMB={YvbF昔̪5 OBw<)cy&J> 9+mPdv&sQ-lh1HsyZ<8ex_kj܌q+Ɍ>2$g_jm@V]o'=)XHl5TJUgPjzcqR7R{󊆱S{}AOW5Ն"W ٹ"wd P'sk6W()b?袧_.5Kѳ׍%gQnUr:<>w<"|U)+<+O80FJl|6}'FvM1bPS: ^=I|]m:%݌ Ol}0vB|EKğ .N86)/-x7fEnuW!o|>:=kBD |N!gf]lvoֿ$ukZ/7A͛J%x6SxY*~Y.SGug"Bzݘa*!!!76'2yYSW'- mƸ _xJQ? [2LLЌOT.WtJLF~N8z =H֫~sl12#>S!Ҁ6vR#x 57KR^G"CX7(#6ˢ 9V>|llu3Bos)ըԤՖ}Y-𬂶LWkS;NvrA8/6 x4t>䇺 dNcߜu%n6Y#ۋlmS.|a1meu6'ybef9u7x|t]O $XX9`fHj0LPV!Α1oYv*)LKӰC+)4ҏNʥPڲ_:2/P _0dMn)!fn}eZmNfG66˞5U cqI_냒b#v Wo0Brb|(_2{ձ=?$dv~6Fs%4=%hK |,>ge4_Ǧ"z. .•Z|jlL9ܩ:&dVzlf<~c{T@"Џ_e`&n{Qc·duJ541s"e11̬6Qb"F>+iț}b@}\ZQA.*h$آ `71'ezI[;dcS;Oޙֿj+ZX5G۵=;; |lAꛝ,=Y >Pjze4w>ƗKx7IZ&&7cC T \aEE^2+L?IDV~Ս5w5H8d.F.e8WTs endstream endobj 649 0 obj << /Type /ObjStm /N 100 /First 929 /Length 6008 /Filter /FlateDecode >> stream x\YsI~@rc"l ilfF~,)Kd{"a"TVV.g˳eR`Scon7Mp7!(D .41[$1)EdmMI5Z)F%e`)f#*14FZM `U K;p3GPAKZLcL>1@1A  LvL[ B3AOYT,* bk-AO6e|r&p0!^i+ #Q:6gb|!4}D LQt̊Z3Qmh2}_G0A%A&EP"dE ZsvXI'@,jQ#Z'pHe @ `6K6h!$!,>rcmNv+dpNCc0+M5FG0;M%?bL5Bshي$L1X FI F$Y#X5jV .Ӡ` .Pm( 9F& .RCm90*Ah*[c1р1,rhg?o/FM}vv>jw'Gm>Oڧ.^ȏ[ [ ~"S&$ڿ4Ìvyvp6>?kׯ/ѷxp>9n@HB I?% V?p?o[w ,Q iA,BMn?/|G<)i}{h }!UU1:Bl_V @ul`isn6ߤ MmC>Tͫ}wAH rvyrqUߨoah*Bz_^_#OIJ0pa=2XPt5Hٯ# NwQ . YM6Qge*/܁T^ȸ+)ڗ6KӾ|iFf{9<mXit6\˦痓єn;ޗk ;=H^'YVbqoP&Cnʯ-]W~=~0i 'RIsP?6n4TlwC%cF3!(l4x'G`k{߶Mɪ00Y7pKL@i Q oe/;{e@6 ܓ{I% pm9Q "ZR`n*A}\Z^CA \)r5x>n} ~aރCoUZWfU" RZR$5&"hN d9 G!G!R@;yRXU̗) syj|[đקv J ҥ.]Pxh hLRJj!B-e=*k הڐDu?q*%%=J5LX,5*LК_\4D@ϠJsӛ)%SȬJ턶Br7kF)NY-KA*c{241C(":@B$-iH{< ./M-;LS  Hޚp}`W28S(_Ix\s&CK{] e <0ELB$XAJȠc}*ڇ҄VQGH1SzURi@Ċ(U0 CiHwU٧-;b9CLP8x"PZ ԗGB JTY;&(D5Ī̳9OL H(k)H%l["']?\vQNsg2Ed:24چ~{;>}Ih|v'ԉojYX F><X9 O$,LUܜMiITKKJv$קN[Չ]Y=Tg*^4q [/ Ki䣧vA[\=Nl)/=-<8T-fg(DD$u W=RąU}nIE %Q#YKykm1tEb} Ĕ%e: os|Yd+l5jVSgRJ,09Er>ص8҇s,#gݯդB-pKSn eSǹz泝{x5YN\s&9?Zpg|KYf:dM)2l2_`hm=Y~u濽J2 l*\܁V |'K/#-u0*%ĝԨ13̓rJ~qX)I[[3$ܒ> OƇg'Fo0+sUPv}ܾla{x~r~tsy=mlNi;kg:o/? ؏X,܅GB @7gl2|@_ڧy jۃu}}3:;N N_@=Z]t;dy& ^&rq9%MtФhTg[G7!is?b9D.HO'<{h_Htt:ܦ?kh#xQX꺍9n+h\=)aU62I>|AU\[p3^pI Ȁ淿3+Z V"]t紐lA.eHRg#˷uWqevu$fΗ=}9uaE7+TڵVl-mF/^92D@1| @osQBjgǵZ M>.?%#Ppll|qPA.k#g1VpLNt-Vdr/l<^hp1ny˃`z:Xs-+@ d$%G!,f#lmAܤ) }j7rtԕ5jj J#Ao4.T׺G|zPmӐӜ,hp(#}ާ׵Ha~7 a% J焆*ݩIEt9> stream xڝZYo~G+F`|Ŏ-[3Je!ES=d_5#H xgX]W_Uw5i+g28&fH&u`JxYa3g S0%<%LegVhdB1bgcE/bB`"SIKB,H@$U L=B%9P6`"N.eoLXd@c1%|܄$AI*4gJ-SZD,R#j=Oe3~_~ A- ]O `\jf@0HN gBcE 5m@G yig9dI 3ddDZffĎ`+1.sc孜9PBt9c4}}ɬU0(\[x= ⇨ gtPBH ,E3 l [ٱ9l[[B` $b+ aqmC=(%Pf6;<߯;|nnGf+ǧo?|r|fQbm]9;r> w}û؞B!wc~v%) ߽s|&6<җɫWbt|zuAww'%Κ*u[?y}K"-7]9̾k;Ue/ۻ|v] OeߵңA/_ΞDߩËʇ{PO?=W]IWDqg=6'ѻSEc/ D3^oWoG.>CwLJt6C X8{|))6p S>:) I'T 3Äđҥ[}q@&8{{Z^wlhBx4tZ[g>{nٱY+k}ik\֓7bF/N4ƖH|3j^R&ǐ1x{ o)_ _.W5UY'_NR֙wǧmUGU*X;KW(\6m%+Y[_5#lvI&nΫyq)m\,'B!j kZTj9qCSҹROUHT*D5ATJQ 2OYj#(\ɕvWH\2s-夯lI-3Ӿk_zTTd`+"nT<)hGl*j_fVB 'j'o3xk?CR )J $Rȡ,ӡy5ߖz~A\Lxs\E`v՛VHⳫMѧ##nB\1lk9Pe'^:xO%D7߷'qlȉd06fxm+.jC5)9- SS"I$H{sZU`ɗ $Hf:ZQZ#p~*pg xdӯģѸA֢qōfJzs%=isK۪ ;)6$vZ#kJ`#CUN%KT8KNcwX ]MY|.[YԾbWH&AbAKh^ľ7c$KMa J{8 {#Xpᔥx1m+oAީcdvqěQ-tz(TsPjńEӜh*ru a9B v[T\pjubփK9;J_aRmW ȓj-f*-"tOr dQ W&U-S"FҎ̬VBR{i7SM-ƙ5ocPCKMߒ2*d 6*V…E@T^>[>NyMߚDR ~hViy\+!3BҖ$oM;,B1[(x> endobj 769 0 obj << /Type /ObjStm /N 7 /First 54 /Length 309 /Filter /FlateDecode >> stream xڅj0E -BeF!J4],TGAbW'4DA3gaF8ӑf""%QBF*G%Ţ('Fؿ&FR# B։ۢ|Gbѵw`5l]KVm\^M#R"qk:Zzפ "DKh~-*cLT 3h )T**sј`iĵk5o~ +dž Hyo=t|{1QQ ۱bʳ=Q1v|rl;naKQ ̬H endstream endobj 777 0 obj << /Type /XRef /Index [0 778] /Size 778 /W [1 3 1] /Root 775 0 R /Info 776 0 R /ID [ ] /Length 1868 /Filter /FlateDecode >> stream x%KlUU:@t(g -Rh˳m B14:3M܎``LLle@11$&f 1&&k@tD^:Ibi%IUO$- H s 6GXZpXVMhyJp ؊ 2a؆vEjP֊vD;v8hwhWкhW: >A:Y !pMkNxE@;MnfN6Y'@+Z+q .ѤYG-z3h2hUh ]hhZhњeEhz-w͡%=hд$J"nÄJ~=hZN%h AhBS43hq\G;p u)& h{ 'rH3 fД/ gRo,_GvDGInC׌ VѺjM˛w}g"=j,,9f[1nlh7@'B"/Y2E/X(b^Ոݑ<$H$s")ɵHFv@lФ%Yc k"QA*#yj+fTMkVhɶR"$%#[##"# R(")-HGW$#/ԑ%x#`C`́KZ4 lx=fWdb\U$"Woޒcg 8M@E>)p>S @B%PT۷Uj:^ T,YIu4C@|P TJs%z*Uo4U# ˼N+=$E| 7UFj*ā橘^K-RYg .6O[jhKP}rX=fp P`ˏQH@G6.*Mqܢe~XnR%CyҎ%.4t0mYp߀goC-[o3pTzf:ʦ<|X:Rپ*td4۬tahCNŷ2, :,Lki`;h:Cg1?d B:v,3N A`sq;Jl[Hbi+Y] (gěR}f>4.jPjNP8]ݙKh5L N1Àm\#h@;tM7:ݎc:2ǝm4e image/svg+xml PyCorrFit pycorrfit-0.8.1/doc-src/Images/PyCorrFit_Screenshot_Main.png0000755000175000017500000023774412262516600022604 0ustar toortoorPNG  IHDRe3fısBITOtEXtSoftwaregnome-screenshot> IDATxu\itw("`وzvggbمzubbw tò13H. Rz^g73g!ZB!PDu!B!0_G!B|!B uB!/B!0_G!BR,_';.^ŀURb@r(;N۶zWfZ$϶1Y? B!S`- x^sVM/Q{ٜ /?I(dE{g]^h:zPdܣOG .@AΟ.cyZx%*W/ʌɣa ZkV." Pm8q*WW,?UkWGJT?!n /7ι@/0krl[򋊸~[>W: !B5\P^-lgGTSS%kwp̦@E\M=TLɬݜ{sR 9:ޭ^I6bZcȿ$dk:y_;& |Q+w㫗@[9jHW s XFtһiy|i/KS!B@}oH펋vx87\=mp!??GN^̗$ITɥ^FLL\& Ezy7aG_|!Rբ˿!Z .H|qiӮ 2n/_{5^$.>,rX W(;N[74?hQi־ 2B?l:YDM˟As?ea|xAf,Xt Y K/I|H?yܦt7wSk3!+#wc  zY{CVH|Ѻ1jmm3=}]+Ⱥ"]{qbC<}j$rO!K+"|fEccvыKmЫd^t1ݯ2cÎ9$Wޱ$ z-[O uA\Cg'1BULz(RyД176cܴa9 wΨ Rk̤fq0wi3BbIJK5nHÛNTm+Aj;Z1;'ʒ dϸ`vxeӸu9,_T54[yYGAA Z_6" ݮW5 F\(a:rsח/'ZuAT\Luc6@2˪;dIǦO֮Yx/knj!oJptxoAk{JIMޞK,ąt+kԽF$Ǿ:gHwGK%Y%eQo"3s>=VQ2ՁJ ?;wuH 99]̥Q[n Fb[um#xڭq陉 ~Ĕ Kgyԑw^pZqkixIr/N°gF>=~qH,ѷ"qO/(һ9ރ%ż~tu#/N߱POky) TIχD xw^۶fU!M_ihHݙgkճ_Da@NAY<ce%Bz9۾/C}F.nMx{#5to&.)< g' ҂@dUI~F8EZNv7W+ZCR.&((aJdKN<]0CQ /ɢ_stES`M'K?!4S/JUrl ߨRe5B=;]%?)ˏI>]! !@R ZB!T.co:fMk4ŀO$)Kڂw\!͐j$(=.4ldTrL9L)J0JRhd_ е)JZjHRUo?QSkTF*w% ͊*4TbӾ_BՎʣ謗㻎HWغQ>}:Tz܎|u5> ̝Gw3Kw5F \64u4ҳus1#s+T2n 2hVߎMl,L\?{I 8u$߼ ,G+}h~ZL[MǭP9& %_{5;v{63w3K8)tS/(}2TK;mn%a>m vfJZ> ]~xӦ>wWfG~p:震nj N{JdpiX۽ی>*2>sA̵KR{w9G OsQ>>|iۺr]ڵs 7}?uXg{Ma7.jУ腻0v ֙o: i: %ϓ+}?G\(/oҙ*;j< `\g:Zwlu;/_Mi:Ad^]0!hҫc@Ճme ^Ig@!~,7b饞:: ;s9c!B?li4A{-B!|B!\g!B!T0_G!B|!B uB!/B!غ!B!$#B!TaB!P:B!B!B_2ہ+I!B!piͲ:4oR[ B!yB!P:B!B0RGɁB!T-o?xjӾi#1eG!x> &UM[)B}?B!P:B!B!B_ZUnk@>Ĉj &iܧUs_?Wkz6i|X1)iiZA!O`Oߖ_R!Esɺ1 )3`9/-G'{בSvQ%i p[O?:݇E̶-oLGغA^SgJkHwlxѥ Iժ#qOz6*xnOQ,mbܪ_9(vkq,^C'n/lQ]'}HwsNQ_%+݊Pwr71tU4T ;gWqnX!@!I>$wUC.伸Rd9ېlnCIo}M܄g/G^ӈUAS Ư{[\ ^vݚp/>~YQΗo%LR D 7N]ylu'%+_&ό;v!tJ%_Fn>/åg7"jqJÔ{e=Xu%ٸR>@1aʵ}j $y%ɳ4TxB"륋śܛ.bmw@_Z Y|Ͼ~g.uG;8Ǖߨ`8h_d2(Izꪩ\9⬨G׮'iE\~. N<eǿ~8ɴdaV ٜj툥0KnMMUO' <۵ D[gG?sG9y{nUbPaąt?L)6]t1l!@g\ {Zp`mAt'4NTp)'Х4(:;i@mFlu&t"8SϘe \D=PF; !Ж!BĽߒ iq}BH K.#d,s}۩74a#LBRSvYɺ$e*%ו{6- HNxkhz>lv&lAt#H-΍9~ˢK?Z9>eӲ>6J@8\Yqr4oK%Iz;T ͛UJ~@U5Gc]$.*)>p-&(0t *:@%]\0bٌrarĕ-So{]mx\oXF[b>j[[iC%_}ƒ7?dHMS=*d!]]ܜG'OQ($X55g/(5B!(O4Tw\.FRgרFvPf7z`_c8+>@@ dgYy4kaZ.adn#_{4/痏ضma3O>I)LiH.u `亁|۷mηu$8%$0>ȅ'+ ]jf3i6זdϿ㉽ܦΜ=urϽ,ؽ_F@܃w6]C ێ{2)x||;\tt ۤ!v;{5  h駓ni5v-B!ꏒ S+ _ td0dи,ǴWGBÇ>sb3>?<᭏NT?]k͢qPчT:*SieF9'5g %ݧ^/N]מ_PSB2GkB~z Х!v-OGJ*dcRD^dCa5ۉOLh߿ йǗqQr\J'\ DWB!LQzQ`h踴o-aww^$);x{qkeB|@l!UT s i" %HPSJ)Ŗ%oBoWGۃ.j* BgT z)oO-tٻK lcTuQ9w|WW"haQ…5|!LΘݴt.=lZ1WDl{Ѻr%C*\S:PCb Y6i:jӪaS7F,dgE?c4-9`??}QF.Zѡ,5xVغv]1UƬο>DKw3, V.&HTWRMf;ff.&ޜ<+b߶yY뗩bsv_ɮ(k;/Zd%uӎҀ߼Yy/]Ϛp_ܶJ+9:s"$sU׌G! q'i6Jai.%Zܴmj\QN.l7窞 |FPxl̾-ek}ܬu%[Ӫy3ʚS^wbJ %MFrCƫ|Ej5 hb2r-,5!K2_ɠQIwLl󶾑˨+鬄0vs5䂲c[p]Mr3VNt+oh=ژ裙B!DdƖ\اy%<_i_F~V?f>6?@ |xbE2, !BHYd ~jJa4ojs,}Hgsmuo B_ZE궞u]GQ}H=2B! B!B!B_#B!TaB!P:B!B!B_RsyǁO 4!B;}y]B!N+aB!·IM˜t$&jI4tm]M,DW%Beס|#$ɪq4M'Dž?7º!Bbe7OZb^' Tt5k}!!g\' T ,2R@!ЏlBNA?13B!QFB!/B!*̧g hһEϱ JzuNۖޞo\06-=[zmor*Tu8S :57o{>B!J~N%]?ko|9O;𿽚ryՎFPn-@j66~vlӑ}}ԕJVu8S*97v\anB!PUE>Upb''+Ν0lpxEq?vgۃ(^;W-tɘ4qr-=[z%,YY1Gޞ-}: w2V*#:jҸW13 ކږQY* 8(!s{]=^~j Ɩ:xlcTJvZz,5 "ǣN {E.ppo dc9}9H~59y\ KJUרuw~er_x,wUt!B^0-7x[md\ԡFVdlvқ/`ܮ(f:k|c[uevu^c_ IDAT;0mXuC Z6)r˴?gɾ㴒RI3 &{w=օIV_c,|έ=[z{vy+/;)lenqJzaB!T?_gh)9m*7) ՅQwd>rCq[6fלɭP~KOBp{m<8SRA}:y{-Ye^MWaE?s--'x}#ˠnެ675TX?3˥2hv+{Fx-w AHb7nh׮]B>5uga*~̡L5[~Yzod:B!P="o|ecekWq[q\#*±4悁Ri-K7 _5s#`ղQJ rn혻#1[ HE%廿Q2GaJ"OmS`8m[9w967Y7.ٱzע m/RQRӼOFMӨߐ/Ѳ%ʏ *8k"\kw'K $KUoZvW|kVXM~}hG~Mswؒ烷4SKǠzE2FSI/P;eNsy% 5*'[жmI<!ϧf=eŕ)0!B!>L%c 5Lc{TWx` M֗`ձ Bu#J}iêz}L:B!TC,$jMQ$wR BQ6_LR' T Rbt ϻE!euFM_>IS8ʎ~*4E%F޻v/-u8!B );څ}vBvfNA?dijx7 uB!BF&K= 4%d!B?)Pb!%~(!B2p!B:va5 B!B$KbroDF7kR*܌i˰T~B![Y+j8bJ WCq5-SOԫ`B!Kţ,uck[m*!>k?_9 O&pD'D'ͼ19,m1SG(6d&Դ7el'BR Ȑ#z|'~h֔.mAa|FgNn}Jx,&WS'zWEKͼYY]p|!Bռ:U-#oz88f<=s*u?Yb¼ӸnM5ԗgYzj aW_E 7dV9*ױ_}Śdž:-F,}jǛ9^{kO&,+ =~& ڗ {MZ5R.ΝbYM y_"?w6B!V4Fhe_~{2RTI!I/E @ɦ_&6&zNGnR^#UDCUmTs9eÎ:Dg@OWib3O޾W꿻23W+-% ]Mt ,jԀSB!PUUj:kȆ,g8M!̭r>'vg(:h%|JK΀[/Jv5#I,U @tTA'`@6AvјRT@N5v`f4v⿞Mܽ8( B!l. |2H^:s)GW1k2'Su%-#V~ClR9T#~-@x $2 CO߻v/ɨ&J, (Y_ūx|lm@WD Ő,6Wp+B!P6_)4v'I*oM3޾hhIճ:@eGRW,&pvDnV3RˮVgѤEv}]8jըa-l)aD¼TM B!@|]! h! fS_F\27Gw^1ƽ4:T=',& |Pۓ; @Ɏ{fh7˪fT5aT͚60Q`J!8쌇0s0c!HR,8JcA!{DVjlm%9%6N͛*7y٩F6 ymvW`hCCC۶m[' B?ք촸2%8K!B u~6uB!MܿZG!B?A^~Y() ;`$Pxe~#Q7VxC>{٫(VG(2rZ,ԥ4\社[}kB!T*_D"@XX ~Œ()kPb4E |6KEct[+etTHxv8+ T0ڛZc'Ӷs[t"ul(Z5>7*zDB!~xru*{m^k2=c -[X7dhov50o' e`_]I^ayw=#T:e ??+t5} !·aXg\[ }# HHX4M qf"I֮kB$UCӴ:AlA:r_޿>Sv7x4J*,` bi6msލ3j!cu3bK徇).m.F8uKJ)I_ڒ&#A!P:GY~Nd^.ZGr֣eH-[3{鱁bE. ׻^ޒjud:8 Ⱥ 'n}̢Aģy:+Q2-Q?#IS M3!===mMCt0P@tQxTG{B:fyHojǼHfy]*N7°Tb܈cΟum_污NDZmfWaō;^h` ߊi1J+=tgjb sjz˜9>&9e=Ř-5 _1ء21cvdU}' ._Id,ƍmquBgV|J0ɹuf6RQ(oP'S'v̘=gR߶'KgZ:A׆ iƦnnZJ@X6}ק36/+|.pGG~q MDNN vQCC]ʔhT(N[o߇|qVv:MQևia,GsHɸCV U9 pzjoz5Q'|@EM 냞9Oݬh^<jB^J.ifoKe5m5ąF{n+ vrOP9=kɧԁk 4w͠5fjuw8!E>Ugb''eo>zyȬ1oJm9z͹2srv}f$2anXjY&Jh?lĘ Uld~qdPSmݲr EҞE)E[C kk婭)4EUbޑ5㛥`T46ѵ7Q!H* ةw_̼22cNc:^τeZbu9h=ݛ{q6Qx~e~хʔ|f0>*VFK@\!QU"_g?C0%K m`o6\R{A ?}8OU{Zzǖ;a1xD͒|AB@%u*/%Yj|݅nɹ2u'?鹳z4Vs'Ns.S2JA[N=̊tƖ ~c?inFR߽mzqe$R[AS7w?}Mį ,.tn6 L.p/I*nO /R闛NgРJyIz'=|&h8J X<&,QF swQr!B)#Ï<>&+ >^7g_&+[58$@򫓫y^q4`CÍӷE7~PS=K.X`q ϔ-_:~qWJ 물#IMtf)%;_SiANRԳг=s7 t~RtR >K~W41W72-cڹs,8a\ O'^Kxg^~bSϵd¨55m`N%?yKkw\!B!*K@O,0'+4^T [+VYyr e˦L1PHך9-jAtHL} "H `(ot4 N8.fqܢ`|CE1Tѳrj:f]1%ʝ3LŠÌř۶mwA@j70si,[(__5m[ϡˠ߼w~Q='Ti]@pN)`a0_G!RT%2'ga*ȍ >ݸZV./oBuȪ1cnv)m6T\_gř@焇}1mPuعRGDhhh۶m: BU84W||f[#<-Dɮ#=3Lۨ&F}dth^8=AmKO8VΏ8nٛ#^>6o}{)Znnp؈{VHik$ߒchZR!$wdYVRBV|azH&3+>')Ahl6 ûFPC=T%V?KG9 =tUB!TUjf˸Ǫ֮w!hd4U@a0wf>:މ'bo2RYJܴ}K 8zm&o5,Yǖ2wd7U&[V}ߴ*ׁ!uTagsmטi;p{wzepZ !BFdƖVRQs%^-b!BKnNS|϶mk-Zg~ M@QMBqA捚?̫B!F,iV0uCB$,M C14E44CFK~i)- IDATuM,%M_XG 5XQ>d[:LN~u=2u!.Sb0 MK*ZxA_iZ4x >d1Lټ H5 - %ɽ=d5$IuæzYG͗|԰eo:iΞ~*瞾L̡GXo>?Mtۏ:1{O\ wG|'+Po>~ΐfZ$r~}yk`Dh]W3ë1 B.u H  9!!rE/1fvV MGE$D7u4fJœgv~1戾y^sP=:HF}f &,Nq MDNN.;w쨡.QdEOBI,M seqܗ< xs܈CZTij վG /-O5ly). U8BKG32ɷn ܓWq&)+km9z͹2sf(Vc.WBQanXj Fѷvh ۺes>1ǝVyZKLa5\vjg;ͫrtMT <Cg?=v]-3L̘Řu3asֿX];_O&^m|MqB-44m۶uB!_Q0L' \oe`43*xH:J  WKKYK/E ƹ&Jޟ۳e?<|ۍbh1MQ -f(觸K`*X? %Q1pFs~Ւž+d`tՉ)y&ڗO~`o6\R{C;vmٰ}څ]9 qnnPz>?IE$B!T5g㢪?, /*.V.4,ʭ4+ }Ҭ\H3Ll]3 7Afmax|zsϽ;3}N u ==g o'WǤyv}run ıۂ-sܿ7zPRq p>6|gKF8_ڂ} :0=qGVN)_a_gסvaX፟%ۇ$z+?V^; Lg/Ew Y7/gHy4CYӣi'H3קip_c/Xsb}##B50^^sw$'47%~=bR7wխ7_oϦMqAu[0n֡#wƿ6hx|^q?j]Йq@zl;;oLy$=v:qpm"27Ni i9:~tʡQرܡ~gw8w g}sR2}>ZJ@{/j?ztkX?p. uu!B&B:z6":|~Z(jװ(DuELڴiR=k}вٲ/?0tf+QEːwkB$gǠyP(dVV1*yuyԽYP~=+we_s?+_[wȚG@j +mۺ䨆si=p__=[M֦}rA ůly~֗Z3^AQLJNœ8 ϯ3R\<R 1~_*wb(B Nlcg-Y]V #Z9x~&/P(V8{(s3];}2{N+x&)æVQ)|Ҟ^j~ּ&z4U=3?H5lY}3}t߽92wEŰ"8_S]F!BS2>W|)ߩ9נ7yer\N OY*n;}SYqG[:~=ثӋ,co ~tr]{~̟\%ya_N t7뵏_5qQ'~"QVч8=Swȳ?eָ\:}VI|g~8%?i֥ZHҢ7{.!BRjI̹t2Oډ*NyΊU&jbc;/(X-ptpdʒB!z*:(ѹcGp2q{C BAN&jjwCVS,yBKf mD>~&kԵ~l d'ekا24+z?/bϷ BtFyGk ];wpIL7Y{+aԨH5?Dʗ'N??@!dSH1+\n^IAy0`ôy`kAبɽU\E .~NʦV~?IBbVuBBQud%DbrdIғy9R5 zrQ-5]U>5SSmc,> uBarA*7lJ 2(_tM-Q8B!W=Glb0Ww'gv5o( @}`B<}u ;-ȭ5`\O~@BUlfzc~:(R(e4))R(֘p'(zFQ)RQe)LØ! 428 R]@0ފw""695jTiQT|aaAZ򝂼h#9:9j/9P/Y_K3!4pg3:Nyw}GlY0!2R'%3LB@Tԑ &2`z6AWV]#\+]MMhߋjDv YSS0xԽ{}|!B^xzBBbc gQ89vFiaA~u ״yP4Hi޲T'|Lm|:B!Pf6,}³}{~4 3bߒ56n8uر(wwwFǎE1+8Ran@8m)B!d15יNE<~JiGv0F7v\St{QzF111)))ΟKIIfTx=eoZ>l=j1F6roR;B!2U|Qg^[<.}>wpw/l97.6n;50ʈBuOi+FUܩ뜃'̍x(B!T+j4^5u!EB= o\~SQQ=yFʗ_(b!UZ6ub X!-<2Sa o'v0MgeK̓ B!dN-ϯkG2{=Y>L%#QݽsS?J{nRlqw,'B!+_g#L+.K!v(=^M@(2F+ k"^ںkG,=U;IB!O-Dнꐎ>4Y>b (+;w^rD昝є猱RDЊgj:jeni{B:jC oݸc7n޼PE̟X {zbV}gz{r[n7+*^ genƽk[qie#O_^BuhBȦl~]y)K1W|)ߩ9נ7j'mA gT>7J@ nfax1KBj+}{5~trl8w_PjL.[4_X"QS^z:B!d#` cmuG4W ^i[4wIW^G!Ba=YasrsB{n0zLBG(^ߔgwIvܵsGtP;/-<6B {%)‚{6<슧YX=ڄq9B!d:!B(K :2ҒD"1yItFyL2{$yF)=CL1B!dN =kjR]FSOGcj:AH(V =v @!#$.'`I D$yKewo_7ߛQ&8Ө1^G!BPE=G^H~r1$FTQJRD!B ]:c ?pvp@ 2eZBvUzt:^U:̇ Eɕ\\#oByy-6˗ePg>L;9HeM)RA|VNp@x^da"BlNyJ3|T6]-6 suqri_RH' ƨ!ӷ]иRZF>ө;>B!d]fau.>uxxtZN bX po-ׯx}$.n qTpP@]GV{gG2}>71SX^̒#n@{>+686B)s`S'0Q(e1Fex'7;hQPn,(eYd>U[XVǺ%G0hў_u>qa#|(>ͻWƍ6rK?yowgD0<Б"j2g-U蒿{sdpk!!Ƿu5; ~p!Y%3#ud„ rUr^'=U`,D";qN_W*8өgAݻޗaΧὯs__="sh@kc, ]c|8-AY?5ŧeW>)NU?hib-|b!$KP' $?//6ؿySq^*`gAUP z=z0Lu^87.)jǖ#?TרUؘfn!5_^>y&6>1u*ᴰVxz־, ʬdԃ~#QW$|sVHzp“asn3$0X#?gHS|rM@/D.8oϐ_d!w/?gHp31 !v9c wKK>?Ho݈f(<$?7vaҢر(wwwFǎE1:,cT\ˁ<(,TJ(,ظ7(߯MjTO|zo?"~xuM~%P%^~(l0o.pmC}jr(IGcS]޺t4GyfĦgõ8Opw45^9~ǡ |;x14, s`Bj4t*m+}o._$rLQ4:ͯb'gb8=ef>:i=QΡ/=H8xh]׆u6{h@=J>(q?>4Oh*Ͻq-.)twl=CMTux6 jPjm<͵]xzhix?!q IDAT"Q< ,Žׂ}γ@NɸvO;zDM8z+ZyEs$]< [Mg%ۻ-EW[t=+k@O ƮtadPCF[6.xJ+Z1<E_F5~ k٬2)W9͹'K-,c:S$ĺ4LEsRW-!_@S$j0G!d0X^DԷ8j>C:i3 naq:Fw)2)o]HcN`z%eK\fW1q((Ϩ;1\ESc B.^&|Ar`|= x/A¹{nk8v[ew>ԧW- sө8n( aiL.눩^ڤV_Z:گ%I9RFyʗzYuB߯y~Sr+qBi~.ϥ<_Xq.b.Ns%?:w߾h> I@;q%:F\:vU S'*;԰OuYB5lPgѨ>ό Uq~q}Zt~cg7-{#xweˎTt'kiLnv'9]x/lo%!*SSʗ~KO6b.ّ2F’Ʈ4/PQ.tqYz`???PI{)Tv;stL(Ij "ˇw8vd엳sƢo\5f, qP_^lD>޾|?j2K/%P3B!dnklN- !ث|Mv> hlYMxvg#L+e5G!!m;v': pR)`LS$:™sQskzsUG>+PΩӸ%;Udm;P!`} 3m<ovpW)áUxX~VY>?y?B!UahA72uL~͛(֞VK3֘sS/{9~Sh$$]tա͇k%qܿ毈 ?\t55S]{;-U~-k^9tIt-h~7n܌OҲ9yh"OMƳ3_7(Nڲ^^u^g _>k|\F.8X*n#Nۧ68o)Kx꜡lnz0*] uC>}{-g|vhclU(-LA!T,^W4T=[1t牡:qϯ2"Ys>miڈT4Mȃ?>qqstӲE1jHB!Pu%e^&$z{y=LoQ&e3B!,Y99yyY9aM7Y{!BxD<@0ZZXp a<0W, @!T[ί"=`##-I$S7Dg$O\/aRίaB!0 8D*̯IvROM?7nժBZz^^"X*pZP5 ct:-aĮnRc~nf"4jB!%TQϑ$R_@@֯d(kԪlKT)-5@Gxx[=<\!PQE+s]p@ PAxml+EȾUzt:^U:̇ Eɕ\\ld00A!dS<uq 2(3&ǝ?~æo R>+K'8 N1?kޖQ @)$3 yw>Z*[kYL׵r!u.>uxxtZN 23SJ)ѤJ)6j?Bv:`S'0QjFrs[ND&F4-7aGowח=@P  3{iE)co4mShP}go'NJ0fXuL0cPJǾ V:Hb8ISbKBJ0~BւW%-ARɎ; 006lc;&eUЮ* TՀV̔/SIFѥ;a?"m cFұ B갧e!j4Osۆr-);M/9|9 ;w^B}9d@_R﫺W~+5q䘪+kS|&t S 0J!Nrwwwi\_yjzo -Ok!z_֯|9FkDN|+gʀVvL [Yj81g?3FĹ(@H^^!III1wȠANN ֌"@y33o{efR72sK54eaΒPm44'4 77:lmօ !F3-\KELuY%ۻѾskLtꌣ޾x}=-x?D$ʂp@[1 p hӦTE]eGO ,}zQF[6.xJ+Z1<E_F5~ k٬2)Ww$ts)pFb]Bdzi8cbb߿$v/v`wdubГ WkYB@42u*FxSe@+i5Oka/tq"`L}!=Cn?<=SgTxal||1Bz3׮Ӝ,?՚WY47zR:)1;\DٜƊ Co|:ׯIw ^UpïsUnR-ׇyv}ruĎn ıۂ-s2onո :d}Kh>|u 9T~Lkɯw hDZ[}̍AyzЅ4_7~WCz*B<;/Gڿ/_j 4?(3;B6 :hݘcXit|Bψ#,>:Tj4]̪L3KAOPzMyFyGk ];wpIL7Y{BBBΜ9S|F !!sRՠO%|z1<h9`[j|quU(ܭ5*߳M!{0Xwa'FK a=Ly^3g( Qn}$0xv;#[7m?!A0AB)>Nk+H-9f߽}|oFH(8NxA{wcx. }}}7o^>~&kƵhi^@vR>%om>9zpFݝ%U絢uMk=qkΠ:UE=G^H~r{e(kԪlKTAF M.]hoL3}}>ˑo~a|य़0bfJ?^vSVڟgBծ*uXA~.D'B'Ajb;L(߀QS7{?KW>;1\\#]ջ3g\CϮe*> OWivչvZC!ëtf +Wvt:^U:̇ Eɕ\\7cks*%TΉzkOeo Q^3=G{tnєVC TuIdžfL2lNyJ3|T6]t\}=b~׼-RI/"Bo.q磥yeDjڴi b֬YDl^xСC9sbR|!4888222$$>ˏZ^HY>}vɶ?^G}m_?ڧD|.jmL}}k q _{o j6F.O2eĉ /_zRԩSN8N.TK:^;TԯW!cO MzBXc5!90*DE0a"Ơ\N ѫiڑr{QMxfVW\ p31xԽ{}&|:g5ٳ!=&_ t/ŁwxuV~aJF}0Qa^R=!!yy3(RO‚&@8i;Ri.{4m*p뷋\f6O O?@Lu݃㇯8-@9/12+n,57'_}Hԕ*IA3ߜ)ҥd؜ҢOT$IDDT*5/g/0S֗!T=.ߠ(RؽF :_6f_\6qhߞ!=0g+k4>/* 3bߒ5`vRF)cQ.ݏb7n5=Yq`݀pUG IDAT~xSBTS%^~(l \[wPߺ\AJ}?T׿<Mس͛7oذ_9s&.....n޽7oP j𗡾8lrk~ܵ}A^_u:7437>k1m_^T+gMDA ܵ+r׮s:iGv0F7v\SY8vFyfgĤ;.%%%&&&7;Q=e_O2ہiϭ;r̰ѓ_7Ř%Ogr7\vN tr}FB"nzPogE,|ak %/HerOh k-(y2SÐÓÔxL_>t~ R^3$<ĕEL&3^͚5ڰPU PooK NvG~٦oQ]hٍolKzȽ;+qٲ#)zqA1Ss׬{#6v"|釾Q^Ocۏˏ*ƩL{9:~tݤ 5πQA_wvrǻNZi;SX3 b+V2eU7섘{!d/q=GNӼ@붭e-q^%޹g"{]4Z6[pGle:+քI =ez^q: /Ѳ; *62FbZmY$]34Y>bE'ΝWpD昝єdeߗʝXG&5߄|pxUԱʣ?Ts4nqn75@vٻ̴;~xT*_ݻwݻ4iiӦ_~_hi'cǺٱ:2#Vˈꑚ몫vڹDׁwjD@{kYo[kElaQ wP]ݳ/Zz #Z9"ׯӊuu :-پ`P;=P(V8{(s3r2ˮ} @%.N[#:n0Yމwfh78IW_}u*rSZ958юoxI.JsN`%QvhALICD~c~і_ :@ ~tr]{~̟\g~͔ԜkgZ3_*q'yɇ8r q=ԪOY{$d2GQoŃi BVGH,&Řv`cyΊUvV~5r.ݧm<8Ugw٢ G Oy=J9?qo6x{*La1xdc_KyG4Wwo/<{uwI[=L*1ucjF` vzd&B{geee焅6da"do~a(Ўk*!du2nTr޽ %32Iv(n0BuTB~ ~9x'W><@0ZZXp ahXj"̿?y(` G!K,muBBQud%DbrdIғy9R5 Bv: c yP@"{ԤxP)ϧ&ş7jU!-tf=//dPVZ{=p;/2ct2 .«b5($.'`I D$yKewo_7ߛQ&8Ө1^G EȶSHʴXlT)&*9~ڷÒ GyN"5o ŠuxxtZN 23SJ)ѤJ)g쀾=C~igƱ3jL\rwwwi~Xq2#85>~Yq7?$,>g?'x\D}n;|gLmXs:Mб›_{立mE-ψ`x#!D8d[t_eלeҳ jy6n1]_oc/Zù,AI ^g[Rā-$ pûe3iL0T6ڋKyFyF3/z[(_~~] ~Ns ڸJŇ2W4o٢UѣySwMTk$mӤ ])eU(X)VD" \\p/\P\PTD\PE ȪЖBid9iel~FlΜmd~<Ǫ9Vso`퇋A|[|ב{\֎8,k8쎌}ßw +6qbs| )u$$qdԐo@zZYXu1fk*:%*.U8x{]M5($ɀ8$u# Ym]L55![>1]rN|fu\zמ#yoA^Q.+t^ٻ3׍3iLWH~+A~%n5g?ce!)6RگMH^M+9>eڂ=uhUS(V/ݦa@>ilWH/+r״h4F^v`>IkS2k^FZGe|sUk/̹UTxdշٔ/oжvy3A@h*37**9TЊrnۃל|Ɣ-^x_HlBҥnםmEZnRR]^ rJ4G;]+UX4F7jXv mo5k%%<ҍ'K{_\Rvvڐs/:;*!灚[M4nT|>!MC3CUq"@iR:mizځ=4ʁ=.l SfN^Vzע_%8/S)!]q,6(?.iґs^Rx>Re{_i&,޷kpOo4}exzwҟ.pAwH p•T2-aDK\"i0x H6#]S~?9ǚWM]`8)o*ܯgMZ|$m! ';vĹbwXVk}]nTUVtӇlg*ܚ Φ ^ȇ9 ƽGmWT>:U);_lvWKrJ MZFpmJJ$4>Jx}ZWtHXam$;-p@۔oGo[bG-W U^Zxe—VO˚hJHUyg `ײ'w-3j=j栬:wKmATꄔvW0 nq{&9篹5#!=u/tjd@,SNtAm#dĪŎZHNA_%bIe=e.>@& xTqsKnV q}',Z;&5s[K67SVAyEeoZ=cztTTcFEFE&Mx&i{6ki|ꁶ8JWE5YY{LY?]胕O^E!㒪 Isu7yRi)͓2FZHn so޵-ǮBh?vϫaN8Jn{C>{}Z~Ozͳ{h ^ v ^}$(PdPy1tIcvA&RS\R6\A\J6g96sy 1}?mVv_?BpJ+ Vɂ:,thVӧWsX t7$lnP! AK&5-#ZWǙ^'0 F)Y}g)!t=* M[>JYk?uB값ՙPT(6sFuZԉ+K9cAB?%$'O~ J6B M'љ^W *M?6_= Qz!Ď?OHJf HN^t~(OaCBw 6C_ѯ1**BuFB 'Dj*JΜ:<7g H]]@]IQ(+]dU5U #4&5F (; .|zs^YQFˊw !J /ό "Lk-Yٝ7`@FA[&bP 4x..h4zCm(CFhchwQfרT45ֲۢqSAO~Bȭ 48RJ ߺW.ܳGDj|мǎc (Пȑ |` WC ;zښ;;v1͒[٦}W&@IHnݭ{Fjt0)(j'VvTH{{V\4D~o_\q-ՙh4*cse1g?cQi\B*AUQѠT65U}MHD#࿶6m%(4ATguC AgLU 'e}__k^c ,;]Y^9!*YքE0&wh{#6ǫLJ R3df󿴙 fifqmF%C86n6nvqmFCŜ歮Ӯl_VpΟ=|!D3^rтj_|HA ^|6Kzڑ#V 0D;@muT$w!w 'K -8 n*DVoSc̸GoC,ewr]1] tYMb%&&nڴ3jk}@].AA ʦiڌ+ Mmukvh.ںB<`ìg>8oOV1:Os&7֐kFM<0JVuQNEIJ{vgggp&r*rf;yTp| H@()ʳl b"{k=T!Axݑ?}nLض98ta-o~) +Ee{5(#IF9RDNi0оR24eTI8v WMU7]`-} J+*+/x:ӣ/eQQ.'KzHݭ8h{C삿! ^Vo#isu7yRi)͓2FZ4!meG e>®[K_s9 rXz^ (.)--//.)NX쿟N넨a%/3 7,,QjmD괚7Wr 1LS#^ǧ*V(bIgkJ/ŇQ *M?6GjRZvTrV2 &weZ[S͘ozrb%s^UUQT{g/aבb&v$!BcLTT8诛>6 N $,,<>!%RUQVrQ9a0AJv47Yn&H‹XשhDڴ*T6 8pFtr2WTvA[?i"&CKɎA$p9ex J /ό;hK,pJ"c_S<5wuM/A@1# b$Ivwu0ɚEAq!=FQjE0juXFOEcL\܌2F?C1g=ν4aծoҊGh;x-]''bǏ<ݯ#l q!G왙hGGWÕݝuJ)wm [ B7eʥ{H퐑a?0?v >1h/&mu\{ 8sqNϮ|xv_=eJ 6$#E.= ADEݝuFkk3إ:g[4Kne]9cӛj@IHnݭ{Fjt*D_[NGy#MW)V|k~xWyj{vXpj^l?X\CC_\q-ՙh4*cse1g?cQi\B*Ж}2k^ة_]e7^YGagM왣Gf5kKm̦Xv mo~k%%<ꏿCA4t(zń8`q3x T&9g񲒒>lYI%c[B0F+sO9GaG#"{ z'N\IVq޺׶:٠N"\gu|e׋G\oc8WԳWLdO 2 ^§0A*A C%UKp@ƍƍߎ; Jh4]i>VsΟ=|!D3}#ey`Fsܔ~lA*/6OMKrJ MZFt፻ )1KwǓҟIHis:\L@z 5eDzPc~ uVZZ}:w޾K:cd7g=DZO(#=x/GYh Řqɿy`d董&Ns?vWhqyĶx兵 ΞzW֘>?icL7f>"nN;mok?:*k_DU_}Y ks^ԃcFfyCS_?G%ʆQa)\X㧜较Fȣq4Ãپt_wؐVvt' `F_J|'^g%V>5fXkody<屟.xo.}`ƘO{4> guiĸf6mZZ+bWKA7iً'U}h6D)]|ٲv8U'j װ_5RI{);|!)Ŝo8ϛԇ/2c+\NhakO?wW%BxR|Ҍ5{{Lzz` @_o8nU8tUL{CDAWDj^f=~3V~ё?}#528J|ͨɓFyTO-MrF99JJwٝ]VR™ȩșQI.3pKR[ 8wyEh?wBШmF J@>?|Rm So޾mcnH)?{۵2h+,,)5!.m].P{x97OLEcWo~!];0'y#C q 7^_~zw׍/ݘh=mw/Yspb=(OBGe[S#VǃtKK>$*Zts1+F\TzWl}!ukHJ*AUr&O !u3լsUIS"xmXBH ⵠsouv?,~dڏ}z?dPT BI.d2^(;䴸Yٍ+PNQqq]sMBAs"ZكhB )}JK !\hEA9D7V.i:4jΙ݈9Sdmw-߅! ޟï=oinᵒ]dz.}jڰfmK;ŗ鴧YRX4}-O^JŻ[]X!< ߨV$S!> &;;˧M7u3{h{ ]z'\rYVN[0]qF6(?.U(nfYu C-:;L=xq#ooS<5Z>G.Z3Ҫ1Y^!zUmBҍFKc hu nY% AZVI-#MM]25Bauw# k3b5ӑSyeєuN8ò*N;v#)+ug)Qã%U=6#;/% &xx#z@f (N#AVvI@: m ؾxr+0T-n{iy _~Z!>-k΢)"VAwukЕNm,h^m"!n!-[liwQt2èȨhwR܂ H0UffsɎ(hxnە{ߎ4HLaN4O#-yRFQH@wA%p1hPWM4pqL";\RELViߕ3@1Ivz:gkF#xs\ԯ4<~siΘ &tU Ns.ƸL`HJic4192gs1(4.!{#o#g.#y䎬#YZeɴqGf%ѫǎ HPcm0sK$㺻V8ԪXRɸ}sAP8fIyYƚ45_P91^|YsU JMvO$ӡQ=hѶ?w G_[+f%k"#ݘJ/}8CA'/7%#2 =y1 BCWp2Q&,M7~ 7~;v8Ҙ6J*M5f 0=aa U=#dw旪R{uk`Eq}n A;xEJ0ʔ졧tvB kuk?hc&xPH @HEywyo^wpam㡺 DhRt=trfbU6Ζn4m̀ĚfMXp^7.~ ,}iUo~D)UT~~VjCo/h=D0wvS{9V> zȆ|wQ Ed@Վx-̟S;i+ٷ1f\{# ~ӻ~X♓c>=oq)9f >0cН_Hsni3ҧRᴞXrfq\6mLLLmiFΨeoPQPQՕPS5UP]U.Ao$J~[4oV_||/k#u֯)-[fw;S]o~U7}k罙}9œڪ>tu(-o=+Ėk}zC2=bI]2:-0UoaQ:7֐kFM<0ʦgW:ťW,[4czOYƝKҲn-2:_KN("gbYIIvvv~~=󳳳JJ899zIQ %EPzʯ@y By?7Q]a]^~UGtKMm>n?}w,)5!.m]dϚG=n-仓mY @K>! 9?MJ *?1OE4g>%kN|!]:OBGe[S#VVEv5﫻&nEƁ@KV}{q^6K3NuK~8GNb(+O ~ܱ]z\4\)dOBˬvR;V<_ͺ: ~XȴȠb,=R'}ܾ;봲c&yUtJ T\2dmHC2 k)ZToqnrGYԆm RpF:67_'0J:r|Ci;8y9c;i"(ceRbOJ`ˢڵҺ$*]CԂ7(YCX?R! ` H.G|A/!wJ# ?kB'7Ȟ3VZyX2՝Q1g U==zl`DV/^B]…fQ󏏝4cSYyi/*1}Tq~{™[< Am{Ϋں54ѣݒ?4'G&[xbW "$PΏ모XRUXnB+ !yJFpe"h[?C{{lf^z#%g³ 6YɂhBDP1ƚE]R\(^wE|Jjs7Yrs`ߗ\xךy 9'vJ $};ػ|bFch*g X^Ä1HΩ?ּhJ:Oy~SuaY`y';q9vK5 jz~{ȡ?jɬUoKt2ӡNE-:vhrTU,glwNcJ K4ra/)=c -ZsE!m}QԁmSgyox1 IDATaNq!DcԼF4^af?V;nkbP d220dNXM֭MU-n{iy _~Z!>-k΢)"V[--lcS$@?:i62ߔIz4O7Jin"/;tD[${:%cxۊe䩕Qk&1UPשFo̧5 zeG9 QqQ"nwz}uX{4tֻM$8_ᱭsqm$DVFYVu_R&$?6H0Wz9/l OWU#<`-+֭NOeNNûTiJuX,(O=Jn"ʿ޸Um^b!\<,eZ}b{5XPtwҵpyPB"筎[7ǎJR5Ygz]L0J38 gH)ulRvE,)6qjy4N<_Oe^s{O_4T S W(NΰӈQԇ@y> Z Wϰqg@z!ԾNZVr: rFtΨN:>]}zey)g Cb{ ߿p;~ʉ;m<eye yy\O^z[^.H6 -%-&Nz|4pٖ'aaw.9d+g{Dc!PrOښjրxkzyy饋yaښ@Q<xfHI39v`gi31ͿeU9<'usu}Z4#&nͶ8+,sbh#27Q). >ZȻ :z ]Ӏ~C!=WK_%KppQREhnh4@:HmTEYəSG指Aע^G\n LLZh>`/c5Ih'~9's+>CDBOB Z @cD󦥽로-֍;-^sQGJF^t<>%έT4j"umwT*!1Bgkk*J@-'sUM:9as6cQ1w~:Ik,}tyE;Y_ɧDdw"}Hs<ѕA֭u!w;^D飇s:}ajI=r׊9sRS[ ?@ **@ ,2"$F tJ"Pw3TV3U蝞YD[wϋ+9vʭ;+] ˭,'H"K75\׍FJQAexqActGBRJX+"4Q*a6?,,b݂Wv|d\oJ}r/;$㬒%&mIQO,QؽO'WypHpingRZVr䑿"pbdHpB8@8!'9!'@Våx8񚳅R%DMmJ= rqB, NgړGRVvAxWyff&!wt%DqW:WkzKPNɳ<8cCvӹk8VrvW=9:1;/.iJuf/"0JO83NQcԼ1GN9csccs13ceqic?Lz9g0j90(j1DU9?(\|ġ Fw{$S,W"Zgӈ(ʽJnOz`vt];!\Bjι9ߺV6%lzbZDƙ("qj5'RG&RΨIK3)QƹI"s9`%9cQ+NF4>jbZ"8>кF3)q1J9*Uyi1k" nbynyf O%Q] AI.3Lq" VoQ|лV|GM}Rr5Yv/Ea;č6'V:48;RP\膥6kFA~[2 {vmD@`u,!XҤ.[a=c8n.afIfrzƘq|{zq*X_gh-\xhuAq$NI$pSQ/}DO{X꼆.Lr{kF|9g7R"RQJE(3*21*2J9(2*jvQI;)1QD"y3(rivR:EQĩmOH9T3hjRQ&mA,xVO@iyvE/]6ӌhh3N:a"Td"T(RQh͢YEC "JSQ*R"ZMm6i缮IE%Z #55(RQTEʤY-AǼԏ5!74HeK1;Ƙ+]X=EtFw]UԔVjFd_}_oO꤉R@xmg˳cvl1po2s@vA[⟣;(BAwClb!aH}g)3YE&T-2Dcc薐ү;j)(rNȸ?>,/꩑J隌۟yԜ^832Q5ދ5TEPyiҺo@Zg&o2 ލGᷙvA:kGm)N)Z 6R9hњ=R#ڽѳFJ@x>SԔrm+UX-Ԕ,)%tm- GVS,xU 5Lj1uA`WX;GQZ9!ӗGl:ϼ5B|[M/8bHSJ5sՒ9Blr@E&k KkEє.6L۹m̱jJDtL|SA2WAA]8_'īZúU~BEќs2s31fja)0<ФtD~w&d+R[Z; AKCe)%/xa8gnҾq Iys“'%K6//asiCd.Q)N9Y׶q]5oAS"XalxDM'4)\x.׫Nɽ'J[Vi<]A ٷ]N9`\ԥmd==;'ADAtzwŀ m|kN.kSV5#w͹ܵ ԣKٝo^H rFLլmf~9k뒢? H#p+vG\JP5' O/ qYF 1+9ЊS 2yYq8vӉ;{JKL_W_'`|AҦGe  Jù:>8ҭ`T_[uȡc4wA '1 jVkl OqCL }rθqQ58t]iι? uAE61V Ʃ?پ]Q=wn:0^}A-lu"4 p>T뾴+9AuA(_?3>807$D@: B(Dh!qȌ5Iu|B" ߩՌAL {aDhqR_vAƩ-5CJfsO"ƨ(A4 _J bLA9 iGB@[臮(nAAANngv)G x"v}SQ~ *UX{ HAPVW2*+BEB԰8A-la4}h47T*F 7E[_0eU~}"q7AAV *6:^'* B8QѨ [^j7i#ǽ@!#D E^pdiˆm?,C 漓qFqn~c,oz#Jvv6 qA!?O۲y6)x^v`>ItuJ{Sf_yEkkDqEdty_A'4Byʿ l~*9~*}:gYG޳poޖ'~|Mi'NNEXv mouk%%b疏`NKlH[ڶZ9AA۸TD6iᔮbmYܶ=S<7WCdeBi1ZOp>P9nJOovA:U);_l4}KrJ MZF% x}W 4`=P7ah߁7@}3b-*]R֭9?܍,vK5 ^5x!v'ˡѬC?9٬H=Y-yUf:߹쳏mQjㅵӟޖny6P[[#:n:  ۾ߪ65bJ n޻Cjrba'ݳ#аl^٠JUԌ[{9owsP{ˣ ~ MIun~=oYT>vu@Oc^ stcޗde#IMSF%hnE KONH .p2B:Ǩy/>=#Xoش\)ED* `ΒlG,hbÍ\,:S$jAoqul_ vP%T<#GpCiȋm1KNP9ˁ!|ܽ͏ .q=Ҟ,տ`=;ֈft Ilh4Ucɛ圯 KnEw/~鋲۞Y2/etȒ^mke ' pidaʲ"t 1iـ8g"|a W0o?[ԳfC.C/D&Ug yz·?;[XZ^xf%:KO@ M_l IK}t , D&U5x婛~smoj=iYAEDG_{+>9 jmTYiDвS;ysL.:]ߗ~w&Z|rKN/%moJǧV|[K ˄4$4n˜3]/pŋ/^ ն_HiQheE )̱}Q8%_^V (П[?Yoo8Z myM5j[O`jɯzc|Ȁ ~ܽGMxQNs vy6)(hG1ӑPRjP# 4.fE9]^zC˨^AJLL٧oNDA*:% #1F΢nf 77DL%%;ڲu@A ,(O[Ar‰$M@8pIT<& v:mǯ[?:  q9eEq-G IDATğBɰn2K:%;!y*@8".wAA u(*QR ̾1&b td~g!AAˠ5Ba2F9cQD(cscrNMy8eng11ʸm HSCyۂFYM!$<[gUO2!A !QQJ(cJ"SJ)q)8-eQD W>$DޱvF#yM]um|V=M#Q̯|8HY5$D-(T 7az^wǹb=|:(#}`w;8n: P fSRQܔΘȘh5}V)4=aEwy֩3[iY^3ny"yˀ^!~߽p+U毽ל7l{݈Au!b2QQSEQDHLLQ*RFH)ENu)ĎzA 3:By47m{KHBHb޾9&mV7ɃCO_.9Yc'oJKHD vߘc !rټIͩϟc!woWl7)zҀkW_r"u^u#nApW Ӟ߰9"b~9D,)Z,:J{Ьg\:HvqDQ2GhIH– [PRZw=uZuU7uԽu-  1d@AKwwr}4M4MIehi]zUhFwA S ׯdOo|ujDz۴DAb YV]"Js|469Nî,:/u3#[F>WwtK׾bUӅGrHQS6w1&ζ9\ySh[o ̥ άuo7ɽتJb3 q& aefYvh-kf{mJֿZßl5n$t73  fPe,b9MUވh&zSi|_G.}3 d;9a|A ECg_O5 J.-YgjI_ն햮n 0suخ?ct>Qt`nq$ik' ^}C(7%mGoq_Yuϴ)6!Uo)"]><I{*T&K^'Ϲ_2f&Nڴ%{]e닆λ7x_muq6Xt i]`#yaa_9̯ϧ2d\[8hah3skM^ܢo뗏T\Ah_(ESi x}l|Vqp.)4: :/ !Tj =%>qeU[fZUzmغ 4&EH_]|3X" $/%.;n]\o 6|ֆ1٬V'}˚]Z=ɣ?\o殟sNuTk1~{~G81m+Wo3bb;`Ro:%u+7~J?{"h3|&ㅭgy@W9kASw47e!VHbI[Vn^DBtA$+7ES҄w?0:I|?)bجs}V]X _s2,?t[UŬ Cl¡u:.{xH3VsȉlB/$i έyvvB.M%I߁WRMX]fL\v~=HD=gg˶ˋ=D;+M THP$*" l5eIZ%EoԲW meI=>Ԟ\FX7AM?Niy'%xc,(~sYVQFAauE|^'& y>n_G4R߽XBɷ;۷O?xtj򠁓l׈w{ S lO{˶E%wOmt.j&=>85U#bU|[BX'SҳEBva1IXҨ xfφڹ:QJWiJo~_ǻռ[[JlZ͚ɏU\ٷYueJk\' lnx&dot9K ֩{on;b(`gñNƊwnѫ/ST}W[gF9߬wVPy]H5xN_7n0cLzrr`{lOK5ZE}z~ӆG[{L7ت3["E"+ES139!POuʱIս+FA8Psb_L';;|PhVP7gbUG^Oٝ#aӊnry)g+wy;m>ϡ/~-ȹE C uM%5ubaT{igƶڻ|jFTx2>&(n16z)46A2ص^r6ABJ%ZԆ  _2_GA1}_rhXԆ  N3Ơu>>  '[{+AHquD6 #  u@_-\x=0`> Q,A YlM  b2hOmCi113 h^x<[k8K J\PA>?Ғ< A#puqF;Ĥ`hP,TuAALT!M)H9+  FAAt1N<4װo2X8d=1c7cmzrAUժHiEĨ̇a ]k۾bS ,]CѳC=grx41'Z/YEDѰEc/ F@'׻뾭a)27/*fROMJ]tݏ|ry/jtuċ+  :\˂~ct5/Ι#XSd4+³{h#6@I t(n`ɱ顭\cӽ6u׽"VFAAL]R/ԣ9]SAN3^O>϶joG \9cö恙]կMZ?eݐ 7bwoeV3VWVyyѰyd-7[ɉ9v2YKvFl$1χ h>C& mlVN5u.NZ iۧ-yvzj Vr/Ƿ!}@E<a |o zR](dZ3} >cɨUpA0Wj]c#捩cxjҹ%@:t3̓,b]ZS+$hlv5#}v}CgU6ۅ9 B`s*1y'nlQmrԌ*}{OKbi@-Wg,Ҋi# TlsTj\ WuP.m9"#0|WȤ^u^tnή-;\ 㵽Fl['^oN͗Ԭ@[uXȽkm\ (`2PώrFz;"@aHUQ\n$:.>wwzz^a^  g50t &'@;gXU-_[$p-W~TA%O;Y?yb[}>˩n K-&";:qM׾=OmdNUj4ir*Izѽv)%e`vAA 6]`c6J>7y}/gDNX>;ZTxB(bY*(BT#-OG׭]9[GCY&f OSuYΰjc7 GAAB{Ƿ} ŞIgK};a#:U|\̹zep-&Y!?zSxXVM3'KbR=ʰiĒ3jҮ65oj\ />AA N͈O൧/^?ufo%]  ܆נ:هlnvi%xy)nc Xm^5-\ŧdff+|ǐ3+6x;sJ\W'=훵k\=UƾQV77Iɉ/nٶp y/Ouۀ  &ڃD?u3FE{¡ɨaWgEvQM.Y76{b 4he_Fu׫wÖ?9\F{u |qԀo|"gl6O=pm.[\.1?On᭤]j\sVQ;2|fN!̭/=2WƺlM[xYBWrʦ3 $=9^ܺ /AC}2Tanz`wSb]{%u cV;}׹?KIz=k Xs  Q>udzW_Yf @ŭ s4!:FLD_G푺 ?BTPRbi.*%Sѧ_))ϔ[W& ۉ HYf?+!POExeݼWA&_SOhvf*VBsofy#QnAA RP)m{3kŧQd5K_In]c'~oz~?}x-Fa rQRXiΚ>}R[O?28LE~)J$AeX+׎i@mq(ڹĿ}A)0Ϟ={ĉ}ݜ9srssuVVq E*TZ\ɟ_UF\hj.|$bzUcl HD: r8:^twon[~/`1v 7.o%%oGpIߥK7n@LLL6mWX1~xmpGI[_9gݠXS]mxɏdO*xs-89 3Zu6];x@ [4XSÁѽF٧o]uwoUGe1B}bP&~ ɓ'O6l;vhܸS۷e5'L_7hҮO_4x Aߝ׾IqfM*ݦu;ǵ?KkeA@A,"x6uzc`tt[@g SDffeTܘZ3]FPǏx\~x>>jlũQ A2mDcƟSRz.ZYpL1/4<&:ֹ^ /\B|^]&NYZ:5r1cƍˆ23$,ƓyQC5'Z][v76pN*kӫǭmmޮ+o|ڨ}YntBBB~o [[[N3TjTVrMnئO'AH_m:}ΎIg]Nmrnw1 ZXyc,I;dwM>#bҵɾ% 4C᫳[^RiŴpG%U1e]j<{@ԫ/źo}wwzz^V!}@ҳcG_~6ȁmkɃjt꧳x.uVsss4~z,` rnlEajxՈk+;{{4GG5ns@͈|wusv 808ƕޫRKEb|o$,ڏ٧Q!0C]_cվeI``ן={6m4i?aN3s"!fnMxnͯ&u7:BZlEeW! g| IDAT]PIOl+g%IޱZ{q>&{x&DUx .$ɛOFڥ 얔E]{ 5ƨ4wwšyV>V/g&&{gosꁶ/sݝUVi!#N ڲY-7a;UuzKUj9̡Tgio<*)a_Qh-,jE[P/ydorOUW QL}P.Idž|y@CDmaև]߶bEF{]*aɱ L{  _BAY88O*eX >8ȎYUqn?cXZ2<ӭ_ _zsvF5Ag x z5YBx"= |+-Z8U`2/ g[ /d5âTTsAad&_SGǏPQυ* [Kj2-ԕ<`ZO7;]MYij"f  H f+brZ6RDb*`@gKgNɳT$) ̹wQV<~UvW7;YM b:{չ?Oc?;3fcgdYy,0snB[-pQւu˹?WMk\__*'O%U8+"ϡɨa~]5J kCEkfVrjcQ׽{֎njW|ݽkxy57Hvv'OM_E)|iVү2EiKo b;0x"s낼ƐHF%jDLIw\[߆-[;dW_C;w"##srrSSSǏ+ɻ[`uᎶլsa4<uGV~Aup}^B˿ge\Te n B[&פ J%e|MQ [3^md7Yw\;uE%bРAm۶}5Ž|K.cǎY+}QPﯜnKAF)b˃{E\^ф_xlF+L  &בr;?N>K4GC_'<~F|~NNN@@@B u5' ضed`ꁉM~1ڃC,:Qϝ[{ 4h*WPyS)IRW5 ; i^;7: -T=|@)z? 0sn Ī-mWx" @(p= CV\ v|N%Йϯ^$"\M*JZNwA/S$esbB@!T)6pݻ'$$;v߾}Ϟ={n)50Ji(AÆ皪Uxgn? _Z| ?}\{) i[ g@3rvb^Qb_8ׯ/,,UVEEEM8gϞ=*g~ЋAA ,S͜rI_'tXLeUJ]3zr iJk|}ι/jWj. C}|esJ6v4UX!%!00Pz\1GY^;z`w}ZT_<]ru+~fv=6INn-u[~Ar(|nPxgnjپ?"> +/'׃귐䖋%FʇR[i&5  \~9fu%&oф2-u6@GİQQ .J^ڴP8R0JJ+eUU=Ǭ!AsdY6#5αr[^0 ò4 0,ðL|k0,C,mS1';[@JJʝ;w6oޜ"qt^@i<]uB\|uXIn>#vt0D+R^z*je?&  e|1χ h>C& m)=V3 Ci0 XO-Zjck ~ L9pc߾}/)ڹsuV2SXO(.|u3&sx=5_['rtv,yO7OvLR.9cti,ðH9rζRcΑ#{xa0+ξuV$ Vy1zduf^"2b[l΃eO8mκyl7Tκa8ԊCe"4[mD ȗ03im_9bm&}Exzz><&&ԩSG} 8ỉ"5ㅺUd-0sI7ڵ \)A-Uz_Նڢ  ȗIl³Sz͆m4q2?sVTSm&•uM0RQDQQQҕĺ60+6bݵwѷ-$oyŠN$wm&wd"GġwMvNCy%1%/i(Z'MG0 D|V3,导D"/]Xӂ5 \A~RN 4[J&'}uaO#[fD R_OgPB1$M rv9*hMOl7cLS73 g)D坿hvS)/Zq:粛cu3` ڴi# ]jժ'f?+8PExeWQKK&#K5c2}wMmU˕Lt%՗JRlXZzS^ERӈ ٨d\gUڎ*5Y@ICq!}-[eȰC[QLv%KPJT(!Uuy?{@=굙zDEKfL2{OOe˖f%6h#ŧQ"]2z+5 pAIQʀN1CKHʍN/;F2@/ahqJ0 {KC--nDS KuUK4ٯ/W"-:P[Ku0UEWu կn.jex.ZYM}T^㝳5ؔ.MߖFD_Ξ`iX[YQHx}90"H (Q)ej<7ˀh2 ˶^ -v4AoJޕ*NT>ߔĔWEIbj4rB},Uj}Ah\{5*uACY~i(U;ՎRզMY-VYxwE` 'm -0 mTvSXh+ s6сD"iZz HX BeYiN *$#1HJK;O;m$ vO/S@3&T4[^s]OiU5k}mT^@f7Hq[X4^K:-9,!z@ 9՞,6V2;5~z~JSjui Hd="D(mZMhS<%֢PTWSB](*':;>]dS(GSxJZQm-*@RPMŢK1C5ȫVdml eHzrkuA^9sd %MA/s/[qX[YO=}xq.aIvF/ll͚5;w/ԬYSǏu4PxvP.Ǭ քM9nzicbMS&~Jj#)!EUXF9vm ^z8J6%Z%j.J fj)x%MZZ׉jj[Z[h(yEM 'jj1X~M(MDk_lZ:V컦"UTrJg*w5T՗ZPVBfO J% bB\no7 ge!ưzbĈ҃rAmz`V4ȿ;G=wnIF=L0ԗ4iغؖaU uؖvTPZ:gt:RlQS]( kw:crP| 4Zl-T]V}#Rt鴁Ku;ZJojaMUHB9<YyXëo7kXβ,!d }}Ё!+sy.o;>'WnhѷTp^QgE<N jPMs&uj`x wP|G%ssZj,Q+* ki4yW:қRޢQLcJ)={Tn:u>LA~Gj gVL37UT*|Q$???=-%~ IAnZc|9Ϟ=Dpĉ7?-1PAXD5&Z%Ρ6XIvy%j}JUUUG@go SSu)- b$vpvER$SU_\gMSKr 륪 4(M$x9[p#&K9ÔuX/׬3*8;ˏiJ÷&C^;]tqĴi&99yŊǏ/o4RhgΨ[ b%3@JYV[TRUX_vpQ KghF5c wMT\:F :o_:Nq2Bx>jVr教aHyIÆ `ǎ7>uիWW^2|旨 Tpˢ>̹ʨ-$uo[_Igq)mT[6 t+@x~4@nvE; KzB_^e+r1e!puq'm34˰,:㑑<;??KM=:ځ֢ߋkP<"ΐSC},qz'&.򃹛]bew Rˬ%Ru ^ uRx|]N6b)=44jժ xa``;uꔞt1^<LDuM=ѽsҲ=u\dhUP{X)E2U+VU5X{&S9RU6%A_\PVnE@g) R~ > N>?x+bز4E,0ae!pF²LRP  ,0P:Һ,aeB\de1~i]I3DZX%>?3=E͛7oSNx3go^I6bG PYŁd$Iť'P`^F\Z͹s˟Ɵ:9e M LP>hp+AE^.ifhbhbh Z C3?0CɋVvimfhZI4#S.+FiSXe3Hy1H{0EKOXT"fh #_m6eyIVnݺ*ɳ Yf͚5k:Wpȣ_9gݠB:hk%NYUt,Ǡ _"VvmwBZ4i׍/1FHԏAAU˲4M||ml|+3)ZvUTAA,aꀕ֐-eA#[^ H-%VIתi!`Y𢋨ܹs?%06mQчG 2hsB,|~&yl3NNn{j=4Bՠo]/ߗKAA>c^_p_}IF+\kY8Qp(\Ne5dlzL[^ GQv:+ FQ<]coq\h:+Q@|BWS _߳gOTTԸq̙e˖G۷sΥ0"LS`Síe8(جeCRe9, b,n N8!a厂U/Z 'ߩhE:&7k j'\2U足,!|*) \c^k䋥G*U5køqUvر0gۖBA5HaT(bhhhh(hJ";EMHhfgg)ka JA?|;o^yB-4M ,4%J3i>駥Oʬ5H5N3RA} Dsss/\ի?>>>'oE1sk2|Nj<=dץlKiOZXvphA1}7Ni"TZfht)H M3R_Yɮ]e';1Ni͒R ܧ<|8}毫-9@ZIte*EKTڥ)v,!F˲Aw '!!u^^^?~|Unj׮}_*nSb#]{|(}θQF{]*aɱ J}iRI  |!{f|baXO. Ų2-~ӭ6b$҄T"^ 2?]vyf ?޼ysX~?4oC,_oҢ+3#lt˴HRWji?LѺ_6h̢ |i(ZCJ(bOXvʾq 4C Ű MQ]WC  AAF4&ZD$35AaY;.{ò,! "QG&M1-Z(7{Q6D+dVq i*-. <?[2b _OpG$dz\Y )#;NrQ؜e.܌t3+^fbR/ ojcR񈎎KgPSmpGt _9gݠC-_mS-e/o62 [;ժ:FASȇ5.0_n "^}V#mDa> B=JU_BMjed`ꁉoxZ #ܹ& |KE3 ;(Ϯ}EAĄటa@XSKD\k:E|.=1eaC#}}Ё!+sy.o;>'WnhT  IOW|-2.˒K$]mð$95kffF#$;#W>ubaT{Bfu! a!5cR|"{1M2EX5  cR(,:  }))Me #  &|Bx̗_'u)egKuW(]4gM>JܧN~"N?S~:aIB<#   6GHa37#/}eXWPдU+ #  :,tިIwn$3 ]66<QHCm r|frnKuW(]4gM>JܧN~"N?1aFUx)+A*(  j,  R.^K$bfmTD$ LVI@ =F|fN!̭/o$"FNCs6f#jw}gДCcCSyYb ~0x2wN{[NTܦfFQ;q=6"-N?}D zNOȅ+gwS8‚\JR9H̭ŖTa>wèV!H"~Ύ{~gBբbL:e9`}59  \~9fu%&oфr5p~Z;M=ѽsҲUuKu_`am/ !H24%): &ȖQ:8{chӱz-@ᕵB\AD-Q9[|Fh~1Bp~$7;:Y( 504U)1 Hv xO[;yOz6OLam싢P;/;HO_G /Xe1gfM>:\k;cm{?GA 'g1Wj(VK}Hym!bltN?6.]zοkS:  a[|_X@M~@2~;vvxiA e_ڹOyOP҈O© @ݤ</' ny$zduf^D ec~6/Wwi-6AA*8< A>W0 |?Z/ϖB. JOgf5'86zU;~*f&`샂 ( 0Oj1;.~kY [BK?25y1{}gE!b ]ܾZo-UclΣB+BCͮ+jT1<# _&L!/p3_~|旺FN t>s;lթI֛+㋰i?jD>z.,ipmr7] Ytm_?f.0'F:S}ǯbl<}Аܘ9IJ4a'y)qaFw}sOX6f*DAAaGu~RŰYY/'i#,CY^ 2,Z3һ~H9>j5H tAA)|,J y}+ˋ=$_Սy2%qM THPnA$]75~A  -9wā>ξs4kģVN:ޙ7&Ʀ?!"?}ћٔݻSw_}ٍ#k^X.-bUgkD)][01SޝEUqgaf$5ADPs Z&*[z Q`a fZ jKh 0#0v̜FD3=yy9K?wq٨h_iw>n(aAnG~rG,2w4;, !0YuiưNk6sTɡ\bW@ UkyW15iyލSl ::*xUﯣ^ :^_{ @{R'}uz@^_uz@^_ _ бq@s[*M/ӵ:0P~z@^_2arײR'hGHMIa')σ6f=M4_v^ggTmr%VSU͚ςL n[ d'|OwdUx]=Ź29PyzlYc >_*oS> 8oO<y՞;l[6_+el7.ͭ61Y:]v42W^#,guJxešLIʸ6lIaR@ЅыGfrq-gqV+:swҵbIs&HD7B]*5rpktn)(Q޹n皕*gOé9z$o#-ZLVkN7&ka:ͧ{!mu h] K>ϞуQv\.yم/ ۾+vT'- s\d77߶-pTeBi~rѼ.HDy\ʰ{wƘ={x9vw'f@9ic4lM=~Ѯ'&':/66}j0+J԰1]DĄU{:xӱb׮JSdO:[*x|s>OXeY~_du.KԦU% ^=%%ؿs@;Շ wqG<_Ĭ M<MsjMմ̱oM~&C?n OC*.!;5,Yo'j=]uьל#OZ)AHݽ=5.ٳ4ҨKNPpymqW3q'ZGj~ɕ>*uP[5cyHQ"!V 6a0M RW`lJ"LɨJTLW#2r1WWʴ13qTZ& sjMi~ڥ /}7$&9{j*ڵ[+HC%1f$ .ϩ9yh3b`3)Ii!(iLŕgr$bյYt`d %Os KͨjiAVt'~4332Gv&\&CVWI:󈨪ǂJ&+95d2E[^ kݜ'x.Y0/9J-zZ6dWZQaԽk=Q[>!f^8dzM>p6Y0Qbwdf(J)GorhQize1½eu%.8 Pcf#mx-rw J8Oj_Ng u0#Oٝ_'J U۾D#Btm5鵇g.QT]PMcaN -؂[yO*+DO3WPZZujXiCnaĈùzHќ#ULUi>!?wXbu:Ct20>굻'^-ͺz,r%ـFY(6-\8x(`ti KXp>O:k4dYZ~5Ǟ5}/rkFrTWvH/v|a`^X? <#6l? BK a:6~uS wcUuV!IENDB`pycorrfit-0.8.1/doc-src/Images/PyCorrFit_icon.png0000644000175000017500000003055312262516600020435 0ustar toortoorPNG  IHDR\rfsBIT|d pHYs N tEXtSoftwarewww.inkscape.org< IDATxwTǿw]ҫ16&Kb1Ɛ5FX"i***Җ]6e2s|3s< sy("HH$ d1R$,F DHH)I#@"bH$Y$ d1R$,F DHH)I#@"b\V d+W9hT+LAHG*'%oߤVj\.@ZVe\H| &O˥H$s*b=4H$&x?P+f#@"1c\{4+HHLB*wL4 ɥ@jLc@ Db&FA`hHHL@*'\V̴GC DbF?X!@"I1W)~bpy@ Dznr =VX/@"I!WdpMXHm"Uذ-vuAj=Bx-l`i]WyゟK ڜJ؃@V;;;Eܤ;stٛ/BhЭ&`%% zW ~!w`.0GP"0+ T+)G EtUrm{-C~Ĉ}7: Ġ/PZ4RNb>ūEl]Fr@ ^* >VW&񳓊k]"s"\z# ?h?uۈ>/qNS`O5kmN?zū 0˞ m(W~+$j/w~i8j:gIQPn&0Rǃ'!:٧"F7Xb]p\.Hטm(!7}VW  /T3mJ)8޹w\Z":eo%1E>(H\w~79;ڵ}#)w&EwWŊ0L[!PB?83zF:`)roTooTj=4%aJjtip-ВAL>m3z 3jVK~]Tr]r5KQ.:;AˀU֙oߺP( ]~;T:ws>ntpH#f)BD 9;pl39fiH/uB *J,7e-EzA-;?E$H$K6 @b/AVBT؀|whslbĵ,;⑭?B?&z7؄BOa#; @TҐ`1'}¯#"lG܎Q{p3Sz'>-"GH] FטiJ46ld!xi{&`!D-fkueY[:>8M&2Bb<].:&VT5#`$I"X I,A9{/lsbRS0Ͳ#Ezڡ9-U9{ Ui)1 j# }cNoc+'j`ցNqӧfOt4lCF!@9ӊftrࣉܙ2|;c*/ILAZgyEoqz]k,@Um?[yQc+j6%B& @?w27-in0^_ԔG]+L9ێ>L4'a2Y~huЫc(i.n+oc*[Mc{z̾6rf&MJL|Uz*pFb <I&ݏAfRW ֨TɎ3l?81L} "׆]u[<1AW=2BF6};U(zܩ8L|}GO<(!<3U7{i~\J3kq'FЙ$_827 O|}ObѺCq4;\ r]b0" `/"tI.^k?㙾lz׽}?y7p:?r| oqqpoD/e~+_x?:n{qݎ <AxĠBI"0(S@A?k>Ͳú>Fmr\9[ͽAI8uԫq^ tCDeyL>?If{WS庩ymkeO|zcFm@KYtט;&шR^<18Dx#d"9_]$9}{uCD6g ?~ ^?E-*.ڴ1 @S/+e8=s]{ =;~g|Q^P$7-^vo&t!u} @zNy玺]w9\*NSlj\r,zj}5`h 0dXtcT;R厔vix v8`**Iw7puEQ|?[f9Î?憟=k5EAVH=3nҗ;=Fи~do.C ښB: @֠$9sF-ڲOĸ_O~=oۼ0Smꃾ{e:* h  #IO4e}XUUQ#KCmD=~31BԔt'2/8ųNa_UUUuSLKUЋZ6 HWо0'\0cX콍7ov%}*r=TQ\G6$IO,[y;7>ѨMЀ tB4"Ė;Yc|ArӜ-ؼ`|Q́A5hX'KA EՃ:`o/7iwESzl&DzDv+vS-LE|Y%5lvݹ+`g5((œ};ࡉ9v01lYgpJZS«ޔ!Tv|0hte]r,J#~7w߸dmY;to^: %{5~ǘ?dҫ2űf.A fi^3H78 Q]%2w!z$ -K`٤xT>b?}A9X{oQCn]3kLFM[O`bd0s6̙l C]s݈WVb6DHO./Zd.k%ʃ㬶pPQ?>j;l@`P4Dsْ:rd i-bI"~05>~+i+)qhdGݎZmC2PU5ǻ;j;,F @ "kkk9ln boi'RD~@dRH͵c#hkyc3ҍz_}+gRb9|0MlTDYpp_foV -m]K"ZOdRSRo ĩ83NAP;}9fG;~!וkJZj) RҐqzNP~Ls7e7]`_9 *0ӡdXmG{8=o]KVaZ Iߦh@M* @(QPLOӀAkŤ\t):X4j`ig~1k-m%ߝ?j;lBNzo6餋 Fdt҂ߺƭ5[/KE_4좻 <ȁLEk E9Q't:`ܳ&~A}=OFD>h*NLXOQTsնKz??yj;l&A-DASȝ cIU˙Ҕkժ&XSAmo%%˾ÆhSZzQh ?yK MҠusUz Sgǃ\ٽmʍOyiyλ93r|sٜES^2 T0>|~!VM*$)6qםۀZs;vo 4hw*΃;/V>l'7$6gt.~oބNQ߿Cߺ/tw蓇i4Nӡϳ+[Q) ޏ2 ĩoK74o/떌QW_6#蝧޹53*]xׇPBGezˉ.f_e6՞Tc@QI=N~|dz4bpR.b H7KP5815w'_U}usP &T)?gX{8}ע<_V7VMNZ6ٗ~N 6# >DF3HP x-:( 2E!ثQUCjr\9[song=٣o?-EJY{lc>z#s+ݱڽjluCPC~]-pYs=|$ˮ,b+4"-@*7q/SFv99<IDATY=d{Hei @PBSqꅯ>/hOzbl_M@ū^]\b?"%Ы׋/_k);VR_kۭ1Ǟ-@h8ܚk]`r=<ԶZso9J,mWϺ8Vxk7渱9nol(QSC/|gS?7[!ԂNF87 ܐ|+wӘce-4l$6z#RAwwbbhM5p}}r2`U]h^q݊ 5% ?ЏX4-F!N{:];Cj(`9{Qľ?_K(^h9R NuQ.a` %-I_[ ZcF]0s?QJ^IMc$H9%Z+|e9-D痫08 /苺)J?Gtw<Գϗ)T> kjj<:[>L. ;ݤQT-)uD7:G(&?>rԻ{yW~.MW?X3w"I7ۊ6C77Ė8["ՒgTE{s ӕ5mi@؞g@/"$ެIJmB {L4Eb _7^ =6^e;q(ƶ!F"RFTǗY >UΦ^vqr;ݲ_vQh; ho+ؕi5Df\gK$ a `u0>ϤWI d Ǣ֡XbE `P+`Qk4́ZC.9L4Eb==+ r0%('0oG{zK$ N:-i!Un>b>ԧY/y9tKQ̀0sƋ~3yKg'i ֣B@ڑ3 90hRV1j7wbpX@p[S\8trD!-:}JK ^~a Uz? ?zinbo,B=)^e4""桀?p:1ԛ(*<7KAW@I:pp\u3~^l6 @xn6bH47B{nU T.Nr:۴>!bi ۾a$¯_)j?=ei@Z(^HJDז_Ty 1q#\Ns:(Utr#sQاUU*߅;~UϙG|g<~ ߇^iM\PH~Dc{Yp߉'U~a.کkgZ!Dk8ŷ-ū q8P+U3/M7lN83c¯C>~"N<A۷>f/jJb)^E.34#4N?caeFDF?{>687ꦱADALdQ/Y(^%ax1V=8mә$iϯ (~DYcZ'ChZ6&ٶQJG*ls(w{;Fߡ}9h͇{"[}~d]OND@lj7U=b]`I p"{nuҦW~EږcĂqXqbߍvM"F .Ky¯5Q! cӳ2BDT_dzk=eR b) | #6'.ZN[3`u6C @bJUCr"B&%C%Q'0oV# !E5v r)ɡ #&D'nu^ !~` rtG]" RѵX!{>)É($y"z ! A\t5b Qވ9tw!rY26pz&GDDA^{!Dh(JPiD;~b_}H1-ȣe*lãGK wV!v\,KOO'OoĴ rځ|Z Aq^·lCt#8Ћ#í}JDm@ utCA@Ј 6)N(X܋B-7;'mՕv!zJB07!yZ1MBx  !elFtfZz?;Y ؇z3vt1ž] ]~8a#>؆6,M&R쇂X'Mju~`|s-bPQО-+ϥD?ME>H}N4?ǎH7n Ȅ[tA"hbη|GYwx^-;{,;BߚF )BE(DdB̩Br"DGj.&DP6$# =q"ѽӅnwt%|ͅ. ,PU;6+@tmN[:Rx(V?ZG|nsH3q+3!V;"3~]4Żlh%RB^f1ksw&^i$I"D"ISH$Y$ d1R$,F DHH)I#@"bH$Y$ d1R$,F DHH)I#@"bH$Y$ d1R$,F DHH)I#@"bH$Y$ d1PUIENDB`pycorrfit-0.8.1/doc-src/Images/PyCorrFit_icon_dark.svg0000644000175000017500000001135712262516600021452 0ustar toortoor image/svg+xml pycorrfit-0.8.1/doc-src/Images/PyCorrFit_logo_dark.pdf0000644000175000017500000002265612262516600021440 0ustar toortoor%PDF-1.5 % 3 0 obj << /Length 4 0 R /Filter /FlateDecode >> stream xUMk1 WDo(9,B{ 9)P!kk&M)eO'o't w@=}w> 9ʃN R %F!^J2k#6BjfFWRuܣWpSیޓw*Jw]8N^ZO~3[U<Ы9Lo`N!N JrIS/0 5`ai 9j;\rP9gD=w/%+ %DSY3A^ c`!hm9-aofi/Hwm?Yߋ];.f-]-J)=)$C’6&1 %I-PfІ-XteWnOQ;!(f(&*^څX5/T3@%{weP~Ej1 mP<'"p u4te,~; ߱]tWpB?GWG> /a1 << /CA 0.445104 /ca 0.445104 >> /a2 << /CA 1 /ca 1 >> /a3 << /CA 0.804734 /ca 0.804734 >> >> /Font << /f-0-0 5 0 R >> >> endobj 6 0 obj << /Type /Page /Parent 1 0 R /MediaBox [ 0 0 629.979187 184.27446 ] /Contents 3 0 R /Group << /Type /Group /S /Transparency /CS /DeviceRGB >> /Resources 2 0 R >> endobj 7 0 obj << /Length 8 0 R /Filter /FlateDecode /Length1 9652 >> stream x: xՕ;yfɼ< 3!!dL$"AB&@paM D, >W |k"2IMl-Z݅Xjn g@:w9{sϽ7 >`[ @Cn-|>%Tvx~nk{n?eѶ#+iIgcys;7n=Xh򓼼mռ 'mlK뺭#N| L8[4 *U#$0r0*y1Ҩ8%N;_珇o2~oC$<0oಏU% ( 7 Ep!\[m+Ѡ\0"}5nv}E ԅG]{0BecFMbQ((Q;̼<5+h2r!sAE \ШՓ $U\D<&sw8FHh5i?k1(nڕQCnTgF5D:C$ +5B11^8|kaj iJMLR[U)Lb*b0yi*E?I$wwsL~9M3%L2Zi&p-vGb*]%o^}'dPJ^F0;A^A*L Q%˟:2/I:M[n<`y 0jHL\j6ZBԣz)5Ox x<; )%SȔbJLZZ4T(pY9d/{nZ.Kia.n,(( r@ĀU& f R0= *ZrJQVN  3] q7)Ǎ KqVqAT8 ?8Rpnt1X$(,#dءTm{L+ZsWg+G謈ZD*y4Ǭu٬&{#`ǣfhFl27LaLPQV2jC ώ' ?#ݿȡc t1%/8}g8g DwefO(gQexly!z1:B_jG5qnʝ4)/7eZ3tZZ X3ܼ**_PeUUnAuAEU"r b'D ?$-Sw-WC?ob1ۏ6)m-L9[6a7uYӦ#5\o;(3:-LX(u=.S-pY*j>,+4N_WoZ`ldTT,ʮpȾo}I#upЊwBM-N&[_pw?A& - 8KQ"j&wtzc+34 E Ö|<++^#䱗 -I+l4SVbu,ƃQG3u*ku%VXӋ̓!}.z.Ʌtetk u=1 +'Dv/j}O{Nؒy?|]wZkR J F&d0V3cy&jT檪eߩ>wPH_lc/+h"{~2\s|ts%cm;.Vsiugx0>{/t@K$Gkϲ+*K 9B~!ۻrsק[FݏsrMv e>doŜ?\ku DP#r.% 1HJ&v|(͢>ņSΤ=XqIc%iH}`z\K5p 6+# F4C_RK80:$nxPFAa>yvl×O>vx {,X94UJ?c ixH@}ҀW4Uڈa$Y Ω o *8fX߁'a~ oBL"}=rJǓǥvV 6`<+y%n;a7S}>$j7̂Ў|3DML%m+ka؋:}qx>D2$A$w!y70 l6 wt |~*@=ވ\g=%P jQkeO%C7F0'T#c[ O;;`'ƀ'1}5oؿ(Y^X a:ZɋvJhѿw)tZ[q(ggZh49rzEyYԒ)E`Iybzr.Î d4dJ%D~gr>oPE_E@>iKsyƋO_'Da!wU-X-+B[FzC&m[lf1B;} Se Z%hR]<.'oY uկhwLSݑo\,6WOqɄRj]:637M稛ɵ+ BsFBïRv5#*5z!1>7cZol4"]dM43UpB>m\_;#PkH}(#>~Ţ#ґwag Ifʦ|~ŭEF-1[鋳5ҩQA5D#|StJzW'a0#@x'ec_xP}G&o&]J]Cx*݆N F(s/tZZA2S&Ù}2ҔEj1B8{^-g/Cs9Gs9Ga#@ؿr8agn >ܼlUnH.60T5hL"#<` El1l!'&VyH`ӛ 2F;6< orҹXtkta8PsC%qngdN+^,IXZL5[D(D(jV5Pv( k TG s̓7U # (A"4!Oۅp^xDޠx V4y y =GB裇fCl1V^!!hF ݓRMKpjncQMiSSKR"Pt8TF*X d),wAB`DDڋP|M/; 94 LBI=-x}n麀9Wg[8" ,0$6<0Dfj `%݅܅z=M\5nC 6i2L0010nLaڅi'PhVZ[CG+T/6LqjfXi6]5F|Ȭ=6GelVϭbWWoZoX/^!Q{PaP'_O +~rP_c'rOe<[ƥ28,҃r3^' y󏨑lKnN*|[_ 8)=RZY}uoQs"ˁˁ:98Iyd9YA,YfDue22,%FFu_ux*s6;fy`VˆNGZr:ȏ䴭c;bop[ʫu~IhGмz.6<2`a_鬍wVǚsTsXXXssdGT^ie:LuZtm3vU\w>"E=*)UxU&HTGȏUFd__ߞtolZճS۳i37@&45hcN9FئO=o:ی=kzpB $LmzVb7'= endstream endobj 8 0 obj 6610 endobj 9 0 obj << /Length 10 0 R /Filter /FlateDecode >> stream x]j >$n!Cд 1}PA|83[{+: gx0 Z)1:Z,k18I7/k3U8kb޽Oc+c\m!s*IXތ,jW+%{yTg9k.DD剨LT)3) х{+)TV6{b>vfZݚ\igs` endstream endobj 10 0 obj 264 endobj 11 0 obj << /Type /FontDescriptor /FontName /MEWPKO+CourierNewPSMT /FontFamily (Courier New) /Flags 4 /FontBBox [ -21 -679 637 1020 ] /ItalicAngle 0 /Ascent 832 /Descent -300 /CapHeight 1020 /StemV 80 /StemH 80 /FontFile2 7 0 R >> endobj 12 0 obj << /Type /Font /Subtype /CIDFontType2 /BaseFont /MEWPKO+CourierNewPSMT /CIDSystemInfo << /Registry (Adobe) /Ordering (Identity) /Supplement 0 >> /FontDescriptor 11 0 R /W [0 [ 600 600 600 600 600 600 600 600 600 ]] >> endobj 5 0 obj << /Type /Font /Subtype /Type0 /BaseFont /MEWPKO+CourierNewPSMT /Encoding /Identity-H /DescendantFonts [ 12 0 R] /ToUnicode 9 0 R >> endobj 1 0 obj << /Type /Pages /Kids [ 6 0 R ] /Count 1 >> endobj 13 0 obj << /Creator (cairo 1.10.2 (http://cairographics.org)) /Producer (cairo 1.10.2 (http://cairographics.org)) >> endobj 14 0 obj << /Type /Catalog /Pages 1 0 R >> endobj xref 0 15 0000000000 65535 f 0000009014 00000 n 0000000773 00000 n 0000000015 00000 n 0000000751 00000 n 0000008851 00000 n 0000001008 00000 n 0000001221 00000 n 0000007925 00000 n 0000007948 00000 n 0000008290 00000 n 0000008313 00000 n 0000008584 00000 n 0000009079 00000 n 0000009207 00000 n trailer << /Size 15 /Root 14 0 R /Info 13 0 R >> startxref 9260 %%EOF pycorrfit-0.8.1/doc-src/Images/PyCorrFit_logo_dark.png0000644000175000017500000001047412262516600021446 0ustar toortoorPNG  IHDR<r>sBIT|d pHYstEXtSoftwarewww.inkscape.org<IDATxyTzwٔMfiE@a28рK4Dq\A#q' 1J9x4$qd5.aId# Bw]ukW]@9}h}w+QUQZtq|nyL Q#.|ͩ:xN˴1x+nM8NiWJp 0Xey8>94" UձILlc{/Ң4"@D:~ |[L?K&y$;EYe ?Cn׀9TJ`lq |t֥AZyV`-^&q׈fwI#ȭLc*ZKU2۾UO5Q7-K"a4c-.˒]h6swͣ+s^dp|_Kd㛸FDð)f63R2:,mYb!0Ļ /w@҅k>d K1(g"B-頪mk&h39W!` X椛L*s?*YލkX1H:c[Xļ҈HGLBPB(Աmc)ihR=.'fc3c)%XB+D42B4`yAUS#"C0Y1Z5,u.N mbeY\2d;ų=돥sҜMg."rw,jJUHcU|)TљsβfRCgDz5n|E0M UՄj+8۠ s,6)"jA:U8Lqdy5","?Kr<&Ui,f$ nrX%pUJU \D*ˁכM,Kk2PTb+A<;0{Hbf&~)!V9=`Q0F`k5Ͱ 36QEs͍[!)vx C'Q@tJBa'jEt$:!cZle-}ZW7rh'!UMb6oHl[O!;KqqqRM["t!U= <ƴp>%kpKi:ґ@\0b+9jFg.=,f7W2S8|AgHY1s 2MlaLJ"̫YqvsE`c*+"ضf <˯("wOuVɘq0V.33F4s 1|АZw0foPxB5mc=B6r.A=4u tzpDd40 G3)" L%kTDaX͘#u8f{@UCDPѨ#0Cƃ"E[чzкZjpOKC|‘[`!SN`:;Ե|%ϯT㜏9c7aN1"|LJ#?|ЭǤLJ3|ta?~?q_8drIt&|K6/G8dPJ5Y%XhPo:uK }R:?jΈYPyWE4z;}7 Yh@aua>la-qqڭ3~HmaoXV*ɴ)W^ U| J=4.9G#˕y!s )oPpX4cgdpwNvظ>r5PwT*K|mR_(WK=91Ei&AaL=;8J% ]Vujশ#DTgk6|Q a xڊ!:aL88Zd݇ERi^zqyɠ/~ gsE4# `Y`pB/L'6مp"9ɰ@E2RyUƣ}QiLPџԩLL.?PՔGx؄U%dJW8\#T7UzִNGM7mGٙF!.B(n,<33q!R塺zjAVsC>\ pQU䗒!nBGr$H (ee/Y Ԩ?tɭXʋ^sf0 e&'bDvMu/|J04ðГb<<AG!SZa:K%Ӏjnb ~B:=ߙؖx'+2Uy$Ko #N8`4V(ߒ慭x -"#?XH{[{?_Uו@D`N,3*4ؚ>XceIHJ$%H`6N8[Y*﫪[j<<’292H&m , +պJda}=sG[r92Ѻ%/3`1aXpeyj[/b*m4&Sq*caE0V ~KG*w>tIENDB`pycorrfit-0.8.1/doc-src/Images/PyCorrFit_logo.svg0000644000175000017500000001250212262516600020452 0ustar toortoor image/svg+xml PyCorrFit pycorrfit-0.8.1/doc-src/Images/PyCorrFit_icon.svg0000644000175000017500000001131712262516600020445 0ustar toortoor image/svg+xml pycorrfit-0.8.1/doc-src/Bibliography.bib0000755000175000017500000017773612262516600016744 0ustar toortoor% This file was created with JabRef 2.7b. % Encoding: UTF-8 @ARTICLE{Aragon1976, author = {S. R. Aragon and R. Pecora}, title = {Fluorescence correlation spectroscopy as a probe of molecular dynamics}, journal = {The Journal of Chemical Physics}, year = {1976}, volume = {64}, pages = {1791-1803}, number = {4}, doi = {10.1063/1.432357}, owner = {paul}, publisher = {AIP}, timestamp = {2012.11.02} } @ARTICLE{Ashkin1970, author = {Ashkin, A.}, title = {Acceleration and Trapping of Particles by Radiation Pressure}, journal = {Physical Review Letters}, year = {1970}, volume = {24}, pages = {156--159}, month = {Jan}, doi = {10.1103/PhysRevLett.24.156}, issue = {4}, owner = {paul}, publisher = {American Physical Society}, timestamp = {2012.11.13} } @ARTICLE{Axelrod1984, author = {Axelrod, D and Burghardt, T P and Thompson, N L}, title = {Total Internal Reflection Fluorescence}, journal = {Annual Review of Biophysics and Biomolecular Structure}, year = {1984}, volume = {13}, pages = {247--268}, number = {1}, month = jun, booktitle = {Annual Review of Biophysics and Bioengineering}, comment = {doi: 10.1146/annurev.bb.13.060184.001335}, doi = {10.1146/annurev.bb.13.060184.001335}, issn = {0084-6589}, owner = {paul}, publisher = {Annual Reviews}, timestamp = {2012.02.14} } @ARTICLE{Bag2012, author = {Bag, Nirmalya and Sankaran, Jagadish and Paul, Alexandra and Kraut, Rachel S. and Wohland, Thorsten}, title = {Calibration and Limits of Camera-Based Fluorescence Correlation Spectroscopy: A Supported Lipid Bilayer Study}, journal = {ChemPhysChem}, year = {2012}, volume = {13}, pages = {2784--2794}, number = {11}, doi = {10.1002/cphc.201200032}, issn = {1439-7641}, keywords = {fluorescence spectroscopy, membrane, multiplexing, point spread function, total internal reflection}, owner = {paul}, publisher = {WILEY-VCH Verlag}, timestamp = {2012.09.20} } @ARTICLE{Bestvater2010, author = {Felix Bestvater and Zahir Seghiri and Moon Sik Kang and Nadine Gr\"{o}ner and Ji Young Lee and Kang-Bin Im and Malte Wachsmuth}, title = {EMCCD-based spectrally resolved fluorescence correlation spectroscopy}, journal = {Optics Express}, year = {2010}, volume = {18}, pages = {23818--23828}, number = {23}, month = {Nov}, abstract = {We present an implementation of fluorescence correlation spectroscopy with spectrally resolved detection based on a combined commercial confocal laser scanning/fluorescence correlation spectroscopy microscope. We have replaced the conventional detection scheme by a prism-based spectrometer and an electron-multiplying charge-coupled device camera used to record the photons. This allows us to read out more than 80,000 full spectra per second with a signal-to-noise ratio and a quantum efficiency high enough to allow single photon counting. We can identify up to four spectrally different quantum dots in vitro and demonstrate that spectrally resolved detection can be used to characterize photophysical properties of fluorophores by measuring the spectral dependence of quantum dot fluorescence emission intermittence. Moreover, we can confirm intracellular cross-correlation results as acquired with a conventional setup and show that spectral flexibility can help to optimize the choice of the detection windows.}, doi = {10.1364/OE.18.023818}, keywords = {CCD, charge-coupled device; Confocal microscopy; Spectroscopy, fluorescence and luminescence}, owner = {paul}, publisher = {OSA}, timestamp = {2012.11.07} } @ARTICLE{Blom2009, author = {Blom, Hans and Chmyrov, Andriy and Hassler, Kai and Davis, Lloyd M. and Widengren, Jerker}, title = {Triplet-State Investigations of Fluorescent Dyes at Dielectric Interfaces Using Total Internal Reflection Fluorescence Correlation Spectroscopy}, journal = {The Journal of Physical Chemistry A}, year = {2009}, volume = {113}, pages = {5554-5566}, number = {19}, doi = {10.1021/jp8110088}, owner = {paul}, timestamp = {2012.11.02} } @ARTICLE{Blom2002, author = {Hans Blom and Mathias Johansson and Anna-Sara Hedman and Liselotte Lundberg and Anders Hanning and Sverker H{\aa}rd and Rudolf Rigler}, title = {Parallel Fluorescence Detection of Single Biomolecules in Microarrays by a Diffractive-Optical-Designed 2 x 2 Fan-Out Element}, journal = {Applied Optics}, year = {2002}, volume = {41}, pages = {3336--3342}, number = {16}, month = {Jun}, abstract = {We have developed a multifocal diffractive-optical fluorescence correlation spectroscopy system for parallel excitation and detection of single tetramethylrhodamine biomolecules in microarrays. Multifocal excitation was made possible through the use of a 2 {\texttimes} 2 fan-out diffractive-optical element with uniform intensity in all foci. Characterization of the 2 {\texttimes} 2 fan-out diffractive-optical element shows formation of almost perfect Gaussian foci of submicrometer lateral diameter, as analyzed by thermal motion of tetramethylrhodamine dye molecules in solution. Results of parallel excitation and detection in a high-density microarray of circular wells show single-biomolecule sensitivity in all four foci simultaneously.}, doi = {10.1364/AO.41.003336}, keywords = {Diffractive optics; Confocal microscopy; Fluorescence microscopy; Fluorescence, laser-induced}, owner = {paul}, publisher = {OSA}, timestamp = {2012.11.07} } @ARTICLE{Brinkmeier1999, author = {M. Brinkmeier and K. Dörre and J. Stephan and M. Eigen}, title = {Two-beam cross-correlation:  a method to characterize transport phenomena in micrometer-sized structures.}, journal = {Analytical Chemistry}, year = {1999}, volume = {71}, pages = {609--616}, number = {3}, month = {Feb}, abstract = {To determine flow properties, namely, the velocity and angle of the flow in microstructured channels, an experimental realization based on fluorescence correlation spectroscopy is described. For this purpose, two micrometer-sized spatially separated volume elements have been created. The cross-correlation signal from these has been recorded and evaluated mathematically. In addition to previous results, two-beam cross-correlation allows for fast and easy determination of even small (down to 200 μm/s) flow velocities, as well as simultaneous measurement of diffusion properties of single dye molecules within a rather short detection time of 5-100 s and an error rate of less than 20\%. The spatial flow resolution is around 1-2 μm, limited by the diameter of the volume element. Furthermore, vectorial flow data can be obtained and evaluated. A discussion of the theoretical background and an experimental verification of the theoretical results is performed. The feasibility of fast and easy data processing is shown if the flow time is the only desired information. Possible applications of this precise and simple method are the determination of transportation effects within artificial microstructures for CE and HPLC, fast chemical kinetics, and high-throughput screening.}, doi = {10.1021/ac980820i}, institution = {Max-Planck-Institut für biophysikalische Chemie, Am Fassberg, D-37077 Göttingen, Germany.}, language = {eng}, medline-pst = {ppublish}, owner = {paul}, pmid = {21662718}, timestamp = {2012.11.07} } @ARTICLE{Brutzer2012, author = {Brutzer, Hergen and Schwarz, Friedrich W. and Seidel, Ralf}, title = {Scanning Evanescent Fields Using a pointlike Light Source and a Nanomechanical DNA Gear}, journal = {Nano Letters}, year = {2012}, volume = {12}, pages = {473-478}, number = {1}, doi = {10.1021/nl203876w}, owner = {paul}, timestamp = {2012.08.09} } @ARTICLE{Buchholz2012, author = {Jan Buchholz and Jan Wolfgang Krieger and G\'{a}bor Mocs\'{a}r and Bal\'{a}zs Kreith and Edoardo Charbon and Gy\"{o}rgy V\'{a}mosi and Udo Kebschull and J\"{o}rg Langowski}, title = {FPGA implementation of a 32x32 autocorrelator array for analysis of fast image series}, journal = {Optics Express}, year = {2012}, volume = {20}, pages = {17767--17782}, number = {16}, month = {Jul}, abstract = {With the evolving technology in CMOS integration, new classes of 2D-imaging detectors have recently become available. In particular, single photon avalanche diode (SPAD) arrays allow detection of single photons at high acquisition rates (\&\#x02265; 100kfps), which is about two orders of magnitude higher than with currently available cameras. Here we demonstrate the use of a SPAD array for imaging fluorescence correlation spectroscopy (imFCS), a tool to create 2D maps of the dynamics of fluorescent molecules inside living cells. Time-dependent fluorescence fluctuations, due to fluorophores entering and leaving the observed pixels, are evaluated by means of autocorrelation analysis. The multi-\&\#x003C4; correlation algorithm is an appropriate choice, as it does not rely on the full data set to be held in memory. Thus, this algorithm can be efficiently implemented in custom logic. We describe a new implementation for massively parallel multi-\&\#x003C4; correlation hardware. Our current implementation can calculate 1024 correlation functions at a resolution of 10\&\#x003BC;s in real-time and therefore correlate real-time image streams from high speed single photon cameras with thousands of pixels.}, doi = {10.1364/OE.20.017767}, keywords = {Detectors; Arrays; Cameras; Correlators ; Fluorescence microscopy; Three-dimensional microscopy; Spectroscopy, fluorescence and luminescence; Avalanche photodiodes (APDs)}, owner = {paul}, publisher = {OSA}, timestamp = {2012.10.24} } @PHDTHESIS{Burkhardt2010, author = {Burkhardt, Markus}, title = {Electron multiplying CCD – based detection in Fluorescence Correlation Spectroscopy and measurements in living zebrafish embryos}, school = {Biophysics, BIOTEC, Technische Universität Dresden, Tatzberg 47–51, 01307 Dresden, Germany}, year = {2010}, note = {\url{http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-61021}}, owner = {paul}, timestamp = {2012.10.24} } @ARTICLE{Burkhardt:06, author = {Markus Burkhardt and Petra Schwille}, title = {Electron multiplying CCD based detection for spatially resolved fluorescence correlation spectroscopy}, journal = {Optics Express}, year = {2006}, volume = {14}, pages = {5013--5020}, number = {12}, month = {Jun}, abstract = {Fluorescence correlation spectroscopy (FCS) is carried out with an electron multiplying CCD (EMCCD). This new strategy is compared to standard detection by an avalanche photo diode showing good agreement with respect to the resulting autocorrelation curves. Applying different readout modes, a time resolution of 20 {\textmu}s can be achieved, which is sufficient to resolve the diffusion of free dye in solution. The advantages of implementing EMCCD cameras in wide-field ultra low light imaging, as well as in multi-spot confocal laser scanning microscopy, can consequently also be exploited for spatially resolved FCS. First proof-of-principle FCS measurements with two excitation volumes demonstrate the advantage of the flexible CCD area detection.}, doi = {10.1364/OE.14.005013}, keywords = {CCD, charge-coupled device; Medical optics and biotechnology; Fluorescence, laser-induced}, publisher = {OSA} } @ARTICLE{Chiantia2006, author = {Chiantia , Salvatore and Ries , Jonas and Kahya, Nicoletta and Schwille, Petra}, title = {Combined AFM and Two-Focus SFCS Study of Raft-Exhibiting Model Membranes}, journal = {ChemPhysChem}, year = {2006}, volume = {7}, pages = {2409--2418}, number = {11}, doi = {10.1002/cphc.200600464}, issn = {1439-7641}, keywords = {fluorescent probes, force measurements, membranes, sphingolipids}, owner = {paul}, publisher = {WILEY-VCH Verlag}, timestamp = {2012.10.24} } @ARTICLE{Dertinger2007, author = {Dertinger, Thomas and Pacheco, Victor and von der Hocht, Iris and Hartmann, Rudolf and Gregor, Ingo and Enderlein, Jörg}, title = {Two-Focus Fluorescence Correlation Spectroscopy: A New Tool for Accurate and Absolute Diffusion Measurements}, journal = {ChemPhysChem}, year = {2007}, volume = {8}, pages = {433--443}, number = {3}, doi = {10.1002/cphc.200600638}, issn = {1439-7641}, keywords = {diffusion coefficients, fluorescence spectroscopy, fluorescent dyes, time-resolved spectroscopy}, owner = {paul}, publisher = {WILEY-VCH Verlag}, timestamp = {2012.02.14} } @ARTICLE{Einstein1905, author = {Einstein, A.}, title = {Über die von der molekularkinetischen Theorie der Wärme geforderte Bewegung von in ruhenden Flüssigkeiten suspendierten Teilchen}, journal = {Annalen der Physik}, year = {1905}, volume = {322}, pages = {549--560}, number = {8}, doi = {10.1002/andp.19053220806}, issn = {1521-3889}, owner = {paul}, publisher = {WILEY-VCH Verlag}, timestamp = {2012.11.02} } @ARTICLE{Elson1974, author = {Elson, Elliot L. and Magde, Douglas}, title = {Fluorescence correlation spectroscopy. I. Conceptual basis and theory}, journal = {Biopolymers}, year = {1974}, volume = {13}, pages = {1--27}, number = {1}, doi = {10.1002/bip.1974.360130102}, issn = {1097-0282}, owner = {paul}, publisher = {Wiley Subscription Services, Inc., A Wiley Company}, timestamp = {2012.09.24} } @ARTICLE{Enderlein1999, author = {J\"{o}rg Enderlein and Thomas Ruckstuhl and Stefan Seeger}, title = {Highly Efficient Optical Detection of Surface-Generated Fluorescence}, journal = {Applied Optics}, year = {1999}, volume = {38}, pages = {724--732}, number = {4}, month = {Feb}, abstract = {We present a theoretical study of a new highly efficient system for optical light collection, designed for ultrasensitive fluorescence detection of surface-bound molecules. The main core of the system is a paraboloid glass segment acting as a mirror for collecting the fluorescence. A special feature of the system is its ability to sample not only fluorescence that is emitted below the angle of total internal reflection (the critical angle) but also particularly the light above the critical angle. As shown, this is especially advantageous for collecting the fluorescence of surface-bound molecules. A comparison is made with conventional high-aperture microscope objectives. Furthermore, it is shown that the system allows not only for highly efficient light collection but also for confocal imaging of the detection region, which is of great importance for rejecting scattered light in potential applications such as the detection of only a few molecules.}, doi = {10.1364/AO.38.000724}, keywords = {Geometric optical design; Microscopy; Detection; Fluorescence microscopy}, owner = {paul}, publisher = {OSA}, timestamp = {2012.11.02} } @ARTICLE{Hansen1998, author = {Hansen, Richard L and Harris, Joel M}, title = {Measuring Reversible Adsorption Kinetics of Small Molecules at Solid/Liquid Interfaces by Total Internal Reflection Fluorescence Correlation Spectroscopy}, journal = {Analytical Chemistry}, year = {1998}, volume = {70}, pages = {4247--4256}, number = {20}, doi = {10.1021/ac980925l}, owner = {paul}, timestamp = {2012.02.14} } @ARTICLE{Hashmi2007, author = {Sara M. Hashmi and Michael Loewenberg and Eric R. Dufresne}, title = {Spatially extended FCS for visualizing and quantifying high-speed multiphase flows in microchannels}, journal = {Optics Express}, year = {2007}, volume = {15}, pages = {6528--6533}, number = {10}, month = {May}, abstract = {We report the development of spatially extended fluorescence correlation spectroscopy for visualizing and quantifying multiphase flows in microchannels. We employ simultaneous detection with a high-speed camera across the width of the channel, enabling investigation of the dynamics of the flow at short time scales. We take advantage of the flow to scan the sample past the fixed illumination, capturing frames up to 100 KHz. At these rates, we can resolve the motion of sub-micron particles at velocities up to the order of 1 cm/s. We visualize flows with kymographs and quantify velocity profiles by cross-correlations within the focal volume. We demonstrate the efficacy of our approach by measuring the depth-resolved velocity profile of suspensions of sub-micron diameter silica particles flowing up to 1.5 mm/s.}, doi = {10.1364/OE.15.006528}, keywords = {Velocimetry; Flow diagnostics; Fluorescence, laser-induced}, owner = {paul}, publisher = {OSA}, timestamp = {2012.11.07} } @ARTICLE{Hassler2005, author = {Hassler, Kai and Anhut, Tiemo and Rigler, Rudolf and G\"{o}sch, Michael and Lasser, Theo}, title = {High Count Rates with Total Internal Reflection Fluorescence Correlation Spectroscopy}, journal = {Biophysical Journal}, year = {2005}, volume = {88}, pages = {L01--L03}, number = {1}, month = jan, doi = {10.1529/biophysj.104.053884}, issn = {0006-3495}, owner = {paul}, publisher = {Cell Press}, refid = {S0006-3495(05)73079-4 DOI - 10.1529/biophysj.104.053884}, timestamp = {2012.05.02} } @ARTICLE{Hassler2005a, author = {Kai Hassler and Marcel Leutenegger and Per Rigler and Ramachandra Rao and Rudolf Rigler and Michael G\"{o}sch and Theo Lasser}, title = {Total internal reflection fluorescence correlation spectroscopy (TIR-FCS) with low background and high count-rate per molecule}, journal = {Optics Express}, year = {2005}, volume = {13}, pages = {7415--7423}, number = {19}, month = {Sep}, abstract = {We designed a fluorescence correlation spectroscopy (FCS) system for measurements on surfaces. The system consists of an objective-type total internal reflection fluorescence (TIRF) microscopy setup, adapted to measure FCS. Here, the fluorescence exciting evanescent wave is generated by epi-illumination through the periphery of a high NA oil-immersion objective. The main advantages with respect to conventional FCS systems are an improvement in terms of counts per molecule (cpm) and a high signal to background ratio. This is demonstrated by investigating diffusion as well as binding and release of single molecules on a glass surface. Furthermore, the size and shape of the molecule detection efficiency (MDE) function was calculated, using a wave-vectorial approach and taking into account the influence of the dielectric interface on the emission properties of fluorophores.}, doi = {10.1364/OPEX.13.007415}, keywords = {Spectroscopy, fluorescence and luminescence; Spectroscopy, surface; Fluorescence, laser-induced}, owner = {paul}, publisher = {OSA}, timestamp = {2012.09.21} } @OTHER{HaunertG, author = {Gerhard Haunert}, howpublished = {Personal communication}, note = {Acal BFi Germany GmbH}, owner = {paul}, timestamp = {2012.10.11}, year = {2012} } @ARTICLE{Haupts1998, author = {Haupts, Ulrich and Maiti, Sudipta and Schwille, Petra and Webb, Watt W.}, title = {Dynamics of fluorescence fluctuations in green fluorescent protein observed by fluorescence correlation spectroscopy}, journal = {Proceedings of the National Academy of Sciences}, year = {1998}, volume = {95}, pages = {13573-13578}, number = {23}, abstract = {We have investigated the pH dependence of the dynamics of conformational fluctuations of green fluorescent protein mutants EGFP (F64L/S65T) and GFP-S65T in small ensembles of molecules in solution by using fluorescence correlation spectroscopy (FCS). FCS utilizes time-resolved measurements of fluctuations in the molecular fluorescence emission for determination of the intrinsic dynamics and thermodynamics of all processes that affect the fluorescence. Fluorescence excitation of a bulk solution of EGFP decreases to zero at low pH (pKa = 5.8) paralleled by a decrease of the absorption at 488 nm and an increase at 400 nm. Protonation of the hydroxyl group of Tyr-66, which is part of the chromophore, induces these changes. When FCS is used the fluctuations in the protonation state of the chromophore are time resolved. The autocorrelation function of fluorescence emission shows contributions from two chemical relaxation processes as well as diffusional concentration fluctuations. The time constant of the fast, pH-dependent chemical process decreases with pH from 300 μs at pH 7 to 45 μs at pH 5, while the time-average fraction of molecules in a nonfluorescent state increases to 80% in the same range. A second, pH-independent, process with a time constant of 340 μs and an associated fraction of 13% nonfluorescent molecules is observed between pH 8 and 11, possibly representing an internal proton transfer process and associated conformational rearrangements. The FCS data provide direct measures of the dynamics and the equilibrium properties of the protonation processes. Thus FCS is a convenient, intrinsically calibrated method for pH measurements in subfemtoliter volumes with nanomolar concentrations of EGFP.}, doi = {10.1073/pnas.95.23.13573}, owner = {paul}, timestamp = {2012.11.01} } @ARTICLE{Haustein2007, author = {Haustein, Elke and Schwille, Petra}, title = {Fluorescence Correlation Spectroscopy: Novel Variations of an Established Technique}, journal = {Annual Review of Biophysics and Biomolecular Structure}, year = {2007}, volume = {36}, pages = {151-169}, number = {1}, doi = {10.1146/annurev.biophys.36.040306.132612}, owner = {paul}, timestamp = {2012.02.14} } @ARTICLE{Helmers2003, author = {Heinz Helmers and Markus Schellenberg}, title = {CMOS vs. CCD sensors in speckle interferometry}, journal = {Optics \& Laser Technology}, year = {2003}, volume = {35}, pages = {587 - 595}, number = {8}, doi = {10.1016/S0030-3992(03)00078-1}, issn = {0030-3992}, keywords = {CCD sensors}, owner = {paul}, timestamp = {2012.10.06} } @ARTICLE{Holekamp2008, author = {Terrence F. Holekamp and Diwakar Turaga and Timothy E. Holy}, title = {Fast Three-Dimensional Fluorescence Imaging of Activity in Neural Populations by Objective-Coupled Planar Illumination Microscopy}, journal = {Neuron}, year = {2008}, volume = {57}, pages = {661 - 672}, number = {5}, doi = {10.1016/j.neuron.2008.01.011}, issn = {0896-6273}, keywords = {SYSBIO}, owner = {paul}, timestamp = {2012.11.13} } @ARTICLE{Humpolickova2006, author = {Jana Humpol\'{i}\v{c}kov\'{a} and Ellen Gielen and Ale\v{s} Benda and Veronika Fagulova and Jo Vercammen and Martin vandeVen and Martin Hof and Marcel Ameloot and Yves Engelborghs}, title = {Probing Diffusion Laws within Cellular Membranes by Z-Scan Fluorescence Correlation Spectroscopy}, journal = {Biophysical Journal}, year = {2006}, volume = {91}, pages = {L23 - L25}, number = {3}, doi = {10.1529/biophysj.106.089474}, issn = {0006-3495}, owner = {paul}, timestamp = {2012.10.25} } @ARTICLE{Jin2004, author = {Jin, S. and Huang, P. and Park, J. and Yoo, J. Y. and Breuer, K. S.}, title = {Near-surface velocimetry using evanescent wave illumination}, journal = {Experiments in Fluids}, year = {2004}, volume = {37}, pages = {825-833}, affiliation = {School of Mechanical and Aerospace Engineering Seoul National University Seoul 151-742 Korea}, doi = {10.1007/s00348-004-0870-7}, issn = {0723-4864}, issue = {6}, keyword = {Technik}, owner = {paul}, publisher = {Springer Berlin / Heidelberg}, timestamp = {2012.02.14} } @ARTICLE{Kannan2006, author = {Kannan, Balakrishnan and Har, Jia Yi and Liu, Ping and Maruyama, Ichiro and Ding, Jeak Ling and Wohland, Thorsten}, title = {Electron Multiplying Charge-Coupled Device Camera Based Fluorescence Correlation Spectroscopy}, journal = {Analytical Chemistry}, year = {2006}, volume = {78}, pages = {3444-3451}, number = {10}, doi = {10.1021/ac0600959}, owner = {paul}, timestamp = {2012.11.07} } @INCOLLECTION{Kohl2005, author = {Kohl, Tobias and Schwille, Petra}, title = {Fluorescence Correlation Spectroscopy with Autofluorescent Proteins}, booktitle = {Microscopy Techniques}, publisher = {Springer Berlin / Heidelberg}, year = {2005}, editor = {Rietdorf, Jens}, volume = {95}, series = {Advances in Biochemical Engineering/Biotechnology}, pages = {1316-1317}, affiliation = {Pastor-Sander-Bogen 92 37083 Göttingen Germany}, doi = {10.1007/b102212}, isbn = {978-3-540-23698-6}, keyword = {Chemistry and Materials Science}, owner = {paul}, timestamp = {2012.02.14} } @ARTICLE{Korson1969, author = {Korson, Lawrence and Drost-Hansen, Walter and Millero, Frank J.}, title = {Viscosity of water at various temperatures}, journal = {The Journal of Physical Chemistry}, year = {1969}, volume = {73}, pages = {34-39}, number = {1}, doi = {10.1021/j100721a006}, owner = {paul}, timestamp = {2012.10.29} } @BOOK{LandauLifshitsStatPhys, title = {{Statistical Physics, Third Edition, Part 1: Volume 5 (Course of Theoretical Physics, Volume 5)}}, publisher = {Butterworth-Heinemann}, year = {1980}, author = {Landau, L. D. and Lifshitz, E. M.}, edition = {3}, month = jan, abstract = {{A lucid presentation of statistical physics and thermodynamics which develops from the general principles to give a large number of applications of the theory.}}, citeulike-article-id = {1284487}, citeulike-linkout-0 = {http://www.amazon.ca/exec/obidos/redirect?tag=citeulike09-20\&path=ASIN/0750633727}, citeulike-linkout-1 = {http://www.amazon.de/exec/obidos/redirect?tag=citeulike01-21\&path=ASIN/0750633727}, citeulike-linkout-2 = {http://www.amazon.fr/exec/obidos/redirect?tag=citeulike06-21\&path=ASIN/0750633727}, citeulike-linkout-3 = {http://www.amazon.jp/exec/obidos/ASIN/0750633727}, citeulike-linkout-4 = {http://www.amazon.co.uk/exec/obidos/ASIN/0750633727/citeulike00-21}, citeulike-linkout-5 = {http://www.amazon.com/exec/obidos/redirect?tag=citeulike07-20\&path=ASIN/0750633727}, citeulike-linkout-6 = {http://www.worldcat.org/isbn/0750633727}, citeulike-linkout-7 = {http://books.google.com/books?vid=ISBN0750633727}, citeulike-linkout-8 = {http://www.amazon.com/gp/search?keywords=0750633727\&index=books\&linkCode=qs}, citeulike-linkout-9 = {http://www.librarything.com/isbn/0750633727}, day = {15}, howpublished = {Paperback}, isbn = {0750633727}, keywords = {fermi\_statistics, statistical\_physics}, owner = {paul}, posted-at = {2011-03-03 11:38:41}, priority = {2}, timestamp = {2012.02.03} } @ARTICLE{Leutenegger2012, author = {Marcel Leutenegger and Christian Ringemann and Theo Lasser and Stefan W. Hell and Christian Eggeling}, title = {Fluorescence correlation spectroscopy with a total internal reflection fluorescence STED microscope (TIRF-STED-FCS)}, journal = {Optics Express}, year = {2012}, volume = {20}, pages = {5243--5263}, number = {5}, month = {Feb}, abstract = {We characterize a novel fluorescence microscope which combines the high spatial discrimination of a total internal reflection epi-fluorescence (epi-TIRF) microscope with that of stimulated emission depletion (STED) nanoscopy. This combination of high axial confinement and dynamic-active lateral spatial discrimination of the detected fluorescence emission promises imaging and spectroscopy of the structure and function of cell membranes at the macro-molecular scale. Following a full theoretical description of the sampling volume and the recording of images of fluorescent beads, we exemplify the performance and limitations of the TIRF-STED nanoscope with particular attention to the polarization state of the laser excitation light. We demonstrate fluorescence correlation spectroscopy (FCS) with the TIRF-STED nanoscope by observing the diffusion of dye molecules in aqueous solutions and of fluorescent lipid analogs in supported lipid bilayers in the presence of background signal. The nanoscope reduced the out-of-focus background signal. A lateral resolution down to 40--50 nm was attained which was ultimately limited by the low lateral signal-to-background ratio inherent to the confocal epi-TIRF scheme. Together with the estimated axial confinement of about 55 nm, our TIRF-STED nanoscope achieved an almost isotropic and less than 1 attoliter small all-optically induced measurement volume.}, doi = {10.1364/OE.20.005243}, keywords = {Diffraction; Fluorescence microscopy; Fluorescence}, owner = {paul}, publisher = {OSA}, timestamp = {2012.09.21} } @ARTICLE{Lieto2003a, author = {Lieto, Alena M. and Cush, Randall C. and Thompson, Nancy L.}, title = {Ligand-Receptor Kinetics Measured by Total Internal Reflection with Fluorescence Correlation Spectroscopy}, journal = {Biophysical Journal}, year = {2003}, volume = {85}, pages = {3294--3302}, number = {5}, month = nov, doi = {10.1016/S0006-3495(03)74748-1}, issn = {0006-3495}, owner = {paul}, publisher = {Cell Press}, refid = {S0006-3495(03)74748-1 DOI - 10.1016/S0006-3495(03)74748-1}, timestamp = {2012.09.21} } @ARTICLE{Lieto2003, author = {Lieto, Alena M. and Lagerholm, B. Christoffer and Thompson, Nancy L.}, title = {Lateral Diffusion from Ligand Dissociation and Rebinding at Surfaces†}, journal = {Langmuir}, year = {2003}, volume = {19}, pages = {1782-1787}, number = {5}, doi = {10.1021/la0261601}, owner = {paul}, timestamp = {2012.02.14} } @ARTICLE{Lieto2004, author = {Alena M. Lieto and Nancy L. Thompson}, title = {Total Internal Reflection with Fluorescence Correlation Spectroscopy: Nonfluorescent Competitors}, journal = {Biophysical Journal}, year = {2004}, volume = {87}, pages = {1268 - 1278}, number = {2}, doi = {10.1529/biophysj.103.035030}, issn = {0006-3495}, owner = {paul}, timestamp = {2012.02.14} } @ARTICLE{Magde1972, author = {Magde, Douglas and Elson, Elliot and Webb, W. W.}, title = {Thermodynamic Fluctuations in a Reacting System - Measurement by Fluorescence Correlation Spectroscopy}, journal = {Physical Review Letters}, year = {1972}, volume = {29}, pages = {705--708}, month = {Sep}, doi = {10.1103/PhysRevLett.29.705}, issue = {11}, owner = {paul}, publisher = {American Physical Society}, timestamp = {2012.11.01} } @ARTICLE{Magde1974, author = {Magde, Douglas and Elson, Elliot L. and Webb, Watt W.}, title = {Fluorescence correlation spectroscopy. II. An experimental realization}, journal = {Biopolymers}, year = {1974}, volume = {13}, pages = {29--61}, number = {1}, doi = {10.1002/bip.1974.360130103}, issn = {1097-0282}, owner = {paul}, publisher = {Wiley Subscription Services, Inc., A Wiley Company}, timestamp = {2012.09.21} } @ARTICLE{Nitsche2004, author = {Johannes M. Nitsche and Hou-Chien Chang and Paul A. Weber and Bruce J. Nicholson}, title = {A Transient Diffusion Model Yields Unitary Gap Junctional Permeabilities from Images of Cell-to-Cell Fluorescent Dye Transfer Between Xenopus Oocytes}, journal = {Biophysical Journal}, year = {2004}, volume = {86}, pages = {2058 - 2077}, number = {4}, doi = {10.1016/S0006-3495(04)74267-8}, issn = {0006-3495}, owner = {paul}, timestamp = {2012.11.08} } @ARTICLE{Ohsugi2009, author = {Ohsugi, Yu and Kinjo, Masataka}, title = {Multipoint fluorescence correlation spectroscopy with total internal reflection fluorescence microscope}, journal = {Journal of Biomedical Optics}, year = {2009}, volume = {14}, pages = {014030-014030-4}, number = {1}, doi = {10.1117/1.3080723}, owner = {paul}, timestamp = {2012.11.12} } @ARTICLE{Ohsugi2006, author = {Ohsugi, Yu and Saito, Kenta and Tamura, Mamoru and Kinjo, Masataka}, title = {Lateral mobility of membrane-binding proteins in living cells measured by total internal reflection fluorescence correlation spectroscopy.}, journal = {Biophysical Journal}, year = {2006}, volume = {91}, pages = {3456--3464}, number = {9}, doi = {10.1529/biophysj.105.074625}, owner = {paul}, publisher = {Biophysical Society}, timestamp = {2012.02.14} } @ARTICLE{Palmer1987, author = {A. G. Palmer and N. L. Thompson}, title = {Theory of sample translation in fluorescence correlation spectroscopy.}, journal = {Biophysical Journal}, year = {1987}, volume = {51}, pages = {339--343}, number = {2}, month = {Feb}, abstract = {New applications of the technique of fluorescence correlation spectroscopy (FCS) require lateral translation of the sample through a focused laser beam (Peterson, N.O., D.C. Johnson, and M.J. Schlesinger, 1986, Biophys. J., 49:817-820). Here, the effect of sample translation on the shape of the FCS autocorrelation function is examined in general. It is found that if the lateral diffusion coefficients of the fluorescent species obey certain conditions, then the FCS autocorrelation function is a simple product of one function that depends only on transport coefficients and another function that depends only on the rate constants of chemical reactions that occur in the sample. This simple form should allow manageable data analyses in new FCS experiments that involve sample translation.}, doi = {10.1016/S0006-3495(87)83340-4}, keywords = {Kinetics; Lasers; Mathematics; Models, Theoretical; Spectrometry, Fluorescence, methods}, language = {eng}, medline-pst = {ppublish}, owner = {paul}, pii = {S0006-3495(87)83340-4}, pmid = {3828464}, timestamp = {2012.11.02} } @ARTICLE{Pero2006-06, author = {Pero, JK and Haas, EM and Thompson, NL}, title = {Size dependence of protein diffusion very close to membrane surfaces: measurement by total internal reflection with fluorescence correlation spectroscopy.}, journal = {The Journal of Physical Chemistry. B}, year = {2006}, volume = {110}, pages = {10910-8}, number = {5}, doi = {10.1021/jp056990y}, issn = {1520-6106}, owner = {paul}, timestamp = {2012.09.21} } @ARTICLE{Petrasek2008, author = {Petr\'{a}\v{s}ek, Zden\v{e}k and Schwille, Petra}, title = {Precise Measurement of Diffusion Coefficients using Scanning Fluorescence Correlation Spectroscopy}, journal = {Biophysical Journal}, year = {2008}, volume = {94}, pages = {1437--1448}, number = {4}, month = feb, doi = {10.1529/biophysj.107.108811}, issn = {0006-3495}, owner = {paul}, publisher = {Cell Press}, refid = {S0006-3495(08)70660-X DOI - 10.1529/biophysj.107.108811}, timestamp = {2012.05.20} } @INCOLLECTION{Petrov:2008, author = {Petrov, E. P. and Schwille, P.}, title = {State of the Art and Novel Trends in Fluorescence Correlation Spectroscopy}, booktitle = {Standardization and Quality Assurance in Fluorescence Measurements II}, publisher = {Springer Berlin Heidelberg}, year = {2008}, editor = {Resch-Genger, Ute}, volume = {6}, series = {Springer Series on Fluorescence}, pages = {145-197}, affiliation = {Biophysics, BIOTEC, Technische Universität Dresden, Tatzberg 47–51, 01307 Dresden, Germany}, doi = {10.1007/4243_2008_032}, isbn = {978-3-540-70571-0}, keyword = {Chemistry} } @ARTICLE{Qian1991, author = {Hong Qian and Elliot L. Elson}, title = {Analysis of confocal laser-microscope optics for 3-D fluorescence correlation spectroscopy}, journal = {Applied Optics}, year = {1991}, volume = {30}, pages = {1185--1195}, number = {10}, month = {Apr}, abstract = {Quantitative fluorescence correlation spectroscopy (FCS) and fluorescence photobleaching recovery (FPR) measurements in bulk solution require a well characterized confocal laser microscope optical system. The introduction of a characteristic function, the collection efficiency function (CEF), provides a quantitative theoretical analysis of this system, which yields an interpretation of the FCS and FPR measurements in three dimensions. We demonstrate that when the proper field diaphragm is introduced, the 3-D FCS measurements can be mimicked by a 2-D theory with only minor error. The FPR characteristic recovery time for diffusion is expected to be slightly longer than the corresponding time measured by FCS in the same conditions. This is because the profile of the laser beam used for photobleaching is not affected by the field diaphragm. The CEF is also important for quantitative analysis of standard scanning confocal microscopy when it is carried out using a finite detection pinhole.}, doi = {10.1364/AO.30.001185}, owner = {paul}, publisher = {OSA}, timestamp = {2012.11.02} } @ELECTRONIC{ImageJ, author = {Rasband, W.S.}, year = {1997-2012}, title = {ImageJ}, organization = {U. S. National Institutes of Health}, note = {\url{http://imagej.nih.gov/ij/}}, owner = {paul}, timestamp = {2012.10.16} } @ARTICLE{Richter2006, author = {Richter, Ralf P. and Bérat, Rémi and Brisson, Alain R.}, title = {Formation of Solid-Supported Lipid Bilayers:  An Integrated View}, journal = {Langmuir}, year = {2006}, volume = {22}, pages = {3497-3505}, number = {8}, doi = {10.1021/la052687c}, owner = {paul}, timestamp = {2012.11.12} } @PHDTHESIS{Ries:08, author = {Ries, E.}, title = {Advanced Fluorescence Correlation Techniques to Study Membrane Dynamics}, school = {Biophysics, BIOTEC, Technische Universität Dresden, Tatzberg 47–51, 01307 Dresden, Germany}, year = {2008}, note = {\url{http://nbn-resolving.de/urn:nbn:de:bsz:14-ds-1219846317196-73420}} } @ARTICLE{Ries2009, author = {Jonas Ries and Salvatore Chiantia and Petra Schwille}, title = {Accurate Determination of Membrane Dynamics with Line-Scan FCS}, journal = {Biophysical Journal}, year = {2009}, volume = {96}, pages = {1999 - 2008}, number = {5}, doi = {10.1016/j.bpj.2008.12.3888}, issn = {0006-3495}, owner = {paul}, timestamp = {2012.11.08} } @ARTICLE{Ries2008390, author = {Jonas Ries and Eugene P. Petrov and Petra Schwille}, title = {Total Internal Reflection Fluorescence Correlation Spectroscopy: Effects of Lateral Diffusion and Surface-Generated Fluorescence}, journal = {Biophysical Journal}, year = {2008}, volume = {95}, pages = {390 - 399}, number = {1}, doi = {10.1529/biophysj.107.126193}, issn = {0006-3495} } @ARTICLE{Ries2008, author = {Ries, Jonas and Schwille, Petra}, title = {New concepts for fluorescence correlation spectroscopy on membranes}, journal = {Physical Chemistry Chemical Physics}, year = {2008}, volume = {10}, pages = {--}, number = {24}, abstract = {Fluorescence correlation spectroscopy (FCS) is a powerful tool to measure useful physical quantities such as concentrations, diffusion coefficients, diffusion modes or binding parameters, both in model and cell membranes. However, it can suffer from severe artifacts, especially in non-ideal systems. Here we assess the potential and limitations of standard confocal FCS on lipid membranes and present recent developments which facilitate accurate and quantitative measurements on such systems. In particular, we discuss calibration-free diffusion and concentration measurements using z-scan FCS and two focus FCS and present several approaches using scanning FCS to accurately measure slow dynamics. We also show how surface confined FCS enables the study of membrane dynamics even in presence of a strong cytosolic background and how FCS with a variable detection area can reveal submicroscopic heterogeneities in cell membranes.}, issn = {1463-9076}, owner = {paul}, publisher = {The Royal Society of Chemistry}, timestamp = {2012.02.14} } @ARTICLE{Rigler1993, author = {Rigler, R. and Mets, {\"U}. and Widengren, J. and Kask, P.}, title = {Fluorescence correlation spectroscopy with high count rate and low background: analysis of translational diffusion}, journal = {European Biophysics Journal}, year = {1993}, volume = {22}, pages = {169-175}, doi = {10.1007/BF00185777}, issn = {0175-7571}, issue = {3}, keywords = {Fluorescence correlation spectroscopy; Fluorescence intensity fluctuations; Translational diffusion; Epifluorescence microscope; Silicon photon counter}, language = {English}, owner = {paul}, publisher = {Springer-Verlag}, timestamp = {2012.11.02} } @ARTICLE{Ruan2004, author = {Ruan, Qiaoqiao and Cheng, Melanie A. and Levi, Moshe and Gratton, Enrico and Mantulin, William W.}, title = {Spatial-Temporal Studies of Membrane Dynamics: Scanning Fluorescence Correlation Spectroscopy (SFCS)}, journal = {Biophysical Journal}, year = {2004}, volume = {87}, pages = {1260--1267}, number = {2}, month = aug, issn = {0006-3495}, owner = {paul}, publisher = {Cell Press}, refid = {S0006-3495(04)73605-X DOI - 10.1529/biophysj.103.036483}, timestamp = {2012.02.14} } @ARTICLE{Sankaran2009, author = {Sankaran, Jagadish and Manna, Manoj and Guo, Lin and Kraut, Rachel and Wohland, Thorsten}, title = {Diffusion, Transport, and Cell Membrane Organization Investigated by Imaging Fluorescence Cross-Correlation Spectroscopy}, journal = {Biophysical Journal}, year = {2009}, volume = {97}, pages = {2630--2639}, number = {9}, month = nov, doi = {10.1016/j.bpj.2009.08.025}, issn = {0006-3495}, owner = {paul}, publisher = {Cell Press}, refid = {S0006-3495(09)01387-3 DOI - 10.1016/j.bpj.2009.08.025}, timestamp = {2012.09.21} } @ARTICLE{Sankaran2010, author = {Jagadish Sankaran and Xianke Shi and Liang Yoong Ho and Ernst H K Stelzer and Thorsten Wohland}, title = {ImFCS: a software for imaging FCS data analysis and visualization.}, journal = {Optics Express}, year = {2010}, volume = {18}, pages = {25468--25481}, number = {25}, month = {Dec}, abstract = {The multiplexing of fluorescence correlation spectroscopy (FCS), especially in imaging FCS using fast, sensitive array detectors, requires the handling of large amounts of data. One can easily collect in excess of 100,000 FCS curves a day, too many to be treated manually. Therefore, ImFCS, an open-source software which relies on standard image files was developed and provides a wide range of options for the calculation of spatial and temporal auto- and cross-correlations, as well as differences in Cross-Correlation Functions (ΔCCF). ImFCS permits fitting of standard models to correlation functions and provides optimized histograms of fitted parameters. Applications include the measurement of diffusion and flow with Imaging Total Internal Reflection FCS (ITIR-FCS) and Single Plane Illumination Microscopy FCS (SPIM-FCS) in biologically relevant samples. As a compromise between ITIR-FCS and SPIM-FCS, we extend the applications to Imaging Variable Angle-FCS (IVA-FCS) where sub-critical oblique illumination provides sample sectioning close to the cover slide.}, doi = {10.1364/OE.18.025468}, institution = {Singapore-MIT Alliance, National University of Singapore, E4-04-10, 4 Engineering Drive 3, 117576 Singapore.}, keywords = {Algorithms; Pattern Recognition, Automated, methods; Software; Spectrometry, Fluorescence, methods}, language = {eng}, medline-pst = {ppublish}, owner = {paul}, pii = {208325}, pmid = {21164894}, timestamp = {2012.10.24} } @ARTICLE{SbalzariniSPT, author = {I. F. Sbalzarini and P. Koumoutsakos}, title = {Feature Point Tracking and Trajectory Analysis for Video Imaging in Cell Biology}, journal = {Journal of Structural Biology}, year = {2005}, volume = {151(2)}, pages = {182-195}, doi = {10.1016/j.jsb.2005.06.002}, owner = {paul}, timestamp = {2012.10.16} } @ARTICLE{Schwille2000, author = {Schwille, Petra and Kummer, Susanne and Heikal, Ahmed A. and Moerner, W. E. and Webb, Watt W.}, title = {Fluorescence correlation spectroscopy reveals fast optical excitation-driven intramolecular dynamics of yellow fluorescent proteins}, journal = {Proceedings of the National Academy of Sciences}, year = {2000}, volume = {97}, pages = {151-156}, number = {1}, abstract = {Fast excitation-driven fluctuations in the fluorescence emission of yellow-shifted green fluorescent protein mutants T203Y and T203F, with S65G/S72A, are discovered in the 10−6–10−3-s time range, by using fluorescence correlation spectroscopy at 10−8 M. This intensity-dependent flickering is conspicuous at high pH, with rate constants independent of pH and viscosity with a minor temperature effect. The mean flicker rate increases linearly with excitation intensity for at least three decades, but the mean dark fraction of the molecules undergoing these dynamics is independent of illumination intensity over ≈6 × 102 to 5 × 106 W/cm2. These results suggest that optical excitation establishes an equilibration between two molecular states of different spectroscopic properties that are coupled only via the excited state as a gateway. This reversible excitation-driven transition has a quantum efficiency of ≈10−3. Dynamics of external protonation, reversibly quenching the fluorescence, are also observed at low pH in the 10- to 100-μs time range. The independence of these two bright–dark flicker processes implies the existence of at least two separate dark states of these green fluorescent protein mutants. Time-resolved fluorescence measurements reveal a single exponential decay of the excited state population with 3.8-ns lifetime, after 500-nm excitation, that is pH independent. Our fluorescence correlation spectroscopy results are discussed in terms of recent theoretical studies that invoke isomerization of the chromophore as a nonradiative channel of the excited state relaxation.}, doi = {10.1073/pnas.97.1.151}, owner = {paul}, timestamp = {2012.09.24} } @ARTICLE{Schwille1997, author = {Schwille, P. and Meyer-Almes, F.J. and Rigler, R.}, title = {Dual-color fluorescence cross-correlation spectroscopy for multicomponent diffusional analysis in solution}, journal = {Biophysical Journal}, year = {1997}, volume = {72}, pages = {1878--1886}, number = {4}, month = apr, issn = {0006-3495}, owner = {paul}, publisher = {Cell Press}, refid = {S0006-3495(97)78833-7 DOI - 10.1016/S0006-3495(97)78833-7}, timestamp = {2012.02.14} } @ARTICLE{Schatzel1990, author = {K. Sch{\"a}tzel}, title = {Noise on photon correlation data. I. Autocorrelation functions}, journal = {Quantum Optics: Journal of the European Optical Society Part B}, year = {1990}, volume = {2}, pages = {287}, number = {4}, abstract = {An adequate analysis of photon correlation data requires knowledge about the statistical accuracy of the measured data. For the model of gamma-distributed intensities, that is including the effect of a finite intercept, the full covariance matrix is calculated for all the channels of the photon autocorrelation functions. A thorough discussion of multiple sample time correlation illuminates the importance of temporal averaging effects at large lag times. A practical estimation scheme is given for the noise in photon correlation data from a multiple sample time measurement.}, doi = {10.1088/0954-8998/2/4/002}, owner = {paul}, timestamp = {2012.11.02} } @ARTICLE{Scomparin2009, author = {Scomparin, C. and Lecuyer, S. and Ferreira, M. and Charitat, T. and Tinland, B.}, title = {Diffusion in supported lipid bilayers: Influence of substrate and preparation technique on the internal dynamics}, journal = {The European Physical Journal E: Soft Matter and Biological Physics}, year = {2009}, volume = {28}, pages = {211-220}, affiliation = {CNRS UPR 3118 CINAM 13288 Marseille Cedex 09 France}, doi = {10.1140/epje/i2008-10407-3}, issn = {1292-8941}, issue = {2}, keyword = {Physik und Astronomie}, owner = {paul}, publisher = {Springer Berlin / Heidelberg}, timestamp = {2012.10.22} } @ARTICLE{Seu2007, author = {Seu, Kalani J. and Pandey, Anjan P. and Haque, Farzin and Proctor, Elizabeth A. and Ribbe, Alexander E. and Hovis, Jennifer S.}, title = {Effect of Surface Treatment on Diffusion and Domain Formation in Supported Lipid Bilayers}, journal = {Biophysical Journal}, year = {2007}, volume = {92}, pages = {2445--2450}, number = {7}, month = apr, doi = {10.1529/biophysj.106.099721}, issn = {0006-3495}, owner = {paul}, publisher = {Cell Press}, refid = {S0006-3495(07)71049-4 DOI - 10.1529/biophysj.106.099721}, timestamp = {2012.10.22} } @ARTICLE{Shannon1984, author = {Shannon, C.E.}, title = {Communication in the presence of noise}, journal = {Proceedings of the IEEE}, year = {1984}, volume = {72}, pages = { 1192 - 1201}, number = {9}, month = {sept.}, doi = {10.1109/PROC.1984.12998}, issn = {0018-9219}, owner = {paul}, timestamp = {2012.11.12} } @ARTICLE{Skinner2005, author = {Joseph P Skinner and Yan Chen and Joachim D Müller}, title = {Position-sensitive scanning fluorescence correlation spectroscopy.}, journal = {Biophysical Journal}, year = {2005}, volume = {89}, pages = {1288--1301}, number = {2}, month = {Aug}, abstract = {Fluorescence correlation spectroscopy (FCS) uses a stationary laser beam to illuminate a small sample volume and analyze the temporal behavior of the fluorescence fluctuations within the stationary observation volume. In contrast, scanning FCS (SFCS) collects the fluorescence signal from a moving observation volume by scanning the laser beam. The fluctuations now contain both temporal and spatial information about the sample. To access the spatial information we synchronize scanning and data acquisition. Synchronization allows us to evaluate correlations for every position along the scanned trajectory. We use a circular scan trajectory in this study. Because the scan radius is constant, the phase angle is sufficient to characterize the position of the beam. We introduce position-sensitive SFCS (PSFCS), where correlations are calculated as a function of lag time and phase. We present the theory of PSFCS and derive expressions for diffusion, diffusion in the presence of flow, and for immobilization. To test PSFCS we compare experimental data with theory. We determine the direction and speed of a flowing dye solution and the position of an immobilized particle. To demonstrate the feasibility of the technique for applications in living cells we present data of enhanced green fluorescent protein measured in the nucleus of COS cells.}, doi = {10.1529/biophysj.105.060749}, institution = {School of Physics and Astronomy, University of Minnesota, Minneapolis, 55455, USA. josephs@physics.umn.edu}, keywords = {Algorithms; Image Enhancement, methods; Image Interpretation, Computer-Assisted, methods; Information Storage and Retrieval, methods; Microscopy, Confocal, methods; Reproducibility of Results; Sensitivity and Specificity; Spectrometry, Fluorescence, methods}, language = {eng}, medline-pst = {ppublish}, owner = {paul}, pii = {S0006-3495(05)72776-4}, pmid = {15894645}, timestamp = {2012.10.28} } @ARTICLE{Starr2001, author = {Tammy E. Starr and Nancy L. Thompson}, title = {Total Internal Reflection with Fluorescence Correlation Spectroscopy: Combined Surface Reaction and Solution Diffusion}, journal = {Biophysical Journal}, year = {2001}, volume = {80}, pages = {1575 - 1584}, number = {3}, doi = {10.1016/S0006-3495(01)76130-9}, issn = {0006-3495} } @ARTICLE{Sutherland1905, author = {Sutherland, William}, title = {A dynamical theory of diffusion for non-electrolytes and the molecular mass of albumin}, journal = {Philosophical Magazine Series 6}, year = {1905}, volume = {9}, pages = {781-785}, number = {54}, __markedentry = {[paul]}, doi = {10.1080/14786440509463331}, owner = {paul}, timestamp = {2012.11.14} } @ARTICLE{Tamm1985, author = {Tamm, L.K. and McConnell, H.M.}, title = {Supported phospholipid bilayers}, journal = {Biophysical Journal}, year = {1985}, volume = {47}, pages = {105--113}, number = {1}, month = jan, doi = {10.1016/S0006-3495(85)83882-0}, issn = {0006-3495}, owner = {paul}, publisher = {Cell Press}, refid = {S0006-3495(85)83882-0 DOI - 10.1016/S0006-3495(85)83882-0}, timestamp = {2012.10.29} } @INCOLLECTION{Thomps:bookFCS2002, author = {Thompson, Nancy}, title = {Fluorescence Correlation Spectroscopy}, booktitle = {Topics in Fluorescence Spectroscopy}, publisher = {Springer US}, year = {2002}, editor = {Lakowicz, Joseph and Geddes, Chris D. and Lakowicz, Joseph R.}, volume = {1}, series = {Topics in Fluorescence Spectroscopy}, pages = {337-378}, affiliation = {University of North Carolina at Chapel Hill Department of Chemistry Chapel Hill North Carolina 27599-3290 USA}, doi = {10.1007/0-306-47057-8_6}, isbn = {978-0-306-47057-8}, keyword = {Biomedical and Life Sciences}, owner = {paul}, timestamp = {2012.01.10} } @ARTICLE{Thompson1983, author = {N.L. Thompson and D. Axelrod}, title = {Immunoglobulin surface-binding kinetics studied by total internal reflection with fluorescence correlation spectroscopy}, journal = {Biophysical Journal}, year = {1983}, volume = {43}, pages = {103 - 114}, number = {1}, doi = {10.1016/S0006-3495(83)84328-8}, issn = {0006-3495}, owner = {paul}, timestamp = {2012.02.14} } @ARTICLE{Thompson1981, author = {N.L. Thompson and T.P. Burghardt and D. Axelrod}, title = {Measuring surface dynamics of biomolecules by total internal reflection fluorescence with photobleaching recovery or correlation spectroscopy}, journal = {Biophysical Journal}, year = {1981}, volume = {33}, pages = {435 - 454}, number = {3}, doi = {10.1016/S0006-3495(81)84905-3}, issn = {0006-3495}, owner = {paul}, timestamp = {2012.02.14} } @ARTICLE{Thompson1997, author = {Thompson, Nancy L. and Drake, Andrew W. and Chen, Lixin and Broek, Willem Vanden}, title = {Equilibrium, Kinetics, Diffusion and Self-Association of Proteins at Membrane Surfaces: Measurement by Total Internal Reflection Fluorescence Microscopy}, journal = {Photochemistry and Photobiology}, year = {1997}, volume = {65}, pages = {39--46}, number = {1}, doi = {10.1111/j.1751-1097.1997.tb01875.x}, issn = {1751-1097}, owner = {paul}, publisher = {Blackwell Publishing Ltd}, timestamp = {2012.02.14} } @ARTICLE{Thompson1997a, author = {Nancy L Thompson and B Christoffer Lagerholm}, title = {Total internal reflection fluorescence: applications in cellular biophysics}, journal = {Current Opinion in Biotechnology}, year = {1997}, volume = {8}, pages = {58 - 64}, number = {1}, doi = {10.1016/S0958-1669(97)80158-9}, issn = {0958-1669}, owner = {paul}, timestamp = {2012.02.14} } @ARTICLE{Toomre2001, author = {Derek Toomre and Dietmar J. Manstein}, title = {Lighting up the cell surface with evanescent wave microscopy}, journal = {Trends in Cell Biology}, year = {2001}, volume = {11}, pages = {298 - 303}, number = {7}, doi = {10.1016/S0962-8924(01)02027-X}, issn = {0962-8924}, keywords = {green-fluorescent protein (GFP)}, owner = {paul}, timestamp = {2012.02.14} } @ARTICLE{Unruh2008, author = {Unruh, Jay R. and Gratton, Enrico}, title = {Analysis of Molecular Concentration and Brightness from Fluorescence Fluctuation Data with an Electron Multiplied CCD Camera}, journal = {Biophysical Journal}, year = {2008}, volume = {95}, pages = {5385--5398}, number = {11}, month = dec, doi = {10.1529/biophysj.108.130310}, issn = {0006-3495}, owner = {paul}, publisher = {Cell Press}, refid = {S0006-3495(08)78962-8 DOI - 10.1529/biophysj.108.130310}, timestamp = {2012.09.21} } @ARTICLE{Vacha2009, author = {V\'{a}cha, Robert and Siu, Shirley W. I. and Petrov, Michal and Böckmann, Rainer A. and Barucha-Kraszewska, Justyna and Jurkiewicz, Piotr and Hof, Martin and Berkowitz, Max L. and Jungwirth, Pavel}, title = {Effects of Alkali Cations and Halide Anions on the DOPC Lipid Membrane}, journal = {The Journal of Physical Chemistry A}, year = {2009}, volume = {113}, pages = {7235-7243}, number = {26}, note = {PMID: 19290591}, doi = {10.1021/jp809974e}, owner = {paul}, timestamp = {2012.10.24} } @ELECTRONIC{VisserRol, author = {G. Visser and J. Rolinski}, year = {2010}, title = {Basic Photophysics}, note = {Photobiological Sciences Online (KC Smith, ed.) American Society for Photobiology \url{http://www.photobiology.info}.}, owner = {paul}, timestamp = {2012.02.14} } @ARTICLE{Widengren1995, author = {Widengren, Jerker and Mets, {\"U}lo and Rigler, Rudolf}, title = {Fluorescence correlation spectroscopy of triplet states in solution: a theoretical and experimental study}, journal = {The Journal of Physical Chemistry}, year = {1995}, volume = {99}, pages = {13368-13379}, number = {36}, doi = {10.1021/j100036a009}, owner = {paul}, timestamp = {2012.02.20} } @ARTICLE{Widengren1994, author = {Widengren, Jerker and Rigler, Rudolf and Mets, {\"U}lo}, title = {Triplet-state monitoring by fluorescence correlation spectroscopy}, journal = {Journal of Fluorescence}, year = {1994}, volume = {4}, pages = {255-258}, affiliation = {Department of Medical Biochemistry and Biophysics Karolinska Institute S-171 77 Stockholm Sweden}, doi = {10.1007/BF01878460}, issn = {1053-0509}, issue = {3}, keyword = {Biomedizin & Life Sciences}, owner = {paul}, publisher = {Springer Netherlands}, timestamp = {2012.09.24} } @ARTICLE{Wohland2001, author = {Wohland, Thorsten and Rigler, Rudolf and Vogel, Horst}, title = {The Standard Deviation in Fluorescence Correlation Spectroscopy}, journal = {Biophysical Journal}, year = {2001}, volume = {80}, pages = {2987--2999}, number = {6}, month = jun, doi = {10.1016/S0006-3495(01)76264-9}, issn = {0006-3495}, owner = {paul}, timestamp = {2012.09.08} } @ARTICLE{Wohland2010, author = {Thorsten Wohland and Xianke Shi and Jagadish Sankaran and Ernst H.K. Stelzer}, title = {Single Plane Illumination Fluorescence Correlation Spectroscopy (SPIM-FCS) probes inhomogeneous three-dimensional environments}, journal = {Optics Express}, year = {2010}, volume = {18}, pages = {10627--10641}, number = {10}, month = {May}, abstract = {The life sciences require new highly sensitive imaging tools, which allow the quantitative measurement of molecular parameters within a physiological three-dimensional (3D) environment. Therefore, we combined single plane illumination microscopy (SPIM) with camera based fluorescence correlation spectroscopy (FCS). SPIM-FCS provides contiguous particle number and diffusion coefficient images with a high spatial resolution in homo- and heterogeneous 3D specimens and live zebrafish embryos. Our SPIM-FCS recorded up to 4096 spectra within 56 seconds at a laser power of 60 \&\#x03BC;W without damaging the embryo. This new FCS modality provides more measurements per time and more, less photo-toxic measurements per sample than confocal based methods. In essence, SPIM-FCS offers new opportunities to observe biomolecular interactions quantitatively and functions in a highly multiplexed manner within a physiologically relevant 3D environment.}, doi = {10.1364/OE.18.010627}, keywords = {Fluorescence microscopy; Three-dimensional microscopy; Spectroscopy, fluorescence and luminescence}, owner = {paul}, publisher = {OSA}, timestamp = {2012.11.07} } @ARTICLE{Yordanov2009, author = {Stoyan Yordanov and Andreas Best and Hans-J\"{u}rgen Butt and Kaloian Koynov}, title = {Direct studies of liquid flows near solid surfaces by total internal reflection fluorescence cross-correlation spectroscopy}, journal = {Optics Express}, year = {2009}, volume = {17}, pages = {21149--21158}, number = {23}, month = {Nov}, abstract = {We present a new method to study flow of liquids near solid surface: Total internal reflection fluorescence cross-correlation spectroscopy (TIR-FCCS). Fluorescent tracers flowing with the liquid are excited by evanescent light, produced by epi-illumination through the periphery of a high numerical aperture oil-immersion objective. The time-resolved fluorescence intensity signals from two laterally shifted observation volumes, created by two confocal pinholes are independently measured. The cross-correlation of these signals provides information of the tracers' velocities. By changing the evanescent wave penetration depth, flow profiling at distances less than 200 nm from the interface can be performed. Due to the high sensitivity of the method fluorescent species with different size, down to single dye molecules can be used as tracers. We applied this method to study the flow of aqueous electrolyte solutions near a smooth hydrophilic surface and explored the effect of several important parameters, e.g. tracer size, ionic strength, and distance between the observation volumes.}, doi = {10.1364/OE.17.021149}, keywords = {Velocimetry; Fluorescence, laser-induced; Spectroscopy, surface}, owner = {paul}, publisher = {OSA}, timestamp = {2012.09.21} } @ARTICLE{Yordanov2011, author = {Stoyan Yordanov and Andreas Best and Klaus Weisshart and Kaloian Koynov}, title = {Note: An easy way to enable total internal reflection-fluorescence correlation spectroscopy (TIR-FCS) by combining commercial devices for FCS and TIR microscopy}, journal = {Review of Scientific Instruments}, year = {2011}, volume = {82}, pages = {036105}, number = {3}, eid = {036105}, doi = {10.1063/1.3557412}, keywords = {fluorescence spectroscopy; optical microscopy}, numpages = {3}, owner = {paul}, publisher = {AIP}, timestamp = {2012.05.02} } @ARTICLE{Zhang2007, author = {Bo Zhang and Josiane Zerubia and Jean-Christophe Olivo-Marin}, title = {Gaussian approximations of fluorescence microscope point-spread function models}, journal = {Applied Optics}, year = {2007}, volume = {46}, pages = {1819--1829}, number = {10}, month = {Apr}, abstract = {We comprehensively study the least-squares Gaussian approximations of the diffraction-limited 2D-3D paraxial-nonparaxial point-spread functions (PSFs)of the wide field fluorescence microscope (WFFM), the laser scanning confocal microscope(LSCM), and the disk scanning confocal microscope (DSCM). The PSFs are expressed using the Debye integral. Under anL$\infty$ constraint imposing peak matching, optimal and near-optimal Gaussian parameters are derived for the PSFs. With anL1 constraint imposing energy conservation, an optimal Gaussian parameter is derived for the 2D paraxial WFFM PSF. We found that (1) the 2D approximations are all very accurate; (2) no accurate Gaussian approximation exists for 3D WFFM PSFs; and (3) with typical pinhole sizes, the 3D approximations are accurate for the DSCM and nearly perfect for the LSCM. All the Gaussian parameters derived in this study are in explicit analytical form, allowing their direct use in practical applications.}, doi = {10.1364/AO.46.001819}, keywords = {Numerical approximation and analysis; Microscopy; Confocal microscopy; Fluorescence microscopy; Three-dimensional microscopy}, owner = {paul}, publisher = {OSA}, timestamp = {2012.09.20} } @BOOK{Rigler:FCSbook, title = {Fluorescence Correlation Spectroscopy, Theory and Applications}, publisher = {Springer Berlin Heidelberg}, year = {2001}, editor = {R. Rigler and E.S. Elson}, edition = {1}, howpublished = {Paperback}, isbn = {978-3540674337}, owner = {paul}, timestamp = {2012.11.02} } @ELECTRONIC{AndorNeoSpec, title = {Andor Technology, Neo sCMOS Specifications}, organization = {Andor Technology}, note = {\url{http://www.andor.com/pdfs/specifications/Andor_Neo_sCMOS_Specifications.pdf} (Okt. 2012)}, citeseerurl = {http://www.andor.com/pdfs/specifications/Andor_Neo_sCMOS_Specifications.pdf}, owner = {paul}, timestamp = {2012.10.08} } @ELECTRONIC{HamamatsuOrcaSpec, title = {Hamamatsu, ORCA-Flash4.0 CMOS datasheet}, organization = {Hamamatsu}, note = {\url{http://sales.hamamatsu.com/assets/pdf/hpspdf/e_flash4.pdf} (Okt. 2012)}, citeseerurl = {http://www.andor.com/pdfs/specifications/Andor_Neo_sCMOS_Specifications.pdf}, owner = {paul}, timestamp = {2012.10.08} } @ELECTRONIC{InvitrogenDiO, month = {November}, title = {Invitrogen, catalog number D-275 (DiO)}, note = {\url{http://products.invitrogen.com/ivgn/product/D275} (Okt. 2012)}, owner = {paul}, timestamp = {2012.10.18} } @ELECTRONIC{vaxavis, title = {Dynamic viscosity of liquid water from 0 \degC}, note = {\url{http://www.vaxasoftware.com/doc_eduen/qui/viscoh2o.pdf} (Okt. 2012)}, owner = {paul}, timestamp = {2012.10.29} } @ELECTRONIC{WikipediaBrown, title = {Brownian motion, Wikipedia - The Free Encyclopedia}, note = {\url{http://en.wikipedia.org/wiki/Brownian_motion} (Okt. 2012)}, owner = {paul}, timestamp = {2012.10.18} } @ELECTRONIC{AndorNeo, year = {2011}, title = {Andor Technology, Neo sCMOS Hardware Guide}, organization = {Andor Technology}, owner = {paul}, timestamp = {2012.10.06} } pycorrfit-0.8.1/doc-src/PyCorrFit_doc.tex0000755000175000017500000001274212262516600017064 0ustar toortoor\documentclass[a4paper,12pt]{scrartcl} % apt-get install texlive-science %Wir arbeiten mit PDF-Latex. %Bei Texmaker unter Werkzeuge > PDFLaTeX (F6) \usepackage[utf8x]{inputenc} %\usepackage{tipa} % apt-get install tipa %Für deutsche Schriften: %\usepackage[ngerman]{babel} %\usepackage{sistyle} %\SIstyle{German} %\usepackage{icomma} % Komma als Dezimaltrenner (im Mathe Modus) % Standardmäßig läßt LaTeX im Mathe Modus immer etwas % Platz nach einem Komma, für 3,45 etc. ist das falsch. % Mit icomma gilt: % Wenn auf das Komma ein Leerzeichen folgt, soll auch % eins kommen, wenn nicht, schreibe es als Operator: % z.B. $f(x, y) = 3,45$ %Für englische Schriften: \usepackage[english]{babel} \usepackage{sistyle} \usepackage[top = 2cm, left = 2.5cm, right = 2cm, bottom = 2.5cm]{geometry} \usepackage{amsmath} \usepackage{amssymb} \usepackage{array} \usepackage{cite} %Für Zitate und Quellen \usepackage{url} \urlstyle{tt} %\usepackage{longtable} % mehrseitige Tabellen %\usepackage{multirow} %Zusammenfassen von Spalten/Zeilen \usepackage{subfig} % Vereinen von Bildern in gesamte Figure %Schönere Unterschriften für Bilder und Tabellen %Setze captions mit Kommas und Beschriftungen \DeclareCaptionLabelFormat{mycaption}{#1 #2} \DeclareCaptionLabelSeparator{comma}{, } \captionsetup{font=small,labelfont=bf,labelformat=mycaption,labelsep=comma} \setcapindent{0pt} % Zeileneinzug ab zweiter Zeile \newcommand{\mycaption}[2]{\caption[~#1]{\textbf{#1:} #2}} \usepackage{tabularx} \usepackage{textcomp} % Sonderzeichen \usepackage{wrapfig} \usepackage{fancyvrb} \usepackage[svgnames]{xcolor} %Farben wie DarkBlue %% %% %% Definitionen für schöne Links innerhalb des Dokuments %%% graphicx: support for graphics \usepackage[pdftex]{graphicx} \pdfcompresslevel=9 %%% hyperref (hyperlinks in PDF): for more options or more detailed %%% explanations, see the documentation of the hyperref-package \usepackage[% %%% general options pdftex=true, %% sets up hyperref for use with the pdftex program %plainpages=false, %% set it to false, if pdflatex complains: ``destination with same identifier already exists'' % pdfstartview={XYZ 0 0 1.0} , %% Startet das PDF mit 100% Zoom, also Originalgroesse %%% extension options backref, %% adds a backlink text to the end of each item in the bibliography pagebackref=false, %% if true, creates backward references as a list of page numbers in the bibliography colorlinks=true, %% turn on colored links (true is better for on-screen reading, false is better for printout versions) linkcolor=DarkBlue, %% Aendern der Linkfarbe urlcolor=DarkBlue, %% Aendern der Url-Linkfarbe und andere serioese Farben anchorcolor = black, citecolor = DarkGreen, filecolor = black, urlcolor = DarkBlue, breaklinks=false, % %%% PDF-specific display options bookmarks=true, %% if true, generate PDF bookmarks (requires two passes of pdflatex) bookmarksopen=true, %% if true, show all PDF bookmarks expanded bookmarksnumbered=false, %% if true, add the section numbers to the bookmarks %pdfstartpage={1}, %% determines, on which page the PDF file is opened %pdfpagemode=None %% None, UseOutlines (=show bookmarks), UseThumbs (show thumbnails), FullScreen ]{hyperref} %%% provide all graphics (also) in this format, so you don't have %%% to add the file extensions to the \includegraphics-command %%% and/or you don't have to distinguish between generating %%% dvi/ps (through latex) and pdf (through pdflatex) % \DeclareGraphicsExtensions{.pdf} %% %% %\newcommand{\kommentar}[1]{\marginpar{\textcolor{red}{#1}}} % Kommentarkommando %\newcommand{\fehler}[3]{\SI{(#1}{}\SI{\pm #2)}{#3}} % Fehlerkommando %Ort für mögliche Bilddateien (Unterordner) \graphicspath{{bilder/}{messwerte/}{auswertung/}} %Neue Befehle \newcommand{\hyref}[2]{\hyperref[#2]{#1~\ref{#2}}} %Schönerer link statt "link zu Bild \ref{im:bild}" -> "\hyref{link zu Bild}{im:bild}" \newcommand{\mytilde}{\raisebox{-0.9ex}{\~{ }}} \setcounter{page}{1} % Tell latex how to break the program names \hyphenation{Py-Corr-Fit Py-Scan-FCS} % For non-italic greek letters \usepackage{upgreek} \usepackage{doi} \begin{document} \noindent \begin{tabularx}{\linewidth}{Xr} \textbf{PyCorrFit \newline FCS data evaluation} \newline \textit{Software Guide} & \raisebox{-2em}{\includegraphics[angle=0,width=40mm]{Images/PyCorrFit_logo_dark.pdf}} \\ \\ Thomas Weidemann & \\ Max Planck Institute of Biochemistry, Martinsried, Germany & \\ \\ Paul Müller & \\ Biotechnology Center of the TU Dresden, Germany & \\ \\ \today & \\ \end{tabularx} \vspace{2em} \tableofcontents \newpage \graphicspath{{Images/}} \include{PyCorrFit_doc_content} \section*{Acknowledgements} \addcontentsline{toc}{section}{Acknowledgements} I thank André Scholich (TU Dresden, Germany) for initial proof reading of the manuscript and Grzegorz Chwastek, Franziska Thomas, and Thomas Weidemann (Biotec, TU Dresden, Germany) for critical feedback on PyCorrFit. %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % Literaturverzeichnis \pagestyle{plain} % nur Nummerierung in der Fuzeile \bibliographystyle{plainurl} % Zitierstil: alphadin = [Nam88] apt-get install bibtex-extras \bibliography{Bibliography} % BibTeX-Datei name.bib ohne .bib hier einfgen %\nocite{*} % Listet alle Eintrge der Datei auf, wenn aktiv \end{document} pycorrfit-0.8.1/doc-src/PyCorrFit_doc_content.tex0000755000175000017500000015426412262516600020624 0ustar toortoor\section{Introduction} \subsection{Preface} \textit{PyCorrFit} emerged from my work in the Schwille Lab\footnote{\url{http://www.biochem.mpg.de/en/rd/schwille/}} at the Biotechnology Center of the TU Dresden in 2011/2012. The program source code is available at GitHub\footnote{\url{https://github.com/paulmueller/PyCorrFit}}. Please do not hesitate to sign up and add a feature request. If you you found a bug, please let me know via GitHub.\\ \noindent \textit{PyCorrFit} was written to simplify the work with experimentally obtained correlation curves. These can be processed independently (operating system, location, time). PyCorrFit supports commonly used file formats and enables users to allocate and organize their data in a simple way.\\ \noindent \textit{PyCorrFit} is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 2 of the License, or (at your option) any later version\footnote{\url{http://www.gnu.org/licenses/gpl.html}}. \subsubsection*{What \textit{PyCorrFit} can do} \begin{itemize} \item Load correlation curves from numerous correlators \item Process these curves (\hyref{Section}{sec:tm}) \item Fit a model function (many included) to an experimental curve \item Import user defined models for fitting \item Many batch processing features \item Save/load entire \textit{PyCorrFit} sessions \end{itemize} \subsubsection*{What \textit{PyCorrFit} is not} \begin{itemize} \item A multiple-$\tau$ correlator \item A software to operate hardware correlators \end{itemize} \subsection{System prerequisites} \subsubsection{Hardware} This documentation addresses the processing of correlation curves with \textit{PyCorrFit}. \textit{PyCorrFit} was successfully used with the following setups: \begin{itemize} \item[1.] APD: Photon Counting Device from PerkinElmer Optoelectronics, Model: \texttt{SPCM-CD3017}\\ Correlator: Flex02-01D/C from correlator.com with the shipped software \texttt{flex02-1dc.exe}. \item[2.] APD: Photon Counting Device from PerkinElmer Optoelectronics\\ Correlator: ALV-6000 \item[3.] LSM Confocor2 or Confocor3 setups from Zeiss, Germany. \end{itemize} \subsubsection{Software} \label{cha:soft} The latest version of \textit{PyCorrFit} can be obtained from the internet at \url{http://pycorrfit.craban.de}. \begin{itemize} \item \textbf{MacOSx}. Binary files for MacOSx $>$10.6.8 are available from the download page but have not yet been fully tested for stability. \item \textbf{Windows}. For Windows XP or Windows 7, stand-alone binary executables are available from the download page. \item \textbf{Linux}. There are executable binaries for widely used distributions (e.g. Ubuntu). \item \textbf{Sources} The program was written in Python, keeping the concept of cross-platform programming in mind. To run \textit{PyCorrFit} on any other operating system, the installation of Python v.2.7 is required. To obtain the latest source, visit \textit{PyCorrFit} at GitHub (\url{https://github.com/paulmueller/PyCorrFit}). \textit{PyCorrFit} depends on the following python modules:\\ \texttt{\\ python-matplotlib ($\geq$ 1.0.1) \\ python-numpy ($\geq$ 1.5.1) \\ python-scipy ($\geq$ 0.8.0) \\ python-sympy ($\geq$ 0.7.2) \\ python-yaml \\ python-wxtools \\ python-wxgtk2.8-dbg \\ } \\ For older versions of Ubuntu, some of the above package versions are not listed in the package repository. To enable the use of \textit{PyCorrFit} on those systems, the following tasks have to be performed: \begin{itemize} \item[ ] \textbf{matplotlib}. The tukss-ppa includes version 1.0.1. After adding the repository (\texttt{apt-add-repository ppa:tukss/ppa}), matplotlib can be installed as usual. \item[ ] \textbf{numpy}. The package from a later version of Ubuntu can be installed: \url{https://launchpad.net/ubuntu/+source/python-numpy/} \item[ ] \textbf{scipy}. The package from a later version of Ubuntu can be installed: \url{https://launchpad.net/ubuntu/+source/python-scipy/} \item[ ] \textbf{sympy}. To enable importing external model functions, sympy is required. It is available from \url{http://code.google.com/p/sympy/downloads/list}. Unpacking the archive and executing \texttt{python setup.py install} within the unpacked directory will install sympy. \end{itemize} \end{itemize} Alternatively \texttt{python-pip} (\url{http://pypi.python.org/pypi/pip}) can be used to install up-to-date python modules. \noindent \textbf{\LaTeX}. \textit{PyCorrFit} can save correlation curves as images using matplotlib. It is also possible to utilize Latex to generate these plots. On Windows, installing MiKTeX with ``automatic package download'' will enable this feature. On MacOSx, the MacTeX distribution can be used. On other systems, the packages LaTeX, dvipng, Ghostscript and the scientific latex packages \texttt{texlive-science} and \texttt{texlive-math-extra} need to be installed. \subsection{Running \textit{PyCorrFit}} \label{sec:run} \paragraph*{Windows} Download the executable file and double-click on the \texttt{PyCorrFit.exe} icon. \paragraph*{Linux/Ubuntu} Make sure the binary has the executable bit set, then simply double-click on the binary \texttt{PyCorrFit}. \paragraph*{Mac OSx} When downloading the archive \texttt{PyCorrFit.zip}, the binary should be extracted automatically (if not, extract the archive) and you can double-click it to run \textit{PyCorrFit}. \paragraph*{from source} Invoke \texttt{python PyCorrFit.py} from the command line. \section{Working with \textit{PyCorrFit}} \subsection{Workflow} \label{cha_graphint} \label{sec:PyCorrFitUserInterface} The following chapter introduces the general idea of how to start and accomplish a fitting project. FCS experiments produce different sets of experimental correlation functions which must be interpreted with appropriate physical models. Each correlation function refers to a single contiguous signal trace or ``run''. In \textit{PyCorrFit}, the user must assign a mathematical model function to each correlation function during the loading procedure. The assignment is irreversible in the sense that within an existing \textit{PyCorrFit} session it cannot be changed. This feature assures the stability of the batch processing routine for automated fitting of large data sets. Nevertheless, the fit of different models to the same data can be explored by loading the data twice or simply by creating two different sessions. Let's briefly discuss a typical example: To determine the diffusion coefficient of a fluorescently labeled protein in free solution, one has to deal with two sets of autocorrelation data: measurements of a diffusion standard (e.g. free dye for which a diffusion coefficient has been published) to calibrate the detection volume and measurements of the protein sample. The protein sample may contain small amounts of slowly diffusing aggregates. While the calibration measurements can be fitted with a one-component diffusion model (T-3D), the protein sample displays two mobility states, monomers and aggregates, which are taken into account by a two-component diffusion model (T-3D-3D). With \textit{PyCorrFit} such a situation can be treated in three ways, having different pros and cons: \begin{enumerate} \item Create separate sessions for each type of sample and assign different model functions. \item Assign a one-component model to the dye measurements and a two-component model to the protein measurements when loading consecutively into the same session. \item Assign a two-component model for all data and, when appropriate, manually inactivate one component by fixing its contribution to 0\%. \end{enumerate} The first approach is straightforward, however, it requires homogeneous diffusion behavior for each data set. The second strategy has the advantage that the dye and the protein curves, as well as the obtained parameters can be visually compared during the fitting analysis within the same session. In this case, batch fitting is still possible because it discriminates data sets assigned to different models. In the third case, simultaneous batch fitting is also possible. However, for each dye measurement one has to eliminate the second, slow diffusion species manually, which might be laborious. Inactivating components by fixing parameters is nevertheless a common way to evaluate heterogeneous data sets, for example, a protein sample for which only a subgroup of curves requires a second diffusion component due to occasional appearance of aggregates. Such situations are frequently encountered in intracellular measurements. In conclusion, all three strategies or combinations thereof may be suitable. In any case, the user must decide on model functions beforehand, therefore it is advisable to group the data accordingly. The fitting itself is usually explored with a representative data set. Here, the user has to decide on starting parameters, the range in which they should be varied, corrections like background, and other fitting options. Once the fit looks good, the chosen settings can be transferred at once to all other pages assigned to the same model using the \textit{Batch control} tool (\hyref{Section}{sec:tm.bc}). After flipping through the data for visual inspection one may check the parameters across all pages in the \textit{Statistics view} tool and re-visit outliers (\hyref{Section}{sec:tm.sv}). From there, the numerical fit values and example correlation functions can be exported. \subsection{The \textit{main window}} %\hyref{Figure}{fig:PyCorrFitMain} shows the main window of PyCorrFit. It contains a menu bar to access all tools, a notebook with tabs, each tab representing a single curve, and a page - the content of the currently selected tab. Together with a system's terminal of the platform on which PyCorrFit was installed (Windows, Linux, MacOS), the \textit{main window} opens when starting the program as described in \hyref{section}{sec:run}. The window title bar contains the version of \textit{PyCorrFit} and, if a session was re-opened or saved, the name of the fitting session. A menu bar provides access to many supporting tools and additional information as thoroughly described in \hyref{Chapter}{sec:mb}. There are three gateways for experimental data into a pre-existing or a new \textit{PyCorrFit} session (\textit{File / Load data}, \textit{File / Open session}, and \textit{Current page / Import data}). When a session has been opened or correlation data have been loaded, each correlation curve is displayed on a separate page of a notebook. For quick identification of the active data set, a tab specifies the page number, the correlated channels (AC/CC), and the run number in case there are multiple runs in one experimental data file. When clicking a little triangle to the far-right, one can use a drop-down list of all page titles to directly access a particular data set. Alternatively, the pages can be toggled by tapping the curser keys (left/right). There can be only one activated page for which the tab appears highlighted. \begin{figure}[h] \centering \includegraphics[width=\linewidth]{PyCorrFit_Screenshot_Main.png} \mycaption{user interface of PyCorrFit}{A circular scanning FCS (CS-FCS) curve of DiO on a supported lipid bilayer (glass substrate) is shown. The measurement yields a diffusion coefficient of \SI{0.28}{\mu m^2s^{-1}} ($F1=1$, so only one component is fitted). Note that a 2D diffusion model is used and not a 3D model (as shown in \hyref{figure}{fig:extxt}). \label{fig:PyCorrFitMain}} \end{figure} The page containing a correlation function is divided in two halves. At the left hand side the page\textit{ }shows a pile of boxes containing values or fitting options associated to the current model and data set: \begin{itemize} \item \textit{Data set}, a unique identifier for each correlation curve which is automatically assembled from different fields during the loading procedure (\hyref{Section}{sec:fm.ld}). This window can also be manually edited, thereby allowing to re-name or flag certain data during the fitting analysis. \item \textit{Model parameters} displays the values which determine the current shape of the assigned model function. Initially, starting values are loaded as they were defined in the model description (\hyref{Section}{sec:fm.im}). Little buttons allow a stepwise increase or decrease in units of 1/10\textsuperscript{th}. It is also possible to directly enter some numbers. A checkbox is used to set the parameter status to ``varied'' (checked) or ``fixed'' (unchecked) during the fitting. At the end, when saving the session, the current set of values together with their indicated names are stored in the *.yaml file (\hyref{Section}{sec:fm.ss}). \item \textit{Amplitude corrections} applies additional rescaling to amplitude related parameters like the number of particles $n$ or fractions thereof associated with different correlation times ($n_1$, $n_2$, etc.). Experimental values of non-correlated background intensity can be manually entered for each channel. In addition, the correlation curves can be normalized, to facilitate a visual comparison of their time dependence. \item \textit{Fitting options} offers weighted fitting. The underlying idea is that data points with higher accuracy should also have a higher impact on model parameters. To derive weights, \textit{PyCorrFit} calculates the variance of the difference between the actual data and a smooth, empiric representation of the curve for a certain neighborhood. The number of neighboring data points at each side ($j > 0$) can be set. For such a smooth representation a 5-knot spline function or the model function with the current parameter set can be used. The latter should improve when repeatedly fitting. \end{itemize} At the right hand side are two graphics windows. The dimensionless correlation functions $G(\tau)$ are plotted against the lag time ($\tau$) in logarithmic scale. Below, a second window shows the residuals, the actual numerical difference between the correlation data and the model function. Fitting with appropriate models will scatter the residuals symmetrically around zero ($x$-axis). When weighted fitting was performed, the weighted residuals are shown. A good fit will not leave residuals systematically above or below the $x$-axis at any time scale. The main window can be rescaled as a whole to improve data representation. In addition, to zoom in, one can drag a rectangle within the plot area; a double click then restores the initial scale. Experimental data points are linked by grey lines, the state of the model function is shown in blue. When a weighted fit was applied, the variance of the fit is calculated for each data point and displayed in cyan. \section{The menu bar} \label{sec:mb} PyCorrFit is organized in panels which group certain functions. The menu organizes data management (File), data analysis (Tools), display of correlation functions (Current Page), numerical examples (Model), software settings (Preferences), and software metadata (Help). \subsection{File menu} \label{sec:fm} The File menu organizes the import of theoretical models, experimental correlation data, and opening and saving of entire \textit{PyCorrFit} fitting sessions. However, the numerical fit results are exported from the \textit{Statistics view} panel which can be found under \textit{Tools} (\hyref{Section}{sec:tm.sv}). \subsubsection{File / Import model} \label{sec:fm.im} Correlation data must be fitted to models describing the underlying physical processes which give rise to a particular time dependence and magnitude of the recorded signal fluctuations. Models are mathematical expressions containing parameters with physical meaning, like the molecular brightness or the dwell time through an illuminated volume etc. While a number of standard functions are built-in, the user can define new expressions. Some examples can be found at GitHub in the \textit{PyCorrFit} repository, e.g. circular scanning FCS \cite{Petrasek2008} or a combination of diffusion and directed flow \cite{Brinkmeier1999}. Model functions are imported as text files (*.txt) using certain syntax: \begin{itemize} \item \textbf{Encoding}: PyCorrFit can interpret the standard Unicode character set (UTF-8). \item \textbf{Comments}: Lines starting with a hash (\texttt{\#}), empty lines, or lines containing only white space characters are ignored. The only exception is the first line starting with a hash followed by a white space and a short name of the model. This line is evaluated to complement the list of models in the dialogue\textit{ Choose }\textit{model}, when loading the data. \item \textbf{Units}: PyCorrFit works with internal units for: \begin{itemize} \item Time: \SI{1}{ms} \item Distance: \SI{100}{nm} \item Diffusion coefficient: \SI{10}{\mu m^2s^{-1}} \item Inverse time: \SI{1000}{s^{-1}} \item Inverse area: \SI{100}{\mu m^{-2}} \item Inverse volume: \SI{1000}{\mu m^{-3}} \end{itemize} \item \textbf{Parameters:} To define a new model function new parameters can be introduced. Parameters are defined by a sequence of strings separated by white spaces containing name, the dimension in angular brackets, the equal sign, and a starting value which appears in the main window for fitting. For example: \texttt{D [\SI{10}{\mu m^ 2 s^{-1}}] = 5.0}. %It is important to note that when the dimensions differ from the internal units (\SI{10}{\mu m^ 2 s^{-1}}), the expression must contain some adjusting factor; here a factor of 5. %Thus, user defined dimensions are only for display and cannot be processed numerically. The parameter names contain only alphabetic (not numerical) characters. \texttt{G} and \texttt{g}, as well as the numbers \texttt{e} and \texttt{pi} are already mapped and cannot be used freely. \item \textbf{Placeholder:} When defining composite mathematical expressions for correlation functions one can use placeholders. Placeholders start with a lowercase ‘g’. For example, the standard, Gaussian 3D diffusion in free solution may be written as \begin{itemize} \item \texttt{gTrp = 1+ T/(1-T)*exp(-tau/tautrip)} \item \texttt{gTwoD = 1/(1+tau/taudiff)} \item \texttt{gThrD = 1/sqrt(1+tau/(taudiff*S**2))} \end{itemize} \end{itemize} The individual parts are then combined in the last line of the *.txt file, where the correlation function is defined starting with uppercase ’G’: \begin{equation} \texttt{G = 1/n * gTrp * gTwoD * gThrD} \notag \end{equation} For reference of mathematical operators check for example \href{http://www.tutorialspoint.com/python/python_basic_operators.htm}{www.tutorialspoint.com / python / python\_basic\_operators.htm}. To illustrate a more complex example see the model function for circular scanning FCS in \hyref{figure}{fig:extxt}. \begin{figure} % for case sensitiver Verbatim, we need the package fancyvrb \begin{Verbatim}[frame = single] # CS-FCS 3D+S+T (Confocal) # Circular Scanning FCS model function. 3D diffusion + Triplet. ## Definition of parameters: # First, the parameters and their starting values for the model function # need to be defined. If the parameter has a unit of measurement, then it # may be added separated by a white space before the "=" sign. The starting # value should be a floating point number. Floating point abbreviations # like "1e-3" instead of "0.001" may be used. # Diffusion coefficient D [10 µm²/s] = 200.0 # Structural parameter w = 5.0 # Waist of the lateral detection area a [100 nm] = 1.0 # Particle number n = 5.0 # Scan radius R [100 nm] = 5.0 # Frequency f [kHz] = 20.0 # Triplet fraction T = 0.1 # Triplet time tautrip [ms] = 0.001 # The user may wish to substitute certain parts of the correlation function # with other values to keep the formula simple. This can be done by using the # prefix "g". All common mathematical functions, such as "sqrt()" or "exp()" # may be used. For convenience, "pi" and "e" are available as well. gTrip = 1. + T/(1-T)*exp(-tau/tautrip) gScan = exp(-(R*sin(pi*f*tau))**2/(a**2+D*tau)) gTwoD = 1./(1.+D*tau/a**2) gOneD = 1./sqrt(1.+D*tau/(w*a)**2) gThrD = gTwoD * gOneD # The final line with the correlation function should start with a "G" # before the "=" sign. G = 1./n * gThrD * gScan * gTrip \end{Verbatim} \mycaption{user defined model function for PyCorrFit}{The working example shows a model function for circular scanning FCS.\label{fig:extxt}} \end{figure} \subsubsection{File / Load data} \label{sec:fm.ld} \textit{Load data }is the first way to import multiple correlation data sets into a \textit{PyCorrFit} session. The supported file formats can be found in a drop-down list of supported file endings in the pop-up dialog \textit{Open data files}: \begin{tabular}{l l} \rule{0pt}{3ex} (1) All supported files & default \\ \rule{0pt}{3ex} (2) Confocor3 (*.fcs) & AIM 4.2, ZEN 2010, Zeiss, Germany \\ \rule{0pt}{3ex} (3) Correlator ALV6000 (*.ASC) & ALV Laser GmbH, Langen, Germany \\ \rule{0pt}{3ex} (4) Correlator.com (*.SIN) & www.correlator.com, USA \\ \rule{0pt}{3ex} (5) Matlab ‘Ries (*.mat) & EMBL Heidelberg, Germany \\ \rule{0pt}{3ex} (6) PyCorrFit (*.csv) & Paul Müller, TU Dresden, Germany \\ \rule{0pt}{3ex} (7) Zip files (*.zip) & Paul Müller, TU Dresden, Germany \\ \end{tabular} \vspace{3ex} \newline While (2)-(4) are file formats associated with commercial hardware, (5) refers to a MATLAB based FCS evaluation software developed by Jonas Ries in the Schwille lab at TU Dresden, (6) is the txt-file containing comma-separated values (csv) generated with PyCorrFit via the command \textit{Current Page / Save data}. Zip-files are automatically decompressed and can be imported when matching one of the above mentioned formats. In particular loading of zip files is a possibility to re-import correlation data from entire \textit{PyCorrFit} sessions. However, these data are treated as raw, which means that all fitting parameters and model assignments are lost. When loading data, the user is prompted to assign fit models in the \textit{Choose Models} dialogue window. There, curves are sorted according to channel (for example AC1, AC2, CC12, and CC21, as a typical outcome of a dual-color cross-correlation experiment). For each channel a fit model must be selected from the list (see \hyref{Section}{sec:models}): If a file format is not yet listed, the correlation data could be converted into a compatible text-file (*.csv) or bundles of *.csv files within a compressed archive *.zip. For reformatting the following points should be considered: \begin{itemize} \item \textbf{Encoding}: \textit{PyCorrFit} uses the standard Unicode character set (UTF-8). However, since no special characters are needed to save experimental data, other encodings may also work. New line characters are \texttt{{\textbackslash}r{\textbackslash}n} (Windows). \item \textbf{Comments}: Lines starting with a hash (\texttt{\#}), empty lines, or lines containing only white space characters are ignored. Exceptions are the keywords listed below. \item \textbf{Units}: PyCorrFit works with units/values for: \begin{itemize} \item Time: \SI{1}{ms} \item Intensity: \SI{1}{kHz} \item Amplitude offset: $G(0) = 0$ (not 1) \end{itemize} \item \textbf{Keywords:}\footnote{Keywords are case-insensitive.} \textit{PyCorrFit} reads the first two columns containing numerical values. The first table (non-hashed) is recognized as the correlation data containing the lag times in the first and the correlation data in the second column. (In case the *.csv file has been generated with \textit{PyCorrFit} up to three additional columns containing the fit function are ignored). The table ends, when the keyword \texttt{\# BEGIN TRACE} appears. Below this line the time and the signal values should be contained in the first two columns. If cross-correlation data have to be imported a second trace can be entered after the keyword \texttt{\# BEGIN SECOND TRACE}. \item \textbf{Tags:}\footnote{Tags are case-insensitive.} Channel information can be entered using defined syntax in a header. The keyword \begin{center} \vspace{-1em} \texttt{\# Type AC/CC Autocorrelation} \vspace{-1em} \end{center} assigns the tag \texttt{AC} and the keyword \begin{center} \vspace{-1em} {\texttt{\# Type AC/CC Crosscorrelation}} \vspace{-1em} \end{center} assigns the tag \texttt{CC} to the correlation curve. These strings are consistently displayed in the user interface of the respective data page in \textit{PyCorrFit}. If no data type is specified, autocorrelation is assumed. Tags may be specified with additional information like channel numbers, e.g. \begin{center} \vspace{-1em} \texttt{\# Type AC/CC Autocorrelation \_01}. \vspace{-1em} \end{center} In this case the tag \texttt{AC\_01} is generated. This feature is useful to keep track of the type of curve during the fitting and when post-processing the numerical fit results. \end{itemize} \subsubsection{File / Open session} \label{sec:fm.os} This command is the second way to import data into PyCorrFit. In contrast to \textit{Load data}, it opens an entire fitting project, which was previously saved with \textit{PyCorrFit}. Sessions are bundles of files named *.fcsfit-session.zip. Sessions contain, comments, model assigned correlation data, and the current state of parameters for each data page (\hyref{Section}{sec:fm.ss}). \subsubsection{File / Comment session} \label{sec:fm.cs} This command opens a window to place text messages that can be used to annotate a fitting session. \subsubsection{File / Clear session} \label{sec:cls} This command closes all pages while the PyCorrFit.exe keeps running. The user is prompted to save the session under the same or a different name. At this stage both options \textit{No} or \textit{Cancel} lead to clearance and a potential loss of recent modifications. \subsubsection{File / Save session} \label{sec:fm.ss} In addition to display and fit individual curves, a strong feature of PyCorrFit is to save an entire fitting project as a single session. Sessions allow the user to revisit and explore different models, fitting strategies, and data sets. Importantly the work can be saved at any stage. The number of files bundled in a session varies depending on the number of data sets (pages), the number of used models, and what was done during the fitting. A detailed description can be found in the Readme.txt file attached to each session. For example, the numerical correlation and intensity data are saved separately as *.csv text files. However, in contrast to the \textit{Save data (*.csv)} command of the \textit{Current Page} menu, there are no metadata in the header, just tables containing the numerical values. In sessions, the fitting parameters are stored separately in the human-readable data serialization format, *.yaml. \subsubsection{File / Exit} \label{sec:fm.e} This command closes down \textit{PyCorrFit}. The user is prompted to save the session under the same or a different name. At this stage \textit{No} leads to the loss of recent changes, while \textit{Cancel} keeps \textit{PyCorrFit} running. \subsection{Tools menu} \label{sec:tm} The \textit{Tools} menu provides access to a series of accessory panels which extent the capability of the main window. These accessory panels can stay open during the entire analysis. Open panels appear checked in the menu. Most operations can be executed across the entire data set with a single mouse click. \subsubsection{Tools / Data range} \label{sec:tm.dr} This panel limits the range of lag times which are displayed in the main window panel. At the same time it defines the range of points which are used for fitting. For example, this feature can be applied to remove dominant after-pulsing of the avalanche photo diodes (APDs) which may interfere with Triplet blinking at short lag times. The user has the options to \textit{Apply} the channel settings only to the current page or he can \textit{Apply to all pages}. In contrast to \textit{Batch control}, this operation ignores whether the data are assigned to different models. Power user, who frequently load and remove data sets, may take advantage of a checkbox to fix the channel selection for all newly loaded data sets. \subsubsection{Tools / Overlay curves} \label{sec:tm.oc} This window displays the correlation data (not the fit curves) of all pages in a single plot. The curves can be discriminated by color. If only one curve is selected it appears in red. Curves with ambiguous shape can easily be identified, selected, and removed by clicking \textit{Apply}. A warning dialogue lists the pages which will be kept. Data representation is synchronized with the page display in the \textit{Main window}. For example, narrowing the range of lag times by \textit{Data range }is immediately updated in the \textit{Overlay curves }tool. Likewise, their normalization of the amplitudes to unity. The other way round, some tools directly respond to the selections made in the \textit{Overlay curves} tool: \textit{Global fitting}, \textit{Average curves}, and \textit{Statistics view} allow to perform operations on an arbitrary selection of pages which can be specified by page number. Instead of manually typing their numbers, the curves may be selected within the \textit{Overlay curves} tool. The respective input fields are immediately updated. The tool is closed by the button \textit{Cancel}. All the listed data sets will be kept. However, the selections transferred to the \textit{Global fitting}, \textit{Average curves}, and \textit{Statistics view} tools are kept as well. \subsubsection{Tools / Batch control} \label{sec:tm.bc} By default the current page is taken as a reference to perform automated fitting. A batch is defined as the ensemble of correlation data sets (pages) assigned to the same model function within a session. A session can therefore have several batches, even for the same data. For fitting it is crucial to carefully define the starting parameters, whether parameters should be fixed or varied, the range of values which make physically sense, and other options offered within the \textit{Main window}. By executing \textit{Apply to applicable pages}, these settings are transferred to all other pages assigned to the same fit model. Note that this includes the range of lag times (lag time channels) which may have been changed with the \textit{Data range }tool for individual pages. The button \textit{Fit applicable pages} then performs several cycles of fitting [how many cycles?] on all pages of the same batch. Alternatively, the user can define an external source of parameters as a reference, i.e. the first page of some \textit{Other session} (*.fcsfit-session.zip). However, this assumes a consistent assignment of model functions. \subsubsection{Tools / Global fitting} \label{sec:tm.gf} Global fitting is useful when experimental curves share the same values for certain physical parameters. For example, due to physical constraints in two-focus FCS both autocorrelation curves and the cross-correlation curves should adopt the same values for the diffusion time \textit{taudiff} and the number of particles \textit{n}. A global fit can be applied such that \textit{n} and \textit{taudiff} are identical for all data sets. All curves are added to a single array. In contrast to fixing the shared parameters across a batch, in \textit{Global fitting} Chi-square is minimized for all data sets simultaneously [please check!]. To perform \textit{Global fitting}, a subset of curves has to be selected by typing the numbers into the input field or by highlighting the pages via the \textit{Overlay} tool. \subsubsection{Tools / Average data} \label{sec:tm.ad} Often in FCS, the measurement time at a particular spot is divided in several runs. This approach is taken when occasional, global intensity changes are superimposed on the molecular fluctuations of interest. Then the user has to sort out the bad runs. After fitting, one may want to re-combine the data, to export a cleaned, average correlation function. This can be done with the tool \textit{Average data}, for which a subset of curves has to be selected by typing the numbers into the input field or by highlighting the pages via the \textit{Overlay curves} tool. For averaging, there are constraints: \begin{enumerate} \item Since the correlation curves are averaged point by point this requires the same number of lag time channels. Runs of different length cannot be averaged. \item The tool can only average data sets which are exclusively autocorrelation or cross-correlation. \item The user can check a box to enforce the program to ask for data sets with the same model as the current page. This may help to avoid mistakes when selecting pages. \end{enumerate} The averaged curve is shown on a separate page. The new \textit{Filename/title} receives the entry \textit{Average [numbers of pages]}. The assigned model is by default the same as for the individual pages. However, while averaging, the user can choose a different model from a drop-down list. \subsubsection{Tools / Trace view} \label{sec:tm.tv} FCS theory makes assumptions about the thermodynamic state of the system. Signal fluctuations can only be analyzed when the system is at equilibrium or at a sufficiently stable steady state. Global instabilities on the time scale of the measurement itself, e.g. photo-bleaching, have dramatic effect on the shape of the measured correlation curve. Therefore it is common practice to check the correlated intensity trace for each curve. Trace view simply displays the signal trace for each correlation function. The window stays open during the session and can be used to revisit and flag ambiguous data sets. \subsubsection{Tools / Statistics view} \label{sec:tm.sv} The goal of a correlation analysis is to determine experimental parameter values with sufficient statistical significance. However, especially for large data sets, it can get quite laborious to check all of the individual values on each page. We designed the \textit{Statistics view} panel to review the state of parameters across the experimental batch (pages assigned to the same model) in a single plot, thereby facilitating to the identification of outliers. The current page is taken as a reference for the type of model parameters which can be displayed. The user can choose different \textit{Plot parameters} from a drop-down list. A subset of pages within the batch can be explicitly defined by typing the page numbers into the input field or by highlighting in the \textit{Overlay curves} tool. Note that page numbers which refer to different models than the current page are ignored. The \textit{Statistics view} panel contains a separate \textit{Export} box, where parameters can be selected (checked) and saved as a comma separated text file (*.csv). Only selected page numbers are included. \subsubsection{Tools / Page info} \label{sec:tm.pi} Page info is a most verbose summary of a data set. The panel \textit{Page info} is synchronized with the current page. The following fields are listed: \begin{enumerate} \item Version of PyCorrFit \item Field values from the main window (filename/title, model specifications, page number, type of correlation, normalizations) \item Actual parameter values (as contained in the model function) \item Supplementary parameters (intensity, counts per particle, duration, etc.) \item Fitting related information (Chi-square, channel selection, varied fit parameters) . \item Model doc string (\hyref{Section}{sec:models}) \end{enumerate} The content of Page info is saved as a header when exporting correlation functions via the command \textit{Current page / Save data (*.csv)} (\hyref{Section}{sec:cp.sd}). \subsubsection{Tools / Slider simulation} \label{sec:tm.ss} This tool visualizes the impact of model parameters on the shape of the model function of a current page. Such insight may be useful to choose proper starting values for fitting or to develop new model functions. For example, in the case two of the parameters trade during the fitting one may explore to which extent a change in both values produces similar trends. Two variables (A and B) have to be assigned from a drop-down list of parameters associated with the current model function. For each of these, the \textit{Slider simulation} panel shows initially the starting value (x) as a middle position of a certain range (from 0.1*x to 1.9*x). The accessible range can be manually edited and the actual value of the slider position is displayed at the right hand side of the panel. Dragging the slider to lower (left) or higher (right) values changes the entry in the box \textit{Model parameters} of the \textit{Main window} and accordingly the shape opt the model function in the plot. By default the checkbox \textit{Vary A and B}\textit{ }is active meaning that both variables during \textit{Slider simulation} can be varied independently. In addition, the variables A and B can be linked by a mathematical relation. For this a mathematical operator can be selected from a small list and the option \textit{Fix relation} must be checked. Then, the variable B appears inactivated (greyed out) and the new variable combining values for A and B can be explored by dragging. \subsection{Current Page} \label{sec:cp} This menu compiles import and export operations referring exclusively to the active page in the main window. \subsubsection{Current Page / Import Data} \label{sec:cp.id} This command is the third way to import data into a pre-existing session. Single files containing correlation data can be imported as long as they have the right format (\hyref{Section}{sec:fm.ld}). In contrast to \textit{Load data} from the \textit{File} menu, the model assignment and the state of the parameters remains. The purpose of this command is to compare different data sets to the very same model function for a given parameter values. After successful import, the previous correlation data of this page are lost. To avoid this loss, one could first generate a new page via the menu (\hyref{Section}{sec:tm.m}), select a model function and import data there. This is also a possibility to assign the very same data to different models within the same session. \subsubsection{Current Page / Save data (*.csv)} \label{sec:cp.sd} For the documentation with graphics software of choice, correlation curves can be exported as a comma-separated table. A saved \textit{PyCorrFit} text-file (*.csv) will contain a hashed header with metadata from the \textit{Page info} tool (\hyref{Section}{sec:tm.pi}), followed by the correlation and fitting values in tab-separated columns: \textit{Channel (tau [s])}, \textit{Experimental correlation}, \textit{Fitted correlation}, \textit{Residuals}, and \textit{Weights (fit)}. Below the columns, there are again 5 rows of hashed comments followed by the intensity data in two columns: \textit{Time [s]} and \textit{Intensity trace [kHz]}. Note that there are no assemblies of ``multiple runs'', since \textit{PyCorrFit} treats these as individual correlation functions. A *.csv file therefore contains only a single fitted correlation curve and one intensity trace for autocorrelation or two intensity traces for cross-correlation. \subsubsection{Current Page / Save correlation as image} \label{sec:cp.sc} For a quick documentation, the correlation curve can be exported as a compressed bitmap (*.png). The plot contains a legend and the actual values and errors of the varied parameters, however, not the fixed parameters. Note that the variable tau cannot be displayed using Unicode with Windows. \subsubsection{Current Page / Save trace view as image} \label{sec:cp.st} For a quick documentation the intensity from the \textit{Trace view} panel can be exported as a compressed bitmap (*.png). \subsubsection{Current Page / Close page} \label{sec:cp.cp} Closes the page; the data set is removed from the session. The page numbers of all other pages remain the same. The command is equivalent with the closer (x) in the tab. \subsection{Models} \label{sec:models} When choosing a model from the \textit{Models} menu a new page opens and the model function is plotted according to the set of starting values for parameters as they were defined in the model description. The lists contains all of the implemented model functions, which can be selected during \textit{File / Load data}. The parameters can be manipulated to explore different shapes; the tool \textit{Slider simulation} can also be used. Via \textit{Current page / Import data}, the model may then be fitted to an experimental data set. Standard model functions for a confocal setup are: \begin{tabular}{l l} %Confocal (Gaussian): 3D \ \ \ \ \ \ [Free diffusion in three dimensions] \rule{0pt}{3ex} - Confocal (Gaussian): T+3D & Triplet blinking and 3D diffusion \\ \rule{0pt}{3ex} - Confocal (Gaussian): T+3D+3D & Triplet with two diffusive components \\ %Confocal (Gaussian): T+3D+3D+3D & [Triplet with three diffusive components] %Confocal (Gaussian): 2D & 2D diffusion, e.g. in membranes \\ \rule{0pt}{3ex} - Confocal (Gaussian): T-2D & Triplet blinking and 2D diffusion \\ \rule{0pt}{3ex} - Confocal (Gaussian): T-2D-2D & Triplet with two diffusive components \\ \rule{0pt}{3ex} - Confocal (Gaussian): T-3D-2D & Triplet with mixed 3D and 2D diffusion \\ \rule{0pt}{3ex} \end{tabular} There is also a collection of models for FCS setups with TIR excitation: \begin{tabular}{l l} \rule{0pt}{3ex} - TIR (Gaussian/Exp.): 3D & 3D diffusion \\ \rule{0pt}{3ex} - TIR (Gaussian/Exp.): T+3D+3D & Triplet with two diffusive components \\ \rule{0pt}{3ex} - TIR (Gaussian/Exp.): T+3D+2D & Triplet with mixed 3D and 2D diffusion \\ \rule{0pt}{3ex} \end{tabular} In addition, there are may be user defined model functions which have been uploaded previously via File / Import model (\hyref{Section}{sec:fm.im}). \subsection{Preferences} \paragraph*{Latex} If the user has a Tex distribution (e.g. MikTex for Windows) installed, checking the ``Latex'' option will open a separate, TeX formatted panel (\textit{Figure1}) via the \textit{Current page / Save […] as image} commands. The \textit{Figure1} contains some interactive options for display. From there, in a second step, the image can be exported as *.png or *.svg. \paragraph*{Verbose} If checked, this will cause the \textit{PyCorrFit} to display graphs that would be hidden otherwise. In weighted fitting with a spline, the spline function used for calculating the weights for each data points is displayed\footnote{For obvious reasons, such a plot is not generated when using the iteratively improved \textit{Model function} or the actual \textit{Average} correlation curve for weighted fitting.}. When saving the correlation curve as an image (\hyref{Section}{sec:cp.sc}), the plot will be displayed instead of saved. If ``Latex'' is active these plots will also be TeX-formatted. The advantage in displaying plots is the ability to zoom or rescale the plot from within \textit{PyCorrFit}. \paragraph*{Show weights} Checking the option \textit{Show weights} will produce two lines showing the weights for each data point of the correlation function in the plot, as well as in the exported image. Note that the weights are always exported when using the \textit{Save data (*.csv)} command from the \textit{Current page} menu. \subsection{Help} \paragraph*{Documentation} This entry displays this documentation using the systems default PDF viewer. \paragraph*{Wiki} This entry displays the wiki of \textit{PyCorrFit} on \textit{GitHub}. Everyone who registers with \textit{GitHub} will be able to make additions and modifications. The wiki is intended for end-users of \textit{PyCorrFit} to share protocols or to add other useful information. \paragraph*{Update} establishes a link to the GitHub website to check for a new release; it also provides a few web links associated with PyCorrFit \paragraph*{Shell} This gives Shell-access to the functions of \textit{PyCorrFit}. It is particularly useful for trouble-shooting. \paragraph*{Software} This lists the exact version of \textit{Python} and the corresponding modules with which PyCorrFit is currently running. \paragraph*{About} Information of the participating developers, the license, and documentation writers. \section{4 Hacker's corner} \paragraph*{New internal model functions} Additionally, new file formats can be implemented by programming of the readfiles module of \textit{PyCorrFit}. First, edit the code for \texttt{\_\_init\_\_.py} and then add the script \texttt{read\_FileFormat.py}. External models will be imported with internal model function IDs starting at $7000$. Models are checked upon import by the Python module sympy. If the import fails it might be a syntax error or just an error of sympy, since this module is still under development. \section{Theoretical background} \subsection{Derivation of FCS model functions} This section introduces the calculation of FCS model functions. It supplies some background information and points out general properties of correlation functions. \subsubsection{General Autocorrelation function for a single species} FCS model functions describe how the signal $F(t)$, emitted from a certain observation volume, is temporally dependent on its own past (autocorrelation) or on some other signal (cross-correlation). The autocorrelation $G(\tau)$ of a signal $F(t)$ is computed as follows: \newline \newline %\fbox{ { \begin{minipage}{\textwidth} %\textbf{Mathematical foundation - Autocorrelation function:} \begin{equation} G(\tau) = \frac{\langle \delta F(t) \delta F(t+\tau) \rangle}{\langle F(t) \rangle^2} = \frac{g(\tau)}{\langle F(t) \rangle^2}. \end{equation} \begin{itemize} \small \item[$G(\tau)$] normalized autocorrelation curve \item[$\tau$] lag time \item[$\langle F \rangle$] the expectation value of $F(t)$. Applying the ergodic theorem, this can be rewritten as the time average \[ \langle F(t) \rangle = \lim_{T \rightarrow \inf }\frac{1}{T} \int_0^T F(t) \mathrm{d}t. \] \item[$\delta F(t)$] $= F(t) - \langle F(t) \rangle$ fluctuation of the fluorescence signal \item[$g(\tau)$] non normalized autocorrelation curve \end{itemize} \end{minipage} %} %} \newline \newline \newline The fluorescence signal is dependent on the size and shape of the detection volume (e.g. Gaussian shaped for confocal setups or exponential decaying for TIRF setups), on the propagator of the diffusing dye (free diffusion, diffusion with flow, etc.), and the brightness and concentration of the dye under observation\cite{Burkhardt2010}. \\ \newline %\fbox{ { \begin{minipage}{\textwidth} %\textbf{General Correlation function for a single species:} \begin{equation} G(\tau) = \frac{ q^2 C \int \! \mathrm{d}^3 r \int \! \mathrm{d}^3 r' \, \Omega(\mathbf{r})\Phi(\mathbf{r}, \mathbf{r'}, \tau) \Omega(\mathbf{r'}) }{\langle F(t) \rangle^2} \end{equation} \begin{itemize} \small \item[$q$] molecular brightness, dependent on excitation intensity, quantum yield, i.e. emission properties and absorption cross sections of the dye, and the detection efficiency of the instrument. \item[$\Omega$] 3D molecule detection function, dependent on the shape of the pinholes used for detection and the excitation laser profile, i.e. the point spread function (PSF). \item[$\Phi$] diffusion propagator. The distribution of dyes in a liquid follows Fick's laws of diffusion. For free diffusion, this is a simple Gaussian distribution. \item[$F$] fluorescence signal of the sample. It is defined as \[ F(t) = q \int \! \mathrm{d}^3 r \, \Omega(\mathbf{r}) c(\mathbf{r}, t) \] with $c(\mathbf{r}, t)$ being dye distribution (particle concentration) inside the detection volume. \item[$C$] average concentration of the dye following the dynamics of the propagator $\Phi$. Using the ergodic hypothesis and assuming a normalized molecule detection function (${V_\mathrm{eff} = \int \!\! d^3r \, \Omega(\mathbf{r}) = 1}$), the concentration computes to $ C = \langle F(t) \rangle / q$. \end{itemize} \end{minipage} %} %} \subsubsection{General Autocorrelation function for multiple species} %Most experiments do not only include a single species of fluorescent dye. When considering a three dimensional detection volume with a freely diffusing dye, adding a lipid bilayer with a different fluorescent dye (diffusing in two dimensions inside the bilayer) will result in two distinct contributions to the fluorescence signal, namely 2D diffusion and 3D diffusion. For $n$ different species inside the detection volume, the autocorrelation function becomes: Most experiments include particles with more than one dynamic property. Labeled particles may have different size or the temporal dynamics may include a triplet term. For $n$ different species inside the detection volume, the autocorrelation function becomes: \newline \newline %\fbox{ { \begin{minipage}{\textwidth} %\textbf{General Correlation function for n species:} \begin{equation} G(\tau) = \frac{g(\tau)}{\langle F(t) \rangle^2} = \frac{\sum_{i=1}^n \sum_{j=1}^n g_{ij}(\tau)}{\langle F(t) \rangle^2} \end{equation} \begin{equation} g_{ij}(\tau) = q_i q_j \int \! \mathrm{d}^3 r \int \! \mathrm{d}^3 r' \, \Omega(\mathbf{r})\Phi_{ij}(\mathbf{r}, \mathbf{r'}, \tau) \Omega(\mathbf{r'}) \end{equation} \begin{itemize} \small \item[$g(\tau)$] non normalized correlation function \item[$g_{ij}(\tau)$] non normalized cross correlation between two species $i$ and $j$. For $n$ species, $i,j \in [1,...,n]$. \item[$q_i$] molecular brightness of species $i$ \item[$\Omega$] 3D molecule detection function \item[$\Phi_{ij}$] diffusion propagator computed from species $i$ with species $j$. If species $i$ and $j$ are independently diffusing, then $\Phi_{ij}$ is zero. $ C_{ij} \Phi_{ij}(\mathbf{r}, \mathbf{r'}, \tau) = \, \langle \delta c_i(\mathbf{r},0) \delta c_j(\mathbf{r'}, \tau) \rangle $ \item[$C_{ij}$] average concentration of objects following the dynamics of $\Phi_{ij}$. If $i=j$, $C_{ii}=C_i$ is the concentration of the dye $i$. \end{itemize} \end{minipage} %} %} \newline \newline If the propagators $\Phi_{ij}(x,y,z; x',y',z'; \tau)$ and the molecule detection function $\Omega(x,y,z)$ factorize into an axial ($z$) and a lateral ($x,y$) part, so will $g_{ij}(\tau)$: \begin{equation} g_{ij}(\tau) = q_i q_j \cdot g_{ij,z}(\tau) \cdot g_{ij,xy}(\tau) \end{equation} Following the example with a freely diffusing species $A$ and a laterally diffusing species $B$ inside a membrane at $z = z_0$, it can be concluded: \begin{eqnarray*} g_{AA}(\tau) = && q_A^2 \cdot g_{AA,z}(\tau) \cdot g_{AA,xy}(\tau) \\ g_{BB}(\tau) = && q_B^2 \cdot g_{BB,z_0}(\tau) \cdot g_{BB,xy}(\tau) \\ g_{AB}(\tau) = g_{BA} (\tau) = && q_A q_B \cdot g_{AB,z}(\tau) \cdot g_{AB,xy}(\tau) \\ g(\tau) = && g_{AA}(\tau) + 2 g_{AB}(\tau) + g_{BB}(\tau) \end{eqnarray*} To obtain the normalized autocorrelation function, the average $\langle F(t) \rangle$ has to be calculated: \begin{eqnarray*} F(t) = && \sum_{i=1}^n F_i(t) \\ F_A(t) = && q_A \int \! \mathrm{d}^3 r \, \Omega(\mathbf{r}) C_A(\mathbf{r}, t) \\ F_B(t) = && q_B \int \! \mathrm{d}x \! \int \! \mathrm{d}y \, \Omega(x,y,z=z_0) C_B(x,y, t) \\ \langle F(t) \rangle = && \langle F_A(t) \rangle + \langle F_B(t) \rangle \end{eqnarray*} It is noticeable, that $C_B$ is a 2D concentration, whereas $C_A$ is a 3D concentration. Since there is no correlation between the two freely diffusing species $A$ and $B$, $g_{AB}(\tau)$ is zero. The normalized autocorrelation curve may now be calculated like this: \begin{eqnarray*} G(\tau) = && \frac{g(\tau)}{\langle F(t) \rangle^2} \\ G(\tau) = && \frac{g_{AA}(\tau) + g_{BB}(\tau)}{(\langle F_A(t) \rangle + \langle F_B(t) \rangle)^2} \\ \end{eqnarray*} \subsubsection{Cross-correlation} Cross-correlation is a generalization of autocorrelation. Cross-correlation functions are derived in the same manner as autocorrelation functions. Here, signals recorded in two detection channels are cross-correlated to obtain the correlation function. \begin{equation} G_{XY}(\tau) = \frac{\langle \delta F_X(t) \delta F_Y(t+\tau) \rangle}{\langle F_X(t) \rangle \langle F_Y(t) \rangle} \end{equation} A cross-correlation analysis of two species labeled by two types of dyes observed in two corresponding detection channels can be used for binding assays. Only complexes giving simultaneous signal in both channels contribute to the cross-correlation amplitude. Thus a finite cross-correlation indicates co-diffusion. \subsubsection{Extension of the theory} By modifying the propagator $\Phi$ and the detection volume $\Omega$, other effects, like triplet blinking or binding reactions can be quantified. In many cases, analytical solutions to the above integrals are not straightforward and approximations have to be made. For example, the Gaussian shaped detection profile in confocal FCS is already an approximation. However, deviations from the true results are considered to be small \cite{Zhang2007}. \hyref{Section}{sec:mdls} introduces several model functions with various detection symmetries and particle dynamics. \subsection{Non-linear least-squares fit} \label{cha:PyCorFit_leastsq} PyCorrFit uses the non-linear least-squares fitting capabilities from \texttt{scipy.optimize}. This package utilizes the Levenberg–Marquardt algorithm to minimize the sum of the squares. More information on this topic can be obtained from the online documentation of \texttt{leastsq}\footnote{\url{http://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.leastsq.html##scipy.optimize.leastsq}}. One can define a distance $d(G,H)$ between two discrete functions $G$ and $H$ with the discrete domain of definition $\tau_1 \dots \tau_n$ as the sum of squares: \begin{equation} d(G,H) = \sum_{i=1}^n \left[ G(\tau_i) - H(\tau_i) \right]^2 \end{equation} The least-squares method minimizes this distance between the model function $G$ and the experimental values $H$ by modifying $k$ additional fitting parameters $\alpha_1, \dots, \alpha_k$: \begin{equation} \chi^2 = \min_{\alpha_1, \dots, \alpha_k} \sum_{i=1}^n \left[ G(\tau_i,\alpha_1, \dots, \alpha_k) - H(\tau_i) \right]^2 \end{equation} The minimum distance $\chi^2$ is used to characterize the success of a fit. Note, that if the number of fitting parameters $k$ becomes too large, multiple values for $\chi^2$ can be found, depending on the starting values of the $k$ parameters. \subsection{Weighted fitting} In certain cases, it is useful to implement weights (standard deviation) $\sigma_i$ for the calculation of $\chi^2$. For example, very noisy parts of a correlation curve can falsify the resulting fit. In PyCorrFit, weighting is implemented as follows: \begin{equation} \chi^2_\mathrm{weighted} = \min_{\alpha_1, \dots, \alpha_k} \sum_{i=1}^n \frac{\left[ G(\tau_i,\alpha_1, \dots, \alpha_k) - H(\tau_i) \right]^2}{\sigma_i^2} \end{equation} PyCorrFit is able to calculate the weights $\sigma_i$ from the experimental data. The different approaches of this calculation of weights implemented in PyCorrFit are explained in \hyref{section}{cha_graphint}. \input{PyCorrFit_doc_models} pycorrfit-0.8.1/doc-src/PyCorrFit_doc_models.tex0000644000175000017500000004551312262516600020426 0ustar toortoor\subsection{Implemented model functions} \label{sec:mdls} This is an overview of all the model functions that are currently\footnote{\today} implemented in PyCorrFit. To each model a unique model ID is assigned by PyCorrFit. Most of the following information is also accessible from within PyCorrFit using the \textbf{Page info} tool. \subsubsection{Confocal FCS} The confocal detection volume with the structural parameter \begin{align} \mathit{SP}= \frac{z_0}{r_0} \end{align} has an effective size of \begin{align} V = \pi^{3/2} r_0^2 z_0 \end{align} where $r_0$ is its lateral and $z_0$ its axial (in case of 3D diffusion) extension. Thus, the effective number of particles is defined as \begin{align} N = C V \end{align} with the concentration $C$ given implicitly in the model functions. The diffusion coefficient is calculated from the diffusion time $\tau_\mathrm{diff}$ using \begin{align} D = \frac{1}{4 \tau_\mathrm{diff}} \left( \frac{z_0}{\mathit{SP}} \right)^2 = \frac{r_0^2}{4 \tau_\mathrm{diff}}. \end{align} The parameters in the equation above need to be calibrated to obtain the diffusion coefficient. Usually a reference dye with a known diffusion coefficient is used to determine the lateral extension of the detection volume $r_0$ with a fixed structural parameter of e.g. $\mathit{SP}=4$.\\ \vspace{2em} % 2D diffusion %\noindent \begin{tabular}{lp{.7\textwidth}} %Name & \textbf{2D (Gauß)} \\ %ID & \textbf{6001} \\ %Descr. & Two-dimensional diffusion with a Gaussian laser profile\cite{Aragon1976, Qian1991, Rigler1993}. \\ %\end{tabular} %\begin{align} %G(\tau) = A_0 + \frac{1}{N} \frac{1}{(1+\tau/\tau_\mathrm{diff})} %\end{align} %\begin{center} %\begin{tabular}{ll} %$A_0$ & Offset \\ %$N$ & Effective number of particles in confocal area \\ %$\tau_\mathrm{diff}$ & Characteristic residence time in confocal area \\ %\end{tabular} \\ %\end{center} %\vspace{2em} % 3D diffusion %\noindent \begin{tabular}{lp{.7\textwidth}} %Name & \textbf{3D (Gauß)} \\ %ID & \textbf{6012} \\ %Descr. & Three-dimensional free diffusion with a Gaussian laser profile (eliptical)\cite{Aragon1976, Qian1991, Rigler1993}. \\ %\end{tabular} %\begin{align} %G(\tau) = A_0 + \frac{1}{N} \frac{1}{(1+\tau/\tau_\mathrm{diff})} \frac{1}{\sqrt{1+\tau/(\mathit{SP}^2 \tau_\mathrm{diff})}} %\end{align} %\begin{center} %\begin{tabular}{ll} %$A_0$ & Offset \\ %$N$ & Effective number of particles in confocal volume \\ %$\tau_\mathrm{diff}$ & Characteristic residence time in confocal volume \\ %$\mathit{SP}$ & Structural parameter, describes elongation of the confocal volume \\ %\end{tabular} %\end{center} %\vspace{2em} % 3D diffusion + triplet \noindent \begin{tabular}{lp{.7\textwidth}} Name & \textbf{Confocal (Gaussian) T+3D} \\ ID & \textbf{6011} \\ Descr. & Three-dimensional free diffusion with a Gaussian laser profile (eliptical), including a triplet component\cite{Widengren1994, Widengren1995, Haupts1998}. \\ \end{tabular} \begin{align} G(\tau) = A_0 + \frac{1}{N} \frac{1}{(1+\tau/\tau_\mathrm{diff})} \frac{1}{\sqrt{1+\tau/(\mathit{SP}^2 \tau_\mathrm{diff})}} \left(1 + \frac{T e^{-\tau/\tau_\mathrm{trip}}}{1-T} \right) \end{align} \begin{center} \begin{tabular}{ll} $A_0$ & Offset \\ $N$ & Effective number of particles in confocal volume \\ $\tau_\mathrm{diff}$ & Characteristic residence time in confocal volume \\ $\mathit{SP}$ & Structural parameter, describes elongation of the confocal volume \\ $T$ & Fraction of particles in triplet (non-fluorescent) state\\ $\tau_\mathrm{trip}$ & Characteristic residence time in triplet \\ \end{tabular} \end{center} \vspace{2em} % 3D+3D diffusion + triplett \noindent \begin{tabular}{lp{.7\textwidth}} Name & \textbf{Confocal (Gaussian) T+3D+3D} \\ ID & \textbf{6030} \\ Descr. & Two-component three-dimensional free diffusion with a Gaussian laser profile, including a triplet component\cite{Elson1974, Aragon1976, Palmer1987, Thomps:bookFCS2002}. \\ \end{tabular} \begin{align} G(\tau) &= A_0 + \frac{1}{N (F + \alpha (1-F))²} \left(1 + \frac{T e^{-\tau/\tau_\mathrm{trip}}}{1-T} \right) \times \\ \notag &\times \left[ \frac{F}{(1+\tau/\tau_1)} \frac{1}{\sqrt{1+\tau/(\mathit{SP}^2 \tau_1)}} + \alpha^2 \frac{1-F}{ (1+\tau/\tau_2) } \frac{1}{\sqrt{1+\tau/(\mathit{SP}^2 \tau_2)}} \right] \end{align} \begin{center} \begin{tabular}{ll} $A_0$ & Offset \\ $N$ & Effective number of particles in confocal volume ($N = N_1+N_2$) \\ $\tau_1$ & Diffusion time of particle species 1 \\ $\tau_2$ & Diffusion time of particle species 2 \\ $F$ & Fraction of molecules of species 1 ($N_1 = F N$) \\ $\alpha$ & Relative molecular brightness of particles 1 and 2 ($ \alpha = q_2/q_1$) \\ $\mathit{SP}$ & Structural parameter, describes elongation of the confocal volume \\ $T$ & Fraction of particles in triplet (non-fluorescent) state\\ $\tau_\mathrm{trip}$ & Characteristic residence time in triplet state \\ \end{tabular} \end{center} \vspace{2em} % 2D diffusion + triplett \noindent \begin{tabular}{lp{.7\textwidth}} Name & \textbf{Confocal (Gaussian) T+2D} \\ ID & \textbf{6002} \\ Descr. & Two-dimensional diffusion with a Gaussian laser profile, including a triplet component\cite{Aragon1976, Qian1991, Rigler1993,Widengren1994, Widengren1995, Haupts1998}. \\ \end{tabular} \begin{align} G(\tau) = A_0 + \frac{1}{N} \frac{1}{(1+\tau/\tau_\mathrm{diff})} \left(1 + \frac{T e^{-\tau/\tau_\mathrm{trip}}}{1-T} \right) \end{align} \begin{center} \begin{tabular}{ll} $A_0$ & Offset \\ $N$ & Effective number of particles in confocal area \\ $\tau_\mathrm{diff}$ & Characteristic residence time in confocal area \\ $T$ & Fraction of particles in triplet (non-fluorescent) state\\ $\tau_\mathrm{trip}$ & Characteristic residence time in triplet state \\ \end{tabular} \end{center} \vspace{2em} % 2D+2D diffusion + triplett \noindent \begin{tabular}{lp{.7\textwidth}} Name & \textbf{Confocal (Gaussian) T+2D+2D} \\ ID & \textbf{6031} \\ Descr. & Two-component, two-dimensional diffusion with a Gaussian laser profile, including a triplet component\cite{Elson1974, Aragon1976, Palmer1987, Thomps:bookFCS2002}. \\ \end{tabular} \begin{align} G(\tau) = A_0 + \frac{1}{N (F + \alpha (1-F))²} \left[ \frac{F}{1+\tau/\tau_1} + \alpha^2 \frac{1-F}{ 1+\tau/\tau_2 } \right] \left(1 + \frac{T e^{-\tau/\tau_\mathrm{trip}}}{1-T} \right) \end{align} \begin{center} \begin{tabular}{ll} $A_0$ & Offset \\ $N$ & Effective number of particles in confocal area ($N = N_1+N_2$) \\ $\tau_1$ & Diffusion time of particle species 1 \\ $\tau_2$ & Diffusion time of particle species 2 \\ $F$ & Fraction of molecules of species 1 ($N_1 = F N$) \\ $\alpha$ & Relative molecular brightness of particles 1 and 2 ($ \alpha = q_2/q_1$) \\ $T$ & Fraction of particles in triplet (non-fluorescent) state\\ $\tau_\mathrm{trip}$ & Characteristic residence time in triplet state \\ \end{tabular} \end{center} \vspace{2em} % 3D+2D diffusion + triplett \noindent \begin{tabular}{lp{.7\textwidth}} Name & \textbf{Confocal (Gaussian) T+3D+2D} \\ ID & \textbf{6032} \\ Descr. & Two-component, two- and three-dimensional diffusion with a Gaussian laser profile, including a triplet component\cite{Elson1974, Aragon1976, Palmer1987, Thomps:bookFCS2002}. \\ \end{tabular} \begin{align} G(\tau) = A_0 + \frac{1}{N (1 - F + \alpha F)²} \left[ \frac{1-F}{1+\tau/\tau_\mathrm{2D}} + \frac{ \alpha^2 F}{ (1+\tau/\tau_\mathrm{3D}) } \frac{1}{\sqrt{1+\tau/(\mathit{SP}^2 \tau_\mathrm{3D})}} \right] \left(1 + \frac{T e^{-\tau/\tau_\mathrm{trip}}}{1-T} \right) \end{align} \begin{center} \begin{tabular}{ll} $A_0$ & Offset \\ $N$ & Effective number of particles in confocal volume ($N = N_\mathrm{2D}+N_\mathrm{3D}$) \\ $\tau_\mathrm{2D}$ & Diffusion time of surface bound particles \\ $\tau_\mathrm{3D}$ & Diffusion time of freely diffusing particles \\ $F$ & Fraction of molecules of the freely diffusing species ($N_\mathrm{3D} = F N$) \\ $\alpha$ & Relative molecular brightness of particle species ($ \alpha = q_\mathrm{3D}/q_\mathrm{2D}$) \\ $\mathit{SP}$ & Structural parameter, describes elongation of the confocal volume \\ $T$ & Fraction of particles in triplet (non-fluorescent) state\\ $\tau_\mathrm{trip}$ & Characteristic residence time in triplet state \\ \end{tabular} \end{center} \vspace{2em} \subsubsection{Confocal TIR-FCS} The detection volume is axially confined by an evanescent field and has an effective size of \begin{align} V = \pi R_0^2 d_\mathrm{eva} \end{align} where $R_0$ is the lateral extent of the detection volume and $d_\mathrm{eva}$ is the evanescent field depth\footnote{Where the field has decayed to $1/e$}. From the concentration $C$, the effective number of particles is $N=CV$. The decay constant $\kappa$ is the inverse of the depth $d_\mathrm{eva}$ : \begin{align} d_\mathrm{eva} = \frac{1}{\kappa} \end{align} The model functions make use of the Faddeeva function (complex error function)\footnote{In user-defined model functions, the Faddeeva function is accessible through \texttt{wofz()}. For convenience, the function \texttt{wixi()} can be used which only takes $\xi$ as an argument and the imaginary $i$ can be omitted.}: \begin{align} w\!(i\xi) &= e^{\xi^2} \mathrm{erfc}(\xi) \\ \notag &= e^{\xi^2} \cdot \frac{2}{\sqrt{\pi}} \int_\xi^\infty \mathrm{e}^{-\alpha^2} \mathrm{d\alpha} \label{eq:faddeeva} \end{align} The lateral detection area has the same shape as in confocal FCS. Thus, correlation functions for two-dimensional diffusion of the confocal case apply and are not mentioned here. \\ \vspace{2em} % 3D diffusion (Gauß/exp) \noindent \begin{tabular}{lp{.7\textwidth}} Name & \textbf{TIR (Gaussian/Exp.) 3D} \\ ID & \textbf{6013} \\ Descr. & Three-dimensional free diffusion with a Gaussian lateral detection profile and an exponentially decaying profile in axial direction\cite{Starr2001, Hassler2005, Ohsugi2006}. \\ \end{tabular} \begin{align} G(\tau) = \frac{1}{C} \frac{ \kappa^2}{ \pi (R_0^2 +4D\tau)} \left( \sqrt{\frac{D \tau}{\pi}} + \frac{1 - 2 D \tau \kappa^2}{2 \kappa} w\!\left(i \sqrt{D \tau} \kappa\right) \right) \end{align} \begin{center} \begin{tabular}{ll} $C$ & Particle concentration in confocal volume \\ $\kappa$ & Evanescent decay constant ($\kappa = 1/d_\mathrm{eva}$)\\ $R_0$ & Lateral extent of the detection volume \\ $D$ & Diffusion coefficient \\ \end{tabular} \end{center} \vspace{2em} % 3D+3D+T diffusion (Gauß/exp) \noindent \begin{tabular}{lp{.7\textwidth}} Name & \textbf{TIR (Gaussian/Exp.) T+3D+3D} \\ ID & \textbf{6034} \\ Descr. & Two-component three-dimensional diffusion with a Gaussian lateral detection profile and an exponentially decaying profile in axial direction, including a triplet component\cite{Starr2001, Hassler2005, Ohsugi2006, Elson1974, Aragon1976, Palmer1987, Thomps:bookFCS2002}. \\ \end{tabular} \begin{align} G(\tau) = &A_0 + \frac{1}{N (1-F + \alpha F)^2} \left(1 + \frac{T e^{-\tau/\tau_\mathrm{trip}}}{1-T} \right) \times \\ \notag \times \Bigg[ \,\, & \frac{F \kappa}{1+ 4 D_1 \tau/R_0^2} \left( \sqrt{\frac{D_1 \tau}{\pi}} + \frac{1 - 2 D_1 \tau \kappa^2}{2 \kappa} w\!\left(i \sqrt{D_1 \tau} \kappa\right) \right) + \\ \notag + & \frac{(1-F) \alpha^2 \kappa}{1+ 4 D_2 \tau/R_0^2} \left( \sqrt{\frac{D_2 \tau}{\pi}} + \frac{1 - 2 D_2 \tau \kappa^2}{2 \kappa} w\!\left(i \sqrt{D_2 \tau} \kappa\right) \right) \,\, \Bigg] \end{align} \begin{center} \begin{tabular}{ll} $A_0$ & Offset \\ $N$ & Effective number of particles in confocal volume ($N = N_1+N_2$) \\ $D_1$ & Diffusion coefficient of species 1 \\ $D_2$ & Diffusion coefficient of species 2 \\ $F$ & Fraction of molecules of species 1 ($N_1 = F N$) \\ $\alpha$ & Relative molecular brightness of particle species ($ \alpha = q_2/q_1$) \\ $R_0$ & Lateral extent of the detection volume \\ $\kappa$ & Evanescent decay constant ($\kappa = 1/d_\mathrm{eva}$)\\ $T$ & Fraction of particles in triplet (non-fluorescent) state\\ $\tau_\mathrm{trip}$ & Characteristic residence time in triplet state \\ \end{tabular} \end{center} \vspace{2em} % 2D+3D+T diffusion (Gauß/exp) \noindent \begin{tabular}{lp{.7\textwidth}} Name & \textbf{TIR (Gaussian/Exp.) T+3D+2D} \\ ID & \textbf{6033} \\ Descr. & Two-component, two- and three-dimensional diffusion with a Gaussian lateral detection profile and an exponentially decaying profile in axial direction, including a triplet component\cite{Starr2001, Hassler2005, Ohsugi2006, Elson1974, Aragon1976, Palmer1987, Thomps:bookFCS2002}. \\ \end{tabular} \begin{align} G(\tau) &= A_0 + \frac{1}{N (1-F + \alpha F)^2} \left(1 + \frac{T e^{-\tau/\tau_\mathrm{trip}}}{1-T} \right) \times \\ & \notag \times \left[ \frac{1-F}{1+ 4 D_\mathrm{2D} \tau/R_0^2} + \frac{\alpha^2 F \kappa}{1+ 4 D_\mathrm{3D} \tau/R_0^2} \left( \sqrt{\frac{D_\mathrm{3D} \tau}{\pi}} + \frac{1 - 2 D_\mathrm{3D} \tau \kappa^2}{2 \kappa} w\!\left(i \sqrt{D_\mathrm{3D} \tau} \kappa\right) \right) \right] \end{align} \begin{center} \begin{tabular}{ll} $A_0$ & Offset \\ $N$ & Effective number of particles in confocal volume ($N = N_\mathrm{2D}+N_\mathrm{3D}$) \\ $D_\mathrm{2D}$ & Diffusion coefficient of surface bound particles \\ $D_\mathrm{3D}$ & Diffusion coefficient of freely diffusing particles \\ $F$ & Fraction of molecules of the freely diffusing species ($N_\mathrm{3D} = F N$) \\ $\alpha$ & Relative molecular brightness of particle species ($ \alpha = q_\mathrm{3D}/q_\mathrm{2D}$) \\ $R_0$ & Lateral extent of the detection volume \\ $\kappa$ & Evanescent decay constant ($\kappa = 1/d_\mathrm{eva}$)\\ $T$ & Fraction of particles in triplet (non-fluorescent) state\\ $\tau_\mathrm{trip}$ & Characteristic residence time in triplet state \\ \end{tabular} \end{center} \vspace{2em} \subsubsection{TIR-FCS with a square-shaped lateral detection volume} The detection volume is axially confined by an evanescent field of depth\footnote{Where the field has decayed to $1/e$} $d_\mathrm{eva} = 1 / \kappa$. The lateral detection area is a convolution of the point spread function of the microscope of size $\sigma$, \begin{align} \sigma = \sigma_0 \frac{\lambda}{\mathit{NA}}, \end{align} with a square of side length $a$. The model functions make use of the Faddeeva function (complex error function)\footnote{In user-defined model functions, the Faddeeva function is accessible through \texttt{wofz()}. For convenience, the function \texttt{wixi()} can be used which only takes $\xi$ as an argument and the imaginary $i$ can be omitted.}: \begin{align} w\!(i\xi) &= e^{\xi^2} \mathrm{erfc}(\xi) \\ \notag &= e^{\xi^2} \cdot \frac{2}{\sqrt{\pi}} \int_\xi^\infty \mathrm{e}^{-\alpha^2} \mathrm{d\alpha} \label{eq:faddeeva} \end{align} \vspace{2em} % 3D TIRF diffusion (□xσ) \noindent \begin{tabular}{lp{.7\textwidth}} Name & \textbf{TIR (□x$\upsigma$/Exp.) 3D} \\ ID & \textbf{6010} \\ Descr. & Three-dimensional diffusion with a square-shaped lateral detection area taking into account the size of the point spread function; and an exponential decaying profile in axial direction\cite{Ries2008390, Yordanov2011}. \\ \end{tabular} \begin{align} G(\tau) = \frac{\kappa^2}{C} & \left( \sqrt{\frac{D \tau}{\pi}} + \frac{1 - 2 D \tau \kappa^2)}{2 \kappa} w\!\left(i \sqrt{D \tau} \kappa\right) \right) \times \\ \notag \times \Bigg[ & \frac{2 \sqrt{\sigma^2+D \tau}}{\sqrt{\pi} a^2} \left( \exp\left(-\frac{a^2}{4(\sigma^2+D \tau)}\right) - 1 \right) + \frac{1}{a} \, \mathrm{erf}\left(\frac{a}{2 \sqrt{\sigma^2+D \tau}}\right) \Bigg]^2 \end{align} \begin{center} \begin{tabular}{ll} $C$ & Particle concentration in detection volume \\ $\sigma$ & Lateral size of the point spread function \\ $a$ & Side size of the square-shaped detection area \\ $\kappa$ & Evanescent decay constant ($\kappa = 1/d_\mathrm{eva}$)\\ $D$ & Diffusion coefficient \\ \end{tabular} \\ \end{center} \vspace{2em} % 3D+3D TIRF diffusion (□xσ) \noindent \begin{tabular}{lp{.7\textwidth}} Name & \textbf{TIR (□x$\upsigma$/Exp.) 3D+3D} \\ ID & \textbf{6023} \\ Descr. & Two-component three-dimensional free diffusion with a square-shaped lateral detection area taking into account the size of the point spread function; and an exponential decaying profile in axial direction. \newline The correlation function is a superposition of three-dimensional model functions of the type \textbf{3D (□x$\upsigma$)} (6010)\cite{Ries2008390, Yordanov2011}. \\ \end{tabular} \vspace{2em} % 2D TIRF diffusion (□xσ) \noindent \begin{tabular}{lp{.7\textwidth}} Name & \textbf{TIR (□x$\upsigma$) 2D} \\ ID & \textbf{6000} \\ Descr. & Two-dimensional diffusion with a square-shaped lateral detection area taking into account the size of the point spread function\cite{Ries2008390, Yordanov2011}\footnote{The reader is made aware, that reference \cite{Ries2008390} contains several unfortunate misprints.}. \\ \end{tabular} \begin{align} G(\tau) = \frac{1}{C} \left[ \frac{2 \sqrt{\sigma^2+D \tau}}{\sqrt{\pi} a^2} \left( \exp\left(-\frac{a^2}{4(\sigma^2+D \tau)}\right) - 1 \right) + \frac{1}{a} \, \mathrm{erf}\left(\frac{a}{2 \sqrt{\sigma^2+D \tau}}\right) \right]^2 \end{align} \begin{center} \begin{tabular}{ll} $C$ & Particle concentration in detection area \\ $\sigma$ & Lateral size of the point spread function \\ $a$ & Side size of the square-shaped detection area \\ $D$ & Diffusion coefficient \\ \end{tabular} \\ \end{center} \vspace{2em} % 2D+2D TIRF diffusion (□xσ) \noindent \begin{tabular}{lp{.7\textwidth}} Name & \textbf{TIR (□x$\upsigma$) 2D+2D} \\ ID & \textbf{6022} \\ Descr. & Two-component two-dimensional diffusion with a square-shaped lateral detection area taking into account the size of the point spread function. \newline The correlation function is a superposition of two-dimensional model functions of the type \textbf{2D (□x$\upsigma$)} (6000)\cite{Ries2008390, Yordanov2011}. \\ \end{tabular} \vspace{2em} % 3D+2D TIRF diffusion (□xσ) \noindent \begin{tabular}{lp{.7\textwidth}} Name & \textbf{TIR (□x$\upsigma$/Exp.) 3D+2D} \\ ID & \textbf{6020} \\ Descr. & Two-component two- and three-dimensional diffusion with a square-shaped lateral detection area taking into account the size of the point spread function; and an exponential decaying profile in axial direction. \newline The correlation function is a superposition of the two-dimensional model function \textbf{2D (□x$\upsigma$)} (6000) and the three-dimensional model function \textbf{3D (□x$\upsigma$)} (6010)\cite{Ries2008390, Yordanov2011}. \end{tabular} \vspace{2em} % 3D+2D+kin TIRF diffusion (□xσ) \noindent \begin{tabular}{lp{.7\textwidth}} Name & \textbf{TIR (□x$\upsigma$/Exp.) 3D+2D+kin} \\ ID & \textbf{6021} \\ Descr. & Two-component two- and three-dimensional diffusion with a square-shaped lateral detection area taking into account the size of the point spread function; and an exponential decaying profile in axial direction. This model covers binding and unbinding kintetics. \newline The correlation function for this model was introduced in \cite{Ries2008390}. Because approximations are made in the derivation, please verify if this model is applicable to your problem before using it. \end{tabular} \vspace{2em} pycorrfit-0.8.1/bin/0000755000175000017500000000000012262516600013041 5ustar toortoorpycorrfit-0.8.1/bin/pycorrfit0000644000175000017500000000053712262516600015012 0ustar toortoor#!/bin/sh if [ -f "/usr/share/pyshared/pycorrfit/PyCorrFit.py" ] then python /usr/share/pyshared/pycorrfit/PyCorrFit.py elif [ -f /usr/local/lib/python2.7/dist-packages/pycorrfit/PyCorrFit.py ] then python /usr/local/lib/python2.7/dist-packages/pycorrfit/PyCorrFit.py else echo "Could not find PyCorrFit.py. Please notify the author." fi pycorrfit-0.8.1/setup.py0000644000175000017500000000320512262516600014003 0ustar toortoor#!/usr/bin/env python from setuptools import setup, find_packages from os.path import join, dirname, realpath from warnings import warn # The next three lines are necessary for setup.py install to include # ChangeLog and Documentation of PyCorrFit from distutils.command.install import INSTALL_SCHEMES for scheme in INSTALL_SCHEMES.values(): scheme['data'] = scheme['purelib'] # Get the version of PyCorrFit from the Changelog.txt StaticChangeLog = join(dirname(realpath(__file__)), "ChangeLog.txt") try: clfile = open(StaticChangeLog, 'r') version = clfile.readline().strip() clfile.close() except: warn("Could not find 'ChangeLog.txt'. PyCorrFit version is unknown.") version = "0.0.0-unknown" setup( name='pycorrfit', author='Paul Mueller', author_email='paul.mueller@biotec.tu-dresden.de', url='https://github.com/paulmueller/PyCorrFit', version=version, packages=['pycorrfit', 'pycorrfit.models', 'pycorrfit.readfiles', 'pycorrfit.tools'], package_dir={'pycorrfit': 'src', 'pycorrfit.models': 'src/models', 'pycorrfit.readfiles': 'src/readfiles', 'pycorrfit.tools': 'src/tools'}, data_files=[('pycorrfit_doc', ['ChangeLog.txt', 'PyCorrFit_doc.pdf'])], license="GPL v2", long_description=open(join(dirname(__file__), 'README.md')).read(), scripts=['bin/pycorrfit'], include_package_data=True, install_requires=[ "NumPy >= 1.5.1", "SciPy >= 0.8.0", "sympy >= 0.7.2", "PyYAML >= 3.09", "wxPython >= 2.8.10.1", "matplotlib >= 1.1.0"] ) pycorrfit-0.8.1/src/0000755000175000017500000000000012262516600013060 5ustar toortoorpycorrfit-0.8.1/src/usermodel.py0000644000175000017500000002635412262516600015443 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module: user model: When the user wants to use his own functions. We are using sympy as function parser instead of writing our own, which might be safer. We only parse the function with sympy and test it once during import. After that, the function is evaluated using eval()! Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import numpy as np import scipy.special as sps try: import sympy from sympy.core.function import Function from sympy.core import S from sympy import sympify, I from sympy.functions import im except ImportError: print " Warning: module sympy not found!" # Define Function, so PyCorrFit will start, even if sympy is not there. # wixi needs Function. Function = object import wx import models as mdls class CorrFunc(object): """ Check the input code of a proposed user model function and return a function for fitting via GetFunction. """ def __init__(self, labels, values, substitutes, funcstring): self.values = values # a --> a # b [ms] --> b self.variables = list() for item in labels: self.variables.append(item.split(" ")[0].strip()) self.funcstring = funcstring for key in substitutes.keys(): # Don't forget to insert the "(" and ")"'s self.funcstring = self.funcstring.replace(key, "("+substitutes[key]+")") for otherkey in substitutes.keys(): substitutes[otherkey] = substitutes[otherkey].replace(key, "("+substitutes[key]+")") # Convert the function string to a simpification object self.simpification = sympify(self.funcstring, sympyfuncdict) self.simstring = str(self.simpification) self.vardict = evalfuncdict def GetFunction(self): # Define the function that will be calculated later def G(parms, tau): tau = np.atleast_1d(tau) for i in np.arange(len(parms)): self.vardict[self.variables[i]] = float(parms[i]) self.vardict["tau"] = tau # Function called with array/list # The problem here might be #for key in vardict.keys(): # symstring = symstring.replace(key, str(vardict[key])) #symstring = symstring.replace("####", "tau") g = eval(self.funcstring, self.vardict) ## This would be a safer way to do this, but it is too slow! # Once simpy supports arrays, we can use these. # # g = np.zeros(len(tau)) # for i in np.arange(len(tau)): # vardict["tau"] = tau[i] # g[i] = simpification.evalf(subs=vardict) return g return G def TestFunction(self): """ Test the function for parsibility with the given parameters. """ vardict = dict() for i in np.arange(len(self.variables)): vardict[self.variables[i]] = sympify(float(self.values[i])) for tau in np.linspace(0.0001, 10000, 10): vardict["tau"] = tau Number = self.simpification.evalf(subs=vardict) if Number.is_Number is False: raise SyntaxError("Function could not be parsed!") class UserModel(object): """ Class for importing txt files as models into PyCorrFit. """ def __init__(self, parent): " Define all important constants and variables. " # Current ID is the last model ID we gave away. # This will be set using self.SetCurrentID self.CurrentID = None # The file to be opened. This is a full path like # os.path.join(dirname, filename) self.filename = None # Imported models # Modelarray = [model1, model2] self.modelarray = [] # String that contains the executable code self.modelcode = None # Parent is main PyCorrFit program self.parent = parent # The string that identifies the user model menu self.UserStr="User" def GetCode(self, filename=None): """ Get the executable code from the file. Optional argument filename may be used. If not self.filename will be used. This automatically sets self.filename """ if filename is not None: self.filename = filename openedfile = open(self.filename, 'r') code = openedfile.readlines() # File should start with a comment #. # Remove everything before that comment (BOM). startfile = code[0].find("#") if startfile != -1: code[0] = code[0][startfile:] else: code[0] = "# "+code[0] # Returncode: True if model was imported, False if there was a problem. # See ModelImported in class CorrFunc self.AddModel(code) openedfile.close() def AddModel(self, code): """ *code* is a list with strings each string is one line. """ # a = 1 # b [ms] = 2.5 # gAlt = 1+tau/b # gProd = a*b # G = 1/gA * gB labels = list() values = list() substitutes = dict() for line in code: # We deal with comments and empty lines # We need to check line length first and then we look for # a hash. line = line.strip() if len(line) != 0 and line[0] != "#": var, val = line.split("=") var = var.strip() if var == "G": # Create a fuction that calculates G funcstring = val.strip() self.FuncClass = CorrFunc(labels, values, substitutes, funcstring) func = self.FuncClass.GetFunction() doc = code[0].strip() # Add whitespaces in model string (looks nicer) for olin in code[1:]: doc = doc + "\n "+olin.strip() func.func_doc = doc elif var[0] == "g": substitutes[var] = val.strip() else: # Add value and variable to our lists labels.append(var) values.append(float(val)) # Active Parameters we are using for the fitting # [0] labels # [1] values # [2] bool values to fit bools = list([False]*len(values)) bools[0] = True # Create Modelarray active_parms = [ labels, values, bools ] self.SetCurrentID() Modelname = code[0][1:].strip() definitions = [self.CurrentID, Modelname, Modelname, func] model = dict() model["Parameters"] = active_parms model["Definitions"] = definitions self.modelarray.append(model) def ImportModel(self): """ Do everything that is necessarry to import the models into PyCorrFit. """ # Set the model ids of the new model(s) # Normally, there is only one model. for i in np.arange(len(self.modelarray)): self.SetCurrentID() self.modelarray[i]["Definitions"][0] = self.CurrentID # We assume that the models have the correct ID for now mdls.AppendNewModel(self.modelarray) # Set variables and models # Is this still necessary? - We are doing this for compatibility! self.parent.value_set = mdls.values self.parent.valuedict = mdls.valuedict self.parent.models = mdls.models self.parent.modeldict = mdls.modeldict self.parent.modeltypes = mdls.modeltypes # Get menu menu = self.parent.modelmenudict[self.UserStr] # Add menu entrys for item in self.modelarray: # Get definitions Defs = item["Definitions"] # This is important if we want to save the session with # the imported model. mdls.modeltypes[self.UserStr].append(Defs[0]) menuentry = menu.Append(Defs[0], Defs[1], Defs[2]) self.parent.Bind(wx.EVT_MENU, self.parent.add_fitting_tab, menuentry) def TestFunction(self): """ Convenience function to test self.FuncClass """ self.FuncClass.TestFunction() def SetCurrentID(self): # Check last item or so of modelarray # Imported functions get IDs starting from 7000 theID = 7000 for model in mdls.models: theID = max(theID, model[0]) self.CurrentID = theID + 1 class wixi(Function): """ This is a ghetto solution for using wofz in sympy. It only returns the real part of the function. I am not sure, if the eval's are placed correctly. I only made it work for my needs. This might be wrong! For true use of wofz, I am not using sympy, anyhow. """ nargs = 1 is_real = True @classmethod def eval(csl,arg): return None #def _should_evalf(csl,arg): # return True def as_base_exp(cls): return cls,S.One def _eval_evalf(cls, prec): result = sps.wofz(1j*float(cls.args[0])) return sympy.numbers.Number(sympy.functions.re(result)) def evalwixi(x): """ Complex Error Function (Faddeeva/Voigt). w(i*x) = exp(x**2) * ( 1-erf(x) ) This function is called by other functions within this module. We are using the scipy.special.wofz module which calculates w(z) = exp(-z**2) * ( 1-erf(-iz) ) z = i*x """ z = x*1j result = sps.wofz(z) # We should have a real solution. Make sure nobody complains about # some zero-value imaginary numbers. return np.real_if_close(result) sympyfuncdict = dict() sympyfuncdict["wixi"] = wixi evalfuncdict = dict() evalfuncdict["wixi"] = evalwixi evalfuncdict["I"] = 1j scipyfuncs = ['wofz', 'erf', 'erfc'] numpyfuncs = ['abs', 'arccos', 'arcsin', 'arctan', 'arctan2', 'ceil', 'cos', 'cosh', 'degrees', 'e', 'exp', 'fabs', 'floor', 'fmod', 'frexp', 'hypot', 'ldexp', 'log', 'log10', 'modf', 'pi', 'power', 'radians', 'sin', 'sinh', 'sqrt', 'tan', 'tanh'] for func in scipyfuncs: evalfuncdict[func] = eval("sps."+func) for func in numpyfuncs: evalfuncdict[func] = eval("np."+func) pycorrfit-0.8.1/src/PyCorrFit.py0000755000175000017500000001103312262516600015314 0ustar toortoor#!/usr/bin/python # -*- coding: utf-8 -*- """ PyCorrFit A flexible tool for fitting and analyzing correlation curves. Dimensionless representation: unit of time : 1 ms unit of inverse time: 1000 /s unit of distance : 100 nm unit of Diff.coeff : 10 um^2/s unit of inverse area: 100 /um^2 unit of inv. volume : 1000 /um^3 Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import csv from distutils.version import LooseVersion import sys # Import matplotlib a little earlier. This way some problems with saving # dialogs that are not made by "WXAgg" are solved. ## On Windows XP I had problems with the unicode Characters. # I found this at # http://stackoverflow.com/questions/5419/python-unicode-and-the-windows-console # and it helped (needs to be done before import of matplotlib): import platform if platform.system() == 'Windows': reload(sys) sys.setdefaultencoding('utf-8') import matplotlib # We do catch warnings about performing this before matplotlib.backends stuff #matplotlib.use('WXAgg') # Tells matplotlib to use WxWidgets import warnings with warnings.catch_warnings(): warnings.simplefilter("ignore") matplotlib.use('WXAgg') # Tells matplotlib to use WxWidgets for dialogs import numpy as np # NumPy import os import scipy # A missing import hook prevented us from bundling PyCorrFit on Mac using # pyinstaller. The following imports solved that issue: try: from scipy.sparse.csgraph import shortest_path from scipy.sparse.csgraph import _validation except: pass # Sympy is optional: try: import sympy except ImportError: print "Importing sympy failed! Checking of external model functions" print "will not work!" # We create a fake module sympy with a __version__ property. # This way users can run PyCorrFit without having installed sympy. class Fake(object): def __init__(self): self.__version__ = "0.0 unknown" self.version = "0.0 unknown" sympy = Fake() # We must not import wx here. frontend/gui does that. If we do import wx here, # somehow unicode characters will not be displayed correctly on windows. # import wx import yaml ## Continue with the import: import doc import frontend as gui # The actual program def CheckVersion(given, required, name): """ For a given set of versions str *required* and str *given*, where version are usually separated by dots, print whether for the module str *name* the required verion is met or not. """ try: req = LooseVersion(required) giv = LooseVersion(given) except: print " WARNING: Could not verify version of "+name+"." return if req > giv: print " WARNING: You are using "+name+" v. "+given+\ " | Required: "+name+" "+ required else: print " OK: "+name+" v. "+given+" | "+required+" required" ## VERSION version = doc.__version__ __version__ = version print gui.doc.info(version) ## Check important module versions print "\n\nChecking module versions..." CheckVersion(csv.__version__, "1.0", "csv") CheckVersion(np.__version__, "1.5.1", "NumPy") CheckVersion(scipy.__version__, "0.8.0", "SciPy") CheckVersion(sympy.__version__, "0.7.2", "sympy") CheckVersion(gui.wx.__version__, "2.8.10.1", "wxPython") CheckVersion(yaml.__version__, "3.09", "PyYAML") ## Command line ? ## Start gui app = gui.wx.App(False) frame = gui.MyFrame(None, -1, version) # Before starting the main loop, check for possible session files # in the arguments. sysarg = sys.argv for arg in sysarg: if len(arg) >= 18: if arg[-18:] == "fcsfit-session.zip": print "\nLoading Session "+arg frame.OnOpenSession(sessionfile=arg) elif arg[:6] == "python": pass elif arg[-12:] == "PyCorrFit.py": pass else: print "I do not know what to do with this argument: "+arg # Now start the app app.MainLoop() pycorrfit-0.8.1/src/page.py0000644000175000017500000013344612262516600014361 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module frontend The frontend displays the GUI (Graphic User Interface). All functions and modules are called from here. Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ # Use DEMO for contrast-rich screenshots. # This enlarges axis text and draws black lines instead of grey ones. DEMO = False import wx # GUI interface wxPython from wx.lib.agw import floatspin # Float numbers in spin fields import wx.lib.plot as plot # Plotting in wxPython import wx.lib.scrolledpanel as scrolled import numpy as np # NumPy import sys # System stuff import edclasses # Cool stuf like better floatspin import leastsquaresfit as fit # For fitting import models as mdls import tools ## On Windows XP I had problems with the unicode Characters. # I found this at # http://stackoverflow.com/questions/5419/python-unicode-and-the-windows-console # and it helped: reload(sys) sys.setdefaultencoding('utf-8') class FittingPanel(wx.Panel): """ Those are the Panels that show the fitting dialogs with the Plots. """ def __init__(self, parent, counter, modelid, active_parms, tau): """ Initialize with given parameters. """ wx.Panel.__init__(self, parent=parent, id=wx.ID_ANY) self.parent = parent self.filename = "None" ## If IsCrossCorrelation is set to True, the trace and traceavg ## variables will not be used. Instead tracecc a list, of traces ## will be used. self.IsCrossCorrelation = False ## Setting up variables for plotting self.trace = None # The intensity trace, tuple self.traceavg = None # Average trace intensity self.tracecc = None # List of traces (in CC mode only) self.bgselected = None # integer, index for parent.Background self.bg2selected = None # integer, index for parent.Background # -> for cross-correlation self.bgcorrect = 1. # Background correction factor for dataexp self.normparm = None # Parameter number used for graph normalization # if greater than number of fitting parms, # then supplementary parm is used. self.normfactor = 1. # Graph normalization factor (e.g. value of n) self.startcrop = None # Where cropping of dataexp starts self.endcrop = None # Where cropping of dataexp ends self.dataexp = None # Experimental data (cropped) self.dataexpfull = None # Experimental data (not cropped) self.datacorr = None # Calculated data self.resid = None # Residuals self.data4weight = None # Data used for weight calculation # Fitting: #self.Fitbox=[ fitbox, weightedfitdrop, fittext, fittext2, fittextvar, # fitspin, buttonfit ] # chi squared - is also an indicator, if something had been fitted self.FitKnots = 5 # number of knots for spline fit or similiars self.chi2 = None self.weighted_fit_was_performed = False # default is no weighting self.weights_used_for_fitting = None # weights used for fitting self.weights_used_for_plotting = None # weights used for plotting self.weights_plot_fill_area = None # weight area in plot self.weighted_fittype_id = None # integer (drop down item) self.weighted_fittype = "Unknown" # type of fit used self.weighted_nuvar = None # bins for std-dev. (left and rigth) # dictionary for alternative variances from e.g. averaging self.external_std_weights = dict() # Errors of fit dictionary self.parmoptim_error = None # A list containing page numbers that share parameters with this page. # This parameter is defined by the global fitting tool and is saved in # sessions. self.GlobalParameterShare = list() # Counts number of Pages already created: self.counter = counter # Has inital plot been performed? # Call PlotAll("init") to set this to true. If it is true, then # nothing will be plotted if called with "init" self.InitialPlot = False # Model we are using self.modelid = modelid # modelpack: # [0] labels # [1] values # [2] bool values to fit # [3] labels human readable (optional) # [4] factors human readable (optional) modelpack = mdls.modeldict[modelid] # The string of the model in the menu self.model = modelpack[1] # Some more useless text about the model self.modelname = modelpack[2] # Function for fitting self.active_fct = modelpack[3] # Parameter verification function. # This checks parameters concerning their physical meaningfullness :) self.check_parms_model = mdls.verification[modelid] # active_parameters: # [0] labels # [1] values # [2] bool values to fit self.active_parms = active_parms # Parameter range for fitting (defaults to zero) self.parameter_range = np.zeros((len(active_parms[0]),2)) # Some timescale self.taufull = tau self.tau = 1*self.taufull # Tool statistics uses this list: self.StatisticsCheckboxes = None ### Splitter window # Sizes size = parent.notebook.GetSize() tabsize = 33 size[1] = size[1] - tabsize self.sizepanelx = 270 canvasx = size[0]-self.sizepanelx+5 sizepanel = (self.sizepanelx, size[1]) sizecanvas = (canvasx, size[1]) self.sp = wx.SplitterWindow(self, size=size, style=wx.SP_3DSASH) # This is necessary to prevent "Unsplit" of the SplitterWindow: self.sp.SetMinimumPaneSize(1) ## Settings Section (left side) #self.panelsettings = wx.Panel(self.sp, size=sizepanel) self.panelsettings = scrolled.ScrolledPanel(self.sp, size=sizepanel) self.panelsettings.SetupScrolling(scroll_x=False) ## Setting up Plot (correlation + chi**2) self.spcanvas = wx.SplitterWindow(self.sp, size=sizecanvas, style=wx.SP_3DSASH) # This is necessary to prevent "Unsplit" of the SplitterWindow: self.spcanvas.SetMinimumPaneSize(1) # y difference in pixels between Auocorrelation and Residuals cupsizey = size[1]*4/5 # Calculate initial data self.calculate_corr() # Draw the settings section self.settings() # Upper Plot for plotting of Correlation Function self.canvascorr = plot.PlotCanvas(self.spcanvas) self.canvascorr.setLogScale((True, False)) self.canvascorr.SetEnableZoom(True) self.PlotAll() self.canvascorr.SetSize((canvasx, cupsizey)) # Lower Plot for plotting of the residuals self.canvaserr = plot.PlotCanvas(self.spcanvas) self.canvaserr.setLogScale((True, False)) self.canvaserr.SetEnableZoom(True) self.canvaserr.SetSize((canvasx, size[1]-cupsizey)) self.spcanvas.SplitHorizontally(self.canvascorr, self.canvaserr, cupsizey) self.sp.SplitVertically(self.panelsettings, self.spcanvas, self.sizepanelx) ## Check out the DEMO option and make change the plot: try: if DEMO == True: self.canvascorr.SetFontSizeAxis(16) self.canvaserr.SetFontSizeAxis(16) except: # Don't raise any unnecessary erros pass # Bind resizing to resizing function. wx.EVT_SIZE(self, self.OnSize) def apply_parameters(self, event=None): """ Read the values from the form and write it to the pages parameters. This function is called when the "Apply" button is hit. """ parameters = list() # Read parameters from form and update self.active_parms[1] for i in np.arange(len(self.active_parms[1])): parameters.append(1*self.spincontrol[i].GetValue()) self.active_parms[2][i] = self.checkboxes[i].GetValue() # As of version 0.7.5: we want the units to be displayed # human readable - the way they are displayed # in the Page info tool. # Here: Convert human readable units to program internal # units e, self.active_parms[1] = mdls.GetInternalFromHumanReadableParm( self.modelid, np.array(parameters)) self.active_parms[1] = self.check_parms(1*self.active_parms[1]) # Fitting parameters self.weighted_nuvar = self.Fitbox[5].GetValue() self.weighted_fittype_id = self.Fitbox[1].GetSelection() if self.Fitbox[1].GetSelection() == -1: # User edited knot number Knots = self.Fitbox[1].GetValue() Knots = filter(lambda x: x.isdigit(), Knots) if Knots == "": Knots = "5" self.weighted_fittype_id = 1 self.FitKnots = str(Knots) elif self.Fitbox[1].GetSelection() == 1: Knots = self.Fitbox[1].GetValue() Knots = filter(lambda x: x.isdigit(), Knots) self.FitKnots = int(Knots) # If parameters have been changed because of the check_parms # function, write them back. self.apply_parameters_reverse() def apply_parameters_reverse(self, event=None): """ Read the values from the pages parameters and write it to the form. """ # check parameters self.active_parms[1] = self.check_parms(self.active_parms[1]) # # As of version 0.7.5: we want the units to be displayed # human readable - the way they are displayed # in the Page info tool. # # Here: Convert program internal units to # human readable units labels, parameters = \ mdls.GetHumanReadableParms(self.modelid, self.active_parms[1]) # Write parameters to the form on the Page for i in np.arange(len(self.active_parms[1])): self.spincontrol[i].SetValue(parameters[i]) self.checkboxes[i].SetValue(self.active_parms[2][i]) # Fitting parameters self.Fitbox[5].SetValue(self.weighted_nuvar) idf = self.weighted_fittype_id List = self.Fitbox[1].GetItems() List[1] = "Spline ("+str(self.FitKnots)+" knots)" self.Fitbox[1].SetItems(List) self.Fitbox[1].SetSelection(idf) def calculate_corr(self): """ Calculate correlation function Returns an array of tuples (tau, correlation) *self.active_f*: A function that is being calculated using *self.active_parms*: A list of parameters Uses variables: *self.datacorr*: Plotting data (tuples) of the correlation curve *self.dataexp*: Plotting data (tuples) of the experimental curve *self.tau*: "tau"-values for plotting (included) in dataexp. Returns: Nothing. Recalculation of the mentioned global variables is done. """ parameters = self.active_parms[1] # calculate correlation values y = self.active_fct(parameters, self.tau) # Create new plotting data self.datacorr = np.zeros((len(self.tau), 2)) self.datacorr[:, 0] = self.tau self.datacorr[:, 1] = y def check_parms(self, parms): """ Check parameters using self.check_parms_model and the user defined borders for each parameter. """ p = 1.*np.array(parms) p = self.check_parms_model(p) r = self.parameter_range for i in range(len(p)): if r[i][0] == r[i][1]: pass elif p[i] < r[i][0]: p[i] = r[i][0] elif p[i] > r[i][1]: p[i] = r[i][1] return p def crop_data(self): """ Crop the pages data for plotting This will create slices from *self.taufull* and *self.dataexpfull* using the values from *self.startcrop* and *self.endcrop*, creating *self.tau* and *self.dataexp*. """ if self.dataexpfull is not None: if self.startcrop == self.endcrop: # self.bgcorrect is background correction self.dataexp = 1*self.dataexpfull self.taufull = self.dataexpfull[:,0] self.tau = 1*self.taufull self.startcrop = 0 self.endcrop = len(self.taufull) else: self.dataexp = 1*self.dataexpfull[self.startcrop:self.endcrop] self.taufull = self.dataexpfull[:,0] self.tau = 1*self.dataexp[:,0] # If startcrop is larger than the lenght of dataexp, # We will not have an array. Prevent that. if len(self.tau) == 0: self.tau = 1*self.taufull self.dataexp = 1*self.dataexpfull try: self.taufull[self.startcrop] self.taufull[self.endcrop-1] except: self.startcrop = 0 self.endcrop = len(self.taufull) self.tau = 1*self.taufull self.dataexp = 1*self.dataexpfull else: # We have to check if the startcrop and endcrop parameters are # inside the taufull array. try: # Raises IndexError if index out of bounds self.taufull[self.startcrop] # Raises TypeError if self.endcrop is not an int. self.taufull[self.endcrop-1] except (IndexError, TypeError): self.tau = 1*self.taufull self.endcrop = len(self.taufull) self.startcrop = 0 else: self.tau = 1*self.taufull[self.startcrop:self.endcrop] ## ## Channel selection ## # Crops the array *self.dataexpfull* from *start* (int) to *end* (int) ## # and assigns the result to *self.dataexp*. If *start* and *end* are ## # equal (or not given), *self.dataexp* will be equal to ## # *self.dataexpfull*. ## self.parent.OnFNBPageChanged(e=None, Page=self) def CorrectDataexp(self, dataexp): """ Background correction Changes *self.bgcorrect*. Overwrites *self.dataexp*. For details see: Thompson, N. Lakowicz, J.; Geddes, C. D. & Lakowicz, J. R. (ed.) Fluorescence Correlation Spectroscopy Topics in Fluorescence Spectroscopy, Springer US, 2002, 1, 337-378 and (for cross-correlation) Weidemann et al. ...? """ # Make a copy. Do not overwrite the original. if dataexp is not None: modified = 1 * dataexp if self.IsCrossCorrelation: # Cross-Correlation if (self.bgselected is not None and self.bg2selected is not None ): if self.tracecc is not None: S = self.tracecc[0][:,1].mean() S2 = self.tracecc[1][:,1].mean() B = self.parent.Background[self.bgselected][0] B2 = self.parent.Background[self.bg2selected][0] self.bgcorrect = (S/(S-B)) * (S2/(S2-B2)) modified[:,1] *= self.bgcorrect else: # Autocorrelation if self.bgselected is not None: # self.bgselected if self.traceavg is not None: S = self.traceavg B = self.parent.Background[self.bgselected][0] # Calculate correction factor self.bgcorrect = (S/(S-B))**2 # self.dataexp should be set, since we have self.trace modified[:,1] *= self.bgcorrect return modified else: return None def Fit_enable_fitting(self): """ Enable the fitting button and the weighted fit control""" #self.Fitbox=[ fitbox, weightedfitdrop, fittext, fittext2, fittextvar, # fitspin, buttonfit ] self.Fitbox[0].Enable() self.Fitbox[1].Enable() self.Fitbox[-1].Enable() def Fit_create_instance(self, noplots=False): """ *noplots* prohibits plotting (e.g. splines) """ ### If you change anything here, make sure you ### take a look at the global fit tool! ## Start fitting class and fill with information. self.apply_parameters() Fitting = fit.Fit() # Verbose mode? if noplots is False: Fitting.verbose = self.parent.MenuVerbose.IsChecked() Fitting.uselatex = self.parent.MenuUseLatex.IsChecked() Fitting.check_parms = self.check_parms Fitting.dataexpfull = self.CorrectDataexp(self.dataexpfull) if self.Fitbox[1].GetSelection() == 1: # Knots = self.Fitbox[1].GetValue() # Knots = filter(lambda x: x.isdigit(), Knots) # self.FitKnots = Knots Fitting.fittype = "spline"+str(self.FitKnots) self.parent.StatusBar.SetStatusText("You can change the number"+ " of knots. Check 'Preference>Verbose Mode' to view the spline.") elif self.Fitbox[1].GetSelection() == 2: Fitting.fittype = "model function" if self is self.parent.notebook.GetCurrentPage(): self.parent.StatusBar.SetStatusText("This is iterative. Press"+ " 'Fit' multiple times. If it does not converge, use splines.") elif self.Fitbox[1].GetSelection() > 2: # This means we have some user defined std, for example from # averaging. This std is stored in self.external_std_weights # list, which looks like this: # self.external_std_weights["from average"] = 1D np.array std Fitting.fittype = "other" Fitlist = self.Fitbox[1].GetItems() FitValue = Fitlist[self.Fitbox[1].GetSelection()] Fitting.external_deviations = self.external_std_weights[FitValue] # Fitting will crop the variances according to # the Fitting.interval that we set below. if self is self.parent.notebook.GetCurrentPage(): self.parent.StatusBar.SetStatusText("") else: self.parent.StatusBar.SetStatusText("") Fitting.function = self.active_fct Fitting.interval = [self.startcrop, self.endcrop] Fitting.values = 1*self.active_parms[1] Fitting.valuestofit = 1*self.active_parms[2] Fitting.weights = self.Fitbox[5].GetValue() Fitting.ApplyParameters() # Set weighted_fit_was_performed variables if self.Fitbox[1].GetSelection() == 0: self.weighted_fit_was_performed = False self.weights_used_for_fitting = None self.tauweight = None else: self.weighted_fit_was_performed = True self.weights_used_for_fitting = Fitting.dataweights self.weighted_fittype_id = idf = self.Fitbox[1].GetSelection() self.weighted_fittype = self.Fitbox[1].GetItems()[idf] return Fitting def Fit_function(self, event=None, noplots=False): """ Call the fit function. """ # Make a busy cursor wx.BeginBusyCursor() # Apply parameters # This also applies the background correction, if present self.apply_parameters() # Create instance of fitting class Fitting = self.Fit_create_instance(noplots) # Reset page counter self.GlobalParameterShare = list() try: Fitting.least_square() except ValueError: # I sometimes had this on Windows. It is caused by fitting to # a .SIN file without selection proper channels first. print "There was an Error fitting. Please make sure that you\n"+\ "are fitting in a proper channel domain." wx.EndBusyCursor() return parms = Fitting.valuesoptim # create an error dictionary p_error = Fitting.parmoptim_error if p_error is None: self.parmoptim_error = None else: self.parmoptim_error = dict() errcount = 0 for i in np.arange(len(parms)): if self.active_parms[2][i]: self.parmoptim_error[self.active_parms[0][i]] =p_error[errcount] errcount += 1 self.chi2 = Fitting.chi for i in np.arange(len(parms)): self.active_parms[1][i] = parms[i] # We need this for plotting self.calculate_corr() self.data4weight = 1.*self.datacorr # Update spin-control values self.apply_parameters_reverse() # Plot everthing self.PlotAll() # Return cursor to normal wx.EndBusyCursor() def Fit_WeightedFitCheck(self, event=None): """ Enable Or disable variance calculation, dependent on "Weighted Fit" checkbox """ #self.Fitbox=[ fitbox, weightedfitdrop, fittext, fittext2, fittextvar, # fitspin, buttonfit ] weighted = (self.Fitbox[1].GetSelection() != 0) # In the case of "Average" we do not enable the # "Calculation of variance" part. if weighted is True and self.Fitbox[1].GetValue() != "Average": self.Fitbox[2].Enable() self.Fitbox[3].Enable() self.Fitbox[4].Enable() self.Fitbox[5].Enable() else: self.Fitbox[2].Disable() self.Fitbox[3].Disable() self.Fitbox[4].Disable() self.Fitbox[5].Disable() def MakeStaticBoxSizer(self, boxlabel): """ Create a Box with check boxes (fit yes/no) and possibilities to change initial values for fitting. Parameters: *boxlabel*: The name of the box (is being displayed) *self.active_parms[0]*: A list of things to put into the box Returns: *sizer*: The static Box *check*: The (un)set checkboxes *spin*: The spin text fields """ box = wx.StaticBox(self.panelsettings, label=boxlabel) sizer = wx.StaticBoxSizer(box, wx.VERTICAL) check = list() spin = list() # # As of version 0.7.5: we want the units to be displayed # human readable - the way they are displayed # in the Page info tool. # labels, parameters = mdls.GetHumanReadableParms(self.modelid, self.active_parms[1]) for label in labels: sizerh = wx.BoxSizer(wx.HORIZONTAL) checkbox = wx.CheckBox(self.panelsettings, label=label) # We needed to "from wx.lib.agw import floatspin" to get this: spinctrl = edclasses.FloatSpin(self.panelsettings, digits=10, increment=.01) sizerh.Add(spinctrl) sizerh.Add(checkbox) sizer.Add(sizerh) # Put everything into lists to be able to refer to it later check.append(checkbox) spin.append(spinctrl) return sizer, check, spin def OnAmplitudeCheck(self, event=None): """ Enable/Disable BG rate text line. New feature introduced in 0.7.8 """ ## Normalization to a certain parameter in plots # Find all parameters that start with an "N" # ? and "C" ? # Create List normlist = list() normlist.append("None") ## Add parameters parameterlist = list() for i in np.arange(len(self.active_parms[0])): label = self.active_parms[0][i] if label[0] == "n" or label[0] == "N": normlist.append("*"+label) parameterlist.append(i) ## Add supplementary parameters # Get them from models supplement = mdls.GetMoreInfo(self.modelid, self) if supplement is not None: for i in np.arange(len(supplement)): label = supplement[i][0] if label[0] == "n" or label[0] == "N": normlist.append("*"+label) # Add the id of the supplement starting at the # number of fitting parameters of current page. parameterlist.append(i+len(self.active_parms[0])) normsel = self.AmplitudeInfo[2].GetSelection() if event == "init": # Read everything from the page not from the panel # self.normparm was set and we need to set # self.normfactor # self.AmplitudeInfo[2] if self.normparm is not None: if self.normparm < len(self.active_parms[1]): # use fitting parameter from page self.normfactor = self.active_parms[1][self.normparm] else: # use supplementary parameter supnum = self.normparm - len(self.active_parms[1]) self.normfactor = supplement[supnum][1] # Set initial selection for j in np.arange(len(parameterlist)): if parameterlist[j] == self.normparm: normsel = j+1 else: self.normfactor = 1. normsel = 0 else: if normsel > 0: # Make sure we are not normalizing with a background # Use the parameter id from the internal parameterlist parameterid = parameterlist[normsel-1] if parameterid < len(self.active_parms[1]): # fitting parameter self.normfactor = self.active_parms[1][parameterid] else: # supplementary parameter supnum = parameterid - len(self.active_parms[1]) self.normfactor = supplement[supnum][1] #### supplement are somehow sorted !!!! # For parameter export: self.normparm = parameterid # No internal parameters will be changed # Only the plotting else: self.normfactor = 1. normsel = 0 # For parameter export self.normparm = None if len(parameterlist) > 0: self.AmplitudeInfo[2].Enable() self.AmplitudeInfo[3].Enable() else: self.AmplitudeInfo[2].Disable() self.AmplitudeInfo[3].Disable() # Set dropdown values self.AmplitudeInfo[2].SetItems(normlist) self.AmplitudeInfo[2].SetSelection(normsel) ## Plot intensities # Quick reminder: #self.AmplitudeInfo = [ [intlabel1, intlabel2], # [bgspin1, bgspin2], # normtoNDropdown, textnor] # Signal if self.IsCrossCorrelation: if self.tracecc is not None: S1 = self.tracecc[0][:,1].mean() S2 = self.tracecc[1][:,1].mean() self.AmplitudeInfo[0][0].SetValue("{:.4f}".format(S1)) self.AmplitudeInfo[0][1].SetValue("{:.4f}".format(S2)) else: self.AmplitudeInfo[0][0].SetValue("{:.4f}".format(0)) self.AmplitudeInfo[0][1].SetValue("{:.4f}".format(0)) else: if self.traceavg is not None: self.AmplitudeInfo[0][0].SetValue("{:.4f}".format( self.traceavg)) else: self.AmplitudeInfo[0][0].SetValue("{:.4f}".format(0)) self.AmplitudeInfo[0][1].SetValue("{:.4f}".format(0)) # Background ## self.parent.Background[self.bgselected][i] ## [0] average signal [kHz] ## [1] signal name (edited by user) ## [2] signal trace (tuple) ([ms], [kHz]) if self.bgselected is not None: self.AmplitudeInfo[1][0].SetValue( self.parent.Background[self.bgselected][0]) else: self.AmplitudeInfo[1][0].SetValue(0) if self.bg2selected is not None and self.IsCrossCorrelation: self.AmplitudeInfo[1][1].SetValue( self.parent.Background[self.bg2selected][0]) else: self.AmplitudeInfo[1][1].SetValue(0) # Disable the second line in amplitude correction, if we have # autocorrelation only. boolval = self.IsCrossCorrelation for item in self.WXAmplitudeCCOnlyStuff: item.Enable(boolval) def OnBGSpinChanged(self, e): """ Calls tools.background.ApplyAutomaticBackground to update background information """ # Quick reminder: #self.AmplitudeInfo = [ [intlabel1, intlabel2], # [bgspin1, bgspin2], # normtoNDropdown, textnor] if self.IsCrossCorrelation: # update both self.bgselected and self.bg2selected bg = [self.AmplitudeInfo[1][0].GetValue(), self.AmplitudeInfo[1][1].GetValue()] tools.background.ApplyAutomaticBackground(self, bg, self.parent) else: # Only update self.bgselected bg = self.AmplitudeInfo[1][0].GetValue() tools.background.ApplyAutomaticBackground(self, bg, self.parent) def OnTitleChanged(self, e): pid = self.parent.notebook.GetPageIndex(self) if self.tabtitle.GetValue() == "": text = self.counter + mdls.modeldict[self.modelid][1] else: # How many characters of the the page title should be displayed # in the tab? We choose 9: AC1-012 plus 2 whitespaces text = self.counter + self.tabtitle.GetValue()[-9:] self.parent.notebook.SetPageText(pid,text) #import IPython #IPython.embed() def OnSetRange(self, e): """ Open a new Frame where the parameter range can be set. Rewrites self.parameter_range Parameter ranges are treated like parameters: They are saved in sessions and applied in batch mode. """ # We write a separate tool for that. # This tool does not show up in the Tools menu. if self.parent.RangeSelector is None: self.parent.RangeSelector = tools.RangeSelector(self) self.parent.RangeSelector.Bind(wx.EVT_CLOSE, self.parent.RangeSelector.OnClose) else: try: self.parent.RangeSelector.OnClose() except: pass self.parent.RangeSelector = None def OnSize(self, event): """ Resize the fitting Panel, when Window is resized. """ size = self.parent.notebook.GetSize() tabsize = 33 size[1] = size[1] - tabsize self.sp.SetSize(size) def PlotAll(self, event=None): """ This function plots the whole correlation and residuals canvas. We do: - Channel selection - Background correction - Apply Parameters (separate function) - Drawing of plots """ if event == "init": # We use this to have the page plotted at least once before # readout of parameters (e.g. startcrop, endcrop) # This is a performence tweak. self.crop_data() if self.InitialPlot is True: return else: self.InitialPlot = True ## Enable/Disable, set values frontend normalization self.OnAmplitudeCheck() self.crop_data() ## Calculate trace average if self.trace is not None: # Average of the current pages trace self.traceavg = self.trace[:,1].mean() # Perform Background correction self.dataexp = self.CorrectDataexp(self.dataexp) ## Apply parameters self.apply_parameters() # Calculate correlation function from parameters self.calculate_corr() ## Drawing of correlation plot # Plots self.dataexp and the calcualted correlation function # self.datacorr into the upper canvas. # Create a line @ y=zero: zerostart = self.tau[0] zeroend = self.tau[-1] datazero = [[zerostart, 0], [zeroend,0]] ## Check out the DEMO option and make change the plot: try: if DEMO == True: width = 4 colexp = "black" colfit = "red" else: width = 1 colexp = "grey" colfit = "blue" except: # Don't raise any unnecessary erros width = 1 colexp = "grey" colfit = "blue" colweight = "cyan" lines = list() linezero = plot.PolyLine(datazero, colour='orange', width=width) lines.append(linezero) if self.dataexp is not None: if self.weighted_fit_was_performed == True and \ self.weights_used_for_fitting is not None and \ self.parent.MenuShowWeights.IsChecked() and \ self.data4weight is not None: # Add the weights to the graph. # This is done by drawing two lines. w = 1*self.data4weight w1 = 1*w w2 = 1*w w1[:, 1] = w[:, 1] + self.weights_used_for_fitting w2[:, 1] = w[:, 1] - self.weights_used_for_fitting wend = 1*self.weights_used_for_fitting # crop w1 and w2 if self.dataexp does not include all # data points. if np.all(w[:,0] == self.dataexp[:,0]): pass else: start = np.min(self.dataexp[:,0]) end = np.max(self.dataexp[:,0]) idstart = np.argwhere(w[:,0]==start) idend = np.argwhere(w[:,0]==end) if len(idend) == 0: # dataexp is longer, do not change anything pass else: w1 = w1[:idend[0][0]+1] w2 = w2[:idend[0][0]+1] wend = wend[:idend[0][0]+1] if len(idstart) == 0: # dataexp starts earlier, do not change anything pass else: w1 = w1[idstart[0][0]:] w2 = w2[idstart[0][0]:] wend = wend[idstart[0][0]:] ## Normalization with self.normfactor w1[:,1] *= self.normfactor w2[:,1] *= self.normfactor self.weights_used_for_plotting = wend self.weights_plot_fill_area = [w1,w2] lineweight1 = plot.PolyLine(w1, legend='', colour=colweight, width=width) lines.append(lineweight1) lineweight2 = plot.PolyLine(w2, legend='', colour=colweight, width=width) lines.append(lineweight2) ## Plot Correlation curves # Plot both, experimental and calculated data # Normalization with self.normfactor, new feature in 0.7.8 datacorr_norm = 1*self.datacorr datacorr_norm[:,1] *= self.normfactor dataexp_norm = 1*self.dataexp dataexp_norm[:,1] *= self.normfactor linecorr = plot.PolyLine(datacorr_norm, legend='', colour=colfit, width=width) lineexp = plot.PolyLine(dataexp_norm, legend='', colour=colexp, width=width) # Draw linezero first, so it is in the background lines.append(lineexp) lines.append(linecorr) PlotCorr = plot.PlotGraphics(lines, xLabel=u'lag time τ [ms]', yLabel=u'G(τ)') self.canvascorr.Draw(PlotCorr) ## Calculate residuals self.resid = np.zeros((len(self.tau), 2)) self.resid[:, 0] = self.tau self.resid[:, 1] = self.dataexp[:, 1] - self.datacorr[:, 1] # Plot residuals # Normalization with self.normfactor, new feature in 0.7.8 resid_norm = np.zeros((len(self.tau), 2)) resid_norm[:, 0] = self.tau resid_norm[:, 1] = dataexp_norm[:, 1] - datacorr_norm[:, 1] lineres = plot.PolyLine(resid_norm, legend='', colour=colfit, width=width) # residuals or weighted residuals? if self.weighted_fit_was_performed: yLabelRes = "weighted \nresiduals" else: yLabelRes = "residuals" PlotRes = plot.PlotGraphics([linezero, lineres], xLabel=u'lag time τ [ms]', yLabel=yLabelRes) self.canvaserr.Draw(PlotRes) else: # Amplitude normalization, new feature in 0.7.8 datacorr_norm = 1*self.datacorr datacorr_norm[:,1] *= self.normfactor linecorr = plot.PolyLine(datacorr_norm, legend='', colour='blue', width=1) PlotCorr = plot.PlotGraphics([linezero, linecorr], xLabel=u'Lag time τ [ms]', yLabel=u'G(τ)') self.canvascorr.Draw(PlotCorr) self.parent.OnFNBPageChanged() def settings(self): """ Here we define, what should be displayed at the left side of the fitting page/tab. Parameters: """ horizontalsize = self.sizepanelx-10 # Title # Create empty tab title mddat = mdls.modeldict[self.modelid] modelshort = mdls.GetModelType(self.modelid) titlelabel = "Data set ({} {})".format(modelshort, mddat[1]) boxti = wx.StaticBox(self.panelsettings, label=titlelabel) sizerti = wx.StaticBoxSizer(boxti, wx.VERTICAL) sizerti.SetMinSize((horizontalsize, -1)) self.tabtitle = wx.TextCtrl(self.panelsettings, value="", size=(horizontalsize-20, -1)) self.Bind(wx.EVT_TEXT, self.OnTitleChanged, self.tabtitle) sizerti.Add(self.tabtitle) # Create StaticBoxSizer box1, check, spin = self.MakeStaticBoxSizer("Model parameters") # Make the check boxes and spin-controls available everywhere self.checkboxes = check self.spincontrol = spin # # As of version 0.7.5: we want the units to be displayed # human readable - the way they are displayed # in the Page info tool. # labels, parameters = mdls.GetHumanReadableParms(self.modelid, self.active_parms[1]) parameterstofit = self.active_parms[2] # Set initial values given by user/programmer for Diffusion Model for i in np.arange(len(labels)): self.checkboxes[i].SetValue(parameterstofit[i]) self.spincontrol[i].SetValue(parameters[i]) self.spincontrol[i].increment() # Put everything together self.panelsettings.sizer = wx.BoxSizer(wx.VERTICAL) self.panelsettings.sizer.Add(sizerti) self.panelsettings.sizer.Add(box1) # Add button "Apply" and "Set range" horzs = wx.BoxSizer(wx.HORIZONTAL) buttonapply = wx.Button(self.panelsettings, label="Apply") self.Bind(wx.EVT_BUTTON, self.PlotAll, buttonapply) horzs.Add(buttonapply) buttonrange = wx.Button(self.panelsettings, label="Set range") self.Bind(wx.EVT_BUTTON, self.OnSetRange, buttonrange) horzs.Add(buttonrange) box1.Add(horzs) # Set horizontal size box1.SetMinSize((horizontalsize, -1)) ## More info normbox = wx.StaticBox(self.panelsettings, label="Amplitude corrections") miscsizer = wx.StaticBoxSizer(normbox, wx.VERTICAL) miscsizer.SetMinSize((horizontalsize, -1)) # Intensities and Background sizeint = wx.FlexGridSizer(rows=3, cols=3, vgap=5, hgap=5) sizeint.Add(wx.StaticText(self.panelsettings, label="[kHz]")) sizeint.Add(wx.StaticText(self.panelsettings, label="Intensity")) sizeint.Add(wx.StaticText(self.panelsettings, label="Background")) sizeint.Add(wx.StaticText(self.panelsettings, label="Ch1")) intlabel1 = wx.TextCtrl(self.panelsettings) bgspin1 = floatspin.FloatSpin(self.panelsettings, increment=0.01, digits=4, min_val=0) self.Bind(floatspin.EVT_FLOATSPIN, self.OnBGSpinChanged, bgspin1) sizeint.Add(intlabel1) intlabel1.SetEditable(False) sizeint.Add(bgspin1) chtext2 = wx.StaticText(self.panelsettings, label="Ch2") sizeint.Add(chtext2) intlabel2 = wx.TextCtrl(self.panelsettings) intlabel2.SetEditable(False) bgspin2 = floatspin.FloatSpin(self.panelsettings, increment=0.01, digits=4, min_val=0) self.Bind(floatspin.EVT_FLOATSPIN, self.OnBGSpinChanged, bgspin2) sizeint.Add(intlabel2) sizeint.Add(bgspin2) miscsizer.Add(sizeint) ## Normalize to n? textnor = wx.StaticText(self.panelsettings, label="Plot normalization") miscsizer.Add(textnor) normtoNDropdown = wx.ComboBox(self.panelsettings) self.Bind(wx.EVT_COMBOBOX, self.PlotAll, normtoNDropdown) miscsizer.Add(normtoNDropdown) self.AmplitudeInfo = [ [intlabel1, intlabel2], [bgspin1, bgspin2], normtoNDropdown, textnor] self.WXAmplitudeCCOnlyStuff = [chtext2, intlabel2, bgspin2] self.panelsettings.sizer.Add(miscsizer) ## Add fitting Box fitbox = wx.StaticBox(self.panelsettings, label="Fitting options") fitsizer = wx.StaticBoxSizer(fitbox, wx.VERTICAL) fitsizer.SetMinSize((horizontalsize, -1)) # Add a checkbox for weighted fitting weightedfitdrop = wx.ComboBox(self.panelsettings) self.weightlist = ["No weights", "Spline (5 knots)", "Model function"] weightedfitdrop.SetItems(self.weightlist) weightedfitdrop.SetSelection(0) fitsizer.Add(weightedfitdrop) # WeightedFitCheck() Enables or Disables the variance part weightedfitdrop.Bind(wx.EVT_COMBOBOX, self.Fit_WeightedFitCheck) # Add the variance part. # In order to do a weighted fit, we need to calculate the variance # at each point of the experimental data array. # In order to do that, we need to know how many data points from left # and right of the interesting data point we want to include in that # calculation. fittext = wx.StaticText(self.panelsettings, label="Calculation of the variance") fitsizer.Add(fittext) fittext2 = wx.StaticText(self.panelsettings, label="from 2j+1 data points") fitsizer.Add(fittext2) fitsizerspin = wx.BoxSizer(wx.HORIZONTAL) fittextvar = wx.StaticText(self.panelsettings, label="j = ") fitspin = wx.SpinCtrl(self.panelsettings, -1, initial=3, min=1, max=100) fitsizerspin.Add(fittextvar) fitsizerspin.Add(fitspin) fitsizer.Add(fitsizerspin) # Add button "Fit" buttonfit = wx.Button(self.panelsettings, label="Fit") self.Bind(wx.EVT_BUTTON, self.Fit_function, buttonfit) fitsizer.Add(buttonfit) self.panelsettings.sizer.Add(fitsizer) # Squeeze everything into the sizer self.panelsettings.SetSizer(self.panelsettings.sizer) # This is also necessary in Windows self.panelsettings.Layout() self.panelsettings.Show() # Make all the stuff available for everyone self.Fitbox = [ fitbox, weightedfitdrop, fittext, fittext2, fittextvar, fitspin, buttonfit ] # Disable Fitting since no data has been loaded yet for element in self.Fitbox: element.Disable() x = self.panelsettings.GetSize()[0] y = self.parent.GetSize()[1] - 33 self.parent.SetSize((x,y)) self.parent.Layout() pycorrfit-0.8.1/src/openfile.py0000644000175000017500000010364112262516600015240 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module openfile This file is used to define operations on how to open some files. Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import csv from distutils.version import LooseVersion # For version checking import numpy as np import os import shutil import tempfile import wx import yaml import zipfile import doc import edclasses from tools import info # These imports are required for loading data from readfiles import Filetypes from readfiles import BGFiletypes def ImportParametersYaml(parent, dirname): """ Import the parameters from a parameters.yaml file from an PyCorrFit session. """ dlg = wx.FileDialog(parent, "Open session file", dirname, "", "*.fcsfit-session.zip", wx.OPEN) # user cannot do anything until he clicks "OK" if dlg.ShowModal() == wx.ID_OK: path = dlg.GetPath() # Workaround since 0.7.5 (dirname, filename) = os.path.split(path) #filename = dlg.GetFilename() #dirname = dlg.GetDirectory() dlg.Destroy() Arc = zipfile.ZipFile(os.path.join(dirname, filename), mode='r') # Get the yaml parms dump: yamlfile = Arc.open("Parameters.yaml") # Parms: Fitting and drawing parameters of correlation curve # The *yamlfile* is responsible for the order of the Pages #i. Parms = yaml.safe_load(yamlfile) yamlfile.close() Arc.close() return Parms, dirname, filename else: dirname=dlg.GetDirectory() return None, dirname, None def OpenSession(parent, dirname, sessionfile=None): """ Load a whole session that has previously been saved by PyCorrFit. Infodict may contain the following keys: "Backgrounds", list: contains the backgrounds "Comments", dict: "Session" comment and int keys to Page titles "Correlations", dict: page numbers, all correlation curves "External Functions", dict: modelids to external model functions "External Weights", dict: page numbers, external weights for fitting "Parameters", dict: page numbers, all parameters of the pages "Preferences", dict: not used yet "Traces", dict: page numbers, all traces of the pages """ Infodict = dict() fcsfitwildcard = ".fcsfit-session.zip" if sessionfile is None: dlg = wx.FileDialog(parent, "Open session file", dirname, "", "*"+fcsfitwildcard, wx.OPEN) # user cannot do anything until he clicks "OK" if dlg.ShowModal() == wx.ID_OK: path = dlg.GetPath() # Workaround since 0.7.5 (dirname, filename) = os.path.split(path) #filename = dlg.GetFilename() #dirname = dlg.GetDirectory() dlg.Destroy() else: # User did not press OK # stop this function dirname = dlg.GetDirectory() dlg.Destroy() return None, dirname, None else: (dirname, filename) = os.path.split(sessionfile) path = sessionfile # Workaround since 0.7.5 if filename[-19:] != fcsfitwildcard: # User specified wrong file print "Unknown file extension: "+filename # stop this function dirname = dlg.GetDirectory() dlg.Destroy() return None, dirname, None Arc = zipfile.ZipFile(path, mode='r') try: ## Check PyCorrFit version: readmefile = Arc.open("Readme.txt") # e.g. "This file was created using PyCorrFit version 0.7.6" identifier = readmefile.readline() arcv = LooseVersion(identifier[46:].strip()) thisv = LooseVersion(parent.version.strip()) if arcv > thisv: errstring = "Your version of Pycorrfit ("+str(thisv)+")"+\ " is too old to open this session ("+\ str(arcv).strip()+").\n"+\ "Please download the lates version of "+\ " PyCorrFit from \n"+doc.HomePage+".\n"+\ "Continue opening this session?" dlg = edclasses.MyOKAbortDialog(parent, errstring, "Warning") returns = dlg.ShowModal() if returns == wx.ID_OK: dlg.Destroy() else: dlg.Destroy() return None, dirname, None except: pass # Get the yaml parms dump: yamlfile = Arc.open("Parameters.yaml") # Parameters: Fitting and drawing parameters of correlation curve # The *yamlfile* is responsible for the order of the Pages #i. Infodict["Parameters"] = yaml.safe_load(yamlfile) yamlfile.close() # Supplementary data (errors of fit) supname = "Supplements.yaml" try: Arc.getinfo(supname) except: pass else: supfile = Arc.open(supname) supdata = yaml.safe_load(supfile) Infodict["Supplements"] = dict() for idp in supdata: Infodict["Supplements"][idp[0]] = dict() Infodict["Supplements"][idp[0]]["FitErr"] = idp[1] if len(idp) > 2: # As of version 0.7.4 we save chi2 and shared pages -global fit Infodict["Supplements"][idp[0]]["Chi sq"] = idp[2] Infodict["Supplements"][idp[0]]["Global Share"] = idp[3] ## Preferences: Reserved for a future version of PyCorrFit :) prefname = "Preferences.yaml" try: Arc.getinfo(prefname) except KeyError: pass else: yamlpref = Arc.open(prefname) Infodict["Preferences"] = yaml.safe_load(yamlpref) yamlpref.close() # Get external functions Infodict["External Functions"] = dict() key = 7001 while key <= 7999: # (There should not be more than 1000 functions) funcfilename = "model_"+str(key)+".txt" try: Arc.getinfo(funcfilename) except KeyError: # No more functions to import key = 8000 else: funcfile = Arc.open(funcfilename) Infodict["External Functions"][key] = funcfile.read() funcfile.close() key=key+1 # Get the correlation arrays Infodict["Correlations"] = dict() for i in np.arange(len(Infodict["Parameters"])): # The *number* is used to identify the correct file number = str(Infodict["Parameters"][i][0]).strip().strip(":").strip("#") pageid = int(number) expfilename = "data"+number+".csv" expfile = Arc.open(expfilename, 'r') readdata = csv.reader(expfile, delimiter=',') dataexp = list() tau = list() if str(readdata.next()[0]) == "# tau only": for row in readdata: # Exclude commentaries if (str(row[0])[0:1] != '#'): tau.append(float(row[0])) tau = np.array(tau) dataexp = None else: for row in readdata: # Exclude commentaries if (str(row[0])[0:1] != '#'): dataexp.append((float(row[0]), float(row[1]))) dataexp = np.array(dataexp) tau = dataexp[:,0] Infodict["Correlations"][pageid] = [tau, dataexp] del readdata expfile.close() # Get the Traces Infodict["Traces"] = dict() for i in np.arange(len(Infodict["Parameters"])): # The *number* is used to identify the correct file number = str(Infodict["Parameters"][i][0]).strip().strip(":").strip("#") pageid = int(number) # Find out, if we have a cross correlation data type IsCross = False try: IsCross = Infodict["Parameters"][i][7] except IndexError: # No Cross correlation pass if IsCross is False: tracefilenames = ["trace"+number+".csv"] else: # Cross correlation uses two traces tracefilenames = ["trace"+number+"A.csv", "trace"+number+"B.csv" ] thistrace = list() for tracefilename in tracefilenames: try: Arc.getinfo(tracefilename) except KeyError: pass else: tracefile = Arc.open(tracefilename, 'r') traceread = csv.reader(tracefile, delimiter=',') singletrace = list() for row in traceread: # Exclude commentaries if (str(row[0])[0:1] != '#'): singletrace.append((float(row[0]), float(row[1]))) singletrace = np.array(singletrace) thistrace.append(singletrace) del traceread del singletrace tracefile.close() if len(thistrace) != 0: Infodict["Traces"][pageid] = thistrace else: Infodict["Traces"][pageid] = None # Get the comments, if they exist commentfilename = "comments.txt" try: # Raises KeyError, if file is not present: Arc.getinfo(commentfilename) except KeyError: pass else: # Open the file commentfile = Arc.open(commentfilename, 'r') Infodict["Comments"] = dict() for i in np.arange(len(Infodict["Parameters"])): number = str(Infodict["Parameters"][i][0]).strip().strip(":").strip("#") pageid = int(number) # Strip line ending characters for all the Pages. Infodict["Comments"][pageid] = commentfile.readline().strip() # Now Add the Session Comment (the rest of the file). ComList = commentfile.readlines() Infodict["Comments"]["Session"] = '' for line in ComList: Infodict["Comments"]["Session"] += line commentfile.close() # Get the Backgroundtraces and data if they exist bgfilename = "backgrounds.csv" try: # Raises KeyError, if file is not present: Arc.getinfo(bgfilename) except KeyError: pass else: # Open the file Infodict["Backgrounds"] = list() bgfile = Arc.open(bgfilename, 'r') bgread = csv.reader(bgfile, delimiter='\t') i = 0 for bgrow in bgread: bgtracefilename = "bg_trace"+str(i)+".csv" bgtracefile = Arc.open(bgtracefilename, 'r') bgtraceread = csv.reader(bgtracefile, delimiter=',') bgtrace = list() for row in bgtraceread: # Exclude commentaries if (str(row[0])[0:1] != '#'): bgtrace.append((np.float(row[0]), np.float(row[1]))) bgtrace = np.array(bgtrace) Infodict["Backgrounds"].append([np.float(bgrow[0]), str(bgrow[1]), bgtrace]) i = i + 1 bgfile.close() # Get external weights if they exist WeightsFilename = "externalweights.txt" try: # Raises KeyError, if file is not present: Arc.getinfo(WeightsFilename) except: pass else: Wfile = Arc.open(WeightsFilename, 'r') Wread = csv.reader(Wfile, delimiter='\t') Weightsdict = dict() for wrow in Wread: Pkey = wrow[0] # Page of weights pageid = int(Pkey) # Do not overwrite anything try: Weightsdict[pageid] except: Weightsdict[pageid] = dict() Nkey = wrow[1] # Name of weights Wdatafilename = "externalweights_data"+Pkey+"_"+Nkey+".csv" Wdatafile = Arc.open(Wdatafilename, 'r') Wdatareader = csv.reader(Wdatafile) Wdata = list() for row in Wdatareader: # Exclude commentaries if (str(row[0])[0:1] != '#'): Wdata.append(np.float(row[0])) Weightsdict[pageid][Nkey] = np.array(Wdata) Infodict["External Weights"] = Weightsdict Arc.close() return Infodict, dirname, filename def SaveSession(parent, dirname, Infodict): """ Write whole Session into a zip file. Infodict may contain the following keys: "Backgrounds", list: contains the backgrounds "Comments", dict: "Session" comment and int keys to Page titles "Correlations", dict: page numbers, all correlation curves "External Functions, dict": modelids to external model functions "External Weights", dict: page numbers, external weights for fitting "Parameters", dict: page numbers, all parameters of the pages "Preferences", dict: not used yet "Traces", dict: page numbers, all traces of the pages We will also write a Readme.txt """ dlg = wx.FileDialog(parent, "Save session file", dirname, "", "*.fcsfit-session.zip", wx.SAVE|wx.FD_OVERWRITE_PROMPT) if dlg.ShowModal() == wx.ID_OK: path = dlg.GetPath() # Workaround since 0.7.5 (dirname, filename) = os.path.split(path) #filename = dlg.GetFilename() #dirname = dlg.GetDirectory() # Sometimes you have multiple endings... if filename.endswith(".fcsfit-session.zip") is not True: filename = filename+".fcsfit-session.zip" dlg.Destroy() # Change working directory returnWD = os.getcwd() tempdir = tempfile.mkdtemp() os.chdir(tempdir) # Create zip file Arc = zipfile.ZipFile(filename, mode='w') # Only do the Yaml thing for safe operations. # Make the yaml dump parmsfilename = "Parameters.yaml" # Parameters have to be floats in lists # in order for yaml.safe_load to work. Parms = Infodict["Parameters"] ParmsKeys = Parms.keys() ParmsKeys.sort() Parmlist = list() for idparm in ParmsKeys: # Make sure we do not accidently save arrays. # This would not work correctly with yaml. Parms[idparm][2] = np.array(Parms[idparm][2],dtype="float").tolist() Parms[idparm][3] = np.array(Parms[idparm][3],dtype="bool").tolist() # Range of fitting parameters Parms[idparm][9] = np.array(Parms[idparm][9],dtype="float").tolist() Parmlist.append(Parms[idparm]) yaml.dump(Parmlist, open(parmsfilename, "wb")) Arc.write(parmsfilename) os.remove(os.path.join(tempdir, parmsfilename)) # Supplementary data (errors of fit) errsfilename = "Supplements.yaml" Sups = Infodict["Supplements"] SupKeys = Sups.keys() SupKeys.sort() Suplist = list() for idsup in SupKeys: error = Sups[idsup]["FitErr"] chi2 = Sups[idsup]["Chi sq"] globalshare = Sups[idsup]["Global Share"] Suplist.append([idsup, error, chi2, globalshare]) yaml.dump(Suplist, open(errsfilename, "wb")) Arc.write(errsfilename) os.remove(os.path.join(tempdir, errsfilename)) # Save external functions for key in Infodict["External Functions"].keys(): funcfilename = "model_"+str(key)+".txt" funcfile = open(funcfilename, 'wb') funcfile.write(Infodict["External Functions"][key]) funcfile.close() Arc.write(funcfilename) os.remove(os.path.join(tempdir, funcfilename)) # Save (dataexp and tau)s into separate csv files. for pageid in Infodict["Correlations"].keys(): # Since *Array* and *Parms* are in the same order (the page order), # we will identify the filename by the Page title number. number = str(pageid) expfilename = "data"+number+".csv" expfile = open(expfilename, 'wb') tau = Infodict["Correlations"][pageid][0] exp = Infodict["Correlations"][pageid][1] dataWriter = csv.writer(expfile, delimiter=',') if exp is not None: # Names of Columns dataWriter.writerow(['# tau', 'experimental data']) # Actual Data # Do not use len(tau) instead of len(exp[:,0])) ! # Otherwise, the experimental data will not be saved entirely, # if it has been cropped. Because tau might be smaller, than # exp[:,0] --> tau = exp[startcrop:endcrop,0] for j in np.arange(len(exp[:,0])): dataWriter.writerow(["%.20e" % exp[j,0], "%.20e" % exp[j,1]]) else: # Only write tau dataWriter.writerow(['# tau'+' only']) for j in np.arange(len(tau)): dataWriter.writerow(["%.20e" % tau[j]]) expfile.close() # Add to archive Arc.write(expfilename) os.remove(os.path.join(tempdir, expfilename)) # Save traces into separate csv files. for pageid in Infodict["Traces"].keys(): number = str(pageid) # Since *Trace* and *Parms* are in the same order, which is the # Page order, we will identify the filename by the Page title # number. if Infodict["Traces"][pageid] is not None: if Parms[pageid][7] is True: # We have cross correlation: save two traces ## A tracefilenamea = "trace"+number+"A.csv" tracefile = open(tracefilenamea, 'wb') traceWriter = csv.writer(tracefile, delimiter=',') time = Infodict["Traces"][pageid][0][:,0] rate = Infodict["Traces"][pageid][0][:,1] # Names of Columns traceWriter.writerow(['# time', 'count rate']) # Actual Data for j in np.arange(len(time)): traceWriter.writerow(["%.20e" % time[j], "%.20e" % rate[j]]) tracefile.close() # Add to archive Arc.write(tracefilenamea) os.remove(os.path.join(tempdir, tracefilenamea)) ## B tracefilenameb = "trace"+number+"B.csv" tracefile = open(tracefilenameb, 'wb') traceWriter = csv.writer(tracefile, delimiter=',') time = Infodict["Traces"][pageid][1][:,0] rate = Infodict["Traces"][pageid][1][:,1] # Names of Columns traceWriter.writerow(['# time', 'count rate']) # Actual Data for j in np.arange(len(time)): traceWriter.writerow(["%.20e" % time[j], "%.20e" % rate[j]]) tracefile.close() # Add to archive Arc.write(tracefilenameb) os.remove(os.path.join(tempdir, tracefilenameb)) else: # Save one single trace tracefilename = "trace"+number+".csv" tracefile = open(tracefilename, 'wb') traceWriter = csv.writer(tracefile, delimiter=',') time = Infodict["Traces"][pageid][:,0] rate = Infodict["Traces"][pageid][:,1] # Names of Columns traceWriter.writerow(['# time', 'count rate']) # Actual Data for j in np.arange(len(time)): traceWriter.writerow(["%.20e" % time[j], "%.20e" % rate[j]]) tracefile.close() # Add to archive Arc.write(tracefilename) os.remove(os.path.join(tempdir, tracefilename)) # Save comments into txt file commentfilename = "comments.txt" commentfile = open(commentfilename, 'wb') # Comments[-1] is comment on whole Session Ckeys = Infodict["Comments"].keys() Ckeys.sort() for key in Ckeys: if key != "Session": commentfile.write(Infodict["Comments"][key]+"\r\n") commentfile.write(Infodict["Comments"]["Session"]) commentfile.close() Arc.write(commentfilename) os.remove(os.path.join(tempdir, commentfilename)) ## Save Background information: Background = Infodict["Backgrounds"] if len(Background) > 0: # We do not use a comma separated, but a tab separated file, # because a comma might be in the name of a bg. bgfilename = "backgrounds.csv" bgfile = open(bgfilename, 'wb') bgwriter = csv.writer(bgfile, delimiter='\t') for i in np.arange(len(Background)): bgwriter.writerow([str(Background[i][0]), Background[i][1]]) # Traces bgtracefilename = "bg_trace"+str(i)+".csv" bgtracefile = open(bgtracefilename, 'wb') bgtraceWriter = csv.writer(bgtracefile, delimiter=',') bgtraceWriter.writerow(['# time', 'count rate']) # Actual Data time = Background[i][2][:,0] rate = Background[i][2][:,1] for j in np.arange(len(time)): bgtraceWriter.writerow(["%.20e" % time[j], "%.20e" % rate[j]]) bgtracefile.close() # Add to archive Arc.write(bgtracefilename) os.remove(os.path.join(tempdir, bgtracefilename)) bgfile.close() Arc.write(bgfilename) os.remove(os.path.join(tempdir, bgfilename)) ## Save External Weights information WeightedPageID = Infodict["External Weights"].keys() WeightedPageID.sort() WeightFilename = "externalweights.txt" WeightFile = open(WeightFilename, 'wb') WeightWriter = csv.writer(WeightFile, delimiter='\t') for pageid in WeightedPageID: number = str(pageid) NestWeights = Infodict["External Weights"][pageid].keys() # The order of the types does not matter, since they are # sorted in the frontend and upon import. We sort them here, anyhow. NestWeights.sort() for Nkey in NestWeights: WeightWriter.writerow([number, str(Nkey).strip()]) # Add data to a File WeightDataFilename = "externalweights_data"+number+\ "_"+str(Nkey).strip()+".csv" WeightDataFile = open(WeightDataFilename, 'wb') WeightDataWriter = csv.writer(WeightDataFile) wdata = Infodict["External Weights"][pageid][Nkey] for jw in np.arange(len(wdata)): WeightDataWriter.writerow([str(wdata[jw])]) WeightDataFile.close() Arc.write(WeightDataFilename) os.remove(os.path.join(tempdir, WeightDataFilename)) WeightFile.close() Arc.write(WeightFilename) os.remove(os.path.join(tempdir, WeightFilename)) ## Readme rmfilename = "Readme.txt" rmfile = open(rmfilename, 'wb') rmfile.write(ReadmeSession) rmfile.close() Arc.write(rmfilename) os.remove(os.path.join(tempdir, rmfilename)) # Close the archive Arc.close() # Move archive to destination directory shutil.move(os.path.join(tempdir, filename), os.path.join(dirname, filename) ) # Go to destination directory os.chdir(returnWD) os.rmdir(tempdir) return dirname, filename else: dirname = dlg.GetDirectory() dlg.Destroy() return dirname, None def saveCSV(parent, dirname, Page): """ Write relevant data into a comma separated list. Parameters: *parent* the parent window *dirname* directory to set on saving *Page* Page containing all necessary variables """ filename = Page.tabtitle.GetValue().strip()+Page.counter[:2] dlg = wx.FileDialog(parent, "Save curve", dirname, filename, "Correlation with trace (*.csv)|*.csv;*.CSV"+\ "|Correlation only (*.csv)|*.csv;*.CSV", wx.SAVE|wx.FD_OVERWRITE_PROMPT) # user cannot do anything until he clicks "OK" if dlg.ShowModal() == wx.ID_OK: path = dlg.GetPath() # Workaround since 0.7.5 (dirname, filename) = os.path.split(path) #filename = dlg.GetFilename() #dirname = dlg.GetDirectory() if filename.lower().endswith(".csv") is not True: filename = filename+".csv" openedfile = open(os.path.join(dirname, filename), 'wb') ## First, some doc text openedfile.write(ReadmeCSV.replace('\n', '\r\n')) # The infos InfoMan = info.InfoClass(CurPage=Page) PageInfo = InfoMan.GetCurFancyInfo() for line in PageInfo.splitlines(): openedfile.write("# "+line+"\r\n") openedfile.write("#\r\n#\r\n") # Get all the data we need from the Page # Modeled data # Since 0.7.8 the user may normalize the curves. The normalization # factor is set in *Page.normfactor*. corr = Page.datacorr[:,1]*Page.normfactor if Page.dataexp is not None: # Experimental data tau = Page.dataexp[:,0] exp = Page.dataexp[:,1]*Page.normfactor res = Page.resid[:,1]*Page.normfactor # Plotting! Because we only export plotted area. weight = Page.weights_used_for_plotting if weight is None: pass elif len(weight) != len(exp): text = "Weights have not been calculated for the "+\ "area you want to export. Pressing 'Fit' "+\ "again should solve this issue. Data will "+\ "not be saved." wx.MessageDialog(parent, text, "Error", style=wx.ICON_ERROR|wx.OK|wx.STAY_ON_TOP) return dirname, None else: tau = Page.datacorr[:,0] exp = None res = None # Include weights in data saving: # PyCorrFit thinks in [ms], but we will save as [s] timefactor = 0.001 tau = timefactor * tau ## Now we want to write all that data into the file # This is for csv writing: ## Correlation curve dataWriter = csv.writer(openedfile, delimiter='\t') if exp is not None: header = '# Channel (tau [s])'+"\t"+ \ 'Experimental correlation'+"\t"+ \ 'Fitted correlation'+ "\t"+ \ 'Residuals'+"\r\n" data = [tau, exp, corr, res] if Page.weighted_fit_was_performed is True \ and weight is not None: header = header.strip() + "\t"+'Weights (fit)'+"\r\n" data.append(weight) else: header = '# Channel (tau [s])'+"\t"+ \ 'Correlation function'+"\r\n" data = [tau, corr] # Write header openedfile.write(header) # Write data for i in np.arange(len(data[0])): # row-wise, data may have more than two elements per row datarow = list() for j in np.arange(len(data)): rowcoli = str("%.10e") % data[j][i] datarow.append(rowcoli) dataWriter.writerow(datarow) ## Trace # Only save the trace if user wants us to: if dlg.GetFilterIndex() == 0: # We will also save the trace in [s] # Intensity trace in kHz may stay the same if Page.trace is not None: # Mark beginning of Trace openedfile.write('#\r\n#\r\n# BEGIN TRACE\r\n#\r\n') # Columns time = Page.trace[:,0]*timefactor intensity = Page.trace[:,1] # Write openedfile.write('# Time [s]'+"\t" 'Intensity trace [kHz]'+" \r\n") for i in np.arange(len(time)): dataWriter.writerow([str("%.10e") % time[i], str("%.10e") % intensity[i]]) elif Page.tracecc is not None: # We have some cross-correlation here: # Mark beginning of Trace A openedfile.write('#\r\n#\r\n# BEGIN TRACE\r\n#\r\n') # Columns time = Page.tracecc[0][:,0]*timefactor intensity = Page.tracecc[0][:,1] # Write openedfile.write('# Time [s]'+"\t" 'Intensity trace [kHz]'+" \r\n") for i in np.arange(len(time)): dataWriter.writerow([str("%.10e") % time[i], str("%.10e") % intensity[i]]) # Mark beginning of Trace B openedfile.write('#\r\n#\r\n# BEGIN SECOND TRACE\r\n#\r\n') # Columns time = Page.tracecc[1][:,0]*timefactor intensity = Page.tracecc[1][:,1] # Write openedfile.write('# Time [s]'+"\t" 'Intensity trace [kHz]'+" \r\n") for i in np.arange(len(time)): dataWriter.writerow([str("%.10e") % time[i], str("%.10e") % intensity[i]]) dlg.Destroy() openedfile.close() return dirname, filename else: dirname = dlg.GetDirectory() dlg.Destroy() return dirname, None ReadmeCSV = """# This file was created using PyCorrFit version {}. # # Lines starting with a '#' are treated as comments. # The data is stored as CSV below this comment section. # Data usually consists of lag times (channels) and # the corresponding correlation function - experimental # and fitted values plus resulting residuals. # If this file is opened by PyCorrFit, only the first two # columns will be imported as experimental data. # """.format(doc.__version__) ReadmeSession = """This file was created using PyCorrFit version {}. The .zip archive you are looking at is a stored session of PyCorrFit. If you are interested in how the data is stored, you will find out here. Most important are the dimensions of units: Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ From there, the dimension of any parameter may be calculated. There are a number of files within this archive, depending on what was done during the session. backgrounds.csv - Contains the list of backgrounds used and - Averaged intensities in [kHz] bg_trace*.csv (where * is an integer) - The trace of the background corresponding to the line number in backgrounds.csv - Time in [ms], Trace in [kHz] comments.txt - Contains page titles and session comment - First n lines are titles, rest is session comment (where n is total number of pages) data*.csv (where * is (Number of page)) - Contains lag times [ms] - Contains experimental data, if available externalweights.txt - Contains names (types) of external weights other than from Model function or spline fit - Linewise: 1st element is page number, 2nd is name - According to this data, the following files are present in the archive externalweights_data_*PageID*_*Type*.csv - Contains weighting information of Page *PageID* of type *Type* model_*ModelID*.txt - An external (user-defined) model file with internal ID *ModelID* Parameters.yaml - Contains all Parameters for each page Block format: - - '#(Number of page): ' - (Internal model ID) - (List of parameters) - (List of checked parameters (for fitting)) - [(Min channel selected), (Max channel selected)] - [(Weighted fit method (0=None, 1=Spline, 2=Model function)), (No. of bins from left and right(, (No. of knots (of e.g. spline))] - [B1,B2] Background to use (line in backgrounds.csv) B2 is always *null* for autocorrelation curves - Data type is Cross-correlation? - Parameter id (int) used for normalization in plotting. This number first enumerates the model parameters and then the supplemental parameters (e.g. "n1"). - - [min, max] fitting parameter range of 1st parameter - [min, max] fitting parameter range of 2nd parameter - etc. - Order in Parameters.yaml defines order of pages in a session - Order in Parameters.yaml defines order in comments.txt Readme.txt (this file) Supplements.yaml - Contains errors of fitting Format: -- Page number -- [parameter id, error value] - [parameter id, error value] - Chi squared - [pages that share parameters] (from global fitting) trace*.csv (where * is (Number of page) | appendix "A" or "B" point to the respective channels (only in cross-correlation mode)) - Contains times [ms] - Contains countrates [kHz] """.format(doc.__version__) pycorrfit-0.8.1/src/frontend.py0000644000175000017500000020040012262516600015245 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module frontend The frontend displays the GUI (Graphic User Interface). All necessary functions and modules are called from here. Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import os import webbrowser import wx # GUI interface wxPython import wx.lib.agw.flatnotebook as fnb # Flatnotebook (Tabs) import wx.lib.delayedresult as delayedresult import wx.py.shell import numpy as np # NumPy import platform import sys # System stuff import traceback # for Error handling try: # contains e.g. update and icon, but no vital things. import misc except ImportError: print " Some modules are not available." print " Update function will not work." # PyCorrFit modules import doc # Documentation/some texts import edclasses import models as mdls import openfile as opf # How to treat an opened file import page import plotting import readfiles import tools # Some tools import usermodel ## On Windows XP I had problems with the unicode Characters. # I found this at # http://stackoverflow.com/questions/5419/python-unicode-and-the-windows-console # and it helped: if platform.system() == 'Windows': reload(sys) sys.setdefaultencoding('utf-8') # ~paulmueller ########################################################### class FlatNotebookDemo(fnb.FlatNotebook): """ Flatnotebook class """ def __init__(self, parent): """Constructor""" style = fnb.FNB_SMART_TABS|fnb.FNB_NO_NAV_BUTTONS|\ fnb.FNB_DROPDOWN_TABS_LIST|fnb.FNB_NODRAG|\ fnb.FNB_TABS_BORDER_SIMPLE|\ fnb.FNB_X_ON_TAB|fnb.FNB_NO_X_BUTTON # Bugfix for Mac if platform.system().lower() in ["windows", "linux"]: style = style|fnb.FNB_HIDE_ON_SINGLE_TAB self.fnb = fnb.FlatNotebook.__init__(self, parent, wx.ID_ANY, agwStyle=style) ########################################################### class MyFrame(wx.Frame): def __init__(self, parent, id, version): ## Set initial variables that make sense tau = 10**np.linspace(-6,8,1001) self.version = version wx.Frame.__init__(self, parent, id, "PyCorrFit " + self.version) self.CreateStatusBar() # A Statusbar in the bottom of the window self.StatusBar.SetStatusText("Find help and updates online:"+ " 'Help > Update'") ## Properties of the Frame initial_size = (768,700) self.SetSize(initial_size) self.SetMinSize(initial_size) # Set this, so we know in which directory we are working in. # This will change, when we load a session or data file. self.dirname = os.curdir self.filename = None # Session Comment - may be edited and saved later self.SessionComment = "This is a session comment. It will be saved" +\ " as the session is saved." ## Set variables # The model module that can be changed by importing user defined # functions. # These are only for compatibility. # value_set and valuedict only for compatibility! # I should use mdls for anything, since it's globally imported # and modified by this program (e.g. adding new function) self.value_set = mdls.values self.valuedict = mdls.valuedict # Some standard time scale # We need this for the functions inside the "FittingPanel"s self.tau = tau # Tab Counter self.tabcounter = 1 # Background Correction List # Here, each item is a list containing three elements: # [0] average signal [kHz] # [1] signal name (edited by user) # [2] signal trace (tuple) ([ms], [kHz]) self.Background = list() # A dictionary for all the opened tool windows self.ToolsOpen = dict() # A dictionary for all the tools self.Tools = dict() # Range selector (None if inactive) # Fitting parameter range selection # New as of 0.7.9 self.RangeSelector = None ## Setting up the menus. # models, modeldict, modeltypes only for compatibility! # I should use mdls for anything, since it's globally imported # and modified by this program (e.g. adding new function) self.models = mdls.models self.modeldict = mdls.modeldict self.modeltypes = mdls.modeltypes self.modelmenudict = dict() self.MakeMenu() ## Create the Flatnotebook (Tabs Tabs Tabs!) panel = wx.Panel(self) self.panel = panel self.notebook = FlatNotebookDemo(panel) self.notebook.SetRightClickMenu(self.curmenu) #self.notebook.SetAGWWindowStyleFlag(FNB_X_ON_TAB) sizer = wx.BoxSizer(wx.VERTICAL) sizer.Add(self.notebook, 1, wx.ALL|wx.EXPAND, 5) panel.SetSizer(sizer) self.Layout() self.Show() # Notebook Handler self.notebook.Bind(fnb.EVT_FLATNOTEBOOK_PAGE_CLOSED, self.OnFNBClosedPage) self.notebook.Bind(fnb.EVT_FLATNOTEBOOK_PAGE_CHANGED, self.OnFNBPageChanged) # This is a hack since version 0.7.7: # When the "X"-button on a page is pressed, ask the user # if he really wants to close that page. self.notebook._pages.Unbind(wx.EVT_LEFT_UP) self.notebook._pages.Bind(wx.EVT_LEFT_UP, self.OnMyLeftUp) # If user hits the "x", ask if he wants to save the session self.Bind(wx.EVT_CLOSE, self.OnExit) # Set window icon try: self.MainIcon = misc.getMainIcon() wx.Frame.SetIcon(self, self.MainIcon) except: self.MainIcon = None def add_fitting_tab(self, event=None, modelid=None, counter=None): """ This function creates a new page inside the notebook. If the function is called from a menu, the modelid is known by the event. If not, the modelid should be specified by *modelid*. *counter* specifies which page number we should use for our new page. If it is None, we will simply use *self.tabcounter*. *event* - An event that has event.GetId() equal to a modelid *modelid* - optional, directly set the modelid *counter* - optional, set the "#" value of the page """ if modelid is None: # Get the model id from the menu modelid = event.GetId() if counter is not None: # Set the tabcounter right, so the tabs are counted continuously. counterint = int(counter.strip().strip(":").strip("#")) self.tabcounter = max(counterint, self.tabcounter) modelid = int(modelid) counter = "#"+str(self.tabcounter)+": " # Get the model for the page together valuepack = mdls.valuedict[modelid] active_labels = valuepack[0] active_values = 1*valuepack[1] active_fitting = 1*valuepack[2] active_parms = [active_labels, active_values, active_fitting] model = mdls.modeldict[modelid][1] # Create New Tab Newtab = page.FittingPanel(self, counter, modelid, active_parms, self.tau) #self.Freeze() self.notebook.AddPage(Newtab, counter+model, select=True) #self.Thaw() self.tabcounter = self.tabcounter + 1 # Enable the "Current" Menu self.EnableToolCurrent(True) # ####### # # This is a work-around to prevent a weird bug in version 0.7.8: # The statistics OnPageChanged function is called but the parameters # are displayed double if a new page is created and the statistics # window is open. # Find Tool Statistics # Get open tools toolkeys = self.ToolsOpen.keys() for key in toolkeys: tool = self.ToolsOpen[key] try: if tool.MyName=="STATISTICS": # Call the function properly. tool.OnPageChanged(Newtab) except: pass # ####### # return Newtab def EnableToolCurrent(self, enabled): """ Independent on order of menus, enable or disable tools and current menu. """ # Tools menu is now always enabled # tid = self.menuBar.FindMenu("&Tools") # self.menuBar.EnableTop(tid, enabled) cid = self.menuBar.FindMenu("Current &Page") self.menuBar.EnableTop(cid, enabled) def MakeMenu(self): self.filemenu = wx.Menu() # toolmenu and curmenu are public, because they need to be enabled/ # disabled when there are tabs/notabs. self.toolmenu = wx.Menu() # curmenu needs to be public, because we want to call it from the right # click menu of a Page in fnb self.curmenu = wx.Menu() modelmenu = wx.Menu() prefmenu = wx.Menu() helpmenu = wx.Menu() # wx.ID_ABOUT and wx.ID_EXIT are standard IDs provided by wxWidgets. # self.filemenu menuAddModel = self.filemenu.Append(wx.ID_ANY, "&Import model", "Add a user defined model.") menuLoadBatch = self.filemenu.Append(wx.ID_ANY, "&Load data", "Loads one or multiple data files") menuOpen = self.filemenu.Append(wx.ID_OPEN, "&Open session", "Restore a previously saved session") self.filemenu.AppendSeparator() self.menuComm = self.filemenu.Append(wx.ID_ANY, "Co&mment session", "Add a comment to this session", kind=wx.ITEM_CHECK) self.filemenu.Check(self.menuComm.GetId(), False) menuClear = self.filemenu.Append(wx.ID_ANY, "&Clear session", "Remove all pages but keep imported model functions.") menuSave = self.filemenu.Append(wx.ID_SAVE, "&Save session", "Save entire Session") self.filemenu.AppendSeparator() menuExit = self.filemenu.Append(wx.ID_EXIT,"E&xit", "Terminate the program") # prefmenu self.MenuUseLatex = prefmenu.Append(wx.ID_ANY, "Use Latex", "Enables/Disables usage of Latex for image saving.", kind=wx.ITEM_CHECK) self.MenuVerbose = prefmenu.Append(wx.ID_ANY, "Verbose mode", "Enables/Disables output of additional information.", kind=wx.ITEM_CHECK) self.MenuShowWeights = prefmenu.Append(wx.ID_ANY, "Show weights", "Enables/Disables displaying weights of fit.", kind=wx.ITEM_CHECK) self.MenuShowWeights.Check() # toolmenu toolkeys = tools.ToolDict.keys() toolkeys.sort() for ttype in toolkeys: for tool in np.arange(len(tools.ToolDict[ttype])): menu = self.toolmenu.Append(wx.ID_ANY, tools.ToolName[ttype][tool][0], tools.ToolName[ttype][tool][1], kind=wx.ITEM_CHECK) self.toolmenu.Check(menu.GetId(), False) # Append tool to list of tools with menu ID self.Tools[menu.GetId()] = tools.ToolDict[ttype][tool] # Bindings # On tool only needs the Id of the wx.EVT_MENU self.Bind(wx.EVT_MENU, self.OnTool, menu) if ttype != toolkeys[-1]: self.toolmenu.AppendSeparator() # curmenu menuImportData = self.curmenu.Append(wx.ID_ANY, "&Import Data", "Import experimental FCS curve") menuSaveData = self.curmenu.Append(wx.ID_ANY, "&Save data (*.csv)", "Save data (comma separated values)") menuSavePlotCorr = self.curmenu.Append(wx.ID_ANY, "&Save correlation as image", "Export current plot as image.") menuSavePlotTrace = self.curmenu.Append(wx.ID_ANY, "&Save trace as image", "Export current trace as image.") self.curmenu.AppendSeparator() menuClPa = self.curmenu.Append(wx.ID_ANY, "&Close Page", "Close Current Page") # model menu # Integrate models into menu keys = mdls.modeltypes.keys() keys.sort() for modeltype in keys: # Now we have selected a type of model # Create a submenu submenu = wx.Menu() modelmenu.AppendMenu(wx.ID_ANY, modeltype, submenu) # Append to menulist self.modelmenudict[modeltype] = submenu for modelid in mdls.modeltypes[modeltype]: # Now we add every model that belongs to that type model = mdls.modeldict[modelid] if platform.system().lower() == "darwin" and hasattr(sys, 'frozen'): ### ### Work-around for freezed mac version ### ### (strange UTF-8 decoding error, ### would work with misc.removewrongUTF8) b = model[1].split("(")[0].strip() c = misc.removewrongUTF8(model[2]) menuentry = submenu.Append(model[0],b,c) else: menuentry = submenu.Append(model[0], model[1], model[2]) self.Bind(wx.EVT_MENU, self.add_fitting_tab, menuentry) # help menu menuDocu = helpmenu.Append(wx.ID_ANY, "&Documentation", "PyCorrFit documentation") menuWiki = helpmenu.Append(wx.ID_ANY, "&Wiki", "PyCorrFit wiki pages by users for users (online)") menuUpdate = helpmenu.Append(wx.ID_ANY, "&Update", "Check for new version"+ " (Web access required)") helpmenu.AppendSeparator() menuShell = helpmenu.Append(wx.ID_ANY, "S&hell", "A Python shell") helpmenu.AppendSeparator() menuSoftw = helpmenu.Append(wx.ID_ANY, "&Software", "Information about the software used") menuAbout = helpmenu.Append(wx.ID_ABOUT, "&About", "Information about this program") # Create the menubar. self.menuBar = wx.MenuBar() # Adding all the menus to the MenuBar self.menuBar.Append(self.filemenu,"&File") self.menuBar.Append(self.toolmenu,"&Tools") self.menuBar.Append(self.curmenu,"Current &Page") self.menuBar.Append(modelmenu,"&Model") self.menuBar.Append(prefmenu,"&Preferences") self.menuBar.Append(helpmenu,"&Help") self.SetMenuBar(self.menuBar) # Adding the MenuBar to the Frame content. self.EnableToolCurrent(False) ## Set events #File #self.Bind(wx.EVT_MENU, self.OnLoadSingle, menuLoadSingle) self.Bind(wx.EVT_MENU, self.OnLoadBatch, menuLoadBatch) self.Bind(wx.EVT_MENU, self.OnAddModel, menuAddModel) self.Bind(wx.EVT_MENU, self.OnCommSession, self.menuComm) self.Bind(wx.EVT_MENU, self.OnClearSession, menuClear) self.Bind(wx.EVT_MENU, self.OnOpenSession, menuOpen) self.Bind(wx.EVT_MENU, self.OnSaveSession, menuSave) self.Bind(wx.EVT_MENU, self.OnExit, menuExit) # Current self.Bind(wx.EVT_MENU, self.OnImportData, menuImportData) self.Bind(wx.EVT_MENU, self.OnSaveData, menuSaveData) self.Bind(wx.EVT_MENU, self.OnSavePlotCorr, menuSavePlotCorr) self.Bind(wx.EVT_MENU, self.OnSavePlotTrace, menuSavePlotTrace) self.Bind(wx.EVT_MENU, self.OnDeletePage, menuClPa) # Help self.Bind(wx.EVT_MENU, self.OnSoftware, menuSoftw) self.Bind(wx.EVT_MENU, self.OnAbout, menuAbout) self.Bind(wx.EVT_MENU, self.OnUpdate, menuUpdate) self.Bind(wx.EVT_MENU, self.OnDocumentation, menuDocu) self.Bind(wx.EVT_MENU, self.OnWiki, menuWiki) self.Bind(wx.EVT_MENU, self.OnShell, menuShell) def OnAbout(self, event=None): # Show About Information description = ("PyCorrFit is a data displaying, fitting "+ "and evaluation tool \nfor fluorescence correlation "+ "spectroscopy. \nPyCorrFit is written in Python.") licence = doc.licence() info = wx.AboutDialogInfo() #info.SetIcon(wx.Icon('hunter.png', wx.BITMAP_TYPE_PNG)) info.SetName('PyCorrFit') info.SetVersion(self.version) info.SetDescription(description) info.SetCopyright('(C) 2011 - 2012 Paul Müller') info.SetWebSite(doc.HomePage) info.SetLicence(licence) info.SetIcon(misc.getMainIcon(pxlength=64)) info.AddDeveloper('Paul Müller') info.AddDocWriter('Thomas Weidemann, Paul Müller') wx.AboutBox(info) def OnAddModel(self, event=None): """ Import a model from an external .txt file. See example model functions available on the web. """ # Add a model using the dialog. filters = "text file (*.txt)|*.txt" dlg = wx.FileDialog(self, "Open model file", self.dirname, "", filters, wx.OPEN) if dlg.ShowModal() == wx.ID_OK: NewModel = usermodel.UserModel(self) # Workaround since 0.7.5 (dirname, filename) = os.path.split(dlg.GetPath()) #filename = dlg.GetFilename() #dirname = dlg.GetDirectory() self.dirname = dirname # Try to import a selected .txt file try: NewModel.GetCode( os.path.join(dirname, filename) ) except: # The file does not seem to be what it seems to be. info = sys.exc_info() errstr = "Unknown file format:\n" errstr += str(filename)+"\n\n" errstr += str(info[0])+"\n" errstr += str(info[1])+"\n" for tb_item in traceback.format_tb(info[2]): errstr += tb_item wx.MessageDialog(self, errstr, "Error", style=wx.ICON_ERROR|wx.OK|wx.STAY_ON_TOP) del NewModel return # Test the code for sympy compatibility. # If you write your own parser, this might be easier. try: NewModel.TestFunction() except: # This means that the imported model file could be # contaminated. Ask the user how to proceed. text = "The model parsing check raised an Error.\n"+\ "This could be the result of a wrong Syntax\n"+\ "or an error of the parser.\n"+\ "This might be dangerous. Procceed\n"+\ "only, if you trust the source of the file.\n"+\ "Try and import offensive file: "+filename+"?" dlg2 = wx.MessageDialog(self, text, "Unsafe Operation", style=wx.ICON_EXCLAMATION|wx.YES_NO|wx.STAY_ON_TOP) if dlg2.ShowModal() == wx.ID_YES: NewModel.ImportModel() else: del NewModel return else: # The model was loaded correctly NewModel.ImportModel() else: dirname = dlg.GetDirectory() dlg.Destroy() self.dirname = dirname def OnClearSession(self,e=None,clearmodels=False): """Open a previously saved session. """ numtabs = self.notebook.GetPageCount() # Ask, if user wants to save current session. if numtabs > 0: dial = wx.MessageDialog(self, 'Do you wish to save this session first?', 'Save current session?', wx.ICON_QUESTION | wx.CANCEL | wx.YES_NO | wx.NO_DEFAULT ) # dial = edclasses.MyYesNoAbortDialog(self, # 'Do you wish to save this session first?', # 'Save current session?') result = dial.ShowModal() dial.Destroy() if result == wx.ID_CANCEL: return "abort" # stop this function - do nothing. elif result == wx.ID_YES: self.OnSaveSession() elif result == wx.ID_NO: pass # Delete all the pages self.notebook.DeleteAllPages() # Disable all the dialogs and menus self.EnableToolCurrent(False) self.OnFNBPageChanged() self.tabcounter = 1 self.filename = None self.SetTitleFCS(None) self.SessionComment = "You may enter some information here." self.Background = list() ## Do we want to keep user defined models after session clearing? if clearmodels == True: # Also reset user defined models for modelid in mdls.modeltypes["User"]: mdls.values.remove(mdls.valuedict[modelid]) del mdls.valuedict[modelid] mdls.models.remove(mdls.modeldict[modelid]) del mdls.modeldict[modelid] mdls.modeltypes["User"] = list() # Model Menu menu=self.modelmenudict["User"] for item in menu.GetMenuItems(): menu.RemoveItem(item) def OnCommSession(self,e=None): """ Dialog for commenting on session. """ try: self.EditCommentDlg.IsEnabled() except AttributeError: # Dialog is not opened self.EditCommentDlg = tools.EditComment(self) self.EditCommentDlg.Bind(wx.EVT_CLOSE, self.EditCommentDlg.OnClose) self.filemenu.Check(self.menuComm.GetId(), True) else: # Close Dialog self.EditCommentDlg.OnClose() def OnDeletePage(self, event=None): """ This method is based on the flatnotebook demo It removes a page from the notebook """ # Ask the user if he really wants to delete the page. title = self.notebook.GetCurrentPage().tabtitle.GetValue() numb = self.notebook.GetCurrentPage().counter.strip().strip(":") text = "This will close page "+numb+"?\n"+title dlg = edclasses.MyOKAbortDialog(self, text, "Warning") if dlg.ShowModal() == wx.ID_OK: self.notebook.DeletePage(self.notebook.GetSelection()) self.OnFNBClosedPage() if self.notebook.GetPageCount() == 0: self.OnFNBPageChanged() def OnDocumentation(self, e=None): """ Get the documentation and view it with browser""" filename = doc.GetLocationOfDocumentation() if filename is None: # Now we have to tell the user that there is no documentation self.StatusBar.SetStatusText("...documentation not found.") else: self.StatusBar.SetStatusText("...documentation: "+filename) if platform.system().lower() == 'windows': os.system("start /b "+filename) elif platform.system().lower() == 'linux': os.system("xdg-open "+filename+" &") elif platform.system().lower() == 'darwin': os.system("open "+filename+" &") else: # defaults to linux style: os.system("xdg-open "+filename+" &") def OnExit(self,e=None): numtabs = self.notebook.GetPageCount() # Ask, if user wants to save current session. if numtabs > 0: dial = wx.MessageDialog(self, 'Do you wish to save this session first?', 'Save current session?', wx.ICON_QUESTION | wx.CANCEL | wx.YES_NO | wx.NO_DEFAULT ) result = dial.ShowModal() dial.Destroy() if result == wx.ID_CANCEL: return # stop this function - do nothing. elif result == wx.ID_YES: self.OnSaveSession() # Exit the Program self.Destroy() def OnFNBClosedPage(self,e=None): """ Called, when a page has been closed """ if self.notebook.GetPageCount() == 0: # Grey out tools self.EnableToolCurrent(False) def OnFNBPageChanged(self,e=None, Page=None): """ Called, when - Page focus switches to another Page - Page with focus changes significantly: - experimental data is loaded - weighted fit was done """ # Get the Page if Page is None: Page = self.notebook.GetCurrentPage() keys = self.ToolsOpen.keys() for key in keys: # Update the information self.ToolsOpen[key].OnPageChanged(Page) # parameter range selection tool for page. if self.RangeSelector is not None: try: self.RangeSelector.OnPageChanged(Page) except: pass # Bugfix-workaround for mac: # non-existing tabs are still displayed upon clearing session if platform.system().lower() == "darwin": if self.notebook.GetPageCount() == 0: self.notebook.Hide() else: self.notebook.Show() def OnImportData(self,e=None): """Import experimental data from a all filetypes specified in *opf.Filetypes*. Is called by the curmenu and applies to currently opened model. """ # Open a data file # Get Data SupFiletypes = opf.Filetypes.keys() SupFiletypes.sort() filters = "" for i in np.arange(len(SupFiletypes)): # Add to the filetype filter filters = filters+SupFiletypes[i] if i+1 != len(SupFiletypes): # Add a separator, but not behind the last entry # This is wx widgets stuff. filters = filters+"|" dlg = wx.FileDialog(self, "Open data file", self.dirname, "", filters, wx.OPEN) if dlg.ShowModal() == wx.ID_OK: # The filename the page will get path = dlg.GetPath() # Workaround since 0.7.5 (self.dirname, self.filename) = os.path.split(path) #self.filename = dlg.GetFilename() #self.dirname = dlg.GetDirectory() try: Stuff = readfiles.openAny(self.dirname, self.filename) except: # The file format is not supported. info = sys.exc_info() errstr = "Unknown file format:\n" errstr += str(self.filename)+"\n\n" errstr += str(info[0])+"\n" errstr += str(info[1])+"\n" for tb_item in traceback.format_tb(info[2]): errstr += tb_item wx.MessageDialog(self, errstr, "Error", style=wx.ICON_ERROR|wx.OK|wx.STAY_ON_TOP) return else: dataexp = Stuff["Correlation"] trace = Stuff["Trace"] curvelist = Stuff["Type"] filename = Stuff["Filename"] # If curvelist is a list with more than one item, we are # importing more than one curve per file. Therefore, we # need to create more pages for this file. # # We want to give the user the possibility to choose from # several types of input functions. If curvelist contains # more than one type of data, like "AC1", "AC2", "CC1", ... # then the user may wish to only import "AC1" or "AC2" # functions. curvetypes = dict() for i in np.arange(len(curvelist)): try: curvetypes[curvelist[i]].append(i) except KeyError: curvetypes[curvelist[i]] = [i] # Now we have a dictionary curvetypes with keys that name # items in curvelist and which point to indices in curvelist. # We will display a dialog that let's the user choose what # to import. keys = curvetypes.keys() if len(keys) > 1: Chosen = tools.ChooseImportTypes(self, curvetypes) newcurvelist = list() newfilename = list() newdataexp = list() newtrace = list() if Chosen.ShowModal() == wx.ID_OK: keys = Chosen.keys if len(keys) == 0: # do not do anything return for key in keys: # create a new curvelist with the selected curves for index in curvetypes[key]: newcurvelist.append(curvelist[index]) newfilename.append(filename[index]) newdataexp.append(dataexp[index]) newtrace.append(trace[index]) curvelist = newcurvelist filename = newfilename dataexp = newdataexp trace = newtrace else: return Chosen.Destroy() # curvelist is a list of numbers or labels that correspond # to each item in dataexp or trace. Each curvelist/filename # item will be converted to a string and then added to the # pages title. num = len(curvelist) # Show a nice progress dialog: style = wx.PD_REMAINING_TIME|wx.PD_SMOOTH|wx.PD_AUTO_HIDE|\ wx.PD_CAN_ABORT dlg = wx.ProgressDialog("Import", "Loading pages...", maximum = num, parent=self, style=style) # Get current page and populate CurPage = self.notebook.GetCurrentPage() for i in np.arange(num): # Fill Page with data self.ImportData(CurPage, dataexp[i], trace[i], curvetype=curvelist[i], filename=filename[i], curveid=i) # Let the user abort, if he wants to: # We want to do this here before an empty page is added # to the notebok. if dlg.Update(i+1, "Loading pages...")[0] == False: dlg.Destroy() return if i+1 != num: # Create new page. # (Add n-1 pages while importing.) CurPage = self.add_fitting_tab(event=None, modelid=CurPage.modelid, counter=None) # We are finished here: return else: # User pressed "Abort" - do nothing. self.dirname = dlg.GetDirectory() dlg.Destroy() return def OnMyLeftUp(self, event): """ Wrapper for LeftUp: We want to have a wrapper for the page closing event. The code was copied from "flatnotebook.py" Handles the ``wx.EVT_LEFT_UP`` event for L{PageContainer}. :param `event`: a `wx.MouseEvent` event to be processed. """ # Get the page container pc = self.notebook._pages # forget the zone that was initially clicked self._nLeftClickZone = fnb.FNB_NOWHERE where, tabIdx = pc.HitTest(event.GetPosition()) FNB_X = 2 FNB_TAB_X = 3 if not pc.HasAGWFlag(fnb.FNB_NO_TAB_FOCUS): # Make sure selected tab has focus self.SetFocus() if where == FNB_X: # Make sure that the button was pressed before if pc._nXButtonStatus != fnb.FNB_BTN_PRESSED: return pc._nXButtonStatus = fnb.FNB_BTN_HOVER self.OnDeletePage(self.notebook.GetCurrentPage()) elif where == FNB_TAB_X: # Make sure that the button was pressed before if pc._nTabXButtonStatus != fnb.FNB_BTN_PRESSED: return pc._nTabXButtonStatus = fnb.FNB_BTN_HOVER self.OnDeletePage(self.notebook.GetCurrentPage()) else: # Call what should have been called. pc.OnLeftUp(event) def ImportData(self, Page, dataexp, trace, curvetype="", filename="", curveid="", run=""): CurPage = Page # Import traces. Traces are usually put into a list, even if there # is only one trace. The reason is, that for cross correlation, we # have two traces and thus we have to import both. # In case of cross correlation, save that list of (two) traces # in the page.tracecc variable. Else, save the trace for auto- # correlations directly into the page.trace variable. We are # doing this in order to keep data types clean. if curvetype[0:2] == "CC": # For cross correlation, the trace has two components CurPage.IsCrossCorrelation = True CurPage.tracecc = trace CurPage.trace = None else: CurPage.IsCrossCorrelation = False CurPage.tracecc = None if trace is not None: CurPage.trace = trace CurPage.traceavg = trace[:,1].mean() # Import correlation function CurPage.dataexpfull = dataexp # We need this to be able to work with the data. # It actually does nothing to the data right now. CurPage.startcrop = None CurPage.endcrop = None # It might be possible, that we want the channels to be # fixed to some interval. This is the case if the # checkbox on the "Channel selection" dialog is checked. self.OnFNBPageChanged() # Enable Fitting Button CurPage.Fit_enable_fitting() # Set new tabtitle value and strip leading or trailing # white spaces. if run != "": title = "{} r{:03d}-{}".format(filename, int(run), curvetype) else: title = "{} id{:03d}-{}".format(filename, int(curveid), curvetype) CurPage.tabtitle.SetValue(title.strip()) # Plot everything CurPage.PlotAll() # Call this function to allow the "Channel Selection" window that # might be open to update itself. # We are aware of the fact, that we just did that self.OnFNBPageChanged() def OnLoadBatch(self, e): """ Open multiple data files and apply a single model to them We will create a new window where the user may decide which model to use. """ ## Browse the file system SupFiletypes = opf.Filetypes.keys() # Sort them so we have "All suported filetypes" up front SupFiletypes.sort() filters = "" for i in np.arange(len(SupFiletypes)): # Add to the filetype filter filters = filters+SupFiletypes[i] if i+1 != len(SupFiletypes): # Add a separator if item is not last item filters = filters+"|" dlg = wx.FileDialog(self, "Open data files", self.dirname, "", filters, wx.OPEN|wx.FD_MULTIPLE) if dlg.ShowModal() == wx.ID_OK: Datafiles = dlg.GetFilenames() # We rely on sorted filenames Datafiles.sort() # Workaround since 0.7.5 paths = dlg.GetPaths() if len(paths) != 0: self.dirname = os.path.split(paths[0])[0] else: self.dirname = dlg.GetDirectory() dlg.Destroy() else: dlg.Destroy() return ## Get information from the data files and let the user choose ## which type of curves to load and the corresponding model. # List of filenames that could not be opened BadFiles = list() # Lists for correlation, trace, type and names Correlation = list() Trace = list() Type = list() Filename = list() # there might be zipfiles with additional name info #Run = list() # Run number connecting AC1 AC2 CC12 CC21 Curveid = list() # Curve ID of each curve in a file for afile in Datafiles: try: Stuff = readfiles.openAny(self.dirname, afile) except: # The file does not seem to be what it seems to be. BadFiles.append(afile) else: for i in np.arange(len(Stuff["Type"])): Correlation.append(Stuff["Correlation"][i]) Trace.append(Stuff["Trace"][i]) Type.append(Stuff["Type"][i]) Filename.append(Stuff["Filename"][i]) #Curveid.append(str(i+1)) # Add number of the curve within a file. nameold = None counter = 1 for name in Filename: if name == nameold: Curveid.append(counter) counter += 1 else: counter = 1 nameold = name Curveid.append(counter) counter += 1 # If there are any BadFiles, we will let the user know. if len(BadFiles) > 0: # The file does not seem to be what it seems to be. errstr = "The following files could not be processed:\n" for item in BadFiles: errstr += " "+item dlg = wx.MessageDialog(self, errstr, "Error", style=wx.ICON_WARNING|wx.OK|wx.CANCEL|wx.STAY_ON_TOP) if dlg.ShowModal() == wx.ID_CANCEL: return # Abort, if there are no curves left if len(Type) == 0: return # We want to give the user the possibility to choose from # several types of input functions. If curvelist contains # more than one type of data, like "AC1", "AC2", "CC1", ... # then the user may wish to only import "AC1" or "AC2" # functions. curvetypes = dict() for i in np.arange(len(Type)): try: curvetypes[Type[i]].append(i) except KeyError: curvetypes[Type[i]] = [i] # Fill in the Run information keys = curvetypes.keys() # This part is a little tricky. We assume at some point, that different # types of curves (AC1, AC2) belong to the same run. The only possible # chek/assumtion that we can make is: # If all curvetypes have the same amount of curves, then the curves # from different curvetypes belong together. # Unfortunately, we do not know how the curves are ordered. It could # be like this: # AC1-r1, AC2-r1, CC12-r1, CC21-r1, AC1-r2, AC1-r2, ... # or otherwise interlaced like this: # AC1-r1, AC2-r1, AC1-r2, AC1-r2, ... , CC12-r1, CC21-r1, ... # What we do know is that the first occurence of AC1 matches up with # the first occurences of AC2, CC12, etc. # We create the list/array *Run* whose elements are the run-number # at the position of the curves in *Types*. # Check if the type of curves have equal length lentypes = np.zeros(len(keys), dtype=int) for i in range(len(keys)): lentypes[i] = len(curvetypes[keys[i]]) if len(np.unique(np.array(lentypes))) == 1 and lentypes[0] != 0: # Made sure that AC1 AC2 CC12 CC21 have same length # Create Runs such that they are matched. # We assume that the curves are somehow interlaced and that # the Nth occurence of the keys in Types correspond to the # matching curves. # Also make sure that number starts at one for each selected file. coords = np.zeros(len(keys), dtype=np.int) Run = np.zeros(len(Curveid), dtype=np.int) WorkType = 1*Type for fname in np.unique(Filename): # unique returns sorted file names. for i in range(Filename.count(fname)/len(keys)): for k in range(len(keys)): coords[k] = WorkType.index(keys[k]) WorkType[coords[k]] = None Run[coords] = i + 1 #del WorkType else: Run = [""] * len(Curveid) # Now we have a dictionary curvetypes with keys that name # items in *Type* and which point to indices in *Type*. # We will display a dialog that lets the user choose what # to import. keys = curvetypes.keys() # Start the dialog for choosing types and model functions labels=list() for i in np.arange(len(Filename)): if Run[i] != "": labels.append("{} r{:03d}-{}".format(Filename[i], Run[i], Type[i])) else: labels.append("{} id{:03d}-{}".format(Filename[i], Curveid[i], Type[i])) Chosen = tools.ChooseImportTypesModel(self, curvetypes, Correlation, labels=labels) newCorrelation = list() newTrace = list() newType = list() newFilename = list() modelList = list() newCurveid = list() newRun = list() if Chosen.ShowModal() == wx.ID_OK: keys = Chosen.typekeys # Keepdict is a list of indices pointing to Type or Correlation # of curves that are supposed to be kept. keepcurvesindex = Chosen.keepcurvesindex # modelids is a dictionary with chosen modelids. # The keys of modelids are indices in the *Type* etc. lists. modelids = Chosen.modelids if len(keys) == 0: # do not do anything return for key in keys: # create a new curvelist with the selected curves for index in curvetypes[key]: if keepcurvesindex.count(index) == 1: newCorrelation.append(Correlation[index]) newTrace.append(Trace[index]) newType.append(Type[index]) newFilename.append(Filename[index]) modelList.append(modelids[index]) newCurveid.append(Curveid[index]) newRun.append(Run[index]) Correlation = newCorrelation Trace = newTrace Type = newType Filename = newFilename Curveid = newCurveid Run = newRun else: return Chosen.Destroy() ## Import the data into new pages # curvelist is a list of numbers or labels that correspond # to each item in dataexp or trace. Each curvelist/filename # item will be converted to a string and then added to the # pages title. num = len(Type) # Show a nice progress dialog: style = wx.PD_REMAINING_TIME|wx.PD_SMOOTH|wx.PD_AUTO_HIDE|\ wx.PD_CAN_ABORT dlg = wx.ProgressDialog("Import", "Loading pages..." , maximum = num, parent=self, style=style) for i in np.arange(num): # create a new page CurPage = self.add_fitting_tab(event=None, modelid=modelList[i], counter=None) # Fill Page with data self.ImportData(CurPage, Correlation[i], Trace[i], curvetype=Type[i], filename=Filename[i], curveid=str(Curveid[i]), run=str(Run[i])) # Let the user abort, if he wants to: # We want to do this here before an empty page is added # to the notebok. if dlg.Update(i+1, "Loading pages...")[0] == False: dlg.Destroy() return # If the user did not select curves but chose a model, destroy # the dialog. dlg.Destroy() def OnOpenSession(self,e=None,sessionfile=None): """Open a previously saved session. Optional parameter sessionfile defines the file that shall be automatically loaded (without a dialog) """ # We need to clear the session before opening one. # This will also ask, if user wants to save the current session. clear = self.OnClearSession(clearmodels=True) if clear == "abort": # User pressed abort when he was asked if he wants to save the # session. return "abort" Infodict, self.dirname, filename = \ opf.OpenSession(self, self.dirname, sessionfile=sessionfile) # Check, if a file has been opened if filename is not None: # Reset all Pages. We already gave the user the possibility to # save his session. # self.OnClearSession() self.filename = filename self.SetTitleFCS(self.filename) ## Background traces try: self.Background = Infodict["Backgrounds"] except: pass ## Preferences ## if Preferences is Not None: ## add them! # External functions for key in Infodict["External Functions"].keys(): NewModel = usermodel.UserModel(self) # NewModel.AddModel(self, code) # code is a list with strings # each string is one line NewModel.AddModel( Infodict["External Functions"][key].splitlines()) NewModel.ImportModel() # Internal functions: N = len(Infodict["Parameters"]) # Reset tabcounter self.tabcounter = 1 # Show a nice progress dialog: style = wx.PD_REMAINING_TIME|wx.PD_SMOOTH|wx.PD_AUTO_HIDE|\ wx.PD_CAN_ABORT dlg = wx.ProgressDialog("Import", "Loading pages..." , maximum = N, parent=self, style=style) for i in np.arange(N): # Let the user abort, if he wants to: if dlg.Update(i+1, "Loading pages...")[0] == False: dlg.Destroy() return # Add a new page to the notebook. This page is created with # variables from models.py. We will write our data to # the page later. counter = Infodict["Parameters"][i][0] modelid = Infodict["Parameters"][i][1] self.add_fitting_tab(modelid=modelid, counter=counter) # Get New Page, so we can add our stuff. Newtab = self.notebook.GetCurrentPage() # Add experimental Data # Import dataexp: number = counter.strip().strip(":").strip("#") pageid = int(number) [tau, dataexp] = Infodict["Correlations"][pageid] if dataexp is not None: # Write experimental data Newtab.dataexpfull = dataexp Newtab.dataexp = True # not None # As of 0.7.3: Add external weights to page try: Newtab.external_std_weights = \ Infodict["External Weights"][pageid] except KeyError: # No data pass else: # Add external weights to fitbox WeightKinds = Newtab.Fitbox[1].GetItems() wkeys = Newtab.external_std_weights.keys() wkeys.sort() for wkey in wkeys: WeightKinds += [wkey] Newtab.Fitbox[1].SetItems(WeightKinds) self.UnpackParameters(Infodict["Parameters"][i], Newtab) # Supplementary data try: Sups = Infodict["Supplements"][pageid] except KeyError: pass else: errdict = dict() for errInfo in Sups["FitErr"]: for ierr in np.arange(len(errInfo)): errkey = mdls.valuedict[modelid][0][int(errInfo[0])] errval = float(errInfo[1]) errdict[errkey] = errval Newtab.parmoptim_error = errdict try: Newtab.GlobalParameterShare = Sups["Global Share"] except: pass try: Newtab.chi2 = Sups["Chi sq"] except: pass # Set Title of the Page try: Newtab.tabtitle.SetValue(Infodict["Comments"][pageid]) except: pass # no page title # Import the intensity trace try: trace = Infodict["Traces"][pageid] except: trace = None if trace is not None: if Newtab.IsCrossCorrelation is False: Newtab.trace = trace[0] Newtab.traceavg = trace[0][:,1].mean() else: Newtab.tracecc = trace # Plot everything Newtab.PlotAll() # Set Session Comment try: self.SessionComment = Infodict["Comments"]["Session"] except: pass try: Infodict["Preferences"] # not used yet except: pass if self.notebook.GetPageCount() > 0: # Enable the "Current" Menu self.EnableToolCurrent(True) self.OnFNBPageChanged() else: # There are no pages in the session. # Disable some menus and close some dialogs self.EnableToolCurrent(False) def OnSaveData(self,e=None): # Save the Data """ Save calculated Data including optional fitted exp. data. """ # What Data do we wish to save? Page = self.notebook.GetCurrentPage() # Export CSV # If no file has been selected, self.filename will be set to 'None'. self.dirname, self.filename = opf.saveCSV(self, self.dirname, Page) def OnSavePlotCorr(self, e=None): """ make some output """ # Saving dialog box. uselatex = self.MenuUseLatex.IsChecked() verbose = self.MenuVerbose.IsChecked() show_weights = self.MenuShowWeights.IsChecked() Page = self.notebook.GetCurrentPage() plotting.savePlotCorrelation(self, self.dirname, Page, uselatex, verbose, show_weights) def OnSavePlotTrace(self, e=None): """ make some output """ # Saving dialog box. uselatex = 1*self.MenuUseLatex.IsChecked() verbose = 1*self.MenuVerbose.IsChecked() Page = self.notebook.GetCurrentPage() plotting.savePlotTrace(self, self.dirname, Page, uselatex, verbose) def OnSaveSession(self,e=None): """Save a session for later continuation.""" # Parameters are all in one dictionary: Infodict = dict() Infodict["Backgrounds"] = self.Background # Background list Infodict["Comments"] = dict() # Session comment "Session" and Pages int Infodict["Correlations"] = dict() # all correlation curves Infodict["External Functions"] = dict() # external model functions Infodict["External Weights"] = dict() # additional weights for the pages Infodict["Parameters"] = dict() # all parameters of all pages Infodict["Preferences"] = dict() # not used Infodict["Supplements"] = dict() # error estimates for fitting Infodict["Traces"] = dict() # all traces # Save each Page N = self.notebook.GetPageCount() # External functions for usermodelid in mdls.modeltypes["User"]: # Those models belong to external user functions. doc = mdls.modeldict[usermodelid][-1].func_doc doc = doc.splitlines() docnew="" for line in doc: docnew = docnew+line.strip()+"\r\n" Infodict["External Functions"][usermodelid] = docnew for i in np.arange(N): # Set Page Page = self.notebook.GetPage(i) counter = int(Page.counter.strip().strip(":").strip("#")) # Apply currently set parameters Page.apply_parameters() # Set parameters Infodict["Parameters"][counter] = self.PackParameters(Page) # Set supplementary information, such as errors of fit if Page.parmoptim_error is not None: # == if Page.chi2 is not None Infodict["Supplements"][counter] = dict() Infodict["Supplements"][counter]["Chi sq"] = float(Page.chi2) PageList = list() for pagei in Page.GlobalParameterShare: PageList.append(int(pagei)) Infodict["Supplements"][counter]["Global Share"] = PageList Alist = list() for key in Page.parmoptim_error.keys(): position = mdls.GetPositionOfParameter(Page.modelid, key) Alist.append([ int(position), float(Page.parmoptim_error[key]) ]) Infodict["Supplements"][counter]["FitErr"] = Alist # Set exp data Infodict["Correlations"][counter] = [Page.tau, Page.dataexpfull] # Also save the trace if Page.IsCrossCorrelation is False: Infodict["Traces"][counter] = Page.trace # #Function_trace.append(Page.trace) else: # #Function_trace.append(Page.tracecc) Infodict["Traces"][counter] = Page.tracecc # Append title to Comments # #Comments.append(Page.tabtitle.GetValue()) Infodict["Comments"][counter] = Page.tabtitle.GetValue() # Add additional weights to Info["External Weights"] if len(Page.external_std_weights) != 0: Infodict["External Weights"][counter] = Page.external_std_weights # Append Session Comment: Infodict["Comments"]["Session"] = self.SessionComment # Save everything # If no file has been selected, self.filename will be set to 'None'. self.dirname, self.filename = opf.SaveSession(self, self.dirname, Infodict) #Function_parms, Function_array, Function_trace, self.Background, #Preferences, Comments, ExternalFunctions, Info) # Set title of our window self.SetTitleFCS(self.filename) def OnShell(self, e=None): Shell = wx.py.shell.ShellFrame(self, title="PyCorrFit Shell", style=wx.DEFAULT_FRAME_STYLE|wx.FRAME_FLOAT_ON_PARENT, locals=locals()) # Set window icon if self.MainIcon is not None: wx.Frame.SetIcon(Shell, self.MainIcon) Shell.Show(True) def OnSoftware(self, event=None): # Show About Information text = doc.SoftwareUsed() wx.MessageBox(text, 'Software', wx.OK | wx.ICON_INFORMATION) def OnTool(self, event): eid = event.GetId() try: # Check if a tool is open self.ToolsOpen[eid] except KeyError: # eid is not on self.ToolOpen # So we open the dialog and add it to the list self.ToolsOpen[eid] = self.Tools[eid](self) self.ToolsOpen[eid].MyID = eid self.ToolsOpen[eid].Bind(wx.EVT_CLOSE, self.ToolsOpen[eid].OnClose) self.toolmenu.Check(eid, True) else: # We close it then self.ToolsOpen[eid].OnClose() def OnUpdate(self, event): misc.Update(self) def OnWiki(self, e=None): """ Go to the GitHub Wiki page""" webbrowser.open(doc.GitWiki) def PackParameters(self, Page): """ Gets all parameters from a page and returns a list object, that can be used to save as e.g. a safe YAML file """ Page.apply_parameters() # Get Model ID modelid = Page.modelid # Get Page number counter = Page.counter active_numbers = Page.active_parms[1] # Array, Parameters active_fitting = Page.active_parms[2] crop = [Page.startcrop, Page.endcrop] Parms = [counter, modelid, active_numbers, active_fitting, crop] # Weighting: # Additional parameters as of v.0.2.0 # Splines and model function: # Additional parameters as of v.6.4.0 #self.Fitbox=[ fitbox, weightedfitdrop, fittext, fittext2, fittextvar, # fitspin, buttonfit ] # Some fits like Spline have a number of knots of the spline # that is important for fitting. If there is a number in the # Dropdown, save it. # knots = str(Page.FitKnots) knots = filter(lambda x: x.isdigit(), knots) if len(knots) == 0: knots = None else: knots = int(knots) weighted = Page.weighted_fittype_id weights = Page.weighted_nuvar Parms.append([weighted, weights, knots]) # Additional parameters as of v.0.2.9 # Which Background signal is selected? # The Background information is in the list *self.Background*. Parms.append([Page.bgselected, Page.bg2selected]) # Additional parameter as of v.0.5.8 # Is the Experimental data (if it exists) AC or CC? Parms.append(Page.IsCrossCorrelation) # Additional parameter as of v.0.7.8 # The selection of a normalization parameter (None or integer) if Page.normparm is not None: # We need to do this because yaml export would not work # in safe mode. Page.normparm=int(Page.normparm) Parms.append(Page.normparm) # Parameter ranges Parms.append(Page.parameter_range) return Parms def UnpackParameters(self, Parms, Page): """ Apply the given parameters to the Page in question. This function contains several *len(Parms) >= X* statements. These are used for opening sessions that were saved using earlier versions of PyCorrFit. """ modelid = Parms[1] if Page.modelid != modelid: print "Wrong model: "+str(Page.modelid)+" vs. "+str(modelid) return active_values = Parms[2] active_fitting = Parms[3] # As of version 0.7.0: square pinhole TIR-FCS models # use sigma instead of lambda, NA and sigma_0. This # is for backwards compatibility: changeTIRF = False if modelid in [6000, 6010]: if len(Parms[2]) > len(mdls.valuedict[modelid][0]): lindex = 1 changeTIRF = True elif modelid in [6020, 6021, 6022, 6023]: if len(Parms[2]) > len(mdls.valuedict[modelid][0]): lindex = 2 changeTIRF = True if changeTIRF: lamb = active_values[lindex] NA = active_values[lindex+1] sigma = 0.21*lamb/NA active_values[lindex] = sigma active_values = np.delete(active_values,lindex+1) active_fitting = np.delete(active_fitting, lindex+1) # Cropping: What part of dataexp should be displayed. [cropstart, cropend] = Parms[4] # Add parameters and fitting to the created page. # We need to run Newtab.apply_parameters_reverse() in order # for the data to be displayed in the user interface. Page.active_parms[1] = active_values Page.active_parms[2] = active_fitting # Cropping Page.startcrop = cropstart Page.endcrop = cropend Page.crop_data() # Weighted fitting if len(Parms) >= 6: if len(Parms[5]) == 2: [weighted, weights] = Parms[5] knots = None else: # We have knots as of v. 0.6.5 [weighted, weights, knots] = Parms[5] if knots is not None: # This is done with apply_paramters_reverse: # text = Page.Fitbox[1].GetValue() # text = filter(lambda x: x.isalpha(), text) # Page.Fitbox[1].SetValue(text+str(knots)) Page.FitKnots = int(knots) if weighted is False: weighted = 0 elif weighted is True: weighted = 1 elif len(Page.Fitbox[1].GetItems())-1 < weighted: # Is the case, e.g. when we have an average std, # but this page is not an average. weighted = 0 Page.weighted_fittype_id = weighted Page.weighted_nuvar = weights Page.apply_parameters_reverse() if Page.dataexp is not None: Page.Fit_enable_fitting() Page.Fit_WeightedFitCheck() Page.Fit_create_instance() if Page.weighted_fit_was_performed: # We need this to plot std-dev Page.calculate_corr() Page.data4weight = 1.*Page.datacorr # Set which background correction the Page uses: if len(Parms) >= 7: # causality check: if len(self.Background) > Parms[6][0]: Page.bgselected = Parms[6][0] if len(Parms[6]) == 2: # New in 0.8.1: CC background correction Page.bg2selected = Parms[6][1] # New feature since 0.7.8: BG selection on Page panel Page.OnAmplitudeCheck("init") # Set if Newtab is of type cross-correlation: if len(Parms) >= 8: Page.IsCrossCorrelation = Parms[7] if len(Parms) >= 9: # New feature in 0.7.8 includes normalization to a fitting # parameter. Page.normparm = Parms[8] Page.OnAmplitudeCheck("init") if len(Parms) >= 10: Page.parameter_range = np.array(Parms[9]) ## If we want to add more stuff, we should do something like: ## if len(Parms) >= 11: ## nextvalue = Parms[10] ## Such that we are compatible to earlier versions of ## PyCorrFit sessions. def SetTitleFCS(self, title): if title is not None and len(title) != 0: title = " {"+title+"}" self.SetTitle('PyCorrFit ' + self.version + title) else: self.SetTitle('PyCorrFit ' + self.version) pycorrfit-0.8.1/src/icon.py0000644000175000017500000004460612262516600014374 0ustar toortoor#---------------------------------------------------------------------- # This file was generated by /usr/bin/img2py # from wx.lib.embeddedimage import PyEmbeddedImage Main = PyEmbeddedImage( "iVBORw0KGgoAAAANSUhEUgAAAQAAAAEACAYAAABccqhmAAAABHNCSVQICAgIfAhkiAAAAAlw" "SFlzAAAMqAAADKgBt04g1gAAABl0RVh0U29mdHdhcmUAd3d3Lmlua3NjYXBlLm9yZ5vuPBoA" "ACAASURBVHic7Z13nFTV2ce/d9r2Xcru0quAFDE2sASUJogGS9TYYosxFozGkMTo6/uuEzXF" "RoxYIomx94YGjBRpKioqGKQICNKWBRbYXdg27b5/nLne2WXuzOwyc++dmfP9fOYzs3PP7jwM" "9/zOc855nucoqqoikUiyE4fVBkgkEuuQAiCRZDFSACSSLEYKgESSxUgBkEiyGCkAEkkWIwVA" "IslipABIJFmMFACJJIuRAiCRZDFSACSSLEYKgESSxUgBkEiyGCkAEkkWIwVAIsliXFYbIJFk" "K4pXOR64GmgGvlQr1BdMt0EWBJFIzEfxKicD7wElEW/fpFaoj5pqhxQAicRcFK/iAbYB5a0u" "qUC5WqFWm2WLXAOQSMznfA7t/AAKJk/LpQBIJOZzg8H7KmI9wDSkAEgkJqJ4laOA0QaXP1Ar" "1P1m2iMFQCIxl+tjXHvcNCvCSAGQSExC8SqFwOUGl3cCs0w0B5ACIJGYyaVAscG1f6gVasBM" "Y0AKgERiJkaLf0FgppmGaEgBkEhMQPEqJwHHGFyerVao28y0R0MKgERiDkajP1iw+KchBUAi" "STGKVykFfmJweRPwvonmtEAKgESSem4Fcg2uPalWWBePLwVAIkkhilfpANxkcNkHPGWiOYcg" "BUAiSS03Y7z196Jaoe4x05jWSAGQSFKE4lWKgFsMLoeAP5loTlSkAEgkqeNGoJPBtVfUCnW9" "mcZEQwqARJICFK+SD/za4LIK3GuiOYZIAZBIUsMviJ7zD/CmWqGuNtMYI2RNQAmIQhRuwBPx" "rL0OIVar/eFn7bXpcevpguJVcoDfxmhyj1m2xEMKQOajIFahOwEdI56L0Dt6e+4DFV0QmoH9" "wL7ws/ba1OIWNuJnQHeDa++qFepKM42JhRSAzKII6Ino5FpH7wg4U/BZCpATfhQBpVHaNNBS" "GHYBlYjkl4xE8Spu4LYYTe42y5ZEkAKQ3uQDvYDe4ecOcdo7EP/nzvDDgejIkY/W74EY7UPh" "ZzXKzyFEpw6En7XItvzwo0eEDQGECGwDtiJEIdTGf7eduRzoY3DtfbVCXW6mMfGQVYHTixxa" "dvjOBu1c6B3dFfFQDNonG00IAq1eR+vofmA7uiDsNsnGpBOu9rsO6GfQZJRaoX5koklxkR6A" "/XEgbqghwBFEd+ddiLl8Tvg5XkdXEe65P8oj0OpnB2IxMPLhavVzbvhzI+1xhe2JJIC+ZuAL" "2+EO//u0TlMLrA0/TK2PlwRuwbjzL7Rb5wfpAdiZbsBQ4EgOTSRxoi/g5WC8nesD6sKP2vBD" "+znZbncu4pCL4vCjJPwoxFiQtJ2F5vDr1jfjToQQrAOakmxvCxSvogC5aoXa2M7fLwM20PKg" "j0jGqRXqwvbalyqkANiLEsRIP5RD5/MuIA/R0aJ5ASFgD1CFmFfXAu26mZOMA7FI2BnoihC2" "AoO2PoTNTbQUgxCwGViDSJ9t9yKi4lWKgcnA2LAt3cJ2dUEIaj1CeKrCjy3AbGBprJJdild5" "HOOCn0vUCvW09tqcSqQA2IMewIlA31bvO9A7vTvK7+1F3KQ7EXPndNmbL0IXg64c6uGoCBFo" "4tCtxCZgJfAlCXoFilfpDZwdfowh+ncZjxpgDvAuMEetUOsi/v4w4CuiC7MKjFQr1M/b8Zkp" "RwqAtfRFdPzIVXJtey2PQ+fQfsSItB3R8X2pN9EUOiDEoC9Q1upaCOEVNNJS4PyITvcFYtQ+" "BMWr9EVsu11GchdADwD3AdPVCrVe8SrvAxMN2j6rVqhXJvGzk4oUAGsYgOj4XSLecyJc4zxa" "3qwqsAPh+m4jg/fQwxQB/cOPolbXAojOHjlFCABfA8sRHRPFq3QG/geYSsvFyWRTBbyOcb5/" "AzBIrVB3pNCGw0IKgHkowGBgJC2371zoHT+SvcC3wHekeAHMxpQjhKAvLTtyECEEjehCEALW" "dHuw25Cqg1VejBfjzMSrVqh3WW1ELKQAmEMvYDwtU0NdiBXyyPlvAPgGsZpch0TDiYhwHELL" "BJsQQggaGgONyhnPn3Huki1LfmiFgVHYgRj9G6w2JBZSAFJLAXAaYuTX8ITfj5zf+9D3vjNl" "Xp8qugDDiYi131633X36c6dPWVe97ohE/4iiusivHYanqRvu5lLczaU4A0UEPPvx5+zBn7OH" "5vytNBZubK+dV6oV6rPt/WWzkIFAqcGBqAF/Crrr6kTsj0d2/CbE1tY3iEUtSXx2hR+lwPCv" "qr4aOun5ST/ZVb/LqPDG9ziCeRRXn0JJ9akUV5+CM1AY98N8uVXUli2mtmwJBzusBCWh8In9" "wIuJNLQa6QEknx4Id19LjlEQI37k3dYArAbWk/mLeiljRdWK3DFPj3mxrrluYKx2SshN6fbz" "6fLdlbj88dIljGkq/JbKI56grvTDRJr/Ta1QjcqB2QYpAMkjHzgVEcSjkYMY9bX9YR9iD3s9" "mZUAYzq+oE/p8VCPB6obqo223wDouOt0um2ciqepS6xmbaK+ZBU7Bk2noXhtvKbXqhXqP5L2" "wSlACkBy6A2ciRABiO7ub0LsWdshOi/tOXLGkdev37t+qtF1RXXQfcPNlG27KCWfH3L42Db0" "HvZ3mRermQ+YoFaoS1NiRBKQAnB4KMDJiD19LX02n5bx7zXAp4h5qyQJnPPyOaPe+eadxzAI" "7nEGCujz9d0U7z055bZU9XuKqv7/4NA0hu/ZAwxVK9TqlBvTDuQiYPspQIz6vcI/OxERbVqY" "aQARqbYW6e4njZqmGsfcb+f+BoPO7wjmMuDLR8k7cKQp9nTd/DOcwXx2DHzYqEkZ8H+I8wFs" "hywK2j56Iwo/aJ0/B7Hop3X+rcDbiIU+2fmTyBkvnHFOU6DJYLtPofea/zWt82uUbb2YTpU/" "itXkesWrDDDLnrYgBaBtKIitvfMRrr6CCFftGH4dBJYBixAr/ZIksmr3qpwvd35pOO/vuvlq" "OuweZ6ZJ39Prm99RUPMDo8tu4I8mmpMwUgASx4Po+CchOrsTEdmnpbbWItJGN1hiXRZwyeuX" "XOYP+qMu5+cdHEjXzdeYbdL3KCE3fVZXoIQMEw0vVLzKCDNtSgQpAImRjzjeuXf451xauvzf" "Ijp/jfmmZQ8b92883+ha9w1TQbX2dvY0daN0+wWxmlxhli2JIgUgPiXAxegx6EWIxT7N5f8o" "/EiXXPy05HfzfjegOdDcO9q1on0jKdp3otkmRaXrd1fFijA810xbEkEKQGzKgUvQO3wJustf" "A/wbMfpLUsw7698Zb3St66ZrzTQlJk5/cSwvoKfiVU4w0554SAEwpjfC7dcW+zqgp+xWIqrD" "1FpjWvaxvXZ71NU9d3MZBbVHmW1OTDrsHhvrsq28ACkA0RkEnIdeYbcjelTfZuADpMtvGk98" "/kR5vb9+aLRrJdWjzDYnLnkHBsUKPT7bTFviIQXgUI4GzkI/OKMzekbfOmApcm/fVOZvmm9U" "apviPaPNNCVhSvacanRpcLgCsS2QAtCSQYhMPm2brzN6tOQK4DOL7Mpqqg5WGZ2yS35dVMfA" "cvLrhhldchP9GDVLkAKg0xtRLjqy8zsRQd7LgFXWmZbd7G/a37pQKCD23l1+O1T+OhR3c8w+" "bnRwqOlIARCUI+ZmmtvfKfwcQkT1yeAeCznoOxjVA3D7bDOQHoK7OapmaUgBsBEdgB+jL/h1" "Qh/5lyAq8UospMHfELU3xRllLUV6AOlBAS3j+juiz/k/RST1SCQZSzYLQA5i5C9B3+fXVvu/" "QlTtkdiAfHf+nmjv+3NsmWIPxLWt0iw74pGtAqAAP0I/hSayes83CAGQ2IRCT2HUI8P9HjsL" "QFTN0pACYDEnAn3Cr4vQI/y2Irf6bEfH3I5Re5Pq8BNw2zMYU3oA9qUnoowXiKw+LbZ/FyLI" "R9ZIsxldC7tG9QAAGorXmGlKwjQUrza65Ads47pkmwDkIcp4aXv92ibyAWAhskS3LZnQf8Jm" "o2t1Zfast1lbtsTo0jq1wj6FOLNNACajF+zUMvxCwGLkiTy25foTrt9d4C6IOtTXJlaj31Qa" "i9bjyzWsAfuOmbbEI5sEYCTikEkQIqAV81gO7LPCIEni9Czp+UG09/05e6gv+dpsc2JSU74w" "1uW3zbIjEbJFAHoA2qGROejz/i2IVX+JzTl70NkLjK5V9Z9ppikxCbrrqO75utHl7WqF+rmZ" "9sQjGwQgF33e76DlvP9jq4yStI37Tr9vY44rJ2pg1oFOn3Gg06dmmxSVqr5PE3QdNLpsq9Ef" "skMARiG2+kDM+7UY/8XIAznTioGdBr5mdK1y4KOJHtyZMny5O2ON/irwLxPNSYhMF4CuiKOk" "Qbj9WqTfZ8h5f9rx1kVvvehxeqqiXWss3EBVv3+abdL3qA4/W4Z5UR2GY8oraoX6pZk2JUIm" "C4CCntvvQD+ddwcyzDctGdBpgO/4bsfPMLpe1e9f1JRHXStMOduOvI/6DoYBpD7gf0w0J2Ey" "WQB+AGh1mYrQq/jaY7IoaRfzLp/3bp4rb2P0qypbh95NY5G567p7er/Mvu7/NrzucrhmqhXq" "JhNNSphMFYB89FV/D3qo7yrAcIVGYn8KPAWhSQMm3Y9BxGbI2cTG46ZSV/qRCdaoVPWfyY6B" "fzNske/OPzj38rkxjxC2kkwVgNMQ230KItEHoA6w14axpF28ddFbHw/qPOgxo+tBVz2bjv4d" "u/u8kDIbQs5GNh99O1X9nsIoetypOIMPTHzgmbF9x/ZHTzyzFZkoAL2AIeHXBbTM75fFPDOE" "VTes+ntpfulcwwZKiMoBM9g8/HZ8eTuS+tkHOn3O+hE/p7Zsccx2t55867wbTrjhO8RANCGp" "RiQJRbVPWHKyuAJRdNEZflaA7xDVfSQZxIqqFbmjnhr1bIO/YUisdkrITemO8+iy+Wpc/g7t" "/rzGwg1UDniUA53jLyNNHjj5izmXzvkAaEQ/P+I/gK2ylzJNAAYCU8KvtVr+AUQAhjytNwN5" "bPljXabNnfaPpkBT33htHcE8iqtPoaT6VIqrT4l1hNf3+HJ3UVu2mNqyxRzssDKhWIPS/NK5" "227d9mKuK1crDbwPsROwH3gaG2WcZpoA/BRR4NOFXnr5C8AwN1OS/szfNL/wwtcufKCmqeaH" "8VsLFNVFfu0wPM3luJpLcfs64wwUEHDX4M/ZS8Czj+b8bTQWtqkerDqw88Anvr7h68c9To8D" "cbhMPqLza3En/8ZG29CZJAD9EF846KN/E/AGMs0347nwtQtPfH3N609i0bqWQ3E0jek75n8W" "XLEgcl1iMCIJDWAvIvJ0D/Cc2fYZkUmLgNrxsC708l5rkJ0/45m+bHr3t9e9fT8W3c95rryN" "N5xwwxWtOj+IcvKN4dfafKMM6G+acXFwxW+SFvREL7WsZfr5kJl+Gc8rq1/pePsHtz8ZCAU6" "mv3Zbqd793Hdjpux4PIFswo8BdEWB4KIQeh4xKDkQqxJnQjYIjAoUwRAG/2d6EE/a5HJPhnN" "su3L8q+Zdc3jzYHmPkZtclw5W5sDzb0Qu0FJwaE46gd1HvTUqxe++szw8uHNcZp/AxyFnoZe" "C3RDnERledn5TBCArugFPrXRP4AQAEmGsqVmi3vy85MfrvfXGx7CV5RT9OW6qeuueW3Na2VP" "r3x6zOaazWPrmupGqKhtvu+divNA5/zOS4eUDll015i7lo7pOybRiFLtXjwGMTgdRHgGI7GB" "AGTCIuA5wBGI0V+LtvoasF3mlSQ51PvqHX0f7nt/dUP1RKM2Hqdnxws/fuGSC4ZesD/y/UXf" "LSr0LvaO2rR/0zEN/oaypkBTuS/oK/eH/GWqqrodiqPe4/Ts8Tg9u3NduXsKPYU7j+9+/LIZ" "k2d8UV5Q3t71JA/iABo3LeMCXgJ2tvNvJoV0F4AC4BfoIb/5CHV9A7EDIMlAek7veeeOuh0X" "GV13OVz7Kk6ruPzOU+9s0wi7aveqnARc+vZyLHpq+h7Effo1YBzNaALpvgswGNH5FfS5/wZk" "589YBs0YdGOszu9UnAevO/6669va+QFS2Pmh5Y6Udq8OwuJpeLoLgHY4vJb4A2CQKipJd45/" "8viLN+zdcIPRdUVRfD8e+uNbZpw5w47rP83oB83mhp89wABrzBGkswCUos/5NUWtQVb6yUjG" "PTNu0pc7v7w9RpPQuL7jfv/qBa/a+WSnb8PPLvSq1EMN2ppCOguAlgDiQA/8+dagrSSNOefl" "c0Yt2rLoT8S4X0/ofsI986+Yb9u8+zCV6NNTbdDqg757ZTrpKgAKugBo7pQKGJ4gI0lPprw0" "ZfS76999WFVVt1GbI0uPnLH82uWGBUNtROQ9moe+fhUzmzGVpKsA9EIPrdSUdCcy4y+jOOvF" "s06bvWH2X1VV9Ri16VXc68V1U9f93Uy7DhPNS1XQi9RaNg1IVwHQvjAn+lzKFqGVkuQw+YXJ" "Y9/b+N70WJ2/rKDsvY03b/yzmXYlgX2ItSoQ29bQcj3LVNJRAFyIvH/QR/8ANoiqkiSHSc9P" "Gvf+xvcfiuX2l+SWLFt53co7PE5POgayaF5ADnofHGyFIekoAD3QR31t/r8FIQKSNOf0506f" "MO/beQ/GCtctyin68r3L3vtV96Lu6fp/HrlWpS1gG+YzpJJ0zAXoFX52oNu/3SJbMo511ety" "/r3+351X71ldVnmgsnO+O7+pf8f+e0b2GFl93uDzalI54o5/dvzEhZsX3qeiOo3aFOcUfz7/" "ivk3jug+otGoTRrQgJgKdEKsAzQipgC5mBzElo4C0Dv8nBPxnuFZzBJjfEGfctOcm4Yt2Lxg" "fNXBqlHNgeaeQTVoWCdLQQm6nK69RZ6i1YM6D/pg2snTFreOtW8v454ZN2nRd4v+Eqfzf7bo" "qkVTj+16bCZEeu5ECIB2HyuItHZTA9nSLRfAA0xFfFkliDWAGmx25rrdufLtK4/5YPMHZ+2q" "3zXOH/SXt/fvKCihwpzCFX079P3goYkPvTmh/4R2nbkw5ukxk5dsWfJnFdVwSlqSU/LpwqsW" "3pQhnR/EVHZ8+LWWG7ASMPVoo3QTgP7AueHXZYhdgLXAcsssSiN+N+93A/6x4h+/2t+4/7Rk" "/22nw1k7tHTok29e9ObLAzoN8CX6e6P/Nfqsj7Z+9MeYnT+3ZNnSq5f+MsWx+mbjAi5BDGa1" "iGnAXuAZM41INwE4DVFdJTL1dyF6jLUkCo8tf6zL3UvuvqnqYNXZpHjh1+P0VI7oPmLG+z99" "f7ZBlZzvOfrxo69YtXvVb4hRrKNDboePl12z7ObBpYMzqfNrTEbcx03oW4NPYGI8S7oJwOWI" "LywPMQUAeBlR/ksShXHPjpu4+LvF94bUUG781smjyFP05RM/euLWS4dfekhuhi/oUwY9Mug3" "W2q3XBHrb3TI7fDhsmuW/SpDOz/oKcIhYHf4vdmYWMounbYBc9FHfS04ZC+y80cl3MmmLty8" "8AGzOz/AAd+B466edfVL0+ZOGxj5/sZ9Gz09H+p5f7zO3zG349Ll1y6/JYM7P4B21HnkjlZv" "g7YpIZ0EoEfEa23lNOpZ8dnO8srleT0e6vHghn0brieJtfDaii/o6/7XT/76/OQXJo8FeH/j" "+0XHPXnck3sa9kyK9Xsd8zouXnHdilvaspaQpuxGP65Ou6d7mmlAOk0BRgCjEaKlrVzL+X8r" "VlStyB39r9FP1fvqh8dvbRqhYWXDHt60f9OUxkBjzPz30vzSeZ9f+/ltfTr0yZaCrmcg7ufI" "UmF/w6TAtnSKA+gUfo60uSZaw2zFF/QpE5+beK/NOj+AY/We1bfGa9SruNeLa6eu/Uu8xcMM" "Yz/6aVYaHRFbgyknnQUghKiwKgkz/PHh18UqlGlj1GFlw6Z/fePX/7LaEAuoCz9bIgDptAag" "HfygfVEHsNEhi1Yz4dkJp6/fu/5Gq+1oK4qi+Ef3Gf37LO38oAuAgt4fOxm0TTrpIgB56Ik/" "WqhorUHbrOPplU+XLvxu4T1YuODXHpyK8+D5Q86/YclVS+ZYbYuF1EW81gY30045ShcBiPxC" "tC+pLlrDbOTOD+68IaSG8uO3tA8uh2vvL0/85ZWvXfjap1bbYjEH0XcCtMFNegCt0L4QBekB" "tOCuRXf1qTxQeb7VdrSVwpzCtdMnTbfNMdkWoiKmsyA9AEM0AYjMFJMeAPDo8kdvjpVBZ1dq" "GmtGXfn2lcdYbYdNaL0Q6EEveZdS0kUAWi8AgvQAuGnOTUPSdNUfgFnrZt1ktQ02wWgnIOWk" "iwBoaqiNdD5kCDBzNsyZbLUNh0Ndc92IV1a/Yvqx3jZEE4BITy4vWsNkky4CoMX+a6vc2RIl" "FpPKg5XjrLbhcFBRHdM/mT7GajtsQORgpt3jhsVQk4kUgDTl9/N/f0RzoNmSOnLJZMPeDWkt" "Ykki8n6WAhAFrQioFIAws76ZNT5+K/uzv2n/ycsrl5vi7toYKQBxaO0BZP38f0fdjh9abUMy" "UFU1x7vIO8JqOyxGCkAMImvDa/amaznopNEcbO5utQ3JYufBnRnzb2knkQKg3eNSAMJEfhHS" "A0Bk/fmD/lKr7UgWtc21lpyKYyOkBxCDaAKQ1WsAr695vWOsgzPSjXpffbsrE2cIUgBiEDkF" "kAIAfLL9k4zqME2Bpmz3AFREWXCQAnAIkV+EZm9WC8CW2i0Z4/4D+IK+jPr3tBNtXUuuAbQi" "GPFay/9Pu9j3ZFKSU1JvtQ3JxKk4M+rf0060vqjd40Gjhqn4UDsTueCnfTmGp8ZmA0eVH7U7" "fqv0IdeVa0r1G5uj3dNaarApC91SANKQs488u9pqG5JJnjsvowStHUQu6Gr3uBSAMJHzfSkA" "wODSwc1OhzNjsiGLPEUZJWjtIPJ+lgLQCukBRMHj8GSM29wxr2O2ewBSAGIQQl8Q0eZHWS8A" "JbklX1ttQ7IY32/8KqttsBgpAHHQvgzpAYQ5pusxph4jnSpyXbmb7hpz1xar7bCYyPtZLgJG" "obUAmLJHamemT5r+sUNxNFptx+HSo6jHAqttsAHRPABTzkRMFwHQFgKlBxBmcOng5o55HT+y" "2o7DZVy/cRnhyRwmcgoQh6bws7YWkEP62J4yhpQOSevR0+1075px5ozVVtthAwrCz5EH3UgB" "iGB/+DkyXrrIIltsw/PnPf++x+nZYbUd7eXIzkc+53F65OlOUBx+jkxzN2WbN10EYF/4OfIL" "Ko7WMJvo06GPf2SPkY9YbUd78Dg9lW9d9NZLVtthE1oLgIpJB9+miwBoHkAI3U0qscgW2+AL" "+pRAKOBRUEyJG08mI7qPmDGg04CsrusQQWsBqMWkXIB0ySnfH/E6gFg0yWoPYNrcaQNnfjHz" "fw/4DhxrtS1tJd+dv+79n74/22o7bEJO+AF6p99v0DbppIsA1CBGfwe6AGSlB7B069KCy9+6" "/MatNVsvS8cTgRRF8V807KK7CzwFofits4LIgUzzAPZFa5gK0kUAtDlRJ3SVzDoPYNyz4yZ+" "uPXD2/xBf9oWBBnZfeQ9T53z1H+ttsNGRN7H0gOIwT6EAGgqqblOpgRMWMkfFv+hz/RPpt9R" "01RzitW2HA69S3o//8nPP3nTajtshiYAQfT1LekBREFTxcidgBIgYxNJ1lWvy5ny0pRrv93/" "7dWqqiYU/ViUU/TlQd/BoxJtbxYluSXLvrr+q/uttsOGaFPZyPtaegBRaB0LANAZAwH449I/" "9pq/ef6Q/Y37S+ua68oa/A2lTYGm0qAazM915u7Oc+dVF3oK95TkluztU9Jnx4MTH1zZvai7" "bcqNT3lpyuh53867oznY3DOR9h6np3J8v/F/mnPZnEVTXpoy+r0N790XVIOmnDAbjw65HT58" "7cLXftsht4Oc9x+KVg5Nu/eaAdMqJCmqmjZxGKXAFeHXnRD5ANuAhVqDqXOmDpn77dzxO+p2" "jG8MNA5oyx93Ks6DnfM7LxlWPmzBvePu/fDknic3JM3yNvDIZ4909S7y/n5v496ETv5RFMXf" "v0P/p9+6+K2/Dy8f/v106I4Fd/R/6JOHHmkONPdOnbXx6dOhz7Mrr1v5oOz8USkGzg2/3o/o" "/JuBt8wyIJ0EAOAGxKmpheGHb0vtljenvDTl4m+qv/mpL+jrlowPURTFV5ZfNv+XI3/56J2n" "3rk1GX8zHpUHKl0Tn5t4xdrqtdeH1FBCR2UV5xR/esuJt9zzh7F/+C7a9dkbZhdf9uZlD9Q2" "1Z6cVGMTQFEU/0k9TvJ+fM3Hs8z+7DRiEHBS+PUuxBrAYuALswxINwH4EeJL84TUUKe7Ft01" "9IGPHzi+MdDYNRUfpqAEuxd3f/2esfc8cdUxV6Wsas0lb1xywtvr3r6zKdB0RCLtXQ5X9Sm9" "Trl/8VWL58RrW9NU4zjpnyddu2HvhqtDaqggXvtkUOgp/Orioy7+y8wpM7M9zz8epwF9EMlu" "e8PvPY+J61rpJgBHAxP+/vnf+9+z9J4Lt9dtN2U7zKE4Go/oeMQzi65a9PdkrhM899/nOv92" "7m+n7arfNSWR9gpKqFdJr5eeOfeZGWP6jjnYls96cdWLnW6bf9sNO+p2XJiq+IEcV86WUb1G" "PTz/ivnzUvH3M5CLEDtZB8OPJuBxWiYFpZR0E4CO17577cynVjx1XkgNmR7GXJxT/Nlz5z33" "67OPPPuwEjXqffWOU58+9SdfVX11c1ANJpTUVOgp/O+VP7jynhlnzlh7OJ9916K7+jy2/LFf" "VjdWj03WToHH6akcWjb06dmXzn7NTgupNqcjoAn/PkT230bgHTONSBsBULyKC3gYuLEtv+cA" "OigKpQ6FMkUhF9irquxRVapDapuDCHJcOVt/c/Jvbrpn3D2b2/irAJz90tmj5m+aPy3RRUqn" "w1l7bNdj//rRzz56I5mZc8srl+fdseCOH67avWpsdUP1qcFQsEMbfl0tcBes6Vncc+GkAZMW" "PXzGw98ky64sYgignYqszf8XAivMNCItBEDxKh2AN4BxibQvUxRGuZyMdjk5+TyThgAAENxJ" "REFU1uWMWT1ke0hlaSDA4kCQ1cFQQr6XU3EePGvQWb+ddfGsDxOxB8QOxfP/fX5aXXPdiQn+" "itq1sOusRyY/8tAFQy9I6b5wTVONY9rcaUdt3Lex577GfWUHfAfKG/wNZc2B5jKn4mzMc+ft" "LnAX7CnJLdldll+257wh56259rhrMzb+wiTGAr0QI78W+PMM+lqAKdheABSv4gbmAmPitT3a" "6eAXOR5+4HR8f8BaW9irqrzqC/Cazx+3GoOiKP7Lj778Z8+c+8zKWO2mL5ve/c8f/fnm3fW7" "z4TEzMpz5224YMgFdz973rOmjgYS01AQ838P+vy/AXjCdEPSQAAeB66P1aavw8H1OW5GuZKz" "trVbVflns5/3/AFibV67HK69D5/x8EU3jrhxV+trszfMLr5pzk3Xbqndcmmic22H4mgYVjbs" "0flXzH+hvKA87VJ8JQnTE92b3YvYBfgGMD1D0tYCoHiVqcCMWG3Odrv4da4nJSGNK4JB7mz0" "URvjOypwF6xZ9vNlV2hBOBv3bfT8+JUfX7ymes11wVAw4YSl0vzSuXePvfsv159wvXStMx9t" "+y8AaNvLc4B1ZhtiWwFQvMoYYB4G4coO4KYcDz/xpDaauTKkcltjM5tDxr5AeUH5nG23bvv9" "hOcmTP50+6e3+IK+7on+/TxX3sZJAybd/9ZFb32cFIMldscD/ARxC2vuvw/h/pu+g2JLAVC8" "ihNYhVgpPQQHcG9eDqOT5PLHo0FVubWxmdVBYxHIdeVuaQo09Un0b7qd7t3HdTtuxoLLF8yS" "ufFZRWT03x5EbsvXiHUu07FrMtBVGHR+gJtzPKZ1foB8ReFPeTlcU9/EHgPBTLTzOxXnwYGd" "Bz716oWvPhsZuy/JGrRoTx96Ytsai2yxX01AxavkAV6j6+e6XVyQYrc/Gp3CItDeyBkFJdCr" "uNeLL1/w8plrp66dKTt/VlIElIVfa4e61AHbrTHHnh7ALUCPaBcGOBzcmmtdmvtgp4Obcj08" "1NS2Wpal+aVzbznxlofNSiyS2BZt9FfRz7o4rMjOw8VWAqB4FQfwa6PrN+a4sboI3jluF2/4" "AmyJsSioUZRT9OWFQy988J9n/1OWwJIA9A8/N6HH+1vm/oPNBAD4IbqL1ILjnU5GmjjvN8IJ" "3JDj5veNxh58rit305i+Y/763mXvLTRsJMk2eiNS2EF3/3diYvWfaNhNAM41unB9jn2OAxzl" "cjLc6WBVlF0Bh+JoXnHdigsHlw6WNe8lkQwPPwfQj/2y/Fg0uy0CRhWAng6FIU57mTrRHV07" "Q2oo5/YFt59gsjkSe9MdUb4OxL4/iNBfS91/sJEAKF7laPQ5UgtOddnNUYHRLqdhYP/yyuUJ" "JS1JsobI0V9b/PsCCwJ/WmMbAQBGGl0wc88/UUoVhaEGXklNY83wqBck2Ug50CX8Wiv22QR8" "ZY05LbGTAESt55erwDCbuf8axzijC5Mv6Iu6kCnJSo4OPwfRR/8VmHT8dzzs1LOixs+XKYqt" "jIykVIk+CQiEAp1rmmrsarbEPDqh39cHEVt/Pkwu+hELO92kUQWgVLGTiS0pdUQXABXV8cba" "NzpHvSjJJrTRP4Q++n8V8dpy7NS7onsABp3MDhh5AACfV34upwHZTVfE3j+Iub+KWPQzreR3" "IthJAKIWx8w324o2kBdDm/Y37jelBLfEljgArfRbALHlByLD1ZIDZ4ywkwBURXtzrw3TlTX2" "hYxtG9BpQMrOEZDYnqHoZ/4dQJ/7L7fMIgPsJACV0d6strEAxLJtQv8Je0w0RWIfCoAfhF83" "oZ9evQw9CMg22F4A9sQYZa3GyDtxKI7Gth7cIckYRiJSRlTE6A+i8MeXllkUA9sLwD5Vpc6m" "XsB2A3FyO92yrl920hNR6hvEaK8V/JiPiaf9tAU7CcCWaG+GgI8C9iuQGwKWGdiV68zdaa41" "EhvgRI9mjVz4+xqR9WdL7CQACxDlkQ9hiQ0FYHUwxD4Dz6RXSa+EDwyRZAzHoqf71qEX/Vhi" "mUUJYBsBUCvUGmBRtGufBYI02cyBWhpDlC456pIPTDRFYj09ESv/IHL9tTDfJdgo6CcathGA" "MG9He7MZmOW3PHHqew6qKrMN7Mlz5224Y/Qd20w2SWIdBYhCNiBc/7rw60qE+29rbCMAilcp" "AwYaXX/W56feJouBz/oChoeF9CjqIUf/7EEBTkUc8Q1Qg77n/x+rjGoLlifaK17leOBm9LPS" "o1KrqrzoC3CtxZWBdoZUXvNFXaoAUM8bfF5a/MdLksKx6CXs6tDz++cixMD2WCIA4QM/zwd+" "CZyS6O+94vczwe2kn8MaxyUEPNjsi75SCZQXlL933+n3bTTTJoll9ACOCr9uRF/1/wpYb4lF" "7cBUAVC8SjlwHeKwz4SPz9JoUuG2xmZm5udSEiMRJ1X8vdnPJwaLf4qi+KedPO1vJpsksYZ8" "YFT4deS8fw+w2BKL2okpR4MpXuUExGgf081PlGOcDv6an2uqer3vD3B3jPMAepf0fn7Lr7b8" "xUSTJNbgAiYCpeGfqxEi4Aeex+Iqv20lZX0ows2/GTg5mX97ZTDE/zY283+5OTEz8pLFPH+A" "P8fo/E7FefDBiQ8+mXpLJBajIE721Tp/Lfq8fx5p1vkhBR6A4lW6AL+gnW4+Yj71AuK01KfR" "51mH0N/h4C95OXRLUc2AEDCz2c9zxot+AKEzB5558+xLZ6eV6ydpF6PQC9fWo8f6r0IIQNqR" "FAEIn+gzCfg5MAVoz1L9VuAx4B9qhbo3/Hf7AZ9icFgIQImicFuuh1OTXDh0j6pyf5OPj+NE" "IQ4tGzp99Y2rn0rqh0vsyAm0DPapDb+uBF7HBhV+28NhCYDiVXoDPws/esVpbsRi4G/ALLVC" "PaS3KV5lNCKZIuahgD9wOpia4zGs1JsojSo87/Pzit8fN/qwS0GXf1f9pur2w/pASTpwFHBc" "+HUz+n7/XuAVbB7tF4s2C0B4bn82YrSfSPuCiTQ3/xG1Qo17bp7iVa4G/gmGpfi/51SXk9Pd" "Lk5yOtu0PrAhFGJpIMjbvoBhjH8khZ7Cr5Zfu/yawaWD5Sm/mc0A9K1qP7APPdX3ZfRpQFqS" "sAAoXuVI4BrgSkSt8/awBd3N39eWX1S8ynnAs+gJFzFxAyNcTo5zOihVHHRyiCO+c1HYp6rs" "VVX2qSrfhTt+VRvqDpQXlM+Zf8X8/5NHfGc8vRGLfgrCxd+HXuDz5fDPaU1cAVC8ykjgfkTI" "Y3tZhHDz34nm5ieK4lWGA7OAfodhy+GgDi4d/MjaqWtnWvT5EvMYCJyE6PwhhLsfRAjBa9g4" "xbctxBQAxaucCvwbg4KdcTgAvATMUCvUVe0zL6pNnYE3EMpsGk7F2Tih/4Tb/vPT/8gTfzOf" "o4Fjwq9DiJE+EH79DrDJIruSTjwB2EHbt/I+A2YCL6sVakrKYileRQEuBe4F+qTiMzQciiM0" "vt/4/04/Y/qiYWXD3kaMBJLMRAFGAIPDPwcQe/ua1/o+NjjRN5nEE4Aa9OqmsahBREHNTGRR" "L1koXiUHEWF4B9Ax2X//qPKjVj048cH5E4+YqG3xBBDTmajlyyRpjROxz68NKH5E5w+FH/8B" "1lljWuqIJwCVGJzZF+ZDxGj/mlqhNibZtoRRvEpH4CrgHMR/4uEEBWxzKI537x5796Y7Rt+h" "fTkehMBo88GPgM2H8RkSe+EGxiIO84CWW31+hNsftWRduhNPAE5ElOqKPORiL/AMYiV/bWrN" "azuKV+kEnIXYqjwOMYXJNWgeQiRwbEYo/Dtqhaqd26YAYxApnyDCpjuhb3uuQESASdKbQkTn" "1zzIRvSSXo3AWxicWZEJJLILcCIilj8f+BZ4W61Q02r7K+wh9ECIQR5iBbcSqFIr1HgRXCei" "V3xxIm4ULYdiB8ILSqvvQ/I9vRD/t1qQ2UH02v11iMXmtIvvbwumZANmAEcBpyO8AgfQAf2m" "aUBEM8qDQNIHB8I71EJ7VUSH16ax1cCb2PAgj2QjBSBx+gM/Qh/9C9GDklTEwQ8ZtUKcoRQg" "Ylq0/JIAYr6veYI7ELUps8KrkwLQNroh1heKwz97EN6Ati6wHbFAmBU3TxrSA7FIrNWkiJzv" "A/wXscuTlok97UEKQNvJQWQ+Dgj/7EBslWo3VT1iXWCX+aZJDHAiAnuGhX9u7fL7EHX80qaU" "V7KQAtB+jkW4ktqWY+SUAMSC6RekcaZYhtATcWKP9n/T2uXfjYh2TYsinslGCsDh0QWxLqAF" "S3nCrzVR8CG2C9dj07PhMphCRFRfZJp6a5d/BeLwDvsdPWUSUgAOnxzEDsGg8M8KYqEp0hvY" "C3yCDCM2AyfC1R+OLsRa4U6trlszwuXfYLp1NkMKQPL4AWJKoFVDciK8gchCJusRuwXGBQYl" "h0MPhLuvJa+piK28BvRRfycwG72Sb1YjBSC5FCGyFAdFvJeL2DXQdgqagbWIuHIpBMmhDJHB" "1yPivSZEJw+Ff24ElpIGx3WZiRSA1NAHGIceXqogpgSRIdV+4BtgDXKhsL10R7j6XSLeCyBS" "0bWtWBURsv0h8ns+BCkAqcOJKCR5InrwkAshBJG5CUHEXHQ1YgtREp/eiI7fOeK9EOL7i3T3" "dyFyWTI2lv9wkQKQeooRySZHRLznRAhBXsR7IUShia+R89NoKIhKUMNpmaIeRMzzm9A7fhNi" "xF+F3H2JiRQA8+iHSKrqGvGeEzEtyKNlwdMqhBhsAcOjCLOFjgjx7EdLwQygd/zI91Yhdlws" "S09PJ6QAmE9vxLQgcn/agRCCfFoKQRBxXsK3iNXrbPnPykN0+CM4tNCLH9HxI8OtfcBKROCV" "7PhtQAqAdXRDCEH/iPcUxPpAHoeeg9CI8Ao2kZkpqk6EKB6BWNyLFEItN7+JljsnjYht1ZXI" "/It2IQXAekoRQjCIlje9EyEEeRxa4ageMU2oQngGDaQfCuLf3hUhhmUc+u9sRnTyZlp6PweB" "zxHJO1mTuJMKpADYhw6IvezBHHr2gQfdM4h23MkBdDGowp7bXQrCne+G6PRdiH44rR9hfyP6" "Hj4IAdiG2Db9hiwO300mUgDsh4JYJxiKqE3vanXNE37kYHy6cy1imlCL2FHQns0aLfMRK/XF" "Ec+lRD/eTUW49T7ESN/axn2ITr+WND+Fx45IAbA3boQIDEXMj1uP/g50QfAQ/7j3BoQY1CLc" "aB9ixNXOt498BNBHWXf44Yp4HfleHi07eyw7tEKbzRGf35pG9CApuYefQqQApA9FwJGIKMMe" "RO9kkYLgQsypD6dCcoj2nf2ooSKERBMTrcNHu+lqEC7+JkSR1lCUNpIkIwUgPXEi5tG9ENOF" "bhh3dCV8zYUuCtrz4R2lLFARnVU7NivyEasTH0B0+G2IrU7p3luAFIDMwIXYOuuNEIVS9KzE" "eCitHo5WP4Po5FpHj3xuy81zALFIuRXR6TNxKzPtkAKQuRQhVt07Is4z6BR+XRzrlw4T7TSd" "/YjFu8jnbI9otCVSALIPF0IIihBegrZmEO21GzHaa3N3n8HrJsQcXrrxaYYUAIkki0nGIpBE" "IklTpABIJFmMFACJJIuRAiCRZDFSACSSLEYKgESSxUgBkEiyGCkAEkkWIwVAIslipABIJFmM" "FACJJIuRAiCRZDFSACSSLEYKgESSxUgBkEiyGCkAEkkWIwVAIslipABIJFmMFACJJIuRAiCR" "ZDFSACSSLEYKgESSxUgBkEiyGCkAEkkWIwVAIslipABIJFmMFACJJIuRAiCRZDH/D8PLUIX+" "BVXhAAAAAElFTkSuQmCC") getMainData = Main.GetData getMainImage = Main.GetImage getMainBitmap = Main.GetBitmap getMainIcon = Main.GetIcon pycorrfit-0.8.1/src/models/0000755000175000017500000000000012262516600014343 5ustar toortoorpycorrfit-0.8.1/src/models/MODEL_TIRF_gaussian_3D3D.py0000755000175000017500000002071412262516600021017 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit This file contains a 3D+3D+T TIR FCS model. Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import numpy as np # NumPy import scipy.special as sps def wixi(x): """ Complex Error Function (Faddeeva/Voigt). w(i*x) = exp(x**2) * ( 1-erf(x) ) This function is called by other functions within this module. We are using the scipy.special.wofz module which calculates w(z) = exp(-z**2) * ( 1-erf(-iz) ) z = i*x """ z = x*1j wixi = sps.wofz(z) # We should have a real solution. Make sure nobody complains about # some zero-value imaginary numbers. return np.real_if_close(wixi) # 3D + 3D + T def CF_Gxyz_3D3DT_gauss(parms, tau): u""" Two-component three-dimensional diffusion with a Gaussian lateral detection profile and an exponentially decaying profile in axial direction, including a triplet component. The triplet factor takes into account blinking according to triplet states of excited molecules. Set *T* or *τ_trip* to 0, if no triplet component is wanted. w(i*x) = exp(x²)*erfc(x) taud1 = r₀²/(4*D₁) κ = 1/d_eva x1 = sqrt(D₁*τ)*κ gz1 = κ * [ sqrt(D₁*τ/π) + (1 - 2*D₁*τ*κ)/(2*κ) * w(i*x1) ] g2D1 = 1 / [ 1+τ/taud1 ] particle1 = F₁ * g2D1 * gz1 taud2 = r₀²/(4*D₂) x2 = sqrt(D₂*τ)*κ gz2 = κ * [ sqrt(D₂*τ/π) + (1 - 2*D₂*τ*κ)/(2*κ) * w(i*x2) ] g2D2 = 1 / [ 1+τ/taud2 ] particle2 = α*(1-F₁) * g2D2 * gz2 triplet = 1 + T/(1-T)*exp(-τ/τ_trip) norm = (1-F₁ + α*F₁)² G = 1/n*(particle1 + particle2)*triplet/norm + offset *parms* - a list of parameters. Parameters (parms[i]): [0] n Effective number of particles in confocal volume (n = n₁+n₂) [1] D₁ Diffusion coefficient of species 1 [2] D₂ Diffusion coefficient of species 2 [3] F₁ Fraction of molecules of species 1 (n₁ = n*F₁) 0 <= F₁ <= 1 [4] r₀ Lateral extent of the detection volume [5] d_eva Evanescent field depth [6] α Relative molecular brightness of particle 2 compared to particle 1 (α = q₂/q₁) [7] τ_trip Characteristic residence time in triplet state [8] T Fraction of particles in triplet (non-fluorescent) state 0 <= T < 1 [9] offset *tau* - lag time """ n=parms[0] D1=parms[1] D2=parms[2] F=parms[3] r0=parms[4] deva=parms[5] alpha=parms[6] tautrip=parms[7] T=parms[8] off=parms[9] kappa = 1/deva ### 1st species tauD1 = r0**2/(4*D1) # 2D gauss component g2D1 = 1 / ( (1.+tau/tauD1) ) # 1d TIR component # Axial correlation x1 = np.sqrt(D1*tau)*kappa w_ix1 = wixi(x1) # Gz = 1/N1D * gz = kappa / Conc.1D * gz gz1 = kappa * (np.sqrt(D1*tau/np.pi) - (2*D1*tau*kappa**2 - 1)/(2*kappa) * w_ix1) particle1 = F * g2D1 * gz1 ### 2nd species tauD2 = r0**2/(4*D2) # 2D gauss component g2D2 = 1 / ( (1.+tau/tauD2) ) # 1d TIR component # Axial correlation x2 = np.sqrt(D2*tau)*kappa w_ix2 = wixi(x2) # Gz = 1/N1D * gz = kappa / Conc.1D * gz gz2 = kappa * (np.sqrt(D2*tau/np.pi) - (2*D2*tau*kappa**2 - 1)/(2*kappa) * w_ix2) particle2 = alpha**2*(1-F) * g2D2 * gz2 ### triplet triplet = 1 + T/(1-T)*np.exp(-tau/tautrip) ### Norm norm = (F + alpha*(1-F))**2 ### Correlation function G = 1/n*(particle1 + particle2)*triplet/norm return G + off def Checkme(parms): parms[0] = np.abs(parms[0]) parms[1] = D1 = np.abs(parms[1]) parms[2] = D2 = np.abs(parms[2]) F=parms[3] parms[4] = r0 = np.abs(parms[4]) parms[5]=np.abs(parms[5]) parms[6]=np.abs(parms[6]) tautrip=np.abs(parms[7]) T=parms[8] off=parms[9] # REMOVED (issue #2) ## Force triplet component to be smaller than diffusion times #tauD2 = r0**2/(4*D2) #tauD1 = r0**2/(4*D1) #tautrip = min(tautrip,tauD2*0.9, tauD1*0.9) # Triplet fraction is between 0 and one. T may not be one! T = (0.<=T<1.)*T + .99999999999999*(T>=1) # Fraction of molecules may also be one F = (0.<=F<=1.)*F + 1.*(F>1) parms[3] = F parms[7] = tautrip parms[8] = T return parms def MoreInfo(parms, countrate): u"""Supplementary parameters: Effective number of particle species 1: [10] n₁ = n*F₁ Effective number of particle species 2: [11] n₂ = n*(1-F₁) Value of the correlation function at lag time zero: [12] G(0) Effective measurement volume: [13] V_eff [al] = π * r₀² * d_eva Concentration of particle species 1: [14] C₁ [nM] = n₁/V_eff Concentration of particle species 2: [15] C₂ [nM] = n₂/V_eff """ # We can only give you the effective particle number n=parms[0] D1=parms[1] D2=parms[2] F=parms[3] r0=parms[4] deva=parms[5] alpha=parms[6] Info=list() # The enumeration of these parameters is very important for # plotting the normalized curve. Countrate must come out last! Info.append([u"n\u2081", n*F]) Info.append([u"n\u2082", n*(1.-F)]) # Detection area: Veff = np.pi * r0**2 * deva C1 = n*F / Veff C2 = n*(1-F) / Veff # Correlation function at tau = 0 G_0 = CF_Gxyz_3D3DT_gauss(parms, 0) Info.append(["G(0)", G_0]) Info.append(["V_eff [al]", Veff]) Info.append(["C"+u"\u2081"+" [nM]", C1 * 10000/6.0221415]) Info.append(["C"+u"\u2082"+" [nM]", C2 * 10000/6.0221415]) if countrate is not None: # CPP cpp = countrate/n Info.append(["cpp [kHz]", cpp]) return Info # 3D + 3D + T model gauss m_gauss_3d_3d_t = [6034, "T+3D+3D", "Combined 3D diffusion + triplet w/ TIR", CF_Gxyz_3D3DT_gauss] labels = ["n", "D"+u"\u2081"+" [10 µm²/s]", "D"+u"\u2082"+" [10 µm²/s]", "F"+u"\u2081", "r₀ [100 nm]", "d_eva [100 nm]", u"\u03b1"+" (q"+u"\u2082"+"/q"+u"\u2081"+")", u"τ_trip [ms]", "T", "offset" ] values = [ 25, # n 25., # D1 0.9, # D2 0.45, # F1 9.44, # r0 1.0, # deva 1.0, # alpha 0.001, # tautrip 0.01, # T 0.0 # offset ] # Human readable stuff labelshr = ["n", "D"+u"\u2081"+u" [µm²/s]", "D"+u"\u2082"+u" [µm²/s]", "F"+u"\u2081", "r₀ [nm]", "d_eva [nm]", u"\u03b1"+" (q"+u"\u2082"+"/q"+u"\u2081"+")", u"τ_trip [µs]", "T", "offset" ] valueshr = [ 1., # n 10., # D1 10., # D2 1., # F1 100., # r0 100., # deva 1., # alpha 1000., # tautrip 1., # T 1. # offset ] valuestofit = [True, True, True, True, False, False, False, False, False, False] parms = [labels, values, valuestofit, labelshr, valueshr] model1 = dict() model1["Parameters"] = parms model1["Definitions"] = m_gauss_3d_3d_t model1["Verification"] = Checkme model1["Supplements"] = MoreInfo Modelarray = [model1] pycorrfit-0.8.1/src/models/MODEL_TIRF_2D2D.py0000755000175000017500000001234612262516600017125 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit This file contains 2D+2D TIR-FCS models. Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import numpy as np # NumPy import scipy.special as sps def wixi(x): """ Complex Error Function (Faddeeva/Voigt). w(i*x) = exp(x**2) * ( 1-erf(x) ) This function is called by other functions within this module. We are using the scipy.special.wofz module which calculates w(z) = exp(-z**2) * ( 1-erf(-iz) ) z = i*x """ z = x*1j wixi = sps.wofz(z) # We should have a real solution. Make sure nobody complains about # some zero-value imaginary numbers. return np.real_if_close(wixi) # 2D + 2D no binding TIRF def CF_Gxy_TIR_square_2d2d(parms, tau, wixi=wixi): u""" Two-component two-dimensional diffusion with a square-shaped lateral detection area taking into account the size of the point spread function. *parms* - a list of parameters. Parameters (parms[i]): [0] D_2D1 Diffusion coefficient of species 1 [1] D_2D2 Diffusion coefficient of species 2 [2] σ Lateral size of the point spread function σ = σ₀ * λ / NA [3] a Side size of the square-shaped detection area [4] d_eva Evanescent penetration depth [5] C_2D1 Two-dimensional concentration of species 1 [6] C_2D2 Two-dimensional concentration of species 2 [7] α Relative molecular brightness of particle 2 compared to particle 1 (α = q₂/q₁) *tau* - lag time """ D_2D1 = parms[0] D_2D2 = parms[1] sigma = parms[2] a = parms[3] kappa = 1/parms[4] Conc_2D1 = parms[5] Conc_2D2 = parms[6] alpha = parms[7] ## First the 2D-diffusion of species 1 var1 = sigma**2+D_2D1*tau AA1 = 2*np.sqrt(var1)/(a**2*np.sqrt(np.pi)) BB1 = np.exp(-a**2/(4*(var1))) - 1 CC1 = sps.erf(a/(2*np.sqrt(var1)))/a # gx = AA*BB+CC # gxy = gx**2 # g2D = Conc_2D * gxy g2D1 = Conc_2D1 * (AA1*BB1+CC1)**2 ## Second the 2D-diffusion of species 2 var2 = sigma**2+D_2D2*tau AA2 = 2*np.sqrt(var2)/(a**2*np.sqrt(np.pi)) BB2 = np.exp(-a**2/(4*(var2))) - 1 CC2 = sps.erf(a/(2*np.sqrt(var2)))/a # gx = AA*BB+CC # gxy = gx**2 # g2D = Conc_2D * gxy g2D2 = alpha**2 * Conc_2D2 * (AA2*BB2+CC2)**2 ## Finally the Prefactor F = Conc_2D1 + alpha * Conc_2D2 G = (g2D1 + g2D2) / F**2 return G # 2D-2D Model TIR m_tir_2d_2d_mix_6022 = [6022, u"2D+2D","Separate 2D diffusion, TIR", CF_Gxy_TIR_square_2d2d] labels_6022 = [ "D"+u"\u2081"+u" [10 µm²/s]", "D"+u"\u2082"+u" [10 µm²/s]", u"σ [100 nm]", "a [100 nm]", "d_eva [100 nm]", "C"+u"\u2081"+u" [100 /µm²]", "C"+u"\u2082"+u" [100 /µm²]", u"\u03b1"+" (q"+u"\u2082"+"/q"+u"\u2081"+")" ] values_6022 = [ 0.90, # D_2D₁ [10 µm²/s] 0.01, # D_2D₂ [10 µm²/s] 2.3, # σ [100 nm] 7.50, # a [100 nm] 1.0, # d_eva [100 nm] 0.01, # conc.2D₁ [100 /µm²] 0.03, # conc.2D₂ [100 /µm²] 1 # alpha ] # For user comfort we add values that are human readable. # Theese will be used for output that only humans can read. labels_human_readable_6022 = [ "D"+u"\u2081"+u" [µm²/s]", "D"+u"\u2082"+u" [µm²/s]", u"σ [nm]", "a [nm]", "d_eva [nm]", "C"+u"\u2081"+u" [1/µm²]", "C"+u"\u2082"+u" [1/µm²]", u"\u03b1"+" (q"+u"\u2082"+"/q"+u"\u2081"+")" ] values_factor_human_readable_6022 = [ 10, # D_2D₁ [10 µm²/s], 10, # D_2D₂ [10 µm²/s] 100, # σ [100 nm] 100, # a [100 nm] 100, # d_eva [100 nm] 100, # conc.2D₁ [100 /µm²] 100, # conc.2D₂ [100 /µm²] 1 ] valuestofit_6022 = [False, True, False, False, False, False, True, False] parms_6022 = [labels_6022, values_6022, valuestofit_6022, labels_human_readable_6022, values_factor_human_readable_6022] model1 = dict() model1["Parameters"] = parms_6022 model1["Definitions"] = m_tir_2d_2d_mix_6022 model1["Verification"] = lambda parms: np.abs(parms) Modelarray = [model1] pycorrfit-0.8.1/src/models/MODEL_TIRF_gaussian_3D2D.py0000755000175000017500000002001212262516600021005 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit This file contains a 3D+2D+T TIR FCS model. Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import numpy as np # NumPy import scipy.special as sps def wixi(x): """ Complex Error Function (Faddeeva/Voigt). w(i*x) = exp(x**2) * ( 1-erf(x) ) This function is called by other functions within this module. We are using the scipy.special.wofz module which calculates w(z) = exp(-z**2) * ( 1-erf(-iz) ) z = i*x """ z = x*1j wixi = sps.wofz(z) # We should have a real solution. Make sure nobody complains about # some zero-value imaginary numbers. return np.real_if_close(wixi) # 3D + 2D + T def CF_Gxyz_3d2dT_gauss(parms, tau): """ Two-component, two- and three-dimensional diffusion with a Gaussian lateral detection profile and an exponentially decaying profile in axial direction, including a triplet component. The triplet factor takes into account blinking according to triplet states of excited molecules. Set *T* or *τ_trip* to 0, if no triplet component is wanted. kappa = 1/d_eva x = sqrt(D_3D*τ)*kappa w(i*x) = exp(x²)*erfc(x) gz = kappa * [ sqrt(D_3D*τ/pi) + (1 - 2*D_3D*τ*kappa²)/(2*kappa) * w(i*x) ] g2D3D = 1 / [ 1+4*D_3D*τ/r₀² ] particle3D = α*F * g2D3D * gz particle2D = (1-F)/ (1+4*D_2D*τ/r₀²) triplet = 1 + T/(1-T)*exp(-τ/τ_trip) norm = (1-F + α*F)² G = 1/n*(particle2D + particle3D)*triplet/norm + offset *parms* - a list of parameters. Parameters (parms[i]): [0] n Effective number of particles in confocal volume (n = n2D+n3D) [1] D_2D Diffusion coefficient of surface bound particles [2] D_3D Diffusion coefficient of freely diffusing particles [3] F Fraction of molecules of the freely diffusing species (n3D = n*F), 0 <= F <= 1 [4] r₀ Lateral extent of the detection volume [5] d_eva Evanescent field depth [6] α Relative molecular brightness of freely diffusing compared to surface bound particles (α = q3D/q2D) [7] τ_trip Characteristic residence time in triplet state [8] T Fraction of particles in triplet (non-fluorescent) state 0 <= T < 1 [9] offset *tau* - lag time """ n=parms[0] D2D=parms[1] D3D=parms[2] F=parms[3] r0=parms[4] deva=parms[5] alpha=parms[6] tautrip=parms[7] T=parms[8] off=parms[9] ### 2D species taud2D = r0**2/(4*D2D) particle2D = (1-F)/ (1+tau/taud2D) ### 3D species taud3D = r0**2/(4*D3D) # 2D gauss component g2D3D = 1 / ( (1.+tau/taud3D) ) # 1d TIR component # Axial correlation kappa = 1/deva x = np.sqrt(D3D*tau)*kappa w_ix = wixi(x) # Gz = 1/N1D * gz = kappa / Conc.1D * gz gz = kappa * (np.sqrt(D3D*tau/np.pi) - (2*D3D*tau*kappa**2 - 1)/(2*kappa) * w_ix) particle3D = alpha**2*F * g2D3D * gz ### triplet triplet = 1 + T/(1-T)*np.exp(-tau/tautrip) ### Norm norm = (1-F + alpha*F)**2 ### Correlation function G = 1/n*(particle2D + particle3D)*triplet/norm return G + off def Checkme(parms): parms[0] = np.abs(parms[0]) parms[1] = D2D = np.abs(parms[1]) parms[2] = D3D = np.abs(parms[2]) F=parms[3] parms[4] = r0 = np.abs(parms[4]) parms[5]=np.abs(parms[5]) parms[6]=np.abs(parms[6]) tautrip=np.abs(parms[7]) T=parms[8] off=parms[9] taud2D = r0**2/(4*D2D) taud3D = r0**2/(4*D3D) # We are not doing this anymore (Issue #2). ## Force triplet component to be smaller than diffusion times ## tautrip = min(tautrip,taud2D*0.9, taud3D*0.9) # Triplet fraction is between 0 and one. T may not be one! T = (0.<=T<1.)*T + .99999999999999*(T>=1) # Fraction of molecules may also be one F = (0.<=F<=1.)*F + 1.*(F>1) parms[3] = F parms[7] = tautrip parms[8] = T return parms def MoreInfo(parms, countrate): u"""Supplementary parameters: Effective number of freely diffusing particles in 3D: [10] n3D = n*F Effective number particles diffusing on 2D surface: [11] n2D = n*(1-F) Value of the correlation function at lag time zero: [12] G(0) Effective measurement volume: [13] V_eff [al] = π * r₀² * d_eva Concentration of the 2D species: [14] C_2D [1/µm²] = n2D / ( π * r₀² ) Concentration of the 3D species: [15] C_3D [nM] = n3D/V_eff """ # We can only give you the effective particle number n=parms[0] D2D=parms[1] D3D=parms[2] F=parms[3] r0=parms[4] deva=parms[5] alpha=parms[6] Info=list() # The enumeration of these parameters is very important for # plotting the normalized curve. Countrate must come out last! Info.append([u"n3D", n*F]) Info.append([u"n2D", n*(1.-F)]) # Detection area: Veff = np.pi * r0**2 * deva C3D = n*F / Veff C2D = n*(1-F) / ( np.pi*r0**2 ) # Correlation function at tau = 0 G_0 = CF_Gxyz_3d2dT_gauss(parms, 0) Info.append(["G(0)", G_0]) Info.append(["V_eff [al]", Veff]) Info.append(["C_2D [1/µm²]", C2D * 100]) Info.append(["C_3D [nM]", C3D * 10000/6.0221415]) if countrate is not None: # CPP cpp = countrate/n Info.append(["cpp [kHz]", cpp]) return Info # 3D + 3D + T model gauss m_gauss_3d_2d_t = [6033, "T+3D+2D", "Separate 3D and 2D diffusion + triplet w/ TIR", CF_Gxyz_3d2dT_gauss] labels = ["n", u"D_2D [10 µm²/s]", u"D_3D [10 µm²/s]", "F_3D", u"r₀ [100 nm]", "d_eva [100 nm]", u"\u03b1"+" (q_3D/q_2D)", u"τ_trip [ms]", "T", "offset" ] values = [ 25, # n 0.51, # D2D 25.1, # D3D 0.45, # F3D 9.44, # r0 1.0, # deva 1.0, # alpha 0.001, # tautrip 0.01, # T 0.0 # offset ] # Human readable stuff labelshr = ["n", u"D_2D [µm²/s]", u"D_3D [µm²/s]", "F_3D", u"r₀ [nm]", "d_eva [nm]", u"\u03b1"+" (q_3D/q_2D)", u"τ_trip [µs]", "T", "offset" ] valueshr = [ 1., # n 10., # D2D 10., # D3D 1., # F3D 100., # r0 100., # deva 1., # alpha 1000., # tautrip 1., # T 1. # offset ] valuestofit = [True, True, True, True, False, False, False, False, False, False] parms = [labels, values, valuestofit, labelshr, valueshr] model1 = dict() model1["Parameters"] = parms model1["Definitions"] = m_gauss_3d_2d_t model1["Verification"] = Checkme model1["Supplements"] = MoreInfo Modelarray = [model1] pycorrfit-0.8.1/src/models/MODEL_classic_gaussian_3D2D.py0000755000175000017500000001375612262516600021703 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit This file contains a 3D+2D+T confocal FCS model. Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import numpy as np # NumPy # 3D + 2D + T def CF_Gxyz_3d2dT_gauss(parms, tau): u""" Two-component, two- and three-dimensional diffusion with a Gaussian laser profile, including a triplet component. The triplet factor takes into account blinking according to triplet states of excited molecules. Set *T* or *τ_trip* to 0, if no triplet component is wanted. particle2D = (1-F)/ (1+τ/τ_2D) particle3D = α*F/( (1+τ/τ_3D) * sqrt(1+τ/(τ_3D*SP²))) triplet = 1 + T/(1-T)*exp(-τ/τ_trip) norm = (1-F + α*F)² G = 1/n*(particle1 + particle2)*triplet/norm + offset *parms* - a list of parameters. Parameters (parms[i]): [0] n Effective number of particles in confocal volume (n = n2D+n3D) [1] τ_2D Diffusion time of surface bound particls [2] τ_3D Diffusion time of freely diffusing particles [3] F Fraction of molecules of the freely diffusing species (n3D = n*F), 0 <= F <= 1 [4] SP SP=z₀/r₀ Structural parameter, describes elongation of the confocal volume [5] α Relative molecular brightness of particle 3D compared to particle 2D (α = q3D/q2D) [6] τ_trip Characteristic residence time in triplet state [7] T Fraction of particles in triplet (non-fluorescent) state 0 <= T < 1 [8] offset *tau* - lag time """ n=parms[0] taud2D=parms[1] taud3D=parms[2] F=parms[3] SP=parms[4] alpha=parms[5] tautrip=parms[6] T=parms[7] off=parms[8] particle2D = (1-F)/ (1+tau/taud2D) particle3D = alpha**2*F/( (1+tau/taud3D) * np.sqrt(1+tau/(taud3D*SP**2))) triplet = 1 + T/(1-T)*np.exp(-tau/tautrip) norm = (1-F + alpha*F)**2 G = 1/n*(particle2D + particle3D)*triplet/norm return G + off def Checkme(parms): parms[0] = np.abs(parms[0]) parms[1] = taud2D = np.abs(parms[1]) parms[2] = taud3D = np.abs(parms[2]) F=parms[3] parms[4]=np.abs(parms[4]) parms[5]=np.abs(parms[5]) tautrip=np.abs(parms[6]) T=parms[7] off=parms[8] # Triplet fraction is between 0 and one. T may not be one! T = (0.<=T<1.)*T + .99999999999999*(T>=1) # Fraction of molecules may also be one F = (0.<=F<=1.)*F + 1.*(F>1) parms[3] = F parms[6] = tautrip parms[7] = T return parms def MoreInfo(parms, countrate): u"""Supplementary parameters: Effective number of freely diffusing particles in 3D solution: [9] n3D = n*F Effective number particles diffusing on 2D surface: [10] n2D = n*(1-F) """ # We can only give you the effective particle number n = parms[0] F3d = parms[3] Info = list() # The enumeration of these parameters is very important for # plotting the normalized curve. Countrate must come out last! Info.append([u"n3D", n*F3d]) Info.append([u"n2D", n*(1.-F3d)]) if countrate is not None: # CPP cpp = countrate/n Info.append(["cpp [kHz]", cpp]) return Info # 3D + 3D + T model gauss m_gauss_3d_2d_t = [6032, "T+3D+2D", "Separate 3D and 2D diffusion + triplet, Gauß", CF_Gxyz_3d2dT_gauss] labels = ["n", "τ_2D [ms]", "τ_3D [ms]", "F_3D", "SP", u"\u03b1"+" (q_3D/q_2D)", "τ_trip [ms]", "T", "offset" ] values = [ 25, # n 100, # taud2D 0.1, # taud3D 0.45, # F3D 7, # SP 1.0, # alpha 0.001, # tautrip 0.01, # T 0.0 # offset ] # For user comfort we add values that are human readable. # Theese will be used for output that only humans can read. labels_human_readable = [ "n", "τ_2D [ms]", "τ_3D [ms]", "F_3D", "SP", u"\u03b1"+" (q_3D/q_2D)", "τ_trip [µs]", "T", "offset" ] values_factor_human_readable = [ 1., # "n", 1., # "τ_2D [ms]", 1., # "τ_3D [ms]", 1., # "F_3D", 1., # "SP", 1., # u"\u03b1"+" (q_3D/q_2D)", 1000., # "τ_trip [µs]", 1., # "T", 1. # "offset" ] valuestofit = [True, True, True, True, False, False, False, False, False] parms = [labels, values, valuestofit, labels_human_readable, values_factor_human_readable] model1 = dict() model1["Parameters"] = parms model1["Definitions"] = m_gauss_3d_2d_t model1["Verification"] = Checkme model1["Supplements"] = MoreInfo Modelarray = [model1] pycorrfit-0.8.1/src/models/MODEL_TIRF_3D2D.py0000755000175000017500000001263512262516600017127 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit This file contains 3D+2D TIR-FCS models. Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import numpy as np # NumPy import scipy.special as sps def wixi(x): """ Complex Error Function (Faddeeva/Voigt). w(i*x) = exp(x**2) * ( 1-erf(x) ) This function is called by other functions within this module. We are using the scipy.special.wofz module which calculates w(z) = exp(-z**2) * ( 1-erf(-iz) ) z = i*x """ z = x*1j wixi = sps.wofz(z) # We should have a real solution. Make sure nobody complains about # some zero-value imaginary numbers. return np.real_if_close(wixi) # 3D + 2D no binding TIRF def CF_Gxyz_TIR_square_3d2d(parms, tau, wixi=wixi): u""" Two-component two- and three-dimensional diffusion with a square-shaped lateral detection area taking into account the size of the point spread function; and an exponential decaying profile in axial direction. *parms* - a list of parameters. Parameters (parms[i]): [0] D_3D Diffusion coefficient of freely diffusing species [1] D_2D Diffusion coefficient of surface bound species [2] σ Lateral size of the point spread function σ = σ₀ * λ / NA [3] a Side size of the square-shaped detection area [4] d_eva Evanescent penetration depth [5] C_3D Concentration of freely diffusing species [6] C_2D Concentration of surface bound species [7] α Relative molecular brightness of 3D particle compared to 2D particle (α = q3D/q2D) *tau* - lag time """ D_3D = parms[0] D_2D = parms[1] sigma = parms[2] a = parms[3] kappa = 1/parms[4] Conc_3D = parms[5] Conc_2D = parms[6] alpha = parms[7] ## First the 2D-diffusion at z=0 var1 = sigma**2+D_2D*tau AA = 2*np.sqrt(var1)/(a**2*np.sqrt(np.pi)) BB = np.exp(-a**2/(4*(var1))) - 1 CC = sps.erf(a/(2*np.sqrt(var1)))/a # gx = AA*BB+CC # gxy = gx**2 # g2D = Conc_2D * gxy g2D = Conc_2D * (AA*BB+CC)**2 ## Second the 3D diffusion for z>0 # Axial correlation x = np.sqrt(D_3D*tau)*kappa w_ix = wixi(x) gz = np.sqrt(D_3D*tau/np.pi) - (2*D_3D*tau*kappa**2 - 1)/(2*kappa) * w_ix # Lateral correlation gx1 = 2/(a**2*np.sqrt(np.pi)) * np.sqrt(sigma**2+D_3D*tau) * \ ( np.exp(-a**2/(4*(sigma**2+D_3D*tau))) -1 ) gx2 = 1/a * sps.erf( a / (2*np.sqrt(sigma**2 + D_3D*tau))) gx = gx1 + gx2 gxy = gx**2 # Non normalized correlation function g3D = alpha**2 * Conc_3D * gxy * gz ## Finally the Prefactor F = alpha * Conc_3D / kappa + Conc_2D G = (g3D + g2D) / F**2 return G # 3D-2D Model TIR m_tir_3d_2d_mix_6020 = [6020, u"3D+2D", "Separate 3D and 2D diffusion, 3D TIR", CF_Gxyz_TIR_square_3d2d] labels_6020 = [u"D_3D [10 µm²/s]", u"D_2D [10 µm²/s]", u"σ [100 nm]", "a [100 nm]", "d_eva [100 nm]", u"C_3D [1000 /µm³]", u"C_2D [100 /µm²]", u"\u03b1"+" (q3D/q2D)" ] values_6020 = [ 50.0, # D_3D [10 µm²/s] 0.81, # D_2D [10 µm²/s] 2.3, # σ [100 nm] 7.50, # a [100 nm] 1.0, # d_eva [100 nm] 0.01, # conc.3D [1000 /µm³] 0.03, # conc.2D [100 /µm²] 1 # alpha ] # For user comfort we add values that are human readable. # Theese will be used for output that only humans can read. labels_human_readable_6020 = ["D_3D [µm²/s]", u"D_2D [µm²/s]", u"σ [nm]", "a [nm]", "d_eva [nm]", u"C_3D [1/µm³]", u"C_2D [1/µm²]", u"\u03b1"+" (q3D/q2D)" ] values_factor_human_readable_6020 = [ 10, # D_3D [µm²/s] 10, # D_2D [10 µm²/s] 100, # σ [100 nm] 100, # a [100 nm] 100, # d_eva [100 nm] 1000, # conc.3D [1000 /µm³] 100, # conc.2D [100 /µm²] 1 # alpha ] valuestofit_6020 = [False, True, False, False, False, False, True, False] parms_6020 = [labels_6020, values_6020, valuestofit_6020, labels_human_readable_6020, values_factor_human_readable_6020] model1 = dict() model1["Parameters"] = parms_6020 model1["Definitions"] = m_tir_3d_2d_mix_6020 model1["Verification"] = lambda parms: np.abs(parms) Modelarray = [model1] pycorrfit-0.8.1/src/models/MODEL_TIRF_3D2Dkin_Ries.py0000755000175000017500000003733712262516600020621 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit This file contains a TIR-FCS kineteics model function according to: "Total Internal Reflection Fluorescence Correlation Spectroscopy: Effects of Lateral Diffusion and Surface-Generated Fluorescence" Jonas Ries, Eugene P. Petrov, and Petra Schwille Biophysical Journal, Volume 95, July 2008, 390–399 Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import numpy as np # NumPy import scipy.special as sps import numpy.lib.scimath as nps def wixi(x): """ Complex Error Function (Faddeeva/Voigt). w(i*x) = exp(x**2) * ( 1-erf(x) ) This function is called by other functions within this module. We are using the scipy.special.wofz module which calculates w(z) = exp(-z**2) * ( 1-erf(-iz) ) z = i*x """ z = x*1j wixi = sps.wofz(z) # We should have a real solution. Make sure nobody complains about # some zero-value imaginary numbers. return np.real_if_close(wixi) # Lateral correlation function def CF_gxy_square(parms, tau): """ 2D free diffusion measured with a square pinhole. For the square pinhole, the correlation function can readily be calculated for a TIR-FCS setup. This function is called by other functions within this module. Attention: This is NOT g2D (or gCC), the non normalized correlation function. g2D = gxy * eta**2 * Conc, where eta is the molecular brightness, Conc the concentration and gxy is this function. *parms* - a list of parameters. Parameters (parms[i]): [0] D Diffusion coefficient [1] sigma lateral size of the point spread function sigma = simga_0 * lambda / NA [2] a side size of the square pinhole *tau* - lag time Returns: Nonnormalized Lateral correlation function w/square pinhole. """ D = parms[0] sigma = parms[1] a = parms[2] var1 = sigma**2+D*tau AA = 2*np.sqrt(var1)/(a**2*np.sqrt(np.pi)) BB = np.exp(-a**2/(4*(var1))) - 1 CC = sps.erf(a/(2*np.sqrt(var1)))/a # gx = AA*BB+CC # gxy = gx**2 return (AA*BB+CC)**2 def CF_gz_CC(parms, tau, wixi=wixi): """ Axial (1D) diffusion in a TIR-FCS setup. From Two species (bound/unbound) this is the bound part. This function is called by other functions within this module. *parms* - a list of parameters. Parameters (parms[i]): [0] D_3D 3D Diffusion coefficient (species A) [1] D_2D 2D Diffusion coefficient of bound species C [2] sigma lateral size of the point spread function sigma = simga_0 * lambda / NA [3] a side size of the square pinhole [4] d_eva evanescent decay length (decay to 1/e) [5] Conc_3D 3-dimensional concentration of species A [6] Conc_2D 2-dimensional concentration of species C [7] eta_3D molecular brightness of species A [8] eta_2D molecular brightness of species C [9] k_a Surface association rate constant [10] k_d Surface dissociation rate constant *tau* - lag time """ D = parms[0] # D_2D = parms[1] sigma = parms[2] # a = parms[3] d_eva = parms[4] Conc_3D = parms[5] # ligand concentration in solution Conc_2D = parms[6] eta_3D = parms[7] eta_2D = parms[8] k_a = parms[9] k_d = parms[10] # Define some other constants: K = k_a/k_d # equilibrium constant Beta = 1/(1 + K*Conc_3D) # This is wrong in the Ries paper Re = D / d_eva**2 Rt = D * (Conc_3D / (Beta * Conc_2D))**2 Rr = k_a * Conc_3D + k_d # Define even more constants: sqrtR1 = -Rr/(2*nps.sqrt(Rt)) + nps.sqrt( Rr**2/(4*Rt) - Rr ) sqrtR2 = -Rr/(2*nps.sqrt(Rt)) - nps.sqrt( Rr**2/(4*Rt) - Rr ) R1 = sqrtR1 **2 R2 = sqrtR2 **2 # Calculate return function A1 = eta_2D * Conc_2D / (eta_3D * Conc_3D) * Beta A2 = sqrtR1 * wixi(-nps.sqrt(tau*R2)) - sqrtR2 * wixi(-nps.sqrt(tau*R1)) A3 = sqrtR1 - sqrtR2 Sol = A1 * A2 / A3 # There are some below numerical errors-imaginary numbers. # We do not want them. return np.real_if_close(Sol) def CF_gz_AC(parms, tau, wixi=wixi): """ Axial (1D) diffusion in a TIR-FCS setup. From Two species (bound/unbound) this is the cross correlation part. This function is called by other functions within this module. *parms* - a list of parameters. Parameters (parms[i]): [0] D_3D 3D Diffusion coefficient (species A) [1] D_2D 2D Diffusion coefficient of bound species C [2] sigma lateral size of the point spread function sigma = simga_0 * lambda / NA [3] a side size of the square pinhole [4] d_eva evanescent decay length (decay to 1/e) [5] Conc_3D 3-dimensional concentration of species A [6] Conc_2D 2-dimensional concentration of species C [7] eta_3D molecular brightness of species A [8] eta_2D molecular brightness of species C [9] k_a Surface association rate constant [10] k_d Surface dissociation rate constant *tau* - lag time """ D = parms[0] # D_2D = parms[1] sigma = parms[2] # a = parms[3] d_eva = parms[4] Conc_3D = parms[5] # ligand concentration in solution Conc_2D = parms[6] eta_3D = parms[7] eta_2D = parms[8] k_a = parms[9] k_d = parms[10] # Define some other constants: K = k_a/k_d # equilibrium constant Beta = 1/(1 + K*Conc_3D) Re = D / d_eva**2 Rt = D * (Conc_3D / (Beta * Conc_2D))**2 Rr = k_a * Conc_3D + k_d # Define even more constants: sqrtR1 = -Rr/(2*nps.sqrt(Rt)) + nps.sqrt( Rr**2/(4*Rt) - Rr ) sqrtR2 = -Rr/(2*nps.sqrt(Rt)) - nps.sqrt( Rr**2/(4*Rt) - Rr ) R1 = sqrtR1 **2 R2 = sqrtR2 **2 # And even more more: sqrtR3 = sqrtR1 + nps.sqrt(Re) sqrtR4 = sqrtR2 + nps.sqrt(Re) R3 = sqrtR3 **2 R4 = sqrtR4 **2 # Calculate return function A1 = eta_2D * Conc_2D * k_d / (eta_3D * Conc_3D) A2 = sqrtR4*wixi(-nps.sqrt(tau*R1)) - sqrtR3*wixi(-nps.sqrt(tau*R2)) A3 = ( sqrtR1 - sqrtR2 ) * wixi(nps.sqrt(tau*Re)) A4 = (sqrtR1 - sqrtR2) * sqrtR3*sqrtR4 Solution = A1 * ( A2 + A3 ) / A4 # There are some below numerical errors-imaginary numbers. # We do not want them. return np.real_if_close(Solution) def CF_gz_AA(parms, tau, wixi=wixi): """ Axial (1D) diffusion in a TIR-FCS setup. From Two species (bound/unbound) this is the unbound part. This function is called by other functions within this module. *parms* - a list of parameters. Parameters (parms[i]): [0] D_3D 3D Diffusion coefficient (species A) [1] D_2D 2D Diffusion coefficient of bound species C [2] sigma lateral size of the point spread function sigma = simga_0 * lambda / NA [3] a side size of the square pinhole [4] d_eva evanescent decay length (decay to 1/e) [5] Conc_3D 3-dimensional concentration of species A [6] Conc_2D 2-dimensional concentration of species C [7] eta_3D molecular brightness of species A [8] eta_2D molecular brightness of species C [9] k_a Surface association rate constant [10] k_d Surface dissociation rate constant *tau* - lag time """ D = parms[0] # D_2D = parms[1] sigma = parms[2] # a = parms[3] d_eva = parms[4] Conc_3D = parms[5] # ligand concentration in solution Conc_2D = parms[6] eta_3D = parms[7] eta_2D = parms[8] k_a = parms[9] k_d = parms[10] # Define some other constants: d = d_eva K = k_a/k_d # equilibrium constant Beta = 1/(1 + K*Conc_3D) Re = D / d_eva**2 Rt = D * (Conc_3D / (Beta * Conc_2D))**2 Rr = k_a * Conc_3D + k_d # Define even more constants: sqrtR1 = -Rr/(2*nps.sqrt(Rt)) + nps.sqrt( Rr**2/(4*Rt) - Rr ) sqrtR2 = -Rr/(2*nps.sqrt(Rt)) - nps.sqrt( Rr**2/(4*Rt) - Rr ) R1 = sqrtR1 **2 R2 = sqrtR2 **2 # And even more more: sqrtR3 = sqrtR1 + nps.sqrt(Re) sqrtR4 = sqrtR2 + nps.sqrt(Re) R3 = sqrtR3 **2 R4 = sqrtR4 **2 # Calculate return function Sum1 = d * nps.sqrt( Re*tau/np.pi ) Sum2 = -d/2*(2*tau*Re -1) * wixi(np.sqrt(tau*Re)) Sum3Mult1 = - eta_2D * Conc_2D * k_d / ( eta_3D * Conc_3D * (sqrtR1 - sqrtR2) ) S3M2S1M1 = sqrtR1/R3 S3M2S1M2S1 = wixi(-nps.sqrt(tau*R1)) + -2*nps.sqrt(tau*R3/np.pi) S3M2S1M2S2 = ( 2*tau*sqrtR1*nps.sqrt(Re) + 2*tau*Re -1 ) * \ wixi(nps.sqrt(tau*Re)) S3M2S2M1 = -sqrtR2/R4 S3M2S2M2S1 = wixi(-nps.sqrt(tau*R2)) + -2*nps.sqrt(tau*R4/np.pi) S3M2S2M2S2 = ( 2*tau*sqrtR2*nps.sqrt(Re) + 2*tau*Re -1 ) * \ wixi(nps.sqrt(tau*Re)) Sum3 = Sum3Mult1 * ( S3M2S1M1 * (S3M2S1M2S1 + S3M2S1M2S2) + S3M2S2M1 * (S3M2S2M2S1 + S3M2S2M2S2) ) Sum = Sum1 + Sum2 + Sum3 # There are some below numerical errors-imaginary numbers. # We do not want them. return np.real_if_close(Sum) # 3D-2D binding/unbinding TIRF def CF_Gxyz_TIR_square_ubibi(parms, tau, gAAz=CF_gz_AA, gACz=CF_gz_AC, gCCz=CF_gz_CC, gxy=CF_gxy_square): u""" Two-component two- and three-dimensional diffusion with a square-shaped lateral detection area taking into account the size of the point spread function; and an exponential decaying profile in axial direction. This model covers binding and unbinding kintetics. *parms* - a list of parameters. Parameters (parms[i]): [0] D_3D Diffusion coefficient of freely diffusing species A [1] D_2D Diffusion coefficient of bound species C [2] σ Lateral size of the point spread function σ = σ₀ * λ / NA [3] a Side size of the square-shaped detection area [4] d_eva Evanescent decay constant [5] C_3D Concentration of species A in observation volume [6] C_2D Concentration of species C in observation area [7] η_3D Molecular brightness of species A [8] η_2D Molecular brightness of species C [9] k_a Surface association rate constant [10] k_d Surface dissociation rate constant *tau* - lag time Returns: 3D correlation function for TIR-FCS w/square pinhole and surface binding/unbinding. Model introduced in: Jonas Ries, Eugene P. Petrov, and Petra Schwille Total Internal Reflection Fluorescence Correlation Spectroscopy: Effects of Lateral Diffusion and Surface-Generated Fluorescence Biophysical Journal, Vol.95, July 2008, 390–399 """ D_3D = parms[0] D_2D = parms[1] sigma = parms[2] a = parms[3] kappa = 1/parms[4] Conc_3D = parms[5] Conc_2D = parms[6] eta_3D = parms[7] eta_2D = parms[8] k_a = parms[9] k_d = parms[10] ## We now need to copmute a real beast: # Inter species non-normalized correlation functions # gAA = gAAz * gxy(D_3D) # gAC = gACz * np.sqrt ( gxy(D_3D) * gxy(D_2D) ) # gCC = gCCz * gxy(D_2D) # Nonnormalized correlation function # g = eta_3D * Conc_3D * ( gAA + 2*gAC + gCC ) # Expectation value of fluorescence signal # F = eta_3D * Conc_3D / kappa + eta_2D * Conc_2D # Normalized correlation function # G = g / F**2 ## # Inter species non-normalized correlation functions # The gijz functions take the same parameters as this function # The gxy function needs two different sets of parameters, depending # on the diffusion constant used. # [0] D: Diffusion coefficient # [1] sigma: lateral size of the point spread function # [3] a: side size of the square pinhole parms_xy_2D = [D_2D, sigma, a] parms_xy_3D = [D_3D, sigma, a] # Here we go. gAA = gAAz(parms, tau) * gxy(parms_xy_3D, tau) gAC = gACz(parms, tau) * nps.sqrt( gxy(parms_xy_3D, tau) * gxy(parms_xy_2D, tau) ) gCC = gCCz(parms, tau) * gxy(parms_xy_2D, tau) # Nonnormalized correlation function g = eta_3D * Conc_3D * ( gAA + 2*gAC + gCC ) # Expectation value of fluorescence signal F = eta_3D * Conc_3D / kappa + eta_2D * Conc_2D # Normalized correlation function G = g / F**2 # There are some below numerical errors-imaginary numbers. # We do not want them. return G.real #FNEW = eta_3D * Conc_3D / kappa #GNEW = eta_3D * Conc_3D * gCCz(parms, tau) / FNEW**2 #return GNEW.real # 3D-2D binding Model TIR m_tir_3d_2d_ubib6021 = [6021, u"3D+2D+kin", "Surface binding and unbinding, 3D TIR", CF_Gxyz_TIR_square_ubibi] labels_6021 = [u"D_3D [10 µm²/s]", u"D_2D [10 µm²/s]", u"σ [100 nm]", "a [100 nm]", "d_eva [100 nm]", u"C_3D [1000 /µm³]", u"C_2D[100 /µm²]", u"η_3D", u"η_2D", u"k_a [µm³/s]", u"k_d [10³ /s]" ] values_6021 = [ 9.0, # D_3D [10 µm²/s] 0.0, # D_2D [10 µm²/s] 2.3, # σ [100 nm] 7.50, # a [100 nm] 1.0, # d_eva [100 nm] 0.01, # conc.3D [1000 /µm³] 0.03, # conc.2D [100 /µm²] 1, # η_3D 1, # η_2D 0.00001, # k_a [µm³/s] 0.000064 # k_d [10³ /s] ] valuestofit_6021 = [False, True, False, False, False, False, True, False, False, False, False] # For user comfort we add values that are human readable. # Theese will be used for output that only humans can read. labels_human_readable_6021 = [ u"D_3D [µm²/s]", u"D_2D [µm²/s]", u"σ [nm]", "a [nm]", "d_eva [nm]", u"C_3D [1/µm³]", u"C_2D [1/µm²]", u"η_3D", u"η_2D", u"k_a [µm³/s]", "k_d [1/s]" ] values_factor_human_readable_6021 = [10, # "D_3D [µm²/s]", 10, # D_2D [10 µm²/s] 100, # σ [100 nm] 100, # a [100 nm] 100, # d_eva [100 nm] 1000, # conc.3D [1000 /µm³] 100, # conc.2D [100 /µm²] 1, # η_3D 1, # η_2D 1, # k_a [µm³/s] 1000 # k_d [10³ /s] ] parms_6021 = [labels_6021, values_6021, valuestofit_6021, labels_human_readable_6021, values_factor_human_readable_6021] model1 = dict() model1["Parameters"] = parms_6021 model1["Definitions"] = m_tir_3d_2d_ubib6021 model1["Verification"] = lambda parms: np.abs(parms) Modelarray = [model1] pycorrfit-0.8.1/src/models/MODEL_classic_gaussian_2D.py0000755000175000017500000002324712262516600021510 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit This file contains some simple 2D models for confocal microscopy Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import numpy as np # NumPy # 2D simple gauss def CF_Gxy_gauss(parms, tau): u""" Two-dimensional diffusion with a Gaussian laser profile. G(τ) = offset + 1/( n * (1+τ/τ_diff) ) Calculation of diffusion coefficient and concentration from the effective radius of the detection profile (r₀ = 2*σ): D = r₀²/(4*τ_diff) Conc = n/(π*r₀²) *parms* - a list of parameters. Parameters (parms[i]): [0] n Effective number of particles in confocal area [1] τ_diff Characteristic residence time in confocal area [3] offset *tau* - lag time """ n = parms[0] taudiff = parms[1] dc = parms[2] BB = 1 / ( (1.+tau/taudiff) ) G = dc + 1/n * BB return G def Check_xy_gauss(parms): parms[0] = np.abs(parms[0]) parms[1] = np.abs(parms[1]) return parms # 2D simple gauss def CF_Gxy_T_gauss(parms, tau): u""" Two-dimensional diffusion with a Gaussian laser profile, including a triplet component. The triplet factor takes into account a blinking term. Set *T* or *τ_trip* to 0, if no triplet component is wanted. triplet = 1 + T/(1-T)*exp(-τ/τ_trip) G(τ) = offset + 1/( n * (1+τ/τ_diff) )*triplet Calculation of diffusion coefficient and concentration from the effective radius of the detection profile (r₀ = 2*σ): D = r₀²/(4*τ_diff) Conc = n/(π*r₀²) *parms* - a list of parameters. Parameters (parms[i]): [0] n Effective number of particles in confocal area [1] τ_diff Characteristic residence time in confocal area [2] τ_trip Characteristic residence time in triplet state [3] T Fraction of particles in triplet (non-fluorescent) state 0 <= T < 1 [4] offset *tau* - lag time """ n = parms[0] taudiff = parms[1] tautrip = parms[2] T = parms[3] dc = parms[4] triplet = 1 + T/(1-T)*np.exp(-tau/tautrip) BB = 1 / ( (1.+tau/taudiff) ) G = dc + 1/n * BB * triplet return G def Check_xy_T_gauss(parms): parms[0] = np.abs(parms[0]) taudiff = parms[1] = np.abs(parms[1]) tautrip = np.abs(parms[2]) T=parms[3] # Triplet fraction is between 0 and one. T may not be one! T = (0.<=T<1.)*T + .99999999999999*(T>=1) parms[2] = tautrip parms[3] = T return parms # 2D + 2D + Triplet Gauß # Model 6031 def CF_Gxyz_gauss_2D2DT(parms, tau): u""" Two-component, two-dimensional diffusion with a Gaussian laser profile, including a triplet component. The triplet factor takes into account blinking according to triplet states of excited molecules. Set *T* or *τ_trip* to 0, if no triplet component is wanted. particle1 = F₁/(1+τ/τ₁) particle2 = α*(1-F₁)/(1+τ/τ₂) triplet = 1 + T/(1-T)*exp(-τ/τ_trip) norm = (F₁ + α*(1-F₁))² G = 1/n*(particle1 + particle2)*triplet/norm + offset *parms* - a list of parameters. Parameters (parms[i]): [0] n Effective number of particles in confocal area (n = n₁+n₂) [1] τ₁ Diffusion time of particle species 1 [2] τ₂ Diffusion time of particle species 2 [3] F₁ Fraction of molecules of species 1 (n₁ = n*F₁) 0 <= F₁ <= 1 [4] α Relative molecular brightness of particle 2 compared to particle 1 (α = q₂/q₁) [5] τ_trip Characteristic residence time in triplet state [6] T Fraction of particles in triplet (non-fluorescent) state 0 <= T < 1 [7] offset *tau* - lag time """ n=parms[0] taud1=parms[1] taud2=parms[2] F=parms[3] alpha=parms[4] tautrip=parms[5] T=parms[6] off=parms[7] particle1 = F/( 1+tau/taud1 ) particle2 = alpha**2*(1-F)/( 1+tau/taud2 ) # If the fraction of dark molecules is zero, we put the # whole triplet fraction to death. triplet = 1 + T/(1-T)*np.exp(-tau/tautrip) # For alpha == 1, *norm* becomes one norm = (F + alpha*(1-F))**2 G = 1/n*(particle1 + particle2)*triplet/norm + off return G def Check_6031(parms): parms[0] = np.abs(parms[0]) parms[1] = taud1 = np.abs(parms[1]) parms[2] = taud2 = np.abs(parms[2]) F=parms[3] parms[4] = np.abs(parms[4]) tautrip = np.abs(parms[5]) T=parms[6] off=parms[7] # Triplet fraction is between 0 and one. T may not be one! T = (0.<=T<1.)*T + .99999999999999*(T>=1) # Fraction of molecules may also be one F = (0.<=F<=1.)*F + 1.*(F>1) parms[3] = F parms[5] = tautrip parms[6] = T return parms def MoreInfo_6001(parms, countrate): # We can only give you the effective particle number n = parms[0] Info = list() if countrate is not None: # CPP cpp = countrate/n Info.append(["cpp [kHz]", cpp]) return Info def MoreInfo_6031(parms, countrate): u"""Supplementary parameters: [8] n₁ = n*F₁ Particle number of species 1 [9] n₂ = n*(1-F₁) Particle number of species 2 """ # We can only give you the effective particle number n = parms[0] F1 = parms[3] Info = list() # The enumeration of these parameters is very important for # plotting the normalized curve. Countrate must come out last! Info.append([u"n\u2081", n*F1]) Info.append([u"n\u2082", n*(1.-F1)]) if countrate is not None: # CPP cpp = countrate/n Info.append(["cpp [kHz]", cpp]) return Info # 2D Model Gauss m_twodga6001 = [6001, u"2D", u"2D confocal diffusion", CF_Gxy_gauss] labels_6001 = ["n", u"τ_diff [ms]", "offset"] values_6001 = [4.0, 0.4, 0.0] valuestofit_6001 = [True, True, False] parms_6001 = [labels_6001, values_6001, valuestofit_6001] # 2D Model Gauss with Triplet m_twodga6002 = [6002, u"T+2D", u"2D confocal diffusion with triplet", CF_Gxy_T_gauss] labels_6002 = ["n", u"τ_diff [ms]", u"τ_trip [ms]", u"T", u"offset"] values_6002 = [4.0, 0.4, 0.001, 0.01, 0.0] labels_hr_6002 = ["n", u"τ_diff [ms]", u"τ_trip [µs]", u"T", u"offset"] factors_hr_6002 = [1., 1., 1000., 1., 1.] valuestofit_6002 = [True, True, True, True, False] parms_6002 = [labels_6002, values_6002, valuestofit_6002, labels_hr_6002, factors_hr_6002] # 2D + 2D + T model gauss m_gauss_2d_2d_t_mix_6031 = [6031, u"T+2D+2D", u"Separate 2D diffusion + triplet, Gauß", CF_Gxyz_gauss_2D2DT] labels_6031 = ["n", u"τ"+u"\u2081"+u" [ms]", u"τ"+u"\u2082"+u" [ms]", u"F"+u"\u2081", u"\u03b1"+u" (q"+u"\u2082"+"/q"+u"\u2081"+")", u"τ_trip [ms]", u"T", u"offset" ] values_6031 = [ 25, # n 5, # taud1 1000, # taud2 0.75, # F 1.0, # alpha 0.001, # tautrip 0.01, # T 0.0 # offset ] # For user comfort we add values that are human readable. # Theese will be used for output that only humans can read. labels_human_readable_6031 = [ u"n", u"τ"+u"\u2081"+u" [ms]", u"τ"+u"\u2082"+u" [ms]", u"F"+u"\u2081", u"\u03b1"+" (q"+u"\u2082"+"/q"+u"\u2081"+")", u"τ_trip [µs]", u"T", u"offset" ] values_factor_human_readable_6031 = [ 1., # "n", 1., # "τ"+u"\u2081"+" [ms]", 1., # "τ"+u"\u2082"+" [ms]", 1., # "F"+u"\u2081", 1., # u"\u03b1"+" (q"+u"\u2082"+"/q"+u"\u2081"+")", 1000., # "τ_trip [µs]", 1., # "T", 1. # "offset" ] valuestofit_6031 = [True, True, True, True, False, False, False, False] parms_6031 = [labels_6031, values_6031, valuestofit_6031, labels_human_readable_6031, values_factor_human_readable_6031] model1 = dict() model1["Parameters"] = parms_6001 model1["Definitions"] = m_twodga6001 model1["Supplements"] = MoreInfo_6001 model1["Verification"] = Check_xy_gauss model2 = dict() model2["Parameters"] = parms_6002 model2["Definitions"] = m_twodga6002 model2["Supplements"] = MoreInfo_6001 model2["Verification"] = Check_xy_T_gauss model3 = dict() model3["Parameters"] = parms_6031 model3["Definitions"] = m_gauss_2d_2d_t_mix_6031 model3["Supplements"] = MoreInfo_6031 model3["Verification"] = Check_6031 Modelarray = [model1, model2, model3] pycorrfit-0.8.1/src/models/MODEL_TIRF_1C.py0000755000175000017500000002044712262516600016736 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit This file contains TIR one component models Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import numpy as np # NumPy import scipy.special as sps def wixi(x): """ Complex Error Function (Faddeeva/Voigt). w(i*x) = exp(x**2) * ( 1-erf(x) ) This function is called by other functions within this module. We are using the scipy.special.wofz module which calculates w(z) = exp(-z**2) * ( 1-erf(-iz) ) z = i*x """ z = x*1j wixi = sps.wofz(z) # We should have a real solution. Make sure nobody complains about # some zero-value imaginary numbers. return np.real_if_close(wixi) def CF_Gxy_TIR_square(parms, tau): # Model 6000 u""" Two-dimensional diffusion with a square shaped lateral detection area taking into account the size of the point spread function. *parms* - a list of parameters. Parameters (parms[i]): [0] D Diffusion coefficient [1] σ Lateral size of the point spread function σ = σ₀ * λ / NA [2] a Side size of the square-shaped detection area [3] C_2D Particle concentration in detection area *tau* - lag time Please refer to the documentation of PyCorrFit for further information on this model function. Returns: Normalized Lateral correlation function w/square pinhole. """ D = parms[0] sigma = parms[1] a = parms[2] Conc = parms[3] var1 = sigma**2+D*tau AA = 2*np.sqrt(var1)/(a**2*np.sqrt(np.pi)) BB = np.exp(-a**2/(4*(var1))) - 1 CC = sps.erf(a/(2*np.sqrt(var1)))/a # gx = AA*BB+CC # gxy = gx**2 # g2D = gxy * eta**2 * Conc # F = 1/(eta*Conc) # G = g2D / F**2 G = 1/Conc * (AA*BB+CC)**2 return G # 3D free tir def CF_Gxyz_TIR_square(parms, tau, wixi=wixi): # Model 6010 u""" Three-dimensional diffusion with a square-shaped lateral detection area taking into account the size of the point spread function; and an exponential decaying profile in axial direction. *parms* - a list of parameters. Parameters (parms[i]): [0] D Diffusion coefficient [1] σ Lateral size of the point spread function σ = σ₀ * λ / NA [2] a Side size of the square-shaped detection area [3] d_eva Evanescent penetration depth [4] C_3D Particle concentration in detection volume *tau* - lag time Please refer to the documentation of PyCorrFit for further information on this model function. Returns: 3D correlation function for TIR-FCS w/square pinhole """ D = parms[0] sigma = parms[1] a = parms[2] kappa = 1/parms[3] Conc = parms[4] ### Calculate gxy # Axial correlation x = np.sqrt(D*tau)*kappa w_ix = wixi(x) gz = np.sqrt(D*tau/np.pi) - (2*D*tau*kappa**2 - 1)/(2*kappa) * w_ix # Lateral correlation gx1 = 2/(a**2*np.sqrt(np.pi)) * np.sqrt(sigma**2+D*tau) * \ ( np.exp(-a**2/(4*(sigma**2+D*tau))) -1 ) gx2 = 1/a * sps.erf( a / (2*np.sqrt(sigma**2 + D*tau))) gx = gx1 + gx2 gxy = gx**2 # Non normalized correlation function # We do not need eta after normalization # g = eta**2 * Conc * gxy * gz g = Conc * gxy * gz # Normalization: # F = eta * Conc / kappa F = Conc / kappa G = g / F**2 return G def MoreInfo_6000(parms, countrate): u"""Supplementary parameters: For a>>sigma, the correlation function at tau=0 corresponds to: [4] G(τ=0) = 1/(N_eff) * ( 1-2*σ/(sqrt(π)*a) )² Effective detection area: [5] A_eff [µm²] = a² Effective particle number in detection area: [6] N_eff = A_eff * C_2D """ D = parms[0] sigma = parms[1] a = parms[2] Conc = parms[3] Info=list() # Detection area: Aeff = a**2 # Particel number Neff = Aeff * Conc # Correlation function at tau = 0 G_0 = CF_Gxy_TIR_square(parms, 0) Info.append(["G(0)", G_0]) # 10 000 nm² = 0.01 µm² # Aeff * 10 000 nm² * 10**(-6)µm²/nm² = Aeff * 0.01 * µm² # Have to divide Aeff by 100 Info.append([u"A_eff [µm²]", Aeff / 100]) Info.append(["N_eff", Neff]) if countrate is not None: # CPP cpp = countrate/Neff Info.append(["cpp [kHz]", cpp]) return Info def MoreInfo_6010(parms, countrate): u"""Supplementary parameters: Molarity: [5] C_3D [nM] = C_3D [1000/µm³] * 10000/6.0221415 Effective detection volume: [6] V_eff = a² * d_eva Effective particle number: [7] N_eff = V_eff * C_3D For a>>σ, the correlation function at τ=0 corresponds to: [8] G(τ=0) = 1/(2*N_eff) * ( 1-2*σ/(sqrt(π)*a) )² """ # 3D Model TIR square # 3D TIR (□xσ/exp),Simple 3D diffusion w/ TIR, fct.CF_Gxyz_square_tir # D [10 µm²/s],σ [100 nm],a [100 nm],d_eva [100 nm],[conc.] [1000 /µm³] sigma = parms[1] a = parms[2] d_eva = parms[3] conc = parms[4] # Sigma Info = list() # Molarity [nM]: # 1000/(µm³)*10**15µm³/l * mol /(6.022*10^23) * 10^9 n cmol = conc * 10000/6.0221415 # Effective volume [al]: Veff = a**2 * d_eva # Effective particel number Neff = a**2 * d_eva * conc # Correlation function at tau = 0 G_0 = CF_Gxyz_TIR_square(parms, 0) Info.append(["G(0)", G_0]) Info.append(["C_3D [nM]", cmol]) # atto liters # 1 000 000 nm³ = 1 al Info.append(["V_eff [al]", Veff]) Info.append(["N_eff", Neff]) if countrate is not None: # CPP cpp = countrate/Neff Info.append(["cpp [kHz]", cpp]) return Info # 2D Model Square m_twodsq6000 = [6000, u"2D", u"2D diffusion w/ square pinhole", CF_Gxy_TIR_square] labels_6000 = [u"D [10 µm²/s]", u"σ [100 nm]", "a [100 nm]", u"C_2D [100 /µm²]"] values_6000 = [0.054, 2.3, 7.5, .6] # [D,lamb,NA,a,conc] # For user comfort we add values that are human readable. # Theese will be used for output that only humans can read. labels_human_readable_6000 = [u"D [µm²/s]", u"σ [nm]", "a [nm]", u"C_2D [1/µm²]"] values_factor_human_readable_6000 = [10, 100, 100, 100] valuestofit_6000 = [True, False, False, True] # Use as fit parameter? parms_6000 = [labels_6000, values_6000, valuestofit_6000, labels_human_readable_6000, values_factor_human_readable_6000] # 3D Model TIR square m_3dtirsq6010 = [6010, u"3D", "Simple 3D diffusion w/ TIR", CF_Gxyz_TIR_square] labels_6010 = [u"D [10 µm²/s]", u"σ [100 nm]","a [100 nm]", "d_eva [100 nm]", u"C_3D [1000 /µm³]"] values_6010 = [0.520, 2.3, 7.5, 1.0, .0216] # For user comfort we add values that are human readable. # Theese will be used for output that only humans can read. labels_human_readable_6010 = [u"D [µm²/s]", u"σ [nm]", "a [nm]", "d_eva [nm]", u"C_3D [1/µm³]"] values_factor_human_readable_6010 = [10, 100, 100, 100, 1000] valuestofit_6010 = [True, False, False, False, True] parms_6010 = [labels_6010, values_6010, valuestofit_6010, labels_human_readable_6010, values_factor_human_readable_6010] # Pack the models model1 = dict() model1["Parameters"] = parms_6000 model1["Definitions"] = m_twodsq6000 model1["Supplements"] = MoreInfo_6000 model1["Verification"] = lambda parms: np.abs(parms) model2 = dict() model2["Parameters"] = parms_6010 model2["Definitions"] = m_3dtirsq6010 model2["Supplements"] = MoreInfo_6010 model2["Verification"] = lambda parms: np.abs(parms) Modelarray = [model1, model2] pycorrfit-0.8.1/src/models/__init__.py0000644000175000017500000003133012262516600016454 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module models: Define all models and set initial parameters. Each model has a unique ID. This ID is very important: 1. It is a wxWidgets ID. 2. It is used in the saving of sessions to identify a model. It is very important, that model IDs do NOT change in newer versions of PyCorrFit, because it would not be possible to restore older PyCorrFit sessions. Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ # This file is necessary for this folder to become a module that can be # imported from within Python/PyCorrFit. import numpy as np # NumPy import platform import sys ## On Windows XP I had problems with the unicode Characters. # I found this at # http://stackoverflow.com/questions/5419/python-unicode-and-the-windows-console # and it helped: reload(sys) sys.setdefaultencoding('utf-8') ## Models import MODEL_classic_gaussian_2D import MODEL_classic_gaussian_3D import MODEL_classic_gaussian_3D2D import MODEL_TIRF_gaussian_1C import MODEL_TIRF_gaussian_3D2D import MODEL_TIRF_gaussian_3D3D import MODEL_TIRF_1C import MODEL_TIRF_2D2D import MODEL_TIRF_3D2D import MODEL_TIRF_3D3D import MODEL_TIRF_3D2Dkin_Ries def AppendNewModel(Modelarray): """ Append a new model from a modelarray. *Modelarray* has to be a list whose elements have two items: [0] parameters [1] some info about the model See separate models for more information """ global values global valuedict global models global modeldict global verification for Model in Modelarray: # We can have many models in one model array parms = Model["Parameters"] texts = Model["Definitions"] values.append(parms) # model ID is texts[0] valuedict[texts[0]] = parms models.append(texts) modeldict[texts[0]] = texts # Suplementary Data might be there try: supper = Model["Supplements"] except KeyError: # Nothing to do pass else: supplement[texts[0]] = supper # Check functions - check for correct values try: verify = Model["Verification"] except KeyError: # Nothing to do. Return empty function, so we do not need to # do this try and error thing again. verification[texts[0]] = lambda parms: parms else: verification[texts[0]] = verify def GetHumanReadableParms(model, parameters): """ From a set of parameters that have internal units e.g. [100 nm], Calculate the parameters in human readable units e.g. [nm]. Uses modeldict from this module. *model* - an integer ID of a model *parameters* - a list/array of parameters (all parameters of that model) Returns: New Units, New Parameters """ stdparms = valuedict[model] if len(stdparms) == 5: # This means we have extra information on the model # Return some human readable stuff OldParameters = 1.*np.array(parameters) Facors = 1.*np.array(stdparms[4]) NewParameters = 1.*OldParameters*Facors NewUnits = stdparms[3] return NewUnits, NewParameters else: # There is no info about human readable stuff, or it is already human # readable. return stdparms[0], parameters def GetHumanReadableParameterDict(model, names, parameters): """ From a set of parameters that have internal units e.g. [100 nm], Calculate the parameters in human readable units e.g. [nm]. Uses modeldict from this module. In contrast to *GetHumanReadableParms* this function accepts single parameter names and does not need the full array of parameters. *model* - an integer ID of a model *name* - the names of the parameters to be translated, order should be same as in *parameters* - a list of parameters Returns: New Units, New Parameters """ stdparms = valuedict[model] if len(stdparms) == 5: # This means we have extra information on the model # Return some human readable stuff # Check for list: if isinstance(names, basestring): names = [names] parameters = [parameters] retstring = True else: retstring = False # Create new lists NewUnits = list() NewParameters = list() for i in np.arange(len(stdparms[0])): for j in np.arange(len(names)): if names[j] == stdparms[0][i]: NewUnits.append(stdparms[3][i]) NewParameters.append(stdparms[4][i]*parameters[j]) if retstring == True: NewUnits = NewUnits[0] NewParameters = NewParameters[0] return NewUnits, NewParameters else: # There is no info about human readable stuff, or it is already human # readable. return names, parameters def GetInternalFromHumanReadableParm(model, parameters): """ This is the inverse of *GetHumanReadableParms* *model* - an integer ID of a model *parameters* - a list/array of parameters Returns: New Units, New Parameters """ stdparms = valuedict[model] if len(stdparms) == 5: # This means we have extra information on the model # and can convert to internal values OldParameters = 1.*np.array(parameters) Facors = 1./np.array(stdparms[4]) # inverse NewParameters = 1.*OldParameters*Facors NewUnits = stdparms[0] return NewUnits, NewParameters else: # There is no info about human readable stuff. The given # parameters have not been converted befor using # *GetHumanReadableParms*. return stdparms[0], parameters def GetModelType(modelid): """ Given a modelid, get the type of model function (Confocal, TIR-Conf., TIR-□, User) """ if modelid >= 7000: return u"User" else: shorttype = dict() shorttype[u"Confocal (Gaussian)"] = u"Confocal" shorttype[u"TIR (Gaussian/Exp.)"] = u"TIR Conf." shorttype[u"TIR (□xσ/Exp.)"] = u"TIR □xσ" for key in modeltypes.keys(): mlist = modeltypes[key] if mlist.count(modelid) == 1: return shorttype[key] try: return shorttype[key] except: return "" def GetMoreInfo(modelid, Page): """ This functino is called by someone who has already calculated some stuff or wants to know more about the model he is looking at. *modelid* is an ID of a model. *Page* is a wx.flatnotebook page. Returns: More information about a model in form of a list. """ # Background signal average bgaverage = None # Signal countrate/average: # might become countrate - bgaverage countrate = Page.traceavg # Get the parameters from the current page. parms = Page.active_parms[1] Info = list() if Page.IsCrossCorrelation is False: ## First import the supplementary parameters of the model ## The order is important for plot normalization and session ## saving as of version 0.7.8 # Try to get the dictionary entry of a model # Background information if Page.bgselected is not None: # Background list consists of items with # [0] average # [1] name # [2] trace bgaverage = Page.parent.Background[Page.bgselected][0] # Now set the correct countrate # We already printed the countrate, so there's no harm done. if countrate is not None: # might be that there is no countrate. countrate = countrate - bgaverage try: # This function should return all important information # that can be calculated from the given parameters. func_info = supplement[modelid] data = func_info(parms, countrate) for item in data: Info.append([item[0], item[1]]) except KeyError: # No information available pass # In case of cross correlation, we don't show this kind of # information. if Page.traceavg is not None: # Measurement time duration = Page.trace[-1,0]/1000 Info.append(["duration [s]", duration]) # countrate has to be printed before background. # Background might overwrite countrate. Info.append(["avg. signal [kHz]", Page.traceavg]) else: ## Cross correlation curves usually have two traces. Since we ## do not know how to compute the cpp, we will pass the argument ## "None" as the countrate. ## First import the supplementary parameters of the model ## The order is important for plot normalization and session ## saving as of version 0.7.8 # Try to get the dictionary entry of a model try: # This function should return all important information # that can be calculated from the given parameters. func_info = supplement[modelid] data = func_info(parms, None) for item in data: Info.append([item[0], item[1]]) except KeyError: # No information available pass if Page.tracecc is not None: # Measurement time duration = Page.tracecc[0][-1,0]/1000 Info.append(["duration [s]", duration]) # countrate has to be printed before background. # Background might overwrite countrate. avg0 = Page.tracecc[0][:,1].mean() avg1 = Page.tracecc[1][:,1].mean() Info.append(["avg. signal A [kHz]", avg0]) Info.append(["avg. signal B [kHz]", avg1]) if len(Info) == 0: # If nothing matched until now: return None else: return Info def GetPositionOfParameter(model, name): """ Returns an integer corresponding to the position of the label of a parameter in the model function """ stdparms = valuedict[model] for i in np.arange(len(stdparms[0])): if name == stdparms[0][i]: return int(i) # Pack all variables values = list() # Also create a dictionary, key is modelid valuedict = dict() # Pack all models models = list() # Also create a dictinary modeldict = dict() # A dictionary for supplementary data: supplement = dict() # A dictionary for checking for correct variables verification = dict() # 6001 6002 6031 AppendNewModel(MODEL_classic_gaussian_2D.Modelarray) # 6011 6012 6030 AppendNewModel(MODEL_classic_gaussian_3D.Modelarray) # 6032 AppendNewModel(MODEL_classic_gaussian_3D2D.Modelarray) # 6013 AppendNewModel(MODEL_TIRF_gaussian_1C.Modelarray) # 6033 AppendNewModel(MODEL_TIRF_gaussian_3D2D.Modelarray) # 6034 AppendNewModel(MODEL_TIRF_gaussian_3D3D.Modelarray) # 6000 6010 AppendNewModel(MODEL_TIRF_1C.Modelarray) # 6022 AppendNewModel(MODEL_TIRF_2D2D.Modelarray) # 6020 AppendNewModel(MODEL_TIRF_3D2D.Modelarray) # 6023 AppendNewModel(MODEL_TIRF_3D3D.Modelarray) # 6021 AppendNewModel(MODEL_TIRF_3D2Dkin_Ries.Modelarray) # Create a list for the differentiation between the models # This should make everything look a little cleaner modeltypes = dict() #modeltypes[u"Confocal (Gaussian)"] = [6001, 6002, 6012, 6011, 6031, 6032, 6030] #modeltypes[u"TIR (Gaussian/Exp.)"] = [6013, 6033, 6034] #modeltypes[u"TIR (□xσ/Exp.)"] = [6000, 6010, 6022, 6020, 6023, 6021] modeltypes[u"Confocal (Gaussian)"] = [6011, 6030, 6002, 6031, 6032] modeltypes[u"TIR (Gaussian/Exp.)"] = [6013, 6034, 6033] modeltypes[u"TIR (□xσ/Exp.)"] = [6010, 6023, 6000, 6022, 6020, 6021] modeltypes[u"User"] = list() pycorrfit-0.8.1/src/models/MODEL_classic_gaussian_3D.py0000755000175000017500000002510012262516600021477 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit This file contains TIR one component models + triplet Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import numpy as np # NumPy # 3D simple gauss def CF_Gxyz_gauss(parms, tau): # Model 6012 u""" Three-dimanesional free diffusion with a Gaussian laser profile (eliptical). G(τ) = offset + 1/( n*(1+τ/τ_diff) * sqrt(1 + τ/(SP²*τ_diff)) ) Calculation of diffusion coefficient and concentration from the effective radius of the detection profile (r₀ = 2*σ): D = r₀²/(4*τ_diff) Conc = n/( sqrt(π³)*r₀²*z₀ ) r₀ lateral detection radius (waist of lateral gaussian) z₀ axial detection length (waist of axial gaussian) D Diffusion coefficient Conc Concentration of dye *parms* - a list of parameters. Parameters (parms[i]): [0] n Effective number of particles in confocal volume [1] τ_diff Characteristic residence time in confocal volume [2] SP SP=z₀/r₀ Structural parameter, describes the axis ratio of the confocal volume [3] offset *tau* - lag time """ n = parms[0] taudiff = parms[1] SP = parms[2] off = parms[3] BB = 1 / ( (1.+tau/taudiff) * np.sqrt(1.+tau/(SP**2*taudiff)) ) G = off + 1/n * BB return G # 3D blinking gauss # Model 6011 def CF_Gxyz_blink(parms, tau): u""" Three-dimanesional free diffusion with a Gaussian laser profile (eliptical), including a triplet component. The triplet factor takes into account a blinking term. Set *T* or *τ_trip* to 0, if no triplet component is wanted. G(τ) = offset + 1/( n*(1+τ/τ_diff) * sqrt(1 + τ/(SP²*τ_diff)) ) * ( 1+T/(1.-T)*exp(-τ/τ_trip) ) Calculation of diffusion coefficient and concentration from the effective radius of the detection profile (r₀ = 2*σ): D = r₀²/(4*τ_diff) Conc = n/( sqrt(π³)*r₀²*z₀ ) *parms* - a list of parameters. Parameters (parms[i]): [0] n Effective number of particles in confocal volume [1] T Fraction of particles in triplet (non-fluorescent) state 0 <= T < 1 [2] τ_trip Characteristic residence time in triplet state [3] τ_diff Characteristic residence time in confocal volume [4] SP SP=z₀/r₀ Structural parameter, describes the axis ratio of the confocal volume [5] offset *tau* - lag time """ n = parms[0] T = parms[1] tautrip = parms[2] taudiff = parms[3] SP = parms[4] off = parms[5] AA = 1. + T/(1.-T) * np.exp(-tau/tautrip) BB = 1 / ( (1.+tau/taudiff) * np.sqrt(1.+tau/(SP**2*taudiff)) ) G = off + 1/n * AA * BB return G def Check_6011(parms): parms[0] = np.abs(parms[0]) T = parms[1] tautrip = np.abs(parms[2]) parms[3] = taudiff = np.abs(parms[3]) parms[4] = np.abs(parms[4]) off = parms[5] # Triplet fraction is between 0 and one. T = (0.<=T<1.)*T + .999999999*(T>=1) parms[1] = T parms[2] = tautrip return parms # 3D + 3D + Triplet Gauß # Model 6030 def CF_Gxyz_gauss_3D3DT(parms, tau): u""" Two-component three-dimensional free diffusion with a Gaussian laser profile, including a triplet component. The triplet factor takes into account a blinking term. Set *T* or *τ_trip* to 0, if no triplet component is wanted. particle1 = F₁/( (1+τ/τ₁) * sqrt(1+τ/(τ₁*SP²))) particle2 = α*(1-F₁)/( (1+τ/τ₂) * sqrt(1+τ/(τ₂*SP²))) triplet = 1 + T/(1-T)*exp(-τ/τ_trip) norm = (F₁ + α*(1-F₁))² G = 1/n*(particle1 + particle2)*triplet/norm + offset *parms* - a list of parameters. Parameters (parms[i]): [0] n Effective number of particles in confocal volume (n = n₁+n₂) [1] τ₁ Diffusion time of particle species 1 [2] τ₂ Diffusion time of particle species 2 [3] F₁ Fraction of molecules of species 1 (n₁ = n*F₁) 0 <= F₁ <= 1 [4] SP SP=z₀/r₀, Structural parameter, describes elongation of the confocal volume [5] α Relative molecular brightness of particle 2 compared to particle 1 (α = q₂/q₁) [6] τ_trip Characteristic residence time in triplet state [7] T Fraction of particles in triplet (non-fluorescent) state 0 <= T < 1 [8] offset *tau* - lag time """ n=parms[0] taud1=parms[1] taud2=parms[2] F=parms[3] SP=parms[4] alpha=parms[5] tautrip=parms[6] T=parms[7] off=parms[8] particle1 = F/( (1+tau/taud1) * np.sqrt(1+tau/(taud1*SP**2))) particle2 = alpha**2*(1-F)/( (1+tau/taud2) * np.sqrt(1+tau/(taud2*SP**2))) # If the fraction of dark molecules is zero, its ok. Python can also handle # exp(-1/inf). triplet = 1 + T/(1-T)*np.exp(-tau/tautrip) # For alpha == 1, *norm* becomes one norm = (F + alpha*(1-F))**2 G = 1/n*(particle1 + particle2)*triplet/norm + off return G def Check_3D3DT(parms): parms[0] = np.abs(parms[0]) parms[1] = taud1 = np.abs(parms[1]) parms[2] = taud2 = np.abs(parms[2]) F=parms[3] parms[4]=np.abs(parms[4]) parms[5]=np.abs(parms[5]) tautrip=np.abs(parms[6]) T=parms[7] off=parms[8] # Triplet fraction is between 0 and one. T may not be one! T = (0.<=T<1.)*T + .99999999999999*(T>=1) # Fraction of molecules may also be one F = (0.<=F<=1.)*F + 1.*(F>1) parms[3] = F parms[6] = tautrip parms[7] = T return parms def MoreInfo_1C(parms, countrate): # We can only give you the effective particle number n = parms[0] Info = list() if countrate is not None: # CPP cpp = countrate/n Info.append(["cpp [kHz]", cpp]) return Info def MoreInfo_6030(parms, countrate): u"""Supplementary parameters: [9] n₁ = n*F₁ Particle number of species 1 [10] n₂ = n*(1-F₁) Particle number of species 2 """ # We can only give you the effective particle number n = parms[0] F1 = parms[3] Info = list() # The enumeration of these parameters is very important for # plotting the normalized curve. Countrate must come out last! Info.append([u"n\u2081", n*F1]) Info.append([u"n\u2082", n*(1.-F1)]) if countrate is not None: # CPP cpp = countrate/n Info.append(["cpp [kHz]", cpp]) return Info # 3D Model blink gauss m_3dblink6011 = [6011, "T+3D","3D confocal diffusion with triplet", CF_Gxyz_blink] labels_6011 = ["n","T","τ_trip [ms]", "τ_diff [ms]", "SP", "offset"] values_6011 = [4.0, 0.2, 0.001, 0.4, 5.0, 0.0] labels_hr_6011 = ["n","T","τ_trip [µs]", "τ_diff [ms]", "SP", "offset"] factors_hr_6011 = [1., 1., 1000., 1., 1., 1.] valuestofit_6011 = [True, True, True, True, False, False] parms_6011 = [labels_6011, values_6011, valuestofit_6011, labels_hr_6011, factors_hr_6011] # 3D Model gauss m_3dgauss6012 = [6012, "3D","3D confocal diffusion", CF_Gxyz_gauss] labels_6012 = ["n", "τ_diff [ms]", "SP", "offset"] values_6012 = [4.0, 0.4, 5.0, 0.0] valuestofit_6012 = [True, True, False, False] parms_6012 = [labels_6012, values_6012, valuestofit_6012] # 3D + 3D + T model gauss m_gauss_3d_3d_t_mix_6030 = [6030, "T+3D+3D", "Separate 3D diffusion + triplet, Gauß", CF_Gxyz_gauss_3D3DT] labels_6030 = ["n", "τ"+u"\u2081"+" [ms]", "τ"+u"\u2082"+" [ms]", "F"+u"\u2081", "SP", u"\u03b1"+" (q"+u"\u2082"+"/q"+u"\u2081"+")", "τ_trip [ms]", "T", "offset" ] values_6030 = [ 25, # n 5, # taud1 1000, # taud2 0.75, # F 5, # SP 1.0, # alpha 0.001, # tautrip 0.01, # T 0.0 # offset ] # For user comfort we add values that are human readable. # Theese will be used for output that only humans can read. labels_human_readable_6030 = [ "n", "τ"+u"\u2081"+" [ms]", "τ"+u"\u2082"+" [ms]", "F"+u"\u2081", "SP", u"\u03b1"+" (q"+u"\u2082"+"/q"+u"\u2081"+")", "τ_trip [µs]", "T", "offset" ] values_factor_human_readable_6030 = [ 1., # n 1., # taud1 1., # taud2 1., # F 1., # SP 1., # alpha 1000., # tautrip [µs] 1., # T 1. # offset ] valuestofit_6030 = [True, True, True, True, False, False, False, False, False] parms_6030 = [labels_6030, values_6030, valuestofit_6030, labels_human_readable_6030, values_factor_human_readable_6030] # Pack the models model1 = dict() model1["Parameters"] = parms_6011 model1["Definitions"] = m_3dblink6011 model1["Supplements"] = MoreInfo_1C model1["Verification"] = Check_6011 model2 = dict() model2["Parameters"] = parms_6012 model2["Definitions"] = m_3dgauss6012 model2["Supplements"] = MoreInfo_1C model3 = dict() model3["Parameters"] = parms_6030 model3["Definitions"] = m_gauss_3d_3d_t_mix_6030 model3["Supplements"] = MoreInfo_6030 model3["Verification"] = Check_3D3DT Modelarray = [model1, model2, model3] pycorrfit-0.8.1/src/models/MODEL_TIRF_3D3D.py0000755000175000017500000001344012262516600017123 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit This file contains 3D+3D TIR-FCS models. Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import numpy as np # NumPy import scipy.special as sps def wixi(x): """ Complex Error Function (Faddeeva/Voigt). w(i*x) = exp(x**2) * ( 1-erf(x) ) This function is called by other functions within this module. We are using the scipy.special.wofz module which calculates w(z) = exp(-z**2) * ( 1-erf(-iz) ) z = i*x """ z = x*1j wixi = sps.wofz(z) # We should have a real solution. Make sure nobody complains about # some zero-value imaginary numbers. return np.real_if_close(wixi) # 3D + 3D no binding TIRF def CF_Gxyz_TIR_square_3d3d(parms, tau, wixi=wixi): u""" Two-component three-dimensional free diffusion with a square-shaped lateral detection area taking into account the size of the point spread function; and an exponential decaying profile in axial direction. *parms* - a list of parameters. Parameters (parms[i]): [0] D_3D1 3D Diffusion coefficient (species 1) [1] D_3D2 3D Diffusion coefficient of bound species 2 [2] σ Lateral size of the point spread function σ = σ₀ * λ / NA [3] a Side size of the square-shaped detection area [4] d_eva Evanescent penetration depth [5] C_3D1 Concentration of species 1 [6] C_3D2 Concentration of species 2 [7] α Relative molecular brightness of particle 2 compared to particle 1 (α = q₂/q₁) *tau* - lag time """ D_3D1 = parms[0] D_3D2 = parms[1] sigma = parms[2] a = parms[3] kappa = 1/parms[4] Conc_3D1 = parms[5] Conc_3D2 = parms[6] alpha = parms[7] ## First, the 3D diffusion of species 1 # Axial correlation x1 = np.sqrt(D_3D1*tau)*kappa w_ix1 = wixi(x1) gz1 = np.sqrt(D_3D1*tau/np.pi) - (2*D_3D1*tau*kappa**2 - 1)/(2*kappa) * \ w_ix1 # Lateral correlation gx1_1 = 2/(a**2*np.sqrt(np.pi)) * np.sqrt(sigma**2+D_3D1*tau) * \ ( np.exp(-a**2/(4*(sigma**2+D_3D1*tau))) -1 ) gx2_1 = 1/a * sps.erf( a / (2*np.sqrt(sigma**2 + D_3D1*tau))) gx1 = gx1_1 + gx2_1 gxy1 = gx1**2 # Non normalized correlation function g3D1 = Conc_3D1 * gxy1 * gz1 ## Second, the 3D diffusion of species 2 # Axial correlation x2 = np.sqrt(D_3D2*tau)*kappa w_ix2 = wixi(x2) gz2 = np.sqrt(D_3D2*tau/np.pi) - (2*D_3D2*tau*kappa**2 - 1)/(2*kappa) * \ w_ix2 # Lateral correlation gx1_2 = 2/(a**2*np.sqrt(np.pi)) * np.sqrt(sigma**2+D_3D2*tau) * \ ( np.exp(-a**2/(4*(sigma**2+D_3D2*tau))) -1 ) gx2_2 = 1/a * sps.erf( a / (2*np.sqrt(sigma**2 + D_3D2*tau))) gx2 = gx1_2 + gx2_2 gxy2 = gx2**2 # Non normalized correlation function g3D2 = alpha**2 * Conc_3D2 * gxy2 * gz2 ## Finally the Prefactor F = (Conc_3D1 + alpha * Conc_3D2) / kappa G = (g3D1 + g3D2) / F**2 return G # 3D-3D Model TIR m_tir_3d_3d_mix_6023 = [6023, u"3D+3D", "Separate 3D diffusion, 3D TIR", CF_Gxyz_TIR_square_3d3d] labels_6023 = ["D"+u"\u2081"+u" [10 µm²/s]", "D"+u"\u2082"+u" [10 µm²/s]", u"σ [100 nm]", "a [100 nm]", "d_eva [100 nm]", "C"+u"\u2081"+u" [1000 /µm³]", "C"+u"\u2082"+u" [1000 /µm³]", u"\u03b1"+" (q"+u"\u2082"+"/q"+u"\u2081"+")" ] values_6023 = [ 9.0, # D_3D₁ [10 µm²/s] 0.5, # D_3D₂ [10 µm²/s] 2.3, # σ [100 nm] 7.50, # a [100 nm] 1.0, # d_eva [100 nm] 0.01, # conc.3D₁ [1000 /µm³] 0.03, # conc.3D₂ [1000 /µm³] 1 # alpha ] # For user comfort we add values that are human readable. # Theese will be used for output that only humans can read. labels_human_readable_6023 = ["D"+u"\u2081"+u" [µm²/s]", "D"+u"\u2082"+u" [µm²/s]", u"σ [nm]", "a [nm]", "d_eva [nm]", "C"+u"\u2081"+u" [1/µm³]", "C"+u"\u2082"+u" [1/µm³]", u"\u03b1"+" (q"+u"\u2082"+"/q"+u"\u2081"+")" ] values_factor_human_readable_6023 = [10, # "D_3D₁ [µm²/s]", 10, # D_3D₂ [10 µm²/s] 100, # σ [100 nm] 100, # a [100 nm] 100, # d_eva [100 nm] 1000, # conc.3D₁ [1000 /µm³] 1000, # conc.3D₂ [1000 /µm³] 1 # alpha ] valuestofit_6023 = [False, True, False, False, False, False, True, False] parms_6023 = [labels_6023, values_6023, valuestofit_6023, labels_human_readable_6023, values_factor_human_readable_6023] model1 = dict() model1["Parameters"] = parms_6023 model1["Definitions"] = m_tir_3d_3d_mix_6023 model1["Verification"] = lambda parms: np.abs(parms) Modelarray = [model1] pycorrfit-0.8.1/src/models/MODEL_TIRF_gaussian_1C.py0000755000175000017500000001131412262516600020621 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit This file contains TIR one component models + Triplet Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import numpy as np # NumPy import scipy.special as sps def wixi(x): """ Complex Error Function (Faddeeva/Voigt). w(i*x) = exp(x**2) * ( 1-erf(x) ) This function is called by other functions within this module. We are using the scipy.special.wofz module which calculates w(z) = exp(-z**2) * ( 1-erf(-iz) ) z = i*x """ z = x*1j wixi = sps.wofz(z) # We should have a real solution. Make sure nobody complains about # some zero-value imaginary numbers. return np.real_if_close(wixi) def CF_Gxyz_TIR_gauss(parms, tau): u""" Three-dimensional free diffusion with a Gaussian lateral detection profile and an exponentially decaying profile in axial direction. x = sqrt(D*τ)*κ κ = 1/d_eva w(i*x) = exp(x²)*erfc(x) gz = κ * [ sqrt(D*τ/π) + (1 - 2*D*τ*κ)/(2*κ) * w(i*x) ] g2D = 1 / [ π (r₀² + 4*D*τ) ] G = 1/C_3D * g2D * gz *parms* - a list of parameters. Parameters (parms[i]): [0] D Diffusion coefficient [1] r₀ Lateral extent of the detection volume [2] d_eva Evanescent field depth [3] C_3D Particle concentration in the confocal volume *tau* - lag time Returns: Normalized 3D correlation function for TIRF. """ D = parms[0] r0 = parms[1] deva = parms[2] Conc = parms[3] # Calculate sigma: width of the gaussian approximation of the PSF Veff = np.pi * r0**2 * deva Neff = Conc * Veff taudiff = r0**2/(4*D) # 2D gauss component # G2D = 1/N2D * g2D = 1/(Aeff*Conc.2D) * g2D g2D = 1 / ( (1.+tau/taudiff) ) # 1d TIR component # Axial correlation kappa = 1/deva x = np.sqrt(D*tau)*kappa w_ix = wixi(x) # Gz = 1/N1D * gz = kappa / Conc.1D * gz gz = kappa * (np.sqrt(D*tau/np.pi) - (2*D*tau*kappa**2 - 1)/(2*kappa) * w_ix) # gz * g2D * 1/( deva *A2D) * 1 / Conc3D # Neff is not the actual particle number. This formula just looks nicer # this way. # What would be easier to get is: # 1 / (Conc * deva * np.pi * r0) * gz * g2D return 1 / (Neff) * g2D * gz def MoreInfo_6013(parms, countrate): u"""Supplementary variables: Beware that the effective volume is chosen arbitrarily. Correlation function at lag time τ=0: [4] G(τ=0) Effective detection volume: [5] V_eff = π * r₀² * d_eva Effective particle concentration: [6] C_3D [nM] = C_3D [1000/µm³] * 10000/6.0221415 """ D = parms[0] r0 = parms[1] deva = parms[2] Conc = parms[3] Info=list() # Detection area: Veff = np.pi * r0**2 * deva Neff = Conc * Veff # Correlation function at tau = 0 G_0 = CF_Gxyz_TIR_gauss(parms, 0) Info.append(["G(0)", G_0]) Info.append(["V_eff [al]", Veff]) Info.append(["C_3D [nM]", Conc * 10000/6.0221415]) if countrate is not None: # CPP cpp = countrate/Neff Info.append(["cpp [kHz]", cpp]) return Info # 3D Model TIR gaussian m_3dtirsq6013 = [6013, "3D","Simple 3D diffusion w/ TIR", CF_Gxyz_TIR_gauss] labels_6013 = ["D [10 µm²/s]",u"r₀ [100 nm]", "d_eva [100 nm]", "C_3D [1000/µm³)"] values_6013 = [2.5420, 9.44, 1.0, .03011] # For user comfort we add values that are human readable. # Theese will be used for output that only humans can read. labels_human_readable_6013 = ["D [µm²/s]", u"r₀ [nm]", "d_eva [nm]", "C_3D [1/µm³]"] values_factor_human_readable_6013 = [10, 100, 100, 1000] valuestofit_6013 = [True, False, False, True] parms_6013 = [labels_6013, values_6013, valuestofit_6013, labels_human_readable_6013, values_factor_human_readable_6013] # Pack the models model1 = dict() model1["Parameters"] = parms_6013 model1["Definitions"] = m_3dtirsq6013 model1["Supplements"] = MoreInfo_6013 model1["Verification"] = lambda parms: np.abs(parms) Modelarray = [model1] pycorrfit-0.8.1/src/plotting.py0000644000175000017500000004100712262516600015274 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module plotting Everything about plotting with matplotlib is located here. Be sure to install texlive-science and texlive-math-extra Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import codecs import numpy as np import matplotlib # We do catch warnings about performing this before matplotlib.backends stuff import warnings with warnings.catch_warnings(): warnings.simplefilter("ignore") matplotlib.use('WXAgg') # Tells matplotlib to use WxWidgets for dialogs import matplotlib.gridspec as gridspec import matplotlib.pyplot as plt # Text rendering with matplotlib from matplotlib import rcParams from matplotlib.backends.backend_wx import NavigationToolbar2Wx #We hack this ## In edclasses, we edited the wxWidgets version of the NavigationToolbar2Wx. ## This hack enables us to remember directories. # import edclasses # NavigationToolbar2Wx = edclasses.NavigationToolbar2Wx import unicodedata # For finding latex tools from misc import findprogram import models as mdls def greek2tex(char): """ Converts greek UTF-8 letters to latex """ decchar = codecs.decode(char, "UTF-8") repres = unicodedata.name(decchar).split(" ") # GREEK SMALL LETTER ALPHA if repres[0] == "GREEK" and len(repres) == 4: letter = repres[3].lower() if repres[1] != "SMALL": letter = letter[0].capitalize() + letter[1:] return "\\"+letter else: return char def escapechars(string): """ For latex output, some characters have to be escaped with a "\\" """ string = codecs.decode(string, "UTF-8") escapechars = ["#", "$", "%", "&", "~", "_", "\\", "{", "}"] retstr = ur"" for char in string: if char in escapechars: retstr += "\\" retstr += char elif char == "^": # Make a hat in latex without $$? retstr += "$\widehat{~}$" else: retstr += char return retstr def latexmath(string): """ Format given parameters to nice latex. """ string = codecs.decode(string, "UTF-8") unicodechars = dict() #unicodechars[codecs.decode("τ", "UTF-8")] = r"\tau" #unicodechars[codecs.decode("µ", "UTF-8")] = r"\mu" unicodechars[codecs.decode("²", "UTF-8")] = r"^2" unicodechars[codecs.decode("³", "UTF-8")] = r"^3" unicodechars[codecs.decode("₁", "UTF-8")] = r"_1" unicodechars[codecs.decode("₂", "UTF-8")] = r"_2" unicodechars[codecs.decode("₀", "UTF-8")] = r"_0" #unicodechars[codecs.decode("α", "UTF-8")] = r"\alpha" # We need lambda in here, because unicode names it lamda sometimes. unicodechars[codecs.decode("λ", "UTF-8")] = r"\lambda" #unicodechars[codecs.decode("η", "UTF-8")] = r'\eta' items = string.split(" ", 1) a = items[0] if len(items) > 1: b = items[1] else: b = "" anew = r"" for char in a: if char in unicodechars.keys(): anew += unicodechars[char] elif char != greek2tex(char): anew += greek2tex(char) else: anew += char # lower case lcitems = anew.split("_",1) if len(lcitems) > 1: anew = lcitems[0]+"_{\\text{"+lcitems[1]+"}}" return anew + r" \hspace{0.3em} \mathrm{"+b+r"}" def savePlotCorrelation(parent, dirname, Page, uselatex=False, verbose=False, show_weights=True): """ Save plot from Page into file Parameters: *parent* the parent window *dirname* directory to set on saving *Page* Page containing all variables *uselatex* Whether to use latex for the ploting or not. This function uses a hack in misc.py to change the function for saving the final figure. We wanted save in the same directory as PyCorrFit was working and the filename should be the tabtitle. """ # Close all other plots before commencing try: plt.close() except: pass # As of version 0.7.8 the user may export data normalized to a certain # parameter. if Page.dataexp is not None: dataexp = 1*Page.dataexp resid = 1*Page.resid dataexp[:,1] *= Page.normfactor resid[:,1] *= Page.normfactor else: dataexp = Page.dataexp resid = Page.resid fit = 1*Page.datacorr fit[:,1] *= Page.normfactor weights = Page.weights_plot_fill_area tabtitle = Page.tabtitle.GetValue() #fitlabel = ur"Fit model: "+str(mdls.modeldict[Page.modelid][0]) fitlabel = Page.modelname labelweights = ur"Weights of fit" labels, parms = mdls.GetHumanReadableParms(Page.modelid, Page.active_parms[1]) # Error parameters with nice look errparmsblank = Page.parmoptim_error if errparmsblank is None: errparms = None else: errparms = dict() for key in errparmsblank.keys(): newkey, newparm = mdls.GetHumanReadableParameterDict(Page.modelid, key, errparmsblank[key]) errparms[newkey] = newparm parmids = np.where(Page.active_parms[2])[0] labels = np.array(labels)[parmids] parms = np.array(parms)[parmids] if dataexp is None: if tabtitle.strip() == "": fitlabel = Page.modelname else: fitlabel = tabtitle else: if tabtitle.strip() == "": tabtitle = "page"+str(Page.counter).strip().strip(":") if Page.normparm is not None: fitlabel += ur", normalized to "+Page.active_parms[0][Page.normparm] ## Check if we can use latex or plotting: (r1, path) = findprogram("latex") (r2, path) = findprogram("dvipng") # Ghostscript (r31, path) = findprogram("gs") (r32, path) = findprogram("mgs") # from miktex r3 = max(r31,r32) if r1+r2+r3 < 3: uselatex = False if uselatex == True: rcParams['text.usetex']=True rcParams['text.latex.unicode']=True rcParams['font.family']='serif' rcParams['text.latex.preamble']=[r"\usepackage{amsmath}"] fitlabel = ur"{\normalsize "+escapechars(fitlabel)+r"}" tabtitle = ur"{\normalsize "+escapechars(tabtitle)+r"}" labelweights = ur"{\normalsize "+escapechars(labelweights)+r"}" else: rcParams['text.usetex']=False # create plot # plt.plot(x, y, '.', label = 'original data', markersize=5) fig=plt.figure() if resid is not None: gs = gridspec.GridSpec(2, 1, height_ratios=[5,1]) ax = plt.subplot(gs[0]) else: ax = plt.subplot(111) # ax = plt.axes() ax.semilogx() if dataexp is not None: plt.plot(dataexp[:,0], dataexp[:,1], '-', color="darkgrey", label=tabtitle) else: plt.xlabel(r'lag time $\tau$ [ms]') # Plotting with error bars is very ugly if you have a lot of # data points. # We will use fill_between instead. #plt.errorbar(fit[:,0], fit[:,1], yerr=weights, fmt='-', # label = fitlabel, lw=2.5, color="blue") plt.plot(fit[:,0], fit[:,1], '-', label = fitlabel, lw=2.5, color="blue") if weights is not None and show_weights is True: plt.fill_between(weights[0][:,0],weights[0][:,1],weights[1][:,1], color='cyan') # fake legend: p = plt.Rectangle((0, 0), 0, 0, color='cyan', label=labelweights) ax.add_patch(p) plt.ylabel('correlation') if dataexp is not None: mind = np.min([ dataexp[:,1], fit[:,1]]) maxd = np.max([ dataexp[:,1], fit[:,1]]) else: mind = np.min(fit[:,1]) maxd = np.max(fit[:,1]) ymin = mind - (maxd - mind)/20. ymax = maxd + (maxd - mind)/20. ax.set_ylim(bottom=ymin, top=ymax) xmin = np.min(fit[:,0]) xmax = np.max(fit[:,0]) ax.set_xlim(xmin, xmax) # Add some nice text: if uselatex == True and len(parms) != 0: text = r"" text += r'\[' #every line is a separate raw string... text += r'\begin{split}' # ...but they are all concatenated # by the interpreter :-) for i in np.arange(len(parms)): text += r' '+latexmath(labels[i])+r" &= " + str(parms[i]) +r' \\ ' if errparms is not None: keys = errparms.keys() keys.sort() for key in keys: text += r' \Delta '+latexmath(key)+r" &= " + str(errparms[key]) +r' \\ ' text += r' \end{split} ' text += r' \] ' else: text = ur"" for i in np.arange(len(parms)): text += labels[i]+" = "+str(parms[i])+"\n" if errparms is not None: keys = errparms.keys() keys.sort() for key in keys: text += "Err "+key+" = " + str(errparms[key]) +"\n" # Add some more stuff to the text and append data to a .txt file #text = Auswert(parmname, parmoptim, text, savename) plt.legend() logmax = np.log10(xmax) logmin = np.log10(xmin) logtext = 0.6*(logmax-logmin)+logmin xt = 10**(logtext) yt = 0.3*ymax plt.text(xt,yt,text, size=12) if resid is not None: ax2 = plt.subplot(gs[1]) #ax2 = plt.axes() ax2.semilogx() if Page.weighted_fit_was_performed: if uselatex == True: lb = r"\newline \indent " else: lb = "\n" yLabelRes = "weighted "+ lb +"residuals" else: yLabelRes = "residuals" plt.plot(resid[:,0], resid[:,1], '-', color="darkgrey", label=yLabelRes) plt.xlabel(r'lag time $\tau$ [ms]') plt.ylabel(yLabelRes, multialignment='center') minx = np.min(resid[:,0]) maxx = np.max(resid[:,0]) miny = np.min(resid[:,1]) maxy = np.max(resid[:,1]) ax2.set_xlim(minx, maxx) maxy = max(abs(maxy), abs(miny)) ax2.set_ylim(-maxy, maxy) ticks = ax2.get_yticks() ax2.set_yticks([ticks[0], ticks[-1], 0]) ## Hack # We need this for hacking. See edclasses. fig.canvas.HACK_parent = parent fig.canvas.HACK_fig = fig fig.canvas.HACK_Page = Page fig.canvas.HACK_append = "" if verbose == True: plt.show() else: # If WXAgg is not used for some reason, then our hack does not work # and we must use e.g. TkAgg try: fig.canvas.toolbar.save() except AttributeError: fig.canvas.toolbar.save_figure() # Close all other plots before commencing try: plt.close() except: pass def savePlotTrace(parent, dirname, Page, uselatex=False, verbose=False): """ Save trace plot from Page into file Parameters: *parent* the parent window *dirname* directory to set on saving *Page* Page containing all variables *uselatex* Whether to use latex for the ploting or not. This function uses a hack in misc.py to change the function for saving the final figure. We wanted save in the same directory as PyCorrFit was working and the filename should be the tabtitle. """ # Close all other plots before commencing try: plt.close() except: pass # Trace must be displayed in s timefactor = 1e-3 tabtitle = Page.tabtitle.GetValue() if tabtitle.strip() == "": tabtitle = "page"+str(Page.counter).strip().strip(":") # Intensity trace in kHz may stay the same if Page.trace is not None: # Set trace traces = [Page.trace] labels = [tabtitle] elif Page.tracecc is not None: # We have some cross-correlation here. Two traces. traces = Page.tracecc labels = [tabtitle+" A", tabtitle+" B"] else: return ## Check if we can use latex or plotting: (r1, path) = findprogram("latex") (r2, path) = findprogram("dvipng") # Ghostscript (r31, path) = findprogram("gs") (r32, path) = findprogram("mgs") # from miktex r3 = max(r31,r32) if r1+r2+r3 < 3: uselatex = False if uselatex == True: rcParams['text.usetex']=True rcParams['text.latex.unicode']=True rcParams['font.family']='serif' rcParams['text.latex.preamble']=[r"\usepackage{amsmath}"] for i in np.arange(len(labels)): labels[i] = ur"{\normalsize "+escapechars(labels[i])+r"}" else: rcParams['text.usetex']=False # create plot # plt.plot(x, y, '.', label = 'original data', markersize=5) fig=plt.figure() ax = plt.subplot(111) for i in np.arange(len(traces)): # Columns time = traces[i][:,0]*timefactor intensity = traces[i][:,1] plt.plot(time, intensity, '-', label = labels[i], lw=1) plt.ylabel('count rate [kHz]') plt.xlabel('time [s]') # Add some more stuff to the text and append data to a .txt file plt.legend() ## Hack # We need this for hacking. See edclasses. fig.canvas.HACK_parent = parent fig.canvas.HACK_fig = fig fig.canvas.HACK_Page = Page fig.canvas.HACK_append = "_trace" if verbose == True: plt.show() else: # If WXAgg is not used for some reason, then our hack does not work # and we must use e.g. TkAgg try: fig.canvas.toolbar.save() except AttributeError: fig.canvas.toolbar.save_figure() # Close all other plots before commencing try: plt.close() except: pass def savePlotSingle(name, x, dataexp, datafit, dirname = ".", uselatex=False): """ CURRENTLY THIS FUNCTION IS NOT USED BY PYCORRFIT Show log plot of correlation function without residuals. Parameters: *name* name of curve in legend *x* tau-values to plot *dataexp* correlation data to plot *datafit* fitted curve to correlation data *dirname* initial directory for dialog (not used here) *uselatex* use latex for plotting This function uses a hack in misc.py to change the function for saving the final figure. We wanted save in the same directory as PyCorrFit was working and the filename should be the tabtitle. """ # This is a dirty hack to make sure no plots are opened try: plt.close() except: pass ## Check if we can use latex or plotting: (r1, path) = findprogram("latex") (r2, path) = findprogram("dvipng") # Ghostscript (r31, path) = findprogram("gs") (r32, path) = findprogram("mgs") # from miktex r3 = max(r31,r32) if r1+r2+r3 < 3: uselatex = False if uselatex == True: rcParams['text.usetex']=True rcParams['text.latex.unicode']=True rcParams['font.family']='serif' rcParams['text.latex.preamble']=[r"\usepackage{amsmath}"] name = ur"{\normalsize "+escapechars(name)+r"}" else: rcParams['text.usetex']=False # create plot # plt.plot(x, y, '.', label = 'original data', markersize=5) fig=plt.figure() ax = plt.subplot(111) # ax = plt.axes() ax.semilogx() plt.plot(x, dataexp,'-', color="darkgrey") plt.xlabel(r'lag time $\tau$ [ms]') plt.plot(x, datafit, '-', label = name, lw=2.5, color="blue") plt.ylabel('correlation') mind = np.min([ dataexp, datafit]) maxd = np.max([ dataexp, datafit]) ymin = mind - (maxd - mind)/20. ymax = maxd + (maxd - mind)/20. ax.set_ylim(bottom=ymin, top=ymax) xmin = np.min(x) xmax = np.max(x) ax.set_xlim(xmin, xmax) # Add some more stuff to the text and append data to a .txt file #text = Auswert(parmname, parmoptim, text, savename) plt.legend() plt.show() pycorrfit-0.8.1/src/__init__.py0000644000175000017500000000316112262516600015172 0ustar toortoor# -*- coding: utf-8 -*- """ In current biomedical research, fluorescence correlation spectroscopy (FCS) is applied to characterize molecular dynamic processes in vitro and in living cells. Commercial FCS setups only permit data analysis that is limited to a specific instrument by the use of in-house file formats or a finite number of implemented correlation model functions. PyCorrFit is a general-purpose FCS evaluation software that, amongst other formats, supports the established Zeiss ConfoCor3 ~.fcs file format. PyCorrFit comes with several built-in model functions, covering a wide range of applications in standard confocal FCS. In addition, it contains equations dealing with different excitation geometries like total internal reflection (TIR). Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import doc import models import readfiles __version__ = doc.__version__ __author__ = "Paul Mueller" __email__ = "paul.mueller@biotec.tu-dresden.de" pycorrfit-0.8.1/src/leastsquaresfit.py0000644000175000017500000004047612262516600016664 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module leastsquaresfit Here are the necessary functions for computing a fit with given parameters. See included class "Fit" for more information. scipy.optimize.leastsq "leastsq" is a wrapper around MINPACK's lmdif and lmder algorithms. Those use the Levenberg-Marquardt algorithm. subroutine lmdif the purpose of lmdif is to minimize the sum of the squares of m nonlinear functions in n variables by a modification of the levenberg-marquardt algorithm. the user must provide a subroutine which calculates the functions. the jacobian is then calculated by a forward-difference approximation. Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import matplotlib.pyplot as plt import numpy as np from scipy import interpolate as spintp from scipy import optimize as spopt # If we use this module with PyCorrFit, we can plot things with latex using # our own special thing. try: import plotting except: pass class Fit(object): """ The class Fit needs the following parameters to perform a fit: check_parms - A function checking the parameters for plausibility. dataexpfull - Full experimental data *array of tuples* function - function to be used for fitting f(parms, x) interval - interval of dataexpfull to fit in. [a, b] values - starting parameters *parms* for fitting. *array* valuestofit - which parameter to use for fitting. *bool array* weights - no. of datapoints from left and right to use for weighting fittype - type of fit. Can be one of the following - "None" (standard) - no weights. (*weights* is ignored) - "splineX" - fit a Xth order spline and calulate standard deviation from that difference - "model function" - calculate std. dev. from difference of fit function and dataexpfull. - "other" - use an external std. dev.. The variable self.external_deviations has to be set before self.ApplyParameters is called. Cropping with *interval* is performed here. """ def __init__(self): """ Initial setting of needed variables via the given *fitset* """ self.check_parms = None self.dataexpfull = None self.function = None self.interval = None # Eventually use latex. This is passed # to each plotting command. Only when plotting # module is available. self.uselatex = False self.values = None self.valuestofit = None self.verbose = False # Verbose mode (shows e.g. spline fit) # The weights (data points from left and right of data array) have # to be chosen in a way, that the interval +/- weights will not # exceed self.dataexpfull!!!! self.weights = None # Changing fittype will change calculation of variances=dataweights**2. # None means dataweights is 1. self.fittype = "None" # Chi**2 Value self.chi = None # Messages from leastsq self.mesg = None # Optimal parameters found by leastsq self.parmoptim = None self.covar = None # covariance matrix self.parmoptim_error = None # Errors of fit # Variances for fitting self.dataweights = None # External std defined by the user self.external_deviations = None # It is possible to edit tolerance for fitting # ftol, xtol and gtol. # Those parameters could be added to the fitting routine later. # Should we do a weighted fit? # Standard is yes. If there are no weights # (self.fittype not set) then this value becomes False self.weightedfit=True def ApplyParameters(self): if self.interval is None: self.startcrop = self.endcrop = 0 else: [self.startcrop, self.endcrop] = self.interval # Get self.dataexp if self.startcrop == self.endcrop: self.dataexp = 1*self.dataexpfull self.startcrop = 0 self.endcrop = len(self.dataexpfull) else: self.dataexp = 1*self.dataexpfull[self.startcrop:self.endcrop] # If startcrop is larger than the lenght of dataexp, # We will not have an array. Prevent that. if len(self.dataexp) == 0: self.dataexp = 1*self.dataexpfull # Calculate x-values # (Extract tau-values from dataexp) self.x = self.dataexp[:, 0] # Experimental data self.data = self.dataexp[:,1] # Set fit parameters self.fitparms = np.zeros(sum(self.valuestofit)) index = 0 for i in np.arange(len(self.values)): if self.valuestofit[i]: self.fitparms[index] = np.float(self.values[i]) index = index + 1 # Assume we have a weighted fit. If this is not the case then # this is changed in the else statement of the following # "if"-statement: self.weightedfit=True if self.fittype[:6] == "spline": # Number of knots to use for spline try: knotnumber = int(self.fittype[6:]) except: print "Could not get knotnumber. Setting to 5." knotnumber = 5 # Number of neighbouring (left and right) points to include points = self.weights # Calculated dataweights datalen = len(self.dataexp[:,1]) dataweights = np.zeros(datalen) if self.startcrop < points: pmin = self.startcrop else: pmin = points if len(self.dataexpfull) - self.endcrop < points: pmax = (len(self.dataexpfull) - self.endcrop) else: pmax = points x = self.dataexpfull[self.startcrop-pmin:self.endcrop+pmax,0] xs = np.log10(x) y = self.dataexpfull[self.startcrop-pmin:self.endcrop+pmax,1] knots = np.linspace(xs[1], xs[-1], knotnumber+2)[1:-1] try: tck = spintp.splrep(xs,y,s=0,k=3,t=knots,task=-1) ys = spintp.splev(xs,tck,der=0) except: print "Could not find spline with "+str(knotnumber)+" knots." return if self.verbose == True: try: # If plotting module is available: name = "Spline fit: "+str(knotnumber)+" knots" plotting.savePlotSingle(name, 1*x, 1*y, 1*ys, dirname = ".", uselatex=self.uselatex) except: plt.xscale("log") plt.plot(x,ys, x,y) plt.show() ## Calculation of variance # In some cases, the actual cropping interval from self.startcrop to # self.endcrop is chosen, such that the dataweights must be # calculated from unknown datapoints. # (e.g. points+endcrop > len(dataexpfull) # We deal with this by multiplying dataweights with a factor # corresponding to the missed points. for i in np.arange(datalen): # Define start and end positions of the sections from # where we wish to calculate the dataweights. # Offset at beginning: if i + self.startcrop < points: # The offset that occurs offsetstart = points - i - self.startcrop offsetcrop = 0 elif self.startcrop > points: offsetstart = 0 offsetcrop = self.startcrop - points else: offsetstart = 0 offsetcrop = 0 # i: counter on dataexp array # start: counter on y array start = i - points + offsetstart + self.startcrop - offsetcrop end = start + 2*points + 1 - offsetstart dataweights[i] = (y[start:end] - ys[start:end]).std() # The standard deviation at the end and the start of the # array are multiplied by a factor corresponding to the # number of bins that were not used for calculation of the # standard deviation. if offsetstart != 0: reference = 2*points + 1 dividor = reference - offsetstart dataweights[i] *= reference/dividor # Do not substitute len(y[start:end]) with end-start! # It is not the same! backset = 2*points + 1 - len(y[start:end]) - offsetstart if backset != 0: reference = 2*points + 1 dividor = reference - backset dataweights[i] *= reference/dividor elif self.fittype == "model function": # Number of neighbouring (left and right) points to include points = self.weights if self.startcrop < points: pmin = self.startcrop else: pmin = points if len(self.dataexpfull) - self.endcrop < points: pmax = (len(self.dataexpfull) - self.endcrop) else: pmax = points x = self.dataexpfull[self.startcrop-pmin:self.endcrop+pmax,0] y = self.dataexpfull[self.startcrop-pmin:self.endcrop+pmax,1] # Calculated dataweights datalen = len(self.dataexp[:,1]) dataweights = np.zeros(datalen) for i in np.arange(datalen): # Define start and end positions of the sections from # where we wish to calculate the dataweights. # Offset at beginning: if i + self.startcrop < points: # The offset that occurs offsetstart = points - i - self.startcrop offsetcrop = 0 elif self.startcrop > points: offsetstart = 0 offsetcrop = self.startcrop - points else: offsetstart = 0 offsetcrop = 0 # i: counter on dataexp array # start: counter on dataexpfull array start = i - points + offsetstart + self.startcrop - offsetcrop end = start + 2*points + 1 - offsetstart #start = self.startcrop - points + i #end = self.startcrop + points + i + 1 diff = y - self.function(self.values, x) dataweights[i] = diff[start:end].std() # The standard deviation at the end and the start of the # array are multiplied by a factor corresponding to the # number of bins that were not used for calculation of the # standard deviation. if offsetstart != 0: reference = 2*points + 1 dividor = reference - offsetstart dataweights[i] *= reference/dividor # Do not substitute len(diff[start:end]) with end-start! # It is not the same! backset = 2*points + 1 - len(diff[start:end]) - offsetstart if backset != 0: reference = 2*points + 1 dividor = reference - backset dataweights[i] *= reference/dividor elif self.fittype == "other": # This means that the user knows the dataweights and already # gave it to us. if self.external_deviations is not None: dataweights = \ self.external_deviations[self.startcrop:self.endcrop] else: raise ValueError, \ "self.external_deviations not set for fit type 'other'." else: # The fit.Fit() class will divide the function to minimize # by the dataweights only if we have weights self.weightedfit=False dataweights=None self.dataweights = dataweights def fit_function(self, parms, x): """ Create the function to be minimized via least squares. The old function *function* has more parameters than we need for the fitting. So we use this function to set only the necessary parameters. Returns what *function* would have done. """ # Reorder the needed variables from *spopt.leastsq* for *function*. index = 0 for i in np.arange(len(self.values)): if self.valuestofit[i]: self.values[i] = parms[index] index = index + 1 # Only allow physically correct parameters self.values = self.check_parms(self.values) tominimize = (self.function(self.values, x) - self.data) # Check if we have a weighted fit if self.weightedfit is True: # Check dataweights for zeros and don't use these # values for the least squares method. with np.errstate(divide='ignore'): tominimize = np.where(self.dataweights!=0, tominimize/self.dataweights, 0) ## There might be NaN values because of zero weights: #tominimize = tominimize[~np.isinf(tominimize)] return tominimize def get_chi_squared(self): # Calculate Chi**2 degrees_of_freedom = len(self.x) - len(self.parmoptim) - 1 return np.sum( (self.fit_function(self.parmoptim, self.x))**2) / \ degrees_of_freedom def least_square(self): """ This will minimize *self.fit_function()* using least squares. *self.values*: The values with which the function is called. *valuestofit*: A list with bool values that indicate which values should be used for fitting. Function *self.fit_function()* takes two parameters: self.fit_function(parms, x) where *x* are x-values of *dataexp*. """ if np.sum(self.valuestofit) == 0: print "No fitting parameters selected." self.valuesoptim = 1*self.values return # Begin fitting res = spopt.leastsq(self.fit_function, self.fitparms[:], args=(self.x), full_output=1) (popt, pcov, infodict, errmsg, ier) = res self.parmoptim = popt if ier not in [1,2,3,4]: print "Optimal parameters not found: " + errmsg # Now write the optimal parameters to our values: index = 0 for i in np.arange(len(self.values)): if self.valuestofit[i]: self.values[i] = self.parmoptim[index] index = index + 1 # Only allow physically correct parameters self.values = self.check_parms(self.values) # Write optimal parameters back to this class. self.valuesoptim = 1*self.values # This is actually a redundance array self.chi = self.get_chi_squared() try: self.covar = pcov * self.chi # The covariance matrix except: print "PyCorrFit Warning: Error estimate not possible, because we" print " could not calculate covariance matrix. Please try" print " reducing the number of fitting parameters." self.parmoptim_error = None else: # Error estimation of fitted parameters if self.covar is not None: self.parmoptim_error = np.diag(self.covar) pycorrfit-0.8.1/src/tools/0000755000175000017500000000000012262516600014220 5ustar toortoorpycorrfit-0.8.1/src/tools/globalfit.py0000644000175000017500000003270012262516600016537 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module tools - globalfit Perform global fitting on pages which share parameters. Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import wx import numpy as np from scipy import optimize as spopt import misc import models as mdls # Menu entry name MENUINFO = ["&Global fitting", "Interconnect parameters from different measurements."] class GlobalFit(wx.Frame): # This tool is derived from a wx.frame. def __init__(self, parent): # Define a unique name that identifies this tool # Do not change this value. It is important for the Overlay tool # (selectcurves.py, *Wrapper_Tools*). self.MyName="GLOBALFIT" # parent is the main frame of PyCorrFit self.parent = parent # Get the window positioning correctly pos = self.parent.GetPosition() pos = (pos[0]+100, pos[1]+100) wx.Frame.__init__(self, parent=self.parent, title="Gobal fitting", pos=pos, style=wx.DEFAULT_FRAME_STYLE|wx.FRAME_FLOAT_ON_PARENT) ## MYID # This ID is given by the parent for an instance of this class self.MyID = None # Page - the currently active page of the notebook. self.Page = self.parent.notebook.GetCurrentPage() ## Content self.panel = wx.Panel(self) self.topSizer = wx.BoxSizer(wx.VERTICAL) textinit = """Fitting of multiple data sets with different models. Parameter names have to match. Select pages (e.g. 1,3-5,7), check parameters on each page and start 'Global fit'. """ self.topSizer.Add(wx.StaticText(self.panel, label=textinit)) ## Page selection self.WXTextPages = wx.TextCtrl(self.panel, value="", size=(330,-1)) # Set initial value in text control pagenumlist = list() for i in np.arange(self.parent.notebook.GetPageCount()): Page = self.parent.notebook.GetPage(i) pagenumlist.append(int(filter(lambda x: x.isdigit(), Page.counter))) valstring=misc.parsePagenum2String(pagenumlist) self.WXTextPages.SetValue(valstring) self.topSizer.Add(self.WXTextPages) ## Weighted fitting # The weighted fit of the current page will be applied to # all other pages. self.weightedfitdrop = wx.ComboBox(self.panel) ## Bins from left and right: We also don't edit that. self.topSizer.Add(self.weightedfitdrop) ## Button btnfit = wx.Button(self.panel, wx.ID_ANY, 'Global fit') # Binds the button to the function - close the tool self.Bind(wx.EVT_BUTTON, self.OnFit, btnfit) self.topSizer.Add(btnfit) self.panel.SetSizer(self.topSizer) self.topSizer.Fit(self) self.SetMinSize(self.topSizer.GetMinSizeTuple()) self.OnPageChanged(self.Page) # Icon if parent.MainIcon is not None: wx.Frame.SetIcon(self, parent.MainIcon) self.Show(True) def fit_function(self, parms): """ *parms*: Parameters to fit, array needs: self.parmstofit - list (strings) of parameters to fit (corresponding to *parms*) self.PageData (dict with dict item = self.PageData["PageNumber"]): item["x"] item["data"] item["modelid"] item["values"] """ # The list containing arrays to be minimized minimize = list() for key in self.PageData.keys(): # Get the function item = self.PageData[key] modelid = item["modelid"] function = mdls.modeldict[modelid][3] values = self.PageData[key]["values"] # Set parameters for each function (Page) for i in np.arange(len(self.parmstofit)): p = self.parmstofit[i] labels = mdls.valuedict[modelid][0] if p in labels: index = labels.index(p) values[index] = parms[i] # Check parameters, if there is such a function check_parms = mdls.verification[modelid] values = check_parms(values) # Write parameters back? # self.PageData[key]["values"] = values # Calculate resulting correlation function # corr = function(item.values, item.x) # Subtract data. This is the function we want to minimize minimize.append( (function(values, item["x"]) - item["data"]) / item["dataweights"] ) # Flatten the list and make an array out of it. return np.array([item for sublist in minimize for item in sublist]) def OnClose(self, event=None): # This is a necessary function for PyCorrFit. # Do not change it. self.parent.toolmenu.Check(self.MyID, False) self.parent.ToolsOpen.__delitem__(self.MyID) self.Destroy() def OnFit(self, e=None): # process a string like this: "1,2,4-9,10" strFull = self.WXTextPages.GetValue() PageNumbers = misc.parseString2Pagenum(self, strFull) if PageNumbers is None: # Something went wrong and parseString2Pagenum already displayed # an error message. return ## Get the corresponding pages, if they exist: self.PageData = dict() self.parmstofit = list() fitparms = list() for i in np.arange(self.parent.notebook.GetPageCount()): Page = self.parent.notebook.GetPage(i) j = filter(lambda x: x.isdigit(), Page.counter) if int(j) in PageNumbers: dataset = dict() try: dataset["x"] = Page.dataexp[:,0] dataset["data"] = Page.dataexp[:,1] except: print "No experimental data in page #"+j+"!" else: dataset["modelid"] = Page.modelid Page.apply_parameters() dataset["values"] = Page.active_parms[1] # Get weights weighttype = self.weightedfitdrop.GetSelection() Page.Fitbox[1].SetSelection(weighttype) weightname = self.weightedfitdrop.GetValue() setweightname = Page.Fitbox[1].GetValue() if setweightname.count(weightname) == 0: print "Page "+Page.counter+" has no fitting type '"+ \ weightname+"'!" Page.Fit_WeightedFitCheck() Fitting = Page.Fit_create_instance(noplots=True) if Fitting.dataweights is None: dataset["dataweights"] = 1. else: dataset["dataweights"] = Fitting.dataweights self.PageData[int(j)] = dataset # Get the parameters to fit from that page labels = Page.active_parms[0] parms = 1*Page.active_parms[1] tofit = 1*Page.active_parms[2] for i in np.arange(len(labels)): if tofit[i]: if self.parmstofit.count(labels[i]) == 0: self.parmstofit.append(labels[i]) fitparms.append(parms[i]) fitparms = np.array(fitparms) # Now we can perform the least squares fit if len(fitparms) == 0: return res = spopt.leastsq(self.fit_function, fitparms[:], full_output=1) (popt, pcov, infodict, errmsg, ier) = res #self.parmoptim, self.mesg = spopt.leastsq(self.fit_function, # fitparms[:]) self.parmoptim = res[0] # So we have the optimal parameters. # We would like to give each page a chi**2 and its parameters back: # Create a clean list of PageNumbers # UsedPages = dict.fromkeys(PageNumbers).keys() UsedPages = self.PageData.keys() UsedPages.sort() for key in UsedPages: # Get the Page: for i in np.arange(self.parent.notebook.GetPageCount()): aPage = self.parent.notebook.GetPage(i) j = filter(lambda x: x.isdigit(), aPage.counter) if int(j) == int(key): Page = aPage Page.GlobalParameterShare = UsedPages # Get the function item = self.PageData[key] modelid = item["modelid"] function = mdls.modeldict[modelid][3] values = 1*Page.active_parms[1] # Set parameters for each Page) for i in np.arange(len(self.parmstofit)): p = self.parmstofit[i] labels = mdls.valuedict[modelid][0] if p in labels: index = labels.index(p) values[index] = self.parmoptim[i] Page.active_parms[2][index] = True # Check parameters, if there is such a function check_parms = mdls.verification[modelid] values = check_parms(values) # Write parameters back? Page.active_parms[1] = 1*values # Calculate resulting correlation function # corr = function(item.values, item.x) # Subtract data. This is the function we want to minimize residual = function(values, item["x"]) - item["data"] # Calculate chi**2 # Set the parameter error estimates for all pages minimized = self.fit_function(self.parmoptim) degrees_of_freedom = len(minimized) - len(self.parmoptim) - 1 self.chi = Page.chi2 = np.sum((minimized)**2) / degrees_of_freedom try: self.covar = pcov * self.chi except: self.parmoptim_error = None else: parmoptim_error = list() if self.covar is not None: self.parmoptim_error = np.diag(self.covar) p_error = self.parmoptim_error if p_error is None: Page.parmoptim_error = None else: Page.parmoptim_error = dict() for i in np.arange(len(p_error)): Page.parmoptim_error[self.parmstofit[i]] = p_error[i] Page.apply_parameters_reverse() # Because we are plotting the weights, we need to update # the corresponfing info in each page: weightid = self.weightedfitdrop.GetSelection() if weightid != 0: # We have weights. # We need the following information for correct plotting. Page.weighted_fit_was_performed = True Page.weights_used_for_fitting = Fitting.dataweights Page.calculate_corr() Page.data4weight = 1.*Page.datacorr Page.PlotAll() def OnPageChanged(self, page): # When parent changes # This is a necessary function for PyCorrFit. # This is stuff that should be done when the active page # of the notebook changes. if self.parent.notebook.GetPageCount() == 0: self.panel.Disable() return self.panel.Enable() self.Page = page if self.Page is not None: weightlist = self.Page.Fitbox[1].GetItems() # Do not display knot number for spline. May be different for each page. # Remove everything after a "(" in the weightlist string. # This way, e.g. the list does not show the knotnumber, which # we don't use anyhow. # We are doing this for all elements, because in the future, other (?) # weighting methods might be implemented. #for i in np.arange(len(weightlist)): # weightlist[i] = weightlist[i].split("(")[0].strip() weightlist[1] = weightlist[1].split("(")[0].strip() self.weightedfitdrop.SetItems(weightlist) try: # if there is no data, this could go wrong self.Page.Fit_create_instance(noplots=True) FitTypeSelection = self.Page.Fitbox[1].GetSelection() except: FitTypeSelection = 0 self.weightedfitdrop.SetSelection(FitTypeSelection) ## Knotnumber: we don't want to interfere # The user might want to edit the knotnumbers. # self.FitKnots = Page.FitKnots # 5 by default def SetPageNumbers(self, pagestring): self.WXTextPages.SetValue(pagestring) pycorrfit-0.8.1/src/tools/statistics.py0000644000175000017500000005552612262516600017001 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module tools - statistics Provide the user with tab-separated statistics of their curves. Values are sorted according to the page number. Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import wx import wx.lib.plot as plot # Plotting in wxPython import numpy as np from info import InfoClass import misc # Menu entry name MENUINFO = ["&Statistics view", "Show some session statistics."] def run_once(f): def wrapper(*args, **kwargs): if not wrapper.has_run: wrapper.has_run = True return f(*args, **kwargs) wrapper.has_run = False return wrapper class Stat(wx.Frame): # This tool is derived from a wx.frame. def __init__(self, parent): self.MyName="STATISTICS" # parent is the main frame of PyCorrFit self.boxsizerlist = list() self.parent = parent # Get the window positioning correctly pos = self.parent.GetPosition() pos = (pos[0]+100, pos[1]+100) wx.Frame.__init__(self, parent=self.parent, title="Statistics", pos=pos, style=wx.DEFAULT_FRAME_STYLE|wx.FRAME_FLOAT_ON_PARENT) ## MYID # This ID is given by the parent for an instance of this class self.MyID = None self.MyName = "STATISTICS" # List of parameters that are plotted or not self.PlotParms = list(["None", 0]) # Page - the currently active page of the notebook. self.Page = self.parent.notebook.GetCurrentPage() # Pagenumbers self.PageNumbers = np.arange(self.parent.notebook.GetPageCount()) ## Splitter window. left side: checkboxes ## right side: plot with parameters self.sp = wx.SplitterWindow(self, style=wx.SP_3DSASH) # This is necessary to prevent "Unsplit" of the SplitterWindow: self.sp.SetMinimumPaneSize(1) ## Content # We will display a dialog that conains all the settings # - Which model we want statistics on # - What kind of parameters should be printed # (We will get the parameters from the current page) # If on another page, the parameter is not available, # do not make a mess out of it. # Then the user presses a button and sees/saves the table # with all the info. self.panel = wx.Panel(self.sp) # Parameter settings. if self.parent.notebook.GetPageCount() != 0: self.InfoClass = InfoClass(CurPage=self.Page) else: self.panel.Disable() # A dropdown menu for the source Page: text = wx.StaticText(self.panel, label="Create a table with all the selected\n"+ "variables below from pages with the\n"+ "same model as the current page.") ## Page selection as in average tool Pagetext = wx.StaticText(self.panel, label="Curves ") Psize = text.GetSize()[0] - Pagetext.GetSize()[0] self.WXTextPages = wx.TextCtrl(self.panel, value="", size=(Psize,-1)) # Set number of pages pagenumlist = list() for i in np.arange(self.parent.notebook.GetPageCount()): Page = self.parent.notebook.GetPage(i) pagenumlist.append(int(filter(lambda x: x.isdigit(), Page.counter))) valstring=misc.parsePagenum2String(pagenumlist) self.WXTextPages.SetValue(valstring) ## Plot parameter dropdown box self.PlotParms = self.GetListOfPlottableParms() Parmlist = self.PlotParms DDtext = wx.StaticText(self.panel, label="Plot parameter ") DDsize = text.GetSize()[0] - DDtext.GetSize()[0] self.WXDropdown = wx.ComboBox(self.panel, -1, "", size=(DDsize,-1), choices=Parmlist, style=wx.CB_DROPDOWN|wx.CB_READONLY) self.Bind(wx.EVT_COMBOBOX, self.OnDropDown, self.WXDropdown) self.Bind(wx.EVT_TEXT, self.OnDropDown, self.WXTextPages) self.WXDropdown.SetSelection(0) # Create space for parameters self.box = wx.StaticBox(self.panel, label="Export parameters") self.masterboxsizer = wx.StaticBoxSizer(self.box, wx.VERTICAL) self.masterboxsizer.Add(text) self.boxsizer = wx.BoxSizer(wx.HORIZONTAL) self.masterboxsizer.Add(self.boxsizer) self.Checkboxes = list() self.Checklabels = list() if self.parent.notebook.GetPageCount() != 0: self.OnChooseValues() self.btnSave = wx.Button(self.panel, wx.ID_ANY, 'Save') self.Bind(wx.EVT_BUTTON, self.OnSaveTable, self.btnSave) # Add elements to sizer self.topSizer = wx.BoxSizer(wx.VERTICAL) #self.topSizer.Add(text) Psizer = wx.BoxSizer(wx.HORIZONTAL) Psizer.Add(Pagetext) Psizer.Add(self.WXTextPages) DDsizer = wx.BoxSizer(wx.HORIZONTAL) DDsizer.Add(DDtext) DDsizer.Add(self.WXDropdown) self.topSizer.Add(Psizer) self.topSizer.Add(DDsizer) self.topSizer.Add(self.masterboxsizer) self.topSizer.Add(self.btnSave) # Set size of window self.panel.SetSizer(self.topSizer) self.topSizer.Fit(self) (px, py) = self.topSizer.GetMinSizeTuple() ## Plotting panel self.canvas = plot.PlotCanvas(self.sp) self.sp.SplitVertically(self.panel, self.canvas, px+5) self.SetMinSize((px+400, py)) ## Icon if parent.MainIcon is not None: wx.Frame.SetIcon(self, parent.MainIcon) self.Show(True) self.OnDropDown() def GetListOfAllParameters(self, e=None, return_std_checked=False): """ Returns sorted list of parameters. If return_std_checked is True, then a second list with standart checked parameters is returned. """ self.InfoClass.CurPage = self.Page # Now that we know our Page, we may change the available # parameter options. Infodict = self.InfoClass.GetCurInfo() # We want to sort the information and have some prechecked values # in the statistics window afterwards. # new iteration keys = Infodict.keys() body = list() tail = list() for key in keys: # "title" - filename/title first if key == "title": for item in Infodict[key]: if len(item) == 2: if item[0] == "filename/title": headtitle = [item] else: tail.append(item) # "title" - filename/title first elif key == "parameters": headparm = list() bodyparm = list() for parm in Infodict[key]: parminlist = False try: for fitp in Infodict["fitting"]: parmname = parm[0] errname = "Err "+parmname if fitp[0] == errname: headparm.append(parm) parminlist = True headparm.append(fitp) except: # Maybe there was not fit... pass if parminlist == False: bodyparm.append(parm) elif key == "fitting": for fitp in Infodict[key]: # We added the error data before in the parameter section if str(fitp[0])[0:4] != "Err ": tail.append(fitp) elif key == "supplement": body += Infodict[key] # Append all other items elif key == "background": body += Infodict[key] else: for item in Infodict[key]: if item is not None and len(item) == 2: tail.append(item) # Bring lists together head = headtitle + headparm body = bodyparm + body Info = head + body + tail # List of default checked parameters: checked = np.zeros(len(Info), dtype=np.bool) checked[:len(head)] = True # A list with additional strings that should be default checked if found # somewhere in the data. checklist = ["cpp", "duration", "bg rate"] for i in range(len(Info)): item = Info[i] for checkitem in checklist: if item[0].count(checkitem): checked[i] = True if return_std_checked: return Info, checked else: return Info def GetListOfPlottableParms(self, e=None, return_values=False): """ Returns sorted list of parameters that can be plotted. (This means that the values are convertable to floats) If return_values is True, then a second list with the corresponding values is returned. """ if self.parent.notebook.GetPageCount() != 0: #Info = self.InfoClass.GetPageInfo(self.Page) Info = self.GetListOfAllParameters() #keys = Info.keys() #keys.sort() parmlist = list() parmvals = list() for item in Info: if item is not None and len(item) == 2: try: val = float(item[1]) except: pass else: # save the key so we can find the parameter later parmlist.append(item[0]) parmvals.append(val) else: parmlist = [""] parmvals = [0] if return_values: return parmlist, parmvals else: return parmlist def GetWantedParameters(self): strFull = self.WXTextPages.GetValue() PageNumbers = misc.parseString2Pagenum(self, strFull) # Get the wanted parameters from the selection. checked = list() for i in np.arange(len(self.Checkboxes)): if self.Checkboxes[i].IsChecked() == True: checked.append(self.Checklabels[i]) # Collect all the relevant pages pages = list() for i in np.arange(self.parent.notebook.GetPageCount()): Page = self.parent.notebook.GetPage(i) if Page.modelid == self.Page.modelid: # Only pages with same modelid if int(Page.counter.strip("#: ")) in PageNumbers: # Only pages selected in self.WXTextPages pages.append(Page) self.InfoClass.Pagelist = pages AllInfo = self.InfoClass.GetAllInfo() self.SaveInfo = list() # Some nasty iteration through the dictionaries. # Collect all checked variables. pagekeys = AllInfo.keys() # If pagenumber is larger than 10, # pagekeys.sort will not work, because we have strings # Define new compare function cmp_func = lambda a,b: cmp(int(a.strip().strip("#")), int(b.strip().strip("#"))) pagekeys.sort(cmp=cmp_func) #for Info in pagekeys: # pageinfo = list() # for item in AllInfo[Info]: # for subitem in AllInfo[Info][item]: # if len(subitem) == 2: # for label in checked: # if label == subitem[0]: # pageinfo.append(subitem) # # We want to replace the above iteration with an iteration that # covers missing values. This means checking for "label == subitem[0]" # and iteration over AllInfo with that consition. for Info in pagekeys: pageinfo = list() for label in checked: label_in_there = False for item in AllInfo[Info]: for subitem in AllInfo[Info][item]: if subitem is not None and len(subitem) == 2: if label == subitem[0]: label_in_there = True pageinfo.append(subitem) if label_in_there == False: # No data available pageinfo.append([label, "NaN"]) self.SaveInfo.append(pageinfo) def OnCheckboxChecked(self, e="restore"): """ Write boolean data of checked checkboxes to Page variable *StatisticsCheckboxes*. If e=="restore", then we will attempt to get the info back from the page. """ # What happens if a checkbox has been checked? # We write the data to the Page (it will not be saved in the session). if e=="restore": checklist = self.Page.StatisticsCheckboxes if checklist is not None: if len(checklist) <= len(self.Checkboxes): for i in np.arange(len(checklist)): self.Checkboxes[i].SetValue(checklist[i]) else: checklist = list() for cb in self.Checkboxes: checklist.append(cb.GetValue()) self.Page.StatisticsCheckboxes = checklist def OnChooseValues(self, event=None): Info, checked = self.GetListOfAllParameters(return_std_checked=True) #headcounter = 0 #headlen = len(head) # We will sort the checkboxes in more than one column if there # are more than *maxitemsincolumn* maxitemsincolumn = np.float(25) Sizernumber = int(np.ceil(len(Info)/maxitemsincolumn)) self.boxsizerlist = list() for i in np.arange(Sizernumber): self.boxsizerlist.append(wx.BoxSizer(wx.VERTICAL)) # Start at -1 so the indexes will start at 0 (see below). #itemcount = -1 for i in range(len(Info)): #itemcount += 1 #headcounter += 1 checkbox = wx.CheckBox(self.panel, label=Info[i][0]) #if headcounter <= headlen: # checkbox.SetValue(True) # Additionally default checked items #for checkitem in checklist: # if item[0].count(checkitem): # checkbox.SetValue(True) checkbox.SetValue(checked[i]) # Add checkbox to column sizers sizern = int(np.floor(i/maxitemsincolumn)) self.boxsizerlist[sizern].Add(checkbox) self.Checkboxes.append(checkbox) self.Checklabels.append(Info[i][0]) self.Bind(wx.EVT_CHECKBOX, self.OnCheckboxChecked, checkbox) # Add sizers to boxsizer for sizer in self.boxsizerlist: self.boxsizer.Add(sizer) self.OnCheckboxChecked("restore") self.AllPlotParms = Info def OnClose(self, event=None): # This is a necessary function for PyCorrFit. # Do not change it. self.parent.toolmenu.Check(self.MyID, False) self.parent.ToolsOpen.__delitem__(self.MyID) self.Destroy() def OnDropDown(self, e=None): """ Plot the parameter selected in WXDropdown Uses info stored in self.PlotParms and self.InfoClass """ if self.parent.notebook.GetPageCount() == 0 or self.Page is None: self.canvas.Clear() return # Get valid pages strFull = self.WXTextPages.GetValue() try: PageNumbers = misc.parseString2Pagenum(self, strFull, nodialog=True) except: PageNumbers = self.PageNumbers else: self.PageNumbers = PageNumbers # Get plot parameters DDselid = self.WXDropdown.GetSelection() #[label, key] = self.PlotParms[DDselid] label = self.PlotParms[DDselid] # Get potential pages pages = list() for i in np.arange(self.parent.notebook.GetPageCount()): Page = self.parent.notebook.GetPage(i) if Page.modelid == self.Page.modelid: # Only pages with same modelid if int(Page.counter.strip("#: ")) in PageNumbers: # Only pages selected in self.WXTextPages pages.append(Page) plotcurve = list() for page in pages: self.Page = page pllabel, pldata = self.GetListOfPlottableParms(return_values=True) # Get the labels and make a plot of the parameters if len(pllabel)-1 >= DDselid and pllabel[DDselid] == label: x = int(page.counter.strip("#: ")) y = pldata[DDselid] plotcurve.append([x,y]) else: # try to get the label by searching for the first instance for k in range(len(pllabel)): if pllabel[k] == label: x = int(page.counter.strip("#: ")) y = pldata[k] plotcurve.append([x,y]) # Prepare plotting self.canvas.Clear() linesig = plot.PolyMarker(plotcurve, size=1.5, fillstyle=wx.TRANSPARENT, marker='circle') plotlist = [linesig] # average line try: avg = np.average(np.array(plotcurve)[:,1]) maxpage = np.max(np.array(plotcurve)[:,0]) except: maxpage = 0 else: plotavg = [[0, avg], [maxpage, avg]] lineclear = plot.PolyLine(plotavg, colour="black", style= wx.SHORT_DASH) plotlist.append(lineclear) # Draw self.canvas.Draw(plot.PlotGraphics(plotlist, xLabel='page number', yLabel=label)) # Correctly set x-axis minticks = 2 self.canvas.SetXSpec(max(maxpage, minticks)) # Zoom out such that we can see the end of all curves try: xcenter = np.average(np.array(plotcurve)[:,0]) ycenter = np.average(np.array(plotcurve)[:,1]) scale = 1.1 self.canvas.Zoom((xcenter,ycenter), (scale, scale)) except: pass # Redraw result self.canvas.Redraw() def OnPageChanged(self, page): # When parent changes # This is a necessary function for PyCorrFit. # This is stuff that should be done when the active page # of the notebook changes. # # Prevent this function to be run twice at once: # oldsize = self.GetSizeTuple() if self.WXTextPages.GetValue() == "": # Set number of pages pagenumlist = list() for i in np.arange(self.parent.notebook.GetPageCount()): Page = self.parent.notebook.GetPage(i) pagenumlist.append(int(filter(lambda x: x.isdigit(), Page.counter))) valstring=misc.parsePagenum2String(pagenumlist) self.WXTextPages.SetValue(valstring) DDselection = self.WXDropdown.GetValue() self.Page = page self.InfoClass = InfoClass(CurPage=self.Page) self.PlotParms = self.GetListOfPlottableParms() # Make sure the selection stays the same DDselid = 0 for i in range(len(self.PlotParms)): if DDselection == self.PlotParms[i]: DDselid = i Parmlist = self.PlotParms self.WXDropdown.SetItems(Parmlist) self.WXDropdown.SetSelection(DDselid) self.panel.Enable() for i in np.arange(len(self.Checkboxes)): self.Checkboxes[i].Destroy() del self.Checkboxes #self.Checklabels[i].Destroy() # those cannot be destroyed. for i in np.arange(len(self.boxsizerlist)): self.boxsizer.Remove(0) self.boxsizer.Layout() self.boxsizerlist = list() self.Checkboxes = list() self.Checklabels = list() # Disable if there are no pages left if self.parent.notebook.GetPageCount() == 0: self.panel.Disable() self.canvas.Clear() return self.OnChooseValues() self.boxsizer.Layout() self.topSizer.Fit(self) (ax, ay) = self.GetSizeTuple() (px, py) = self.topSizer.GetMinSizeTuple() self.sp.SetSashPosition(px+5) self.SetSize((np.max([px+400,ax,oldsize[0]]), np.max([py,ay,oldsize[1]]))) self.SetMinSize((px+400, py)) # Replot self.OnDropDown() def OnSaveTable(self, event=None): dirname = self.parent.dirname dlg = wx.FileDialog(self.parent, "Choose file to save", dirname, "", "Text file (*.txt)|*.txt;*.TXT", wx.SAVE|wx.FD_OVERWRITE_PROMPT) # user cannot do anything until he clicks "OK" if dlg.ShowModal() == wx.ID_OK: filename = dlg.GetPath() if filename.lower().endswith(".txt") is not True: filename = filename+".txt" dirname = dlg.GetDirectory() openedfile = open(filename, 'wb') # Get Parameterlist of all Pages with same model id as # Self.Page # This creates self.SaveInfo: self.GetWantedParameters() # Write header linestring = "" for atuple in self.SaveInfo[0]: linestring += str(atuple[0])+"\t" # remove trailing "\t" openedfile.write(linestring.strip()+"\r\n") # Write data for item in self.SaveInfo: linestring = "" for btuple in item: linestring += str(btuple[1])+"\t" openedfile.write(linestring.strip()+"\r\n") openedfile.close() else: dirname = dlg.GetDirectory() dlg.Destroy() # Give parent the current dirname self.parent.dirname = dirname def SetPageNumbers(self, pagestring): self.WXTextPages.SetValue(pagestring) pycorrfit-0.8.1/src/tools/datarange.py0000644000175000017500000002441612262516600016527 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module tools - channels Let the user choose time domains. Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import wx import numpy as np # Menu entry name MENUINFO = ["&Data range", "Select an interval of lag times to be used for fitting."] class SelectChannels(wx.Frame): def __init__(self, parent): # parent is main frame self.parent = parent # Get the window positioning correctly pos = self.parent.GetPosition() pos = (pos[0]+100, pos[1]+100) wx.Frame.__init__(self, parent=self.parent, title="Data range selection", pos=pos, style=wx.DEFAULT_FRAME_STYLE|wx.FRAME_FLOAT_ON_PARENT) ## MYID # This ID is given by the parent for an instance of this class self.MyID = None ## Start drawing panel = wx.Panel(self) self.panel = panel # Page self.Page = self.parent.notebook.GetCurrentPage() self.Calc_init(self.Page) text1 = wx.StaticText(panel, label=u"The lag times τ are stored as an "+ u"array of length ") self.textend = wx.StaticText(panel, label="%d." % self.lentau) text2 = wx.StaticText(panel, label=u"You may wish to confine this array. "+ u"This can be done here.") ##Spincontrols: FlexSpinSizer = wx.FlexGridSizer(rows=2, cols=4, vgap=5, hgap=5) FlexSpinSizer.Add(wx.StaticText(panel, label="Channels:")) self.spinstart = wx.SpinCtrl(panel, -1, initial=self.left, min=self.start0, max=self.end0-1) FlexSpinSizer.Add(self.spinstart) FlexSpinSizer.Add(wx.StaticText(panel, label=" - ")) self.spinend = wx.SpinCtrl(panel, -1, initial=self.right, min=self.start0+1, max=self.end0) FlexSpinSizer.Add(self.spinend) FlexSpinSizer.Add(wx.StaticText(panel, label="Times [ms]:")) self.TextTimesStart = wx.StaticText(panel, label="None") FlexSpinSizer.Add(self.TextTimesStart) FlexSpinSizer.Add(wx.StaticText(panel, label=" - ")) self.TextTimesEnd = wx.StaticText(panel, label="None") FlexSpinSizer.Add(self.TextTimesEnd) # Buttons btnapply = wx.Button(panel, wx.ID_ANY, 'Apply') btnapplyall = wx.Button(panel, wx.ID_ANY, 'Apply to all pages') self.ButtonApply = btnapply self.ButtonApplyAll = btnapplyall self.Bind(wx.EVT_BUTTON, self.OnApply, btnapply) self.Bind(wx.EVT_BUTTON, self.OnApplyAll, btnapplyall) self.Bind(wx.EVT_SPINCTRL, self.OnChangeChannels, self.spinend) self.Bind(wx.EVT_SPINCTRL, self.OnChangeChannels, self.spinstart) # Checkbox self.fixcheck = wx.CheckBox(panel, -1, label="Fix current channel selection for all pages.") self.Bind(wx.EVT_CHECKBOX, self.OnCheckbox, self.fixcheck) # Text channelsel = "Leave this window open for a fixed selection." text3 = wx.StaticText(panel, label=channelsel) # Sizer topSizer = wx.BoxSizer(wx.VERTICAL) buttonsizer = wx.BoxSizer(wx.HORIZONTAL) buttonsizer.Add(btnapply, 1) buttonsizer.Add(btnapplyall, 1) text1sizer = wx.BoxSizer(wx.HORIZONTAL) text1sizer.Add(text1) text1sizer.Add(self.textend) topSizer.Add(text1sizer) topSizer.Add(text2) topSizer.AddSpacer(5) topSizer.Add(FlexSpinSizer) topSizer.Add(self.fixcheck) topSizer.Add(text3) topSizer.AddSpacer(5) topSizer.Add(buttonsizer) panel.SetSizer(topSizer) topSizer.Fit(self) self.SetMinSize(topSizer.GetMinSizeTuple()) # Get times. self.OnChangeChannels() #Icon if parent.MainIcon is not None: wx.Frame.SetIcon(self, parent.MainIcon) # Show window self.Show(True) self.OnPageChanged(self.Page) def Calc_init(self, parent): ## Variables # Parent should be the fitting panel - # The tab, where the fitting is done. self.Page = parent if self.Page == None: # dummy info taufull = np.arange(100) self.left = self.right = None self.panel.Disable() else: self.left = self.Page.startcrop # starting position self.right = self.Page.endcrop # ending position if self.Page.dataexpfull is not None: taufull = self.Page.dataexpfull[:,0] else: # then we only have tau taufull = self.Page.taufull self.lentau = len(taufull) self.start0 = 0 # left border of interval # The interval starts at 0! self.end0 = self.lentau - 1 # right border of interval if self.left is None or self.left > self.end0: # This means, that either left = right = None # or the dataexp-array is too small self.left = self.start0 if self.right is None: # set the maximum possible value self.right = self.end0 else: self.right -=1 def OnApply(self, event=None): self.SetValues() self.Page.PlotAll() def OnApplyAll(self, event=None): start = self.spinstart.GetValue() end = self.spinend.GetValue() + 1 # +1, [sic] if start > end: # swap the variables, we are not angry at the user start, end = end, start # Get all the Pages N = self.parent.notebook.GetPageCount() for i in np.arange(N): # Set Page Page = self.parent.notebook.GetPage(i) # Find out maximal length if Page.dataexpfull is not None: maxlen = len(Page.dataexpfull[:,0]) else: # then we only have tau maxlen = len(Page.taufull) # Use the smaller one of both, so we do not get an # index out of bounds error Page.endcrop = min(end, maxlen) Page.startcrop = start*(start < maxlen - 1 ) Page.PlotAll() # Page.PlorAll() calls this function. This results in the wrong data # being displayed in an open "View Info" Window. We call it again. self.parent.OnFNBPageChanged() def OnChangeTimes(self, e=None): """ Called, whenever data range in seconds is changed. This updates the data range in channels in the window. This function might be used in later versions of PyCorrFit. """ pass def OnChangeChannels(self, e=None): """ Called, whenever data range in channels is changed. This updates the data range in seconds in the window. """ if self.Page == None: return N = len(self.Page.taufull) start = self.spinstart.Value end = self.spinend.Value # If the initial boundaries are outside of the experimental # data array of length N, change the start and end variables. start = start*(start < N-2) end = min(end, N-1) t1 = 1.*self.Page.taufull[start] t2 = 1.*self.Page.taufull[end] self.TextTimesStart.SetLabel("%.4e" % t1) self.TextTimesEnd.SetLabel("%.4e" % t2) self.OnCheckbox() def OnCheckbox(self, event=None): """ Set the correct value in the spincontrol, if the checkbox is not checked. """ state = self.fixcheck.GetValue() if state == True: self.OnApplyAll() self.ButtonApply.Disable() self.ButtonApplyAll.Disable() else: self.ButtonApply.Enable() self.ButtonApplyAll.Enable() #self.OnPageChanged(self.Page) def OnClose(self, event=None): self.parent.toolmenu.Check(self.MyID, False) self.parent.ToolsOpen.__delitem__(self.MyID) self.Destroy() def OnPageChanged(self, page): # We do not need the *Range* Commands here yet. # We open and close the SelectChannelsFrame every time we # import some data. # # Check if we have a fixed channel selection if self.parent.notebook.GetPageCount() == 0: self.panel.Disable() else: self.panel.Enable() # There is a page. We may continue. state = self.fixcheck.GetValue() if state == True: # We do not need to run Calc_init self.Page = page self.SetValues() self.Page.PlotAll(event="init") else: # We will run it self.Calc_init(page) self.spinstart.SetRange(self.start0, self.end0-1) self.spinstart.SetValue(self.left) self.spinend.SetRange(self.start0+1, self.end0) self.spinend.SetValue(self.right) self.textend.SetLabel("%d." % self.lentau) self.OnChangeChannels() def SetValues(self): start = self.spinstart.GetValue() end = self.spinend.GetValue() if start > end: # swap the variables, we are not angry at the user start, end = end, start self.Page.startcrop = start self.Page.endcrop = end + 1 # +1, because arrays are accessed like this pycorrfit-0.8.1/src/tools/trace.py0000644000175000017500000000773312262516600015702 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module tools - trace Show the trace of a file. Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import wx import wx.lib.plot as plot # Menu entry name MENUINFO = ["&Trace view", "Show the trace of an opened file."] class ShowTrace(wx.Frame): def __init__(self, parent): # parent is main frame self.parent = parent # Get the window positioning correctly pos = self.parent.GetPosition() pos = (pos[0]+100, pos[1]+100) wx.Frame.__init__(self, parent=self.parent, title="Trace view", pos=pos, style=wx.DEFAULT_FRAME_STYLE|wx.FRAME_FLOAT_ON_PARENT) ## MYID # This ID is given by the parent for an instance of this class self.MyID = None # Page self.Page = self.parent.notebook.GetCurrentPage() ## Canvas self.canvas = plot.PlotCanvas(self) self.canvas.SetEnableZoom(True) if self.parent.notebook.GetPageCount() == 0: # We do not need to disable anything here. user input. pass else: self.OnDraw() initial_size = (350,150) self.SetSize(initial_size) self.SetMinSize(initial_size) ## Icon if parent.MainIcon is not None: wx.Frame.SetIcon(self, parent.MainIcon) self.Show(True) def OnClose(self, event=None): self.parent.toolmenu.Check(self.MyID, False) self.parent.ToolsOpen.__delitem__(self.MyID) self.Destroy() def OnDraw(self): if self.Page.trace is not None: self.trace = 1*self.Page.trace # We want to have the trace in [s] here. self.trace[:,0] = self.trace[:,0]/1000 line = plot.PolyLine(self.trace, legend='', colour='blue', width=1) lines = [line] self.canvas.SetEnableLegend(False) elif self.Page.tracecc is not None: # This means that we have two (CC) traces to plot self.tracea = 1*self.Page.tracecc[0] self.tracea[:,0] = self.tracea[:,0]/1000 self.traceb = 1*self.Page.tracecc[1] self.traceb[:,0] = self.traceb[:,0]/1000 linea = plot.PolyLine(self.tracea, legend='channel 1', colour='blue', width=1) lineb = plot.PolyLine(self.traceb, legend='channel 2', colour='red', width=1) lines = [linea, lineb] self.canvas.SetEnableLegend(True) else: self.canvas.Clear() return # Plot lines self.canvas.Draw(plot.PlotGraphics(lines, xLabel='time [s]', yLabel='count rate [kHz]')) def OnPageChanged(self, page=None): self.Page = page # When parent changes if self.parent.notebook.GetPageCount() == 0: # Nothing to do try: self.canvas.Clear() except: pass return self.OnDraw() pycorrfit-0.8.1/src/tools/simulation.py0000644000175000017500000004461012262516600016763 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module tools - simulation Enables the user to change plotting parameters and replotting fast. Might be useful for better understanding. Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import wx import numpy as np import edclasses # edited floatspin import models as mdls # Menu entry name MENUINFO = ["S&lider simulation", "Fast plotting for different parameters."] class Slide(wx.Frame): # This tool is derived from a wx.frame. def __init__(self, parent): # parent is the main frame of PyCorrFit self.parent = parent # Get the window positioning correctly pos = self.parent.GetPosition() pos = (pos[0]+100, pos[1]+100) wx.Frame.__init__(self, parent=self.parent, title="Simulation", pos=pos, style=wx.DEFAULT_FRAME_STYLE|wx.FRAME_FLOAT_ON_PARENT) # Starting positions/factors for spinctrls and sliders self.slidemax = 1000 self.slidestart = 500 self.spinstartfactor = 0.1 self.spinendfactor = 1.9 ## MYID # This ID is given by the parent for an instance of this class self.MyID = None # Page - the currently active page of the notebook. self.Page = self.parent.notebook.GetCurrentPage() ## Content self.panel = wx.Panel(self) self.rbtnB = wx.RadioButton (self.panel, -1, 'Vary A and B', style = wx.RB_GROUP) self.rbtnOp = wx.RadioButton (self.panel, -1, 'Fix relation') self.btnreset = wx.Button(self.panel, wx.ID_ANY, 'Reset') # Set starting variables self.SetStart() # Populate panel dropsizer = wx.FlexGridSizer(rows=2, cols=3, vgap=5, hgap=5) dropsizer.Add( wx.StaticText(self.panel, label="Parameter A")) dropsizer.Add( wx.StaticText(self.panel, label="Operator")) dropsizer.Add( wx.StaticText(self.panel, label="Parameter B")) self.droppA = wx.ComboBox(self.panel, -1, self.labelA, (15, 20), wx.DefaultSize, self.parmAlist, wx.CB_DROPDOWN|wx.CB_READONLY) self.droppA.SetSelection(0) self.Bind(wx.EVT_COMBOBOX, self.Ondrop, self.droppA) self.dropop = wx.ComboBox(self.panel, -1, "", (10, 20), wx.DefaultSize, self.oplist, wx.CB_DROPDOWN|wx.CB_READONLY) self.dropop.SetSelection(0) self.opfunc = self.opdict[self.opdict.keys()[0]] self.Bind(wx.EVT_COMBOBOX, self.Ondrop, self.dropop) self.droppB = wx.ComboBox(self.panel, -1, self.labelB, (15, 30), wx.DefaultSize, self.parmBlist, wx.CB_DROPDOWN|wx.CB_READONLY) self.Bind(wx.EVT_COMBOBOX, self.Ondrop, self.droppB) self.droppB.SetSelection(1) dropsizer.Add(self.droppA) dropsizer.Add(self.dropop) dropsizer.Add(self.droppB) textfix = wx.StaticText(self.panel, label="\nEdit intervals and drag the slider.\n") # Parameter A slidesizer = wx.FlexGridSizer(rows=3, cols=5, vgap=5, hgap=5) self.textstartA = wx.StaticText(self.panel, label=self.labelA) slidesizer.Add(self.textstartA) self.startspinA = edclasses.FloatSpin(self.panel, digits=7, increment=.1) slidesizer.Add(self.startspinA) self.sliderA = wx.Slider(self.panel, -1, self.slidestart, 0, self.slidemax, wx.DefaultPosition, (250, -1), wx.SL_HORIZONTAL) slidesizer.Add(self.sliderA) self.endspinA = edclasses.FloatSpin(self.panel, digits=7, increment=.1) slidesizer.Add(self.endspinA) self.textvalueA = wx.StaticText(self.panel, label= "%.5e" % self.valueA) slidesizer.Add(self.textvalueA) # Parameter B self.textstartB = wx.StaticText(self.panel, label=self.labelB) slidesizer.Add(self.textstartB) self.startspinB = edclasses.FloatSpin(self.panel, digits=7, increment=.1) slidesizer.Add(self.startspinB) self.sliderB = wx.Slider(self.panel, -1, self.slidestart, 0, self.slidemax, wx.DefaultPosition, (250, -1), wx.SL_HORIZONTAL) slidesizer.Add(self.sliderB) self.endspinB = edclasses.FloatSpin(self.panel, digits=7, increment=.1) slidesizer.Add(self.endspinB) self.textvalueB = wx.StaticText(self.panel, label= "%.5e" % self.valueB) slidesizer.Add(self.textvalueB) # Result of operation self.textstartOp = wx.StaticText(self.panel, label=self.labelOp) slidesizer.Add(self.textstartOp) self.startspinOp = edclasses.FloatSpin(self.panel, digits=7, increment=.1) slidesizer.Add(self.startspinOp) self.sliderOp = wx.Slider(self.panel, -1, self.slidestart, 0, self.slidemax, wx.DefaultPosition, (250, -1), wx.SL_HORIZONTAL) slidesizer.Add(self.sliderOp) self.endspinOp = edclasses.FloatSpin(self.panel, digits=7, increment=.1) slidesizer.Add(self.endspinOp) self.textvalueOp = wx.StaticText(self.panel, label= "%.5e" % self.valueOp) slidesizer.Add(self.textvalueOp) # Bindings for slider self.Bind(wx.EVT_SLIDER, self.OnSlider, self.sliderA) self.Bind(wx.EVT_SLIDER, self.OnSlider, self.sliderB) self.Bind(wx.EVT_SLIDER, self.OnSlider, self.sliderOp) # Bindings for radiobuttons self.Bind(wx.EVT_RADIOBUTTON, self.OnRadio, self.rbtnB) self.Bind(wx.EVT_RADIOBUTTON, self.OnRadio, self.rbtnOp) self.Bind(wx.EVT_BUTTON, self.OnReset, self.btnreset) # Bindings for spin controls # Our self-made spin controls alread have wx_EVT_SPINCTRL bound to # the increment function. We will call that function manually here. self.startspinA.Unbind(wx.EVT_SPINCTRL) self.startspinB.Unbind(wx.EVT_SPINCTRL) self.startspinOp.Unbind(wx.EVT_SPINCTRL) self.endspinA.Unbind(wx.EVT_SPINCTRL) self.endspinB.Unbind(wx.EVT_SPINCTRL) self.endspinOp.Unbind(wx.EVT_SPINCTRL) self.Bind(wx.EVT_SPINCTRL, self.OnSlider, self.startspinA) self.Bind(wx.EVT_SPINCTRL, self.OnSlider, self.startspinB) self.Bind(wx.EVT_SPINCTRL, self.OnSlider, self.startspinOp) self.Bind(wx.EVT_SPINCTRL, self.OnSlider, self.endspinA) self.Bind(wx.EVT_SPINCTRL, self.OnSlider, self.endspinB) self.Bind(wx.EVT_SPINCTRL, self.OnSlider, self.endspinOp) # Set values self.SetValues() ## Sizers self.topSizer = wx.BoxSizer(wx.VERTICAL) self.topSizer.Add(dropsizer) self.topSizer.Add(self.rbtnB) self.topSizer.Add(self.rbtnOp) self.topSizer.Add(self.btnreset) self.topSizer.Add(textfix) self.topSizer.Add(slidesizer) self.panel.SetSizer(self.topSizer) self.topSizer.Fit(self) #self.SetMinSize(self.topSizer.GetMinSizeTuple()) self.OnRadio() self.OnPageChanged(self.Page, init=True) #Icon if parent.MainIcon is not None: wx.Frame.SetIcon(self, parent.MainIcon) self.Show(True) def CalcFct(self, A, B, C): if self.rbtnB.Value == True: func = self.opfunc[0] try: C = func(A,B) except ZeroDivisionError: pass else: return B, C else: func = self.opfunc[1] try: B = func(A,C) except ZeroDivisionError: pass else: return B, C def Increment(self): # Set the correct increment for each spinctrl self.startspinA.increment() self.startspinB.increment() self.startspinOp.increment() self.endspinA.increment() self.endspinB.increment() self.endspinOp.increment() def FillOpDict(self): # Dictionaries: [Calculate C, Calculate B) self.opdict["A/B"] = [lambda A,B: A/B, lambda A,C: A/C] self.opdict["B/A"] = [lambda A,B: B/A, lambda A,C: C*A] self.opdict["A*B"] = [lambda A,B: A*B, lambda A,C: C/A] self.opdict["A+B"] = [lambda A,B: A+B, lambda A,C: C-A] self.opdict["A-B"] = [lambda A,B: A-B, lambda A,C: A-C] self.opdict["A*exp(B)"] = [lambda A,B: A*np.exp(B), lambda A,C: np.log(C/A)] self.opdict["B*exp(A)"] = [lambda A,B: B*np.exp(A), lambda A,C: C/np.exp(A)] def OnClose(self, event=None): # This is a necessary function for PyCorrFit. # Do not change it. self.parent.toolmenu.Check(self.MyID, False) self.parent.ToolsOpen.__delitem__(self.MyID) self.Destroy() def Ondrop(self, event=None): self.labelOp = self.oplist[self.dropop.GetSelection()] self.labelA = self.parmAlist[self.droppA.GetSelection()] self.labelB = self.parmBlist[self.droppB.GetSelection()] self.textstartOp.SetLabel(self.labelOp) self.textstartA.SetLabel(label=self.labelA) self.textstartB.SetLabel(self.labelB) self.sliderB.SetValue(self.slidestart) self.sliderOp.SetValue(self.slidestart) self.sliderA.SetValue(self.slidestart) self.SetValues() self.OnSize() def OnPageChanged(self, page=None, init=False): #if init: # # Get the parameters of the current page. # self.SavedParms = self.parent.PackParameters(self.Page) # When parent changes # This is a necessary function for PyCorrFit. # This is stuff that should be done when the active page # of the notebook changes. if self.parent.notebook.GetPageCount() == 0: self.panel.Disable() return try: # wx._core.PyDeadObjectError: The C++ part of the FittingPanel # object has been deleted, attribute access no longer allowed. oldcounter = self.Page.counter except: oldcounter = -1 if page is not None: if page.counter != oldcounter: self.Page = page self.SetStart() self.droppA.SetItems(self.parmAlist) self.droppB.SetItems(self.parmBlist) self.droppA.SetSelection(0) self.droppB.SetSelection(1) self.dropop.SetSelection(0) # Set labels self.Ondrop() else: self.Page = page self.panel.Enable() def OnRadio(self, event=None): if self.rbtnB.Value == True: # Parameter B is vaiable self.sliderOp.Enable(False) self.startspinOp.Enable(False) self.endspinOp.Enable(False) self.sliderB.Enable(True) self.startspinB.Enable(True) self.endspinB.Enable(True) else: # Operation result is vaiable self.sliderOp.Enable(True) self.startspinOp.Enable(True) self.endspinOp.Enable(True) self.sliderB.Enable(False) self.startspinB.Enable(False) self.endspinB.Enable(False) self.Ondrop() def OnReset(self, e=None): self.parent.UnpackParameters(self.SavedParms, self.Page) self.Page.apply_parameters_reverse() #self.OnPageChanged(self.Page) self.SetStart() self.Ondrop() def OnSize(self, event=None): # We need this funciton, because contents of the flexgridsizer # may change in size. self.panel.SetSizer(self.topSizer) self.topSizer.Fit(self) self.panel.SetSize(self.GetSize()) def OnSlider(self, event=None): ## Set the slider vlaues idmax = self.sliderA.GetMax() slideA = self.sliderA.GetValue() startA = self.startspinA.GetValue() endA = self.endspinA.GetValue() self.valueA = startA + (endA-startA)*slideA/idmax self.textvalueA.SetLabel( "%.5e" % self.valueA) if self.rbtnB.Value == True: slideB = self.sliderB.GetValue() startB = self.startspinB.GetValue() endB = self.endspinB.GetValue() self.valueB = startB + (endB-startB)*slideB/idmax else: # Same thing slideOp = self.sliderOp.GetValue() startOp = self.startspinOp.GetValue() endOp = self.endspinOp.GetValue() self.valueOp = startOp + (endOp-startOp)*slideOp/idmax self.valueB, self.valueOp = self.CalcFct(self.valueA, self.valueB, self.valueOp) self.textvalueB.SetLabel( "%.5e" % self.valueB) self.textvalueOp.SetLabel( "%.5e" % self.valueOp) self.Increment() self.SetResult() self.OnSize() def SetResult(self, event=None): if self.parent.notebook.GetPageCount() == 0: # Nothing to do return # And Plot idA = self.droppA.GetSelection() idB = self.droppB.GetSelection() # As of version 0.7.5: we want the units to be displayed # human readable - the way they are displayed # in the Page info tool. # Convert from human readable to internal units # The easiest way is to make a copy of all parameters and # only write back those that have been changed: # parms_0 = 1.*np.array(mdls.valuedict[self.modelid][1]) parms_0[idA] = self.valueA # human readable units parms_0[idB] = self.valueB # human readable units label, parms_i =\ mdls.GetInternalFromHumanReadableParm(self.modelid, parms_0) self.Page.active_parms[1][idA] = parms_i[idA] self.Page.active_parms[1][idB] = parms_i[idB] self.Page.apply_parameters_reverse() self.Page.PlotAll() def SetStart(self): # Sets first and second variable of a page to # Parameters A and B respectively. if self.parent.notebook.GetPageCount() == 0: self.modelid = 6000 ParmLabels, ParmValues = \ mdls.GetHumanReadableParms(self.modelid, mdls.valuedict[6000][1]) else: self.SavedParms = self.parent.PackParameters(self.Page) self.modelid = self.Page.modelid ParmLabels, ParmValues = \ mdls.GetHumanReadableParms(self.modelid, self.Page.active_parms[1]) self.parmAlist = ParmLabels self.parmBlist = ParmLabels # Operators # Calculation of variable A with fixed B self.opdict = dict() self.FillOpDict() self.oplist = self.opdict.keys() self.oplist.sort() self.labelA = self.parmAlist[0] self.labelB = self.parmBlist[1] self.labelOp = self.oplist[0] self.opfunc = self.opdict[self.labelOp] self.valueA = ParmValues[0] self.valueB = ParmValues[1] self.valueB, self.valueOp = self.CalcFct(self.valueA, self.valueB, 0) def SetValues(self, event=None): # Set the values for spin and slider # As of version 0.7.5: we want the units to be displayed # human readable - the way they are displayed # in the Page info tool. # # Parameter A idA = self.droppA.GetSelection() # Parameter B idB = self.droppB.GetSelection() # self.valueB = self.Page.active_parms[1][idB] # self.valueA = self.Page.active_parms[1][idA] if self.parent.notebook.GetPageCount() == 0: self.modelid = 6000 ParmLabels, ParmValues = \ mdls.GetHumanReadableParms(self.modelid, mdls.valuedict[6000][1]) else: self.modelid = self.Page.modelid ParmLabels, ParmValues = \ mdls.GetHumanReadableParms(self.modelid, self.Page.active_parms[1]) self.valueA = ParmValues[idA] self.valueB = ParmValues[idB] # Operator idop = self.dropop.GetSelection() keys = self.opdict.keys() opkey = self.oplist[idop] self.opfunc = self.opdict[opkey] # Parameter A startA = self.valueA*self.spinstartfactor endA = self.valueA*self.spinendfactor self.startspinA.SetValue(startA) self.endspinA.SetValue(endA) # Parameter B startB = self.valueB*self.spinstartfactor endB = self.valueB*self.spinendfactor self.startspinB.SetValue(startB) self.endspinB.SetValue(endB) # Operation result self.valueOp = self.opfunc[0](self.valueA, self.valueB) startOp = self.valueOp*self.spinstartfactor endOp = self.valueOp*self.spinendfactor self.startspinOp.SetValue(startOp) self.endspinOp.SetValue(endOp) # Set text self.textvalueA.SetLabel( "%.5e" % self.valueA) self.textvalueB.SetLabel( "%.5e" % self.valueB) self.textvalueOp.SetLabel( "%.5e" % self.valueOp) self.Increment() self.SetResult() pycorrfit-0.8.1/src/tools/info.py0000644000175000017500000003262212262516600015532 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module tools - info Open a text window with lots of information. Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import wx import numpy as np import models as mdls # Menu entry name MENUINFO = ["Page &info", "Display some information on the current page."] class InfoClass(object): """ This class get's all the Info possible from a Page and makes it available through a dictionary with headings as keys. """ def __init__(self, CurPage=None, Pagelist=None ): # A list of all Pages currently available: self.Pagelist = Pagelist # The current page we are looking at: self.CurPage = CurPage def GetAllInfo(self): """ Get a dictionary with page titles and an InfoDict as value. """ MultiInfo = dict() for Page in self.Pagelist: # Page counter includes a whitespace and a ":" which we do not want. MultiInfo[Page.counter[:-2]] = self.GetPageInfo(Page) return MultiInfo def GetCurInfo(self): """ Get all the information about the current Page. Added for convenience. You may use GetPageInfo. """ return self.GetPageInfo(self.CurPage) def GetCurFancyInfo(self): """ For convenience. """ return self.GetFancyInfo(self.CurPage) def GetFancyInfo(self, Page): """ Get a nice string representation of the Info """ InfoDict = self.GetPageInfo(Page) # Version Version = "PyCorrFit v."+InfoDict["version"][0]+"\n" # Title Title = "\n" for item in InfoDict["title"]: Title = Title + item[0]+"\t"+ item[1]+"\n" # Parameters Parameters = "\nParameters:\n" for item in InfoDict["parameters"]: Parameters = Parameters + " "+item[0]+"\t"+ str(item[1])+"\n" # Supplementary parameters Supplement = "\nSupplementary parameters:\n" try: for item in InfoDict["supplement"]: Supplement = Supplement + " "+item[0]+"\t"+ str(item[1])+"\n" except KeyError: Supplement = "" # Fitting Fitting = "\nFitting:\n" try: for item in InfoDict["fitting"]: Fitting = Fitting + " "+item[0]+"\t"+str(item[1])+"\n" except KeyError: Fitting = "" # Background Background = "\nBackground:\n" try: for item in InfoDict["background"]: Background = Background + " "+item[0]+"\t"+str(item[1])+"\n" except KeyError: Background = "" # Function doc string ModelDoc = "\n\nModel doc string:\n " + InfoDict["modeldoc"][0] # Supplementary variables try: SupDoc = "\n"+8*" "+InfoDict["modelsupdoc"][0] except: SupDoc = "" PageInfo = Version+Title+Parameters+Supplement+Fitting+Background+\ ModelDoc+SupDoc return PageInfo def GetPageInfo(self, Page): """ Needs a Page and gets all information from it """ Page.PlotAll("init") # A dictionary with headings as keys and lists of singletts/tuples as # values. If it is a tuple, it might me interesting for a table. InfoDict = dict() # Get model information model = [Page.model, Page.tabtitle.GetValue(), Page.modelid] parms = Page.active_parms[1] fct = Page.active_fct.__name__ InfoDict["version"] = [Page.parent.version] Title = list() # The tool statistics relys on the string "filename/title". # Do not change it! if len(model[1]) == 0: # Prevent saving no title model[1] = "NoName" Title.append(["filename/title", model[1] ]) Title.append(["Model ID", str(model[2]) ]) Title.append(["Model name", model[0] ]) Title.append(["Model function", fct ]) Title.append(["Page number", Page.counter[1:-2] ]) ## Parameters Parameters = list() # Use this function to determine human readable parameters, if possible Units, Newparameters = mdls.GetHumanReadableParms(model[2], parms) # Add Parameters for i in np.arange(len(parms)): Parameters.append([ Units[i], Newparameters[i] ]) InfoDict["parameters"] = Parameters # Add some more information if available # Info is a dictionary or None MoreInfo = mdls.GetMoreInfo(model[2], Page) if MoreInfo is not None: InfoDict["supplement"] = MoreInfo # Try to get the dictionary entry of a model try: # This function should return all important information # that can be calculated from the given parameters. func_info = mdls.supplement[model[2]] except KeyError: # No information available pass else: InfoDict["modelsupdoc"] = [func_info.func_doc] ## Fitting weightedfit = Page.weighted_fit_was_performed weightedfit_type = Page.weighted_fittype fittingbins = Page.weighted_nuvar # from left and right Fitting = list() if Page.dataexp is not None: # Mode AC vs CC if Page.IsCrossCorrelation is True: Title.append(["Type AC/CC", "Cross-correlation" ]) else: Title.append(["Type AC/CC", "Autocorrelation" ]) Fitting.append([ u"\u03c7"+"²", Page.chi2 ]) if Page.weighted_fit_was_performed: Chi2type = "reduced "+u"\u03c7"+"²" else: Chi2type = "reduced sum of squares" Fitting.append([ u"\u03c7"+"²-type", Chi2type ]) Fitting.append([ "Weighted fit", weightedfit_type ]) if len(Page.GlobalParameterShare) != 0: shared = str(Page.GlobalParameterShare[0]) for item in Page.GlobalParameterShare[1:]: shared += ", "+str(item) Fitting.append(["Shared parameters with Pages", shared]) if weightedfit is True: Fitting.append([ "Std. channels", 2*fittingbins+1 ]) # Fitting range: t1 = 1.*Page.taufull[Page.startcrop] t2 = 1.*Page.taufull[Page.endcrop-1] Fitting.append([ "Interval start [ms]", "%.4e" % t1 ]) Fitting.append([ "Interval end [ms]", "%.4e" % t2 ]) # Fittet parameters and errors somuch = sum(Page.active_parms[2]) if somuch >= 1: fitted = "" for i in np.arange(len(Page.active_parms[2])): if np.bool(Page.active_parms[2][i]) is True: errorvar = Page.active_parms[0][i] # variable name fitted=fitted+errorvar+ ", " fitted = fitted.strip().strip(",") # remove trailing comma Fitting.append(["fit par.", fitted]) # Fitting error included in v.0.7.3 Errors_fit = Page.parmoptim_error if Errors_fit is not None: errkeys = Errors_fit.keys() errkeys.sort() for key in errkeys: savekey, saveval = \ mdls.GetHumanReadableParameterDict(model[2], [key], [Errors_fit[key]]) # The tool statistics relys on the string "Err ". # Do not change it! Fitting.append(["Err "+savekey[0], saveval[0]]) InfoDict["fitting"] = Fitting ## Normalization if Page.normparm is None: normparmtext = "None" elif Page.normparm < len(Page.active_parms[0]): normparmtext = Page.active_parms[0][Page.normparm] else: # supplementary parameters supnum = Page.normparm - len(Page.active_parms[1]) normparmtext = MoreInfo[supnum][0] Title.append(["Normalization", normparmtext ]) ## Background Background = list() if Page.IsCrossCorrelation: if ( Page.bgselected is not None and Page.bg2selected is not None ): # Channel 1 bgname = Page.parent.Background[Page.bgselected][1] if len(bgname) == 0: # Prevent saving no name bgname = "NoName" Background.append([ "bg name Ch1", bgname]) Background.append([ "bg rate Ch1 [kHz]", Page.parent.Background[Page.bgselected][0] ]) # Channel 2 bg2name = Page.parent.Background[Page.bg2selected][1] if len(bg2name) == 0: # Prevent saving no name bg2name = "NoName" Background.append([ "bg name Ch2", bg2name]) Background.append([ "bg rate Ch2 [kHz]", Page.parent.Background[Page.bg2selected][0] ]) InfoDict["background"] = Background else: if Page.bgselected is not None: bgname = Page.parent.Background[Page.bgselected][1] if len(bgname) == 0: # Prevent saving no name bgname = "NoName" bgrate = Page.parent.Background[Page.bgselected][0] Background.append([ "bg name", bgname ]) Background.append([ "bg rate [kHz]", bgrate ]) InfoDict["background"] = Background ## Function doc string InfoDict["modeldoc"] = [Page.active_fct.func_doc] InfoDict["title"] = Title return InfoDict class ShowInfo(wx.Frame): def __init__(self, parent): # parent is main frame self.parent = parent # Get the window positioning correctly pos = self.parent.GetPosition() pos = (pos[0]+100, pos[1]+100) wx.Frame.__init__(self, parent=self.parent, title="Info", pos=pos, style=wx.DEFAULT_FRAME_STYLE|wx.FRAME_FLOAT_ON_PARENT) ## MYID # This ID is given by the parent for an instance of this class self.MyID = None # Page self.Page = self.parent.notebook.GetCurrentPage() # Size initial_size = wx.Size(650,700) initial_sizec = (initial_size[0]-6, initial_size[1]-30) self.SetMinSize(wx.Size(200,200)) self.SetSize(initial_size) ## Content self.panel = wx.Panel(self) self.control = wx.TextCtrl(self.panel, style=wx.TE_MULTILINE, size=initial_sizec) self.control.SetEditable(False) font1 = wx.Font(10, wx.MODERN, wx.NORMAL, wx.NORMAL, False, u'Monospace') self.control.SetFont(font1) btncopy = wx.Button(self.panel, wx.ID_CLOSE, 'Copy to clipboard') self.Bind(wx.EVT_BUTTON, self.OnCopy, btncopy) self.topSizer = wx.BoxSizer(wx.VERTICAL) self.topSizer.Add(btncopy) self.topSizer.Add(self.control) self.panel.SetSizer(self.topSizer) self.topSizer.Fit(self) #Icon if parent.MainIcon is not None: wx.Frame.SetIcon(self, parent.MainIcon) self.Show(True) wx.EVT_SIZE(self, self.OnSize) self.Content() def Content(self): # Fill self.control with content. # Parameters and models if self.parent.notebook.GetPageCount() == 0: self.control.SetValue("") self.panel.Disable() return self.panel.Enable() Page = self.Page InfoMan = InfoClass(CurPage=Page) PageInfo = InfoMan.GetCurFancyInfo() self.control.SetValue(PageInfo) def OnClose(self, event=None): self.parent.toolmenu.Check(self.MyID, False) self.parent.ToolsOpen.__delitem__(self.MyID) self.Destroy() def OnCopy(self, event): if not wx.TheClipboard.IsOpened(): clipdata = wx.TextDataObject() clipdata.SetText(self.control.GetValue()) wx.TheClipboard.Open() wx.TheClipboard.SetData(clipdata) wx.TheClipboard.Close() else: print "Other application has lock on clipboard." def OnPageChanged(self, page=None): # When parent changes self.Page = page self.Content() def OnSize(self, event): size = event.GetSize() sizec = wx.Size(size[0]-5, size[1]-30) self.panel.SetSize(size) self.control.SetSize(sizec) pycorrfit-0.8.1/src/tools/comment.py0000755000175000017500000000625312262516600016245 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module tools - comment Just edit the sessions comment. Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import wx class EditComment(wx.Frame): """ Little Dialog to edit the comment on the session. """ def __init__(self, parent): ## Variables # parent is main frame self.parent = parent # Get the window positioning correctly pos = self.parent.GetPosition() pos = (pos[0]+100, pos[1]+100) wx.Frame.__init__(self, parent=parent, title="Session comment", pos=pos, style=wx.DEFAULT_FRAME_STYLE|wx.FRAME_FLOAT_ON_PARENT) initial_size = (400,300) initial_sizec = (initial_size[0], initial_size[1]-50) self.SetSize(initial_size) self.SetMinSize((400,300)) ## Content self.panel = wx.Panel(self) self.control = wx.TextCtrl(self.panel, style=wx.TE_MULTILINE, size=initial_sizec, value=self.parent.SessionComment) text = wx.StaticText(self.panel, label="Session comments will be saved in the session file.") # buttons btnclose = wx.Button(self.panel, wx.ID_ANY, 'Close') btnokay = wx.Button(self.panel, wx.ID_ANY, 'OK') self.Bind(wx.EVT_BUTTON, self.OnClose, btnclose) self.Bind(wx.EVT_BUTTON, self.OnOkay, btnokay) #sizers self.topSizer = wx.BoxSizer(wx.VERTICAL) buttonsizer = wx.BoxSizer(wx.HORIZONTAL) buttonsizer.Add(btnclose, 1) buttonsizer.Add(btnokay, 1) self.topSizer.Add(text) self.topSizer.Add(buttonsizer) self.topSizer.Add(self.control) self.panel.SetSizer(self.topSizer) self.topSizer.Fit(self) #Icon if parent.MainIcon is not None: wx.Frame.SetIcon(self, parent.MainIcon) self.Show(True) wx.EVT_SIZE(self, self.OnSize) def OnSize(self, event): size = event.GetSize() sizec = (size[0], size[1]-50) self.panel.SetSize(size) self.control.SetSize(sizec) def OnClose(self, event=None): self.parent.filemenu.Check(self.parent.menuComm.GetId(), False) self.Destroy() def OnOkay(self, event): self.parent.SessionComment = self.control.GetValue() self.OnClose() pycorrfit-0.8.1/src/tools/background.py0000644000175000017500000006465412262516600016730 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module tools - background We make some background corection here. Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import numpy as np import os import sys import traceback # for Error handling import wx from wx.lib.agw import floatspin # Float numbers in spin fields import wx.lib.plot as plot import doc import misc import openfile as opf # How to treat an opened file import readfiles # Menu entry name MENUINFO = ["&Background correction", "Open a file for background correction."] class BackgroundCorrection(wx.Frame): def __init__(self, parent): self.MyName="BACKGROUND" # Parent is main frame self.parent = parent # Get the window positioning correctly pos = self.parent.GetPosition() pos = (pos[0]+100, pos[1]+100) wx.Frame.__init__(self, parent=parent, title="Background correction", pos=pos, style=wx.DEFAULT_FRAME_STYLE|wx.FRAME_FLOAT_ON_PARENT) ## MYID # This ID is given by the parent for an instance of this class self.MyID = None # Current trace we are looking at self.activetrace = None # Importet trace self.trace = None # Importet trace after user decides to cange radio buttons self.oldtrace = None self.oldfilename = None self.average = None ## Start drawing # Splitter Window self.sp = wx.SplitterWindow(self, style=wx.SP_NOBORDER) ## Controls panel = wx.Panel(self.sp) # text1 backgroundinit = ( "Correct the amplitude for non-correlated background.\n"+ "The background intensity can be either imported\n"+ "from a blank measurement or set manually.") textinit = wx.StaticText(panel, label=backgroundinit) # Radio buttons self.rbtnfile = wx.RadioButton(panel, -1, 'Blank measurement: ', style = wx.RB_GROUP) self.rbtnfile.SetValue(True) self.btnbrowse = wx.Button(panel, wx.ID_ANY, 'Browse ...') self.rbtnhand = wx.RadioButton (panel, -1, 'Manual, [kHz]: ') # Spincontrol self.spinctrl = floatspin.FloatSpin(panel, digits=4, min_val=0, increment=.01) self.spinctrl.Enable(False) # Verbose text self.textfile = wx.StaticText(panel, label="No blank measurement file selected.") textmeanavg = wx.StaticText(panel, label="Average background signal [kHz]: ") self.textmean = wx.StaticText(panel, label="") # name textname = wx.StaticText(panel, label="User defined background name: ") sizeTextn = textname.GetSize()[0] self.bgname = wx.TextCtrl(panel, value="", size=(sizeTextn,-1)) self.bgname.Enable(False) self.btnimport = wx.Button(panel, wx.ID_ANY, 'Import into session') self.btnimport.Enable(False) # Dropdown self.BGlist = ["File/User"] # updated by self.UpdateDropdown() textdropdown = wx.StaticText(panel, label="Show background: ") self.dropdown = wx.ComboBox(panel, -1, "File/User", (15, -1), wx.DefaultSize, self.BGlist, wx.CB_DROPDOWN|wx.CB_READONLY) self.UpdateDropdown() # Radio buttons Channel1 and 2 self.rbtnCh1 = wx.RadioButton (panel, -1, 'Ch1 ', style = wx.RB_GROUP) self.rbtnCh1.SetValue(True) self.rbtnCh2 = wx.RadioButton (panel, -1, 'Ch2') # Apply buttons self.btnapply = wx.Button(panel, wx.ID_ANY, 'Apply') textor = wx.StaticText(panel, label=" or ") self.btnrem = wx.Button(panel, wx.ID_ANY, 'Dismiss') textpages = wx.StaticText(panel, label=" correction for pages: ") self.WXTextPages = wx.TextCtrl(panel, value="") # Initial value for WXTextPages pagenumlist = list() for i in np.arange(self.parent.notebook.GetPageCount()): Page = self.parent.notebook.GetPage(i) pagenumlist.append(int(filter(lambda x: x.isdigit(), Page.counter))) valstring=misc.parsePagenum2String(pagenumlist) self.WXTextPages.SetValue(valstring) textyma = wx.StaticText(panel, label="Shortcut - ") self.btnapplyall = wx.Button(panel, wx.ID_ANY, 'Apply to all pages') textor2 = wx.StaticText(panel, label=" or ") self.btnremyall = wx.Button(panel, wx.ID_ANY, 'Dismiss from all pages') # Bindings self.Bind(wx.EVT_BUTTON, self.OnBrowse, self.btnbrowse) self.Bind(wx.EVT_RADIOBUTTON, self.OnRadioFile, self.rbtnfile) self.Bind(wx.EVT_RADIOBUTTON, self.OnRadioHand, self.rbtnhand) self.Bind(wx.EVT_SPINCTRL, self.SpinCtrlChange, self.spinctrl) self.Bind(wx.EVT_BUTTON, self.OnImport, self.btnimport) self.Bind(wx.EVT_COMBOBOX, self.OnDraw, self.dropdown) self.Bind(wx.EVT_BUTTON, self.OnApply, self.btnapply) self.Bind(wx.EVT_BUTTON, self.OnApplyAll, self.btnapplyall) self.Bind(wx.EVT_BUTTON, self.OnRemove, self.btnrem) self.Bind(wx.EVT_BUTTON, self.OnRemoveAll, self.btnremyall) # Sizers topSizer = wx.BoxSizer(wx.VERTICAL) text1sizer = wx.BoxSizer(wx.HORIZONTAL) text1sizer.Add(self.rbtnfile) text1sizer.Add(self.btnbrowse) text2sizer = wx.BoxSizer(wx.HORIZONTAL) text2sizer.Add(self.rbtnhand) text2sizer.Add(self.spinctrl) textmeansizer = wx.BoxSizer(wx.HORIZONTAL) textmeansizer.Add(textmeanavg) textmeansizer.Add(self.textmean) dropsizer = wx.BoxSizer(wx.HORIZONTAL) dropsizer.Add(textdropdown) droprightsizer = wx.BoxSizer(wx.VERTICAL) dropsizer.Add(droprightsizer) droprightsizer.Add(self.dropdown) #droprightsizer.Add(self.textafterdropdown) applysizer = wx.BoxSizer(wx.HORIZONTAL) applysizer.Add(self.btnapply) applysizer.Add(textor) applysizer.Add(self.btnrem) applysizer.Add(textpages) applysizer.Add(self.WXTextPages) applysizer.Add(self.rbtnCh1) applysizer.Add(self.rbtnCh2) allsizer = wx.BoxSizer(wx.HORIZONTAL) allsizer.Add(textyma) allsizer.Add(self.btnapplyall) allsizer.Add(textor2) allsizer.Add(self.btnremyall) topSizer.Add(textinit) topSizer.Add(text1sizer) topSizer.Add(text2sizer) topSizer.Add(self.textfile) topSizer.Add(textmeansizer) topSizer.Add(textname) topSizer.Add(self.bgname) topSizer.Add(self.btnimport) topSizer.Add(dropsizer) topSizer.Add(applysizer) topSizer.Add(allsizer) panel.SetSizer(topSizer) topSizer.Fit(self) self.SetMinSize(topSizer.GetMinSizeTuple()) self.Show(True) ## Canvas self.canvas = plot.PlotCanvas(self.sp) # Sizes psize = panel.GetBestSize() initial_size = (psize[0],psize[1]+200) self.SetSize(initial_size) sashsize = psize[1]+3 # This is also necessary to prevent unsplitting self.sp.SetMinimumPaneSize(sashsize) self.sp.SplitHorizontally(panel, self.canvas, sashsize) # If there is no page, disable ourselves: self.OnPageChanged(self.parent.notebook.GetCurrentPage()) #Icon if parent.MainIcon is not None: wx.Frame.SetIcon(self, parent.MainIcon) def OnApply(self, event): strFull = self.WXTextPages.GetValue() PageNumbers = misc.parseString2Pagenum(self, strFull) if PageNumbers is None: # Something went wrong and parseString2Pagenum already displayed # an error message. return # BG number item = self.dropdown.GetSelection() # Apply to corresponding pages for i in np.arange(self.parent.notebook.GetPageCount()): Page = self.parent.notebook.GetPage(i) j = filter(lambda x: x.isdigit(), Page.counter) if int(j) in PageNumbers: if self.rbtnCh1.GetValue() == True: Page.bgselected = item else: Page.bg2selected = item if Page.IsCrossCorrelation is False: # Autocorrelation only has one background! Page.bg2selected = None Page.OnAmplitudeCheck("init") Page.PlotAll() # Clean up unused backgrounds CleanupAutomaticBackground(self.parent) def OnApplyAll(self, event): self.btnrem.Enable(True) self.btnremyall.Enable(True) N = self.parent.notebook.GetPageCount() item = self.dropdown.GetSelection() for i in np.arange(N): # Set Page Page = self.parent.notebook.GetPage(i) Page.bgselected = item if Page.IsCrossCorrelation: Page.bg2selected = item else: Page.bg2selected = None try: Page.OnAmplitudeCheck("init") Page.PlotAll() except OverflowError: errstr = "Could not apply background to Page "+Page.counter+\ ". \n Check the value of the trace average and the background." dlg = wx.MessageDialog(self, errstr, "Error", style=wx.ICON_ERROR|wx.OK|wx.STAY_ON_TOP) dlg.ShowModal() Page.bgselected = None Page.bg2selected = None # Clean up unused backgrounds CleanupAutomaticBackground(self.parent) def OnClose(self, event=None): self.parent.toolmenu.Check(self.MyID, False) self.parent.ToolsOpen.__delitem__(self.MyID) self.Destroy() def OnBrowse(self, event): # opf.BGFiletypes is a dictionary with filetypes that have some # trace signal information. SupFiletypes = opf.BGFiletypes.keys() SupFiletypes.sort() filters = "" for i in np.arange(len(SupFiletypes)): # Add to the filetype filter filters = filters+SupFiletypes[i] if i+1 != len(SupFiletypes): # Add a separator filters = filters+"|" dlg = wx.FileDialog(self, "Choose a data file", self.parent.dirname, "", filters, wx.OPEN) if dlg.ShowModal() == wx.ID_OK: # Workaround since 0.7.5 (dirname, filename) = os.path.split(dlg.GetPath()) #filename = dlg.GetFilename() #dirname = dlg.GetDirectory() # Set parent dirname for user comfort self.parent.dirname = dirname try: # [data, trace, curvelist] stuff = readfiles.openAnyBG(dirname, filename) except: # The file does not seem to be what it seems to be. info = sys.exc_info() errstr = "Unknown file format:\n" errstr += str(filename)+"\n\n" errstr += str(info[0])+"\n" errstr += str(info[1])+"\n" for tb_item in traceback.format_tb(info[2]): errstr += tb_item wx.MessageDialog(self, errstr, "Error", style=wx.ICON_ERROR|wx.OK|wx.STAY_ON_TOP) return # Usually we will get a bunch of traces. Let the user select which # one to take. if len(stuff["Filename"]) > 1: choices = list() for i2 in np.arange(len(stuff["Filename"])): choices.append(str(i2)+". " + stuff["Filename"][i2] + " " + stuff["Type"][i2]) dlg = wx.SingleChoiceDialog(self, "Choose a curve", "Curve selection", choices=choices) if dlg.ShowModal() == wx.ID_OK: selindex = dlg.GetSelection() else: return else: selindex = 0 # If we accidentally recorded a cross correlation curve # as the background, let the user choose which trace he wants: channelindex = None if ( len(stuff["Type"][selindex]) >= 2 and stuff["Type"][selindex][0:2] == "CC" ): choices = ["Channel 1", "Channel 2"] label = "From which channel do you want to use the trace?" dlg = wx.SingleChoiceDialog(self, label, "Curve selection", choices=choices) if dlg.ShowModal() == wx.ID_OK: channelindex = dlg.GetSelection() trace = stuff["Trace"][selindex][channelindex] else: return else: trace = stuff["Trace"][selindex] if trace is None: print "WARNING: I did not find any trace data." return # Display filename and some of the directory self.textfile.SetLabel("File: ..."+dirname[-10:]+"/"+filename) name = str(selindex)+". "+stuff["Filename"][selindex]+" "+\ stuff["Type"][selindex] if channelindex is not None: name += " "+str(channelindex+1) self.bgname.SetValue(name) self.trace = trace # Calculate average self.average = self.trace[:,1].mean() # Display average self.textmean.SetLabel(str(self.average)+" kHz") self.spinctrl.SetValue(self.average) # Let the user see the opened file self.dropdown.SetSelection(0) # show trace self.OnDraw() # Enable button and editable name self.bgname.Enable(True) self.btnimport.Enable(True) else: # User pressed "Abort" - do nothing. self.parent.dirname = dlg.GetDirectory() dlg.Destroy() return def OnDraw(self, event=None): item = self.dropdown.GetSelection() if item < 0: # Disable Apply Buttons self.btnapply.Enable(False) self.btnapplyall.Enable(False) # Draw the trace that was just imported if self.trace != None: # Calculate average self.average = self.trace[:,1].mean() self.activetrace = self.trace #self.textafterdropdown.SetLabel(" Avg: "+str(self.average)+ # " kHz") self.textmean.SetLabel(str(self.average)) self.spinctrl.SetValue(self.average) else: # Clear the canvas. Looks better. self.canvas.Clear() # Don't show the average #self.textafterdropdown.SetLabel("") self.textmean.SetLabel("") return else: # Enable Apply Buttons self.btnapply.Enable(True) self.btnapplyall.Enable(True) # Draw a trace from the list self.activetrace = self.parent.Background[item-1][2] #self.textafterdropdown.SetLabel(" Avg: "+ # str(self.parent.Background[item-1][0])) # We want to have the trace in [s] here. trace = 1.*self.activetrace trace[:,0] = trace[:,0]/1000 linesig = plot.PolyLine(trace, legend='', colour='blue', width=1) self.canvas.Draw(plot.PlotGraphics([linesig], xLabel='time [s]', yLabel='background signal [kHz]')) def OnImport(self, event): self.parent.Background.append([self.average, self.bgname.GetValue(), self.trace]) # Next two lines are taken care of by UpdateDropdown #name = "{} ({:.2f} kHz)".format(self.bgname.GetValue(), self.average) #self.BGlist.append(name) self.UpdateDropdown() self.btnremyall.Enable(True) self.btnrem.Enable(True) self.btnapplyall.Enable(True) self.btnapply.Enable(True) self.OnDraw() # Update BG dropdown of each page for i in np.arange(self.parent.notebook.GetPageCount()): self.parent.notebook.GetPage(i).OnAmplitudeCheck() def OnPageChanged(self, page=None): # We do not need the *Range* Commands here yet. # We open and close the SelectChannelsFrame every time we # import some data. if len(self.parent.Background) == 0: self.BGlist = list() self.UpdateDropdown() self.dropdown.SetValue("File/User") if self.parent.notebook.GetPageCount() == 0: self.sp.Disable() return self.sp.Enable() if len(self.BGlist) <= 0: self.btnrem.Enable(False) self.btnremyall.Enable(False) self.btnapply.Enable(False) self.btnapplyall.Enable(False) else: self.btnrem.Enable(True) self.btnremyall.Enable(True) self.btnapply.Enable(True) self.btnapplyall.Enable(True) if (self.WXTextPages.GetValue() == "" and self.parent.notebook.GetPageCount() != 0): # Initial value for WXTextPages pagenumlist = list() for i in np.arange(self.parent.notebook.GetPageCount()): Page = self.parent.notebook.GetPage(i) pagenumlist.append(int(filter(lambda x: x.isdigit(), Page.counter))) valstring=misc.parsePagenum2String(pagenumlist) self.WXTextPages.SetValue(valstring) def OnRadioFile(self, event): # Do not let the user change the spinctrl # setting. self.spinctrl.Enable(False) self.btnbrowse.Enable(True) # Restor the old trace self.trace = self.oldtrace if self.oldfilename is not None: self.textfile.SetLabel(self.oldfilename) if self.trace is None: # Disable button and editable name self.bgname.Enable(False) self.btnimport.Enable(False) # Let us draw self.dropdown.SetSelection(0) self.OnDraw() def OnRadioHand(self, event): # Let user enter a signal. self.spinctrl.Enable(True) self.btnbrowse.Enable(False) # save the old trace. We might want to switch back to it. if self.trace is not None: self.oldtrace = 1.*self.trace self.oldfilename = self.textfile.GetLabel() self.SpinCtrlChange() # Do not show the filename self.textfile.SetLabel("No file selected.") # Enable button and editable name self.bgname.Enable(True) self.btnimport.Enable(True) if len(self.bgname.GetValue()) == 0: # Enter something as name self.bgname.SetValue("User") def OnRemove(self, event): strFull = self.WXTextPages.GetValue() PageNumbers = misc.parseString2Pagenum(self, strFull) if PageNumbers is None: # Something went wrong and parseString2Pagenum already displayed # an error message. return # BG number item = self.dropdown.GetSelection() # Apply to corresponding pages for i in np.arange(self.parent.notebook.GetPageCount()): Page = self.parent.notebook.GetPage(i) j = filter(lambda x: x.isdigit(), Page.counter) if int(j) in PageNumbers: if self.rbtnCh1.GetValue() == True: Page.bgselected = None else: Page.bg2selected = None Page.bgselected = None Page.OnAmplitudeCheck("init") Page.PlotAll() # Clean up unused backgrounds CleanupAutomaticBackground(self.parent) def OnRemoveAll(self, event): N = self.parent.notebook.GetPageCount() for i in np.arange(N): Page = self.parent.notebook.GetPage(i) Page.bgselected = None Page.bg2selected = None Page.OnAmplitudeCheck("init") Page.PlotAll() # Clean up unused backgrounds CleanupAutomaticBackground(self.parent) def SetPageNumbers(self, pagestring): self.WXTextPages.SetValue(pagestring) def SpinCtrlChange(self, event=None): # Let user see the continuous trace we will generate self.average = self.spinctrl.GetValue() self.trace = np.array([[0,self.average],[1,self.average]]) self.textmean.SetLabel(str(self.average)) self.OnDraw() def UpdateDropdown(self, e=None): self.BGlist = list() #self.BGlist.append("File/User") for item in self.parent.Background: bgname = "{} ({:.2f} kHz)".format(item[1],item[0]) self.BGlist.append(bgname) if len(self.BGlist) == 0: ddlist = ["File/User"] else: ddlist = 1*self.BGlist self.dropdown.SetItems(self.BGlist) # Show the last item self.dropdown.SetSelection(len(self.BGlist)-1) def ApplyAutomaticBackground(page, bg, parent): """ Creates an "automatic" background with countrate in kHz *bg* and applies it to the given *page* object. If an automatic background with the same countrate exists, uses it. Input: *page* - page to which the background should be applied *bg* - background that should be applied to that page float or list of 1 or two elements -> if the page is cross-correlation, the second background will be applied as well. *parent* - parent containing *Background* list """ bglist = 1*np.atleast_1d(bg) # minus 1 to identify non-set background id bgid = np.zeros(bglist.shape, dtype=int) - 1 for b in xrange(len(bglist)): # Check if exists: for i in xrange(len(parent.Background)): if parent.Background[i][0] == bglist[b]: bgid[b] = i if bgid[b] == -1: # Add new background bgname = "AUTO: {:e} kHz \t".format(bglist[b]) trace = np.array([[0,bglist[b]],[1,bglist[b]]]) parent.Background.append([bglist[b], bgname, trace]) bgid[b] = len(parent.Background) - 1 # Apply background to page # Last item is id of background page.bgselected = bgid[0] if page.IsCrossCorrelation: if len(bgid) != 2: raise NotImplementedError("Cross-correlation data needs"+ "exactly two signals for background-correction!") # Apply second background page.bg2selected = bgid[1] else: page.bg2selected = None CleanupAutomaticBackground(parent) page.OnAmplitudeCheck("init") page.PlotAll() def CleanupAutomaticBackground(parent): """ Goes through the pagelist *parent.notebook.GetPageCount()* and checks *parent.Background* for unnused automatic backgrounds. Removes these and updates the references to all backgrounds within the pages. """ # Create a dictionary with keys: indices of old background list - # and elements: list of pages having this background BGdict = dict() BG2dict = dict() # cross-correlation for i in xrange(len(parent.Background)): BGdict[i] = list() BG2dict[i] = list() # Append pages to the lists inside the dictionary for i in xrange(parent.notebook.GetPageCount()): Page = parent.notebook.GetPage(i) if Page.bgselected is not None: BGdict[Page.bgselected].append(Page) if Page.bg2selected is not None: BG2dict[Page.bg2selected].append(Page) # Sort the keys and create a new background list NewBGlist = list() keyID = 0 keys = BGdict.keys() keys.sort() for key in keys: # Do not delete user-generated backgrounds if len(BGdict[key]) == 0 and parent.Background[key][1][-1]=="\t": # This discrads auto-generated backgrounds that have no # pages assigned to them pass else: for page in BGdict[key]: page.bgselected = keyID NewBGlist.append(parent.Background[key]) keyID += 1 # Same thing for cross-correlation (two bg signals) #keyID = 0 keys = BG2dict.keys() keys.sort() for key in keys: # Do not delete user-generated backgrounds if len(BG2dict[key]) == 0 and parent.Background[key][1][-1]=="\t": # This discrads auto-generated backgrounds that have no # pages assigned to them pass elif parent.Background[key][1][-1]=="\t": # We already added the user-defined backgrounds # Therefore, we only check for aut-generated backgrounds # ("\t") for page in BG2dict[key]: page.bg2selected = keyID NewBGlist.append(parent.Background[key]) keyID += 1 # Finally, write back background list parent.Background = NewBGlist # If the background correction tool is open, update the list # of backgrounds. # (self.MyName="BACKGROUND") toolkeys = parent.ToolsOpen.keys() if len(toolkeys) == 0: pass else: for key in toolkeys: tool = parent.ToolsOpen[key] try: if tool.MyName == "BACKGROUND": tool.UpdateDropdown() tool.OnPageChanged() except: pass pycorrfit-0.8.1/src/tools/average.py0000644000175000017500000003757412262516600016224 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module tools - average Creates an average of curves. Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import numpy as np import wx import misc import models as mdls # Menu entry name MENUINFO = ["&Average data", "Create an average curve from whole session."] class Average(wx.Frame): # This tool is derived from a wx.frame. def __init__(self, parent): # Define a unique name that identifies this tool # Do not change this value. It is important for the Overlay tool # (selectcurves.py, *Wrapper_Tools*). self.MyName="AVERAGE" # parent is the main frame of PyCorrFit self.parent = parent # Get the window positioning correctly pos = self.parent.GetPosition() pos = (pos[0]+100, pos[1]+100) wx.Frame.__init__(self, parent=self.parent, title="Average curves", pos=pos, style=wx.DEFAULT_FRAME_STYLE|wx.FRAME_FLOAT_ON_PARENT) ## MYID # This ID is given by the parent for an instance of this class self.MyID = None # Page - the currently active page of the notebook. self.Page = self.parent.notebook.GetCurrentPage() ## Content self.panel = wx.Panel(self) self.topSizer = wx.BoxSizer(wx.VERTICAL) textinit = wx.StaticText(self.panel, label="Create an average from the following pages:") self.topSizer.Add(textinit) ## Page selection self.WXTextPages = wx.TextCtrl(self.panel, value="", size=(textinit.GetSize()[0],-1)) self.topSizer.Add(self.WXTextPages) ## Chechbox asking for Mono-Model self.WXCheckMono = wx.CheckBox(self.panel, label="Only use pages with the same model as the first page.") self.WXCheckMono.SetValue(True) self.topSizer.Add(self.WXCheckMono) ## Model selection Dropdown textinit2 = wx.StaticText(self.panel, label="Select a model for the average:") self.topSizer.Add(textinit2) self.WXDropSelMod = wx.ComboBox(self.panel, -1, "", (15,30), wx.DefaultSize, [], wx.CB_DROPDOWN|wx.CB_READONLY) self.topSizer.Add(self.WXDropSelMod) textinit3 = wx.StaticText(self.panel, label="This tool averages only over pages with the same type"+\ "\n(auto- or cross-correlation). Intensity data are"+\ "\nappended sequentially.") self.topSizer.Add(textinit3) # Set all values of Text and Strin self.SetValues() btnavg = wx.Button(self.panel, wx.ID_CLOSE, 'Create average') # Binds the button to the function - close the tool self.Bind(wx.EVT_BUTTON, self.OnAverage, btnavg) self.topSizer.Add(btnavg) self.panel.SetSizer(self.topSizer) self.topSizer.Fit(self) self.SetMinSize(self.topSizer.GetMinSizeTuple()) #Icon if parent.MainIcon is not None: wx.Frame.SetIcon(self, parent.MainIcon) self.Show(True) self.OnPageChanged(self.Page) def OnClose(self, event=None): # This is a necessary function for PyCorrFit. # Do not change it. self.parent.toolmenu.Check(self.MyID, False) self.parent.ToolsOpen.__delitem__(self.MyID) self.Destroy() def OnPageChanged(self, page): # When parent changes # This is a necessary function for PyCorrFit. # This is stuff that should be done when the active page # of the notebook changes. idsel = self.WXDropSelMod.GetSelection() self.SetValues() # Set back user selection: self.WXDropSelMod.SetSelection(idsel) if self.parent.notebook.GetPageCount() == 0: self.panel.Disable() return self.panel.Enable() self.Page = page def OnAverage(self, evt=None): strFull = self.WXTextPages.GetValue() PageNumbers = misc.parseString2Pagenum(self, strFull) if PageNumbers is None: # Something went wrong and parseString2Pagenum already displayed # an error message. return pages = list() UsedPagenumbers = list() # Reference page is the first page of the selection! #referencePage = self.parent.notebook.GetCurrentPage() referencePage = None for i in np.arange(self.parent.notebook.GetPageCount()): Page = self.parent.notebook.GetPage(i) if Page.counter.strip(" :#") == str(PageNumbers[0]): referencePage = Page break if referencePage is not None: # If that did not work, we have to raise an error. raise IndexError("PyCorrFit could not find the first"+ " page for averaging.") return for i in np.arange(self.parent.notebook.GetPageCount()): Page = self.parent.notebook.GetPage(i) j = filter(lambda x: x.isdigit(), Page.counter) if int(j) in PageNumbers: # Get all pages with the same model? if self.WXCheckMono.GetValue() == True: if (Page.modelid == referencePage.modelid and Page.IsCrossCorrelation == referencePage.IsCrossCorrelation): ## Check if the page has experimental data: # If there is an empty page somewhere, don't bother if Page.dataexpfull is not None: pages.append(Page) UsedPagenumbers.append(int(j)) else: if Page.IsCrossCorrelation == referencePage.IsCrossCorrelation: # If there is an empty page somewhere, don't bother if Page.dataexpfull is not None: pages.append(Page) UsedPagenumbers.append(int(j)) # If there are no pages in the list, exit gracefully if len(pages) <= 0: texterr_a = "At least one page with experimental data is\n"+\ "required for averaging. Please check the pages\n"+\ "that you selected for averaging." if self.WXCheckMono.GetValue() == True: texterr_a += " Note: You selected\n"+\ "to only use pages with same model as the first page." wx.MessageDialog(self, texterr_a, "Error", style=wx.ICON_ERROR|wx.OK|wx.STAY_ON_TOP) return # Now get all the experimental data explist = list() # Two components in case of Cross correlation tracetime = [np.array([]), np.array([])] tracerate = [np.array([]), np.array([])] TraceNumber = 0 TraceAvailable = False # turns True, if pages contain traces for page in pages: # experimental correlation curve # (at least 1d, because it might be None) explist.append(np.atleast_1d(1*page.dataexpfull)) # trace # We will put together a trace from all possible traces # Stitch together all the traces. if page.IsCrossCorrelation is False: trace = [page.trace] # trace has one element TraceNumber = 1 else: trace = page.tracecc # trace has two elements TraceNumber = 2 if trace is not None and trace[0] is not None: TraceAvailable = True # Works with one or two traces. j = 0 or 1. for j in np.arange(TraceNumber): if len(tracetime[j]) != 0: # append to the trace oldend = tracetime[j][-1] newtracetime = 1.*trace[j][:,0] newtracetime = newtracetime + oldend tracetime[j] = np.append(tracetime[j], newtracetime) del newtracetime tracerate[j] = np.append(tracerate[j], trace[j][:,1]) else: # Initiate the trace tracetime[j] = 1.*trace[j][:,0] tracerate[j] = 1.*trace[j][:,1] # Now check if the length of the correlation arrays are the same: len0 = len(explist[0]) for item in explist[1:]: if len(item) != len0: # print an error message wx.MessageDialog(self, "Averaging over curves with different lengths is not"+\ "\nsupported. When measuring, please make sure that"+\ "\nthe measurement time for all curves is the same.", "Error", style=wx.ICON_ERROR|wx.OK|wx.STAY_ON_TOP) return # Now shorten the trace, because we want as little memory usage as # possible. I used this algorithm in read_FCS_Confocor3.py as well. newtraces = list() if TraceAvailable is True: for j in np.arange(TraceNumber): tracej = np.zeros((len(tracetime[j]),2)) tracej[:,0] = tracetime[j] tracej[:,1] = tracerate[j] if len(tracej) >= 500: # We want about 500 bins # We need to sum over intervals of length *teiler* teiler = int(len(tracej)/500) newlength = len(tracej)/teiler newsignal = np.zeros(newlength) # Simultaneously sum over all intervals for k in np.arange(teiler): newsignal = \ newsignal+tracej[k:newlength*teiler:teiler][:,1] newsignal = 1.* newsignal / teiler newtimes = tracej[teiler-1:newlength*teiler:teiler][:,0] if len(tracej)%teiler != 0: # We have a rest signal # We average it and add it to the trace rest = tracej[newlength*teiler:][:,1] lrest = len(rest) rest = np.array([sum(rest)/lrest]) newsignal = np.concatenate((newsignal, rest), axis=0) timerest = np.array([tracej[-1][0]]) newtimes = np.concatenate((newtimes, timerest), axis=0) newtrace=np.zeros((len(newtimes),2)) newtrace[:,0] = newtimes newtrace[:,1] = newsignal else: # Declare newtrace - # otherwise we have a problem down three lines ;) newtrace = tracej newtraces.append(newtrace) else: newtraces=[None,None] # Everything is cleared for averaging exparray = np.array(explist) averagedata = exparray.sum(axis=0)[:,1]/len(exparray) # Create a copy from the first page average = 1*exparray[0] # Set average data average[:,1] = averagedata # create new page self.IsCrossCorrelation = self.Page.IsCrossCorrelation interval = (self.Page.startcrop, self.Page.endcrop) # Obtain the model ID from the dropdown selection. idsel = self.WXDropSelMod.GetSelection() modelid = self.DropdownIndex[idsel] self.parent.add_fitting_tab(modelid = modelid) self.AvgPage = self.parent.notebook.GetCurrentPage() (self.AvgPage.startcrop, self.AvgPage.endcrop) = interval self.AvgPage.dataexpfull = average self.AvgPage.IsCrossCorrelation = self.IsCrossCorrelation if self.IsCrossCorrelation is False: newtrace = newtraces[0] if newtrace is not None and len(newtrace) != 0: self.AvgPage.trace = newtrace self.AvgPage.traceavg = newtrace.mean() else: self.AvgPage.trace = None self.AvgPage.traceavg = None else: if newtraces[0] is not None and len(newtraces[0][0]) != 0: self.AvgPage.tracecc = newtraces else: self.AvgPage.tracecc = None self.AvgPage.PlotAll() self.AvgPage.Fit_enable_fitting() if len(pages) == 1: # Use the same title as the first page newtabti = referencePage.tabtitle.GetValue() else: # Create a new tab title newtabti = "Average ["+misc.parsePagenum2String(UsedPagenumbers)+"]" self.AvgPage.tabtitle.SetValue(newtabti) # Set the addition information about the variance from averaging Listname = "Average" standarddev = exparray.std(axis=0)[:,1] if np.sum(np.abs(standarddev)) == 0: # The average sd is zero. We probably made an average # from only one page. In this case we do not enable # average weighted fitting pass else: self.AvgPage.external_std_weights[Listname] = standarddev WeightKinds = self.AvgPage.Fitbox[1].GetItems() # Attention! Average weights and other external weights should # be sorted (for session saving). extTypes = self.AvgPage.external_std_weights.keys() extTypes.sort() # sorting for key in extTypes: try: WeightKinds.remove(key) except: pass LenInternal = len(WeightKinds) IndexAverag = extTypes.index(Listname) IndexInList = LenInternal + IndexAverag for key in extTypes: WeightKinds += [key] self.AvgPage.Fitbox[1].SetItems(WeightKinds) self.AvgPage.Fitbox[1].SetSelection(IndexInList) # Keep the average tool open. # self.OnClose() def SetPageNumbers(self, pagestring): self.WXTextPages.SetValue(pagestring) def SetValues(self, e=None): # Text input pagenumlist = list() for i in np.arange(self.parent.notebook.GetPageCount()): Page = self.parent.notebook.GetPage(i) pagenumlist.append(int(filter(lambda x: x.isdigit(), Page.counter))) valstring=misc.parsePagenum2String(pagenumlist) self.WXTextPages.SetValue(valstring) # Dropdown modelkeys = mdls.modeltypes.keys() modelkeys.sort() try: current_model = self.parent.notebook.GetCurrentPage().modelid except: current_model = -1 i = 0 DropdownList = list() self.DropdownIndex = list() # Contains model ids with same index current_index = 0 for modeltype in modelkeys: for modelid in mdls.modeltypes[modeltype]: DropdownList.append(modeltype+": "+mdls.modeldict[modelid][1]) self.DropdownIndex.append(str(modelid)) if str(current_model) == str(modelid): current_index = i i+=1 self.WXDropSelMod.SetItems(DropdownList) self.WXDropSelMod.SetSelection(current_index) pycorrfit-0.8.1/src/tools/batchcontrol.py0000644000175000017500000001602412262516600017257 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module tools - batch Stuff that concerns batch processing. Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import numpy as np import wx import openfile as opf # How to treat an opened file import models as mdls # Menu entry name MENUINFO = ["B&atch control", "Batch fitting."] class BatchCtrl(wx.Frame): def __init__(self, parent): # Parent is main frame self.parent = parent # Get the window positioning correctly pos = self.parent.GetPosition() pos = (pos[0]+100, pos[1]+100) wx.Frame.__init__(self, parent=parent, title="Batch control", pos=pos, style=wx.DEFAULT_FRAME_STYLE|wx.FRAME_FLOAT_ON_PARENT) ## MYID # This ID is given by the parent for an instance of this class self.MyID = None ## Controls panel = wx.Panel(self) self.panel = panel text1 = wx.StaticText(panel, label="Choose source of parameters:") self.rbtnhere = wx.RadioButton(panel, -1, 'This session', style = wx.RB_GROUP) self.rbtnhere.SetValue(True) self.rbtnthere = wx.RadioButton(panel, -1, 'Other session') self.dropdown = wx.ComboBox(panel, -1, "Current page", (15, 30), wx.DefaultSize, [], wx.CB_DROPDOWN|wx.CB_READONLY) # Create the dropdownlist self.OnPageChanged() text2 = wx.StaticText(panel, label='This will affect all pages'+ '\nwith the same model.'+ '\nApply parameters:') btnapply = wx.Button(panel, wx.ID_ANY, 'Apply to applicable pages') btnfit = wx.Button(panel, wx.ID_ANY, 'Fit applicable pages') # Bindings self.Bind(wx.EVT_BUTTON, self.OnApply, btnapply) self.Bind(wx.EVT_BUTTON, self.OnFit, btnfit) self.Bind(wx.EVT_RADIOBUTTON, self.OnRadioHere, self.rbtnhere) self.Bind(wx.EVT_RADIOBUTTON, self.OnRadioThere, self.rbtnthere) # self.Bind(wx.EVT_COMBOBOX, self.OnSelect, self.dropdown) topSizer = wx.BoxSizer(wx.VERTICAL) topSizer.Add(text1) topSizer.Add(self.rbtnhere) topSizer.Add(self.rbtnthere) topSizer.AddSpacer(5) topSizer.Add(self.dropdown) topSizer.AddSpacer(5) topSizer.Add(text2) topSizer.AddSpacer(5) topSizer.Add(btnapply) topSizer.Add(btnfit) panel.SetSizer(topSizer) topSizer.Fit(self) self.SetMinSize(topSizer.GetMinSizeTuple()) # Check if we even have pages. self.OnPageChanged() #Icon if parent.MainIcon is not None: wx.Frame.SetIcon(self, parent.MainIcon) self.Show(True) def OnApply(self, event): # Get the item from the dropdown list item = self.dropdown.GetSelection() if self.rbtnhere.Value == True: # Get parameters from this session if item <= 0: Page = self.parent.notebook.GetCurrentPage() else: Page = self.parent.notebook.GetPage(item-1) # First apply the parameters of the page Page.apply_parameters() # Get all parameters Parms = self.parent.PackParameters(Page) else: # Get Parameters from different session Parms = self.YamlParms[item] modelid = Parms[1] # Set all parameters for all pages for i in np.arange(self.parent.notebook.GetPageCount()): OtherPage = self.parent.notebook.GetPage(i) if OtherPage.modelid == modelid and OtherPage.dataexp is not None: self.parent.UnpackParameters(Parms, OtherPage) OtherPage.PlotAll() def OnClose(self, event=None): self.parent.toolmenu.Check(self.MyID, False) self.parent.ToolsOpen.__delitem__(self.MyID) self.Destroy() def OnFit(self, event): item = self.dropdown.GetSelection() if self.rbtnhere.Value == True: if item <= 0: Page = self.parent.notebook.GetCurrentPage() else: Page = self.parent.notebook.GetPage(item) # Get internal ID modelid = Page.modelid else: # Get external ID modelid = self.YamlParms[item][1] # Fit all pages with right modelid for i in np.arange(self.parent.notebook.GetPageCount()): OtherPage = self.parent.notebook.GetPage(i) if (OtherPage.modelid == modelid and OtherPage.dataexpfull is not None): #Fit OtherPage.Fit_function(noplots=True) def OnPageChanged(self, Page=None): if self.parent.notebook.GetPageCount() == 0: self.panel.Disable() return else: self.panel.Enable() # We need to update the list of Pages in self.dropdown if self.rbtnhere.Value == True: DDlist = list() DDlist.append("Current page") for i in np.arange(self.parent.notebook.GetPageCount()): aPage = self.parent.notebook.GetPage(i) DDlist.append(aPage.counter+aPage.model) self.dropdown.SetItems(DDlist) self.dropdown.SetSelection(0) def OnRadioHere(self, event=None): self.OnPageChanged() def OnRadioThere(self, event=None): # If user clicks on pages in main program, we do not want the list # to be changed. self.YamlParms, dirname, filename = \ opf.ImportParametersYaml(self.parent, self.parent.dirname) if filename == None: # User did not select any sesion file self.rbtnhere.SetValue(True) else: DDlist = list() for i in np.arange(len(self.YamlParms)): # Rebuild the list modelid = self.YamlParms[i][1] modelname = mdls.modeldict[modelid][1] DDlist.append(self.YamlParms[i][0]+modelname) self.dropdown.SetItems(DDlist) # Set selection text to first item self.dropdown.SetSelection(0) pycorrfit-0.8.1/src/tools/overlaycurves.py0000644000175000017500000003602512262516600017511 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module tools - selectcurves Let the user choose which correlation curves to use. Contains wrappers for file import and tools. Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ from matplotlib import cm import numpy as np import platform import wx import wx.lib.plot as plot # Plotting in wxPython import edclasses import misc # Menu entry name MENUINFO = ["&Overlay curves", "Select experimental curves."] class Wrapper_OnImport(object): """ Wrapper for import function. parent: wx.Frame curvedict: dictionary with curves onselected: external function that is called with two arguments: *kept keys* and *unwanted keys* as lists referring to curvedict. selkeys: preselected values for curves in curvedict """ def __init__(self, parent, curvedict, onselected, selkeys=None, labels=None): self.onselected = onselected self.parent = parent self.Selector = UserSelectCurves(parent, curvedict, wrapper=self, selkeys=selkeys, labels=labels) self.Selector.Show() self.Selector.MakeModal(True) self.Selector.Bind(wx.EVT_CLOSE, self.OnClose) def OnClose(self, event=None): self.Selector.MakeModal(False) self.Selector.Destroy() def OnResults(self, keyskeep, keysrem): """ Here we will close (or disable?) pages that are not wanted by the user. It is important that we do not close pages that do not contain any experimental data (Page.dataeyp is None), because we ignored those pages during import. """ self.OnClose() self.onselected(keyskeep,keysrem) class Wrapper_Tools(object): def __init__(self, parent): """ Wrapper for tools menu. Gets curvedict from parent and starts curve selection. See *UserSelectCurves* class. """ # parent is the main frame of PyCorrFit self.parent = parent ## MYID # This ID is given by the parent for an instance of this class self.MyID = None ## Wrapping curvedict, labels = self.GetCurvedict() self.labels = labels self.Selector = UserSelectCurves(parent, curvedict, wrapper=self, labels=labels) # This is necessary for parent to deselect and select the tool # in the tools menu. self.Bind = self.Selector.Bind if self.parent.notebook.GetPageCount() == 0: self.Selector.sp.Disable() def Disable(self): self.Selector.Disable() def Enable(self, par=True): self.Selector.Enable(par) def GetCurvedict(self, e=None): curvedict = dict() labels = dict() N = self.parent.notebook.GetPageCount() for i in np.arange(N): Page = self.parent.notebook.GetPage(i) key = Page.counter if Page.dataexp is not None: curve = 1*Page.dataexp curve[:,1] *= Page.normfactor curvedict[key] = curve labels[key] = Page.tabtitle.GetValue() return curvedict, labels def OnClose(self, event=None): # This is a necessary function for PyCorrFit. # Do not change it. self.parent.toolmenu.Check(self.MyID, False) self.parent.ToolsOpen.__delitem__(self.MyID) self.Selector.Destroy() def OnPageChanged(self, page=None): # When parent changes # This is a necessary function for PyCorrFit. # This is stuff that should be done when the active page # of the notebook changes. if self.parent.notebook.GetPageCount() == 0: self.Selector.SelectBox.SetItems([]) self.Selector.sp.Disable() else: # Sticky behavior cleaned up in 0.7.8 curvedict, labels = self.GetCurvedict() self.Selector.curvedict = curvedict self.Selector.labels = labels self.Selector.ProcessDict() self.labels = labels self.Selector.SelectBox.SetItems(self.Selector.curvelabels) for i in np.arange(len(self.Selector.curvekeys)): self.Selector.SelectBox.SetSelection(i) self.Selector.OnUpdatePlot() def OnResults(self, keyskeep, keysrem): """ Here we will close (or disable?) pages that are not wanted by the user. It is important that we do not close pages that do not contain any experimental data (Page.dataeyp is None), because we ignored those pages during import. """ if len(keysrem) == 0: self.OnClose() return # warn the user! # First make a list of all pages that need to be removed and then # delete those pages. overtext = "Keep only pages in this list?" textlist = "" for key in keyskeep: textlist += "- "+key+" "+self.labels[key]+"\n" dlg = edclasses.MyScrolledDialog(self.parent, overtext, textlist, "Warning") if dlg.ShowModal() == wx.ID_OK: N = self.parent.notebook.GetPageCount() pagerem = list() for i in np.arange(N): Page = self.parent.notebook.GetPage(i) key = Page.counter if keysrem.count(key) == 1: pagerem.append(Page) for Page in pagerem: j = self.parent.notebook.GetPageIndex(Page) self.parent.notebook.DeletePage(j) dlg.Destroy() self.OnPageChanged() def OnSelectionChanged(self, keylist): if len(keylist) == 0: return # integer type list with page number pagelist = list() N = self.parent.notebook.GetPageCount() for i in np.arange(N): Page = self.parent.notebook.GetPage(i) key = Page.counter if keylist.count(key) == 1: pagelist.append(int(key.strip("#: "))) # Get open tools toolkeys = self.parent.ToolsOpen.keys() if len(toolkeys) == 0: return # Fill string = misc.parsePagenum2String(pagelist) for key in toolkeys: tool = self.parent.ToolsOpen[key] try: tool.SetPageNumbers(string) except: # tool does not have this function and hence does not # need numbers. pass class UserSelectCurves(wx.Frame): # This tool is derived from a wx.frame. def __init__(self, parent, curvedict, wrapper=None, selkeys=None, labels=None): """ *curvedict* is a dictionary that contains the curves. Keys serve as identifiers in the curve selection. e.g. curvelist["#1:"] = np.array[ np.array[0.0,1], np.array[0.0,.971] ...] *parent* is the main frame *wrapper* is the object to which the chosen keys are given back. If it is not None, it must provide a function *OnResults*, accepting a list of keys as an argument. *selkeys* items in the list *curvedict* that are preelected. *labels* dictionary with same keys as *curvelist* - labels of the entries in the list. If none, the keys of *curvedict* will be used. """ # parent is the main frame of PyCorrFit self.parent = parent self.wrapper = wrapper self.curvedict = curvedict self.selkeys = selkeys self.labels = labels # can be None self.curvelabels = None # filled by self.ProcessDict() if self.selkeys is not None: newselkeys = list() for item in self.selkeys: newselkeys.append(str(item)) self.selkeys = newselkeys # Get the window positioning correctly pos = self.parent.GetPosition() pos = (pos[0]+100, pos[1]+100) wx.Frame.__init__(self, parent=self.parent, title="Curve selection", pos=pos, style=wx.DEFAULT_FRAME_STYLE|wx.FRAME_FLOAT_ON_PARENT, size=(800,500)) ## Pre-process self.ProcessDict() ## Content self.sp = wx.SplitterWindow(self, size=(500,500), style=wx.SP_NOBORDER) self.sp.SetMinimumPaneSize(1) # Top panel panel_top = wx.Panel(self.sp, size=(500,200)) self.upperSizer = wx.BoxSizer(wx.VERTICAL) if platform.system().lower() == 'darwin': ctrl = "Apple" else: ctrl = "Ctrl" text = "Select the curves to keep. \n" +\ "By holding down the '"+ctrl+"' key, single curves can be \n" +\ "selected or deselected. The 'Shift' key can be used \n" +\ "to select groups." self.upperSizer.Add(wx.StaticText(panel_top, label=text)) # Bottom Panel self.bottom_sp = wx.SplitterWindow(self.sp, size=(500,300), style=wx.SP_NOBORDER) self.bottom_sp.SetMinimumPaneSize(1) sizepanelx = 250 panel_bottom = wx.Panel(self.bottom_sp, size=(sizepanelx,300)) self.boxSizer = wx.BoxSizer(wx.VERTICAL) # Box selection style = wx.LB_EXTENDED self.SelectBox = wx.ListBox(panel_bottom, size=(sizepanelx,300), style=style, choices=self.curvelabels) for i in np.arange(len(self.curvekeys)): self.SelectBox.SetSelection(i) # Deselect keys that are not in self.selkeys if self.selkeys is not None: for i in np.arange(len(self.curvekeys)): if self.selkeys.count(self.curvekeys[i]) == 0: self.SelectBox.Deselect(i) self.Bind(wx.EVT_LISTBOX, self.OnUpdatePlot, self.SelectBox) self.boxSizer.Add(self.SelectBox) # Button APPLY btnok = wx.Button(panel_bottom, wx.ID_ANY, 'Apply') self.Bind(wx.EVT_BUTTON, self.OnPushResults, btnok) self.boxSizer.Add(btnok) # Button CANCEL btncancel = wx.Button(panel_bottom, wx.ID_ANY, 'Cancel') self.Bind(wx.EVT_BUTTON, self.OnCancel, btncancel) self.boxSizer.Add(btncancel) # Finish off sizers panel_top.SetSizer(self.upperSizer) panel_bottom.SetSizer(self.boxSizer) self.upperSizer.Fit(panel_top) self.boxSizer.Fit(panel_bottom) minsize = np.array(self.boxSizer.GetMinSizeTuple()) +\ np.array(self.upperSizer.GetMinSizeTuple()) +\ np.array((300,30)) self.SetMinSize(minsize) #self.SetSize(minsize) #self.SetMaxSize((9999, self.boxSizer.GetMinSizeTuple()[1])) # Canvas self.canvas = plot.PlotCanvas(self.bottom_sp) self.canvas.setLogScale((True, False)) self.canvas.SetEnableZoom(True) # Splitter window self.bottom_sp.SplitVertically(panel_bottom, self.canvas, sizepanelx) sizetoppanel = self.upperSizer.GetMinSizeTuple()[1] self.sp.SplitHorizontally(panel_top, self.bottom_sp, sizetoppanel) self.OnUpdatePlot() # Icon if parent.MainIcon is not None: wx.Frame.SetIcon(self, parent.MainIcon) self.Show(True) def ProcessDict(self, e=None): # Define the order of keys used. # We want to sort the keys, such that #10: is not before #1: self.curvekeys = self.curvedict.keys() # Sorting key function applied to each key before sorting: page_num = lambda counter: int(counter.strip().strip(":").strip("#")) try: for item in self.curvekeys: page_num(item) except: fstr = lambda x: x else: fstr = page_num self.curvekeys.sort(key = fstr) if self.labels is None: self.curvelabels = self.curvekeys else: # Use given labels instead of curvekeys. self.curvelabels = list() for key in self.curvekeys: self.curvelabels.append("#"+str(key).strip(":# ")+" "+self.labels[key]) def OnCancel(self, e=None): """ Close the tool """ self.wrapper.OnClose() def OnPushResults(self, e=None): # Get keys from selection keyskeep = list() for i in self.SelectBox.GetSelections(): keyskeep.append(self.curvekeys[i]) keysrem = list() for key in self.curvekeys: if keyskeep.count(key) == 0: keysrem.append(key) self.wrapper.OnResults(keyskeep, keysrem) def OnUpdatePlot(self, e=None): """ What should happen when the selection in *self.SelectBox* is changed? This function will alsy try to call the function *self.parent.OnSelectionChanged* and hand over the list of currently selected curves. This is an addon for 0.7.8 where we will control the page selection in the average tool. """ # Get selected curves curves = list() legends = list() selection = self.SelectBox.GetSelections() for i in selection: curves.append(self.curvedict[self.curvekeys[i]]) legends.append(self.curvekeys[i]) # Set color map cmap = cm.get_cmap("gist_rainbow") # Clear Plot self.canvas.Clear() # Draw Plot lines = list() for i in np.arange(len(curves)): color = cmap(1.*i/(len(curves)), bytes=True) color = wx.Colour(color[0], color[1], color[2]) line = plot.PolyLine(curves[i], legend=legends[i], colour=color, width=1) lines.append(line) self.canvas.SetEnableLegend(True) if len(curves) != 0: self.canvas.Draw(plot.PlotGraphics(lines, xLabel=u'lag time τ [s]', yLabel=u'G(τ)')) ## This is an addon for 0.7.8 keyskeep = list() for i in self.SelectBox.GetSelections(): keyskeep.append(self.curvekeys[i]) try: self.wrapper.OnSelectionChanged(keyskeep) except: pass pycorrfit-0.8.1/src/tools/__init__.py0000644000175000017500000000665112262516600016341 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module tools This file contains useful tools, such as dialog boxes and other stuff, that we need in PyCorrFit. Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ # This file is necessary for this folder to become a module that can be # imported by PyCorrFit or other people. import importlib import numpy as np # NumPy import sys ## On Windows XP I had problems with the unicode Characters. # I found this at # http://stackoverflow.com/questions/5419/python-unicode-and-the-windows-console # and it helped: reload(sys) sys.setdefaultencoding('utf-8') import datarange import background import overlaycurves import batchcontrol import globalfit import average import simulation import info import statistics import trace # Load all of the classes # This also defines the order of the tools in the menu ImpA = [ ["datarange", "SelectChannels"], ["overlaycurves", "Wrapper_Tools"], ["batchcontrol", "BatchCtrl"], ["globalfit", "GlobalFit"], ["average", "Average"], ["background", "BackgroundCorrection"] ] ImpB = [ ["trace", "ShowTrace"], ["statistics", "Stat"], ["info", "ShowInfo"], ["simulation", "Slide"] ] ModuleActive = list() ToolsActive = list() for i in np.arange(len(ImpA)): # We have to add "tools." because this is a relative import ModuleActive.append(__import__(ImpA[i][0], globals(), locals(), [ImpA[i][1]], -1)) ToolsActive.append(getattr(ModuleActive[i], ImpA[i][1])) ModulePassive = list() ToolsPassive = list() for i in np.arange(len(ImpB)): ModulePassive.append(__import__(ImpB[i][0], globals(), locals(), [ImpB[i][1]], -1)) ToolsPassive.append(getattr(ModulePassive[i], ImpB[i][1])) #ModulePassive.append(importlib.import_module("tools."+ImpB[i][0])) #ToolsPassive.append(getattr(ModulePassive[i], ImpB[i][1])) # This is in the file menu and not needed in the dictionaries below. from chooseimport import ChooseImportTypes from chooseimport import ChooseImportTypesModel from comment import EditComment # the "special" tool RangeSelector from parmrange import RangeSelector ToolDict = dict() ToolDict["A"] = ToolsActive ToolDict["P"] = ToolsPassive # Make the same for Menu Names in Tools NameActive = list() for i in np.arange(len(ImpA)): NameActive.append(ModuleActive[i].MENUINFO) NamePassive = list() for i in np.arange(len(ImpB)): NamePassive.append(ModulePassive[i].MENUINFO) ToolName = dict() ToolName["A"] = NameActive ToolName["P"] = NamePassive pycorrfit-0.8.1/src/tools/plotexport.py0000644000175000017500000000547512262516600017025 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module tools - plotexport Let the user create nice plots of our data. --currently not used Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import wx class Tool(wx.Frame): # This tool is derived from a wx.frame. def __init__(self, parent): # parent is the main frame of PyCorrFit self.parent = parent # Get the window positioning correctly pos = self.parent.GetPosition() pos = (pos[0]+100, pos[1]+100) wx.Frame.__init__(self, parent=self.parent, title="Example Tool", pos=pos, style=wx.DEFAULT_FRAME_STYLE|wx.FRAME_FLOAT_ON_PARENT) ## MYID # This ID is given by the parent for an instance of this class self.MyID = None # Page - the currently active page of the notebook. self.Page = self.parent.notebook.GetCurrentPage() ## Content self.panel = wx.Panel(self) btnexample = wx.Button(self.panel, wx.ID_ANY, 'Example button') # Binds the button to the function - close the tool self.Bind(wx.EVT_BUTTON, self.OnClose, btnexample) self.topSizer = wx.BoxSizer(wx.VERTICAL) self.topSizer.Add(btnexample) self.panel.SetSizer(self.topSizer) self.topSizer.Fit(self) self.SetMinSize(self.topSizer.GetMinSizeTuple()) #Icon if parent.MainIcon is not None: wx.Frame.SetIcon(self, parent.MainIcon) self.Show(True) def OnClose(self, event=None): # This is a necessary function for PyCorrFit. # Do not change it. self.parent.toolmenu.Check(self.MyID, False) self.parent.ToolsOpen.__delitem__(self.MyID) self.Destroy() def OnPageChanged(self, page): # When parent changes # This is a necessary function for PyCorrFit. # This is stuff that should be done when the active page # of the notebook changes. self.Page = page pycorrfit-0.8.1/src/tools/chooseimport.py0000644000175000017500000002464412262516600017317 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module tools - chooseimport Displays a window that lets the user choose what type of data (AC1, AC2, CC12, CC21) he wants to import. Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import numpy as np import wx import models as mdls import doc import overlaycurves class ChooseImportTypes(wx.Dialog): """ This class is used for importing single files from the "Current" menu. The model function is defined by the model that is in use. """ # This tool is derived from a wx.Dialog. def __init__(self, parent, curvedict): # parent is the main frame of PyCorrFit self.parent = parent # init #super(ChooseImportTypes, self).__init__(parent=parent, # title="Choose types", size=(250, 200)) wx.Dialog.__init__(self, parent, -1, "Choose models") self.keys = list() ## Content self.panel = wx.Panel(self) self.sizer = wx.BoxSizer(wx.VERTICAL) self.boxes = dict() # For the selection of types to import when doing import Data chooseimport = ("Several types of data were found in\n"+ "the chosen file. Please select what\n"+ "type(s) you would like to import.") textinit = wx.StaticText(self.panel, label=chooseimport) self.sizer.Add(textinit) thekeys = curvedict.keys() thekeys.sort() for key in thekeys: label = key + " (" + str(len(curvedict[key])) + " curves)" check = wx.CheckBox(self.panel, label=label) self.boxes[key] = check self.sizer.Add(check) self.Bind(wx.EVT_CHECKBOX, self.OnSetkeys, check) btnok = wx.Button(self.panel, wx.ID_OK, 'OK') # Binds the button to the function - close the tool self.Bind(wx.EVT_BUTTON, self.OnClose, btnok) self.sizer.Add(btnok) self.panel.SetSizer(self.sizer) self.sizer.Fit(self) #Icon if parent.MainIcon is not None: wx.Dialog.SetIcon(self, parent.MainIcon) #self.Show(True) self.SetFocus() def OnClose(self, event=None): # This is a necessary function for PyCorrFit. # Do not change it. self.EndModal(wx.ID_OK) #self.Destroy() def OnSetkeys(self, event = None): self.keys = list() for key in self.boxes.keys(): if self.boxes[key].Value == True: self.keys.append(key) class ChooseImportTypesModel(wx.Dialog): """ This class shows a dialog displaying options to choose model function on import of data """ # This tool is derived from a wx.frame. def __init__(self, parent, curvedict, correlations, labels=None): """ curvedict - dictionary, contains indexes to correlations and labels. The keys are different types of curves correlations - list of correlations labels - list of labels for the correlations (e.g. filename+run) if none, index numbers will be used for labels """ # parent is the main frame of PyCorrFit self.parent = parent # init #super(ChooseImportTypesModel, self).__init__(parent=parent, # title="Choose types", size=(250, 200)) wx.Dialog.__init__(self, parent, -1, "Choose models") self.curvedict = curvedict self.kept_curvedict = curvedict.copy() # Can be edited by user self.correlations = correlations self.labels = labels # List of keys that will be imported by our *parent* self.typekeys = list() # Dictionary of modelids corresponding to indices in curvedict self.modelids = dict() ## Content self.panel = wx.Panel(self) self.sizer = wx.BoxSizer(wx.VERTICAL) self.boxes = dict() labelim = "Select a fitting model for each correlation channel (AC,CC)." textinit = wx.StaticText(self.panel, label=labelim) self.sizer.Add(textinit) curvekeys = curvedict.keys() curvekeys.sort() self.curvekeys = curvekeys # Dropdown model selections: DropdownList = ["No model selected"] # Contains string in model dropdown self.DropdownIndex = [None] # Contains corresponsing model modelkeys = mdls.modeltypes.keys() modelkeys.sort() for modeltype in modelkeys: for modelid in mdls.modeltypes[modeltype]: DropdownList.append(modeltype+": "+mdls.modeldict[modelid][1]) self.DropdownIndex.append(modelid) self.ModelDropdown = dict() dropsizer = wx.FlexGridSizer(rows=len(modelkeys), cols=3, vgap=5, hgap=5) self.Buttons = list() i = 8000 for key in curvekeys: # Text with keys and numer of curves dropsizer.Add( wx.StaticText(self.panel, label=str(key)) ) label=" ("+str(len(curvedict[key]))+" curves)" button = wx.Button(self.panel, i, label) i += 1 self.Bind(wx.EVT_BUTTON, self.OnSelectCurves, button) self.Buttons.append(button) dropsizer.Add(button) # Model selection dropdown dropdown = wx.ComboBox(self.panel, -1, DropdownList[0], (15,30), wx.DefaultSize, DropdownList, wx.CB_DROPDOWN|wx.CB_READONLY) dropsizer.Add( dropdown ) self.ModelDropdown[key] = dropdown self.Bind(wx.EVT_COMBOBOX, self.OnSetkeys, dropdown) self.sizer.Add(dropsizer) btnok = wx.Button(self.panel, wx.ID_OK, 'OK') # Binds the button to the function - close the tool self.Bind(wx.EVT_BUTTON, self.OnClose, btnok) self.sizer.Add(btnok) self.panel.SetSizer(self.sizer) self.sizer.Fit(self) #self.Show(True) self.SetFocus() if parent.MainIcon is not None: wx.Dialog.SetIcon(self, parent.MainIcon) def OnClose(self, event=None): # This is a necessary function for PyCorrFit. # Do not change it. self.keepcurvesindex = list() for key in self.kept_curvedict.keys(): self.keepcurvesindex += self.kept_curvedict[key] for i in np.arange(len(self.keepcurvesindex)): self.keepcurvesindex[i] = int(self.keepcurvesindex[i]) self.EndModal(wx.ID_OK) #self.Show #self.Destroy() def OnSelectCurves(self, buttonevent): # Get the type of curves we want to look at index = buttonevent.GetId() - 8000 self.buttonindex = index key = self.curvekeys[index] # Get correlation curves for corresponding type corrcurves = dict() if self.labels is None: labeldict = None else: labeldict = dict() for i in self.curvedict[key]: corrcurves[str(i)] = self.correlations[int(i)] if self.labels is not None: labeldict[str(i)] = self.labels[int(i)] prev_selected = list() for item in self.kept_curvedict.keys(): prev_selected += self.kept_curvedict[item] overlaycurves.Wrapper_OnImport(self.parent, corrcurves, self.OnSelected, prev_selected, labels=labeldict) def OnSelected(self, keep, remove): # Set new button label for i in np.arange(len(keep)): keep[i] = int(keep[i]) #button = self.Buttons[self.buttonindex] label = " ("+str(len(keep))+" curves)" #button.SetLabel(label) # Add new content to selected key SelectedKey = self.curvekeys[self.buttonindex] #self.kept_curvedict[SelectedKey] = keep # If there are keys with the same amount of correlations, # these are assumed to be AC2, CC12, CC21 etc., so we will remove # items from them accordingly. diff = set(keep).intersection(set(self.curvedict[SelectedKey])) indexes = list() for i in np.arange(len(self.curvedict[SelectedKey])): for number in diff: if number == self.curvedict[SelectedKey][i]: indexes.append(i) for j in np.arange(len(self.curvekeys)): key = self.curvekeys[j] if len(self.curvedict[key]) == len(self.curvedict[SelectedKey]): newlist = list() for index in indexes: newlist.append(self.curvedict[key][index]) self.kept_curvedict[key] = newlist # Also update buttons button = self.Buttons[j] button.SetLabel(label) def OnSetkeys(self, event = None): # initiate objects self.typekeys = list() self.modelids = dict() # iterate through all given keys (AC1, AC2, CC12, etc.) for key in self.curvedict.keys(): # get the dropdown selection for a given key modelindex = self.ModelDropdown[key].GetSelection() # modelindex is -1 or 0, if no model has been chosen if modelindex > 0: # Append the key to a list of to be imported types self.typekeys.append(key) # Append the modelid to a dictionary that has indexes # belonging to the imported curves in *parent* modelid = self.DropdownIndex[modelindex] for index in self.curvedict[key]: # Set different model id for the curves self.modelids[index] = modelid self.typekeys.sort() pycorrfit-0.8.1/src/tools/example.py0000644000175000017500000000623212262516600016230 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module tools - example This is an example tool. You will need to edit __init__.py inside this folder to activate it. Add the filename (*example*) and class (*Tool*) to either of the lists *ImpA* or *ImpB* in __init__.py. Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import wx import numpy as np class Tool(wx.Frame): # This tool is derived from a wx.frame. def __init__(self, parent): # parent is the main frame of PyCorrFit self.parent = parent # Get the window positioning correctly pos = self.parent.GetPosition() pos = (pos[0]+100, pos[1]+100) wx.Frame.__init__(self, parent=self.parent, title="Example tool", pos=pos, style=wx.DEFAULT_FRAME_STYLE|wx.FRAME_FLOAT_ON_PARENT) ## MYID # This ID is given by the parent for an instance of this class self.MyID = None # Page - the currently active page of the notebook. self.Page = self.parent.notebook.GetCurrentPage() ## Content self.panel = wx.Panel(self) btncopy = wx.Button(self.panel, wx.ID_ANY, 'Example button') # Binds the button to the function - close the tool self.Bind(wx.EVT_BUTTON, self.OnClose, btncopy) self.topSizer = wx.BoxSizer(wx.VERTICAL) self.topSizer.Add(btncopy) self.panel.SetSizer(self.topSizer) self.topSizer.Fit(self) self.SetMinSize(self.topSizer.GetMinSizeTuple()) # Icon if parent.MainIcon is not None: wx.Frame.SetIcon(self, parent.MainIcon) self.Show(True) def OnClose(self, event=None): # This is a necessary function for PyCorrFit. # Do not change it. self.parent.toolmenu.Check(self.MyID, False) self.parent.ToolsOpen.__delitem__(self.MyID) self.Destroy() def OnPageChanged(self, page): # When parent changes # This is a necessary function for PyCorrFit. # This is stuff that should be done when the active page # of the notebook changes. if self.parent.notebook.GetPageCount() == 0: # Do something when there are no pages left. self.panel.Disable() return self.panel.Enable() self.Page = page pycorrfit-0.8.1/src/tools/parmrange.py0000644000175000017500000001424012262516600016547 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module tools - RangeSelector Select the range in which the parameter should reside for fitting. This is only the frontend. Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import wx import numpy as np import edclasses # edited floatspin import models as mdls class RangeSelector(wx.Frame): # This tool is derived from a wx.frame. def __init__(self, Page): # parent is the main frame of PyCorrFit self.parent = Page.parent # Get the window positioning correctly pos = self.parent.GetPosition() pos = (pos[0]+100, pos[1]+100) wx.Frame.__init__(self, parent=self.parent, title="Parameter Range", pos=pos, style=wx.DEFAULT_FRAME_STYLE|wx.FRAME_FLOAT_ON_PARENT) # Page - the currently active page of the notebook. self.Page = self.parent.notebook.GetCurrentPage() ## Content self.panel = wx.Panel(self) self.topSizer = wx.BoxSizer(wx.VERTICAL) self.WXboxsizerlist = list() self.WXparmlist = list() self.OnPageChanged(self.Page) #Icon if self.parent.MainIcon is not None: wx.Frame.SetIcon(self, self.parent.MainIcon) self.Show(True) def FillPanel(self): """ Fill the panel with parameters from the page """ self.parameter_range = np.zeros(self.Page.parameter_range.shape) labels, parmleft = mdls.GetHumanReadableParms(self.Page.modelid, self.Page.parameter_range[:,0]) labels, parmright = mdls.GetHumanReadableParms(self.Page.modelid, self.Page.parameter_range[:,1]) self.parameter_range[:,0] = np.array(parmleft) self.parameter_range[:,1] = np.array(parmright) # create line # = wx.BoxSizer(wx.VERTICAL) self.WXboxsizer = wx.FlexGridSizer(rows=len(labels), cols=4, vgap=5, hgap=5) for i in range(len(labels)): left = edclasses.FloatSpin(self.panel, digits=7, increment=.1) right = edclasses.FloatSpin(self.panel, digits=7, increment=.1) left.SetValue(self.parameter_range[i][0]) right.SetValue(self.parameter_range[i][1]) left.Bind(wx.EVT_SPINCTRL, self.OnSetParmRange) right.Bind(wx.EVT_SPINCTRL, self.OnSetParmRange) text = wx.StaticText(self.panel, label=u'< '+labels[i]) text2 = wx.StaticText(self.panel, label=u' <') self.WXboxsizer.Add(left) self.WXboxsizer.Add(text) self.WXboxsizer.Add(text2) self.WXboxsizer.Add(right) self.WXparmlist.append([left, [text, text2], right]) self.topSizer.Add(self.WXboxsizer) self.btnapply = wx.Button(self.panel, wx.ID_ANY, 'Apply') self.Bind(wx.EVT_BUTTON, self.OnSetParmRange, self.btnapply) self.topSizer.Add(self.btnapply) def OnClose(self, event=None): # This is a necessary function for PyCorrFit. # Do not change it. self.parent.RangeSelector = None self.Destroy() def OnPageChanged(self, page=None): # When parent changes # This is a necessary function for PyCorrFit. # This is stuff that should be done when the active page # of the notebook changes. self.Page = page if self.parent.notebook.GetPageCount() == 0: self.panel.Disable() return self.panel.Enable() try: self.btnapply.Destroy() except: pass for i in np.arange(len(self.WXparmlist)): self.WXparmlist[i][0].Destroy() #start self.WXparmlist[i][1][0].Destroy() #pname self.WXparmlist[i][1][1].Destroy() #pname self.WXparmlist[i][2].Destroy() #end del self.WXparmlist for i in np.arange(len(self.WXboxsizerlist)): self.WXboxsizer.Remove(0) self.WXboxsizerlist = list() self.WXparmlist = list() self.FillPanel() self.WXboxsizer.Layout() self.topSizer.Layout() self.SetMinSize(self.topSizer.GetMinSizeTuple()) self.topSizer.Fit(self) def OnSetParmRange(self, e): """ Called whenever something is edited in this frame. Writes back parameter ranges to the page """ # Read out parameters from all controls for i in range(len(self.WXparmlist)): self.parameter_range[i][0] = self.WXparmlist[i][0].GetValue() self.parameter_range[i][1] = self.WXparmlist[i][2].GetValue() if self.parameter_range[i][0] > self.parameter_range[i][1]: self.parameter_range[i][1] = 1.01*np.abs(self.parameter_range[i][0]) self.WXparmlist[i][2].SetValue(self.parameter_range[i][1]) # Set parameters l, parm0 = mdls.GetInternalFromHumanReadableParm(self.Page.modelid, self.parameter_range[:,0]) l, parm1 = mdls.GetInternalFromHumanReadableParm(self.Page.modelid, self.parameter_range[:,1]) self.Page.parameter_range[:,0] = np.array(parm0) self.Page.parameter_range[:,1] = np.array(parm1) #self.Page.PlotAll() pycorrfit-0.8.1/src/doc.py0000755000175000017500000001575412262516600014216 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module doc *doc* is the documentation. Functions for various text output point here. Dimensionless representation: unit of time : 1 ms unit of inverse time: 10³ /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import sys import csv import matplotlib # We do catch warnings about performing this before matplotlib.backends stuff #matplotlib.use('WXAgg') # Tells matplotlib to use WxWidgets import warnings with warnings.catch_warnings(): warnings.simplefilter("ignore") matplotlib.use('WXAgg') # Tells matplotlib to use WxWidgets for dialogs import numpy import os import platform import scipy # This is a fake class for modules not available. class Fake(object): def __init__(self): self.__version__ = "N/A" self.version = "N/A" try: import sympy except ImportError: print " Warning: module sympy not found!" sympy = Fake() try: import urllib2 except ImportError: print " Warning: module urllib not found!" urllib = Fake() try: import webbrowser except ImportError: print " Warning: module webbrowser not found!" webbrowser = Fake() import wx import yaml import readfiles def GetLocationOfChangeLog(filename = "ChangeLog.txt"): locations = list() fname1 = os.path.realpath(__file__) # Try one directory up dir1 = os.path.dirname(fname1)+"/../" locations.append(os.path.realpath(dir1)) # In case of distribution with .egg files (pip, easy_install) dir2 = os.path.dirname(fname1)+"/../pycorrfit_doc/" locations.append(os.path.realpath(dir2)) ## freezed binaries: if hasattr(sys, 'frozen'): try: dir2 = sys._MEIPASS + "/doc/" except: dir2 = "./" locations.append(os.path.realpath(dir2)) for loc in locations: thechl = os.path.join(loc,filename) if os.path.exists(thechl): return thechl break # if this does not work: return None def GetLocationOfDocumentation(filename = "PyCorrFit_doc.pdf"): """ Returns the location of the documentation if there is any.""" ## running from source locations = list() fname1 = os.path.realpath(__file__) # Documentation is usually one directory up dir1 = os.path.dirname(fname1)+"/../" locations.append(os.path.realpath(dir1)) # In case of distribution with .egg files (pip, easy_install) dir2 = os.path.dirname(fname1)+"/../pycorrfit_doc/" locations.append(os.path.realpath(dir2)) ## freezed binaries: if hasattr(sys, 'frozen'): try: dir2 = sys._MEIPASS + "/doc/" except: dir2 = "./" locations.append(os.path.realpath(dir2)) for loc in locations: thedoc = os.path.join(loc,filename) if os.path.exists(thedoc): return thedoc break # if this does not work: return None def info(version): """ Returns a little info about our program and what it can do. """ textwin = u""" Copyright 2011-2012 Paul Müller, Biotec - TU Dresden A versatile tool for fitting and analyzing correlation curves. Dimensionless representation: unit of time : 1 ms unit of inverse time: 1000 /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm^3 """ textlin = """ © 2011-2012 Paul Müller, Biotec - TU Dresden A versatile tool for fitting and analyzing correlation curves. Dimensionless representation: unit of time : 1 ms unit of inverse time: 1000 /s unit of distance : 100 nm unit of Diff.coeff : 10 µm²/s unit of inverse area: 100 /µm² unit of inv. volume : 1000 /µm³ """ if platform.system() != 'Linux': texta = textwin else: texta = textlin one = " PyCorrFit version "+version+"\n\n" two = "\n\n Supported file types:" for item in readfiles.Filetypes.keys(): if item.split("|")[0] != readfiles.Allsupfilesstring: two = two + "\n - "+item.split("|")[0] lizenz = "" for line in licence().splitlines(): lizenz += " "+line+"\n" return one + lizenz + texta + two def licence(): return """PyCorrFit is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation, either version 2 of the License, or (at your option) any later version. PyCorrFit is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ def SoftwareUsed(): """ Return some Information about the software used for this program """ text = "Python "+sys.version+\ "\n\nModules:"+\ "\n - csv "+csv.__version__+\ "\n - matplotlib "+matplotlib.__version__+\ "\n - NumPy "+numpy.__version__+\ "\n - os "+\ "\n - platform "+platform.__version__+\ "\n - SciPy "+scipy.__version__+\ "\n - sympy "+sympy.__version__ +\ "\n - sys "+\ "\n - tempfile" +\ "\n - urllib2 "+ urllib2.__version__ +\ "\n - webbrowser"+\ "\n - wxPython "+wx.__version__+\ "\n - yaml "+yaml.__version__ if hasattr(sys, 'frozen'): pyinst = "\n\nThis executable has been created using PyInstaller." text = text+pyinst return text # Standard homepage HomePage = "http://pycorrfit.craban.de/" # Changelog filename ChangeLog = "ChangeLog.txt" StaticChangeLog = GetLocationOfChangeLog(ChangeLog) # Check if we can extract the version try: clfile = open(StaticChangeLog, 'r') __version__ = clfile.readline().strip() clfile.close() except: __version__ = "0.0.0-unknown" # Github homepage GitChLog = "https://raw.github.com/paulmueller/PyCorrFit/master/ChangeLog.txt" GitHome = "https://github.com/paulmueller/PyCorrFit" GitWiki = "https://github.com/paulmueller/PyCorrFit/wiki" pycorrfit-0.8.1/src/misc.py0000644000175000017500000002311312262516600014365 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module misc Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import codecs from distutils.version import LooseVersion # For version checking import numpy as np import os import sys import tempfile import urllib2 import webbrowser import wx # GUI interface wxPython import wx.html import wx.lib.delayedresult as delayedresult import doc # Documentation/some texts # The icon file was created with # img2py -i -n Main PyCorrFit_icon.png icon.py import icon # Contains the program icon class UpdateDlg(wx.Frame): def __init__(self, parent, valuedict): description = valuedict["Description"] homepage = valuedict["Homepage"] githome = valuedict["Homepage_GIT"] changelog = valuedict["Changelog"] pos = parent.GetPosition() pos = (pos[0]+100, pos[1]+100) wx.Frame.__init__(self, parent, wx.ID_ANY, title="Update", size=(250,180), pos=pos) self.changelog = changelog # Fill html content html = wxHTML(self) string = '' +\ " PyCorrFit
" +\ "Your version: " + description[0]+"
" +\ "Latest version: " + description[1]+"
" +\ "(" + description[2]+")

" if len(homepage) != 0: string = string + 'Homepage
' if len(githome) != 0: string = string + 'Repository
' if len(changelog) != 0: string = string + \ 'Change Log' string = string+'

' html.SetPage(string) self.Bind(wx.EVT_CLOSE, self.Close) # Set window icon ico = getMainIcon() wx.Frame.SetIcon(self, ico) def Close(self, event): if len(self.changelog) != 0: # Cleanup downloaded file, if it was downloaded if self.changelog != doc.StaticChangeLog: os.remove(self.changelog) self.Destroy() class wxHTML(wx.html.HtmlWindow): def OnLinkClicked(parent, link): webbrowser.open(link.GetHref()) def parseString2Pagenum(parent, string, nodialog=False): """ Parse a string with a list of pagenumbers to an integer list with page numbers. e.g. "1-3,5,7" --> [1,2,3,5,7] parent is important """ listFull = string.split(",") PageNumbers = list() try: for item in listFull: pagerange = item.split("-") start = pagerange[0].strip() start = int(filter(type(start).isdigit, start)) end = pagerange[-1].strip() end = int(filter(type(end).isdigit, end)) for i in np.arange(end-start+1)+start: PageNumbers.append(i) PageNumbers.sort() return PageNumbers except: if nodialog is False: errstring = "Invalid syntax in page selection: "+string+\ ". Please use a comma separated list with"+\ " optional dashes, e.g. '1-3,6,8'." try: wx.MessageDialog(parent, errstring, "Error", style=wx.ICON_ERROR|wx.OK|wx.STAY_ON_TOP) except: raise ValueError(errstring) else: raise ValueError(errstring) return None def parsePagenum2String(pagenumlist): """ Make a string with dashes and commas from a list of pagenumbers. e.g. [1,2,3,5,7] --> "1-3,5,7" """ if len(pagenumlist) == 0: return "" # Make sure we have integers newlist = list() for num in pagenumlist: newlist.append(int(num)) newlist.sort() # begin string string = str(newlist[0]) # iteration through list: dash = False for i in np.arange(len(newlist)-1)+1: if dash == True: if newlist[i]-1 == newlist[i-1]: pass else: string += "-"+str(newlist[i-1])+", "+str(newlist[i]) dash = False else: if newlist[i]-1 == newlist[i-1]: if newlist[i]-2 == newlist[i-2]: dash = True elif len(newlist) != i+1 and newlist[i]+1 == newlist[i+1]: dash = True else: string += ", "+str(newlist[i]) dash = False else: dash = False string += ", "+str(newlist[i]) # Put final number if newlist[i] == newlist[-1]: if parseString2Pagenum(None, string)[-1] != newlist[i]: if dash == True: string += "-"+str(newlist[i]) else: string += ", "+str(newlist[i]) return string def removewrongUTF8(name): newname = u"" for char in name: try: codecs.decode(char, "UTF-8") except: pass else: newname += char return newname def getMainIcon(pxlength=32): """ *pxlength* is the side length in pixels of the icon """ # Set window icon iconBMP = icon.getMainBitmap() # scale image = wx.ImageFromBitmap(iconBMP) image = image.Scale(pxlength, pxlength, wx.IMAGE_QUALITY_HIGH) iconBMP = wx.BitmapFromImage(image) iconICO = wx.IconFromBitmap(iconBMP) return iconICO def findprogram(program): """ Uses the systems PATH variable find executables""" path = os.environ['PATH'] paths = path.split(os.pathsep) for d in paths: if os.path.isdir(d): fullpath = os.path.join(d, program) if sys.platform[:3] == 'win': for ext in '.exe', '.bat': program_path = fullpath + ext if os.path.isfile(fullpath + ext): return (1, program_path) else: if os.path.isfile(fullpath): return (1, fullpath) return (0, None) def Update(parent): """ This is a thread for _Update """ parent.StatusBar.SetStatusText("Connecting to server...") delayedresult.startWorker(_UpdateConsumer, _UpdateWorker, wargs=(parent,), cargs=(parent,)) def _UpdateConsumer(delayedresult, parent): results = delayedresult.get() dlg = UpdateDlg(parent, results) dlg.Show() parent.StatusBar.SetStatusText("...update status: "+results["Description"][2]) def _UpdateWorker(parent): changelog = "" hpversion = None # I created this TXT record to keep track of the current web presence. try: urlopener = urllib2.urlopen(doc.HomePage, timeout=2) homepage = urlopener.geturl() except: homepage = doc.HomePage try: urlopener2 = urllib2.urlopen(doc.GitHome, timeout=2) githome = urlopener2.geturl() except: githome = "" # Find the changelog file try: responseCL = urllib2.urlopen(homepage+doc.ChangeLog, timeout=2) except: CLfile = doc.GitChLog else: fileresponse = responseCL.read() CLlines = fileresponse.splitlines() # We have a transition between ChangeLog.txt on the homepage # containing the actual changelog or containing a link to # the ChangeLog file. if len(CLlines) == 1: CLfile = CLlines[0] else: hpversion = CLlines[0] CLfile = doc.GitChLog # Continue version comparison if True continuecomp = False try: responseVer = urllib2.urlopen(CLfile, timeout=2) except: if hpversion == None: newversion = "unknown" action = "cannot connect to server" else: newversion = hpversion continuecomp = True else: continuecomp = True changelog = responseVer.read() newversion = changelog.splitlines()[0] if continuecomp: new = LooseVersion(newversion) old = LooseVersion(parent.version) if new > old: action = "update available" elif new < old: action = "whoop you rock!" else: action = "state of the art" description = [parent.version, newversion, action] if len(changelog) != 0: changelogfile = tempfile.mktemp()+"_PyCorrFit_ChangeLog"+".txt" clfile = open(changelogfile, 'wb') clfile.write(changelog) clfile.close() else: changelogfile=doc.StaticChangeLog results = dict() results["Description"] = description results["Homepage"] = homepage results["Homepage_GIT"] = githome results["Changelog"] = changelogfile return results pycorrfit-0.8.1/src/edclasses.py0000644000175000017500000002022012262516600015374 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit EditedClasses Contains classes that we edited. Should make our classes more useful. Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ # Matplotlib plotting capabilities import matplotlib # We do catch warnings about performing this before matplotlib.backends stuff #matplotlib.use('WXAgg') # Tells matplotlib to use WxWidgets import warnings with warnings.catch_warnings(): warnings.simplefilter("ignore") matplotlib.use('WXAgg') # Tells matplotlib to use WxWidgets for dialogs # We will hack this toolbar here from matplotlib.backends.backend_wx import NavigationToolbar2Wx import numpy as np import sys import traceback from wx.lib.agw import floatspin # Float numbers in spin fields import wx class FloatSpin(floatspin.FloatSpin): def __init__(self, parent, digits=10, increment=.01): floatspin.FloatSpin.__init__(self, parent, digits=digits, increment = increment) self.Bind(wx.EVT_SPINCTRL, self.increment) #self.Bind(wx.EVT_SPIN, self.increment) #self.increment() def increment(self, event=None): # Find significant digit # and use it as the new increment x = self.GetValue() if x == 0: incre = 0.1 else: digit = int(np.ceil(np.log10(abs(x)))) - 2 incre = 10**digit self.SetIncrement(incre) class ChoicesDialog(wx.Dialog): def __init__(self, parent, dropdownlist, title, text): # parent is main frame self.parent = parent #super(ChoicesDialog, self).__init__(parent=parent, # title=title) wx.Dialog.__init__(self, parent, -1, title) ## Controls panel = wx.Panel(self) # text1 textopen = wx.StaticText(panel, label=text) btnok = wx.Button(panel, wx.ID_OK) btnabort = wx.Button(panel, wx.ID_CANCEL) # Dropdown self.dropdown = wx.ComboBox(panel, -1, "", (15, 30), wx.DefaultSize, dropdownlist, wx.CB_DROPDOWN|wx.CB_READONLY) self.dropdown.SetSelection(0) # Bindings self.Bind(wx.EVT_BUTTON, self.OnOK, btnok) self.Bind(wx.EVT_BUTTON, self.OnAbort, btnabort) # Sizers topSizer = wx.BoxSizer(wx.VERTICAL) topSizer.Add(textopen) topSizer.Add(self.dropdown) btnSizer = wx.BoxSizer(wx.HORIZONTAL) btnSizer.Add(btnok) btnSizer.Add(btnabort) topSizer.Add(btnSizer) panel.SetSizer(topSizer) topSizer.Fit(self) #self.Show(True) self.SetFocus() def OnOK(self, event=None): self.SelcetedID = self.dropdown.GetSelection() self.EndModal(wx.ID_OK) def OnAbort(self, event=None): self.EndModal(wx.ID_CANCEL) def save_figure(self, evt=None): """ A substitude function for save in: matplotlib.backends.backend_wx.NavigationToolbar2Wx We want to be able to give parameters such as dirname and filename. """ try: parent=self.canvas.HACK_parent fig=self.canvas.HACK_fig Page = self.canvas.HACK_Page add = self.canvas.HACK_append dirname = parent.dirname filename = Page.tabtitle.GetValue().strip()+Page.counter[:2]+add formats = fig.canvas.get_supported_filetypes() except: dirname = "." filename = "" formats = self.canvas.get_supported_filetypes() parent = self fieltypestring = "" keys = formats.keys() keys.sort() for key in keys: fieltypestring += formats[key]+"(*."+key+")|*."+key+"|" # remove last | fieltypestring = fieltypestring[:-1] dlg = wx.FileDialog(parent, "Save figure", dirname, filename, fieltypestring, wx.SAVE|wx.OVERWRITE_PROMPT) # png is default dlg.SetFilterIndex(keys.index("png")) # user cannot do anything until he clicks "OK" if dlg.ShowModal() == wx.ID_OK: wildcard = keys[dlg.GetFilterIndex()] filename = dlg.GetPath() haswc = False for key in keys: if filename.lower().endswith("."+key) is True: haswc = True if haswc == False: filename = filename+"."+wildcard dirname = dlg.GetDirectory() #savename = os.path.join(dirname, filename) savename = filename try: self.canvas.figure.savefig(savename) except: # RuntimeError: # The file does not seem to be what it seems to be. info = sys.exc_info() errstr = "Could not latex output:\n" errstr += str(filename)+"\n\n" errstr += str(info[0])+"\n" errstr += str(info[1])+"\n" for tb_item in traceback.format_tb(info[2]): errstr += tb_item wx.MessageDialog(parent, errstr, "Error", style=wx.ICON_ERROR|wx.OK|wx.STAY_ON_TOP) else: dirname = dlg.GetDirectory() try: parent.dirname = dirname except: pass class MyScrolledDialog(wx.Dialog): def __init__(self, parent, overtext, readtext, title): wx.Dialog.__init__(self, parent, title=title) overtext = wx.StaticText(self, label=overtext) text = wx.TextCtrl(self, -1, readtext, size=(500,400), style=wx.TE_MULTILINE | wx.TE_READONLY) sizer = wx.BoxSizer(wx.VERTICAL ) btnsizer = wx.BoxSizer() btn = wx.Button(self, wx.ID_OK)#, "OK ") btnsizer.Add(btn, 0, wx.ALL, 5) btnsizer.Add((5,-1), 0, wx.ALL, 5) btn = wx.Button(self, wx.ID_CANCEL)#, "Abort ") btnsizer.Add(btn, 0, wx.ALL, 5) sizer.Add(overtext, 0, wx.EXPAND|wx.ALL, 5) sizer.Add(text, 0, wx.EXPAND|wx.ALL, 5) sizer.Add(btnsizer, 0, wx.EXPAND|wx.ALIGN_CENTER_VERTICAL|wx.ALL, 5) self.SetSizerAndFit(sizer) class MyOKAbortDialog(wx.Dialog): def __init__(self, parent, text, title): wx.Dialog.__init__(self, parent, title=title) overtext = wx.StaticText(self, label=text) sizer = wx.BoxSizer(wx.VERTICAL ) btnsizer = wx.BoxSizer() btn = wx.Button(self, wx.ID_OK)#, "OK ") btnsizer.Add(btn, 0, wx.ALL, 5) btnsizer.Add((5,-1), 0, wx.ALL, 5) btn = wx.Button(self, wx.ID_CANCEL)#, "Abort ") btnsizer.Add(btn, 0, wx.ALL, 5) sizer.Add(overtext, 0, wx.EXPAND|wx.ALL, 5) sizer.Add(btnsizer, 0, wx.EXPAND|wx.ALIGN_CENTER_VERTICAL|wx.ALL, 5) self.SetSizerAndFit(sizer) class MyYesNoAbortDialog(wx.Dialog): def __init__(self, parent, text, title): wx.Dialog.__init__(self, parent, title=title) overtext = wx.StaticText(self, label=text) sizer = wx.BoxSizer(wx.VERTICAL) btnsizer = wx.BoxSizer() btn1 = wx.Button(self, wx.ID_YES) #btn1.Bind(wx.EVT_BTN, self.YES) btnsizer.Add(btn1, 0, wx.ALL, 5) btnsizer.Add((1,-1), 0, wx.ALL, 5) btn2 = wx.Button(self, wx.ID_NO) btnsizer.Add(btn2, 0, wx.ALL, 5) btnsizer.Add((1,-1), 0, wx.ALL, 5) btn3 = wx.Button(self, wx.ID_CANCEL) btnsizer.Add(btn3, 0, wx.ALL, 5) sizer.Add(overtext, 0, wx.EXPAND|wx.ALL, 5) sizer.Add(btnsizer, 0, wx.EXPAND|wx.ALIGN_CENTER_VERTICAL|wx.ALL, 5) self.SetSizerAndFit(sizer) self.SetFocus() self.Show() def YES(self, e): self.EndModal(wx.ID_YES) # Add the save_figure function to the standard class for wx widgets. matplotlib.backends.backend_wx.NavigationToolbar2Wx.save = save_figure pycorrfit-0.8.1/src/readfiles/0000755000175000017500000000000012262516600015016 5ustar toortoorpycorrfit-0.8.1/src/readfiles/read_mat_ries.py0000644000175000017500000002010312262516600020162 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit functions in this file: *openMAT* Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import numpy as np # On the windows machine the matlab binary import raised a warning. # We want to catch that warning, since importing ries's files works. import warnings with warnings.catch_warnings(): warnings.simplefilter("ignore") try: # scipy.io might not work on OSX (wrong architecture) import scipy.io as spio import scipy.io.matlab # streams is not available in older versions # of scipy. We catch this, so PyCorrFit will start # without problems. import scipy.io.matlab.streams except: print " Error: import error in scipys 'matlab' submodule." print " Try upgrading python-scipy or ignore this" print " error if you are not using .mat files that" print " were generated by programs by Jonas Ries." import os def openMAT(dirname, filename): """ Read mat files that Jonas Ries used in his programs. For opening .mat files, this helped a lot: http://stackoverflow.com/questions/7008608/ scipy-io-loadmat-nested-structures-i-e-dictionaries The structure has been derived from "corrSFCS.m" from the SFCS.m program from Jonas Ries. """ # initiate lists correlations = list() traces = list() curvelist = list() # Import everything inside the mat file as big iterated dictionary f = os.path.join(dirname, filename) alldata = loadmat(f) # Correlation functions are stored in "g" g = alldata["g"] # Get all Autocorrelation functions try: # ac for autocorrelation ac = g["ac"] except KeyError: pass else: N = len(ac) # Workaround for single ACs, they are not stored in a separate list, # but directly inserted into g["ac"]. We put it in a list. # This is not the case for the trace averages. # There are a maximum of 4 autocorrelation functions in one file, # as far as I know. if len(ac) > 4: N=1 ac = [ac] g["act"] = [g["act"]] for i in np.arange(len(ac)): corr = ac[i] try: times = g["act"][i] except KeyError: pass else: # Another workaround # Sometimes, there's just one curve, which # means that corr[0] has no length. if len( np.atleast_1d(corr[0]) ) == 1: final = np.zeros((len(corr), 2)) final[:,0] = times final[:,1] = corr correlations.append(final) curvelist.append("AC"+str(i+1)) try: # only trace averages are saved traceavg = g["trace"][i] except: # No trace traces.append(None) else: trace = np.zeros((2,2)) trace[1,0] = 1.0 trace[:,1] = traceavg traces.append(trace) elif len(corr) == len(times): for j in np.arange(len(corr[0])): final = np.zeros((len(corr), 2)) final[:,0] = times final[:,1] = corr[:,j] correlations.append(final) curvelist.append("AC"+str(i+1)) try: # only trace averages are saved traceavg = g["trace"][i][j] except: # No trace traces.append(None) else: trace = np.zeros((2,2)) trace[1,0] = 1.0 trace[:,1] = traceavg traces.append(trace) # Get dc "dual color" functions try: dc = g["dc"] except KeyError: pass else: for i in np.arange(len(dc)): corr = dc[i] try: times = g["dct"][i] except KeyError: pass else: if len(corr) == len(times): for j in np.arange(len(corr[0])): final = np.zeros((len(corr), 2)) final[:,0] = times final[:,1] = corr[:,j] correlations.append(final) curvelist.append("CC dual color "+str(i+1)) traces.append(None) # Get twof "two focus" functions try: twof = g["twof"] except KeyError: pass else: for i in np.arange(len(dc)): corr = twof[i] try: times = g["twoft"][i] except KeyError: pass else: if len(corr) == len(times): for j in np.arange(len(corr[0])): final = np.zeros((len(corr), 2)) final[:,0] = times final[:,1] = corr[:,j] correlations.append(final) curvelist.append("CC two foci "+str(i+1)) traces.append(None) # Get dc2f "dual color two focus" functions try: g["dc2f"] except KeyError: pass else: for i in np.arange(len(dc)): corr = twof[i] try: times = g["dc2ft"][i] except KeyError: pass else: if len(corr) == len(times): for j in np.arange(len(corr[0])): final = np.zeros((len(corr), 2)) final[:,0] = times final[:,1] = corr[:,j] correlations.append(final) curvelist.append("CC dual color two foci "+str(i+1)) traces.append(None) dictionary = dict() dictionary["Correlation"] = correlations dictionary["Trace"] = traces dictionary["Type"] = curvelist filelist = list() for i in curvelist: filelist.append(filename) dictionary["Filename"] = filelist return dictionary def loadmat(filename): ''' this function should be called instead of direct spio.loadmat as it cures the problem of not properly recovering python dictionaries from mat files. It calls the function check keys to cure all entries which are still mat-objects ''' data = spio.loadmat(filename, struct_as_record=False, squeeze_me=True) return _check_keys(data) def _check_keys(dict): ''' checks if entries in dictionary are mat-objects. If yes todict is called to change them to nested dictionaries ''' for key in dict: if isinstance(dict[key], spio.matlab.mio5_params.mat_struct): dict[key] = _todict(dict[key]) return dict def _todict(matobj): ''' A recursive function which constructs from matobjects nested dictionaries ''' dict = {} for strg in matobj._fieldnames: elem = matobj.__dict__[strg] if isinstance(elem, spio.matlab.mio5_params.mat_struct): dict[strg] = _todict(elem) else: dict[strg] = elem return dict pycorrfit-0.8.1/src/readfiles/read_CSV_PyCorrFit.py0000644000175000017500000001332212262516600020760 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit functions in this file: *openCSV* Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import os import csv import numpy as np def openCSV(dirname, filename): """ Read relevant data from a file looking like this: [...] # Comment # Data type: Autocorrelation [...] 1.000000e-006 3.052373e-001 1.020961e-006 3.052288e-001 1.042361e-006 3.052201e-001 1.064209e-006 3.052113e-001 1.086516e-006 3.052023e-001 1.109290e-006 3.051931e-001 [...] # BEGIN TRACE [...] 10.852761 31.41818 12.058624 31.1271 13.264486 31.27305 14.470348 31.33442 15.676211 31.15861 16.882074 31.08564 18.087936 31.21335 [...] Data type: If Data type is "Cross-correlation", we will try to import two traces after "# BEGIN SECOND TRACE" 1st section: First column denotes tau in seconds and the second row the correlation signal. 2nd section: First column denotes tau in seconds and the second row the intensity trace in kHz. Returns: 1. A list with tuples containing two elements: 1st: tau in ms 2nd: corresponding correlation signal 2. None - usually is the trace, but the trace is not saved in the PyCorrFit .csv format. 3. A list with one element, indicating, that we are opening only one correlation curve. """ # Check if the file is correlation data csvfile = open(os.path.join(dirname, filename), 'r') firstline = csvfile.readline() if firstline.lower().count("this is not correlation data") > 0: csvfile.close() return None csvfile.close() # Define what will happen to the file timefactor = 1000 # because we want ms instead of s csvfile = open(os.path.join(dirname, filename), 'r') readdata = csv.reader(csvfile, delimiter=',') data = list() trace = None traceA = None DataType="AC" # May be changed numtraces = 0 for row in readdata: if len(row) == 0 or len(str(row[0]).strip()) == 0: # Do nothing with empty/whitespace lines pass # Beware that the len(row) statement has to be called first # (before the len(str(row[0]).strip()) ). Otherwise some # error would be raised. elif str(row[0])[:12].lower() == "# Type AC/CC".lower(): corrtype = str(row[0])[12:].strip().strip(":").strip() if corrtype[:17].lower() == "cross-correlation": # We will later try to import a second trace DataType="CC" DataType += corrtype[17:].strip() elif corrtype[0:15].lower() == "autocorrelation": DataType="AC" DataType += corrtype[15:].strip() elif str(row[0])[0:13].upper() == '# BEGIN TRACE': # Correlation is over. We have a trace corr = np.array(data) data=list() numtraces = 1 elif str(row[0])[0:20].upper() == '# BEGIN SECOND TRACE': # First trace is over. We have a second trace traceA = np.array(data) data = list() numtraces = 2 # Exclude commentaries elif str(row[0])[0:1] != '#': # Read the 1st section # On Windows we had problems importing nan values that # had some white-spaces around them. Therefore: strip() ## As of version 0.7.8 we are supporting white space ## separated values as well if len(row) == 1: row = row[0].split() data.append((np.float(row[0].strip())*timefactor, np.float(row[1].strip()))) # Collect the rest of the trace, if there is any: rest = np.array(data) if numtraces == 0: corr = rest elif numtraces >= 1: trace = rest del data ## Remove any NaN numbers from thearray # Explanation: # np.isnan(data) # finds the position of NaNs in the array (True positions); 2D array, bool # any(1) # finds the rows that have True in them; 1D array, bool # ~ # negates them and is given as an argument (array type bool) to # select which items we want. corr = corr[~np.isnan(corr).any(1)] # Also check for infinities. corr = corr[~np.isinf(corr).any(1)] csvfile.close() Traces=list() # Set correct trace data for import if numtraces == 1 and DataType[:2] == "AC": Traces.append(trace) elif numtraces == 2 and DataType[:2] == "CC": Traces.append([traceA, trace]) elif numtraces == 1 and DataType[:2] == "CC": # Should not happen, but for convenience: Traces.append([trace, trace]) else: Traces.append(None) dictionary = dict() dictionary["Correlation"] = [corr] dictionary["Trace"] = Traces dictionary["Type"] = [DataType] dictionary["Filename"] = [filename] return dictionary pycorrfit-0.8.1/src/readfiles/__init__.py0000644000175000017500000002250312262516600017131 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit Module readfiles: Import correlation data from data files. Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ # This file is necessary for this folder to become a module that can be # imported by PyCorrFit. import csv import numpy as np import os import tempfile import yaml import zipfile # To add a filetype add it here and in the # dictionaries at the end of this file. from read_ASC_ALV_6000 import openASC from read_CSV_PyCorrFit import openCSV from read_SIN_correlator_com import openSIN from read_FCS_Confocor3 import openFCS from read_mat_ries import openMAT def AddAllWildcard(Dictionary): wildcard = "" keys = Dictionary.keys() N = len(keys) i = 0 for key in keys: newwc = key.split("|")[1] wildcard = wildcard + newwc i = i + 1 if i != N: wildcard = wildcard + ";" Dictionary[Allsupfilesstring+"|"+wildcard] = openAny return Dictionary # To increase user comfort, we will now create a file opener thingy that # knows how to open all files we know. def openAny(dirname, filename): """ Using the defined Filetypes and BGFiletypes, open the given file """ wildcard = filename.split(".")[-1] for key in Filetypes.keys(): # Recurse into the wildcards wildcardstring = key.split("|") # We do not want to recurse if wildcardstring[0] != Allsupfilesstring: otherwcs = wildcardstring[1].split(";") for string in otherwcs: if string[-3:] == wildcard: return Filetypes[key](dirname, filename) # If we could not find the correct function in Filetypes, try again # in BGFiletypes: return openAnyBG(dirname, filename) ## For convenience in openZIP #return None # already in openAnyBG def openAnyBG(dirname, filename): wildcard = filename.split(".")[-1] for key in BGFiletypes.keys(): wildcardstring = key.split("|") # We do not want to recurse if wildcardstring[0] != Allsupfilesstring: otherwcs = wildcardstring[1].split(";") for string in otherwcs: if string[-3:] == wildcard: return BGFiletypes[key](dirname, filename) # For convenience in openZIP return None def openZIP(dirname, filename): """ Get everything inside a .zip file that could be an FCS curve. Will use any wildcard in Filetypes dictionary. """ # It's a rather lengthy import of the session file. The code is copied # from openfile.OpenSession. The usual zip file packed curves are # imported on the few code lines after the else statement. ## Open the archive: Arc = zipfile.ZipFile(os.path.join(dirname, filename), mode='r') Correlations = list() # Correlation information Curvelist = list() # Type information Filelist = list() # List of filenames corresponding to *Curvelist* Trace = list() # Corresponding traces ## First test, if we are opening a session file fcsfitwildcard = ".fcsfit-session.zip" if len(filename)>19 and filename[-19:] == fcsfitwildcard: # Get the yaml parms dump: yamlfile = Arc.open("Parameters.yaml") # Parms: Fitting and drawing parameters of the correlation curve # The *yamlfile* is responsible for the order of the Pages #i. # The parameters are actually useless to us right now. Parms = yaml.safe_load(yamlfile) yamlfile.close() # Get the correlation arrays ImportedNum = list() for i in np.arange(len(Parms)): # The *number* is used to identify the correct file number = str(Parms[i][0]) expfilename = "data"+number[1:len(number)-2]+".csv" expfile = Arc.open(expfilename, 'r') readdata = csv.reader(expfile, delimiter=',') dataexp = list() tau = list() if str(readdata.next()[0]) == "# tau only": # We do not have a curve here pass else: Filelist.append(filename+"/#"+number[1:len(number)-2]) for row in readdata: # Exclude commentaries if (str(row[0])[0:1] != '#'): dataexp.append((float(row[0]), float(row[1]))) dataexp = np.array(dataexp) Correlations.append(dataexp) ImportedNum.append(i) del readdata expfile.close() # Get the Traces for i in ImportedNum: # Make sure we only import those traces that had a corresponding # correlation curve. (ImportedNum) # # The *number* is used to identify the correct file number = str(Parms[i][0]) # Find out, if we have a cross correlation data type IsCross = False try: IsCross = Parms[i][7] except IndexError: # No Cross correlation pass if IsCross is False: tracefilenames = ["trace"+number[1:len(number)-2]+".csv"] Curvelist.append("AC") else: # Cross correlation uses two traces tracefilenames = ["trace"+number[1:len(number)-2]+"A.csv", "trace"+number[1:len(number)-2]+"B.csv" ] Curvelist.append("CC") thistrace = list() for tracefilename in tracefilenames: try: Arc.getinfo(tracefilename) except KeyError: # No correlation curve, but add a None pass else: tracefile = Arc.open(tracefilename, 'r') traceread = csv.reader(tracefile, delimiter=',') singletrace = list() for row in traceread: # Exclude commentaries if (str(row[0])[0:1] != '#'): singletrace.append((float(row[0]), float(row[1]))) singletrace = np.array(singletrace) thistrace.append(singletrace) del traceread del singletrace tracefile.close() if len(thistrace) == 1: Trace.append(thistrace[0]) elif len(thistrace) == 2: Trace.append(thistrace) else: Trace.append(None) else: # We are not importing from a session but from a zip file with # probably a mix of all filetypes we know. This works # recursively (e.g. a zip file in a zipfile). allfiles = Arc.namelist() # Extract data to temporary folder tempdir = tempfile.mkdtemp() for afile in allfiles: Arc.extract(afile, path=tempdir) ReturnValue = openAny(tempdir, afile) if ReturnValue is not None: cs = ReturnValue["Correlation"] ts = ReturnValue["Trace"] ls = ReturnValue["Type"] fs = ReturnValue["Filename"] for i in np.arange(len(cs)): Correlations.append(cs[i]) Trace.append(ts[i]) Curvelist.append(ls[i]) Filelist.append(filename+"/"+fs[i]) # Delte file os.remove(os.path.join(tempdir,afile)) os.removedirs(tempdir) Arc.close() dictionary = dict() dictionary["Correlation"] = Correlations dictionary["Trace"] = Trace dictionary["Type"] = Curvelist dictionary["Filename"] = Filelist return dictionary # The string that is shown when opening all supported files Allsupfilesstring = "All supported files" # Dictionary with filetypes that we can open # The wildcards point to the appropriate functions. Filetypes = { "Correlator.com (*.SIN)|*.SIN;*.sin" : openSIN, "Correlator ALV-6000 (*.ASC)|*.ASC" : openASC, "PyCorrFit (*.csv)|*.csv" : openCSV, "Matlab 'Ries (*.mat)|*.mat" : openMAT, "Confocor3 (*.fcs)|*.fcs" : openFCS, "zip files (*.zip)|*.zip" : openZIP } # For user comfort, add "All supported files" wildcard: Filetypes = AddAllWildcard(Filetypes) # Dictionary with filetypes we can open that have intensity traces in them. BGFiletypes = { "Correlator.com (*.SIN)|*.SIN;*.sin" : openSIN, "Correlator ALV-6000 (*.ASC)|*.ASC" : openASC, "PyCorrFit (*.csv)|*.csv" : openCSV, "Confocor3 (*.fcs)|*.fcs" : openFCS, "zip files (*.zip)|*.zip" : openZIP } BGFiletypes = AddAllWildcard(BGFiletypes) pycorrfit-0.8.1/src/readfiles/read_ASC_ALV_6000.py0000755000175000017500000001467512262516600020160 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit functions in this file: *openASC* Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import os import csv import numpy as np def openASC(dirname, filename): """ Read data from a .ASC file, created by some ALV-6000 correlator. ALV-6000/E-WIN Data Date : "2/20/2012" ... "Correlation" 1.25000E-004 3.00195E-001 2.50000E-004 1.13065E-001 3.75000E-004 7.60367E-002 5.00000E-004 6.29926E-002 6.25000E-004 5.34678E-002 7.50000E-004 4.11506E-002 8.75000E-004 4.36752E-002 1.00000E-003 4.63146E-002 1.12500E-003 3.78226E-002 ... 3.35544E+004 -2.05799E-006 3.77487E+004 4.09032E-006 4.19430E+004 4.26295E-006 4.61373E+004 1.40265E-005 5.03316E+004 1.61766E-005 5.45259E+004 2.19541E-005 5.87202E+004 3.26527E-005 6.29145E+004 2.72920E-005 "Count Rate" 1.17188 26.77194 2.34375 26.85045 3.51563 27.06382 4.68750 26.97932 5.85938 26.73694 7.03125 27.11332 8.20313 26.81376 9.37500 26.82741 10.54688 26.88801 11.71875 27.09710 12.89063 27.13209 14.06250 27.02200 15.23438 26.95287 16.40625 26.75657 17.57813 26.43056 ... 294.14063 27.22597 295.31250 26.40581 296.48438 26.33497 297.65625 25.96457 298.82813 26.71902 1. We are interested in the "Correlation" section, where the first column denotes tau in ms and the second row the correlation signal. Values are separated by a tabulator "\t" (some " "). 2. We are also interested in the "Count Rate" section. Here the times are saved as seconds and not ms like above. 3. There is some kind of mode where the ALV exports five runs at a time and averages them. The sole correlation data is stored in the file, but the trace is only stored as average or something. So I would not recommend this. However, I added support for this. PyCorrFit then only imports the average data. ~ Paul, 2012-02-20 Correlation data starts at "Correlation (Multi, Averaged)". Returns: [0]: An array with tuples containing two elements: 1st: tau in ms 2nd: corresponding correlation signal [1]: Intensity trace: 1st: time in ms 2nd: Trace in kHz [2]: An array with N elements, indicating, how many curves we are opening from the file. Elements can be names and must be convertible to strings. """ openfile = open(os.path.join(dirname, filename), 'r') Alldata = openfile.readlines() ## Correlation function # Find out where the correlation function is for i in np.arange(len(Alldata)): if Alldata[i][0:13] == '"Correlation"': # Start of correlation function StartC = i+1 if Alldata[i][0:31] == '"Correlation (Multi, Averaged)"': # Start of AVERAGED correlation function !!! # There are several curves now. StartC = i+2 if Alldata[i][0:12] == '"Count Rate"': # End of correlation function EndC = i-2 # Start of trace (goes until end of file) StartT = i+1 EndT = Alldata.__len__() # Get the header Namedata = Alldata.__getslice__(StartC-1, StartC) ## Define *curvelist* curvelist = csv.reader(Namedata, delimiter='\t').next() if len(curvelist) <= 2: # Then we have just one single correlation curve curvelist = [""] else: # We have a number of correlation curves. We need to specify # names for them. We take these names from the headings. # Lag times not in the list curvelist.remove(curvelist[0]) # Last column is empty curvelist.remove(curvelist[-1]) ## Correlation function Truedata = Alldata.__getslice__(StartC, EndC) readdata = csv.reader(Truedata, delimiter='\t') data = list() # Add lists to *data* according to the length of *curvelist* for item in curvelist: data.append(list()) # Work through the rows in the read data for row in readdata: for i in np.arange(len(curvelist)): data[i].append( (np.float(row[0]), np.float(row[i+1])) ) ## Trace # Trace is stored in two columns # 1st column: time [s] # 2nd column: trace [kHz] # Get the trace Tracedata = Alldata.__getslice__(StartT, EndT) timefactor = 1000 # because we want ms instead of s readtrace = csv.reader(Tracedata, delimiter='\t') trace = list() # Add lists to *trace* according to the length of *curvelist* for item in curvelist: trace.append(list()) # Work through the rows for row in readtrace: # tau in ms, corr-function trace[0].append((np.float(row[0])*timefactor, np.float(row[1]))) for i in np.arange(len(curvelist)-1): trace[i+1].append((np.float(row[0])*timefactor, 0)) # return as an array openfile.close() dictionary = dict() dictionary["Correlation"] = np.array(data) dictionary["Trace"] = np.array(trace) dictionary["Type"] = curvelist filelist = list() for i in curvelist: filelist.append(filename) dictionary["Filename"] = filelist return dictionary pycorrfit-0.8.1/src/readfiles/read_FCS_Confocor3.py0000644000175000017500000004442312262516600020720 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit functions in this file: *openFCS*, *openFCS_Single*, *openFCS_Multiple* Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import os import csv import numpy as np import warnings def openFCS(dirname, filename): """ Load data from Zeiss Confocor3 Data is imported sequenially from the file. PyCorrFit will give each curve an id which corresponds to the position of the curve in the .fcs file. The AIM software can save data as multiple or single data files. The type is identified by the first line of the .fcs file. This works with files from the Confocor2, Confocor3 (AIM) and files created from the newer ZEN Software. This function is a wrapper combining *openFCS_Single* and *openFCS_Multiple* """ openfile = open(os.path.join(dirname, filename), 'r') identitystring = openfile.readline().strip()[:20] openfile.close() if identitystring == "Carl Zeiss ConfoCor3": return openFCS_Multiple(dirname, filename) else: return openFCS_Single(dirname, filename) def openFCS_Multiple(dirname, filename): """ Load data from Zeiss Confocor3 Data is imported sequenially from the file. PyCorrFit will give each curve an id which corresponds to the position of the curve in the .fcs file. This works with files from the Confocor2, Confocor3 (AIM) and files created from the newer ZEN Software. """ openfile = open(os.path.join(dirname, filename), 'r') Alldata = openfile.readlines() # Start progressing through the file. i is the line index. # We are searching for "FcsDataSet" sections that contain # all the information we want. # index i for linenumber i = 0 # A parameter to check whether we are in a "FcsDataSet" section # and should import something fcsset = False # The names of the traces aclist = list() # All autocorrelation functions cclist = list() # All cross-correlation functions # The intensity traces traces = list() # The correlation curves ac_correlations = list() cc_correlations = list() while i <= len(Alldata)-1: if Alldata[i].count("FcsDataSet") == 1: # We are in a "FcsDataSet" section fcsset = True gottrace = False if fcsset == True: if Alldata[i].partition("=")[0].strip() == "Channel": # Find out what type of correlation curve we have. # Might be interesting to the user. FCStype = Alldata[i].partition("=")[2].strip() FoundType = False for chnum in np.arange(4)+1: if FCStype == "Auto-correlation detector "+str(chnum): FoundType = "AC"+str(chnum) aclist.append(FoundType) elif FCStype == "Auto-correlation detector Meta"+str(chnum): FoundType = "AC"+str(chnum) aclist.append(FoundType) else: for ch2num in np.arange(4)+1: if FCStype == "Cross-correlation detector "+\ str(chnum)+" versus detector "+\ str(ch2num): FoundType = "CC"+str(chnum)+str(ch2num) cclist.append(FoundType) elif FCStype == "Cross-correlation detector Meta"+\ str(chnum)+" versus detector Meta"+\ str(ch2num): FoundType = "CC"+str(chnum)+str(ch2num) cclist.append(FoundType) if FoundType is False: # Jump out of this set. We will continue at # the next "FcsDataSet"-section. print "Unknown channel configuration in .fcs file: "+FCStype fcsset = False if Alldata[i].partition("=")[0].strip() == "CountRateArray": # Start importing the trace. This is a little difficult, since # traces in those files are usually very large. We will bin # the trace and import a lighter version of it. tracelength = \ int(Alldata[i].partition("=")[2].strip().partition(" ")[0]) if tracelength != 0: tracedata = Alldata.__getslice__(i+1, i+tracelength+1) # Jump foward in the index i = i + tracelength readtrace = csv.reader(tracedata, delimiter='\t') trace = list() for row in readtrace: # tau in ms, trace in kHz # So we need to put some factors here trace.append( (np.float(row[3])*1000, np.float(row[4])/1000) ) trace = np.array(trace) # The trace is too big. Wee need to bin it. if len(trace) >= 500: # We want about 500 bins # We need to sum over intervals of length *teiler* teiler = int(len(trace)/500) newlength = len(trace)/teiler newsignal = np.zeros(newlength) # Simultaneously sum over all intervals for j in np.arange(teiler): newsignal = \ newsignal+trace[j:newlength*teiler:teiler][:,1] newsignal = 1.* newsignal / teiler newtimes = trace[teiler-1:newlength*teiler:teiler][:,0] if len(trace)%teiler != 0: # We have a rest signal # We average it and add it to the trace rest = trace[newlength*teiler:][:,1] lrest = len(rest) rest = np.array([sum(rest)/lrest]) newsignal = np.concatenate((newsignal, rest), axis=0) timerest = np.array([trace[-1][0]]) newtimes = np.concatenate((newtimes, timerest), axis=0) newtrace=np.zeros((len(newtimes),2)) newtrace[:,0] = newtimes newtrace[:,1] = newsignal else: # Declare newtrace - # otherwise we have a problem down three lines ;) newtrace = trace # Finally add the trace to the list traces.append(newtrace) if FoundType[:2] != "AC": # For every trace there is an entry in aclist print "Trace data saved in CC section."+ \ "I cannot handle that." gottrace = True if Alldata[i].partition("=")[0].strip() == "CorrelationArraySize": # Get the correlation information corrlength = int(Alldata[i].partition("=")[2].strip()) if corrlength !=0: # For cross correlation or something sometimes # there is no trace information. if gottrace == False and FoundType[:2] =="AC": # We think we know that there is no trace in CC curves traces.append(None) corrdata = Alldata.__getslice__(i+2, i+corrlength+2) # Jump foward i = i + corrlength readcorr = csv.reader(corrdata, delimiter='\t') corr = list() for row in readcorr: # tau in ms, corr-function corr.append( (np.float(row[3])*1000, np.float(row[4])-1) ) if FoundType[:2] == "AC": ac_correlations.append(np.array(corr)) elif FoundType[:2] == "CC": cc_correlations.append(np.array(corr)) else: # There is no correlation data in the file # Fill in some dummy data. These will be removed. if FoundType[:2] == "AC": # append a dummy correlation curve ac_correlations.append(None) if gottrace == False: # append a dummy trace traces.append(None) elif FoundType[:2] == "CC": # append a dummy correlation curve # cc_correlations do not have traces cc_correlations.append(None) # We reached the end of this "FcsDataSet" section. fcsset = False i = i + 1 # finished. openfile.close() # We now have: # aclist: a list of AC curve names mentioned in the file. # cclist: a list of CC curve names mentioned in the file. # traces: All traces corresponding to non-"None"-type entries in # ac_correlations. Not in cc_correlations, # because cross-correlations are not saved with traces.? # # ac_correlations: AC-correlation data in list. # cc_correlations: CC-correlation data in list. # # ac_correlations or cc_correlations can have items that are "None". # These item come from averaging inside the Confocor software and # do not contain any data. # These "None" type items should be at the end of these lists. # If the user created .fcs files with averages between the curves, # the *traces* contains *None* values at those positions. ## We now create: # curvelist: All actually used data # tracelist: Traces brought into right form (also for CCs) # corrlist: Correlation curves # Index in curvelist defines index in trace and correlation. curvelist = list() tracelist = list() corrlist = list() for i in np.arange(len(ac_correlations)): if ac_correlations[i] is not None: curvelist.append(aclist[i]) tracelist.append(1*traces[i]) corrlist.append(ac_correlations[i]) else: if traces[i] is not None: warnings.warn("File {} curve {} does not contain AC data.".format(filename, i)) ## The CC traces are more tricky: # Add traces to CC-correlation functions. # It seems reasonable, that if number of AC1,AC2 and CC are equal, # CC gets the traces accordingly. n_ac1 = aclist.count("AC1") n_ac2 = aclist.count("AC2") n_cc12 = cclist.count("CC12") n_cc21 = cclist.count("CC21") if n_ac1==n_ac2==n_cc12==n_cc21>0: CCTraces = True else: CCTraces = False # Commence swapping, if necessary # We want to have CC12 first and the corresponding trace to AC1 as well. if len(cc_correlations) != 0: if cclist[0] == "CC12": if aclist[0] == "AC2": for i in np.arange(len(traces)/2): traces[2*i], traces[2*i+1] = traces[2*i+1], traces[2*i] # Everything is OK elif cclist[0] == "CC21": # Switch the order of CC correlations a = cc_correlations for i in np.arange(len(a)/2): a[2*i], a[2*i+1] = a[2*i+1], a[2*i] cclist[2*i], cclist[2*i+1] = cclist[2*i+1], cclist[2*i] if aclist[2*i] == "AC2": traces[2*i], traces[2*i+1] = traces[2*i+1], traces[2*i] # Add cc-curves with (if CCTraces) trace. for i in np.arange(len(cc_correlations)): if cc_correlations[i] is not None: curvelist.append(cclist[i]) corrlist.append(cc_correlations[i]) if CCTraces == True: if cclist[i] == "CC12": tracelist.append([traces[i], traces[i+1]]) elif cclist[i] == "CC21": tracelist.append([traces[i-1], traces[i]]) else: tracelist.append(None) dictionary = dict() dictionary["Correlation"] = corrlist dictionary["Trace"] = tracelist dictionary["Type"] = curvelist filelist = list() for i in curvelist: filelist.append(filename) dictionary["Filename"] = filelist return dictionary def openFCS_Single(dirname, filename): """ Load data from Zeiss Confocor3 files containing only one curve. This works with files from the Confocor2, Confocor3 (AIM) and files created from the newer ZEN Software. """ openfile = open(os.path.join(dirname, filename), 'r') Alldata = openfile.readlines() # Start progressing through the file. i is the line index. # We are searching for "FcsDataSet" sections that contain # all the information we want. # index i for linenumber i = 0 # Indicates if trace or FCS curve should be imported in loop fcscurve = False tracecurve = False while i <= len(Alldata)-1: if Alldata[i].partition("=")[0].strip() == "##DATA TYPE": # Find out what type of correlation curve we have. # Might be interesting to the user. Type = Alldata[i].partition("=")[2].strip() if Type == "FCS Correlogram": fcscurve = True tracecurve = False elif Type == "FCS Count Rates": tracecurve = True fcscurve = False else: raise SyntaxError("Unknown file syntax: "+Type) i = i + 1 if tracecurve == True: if Alldata[i].partition("=")[0].strip() == "##NPOINTS": # Start importing the trace. This is a little difficult, since # traces in those files are usually very large. We will bin # the trace and import a lighter version of it. tracelength = int(Alldata[i].partition("=")[2].strip()) # Trace starts 3 lines after this. i = i + 3 if tracelength != 0: tracedata = Alldata.__getslice__(i, i+tracelength) # Jump foward in the index i = i + tracelength readtrace = csv.reader(tracedata, delimiter=',') trace = list() for row in readtrace: # tau in ms, trace in kHz # So we need to put some factors here trace.append( (np.float(row[0])*1000, np.float(row[1])) ) trace = np.array(trace) # The trace is too big. Wee need to bin it. if len(trace) >= 500: # We want about 500 bins # We need to sum over intervals of length *teiler* teiler = int(len(trace)/500) newlength = len(trace)/teiler newsignal = np.zeros(newlength) # Simultaneously sum over all intervals for j in np.arange(teiler): newsignal = \ newsignal+trace[j:newlength*teiler:teiler][:,1] newsignal = 1.* newsignal / teiler newtimes = trace[teiler-1:newlength*teiler:teiler][:,0] if len(trace)%teiler != 0: # We have a rest signal # We average it and add it to the trace rest = trace[newlength*teiler:][:,1] lrest = len(rest) rest = np.array([sum(rest)/lrest]) newsignal = np.concatenate((newsignal, rest), axis=0) timerest = np.array([trace[-1][0]]) newtimes = np.concatenate((newtimes, timerest), axis=0) newtrace=np.zeros((len(newtimes),2)) newtrace[:,0] = newtimes newtrace[:,1] = newsignal else: # Declare newtrace - # otherwise we have a problem down three lines ;) newtrace = trace tracecurve = False if fcscurve == True: if Alldata[i].partition("=")[0].strip() == "##NPOINTS": # Get the correlation information corrlength = int(Alldata[i].partition("=")[2].strip()) i = i + 2 if corrlength !=0: corrdata = Alldata.__getslice__(i, i+corrlength) # Jump foward i = i + corrlength readcorr = csv.reader(corrdata, delimiter=',') corr = list() for row in readcorr: # tau in ms, corr-function corr.append( (np.float(row[0]), np.float(row[1])-1) ) corr = np.array(corr) fcscurve = False openfile.close() dictionary = dict() dictionary["Correlation"] = [corr] dictionary["Trace"] = [newtrace] dictionary["Type"] = [""] dictionary["Filename"] = [filename] return dictionary pycorrfit-0.8.1/src/readfiles/read_SIN_correlator_com.py0000644000175000017500000002424412262516600022114 0ustar toortoor# -*- coding: utf-8 -*- """ PyCorrFit functions in this file: *openSIN* Copyright (C) 2011-2012 Paul Müller This program is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see . """ import os import csv import numpy as np def openSIN(dirname, filename): """ Read data from a .SIN file, usually created by the software using correlators from correlator.com. FLXA Version= 1d [Parameters] ... Mode= Single Auto ... [CorrelationFunction] 1.562500e-09 0.000000e+00 3.125000e-09 0.000000e+00 4.687500e-09 0.000000e+00 ... 1.887435e+01 1.000030e+00 1.929378e+01 1.000141e+00 1.971321e+01 9.999908e-01 2.013264e+01 9.996810e-01 2.055207e+01 1.000047e+00 2.097150e+01 9.999675e-01 2.139093e+01 9.999591e-01 2.181036e+01 1.000414e+00 2.222979e+01 1.000129e+00 2.264922e+01 9.999285e-01 2.306865e+01 1.000077e+00 ... 3.959419e+02 0.000000e+00 4.026528e+02 0.000000e+00 4.093637e+02 0.000000e+00 4.160746e+02 0.000000e+00 4.227854e+02 0.000000e+00 4.294963e+02 0.000000e+00 [RawCorrelationFunction] ... [IntensityHistory] TraceNumber= 458 0.000000 9.628296e+03 9.670258e+03 0.262144 1.001358e+04 9.971619e+03 0.524288 9.540558e+03 9.548188e+03 0.786432 9.048462e+03 9.010315e+03 1.048576 8.815766e+03 8.819580e+03 1.310720 8.827210e+03 8.861542e+03 1.572864 9.201050e+03 9.185791e+03 1.835008 9.124756e+03 9.124756e+03 2.097152 9.059906e+03 9.029389e+03 ... 1. We are interested in the "[CorrelationFunction]" section, where the first column denotes tau in seconds and the second row the correlation signal. Values are separated by a tabulator "\t". We do not import anything from the "[Parameters]" section. We have to subtract "1" from the correlation function, since it is a correlation function that converges to "1" and not to "0". 2. We are also interested in the "[IntensityHistory]" section. If we are only interested in autocorrelation functions: An email from Jixiang Zhu - Correlator.com (2012-01-22) said, that "For autocorrelation mode, the 2nd and 3 column represent the same intensity series with slight delay. Therefore, they are statistically the same but numerically different." It is therefore perfectly fine to just use the 2nd column. Different acquisition modes: Mode [CorrelationFunction] [IntensityHistory] Single Auto 2 Colums (tau,AC) 1 significant Single Cross 2 Colums (tau,CC) 2 Dual Auto 3 Colums (tau,AC1,AC2) 2 Dual Cross 3 Colums (tau,CC12,CC21) 2 Quad 5 Colums (tau,AC1,AC2,CC12,CC21) 2 Returns: [0]: N arrays with tuples containing two elements: 1st: tau in ms 2nd: corresponding correlation signal [1]: N Intensity traces: 1st: time in ms 2nd: Trace in kHz [2]: A list with N elements, indicating, how many correlation curves we are importing. """ openfile = open(os.path.join(dirname, filename), 'r') Alldata = openfile.readlines() # Find out where the correlation function and trace are for i in np.arange(len(Alldata)): if Alldata[i][0:4] == "Mode": Mode = Alldata[i].split("=")[1].strip() if Alldata[i][0:21] == "[CorrelationFunction]": StartC = i+1 if Alldata[i][0:24] == "[RawCorrelationFunction]": EndC = i-2 if Alldata[i][0:18] == "[IntensityHistory]": # plus 2, because theres a line with the trace length StartT = i+2 if Alldata[i][0:11] == "[Histogram]": EndT = i-2 curvelist = list() correlations = list() traces = list() # Get the correlation function Truedata = Alldata.__getslice__(StartC, EndC) timefactor = 1000 # because we want ms instead of s readcorr = csv.reader(Truedata, delimiter='\t') # Trace # Trace is stored in three columns # 1st column: time [s] # 2nd column: trace [Hz] # 3rd column: trace [Hz] - Single Auto: equivalent to 2nd # Get the trace Tracedata = Alldata.__getslice__(StartT, EndT) # timefactor = 1000 # because we want ms instead of s timedivfac = 1000 # because we want kHz instead of Hz readtrace = csv.reader(Tracedata, delimiter='\t') openfile.close() # Process all Data: if Mode == "Single Auto": curvelist.append("AC") corrdata = list() for row in readcorr: # tau in ms, corr-function minus "1" corrdata.append((np.float(row[0])*timefactor, np.float(row[1])-1)) correlations.append(np.array(corrdata)) trace = list() for row in readtrace: # tau in ms, corr-function minus "1" trace.append((np.float(row[0])*timefactor, np.float(row[1])/timedivfac)) traces.append(np.array(trace)) elif Mode == "Single Cross": curvelist.append("CC") corrdata = list() for row in readcorr: # tau in ms, corr-function minus "1" corrdata.append((np.float(row[0])*timefactor, np.float(row[1])-1)) correlations.append(np.array(corrdata)) trace1 = list() trace2 = list() for row in readtrace: # tau in ms, corr-function minus "1" trace1.append((np.float(row[0])*timefactor, np.float(row[1])/timedivfac)) trace2.append((np.float(row[0])*timefactor, np.float(row[2])/timedivfac)) traces.append([np.array(trace1), np.array(trace2)]) elif Mode == "Dual Auto": curvelist.append("AC1") curvelist.append("AC2") corrdata1 = list() corrdata2 = list() for row in readcorr: # tau in ms, corr-function minus "1" corrdata1.append((np.float(row[0])*timefactor, np.float(row[1])-1)) corrdata2.append((np.float(row[0])*timefactor, np.float(row[2])-1)) correlations.append(np.array(corrdata1)) correlations.append(np.array(corrdata2)) trace1 = list() trace2 = list() for row in readtrace: # tau in ms, corr-function minus "1" trace1.append((np.float(row[0])*timefactor, np.float(row[1])/timedivfac)) trace2.append((np.float(row[0])*timefactor, np.float(row[2])/timedivfac)) traces.append(np.array(trace1)) traces.append(np.array(trace2)) elif Mode == "Dual Cross": curvelist.append("CC12") curvelist.append("CC21") corrdata1 = list() corrdata2 = list() for row in readcorr: # tau in ms, corr-function minus "1" corrdata1.append((np.float(row[0])*timefactor, np.float(row[1])-1)) corrdata2.append((np.float(row[0])*timefactor, np.float(row[2])-1)) correlations.append(np.array(corrdata1)) correlations.append(np.array(corrdata2)) trace1 = list() trace2 = list() for row in readtrace: # tau in ms, corr-function minus "1" trace1.append((np.float(row[0])*timefactor, np.float(row[1])/timedivfac)) trace2.append((np.float(row[0])*timefactor, np.float(row[2])/timedivfac)) traces.append([np.array(trace1), np.array(trace2)]) traces.append([np.array(trace1), np.array(trace2)]) elif Mode == "Quad": curvelist.append("AC1") curvelist.append("AC2") curvelist.append("CC12") curvelist.append("CC21") corrdata1 = list() corrdata2 = list() corrdata12 = list() corrdata21 = list() for row in readcorr: # tau in ms, corr-function minus "1" corrdata1.append((np.float(row[0])*timefactor, np.float(row[1])-1)) corrdata2.append((np.float(row[0])*timefactor, np.float(row[2])-1)) corrdata12.append((np.float(row[0])*timefactor, np.float(row[3])-1)) corrdata21.append((np.float(row[0])*timefactor, np.float(row[4])-1)) correlations.append(np.array(corrdata1)) correlations.append(np.array(corrdata2)) correlations.append(np.array(corrdata12)) correlations.append(np.array(corrdata21)) trace1 = list() trace2 = list() for row in readtrace: # tau in ms, corr-function minus "1" trace1.append((np.float(row[0])*timefactor, np.float(row[1])/timedivfac)) trace2.append((np.float(row[0])*timefactor, np.float(row[2])/timedivfac)) traces.append(np.array(trace1)) traces.append(np.array(trace2)) traces.append([np.array(trace1), np.array(trace2)]) traces.append([np.array(trace1), np.array(trace2)]) dictionary = dict() dictionary["Correlation"] = correlations dictionary["Trace"] = traces dictionary["Type"] = curvelist filelist = list() for i in curvelist: filelist.append(filename) dictionary["Filename"] = filelist return dictionary pycorrfit-0.8.1/setup.cfg0000644000175000017500000000007312262516600014112 0ustar toortoor[egg_info] tag_build = tag_date = 0 tag_svn_revision = 0 pycorrfit-0.8.1/external_model_functions/0000755000175000017500000000000012262516600017363 5ustar toortoorpycorrfit-0.8.1/external_model_functions/ExampleFunc_SFCS_1C_2D_Autocorrelation.txt0000755000175000017500000000125712262516600027263 0ustar toortoor# 2D SFCS AC # 2D one component correlation function for for perpendiculat SFCS. # Model function for PyCorrFit. # http://fcstools.dyndns.org/pycorrfit/ # http://fcstools.dyndns.org/pyscanfcs/ # The detection profile is eliptical, as the focus passes the membrane # perpendicular to its axis of symmetry. # The axis ratio / strucure parameter is defined as: # SP = semi-major-axis / semi-minor-axis (wz/w0) ## Parameters # Number of particles Nob = 40.0 # Diffusion time taudiff [ms] = 1.0 # Axis ratio / structural parameter SP = 5 gFirst = 1/sqrt(1+tau/taudiff) gSecond = 1/sqrt(1+tau/(taudiff*SP**2)) # Correlation function G = 1/Nob * gFirst * gSecond pycorrfit-0.8.1/external_model_functions/Model_AC_3D+T_confocal.txt0000755000175000017500000000052312262516600024123 0ustar toortoor# 3D+T (Gauss) # Autocorrelation function for 3D diffusion + Triplet ## Parameters # Particle number n = 10.0 # Triplet fraction T = 0.2 # Triplet time tautrip [ms] = 0.02 # Diffusion time taudiff [ms] = 0.4 # Structural parameter SP = 5 G = 1/( n*(1+tau/taudiff) * sqrt(1 + tau/(SP**2*taudiff)) ) * ( 1+T/(1.-T)*exp(-tau/tautrip) ) pycorrfit-0.8.1/external_model_functions/ExampleFunc_CS_2D+2D+S+T.txt0000755000175000017500000000573712262516600024122 0ustar toortoor# CS-FCS 2D+2D+S+T (Confocal) # Circular Scanninf FCS model function for two 2D-diffusing species # including triplet component. # Further reading: # Precise Measurement of Diffusion Coefficients using Scanning # Fluorescence Correlation Spectroscopy # Petrasek and Schwille, BiophysJ 2008, 1437-1448 # http://dx.doi.org/10.1529/biophysj.107.108811 # Visit http://fcstools.dyndns.org/pyscanfcs/ for more information. # The first line of this file will be treated as the name of the model # inside PyCorrFit. PyCorrFit will enumerate user imported models with IDs # starting at 7001. You can save a session and the user defined models # like this one will be saved as well. Lines starting with a hash "#" # are treated as comments. Empty lines and lines with only white space # characters are ignored. # Note that if your code does not work, it might be that some variables # have other meaning. This includes using "n" instead of "N". # If you get some Syntax Error it might be that your starting variables # are not set to a reasonable starting function. PyCorrFit is testing the # function with sympy (for safety) and calculates the function for # different values of tau. ## Definition of parameters: # First, define the parameters and their starting values for you model # function. If the parameter has a unit of measurement, then it may be # added separated by a white space before the "=" sign. The starting # value should be a floating point number. You may use abbreviations # like "1e-3" instead of "0.001". # Note that PyCorrFit has it's own unit system: # unit of time : 1 ms # unit of inverse time: 1000 /s # unit of distance : 100 nm # unit of diff.coeff : 10 µm²/s # unit of inverse area: 100 /µm² # unit of inv. volume : 1000 /µm³ # Diffusion coefficient of first component D1 [10 µm²/s] = 200.0 # Diffusion coefficient of second component D2 [10 µm²/s] = 20.0 # Fraction of species One F1 = 1.0 # Half waist of the lateral detection area (w0 = 2*a) a [100 nm] = 1.0 # Particle number n = 5.0 # Scan radius R [100 nm] = 3.850 # Frequency f [kHz] = .2 # Triplet fraction T = 0.1 # Triplet time tautrip [ms] = 0.001 # You may choose to substitute certain parts of the correlation function # with other values for easy reading. This can be done by using the # prefix "g". You may use all common mathematical functions, # such as "sqrt()" or "exp()". For convenience, "pi" and "e" may also # be used. If you are paranoid, you always use float numbers with a dot "." # to be sure the program doesn't accidently do integer division. gTriplet = 1. + T/(1-T)*exp(-tau/tautrip) gScan1 = exp(-(R*sin(pi*f*tau))**2/(a**2+D1*tau)) gScan2 = exp(-(R*sin(pi*f*tau))**2/(a**2+D2*tau)) gTwoD1 = F1/(1.+D1*tau/a**2) gTwoD2 = (1-F1)/(1.+D2*tau/a**2) # The final line with the correlation function should start with a "G" # before the "=" sign. G = 1./n * (gTwoD1 * gScan1 + gTwoD2 * gScan2) * gTriplet pycorrfit-0.8.1/external_model_functions/ExampleFunc_SFCS_1C_2D_Cross-correlation.txt0000755000175000017500000000161312262516600027515 0ustar toortoor# 2D SFCS CC # 2D on component correlation function for perpendiculat SFCS. # Model function for PyCorrFit. # http://fcstools.dyndns.org/pycorrfit/ # http://fcstools.dyndns.org/pyscanfcs/ # The detection profile is eliptical, as the focus passes the membrane # perpendicular to its axis of symmetry. # The axis ratio / strucure parameter is defined as: # SP = semi-major-axis / semi-minor-axis (wz/w0) # This model describes the cross-correlation for two-focus FCS ## Parameters # Number of particles Nob = 40.0 # Diffusion time taudiff [ms] = 1.0 # axis ratio / structural parameter SP = 5 # Beam waist radius w0 [100nm] = 2.3 # Distance between the foci d [100nm] = 5.0 gFirst = 1/sqrt(1+tau/taudiff) gSecond = 1/sqrt(1+tau/(taudiff*SP**2)) gac = 1/Nob * gFirst * gSecond # Diffusion coefficient: gD = w0**2/(4*taudiff) gcc = exp(-d**2/(w0**2+4*gD*tau)) G = gac*gcc pycorrfit-0.8.1/external_model_functions/ExampleFunc_CS_3D+S+T.txt0000755000175000017500000000542412262516600023653 0ustar toortoor# CS-FCS 3D+S+T (Confocal) # Circular scanning FCS (3D diffusion with triplet). # Further reading: # Precise Measurement of Diffusion Coefficients using Scanning # Fluorescence Correlation Spectroscopy # Petrasek and Schwille, BiophysJ 2008, 1437-1448 # http://dx.doi.org/10.1529/biophysj.107.108811 # Visit http://fcstools.dyndns.org/pycorrfit/ for more information. # The first line of this file will be treated as the name of the model # inside PyCorrFit. PyCorrFit will enumerate user imported models with IDs # starting at 7001. You can save a session and the user defined models # like this one will be saved as well. Lines starting with a hash "#" # are treated as comments. Empty lines and lines with only white space # characters are ignored. # Note that if your code does not work, it might be that some variables # have other meaning. This includes using "n" instead of "N". # If you get some Syntax Error it might be that your starting variables # are not set to a reasonable starting function. PyCorrFit is testing the # function with sympy (for safety) and calculates the function for # different values of tau. ## Definition of parameters: # First, define the parameters and their starting values for you model # function. If the parameter has a unit of measurement, then it may be # added separated by a white space before the "=" sign. The starting # value should be a floating point number. You may use abbreviations # like "1e-3" instead of "0.001". # Note that PyCorrFit has it's own unit system: # unit of time : 1 ms # unit of inverse time: 1000 /s # unit of distance : 100 nm # unit of diff.coeff : 10 µm²/s # unit of inverse area: 100 /µm² # unit of inv. volume : 1000 /µm³ # Diffusion coefficient D [10 µm²/s] = 200.0 # Structural parameter w = 5.0 # Half waist of the lateral detection area (w0 = 2*a) a [100 nm] = 1.0 # Particle number n = 5.0 # Scan radius R [100 nm] = 5.0 # Frequency f [kHz] = 20.0 # Triplet fraction T = 0.1 # Triplet time tautrip [ms] = 0.001 # You may choose to substitute certain parts of the correlation function # with other values for easy reading. This can be done by using the # prefix "g". You may use all common mathematical functions, # such as "sqrt()" or "exp()". For convenience, "pi" and "e" may also # be used. If you are paranoid, you always use float numbers with a dot "." # to be sure the program doesn't accidently do integer division. gTrip = 1. + T/(1-T)*exp(-tau/tautrip) gScan = exp(-(R*sin(pi*f*tau))**2/(a**2+D*tau)) gTwoD = 1./(1.+D*tau/a**2) gOneD = 1./sqrt(1.+D*tau/(w*a)**2) gThrD = gTwoD * gOneD # The final line with the correlation function should start with a "G" # before the "=" sign. G = 1./n * gThrD * gScan * gTrip pycorrfit-0.8.1/external_model_functions/Model_Flow_AC_3D_confocal.txt0000755000175000017500000000217412262516600024717 0ustar toortoor# AC flow 3D (gauss) # Autocorrelation function including flow for confocal setups with # a free 3D diffusing species. # This file was gladly provided by Thomas Kuckert, Schwille Lab, Biotec, # Tatzberg 47-51, 1307 Dresden, Germany. # For more information about this model function, see: # Staroske, Wolfgang: # In Vitro and In Vivo Applications of Fluorescence # Cross-Correlation Spectroscopy, TU Dresden, Diss., June 2010# # # Brinkmeier, M. ; Dörre, K. ; Stephan, J. ; Eigen, M.: Two-beam cross- # correlation: A method to characterize transport phenomena in micrometer- # sized structures. In: Anal Chem 71 (1999), Feb, Nr. 3, 609?616. http://dx. # doi.org/10.1021/ac980820i. ? DOI 10.1021/ac980820i ## Parameters # Diffusion coefficient D [10 µm²/s] = 10.0 # Structural parameter w = 6.5 # Waist of the lateral detection area a [100 nm] = 3.25 # Particle number n = 10.0 #Flow velocity v [100 µm/s] = 0.5 ## Calculation fo correlation function gFlow = exp(-((v**2) * (tau**2))/(a**2+4*D*tau)) gTwoD = 1./(1.+4*D*tau/a**2) gOneD = 1./sqrt(1.+4*D*tau/(w*a)**2) gThrD = gTwoD * gOneD G = 1./n * gThrD * gFlow pycorrfit-0.8.1/external_model_functions/Model_Flow_CC_Backward_3D_confocal.txt0000755000175000017500000000244112262516600026514 0ustar toortoor# CC bw flow 3D (gauss) # Backward cross-correlation function including flow for confocal setups with # a free 3D diffusing species. # This file was gladly provided by Thomas Kuckert, Schwille Lab, Biotec, # Tatzberg 47-51, 1307 Dresden, Germany. # For more information about this model function, see: # Staroske, Wolfgang: # In Vitro and In Vivo Applications of Fluorescence # Cross-Correlation Spectroscopy, TU Dresden, Diss., June 2010# # # Brinkmeier, M. ; Dörre, K. ; Stephan, J. ; Eigen, M.: Two-beam cross- # correlation: A method to characterize transport phenomena in micrometer- # sized structures. In: Anal Chem 71 (1999), Feb, Nr. 3, 609?616. http://dx. # doi.org/10.1021/ac980820i. ? DOI 10.1021/ac980820i ## Parameters # Diffusion coefficient D [10 µm²/s] = 10.0 # Structural parameter w = 6.5 # Waist of the lateral detection area a [100 nm] = 3.25 # Particle number n = 10.0 # Focal distance R [100 nm] = 5.0 # Flow velocity v [100 µm/s] = 0.5 #angular difference to Flow for Foci Vector alpha = 0.0000001 ## Calculation fo correlation function gFlowT = (v**2)*(tau**2)+R**2 gAng = 2*R*v*tau*cos(alpha) gC2Flow = exp(-(gFlowT+gAng)/(a**2+4*D*tau)) gTwoD = 1./(1.+D*tau/a**2) gOneD = 1./sqrt(1.+D*tau/(w*a)**2) gThrD = gTwoD * gOneD G = 1./n * gThrD * gC2Flow pycorrfit-0.8.1/external_model_functions/ExampleFunc_Exp_correlated_noise.txt0000755000175000017500000000050412262516600026552 0ustar toortoor# Exponentially correlated noise # Model function for PyCorrFit. # http://fcstools.dyndns.org/pycorrfit/ # This is a test function used to check decay times of exponentially # correlated noise. # Fraction Purity = 0.5 # Exp time tauexp [ms] = 2.0 gTrip = Purity/(1-Purity)*exp(-tau/tauexp) G = gTrip pycorrfit-0.8.1/external_model_functions/Model_Flow_CC_Forward_3D_confocal.txt0000755000175000017500000000244312262516600026404 0ustar toortoor# CC fw flow 3D (gauss) # Forward cross-correlation function including flow for confocal setups with # a free 3D diffusing species. # This file was gladly provided by Thomas Kuckert, Schwille Lab, Biotec, # Tatzberg 47-51, 1307 Dresden, Germany. # For more information about this model function, see: # Staroske, Wolfgang: # In Vitro and In Vivo Applications of Fluorescence # Cross-Correlation Spectroscopy, TU Dresden, Diss., June 2010# # # Brinkmeier, M. ; Dörre, K. ; Stephan, J. ; Eigen, M.: Two-beam cross- # correlation: A method to characterize transport phenomena in micrometer- # sized structures. In: Anal Chem 71 (1999), Feb, Nr. 3, 609?616. http://dx. # doi.org/10.1021/ac980820i. ? DOI 10.1021/ac980820i ## Parameters # Diffusion coefficient D [10 µm²/s] = 10.0 # Structural parameter w = 6.5 # Waist of the lateral detection area a [100 nm] = 3.25 # Particle number n = 10.0 # Focal distance R [100 nm] = 5.0 # Flow velocity v [100 µm/s] = 0.5 # angular difference to Flow for Foci Vector alpha = 0.000001 ## Calculation fo correlation function gFlowT = (v**2)*(tau**2)+R**2 gAng = 2*R*v*tau*cos(alpha) gC1Flow = exp(-(gFlowT-gAng)/(a**2+4*D*tau)) gTwoD = 1./(1.+4*D*tau/a**2) gOneD = 1./sqrt(1.+4*D*tau/(w*a)**2) gThrD = gTwoD * gOneD G = 1./n * gThrD * gC1Flow pycorrfit-0.8.1/external_model_functions/ExampleFunc_TIRF_zOnly.txt0000755000175000017500000000336412262516600024363 0ustar toortoor# Axial diffusion (TIRF) # This model function describes fictional one-dimensional diffusion # in TIR-FCS setups. It demonstrates the mathematical functions available # in PyCorrFit. # Visit http://fcstools.dyndns.org/pycorrfit/ for more information. # The first line of this file will be treated as the name of the model # inside PyCorrFit. PyCorrFit will enumerate user imported models with IDs # starting at 7001. You can save a session and the user defined models # like this one will be saved as well. Lines starting with a hash "#" # are treated as comments. Empty lines and lines with only white space # characters are ignored. ## Definition of parameters: # First, define the parameters and their starting values for you model # function. If the parameter has a unit of measurement, then it may be # added separated by a white space before the "=" sign. The starting # value should be a floating point number. You may use abbreviations # like "1e-3" instead of "0.001". # Note that PyCorrFit has it's own unit system: # unit of time : 1 ms # unit of inverse time: 1000 /s # unit of distance : 100 nm # unit of diff.coeff : 10 µm²/s # unit of inverse area: 100 /µm² # unit of inv. volume : 1000 /µm³ D [10 µm²/s] = 5e-5 d [100 nm] = 1.0 # The final line with the correlation function should start with a "G" # before the "=" sign. You may use all common mathematical functions, # such as "sqrt()" or "exp()". For convenience, "pi" and "e" may also # be used. If you need to use the faddeeva function you can do so by # typing "wofz()". A common used version with an imaginary argument is # also available: wofz(i*x) = wixi(x) G = (sqrt(D*tau/pi) - (2*D*tau/d**2 - 1)/(2/d) * wixi(sqrt(D*tau)/d))/d**2 pycorrfit-0.8.1/PKG-INFO0000644000175000017500000000363312262516600013373 0ustar toortoorMetadata-Version: 1.0 Name: pycorrfit Version: 0.8.0 Summary: UNKNOWN Home-page: https://github.com/paulmueller/PyCorrFit Author: Paul Mueller Author-email: paul.mueller@biotec.tu-dresden.de License: GPL v2 Description: ![PyCorrFit](https://raw.github.com/paulmueller/PyCorrFit/master/doc-src/Images/PyCorrFit_logo_dark.png) ========= This repository contains the source code of PyCorrFit - a scientific tool for fitting correlation curves on a logarithmic plot. In current biomedical research, fluorescence correlation spectroscopy (FCS) is applied to characterize molecular dynamic processes in vitro and in living cells. Commercial FCS setups only permit data analysis that is limited to a specific instrument by the use of in-house file formats or a finite number of implemented correlation model functions. PyCorrFit is a general-purpose FCS evaluation software that, amongst other formats, supports the established Zeiss ConfoCor3 ~.fcs file format. PyCorrFit comes with several built-in model functions, covering a wide range of applications in standard confocal FCS. In addition, it contains equations dealing with different excitation geometries like total internal reflection (TIR). For more information, visit the official homepage at http://pycorrfit.craban.de. - [Download the latest version](https://github.com/paulmueller/PyCorrFit/releases) - [Documentation](https://github.com/paulmueller/PyCorrFit/raw/master/PyCorrFit_doc.pdf) - [Run PyCorrFit from source](https://github.com/paulmueller/PyCorrFit/wiki/Running-PyCorrFit-from-source) - [Write model functions](https://github.com/paulmueller/PyCorrFit/wiki/Writing-model-functions) - [Need help?](https://github.com/paulmueller/PyCorrFit/wiki/Creating-a-new-issue) Platform: UNKNOWN