fExtremes/0000755000176000001440000000000012254146514012250 5ustar ripleyusersfExtremes/inst/0000755000176000001440000000000012251673345013231 5ustar ripleyusersfExtremes/inst/COPYRIGHT.html0000644000176000001440000002041111370220752015454 0ustar ripleyusers Rmetrics::COPYRIGHT

Rmetrics Copyrights


2005-12-18 Built 221.10065

  
________________________________________________________________________________
Copyrights (C) for 

    R:  
      see R's copyright and license file
      
    Version R 2.0.0 claims:
    - The stub packages from 1.9.x have been removed.
    - All the datasets formerly in packages 'base' and 'stats' have
      been moved to a new package 'datasets'. 
    - Package 'graphics' has been split into 'grDevices' (the graphics
      devices shared between base and grid graphics) and 'graphics'
      (base graphics). 
    - Packages must have been re-installed for this version, and
      library() will enforce this.
    - Package names must now be given exactly in library() and
      require(), regardless of whether the underlying file system is
      case-sensitive or not.    

________________________________________________________________________________
for 
    
    Rmetrics:
      (C) 1999-2005, Diethelm Wuertz, GPL
      Diethelm Wuertz 
      www.rmetrics.org
      info@rmetrics.org
 
________________________________________________________________________________
for non default loaded basic packages part of R's basic distribution

    MASS:    
      Main Package of Venables and Ripley's MASS.
      We assume that MASS is available. 
      Package 'lqs' has been returned to 'MASS'.  
      S original by Venables & Ripley.
      R port by Brian Ripley .
      Earlier work by Kurt Hornik and Albrecht Gebhardt.
    methods: 
      Formally defined methods and classes for R objects, plus other 
      programming tools, as described in the reference "Programming 
      with Data" (1998), John M. Chambers, Springer NY. 
      R Development Core Team.
    mgcv:   
      Routines for GAMs and other generalized ridge regression
      with multiple smoothing parameter selection by GCV or UBRE.
      Also GAMMs by REML or PQL. Includes a gam() function.
      Simon Wood 
    nnet: 
      Feed-forward Neural Networks and Multinomial Log-Linear Models
      Original by Venables & Ripley. 
      R port by Brian Ripley .
      Earlier work by Kurt Hornik and Albrecht Gebhardt.
      
________________________________________________________________________________
for the code partly included as builtin functions from other R ports:

    fBasics:CDHSC.F
      GRASS program for distributional testing.
      By James Darrell McCauley 
      Original Fortran Source by Paul Johnson EZ006244@ALCOR.UCDAVIS.EDU>
    fBasics:nortest
      Five omnibus tests for the composite hypothesis of normality
      R-port by Juergen Gross 
    fBasics:SYMSTB.F
      Fast numerical approximation to the Symmetric Stable distribution 
      and density functions.  
      By Hu McCulloch 
    fBasics:tseries
      Functions for time series analysis and computational finance.
      Compiled by Adrian Trapletti 
         
    fCalendar:date     
      The tiny C program from Terry Therneau  is used
      R port by Th. Lumley ,
      K. Halvorsen , and 
      Kurt Hornik 
    fCalendar:holidays
      The holiday information was collected from the internet and 
      governmental sources obtained from a few dozens of websites
    fCalendar:libical
      Libical is an Open Source implementation of the IETF's 
      iCalendar Calendaring and Scheduling protocols. (RFC 2445, 2446, 
      and 2447). It parses iCal components and provides a C API for 
      manipulating the component properties, parameters, and subcomponents.
    fCalendar:vtimezone
      Olsen's VTIMEZONE database consists of data files are released under 
      the GNU General Public License, in keeping with the license options of 
      libical. 
     
    fSeries:bdstest.c
      C Program to compute the BDS Test.
      Blake LeBaron
    fSeries:fracdiff  
      R functions, help pages and the Fortran Code for the 'fracdiff' 
      function are included. 
      S original by Chris Fraley 
      R-port by Fritz Leisch 
      since 2003-12: Martin Maechler
    fSeries:lmtest
      R functions and help pages for the linear modelling tests are included .
      Compiled by Torsten Hothorn ,
      Achim Zeileis , and
      David Mitchell
    fSeries:mda    
      R functions, help pages and the Fortran Code for the 'mars' function
      are implemeted.
      S original by Trevor Hastie & Robert Tibshirani,
      R port by Friedrich Leisch, Kurt Hornik and Brian D. Ripley 
    fSeries:modreg
      Brian Ripley and the R Core Team
    fSeries:polspline   
      R functions, help pages and the C/Fortran Code for the 'polymars' 
      function are implemented
      Charles Kooperberg 
    fSeries:systemfit
      Simultaneous Equation Estimation Package.
      R port by Jeff D. Hamann  and 
      Arne Henningsen 
    fSeries:tseries
      Functions for time series analysis and computational finance.
      Compiled by Adrian Trapletti 
    fSeries:UnitrootDistribution:
      The program uses the Fortran routine and the tables 
      from J.G. McKinnon. 
    fSeries:urca
      Unit root and cointegration tests for time series data.
      R port by Bernhard Pfaff .
     
    fExtremes:evd
      Functions for extreme value distributions.
      R port by Alec Stephenson 
      Function 'fbvpot' by Chris Ferro.
    fExtremes:evir
      Extreme Values in R
      Original S functions (EVIS) by Alexander McNeil 
      R port by Alec Stephenson   
    fExtremes:ismev
      An Introduction to Statistical Modeling of Extreme Values
      Original S functions by Stuart Coles 
      R port/documentation by Alec Stephenson 
      
    fOptions
      Option Pricing formulas are implemented along the book and 
      the Excel spreadsheets of E.G. Haug, "The Complete Guide to Option 
      Pricing"; documentation is partly taken from www.derivicom.com which 
      implements a C Library based on Haug. For non-academic and commercial 
      use we recommend the professional software from "www.derivicom.com".  
    fOptions:SOBOL.F
      ACM Algorithm 659 by P. Bratley and B.L. Fox
      Extension on Algorithm 659 by S. Joe and F.Y. Kuo
    fOptions:CGAMA.F
      Complex gamma and related functions.
      Fortran routines by Jianming Jin.
    fOptions:CONHYP.F
      Confluenet Hypergeometric and related functions.
      ACM Algorithm 707 by mark Nardin, W.F. Perger, A. Bhalla
             
    fPortfolio:mvtnorm
      Multivariate Normal and T Distribution.
      Alan Genz , 
      Frank Bretz 
      R port by Torsten Hothorn 
    fPortfolio:quadprog
      Functions to solve Quadratic Programming Problems.
      S original by Berwin A. Turlach  
      R port by Andreas Weingessel 
    fPortfolio:sn
      The skew-normal and skew-t distributions.
      R port by Adelchi Azzalini 
    fPortfolio:tseries
      Functions for time series analysis and computational finance.
      Compiled by Adrian Trapletti 
 
fExtremes/inst/unitTests/0000755000176000001440000000000012251673345015233 5ustar ripleyusersfExtremes/inst/unitTests/Makefile0000644000176000001440000000042111370220752016657 0ustar ripleyusersPKG=fExtremes TOP=../.. SUITE=doRUnit.R R=R all: inst test inst: # Install package -- but where ?? -- will that be in R_LIBS ? cd ${TOP}/..;\ ${R} CMD INSTALL ${PKG} test: # Run unit tests export RCMDCHECK=FALSE;\ cd ${TOP}/tests;\ ${R} --vanilla --slave < ${SUITE}fExtremes/inst/unitTests/runit.GevMdaEstimation.R0000644000176000001440000000561111370220752021710 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: MDA ESTIMATORS: # hillPlot Plot Hill's estimator # shaparmPlot Pickands, Hill & Decker-Einmahl-deHaan Estimator # shaparmPickands Auxiliary function called by shaparmPlot # shaparmHill ... called by shaparmPlot # shaparmDehaan ... called by shaparmPlot ################################################################################ test.hillPlot = function() { # hillPlot Plot Hill's estimator # Graph Frame: par(mfrow = c(2, 2), cex = 0.7) par(ask = FALSE) # Hill Plot: hillPlot(gevSim(n=1000), plottype = "alpha") hillPlot(gevSim(n=1000), plottype = "xi"); grid() # Don't Plot Return Value: hillPlot(gevSim(n=1000), plottype = "alpha", doplot = FALSE) hillPlot(gevSim(n=1000), plottype = "xi", doplot = FALSE); grid() # Return Value: return() } # ------------------------------------------------------------------------------ test.shaparmPlot = function() { # shaparmPlot Pickands, Hill & Decker-Einmahl-deHaan Estimator # Graph Frame: par(mfrow = c(2, 2), cex = 0.7) par(ask = FALSE) # shaparmPlot(x, p = 0.01*(1:10), xiRange = NULL, alphaRange = NULL, # doplot = TRUE, plottype = c("both", "upper")) # Graph Frame: par(mfcol = c(3, 2), cex = 0.7) par(ask = FALSE) shaparmPlot(as.timeSeries(data(bmwRet))) # Print (Results: shaparmPlot(as.timeSeries(data(bmwRet)), doplot = FALSE) # Tailored p: shaparmPlot(as.timeSeries(data(bmwRet)), p = 0.005*(2:20)) # Return Value: return() } ################################################################################ fExtremes/inst/unitTests/runit.ExtremeIndex.R0000644000176000001440000001115311370220752021107 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2004, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: DESCRIPTION: # 'fTHETA' Class representation for extremal index # show.fTHETA S4: Print Method for extremal index # thetaSim Simulates a time series with known theta # FUNCTION: DESCRIPTION: # blockTheta Computes theta from Block Method # clusterTheta Computes theta from Reciprocal Cluster Method # runTheta Computes theta from Run Method # ferrosegersTheta Computes Theta according to Ferro and Seegers # FUNCTION: DESCRIPTION: # exindexesPlot Computes and Plot Theta(1,2,3) # exindexPlot Computes Theta(1,2) and Plot Theta(1) ################################################################################ test.fTHETA = function() { # Slot Names: slotNames("fTHETA") # [1] "call" "data" "theta" "title" "description" # Return Value: return() } # ------------------------------------------------------------------------------ test.thetaSim = function() { # Simulation: # thetaSim(model = c("max", "pair"), n = 100, theta = 0.5) # Max Frechet Series: x = thetaSim("max") class(x) print(x) # Paired Exponential Series: x = thetaSim("pair") class(x) print(x) # Return Value: return() } # ------------------------------------------------------------------------------ test.thetaFit = function() { # Parameter Estimation: x.ts = thetaSim("max", n=22000) class(x.ts) # Parameter Estimation: # blockTheta(x, block = 22, quantiles = seq(0.95, 0.995, length = 10), # title = NULL, description = NULL) # clusterTheta(x, block = 22, quantiles = seq(0.95, 0.995, length = 10), # title = NULL, description = NULL) # runTheta(x, block = 22, quantiles = seq(0.95, 0.995, length = 10), # title = NULL, description = NULL) # ferrosegersTheta(x, quantiles = seq(0.95, 0.995, length = 10), # title = NULL, description = NULL) # time series ts as input: blockTheta(x.ts) clusterTheta(x.ts) runTheta(x.ts) ferrosegersTheta(x.ts) # Numeric Vector as input: x.vec = as.vector(x.ts) blockTheta(x.vec) clusterTheta(x.vec) runTheta(x.vec) ferrosegersTheta(x.vec) # timeSeries object as input: x.tS = as.timeSeries(x.ts) blockTheta(x.tS) clusterTheta(x.tS) runTheta(x.tS) ferrosegersTheta(x.tS) # Return Value: return() } # ------------------------------------------------------------------------------ test.exindexesPlot = function() { # Graphics Frame: par(mfrow = c(2, 2), cex = 0.7) par(ask = FALSE) # Parameter Estimation: x = thetaSim("max", n = 22000) exindexesPlot(x) # Parameter Estimation: y = thetaSim("pair", n = 22000) exindexesPlot(y) # Return Value: return() } # ------------------------------------------------------------------------------ test.exindexPlot = function() { # Graphics Frame: par(mfrow = c(2, 2), cex = 0.7) par(ask = FALSE) # Parameter Estimation: x = thetaSim("max", n=22000) exindexPlot(x, block = 22) # Parameter Estimation: y = thetaSim("pair", n=22000) exindexPlot(y, block = 22) # Return Value: return() } ################################################################################ fExtremes/inst/unitTests/runit.GpdModelling.R0000644000176000001440000001077211370220752021061 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file # ############################################################################## # FUNCTION: GPD SIMULATION: # gpdSim Simulates a GPD distributed process # FUNCTION: GPD PARAMETER ESTIMATION: # 'fGPDFIT' S4 class representation # gpdFit Fits Parameters of GPD distribution # METHODS: PRINT, PLOT, AND SUMMARY: # show.fGPDFIT S4 Print Method for object of class "fGPDFIT" # plot.fGPDFIT S3 Plot Method for object of class "fGPDFIT" # summary.fGPDFIT S3 Summary Method for object of class "fGPDFIT" ################################################################################ test.gpdSim = function() { # Generate Artificial Data Set: x = gpdSim(model = list(xi = 0.25, mu = 0, beta = 1), n = 1000, seed = 4711) class(x) # Plot Series: par(mfrow = c(2, 1), cex = 0.7) par(ask = FALSE) seriesPlot(as.timeSeries(x)) # Return Value: return() } # ------------------------------------------------------------------------------ test.fGPDFIT = function() { # Slot names: slotNames("fGPDFIT") # [1] "call" "method" "parameter" "data" "fit" # [6] "residuals" "title" "description" # Return Value: return() } # ------------------------------------------------------------------------------ test.gpdFit = function() { # Generate Artificial Data Set: model = list(xi = -0.25, mu = 0, beta = 1) ts = gpdSim(model = model, n = 5000, seed = 4711) class(ts) # Transform As timeSeries: tS = as.timeSeries(ts) class(tS) # As numeric vector: x = as.vector(ts) class(x) # GPD Fit: # gpdFit(x, u = quantile(x, 0.95), type = c("mle", "pwm"), # information = c("observed", "expected"), title = NULL, # description = NULL, ...) # PWM Fit: fit = gpdFit(tS, u = min(series(tS)), "pwm") print(fit) fit = gpdFit(ts, u = min(ts), "pwm") print(fit) fit = gpdFit(x, u = min(x), "pwm") print(fit) # MLE Fit: fit = gpdFit(tS, u = min(series(tS)), "mle") print(fit) fit = gpdFit(ts, u = min(ts), "mle") print(fit) fit = gpdFit(x, u = min(x), "mle") print(fit) # Information: fit = gpdFit(tS, u = min(series(tS)), type = "mle", information = "observed") print(fit) fit = gpdFit(tS, u = min(series(tS)), type = "mle", information = "expected") print(fit) # Return Value: return() } # ------------------------------------------------------------------------------ test.plot = function() { # Artificial Data Set: model = list(xi = -0.25, mu = 0, beta = 1) ts = gpdSim(model = model, n = 5000, seed = 4711) class(ts) # Fit: fit = gpdFit(ts, u = min(ts), type = "mle") print(fit) par(mfrow = c(2, 2), cex = 0.7) par(ask = FALSE) plot(fit, which = "all") # Try: # plot(fit, which = "ask") # Return Value: return() } # ------------------------------------------------------------------------------ test.summary = function() { # Artificial Data Set: model = list(xi = -0.25, mu = 0, beta = 1) ts = gpdSim(model = model, n = 5000, seed = 4711) class(ts) # Fit: fit = gpdFit(ts, u = min(ts), type = "mle") summary(fit, doplot = FALSE) # Return Value: return() } ################################################################################ fExtremes/inst/unitTests/runit.GpdDistribution.R0000644000176000001440000000602411370220752021621 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: GPD DISTRIBUTION FAMILY: # dgpd Density for the Generalized Pareto DF [USE FROM EVIS] # pgpd Probability for the Generalized Pareto DF # qgpd Quantiles for the Generalized Pareto DF # rgpd Random variates for the Generalized Pareto DF # gpdMoments Computes true statistics for GPD distribution # gpdSlider Displays distribution and rvs for GPD distribution ################################################################################ test.gpd = function() { # Check Distribution: set.seed(1985) .distCheck(fun = "gpd", n = 500, xi = 1, mu = 0, beta = 1) # Return Value: return() } # ------------------------------------------------------------------------------ test.gpdMoments = function() { # gpdMoments(xi = 1, mu = 0, beta = 1) # Compute Moments: xi = seq(-2, 2, length = 401) mom = gpdMoments(xi) # Plot Mean: par(mfrow = c(2, 1), cex = 0.7) par(ask = FALSE) plot(xi, mom$mean, main = "Mean", pch = 19, cex = 0.5) abline(v = 1, col = "red", lty = 3) abline(h = 0, col = "red", lty = 3) # Plot Variance: plot(xi, log(mom$var), main = "log Variance", pch = 19, cex = 0.5) abline(v = 1/2, col = "red", lty = 3) abline(h = 0.0, col = "red", lty = 3) # Return Value: return() } # ------------------------------------------------------------------------------ test.gpdSlider = function() { # Distribution Slider: # print("Activate Slider manually!") # gpdSlider(method = "dist") # Random Variates Slider: # print("Activate Slider manually!") # gpdSlider(method = "rvs") NA # Return Value: return() } ################################################################################ fExtremes/inst/unitTests/runit.GevRisk.R0000644000176000001440000000370211370220752020061 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: ADDITIONAL FUNCTIONS: # gevrlevelPlot Calculates Return Levels Based on GEV Fit # .gevrlevelLLH Computes log-likelihood function for gevrlevelPlot ################################################################################ test.returnLevel = function() { # gevrlevelPlot(object, kBlocks = 20, ci = c(0.90, 0.95, 0.99), # plottype = c("plot", "add"), labels = TRUE,...) # Artificial Data Set: model = list(xi = -0.25, mu = 0, beta = 1) x = gevSim(model = model, n = 1000, seed = 4711) class(x) # Empirical distribution plot: fit = gevFit(x) gevrlevelPlot(fit) # Return Value: return() } ################################################################################ fExtremes/inst/unitTests/runit.DataPreprocessing.R0000644000176000001440000001116411370220752022125 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION DATA PREPROCESSING: # blockMaxima Returns block maxima from a time series # findThreshold Upper threshold for a given number of extremes # pointProcess Returns peaks over a threshold from a time series # deCluster Declusters a point process ################################################################################ test.blockMaxima = function() { # blockMaxima - Returns block maxima from a time series # blockMaxima(x, block = c("monthly", "quarterly"), doplot = FALSE) # Time Series Data: x = MSFT[, "Close"] x.ret = 100*returns(x) head(x.ret) class(x.ret) # Monthly Block Maxima: ans = blockMaxima(x.ret, block = "monthly", doplot = TRUE) print(ans) # Quarterly Block Maxima: ans = blockMaxima(x.ret, block = "q", doplot = TRUE) print(ans) # 20-Days Block Maxima: ans = blockMaxima(x.ret, block = 20, doplot = TRUE) print(ans) # Numerical Data Vector: x.ret = as.vector(x.ret) head(x.ret) ans = blockMaxima(x.ret, block = 20, doplot = TRUE) print(ans) # Stops by stopifnot() - Check: # blockMaxima(x.ret, block = "month", doplot = TRUE) # Return Value: return() } # ------------------------------------------------------------------------------ test.findThreshold = function() { # findThreshold - Upper threshold for a given number of extremes # findThreshold(x, n = floor(0.05*length(as.vector(x))), doplot = FALSE) # Time Series Data: x = MSFT[, "Close"] x.ret = 100*returns(x) head(x.ret) class(x.ret) # Find 99% Threshold: par(mfrow = c(2, 2), cex = 0.7) par(ask = FALSE) findThreshold(x.ret, n = floor(0.01*length(as.vector(x))), doplot = TRUE) # Remark - Alternative use ... quantile(x.ret, probs = 1 - 0.01) quantile(x.ret, probs = 1 - 0.01, type = 1) # Find 95% Threshold: findThreshold(x.ret, doplot = TRUE) # Find 90% Threshold: findThreshold(x.ret, n = floor(0.1*length(as.vector(x))), doplot = TRUE) # Try if x is a numeric vector: findThreshold(as.vector(x.ret), doplot = TRUE) # Return Value: return() } # ------------------------------------------------------------------------------ test.pointProcess = function() { # pointProcess - Returns peaks over a threshold from a time series # pointProcess(x, u = quantile(x, 0.95), doplot = FALSE) # Time Series Data: x = MSFT[, "Close"] x.ret = 100*returns(x) head(x.ret) class(x.ret) # Plot Series: par(mfrow = c(2, 1), cex = 0.7) par(ask = FALSE) # plot(x.ret, type = "l", main = "Series") # abline(h = 0, col = "red", lty = 3) # or use ... seriesPlot(x.ret) # Point Process: pp = pointProcess(x.ret, u = quantile(x.ret, 0.8)) pp plot(pp, type = "b", main = "Point Process") abline(h = 0, col = "red", lty = 3) # Try seriesPlot(pp) # ... add points in graph # Return Value: return() } # ------------------------------------------------------------------------------ test.deCluster = function() { # deCluster - Declusters a point process # deCluster(x, run = 20, doplot = TRUE) # Time Series Data: x = MSFT[, "Close"] x.ret = 100*returns(x) head(x.ret) class(x.ret) # Decluster Time Series: tS = deCluster(x = x.ret, run = 3) print(tS) # Return Value: return() } ################################################################################ fExtremes/inst/unitTests/runit.GevModelling.R0000644000176000001440000002327712157313044021075 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: GEV SIMULATION: # gevSim Simulates a GEV distributed process # gumbelSim Simulates a Gumbel distributed process # FUNCTION: GEV PARAMETER ESTIMATION: # 'fGEVFIT' S4 class representation # gevFit Fits Parameters of GEV distribution # gumbelFit Fits Parameters of Gumbel distribution # METHODS: PRINT, PLOT, AND SUMMARY: # show.fGEVFIT S4 Show method for object of class "fGEVFIT" # plot.fGEVFIT S3 Plot method for object of class "fGEVFIT" # summary.fGEVFIT S3 Summary Method for object of class "fGEVFIT" ################################################################################ test.gevSim = function() { # gevSim(model = list(xi=-0.25, mu=0, beta=1), n = 1000, seed = NULL) # RVs: RNGkind(kind = "Marsaglia-Multicarry", normal.kind = "Inversion") set.seed(4711, kind = "Marsaglia-Multicarry") # Artificial Data Set: model = list(xi = -0.25, mu = 0, beta = 1) x.ts = gevSim(model, n = 50, seed = 4711) class(x.ts) print(x.ts) # Create a daily timeSeries object with dummy dates: as.timeSeries(x.ts) # Create a daily timeSeries object starting 2007-01-01 Calendar = timeSequence(from = "2007-01-01", length.out = length(x.ts)) x.tS = timeSeries(data = x.ts, charvec = Calendar, units = "x") print(x.tS) # Return Value: return() } # ------------------------------------------------------------------------------ test.gumbelSim = function() { # gumbelSim(model = list(mu=0, beta=1), n = 1000, seed = NULL) # RVs: RNGkind(kind = "Marsaglia-Multicarry", normal.kind = "Inversion") set.seed(4711, kind = "Marsaglia-Multicarry") # Artificial Data Set: model = list(mu = 0, beta = 1) x.ts = gumbelSim(model, n = 50, seed = 4711) class(x.ts) print(x.ts) # Create a daily timeSeries object with dummy dates: x.tS = as.timeSeries(x.ts) print(x.tS) # Create a daily timeSeries object starting 2007-01-01 Calendar = timeSequence(from = "2007-01-01", length.out = length(x.ts)) x.tS = timeSeries(data = x.ts, charvec = Calendar, units = "x") print(x.tS) # Return Value: return() } # ------------------------------------------------------------------------------ test.numericVectorBlocks = function() { # RVs: RNGkind(kind = "Marsaglia-Multicarry", normal.kind = "Inversion") set.seed(4711, kind = "Marsaglia-Multicarry") # Check numeric vector as input: X = rt(5000, df = 4) x.vec = blockMaxima(X, 20) class(x.vec) head(x.vec) # Internal Fit - GEV PWM: fit = .gevpwmFit(x.vec) fit fit$par.ests # Internal Fit - GEV MLE: fit = .gevmleFit(x.vec) fit fit$par.ests # Internal Fit - Gumbel PWM: fit = .gumpwmFit(x.vec) fit fit$par.ests # Internal Fit - Gumbel MLE: fit = .gummleFit(x.vec) fit fit$par.ests # Return Value: return() } # ------------------------------------------------------------------------------ test.timeSeriesBlocks = function() { # RVs: RNGkind(kind = "Marsaglia-Multicarry", normal.kind = "Inversion") set.seed(4711, kind = "Marsaglia-Multicarry") # Create an artificial timeSeries with dummy positions: xx <- rt(5000, df = 4) charvec <- timeSequence("2013-01-01", length.out = NROW(xx)) X = timeSeries(xx, charvec = charvec) # Compute Block Maxima: x.tS = blockMaxima(X, "monthly") class(x.tS) head(x.tS) # Convert to Vector: x.vec = as.vector(x.tS) # Internal Fit - GEV PWM: fit = .gevpwmFit(x.vec) fit fit$par.ests # Internal Fit - GEV MLE: fit = .gevmleFit(x.vec) fit fit$par.ests # Internal Fit - Gumbel PWM: fit = .gumpwmFit(x.vec) fit fit$par.ests # Internal Fit - Gumbel MLE: fit = .gummleFit(x.vec) fit fit$par.ests # Return Value: return() } # ------------------------------------------------------------------------------ test.gevFit = function() { # gevFit(x, block = 1, type = c("mle", "pwm"), # title = NULL, description = NULL, ...) # RVs: RNGkind(kind = "Marsaglia-Multicarry", normal.kind = "Inversion") set.seed(4711, kind = "Marsaglia-Multicarry") # Simulate Series: model = list(xi = -0.25, mu = 0, beta = 1) x.ts = gevSim(model = model, n = 5000, seed = 4711) class(x.ts) # Check time series input: fit = gevFit(x.ts, block = 1, type = "pwm") class(fit) print(fit) fit = gevFit(x.ts, block = 1, type = "mle") class(fit) print(fit) # Check numeric vector input: fit = gevFit(as.vector(x.ts), block = 1, type = "pwm") class(fit) print(fit) fit = gevFit(as.vector(x.ts), block = 1, type = "mle") class(fit) print(fit) # Check timeSeries objerct input: fit = gevFit(as.timeSeries(x.ts), block = 1, type = "pwm") class(fit) print(fit) fit = gevFit(as.timeSeries(x.ts), block = 1, type = "mle") class(fit) print(fit) # Return Value: return() } # ------------------------------------------------------------------------------ test.gumbelFit = function() { # gevFit(x, block = 1, type = c("mle", "pwm"), # title = NULL, description = NULL, ...) # RVs: RNGkind(kind = "Marsaglia-Multicarry", normal.kind = "Inversion") set.seed(4711, kind = "Marsaglia-Multicarry") # Simulate Series: model = list(mu = 0, beta = 1) x.ts = gumbelSim(model = model, n = 5000, seed = 4711) class(x.ts) # Check time series input: fit = gumbelFit(x.ts, block = 1, type = "pwm") class(fit) print(fit) fit = gumbelFit(x.ts, block = 1, type = "mle") class(fit) print(fit) # Check numeric vector input: fit = gumbelFit(as.vector(x.ts), block = 1, type = "pwm") class(fit) print(fit) fit = gumbelFit(as.vector(x.ts), block = 1, type = "mle") class(fit) print(fit) # Check timeSeries objerct input: fit = gumbelFit(as.timeSeries(x.ts), block = 1, type = "pwm") class(fit) print(fit) fit = gumbelFit(as.timeSeries(x.ts), block = 1, type = "mle") class(fit) print(fit) # Return Value: return() } # ------------------------------------------------------------------------------ test.gevFitByBlocks <- function() { # gevFit(x, block = 1, type = c("mle", "pwm"), # title = NULL, description = NULL, ...) # RVs: RNGkind(kind = "Marsaglia-Multicarry", normal.kind = "Inversion") set.seed(4711, kind = "Marsaglia-Multicarry") # Simulate Series: model = list(xi = -0.25, mu = 0, beta = 1) x.ts = gevSim(model = model, n = 5000, seed = 4711) class(x.ts) x.vec = as.vector(x.ts) class(x.vec) charvec <- timeSequence("2013-01-01", length.out = NROW(x.vec)) x.tS = timeSeries(x.vec, charvec) # ts as input & 20 Days Blocks: fit = gevFit(x.ts, block = 20, type = "pwm") fit fit = gevFit(x.ts, block = 20, type = "mle") fit # Numeric Vector as input & 20 Days Blocks: fit = gevFit(x.vec, block = 20, type = "pwm") fit fit = gevFit(x.vec, block = 20, type = "mle") fit # timeSeries o bject as input & Monthly Blocks: fit = gevFit(x.tS, block = "monthly", type = "pwm") fit fit = gevFit(x.tS, block = "quarterly", type = "mle") fit # timeSeries object as input & 20 Days Blocks: fit = gevFit(x.tS, block = 20, type = "pwm") fit fit = gevFit(x.tS, block = 20, type = "mle") fit # Return Value: return() } # ------------------------------------------------------------------------------ test.plot = function() { # Load Data: x = as.timeSeries(data(danishClaims)) # Parameter Estimation with Declustering: # gevFit(x, block = 1, type = c("mle", "pwm"), # title = NULL, description = NULL, ...) fit = gevFit(x, block = "month") print(fit) # Plot: par(mfrow = c(2, 2), cex = 0.7) par(ask = FALSE) plot(fit, which = 1:4) # Try Interactive: # plot(fit) # Return Value: return() } # ------------------------------------------------------------------------------ test.summary = function() { # Summary Report: # summary(object, doplot = TRUE, which = "all", ...) # Load Data: x = as.timeSeries(data(danishClaims)) # Parameter Estimation with Declustering: fit = gevFit(x, block = "month") print(fit) # Summary: summary(fit, doplot = FALSE) # Return Value: return() } ################################################################################ fExtremes/inst/unitTests/runit.GpdRisk.R0000644000176000001440000001225311370220752020053 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: ADDITIONAL PLOTS: # gpdTailPlot Plots Tail Estimate From GPD Model # gpdQuantPlot Plots of GPD Tail Estimate of a High Quantile # gpdShapePlot Plots for GPD Shape Parameter # gpdQPlot Adds Quantile Estimates to plot.gpd # gpdSfallPlot Adds Expected Shortfall Estimates to a GPD Plot # gpdRiskMeasures Calculates Quantiles and Expected Shortfalls # FUNCTION: NEW STYLE FUNCTIONS: # tailPlot Plots GPD VaR and Expected Shortfall risk # tailSlider Interactive view to find proper threshold value # tailRiskMeasures Calculates VaR and Expected Shortfall risks ################################################################################ test.gpdTailPlot = function() { # Artificial Data Set: x = gpdSim(seed = 1985) fit = gpdFit(x) par(mfrow = c(1, 1)) par(ask = FALSE) gpdTailPlot(fit) # Danish Fire Claims: x = as.timeSeries(data(danishClaims)) fit = gpdFit(x) par(mfrow = c(1, 1)) par(ask = FALSE) gpdTailPlot(fit) # Return Value: return() } # ------------------------------------------------------------------------------ test.gpdQuantPlot = function() { # Artificial Data Set: x = gpdSim(seed = 1985) par(mfrow = c(1, 1)) par(ask = FALSE) gpdQuantPlot(x) # Danish Fire Claims: x = as.timeSeries(data(danishClaims)) fit = gpdFit(x) par(mfrow = c(1, 1)) par(ask = FALSE) gpdQuantPlot(x) # Return Value: return() } # ------------------------------------------------------------------------------ test.gpdShapePlot = function() { # Artificial Data Set: x = gpdSim(seed = 1985) par(mfrow = c(1, 1)) par(ask = FALSE) gpdShapePlot(x) # Danish Fire Claims: x = as.timeSeries(data(danishClaims)) par(mfrow = c(1, 1)) par(ask = FALSE) gpdShapePlot(x) # Return Value: return() } # ------------------------------------------------------------------------------ test.gpdQPlot = function() { # Artificial Data Set: x = gpdSim(seed = 1985) fit = gpdFit(x) tp = gpdTailPlot(fit) gpdQPlot(tp) # Danish Fire Claims: x = as.timeSeries(data(danishClaims)) fit = gpdFit(x, u =10) tp = gpdTailPlot(fit) gpdQPlot(tp) # Return Value: return() } # ------------------------------------------------------------------------------ test.gpdSfallPlot = function() { # Artificial Data Set: x = gpdSim(seed = 1985) fit = gpdFit(x) ### tp = gpdTailPlot(fit) # CHECK ### gpdSfallPlot(tp) # CHECK # Danish Fire Claims: x = as.timeSeries(data(danishClaims)) fit = gpdFit(as.vector(x), u =10) ### tp = gpdTailPlot(fit) # CHECK ### gpdSfallPlot(tp) # CHECK # Return Value: return() } # ------------------------------------------------------------------------------ test.tailPlot = function() { # Danish Fire Claims: x = as.timeSeries(data(danishClaims)) fit = gpdFit(x, u = 10) ### tailPlot(fit) # CHECK # Return Value: return() } # ------------------------------------------------------------------------------ test.tailSlider = function() { # Danish Fire Claims: # x = as.timeSeries(data(danishClaims)) # tailSlider(x) NA # Return Value: return() } # ------------------------------------------------------------------------------ test.tailRisk = function() { # Danish Fire Claims: x = as.timeSeries(data(danishClaims)) fit = gpdFit(x, u = 10) tailRisk(fit) # Return Value: return() } ################################################################################ fExtremes/inst/unitTests/runit.ExtremesData.R0000644000176000001440000002044311370220752021076 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION EXPLORATIVE DATA ANALYSIS: # emdPlot Creates an empirical distribution plot # qqparetoPlot Creates exploratory QQ plot for EV analysis # mePlot Creates a sample mean excess function plot # mxfPlot Creates another view of a sample mean excess plot # mrlPlot Returns a mean residual life plot with confidence levels # recordsPlot Plots records development # ssrecordsPlot Plots records development of data subsamples # msratioPlot Plots ratio of maximums and sums # sllnPlot Verifies Kolmogorov's Strong Law of large numbers # lilPlot Verifies Hartman-Wintner's Law of the iterated logarithm # xacfPlot Plots autocorrelations of exceedences ################################################################################ test.emd = function() { # emdPlot - Creates an empirical distribution plot # Artificial Data Set: RNGkind(kind = "Marsaglia-Multicarry", normal.kind = "Inversion") set.seed(4711, kind = "Marsaglia-Multicarry") x = rgpd(1000) # Empirical distribution plot: par(ask = FALSE) par(mfrow = c(2, 2)) emdPlot(x, plottype = "xy") emdPlot(x, plottype = "x") emdPlot(x, plottype = "y") # emdPlot(x, plottype = " ") # CHECK !!! # Artificial Data Set: x = rt(1000, df = 3) # Empirical distribution plot: par(ask = FALSE) par(mfrow = c(2, 2)) emdPlot(x, plottype = "xy") emdPlot(x, plottype = "x") emdPlot(x, plottype = "y") # emdPlot(x, plottype = " ") # CHECK !!! # Artificial Data Set: x = rnorm(1000) # Empirical distribution plot: par(ask = FALSE) par(mfrow = c(2, 2)) emdPlot(x, plottype = "xy") emdPlot(x, plottype = "x") emdPlot(x, plottype = "y") # emdPlot(x, plottype = " ") # CHECK !!! # Return Value: return() } # ------------------------------------------------------------------------------ test.qqpareto = function() { # qqparetoPlot - Creates exploratory QQ plot for EV analysis # Artificial Data Set - RNGkind(kind = "Marsaglia-Multicarry", normal.kind = "Inversion") set.seed(4711, kind = "Marsaglia-Multicarry") r0 = rgpd(n = 1000, xi = 0) r1 = rgpd(n = 1000, xi = 1) # Graph Frame: par(ask = FALSE) par(mfrow = c(2, 2)) # Empirical Pareto Distribution Plot: qqparetoPlot(x = r0, xi = 0) qqparetoPlot(x = r1, xi = 1) # Empirical Normal Distribution Plot: qqnormPlot(x = r0) qqnormPlot(x = r1) # Return Value: return() } # ------------------------------------------------------------------------------ test.me = function() { # mePlot - Creates a sample mean excess function plot # mxfPlot - Creates another view of a sample mean excess plot # mrlPlot - Returns a mean residual life plot with confidence levels # Artificial Data Set - RNGkind(kind = "Marsaglia-Multicarry", normal.kind = "Inversion") set.seed(4711, kind = "Marsaglia-Multicarry") r = rgpd(n = 1000) # Mean Excess Function Plot: par(ask = FALSE) par(mfrow = c(2, 2)) mePlot(x = r) # Check, the largest point is missing ... mxfPlot(x = r) mrlPlot(x = r) # No Labels: par(mfrow = c(2, 2)) par(ask = FALSE) mePlot(x = r, labels = FALSE) mxfPlot(x = r, labels = FALSE) mrlPlot(x = r, labels = FALSE) # Return Value: return() } # ------------------------------------------------------------------------------ test.records = function() { # recordsPlot - Plots records development # ssrecordsPlot - Plots records development of data subsamples # Artificial Data Set - RNGkind(kind = "Marsaglia-Multicarry", normal.kind = "Inversion") set.seed(4711, kind = "Marsaglia-Multicarry") r = rgpd(n = 1000) # Records Plot: par(mfrow = c(2, 2)) par(ask = FALSE) recordsPlot(x = r) recordsPlot(x = r, ci = 0.99) ans = recordsPlot(x = r, labels = FALSE) print(ans) # Subrecords Plot: set.seed(1985) r = rgpd(n = 10000) par(mfrow = c(2, 2)) par(ask = FALSE) recordsPlot(r) ssrecordsPlot(r, subsamples = 1) ssrecordsPlot(r, subsamples = 1, plottype = "log") ans = ssrecordsPlot(r, subsamples = 1, plottype = "lin") print(ans) # Subrecords Plot: set.seed(1985) r = rgpd(n = 10000) par(mfrow = c(2, 2)) par(ask = FALSE) ssrecordsPlot(r, subsamples = 10) ssrecordsPlot(r, subsamples = 50) ssrecordsPlot(r, subsamples = 10, plottype = "log") ans = ssrecordsPlot(r, subsamples = 50, plottype = "log", labels = FALSE) print(ans) # Return Value: return() } # ------------------------------------------------------------------------------ test.msratio = function() { # msratioPlot - Plots ratio of maximums and sums # Artificial Data Set - RNGkind(kind = "Marsaglia-Multicarry", normal.kind = "Inversion") set.seed(4711, kind = "Marsaglia-Multicarry") r = rgpd(n = 1000) # Mean Excess Function Plot: par(ask = FALSE) par(mfrow = c(2, 2)) msratioPlot(x = r, p = 1:4) ans = msratioPlot(x = r, p = 1:4, labels = FALSE) print(head(ans)) # Return Value: return() } # ------------------------------------------------------------------------------ test.laws = function() { # sllnPlot - Verifies Kolmogorov's Strong Law of large numbers # lilPlot - Verifies Hartman-Wintner's Law of the iterated logarithm # Artificial Data Set - RNGkind(kind = "Marsaglia-Multicarry", normal.kind = "Inversion") set.seed(4711, kind = "Marsaglia-Multicarry") r = rgpd(n = 1000) # Strong Law of Large Numbers: par(ask = FALSE) par(mfrow = c(2, 2)) sllnPlot(x = r) ans = sllnPlot(x = r, labels = FALSE) print(ans) # Law of the Iterated Logarithm: lilPlot(x = r) ans = lilPlot(x = r, labels = FALSE) print(ans) # Return Value: return() } # ------------------------------------------------------------------------------ test.xacf = function() { # xacfPlot - Plots autocorrelations of exceedences # Create an Artificial Data Set: RNGkind(kind = "Marsaglia-Multicarry", normal.kind = "Inversion") set.seed(4711, kind = "Marsaglia-Multicarry") r = rgpd(n = 1000) # ACF of Exceedances Plot: par(ask = FALSE) par(mfrow = c(2, 2)) ans = xacfPlot(x = r) print(ans) # ACF of Exceedances Plot: par(ask = FALSE) par(mfrow = c(2, 2)) xacfPlot(x = r, labels = FALSE) # ACF of Exceedances Plot: par(ask = FALSE) par(mfrow = c(2, 2)) xacfPlot(x = r, labels = FALSE, which = 1); title(main = "1") xacfPlot(x = r, labels = FALSE, which = 2); title(main = "2") xacfPlot(x = r, labels = FALSE, which = "3"); title(main = "3") xacfPlot(x = r, labels = FALSE, which = "4"); title(main = "4") # Return Value: return() } ################################################################################ fExtremes/inst/unitTests/runTests.R0000644000176000001440000000453111370220752017177 0ustar ripleyuserspkg <- "fExtremes" if(require("RUnit", quietly = TRUE)) { library(package=pkg, character.only = TRUE) if(!(exists("path") && file.exists(path))) path <- system.file("unitTests", package = pkg) ## --- Testing --- ## Define tests testSuite <- defineTestSuite(name = paste(pkg, "unit testing"), dirs = path) if(interactive()) { cat("Now have RUnit Test Suite 'testSuite' for package '", pkg, "' :\n", sep='') str(testSuite) cat('', "Consider doing", "\t tests <- runTestSuite(testSuite)", "\nand later", "\t printTextProtocol(tests)", '', sep = "\n") } else { ## run from shell / Rscript / R CMD Batch / ... ## Run tests <- runTestSuite(testSuite) if(file.access(path, 02) != 0) { ## cannot write to path -> use writable one tdir <- tempfile(paste(pkg, "unitTests", sep="_")) dir.create(tdir) pathReport <- file.path(tdir, "report") cat("RUnit reports are written into ", tdir, "/report.(txt|html)", sep = "") } else { pathReport <- file.path(path, "report") } ## Print Results: printTextProtocol(tests, showDetails = FALSE) printTextProtocol(tests, showDetails = FALSE, fileName = paste(pathReport, "Summary.txt", sep = "")) printTextProtocol(tests, showDetails = TRUE, fileName = paste(pathReport, ".txt", sep = "")) ## Print HTML Version to a File: ## printHTMLProtocol has problems on Mac OS X if (Sys.info()["sysname"] != "Darwin") printHTMLProtocol(tests, fileName = paste(pathReport, ".html", sep = "")) ## stop() if there are any failures i.e. FALSE to unit test. ## This will cause R CMD check to return error and stop tmp <- getErrors(tests) if(tmp$nFail > 0 | tmp$nErr > 0) { stop(paste("\n\nunit testing failed (#test failures: ", tmp$nFail, ", R errors: ", tmp$nErr, ")\n\n", sep="")) } } } else { cat("R package 'RUnit' cannot be loaded -- no unit tests run\n", "for package", pkg,"\n") } ################################################################################ fExtremes/inst/unitTests/runit.GevDistribution.R0000644000176000001440000000765311370220752021641 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: GEV DISTRIBUTION FAMILY: [CALLING EVD] # dgev Density for the GEV Distribution # pgev Probability for the GEV Distribution # qgev Quantiles for the GEV Distribution # rgev Random variates for the GEV Distribution # gevMoments Computes true statistics for GEV distribution # gevSlider Displays distribution and rvs for GEV distribution ################################################################################ test.gev = function() { # Check Distribution: RNGkind(kind = "Marsaglia-Multicarry", normal.kind = "Inversion") set.seed(4711, kind = "Marsaglia-Multicarry") .distCheck(fun = "gev", n = 2000, xi = 0.0, mu = 0, beta = 1) # Check Distribution: RNGkind(kind = "Marsaglia-Multicarry", normal.kind = "Inversion") set.seed(4711, kind = "Marsaglia-Multicarry") .distCheck(fun = "gev", n = 5000, xi = 0.3, mu = 0, beta = 2) # Return Value: return() } # ------------------------------------------------------------------------------ test.gevMoments = function() { # gevMoments(xi = 0, mu = 0, beta = 1) # Compute Moments: xi = seq(-4.5, 1.5, by = 0.25) mom = gevMoments(xi) print(mom) # Plot Mean: par(mfrow = c(2, 1), cex = 0.7) par(ask = FALSE) xi = seq(-5, 2, length = 351) mom = gevMoments(xi) plot(xi, mom$mean, main = "Mean GEV", pch = 19, col = "steelblue") abline(v = 1, col = "red", lty = 3) abline(h = 0, col = "red", lty = 3) # Plot Variance: plot(xi, log(mom$var), main = "log Variance GEV", pch = 19, col = "steelblue") abline(v = 1/2, col = "red", lty = 3) abline(h = 0.0, col = "red", lty = 3) # check gevMoments for specific values xi <- c(-1, 0, 0.3) mu <- c(-1, 0, 1) beta <- c(0.5, 1, 10) for (i in seq(length(xi))) { for (j in seq(length(xi))) { for (k in seq(length(xi))) { rg <- rgev(1000000, xi = xi[i], mu = mu[j], beta = beta[k]) rgMoments <- gevMoments(xi = xi[i], mu = mu[j], beta = beta[k]) checkEqualsNumeric(mean(rg), rgMoments$mean, tolerance = 0.1) checkEqualsNumeric(var(rg), rgMoments$var, tolerance = 0.1) } } } # Return Value: return() } # ------------------------------------------------------------------------------ test.gevSlider = function() { # Distribution Slider: # print("Activate Slider manually!") # gevSlider(method = "dist") NA # Random Variates Slider: # print("Activate Slider manually!") # gevSlider(method = "rvs") NA # Return Value: return() } ################################################################################ fExtremes/tests/0000755000176000001440000000000012251673345013416 5ustar ripleyusersfExtremes/tests/doRUnit.R0000644000176000001440000000151611370220751015116 0ustar ripleyusers#### doRUnit.R --- Run RUnit tests ####------------------------------------------------------------------------ ### Origianlly follows Gregor Gojanc's example in CRAN package 'gdata' ### and the corresponding section in the R Wiki: ### http://wiki.r-project.org/rwiki/doku.php?id=developers:runit ### MM: Vastly changed: This should also be "runnable" for *installed* ## package which has no ./tests/ ## ----> put the bulk of the code e.g. in ../inst/unitTests/runTests.R : if(require("RUnit", quietly = TRUE)) { ## --- Setup --- wd <- getwd() pkg <- sub("\\.Rcheck$", '', basename(dirname(wd))) library(package=pkg, character.only = TRUE) path <- system.file("unitTests", package = pkg) stopifnot(file.exists(path), file.info(path.expand(path))$isdir) source(file.path(path, "runTests.R"), echo = TRUE) } fExtremes/NAMESPACE0000644000176000001440000000565412157313044013475 0ustar ripleyusers ################################################ ## Note this file has been automatically ## generated --- Do not edit it. ################################################ ################################################ ## import name space ################################################ import("methods") import("timeDate") import("timeSeries") import("fBasics") import("fGarch") import("fTrading") ################################################ ## S4 classes ################################################ exportClasses("fGEVFIT", "fGPDFIT", "fTHETA" ) exportMethods("$", "$<-", "+", "-", "[", "[<-", "cummax", "cummin", "cumprod", "cumsum", "dim", "dim<-", "dimnames", "dimnames<-", "is.na", "names", "names<-", "show" ) ################################################ ## S3 classes ################################################ S3method("plot", "fGEVFIT") S3method("plot", "fGPDFIT") S3method("summary", "fGEVFIT") S3method("summary", "fGPDFIT") ################################################ ## functions ################################################ export( ".depd", ".devd", ".garch11MetricsPlot", ".gev1Plot", ".gev2Plot", ".gev3Plot", ".gev4Plot", ".gevFit", ".gevLLH", ".gevmleFit", ".gevpwmFit", ".gevrlevelLLH", ".gpd1Plot", ".gpd2Plot", ".gpd3Plot", ".gpd4Plot", ".gpdLLH", ".gpdmleFit", ".gpdmleFitCheck", ".gpdpwmFit", ".gpdpwmFitCheck", ".gumLLH", ".gummleFit", ".gumpwmFit", ".meanExcessPlot", ".pepd", ".pevd", ".qepd", ".qevd", ".repd", ".revd", ".riskMetricsPlot", "CVaR", "VaR", "blockMaxima", "blockTheta", "clusterTheta", "deCluster", "dgev", "dgpd", "emdPlot", "exindexPlot", "exindexesPlot", "ferrosegersTheta", "findThreshold", "gevFit", "gevMoments", "gevSim", "gevSlider", "gevrlevelPlot", "ghMeanExcessFit", "ghtMeanExcessFit", "gpdFit", "gpdMoments", "gpdQPlot", "gpdQuantPlot", "gpdRiskMeasures", "gpdSfallPlot", "gpdShapePlot", "gpdSim", "gpdSlider", "gpdTailPlot", "gumbelFit", "gumbelSim", "hillPlot", "hypMeanExcessFit", "lilPlot", "mePlot", "mrlPlot", "msratioPlot", "mxfPlot", "nigMeanExcessFit", "normMeanExcessFit", "pgev", "pgpd", "pointProcess", "qgev", "qgpd", "qqparetoPlot", "recordsPlot", "rgev", "rgpd", "runTheta", "shaparmDEHaan", "shaparmHill", "shaparmPickands", "shaparmPlot", "sllnPlot", "ssrecordsPlot", "tailPlot", "tailRisk", "tailSlider", "thetaSim", "xacfPlot" ) fExtremes/data/0000755000176000001440000000000012254133267013162 5ustar ripleyusersfExtremes/data/danishClaims.csv.gz0000644000176000001440000003454212254133267016725 0ustar ripleyusers‹…½»²-¹®èë?®Y+ø&e)BF·ÓN[òåêÿ]a ræ½:u¦‘cçbò7Að?þç?ÿñ¿ÿùÿõïÿøïÿßÿûÿÿ?ÿ-/Jÿ¤üOªÿæ¿Au6ÊÓÑöoùK«ÎÔJw~yÖÒ)—¾ž€çš½¥2"ÜþF.)uêÞvNÿÒß,½Ì:W‰ðü£EÜJÎŽîHI”Z¯DÎFZiNÿb^üö\#Ï\S=pÉÜt-i´V ®u̶FmŽ64=Z.D-´!C/ø_oÞ‘ÂÉ2{c]páþñW¥î0ý[Æ_ÉÒüž¿‚5àÙN¥Ufs¸óüõš’¼ï0f›Zž½ôQÌóWÿ(YÛZÕa^ßö—KIsäû¯méµ,ÿ$O Gf/Í~à’ø“=÷:kíÞO`áR'þ¿££ikRªäÝ.Üž“Ñû*k:¬$Õ[ëi…/Î_²¬:¯Xá/–\ê1Æ?*™gVDù{YiµÜ/¿ëËðä ¡ÙWÊ+9,#g2mu•êm+éTþf#¼]0OÔ7ԻàÖU³’ ÁíŸ4>&ªa!¹fn(/‡3zÒÇêyÊáÆDÂmgê+´;Ö7Ï2Ç /£#ºŠ…Ʊóï§iYš±2Ï¢é°4ÂôÎÞt&o™×™sã‚É0fj’×g„ë_UH­9 žYÔ× ðDÛ³HÛ /4²Z4¡˼R]QÀtLɯl`˜>ae„9Jî¥\pû£ÎâˆÒ 0¦¤7~¬ê0O`ýÌ‘+ ïvY l𨋠6M ̽Ž:çlf¢ç‰yÕ2VâY‹Ås=†a4Òz.©w†¯|tÖÌ`æ1Gáí!’±Í†¾8,Ü*JuXæä§8Êmæ3š©Tòòê9TnƒõOÀˆpg*¾¥ÃT5ö¥Ñq9Zb®¬¥9Œ/Ž9iåÑ,J¡­‹±§ÌÈ`Z»ûQ0O=¥)ôãð)ÅÔCcP„×óãËûUñÇ3ͺŒŽ”Ÿ †Âb*±HË}/ aŒã¯æµry8,” ‚Jù†ï !ˆÔ(äï2#@>Ö%Šð/[“òËOÕªcÑGNÉJ`e3[Á*D¸r¯—ÐSwX¸½çÕçYÂ<ž˜§©¦a¹(„t„¨ÀX.ž;6)¼é*Ý^Ôkg‰á!º”óˆŸêH™QŽ5ÕvHdA‰c¢XK2ûµsÕÞ'U‡Ñ¿VF)-eÿ¤°/´,ÛXËÛfbm,-UŠÎCtÑþI¡aä—9¶@ü6Ìb Dxò’1é`^î·'ø€Ú¼<µ‚ü“ ª™êðQÊ*P8¥yÛ,YX‡ÌI5ˆè A›­枌T@óÅaUªÌ½0œ†åÀss,ͳyÉ"ž•ÜV,€…è­äðúG Ë2ðD™64+²(¸ Såf `—‹•Y*“²RŠhQ1€›èwЋ@¬­™††ÃS¬e&Yð Ã1)×ÜèGèµ 0¯C$E-›ß·3Ä[%j¸;\~Í#À ×ή–üÃB—m^l¦0Slª°Òo¯ÏA–*ÓZk"6Uî³]TQ>†P.J®ÔRKku_„š…x¤e•ôYœO3—ᮩOê0“Cg‚¨3Âl¥º˜Gx»2u³²ì˜¥Ë ó 0Eh¿–9a!ZW§ð¶ð*±a¨ê,oÏ‘ÕÏx5“G`±†bÃ+™œÅéKˆ)-±£4ù•Æ(=Æ ;}lÖ5SñæQ¶ZÙ°ËÞH†¾e]yÆW‹0O,»¼™ížìp‡žcEÇß 0,búëµÍðI2ORbã†Æ 7–Ò¬àÕ­Ïæ#®?–EXÉ2ÍX©í¶«.=kYu‰56CXø8¬!5بÍÝaa)ŒyÍu`a)æu–^“,æ!;¬‰’R§ŠÍŠÄz1;Å®HžT¼Û"6 ÔÛ™ám‘_ F\*I·Z HH 1f”~Wx©-Í¢x¥LêöWØßE8p8, Ä¾wëäm‹JL,‡gñrØÞcמïQ",>èè)Ð6„C,ì˜ÿ¼ƒbò‚!&6ýíÒ~ ýœ·LY,ï~¤.œdìÍl¯ ; ƒ¯ÁÞÁølN2ø†W¡¦Í“Ùœ6nDº¸±—c`[ÎØ»œð%d†=(sdO2éByˆ} ödK€Ë¯µ˜ÄU¸,5ÀàÓ­ár„áó²`íËû\5&[RŸCUYß> û|l¼%…ñÅÂ&jjv˜ýf樢¦Âø"Kí2Øôò·Å¯`Ç®±‘5_˜U$› *uƤ6{¶¨F1/»ÿBØ oU9išÃ˜?æX=Å· ôKx6&Ýo„Pœ’‚-¶ŒMe¶V &« ,óÖ€3S]j¤ñÔ¥ƒ-÷’OOŠù°L•,þN#E…CÁŽ0·Qnè7•$srÁO¦ž9#,ÜÄ”ÚãÛ²58˜¦XÄè*LglT8,Jâ«Ú·ÂKeKþIÝ—geÃËÙým]Å©›/¶÷ËÆJc#«÷^žBkU¦û‚!…¦RhD(“AZ¿a"p[¼À,X¯±Û¶e0› yEø—ª,8À²ˆM;Z+ »iTFsXØÝF–¹þ²Ø´ïÊragŸ«ÍÛÐH#lÍðöúMS­@¬?ÙÊ8«ÐtÏíâÌêâ0«m,³éQØ6ìÙañÔÛ©;o˜ýîJ,Ó,nKš=Ü0"Ϭ„Ì…(H½²©Ògï7, °Iï£Ñ=×Â&';ëÞmÝSb¨A‰:\>§Dü§‚ð7»Jㆱ×ÞX+®n¾¦>)«ëL=çúÀ•Ùaшì–-ìs;,6wEk”uÃÜ‹´ M ¶ísÄVX™2˜m@läCÐ8lâ. ¨.‡!ML¼Öî°0Ömµá°8a,H.™nј´jÿd©ú¢›véÕê&Áar-CQæ`Ä:؈NØ$sTLƇ†vû–h·pÓ Ò#ÌÝV‰ž†ÃYd Ïk£CôС´â=É’ÉÔØ’Än[„%Ò05òˆsáßÅB&¼°’‰H~ë+¢’5ÙEŽŠE«:r ‡%ÚDÄS€%#-‘·í†;²â¨RuXwTXg•<½IAJ©¨w¸|¬ú2Ûû,ìMúDÉ&o%bó`·¡~>Û¿¦àšÃ2«ÓŒW‡›P°_*1ÓR4>t7”ƒ•Ò¼ /šzÊýoš”r˜åðD"¬ËáÜÿ ²w(G`QMðÁ¨S7½"ºÕL$‡Ác ærÞ2jïo‹y óÕaú•üÙÜ!EbKy#ÊÏ—°af¯RÙÀ¢þm[HíÑoêÓÿùº¿ N`•âp¶0V°\—~O¶ ÍOº³E fûôÙ|ï :ï?ý˜I$š—€5”•7õÕ†Hékt&f>ÔÍìbbÕƒÇX%ÕKÓ9*97‡×¯¶Þá¤` &¢?b.º5öý¿†á£×Ú²÷O ¬¶yZ ,Ú²ÓHÊÙR‚ň²‘Fi@@_0Ö`w‡áag:°l´‹]Tº†QëI GàŒUÝ(;žL¯–XSÏÎy‡ž-– ]O¼ C°i],•ý^Es*îjⳫ5K€5˜M=òOfÙ"a%Äm艦$äÄü1UDדý¹éUm·µZö7jÛ5û(«$® Žš,¥ªš÷>þl0…Ŷ<»C+À,ÉìÊÌ1û…ßuä×TXS6Ù~™æ©*ŒóccÑ8K„!^Ù€aõYžÊ|‰m“DÎd²ü‹"3¸Ï2WÞí¢ýcn-y·uŸ¥¬¶c†#Ýü˜jyÔЈy;óF™Bäh‘ôÎŒžÊÜz¶Í™eˆEϨkÆæ€«6B#ôñv³1fv¶r©;Ü?ÓÔ]Ã*±TD"á• ‡³0,‹[NÞ´®c›mKÞvNˆºÊGžF ×8¼MŸýÓuDŠw5K´š#Í,ìQ@U: õÎ> ëýrøºéþ.›²ú¾ÃªÞGý,™dhÿ6b9× À}mÃa¬Ë>ò«6‡ÛÇTu|‰ŒuÖ¬ÛàYÔõNÞ†Zƒ:›ä°œÀy–*Ywr5O7¤‘…«ù£Ðs«¹Ø‰¥zò¼_1?U«Š|a#nÏéцË7©gGyLvcfD׃ZË¥ÔNì°Q„ÙöŸÞ®â¨ÐX—˜"¥¨’¤á:#\dG~›çþ2[ m°*Ïë¨Û oËFC]ãFqT«óP„ç÷W´P€5@Ëî×LG¿‘`'œÉc7š$ÊH­DTò«Ø*aà'ÙÖÞbûÔ…ÕÛ\éh›¥îÀ€«ƒjKžÍØâ;:Ò`¨XLwsXø‘ ¬·º,j…—ÕuèˆDEá"àƒÃK`lç×ê_d ßFB ð@n”È}‹Àl˜5ÖÈÏ uÓä‘Rè¶XRß­½ÅN'Ûøá‹Kr$±Ÿe»žõløºlyäÔ¼mqøzƒõ›š¿­^ÌLlYø½?PºÑº*…fÔØwÁ¹³Ù6\”WßQË ¶Ž×ædÑAåÔ,Û¡ÒñƒŠÀæ¶L†7!TÌŠâyY5;;â;rÔd§qXˆˆlw£ì¤¦,cÏ7,£.¦Q›íV"[ØëÌžmb¾ÓTme5׫Y:ôïêV=3@7áTsÄ1èÚ“÷Cíašõq×x®šÃø"ânì6u©[#¼ó¤›À£Â¯þIÌu†ƒŽs(ý‚YJ¡á:÷Œ4‹\"3,\­°¦°t$êÐ80¨¯¾m·#1¤ã%Âr’@ êÀâ¼ÚŠMr¸ Q⨥m´“ÊœX¸â«-ÂØÐ— ¬t¿˜¦Ä{¢’äáš¶%Éı¶éŸáZž~wM×Ã2ֲ΢uµÌ[aM;ì0žÃìÂ$pFkóG€õ€”ö°:¼>–ÁŽÞfÁ*Íf™qÄÑ/.jÔ½;L®ŽÍ‡>Öl¤´3¯¶-ùÊ®¡üD*^ˆ"Œ ]¶»úìë†÷´Ö#èC,/G rÀ’–_ÙeùÆðúOáþŠÏT}8šb$‚n.|ùþ¤í£ßÂĶ%ßù¶bdYËê¬Ö¢ 2w½ÎKá\M>´–UšÅ¼ÃßžˆLš›ì¬D¥J+|R÷¢¸ÛqѦ æ^b2™¡ «#äâøý2K’†­Ã² Ç“’`fé€fŠÃ¸Í8Êãtl[|ïRš3Éß»´©²ëØÎ_æÙ‹‹c®§Ñ|°D3:Þ;ëüÒÆ%°ùØÅƒh]Å w‡×0õçðÓI mÓ‡4ø%cÒ:öÃ[°æ9²ñÛ |Ú 3›.‘w/ÛÆV\m§móEí3ÐÍ|Áú£/Ö·6x‹[öLÍåé0(ÃÄRœŸé^|{Lžeµi·ÁÚZ¤ù¥ŒóJ¥Š«of0”6[´kI/›“yýeÓ¸V£ö^Ê…,o¶µmݽ;I0Wš…ý›íNþtpÃçìÃÄ ?0IhûƒzôxpýÛ’¾X3Pî¥Ü°("(ËCÀwœ‚fÝ)ÇÙ¾éèogÈ·‰¹Ž!æ`˜£W±n¿sg¿cÌš(×o-¬ðü+ÛöŒðï’Y©ŸGŽžó¾¥$á‡A˜›¶½'jýßFÜÎCÆYÌÔ-³#WxTÅ|1â‰0Îæ›Ùm°í¾3R”Ðrb©M9¾]%`¶= w¶ ¦gÕ‹N”Zè–Ô¯ð/ë¹ÞßɶLæßѨ‘þÓÉ×¾ }MC~ýž|r)/• éÆ…¨.¯wºþ˜ý8ŸÝeÚzøwaØêÏbRôûyÝá¯Å4F`À!!õKõy¡„¿ ý ÛÈÜX Ý]ê½HOceTgÄPâ¤Gà™•òG±†æL¡ïj –µü'äÏzü/ôAüŠð,.YßuÓ8Ób˜≖L²¢¥—÷Yz-_¾[Ž4ÿa|dýÚüß‘°ã2!i°égÏ4¥í1v;Æ ÇŸ…„J²X›c¥'³læjp Z$¬¢ŠoÎðç;¶ïÕÏNª‰dXÉ…/­ÿ“Lõø«×ó_ÿ®G¨ú à´Ç $Á¨^-i»[Db`#·öð·"÷ßUå´U—¼ ãÐt5 вEøËp š?–3šª'ÀëšÏâuL3³ÐÕ!*ÛÌíÝ;QTLù³î߯gT7kåÐÃIhnå ®YVÂ…³ÙN½8ÆÂëˆôõ3õ–ÍZ|"hÛ¶´×B¼ É@vXëTE4”ý>ö¡ghZJë®’½ uaŸÅÞ‘ ’%“KçݸòØü÷r"Âa9ÍšÒ-¢ ç·­ªëؾ¢@mŒY/óÇÚgÞ†^›‹ZŸ¦rÇ)RõÒ€9a°6·YÁ‡' âàB„Y¬(Tx¨±î”¼Y* @:¬™x,{mÃL_¨YiÇ©‡ùq¿ëkðC^æäM”teF ðü˜(Ý3F ΖR^õ‚äGÝG‹n޳ӳHƒAÄÛû§ç Ùco8÷åð’BAà-ŒF½K”@x„‹Ã°Š`ÃÜÌŠS#$$å0,“³«õì°Ä™Ÿ‰² á—´›™ÏÐþ•‘MsQÄ ®ºíGã?ò"[„‰¡éÚ þßB}‘çñgM‡OŠYøˆŒf‡(Ÿµ±fœhV•t£P`ÂìG´ \‚?uXÊ1eJ• ~F[mÄÞrMámÑæ­¬s¼aœ„ç>PÄ¢ã$<¿úÁöâG5†dükŠ0§ {R´©ÖÀe«8(ì I™Z˜l9–QQ&õÓ½®\­³ÔIu¥œ?˜då8Æ9­»žií–;Ùn 9ÂÛðDÂÕ¬ÚOÛ-v¦`˜{+9gâòþímûKF)\þžÉ¶äíFú'tM™™ÄÄVñn[™«{i ®D80ޖÚ³zËËõ~H7‘ËÍb@þvMr{oþóœ g;üE—ÃÌ™ gÅv0Ç +dEL5¼›Ô$Źa:K9´6Þ;' x% ÀãO¤Ñ°£ l=38,¥îY‹ ¼*xXáãGWü ;ƒ_ÃÅà÷“– r)õ´6·6[k=¼ýEƒsïN!iwÑ8µ”Á+§í×=l9­†ÅíYǵâ5}­žý“zAÀ3x­õž\’%`Ój•>}8Y“¿X'K'ß`“IqøË@šºíòÒÉÜ!£¾*{‚ëÀŸ‹6-föÌ íŒ{Ñ6¼YÍá/ö&W­³ç’–BìÒ=Z=ÂÂ;s\†1•ÃbÙ­[)Z€&7„u…·eÑJ…EU}š ÖQ3woœ0ÎËÛd‡ÿ‹Q›Ãóûí/UdðËQ–H_+.* ì,ßo+G=‚Íj§±>Tºw2 ‚ªÕ9,›;y®:ê!*‹éT0*å¸0–}žyñÍK…‘2;ª@®l7v+É1ÎiÖ8p½×¬Fˆs3Ž‘þFóÌLÅ‹BZðmªýï°nëf&…zDÌÒm­—ä—ÖBcYöÉmï-ݪømCÏã<4²öy囯 Þƒñ%ÐÜFTO Á=Ïâðü°ÖÞR¸©Ï*«³Ù$¾ŒÃ*äsÙ½Þ1¯G#nWÝ`ÕG„èG´³×ñ4ÂÃvšÚQäA³dwxüØççlø=˜FŸYM”]½tœ°Ûc ìâpèƒÐˆÃz~ÛØÀa-Ÿx™´*û‘¡Zn½á‚™“–RîEÌJnþÑã§jÜ-GNfÇ-Z-ÈöJ†nµYqË KÉ$œ€Oå†u½rX`Ùh}ÛÞa jé §:G¢¶]…7ò.š—Ï79Hpaû)îqhúkJ,ftsä®›Az¼üÇ?Ìùÿ¼ÒüŽüË:Ú•ä/xŸ/O ÷?Œ-wÔ·ÔÔ3!E«p=žÖ*¾Kcñ–·ƒoyfÄ(/ÿZE†Ò:å~§¤ì«M~àg×´c}à¶%ÿäWL$ï« Š*!Ÿ@ݲN=cʽmÝóèYŸÃr;RR–eŒsôá囲rÚݧJ+eý~r}°j±‹?°yÝH3¦”@QÅlwU9Ì>mÓªÝÍleëÅ75>O† !®ëFë<ÑÓÒoœ0OêICEôeçÞæ9>ò‰H¹*Îrv¸f;ïo¯™å~1oD+buÉbxÓÓP NF™ÀÜ0ÎI°`Òî·‘éƒâ/í~[ç©…f­vÇ ¾ê°äbà’f3O\ÎÑ­&â_¦áTw°›«ÆôP帲Áß#,Ù ìëÔ#œ,L\pÞ "1¿̿£Ç¼ifZ‘V‡‘Õ…¯3Ñ o ÓáÀM-±'²ƒµÙËaq™PY™mßÐA¡¿a4u`5%~:¨~—JñòÂàÃlÇg¨Rñ6"»cìÌ6mÓlÃ(bk¬à°¤¢øB¯#4"£TÒNÍa¡Ë‡E-ñê¡É¿šÜt!Ô\öéÖ–P×h{Íú÷šYºÕË ‡} Óàwºwhõ}vJY÷6k¾`ìOO4IåÖ`ø=v»®áå½n—u>¦[}æGìwMIR9ÒÆ¸á>”bC#ëómy<ÝÿXñ15€Œ˜Š“È´D¸JJräœ!Ò ¹—u5ÔWa^âg«t1ílV•Ó¸hé!òÅ?©éýbÍ¥êŸT¡ö,üУp"Ñ’%ÒL?à¥M4G¿4¡Áú€ûÇRnŸãÚ£™f´ÊdÛå_Ó"¨ ‡nP™kR„?9wBa²ÔÖ’¼UAÊA5yyèþEÍ|,H‰«þÄVa2™Ôõ·u?fª8®Vÿ÷±Gæ7©Y RH{ôÃf–!¶…Zux~˜¤Q75ì¦ÍyNUÌ~`Êäo# ±à\‹"v˜Û¿yu—èx vRXM«PÝj™]¸Ýðú^V ñ]‹ò=Œ%y™n¿FtgË7Œ‹¡.{b}“¶+2Zí놷Rõn«Ñ4Ô™j7ü®ºU¤À Š&±Ãå—X×¾^ÉØé†?šn¿²ÕªWü¶LŸM¨¶1ËÁdÇÑ'Má‰k‰Kšs·½ß®RŸhnµ§Õ+Êü»irµxD«Fã:j|ªé°fâuƒ¬É÷IÝ´½‡¾“ãê3YÏËí×JÛ×'>RxgÆ=*rçÂU\^W,‰yž0"É0'|B4ÿ±Ü;íð‡µ½‹nŒí8üá1í àcwi‰Ž!eA•þ6SÃb}•QS)õÿˆ¹ÏâµÍ‘»¢e3o Ex¾†áó½ýÎjø?–ëNÙË6)Ùa)fü,ƒ–yü«láñî1…ÿrÓ>ï5oƒ;{òšø Å]ç²ÿôCƒC˜»J[é‚‘·‘vÎŽáÓ°td»H£˜b^äæÚ£â• -|Þl¾ÍÙ ¯¡]ÎØÌ[–DRчFtÃÉ ¯ép•»V4jT#Œb+©§´Jh¤ýø’;†h–| ýûˆì£g/+”½k#ÓšÖýö%Ø[^r5xG|ºuG;¼]t·‰ÄMðúœ@­?ó3œúã¾Ò‰ ¢tV–8ê0|Ǿpw¸JöNFõ7µÒè¤`r?ZYVu˜NX‡0W6AEÑ“MÆ…@¤¿­êl­Ð´fP6•-]ìô²LxBªUý§¥Ê%¿MYØ©£®Ž #—ó“«K铺"Œ#ø„òݺGápGÅõžTdž~Æ¢=I æƒQž¬(Š•m¢<ùÓˆÄõÇ_bmÁh9«Qf{N‡å@Ÿi²—k­Ü—š){ÛªùVÅ6XŽõ6Têiämq7ÿê=ѵé} žN~,H4ÿl^s€¥àÜ„ʶŽ" 1ö•ë™)#ùÒJS©F$‹AWèæœþZ³]ifLÜ?ÙºÃZ˜•5í4ƒ‚NzàÄej»Ì°ÏoÛ¹|,eÝ—Ïrg%Â?$_·uÉämÅì褖V \{¯…ä%Ë)™ö¤SWt±é€Àf5IÌ!lL•C mAº‡Þ,~$˜’úAu;%¨¨FXøÕÁ°/ç°ÞõL(ÜÞø –f÷ÒçÞYµ+7„ê ÝÖØά·T§ÃºñÕ{ÝIÉ9eòË*.-ûî%†J KÕá°d¡Æe³-{:QŸnÜA[qUxŽýÓZ %í;[7Zÿz[î}FTd°×ÉÖJ„ª+eãîôº³6<±­‰£“«9ŒÀ‡Š³'6¼þp•j–kLZ8©¾°Ú5V—N ß»¾³ï:Ph1ßð«Vº]cû2÷Ù—ˆ>|*ÅZÙר•°7Yj!`¯|x·U£Œá°"® Ë 1¤bR£êðײ ü5ÝöÖ itÁ?ô×÷M:sÕÒG»áV%W­Õã|mb‹ÎªÝ’gnUápÖ²M—šÔ+R°ˆÓ«GÀ ÍÃíʨÞ–¬PÓ7%Â$”J6?ŸvÀ Å5Ù•¶1Æ””™»ÙzfV% )¡m9˜Ç£a)+b¾?°2êQª» îcØØÉÊ—^í„å,Q)öÚ;2Ÿ¬'éíi05l`Úuda3šXަ•z„‘Züq: “¯E1ô€**RæÕbOÖ/çÌ“ZJ­Úž¤ÃW¼9À7aŠ¢õ¢.¿dj®ü£´K%Ñ)Ĥ%¡M‡!\yСiÙ¥I«æqV}ZÆ—h&Kè!¯ÙÔzAÑÿ¢žÁa+ŠEݱõ¦…¦ß!j&QL²Nmб`øBØ/G¸¤¿ßŽÈf,sÌ$»W”,³ãN.ÜÜ9½'ê¤gvêwõw²xá”u+÷£Éê¹?âˆl+塲íö¬Äíhù·Í?$G Yâ"ræD¹8,á û ¿­/j%%3XXš]coDSØò³‹VàèÃcÉþ%NÀ°òdUsKûƒB-ó¥tùZÖKP/[[y\˜É™•*9<¤~bÃf]ðO£“ÏXFAÚQ„5öŽtvö&[:âß/ƒ½„îÚ(¡úlÏþI=ùˆ: ^þª§õM™ãΡ¬tu`»eý¶F÷õ."ÿì¾=:‘Çßžè ³{ª´‚’¤gg‡´ÂR3¢?2W£ƒ(ñgÁó—G±œ`ÝMÝ~c äÏAöÌã¼ ‹Û\R€ÅQ.…Fè—^w:Ý,uä6ajM$ÔweãpåÔ–]T(m†·;6Ì÷¬ÆIluAŠÃ–©\Ö¶E7øÈ ;ðúdç*˜[î™yp[»kÌk¼4²¿N¸ŸµÜè-Š÷ý0H-Oì—e‡E ™qä׋`nᵓìZÆ5®5{?ÄSËà ·‘gÓ=”³ƒo·9‘ÏyD]„îðø5æwLî¥ìb)®[Þ QdQ6ÐߪHü¢ËnԞ•¯#¿+iq<êɪãæNÑvÛ/Y„ìÇ¿9Å™ž¸úò8,×=|Á_ÔýÀÛA<^‹q$6œ`Ý'ß×90\XËýQãd0³úÄÁÍê°Õ²c •ÔX~˜ÍþÒMæ8±1t¶—gw \ëhÕ×?§ðDmy‹€:ÌFkr»ÓgYzW“6ç»`ÄbÙÊ(vëü:y\ ‘’b¦õ²T\g—ŨëDÈ |–j‘–åZÁ¢³³F¡&Âi½0Ê2$%‚%·ÅÃ׉yá=»cˆ딂ð´7¬5”2n#KªÅ×IµBâ;b9\þv/,gøjj[Ùúu2°ØŒDmÄ2¬éPî-ÝtYÄ«ÁíÁöêFƒÕ{ʇ6Ì<€+NÈáþ±4–j5úL0×ýöKQeçŸH}òFôÐ "rû²Œuê)Õ-iÕÕ–—OzÖWOÕ¢¾+=ÑpX¦ª&ìkrXŽM#¦×úY˺7îQîcµ„$;@µNl«]ï‰ÚÖp_wZK¢;*eËa['èUkf®z·¹Mˆð  =¸ãƒí‰ÙV»ÞƼ⚨âÝÖ³`<–=Þ5>žÅÑÜÎÑÏhÚö‘Ђݚ»ìH,n=˜8]Ú‡ÃZ&Q»jeý³¯ï™Ø•À­¥K 7-ÛÐ_”ërÁÚJË;b‡ÔÓÄõ½Þ†ˆùÜpLÓvd–‡ÙˆMZ’søkìV+§ž„¬õ9háþ­å¨9ö€8vXl•|ˇ}áíLYúîðø W;Ï _W„åÿ2ë†Kc笅QJ¦œˆm¶1÷¼vSðl7ôRCÙ1׺å†Ã_ý¶„©ÞM¹Ü°­ïY;çšÙ™kÎÙ!µG8*œþ„vì,È:Ѱðôl#¢âð²Çb5U—É*®*Ãõí%´AŸ£ÑÊϼÚéWãàБOAÕm›åQ‰ûP,ëkÜmžoø•Ý’GšêŽdÛUäBÚæßF‘‚+C? oa¹ßi•:ï;ÄV‹È¤â0ŠçN¹Áa ]¨uX‹mãFèuXxX’úÃPû²ßgÆ>öÜ2îænkI©Kª :X–˜bpX¤.êë2óx·ÅÒÀ±‚bw¶lNœÎ«÷º|‘¥Á4ÔÌòÁȲÿŽÑnD½ ¡wù±Ñ´É`¸ Ê‘•F„á|V!äˆâº[œûåD©á¨Jwx}ÀS•ï²O ˜‘PÂÛrN‡ézR>Ô:ÕË|ì£i‡¯…9Z£ƒâ¡•·¡‘GÜ/Ôí*µåA¨‡Fìú(QB»½¿½p„¯[µÂuŽ÷²±ØyˆÓ{¢‰[ÏN]ÉFGí†Go¼¥‘Ï.ƆáláfU g+¼ÏŠEêðOÎ}:†ïYêç‚P4È­^ðŸ6áŠÈÒç^y4íT»»ìSå=x²t‰gbÉŠ|I‡ÍHÄÿø{>Iûèð­¶îÈGˆ=!oK ñc”!À·ÁâF3|Rj•йWG€%R$bÞîZÝûoMûcúV«¶9üElûj.ö¿f^‡‡¥@ý©IfQ¨ ÿç-’n j§ >æ )ùt\³ _+¬97Å’•x|?imÎ×J''fH×ÐA½µÀlªk®æ#`é›­${Ax€jßm¯o£Bàú#Ó—mø?Ö¤]õ¼EfvØŠÃÓ*í,Ã/9Б[ßþí Ô˜3õZ[»`)»P<«Þ06a+î7rXÏÀg+£;ü¥B-K·:Ï)³N ñ5>­ Øi’,^vÀ¹ãâo\E{„m…åš›{ðâ oc#¼-7À”¬"ÜaûA]äú½>èDa‰‚¸Žß¥ä‡f— œØþ´™Xá©~N-ß—F1‘s¢î\€ 2Ó‘v–Æò¶ÁÄéïæís-6¢$[„øo§>2IcRöV ‡µŠ²ú.þðöµÓ é¸ì‡õ~ [ XÒßmÊœ”>¦ª–í¶,¥ïGTíL¿žÈ}èu6\FvÐò!5ÓïG$ú¸š|–åð—у¾? zŸÿ}?iÁb#d¥´¾š¬•X7ü„NÙøgì±ü]x‹ÈŒÚ¤4 9¬GŽ•‹½ƒº ¨ÑZÖðOŠåó®‚_k¸á±ùàÅ•}È“wwó²E^¶^î¿ïÎÆ{ä×ÎÆ{™µXäÛ1ßIz¿‹ó_ÁQé¬tr÷Xû¥a»[Ëš Bq{‘vKüó®$…¥¦?Î7ÉÏa½Î†mÒŒUƒµØ®ÉÔÛ- Ö –QYÞ/°Yúr`»v¥ ür„‘a-¥öŠ7¢9îHWX-Œ²ØE ry€ÅRm¯””&ª‹2WØ*umÜá,vŽÁÆ98î¯aJ¬¬ƒ%Õý1X뀳ðÇÏa½ù–=ünø 81ÑI% Áb•Õ{,{[´ÔY2XM;æ d39,Ád²É>pIrs+»ðS¥—ÁR,Û;<>{¢‘5äöãwönÎÛl¸õu¬Ö *‡ Ââ°äãÒþu‡¿æÄ.„KJñu8,=)•õ–æ-,‹ƒÂ›Ô³·-dÙ°%¿«–’£¢(×ÂÞ U‡u›¨)î°æþôŒŸwÛŠ©PAqŠðEÑ,:QcO‰ÚDåÁ¤¢8 ¨\ ¿çågm¬âŸ’p:üÛ¼ô[$L oÒŸv.Qx{*£BbR³Ò~Jz}A€Y–Sb%:¨»ƒ}ÉïÀ²î¸ºP~×yµžïJ6Ó M =0 ›ÔÚ>}Ñåaï Z‡©ÙÍy³(,Ùmv}']³Í]t×ÐàòAÜ]w;níÅ/ÂÈägˆÅá&»¿·<·da 90tÃWsöt¦¤›"„É.¿ÿ‰]ˆþNw·bÀOªÆû=«§¢¡¼Àô&4ñyˆKwFŒð@WâyÞF>ÿŒMMDy2~ëU·8Ó:°êqvÊGÒ[£ïĶG Œ}FïÖpV­¯­žðëF´ÈQálο8?dšÅUã—£œö‰ÍÂ—Ñ Fê{Ù¿pc¿ªeŸoMðz˜ØàŒÂˆCcÕÏ;ÆB‚p‡"ïl¸Þr×঺¹ß(•wº%©¶¤‹…5ªˆ*½—$ÝTB!±hi[Ú•Ãz¾õ&Ì´K"¢'9,a]Ç#¿¬Ä ¹"yÊ ·-în²•zS iDÇlëÐAÙÓðûJ™ó•±kè °HÑ÷}N4kÄ8;ô[¯`ñššÃš¦¨Üà”ôaHï[oÝ·«þ,Nù°ýµbhߎ˜šnE¼>©Áeè?Tõ^®r«n?Œ`AÂffÓž%þÛ%·VµÐ\_f%8,~jd¸ö] ·°ƒïhvB^VÑSoøÕïûˆðÍ’ûÚ{Ü9¢íµøåç°”wQk4¾­Æ/Ù%ÿÒ­q,3ðeøX×; ¶Íºð6¦[‚1̬>ø!ô§ ¹©Õί-g„ûŸI´ä¹¥¨Æ l—:‡oé¯ð¶ÓSVŸ²:¼~©XÓý_¸Vµ„b…Eð,³„šÃùW“kP­þm¹# ßvù9,¥qo^Ð\=„’/{t_…ñ„4 V_9ºãdïTm·~ˆ•©a‡]ù®Ü‚gÇÉÚ¼Dàˆ¥{4åÓŒÞa²aNKqX‚S; alo —nëuó»³.ÿ¤d|<¦ÐŽžÕ¡+ÑúÛ¶òKðZÙ24½~9xßWmìKQÂmÛDø'Ò±+ðÕò®m]öц‡êæóWÚ]î |·¹§¸¯žÔôëõí·‰¤æK ý[.xå¾¼fExtremes/data/bmwRet.csv.gz0000644000176000001440000012142512254133267015563 0ustar ripleyusers‹uý»®.=Î4úsåüæú ó¿×@›í h´?îÜ¿;ŒL2‚¹Ö®zËybçRJJÈü_ÿÏÏÿúÿýü¯ÿïÿþÏþóü_ÿ÷ÿü¿ÿÏÿÏÿ«ÞÝJý)íÿ§üO{—QîÞ¼ìÚvk]øxñsNï%=?ÿ÷áµQæºÂσ—nà]Uøµrø«–·ÔºG9e¯ï_ÿÁÛy~~J]ÿå¯÷ó×½9íß„K« ¿Þú8·ªi­½ÅϱöZê¢Öܺ§ÞµÕEííºÑö­§ræ[}fê ¶Þ&ÞÖÆééëš»œ1‰÷ò~€9ìñT›^Ÿr~áÍþé-gÕqïÂßVYcÇÜùù÷ÏÙæU/þvr-u+Mxtrëg*<Ä™]¥a@<ïmkÍ9ôÞøÔkÛ{WþÐvÇ-Õ§Ž·>}ž³úþÖ¿í5÷áÀm$OÿÜ{˨S¸×§îº—ŠiïHµZÞO5[õjX®ªÓ|~ÙLjS¥û©ó,k@*æ­M+oMøÛ›­.›©—ۉѰVQÍί»ËÞ»má>fßï8…OŸî«—Þšp…Ý*4Û¾}Ú›k|¼V­7Õæúâ°ë±bˆû·­¶jô˱ÐãÛÚ[[_*½¾È&D;µ©xÿ´e^ëŸ\ ;Ó&\Kåøš§ÏÆNîø¶O5ë¶ =Ïo;l0«ÏüÓ–~ëØ¹˜ßÄùNøò™e¯ÍÕ·oûöN¹³ õq;>sË]åè¶´tö˜÷ÖWö18Ï¿´­BöY‡p¯Ý°a×¹~ ,ìoílÅ\oG†=mx _\‡æÚM¸wî6ø•_£žÕVº®òc‰oûöÕUNŒÙÚ½-=ßcÝÂçV{cŸn«å¼ÕY#VøÞÎmKmñ¾–6—jüLuýêŸ_Ãk}Ë-©5ÍW[®lUUküÛ×5{=C_ÃçõÀ×?éÝÝ7D+¿¬­œ˜×¬ÅŒÍ»Û;4¢f|cû¾­•žp_ëÙgqƒ›˜ÍO­mBœÑî#ÐÖ^[çð˜ÏµW🸮kŸ€2ñ‰ßÅÅ–4-ù3–j›ÌßC¸w¢íÚ¶¬%<#Û=WIålж.v_ûÝ ÊèÚ 'æóÓ=¶rÍžÊ÷ msÜFQW7´îÝl³Ù¦p·…¬±f=5ᾡÛ?œ‘úá™ÐúåŒÙÝÖEâ1¡×´½g©Ôž>ýо ›c“ûÓ ûÌÖ0k6ß½Ò4.6ËÒó>m°T]öÑÒoÛñ¨éyßpƵåŠ|½ßX¿bÓž¶ÊrX¯XØ«›e»¿pïÉn»F] w«Ò·UEµðylkë²ÔÖ;‘ù+öjÛ¸ÚHokŸºÆ2nëÉ>=+¦²­««½"frµR36÷©cÚ?ᢷljUZ;æ´u°ÅJï× ×¯˜É¶žÈjÙ4Ãí_lö¤RWÌÖ]\ù6Wëc¦á¢ý°_ƒ\¿ªh3bìË ÷=æž;WKík4Öâµ¾Â<°1È=iǶ•qÛ†œðè»ÕZ;jeþÌ;¶íÒG{驰ç îÛÔÏ´Õ/_l¬»wú~åiÓÎV‹¹W˜¸¯×ñ·ç¦Yawöè÷Ó×¾ŒŸ*ÿ…Gw‚£ ¸Ûä}Ø® Ç}·¶ÅñÌTûð¹¬QÖ wsÈ––VÕǾ胕êw«Ãý¶Em–ªÊ´à2oòÒÅøs·:§GgŽ Üñúšï˜C»›~…7Ÿ£VPgªV‘.ÏöÜ…¯è4›½7•ã–ðü„£mëxÐµÆ 1|›½¨zúÆn6¡¹¼Eï­¤¶Úî1§¿cvÙ’QG¬µÆ é6Ú4ÖªÆÈž»U3ó}êùXæh6µªpr•|œÝfÇM•êcàÝJIxìñV¤:%&¾ ÷]ÒGò‰ßnÕÔsÛrªÙj9¹Ôß,Hx삃¼f2…E_-÷ƒê‡SE¼Yk;MŽÚb TÛL±’¯Q³…{zÞ÷¿¸IX"k¥r¢ú¿q7I×¶-%là¾/™ÁgëŠÚû€Ù¶­&ü­ÒÌî›Äc`nŽÔ?ÎÎ4›´óLus2óðk~~­Ï/ÿàwÏÛ†z8ùköËçô²õÝ|ûgXŒ‡ÉŽA8œO- Ä÷]æFæÓÍ;°¥"ýµ/PíéŸ-Ü{mZºé;œOÿ5xéô§3­G¯ ®í6îÖó¾ÚÛÚ:1™…;ÅvŽuÄ;ñ‰õ§['V*'ØSs€}âD±?|ÇH҉eÙ|×Ë÷½;ž-9*>|³cîBêLÎhƒ÷N¸›yfÅ‚"îË=<7i^Üg´m`儇ofÞÙíê|7ùšõþ¾Uå¸Éw@Éïø¶¢Ø™vm-áá¼ÙV5֮Ȩ{²:«ý:6ÂÏΙuÌöOå„…5Á}®i ’W¯ÎIÕËè»Ù>±ø¥9½m`›õzªðð8—m K• ßmÚ–P–:%öõ¿dè½aîÿÁƒÉ>Ã솄»Y'|¥÷Ò̳*žð{Kk³èc„ßßàSŽwâ­Ó-‡‘ˆu`£÷“ðf•ùa} ŸêôÎ5ÛÈá‡#ÑéfyY÷vÁNWöi6ШăPÃ!\™ª„SÛ¡àüoá=W"¾¤m3ÆŽpn4VnK¥ÆBm&æè ¿Á0sK/ñø’Çê7›é õÃs·«.ò/ùt¨FV笽°‹zŴںܒʇž×NÏGýç#=îÕ7Íì®è†ñYÏòföÚàÛÌþ/<¼!™g§çc5or: ÎÕkfËâd%n=‚§]hw«œë´v›ZS[nMŸ_oGšÛf]Ó;‚äXpJ·ð·®6Ö­½ƒGef¾‡øàíóî8òz<ú®ž ½›uwøùE¥ãT ÷÷î÷WP÷6l×Q]Ãø²=eÖuŸ¯+½tó’¸ïLîÖf‰…-öâ¤ñm”qöOvš]dáHz±/ådìHüùX¯¸ˆ‡of‹lLá¾ß™`ݧrb&×ÙmçQu|&ײÇÒÆ@nèºMÕŒnën½GÍ “Ë&Z;é½±&ÿÁ£;§z´SdÌÈ´ù2×qnÕŽlÌøå#‘åÕ–\zz#00ÿ©¥A˜³¸p.‘¨Ã¹ò÷ÜÀœI~4QåfU_Mù¥mØ,¸~§p_wسôå ÊïRc;‡åµÓã+ø¢¹ûI»/Ýöô½áÙf7l°j¶|ð#qæf꬈¸‰3™„È#â>½§9°cþræÏ„³ÿ® çÌ'hÚ_^ÎÜ¿}21ææ7ÎQ®`¯Ì2‡õÞTÈ g‹“ùåË_¯å´KjâåËÿ®âË¿C x8‚½›¹–žŸÎ¬Àª(ꂪÙó²äïŽÕFáyYò÷mÍŒ•­·ñÄ–ð±ÕiÙñwÎܺc»xÉñçe6 WéjrìâÇv}íyñ ê­/VúŠi)9ñ·¡s4U´Ëînãji㬕Óö4ÁGx„ÕäeýåÁßæws§cc®bÏl8ÀÎøŸ‘%ga­ö5ƒV«Š3µie»Bî3,8×—JúlXG› žð0)/¢úÒ{½>cØJžß•uN‹%ˆpý Ÿ߉ßO¬™Ù …s fÃÕ ‡„ûê7mÌÒgMDø2[ˆ>ÔK„ÿüÙ€D„oœªìجk{g³~Å6Ráp4á± &s9Ñß¿á? G;ÂÛÒ]χߌؾvU·DŠU›Z/pç«âÄž5´©Ê\±m”õT¹˜ÅV³y†*WIÞ®¶oz>Ža²]›(osÂǸ ï±VíE66SÞãCyå=åý§Æ1Œ{jÑ9}ç‡òž)„¼áxf¾kÓL”wŸýØŽ¿…¯ð6ÍKhéy·†¹ŽçµŽg¢¼ÁÒÚîQ…û–·w½ÎÜÍÄyã¸Þ '=Ïm÷7þ¼QÍæw?žÎyÿ3QÞÏ’àNýÌ”÷±¡·_ãûÁÝüK¹üXj&λ „B uOÌÞm¾þLÍbX)ö§™ž3ü7îõ±·Ô*Úaæ8”w œÎyãq«#b Õ;1"=3ó~ËA¼ÿˆîÑé–T=la:éý–£8¨é¤·ûn†«pç*±ÖÙm$½Ûmëªþ'Û>° Og½þ‰»q¶ôyîõ¿¶ v“ãÝ«-îéù8-ĹPMõñ¹¾­*G?ãʱëšÛˆ·°wÏZ~à=s`¹m3¿7>»uCõx€éü÷Ó.óóZŸ w|Úb58ëÄ›ßyNªgŒ“{ˆåt^â0œõéú¾8ØæwéŠH³•†¯í$Sl4$=Õ)ðäõ|˜a6ªu$ÜwnNÍZn Òéœ -Ÿ0mêfŠn*ÇOÔ­š3?œ2Xô¡Þ‰¯»Ïµõªg0ÎÛR5ƒÿƒÇ± ²RùašÁtý”¡îO8F*Ç»Óúþ8Ç5[Î_yuaœ•ÇT™CxCÓ¬§}„÷é»kAàGWÀ$ÔmËX`·Áý¼aoI/%ÅbþÊ9Kxth»[ËÌàÁ&â-{~žË…ã$Ü7©Ž0Ï£F‘bQ$ÅL‘å áK;µ–‡£ŸÞ·ðØ4­¢4fþP½'âkgbÉ6\Ë YòŠÀ-Û#¦¯ME[?§ú-ør„Û÷©çc™·¥çx´Öüðå3ñåà§=Zô…³›¸û l*J°³~f&î¾Ó«ÎçUVÁ^â¤I=Æ™!ŽÜìw€Âé"œJuŽsK+½x¬ÉL9˜\sŠRù7׮Ů8Àµláá]shÚ›:.aÍ:…‡CkîËIÝNkm:3þ˜°™=ªw:Gþöd»e§—}ªþ˜­œ÷r [ô±ÏI9…¿#pàÒÜÙéù¸XÕ¶­„Cx®?ÞwzÞ;Òf­­WMxPù6.ÛÞă._æeOÎÜÅ Ü*i\Â}ï»”›÷µÍ¡îƒ3m8T5IœÏX>Ð)µ hº‰!U‚×Á:œƒ„‡±f‹S®Eçúw7u:‰œu•ô>"ÊBDÚ%×FlóÁ‘¥ã"ÎûÄéîÎÛŸf”Õ„G€ÍÙS9^ý;¬¦\6×ô÷ša½;­Y]c`‹lÁ^Äc—·jš›„û–„ØÐ9ÒóÑæjõ{…ÓZ¶Táq„c¬pÝŒQÂù—Œ ÛóàþõÞ0ÊVïª÷2(á7îŽïšº9FÃsŒ_»p?¿»§Ùj[6î„ÞT÷„ꂱ#<õ‹8UODìUçŽMVý ¸\ÎKúCÁTäíô¼ûÅã‰@Ix„''Ü|į³ì´9N}3àìûê½:/Ù»q¸Í}3NëJåxýÍI·Ù8…Gý;h¶pïg|˜ÑS9¾ üÆãøÌ ‘»¶ÊÑN'õgœ“¢]·ê½ôاó꯿ÚϨ©vœû¶LÎô².]'GÁ‰ï¢Û{É‚8¿¼1ÁŽ#x”Âÿ¹°2~ÝÖœmc`Ëjïä÷^ºóÇF7½7BØŽ9:UÏÇ¡øÞìÔó¼8hw¥!Àøõ_þ¥;¿à_uõCeL޹Ÿ¹|ïf½Ýšž÷Á±qI ªHâôç"ñXÀ7ûñËtÞ÷µ+óŠÃ™‰€GÔ½Ùeé}_øîGWA-§îÓT*/•ÕwOg"à»’•ÿcŠY¯xë]©˜EçoÔX°^ þ]çA’Ç:, Þª;¥[¸,Qïdÿ¸*•'âÓ w_Óm*55!&÷ìˆÕ»“!p[8à PÿƒÇÈõ{ÔæØ:€Ây©÷×5\8Ì»Âcþ4„y«5± ØÚ}FøŽçûÈõk*Û^|fÂÃS²e½,Õ?l‚ˆÑ“ð{UÆ©zÌQÚá2Š›‡i¦NÂãLÍŒ.lææó"bÔû{…8=ü±e§&nÞö_³<õ^F6MgäßZ#D¼¤¿šq 1ë+Ù:œ{ß½[,Ž_82k6ÿ+9=óƒö¸Guj\“¾o ²¦ÜW©qSÔ–lÜ 7]{Y7=CÂÆOO]q¬:휷 p|Ûj¥ÄC½0ZÃÂOÌü=¸±Ó„GâŸv*Ÿ26VFzþÄ¢‚½¨ ÷¾âÚØ­ÄÃò¿-ŽG‘éÙÔULx€ÓL”»ôZÞHY¸ª¢f…Ñg“Ñæ×L¢­|{ªYÁÜ™[µjø§béÛå­ÞæýQ³ô[×kãÖp;paÓóZà3I??$ýûëæ_²ÐÑ×$y³>ÔüÊÔ|ïæl¼¬Ärµï’5ý¼wejÞÞoOÏÇ:5—«^u_ÎV,§æßOçß¹\·åýtÓ ¥©÷†á–BV G¯ÏÙ÷»&¬ŽÒqÂWn=çìJ<|Ù6öÊK­ÄÃW\Uö³¶•å]l”ù½¤å4ü;×@wuELY:n¦,§áß.Ú8TQ¹ã"HM]M}x:Mïí< ÂAB¼7®-gßÿñ6Mám憟¯Ä¾WÐû7Á±þÀ¿,]øÍ/KnŸ§bÞÚNq¯D¹›«S=Hråˆsm5,öéïØÒ}’ÝqÃe ÷Ugâ ñ{P¿RÀyAäûÍJçjßú¿ÛÞÆy~æ>áÊýp´SxX“¸‚0ˆÓ-•¾»pßš—¹m£'Ü?tµùÜ–Åëü¦Þªg¬Ð8ulç a‡aÿ0îñˆl©½ÏÖŸ¤÷Æý»âS#T}¤~ˆOýÁ}ëj/5Zmmëã†ëµ­2‡³Qš-ß9Þu¦fÆIáåŒü?–Çñ™ûƒ_{nxbG¸û‘f±o'7–ñïÂ7 º”žݙ߸;8ÓjágËCÌl™5—O3}%¶œ¦§b+2{659¯äã³’'a–å »/¤»ºß¸œaš`K®ß$ØÁU…*ÀJ -FÂåÔ¢BKéw67ÛW"ØíO@Ÿð¸qÍÊÙjõ¹pŒseÆŒ¶nìʱÔjWµ`€ão|ä·‘Mûôð≠.³¨'ꧦñõ™¶¥©¦šÜP²P™­å¿×]±iY, ôT܈J=?í3ùà(÷¤§â¨l˜'Øwú~=ÈÐåû;5·rpºm¶_¥çãFùÁÎQ¯,Õ‚hš2¾cÿ…{õ¡[ØFÂãcÇ-‹E<ÈSèLÁMzÙh¯œPûó‘’®À•‰u[ÙØ üæ?Žx&sRq&<´Û‚Ï%¼ý9[+x-Þ¶<Òüí¸¾®Gû¯¤ÙòÈÏÔ<"}%F„GЯ­˜­ªñá/. qŒÎ]6„šðÐø«VŸð°Åm¬žÚ…Ç€€¸Õƒ°ú3Öà[g±ùårtŽ.žÜrîU=i–/ĦnáìgIJªþyk>ô¢÷A¨…jâÐÕѾsÿboOχ9Ž»·éùÖóÛx~6A³ª®<·ÝcÒ¡=÷0/´Þ„ýó ï%¿-VqÛ`nçS¤ÎqnªŽ+SçàÂס›¾²€ x™ÝÒ_û8…óvjz[Ð) ײ·ð§øœGåÄÎ~¦¾ ~zXxô©M¿Z|J{nî ¦ox›Ì½Ä—o„× Iÿ’_WxÄO3¨:1éqÕ¹¦fÅ´¶§î\>TáûßÝÌë Çf}SsIãÎ\'~÷y6½ÄœCPh±7ŒÞ¡ƒ°cg~å[ü+‚‹«Â}4\3‡öM8GCê6é½<NSÅûi4jm3éÂ]Ãöú9¦ŠáäùØf’}Åy±ÃæÆlÎê-gX9êZm¶øÕ¢Þ‰ÁI9UNéf²Mz\‰Lÿƒóä®Í³Õ 1ßq¨OõfC>œdx¶9¤ýbSýÃsƒ†çŠUJëÐ&„dƒð8‘¬êìÂÃ~›«•ðxk–}{³ûIÌ Ø2$U†xK*>–€_xCþÁÅJåÅ&Y¯£ëm¡Úb^^á²WùŠÛ}×K¹týÒáé:¹Ô¸©kž9;ɶX]mmz·ù>î®z[Ìw0\$RtûïÚµÿRkjú˜]9Ó§ G až“õ—šÃÆÙ0ŦãF’ÍÔ&¼sjÜM%·†¯u%ý‰Üäšøô•ÛË{kLµ‹S3ÈZ–jW8A0©raëÙ†bÞ½ÉöýuiÍÅ[75]ôk~~yïj¼—•6õ^ø‹ó×ü3¿(°³:®¹ÀÃμ:?ÏÓ„£±µ¯ŽPìægy;ǰ,·-á±é@Zá5Ü·3ìîx™ùnŽ;sí¶ÙÙ¦£çÃП`¾zÂÃ}¯& ç™=®ëTáÿ¥cÏÿÝ®XæmÜø=t[ üc&NŸ´Ñç=~Yd“kƒmFwá]/—™ýUø‘8Áë¹á¬Â)ˆgÖ]Â#æLAI;ÑîAÆ‘¿s¤ûšÆXÂ#Êh㫪5´ïêŠËw›Šèú–òä1úNú.íaË®ZÉÐ(Yè; ¡Wh¥Ý:…ˬë.Á°3ÝþBŒoÿõ‰“XÓÎ2-sÙää7­nßw±„ópr&G8Ólèº ^YÍÍ¡íÄú»wT¨q'œC­{çvbÝç­9YGÅךC݆“GÞì,Øò§Ìê®§§(·Üu]7IwI¯˜W]û´™7G“E$ú´9”Û¬}Ú6®‘ʉx³jënƒÇè{{üÂÎ$zk5bw&Ñÿàqj|l©ç'SXûcÐÝ!<¦+Žl80E§£¢t:bq3Nx(v³'Êžzw|¾¬DÎëmAqlRçÏ¥ ¼-<ÌF[—æNχsQŸ# áqqÝV„2Ô6†³šõs»Ê'³¾¿±ë¶êÏ.œQà3ÄnwŽE·}~¸¥µ?Š-;êf´˜åÕÔ±C£ }¤RÝN;PAä4•ÊÀ ©,grSÆ M嬘RûøG¬è¹A„ï½!©¥Ké¶7WÚÙN¨¿{ÀŒoƒ{ƒu±EÜÍœí|:;kò†aÅí®édÑQëUnª„³.]|ñþD¯ïÄCþ²¸÷±s4: ô±õ6®Åð€Sñ<쎡e&/ ©6¶pš¸Ö¡*¾—\9Î_¤"Øñ½—H«uë ¹‚k«ÌYÂE.½ØYÆ¥aHq»] N­ð”»pF™·ÁQ½t¾?Ôù“®a '5Ý¡(<,0D½Á¡Üc&t+jd†ú„° ß¹í¿T‚aI!ªDLä?xì†ø²ê‰¸ó—>Mî_pØ\g¿³³ð¹57މÑÒ7B"âùM¢bT‹ãaÓ…¶2ÎÌÏûZˆ@™Î­|‹W—êÞμz7»¤íTNˆ¾,ÛÙ¦ž§™…› \ýÄ«#ª·×.<®ânLOÏ'/A¼zC\CÂ9ú~ánoã•JËیɫJÐ*£ÊäNûs+sµ‘{ôxÒîø lÕ&†Àã-•ôüÉmŒ•|jÆ_Qi¹>–p^z³™ðPÒ¶uù›‰ukã™üçã1ؼàfCçÞG ôúˆrqK}2üfÚÎtºº6 ÔšØÓqÒKÂ9.yc};|änù½æ½kçš{’ßôãÍiõŽãÓÔö°ÚpÞz¶ZÛzîÎìÈÂô×alÏÙj.5´g‡­¡©í1·‘’é¦T:n·rxÞC¨ ébø]ˆÿ‰K·øus·oÙ §!¼ÚànxµÔ›7h^I*}€ø:é½¼®²«Ì+1íwhšJ*ýw9”ÈÅYùIϧ)5ås½-û_—1-f±U×úÙ"SÀβè黊ú‚Lêµõ‰Ÿì¦°d[ê{*'¢&שkª¯™ç’Ú#•Ï»)¶ s9¾œôÝlÅ5T†%Cœ+ú.I¼Ü'<æ k¿qMn¢ã:ç™é¡øàcàl†8åqáCèqÆ@`+]x :ÂŒ[…³!¾˜ÊœòwÞTwÌ»hÞ'6}È<‚ƒåµYÕR1íßµoÿ¥ö:(ùUÌüw1ºzöI°mØhÂã¼o™¥ªÂ#¤É–Õ{ÂÐI 0P8á°‘Ž£ÚÍý.¢Òõ+bæëC `—i®ÉM±{yǶÅô¶ˆ‡5g{Y›ª\el‰ù_-á 27¤×Òš;µîüüʉрôv\ÕD¤cv7’q)È|Ù>ÂÁCù]‡whtó{ÎU¨0Ì ×Kyáü†z„ilBK~sSF-¨)E¢#†IjP‡ó’JÈu?Y®pw‡”ÖƒçwŽD cäçã»Cš8?œ²šuÕ§FÌØÁYÜ.z(M©”l°CkGõ¯iQRŠÁŠ„5=Kû/œKûgiLRIU» ÖžôZÁ„c®_ëó‹ÓÇš©¬Ð£]Ëöö#üþc5hrÝN 7+æ©&ÓäÞžZ_/ñ8Mþrt¹eŽpŸ5·ÉßAtrpzÅý¶’ð ðq^|UŽw~´ã$zàažŸ”TÔfÊñœŒ'“裃´¾Â#¶`ž¸ŸpœD’Þ†é~r_Õ´ôÌô×ìîÚ'µ’Ç¿z‹óøW+µª[©eý¾síx_L;Ýɪ/·p=ŒãŒ¹;WË/¦žD˜ÛºûŒMááC@ë¦b–*ÑÉj6yIOÅiî—Ìô¶8´ý…‡ T[= æäÀó‡z=³ó!ÌÍñö8aþ®2›Kþ©5 s³ Wê¢öwmÀÊ=ÔF+žŸ”]žŸg¾ï‹¾¡+x23þ Îk²èðï$œ‚ß6 ^ž$[^»ÙÓ®sœÏOí?›ÆIbåiÓ8?Jùi.SCŽ)áÁä™ExkÂÃHجJÔOUCt WÝÝ–;‰ÇE:\µ Ž~§*Òp¸jW­™læü¤ HÌ3{úëùyÊwþ…%‰ÃUø.Ìõr2nõwùÀóa9aѯˆê i;ÂÇç© t6N«ðH瘷æ$¾òÙT4þk[˜:Õ8™ÚNîС,¹~­Ï/Þv5ÿñ¤’Ù#·®æ1[Øg!˜ºmýÿ— 2?³n~>ÞL‡Ìyí˜òhá~TUImó¬XðG½…k cÁm·±qþ×(>Yhåž:rÉÕ½ÐWZÂãVRµœ+<¹R'Ë‘·c¶ó®Âã" jz>üóÙ ‰#<«Ro¤rÂS¯Èª.Ø/Y‚G˜©xúºÓʸǽm‡syKÔ6¤uºÊçq3î%—&<²êAÅ}OáÁl>÷ Òó‘ a?rÎÂ4¾B§ì$v»‚™we–ói=1ÈpZÎUk‹Ë´Ír>ìtɧÖUÓóŠÊ,~°}’2‹µ·ïVôÞš6 qÚz©é¯#BìM)Ò„ó€`ö3T»¼"SUÌ}÷å“äÈq 0$üƒÅŸ+{øñÅÉaãHÔ5FÂkØu¸ ¨ÚÅÚ|‘à„c…²*ÍŒÑÅ5 Å/dÎØÂ#àþ7~r#™î <¸™º¼ ‹y­Âée èü§äŽ-Ð%=TÇ<¡Þ}’– ŽéïÞ©®èVнÂc»ƒ¸;QôöÀ¥®*‰ÞæµW•GÐæžÌ}R9aQÍqo‚à²o]ùÄn[÷„ZæÉôvC?Õ^œÖŽ˜„có¿©st‡³Ÿ¹»ð8¶}45*/ü'ËìÌšºŠi¡– ¸ÔF Ú’µªº—€öy‡ÅnÛæÑ;§°„T¾]"Ñò…¬A¬´âÈ'7÷TÃ™Ž©bL'y#S8¯k·È”|²ZJ{²«š×P·V9É]VúÎ? NW> –n·ÎqïÛ­‰Ù<4Fªô>°4U`Hß’p®”¸Žz“ .wv­òír˜¤œ~ ²< çgÿâÚ¦ópnyõŽœú5>¿Â5<«“J>Ÿ§î¿¾UŠîÎfBpÕúFÏ‚ž³z-í»™¡>ÎP¿‹”y¤ÄPÿš`I-å~tPîG¢Ü\Ï­tSwšI]¸SÓfBÎÛÒóóÏ‚z“D9Ž_¿™šî6BßóŸëÌt8©‘ðå’™Ö¯ O‘)ê¨rÔ%Ÿu¹Ës“09¢!¯g÷º9¨zеª1º¨aîmjÖ¢GÙI8oÜõ;JS%"ʳõ†îG~j>Cï­ŸJ3ôï÷Sá£Üq¹,ŠìKz<2ûXÖ´PDŠ#¾ßão"Åë#ü±î2”et·ðû/Dvôe"Ó2æÜi©æüFº†TψŸÄBÄg3tZu.{)^Ï…˜ÿîf9D;Wh…|CôöÐÎ&)Þqy¿sÇØIÿi*'nÖu(ׄ¿ý¹ÌÚ°ò€³ÒóA?vÔGíÊk¸DÇ¡†R«jí“ÛÆ§M®ÚÊÖ‰´ôwêm\Íã‘%jaµQù±œãg¤ZÓ»™òÊô×á^Ãìkú61¹o[Ü$5þÆZóS’ÿ‹G—"ãUMxÚ Ž¼n¦Âm.Ãa!*iÈë>Âc}º7ÙG¢Â‘}®r'¡¬82 ¬6NÕ½Yµf(Ò™©úRBiz­yÓÓþס3)œAàSèV)m'd3JzÜ—øó±ë¥AŸØ6vºú†ƒà_÷3&%~fܺY\ü7NöäW9ºœW®,ô«K[¥·É↭† ‰ûïÂÓX½1Ï­#Wvü¶¥jë¯9Ï×BÒ&á`‡‡+=Ï0&ûÒ4¯{d­á&p?‹½„Qp±Åµ7oN(OQ_0”Bâ‚[É…†j)ôô<õÍj(léCM#MŠDÈ©ÉÙ“fÊ]æ™5ˆ;=nÁFG&ùð aÝ*8ÎŽ7ެŽpMâÌŠo3Ÿb‹–RÊs>²KzY!HÓ-á1'2«´ÃóL™;‘¥Ð/&]gÅÿó¯ZóâÆ1«¡§ç#E´,S}bBlŽ]p\.@vˆ’§``Ýk¨ø˜Ð¶›V:éˆÃè:%½–|èžëÞô|(· pà¨>±¹C,Ü£Ío–>é¸K’êßKþx$Yžœ">þ-ŽXãÛÆK‹¿câ[x0jû–¦J•NBÚ»åaãE…—eöó ³®VEÝdøÍ’áËüÛõ7ŒdèÛ¬G5­ÎÏ_GÒL(l5”aEX¶SCƒFÁv<ù9«ˆÒKêûÙÊàTêÝýÙ!2ÛeMÑœˆTlª…»ÜzX5Án =W¸WªÄ§›xž 5>ʨÍjŸªŸ ¥þí ÙØÂcÜf€T~Dï?鲄ǾlÞ!ï"î»=ʱì§(î?x( |F¶´v‡:)š<Ÿ˜†’»r®„‡™k;X 5qäg! Jõgä¿öçWœZõµ5Dzw›ãuèýÁ‹ú¯úù·‹'$øUV#ÓLáŒ:×°•çåèåÛ¤_eHǽ¯@ÜÍ£ç}5¿z¹qäx¬(­ã®¸ð0¹:îë5z|•ô¼Sߥ†QÜtlwè[ðvô/¼§¾›ŸmY¬÷wT@÷w㣠¸µý¾\½ãqísq&î³(1«&2ЛQ‘Ê÷k¤t.'Ρå«J<جµz}ßiѽ¶Nvá‘o \ì$Ÿ“Û0ym{ùR9Œ¯ÃAqz//ãÚkë~Õ»û³$+…f·Ípјۺ ’±%<Ø9“Tú®àzjCþÎ)qæ6÷àªEœ«XŽJm 4‡ªpÂÃpÁÇ\ÂÃî¿ðµ®ð_.à0H -ê0FôbëPéÌyÇоSÎ{Ü Œïq>K¶"¶ÛF"ª%œ™îûyD§v:¤;¶ðý)õä_þ5í#tçGgŽn3Š9â•ÓóÓJzž4°UjªNy¥VPöÅÒ0¯ðû¯q{hKA£äPr¼†!pû+.ìxCÅÜ5:8’™¶ ÈÌ•ª· –•J¡&–­ë%•Iª>³4‰• |š¦×e|¶¬µi^‘Ym ’"<¸k[|+gÖÕê içÞ„{«ªùr½¦rÜŒÿ¡…xµïáA_œòÇDTbL«_}q_ýȘKQg*'΀æõ Çã’-ùÆš£¼W‡Ë­Þ‘$¬íþCÍâ®Y”{ÙŠá'5ÑÚyWª™×6gt¿¡}À;çh…pNÆ#”w>‰™…3÷wD&:®9m~ƾþ5çÁýÄ—ýÉÄ<…Gº’O»”øòî á}bÔ®ó_û_ß@úÜþ‹Ñݣͪ¶1NöÕÔ_K/æîóaohð¨ïâ–4ôOòã᥊`®Î_¿lB닺ˆæUÍ*$iw®‰¶ÎF[%mý,² ï¾ÓØdÒ‡eŒ6‚)‘AøV-˜Ù©0«ïk5‘ÚOËÎÄùùGñ¬RŽ÷è¢ áÑ¥RQ‚½Gïs“ï ÃÆ±]ÓóŸJ·òù#ïÉ¢¢ªÒ­š¸«Þ…3a¬¹ªiãT]¯nŒIlóUµO%Þþjˆ ¹çg ·ùß»G%æg#žZˆqŸ¹᱑!¼³-á/¿§fJÛB{…~?ÿ )³åçëO ÷" Pzo¥õ‚äìUx8& ’rIÕÒó<~¨[R¶Ê¹™pŽÇÇÓ"s¤Xp]U…¯Ý¶ços<<Þ‰Ôä[8»tÕNP wA ÃØìe+ }¥“nÒEмã”RäÔ'q@µÇ8²›$i 4¸Ó{#d·Õ–š7é…ÜÞWk¾¸(în»Á¥ýýïÿåùHezqNr„ÇD²j^:1W/8î#wáÀ煮Pn졈S8å&‘c gZ=ˆë¨½ÌefWS=c!°=ÑåƒÀÓ?½7Ä×ëìMÅó¾ÍAvÓäô“`ïM[3ÖèêM_@G1T4ANCóý®pq1‡}ŽªVùúok#"ÔË6ç ¶º:ëíT(Õ+Ü×1›c´„»=«p{Çd*O8DzÞ'Ý€VV0/ è6'ÐŒîØï‚ç/Þ‡^H^ÂC0É>6wEÑÞH®×hµD·~Åý0^—p<"¨â^Ä sÂÛìåÀÊd÷oÜý$YN=‘¸‘m#þî•Ib[êŽ{á6ýê‰X÷Ñ=Î̥ݶõØž²BÉm ÄÕ=V¡Ä¶!\2bXŠênA0åÐm„b‡y …’ E–˜ûI Jj»ªøˆ;È+„Š˜nè\ûµaÇã„·w¯Vè sÄv57™s)œ[W<^\Wh‡_Ãq<®„˜}ùÞ”p…G6ô¹Î›¹Éñ¸Ë²¡¡žÊœ‘ms.PåÉWý%i‚Ý·–!<ö½dÂܬÁ¾W# ìVåùIâ» çâ›mUxú¤¢ÃmJG]K‘ÝÚZ¯:‹ãAŒÿ•`¹ÿÞóÛxq™…®ÚÀ£†°Æ.œ¶æEOu4Ó›)(¸eõ¬%ut^¾Å‡ÛÔ¸oÇ#˜ÕÖ÷žÛOJRy°Ÿr¡U’Ê ®žÃFJÝ ¿”˜q[9Æ›ìÐñ›kGß»9 þ–*Ü{Wô¸ê•ˆµ;ªí33uµj¥•PDåX“x Ò1Œžð`%û±‰X…‡œò:7wQìÙ6£m«Õ{ãÃ"—Ô{]×qg±ÌÊžï}IǃnžwÎÞò0æÙ¿ /μ¥OR (gê´²%’¼!wÌ‚½@wOúc™=ïMt‡}UĆ;ÙvéxWÄý-œyZFôµL‘[‡›6…GÐV/œ‰üV¸Ã¼6Àò çux WxÜ Ã1EQ/ðH¤eŠ)™ÏR¥Û¿+цYÚQ[¸2Û~õÞÛu|ÿ»rÔj7K¿§çó„W>ì` ŽxW#s0ÚGU&…HógHöTN\ØÞ3­¤I΄ò'ŽÇ²þù‰?o9ü .-Õ"Œqs„š"4çÏݹÙE«Š4¾ÿàûó¶0”LqÞ¼AôUjskù¯ez›_3Õ“-ºæ9lµ7°p/¨¤Rãñ}¤&…Gêæ_8(á|SµF3ާZoEð7'ÎÃÑÂÑ·ðˆh]æ/±ÿÄœ#ïíâw w^í»ÚˆÇ¹ÇÁ`]*GÊŠ#š–Xô?m«ŸZ“E¿fkÜ-<î_xžµÊHùû<Øú÷Ï_‡9\ÍÈ»ššI¾ü3yvZ¡£¥WõTÑ£8Íz–™øòÃr„“.Zã2]>’Y-ºÜ¶ÎSJ”Ô¾¼©R$QrÌÖ^Âk®2µûm-ë©jµž¢®2³¥²ÞND`y¹\S#J+类Œ7ãµ§ç0»+—;dòzÒ…;jt•3öõG†q޳L’q*üBÏnª?±Å½¹zp’ä Ž‡V ‘ä:Á\E’úÑjw5»Wïç¦òƒd^~èÂO¸&sÊ5a8FÐ*>†D}5—ð`uZGªZáî±`à^¦£„$i×PT¸m0¥çrÂ<æ%œÞõXw¤úP·1ÒóqZÖA°©]áqY}ÅabÓvTŽG”Í|6êfžØžçluK,y}TšºÚ›fV8±¥¬ÆžÒQNð»ñ‘R:JÿCt-¼)%ò¬è#1âˆøDÖ~”²š‰¼üzLË’%Ûê°b±IÜøÝ6SÃN!áÈœ»TiE„#¦$áçŸ2)zÿÆÿùéKn{®­4U¸¯Sð §žœ©0ÿÂwîtª†þ~êxýŽûQ&J#ÒGxðÌsŸL€hòï8NáXxh KáĞǹÔ~þK9ѹ{þ!ÁH ÙfUÒ)³ÒqK,ù8õ:KÞ²¤÷Å¡õJÏû%BdQ¿1K"9¥~ù°ë&U"í§m“GÍÍGùCo‹¹];V•ô|¼U(pê1+eÃå©TNð•¶1”«¾j7ךÉìࣰԖNA;²_Xz$'K¡á±¢áÝ÷pq™=¸•¤gRÏ*=<Ô•òs­%¹è…G‚ð«ú¤¼ìa[6IJŒ›#ŽûHj µžåÀq%Àå zÖ:1§©x|PÿQ¦Jkn-΀ôeª,`4Îk)õL£#šÝãzÖ:øæ‚t‡÷šAÔ÷€JÏ*æèÇÝž¤Qjyжð%Á[u@$Ú‚Ìûâ‚—•Tlyjr|a\j(Wå´O‡ñžæs¹C3wŒ§Xáëó×aë”^gêÖ`a°´åŽQÖŽ»üšJÏ*àh­ÂSG2ØüïSá1nH=_áaWb˜ § CŸS¼ù+š|ˆ“|Áàß*'âRí­±Çۇ{*'T£pN}U¦ŽïË¥ ß¹õq$¶@9W½­…ƒ³×xÕv°þèÖ Z›…#k€\zÇýÑšj*“ЂéE­gji d\™%”x,â (× L_7$&Ç4Ãç >Õ+bÕÍ(ZSȪ›ÏgÞCMo‹e»Àí^#ºó>˜ðæ<鬕sT¤úÄ=ÀôZÞ¸>‰­Â?•ŽÃœÎ¸¯Ú“ø7°·ë7õ$þ À«è‰^¯Èš«w1’åtÛüU7F+üÆ#”ïÌVN*}ç:“_·eÌã{ŽGG<‡LïÃ)m+cužµ¿mrMšóYÇůÛB>×XÂCM )²Jz›÷(\êÅÁ"ðùFÂçiû$<8¸9ÎUxóÉÑûH8ûª'ïåÎÕj.?Ì !÷TNDícå«·bDüƹ!?–¾Ú«ÈÄÑuáqL zý¨ßbïwŸsÔÞØÀkÇyzžÙ1ÍÈHÏÇAÙÄ=ª¡zÆ‚¿¬>“õOò+f·µ.8®#–ä¦ÇÓô×Þn²Lá¾q 4…^‘_ˆƒéT/ƒ¦†4Tþàý¿Ú,¶¸ôT~죵ù&>r<¤LÇ“@Jx¤‚øGÜÒóóÏü™kGöR?ˆ‰kGà+ò·§ æ6·£ªß¦³Ï;ʈçuÒ6®¸.á±^ÙÇrv{|"Öï쌤0Þ®îøŒ¬0>Ìï¢ŽÂøH»-æ ŸF<Ž_Ì>¸8’K}Ò¶Ü„‡À´Õz©aÔ(ÕT~ð‡HH~¯ðÍå."S‡k±¼£ aé½±êÛè(7~RÆÍkæûê­XõŸh½­ÞåÛøQòM$ ùÛÄüŸ×¼4U:œ:øä“/ë"VG¢ÚL9»´Klâ\ºìö…ÛìGx,NÉn«ð8ãGFõ•jA'äÄýï‘"Ñͬ¶%®¨>´Ýö4ÓPõ©#·&6qÛS! (<òAVý6á± ÙÆâ¼ÅøH´˜U³ÆúÓ †A~±>ô à‘]Îì§ 5’ëúHÂâ¸Gßò'àm“ýÜóÎJ÷ÚÓ‡¥ÿf^¹SVãGi7±ƒ/waF’hëw ³}sÂß\‰ŽOÕZë°>ïIO1žùŠ\¹”lÓ–.Úƒú²X¥ðB‚õÉZééÝZˆí®ÂÝÆÕA¿•1\¡å·q>~R"ÎkRMm§(Äø(º Æ¡ëWÜ^Eb£^mŸ^bŒºÍë±ÒSqÍ 7mÓû©wyIu~&íä}_3¦¦Ÿ™ òèKPÏäðW†Ín^vfØ»XÔ†©IûXGKxÈï Ã(×;që8ýÖº¦¬›‰óI˜;CsŶ‘„YÊ#™“ÞËÃO3qüŽÎøIY7“ƒ9r˜:ÕÚU9ÁµAm`NõnÌßY!甞g\g”Ö34~RþÍŠ Bc¬™G³ò[oþ¦A·ÿ®[O_ž${ÙÖâÉEäæ”ÎñŸ´Zlf”–ž§H”™6ÜI·¿÷æ÷î¶ùB˜:[¼Î:\¶å]ïÎ…“,œ 3y0L½X¿ í«"Ù‘C+Ÿ²n"Ð@+ÓR óFNÕ:–ñ‚Sê¦çÛ§Öñõ¿_9«¹<ÂEÂ#Îz7•ê6.¤•£oÀûíÄíÑHv=ì­N¢äŽX\6ÅçÍ 9Ø‘EɧyŸú6;¶ð1! õ5ÑíÏÒ!œ7¨‹gzñð°mÉm‹¶"éö y¥êÇÕÑÛvÛ©:TALV.>N~p¥ù¨¹1¹!Æ÷æþsœ‡¶º9Â)^ý>÷Hš0¸øWöÑkésÕ5çQ/ð:Bå]ΑË7ˆåÝ…Ç­¨“×Ü-.›ªµ«¸ÂÛVÜ–ª9Ô \ÃgP²üõd[î«ð%4#6¡”ÂB¨ûóý³+#ç5ïc#ü =Áô|Ä<.œ/â¡ÿ3½M?Fræ8SåÇKýȱyûVQž„Çþ¾ºÍåT>׫;çT}è‡Í;FêÍØ¬ÛÆ<ªã§5€–ÝI—V~áÁÇ}ÆDÎÿÁƒ±A”þMåÄu/ûºdr^ý_{©¹œÐª1¯æ6ÕŸ4=<ר6DÓ›©xj™j—ïþs Þ6fZÊÞùYqMÿYqDÓ#èxÝ\>†¡§¯ðˆ“BB•®ç'eÚU|GÖ=ß¶Ë”ïeT|ƒ\yq.ž¾>+NŠrG´|,°’=ÿ.,’=‡Æh[Aø¥ øtŽ62M^ô<%éø‘Å`pŒqFz>º¹OÚ¹UÊc­II;„:ý¬ ³ÌæàWLÊ¿qY†™¥ŸHÓ«Rã½Ìf¹ ZL¤c:£qòrÕHÂÍ:|#³ô¡`©Ó;÷±&)tèŽ>ϘW¤8Ý£¤çÃ+ø|¼–,C[]ºàXy×¾4MR¸û³Â^_iÊ 9BAØëWÜJ¯{õ¥J»SÑÂÚŽjÁ­Þv‘›`Þîb‚°ç¯äÉ'Õ³`´„~è¶,3>TüÈÑîçѹ‰Š¯ž*I'n~¤_f"àŸ¾ûR33ðš33ðö¡›k†Í¥ó¬ÇõFÁMW~ñ`££¦ç¯Û•І˜ÎÀ¿»µm_~ñxfÇãï2<”×:á:O—€y? º^«~fþB¤ñ¤âTÈÕ.<>/«‡Êi©…ù“R|¦síéü3ª üæäéüQŠOëe©÷iîC«§rN Ä&§r¸Ð,§Eç2|âtr¤ÖRÚq®ãùÓé÷§–ÝÚî½ Ý'¿=>“Lvbf’D/OššžhqÛŒ<}åtúýkÈK2…S.äžÕØû’€A,†oŽ3ñ ÄPš)Ú øÀ §f‡mËäMÇ.æÞÔ^7ê` íµô^æ1‚çU?„Q—ÌèÉpwý¢)ǘ‡™%`RÌÃÌD<±ºBÈt"þ]nq“½Eí:U ´æC3‡­§ZÑ:qÿÏ/CdfÆ=E9L ÅðWå1>Í’™_’ƒ4gç›8oÒ^ÛB8W¹•0Xiæõ…eü^áquʾëj³¾ ¯oN§ÜÝcÈÚÌZ1Ö<¤ Lo‰ÛRù¤ó·”*:†Î½ª¿âc ­ö’|ÿàƒ¡r ãlOá\Ý‘Âo ïÿZƇô#¶ ‹•ÊOcb|¾ñÐaª}€ÍÅGªèuväk'¤Žêc–ZkC7‡«ÉH1PŒ™šYô™¢ÚË:cì’žŠSÄÒêeíoàÖüè£j ¤›m…;·Â?Æ`vs±Î7â§}ü¥ž`bu¨Ýr=™Ÿ[áìÓfßÞŸ‰†ÇùåÙ Ô~Ö×MÏ“*¢Ë8³>:ØÔVG$LÇuÔMœ—Ó‘Á‘§bw«f>r[a;­¾ÒTøËL„;‚y mIœæ4®vµ ®]„•Ü%<òUÊTOð˜ü³^“poÈVSÚεS4sG ¼pWÁ®YuXê —s’rÿŸ7páh‘P:¨ÇN[gqwþ®Kg¨×†lKxêSE¸§ˆÁ™#Ömˈû›3i¾,¤ðØç ÷ðÖÞŸÁ; Zâ>¥ç\˜–‚™[ §äª¾/à÷nêP5k¤W;Sñ±*Âë?©œ8ÌŸ88U³Âoèµ*öôf›V¹êe^*¸Ô êÇR^ŸFG8µ“úÚ'=£)ú2Š,È`wÔ,¬M'Üß>GÁé1¬¹!J<OÂý ØÓX=ºÇ2”#´áï¤ÏL´„é•ô×¼bÎlô3ífö@Ø'áA´±«üàQÌu€Ý(œ8 £OãqSóTÈjgª“éLÅL1ì0weÅH9¦Aà$œ4ûˆúé4ûKV)Li¦w\ÌŸ%—9ˆGšéùË ïªÏD³ãÆÝjÜH³›»2@ÙïqlûE=Öưã7áäZ'u|¡ÕC]xfÅ™ÞmWÙê,ʸ¦êªEœ£œ¹¡ã,|ÇŒ*n6Zö—¨u³ä '}E5‰'1DÐ 9Éh•(b½íZ*·ÉD¬Cº¥¶Tzdõ´.,¹ø¸xw™éy:³þZNfäOÃõ¾Žë‰fŽ„ÿƒ{þ«MÓ`œ«½s¾RùT 3-—³l9z† ܪ~œ¾ãC/êºw¿:¿ÉÍÄûüï3ï);ªJæÉÈž{-Áᯯï¾2 íh+kË\Dbµ„¿ÖÏ: ê…‡X8’œ¾KßJ’ë6ÃjŒÌ•Bßë5{„KêuÙj\…‡Òè‚ðn)] ¨ê0¶Ù|€•!Fû­ô<#²mv¾‹ÙúIYF‘ïØ#¾W"Þ¡7r]Õw%íu[™ÌXKÝãû>®i#œIxÜÃé1SV dŽÄ\ìw%ÞÝ|^èÛ©wx¡E¢]+ïæMáJÕ¡oÁÆ.Õ˜i®G™Mx0P"äWi²ÿm¹è;•öÿy‚Ô…‡žó/œ²PXÜ[xQ!/G•˜w³gºßº\μ?ÝlbÝ„38 Ët*?ÌIÕçZ?#½ÏÊ"ìˆCoõ‰wÐ23ë'r’>K㜠ügc=þjÉi‡U<•sR¿Ò±rè;¶O´³’lL§ä‰ßVæáŸ+3•Á}ÈYÖ¯p×O‹£y%F7FzodÇZc¶²gªˆÏ¬*Ÿ!2¶U‡³¼_\óñ´æh\ÚÍ•¹¹Û("cmíKÕgåD.åèÎùÙÞgÞÞÃRXYEÆv©Ã}‡Ü¼}#ö  Þmu+ ?Ÿ—ÃŒïÞ Y>7W¦äá¾ÝžðÁ–þ_rÞ¶¿Âƒê3Ðck+Á7× K/ŽÄThæðЄkÆ6qó¸Ü–_Ÿ—…1tŸ}Qx0ã¦öQ7¶OUÃYo˜X%þz‘Žùnb‹ÑÏcKx¨§ÚöÜ9`—èÄöž„ÇÇ¿X£šðû¯1´hªýÁƒV.#n°®DÍ'Å­•Ãâ1k9¾)%Ó‡mo³>ÌüJÌü/³s) ‡ú¤“€]¥Rá­›%¶Õ'mä·qîZ´”,Q°¬fúkÆÂÃ)Õó¼{º²;>TŽ–o¦Û mK|š‡E­¼.öY‡pw ­Æ¶¥&<äßv Ö~%Uö'™ãè¹CçyªœØÇ$í¹»l¬­D¾#"l®ªÆøäE\N¹7•êÔ‹-][ÙNNØ™Ú-”Ü)ŒfÆÿµõí¤è—·>’ïÕ>ã®j{¨Àîn´7)%ó\ž+=ï+áY;Ùé {oVÉ]õí¨&UÁAFu$=ÅXOʱ’¨L{Â8ŒÅÇÿÁ s¯ìô#‚n™#3Žpçã ¾ÂeŽ¢2#ºV=Ü˲}£N½×çös_f¤ú‡=Ž|ŠÚ[¹-µŒF"Á ií*܃·ì§pÃaØ;Îúü~âúIÙNP«Š‰¥~ YÍPçDfy3×nú& eµéî*+±ñÖ‡ÒÂ#´¹¶ÓrubYµŽì{ Ê Ä¬.—˼ƒÉ%ŠAïÏ¥-¹Àdãm§Üö|*Çù¹sa@°›sO¼!›•dÜÍñÛªeMK˜(øßù5ÖORM¼ðÕ'* c&œ 9­Gv~>uˆÃØõzˆjÇšÕ¿#ÓÖzÄ;¥ÇÃ|_Õ3Á­ç±ŒžJ ÓÃ|™*ÒrßV)±³÷:" ÓJ<|³uk,ÑÁÃëWNÄÛ6á±û@ñ+*§<¦O4Έ%T컀ºvIxðXó–ðërh{nã'¡iGæá< ø…3£ûµ¸vâÛqÐüáô|¬FfëûÐÞ)ÐÝæ•´û渞@ÃIœ;t³‰]ô|©AýŽÈévœš¹Á»3ߎé×Nz>DÎÖ<¾+ï$Òq¼âGJ;+ÇTÜc;ê²m8›ÌÏ¿(•®î¡}ͬ½©œ¸Aypߨ lyxqKåÿ\ÖiïÜ?JuÚ'bJ‹Š“Õë-Š×‡8×=SxØnõ‰}î¾ÜÄyý¨Â“ÅÍŠÕ¯pçd‘á\ÝÏ æY­á²`;óí¸žä[;óíGá©>tÝ! t—ðpEË•!<âgp¸3UÏ emˆ‡„ÌþIiO¡Ü´Vª§†·îl±²é®_qéac©ÖŒ«3ûÁWŠ™v»z×Αn1ïÄ´#aÛì-©Æ,³¯}eÙY ÝÌŸªÁG¦½!I§Y߉i/æŠNgàwR™tÛY]:e»Ê§¥†ƒÈº„×õCŠ~/OEáaÙu{qW{ÃRkÐ÷¾jo˜þî‡XÁ‡Õrµ×'ƒ!;¦ÞK½(ºsrRªùâzöNZí¸ û@xlƒò¦w¢Ú±2v*߉j·õ v¿]³³”Ì#ˆ bÂYGÞ,™ÛÔj‡sŒ@ß¹…GÆ(+dz¾ìü¾»µw§çC¦ç /Q*?N4ì›TÛÁÝ’ŒždmˆøG CåD4 ’ÂL¡>‰&@®JO:À{µTJ2"Ú¹¥râŠÁ‚h…ZËÍ}4ev’jÇÝVdçþö®9‚¡2HŸMQj4wt ôtfæ ÏôÅZaìÇÐÎ ßâîÄ*{ÊXó³ÅO)QtÜ98Â}ÄyÚ® ÷ïâ}ºp¦S¼s°ç§„Ÿ¡¼Ûñ°ÎúxdN„ÿ^oš†Iͽž®=>…Äïĺ›¡1ûàw¦ˆû“ðlT‹¸µ¹¡N¢V)³ÁM=ÏD<²ÝwŽŸ§µ£×ÒNۤݟ•—¿R¡¼™ØK[é¯Ïç¯# $·¹-¹ãgšûÒ…‡*WEÞ-œ¹VfäôÜIŽtÙœìàEm @&Ÿ*·h›0yé©°Íÿà5˜ËðÞw£HãÜÏ «lt•ôx8‰JTjJž½KÄËNQðÕl°]R3ã "_]iê^E½-,ÅÂ#]ëoºùdKd¸#ê—jCÃ̱ÚSŸû¼Gf ä}lÇ¢:ÅA ¥[ôîðÏ µáiªÎrŸB‚ðôO­ƒ“ôÙÁ­>I-¦`zêÖãœûÏ/ ásׯ Ǭ邷_F*4â2¡›ÏèD;ñsîÛgúkzc°¶àñùcÿ†f!?#=I“ê-ááçâÚIO¥F4‘mš~£ëäpô'™ÕR¿´ÿÒ_±]wÄÑu5&.¢vë:'UNR”i "O.?8~Û“Zú~!é™N¶NæÔÓIØùê||Öò™~\ Æ?ç5«ê E‚sëòBO"á¼>¹®›'G¦ãr‘'«??J€Ú;Kz>L cÕÙ*?q1_—ÚÚç¯ççW°?÷ K¾Ù3{ ÕœO¸ú¹Ó»÷T4¨…>qvË…mÊè2ÿ¾pðL±*ç¤]ƒQë¿>Ô¤Ý=¯Ù?ùù”¸È 5ˆÇñç“Ë]sËnÈ7¢êÔÔs¤Íë£Á£:øD‘h#½+̆jÖÿì{Éž»hÜ:³¥(•Þ2Ši=W-kÛ¸®L¥@…—ẻÇéóè·í×¥ÎGPæ¤xô_ÛbÌO1ϱÅ@}Ž•ÈËó“S ž,#Ól ô‘þ:&¯ÙÕ;ÁqAoØŠÅ9±tøµêàfÌz}N0{%Î#R3#\ää ô?xPØbo*'îåoaçÏß‘e+†ì¬õÙ’…ŽÃ®¦M’ú0`­émÜœ!;Av‰<ýU;æHDí¢N Þlãħ©v²ºÌ½Úê\†¨B<µè½Áµ|¦ù¦1Ö÷s:!<¼‡œ¤6†ð¸a¶zp0'E²ãhÑÞ´¥'Æ|mâŒJ½°ÎõnÚ•C™œó ½7€… -UíeûŽs$}.’.gíê7Æ´ëÀùIIR+‚…¹Aœtz‚H„W>›‹Ž_Þch°Ó¸q+¦±ƒ ’cÒ½‡“cleØ‘Rîü(_*ò#×É•DùR¿XÊ—j•qªÿ$‘ö' ävÁ1j!³sT )÷<š¯ô<¡t1¾Â^=·ìT|œQ™¹Ó›Š§€ÀoœN¯M]uÅ ¡C¼Óã1HÌGÞ[»?’@–‘ðý9zÔjncã–'BÞ¶ßÙSßËl@Vmïc…±×­÷›IDÆzìÜüüäyȘ¤ü©ý‚ªºÂýÒ•90{EsSþÔq ö«çe6¥«ùDÉ#Yö-áÁ&/©ž LfâØ¡¡´ïÝÂ]Trw´ _JÞ¹6ˆp¨þ 8"\ê·Ø(®z25Sëìô|l>Wu¨Øß˜%æ$J¾"¯À-©˜´ìO´#ñÌ%ÒÑFí«6 ð±±¢fvÜ’ÍX”<"'Wìõn ÃÝOí8é4ã` §pò¬´xªä'l$Ђ!Y‡ÈÇv>š3v¬*F|¬YÀK¯àÃÌä4ªçó~g™±ê&©™dˆ$ ÜuSucmlã©-’gOLOJ¢Úž —Wxè4žZ²'³›UÑä$TÆÀB»¢ª¶3‹êg¼+‹jƒæÂã^î+Äz£`vœ"RM8WQ=•³Ãäßä|Rì;~kz:‚DÎEôñäë%uö'™¿{“è4ßùšÎZÏOJ¢Š-~sÌ+‚ÝzrçÄC6rÚg!$¡öçÊêêQ&ÜÛÓ,qÕNœ™_%=¿ríbb#°£Å(Fþ/µ¶2×Ï”Hõ~ôdr[^áLÊÖýÆÅurÞ·23_Zé&rižª"ßeRE€ÛrÖ›tÜat\ü»IN¦B ù®³óN_öåÄñuzþm$ì‹÷ƒßDÏC1|xf½›…ÜqÓoíߤ'åTÛ<Ô®Díì³R9q<$/ãæxøÄLÞ¤'óJ§/½7©OÍx?üwÉ×&õv3™][öæø'qWáq(½\éþ&í˜ÌlÞvŒ¸Ì›‰yHz¨±Ê›ú¸yM8cfªÍ¢•ð0!ÿ>³s†k_Ý¬æ¾ ä »ð˜ì ±ÜéyÞÅYÛ•un–ŽAôgkêËø´àOúP9ñi‘EÍ•´/ãßõk|~íÏ/&Á Óý&2þ7L }#K¾]çNg!õb…çÍÅbHÁ?÷îHôŽ™Ä\3‡„ŸoŠjÿóókAùy$<"b;¸2 ¹ì³yfŽû©Rõ‹¶÷ó›4bÊØ£ÏÇÁ€x´›EÛ“ñp3%d’S“ÊYk¥ncÔ±©³R)‘ˆáWËy–ŠPÞ©&6žj­å!]7)Ê üÅF¥`õ›ÔÛó…®›¨yF.º¤æõ‹Û ®èá$Ávõ¸Ÿ›yò”= óÒ鱉²+»Ié Ž;kÛÜߣÂi|ݬ ³poœ«¯øøñP\]xˆæ>k™ÑÍ8sp7ó~âÓq^µUNž¬SËï#Ö­·O]ãË&"ô¦u„ Bë’xÜ#ß0™›JeB•ûÈž8¾´x¿ZÓÂ}¾OTqÂÃôÞl­Â™ê ^-$KÛðoÜûŽìLÅ0KL‹dr7Ǫ—‡ƒO¸ÇFàp³5+4!jÔû^!N¼Ò[ƒ}±¾wQ€›XúŽþr¾ùf[Õo£/(‘ŒûP0›½yR¬›Yzdú™Sï¥SŽë‰[ÕµùúnIÏ¡½q6«oÒh+"ÃfzoF8É^ `â9ÍÓºÄiˆ# /ñôðÉ;×Ì£“ÙO}ÈÓ׋ œ5•èD l"qVøá*yb#€àk2Nøk JE‹ƒMDý88Ë]ƒäw—®zÒ&áÅ›izè+žüô ·èT›0àpãq4õCgn–sG(ÝL}Ë;ÈàGFÂãZê³©ÔÆe«÷£¦óš‹ù’S-gª­ Ÿ%á¼»JpçÙ\=»]=Ñð¶B[æ*NÒ6ÔKÇqÁkôÕ]ÕÙëá~yO} ê}¥rÇroiéy®eX„ô¼Ì;$ Ô{«âXîi C;høŸTN8Âç°zo8k··ªUëêD¦õNóQÔ<®OÚ¶¢æë²wõ&å䎭7ùù˜g}JÝb3c¥ç}q½ð³S73³.ÄUf®„lè áapaYŒy¬ü©0uèŸ%wÛ1äWåô©fÁöÍimÅPNŠ4ãB¥ çQÁ–¦â\g a´…œCxã´”9šÇÅ®"Þm¬/Ü RÝ|GÀwh%ì¤é~q¶ÕFÞnÃÅ¢ <’ Íê¸u¯çX… Ñ“ð¸›ZÛ&õ Mw4p‡Ù©y¤jªw&<®¾Ì:sõ —Ò¤šén “ÄÂÃÕ*1 ÅÂÛÈ)‘/ìæÀøñ„W¦çCFùàÚÖÃâÊ1K ß7*4bÄO…Gšd@ôhÿË8wýŠÛ‚Ö¾^ÅÈX3ÎzÆh÷ ÄC£[]3¼#ªS•#Û‘OLmi-W.EMš}¤&$ß=%Dí6pëN¯>Ÿ§î¿¿e§Ó”qrnö±B_á<99§‡VíŠ0ØRÃZ®Ò”xŽrÃO<» âÕA×,~5vկȄ°+ÙañìÈ»Õ4ÚE¹™¯Fz™"çú(aÃ&‰v3V{méyF½nˆ"'ùöÜV9ÌÜbߘËxbÜÍ¿/dFrLûýðì7ñì©4NOeÅEœkžônò»%Dzû¯ ÿ6˜%üo6xáÍy  öô“ãÌsk6•‡z` ž}ÁñØï£,H<¤¶íË÷ªÒëšØS8=¼3Þ3ÇcW†hÀMå»:ä“áðÙß¹W¤3ÑÊÌ¥2ŸÔpõó§ÁÁ¿­RÃu÷_‘÷HÎq3 $<Ò#ésUß1¹=’>.Ë‹óšOLžÃ |N0guG mëW¸äCZâå ÷!åb+_Â#çÎ@¸)̘ӷ qvšyQ~çÍsûzwÏ.<¢PÀ¿Ö-|þãuY[ðšøº.¬\D<áç__E¤:¤ùûR5%!`®ÃÔ{3i3ùeØ=‰=ÖWñÔaUzïs„ßpŽžÜ Ä™ˆ¡õukÂÃ_ÀÊÔUéP›¸°=›ðQ\Hë¦>—=Œ*œø|“X:¸Ç™J¹ÿ.%¬3ü¢;d_l á~ÉÜáüÀʆZnAWáa~#“ÀMϯÏÛ¤EÙßl?/F÷éÛ³:¶£YÎov>ÇyÚ<®f•vÿõywh8"XqNÁLíž,½:T›+Ò3©ª´µï…‚™pW¾6°6¡áú=§+z+¯¦¬=_éÇÙËöfµr|‡¿8 áô°m»½œæó³U3æÝ¦Æ=‹mŸRè ë‡pw¸/¢öû'±”u*ǯê!ýeUxÓÃcMåÇç.Y¦ûâ¹pÒOe:FxkIÅP]rÏ^õÚüÝE²#ñÐäÊœ¢Þ!†ËmcRs™CµLE8|„TØ¿hÒ¾ö_q]œÚR-hy5é¯}§~Üâ©çãƒÃÇ]Wm²Í|´»cü-º`å9\ªNüW)ÿgG’cGF9WÆs<8vتw?¹Ôp™q­qëc · 묖ð Bh#Ýú,á‹+þ+"î!ÇÌ‘”d`V\5(¢Zn5Ïkꔆ°é1òóÌí‚—.\yÀ Ä̬Tþþ/åÇÙ*8„.œÝøœÔ±!„SÒ!<¸’=·*Âc—: Œ–¾cÉBì :¤ ÷%jAš¶máJ9Zê4#Œ²ÎGl(®;Ý+<¢—aüTÁUÛͨKµW;ÈŠ„GšëºÕ Œ™Õãü‚ÁZÚ†jë½­HgU' 8›+OËÎ[¢Ôý×úü¢5Ú+™-Qê¸nd[»ú%äanýPê5Qêý‰à~÷´ê”úÛõ xíΚ#ÖÍïôàÇ]cöá‹ß)W3¥ ×{ëãÆÁì!ÎX¸þ(Ô »ÁOæ£-ÜwIÜWy»yv×®rœBb¦ðw(J¿ÚŽÇÝ`\¤V70¡1L“T©ð›¡TÕÜàR/VžðH¸´Ì(/zm,ð¿›«[h}½AއïŽ{JUõ¤Þ¼ÀTÿ•3ó~Œ—‰¨‰gO¾JÍR1ùØ^]9P…GÀúo<$( 56Òk#Næú]°fŽL¤PïSxèíÙ¶ò ;î3ݶ2\ 9MÐ÷©˜·“!>U|·¬”Œy´‹°¤ê0¡í£SŇŠ䈦ÜlCv*>âä°Ø½ÁTŽGö*ë™÷¼ßqßÔÇ“[ô ggº…ÃnÚò2|#©‰n·5gàÞºp‚ ª7áqñäðªÂý¥õðûkŽa¯°(ïîŸò+‹ÕL²1˜rsgš¼û^•u8tg¡áPR1^{ì?¯’’ã!r¿/.YUüÂJåÇ ”ƒL™ÿ:Ìš bGç,©>œGÑ{[¤“5“ê±rœ×¹®‡C9C¶øVý‰ü§O̪- ]yͲ1þ+â\Í)$ܡٱßôŽù9Oy¥7ŸŸRȃp‡CÀù\÷ {¹í…‡ ãÙŸ”î{¨a·OØvÜ”þô–³^ rÇ#¶mA³S0 »¦%f@‹ö;S)í ¹®"<¾*b=÷P9a¶ƒ¿ôüÞ YšTfÁ…8ðT/PjØÔoê…°Ûô,K'ÎàkÕâ 5?ß=ëΘÝÉÍ(·ãÀü áa×B¼oNáAgCã“ßHJìØ5µ NêF½¿âÃã²àLuÝßÔZÂÃR¼ðøÕæˆlùöѤ· QÈ!1©½ï¿€k¾þºÂ™9v‹ÞÙæ«u"Qï5 °Ã‡N«pæsêeÌô×Vkqž©2Ò]¿"¾}= [—»­‹“3Qúë¶öÎ)§ðö³qñg ˆæ…lÒCxÌ!´fWâaª™‹…üLÂ)=‰Œ· ]Ûܬ7Yã1 žJÛEñí°LgQ=®zqÅRõŒ¥÷zMõ¤ôÈn5µ—d#f¾ƒýÑ8P0î yá×è½á¡=ÙÁ¼é+CNR¸g@F¦/íp”•ÁÂ9/MµM&r•#QÄ|ãZ—ðˆ]þG€;¤é†àØ(·ù|7½6¸C25 ´# |åžÁwž€Ðñƒ™òG’ˆÞ8BDËÛm먩<_ÅÝ…T—HKŠ ™Tv˜o¶÷¥RxâòÄ{ëyž²á&mv%I]feÞ®¦ò¼ tÁROJfjcUN3:†êçmOdóTùôþà nï÷9wÎûtVÿ3KšùWÜÈÔãÈvtzR)¶ýÜ¡ ±í==—¤Œ-ëeë½ a6¬p=f’T(ßJëD±í êÀMï!‚»'¯¨‰ãÑÍȆ³Sù®> ûbÑ‚£f;d€_!‡C¾ wGÂ=Ë#ÌUS1a˜>—v¦ð8Ø:öÙª7ýoŠåµì¤«OcäZt^±3£ŸÚM*%™Q—‹U­ n¦‰¡Ç¦Ì±­øö¶ì/¸'†þ7cØ,€v„ÓÚ³Ýëž ”«p©êl¼·¡×yÓß„ŽRLr (/êwý¾)®x̽ãšoM¾ðÕ>€„f齌•>lU9ÔþD«¯úšóÿÉQªoó#}êIÏs#¨®ßøâŒŒ«ö¿ëïÍZî8슜ƒÙ­;ã½›ˆùƒôØ.¤ànc?´”&Ô› éù »l9£3‘ôe°é¿5çÕÛÅcä$½û‡þêã8·¹Ù« éméÆêšôed÷ÜT3eCŠL½ØjôKßWt<ÜǺ "®[u‹Œj 7;w*†j¥PžW6:›ÓC(kÒmn“_ˆÔ¨þ«2¾Âz³ßðE’4ŒmÁV‰-œ‘O XáëSj¬òms ]‡Ð>Ç45Ú±ìÓØAU'móÞWÇÔñ¸ˆ8óŒÇ­»6ü.¤ã6wÀs Ž©ÝqÔ)8tÃn ª6ÉÔÏ|{ͺ1}O¸ªÂC#l [1Õ±ž/D¾Ï„S™ÑÆëHïfÞk›“W-“”óõlzÀ™#Ñ3[ñ9뛈XˆOƶ¬xvÈù•3ÇiÂsIå§Ï-%Äqï©·)‡Þ²ŽQ©ñ½m ]+6×D¹Û¤\´}ŹWÜqíGµf‚¥’l§È}Šò‘v)vãJ*ΤžHåG®µr§ÖÇZnÖæ¥3XË ÝºJ+“îõCº×é^³^ÌÂñâV¿’a­æì\µ”“¶}H÷–Tbpíý¾Ù„^HTšðXûNñ›@އ"¤}ê7‰’ã¡bñ„W¤rÜ4³5º{è^K¤;lwSñäÜE»&œ Ñö“² .äšMÍ¢DôRõã£w[êëkR5W}Iî"ïß}«1Š]¿âûÂH¯æ÷mIÖ”^*üñ”¼j(ïžì³^m´ èÈoçñ¯-±êÍV§î¤]Kº.õ¹ Ù»ð :Ì-áÞ‘H’òÞŸqœd¡ý·‡ðúֽĩû8Æñð’–ØqK²—ô¼tû«OâxW/6‰ÂÎò \9œtívW;HT5†çf6¥^9‘€¹'¥E‚à †¾ßQ_é:éõ WŽÏ\çX½;ú_ÇaN±o-‡¬ï'ßÃR_Dýj.÷‚ævÂÃåþópê”à ¤Pöý„:áWmIyNqŠJeô 2 ΄Kcn¢R©ä|·k M‡;³ºðˆÀÜ= [–dGÐÐ{yÎqf‘¼£§FÆ÷…ÆËܪm® D糧]䆅ÓP|to…ǧÆV5†ð0·Ÿëz^i¬;’d§‰cž~‹zÚ_ÏíNf²ìfÚ¬ÝMxÐ[O¼!œ!÷mܒʧ̯íy¹œ°}V\¤rœú=bGZR©Ûõ£Ë–eÚHÛ£úsUG|—P1ƒ"[¹âÔ,ÞMj ]ׯˆ­_H¬W“l5¡q¾ ]&ýGަS•}Gb-ëµ?i”¯êÊõ½BLx Øóî³´½Þ~”è¶Æª\-%ÓŽs޾êO¶™4pÑëßIœöàD§fõ¹z®ã±:=:;C¸Ÿ=o³hKS}b‡rªNœœáÈà6U'NE¡0ÛjÂàÅUsÕÆ§y³µqjÂÉ·ý&¡rÜorA¬±WU‡’Ë¾ÜØÂE÷|ñðga/sïTîSÿÅ4j½ío[âÒ°<•*œ+¿9WlåRäèÛ/5çÞßE?3•Ïë@¿ðT;Ê·cMÅ!.q¥üÆë¿ßûùîsÓ|‘@ rWú)ûÉÙOW¿‡+¹w¤Ééj#o›þ® Sg­ù*l¾8=.Ü4Y*‡©³ÌÖ9 f¶»ºňhs{ÚMÅÝ~Êäx“< ²¢Nà5äunQ ó¡V|çC{bë0íÞâ×ÌZ¢ÛëÉr3¡(‚|¶ðXÝ‘º²¤ç}uŸ éµÔ ¸³²úŠ„G„^ëªå̉wN¼¹BŒ›eqvIÆÝÖª^Sé¨e®îY‚}nÝ7AñHІ¢¶ô|KëåVl $y¶ªFÿy[Í¥†*ú¹µ©nq¶–(Ж5aø§Ôƒ‘æFÈ]ú²’àBaL4EÆ×ö†¹ kýfÞŒ!œ÷”Í~oMÕ©q­v^ôÎKÀ¿CÇ\ª¦Ä£J„§6çßqÖ-ǹ¯6j«ªMœ®=¡©µœöÞ‰å&ñ8ÅX3•ó/JLzí¹UçJÏ3 Í:¿¤çã^‘Uˆ^ª"ã!kÞýDK\<Â1æ #>åPœèîÛƒÙ¢‹^¤r¨‚ûjí+o^^›aÜKZ†EÝ/êî`a`?º=_ËU}¸= –£M=®'½Ú†ŽÇö†ä~+á<¼fpCKº1q²)‹ªíì‡Cñ¸à¥U5ź·Ó]ÄÜñð´:ò˜§râ´·è·:7|l\K}B÷¶ãÒãÔ“lç¼¥·ŠTÂCÅM-á<äÀ•¶TNTß î»Ë(n5ìTN¨Ûµã71[æìmh¬võ‡³GüiÂã²ÆÅ¦œp‚Ì­kToƒâNÏåÇ ÓT‚)™apI§  ØnE¶Q…sñs\UB‡®kö›žŸRçç—¯”f#Ý´ðH,²m<]õ;ÜaIpX«ýÃØ÷¤$S‘×­³ž„Úë¾ý.gý'eQ]`×ÞN뉱¯OÚÏ•ð«5ÛØÃL{{ÇÝÆWåÈq&kÇáRÚÝ §R}Hî &æ Ž´ŒQ̯õ(SÜq¢¦'ö†˜®bš.a÷¦{¦îûÁ5+õïµã.˜*éû86ßúŠî9&3Ž Sé$ôÌlééñ $!$qS-/§ó87ú¾iGôÜÚÂ#Êo<.ßçʇððÌ´\¬OS6Ÿç.X*'üçG¤-áÌ º¦º¡ÑÚ»ˆÙážÙ|䛨Mõa®Ý‹íê ŸÙœÝàãjåˆá-ЄÖÁém!'[´g:qBs¯ï ‰Rzžª°P0Øjeè­Ëªé^\¿{'sOÌ~…bû÷=+¶ÛÌlª'9|Dý¸Ú²ãQÈï®)<gÁ¥ Ÿ2(\.¦gÍv›#Ë/’trø¯º òHçÍY¿JÜÍ÷Ñ:WåÌW||Ã-¿’Ê!-b–}z/¿¯}GwlzRm7ƒb×µSùG££ó«Ú—Ûõê¯ã«ât»Ö#Tmõ÷˜Êþáðnõ¤rèv€P¾Ä)Op t’ðÈ€ø¸®rÄá#qÒ'Œ{"È[¸gwGükoª's2@€y¦÷Æžc⮩œ`ú£»Bœç7=éÔ<Ñ:~ÂÐRâÕ1ö|³,;;_1ÃýláA4Ãz©émÁ9! Wª5Ïçëî«vÄû/ Ï<×99t)ónµ†ç±‘—š=ÇÍ7Û|çN¸_ÚsÂIæý¹ŠÛÇù§bÈ:ÃæõKñ *)]£rüDÎUýò£ïŽxœ›þ:ògÜÝJª£r‘4ì5èGÒ­)Ï­ø×‰ÁÇÕƒ½S9M¡âsdÜîm’ ^x>÷Dºp11;­¿y±x(.K­çMyÝCõ§h¸ˆ7­÷‹»½·Í€j§þZ˦l]Â#è 2Ý­ çA rY\á!ýׯ¸#á±;MdÈkÂcòϾœXNâ»y¼“xH‚”÷]t8‰ÿÖs“øIº5÷Mäã8ÕYp'?= ¦ÐÈål·i:Tçt>Îçšpê›l\FW»ÄEaŸ‘Ø|°êH7 D»óµã#v3]üe$©øfUï~B3’v ´f#jg$©ø†ÀðÁÎWþUÛRÍG<Â%sFQ9Ô%‚Y1TÞ‚´‰ºðØ™Ìß=•CÑ×áù&òÔj§z“Y\žãõ&oÂã{%˜têRñ±`÷¿GÕŒë{šËZÒkýTq<äFÂC*4©›)<ýD|©>Ì¡ˆúñQ½”×/ïRÛŒKKa$ »îÎè·SœŽ)ßlW«(ÇUJË:.ty„‡ùf.Ï›‘×ñ0?!×S9ÁGÿÆ#Òáw9qôÝqê¦vÅHä%ö0•‘¥ä¡M´Óó±ÝW¤jKíUd~ÑLÒŠ»8sö.·:%ÂolYeéy*RDêÃ?Ü~Yn$Yù† ï™šñ× µüÈddRÿ@i¶ g~N,+CåtÍ=¿>³ÛI}ý öƬ³ýaþ€úwq3G޲.4nŒ·68•®\­EÉß<¤KªgXp«­}¹™,eì2—H[ÒRØÍ…|‘*IÙoV\Ç™›¾×ÚTÏÂ.¯Ÿ#ÿ‰ë8´(_ƒ-ìrÍß±øáœíp_ÖN}#Dƪ:Áå58ºCpœ×˜…Õ¶jC¢¶¢3ÕZNößø Â2†©šÌùeµ<„„cS£6Õžüù».M22›o#~ÌÔiq ÛúØ&<4ØÀ]u>?úáÝž‘elÌnÇÿ… f öâT!›)ÑdˆÍ?32œŒ¤3Ìç®ös³›Ša–V8qMxˆ×ÂçØ›ÿ„Êv=í$<¼£9‚™ÍÇMÞ¡ÖêPoÇÙÅp6ß}JóïØkWg¶÷êkIØf6H¹«:±2@ðe¦n£ùõNœî™q~æHÅD”­9¿o:SÇCcË@l¾í[»ÏôÖ·öKÌ.ª}°ù*Ñ;YÖf$Yø£kLj‡„‡¥¦Ü¬8ïÊSnótÒRµéOȪbB™v ëxÆ#²ÑßÂÞT>’©•¡×†w¹sîåIÔæI׋´Dml` 'ìî©­¬­UŸw­‘S9– qùV›ÙZO8 žûúÂãò ’Ã5·Ùxû¡úÐ:¸çù!=[qëSÅPx™”)_+¤¥FøëIn¥+ŒÂׯÈõñdoM¥FäÛn2Q$uóhHœ™ðHоAvâaû÷ƒðJ•Ï bO~IÁƒ³îÌ–b\ó-j$㵆ó÷¾MÒ’_n›Ò´BCs×£"{&“ß<~ˆQª[ìí¹UœÊð(臦 Á#ÛôR×?Å A<æ8Î!wV/‘ï6÷Xž@s$"•ÙáEJoÞ¶1doÂãXáÉG'8ÎÈ‘M/vÃwÿç­áLÃJ ÓBÙ\ ŽŒiÓ*«á ÂC8ÃÀYS9Q}\ˆ.©œ0½Ù?UÏàp¿ÓH\>R)w¡ ‰yömá2^SÜ}g]ô®¶>OÅ2y ’Ê ×ÉL©¾Ó_{ÏÍç:¥Z§ù½ìæQ@3©ÑÚŠóùU£G`Üù1`d½Svf&?}á™…t’¢¿ÆæÌÂ8¶•…JåüQ‚W3rÍ ~C¾fâòqŒQüVÃ$—ñŒ4jV $Fm#Ñ„æÍ¿4âtNß÷áÛ\1ef-zû@ÝïZÍÄé#ÖrúIÓLœ~]Ý“ðÌ$•SóÖR›¨mqœ¡>ˆ¹¾ž *ª%GÆÌ<þï§xËaØz3tg8àh,&<¢6vóC‘™#ñ‘ö½·&Ù$Ø+êjb,¨Y¨Wb@èvMïmië%qzs ‘“Gø 'î w{¶]ÄÊ.âAçAšË¥‹'Åoô+r2îeý5„ÇIòÅ­:ÕŽW±Ì#,7½-Ü)›_éq²´8>êD×C­± ç%ë-Õ9h¼r«íL\v-¡×ÝÜŒ›p!t³‹U{nû™¦Aª)zGb9 Y¹wƒHR ÇL:õ Eçáq@g¦Õ¸ ¬OH%¦žH{èuî²U~Ì÷n¶Iz/3 øg_áqóGh'áá¦Bh¥rhÂî’ûA×°V›ðÈQ @—™HûG²eUµ7†öRä)Å”¦÷&O/ÓöóG cŸhÛ;RYat[¤RÛxéÑ ïü_Mœ•Uì‘RÄGÈJa÷°AÌÌ;ÂC¢]ó—ªð¸Üjñsº•UìÓÈY™¬oÈ5ÕKÉ\®w-^ÎÕûüi½§ê0Ѥ=—¡hÕUãåñWë¯)<ap7ý¨úL@bV€ëè.gëýÃB;„pLó…h¸T|{ Qu ö“ãmsúB±€Qñtåð{¤()©/ÃóïûvwXV¦í‘F ø•Âïm¢Y?¬(ŸvX‰÷êSxè$âJÅìÂ#~´È?»sÓðúH]?)q,5¸2sÛ¨~Che ‹,‘Cõ‰þýX)_…}tusË꘳_ÕSñ¸¿p*&ÒÔ^±{ã÷ª|^³}nÿoái¦iÎCó¤§Öyo>h=S½ÅëTÈ|“ŸÀQÍX]¿ArŒqTk…ï@"0¾¶Xý°Y…G0FS˜ž·ð8¶ÅêãF¨«c¬o½¸‡ç­¬­s°ŠF<Œ|ëÅíŠI+iåԅମzòN5–è;„3³ÁF¼³p¦±‹+…×çЕ”ïa‹ö¡Zò[CCú¨µTÕæÎe<4vvq™¶•£ë`:Ô7ÉšÛï­¬zߟ ÏTJ$CpDê‚Xú‘,íî¨üà'ÇYw¿‚•ÿenvÁÐ9Îo\ᛡW°ž¯_çó‹÷è–ß&^‰ÏGxÐíœjCbØfZ´©W0Xs¥€úzûؾ~”Gö×ZåZ·ð¸Wß ž:„Ç€\ôTù±Ì›O®M¸~”Iö/Î fPŒ„3…„íG$|=sèJ¿éù8i薘ĕlé䣽ó³ÕOÝ¢‚LØÙÂyÿsv¿/ºr=Ôã*{eJîv%Fý£O0ù½ÁV®­ŠµOQw+EÑã¸`æJ(î¶j±“¾ýssê¤âW®[xvû‘HQ¡ý…ˆ•)<ÌM¤Œ+@´>JöÈ}“ñ`ïëŒÌ+Ó÷ð¸Y5‘'[dîX,UµæR^mÒ”ôüe}ÊM8U­q°[ô^ùfìÚ)I¼gïÂ)Ö…Ô·©œõl{æzr9¸Ú“žgž,Þƒ[IÍíŠÃ·•õp:Ô)¹Ìmó@ké ýjˆsÑÒØ¸´›RöØöÈ =œ…t zžsJ±Sõ¢yãè½Îß@Êyj„SÏüàHm cv©è¢Û?ù=â•y|ûºÖä*œyšû‘!¦²ðgšÿÿjAÛ¬Ò%fExtremes/R/0000755000176000001440000000000012251673345012455 5ustar ripleyusersfExtremes/R/GpdSim.R0000644000176000001440000000437011370220751013755 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: GPD SIMULATION: # gpdSim Simulates a GPD distributed process ################################################################################ gpdSim = function(model = list(xi = 0.25, mu = 0, beta = 1), n = 1000, seed = NULL) { # A function implemented by Diethelm Wuertz # Description: # Generates random variates from a GPD distribution # Arguments: # model - a list of model parameters, xi, mu and beta. # n - an integer, the number of simulated random variates # seed - an integer, the random number generator seed # FUNCTION: # Seed: if (is.null(seed)) seed = NA else set.seed(seed) # Simulate: ans = rgpd(n = n, xi = model$xi, mu = model$mu, beta = model$beta) # DW: ans = as.ts(ans) ans = timeSeries(ans, units = "GPD") # Control: attr(ans, "control") = data.frame(t(unlist(model)), seed = seed, row.names = "") # Return Value: ans } ################################################################################ fExtremes/R/MeanExcessFit.R0000644000176000001440000002131511370220751015266 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: MEAN EXCESS FUNCTION FIT: # normMeanExcessFit Fits mean excesses to a normal density # ghMeanExcessFit Fits mean excesses to a generalized hyperbolic density # hypMeanExcessFit Fits mean excesses to a hyperbolic density # nigMeanExcessFit Fits mean excesses to a normal inverse Gaussian density ################################################################################ normMeanExcessFit = function(x, doplot = TRUE, trace = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Fits mean excesses with a normal density # Arguments: # x - an univariate 'timeSeries' object # doplot - alogical flag. Should a mean excess plot be dispalyed? # ... - optional parameters passed to the function mePlot() # FUNCTION: # Settings: x = as.vector(x) U = mePlot(x, doplot = doplot, ...)[, 1] U = U[!is.na(U)] U = seq(min(U), max(U), length = 51) if(trace) print(U) # Fit Parameters: fit = nFit(x, doplot = FALSE, trace = FALSE) param = fit@fit$estimate # Compute Mean Excess Function: func<-function(x, u, param) { (x-u)*dnorm(x, param[1], param[2])} Y = NULL for (u in U) { y1 = integrate(func, lower = u, upper = Inf, u = u, param = param)[[1]] y2 = integrate(dnorm, lower = u, upper = Inf, mean = param[1], sd = param[2])[[1]] Y = c(Y, y1/y2) } # Plot: if (doplot) lines(U, Y, lwd = 2) # Result: result = data.frame(threshold = U, me = Y) attr(result, "control")<-fit # Return Value: invisible(result) } # ------------------------------------------------------------------------------ ghMeanExcessFit = function(x, doplot = TRUE, trace = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Fits mean excesses with a hyperbolic density # Arguments: # x - an univariate 'timeSeries' object # doplot - alogical flag. Should a mean excess plot be dispalyed? # ... - optional parameters passed to the function mePlot() # FUNCTION: # Settings: x = as.vector(x) U = mePlot(x, doplot = doplot, ...)[, 1] U = U[!is.na(U)] U = seq(min(U), max(U), length = 51) if(trace) print(U) # Fit Parameters: fit = ghFit(x, doplot = FALSE, trace = FALSE) param = fit@fit$estimate # Compute Mean Excess Function: func<-function(x, u, param) { (x-u)*dgh(x, param[1], param[2], param[3], param[4], param[5]) } Y = NULL for (u in U) { y1 = integrate(func, lower = u, upper = Inf, u = u, param = param)[[1]] if (trace) print(c(u, y1)) y2 = integrate(dgh, lower = u, upper = Inf, alpha = param[1], beta = param[2], delta = param[3], mu = param[4], lambda = param[5])[[1]] if (trace) print(c(u, y2)) Y = c(Y, y1/y2) } # Plot: if (doplot) lines(U, Y, lwd = 2) # Result: result = data.frame(threshold = U, me = Y) attr(result, "control")<-fit # Return Value: invisible(result) } # ------------------------------------------------------------------------------ hypMeanExcessFit = function(x, doplot = TRUE, trace = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Fits mean excesses with a hyperbolic density # Arguments: # x - an univariate 'timeSeries' object # doplot - alogical flag. Should a mean excess plot be dispalyed? # ... - optional parameters passed to the function mePlot() # FUNCTION: # Settings: x = as.vector(x) U = mePlot(x, doplot = FALSE)[, 1] U = U[!is.na(U)] U = seq(min(U), max(U), length = 51) # Fit Parameters: fit = hypFit(x, doplot = FALSE, trace = FALSE) param = fit@fit$estimate # Compute Mean Excess Function: func<-function(x, u, param) { (x-u)*dhyp(x, param[1], param[2], param[3], param[4])} Y = NULL for (u in U) { y = integrate(func, lower = u, upper = Inf, u = u, param = param)[[1]] Y = c(Y, y) } # Plot: if (doplot) lines(U, Y, lwd = 2) # Result: result = data.frame(threshold = U, me = Y) attr(result, "control")<-fit # Return Value: invisible(result) } # ------------------------------------------------------------------------------ nigMeanExcessFit = function(x, doplot = TRUE, trace = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Fits mean excesses with a genaralized hyperbolic density # Arguments: # x - an univariate 'timeSeries' object # doplot - alogical flag. Should a mean excess plot be dispalyed? # ... - optional parameters passed to the function mePlot() # FUNCTION: # Settings: x = as.vector(x) U = mePlot(x, doplot = doplot, ...)[, 1] U = U[!is.na(U)] U = seq(min(U), max(U), length = 51) if(trace) print(U) # Fit Parameters: fit = nigFit(x, doplot = FALSE, trace = FALSE, scale = FALSE) param = fit@fit$estimate # Compute Mean Excess Function: func<-function(x, u, param) { (x-u)*dnig(x, param[1], param[2], param[3], param[4]) } Y = NULL for (u in U) { y1 = integrate(func, lower = u, upper = Inf, u = u, param = param)[[1]] if (trace) print(c(u, y1)) y2 = integrate(dnig, lower = u, upper = Inf, alpha = param[1], beta = param[2], delta = param[3], mu = param[4])[[1]] if (trace) print(c(u, y2)) Y = c(Y, y1/y2) } # Plot: if (doplot) lines(U, Y, lwd = 2) # Result: result = data.frame(threshold = U, me = Y) attr(result, "control")<-fit # Return Value: invisible(result) } # ------------------------------------------------------------------------------ ghtMeanExcessFit = function(x, doplot = TRUE, trace = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Fits mean excesses with a genaralized hyperbolic density # Arguments: # x - an univariate 'timeSeries' object # doplot - alogical flag. Should a mean excess plot be dispalyed? # ... - optional parameters passed to the function mePlot() # FUNCTION: # Settings: x = as.vector(x) U = mePlot(x, doplot = doplot, ...)[, 1] U = U[!is.na(U)] U = seq(min(U), max(U), length = 51) if(trace) print(U) # Fit Parameters: fit = ghtFit(x, doplot = FALSE, trace = FALSE, scale = FALSE) param = fit@fit$estimate # Compute Mean Excess Function: func<-function(x, u, param) { (x-u) * dght(x, param[1], param[2], param[3], param[4]) } Y = NULL for (u in U) { y1 = integrate(func, lower = u, upper = Inf, u = u, param = param)[[1]] if (trace) print(c(u, y1)) y2 = integrate(dght, lower = u, upper = Inf, beta = param[1], delta = param[2], mu = param[3], nu = param[4])[[1]] if (trace) print(c(u, y2)) Y = c(Y, y1/y2) } # Plot: if (doplot) lines(U, Y, lwd = 2) # Result: result = data.frame(threshold = U, me = Y) attr(result, "control")<-fit # Return Value: invisible(result) } ################################################################################ fExtremes/R/ExtremesData.R0000644000176000001440000005352011370220751015161 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION EXPLORATIVE DATA ANALYSIS: # emdPlot Creates an empirical distribution plot # qqparetoPlot Creates exploratory QQ plot for EV analysis # mePlot Creates a sample mean excess plot # mxfPlot Creates another view of a sample mean excess plot # mrlPlot Returns a mean residual life plot with confidence levels # recordsPlot Plots records development # ssrecordsPlot Plots records development of data subsamples # msratioPlot Plots ratio of maximums and sums # sllnPlot Verifies Kolmogorov's Strong Law of large numbers # lilPlot Verifies Hartman-Wintner's Law of the iterated logarithm # xacfPlot Plots autocorrelations of exceedences ################################################################################ emdPlot = function(x, doplot = TRUE, plottype = c("xy", "x", "y", " "), labels = TRUE, ...) { # A function imported from R-package evir # Description: # Plots empirical distribution function # Arguments: # x - any object which can be transformed by the function # as.vector() into a numeric vector # doplot - a logical flag, should a pot be returned ? # plottype - which axes should be on a log scale: # "x" denotes x-axis only; "y" denotes y-axis only, # "xy" || "yx" both axes, "" denotes neither of the # axis # FUNCTION: # Convert Type: x = as.vector(x) # Settings: plottype = match.arg(plottype) # Convert x to a vector, if the input is a data.frame. if (is.data.frame(x)) x = x[, 1] xs = x = sort(as.numeric(x)) ys = y = 1 - ppoints(x) if (plottype == "x") { xs = x[x > 0] ys = y[x > 0] } if (plottype == "y") { xs = x[y > 0] ys = y[y > 0] } if (plottype == "xy") { xs = x[x > 0 & y > 0] ys = y[x > 0 & y > 0] } # Plot: if (doplot) { if (labels) { xlab = "x" ylab = "1-F(x)" main = "Empirical Distribution" if (plottype == "xy") main = paste("log-log", main) if (plottype == "x") main = paste("log-lin", main) if (plottype == "y") main = paste("lin-log", main) if (plottype == "") main = paste("lin-lin", main) } else { xlab = "" ylab = "" main = "" } if (labels) { plot(xs, ys, pch = 19, col = "steelblue", log = plottype, xlab = xlab, ylab = ylab, main = main, ...) grid() } else { plot(xs, ys, log = plottype, xlab = xlab, ylab = ylab, main = main, ...) } } # Result: result = data.frame(x, y) # Return Value: if (doplot) return(invisible(result)) else return(result) } # ------------------------------------------------------------------------------ qqparetoPlot = function(x, xi = 0, trim = NULL, threshold = NULL, doplot = TRUE, labels = TRUE, ...) { # A function imported from R-package evir # Description: # Creates an exploratory QQ-plot for Extreme Value Analysis. # Arguments: # x - any object which can be transformed by the function # as.vector() into a numeric vector # doplot - a logical flag, should a plot be returned ? # FUNCTION: # Convert Type: x = as.vector(x) # Convert x to a vector, if the input is a data.frame. if(is.data.frame(x)) x = x[, 1] # qPlot: x = as.numeric(x) if (!is.null(threshold)) x = x[x >= threshold] if (!is.null(trim)) x = x[x < trim] if (xi == 0) { y = qexp(ppoints(x)) } if( xi != 0) { y = qgpd(ppoints(x), xi = xi) } # Plot: if (doplot) { if (labels) { xlab = "Ordered Data" ylab = "Quantiles" if (xi == 0) { ylab = paste("Exponential", ylab) } if (xi != 0) { ylab = paste("GPD(xi=", xi, ") ", ylab, sep = "") } main = "Exploratory QQ Plot" } else { xlab = "" ylab = "" main = "" } z = sort(x) plot(z, y, pch = 19, col = "steelblue", xlab = xlab, ylab = ylab, main = main, ...) rug(z, ticksize = 0.01, side = 3) rug(y, ticksize = 0.01, side = 4) abline(lsfit(z, y)) if (labels) { grid() text = paste("xi =", as.character(round(xi, 3))) mtext(text, side = 4, adj = 0, cex = 0.7) } } # Result: result = data.frame(x = sort(x), y) # Return Value: if (doplot) return(invisible(result)) else return(result) } # ------------------------------------------------------------------------------ mxfPlot = function (x, u = quantile(x, 0.05), doplot = TRUE, labels = TRUE, ...) { # A function written by Diethelm Wuertz # Description: # Creates a simple mean excess function plot. # Arguments: # FUNCTION: # Convert Type: x = as.vector(x) # Convert x to a vector, if the input is a data.frame. if(is.data.frame(x)) x = x[, 1] # mxf: tail = length(x[x < u])/length(x) u = rev(sort(x)) n = length(x) u = u[1:floor(tail*n)] n = length(u) e = (cumsum(u)-(1:n)*u)/(1:n) # Plot if (doplot) { if (labels) { xlab = "Threshold: u" ylab = "Mean Excess: e" main = "Mean Excess Function" } else { main = xlab = ylab = "" } plot (u, e, pch = 19, col = "steelblue", xlab = xlab, ylab = ylab, main = main, ...) if (labels) grid() } # Result: result = data.frame(threshold = u, excess = e) # Return Values: if (doplot) return(invisible(result)) else return(result) } # ------------------------------------------------------------------------------ mrlPlot = function(x, ci = 0.95, umin = mean(x), umax = max(x), nint = 100, doplot = TRUE, plottype = c("autoscale", ""), labels = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Create a mean residual life plot with # confidence intervals. # Arguments: # References: # A function originally written by S. Coles # FUNCTION: # Convert Type: x = as.vector(x) # Settings: plottype = plottype[1] # Convert x to a vector, if the input is a data.frame. if (is.data.frame(x)) x = x[,1] sx = xu = xl = rep(NA, nint) u = seq(umin, umax, length = nint) for (i in 1:nint) { x = x[x >= u[i]] sx[i] = mean(x - u[i]) sdev = sqrt(var(x)) n = length(x) xu[i] = sx[i] + (qnorm((1 + ci)/2) * sdev) / sqrt(n) xl[i] = sx[i] - (qnorm((1 + ci)/2) * sdev) / sqrt(n) } # Plot: if (doplot) { if (labels) { xlab = "Threshold: u" ylab = "Mean Excess: e" main = "Mean Residual Live Plot" } else { main = xlab = ylab = "" } if (plottype == "autoscale") { ylim = c(min(xl[!is.na(xl)]), max(xu[!is.na(xu)])) plot(u, sx, type = "o", pch = 19, col = "steelblue", xlab = xlab, ylab = ylab, ylim = ylim, main = main, ...) } else { plot(u[!is.na(xl)], sx[!is.na(xl)], type = "o", pch = 19, col = "steelblue", xlab = xlab, ylab = ylab, main = main, ...) } lines(u[!is.na(xl)], xl[!is.na(xl)], col = "brown") lines(u[!is.na(xu)], xu[!is.na(xu)], col = "brown") if (labels) { grid() text = paste("ci =", as.character(round(ci, 3))) mtext(text, side = 4, adj = 0, cex = 0.7) } } # Result result = data.frame(threshold = u, mrl = sx) # Return Value: if (doplot) return(invisible(result)) else return(result) } # ------------------------------------------------------------------------------ mePlot = function(x, doplot = TRUE, labels = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Create a Mean Excess Plot # Arguments: # x - an univariate time series object or any other object which # can be transformed by the function as.vector() into a numeric # vector. # doplot - a logical flag, should a plot be created? # labels - a logical flag, should the plot be automatically labeld? # If TRUE, then default values to xlab, ylab, main, pch and col # are assigned. # Reference: # A function imported from R-package evir # FUNCTION: # Convert Type: x = as.vector(x) # Settings: omit = 0 # Internal Function: myrank = function(x, na.last = TRUE){ ranks = sort.list(sort.list(x, na.last = na.last)) if (is.na(na.last)) x = x[!is.na(x)] for (i in unique(x[duplicated(x)])) { which = x == i & !is.na(x) ranks[which] = max(ranks[which]) } ranks } # Convert x to a vector, if the input is a data.frame. if(is.data.frame(x)) x = x[, 1] x = as.numeric(x) n = length(x) x = sort(x) n.excess = unique(floor(length(x) - myrank(x))) points = unique(x) nl = length(points) n.excess = n.excess[-nl] points = points[-nl] excess = cumsum(rev(x))[n.excess] - n.excess * points y = excess/n.excess xx = points[1:(nl-omit)] yy = y[1:(nl-omit)] # Plot: if (doplot) { if (labels) { xlab = "Threshold: u" ylab = "Mean Excess: e" main = "Mean Excess Plot" plot(xx, yy, pch = 19, col = "steelblue", xlab = xlab, ylab = ylab, main = main, ...) grid() } else { plot(xx, yy, ...) } } # Results: result = data.frame(threshold = xx, me = yy) # Return Value: if (doplot) return(invisible(result)) else return(result) } # ----------------------------------------------------------------------------- recordsPlot = function(x, ci = 0.95, doplot = TRUE, labels = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Creates a records plot. # Note: # A function imported from R-package evir, # original name in EVIR: records # FUNCTION: # Convert Type: x = as.vector(x) # Settings: conf.level = ci # Convert x to a vector, if the input is a data.frame. if (is.data.frame(x)) x = x[,1] # Records: record = cummax(x) expected = cumsum(1/(1:length(x))) se = sqrt(expected - cumsum(1/((1:length(x))^2))) trial = (1:length(x))[!duplicated(record)] record = unique(record) number = 1:length(record) expected = expected[trial] se = se[trial] # Plot: if (doplot) { if (labels) { xlab = "Trials" ylab = "Records" main = "Plot of Record Development" } else { xlab = "" ylab = "" main = "" } ci = qnorm(0.5 + conf.level/2) upper = expected + ci * se lower = expected - ci * se lower[lower < 1] = 1 yr = range(upper, lower, number) plot(trial, number, log = "x", ylim = yr, pch = 19, col = "steelblue", xlab = xlab, ylab = ylab, main = main, ...) lines(trial, expected) lines(trial, upper, lty = 2, col = "brown") lines(trial, lower, lty = 2, col = "brown") if (labels) { grid() text = paste("ci =", as.character(conf.level)) mtext(text, side = 4, adj = 0, col = "grey", cex = 0.7) } } # Result: result = data.frame(number, record, trial, expected, se) # Return Value: if (doplot) return(invisible(result)) else return(result) } # ------------------------------------------------------------------------------ ssrecordsPlot = function (x, subsamples = 10, doplot = TRUE, plottype = c("lin", "log"), labels = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Creates a plot of records on subsamples. # note: # Changes: # 2003/09/06 - argument list made consistent # FUNCTION: # Convert Type: x = as.vector(x) # Plot type: plottype = match.arg(plottype) # Labels: xlab = ylab = main = "" # Records: save = x cluster = floor(length(save)/subsamples) records = c() for (i in 1:subsamples) { x = save[((i-1)*cluster+1):(i*cluster)] y = 1:length(x) u = x[1] v = x.records = 1 while (!is.na(v)) { u = x[x > u][1] v = y[x > u][1] if(!is.na(v)) x.records = c(x.records, v) } if (i == 1) { nc = 1:length(x) csmean = cumsum(1/nc) cssd = sqrt(cumsum(1/nc-1/(nc*nc))) ymax = csmean[length(x)] + 2*cssd[length(x)] # Plot: if (doplot) { if (plottype == "log") { nc = log(nc) } if (labels) { if (plottype == "lin") xlab = "n" if (plottype == "log") xlab = "log(n)" ylab = "N(n)" main = "Subsample Records Plot" } plot (nc, csmean+cssd, type = "l", ylim = c(0, ymax), lty = 2, col = "brown", xlab = xlab, ylab = ylab, main = main, ...) lines(nc, csmean) lines(nc, csmean-cssd, lty = 2, col = "brown") if (labels) { grid() text = paste("subsamples =", as.character(subsamples)) mtext(text, side = 4, adj = 0, col = "grey", cex = 0.7) } } } y.records = 1:length(x.records) x.records = x.records[y.records < ymax] if (doplot) { if (plottype == "log") { x.records = log(x.records) } points(x.records, y.records[y.records u] Distances = diff((1:length(x))[x > u]) # Plot: if (doplot) { if (which == "all" | which == "1") plot (Heights, type = "h", xlab = xlab[1], ylab = ylab[1], main = main[1], ...) if (which == "all" | which == "2") plot (Distances, type = "h", xlab = xlab[1], ylab = ylab[2], main = main[2], ...) } # Correlations: if (which == "all" | which == "3") Heights = as.vector(acf(Heights, lag.max=lag.max, plot = doplot, xlab = xlab[2], ylab = ylab[3], main = main[3], ...)$acf) if (which == "all" | which == "4") Distances = as.vector(acf(Distances, lag.max=lag.max, plot = doplot, xlab = xlab[2], ylab = ylab[3], main = main[4], ...)$acf) # Result: if (which == "all") { lag = as.vector(0:(lag.max)) result = data.frame(lag, Heights, Distances) } else { result = NULL } # Return Value: if (doplot) return(invisible(result)) else return(result) } ################################################################################ fExtremes/R/GevRisk.R0000644000176000001440000001413211370220751014141 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: ADDITIONAL FUNCTIONS: # gevrlevelPlot Calculates Return Levels Based on GEV Fit # .gevrlevelLLH Computes log-likelihood function for gevrlevelPlot ################################################################################ gevrlevelPlot = function(object, kBlocks = 20, ci = c(0.90, 0.95, 0.99), plottype = c("plot", "add"), labels = TRUE,...) { # A function implemented by Diethelm Wuertz # Description: # Calculates Return Levels Based on GEV Fit # Arguments: # object - an object of class "fGEVFIT" as returned by the # function gevFit(). # kBlocks - specifies the particular return level to be # estimated; default set arbitrarily to 20 # Note: # Partial copy from R package evir # Examples: # ans = gevFit(gevSim(), type = "mle", gumbel = FALSE) # ans = gevrlevelPlot(ans); ans@fit$rlevel # ans = gevFit(.gumbelSim(), type = "mle", gumbel = TRUE) # ans = gevrlevelPlot(ans); ans@fit$rlevel # # BMW annual (12 month) Return Level: # ans = gevFit(as.timeSeries(data(bmwRet)), "m"); gevrlevelPlot(ans, 12) # FUNCTION: # Check: stopifnot(object@method[1] == "gev") stopifnot(object@method[2] == "mle") stopifnot(kBlocks > 1) stopifnot(max(ci) < 1) stopifnot(min(ci) > 0) # Settings: out = object@fit conf = ci[1] plottype = plottype[1] # Data: par.ests = out$par.ests mu = par.ests["mu"] beta = par.ests["beta"] xi = par.ests["xi"] pp = 1/kBlocks v = qgev((1 - pp), xi, mu, beta) if (plottype[1] == "add") abline(h = v) data = out$data overallmax = out$llh # DW: out$nllh.final beta0 = sqrt(6 * var(data))/pi xi0 = 0.01 theta = c(xi0, beta0) # Return Levels: parmax = NULL rl = v * c(0.5, 0.6, 0.7, 0.8, 0.85, 0.9, 0.95, 1, 1.1, 1.2, 1.25, 1.5, 1.75, 2, 2.25, 2.5, 2.75, 3, 3.25, 3.5, 4.5) for (i in 1:length(rl)) { fit = optim(theta, .gevrlevelLLH, tmp = data, pp = pp, rli = rl[i]) parmax = rbind(parmax, fit$value) } parmax = -parmax overallmax = -overallmax crit = overallmax - qchisq(0.9999, 1)/2 cond = parmax > crit rl = rl[cond] parmax = parmax[cond] smth = spline(rl, parmax, n = 200) aalpha = qchisq(conf[1], 1) # Labels: if (labels) { main = paste(kBlocks, "Blocks Return Level") xlab = "rl" ylab = "parmax" } else { main = xlab = ylab = "" } # Plot ? if (plottype[1] == "plot") { plot(rl, parmax, type = "p", pch = 19, col = "steelblue", main = main, xlab = xlab, ylab = ylab, ...) h = overallmax - aalpha/2 abline(h = h, lty = 3, col = "brown") abline(v = v, lty = 3, col = "brown") lines(smth, ...) if (labels) { ciText = paste(as.character(100*conf[1]), "%", sep = "") span = 0.025*abs(max(parmax)-min(parmax)) text(max(rl), h+span, ciText) } if (length(ci) > 1) { for ( i in 2:length(ci) ) { gevrlevelPlot(object = object, kBlocks = kBlocks, ci = ci[i], plottype = c("nextconf"), labels = labels, ...) } } } # Internally used to add furter confidence level lines ... if (plottype[1] == "nextconf") { h = overallmax - aalpha/2 abline(h = h, lty = 3, col = "brown") abline(v = v, lty = 3, col = "brown") lines(smth, ...) if (labels) { ciText = paste(as.character(100*conf[1]), "%", sep = "") span = 0.025*abs(max(parmax)-min(parmax)) text(max(rl), h+span, ciText) } } # Or Add ? ind = smth$y > overallmax - aalpha/2 ci = range(smth$x[ind]) if (plottype[1] == "add") { abline(v = ci[1], lty = 2, col = "orange") abline(v = ci[2], lty = 2, col = "orange") } # Result: ans = as.numeric(c(ci[1], v, ci[2])) ans = data.frame(cbind(min = ans[1], v = ans[2], max = ans[3], kBlocks = kBlocks), row.names = "GEV Return Level") # Return Value: ans } # ------------------------------------------------------------------------------ .gevrlevelLLH = function(theta, tmp, pp, rli) { # A copy from evir # Description: # Computes log-likelihood function for gevrlevelPlot # Arguments: # FUNCTION: # LLH: mu = rli + (theta[2]*(1-(-log(1-pp))^(-theta[1])))/theta[1] y = 1 + (theta[1]*(tmp-mu))/theta[2] if ((theta[2] < 0) | (min(y) < 0)) { ans = NA } else { term1 = length(tmp) * log(theta[2]) term2 = sum((1 + 1/theta[1]) * log(y)) term3 = sum(y^(-1/theta[1])) ans = term1 + term2 + term3 } # Return Value: ans } ################################################################################ fExtremes/R/GpdFit.R0000644000176000001440000002405011370220751013744 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: GPD SIMULATION: # gpdSim Simulates a GPD distributed process # FUNCTION: GPD PARAMETER ESTIMATION: # 'fGPDFIT' S4 class representation # gpdFit Fits Parameters of GPD distribution # .gpdpwmFit Fits GPD with probability weighted moments # .gpdmleFit Fits GPD with max log-likelihood approach # .gpdLLH Computes GPD log-likelihood function ################################################################################ gpdSim <- function(model = list(xi = 0.25, mu = 0, beta = 1), n = 1000, seed = NULL) { # A function implemented by Diethelm Wuertz # Description: # Generates random variates from a GPD distribution # FUNCTION: # Seed: if (is.null(seed)) seed = NA else set.seed(seed) # Simulate: ans = rgpd(n = n, xi = model$xi, mu = model$mu, beta = model$beta) ans = as.ts(ans) # Control: attr(ans, "control") = data.frame(t(unlist(model)), seed = seed, row.names = "") # Return Value: ans } ################################################################################ setClass("fGPDFIT", representation( call = "call", method = "character", parameter = "list", data = "list", fit = "list", residuals = "numeric", title = "character", description = "character" ) ) # ------------------------------------------------------------------------------ gpdFit <- function(x, u = quantile(x, 0.95), type = c("mle", "pwm"), information = c("observed", "expected"), title = NULL, description = NULL, ...) { # A function implemented by Diethelm Wuertz # Description: # Fits a generalized Pareto model to excesses # Arguments: # Details: # Returns an object of class "fGPDFIT" representing the fit of a # generalized Pareto model to excesses over a high threshold # Notes: # This is a wrapper to EVIR's 'gpd' function. # FUNCTION: # Settings: call = match.call() type = match.arg(type) information = match.arg(information) # Check Type and Convert: X = x xClass = class(x) if (xClass == "timeSeries") stopifnot(isUnivariate(x)) x = as.vector(x) N = length(x) # Compute Exceedances: exceedances = x[x > u] Names = as.character((1:N)[x > u]) exceedances = as.vector(exceedances) names(exceedances) = Names # Estimate Parameters: if (type == "mle") { fit = .gpdmleFit(x, u, information) fit$llh = fit$fit$value fit$convergence = fit$fit$convergence } else if (type == "pwm") { fit = .gpdpwmFit(x, u) fit$llh = NA fit$convergence = NA } fit$p.less.thresh = fit$prob = 1 - length(x[x > u]) / length(x) fit$threshold = u fit$data = x # Compute Residuals: xi = fit$par.ests["xi"] beta = fit$par.ests["beta"] residuals = log(1 + (xi * (as.vector(exceedances)-u))/beta)/xi # Add title and description: if (is.null(title)) title = "GPD Parameter Estimation" if (is.null(description)) description = description() # Compose Parameter List: parameter = list(u = u, type = type) if (information == "mle") parameter$information = information # Return Value: new("fGPDFIT", call = call, method = c("gpd", type), parameter = parameter, data = list(x = X, exceedances = exceedances), fit = fit, residuals = residuals, title = title, description = description) } # ------------------------------------------------------------------------------ .gpdmleFit <- function (x, u, information = c("observed", "expected"), ...) { # A Copy from Evir data = x threshold = u exceedances <- data[data > threshold] excess <- exceedances - threshold Nu <- length(excess) xbar <- mean(excess) s2 <- var(excess) xi0 <- -0.5 * (((xbar * xbar)/s2) - 1) beta0 <- 0.5 * xbar * (((xbar * xbar)/s2) + 1) theta <- c(xi0, beta0) negloglik <- function(theta, tmp) { xi <- theta[1] beta <- theta[2] cond1 <- beta <= 0 cond2 <- (xi <= 0) && (max(tmp) > (-beta/xi)) if (cond1 || cond2) { f <- 1e+06 } else { y <- logb(1 + (xi * tmp)/beta) y <- y/xi f <- length(tmp) * logb(beta) + (1 + xi) * sum(y) } f } fit <- optim(theta, negloglik, hessian = TRUE, ..., tmp = excess) names(fit$par) = c("xi", "beta") if (fit$convergence) warning("Optimization may not have been succeeded.") par.ests <- fit$par converged <- fit$convergence nllh.final <- fit$value information <- match.arg(information) if (information == "observed") varcov <- solve(fit$hessian) if (information == "expected") { one <- (1 + par.ests[1])^2/Nu two <- (2 * (1 + par.ests[1]) * par.ests[2]^2)/Nu cov <- -((1 + par.ests[1]) * par.ests[2])/Nu varcov <- matrix(c(one, cov, cov, two), 2) } par.ses <- sqrt(diag(varcov)) names(par.ses) = c("xi", "beta") list(par.ests = par.ests, par.ses = par.ses, fit = fit, varcov = varcov) } # ------------------------------------------------------------------------------ .gpdmleFitCheck = function(x, u = quantile(x, 0.95), information = c("observed", "expected"), ...) { # A function implemented by Diethelm Wuertz # Description: # Fits GPD with max log-likelihood approach # FUNCTION: x = as.vector(x) excess = x[x > u] - u theta = .gpdpwmFit(x = x, u = u)$par.ests # Parameter Estimation: fit = optim(theta, .gpdLLH, hessian = TRUE, excess = excess) names(fit$par.ests) = c("xi", "beta") # Error Estimates: if (information[1] == "observed") { varcov = solve(fit$hessian) } if (information[1] == "expected") { Nu = length(excess) one = (1 + fit$par[1])^2/Nu two = (2 * (1 + fit$par[1]) * fit$par[2]^2)/Nu cov = -((1 + fit$par[1]) * fit$par[2])/Nu varcov = matrix(c(one, cov, cov, two), 2) } par.ses = sqrt(diag(varcov)) names(par.ses) = c("xi", "beta") # Return Value: list(par.ests = fit$par, par.ses = par.ses, fit = fit, varcov = varcov) } # ------------------------------------------------------------------------------ .gpdpwmFit <- function (x, u) { # A Copy from Evir data = x threshold = u data <- as.numeric(data) n <- length(data) exceedances <- data[data > threshold] excess <- exceedances - threshold Nu <- length(excess) xbar <- mean(excess) a0 <- xbar gamma <- -0.35 delta <- 0 pvec <- ((1:Nu) + gamma)/(Nu + delta) a1 <- mean(sort(excess) * (1 - pvec)) xi <- 2 - a0/(a0 - 2 * a1) beta <- (2 * a0 * a1)/(a0 - 2 * a1) par.ests <- c(xi, beta) names(par.ests) = c("xi", "beta") denom <- Nu * (1 - 2 * xi) * (3 - 2 * xi) if (xi > 0.5) { denom <- NA warning("Asymptotic Standard Errors not available for PWM when xi>0.5.") } one <- (1 - xi) * (1 - xi + 2 * xi^2) * (2 - xi)^2 two <- (7 - 18 * xi + 11 * xi^2 - 2 * xi^3) * beta^2 cov <- beta * (2 - xi) * (2 - 6 * xi + 7 * xi^2 - 2 * xi^3) varcov <- matrix(c(one, cov, cov, two), 2)/denom information <- "expected" converged <- NA nllh.final <- NA par.ses <- sqrt(diag(varcov)) names(par.ses) = c("xi", "beta") list(par.ests = par.ests, par.ses = par.ses, fit = NA, varcov = NA) } # ------------------------------------------------------------------------------ .gpdpwmFitCheck <- function(x, u = quantile(x, 0.95)) { # A function implemented by Diethelm Wuertz # Description: # Fits GPD with probability weighted moments # FUNCTION: # PWM Fit: x = as.vector(x) excess = x[x > u] - u Nu = length(excess) a0 = mean(excess) pvec = ((1:Nu) - 0.35)/Nu a1 = mean(sort(excess) * (1 - pvec)) xi = 2 - a0/(a0 - 2 * a1) beta = (2 * a0 * a1)/(a0 - 2 * a1) par.ests = c(xi = xi, beta = beta) names(par.ests) = c("xi", "beta") par.ses = c(xi = NA, beta = NA) names(par.ses) = c("xi", "beta") # Return Value: list(par.ests = par.ests, par.ses = par.ses, fit = NA, varcov = NA) } ################################################################################ .gpdLLH = function(theta, excess) { # A function implemented by Diethelm Wuertz # Description: # Computes GPD log-likelihood function # FUNCTION: # LLH: xi = theta[1] beta = theta[2] cond = (beta <= 0) || ((xi <= 0) && (max(excess) > (-beta/xi))) if (cond) { func = NA } else { y = log(1+(xi*excess)/beta) / xi func = length(excess) * log(beta) + (1+xi)*sum(y) } # Return Value: func } ################################################################################ fExtremes/R/metrics.R0000644000176000001440000000646711370220751014251 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: METRICS: # .riskMetricsPlot # .garch11MetricsPlot ################################################################################ .riskMetricsPlot <- function(x, labels = TRUE, lambda = 0.94, ...) { # A function implemented by Diethelm Wuertz # Description: # Arguments: # x - an univariate timesSeries object # FUNCTION: # Check: stopifnot(isUnivariate(x)) # Units: units = colnames(x) # Filter: riskMetrics = sqrt(emaTA(x^2, 1-lambda)) # Plot: if (labels) { plot(riskMetrics, type = "l", col = "steelblue", main = paste(units, "RiskMetrics[TM]"), xlab = "Time", ylab = "Volatility", ...) abline(h = sd(riskMetrics), col = "grey") SD = paste("StDev =", round(sd(x), 3)) mtext(text = SD, side = 4, adj = 0, col = "grey", cex = 0.7) grid() } else { plot(riskMetrics, main = "", xlab = "", ylab = "", ...) } # Return Value: invisible(riskMetrics) } # ------------------------------------------------------------------------------ .garch11MetricsPlot <- function(x, labels = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Arguments: # x - an univariate timesSeries object # FUNCTION: # Check: stopifnot(isUnivariate(x)) # Units: units = colnames(x) # Filter: fit = garchFit(~garch(1,1), x, trace = FALSE) garch11 = volatility(fit) attr(garch11, "fit") <- fit # Plot: if (labels) { plot(garch11, type = "l", col = "steelblue", main = paste(units, "GARCH11 Volatility"), xlab = "Time", ylab = "Volatility", ...) abline(h = sd(x), col = "grey") SD = paste("StDev =", round(sd(x), 3)) mtext(text = SD, side = 4, adj = 0, col = "grey", cex = 0.7) grid() } else { plot(garch11, main = "", xlab = "", ylab = "", ...) } # Return Value: invisible(garch11) } ################################################################################ fExtremes/R/GpdPlot.R0000644000176000001440000001615311370220751014145 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # METHODS: PRINT, PLOT, AND SUMMARY: # plot.fGPDFIT S3 Plot Method for object of class "fGPDFIT" # .gpd1Plot Empirical Distribution Plot # .gpd2Plot Tail of Underlying Distribution # .gpd3Plot Scatterplot of GPD Residuals # .gpd4Plot Quantile-Quantile Plot of GPD Residuals ################################################################################ plot.fGPDFIT = function(x, which = "ask", ...) { # A function implemented by Diethelm Wuertz # Description: # Plot method for objects of class 'fGPDFIT' # Example: # x = as.timeSeries(danishClaims); plot(gpdFit(x, 4), "ask") # FUNCTION: # Plot: interactivePlot( x = x, choices = c( "Excess Distribution", "Tail of Underlying Distribution", "Scatterplot of Residuals", "QQ-Plot of Residuals"), plotFUN = c( ".gpd1Plot", ".gpd2Plot", ".gpd3Plot", ".gpd4Plot"), which = which) # Return Value: invisible(x) } # ------------------------------------------------------------------------------ .gpd1Plot = function(x, labels = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Empirical Distribution Plot # Arguments: # x - an object of class fGPDFIT # labels - a logical flag. Should labels be printed? # FUNCTION: # Data: extend = 1.5 u = x@parameter$u data = as.vector(x@data$exceedances) sorted = sort(data) shape = xi = x@fit$par.ests["xi"] scale = beta = x@fit$par.est["beta"] ypoints = ppoints(sorted) U = max(sorted)*extend z = qgpd(seq(0, 1, length = 1000), xi, u, beta) z = pmax(pmin(z, U), u) y = pgpd(z, xi, u, beta) # Labels: if (labels) { xlab = "Fu(x-u)" ylab = "x [log Scale]" main = "Excess Distribution" } else { xlab = ylab = main = "" } # Plot: plot(x = sorted, y = ypoints, xlim = range(u, U), ylim = range(ypoints, y, na.rm = TRUE), main = main, xlab = xlab, ylab = ylab, log = "x", axes = TRUE, col = "steelblue", pch = 19, ...) lines(z[y >= 0], y[y >= 0], col = "brown") # Addon: if (labels) { u = signif (x@parameter$u, 3) text = paste("u =", u) mtext(text, side = 4, adj = 0, cex = 0.7) grid() } # Return Value: invisible() } # ------------------------------------------------------------------------------ .gpd2Plot = function(x, labels = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Tail of Underlying Distribution # Arguments: # x - an object of class fGPDFIT # labels - a logical flag. Should labels be printed? # FUNCTION: # Settings: extend = 1.5 u = x@parameter$u data = as.vector(x@data$x) sorted = sort(data[data > u]) prob = x@fit$prob shape = xi = x@fit$par.ests["xi"] beta = x@fit$par.ests["beta"] scale = beta * (1-prob)^xi location = u - (scale*((1 - prob)^(-xi)-1))/xi # Labels: if (labels) { xlab = "x [log scale]" ylab = "1-F(x) [log scale]" main = "Tail of Underlying Distribution" } else { xlab = ylab = main = "" } # Plot: U = max(data) * extend ypoints = ppoints(sorted) ypoints = (1 - prob) * (1 - ypoints) z = qgpd(seq(0, 1, length = 1000), xi, u, beta) z = pmax(pmin(z, U), u) y = pgpd(z, xi, u, beta) y = (1 - prob) * (1 - y) plot(x = sorted, y = ypoints, xlim = range(u, U), ylim = range(ypoints, y[y>0], na.rm = TRUE), main = main, xlab = xlab, ylab = ylab, log = "xy", axes = TRUE, col = "steelblue", pch = 19, ...) # Line: lines(z[y >= 0], y[y >= 0], col = "brown") if (labels) grid() # Return Value: invisible(list(x = sorted, y = ypoints)) } # ------------------------------------------------------------------------------ .gpd3Plot = function(x, labels = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Scatterplot of GPD Residuals # Arguments: # x - an object of class fGPDFIT # labels - a logical flag. Should labels be printed? # FUNCTION: # Residuals: residuals = x@residuals # Labels: if (labels) { ylab = "Residuals" xlab = "Ordering" main = "Scatterplot of Residuals" } else { xlab = ylab = main = "" } # Plot: plot(residuals, main = main, ylab = ylab, xlab = xlab, col = "steelblue", pch = 19, ...) lines(lowess(1:length(residuals), residuals), col = "brown") if (labels) grid() # Return Value: invisible(list(x = 1:(length(residuals)), y = residuals)) } # ------------------------------------------------------------------------------ .gpd4Plot = function(x, labels = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Quantile-Quantile Plot of GPD Residuals # Arguments: # x - an object of class fGPDFIT # labels - a logical flag. Should labels be printed? # FUNCTION: # Data: data = x@residuals sorted = sort(data) # Labels: if (labels) { xlab = "Ordered Data" ylab = "Exponential Quantiles" main = "QQ-Plot of Residuals" } else { xlab = ylab = main = "" } # Plot: y = qexp(ppoints(data)) plot(x = sorted, y = y, main = main, xlab = xlab, ylab = ylab, col = "steelblue", pch = 19, ...) abline(lsfit(sorted, y), col = "brown") if (labels) grid() # Return Value: invisible(list(x = sorted, y = y)) } ################################################################################ fExtremes/R/GpdRisk.R0000644000176000001440000007770612157313044014154 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: ADDITIONAL PLOTS: # gpdTailPlot Plots Tail Estimate From GPD Model # gpdQPlot Adds Quantile Estimates to gpdTailPlot() # gpdSfallPlot Adds Expected Shortfall Estimates to a GPD Plot # gpdQuantPlot Plots of GPD Tail Estimate of a High Quantile # gpdShapePlot Plots for GPD Shape Parameter # gpdRiskMeasures Calculates Quantiles and Expected Shortfalls # FUNCTION: NEW STYLE FUNCTIONS: # tailPlot Plots GPD VaR and Expected Shortfall risk # tailSlider Interactive view to find proper threshold value # tailRisk Calculates VaR and Expected Shortfall risks ################################################################################ gpdTailPlot = function(object, plottype = c("xy", "x", "y", ""), doplot = TRUE, extend = 1.5, labels = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Plots tail estimate from GPD model # Arguments: # object - an object of class 'fGPDFIT' # Note: # Code partly copied from R package evir # Example: # object = gpdFit(as.timeSeries(data(danishClaims)), u = 10) # gpdTailPlot(object) # FUNCTION: # Settings: threshold = object@fit$threshold x = as.vector(object@data$x) data = x[x > threshold] xi = as.numeric(object@fit$par.ests["xi"]) beta = as.numeric(object@fit$par.ests["beta"]) # Points: plotmin = threshold plotmax = max(data) * max(1, extend) z = qgpd(seq(from = 0, to = 1, length = 501), xi, threshold, beta) z = pmax(pmin(z, plotmax), plotmin) invProb = 1 - length(data)/length(x) ypoints = invProb*(1-ppoints(sort(data))) y = invProb*(1-pgpd(z, xi, threshold, beta)) # Parameters: shape = xi scale = beta * invProb^xi location = threshold - (scale*(invProb^(- xi)-1))/xi # Show Plot: if (doplot) { # Plot plot(sort(data), ypoints, xlim = range(plotmin, plotmax), ylim = range(ypoints, y, na.rm = TRUE), col = "steelblue", pch = 19, xlab = "", ylab = "", log = plottype[1], axes = TRUE, ...) lines(z[y >= 0], y[y >= 0]) grid() # Labels: alog = plottype[1] if (labels) { xLab = "x" if (alog == "x" || alog == "xy" || alog == "yx") xLab = paste(xLab, "(on log scale)") yLab = "1-F(x)" if (alog == "xy" || alog == "yx" || alog == "y") yLab = paste(yLab, "(on log scale)") title(xlab = xLab, ylab = yLab) title(main = "Tail Estimate Plot") } } # Object: object@fit$n = length(x) object@fit$data = object@data$exceedances object@fit$n.exceed = length(object@fit$data) if(object@method[2] == "mle") { object@fit$method = "ml" } else { object@fit$method = "" } # Last Fit: lastfit = object@fit class(lastfit) = "gpd" # Result: ans = list(lastfit = lastfit, type = "tail", dist = "gpd", plotmin = plotmin, plotmax = plotmax, alog = plottype[1], location = location, shape = shape, scale = scale) # Return Value: invisible(ans) } # ------------------------------------------------------------------------------ gpdQPlot = function(x, p = 0.99, ci = 0.95, type = c("likelihood", "wald"), like.num = 50) { # A function implemented by Diethelm Wuertz # Description: # Adds Expected Shortfall Estimates to a GPD Plot # Arguments: # x - an object of class 'gpdFit' # pp - the probability level # Example: # par(mfrow=c(1,1)); x=as.timeSeries(data(danishClaims)) # tp = gpdTailPlot(gpdFit(x, 10)); gpdQPlot(tp) # FUNCTION: # Check Argument: par(new = TRUE) ci.p = ci pp = p ci.type = type[1] # A copy from evir: if (x$dist != "gpd") stop("This function is used only with GPD curves") if (length(pp) > 1) stop("One probability at a time please") threshold = x$lastfit$threshold par.ests = x$lastfit$par.ests xihat = par.ests["xi"] betahat = par.ests["beta"] varcov = x$lastfit$varcov p.less.thresh = x$lastfit$p.less.thresh lambda = 1 if (x$type == "tail") lambda = 1/(1 - p.less.thresh) a = lambda * (1 - pp) gfunc = function(a, xihat) (a^(-xihat) - 1)/xihat gfunc.deriv = function(a, xihat) (-(a^(-xihat) - 1)/xihat - a^(-xihat) * logb(a))/xihat q = q.keep = threshold + betahat * gfunc(a, xihat) # if (q < x$plotmax) abline(v = q, lty = 2) out = as.numeric(q) if (ci.type == "wald") { if (class(x$lastfit) != "gpd") stop("Wald method requires model be fitted with gpd (not pot)") scaling = threshold betahat = betahat/scaling xivar = varcov[1, 1] betavar = varcov[2, 2]/(scaling^2) covar = varcov[1, 2]/scaling term1 = betavar * (gfunc(a, xihat))^2 term2 = xivar * (betahat^2) * (gfunc.deriv(a, xihat))^2 term3 = 2 * covar * betavar * gfunc(a, xihat) * gfunc.deriv(a, xihat) qvar = term1 + term2 + term3 if (qvar < 0) stop("Negative estimate of quantile variance") qse = scaling * sqrt(qvar) qq = qnorm(1 - (1 - ci.p)/2) upper = q + qse * qq lower = q - qse * qq abline(v = upper, lty = 2, col = 2) abline(v = lower, lty = 2, col = 2) out = as.numeric(c(lower, q, qse, upper)) names(out) = c("Lower CI", "Estimate", "Std.Err", "Upper CI") } if (ci.type == "likelihood") { parloglik = function(theta, tmp, a, threshold, xpi) { beta = (theta * (xpi - threshold))/(a^(-theta) - 1) if ((beta <= 0) || ((theta <= 0) && (max(tmp) > (-beta/theta)))) f = 1e+06 else { y = logb(1 + (theta * tmp)/beta) y = y/theta f = length(tmp) * logb(beta) + (1 + theta) * sum(y) } if(is.na(f)) f = 1e+6 f } theta = xihat parmax = NULL xp = exp(seq(from = logb(threshold), to = logb(x$plotmax), length = like.num)) excess = as.numeric(x$lastfit$data - threshold) for (i in 1:length(xp)) { fit2 = optim(theta, parloglik, method = "BFGS", hessian = FALSE, tmp = excess, a = a, threshold = threshold, xpi = xp[i]) parmax = rbind(parmax, fit2$value) } parmax = -parmax overallmax = -parloglik(xihat, excess, a, threshold, q) crit = overallmax - qchisq(0.999, 1)/2 cond = parmax > crit xp = xp[cond] parmax = parmax[cond] par(new = TRUE) dolog = "" if (x$alog == "xy" || x$alog == "x") dolog = "x" plot(xp, parmax, type = "n", xlab = "", ylab = "", axes = FALSE, xlim = range(x$plotmin, x$plotmax), ylim = range(overallmax, crit), log = dolog) axis(4, at = overallmax - qchisq(c(0.95, 0.99), 1)/2, labels = c("95", "99"), tick = TRUE) aalpha = qchisq(ci.p, 1) abline(h = overallmax - aalpha/2, lty = 2, col = 2) cond = !is.na(xp) & !is.na(parmax) smth = spline(xp[cond], parmax[cond], n = 200) lines(smth, lty = 2, col = 2) ci = smth$x[smth$y > overallmax - aalpha/2] abline(v = q.keep, lty = 2) out = c(min(ci), q, max(ci)) names(out) = c("Lower CI", "Estimate", "Upper CI") } # Return Value: out } # ------------------------------------------------------------------------------ gpdSfallPlot = function(x, p = 0.99, ci = 0.95, like.num = 50) { # A function implemented by Diethelm Wuertz # Description: # Adds Expected Shortfall Estimates to a GPD Plot # Arguments: # x - an object of class 'gpdFit' # p - the desired probability for expected shortfall # estimate (e.g. 0.99 for the 99th percentile) # ci - probability for confidence interval (must be # less than 0.999) # like.num - number of times to evaluate profile likelihood # FUNCTION: # Settings: par(new = TRUE) pp = p ci.p = ci object = x # A copy from evir: if(x$dist != "gpd") stop("This function is used only with GPD curves") if(length(pp) > 1) stop("One probability at a time please") threshold = x$lastfit$threshold par.ests = x$lastfit$par.ests xihat = par.ests["xi"] betahat = par.ests["beta"] varcov = x$lastfit$varcov p.less.thresh = x$lastfit$p.less.thresh lambda = 1 # if (x$type == "tail") lambda = 1/(1 - p.less.thresh) a = lambda * (1 - pp) gfunc = function(a, xihat) (a^( - xihat) - 1) / xihat q = threshold + betahat * gfunc(a, xihat) s = s.keep = q + (betahat + xihat * (q - threshold))/(1 - xihat) # if (s < x$plotmax) abline(v = s, lty = 2) out = as.numeric(s) parloglik = function(theta, tmp, a, threshold, xpi) { beta = ((1 - theta) * (xpi - threshold)) / (((a^( - theta) - 1)/theta) + 1) if((beta <= 0) || ((theta <= 0) && (max(tmp) > ( - beta/theta)))) { f = 1e+06 } else { y = log(1 + (theta * tmp)/beta) y = y/theta f = length(tmp) * log(beta) + (1 + theta) * sum(y) } f } theta = xihat parmax = NULL xp = exp(seq(from = log(threshold), to = log(x$plotmax), length = like.num)) excess = as.numeric(x$lastfit$data - threshold) for (i in 1:length(xp)) { fit2 = optim(theta, parloglik, method = "BFGS", hessian = FALSE, tmp = excess, a = a, threshold = threshold, xpi = xp[i]) parmax = rbind(parmax, fit2$value) } parmax = - parmax overallmax = - parloglik(xihat, excess, a, threshold, s) crit = overallmax - qchisq(0.999, 1)/2 cond = parmax > crit xp = xp[cond] parmax = parmax[cond] dolog = "" if(x$alog == "xy" || x$alog == "x") dolog = "x" par(new = TRUE) plot(xp, parmax, type = "n", xlab = "", ylab = "", axes = FALSE, xlim = range(x$plotmin, x$plotmax), ylim = range(overallmax, crit), log = dolog) axis(4, at = overallmax - qchisq(c(0.95, 0.99), 1)/2, labels = c("95", "99"), tick = TRUE) aalpha = qchisq(ci.p, 1) abline(h = overallmax - aalpha/2, lty = 2, col = 2) cond = !is.na(xp) & !is.na(parmax) smth = spline(xp[cond], parmax[cond], n = 200) lines(smth, lty = 2, col = 2) ci = smth$x[smth$y > overallmax - aalpha/2] abline(v = s.keep, lty = 2) out = c(min(ci), s, max(ci)) names(out) = c("Lower CI", "Estimate", "Upper CI") # Return Value: out } # ------------------------------------------------------------------------------ gpdQuantPlot = function(x, p = 0.99, ci = 0.95, models = 30, start = 15, end = 500, doplot = TRUE, plottype = c("normal", "reverse"), labels = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Plots of GPD Tail Estimate of a High Quantile # Example: # Danish = as.timeSeries(data(danishClaims)) # gpdquantPlot(Danish) # Note: # Code partly copied from R package evir # FUNCTION: # Settings: data = as.vector(x) n = length(data) exceed = trunc(seq(from = min(end, n), to = start, length = models)) p = max(p, 1 - min(exceed)/n) start = max(start, ceiling(length(data) * (1 - p))) # Internal Function: .quantFit = function(nex, data) { prob = 1 - nex/length(as.vector(data)) fit = gpdFit(data, u = quantile(data, prob))@fit c(fit$threshold, fit$par.ests, fit$par.ses, as.vector(fit$varcov)[c(1,4,2)]) } # Compute: mat = apply(as.matrix(exceed), 1, .quantFit, data = data) thresh = mat[1, ] xihat = mat[2, ] betahat = mat[3, ] lambda = length(data)/exceed a = lambda * (1 - p) gfunc = function(a, xihat) (a^( - xihat) - 1) / xihat qest = thresh + betahat * gfunc(a, xihat) l = u = qest yrange = range(qest) # Add Confidence Intervals: if (ci) { qq = qnorm(1 - (1 - ci)/2) xivar = mat[4, ] betavar = mat[5, ] covar = mat[6, ] scaling = thresh betahat = betahat/scaling betavar = betavar/(scaling^2) covar = covar/scaling gfunc.deriv = function(a, xihat) (-(a^(-xihat)-1)/xihat-a^(-xihat)*log(a))/xihat term1 = betavar * (gfunc(a, xihat))^2 term2 = xivar * (betahat^2) * (gfunc.deriv(a, xihat))^2 term3 = 2 * covar * betavar * gfunc(a, xihat) * gfunc.deriv(a, xihat) qvar = term1 + term2 + term3 if (min(qvar) < 0) stop(paste( "Conditioning problems lead to estimated negative", "quantile variance", sep = "\n")) qse = scaling * sqrt(qvar) u = qest + qse * qq l = qest - qse * qq yrange = range(qest, u, l) } # Result matrix: mat = rbind(thresh, qest, exceed, l, u) rownames(mat) = c("threshold", "qest", "exceedances", "lower", "upper") colnames(mat) = paste(1:dim(mat)[2]) # Plot: if (doplot) { if (plottype[1] == "normal") { index = exceed } else if (plottype[1] == "reverse") { index = -exceed } plot(index, qest, ylim = yrange, type = "l", xlab = "", ylab = "", axes = FALSE) axis(1, at = index, labels = paste(exceed)) axis(2) axis(3, at = index, labels = paste(format(signif (thresh, 3)))) box() if (ci) { lines(index, l, lty = 2, col = "steelblue") lines(index, u, lty = 2, col = "steelblue") } if (labels) { title(xlab = "Exceedances", ylab = paste("Quantiles:", substitute(x))) mtext("Threshold", side = 3, line = 3) } p = round(p, 3) ci = round(ci, 3) text = paste("p =", p, "| ci =", ci, "| start =", start, "| end =", end ) mtext(text, side = 4, adj = 0, cex = 0.7) } # Add Attributes: mat = t(mat) attr(mat, "control") = data.frame(cbind(p = p, ci = ci, start = start, end = end), row.names = "") # Return Value: invisible(mat) } # ------------------------------------------------------------------------------ gpdShapePlot = function(x, ci = 0.95, models = 30, start = 15, end = 500, doplot = TRUE, plottype = c("normal", "reverse"), labels = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Plots for GPD Shape Parameter # Example: # FUNCTION: # Settings: data = as.vector(x) X = trunc(seq(from = min(end, length(data)), to = start, length = models)) # Internal Function: .shapeFit = function(nex, data) { prob = 1 - nex/length(as.vector(data)) fit = gpdFit(data, u = quantile(data, prob), information = "expected")@fit c(fit$threshold, fit$par.ests[1], fit$par.ses[1]) } # Result Matrix: mat = apply(as.matrix(X), 1, .shapeFit, data = data) mat = rbind(mat, X) rownames(mat) = c("threshold", "shape", "se", "exceedances") colnames(mat) = paste(1:dim(mat)[2]) # Plot: if (doplot) { thresh = mat[1, ] y = mat[2, ] yrange = range(y) if (plottype[1] == "normal") { index = X } else if (plottype == "reverse") { index = -X } if (ci) { sd = mat[3, ] * qnorm(1 - (1 - ci)/2) yrange = range(y, y + sd, y - sd) } plot(index, y, ylim = yrange, type = "l", xlab = "", ylab = "", axes = FALSE) axis(1, at = index, labels = paste(X)) axis(2) axis(3, at = index, labels = paste(format(signif(thresh, 3)))) box() grid() if (ci) { sd = mat[3, ] * qnorm(1 - (1 - ci)/2) yrange = range(y, y + sd, y - sd) lines(index, y + sd, lty = 2, col = "steelblue") lines(index, y - sd, lty = 2, col = "steelblue") } if (labels) { title(xlab = "Exceedances", ylab = paste("Shape:", substitute(x))) mtext("Threshold", side = 3, line = 3) } text = paste("ci =", ci, "| start =", start, "| end =", end ) mtext(text, side = 4, adj = 0, cex = 0.7) } # Add Attributes: attr(mat, "control") = data.frame(cbind(ci = ci, start = start, end = end), row.names = "") mat = t(mat) # Return Value: invisible(mat) } # ------------------------------------------------------------------------------ gpdRiskMeasures = function(object, prob = c(0.99, 0.995, 0.999, 0.9995, 0.9999)) { # A function implemented by Diethelm Wuertz # Description: # Calculates Quantiles and Expected Shortfalls # Arguments: # x - an object of class 'gpdFit' # prob - a numeric value or vector of probability levels # FUNCTION: # Settings: u = object@parameter$u par.ests = object@fit$par.ests xi = par.ests["xi"] beta = par.ests["beta"] lambda = 1/(1 - object@fit$prob) # Quantile Risk: q = u + (beta * ((lambda * (1 - prob))^( - xi) - 1))/xi # Shortfall Risk: es = (q * (1 + (beta - xi * u)/q)) / (1 - xi) # Risk Matrix: ans = data.frame(p = prob, quantile = q, shortfall = es) # Return Value: ans } ################################################################################ tailPlot <- function(object, p = 0.99, ci = 0.95, nLLH = 25, extend = 1.5, grid = TRUE, labels = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Plots GPD VaR and Expected Shortfall risk # Arguments: # object - an object of class 'fGPDFIT' # Note: # Code partly copied from R package evir # Example: # object = gpdFit(as.timeSeries(data(danishClaims)), u = 10) # gpdTailPlot(object) # FUNCTION: # Settings: ci.p = p pp = p like.num = nLLH threshold = object@fit$threshold x = as.vector(object@data$x) data = x[x > threshold] xi = as.numeric(object@fit$par.ests["xi"]) beta = as.numeric(object@fit$par.ests["beta"]) # Points: plotmin = threshold plotmax = max(data) * max(1, extend) z = qgpd(seq(from = 0, to = 1, length = 501), xi, threshold, beta) z = pmax(pmin(z, plotmax), plotmin) invProb = 1 - length(data)/length(x) ypoints = invProb*(1-ppoints(sort(data))) y = invProb*(1-pgpd(z, xi, threshold, beta)) # Parameters: shape = xi scale = beta * invProb^xi location = threshold - (scale*(invProb^(- xi)-1))/xi # Show Plot: xlim = range(plotmin, plotmax) ylim = range(ypoints, y[y>0], na.rm = TRUE) plot(sort(data), ypoints, xlim = xlim, ylim = ylim, col = "steelblue", pch = 19, xlab = "", ylab = "", log = "xy", axes = TRUE, ...) lines(z[y >= 0], y[y >= 0]) if (grid) grid() # Labels: alog = "xy" if (labels) { xLab = "x" if (alog == "x" || alog == "xy" || alog == "yx") xLab = paste(xLab, "(on log scale)") yLab = "1-F(x)" if (alog == "xy" || alog == "yx" || alog == "y") yLab = paste(yLab, "(on log scale)") title(xlab = xLab, ylab = yLab) title(main = "Tail Estimate Plot") } # Object: object@fit$n = length(x) object@fit$data = object@data$exceedances object@fit$n.exceed = length(object@fit$data) # Tail Plot: lastfit = object@fit x = list(lastfit = lastfit, type = "tail", dist = "gpd", plotmin = plotmin, plotmax = plotmax, alog = "xy", location = location, shape = shape, scale = scale) threshold = lastfit$threshold par.ests = lastfit$par.ests xihat = par.ests["xi"] betahat = par.ests["beta"] varcov = lastfit$varcov p.less.thresh = lastfit$p.less.thresh par(new = TRUE) # GPD Quantiles: a = 1/(1 - p.less.thresh) * (1 - pp) gfunc = function(a, xihat) (a^(-xihat) - 1)/xihat gfunc.deriv = function(a, xihat) (-(a^(-xihat)-1)/xihat - a^(-xihat)*logb(a))/xihat q = q.keep = threshold + betahat * gfunc(a, xihat) # if (q < x$plotmax) abline(v = q, lty = 2) out1 = as.numeric(q) # Log Likelihood: parloglik = function(theta, tmp, a, threshold, xpi) { beta = (theta * (xpi - threshold))/(a^(-theta) - 1) if ((beta <= 0) || ((theta <= 0) && (max(tmp) > (-beta/theta)))) f = 1e+06 else { y = logb(1 + (theta * tmp)/beta) y = y/theta f = length(tmp) * logb(beta) + (1 + theta) * sum(y) } if(is.na(f)) f = 1e+6 f } # x Value: theta = xihat parmax = NULL xp = exp(seq(from = logb(threshold), to = logb(x$plotmax), length = like.num)) # y Value: excess = as.numeric(x$lastfit$data - threshold) for (i in 1:length(xp)) { fit2 = optim(theta, parloglik, method = "BFGS", hessian = FALSE, tmp = excess, a = a, threshold = threshold, xpi = xp[i]) parmax = rbind(parmax, fit2$value) } parmax = -parmax overallmax = -parloglik(xihat, excess, a, threshold, q) crit = overallmax - qchisq(0.999, 1)/2 cond = parmax > crit xp = xp[cond] parmax = parmax[cond] # Plot: par(new = TRUE) plot(xp, parmax, type = "n", xlab = "", ylab = "", axes = FALSE, xlim = range(x$plotmin, x$plotmax), ylim = range(overallmax, crit), log = "x") axis(4, at = overallmax - qchisq(c(0.95, 0.99), 1)/2, labels = c("95", "99"), tick = TRUE) aalpha = qchisq(ci.p, 1) abline(h = overallmax - aalpha/2, lty = 2, col = 2) cond = !is.na(xp) & !is.na(parmax) smth = spline(xp[cond], parmax[cond], n = 200) lines(smth, lty = 2, col = 2) ci = smth$x[smth$y > overallmax - aalpha/2] abline(v = q.keep, lty = 2) # Result: out1 = c(min(ci), q, max(ci)) names(out1) = c("Lower CI", "Estimate", "Upper CI") # GPD Shortfall: a = 1/(1 - p.less.thresh) * (1 - pp) gfunc = function(a, xihat) (a^( - xihat) - 1) / xihat q = threshold + betahat * gfunc(a, xihat) s = s.keep = q + (betahat + xihat * (q - threshold))/(1 - xihat) out = as.numeric(s) # Log Likelihood: parloglik = function(theta, tmp, a, threshold, xpi) { beta = ((1-theta)*(xpi-threshold)) / (((a^(-theta)-1)/theta)+1) if((beta <= 0) || ((theta <= 0) && (max(tmp) > ( - beta/theta)))) { f = 1e+06 } else { y = log(1 + (theta * tmp)/beta) y = y/theta f = length(tmp) * log(beta) + (1 + theta) * sum(y) } f } # x Values: theta = xihat parmax = NULL xp = exp(seq(from = log(threshold), to = log(x$plotmax), length = like.num)) excess = as.numeric(x$lastfit$data - threshold) # y Values: for (i in 1:length(xp)) { fit2 = optim(theta, parloglik, method = "BFGS", hessian = FALSE, tmp = excess, a = a, threshold = threshold, xpi = xp[i]) parmax = rbind(parmax, fit2$value) } parmax = -parmax overallmax = -parloglik(xihat, excess, a, threshold, s) crit = overallmax - qchisq(0.999, 1)/2 cond = parmax > crit xp = xp[cond] parmax = parmax[cond] # Plot: par(new = TRUE) plot(xp, parmax, type = "n", xlab = "", ylab = "", axes = FALSE, xlim = range(x$plotmin, x$plotmax), ylim = range(overallmax, crit), log = "x") axis(4, at = overallmax - qchisq(c(0.95, 0.99), 1)/2, labels = c("95", "99"), tick = TRUE) aalpha = qchisq(ci.p, 1) abline(h = overallmax - aalpha/2, lty = 2, col = 2) cond = !is.na(xp) & !is.na(parmax) smth = spline(xp[cond], parmax[cond], n = 200) lines(smth, lty = 2, col = 2) ci = smth$x[smth$y > overallmax - aalpha/2] abline(v = s.keep, lty = 2) # Result: out2 = c(min(ci), s, max(ci)) names(out2) = c("Lower CI", "Estimate", "Upper CI") # Return Value: ans = list(var = out1, sfall = out2) invisible(ans) } # ------------------------------------------------------------------------------ .tailSlider.last.Quantile = NA .tailSlider.last.nThresholds = NA .tailSlider.param = NA .tailSlider.conf = NA .tailSlider.counter = NA .tailSlider.Thresholds = NA tailSlider = function(x) { # A function implemented by Diethelm Wuertz # Description # Interactive view to find proper threshold value # Arguments: # x - an univariate timeSeries object or any other object which # can be transformed by the function as.vector() into a # numeric vector. # FUNCTION: # Transform to Vector: x = as.vector(x) # Exit: on.exit(rm(.tailSlider.last.Quantile)) on.exit(rm(.tailSlider.last.nThresholds)) on.exit(rm(.tailSlider.param)) on.exit(rm(.tailSlider.conf)) on.exit(rm(.tailSlider.counter)) on.exit(rm(x)) # Internal Function: refresh.code = function(...) { .tailSlider.counter <<- .tailSlider.counter + 1 # Sliders: u = thresholdStart = .sliderMenu(no = 1) du = .sliderMenu(no = 2) max.x = .sliderMenu(no = 3) nThresholds = .sliderMenu(no = 4) Quantile = .sliderMenu(no = 5) pp = .sliderMenu(no = 6) if (.tailSlider.counter > 5) { # Plot data: par(mfrow = c(2, 2), cex = 0.7) # Figure 1: ans = mxfPlot(x, u = quantile(x, 1), xlim = c(min(x), max.x), labels = FALSE) grid() # Add thresholds: U = min(c(u+du, max(x))) abline(v = u, lty = 3, col = "red") abline(v = U, lty = 3, col = "red") # Fit line to mean excess within thresolds: X = as.vector(ans[, 1]) Y = as.vector(ans[, 2]) Y = Y[X > u & X < U] X = X[X > u & X < U] lineFit = lsfit(X, Y) abline(lineFit, col = "red", lty = 2) c = lineFit$coef[[1]] m = lineFit$coef[[2]] # Compute parameters xi and beta: xi = c(xi = m/(1+m)) beta = c(beta = c/(1+m)) Xi = signif(xi, 3) Beta = signif(beta, 3) # Add Title: Main = paste("Fig 1: xi = ", Xi, "| beta =", Beta) title(main = Main, xlab = "Threshold", ylab = "Mean Excess") # GPD Fit: if (.tailSlider.last.Quantile != Quantile | .tailSlider.last.nThresholds != nThresholds) { .tailSlider.param <<- NULL .tailSlider.conf <<- NULL .tailSlider.Thresholds <<- seq(quantile(x, Quantile), quantile(x, 1-Quantile), length = nThresholds) for (threshold in .tailSlider.Thresholds) { ans = gpdFit(x, threshold)@fit .tailSlider.param <<- rbind(.tailSlider.param, c(u = threshold, ans$par.ests)) .tailSlider.conf <<- rbind(.tailSlider.conf, c(u = threshold, ans$par.ses)) } .tailSlider.last.Quantile <<- Quantile .tailSlider.last.nThresholds <<- nThresholds } # Figure 2: ymax = max(c(.tailSlider.param[, 2] + .tailSlider.conf[, 2])) ymin = min(c(.tailSlider.param[, 2] - .tailSlider.conf[, 2])) plot(.tailSlider.Thresholds, .tailSlider.param[, 2], xlab = "Threshold", ylab = "xi", ylim = c(ymin, ymax), col = "steelblue", type = "l", main = "xi Estimation") grid() points(.tailSlider.Thresholds, .tailSlider.param[, 2], pch = 19, col = "steelblue") lines(.tailSlider.Thresholds, .tailSlider.param[, 2] + .tailSlider.conf[, 2], lty = 3) lines(.tailSlider.Thresholds, .tailSlider.param[, 2] - .tailSlider.conf[, 2], lty = 3) abline(h = xi, lty = 3, col = "red") abline(v = u, lty = 3, col = "red") abline(v = U, lty = 3, col = "red") # Figure 3: ymax = max(c(.tailSlider.param[, 3] + .tailSlider.conf[, 3])) ymin = min(c(.tailSlider.param[, 3] - .tailSlider.conf[, 3])) plot(.tailSlider.Thresholds, .tailSlider.param[, 3], xlab = "Threshold", ylab = "beta", ylim = c(ymin, ymax), col = "steelblue", type = "l", main = "beta Estimation") grid() points(.tailSlider.Thresholds, .tailSlider.param[, 3], pch = 19, col = "steelblue") lines(.tailSlider.Thresholds, .tailSlider.param[, 3] + .tailSlider.conf[, 3], lty = 3) lines(.tailSlider.Thresholds, .tailSlider.param[, 3] - .tailSlider.conf[, 3], lty = 3) abline(h = beta, lty = 3, col = "red") abline(v = u, lty = 3, col = "red") abline(v = U, lty = 3, col = "red") # Figure 4: # <<- fit = gpdFit(x, u) tailPlot(object = fit, p = pp) # Refresh Frame: par(mfrow = c(2, 2), cex = 0.7) } } # Save x globally: x <<- as.vector(x) # Slider Menu - x Series Settings: xmax = max(x) delta.x = (max(x)-min(x))/200 start.x = par()$usr[2] # Slider Menu - Threshold/Quantiles Settings: qmin = quantile(x, 0.25) qmax = quantile(x, 0.995) delta.q = (qmax-qmin)/200 start.q = (qmin+qmax)/2 # Save Globally: .tailSlider.last.Quantile <<- 0.05*(1+1e-4) .tailSlider.last.nThresholds <<- 10+1 .tailSlider.param <<- NA .tailSlider.conf <<- NA .tailSlider.counter <<- 0 # Open Slider Menu: .sliderMenu(refresh.code, names = c("1 thresholdStart", "1 thresholdInterval", "1 max(x)", "2|3 nThresholds", "2|3 Quantile", "pp"), minima = c( qmin, 0, min(x), 5, 0.005, 0.900), maxima = c( qmax, qmax, max(x), 50, 0.500, 0.999), resolutions = c( delta.q, delta.x, delta.x, 5, 0.005, 0.001), starts = c( start.q, start.x, max(x), 10, 0.050, 0.990)) } # ------------------------------------------------------------------------------ tailRisk = function(object, prob = c(0.99, 0.995, 0.999, 0.9995, 0.9999), ...) { # A function implemented by Diethelm Wuertz # Description: # Calculates Quantiles VaR and Expected Shortfall Risks # Arguments: # x - an object of class 'gpdFit' # prob - a numeric value or vector of probability levels # FUNCTION: # Settings: u = object@parameter$u par.ests = object@fit$par.ests xi = par.ests["xi"] beta = par.ests["beta"] lambda = 1/(1 - object@fit$prob) # Quantile Risk: q = u + (beta * ((lambda * (1 - prob))^( - xi) - 1))/xi # Shortfall Risk: es = (q * (1 + (beta - xi * u)/q)) / (1 - xi) # Risk Matrix: ans = data.frame(Prob = prob, VaR = q, ES = es) # Return Value: ans } ################################################################################ fExtremes/R/GevDistribution.R0000644000176000001440000002766411370220751015726 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2009, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: GEV DISTRIBUTION FAMILY: [CALLING EVD] # dgev Density for the GEV Distribution # pgev Probability for the GEV Distribution # qgev Quantiles for the GEV Distribution # rgev Random variates for the GEV Distribution # gevMoments Computes true statistics for GEV distribution # gevSlider Displays distribution and rvs for GEV distribution # FUNCTION: GEV DISTRIBUTION FAMILY: [USE FROM EVD] # .devd Density for the GEV Distribution # .pevd Probability for the GEV Distribution # .qevd Quantiles for the GEV Distribution # .revd Random variates for the GEV Distribution ################################################################################ dgev = function(x, xi = 1, mu = 0, beta = 1, log = FALSE) { # A function implemented from package evd # Description: # GEV Density Function # Note: 1 + xi*(x-mu)/beta > 0 # xi > 0 Frechet # xi = 0 Gumbel # xi < 0 weibl # FUNCTION: # Density: d = .devd(x, location = mu, scale = beta, shape = xi, log = FALSE) # Add Attribute: attr(d, "control") = data.frame(xi = xi, mu = mu, beta = beta, log = log, row.names = "") # Return Value: d } # ------------------------------------------------------------------------------ pgev = function(q, xi = 1, mu = 0, beta = 1, lower.tail = TRUE) { # A function implemented from package evd # Description: # GEV Probability Function # Note: 1 + xi*(x-mu)/beta > 0 # xi > 0 Frechet # xi = 0 Gumbel # xi < 0 Weibull # FUNCTION: # Probability: p = .pevd(q, location = mu, scale = beta, shape = xi, lower.tail = lower.tail) # Add Attribute: attr(p, "control") = data.frame(xi = xi, mu = mu, beta = beta, lower.tail = lower.tail, row.names = "") # Return Value: p } # ------------------------------------------------------------------------------ qgev = function(p, xi = 1, mu = 0, beta = 1, lower.tail = TRUE) { # A function implemented from package evd # Description: # GEV Quantile Function # Note: 1 + xi*(x-mu)/beta > 0 # xi > 0 Frechet # xi = 0 Gumbel # xi < 0 Weibull # FUNCTION: # Quantiles: q = .qevd(p, location = mu, scale = beta, shape = xi, lower.tail = lower.tail) # Add Attribute: attr(q, "control") = data.frame(xi = xi, mu = mu, beta = beta, lower.tail = lower.tail, row.names = "") # Return Value: q } # ------------------------------------------------------------------------------ rgev = function(n, xi = 1, mu = 0, beta = 1) { # A function implemented from package evd # Description: # GEV Random Variables # Note: 1 + xi*(x-mu)/beta > 0 # xi > 0 Frechet # xi = 0 Gumbel # xi < 0 Weibull # FUNCTION: # Random Variates: r = .revd(n = n, location = mu, scale = beta, shape = xi) # Add Attribute: attr(q, "control") = data.frame(xi = xi, mu = mu, beta = beta) # Return Value: r } # ---------------------------------------------------------------------------- gevMoments = function(xi = 0, mu = 0, beta = 1) { # A function implemented by Diethelm Wuertz # Description: # Compute true statistics for Generalized Extreme Value distribution # Value: # Returns true mean for xi < 1 and variance for xi < 1/2 # of GEV distribution, otherwise NaN is returned # FUNCTION: # MEAN: Returns for x >= 1 NaN: g = c(1, 0, NaN) xinv = 1/ ( xi + sign(abs(xi)) - 1 ) # For xi = the result is eulers constant euler = 0.57721566490153286060651209008240243104 xi0 = c(0, beta*euler, 0) # Supress warning for NaN's from Gamma Function: warn.old <- getOption("warn") options(warn = -1) gevMean = mu + beta * xinv * (gamma(1-xi)-1) * g[sign(xi-1)+2] + xi0[(sign(xi)+2)] options(warn = warn.old) # VAR: Returns for x >= 1 NaN: g = c(1, 0, NaN) xinv = 1/ ( xi + sign(abs(xi)) - 1 ) xi0 = c(0, (beta*pi)^2 / 6, 0) # Supress warning for NaN's from Gamma Function: warn.old <- getOption("warn") options(warn = -1) gevVar = (beta*xinv)^2 * (gamma(1-2*xi) - gamma(1-xi)^2 ) * g[sign(2*xi-1)+2] + xi0[(sign(xi)+2)] options(warn = warn.old) # Result: param = c(xi = xi, mu = mu, beta = beta) ans = list(param = param, mean = gevMean, var = gevVar) # Return Value: ans } # ------------------------------------------------------------------------------ gevSlider = function(method = c("dist", "rvs")) { # A function implemented by Diethelm Wuertz # Description: # Displays distribution and rvs for GEV distribution # FUNCTION: # Settings: method = match.arg(method) # Internal Function: refresh.code = function(...) { # Sliders: N = .sliderMenu(no = 1) xi = .sliderMenu(no = 2) mu = .sliderMenu(no = 3) beta = .sliderMenu(no = 4) # Compute Data: if (xi > 0) { pmin = 0 pmax = 0.999 } else if (xi < 0) { pmin = 0.001 pmax = 1 } else if (xi == 0) { pmin = 0.001 pmax = 0.999 } xmin = round(qgev(pmin, xi, mu, beta), digits = 2) xmax = round(qgev(pmax, xi, mu, beta), digits = 2) s = seq(xmin, xmax, length = N) y1 = dgev(s, xi, mu, beta) y2 = pgev(s, xi, mu, beta) Moments = gevMoments(xi, mu, beta) Mean = round(Moments$mean, 2) Var = round(Moments$var, 2) mText = paste("Mean =", Mean, " | Variance = ", Var) main1 = paste("GEV Density\n", "xi = ", as.character(xi), " | ", "mu = ", as.character(mu), " | ", "beta = ", as.character(beta) ) main2 = paste("GEV Probability\n", "xmin [0.00] = ", as.character(xmin), " | ", "xmax [0.99] = ", as.character(xmax) ) Median = qgev(0.5, xi, mu, beta) # Frame: par(mfrow = c(2, 1), cex = 0.7) # Density: if (method == "rvs") { x = rgev(N, xi, mu, beta) hist(x, probability = TRUE, col = "steelblue", border = "white", xlim = c(xmin, xmax), ylim = c(0, 1.1*max(y1)), main = main1, breaks = "FD" ) lines(s, y1, col = "orange") mtext(mText, side = 4, col = "grey", cex = 0.7) } else { plot(s, y1, type = "l", xlim = c(xmin, xmax), col = "steelblue") abline(h = 0, lty = 3) abline(v = Median, lty = 3, col = "red") abline(v = Mean, lty = 3, col = "darkgreen") title(main = main1) mtext(mText, side = 4, col = "grey", cex = 0.7) } # Probability: plot(s, y2, type = "l", xlim = c(xmin, xmax), ylim = c(0, 1), col = "steelblue" ) abline(h = 0, lty = 3) abline(h = 0.5, lty = 3, col = "red") abline(v = Median, lty = 3, col = "red") abline(v = Mean, lty = 3, col = "darkgreen") title(main = main2) mtext(mText, side = 4, col = "grey", cex = 0.7) # Reset Frame: par(mfrow = c(1, 1), cex = 0.7) } # Open Slider Menu: .sliderMenu(refresh.code, names = c( "N", "xi", "mu", "beta"), minima = c( 50, -1.50, -5.00, 0.10 ), maxima = c( 1000, 1.50, +5.00, 5.00 ), resolutions = c( 50, 0.01, 0.10, 0.10 ), starts = c( 500, 0.00, 0.00, 1.00 ) ) } ################################################################################ .devd = function(x, location = 0, scale = 1, shape = 0, log = FALSE) { # A modified copy from contributed R package evd # FUNCTION: # Check: stopifnot(min(scale) > 0) stopifnot(length(shape) == 1) # Density: x = (x - location) / scale if (shape == 0) { d = log(1/scale) - x - exp(-x) } else { nn = length(x) xx = 1 + shape * x xxpos = xx[xx > 0 | is.na(xx)] scale = rep(scale, length.out = nn)[xx > 0 | is.na(xx)] d = numeric(nn) d[xx > 0 | is.na(xx)] = log(1/scale) - xxpos^(-1/shape) - (1/shape + 1) * log(xxpos) d[xx <= 0 & !is.na(xx)] = -Inf } # Log: if (!log) { d = exp(d) } # Add Attribute: attr(d, "control") = data.frame(location = location[1], scale = scale[1], shape = shape[1], log = log, row.names = "") # Return Value: d } # ------------------------------------------------------------------------------ .pevd = function(q, location = 0, scale = 1, shape = 0, lower.tail = TRUE) { # A modified copy from contributed R package evd # FUNCTION: # Check: stopifnot(min(scale) > 0) stopifnot(length(shape) == 1) # Probabilities: q = (q - location)/scale if (shape == 0) { p = exp(-exp(-q)) } else { p = exp(-pmax(1 + shape * q, 0)^(-1/shape)) } # Lower Tail: if (!lower.tail) { p = 1 - p } # Add Attribute: attr(p, "control") = data.frame(location = location[1], scale = scale[1], shape = shape[1], lower.tail = lower.tail, row.names = "") # Return Value: p } # ------------------------------------------------------------------------------ .qevd = function(p, location = 0, scale = 1, shape = 0, lower.tail = TRUE) { # A modified copy from contributed R package evd # FUNCTION: # Check: stopifnot(min(scale) > 0) stopifnot(length(shape) == 1) stopifnot(min(p, na.rm = TRUE) >= 0) stopifnot(max(p, na.rm = TRUE) <= 1) # Quantiles: if (!lower.tail) p = 1 - p if (shape == 0) { q = location - scale * log(-log(p)) } else { q = location + scale * ((-log(p))^(-shape) - 1)/shape } # Add Attribute: attr(q, "control") = data.frame(location = location[1], scale = scale[1], shape = shape[1], lower.tail, row.names = "") # Return Value: q } # ------------------------------------------------------------------------------ .revd = function(n, location = 0, scale = 1, shape = 0) { # A modified copy from contributed R package evd # FUNCTION: # Check: stopifnot(min(scale) > 0) stopifnot(length(shape) == 1) # Random Variates: if (shape == 0) { r = location - scale * log(rexp(n)) } else { r = location + scale * (rexp(n)^(-shape) - 1)/shape } # Add Attribute: attr(r, "control") = data.frame(location = location[1], scale = scale[1], shape = shape[1], row.names = "") # Return Value: r } ################################################################################ fExtremes/R/GpdSow.R0000644000176000001440000000441211370220751013772 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # METHODS: PRINT, PLOT, AND SUMMARY: # show.fGPDFIT S4 Print Method for object of class "fGPDFIT" ################################################################################ setMethod("show", "fGPDFIT", function(object) { # A function implemented by Diethelm Wuertz # Description: # Print Method for an object of class 'gpdFit' # Arguments: # object - an object of class fGPDFIT # FUNCTION: # Title: cat("\nTitle:\n ", object@title, "\n") # Function Call: cat("\nCall:\n ") cat(paste(deparse(object@call), sep = "\n", collapse = "\n"), "\n", sep = "") # Estimation Type: cat("\nEstimation Method:\n ", object@method, "\n") # Estimated Parameters: cat("\nEstimated Parameters:\n") print(object@fit$par.ests) # Desription: cat("\nDescription\n ", object@description, "\n\n") # Return Value: invisible(object) }) # ------------------------------------------------------------------------------ ################################################################################ fExtremes/R/GevPrintPlotSummary.R0000644000176000001440000002304011370220751016540 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # METHODS: PRINT, PLOT, AND SUMMARY: # show.fGEVFIT S4 Show method for object of class "fGEVFIT" # plot.fGEVFIT S3 Plot method for object of class "fGEVFIT" # .gev1Plot Block Maxima Plot # .gev2Plot Scatterplot of Residuals # .gev3Plot Histogram of Residuals # .gev4Plot Quantile-Quantile Plot # summary.fGEVFIT S3 Summary Method for object of class "fGEVFIT" ################################################################################ setMethod("show", "fGEVFIT", function(object) { # A function implemented by Diethelm Wuertz # Description: # Print Method for an object of class "fGEVFIT". # Arguments: # object - an object of class fGEVFIT # FUNCTION: # Title: cat("\nTitle:\n" , object@title, "\n") # Function Call: cat("\nCall:\n ") cat(paste(deparse(object@call), sep = "\n", collapse = "\n"), "\n", sep = "") # Estimation Type: cat("\nEstimation Type:\n ", object@method, "\n") # Estimated Parameters: cat("\nEstimated Parameters:\n") print(object@fit$par.ests) # Desription: cat("\nDescription\n ", object@description, "\n\n") # Return Value: invisible(object) }) ################################################################################ plot.fGEVFIT = function(x, which = "ask", ...) { # A function implemented by Diethelm Wuertz # Description: # Plot method for an object of class "gevFit". # Arguments: # Details: # plot.gev: # Data are converted to unit exponentially distributed residuals # under null hypothesis that GEV fits. Two diagnostics for iid # exponential data are offered: # "Scatterplot of Residuals" and "QQplot of Residuals" # Example: # fit = gevFit(gevSim(), type = "mle", gumbel = FALSE) # par(mfrow = c(2, 2)); plot(fit) # par(mfrow = c(1, 1)); plot(fit, which = "ask") # # fit = gevFit(gevSim(), type = "mle", gumbel = TRUE) # par(mfrow = c(1, 1)); plot(fit, which = "ask") # # fit = gevFit(gevSim(), type = "pwm", gumbel = FALSE) # par(mfrow = c(1, 1)); plot(fit, which = "ask") # # fit = gevFit(gevSim(), type = "pwm", gumbel = TRUE) # par(mfrow = c(1, 1)); plot(fit, which = "ask") # FUNCTION: # Plot: interactivePlot( x = x, choices = c( "Block Maxima Plot", "Scatterplot of Residuals", "Histogram of Residuals", "Quantile Quantile Plot"), plotFUN = c( ".gev1Plot", ".gev2Plot", ".gev3Plot", ".gev4Plot"), which = which) # Return Value: invisible(x) } # ------------------------------------------------------------------------------ .gev1Plot = function(x, labels = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Time Series Plot of Block Maxima # Arguments: # x - an object of class "fGEVFIT" as returned by the # function gevFit # labels - a logical, should labels be added to the plot # ... - optional arguments passed to the function plot # Example: # .gev1Plot(gevFit(gevSim())) # FUNCTION: # Data: data = x@data$blockmaxima # Labels: if (labels) { main = "Block Maxima" xlab = "Index" ylab = "Data" } else { main = xlab = ylab = "" } # Plot: plot(data, type = "h", main = main, xlab = xlab, ylab = ylab, col = "steelblue", ...) # Add Grid: if (labels) grid() # Return Value: invisible() } # ------------------------------------------------------------------------------ .gev2Plot = function(x, labels = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Scatterplot of Residuals: # Arguments: # x - an object of class "fGEVFIT" as returned by the # function gevFit # labels - a logical, should labels be added to the plot # ... - optional arguments passed to the function plot # Example: # .gev2Plot(gevFit(gevSim())) # FUNCTION: # Data: residuals = x@residuals # Labels: if (labels) { main = "Scatterplot of Residuals" xlab = "Ordering" ylab = "Residuals" } else { main = xlab = ylab = "" } # Plot: plot(residuals, main = main, xlab = xlab, ylab = ylab, pch = 19, col = "steelblue", ...) lines(lowess(1:length(residuals), residuals), col = "brown") # Add Grid: if (labels) grid() # Return Value: invisible() } # ------------------------------------------------------------------------------ .gev3Plot = function(x, labels = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Histogram Plot of Residuals with Gaussian Fit: # Arguments: # x - an object of class "fGEVFIT" as returned by the # function gevFit # labels - a logical, should labels be added to the plot # ... - optional arguments passed to the function hist # Example: # .gev3Plot(gevFit(gevSim())) # FUNCTION: # Data: residuals = x@residuals # Labels: if (labels) { if (x@method[1] == "gev") { dist = "GEV" } else if (x@method[1] == "gum") { dist = "Gumbel" } main = paste(dist, "Residual Histogram") xlab = "Residuals" ylab = "Density" } else { main = xlab = ylab = "" } # Plot: hist(residuals, probability = TRUE, breaks = "FD", main = main, xlab = xlab, ylab = ylab, col = "steelblue", border = "white", ...) # Return Value: invisible() } # ------------------------------------------------------------------------------ .gev4Plot = function(x, labels = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Quantile-Quantile Plot: # Arguments: # x - an object of class "fGEVFIT" as returned by the # function gevFit # labels - a logical, should labels be added to the plot # ... - optional arguments passed to the function plot # Example: # .gev4Plot(gevFit(gevSim())) # FUNCTION: # Data: data = x@residuals sorted = sort(data) y <- qexp(ppoints(data)) # Labels: if (labels) { main = "QQ Plot of Residuals" xlab = "Ordered Data" ylab = "Exponential Quantiles" } else { main = xlab = ylab = "" } # Plot:ata, type = "h", plot(sorted, y, main = main, xlab = xlab, ylab = ylab, pch = 19, col = "steelblue", ...) abline(lsfit(sorted, y)) # Add Grid: if (labels) grid() # Return Value: invisible() } ################################################################################ summary.fGEVFIT = function(object, doplot = TRUE, which = "all", ...) { # A function implemented by Diethelm Wuertz # Description: # Summary method for an object of class "gevFit". # Arguments: # object - an object of class "fGEVFIT" as returned by the # function gevFit # doplot - a logical, should a plot be returned # which - which plot(s) should be returned # optional arguments passed to the function plot # Example: # fit = gevFit(gevSim(), type = "mle") # par(mfrow = c(2, 2)); summary(fit) # FUNCTION: # Title: cat("\nTitle:\n", object@title, "\n") # Function Call: cat("\nCall:\n ") cat(paste(deparse(object@call), sep = "\n", collapse = "\n"), "\n", sep = "") # Estimation Type: cat("\nEstimation Type:\n ", object@method, "\n") # Estimated Parameters: cat("\nEstimated Parameters:\n") print(object@fit$par.ests) # Summary: if (object@method[2] == "mle") { cat("\nStandard Deviations:\n "); print(object@fit$par.ses) cat("\nLog-Likelihood Value:\n ", object@fit$llh, "\n") cat("\nType of Convergence:\n ", object@fit$converged, "\n") } # Desription: cat("\nDescription\n ", object@description, "\n\n") # Plot: if (doplot) { plot(object, which = which, ...) } # Return Value: invisible(object) } ################################################################################ fExtremes/R/ExtremeIndex.R0000644000176000001440000004225712157313044015203 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: DESCRIPTION: # 'fTHETA' Class representation for extremal index # show.fTHETA S4: Print Method for extremal index # thetaSim Simulates a time series with known theta # FUNCTION: DESCRIPTION: # blockTheta Computes theta from Block Method # clusterTheta Computes theta from Reciprocal Cluster Method # runTheta Computes theta from Run Method # ferrosegersTheta Computes Theta according to Ferro and Seegers # FUNCTION: DESCRIPTION: # exindexesPlot Computes and Plot Theta(1,2,3) # exindexPlot Computes Theta(1,2) and Plot Theta(1) ################################################################################ setClass("fTHETA", representation( call = "call", data = "list", theta = "data.frame", title = "character", description = "character") ) # ------------------------------------------------------------------------------ setMethod("show", "fTHETA", function(object) { # A function implemented by Diethelm Wuertz # FUNCTION: # Unlike print the argument for show is 'object'. x = object # Title: cat("\nTitle:\n ", x@title, "\n", sep = "") # Call: cat("\nCall:\n ", deparse(x@call), "\n", sep = "") # Extremal Index: cat("\nExtremal Index:\n") print(object@theta) # Description: cat("\nDescription:\n ", x@description, sep = "") cat("\n\n") # Return Value: invisible() }) # ------------------------------------------------------------------------------ thetaSim = function(model = c("max", "pair"), n = 1000, theta = 0.5) { # A function implemented by Diethelm Wuertz # Description: # Simulates a time series with known theta # Arguments: # model - a character string denoting the model # "max" - Max Frechet Series # "pair" - Paired Exponential Series # FUNCTION: # Model Argument: model = match.arg(model) # Simulate: model = model[1] X = rep(0, n) if (model == "max") { # Frechet rvs: eps = 1/(-log(runif(n))) X[1] = theta*eps[1] for ( i in 2:n ) X[i] = max( (1-theta)*X[i-1], theta*eps[i] ) } else if (model == "pair") { theta = 0.5 eps = rexp(n+1) for ( i in 1:n ) X[i] = max(eps[i], eps[i+1]) } # As time series: X = as.ts(X) attr(X, "control") = c(model = model, theta = as.character(theta)) # Return Value: X } ################################################################################ blockTheta = function (x, block = 22, quantiles = seq(0.950, 0.995, length = 10), title = NULL, description = NULL) { # A function implemented by Diethelm Wuertz # Description: # Calculates theta from Block method, i.e. theta1. # Example: # blockTheta(thetaSim(n=10000)) # FUNCTION: # Check if block is numeric: stopifnot(is.numeric(block)) # Number of blocks and number of data points: X = as.vector(x) ordered = sort(X) k = floor(length(X)/block) n = k*block # Now organize your X: # 1) truncate the rest of the time series, # 2) arrange them in matrix form, # 3) sort them in reverse order, ie. from high (pos) to low (neg) X = matrix(X[1:(k*block)], ncol = block, byrow = TRUE) # Threshold values associated to quantiles: thresholds = ordered[floor(quantiles*length(X))] # Presettings: theta1 = rep(0, times = length(quantiles)) # Calculate Extremal Imdex: run = 0 keepK = keepN = NULL for ( u in thresholds ) { run = run + 1 # N # of exceedences | K # of blocks with exceedences: N = length(X[X > u]) K = floor(sum(sign(apply(X, 1, max) - u) + 1) / 2) if (K/k < 1) { theta1[run] = (k/n) * log(1-K/k) / log(1-N/n) } else { theta1[run] = NA } keepK = c(keepK, K) keepN = c(keepN, N) } # Theta Values: ans = data.frame(quantiles = quantiles, thresholds = thresholds, N = keepN, K = keepK, theta = theta1) # Add title and description: if (is.null(title)) title = "Extremal Index from Block Method" if (is.null(description)) description = description() # Return Value: new("fTHETA", call = match.call(), data = list(x = x, block = block), theta = ans, title = title, description = description) } # ------------------------------------------------------------------------------ clusterTheta = function (x, block = 22, quantiles = seq(0.950, 0.995, length = 10), title = NULL, description = NULL) { # A function implemented by Diethelm Wuertz # Description: # Calculates theta from Reciprocal Mean Cluster Size method, i.e. theta2. # Example: # clusterTheta(thetaSim(n=10000)) # FUNCTION: # Check if block is numeric: stopifnot(is.numeric(block)) # Number of blocks and number of data points: X = as.vector(x) ordered = sort(X) k = floor(length(X)/block) n = k*block # Now organize your X: # 1) truncate the rest of the time series, # 2) arrange them in matrix form, # 3) sort them in reverse order, ie. from high (pos) to low (neg) X = matrix(X[1:(k*block)], ncol = block, byrow = TRUE) # Threshold values associated to quantiles: thresholds = ordered[floor(quantiles*length(X))] # Presettings: theta2 = rep(0, times = length(quantiles)) # Calculate Extremal Imdex: run = 0 keepK = keepN = NULL for ( u in thresholds ) { run = run + 1 # N # of exceedences | K # of blocks with exceedences: N = length(X[X > u]) K = floor(sum(sign(apply(X, 1, max) - u) + 1) / 2) theta2[run] = K/N keepK = c(keepK, K) keepN = c(keepN, N) } # Theta Values: ans = data.frame(quantiles = quantiles, thresholds = thresholds, N = keepN, K = keepK, theta = theta2) # Add title and description: if (is.null(title)) title = "Extremal Index from Reciprocal Cluster Method" if (is.null(description)) description = description() # Return Value: new("fTHETA", call = match.call(), data = list(x = x, block = block), theta = ans, title = title, description = description) } # ------------------------------------------------------------------------------ runTheta = function (x, block = 22, quantiles = seq(0.950, 0.995, length = 10), title = NULL, description = NULL) { # A function implemented by Diethelm Wuertz # Description: # Calculates theta from Run method, i.e. theta3. # Example: # runTheta(thetaSim(n=10000)) # FUNCTION: # Check if block is numeric: stopifnot(is.numeric(block)) # Number of blocks and number of data points: X = as.vector(x) ordered = sort(X) k = floor(length(X)/block) n = k*block Count = 1:n # Now organize your X: # 1) truncate the rest of the time series, # 2) arrange them in matrix form, # 3) sort them in reverse order, ie. from high (pos) to low (neg) X = matrix(X[1:(k*block)], ncol = block, byrow = TRUE) # Threshold values associated to quantiles: thresholds = ordered[floor(quantiles*length(X))] # Presettings: theta3 = rep(0, times = length(quantiles)) # Calculate Extremal Imdex: run = 0 keepN = NULL for ( u in thresholds ) { run = run + 1 # N # of exceedences | K # of blocks with exceedences: N = length(X[X > u]) Y = diff(Count[X > u]) Y = Y[Y > block] theta3[run] = length(Y)/N keepN = c(keepN, N) } # Theta Values: ans = data.frame(quantiles = quantiles, thresholds = thresholds, N = keepN, theta = theta3) # Add title and description: if (is.null(title)) title = "Extremal Index from Run Method" if (is.null(description)) description = description() # Return Value: new("fTHETA", call = match.call(), data = list(x = x, block = block), theta = ans, title = title, description = description) } # ------------------------------------------------------------------------------ ferrosegersTheta = function (x, quantiles = seq(0.950, 0.995, length= 10), title = NULL, description = NULL) { # Description: # Estimates the extremal index based on the intervals estimator # due to Ferro and Segers (2003). # Note: # Adapted from function 'extremalindex' in contributed R-package # 'extRemes' written and maintained by ... # FUNCTION: # Settings: x = as.vector(x) n = length(x) N = floor(quantiles*n) sorted = sort(x) U = sorted[N] ans = NULL # Extremal Index: for ( u in U ) { msg = 0 id = x > u N = sum(id) S = (1:n)[id] TT = diff(S) if (!any(TT > 2)) { theta = 2*sum(TT, na.rm = TRUE)^2/((N-1) * sum(TT^2, na.rm = TRUE)) # msg = paste("theta.hat used because no values of T>2.") msg = msg + 1 if (theta > 1) { theta = 1 # msg = paste(msg, "Using theta = 1 because theta.hat > 1.", # sep = "\n") msg = msg + 10 } } else { theta = 2 * sum(TT-1, na.rm = TRUE)^2/((N-1) * sum((TT-1) * (TT-2), na.rm = TRUE)) # msg = paste("theta.tilde used because a value(s) exists of T>2.") msg = msg + 100 if (theta > 1) { theta = 1 # msg = paste(msg, "Using theta = 1 as theta.hat > 1.") msg = msg + 1000 } } K = ifelse(round(theta*N) != theta*N, floor(theta*N) + 1, theta*N) T.order = order(TT, na.last = TRUE, decreasing = TRUE) T.ranked = TT[T.order] T.K = T.ranked[K] if (sum(TT == T.K, na.rm = TRUE) > 1) { for (i in 1:K) { K = K - 1 T.K = T.ranked[K] if (sum(TT == T.K, na.rm = TRUE) > 1) { next } else { break } } } ans = rbind(ans, c(T.K, K, msg, theta)) } # Result: ans = data.frame(quantiles, U, ans) colnames(ans) = c("Threshold", "Quantiles", "RunLength", "Clusters", "messageNo", "theta") # Add title and description: if (is.null(title)) title = "Extremal Index from Ferro-Segers Method" if (is.null(description)) description = description() # Return Value: new("fTHETA", call = match.call(), data = list(x = x), theta = ans, title = title, description = description) } ################################################################################ exindexesPlot = function (x, block = 22, quantiles = seq(0.950, 0.995, length = 10), doplot = TRUE, labels = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Calculates and Plots Theta(1,2,3) for numeric block lengths # Areguments: # x - an univariate time series, or any other object which can be # transformed by the function as.vector into a numeric vector. # block - an integer value which denotes the length of the blocks. # quantiles - a numeric vector of quantile values. # doplot - alogical flag. Should a plot be produced? # Example: # exindexesPlot(as.timeSeries(data(bmwRet)), 20) # FUNCTION: # Settings: if (!is.numeric(block)) stop("Argument block must be an integer value.") doprint = FALSE # Block Size: blocklength = block # argument renamed # Note, in finance the x's should be residuals resid = as.vector(x) # Extremal Index - Theta_1, Theta_2 and Theta_3 k = floor(length(resid)/blocklength) # Number of blocks n = k*blocklength # Number of data points # Now organize your residuels: # 1) truncate the rest of the time series, # 2) arrange them in matrix form, # 3) sort them in reverse order, ie. from high (pos) to low (neg) resid1 = resid[1:(k*blocklength)] resid1 = matrix(resid1, ncol = blocklength, byrow = TRUE) ordered1 = sort(resid1) # Threshold values associated to quantiles: z0 = ordered1[floor(quantiles*length(resid1))] # Presettings: theta1 = theta2 = theta3 = rep(0, times = length(quantiles)) # Calculate Extremal Imdex: run = 0 for ( z in z0 ) { run = run + 1 # N - number of exceedences: N = length(resid1[resid1 > z]) # K - number of blocks with exceedences: # DW: floor() K = floor(sum(sign(apply(resid1, 1, max)-z)+1) / 2) if (K/k < 1) { theta1[run] = (k/n) * log(1-K/k) / log(1-N/n) } else { theta1[run] = NA } theta2[run] = K/N x = 1:n xx = diff(x[resid1 > z]) xx = xx[xx > blocklength] theta3[run] = length(xx)/N # Printing: if (doprint) { print(c(N, K, quantiles[run], z)) print(c(theta1[run], theta2[run], theta3[run])) } } # Plotting: if (doplot) { plot(quantiles, theta1, xlim = c(quantiles[1], quantiles[length(quantiles)]), ylim = c(0, 1.2), type = "b", pch = 1, xlab = "", ylab = "", main = "", ...) points(quantiles, theta2, pch = 2, col = 3) points(quantiles, theta3, pch = 4, col = 4) if (labels) { title(main = "Extremal Index") title(xlab = "Quantile", ylab = "Theta 1,2,3") mtext("Threshold", side = 3, line = 3) grid() mtext(text = paste("Blocklength: ", as.character(block)), adj = 0, side = 4, cex = 0.7) } } # Theta Values: ans = data.frame(quantiles = quantiles, thresholds = z0, theta1 = theta1, theta2 = theta2, theta3 = theta3) # Return Value: ans } # ------------------------------------------------------------------------------ exindexPlot = function(x, block = c("monthly", "quarterly"), start = 5, end = NA, doplot = TRUE, plottype = c("thresh", "K"), labels = TRUE, ...) { # Example: # exindexPlot(as.timeSeries(data(bmwRet)), 20) # exindexPlot(as.timeSeries(data(bmwRet)), "monthly") # exindexPlot(as.vector(as.timeSeries(data(bmwRet))), 20) # Settings: plottype = match.arg(plottype) reverse = FALSE if (plottype == "K") reverse = TRUE # Extremal Index - following A. McNeil: b.maxima = rev(sort(as.vector(blockMaxima(x, block)))) data = as.vector(x) sorted = rev(sort(data)) n = length(sorted) if (!is.numeric(block)) block = round(length(data)/length(b.maxima)) k = round(n/block) un = unique(b.maxima)[-1] K = match(un, b.maxima) - 1 N = match(un, sorted) - 1 if (is.na(end)) end = k cond = (K < end) & (K >= start) un = un[cond] K = K[cond] N = N[cond] theta2 = K/N theta = logb(1 - K/k)/(block * logb(1 - N/n)) ans = data.frame(N = N, K = K, un = un, theta2 = theta2, theta = theta) yrange = range(theta) index = K if (reverse) index = - K # Plot: if (doplot) { plot(index, theta, ylim = yrange, type = "b", xlab = "", ylab = "", axes = FALSE, ...) IDX = round(seq(1, length(index), length = 10)) axis(1, at = index[IDX], labels = paste(K)[IDX]) axis(2) axis(3, at = index[IDX], labels = paste(format(signif(un, 3)))[IDX]) box() if (labels) { ylabel = paste("theta (", k, " blocks of size ", block, ")", sep = "") title(xlab = "K", ylab = ylabel) mtext("Threshold", side = 3, line = 3) lines(index, theta, col = "steelblue") grid() mtext(text = paste("Blocklength: ", as.character(block)), adj = 0, side = 4, cex = 0.7) } } # Return Value: ans } ################################################################################ fExtremes/R/GpdDistribution.R0000644000176000001440000002722011370220751015703 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: GPD DISTRIBUTION FAMILY: # dgpd Density for the Generalized Pareto DF [USE FROM EVIS] # pgpd Probability for the Generalized Pareto DF # qgpd Quantiles for the Generalized Pareto DF # rgpd Random variates for the Generalized Pareto DF # FUNCTION: GPD MOMENTS: # gpdMoments Computes true statistics for GPD distribution # FUNCTION: GPD SLIDER: # gpdSlider Displays distribution and rvs for GPD distribution # FUNCTION: INTERNAL GPD DISTRIBUTION FAMILY: # .depd Density for the Generalized Pareto DF # .pepd Probability for the Generalized Pareto DF # .qepd Quantiles for the Generalized Pareto DF # .repd Random variates for the Generalized Pareto DF ################################################################################ dgpd <- function(x, xi = 1, mu = 0, beta = 1, log = FALSE) { # A function written by Diethelm Wuertz # Description: # Density for the Generalized Pareto DF # Arguments: # FUNCTION: # Transform: shape = xi location = mu scale = beta # Density: d = .depd(x, location, scale, shape, log) # Add Control: attr(d, "control") = data.frame(xi = xi, mu = mu, beta = beta[1], log = log, row.names = "") # Return Value: d } # ------------------------------------------------------------------------------ pgpd <- function(q, xi = 1, mu = 0, beta = 1, lower.tail = TRUE) { # A function written by Diethelm Wuertz # Description: # Probability for the Generalized Pareto DF # Arguments: # FUNCTION: # Transform: shape = xi location = mu scale = beta # Probability: p = .pepd(q, location, scale, shape, lower.tail) # Add Control: attr(p, "control") = data.frame(xi = xi, mu = mu, beta = beta[1], lower.tail = lower.tail, row.names = "") # Return Value: p } # ------------------------------------------------------------------------------ qgpd <- function(p, xi = 1, mu = 0, beta = 1, lower.tail = TRUE) { # A function written by Diethelm Wuertz # Description: # Quantiles for the Generalized Pareto DF # Arguments: # FUNCTION: # Transform: shape = xi location = mu scale = beta # Quantiles: q = .qepd(p, location, scale, shape, lower.tail) # Add Control: attr(q, "control") = data.frame(xi = xi, mu = mu, beta = beta[1], lower.tail = lower.tail, row.names = "") # Return Value: q } # ------------------------------------------------------------------------------ rgpd <- function(n, xi = 1, mu = 0, beta = 1) { # A function written by Diethelm Wuertz # Description: # Random variates for the Generalized Pareto DF # Arguments: # FUNCTION: # Transform: shape = xi location = mu scale = beta # Random Variates: r = .repd(n, location, scale, shape) # Add Control: attr(r, "control") = data.frame(xi = xi, mu = mu, beta = beta[1], row.names = "") # Return Value: r } # ------------------------------------------------------------------------------ gpdMoments <- function(xi = 1, mu = 0, beta = 1) { # A function implemented by Diethelm Wuertz # Description: # Compute true statistics for Generalized Pareto distribution # Arguments: # Value: # Returns true mean of Generalized Pareto distribution # for xi < 1 else NaN # Returns true variance of Generalized Pareto distribution # for xi < 1 else NaN # FUNCTION: # MEAN: Returns 1 for x <= 0 and -Inf's's else a = c(1, NaN, NaN) gpdMean = mu + beta/(1-xi)*a[sign(xi-1)+2] # VAR: Rreturns 1 for x <= 0 and -Inf's's else a = c(1, NaN, NaN) gpdVar = beta*beta/(1-xi)^2/(1-2*xi) * a[sign(2*xi-1)+2] # Result: param = c(xi = xi, mu = mu, beta = beta) ans = list(param = param, mean = gpdMean, var = gpdVar) # Return Value: ans } # ------------------------------------------------------------------------------ gpdSlider <- function(method = c("dist", "rvs")) { # A function implemented by Diethelm Wuertz # Description: # Displays distribution and rvs for GPD distribution # Arguments: # FUNCTION: # Settings: method = match.arg(method) # Internal Function: refresh.code = function(...) { # Sliders: N = .sliderMenu(no = 1) xi = .sliderMenu(no = 2) mu = .sliderMenu(no = 3) beta = .sliderMenu(no = 4) # Compute Data: pmin = 0.00 pmax = 0.99 xmin = round(qgpd(pmin, xi, mu, beta), digits = 2) xmax = round(qgpd(pmax, xi, mu, beta), digits = 2) s = seq(xmin, xmax, length = N) y1 = dgpd(s, xi, mu, beta) y2 = pgpd(s, xi, mu, beta) Moments = gpdMoments(xi, mu, beta) Mean = round(Moments$mean, 2) Var = round(Moments$var, 2) mText = paste("Mean =", Mean, " | Variance = ", Var) main1 = paste("GPD Density\n", "xi = ", as.character(xi), " | ", "mu = ", as.character(mu), " | ", "beta = ", as.character(beta) ) main2 = paste("GPD Probability\n", "xmin [0.00] = ", as.character(xmin), " | ", "xmax [0.99] = ", as.character(xmax) ) Median = qgpd(0.5, xi, mu, beta) # Frame: par(mfrow = c(2, 1), cex = 0.7) # Density: if (method == "rvs") { x = rgpd(N, xi, mu, beta) hist(x, probability = TRUE, col = "steelblue", border = "white", xlim = c(xmin, xmax), ylim = c(0, 1.1*max(y1)), main = main1, breaks = "FD" ) lines(s, y1, col = "orange") mtext(mText, side = 4, col = "grey", cex = 0.7) } else { plot(s, y1, type = "l", xlim = c(xmin, xmax), col = "steelblue") abline(h = 0, lty = 3) abline(v = Median, lty = 3, col = "red") abline(v = Mean, lty = 3, col = "darkgreen") title(main = main1) mtext(mText, side = 4, col = "grey", cex = 0.7) } # Probability: plot(s, y2, type = "l", xlim = c(xmin, xmax), ylim = c(0, 1), col = "steelblue" ) abline(h = 0, lty = 3) abline(h = 0.5, lty = 3, col = "red") abline(v = Median, lty = 3, col = "red") abline(v = Mean, lty = 3, col = "darkgreen") title(main = main2) mtext(mText, side = 4, col = "grey", cex = 0.7) # Reset Frame: par(mfrow = c(1, 1), cex = 0.7) } # Open Slider Menu: .sliderMenu(refresh.code, names = c( "N", "xi", "mu", "beta"), minima = c( 50, 0.00, -5.00, 0.10 ), maxima = c( 1000, 1.50, +5.00, 5.00 ), resolutions = c( 50, 0.01, 0.10, 0.10 ), starts = c( 500, 1.00, 0.00, 1.00 ) ) } ################################################################################ .depd <- function(x, location = 0, scale = 1, shape = 0, log = FALSE) { # Description: # Arguments: # FUNCTION: # Check: stopifnot(min(scale) > 0) stopifnot(length(shape) == 1) # Density: d <- (x - location)/scale nn <- length(d) scale <- rep(scale, length.out = nn) index <- (d > 0 & ((1 + shape * d) > 0)) | is.na(d) if (shape == 0) { d[index] <- log(1/scale[index]) - d[index] d[!index] <- -Inf } else { d[index] <- log(1/scale[index]) - (1/shape+1)*log(1+shape*d[index]) d[!index] <- -Inf } # Log: if (!log) d <- exp(d) # Add Control: attr(d, "control") = data.frame(location = location[1], scale = scale[1], shape = shape[1], log = log, row.names = "") # Return Value: d } # ------------------------------------------------------------------------------ .pepd <- function(q, location = 0, scale = 1, shape = 0, lower.tail = TRUE) { # Description: # Arguments: # FUNCTION: # Check: stopifnot(min(scale) > 0) stopifnot(length(shape) == 1) # Probability: q <- pmax(q - location, 0)/scale if (shape == 0) p <- 1 - exp(-q) else { p <- pmax(1 + shape * q, 0) p <- 1 - p^(-1/shape) } # Lower Tail: if (!lower.tail) p <- 1 - p # Add Control: attr(p, "control") = data.frame(location = location[1], scale = scale[1], shape = shape[1], lower.tail = lower.tail, row.names = "") # Return Value: p } # ------------------------------------------------------------------------------ .qepd <- function(p, location = 0, scale = 1, shape = 0, lower.tail = TRUE) { # Description: # Arguments: # FUNCTION: # Check: stopifnot(min(scale) > 0) stopifnot(length(shape) == 1) stopifnot(min(p, na.rm = TRUE) >= 0) stopifnot(max(p, na.rm = TRUE) <= 1) # Lower Tail: if (lower.tail) p <- 1 - p # Quantiles: if (shape == 0) { q = location - scale * log(p) } else { q = location + scale * (p^(-shape) - 1)/shape } # Add Control: attr(q, "control") = data.frame(location = location[1], scale = scale[1], shape = shape[1], lower.tail = lower.tail, row.names = "") # Return Value: q } # ------------------------------------------------------------------------------ .repd <- function(n, location = 0, scale = 1, shape = 0) { # Description: # Arguments: # FUNCTION: # Check: stopifnot(min(scale) > 0) stopifnot(length(shape) == 1) # Random Variates: if (shape == 0) { r = location + scale * rexp(n) } else { r = location + scale * (runif(n)^(-shape) - 1)/shape } # Add Control: attr(r, "control") = data.frame(location = location[1], scale = scale[1], shape = shape[1], row.names = "") # Return Value: r } ################################################################################fExtremes/R/GevSim.R0000644000176000001440000000556311370220751013771 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: GEV SIMULATION: # gevSim Simulates a GEV distributed process # gumbelSim Simulates a Gumbel distributed process ################################################################################ gevSim = function(model = list(xi = -0.25, mu = 0, beta = 1), n = 1000, seed = NULL) { # A function implemented by Diethelm Wuertz # Description: # Generates random variates from a GEV distribution # Arguments: # Examples: # gevSim(n = 100) # gevSim(n = 100, seed = 4711) # gevSim(model = list(xi = -0.15, mu = 0, beta = 0.02)) # FUNCTION: # Seed: if (is.null(seed)) seed = NA else set.seed(seed) # Simulate: ans = rgev(n = n, xi = model$xi, mu = model$mu, beta = model$beta) # DW: ans = as.ts(ans) ans = timeSeries(ans, units = "GEV") # Control: attr(ans, "control") = data.frame(t(unlist(model)), seed = seed, row.names = "control") # Return Value: ans } # ------------------------------------------------------------------------------ gumbelSim = function(model = list(mu = 0, beta = 1), n = 1000, seed = NULL) { # A function implemented by Diethelm Wuertz # Description: # Generates random variates from a GEV distribution # Arguments: # Examples: # gumbelSim(n = 100) # gumbelSim(n = 100, seed = 4711) # FUNCTION: # Simulate: ans = gevSim(model = list(xi = 0, mu = model$mu, beta = model$beta), n = n, seed = seed) colnames(ans) = "GUMBEL" # Return Value: ans } ################################################################################ fExtremes/R/GevMdaEstimation.R0000644000176000001440000004302512157313044015774 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: MDA ESTIMATORS: # hillPlot Plot Hill's estimator # shaparmPlot Pickands, Hill & Decker-Einmahl-deHaan Estimator # shaparmPickands Auxiliary function called by shaparmPlot # shaparmHill ... called by shaparmPlot # shaparmDehaan ... called by shaparmPlot ################################################################################ hillPlot = function(x, start = 15, ci = 0.95, doplot = TRUE, plottype = c("alpha", "xi"), labels = TRUE, ...) { # A function implemented by Diethelm Wuertz # Description: # Plots the results from the Hill Estimator. # Note: # Code partly adapted from R package evir # Examples: # par(mfrow = c(2, 2)) # hillPlot(gevSim(n=1000), plottype = "alpha") # hillPlot(gevSim(n=1000), plottype = "xi") # NYI: hillPlot(gevSim(n=1000), plottype = "alpha", reverse = TRUE) # NYI: hillPlot(gevSim(n=1000), plottype = "xi", reverse = TRUE) # hillPlot(gevSim(n=1000), plottype = "alpha", doplot = FALSE) # hillPlot(gevSim(n=1000), plottype = "xi", doplot = FALSE) # Check Type: stopifnot(NCOL(x)==1) x = as.vector(x) # Settings: reverse = FALSE option = match.arg(plottype) data = x # MDA: ordered = rev(sort(data)) ordered = ordered[ordered > 0] n = length(ordered) k = 1:n loggs = log(ordered) avesumlog = cumsum(loggs)/(1:n) xihat = c(NA, (avesumlog - loggs)[2:n]) y = switch(option, alpha = 1/xihat, xi = xihat) ses = y / sqrt(k) x = trunc(seq(from = min(n, length(data)), to = start)) y = y[x] qq <- qnorm(1 - (1 - ci)/2) u <- y + ses[x] * qq l <- y - ses[x] * qq yrange <- range(u, l) if (reverse) index = -x else index = x # Plot: if (doplot) { plot(index, y, ylim = yrange, type = "l", xlab = "", ylab = "", axes = FALSE, ...) pos = floor(seq(1, length(index), length = 10)) axis(1, at = index[pos], labels = paste(x[pos]), tick = TRUE) axis(2) threshold = signif(findThreshold(data, x), 3) axis(3, at = index[pos], labels = paste(format(threshold[pos])), tick = TRUE) box() lines(index, u, lty = 2, col = "steelblue") lines(index, l, lty = 2, col = "steelblue") if (labels) { title(xlab = "Order Statistics", ylab = option) mtext("Threshold", side = 3, line = 3) } } # Result: ans = list(x = index, y = y) control = data.frame(plottype = option[1], start = start, ci = ci, reverse = FALSE, row.names = "control") attr(ans, "control") = control # Return Value: if (doplot) return(invisible(ans)) else ans } # ------------------------------------------------------------------------------ shaparmPlot = function(x, p = 0.01*(1:10), xiRange = NULL, alphaRange = NULL, doplot = TRUE, plottype = c("both", "upper")) { # A function written by Diethelm Wuertz # Description: # Displays Pickands, Einmal-Decker-deHaan, and Hill estimators # Example: # par(mfcol=c(3,2)); shaparmPlot(as.timeSeries(data(daxRet))) # shaparmPlot(as.timeSeries(data(daxRet)), doplot = FALSE) # shaparmPlot(as.timeSeries(data(daxRet)), 0.005*(1:20)) # FUNCTION: # Settings: x = as.vector(x) tails = p if (is.null(xiRange)) xiRange = c(-0.5, 1.5) if (is.null(alphaRange)) alphaRange = c(0, 10) plottype = match.arg(plottype) if (plottype == "both") bothTails = TRUE else bothTails = FALSE # Median Plot: index = which.min(abs(tails-median(tails))) DOPLOT = rep(FALSE, length(tails)) DOPLOT[index] = TRUE selected.tail = tails[index] if (!doplot) DOPLOT[index] = FALSE # Which estimator ? which = c(TRUE, TRUE, TRUE) # Settings: select.doplot = which ylim1 = xiRange ylim2 = alphaRange z = rep(mean(ylim2), length(tails)) ylim1 = xiRange ylim2 = alphaRange # Estimates: p1 = p2 = h1 = h2 = d1 = d2 = m1 = m2 = rep(0, length(tails)) for ( i in (1:length(tails)) ) { tail = tails[i] # Plotting Staff: if (select.doplot[1]) { xi = shaparmPickands(x, tail, ylim1, doplot = FALSE, plottype = plottype) p1[i] = xi$xi[1] p2[i] = xi$xi[3] } if (select.doplot[2]) { xi = shaparmHill(x, tail, ylim1, doplot = FALSE, plottype = plottype) h1[i] = xi$xi[1] h2[i] = xi$xi[3] } if (select.doplot[3]) { xi = shaparmDEHaan(x, tail, ylim1, doplot = FALSE, plottype = plottype) d1[i] = xi$xi[1] d2[i] = xi$xi[3] } } # Plot Pickands' Summary: if (select.doplot[1] & doplot) { plot (tails, z, type = "n", xlab = "tail depth", ylab = "alpha", ylim = ylim2, main = "Pickands Summary") grid() abline(v = selected.tail, lty = 3) y1 = 1/p1 x1 = tails [y1 > ylim2[1] & y1 < ylim2[2]] y1 = y1[y1 > ylim2[1] & y1 < ylim2[2]] points (x1, y1, col = "steelblue") lines(x1, y1, col = "steelblue") if (bothTails) { y1 = 1/p2 x1 = tails [y1 > ylim2[1] & y1 < ylim2[2]] y1 = y1 [y1 > ylim2[1] & y1 < ylim2[2]] points (x1, y1, col = "brown") lines(x1, y1, col = "brown") } } # Plot Hill Summary: if (select.doplot[2] & doplot) { plot (tails, z, type = "n", xlab = "tail depth", ylab = "alpha", ylim = ylim2, main = "Hill Summary") grid() abline(v = selected.tail, lty = 3) y1 = 1/h1 x1 = tails [y1 > ylim2[1] & y1 < ylim2[2]] y1 = y1 [y1 > ylim2[1] & y1 < ylim2[2]] points (x1, y1, col = "steelblue") lines(x1, y1, col = "steelblue") if (bothTails) { y1 = 1/h2 x1 = tails [y1 > ylim2[1] & y1 < ylim2[2]] y1 = y1 [y1 > ylim2[1] & y1 < ylim2[2]] points (x1, y1, col = "brown") lines(x1, y1, col = "brown") } } # Plot Deckers-Einmahl-deHaan Summary if (select.doplot[3] & doplot) { plot (tails, z, type = "n", xlab = "tail depth", ylab = "alpha", ylim = ylim2, main = "Deckers-Einmahl-deHaan Summary") grid() abline(v = selected.tail, lty = 3) y1 = 1/d1 x1 = tails [y1>ylim2[1] & y1ylim2[1] & y1 ylim2[1] & y1 < ylim2[2]] y1 = y1 [y1 > ylim2[1] & y1 < ylim2[2]] points (x1, y1, col = "brown") lines(x1, y1, col = "brown") } } # Plot Estimates: resultUpper = resultLower = NULL for ( i in (1:length(tails)) ) { tail = tails[i] # Plotting Staff: if (select.doplot[1]) { xi = shaparmPickands(x, tail, ylim1, doplot = DOPLOT[i], plottype = plottype) p1[i] = xi$xi[1] p2[i] = xi$xi[3] } if (select.doplot[2]) { xi = shaparmHill(x, tail, ylim1, doplot = DOPLOT[i], plottype = plottype) h1[i] = xi$xi[1] h2[i] = xi$xi[3] } if (select.doplot[3]) { xi = shaparmDEHaan(x, tail, ylim1, doplot = DOPLOT[i], plottype = plottype) d1[i] = xi$xi[1] d2[i] = xi$xi[3] } resultUpper = rbind(resultUpper, c(tails[i], p1[i], h1[i], d1[i])) if (bothTails) resultLower = rbind(resultLower, c(tails[i], p2[i], h2[i], d2[i])) } colnames(resultUpper) = c("Upper", "Pickands", "Hill", "DEHaan") resultUpper = data.frame(resultUpper) if (bothTails) { colnames(resultLower) = c("Lower", "Pickands", "Hill", "DEHaan") resultLower = data.frame(resultLower) } # Result: ans = list(Upper = resultUpper) if (bothTails) ans$Lower = resultLower # Return Value: if (doplot) return(invisible(ans)) else ans } # ------------------------------------------------------------------------------ shaparmPickands = function(x, p = 0.05, xiRange = NULL, doplot = TRUE, plottype = c("both", "upper"), labels = TRUE, ...) { # A function written by Diethelm Wuertz # FUNCTION: # Order Residuals: x = as.vector(x) tail = p if (is.null(xiRange)) xiRange = c(-0.5, 1.5) yrange = xiRange plottype = match.arg(plottype) if (plottype == "both") bothTails = TRUE else bothTails = FALSE ordered1 = rev(sort(abs(x[x < 0]))) if (bothTails) ordered2 = rev(sort(abs(x[x > 0]))) n1 = length(ordered1) if (bothTails) n2 = length(ordered2) ordered1 = ordered1[1:floor(tail*n1)] if (bothTails) ordered2 = ordered2[1:floor(tail*n2)] n1 = length(ordered1) if (bothTails) n2 = length(ordered2) # Pickands Estimate: k1 = 1:(n1%/%4) if (bothTails) k2 = 1:(n2%/%4) pickands1 = log ((c(ordered1[k1])-c(ordered1[2*k1])) / (c(ordered1[2*k1])-c(ordered1[4*k1]))) / log(2) if (bothTails) pickands2 = log ((c(ordered2[k2])-c(ordered2[2*k2])) / (c(ordered2[2*k2])-c(ordered2[4*k2]))) / log(2) # Prepare Plot: y1 = pickands1[pickands1 > yrange[1] & pickands1 < yrange[2]] x1 = log10(1:length(pickands1))[pickands1 > yrange[1] & pickands1 < yrange[2]] if (bothTails) { y2 = pickands2[pickands2 > yrange[1] & pickands2 < yrange[2]] x2 = log10(1:length(pickands2))[pickands2 > yrange[1] & pickands2 < yrange[2]] } # Labels: if (labels) { main = "Pickands Estimator" xlab = "log scale" ylab = "xi" } else { main = xlab = ylab = "" } # Plot: if (doplot) { par(err = -1) plot (x1, y1, xlab = xlab, ylab = ylab, ylim = yrange, main = main, type = "n") title(sub = paste("tail depth:", as.character(tail))) lines(x1, y1, type = "p", pch = 2, col = "steelblue") if (bothTails) lines(x2, y2, type = "p", pch = 6, col = "brown") if (labels) grid() } # Calculate invers "xi": my1 = mean(y1, na.rm = TRUE) if (bothTails) my2 = mean(y2, na.rm = TRUE) sy1 = sqrt(var(y1, na.rm = TRUE)) if (bothTails) sy2 = sqrt(var(y2, na.rm = TRUE)) # Plot: if (doplot) { par(err = -1) lines(c(x1[1], x1[length(x1)]), c(my1,my1), type = "l", lty = 1, col = "steelblue") if (bothTails) lines(c(x2[1], x2[length(x2)]), c(my2, my2), type = "l", lty = 1, col = "brown") } # Result: result = list(xi = c(my1, sy1)) if (bothTails) result = list(xi = c(my1, sy1, my2, sy2)) # Return Result: result } # ------------------------------------------------------------------------------ shaparmHill = function(x, p = 0.05, xiRange = NULL, doplot = TRUE, plottype = c("both", "upper"), labels = TRUE, ...) { # A Function written by Diethelm Wuertz # ORDER RESIDUALS: x = as.vector(x) tail = p if (is.null(xiRange)) xiRange = c(-0.5, 1.5) yrange = xiRange plottype = match.arg(plottype) if (plottype == "both") bothTails = TRUE else bothTails = FALSE ordered1 = rev(sort(abs(x[x < 0]))) if (bothTails) ordered2 = rev(sort(abs(x[x > 0]))) n1 = length(ordered1) if (bothTails) n2 = length(ordered2) ordered1 = ordered1[1:floor(tail*n1)] if (bothTails) ordered2 = ordered2[1:floor(tail*n2)] n1 = length(ordered1) if (bothTails) n2 = length(ordered2) # HILLS ESTIMATE: hills1 = c((cumsum(log(ordered1))/(1:n1)-log(ordered1))[2:n1]) if (bothTails) hills2 = c((cumsum(log(ordered2))/(1:n2) - log(ordered2))[2:n2]) # PREPARE PLOT: y1 = hills1[hills1 > yrange[1] & hills1 < yrange[2]] x1 = log10(1:length(hills1))[hills1 > yrange[1] & hills1 < yrange[2]] if (bothTails) { y2 = hills2[hills2 > yrange[1] & hills2 < yrange[2]] x2 = log10(1:length(hills2))[hills2 > yrange[1] & hills2 < yrange[2]] } # Labels: if (labels) { main = "Hill Estimator" xlab = "log scale" ylab = "xi" } else { main = xlab = ylab = "" } # Plot: if (doplot) { par(err = -1) plot (x1, y1, xlab = xlab, ylab = ylab, ylim = yrange, main = main, type="n") if (labels) title(sub = paste("tail depth:", as.character(tail))) lines(x1, y1, type = "p", pch = 2, col = "steelblue") if (bothTails) lines(x2, y2, type = "p", pch = 6, col = "brown") if (labels) grid() } # CALCULATE INVERSE XI: my1 = mean(y1, na.rm = TRUE) if (bothTails) my2 = mean(y2, na.rm = TRUE) sy1 = sqrt(var(y1, na.rm = TRUE)) if (bothTails) sy2 = sqrt(var(y2, na.rm = TRUE)) if (doplot) { par(err=-1) lines(c(x1[1], x1[length(x1)]), c(my1,my1), type="l", lty = 1, col = "steelblue") if (bothTails) lines(c(x2[1], x2[length(x2)]), c(my2,my2), type = "l",lty = 1, col = "brown") } # Result: result = list(xi = c(my1, sy1)) if (bothTails) result = list(xi = c(my1, sy1, my2, sy2)) # Return Result: result } # ------------------------------------------------------------------------------ shaparmDEHaan = function(x, p = 0.05, xiRange = NULL, doplot = TRUE, plottype = c("both", "upper"), labels = TRUE, ...) { # A function written by Diethelm Wuertz # ORDER RESIDUALS: x = as.vector(x) tail = p if (is.null(xiRange)) xiRange = c(-0.5, 1.5) yrange = xiRange plottype = match.arg(plottype) if (plottype == "both") bothTails = TRUE else bothTails = FALSE ordered1 = rev(sort(abs(x[x < 0]))) if (bothTails) ordered2 = rev(sort(abs(x[x > 0]))) n1 = length(ordered1) if (bothTails) n2 = length(ordered2) ordered1 = ordered1[1:floor(tail*n1)] if (bothTails) ordered2 = ordered2[1:floor(tail*n2)] n1 = length(ordered1) if (bothTails) n2 = length(ordered2) # DECKERS-EINMAHL-deHAAN ESTIMATE: ns0 = 1 n1m = n1-1; ns1 = ns0; ns1p = ns1+1 bod1 = c( cumsum(log(ordered1))[ns1:n1m]/(ns1:n1m) - log(ordered1)[ns1p:n1] ) bid1 = c( cumsum((log(ordered1))^2)[ns1:n1m]/(ns1:n1m) - 2*cumsum(log(ordered1))[ns1:n1m]*log(ordered1)[ns1p:n1]/(ns1:n1m) + ((log(ordered1))^2)[ns1p:n1] ) dehaan1 = ( 1.0 + bod1 + ( 0.5 / ( bod1^2/bid1 - 1 ) )) if (bothTails) { n2m = n2-1; ns2 = ns0; ns2p = ns2+1 bod2 = c( cumsum(log(ordered2))[ns2:n2m]/(ns2:n2m) - log(ordered2)[ns2p:n2] ) bid2 = c( cumsum((log(ordered2))^2)[ns2:n2m]/(ns2:n2m) - 2*cumsum(log(ordered2))[ns2:n2m]*log(ordered2)[ns2p:n2]/(ns2:n2m) + ((log(ordered2))^2)[ns2p:n2] ) dehaan2 = ( 1.0 + bod2 + ( 0.5 / ( bod2^2/bid2 - 1 ) )) } # PREPARE PLOT: y1 = dehaan1[dehaan1 > yrange[1] & dehaan1 < yrange[2]] x1 = log10(1:length(dehaan1))[dehaan1 > yrange[1] & dehaan1 < yrange[2]] if (bothTails) { y2 = dehaan2[dehaan2 > yrange[1] & dehaan2 < yrange[2]] x2 = log10(1:length(dehaan2))[dehaan2 > yrange[1] & dehaan2 < yrange[2]] } # Labels: if (labels) { main = "Deckers - Einmahl - de Haan Estimator" xlab = "log scale" ylab = "xi" } else { main = xlab = ylab = "" } # Plot: if (doplot) { par(err = -1) plot (x1, y1, xlab = xlab, ylab = ylab, ylim = yrange, main = main, type = "n") if (labels) title(sub = paste("tail depth:", as.character(tail))) lines(x1, y1, type = "p", pch = 2, col = "steelblue") if (bothTails) lines(x2, y2, type = "p", pch = 6, col = "brown") if (labels) grid() } # CALCULATE INVERSE XI: my1 = mean(y1, na.rm = TRUE) if (bothTails) my2 = mean(y2, na.rm = TRUE) sy1 = sqrt(var(y1, na.rm = TRUE)) if (bothTails) sy2 = sqrt(var(y2, na.rm = TRUE)) if (doplot) { par(err = -1) lines(c(x1[1], x1[length(x1)]), c(my1,my1), type = "l", lty = 1, col = "steelblue") if (bothTails) lines(c(x2[1], x2[length(x2)]), c(my2, my2), type = "l", lty = 1, col = "brown") } # Result: result = list(xi = c(my1, sy1)) if (bothTails) result = list(xi = c(my1, sy1, my2, sy2)) # Return Result: result } ################################################################################ fExtremes/R/meanExcessPlot.R0000644000176000001440000001142311370220751015521 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: MEAN EXCESS FUNCTION PLOT: # meanExcessPlot Plot mean excesses to a normal/nig/ght density ################################################################################ .meanExcessPlot <- function(x, labels = TRUE, title = FALSE, grid = TRUE, col = "steelblue", ...) { # A function implemented by Diethelm Wuertz # Description: # Plots and fits mean excess function # Arguments: # FUNCTION: # Common Range: DIM = NCOL(x) xRange = yRange = NULL for (i in 1:DIM) { xRange = c(xRange, range(mePlot(-scale(x[, i]), doplot = FALSE)[, 1], na.rm = TRUE)) yRange = c(yRange, range(mePlot(-scale(x[, i]), doplot = FALSE)[, 2], na.rm = TRUE)) } xLim = range(xRange) yLim = c(0, max(yRange)) xPos = min(xLim) + 0.075*diff(xLim) yPos = 0.05*diff(yLim) # Colors: if (length(col) == 1) col = rep(col, times = DIM) # Labels: if (title) { xlab = "Threshold" ylab = "Mean Excess" main = colnames(X) } else { xlab = ylab = main = "" } # Mean Excess: for (i in 1:DIM) { # Scale Tail of Series: X = -scale(x[, i]) if (labels) main = colnames(X) # Normal Fit: me = normMeanExcessFit(X, doplot = TRUE, trace = FALSE, lwd = 2, labels = FALSE, col = col[i], xlim = xLim, ylim = yLim, main = main, xlab = xlab, ylab = ylab, ...) normLLH = attr(me, "control")@fit$minimum if (grid) { grid(col = "darkgrey") } if (title) { mtext("Scaled Mean Excess", line = 0.5, cex = 0.7) } # Add 95% and 99% Sample Quantiles: abline(v = quantile(X, 0.95, type = 1), col = "darkgrey") abline(v = quantile(X, 0.99, type = 1), col = "darkgrey") # If Normality rejected, add NIG and GH Student-t: test = jbTest(X)@test$p.value[3] nigLLH = ghtLLH = -9.99e99 if (test == 0) { # NIG Fit: me = nigMeanExcessFit(X, doplot = FALSE, trace = FALSE) lines(me, col = "green", lwd = 2) nigLLH = attr(me, "control")@fit$minimum param = attr(me, "control")@fit$estimate abline(v = qnig(0.95, param[1], param[2], param[3], param[4]), col = "green") abline(v = qnig(0.99, param[1], param[2], param[3], param[4]), col = "green") # GH Student-t Fit: me = ghtMeanExcessFit(X, doplot = FALSE, trace = FALSE) lines(me, col = "red", lwd = 2) ghtLLH = attr(me, "control")@fit$minimum } # Finish: if (title) { LLH = c("NORM", "NIG", "GHT") colorsLLH = c("black", "green", "red") if (test == 0) { mText = paste( "logLLH: NORM = ", signif(normLLH, 5), " | NIG = ", signif(nigLLH, 5), " | GHT = ", signif(ghtLLH, 5), sep = "") mtext(mText, side = 4, adj = 0, col = "darkgrey", cex = 0.7) } else { mText = paste( "logLLH: NORM = ", signif(normLLH, 5), sep = "") mtext(mText, side = 4, adj = 0, col = "darkgrey", cex = 0.7) } indexLLH = which.max(c(normLLH, nigLLH, ghtLLH)) maxLLH = LLH[indexLLH] colLLH = colorsLLH[indexLLH] text(xPos, yPos, maxLLH, col = colLLH) } } # Return Value: invisible() } ################################################################################ fExtremes/R/GevFit.R0000644000176000001440000003247611370220751013766 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2009, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: GEV PARAMETER ESTIMATION: # 'fGEVFIT' S4 class representation # gevFit Fits parameters of GEV distribution # gumbelFit Fits parameters of Gumbel distribution # FUNCTION: FOR INTERNAL USE: # .gumpwmFit Fits Gumbel with probability weighted moments # .gevpwmFit Fits GEV with probability weighted moments # .gummleFit Fits Gumbel with max log-likelihood approach # .gumLLH Computes Gumbel log-likelihood function # .gevmleFit Fits GEV with max log-likelihood approach # .gevLLH Computes GEV log-likelihood function ################################################################################ setClass("fGEVFIT", representation( call = "call", method = "character", parameter = "list", data = "list", fit = "list", residuals = "numeric", title = "character", description = "character" ) ) # ------------------------------------------------------------------------------ gevFit = function(x, block = 1, type = c("mle", "pwm"), title = NULL, description = NULL, ...) { # A function implemented by Diethelm Wuertz # Description: # Fits the parameters of GEV distribution # Arguments: # x - an object of class timeSeries # block - an integer value, the block size # type - a character string, which type of method should be used, # max log-likelihood estimation, "mle", or partial weighted # moments estimation, "pwm". # Examples: # gevFit(gevSim()) # par(mfrow = c(2,2)); summary(gevFit(gevSim())) # FUNCTION: # Match Call: call = match.call() # Match Arguments: type = match.arg(type) # Fit: ans = .gevFit(x = x, block = block, type = type, gumbel = FALSE, title = title, description = description, ...) ans@call = call # Return Value: ans } # ------------------------------------------------------------------------------ gumbelFit = function(x, block = 1, type = c("mle", "pwm"), title = NULL, description = NULL, ...) { # A function implemented by Diethelm Wuertz # Description: # Fits the parameters of Gumbel distribution # Arguments: # x - an object of class timeSeries # block - an integer value, the block size # type - a character string, which type of method should be used, # max log-likelihood estimation, "mle", or partial weighted # moments estimation, "pwm". # Examples: # gumbelFit(gumbelSim()) # par(mfrow = c(2,2)); summary(gumbelFit(gumbelSim())) # FUNCTION: # Match Call: call = match.call() # Match Arguments: type = match.arg(type) # Fit: ans = .gevFit(x = x, block = block, type = type, gumbel = TRUE, title = title, description = description, ...) ans@call = call # Return Value: ans } # ------------------------------------------------------------------------------ .gevFit = function(x, block = 1, type = c("mle", "pwm"), gumbel = FALSE, title = NULL, description = NULL, ...) { # A function implemented by Diethelm Wuertz # Description: # Fits parameters to a GEV distribution # Arguments: # x - a numeric vector of Block Maxima # Examples: # fit = gevFit(gevSim(), type = "mle", gumbel = FALSE); print(fit) # fit = gevFit(gevSim(), type = "pwm", gumbel = FALSE); print(fit) # fit = gevFit(gevSim(), type = "mle", gumbel = TRUE); print(fit) # fit = gevFit(gevSim(), type = "pwm", gumbel = TRUE); print(fit) # x = rnorm(500); block = 20; type="mle"; gumbel=FALSE # x = as.ts(rnorm(500)); block = 20; type = "mle"; gumbel = FALSE # x = dummyDailySeries(rnorm(500)); block = 20; type = "mle"; gumbel=FALSE # Note: # Argument named "method is already used for the selection # of the MLE optimization algorithm, therfore we use here # "type". # FUNCTION: # Match Call: call = match.call() # Match Arguments: type = match.arg(type) # Check Type and Convert: X = x xClass = class(x) x = as.timeSeries(x) stopifnot(isUnivariate(x)) # Block Maxima: if (is.numeric(block)) { if (block == 1) { blockmaxima = x Names = paste(1:dim(blockmaxima)[1]) } else { blockmaxima = blockMaxima(x, block, doplot = FALSE) Names = blockmaxima@recordIDs[, 3] } } else { blockmaxima = blockMaxima(x, block, doplot = FALSE) Names = rownames(series(blockmaxima)) } if (xClass == "numeric") { blockmaxima = as.vector(blockmaxima) names(blockmaxima) = Names } if (xClass == "ts") { blockmaxima = as.ts(blockmaxima) names(blockmaxima) = Names } x = as.vector(blockmaxima) # Estimate Parameters: if (gumbel) { # GUMBEL: Add Call and Type if (length(type) > 1) type = type # Probability Weighted Moment Estimation: if (type == "pwm") { fit = .gumpwmFit(data = x, ...) } # Maximum Log Likelihood Estimation: # Use Alexander McNeils EVIS from evir Package ... if (type == "mle") { fit = .gummleFit(data = x, ...) } } else { # GEV: Add Call and Type if (length(type) > 1) type = type # Probability Weighted Moment Estimation: if (type == "pwm") { fit = .gevpwmFit(data = x, ...) } # Maximum Log Likelihood Estimation: # Use Alexander McNeils EVIS from evir Package ... if (type == "mle") { fit = .gevmleFit(data = x, ...) } } class(fit) = "list" # Compute Residuals: if (gumbel) { # GUMBEL: xi = 0 beta = fit$par.ests["beta"] mu = fit$par.ests["mu"] residuals = exp( - exp( - (x - mu)/beta)) } else { # GEV: xi = fit$par.ests["xi"] beta = fit$par.ests["beta"] mu = fit$par.ests["mu"] residuals = (1 + (xi * (x - mu))/beta)^(-1/xi) } # Make Unique: fit$llh = fit$nllh.final # Add title and description: if (is.null(title)) { if (gumbel) { title = "Gumbel Parameter Estimation" } else { title = "GEV Parameter Estimation" } } if (is.null(description)) { description = as.character(date()) } # Add Counts to x: # Return Value: new("fGEVFIT", call = match.call(), method = c(if (gumbel) "gum" else "gev", type[1]), parameter = list(block = block, type = type[1], gumbel = gumbel), data = list(x = X, blockmaxima = blockmaxima), fit = fit, residuals = residuals, title = title, description = description) } # ------------------------------------------------------------------------------ .gumpwmFit = function(data, ...) { # A function implemented by Diethelm Wuertz # Description: # Arguments: # FUNCTION: # "Probability Weighted Moment" method. data = as.numeric(data) n = length(data) # Sample Moments: x = rev(sort(data)) lambda = c(mean(x), 0) for (i in 1:n) { weight = (n-i)/(n-1)/n lambda[2] = lambda[2] + weight*x[i] } # Calculate Parameters: xi = 0 beta = lambda[2]/log(2) mu = lambda[1] - 0.5772*beta # Output: fit = list( n = n, data = data, par.ests = c(mu = mu, beta = beta), par.ses = c(mu = NA, beta = NA), varcov = matrix(rep(NA, 4), 2, 2), converged = NA, nllh.final = NA, call = match.call(), selected = "pwm") class(fit) = "gev" # not gumbel! # Return Value: fit } # ------------------------------------------------------------------------------ .gevpwmFit = function(data, block = NA, ...) { # A function implemented by Diethelm Wuertz # Description: # Arguments: # FUNCTION: # Probability Weighted Moment method. data = as.numeric(data) n = length(data) # Internal Function: y = function(x, w0, w1, w2) { (3^x-1)/(2^x-1) - (3*w2 - w0)/(2*w1 - w0) } # Moments: nmom = 3 x = rev(sort(data)) moments = rep(0, nmom) moments[1] = mean(x) n = length(x) for (i in 1:n) { weight = 1/n for (j in 2:nmom) { weight = weight*(n-i-j+2)/(n-j+1) moments[j] = moments[j] + weight*x[i] } } w0 = moments[1] w1 = moments[2] w2 = moments[3] # Parameters: xi = uniroot(f = y, interval = c(-5,+5), w0 = w0, w1 = w1, w2 = w2)$root beta = (2*w1-w0)*xi / gamma(1-xi) / (2^xi-1) mu = w0 + beta*(1-gamma(1-xi))/xi # Output: fit = list( n = n, data = data, par.ests = c(xi = xi, mu = mu, beta = beta), par.ses = c(xi = NA, mu = NA, beta = NA), varcov = matrix(rep(NA, 9), 3, 3), converged = NA, nllh.final = NA, call = match.call(), selected = "pwm") class(fit) = "gev" # Return Value: fit } # ------------------------------------------------------------------------------ .gummleFit = function(data, block = NA, ...) { # A copy from evir # Description: # Arguments: # FUNCTION: # Data: data = as.numeric(data) n = length(data) # Generate EVIR Start Values: # beta0 = sqrt(6 * var(data))/pi # mu0 = mean(data) - 0.57722 * beta0 # theta = c(mu = mu0, beta = beta0) # We use PWM Start Values: theta = .gumpwmFit(data)$par.ests # Fit: fit = optim(theta, .gumLLH, hessian = TRUE, ..., tmp = data) if( fit$convergence) warning("optimization may not have succeeded") par.ests = fit$par varcov = solve(fit$hessian) par.ses = sqrt(diag(varcov)) # Result: ans = list( n = n, data = data, par.ests = par.ests, par.ses = par.ses, varcov = varcov, converged = fit$convergence, nllh.final = fit$value) class(ans) = "gev" # Return Value: ans } # ------------------------------------------------------------------------------ .gumLLH = function(theta, tmp) { # A copy from evir # Description: # Arguments: # FUNCTION: # Gumbel Log-Likelihood: y = (tmp - theta[1])/theta[2] if(theta[2] < 0) { ans = 1.0e+6 } else { term1 = length(tmp) * logb(theta[2]) term2 = sum(y) term3 = sum(exp( - y)) ans = term1 + term2 + term3 } # Return Value: ans } # ------------------------------------------------------------------------------ .gevmleFit = function(data, block = NA, ...) { # A copy from evir # Description: # Arguments: # FUNCTION: # Data: data = as.numeric(data) n = length(data) # EVIR Start Values: beta0 = sqrt(6 * var(data))/pi mu0 = mean(data) - 0.57722 * beta0 xi0 = 0.1 # We use PWM Start Values: theta = .gevpwmFit(data)$par.ests # Fit: fit = optim(theta, .gevLLH, hessian = TRUE, ..., tmp = data) if (fit$convergence) warning("optimization may not have succeeded") par.ests = fit$par varcov = solve(fit$hessian) par.ses = sqrt(diag(varcov)) # Result: ans = list( n = n, data = data, par.ests = par.ests, par.ses = par.ses, varcov = varcov, converged = fit$convergence, nllh.final = fit$value) class(ans) = "gev" # Return Value: ans } # ------------------------------------------------------------------------------ .gevLLH = function(theta, tmp) { # A copy from evir # Description: # Computes log-likelihood for GEV distribution # Arguments: # FUNCTION: # GEV Log-likelihood: y = 1 + (theta[1] * (tmp - theta[2]))/theta[3] if((theta[3] < 0) || (min(y) < 0)) { ans = 1e+06 } else { term1 = length(tmp) * logb(theta[3]) term2 = sum((1 + 1/theta[1]) * logb(y)) term3 = sum(y^(-1/theta[1])) ans = term1 + term2 + term3 } # Return Value: ans } ################################################################################ fExtremes/R/ValueAtRisk.R0000644000176000001440000000726711370220751014774 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTION: DESCRIPTION: # VaR Computes Value-at-Risk # CVaR Computes Conditional Value-at-Risk ################################################################################ VaR = function(x, alpha = 0.05, type = "sample", tail = c("lower", "upper")) { # A function implemented by Diethelm Wuertz # Description: # Computes Value-at-Risk # Arguments: # x - an uni- or multivariate timeSeries object # alpha - a numeric value, the confidence interval # type - a character string, the type to calculate the value-at-risk # tail - a character string denoting which tail will be # considered, either \code{"lower"} or \code{"upper"}. # If \code{tail="lower"}, then alpha will be converted to # \code{alpha=1-alpha}. # FUNCTION: # Settings: x = as.matrix(x) tail = match.arg(tail) # Value-at-Risk: if (type == "sample") { if (tail == "upper") alpha = 1-alpha # Important: use type=1 ! VaR = quantile(x, probs = alpha, type = 1) } else if (type == "gpd") { VaR = "Not yet Implemented" } else if (type == "obre") { VaR = "Not yet Implemented" } # Return Value: VaR } # ------------------------------------------------------------------------------ CVaR = function(x, alpha = 0.05, type = "sample", tail = c("lower", "upper")) { # A function implemented by Diethelm Wuertz # Description: # Computes Conditional Value-at-Risk # Arguments: # x - an uni- or multivariate timeSeries object # alpha - a numeric value, the confidence interval # type - a character string, the type to calculate the value-at-risk # tail - a character string denoting which tail will be considered, # either "lower" or upper", if tail="lower", then alpha will be # converted to alpha=1-alpha. # FUNCTION: # Settings: x = as.matrix(x) tail = match.arg(tail) # Sample VaR: VaR = VaR(x, alpha, type, tail) # Sample CVaR: if (tail == "upper") alpha = 1-alpha if (type == "sample") { CVaR = NULL for (i in 1:ncol(x)) { X = as.vector(x[, i]) CVaR = c(CVaR, VaR[i] - 0.5 * mean(((VaR[i]-X) + abs(VaR[i]-X))) / alpha ) } } # Return Value: CVaR } ################################################################################ fExtremes/R/zzz.R0000644000176000001440000000346512157313044013435 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2008, Diethelm Wuertz, Rmetrics Foundation, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ ## .First.lib = ## function(lib, pkg) ## { ## # Startup Mesage and Desription: ## MSG <- if(getRversion() >= "2.5") packageStartupMessage else message ## dsc <- packageDescription(pkg) ## if(interactive() || getOption("verbose")) { ## # not in test scripts ## MSG(sprintf("Rmetrics Package %s (%s) loaded.", pkg, dsc$Version)) ## } ## # Load dll: ## # library.dynam("fExtremes", pkg, lib) ## } if(!exists("Sys.setenv", mode = "function")) # pre R-2.5.0, use "old form" Sys.setenv <- Sys.putenv ################################################################################ fExtremes/R/DataPreprocessing.R0000644000176000001440000002172611370220751016213 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # FUNCTIONé DESCRIPTION: # blockMaxima Returns block maxima from a time series # findThreshold Upper threshold for a given number of extremes # pointProcess Returns peaks over a threshold from a time series # deCluster Declusters a point process ################################################################################ blockMaxima <- function (x, block = c("monthly", "quarterly"), doplot = FALSE) { # A function implemented by Diethelm Wuertz # Description: # Compute block maxima from a time series or numeric vector # Arguments: # x - an univariate 'timeSeries' object or any other object # which can be coerced in a numeric vector by the function # as.vector(). # block - block size, either a "monthl" or "quarterly" # calendrical block, or an integer value, specifying the # length of the block. # Note: # This function was implemented for daily recorded data sets. # Example: # data(bmwRet) # blockMaxima(bmwRet, 200) # data(bmwRet); x = bmwRet[5100:5280, ]; x; block = "monthly" # FUNCTION: # Check Type: if (class(x) == "timeSeries") { stopifnot(isUnivariate(x)) } else { x = as.vector(x) stopifnot(is.numeric(block[1])) } # Maxima: if (is.numeric(block[1])) { block = block[1] } else { block = match.arg(block) } if (is(x, "timeSeries")) { if (is.numeric(block)) { from = blockStart(time(x), block = block) to = blockEnd(time(x), block = block) } else if (block == "monthly") { from = unique(timeFirstDayInMonth(time(x))) to = unique(timeLastDayInMonth(time(x))) } else if (block == "quarterly") { from = unique(timeFirstDayInQuarter(time(x))) to = unique(timeLastDayInQuarter(time(x))) } else { stop("Unknown block size for timeSeries Object") } maxValue = applySeries(x, from, to, FUN = max) maxIndex = as.matrix(applySeries(x, from, to, FUN = which.max)) toIndex = as.matrix(applySeries(x, from, to, FUN = length)) # maxPosition = rownames(series(x))[cumsum(toIndex)-toIndex+maxIndex-1] maxPosition = rownames(series(x))[cumsum(toIndex)-toIndex+maxIndex] # Add Attributes: Update rownames, colnames and recordIDs rownames(maxValue) <- as.character(maxPosition) colnames(maxValue) <- paste("max.", x@units, sep = "") maxValue@recordIDs = data.frame( from = as.character(from), to = as.character(to), cumsum(toIndex)-toIndex+maxIndex ) } else { if (is.numeric(block)) { data = as.vector(x) nblocks = (length(data) %/% block) + 1 grouping = rep(1:nblocks, rep(block, nblocks))[1:length(data)] maxValue = as.vector(tapply(data, grouping, FUN = max)) maxIndex = as.vector(tapply(as.vector(data), grouping, FUN = which.max)) names(maxValue) = paste(maxIndex) } else { stop("For non-timeSeries Objects blocks must be numeric") } } if (doplot) { plot(maxValue, type = "h", col = "steelblue", main = "Block Maxima") grid() } # Return Value: maxValue } # ------------------------------------------------------------------------------ findThreshold = function(x, n = floor(0.05*length(as.vector(x))), doplot = FALSE) { # A function implemented by Diethelm Wuertz # Description: # Upper threshold for a given number of extremes # Arguments: # x - an univariate 'timeSeries' object or any other object # which can be coerced in a numeric vector by the function # as.vector(). # n - a numeric value giving number of extremes # above the threshold, by default 5%. # Example: # findThreshold(x = as.timeSeries(data(bmwRet)), # n = floor(c(0.05, 0.10)*length(as.vector(x)))) # FUNCTION: # Check Type: if (class(x) == "timeSeries") { stopifnot(isUnivariate(x)) } else { x = as.vector(x) } # Threshold: X = rev(sort(as.vector(x))) thresholds = unique(X) indices = match(X[n], thresholds) indices = pmin(indices + 1, length(thresholds)) # Result: ans = thresholds[indices] names(ans) = paste("n=", as.character(n), sep = "") # Plot: if (doplot) { plot(x, type = "h", col = "steelblue", main = "Threshold Value") grid() rug(as.vector(x), ticksize = 0.01, side = 4) for (u in ans) abline (h = u, lty = 3, col = "red") } # Return Value: ans } # ------------------------------------------------------------------------------ pointProcess = function(x, u = quantile(x, 0.95), doplot = FALSE) { # A function implemented by Diethelm Wuertz # Description: # Returns peaks over a threshold from a time series # Arguments: # x - an univariate 'timeSeries' object or any other object # which can be coerced in a numeric vector by the function # as.vector(). # u - threshold value # Examples: # pointProcess(as.timeSeries(data(daxRet))) # Point Process: CLASS = class(x) if (CLASS == "timeSeries") { stopifnot(isUnivariate(x)) X = x[x > u,] } else { X = as.vector(x) X = X[X > u] N = length(x) IDX = (1:N)[x > u] attr(X, "index") <- IDX } # Plot: if (doplot) { if (CLASS == "timeSeries") { plot(X, type = "h", xlab = "Series") } else { plot(IDX, X, type = "h", xlab = "Series") } mText = paste("Threshold =", u, "| N =", length(as.vector(X))) mtext(mText, side = 4, cex = 0.7, col = "grey") abline(h = u, lty = 3, col = "red") title(main = "Point Process") grid() } # Return Value: X } # ------------------------------------------------------------------------------ deCluster = function(x, run = 20, doplot = TRUE) { # A function implemented by Diethelm Wuertz # Description: # Decluster a Point Process. # Arguments: # x - an univariate 'timeSeries' object # Example: # deCluster(pointProcess(as.timeSeries(daxRet))) # FUNCTION: # Check: stopifnot(class(x) == "timeSeries") stopifnot(isUnivariate(x)) # Decluster time Series: positions = time(x) data = series(x) gapLengths = c(0, diff(positions)) # / (24*3600) clusterNumbers = cumsum(gapLengths > run) + 1 N = length(data) fromIndex = (1:N)[c(1, diff(clusterNumbers)) == 1] toIndex = c(fromIndex[-1]-1, N) from = positions[fromIndex] to = positions[toIndex] # Maximum Values: maxValue = applySeries(x, from, to, FUN = max) maxIndex = as.matrix(applySeries(x, from, to, FUN = which.max)) lengthIndex = as.matrix(applySeries(x, from, to, FUN = length)) maxPosition = rownames(series(x))[cumsum(lengthIndex)-lengthIndex+maxIndex] # Add Attributes: Update rownames, colnames and recordIDs rownames(maxValue) = rownames(maxValue) = as.character(maxPosition) colnames(maxValue) = colnames(maxValue) = paste("max.", x@units, sep = "") maxValue@recordIDs = data.frame( from = as.character(from), to = as.character(to) ) # Plot: if (doplot) { plot(maxValue, type = "h", xlab = "Series") title(main = "Declustered Point Process") mText = paste("Run Length =", run, "| N =", length(as.vector(maxValue))) mtext(mText, side = 4, cex = 0.7, col = "grey") abline(h = min(as.vector(maxValue)), lty = 3, col = "red") grid() } # Return Value: maxValue } ################################################################################ fExtremes/R/GpdSummary.R0000644000176000001440000000522111370220751014656 0ustar ripleyusers # This library is free software; you can redistribute it and/or # modify it under the terms of the GNU Library General Public # License as published by the Free Software Foundation; either # version 2 of the License, or (at your option) any later version. # # This library is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU Library General Public License for more details. # # You should have received a copy of the GNU Library General # Public License along with this library; if not, write to the # Free Foundation, Inc., 59 Temple Place, Suite 330, Boston, # MA 02111-1307 USA # Copyrights (C) # for this R-port: # 1999 - 2007, Diethelm Wuertz, GPL # Diethelm Wuertz # info@rmetrics.org # www.rmetrics.org # for the code accessed (or partly included) from other R-ports: # see R's copyright and license files # for the code accessed (or partly included) from contributed R-ports # and other sources # see Rmetrics's copyright file ################################################################################ # METHODS: PRINT, PLOT, AND SUMMARY: # summary.fGPDFIT S3 Summary Method for object of class "fGPDFIT" ################################################################################ summary.fGPDFIT = function(object, doplot = TRUE, which = "all", ...) { # A function written by Diethelm Wuertz # Description: # Summary method for objects of class "gpdFit" # Arguments: # FUNCTION: # Title: cat("\nTitle:\n" , object@title, "\n") # Function Call: cat("\nCall:\n") cat(paste(deparse(object@call), sep = "\n", collapse = "\n"), "\n", sep = "") # Estimation Type: cat("\nEstimation Type:\n ", object@method, "\n") # Estimated Parameters: cat("\nEstimated Parameters:\n") print(object@fit$par.ests) # Summary - For MLE print additionally: if (object@method[2] == "mle") { cat("\nStandard Deviations:\n"); print(object@fit$par.ses) if (!is.na(object@fit$llh)) cat("\nLog-Likelihood Value:\n ", object@fit$llh, "\n") if (!is.na(object@fit$convergence)) cat("\nType of Convergence:\n ", object@fit$convergence, "\n") } # Plot: if (doplot) { plot(object, which = which, ...) } # Desription: cat("\nDescription\n ", object@description, "\n\n") # Return Value: invisible(object) } ################################################################################ fExtremes/MD50000644000176000001440000000544312254146514012566 0ustar ripleyusers672b53fc80213aafdfdd007853a5421c *ChangeLog 3987f8cfc80322f61583909a7451f2c8 *DESCRIPTION 64c4693dbfe914d62b556004d72de71a *NAMESPACE cd387c13bac92b586162a5059ea2dbee *R/DataPreprocessing.R fe5bf844eb0bc3441061eb801aa1fbf7 *R/ExtremeIndex.R f1cb59a33d956d21bbc91cbc569092b7 *R/ExtremesData.R 7d5db3590dcec86b250070e66da49d87 *R/GevDistribution.R 38508e385c29a2f84000efa6c6ca5007 *R/GevFit.R 5cc1498e266d7ffcc92057e5b357f26d *R/GevMdaEstimation.R d799ea168869bdd87cd8eed58422679e *R/GevPrintPlotSummary.R 71af4bfffaf98ca07023825eebe7685f *R/GevRisk.R d593fa76a1e6f6e5f71b1389a5b15dba *R/GevSim.R 783b89d266aedf7e3af32000c76fe18e *R/GpdDistribution.R c798700713a81b4cf7a27b37010022f1 *R/GpdFit.R 731f08df1253dc0dfff3a0095941031c *R/GpdPlot.R 7cba287ce4ef0ca866c260fbcdbec9c0 *R/GpdRisk.R 78b2f292ae9ac4d18eeee6d5fb161481 *R/GpdSim.R 11aba9bc6b7051e535c17b1629559dc1 *R/GpdSow.R 11b69cb92329efc96d03615d481850b3 *R/GpdSummary.R 4be5dd80c4b45ecfd74b01aac992049a *R/MeanExcessFit.R eedf28bb1b693bfc23cf90a84c9cbdfb *R/ValueAtRisk.R 69559d2b10e05416534c7be296325501 *R/meanExcessPlot.R 2e2118e30bc041ead36e0d87c27c38de *R/metrics.R 1aa4c307071784e345d62d7429091974 *R/zzz.R bb453207f5e2fb79f844c9816db5d2d8 *data/bmwRet.csv.gz 71b678e389195b7b3921e3c000b5436b *data/danishClaims.csv.gz 6042b9c5e5bec3ecc1b6959cd2858b64 *inst/COPYRIGHT.html d1001bf398f94e5ca67bceac26317ce3 *inst/unitTests/Makefile ef6862244177aa0c9690e037b2823bbd *inst/unitTests/runTests.R f0e530000f96bc971e3fb89a25272c17 *inst/unitTests/runit.DataPreprocessing.R 3d8fa2217c7cdf803edc8cb3f531ad9c *inst/unitTests/runit.ExtremeIndex.R 5c4d0101ecf88db0530fd53f5c2f7b32 *inst/unitTests/runit.ExtremesData.R df54978979460d89018b2c7e8d9ed82e *inst/unitTests/runit.GevDistribution.R aeed009908a914715e5d4947d8c4e098 *inst/unitTests/runit.GevMdaEstimation.R 2cb05c2f27012611715b74e9daf89ba0 *inst/unitTests/runit.GevModelling.R 553d134fc982fe578b1c4013b728a50e *inst/unitTests/runit.GevRisk.R 9f0952a7794071690346f36d017c8c7f *inst/unitTests/runit.GpdDistribution.R 9ccbeecb2cd8580869caac1b8dae64c6 *inst/unitTests/runit.GpdModelling.R 69a26b861bd2d71074818864bd74029a *inst/unitTests/runit.GpdRisk.R e32d5a7ef90d5460ad2490e8e51bdc5e *man/DataPreprocessing.Rd 889ac9f50477b81bd1e5f26e54f08e47 *man/ExtremeIndex.Rd 053bd8de255d74a0c1dfd70121968a33 *man/ExtremesData.Rd 92957faa1992251453a248bca84052bc *man/GevDistribution.Rd 45054a04d74c5dc6bb8fe4d063716779 *man/GevMdaEstimation.Rd 2cb16bcd6748e1b232bde82f6dc7ae0a *man/GevModelling.Rd 1733b69b25decb4ff580afbd698f235f *man/GevRisk.Rd fbb6df0146b8d12bcd1a9328cfc323f4 *man/GpdDistribution.Rd c652b0dc4bf2b3950578b0d7b4f73efc *man/GpdModelling.Rd 0aa223d959e41d950bd2ce2e57d180e8 *man/GpdRisk.Rd 79605e94b04f8ac2e743bd30d4d44e58 *man/ValueAtRisk.Rd 246ce117b00d8707bef1634cb9768a5a *man/data.Rd ca566e590ec30abd0718c5375e1a446f *tests/doRUnit.R fExtremes/DESCRIPTION0000644000176000001440000000146112254146514013760 0ustar ripleyusersPackage: fExtremes Version: 3010.81 Revision: 5532 Date: 2013-12-17 Title: Rmetrics - Extreme Financial Market Data Author: Diethelm Wuertz and many others, see the SOURCE file Depends: R (>= 2.4.0), timeDate, timeSeries, fBasics, fGarch, fTrading Imports: methods Suggests: RUnit, tcltk Maintainer: Yohan Chalabi Description: Environment for teaching "Financial Engineering and Computational Finance" Note: Several parts are still preliminary and may be changed in the future. this typically includes function and argument names, as well as defaults for arguments and return values. LazyData: yes License: GPL (>= 2) URL: http://www.rmetrics.org Packaged: 2013-12-17 20:40:23 UTC; yohan NeedsCompilation: no Repository: CRAN Date/Publication: 2013-12-17 23:16:44 fExtremes/ChangeLog0000644000176000001440000000377712254133267014041 0ustar ripleyusers2013-12-10 chalabi * DESCRIPTION: 2013-04-30 chalabi * ChangeLog, DESCRIPTION: Updated ChangeLog and DSC files * DESCRIPTION, inst/unitTests/runit.GevModelling.R: Updating unit tests 2013-04-02 chalabi * R/zzz.R: removed .First.lib() * ChangeLog, DESCRIPTION: Updated ChangeLog and DESC files * DESCRIPTION: Updated version number. * inst/unitTests/runit.GevModelling.R: Updated unit test to avoid troubles when running in the last two days of the month as reported by Brian Ripley. 2012-12-01 chalabi * ChangeLog, DESCRIPTION: Updated ChangeLog and DESCRIPTION files. 2012-11-30 chalabi * DESCRIPTION: Updated version number. * NAMESPACE: Added NAMESPACE * DESCRIPTION: Updated maintainer field. * R/ExtremeIndex.R, R/GevMdaEstimation.R, R/GpdRisk.R: Fixed partial argument match. * inst/unitTests/runit.GevModelling.R: Updated unit test to avoid troubles when running in the last two days of the month as reported by Brian Ripley. 2011-09-23 mmaechler * DESCRIPTION: remove deprecated "LazyLoad" entry 2010-07-23 chalabi * inst/DocCopying.pdf: removed DocCopying.pdf license is already specified in DESCRIPTION file 2009-10-01 chalabi * DESCRIPTION: updated version number 2009-09-29 chalabi * ChangeLog, DESCRIPTION: updated DESC and ChangeLog 2009-05-21 chalabi * R/GevFit.R: assignment beta = fit$par.ests["xi"] should read : beta = fit$par.ests["beta"]. (Dodzi Attimu) 2009-04-09 wuertz * R/DataPreprocessing.R, R/GevDistribution.R, R/GevFit.R, R/GevPrintPlotSummary.R, R/GevSim.R, R/GpdDistribution.R, R/GpdSim.R, R/GpdSow.R, R/MeanExcessFit.R, R/ValueAtRisk.R, man/DataPreprocessing.Rd, man/GevDistribution.Rd, man/GevModelling.Rd: The *Sim functions now return signal timeSeries, and some smaller beautifies 2009-04-02 chalabi * DESCRIPTION: more explicit depends and suggests field in DESC file. 2009-04-01 chalabi * DESCRIPTION: updated DESC file 2009-01-28 chalabi * man/ExtremeIndex.Rd: updated manual pages to new Rd parser fExtremes/man/0000755000176000001440000000000012251673345013027 5ustar ripleyusersfExtremes/man/GpdModelling.Rd0000644000176000001440000002260011370220751015651 0ustar ripleyusers\name{GpdModelling} \alias{GpdModelling} \alias{fGPDFIT} \alias{fGPDFIT-class} \alias{show,fGPDFIT-method} \alias{gpdSim} \alias{gpdFit} \alias{plot.fGPDFIT} \alias{summary.fGPDFIT} \title{GPD Distributions for Extreme Value Theory} \description{ A collection and description to functions to compute the generalized Pareto distribution and to estimate its parameters. The functions compute density, distribution function, quantile function and generate random deviates for the GPD. Two approaches for parameter estimation are provided: Maximum likelihood estimation and the probability weighted moment method. \cr The GPD modelling functions are: \tabular{ll}{ \code{gpdSim} \tab generates data from the GPD, \cr \code{gpdFit} \tab fits empirical or simulated data to the distribution, \cr \code{print} \tab print method for a fitted GPD object of class ..., \cr \code{plot} \tab plot method for a fitted GPD object, \cr \code{summary} \tab summary method for a fitted GPD object. } } \usage{ gpdSim(model = list(xi = 0.25, mu = 0, beta = 1), n = 1000, seed = NULL) gpdFit(x, u = quantile(x, 0.95), type = c("mle", "pwm"), information = c("observed", "expected"), title = NULL, description = NULL, \dots) \S4method{show}{fGPDFIT}(object) \method{plot}{fGPDFIT}(x, which = "ask", \dots) \method{summary}{fGPDFIT}(object, doplot = TRUE, which = "all", \dots) } \arguments{ \item{description}{ a character string which allows for a brief description. } \item{doplot}{ a logical. Should the results be plotted? } \item{information}{ whether standard errors should be calculated with \code{"observed"} or \code{"expected"} information. This only applies to the maximum likelihood method; for the probability-weighted moments method \code{"expected"} information is used if possible. } \item{model}{ [gpdSim] - \cr a list with components \code{shape}, \code{location} and \code{scale} giving the parameters of the GPD distribution. By default the shape parameter has the value 0.25, the location is zero and the scale is one.} \item{n}{ [rgpd][gpdSim\ - \cr the number of observations to be generated. } \item{object}{ [summary] - \cr a fitted object of class \code{"gpdFit"}. } \item{seed}{ [gpdSim] - \cr an integer value to set the seed for the random number generator. } \item{title}{ a character string which allows for a project title. } \item{type}{ a character string selecting the desired estimation mehtod, either \code{"mle"} for the maximum likelihood mehtod or \code{"pwm"} for the probability weighted moment method. By default, the first will be selected. Note, the function \code{gpd} uses \code{"ml"}. } \item{u}{ the threshold value. } \item{which}{ if \code{which} is set to \code{"ask"} the function will interactively ask which plot should be displayed. By default this value is set to \code{FALSE} and then those plots will be displayed for which the elements in the logical vector \code{which} ar set to \code{TRUE}; by default all four elements are set to \code{"all"}. } \item{x}{ [dgpd] - \cr a numeric vector of quantiles. \cr [gpdFit] - \cr the data vector. Note, there are two different names for the first argument \code{x} and \code{data} depending which function name is used, either \code{gpdFit} or the EVIS synonyme \code{gpd}. \cr [print][plot] - \cr a fitted object of class \code{"gpdFit"}. } \item{xi, mu, beta}{ \code{xi} is the shape parameter, \code{mu} the location parameter, and \code{beta} is the scale parameter. } \item{\dots}{ control parameters and plot parameters optionally passed to the optimization and/or plot function. Parameters for the optimization function are passed to components of the \code{control} argument of \code{optim}. } } \value{ \code{gpdSim} \cr returns a vector of datapoints from the simulated series. \code{gpdFit} \cr returns an object of class \code{"gpd"} describing the fit including parameter estimates and standard errors. \code{gpdQuantPlot} \cr returns invisible a table of results. \code{gpdShapePlot} \cr returns invisible a table of results. \code{gpdTailPlot} \cr returns invisible a list object containing details of the plot is returned invisibly. This object should be used as the first argument of \code{gpdqPlot} or \code{gpdsfallPlot} to add quantile estimates or expected shortfall estimates to the plot. } \details{ \bold{Generalized Pareto Distribution:} \cr\cr Compute density, distribution function, quantile function and generates random variates for the Generalized Pareto Distribution. \bold{Simulation:} \cr\cr \code{gpdSim} simulates data from a Generalized Pareto distribution. \cr \bold{Parameter Estimation:} \cr\cr \code{gpdFit} fits the model parameters either by the probability weighted moment method or the maxim log likelihood method. The function returns an object of class \code{"gpd"} representing the fit of a generalized Pareto model to excesses over a high threshold. The fitting functions use the probability weighted moment method, if method \code{method="pwm"} was selected, and the the general purpose optimization function \code{optim} when the maximum likelihood estimation, \code{method="mle"} or \code{method="ml"} is chosen. \cr \bold{Methods:} \cr\cr \code{print.gpd}, \code{plot.gpd} and \code{summary.gpd} are print, plot, and summary methods for a fitted object of class \code{gpdFit}. The plot method provides four different plots for assessing fitted GPD model. \cr \bold{gpd* Functions:} \cr\cr \code{gpdqPlot} calculates quantile estimates and confidence intervals for high quantiles above the threshold in a GPD analysis, and adds a graphical representation to an existing plot. The GPD approximation in the tail is used to estimate quantile. The \code{"wald"} method uses the observed Fisher information matrix to calculate confidence interval. The \code{"likelihood"} method reparametrizes the likelihood in terms of the unknown quantile and uses profile likelihood arguments to construct a confidence interval. \cr \code{gpdquantPlot} creates a plot showing how the estimate of a high quantile in the tail of a dataset based on the GPD approximation varies with threshold or number of extremes. For every model \code{gpdFit} is called. Evaluation may be slow. Confidence intervals by the Wald method may be fastest. \cr \code{gpdriskmeasures} makes a rapid calculation of point estimates of prescribed quantiles and expected shortfalls using the output of the function \code{gpdFit}. This function simply calculates point estimates and (at present) makes no attempt to calculate confidence intervals for the risk measures. If confidence levels are required use \code{gpdqPlot} and \code{gpdsfallPlot} which interact with graphs of the tail of a loss distribution and are much slower. \cr \code{gpdsfallPlot} calculates expected shortfall estimates, in other words tail conditional expectation and confidence intervals for high quantiles above the threshold in a GPD analysis. A graphical representation to an existing plot is added. Expected shortfall is the expected size of the loss, given that a particular quantile of the loss distribution is exceeded. The GPD approximation in the tail is used to estimate expected shortfall. The likelihood is reparametrised in terms of the unknown expected shortfall and profile likelihood arguments are used to construct a confidence interval. \cr \code{gpdshapePlot} creates a plot showing how the estimate of shape varies with threshold or number of extremes. For every model \code{gpdFit} is called. Evaluation may be slow. \cr \code{gpdtailPlot} produces a plot of the tail of the underlying distribution of the data. } \author{ Alec Stephenson for the functions from R's \code{evd} package, \cr Alec Stephenson for the functions from R's \code{evir} package, \cr Alexander McNeil for the EVIS functions underlying the \code{evir} package, \cr Diethelm Wuertz for this \R-port. } \references{ Embrechts, P., Klueppelberg, C., Mikosch, T. (1997); \emph{Modelling Extremal Events}, Springer. Hosking J.R.M., Wallis J.R., (1987); \emph{Parameter and quantile estimation for the generalized Pareto distribution}, Technometrics 29, 339--349. } \examples{ ## gpdSim - x = gpdSim(model = list(xi = 0.25, mu = 0, beta = 1), n = 1000) ## gpdFit - par(mfrow = c(2, 2), cex = 0.7) fit = gpdFit(x, u = min(x), type = "pwm") print(fit) summary(fit) } \keyword{distribution} fExtremes/man/GevMdaEstimation.Rd0000644000176000001440000002052711370220751016512 0ustar ripleyusers\name{GevMdaEstimation} \alias{GevMdaEstimation} \alias{hillPlot} \alias{shaparmPlot} \alias{shaparmPickands} \alias{shaparmHill} \alias{shaparmDEHaan} \title{Generalized Extreme Value Modelling} \description{ A collection and description functions to estimate the parameters of the GEV distribution. To model the GEV three types of approaches for parameter estimation are provided: Maximum likelihood estimation, probability weighted moment method, and estimation by the MDA approach. MDA includes functions for the Pickands, Einmal-Decker-deHaan, and Hill estimators together with several plot variants. \cr Maximum Domain of Attraction estimators: \tabular{ll}{ \code{hillPlot} \tab shape parameter and Hill estimate of the tail index, \cr \code{shaparmPlot} \tab variation of shape parameter with tail depth. } } \usage{ hillPlot(x, start = 15, ci = 0.95, doplot = TRUE, plottype = c("alpha", "xi"), labels = TRUE, \dots) shaparmPlot(x, p = 0.01*(1:10), xiRange = NULL, alphaRange = NULL, doplot = TRUE, plottype = c("both", "upper")) shaparmPickands(x, p = 0.05, xiRange = NULL, doplot = TRUE, plottype = c("both", "upper"), labels = TRUE, \dots) shaparmHill(x, p = 0.05, xiRange = NULL, doplot = TRUE, plottype = c("both", "upper"), labels = TRUE, \dots) shaparmDEHaan(x, p = 0.05, xiRange = NULL, doplot = TRUE, plottype = c("both", "upper"), labels = TRUE, \dots) } \arguments{ \item{alphaRange, xiRange}{ [saparmPlot] - \cr plotting ranges for \code{alpha} and \code{xi}. By default the values are automatically selected. } \item{ci}{ [hillPlot] - \cr probability for asymptotic confidence band; for no confidence band set \code{ci} to zero. } \item{doplot}{ a logical. Should the results be plotted? \cr [shaparmPlot] - \cr a vector of logicals of the same lengths as tails defining for wich tail depths plots should be created, by default plots will be generated for a tail depth of 5 percent. By default \code{c(FALSE, FALSE, FALSE, FALSE, TRUE, FALSE, FALSE, FALSE, FALSE, FALSE)}. } \item{labels}{ [hillPlot] - \cr whether or not axes should be labelled. } \item{plottype}{ [hillPlot] - \cr whether \code{alpha}, \code{xi} (1/alpha) or \code{quantile} (a quantile estimate) should be plotted. } \item{p}{ [qgev] - \cr a numeric vector of probabilities. [hillPlot] - \cr probability required when option \code{quantile} is chosen. } \item{start}{ [hillPlot] - \cr lowest number of order statistics at which to plot a point. } \item{x}{ [dgev][devd] - \cr a numeric vector of quantiles. \cr [gevFit] - \cr data vector. In the case of \code{method="mle"} the interpretation depends on the value of block: if no block size is specified then data are interpreted as block maxima; if block size is set, then data are interpreted as raw data and block maxima are calculated. \cr [hillPlot][shaparmPlot] - \cr the data from which to calculate the shape parameter, a numeric vector. \cr [print][plot] - \cr a fitted object of class \code{"gevFit"}. } \item{\dots}{ [gevFit] - \cr control parameters optionally passed to the optimization function. Parameters for the optimization function are passed to components of the \code{control} argument of \code{optim}. \cr [hillPlot] - \cr other graphics parameters. \cr [plot][summary] - \cr arguments passed to the plot function. } } \value{ \code{gevSim} \cr returns a vector of data points from the simulated series. \cr \code{gevFit} \cr returns an object of class \code{gev} describing the fit. \cr \code{print.summary} \cr prints a report of the parameter fit. \cr \code{summary} \cr performs diagnostic analysis. The method provides two different residual plots for assessing the fitted GEV model. \cr \code{gevrlevelPlot} \cr returns a vector containing the lower 95\% bound of the confidence interval, the estimated return level and the upper 95\% bound. \cr \code{hillPlot} \cr displays a plot. \cr \code{shaparmPlot} \cr returns a list with one or two entries, depending on the selection of the input variable \code{both.tails}. The two entries \code{upper} and \code{lower} determine the position of the tail. Each of the two variables is again a list with entries \code{pickands}, \code{hill}, and \code{dehaan}. If one of the three methods will be discarded the printout will display zeroes. } \details{ \bold{Parameter Estimation:} \cr\cr \code{gevFit} and \code{gumbelFit} estimate the parameters either by the probability weighted moment method, \code{method="pwm"} or by maximum log likelihood estimation \code{method="mle"}. The summary method produces diagnostic plots for fitted GEV or Gumbel models. \cr \bold{Methods:} \cr\cr \code{print.gev}, \code{plot.gev} and \code{summary.gev} are print, plot, and summary methods for a fitted object of class \code{gev}. Concerning the summary method, the data are converted to unit exponentially distributed residuals under null hypothesis that GEV fits. Two diagnostics for iid exponential data are offered. The plot method provides two different residual plots for assessing the fitted GEV model. Two diagnostics for iid exponential data are offered. \cr \bold{Return Level Plot:} \cr\cr \code{gevrlevelPlot} calculates and plots the k-block return level and 95\% confidence interval based on a GEV model for block maxima, where \code{k} is specified by the user. The k-block return level is that level exceeded once every \code{k} blocks, on average. The GEV likelihood is reparameterized in terms of the unknown return level and profile likelihood arguments are used to construct a confidence interval. \cr \bold{Hill Plot:} \cr\cr The function \code{hillPlot} investigates the shape parameter and plots the Hill estimate of the tail index of heavy-tailed data, or of an associated quantile estimate. This plot is usually calculated from the alpha perspective. For a generalized Pareto analysis of heavy-tailed data using the \code{gpdFit} function, it helps to plot the Hill estimates for \code{xi}. \cr \bold{Shape Parameter Plot:} \cr\cr The function \code{shaparmPlot} investigates the shape parameter and plots for the upper and lower tails the shape parameter as a function of the taildepth. Three approaches are considered, the \emph{Pickands} estimator, the \emph{Hill} estimator, and the \emph{Decker-Einmal-deHaan} estimator. } \note{ \bold{GEV Parameter Estimation:} \cr\cr If method \code{"mle"} is selected the parameter fitting in \code{gevFit} is passed to the internal function \code{gev.mle} or \code{gumbel.mle} depending on the value of \code{gumbel}, \code{FALSE} or \code{TRUE}. On the other hand, if method \code{"pwm"} is selected the parameter fitting in \code{gevFit} is passed to the internal function \code{gev.pwm} or \code{gumbel.pwm} again depending on the value of \code{gumbel}, \code{FALSE} or \code{TRUE}. } \references{ Coles S. (2001); \emph{Introduction to Statistical Modelling of Extreme Values}, Springer. Embrechts, P., Klueppelberg, C., Mikosch, T. (1997); \emph{Modelling Extremal Events}, Springer. } \author{ Alec Stephenson for R's \code{evd} and \code{evir} package, and \cr Diethelm Wuertz for this \R-port. } \examples{ ## Load Data: x = as.timeSeries(data(danishClaims)) colnames(x) <- "Danish" head(x) ## hillPlot - # Hill plot of heavy-tailed Danish fire insurance data par(mfrow = c(1, 1)) hillPlot(x, plottype = "xi") grid() } \keyword{models} fExtremes/man/ValueAtRisk.Rd0000644000176000001440000000250211370220751015475 0ustar ripleyusers\name{ValueAtRisk} \alias{ValueAtRisk} \alias{VaR} \alias{CVaR} \title{Value-at-Risk} \description{ A collection and description of functions to compute Value-at-Risk and conditional Value-at-Risk \cr The functiona are: \tabular{ll}{ \code{VaR} \tab Computes Value-at-Risk, \cr \code{CVaR} \tab Computes conditional Value-at-Risk. } } \usage{ VaR(x, alpha = 0.05, type = "sample", tail = c("lower", "upper")) CVaR(x, alpha = 0.05, type = "sample", tail = c("lower", "upper")) } \arguments{ \item{x}{ an uni- or multivariate timeSeries object } \item{alpha}{ a numeric value, the confidence interval. } \item{type}{ a character string, the type to calculate the value-at-risk. } \item{tail}{ a character string denoting which tail will be considered, either \code{"lower"} or \code{"upper"}. If \code{tail="lower"}, then alpha will be converted to \code{alpha=1-alpha}. } } \value{ \code{VaR}\cr \code{CVaR}\cr \cr returns a numeric vector or value with the (conditional) value-at-risk for each time series column. } \seealso{ \code{hillPlot}, \code{gevFit}. } \author{ Diethelm Wuertz for this \R-port. } \keyword{models} fExtremes/man/DataPreprocessing.Rd0000644000176000001440000001356211370220751016730 0ustar ripleyusers\name{DataPreprocessing} \alias{DataPreprocessing} \alias{blockMaxima} \alias{findThreshold} \alias{pointProcess} \alias{deCluster} \title{Extremes Data Preprocessing} \description{ A collection and description of functions for data preprocessing of extreme values. This includes tools to separate data beyond a threshold value, to compute blockwise data like block maxima, and to decluster point process data. \cr The functions are: \tabular{ll}{ \code{blockMaxima} \tab Block Maxima from a vector or a time series, \cr \code{findThreshold} \tab Upper threshold for a given number of extremes, \cr \code{pointProcess} \tab Peaks over Threshold from a vector or a time series, \cr \code{deCluster} \tab Declusters clustered point process data. } } \usage{ blockMaxima(x, block = c("monthly", "quarterly"), doplot = FALSE) findThreshold(x, n = floor(0.05*length(as.vector(x))), doplot = FALSE) pointProcess(x, u = quantile(x, 0.95), doplot = FALSE) deCluster(x, run = 20, doplot = TRUE) } \arguments{ \item{block}{ the block size. A numeric value is interpreted as the number of data values in each successive block. All the data is used, so the last block may not contain \code{block} observations. If the \code{data} has a \code{times} attribute containing (in an object of class \code{"POSIXct"}, or an object that can be converted to that class, see \code{\link{as.POSIXct}}) the times/dates of each observation, then \code{block} may instead take the character values \code{"month"}, \code{"quarter"}, \code{"semester"} or \code{"year"}. By default monthly blocks from daily data are assumed. } \item{doplot}{ a logical value. Should the results be plotted? By default \code{TRUE}. } \item{n}{ a numeric value or vector giving number of extremes above the threshold. By default, \code{n} is set to an integer representing 5\% of the data from the whole data set \code{x}. } \item{run}{ parameter to be used in the runs method; any two consecutive threshold exceedances separated by more than this number of observations/days are considered to belong to different clusters. } \item{u}{ a numeric value at which level the data are to be truncated. By default the threshold value which belongs to the 95\% quantile, \code{u=quantile(x,0.95)}. } \item{x}{ a numeric data vector from which \code{findThreshold} and \code{blockMaxima} determine the threshold values and block maxima values. For the function \code{deCluster} the argument \code{x} represents a numeric vector of threshold exceedances with a \code{times} attribute which should be a numeric vector containing either the indices or the times/dates of each exceedance (if times/dates, the attribute should be an object of class \code{"POSIXct"} or an object that can be converted to that class; see \code{\link{as.POSIXct}}). } } \details{ \bold{Computing Block Maxima:} \cr\cr The function \code{blockMaxima} calculates block maxima from a vector or a time series, whereas the function \code{blocks} is more general and allows for the calculation of an arbitrary function \code{FUN} on blocks. \cr \bold{Finding Thresholds:} \cr\cr The function \code{findThreshold} finds a threshold so that a given number of extremes lie above. When the data are tied a threshold is found so that at least the specified number of extremes lie above. \cr \bold{De-Clustering Point Processes:} \cr\cr The function \code{deCluster} declusters clustered point process data so that Poisson assumption is more tenable over a high threshold. } \value{ \code{blockMaxima} \cr returns a timeSeries object or a numeric vector of block maxima data. \code{findThreshold} \cr returns a numeric value or vector of suitable thresholds. \code{pointProcess} \cr returns a timeSeries object or a numeric vector of peaks over a threshold. \code{deCluster} \cr returns a timeSeries object or a numeric vector for the declustered point process. } \references{ Coles S. (2001); \emph{Introduction to Statistical Modelling of Extreme Values}, Springer. Embrechts, P., Klueppelberg, C., Mikosch, T. (1997); \emph{Modelling Extremal Events}, Springer. } \author{ Some of the functions were implemented from Alec Stephenson's R-package \code{evir} ported from Alexander McNeil's S library \code{EVIS}, \emph{Extreme Values in S}, some from Alec Stephenson's R-package \code{ismev} based on Stuart Coles code from his book, \emph{Introduction to Statistical Modeling of Extreme Values} and some were written by Diethelm Wuertz. } \examples{ ## findThreshold - # Threshold giving (at least) fifty exceedances for Danish data: x = as.timeSeries(data(danishClaims)) findThreshold(x, n = c(10, 50, 100)) ## blockMaxima - # Block Maxima (Minima) for left tail of BMW log returns: BMW = as.timeSeries(data(bmwRet)) colnames(BMW) = "BMW.RET" head(BMW) x = blockMaxima( BMW, block = 65) head(x) y = blockMaxima(-BMW, block = 65) head(y) y = blockMaxima(-BMW, block = "monthly") head(y) ## pointProcess - # Return Values above threshold in negative BMW log-return data: PP = pointProcess(x = -BMW, u = quantile(as.vector(x), 0.75)) PP nrow(PP) ## deCluster - # Decluster the 200 exceedances of a particular DC = deCluster(x = PP, run = 15, doplot = TRUE) DC nrow(DC) } \keyword{programming} fExtremes/man/GpdDistribution.Rd0000644000176000001440000000764511370220751016432 0ustar ripleyusers\name{GpdDistribution} \alias{GpdDistribution} \alias{dgpd} \alias{pgpd} \alias{qgpd} \alias{rgpd} \alias{gpdMoments} \alias{gpdSlider} \title{Generalized Pareto Distribution} \description{ A collection and description of functions to compute the generalized Pareto distribution. The functions compute density, distribution function, quantile function and generate random deviates for the GPD. In addition functions to compute the true moments and to display the distribution and random variates changing parameters interactively are available. \cr The GPD distribution functions are: \tabular{ll}{ \code{dgpd} \tab Density of the GPD Distribution, \cr \code{pgpd} \tab Probability function of the GPD Distribution, \cr \code{qgpd} \tab Quantile function of the GPD Distribution, \cr \code{rgpd} \tab random variates from the GEV distribution, \cr \code{gpdMoments} \tab computes true mean and variance, \cr \code{gpdSlider} \tab displays density or rvs from a GPD.} } \usage{ dgpd(x, xi = 1, mu = 0, beta = 1, log = FALSE) pgpd(q, xi = 1, mu = 0, beta = 1, lower.tail = TRUE) qgpd(p, xi = 1, mu = 0, beta = 1, lower.tail = TRUE) rgpd(n, xi = 1, mu = 0, beta = 1) gpdMoments(xi = 1, mu = 0, beta = 1) gpdSlider(method = c("dist", "rvs")) } \arguments{ \item{log}{ a logical, if \code{TRUE}, the log density is returned. } \item{lower.tail}{ a logical, if \code{TRUE}, the default, then probabilities are \code{P[X <= x]}, otherwise, \code{P[X > x]}. } \item{method}{ [gpdSlider] - \cr a character string denoting what should be displayed. Either the density and \code{"dist"} or random variates \code{"rvs"}. } \item{n}{ [rgpd][gpdSim\ - \cr the number of observations to be generated. } \item{p}{ a vector of probability levels, the desired probability for the quantile estimate (e.g. 0.99 for the 99th percentile). } \item{q}{ [pgpd] - \cr a numeric vector of quantiles. } \item{x}{ [dgpd] - \cr a numeric vector of quantiles. } \item{xi, mu, beta}{ \code{xi} is the shape parameter, \code{mu} the location parameter, and \code{beta} is the scale parameter. } } \value{ All values are numeric vectors: \cr \code{d*} returns the density, \cr \code{p*} returns the probability, \cr \code{q*} returns the quantiles, and \cr \code{r*} generates random deviates. } \author{ Alec Stephenson for the functions from R's \code{evd} package, \cr Alec Stephenson for the functions from R's \code{evir} package, \cr Alexander McNeil for the EVIS functions underlying the \code{evir} package, \cr Diethelm Wuertz for this \R-port. } \references{ Embrechts, P., Klueppelberg, C., Mikosch, T. (1997); \emph{Modelling Extremal Events}, Springer. } \examples{ ## rgpd - par(mfrow = c(2, 2), cex = 0.7) r = rgpd(n = 1000, xi = 1/4) plot(r, type = "l", col = "steelblue", main = "GPD Series") grid() ## dgpd - # Plot empirical density and compare with true density: # Omit values greater than 500 from plot hist(r, n = 50, probability = TRUE, xlab = "r", col = "steelblue", border = "white", xlim = c(-1, 5), ylim = c(0, 1.1), main = "Density") box() x = seq(-5, 5, by = 0.01) lines(x, dgpd(x, xi = 1/4), col = "orange") ## pgpd - # Plot df and compare with true df: plot(sort(r), (1:length(r)/length(r)), xlim = c(-3, 6), ylim = c(0, 1.1), pch = 19, cex = 0.5, ylab = "p", xlab = "q", main = "Probability") grid() q = seq(-5, 5, by = 0.1) lines(q, pgpd(q, xi = 1/4), col = "steelblue") ## qgpd - # Compute quantiles, a test: qgpd(pgpd(seq(-1, 5, 0.25), xi = 1/4 ), xi = 1/4) } \keyword{distribution} fExtremes/man/ExtremeIndex.Rd0000644000176000001440000001365211370220751015714 0ustar ripleyusers\name{ExtremeIndex} \alias{ExtremeIndex} \alias{fTHETA} \alias{fTHETA-class} \alias{show,fTHETA-method} \alias{thetaSim} \alias{blockTheta} \alias{clusterTheta} \alias{runTheta} \alias{ferrosegersTheta} \alias{exindexPlot} \alias{exindexesPlot} \title{Extremal Index Estimation} \description{ A collection and description of functions to simulate time series with a known extremal index, and to estimate the extremal index by four different kind of methods, the blocks method, the reciprocal mean cluster size method, the runs method, and the method of Ferro and Segers. \cr The functiona are: \tabular{ll}{ \code{thetaSim} \tab Simulates a time Series with known theta, \cr \code{blockTheta} \tab Computes theta from Block Method, \cr \code{clusterTheta} \tab Computes theta from Reciprocal Cluster Method, \cr \code{runTheta} \tab Computes theta from Run Method, \cr \code{ferrosegersTheta} \tab Computes Theta according to Ferro and Seegers, \cr \code{exindexPlot} \tab Calculate and Plot Theta(1,2,3), \cr \code{exindexesPlot} \tab Calculate Theta(1,2) and Plot Theta(1). } } \usage{ \S4method{show}{fTHETA}(object) thetaSim(model = c("max", "pair"), n = 1000, theta = 0.5) blockTheta(x, block = 22, quantiles = seq(0.950, 0.995, length = 10), title = NULL, description = NULL) clusterTheta(x, block = 22, quantiles = seq(0.950, 0.995, length = 10), title = NULL, description = NULL) runTheta(x, block = 22, quantiles = seq(0.950, 0.995, length = 10), title = NULL, description = NULL) ferrosegersTheta(x, quantiles = seq(0.950, 0.995, length = 10), title = NULL, description = NULL) exindexPlot(x, block = c("monthly", "quarterly"), start = 5, end = NA, doplot = TRUE, plottype = c("thresh", "K"), labels = TRUE, \dots) exindexesPlot(x, block = 22, quantiles = seq(0.950, 0.995, length = 10), doplot = TRUE, labels = TRUE, \dots) } \arguments{ \item{block}{ [*Theta] - \cr an integer value, the block size. Currently only integer specified block sizes are supported. \cr [exindex*Plot] - \cr the block size. Either \code{"monthly"}, \code{"quarterly"} or an integer value. An integer value is interpreted as the number of data values in each successive block. The default value is \code{"monthly"} which correpsond for daily data to an approximately 22-day periods. } \item{description}{ a character string which allows for a brief description. } \item{doplot}{ a logical, should the results be plotted? } \item{labels}{ whether or not axes should be labelled. If set to \code{FALSE} then user specified lables can be passed through the \code{"..."} argument. } \item{model}{ [thetaSim] - \cr a character string denoting the name of the model. Either \code{"max"} or \code{"pair"}, the first representing the maximimum Frechet series, and the second the paired exponential series. } \item{n}{ [thetaSim] - \cr an integer value, the length of the time series to be generated. } \item{object}{ an object of class \code{"fTHETA"} as returned by the functions \code{*Theta}. } \item{plottype}{ [exindexPlot] - \cr whether plot is to be by increasing threshold (\code{thresh}) or increasing K value (\code{K}). } \item{quantiles}{ [exindexesPlot] - \cr a numeric vector of quantile values. } \item{start, end}{ [exindexPlot] - \cr \code{start} is the lowest value of \code{K} at which to plot a point, and \code{end} the highest value; \code{K} is the number of blocks in which a specified threshold is exceeded. } \item{theta}{ [thetaSim] - \cr a numeric value between 0 and 1 setting the value of the extremal index for the maximum Frechet time series. (Not used in the case of the paired exponential series.) } \item{title}{ a character string which allows for a project title. } \item{x}{ a 'timeSeries' object or any other object which can be transformed by the function \code{as.vector} into a numeric vector. \code{"monthly"} and \code{"quarterly"} blocks require \code{x} to be an object of class \code{"timeSeries"}. } \item{\dots}{ additional arguments passed to the plot function. } } \value{ \code{exindexPlot} \cr returns a data frame of results with the following columns: \code{N}, \code{K}, \code{un}, \code{theta2}, and \code{theta}. A plot with \code{K} on the lower x-axis and threshold Values on the upper x-axis versus the extremal index is displayed. \code{exindexesPlot} \cr returns a data.frame with four columns: \code{thresholds}, \code{theta1}, \code{theta2}, and \code{theta3}. A plot with quantiles on the x-axis and versus the extremal indexes is displayed. } \references{ Embrechts, P., Klueppelberg, C., Mikosch, T. (1997); \emph{Modelling Extremal Events}, Springer. Chapter 8, 413--429. } \seealso{ \code{hillPlot}, \code{gevFit}. } \author{ Alexander McNeil, for parts of the \code{exindexPlot} function, and \cr Diethelm Wuertz for the \code{exindexesPlot} function. } \examples{ ## Extremal Index for the right and left tails ## of the BMW log returns: data(bmwRet) par(mfrow = c(2, 2), cex = 0.7) exindexPlot( as.timeSeries(bmwRet), block = "quarterly") exindexPlot(-as.timeSeries(bmwRet), block = "quarterly") ## Extremal Index for the right and left tails ## of the BMW log returns: exindexesPlot( as.timeSeries(bmwRet), block = 65) exindexesPlot(-as.timeSeries(bmwRet), block = 65) } \keyword{hplot} fExtremes/man/data.Rd0000644000176000001440000000030511370220751014213 0ustar ripleyusers\name{TimeSeriesData} \alias{TimeSeriesData} \alias{bmwRet} \alias{danishClaims} \title{Time Series Data Sets} \description{ Data sets used in the examples of the timeSeries packages. } fExtremes/man/GevDistribution.Rd0000644000176000001440000000745511370220751016440 0ustar ripleyusers\name{GevDistribution} \alias{GevDistribution} \alias{dgev} \alias{pgev} \alias{qgev} \alias{rgev} \alias{gevMoments} \alias{gevSlider} \title{Generalized Extreme Value Distribution} \description{ Density, distribution function, quantile function, random number generation, and true moments for the GEV including the Frechet, Gumbel, and Weibull distributions. \cr The GEV distribution functions are: \tabular{ll}{ \code{dgev} \tab density of the GEV distribution, \cr \code{pgev} \tab probability function of the GEV distribution, \cr \code{qgev} \tab quantile function of the GEV distribution, \cr \code{rgev} \tab random variates from the GEV distribution, \cr \code{gevMoments} \tab computes true mean and variance, \cr \code{gevSlider} \tab displays density or rvs from a GEV.} } \usage{ dgev(x, xi = 1, mu = 0, beta = 1, log = FALSE) pgev(q, xi = 1, mu = 0, beta = 1, lower.tail = TRUE) qgev(p, xi = 1, mu = 0, beta = 1, lower.tail = TRUE) rgev(n, xi = 1, mu = 0, beta = 1) gevMoments(xi = 0, mu = 0, beta = 1) gevSlider(method = c("dist", "rvs")) } \arguments{ \item{log}{ a logical, if \code{TRUE}, the log density is returned. } \item{lower.tail}{ a logical, if \code{TRUE}, the default, then probabilities are \code{P[X <= x]}, otherwise, \code{P[X > x]}. } \item{method}{ a character sgtring denoting what should be displayed. Either the density and \code{"dist"} or random variates \code{"rvs"}. } \item{n}{ the number of observations. } \item{p}{ a numeric vector of probabilities. [hillPlot] - \cr probability required when option \code{quantile} is chosen. } \item{q}{ a numeric vector of quantiles. } \item{x}{ a numeric vector of quantiles. } \item{xi, mu, beta}{ \code{xi} is the shape parameter, \code{mu} the location parameter, and \code{beta} is the scale parameter. The default values are \code{xi=1}, \code{mu=0}, and \code{beta=1}. Note, if \code{xi=0} the distribution is of type Gumbel. } } \value{ \code{d*} returns the density, \cr \code{p*} returns the probability, \cr \code{q*} returns the quantiles, and \cr \code{r*} generates random variates. \cr All values are numeric vectors. } \references{ Coles S. (2001); \emph{Introduction to Statistical Modelling of Extreme Values}, Springer. Embrechts, P., Klueppelberg, C., Mikosch, T. (1997); \emph{Modelling Extremal Events}, Springer. } \author{ Alec Stephenson for R's \code{evd} and \code{evir} package, and \cr Diethelm Wuertz for this \R-port. } \examples{ ## rgev - # Create and plot 1000 Weibull distributed rdv: r = rgev(n = 1000, xi = -1) plot(r, type = "l", col = "steelblue", main = "Weibull Series") grid() ## dgev - # Plot empirical density and compare with true density: hist(r[abs(r)<10], nclass = 25, freq = FALSE, xlab = "r", xlim = c(-5,5), ylim = c(0,1.1), main = "Density") box() x = seq(-5, 5, by = 0.01) lines(x, dgev(x, xi = -1), col = "steelblue") ## pgev - # Plot df and compare with true df: plot(sort(r), (1:length(r)/length(r)), xlim = c(-3, 6), ylim = c(0, 1.1), cex = 0.5, ylab = "p", xlab = "q", main = "Probability") grid() q = seq(-5, 5, by = 0.1) lines(q, pgev(q, xi = -1), col = "steelblue") ## qgev - # Compute quantiles, a test: qgev(pgev(seq(-5, 5, 0.25), xi = -1), xi = -1) ## gevMoments: # Returns true mean and variance: gevMoments(xi = 0, mu = 0, beta = 1) ## Slider: # gevSlider(method = "dist") # gevSlider(method = "rvs") } \keyword{models} fExtremes/man/GpdRisk.Rd0000644000176000001440000002157311370220751014657 0ustar ripleyusers\name{gpdRisk} \alias{gpdRisk} \alias{gpdQPlot} \alias{gpdQuantPlot} \alias{gpdSfallPlot} \alias{gpdShapePlot} \alias{gpdTailPlot} \alias{gpdRiskMeasures} \alias{tailPlot} \alias{tailSlider} \alias{tailRisk} \title{GPD Distributions for Extreme Value Theory} \description{ A collection and description to functions to compute tail risk under the GPD approach. \cr The GPD modelling functions are: \tabular{ll}{ \code{gpdQPlot} \tab estimation of high quantiles, \cr \code{gpdQuantPlot} \tab variation of high quantiles with threshold, \cr \code{gpdRiskMeasures} \tab prescribed quantiles and expected shortfalls, \cr \code{gpdSfallPlot} \tab expected shortfall with confidence intervals, \cr \code{gpdShapePlot} \tab variation of shape with threshold, \cr \code{gpdTailPlot} \tab plot of the tail, \cr \code{tailPlot} \tab , \cr \code{tailSlider} \tab , \cr \code{tailRisk} \tab . } } \usage{ gpdQPlot(x, p = 0.99, ci = 0.95, type = c("likelihood", "wald"), like.num = 50) gpdQuantPlot(x, p = 0.99, ci = 0.95, models = 30, start = 15, end = 500, doplot = TRUE, plottype = c("normal", "reverse"), labels = TRUE, \dots) gpdSfallPlot(x, p = 0.99, ci = 0.95, like.num = 50) gpdShapePlot(x, ci = 0.95, models = 30, start = 15, end = 500, doplot = TRUE, plottype = c("normal", "reverse"), labels = TRUE, \dots) gpdTailPlot(object, plottype = c("xy", "x", "y", ""), doplot = TRUE, extend = 1.5, labels = TRUE, \dots) gpdRiskMeasures(object, prob = c(0.99, 0.995, 0.999, 0.9995, 0.9999)) tailPlot(object, p = 0.99, ci = 0.95, nLLH = 25, extend = 1.5, grid = TRUE, labels = TRUE, \dots) tailSlider(x) tailRisk(object, prob = c(0.99, 0.995, 0.999, 0.9995, 0.9999), \dots) } \arguments{ \item{ci}{ the probability for asymptotic confidence band; for no confidence band set to zero. } \item{doplot}{ a logical. Should the results be plotted? } \item{extend}{ optional argument for plots 1 and 2 expressing how far x-axis should extend as a multiple of the largest data value. This argument must take values greater than 1 and is useful for showing estimated quantiles beyond data. } \item{grid}{ ... } \item{labels}{ optional argument for plots 1 and 2 specifying whether or not axes should be labelled. } \item{like.num}{ the number of times to evaluate profile likelihood. } \item{models}{ the number of consecutive gpd models to be fitted. } \item{nLLH}{ ... } \item{object}{ [summary] - \cr a fitted object of class \code{"gpdFit"}. } \item{p}{ a vector of probability levels, the desired probability for the quantile estimate (e.g. 0.99 for the 99th percentile). } \item{reverse}{ should plot be by increasing threshold (\code{TRUE}) or number of extremes (\code{FALSE}). } \item{prob}{ a numeric value. } \item{plottype}{ a character string. } \item{start, end}{ the lowest and maximum number of exceedances to be considered. } \item{type}{ a character string selecting the desired estimation mehtod, either \code{"mle"} for the maximum likelihood mehtod or \code{"pwm"} for the probability weighted moment method. By default, the first will be selected. Note, the function \code{gpd} uses \code{"ml"}. } \item{x}{ [dgpd] - \cr a numeric vector of quantiles. \cr [gpdFit] - \cr the data vector. Note, there are two different names for the first argument \code{x} and \code{data} depending which function name is used, either \code{gpdFit} or the EVIS synonyme \code{gpd}. \cr [print][plot] - \cr a fitted object of class \code{"gpdFit"}. } \item{\dots}{ control parameters and plot parameters optionally passed to the optimization and/or plot function. Parameters for the optimization function are passed to components of the \code{control} argument of \code{optim}. } } \value{ \code{gpdSim} \cr returns a vector of datapoints from the simulated series. \code{gpdFit} \cr returns an object of class \code{"gpd"} describing the fit including parameter estimates and standard errors. \code{gpdQuantPlot} \cr returns invisible a table of results. \code{gpdShapePlot} \cr returns invisible a table of results. \code{gpdTailPlot} \cr returns invisible a list object containing details of the plot is returned invisibly. This object should be used as the first argument of \code{gpdqPlot} or \code{gpdsfallPlot} to add quantile estimates or expected shortfall estimates to the plot. } \details{ \bold{Generalized Pareto Distribution:} \cr\cr Compute density, distribution function, quantile function and generates random variates for the Generalized Pareto Distribution. \bold{Simulation:} \cr\cr \code{gpdSim} simulates data from a Generalized Pareto distribution. \cr \bold{Parameter Estimation:} \cr\cr \code{gpdFit} fits the model parameters either by the probability weighted moment method or the maxim log likelihood method. The function returns an object of class \code{"gpd"} representing the fit of a generalized Pareto model to excesses over a high threshold. The fitting functions use the probability weighted moment method, if method \code{method="pwm"} was selected, and the the general purpose optimization function \code{optim} when the maximum likelihood estimation, \code{method="mle"} or \code{method="ml"} is chosen. \cr \bold{Methods:} \cr\cr \code{print.gpd}, \code{plot.gpd} and \code{summary.gpd} are print, plot, and summary methods for a fitted object of class \code{gpdFit}. The plot method provides four different plots for assessing fitted GPD model. \cr \bold{gpd* Functions:} \cr\cr \code{gpdqPlot} calculates quantile estimates and confidence intervals for high quantiles above the threshold in a GPD analysis, and adds a graphical representation to an existing plot. The GPD approximation in the tail is used to estimate quantile. The \code{"wald"} method uses the observed Fisher information matrix to calculate confidence interval. The \code{"likelihood"} method reparametrizes the likelihood in terms of the unknown quantile and uses profile likelihood arguments to construct a confidence interval. \cr \code{gpdquantPlot} creates a plot showing how the estimate of a high quantile in the tail of a dataset based on the GPD approximation varies with threshold or number of extremes. For every model \code{gpdFit} is called. Evaluation may be slow. Confidence intervals by the Wald method may be fastest. \cr \code{gpdriskmeasures} makes a rapid calculation of point estimates of prescribed quantiles and expected shortfalls using the output of the function \code{gpdFit}. This function simply calculates point estimates and (at present) makes no attempt to calculate confidence intervals for the risk measures. If confidence levels are required use \code{gpdqPlot} and \code{gpdsfallPlot} which interact with graphs of the tail of a loss distribution and are much slower. \cr \code{gpdsfallPlot} calculates expected shortfall estimates, in other words tail conditional expectation and confidence intervals for high quantiles above the threshold in a GPD analysis. A graphicalx representation to an existing plot is added. Expected shortfall is the expected size of the loss, given that a particular quantile of the loss distribution is exceeded. The GPD approximation in the tail is used to estimate expected shortfall. The likelihood is reparametrised in terms of the unknown expected shortfall and profile likelihood arguments are used to construct a confidence interval. \cr \code{gpdshapePlot} creates a plot showing how the estimate of shape varies with threshold or number of extremes. For every model \code{gpdFit} is called. Evaluation may be slow. \cr \code{gpdtailPlot} produces a plot of the tail of the underlying distribution of the data. } \references{ Embrechts, P., Klueppelberg, C., Mikosch, T. (1997); \emph{Modelling Extremal Events}, Springer. Hosking J.R.M., Wallis J.R., (1987); \emph{Parameter and quantile estimation for the generalized Pareto distribution}, Technometrics 29, 339--349. } \author{ Alec Stephenson for the functions from R's \code{evd} package, \cr Alec Stephenson for the functions from R's \code{evir} package, \cr Alexander McNeil for the EVIS functions underlying the \code{evir} package, \cr Diethelm Wuertz for this \R-port. } \examples{ ## Load Data: danish = as.timeSeries(data(danishClaims)) ## Tail Plot: x = as.timeSeries(data(danishClaims)) fit = gpdFit(x, u = 10) tailPlot(fit) ## Try Tail Slider: # tailSlider(x) ## Tail Risk: tailRisk(fit) } \keyword{distribution} fExtremes/man/GevRisk.Rd0000644000176000001440000001467111370220751014667 0ustar ripleyusers\name{GevRisk} \alias{GevRisk} \alias{gevrlevelPlot} \title{Generalized Extreme Value Modelling} \description{ A collection and description functions to estimate the parameters of the GEV distribution. To model the GEV three types of approaches for parameter estimation are provided: Maximum likelihood estimation, probability weighted moment method, and estimation by the MDA approach. MDA includes functions for the Pickands, Einmal-Decker-deHaan, and Hill estimators together with several plot variants. \cr The GEV modelling functions are: \tabular{ll}{ \code{gevrlevelPlot} \tab k-block return level with confidence intervals. } } \usage{ gevrlevelPlot(object, kBlocks = 20, ci = c(0.90, 0.95, 0.99), plottype = c("plot", "add"), labels = TRUE,...) } \arguments{ \item{add}{ [gevrlevelPlot] - \cr whether the return level should be added graphically to a time series plot; if \code{FALSE} a graph of the profile likelihood curve showing the return level and its confidence interval is produced. } \item{ci}{ [hillPlot] - \cr probability for asymptotic confidence band; for no confidence band set \code{ci} to zero. } \item{kBlocks}{ [gevrlevelPlot] - \cr specifies the particular return level to be estimated; default set arbitrarily to 20. } \item{labels}{ [hillPlot] - \cr whether or not axes should be labelled. } \item{object}{ [summary][grlevelPlot] - \cr a fitted object of class \code{"gevFit"}. } \item{plottype}{ [hillPlot] - \cr whether \code{alpha}, \code{xi} (1/alpha) or \code{quantile} (a quantile estimate) should be plotted. } \item{\dots}{ arguments passed to the plot function. } } \value{ \code{gevSim} \cr returns a vector of data points from the simulated series. \cr \code{gevFit} \cr returns an object of class \code{gev} describing the fit. \cr \code{print.summary} \cr prints a report of the parameter fit. \cr \code{summary} \cr performs diagnostic analysis. The method provides two different residual plots for assessing the fitted GEV model. \cr \code{gevrlevelPlot} \cr returns a vector containing the lower 95\% bound of the confidence interval, the estimated return level and the upper 95\% bound. \cr \code{hillPlot} \cr displays a plot. \cr \code{shaparmPlot} \cr returns a list with one or two entries, depending on the selection of the input variable \code{both.tails}. The two entries \code{upper} and \code{lower} determine the position of the tail. Each of the two variables is again a list with entries \code{pickands}, \code{hill}, and \code{dehaan}. If one of the three methods will be discarded the printout will display zeroes. } \details{ \bold{Parameter Estimation:} \cr\cr \code{gevFit} and \code{gumbelFit} estimate the parameters either by the probability weighted moment method, \code{method="pwm"} or by maximum log likelihood estimation \code{method="mle"}. The summary method produces diagnostic plots for fitted GEV or Gumbel models. \cr \bold{Methods:} \cr\cr \code{print.gev}, \code{plot.gev} and \code{summary.gev} are print, plot, and summary methods for a fitted object of class \code{gev}. Concerning the summary method, the data are converted to unit exponentially distributed residuals under null hypothesis that GEV fits. Two diagnostics for iid exponential data are offered. The plot method provides two different residual plots for assessing the fitted GEV model. Two diagnostics for iid exponential data are offered. \cr \bold{Return Level Plot:} \cr\cr \code{gevrlevelPlot} calculates and plots the k-block return level and 95\% confidence interval based on a GEV model for block maxima, where \code{k} is specified by the user. The k-block return level is that level exceeded once every \code{k} blocks, on average. The GEV likelihood is reparameterized in terms of the unknown return level and profile likelihood arguments are used to construct a confidence interval. \cr \bold{Hill Plot:} \cr\cr The function \code{hillPlot} investigates the shape parameter and plots the Hill estimate of the tail index of heavy-tailed data, or of an associated quantile estimate. This plot is usually calculated from the alpha perspective. For a generalized Pareto analysis of heavy-tailed data using the \code{gpdFit} function, it helps to plot the Hill estimates for \code{xi}. \cr \bold{Shape Parameter Plot:} \cr\cr The function \code{shaparmPlot} investigates the shape parameter and plots for the upper and lower tails the shape parameter as a function of the taildepth. Three approaches are considered, the \emph{Pickands} estimator, the \emph{Hill} estimator, and the \emph{Decker-Einmal-deHaan} estimator. } \note{ \bold{GEV Parameter Estimation:} \cr\cr If method \code{"mle"} is selected the parameter fitting in \code{gevFit} is passed to the internal function \code{gev.mle} or \code{gumbel.mle} depending on the value of \code{gumbel}, \code{FALSE} or \code{TRUE}. On the other hand, if method \code{"pwm"} is selected the parameter fitting in \code{gevFit} is passed to the internal function \code{gev.pwm} or \code{gumbel.pwm} again depending on the value of \code{gumbel}, \code{FALSE} or \code{TRUE}. } \references{ Coles S. (2001); \emph{Introduction to Statistical Modelling of Extreme Values}, Springer. Embrechts, P., Klueppelberg, C., Mikosch, T. (1997); \emph{Modelling Extremal Events}, Springer. } \author{ Alec Stephenson for R's \code{evd} and \code{evir} package, and \cr Diethelm Wuertz for this \R-port. } \examples{ ## Load Data: # BMW Stock Data - negative returns x = -as.timeSeries(data(bmwRet)) colnames(x)<-"BMW" head(x) ## gevFit - # Fit GEV to monthly Block Maxima: fit = gevFit(x, block = "month") print(fit) ## gevrlevelPlot - # Return Level Plot: gevrlevelPlot(fit) } \keyword{models} fExtremes/man/GevModelling.Rd0000644000176000001440000002365311370220751015671 0ustar ripleyusers\name{GevModelling} \alias{GevModelling} \alias{fGEVFIT} \alias{fGEVFIT-class} \alias{show,fGEVFIT-method} \alias{gevSim} \alias{gumbelSim} \alias{gevFit} \alias{gumbelFit} \alias{plot.fGEVFIT} \alias{summary.fGEVFIT} \title{Generalized Extreme Value Modelling} \description{ A collection and description functions to estimate the parameters of the GEV distribution. To model the GEV three types of approaches for parameter estimation are provided: Maximum likelihood estimation, probability weighted moment method, and estimation by the MDA approach. MDA includes functions for the Pickands, Einmal-Decker-deHaan, and Hill estimators together with several plot variants. \cr The GEV modelling functions are: \tabular{ll}{ \code{gevSim} \tab generates data from the GEV distribution, \cr \code{gumbelSim} \tab generates data from the Gumbel distribution, \cr \code{gevFit} \tab fits data to the GEV distribution, \cr \code{gumbelFit} \tab fits data to the Gumbel distribution, \cr \code{print} \tab print method for a fitted GEV object, \cr \code{plot} \tab plot method for a fitted GEV object, \cr \code{summary} \tab summary method for a fitted GEV object, \cr \code{gevrlevelPlot} \tab k-block return level with confidence intervals. } } \usage{ gevSim(model = list(xi = -0.25, mu = 0, beta = 1), n = 1000, seed = NULL) gumbelSim(model = list(mu = 0, beta = 1), n = 1000, seed = NULL) gevFit(x, block = 1, type = c("mle", "pwm"), title = NULL, description = NULL, \dots) gumbelFit(x, block = 1, type = c("mle", "pwm"), title = NULL, description = NULL, \dots) \S4method{show}{fGEVFIT}(object) \method{plot}{fGEVFIT}(x, which = "ask", \dots) \method{summary}{fGEVFIT}(object, doplot = TRUE, which = "all", \dots) } \arguments{ \item{block}{ block size. } \item{description}{ a character string which allows for a brief description. } \item{doplot}{ a logical. Should the results be plotted? \cr [shaparmPlot] - \cr a vector of logicals of the same lengths as tails defining for wich tail depths plots should be created, by default plots will be generated for a tail depth of 5 percent. By default \code{c(FALSE, FALSE, FALSE, FALSE, TRUE, FALSE, FALSE, FALSE, FALSE, FALSE)}. } \item{model}{ [gevSim][gumbelSim] - \cr a list with components \code{shape}, \code{location} and \code{scale} giving the parameters of the GEV distribution. By default the shape parameter has the value -0.25, the location is zero and the scale is one. To fit random deviates from a Gumbel distribution set \code{shape=0}. } \item{n}{ [gevSim][gumbelSim] - \cr number of generated data points, an integer value. \cr [rgev] - \cr the number of observations. } \item{object}{ [summary][grlevelPlot] - \cr a fitted object of class \code{"gevFit"}. } \item{seed}{ [gevSim] - \cr an integer value to set the seed for the random number generator. } \item{title}{ [gevFit] - \cr a character string which allows for a project title. } \item{type}{ a character string denoting the type of parameter estimation, either by maximum likelihood estimation \code{"mle"}, the default value, or by the probability weighted moment menthod \code{"pwm"}. } \item{which}{ [plot][summary] - \cr a vector of logicals, one for each plot, denoting which plot should be displayed. Alkternatively if \code{which="ask"} the user will be interactively asked which of the plots should be desplayed. By default \code{which="all"}. } \item{x}{ [dgev][devd] - \cr a numeric vector of quantiles. \cr [gevFit] - \cr data vector. In the case of \code{method="mle"} the interpretation depends on the value of block: if no block size is specified then data are interpreted as block maxima; if block size is set, then data are interpreted as raw data and block maxima are calculated. \cr [hillPlot][shaparmPlot] - \cr the data from which to calculate the shape parameter, a numeric vector. \cr [print][plot] - \cr a fitted object of class \code{"gevFit"}. } \item{xi, mu, beta}{ [*gev] - \cr \code{xi} is the shape parameter, \code{mu} the location parameter, and \code{sigma} is the scale parameter. The default values are \code{xi=1}, \code{mu=0}, and \code{beta=1}. Note, if \code{xi=0} the distribution is of type Gumbel. } \item{\dots}{ [gevFit] - \cr control parameters optionally passed to the optimization function. Parameters for the optimization function are passed to components of the \code{control} argument of \code{optim}. \cr [hillPlot] - \cr other graphics parameters. \cr [plot][summary] - \cr arguments passed to the plot function. } } \value{ \code{gevSim} \cr returns a vector of data points from the simulated series. \cr \code{gevFit} \cr returns an object of class \code{gev} describing the fit. \cr \code{print.summary} \cr prints a report of the parameter fit. \cr \code{summary} \cr performs diagnostic analysis. The method provides two different residual plots for assessing the fitted GEV model. \cr \code{gevrlevelPlot} \cr returns a vector containing the lower 95\% bound of the confidence interval, the estimated return level and the upper 95\% bound. \cr \code{hillPlot} \cr displays a plot. \cr \code{shaparmPlot} \cr returns a list with one or two entries, depending on the selection of the input variable \code{both.tails}. The two entries \code{upper} and \code{lower} determine the position of the tail. Each of the two variables is again a list with entries \code{pickands}, \code{hill}, and \code{dehaan}. If one of the three methods will be discarded the printout will display zeroes. } \details{ \bold{Parameter Estimation:} \cr\cr \code{gevFit} and \code{gumbelFit} estimate the parameters either by the probability weighted moment method, \code{method="pwm"} or by maximum log likelihood estimation \code{method="mle"}. The summary method produces diagnostic plots for fitted GEV or Gumbel models. \cr \bold{Methods:} \cr\cr \code{print.gev}, \code{plot.gev} and \code{summary.gev} are print, plot, and summary methods for a fitted object of class \code{gev}. Concerning the summary method, the data are converted to unit exponentially distributed residuals under null hypothesis that GEV fits. Two diagnostics for iid exponential data are offered. The plot method provides two different residual plots for assessing the fitted GEV model. Two diagnostics for iid exponential data are offered. \cr \bold{Return Level Plot:} \cr\cr \code{gevrlevelPlot} calculates and plots the k-block return level and 95\% confidence interval based on a GEV model for block maxima, where \code{k} is specified by the user. The k-block return level is that level exceeded once every \code{k} blocks, on average. The GEV likelihood is reparameterized in terms of the unknown return level and profile likelihood arguments are used to construct a confidence interval. \cr \bold{Hill Plot:} \cr\cr The function \code{hillPlot} investigates the shape parameter and plots the Hill estimate of the tail index of heavy-tailed data, or of an associated quantile estimate. This plot is usually calculated from the alpha perspective. For a generalized Pareto analysis of heavy-tailed data using the \code{gpdFit} function, it helps to plot the Hill estimates for \code{xi}. \cr \bold{Shape Parameter Plot:} \cr\cr The function \code{shaparmPlot} investigates the shape parameter and plots for the upper and lower tails the shape parameter as a function of the taildepth. Three approaches are considered, the \emph{Pickands} estimator, the \emph{Hill} estimator, and the \emph{Decker-Einmal-deHaan} estimator. } \note{ \bold{GEV Parameter Estimation:} \cr\cr If method \code{"mle"} is selected the parameter fitting in \code{gevFit} is passed to the internal function \code{gev.mle} or \code{gumbel.mle} depending on the value of \code{gumbel}, \code{FALSE} or \code{TRUE}. On the other hand, if method \code{"pwm"} is selected the parameter fitting in \code{gevFit} is passed to the internal function \code{gev.pwm} or \code{gumbel.pwm} again depending on the value of \code{gumbel}, \code{FALSE} or \code{TRUE}. } \references{ Coles S. (2001); \emph{Introduction to Statistical Modelling of Extreme Values}, Springer. Embrechts, P., Klueppelberg, C., Mikosch, T. (1997); \emph{Modelling Extremal Events}, Springer. } \author{ Alec Stephenson for R's \code{evd} and \code{evir} package, and \cr Diethelm Wuertz for this \R-port. } \examples{ ## gevSim - # Simulate GEV Data, use default length n=1000 x = gevSim(model = list(xi = 0.25, mu = 0 , beta = 1), n = 1000) head(x) ## gumbelSim - # Simulate GEV Data, use default length n=1000 x = gumbelSim(model = list(xi = 0.25, mu = 0 , beta = 1)) ## gevFit - # Fit GEV Data by Probability Weighted Moments: fit = gevFit(x, type = "pwm") print(fit) ## summary - # Summarize Results: par(mfcol = c(2, 2)) summary(fit) } \keyword{models} fExtremes/man/ExtremesData.Rd0000644000176000001440000002524511370220751015702 0ustar ripleyusers\name{ExtremesData} \alias{ExtremesData} \alias{emdPlot} \alias{qqparetoPlot} \alias{mePlot} \alias{mrlPlot} \alias{mxfPlot} \alias{msratioPlot} \alias{recordsPlot} \alias{ssrecordsPlot} \alias{sllnPlot} \alias{lilPlot} \alias{xacfPlot} \alias{normMeanExcessFit} \alias{ghMeanExcessFit} \alias{hypMeanExcessFit} \alias{nigMeanExcessFit} \alias{ghtMeanExcessFit} \title{Explorative Data Analysis} \description{ A collection and description of functions for explorative data analysis. The tools include plot functions for emprical distributions, quantile plots, graphs exploring the properties of exceedences over a threshold, plots for mean/sum ratio and for the development of records. \cr The functions are: \tabular{ll}{ \code{emdPlot} \tab Plot of empirical distribution function, \cr \code{qqparetoPlot} \tab Exponential/Pareto quantile plot, \cr \code{mePlot} \tab Plot of mean excesses over a threshold, \cr \code{mrlPlot} \tab another variant, mean residual life plot, \cr \code{mxfPlot} \tab another variant, with confidence intervals, \cr \code{msratioPlot} \tab Plot of the ratio of maximum and sum, \cr \code{recordsPlot} \tab Record development compared with iid data, \cr \code{ssrecordsPlot} \tab another variant, investigates subsamples, \cr \code{sllnPlot} \tab verifies Kolmogorov's strong law of large numbers, \cr \code{lilPlot} \tab verifies Hartman-Wintner's law of the iterated logarithm, \cr \code{xacfPlot} \tab ACF of exceedences over a threshold, \cr \code{normMeanExcessFit} \tab fits mean excesses with a normal density, \cr \code{ghMeanExcessFit} \tab fits mean excesses with a GH density, \cr \code{hypMeanExcessFit} \tab fits mean excesses with a HYP density, \cr \code{nigMeanExcessFit} \tab fits mean excesses with a NIG density, \cr \code{ghtMeanExcessFit} \tab fits mean excesses with a GHT density. } } \usage{ emdPlot(x, doplot = TRUE, plottype = c("xy", "x", "y", " "), labels = TRUE, \dots) qqparetoPlot(x, xi = 0, trim = NULL, threshold = NULL, doplot = TRUE, labels = TRUE, \dots) mePlot(x, doplot = TRUE, labels = TRUE, \dots) mrlPlot(x, ci = 0.95, umin = mean(x), umax = max(x), nint = 100, doplot = TRUE, plottype = c("autoscale", ""), labels = TRUE, \dots) mxfPlot(x, u = quantile(x, 0.05), doplot = TRUE, labels = TRUE, \dots) msratioPlot(x, p = 1:4, doplot = TRUE, labels = TRUE, \dots) recordsPlot(x, ci = 0.95, doplot = TRUE, labels = TRUE, \dots) ssrecordsPlot(x, subsamples = 10, doplot = TRUE, plottype = c("lin", "log"), labels = TRUE, \dots) sllnPlot(x, doplot = TRUE, labels = TRUE, \dots) lilPlot(x, doplot = TRUE, labels = TRUE, \dots) xacfPlot(x, u = quantile(x, 0.95), lag.max = 15, doplot = TRUE, which = c("all", 1, 2, 3, 4), labels = TRUE, \dots) normMeanExcessFit(x, doplot = TRUE, trace = TRUE, \dots) ghMeanExcessFit(x, doplot = TRUE, trace = TRUE, \dots) hypMeanExcessFit(x, doplot = TRUE, trace = TRUE, \dots) nigMeanExcessFit(x, doplot = TRUE, trace = TRUE, \dots) ghtMeanExcessFit(x, doplot = TRUE, trace = TRUE, \dots) } \arguments{ \item{ci}{ [recordsPlot] - \cr a confidence level. By default 0.95, i.e. 95\%. } \item{doplot}{ a logical value. Should the results be plotted? By default \code{TRUE}. } \item{labels}{ a logical value. Whether or not x- and y-axes should be automatically labelled and a default main title should be added to the plot. By default \code{TRUE}. } \item{lag.max}{ [xacfPlot] - \cr maximum number of lags at which to calculate the autocorrelation functions. The default value is 15. } \item{nint}{ [mrlPlot] - \cr the number of intervals, see \code{umin} and \code{umax}. The default value is 100. } \item{p}{ [msratioPlot] - \cr the power exponents, a numeric vector. By default a sequence from 1 to 4 in unit integer steps. } \item{plottype}{ [emdPlot] - \cr which axes should be on a log scale: \code{"x"} x-axis only; \code{"y"} y-axis only; \code{"xy"} both axes; \code{""} neither axis. \cr [msratioPlot] - \cr a logical, if set to \code{"autoscale"}, then the scale of the plots are automatically determined, any other string allows user specified scale information through the \code{\dots} argument. \cr [ssrecordsPlot] - \cr one from two options can be select either \code{"lin"} or \code{"log"}. The default creates a linear plot. } \item{subsamples}{ [ssrecordsPlot] - \cr the number of subsamples, by default 10, an integer value. } \item{threshold, trim}{ [qPlot][xacfPlot] - \cr a numeric value at which data are to be left-truncated, value at which data are to be right-truncated or the thresold value, by default 95\%. } \item{trace}{ a logical flag, by default \code{TRUE}. Should the calculations be traced? } \item{u}{ a numeric value at which level the data are to be truncated. By default the threshold value which belongs to the 95\% quantile, \code{u=quantile(x,0.95)}. } \item{umin, umax}{ [mrlPlot] - \cr range of threshold values. If \code{umin} and/or \code{umax} are not available, then by default they are set to the following values: \code{umin=mean(x)} and \code{umax=max(x)}. } \item{which}{ [xacfPlot] - \cr a numeric or character value, if \code{which="all"} then all four plots are displayed, if \code{which} is an integer between one and four, then the first, second, third or fourth plot will be displayed. } \item{x, y}{ numeric data vectors or in the case of x an object to be plotted. } \item{xi}{ the shape parameter of the generalized Pareto distribution. } \item{\dots}{ additional arguments passed to the FUN or plot function. } } \details{ \bold{Empirical Distribution Function:} \cr\cr The function \code{emdPlot} is a simple explanatory function. A straight line on the double log scale indicates Pareto tail behaviour. \cr \bold{Quantile--Quantile Pareto Plot:} \cr\cr \code{qqparetoPlot} creates a quantile-quantile plot for threshold data. If \code{xi} is zero the reference distribution is the exponential; if \code{xi} is non-zero the reference distribution is the generalized Pareto with that parameter value expressed by \code{xi}. In the case of the exponential, the plot is interpreted as follows: Concave departures from a straight line are a sign of heavy-tailed behaviour, convex departures show thin-tailed behaviour. \cr \bold{Mean Excess Function Plot:} \cr\cr Three variants to plot the mean excess function are available: A sample mean excess plot over increasing thresholds, and two mean excess function plots with confidence intervals for discrimination in the tails of a distribution. In general, an upward trend in a mean excess function plot shows heavy-tailed behaviour. In particular, a straight line with positive gradient above some threshold is a sign of Pareto behaviour in tail. A downward trend shows thin-tailed behaviour whereas a line with zero gradient shows an exponential tail. Here are some hints: Because upper plotting points are the average of a handful of extreme excesses, these may be omitted for a prettier plot. For \code{mrlPlot} and \code{mxfPlot} the upper tail is investigated; for the lower tail reverse the sign of the \code{data} vector. \cr \bold{Plot of the Maximum/Sum Ratio:} \cr\cr The ratio of maximum and sum is a simple tool for detecting heavy tails of a distribution and for giving a rough estimate of the order of its finite moments. Sharp increases in the curves of a \code{msratioPlot} are a sign for heavy tail behaviour. \cr \bold{Plot of the Development of Records:} \cr\cr These are functions that investigate the development of records in a dataset and calculate the expected behaviour for iid data. \code{recordsPlot} counts records and reports the observations at which they occur. In addition subsamples can be investigated with the help of the function \code{ssrecordsPlot}. \cr \bold{Plot of Kolmogorov's and Hartman-Wintern's Laws:} \cr\cr The function \code{sllnPlot} verifies Kolmogorov's strong law of large numbers, and the function \code{lilPlot} verifies Hartman-Wintner's law of the iterated logarithm. \cr \bold{ACF Plot of Exceedences over a Thresold:} \cr\cr This function plots the autocorrelation functions of heights and distances of exceedences over a threshold. \cr } \value{ The functions return a plot. } \note{ The plots are labeled by default with a x-label, a y-label and a main title. If the argument \code{labels} is set to \code{FALSE} neither a x-label, a y-label nor a main title will be added to the graph. To add user defined label strings just use the function \code{title(xlab="\dots", ylab="\dots", main="\dots")}. } \references{ Coles S. (2001); \emph{Introduction to Statistical Modelling of Extreme Values}, Springer. Embrechts, P., Klueppelberg, C., Mikosch, T. (1997); \emph{Modelling Extremal Events}, Springer. } \author{ Some of the functions were implemented from Alec Stephenson's R-package \code{evir} ported from Alexander McNeil's S library \code{EVIS}, \emph{Extreme Values in S}, some from Alec Stephenson's R-package \code{ismev} based on Stuart Coles code from his book, \emph{Introduction to Statistical Modeling of Extreme Values} and some were written by Diethelm Wuertz. } \examples{ ## Danish fire insurance data: data(danishClaims) danishClaims = as.timeSeries(danishClaims) ## emdPlot - # Show Pareto tail behaviour: par(mfrow = c(2, 2), cex = 0.7) emdPlot(danishClaims) ## qqparetoPlot - # QQ-Plot of heavy-tailed Danish fire insurance data: qqparetoPlot(danishClaims, xi = 0.7) ## mePlot - # Sample mean excess plot of heavy-tailed Danish fire: mePlot(danishClaims) ## ssrecordsPlot - # Record fire insurance losses in Denmark: ssrecordsPlot(danishClaims, subsamples = 10) } \keyword{hplot}